Viewing 4 posts - 1 through 4 (of 4 total)
  • Author
    Posts
  • #190899

    Hello,
    I just received an e-mail from Google:
    Over the last 24 hours, Googlebot encountered 4 errors while attempting to access your robots.txt. To ensure that we didn’t crawl any pages listed in that file, we postponed our crawl. Your site’s overall robots.txt error rate is 100.0%.

    if I type in the browser http://www.mydomaine.com/robots.txt
    this is what appears:
    User-agent: *
    Disallow: /siteone/wp-admin/
    Disallow: /siteone/wp-includes/

    Why Google cant see it ?
    What can I do to solve this problem ?
    If this is a problem ?

    Thanks

    #190964

    Hi manzo!

    Try following code instead

    User-agent: *
    Disallow: /wp-admin
    Disallow: /wp-includes

    Best regards,
    Peter

    #190996
    This reply has been marked as private.
    #191786

    Hi!

    Please try following code

    
    User-agent: *
    Disallow: /cgi-bin
    Disallow: /wp-admin
    Disallow: /wp-includes
    Disallow: /wp-content/plugins
    Disallow: /wp-content/cache
    Disallow: /wp-content/themes
    Disallow: /trackback
    Disallow: /feed
    

    I validated it with http://www.searchenginepromotionhelp.com/m/robots-text-tester/robots-checker.php and it seems to work: http://www.clipular.com/c/5419162805469184.png?k=Ci1mASEEcST2NEHva0mSwBcQzgo

    When I try to validate your file I get an error in the last line.

    The directory ( /siteone/ ) must not be in the robots.txt paths because the url itself already links to /siteone/ as root directory and the crawlers can’t access the /siteone/ directory directly. Otherwise this url: http://danielzahner.ch/siteone/robots.txt would also work but it does not.

    Enfold does not generate a robots.txt file for sure but it seems like WP does. You can try this plugin http://wordpress.org/plugins/wp-robots-txt/ to change the content.

    Regards,
    Peter

Viewing 4 posts - 1 through 4 (of 4 total)
  • The topic ‘Googlebot can’t access your site’ is closed to new replies.