We are using the Enfold theme and in our Google Webmaster Tools account, Google has crawled and thrown back 403 crawl errors on the following directory:
I know that it’s a good thing to allow Google to crawl css, js, images, etc files inside the framework folders but I think we don’t want to allow them to crawl php files since we shut that down at the server level. Is this a matter of disallowing that php directory inside our robots.txt file? However there are additional css and js images inside that php directory.
I also notice that the following was also crawled by Google and is throwing back a 403 crawl error:
And it throws this fatal error in the browser:
Fatal error: Call to undefined function do_action() in /home/…/public_html/wp-content/themes/enfold/framework/avia_framework.php on line 28
What is the best practice on not allowing Google to crawl these files and for that matter any website visitors too?
Is this something we can also close off from our .htaccess file?
Thank you for using Enfold.
You should allow the framework > php folder but ignore or disallow other folders inside via robot.txt. It is true that the directory contains js and css files but they are not use for the frontend rendering of the site. Only allow scripts and stylesheets that contribute to the frontend so that crawlers can render the page properly.
You must be logged in to reply to this topic.