Viewing 30 posts - 1 through 30 (of 34 total)
  • Author
  • #706140


    After a lot of time looking for this problem, finally, i found why Google Web Master Tools are showing me near 50 error problems. All of them tell me that the url with ?s= return soft 404 (little sample):


    Is very annoying worry about SEO and see how the efforts are useless for little things.

    If I go to Enfold -> Header -> Extra Elements -> and uncheck “If enabled a search Icon will be appended to the main menu that allows the users to perform an ‘AJAX’ Search

    search box disappears and I can solve the problem. No more urls with ?s= in GWT.

    But I want search box in my site. Please, do you can remove ?s= from href in menu item? This are making this situation (i mark in bold that you would change):

    href=”?s=” data-avia-search-tooltip=

    Complete code:

    <li id="menu-item-search" class="noMobile menu-item menu-item-search-dropdown menu-item-avia-special">
    		<a href="?s=" data-avia-search-tooltip="<form action=&quot;; id=&quot;searchform&quot; method=&quot;get&quot; class=&quot;&quot;>
    		<input type=&quot;submit&quot; value=&quot;&quot; id=&quot;searchsubmit&quot; class=&quot;button avia-font-entypo-fontello&quot; />
    		<input type=&quot;text&quot; id=&quot;s&quot; name=&quot;s&quot; value=&quot;&quot; placeholder='Search' />
    </form>" aria-hidden='true' data-av_icon='' data-av_iconfont='entypo-fontello'><span class="avia_hidden_link_text">Search</span></a></li>

    Thanks a lot,

    • This topic was modified 3 years, 10 months ago by  fromcouch. Reason: link nightmare

    Hey Victor,

    Please post us your login credentials (in the “private data” field), so we can take a look at your backend.

    Login credentials include:

    • The URL to the login screen.
    • A valid username (with full administration capabilities).
    • As well as a password for that username.
    • permission to deactivate plugins if necessary.

    Best regards,


    Credentials in Private field.

    Please, use our test environment. Have same problem.



    You can and should disallow or block search pages or every url with the search query in the robot.txt file.

    User-agent: *
    Disallow: /search
    Disallow: /*?s=


    Best regards,


    Not a valid solution, problem still there.
    I think is better if you change line 126 of functions-enfold.php and change

    for a valid:

    final line will be:
    <a href="#" data-avia-search-tooltip="'.$form.'" '.av_icon_string('search').'><span class="avia_hidden_link_text">'.__('Search','avia_framework').'</span></a>



    Thanks for the update.
    The dissalow method is proper as it is also allowed from google, but we will do update Kriesi with what you suggest and if it is possible he will do it.

    Thank u



    Hi fromcouch

    thanks for raising this issue – I thought I should “second” the issue.

    So, having used enfold themes on some sites for around 3 years I suddenly get a lot of soft 404 errors for pages ending in /*?s= that appear on google console/webmaster tools.

    I will try adding the disavow to the robots file.

    However, any idea why this has suddenly become an issue – sometime in the last few weeks.




    However, any idea why this has suddenly become an issue – sometime in the last few weeks.

    Really no idea. Google change their mind quickly with this kind of questions. Another thing that can happen is that Kriesi has changed something in Enfold theme raising this 404.

    Really I don’t know. I’m using Enfold from one year and is the first time I see this.



    I’m also experiencing this issue as I suspect is every website using enfold with search. Are we certain there is not another solution besides altering the robots.txt file? This was not happening prior to updating enfold to the newest version.

    Also, I received the notification below from Google this morning and all of the URLs in Goggle’s soft 404 report are for url/?=s.

    Increase in “soft-404” pages on “my website name here”

    To: Webmaster of “my website name here”

    Googlebot identified a significant increase in the number of URLs on “my website name here” that should return a 404 (not found) error, but currently don’t. This can cause a bad experience for your users, who might have been looking for a specific page, but end up elsewhere on your website. This misconfiguration can also prevent Google from showing the correct page in search results.

    Recommended Actions:

    Identify the URLs with errors
    Open the Crawl Errors report in your Search Console account to review the list of sample URLs.
    Check Crawl Errors
    Fix the issue
    Check your server and CMS settings to make sure that these URLs return a 404 (Not Found) or 410 (Gone) HTTP response code in response to requests for non-existent pages. You may need help from your server administrator or hoster for this step.
    Verify the fix
    Once you’ve fixed the URLs with errors, make sure that Googlebot can access and see your content properly, or that they return a proper error result code. You can verify this using Fetch as Google.


    I’m having the same issue. And I’m using Enfold on about 30 sites. Granted, not all of those sites use the search field in the menu bar option, but many do. Please do let us know whether or not this issue will be addressed in the next theme update. Thanks.


    Hi Ismael,

    In your recommendation for adding to the robots.txt file you include: Disallow: /*?s=

    Is there a purpose for the asterisk? I’m asking because the URLs in my GWT soft 404 report do not show an asterisk and I continue to accrue soft 404 errors despite the fact that I have added your recommended text to my robots.txt file. Also, I have verified that GWT has the correct robots.txt file for my site. Here is a sample link from my GWT:


    Thanks for your help!



    I got the solution from the SO forum and modified it so that it will only affect the search query.


    Google may take its time to remove the URLs that you have blocked from the search index. The extra URLs may still be indexed for months. You can speed the process up by using the “Remove URLs” feature in webmaster tools after they have been blocked. But that is a manual process where you have to paste in each individual URL that you want to have removed.


    Best regards,


    I also wanted to chime in and say I’m getting similar URL queries in Search Console. I’m also using the Enfold theme. I have used the robots file to block these search results but am still getting these 404 errors. It would be best to get rid of the bad code (as suggested by fromcouch) to allow Google to crawl the site more efficiently. In my opinion, the removal tool should not be used to get rid of crawl errors (there is even a disclaimer at the bottom of the help page that says not to use it for this).



    We would be glad to help you with this customization, but at the moment there is no easy way to do this by using a small custom code snippet, so I am afraid its out of the scope of our support.

    But please feel free to request such feature at our feature request system:

    This system allows us to keep track of user suggestions and lets you vote on the feature you would like to see the most. I am afraid though there is no guarantee that a feature will get implemented. If that’s something you really need you can always try to hire a developer for the task :)

    Best regards,


    Hi, I have also suddenly received the same warning email from Google that I have a sudden increase in soft 403 erros and the are all “/?s=” ending urls. After slightly panicking I found this thread, it seems I’m not alone with this issue
    I’ve added as recommended above the following to my robot.txt
    Disallow: /search
    Disallow: /*?s=

    thanks for this info.

    Googles email at the end said “Once you’ve fixed the URLs with errors, make sure that Googlebot can access and see your content properly, or that they return a proper error result code. You can verify this using Fetch as Google.”

    Do I need to do this now or just wait ?… how long does it take?
    Many thanks



    We are not really sure how long but it should take a while. However, you can ask google to recrawl the url.




    Hi Andy,

    In a previous thread you said…

    Most people want their search results appearing in Google


    I still don’t understand this claim. Why would a URL that causes a Soft 404 error be beneficial?

    Every error created by Enfold’s search box contains empty searches… that go to an error page or to the same page (duplicate content)?



    I’m sorry for the confusion. 404 urls are neither beneficial nor detrimental to a site but it is recommended to block search pages or search results. Please follow the solution that we provided in this thread.

    Related articles:

    Best regards,


    Ok, Thanks!



    I will close the thread here, as the details have been provided many times.
    Thank you very much for that.

    Best regards,


    Hi guys!
    Any news on this? The 404s are not as big a deal as the misuse of Google’s crawling budget. Good content cannot be crawled and for large sites it will definitely affect Search indexing – it already is)

    I have tried to fix it with robots.txt but it doesn’t solve the problem.

    So I thought I would share Yoast’s article: (Yoast’s plugin does block indexing of non-existent search pages but it doesn’t cover Enfold’s search functionality) This is such a shame because the menu search functionality on mobile is absolutely terrific but I am evaluating what’s more important)

    If anybody has a code solution, please feel free to share.




    Hi Havi,

    The solution is to add these lines into your Robot text. I have not had any Soft 404 errors in 3 months.

    Disallow: /*?wordfence_logHuman=
    Disallow: /*?s=
    Disallow: /*?redirect_to=


    Thank you @Coreyfocusgroup. I never used the wordfence plugin. Does Enfold use it?

    I’m going to keep the crawling of redirects. GoogleBot only crawls 302s and 307s every time. 301s are only crawled once.

    Also, I have already implemented Disallow: /*?s= to no avail. The reality is that you need to allow GoogleBot to crawl and add a no index directive to the “page” but as there’s no page, it’s hard to do. This happens because the bot does not enter any data to search for. So if the code would state that if the Search field is empty, to noindex, nofollow, it would be a cleaner fix.

    I will test it on a smaller site. Meanwhile I disabled the tooltip on the large site, deleted all of the soft 404s (to save crawling budget) and will see if they re-appear.

    I may modify the code like @fromcouch indicated changing the ?s= for # on the small site and test if it works.

    I’ll share my findings in about 2 weeks.

    I do hope one of these solutions take care of it. :)

    Thanks again!!




    Hi Havi,

    I’m simply giving you an example of the syntax that works. Each of these URL parameters was causing Soft 404 errors and these commands stopped it. You also need to make sure your robot.txt is accessible. You can test it in Google Webmaster (Console) for your preferred domain. My preferred domain format is (ssl with no www).

    I’m not sure the “redirect_to” is what you think it is. It’s a URL parameter not a directive. It has nothing to do with 301 or any other 3xx directive. I have a dozen redirects setup and none of those show up in the Soft 404 errors shown in Google Webmaster. “redirect_to” examples URLs I’m seeing in Google Webmaster seem to have something to do with the wp-login.php and certainly should not be crawled.

    This definitely works… Is Google Webmaster (Console) setup to “Let Googlebot decide” for “s”?
    Disallow: /*?s=

    Example URLs that show up for “redirect_to”

    WordFence this a commonly used security plugin and is not part of Enfold.

    Disallow: /*?wordfence_logHuman=
    Disallow: /*?s=
    Disallow: /*?redirect_to=


    Hi Havi,

    I just used Fetch as Google in Google Webmaster (Console) and none of my page redirects are being blocked- all can be indexed.

    Only URLs with “redirect_to” are being blocked and those don’t seem to have anything to do with 300 level redirects. I assume it’s something being generated by Enfold or another plugin. But it’s definitely not blocking any page redirects that I have setup.


    I’m not tested yet because my first solution was deactivate search nav in my wordpress.
    But this maybe works (add to functions of child theme)

    add_filter( 'wp_nav_menu_items', function($items, $args) {
        return str_replace('href="?s="', 'href="#"', $items);
    }, 9999, 2 );


    That can be a nice solution.
    Feel free for any other suggestions that we can implement.

    Best regards,


    I’m also getting this problem.

    Will this be fixed in future updates?



    Add this line in your robot.txt

    Disallow: /*?s=


    @corefocusgroup stop please, I was discussed before, using robots.txt isn’t a valid solution for us. Or did not you read the whole post?

Viewing 30 posts - 1 through 30 (of 34 total)

You must be logged in to reply to this topic.