Tagged: google search, google webmaster, url parameter
-
AuthorPosts
-
October 31, 2016 at 1:47 pm #706140
Hi,
After a lot of time looking for this problem, finally, i found why Google Web Master Tools are showing me near 50 error problems. All of them tell me that the url with ?s= return soft 404 (little sample):
blog/tag/web/?s=
portfolio/plugin-moodle-conector-woocommerce/?s=
blog/tag/thebeerproject/?s=
portfolio/products-short-url/?s=Is very annoying worry about SEO and see how the efforts are useless for little things.
If I go to Enfold -> Header -> Extra Elements -> and uncheck “If enabled a search Icon will be appended to the main menu that allows the users to perform an ‘AJAX’ Search”
search box disappears and I can solve the problem. No more urls with ?s= in GWT.
But I want search box in my site. Please, do you can remove ?s= from href in menu item? This are making this situation (i mark in bold that you would change):
href=”?s=” data-avia-search-tooltip=
Complete code:
<li id="menu-item-search" class="noMobile menu-item menu-item-search-dropdown menu-item-avia-special"> <a href="?s=" data-avia-search-tooltip="<form action="http://www.myurl.net/" id="searchform" method="get" class=""> <div> <input type="submit" value="" id="searchsubmit" class="button avia-font-entypo-fontello" /> <input type="text" id="s" name="s" value="" placeholder='Search' /> </div> </form>" aria-hidden='true' data-av_icon='' data-av_iconfont='entypo-fontello'><span class="avia_hidden_link_text">Search</span></a></li>
Thanks a lot,
Victor- This topic was modified 8 years, 1 month ago by fromcouch. Reason: link nightmare
October 31, 2016 at 4:43 pm #706285Hey Victor,
Please post us your login credentials (in the “private data” field), so we can take a look at your backend.
Login credentials include:
- The URL to the login screen.
- A valid username (with full administration capabilities).
- As well as a password for that username.
- permission to deactivate plugins if necessary.
Best regards,
JordanNovember 1, 2016 at 12:55 am #706518Credentials in Private field.
Please, use our test environment. Have same problem.
November 3, 2016 at 3:31 am #707487Hi,
You can and should disallow or block search pages or every url with the search query in the robot.txt file.
User-agent: * Disallow: /search Disallow: /*?s=
// https://kriesi.at/support/topic/links-to-404-pages/#post-588813
Best regards,
IsmaelNovember 3, 2016 at 5:26 pm #707875Not a valid solution, problem still there.
I think is better if you change line 126 of functions-enfold.php and change
?s=
for a valid:
#
final line will be:
<a href="#" data-avia-search-tooltip="'.$form.'" '.av_icon_string('search').'><span class="avia_hidden_link_text">'.__('Search','avia_framework').'</span></a>
November 3, 2016 at 8:18 pm #707932Hi!
Thanks for the update.
The dissalow method is proper as it is also allowed from google, but we will do update Kriesi with what you suggest and if it is possible he will do it.Thank u
Cheers!
BasilisNovember 8, 2016 at 11:18 am #709698Hi fromcouch
thanks for raising this issue – I thought I should “second” the issue.
So, having used enfold themes on some sites for around 3 years I suddenly get a lot of soft 404 errors for pages ending in /*?s= that appear on google console/webmaster tools.
I will try adding the disavow to the robots file.
However, any idea why this has suddenly become an issue – sometime in the last few weeks.
Regards
Richard
November 8, 2016 at 12:14 pm #709735However, any idea why this has suddenly become an issue – sometime in the last few weeks.
Really no idea. Google change their mind quickly with this kind of questions. Another thing that can happen is that Kriesi has changed something in Enfold theme raising this 404.
Really I don’t know. I’m using Enfold from one year and is the first time I see this.
Regards,
November 8, 2016 at 7:30 pm #709937I’m also experiencing this issue as I suspect is every website using enfold with search. Are we certain there is not another solution besides altering the robots.txt file? This was not happening prior to updating enfold to the newest version.
Also, I received the notification below from Google this morning and all of the URLs in Goggle’s soft 404 report are for url/?=s.
Increase in “soft-404” pages on “my website name here”
To: Webmaster of “my website name here”
Googlebot identified a significant increase in the number of URLs on “my website name here” that should return a 404 (not found) error, but currently don’t. This can cause a bad experience for your users, who might have been looking for a specific page, but end up elsewhere on your website. This misconfiguration can also prevent Google from showing the correct page in search results.
Recommended Actions:
1
Identify the URLs with errors
Open the Crawl Errors report in your Search Console account to review the list of sample URLs.
Check Crawl Errors
2
Fix the issue
Check your server and CMS settings to make sure that these URLs return a 404 (Not Found) or 410 (Gone) HTTP response code in response to requests for non-existent pages. You may need help from your server administrator or hoster for this step.
3
Verify the fix
Once you’ve fixed the URLs with errors, make sure that Googlebot can access and see your content properly, or that they return a proper error result code. You can verify this using Fetch as Google.November 9, 2016 at 9:20 pm #710445I’m having the same issue. And I’m using Enfold on about 30 sites. Granted, not all of those sites use the search field in the menu bar option, but many do. Please do let us know whether or not this issue will be addressed in the next theme update. Thanks.
November 9, 2016 at 10:16 pm #710463Hi Ismael,
In your recommendation for adding to the robots.txt file you include: Disallow: /*?s=
Is there a purpose for the asterisk? I’m asking because the URLs in my GWT soft 404 report do not show an asterisk and I continue to accrue soft 404 errors despite the fact that I have added your recommended text to my robots.txt file. Also, I have verified that GWT has the correct robots.txt file for my site. Here is a sample link from my GWT:
what-is-the-annual-income-for-a-travel-nurse/?s=
Thanks for your help!
November 11, 2016 at 2:03 pm #711135Hey!
I got the solution from the SO forum and modified it so that it will only affect the search query.
// http://stackoverflow.com/questions/19113788/google-disable-certain-querystring-in-robots-txt
Google may take its time to remove the URLs that you have blocked from the search index. The extra URLs may still be indexed for months. You can speed the process up by using the “Remove URLs” feature in webmaster tools after they have been blocked. But that is a manual process where you have to paste in each individual URL that you want to have removed.
// https://support.google.com/webmasters/answer/1663419?hl=en
Best regards,
IsmaelNovember 15, 2016 at 1:39 am #712266I also wanted to chime in and say I’m getting similar URL queries in Search Console. I’m also using the Enfold theme. I have used the robots file to block these search results but am still getting these 404 errors. It would be best to get rid of the bad code (as suggested by fromcouch) to allow Google to crawl the site more efficiently. In my opinion, the removal tool should not be used to get rid of crawl errors (there is even a disclaimer at the bottom of the help page that says not to use it for this).
November 16, 2016 at 4:36 pm #713013Hi,
We would be glad to help you with this customization, but at the moment there is no easy way to do this by using a small custom code snippet, so I am afraid its out of the scope of our support.
But please feel free to request such feature at our feature request system: https://kriesi.at/support/enfold-feature-requests/
This system allows us to keep track of user suggestions and lets you vote on the feature you would like to see the most. I am afraid though there is no guarantee that a feature will get implemented. If that’s something you really need you can always try to hire a developer for the task :)
Best regards,
AndyDecember 5, 2016 at 7:10 pm #720617Hi, I have also suddenly received the same warning email from Google that I have a sudden increase in soft 403 erros and the are all “/?s=” ending urls. After slightly panicking I found this thread, it seems I’m not alone with this issue
I’ve added as recommended above the following to my robot.txt
Disallow: /search
Disallow: /*?s=thanks for this info.
Googles email at the end said “Once you’ve fixed the URLs with errors, make sure that Googlebot can access and see your content properly, or that they return a proper error result code. You can verify this using Fetch as Google.”
Do I need to do this now or just wait ?… how long does it take?
Many thanksDecember 6, 2016 at 6:22 am #720848Hi!
We are not really sure how long but it should take a while. However, you can ask google to recrawl the url.
// https://support.google.com/webmasters/answer/6065812?hl=en
Cheers!
IsmaelMay 9, 2017 at 5:21 am #789960Hi Andy,
In a previous thread you said…
Most people want their search results appearing in Google
search-icon-causes-soft-404-errors
I still don’t understand this claim. Why would a URL that causes a Soft 404 error be beneficial?
Every error created by Enfold’s search box contains empty searches… that go to an error page or to the same page (duplicate content)?
/my-sample-url/?s=- This reply was modified 7 years, 7 months ago by corefocusgroup.
May 10, 2017 at 7:52 am #790884Hi,
I’m sorry for the confusion. 404 urls are neither beneficial nor detrimental to a site but it is recommended to block search pages or search results. Please follow the solution that we provided in this thread.
Related articles:
// https://www.labnol.org/internet/block-google-with-robots-txt/28861/
// https://varvy.com/block-unuseful-pages.htmlBest regards,
IsmaelMay 10, 2017 at 11:53 am #791008Ok, Thanks!
- This reply was modified 7 years, 7 months ago by corefocusgroup.
May 11, 2017 at 7:53 pm #791867Hi,
I will close the thread here, as the details have been provided many times.
Thank you very much for that.Best regards,
BasilisJuly 23, 2017 at 6:53 am #828603Hi guys!
Any news on this? The 404s are not as big a deal as the misuse of Google’s crawling budget. Good content cannot be crawled and for large sites it will definitely affect Search indexing – it already is)I have tried to fix it with robots.txt but it doesn’t solve the problem.
So I thought I would share Yoast’s article: https://yoast.com/blocking-your-sites-search-results/ (Yoast’s plugin does block indexing of non-existent search pages but it doesn’t cover Enfold’s search functionality) This is such a shame because the menu search functionality on mobile is absolutely terrific but I am evaluating what’s more important)
If anybody has a code solution, please feel free to share.
Warmly,
Havi
July 23, 2017 at 8:39 am #828622Hi Havi,
The solution is to add these lines into your Robot text. I have not had any Soft 404 errors in 3 months.
Disallow: /*?wordfence_logHuman=
Disallow: /*?s=
Disallow: /*?redirect_to=July 23, 2017 at 8:59 am #828624Thank you @Coreyfocusgroup. I never used the wordfence plugin. Does Enfold use it?
I’m going to keep the crawling of redirects. GoogleBot only crawls 302s and 307s every time. 301s are only crawled once.
Also, I have already implemented Disallow: /*?s= to no avail. The reality is that you need to allow GoogleBot to crawl and add a no index directive to the “page” but as there’s no page, it’s hard to do. This happens because the bot does not enter any data to search for. So if the code would state that if the Search field is empty, to noindex, nofollow, it would be a cleaner fix.
I will test it on a smaller site. Meanwhile I disabled the tooltip on the large site, deleted all of the soft 404s (to save crawling budget) and will see if they re-appear.
I may modify the code like @fromcouch indicated changing the ?s= for # on the small site and test if it works.
I’ll share my findings in about 2 weeks.
I do hope one of these solutions take care of it. :)
Thanks again!!
Warmly,
Havi
July 23, 2017 at 4:10 pm #828709Hi Havi,
I’m simply giving you an example of the syntax that works. Each of these URL parameters was causing Soft 404 errors and these commands stopped it. You also need to make sure your robot.txt is accessible. You can test it in Google Webmaster (Console) for your preferred domain. My preferred domain format is https://mydomain.com (ssl with no www).
I’m not sure the “redirect_to” is what you think it is. It’s a URL parameter not a directive. It has nothing to do with 301 or any other 3xx directive. I have a dozen redirects setup and none of those show up in the Soft 404 errors shown in Google Webmaster. “redirect_to” examples URLs I’m seeing in Google Webmaster seem to have something to do with the wp-login.php and certainly should not be crawled.
This definitely works… Is Google Webmaster (Console) setup to “Let Googlebot decide” for “s”?
Disallow: /*?s=Example URLs that show up for “redirect_to”
wp-login.php?redirect_to=https%3A%2F%2Fmydomain.com%2Fmycompany-backs-a-winning-team%2Fimage-2v2%2FWordFence this a commonly used security plugin and is not part of Enfold.
Disallow: /*?wordfence_logHuman=
Disallow: /*?s=
Disallow: /*?redirect_to=- This reply was modified 7 years, 5 months ago by corefocusgroup.
July 23, 2017 at 4:34 pm #828713Hi Havi,
I just used Fetch as Google in Google Webmaster (Console) and none of my page redirects are being blocked- all can be indexed.
Only URLs with “redirect_to” are being blocked and those don’t seem to have anything to do with 300 level redirects. I assume it’s something being generated by Enfold or another plugin. But it’s definitely not blocking any page redirects that I have setup.
- This reply was modified 7 years, 5 months ago by corefocusgroup.
July 23, 2017 at 9:59 pm #828777I’m not tested yet because my first solution was deactivate search nav in my wordpress.
But this maybe works (add to functions of child theme)add_filter( 'wp_nav_menu_items', function($items, $args) { return str_replace('href="?s="', 'href="#"', $items); }, 9999, 2 );
July 27, 2017 at 10:31 pm #830832Hi,
That can be a nice solution.
Feel free for any other suggestions that we can implement.Best regards,
BasilisAugust 8, 2017 at 3:57 pm #835896I’m also getting this problem.
Will this be fixed in future updates?
August 8, 2017 at 4:34 pm #835905Hi,
Add this line in your robot.txt
Disallow: /*?s=
August 12, 2017 at 4:58 pm #837952@corefocusgroup stop please, I was discussed before, using robots.txt isn’t a valid solution for us. Or did not you read the whole post?
-
AuthorPosts
- You must be logged in to reply to this topic.