Tagged: 

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
    Posts
  • #802112

    Hi-
    My site is creating duplicate urls with /?s= at the end and they’re being crawled when I do site audits.
    I was able to find a few suggestions on how to block /?s= from showing up in search results, but I’m wondering if there’s now a better fix than adding below in your robots.txt or if there’s a way to avoid having them created all together.

    User-agent: *
    Disallow: /search
    Disallow: /*?s=

    I’ve recently switched from http to https and trying to fix all my mixed content errors is a nightmare with all of these extra links.
    Why are they being created and how can I best stop it.

    Thanks

    #803253

    Hey vidaelf,

    Thank you for using Enfold.

    Did you enable the search icon in the site? They are created because the site’s users are using the search feature. The robot.txt directive that you mentioned above should prevent the bots from crawling pages with the search query.

    Best regards,
    Ismael

Viewing 2 posts - 1 through 2 (of 2 total)
  • You must be logged in to reply to this topic.