Viewing 8 posts - 1 through 8 (of 8 total)
  • Author
    Posts
  • #1122476

    after searh my site on Google:
    nacionga(check in prive links)
    https://www.google.com/search (check in prive links)

    whit this 2 link that can Access to enfold files:
    http://nacionga(check in prive links)
    http://nacionga(check in prive links)

    i try with a 301 redirect to main domain, but sites go down with that
    After i try with robots.txt, but still get this results on Google, and Access to this carpets :s

    so i want to ask about that, its a security risk for my site?
    im to scary about this :(
    thanks for reply
    Jesseo
    (sorry about my english, i try my best)

    #1122633

    Hey Jesse,

    What exactly did you try in robots.txt? That should be the correct solution, but you can’t expect the search results to be updated straight away, you would have to give it a few weeks at least.

    Best regards,
    Rikard

    #1122755

    Hi Rikard
    this is my robots.txt
    # robots.txt for http://www.naciongamerx.com/ and friends
    #
    # Please note: There are a lot of pages on this site, and there are
    # some misbehaved spiders out there that go _way_ too fast. If you’re
    # irresponsible, your access to the site may be blocked.
    #

    # Observed spamming large amounts of https://en.wikipedia.org/?curid=NNNNNN
    # and ignoring 429 ratelimit responses, claims to respect robots:
    # http://mj12bot.com/
    User-agent: MJ12bot
    Disallow: /

    # advertising-related bots:
    # User-agent: Mediapartners-Google*
    # Disallow: /

    # Crawlers that are kind enough to obey, but which we’d rather not have
    # unless they’re feeding search engines.
    User-agent: UbiCrawler
    Disallow: /

    User-agent: DOC
    Disallow: /

    User-agent: Zao
    Disallow: /

    # Some bots are known to be trouble, particularly those designed to copy
    # entire sites. Please obey robots.txt.
    User-agent: sitecheck.internetseer.com
    Disallow: /

    User-agent: Zealbot
    Disallow: /

    User-agent: MSIECrawler
    Disallow: /

    User-agent: SiteSnagger
    Disallow: /

    User-agent: WebStripper
    Disallow: /

    User-agent: WebCopier
    Disallow: /

    User-agent: Fetch
    Disallow: /

    User-agent: Offline Explorer
    Disallow: /

    User-agent: Teleport
    Disallow: /

    User-agent: TeleportPro
    Disallow: /

    User-agent: WebZIP
    Disallow: /

    User-agent: linko
    Disallow: /

    User-agent: HTTrack
    Disallow: /

    User-agent: Microsoft.URL.Control
    Disallow: /

    User-agent: Xenu
    Disallow: /

    User-agent: larbin
    Disallow: /

    User-agent: libwww
    Disallow: /

    User-agent: ZyBORG
    Disallow: /

    User-agent: Download Ninja
    Disallow: /

    # Misbehaving: requests much too fast:
    User-agent: fast
    Disallow: /

    #
    # Sorry, wget in its recursive mode is a frequent problem.
    # Please read the man page and use it properly; there is a
    # –wait option you can use to set the delay between hits,
    # for instance.
    #
    User-agent: wget
    Disallow: /

    #
    # The ‘grub’ distributed client has been *very* poorly behaved.
    #
    User-agent: grub-client
    Disallow: /

    #
    # Doesn’t follow robots.txt anyway, but…
    #
    User-agent: k2spider
    Disallow: /

    #
    # Hits many times per second, not acceptable
    # http://www.nameprotect.com/botinfo.html
    User-agent: NPBot
    Disallow: /

    # A capture bot, downloads gazillions of pages with no public benefit
    # http://www.webreaper.net/
    User-agent: WebReaper
    Disallow: /

    # Wayback Machine: defaults and whether to index user-pages
    # FIXME: Complete the removal of this block, per T7582.
    # User-agent: archive.org_bot
    # Allow: /

    #
    # Friendly, low-speed bots are welcome viewing article pages, but not
    # dynamically-generated pages please.
    #
    # Inktomi’s “Slurp” can read a minimum delay between hits; if your
    # bot supports such a thing using the ‘Crawl-delay’ or another
    # instruction, please let us know.
    #
    # There is a special exception for API mobileview to allow dynamic
    # mobile web & app views to load section content.
    # These views aren’t HTTP-cached but use parser cache aggressively
    # and don’t expose special: pages etc.
    #
    # Another exception is for REST API documentation, located at
    # /api/rest_v1/?doc.
    #
    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-content/themes

    i understand with that search engines Will not index this, but my concern its about if someone enter to this carpets, its a security risk?
    cause a 301 dont work, when i make a 301, my site said: database conection error

    #1123301

    Hi,

    Unfortunately, it would require quite some time and customization of the theme to achieve this, so I am sorry to tell you that this is not covered by our support. However, if it’s really important for you to get this done, you can always hire a freelancer to do the job for you :)

    Best regards,
    Basilis

    #1123314

    mmmm…

    #1123897

    Hi,

    Thank you for the update.

    Where are you hosting the site? Is it an IIS server? You can ask your hosting provider to disable the directory listing or directory browsing option to prevent access to these directories.

    Best regards,
    Ismael

    #1124454

    Hi Isamel and Basilis
    i forget to thanks last anwser, im sorry, but i have kinda some months without sleeping, cause im trying to make my site, i buyed your theme 4 years ago, and after this years i Will release my first comercial site, im exiting about it XD NacionGamerX.com
    So, back to the point, my hosting provider ir Hostmonster.
    Let me ask, if they have that option.

    guys im trying with sportpress pro plugin, it have a lot of fuctions but dont have many fuctions requiere as a eSports manager sitwe like naciongamerx.
    maybe u can make sone extension for enfold that match with it, for pay of course.

    Or something like ProfileGrid for pay too, i Will pay for that for sure XD

    #1124540

    Hi jesserojas,

    We do not provide such services but you can hire a freelancer here.

    Best regards,
    Victoria

Viewing 8 posts - 1 through 8 (of 8 total)
  • You must be logged in to reply to this topic.