Announcement

Collapse
No announcement yet.

Robot.txt file blocking all products

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • JulieAndrus
    replied
    All is good. It just took a while for the robots.txt file to be updated. Thanks for your help.

    Leave a comment:


  • bzeltzer
    replied
    Your robots.txt file is located at www.site.com/robots.txt so you can check it there. If it's not updated, you should be able to edit it directly using FTP access. It's located in the top level "web" folder. There's also another file www.site.com/robots_ssl.txt that needs to be edited

    Leave a comment:


  • JulieAndrus
    replied
    I've done that multiple times since yesterday. Thanks though.

    Leave a comment:


  • bzeltzer
    replied
    Try clearing the site cache in general settings

    Leave a comment:


  • JulieAndrus
    replied
    Thanks. I think that maybe the new robots.txt file hasn't been updated by the servers because if I test it in Google, it shows the default robots file without the google bot code::
    Sitemap: https://www.----.com/sitemap.xml # Disallow all crawlers access to certain pages. User-agent: * Disallow: /checkout.asp Disallow: /add_cart.asp Disallow: /view_cart.asp Disallow: /error.asp Disallow: /shipquote.asp Disallow: /rssfeed.asp Disallow: /mobile/ I guess I'll wait a little longer to see if the updated robots.txt file allows my product pages to be indexed.

    Leave a comment:


  • kelly
    replied
    I had that problem briefly as well - in maybe an error with https vs http. Go to Google Search Console and check bot access. https://www.google.com/webmasters/tools/googlebot-fetch

    I have also found that 3dcart servers can take up to 24 hours to reflect changes with Bing and Google.

    Leave a comment:


  • JulieAndrus
    started a topic Robot.txt file blocking all products

    Robot.txt file blocking all products

    Yesterday I restored the default robot.txt file on my website. (It's on a Core theme.) I then added the code for google bots so that the robot.txt file looks like this:

    Sitemap: https://www.-----/sitemap.xml

    # Disallow all crawlers access to certain pages.
    User-agent: *
    Disallow: /checkout.asp
    Disallow: /add_cart.asp
    Disallow: /view_cart.asp
    Disallow: /error.asp
    Disallow: /shipquote.asp
    Disallow: /rssfeed.asp
    Disallow: /mobile/

    User-agent: Googlebot
    Disallow:

    User-agent: Googlebot-image
    Disallow:

    User-agent: AdsBot-Google
    Disallow:

    Now, my Google Merchant Center account shows that my products are disapproved because "Desktop page not crawlable due to robots.txt" and
    "Mobile page not crawlable due to robots.txt." In addition, if I use the URL Inspection tool in the Search Console, it shows "URL is not available to Google,.. It cannot be indexed.... Blocked by Robot.txt."

    What is wrong with my Robot.txt file?


Working...
X