No announcement yet.

1009 pages blocked by robot.txt???

  • Filter
  • Time
  • Show
Clear All
new posts

  • 1009 pages blocked by robot.txt???

    Since our switch to 3DCart our number of indexed pages (according to Google Webmaster Tools) has dropped from approximately 2,415 pages with 6,948 pages blocked down to 439 pages indexed (last week) and 1,009 pages blocked.

    We have over 1,500 products. Something is wrong. We've not changed our robots.txt file at all.

    Any thoughts?!?!

    *EDIT* - could it be our 301 redirects? We have all of our old site URLs redirecting to our 3DCart URLs. When we moved we have a little over 1,200 products though, so the number is a little off.

    How long should we keep the redirects?
    Last edited by DJGearForLess; 03-20-2014, 07:40 PM.

  • #2
    The redirects should not affect blocked pages.

    On Google Webmaster tools, you can see which pages were blocked, and you can test individual pages yourself against the robots.txt rules.

    I would start there.

    There have been forum discussions about how long to keep redirects. Run a search on the forum for lots of info.


    • #3
      Thanks for the reply! We saw the errors and they were all errors related to our old site URLs. One thing we found is that it was still referencing our old sitemap file. It showed 10 pages indexed with 2,600+ errors...

      We uploaded our new sitemap last night and this morning we had 1559 out of 1642 pages indexed.

      Hopefully this begins to solve some of our issues. Going to work on getting things cleaned up a LOT more this weekend.

      The blocked pages stats run on a 2 or 3 day lag so we'll have to keep checking. We think that our 3DCart robots file may have been "blocking" access to our old sitemap referenced files, so it was simply giving up as opposed to cascading down through our site from our homepage.

      Ugh. So much to keep track of. Thanks again!