Announcement

Collapse
No announcement yet.

Robots.txt

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Robots.txt

    We recently switched site to 3D. Our old URL's are xxx.htm and the new site uses xxx.html.
    If I include: Disallow:/file.htm in the robots.txt file will that do the trick for the search engines to remove the old URL's?

  • #2
    Don't ever do that.

    You need to setup 301 Redirects to map as many of your Old URL's to your new URL's. Not doing so with negatively impact your SEO rankings and the like.

    Go to: Marketing --> SEO Tools --> Edit Page Redirects

    Comment


    • #3
      You really need to do 301 redirects and research them before you go live. I can attest and tell you if you disregard this you will lose every bit of traffic you have. We went from over 8-10K visitors a month to 2K-3K a month and we are not recovering very fast even after starting on hundreds of 301s. We are also now hammering out SEO improvements as fast as we can to try and pick up some traffic that we never had in the first place. Its so time consuming to do it after you screw it all up so do it correctly the first time!

      Also another way would be to make the new urls match the old ones using the custom URL features. The only ones that probably wont be able to be done are the manufactures, but it may be possible to also do those. All these changes need to be made using the CSV export import functions.

      Comment

      Working...
      X