Announcement

Collapse
No announcement yet.

Duplicate Title Tag Proble

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Duplicate Title Tag Proble

    As of today, the Google Webmaster Tools reports a large and growing number of duplicate title tags. The title tag from the home page is being reproduced on the My Accounts page and on any page produced by a user search.

    It reports over 200 pages with Title Tags on results pages created by the search feature. Apparently, these are orphan pages created by the search.

    It reports over 150 duplicate Title Tags from pages produced by the Waiting Lists.

    It also reports duplicate Title Tags from the Manufacturer pages, Products By Price pages, and Categories with more than one page (category_1, category_2, etc.)

    We submitted a ticket. This is a SEO issue that can affect page rankings. It seems that the search results pages should not be appearing. I wonder if 3dCart recently began caching these pages. Furthermore why would these pages as well as the My Account, and Waiting List pages have title tags?

    As for the Manufacturer pages, these should be designed to work like category pages so that we can organize the products in subcategories with their own Title Tags. It is baffling why such an obvious need has not been implemented. It is strange that the Manufacturer pages have many of the attributes of the category pages except the ability to create subcategories and the footer. We have made this request in the Feature Request as well as called for it in this forum for a number of years.

    Is anyone aware of a change to the software? Should some of this be handled by the robots.txt file? Is anyone else having this issue? You can check Google Webmaster Tools under the Diagnostics section in the HTML suggestions to see if this is an issue for you.
    Last edited by Luxlife; 04-16-2011, 12:16 AM.
    Luxlife

  • #2
    Did you figure out this issue I am having the same problem with the category pages indexing the old pages like category_21_01 and stuff instead of just the seo URL
    http://www.epicmoviecostumes.com

    Comment


    • #3
      This appears to have been a Google problem or sitemap problem. It corrected itself a few days later.
      Luxlife

      Comment


      • #4
        is anyone still having problems with this, because my site still show duplicate title tags
        http://www.epicmoviecostumes.com

        Comment


        • #5
          We have this problem -- 702 duplicate title tags, and when I look at the list (in webmaster tools) I see most of them are referring to multi-page lists (category, manufacturer, price range, search results)

          Anybody have any suggestions to fix?

          Comment


          • #6
            Tag Proble

            Hi,

            A few days ago I found and installed a wordpress plugin called SEO No Duplicate by Thaya Kareeson (just type in seo no duplicate into the "get new plugins" to find it).
            Wether this plugin will do the job I am not sure yet (I think it will). It will probably take a little bit of time to find out if it works or not.http://imagicon.info/cat/5-6/vbulletin-smile.gif

            Comment


            • #7
              Duplicate title tag solution

              http://forums.3dcart.com/optimizing-...gs-google.html

              Comment


              • #8
                Really, was there a point to commenting on a dead thread? Commenting on active threads get you more attention and help.

                David
                David's Gifts and Things

                Wholesale Gifts, Home Decorating, Jewelry and More

                Quality, Selection, Value Always

                The more you buy the more you save!

                Comment


                • #9
                  Thanks, Mitch,

                  I was hoping someone would bring this up again at some point. I read your other post and think the suggestion you received from support about adding
                  "Disallow: /Rubber-Perches_c_174-*.html " to the robots text file -- while not the answer you were looking for -- might actually be the solution to my issue.

                  I've never understood how to use the robots text file properly, but now I'm going to research that a little more and see if that disallow + wildcard method may work for me.
                  --mimib

                  Comment


                  • #10
                    robots.txt

                    Just remember if you clear your cache, the links go away

                    Comment


                    • #11
                      Dead threads

                      David

                      My FB friend Jim C & I appointed a new "Chief of Facebook Spam Police" this week

                      I haven't run it by Jim but I'm sure he wouldn't mind I we appointed you "Chief of 3dcart Forum Spam Police"

                      You guys can compete and the winner gets to watch over Twitter

                      Comment


                      • #12
                        Robots.txt

                        mimib

                        As I explained to support I did the "clear cache" solution first - if I did both I wouldn't know which one was effective

                        That said I await to hear your results as since I've cleared the cache and resubmitted the site map the duplicate title tag list has grown in WMT

                        Looks like we're on our own here

                        Comment


                        • #13
                          I've tried the "clear cache" method and it hasn't helped.

                          I really do want to understand better how to use that robots.txt file, but I'm timid as it seems I could do more harm than good if I use it incorrectly. It will probably be a while before I can play around with it more, though. As for now, I've moved the problem to my low-priority list as Google Webmaster doesn't seem to put much weight on the issue ("When Googlebot crawled your site, it found some issues with your content. These issues won't prevent your site from appearing in Google search results, but addressing them may help your site's user experience and performance.") I still have 2000+ crawl errors to chip away at first. :-0

                          Sooooo much to do!!!

                          --mimi

                          Comment


                          • #14
                            dupe content

                            I agree about the robots.txt - especially with a wild card - you could get into trouble.

                            With regards to your crawl errors- have you moved your site recently? We just put more than 13,000 301's in place but still have issues

                            We hand mapped about 4,000 - that took 2 of us 3 days - the rest we're dropping onto various landing pages.

                            Have a ticket open with Bing WMT as the 301's are working but are returning 404's and no one has an answer. This is happening on Bing & Google

                            This what our crawl errors look like since implemented the last 301's - Bing is a head scratcher. google crawls us on average 2100 pages per day

                            Bing crawl errors 2204 - 12/4/11
                            Bing crawl errors 2079 - 12/12/11
                            Bing crawl errors 1600 - 12/15/11
                            Bing crawl errors 968 - 12/18/11
                            Bing crawl errors 916 - 12/21/11
                            Bing crawl errors 2411 - 12/22/11
                            Bing crawl errors 768 - 12/23/11

                            Google crawl errors 4852 - 12/12/11
                            Google crawl errors 4833 - 12/15/11
                            Google crawl errors 4763 - 12/18/11
                            Google crawl errors 4112 - 12/21/11
                            Google crawl errors 4100 - 12/22/11

                            Comment


                            • #15
                              We switched from NetSuite last spring. We started with over 13,000 crawl errors which I've managed (though redirects and setting parameters) to whittle down to our current 2400. The remaining are mostly NetSuite numeric URLs which I failed to identify before switching. (and parameters which Google has yet to recognize.)

                              Comment

                              Working...
                              X