No announcement yet.

Google Email: Important Update For Google Shopping Product Page Crawl Requirements

  • Filter
  • Time
  • Show
Clear All
new posts

  • Google Email: Important Update For Google Shopping Product Page Crawl Requirements

    I think they are talking about 'product_type'... I must have ours set up wrong.


    Thank you for your participation in Google Shopping. In order to check for quality issues and to help ensure a good experience for our users, we routinely crawl landing pages and images submitted by merchants to Google Shopping.

    We’re reaching out to you today because we’ve encountered some errors while trying to crawl the landing pages included in your data feed. Some of our recent attempts to crawl your landing pages were unsuccessful because the robots.txt file for your website restricts access to these pages.

    Beginning on September 17, 2013, if we cannot crawl the landing page of an item, we will no longer show this item in Google Shopping results. Items we’re unable to crawl will be displayed in the Data Quality tab of your Merchant Center account.

    In order for us to access your site and images, ensure that your robots.txt file allows both user-agents "Googlebot" and "Googlebot-image" to crawl your site. You can do this by adding the following two lines to your file:

    User-agent: Googlebot

    User-agent: Googlebot-image

    Please ensure that your robots.txt file does not prevent us from crawling your items so that these items can continue to appear on Google Shopping.

    To learn more about robots.txt files, please visit

    To learn more about our crawling process on common issues, please visit


    The Google Shopping Team

    © 2013 Google Inc. 1600 Amphitheatre Parkway, Mountain View, CA 94043

    You have received this mandatory email service announcement to update you about important changes to your Google Merchant Center account.

  • #2
    I don't see anything in there about product_type. They are saying that they cannot access the product pages that you have submitted in your feed file. Either the pages are not there, or you are blocking the Google bot in your robots.txt file.


    • #3
      I thought product_type was the path for your products landing pages. Like for a Complete Skateboard from us it would be "Skateboards > Complete Skateboards". I figured that was how they even found our landing pages.

      Anyway we use the default 3D robots.txt, but I added what they mention in the email. Maybe that will help?

      I'm guessing I'm the only one who got this (we also got the email on all 5 of our 3D sites, so I definitely did something wrong)
      Last edited by Spesh; 09-06-2013, 10:49 AM.


      • #4
        One of the fields in the data feed should be a direct URL to your product page.


        • #5
          That would be funny if we didn't have a link to the product, but sadly it's not that easy of a fix.

          Here this is what Google put for product_type

          This attribute also indicates the category of the product being submitted, but you can provide your own classification. Unlike the 'Google product category', you can include more than one 'product type' attribute value if products apply to more than one category. Please include the full category string. For example, if your products belong in Refrigerators list the full string: Home & Garden > Kitchen & Dining > Kitchen Appliances > Refrigerators. Any separator such as > or / may be used.

          When to include: Strongly recommended for all items if you have a categorization for your items.
          In 3D they have it default to something with a Top in the title. We switched it to an extra field so we could go past the top category and add a second (Like Skateboard > Complete Skateboard rather then just Skateboard).

          I thought this was the attribute the Google used to tell it what your landing pages actually are for the product and it for some reason couldn't follow it back. Not sure how I figured they would do this without a link but I was wrong anyway.

          Hopefully the robots.txt change they suggested will fix the issue.