Yesterday I restored the default robot.txt file on my website. (It's on a Core theme.) I then added the code for google bots so that the robot.txt file looks like this:
Sitemap: https://www.-----/sitemap.xml
# Disallow all crawlers access to certain pages.
User-agent: *
Disallow: /checkout.asp
Disallow: /add_cart.asp
Disallow: /view_cart.asp
Disallow: /error.asp
Disallow: /shipquote.asp
Disallow: /rssfeed.asp
Disallow: /mobile/
User-agent: Googlebot
Disallow:
User-agent: Googlebot-image
Disallow:
User-agent: AdsBot-Google
Disallow:
Now, my Google Merchant Center account shows that my products are disapproved because "Desktop page not crawlable due to robots.txt" and
"Mobile page not crawlable due to robots.txt." In addition, if I use the URL Inspection tool in the Search Console, it shows "URL is not available to Google,.. It cannot be indexed.... Blocked by Robot.txt."
What is wrong with my Robot.txt file?
Sitemap: https://www.-----/sitemap.xml
# Disallow all crawlers access to certain pages.
User-agent: *
Disallow: /checkout.asp
Disallow: /add_cart.asp
Disallow: /view_cart.asp
Disallow: /error.asp
Disallow: /shipquote.asp
Disallow: /rssfeed.asp
Disallow: /mobile/
User-agent: Googlebot
Disallow:
User-agent: Googlebot-image
Disallow:
User-agent: AdsBot-Google
Disallow:
Now, my Google Merchant Center account shows that my products are disapproved because "Desktop page not crawlable due to robots.txt" and
"Mobile page not crawlable due to robots.txt." In addition, if I use the URL Inspection tool in the Search Console, it shows "URL is not available to Google,.. It cannot be indexed.... Blocked by Robot.txt."
What is wrong with my Robot.txt file?
Comment