Google issue: Indexed, though blocked by robots.txt

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.
5 years ago
On Google Search Console, I get the warning message. "Indexed, though blocked by robots.txt" for the following pages:

https://www.store.com/cart
https://www.store.com/compare products
https://www.store.com/order/history
https://www.store.com/custom/info
https://www.store.com/customer/addresses

There are three ways to solve this issue.  One is to insert the <meta name="robots" content="noindex"> into the affected pages. The other way, which is experimental, is to use the NoIndex: rather than Disallow:.   Finally there's the "X-Robots-Tag noindex" header tag.  I wouldn't use the second one as it is unclear whether google will continue to support it.

My question is how do I make any of these modifications to the affect pages?
5 years ago
Those pages are blocked in robots.txt by default because they are transactional pages, with variable content and it is better keep them out of the google index.

Just ignore the warning about them on Google Search Console, you will be fine.

Btw, some SEO guys recommend to remove every url from robots.txt for security reasons, just for other robots or hacker don't get all your secure pages easily, but they will find them anyway if they want. Don't worry about it.
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.