On Google Search Console, I get the warning message. "Indexed, though blocked by robots.txt" for the following pages:
https://www.store.com/cart
https://www.store.com/compare products
https://www.store.com/order/history
https://www.store.com/custom/info
https://www.store.com/customer/addresses
There are three ways to solve this issue. One is to insert the <meta name="robots" content="noindex"> into the affected pages. The other way, which is experimental, is to use the NoIndex: rather than Disallow:. Finally there's the "X-Robots-Tag noindex" header tag. I wouldn't use the second one as it is unclear whether google will continue to support it.
My question is how do I make any of these modifications to the affect pages?