Surprisingly, i have done two approaches in version 3.7 and faced error page on second approach. In first approach i faced default action result and my robots.custom.txt file didn't work. I'm confused.
I'm using nop 3.7. You seem to know a bit about this. I have 27,000 products. Google webmaster tools is reporting that 27,000 have been submitted. Approx. 14,000 indexed and 13,000 blocked by robots.txt. Do you know of any reason why robots.txt would be blocking the urls from being indexed?
Bing webmaster tools is even worse. Only 615 urls being submitted to the index from sitemap.xml. I have resubmitted many times with the same exact results.
In your opinion would it be better to comment out the sitemap.xml and robots.txt lines from web.config and use flat files for them in website root?
Yes!! You were right. I have done something wrong. robots.custom.txt works fine on root. And what i should do with sitemap? sitemap.custom.xml does not work. Is there any way without manipulate web.config to provide custom sitemap?