robots.txt blocking all pages from being crawled or fetched since upgrade for ver 4.2 to ver 4.6

одна неделя назад
Hi all,
I'm wondering if anyone has had the same problem.
Since I upgraded a store www.brasaovens.com from ver 4.2 -> 4.6 I'm getting the following error from google search console on every single page "BLOCKED BY ROBOTS.TXT"

and my breadcrumbs and snippets also dropped

одна неделя назад
I am running into a similar issue. Every URL is allowed for crawling but google is reporting some of the URLs are blocked.
одна неделя назад
I don't know if this helps but I'm going to add it to the conversation anyway since it's another clue:
In another site I have, where I made the same upgrade from, ver 4.2 to ver 4.60.6 that uses the same theme from nop-templates.com, I have 2 languages PT and ES and the robots.txt starts with:

User-agent: *
Sitemap: https://prginox.com/pt/sitemap.xml
Sitemap: https://prginox.com/es/sitemap.xml
Host: https://prginox.com/
Disallow: /admin
Disallow: /bin/
...


in brasaovens I have 5 languages PT, ES, FR, EN, DE and the robots.txt starts with:

User-agent: *
Sitemap: https://www.brasaovens.com/sitemap.xml
Host: https://www.brasaovens.com/
Disallow: /admin
Disallow: /bin/
...

NOTE that in the 1st website that has 2 languages it shows 2 sitemaps
and on the second it shows only one sitemap. Having 5 languages shouldn't it have 5 sitemaps ? ....
4 дня назад
Hello,

we have the same issue. A lot of our page are blocked by robot after upgrade from 4.2 to 4.6.

Any solution?
4 дня назад
Single sitemap was introduced in 4.40, see details here. And the problem with robots.txt was solved later. So, this should work fine.
4 дня назад
Thanks for your answer, but we don't have neither multilanguage site or sitemap. Just some valid URLs are blocked by robot or at least Google Search Console say that.