Googlebot getting errors when crawling intermittently

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.
6 years ago
Hi everyone,

We've recently launched a new site at https://www.cheshiremouldings.co.uk and we are having some strange problems with crawlers, especially googlebot.

Sometimes, when google crawls naturally it will index a page but from the looks of it is unable to render any of the content and displays this in the SERPS



If we then fetch it manually, it will respond with temporary unavailable and eventually it will work



This is a very bizarre occurance as it only seems to happen sometimes. When we've visited a page in google's cache the bot lands on a page that immediately gives them a 'error this page needs to reload' and then they are taken to flushnop=true. This occassionally happens to visitors as well

Any help would be greatly appreciated

We have reviewed robots.txt and nothing in there is blocking the meta description or anything critical for that matter
6 years ago
Does anyone know what could cause flushnop=true?
6 years ago
cheshiremouldings wrote:
Hi,

Have you try to ask on Wordpress board, cause that store is using Wordpress, not nopCommerce.

Regards,
Tomasz
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.