Hi everyone,
We've recently launched a new site at https://www.cheshiremouldings.co.uk and we are having some strange problems with crawlers, especially googlebot.
Sometimes, when google crawls naturally it will index a page but from the looks of it is unable to render any of the content and displays this in the SERPS
If we then fetch it manually, it will respond with temporary unavailable and eventually it will work
This is a very bizarre occurance as it only seems to happen sometimes. When we've visited a page in google's cache the bot lands on a page that immediately gives them a 'error this page needs to reload' and then they are taken to flushnop=true. This occassionally happens to visitors as well
Any help would be greatly appreciated
We have reviewed robots.txt and nothing in there is blocking the meta description or anything critical for that matter