i've been getting these to, in two v1.5 deployments (not in any 1.4) - they are not proper stores, just working samples
also getting a lot of ' Length cannot be less than zero. Parameter name: length '
Log type: Unknown
Severity: 11
Message: Length cannot be less than zero. Parameter name: length
Exception: System.ArgumentOutOfRangeException: Length cannot be less than zero. Parameter name: length at System.String.InternalSubStringWithChecks(Int32 startIndex, Int32 length, Boolean fAlwaysCopy) at System.Web.Handlers.AssemblyResourceLoader.System.Web.IHttpHandler.ProcessRequest(HttpContext context) at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)
IP address: 62.24.129.13 Customer: Page URL: http://domainname/webresource.axd?d=v22wlqpvkejm1rlzwvk2ia2&t=633750414833416323
Could you please explain me how to use "what google sees" i googled but could not find a solution
Open the Google Webmasters pages. There should be a link called Diagnose with a submenu option Crawl Errors (names might be different, since I'm looking at it in Dutch).
If everything is OK, the only pages that are in the list(s) are pages that are blocked by the robots.txt file. All other errors should be displayed there, if any.
the reason i asked - everyone who is having this issue of getting strange errors in system log, i guess we all have one thing in common, we have posted our website on Google Webmaster, and i guess Google crawler is throwing these error messages while crawling on website pages
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.