Extremely high database load after upgrading to 3.8 - did i do something wrong? Need help ASAP

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.
7 years ago
Try the following in ProductService.cs. This is a temporary fix, since it might hurt the SEO. I also did some changes in web master tools.

filterableSpecificationAttributeOptionIds = new List<int>();

//CUSTOM CODE START
if (_workContext.CurrentCustomer.IsSearchEngineAccount())
{
      if (HttpContext.Current.Request.Url.AbsoluteUri.Contains("specs"))
      {
               return new PagedList<Product>(new List<Product>(), 0, 1);
      }
}
//CUSTOM CODE END

//search by keyword

Google webmaster tools:
http://i.imgur.com/xz1c3dt.png

I'm sure if ignoring specs on indexing will hurt the seo, bu so far it does not seem so.
7 years ago
Thank you Lintho for the quick post and suggested temporary fix.

Our database CPU usage has returned to a more normal state without the fix. I'm watching it closely, and if it should happen again I will use your fix, and post the results.

Thanks again!
7 years ago
We run nop 3.8 on Azure Web Apps with Azure SQL. We only have about 200 products.

On the App Server side:
We were on S1. Memory avg was 70%. CPU avg was around 30%, but we just scaled up to an S2 to see the result.
Our longest duration server responses (measured by the percentile_duration_50) were from GET Catalog/Category (3.1 sec) and GET Product/ProductDetails (2.6 sec). Is caching used in these methods or could it be? As our pricing and product content doesn't change that much, I'd rather have these things cached and do a forced clear of the cache when we do update. Could you implement a page level cache on the Action, varied by params?

We used this Insights Analytics query:

//Estimate the percentile distribution of an expression.
requests
| where timestamp >= now(-7d)
| where toint(duration) > 0
| summarize count(id), avg(duration), percentiles(duration,20,50,75) by name
| where count_id > 5
| order by percentile_duration_50  desc



On the SQL side:
We were on SQL S1 and found our DTU % to range between 40% and 80%. We just upgraded to S2 to see the result.

We too saw using Query Performance Insights that the INSERT INTO #FilterableSpecs to be the number one long running query. Our page view load times were taking around 10 seconds (just the server response portion, about 4 sec). Is there some feature we can turn off that utilizes this stored procedure to reduce its impacts?

Plugin Changes
Using I use the plugin suite from Nop-Templates. I found that disabling the Ajax Filters and Sales Campaigns, and Product Ribbons, helped some, but haven't had time to measure the result yet. The former 2 are hit for every product detail load and were adding about 1 sec.

The Net
Azure has some great advantages for hosting web apps and Nop users. The auto scaling, the fact you don't need to be bothered with the whole server admin side, are great time and money savers. Having Nop work on their PaaS platform would greatly help spread the continued Nop adoption, but only if it performant. To be a major contender with Magento, it needs to scale for the bigger players. It'd be great to see a concerted effort on part of the Nop team to look at optimizing Nop for this platform. I love Nop, but even before hosting on Azure (had a dedicated VM with Nop and SQL Server on the same machine), it has never been what I'd call fast...and we only run 200 products.  It needs to be snappy!
7 years ago
GreatAdventure.ca wrote:
Thank you Lintho for the quick post and suggested temporary fix.

Our database CPU usage has returned to a more normal state without the fix. I'm watching it closely, and if it should happen again I will use your fix, and post the results.

Thanks again!


The issue has flared up again. We have moved from a S2 database to an S3, but the CPU usage is continuing to climb, and is currently sitting at 70%. The query identified in the first post by Lintho seams to be the primary cause.

I haven't used Lintho's temporary fix yet as my Parameter numbers in Google Webmaster tools are no where near as high as what Lintho was getting. Plus, I'm also concerned about the potential search engine impact.

My question is:

Is this long running query a NopCommerce issue? An Azure issue? or a plugin issue?
7 years ago
As far as i see it, it is a problem with performance in azure SQL. Google bots kept spamming us 3 times per second. Should nopcommerce handle that in azure db? I would think so. How can it be required to use a local SQL in a VM to run nopcommerce when bots starts hammering the site? 3 times per second is not a lot. I did post the numbers on google forums, and people where confused how 3 times a second could be a problem. Thats nothing.

Any if we load test other sites with 200 concurrent users its NO problem. Do the same with nopcommerce and the SQL DTU flies through the roof after seconds. I have no idea why.
7 years ago
You could watch this video:
https://www.youtube.com/watch?v=AuX5un-1f0Q

The sound is not great, but here is the presentation: http://downloads.nopcommerce.com/days2016/popov.pdf

It seems that 50 concurrent users for 2 minutes is a lot to handle. You need at least S2 Standard to keep the response to a decent level. It still does not explain why a site without any load(atleast in analytics) keeps using 100% DTU. It has to be the google bots, and they only visit the site 3 times per second.
7 years ago
lintho wrote:
As far as i see it, it is a problem with performance in azure SQL. Google bots kept spamming us 3 times per second.  


Thanks Lintho, but why all of sudden? We have been using Nop on Azure for years, and had the 3.8 upgrade several months ago without any increase to DTU/CPU. We had always used a S2 size database and DTU/CPU never went over 20%, with the large majority of the time it was under 5%.

Nothing seams to have changed with regards to Google bots from what I can tell.

Then around Nov 12, the DTU/CPU issues started. Around this same time (I think on the night of the 11th), I made the following changes to our site.

- Updated our Nop-Templates themes and plugins to the newest build (the same themes and plugins that we had always used).
     - Themes
        - Alfresco; Traction
     - Plugins
        - AjaxCart; AjaxFilters; AjaxFiltersPro; AnywhereSliders; Atachments; CloudZoom; CustomerRiminders; HtmlWidgets; InstantSearch; JCarousel; MegaMenu; NopQuickTabs; PrevNextProduct; ProductRibbons; QuickView; RichBlog & SmartSEO

- I also added the following new Theme and Plugins
     - Theme: Pavilion (Nop-Templates) - changed one of our multistores from Alfresco to Pavilion
     - Plugins: SmartProductCollection (Nop-Tmeplates); MSSQLProvider (FoxNetSoft); nopAccelerateCDN (NopAccelerate)

I have since uninstalled the MSSQLProvider

For additional information, when I look at the stored procedures in the database there are three that are very similar to the long running query. They are:

- dbo.ProductLoadAllPaged
- dbo.ProductLoadAllPagedNopAjaxFilters
- dbo.ProductLoadAllPagegNopAjaxFilters

I asked Nop-Templates support about the bottom two and they told me that the bottom one "is a left-over from some of the upgrades. You can remove it without any problems because this stored procedure is not used". - Unfortunately I don't know how to remove it.

Instead of bots, can the issue be with one of the three stored procedures?
7 years ago
GreatAdventure.ca wrote:

Thanks Lintho, but why all of sudden? We have been using Nop on Azure for years, and had the 3.8 upgrade several months ago without any increase to DTU/CPU. We had always used a S2 size database and DTU/CPU never went over 20%, with the large majority of the time it was under 5%.



I know, right! This is whats bugging me too. I have a huge thread on this with the Nopcommerce team, where they asked me to change the stored procedure, disabling plugins etc, and nothing helped.

I tried uninstalling and removing the stored procedure, and nothing changed for me.

I would love to get to the bottom of this ASAP.

I hope more people end up with the same problem, so we can figure out whats wrong eventually.
7 years ago
lintho wrote:

I would love to get to the bottom of this ASAP.

I hope more people end up with the same problem, so we can figure out whats wrong eventually.


Me too, this is very frustrating.


It's probably not related, but in case it is, and helps with someone trying to solve this mystery. In my "online customers" list, I often have one or two people from China. What is strange is the URL parameters on the "last visited page" always shows specs followed by three sets of numbers such as "?specs=220,715,700".

All of the other "online customers" are either from Canada or the US (our typical customer base) and those always have the SEO friendly URL's for last page visited not the parameter with three sets of numbers.

Also if I copy and paste the strange parameter URL into a browser it will go to the particular category but no items will be displayed (example: https://www.adventuregear.ca/knives-accessories?specs=220,715,700).
7 years ago
We are seeing the same thing. Google is indexing all url parameters for specifications as separate and unique urls. From what it looks like, the specs query parameter is used to be able to navigate to a category and then filter on certain specifications. This is used by the Nop-Template.com's Ajax Filters plugin (not sure if base Nop has these feature as well?). So if you use Ajax filters, and select certain specifications to filter by, then I'm guessing it will generate this format of url such as http://<your-site.com>/<categoryname>?specs=220,715,700, etc.

I'm not sure where google is knowing which combination of spec id's to use in generating these urls as they are dynamically generated based on user filtering, but regardless, that is what is happening. And since there can be a very large number of resulting unique urls based on the combination and permutation of these spec ids, you get googlebot pounding your website for each unique url as a result.

If you go into Google WebMasterTools > Crawl > URL Parameters, you will see this specs parameter, as well as others like order by, pagesize, pagenumber, etc. From my reading and understanding, I believe that you should change the crawl setting for most of these parameters (all except pagenumber parameter) to "No: Doesn't affect page content", which google states:

"Select this option if this parameter can be set to any value without changing the page content. For example, select this option if the parameter is a session ID. If many URLs differ only in this parameter, Googlebot will crawl one representative URL."

This is what I want, basically ignore the specs parameters (and others) and just crawl one representative url. Today it states it is monitoring 106,109 unique urls that have the specs parameter...we only have < 200 products! It will see all our products and index them simply by going to the root category pages, except pages 2 and onward, by default, as such I did leave the pagenumber parameter setting in webmaster tools to Let Googlebot decide setting, so it will hopefully consider http://<your-site.com>/<categoryname>?pagenumber=2, http://<your-site.com>/<categoryname>?pagenumber=3, http://<your-site.com>/<categoryname>?pagenumber=4, etc as separate unique urls, crawl each of them, and hence capture all products.

GreatAdventure.ca wrote:

I would love to get to the bottom of this ASAP.

I hope more people end up with the same problem, so we can figure out whats wrong eventually.

Me too, this is very frustrating.


It's probably not related, but in case it is, and helps with someone trying to solve this mystery. In my "online customers" list, I often have one or two people from China. What is strange is the URL parameters on the "last visited page" always shows specs followed by three sets of numbers such as "?specs=220,715,700".

All of the other "online customers" are either from Canada or the US (our typical customer base) and those always have the SEO friendly URL's for last page visited not the parameter with three sets of numbers.

Also if I copy and paste the strange parameter URL into a browser it will go to the particular category but no items will be displayed (example: https://www.adventuregear.ca/knives-accessories?specs=220,715,700).
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.