Extremely high database load after upgrading to 3.8 - did i do something wrong? Need help ASAP

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.
7 anni tempo fa
GreatAdventure.ca wrote:


Thank you Hristian for clarifying. I agree, It's not a Nop-Templates issue, but an issue with a bot stuck on a Nopcommerce long running query. In my case that bot is a Chinese bot, whereas Lintho's bot is from Google. Thank you for the suggestion.


Do you think its stuck in some loop, or could it be that the method is accessed a ton?
7 anni tempo fa
Nop-Templates.com wrote:
Hi guys,

because I'm seeing that our plugin ( Ajax Filters ) is mentioned quite a few times, I would like to clear the thing up with it.

When a URL with "?specs=220,715,700" in it, is loaded, nopCommerce is using its own stored procedure ( from which the mentioned slow query is ) to load the products. Our plugin does not call its own stored procedure until any filtering is applied by the customers ( calling "?specs=***" would not trigger anything in our plugin ). Also, when you filter with the Ajax Filters plugin, we are appending the applied filters after the hashtag symbol ( computers#/specFilters=2m!#-!6, for example ), which is only client-side ( only on customer's machine ) and it cannot be indexed by the search engines.

Keep in mind that if you really have a search engine, which is refreshing your site a few times in a second, then it is normal that you have 100% DTU usage. In the end, requesting this slow query so frequently is really "expensive" and this is keeping the DTU usage at maximum levels. My suggestion is to blacklist those Chinese IPs, which the "search engine" is using.

Hope this helps!

Regards,
Hristian


Do you guys use the stored procedure 'ProductLoadAllPaged' to generate the filterable options on first load? If not, i could remove it from the procedure.
7 anni tempo fa
lintho wrote:


But what if its google, bing or some other useful crawler?


The IP address from China hurting my database was a Baiduspider. Apparently Baidu is an index in China, but I'm ok blocking as I don't have customers in China.

lintho wrote:


Do you think its stuck in some loop, or could it be that the method is accessed a ton?


Sorry, I don't know. I'm not an expert, or even a developer for that matter, I'm just a business owner who liked NopCommerce, and the idea of hosting it in the cloud.

Either way, looped or accessed at on everything points to fact that the stored procedure is an issue, especially if a bot or spider gets caught in it.

chadwixk wrote:
Yes! Have that Chinese visitor as well. Can maybe set up a firewall rule to block Chinese IP addresses in Azure?


I added the following to my web.config file between the <system.webserver> tags:

      <security>
         <ipSecurity>
            <add ipAddress="180.76.15.0" subnetMask="255.255.255.0" />
         </ipSecurity>
      </security>

and now the CPU usage on my database has dropped to below 1%. Time to switch back to S2.

I do think I will have to look more into ModSecurity as a better long term solution, and would love to hear from anyone who has installed it on their Azure Web App.
7 anni tempo fa
They may use different IP addresses though. The other thing I found was to use IIS rewrite rules, which look at the user agent name, not a specific IP address. Here is an example to put in your web.config file within the system.webServer tags:


    <rewrite>
      <rules>
        <rule name="RequestBlockingRuleforCrawlers" stopProcessing="true">
            <match url=".*" />
            <conditions>
               <add input="{HTTP_USER_AGENT}" pattern="Baiduspider|Yandex|YoudaoBot|sogou|NaverBot|moget|ichiro" />
            </conditions>
            <action type="CustomResponse" statusCode="403"
               statusReason="Forbidden: Access is denied."
               statusDescription="You do not have permission to view this page." />
         </rule>
        
      </rules>
    </rewrite>


Note: This requires IIS to have rewrite rules installed and configured. Azure Web Apps does.
7 anni tempo fa
lintho wrote:

Do you guys use the stored procedure 'ProductLoadAllPaged' to generate the filterable options on first load? If not, i could remove it from the procedure.


Hi,

No, we are not using the 'ProductLoadAllPaged' stored procedure to generate the filterable options. You can remove the part where the stored procedure is building the @FilterableSpecificationAttributeOptionIds and see if this will make the difference.

In other words, remove this:

DECLARE @sql_filterableSpecs nvarchar(max)
SET @sql_filterableSpecs = '
  INSERT INTO #FilterableSpecs ([SpecificationAttributeOptionId])
  SELECT DISTINCT [psam].SpecificationAttributeOptionId
  FROM [Product_SpecificationAttribute_Mapping] [psam] WITH (NOLOCK)
    WHERE [psam].[AllowFiltering] = 1
    AND [psam].[ProductId] IN (' + @sql + ')'

EXEC sp_executesql @sql_filterableSpecs


Regards,
Hristian
7 anni tempo fa
FYI, I tried this change as recommended by Microsoft...didn't seem to help at all...even maybe a little worse.

When is this script actually used and why? Is it only when the ?spec=21,33,24, etc category urls are called? Is there a setting that turns this off? I see this in the stored proc
--prepare filterable specification attribute option identifier (if requested)
    IF @LoadFilterableSpecificationAttributeOptionIds = 1


So what determines if
@LoadFilterableSpecificationAttributeOptionIds = 1
? Some setting?

It is this specific store proc that is accounting for the large DTU utilization % in Azure and slowing down the site as well as forcing me to run on a much higher DB plan. If it wasn't for this, I could probably run on S0 ($15/mo) or S1 ($30/mo) instead of S3 ($150/mo)...that's a big difference!

lintho wrote:
Answer from Microsoft

Our current hypothesis is that the issue is related to compilations of the query you already identified and that you shared in the nopCommerce forum.



I had a look at nopCommerce publicly code repository in order to speed up the analysis and found that this query seems to be called by stored procedure ProductLoadAllPaged (https://github.com/nopSolutions/nopCommerce/blob/develop/src/Presentation/Nop.Web/App_Data/Install/SqlServer.StoredProcedures.sql).

Please note this may or may not be the code you are running; I am assuming you are using the latest code base published on the repository.



The relevant code section is:



--prepare filterable specification attribute option identifier (if requested)

IF @LoadFilterableSpecificationAttributeOptionIds = 1

    BEGIN        

         CREATE TABLE #FilterableSpecs

         (

              [SpecificationAttributeOptionId] int NOT NULL

         )

    DECLARE @sql_filterableSpecs nvarchar(max)

    SET @sql_filterableSpecs = '

             INSERT INTO #FilterableSpecs ([SpecificationAttributeOptionId])

             SELECT DISTINCT [psam].SpecificationAttributeOptionId

             FROM [Product_SpecificationAttribute_Mapping] [psam] WITH (NOLOCK)

                 WHERE [psam].[AllowFiltering] = 1

                 AND [psam].[ProductId] IN (' + @sql + ')'



    EXEC sp_executesql @sql_filterableSpecs



According to our hypothesis (which we are still working on and which is not confirmed yet) the issue arises when the table #FilterableSpecs is created in the stored procedure scope and is then referenced in the scope of the dynamic query.

The dynamic query is performing a single statement to retrieve a result set and put it into the #FilterableSpecs temporary table.

There is an alternative form that allows to achieve the same result and avoids referencing the temporary table in the dynamic code.



The alternative form would be something like this (changes are highlighted):



--prepare filterable specification attribute option identifier (if requested)

IF @LoadFilterableSpecificationAttributeOptionIds = 1

    BEGIN        

         CREATE TABLE #FilterableSpecs

         (

              [SpecificationAttributeOptionId] int NOT NULL

         )

    DECLARE @sql_filterableSpecs nvarchar(max)

    SET @sql_filterableSpecs = '

             SELECT DISTINCT [psam].SpecificationAttributeOptionId

             FROM [Product_SpecificationAttribute_Mapping] [psam] WITH (NOLOCK)

                 WHERE [psam].[AllowFiltering] = 1

                 AND [psam].[ProductId] IN (' + @sql + ')'



    INSERT #FilterableSpecs ([SpecificationAttributeOptionId]) EXEC(@sql_filterableSpecs)



Can you please check with nopCommerce if this may be a valid alternative and see if there is any way to test it?



As a further note, you may disable forced parameterization


Any thoughts on this from the nopcommerce team? Is my current stored procedure wrong in any way?
7 anni tempo fa
I added this rewrite rule in web.config to redirect any
mysite.com/folder?specs=2,33,42,45
to just
mysite.com/folder


        <rule name="SpecsSpiderCrawl" stopProcessing="true">
          <match url=".*" />
          <conditions>
            <add input="{QUERY_STRING}" pattern="specs=.*" />
          </conditions>
          <action type="Redirect" url="http://{HTTP_HOST}/{R:0}" appendQueryString="false" redirectType="Permanent" />
        </rule>


We don't use spec filtering, so that should be ok. If you do use it, and just want to redirect this only for requests from bots, you could include the following within the
<conditions>
tag:
<add input="{HTTP_USER_AGENT}" pattern="googlebot|bingbot|Baiduspider|Yandex|YoudaoBot|sogou|NaverBot|moget|ichiro" />


Bot user agent reference
7 anni tempo fa
chadwixk wrote:
I added this rewrite rule in web.config to redirect any
mysite.com/folder?specs=2,33,42,45
to just
mysite.com/folder


        <rule name="SpecsSpiderCrawl" stopProcessing="true">
          <match url=".*" />
          <conditions>
            <add input="{QUERY_STRING}" pattern="specs=.*" />
          </conditions>
          <action type="Redirect" url="http://{HTTP_HOST}/{R:0}" appendQueryString="false" redirectType="Permanent" />
        </rule>


We don't use spec filtering, so that should be ok. If you do use it, and just want to redirect this only for requests from bots, you could include the following within the
<conditions>
tag:
<add input="{HTTP_USER_AGENT}" pattern="googlebot|bingbot|Baiduspider|Yandex|YoudaoBot|sogou|NaverBot|moget|ichiro" />


Bot user agent reference


You could also try what nop-templates wrote 3 posts ago.

Did your redirect change help btw?
7 anni tempo fa
I did try the suggestion from Nop-Templates and some quick tests seemed to reveal some improvement but not great.

The rewrite rules have worked wonders. This is on S2 size on Azure. Very pleased with the result:



This was a result of redirecting the China bots and the spec urns.
7 anni tempo fa

chadwixk wrote:
I did try the suggestion from Nop-Templates and some quick tests seemed to reveal some improvement but not great.

The rewrite rules have worked wonders. This is on S2 size on Azure. Very pleased with the result:



This was a result of redirecting the China bots and the spec urns.
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.