Performance testing nopCommerce 3.70 on Azure using Redis + Blob + Azure SQL

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.
7 years ago
No problem, nop will store your original full-size image with the thumbnails yes. The problem is that internally the binary image is loaded into memory from the database or disk the first time it is accessed. If you store original pictures in the database this will be returned EVERY TIME the picture record is accessed. In both cases this is a waste of resources as all the pictures are already in blob storage. Once the site has 'warmed up' and the page results cached you may see reasonable performance with pictures stored on disk. We have several thousand products which are constantly being added to. We also deploy updates quite often which clears the cache.

www.openforvintage.com

Conclusion : If you have a lot of products / high res images, you do not want to store them in the database and if you want to scale your site to large numbers of users you also don't want them on the webserver. Successful ecommerce sites need to serve pages in 3-4 seconds or less. You can get around this to an extent by throwing all your money at Azure but who wants to do that?
7 years ago
pdunleavy75 wrote:
No problem, nop will store your original full-size image with the thumbnails yes. The problem is that internally the binary image is loaded into memory from the database or disk the first time it is accessed. If you store original pictures in the database this will be returned EVERY TIME the picture record is accessed. In both cases this is a waste of resources as all the pictures are already in blob storage. Once the site has 'warmed up' and the page results cached you may see reasonable performance with pictures stored on disk. We have several thousand products which are constantly being added to. We also deploy updates quite often which clears the cache.

www.openforvintage.com

Conclusion : If you have a lot of products / high res images, you do not want to store them in the database and if you want to scale your site to large numbers of users you also don't want them on the webserver. Successful ecommerce sites need to serve pages in 3-4 seconds or less. You can get around this to an extent by throwing all your money at Azure but who wants to do that?


3-4 seconds seems a lot. We tend to try below 1 sec
7 years ago
Ok, gotcha now. Thanks for clarifying!
7 years ago
Still related I should point out that images uploaded into nop are not web optimised/shrunk. This will compound the performance problem. Any thumbnails generated from your images will also be too large. Our product listing page with around 50 products is a 6MB download!
I'm going to develop a plugin to intercept picture insert and use something like tinypng to optimise the images before they are imported. I can't find a plugin in the marketplace that does this so let me know if anyone is interested in it when I finish.
7 years ago
Yes it seems the separated app and sql servers in the azure app and azure sql setup is definitely the biggest issue. Using application insights we were able to run some reports to sample results of a category view server request and resulting dependency. It showed that the DB dependency calls were 2/3 of the total server request time. For example, a category view server request took 3.5sec and 2.3sec of that was the 40 database calls that were made.

Entity Framework and lazy loading make it very easy as a developer to make a DB chatty application. We have an ERP web app that we built as well and are running it on Azure. We had to go through each of the slow server requests and run them with SQL profiler to find all the code resulting in additional DB calls. In general we did more eager loading and storing results in variables in order to reduce DB calls in our methods. The results were quite impressive. In some cases, poor coding had things like a foreach loop on an iqueryable, resulting in each entity being pulled individually from the DB, then due to lazy loading, each one of those requests spawning off multiple additional requests to get related table data. The result would be hundreds of DB calls that slowed the app to a crawl.

When I get some time, I was planning on trying this optimization to nop 3.8 on the category view methods to see if that 40 DB calls can be cut down to 3 to 5, I'm sure it would have a tremendous impact.

ArielDeil wrote:
The conclusion of my test was that you can run 20 users concurrent on a single machine with maximum response time of 2sec with 2 CPUs and 8GB RAM. I tested this with 10 servers and got 200 users concurrent.


One thing to point out is that 20 users concurrent means that you have 20 HTTP requests at once on the server. The actual number of users that might be surfing your website (reading details of product, checking images, etc..) is usually 4 times this number so you might have 80 users visiting your website.


A few things to consider:
- SQL Azure slows down the speed because all SQL data needs to travel through the network
- Redis will also slow down the website for the same reason
- Using Blob as file storage dose not seem to effect the speed

Conclusion:
The best setup for nopCommerce is: local SQL server and local memory cache.

If you MUST handle more users than your server can hold then you have to split nopCommerce into more than 1 server and you should use Redis and a central SQL server.


In my case the actual project was not worth it, my client moved to another platform because of the cost of servers.
7 years ago
chadwixk wrote:
Yes it seems the separated app and sql servers in the azure app and azure sql setup is definitely the biggest issue. Using application insights we were able to run some reports to sample results of a category view server request and resulting dependency. It showed that the DB dependency calls were 2/3 of the total server request time. For example, a category view server request took 3.5sec and 2.3sec of that was the 40 database calls that were made.

Entity Framework and lazy loading make it very easy as a developer to make a DB chatty application. We have an ERP web app that we built as well and are running it on Azure. We had to go through each of the slow server requests and run them with SQL profiler to find all the code resulting in additional DB calls. In general we did more eager loading and storing results in variables in order to reduce DB calls in our methods. The results were quite impressive. In some cases, poor coding had things like a foreach loop on an iqueryable, resulting in each entity being pulled individually from the DB, then due to lazy loading, each one of those requests spawning off multiple additional requests to get related table data. The result would be hundreds of DB calls that slowed the app to a crawl.

When I get some time, I was planning on trying this optimization to nop 3.8 on the category view methods to see if that 40 DB calls can be cut down to 3 to 5, I'm sure it would have a tremendous impact.

The conclusion of my test was that you can run 20 users concurrent on a single machine with maximum response time of 2sec with 2 CPUs and 8GB RAM. I tested this with 10 servers and got 200 users concurrent.


One thing to point out is that 20 users concurrent means that you have 20 HTTP requests at once on the server. The actual number of users that might be surfing your website (reading details of product, checking images, etc..) is usually 4 times this number so you might have 80 users visiting your website.


A few things to consider:
- SQL Azure slows down the speed because all SQL data needs to travel through the network
- Redis will also slow down the website for the same reason
- Using Blob as file storage dose not seem to effect the speed

Conclusion:
The best setup for nopCommerce is: local SQL server and local memory cache.

If you MUST handle more users than your server can hold then you have to split nopCommerce into more than 1 server and you should use Redis and a central SQL server.


In my case the actual project was not worth it, my client moved to another platform because of the cost of servers.


There are 40 DB calls for ONE category view?
7 years ago
Yes! You can see it in the Azure Portal if you have Application Insights installed for your web app. Click App Insights, on the resulting blade, click one of the rows under the Request Performance section (I click on the first which is all requests in the 3 - 7 second range). In the resulting new blade, click a specific request, and finally in that resulting blade, you'll see a list of all the remote dependencies called in a timeline format.

There are about 40 in a category view of mine. Most of the DB calls are short (15ms +/-), but the query in question is one that does the insert into #FilterableSpecs one, takes about 1.1 whole seconds in and of itself. The sum total is 2.3seconds for all DB calls in this single request.

I also wrote this query for App Insights Analytics which groups and sums all dependency requests for an overall server request. Still doing some double checks to confirm it is reporting the correctly, but it shows some strange ones, like a simple product detail view that made 202 database calls!


(requests
| where itemType == 'request' and timestamp >= datetime(2016-12-13T17:33:16.000Z) and performanceBucket == '3sec-7sec'
| project reqTimestamp=timestamp, requestName=name, reqId=id, url, reqDuration=duration, operation_Id
)
|join
(dependencies
| where type == 'SQL'
| summarize nbrDepCalls=count(id), sumDepDuration=sum(duration) by operation_Id, depName=name  
) on operation_Id
|project reqId, reqTimestamp, url, reqDuration, depName, nbrDepCalls, sumDepDuration
|order by reqTimestamp
7 years ago
chadwixk wrote:
Yes! You can see it in the Azure Portal if you have Application Insights installed for your web app. Click App Insights, on the resulting blade, click one of the rows under the Request Performance section (I click on the first which is all requests in the 3 - 7 second range). In the resulting new blade, click a specific request, and finally in that resulting blade, you'll see a list of all the remote dependencies called in a timeline format.

There are about 40 in a category view of mine. Most of the DB calls are short (15ms +/-), but the query in question is one that does the insert into #FilterableSpecs one, takes about 1.1 whole seconds in and of itself. The sum total is 2.3seconds for all DB calls in this single request.

I also wrote this query for App Insights Analytics which groups and sums all dependency requests for an overall server request. Still doing some double checks to confirm it is reporting the correctly, but it shows some strange ones, like a simple product detail view that made 202 database calls!


(requests
| where itemType == 'request' and timestamp >= datetime(2016-12-13T17:33:16.000Z) and performanceBucket == '3sec-7sec'
| project reqTimestamp=timestamp, requestName=name, reqId=id, url, reqDuration=duration, operation_Id
)
|join
(dependencies
| where type == 'SQL'
| summarize nbrDepCalls=count(id), sumDepDuration=sum(duration) by operation_Id, depName=name  
) on operation_Id
|project reqId, reqTimestamp, url, reqDuration, depName, nbrDepCalls, sumDepDuration
|order by reqTimestamp


That is quite a lot! Do you have list of requests made? Do we know what #FilterableSpecs does? Is it only to populate the filter options?
7 years ago
Unfortunately the integration on Azure reporting between Application Insights and Query insights isn't there to see the actual database commands issued for each DB call, it just knows that the DB call was issued and its duration.

I don't know exactly what that stored procedure is doing at this point. I haven't had a chance to dig into it that deeply as of yet, but hope to soon.

lintho wrote:

That is quite a lot! Do you have list of requests made? Do we know what #FilterableSpecs does? Is it only to populate the filter options?
7 years ago
We have developed several large websites using the Azure Platform. The only step for success in a cloud environment is to minimize round-trips over the networks - higher performance infrastructure cannot fix the root cause. The critical technology in a Web/SQL Azure application for Performance is multi-recordset calls to Stored Procedures in SQL Server so that we have one round-trip from our webserver to the SQL Server (each trip is 100ms minimum in Azure) per page request. That single stored procedure that is called returns every dataset needed for the page view in one call - we then typecast and read the result sets from the encoded result. This alone is the difference between a 15 second page view time and 250ms. When applying the same methodology to Redis and Blob/Filesystems you will see repeatable and highly scalable response times in the order of <500ms. We have "tweaked" NopCommerce this way to achieve performance on Azure with a single server to 1,000 requests per second with average response time of 600ms or consistently <300ms with 50 request per second (S2 App Service / DS11 SQL Web VM). This change as well as async calls to everything should be the highest priority development task to NopCommerce. Then it would be world-class and really showcase Microsoft Technologies.
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.