Performance testing nopCommerce 3.70 on Azure using Redis + Blob + Azure SQL

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.
7 years ago
Maybe is wise to consider Cloudflare as front
7 years ago
rkotulan wrote:
Any updates on this? I've just moved our nopCommerce website and database to Azure Web App, Azure SQL and blob storage. And performance is extremely poor, had to go into upper Premium tier for the database (700 USD/mo) to get acceptable load times around 1,5 sec, today on a dedicated server with IIS and SQL server on the same load time is less than 1s.

Hi mgustafsson,

Why did you decide to move to Azure? How many products do you have in store? On which page do you measure load time?

I recently made load test on our shop with 150000 products.
Test environment
dedicated server Xeon E3-1230 V5 3,6 GHz
RAM 16GB, SSD

I used https://loader.io and I measured acces to category page and one product page.

I got 1.7s avg load time. 1 user per sesc.

Please share your experience and measure methodology.

Best regards,
Rudolf


Basically for the ease of DevOps - we are about to outgrow our current (dedicated) server and would like to easily be able to scale out and up when needed, also Azure's continuous delivery seems like a nice fit for us.

We have 50K products and around 350 categories. So far I've only looked at the load time of  the home page in Chrome dev tool - at first the results were very poor. However after creating nonclustered indexes on GenericAttribute (EntityId, KeyGroup) and Customer (CutstomerGuid) tables the pressure on the database seemed to decrease dramatically.

I'm now on Azure's S0 Standard (10 DTUs) price plan and the site seems to load as fast as on the dedicated server. Before creating the indexes I had to be on the P2 Premium (250 DTUs) plan in order to get decent load times.

I have also set up a SQL Server VM in Azure as Greg Smyth suggested and going to run a some stress tests later this week to determine the best setup for us. I'll get back when I have those results.

(Edit: We are on nopCommerce 3.7)
7 years ago
We have a couple of nop sites running on Azure and one thing that becomes a significant issue is storing pictures in the database. Before a page is cached, nop will pull a page of picture records from the database including the binary stored in the field. This can send your DTUs through the roof and slow the site to a grinding pace (especially if a crawler visits the site just after a deployment). Now obviously you can change the media settings to store your pictures on the web server however, this is bad form for Azure which has blob storage for the task and if you scale out with another node or loose the one you have you're going to be in big trouble. So I have created a partial class to extend AzurePictureService.cs so that when you say 'store pictures on file system' it actually treats this as blob storage. Another point to note is that images added to blob do not have a cache policy set which google will downgrade you for. If this code is suitable maybe it can go into the next version of nop? Another useful feature would be support for CDN which requires pictures to be served from a different endpoint.

namespace Nop.Services.Media
{
    public partial class AzurePictureService : PictureService
    {
        // PD : Overrides so nop will treat 'file system' as blob storage for original pictures
        // would normally be in "~/content/images/" we'll use the same container as the thumbnails

        protected override void SavePictureInFile(int pictureId, byte[] pictureBinary, string mimeType)
        {
            var lastPart = GetFileExtensionFromMimeType(mimeType);
            var fileName = $"{pictureId.ToString("0000000")}_0.{lastPart}";
            var blockBlob = container_thumb.GetBlockBlobReference(fileName);
            blockBlob.UploadFromByteArray(pictureBinary, 0, pictureBinary.Length);

            // custom code to set cache duration and keep google happy
            const string cacheControl = "max-age=3600, must-revalidate";
            blockBlob.Properties.CacheControl = cacheControl;
            blockBlob.SetProperties();
        }

        protected override void DeletePictureOnFileSystem(Picture picture)
        {
            var lastPart = GetFileExtensionFromMimeType(picture.MimeType);
            var fileName = $"{picture.Id.ToString("0000000")}_0.{lastPart}";
            var blockBlob = container_thumb.GetBlockBlobReference(fileName);
            if (blockBlob.Exists()) blockBlob.Delete();
        }

        protected override byte[] LoadPictureFromFile(int pictureId, string mimeType)
        {
            var lastPart = GetFileExtensionFromMimeType(mimeType);
            var fileName = $"{pictureId.ToString("0000000")}_0.{lastPart}";
            var blockBlob = container_thumb.GetBlockBlobReference(fileName);

            if (!blockBlob.Exists()) return new byte[0];

            // Not sure how efficient this is (round trips?)
            var byteData = new byte[blockBlob.StreamWriteSizeInBytes - 1];
            blockBlob.DownloadToByteArray(byteData, 0);
            return byteData;
        }
    }
}
7 years ago
We are experiencing the same issues for all our webstores(10) while doing loadtesting.

I have contacted Microsoft and the NopCommerce team now(We have premium support for both) asking what the solution could be. Forcing us onto a local SQL instance isn't really a solution, and since that is the current solution, this seems to be a problem with the current architecture. This is a problem considering everyone wants the cheapest possible solution for their hosting.

I wish the NopCommerce team could work together with Microsoft on this particular issue. I don't know if the NopCommerce team is already communicating with Microsoft, but i have asked them in my support-ticket if that would be possible. I am certain Microsoft would cooperate since they want everyone to use Azure, and right now this is a issue for all .net users who want to have a great webstore(Which nopcommerce is!).
7 years ago
If anyone is interested in the results... I deployed the azurepictureservice code to our live site and hit 'transfer pictures' in the media settings screen. 6 hours later ! all images are now in blob storage. However, server response time will still not be as fast as storing images on the web server. I noticed that the method to move pictures actually flags them all as 'new', this has the undesirable effect of deleting and regenerating all thumbnails the next time you request a page on the front end. It also seems to leave them flagged as new so this now happens every time. I manually set isnew for all pictures to false in SQL.

Another important thing is to hack the main pictureservice.getpictureurl method. You will see that this class always loads the original picture binary into memory from database or file system regardless of whether it is actually needed (primarily to check if the binary is null and serve up the default picture). I've moved the call to 'LoadPictureBinary()' so it is only done if the picture is new or the thumbnail doesn't exist.

Time to the first byte for a cached page is now just under 2 seconds. All the heavy lifting for pictures has now been taken off the database and web server but I do feel this is still pretty slow for an expensive hosting package (£150 per month). Nop 3.7, SQL Azure S3, Web App S2.

Going to try upgrading to 3.8 next month and see if that yields better results.
7 years ago
How is your code different from what Nop already does to use Azure blob storage for pictures?

pdunleavy75 wrote:
If anyone is interested in the results... I deployed the azurepictureservice code to our live site and hit 'transfer pictures' in the media settings screen. 6 hours later ! all images are now in blob storage. However, server response time will still not be as fast as storing images on the web server. I noticed that the method to move pictures actually flags them all as 'new', this has the undesirable effect of deleting and regenerating all thumbnails the next time you request a page on the front end. It also seems to leave them flagged as new so this now happens every time. I manually set isnew for all pictures to false in SQL.

Another important thing is to hack the main pictureservice.getpictureurl method. You will see that this class always loads the original picture binary into memory from database or file system regardless of whether it is actually needed (primarily to check if the binary is null and serve up the default picture). I've moved the call to 'LoadPictureBinary()' so it is only done if the picture is new or the thumbnail doesn't exist.

Time to the first byte for a cached page is now just under 2 seconds. All the heavy lifting for pictures has now been taken off the database and web server but I do feel this is still pretty slow for an expensive hosting package (£150 per month). Nop 3.7, SQL Azure S3, Web App S2.

Going to try upgrading to 3.8 next month and see if that yields better results.
7 years ago
chadwixk wrote:
How is your code different from what Nop already does to use Azure blob storage for pictures?



I think he commented on that further up

He is on 3.7. Maybe 3.7 didnt have azure blob storage for pictures
7 years ago
Nop 3.7 and 3.8 both support storing picture thumbnails in blob storage but not the original picture you uploaded. Could be a few megabytes. These are all pulled out of the database or from the file system every time you access a product listing page that isn't cached. Which is a performance killer on Azure, hence the code and tips.
7 years ago
Thanks for explaining the reasoning behind this change pdunleavy75. I am trying to validate it on our side but I see the original image being served from blob storage as well...but maybe I am not looking at the right thing? In the pic below, you can see where I am looking at Chrome Dev Tools to see where the original image is served from, which shows it is from Blob in this case. Is this the image you are talking about that is being loaded from file system? And if so, even if it were, I am not understanding how that could be a large load on the DB server (assuming you store images in the DB) as it would only be for one image on a product detail page. The additional product images (besides the main image, i.e. image 2, 3, 4, etc) do not pull from the server until you actually click on their thumbnail, so it's not like multiple original images are pulled at once at the time of page load, which may cause a performance hit if you had a lot of images.




You can see the thumbnails for that main product image, which have the thumb size as a suffix to the file name, i.e. _100, _600, which do also show being served from blob storage and not file system.

Note I'm not trying to be critical, just trying to understand so we can evaluate if it would be beneficial to implement this change in our code as well.

pdunleavy75 wrote:
Nop 3.7 and 3.8 both support storing picture thumbnails in blob storage but not the original picture you uploaded. Could be a few megabytes. These are all pulled out of the database or from the file system every time you access a product listing page that isn't cached. Which is a performance killer on Azure, hence the code and tips.
7 years ago
I didn't do any changes after applying azure blob storage except clearing cache inside the admin area
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.