Performance optimization needed - NOPCOMMERCE IS TOO SLOW !

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.
10 Jahre weitere
Hi,
I would like to speak again about an old and big problem since nop v2 : NOPCOMMERCE IS TOO SLOW!
I know that when you take a fresh installation with sample data response time are correct, but this is not reality. I have more products, with more variants, with many customers, orders and discounts.

.NET is a wonderful platform with high performance, running on powerful servers, ASP MVC is lightweight compared to ASP, SQL server is an Amazing tool, and EF is...
But I'm currently having very bad performance on nopCommerce. Each version is announcing performance improvements but I can't see them. Today I migrated one of my web sites to nop 3.3, I hoped an amelioration but it takes from 0.8 to 5 seconds to load a page :(
I'm running on the server itself, so it's really the execution time.

My idea is that perf optimization should be the main goal of the next release. There are many ways to optimize the software, but what I can see in each version is just caching. This is one direction but all others are ignored.
For example:

- The worst process in all nop development is ProductLoadAllPaged store procedure. In general, temp tables should be avoided, if possible. Because they are created in the tempdb database, they create additional overhead for SQL Server, slowing overall performance. Can you imagine that in a category page containing 20 grouped products, ProductLoadAllPaged is called minimum 21 times, each time a table is created in tempdb, and dropped...
Proposition: Create specific procedures for specific situations. For example, I created a procedure to load associated products, no need to create a temp table just to make a simple query, I have an immediate and visible amelioration! ProductLoadAllPaged should be used only for search page.


- Use more store procedures, for public area only! EF is perfect for admin pages, where loading time is not a goal. But for public pages and particularly the catalog, each query should be optimized


- Lazy loading could be avoided in some situations. For my 20 grouped products, each product contains 4 associated products, I have 80 queries SELECT ... FROM  [dbo].[Discount_AppliedToProducts] AS [Extent1]   INNER JOIN [dbo].[Discount] AS [Extent2] ON [Extent1].[Discount_Id] = [Extent2].[Id]
Proposition : Manually pre load some data, or use include statement in linq queries

- The number of database request should be reasonable and controlled, actually we can have 100 request for just one page!

- Some processes are called twice ore more in the code. Example: in GetDiscountAmount, we call GetPreferredDiscount, and a few lines later we call GetFinalPrice. Both methods are looking for the applied discount.
Proposition: for this example, GetFinalPrice could have a new parameter 'out Discount appliedDiscount' needed in many cases


- More than 90% of data read by EF will never be updated, just loaded to be displayed. And we never use .AsNoTracking() feature, so thousand change tracking objects are instantiated, initialized, reference all objects loaded, then go the garbage collector.
Proposition: Add a boolean readonly parameter to each GetXXXObject, and use .AsNoTracking() on repository


As you can see I am focused on database access, because I know it well, and all these points have a real impact on performance
But I thinks other efforts could be made
  - in plugin management to load / unload without restart application
  - on injection to avoid so much first chance exception at startup etc.
  - etc

I hope the community will give more ideas, I hope dev team will accept to work on this, thank you to them for their patience

Regards,
Nicolas
10 Jahre weitere
+1
Your proposed improvements are very interesting.
10 Jahre weitere
On the same subject, this article give some more indications concerning EF :

http://programmers.stackexchange.com/questions/117357/is-entity-framework-suitable-for-high-traffic-websites

Some ideas seems to be good :
- Very carefully revise your data access with SQL profiler and review your LINQ queries if they correctly use Linq-to-entities instead of Linq-to-objects
- Very carefully use advanced EF optimization features like MergeOption.NoTracking
- Use ESQL in some cases
- Pre-compile queries which are executed often
- etc...
10 Jahre weitere
Your link is must read!
10 Jahre weitere
nicolas.muniere wrote:
Hi,

For example:

- The worst process in all nop development is ProductLoadAllPaged store procedure. In general, temp tables should be avoided, if possible. Because they are created in the tempdb database, they create additional overhead for SQL Server, slowing overall performance. Can you imagine that in a category page containing 20 grouped products, ProductLoadAllPaged is called minimum 21 times, each time a table is created in tempdb, and dropped...
Proposition: Create specific procedures for specific situations. For example, I created a procedure to load associated products, no need to create a temp table just to make a simple query, I have an immediate and visible amelioration! ProductLoadAllPaged should be used only for search page.


Would you please elaborate on this? How are grouped products related to the calling of ProductLoadAllPaged and how do you exactly end up with 21 calls to the procedure when it should be just one?

Thanks
10 Jahre weitere
Nop-Templates.com wrote:
Hi,

Would you please elaborate on this? How are grouped products related to the calling of ProductLoadAllPaged and how do you exactly end up with 21 calls to the procedure when it should be just one?

Thanks


Catalog controller, line 350, in the PrepareProductOverviewModels, SearchProducts is called to load the associated products. Some people do not have many grouped products, so it has no impact on performance. But if you have some... it terrible.


I replaced this call by:
var associatedProducts = _productService.GetAssociatedProducts(
    storeId: _storeContext.CurrentStore.Id,
   parentGroupedProductId: product.Id);


Add this in IProductService:
IList<Product> GetAssociatedProducts(
            int storeId,
            int parentGroupedProductId);


Add this to ProductService:

public virtual IList<Product> GetAssociatedProducts(
            int storeId,
            int parentGroupedProductId
            )
        {

            var pParentGroupedProductId = _dataProvider.GetParameter();
            pParentGroupedProductId.ParameterName = "ParentGroupedProductId";
            pParentGroupedProductId.Value = parentGroupedProductId;
            pParentGroupedProductId.DbType = DbType.Int32;

            var pStoreId = _dataProvider.GetParameter();
            pStoreId.ParameterName = "StoreId";
            pStoreId.Value = !_catalogSettings.IgnoreStoreLimitations ? storeId : 0;
            pStoreId.DbType = DbType.Int32;


            //invoke stored procedure
            var products = _dbContext.ExecuteStoredProcedureList<Product>(
                "ProductLoadAssociated",
                pStoreId,
                pParentGroupedProductId
                );


            //return products
            return new List<Product>(products);

        }


Here is the store procedure :

CREATE PROCEDURE [dbo].[ProductLoadAssociated]
(
  @StoreId      int = 0,
  @ParentGroupedProductId int
)
AS
BEGIN
  

    
  IF @StoreId > 0
  BEGIN
    SELECT
      p.*
    FROM
      Product p with (NOLOCK)
    WHERE
      VisibleIndividually = 0
      AND p.ParentGroupedProductId = @ParentGroupedProductId
    
      --filter by store
      AND (p.LimitedToStores = 0 OR EXISTS (
        SELECT 1 FROM [StoreMapping] sm with (NOLOCK)
        WHERE [sm].EntityId = p.Id AND [sm].EntityName = 'Product' and [sm].StoreId=@StoreId
        ))
  END
  ELSE
  BEGIN

    SELECT
      p.*
    FROM
      Product p with (NOLOCK)
    WHERE
      VisibleIndividually = 0
      AND p.ParentGroupedProductId = @ParentGroupedProductId
    END

END

GO



It's VERY efficient.
10 Jahre weitere
Using this trip AND removing a big part of my discounts, I actually have acceptable performances... between 0.6 and 3 seconds

Milen, thanks for your theme, you can find an integration here (nop 3.3) http://www.3d-extensions-cheveux.com/

It takes more than 1.5 Go memory and explode the processors. If you put this in Azur, it will cost 1$ a page :)

Ohhh I'm joking ;)

Andrei, do you think it could be a global project to improve performances?
Perhaps you do not agree with my observations but I just would like to know!

Thanks,
Nicolas
10 Jahre weitere
nicolas.muniere wrote:
Andrei, do you think it could be a global project to improve performances?
Perhaps you do not agree with my observations but I just would like to know!

Hi Nicolas,

Thanks. Absolutely agree with you

Regarding this particular SP improvement. I would better implement it with LINQ and use _storeMappingService and _aclService once products are loaded (anyway results of these services are cached). But the issue is that it misses a lot of other validations.

As you may know each new version has some enhancements. I also have an internal list of performance improvements. All of them are quite minor (cache "something", refactor "something", simplify "something". But the problem is than all of them won't give more than 10% of performance improvement. It won't be significant. What can really help and solve performance issue is entity caching between requests (not presentation layer caching). It can be done one of the following ways (dilemma for the last several months for me):

1. Wait when EF starts supporting two-level caching (work item here). The easiest way to go. No much changes will be needed. But when will it supported? Nobody knows

2. Move from EF to Dapper (work item here). This way performance could be much better than in the first case. The first issue is that it'll require SIGNIFICANT source code refactoring. It can seriously effect existing developers. The second issue is that almost every ASP.NET developer is familiar with EF. And I presume that only 2-3% of them worked with Dapper. Of course, we can talk about leaning curve and so on. But it can seriously effect product popularity

P.S. I do consider nHibernate as dying dinosaur
10 Jahre weitere
I am very relieved to know that you take this problem seriously, but yet I 'm not quite agree on how to resolve this problem.

When process is unusually long, we can increase the number of cache much as we want, it does not change the fact that the processis too long.

The solutions I propose is to address the root of the problem, in order to consume less resources from the outset.

The examples I give are very specific on unnecessary tracking of objects (6 % additional treatment according to the msdn ) inappropriate lazy loading, duplicate calls etc. .

On the other hand, superposed caches will be a real problem when nopCommerce will move to the cloud. It will ask question of data refreshing, because multiple instances of the site will execute in parallel.

SQL server already has several caches, the execution plans cache, data cache, etc... It is already very efficient. All caches we add only consume additional resources. Our problem of slowness is directly related to how we access SQL server:
- Dynamic LINQ Queries
- Unappropriated lazy loading
- Unused changes tracking

I have proved this repeatedly, and the changes I made in this direction is brought immediate and impressive results in my stores.

Adaptation work is not very important, it is just to change our strategy of reading data for front office catalog pages. All our performance problems will just disappear. Especially as these solutions do not require relearning a new technology to the developer community, it is to use a method provided by EF:
- Avoid inappropriate lazy loading
- Do not use change tracker when not needed
- Split ProductLoadAllPaged into multiple simple and specialized and very fact procedures
I don’t want to kill EF, just help him sometimes to go faster.

It would not be difficult to maintain 10 more store procedures to be absolutely sure to have a result.

And what about the multiple calls to the same code?

Thanks for reading this,

Nicolas
10 Jahre weitere
nicolas.muniere wrote:
And what about the multiple calls to the same code?

Hi Nicolas,

Please clarify
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.