nopCommerce 1.7 cache not working?! Possible solution to 1.7 slowness?

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.
13 years ago
I actually really like the LUCENE.net idea for a variety of things... very cool
13 years ago
thought wrote:


I'll see if I can get something working over the weekend.

The other nice thing about using lucene is ease of facet searching and creating facets from your catalogue


Have just started running some tests myself.

Shame someone hasn't done an EF port of NHibernate.Search. Would save some work.
13 years ago
With thanks to a friend of mine I have uploaded a lucene based project:

http://fastnop.adept.thought.co.uk/category/30-computers.aspx]http://fastnop.adept.thought.co.uk/category/30-computers.aspx

Currently ONLY the categories are working from the index. As you can see though, price range filters are created from the index as are facets regardless of depth of category.

Next will be manufacturers and search pages.

We have created a project on Codeplex called 'FastNop' - It's not published yet but will do once we have got a bit further on.
13 years ago
"The SQL Server Service Broker for the current database is not enabled"

Really eager to see this one.  I wasn't really that overjoyed with rolling back.

Jared Nielsen
13 years ago
FUZION wrote:
"The SQL Server Service Broker for the current database is not enabled"

Really eager to see this one.  I wasn't really that overjoyed with rolling back.

Jared Nielsen

Try refreshing couple times
13 years ago
I've been looking into Lucene.net, Memcached and System.Runtime.Caching.

I think the point we have all agreed on is that our UI should talk to the caching layer (where possible).

Although I think Lucene should definitely be considered for the search index of nopCommerce, I don't think it is best solution for caching. In most cases we just want to easily cache an object and retrieve it from the cache. Lucene adds additional work since we need to first index our content and when retrieving our object, cast out the hit documents back into our business objects (I'm assuming this is how you have done it in FastNop - either that or you have changed the UI to use the Lucene documents instead??).

For simple object caching I think we should start with the System.Runtime.Caching MemoryCache. The nice thing about this approach is that we can easily scale up to use Memcached (examples to follow next week).

In my tests all I had to do was create a simple service layer on top of our existing business layer. For each entity we store an object dictionary containing all entities from our database:


    public class CategoryService
    {
        ICacheProvider _cache;
        private const string CATEGORY_CACHE_KEY = "categories";
        
        public CategoryService()
        {
            _cache = new MemoryCacheProvider();
        }

        public IList<Category> Categories
        {
            get
            {
                if (_cache.IsSet(CATEGORY_CACHE_KEY))
                    return _cache.Get<IDictionary<int, Category>>(CATEGORY_CACHE_KEY).Values.ToList();

                var db = ObjectContextHelper.CurrentObjectContext;
                var categories = db.Categories;
                categories.MergeOption = System.Data.Objects.MergeOption.NoTracking;

                _cache.Set(CATEGORY_CACHE_KEY, categories.ToDictionary(x => x.CategoryId), 30);

                return categories.ToList();
            }
        }

        public Category GetCategoryById(int categoryId) {
            return this.Categories.SingleOrDefault(x => x.CategoryId == categoryId);
        }

        public List<Category> GetBreadCrumb(int categoryId) {
            var breadCrumb = new List<Category>();
            var category = GetCategoryById(categoryId);
            while (category != null && !category.Deleted && category.Published)
            {
                breadCrumb.Add(category);
                category = GetCategoryById(category.ParentCategoryId);
            }
            breadCrumb.Reverse();
            return breadCrumb;
        }

        public List<Category> GetCategoriesByParentCategory(int parentCategoryId) {
            return this.Categories.Where(x => x.ParentCategoryId == parentCategoryId).ToList();
        }

        public List<Category> GetHomePageCategories() {
            return this.Categories.Where(x => x.ShowOnHomePage).ToList();
        }
    }


As you can see, we will probably need to move many of the methods currently located in the manager classes. Effectively we would be simplifying our Manager (repository) classes - which is how it should be.

Then it is just a case of updating our UI to use our services instead of the existing Manager classes.

When we update an entity we can update the cached item too:


            if (cache.IsSet("categories")) {
                var categories = cache.Get<IDictionary<int, Category>>("categories");
                categories[category.CategoryId] = category;
            }


This will provide a caching layer that is easy to implement (the category implementation took about 30 minutes) and scalable (easy to implement Memcached).

Since we will only use the cache for reading data on the storefront (we can continue to use the HttpContext.Items collection in admin) this will avoid the problems we experienced previously.
13 years ago
nopCommerce team | retroviz wrote:
I've been looking into Lucene.net, Memcached and System.Runtime.Caching.

I think the point we have all agreed on is that our UI should talk to the caching layer (where possible).


Although I think Lucene should definitely be considered for the search index of nopCommerce, I don't think it is best solution for caching. In most cases we just want to easily cache an object and retrieve it from the cache. Lucene adds additional work since we need to first index our content and when retrieving our object, cast out the hit documents back into our business objects (I'm assuming this is how you have done it in FastNop - either that or you have changed the UI to use the Lucene documents instead??).




Lucene creates an index which is then queried to retrieve documents, these documents cast as Products/Categoies etc.

As you rightly say, we all agree UI should be talking to a cached object rather than DB - this is essential for load speed. However the reason why I think Lucene is a better solution is because not only does it increase the speed, but it also adds functionality. Lucene is a search engine, so you effectively turn categories into a faceted search.

The way the Lucene helper has been created in FastNop (this work is thanks to another guy named on the FastNop site soon to be published) is to create events on ProductUpdate, ProductDeleted etc. On these events the Lucene helper creates a queue of documents which need to be updated in the index. This effectively means the cache is never out of date.

I still think there should be a caching layer even for admin system. I still think the example you gave should be implemented.

Slowness is not acceptable in today's world; everyone is too impatient, including me! :)
13 years ago
I agree, Lucene adds additional functionality (it is a search engine after all) but I just don't think it's right for the entire caching layer as it adds too much overhead to create and maintain the index, especially for caching entire objects.

I definitely think Lucene should be used for search and can be used nicely alongside the cache. We can index the things we want to search on like product descriptors, then hit the cache to retrieve the full object if necessary - similar to how NHibernate.Search works with second level caching.

Also, I wired up Memcached this afternoon and aside from the setup of Memcached itself, the implementation took about a minute - not bad for a fully distributed caching layer.
13 years ago
Since I switched to 1.8 my site is faster, that is an upgrade site from 1.5 -1.6. then 1.7


New clean istall of 1.8 is even fater thou.


I have to be honest this about my only issue speed:)
13 years ago
nopCommerce team | a.m. wrote:
I'm working on more configurable caching in upcoming nopCommerce 1.90. If I start using static caching (like it was used in nopCommerce 1.60), then we get the error related to navigation properties and using distinct ObjectContexts:

To reproduce follow the next steps:
1. Load a shipping method (go to a shipping method details page). So now we have it cached.
2. Then go to shipping methods page (Admin Area > Configuration > Shipping > Shipping Methods) and try to modify several country restrictions.
3. You'll get the following error 'An object with the same key already exists in the ObjectStateManager. The ObjectStateManager cannot track multiple objects with the same key' because the shipping method that you try to restrict is already cached, but it belongs to another object context (object context created in one of previous request - step 1).

I really don't know how to resolve this issue. Any advice would be appreciated


I had the same issue, and I spent a lot of time testing and figuting out what was wrong.

I finnally came to that solution:
I modified every Get method with a ById name to Detach the selected object

public static Product GetProductById(int productId)
{
...
  var context = ObjectContextHelper.CurrentObjectContext;
  var query = from p in context.Products
                    where p.ProductId == productId
                    select p;
  var product = query.SingleOrDefault();

  if (product != null)
       context.Products.Detach(product);
   // Possible as long as we use Attach() before performing updates

   return product;
}


In my Context it solves the problem and I then Cached the objects into the static cache.

Let me know if that solve the issue for you.
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.