I've been looking into Lucene.net, Memcached and System.Runtime.Caching.
I think the point we have all agreed on is that our UI should talk to the caching layer (where possible).
Although I think Lucene should definitely be considered for the search index of nopCommerce, I don't think it is best solution for caching. In most cases we just want to easily cache an object and retrieve it from the cache. Lucene adds additional work since we need to first index our content and when retrieving our object, cast out the hit documents back into our business objects (I'm assuming this is how you have done it in FastNop - either that or you have changed the UI to use the Lucene documents instead??).
For simple object caching I think we should start with the System.Runtime.Caching MemoryCache. The nice thing about this approach is that we can easily scale up to use Memcached (examples to follow next week).
In my tests all I had to do was create a simple service layer on top of our existing business layer. For each entity we store an object dictionary containing all entities from our database:
public class CategoryService
{
ICacheProvider _cache;
private const string CATEGORY_CACHE_KEY = "categories";
public CategoryService()
{
_cache = new MemoryCacheProvider();
}
public IList<Category> Categories
{
get
{
if (_cache.IsSet(CATEGORY_CACHE_KEY))
return _cache.Get<IDictionary<int, Category>>(CATEGORY_CACHE_KEY).Values.ToList();
var db = ObjectContextHelper.CurrentObjectContext;
var categories = db.Categories;
categories.MergeOption = System.Data.Objects.MergeOption.NoTracking;
_cache.Set(CATEGORY_CACHE_KEY, categories.ToDictionary(x => x.CategoryId), 30);
return categories.ToList();
}
}
public Category GetCategoryById(int categoryId) {
return this.Categories.SingleOrDefault(x => x.CategoryId == categoryId);
}
public List<Category> GetBreadCrumb(int categoryId) {
var breadCrumb = new List<Category>();
var category = GetCategoryById(categoryId);
while (category != null && !category.Deleted && category.Published)
{
breadCrumb.Add(category);
category = GetCategoryById(category.ParentCategoryId);
}
breadCrumb.Reverse();
return breadCrumb;
}
public List<Category> GetCategoriesByParentCategory(int parentCategoryId) {
return this.Categories.Where(x => x.ParentCategoryId == parentCategoryId).ToList();
}
public List<Category> GetHomePageCategories() {
return this.Categories.Where(x => x.ShowOnHomePage).ToList();
}
}
As you can see, we will probably need to move many of the methods currently located in the manager classes. Effectively we would be simplifying our Manager (repository) classes - which is how it should be.
Then it is just a case of updating our UI to use our services instead of the existing Manager classes.
When we update an entity we can update the cached item too:
if (cache.IsSet("categories")) {
var categories = cache.Get<IDictionary<int, Category>>("categories");
categories[category.CategoryId] = category;
}
This will provide a caching layer that is easy to implement (the category implementation took about 30 minutes) and scalable (easy to implement Memcached).
Since we will only use the cache for reading data on the storefront (we can continue to use the HttpContext.Items collection in admin) this will avoid the problems we experienced previously.