Distributed Caching

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.
12 years ago
I've managed to get distributed caching working - both as an implementation of ICacheManager for the model objects and also using an implementation of OutputCacheProvider along withe the DevTrends DonutCache attribute. I'm using a Memcached server to perform the caching.

It's running really nicely - even after restarting my local ASPNET process, the site still starts up and retrieves all it's content from the distributed cache - meaning the site is 'warmed up' even after a restart. Performance wise, it seems much faster. Response time is improved and it's running a lot less code for each http request, meaning CPU load is reduced.

This was actually pretty simple to get up and running and offers some real benefits - it would allow nopCommerce to scale, something I think it struggles with now.

In an environment with a Network load balancer, a number of web servers + a number of caching servers etc, this would simply allow you you to throw more power at the solution by introducing another web or caching server to the cluster.

I know nopCommerce isn't really geared to this end of the market yet, but it certainly could be if a few things happen in the source. It probably wouldn't apply for most people who are using the solution in a simple hosted enviroment, but I think if the foundations are laid then it has the possibility of becoming a credible enterprise solution.

Issue:

Memcached and other distributed caching providers will use TCP to serialize objects. Memcached will currently throw an exception 'The type <Some.Nop.Type> is not marked as serializable.' when trying to cache objects - most Nop models and settings aren't marked as serilizable - simple fix though..

* Work item: Mark all items (models \ settings) that could possibly be cached with the [System.Serializable] attribute.


I'm happy to send what I've done to get distributed caching working - it's not much...a few new references + config changes + Serializable attributes!

Obviously, you'd need a Memcached server to see it in operation.

Thanks
12 years ago
Hi

we are embarking on this road shortly.  we have heavily bespoked nop (1.9) for a client and the server is current taking 14m hits per month (on the nop site alone). we're moving this to rackspace cloud with loading balancing and caching server(s). if you could let me know what you've done that would be fab.

thanks
Sean
12 years ago
It would be great if you could share your changes. Please find more info about contributing here
12 years ago
I'll create a fork & get involved!
12 years ago
I've created a changeset in my fork - most of the changes are just adding [Serializable] attributes to classes

http://nopcommerce.codeplex.com/SourceControl/network/forks/morley/MorleyFork/changeset/changes/e1585a51f95f


I've included the Devtrends.MvcDonutCache NuGet package and have wired that in to a few controller actions. I added a variety of profiles to outputCacheProfiles in web.config and added an override of GetVaryByCustomString to Global.asax

Forgetting all the distributed stuff, this seems to have got partial output caching working based on customer language selection using the default AspNetInternalProvider

I added this in to a few places I thought wouldn't affect the site too much but could demonstrate how to use. See CommonController for some obvious ones & BlogController, TopicController for some multi-language examples

For distributed caching, I've created a MemcachedOutputCacheProvider in Nop.Core.Caching. To get this up and running, you need to do the following..

1) Add the ip address of your Memcached server into the enyim.com web.confg section where it's marked 'your-server-address'.

2) Under the caching section, uncomment the DistributedCacheProvider item.

3) On the outputCache attribute, change defaultProvider from AspNetInternalProvider to DistributedCacheProvider

4) Add a new setting to the settings table


--new setting
IF NOT EXISTS (SELECT 1 FROM [Setting] WHERE [name] = N'distributedcachesettings.enabled')
BEGIN
  INSERT [Setting] ([Name], [Value])
  VALUES (N'distributedcachesettings.enabled', N'False')
END
GO


This is just a proof of concept, so interested to see what you think. At the least, I think the Donut Caching library offers a better caching mechanism - particular when it comes to invalidating the output cache. Have a look in Nop.Web.Framework, I included an attribute InvalidateOutputCacheAttribute that you could add to controller actions to cause an Action to automatically invalidate the cached output of another action.

The output cache proivder using Enyim.Memcached could would certainly offer some cool distributed caching options...but you'd probably want all the config for that to happen in the admin console rather than in web.config
12 years ago
Thanks
12 years ago
Final check in on this subject to my fork

http://nopcommerce.codeplex.com/SourceControl/network/forks/morley/MorleyFork/changeset/changes/b3fb2e80134a#

I've also added a VaryByCustom for CustomerId that should allow some customer specific output caching for certain elements - such as home page products and mini shopping cart etc.

On the shopping cart controller, I've included the [InvalidateOutputCache] attribute on a number of actions to demonstrate how actions could invalidate different cached items - e.g. RemoveCartItem can invalidate the MiniShoppingCart Action.


In a high volume site (i.e. 1000's of customers), this would cause some heavy activity for customer specific output caching. I guess you'd need to weigh up the performance gains against the additional load?

In a low volume site, I think it could offer some decent performance gains with very little trade off.



Again - interested to see what people think. The combination of language and customer specific ouput caching should massively reduce the amount of work required for each http request.


Cheers
7 years ago
Hi!

Any news about it?
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.