Orchard 1.4 Performance

Topics: Core, Customizing Orchard, General, Troubleshooting
Apr 4, 2012 at 3:10 AM
Edited Apr 4, 2012 at 3:13 AM

Orchard Gurus,

We're working on very large project involving Orchard 1.4. Performance is key issue and obviously there are performance issues with in orchard. Running simple 10 user load test for 20 minute on "welcome to orchard page" drives cpu to 100% on 4 core box.  Doing page refresh generates several queries. This is something not acceptable to move forward with in production.  we make sure we're running stable version of orchard under Full trust, debugging is off, iis app pool idle timeout and app pool recycling setting are taken care of. We tried 2nd level nhibernate asp.net Contrib.DBCache and that didn't help much either. With 2nd level cache in place, we're still seeing queries making to sql server and cpu shoots upto 90%+ under load. We also looked into Contrib.Cache page level caching and its not a viable option for us to have. Our site pages doesn't change much except displaying different data on page. user related publishing is very minimal.  We're evaluating multiple options to add caching within orchard to improve throughput and need input from gurus. Our goal is to cache entire site setting and widget content part/meta data so we don't even have to go to db or even 2nd level cache.

  1. Add caching at Irepository implementation level to cache what ever we can
  2. Add caching at DefaultContentManager to cache 
  3. Shape caching (not sure if this is useful)
  4. something else?

Any advice?

Thanks.

Coordinator
Apr 4, 2012 at 4:35 AM

How many queries?

Contrib.DbCache doesn't work, don't use it. Use output caching instead (Contrib.Cache module). If you can't use that (but I'm not sure I understand why from your explanations), you'll have to profile and do your own optimization.

Shape caching for example has been shown to work on largish sites: http://chrisbower.com/2011/05/31/shape-method-caching/

Apr 4, 2012 at 4:48 AM

Are you running DB and web server on the same machine when you do the load test? CPU usage by itself isn't that useful in doing load or perf testing. What requests per second are you getting? 

Apr 4, 2012 at 5:26 AM

bertrandleroy wrote:

How many queries?

Contrib.DbCache doesn't work, don't use it. Use output caching instead (Contrib.Cache module). If you can't use that (but I'm not sure I understand why from your explanations), you'll have to profile and do your own optimization.

Shape caching for example has been shown to work on largish sites: http://chrisbower.com/2011/05/31/shape-method-caching/

Basic welcome to orchard page generates roughly 7 to 8 queries on every page refresh. This is for no reason as there added content/widget on page. We can't use Contrib.Cache as it caches page level and our pages diaplay data dynamicallt based on various parameters. so every request might output same html but with different data values and images..

I am reading lot up on forum here on performance and i am seeing one thing "profile and do your own optimization" but no one clarifies why performance is so bad right out the gate? Whats your thought on caching at IRepository and/or Contentmanager level?

Apr 4, 2012 at 5:34 AM
TheMonarch wrote:

Are you running DB and web server on the same machine when you do the load test? CPU usage by itself isn't that useful in doing load or perf testing. What requests per second are you getting? 

No, running on different machine connected over 1gb lan. we're only seeing 70 req/sec on 4 core 8 GB box for 10, 20 and 40 concurrent users. concerning part is throughput #'s are without any added from our side.

What do you recommend?

Coordinator
Apr 4, 2012 at 6:13 AM
Edited Apr 4, 2012 at 6:16 AM

1. Profile

2. Go from there.

7 to 8 queries is very little for a CMS without caching. What are your pages varying on?

Performance is excellent out of the gate, especially when you compare it to other CMS on .net or other platforms.

Caching at repository level or content manager is difficult because of invalidation, but of course not impossible.

Apr 4, 2012 at 11:36 PM

Are you sure performance even needs improving? If you got 70req/s for 40 users that's not too bad. Did you stop at 40 because the CPU usage got to >90%, or did you start receiving errors? Does your load testing script include any wait time or does it fire off requests as soon as the previous one is done? 

What metric (e.g. requests/sec under X concurrent users) would it take to satisfy your perf. requirements?

Apr 4, 2012 at 11:38 PM

I'm not sure why you can't use the Cache module. You said the page content doesn't vary by user, so does it vary by URL? The cache module can handle that from what I've read. Or maybe you can extend the cache module to vary by whatever input your site uses that prevents you from using Cache module out of the box.

Apr 5, 2012 at 2:38 AM
bertrandleroy wrote:

1. Profile

2. Go from there.

7 to 8 queries is very little for a CMS without caching. What are your pages varying on?

Performance is excellent out of the gate, especially when you compare it to other CMS on .net or other platforms.

Caching at repository level or content manager is difficult because of invalidation, but of course not impossible.

no added pages yet. test was done on plain vanilla orchard. CPU usage for 1o concurrent concerns me. Do you guy have baseline performance #'s on different hardware configuration? 

Apr 5, 2012 at 2:59 AM
TheMonarch wrote:

Are you sure performance even needs improving? If you got 70req/s for 40 users that's not too bad. Did you stop at 40 because the CPU usage got to >90%, or did you start receiving errors? Does your load testing script include any wait time or does it fire off requests as soon as the previous one is done? 

What metric (e.g. requests/sec under X concurrent users) would it take to satisfy your perf. requirements?

Yes, definitely needs improvement. 90% CPU usage for 10 concurrent user over 20 minutes test is not acceptable to any site on internet. We stopped further testing as we're not seeing any improvement between 20 and 40 concurrent users test. script doesn't have any wait time between request, we use 2 client machine to drive total user load. we also wait for 30 minutes before ramping up users.

Its not much about our requirement rather base product it self should run without chewing up server resources. which is not the case right now. 

based on my current understanding Shapes are culprit and another one is NHibernate db queries.

How's performance looking for your site?

Apr 5, 2012 at 3:39 AM

As I said before, CPU usage means nothing by itself. You should be looking at requests per second. If you got to 70r/s that's not *terrible*. It's been about 4 years since I did major performance tests, and on the (pretty large) projects I was on at the time 70r/s on a single server would have been considered decent to good. If you can achieve your target requests/sec does it matter what the CPU use is? That's why I asked what your target metric is. CPU usage alone is a terrible target metric to use. I do get your concern that there might not be enough performance headroom for when you add your site's custom data and logic. 

One question I have, is that earlier you said that you get 70 req/s with up to 40 concurrent users. Why do you keep worrying then, about the cpu usage for 10 concurrent users? If you were at 90% at 10 users it's surprising that you were able to get to 40 concurrent users with the same requests/sec; that seems strange to me. 

I haven't tested performance for Orchard on my site yet. I did look around to see what the biggest sites are that use Orchard and then looked at their traffic stats on alexa.com. With a brand new site like mine I have no idea if it will grow large enough for performance to become a concern. If/when it does I'm confident I'll be able to handle with various things: continual improvements in Orchard codebase, caching modules in the gallery (like Contrib.Cache), custom caching, or other custom optimizations. Basically it's good enough for me right now and I'm not going to worry about it until I have to. 

Apr 5, 2012 at 3:41 AM

Also, why is Contrib.Cache not viable? What input(s) are you varying data by, if not url or user details? 

Apr 5, 2012 at 3:55 AM
TheMonarch wrote:

Also, why is Contrib.Cache not viable? What input(s) are you varying data by, if not url or user details? 

beause it does page level caching. our pages varies data by parameters.

www.foo.com/getmesomething?myparam=1&id=2

OR

www.foo.com/getmesomething?myparam=3&id=4

We'll also reuse some of the widgets on multiple pages with varying data by parameters

Apr 5, 2012 at 6:19 AM
Edited Apr 5, 2012 at 6:20 AM

Ah. For some reason I thought the Contrib.Cache module would vary by QueryString params as well. Looks like it's using Request.ToRootUrlString() to get the url it uses for the cache key -- I'm not sure if that method includes querystring, but I thought it did. I've only used it once or twice, so I'll take your word for it. 

Apr 5, 2012 at 4:57 PM
Edited Apr 5, 2012 at 5:27 PM

Anyone know if/how Contrib.Cache handles widgets? If you do output caching for a page w/ widgets, does the Cache module have some kind of Donut (Hole) caching? 

Also, what's up with the nhibernate caching? Why doesn't it work? Is it impossible to fix right now due to a technical constraint or just that no one has gotten around to it yet? 

Coordinator
Apr 7, 2012 at 7:27 AM

No. It caches the whole page, at least now.

NHib caching won't work because of constraints in the version we're currently using.

Coordinator
Apr 7, 2012 at 7:42 AM

Oh and not taking the QS into account would be a bug. I would file it.

Coordinator
Apr 7, 2012 at 7:48 AM

@Kamin: if you have performance goals that are unrealistic for any CMS, maybe you shouldn't use a CMS and go for custom development.