Not going into specifics on the specs since I know there is no real answer for this. But I've been doing load testing today with the ab
command in apache.
And got to the number of 70 requests per second (1000 requests with 100 concurrent users), on a page that is loading from 4 different DB tables, and doing some manipulation with the data. So it's a fairly heavy page.
The server isn't used for anything else for now and the load on it is just me, since it's in development. But the application will be used daily by many users.
But is this enough? Or should I even worry (just as long as it's over X requests a second)
I'm thinking that I shouldn't worry but I'd like some tips on this.
Answer
70 requests per second works out to an hourly rate of 252,000 page renders / hour.
If you assume that the average browsing session for your site is 10 pages deep, then you can support 25,000 uniques / hour.
You should probably check these numbers against your expected visitor count, which should be available from the folks on the business side.
Many of the sites I work on see about 50% of their daily traffic in a roughly 3 hour peak period on each day. If this is the case with your site (it depends on the kind of content you provide, and the audience), then you should be able to support a daily unique visit count of around 150,000.
These are pretty good numbers; I think you should be fine. It's wise to look into opcode caching and database tuning now, but remember- premature optimization is the root of all evil. Monitor the site, look for hotspots, and wait for traffic to grow before you go through an expensive optimization effort for a problem you may not have.
Comments
Post a Comment