There can be any number of problems.
For instance there are limits to the number of Pages Apache (a webserver in this instance) can handle at a time, also SQL databases also have a limit. When the limits are rigged round the wrong way (SQL Greater than Apache) you'll have timeouts. If it's the other way (Apache Greater than SQL) you'll get DB error messages.
This could be alleviated by increasing the number of users the Server and Database can handle (Obviously you have to balance the two so there is no bottleneck)
Are you using SSL at all? Do you have a redundancy network in place or are you trying to pile all your connections through one "gateway"?
I'm basing my questions on what I learnt when I ran a dedicated server online. I ran a number of servers (Apache/SQL/IRC/Sendmail/Newsgroup). The Dedicated I used was connected to a backbone in what I term a redundancy network, which basically means when people accessed the server they could find some of it's contents served up by Cache systems to spread the overall weigh across those machines. It was possible for people to hook directly too the system which was tested using an IRC server, however the number of people and the bandwidth they utilised use to crash it. (Well technically DoS, Denial of Service) This wasn't purposely done but just done through the popularity.