Of the half-dozen VPS/dedicated hosting I've used over the past few years none of them have had anything close to a workable web server setup. They usually just give a default config file and expect you to figure it out on your own...which is good in some ways as it forces you to learn things but sometimes you just want something that works. Perhaps I've just been unlucky with providers and other services have more sane setups.
For example, my current dedicated host seemed to just use the default config file for Apache 1.3. MaxClients has set to 255 when the 512MB of server RAM could only support 50 clients at the max. KeepAlive was on and set to an extremely high value of 10 (seconds) when it should be 1 or 2 at the most or even off on high traffic sites. What would happen is the high KeepAlive kept sessions going for far too long which in turn caused more clients to be spawned to handle incoming connections. The server quickly ran out of RAM (within an hour) and then would start swapping like crazy and performance dropped through the floor. By researching and tweaking the config I was able to improve the traffic the server was capable of handling by a factor of 100 or so.
I might guess that something similar is happening to your server, although with your relatively low traffic it may be something else. If you're running Apache 1.3 I'd check your the settings mentioned above and tweak them. There should be something similar for Apache 2.0 or other web servers. Google should turn up a bunch of articles related to tuning performance of Apache. You can also use the 'top' command to see exactly what is taking up memory...it may well be a bunch of other things that you don't necessarily need. I know one VPS I had had dozens of optional things running on it that I could have removed if needed.