@C-Octothorpe said:
Depending on the application size, you'd probably save yourself a lot of time and effort by buying a profiler rather than just guessing where the bottlenecks are and optimizing in the wrong places. You think it's DB access when really it could be that some idiot wrote code to loop through a collection of 50k items 4 times over before getting back to the page which turns out only renders the first 20 items of that collection.
That kind of thing is one of the known problems, actually. First off, one of the contractors-of-yore rolled his own pagination implementation, which doesn't work (he still transfers entire, large database tables just to get the first n records). Plus, there are grids that call the database on every row instead of just once (getting entire data sets just to filter down to one row). Plus, there are Entity Framework calls that use all kinds of irrelevant .Include()
calls, thereby pulling in all manner of useless data. The contractors who pulled that one had no idea how to use Entity Framework appropriately (giving EF a bad name around here, which is why we're not allowed to use it in new code right now) and either didn't test their code or didn't care that it was slower than a turtle going for a leisurely stroll up a mountain.
There might be other problems, too, that a profiler would identify, at least within the code. We have done a fair amount of profiling on the actual DB itself using plain old Microsoft tools. There are some long-standing WTFs in the schema design (it was great fun when the decision not to make UserName unique so that "migration would be easier" blew up after some database optimization), but stuff written in the last six months seems to be performing well overall.