Schroeder last edited by
I get a huge kick out of this site every day over lunch. Figured I'd see if anyone can provide some input to my current situation... I recently moved to a company that has embraced "enterprise practices" and built their own business framework (in .NET). That's all well and good, until it comes time to deploy and maintain the applications.
The data access components, business processes, utilities etc. are spread between hundreds of assemblies. That forces us to deploy those assemblies to the GAC so that we can distribute changes to the underlying framework between the various applications that use it (without recompiling each of those apps). And, of course, that means a machine.config file that's chock full of redirects between the various versions. It also significantly blurs the boundaries between the various applications since they're all so tightly coupled.
A good example is when I decided to use a very minor piece of functionality in that framework... I created one reference to a shared component and next thing I know my app is loading about 25 different assemblies which takes forever because of all the binding redirects, JIT compilation, etc.
I can accept that shared code and enterprise level solutions are tough, but I've got to believe that there's a better way of doing things. My thinking is that the entire framework should be boiled down to several assemblies which would then allow us to start using build scripts (via NAnt or something) and ditch the GAC and the associated "redirect hell" since it becomes easier to package the framework and then let the individual apps worry about getting the updates. Am I nuts?? Am I living a WTF?
I suppose the question comes down to how frequent your enterprise framework is released. If the application to framework release ratio is anywhere close to (or below) one, then it doesn't make a whole lot of sense to be deploying the countless side-by-side versions in the GAC. And if your ration is that close, your framework is probably too fat (which it sounds like it is, anyway).
As far as slow apps, I don't think there's anyway around that. If your framework was designed differently, such that each assembly referencened only a core assembly to do its thing (look at the .NET frameowrk) it'd be a totally different story.
One thing you should look into is for your automated builds is BusyBeeBuilder. I found it oh so much easier than nAnt.
bmschkerke last edited by
Hundreds of assemblies would be slow in any language that you choose to
deploy it in - C++ DLLs would provide no benefit for that kind of
scope. I cannot fathom how someone would see this as a good
idea. The interdependencies alone would drive me insane.
I'm doing something similar now, but I'm using declarative programming
on the WinForms side to attach to the business objects. The
business objects are pushed through up to five tiers, depending upon
what request is being received (reporting, data entry, etc). The
only shared assembly is the definition of the business objects.
Everything else is a discrete system -- web services are used to
communicate between layers (and not Remoting, unfortunately - I'd like
to do a compression sink inside .NET Remoting but I desire
interoperability more). Is the system "slower" because of the web
services? Sure. But the maintenance, the ability to replace
and repair our system like Legos... that's so much nicer. And if
someone wants to complain about an extra 70ms or so ... well, their
monitor might explode.