.NET app memory usage help



  • I'll kick this off by saying that I've some experience in .NET, but I don't have a compsci degree, so bear with me.

    I've developed a small and simple C# .net 2.0 windows app that basically caches some database data and provides a query mechanism against it. Quite simply, it downloads a bunch of stuff (about 1.3MB worth), whacks it in a SQLite db (using System.Data.SQLite library) and then there's a notification icon, small query form which also shows the results in a datagridview and an options form. Only other thing is a keyboard hook so that I can use a shortcut key combo to show and hide the query form.

    Now, I know if you set up a simple forms project and run it, the VM usage is about 25MB, which goes down to a few MB when the main form is minimized. However, my app seems to balloon up and past 50MB after a few queries are being done. I'm not quite sure what's using up this memory - my data access class is a singleton and thread-safe and I throw data around in datatables which I try to dispose of along the way when I don't need them. I was under the impression that the .NET framework did some sort of sane garbage collection, and I can see that in action sometimes when I do a large query followed by a small query, and the VM usage drops down something like 41MB.

     Does anyone have any tips/reading/resources/applications that I can use to somehow figure out why so much is being allocated, and any way I can reduce this?

     Thanks

     



  • Sounds absolutely normal for .Net applications with database connectivity.  I rarely develop applications that don't interface with a database and all of them use 47 - 57Mb of ram (physical and virtual) once a connection has been made.  My guess is that the extra memory usage has to do with loading the database drivers, maintaining the connectivity when idle, and reserving space for query data.  As long as the memory usage doesn't slowly increase over time, then you can be fairly certain you have no memory leaks.

    As far as what applications to use to display the resources being used by an application, I use SysInternal's Process Explorer which Microsoft bought.  If you go to  www.sysinternals.com, you'll get redirected to the microsoft page - it's under processes & threads.  If you right click on a .Net application and go to properties, there is a .Net tab that has a whole bunch of statistics related to the runtime.  The other option is to use a profiler.  I've heard good things about Red Gate's ANTS profile, but I haven't used it.  




  • Thanks, I'll take a look at that. Just to allay my fears, there's nothing inherently wrong with having a single data connection and locking it whenever I need to use it is there? I imagine I could pool the connections, but that might suck even more memory...



  • Why do you care that much about memory usage?



  • I don't think I care excessively, I simply thought it unusual.



  • @growse said:

    Thanks, I'll take a look at that. Just to allay my fears, there's nothing inherently wrong with having a single data connection and locking it whenever I need to use it is there? I imagine I could pool the connections, but that might suck even more memory...

    In a client application such as you describe, I'd say no.  Most of my client apps use a singelton data access component.  I'd steer clear of connection pooling unless your dealing with a server-side component or you have very unique requirements.  The native .Net libraries for OleDb, SqlClient, and OracleClient already perform connection reuse eliminating the overhead of establishing connections.  The Close() method on a connection doesn't actually close the connection to the database, it just marks it as Inactive.  A subsequent command to the same resource should wake up the existing internal connection to do the work.  IMO, addon data libraries for other RDBMs should exhibit the same behavior, but you never know. 

     

     



  • Times are just changing. My old Mac clone (StarMax 4000/200) has 144 MB of RAM. In this, I would run, simultaneously, iCab, Outlook Express (the nice Mac version), AIM, ICQ, ShadowIRC, lili_Pad, SimpleText, BBEdit 6, Photoshop 5, REALbasic 3.5.1, MacHTTP (Web server), Rumpus 3 (pro FTP server), MacASP, error acgi, SoundApp, Mac TCP Watcher, Fetch, Picture Viewer and UptimeMac. The Mac would be on 24/7.

    Running this much in Linux/KDE or Windows 2000 would be unthinkable. I struggle to get close to this in Windows 2000 with 512 MB. Visual C# 2000 Express isn't too bad an IDE (the live error checking is helpful) but I've never actually seen an IDE that ever beat Delphi 1 on my 486 in Windows 3.11, and that PC had, what, 12 MB RAM? Visual C# needs about 70 MB, 12 more for a permanently open VShost whatsit, and then another 40 for the astonishingly useless online help.

    Most OS 9 apps contained debugging symbols (visible if they crash when you have MacsBug installed) and as far as I am aware, Mac OS never provided a C or C++ runtime, so every app must have had the necessary parts of the runtime compiled in. (I don't know, but if it did ship with C runtime, it's not visible anywhere.)

    It pains me to see Thunderbird eating over 30 MB just to read e-mail, and I caught Pidgin eating over 70 MB the other day. (I am pretty convinced that GTK+ in Windows leaks RAM like a sieve.) As much as I love Process Explorer, I caught that using 40 MB the other day. Using my PC just for Web surfing, e-mail, IM and music, I'm using 395 MB right now. I can't even see where it's all gone -- not all memory in use registers in Process Explorer outside of the commit charge counter. (The figure earlier was over 600 MB, so the kernel must have leaked like diarrhoea when I installed .NET the other day.)

    I suppose, though, if your PC has like 4 GB of RAM, you can afford to let .NET throw it all down the drain like it did with all of your money.



  • @Daniel Beardsmore said:

    Using my PC just for Web surfing, e-mail, IM and music, I'm using 395 MB right now. I can't even see where it's all gone -- not all memory in use registers in Process Explorer outside of the commit charge counter. (The figure earlier was over 600 MB, so the kernel must have leaked like diarrhoea when I installed .NET the other day.)

    I suppose, though, if your PC has like 4 GB of RAM, you can afford to let .NET throw it all down the drain like it did with all of your money.

    Every operating system supports the concept of virtual memory. Just because an app requires, say, 100 MB doesn't mean that there is only 412 MB of your 512 MB RAM left. 



  • @Daniel Beardsmore said:

    Times are just changing. My old Mac clone (StarMax 4000/200) has 144 MB of RAM. In this, I would run, simultaneously, iCab, Outlook Express (the nice Mac version), AIM, ICQ, ShadowIRC, lili_Pad, SimpleText, BBEdit 6, Photoshop 5, REALbasic 3.5.1, MacHTTP (Web server), Rumpus 3 (pro FTP server), MacASP, error acgi, SoundApp, Mac TCP Watcher, Fetch, Picture Viewer and UptimeMac. The Mac would be on 24/7.

    Running this much in Linux/KDE or Windows 2000 would be unthinkable. I struggle to get close to this in Windows 2000 with 512 MB. Visual C# 2000 Express isn't too bad an IDE (the live error checking is helpful) but I've never actually seen an IDE that ever beat Delphi 1 on my 486 in Windows 3.11, and that PC had, what, 12 MB RAM? Visual C# needs about 70 MB, 12 more for a permanently open VShost whatsit, and then another 40 for the astonishingly useless online help.

    If you don't run KDE or gnome, a modern unix host gets a lot done in very little space. Glancing at what's currently running on my desktop:

    • Print server (cups, 5Mb)
    • NFS server (nothing really, kernel-side)
    • Mail server (exim4, 2Mb)
    • Samba (5Mb)
    • Tor endpoint (10Mb)
    • NTP server (1.5Mb)
    • Miscellaneous X processes (about 20Mb total)
    • mpd (audio player daemon, 7Mb)
    • xfce window manager and panel (40Mb, mostly on behalf of other applications)
    • gkrellm (system static monitor, 11Mb)
    • xemacs21 (development kitchen sink, 23Mb)
    • terminals (18Mb total)
    • transmission (bittorrent client, 50Mb, several large torrents active)
    • spamassassin (50Mb)
    • clamav (60Mb)
    • firefox (piece of crap, 330Mb of cache-bloat)
    • mutt (mail reader, 3Mb)
    There'd normally be a bit more than that, but I had a power cut last week and haven't cluttered it much since then. Aside from firefox's inane memory cache behaviour, most of the memory usage is spamassassin and clamav holding their entire rule databases in memory, and if this wasn't my development workstation those would have been pushed off to a server someplace. Also, I'm quoting the maximum memory requirements, not the actual usage - a lot of that will be shared between the various applications.

    Windows and .NET waste memory, because Microsoft don't care about wasting your memory (they don't pay for it). A properly designed system doesn't have to (KDE and gnome waste a lot of memory on graphical crud). Times aren't really changing at all - it's just people doing things carelessly, same as it always has been.



  • ...The native .Net libraries for OleDb, SqlClient, and OracleClient already perform connection reuse eliminating the overhead of establishing connections.  The Close() method on a connection doesn't actually close the connection to the database, it just marks it as Inactive.  A subsequent command to the same resource should wake up the existing internal connection to do the work....

    umm, yes.  It is called connection pooling and is the deault behaviour for the SQLClient provider (dont know about the others).  If you dont want this behaviour then you must specify pooling=false in the connection string.

    When checking memory usage you must seperate the working set of the process from the acutal amount of memory it is using.  At work we have use of spotlight which i have pointed at my PC to see how the application we are developing is doing on memory, cpu etc.  It normally reports a working set (virtual address space) of around 180Mb, just for starting up.  Its actualy virtual memory at this point is around 40Mb and in use physical RAM is normally 2Mb.  Windows is lazy about re-claiming memory, if process 1 requests 100Mb, then it gets it.  When it releases it its working set wont shrink.  windows will only shrink its working set when either the process asks for it to be shrunk (as happens when you minimise a window) or if windows thinks it is getting short of RAM.

    Hopefully, someone with more experiance of lower level programming will now come forward and correct that.  I think my basic point is that, if average memory usage is staying level for your app over a long time and performance is okay, (ie no excessive paging) then dont fret about memory usage figures.



  • @asuffield said:

    Times aren't really changing at all - it's just people doing things carelessly, same as it always has been.

    Of course they are. This kind of carelessness was not always possible. At one stage, people had to do a proper job because nothing less would suffice. Computers were expensive and not very well specified (since that was all we had) so programs had to be written properly. Far enough back, in assembler, and they were reliable and didn't crash. Quite a contrast. We've got ahead of ourselves -- technology has advanced much faster than people have figured out how to cope with it, and the price has come down enough that enough people can keep shovelling in the cash to keep up. Then, people lose track of the concept of efficiency. Buy a new PC, that will sort it out.

    @ammoQ said:

    Every operating system supports the concept of virtual memory. Just because an app requires, say, 100 MB doesn't mean that there is only 412 MB of your 512 MB RAM left. 

    That is not true. Virtual memory is an advanced addressing technique. Mac OS 9 supports page swapping ("Virtual Mamory") but not virtual memory. It took me a long time to realise that the reason Mac OS 9 lacks protected memory, is because certain OS vendors are too stupid to give page swapping the correct title. Mac "Virtual Memory" isn't, hence no protected memory.

    Besides, I don't like being RAM-overdrawn. I was already pretty overdrawn in OS 9, and it amused yet annoyed me that with my new near-silent hard drive, I couldn't tell a system hung from paging with one hung from a crash, unless I stopped my music and listened really closely. Windows 2000 has -- on my PC -- severe problems with a heavily overdrawn system. Attempting to initialise the sound drivers (when starting playing audio) causes a temporary system hang and drive thrashing. Up to a point (say, 500 MB RAM used out of 512) you can run a program in the background that locks the sound driver in use by playing a loop of silence. By the time you're going over the 512 MB limit just skipping along a track will have the same effect, and my Sound Hack app is useless - having the sound driver locked in RAM by another process has no effect any more.

    Mac users prior to X are well-known for trying to avoid paging. The Mac trick is to leave it on, but set the swap file to size of RAM + 1 MB (the lowest it will go). Running page swapping enables other advanced memory services like (I think?) sharing libraries between processes, so you want it enabled but then don't allow it to swap anything out. I'd disable the swap file entirely in Windows and stop all paging if I could, but I just cannot do this with "only" 512 MB RAM. Windows 2000 can be pretty horrible at page swapping but by SP 4 they seemed to have mostly fixed that. Even so, it's really best to not use any more RAM than you can help it. Windows isn't happy about it. If I've gone over my 512 MB, things are starting to look a bit grim. By the time they're heading towards 700 MB, I'm in trouble.

    I set Mac OS 9 to pretend to have 288 MB (twice 144) and I can use up most of that without too much pain. Mostly Photoshop sits there forever being paged back in, over and over.


  • Discourse touched me in a no-no place

    @Daniel Beardsmore said:

    @ammoQ said:

    Every operating system supports the concept of virtual memory. Just because an app requires, say, 100 MB doesn't mean that there is only 412 MB of your 512 MB RAM left. 

    That is not true. Virtual memory is an advanced addressing technique. Mac OS 9 supports page swapping ("Virtual Mamory") but not virtual memory. It took me a long time to realise that the reason Mac OS 9 lacks protected memory, is because certain OS vendors are too stupid to give page swapping the correct title. Mac "Virtual Memory" isn't, hence no protected memory.

    Conversly the embedded platform running Microware OS 9 I used to deal with at work had no virtual memory or page swapping.


  • @growse said:

    I was under the impression that the .NET framework did some sort of sane garbage collection, and I can see that in action sometimes when I do a large query followed by a small query, and the VM usage drops down something like 41MB.

    .NET avoids doing garbage collection unless its necessary, e.g. running low on memory, because its time consuming and all threads are suspended while it happens. I believe there's some sort of concurrent GC but I can't see it actually moving memory while the threads are running.

    Its highly unlikely you have a memory leak unless you're using a COM component which leaks memory, the way to "reduce" memory use is to avoid too many allocations in the first place, the classic example being to use StringBuilder (with initial capacity set to the expect final length) instead of string concatenation when used in a loop.


Log in to reply