Stress testing Visual Studio 2010 – Part 2
In the first part of this series I've started talking about our general approach to stress testing Visual Studio. In this post I'll talk about what parameters we're measuring. In the next post I’ll explain how we're measuring them and what tools we use for it.
VM
The main metric that we are tracking is the devenv.exe process virtual memory. Virtual memory describes the address space of the process, and it can be backed by RAM or page file on disk. Until a region in the address space is committed, it doesn’t consume any actual memory.
Our primary goal is to not let VM grow over 1.5 GB, because otherwise the address space on 32-bit OS becomes so fragmented that we are unable to allocate new chunks of memory and crash with OOM (Out-Of-Memory). Fortunately we’re now fulfilling this goal even on very large projects. Typical Visual Studio 2010 VM size on startup is anywhere from 300-500 MB. Note however that this doesn’t mean that it consumes that much RAM, no. It merely means that the process reserved this much address space which it might or might not actually fill in the future with “real memory” (committed memory that actually consumes RAM or page file).
Our secondary goal is that VM doesn’t keep growing if VS is not working with new data (e.g. there are no memory leaks).
If VM grows in our tests, it typically grows in chunks of approximately 15-20 MB. VM growth is usually caused by the private/committed memory growing.
Working set
Working set is a subset of virtual memory currently being stored in the RAM. If the memory is swapped out to the page file, working set may drop to almost nothing. This is the memory that gets freed when you minimize your Windows application or call SetProcessWorkingSetSize. It’s not a very useful counter since depending on how much is swapped out to the page file working set size can differ dramatically.
Private bytes
A more useful measure is the actual amount of memory consumed by this particular instance of the application not shared with other processes. It can be backed by physical memory (RAM) or page file and serves as a more exact measurement of what does your process consume.
https://www.itwriting.com/dotnetmem.php
Process\Privates Bytes shows all private bytes used in the process (including native memory).
Committed bytes
The .NET CLR Memory\# Total Committed Bytes is the private bytes used for managed heap.
Bytes in all 4 managed heaps
Another useful measurement is the consumed managed memory. The CLR currently has 3 generations in the garbage collector: 0-gen heap, 1-gen heap and 2-gen heap (GC.MaxGeneration currently returns 2). Moreover, for objects larger than 85,000 bytes there is a special fourth heap called the “large object heap”. The LOH is not compacted.
Objects in gen-0 are short lived, young, recently allocated instances. As they survive garbage collections, they get promoted to the first and then the second generation. Gen-2 heap contains long lived objects that aren’t collected very often. If you have a memory leak, your leaked objects will most likely end up being promoted to the Gen-2 heap and will stay there.
GC Handles
Since VS is a mixed managed-native application (increasingly managed with every release), COM interop still plays an important role. System.Runtime.InteropServices.GCHandle is a managed structure that provides a means to access a managed object from unmanaged memory. You use GC Handles to prevent the GC from collecting a managed object if it’s only being used from native code. GCHandles are also used to pin an object in memory and prevent the GC from moving it. Typically a VS process has around 30,000-40,000 GC Handles, which is normal. However if we see GC Handles grow up to 70,000 and beyond, we likely have a managed/native memory leak somewhere.
CCW
CCW, or COM-Callable-Wrapper, is another COM interop counter that we’re tracking. If it grows above 2000, I’ll likely start investigating a leak bug. CCW and GC Handles sometimes leak together (if one is leaking, then the other is leaking).
GDI handles
GDI handles are objects used by the operating system to describe brushes, pens, and other parts of GDI drawing system.
A Windows process can allocate a max of 10,000 GDI handles.
If there is a GDI leak, we usually notice it very soon since if there is a leak, 10,000 get exhausted relatively fast. If you notice drawing weirdness or some controls don’t refresh correctly, check the GDI objects column in the Task Manager for your process. Tracking down GDI leaks is relatively easy – find what operation increases the number of GDI handles, and go through the code – search for places where you forget to call ReleaseDC, DeleteObject and such.
Many people are surprised that Visual Studio 2010, a WPF application, can still leak GDI handles. The explanation is that some remaining parts of the UI are still written in native code/GDI, and it’s them who is typically leaking, if any. WPF doesn’t consume GDI objects and hence can’t leak them.
USER objects
User objects are very similar to GDI in nature, they’re also used by the operating system. To be frank, I’ve never seen them leaking.
Handles
OS handles are typically created when you open a file, a registry key, a semaphore or a mutex, and the like, or create a thread for instance. Typically the code should close the handle after using it (ever forgot to close a file?). Handles are measured by the Task Manager or also check out the SysInternals Handle tool.
Threads
The number of threads is worth watching because a typical .NET thread allocates 1 MB for its stack. Threads and VM are typically related – if you’re seeing a sudden 1 or 2 MB jump in VM, check the threads - it’s likely the runtime has started 1 or 2 new threads.
VS has typically around 30-50 threads. We’re working to bring this number down wherever we can, because it’s generally good to not start your own threads, but instead preferably use the thread pool (ThreadPool.QueueUserWorkItem) and abstractions such as the TPL (Task Parallel Library). Speaking of which, if you have a chance to take Jeffrey Richter’s training on threading, go take it. It’s an awesome experience.
In the next post I will talk about how exactly we measure these values and what tools we’re using for this.
Comments
- Anonymous
May 03, 2010
Thanks for the interesting and detailed post Kirill. I stumbled upon it after searching for VS 2010 problems. I've been using the released version of VS2010 to build entity data models and asp.net web applications since April 12th and I'm generally impressed. However, I've had to shut it down and restart it several times. At startup of a web project vs2010 takes up almost twice as much Memory (private working set) than the same project in VWD Express. After opening and working with pretty simple (about 30 tables, 20 entities) database storage schemas and entity models, devenv.exe starts taking up what seems like a ton of memory (over 200mb). It gets pretty unresponsive and right click latency before the menu pops up increases to the point of frustration. When I attempt to close the project, devenv.exe memory goes up to over 500mb before finally terminating after about 5 minutes. After about a minute or so, the screen image and icon on the taskbar disappear, but devenv.exe stays in memory a lot longer. So far, I don't feel like I've done anything very "stressfull" and VS seems slow and hogish on memory. I have yet to experiment with vastly more complicated schemas (eg with > 100 tables) and entity models, but I presume that Microsoft has. Perhaps there is an issue with my particular system or install, but all else seems well and VWD 2008 still works fine. In any case, I would appreciate additional performance improvements especially with the user interface responsiveness.