Are you a candidate to run Visual Studio 2010 on a 64-bit OS?
Brian Harry's just posted an article on configuring an ideal Visual Studio development machine. You can read about it here: https://blogs.msdn.com/bharry/archive/2010/04/29/your-visual-studio-2010-dream-machine.aspx. By the way, if you scroll down and peruse the comments that customers have posted there, you will see recommendations for several other configuration options. Some of these go well beyond the simpler & less expensive ones Brian discussed. If you rely on Visual Studio in your daily work, you may want to give serious consideration to some of these advanced configuration options.
One of topics Brian addresses in his blog was the benefit of running Visual Studio on a 64-bit OS. We’d like to drill into that topic a little deeper in this post.
Visual Studio’s devenv.exe process still runs as a 32-bit process. Under a 32-bit OS, a 32-bit process can only address up to 2 GB of private virtual memory. (For the sake of simplicity, we are going to ignore the /3 GB boot option for 32-bit Windows machines.) The remaining 2 GB of the 4 GB virtual address space is reserved for system addresses. This 2 GB max is an architectural limit on the size of a 32-bit process. All the code and data that gets loaded has to fit in this 2 GB virtual memory space. This may seem a little strange, but as developers working on Visual Studio, but we consider your code – and forms, XAML, DBML, DGML, PDBs for debugging, TFS Work items, and whatever else your Solution loads – as the data Visual Studio has to load & process. The problem arises when Visual Studio needs to load some combination of our code and your data that exceeds this 2 GB architectural limit.
Actually, memory fragmentation issues prevent a 32-bit process like devenv.exe from ever reaching its 2 GB architectural limit. Due to fragmentation, when a 32-bit process address space starts to grow into the 1.7 – 1.8 GB range, the risk that a virtual memory allocation request will fail starts to increase sharply. When a virtual memory allocation request fails, Visual Studio encounters an End of Memory error and crashes.
Obviously, this is a scenario you want to avoid. The cleanest way to get around the problem is to run Visual Studio on a 64-bit OS. On a 64-bit version of Windows, the private area of a 32-bit process expands to encompass the full 4 GB virtual memory addressing range. Due to fragmentation, you can’t quite get to the 4 GB upper limit either, but the effect of the change is to allow devenv.exe to address twice as much private virtual memory. Please don’t take this as a challenge, but we have yet to see a customer scenario that exhausts the full 4 GB address range.
So, if you think you could be a good candidate for running Visual Studio 2010 under a 64-bit OS, here is what to look for. We will run through a scenario we ran recently on the final RTM version of Visual Studio 2010 Ultimate. The test project is a smallish Web solution with a handful of simple ASP.NET Pages that use LINQ to query the MS SQL Server AdventureWorks demo database. This is a version of an app one of us built last year originally for stress testing some Visual Studio components he was working on. We would characterize this app as “borderline realistic.” For example, the AdventureWorks database has fifty or so inter-related Tables, and its SaleOrderDetail Table contains in excess of 120,000 rows. So it is not a trivial “Hello World” type of app, but the web forms themselves are pretty basic, lacking many of the UI components that you are likely to put into a real-world e-commerce application.
Table 1 shows the main steps in my test scenario and the amount of virtual memory Visual Studio consumed at the end of end of each step. The tool we recommend for measuring virtual memory usage at the process level is VMMap, one of the free Sysinternals utilities. With VMMap, I can see the overall virtual memory usage of the devenv.exe process, broken down according to different types of allocations, which is something we will drill into in a moment.
Here is a summary of the steps of the scenario we ran and the amount of virtual memory allocated by the devenv.exe process at the end of each step.
Step |
Scenario |
VM Usage |
1 |
Open VS with empty solution |
300 |
2 |
Open VS with web solution: · Solution Explorer, Team Explorer, and Server Explorer · Connect to TFS · Open one .cs file Opened in the VS Editor · Open one simple web form opened in Split mode · Open one (.dbml) Data Designer · Open one TFS Work Item open |
910 |
3 |
Step 2, plus F5 to Debug Solution, and run to a breakpoint · Intellitrace is active at its low (default) setting |
975 |
4 |
Step 3, run to Breakpoint; plus step 10x and then run to 2nd Breakpoint |
1060 |
5 |
Step 4, with Resharper installed |
> 1300 |
Table 1. Virtual memory usage of Visual Studio at the end of each scenario step.
In Step 5, we repeated the full scenario, but this time with Resharper, a popular Visual Studio add-in, installed. As Figure 1 shows, at that point, the virtual memory footprint of Visual Studio 2010 exceeds 1.3 GB. Committed bytes exceeds 1.2 GB. While that is not a particularly dangerous amount of virtual memory for Visual Studio to consume, it is still enough to start to make you wary.
Figure 1. VMMAP report on Visual Studio’s virtual memory usage for Step 5 in the scenario. (Click for full size image.)
From the standpoint of Visual Studio responsiveness, it is worth noting that the high water mark for the devenv.exe working set was just under 600 MB in Step 5. So this is a scenario that can still readily fit in a machine with the minimum of 1 GB of RAM, and should perform quite well on a machine with 2 GB of RAM.
Drilling into the virtual memory footprint using the VMMap report shown in Figure 1, we see that the largest single contributor is Images files, code that Visual Studio is loading. For the scenario in step 5, Visual Studio needs to load 775 MB of executable code. we zoomed the VMMap detail report into the Image data and then sorted by Image file size. You can see that many of the image files that Visual Studio loads are quite large, in excess of 10 MB. Meanwhile, private data areas accounted for only about 220 MB of virtual.
One thing about the Image files that get loaded in Visual Studio, the more components of the IDE you use, the more code gets loaded. And once loaded, Visual Studio does not have a mechanism to unload components that you are no longer using. The Image File footprint just keeps on growing throughout your Visual Studio session.
So, take this simple scenario and apply it to a very large project or solution and exercise a few more Visual Studio components like architectural diagrams and performance profiling, and you may quickly be up against the architectural limit of a 2 GB private process virtual address space. As Visual Studio virtual memory usage approaches that limit, the product becomes unstable on a 32-bit OS.
As Brian’s blog discusses, to avoid possible instability problems, the workaround is to install and run Visual Studio on a 64-bit version of the OS. Hopefully, this discussion will help you understand better whether you are a good candidate to run Visual Studio on a 64-bit version of the OS. Running VMMap to gain a more detailed look at Visual Studio’s virtual memory usage in your specific environment can also be useful.
-- David Berg and Mark Friedman
posted May 3, 2010