Udostępnij za pośrednictwem


Analyzing a performance report in Visual Studio Team System 2005 Part 1: The Summary View

            In my previous series of blog posts I’ve outlined how to create and configure a new profiling session in Visual Studio Team System 2005. Now I’m going to take a look at using the various analysis views provided by the IDE to diagnose a performance issue in an application. After running your first profiling session, you should now have a .VSP report file in your performance explorer window under the “Reports” folder. The IDE will automatically open the VSP file for you after running a performance session, or you can open the file manually by double clicking on it. While opening the VSP file, a progress bar will keep you informed of the file loading progress. VSP files, especially those from instrumentation mode, can be very large, so if the files are taking longer to load then you like, switch to sampling mode or choose a smaller scenario to profile.

 

            For a demo application, I downloaded a rational numbers class off of GotDotNet. This class has a function to create and factor several large rational numbers. For my profiling scenario I launched the application, clicked the performance button (in the application, not the IDE) to launch the function above and then I closed the app after the results from the performance function were reported. If you want to, you can download the project and add a performance session to it to follow along with the analysis in these walkthroughs.

 

            The VSP file will open up like a new source file page in the IDE (shown below). There is a selection of analysis views selectable by buttons along the bottom of the VSP file. The first view that will be opened is the summary view. The summary view provides a quick and easy way to see where the biggest performance issues are cropping up in your program. The summary view will look different in instrumentation mode vs. sampling mode (check a previous walkthrough for more info on instrumentation and sampling), so I will cover the two modes separately.

 

 

            When first profiling an application it is best to start out in sampling mode. Sampling mode will perturb your application less, and will collect a smaller amount of data. By sampling first you can narrow down the number of binaries that you need to look at in instrumentation mode. The screen below shows the summary view from a typical sampling run of our performance session.

 

 

            The summary view for sampling lists the top three (this number can be adjusted in the properties page) function in terms of inclusive samples and exclusive samples. An inclusive sample is a sample that was in the listed function or one of its sub-functions, while an exclusive sample is a sample that was taken just in the listed function. While seeing the inclusive numbers can be useful, the fastest way to look for a performance issue is by looking at the function with the highest number of exclusive samples; that function is where most of your processing power is going. By looking at the top exclusive functions, we see that the Rational.reduce() function is the big offender, almost eighty percent of the samples taken were taken in that function.

 

 

In instrumentation mode, the summary view contains three different sections “most called functions,” “functions with most individual work” and “functions taking longest.” The “most called functions” section lists the functions that were called the most times, and lists the number of times they were called. “Functions with most individual work” details the functions that have the most time spent inside that function and not in any sub-function. And finally, “functions taking longest” shows the functions that took the most time while including all their sub-functions. The picture below shows the instrumentation summary view for our demo application. The nice thing about instrumentation is that it captures every function entry and exit, so you get absolute data to analyze, not just samples.

           

 

            Looking at “functions with most individual work,” we can confirm what we learned from sampling, that Rational.reduce() is the function that takes the longest. But we also can glean another useful bit of information from this page. We can see in “most called functions” that System.Array.GetUpperBound() is being called almost 87,000 times, much more then any other function.

 

            So by using the summary views we were able to isolate two possible performance issues in our code. First, we learned that Rational.reduce() is the function that takes up the majority of processing time. Second, we learned that System.Array.GetUpperBound() is being called a very large number of times. So in the next installment I’ll show how to get more information about these possible performance hotspots using some of the other views provided with our profiler.

Comments

  • Anonymous
    April 08, 2005
    The last three installments of this walkthrough have helped show you how to use the new profiler in Visual...
  • Anonymous
    June 09, 2005
    I’ve pulled together all of the technical articles and walkthroughs from the various team member blogs...