Performance Engineering
Recently, many people have asked me about the performance of Linq. The questions have ranged from the broad, "How can I analyze the performance of code using Linq?", to the very specific, "Why is the performance of this code sample not what I expected?"
I want to address these questions. So before I dive into some of them individually, I want to set up a framework for discussions about performance. The framework consists of ideas that are well known and understood and yet often when people talk or think about performance they forget about them.
Performance is critical to the success of any application and engineering performance is a fundamental part of what developers should be doing on a day to day basis. Great performance doesn't just happen. It must have requirements, be designed, implemented, and tested just like any other feature. Individuals and teams must foster a culture of performance.
Performance Requirements
Rico Mariani wrote a great post about why performance must be measured in context. That means that performance goals should be made in relation to what the user sees and cares about. Then when possible these goals should be translated into more low-level goals that are easier to quantify and directly measure, but never take your eye of the original goals. Many times when performance is tuned in one dimension it can degrade in other dimensions. It is like trying to cram a whole bunch of stuff into a tiny space, you may push it in here but it pops out over there. So how does a programmer know which dimension is most important? Consider the performance in context. Furthermore, the performance tuning that a programmer makes might not even matter when considered in context. So spend your time where in counts. Finally, it is much easier to consider trade-offs between performance requirements and other requirements when the performance requirements are in context. Set up a performance budget in terms that are material and important to the customer.
Designing for Performance
Designing for performance is important. Many performance problems can be fixed as they are identified later, but some are deeply rooted in poor design choices that were made early on that are difficult to resolve later. However, care must be taken because often these poor design choices were made for the sake of "better performance". For example, in an app where there is a lot of data parallelism, it doesn't make a lot of sense to make it so complex with stateful operations that it precludes concurrency.
Most of these poor design choices are caused by the design not solving the right problem (performance requirements) or because the requirements changed over time and the earlier design made it difficult to adapt to the new requirements.
Understanding Performance
There are at least two basic tools for understanding performance.
I am sure that my readers know and love asymptotic analysis. It is so important that developers know what they are doing when they call a System.* function, which has an O(n) time complexity, n times -- O(n^2) behavior. Know what is going on in terms of both space and time.
2. Benchmarks / Profilers
So often when a performance problem exists, developers will convince themselves that they know what the problem is without accurately measuring the behavior of the system. They will proceed to "fix" the problem only to discover that what they thought would be of great benefit is in fact unimportant.
I've done that. When C# 1.0 was in beta back in 2000, I was working at a small company that decided to try out this new .NET stuff. My first application was a good sized application that including a custom scripting engine inside of it. The performance wasn't great and so I set about fixing it. I was sure that I knew what the problem was and so I began improving some algorithms in various complex parts of the system. But I found that it didn't help matters at all.
At this point, I broke out a profiler and began measuring the actual behavior of the system. The suprising result (at the time) was that pretty much all of the performance problems came from a small loop that was building up the text buffer for the scripting engine. It was using the + operator on strings. Since this was my first .NET application, I had no idea that concatenating hundreds of thousands of strings was a bad idea.
When I changed the code to use a StringBuilder instead, it sailed. I made a few other improvements (all of which were targeted and very small), and the application was running fine.
Now the point of all of this is not that the + operator on strings should not be used. If that were true then we would make the C# compiler issue a warning when you used it. The point is that a programmer should be aware of the costs and tradeoffs involved with a programming decision and act accordingly. Measurement is a powerful tool. Knowing where the problems lie is the key to success. Measure early and often.
The result of profiling is a large set of various statistics about an application. If the measurements are taken in context then they are a powerful tool. As with any statistics, what is being measured is as important as the actual measurement.
Analyzing an application is very rewarding. You are able to see the material benefits of any improvement in a very quantitative way. Apply the scientific method in the process of analyzing a problem. Once you suspect you know what the problem is then consider it the hypothesis and then prove it by observing the data (profiling data) from an operation (the suspect code) across various trials.
Try to think big and understand not only what is the problem but why it is a problem. The best solutions attack not only the symptoms but the fundamental causes of the problem. Look to understand the nature of the problem.
Some Good Performance Resources
Rico Mariani's (Visual Studio Performance Architect) Blog
Vance Morrison's (CLR Performance Architect) Blog
Comments
Anonymous
March 20, 2007
Linq to objects just does nested loops. Theres no optimisation at all. Not recommended for performance critical situations.Anonymous
March 21, 2007
Last week the Microsoft MVP's converged on Redmond from all corners of the globe. It was a great occaissionAnonymous
March 21, 2007
Waiting for your next posts about performance, I met Erik in Qcon last week, he talked about some ideas of PLinq, we had some talk, and I was happy talking to him, i told him that i am waiting for you to post some examples explaining the PLinq idea, I also told him how helful your blog is :) He talked about designing a single tier and deploying on multitier! I pretty like the idea, but there should be some rules i guess so that it can be distributed. anyway, I hoped that you ll be there Wesel, but didnt find you :( You can find some notes taken life and photos from the presentation at http://sadekdrobi.com/?p=116 I liked also Erik's last talk about the gap between business ppl and developers :)Anonymous
March 21, 2007
Erik is an awesome guy. I always enjoy talking with him. I have so much that I want to blog about...now if I could only finish polishing Orcas so I had the time.Anonymous
March 21, 2007
Wes, You mentioned you might find out if Joe Duffy could post a status update on PLINQ, would that be possible? Or is PLINQ something that Microsoft / Joe just aren't ready to talk openly about yet? Oh, and when's your blog series on writing LINQ providers due? [Apologies for the 'hungry' email :-) ] Kind regards, tomAnonymous
March 21, 2007
No apologies necessary. I passed along the request. I want to write about LINQ providers soon. The problem is my backlog of stuff is about 6 posts in the future and I have been working hard to polish up Orcas. I just hit my personal ZBB (zero bug bounce) tonight. So I expect to be able to make some progress in the near future.Anonymous
March 21, 2007
Ah, your ever hungry customers. We want our cake and eat it too! ;-) Awesome to hear Orcas is coming along well! Perhaps a patial method could be defined that would allow a Wes.Fork() operation, allowing you to (concurrently, obviously) write, code and spend time with the family! :-)Anonymous
June 04, 2007
The comment has been removed