Software Security: When good neighborhoods go bad
I guess this entry would be a zombie blog, since I wrote part of it live and part of it after the session (Zombies are 1/2 live and 1/2 dead, get it? Groan. Haven't finished my latte yet; joke-u-lator not yet running at full steam).
I'm now sitting in Martyn Lovell's C++ security talk. Martyn is the dev lead for the C++ libraries team. When we're not in Vegas, he sits a couple doors down from me. Martyn discusses the inherent insecurity of much of the C runtime library (CRT), how security needs have changed development processes at Microsoft, and how the VC++ compiler and libraries can help you with security. Notably, the VC++ 2005 compiler can provide good protection against buffer overflow-type exploits and the libraries offers more secure versions of unsafe CRT functions, better parameter value checking, safer STL iterators, and the like. Good stuff.
I've been interested in the topic of software development security for years. I've written and spoken a bit about it, trying impress on developers around the world the importance of building hardened software. I spoke to Martyn after the talk and we both bemoaned the fact that still far too many developers don't "get" the importance of writing secure software.
It occurred to me that the world of software development security is like a good neighborhood gone bad. When you moved into the neighborhood, there was little crime, the neighbors all knew one another, and everyone left their front door unlocked. Over the period of a few years they neighborhood's gone to pot; break-ins and burglaries are commonplace, neighbors no longer trust one another, and con artists abound. Despite this, some residents have yet to be victimized and are so busy with their lives that they haven't looked up to see how bad things are and therefore have yet to be inspired to take action to help change things.
Such is the state of software development. We used to write software with the utmost concern for user productivity and ran our schedules to get the maximum number of features into the product. Now, bad guys look for such software written with the best of intentions in order to commit crimes. These crimes include ID theft, DoS zombies, viruses, worms, and other nasties. Some software organizations choose to build software securely in order to curtail these industry-wide problems. Others believe it can't happen to them or it's not their problem and continue merrily on their way.
Wrong, wrong, wrong! The less trust that users can put into their computers, it is to the detriment of all of us that write software for a living. The more people can depend on their computers, the more they will do so, the more software they will buy, and the stronger our industry becomes. Conversely, if they can't depend on their computers, it will ultimately weaken our industry.
As I said, there are companies that do indeed get it. Almost all information security companies get it because their brand depends on them getting it. Most companies that build server-based software get it because they're among the most vulnerable. ISVs that have had their software exploited in a very public way tend to get it. Companies that have had to spend a lot of money defending against attacks also get it. There are even a number of enlightened companies that don't fit into any of these categories that still get it.
But there are a scary number of companies that don't get it. I hear lots of excuses... "We're desktop software. We don't talk to the network. If they compromise our software, they already have control of the console anyway." The bottom line is that if your application deals with any data that is ever outside of its complete control, it's vulnerable. For example, if your software listens on the network, reads from the network, reads files, takes keyboard or mouse input, or processes Windows messages, it is potentially vulnerable. If one of your customers is compromised when using your application to open a malformed file, it's your fault. Well, you probably won't be legally liable (as the law is written today, anyway), but it's unlikely to be good for PR, particularly brand enhancing, or enhancing of product sales.
Luckily, the mitigation strategies are pretty straightforward. You have to care about security. Developers in your organization need to learn about how to design and write secure software. Make security a part of your process. And you should take advantage of security tools, many of which are already built into your compiler and libraries.
It's time to join neighborhood watch.