Do SDETs dream of electric sheep?
I read Joel Spolsky's blog from time to time. I am a little embarrassed to admit that I took the following entry a bit personally.
Apparently—and this is all based on blog rumors and innuendo—Microsoft has had a long term policy of eliminating all software testers who don’t know how to write code, replacing them with what they call SDETs, Software Development Engineers in Test, programmers who write automated testing scripts.
The old testers at Microsoft checked lots of things: they checked if fonts were consistent and legible... they checked whether the screen flickered when you did things, they looked at how the UI flowed, they considered how easy the software was to use, how consistent the wording was, they worried about performance, they checked the spelling and grammar of all the error messages, and they spent a lot of time making sure that the user interface was consistent from one part of the product to another, because a consistent user interface is easier to use than an inconsistent one.
None of those things could be checked by automated scripts. And so one result of the new emphasis on automated testing was that the Vista release of Windows was extremely inconsistent and unpolished. Lots of obvious problems got through in the final product… none of which was a “bug” by the definition of the automated scripts, but every one of which contributed to the general feeling that Vista was a downgrade from XP.
- from Joel On Software
I wanted to take a little time to cut through the blog rumors and innuendo and shed some light on the SDET role at Microsoft.
I don't think it's really accurate to call the automation test that we write "scripts". We write programs to test programs, and sometimes those have quite a bit of complexity. We have programs that put Windows Mobile through its paces for hours at a time, making calls, browsing the web, and sending e-mail. Beyond obvious things like calling APIs with various arguments and checking their return values, we have automation that runs features with enormous data sets and measure its performance. We have automation that runs through entire end user scenarios from end to end and verifies that it all works. We have automation that can check each screen of the product against a database of past saved images and e-mail me if a single pixel moves out of place. In some situations, writing the automation to test a feature can be more technically challenging than writing the feature in the first place.
In the end, our software isn't used by robots. People buy our software and people use it, so we need to make sure that people will like it. Joel is totally correct that there are some dimensions that are difficult to measure with automation. We use a variety of means to mitigate that risk, like
- bug bashes - a broad set of people set aside a block of time to explore the product or dig into a specific feature
- ad-hoc testing - spending time exploring your feature manually or with the aid of tools
- end user betas - wide scale beta testing for external users
- enterprise TAP programs - focused betas for our enterprise customers
- usability studies - we bring in users and observe how they interact with the product
- and dogfood - we use the product to do our jobs, every day
The dogfood program is probably one of our best sources of bugs like that - if you think the flames and vitriol directed towards our products on the Internet are bad, you should feel the wrath of a Microsoft employee who is cranked that something is too slow or takes too many clicks to use. Factors that Joel mentions like perf, usability, consistency, and flow are all in bounds for the developer, program manager, and tester to file as bugs against a feature, but we also get plenty of feedback along those lines from the dogfood programs. They're definitely all "bugs" by our definition of quality.
As an SDET you get to free reign to write code that will push our products to the breaking point as well as putting yourself in the customer's shoes and making sure that we delight them. Does that interest you? We'd like to hear from you.
Further reading:
- Are you a good enough developer to be a Microsoft SDET?
- 3 days in the Life of a SDET
- A Day in the Life... of an SDET
- Tester Center - check this out. It's our new site, all about the technical side of testing.
Nitpicker's Corner:
- I didn't work on Vista. I can't make authoritative statements about it.
- I didn't make the decision to migrate from STE to SDE/T and I can't make official comments on it.
- It's very likely that I can't do anything about a particular dialog box that may be misaligned in a Microsoft product.
Comments are moderated - please be civil and on-topic.
Scott - SDE/T, Windows Mobile Security
Comments
Anonymous
December 23, 2007
Only big concern I have with 'dogfooding" by MSofties on the MS campus is that is a very narrowly focused segment of users, IMHO. I expect you typically are working with Exchange and Server Sync, which may or may not be representative of the majority of users. It is not representative of me. I'm stuck with WMDC. (No need to send a sympathy card). I also understand that internal dogfood releases often come quite rapidly which really doesn't give the victim a chance to do much but play with what it is, how it is. That is, you don't get much chance to customize your device as a regular user might, to see the issues that arise. I appreciate part of this is that you don't have the time, because you love your jobs and are at them 16 hours a day. Problem is that keeping a calendar and email sync'd with Exchange is not where the issues are. The solution to that would be to have folks whose sole job it was to play with the latest release as a typical user might, i.e not someone with a 'real' job at MS, and possibly not on the MS network. That sounds like the test environment you got rid of. Sven MS MVP Mobile DevicesAnonymous
December 24, 2007
The comment has been removedAnonymous
December 25, 2007
The comment has been removedAnonymous
December 26, 2007
The comment has been removedAnonymous
February 11, 2008
When all's said and done, the fact remains that somewhere around 90% of an SDET's time is spent on automation-related stuff, which has whatever consequences it has.Anonymous
February 11, 2008
I don't know that I'd call that a fact. I certainly don't spend 90% of my time maintaining automation.Anonymous
March 02, 2008
May not be a fact, but certainly is a perception. Whenever a dev files an "Ad-Hoc" bug, (s)he can't help but think: "Why didn't a tester find this bug first? Is it because they're busy maintaining automation instead of clicking around?" Let me give you an example -- for a while in one team, developers were filing a majority of the setup bugs. Testers were using scripts to install on clean machines that got rebuilt every day, while developers often had dirty machines, and sometimes even ran setup interactively through the GUI. This is not a good state of affairs, especially since setup forms a customer's first-impression of your software! (And since customer's machines are never as clean as a fresh install.) Now, Windows Mobile has less of a setup problem, but there's probably an equivalent.Anonymous
April 03, 2008
I was just hired as an SDET intern at Microsoft, and while I'm certainly glad that Microsoft takes testing seriously, I believe that there are some serious problem s with Vista (which, mind you, I use every day as my primary OS):
- UI inconsistency. For example, there's lots of UI left over from XP for preferences (display settings, the system control panel, mouse control panel). Sometimes control panels open in a separate window. Sometimes they don't.
- UI annoyance for experienced users. Vista makes it too hard, for example, to set up a static IP for a network adapter: Start->Control Panel->Network and Internet->Network and Sharing Center->Manage Network Connections->Right click on connection->Properties->Elevate->Click on IPv4->Click Properties That's 10 clicks. In XP, it's Start->Control Panel->Network Connections->Right click on connection->Properties->Click on IPv4->Properties That's 6 clicks, 4 clicks shorter than Vista.
- Vista is an I/O HOG. After I close a RAM hungry application (games in particular), Vista tries to fill the freed memory using SuperFetch. But even though the I/Os are at a low priority, they turn any sequential I/O that the user is doing (e.g. copying a file) into random I/O. Sometimes SearchIndexer does this, too.
- Vista takes WAY too long to shut down. 20-30 seconds on some systems I use.
- UAC elevations need a progress bar when checking signatures on big executables. It's confusing to double-click on a big installer and have to wait 2 minutes for elevation, with no indication of what's going on. Vista's #1 problem is performance. It's not that it's slow, it's that it's inconsistent.