Breaking into Apple
Larry's in a curmudgeonly mood today, so...
Over the weekend, I noticed this post on Digg: "How I learned to break into Apple and Code for them without Permission".
It's an "interesting" story, and I have to say that I was aghast when I read it. And my jaw dropped even further when I read the Digg comments about it (yeah, I know - self selecting trolls, just like on /., but...).
For some reason, the people reading the story actually thought it was COOL!
Now think about it. These people were so dedicated to their product that they were willing to BREAK THE LAW to get it into Apple's OS.
That's cool (sort-of, if you're willing to condone breaking the law).
But what does it say about Apple's internal controls when the QA process on the final disks has things like:
Once again, my sanity was saved by the kindness of a stranger. At 2:00 one morning, a visitor appeared in my office: the engineer responsible for making the PowerPC system disk master. <snip> He told me that if I gave him our software the day before the production run began, it could appear on the Golden Master disk. Then, before anyone realized it was there, thirty thousand units with our software on the disks would be boxed in a warehouse. (In retrospect, he may have been joking. But we didn't know that, so it allowed us to move forward with confidence.)
Wow. So the contents of the software on the FINAL MASTERS for the operating system can change on the whim of a single release manager? Doesn't anyone ever check this stuff?
In addition, what was the state of Apple's physical security? Admittedly this was 1994, and things were somewhat looser, but still... I'm sorry, but if you've got random ex-employees running around the halls spending HOURS with access to your CORPORATE NETWORK, what does that say about the level of security in your physical plant? Apple got hugely lucky that these guys weren't bent on corporate espionage.
Don't get me wrong, I'm sure that Graphing Calculator is a SERIOUSLY cool app, and it's clear that at some point there was a formal decision made to include it into the product, and it got the attention it deserved (as mentioned in the article, Apple eventually decided to include it in the product).
Think of what would have happened if Apple hadn't: Once the decision was made to include it in the product, they got test resources, they got localization resources, they got usability testing, etc. None of this would have happened if they had continued as a "skunkworks" project, and they'd have shipped a product that had serious flaws.
And this is a model we're supposed to admire?
Comments
- Anonymous
August 07, 2006
Larry, come on, it's not that clear cut. I used to work at Apple; I met Ron there and heard the story firsthand. I was also acquainted with Greg.
0. These guys were known quantities to important individuals on the system software team at Apple. If they weren't, they'd have been kicked out of the building.
1. Controls of various sorts were much looser at Apple then. They've been tightened up considerably since.
2. People in glass houses. Do you really want to encourage Slashdot trolls? Post about law breaking on an MS blog. - Anonymous
August 07, 2006
The comment has been removed - Anonymous
August 07, 2006
Chris, to me it is. Apple CANCELED their project. Apple made an executive decision to not produce the product.
And the developers decided to override it by BREAKING INTO APPLE.
There are a bazillion other ways they could have handled it: Building it on their own and trying to convince Apple to put it in the product, etc. But no, they chose to STEAL from Apple. And the policies of Apple were sufficiently lax that they let them get away with it.
John, that's apples and oranges. There's a HUGE difference between an easter egg inserted by employees and someone breaking into a building, and STEALING company resources for a skunkworks project. And security holes have absolutely nothing to do with an almost total lack of control over process and facilities. - Anonymous
August 07, 2006
The comment has been removed - Anonymous
August 07, 2006
Think about it though. These guys were so loyal (in their minds) to this company and its OS that they were willing to bypass security to make the product better.
The question isn't why Apple was so bad at keeping these type of people out. It should be why Microsoft can't engender the same type of loyalty and fanaticism that Apple and the MacOS can.
Just like the commercials with the two computer guys. The Mac guy is self-assured to the point of cockiness. He believes he is the best OS out there with everyone else following his lead. Mac fanatics believe this. They see the OS as something that they personally have a stake in.
From a project management perspective, having rogue developers and secret deliveries is the last thing you want because it brings in too many unknown variables (QA being a huge issue, as you mentioned). But knowing that people are breaking down your doors to make your product better is a huge boost to the community surrounding that product. This is one major positive point to OSS. The community rallies around the project because in a real sense they own the project.
Is it insane that this type of thing was allowed to occur? You bet. But it is also incredibly cool. - Anonymous
August 07, 2006
Larry, I see your point and I agree in a "they broke the rules and must be spanked" way. Yet, I have to admire their dedication. They were looking for every possible way to ship their bits. Big companies like Microsoft look for reasons to NOT ship bits. Mark Lucovsky's article about shipping software comes to mind:
http://mark-lucovsky.blogspot.com/2005_02_01_mark-lucovsky_archive.html - Anonymous
August 07, 2006
Lauren, that's a very valid point, and you ARE right, the passion shown by these guys is very impressive. I just seriously disagree with their methods, and the environment that allowed those methods to succeed.
David, the answer depends on who's making the cool toy. In general, a team in Windows owns each of the features. There are also different vehicles for releasing products. You already mentioned the powertoy example. Because powertoys aren't supported, the quality bar is much lower for them. Another example of a distribution mechanism is the windows ultimate extras stuff - that has a much higher quality bar.
For getting a feature into the product (and continuing on your example, adding the PowerToy Calculator would be considered a feature), you need to have a series of specs (functional spec, design spec, test plan, threat model) for the app, you need to have dev and test signed up to work on it. And you need to have approval from the set of people who are authorized to approve the feature. There's other stuff that has to happen as well, that's well out of scope of a comment. - Anonymous
August 07, 2006
Larry,
If you had known that the bug you fixed would have caused all the havoc during the infamous "text to speech" demo you recently blogged about, and had an opportunity, by breaking into an office at MS, to place a fixed version on the demo machine before the demo happend would you have? - Anonymous
August 07, 2006
The comment has been removed - Anonymous
August 07, 2006
"We wanted to release a Windows version as part of Windows 98, but sadly, Microsoft has effective building security."
I know this to be true, from personal experience. I was working on a saturday once, and wore an IBM T-Shirt from my last job. A security guard stopped me and asked to see my badge. :-)
Lauren: Microsoft does have that kind of fanatical loyalty to its products among some of the employees, but sometimes this gets dampered by bureaucracy. - Anonymous
August 07, 2006
I think you're focusing on the negative in a hero story from olden days, when standards weren't the same. It doesn't smell nice - coming from a MS developer. In fact, it stinks. - Anonymous
August 07, 2006
Here's another way of looking at it.
A security system succeeds if it lets in the good guys (people who contribute) and keeps out the bad guys (people who destroy). Apple's bureaucracy, as described in the story, failed in this by keeping these guys out even though they had a positive contribution to make and were willing to do so at no one's expense. Apple's reality, being the people working there, corrected this system failure informally by ignoring the bureaucracy where it was harmful. The rules wouldn't have been ignored had they been generally perceived as good.
What Larry is proposing is that a system designed to be suboptimal should perform suboptimally, as designed. I propose that such anecdotes serve to teach us that systems should be designed around people, and not the other way around. If a company has people able to produce good work and willing to do this for free if the company's financial reality isn't good enough to payroll the work, then the company and everyone else is better off to the extend that it manages to capture this creativity, and the company and everyone else is worse off to the extent that it blocks it.
Bureaucracies are usually created with the most common cases in mind, not the exceptions. To the extent that bureaucracies fail to gracefully handle exceptions, people that comprise those bureaucracies should compensate. After all, a foolish consistency is the hobgoblin of little minds. - Anonymous
August 07, 2006
You are being a bit curmudgeonly today. Both American and hacker cultures make heroes of people who bend and break the rules. If you break the rules and accomplish something generally perceived to be "good" you're a hero. You can even engage in outright crime and be admired for your daring if you're bold enough (D. B. Cooper). It Graphing Calculator story represents old history (thanks to Internet-time); people are admiring people who wrote Good Software and weren't afraid to Break The Rules To Do It. (Sounds like an ad for a movie.) So, yeah, what they did was cool; just like D. B. Cooper.
I don't think people are quite as irrational as you might fear. The author wisely didn't post it for several years. If it was unveiled that something like this happened last year I believe the response would be different. While a subset might still admire the work, many more people would worry about the security of the operation system. - Anonymous
August 07, 2006
"Over the weekend, I noticed this post on Digg: "How I learned to break into Apple and Code for them without Permission"."
Gawd, it makes you wonder how many other people were slipping into the place to write software. - Anonymous
August 08, 2006
Why is anyone getting worked up over this? It happened 12 years ago. What's the point, that 12 years ago the QA controls Apple had in place were bad? 12 years ago everything in software was bad, or perhaps I should say even worse than now. - Anonymous
August 08, 2006
The comment has been removed - Anonymous
August 08, 2006
In answer to the question you ended with, no, this is not a model we're supposed to admire. It's not a model at all; it isn't meant to describe the way Apple planned in the past, or plans now, to develop software. We're supposed to be amused by the pluck of the rogue programmers and the others who helped them, and, I think, to wonder a bit at how Apple managed to so bungle the dismissal of contractors that they managed to keep coming to work for months. - Anonymous
August 08, 2006
The comment has been removed - Anonymous
August 08, 2006
I figured this comment thread would be interesting. There are two factors that come up for me.
The first one, is developers placing their judgment ahead of the users and the responsible organization. Whatever the sense of commitment, and sense of rightness, subverting the system removes all checks and is involving the users and customers without anything like their consent. I first noticed this back when "tyranny of the technician" became part of the language in conjunction with more serious matters (like arms-control policies and building things like Star Wars). There is considerable arrogance around such actions, it would seem, or in the case at hand, gross carelessness justified by the rightness of a course of action.
My first thought was about a later experience. For a time I was in an organization that built serious hardware prodects with embedded computer systems. As a manufacturing company, there was a pretty serious product development process that worked from product conception to turnover to manufacturing. There was also a larger customer delivery process. I was impressed to learn of the second, but as a participant in the first, I quickly recognized that engineering was perturbing the second without any righteous feedback (let alone virtuous) from the customer-delivery side regarding maintainability, usability, etc. The engineers were pretty much making it up. And it was interesting to see some pretty major projects get cancelled late in the product development because of some serious operability failing.
I think programmers often disdain the serious non-development considerations that enter into deploying and supporting a software product. I know I did. I still tend to approach software for its own sake. It may be great for me and satisfying, but it is not the same as putting in the extra effort it takes to satisfy real customers and users with a reliable product.
It would be great to have a lighter-weight means for putting software into peoples hands where it is not worth it to either the developers or the users to carry the burden of productization and hardening, as well as the costs raised by commercial deployment and support. It looks like Microsoft is slowly introducing such alternative pathways. I don't know what Apple is doing or what the Apple community is doing.
I confess I was amused when I first read of the incident some time ago. Thanks, Larry, for pointing out that there are deeper lessons that we can bring into our present-day conduct.
- Dennis - Anonymous
August 08, 2006
What a fascinating discussion!
I've given these issues a lot of thought at the time and over the years since. I still ponder the ethical ambiguity of the situation. It is only trespassing if we did not have permission. We were given permission later, retroactively making it legitimate. But there was no way we could know that at the time. I certainly told myself that we were "doing the right thing" and believed that. But at the same time, I was self-aware enough to realize that that was just frustration and bull-headedness and my ego talking. Even looking back it now, I'm still not sure if I pulled one over one them, or if I was horribly naive and taken advantage of, or perhaps both.
It was certainly arrogant and egotistic of me at the beginning to think that I knew what was in Apple's best interest better than the folks tasked with making project decisions. Had I been arrested I would have accepted the consequences knowing full well that I had earned them. Yet, when push came to shove and we came clean with what we had been doing, management confirmed my judgment and agreed that it was worthwhile and ought to ship.
We could not have done any of this without the support of a lot of people who knew us, trusted us, and helped us at every step along the way. We were not strangers breaking in during the dark of night.
And yes, to Alan: the events are 13 years old, I wrote the story 10 years ago, but waited to publish it until less than 2 years ago, now that Apple is no longer "beleaguered" and the story is just an amusing history. The Apple of 1993 was a place of chaos lacking in vision and leadership at the very top entirely unlike the Apple of today. Nothing like this would happen there today - nor would there be any need to operate that way.
As to the question of Quality, we took that extremely seriously as professionals. So much so, that for nearly a decade after we finished, Apple used Graphing Calculator to test sick machines. Apple repair folks would run the Graphing Calculator in Demo mode overnight, and if it crashed, classified that as a hardware failure. I like to think of that as the theoretical limit of software stability.
>And this is a model we're supposed to...
I was not trying to preach. If anything, the tale is a rorschach. Everyone reacts to it differently, reflecting their own experience. For me, I wrote the story as memoir. I like to reminisce.
By the way, I've continued working on it ever since. There is a Windows release now. Send me an e-mail if you'd like to play with it, or if it's not too late and you know Microsoft folks looking for extras to go in Vista. ;)
Best regards,
Ron - Anonymous
August 10, 2006
Imagine working on a painting for months only to not finish, or work on a car for years to only have it taken away, or work on a symphony and never be able to finish it, build a house and almost finish... I can say first hand that one of the worst things you can do to a programmer is to not let them finish their work.
I work for a company right now that is extremely successful, but we're just pulling our developers out of a rut. The issue was that some folks had them developing on a few projects in a row where the plug was pulled. It's had a terrible effect on their moral and pride in the products they are developing.
A good developer is like a good craftsman. The satisfaction of the job only comes once it is complete. This isn't about QA, stealing, breaking in... it's about passion and vision.
Dare I say that it's also about 'open source' development and it's success. - Anonymous
August 11, 2006
The comment has been removed - Anonymous
August 20, 2006
The comment has been removed - Anonymous
August 27, 2006
remember, law create criminals. by definition