Put a ruler to the blueprint... is it useful?
My favorite posession in high school was my drafting board. Yep... I was geek, even then. I was going to be the next Frank Lloyd Wright (or at least, I wanted to die trying). I fell in love with Architecture in a high-school drafting class and was hooked. I had notebook after notebook filled with sketches of floor plans and perspective drawings. The year was 1979. Good times.
So when I was talking to a fellow architect recently about one of our team meetings, I realized that I had a good thing back then, something that I don't have today in my current incarnation of 'Architect.' When I created a set of blueprints for a house, it was accurate. I was a careful person because I had to be.
You see, the goal of a blueprint is that I can give a package of drawings to a builder and literally walk away. The house that he or she builds should (if I did my job well) come out looking A LOT like the picture on the paper. Not identical, mind you. There will be minor gaps, and the builder may have to make a compromise or two, but for the most part, I should be able to walk through the finished house and find everything pretty much where I put it on paper.
If the builder had a question about the amount of carpet to order for a room, for instance, they could whip out a ruler and measure the size of the room on the blueprint. If the scale was 1/2", and the room, on paper, measured out to 6 inches wide, the builder KNEW he could order 12 feet of carpet. (Of course, he would order 13 feet... just in case).
Point is that the diagram was so accurate that the builder would not have to ask me for anything that he could get by whipping out a ruler and measuring the drawing on the paper.
Why don't we have this kind of accuracy in our architectural models?
Is that something we should strive for? This is not an MDA question. This is an accuracy question.
In your opinion, gentle reader, what level of accuracy should the architectural model go to?
Comments
Anonymous
August 16, 2007
PingBack from http://msdnrss.thecoderblogs.com/2007/08/16/put-a-ruler-to-the-blueprint-is-it-useful/Anonymous
August 16, 2007
My intuitive answer is that it depends on the level of uncertainty and possibility of disruptions/surprises coming from the environment etc. Even in the case of building a house I would say that there is a fair amount of uncertainty and surprises that you have to be prepared for. The more uncertainty there is the less architectural detail can be motivated. But there are cases when blueprints are absolutely accurate - and must be. I'm thinking of computer architecure (hardware, circuits, motherboards etc). In the case of systems architecture however we, unfortunately?, have to be prepared for a lot of uncertainty and surprises. Virtually any requirement could be misinterpreted, added, obsolete, etc. In stead systems architecture have to rely more on principles, strategies, standards, best practices, communication, agility and more.Anonymous
August 16, 2007
Only as accurate as the model needs to be in order to effectively communicate to the people reading the model. Any accuracy past the "effective communication tool" bar has diminishing returns. Of course, measuring that "effective communication" level will vary based on the person you are creating it for. That would be my general answer. Another possibility might be: As far as is needed to accurately ascertain or predict the amount and types of resources needed when reifying the model. Resources may include people, man hours, servers, bandwidth, etc. Of course, edge conditions (such as systems with low tolerance for latency, or safety-critical systems) would probably be different.Anonymous
August 17, 2007
The comment has been removedAnonymous
August 17, 2007
Hi Nick, My wife is a building architect and we had an interesting dialog a couple of months ago comparing Software Architecture to Building Architecture as a response to a blog entry comparing the two. The conversation was captured in this blog from Robin Mestre found here http://blogs.msdn.com/robinm/archive/2007/03/26/if-building-architects-had-to-work-like-it-architects.aspx The result, as you probably imagine, was that Software Architecture is VERY immature in comparison. The reason I'm bringing this up, is that it's hard for me to offer an answer to your question because the differences between the mature standards and techniques in the Building Industry are so far ahead of the standards and techniques in the Software Industry I'm not sure where to begin. Anyway, I like the challenge you present. It's time (or past time if you follow Pat Helland) to seriously look at how we in the Software Industry we can improve our maturity to reach the level of sophistication our brothers in the Building Industry have acheived.Anonymous
August 17, 2007
@Nick, The often used "builder" analogy also puts a bit of context on your question. We know that process is serial in nature (first blueprints, then construction). We know that the architect is typically the skilled (highly paid/highly trained) labor. We know that the construction work is performed by low skilled (low paid) labor. Both are specialized in skill. It also assumes that the knowledge of the future is somewhat certain (thus it is a predictive process). Building a bridge or a house is pretty predictive (although a 10 yr construction project probably exhibits some unpredictability). If you change the above assumptions, an architectural model can be a completely different beast. If you rip out the serial process phases (and mash them together), the blueprint becomes more of a "snapshot in time". This usually works well where there the team is not highly specialized. The skill level for each person is raised (and the team is flat instead of specialized). This approach works well where the future is uncertain (and likely to change). Thus the adaptive vs predictive approach. The model reflects the thing (think documentation) instead of the thing reflecting the model (think specification).Anonymous
August 17, 2007
The comment has been removedAnonymous
August 17, 2007
The comment has been removedAnonymous
August 17, 2007
The comment has been removedAnonymous
August 17, 2007
@Bob, Ultimately, you are describing the effects that Moore's law (and friends) has had on us as software professionals. If the construction industry changed at the pace that software changed, I think there would be mass confusion (or at least, lots of "legacy buildings".) Combine that with the changing software needs of today's business and you have a mess on your hands. In terms of business need, office facilities don't change that much (especially when compared to software requirements). Sure they often have the need to scale capacity (sound familiar?), but fundamentally, they are stable in what they need. Heck, they will often meet the needs of the next business after the original is long gone.Anonymous
August 17, 2007
I like the idea of having as accurate a blueprint as time can afford. Unfortunately the blueprint is often the lowest common demonator of the attitudes, experience, skill and knowledge of architects involved. I once had an architect from a big French Telecommunications company ask me what UML was when I suggested we use it for documenting our work and thinking. As the conversation progressed and I suggested he should learn UML he gave a bit of a snort and nothing more came of it. Except, we then spent many minutes and hours getting ourselves on the same page every time we had to meet. I've found blueprints work well when the team is committed to improving its common language, refining this as the project(s) progress(es) and dumping stuff that doesn't work. So accuracy is good, but in my experience it's only as good as the consensus. And most often, it's an uphill battle. Like making my kids eat their greens.Anonymous
August 17, 2007
The comment has been removedAnonymous
August 18, 2007
@Nick, Just remember that some business can't wait for a year and a half+ development project to complete. The all or nothing "builder" approach forces up-front stability in the project. It also radically alters the financial picture of the project (in terms of when the investment reaches payoff and investment ROI). While the predictive approach may work well for a project like Sql Server, my experience tells me that it's not one size fits all. My stakeholders (and their shareholders) would rather start recouping investment (and getting new business) with something that works in a month or two, even if it isn't the polished/finished product. I personally haven't seen any "big specification here, walk away, build in india" projects. Maybe they are more common in Very Large Enterprises (my current client only has a few hundred developers on staff and the offshore outsourcing isn't the low skilled laborers described in this conversation). The real waste shows up when we try and force the predictive approach on an adaptive project. It's important to see both and know the strenghts/weaknesses of each and apply intelligently to the given context (which ultimately shapes our definition of architectural model). To be fair to us as software professionals, I think we would have to compare most development projects to something like building The Guggenheim (a project where everything's completely custom). The difference between two random buildings is not nearly as significant as the difference between two random software projects. And what about those guys doing renovations? What happens to a building project when the size of renovations is several times the original scope of the project? ;-)Anonymous
August 18, 2007
The comment has been removedAnonymous
August 20, 2007
The comment has been removedAnonymous
August 21, 2007
The comment has been removedAnonymous
August 21, 2007
Nick, Interesting - a couple of comments back you said "I'm an agilist". This, for me, is the root of why software architecture is different - why do we need to be agile?. To give a building architect a similar experience to a software architect, you'd have to do this: . repeal all building regulations . repeal all zoning/siting regulations . allow competitors to target the project's requirements e.g. buy up the concrete works and change the blend so my design will no longer work . make the project brief something like "build any structure anywhere using any construction method, so long as it increases my sales or lowers my costs" In this environment, project failure rate would skyrocket, long planning cycles and solid blueprints would vanish, and building architects would be going to conferences to talk about how to be more agile! On the other hand, if you gave a software architect a brief specifying the technology platform, the deployment mechanisms, the precise functionality required, and full access to a couple of thousand other systems that use various approaches to do exactly the same thing, I think you'd find your architect could hand you back a pretty solid blueprint in short order. This is in fact exactly what happens with things like avionics and critical embedded systems, most of which fall over about as often as the Great Pyramids. I don't want to trivialize what building architects do, but their environment is simpler, better understood (at the business level), and has vastly fewer degrees of freedom. Outside of a few specialized domains, the market for software has consistently chosen functionality over reliability and adaptability over predictability. Which is why we need to be agile, and the users are also kept on their toes dodging falling bricks.Anonymous
August 21, 2007
@Jaime, I guess I'm not sure of your point: should we be more agile because the market is immature, or should we try to make the market more mature so that we don't need to be agile anymore? Or both? For me: both. We need to be agile. However, rather than simply adapting to the nuttiness, we can also push back to create structures and mature frameworks so that we can quickly solve the needs without sacrificing quality. I disagree that the market for software has consistently chosen functionality over reliability and adaptability over predictability. If that were the case, Unix would never have come on the scene in the server market. The "selling point" that drives Unix is not the software cost. It is the notion that it is more reliable or secure (and therefore more predictable). The market has dynamics that certainly place a value on providing at least the 'minimum' functionality over a partial but reliable solution. On the other hand, if multiple products meet the minimum bar, then adding features will not trump reliability or predictability. Adding XML to SQL would have meant nothing if that tool didn't provide reliable and predictable performance in managing data. So let's not kid ourselves. Agility is necessary now because our field is still young. We cannot engineer a quality solution in a reasonable period of time. But our current Agile methods are first and foremost a coping mechanism. We need to cope with our inability to quickly engineer a solution. As soon as we can speed up engineering, then Agile developers will be the first people to use 'quick engineering' to replace 'quick craftsmanship.' Agilists are the advance guard of excellence in our field. I intend to help speed up engineering by managing complexity and evangelizing the use of an appropriate level of accuracy and consistency in our work. And when that happens, I'm sure the agilists will lead the way to mass adoption.Anonymous
August 22, 2007
I have put some own thougts on this subjects on my own blog.Anonymous
August 24, 2007
Love your analogy. I believe that how accurate the model needs to be depends upon the company implementing. Small companies have different calls to action and drivers than larger ones. Public companies have greater compliance concerns than private ones. Etc... But, I will share an analogy that I use in my mind, and in my daily evangelising of the topic. That of Apollo 13. Too young to watch the news live, I have to settle for the movie. The amazing thing to me was that they could take their test bed, and (quickly) know exactly how much voltage each item used, and then tell the astronauts what they could turn on, and in what order, to have the minimum spaceship necessary to make it home. There are a million reasons why it's both unreasonable and impractical to deliver this level of accuracy when designing a SOA, but I see no reason why it can't be the objective.Anonymous
August 24, 2007
@David, Your analogy sounds compelling, but I'm not sure I understand the point. Are you asking that our level of meta-data about our systems be so good that we can emulate the system's behavior and thereby create a virtual test environment? Can you post a reply or blog something to go into more detail?Anonymous
August 28, 2007
The comment has been removedAnonymous
August 29, 2007
The comment has been removedAnonymous
August 29, 2007
The comment has been removedAnonymous
August 30, 2007
The comment has been removed