Partilhar via


Toby's perfect day

I glossed over an important point at the end of my last post when estimating the user stories.

Toby does a quick tally and comes up with 18 story points. Assuming four ideal hours per working day, it should take him 36 days to complete the project

For people who haven't been working with scrum, admitting to four hours of work per day probably borders on sacrilegious. Let me show how I arrived with this seemingly small number.

Back in November, Mitch Lacey wrote a post about ideal days. In it, he shared an experience about a project that ran into problems because the manager directly converted ideal days to calendar days. The problem is that although a task may take 2 (ideal) days to complete, this will rarely (if ever) equal 2 calendar days. This led to a discussion about what an ideal day really was. Eric Jorgensen got the ball rolling:

A while back, I worked on a project where I created my estimates based on what amounted to 2-3 hours of “real development” each day. I arrived at this number by a very simple approach:

8 hours * R1 * R2 * R3

· R1: Figure out the number of developer days in a year, subtract vacation, sick time, all-day meetings, morale events, etc. Divide by the number of developer days to get a ratio, which we call R1. This is about 0.85.

· R2: Look backwards over the previous month and determine the number of hours spent in meetings. Divide this by the total hours at work during the same time frame. This is R2. This number varies over the year, but my experience is that 0.85 is about average.

· R3: My estimates are always wrong by a factor of two. I don’t know why, but that’s the way it has been every since I started programming. For me, R3 = 0.5. (Note: Working longer hours decreases R3, so you can’t get done faster by working longer)

8 * 0.85 * 0.85 * 0.5 = 2.9 hours.

This seemed like an interesting take on things, so I added my €0.02:

I particularly like the explanation of R3 :) I know for translating our estimated hours into calendar days for the last project I was on, we used 4 real hours / day - and then 5 when the project seemed to be slipping, which strangely didn’t help it go any faster.

At the start, I too was a bit uncomfortable coming to terms with only doing 4 hours of real work per day, but once I started paying attention to how much time was spent in meetings (R2) and following unnecessary processes / dealing with environmental issues [build server, etc.] (R4?) I could easily see how much time was lost during a day. Factoring in anything you’ve mentioned in R1 (especially vacation) never even entered my mind – those were usually dealt with completely separately.

Personally, I found that R3 averaged over all my work items in a given sprint was closer to .8 but individual work item estimates could vary wildly. Perhaps it was the constant pairing that helped us defined estimates a bit better than by ourselves. In our case though, it was R4 that was the real killer .. I would use another multiplier of .8 there.

8 hours * .85 *.85 * .8 * .8 = 3.69 hours.

Even then, notice that the management’s forced estimate of 4 (and then 5) hours per day was already optimistic. You can’t help but wonder how this would have helped to manage expectations had a smaller number been used from the start.

With the way you’ve explained it, no one should be uncomfortable admitting that they only code 3-4 hours a day out of 8.

After that the free-for-all started :) Here were some more replies in the thread:

  • Jon Pincus added about 10% overhead for R4, and kept R3 at 50%. He also mentioned that the industry average is something like 18 hours "development work" per week.
  • Eric Brechner combined all the R values into a 'load factor' with values of 33% for his team and ~42% for the industry.
  • Alex Kosykh introduced R5 for multitasking, where R5 = 1/(1.5* number of tasks) so you lose time for context switching.
  • Shyam Habarakada added R6 as a penalty for being higher up in the org tree. The more time you spend managing, the less time you have available to code.

It was a very lively and enjoyable discussion, but obviously there is no exact formula nor do we really care about one in the end. The main point to take from this is that on average, we work far less than eight hours per day and it's OK to admit that. We can then easily draw the following conclusion:

Anyone that bases their estimates on developers doing 'real' work for eight hours per day either (a) dooms the project to failure from the start or (b) condemns the team to working overtime to meet the deadlines.

I've reproduced the discussion to show where Toby's original estimate of four ideal hours/day comes from but in the end it doesn't matter how many ideal hours are in a day. What really matters is velocity.

Velocity is the number of story points that a team can completein an iteration. A full list of what defines a task as 'complete' will be the subject of a later post, but in its simplest form it means the feature is in a shippable state. Toby's initial estimate of his velocity is 2.5 points/week, or 10 story points/sprint (assuming 4 week sprints).

After the first sprint is over, Toby will know if he really can complete ten story points in four weeks and use that as the basis for determining how much work he can finish in the following sprint.

Comments

  • Anonymous
    June 28, 2007
    Nice work fellas. I'm new to SCRUM but the contract environment I am currently working in employs it everywhere. This will be a great help this week as we work out the whole estimation issue.
  • Anonymous
    July 01, 2007
    Estimating is a black art :) But at least there are ways we can go about making slightly more informed decisions about the near future. I highly recommend reading Steve McConnell's book on the subject. (http://www.stevemcconnell.com/est.htm)If you've got any questions or experiences to share feel free to write back; I'd love to discuss it further.