Subscribe to Adventures in Spaghetti code        RSS Feed

Predicting the time it will take to complete a task

Icon 2 Comments
I've made a chart of how long it takes for me to complete each task versus how long I estimated the task would take. I have a dataset of 42 tasks. For each task, I have a specific unit test in mind that needs to be complete before I consider the task completed. Sometimes I decide to do more than what I originally intended, sometimes less. Sometimes, I abandon the task and move on to something new. I do not factor the abandoned tasks into my calculations. So here's how I come up with my metric. If the tasks takes longer than expected then I divide the actual length by the estimated length and multiply the number by -1. If the task takes shorter than expected then I do the reverse and divide the estimated length by the actual length. So if I estimate that a task will take 10 hours and I finish in 5 hours then that would result in 2. If it's the other way around, then that would result in -2. The chart below represents a moving average of the last 5 tasks. On average for every 1 hour that I predict a task will take it takes 1 hour and 40 minutes to complete. There was one area where two tasks in a row took 16 and 18 times longer to complete than I thought, so I really got slammed there.

2 Comments On This Entry

Page 1 of 1


24 August 2018 - 09:45 AM
Interesting use of metrics. It looks like you're converging towards zero, so that's a pretty cool trend.


28 August 2018 - 08:09 PM
Actually, I just made some pretty bad predications. I predicted a task would take 3 hours and it took 18. And then another task of 10 hours, ended up taking 30.
Page 1 of 1

Trackbacks for this entry [ Trackback URL ]

There are no Trackbacks for this entry

May 2022

22 232425262728


    Recent Entries

    Recent Comments

    Search My Blog

    2 user(s) viewing

    2 Guests
    0 member(s)
    0 anonymous member(s)