Random Acts of IT Project Management

Project Management for Information Technology

Posts Tagged ‘Estimating’

Are Accurate Estimates a Myth?

Posted by iammarchhare on 22 July 2009

Yesterday, I pointed you to Robert Merrill’s post about “A Tale of two processes” comparing waterfall and Agile.  So, it seems appropriate today to point you to another of his posts “Software sanity: Accurate estimates and other myths” to round things out.

What is an “accurate” estimate, anyhow?  What is a “good” estimate?  Merrill writes:

Most people equate “good “with “accurate.” I equate “good” with “unbiased and complete, with reasonable accuracy given the time spent on estimating.” This is within every organization’s reach, relatively quickly. Getting incrementally more accurate may not even be worth the effort. Instead, we need to learn to account for estimation errors when selecting and running projects. We will never be accurate enough to let us off the governance and methodology hook.

You can read the rest here.

Posted in Estimating | Tagged: , , , | Comments Off on Are Accurate Estimates a Myth?

Avoiding Project Estimating Mistakes and Constructive Pessimism

Posted by iammarchhare on 10 July 2009

It happens to everyone sooner or later.  Someone made an error in giving an estimate on a task or set of tasks.  The causes can be many.  Sometimes, the estimate might be fatal to the timeline of the project.  How can this be avoided?

Erik Eckel of TechRepublic says that his “Three tips for avoiding project estimating mistakes” are: 1. Confirm all assumptions, 2. Don’t expect trouble-free projects (aka plan for “unknown unknowns”) and 3. Specify exactly what estimates include (aka put it all in writing).  He goes into each of these in his article.

The Defensive Mentality

I’ve been an advocate for “defensive programming” for years now.  Developers are notorious for being overly optimistic.  They can’t fail.  After all, it is their code, and their code always works, right?  I know this attitude to be true.  I was a software engineer myself for a while.  Needless to say, my code did not always work right the first time.  I had to learn to expect that something will go wrong and plan for it.  Just like the driver learns defensive driving and plans ahead because accidents even to good drivers, the developer must plan for strange inputs and react accordingly.  The developer needs to come up with a test plan before coding.  The assumptions should be stated not only in design documents but also as code comments first.

I know what you are thinking, as I have thought it myself: That is going to add a lot of overhead to the project.  The plain and simple truth is that it reduces effort on the project.  Sure, your estimates will go up.  Sure, management is going to question why a module takes so long to code.  The answer is that it actually reduces coding time and, better yet, wasteful time produced by context switching, because there is less rework!  It puts the effort closer to the front of the project, where it is cheaper.  Have you looked to see how much rework occurs on projects?  How much of this was even planned?  Wouldn’t you like to reduce it, planned or not?

The project manager needs to be aware of this and prod people to adopt a defensive mentality.  Basically, when you look at Eckel’s three points, they boil down to being “constructively pessimistic”.  In other words, you expect bad things to happen and plan accordingly.  Even if you use defensive techniques, the PM needs to be pessimistic too and build in rework for a project.  To reword Eckel’s list slightly, the PM needs to challenge assumptions (pessimism that assumptions, particularly about inputs and outputs, reflect reality), not expect lack of problems (general pessimism that all will go as planned without any unit testing and without any rework) and be specific in communication (pessimism about understanding and being clearly understood).

No Fear

Eckel actually has a 4th tip, though.  Near the end of his article, he states, “Analysis paralysis isn’t just for politicians and leaders in other industries — it affects IT managers as well.”  Fear, in his view, makes you less efficient.  As we discussed yesterday in “Keys to Successful IT Projects”, this fear can lead to not making important decisions that would otherwise make the project successful.

So, while I advocate a certain amount of pessimism, don’t make the mistake of rooting the pessimism in grid-locking fear.  Constructive pessimism, however, simply acknowledges that things will likely go wrong and forces you to make a plan for the event.

Of course, you should plan for success as well.  After all, if you are being constructively pessimistic about your constructive pessimism, you’ll also realize that some things go right, so you need to plan for those events as well!

OK, I’m teasing, but only slightly.  Each event will either be a success or not.  What do you do if A goes according to plan?  What do you do if A does not go according to plan?  Always have a back-pocket plan if something of significance doesn’t work out.

Reality Check

Developers are notorious for being overly optimistic.  I have found that many infrastructure engineers are just as optimistic.  They don’t plan for failure, so they don’t know what to do when it hits them.  Many PMs have come up from the technical rank.  Unfortunately, these PMs sometimes still carry much of the same optimism they had before.

“Constructive pessimism” is the antidote to naive optimism.  Naive optimism doesn’t plan for things to go wrong at all.  Constructive pessimism says things will go wrong, but here is what we will do about it.  It provides a reality check to the team.

The funny thing is that this actually increases confidence in the overall project because contingencies have been identified that provide a safety net for the team.

Posted in Estimating, People Management, PM Basics | Tagged: , , , , , , , , , | Comments Off on Avoiding Project Estimating Mistakes and Constructive Pessimism

Stupid Estimate Mistakes

Posted by iammarchhare on 22 June 2009

Or, rather how to avoid them.  Project management is largely about estimates and hitting them, after all.  TechRepulic did an article on the “10 ways to avoid stupid project estimates” (PDF format).  I particularly like #9:

Penalize the bad estimate, not that a task estimate is too long. When estimates are shortened, it’s generally because that shorter number is what’s expected. That is where the “If all goes well, it will take…” comes from. Be serious. When was the last time “all went well”? There’s nothing wrong with qualifying an estimate, but adding an unrealistic assumption as a way to give a bad estimate only hurts the project.

My experience has been that IT folk, but especially programmers, are an optimistic lot.  The fresher they are out of school, the more optimistic they tend to be.  Why?  In school, the programming assignments were always the type that made it easy to hit the target.  They were geared towards the topic being learned, and often the answer was spoon-fed to them.  Note, I am NOT criticizing the fact that students are not hit over the head with a huge reality stick!  However, it does shade their perceptions when they graduate.  Unfortunately, by the time their estimates get good, they tend to move into more senior non-programming roles.

Some additional methods to getting more accurate estimates:

The best case amount of time, most likely amount of time and worst case time estimates.  You know the drill here: Estimate =  (Best case + 4 x most likely + worst case) / 6.  Problem?  Many view this as a time consuming way to get estimates.  Some senior or more experienced developers may even view this as a waste of time.

Group sanity check.  If you have a mix of experienced and inexperienced developers, it might be handy to have a meeting to go over the proposed schedule before it is presented outside of a developer team.  I have seen this done twice, and I wish more organizations took the time to do this.  You have to give enough time for reviewing the schedule beforehand and create an atmosphere that encourages questions.  Problem?  It will not work correctly if management has a target date and effort already in mind, unless developers or team leads are willing to defend any deviation.

The TechRepulic paper really identifies that “reward and penalize” is one of the keys (#9).  That is, hold people accountable for estimates.  To do this, you also have to give feedback (#8).  Even at that, you won’t do any good unless you develop a culture of continuous improvement (similar to point #10).  Unless people are motivated to improve, they will not!

Posted in Estimating | Tagged: , , , , | Comments Off on Stupid Estimate Mistakes

Earned Value Management Now More Important Than Ever

Posted by iammarchhare on 19 May 2009

Containing costs is a really big deal in this economy.  CEOs and CIOs are not likely to react favorably to unexpected project cost overruns.  Project managers (PMs)  do not want to get caught in this squeeze.  Early detection and mitigation are key factors in a project’s success.

As stated in the PM Hut article “Utilizing Earned Value Management During Economic Downturn”:

Utilizing EVM techniques does not prevent project costs overruns, but it does provide project managers with data for more effective cost and risk management, which has become increasingly important to corporations. Risks that are identified through the use of EVM provide early warning signals that immanent project risks exist.

~ Smith, Kevin.  (13 November 2008).  Utilizing Earned Value Management During Economic Downturn.  Retrieved 12 May 2009 from http://www.pmhut.com/utilizing-earned-value-management-during-economic-downturn.

While the article’s main premise is that company executives will (or will need to) pay more attention to EVM, I haven’t worked anywhere where that was a major problem.

What I find useful is to maintain 2 EVM calculations.  If you have a tool like CA Clarity, then it has a built-in EVM calculator that is rolled up into the various dashboards.  If you do not, then consider which EVM method you want to use to present to the world.  I would pick either percentage or 50-50 methods to present the most accurate data at that snapshot in time.  I would consider this my “tracking EVM” because it is the one being reported and the one everyone will be evaluating you on (officially or unofficially).

However, you, as the PM, need to know even before the rest of the world knows.  Therefore, I have always kept a 100% method EVM spreadsheet to update on a weekly basis to gauge when the project is even thinking about falling behind schedule.  Using this method, you are either late or not on a task.  There is no middle ground.  If you are even one day late, severe penalties are built into the calculations.  Then, you can dig in to find out what is going on underneath the calculations.

Posted in PM Basics, Tracking | Tagged: , , , , , , , | Comments Off on Earned Value Management Now More Important Than Ever

Better Estimating Through Software Sizing

Posted by iammarchhare on 18 May 2009

How do you know how long it will take?  Gut feel is how many do their estimations.

Late last year, I attended a webinar that intrigued me.  I had heard of feature point analysis (FPA) before, but I didn’t know much about it.  I decided to look into it more.

Software sizing is the software engineering term for estimating the size of a piece of software (whether component or entire application).  These estimates then can be used in project management activities.  Software sizing processes are a requirement for CMMi level 2.

Lines of Code

One of the original measurements for coding projects was Lines of Code (LOC).  When procedural languages were the norm, it gave a rough estimate of effort based upon the developer’s output.  With OO software, though, it is a less useful measure, and so it has fallen out of favor in recent times.

In the 1970s, IBM tapped Allan Albrecht to come up with a better tool for estimation.  The result was published in 1979.  He came up with a way of measuring based upon 5 areas:  Internal logical files, external interface files, external inputs, external outputs and external queries.  The Code Project has a 2 part posting that goes into more detail on function point analysis.  Unfortunately, it appears there were supposed to be additional postings that did not occur.

One of the complaints leveled against such measurement is the amount of time required to do the measurements.  However, an experienced person can document one person-year’s worth of effort in about one day.  While some criticisms of function point analysis may be valid, “others are excuses for avoiding the hard work of measurement” (Total Metrics).  There are far too many organizations that would avoid procedures in the estimation process if it took an hour because it would “take too long”.

To me, the biggest disadvantages are the requirements of previous measurements and specialized training.  Previous measurements can be substituted with industry standards, though you will lose the impact of organizational maturity and influences.  Training and experience increase the accuracy of estimates by the estimator.

You also have a catch-22 situation in that functional requirements need to be detailed enough to make an accurate estimate.  No matter the method of estimating, you’ll have this problem, anyhow.  Estimates are improved through the progressive elaboration of requirements.

Both of these disadvantages are quite likely to discourage, rather than encourage, a more systematic approach to estimation.  In addition, FPA is not without its critics for other reasons.  For one thing, best practices in software and the way software is developed is pretty far removed from the 1970s, when FPA was developed.

In addition, project management itself has changed a lot since then.  The concept of FPA might have worked fine with monolithic waterfall projects.  However, with the adoption of Agile by many organizations, such detailed analysis prohibits change rather than encouraging it.

Use Case Points

One alternative to pure FPA is estimation built upon the number and complexity of use cases.  There are tools that can make this much easier, and anyone who understands use cases already can put together an estimate with little additional training.

There is a Windows tool you can use to estimate size with use cases called EZ Estimation at http://ezestimation.googlepages.com/.  I downloaded it, and it looks like a pretty decent estimator that can be used in the requirements gathering phase.

Conclusions

A good plan today is better than a perfect plan tomorrow.

~ George S. Patton

One thing to keep in mind is that any initial estimate is going to be wrong.  That is why progressive elaboration is pointed out in the PMBOK.  One place I worked realized this and broke all but the smallest of projects out into an analysis phase and a construction phase.  The gateway for the construction phase was how the estimate stacked up against the analysis estimate and whether or not the project was still worth it.

The beauty of Agile, of course, is that estimates are adjusted as more is learned.  Estimates become more accurate, as estimates over the life of the project become more accurate towards the end.

If time allows, however, it would seem prudent to do enough analysis upfront so as to be able to hit that middle ground of estimation so that less will be left out in the end.  By doing use cases and using estimating tools based upon that seems to me to be the most reasonable approach.  The larger the project, the more this approach might make sense.  In an Agile environment, this would be done once to get the best possible overall estimate, but user stories, backlogs and adjustments after a sprint would still be carried out on  a normal basis.  The key would be “appropriate detail” in use cases.

I would love to hear anyone’s experience with these.  I have a feeling it depends a lot upon the type of project, the type of customer and the overall project size.


Sources:

  1. Buglione, Luigi.  (25 July 2008).  Functional Size Measurement.  Retrieved 12 May 2009 from http://www.geocities.com/lbu_measure/fpa/fpa.htm.
  2. Cohn, Mike.  (2005).  Estimating With Use Case Points.  Retrieved 12 May 2009 from http://www.methodsandtools.com/archive/archive.php?id=25.
  3. Function point.  (n.d.).  Retrieved 12 May 2009, from http://en.wikipedia.org/wiki/Function_points.
  4. s.kushal.  (11 Mar 2007).  Function Point and Function Point Analysis.  Message posted to http://www.codeproject.com/KB/architecture/Function_Point.aspx.
  5. Software Composition Technologies, Inc.  (June 2003).  Function Point FAQ. Retrieved 12 May 2009 from http://www.royceedwards.com/floating_function_point_faq/about_function_point_analysis.htm.
  6. Software Sizing.  (n.d.).  Retrieved 12 May 2009, from http://en.wikipedia.org/wiki/Software_Size.
  7. Total Metrics.  (June 2007).  Methods for Software Sizing: How to Decide which Method to Use. Retrieved 12 May 2009 from http://www.totalmetrics.com/function-point-resources/downloads/Why-use-Function-Points.pdf.

Posted in Estimating, PM Basics | Tagged: , , , , , , , , , , | Comments Off on Better Estimating Through Software Sizing