Earned Value Management Revisited: The unknown past and future of this useful technique

Earned Value Management Revisited: The unknown past and future of this useful technique

Reading some of the posts I have been writing recently, it has dawned on me that it has been a while since I’ve posted articles related to core project management topics.  One of the goals of this site is to provide “insights that take you beyond the normal of what is provided in the traditional project management community” and while that goal will always be the emphasis, I can’t forget that readers of this site visit because the theme of the site is Project Management.  With that being said, I will also make sure to post articles on the core “meat and potato” tools, techniques and methods we have all come to both love and despise as well as struggle and triumph with in our day to day management of projects!

And having just completed a few rounds of in-depth PM training, one of which I did onsite for UCLA Extension’s project management certificate program at the Port of LA on Earned Value Management, it never ceases to amaze me how I learn each time I do such sessions.  In fact, I have to admit feeling a bit guilty that I learn so much and get paid to do.  This is why I am honored and grateful to have this as part of my core career.  But back to business: I want to talk about a few new things I learned about one of the most well-known, but underutilized or just flat out misused quantitative technique know as Earned Value Management.

I won’t go into the basics of this technique and if you need a refresher, there are great resources on the web such as this Wikipedia page.  But one resource I found that is very insightful is the book “Earned Value Project Management” by Flemming and Koffleman which is in its 4th edition and considered one of the essential texts for EVM in the industry.  I used this book for the first time in the UCLA Ext. class and the reason was for its readability and the fact that rather than just reviewing the equations it gives you insights on how best to use the technique.  If you’re just learning EVM, I can’t say I would quite recommend this book unless you have an instructor that will go over the basics and provide background, but if you’ve been in the profession and banged your head against trying to do EVM, this book is rock solid!

If there’s one very interesting insight that I learned from the book, was the chapter reviewing the historical genealogy of EVM where it was mentioned that as far back as 1890, some plant engineers outlined a “three dimensional” approach for performance efficiency in the factory floor.  Industrial engineers would measure “planned standards” measured against “earned standards” achieved against “actual expenses”.  This was due to the influence of none other than Frederick Winslow Taylor of the famed “scientific management” movement outlined in his infamous book “The Principles of Scientific Management” published in 1911.  The very interesting and ironic idea is that plant engineers and managers were probably doing a better job of EVM than we are now!  Of course the business environment was much simpler back then and they did not have the kind of technologies we have now in all fairness, but in my opinion this restriction actually makes you more effective since more care has to be taken to quantify the data.  These days we are so reliant on IT that we take it for granted and are not as detailed and careful about the data going in resulting in the old “garbage in, garbage out” phenomenon.  More on this point on another post…

But given the sophistication our current technologies and advancement of business and project management practices (though one could strongly argue that in many ways we have regressed backwards in this regard!), that we should be looking at ways to expand this useful technique.  For me, the core techniques of EVM are driven by a deterministic, linear time series model, but the forecasting components especially with Estimate At Complete (EAC) could benefit (I think the To Complete Performance Index (TCPI) could benefit, but I have not though that through in detail yet) from an extension using simulations of a stochastic process described by random variables for a non-deterministic model so that the forecasts are determined probabilistically.

A perfect candidate for using Monte Carlo simulations!  Using both EVM and MCS will provide a way to capture both the deterministic and probabilistic EAC.  This will allow a way to do re-estimations and re-baselines for your project at the planning phases as well as the readjustments within the early completion periods of your projects.  As this post from John Goodpasture outlines:

The facts are AC (actual cost), BAC (budget at completion), and cumulative EV. The cumulative CPI is a historical fact, but it’s only an estimate of future performance. That’s where the MCS comes in. MCS results may shape our idea about this estimate when compared to the EVM linear equation calculations… Once the MCS value is determined, the equation above is reworked to solve for the future CPI. Now you have two CPI’s: one from the EVM estimate, and one from the MCS re-engineering. What to do now?  The conservative thing is to pick the worst case. The management thing is to determine what needs to be done or changed to bring the CPI into an acceptable range, and then do it.

Here’s a pretty good graphic I found that illustrates what this would look like:

EVM_MCImage by VBKom

In a future article, I will discuss another integration with EVM and that is with statistical process control (namely through Control Charts) to capture the variances and ultimately, to detect the anomalies or the trending of these anomalies to provide a more rigorous indicator on the quality of the performance of your projects.  I’ve actually implemented this integration for two large scale IT infrastructure projects so my observations will be first-hand and real world.

0.00 avg. rating (0% score) - 0 votes