Guest Blog By: Michael Gallo
In part 1, our guest blogger discussed the wisdom found in sharpening your pencil on cost build-ups before applying margin as well as the wisdom in asking the right question to assure your bid and your delivery of services is solving the right problem. Here, in part 2, we explore how to be more competitive via learning from previous projects and using this to win the next one.
(These are the comments of Michael Gallo, owner and principal consultant of Valerisys Consulting. (www.valerisys.com). He has over twenty years of experience in cost analysis, forecasting and risk based decision metrics.)
I've been a career cost analyst for over 20 years on the buyer side working for either program offices or agencies charged with assessing the credibility of 'the number' (i.e. a program's cost/budget). Over time, two things become apparent to a cost analyst.
First, it doesn't take long for a career cost analyst to develop instincts that tell them which programs are doomed to failure. The second insight is that program managers already know they're underfunded and are swimming upstream against the current. More analysis that tells the program manager that they're underfunded is just an additional statement of the obvious.
The high profile program fiascos that occur are almost always associated with some sort of development (Navy A-12, Denver Airport Baggage system, Boston's 'Big Dig' Tunnel, NRO FIA program, Air Force SBIRS program, Marine Corps Advanced Amphibious Assault Vehicle, Army's Future Combat System, IRS Tax System Modernization, FAA Radar System, DHS SBInet). The response of policy-makers, charged with oversight, is the belief that the forecasting of realistic and credible budgets is inadequate. They believe the answer to that problem is to ask for more detailed data and definition, more thorough cost analyses, and tighter adherence to rigorous forecasting processes.
I don't agree with that.
There is already a dedicated cadre of very smart and hard working professionals that commit their entire careers to improve models for forecasting budget requirements. In the DoD, some programs might be subjected to four levels of detailed cost analysis and review as the program wends its way through the acquisition system. The problem isn't 'the number'. Applying more analysis, more detailed forecasts and more oversight will not address the core problems of failed programs and budget overruns. Rather, to borrow a phrase from computer science--we're wrestling with the issue of 'garbage-in' , 'garbage-out'.
As a community, we're failing to adequately address the inputs into forecast models. We're failing in our ability to adequately define and articulate a development project as a series of questions to elicit answers about what's known, unknown, and unknowable in terms of what's really needed to successfully produce a system capable of satisfying the customer's requirements.
We've applied factory-based industrial engineering approaches to planning and forecasting complex development projects. This approach assumes that there exists known repeatable, well-defined processes. The reality for a development program is that solutions aren't completely known at the start of the program. We're fundamentally failing, during the project planning stages, to articulate and address the unknown and unknowables that arise in development projects. Instead, we ask for specific and detailed plans for use in managing and forecasting these projects. (In some cases we allow for some 'fuzziness' in our estimates of time and resources to address 'risk'.) In fact, if a vendor fails to demonstrate (some might call fake) that they know what they're doing--by providing a very specific and detailed plan, they may limit their chances of winning the work.
Companies know that they must demonstrate past performance and must build credible budget forecasts grounded in data. Buyers ask for 'past performance' information in order to confirm that the vendor has successfully performed similar work for other customers on previous projects. Buyers also want to know that the vendor's bid is credible by thoroughly reviewing the bidder’s 'basis of estimate'. Unfortunately, too much emphasis is placed on result and not enough emphasis is paid to understanding the vendor's ability to anticipate, address, and overcome risks, unknowns, and the unknowables that arise in any development project.
A successfully tested and mature system design reflects not only the satisfaction of customer requirements, it reflects decisions made by the vendor to address specific problems and issues encountered during the design process. It even reflects the fact that the vendor understands what kinds of specific tests must be performed on the system. Unfortunately, that knowledge (the answer to the question of why the design is the way it is) isn't necessarily captured comprehensively and systematically by the vendor. The artifact might exist (test plans, algorithms, design specifications, etc), but the wisdom (the “why?”) most likely remains locked in the heads of their very talented staff as tacit knowledge.
What does all this mean for the small business that develops products for the federal market? I believe that in order for them to win and prosper, their project planning strategies must address winnability, profitability, and executability. All three must be present or else the business will not prosper. Hard accounting and engineering data is a good start to better project planning, but is not enough to really prosper. Your bid and general offering need all three.
More Advice: If a vendor can learn to systematically extract more knowledge and insights from their projects (both the successes AND the failures), I believe they will gain a tremendous advantage in the market place to plan. forecast, and win projects. Develop a robust planning and forecasting system. Feed your forecasting system with hard data like costs, hours, materials, schedule, and product metrics and features. Then supplement your system with additional knowledge and insights. Elicit answers to the following kinds of questions to better understand how employees anticipate, define, and tackle uncertainty in their projects. Questions like:
· What have we learned previously (from similar or precursor projects)?
· What specific knowledge and data was acquired and is now being leveraged?
· What problems and risks have we and our team encountered? How we are able to address them now?
· Could those problems and risks have been foreseen? Show how you will anticipate them in the future and how (if selected) the agency will benefit as a result.
Conclusion to two part series on out maneuvering your competition:
Being small can sometimes mean that you have an opportunity to be smarter, more nimble and perhaps more honest about your organization’s strengths and weaknesses. That is what we’re advocating. If your firm is small, let this be leveraged to your advantage by learning as much as possible from both project successes and the occasional setback. A project setback isn’t a true failure if it contributes to overall organization knowledge that positions your company for the next big win. Your company will stand out from the rest, as a result.