This is the accessible text file for GAO report number GAO-06-110 
entitled 'Best Practices: Better Support of Weapon System Program 
Managers Needed to Improve Outcomes' which was released on December 1, 
2005. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to the Subcommittee on Readiness and Management Support, 
Committee on Armed Services, U.S. Senate: 

United States Government Accountability Office: 

GAO: 

November 2005: 

best practices: 

Better Support of Weapon System Program Managers Needed to Improve 
Outcomes: 

Best Practices: 

GAO-06-110:

GAO Highlights: 

Highlights of GAO-06-110, a report to the Subcommittee on Readiness and 
Management Support, Committee on Armed Services, U.S. Senate: 

Why GAO Did This Study: 

The Department of Defense (DOD) relies on a relatively small cadre of 
officials to develop and deliver weapon systems. In view of the 
importance of DOD’s investment in weapon systems, we have undertaken an 
extensive body of work that examines DOD’s acquisition issues from a 
perspective that draws lessons learned from the best commercial product 
development efforts to see if they apply to weapon system acquisitions. 
In response to a request from the Chairman and Ranking Minority Member 
of the Subcommittee on Readiness and Management Support, Senate 
Committee on Armed Services, this report assesses (1) how successful 
commercial companies position their program managers, (2) how DOD 
positions its program managers, and (3) underlying reasons for the 
differences. In compiling this report, GAO conducted a survey of 
program managers. See GAO-06-112SP. 

What GAO Found: 

U.S. weapons are among the best in the world, but the programs to 
acquire them often take significantly longer and cost more money than 
promised and often deliver fewer quantities and capabilities than 
planned. It is not unusual for estimates of time and money to be off by 
20 to 50 percent. When costs and schedules increase, quantities are 
cut, and the value for the warfighter—as well as the value of the 
investment dollar—is reduced. 

When we examined private sector companies that developed complex and 
technical products similar to DOD, we found that their success hinged 
on the tone set by leadership and disciplined, knowledge-based 
processes for product development and execution. More specifically, 
long before the initiation of a new program, senior company leaders 
made critical investment decisions about the firm’s mix of products so 
that they could commit to programs they determined best fit within 
their overall goals. These decisions considered long-term needs versus 
wants as well as affordability and sustainability. Once high level 
investment decisions were made, senior leaders ensured that programs 
did not begin unless they had a business case that made sure resources 
were in-hand to execute the program—that is, time, technology, money, 
and people. Once a business case was established, senior leaders tasked 
program managers with executing that business case for each new product 
from initiation to delivery, but required their program managers to use 
a knowledge-based product development process that demanded appropriate 
demonstrations of technology, designs, and processes at critical 
junctures. The program manager was empowered to execute the business 
case, but also held accountable for delivering the right product at the 
right time for the right cost. Requiring the program manager to stay 
throughout the length of a project was a principal means of enforcing 
accountability. Overall, by providing the right foundation and support 
for program managers, the companies we visited were able to 
consistently deliver quality products within targets, and in turn, 
transform themselves into highly competitive organizations. 

DOD program managers are put in a very different situation. DOD 
leadership rarely separates long-term wants from needs based on 
credible, future threats. As a result, DOD starts many more programs 
than it can afford--creating a competition for funds that pressures 
program managers to produce optimistic cost estimates and to 
overpromise capabilities. Moreover, our work has shown that DOD allows 
programs to begin without establishing a formal business case. And once 
they begin, requirements and funding change over time. In fact, program 
managers personally consider requirements and funding instability—which 
occur throughout the program—to be their biggest obstacles to success. 
Program managers also believe that they are not sufficiently empowered 
to execute their programs, and that because much remains outside of 
their span of control, they cannot be held accountable. 

What GAO Recommends: 

GAO recommends the Secretary of Defense develop an investment strategy 
to prioritize needed capabilities; require senior stakeholders to 
formally commit to business cases for new weapon system developments; 
and develop a process to instill and sustain accountability for 
successful program outcomes. DOD agreed with our recommendations. 

www.gao.gov/cgi-bin/getrpt?GAO-06-110
www.gao.gov/cgi-bin/getrpt?GAO-06-112SP 

To view the full product, including the scope
and methodology, click on the links above.
For more information, contact Michael J. Sullivan at (202) 512-4841 or 
sullivanm@gao.gov. 

[End of section]

Contents: 

Letter: 

Executive Summary: 

Purpose: 

Background: 

Results in Brief: 

Best Practice: Corporate Leadership and Disciplined, Knowledge-Based 
Processes Are Critical to Program Manager Success: 

DOD: Critical Support Factors Are Missing: 

Differences in Incentives Contribute to Differences in Support for 
Program Managers: 

Chapter 1: Introduction: 

Long-Standing Problems Hamper Weapons Systems Acquisitions: 

DOD Program Managers Are Central Executors of the Acquisition Process: 

Legislation to Improve Program Manager Proficiency: 

Objectives, Scope, and Methodology: 

Chapter 2: Senior Leader Support and Disciplined Knowledge-Based 
Processes Are Critical Enablers for Program Managers: 

Senior Leadership Provides Program Managers with a Strong Foundation 
for Success: 

Knowledge-Based Process Followed to Execute Programs: 

Continued Senior Leadership during Product Development Further Enabled 
Success: 

Chapter 3: DOD Is Not Supporting Its Program Managers Effectively: 

Senior Leadership Does Not Provide a Strong Foundation for Success: 

Execution in DOD Does Not Provide Adequate Support and Accountability: 

Senior Leader Support during Execution: 

Chapter 4: Basic Incentives Drive Differences in How Program Managers 
Are Supported and Held Accountable: 

Definition of Success: 

Means for Success: 

Other Differences Put Additional Pressures on DOD Program Managers: 

Chapter 5: Conclusions and Recommendations: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Comments from the Department of Defense: 

Appendix II: GAO Staff Acknowledgments: 

Related GAO Products: 

Tables: 

Table 1: Acquisition Categories: 

Table 2: Are Best Practices Present in DOD? 

Table 3: Technology Maturity and Program Outcomes: 

Table 4: Are Best Practices Present in DOD? 

Table 5: Program Manager Views on Formal vs. Informal Authority: 

Figures: 

Figure 1: Critical Support and Accountability Factors: 

Figure 2: 2005 Toyota Avalon: 

Figure 3: Siemens Bi-Plane AXIOM Artis: 

Figure 4: Best Practice Roles, Responsibilities, and Behaviors of 
Senior Managers: 

Figure 5: Breakdowns in Support and Accountability Factors: 

Figure 6: Highlights of Program Manager Comments Regarding Competition 
for Funding: 

Figure 7: To What Extent Were the Parameters of Your Program Reasonable 
at Program Start? 

Figure 8: How Program Managers Responded to an Open-ended Question on 
What Were the Biggest Obstacles They Faced: 

Figure 9: Highlights of Program Manager Comments on What Types of 
Authority They Need: 

Figure 10: Key Differences in Definition of Success and Resulting 
Behaviors: 

Figure 11: Commercial vs. DOD Oversight Environments: 

Abbreviations: 

DAWIA: Defense Acquisition Workforce Improvement Act: 

DOD: Department of Defense: 

OSD: Office of the Secretary of Defense: 

PEO: program executive officer: 

United States Government Accountability Office: 

Washington, DC 20548: 

November 30, 2005: 

The Honorable John Ensign: 
Chairman: 
The Honorable Daniel K. Akaka: 
Ranking Minority Member: 
Subcommittee on Readiness and Management Support: 
Committee on Armed Services: 
United States Senate: 

As you requested, this report examines how program managers in the 
Department of Defense are supported and how they are held accountable 
for program outcomes. It compares department polices and practices to 
those of leading commercial companies we visited and discusses actions 
DOD could take to improve the accountability of program managers and 
provide them with timely support as they manage the development of 
complex systems. We make recommendations to the Secretary of Defense to 
(1) develop an investment strategy to prioritize needed capabilities, 
(2) require, for each new program, that senior level stakeholders 
formally commit to a business case for program approval at the start of 
a new program, and (3) implement a process to instill and sustain 
accountability for successful program outcomes. 

We are sending copies of this report to the Secretary of Defense; the 
Secretary of the Army; the Secretary of the Navy; the Secretary of the 
Air Force; the Director, Missile Defense Agency; the Director of the 
Office of Management and Budget; and interested congressional 
committees. We will also make copies available to others upon request. 
In addition, the report will be available at no charge on the GAO Web 
site at http://www.gao.gov. 

If you have any questions regarding this report, please call me at 
(202) 512-4841. Staff acknowledgments are listed in appendix II. 

Signed by:

Michael J. Sullivan: 
Director, Acquisition and Sourcing Management: 

[End of section] 

Executive Summary: 

Purpose: 

The Department of Defense (DOD) plans to increase its investment in the 
research, development, and procurement of new weapon systems from $144 
billion in fiscal year 2005 to $185 billion in fiscal year 2009. U.S. 
weapons are among the best in the world, but the programs to acquire 
them often take significantly longer and cost more money than promised 
and often deliver fewer quantities and other capabilities than planned. 
It is not unusual for estimates of time and money to be off by 20 to 50 
percent. When costs and schedules increase, quantities are cut, and the 
value for the warfighter--as well as the value of the investment 
dollar--is reduced. 

In view of the importance of DOD's investment in weapon systems, we 
have undertaken an extensive body of work that examines DOD's 
acquisition issues from a different, more cross-cutting perspective-- 
one that draws lessons learned from the best commercial product 
development efforts to see if they apply to weapon system acquisitions. 
In response to a request from the Chairman and Ranking Minority Member 
of the Subcommittee on Readiness and Management Support, Senate 
Committee on Armed Services, this report assesses (1) how successful 
commercial companies position their program managers, (2) how DOD 
positions its program managers, and (3) underlying reasons for the 
differences. 

Background: 

DOD relies on a relatively small cadre of military and civilian 
officials--known as program managers--to lead the development and 
delivery of its weapon systems. The responsibility placed on this group 
is enormous. The systems that program managers are responsible for 
range from highly complex and sophisticated aircraft, missile 
interceptors, submarines, and space-based sensors, to new communication 
and ground control systems that support and interconnect this 
equipment, to smaller, less complex systems that support the 
warfighter. In these times of asymmetric threats and netcentricity, 
individual weapon system investments are getting larger and more 
complex. The development process itself is very challenging as many 
systems require successful management and coordination of a broad array 
of military service and DOD officials, outside suppliers, internal and 
external oversight entities, as well as technical, business, 
contracting, and management expertise. Moreover, in many cases, weapon 
systems are also expected to incorporate technologies that push the 
state-of-the-art while operating in harsh and even untested 
environments--adding daunting technical challenges to the already 
existing business, management, and logistical challenges. Lastly, GAO 
has reported many of the business processes that support weapons 
development--strategic planning and budgeting, human capital 
management, infrastructure, financial management, information 
technology, and contracting--are beset with pervasive, decades-old 
management problems, which include outdated organizational structures, 
systems, and processes.[Footnote 1] 

Weapon system program managers are the central executors of the 
acquisition process. They are responsible for all aspects of 
development and delivery of a new system and for assuring that systems 
are high quality, affordable, supportable, and effective. In carrying 
out this responsibility, they are also responsible for balancing 
factors that influence cost, schedule, and performance. DOD employs 
about 729 program managers to run its weapons programs. Both military 
officers and civilians serve as program managers, but the majority is 
from the military. DOD's program managers typically report to program 
executive officers (PEO) who are charged with overseeing the execution 
of a portfolio of related systems. PEOs, in turn, typically report to a 
military service acquisition executive, who reports to a service 
secretary, or for some programs, the PEO reports to the Defense 
Acquisition Executive. 

Results in Brief: 

Program managers from the leading companies we spoke with believed that 
two critical enablers--(1) support from top leadership and (2) 
disciplined, knowledge-based processes for product development 
execution--empowered them to succeed in delivering new products when 
needed within cost, quality, and performance targets originally set by 
the company. Long before the initiation of a new product development, 
senior company leaders make critical strategic investment decisions 
about the firm's mix of products and the return on investment they may 
yield. Once high-level investment decisions were made, senior leaders 
ensured that programs did not begin unless they had a business case 
that demonstrated the program was aligned with the company's goals and 
that resources were in-hand to execute the program--that is, time, 
technology, money, and people. Once a business case was established, 
senior leaders tasked program managers with executing that business 
case for each new product from initiation to delivery, but required 
their program managers to use a knowledge-based product development 
process that demanded appropriate demonstrations of technology, 
designs, and processes at critical junctures. The program manager was 
empowered to execute the business case, but also held accountable for 
delivering the right product at the right time for the right cost. 
Throughout execution, company senior leaders supported their program 
managers by encouraging open and honest communication and continually 
assured that the right levels of resources and management attention 
were available for the project. 

While DOD has taken action in recent years to better position programs 
for success, it puts its program managers in a very different 
situation. Program managers themselves believe that rather than making 
strategic investment decisions, DOD starts more programs than it can 
afford and rarely prioritizes them for funding purposes. The result is 
a competition for funds that creates pressures to produce optimistic 
cost and schedule estimates and to overpromise capability. Our own work 
has shown that many programs begin without a business case, that is, 
without adequate knowledge about technology, time, and cost and without 
demonstrating that the program itself is the optimal approach for 
achieving a needed capability. Moreover, once programs begin, the 
program manager is not empowered to execute the program. In particular, 
program managers cannot veto new requirements, control funding, or 
control staff. In fact, program managers personally consider 
requirements and funding instability to be their biggest obstacles to 
success. Program managers also believe that they are not sufficiently 
supported once programs begin. In fact, they must continually advocate 
for their programs in order to sustain support. Our past reports also 
show that programs are incentivized to suppress bad news and to 
continually produce optimistic estimates--largely due to continual 
funding competition. 

Many of these differences can be attributed to how success is defined 
within the commercial and DOD environment. Success for the commercial 
world is straightforward and simple: maximize profit. In turn, this 
means selling products to customers at the right price, the right time, 
and the right cost. With this imperative in hand, companies have no 
choice but to adopt processes and cultures that emphasize basing 
decisions on knowledge, reducing risks prior to undertaking new 
efforts, producing realistic cost and schedule estimates, and assuring 
consistency and quality pervade all efforts. At first glance, DOD's 
definition of success is very similar: deliver capability to the 
warfighter at the right price, the right time, and the right cost. But, 
for various reasons, it is clear that the implied definition for 
success is to attract funds for new programs and to keep funds for 
ongoing programs. While the annual appropriations process and the wide 
variety of mission demands placed on DOD contribute to this condition, 
DOD has made matters worse by not making hard tradeoff decisions to 
ensure it does not pursue more programs than it can afford. Once 
attracting funds becomes "success," harmful practices emerge. For 
example, it is not in a program manager's interest to develop accurate 
estimates of cost, schedule, and technology readiness, because honest 
assessments could result in lost funding. Delayed testing becomes 
preferred over early testing because that will keep "bad news" at bay. 
In turn, knowing data being reported to them may not be reliable, 
senior leaders believe they cannot trust it and must instill multiple 
oversight mechanisms. Any attempts to improve policy and processes 
eventually succumb to funding competition because no one wants to risk 
loss of support. 

We are making recommendations to DOD to better position program 
managers for success. These recommendations focus on what is needed to 
be done to provide the strategic leadership needed to provide the right 
foundation for starting programs, ensure an executable business case is 
delivered to program managers, and to hold program managers accountable 
for successful outcomes. It is important to note that the success of 
all of our recommendations hinge on DOD's ability to instill more 
discipline and leadership over the investment process. After a review 
of a draft of this report, DOD concurred with our recommendations and 
provided some additional comments. The full text of DOD's comments may 
be found in appendix I. 

Best Practice: Corporate Leadership and Disciplined, Knowledge-Based 
Processes Are Critical to Program Manager Success: 

At all of the companies we visited, support for program managers began 
well before they were assigned to a new product development effort-- 
with high-level strategic planning and investment decisions and 
concerted efforts to make sure that any new initiative the company 
undertook was achievable within the time and money and other resources 
the company had available. Technology development and program advocacy 
were also generally kept out of a program manager's domain. Once new 
efforts got off the ground, program managers were empowered to manage 
resources, encouraged to bring up problems and propose solutions, and 
consult with senior leaders without fear of losing their support. At 
the same time, however, they were expected to base their decisions on 
hard data and to assure the right knowledge was in-hand before 
proceeding into the next phases of development. They were also held 
accountable for their choices, though companies generally found that 
with good pre-program decisions, a good launch, a sound, disciplined 
process for execution, and continued support, there was little need to 
punish or remove their program managers. Ultimately, as long as a 
program manager could deliver the right product at the right time for 
the right cost, he was incentivized to do so without interference from 
above. 

According to commercial program managers we spoke with, the most 
critical support factors included the following: 

* Investment strategies. Each of the companies we visited followed a 
rigorous process to forecast market needs against company resources, 
economic trends, available technologies, and its own strategic vision. 
These exercises culminated in short-and long-term investment strategies 
that provided program managers with confidence that the company was 
committed to their particular program and showed them where the project 
fit within overall corporate goals. 

* Evolutionary development. All of the companies followed an 
incremental path toward meeting market needs rather than attempting to 
satisfy all needs in a single step. This provided program managers with 
more achievable requirements, which, in turn, facilitated shorter cycle 
times. With shorter cycle times, the companies could ensure both 
program managers and senior leaders stayed with programs throughout the 
duration. 

* Matching requirements to resources. Once specific product concepts 
were identified, the companies worked vigorously to close gaps between 
requirements/customer needs, and resources--time, money, and 
technology. In effect, this took the investment strategy down to a 
project level, assuring that the program manager would be well 
positioned to execute within cost and schedule. 

* Matching the right people to the program. All of the companies we 
visited took steps to ensure that they assigned the right people to the 
right programs. These included long term efforts to train and groom 
technical staff into program managers, mentoring on the part of senior 
leaders with program management experience, handpicking program 
managers based on their expertise and experience, and supporting 
program managers with teams of highly qualified functional and 
technical experts. 

* Knowledge-driven development decisions. Once a new product 
development began, program managers and senior leaders used 
quantifiable data and demonstrable knowledge to make go/no-go 
decisions. These covered critical program facets such as cost, 
schedule, technology readiness, design readiness, production readiness, 
relationships with suppliers, etc. Development was not allowed to 
proceed until certain thresholds were met, for example, a high 
percentage of engineering drawings completed or production processes 
under statistical control. Development processes were also continually 
tailored based on lessons learned. Program managers themselves placed 
high value on these requirements, as they ensured programs were well 
positioned to move into subsequent phases and were less likely to 
encounter disruptive problems. 

* Empowerment. At all the companies we visited, program managers were 
empowered to make decisions as to whether programs were ready to move 
forward and to resolve problems and implement solutions. They could 
redirect available funding, if needed. They could change team members. 
Prior to development, they often had a say in what requirements they 
would be handed. 

* Accountability. With authority, came accountability. Program managers 
at all of the companies we visited were held accountable for their 
choices. To assure accountability, senior leaders set goals that were 
clear to the entire project team and provided incentives for program 
managers and others to meet those goals. 

* Tenure. All of the companies we visited required that program 
managers stay on until the end of the program. This was a primary means 
of assuring accountability. 

* Continued senior leadership. In addition to empowering them, program 
managers credited senior leaders with other vital levels of support. 
Namely, senior leaders' commitment to their programs were unwavering, 
they trusted their program managers, they encouraged them to share bad 
news, and they encouraged collaboration and communication. At the end 
of the day, it was the senior leaders' job to anticipate and remove 
obstacles and provide the right levels of support so that the path was 
cleared for the program manager to execute the program. 

DOD: Critical Support Factors Are Missing: 

At DOD, program managers are not put in a position to deliver a product 
within estimates, nor are they held accountable when there are failures 
to deliver products within estimates. While senior leaders work hard to 
develop a short-and long-term vision for the defense of the United 
States, these visions are rarely translated into realistic investment 
strategies that assure the right mix of programs is being pursued. 
Moreover, while recognized in policy as a best practice, DOD does not 
always make sure that there is a business case for new initiatives. 
Lastly, program managers are not empowered to execute programs once 
they begin or held accountable when programs get off track. 

The primary problem, according to many program managers and verified by 
GAO's work, is that DOD starts more programs than it can afford and 
does not prioritize programs for funding. This creates an environment 
where programs must continually compete for funding. Before programs 
are even started, advocates are incentivized to underestimate both cost 
and schedule and overpromise capability. 

A second problem is that gaps between resources and requirements are 
not closed before or even during program development. For example, we 
have reported that DOD allows many programs to go forward without 
knowing whether critical technologies--such as satellite's main sensor, 
a fighter aircraft's stealth technology, a new tank's networking 
capability--can work as intended. Invariably, when programs start with 
such unknowns, they spend a great deal of time and money later on 
fixing technical glitches while simultaneously trying to get other 
program aspects on track. One reason programs begin with immature 
technologies is that program advocates are rushed to start the 
acquisition program because it assures at least an initial commitment 
of funding. Compounding this problem is the fact that acquisition 
programs tend to attract funds over other activities, including science 
and technology efforts that ultimately support acquisition. As a 
result, program managers are incentivized to take on tasks that really 
should be accomplished within a laboratory environment, where it is 
easier and cheaper to discover and address technical problems. 

A third problem is that program managers themselves are not empowered 
to execute their programs. First, they have little control over funding 
and they cannot count on funding to be stable. When funding is taken 
away, program managers often find themselves in a negative spiral of 
funding-related problems--particularly because they've already made 
commitments to contractors based on certain anticipated levels of 
funding. Second, they cannot veto new requirements. Faced with long 
development life cycles and promising technology advances, users often 
ask for new or better capabilities as a program proceeds forward. 
Program managers themselves are not always empowered to say "no" to 
demands that may overly stretch their programs, and few senior leaders 
above them have been willing to. In addition, program managers have 
little authority over staffing and the ability to shift funds within 
the program. With so much outside their span of control, program 
managers say that DOD is unable to hold them accountable when programs 
get off track. Another reason that it is difficult to hold program 
managers accountable is that their tenure is relatively short. The 
problems being encountered today may well be the result of a poor 
decision made years ago by another program manager. 

DOD has tried to improve its processes and policies to better position 
programs for success. For example, policies embrace the concept of 
closing gaps between requirements and resources before launching new 
programs, and DOD is making changes to requirements setting and funding 
processes in an attempt to strengthen investment decisions. At this 
point, however, program managers do not see trade-offs being made in 
the front-end of product development that would ensure DOD could fully 
commit to their programs and allow program managers themselves to focus 
solely on executing their programs. The level of trust, collaboration 
and communication is low, while the level of oversight and second 
guessing is high. 

Differences in Incentives Contribute to Differences in Support for 
Program Managers: 

Differences between how program managers are supported and held 
accountable are rooted in differences in incentives and resulting 
behaviors. This begins with the definition of success. The commercial 
firms we studied concluded their survival hinged on their ability to 
increase their market share, which, in turn, meant developing higher 
quality products, at the lowest possible price, and delivering them in 
a timely fashion--preferably before their competitors could do the 
same. This imperative meant that they had no choice but to narrow the 
gap between requirements and resources in a manner that not only 
ensured they met their market targets, but did so in a manner that 
consumed resources fairly efficiently. It also meant that they had no 
choice but to fully support the development effort, instill strategic 
planning and prioritization, work collaboratively, follow a knowledge- 
based process that makes product development manageable, and, 
ultimately, make everyone accountable for success. Ultimately, the 
companies developed processes that embodied these tenets for success. 
At the strategic level, these include accurate, strategic planning and 
prioritization to ensure the right mix of products are pursued and 
strong systems engineering to help them establish a realistic business 
case. At the tactical level, companies developed development processes 
that required certain thresholds of knowledge to be gained before a 
decision to proceed forward is made. 

In theory, DOD's success likewise hinges on its ability to deliver high-
quality weapons to the warfighter in a timely fashion. But in practice, 
success is defined as the ability of a program to win support and 
attract funds. Of course, there are reasons for this disconnect. 
Corporate revenue is generated by customer sales while DOD's funding is 
dependent on annual appropriations. Corporations go out of business 
when their product development efforts do not succeed; DOD does not. 
Selling products to customers is the single focus of a private-sector 
company while DOD is charged with a myriad of important missions--each 
of which also competes for budget share. Nevertheless, these conditions 
create a vastly different set of processes and behaviors affecting 
program managers. Program managers are incentivized, for example, to be 
optimistic and suppress bad news because doing otherwise could result 
in a loss of support and funding and further damage their program. In 
short, unknowns become acceptable and desirable rather than 
unacceptable as they are in the corporate environment. And 
accountability becomes much more difficult to define. 

[End of section] 

Chapter 1: 
Introduction: 

DOD plans to spend about $1.3 trillion for its major programs between 
2005 and 2009 and increase its investment in research and development 
during that period by about 28 percent--from $144 billion to $185 
billion. Although DOD's weapons are widely regarded as unrivaled in 
superiority, DOD has not received a predictable return on investment in 
major weapon systems acquisitions. For decades, many of DOD's weapon 
systems acquisitions have experienced large cost increases and extended 
schedules, which, in turn, have jeopardized performance and, more 
broadly, undermined DOD's buying power. 

To help better position DOD to successfully field weapons, we have 
undertaken a body of work over the past decade that has examined 
lessons learned from the best commercial product development efforts to 
see if they can be applied to DOD weapon system development. Leading 
commercial firms have developed increasingly sophisticated products in 
significantly less time and at lower costs. Our previous best practices 
reports[Footnote 2] have examined such topics as matching resources 
with requirements, controlling total ownership costs, effective use of 
testing, and product development. This report examines the program 
manager's role and the mechanisms that DOD and leading commercial 
companies use to position program managers for success and hold them 
accountable. As the central executor of the acquisition process, DOD 
depends on its program managers to efficiently and effectively run its 
large range of complex weapon systems acquisitions. 

The challenge that program managers now face is massive. Weapon systems 
themselves are becoming increasingly sophisticated and interdependent 
and, therefore, more complicated and difficult to develop. At the same 
time, however, DOD is faced with threats that are constantly evolving, 
requiring quicker development cycles and more flexibility within 
weapons programs. Moreover, many of the business processes that support 
weapons development--strategic planning and budgeting, human capital 
management, infrastructure, financial management, information 
technology, and contracting--are beset with pervasive, decades-old 
management problems, including outdated organizational structures, 
systems, and processes. In fact, these areas--along with weapons system 
acquisitions--are on GAO's high-risk list of major government programs 
and operations. Lastly, while DOD plans to considerably ramp up weapons 
system spending in the next 5 years in an effort to dramatically 
transform how it carries out its military operations, it is likely to 
face considerable pressure to reduce its investment in new weapons as 
the nation addresses long-term fiscal imbalances. 

Long-Standing Problems Hamper Weapons Systems Acquisitions: 

While DOD's acquisition process has produced weapons that are among the 
best in the world, it also consistently yields undesirable 
consequences--such as cost increases, late deliveries to the 
warfighter, and performance shortfalls. Such problems have been 
highlighted, for example, in our past reviews of DOD's F/A-22 Raptor, 
Space-Based Infrared System, Airborne Laser, the Joint Strike Fighter, 
and other programs. Our past work has found that problems occur because 
DOD's weapon programs do not capture early on the requisite knowledge 
that is needed to efficiently and effectively manage program risks. For 
example, programs move forward with unrealistic cost and schedule 
estimates, lack clearly defined and stable requirements, use immature 
technologies to launch the product development, and fail to solidify 
design and manufacturing processes at appropriate junctures in 
development. 

When costs and schedules increase, quantities are cut and the value for 
the warfighter, as well as the value of the investment dollar, is 
reduced. Moreover, in these times of asymmetric threats and 
netcentricity, individual weapon system investments are getting larger 
and more complex. Just 4 years ago, the top five weapon systems cost 
about $281 billion; today, in the same base year dollars, the top five 
weapon systems cost about $521 billion.[Footnote 3] If these 
megasystems are managed with traditional margins of error, the 
financial consequences--particularly the ripple effects on other 
programs--can be dire. 

DOD has long recognized such problems and initiated numerous 
improvement efforts. In fact, between 1949 and 1986 five commissions 
studied issues such as cycle time and cost increases as well as the 
acquisition workforce. DOD has also undertaken a number of acquisition 
reforms. Specifically, DOD has restructured its acquisition policy to 
incorporate attributes of a knowledge-based acquisition model and has 
reemphasized the discipline of systems engineering. In addition, DOD 
recently introduced new policies to strengthen its budgeting and 
requirements determination processes in order to plan and manage 
systems based on joint warfighting capabilities. While these policy 
changes are positive steps, we recently testified that implementation 
in individual programs has not occurred because of inherent funding, 
management, and cultural factors that lead managers to develop business 
cases for new programs that overpromise on cost, delivery, and 
performance of weapon systems. 

DOD Program Managers Are Central Executors of the Acquisition Process: 

DOD relies on a cadre of military and civilian officials--known as 
program managers--to lead the development and delivery of hundreds of 
weapon systems and subsystems. The services report a combined total of 
729 program managers currently executing programs at all acquisition 
category levels. The systems that program managers are responsible for 
range from highly sophisticated air, land, sea, and space-based systems 
to smaller, less complex communications or support equipment that 
interconnects or supports larger systems. Program managers are 
responsible for assuring that these systems are reliable, affordable, 
supportable, and effective. They carry out multiple roles and 
responsibilities and are expected to have a working knowledge in such 
diverse areas as contracting, budgeting, systems engineering, and 
testing. 

DOD classifies its acquisition programs into categories based upon a 
number of factors such as their size, cost, complexity and importance. 
The largest, most complex and expensive programs generally fall under 
the responsibility of the Under Secretary of Defense (Acquisition, 
Technology and Logistics) while less complex and risky programs are 
overseen by the service or component acquisition executive. Table 1 
provides more details. 

Table 1: Acquisition Categories: 

Acquisition category: Category I;
Definition: Research, development, test, and evaluation > $365M 
Procurement > $2.19B;
Milestone decision authority: 1D: Under Secretary of Defense 
(Acquisition, Technology and Logistics);
1C: Service Acquisition Executive;
Program examples: Future Combat System; DD(X) Destroyer; B-1 Aircraft. 

Acquisition category: Category II;
Definition: Research, development, test, and evaluation > $140M 
Procurement > $660M;
Milestone decision authority: Service or Component Acquisition 
Executive;
Program examples: All Source Analysis System; KC-130J Aircraft; Joint 
Surveillance and Target Attack Radar System. 

Acquisition category: Category III;
Definition: No fiscal criteria;
Program examples: 10k W Auxiliary Power Unit; Assault Breaching 
Vehicle; C-5 Avionics. 

Acquisition category: Category IV;
Definition: No fiscal criteria (Navy and Marine Corps only);
Program examples: C-130 Night Vision Lighting; Advanced Recovery 
Control System. 

Source: GAO.

Note: Category I systems are referred to as "programs" and smaller 
related subsystems are called "projects" or "products." For example, 
the Air Force's B-1 aircraft system-a category IC program-includes 
category II and III projects that may have a designated manager. 
Category 1D and 1C programs are distinguished by their milestone 
decision authority. 

[End of table] 

Program managers typically supervise a large staff of engineers, 
contracting personnel, logisticians, business, financial, and 
administrative personnel. The number of people assigned to program 
offices varies widely and depends on factors such as the complexity of 
the system, the category level, and the availability of staff. For 
example, the Joint Strike Fighter, a category ID program, is managing 
the development of three configurations of a new aircraft for the Navy, 
Marines and Air Force, and currently has about 200 government and 
international personnel assigned. By contrast the Light Utility 
Helicopter, a category II project relying largely on commercial off- 
the-shelf components, has a staff of 34. 

To successfully deliver a weapon system to the user, program managers 
must also work with a range of individuals outside their sphere of 
influence such as those charged with independent cost estimating, 
testing, funding, writing requirements, security, and ensuring 
interoperability. Simultaneously, the program manager is responsible 
for overseeing, integrating, and evaluating the defense contractor's 
work as the development progresses. Moreover, some program managers 
lead international teams. For example, the Joint Strike Fighter Program 
Office, in addition to the military, civilian, and contract team 
members, has eight international partners and approximately 40 
international team members. 

The majority of DOD program managers for category I programs are 
military officers at the rank of colonel or (Navy) captain. Subsystem 
program managers are usually lower in rank and report directly to the 
system program manager. DOD also employs civilian program managers, 
usually GS-15s for its category I programs. As a rule, program managers 
report to a Program Executive Officer--a civilian at the senior 
executive level or military officer at the general officer rank--who 
typically manages a portfolio of related weapon systems. However, some 
program executive officers are responsible for a single large program, 
such as the Joint Strike Fighter or the F-22 aircraft. One level up 
from the program executive officer is the Service Acquisition 
Executive, a civilian (often a political appointee) who reports to the 
service Secretary. Programs classified as a category ID report through 
the defense acquisition executive, Undersecretary of Defense 
(Acquisition, Technology and Logistics), as their milestone decision 
authority. 

Legislation to Improve Program Manager Proficiency: 

Program manager training and tenure is now governed by legislation 
known as the Defense Acquisition Workforce Improvement Act 
(DAWIA),[Footnote 4] enacted in 1990 after studies showed that a key 
problem affecting acquisitions was that program managers did not stay 
in their positions long enough to be accountable for outcomes and that 
many simply lacked the training and experience needed to assume their 
leadership roles. Congress amended the law in the fiscal year 2004 and 
2005 defense authorization acts to allow the Secretary of Defense more 
flexibility to tailor tenure, experience, and education qualifications 
for program managers. 

The act specifically created a formal acquisition corps and defined 
educational, experience, and tenure criteria needed for key positions, 
including program managers as well as contracting officers and others 
involved the acquisition process. The act also provided for the 
establishment of a defense acquisition university to provide 
educational development and training for acquisition personnel. Under 
DOD regulations program managers are required to attend training and 
meet course requirements through the university in order to meet 
certification requirements for the program management track. 

There are three progressive certification levels: basic, intermediate, 
and advanced. Program managers of major defense acquisition programs 
are required to have Level 3 certification, which requires four years 
of acquisition experience and an advanced level Defense Acquisition 
University course in program management. DOD prefers that individuals 
with Level 3 certification have a Master's degree in engineering, 
systems acquisition management, or business administration, and 
complete additional external coursework in relevant fields. 

Objectives, Scope, and Methodology: 

The Chairman and the Ranking Member, Subcommittee on Readiness and 
Management Support, Senate Committee on Armed Services, requested that 
we examine best practices and DOD procedures for factors that affect 
program manager effectiveness. Our overall objectives for this report 
were to (1) identify best practices that have enabled organizations to 
successfully position their program managers for success, (2) identify 
DOD practices for supporting program managers and holding them 
accountable, and (3) compare and contrast DOD and commercial practices 
in order to identify possible improvements to DOD practices. 

To identify the best practices and processes that commercial companies 
employ to position their program managers for success, we used a case 
study methodology. We selected companies that, like DOD, research, 
develop, and field products, using program managers as the central 
executors of the programs. Selection of the companies was also based 
upon recognition by the American Productivity and Quality Center and 
the Project Management Institute and the recommendations of experts. 
Below are descriptions of the three companies that are specifically 
featured in this report. 

* Toyota Motor Manufacturing of North America, Inc. 

Toyota Motor Manufacturing of North America, Inc., the third largest 
automobile producer in the world and the fifth largest industrial 
company in the world, designs, manufactures, and markets cars, trucks, 
and buses worldwide. In 2005, the company reported total net sales of 
$172.7 billion. We met with individuals involved with the development 
of the 2005 Toyota Avalon, a full-size sedan, at Toyota Motor 
Manufacturing in Erlanger, Kentucky. 

* Siemens Medical Solutions USA, Inc. 

Siemens Medical Solutions is one of the world's largest suppliers in 
the healthcare industry. Siemens Medical manufactures and markets a 
wide range of medical equipment, including magnetic resonance imaging 
systems, radiation therapy equipment, ultrasound equipment, and patient 
monitoring systems. We met with individuals from the Angiography, 
Cardiology, and Neurology business unit, located in Hoffman Estates, 
Illinois. 

* Motorola, Inc. 

Motorola is a Fortune 100 global communications leader that provides 
seamless mobility products and solutions across broadband, embedded 
systems and wireless networks. Seamless mobility harnesses the power of 
technology convergence and enables smarter, faster, cost-effective, 
flexible communication in homes, autos, workplaces and all spaces in 
between. Motorola had sales of $31.3 billion in 2004. We visited its 
offices in Arlington Heights, Illinois, and discussed program 
management practices and processes with representatives from the 
Networks sector. 

In addition to the three companies featured in this report, we visited 
two additional successful firms to assess whether they employed similar 
processes and practices for program management. These include Molson 
Coors Brewing Company and Wells Fargo. Both companies have undertaken 
projects that reflect some of the complexity and challenges that a DOD 
weapon systems program would face. For example, we met with managers of 
a Molson project intended to automate day-to-day marketing operations 
for digital assets. We also met with Wells Fargo officials who 
developed an electronic imaging process for paperless check clearance. 
At both companies, we also discussed broader corporate investment 
processes that supported these particular internal projects as well as 
the companies' main service lines. 

For each of the five companies, we interviewed senior management 
officials and program managers to gather consistent information about 
processes, practices, and metrics the companies use to support program 
managers and hold them accountable. In addition to the case studies, we 
synthesized information from GAO's past best practices work about 
product development. 

We also examined key best practices studies related to program 
management, including studies from organizations such as the Project 
Management Institute and the American Productivity and Quality Center. 
Moreover, we relied on our previous best practice studies, which have 
examined incentives and pressures affecting weapon system programs, the 
optimal levels of knowledge needed to successfully execute programs, 
and complementary management practices and processes that have helped 
commercial and DOD programs to reduce costs and cycle time. 

In order to determine DOD practices for supporting program managers and 
holding them accountable, we conducted five separate focus groups 
between July and October 2004. Each group was composed of project 
managers from one of the services or the Missile Defense Agency. A 
total of 28 acquisition category I program managers representing a 
range of DOD programs were identified by their respective services for 
the meetings held in separate locations in Huntsville, Ala; El Segundo, 
Calif; Dayton, Ohio; Arlington, Va; and Ft. Belvoir, Va. For each focus 
group, the facilitators introduced discussion topics to discover how 
program managers define success, as well as what they are accountable 
for and how they are held accountable. In addition, participants were 
asked to discuss how program managers are supported and what obstacles 
they encounter in performing their duties. 

We analyzed the content of focus group transcripts and used the themes 
we identified to design a survey to gather information about 
acquisition category I and II program managers' perceptions about 
factors that assist or block their success and to more clearly define 
other issues in the DOD acquisition process that affect program manager 
effectiveness. We elicited input from several experts-retired program 
managers, active-duty members with program management experience, and 
senior acquisition officials who reviewed the questions and provided 
feedback on the draft survey. 

We pretested the survey with five program managers. During the pretest 
we asked the program managers questions to determine whether (1) the 
survey questions were clear, (2) the terms used were precise, (3) the 
questionnaire placed an undue burden on the respondents, and (4) the 
questions were unbiased. We then incorporated their comments into the 
survey, finalized the questions, and sent the web-based survey to 
acquisition category I and II program managers. We selected the 
category I and II program managers because they manage the more complex 
and expensive programs. We identified the program managers through 
consultation with each of the services. The survey consisted of open- 
ended and close-ended questions concerning support for program managers 
and how they are held accountable for program outcomes. Originally we e-
mailed 237 program managers but later determined that 52 should not be 
included because they managed programs other than acquisition category 
I and II. Of the 185 remaining program managers, we received completed 
surveys from 69 percent. 

The surveys were conducted using self-administered electronic 
questionnaires posted on the World Wide Web. We sent e-mail 
notifications to all acquisition category I and II program managers on 
April 12, 2005. We then sent each potential respondent a unique 
password and username by e-mail to ensure that only members of the 
target population could participate in the survey. To encourage 
respondents to complete the questionnaire, we began sending e-mail 
messages to prompt each nonrespondent between April 26, 2005 and May 
19, 2005. Additionally, the team contacted nonrespondents through 
telephone calls between May 31, 2005 and July 12, 2005. We closed the 
survey on July 19, 2005. 

In this report we discuss some of the results obtained from the survey. 
A more complete tabulation of survey questions together with tables 
indicating the levels of response can be found on our Web site at GAO- 
06-112SP. The survey contained close-ended questions and open-ended 
questions. We conducted a content analysis of the open-ended questions 
and constructed tables showing the results of the analysis arranged 
into broad categories. Some of the respondents to our survey provided 
more than one answer to the open-ended questions. All responses that 
indicated equally important factors were tabulated in the appropriate 
categories. However, because some respondents provided more than one 
answer, the percentages may add up to more than 100 percent of 
respondents. The web-based report does not contain all the results from 
the survey. For example, we do not report responses for questions about 
demographics, some open-ended questions, or questions with high item 
nonresponse rates. 

In addition to the focus groups and survey, we conducted in-depth 
interviews with individual program managers, program executive officers 
from across the services, as well as program managers from Boeing and 
Lockheed Martin for two major weapon systems. To further assess the 
conditions and environment program managers were operating in, we 
relied on previous GAO reports. For example, we relied on a recent 
study of space acquisition problems that incorporated interviews of 
more than 40 individuals, including experienced program managers, 
program executive officials, officials responsible for science and 
technology activities, and former and current officials within the 
Office of Secretary of Defense who have specific responsibility for 
space system oversight or more general weapon system oversight. 

To further determine relevant DOD policies and practices, we analyzed 
documents describing the roles and responsibilities of program 
managers, acquisition force career management, promotion rates, 
performance reporting, and training requirements. Moreover, we analyzed 
relevant legislation and the DOD 5000 series of directives and 
instructions. We also interviewed career acquisition service officials, 
Defense Acquisition University course managers, and the Director of 
Training. We reviewed studies from the Rand Corporation, the Center for 
Strategic and International Studies, and the Defense Science Board, 
among others, on weapons system program management and acquisition 
issues as well as studies performed by past commissions focused on 
acquisition reform. 

We conducted our review between April 2004 and November 2005 in 
accordance with generally accepted government auditing standards. 

[End of section] 

Chapter 2: Senior Leader Support and Disciplined Knowledge-Based 
Processes Are Critical Enablers for Program Managers: 

Program managers from the leading companies we spoke with believed that 
two critical enablers--(1) support from top leadership and (2) 
disciplined, knowledge-based processes for strategic investment, 
program selection, and product development execution--empowered them to 
succeed in delivering new products when needed within cost, quality, 
and performance targets originally set by the company. At all of the 
companies we visited, corporate leadership began at a strategic level, 
long before the initiation of a new product development, with senior 
company leaders making critical strategic investment decisions about 
the firm's mix of products and the return on investment they may yield. 
Once high-level investment decisions were made, senior leaders assured 
that new programs did not begin until there was a business case for 
them--meaning there was assurance that the program fit in with the 
corporation's goals and investment strategy and that there were 
resources available to execute the program. Once a business case had 
been made, senior leaders selected and tasked program managers with 
executing the program. They also required the program managers to use a 
knowledge-based product development process that demanded appropriate 
demonstrations of technology, designs, and processes at critical 
junctures. They also empowered program managers as appropriate to 
execute the program and held them accountable for delivering the 
program within estimates. While they were empowered to execute the 
program, program managers were still supported by senior leaders, who 
encouraged open and honest communication and continually assured that 
the right levels of resources and management attention were available 
for the project. Figure 1 maps critical support and accountability 
factors. 

Figure 1: Critical Support and Accountability Factors: 

[See PDF for image] 

[End of figure] 

Senior Leadership Provides Program Managers with a Strong Foundation 
for Success: 

At each of the companies we visited, senior leaders invested a great 
deal of time and effort positioning new development efforts for 
success. Before even considering initiating a new project, senior 
leaders made high level trade-off decisions between their long-term 
corporate goals, projected resources, market needs, and alternative 
ways of meeting those needs. These efforts culminated in investment 
strategies that assured that the company could fully commit to any 
product development effort it pursued. With a broad strategy in place, 
senior leaders would then begin concept development for potential new 
products, analyzing proposed products in terms of what requirements 
could be achieved today versus future versions of the product and what 
resources would be needed--not just in terms of cost, but in terms of 
technologies, time, and people. Once a specific concept was selected, 
senior leaders would follow rigorous systems engineering processes to 
narrow the gap between requirements and resources to a point where they 
were assured that they were pursuing a product that would meet market 
needs and could be developed within cost and schedule goals. The end 
point of this process was a sound business case that senior leaders 
could then hand off to a program manager--who was then empowered to 
deliver the product on time and within cost. Program managers 
themselves highly valued this support because it ensured the companies 
were committed to their particular efforts, reduced the level of 
unknowns that they were facing, and kept them focused solely on 
executing their programs. Put more simply, they believed senior leaders 
consistently provided a sound foundation on which they could launch 
their programs. 

The most critical characteristics of the strategic leadership provided 
include the following: 

* Investment strategies. Because there are more product ideas than 
there is funding to pursue them, the commercial companies we visited 
used a knowledge-based process to make decisions about which product 
development efforts to invest in. They began by developing an 
investment strategy that supports a corporate vision. For the most 
profitable mix of new products, companies analyzed factors such as 
customer needs, available technology, and available resources. 
Companies ensured that decisions to start new product developments fit 
within the investment strategy. The investment strategy determined 
project priority as well as providing a basis for trade-off decisions 
against competing projects. Program managers found their company's use 
of investment strategies helpful because it gave them confidence that 
their project had commitment from their organization and from their top 
leaders and managers and clearly identified where their project stood 
within the company's overall investment portfolio and funding 
priorities. 

* Evolutionary development. All of the companies generally followed an 
evolutionary path toward meeting market needs rather than attempting to 
satisfy all needs in a single step. In effect, the companies evolved 
products, continuously improving their performance as new technologies 
and methods allow. These evolutionary improvements to products 
eventually result in full desired capability, but in multiple steps, 
delivering enhanced capability to the customer more quickly through a 
series of interim products. For example, the 2005 Avalon involved 
redesign of about 60 per cent of the vehicle, but component sections 
such as the electronics and such features as the keyless ignition 
system and the reclining rear seat were either developed by suppliers 
or had been used on the Lexus. By using this method, the company 
changed the Avalon's overall design and functionality by increments. In 
more strategic investment planning, Toyota maintains an ongoing 
research into such technology areas as alternative fueled automobiles 
and environmental implications of automotive developments that will 
feed into its long-term planning. Our previous work has found that this 
approach reduces the amount of risk in the development of each 
increment, facilitating greater success in meeting cost, schedule, and 
performance requirements. The approach permits program managers to 
focus more on design and manufacturing with a limited array of new 
content and technologies in a program. It also ensures that the company 
has the requisite knowledge for a product's design before investing in 
the development of manufacturing processes and facilities. Conversely, 
our past work has found that organizations that set exceedingly high 
technology advancement goals invariably spend more time and money than 
anticipated trying to address technology-related challenges amid other 
product development activities, including design and production 
stabilization. 

* Matching of Requirements and Resources. The companies we visited were 
able to achieve their overall investment goals by matching requirements 
to resources--that is time, money, technology, and people--before 
undertaking a new development effort. Any gaps that existed were 
relatively small, and it was the program manager's job to quickly close 
them as development began. More specifically: 

* The companies had already extensively researched and defined 
requirements to ensure that they are achievable given available 
resources before initiating new efforts. 

* Technologies were mature at the start of a program, that is, they had 
been proven to work as intended. More ambitious technology development 
efforts were assigned to corporate research departments until they were 
ready to be added to future generations (increments) of the product. In 
rare instances when less mature technologies were being pursued, the 
company accepted the additional risk and planned for it. 

* Companies committed to fully fund projects before they began. Not one 
of the program managers we spoke with mentioned funding as a problem at 
the beginning of a development effort and throughout. Funding was a 
given once senior leaders had committed to their project. 

* Systems engineering was typically used to close gaps between 
resources and requirements before launching the development process. As 
our previous work has shown, requirements analysis, the first phase of 
any robust systems engineering regimen, is a process that enables the 
product developer to translate customer wants into specific product 
features for which requisite technological, software, engineering, and 
production capabilities can be identified. Once these are identified, a 
developer can assess its own capabilities to determine if gaps exist, 
and then analyze and resolve them through investments, alternate 
designs, and, ultimately, trade-offs. The companies we visited allowed 
their engineers to analyze and weigh-in on the customers needs as 
determined by its marketers. 

Our previous best practice work has consistently found the practice of 
matching requirements and resources prior to initiating a new program 
to be a hallmark for successful companies. Simply put, we have found 
that when wants and resources are matched before a product development 
is started, the development is more likely to meet performance, cost, 
and schedule objectives. When this match does not take place at the 
start of a program, programs typically encounter problems such as 
increased costs, schedule delays, and performance shortfalls as they 
try to meet requirements during product development. Program managers 
we spoke with for this review specifically cited this process as an 
enabler for their own success because it ensured they were in a good 
position to commit to cost and schedule estimates that were attainable, 
and it did not require them to perform "heroic" efforts to overcome 
problems resulting from large gaps between wants and resources, such as 
technology challenges or funding shortages. 

In addition to these critical strategic enablers, program managers at 
the companies also stated that senior leaders made concerted efforts to 
match program manager skills and experience to appropriate projects and 
to train and mentor program managers. In selecting program managers 
themselves, the companies placed high value on strong leadership 
qualities, including decision making skills, diplomacy, communication 
skills, ability to motivate others, and integrity, as well as how 
individual personalities fit with the job or team. Most of the program 
managers we spoke with had been groomed for their positions through 
formal training on budgeting, scheduling, and regulatory compliance and 
other aspects of program management; informal mentoring by senior 
executives or experienced program managers; and by being placed in 
positions that gradually increased their management responsibilities. 
In addition, many of the program managers we spoke with also possessed 
considerable technical experience. In fact, they often started at the 
company as engineers. The companies we visited were similarly 
deliberate in developing and deploying teams of functional experts to 
support a program manager. In some cases, the teams reported directly 
to the program manager. In others, they reported to their respective 
home units and worked collaboratively with the program managers. In 
either case, the program managers themselves valued the support they 
were getting from these teams--particularly because they enabled the 
program manager to employ a broad array of expertise from day-one of 
the development effort and to facilitate an exchange of ideas. The 
program managers we spoke with believed that their functional teams 
were also highly skilled--to the point where they could easily delegate 
major tasks. 

Strategic Leadership at Toyota and Siemens Medical: 

Strategic leadership for the development of Toyota's Avalon luxury 
sedan ties back to conscious decisions made by senior leaders in Japan 
when they built a Toyota facility in the United States 25 years ago. To 
assure that the vehicles could be made to the same levels of quality as 
those in Japan, Toyota replicated its manufacturing facilities, used 
Japanese suppliers, and sent its managers to the United States to 
supervise development. As U.S. employees gained experience and 
demonstrated their capability, the reliance on Japanese suppliers and 
personnel gradually decreased. A second step Toyota took was to 
replicate its training and mentoring of program managers--pairing them 
with more experienced chief engineers, who oversee long-term planning 
across projects, and even bringing them to Japan to study how Toyota 
approached development. 

To support all of its new development efforts, senior leaders have 
developed an overall strategic plan--which takes a long-and short-term 
investment perspective. Over the long run, the plan envisions the 
company achieving significant advancements in capabilities, such as 
alternatively fueled engines, through incremental improvements to 
technologies. Over the short run, a specific vehicle development 
program uses a marketing analysis about features customers desire in 
new models; and the staff determines whether a market exists for a 
certain type of product at a certain price. In establishing a business 
case for the Avalon, Toyota embarked on a formal concept development 
effort, which was led by a chief engineer. The chief engineer, a high- 
level executive, was largely responsible for setting the vision for the 
new Avalon, securing resources needed for development effort prior to 
initiating the development program, and working with representatives 
from its sales division to make sure that the design and technologies 
being pursued still fit within market needs--not just in terms of cost, 
but in terms of vehicle features. A variety of functional experts were 
consulted during this phase, though the chief engineer had the most 
formal authority over concept development. At the conclusion of this 
effort, Toyota decided to take on a very extensive redesign of the 
Avalon but also set a goal bringing the vehicle to market in only 18 
months. Redesign features included a reinforced body, improvements to 
the engine and to the braking system, as well as features customers 
desired such as a keyless ignition system and reclining rear seats. 
Toyota leadership also decided to include mature technologies, often 
borrowed from other vehicle lines, or purchased from outsider 
suppliers. Once the design was approved, day to day project management 
shifted to the Chief Production Engineer, whose responsibility it was 
to see the vehicle through production to distribution. 

Figure 2: 2005 Toyota Avalon: 

[See PDF for image] 

[End of figure] 

Corporate leadership at Siemens Medical took a similar shape in the 
development of new medical equipment. For example, senior leaders 
developed an overall investment strategy, based largely on researching 
their customers' technology needs as well as their own technology 
readiness, the direction their competitors were going in, economic 
trends, and projected manpower resources. From these assessments, a 
team within Siemens developed a portfolio of potential new projects to 
pursue, which upper management then prioritized based on their 
potential profit, how they fit in with corporate goals and projected 
resources. Ultimately, senior leaders produce a short-term (1 year) 
investment plan as well as a longer-term (3 to 5 year) plan. Once a 
specific project is selected, Siemens employs systems engineering 
practices to narrow down the gap between customer requirements and 
resources--working with both business and technical managers. A 
"product manager" is charged with making trade-offs between 
requirements, schedule, and cost prior to initiating product 
development and is held accountable for systems engineering decisions 
made to level requirements with resources for the business case. This 
person sits at a relatively high level within the company and possesses 
marketing and business expertise. A "project manager" who reports to 
the product manager is ultimately assigned to execute the business 
case, but he or she plays a role in the concept development by 
participating in trade-off decisions and raising concerns about how 
decisions can be executed. 

At Siemens Medical, many project managers begin by serving as the 
technical leader working with three to five people in systems 
engineering or another technical area of a project. As the technical 
team lead, they gain experience with scheduling, communicating, and 
managing people. Over time the individual is given more 
responsibilities such as becoming a subsystem project leader; as the 
manager gains experience, he or she transitions to handling cross- 
functional areas including business, budgeting, staffing, technology, 
and testing. 

Siemens Medical project managers are also given formal training, 
including courses on regulatory and quality requirements as well as 
courses that help program managers learn about their management styles. 
In addition, Siemens ensures that project managers are well-trained on 
risk management so that they can identify and mitigate potential risks 
at the beginning of the project. Also, since project managers function 
within a centralized project management department, they are mentored 
both by the head of the department and by their peers. 

Figure 3: Siemens Bi-Plane AXIOM Artis: 

[See PDF for image] 

[End of figure] 

Knowledge-Based Process Followed to Execute Programs: 

Once a new development effort began, program managers were empowered to 
execute the business case and were held accountable for doing so. At 
all of the companies we visited, program managers believed that 
following a disciplined, knowledge-based development process and 
continued support from senior leaders were essential to their success. 
The process itself was typically characterized by a series of gates or 
milestone decisions, which demanded programs assess readiness and 
remaining risk within key sectors of the program as well as overall 
cost and schedule issues, and it required go/no-go decisions to be made 
fairly quickly. The most important aspect of the process, in the view 
of the program managers, was that it empowered them to make decisions 
about design and manufacturing trade-offs, supplier base, staffing on 
the program team, etc.--as long as they were within the parameters of 
the original business case. At the same time, the process held program 
managers accountable and set clear goals and incentives. 

Common critical characteristics of the knowledge-based process followed 
to execute programs include the following: 

* Knowledge-driven development decisions. Once a new product 
development began, program managers and senior leaders used 
quantifiable data and demonstrable knowledge to make go/no-go 
decisions. These covered critical facets of the program such as cost, 
schedule, technology readiness, design readiness, production readiness, 
and relationships with suppliers. Development was not allowed to 
proceed until certain thresholds were met, for example, a high 
proportion of engineering drawings completed or production processes 
under statistical control. Program managers themselves placed high 
value on these requirements, as it ensured they were well positioned to 
move into subsequent phases and were less likely to encounter 
disruptive problems. 

* Empowerment for program managers to make decisions. At all the 
companies we visited, program managers were empowered to make decisions 
on the direction of the program and to resolve problems and implement 
solutions. They could make trade-offs among schedule, cost, and 
performance features, as long as they stayed within the confines of the 
original business case. When the business case changed, senior leaders 
were brought in for consultation--at this point, they could become 
responsible for trade-off decisions. 

* Accountability. Program managers at all the companies we visited were 
held accountable for their choices. Sometimes this accountability was 
shared with the program team and/or senior leaders. Sometimes, it 
resided solely with the program manager on the belief that company had 
provided the necessary levels of support. In all cases, the process 
itself clearly spelled out what the program manager was accountable 
for--the specific cost, performance, schedule, and other goals that 
needed to be achieved. 

* Tenure. To further ensure accountability, program managers were also 
required to stay with a project to its end. Sometimes senior leaders 
were also required to stay. At the same time, program managers were 
incentivized to succeed. If they met or exceeded their goals, they 
received substantial bonuses and/or salary increases. Awards could also 
be obtained if the company as a whole met larger objectives. In all 
cases, companies refrained from removing a program manager in the midst 
of a program. Instead, they chose first to assess whether more support 
was needed in terms of resources for the program or support and 
training for the program manager. 

Other important aspects within the development process included the 
following: 

* Common templates and tools to support data gathering and analysis. 
These tools included databases of demonstrated, historical cost, 
schedule, quality, test, and performance data that helped program 
managers produce metrics as well as standard forms and guidance for 
conducting the meetings. Program managers valued these tools because 
they greatly reduced the time needed to prepare for milestone meetings. 
In all cases, program managers did not believe they were spending time 
collecting data that was valuable to senior management but not to them. 

* Common processes that supported product development. The companies 
generally found that requiring program managers to employ similar risk 
management, project management, requirements approval, testing, quality 
management, problem resolution, and other processes enabled them to add 
additional discipline and consistency to product development. Some 
companies were certified by professional organizations as achieving the 
highest level of proficiency within supporting development processes. 
For example, Motorola was certified as a level 5 software development 
organization by Carnegie Mellon's Software Engineering Institute. 

* Lessons learned. All of the companies we visited continually refined 
and enhanced their development process via some sort of lessons-learned 
process. The program managers themselves placed a great deal of value 
on these processes--as they were seen as the primary means for learning 
how to tailor the process to better fit a project and to prevent the 
same mistakes from recurring. 

Program managers also cited flexibility as an enabling quality of their 
processes. All of the companies allowed their processes to be tailored 
as needed. Milestones that were deemed unnecessary could be dropped. 
More often, however, additional meetings were added to gain consensus 
on how to address particular problems. Another enabling factor was that 
their processes ensured decisionmakers were not flooded with data. 
Often, program teams boiled down data into one or two pages, using 
simple metrics depicting status and risk on critical facets of the 
program such as cost, schedule, technology readiness, design readiness, 
and production readiness. Program managers valued the process of 
translating detailed data into higher level metrics because it required 
them to think about their programs in more strategic terms and focus on 
the highest problem areas. 

Knowledge-Based Development at Motorola and Toyota: 

Motorola's development process is comprised of 16 milestones or 
"gates"--the first five of which pertain to processes employed to 
develop a product concept and the business case. Eleven gates comprise 
the execution of the business case, from project initiation, to systems 
requirements definition, design readiness, testing, controlled 
introduction, and then full deployment. Each gate demands an array of 
indicators on status and progress, including resources, cost, scope, 
risk, and schedule. A centralized database helps program managers 
produce this data and allows users to obtain data at any time and at 
any level of detail that they need. For meetings themselves, program 
managers are required to produce a set of "vital few" performance 
measures relating to cost, quality, program status, and customer 
satisfaction. At the gates themselves, program managers discuss the 
status of the program with senior leaders, but they are ultimately 
responsible for making decisions on whether to proceed to the next 
phase. In the past, program managers did not have this responsibility 
and acted more as an administrator than a leader, according to senior 
executives. With less responsibility and accountability, programs were 
not managed as well--often employing disjointed management processes 
with less attention to efficiency and effectiveness. By increasing 
program manager's ownership and accountability over the project, senior 
leaders found that they were more incentivized to meet and exceed cost, 
schedule, and performance goals. To support this change, the company 
also adopted common supporting processes, including configuration 
management, design, training, testing, defect prevention, quality 
management, supplier management, and system upgrades. The common 
processes assured program managers employed the same set of quality 
controls and that deployed tools and guidance enabled program managers 
to reduce cycle times as well as to produce better and more consistent 
management data. 

Toyota's process is comprised of eight key milestones--starting with a 
lessons-learned gate. At this point, senior leaders and project teams 
formally review what worked well and not so-well in the prior 
development effort and assess whether the process needs to be tailored 
as a result. The Avalon program manager told us that these 
"reflections" are not taken lightly; they are developed through a very 
detailed and soul-searching process during which people have to openly 
admit errors and inadequacies so that better processes and procedures 
can be devised. The next gate, "image" represents the process by which 
the chief engineer derives the business case. Once he is done, direct 
supervision of the project is transferred to a "chief production 
engineer," who is charged with its execution of the business case 
although the chief engineer continues to be involved in the 
development. The next few milestones come as the car is designed, 
prototyped, tested, and put through quality assurance. The last 
milestone, the production stage also contains a customer feedback 
phase, which is used to refine the next development effort. 

Within the business case itself, Toyota places highest importance on 
schedule because a number other vehicle development efforts are 
dependent on the same resources and staff being used by the current 
effort. As a result, the chief production engineer is more inclined to 
make trade-offs that favor schedule over other factors. At each 
milestone meeting, the chief production engineer reviews the status of 
the program with senior leaders, focusing first on what problems are 
occurring and what his solutions are for overcoming them. The meeting 
itself employs streamlined reporting with simple indicators of 
remaining risk on critical facets of the program--specifically, a 
circle, meaning low remaining risk and okay to proceed; a triangle, 
meaning there are problems but they can be fixed, or an "x," meaning 
there is a problem without a solution. The chief production engineer is 
responsible for making decisions as to how to proceed at these 
milestones, unless there is a problem that significantly affects the 
business case. If so, senior leaders become more involved in the 
decision-making rather than simply advising the chief production 
engineer. 

While the Toyota process only employs eight formal milestones, the 
chief production engineer actually involves functional experts, senior 
executives, and other stakeholders in frequent meetings to make 
tactical decisions about the program. For example, the Avalon chief 
production engineer told us that he held "obeya" (literally "big room," 
signifying that all inputs are desired) meetings twice a week, which 
involved all functional areas as well as "namewashi" (literally binding 
the roots together, signifying gathering facts and moving toward a 
decision) meetings before a formal milestone meeting--at which 
functional officials consulted with each other to identify problems and 
develop potential solutions that would be presented to senior leaders 
at the milestone. Overall, the accountability for meeting the Avalon 
program's goals was shared between the chief production engineer, the 
functional team, and senior executives. At Toyota, senior leaders 
assume that the processes they have in place will work, and if the 
process is not delivering a suitable quality outcome, then it was the 
shared responsibility of managers and staff to resolve the issue. If 
performance issues arose, senior leaders attempted to address them 
first through training, mentoring, and additional support, rather than 
removing the program manager. 

Continued Senior Leadership during Product Development Further Enabled 
Success: 

Empowering program managers to make decisions in executing the business 
case was seen as the most significant type of support provided by 
senior leaders. But program managers themselves pointed to other types 
of support that made it easier for them to succeed. Primarily, senior 
leaders did the following: 

* Provided unwavering commitment to the development effort. At all the 
firms we visited, senior leaders were champions of the project 
throughout its life and fully committed to supporting it. When 
significant problems arose that jeopardized the business case, they 
found ways to address those problems, rather than rejecting the program 
in favor of another one. 

* Trusted their program managers. Senior leaders trusted the 
information being provided by the program manager as well as his or her 
expertise. This reduced the need for instilling additional layers of 
oversight that could slow down the program. At the same time, however, 
senior leaders took personal responsibility for assuring their program 
managers had the knowledge and capability needed to succeed--in some 
cases, by personally mentoring them for a long period of time. 

* Encouraged program managers to share bad news. Senior leaders went 
out of their way to encourage program managers to share problems. In 
fact, program managers were often expected to discuss problems before 
anything else at key milestones. And, in some cases, program managers 
were evaluated based on their ability to identify and share problems. 
At the same time, senior leaders expected their program managers to 
come up with solutions--to take ownership over their efforts. 

* Encouraged collaboration and communication. Senior leaders spent a 
great deal of time breaking down stovepipes and other barriers to 
sharing information. The Avalon chief production engineer, in fact, 
told us that Toyota's development processes alone were much like other 
automobile manufacturers he had worked for. What separated Toyota from 
the others was its emphasis on open information exchange, cooperation, 
and collaboration. He believed that this was the key enabler for 
Toyota's superior systems integration. 

Figure 4: Best Practice Roles, Responsibilities, and Behaviors of 
Senior Managers: 

[See PDF for image] 

[End of figure] 

[End of section] 

Chapter 3: DOD Is Not Supporting Its Program Managers Effectively: 

While DOD's leadership has taken action in recent years it hopes will 
better position programs and improve planning and budgeting, it is 
still not effectively positioning or supporting program managers for 
success. For example, rather than making strategic investment 
decisions, DOD starts more programs than it can afford and rarely 
prioritizes them for funding purposes. The result is a competition for 
funds that creates pressures to produce optimistic cost and schedule 
estimates and to overpromise capability. Moreover, our work has shown 
that DOD often starts programs without establishing a business case. 
Specifically, technologies are not always mature at start, requirements 
are not fully defined, and cost and schedule estimates are not always 
realistic. In addition, program managers are not empowered to execute 
programs. They cannot veto requirements and they do not control funding 
or other resources. In fact, program managers who responded to our 
survey personally consider requirements and funding instability to be 
their biggest obstacles to success. Program managers also believe that 
they are not sufficiently supported once programs begin. In particular, 
they believe that program decisions are based on funding needs of other 
programs rather than demonstrable knowledge; they lack tools needed to 
enable them to provide leadership consistent with cost, schedule and 
performance information; they are not trusted; they are not encouraged 
to share bad news; and they must continually advocate for their 
programs in order to sustain commitment. 

Figure 5: Breakdowns in Support and Accountability Factors: 

[See PDF for image] 

[End of figure] 

Senior Leadership Does Not Provide a Strong Foundation for Success: 

According to program managers we interviewed as well as comments to our 
survey and our past reviews, senior leadership within DOD does not 
provide a strong foundation for success. While DOD is adept at 
developing long-term visions and strategic plans, it does not develop 
realistic, integrated investment strategies for weapons acquisitions to 
carry out these plans. Instead, more programs are started than can be 
funded and too many programs must compete for funding, which, in turn, 
creates incentives to produce overly optimistic estimates and to 
overpromise capability. Moreover, when faced with a lower budget, 
program managers believe that senior executives within the Office of 
the Secretary of Defense (OSD) and the services would rather make 
across-the-board cuts to a span of programs rather than hard decisions 
as to which ones to keep and which ones to cancel or cut back. Our work 
continues to show that, while DOD has adopted evolutionary development 
in its policies, programs are being encouraged to pursue significant 
leaps in capability. In addition, DOD's policy now encourages programs 
to match resources to requirements before program initiation, but 
program managers reported in our survey that requirements and funding 
are not stabilized and were the biggest obstacles to their success. 
Further, while program managers believe their training has been 
adequate, they also believe that mentoring has been uneven and that 
they could benefit with tours of duty inside the Pentagon, for example, 
in offices that oversee budget or financial management. Table 2 
highlights differences between strategic senior leadership support 
within the commercial companies we visited and DOD. 

Table 2: Are Best Practices Present in DOD? 

Best practices: Develop long-term vision and investment strategy;
DOD: DOD has long-term vision, but not an investment strategy. Lack of 
investment strategy has created competition for funding and spurred low 
cost-estimating, optimistic schedules, and suppression of bad news. 

Best practices: Adopt evolutionary path toward meeting customer needs;
DOD: DOD has adopted evolutionary development in policy but not in 
practice. 

Best practices: Match requirements and resources before starting new 
product development;
DOD: DOD has encouraged achieving match in policy but not in practice. 
Requirements are not stable; funding commitments are not enforced;
key technologies are not matured before development. Requirements and 
funding are biggest obstacles in view of program managers. 

Source: GAO.

[End of table] 

Investment Strategy and Evolutionary Development: 

DOD is attempting to address some of the problems identified, but it is 
too early to determine how effective its solutions are. For example, it 
is implementing a new requirements setting processes--known as the 
Joint Capabilities Integration and Development System--in an attempt to 
bring more discipline to investment decisions. The system is organized 
around key functional concepts and areas, such as command and control, 
force application, battlespace awareness, and focused logistics. For 
each area, boards of high-ranking military and civilian officials 
identify long-term joint needs and make high-level trade-offs on how 
those needs should be met. Once specific programs are proposed, the 
process is designed to encourage a more evolutionary approach by 
allowing requirements setters the flexibility to define requirements in 
terms of capabilities as well as to defer final requirements 
formulation to later in the development process. DOD has also been 
attempting to implement complementary planning and budgeting processes-
-for example, by asking the military services to plan budgets around 
guidance that takes a joint perspective and by taking a portfolio 
planning approach. However, there is no evidence to date that shows 
these enhancements are providing DOD with a sound investment strategy 
as well as the right controls for enforcing that strategy. 

While some program managers we spoke with believed the process' focus 
on capabilities versus requirements promised more flexibility, program 
managers comments to our survey show that they also still widely 
believed that they were operating in an environment where there was 
unfair competition for funding. Figure 6 highlights specific views. 

Figure 6: Highlights of Program Manager Comments Regarding Competition 
for Funding: 

[See PDF for image] 

[End of figure] 

DOD has also adopted policies that encourage evolutionary 
development.[Footnote 5] However, our reviews continue to find that 
programs are still pursuing significant leaps in capabilities. For 
example, we reported this year[Footnote 6] that the Joint Strike 
Fighter acquisition strategy was striving to achieve the ultimate 
fighter capability within a single product development increment, and 
that it had bypassed early opportunities to trade or defer to later 
increments those features and capabilities that could not be readily 
met. We also testified[Footnote 7] that while DOD's space acquisition 
policy states its preference for evolutionary development, programs 
still attempt to achieve significant leaps in one step. 

Matching Resources to Requirements: 

In recent years, DOD has changed its acquisition policy to encourage 
decisionmakers to match requirements to resources before starting a new 
program. For example, the policy specifically encourages that 
technologies be demonstrated in a relevant environment before being 
included in a program; that a full funding commitment be made to a 
program before it is started and that requirements be informed by the 
systems engineering process. Concurrently, DOD's new requirements 
process is designed to instill more discipline during initial 
requirements development and postpone final determination of 
requirements to assure that requirements being set are achievable. 

In practice, however, our work has shown that there are still 
significant gaps between requirements and technology resources when 
programs begin. Our most recent annual assessment of major weapon 
systems programs,[Footnote 8] for example, showed that only 15 percent 
of the programs we reviewed began development having demonstrated that 
all of their technologies were mature. More often than not, programs 
had to worry about maturing technologies well into system development, 
when they should have focused on maturing system design and preparing 
for production. These assessments also show that programs that started 
development with mature technologies experienced lower development and 
unit cost increases than those programs that started with immature 
technologies. Table 3 provides some examples. 

Table 3: Technology Maturity and Program Outcomes: 

Program: Advanced Threat Infrared Countermeasures/Common Missile 
Warning System;
Percent increase in R&D (first full estimate to latest estimate): 5.6%;
Percent of critical technologies and associated maturity level at 
development start: 50 % (3 of 6) at 6 or higher. 

Program: C-5 Reliability Enhancement and Reengining Program;
Percent increase in R&D (first full estimate to latest estimate): 2.1%;
Percent of critical technologies and associated maturity level at 
development start: 100 % (11 of 11) at 6 or higher. 

Program: DD(X) Destroyer;
Percent increase in R&D (first full estimate to latest estimate): 
417.3%;
Percent of critical technologies and associated maturity level at 
development start: 25 % (3 of 12) at 6 or higher. 

Program: Future Combat System;
Percent increase in R&D (first full estimate to latest estimate): 50.8%;
Percent of critical technologies and associated maturity level at 
development start: 32 % (17of 52) at 6 or higher. 

Program: Joint Strike Fighter;
Percent increase in R&D (first full estimate to latest estimate): 30.1%;
Percent of critical technologies and associated maturity level at 
development start: 25 % (2 of 8) are 6 or higher. 

Source: GAO. 

Note: Technology readiness level of 7 or higher at program launch is 
considered best practice; a technology readiness level of 6 or higher 
is DOD standard.

[End of table] 

Although the majority of respondents to our survey believed that the 
initial baselines of their programs were reasonable, a significant 
group, about 24 percent of program managers, responded that their 
program parameters were not reasonable at the start and 45 program 
managers responded that their program had been re-baselined one or more 
times for cost and schedule increases; 18 percent said one or more key 
technologies fell below a readiness level of 7, which is proven to work 
in an operational environment. They also noted that the most frequently 
missing critical skill was systems engineering--a key function for 
matching requirements to the technologies needed and for providing 
reasonable baselines at the beginning of development. In addition, in 
written comments and individual interviews, program managers noted 
pressure to agree to cost commitments that could be attained only if 
programs enjoyed higher-level support. They also noted that 
requirements were often not fully defined at the onset of a program, 
and many also pointed out that users and stakeholders often did not 
stick to the agreements they made when programs were launched, 
especially if technologies did not mature as planned. 

Figure 7: To What Extent Were the Parameters of Your Program Reasonable 
at Program Start? 

[See PDF for image] 

[End of figure] 

Figure 8: How Program Managers Responded to an Open-ended Question on 
What Were the Biggest Obstacles They Faced: 

[See PDF for image] 

[End of figure] 

Program managers' views were mixed when it came to whether human 
capital resources were well matched to new programs. They cited major 
improvements in DOD's training programs and credited cross-functional 
teams as a valuable resource. They also generally believed they 
personally had the right mix of experience and training to do their 
jobs well. Ninety-four percent of the program managers responding to 
our survey reported that they had been certified at the highest level 
for program management by DOD's Defense Acquisition University. More 
than 80 percent also believed they had adequate training in the areas 
of systems engineering, business processes, contracting, management, 
program representation, cost control, and planning and budgeting. 
Slightly less, about 76 percent, believed they had enough leadership 
training. In addition, about 92 percent said that they believed that 
their service consistently assigned people with the skills and 
experience to be effective program managers. 

At the same time, however, program managers comments and interviews 
with program executive officers pointed to critical skill shortages for 
staff that support them--including program management, systems 
engineering, cost estimating, and software development. Some of these 
officials attributed the shortages to shifts in emphasis from oversight 
to insight of contractor operations. Lastly, in their written comments, 
about 18 percent of program managers who provided written comments 
cited shortcomings in their career path, such as lack of opportunities 
at the general officer level and requirements to move often as a 
disincentive; 13 percent cited the lack of financial incentives. Some 
program managers also noted that DOD loses opportunities to retain 
valuable experience, merely because there are no formal incentives for 
military officers to stay on as program managers after they are 
eligible for retirement. Civilians in program management also cited a 
lack of career opportunities; one problem cited was having to find 
their next job in contrast to military program managers, whose 
subsequent job is presented to them. 

Execution in DOD Does Not Provide Adequate Support and Accountability: 

According to program managers and our past reviews, the execution 
process does not provide adequate support and accountability. In 
particular, knowledge-based development processes are not employed, 
program managers are not empowered to execute, and they are not held 
accountable for delivering programs within targets. 

More specifically, DOD has encouraged following knowledge-based 
development processes in its acquisition policy but not always in 
practice. The acquisition process itself mirrors many aspects of the 
commercial companies. For example, it requires a variety of senior, 
functional, and program-level personnel to come together, assess 
progress, identify problems, and make go/no-go decisions at key points 
in development. It encourages oversight personnel to base these 
decisions on quantifiable data and demonstrated knowledge. To enhance 
product development, DOD has also been attempting to adopt and improve 
policies in areas such as software development and systems engineering. 

However, program managers who responded to our survey believe that the 
acquisition process does not enable them to succeed because it does not 
empower them to make decisions on whether the program is ready to 
proceed forward or even to make relatively small trade-offs between 
resources and requirements as unexpected problems are encountered. 
Program managers assert that they are also not able to make shifts 
within personnel to respond to changes affecting the program. At the 
same time, program managers commented that requirements continue to be 
added as the program progresses and funding instability continues 
throughout. These two factors alone cause the greatest disruption to 
programs, according to program managers. Compounding this problem is 
the fact that because acquisition programs tend to attract funds over 
other activities, including science and technology efforts that 
ultimately support acquisition, program managers are incentivized to 
take on tasks that really should be accomplished within a laboratory 
environment, where it is easier and cheaper to discover and address 
technical problems. 

With many factors out of their span of control, program managers in our 
focus groups also commented that it was difficult to hold them 
accountable for mistakes. In addition, in their written comments to the 
survey, many program managers expressed frustration with the time 
required of them to answer continual demands for information from 
oversight officials--many of which did not seem to add value. Some 
program managers in fact estimated that they spent more then 50 percent 
of their time producing and tailoring and explaining status information 
to others. 

More broadly, in interviews and written comments, many program managers 
and program executive officials said that did not believe that DOD's 
acquisition process really supported or enabled them. Instead, they 
viewed the process as cumbersome and the information produced as non- 
strategic. When strategic plans or useful analyses were produced, they 
were done so apart from the acquisition process. 

Our own reviews have pointed to a number of structural problems with 
the acquisition process.[Footnote 9] In particular, while DOD's 
acquisition policy has embraced best practice criteria for making 
decisions, it does not yet have the necessary controls to ensure 
knowledge is used for decision-making purposes. As a result, programs 
can move forward into design, integration, and production phases 
without demonstrating that they are ready to. Without a means to ensure 
programs and senior managers are adhering to the process, the process 
itself can become an empty exercise--and, in the view of program 
managers, a time-consuming one. 

Table 4 highlights differences between DOD and commercial knowledge- 
based development support and accountability factors--collectively from 
the perspective of program managers, our past reports, and observations 
we made during the course of the review. 

Table 4: Are Best Practices Present in DOD? 

Best practices: Base decisions on quantifiable data and demonstrated 
knowledge;
DOD: DOD policy encourages decisions to be based on quantifiable data 
and demonstrated knowledge, but not happening in practice. 

Best practices: Empower program managers to make decisions;
DOD: Program managers say they are not empowered in the same way as 
commercial companies. They do not control resources. They do not have 
authority to move programs to next phases. 

Best practices: Hold program managers accountable;
DOD: Difficult to enforce accountability. 

Best practices: Program managers stay through execution;
DOD: Tenure has been lengthened, but program managers generally do not 
stay after 3 to 4 years. 

Source: GAO.

[End of table] 

Data Supporting Oversight and Management Decisions: 

We reported that while DOD's acquisition policy has embraced best 
practice criteria for making decisions, it does not yet have the 
necessary controls to ensure demonstrable data is used for decision- 
making purposes. We recommended that DOD assure that program launch 
decisions capture knowledge about cost and schedule estimates based on 
analysis from a preliminary design using systems engineering tools. In 
transitioning from system integration to system demonstration, we 
recommended that DOD ensure the capture of knowledge about the 
completion of engineering drawings; completion of subsystem and system 
design reviews; agreement from all stakeholders that the drawings are 
complete; and identification of critical manufacturing processes, among 
other indicators. And in transition to production, we recommended that 
DOD capture knowledge about production and availability of 
representative prototypes along with statistical process control data. 

We recommended adopting these controls because, in our view, they would 
help set program managers up for success by (1) empowering them with 
demonstrated knowledge as they move toward production and (2) bringing 
accountability to their positions and making the business case more 
understandable. Without these types of controls, the process can become 
an empty and time-consuming exercise in the view of program managers. 
At present, our reports continue to show that programs are allowed to 
proceed without really showing that they are ready to. In our most 
recent annual assessment of major weapon systems, for example, only 42 
percent of programs had achieved design stability at design review and 
almost none of the programs in production or nearing production planned 
to assure production reliability through statistical control of key 
processes. 

Our survey also indicated that a relatively small percentage of 
programs used knowledge indicators that successful commercial companies 
use. For example, in responding to our survey, only 32 percent of 
program managers said they used design drawing completion extensively 
to measure design maturity; only 26 percent said they used production 
process controls to a great extent. Even fewer program managers 
reported that their immediate supervisor used these measures 
extensively to evaluate progress. 

In our survey and interviews, program managers and program executive 
officers also frequently commented that they spend too much time 
preparing data for oversight purposes that is not strategic or very 
useful to them. In fact, more than 90 percent of survey respondents 
said that they spent either a moderate, great, or very great extent of 
their time representing their program to outsiders and developing and 
generating information about program progress. In addition, program 
managers told us that they do not have standard tools for preparing 
program-status data. Instead, they must hand-tailor data to the 
requester's particular demands for format and level of detail. The Air 
Force was cited by some program managers as taking initiative in 
developing a database (known as the System Management Analysis 
Reporting Tool) that could save time in answering internal oversight 
demands for data, but they also wanted to be able to use such a tool to 
answer outside demands. While program managers said they were spending 
a great deal of time reporting on program status to outsiders, some 
program executive officers and program managers also commented that 
they had to separately produce data, analyses, and strategic plans for 
their own purposes in order to keep their programs on track--the types 
of plans and analyses that they used were simply not called for by the 
process itself. One program executive officer said that he used three 
documents, the approved program baseline, the acquisition strategy, and 
the test plan to evaluate the program manager's plans--all of these 
documents and many more are required under current acquisition 
planning--but these were of most significance. In addition the 
executive officer held a one-day review per quarter with each program 
manager and reviewed metrics such as earned value, use of award fee, 
contract growth, and schedule variation. 

Program Manager Authority: 

In several key areas, program managers said that they do not have the 
necessary authority to overcome obstacles and make trade-offs to 
achieve program goals. About 60 percent of the program managers that 
responded to our survey said that program managers should have more 
authority to manage their programs--particularly when it comes to 
funding, deciding when programs are ready to proceed to the next phase, 
and shifting staff. In interviews and written comments, some program 
managers commented that they were seeking the ability to make 
relatively small trade-offs--for example, moving a staff member from 
one section of a program to another and shifting a small amount of 
funds from procurement accounts to research and development accounts, 
while others advocated for greater authority, as long as their program 
stayed on track. In addition, program managers often commented that 
they should have a larger role in requirements decisions that are made 
after a program is started--specifically, the ability to veto new 
requirements that would put too much strain on the program. A few 
program managers we interviewed, however, believed that they did have 
sufficient authority and that many program managers have not learned 
how to exercise it or are risk averse. Others commented that program 
managers were simply not allowed by senior managers to exercise their 
authority. At the same time, program executive officers, who manage a 
set of programs, commented in interviews that they also lacked 
authority over simple matters such as moving staff or shifting small 
amounts of funds. Lastly, in our focus groups and in written comments, 
program managers who specifically worked for the Missile Defense Agency 
indicated that they did have authority to make trade-offs among cost, 
schedule, and performance and to set requirements for the business 
case. They found that this authority alone greatly separated their 
current positions from past program manager positions and consistently 
cited it as a major enabler. 

Table 5 shows how program managers answered survey questions regarding 
the types of formal and information authority they have. Figure 9 
highlights comments that were provided by program managers. 

Table 5: Program Manager Views on Formal vs. Informal Authority: 

(Continued From Previous Page) 

In percent: Type of authority: Developing program requirements;
I have formal authority[A]: 10%;
I have informal authority: 82%;
No authority: 7%. 

In percent: Type of authority: Changes in program requirements;
I have formal authority[A]: 13%;
I have informal authority: 85%;
No authority: 2%. 

In percent: Type of authority: Flexibility within program to reallocate 
funding;
I have formal authority[A]: 81%;
I have informal authority: 15%;
No authority: 5%. 

In percent: Type of authority: Developing technology;
I have formal authority[A]: 42%;
I have informal authority: 45%;
No authority: 9%. 

In percent: Type of authority: Setting testing requirements;
I have formal authority[A]: 48%;
I have informal authority: 49%;
No authority: 2%. 

In percent: Type of authority: Selecting contractor sources;
I have formal authority[A]: 48%;
I have informal authority: 33%;
No authority: 11%. 

In percent: Type of authority: Administering contracts;
I have formal authority[A]: 60%;
I have informal authority: 37%;
No authority: 3%. 

In percent: Type of authority: Addressing difficulties to meet 
requirements;
I have formal authority[A]: 66%;
I have informal authority: 31%;
No authority: 2% 

Source: GAO. 

[A] Note: Numbers may not total 100 percent due to rounding.

[End of table] 

Figure 9: Highlights of Program Manager Comments on What Types of 
Authority They Need: 

[See PDF for image] 

[End of figure] 

Accountability: 

Program manager views with regard to accountability are mixed. In our 
interviews and our focus groups, many program managers stated they 
personally held themselves accountable; however, many also commented 
that it is difficult to be accountable when so much is outside their 
span of control. During our focus groups, program managers cited 
sporadic instances when program managers were removed from their 
positions or forced to retire. They also cited instances when a program 
manager was promoted, even though the program was experiencing 
difficulties. In their written comments for our survey, program 
managers often commented that it was a disincentive that senior leaders 
who were impacting their program negatively were not being held 
accountable. 

We observed some key differences between the commercial companies we 
visited and DOD when it comes to accountability. 

* Commercial companies make it very clear who is accountable on a 
program and for what. Goals that must be achieved are clearly spelled 
out and understood by the entire program team. In DOD, it is not always 
clear who is responsible. Moreover, the expectations set for program 
managers by their supervisors may not necessarily match up with the 
goals of their program--particularly when the program manager is a 
military officer who reports to both a PEO and another commanding 
official. 

* Program managers and senior managers in the commercial sector are 
required to stay with programs until they are done; at DOD they are 
not. 

* Program managers in the commercial sector are incentivized to stay 
with programs and be accountable for them--principally through 
empowerment and financial incentives, but also through their desire to 
help the company achieve its goals. At DOD, program managers strongly 
asserted that they are incentivized to help the warfighter, but few 
said they were incentivized by financial or promotion incentives or by 
empowerment. 

Senior Leader Support during Execution: 

In commenting on senior leader support during program execution, 
program managers had mixed views on whether they received sustained 
commitment from their program executive officers, but widely believed 
that they did not receive sustained commitment from other senior 
leaders and stakeholders--unless their programs enjoyed priority and 
support from very high level officials, Congress, or the President. 
More often than not, programs struggled to compete for funding and were 
continually beset by changing demands from users. Others noted that 
while DOD is emphasizing jointness in programs more and more, 
collaboration among senior leaders needed to achieve jointness is not 
always happening. Some program managers lamented that they felt they 
were not respected in DOD, while others believed their service was 
taking some positive actions to put program managers on a par with 
military officers in operational positions. 

Program managers were also troubled by constant demands for information 
for oversight purposes as well as interruptions from stakeholders (for 
example, in department-wide budget or testing offices) that seemed to 
be non value-added. As we noted earlier, over 90 percent of the survey 
respondents said that they spent either a moderate, great, or very 
great extent of their time representing their program to outsiders and 
developing and generating information about program progress. 

Several program managers also cited reluctance on the part of senior 
managers to hear bad news. Our past reviews have similarly noted that 
the overall competition for funding in DOD spurs program managers to 
suppress bad news because it can result in funding cuts. 

[End of section] 

Chapter 4: Basic Incentives Drive Differences in How Program Managers 
Are Supported and Held Accountable: 

Differences between DOD and leading companies in how program managers 
are supported and held accountable are rooted in differences in 
incentives and resulting behaviors. This begins with the definition of 
success. The commercial firms we studied concluded their survival 
hinged on their ability to increase their market share, which, in turn, 
meant developing higher-quality products, at the lowest possible price, 
and delivering them in a timely fashion--preferably before their 
competitors could do the same. This imperative meant that they had no 
choice but to narrow the gap between requirements and resources in a 
manner that not only ensured they met their market targets, but did so 
in a manner that consumed resources efficiently. It also meant that 
they had no choice but to fully support the development effort, to 
instill strategic planning and prioritization, to work collaboratively, 
to follow a knowledge-based process that makes product development 
manageable, and ultimately, assign accountability to all involved for 
success or failure. In theory, DOD's success likewise hinges on its 
ability to deliver high quality weapons to the warfighter in a timely 
fashion. But in practice, the implied definition of success is the 
ability of a program to win support and attract funds. Of course, there 
are reasons for this disconnect. Corporate revenue is generated by 
customer sales while DOD's funding is dependent on annual 
appropriations. Corporations go out of business when their product 
development efforts do not succeed; DOD does not. Selling products to 
customers is the single focus of a private-sector company while DOD is 
charged with a myriad of important missions--each of which also 
competes for budget share. As a result, program managers are 
incentivized to overpromise on performance because it makes their 
program stand out from others. They are incentivized to underestimate 
cost and schedule and to suppress bad news because doing otherwise 
could result in a loss of support and funding and further damage their 
program. In short, unknowns become acceptable and desirable rather than 
unacceptable as they are in the corporate environment. And 
accountability becomes much more difficult to define. 

Figure 10: Key Differences in Definition of Success and Resulting 
Behaviors: 

[See PDF for image] 

[End of figure] 

Definition of Success: 

Success for the commercial world is straightforward and simple: 
maximize profit. In turn, this means selling products to customers at 
the right price, right time, and right cost. Each of the commercial 
companies we visited enjoyed success to this end, but at some point in 
time, as competitors made gains, markets tightened, and the pace of 
technology changes grew faster, they realized they needed to do more to 
be successful. Toyota decided it needed to expand its role in the world 
market place and this need persisted as competition grew stronger over 
the years. For Siemens this realization came in the 1990s--when Siemens 
Medical Division took a hard look at its profitability for its medical 
devices and for Motorola in the 1980's when it began losing market 
share for its communication devices. To turn themselves around, all 
three companies chose not to depend on technology breakthroughs or 
exotic marketing, but rather to improve their position by looking 
inward at how they approached development. Each found that there was 
room for improvement, starting with corporate cultures and ending with 
processes and controls. In Toyota's case, emphasis was largely placed 
on collaboration and consistency. In Siemens case, emphasis was placed 
on quality, particularly because its medical products come under 
extensive Food and Drug Administration regulations. For Motorola, 
emphasis was placed on empowerment and commonality, particularly in the 
processes that support product development like software development. 

At DOD, success is often formally defined in similar terms as the 
commercial world: deliver high quality products to customers (the 
warfighter) at the right time and the right cost. Virtually all program 
managers we spoke with first defined success in terms of enabling 
warfighters and doing so in a timely and cost-efficient manner. But 
when the point was pursued further, it became clear that the implied 
definition for success in DOD is attracting funds for new programs, and 
keeping funds for ongoing programs. Program managers themselves say 
they spend enormous amounts of time retaining support for their efforts 
and that their focus is largely on keeping funds stable. They also 
observe that DOD starts more programs than it can afford to begin with, 
which merely sets the stage for competition and resulting behaviors. As 
noted earlier, there are factors that contribute to how success is 
defined in practice, including the fact that DOD depends on annual 
appropriations and it must fund a wide variety of missions beyond 
weapon systems development. However, according to program managers, the 
willingness to make trade-off decisions alone, would help DOD mitigate 
these circumstances. 

Means for Success: 

Regardless of where they placed greatest emphasis, each company we 
studied adopted processes and support mechanisms that emphasized the 
following: 

* risk reduction, 

* knowledge-based decisionmaking, 

* discipline, 

* collaboration, 

* trust, 

* commitment, 

* consistency, 

* realism, and: 

* accountability. 

Such characteristics were seen as absolutely essential to gaining 
strength in the market place. With limited opportunities to invest in 
new product development efforts, companies understand it is essential, 
for example, that they know they are pursuing efforts that will 
optimize profits. Therefore, estimates of costs and technology maturity 
must be accurate and they must be used for making decisions. 
Consistency and discipline are integral to assuring that successful 
efforts can be repeated. Ultimately, these characteristics translate 
into processes that help companies develop products quicker, cheaper, 
and better. At the strategic level, processes include accurate, 
strategic planning and prioritization to ensure the right mix of 
products are pursued; investment strategies that prioritize projects 
for funding; and strong systems engineering to help them establish a 
realistic business case that levels market needs with available 
resources prior to beginning a product development. At the tactical 
level, this includes knowledge-based developments that center on 
designing and manufacturing products that will sell well enough to make 
an acceptable profit. This combination of focused leadership and 
disciplined processes promotes positive behaviors, such as an 
insistence that technology development take place separately from 
product development programs and trade-offs between requirements and 
resources be made before beginning a program; it promotes an atmosphere 
of early candor and openness from everyone as to potential program 
risks; and underscores the need for realistic, knowledge-based cost and 
schedule estimates to support full funding decisions; and the ability 
to test early, allowing "red lights" for problems that must be proven 
solved before they can be changed to "green lights." 

Once attracting and sustaining funds becomes a part of the definition 
of success, as it is at DOD, different values and behaviors emerge. For 
example, it is not necessarily in a program manager's interest to 
develop accurate estimates of cost, schedule, and technology readiness, 
because honest assessments could result in lost funding. Delayed 
testing becomes preferred over early testing since that will keep "bad 
news" at bay. 

Ultimately, no matter how well-intentioned or what improvements are 
made, DOD's processes and support mechanisms eventually play into 
funding competition. On paper, the requirements process may emphasize 
realism and the importance of incremental development, but in practice, 
it consistently encourages programs to promise performance features 
that significantly distinguish them from other systems. Likewise, 
changes may be made to make the funding process more strategic, but 
because there are still many programs competing for funds, it 
encourages cost and schedule estimates to be comparatively soft with 
little benefit from systems engineering tradeoffs. By favoring 
acquisition programs over science and technology efforts, the funding 
process also encourages programs to take on technology development that 
should be carried out in research labs. Lastly, the acquisition process 
may adopt world-class criteria for making decisions, but because it is 
much easier to attract funds for a formal weapons program than funds 
for the exercise of gaining knowledge about technologies, the process 
encourages programs to move forward despite risks with the assumption 
that programs can resolve technical, design, or production "glitches" 
later on. Significant unknowns are accepted in this environment. 
Delivering a product late and over cost does not necessarily threaten 
program success. The cumulative effect of these pressures is 
unpredictable cost and schedule estimates at the outset of a program 
that are breached, sometimes very significantly, by the time the weapon 
system is fielded. 

Other Differences Put Additional Pressures on DOD Program Managers: 

There are other environmental differences that put additional pressures 
on program managers within DOD. They include layers of internal and 
external oversight that come with DOD's stewardship responsibilities, 
personnel rules that make it more difficult to manage human capital and 
hold people accountable, laws and regulations that place additional 
constraints on an acquisition, and the mere size and scope of DOD, 
which adds significant challenges to communicating and collaborating 
effectively. 

For example, as shown below, commercial companies we visited tended to 
have fairly streamlined oversight. No matter what level the program 
manager resided, they had access to top executives who were empowered 
to help them make go/no-go decisions. In addition to this structure, 
the companies were governed by a degree of oversight from shareholders, 
but this was not actualized in day-to-day management of program 
development activities. At DOD, by contrast, program managers operate 
under many layers of oversight--both internally and externally. These 
include Congress, the President, the Secretary of Defense, a myriad of 
functional agencies as well as the military services--all of whom have 
a say in DOD's overall budget as well as funding for specific programs. 
Moreover, within these confines, leaders at all levels shift 
frequently. Much of this oversight is necessary for carrying out 
stewardship responsibilities for public money, but studies conducted by 
a variety of commissions assessing acquisition problems through the 
years have consistently found that there are opportunities to reduce 
oversight layers and streamline oversight processes and protect 
programs from frequent leadership changes. Program managers themselves 
understood the need for oversight, but found that responding to 
oversight demands was taking too much of their time. They also 
identified opportunities to make it easier to work within this 
environment, including development of databases to support internal and 
external oversight requests, empowering program managers for more day- 
to-day decisions, and making stakeholders more accountable. 

Figure 11: Commercial vs. DOD Oversight Environments: 

[See PDF for image] 

[End of figure] 

Program managers also cited several trends that have increased 
pressures that they face. These include DOD's movement toward 
developing more technical complex families of weapon systems as one 
package (system of systems), which they believe vastly increases 
management challenges and makes it more difficult to oversee 
contractors and DOD's reduction in acquisition workforces over the past 
decade, which has made it more difficult to carry out day-to-day 
responsibilities and retain technical and business expertise. Overall, 
however, program managers themselves consistently attribute their 
problems to competition for funding over these other factors. 

[End of section] 

Chapter 5: Conclusions and Recommendations: 

Like the commercial world, DOD has a mandate to deliver high-quality 
products to its customers, at the right time, and the right price. 
Quality and time are especially critical to maintain DOD's superiority 
over others, to counter quickly changing threats, and to better protect 
and enable the warfighter. Cost is critical given DOD's stewardship 
over taxpayer money, long-term budget forecasts which indicate that the 
nation will not be able to sustain its current level of investment in 
weapon systems, and DOD's desire to dramatically transform the way it 
conducts military operations. At this time, however, DOD is not 
positioned to deliver high quality products in a timely and cost- 
efficient fashion. It is not unusual to see cost increases that add up 
to tens of millions of dollars, schedule delays that add up to years, 
and large and expensive programs to be continually rebaselined. 
Recognizing this dilemma, DOD has tried to embrace best practices in 
its policies, instill more discipline in requirements setting, 
strengthen training for program managers and has lengthened program 
manager tenures. It has also reorganized offices that support and 
oversee programs, required programs to use independent cost estimates 
and systems engineering, and it has alternately relaxed and 
strengthened oversight over contractors in an effort to extract better 
performance from them. Yet despite these and many other actions, 
programs are still running over cost and over schedule and, in some 
cases, changes have merely added tasks for program managers while 
adding no value. 

Our work shows that this condition will likely persist until DOD 
provides a better foundation on which program managers can launch 
programs and more consistent and steadfast support once it commits to 
programs. At the core of this solution is developing and enforcing an 
investment strategy that prioritizes programs based on customer needs 
and DOD's long term vision and reduces the burden of advocacy on the 
part of the program manager. DOD will always be facing funding 
uncertainties due to the environment it operates in. But it has an 
opportunity to greatly mitigate the risks that come with this 
environment by separating long-term wants from needs, matching them up 
against what technologies are available today, tomorrow, and decades 
from now, as well as being realistic in determining what resources can 
be counted on. Without an investment strategy, all other improvements 
will likely succumb to the negative incentives and behaviors that come 
with continual competition for funding. With an investment strategy, 
senior leaders will be better positioned to formally commit to a 
business case that assures new programs fit in with priorities, that 
they begin with adequate knowledge about technology, time, and cost, 
and that they will follow a knowledge-based approach as they move into 
design and production. Another core enabler for improving program 
managers' chances for success lies in leadership's ability to implement 
evolutionary, incremental acquisition programs that have limited cycle 
times from beginning to delivery of the weapon system. This allows DOD 
to align program managers' tenures to delivery dates, thereby 
substantially increasing accountability for successful outcomes. 

Once senior leaders do their part--by providing program managers with 
an executable business case and committing their full support to a 
program--they can begin to empower program managers more and hold them 
accountable. By embracing a model for supporting program managers that 
incorporates all these elements, DOD can achieve the same outcomes for 
its weapons programs as other high-performing organizations. 

Recommendations for Executive Action: 

We recommend that the Secretary of Defense take the following actions 
to ensure program managers are well positioned to successfully execute 
and be held accountable for weapon acquisitions: 

* DOD should develop an investment strategy that, at a minimum, 

* determines the priority order of needed capabilities with a corollary 
assessment of the resources, that is dollars, technologies, time and 
people needed to achieve these capabilities. The remaining capabilities 
should be set out separately as desirable, resources permitting. 

* lays out incremental product development programs for achieving 
desired capabilities, and: 

* establishes controls to ensure that requirements, funding, and 
acquisition processes will work together so that DOD will sustain its 
commitment to its priority programs. 

As DOD works to develop the strategy, it should take an interim step by 
identifying priorities for programs that are already past milestone B 
(the formal start of development). Once the strategy is complete, it 
should be used by the Office of the Secretary of Defense to prepare and 
assess annual budget proposals as well as to balance funding between 
science and technology efforts and acquisition efforts to ensure that 
robust technology development efforts are conducted, but outside the 
acquisition program until reaching maturity. 

* For each new major weapons program, require that senior-level 
officials from the requirements, science and technology, program 
management, testing communities as well as the Office of the 
Comptroller formally commit to a business case prior to approving a 
program at milestone B. At a minimum, the business case should 
demonstrate that: 

* a requirement exists that warrants a materiel solution consistent 
with national military strategy, 

* an independent analysis of alternatives has been conducted: 

* the developer has the requisite technical knowledge to meet the 
requirement, 

* the developer has a knowledge-based product development and 
production plan that will attain high levels of design and production 
maturity, 

* reasonable estimates have been developed to execute the product 
development and production plan, and: 

* funding is available to execute the plan. 

* Develop and implement a process to instill and sustain accountability 
for successful program outcomes. At a minimum, this should consider: 

* matching program manager tenure with delivery of a product or for 
system design and demonstration, 

* tailoring career paths and performance management systems to 
incentivize longer tenures, 

* empowering program managers to execute their programs, including an 
examination of whether and how much additional authority can be 
provided over funding, staffing, and approving requirements proposed 
after milestone B, 

* developing and providing automated tools to enhance management and 
oversight as well as to reduce time required to prepare status 
information. 

Agency Comments and Our Evaluation: 

In commenting on a draft of our report, DOD's Acting Director for 
Procurement and Acquisition Policy concurred with our recommendations. 
In doing so, DOD asserted it was already taking actions to address our 
recommendations, notably by reviewing its overall approach to 
acquisition governance and investment decisionmaking as part of its 
Quadrennial Defense Review due in February 2006 and identifying ways to 
more effectively implement existing policies. DOD also stated it 
intended to develop a plan to address challenges in acquisition 
manpower including program manager tenure and career path and it 
intends to enhance its information management systems to facilitate 
oversight and management decisions. As underscored in our report, DOD 
has attempted similar efforts in the past--that is, reviewed its 
approach to governance and investment decisions and policies--without 
achieving significant improvements. This is because DOD has not assured 
such actions were executed in tandem with (1) instilling more 
leadership and discipline in investment decisionmaking, in both the 
short and long term and (2) instilling accountability--by requiring key 
senior officials to sign a business case, based on systems engineering 
knowledge, prior to every new acquisition as well as by matching 
program managers' tenure to cycle time. Therefore, in pursuing the 
actions it identifies in its response to our report, we believe that 
DOD should address the important questions of who should be held 
accountable for acquisition decisions; how much deviation from the 
original business case is allowed before it is no longer considered 
valid and the investment reconsidered; and what is the penalty when 
investments do not result in meeting promised warfighter needs. 

The full text of the department's response is in appendix I. 

[End of section] 

Appendix I: Comments from the Department of Defense: 

Acquisition Technology And Logistics: 

Office Of The Under Secretary Of Defense: 

3000 Defense Pentagon Washington, DC 20301-3000: 

Nov. 22 2005: 

DPAP/PAIC: 

Mr. Michael Sullivan: 
Director, Acquisition and Sourcing Management: 
U.S. Government Accountability Office: 
441 G Street, N.W.: 
Washington, D.C. 20548: 

Dear Mr. Sullivan: 

This is the Department of Defense response to the GAO draft report, 
`BEST PRACTICES: Better Support of Weapon System Program Managers 
Needed to Improve Outcomes,' dated October 21, 2005, (GAO Code 120320/ 
GAO-06-110). I concur with the recommendations in the report and have 
provided some amplifying discussion. 

Domenic Cipicchio Acting Director, Defense Procurement and Acquisition 
Policy: 

Attachment: As stated: 

GAO Draft Report - Dated October 21, 2005 GAO Code 120320/GAO-06-110: 

"Best Practices: Better Support Of Weapon System Program Managers 
Needed To Improve Outcomes" 

Department Of Defense Comments To The Recommendations: 

Recommendation 1: The GAO recommended that the Secretary of Defense 
direct the DoD to develop an investment strategy that at a minimum, 

* determines the priority order of needed capabilities with a corollary 
assessment of the resources, that is dollars, technologies, time and 
people needed to achieve these capabilities. The remaining capabilities 
should be set out separately as desirable, resources permitting. 

* lays out incremental product development programs for achieving 
desired capabilities, and: 

* establishes controls to ensure that requirements, funding, and 
acquisition processes will work together so that DoD will sustain its 
commitment to its priority programs. 

Also, as DoD works to develop the strategy, it should take an interim 
step by identifying priorities for programs that are already past 
milestone B. Once the strategy is complete, it should be used by the 
Office of the Secretary of Defense to prepare and assess annual budget 
proposals as well as to balance funding between science and technology 
efforts and acquisition efforts to ensure that robust technology 
development efforts are conducted, but outside the acquisition program 
until reaching maturity. (p. 61/GAO Draft Report): 

DOD Response: Concur: 

DoD is currently reviewing its overall approach to department 
acquisition governance with the objective of (1) refining the mechanism 
for prioritizing materiel acquisition proposals and improving the 
alignment between corporate commitment and associated resource 
allocation; (2) effectively implementing existing. evolutionary 
acquisition policy; (3) ensuring that current statutory and regulatory 
controls operate effectively; and 4) creating a more complementary 
relationship between technology development and acquisition. The 
ongoing Quadrennial Defense Review (QDR) and related efforts will 
address these issues. The QDR will be completed in February 2006. 
Subsequently, DoD will develop an implementation plan and schedule for 
actions that will address these issues. 

Recommendation 2: The GAO recommended that the Secretary of Defense, 
for each new weapons program, require that senior-level officials from 
the requirements, science and technology, program management, testing 
communities as well as the Office of the Comptroller formally commit to 
a business case prior to approving a program at milestone B. At a 
minimum, the business case should demonstrate that: 

* a requirement exists that warrants a materiel solution consistent 
with national military strategy, 

* an independent analysis of alternatives has been conducted: 

* the developer has the requisite technical knowledge to meet the 
requirement, 

* the developer has a knowledge-based product development and 
production plan that will attain high levels of design and protection 
maturity, 

* reasonable estimates have been developed to execute the product 
development and production plan, and: 

* funding is available to execute the plan. (pages 61 & 62/GAO Draft 
Report): 

DOD Response: Concur: 

Current DoD policy requires a number of criteria to be met before a 
program may be formally initiated. These include: (1) a Joint 
Requirements Oversight Council validated requirement that is consistent 
with national military strategy; (2) an Analysis of Alternatives that 
has been completed and assessed; (3) evidence of technology maturity 
that has been independently assessed; (4) an approved acquisition 
strategy; (5) an approved acquisition program baseline; (6) a 
completed, independent cost estimate; and (7) full-funding. Mechanisms 
to improve the effectiveness of these policies are being considered as 
part of the QDR. 

Recommendation 3: The GAO recommended that the Secretary of Defense 
direct the DoD to develop and implement a process to instill and 
sustain accountability for successful program outcomes. At a minimum, 
this should consider: 

* matching program manager tenure with delivery of a product or for 
system design and demonstration, 

* tailoring career paths and performance management systems to 
incentivize longer tenures, 

* empowering program managers to execute their programs, including an 
examination of whether and how much additional authority can be 
provided over funding, staffing, and approving requirements proposed 
after milestone B, * developing and providing automated tools to 
enhance management and oversight as well as to reduce time required to 
prepare status information, (p. 62/GAO Draft Report): 

DOD Response: Concur: 

DoD is currently engaged in the development of a manpower strategy 
designed to satisfy our current and future acquisition manpower 
challenges. That strategy will be comprehensive, and consider such 
issues as program manager tenure and career paths. This strategy will 
be developed by the end of FY 2006. In addition the department is 
considering policy designed to reduce requirements growth after program 
initiation and, is in the process of developing a transparent and 
efficient information management system intended to provide management 
with accurate and current program information. 

[End of section] 

Appendix II: GAO Staff Acknowledgments: 

GAO Contacts: 

Michael J. Sullivan (202) 512-4841 or sullivanm@gao.gov: 

Staff Acknowledgments: 

Greg Campbell, Cristina Chaplain, Ron La Due Lake, Sigrid McGinty, Jean 
McEwen, Carol Mebane, Guisseli Reyes, Lesley Rinner, Lisa Simon, 
Bradley Trainor, and Michele Williamson. 

[End of section] 

Related GAO Products: 

DOD Acquisition Outcomes: A Case for Change. GAO-06-257T. Washington, 
D.C.: November 15, 2005. 

Defense Acquisitions: Stronger Management Practices Are Needed to 
Improve DOD's Software-Intensive Weapon Acquisitions. GAO-04-393. 
Washington, D.C.: March 1, 2004. 

Best Practices: Setting Requirements Differently Could Reduce Weapon 
Systems' Total Ownership Costs. GAO-03-57. Washington, D.C.: February 
11, 2003: 

Best Practices: Capturing Design and Manufacturing Knowledge Early 
Improves Acquisition Outcomes. GAO-02-701. Washington, D.C.: July 15, 
2002. 

Defense Acquisitions: DOD Faces Challenges in Implementing Best 
Practices. GAO-02-469T. Washington, D.C.: February 27, 2002. 

Best Practices: Better Matching of Needs and Resources Will Lead to 
Better Weapon System Outcomes. GAO-01-288. Washington, D.C.: March 8, 
2001. 

Best Practices: A More Constructive Test Approach Is Key to Better 
Weapon System Outcomes. GAO/NSIAD-00-199. Washington, D.C.: July 31, 
2000. 

Defense Acquisition: Employing Best Practices Can Shape Better Weapon 
System Decisions. GAO/T-NSIAD-00-137. Washington, D.C.: April 26, 2000. 

Best Practices: DOD Training Can Do More to Help Weapon System Programs 
Implement Best Practices. GAO/NSIAD-99-206. Washington, D.C.: August16, 
1999. 

Best Practices: Better Management of Technology Development Can Improve 
Weapon System Outcomes. GAO/NSIAD-99-162. Washington, D.C.: July 30, 
1999. 

Defense Acquisitions: Best Commercial Practices Can Improve Program 
Outcomes. GAO/T-NSIAD-99-116. Washington, D.C.: March 17, 1999. 

Defense Acquisition: Improved Program Outcomes Are Possible. GAO/T- 
NSIAD-98-123. Washington, D.C.: March 17, 1998. 

Best Practices: DOD Can Help Suppliers Contribute More to Weapon System 
Programs. GAO/NSIAD-98-87. Washington, D.C.: March 17, 1998. 

Best Practices: Successful Application to Weapon Acquisition Requires 
Changes in DOD's Environment. GAO/NSIAD-98-56. Washington, D.C.: 
February 24, 1998. 

Best Practices: Commercial Quality Assurance Practices Offer 
Improvements for DOD. GAO/NSIAD-96-162. Washington, D.C.: August 26, 
1996. 

FOOTNOTES 

[1] Defense Management: Key Elements Needed to Successfully Transform 
DOD Business Operations, GAO-05-629T (Washington, D.C.: April 28, 2005) 
and High-Risk Series: An Update, GAO-05-207 (Washington, D.C.: January 
2005). 

[2] A complete list of best practices reports is at the end of this 
report. 

[3] These figures represent the costs for the top five weapon systems 
in 2001 and the top five in 2005. For 2001, these systems were F/A-22 
Raptor, DDG-51 Guided Missile Destroyer, Virginia Class Submarine, C-17 
Globemaster III, and the F/A 18 E/F, Naval Strike Fighter. The 2005 
systems include the Joint Strike Fighter, Future Combat System, F/A-22 
Raptor, DDG-51 Guided Missile Destroyer, and the Virginia Class 
Submarine. 

[4] 10 U.S.C.§ 1701 et seq. (P.L. 101-510. Div A. Title XII (November 
5, 1990)). 

[5] DOD Directive 5000.1, the Defense Acquisition System (May 2003) and 
DOD Instruction 5000.2 Operation of the Defense Acquisition System (May 
2003). The directive establishes evolutionary acquisition strategies as 
the preferred approach to satisfying DOD's operational needs. The 
directive also requires program managers to provide knowledge about key 
aspects of a system at key points in the acquisition process. The 
instruction implements the policy and establishes detailed policy for 
evolutionary acquisition. 

[6] Tactical Aircraft: Opportunity to Reduce Risks in the Joint Strike 
Fighter Program with Different Acquisition Strategy, GAO-05-
271(Washington, D.C.: March 15, 2005). 

[7] Space Acquisitions: Stronger Development Practices and Investment 
Planning Needed to Address Continuing Problems, GAO-05-891T, 
(Washington, D.C.: July 12, 2005). 

[8] Defense Acquisitions: Assessments of Selected Major Weapon 
Programs, GAO-05-301 (Washington, D.C.: March 31, 2005). 

[9] Defense Acquisitions: DOD's Revised Policy Emphasizes Best 
Practices but More Controls Are Needed, GAO-04-53 (Washington, D.C.: 
Nov. 17, 2003). 

GAO's Mission: 

The Government Accountability Office, the investigative arm of 
Congress, exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics. 

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading. 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office 

441 G Street NW, Room LM 

Washington, D.C. 20548: 

To order by Phone: 

Voice: (202) 512-6000: 

TDD: (202) 512-2537: 

Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: www.gao.gov/fraudnet/fraudnet.htm 

E-mail: fraudnet@gao.gov 

Automated answering system: (800) 424-5454 or (202) 512-7470: 

Public Affairs: 

Jeff Nelligan, managing director, 

NelliganJ@gao.gov 

(202) 512-4800 

U.S. Government Accountability Office, 

441 G Street NW, Room 7149 

Washington, D.C. 20548: