This is the accessible text file for GAO report number GAO-03-600 
entitled 'Missile Defense: Additional Knowledge Needed in Developing 
System for Intercepting Long-Range Missiles' which was released on 
September 23, 2003.

This text file was formatted by the U.S. General Accounting Office 
(GAO) to be accessible to users with visual impairments, as part of a 
longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov.

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately.

Report to the Ranking Minority Member, Subcommittee on Financial 
Management, the Budget, and International Security, Committee on 
Governmental Affairs, U.S. Senate:

United States General Accounting Office:

GAO:

August 2003:

Missile Defense:

Additional Knowledge Needed in Developing System for Intercepting 
Long-Range Missiles:

GAO-03-600:

GAO Highlights:

Highlights of GAO-03-600, a report to the Ranking Minority Member, 
Subcommittee on Financial Management, the Budget, and International 
Security, Committee on Governmental Affairs, U.S. Senate 

Why GAO Did This Study:

A number of countries hostile to the United States and its allies have 
or will soon have missiles capable of delivering nuclear, biological, 
or chemical weapons. To counter this threat, the Department of 
Defense’s (DOD’s) Missile Defense Agency (MDA) is developing a system 
to defeat ballistic missiles.

MDA expects to spend $50 billion over the next 5 years to develop and 
field this system. A significant portion of these funds will be 
invested in the Ground-based Midcourse Defense (GMD) element. To field 
elements as soon as practicable, MDA has adopted an acquisition 
strategy whereby capabilities are upgraded as new technologies become 
available and is implementing it in 2-year blocks.

Given the risks inherent to this strategy, GAO was asked to determine 
when MDA plans to demonstrate the maturity of technologies critical to 
the performance of GMD’s Block 2004 capability and to identify the 
estimated costs to develop and field the GMD element and any 
significant risks with the estimate.

What GAO Found:

GMD is a sophisticated weapon system being developed to protect the 
United States against limited attacks by long-range ballistic 
missiles. It consists of a collection of radars and a weapon component—
a three-stage booster and exoatmospheric kill vehicle—integrated by a 
centralized control system that formulates battle plans and directs 
the operation of GMD components. Successful performance of these 
components is dependent on 10 critical technologies.

MDA expects to demonstrate the maturity of most of these technologies 
before fielding the GMD element, which is scheduled to begin in 
September 2004. However, the agency has accepted higher cost and 
schedule risks by beginning integration of the element’s components 
before these technologies have matured. So far, MDA has matured two 
critical GMD technologies. If development and testing progress as 
planned, MDA expects to demonstrate the maturity of five other 
technologies by the second quarter of fiscal year 2004.

The radar technologies are the least mature. MDA intends to 
demonstrate the maturity of an upgraded early warning radar in 
California in the first quarter of fiscal year 2005 and a sea-based 
radar in the Pacific Ocean in the fourth quarter of that year. 
Although MDA does not plan to demonstrate the maturity of the 
technology of the early warning radar in Alaska, which will serve as 
the primary fire control radar, through its own integrated flight 
tests, it may be able to do so through the anticipated launch of 
foreign test missiles.

MDA estimates that it will spend about $21.8 billion between 1997 and 
2009 to develop the GMD element. This estimate includes $7.8 billion 
to develop and field the GMD Block 2004 capability. For example, the 
funds will be used to install interceptors at two sites, upgrade 
existing radars and testing infrastructure, and develop the sea-based 
X-band radar. We found that MDA has incurred a greater risk of cost 
growth because for more than a year the agency was not able to rely 
fully on data from its primary tool for monitoring whether the GMD 
contractor has been performing work within cost and on schedule. In 
February 2002, MDA modified the prime contract to reflect an increased 
scope of work for developing GMD. It was not until July 2003 that the 
agency completed a review to ensure that the data was fully reliable.


What GAO Recommends:

GAO is recommending DOD (1) explore options to demonstrate 
effectiveness of the Cobra Dane radar and (2) establish procedures to 
help ensure data are reliable from MDA’s monitoring system. DOD 
concurred with GAO’s first recommendation and partially concurred with 
GAO’s second.

www.gao.gov/cgi-bin/getrpt?GAO-03-600.

To view the full product, including the scope and methodology, click 
on the link above. For more information, contact Robert E. Levin at 
(202) 512-4841 or levinr@gao.gov.

[End of section]

Contents:

Letter:

Results in Brief:

Background:

MDA Expects to Demonstrate the Maturity of Most GMD Technologies before 
September 2004:

MDA Has Risked Cost Growth Because It Could Not Fully Rely on Data from 
Its System for Monitoring Contractor Performance:

Conclusions:

Recommendations for Executive Action:

Agency Comments and Our Evaluation:

Appendix I: Scope and Methodology:

Appendix II: Comments from the Department of Defense:

Appendix III: Technology Readiness Level Assessment Matrix:

Appendix IV: Importance of Earned Value Management:

Appendix V: GAO Contact and Staff Acknowledgments:

Tables:

Table 1: Technology Readiness Levels of GMD Critical Technologies:

Table 2: Estimated Cost to Develop and Field GMD:

Table 3: 32 Criteria for Earned Value Management Systems:

Figures:

Figure 1: Components of GMD:

Figure 2: Notional GMD Concept of Operations:

Figure 3: Tasks GMD Plans to Accomplish for the GMD Block 2004 Project:

Abbreviations:

BMDO: Ballistic Missile Defense Organization:

CPR: Cost Performance Report:

DCMA: Defense Contract Management Agency:

EVM: Earned Value Management:

GMD: Ground-based Midcourse Defense:

IBR: integrated baseline review:

IFT: integrated flight test:

MDA: Missile Defense Agency:

NMD: National Missile Defense:

TRL: technology readiness level:

United States General Accounting Office:

Washington, DC 20548:

August 21, 2003:

The Honorable Daniel K. Akaka 
Ranking Minority Member 
Subcommittee on Financial Management, the Budget, and International 
Security 
Committee on Governmental Affairs 
United States Senate:

Dear Senator Akaka:

Hostile states, including those that sponsor terrorism, are investing 
significant resources to develop and deploy ballistic missiles of 
increasing range and sophistication that could be used against the 
United States, our deployed forces, and our allies. At least 25 
countries now have, or are in the process of acquiring, missiles 
capable of delivering nuclear, biological, or chemical weapons. To 
counter this threat, the President of the United States in December 
2002, directed the Department of Defense (DOD) to begin fielding a 
ballistic missile defense system in 2004.

The Missile Defense Agency (MDA) within DOD is responsible for 
developing this system, including the Ground-based Midcourse Defense 
(GMD) element,[Footnote 1] which is being developed to protect the 
United States against long-range ballistic missiles. MDA is also 
building an integrated testing infrastructure--or "test bed"--with the 
newly designated GMD element as its centerpiece. MDA expects to spend 
nearly $50 billion in research and development funds between fiscal 
years 2004 and 2009 to develop and field a ballistic missile defense 
system. A significant percentage of the $50 billion will be invested in 
the GMD element.

GMD is a sophisticated weapon system that will rely on state-of-the-art 
technologies that have been under development for a number of years. 
GMD will use space-based sensors to provide early warning of missile 
launches; ground-based radars to identify and refine the tracks of 
threatening warheads and associated objects; ground-based interceptors 
(each consisting of a three-stage booster and exoatmospheric kill 
vehicle) to destroy warheads; and a centralized control system that 
formulates battle plans and directs the operation of GMD components for 
carrying out the missile defense mission.

To meet the technical challenge of developing both the integrated 
system and the GMD element, MDA has adopted a "capabilities-based" 
acquisition strategy and is implementing it in 2-year development 
blocks. This approach is designed to field elements as soon as 
practicable and to improve the effectiveness of fielded elements by 
upgrading their capability as new technologies become available or as 
the threat warrants. Block 2004 will be the first block fielded, 
followed by Blocks 2006 and 2008. Although GMD's Block 2004 capability 
is expected to be fielded beginning in September 2004, MDA plans to 
upgrade that capability through the end of 2005.[Footnote 2]

Because development and fielding of GMD involves substantial technical 
challenges and a major investment, you asked us to review technical and 
cost issues related to the GMD element. Specifically, we determined 
when MDA plans to demonstrate the maturity[Footnote 3] of technologies 
critical to the performance of GMD's Block 2004 capability. We also 
identified the estimated costs to develop and field the GMD element and 
any significant risks associated with the estimate.

Our scope and methodology are included in appendix I. Although we 
assessed the maturity of specific GMD critical technologies, the scope 
of this review did not include an evaluation of MDA's test plans for 
demonstrating GMD's ability to operate as a system overall. Our 
detailed assessment of GMD system-level testing is included in a 
classified report that we issued in June 2003 to other congressional 
requesters.

Results in Brief:

MDA expects to demonstrate the maturity of most of the ten technologies 
critical to GMD's initial performance before fielding of the element 
begins in September 2004. However, the agency has accepted a higher 
risk of cost growth and schedule slips by beginning the integration of 
the element's components before these technologies have been 
demonstrated. So far, MDA has matured two critical GMD technologies--
the infrared sensors of the kill vehicle[Footnote 4] and the fire 
control software of the battle management component.[Footnote 5] But if 
development and testing progress as planned, MDA expects to demonstrate 
the maturity of five others--resident in the kill vehicle, interceptor 
boosters, and the battle management component--by the second quarter of 
fiscal year 2004. MDA intends to demonstrate the maturity of an 
upgraded early warning radar--located at Beale Air Force Base, 
California--in the first quarter of fiscal year 2005 and a sea-based X-
band radar, located in the Pacific Ocean, in the fourth quarter of that 
year. MDA does not plan to demonstrate through its own integrated 
flight tests the maturity of a technology resident in the Cobra Dane 
radar located in Alaska, which will serve as the element's primary 
radar when GMD is first fielded. Agency officials told us that they may 
be able to test the radar through the anticipated launch of foreign 
test missiles. However, it is not clear that testing Cobra Dane in this 
manner will provide all of the information that a dedicated test 
provides because MDA will not control the configuration of the target 
or the flight environment.

MDA estimates that it will spend about $21.8 billion between 1997 and 
2009 to develop the GMD element. This estimate includes $7.8 billion to 
develop and field the GMD Block 2004 capability and to develop the GMD 
portion of the test bed between 2002 and 2005. For example, the funds 
will be used to install interceptors at Fort Greely, Alaska, and 
Vandenberg Air Force Base, California; upgrade existing radars and the 
test bed infrastructure; and develop the sea-based X-band radar.

MDA has incurred a greater risk of cost growth because for more than a 
year the agency was not able to rely fully on the data from its primary 
tool for monitoring whether the GMD contractor was performing work 
within cost and on schedule--the prime contractor's Earned Value 
Management (EVM) system.[Footnote 6] In February 2002, MDA modified 
GMD's contract to bring it into line with the agency's new 
capabilities-based acquisition strategy. It took several months to 
establish an interim cost baseline[Footnote 7] against which to measure 
the contractor's performance and 13 months to complete revisions to the 
baseline. Also, MDA and the contractor did not complete a review until 
July 2003 to ensure that the revised baseline was accurate and that 
contractor personnel were correctly using it to measure performance. 
This review was of particular importance because an earlier review 
revealed significant deficiencies in the contractor's development and 
use of the initial contract baseline. Until this review was completed, 
MDA did not know for sure whether it could rely fully on the data from 
its EVM system to recognize and correct potential problems in time to 
prevent significant cost increases and schedule delays.

We are making recommendations that MDA (1) consider adding a test of 
the effectiveness of the radar in Alaska; and (2) ensure that 
procedures are in place that will increase MDA's confidence in data 
from its EVM system. DOD concurred with our first recommendation and 
partially concurred with the second. In commenting on the draft report, 
DOD stated that the feasibility of these procedures will be determined 
and that a portion of the work is already being accomplished.

Background:

The concept of using a missile to destroy another missile (hit-to-kill) 
has been explored since the mid-1950's, but it was not until 1984 that 
the first such intercept achieved its objective. Between the mid-1980's 
and late-1990's the United States conducted a number of experiments 
designed to demonstrate that it was possible to hit one missile with 
another. In 1997, the Ballistic Missile Defense Organization (BMDO) 
established the National Missile Defense (NMD) Joint Program Office. 
The program office was directed to demonstrate by 1999 a system that 
could protect the United States from attacks of intercontinental 
ballistic missiles and to be in a position to deploy the system if the 
threat warranted by 2003. The initial system consisted of space-and 
ground-based sensors, early warning radars, interceptors, and battle 
management functions.

The program underwent additional changes as the new decade began. 
In September 2000, the President decided to defer deployment of the 
NMD system, but development of the system continued with the goal of 
being ready to deploy the system when directed. This action was 
followed in 2001 by BMDO's redirection of the prime contractor's 
efforts from developing and deploying an NMD system to developing an 
integrated test bed with the newly designated GMD system as its 
centerpiece. The Secretary of Defense, in January 2002, renamed BMDO as 
MDA and consolidated all ballistic missile defense programs under the 
new agency. Former missile defense acquisition programs became elements 
of a single ballistic missile defense system. These changes were 
followed in December 2002, by the President's directive to begin 
fielding in 2004 a ballistic missile defense system, which included 
components of the GMD element already under development.

The GMD element is intended to protect the United States against long-
range ballistic missiles in the midcourse phase of their flight. This 
is the point outside the atmosphere where the motors that boost an 
enemy missile into space have stopped burning and the deployed warhead 
follows a predictable path toward its target. Compared to the boost and 
terminal phases, this stage of flight offers the largest window of 
opportunity for interception and allows the GMD element a longer time 
to track and engage a target.

As illustrated in figure 1, GMD will rely on a broad array of 
components to track and intercept missiles. Figure 2 provides a 
notional concept of how these components will operate once they are 
fully integrated into the GMD element.

Figure 1: Components of GMD:

[See PDF for image]

[End of figure]

Figure 2: Notional GMD Concept of Operations:

[See PDF for image]

Note: The concept of operations assumes weapons release authority has 
been previously granted by the President of the United States or the 
Secretary of Defense. Missile flight times may be too brief to ask for 
permission to launch interceptors and engage the enemy.

[End of figure]

MDA Expects to Demonstrate the Maturity of Most GMD Technologies before 
September 2004:

MDA is gaining the knowledge it needs to have confidence that 
technologies critical to the GMD Block 2004 capability will work as 
intended. Two of the ten technologies essential to the Block 2004 
capability have already been incorporated into actual prototype 
hardware and have been demonstrated to function as expected in an 
operational environment.[Footnote 8] Other technologies are reaching 
this level of maturity. If development and testing proceed as planned, 
MDA will demonstrate the maturity of five additional technologies by 
the second quarter of fiscal year 2004 and two critical radar 
technologies during fiscal year 2005. MDA believes that its best 
opportunity to demonstrate the maturity of the tenth technology, 
technology critical to GMD's primary radar, may come through the 
anticipated flight tests of foreign missiles.

Our work over the years has found that making a decision to begin 
system integration of a capability before the maturity of all critical 
technologies have been demonstrated increases the program's cost, 
schedule, and performance risks. Because the President directed DOD to 
begin fielding a ballistic missile defense system in 2004, MDA began 
GMD system integration with technologies whose maturity has not been 
demonstrated. As a result, there is a greater likelihood that critical 
technologies will not work as intended in planned flight tests. If this 
occurs, MDA may have to spend additional funds in an attempt to 
identify and correct problems by September 2004 or accept a less 
capable system.[Footnote 9]

Importance of Maturing Technology:

Successful developers follow "knowledge-based acquisition" practices 
to get quality products to the customer as quickly and cost effectively 
as possible. As a part of meeting this goal, developers focus their 
technology programs on maturing technologies that have the realistic 
potential for being incorporated into the product under consideration. 
Accordingly, successful developers spend time to mature technology in a 
technology setting, where costs are typically not as great, and they do 
not move forward with product development--the initiation of a program 
to fully design, integrate, and demonstrate a product for production--
until essential technologies are sufficiently mature.

An analytical tool--which has been used by DOD and the National 
Aeronautics and Space Administration, called technology readiness 
levels (TRLs),[Footnote 10] --can assess the maturity level of 
technology as well as the risk that technology poses if it is included 
in a product's development. The nine readiness levels are associated 
with progressing levels of technological maturity and demonstrated 
performance relative to a particular application--starting with paper 
studies of applied scientific principles (TRL 1) and ending with a 
technology that has been "flight proven" on an actual system through 
successful mission operations (TRL 9). Additional details on TRLs are 
shown in appendix III.

TRLs provide a gauge of how much knowledge the program office has 
on the progress or status of a particular technology and are based on 
two principal factors: (1) the fidelity of demonstration hardware, 
including design maturity and level of functionality achieved; and 
(2) the extent and realism of the environment in which the technology 
has been demonstrated.

MDA recognizes the value of beginning system integration with mature 
technology and of using TRLs to assess the maturity of technology 
proposed for a block configuration. In particular, MDA prefers to 
include new technology in a block configuration only if the technology 
has reached a TRL 7; that is, only if prototype hardware with the 
desired form, fit, and function has been proved in an operational 
environment. However; MDA retains the flexibility to include less 
mature technology in a block configuration if that technology offers a 
significant benefit in performance and the risk of retaining it is 
acceptable and properly managed.

Readiness Levels of GMD Element Technologies:

Through technical discussions with the GMD joint program office and 
its prime contractor, we identified ten critical GMD technologies and 
jointly assessed the readiness level of each. The critical technologies 
are resident in the exoatmospheric kill vehicle, the boosters, the 
battle management, command, and control component, and in the element's 
radars. In 7 of 10 cases, we agreed with the program office and the 
GMD prime contractor on the maturity level of the element's critical 
technologies. The differences in the remaining three cases, as 
discussed in detail below, were primarily due to interpretation of TRL 
definitions. The program office and its contractor rated the two 
booster technologies and one radar technology at higher readiness 
levels than, in our opinion, MDA had demonstrated.

Most critical GMD technologies are currently at TRLs 5 and 6. At TRL 5, 
the technology's development is nearing completion, but it has not been 
applied or fitted for the intended product. At this point, the 
technology has been incorporated into a high-fidelity 
breadboard[Footnote 11] that has been tested in a laboratory or 
relevant environment[Footnote 12]. Although this demonstrates the 
functionality of the technology to some extent, the hardware is not 
necessarily of the form and fit (configuration) that would be 
integrated into the final product. A new application of existing 
technology is usually assessed at a TRL 5, because the technology has 
not been demonstrated in the relevant environment for the new 
application. TRL 6 begins the true "fitting" or application of the 
technology to the intended product. To reach this level, technology 
must be a part of a representative prototype that is very close to the 
form, fit, and function of that needed for the intended product. 
Reaching a TRL 6 requires a major step in a technology's demonstrated 
readiness, that is, the prototype must be tested in a high-fidelity 
laboratory environment or demonstrated in a restricted but relevant 
environment.

Two of the ten GMD technologies were assessed at a TRL 7, the level 
that successful developers insist upon before initiating product 
development. To reach this level, a pre-production prototype of the 
technology must be demonstrated to its expected functionality in an 
operational environment. If development and testing proceed as planned 
by MDA, we judge that most of the technologies (7 of 10) will be at a 
TRL 7 after the completion of integrated flight test (IFT)-14,[Footnote 
13] which is scheduled for the second quarter of fiscal year 2004. 
Table 1 summarizes our assessment of the TRL for each critical 
technology as of June 2003 and the date at which MDA anticipates each 
technology will reach TRL 7. A detailed discussion of each critical 
technology follows.

Table 1: Technology Readiness Levels of GMD Critical Technologies:

[See PDF for image]

Source: GAO analysis of GMD data.

Note: Information provided in the table--the configuration of flight 
test events and associated date--is as of June 2003 and is subject to 
change.

[A] Assumes technology development and demonstrations will have been 
successful.

[End of table]

Exoatmospheric Kill Vehicle Technologies:

The exoatmospheric kill vehicle is the weapon component of the GMD 
interceptor that attempts to detect and destroy the threat reentry 
vehicle through a hit-to-kill impact. The prime contractor identified 
three critical technologies pertaining to the operation of the 
exoatmospheric kill vehicle. They include the following:

* Infrared seeker, which is the "eyes" of the kill vehicle. The seeker 
is designed to support kill vehicle functions like tracking and target 
discrimination. The primary subcomponents of the seeker are the 
infrared sensors, a telescope, and the cryostat that cools down the 
sensors.

* On-board discrimination, which is needed to identify the true warhead 
from among decoys and associated objects. Discrimination is a critical 
function of the hit-to-kill mission that requires the successful 
execution of a sequence of functions, including target detection, 
target tracking, and the estimation of object features. As such, 
successful operation of the infrared seeker is a prerequisite for 
discrimination.

* Guidance, navigation, and control subsystem, which is a combination 
of hardware and software that enables the kill vehicle to track its 
position and velocity in space and to physically steer itself into the 
designated target.

All three kill vehicle technologies have been demonstrated to some 
extent in actual integrated flight tests on near-production-
representative kill vehicles. The infrared seeker has reached a TRL 7, 
because a configuration very much like that to be fielded has been 
demonstrated in previous integrated flight tests, and only minor design 
upgrades are planned to reach the Block 2004 configuration. The 
remaining two kill vehicle technologies are at a TRL 6, because their 
functionality is being upgraded and the technologies have yet to be 
incorporated into the kill vehicle and demonstrated in an operational 
environment.

The on-board discrimination technology has not yet reached TRL 7 
because MDA has not tested a "knowledge database" that is expected to 
increase the kill vehicle's discrimination capability. The purpose of 
the database is to enable the kill vehicle to distinguish 
characteristics of threatening from non threatening objects. MDA 
expects to test the database for the first time in IFT-14.

As a software-intensive technology, on-board discrimination 
performance under all flight conditions can only be evaluated through 
ground testing, but flight-testing is needed to validate the software's 
operation in a real world environment. The discrimination capability 
that will be tested in IFT-14 is expected to be fielded as part of the 
Block 2004 capability. Therefore, IFT-14 should demonstrate the 
technology's maturity if the test shows that the kill vehicle achieves 
its discrimination objective.[Footnote 14]

Similarly, the guidance, navigation, and control technology will also 
increase to a TRL 7 if the technology achieves its objectives in IFT-
14. The inertial measurement unit, an important component of the 
guidance, navigation, and control subsystem that enables the kill 
vehicle to track its position and velocity, has not yet been tested in 
the severe environments (e.g., vibrations and accelerations) induced by 
the operational booster. This will be first attempted when one of the 
new operational boosters is used in IFT-14. In addition to testing the 
inertial measurement unit, IFT-14 will also test the upgraded divert 
hardware (used to actively steer the kill vehicle to its target) that 
is expected to be part of the Block 2004 configuration.

Booster Technologies:

The integrated booster stack is the part of the GMD interceptor that is 
composed of rocket motors needed to deliver and deploy the kill vehicle 
into a desired intercept trajectory. For all flight tests to date, a 
two-stage surrogate booster called the payload launch vehicle has been 
used.

In July 1998, the GMD prime contractor began developing a new 
three-stage booster for the GMD program, known as the "Boost Vehicle", 
from commercial off-the-shelf components. However, the contractor 
encountered difficulty. By the time the booster was flight tested in 
August 2001, it was already about 18 months behind schedule. The first 
booster flight test met its objectives, but the second booster tested 
drifted off course and had to be destroyed 30 seconds after launch.

Subsequently, MDA altered its strategy for acquiring a new booster for 
the interceptor. Instead of relying on a single contractor, MDA 
authorized the GMD prime contractor to develop a second source for the 
booster by awarding a subcontract to another contractor. If development 
of the boosters proceeds as planned, both boosters will be part of the 
Block 2004 capability. One booster is known as BV+ and the other as 
"OSC Lite.":

The BV+ Booster:

The prime contractor ultimately transferred development of the boost 
vehicle to a subcontractor who is currently developing a variant--known 
as "BV+"--for the GMD element. The program office and GMD 
contractor rated the BV+ at a TRL 7. The prime contractor reasoned 
that the extent of the legacy program and its one successful flight 
test should allow for this rating. However, given the limited testing 
to date, we assessed the BV+ booster currently at a TRL 6; that is, the 
technology has been demonstrated in a restricted flight environment 
using hardware close in form, fit, and function to that which will be 
fielded in 2004. We believe the contractor's assessment is too high at 
this time, because the step from TRL 6 to TRL 7 is significant in terms 
of the fidelity of the demonstration environment. However, the first 
test of a full configuration BV+ booster will occur with IFT-13A, which 
is scheduled for the first quarter of fiscal year 2004. In our opinion, 
the BV+ booster will reach TRL 7 at this time if the booster works as 
planned.

The "OSC Lite" Booster:

The second booster under development is referred to as "OSC Lite". This 
booster, which is essentially the Taurus Lite missile that carries 
satellites into low-earth orbit, will be reconfigured for the GMD 
element. Despite the fact that the booster was recently tested under 
restricted flight conditions, GMD's prime contractor believes that the 
legacy development of the Taurus Lite missile is sufficient to prove 
that the OSC Lite has reached TRL 7. However, in our opinion, because 
the test was conducted with hardware configured as it was in the Taurus 
missile, not as it will be configured for GMD's Block 2004, the 
booster's maturity level is comparable to that of the BV+. The first 
flight test of a full configuration OSC Lite booster is scheduled for 
IFT-13B in the first quarter of fiscal year 2004. We believe that if 
the booster performs as intended in this test, it will reach TRL 7.

Battle Management Command, Control, and Communications Technologies:

The battle management component is the integrating and controlling 
component of the GMD element. Prime contractor officials identified and 
assessed the following sub-components as critical technologies:

* GMD fire control software, which analyzes the threat, plans 
engagements, and tasks components of the GMD element to execute a 
mission.

* In-flight interceptor communications system, which enables the GMD 
fire control component to communicate with the exoatmospheric kill 
vehicle while in flight.

The two battle management technologies have been demonstrated to some 
extent in actual integrated flight tests, and both are near their Block 
2004 design. We determined that the GMD fire control software has 
currently achieved a TRL 7 and the in-flight interceptor communications 
system has reached a TRL 6. Prime contractor officials concur with our 
assessment.

The fire control software is nearing expected functionality and prior 
software builds have been demonstrated in GMD flight tests. Only minor 
design changes will be made to address interfacing issues (linking the 
fire control component with other GMD components) before the 
software reaches the operational configuration of Block 2004. As a 
software-intensive technology, the performance of the fire control 
software throughout the entire "flight envelope" can only be evaluated 
through ground testing. Ground testing is well underway at both the 
Joint National Integration Center at Schriever Air Force Base, 
Colorado, and at the prime contractor's integration laboratory in 
Huntsville, Alabama.

The second technology associated with the battle management component 
is the in-flight interceptor communications system. Even though the 
pointing accuracy and communications capability of this technology 
were demonstrated in previous flight tests, the operational hardware to 
be fielded by 2004 is expected to operate at a different uplink 
frequency than the legacy hardware used in these past flight 
tests.[Footnote 15] Accordingly, we assessed the in-flight interceptor 
communications system at a TRL 6. The first integrated flight test to 
include an operational-like build of this technology is IFT-14, and if 
the technology meets its objectives in this flight test, TRL 7 would be 
achieved.

Radar Technologies:

The GMD contractor initially identified the sea-based X-band radar as 
the only radar-related critical technology. Since its initial 
assessment in September 2002, the contractor has now agreed with us 
that the Beale upgraded early warning radar and the Cobra Dane radar 
are also critical technologies of the GMD element. The contractor and 
the GMD program office assessed the Beale and Cobra Dane radars at a 
TRL 5, because the technology, especially mission software, is still 
under development and has not yet been demonstrated in a relevant 
flight environment.[Footnote 16] The contractor assessed the sea-based 
X-band radar at a TRL 6. As discussed below, we agree with their 
assessment of the Beale and Cobra Dane radars but rated the sea-based 
X-band radar as a TRL 5.

The early warning radar at Beale Air Force Base has participated in 
integrated flight tests in a missile-defense role using legacy hardware 
and developmental software. Design and development of operational 
builds of the software are progressing, but such builds have only been 
tested in a simulated environment. Therefore, we assessed the Beale 
radar technology at a TRL 5--an assessment driven by software 
considerations. The conversion of the early warning radar at Beale to 
an upgraded early warning radar, which consists of minor hardware and 
significant software upgrades, is planned for completion sometime 
during the middle of fiscal year 2004. After this time, the Beale radar 
can take part in flight-testing in its upgraded configuration. MDA 
currently plans to demonstrate the upgraded Beale technology in a non 
intercept flight test, known as a radar certification flight,[Footnote 
17] in the first quarter of fiscal year 2005. The Beale radar will be 
demonstrated at a TRL 7 if the objectives of this flight test 
are achieved.

The Cobra Dane radar is currently being used in a surveillance mode 
to collect data on selected intercontinental ballistic missile test 
launches out of Russia and does not require real-time data processing 
and communications capabilities. To achieve a defensive capability by 
September 2004, the Cobra Dane radar is being upgraded to perform 
both of these tasks. This upgrade, which requires a number of software 
modifications, is designed to enable Cobra Dane to detect and track 
enemy targets much as the Beale upgraded early warning radar does. 
Although the hardware component of the Cobra Dane radar is mature and 
will undergo only minor updating, Cobra Dane's mission software is 
being revised for this application. The revision includes reuse of 
existing software and development of new software so that the Cobra 
Dane radar can be integrated into the GMD architecture.

Upgrades to the Cobra Dane radar are due to be completed at the 
beginning of 2004. After the software is developed and ground tested, 
the radar can reach a TRL 6, but it is uncertain when the radar will 
reach a TRL 7. Because of other funding and scheduling priorities, MDA 
has no plans through fiscal year 2007 for using this radar in 
integrated flight tests; such tests would require air-or sea-launched 
targets that are not currently part of the test program. Unless the 
current test program is modified, the only opportunities for 
demonstrating Cobra Dane in an operational environment would come from 
flight tests of foreign missiles. MDA officials anticipate that such 
opportunities will occur. However, it is not clear that testing Cobra 
Dane in this manner will provide all of the information that a 
dedicated test provides because MDA will not control the configuration 
of the target or the flight environment.

The sea-based X-band radar is being built as part of the Block 2004 
capability and scheduled for completion in 2005. It will be built from 
demonstrated technologies--a sea-based platform and the prototype 
X-band radar currently being used in the GMD test program. Prime 
contractor officials told us that they consider the risk associated 
with the construction and checkout of the radar as primarily a 
programmatic, rather than technical risk, and believe that the sea-
based X-band radar has reached a TRL 6. The contractor also stated that 
the initial operational build of the radar software is developed and 
currently being tested at the contractor's integration laboratory. We 
assessed the sea-based X-band radar as a TRL 5 because the radar has 
not yet been built and because constructing a radar from an existing 
design and placing it on a sea-based platform is a new application of 
existing technology. For example, severe wind and sea conditions may 
affect the radar's functionality--conditions that cannot be replicated 
in a laboratory. As a result, developers cannot be sure that the sea-
based X-band radar will work as intended until it is demonstrated in 
this new environment. However, both we and the contractor agree that 
the maturity level of the sea-based X-band radar will increase to a TRL 
7 if it achieves its test objectives in IFT-18 (scheduled for the 
fourth quarter of fiscal year 2005).

MDA Has Risked Cost Growth Because It Could Not Fully Rely on Data from 
Its System for Monitoring Contractor Performance:

From the program's inception in 1997[Footnote 18] through 2009, MDA 
expects to spend about $21.8 billion to develop the GMD element. About 
$7.8 billion of the estimated cost will be needed between 2002 and 2005 
to develop and field the Block 2004 GMD capability and to develop the 
GMD portion of the test bed.[Footnote 19] However, MDA has incurred a 
greater risk of cost increases because for more than a year MDA was not 
sure that it could rely fully upon data from the prime contractor's 
Earned Value Management (EVM) system,[Footnote 20] which provides 
program managers and others with early warning of problems that could 
cause cost and schedule growth.

GMD Development Costs:

Before the restructuring of the GMD program in 2002, about $6.2 billion 
was spent (between 1997 and 2001) to develop a ground-based defense 
capability. MDA estimates it will need an additional $7.8 billion 
between 2002 and 2005 to, among other tasks, install interceptors at 
Fort Greely, Alaska, and at Vandenberg Air Force Base, California; 
upgrade existing radars and test bed infrastructure; and develop the 
sea-based X-band radar that will be added in the fourth quarter of 
fiscal year 2005. In addition, MDA will invest an additional 
$7.8 billion between fiscal year 2004 and 2009 to continue efforts 
begun under Block 2004, such as enhancing capability and expanding the 
test bed. Table 2, below, provides details on the funding requirements 
by block and by fiscal year, and figure 3 provides examples of specific 
Block 2004 tasks.

Table 2: Estimated Cost to Develop and Field GMD:

[See PDF for image]

Source: Ballistic Missile Defense Budget, Midcourse Defense Segment, 
February 2003.

[End of table]


Figure 3: Tasks GMD Plans to Accomplish for the GMD Block 2004 Project:

[See PDF for image]

[End of figure]

MDA did not include the following costs is its Block 2004 estimate:

* The cost to recruit, hire, and train military personnel to operate 
the initial defensive capability and provide site security at various 
locations, which MDA estimates to be an additional $13.4 million (half 
in fiscal year 2003 and half in 2004 each), will be needed to operate 
GMD and provide physical security. Additional costs to cover these 
personnel throughout the life of the program beginning in 2005 and 
beyond were also omitted.

* The cost to maintain equipment and facilities was not included.

* Systems engineering and national team costs--which benefit all 
elements, including GMD and cannot be divided among the elements--were 
not included in MDA's budget.

MDA's Insight into Potential Cost Growth Was Limited by the Agency's 
Inability to Rely Fully on Data from Earned Value Management System:

Because a significant portion of MDA's Block 2004 GMD cost estimate is 
the cost of work being performed by the element's prime contractor, 
MDA's ability to closely monitor its contractor's performance is 
critical to controlling costs. The tool that MDA, and many DOD 
entities, have chosen for this purpose is the EVM system. This system 
uses contractor reported data to provide program managers and others 
with timely information on a contractor's ability to perform work 
within estimated cost and schedule. It does so by examining variances 
reported in contractor cost performance reports between the actual cost 
and time of performing work tasks and the budgeted or estimated cost 
and time. While this tool can provide insightful information to 
managers, MDA's use of it has been hampered by several factors. 
Principally, although major contract modifications were made in 
February 2002, it took until July 2003 for MDA to complete a review to 
confirm the reliability of data from the EVM system. An earlier review 
of a similar nature revealed significant deficiencies in the 
contractor's formulation and collection of EVM data. Until a new review 
was completed, MDA could not be sure about its ability to rely fully 
upon this data to identify potential problems in time to prevent 
significant cost growth and schedule delays.

Baseline Revised over 13-Month Period:

An accurate, valid, and current performance management baseline is 
needed to perform useful analyses using EVM. The baseline identifies 
and defines work tasks, designates and assigns organizational 
responsibility for each task, schedules the work task in accordance 
with established targets, and allocates budget to the scheduled work. 
According to DOD guidance,[Footnote 21] a performance management 
baseline should be in place as early as possible after the contractor 
is authorized to proceed. Although the guidance does not define how 
quickly the contractor should establish a baseline, experts generally 
agree that it should be in place, on average, within 3 months after a 
contract is awarded or modified.

About a year before the Secretary of Defense directed MDA to adopt an 
evolutionary acquisition strategy, the agency awarded a new contract 
for the development of a National Missile Defense system. In February 
2002, MDA modified this contract to redirect the contractor's efforts. 
Instead of developing a missile defense system that met all of the 
requirements of the war fighter, as the initial contract required, the 
modification directed the contractor to develop the first GMD 
increment, or block, which was to be a ballistic missile test bed with 
GMD as its centerpiece.

Following the contract's modification, the contractor in June 2002 
established an interim baseline. This baseline was developed by adding 
budgets for near-term new work to the original baseline. Because the 
cost of the work being added to the baseline had not yet been 
negotiated, the contractor based the budgets on the cost proposed to 
MDA, as directed by DOD guidelines. The contractor implemented the 
baseline almost within the 3-month time frame recommended by experts. 
In the time between the modification and the development of the interim 
baseline, MDA authorized the contractor to begin work and spend a 
specified amount of money, and MDA paid the contractor about $390 
million during this period.

An option that MDA could have used to help validate the interim 
baseline was to have the Defense Contract Management Agency 
(DCMA)[Footnote 22] verify contractor work packages and track the 
movement of funds between the unpriced work account and the baseline. 
However, neither MDA nor DCMA initiated these actions. In its technical 
comments on a draft of this report, DOD pointed out that during the 
negotiation process, MDA reviews prime and subcontractor proposal data 
that include engineering labor hours, material, and cost estimates. DOD 
further noted that these estimates eventually form a basis for the work 
packages that make up the data for the performance management baseline. 
We agree that these costs will eventually be associated with the work 
packages that make up the baseline. However, a joint contractor and MDA 
review of the initial GMD baseline concluded that even though these 
costs were otherwise fair and reasonable, some work packages that the 
contractor developed for the original contract's baseline did not 
correctly reflect the work directed by MDA. An independent review 
of work packages included in the interim baseline would have increased 
the likelihood that the work packages were being properly developed and 
that their budget and schedule were appropriate.

The contractor completed all revisions to the baseline for the prime 
contractor and all five subcontractors by March 2003, 3 months 
after negotiating the cost of the modification and 13 months after 
authorizing the work to begin. The contracting officer explained that 
it took until December 2002 to negotiate the 2002 contract change 
because the additional work was extremely complex, and, as a result, 
the modification needed to be vetted through many subcontractors that 
support the prime.

Baseline Review Completed in July 2003:

The DOD guidance states that an integrated baseline review (IBR) is to 
be conducted within 6 months of award of a new contract or major change 
to an existing contract.[Footnote 23] The review verifies the technical 
content of the baseline. It also ensures that contractor personnel 
understand and have been adequately trained to collect EVM data. The 
review also verifies the accuracy of the related budget and schedules, 
ensures that risks have been properly identified, assesses the 
contractor's ability to implement properly EVM, and determines if the 
work identified by the contractor meets the program's objectives. The 
government's program manager and technical staff carry out this review 
with their contractor counterparts.

Completing an IBR of the new baseline has been of particular importance 
because the July 2001 IBR for the initial contract identified more than 
300 deficiencies in the contractor's formulation and execution of the 
baseline. For example, the contractor had not defined a critical path 
for the overall effort, many tasks did not have sufficient milestones 
that would allow the contractor to objectively measure performance, and 
contractor personnel who were responsible for reporting earned value 
were making mistakes in measuring actual performance against the 
baseline.

MDA began a review in March 2003 of the contractor's new baseline, 
which reflected the contract modification,. Completing this IBR took 
until July 2003 because of the complexity of the program and the many 
subcontractors that were involved. Although the review team found fewer 
problems with the contractor's formulation and execution of the new 
baseline, problems were identified. For example, the IBR showed that in 
some cases the baseline did not reflect the new statement of work. 
Also, both the prime contractor and subcontractors improperly allocated 
budget to activities that indirectly affect a work product (known as 
level of effort activities) when they could have associated these 
activities with a discrete end product. Because of the way these 
activities are accounted for, this designation could mask true cost 
variances.

Management Reserve Used to Offset Expected Cost Overruns at 
Contract Completion:

Before the IBR was underway, DCMA recognized another problem with the 
contractor's EVM reports. In its December 2002 cost performance report, 
the contractor reported that it expected no cost overrun at contract 
completion. This implied that the program was not experiencing any 
problems that could result in significant cost or schedule growth. 
However, DCMA stated that October 2002 was the second month in a row 
that the contractor had used management reserve funds to offset a 
significant negative cost variance.[Footnote 24] DCMA emphasized that 
this is not the intended purpose of management reserves. (Management 
reserves are a part of the total project budget intended to be used to 
fund work anticipated but not currently defined.) DCMA officials told 
us that while this is not a prohibited practice most programs wait 
until their work is almost completed, that is 80 to 90 percent 
complete, before making a judgment that the management reserve would 
not be needed for additional undefined work and could be applied to 
unfavorable contract cost variances.

Conclusions:

Because of the President's direction to begin fielding a ballistic 
missile defense system in 2004, the MDA took a higher risk approach by 
beginning GMD system integration before knowing whether its critical 
technologies were mature. If development and testing progress as 
planned, however, MDA expects to have demonstrated the maturity of 7 of 
the 10 critical GMD technologies before the element is initially 
fielded in September 2004 and 2 others during fiscal year 2005. If 
technologies do not achieve their objectives during testing, MDA may 
have to spend additional funds in an attempt to identify and correct 
problems by September 2004 or accept a less capable system.

Because of other funding and scheduling priorities, MDA does not plan 
to demonstrate through integrated flight tests whether the Cobra Dane 
radar's software can process and communicate data on the location of 
enemy missiles in "real time." Although tests using sea-or air-launched 
targets before September 2004 would provide otherwise unavailable 
information on the software's performance, we recognize those tests 
would be costly and funds have not been allocated for that purpose. We 
also recognize that the most cost efficient means of testing the Cobra 
Dane radar is through launches involving foreign test missiles. 
However, we believe it would be useful for MDA to consider whether the 
increased confidence provided by a planned test event outweighs other 
uses for those funds.

MDA is investing a significant amount of money to achieve an 
operational capability during the first block of GMD's development, and 
the agency expects to continue investing in the element's improvement 
over the next several years. Because MDA is also developing other 
elements and must balance its investment in each, it needs an accurate 
GMD cost estimate. If it is used as intended, the EVM system can be an 
effective means of monitoring one of GMD's largest costs, the cost of 
having a contractor develop the GMD system. It is understandable that 
the dynamic changes in MDA's acquisition strategy led to major contract 
modifications, which made it more difficult for the contractor to 
establish a stable baseline. However, in this environment, it is even 
more important that MDA find ways to ensure the integrity of the 
interim baselines and to quickly determine that revised baselines can 
be fully relied on to identify potential problems before they 
significantly affect the program's cost.

Recommendations for Executive Action:

To increase its confidence that the Ground-based Midcourse Defense 
element fielded in 2004 will operate as intended, we recommend that the 
Secretary of Defense direct the Director, Missile Defense Agency, to 
explore its options for demonstrating the upgraded Cobra Dane radar in 
its new ballistic missile defense role in a real-world environment 
before September 2004.

To improve MDA's oversight of the GMD element and to provide the 
Congress with the best available information for overseeing the 
program, we recommend that the Secretary of Defense direct the 
Director, Missile Defense Agency, to:

* ensure that when a contractor is authorized to begin new work before 
a price is negotiated that DCMA validate the performance measurement 
baseline to the extent possible by (1) tracking the movement of budget 
from the authorized, unpriced work account into the baseline, 
(2) verify that the work packages accurately reflect the new work 
directed, and (3) report the results of this effort to MDA; and:

* strive to initiate and complete an integrated baseline review (IBR) 
of any major contract modifications within 6 months.

Agency Comments and Our Evaluation:

DOD's comments on our draft report are reprinted in appendix II. 
DOD concurred with our first recommendation. DOD stated that MDA is 
exploring its options for demonstrating, prior to 2004, the upgraded 
Cobra Dane radar in a real-world environment. However, DOD noted that 
because it takes considerable time to develop and produce targets and 
to conduct safety and environmental assessments, completing a Cobra 
Dane radar test before September 2004 would be very challenging. DOD 
concluded that "targets of opportunity" (flight tests of foreign 
missiles) and ground testing may provide the best means to demonstrate 
the radar's maturity in the near term.

DOD partially concurred with our second recommendation. In responding 
to the first part of recommendation two, DOD stated that MDA and the 
DCMA will jointly determine the feasibility of tracking the budget for 
authorized, unpriced work into the baseline and will concurrently 
assess work package data while establishing the formal performance 
measurement baseline. DOD also stated that a selected portion of this 
work is already being accomplished by DCMA. We continue to believe in 
the feasibility of our recommendation. DCMA officials told us that they 
could monitor the movement of budget into the baseline and verify the 
work packages associated with the budget. In addition, the guidelines 
state that surveillance may be accomplished through sampling of 
internal and external data. We believe that if DCMA sampled the data as 
it is transferred into the baseline, the implementation of this 
recommendation should not be burdensome.

In responding to the second part of recommendation two, DOD stated that 
MDA will continue to adhere to current DOD policy by starting an IBR of 
any major contract modification within 6 months. MDA correctly pointed 
out that DOD's Interim Defense Acquisition Guidebook only requires a 
review be initiated within 6 months (180 days) after a contract is 
awarded or a major modification is issued. However, DOD's Earned Value 
Management Implementation Guide states that such a review is conducted 
within 6 months. Similar language is found in the applicable clause 
from the GMD contract,[Footnote 25] which states that such reviews 
shall be scheduled as early as practicable and should be conducted 
within 180 calendar days after the incorporation of major 
modifications. While we understand the difficulty of conducting reviews 
within 180 days when the contract is complex and many subcontractors 
are involved, we believe that it is important for the government to 
complete an IBR as soon as possible to ensure accurate measurement of 
progress toward the program's cost, schedule, and performance goals.

DOD also provided technical comments to this report, which we 
considered and implemented as appropriate. In its technical comments, 
for example, DOD expressed particular concern that our draft report 
language asserting MDA's inability to rely on the EVM system was 
unsupported and misleading. DOD also stated that its prime contractor's 
EVM system is reliable. It stated, for example, that MDA has reviewed, 
and continues to review on a monthly basis, the contractor's cost 
performance reports and that the prime contractor's EVM system and 
accounting systems have been fully certified and validated by DCMA. We 
modified our report to better recognize MDA's ability to use and trust 
the EVM system. However, we still believe that MDA would benefit from 
taking additional measures to increase its confidence in the accuracy 
of its interim baselines. Also, when the revised baseline is in place, 
a review of its formulation and execution is necessary before MDA can 
confidently and fully rely on data from the EVM system.

We conducted our review from December 2001 through August 2003 in 
accordance with generally accepted government auditing standards. As 
arranged with your staff, unless you publicly announce its contents 
earlier, we plan no further distribution of this report until 30 days 
from its issue date. At that time, we plan to provide copies of this 
report to interested congressional committees, the Secretary of 
Defense, and the Director, Missile Defense Agency. We will make copies 
available to others upon request. In addition, the report will be 
available at no charge on the GAO Web site at http://www.gao.gov/.

If you or your staff have any questions concerning this report, please 
contact me on (202) 512-4841. Major contributors to this report are 
listed in appendix V.

Sincerely yours,

Robert E. Levin 
Director 
Acquisition and Sourcing Management:

Signed by Robert E. Levin: 

[End of section]

Appendix I: Scope and Methodology:

To determine when MDA plans to demonstrate the maturity of technologies 
critical to the performance of GMD's Block 2004 capability, we reviewed 
their critical technologies using technology readiness levels (TRLs) 
developed by the National Aeronautics and Space Administration and used 
by DOD. We did so by asking contractor officials at the Boeing System 
Engineering and Integration Office in Arlington, Virginia, to identify 
the most critical technologies and to assess the level of maturity of 
each technology using definitions developed by the National Aeronautics 
and Space Administration. We reviewed these assessments along with 
program documents, such as the results of recent flight tests and 
discussed the results with contractor and agency officials in order to 
reach a consensus, where appropriate, on the readiness level for each 
technology and identify the reasons for any disagreements.

In reviewing the agency's current cost estimate to develop the first 
block of the GMD element and its test bed, we reviewed and analyzed 
budget backup documents, cost documents, and selected acquisition 
reports for the GMD program extending over a period of several years. 
We also met with program officials responsible for managing the 
development and fielding of the GMD Block 2004 capability. For example, 
we met with officials from the GMD Joint Program Office in Arlington, 
Virginia, and Huntsville, Alabama; and the Office of the Deputy 
Assistant for Program Integration at the MDA, Arlington, Virginia.

To determine whether there were any significant risks associated with 
the estimate, we met with agency officials responsible for determining 
the cost of the GMD element to find out if there were costs that were 
omitted, but should have been included, in the estimate. We also 
analyzed data from cost performance reports that the GMD contractor 
developed for the MDA. We reviewed data from the GMD element and 
contracting officials and conducted interviews to discuss the data. 
Although we did not independently verify the accuracy of the cost 
performance reports we received from MDA, the data were assessed 
independently by DCMA.

[End of section]

Appendix II: Comments from the Department of Defense:

OFFICE OF THE UNDER SECRETARY OF DEFENSE:

3000 DEFENSE PENTAGON WASHINGTON, DC 20301-3000:

ACQUISITION, TECHNOLOGY AND LOGISTICS:

6 JUL 2003:

Mr. R. E. Levin:

Managing Director, Acquisition and Sourcing Management U.S. General 
Accounting Office:

441 G Street, NW Washington, D.C. 20548:

Dear Mr. Levin:

This is the Department of Defense (DoD) response to the General 
Accounting Office (GAO) draft report, "MISSILE DEFENSE: Additional 
Knowledge Needed in Developing System for Intercepting Long-Range 
Missiles," dated June 19, 2003 (GAO Code 120109/GAO 03-600). The 
Department appreciates the opportunity to comment on the draft report.

The Department concurs with the first recommendation and partially 
concurs with the second. MDA will continue to adhere to current DoD 
policy by starting an integrated baseline review of any major 
modification within six (6) months after contract definitization. 
Specific comments for each recommendation are enclosed. We are also 
providing recommendations for factual corrections in a separate 
enclosure. My point of contact for this report is Lt Col Christina N. 
Walton, USAF, (703) 697-5385, christina.walton@osd.mil.

Sincerely,

Mark D. Schaeffer Principal Deputy Defense Systems:

Signed by Mark D. Schaeffer: 

Enclosures: As Stated:

GAO DRAFT REPORT - DATED JUNE 19, 2003 GAO CODE 120109/GAO-03-600:

"MISSILE DEFENSE: ADDITIONAL KNOWLEDGE NEEDED IN DEVELOPING SYSTEM FOR 
INTERCEPTING LONG-RANGE MISSILES":

DEPARTMENT OF DEFENSE COMMENTS TO THE RECOMMENDATIONS:

RECOMENDATION 1: GAO recommends that "To increase its confidence that 
the Ground-based Midcourse Defense element fielded in 2004 will operate 
as intended, we recommend that MDA explore its options for 
demonstrating the upgraded Cobra Dane radar in its new ballistic 
missile defense role in a real-world environment before September 
2004.":

DOD RESPONSE: Concur. MDA is considering the addition of an integrated 
flight test prior to September 30, 2004, that would prove-out the 
upgrades that are underway to the Cobra Dane radar at Shemya, Alaska. 
However, the lead time for adding radar tests with dedicated targets is 
considerable and depends on the time needed for target development and/
or production, as well as the requisite safety and environmental 
assessments. As a result, accomplishing such testing before September 
2004 would be very challenging. Consequently, targets-of-opportunity 
and ground testing may provide the best avenue for demonstrating the 
radar in the near term.

RECOMENDATION 2: To improve MDA's oversight of the Ground-based 
Midcourse Defense element and to provide Congress with more reliable 
information for overseeing the program, we recommend that MDA:

* Ensure that when a contractor is authorized to begin new work before 
a price is negotiated that DCMA validate the performance measurement 
baseline to the extent possible by (1) tracking the movement of budget 
from the authorized, un-priced work account into the baseline, (2) 
verify that the work packages accurately reflect the new work directed, 
and (3) report the results of this effort to MDA.

* Strive to initiate and complete an integrated baseline review of any 
major contract modifications within 6 months.

DOD RESPONSE: Partially Concur. MDA and DCMA will jointly determine the 
feasibility of performing reviews of the transfer of budget for 
authorized, un-priced work into the baseline and assess work package 
data concurrently with the establishment of the formal performance 
measurement baseline. A selected portion of this work is already being 
accomplished by DCMA contractor surveillance activities. During the 
period that the contractor is establishing the Performance Measurement 
Baseline (PMB), there are reviews of contractor estimating and 
accounting processes including earned value management. MDA will 
continue to adhere to current DoD policy by starting an integrated 
baseline review of any major modification within six (6) months after 
contract definitization.

DCMA Boeing Anaheim Comments: DCMA early activities to validate the PMB 
included participation in all IBR activities (both at Prime and at 
Subcontractors), membership in all cost and financial Integrated 
Process Teams, attended program management reviews, reviewed prime and 
subcontractor's proposals, appointment of warranted Administrative 
Contract Officer, Certification of all the contractors ticketed system 
including; Accounting System, Anaheim Site Billing System, Anaheim 
Site, Estimating System, and Indirect & ODC System.

[End of section]

Appendix III: Technology Readiness Level Assessment Matrix:

Technology readiness level (TRL): 1. Basic principles observed and 
reported; Description: Lowest level of technology readiness. 
Scientific research begins to be translated into applied research and 
development. Examples might include paper studies of a technology's 
basic properties; Hardware /software: None (paper studies and 
analysis); Demonstration environment: None.

Technology readiness level (TRL): 2. Technology concept and/or 
application formulated; Description: Invention begins. Once basic 
principles are observed, practical applications can be invented. The 
application is speculative, and there is no proof or detailed analysis 
to support the assumption. Examples are still limited to paper 
studies; Hardware /software: None (paper studies and analysis); 
Demonstration environment: None.

Technology readiness level (TRL): 3. Analytical and experimental 
critical function and/or characteristic proof of concept; Description: 
Active research and development is initiated. This includes analytical 
studies and laboratory studies to physically validate analytical 
predictions of separate elements of the technology. Examples include 
components that are not yet integrated or representative; Hardware /
software: Analytical studies and demonstration of nonscale individual 
components (pieces of subsystem); Demonstration environment: Lab.

Technology readiness level (TRL): 4. Component and/or breadboard. 
Validation in laboratory environment; Description: Basic 
technological components are integrated to establish that the pieces 
will work together. This is relatively "low fidelity" compared to the 
eventual system. Examples include integration of "ad hoc" hardware in a 
laboratory; Hardware /software: Low fidelity breadboard. Integration 
of nonscale components to show pieces will work together. Not fully 
functional or form or fit but representative of technically feasible 
approach suitable for flight articles; Demonstration environment: Lab.

Technology readiness level (TRL): 5. Component and/or breadboard 
validation in relevant environment; Description: Fidelity of 
breadboard technology increases significantly. The basic technological 
components are integrated with reasonably realistic supporting elements 
so that the technology can be tested in a simulated environment. 
Examples include "high fidelity" laboratory integration of components; 
Hardware /software: High fidelity breadboard. Functionally equivalent 
but not necessarily form and/or fit (size, weight, materials, etc). 
Should be approaching appropriate scale. May include integration of 
several components with reasonably realistic support elements/
subsystems to demonstrate functionality; Demonstration environment: 
Lab demonstrating functionality but not form and fit. May include 
flight-demonstrating breadboard in surrogate aircraft. Technology 
ready for detailed design studies.

Technology readiness level (TRL): 6. System/subsystem model or 
prototype demonstration in a relevant environment; Description: 
Representative model or prototype system, which is well beyond the 
breadboard tested for TRL 5, is tested in a relevant environment. 
Represents a major step up in a technology's demonstrated readiness. 
Examples include testing a prototype in a high fidelity laboratory 
environment or in simulated operational environment; Hardware /
software: Prototype. Should be very close to form, fit, and function. 
Probably includes the integration of many new components and realistic 
supporting elements/subsystems if needed to demonstrate full 
functionality of the subsystem; Demonstration environment: High-
fidelity lab demonstration or limited/restricted flight demonstration 
for a relevant environment. Integration of technology is well defined.

Technology readiness level (TRL): 7. System prototype demonstration in 
an operational environment; Description: Prototype near or at planned 
operational system. Represents a major step up from TRL 6, requiring 
the demonstration of an actual system prototype in an operational 
environment, such as in an aircraft, on a vehicle or in space. Examples 
include testing the prototype in a test bed aircraft; Hardware /
software: Prototype. Should be form, fit and function integrated with 
other key supporting elements/subsystems to demonstrate full 
functionality of subsystem; Demonstration environment: Flight 
demonstration in representative operational environment such as flying 
test bed or demonstrator aircraft. Technology is well substantiated 
with test data.

Technology readiness level (TRL): 8. Actual system completed and 
"flight qualified" through test and demonstration; Description: 
Technology has been proven to work in its final form and under expected 
conditions. In almost all cases, this TRL represents the end of true 
system development. Examples include developmental test and evaluation 
of the system in its intended weapon system to determine if it meets 
design specifications; Hardware /software: Flight-qualified hardware; 
Demonstration environment: Developmental test and evaluation in the 
actual system application.

Technology readiness level (TRL): 9. Actual system "flight proven" 
through successful mission operations; Description: Actual 
application of the technology in its final form and under mission 
conditions, such as those encountered in operational test and 
evaluation. In almost all cases, this is the end of the last "bug 
fixing" aspects of true system development. Examples include using the 
system under operational mission conditions; Hardware /software: 
Actual system in final form; Demonstration environment: Operational 
test and evaluation in operational mission conditions.

Source: GAO and its analysis of National Aeronautics and Space 
Administration data.

Note: GAO information based on U.S. General Accounting Office, Missile 
Defense: Knowledge-Based Decision Making Needed to Reduce Risks in 
Developing Airborne Laser, GAO-02-631 (Washington, D.C.: June 2002).

[End of table]

[End of section]

Appendix IV: Importance of Earned Value Management:

Pulling together essential cost, schedule, and technical information in 
a meaningful, coherent fashion is always a challenge for any program. 
Without this information, management of the program will be fragmented, 
presenting a distorted view of program status. For several decades, DOD 
has compared the value of work performed to the work's actual cost. 
This measurement is referred to as Earned Value Management (EVM). 
Earned value goes beyond the two-dimensional approach of comparing 
budgeted costs to actuals. It attempts to compare the value of work 
accomplished during a given period with the work scheduled for that 
period. By using the value of completed work as a basis for estimating 
the cost and time needed to complete the program, the earned value 
concept should alert program managers to potential problems early in 
the program.

In 1996, in response to acquisition reform initiatives, DOD 
reemphasized the importance of earned value in program management and 
adopted 32 criteria for evaluating the quality of management systems. 
These 32 criteria are organized into 5 basic categories: organization, 
planning and budgeting, accounting considerations, analysis and 
management reports, and revisions and data maintenance. The 32 criteria 
are listed in table 1. In general terms, the criteria require 
contractors to (1) define the contractual scope of work using a work 
breakdown structure; (2) identify organizational responsibility for the 
work; (3) integrate internal management subsystems; (4) schedule and 
budget authorized work; (5) measure the progress of work based on 
objective indicators; (6) collect the cost of labor and materials 
associated with the work performed; (7) analyze any variances from 
planned cost and schedules; (8) forecast costs at contract completion; 
and (9) control changes.

Table 3: 32 Criteria for Earned Value Management Systems:

Criteria: Categories of Criteria: 1. Define the authorized work 
elements for the program. A work breakdown structure, tailored for 
effective internal management control, is commonly used in this 
process.

Criteria: Categories of Criteria: 2. Identify the program 
organizational structure, including the major subcontractors 
responsible for accomplishing the authorized work, and define the 
organizational elements in which work will be planned and controlled.

Criteria: Categories of Criteria: 3. Provide for the integration of the 
company's planning, scheduling, budgeting, work authorization, and cost 
accumulation processes with each other and, as appropriate, the program 
work breakdown structure and the program organizational structure.

Criteria: Categories of Criteria: 4. Identify the company organization 
or function responsible for controlling overhead (indirect costs).

Criteria: Categories of CriteriaPlanning and Budgeting: 5. Provide for 
integration of the program work breakdown structure and the program 
organizational structure in a manner that permits cost and schedule 
performance measurement by elements of either or both structures as 
needed.

Categories of Criteria: Organization: Planning and Budgeting; Criteria: 
6. Schedule the authorized work in a manner that describes the sequence 
of work and identifies significant task interdependencies required to 
meet the requirements of the program.

Criteria: Categories of Criteria: 7. Identify physical products, 
milestones, technical performance goals, or other indicators that will 
be used to measure progress.

Criteria: Categories of Criteria: 8. Establish and maintain a time-
phased budget baseline, at the control account level, against which 
program performance can be measured. Budget for far-term efforts may be 
held in higher-level accounts until an appropriate time for allocation 
at the control account level. Initial budgets established for 
performance measurement will be based on either internal management 
goals or the external customer-negotiated target cost including 
estimates for authorized but undefinitized work. On government 
contracts, if an over target baseline is used for performance 
measurement reporting purposes, prior notification must be provided to 
the customer.

Criteria: Categories of Criteria: 9. Establish budgets for authorized 
work with identification of significant cost elements (labor, material, 
etc.) as needed for internal management and for control of 
subcontractors.

Criteria: Categories of Criteria: 10. To the extent it is practical to 
identify the authorized work in discrete work packages, establish 
budgets for this work in terms of dollars, hours, or other measurable 
units. Where the entire control account is not subdivided into work 
packages, identify the far term effort in larger planning packages for 
budget and scheduling purposes.

Criteria: Categories of Criteria: 11. Provide that the sum of all work 
package budgets plus planning package budgets within a control account 
equals the control account budget.

Criteria: Categories of Criteria: 12. Identify and control level of 
effort activity by time-phased budgets established for this purpose. 
Only that effort which is unmeasurable or for which measurement is 
impractical may be classified as level of effort.

Criteria: Categories of Criteria: 13. Establish overhead budgets for 
each significant organizational component of the company for expenses 
that will become indirect costs. Reflect in the program budgets, at the 
appropriate level, the amounts in overhead pools that are planned to be 
allocated to the program as indirect costs.

Criteria: Categories of Criteria: 14. Identify management reserves and 
undistributed budget.

Criteria: Categories of CriteriaAccounting Considerations: 15. Provide 
that the program target cost goal is reconciled with the sum of all 
internal program budgets and management reserves.

Categories of Criteria: Organization: Accounting Considerations; 
Criteria: 16. Record direct costs in a manner consistent with the 
budgets in a formal system controlled by the general books of account.

Criteria: Categories of Criteria: 17. When a work breakdown structure 
is used, summarize direct costs from control accounts into the work 
breakdown structure without allocation of a single control account to 
two or more work breakdown structure elements.

Criteria: Categories of Criteria: 18. Summarize direct costs from the 
control accounts into the contractor's organizational elements without 
allocation of a single control account to two or more organizational 
elements.

Criteria: Categories of Criteria: 19. Record all indirect costs which 
will be allocated to the contract.

Criteria: Categories of CriteriaAccounting Considerations: 20. 
Identify unit costs, equivalent units costs, or lot costs when needed.

Categories of Criteria: Organization: Accounting Considerations; 
Criteria: 21. For EVMS, the material accounting system will provide 
for: (1) Accurate cost accumulation and assignment of costs to control 
accounts in a manner consistent with the budgets using recognized, 
acceptable, costing techniques. (2) Cost performance measurement at the 
point in time most suitable for the category of material involved, but 
no earlier than the time of progress payments or actual receipt of 
material. (3) Full accountability of all material purchased for the 
program including the residual inventory.

Categories of Criteria: Organization: Analysis and Management Reports; 
Criteria: 22. At least on a monthly basis, generate the following 
information at the control account and other levels as necessary for 
management control using actual cost data from, or reconcilable with, 
the accounting system: (1) Comparison of the amount of planned budget 
and the amount of budget earned for work accomplished. This comparison 
provides the schedule variance. (2) Comparison of the amount of the 
budget earned and the actual (applied where appropriate) direct costs 
for the same work. This comparison provides the cost variance.

Criteria: Categories of Criteria: 23. Identify, at least monthly, the 
significant differences between both planned and actual schedule 
performance and planned and actual cost performance, and provide the 
reasons for the variances in the detail needed by program management.

Criteria: Categories of Criteria: 24. Identify budgeted and applied (or 
actual) indirect costs at the level and frequency needed by management 
for effective control, along with the reasons for any significant 
variances.

Criteria: Categories of Criteria: 25. Summarize the data elements and 
associated variances through the program organization and/or work 
breakdown structure to support management needs and any customer 
reporting specified in the contract.

Criteria: Categories of Criteria: 26. Implement managerial actions 
taken as the result of earned value information.

Criteria: Categories of CriteriaRevisions and Data Maintenance: 27. 
Develop revised estimates of cost at completion based on performance to 
date, commitment values for material, and estimates of future 
conditions. Compare this information with the performance measurement 
baseline to identify variances at completion important to company 
management and any applicable customer reporting requirements including 
statements of funding requirements.

Categories of Criteria: Organization: Revisions and Data Maintenance; 
Criteria: 28. Incorporate authorized changes in a timely manner, 
recording the effects of such changes in budgets and schedules. In the 
directed effort prior to negotiation of a change, base such revisions 
on the amount estimated and budgeted to the program organizations.

Criteria: Categories of Criteria: 29. Reconcile current budgets to 
prior budgets in terms of changes to the authorized work and internal 
replanning in the detail needed by management for effective control.

Criteria: Categories of Criteria: 30. Control retroactive changes to 
records pertaining to work performed that would change previously 
reported amounts for actual costs, earned value, or budgets. 
Adjustments should be made only for correction of errors, routine 
accounting adjustments, effects of customer or management directed 
changes, or to improve the baseline integrity and accuracy of 
performance measurement data.

Criteria: Categories of Criteria: 31. Prevent revisions to the program 
budget except for authorized changes.

Criteria: Categories of Criteria: 32. Document changes to the 
performance measurement baseline.

Source: Interim Defense Acquisition Guidebook, app. 4.

Note: In the Interim Defense Acquisition Guidebook, DOD states that 
these guidelines are reproduced from the American National Standards 
(ANSI) Institute/Electronic Industries Alliance (EIA) EVM System 
Standard (ANSI/EIA-748-98), Chapter 2 (May 19, 1998).

[End of table]

The criteria have become the standard for EVM and have also been 
adopted by major US government agencies, industry, and the governments 
of Canada and Australia. The full application of EVM system criteria is 
appropriate for large cost reimbursable contracts where the government 
bears the cost risk. For such contracts, the management discipline 
described by the criteria is essential. In addition, data from an EVM 
system have been proven to provide objective reports of contract 
status, allowing numerous indices and performance measures to be 
calculated. These can then be used to develop accurate estimates of 
anticipated costs at completion, providing early warning of impending 
schedule delays and cost overruns.

The standard format for tracking earned value is through a Cost 
Performance Report (CPR). The CPR is a monthly compilation of cost, 
schedule and technical data which displays the performance measurement 
baseline, any cost and schedule variances from that baseline, the 
amount of management reserve used to date, the portion of the contract 
that is authorized unpriced work, and the contractor's latest revised 
estimate to complete the program.

As a result, the CPR can be used as an effective management tool 
because it provides the program manager with early warning of potential 
cost and schedule overruns. Using data from the CPR, a program manager 
can assess trends in cost and schedule performance. This information is 
useful because trends tend to continue and can be difficult to reverse. 
Studies have shown that once programs are 15 percent complete the 
performance indicators are indicative of the final outcome. For 
example, a CPR showing a negative trend for schedule status would 
indicate that the program is behind schedule. By analyzing the CPR, one 
could determine the cause of the schedule problem such as delayed 
flight tests, changes in requirements, or test problems because the CPR 
contains a section that describes the reasons for the negative status. 
A negative schedule condition is a cause for concern, because it can be 
a predictor of later cost problems since additional spending is often 
necessary to resolve problems. For instance, if a program finishes 6 
months later than planned, additional costs will be expended to cover 
the salaries of personnel and their overhead beyond what was originally 
expected. CPR data provides the basis for independent assessments of a 
program's cost and schedule status and can be used to project final 
costs at completion in addition to determining when a program should be 
completed.

Examining a program's management reserve is another way that a program 
can use a CPR to determine potential issues early on. Management 
reserves, which are funds that may be used as needed, provide 
flexibility to cope with problems or unexpected events. EVM experts 
agree that transfers of management reserve should be tracked and 
reported because they are often problem indicators. An alarming 
situation arises if the CPR shows that the management reserve is being 
used at a faster pace than the program is progressing toward 
completion. For example, a problem would be indicated if a program has 
used 80 percent of its management reserve but only completed 40 percent 
of its work. A program's management reserve should contain at least 10 
percent of the cost to complete a program so that funds will always be 
available to cover future unexpected problems that are more likely to 
surface as the program moves into the testing and evaluation phase.

[End of section]

Appendix V: GAO Contact and Staff Acknowledgments:

GAO Contact:

Barbara Haynes (256) 922-7500:

Acknowledgments:

In addition to the individual named above Yvette Banks, Myra Watts 
Butler, Cristina Chaplain, Roger Corrado, Jennifer Echard, Dayna 
Foster, Matt Lea, Karen Richey, and Randy Zounes made key contributions 
to this report.

FOOTNOTES

[1] In January 2002, the Secretary of Defense created the Missile 
Defense Agency and consolidated all ballistic missile defense programs 
under the new agency. Former missile defense acquisition programs are 
now referred to as elements of a single ballistic missile defense 
system.

[2] The intended performance of the Block 2004 capability is described 
in a classified annex to this report.

[3] Technological maturity for starting product development or systems 
integration is achieved when prototype hardware with the desired form, 
fit, and function has been proven in a realistic operational 
environment. See U.S. General Accounting Office, Best Practices: Better 
Management of Technology Development Can Improve Weapon System 
Outcomes, GAO/NSIAD-99-162 (Washington, D.C.: July 1999).

[4] The kill vehicle is the weapon component of the GMD element that 
attempts to detect and destroy threat warheads through "hit-to-kill" 
impacts.

[5] The battle management component is the integrating and controlling 
component of the GMD element. The fire control software plans 
engagements and tasks GMD components to execute a missile defense 
mission.

[6] The EVM system is a management tool widely used by DOD to compare 
the value of contractor's work performed to the work's actual cost. The 
tool measures the contractor's actual progress against its expected 
progress and enables the government and contractor to estimate the 
program's remaining cost.

[7] An interim baseline is often established by the contractor when the 
government has authorized work, but the requirements and terms of the 
work have not yet been negotiated. Until negotiations are completed, 
the contractor develops a baseline using proposed cost that has been 
divided among work packages with associated budgets and schedule.

[8] An operational environment is a real-world environment (e.g., 
flight demonstration) that addresses all of the operational 
requirements and specifications demanded of the final product.

[9] U.S. General Accounting Office, Missile Defense: Knowledge-Based 
Practices Being Adopted, but Risks Remain, GAO-03-441 (Washington, 
D.C.: Apr. 30, 2003). This report presents our analysis of MDA's new 
approach for developing missile defense technology.

[10] U.S. General Accounting Office, Best Practices: Better Management 
of Technology Development Can Improve Weapon System Outcomes, GAO/
NSIAD-99-162 (Washington, D.C.: July 1999).

[11] A breadboard is a collection of integrated components that provide 
a representation of a system/subsystem that can be used to determine 
concept feasibility and to develop technical data. A breadboard is 
typically configured for laboratory use to demonstrate the technical 
principals of immediate interest.

[12] A relevant environment is defined as a testing environment that 
simulates key aspects of the operational environment.

[13] Integrated flight tests of the GMD element are real-world 
demonstrations of system performance during which an interceptor is 
launched to engage and intercept a mock warhead above the atmosphere.

[14] See classified annex for further details.

[15] See classified annex for further details.

[16] The hardware of the Beale and Cobra Dane radars is mature since 
both are currently in operation for other missions, namely, integrated 
tactical warning and technical intelligence, respectively. Adding the 
ballistic missile defense mission to these radars requires primarily 
software-related development and testing.

[17] Ground testing of interim software builds to be mounted on the 
Beale radar is ongoing.

[18] We calculated program cost from 1997 forward because the National 
Missile Defense program was established at that time.

[19] The cost to develop and field the initial GMD capability and the 
ballistic missile defense test bed is funded in MDA's budget within the 
Defense Wide Research, Development, Test and Evaluation appropriation. 
MDA is not requesting any procurement, military construction, or 
military personnel funds for this effort.

[20] The EVM system is a management tool widely used by DOD to compare 
the value of contractor's work performed to the work's actual cost. The 
tool measures the contractor's actual progress against its expected 
progress and enables the government and contractor to estimate the 
program's remaining cost.

[21] Department of Defense, Earned Value Management Implementation 
Guide (Washington, D.C.: Dec. 1996, as revised, p. 10).

[22] DCMA is the agency that DOD has given responsibility for 
validating contractors' Earned Value data.

[23] Earned Value Management Implementation Guide, pp. 34 and 36.

[24] Defense Contract Management Agency, Ground-Based Midcourse Defense 
Monthly Assessment Report Contract No. HQ0006-01-C-0001 for Missile 
Defense Agency (Seal Beach, Calif.: Dec. 2002, p. 10). DCMA reported 
that cost performance reports were giving "… a misleading feeling that 
everything in the program is OK. For the 2nd month in a row, [the prime 
contractor] has covered up a significant Variance-at-Completion (-
$107,800K) … by taking money out of Management Reserve (MR). This is 
not the intended purpose of using MR funds. [The prime contractor] is 
reporting a $0 Variance-At-Completion [VAC] by subtracting $107,800K 
from MR to reduce VAC to $0. Based on prior performance to date, this 
could be an indication of a trend for growth of the EAC [estimate-at-
completion]."

[25] Defense Federal Acquisition Regulation Supplement clause 252.234-
7001, EVM System (March 1998).

GAO's Mission:

The General Accounting Office, the investigative arm of Congress, 
exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony:

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics.

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading.

Order by Mail or Phone:

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to:

U.S. General Accounting Office

441 G Street NW,

Room LM Washington,

D.C. 20548:

To order by Phone: 	

	Voice: (202) 512-6000:

	TDD: (202) 512-2537:

	Fax: (202) 512-6061:

To Report Fraud, Waste, and Abuse in Federal Programs:

Contact:

Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov

Automated answering system: (800) 424-5454 or (202) 512-7470:

Public Affairs:

Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S.

General Accounting Office, 441 G Street NW, Room 7149 Washington, D.C.

20548: