This is the accessible text file for GAO report number GAO-10-629 
entitled 'NextGen Air Transportation System: FAA's Metrics Can Be Used 
to Report on Status of Individual Programs, but Not of Overall NextGen 
Implementation or Outcomes' which was released on July 27, 2010. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to Congressional Committees: 

United States Government Accountability Office: 
GAO: 

July 2010: 

NextGen Air Transportation System: 

FAA's Metrics Can Be Used to Report on Status of Individual Programs, 
but Not of Overall NextGen Implementation or Outcomes: 

GAO-10-629: 

GAO Highlights: 

Highlights of GAO-10-629, a report to congressional committees. 

Why GAO Did This Study: 

To prepare for forecasted air traffic growth, the Federal Aviation 
Administration (FAA), in partnership with other federal agencies and 
the aviation industry, is planning and implementing the Next 
Generation Air Transportation System (NextGen), a new satellite-based 
air traffic management system that will replace the current radar-
based system and is expected to enhance the safety and capacity of the 
air transport system. 

GAO was asked to review FAA’s metrics for (1) tracking the status of 
NextGen programs and the implementation of NextGen capabilities, the 
reliability of those metrics, and any limitations or gaps and (2) 
measuring the performance and outcomes of NextGen capabilities that 
are implemented and any limitations. GAO analyzed FAA program progress 
reports and associated metrics for monitoring. GAO also reviewed 
agency performance and accountability reports and discussed internal 
performance reporting methods with FAA officials. 

What GAO Found: 

FAA has metrics that allow it to monitor the progress of its programs 
for acquiring software and hardware. These metrics include Earned 
Value Management (EVM) measurements that show how well a program is 
meeting its planned cost and schedule targets for system development. 
Previous GAO reports have identified issues with FAA’s implementation 
of EVM, which continue to affect the accuracy and reliability of some 
of FAA’s program status reports. For example, for one acquisition 
program, FAA implemented EVM metrics only for the contractor’s 
performance and not for the government’s. As a result, the EVM data 
did not pick up delays that occurred after the contractor delivered 
the system and the EVM system did not provide early warnings of delays 
and potential cost overruns. In addition, GAO’s previous work has 
shown that FAA is not able to report on how slippage in one program’s 
schedule or budget will ultimately affect the implementation of other 
NextGen acquisition programs or operational capabilities whose 
progress depends on the completion of the first program. GAO has made 
recommendations to address these issues, which FAA and the Department 
of Transportation have begun to implement. FAA has also designated 
specific positions within the NextGen Integration and Implementation 
Office–known as solution set coordinators–to monitor and track 
progress toward implementing a portfolio of operational improvements 
into the national airspace system. However, the role of the 
coordinators and the process for resolving any disputes across FAA 
lines of business have not been clearly defined or delineated and it 
is uncertain whether the processes in place in this portfolio 
management structure will strengthen oversight and create a greater 
likelihood that required activities are completed on time. 

FAA has broad goals for NextGen as a whole, such as increasing 
capacity and reducing noise and emissions, but has not yet developed 
specific goals and outcome-based performance metrics to track the 
impact of and benefits realized from the entire NextGen endeavor. The 
agency has multiple efforts underway to develop such metrics: FAA’s 
Air Traffic Organization (ATO), which manages the air traffic control 
system, has started to compile and review a set of metrics for 
measuring outcomes and performance associated with NextGen 
improvements. These metrics are likely to measure such things as the 
extent to which improvements increase throughput at airports, reduce 
emissions, and reduce flight times, but they are in the early stages 
of development. Recently, FAA also committed to developing performance 
metrics with industry, but it has no timeline or action plan for 
completing this effort. Separately, the Joint Planning and Development 
Office (JPDO), which is responsible for the long-term planning for 
NextGen and partnering with other federal agencies, has been working 
to develop a list of potential metrics, which range from fuel consumed 
per distance flown to curb-to-curb travel time. Without specific goals 
and metrics for the performance of NextGen as a whole, together with a 
timeline and action plan for implementation, it is not clear whether 
NextGen technologies, systems, and capabilities will achieve desired 
outcomes and be completed within the planned time frames.  

What GAO Recommends: 

The FAA Administrator should clarify dispute resolution processes 
within FAA’s portfolio management structure, and develop a timeline 
and action plan to agree with stakeholders on a list of specific goals 
and outcome-based performance metrics for NextGen.  DOT agreed to 
consider GAO’s recommendations and provided technical comments that 
GAO incorporated as appropriate. 

View [hyperlink, http://www.gao.gov/products/GAO-10-629] or key 
components. For more information, contact Gerald Dillingham, Ph.D., at 
(202) 512-2834 or dillinghamg@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

FAA Has Metrics to Report on Program Status but Does Not Have Metrics 
to Measure Overall Implementation of NextGen Capabilities: 

Metrics Have Yet to Be Developed to Measure the Performance of NextGen 
Improvements in Relation to Specific NextGen Goals, but Some 
Performance Metrics Are Available for Specific Programs: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: Further Information on the ERAM Program: 

Appendix III: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: Status of NextGen's Six Transformational Programs: 

Table 2: Description of FAA's NextGen Solution Sets: 

Table 3: Selection of JPDO's Proposed Performance Metrics: 

Table 4: Selection of FAA Proposed Performance Metrics: 

Table 5: Alignment of NextGen Activities with Existing Flight Plan 
Metrics: 

Table 6: ERAM Status for Achieving Key Program Milestones: 

Figures: 

Figure 1: NextGen Governmental Organizational Structure: 

Figure 2: Program Assessment Report for ERAM as of March 2010: 

Figure 3: Program Assessment Report for ADS-B as of March 2010: 

Figure 4: Implementation Status of Prior Selected GAO Recommendations 
to FAA and DOT as of June 2010: 

Figure 5: Current Phase of ERAM Testing at FAA's En Route Centers as 
of May 2010: 

[End of section] 

United States Government Accountability Office: 
Washington, DC 20548: 

July 27, 2010: 

Congressional Committees: 

Air traffic is growing, and with it, congestion and flight delays, 
which can cause significant economic losses. The Federal Aviation 
Administration (FAA) predicts that, by 2025, the number of passengers 
will increase 57 percent--from about 700 million to about 1.1 billion 
per year--and the number of flights from about 80,000 to more than 
95,000 every 24 hours. Today's air transportation system will be 
strained to meet these air traffic demands, especially on some routes 
to and from major cities and hubs, but improvements to the national 
airspace system can mitigate the anticipated increase in flight delays 
and any resulting decrease in economic productivity. Accordingly, FAA 
and other federal agencies have worked in partnership to develop a 
plan for the Next Generation Air Transportation System (NextGen). 
[Footnote 1] NextGen involves every aspect of air transportation, from 
arrival at the airport to departure from the destination airport. 
NextGen requires the acquisition of new integrated systems (software 
and hardware), flight procedures, aircraft performance capabilities, 
and supporting infrastructure to transform the current air 
transportation system into one that uses satellite-based surveillance 
and navigation and network-centric operations.[Footnote 2] These 
acquisition programs and their associated improvements are intended to 
increase the efficiency and capacity of the air transportation system 
while maintaining its safety so that it can accommodate anticipated 
future growth. The initial planning for NextGen, starting with Vision 
100[Footnote 3] in 2003, focused on implementing improvements through 
2025. More recently, FAA has emphasized improvements that can be 
implemented in the near term and midterm, defined as between 2012 and 
2018.[Footnote 4] At the same time, stakeholders and Members of 
Congress have expressed concerns about the pace of FAA's 
implementation of NextGen, citing the schedule delays that plagued 
FAA's previous air traffic control modernization efforts and led GAO 
to place air traffic modernization on its High-Risk List between 1996 
through 2008. Additionally, given increasing demands for a more 
effective, transparent, and accountable federal government, it is 
important that federal agencies establish meaningful goals for 
improving performance, monitor progress in achieving their goals, and 
use information about performance to make decisions that can improve 
results. Metrics are important for demonstrating progress toward 
achieving goals and providing information on which to base 
organizational and management decisions. 

In light of the scale and complexity of NextGen implementation and 
concerns about past modernization efforts and the pace of 
implementation, you asked us to review the metrics and process FAA 
uses to monitor the status of NextGen implementation. To do so, we 
examined (1) FAA's metrics for tracking the status of NextGen 
acquisition programs and the implementation of NextGen capabilities, 
the reliability of these metrics and the data underlying them, and any 
limitations or gaps in FAA's efforts to track the status of NextGen 
implementation; and (2) how FAA currently measures the performance of 
NextGen programs and capabilities, FAA's progress in developing a full 
suite of metrics to measure the outcomes and performance of NextGen 
capabilities once implemented, and any limitations or gaps in FAA's 
approach to developing these metrics. 

To determine what programmatic metrics FAA has available for 
monitoring NextGen programs and programs critical to NextGen 
implementation, we reviewed reports used to justify programs prior to 
investment, progress reports submitted by program managers, and 
reports based on the FAA database that houses program information. We 
also interviewed FAA acquisition and finance officials and selected 
NextGen program managers to understand how program managers develop 
and report their metrics to internal and external stakeholders and to 
gain an understanding of the database that houses this information. To 
learn how FAA plans to monitor and measure progress toward 
implementing NextGen operational capabilities--beyond the status of 
acquisition programs--we reviewed documents that outline FAA's 
solution set organization and management approach, and interviewed 
officials involved in coordinating and managing solution sets. 
[Footnote 5] To determine the reliability of these programmatic 
metrics and to analyze the extent of any gaps or limitations, we 
reviewed past GAO reports on FAA's acquisition process and the 
reliability of the data FAA uses to develop its metrics, as well as 
the implementation status of prior recommendations. We reviewed 
program and process reviews from FAA's acquisition offices to identify 
key areas of FAA's internal oversight focus and key findings reached 
in such reviews about FAA's acquisition procedures and policies. To 
determine FAA's progress in developing metrics for measuring the 
outcomes of NextGen improvements, we first reviewed how FAA currently 
reports on its performance, both internally and externally, and how 
information on the performance of specific NextGen improvements is 
incorporated into those metrics. We reviewed FAA's performance and 
accountability reports and discussed internal performance reporting 
methods with relevant FAA officials. Specifically, we reviewed FAA's 
Flight Plan, Performance and Accountability Report, NextGen 
Implementation Plans published in 2009 and 2010, Enterprise 
Architecture, and reports to the Office of Management and Budget (OMB) 
(known as "Exhibit 300" reports).[Footnote 6] To understand FAA's 
approach and progress toward developing a suite of NextGen metrics, we 
interviewed FAA officials with responsibilities for NextGen planning 
and implementation, particularly officials within the Air Traffic 
Organization (ATO) and the Joint Planning and Development Office 
(JPDO) responsible for modeling NextGen benefits and developing 
NextGen performance metrics. To evaluate metrics that FAA is 
considering, we compared proposed metrics with key attributes of 
successful performance metrics that we identified in past GAO 
work.[Footnote 7] We also interviewed several key stakeholders for 
NextGen, including representatives from airlines, equipment 
manufacturers, federal partner agencies, and the air traffic 
controllers union to get their views on the metrics they deem most 
appropriate to measure the performance of NextGen. 

In this report, we discuss two types of metrics: programmatic and 
performance. Programmatic metrics are used to track the progress of 
programs or capabilities, and include such things as time, cost, and 
schedule. For instance, some programs use earned value management 
(EVM), a technique for showing how well a program is meeting cost and 
schedule milestones.[Footnote 8] In contrast, performance metrics 
measure the impact or results of a program or activity once it is 
implemented relative to desired outcomes or goals, such as reductions 
in delays or fuel consumption and increased throughput at an airport. 
[Footnote 9] Effective performance metrics require baselining, or 
determining the current status of whatever is being measured, so that 
targets can then be set. These metrics will, if developed well, 
measure how well something is progressing toward its intended target. 

We conducted this work from June 2009 through July 2010 in accordance 
with generally accepted government auditing standards. Those standards 
require that we plan and perform the work to obtain sufficient, 
appropriate evidence to provide a reasonable basis for our findings 
and conclusions based on our audit objectives. We believe that the 
evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objectives. Appendix I contains more 
detail information on our objectives, scope, and methodology. 

Background: 

FAA is currently tracking over 30 acquisitions related to 
modernization and improvement of the national airspace. While not all 
of these acquisition programs are considered to be "NextGen" programs, 
several are instrumental to NextGen implementation. Examples of 
instrumental programs include the En Route Automation Modernization 
(ERAM); the Airport Surface Detection Equipment-Model X (ASDE-X), 
which increases runway safety and airport efficiency by putting in 
place tools to improve operations, surveillance, and data sharing on 
the airport surface (e.g., runways and taxiways); and the Wide Area 
Augmentation System (WAAS), which provides aircraft more accurate 
position information for more direct flight paths and precision 
approaches to airports. In particular, ERAM, a new system 
architecture, will replace the current En Route computer system and 
its backup and is considered to be the backbone that will support 
NextGen. ERAM is meant to provide all of today's functionality and add 
new capabilities needed to support the transformation to NextGen. 

Besides these three instrumental programs, FAA has identified six 
major acquisition programs that it considers to be transformational 
NextGen programs, as follows: 

* Automatic Dependent Surveillance Broadcast (ADS-B) is a satellite- 
based information broadcasting system that is designed, along with GPS-
based navigation technologies, to enable more precise control of 
aircraft during en route flight, approach, and descent. 

* System-Wide Information Management (SWIM) is the information 
management architecture for the national airspace system, acting as 
its "World Wide Web." SWIM will manage surveillance, weather, and 
flight data, as well as aeronautical and system status information, 
and will provide the information securely to users. 

* NextGen Data Communications (Data Comm) is intended to provide a 
digital communications link for two-way exchanges between controllers 
and flight crews for air traffic control clearances, instructions, 
advisories, flight crew requests, and reports. 

* NextGen Network Enabled Weather (NNEW) is planned to serve as the 
core of the NextGen weather support services and provide a common 
weather picture across the national airspace system. 

* National Airspace Voice Switch (NVS) is to replace existing switches 
and provide the foundation for all air-to-ground and ground-to-ground 
voice communications in the future air traffic control environment. 

* Collaborative Air Traffic Management Technologies (CATMT) 
encompasses the development of systems to distribute and manage 
aeronautical information, manage airspace reservations, and manage 
flight information from preflight to postflight analysis. 

Acquisition programs are overseen by program offices within the Air 
Traffic Organization (ATO) and headed by program managers who are 
responsible for gathering and reporting programmatic data to FAA's 
acquisition tracking database, known as the Simplified Program 
Information Reporting and Evaluation (SPIRE) tool, which FAA uses to 
track and report the progress of all approved acquisitions toward its 
schedule and cost performance targets. Detailed cost, schedule, and 
EVM metrics are developed, and reporting begins after the Joint 
Resources Council (JRC) approves a program for funding.[Footnote 10] 
As table 1 shows, three NextGen transformational programs, ADS-B, 
SWIM, and CATM, have received final investment approval and are being 
reported with EVM and associated metrics, while the three remaining 
programs, DataComm, Voice Switch, and Weather, have not yet received 
such approval. For the time being, their progress is being tracked 
against schedule milestones. 

Table 1: Status of NextGen's Six Transformational Programs: 

Program: ADS-B; 
Status of latest investment decision: Final investment; 
Date of decision: June 2006; 
Next major milestone: Under deployment at several sites (at and near 
airports); full deployment of ground-based transceivers expected in FY 
2013. 

Program: SWIM; 
Status of latest investment decision: Final investment; 
Date of decision: June 2009; 
Next major milestone: Deployment of capabilities expected to start in 
FY 2010. 

Program: Collaborative Air Traffic Management Technologies (CATMT); 
Status of latest investment decision: Final investment; 
Date of decision: Sept 2008 (Work package 2) Jan. 2010 (Work package 
3); 
Next major milestone: Integration of weather data in 2011. 

Program: DataComm; 
Status of latest investment decision: Final investment decision 
Segment 1; 
Date of decision: September 2011; 
Next major milestone: Deployment schedule not baselined until final 
investment decision, expected in FY 2016. 

Program: NAS Voice Switch (NVS); 
Status of latest investment decision: Final investment decision 
Segment 1; 
Date of decision: August 2012; 
Next major milestone: Market survey of potential contractors scheduled 
for FY 2010. 

Program: NextGen Weather; 
Status of latest investment decision: Initial and final investment 
decision have yet to be scheduled; 
Date of decision: FY 2012-2015; 
Next major milestone: [TBD]. 

Source: FAA. 

[End of table] 

ATO's NextGen and Operations Planning Office is leading near-term (now 
through 2015) and midterm (2015 through 2018) NextGen planning and 
implementation efforts.[Footnote 11] These efforts are guided by the 
NextGen Implementation Plan, which identifies the NextGen capabilities 
that are to be implemented between 2012 and 2018.[Footnote 12] NextGen 
capabilities are defined in portfolios of related operational 
improvements called solution sets, which together, will bring about 
the midterm system. FAA currently is managing seven solution sets, 
described in table 2. 

Table 2: Description of FAA's NextGen Solution Sets: 

Solution set: Initiate Trajectory-Based Operations; 
Description of solution set: This will lead to a shift from current 
"clearance-based" to "trajectory-based" air traffic control and will 
enable aircraft to fly negotiated flight paths that take both 
controller and pilot preferences and optimal airspace system 
performance into consideration. 

Solution set: Increase Arrivals and Departures at High-Density; 
Description of solution set: Airports will improve arrival and 
departure capacity for airports with heavily used airspace. 

Solution set: Increase Flexibility in the Terminal Environment; 
Description of solution set: This will increase access and help manage 
the separation of aircraft in and around airports and allow for 
improved management of aircraft on the airport surface, as well as 
improved access to runways in low visibility. 

Solution set: Improve Collaborative Air Traffic Management; 
Description of solution set: This will support a more flexible air 
traffic system capable of adjustments to routings or altitude to match 
airspace and airport capacity, and accommodate controller and pilot 
preferences to the maximum extent possible. 

Solution set: Reduce Weather Impact; 
Description of solution set: This will support integration of a broad 
range of weather information into air traffic control decision-making. 

Solution set: Improve Safety, Security and Environmental Performance; 
Description of solution set: This will deploy an automated system to 
identify airborne security threats and communicate that information to 
the appropriate agency. 

Solution set: Transform Facilities; 
Description of solution set: This will support planning for future 
NextGen facilities.[A] 

Source: FAA. 

[A] This definition is limited to activities funded for fiscal year 
2010. 

[End of table] 

Each of the solution sets includes or will include numerous 
acquisition programs and a variety of other activities that will be 
carried out across offices within ATO, such as the Office of 
Operations, and several other lines of business across FAA, such as 
the Office of Aviation Safety, the Office of Airports, and others. For 
example, implementing the solution set Increase Flexibility in the 
Terminal Environment requires that ADS-B, DataComm, SWIM, ERAM, NNEW 
and other programs be implemented; flight procedures be developed by 
the Flight Procedure Standards Branch; and safety analyses be 
conducted by ATO, and requirements and standards be developed by the 
Flight Technology and Procedures Division, among numerous other 
actions. 

FAA has created a new position - solution set coordinator - to 
coordinate and manage the implementation of each solution set across 
the agency. While solution set coordinators manage the day-to-day 
implementation of solution sets, the NextGen Management Board, which 
includes the heads of ATO and the key agency lines of business, 
oversees NextGen implementation efforts within FAA and has the 
authority to force timely resolution of emerging NextGen 
implementation issues.[Footnote 13] The Board's role is to measure the 
progress of deployments and of key activities that support decision-
making; ensure essential resources are available, including 
reprioritizing resources as necessary; issue policies and guidance; 
and identify officials--like program managers--within organizations 
who will be accountable for delivering system changes. 

JPDO is responsible for the long-term planning and development for 
NextGen and as such, is involved in modeling the costs, benefits, and 
risks associated with alternative scenarios of NextGen implementation 
over the long term. Originally chartered in Vision 100[Footnote 14] to 
plan and coordinate the transition to NextGen, JPDO began to focus on 
planning for NextGen beyond 2018 after organizational changes were 
made in May 2008.[Footnote 15] JPDO has recently undergone a 
leadership change, has been repositioned within FAA's organization, 
and now reports directly to the FAA Deputy Administrator-who is the 
FAA executive in charge of NextGen. JPDO is also responsible for 
ensuring and fostering interagency coordination and collaboration and 
is closely tied to the Senior Policy Committee-the governing body for 
NextGen chaired by the Secretary of Transportation and made up of 
cabinet-level officials from the partner agencies. Figure 1 shows the 
current governmental organization surrounding NextGen activities. 

Figure 1: NextGen Governmental Organizational Structure: 

[Refer to PDF for image: illustration of Organizational Structure] 

Top level: 
Secretary of Transportation: 
* FAA Administrator; 
* FAA Deputy Administrator; 
* NextGen Management Board. 

Second level: 
* Senior Policy Committee (Secretary of Transportation, chairman): 
reports to Secretary of Transportation; 
* JPDO board: reports to Senior Policy Committee; 
* NextGen liaison to Secretary of Transportation and JPDO Director: 
connected to each of the top level entities and JPDO Board; 
* Joint Planning and Development Office (JPDO): connected to NextGen 
liaison. 

Third level: 
Partner agencies: connected to all second level entities: 
* Department of Commerce (National Oceanic and Atmospheric 
Administration); 
* Department of Defense; 
* Department of Homeland Security; 
* National Aeronautics and Space Administration; 
* White House Office of Science and Technology Policy. 

Fourth level, connected to NextGen Management Board: ATO offices with 
NextGen responsibilities: 
Chief Operating Officer, Air Traffic Organization: 
* Financial Services; 
* NextGen and Operations Planning; 
* Safety; 
* Strategy and Performance; 
* Operations: 
- En Route and Oceanic; 
- System Operations; 
- Terminal; 
- Technical Operations. 

Fourth level, connected to NextGen Management Board: Other lines of 
business with NextGen responsibilities: 
* Airports; 
* Aviation Safety; 
* Financial Services/Chief Financial Officer; 
* Information Services/Chief Information Officer; 
* International; 
* Policy, Planning, and Environment; 
* Regions and Center Operations. 

Sources: FAA and JPDO. 

[End of figure] 

The participation of industry and other stakeholders is critically 
important to the success of NextGen's implementation. Numerous venues 
exist for stakeholders to participate, although our prior work has 
shown that industry stakeholders, the air traffic controller and 
aviation safety specialist unions, and the partner agencies have 
participated unevenly for a variety of reasons.[Footnote 16] The 
NextGen Institute, a JPDO mechanism designed to involve private-sector 
expertise, tools, and facilities in the development and implementation 
of NextGen, has been leaderless since March 2010 when the head of the 
NextGen Institute resigned, and a replacement has yet to be named. 
Last year, FAA requested that RTCA create a NextGen Midterm 
Implementation Task Force to reach consensus within the aviation 
community on the operational improvements that can be implemented by 
2018 and would be most beneficial to users.[Footnote 17] The Task 
Force focused on maximizing benefits in the near term and paid 
particular attention to aligning its recommendations with how aircraft 
operators decide to invest in aircraft equipment. On September 9, 
2009, the Task Force issued its final report, which contained a list 
of recommendations for FAA. FAA agreed with the Task Force 
recommendations and worked with the Task Force to incorporate and 
address their recommendations in FAA's plans. In January 2010, FAA 
released its response to the task force, which outlines FAA's specific 
responses, including an action plan detailing when certain tasks will 
be completed. A key recommendation of the Task Force was for FAA to 
work with industry to develop performance metrics to show the progress 
and benefits of NextGen. 

FAA Has Metrics to Report on Program Status but Does Not Have Metrics 
to Measure Overall Implementation of NextGen Capabilities: 

FAA Uses Programmatic Metrics to Provide Updates on Program Status, 
but Additional Information and Context Could Help Observers and 
Overseers Understand Problems: 

FAA's SPIRE tool can organize information provided by program managers 
into various reports that give high-level indications of a program's 
status. For example, figures 2 and 3 show the program assessment 
reports for ERAM and ADS-B, respectively. These reports use a color- 
coded chart to summarize program managers' assessments of performance 
for cost and schedule indicators, with green signifying that the 
program is on target, yellow that there are potential issues with 
meeting targets, and red that there is significant risk that the 
target will not be met. These reports also contain space for the 
program managers to clarify the status of the program, implementation 
concerns, and information on any other issues deemed necessary to 
highlight, such as problems, issues, and corrective actions for 
ensuring that milestones and costs are maintained within the 
established targets. 

Figure 2: Program Assessment Report for ERAM as of June 2010: 

[Refer to PDF for image: illustration of report] 

Key: 
Metric indicates no issues: Green; 
Metric indicates some issues: Yellow; 
Metric indicates serious issues: Red. 

Program Phase: Deployment; 
June 2010: 

Program assessment: 
Financial: Metric indicates no issues;
Schedule: Metric indicates serious issues;
Technical: Metric indicates some issues;
Resources: Metric indicates no issues;
External interest: Metric indicates some issues;
Program manager: Metric indicates some issues. 

Supporting indicators: 

Financial: Cost Performance Index (CPI); 
Measure of cost performance: 1.02. 

Financial: To Complete Performance Index (TCPI); 
Measure of cost performance: TCPI = 0.85. 

Financial: Cost Variance At Completion (CVAC); 
Measure of cost performance: Metric indicates no issues. 

Financial: Obligation rate; 
Measure of cost performance: [Symbol]. 

Schedule: Schedule Performance Index (SPI); 
Measure of schedule performance: 1.00. 

Schedule: Schedule Variance At Completion (SVAC); 
Measure of schedule performance: Metric indicates serious issues. 

Schedule: APB standard level 1 milestones; 
Measure of schedule performance: Metric indicates serious issues. 

Schedule: Fiscal year schedule goals; 
Measure of schedule performance: Metric indicates serious issues. 

Technical: Requirements stability; 
Measure of cost performance: Metric indicates no issues. 

Technical: System defects; 
Measure of cost performance: Metric indicates some issues. 

Technical: Test results; 
Measure of cost performance: Metric indicates some issues. 

Technical: Deployment; 
Measure of cost performance: Metric indicates some issues. 

Technical: Cumulative mitigated risk impact; 
Measure of cost performance: Metric indicates serious issues. 

Technical: Performance Variance At Completion (PVAC); 
Measure of cost performance: Metric indicates no issues. 

Resources: Funding; 
Measure of appropriate funding: Metric indicates no issues. 

External interest: Under review by DOT, IG, GAO, or other; 
External review activity: Metric indicates some issues. 

Program manager comments: 

The approved 2003 schedule for the ERAM program was to reach full 
operational status at the first operational site, the Salt Lake City 
(ZLC) Air Route Traffic Control Center by the end of December 2009. 
However, continuing resolution of PRs needed for the key site to 
transition to continuous operations continues to delay this milestone. 
These PRs could not have been discovered in the Technical Center 
Laboratory environment and only became evident when the system was 
tested operationally at the key sites. The Salt Lake City keysite has 
run on ERAM operationally three separate times for periods of seven 
days from late January through March 2010. During these periods there 
were no delays, separation violations, safety impacts, or catastrophic 
failures of ERAM. However, there were issues noted which required work-
arounds in operational procedures and required additional staffing to 
conduct operations. While this was workable for short durations of up 
to a week, these issues precluded sustained continuous operations due 
to the impact on staffing. 

The Western Service Center has developed a list (the NAS Operations 
List or NOL) containing PRs that are needed in order to sustain 
continuous operations at the ZLC key site. These PRs are being fixed 
and packaged in three incremental software releases. The initial two 
releases have been delivered to the key sites and both Salt Lake City 
and Seattle have run for 4 hours on the midshift on the first software 
release with no new critical issues identified. The key sites are 
expected to conduct operational runs on the 2nd release in early 
August and the last software release is planned to be delivered during 
August as well. Once enough operational experience is gained on these 
releases the schedule to an In-Service Decision (ISD), including 
conduct of independent operational test and evaluation, and the 
resultant waterfall deployment schedule will be determined. The key 
sites are expected to come up on a continuous operational basis on 
ERAM by November. The current projection for an In-Service Decision is 
the 2nd quarter of FY 2011. 

The program office is currently assessing the cost impact to the 
program baseline as a result of the schedule delay in achieving ISD 
and the waterfall deployment at all ARTCCs. 

Source: FAA. 

[End of figure] 

Figure 3: Program Assessment Report for ADS-B as of June 2010: 

[Refer to PDF for image: illustration of report] 

Key: 
Metric indicates no issues: Green; 
Metric indicates some issues: Yellow. 

Program Phase: Production; 
June 2010: 

Program assessment: 
Financial: Metric indicates no issues; 
Schedule: Metric indicates no issues; 
Technical: Metric indicates no issues; 
Resources: Metric indicates no issues;
External interest: Metric indicates some issues;
Program manager: Metric indicates no issues. 

Supporting indicators: 

Financial: Cost Performance Index (CPI); 
Measure of cost performance: 1.04. 

Financial: To Complete Performance Index (TCPI); 
Measure of cost performance: TCPI = 0.97. 

Financial: Cost Variance At Completion (CVAC); 
Measure of cost performance: Metric indicates no issues. 

Financial: Obligation rate; 
Measure of cost performance: [Symbol]. 

Schedule: Schedule Performance Index (SPI); 
Measure of schedule performance: 0.98. 

Schedule: Schedule Variance At Completion (SVAC); 
Measure of schedule performance: Metric indicates no issues. 

Schedule: APB standard level 1 milestones; 
Measure of schedule performance: Metric indicates some issues. 

Schedule: Fiscal year schedule goals; 
Measure of schedule performance: Metric indicates no issues. 

Technical: Requirements stability; 
Measure of cost performance: Metric indicates no issues. 

Technical: System defects; 
Measure of cost performance: N/A. 

Technical: Test results; 
Measure of cost performance: Metric indicates no issues. 

Technical: Deployment; 
Measure of cost performance: Metric indicates no issues. 

Technical: Cumulative mitigated risk impact; 
Measure of cost performance: Metric indicates no issues. 

Technical: Performance Variance At Completion (PVAC); 
Measure of cost performance: Metric indicates no issues. 

Resources: Funding; 
Measure of appropriate funding: Metric indicates no issues. 

External interest: Under review by DOT, IG, GAO, or other; 
External review activity: Metric indicates some issues. 

Program manager comments: 

External Interest is yellow because the Program has on-going OIG and 
GAO audits. 

Juneau IOC was achieved on 04/28/2010. Philadelphia IOC was achieved 
on March 28. Louisville IOC was achieved on November 19, 2009. Gulf of 
Mexico IOC was achieved on December 17, 2009. 

The Final rule was published in the Federal Register on 5/28/10. 

ISD due in September and is on track. ISR checklist is 61% complete. 
Some automation issues are being worked. 

Source: FAA. 

[End of figure] 

As figures 2 and 3 show, FAA data and reports can indicate potential 
problems, and the program manager's comment box can provide additional 
information on their nature. For example, the report on ERAM in figure 
2 shows schedule performance in red, specifically, the program is 
behind in meeting key milestones and technical performance in yellow, 
including system defects that need to be dealt with and could affect 
the cumulative risk and the level 1 milestones associated with the 
program. The program manager comment box provides more detailed 
information on the nature of the defects and problems associated with 
the program-in this case, interruptions in flight data processor 
software. These reports also indicate how well the program is 
performing from an EVM perspective; that is, they show how well the 
program is meeting its planned cost and schedule targets for system 
development. In these reports, the information on the left side of the 
chart indicates performance relative to EVM. "CPI" values of greater 
than 1 on the chart indicate that the program is performing well, in 
that for every dollar spent, more than a dollar of value is received, 
whereas a "CPI" less than 1 would indicate less than a dollar of value 
is received for every dollar spent. Similarly, a "SPI" value of 1.00 
indicates that work is being accomplished at the planned rate, while a 
"SPI" value of less than 1 indicates that work is behind schedule to 
some degree. 

While we did not specifically review the validity of EVM metrics for 
these programs as part of this engagement, we have previously reported 
issues with FAA's implementation of EVM. Our review of FAA's reports 
and interviews with program managers and other officials indicates 
that these issues continue to affect the accuracy and reliability of 
some of FAA's program status reports. For example, in 2008, we 
reported that FAA did not apply sound earned value techniques to the 
full ERAM program baseline.[Footnote 18] In particular, FAA had 
rigorous EVM processes to govern contractor deliverables for ERAM, but 
it did not have the same processes in place for government work. As a 
result, FAA could not ensure that the earned value data reported for 
the total program were reliable. The consequences of this risk can be 
seen in figure 2. While the figure shows that milestones for the ERAM 
program as a whole are being met, the EVM contract-level data do not 
reliably reflect these ongoing schedule issues affecting the program 
at this time. In another recent GAO engagement, we found that because 
EVM is not applied at the full program level for ERAM and only at the 
contractor level, it is unclear in FAA's reporting whether delays will 
affect the program's overall costs.[Footnote 19] The ERAM program 
manager told us that the overall program cost is likely to be in 
excess of what was originally planned due to the ongoing software 
defects and schedule slippages the program is currently experiencing, 
but FAA's program assessment report for ERAM does not indicate this 
issue. If the EVM processes had been implemented appropriately, the 
EVM data could likely have provided an early warning of the problems 
the program is currently experiencing and enabled managers to take 
timely and aggressive action to mitigate them. (For a further 
discussion on ERAM and a more complete description of issues involved 
in its implementation, see appendix II). For other programs like ADS-
B, EVM data will be more accurate because EVM metrics have been 
established for the full program. 

In other previous work, we found that FAA's measurement and reporting 
of its acquisition performance could mask budget increases and 
schedule delays that could negatively affect the transition to 
NextGen.[Footnote 20] Consequently, budget increases and delays in one 
program that could slow the implementation of NextGen capabilities may 
not be apparent to Congress or aviation stakeholders. Our review of 
FAA's reports and interviews with program managers and other officials 
indicates that these problems persist. For example, according to the 
ERAM program manager, the implementation schedule for ERAM will affect 
the implementation schedule for ADS-B if the implementation of ERAM 
extends beyond April 2011. Generally, the individual program offices 
understand their programs' dependence on ERAM's implementation, but 
FAA has not developed a full listing of how ERAM schedule slippages 
could affect or put other programs' implementation schedules at risk 
or delay the implementation of capabilities or improvements.[Footnote 
21] We recommended that FAA improve the usefulness of ATO's 
acquisition performance reporting by including information in the 
agency's Performance and Accountability Report or elsewhere on the 
potential effect of any budget or schedule slippages on the overall 
transition to NextGen. This recommendation remains open, as FAA has 
not definitively indicated how it will track slippages that will 
affect other dependent NextGen programs. Currently, FAA manages its 
acquisitions using its Acquisition Management System (AMS), which 
establishes policy and guidance for life-cycle acquisition management; 
however, AMS was not designed for managing NextGen programs in an 
integrated way. To assist in managing its NextGen portfolios, FAA is 
employing a solution set management approach, discussed in the next 
section of this report, which is designed to monitor all the 
activities of a particular operational improvement to ensure 
integration is on track. As this approach is more fully implemented, 
it will likely clarify the impact of slippages in one program's 
schedule on the implementation status of other NextGen programs and 
operational capabilities. 

In addition to the issues described above, we have made several 
recommendations to FAA and DOT on acquisition performance measurement 
and reporting systems, and FAA has made many improvements in response, 
as shown in figure 4. 

Figure 4: Implementation Status of Prior Selected GAO Recommendations 
to FAA and DOT as of June 2010: 

[Refer to PDF for image: illustrated table] 

GAO Report: GAO-08-42 (Dec. 2007): 
Recommendation (summarized): Improve the objectivity, reliability, and 
inclusion of core programs in ATO’s acquisition performance measures; 
Agency responsible: FAA; 
Status of recommendation: Closed (implemented). 

GAO Report: GAO-08-42 (Dec. 2007): 
Recommendation (summarized): Improve the clarity of ATO’s annual 
acquisition performance measurement process by disclosing in its 
Performance and Accountability Reports that the measurement for on-
budget performance covers 8 months and is measured against the most 
recently approved budget baselines; 
Agency responsible: FAA; 
Status of recommendation: Closed (implemented). 

GAO Report: GAO-08-42 (Dec. 2007): 
Recommendation (summarized): Include information on any mitigation 
plans ATO has developed to lessen the effects of program slippages on 
the implementation of NextGen systems; 
Agency responsible: FAA; 
Status of recommendation: Open but in process. 

GAO Report: GAO-08-756 (July 2008): 
Recommendation (summarized): Modify acquisition policies governing EVM 
to require the use of a product-oriented standard work breakdown 
structure; 
Agency responsible: FAA; 
Status of recommendation: Closed (implemented). 

GAO Report: GAO-08-756 (July 2008): 
Recommendation (summarized): Modify acquisition policies governing EVM 
to enforce existing EVM training requirements and expand these 
requirements to include senior executives responsible for investment 
oversight and program staff responsible for program oversight; 
Agency responsible: FAA; 
Status of recommendation: Open but in process. 

GAO Report: GAO-08-756 (July 2008): 
Recommendation (summarized): Modify acquisition policies governing EVM 
to define acceptable reasons for rebaselining and when seeking to 
rebaseline a program, require (1) a root cause analysis to determine 
why significant cost and schedule variances occurred, and (2) 
mitigation plans to address the root cause; 
Agency responsible: FAA; 
Status of recommendation: Closed (implemented). 

GAO Report: GAO-08-756 (July 2008): 
Recommendation (summarized): Improve FAA’s oversight processes by 
including an evaluation of contractors’ performance data as part of FAA’
s program assessment criteria; 
Agency responsible: FAA; 
Status of recommendation: Closed (implemented). 

GAO Report: GAO-08-756 (July 2008): 
Recommendation (summarized): Modify acquisition policies governing EVM 
to enforce existing EVM training requirements and expand these 
requirements to include senior executives responsible for investment 
oversight and program staff responsible for program oversight; 
Agency responsible: FAA; 
Status of recommendation: Closed (implemented). 

GAO Report: GAO-10-2 (Dec. 2009): 
Recommendation (summarized): Modify policies governing EVM to ensure 
that they address the weaknesses that we identified. Direct managers 
of key system acquisition programs to implement the EVM practices; 
Agency responsible: DOT; 
Status of recommendation: Open. 

GAO Report: GAO-10-2 (Dec. 2009): 
Recommendation (summarized): Direct managers of key system acquisition 
programs to take action to reverse current negative performance 
trends, as shown in the earned value data, to mitigate the potential 
cost and schedule overruns; 
Agency responsible: DOT; 
Status of recommendation: Open. 

Source: GAO review of DOT and FAA responses to past recommendations. 

[End of figure] 

NextGen Solution Set Approach Encompasses Program Metric Data and 
Other Initiatives and Processes but Has Yet to Be Fully Developed: 

The NextGen solution set organization and structure hold promise for 
monitoring NextGen implementation, but are still under development, 
and questions about appropriate roles have yet to be resolved. The 
solution set management team will be responsible for monitoring all 
aspects of NextGen implementation by tracking schedule and budget 
data, as well as changes in policies and processes affecting such 
things as certifications, standards, and staffing levels. Within the 
management team, solution set coordinators will be in charge of 
collecting and monitoring the status of all aspects of operational 
improvements and supporting activities within their solution sets. The 
coordinator's area of responsibility is vast and shifting because each 
solution set encompasses numerous capital acquisitions, programs, 
projects, and processes handled by various FAA offices. As of April 
20, 2010, FAA had filled four permanent coordinator positions, one 
position was filled through a temporary assignment, and one position 
was vacant. In addition, the position of solution set manager, who 
oversees all the solution set coordinators, was being filled on a 
temporary basis. Filling key positions with qualified personnel is an 
ongoing challenge for FAA, as we have previously reported.[Footnote 22] 

To support its monitoring of solution set activities, FAA is 
developing two key tools, the portfolio management tool and project-
level agreements. The portfolio management tool is a database for 
tracking and monitoring key milestones and the status of funding that 
has been obligated and committed for individual budget line items. 
According, to FAA officials currently, about 50 percent of the 
programs and projects that receive funding are loaded into the 
portfolio management tool; none of these are linked to specific 
NextGen operational improvements. This process has been slowed, in 
part, as FAA continues to staff the office. Once the remaining 
programs and projects are loaded into the portfolio management tool 
and linked to operational improvements, the solution set coordinators 
can monitor and report progress at a portfolio level. According to FAA 
officials, efforts to load the necessary information into the tool are 
ongoing and are expected to be completed by the first quarter of 
fiscal year 2011. Project-level agreements are annual agreements 
between the NextGen Integration and Implementation Office and the 
performing service organization (e.g., the Air Traffic Organization-
Terminal for Flexible Terminals and Airports program, or FAA's Office 
of Environment and Energy for the Advanced Noise/Emissions Reduction 
program) for monitoring and reporting milestones and obligations to 
FAA management, OMB, and other stakeholders. In total, there are 95 
project-level agreements for fiscal years 2009 and 2010, including 49 
for fiscal year 2009 and 46 for fiscal year 2010 that were signed as 
of July 15, 2010. It is unclear whether the remaining nine agreements 
will be signed by the end of fiscal year 2010, as planned, because, 
according to FAA officials, work was slowed during the first part of 
fiscal year 2010 by issues associated with operating under a 
continuing resolution. 

Our analysis of the solution set structure raises questions about 
whether sufficient processes are in place that will strengthen 
oversight and create a greater likelihood that actions required by 
various lines of business to produce operational improvements are 
implemented in a timely fashion across the agency. The project-level 
agreement outlines the key responsibilities of the performing 
organization, such as reporting information in the portfolio 
management tool and managing obligations and milestones. If a dispute 
arises or the performing organization does not perform its functions 
in a timely manner, the agreement provides for an informal resolution 
between the signers to the agreement (i.e., the solution set 
coordinator and the performing organization). According to FAA 
officials, if a dispute could not be resolved, it would be brought to 
the NextGen Management Board, which has authority to resolve any 
issues, such as ensuring that appropriate resources are available or 
issuing policy and guidance to force timely resolution. However, FAA 
has no written policy for resolving this type of dispute beyond what 
is described in the project-level agreement. Given that the solution 
set model is relatively new, there is little experience to draw upon 
to understand the impact of tasks not being completed on time or 
funding not being spent properly. 

Metrics Have Yet to Be Developed to Measure the Performance of NextGen 
Improvements in Relation to Specific NextGen Goals, but Some 
Performance Metrics Are Available for Specific Programs: 

FAA Is Considering a Number of NextGen Performance Metrics, but Little 
Progress Has Been Made: 

FAA has broad goals for NextGen, such as enhancing safety, reducing 
aviation's environmental impact, and increasing operations and 
efficiency, but specific goals for NextGen as a whole have yet to be 
determined and FAA has not agreed on a set of overall performance 
metrics that it can use to measure progress. In order to measure 
outcomes and performance as implementation progresses, the Senior 
Policy Committee-which is the interagency governing body for NextGen- 
will need to identify milestones or performance goals for NextGen as a 
whole across federal partner agencies. Relative to the broad goals 
outlined for NextGen, FAA will then need to identify a set of metrics 
and begin collecting baseline performance information against which to 
measure the effects of its NextGen activities. 

GAO has identified criteria for sound performance management for 
federal agencies that may assist FAA as it continues to develop 
specific NextGen performance goals and metrics. According to previous 
GAO work, agencies that have been successful in measuring performance 
had performance measures that demonstrate results, are limited to the 
vital few, cover multiple priorities, and provide useful information 
for decision making.[Footnote 23] Furthermore, GAO work cited specific 
attributes[Footnote 24] that are key to successful performance 
measures: 

* Linkage-Measure is aligned with division and agencywide goals and 
mission and clearly communicated throughout the organization. 

* Clarity-Measure is clearly stated and the name and definition are 
consistent with the methodology used to calculate it. 

* Measurable target-Measure has a numerical goal. 

* Objectivity-Measure is reasonably free from significant bias or 
manipulation. 

* Reliability-Measure produces the same result under similar 
conditions. 

* Core program activities-Measures cover the activities that an entity 
is expected to perform to support the intent of the program. 

* Limited overlap-Measure should provide new information beyond that 
provided by other measures. 

* Balance-Balance exists when a suite of measures ensures that an 
organization's various priorities are covered. 

* Governmentwide priorities-Each measure should cover a priority such 
as quality, timeliness, and cost of service. 

Having performance metrics with these attributes will help FAA 
management and stakeholders, such as Congress, make decisions about 
how to fund and monitor the progress of NextGen. 

While there are currently no agreed-upon NextGen performance goals or 
metrics available, JPDO and ATO are working to develop such 
performance tools. First, JPDO has developed a list of potential 
performance metrics for measuring progress toward the goals of federal 
partner agencies-not just FAA, as shown in table 3. 

Table 3: Selection of JPDO's Proposed Performance Metrics: 

Performance area: Expand capacity; 
Metrics: Throughput and average delay.
Metrics: Difference in delay between good and bad weather delays.
Metrics: Cancellations and consequent passenger delay time.
Metrics: Curb-to-curb travel time. 

Performance area: Safety; 
Metrics: Percentage of proposed improvements evaluated for safety. 

Performance area: Environment; 
Metrics: Fuel consumed per unit of distance flown. 

Performance area: National defense; 
Metrics: Reduced flight time from flexible use of special use airspace. 

Performance area: Security; 
Metrics: Time passengers spend in airport security. 

Source: JPDO. 

[End of table] 

Second, ATO recently stated in the 2010 NextGen Implementation Plan 
that it would begin to consider what NextGen performance metrics are 
feasible for both FAA and industry. ATO officials told us that FAA is 
forming a team of staff from FAA and MITRE[Footnote 25] to develop 
metrics as part of the agency's response to a recommendation from the 
RTCA Task Force. However, this effort has only recently begun, and no 
timeline or action plan has yet been established. Under the direction 
of the NextGen Management Board, this group will be charged with 
identifying performance metrics in consultation with industry. 
According to an FAA official, one of the group's first tasks will be 
to review an extensive list of several hundred potential metrics that 
FAA has considered in the past and to recommend those metrics that the 
group considers the most appropriate for use. Table 4 shows an initial 
list of FAA's proposed metrics. This list will be revised and changed 
as the work group gets underway. Additionally, FAA has modeling 
efforts under way to estimate the impacts of NextGen technologies on 
safety, environmental operations and total delay reduction. 
Specifically, FAA estimates that, in aggregate, planned NextGen 
technologies--including performance based procedures and planned 
runway improvements will reduce delays by about 21 percent by 2019 as 
measured against doing nothing at all and will save 1.4 billion 
gallons of fuel from air traffic operations. These modeling efforts 
are somewhat preliminary and still under development and these 
estimates of benefits are not currently performance targets for 
planned NextGen improvements. 

Table 4: Selection of FAA Proposed Performance Metrics: 

Performance area: Access and equity; 
Performance indicators: 
Fleet penetration; 
Number of airports with enhanced NextGen capabilities; 
Number of centers with enhanced NextGen capabilities. 

Performance area: Capacity; 
Performance indicators: 
Peak-hour throughput at airports; 
Peak-hour airspace throughput; 
Difference between throughput and demand. 

Performance area: Cost-effectiveness; 
Performance indicators: 
Number of air traffic controllers per operation; 
Number of air traffic controllers per flight hour; 
Air traffic controller cost per operation; 
Air traffic controller cost per flight hour; 
Air traffic controller cost per flight mile; 
Number of air traffic controllers per facility or facility type. 

Performance area: Efficiency; 
Performance indicators: 
Delay; 
Excess fuel consumed; 
Excess distance flown; 
Distance flown at suboptimal altitude; 
Peak-hour average taxi-time. 

Performance area: Environment; 
Performance indicators: 
Aircraft emissions below 3,000 feet; 
Full flight emissions; 
Terminal noise contour area; 
Noise population exposure; 
Temperature change, premature mortality, and noise exposure. 

Performance area: Flexibility; 
Performance indicators: 
Percent of flight trajectory flown at optimal parameters; 
Percent of user requests granted. 

Performance area: Predictability; 
Performance indicators: 
Delay variance; 
Variance of excess fuel consumed; 
Variance of excess distance flown; 
Variance of distance flown at suboptimal altitude; 
Variance of peak-hour average taxi-time; 
Variance of demand; 
Variance of peak-hour throughput. 

Performance area: Safety; 
Performance indicators: 
Number of accidents per operation; 
Number of losses of separation per operation; 
Number of pilot deviations per operation; 
Number of air traffic management induced accidents; 
Runway incursions. 

Source: FAA. 

Note: The metrics in this table were extracted from FAA's draft 
NextGen performance assessment plan. FAA acknowledged that some of 
these metrics could not be quantified empirically and were more 
appropriate for cost-benefit analysis. 

[End of table] 

While this list of metrics is preliminary and will be changed as the 
work group proceeds, we identified some areas in the course of our 
work that do not appear on this list. For example, one industry 
stakeholder suggested that average gate-to-gate times for city pairs 
would be a useful metric, as several NextGen improvements are meant to 
shorten both the time spent flying between two cities, as well as the 
time spent taxiing and waiting at the airport. Furthermore, we 
recently recommended that FAA develop airport-specific on-time 
performance targets to better prioritize its actions and demonstrate 
their benefits.[Footnote 26] The Senate FAA reauthorization bill 
[Footnote 27] also proposes metrics that are not included here, such 
as flown versus filed flight times for key city pairs. 

Lastly, developing and agreeing on the right set of goals and metrics 
is difficult because many aspects of performance and actions that will 
influence outcomes are not exclusively under FAA's control. For 
example, to assess its progress in achieving benefits associated with 
implementing performance-based navigation procedures, FAA currently 
measures the number of procedures it develops annually. While FAA may 
be moving away from this approach, stakeholders argue that, by not 
measuring outcomes associated with those procedures, such as improved 
runway utilization and reduced travel times, FAA has not developed 
procedures that have the most significant benefit.[Footnote 28] 
However, achieving the benefits of the new procedures requires actions 
both outside and within FAA. Outside FAA, airlines must train their 
pilots and crews to use the procedures and equip their aircraft for 
flight using the procedures. Similarly, JPDO, as the federal agency 
coordinator for NextGen, shares responsibility for developing and 
agreeing on several proposed metrics whose outcomes will be affected 
by the actions of multiple agencies and stakeholders. For example, 
JPDO's proposed metric, the outcome of "time passengers spend in 
airport security," will be influenced not only by security procedures 
from the Department of Homeland Security, but also by airport 
configurations and airline scheduling patterns, among other things. In 
these cases, developing and agreeing on metrics will require 
collaboration with partner agencies, airports, industry, and a variety 
of other stakeholders, and will also require commitments from other 
parties to take responsibility for various outcomes and aspects of 
performance. 

FAA Reports Some Performance Metrics for Existing Modernization and 
NextGen Programs to OMB, but These Metrics Are Not Always Outcome 
Oriented: 

FAA reports quarterly to OMB on metrics identified in Exhibit 300 
reports for major acquisitions approved through the Joint Resources 
Council, but these metrics are not always outcome based or focused on 
the performance of the system; therefore, they do not always clearly 
indicate progress toward performance goals.[Footnote 29] For example, 
one metric for ERAM is related to achieving the capability to utilize 
64 ground radar sensors as compared with 24 under the current system. 
Performance measures should clearly represent or be related to the 
performance they are designed to assess.[Footnote 30] In this case, 
the metric measures an output from the ERAM system-utilization of 
radar sensors-but does not measure any outcome of having the 
capability to utilize more radar sensors - such as improved adherence 
to aircraft separation standards and a resulting increase in capacity 
or reduction in congestion. Such an outcome would indicate progress 
toward an FAA strategic goal to increase capacity. Other metrics FAA 
reports are focused on outcomes and will show progress toward goals. 
For example, another ERAM metric-to be measured in 2012-is for 10 
percent fewer flight delays to be attributable to ERAM as compared 
with the average annual number of flight delays attributable to its 
predecessor system between 2000 and 2008. This metric will allow FAA 
to measure progress toward FAA's goal to improve on-time arrivals. 
Hence, the metric is clearly related to the performance that it is 
designed to assess and it identifies a baseline from which to measure 
progress. 

Performance metrics for NextGen programs are also identified in 
Exhibit 300[Footnote 31] reports to OMB and include a similar mix of 
outcome and output measures. For example, the report on the ADS-B 
program identifies a number of clear and specific outcome-based 
performance metrics, such as reducing passenger delay hours by 28 
percent in the Gulf of Mexico, or maintaining a 13 percent reduction 
in the accident rate in Alaska. However, for other aspects of the ADS-
B program, metrics are related to outputs with no corresponding link 
to outcomes or goals. For example, one metric is to maintain the 
service that transmits weather information via ADS-B, but there are no 
metrics associated with the outcome or benefit of having such services 
available, the quality of those services, or how the availability of 
those services furthers progress toward any of FAA's stated goals, 
such as reducing the impact of weather on delays. Without links to 
outcomes and goals, metrics will not help to measure progress toward 
those goals and the agency may not emphasize the quality of the 
services it provides or the resulting benefits to users. In this case, 
additional metrics, such as the rate at which aircraft operators 
subscribe to ADS-B services or the rates of satisfaction reported by 
users of the information, would provide FAA and observers with more 
information to indicate the performance of the program and the 
benefits derived from public expenditures. 

Information on NextGen Outcomes and Performance Is Limited in FAA 
Performance Reporting: 

In addition to reporting to OMB, FAA uses its annual Flight Plan and 
Performance and Accountability Report to report its performance and 
activities. However, these documents discuss only a few NextGen 
capabilities and programs that are expected to have an effect on 
existing agencywide metrics and do not include any performance 
information specific to ongoing NextGen capabilities that are being 
implemented.[Footnote 32] For example, for one of FAA's metrics- 
decreasing the commercial air carrier fatality rate-the Flight Plan 
reports that the deployment of ADS-B will help drive the commercial 
fatality rate down.[Footnote 33] As discussed in the previous section, 
one of the performance metrics associated with ADS-B is to maintain a 
13 percent accident reduction rate in Alaska. However, the performance 
and accountability report does not currently indicate whether or how 
the ongoing deployment of ADS-B has affected the accident rate in 
Alaska. Such information would help stakeholders understand the 
progress of ADS-B on the performance metric. Outcome goals should be 
included in the annual performance plan whenever possible and annual 
performance plans should identify performance goals that cover all of 
the program activities in an agency's budget.[Footnote 34] 

In other cases, NextGen improvements are meant to enhance performance 
in certain areas, but the Flight Plan and Performance and 
Accountability Report do not mention those planned improvements. For 
example, some air traffic modernization and NextGen activities - such 
as implementing performance-based navigation procedures[Footnote 35] - 
are meant to increase aviation fuel efficiency, and this is one of the 
performance metrics in FAA's Flight Plan, yet these reports include no 
discussion of the activities underway that are intended to affect this 
metric. Table 5 identifies the current metrics described in the Flight 
Plan for the goals of increasing safety and capacity; indicates 
whether NextGen activities are included in those metrics; and shows 
our analysis of whether NextGen activities are captured by the 
performance reports. 

Table 5: Alignment of NextGen Activities with Existing Flight Plan 
Metrics: 

Flight Plan goal: Increase safety: 

Flight Plan performance metric: Commercial air carrier fatality rate; 
NextGen activities captured (per FAA reports): [Check]; 
Areas NextGen will affect: [Check]. 

Flight Plan performance metric: General aviation fatal accident; 
NextGen activities captured (per FAA reports): [Check]; 
Areas NextGen will affect: [Check]. 

Flight Plan performance metric: Alaska accidents; 
NextGen activities captured (per FAA reports): [Check]; 
Areas NextGen will affect: [Check]. 

Flight Plan performance metric: Runway incursions; 
NextGen activities captured (per FAA reports): [Empty]; 
Areas NextGen will affect: [Check]. 

Flight Plan performance metric: Commercial space launches; 
NextGen activities captured (per FAA reports): [Empty]; 
Areas NextGen will affect: [Empty]. 

Flight Plan performance metric: Operational errors; 
NextGen activities captured (per FAA reports): [Check]; 
Areas NextGen will affect: [Check]. 

Flight Plan performance metric: Safety management system; 
NextGen activities captured (per FAA reports): [Empty]; 
Areas NextGen will affect: [Check]. 

Flight Plan goal: Increase capacity: 

Flight Plan performance metric: Average daily airport capacity (35 OEP 
airports); 
NextGen activities captured (per FAA reports): [Check]; 
Areas NextGen will affect: [Check]. 

Flight Plan performance metric: Average daily capacity (7 metro areas); 
NextGen activities captured (per FAA reports): [Check]; 
Areas NextGen will affect: [Check]. 

Flight Plan performance metric: Annual service volume; 
NextGen activities captured (per FAA reports): [Empty]; 
Areas NextGen will affect: [Check]. 

Flight Plan performance metric: Adjusted operational availability; 
NextGen activities captured (per FAA reports): [Empty]; 
Areas NextGen will affect: [Check]. 

Flight Plan performance metric: Noise exposure; 
NextGen activities captured (per FAA reports): [Empty]; 
Areas NextGen will affect: [Check]. 

Flight Plan performance metric: Aviation fuel efficiency; 
NextGen activities captured (per FAA reports): [Empty]; 
Areas NextGen will affect: [Check]. 

Flight Plan performance metric: NAS on-time arrivals; 
NextGen activities captured (per FAA reports): [Check]; 
Areas NextGen will affect: [Check]. 

Source: GAO analysis of FAA Flight Plan and NextGen Implementation 
Plan. 

[End of table] 

Further clarification and consistency in reporting the outcomes and 
performance of new technologies and capabilities as they are deployed, 
and how those activities will further affect progress toward 
agencywide goals, would provide users with additional context to 
discern the impact of ongoing air traffic modernization and NextGen 
activities. Recently, FAA has begun an initiative that aims to align 
NextGen activities and performance with FAA's Flight Plan and expects 
to deliver a report by early 2011. 

Conclusions: 

NextGen is an undertaking of significant breadth and complexity and 
touches several federal agencies, nearly every office within FAA, and 
nearly every existing system and piece of infrastructure currently 
operating in the national airspace system. As a result, determining 
the status and performance of the effort as a whole is, therefore, a 
broad, complex undertaking, requiring multiple reports and pieces of 
information from multiple parties. While we currently have several 
open recommendations related to improving FAA's use of EVM and its 
acquisition management system, FAA's current reporting mechanisms can 
give overseers and interested parties certain information that can 
indicate potential problems with the cost and pace of individual 
programs' implementation. However, these mechanisms are insufficient 
to report on the status of NextGen portfolios or how delays and cost 
overruns in one acquisition can impact implementation of other 
programs or capabilities. 

FAA's portfolio approach to implementation is designed to help the 
agency assess and convey the implementation status of interrelated 
capabilities and operational improvements. However, because 
implementation of solution sets requires action across various lines 
of business with separate budgets within FAA, it is important to 
ensure that processes are in place that will strengthen oversight and 
create a greater likelihood that required activities are completed on 
time. While the NextGen Management Board is ultimately responsible for 
resolving disputes, there are no written policies and procedures to 
guide its resolution of disputes between the parties to project-level 
agreements. A lack of clear dispute resolution procedures raises 
questions about how quickly and effectively any such disputes will be 
resolved. 

Finally, while several of FAA's efforts to develop, agree on, and 
implement a suite of performance metrics are relatively recent or 
still in progress, action is needed that will provide stakeholders 
with a clear vision of what is required by each one. Without a 
timeline and action plan that stakeholders have agreed on, it remains 
to be seen if these actions will enable FAA to provide stakeholders, 
interested parties, Congress, and the American people with a clear 
picture of where implementation stands at any given time, and whether 
the technologies, capabilities, and operational improvements that are 
being implemented are resulting in positive outcomes and improved 
performance for operators and passengers. 

Recommendations for Executive Action: 

To ensure that FAA can effectively manage NextGen solution sets, we 
recommend that the Secretary of Transportation direct the FAA 
Administrator to develop written policies and procedures for dispute 
resolution across different FAA lines of business and outline the 
appropriate roles of the solution set managers, program managers, and 
the NextGen Management Board in managing these portfolios of 
improvements. 

To ensure that the outcomes and performance expected from NextGen 
improvements are understood and can be monitored, we recommend that 
the Secretary of Transportation direct the FAA Administrator to 
develop a timeline and action plan to work with industry and federal 
partner agencies to develop an agreed-upon list of outcome-based 
performance metrics and goals for NextGen broadly and for specific 
NextGen portfolios, programs, and capabilities. The Administrator 
should then share this list with the appropriate congressional 
oversight committees. Furthermore, the Administrator should establish 
a clear timeline to align NextGen performance metrics with FAA's 
agencywide goals and performance plans. 

Agency Comments: 

The Department of Transportation provided comments on a draft of this 
report via e-mail. In those comments, the department agreed to 
consider the report's recommendations. The department also provided 
technical comments, which we have incorporated in this report as 
appropriate. 

We are sending copies of this report to interested congressional 
committees, the Secretary of Transportation, the Administrator of the 
Federal Aviation Administration, and other parties. In addition, the 
report will be available at no charge on the GAO Web site at 
[hyperlink, http://www.gao.gov]. 

If you or your staff have any questions about this report, please 
contact me at (202) 512-2834 or dillinghamg@gao.gov. Contact points 
for our Offices of Congressional Relations and Public Affairs may be 
found on the last page of this report. GAO staff who made major 
contributions to this report are listed in appendix III. 

Signed by: 

Gerald Dillingham, Ph.D. 
Director, Physical Infrastructure Issues: 

List of Committees: 

The Honorable Jerry Costello: 
Chairman: 
The Honorable Thomas Petri: 
Ranking Member: 
Subcommittee on Aviation: 
Committee on Transportation and Infrastructure: 
House of Representatives: 

The Honorable John D. Rockefeller: 
Chairman: 
The Honorable Kay Bailey Hutchinson: 
Ranking Member: 
Committee on Commerce, Science, and Transportation: 
United States Senate: 

The Honorable Bart Gordon: 
Chairman: 
The Honorable Ralph Hall: 
Ranking Member: 
Committee on Science and Technology: 
House of Representatives: 

[End of section] 

Appendix I: Objectives, Scope, and Methodology: 

In response to a congressional request, we examined the Federal 
Aviation Administration's (FAA) ability to monitor the implementation 
of the Next Generation Air Transportation System (NextGen) portfolio 
of air traffic control systems and programs and whether they will 
deliver the desired benefit to the national airspace system. 
Specifically, we reviewed (1) FAA's metrics for tracking the status of 
NextGen acquisition programs and the implementation of NextGen 
capabilities, the reliability of these metrics and the data underlying 
them, and any limitations or gaps in FAA's efforts to track the status 
of NextGen implementation; and (2) how FAA currently measures the 
performance of NextGen programs and capabilities, FAA's progress in 
developing a full suite of metrics to measure the outcomes and 
performance of NextGen capabilities once implemented, and any 
limitations or gaps in FAA's approach to developing these metrics. 

To determine FAA's metrics for tracking the status of NextGen 
acquisition programs and acquisition programs critical to NextGen 
implementation (i.e., ERAM), and the implementation of NextGen 
capabilities, we analyzed the process for individual program managers 
to report their metrics to internal and external stakeholders and 
gained an understanding of the database that houses the information. 
We reviewed program and process reviews from FAA's acquisition offices 
to identify key areas of oversight focus and key findings that have 
been reached in such reviews regarding acquisition procedures and 
policies. To ensure acquisition program information is reliable, we 
drew on past work in which we undertook detailed reviews of the status 
of FAA acquisition programs, and we obtained updated information as 
necessary from FAA by reviewing documents and interviewing agency 
officials. Based on these past reviews, we determined that FAA's 
acquisition data were sufficiently reliable for the purposes of our 
report. To determine any limitations in FAA's effort to track the 
status of NextGen implementation, we obtained and analyzed recent 
metric reports for various FAA acquisitions to determine what 
information is readily available for FAA management and stakeholders 
outside FAA to monitor NextGen programs. We interviewed FAA officials, 
including acquisition, finance, and program managers. We then reviewed 
this information in light of our past recommendations and findings to 
determine the extent to which a program's implementation status can be 
discerned from the available information, how well that information 
allows reviewers to understand whether any issues may result in delays 
to a program, and whether those delays will affect the implementation 
of other programs or operational improvements. We also interviewed 
ERAM program office officials and representatives of the National Air 
Traffic Controllers Association (NATCA) in Washington, D.C.; Salt Lake 
City; and Seattle to obtain information related to discrepancies in 
program reports associated with ERAM. We did not conduct an individual 
or in-depth review of the effectiveness of the specific programs 
selected for performance reporting. We also did not identify a 
comprehensive list of programs that were excluded from acquisition 
performance reporting. This was beyond the scope and intent of this 
study. 

To gain an understanding of how FAA currently measures the performance 
of NextGen programs and capabilities, we reviewed documents that 
outline FAA's solution set organization and management approach, and 
interviewed officials involved in coordinating and managing solutions 
sets. We also reviewed our past reports on FAA's acquisition metrics 
and the status of prior recommendations. To determine FAA's progress 
in developing metrics to measure the outcomes and performance of 
NextGen capabilities once implemented, we first reviewed how FAA 
currently reports on its performance, both internally and externally, 
and how information on the performance of specific NextGen 
capabilities is incorporated into those metrics. We reviewed FAA's 
performance and accountability reports and discussed internal 
performance reporting methods with relevant FAA officials. 
Specifically, we reviewed FAA's Flight Plan, Performance and 
Accountability Report, 2009 and 2010 NextGen Implementation Plans, 
Enterprise Architecture, and reports to the Office of Management and 
Budget (OMB) (known as Exhibit 300 reports).[Footnote 36] To 
understand FAA's approach and progress toward developing a suite of 
NextGen metrics, we interviewed FAA officials with responsibilities 
for NextGen planning and implementation, particularly officials within 
the Air Traffic Organization (ATO) and the Joint Planning and 
Development Office (JPDO) responsible for modeling NextGen benefits 
and developing NextGen performance metrics. To evaluate the metrics 
that FAA is considering for potential gaps and limitations, we 
compared proposed metrics with key attributes of successful 
performance metrics that we identified in past GAO work.[Footnote 37] 
Metrics should cover key program activities and represent program and 
agency goals and priorities to help identify those activities that 
contribute to the goals and priorities. To the greatest extent 
possible, metrics should be objective, that is reasonably free of bias 
or manipulation that would distort an accurate assessment of 
performance, and clearly defined such that they can be understood by 
stakeholders both internally and externally. When appropriate, metrics 
should be measurable and quantifiable, including having annual 
targets, to facilitate future assessments of whether goals or 
objectives were achieved. We also interviewed several key stakeholders 
for NextGen, including representatives from airlines, equipment 
manufacturers, federal partner agencies, and NATCA to get their views 
on the metrics they deem most appropriate to measure the performance 
of NextGen. 

We conducted this performance audit from June 2009 through July 2010 
in accordance with generally accepted government auditing standards. 
Those standards require that we plan and perform the work to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objectives. We believe 
that the evidence obtained provides a reasonable basis for our 
findings and conclusions based on our audit objectives. 

[End of section] 

Appendix II: Further Information on the ERAM Program: 

The En Route Automation Modernization system (ERAM) replaces the 
existing en route air traffic control automation system at FAA's Air 
Route Traffic Control Centers (ARTCC). ERAM will replace the hardware 
and software in the en route Host Computer System and its backup 
system, the Direct Access Radar Channel, as well as associated 
interfaces, communications, and support infrastructure at 20 en route 
centers across the country. This effort is critical because ERAM is 
expected to upgrade hardware and software for facilities that control 
high-altitude air traffic. ERAM will modernize the En Route 
infrastructure to provide a supportable open standards-based system 
that will be the basis for future capabilities and enhancements ERAM 
will provide existing functionality and new capabilities needed to 
support NextGen. 

New Information Indicates That Software Issues Will Delay ERAM's 
Deployment: 

FAA's ability to keep ERAM on schedule remains uncertain because 
problems with software at key test sites could delay work at other 
sites and on other systems. Recently, FAA halted testing and is 
revising its implementation schedule to reflect the effects of these 
software problems. Two key sites, Salt Lake and Seattle en route 
centers, although achieving initial operating capability (IOC) 
[Footnote 38] in June 2009 and September 2009 respectively, under 
FAA's master schedule, failed to have an operational ready decision 
(ORD)[Footnote 39] by December 2009. As these sites conducted ERAM 
testing on live air traffic during the past year, usually late at 
night when air traffic volume was low, FAA and controllers found both 
critical and non-critical software issues that prompted the center, at 
times, to revert to the HOST system. Specifically, instructions to a 
controller to hand off control of an aircraft in one sector to a 
controller in an adjacent sector failed, and flight data were lost or 
reassigned to another flight. Although some progress has been made to 
correct these problems, some of these issues remain. FAA continued 
working with its contractor, Lockheed Martin, to correct many software 
issues, but further testing on live air traffic continues to produce 
critical safety errors. As a result, in March 2010, FAA decided, with 
the support of the air traffic controllers' union, to halt all ERAM 
testing on live traffic and to revise the deployment schedule. Such 
revisions will affect numerous sites across the country (see table 6 
for the original schedule). 

Table 6: ERAM Status for Achieving Key Program Milestones: 

Site: Salt Lake; 
Government acceptance (GA): 5/3/2008; 
Initial operational capability (IOC): 6/18/2009; 
Operation readiness decision (ORD): 10/30/2009. 

Site: Seattle; 
Government acceptance (GA): 5/8/2008; 
Initial operational capability (IOC): 9/21/2009; 
Operation readiness decision (ORD): 10/30/2009. 

Site: Minneapolis; 
Government acceptance (GA): 7/24/2008; 
Initial operational capability (IOC): 11/6/2009; 
Operation readiness decision (ORD): 12/6/2009. 

Site: Denver; 
Government acceptance (GA): 5/20/2008; 
Initial operational capability (IOC): 10/25/2009; 
Operation readiness decision (ORD): 11/30/2009. 

Site: Albuquerque; 
Government acceptance (GA): 6/19/2008; 
Initial operational capability (IOC): 2/11/2010; 
Operation readiness decision (ORD): 3/13/2010. 

Site: Dallas-Ft. Worth; 
Government acceptance (GA): 2/17/2009; 
Initial operational capability (IOC): 2/14/2010; 
Operation readiness decision (ORD): 3/16/2010. 

Site: Chicago; 
Government acceptance (GA): 4/15/2009; 
Initial operational capability (IOC): 2/27/2010; 
Operation readiness decision (ORD): 3/29/2010. 

Site: Houston; 
Government acceptance (GA): 4/23/2009; 
Initial operational capability (IOC): 3/3/2010; 
Operation readiness decision (ORD): 4/2/2010. 

Site: Oakland; 
Government acceptance (GA): 1/8/2009; 
Initial operational capability (IOC): 3/13/2010; 
Operation readiness decision (ORD): 4/12/2010. 

Site: Cleveland; 
Government acceptance (GA): 11/6/2008; 
Initial operational capability (IOC): 3/20/2010; 
Operation readiness decision (ORD): 4/19/2010. 

Site: Kansas City; 
Government acceptance (GA): 6/19/2008; 
Initial operational capability (IOC): 4/9/2010; 
Operation readiness decision (ORD): 5/9/2010. 

Site: Indianapolis; 
Government acceptance (GA): 10/28/2008; 
Initial operational capability (IOC): 4/25/2010; 
Operation readiness decision (ORD): 5/25/2010. 

Site: New York; 
Government acceptance (GA): 5/18/2009; 
Initial operational capability (IOC): 5/7/2010; 
Operation readiness decision (ORD): 6/6/2010. 

Site: Memphis; 
Government acceptance (GA): 9/18/2008; 
Initial operational capability (IOC): 5/21/2010; 
Operation readiness decision (ORD): 6/20/2010. 

Site: Los Angeles; 
Government acceptance (GA): 2/19/2009; 
Initial operational capability (IOC): 6/4/2010; 
Operation readiness decision (ORD): 7/9/2010. 

Site: Washington, D.C.; 
Government acceptance (GA): 1/28/2009; 
Initial operational capability (IOC): 6/19/2010; 
Operation readiness decision (ORD): 7/19/2010. 

Site: Boston; 
Government acceptance (GA): 3/19/2009; 
Initial operational capability (IOC): 7/9/2010; 
Operation readiness decision (ORD): 8/8/2010. 

Site: Atlanta; 
Government acceptance (GA): 8/10/2009; 
Initial operational capability (IOC): 7/31/2010; 
Operation readiness decision (ORD): 8/30/2010. 

Site: Jacksonville; 
Government acceptance (GA): 10/23/2008; 
Initial operational capability (IOC): 8/18/2010; 
Operation readiness decision (ORD): 9/17/2010. 

Site: Miami; 
Government acceptance (GA): 5/20/2009; 
Initial operational capability (IOC): 9/24/2010; 
Operation readiness decision (ORD): 10/24/2010. 

Source: FAA. 

[End of table] 

Testing at Key Sites Yielded Software Issues: 

In testing and evaluating ERAM at two key sites, the Seattle and Salt 
Lake en route centers, FAA has encountered both anticipated and 
unanticipated software issues.[Footnote 40] Before this testing began 
last year at these sites, FAA formally accepted the system from the 
contractor--a contractual milestone known as government acceptance 
(GA)--which indicates in this case, that equipment has performed to 
specification at all 20 sites.[Footnote 41] After GA, FAA designated 
the Salt Lake and Seattle sites as key sites for initial live testing 
on traffic in order to reach the next milestone, IOC, which FAA 
defines as the separation of two aircraft by the ERAM system for as 
little as 1 minute. (Figure 5 shows the en route centers progress in 
reaching GA and IOC.) However, during the testing process, controllers 
discovered a variety of software issues when ERAM was turned on. When 
FAA encountered a software issue that required a change, it assigned a 
code to the issue to indicate its severity and tracked the 
contractor's efforts to correct the issue in a database. During the 
life of the program, approximately 15,000 software issues have been 
identified. According to FAA officials, problems with the software are 
expected and common during the testing phase, especially for a program 
as large as ERAM (1.3 million lines of code) and time to correct them 
is built into the schedule. 

Figure 5: Current Phase of ERAM Testing at FAA's En Route Centers as 
of May 2010: 

[Refer to PDF for image: illustrated U.S. map] 

Initial operating capability (3 Air Route Traffic Control Centers): 
Minneapolis; 
Salt Lake; 
Seattle. 

Government acceptance (20 Air Route Traffic Control Centers): 
Albuquerque; 
Atlanta; 
Boston; 
Chicago; 
Cleveland; 
Denver; 
Fort Worth; 
Houston; 
Indianapolis; 
Jacksonville; 
Kansas City; 
Los Angeles; 
Memphis; 
Miami; 
Minneapolis; 
New York; 
Oakland; 
Salt Lake; 
Seattle; 
Washington. 

Sources: FAA and Map Resources (map). 

[End of figure] 

While some testing at FAA's Technical Center preceded testing at the 
two key sites, the Tech Center could not fully test the system because 
each of the 20 en route centers has unique airspace or operational 
issues that controllers have resolved over the years. Specifically, 
the Tech Center ERAM test room has only 12 ERAM positions compared 
with the 40 or more controller stations in an enroute center. As a 
result the Tech Center could test only limited scenarios. According to 
FAA officials, the testing accomplished what was expected, but the 
Tech Center environment was not robust enough to capture all issues. 
The more extensive testing that has since been conducted in the field 
with multiple operational facilities and systems has identified many 
issues, both expected and, as field personnel have become involved, 
unexpected and these unexpected issues have slowed the scheduled work. 
NATCA officials stated that the original ERAM schedule was aggressive 
and did not account for such issues and is thus the principal factor 
driving the delay. 

FAA Halted ERAM Testing in Response to Mounting Software Issues It 
Must Address at the Key Sites: 

FAA has halted ERAM testing because so many software issues have to be 
addressed at the key sites. FAA anticipated the potential for software 
issues and initially scheduled approximately 6 to 9 months between IOC 
and ORD to fix critical software issues. From the beginning, FAA has 
tracked more than 15,000 issues; currently, however, there are 
approximately 1,400 unresolved software issues, ranging from not 
tracking targets correctly to unwarranted safety alerts for altitude 
risk. Out of the 1,400 issues, about 200 are critical and will need to 
be resolved before testing on live traffic can resume, according to 
FAA and NATCA. 

The union was not initially involved in ERAM development but its 
recent inclusion may resolve some issues. NATCA officials stated the 
many of problems arising with ERAM could have been corrected earlier 
if NATCA had been involved. ERAM was designed during a period when 
controllers did not participate in efforts to design and test new 
systems. Because active users of the system from different locations 
could not provide insight early on, issues that could have been 
addressed early in the design phase were not addressed. To ensure that 
controllers will be involved as efforts go forward, unlike during the 
design of ERAM requirements, FAA and NATCA recently entered into a 
memorandum of understanding (MOU)[Footnote 42] that is designed to 
bring in controllers for testing and evaluation of ERAM to alleviate 
some of the same types of problems that arose earlier because they 
were not involved. Under this agreement, NATCA will have ERAM 
technical, evaluation and training representatives as well as a team 
of 16 controllers (including 12 from en route and 4 from terminal 
facilities) who will be detailed to test and validate software fixes 
with contractor engineers at the FAA Tech Center. 

FAA Indicates Revised Schedule May Not Delay Other Programs, but 
Software Changes Will Slightly Increase Costs: 

FAA acknowledges ERAM is unlikely to be operational at all 20 sites as 
originally planned due the unexpectedly large number of software 
issues. According to FAA, which has not released a revised schedule, 
it is working to fix the 200 or so critical software issues identified 
before ERAM testing on live traffic will resume at the key sites. FAA 
expects this testing to resume by early fall 2010 with the remaining 
sites reaching IOC soon thereafter. 

ERAM is a key platform for NextGen programs and keeping it on schedule 
is critical to maintaining the schedules for many NextGen programs, 
most notably ADS-B. FAA officials stated that the revised schedule for 
ERAM is not likely to delay the deployment of ADS-B, which is the 
first scheduled NextGen program to come online over the next couple of 
years. However, if additional delays occur and push the completion of 
testing beyond April 2011, ADS-B's deployment may be delayed. 
Specifically, Houston is the first en route center slated to receive 
ERAM version 3 in April 2011, which will support ADS-B demonstrations 
in the Gulf of Mexico, and reaching IOC on schedule is a critical step 
to ensure that ADS-B's deployment schedule is not delayed. ERAM 
officials stated they are coordinating with the ADS-B office and that 
it has been notified of the potential schedule slippage. 

Because FAA has assumed ownership of the system from the contractor, 
it is responsible for additional costs associated with any software 
changes. Because of the large number of unanticipated software issues, 
FAA will have to pay the contractor more than the $2.1 billion 
originally budgeted to complete the program. FAA is working to revise 
its test schedule and backlog of outstanding software issues and plans 
to provide a revised cost and schedule estimate. 

[End of section] 

Appendix III: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Gerald L. Dillingham, Ph.D. (202) 512-2834 or dillinghamg@gao.gov: 

Staff Acknowledgments: 

In addition to the contact named above, individuals making key 
contributions to this report include Andrew Von Ah (Assistant 
Director), Kevin Egan, Elizabeth Eisenstadt, Brandon Haller, Rich 
Hung, Bert Japikse, Dominic Nadarski, and Josh Ormond. 

[End of section] 

Footnotes: 

[1] NextGen was designed as an interagency effort in order to leverage 
various agencies' expertise and funding to advance NextGen while 
avoiding duplication. In addition to FAA, federal partner agencies 
include the Departments of Commerce (particularly its National Oceanic 
and Atmospheric Administration), Defense, Homeland Security, and 
Transportation; the National Aeronautics and Space Administration; and 
the White House Office of Science and Technology Policy. 

[2] GAO, Next Generation Air Transportation System: Challenges with 
Partner Agency and FAA Coordination Continue, and Efforts to Integrate 
Near, Mid-, and Long-term Activities Are Ongoing, [hyperlink, 
http://www.gao.gov/products/GAO-10-649T] (Washington, D.C.: Apr. 21, 
2010). 

[3] Vision 100--Century of Aviation Reauthorization Act (Pub. L. No. 
108-176, 117 Stat. 2490 (2003)). 

[4] FAA requested that RTCA--a private, not-for-profit corporation 
that develops consensus-based recommendations on communications, 
navigation, surveillance, and air traffic management system issues- 
create a NextGen Midterm Implementation Task Force (the Task Force), 
composed of industry stakeholders, to reach consensus within the 
aviation community on the operational improvements that can be 
implemented between now and 2018. The Task Force provided 
recommendations to FAA in September 2009 and FAA responded to all of 
these recommendations in its 2010 NextGen Implementation Plan. FAA is 
continuing to work with industry through RTCA to address the 
recommendations as implementation continues. 

[5] The solution set organization is located in the NextGen 
Integration and Implementation Office and provides a portfolio 
framework to manage the successful implementation of both immediate 
improvements and the large-scale integration of NextGen capabilities. 

[6] FAA's Flight Plan is a 5-year strategic plan that outlines agency 
goals and metrics. FAA's NextGen Implementation Plan is an annual 
workplan that defines the midterm operational capabilities the agency 
plans to deliver between now and 2018. FAA's Enterprise Architecture 
provides the structure to relate organizational mission, vision, and 
goals to business processes and the technical or information 
technology infrastructure required to execute them. An Exhibit 300-
also called a Capital Asset Plan and Business Case-is used to justify 
resource requests for major investments and is intended to enable an 
agency to demonstrate to its own management, as well as to OMB, that a 
major project is well planned. 

[7] GAO, Tax Administration: IRS Needs to Further Refine Its Tax 
Filing Season Performance Measures, [hyperlink, 
http://www.gao.gov/products/GAO-03-143] (Washington, D.C.: Nov. 22, 
2002). 

[8] Earned value management compares the actual work performed at 
certain stages of a job to its actual costs--rather than comparing 
budgeted and actual costs, the traditional management approach to 
assessing progress. By measuring the value of the work that has been 
completed at certain stages in a job, earned value management can 
alert program managers, contractors, and administrators to potential 
cost growth and schedule delays before they occur and to problems that 
need correcting before they worsen. 

[9] Performance metrics ensure that project managers are accountable 
in meeting expected performance goals and that projects are aligned 
with an agency's strategic goals. 

[10] Within FAA, the Joint Resources Council is an executive body 
consisting of associate and assistant administrators, acquisition 
executives, the Chief Financial Officer, the Chief Information 
Officer, and legal counsel. The council makes agency-level decisions, 
including those that determine whether an acquisition meets a mission 
need and should proceed. The council also approves changes to a 
program's baseline, budget submissions, and the national airspace 
system's architecture baseline. 

[11] ATO is responsible for operating, maintaining, and modernizing 
the nation's current air traffic control system. 

[12] The 2010 NextGen Implementation Plan also identifies how FAA 
plans to respond to recommendations from the NextGen Midterm 
Implementation Task Force (the Task Force). FAA requested that RTCA, 
(see footnote 4) create the Task Force to reach consensus within the 
aviation community on the operational improvements that can be 
implemented between now and 2018. 

[13] The NextGen Management Board is chaired by the FAA Deputy 
Administrator and consists of ATO's Chief Operating Officer and direct 
reports (including all Senior Vice Presidents and Vice Presidents, as 
well as the Director of the NextGen Integration and Implementation 
Office), the Associate Administrator for Aviation Safety, the Deputy 
Associate Administrator for Airports, the Assistant Administrator for 
Regions and Center Operations, the Assistant Administrator for 
Financial Services/Chief Financial Officer, the Assistant Associate 
Administrator for Policy, Planning and Environment, the Director of 
JPDO, Department of Defense Liaison, and a representative from MITRE. 
Also on the board are members of other key stakeholder groups, 
including representatives from the air traffic controller and aviation 
safety specialist unions. According to FAA, the Board is currently 
undergoing restructuring. 

[14] Vision 100--Century of Aviation Reauthorization Act (Pub. L. No. 
108-176, 117 Stat. 2490 (2003)). 

[15] 2010 NextGen Implementation Plan, March 2010, and GAO, Next 
Generation Air Transportation System: Status of Systems Acquisition 
and the Transition to the Next Generation Air Transportation System, 
[hyperlink, http://www.gao.gov/products/GAO-08-1078] (Washington, 
D.C.: Sept. 11, 2008). 

[16] [hyperlink, http://www.gao.gov/products/GAO-08-1078] and 
[hyperlink, http://www.gao.gov/products/GAO-10-649T]. 

[17] Organized in 1935 and once called the Radio Technical Commission 
for Aeronautics, RTCA is today known just by its acronym, RTCA. RTCA 
is a private, not-for-profit corporation that develops consensus-based 
performance standards for air traffic control systems. RTCA serves as 
a federal advisory committee, and its recommendations are the basis 
for a number of FAA's policy, program, and regulatory decisions. 

[18] GAO, Air Traffic Control: FAA Uses Earned Value Techniques to 
Help Manage Information Technology Acquisitions, but Needs to Clarify 
Policy and Strengthen Oversight, [hyperlink, 
http://www.gao.gov/products/GAO-08-756] (Washington, D.C.: July 18, 
2008). 

[19] GAO, Information Technology: Agencies Need to Improve the 
Implementation and Use of Earned Value Techniques to Help Manage Major 
System Acquisitions, [hyperlink, http://www.gao.gov/products/GAO-10-2] 
(Washington, D.C.: Oct. 8, 2009). 

[20] GAO, Air Traffic Control: FAA Reports Progress in System 
Acquisitions, but Changes in Performance Measurement Could Improve 
Usefulness of Information, [hyperlink, 
http://www.gao.gov/products/GAO-08-42] (Washington, D.C.: Dec. 18, 
2007). 

[21] FAA's Enterprise Architecture for the national airspace system 
shows the interdependencies and capabilities that may be affected by 
various programs, but this document cannot indicate specific 
scheduling milestones that might be affected. 

[22] GAO, Federal Aviation Administration: Human Capital System 
Incorporates Many Leading Practices, but Improving Employees' 
Satisfaction with Their Workplace Remains a Challenge, [hyperlink, 
http://www.gao.gov/products/GAO-10-89] (Washington, D.C.: Oct. 28, 
2009). 

[23] [hyperlink, http://www.gao.gov/products/GAO-03-143]. 

[24] All attributes are not equal and failure to have a particular 
attribute does not necessarily indicate that there is a weakness in 
that area or that the measure is not useful; rather, it may indicate 
an opportunity for further refinement. 

[25] MITRE is a not-for-profit organization chartered to work in the 
public interest. It manages four Federally Funded Research and 
Development Centers, including one for FAA. MITRE has its own 
independent research and development program that explores new 
technologies and new uses of technologies to solve problems in the 
near term and in the future. 

[26] GAO, National Airspace System: Setting On-Time Performance 
Targets at Congested Airports Could Help Focus FAA's Actions, 
[hyperlink, http://www.gao.gov/products/GAO-10-542] (Washington, D.C.: 
May 26, 2010). 

[27] S.1451, 111th Cong, § 317. 

[28] For example, these procedures decrease flight miles, which reduce 
an aircraft's fuel burn and carbon emissions. 

[29] OMB transmits some of this information into its "IT Dashboard." 
See GAO, Information Technology: OMB's Dashboard Has Increased 
Transparency, but Data Accuracy Improvements Needed, [hyperlink, 
http://www.gao.gov/products/GAO-10-701] (Washington, D.C.: July 17, 
2010). 

[30] GAO, The Results Act: An Evaluator's Guide to Assessing Agency 
Annual Performance Plans, [hyperlink, 
http://www.gao.gov/products/GAO/GGD-10.1.20] (Washington, D.C.: Apr. 
1998). 

[31] OMB 300, also called the Capital Asset Plan and Business Case, is 
a document that agencies submit to OMB to justify resource requests 
for major information technology investments. 

[32] The Flight Plan outlines the agency's four goals (Increase 
Safety, Increase Capacity, Organizational Excellence, and 
International Leadership) along with numerous performance metrics, and 
the Performance and Accountability Report shows the results. 

[33] Commercial aviation does not include general aviation. 

[34] [hyperlink, http://www.gao.gov/products/GAO/GGD-10.1.20]. 

[35] Performance-based navigation includes such things as Area 
Navigation (RNAV), which enables aircraft to fly on any path within 
coverage of ground-or space-based navigation aids, permitting more 
access and flexibility for point-to-point operations; and Required 
Navigation Performance (RNP), which, like RNAV, enables aircraft to 
fly on any path within coverage of ground-or space-based navigation 
aids, but also includes an onboard performance monitoring capability. 
RNP also enables closer en route spacing without intervention by air 
traffic control and permits more precise and consistent arrivals and 
departures. 

[36] FAA's Flight Plan is a 5-year strategic plan that outlines agency 
goals and metrics. FAA's NextGen Implementation Plan is an annual 
workplan that defines the midterm operational capabilities the agency 
plans to deliver between now and 2018. FAA's Enterprise Architecture 
is the structure to relate organizational mission, vision, and goals 
to business processes and the technical or IT infrastructure required 
to execute them. An exhibit 300-also called a Capital Asset Plan and 
Business Case-is used to justify resource requests for major 
investments and is intended to enable an agency to demonstrate to its 
own management, as well as to OMB, that a major project is well 
planned. 

[37] [hyperlink, http://www.gao.gov/products/GAO/GGD-10.1.20]. 

[38] IOC is the declaration by site personnel that the system is ready 
for conditional operational use in the national airspace system. 

[39] ORD signifies the end of conditional use, at which time, 
switchover to the new product is complete. 

[40] The Minneapolis Center became a third key site in 2010. 

[41] Prior to field testing at the key centers testing was done at 
FAA's Technical Center. 

[42] The MOU was signed December 2009. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: