This is the accessible text file for GAO report number GAO-15-60 entitled 'Geostationary Weather Satellites: Launch Date Nears, but Remaining Schedule Risks Need to be Addressed' which was released on January 15, 2015. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to the Committee on Science, Space, and Technology, House of Representatives: December 2014: Geostationary Weather Satellites: Launch Date Nears, but Remaining Schedule Risks Need to be Addressed: GAO-15-60: GAO Highlights: Highlights of GAO-15-60, a report to the Committee on Science, Space, and Technology, House of Representatives. Why GAO Did This Study: NOAA, with the aid of the National Aeronautics and Space Administration (NASA), is procuring the next generation of geostationary weather satellites. The GOES-R series is to replace the current series of satellites, which will likely begin to reach the end of their useful lives in 2015. This new series is considered critical to the United States’ ability to maintain the continuity of satellite data required for weather forecasting through 2036. GAO was asked to evaluate GOES-R. GAO’s objectives were to (1) assess progress on program schedule, cost, and functionality; (2) assess efforts to identify and address issues discovered during integration and testing; and (3) evaluate the likelihood of a gap in satellite coverage and actions to prevent or mitigate such a gap. To do so, GAO analyzed program and contractor data, earned value data information, and defect reports, compared both defect management policies and contingency plans to best practices by leading organizations, and interviewed officials at NOAA and NASA. What GAO Found: The National Oceanic and Atmospheric Administration (NOAA)’s Geostationary Operational Environmental Satellite-R (GOES-R) program has made major progress in developing its first satellite, including completing testing of satellite instruments. However, the program continues to face challenges in the areas of schedule, cost, and functionality. Specifically, the program has continued to experience delays in major milestones and cost overruns on key components. Also, in order to meet the planned launch date, the program has deferred some planned functionality until after launch, and program officials acknowledge that they may defer more. Figure: Delays in Key GOES-R Program Milestones: [Refer to PDF for image: horizontal bar graph] Program milestone: Mission operations review; Date planned (as of April 2012): January 2013; Schedule variance: July 2014; Date completed or planned (as of March 2013): January 2013; Date completed or planned (as of September 2014): July 2014. Program milestone: End-to-end test #1; Date planned (as of April 2012): February 2014; Schedule variance: December 2014; Date completed or planned (as of March 2013): November 2013; Date completed or planned (as of September 2014): December 2014. Program milestone: End-to-end test #2; Date planned (as of April 2012): May 2014; Schedule variance: January 2015; Date completed or planned (as of March 2013): August 2014; Date completed or planned (as of September 2014): January 2015. Program milestone: End-to-end test #3; Date planned (as of April 2012): August 2014; Schedule variance: April 2015; Date completed or planned (as of March 2013): December 2014; Date completed or planned (as of September 2014): April 2015. Program milestone: End-to-end test #4; Date planned (as of April 2012): December 2014; Schedule variance: August 2015; Date completed or planned (as of March 2013): March 2015; Date completed or planned (as of September 2014): August 2015. Program milestone: End-to-end test #5; Date planned (as of April 2012): July 2015; Schedule variance: September 2015; Date completed or planned (as of March 2013): July 2015; Date completed or planned (as of September 2014): September 2015. Program milestone: Anticipated launch; Date planned (as of April 2012): October 2015; Schedule variance: March 2016; Date completed or planned (as of March 2013): October 2015; Date completed or planned (as of September 2014): March 2016. Source: GAO analysis of NOAA data. GAO-15-60. [End of figure] NOAA and its contractors have implemented a defect management process as part of their overall testing approach that allows them to identify, assess, track, resolve, and report on defects. However, shortfalls remain in how defects are analyzed and reported. For example, contractors manage, track, and report defects differently due to a lack of guidance from NOAA. Without consistency among contractors, it is difficult for management to effectively prioritize and oversee defect handling. In addition, more than 800 defects in key program components remained unresolved. Until the program makes progress in addressing these defects, it may not have a complete picture of remaining issues and faces an increased risk of further delays to the GOES-R launch date. As the GOES-R program approaches its expected launch date of March 2016, it faces a potential gap of more than a year during which an on- orbit backup satellite would not be available. This means that if an operational satellite experiences a problem, there could be a gap in GOES coverage. NOAA has improved its plan to mitigate gaps in satellite coverage. However, the revised plan does not include steps for mitigating a delayed launch, or details on potential impacts and minimum performance levels should a gap occur. Until these shortfalls are addressed, NOAA management cannot fully assess all gap mitigation strategies, which in turn could hinder the ability of meteorologists to observe and report on severe weather conditions. What GAO Recommends: GAO is recommending that NOAA address shortfalls in its defect management approach, reduce the number of open high-priority defects, and add information to its satellite contingency plan. NOAA concurred with GAO’s recommendations and identified steps it plans to take to implement them. View [hyperlink, http://www.gao.gov/products/GAO-15-60]. For more information, contact Dave Powner at (202) 512-9286 or pownerd@gao.gov. [End of section] Contents: Letter: Background: NOAA is Making Progress on the GOES-R Satellite, but Schedule, Cost, and Functionality Challenges Remain: NOAA Has Implemented a Defect Management Process, but Shortfalls Exist and Selected Critical Defects Remain Unresolved: Facing a Gap in Backup Satellite Coverage, the GOES-R Program Has Improved Contingency Plans Though Shortfalls Remain: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Objectives, Scope, and Methodology: Appendix II: Defect Trends for Key GOES-R Components: Appendix III: Comments from the Department of Commerce: Appendix IV: GAO Contact and Staff Acknowledgments: Tables: Table 1: Summary of the Procurement History of the Geostationary Operational Environmental Satellites: Table 2: Key Changes to the Geostationary Operational Environmental Satellite-R Program over Time: Table 3: Instruments on the Geostationary Operational Environmental Satellite-R Program: Table 4: Key Components of the Geostationary Operational Environmental Satellite-R Program Ground Project: Table 5: Major Development Reviews for the Geostationary Operational Environmental Satellite-R Program: Table 6: Delays in Milestones for the Geostationary Operational Environmental Satellite-R Program: Table 7: Best Practices in Managing Defects: Table 8: Assessment of Geostationary Operational Environmental Satellite-R Program Practices in Managing Defects: Table 9: Summary of Defects for Selected Geostationary Operational Environmental Satellite-R Program Components: Table 10: Assessment of Satellite Contingency Plans for the Geostationary Operational Environmental Satellite-R Program over Time: Figures: Figure 1: Approximate Geographic Coverage of the Geostationary Operational Environmental Satellites: Figure 2: Generic Data Relay Pattern for the Geostationary Operational Environmental Satellites: Figure 3: Organizational Structure and Staffing of the Geostationary Operational Environmental Satellite-R Series Program: Figure 4: NASA's Life Cycle for Flight Systems: Figure 5: Changes in Key Test Dates for the Geostationary Operational Environmental Satellite-R Program over Time: Figure 6: Increases in Cost Variance for the Geostationary Operational Environmental Satellite-R Program's Ground System: Figure 7: Increases in Cost Variance for the Geostationary Operational Environmental Satellite-R Program's Advanced Baseline Imager Instrument: Figure 8: Increases in Cost Variance for the Geostationary Operational Environmental Satellite-R Program's Geostationary Lightning Mapper Instrument: Figure 9: Potential Gap in Geostationary Operational Environmental Satellite Coverage, as of April 2014: Figure 10: Number of Open and Closed Defects for the Geostationary Operational Environmental Satellite-R Program's Ground System, as of August 2014: Figure 11: Number of Open and Closed Hardware Defects for the Geostationary Operational Environmental Satellite-R Program's Advanced Baseline Imager Instrument for Each Month from February 2013 to August 2014: Figure 12: Number of Open and Closed Software Defects for the Geostationary Operational Environmental Satellite-R Program's Advanced Baseline Imager Instrument from 2012 to 2014: Abbreviations: ABI: Advanced Baseline Imager: CDR: critical design review: EVM: earned value management: FOR: flight operations review: GLM: Geostationary Lightning Mapper: GOES: Geostationary Operational Environmental Satellite: GOES-R: Geostationary Operational Environmental Satellite - R series: MDR: mission definition review: MOR: mission operations review: NASA: National Aeronautics and Space Administration: NOAA: National Oceanic and Atmospheric Administration: ORR: operational readiness review: PDR: preliminary design review: SDR: system definition review: SIR: system integration review: [End of section] United States Government Accountability Office: GAO: 441 G St. N.W. Washington, DC 20548: December 16, 2014: The Honorable Lamar S. Smith: Chairman: The Honorable Eddie Bernice Johnson: Ranking Member: Committee on Science, Space, and Technology: House of Representatives: Geostationary environmental satellites play a critical role in our nation's weather forecasting. These satellites--which are managed by the Department of Commerce's National Oceanic and Atmospheric Administration (NOAA)--provide information on atmospheric, oceanic, climatic, and solar conditions that help meteorologists observe and predict regional and local weather events. They also provide a means of identifying the large-scale evolution of severe storms, such as forecasting a hurricane's path and intensity. NOAA, through collaboration with the National Aeronautics and Space Administration (NASA), is procuring the next generation of geostationary weather satellites, called the Geostationary Operational Environmental Satellite-R (GOES-R) series. The GOES-R series consists of four satellites and is to replace the current series of geostationary environmental satellites as they reach the end of their useful lives. This new series is expected to provide the first major improvement in the technology of GOES instruments since 1994 and, as such, is considered critical to the United States' ability to maintain the continuity of data required for weather forecasting through the year 2036. NOAA is facing a potential gap where it may not have a backup satellite in orbit between now and the launch of the first GOES- R satellite. Because of the criticality of satellite data to weather forecasting, the likelihood of data gaps in NOAA's satellite programs, and the potential impact of a gap on the health and safety of the U.S. population and economy, we added mitigating gaps in weather satellite data to our High Risk List in 2013.[Footnote 1] This report responds to your request that we review NOAA's GOES-R program. Specifically, our objectives were to (1) assess progress on the GOES-R program with respect to planned schedule, cost, and functionality; (2) assess efforts to identify and address issues discovered during integration and testing; and (3) evaluate the likelihood of a gap in satellite coverage and analyze the adequacy of contingency actions in place to prevent or mitigate such a gap. To assess NOAA's progress in developing GOES-R with respect to cost, schedule, and functionality, we analyzed monthly program status briefings to identify current status, recent development challenges, and both expected and potential changes in functionality. We compared current and past status briefings to determine delays over time to key milestones. We also analyzed cost reserve and earned value data information to understand the program's current cost posture. To assess NOAA's efforts to identify and address issues discovered during integration and testing, we analyzed defect management policies and practices against criteria from industry best practices. We also compared defect trend data and metrics over time, and interviewed contractor officials responsible for testing and defect management. To ensure the reliability of cost and defect data, we compared data across monthly reports and other agency data sources, compared formulas from the GAO Cost Guide[Footnote 2] to the program's earned value management approach, and sought corroboration from agency and contractor officials. To evaluate the likelihood of a gap in satellite coverage and analyze contingency actions, we analyzed program documentation to determine the likely length of a gap, and followed up on our previous effort to compare the GOES-R contingency plan to best practices in contingency planning identified by leading organizations.[Footnote 3] We also interviewed NOAA and GOES-R program officials regarding program status, defect management, and contingency planning. We conducted this performance audit from January 2014 to December 2014 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. See appendix I for a complete description of our objectives, scope, and methodology. Background: Since the 1970s, geostationary satellites have been used by the United States to provide meteorological data for weather observation, research, and forecasting. NOAA's National Environmental Satellite, Data, and Information Service is responsible for managing the civilian operational geostationary satellite system, called GOES. Geostationary satellites can maintain a constant view of the earth from a high orbit of about 22,300 miles in space. NOAA operates GOES as a two-satellite system that is primarily focused on the United States (see figure 1). These satellites provide timely environmental data about the earth's atmosphere, surface, cloud cover, and the space environment to meteorologists and their audiences. They also observe the development of hazardous weather, such as hurricanes and severe thunderstorms, and track their movement and intensity to reduce or avoid major losses of property and life. The ability of the satellites to provide broad, continuously updated coverage of atmospheric conditions over land and oceans is important to NOAA's weather forecasting operations. Figure 1: Approximate Geographic Coverage of the Geostationary Operational Environmental Satellites: [Refer to PDF for image: illustrated world map] Depicts approximate coverage of GOES-West and GOES-East, which overlap. Source: NOAA (data), MapArt (map). GAO-15-60. [End of figure] To provide continuous satellite coverage, NOAA acquires several satellites at a time as part of a series and launches new satellites every few years (see table 1). NOAA's policy is to have two operational satellites and one backup satellite in orbit at all times. Table 1: Summary of the Procurement History of the Geostationary Operational Environmental Satellites: Series name: Original GOES[C]; Procurement duration[A]: 1970-1987; Satellites[B]: 1, 2, 3, 4, 5, 6, 7. Series name: GOES I-M; Procurement duration[A]: 1985-2001; Satellites[B]: 8, 9, 10, 11, 12. Series name: GOES-N; Procurement duration[A]: 1998-2010; Satellites[B]: 13, 14, 15, Q[D]. Series name: GOES-R; Procurement duration[A]: 2008-2024; Satellites[B]: R, S, T, U. Source: GAO analysis of NOAA data. GAO-15-60. [A] Duration includes time from contract award to final satellite launch. [B] Satellites in a series are identified by letters of the alphabet when they are on the ground (before launch) and by numbers once they are in orbit. [C] The procurement of these satellites consisted of four separate contracts for (1) two early prototype satellites and GOES-1, (2) GOES- 2 and -3, (3) GOES-4 through -6, and (4) GOES-G (failed on launch) and GOES-7. [D] NOAA decided not to exercise the option for this satellite. [End of table] Three viable GOES satellites--GOES-13, GOES-14, and GOES-15--are currently in orbit.[Footnote 4] Both GOES-13 and GOES-15 are operational satellites, with GOES-13 covering the eastern United States (GOES-East in Figure 1) and GOES-15 covering the western United States (GOES-West). GOES-14 is currently in an on-orbit storage mode and is available as a backup for the other two satellites should they experience any degradation in service. The GOES-R series is the next generation of satellites that NOAA is planning. Each of the operational geostationary satellites continuously transmits raw environmental data to NOAA ground stations. The data are processed at these ground stations and transmitted back to the satellite for broadcast to primary weather services and the global research community in the United States and abroad. Raw and processed data are also distributed to users via ground stations through other communication channels, such as dedicated private communication lines and the Internet. Figure 2 depicts a generic data relay pattern from a geostationary satellite to the ground stations and commercial terminals. Figure 2: Generic Data Relay Pattern for the Geostationary Operational Environmental Satellites: [Refer to PDF for image: illustration] Communications link: GOES satellite: Raw environmental data sent to ground station; Processed environmental data sent back to GOES; Processed environmental data broadcast to users. Source: GAO analysis of NOAA data. GAO-15-60. [End of figure] Overview of the GOES-R Program: NOAA established the GOES-R program to develop and launch the next series of geostationary satellites and to ensure the continuity of geostationary satellite observations. The GOES-R satellite series is designed to improve upon the technology of the prior satellite series in terms of system and instrument improvements. NOAA expects that the GOES-R series will significantly increase the clarity and precision of the observed environmental data. In addition, the data generated by the satellites are to be both developed and transmitted more quickly. Since its inception, the GOES-R program has undergone several changes in cost and scope. As originally envisioned, GOES-R was to encompass four satellites hosting a variety of advanced technology instruments and providing 81 environmental products. The first two satellites in the series (called GOES-R and GOES-S) were expected to launch in September 2012 and April 2014. However, in September 2006, NOAA decided to reduce the scope and technical complexity of the GOES-R program because of expectations that total costs, which were originally estimated to be $6.2 billion, could reach $11.4 billion. Specifically, NOAA reduced the minimum number of satellites from four to two, canceled plans for developing an advanced instrument (which reduced the number of planned satellite products from 81 to 68), and divided another instrument into two separate acquisitions. The agency estimated that the revised program would cost $7 billion and kept the planned launch dates unchanged. Subsequently, NOAA made several other important decisions about the cost and scope of the GOES-R program. In 2007, NOAA established a new program cost estimate of $7.67 billion and moved the launch dates for the first two satellites to December 2014 and April 2016. Further, to mitigate the risk that costs would rise, program officials decided to remove selected program requirements from the baseline program and treat them as contract options that could be exercised if funds allowed. These requirements included the number of products to be distributed, the time to deliver the remaining products (product latency), and how often these products would be updated with new satellite data (refresh rate). For example, program officials eliminated the requirement to develop and distribute 34 of the 68 envisioned products, including low cloud and fog, sulfur dioxide detection, and cloud liquid water. Program officials included the restoration of the requirements for the products, latency times, and refresh rates as options in the ground system contract that could be acquired at a later time. Program officials later reduced the number of products that could be restored as a contract option (called option 2) from 34 to 31 because they determined that two products were no longer feasible and two others could be combined into a single product. In late 2009, NOAA changed the launch dates for the first two satellites to October 2015 and February 2017. More recently, NOAA restored two satellites to the program's baseline, making GOES-R a four-satellite program once again. In February 2011, as part of its fiscal year 2012 budget request, NOAA requested funding to begin development for two additional satellites in the GOES-R series--GOES-T and GOES-U. The program estimated that the development for all four satellites in the GOES-R series--GOES-R, GOES-S, GOES-T, and GOES-U-- would cost $10.86 billion through 2036, an increase of $3.19 billion over its 2007 life cycle cost estimate of $7.67 billion for the two- satellite program. In August 2013, the program announced that it would delay the launch of the first two satellites in the program, due in part to the effects of sequestration. Specifically, the launch of the GOES-R satellite was delayed from October 2015 to the quarter ending March 2016, and the expected GOES-S satellite launch date was moved from February 2017 to the quarter ending June 2017. See table 2 for an overview of key changes to the GOES-R program. Table 2: Key Changes to the Geostationary Operational Environmental Satellite-R Program over Time: Number of satellites; August 2006 (baseline program): 4; September 2006: 2; November 2007: 2; February 2011: 4; August 2013: 4; [Empty]. Instruments; August 2006 (baseline program): * Advanced Baseline Imager; * Geostationary Lightning Mapper; * Magnetometer; * Space Environmental In-Situ Suite; * Solar Imaging Suite (which included the Solar Ultraviolet Imager, and Extreme Ultraviolet/X-Ray Irradiance Sensor); * Hyperspectral Environmental Suite; September 2006: * Advanced Baseline Imager; * Geostationary Lightning Mapper; * Magnetometer; * Space Environmental In-Situ Suite; * Solar Ultraviolet Imager; * Extreme Ultraviolet/X-Ray Irradiance Sensor; November 2007: No change; February 2011: No change; August 2013: No change. Number of satellite products; August 2006 (baseline program): 81; September 2006: 68; November 2007: 34 baseline; 34 optional; February 2011: 34 baseline; 31 optional; August 2013: 34 baseline; 31 optional. Life cycle cost estimate (in then-year dollars); August 2006 (baseline program): $6.2 billion - $11.4 billion (through 2034); September 2006: $7 billion (through 2028); November 2007: $7.67 billion (through 2028); February 2011: $10.86 billion (through 2036)[A]; August 2013: $10.86 billion (through 2036). Estimated launch dates for GOES-R and GOES-S; August 2006 (baseline program): GOES-R: September 2012; GOES-S: April 2014. September 2006: GOES-R: September 2012; GOES-S: April 2014; November 2007: GOES-R: December 2014; GOES-S: April 2016; February 2011: GOES-R: October 2015 GOES-S: February 2017; August 2013: GOES-R: by March 2016 GOES-S: by June 2017[B]. Source: GAO analysis of NOAA data. GAO-15-60. [A] Based on NOAA's fiscal year 2012 budget estimate, $7.64 billion of this cost estimate was for the first two satellites in the series, GOES-R and GOES-S. The cost for the remaining two satellites--GOES-T and GOES-U--was estimated at $3.22 billion. [B] Program documentation shows that the launch commitment dates were changed to the first quarter of 2016 and the second quarter of 2017, respectively. The launch dates in this chart reflect the latest month in which launch can occur and still meet the launch commitment dates. [End of table] Program Structure: While NOAA is responsible for GOES-R program funding and overall mission success, it implemented an integrated program management structure with NASA for the GOES-R program since it relies on NASA's acquisition experience and technical expertise. The NOAA-NASA Program Management Council is the oversight body for the GOES-R program, and is co-chaired by the NOAA Deputy Undersecretary for Operations and the NASA Associate Administrator. NOAA also located the program office at NASA's Goddard Space Flight Center. The GOES-R program is divided into flight and ground projects that have separate areas of responsibility and oversee different sets of contracts. The flight project, which is led by NASA, includes instruments, spacecraft, launch services, satellite integration, and on-orbit satellite initialization. The ground project, which is led by NOAA, is made up of three main components: the core ground system, an infrastructure of antennas, and a product access subsystem. In turn, the core ground system comprises four functional modules supporting operations, product generation, product distribution, and configuration control. Figure 3 depicts the integrated program management structure and the organization of the flight and ground projects within that structure, while table 3 summarizes the GOES-R instruments and their planned capabilities and table 4 describes key components of the ground project. Figure 3: Organizational Structure and Staffing of the Geostationary Operational Environmental Satellite-R Series Program: [Refer to PDF for image: Organizational chart] Top level: Commerce. Second level, reporting to Commerce: National Oceanic and Atmospheric Administration (NOAA); * Communicates with: NOAA/NASA Program Management Council. Third level, reporting to NOAA: National Environmental Satellite, Data, and Information Service (NESDIS); * Communicates with: Goddard Space Flight Center Management Council (with NOAA representation), which communicates with: NASA; * Communicates with: NESDIS/Science Mission Directorate Program Management Council, which communicates with: NOAA/NASA Program Management Council. Fourth level, reporting to NESDIS: * GOES-R Program: System Program Director: NOAA; Deputy System Program Director: NASA; Assistant System Program Director: NOAA; - Communicates with: Program Scientist: NOAA; - Communicates with: Goddard Space Flight Center Management Council (with NOAA representation); * Program Scientist: NOAA. Fifth level, reporting to: GOES-R Program; * Ground Project Project Manager: NOAA; * Flight Project Project Manager: NASA. Source: NOAA. GAO-15-60. [End of figure] Table 3: Instruments on the Geostationary Operational Environmental Satellite-R Program: Instrument: Advanced Baseline Imager; Description: Expected to provide variable area imagery and radiometric information of the earth's surface, atmosphere, and cloud cover. Key features include; * monitoring and tracking severe weather; * providing images of clouds to support forecasts; and; * providing higher resolution, faster coverage, and broader coverage simultaneously. Instrument: Geostationary Lightning Mapper; Description: Expected to continuously monitor total lightning (in- cloud and cloud-to-ground) activity over the United States and adjacent oceans and to provide a more complete dataset than previously possible. Key features include; * detecting lightning activity as an indicator of severe storms and convective weather hazard impacts to aviation; and; * providing a new capability to GOES for long-term mapping of total lightning that only previously existed on NASA low-earth-orbiting research satellites. Instrument: Magnetometer; Description: Expected to provide information on the general level of geomagnetic activity, monitor current systems in space, and permit detection of magnetopause crossings, sudden storm commencements, and substorms. Instrument: Space Environmental In-Situ Suite; Description: Expected to provide information on space weather to aid in the prediction of particle precipitation, which causes disturbance and disruption of radio communications and navigation systems. Key features include; * measuring magnetic fields and charged particles; * providing improved heavy ion detection, adding low-energy electrons and protons; and; * enabling early warnings for satellite and power grid operation, telecom services, astronauts, and airlines. Instrument: Solar Ultraviolet Imager; Description: Expected to provide coverage of the entire dynamic range of solar X-ray features, from coronal holes to X-class flares, and will provide quantitative estimates of the physical conditions in the Sun's atmosphere. Key features include; * providing information used for geomagnetic storm forecasts, and power grid performance; and; * providing observations of solar energetic particle events related to flares. Instrument: Extreme Ultraviolet/X-Ray Irradiance Sensor; Description: Expected to detect solar soft X-ray irradiance and solar extreme ultraviolet spectral irradiance. Key features include; * monitoring solar flares that can disrupt communications and degrade navigational accuracy, affecting satellites, astronauts, high latitude airline passengers; and; * monitoring solar variations that directly affect satellite drag/tracking and ionospheric changes, which impact communications and navigation operations. Source: GAO analysis of NOAA data. GAO-15-60. [End of table] Table 4: Key Components of the Geostationary Operational Environmental Satellite-R Program Ground Project: Component: Core Ground System; Description: Expected to (1) provide command of operational functions of the spacecraft and instruments, (2) receive and process information from the instruments and spacecraft, (3) distribute satellite data products to users, and (4) provide configuration control and a common infrastructure and set of services for the satellite and instruments. Component: Antennas; Description: Expected to provide six new antenna stations and modify four existing antennas to receive GOES-R data. The antenna contract is also expected to include the construction of related infrastructure, software development for control systems, and maintenance. Component: Product Distribution and Access System; Description: Expected to provide ingestion of data and distribution for GOES-R products and data to authorized users. When completed, this system will be integrated into the core ground system. Source: GAO analysis of NOAA data. GAO-15-60. [End of table] Prior GAO Reports Made Recommendations to Address Program Weaknesses: In recent years, we issued a series of reports aimed at addressing weaknesses in the GOES-R program.[Footnote 5] Key areas of focus included (1) cost, (2) technical challenges and changes in requirements, and (3) contingency plans. * Addressing cost risks: In June 2012, we reported that the GOES-R program might not be able to ensure that it had adequate resources to cover unexpected problems in remaining development.[Footnote 6] We recommended the program strengthen its process for planning and reporting on reserves. More recently, in September 2013, we reported on weaknesses in the process for reporting reserves to management and recommended the agency take action to brief senior executives on a regular basis regarding the status of reserves.[Footnote 7] The agency agreed with our recommendations and took steps to address them by identifying needed reserve levels and providing a more detailed breakdown of reserve percentage calculations. However, NOAA is not yet identifying the reserves associated with each satellite in the series. * Technical issues and changes in requirements: We previously reported on issues related to GOES technical challenges and requirements. In 2012, we reported that key instruments were experiencing technical challenges and required additional redesign efforts. For example, emissions for the Geostationary Lightning Mapper instrument were outside the specified range.[Footnote 8] Also, the ground project was experiencing ongoing technical problems--for example, the definition of ground system software requirements and integration of flight instruments. As a result, revisions were made to the Core Ground System's baseline development plan and schedule. More recently, in September 2013, we reported that NOAA made changes to several of the GOES-R requirements--including decreasing the accuracy requirement for the hurricane intensity product and decreasing the timeliness of the lightning detection product--and that end users were concerned by many of these changes.[Footnote 9] We recommended that the program improve communications with users on changes in GOES-R requirements by assessing impacts, seeking input from users, and disseminating information on changes. NOAA agreed with this recommendation and took steps to explore further avenues of user communication such as customer forums and interagency working groups. * Contingency planning: In February 2013, due to the importance of environmental satellite data and the potential for a gap in this data, we added mitigating weather satellite gaps to our biennial High-Risk list.[Footnote 10] In that report, we noted that NOAA had established a contingency plan for a potential gap in the GOES program, but it needed to demonstrate its progress in coordinating with the user community to determine their most critical requirements, conducting training and simulations for contingency operations scenarios, evaluating the status of viable foreign satellites, and working with the user community to account for differences in product coverage under contingency operations scenarios. We also stated that NOAA should update its contingency plan to provide more details on its contingency scenarios, associated time frames, and any preventative actions it is taking to minimize the possibility of a gap. More recently, in September 2013, we reported that, while NOAA had established contingency plans for the loss of the GOES satellites, these plans still did not address user concerns over potential reductions in capability, and did not identify alternative solutions and timelines for preventing a delay in the GOES-R launch date.[Footnote 11] We recommended the agency revise the satellite and ground system contingency plans to address weaknesses, including providing more information on the potential impact of a satellite failure and coordinating with key external stakeholders on contingency strategies. The agency agreed with these recommendations and took steps to address them by identifying and refining program contingency plans. NOAA is Making Progress on the GOES-R Satellite, but Schedule, Cost, and Functionality Challenges Remain: NASA and NOAA are following NASA's standard space system life cycle on the GOES-R program. This life cycle includes distinct phases, including concept and technology development; preliminary design and technology completion; final design and fabrication; system assembly, integration and testing, and launch; and operations and sustainment. Key program reviews are to occur throughout each of the phases, including preliminary design review, critical design review, and system integration review. NOAA and NASA jointly conduct key reviews on the flight and ground segments individually as well as for the program as a whole, and then make decisions on whether to proceed to the next phase. Figure 4 provides an overview of the life cycle phases, key program reviews, and associated decision milestones. In addition, the key reviews are described in table 5. Figure 4: NASA's Life Cycle for Flight Systems: [Refer to PDF for image: life cycle illustration] Management decision reviews: Formulation: Pre-phase A: Concept Studies; KDP A. Phase A: Concept and Technology Development; KDP B; Technical review: SDR/MDR. Phase B: Preliminary Design and Technology Completion; KDP C: (confirmation review); Technical review: PDR; Program Start. Phase C: Final Design and Fabrication; Technical reviews: CDR, MOR, SIR; KDP D. Phase D: System Assembly, Integration and Test, Launch; Technical reviews: FOR, ORR; KDP E. Phase E: Operations and Sustainment; KDP F. Phase F: Closeout. Implementation. KDP = key decision point. Technical reviews: SDR/MDR = system definition review/mission definition review; PDR = preliminary design review; CDR = critical design review; MOR = mission operations review; SIR = system integration review; FOR = flight operations review; ORR = operational readiness review. Source: GAO analysis of NASA data. GAO-15-60. Note: According to a NASA official, the MOR and FOR are considered lower-level reviews and are not mandated by NASA's primary procedural requirements. They are, however, key mission reviews required by NASA's Goddard Space Flight Center. [End of figure] Table 5: Major Development Reviews for the Geostationary Operational Environmental Satellite-R Program: Review: System Definition Review; Description: Performed on the flight and ground segments individually, and then on the program as a whole, this review is to examine the proposed system architecture/design and demonstrate that a system that fulfills the mission objectives can be built within existing constraints. Review: Preliminary Design Review; Description: Performed on the flight and ground segments individually, and then on the program as a whole, this review is to demonstrate that the preliminary design meets all system requirements with acceptable risk and within the cost and schedule constraints and to establish the basis for proceeding with detailed design. Review: Critical Design Review; Description: Performed on the flight and ground segments individually, and then on the program as a whole, this review is to evaluate the completed detailed design of the element and subsystem products in sufficient detail to provide approval for a production stage. Review: Mission Operations Review; Description: Performed programwide, this review is to establish the adequacy of plans and schedules for ground systems and flight operations preparation, and to justify readiness to proceed with implementation of the remaining required activities. It is typically held subsequent to completion of detail design and fabrication activity, but prior to initiation of major integration activities of flight or ground-system elements. Review: System Integration Review; Description: Performed programwide, this review is to evaluate the readiness of the project to start system assembly, test, and launch operations. The objectives of the review include ensuring that planning is adequate for all remaining system activities and that available cost and schedule resources support completion of all necessary remaining activities with adequate margin. Review: Flight Operations Review; Description: This review is to present the results of mission operations activities and show that the program has verified compliance with all requirements and demonstrated the ability to execute all phases and modes of mission operations, data processing, and analysis. Review: Operational Readiness Review; Description: This review is to examine characteristics and procedures used in the system's operation and ensures that all system and support hardware, software, personnel, and procedures are ready for operations and that user documentation accurately reflects the deployed state of the system. It is typically held near the completion of pre-launch testing between the flight segment and the ground system. Source: GAO analysis of NOAA data. GAO-15-60. [End of table] The GOES-R program has completed important steps in developing its first satellite. Specifically, the program completed its critical design review in November 2012, its Mission Operations Review in June 2014, and its System Integration Review in July 2014. Based on the results of the System Integration Review, in September 2014, NOAA and NASA decided to move the program to the next phase, the system assembly, integration and test, and launch and checkout phase. To prepare for the recent reviews and milestones, the program completed numerous important steps on both the flight and ground projects. Key accomplishments include: * completing testing on individual components including the six flight instruments, the spacecraft core and system modules, and ground system components; * releasing key ground system software components on enterprise infrastructure and mission management in December 2013 and April 2014, respectively, and completing the dry run of another ground system release that includes all planned products for the GOES-R satellite; * replacing and successfully demonstrating a new engineering analysis tool, which will perform trending and offline analysis; * completing installation and testing of antenna dishes at NOAA's satellite operations facility, continuing installation and testing at NOAA's primary satellite communications site, and beginning training in use of the antenna system; * completing two key readiness reviews on the product distribution and access system; and: * completing connectivity tests throughout system hardware components. Moving forward, the next major program milestones are the flight operations review and the operational readiness review. In preparation for these milestones, the program plans to conduct a series of five end-to-end tests.[Footnote 12] These tests are expected to validate compatibility between the space and ground segments before the launch of the first satellite. Schedule Delays Continue and Could Affect the Planned Launch Date: While the program is making progress, it has experienced recent and continuing schedule delays. In 2013, we reported that technical issues on both the flight and ground projects had the potential to cause further delays to the program schedule, and that delays in interim key milestones could result in delays to the launch readiness date. [Footnote 13] We also reported that the program had delayed many of these key milestones and tests, including the mission operations review and the five end-to-end tests. In the months since that report, these and all other major milestones have been further delayed by 5 to 8 months. For example, the program delayed its mission operations review an additional 5 months to June 2014. This most recent delay occurred when the program found that detailed plans for future operations of the system were inadequate and behind schedule. The GOES-R program cited multiple reasons for these recent delays. They include challenges in delivering software builds for the flight project and delays in completing communication testing for the spacecraft. In addition to these delays, as previously noted, in late 2013 the agency moved the launch commitment date of the first GOES satellite from October 2015 to the first quarter of 2016. In September 2014, a NOAA management council also confirmed the agency's commitment to launch the satellite during that quarter. Table 6 and figure 5 show changes in the planned or completed dates of selected key program milestones over time. Table 6: Delays in Milestones for the Geostationary Operational Environmental Satellite-R Program: Program milestone: Mission operations review; Date planned (as of April 2012): January 2013; Date completed or planned (as of March 2013): January 2014; Delay through March 2013: 12 months[A]; Date completed or planned (as of November 2014): June 2014; Additional delay (from April 2013 to November 2014): 5 months; Total delay from April 2012 to November 2014: 17 months. Program milestone: End-to-end test #1; Date planned (as of April 2012): February 2014; Date completed or planned (as of March 2013): May 2014; Delay through March 2013: 3 months; Date completed or planned (as of November 2014): December 2014; Additional delay (from April 2013 to November 2014): 7 months; Total delay from April 2012 to November 2014: 10 months. Program milestone: End-to-end test #2; Date planned (as of April 2012): May 2014; Date completed or planned (as of March 2013): August 2014; Delay through March 2013: 3 months; Date completed or planned (as of November 2014): March 2015; Additional delay (from April 2013 to November 2014): 7 months; Total delay from April 2012 to November 2014: 10 months. Program milestone: End-to-end test #3; Date planned (as of April 2012): August 2014; Date completed or planned (as of March 2013): December 2014; Delay through March 2013: 4 months; Date completed or planned (as of November 2014): May 2015; Additional delay (from April 2013 to November 2014): 5 months; Total delay from April 2012 to November 2014: 9 months. Program milestone: Flight operations review; Date planned (as of April 2012): September 2014; Date completed or planned (as of March 2013): January 2015; Delay through March 2013: 4 months; Date completed or planned (as of November 2014): July 2015; Additional delay (from April 2013 to November 2014): 6 months; Total delay from April 2012 to November 2014: 10 months. Program milestone: End-to-end test #4; Date planned (as of April 2012): December 2014; Date completed or planned (as of March 2013): March 2015; Delay through March 2013: 3 months; Date completed or planned (as of November 2014): September 2015; Additional delay (from April 2013 to November 2014): 6 months; Total delay from April 2012 to November 2014: 9 months. Program milestone: Operational readiness review; Date planned (as of April 2012): April 2015; Date completed or planned (as of March 2013): April 2015; Delay through March 2013: No change; Date completed or planned (as of November 2014): December 2015; Additional delay (from April 2013 to November 2014): 8 months; Total delay from April 2012 to November 2014: 8 months. Program milestone: End-to-end test #5; Date planned (as of April 2012): July 2015; Date completed or planned (as of March 2013): July 2015; Delay through March 2013: No change; Date completed or planned (as of November 2014): January 2016; Additional delay (from April 2013 to November 2014): 6 months; Total delay from April 2012 to November 2014: 6 months. Program milestone: Anticipated launch; Date planned (as of April 2012): October 2015; Date completed or planned (as of March 2013): October 2015; Delay through March 2013: No change; Date completed or planned (as of November 2014): March 2016; Additional delay (from April 2013 to November 2014): 5 months; Total delay from April 2012 to November 2014: 5 months. Source: GAO analysis of NOAA data. GAO-15-60. [A] Program officials stated that they had erroneously scheduled the mission operations review too soon, and moved the date by 9 months to better reflect when the review was needed. Therefore, only 3 of the 12 months were attributable to a delay. [End of table] Figure 5: Changes in Key Test Dates for the Geostationary Operational Environmental Satellite-R Program over Time: [Refer to PDF for image: horizontal bar graph] Program milestone: Mission operations review; Date planned (as of April 2012): January 2013; Schedule variance: July 2014; Date completed or planned (as of March 2013): January 2013; Date completed or planned (as of September 2014): July 2014; 17 month delay. Program milestone: End-to-end test #1; Date planned (as of April 2012): February 2014; Schedule variance: December 2014; Date completed or planned (as of March 2013): November 2013; Date completed or planned (as of September 2014): December 2014; 10 month delay. Program milestone: End-to-end test #2; Date planned (as of April 2012): May 2014; Schedule variance: January 2015; Date completed or planned (as of March 2013): August 2014; Date completed or planned (as of September 2014): January 2015; 10 month delay. Program milestone: End-to-end test #3; Date planned (as of April 2012): August 2014; Schedule variance: April 2015; Date completed or planned (as of March 2013): December 2014; Date completed or planned (as of September 2014): April 2015; 9 month delay. Program milestone: Flight operations review; Date planned (as of April 2012): August 2014; Schedule variance: July 2015; Date completed or planned (as of March 2013): January 2015; Date completed or planned (as of September 2014): July 2015; 10 month delay. Program milestone: End-to-end test #4; Date planned (as of April 2012): December 2014; Schedule variance: August 2015; Date completed or planned (as of March 2013): March 2015; Date completed or planned (as of September 2014): August 2015; 9 month delay. Program milestone: Operational readiness review; Date planned (as of April 2012): April 2015; Schedule variance: December 2015; Date completed or planned (as of March 2013): April 2015; Date completed or planned (as of September 2014): December 2015; 8 month delay. Program milestone: End-to-end test #5; Date planned (as of April 2012): July 2015; Schedule variance: September 2015; Date completed or planned (as of March 2013): July 2015; Date completed or planned (as of September 2014): September 2015; 6 month delay. Program milestone: Anticipated launch; Date planned (as of April 2012): October 2015; Schedule variance: March 2016; Date completed or planned (as of March 2013): October 2015; Date completed or planned (as of September 2014): March 2016; 5 month delay. Source: GAO analysis of NOAA data. GAO-15-60. [End of figure] The program has experienced recent issues in developing specific components that increase the potential for further delays. For example, in mid-2014, the program discovered an issue in newly- designed electronics systems to be used on the spacecraft system module. According to the program, this issue could cause a schedule delay because it was discovered late in the integration and testing phase. As we have previously reported, experiencing problems during the integration and test phase often lead to cost and schedule growth. [Footnote 14] Also, one of the main ground antenna sites experienced power supply failures on 10 separate occasions, due to issues in the site's air conditioning units. According to the program officials, this issue delayed the completion of testing. In addition, the program's actions to mitigate schedule delays introduce some risks, and could therefore increase the amount of the delay. For example, the program attempted to mitigate delays in developing the detailed plans for ground-based data operations by performing system development while concurrently working on the detailed plans. According to program documentation, this could cause rework and increase cost and schedule slips. In addition, the program has responded to prior delays by eliminating selected repetitive tests and moving to a 24-hour-a-day, 7-day-a-week spacecraft integration testing schedule. These actions increase the risk that it will take longer to identify issues and that there will be no extra time available to expand testing into off hours, if needed. We have previously reported that overlapping planning and development activities and compressing test schedules are activities that increase the risk of further delays because there would be little time to resolve any issues that arise.[Footnote 15] Program officials acknowledged that there was the possibility of further schedule slips. The program is now reaching a point where additional delays in starting the end-to-end tests could begin to adversely affect its schedule. Recently, a NOAA review board reported that moving all of the end-to-end and data operations tests further back into 2015 would cause a situation where too much testing would need to occur at once. In addition, any further delays could affect the committed launch date of the first GOES satellite. As of August 2014, program officials could not rule out the possibility of further delays in the planned launch date. Program Life Cycle Cost Estimate Remains Relatively Steady, but Key Components are Costing More than Expected: NOAA's estimated life cycle cost for the GOES-R program has held relatively steady. NOAA's current life cycle cost estimate for the GOES-R program is $10.83 billion, which is slightly less than the $10.86 billion lifecycle cost estimate from August 2013. The $30 million change in the life cycle cost estimate from last year is the net result of moving selected management functions outside the program and addressing program commitments impacted by funding reductions associated with sequestration in late 2013. However, program data show that individual components are costing more than expected.[Footnote 16] Earned Value Management Data Show Key Components are Costing More than Expected: Federal agencies and private industry organizations often implement earned value management (EVM) as a tool for ensuring work completed is on track with expected costs and schedules.[Footnote 17] EVM is a project management tool that, among other things, produces early warning signs of impending schedule slippages and cost overruns. Key EVM metrics include cost and schedule variances, which measure the value of work accomplished in a given period and compares it with the planned value of work scheduled for that period and with the actual cost of work accomplished. For example, an increase in cost variance means that the program spent more than expected to produce the work. An analysis of EVM data for three key components--the GOES ground system, the ABI instrument, and the GLM instrument--shows that each experienced a growing cost variance. Specifically, over the twelve- month period ending July 2014, the cost variance for the ground system increased to 8.4 percent of total cumulative budgeted cost and for GLM, the cost variance increased to 5.1 percent. For a third key component, the ABI instrument, cost variance increased slightly from 2.4 to 2.6 percent. Figures 6, 7, and 8 show monthly cost variance data for the three key components for the year ending July 2014. Figure 6: Increases in Cost Variance for the Geostationary Operational Environmental Satellite-R Program's Ground System: [Refer to PDF for image: multiple line graph] Date: August 2013; Ground System Budget: $546,524; Ground System Budget Variance: $566,283; +3.6%. Date: September 2013; Ground System Budget: $558,749; Ground System Budget Variance: $577,803; +3.4%. Date: October 2013; Ground System Budget: $568,257; Ground System Budget Variance: $588,901; +3.6%. Date: November 2013; Ground System Budget: $577,931; Ground System Budget Variance: $599,717; +3.5%. Date: December 2013; Ground System Budget: $586,531; Ground System Budget Variance: $609,988; +4.0%. Date: January 2014; Ground System Budget: $597,368; Ground System Budget Variance: $622,462; +4.2%. Date: February 2014; Ground System Budget: $606,900; Ground System Budget Variance: $636,367; +4.9%. Date: March 2014; Ground System Budget: $615,805; Ground System Budget Variance: $653,067; +6.1%. Date: April 2014; Ground System Budget: $623,242; Ground System Budget Variance: $663,757; +6.5%. Date: May 2014; Ground System Budget: $631,622; Ground System Budget Variance: $676,080; +7.0%. Date: June 2014; Ground System Budget: $638,225; Ground System Budget Variance: $688,246; +7.8%. Date: July 2014; Ground System Budget: $642,661; Ground System Budget Variance: $696,835; +8.4%. Source: GAO analysis of NOAA data. GAO-15-60. [End of figure] Figure 7: Increases in Cost Variance for the Geostationary Operational Environmental Satellite-R Program's Advanced Baseline Imager Instrument: [Refer to PDF for image: multiple line graph] Date: August 2013; Advanced Baseline Imager budget: $627,320; Advanced Baseline Imager budget Variance: $647,370; +2.4%. Date: September 2013; Advanced Baseline Imager budget: $631,470; Advanced Baseline Imager budget Variance: $652,757; +2.5%. Date: October 2013; Advanced Baseline Imager budget: $635,645; Advanced Baseline Imager budget Variance: $657,796; +2.6%. Date: November 2013; Advanced Baseline Imager budget: $640,259; Advanced Baseline Imager budget Variance: $662,233; +2.7%. Date: December 2013; Advanced Baseline Imager budget: $644,253; Advanced Baseline Imager budget Variance: $666,505; +2.8%. Date: January 2014; Advanced Baseline Imager budget: $647,845; Advanced Baseline Imager budget Variance: $670,103; +2.8%. Date: February 2014; Advanced Baseline Imager budget: $652,041; Advanced Baseline Imager budget Variance: $674,160; +2.7%. Date: March 2014; Advanced Baseline Imager budget: $657,775; Advanced Baseline Imager budget Variance: $675,395; +2.7%. Date: April 2014; Advanced Baseline Imager budget: $662,317; Advanced Baseline Imager budget Variance: $679,987; +2.7%. Date: May 2014; Advanced Baseline Imager budget: $666,460; Advanced Baseline Imager budget Variance: $683,312; +2.5%. Date: June 2014; Advanced Baseline Imager budget: $670,054; Advanced Baseline Imager budget Variance: $687,494; +2.6%. Date: July 2014; Advanced Baseline Imager budget: $673,391; Advanced Baseline Imager budget Variance: $690,903; +2.6. Source: GAO analysis of NOAA data. GAO-15-60. [End of figure] Figure 8: Increases in Cost Variance for the Geostationary Operational Environmental Satellite-R Program's Geostationary Lightning Mapper Instrument: [Refer to PDF for image: multiple line graph] Date: August 2013; Geostationary Lightning Mapper budget: $196,886; Geostationary Lightning Mapper budget Variance: $197,185; +0.2%. Date: September 2013; Geostationary Lightning Mapper budget: $200,356; Geostationary Lightning Mapper budget Variance: $201,262; +0.5%. Date: October 2013; Geostationary Lightning Mapper budget: $201,977; Geostationary Lightning Mapper budget Variance: $204,394; +1.2%. Date: November 2013; Geostationary Lightning Mapper budget: $204,886; Geostationary Lightning Mapper budget Variance: $208,292; +1.7%. Date: December 2013; Geostationary Lightning Mapper budget: $207,097; Geostationary Lightning Mapper budget Variance: $211,277; +2.0%. Date: January 2014; Geostationary Lightning Mapper budget: $209,086; Geostationary Lightning Mapper budget Variance: $214,304; +2.5%. Date: February 2014; Geostationary Lightning Mapper budget: $212,277; Geostationary Lightning Mapper budget Variance: $217,122; +2.3%. Date: March 2014; Geostationary Lightning Mapper budget: $216,470; Geostationary Lightning Mapper budget Variance: $221,620; +2.4%. Date: April 2014; Geostationary Lightning Mapper budget: $218,539; Geostationary Lightning Mapper budget Variance: $224,966; +2.9%. Date: May 2014; Geostationary Lightning Mapper budget: $220,539; Geostationary Lightning Mapper budget Variance: $229,311; +4.0%. Date: June 2014; Geostationary Lightning Mapper budget: $223,114; Geostationary Lightning Mapper budget Variance: $234,190; +5.0%. Date: July 2014; Geostationary Lightning Mapper budget: $225,816; Geostationary Lightning Mapper budget Variance: $237,380; +5.1. Source: GAO analysis of NOAA data. GAO-15-60. [End of figure] Program officials stated that cost variances during this period for the ground system were primarily due to increased labor costs needed to complete the contract deliverables on schedule. Because the program is moving into integration and testing--the phase of development in which cost growth is most likely[Footnote 18]--cost variances will likely continue to increase. If these variances continue to increase, the program will need to use more of its contingency reserves to cover the costs, and the overall cost of the GOES-R program could increase. Inconsistencies in EVM Data Make Oversight More Difficult: Best practices in cost estimation and monitoring call for agencies to confirm that data are reliable and valid, and that the calculations for each cost element are correct and the results make sense.[Footnote 19] In reviewing EVM data for the GLM and ABI instruments we found inconsistencies in the contractor's monthly and cumulative reports that made it more difficult for NOAA to effectively oversee the contractor's performance. Specifically, we found inconsistencies between cumulative and monthly budget totals in contractor performance reports that ranged from hundreds of thousands to millions of dollars. For instance, the cumulative amount of budget allocated to complete work for GLM between February 2013 and March 2013 increased by just under $2 million, while the stated monthly change was $3.8 million. Also, the cumulative amount of budgeted work accomplished for ABI between May 2013 and June 2013 increased by $3.2 million, while the stated monthly change was $5.4 million. Month-to-month discrepancies such as this occurred in each of 6 months for the GLM instrument and each of 9 months for the ABI instrument in the period between August 2013 and July 2014. Program officials stated that these issues were addressed in later contractor reports, and that program analysts communicate with contractors and program management regularly to resolve any found discrepancies. However, more recent monthly reports continue to show discrepancies. If the instrument's cost data are unreliable, it is difficult for managers and program officials to make financial projections and assess reserve needs and usage. The GOES-R Program is Considering Reducing or Deferring Functionality on the Ground System: The GOES-R program is considering eliminating or deferring planned functionality on its ground system due to issues experienced during development. Specifically, the GOES-R program is considering deferring functionality on the ground system in order to provide schedule relief in the case of further delays. Program and contracting officials recently revised the composition of the software releases it will be delivering on the ground system. In doing so, the GOES-R program identified "off-ramps," or decision points, at which time they could remove or defer a specific function from pre-launch to post-launch if it is not ready in time for testing. As of September 2014, officials identified 50 potential decision points for deferring functionality. To date, the program has decided to implement five of the deferrals, including one to remove the ability to play back information from alternate ABI data sets outside the GOES ground system, and another to remove a low-level navigation capability. In addition, program officials decided against deferring 30 of the functions, leaving 16 deferrals that could be implemented in the future. The off-ramps still under consideration include a reduction in the amount of verification and validation activities that will be conducted. NOAA Has Implemented a Defect Management Process, but Shortfalls Exist and Selected Critical Defects Remain Unresolved: A key element of a successful test phase is appropriately identifying and handling any defects or anomalies that are discovered during testing. Key aspects of a sound defect management process include defect management planning, defect identification and classification, defect analysis, defect resolution, defect tracking, and defect trending and reporting. Leading industry and government organizations consider defect management and resolution to be among the primary goals of testing.[Footnote 20] These organizations have identified best practices for managing defects and anomalies that arise during system testing. Table 7 outlines the best practices of a sound defect management process. Table 7: Best Practices in Managing Defects: Category: Defect management planning; Best practice: Establish procedures to maintain reasonable assurance that significant defects (and anomalies) will be discovered before production system deployment. Best practice: Ensure defect management is a part of standard testing procedures. Best practice: Establish procedures that link test exit criteria to resolution of defects. Best practice: Establish realistic schedules to discover and resolve defects during testing. Best practice: Establish defect metrics (number and severity of defects, number of defects closed, duration, etc.). Best practice: Clearly define roles and responsibilities for testing staff and key stakeholders. Category: Defect identification and classification; Best practice: Classify each defect/anomaly using a series of descriptive attributes (type, severity, date of detection, etc.). Best practice: Assign a priority or severity category to each defect/ anomaly. Category: Defect analysis; Best practice: Assess impact of the defect on system operation and performance requirements, and that defects are traceable back to original requirements. Best practice: Analyze defects and determine their source and root cause. Best practice: Evaluate the impact of defects, and the steps required to mitigate them, on resources (time, cost). Category: Defect resolution; Best practice: Decide how to address and resolve defects (make a fix, establish a workaround, require a waiver, etc.). Best practice: Ensure that defect resolution complies with testing exit criteria. Best practice: Perform quality assurance to confirm the implementation of corrective actions, defect repairs, and preventive actions. Category: Defect tracking; Best practice: Track defect dispositions, rationale, and outcome of resolution activities. Best practice: Track defect-related metrics, such as number of defects open, closed, and duration (both in total and by severity). Best practice: Monitor the risks of open high-priority defects throughout the project. Category: Defect trending and reporting; Best practice: Track and analyze defect trends over time, such as defect density and severity. Best practice: Report key defects to management and stakeholders. Best practice: Management should analyze information on reported problems, and should ensure the impact of any changes is determined, controlled, and monitored. Source: GAO analysis of best practices identified by leading government and industry organizations. GAO-15-60. [End of table] The GOES-R program has sound defect management policies in place and it, along with its contractors, is actively performing defect management activities. Specifically, the program has fully satisfied 13 of the 20 best practices, and partially satisfied the remaining 7 practices. For example, the program has defect management procedures in place as part of its overall testing program. Defect management is incorporated in the program's mission assurance, configuration management, and verification/validation functions. In addition, for three key components we reviewed--ABI, GLM, and the ground system-- defects are entered into automated systems, from which they are analyzed and resolved. The program also tracks, and regularly reports weekly and monthly, on defect totals and metrics to NOAA management. However, there are several areas in which defect management policies and practices are inconsistent, including in performing and recording information pertinent to individual defects, and in reporting and tracking defect information. Table 8 provides an assessment of how the GOES-R program and key contractors performed on each of the best practices, and is followed by a more detailed discussion of shortfalls. Table 8: Assessment of Geostationary Operational Environmental Satellite-R Program Practices in Managing Defects: Category: Defect management; Best practice: Establish procedures to maintain reasonable assurance that significant defects (and anomalies) will be discovered before production system deployment; Assessment: Fully satisfied; Description: NOAA has key procedures in place that provide assurance that defects will be discovered before system deployment. These include mission assurance, configuration management, and verification and validation procedures. In addition, individual program components are required to establish procedures for reporting nonconformance, discrepancies, defects and anomalies. Best practice: Ensure defect management is a part of standard testing procedures; Assessment: Fully satisfied; Description: Defect management is a part of NOAA's overall testing approach. For example, mission assurance and engineering management plans explain that testing results will be reviewed for both accuracy and compliance with mission objectives, and that processes and systems will be in place to report and resolve defects and anomalies. Further, NOAA established processes for reporting defects identified during testing. Best practice: Establish procedures that link test exit criteria to resolution of defects; Assessment: Fully satisfied; Description: NOAA program procedures call for ensuring the traceability of requirements and mission objectives throughout the testing phase, and assign staff to perform oversight of testing. Program documentation traces exit criteria and results to mission level objectives to ensure changes are tracked and controlled, throughout the project lifecycle. Best practice: Establish realistic schedules to discover and resolve defects during testing; Assessment: Partially satsified; Description: NOAA established a schedule for the GOES-R program's integration and testing phase, which includes time for defect discovery and resolution. However, the schedule was not realistic. The program has delayed major milestones several times due to issues discovered during testing and has compressed its overall testing period. Best practice: Establish defect metrics (number and severity of defects, number of defects closed, duration, etc.); Assessment: Partially satsified; Description: NOAA established metrics for tracking defects on its software components, including number and severity. However, NOAA did not establish requirements for metrics in tracking defects in its hardware components. Moreover, NOAA did not establish a consistent approach for classifying the priority and severity of defects. As a result, metrics and methodologies for classifying defects are not consistent between contractors or subcontractors. Best practice: Clearly define roles and responsibilities for testing staff and key stakeholders; Assessment: Fully satisfied; Description: NOAA and contractor procedures establish and describe the key roles and responsibilities of individuals and groups that are involved in integration and testing phase activities. Category: Defect identification and classification; Best practice: Classify each defect/anomaly using a series of descriptive attributes (type, severity, date of detection, etc.); Assessment: Fully satisfied; Description: All three components we reviewed performed this practice. For ABI, GLM, and the ground system, contractors fully classified each defect with a range of descriptive attributes such as a defect's status, description, and the activity during which it occurred. Category: Defect identification and classification; Best practice: Assign a priority or severity category to each defect/ anomaly; Assessment: Partially satsified; Description: For the ground system, contractors assigned a priority category for each defect or anomaly. For ABI and GLM, individual defect reports provide qualitative information on priority and severity, but there is no standard rating scale conversion for these attributes. Category: Defect analysis; Best practice: Assess impact of the defect on system operation and performance requirements, and that defects are traceable back to original requirements; Assessment: Fully satisfied; Description: For all three components we reviewed, contractors discussed potential impact in individual defect reports and traced defects back to original requirements in cases where defects affected those requirements. Category: Defect analysis; Best practice: Analyze defects and determine their source and root cause; Assessment: Fully satisfied; Description: For all three components, contractors analyzed defects and determined their root cause. Category: Defect analysis; Best practice: Evaluate the impact of defects, and the steps required to mitigate them, on resources (time, cost); Assessment: Partially satsified; Description: For the ground system, contractors evaluated the impact of defects and defect mitigation steps on resources. On the ABI and GLM instruments, contractors did not evaluate the impact of defects on resources. Moreover, NOAA did not provide evidence that it assessed the impact of key defects on resources. Category: Defect resolution; Best practice: Decide how to address and resolve defects (make a fix, establish a workaround, require a waiver, etc.); Assessment: Fully satisfied; Description: All three program components have a disposition or resolution plan in place in order to close each defect. Category: Defect resolution; Best practice: Ensure that defect resolution complies with testing exit criteria; Assessment: Fully satisfied; Description: For all three program components, there are records of exit testing compliance for individual defects. Also, relevant stakeholders must sign off before a successful test result can be achieved. Category: Defect resolution; Best practice: Perform quality assurance to confirm the implementation of corrective actions, defect repairs, and preventive actions; Assessment: Fully satisfied; Description: For all three program components, contractors provided evidence of quality assurance steps taken to address defects. Category: Defect tracking; Best practice: Track defect dispositions, rationale, and outcome of resolution activities; Assessment: Fully satisfied; Description: All three components tracked dispositions, corrective actions, and outcomes of defects. This was done either in defect reports or at the level of defect review boards. Contractors also maintain additional analysis reports and tracking systems. Category: Defect tracking; Best practice: Track defect-related metrics, such as number of defects open, closed, and duration (both in total and by severity); Assessment: Fully satisfied; Description: For the three components we reviewed, contractors tracked defect totals and metrics, such as number of defects open and closed. They also tracked information on the hardware or software severity rating for each defect. Category: Defect tracking; Best practice: Monitor the risks of open high-priority defects throughout the project; Assessment: Fully satisfied; Description: For the three components we reviewed, contractors tracked and reviewed all open high-priority defects. Category: Defect trending and reporting; Best practice: Track and analyze defect trends over time, such as defect density and severity; Assessment: Partially satsified; Description: NOAA's prime contractors on the flight and ground projects track, analyze, and provide trend data (including defect density) for both hardware and software defects. They also record information about defect root cause, lifecycle, and corrective actions. In addition, the contractors for flight instruments track defect metrics and statistics. However, because NOAA has not specified consistent metrics or defined how to assess the severity of defects, the contractor data does not lend itself to programwide analysis. Category: Defect trending and reporting; Best practice: Report key defects to management and stakeholders; Assessment: Partially satsified; Description: Both flight and ground project contractors report defect trend information and metrics to the GOES-R program and an independent NOAA mission assurance group monthly, and the mission assurance group reports on defects both weekly and monthly to NOAA management. However, the information reported to NOAA is not comprehensive or comparable because each of the contractors maintains its own disparate reporting standards and metrics. For example, on the ground system, the contractor tracks and reports defect totals split out monthly by type and severity; on GLM, only software defects are reported monthly, with totals not split out by severity; and on ABI, high-and low-priority defects are tracked and reported in different ways and for different lengths of time. Category: Defect trending and reporting; Best practice: Management should analyze information on reported problems, and should ensure the impact of any changes is determined, controlled, and monitored; Assessment: Partially satsified; Description: NOAA management provides oversight of key defects and the resolution of those defects during monthly program management meetings. However, managers were not briefed on all longstanding high- risk defects. Specifically, none of the 14 high risk defects that we assessed which had remained unresolved for more than 12 months were included in monthly briefings. Source: GAO analysis of NOAA data. GAO-15-60. [End of table] Among the shortfalls seen in table 8 are a number of cross-cutting themes: * Variation among contractors in managing and reporting metrics: The GOES-R program affords its contractors and subcontractors wide latitude in making decisions on how to manage, track, and report defects, which results in variation among program components. For example, the program established a minimum set of metrics that must be reported and recorded for each software defect, but has not done so for hardware defects. * No clear definition of defects: In its guidance, NOAA did not fully define the terminology of defect management. As a result, the program and contractors use terms such as defect, anomaly, nonconformance, incident reports, and trouble reports without explaining clearly how they are related to each other. Without understanding these relationships, variations from expected performance are likely to be treated differently throughout the program. For example, some issues that occur after formal integration and testing are complete are not considered as defects, which means that they are not all reported on the statistics provided to program management. In another case, an issue on instrument data algorithms was uncovered during user testing and required rework; however, it was not counted as a defect. Program officials stated that this was a documentation issue and noted that they do not track documentation issues as defects. However, the issue affected more than documentation. It was addressed by reworking the software algorithms; thus, it should have been tracked as a software defect. * No clear definition of priority/severity: In its contract requirements and other guidance documents, the GOES-R program did not establish specific guidance to its contractors on how to prioritize or establish the severity of defects. At most, documents list a severity classification as one of many defect attributes that contractors should review as necessary. A program official explained that, for hardware defects, there is no programwide policy for defect metrics, including priority or severity. As a result, the information tracked, trended, and reported to management on defect totals also varied by contractor, and thus often by instrument or component. * Unrealistic testing schedule: Effective defect management requires a realistic schedule in that it takes time to be able to fully identify, analyze, prioritize, resolve, and track defects. However, in May 2014, an internal NOAA report stated that the current program testing schedule is "unrealistically aggressive." The GOES-R program has chosen to compress its remaining testing schedule, which increases the risk that there will not be sufficient time to address defects before moving to the next stage of testing. Effects of this type can already be seen; the program reported that some defects that were open for a long period of time have been delayed due to the need for all available resources to be applied in resolving new, current defects. * Limited trend analysis: While the program and NOAA's mission assurance team have analyzed contractor-provided trends in the volume and severity of defects over time for the ground system and spacecraft, they do not routinely analyze trend data for all components. For example, on the ABI and GLM instruments, the program and mission assurance team do not analyze trend data monthly for hardware defects. Instead, these groups only assess individual defect reports for this subset of defects. It is important for the program to assess defect trends over time in order to provide to management a more complete picture of testing status. Assessing trends in defect handling can result in better resource allocation to components of greatest need, and more attention to the effectiveness of resolving defects. Although the GOES program has defect management policies in place and its contractors are actively tracking, analyzing, and reporting on defects, the discrepancies between contractors' data could cause issues in completing the remainder of the program's integration and testing period. For example, without consistent defect metrics, it is more difficult for managers to obtain a complete picture of the status of open, closed, and high priority defects across the program. Program officials stated that the program did not contractually specify any best practices, but that it assumed that contractors with high-level professional certifications would employ all practices necessary for the development effort. Unless the program can find a way to unite these disparate approaches to provide consistency in the methods by which defects are identified, prioritized, captured, and tracked, it will be more difficult for management to analyze and understand trends in opening and closing high-priority defects or to make decisions on how to best resolve the defects. Moreover, until the program addresses shortfalls in its defect management processes, it may not have a complete picture of remaining defects and runs the risk of not having sufficient time to resolve them. Data Show a High Number of Unresolved Defects for Selected Key Components: While effectively and efficiently addressing and resolving defects is an important part of a sound system testing approach, the GOES-R program has not efficiently closed defects on selected components. As noted earlier, the program does not obtain or maintain defect trend data for the program as a whole; however, data on individual components and portions of components show that a large number of defects remain open, including several high-priority defects. Specifically, data for the GOES ground system shows that 500 defects remained open as of September 2014, including 36 high-priority defects. Defect data for the spacecraft show that it is taking an increasing amount of time to close hardware-related defects, but that the program is making progress in closing software defects. Specifically, as of April 2014, 42 software and 332 hardware defects were unresolved. Defect totals on GOES instruments declined as the program approached the deadline for them to be completed in order to be integrated with the spacecraft. Specifically, for the GLM instrument, the total number of both hardware and software defects declined. Hardware defects declined from 117 in May 2014 to 13 in September 2014, and there were no remaining software defects by January 2014. For the ABI instrument, the number of newly opened hardware and software defects each month has declined over time, with only one unresolved defect remaining in July 2014. Table 9 depicts summary information on defects for selected components at different points in time. In addition, appendix II provides more information on defect trends for these components. Table 9: Summary of Defects for Selected Geostationary Operational Environmental Satellite-R Program Components: Component: Ground Segment, as of September 2014; Total defects: 5,686; Resolved defects: 5,186; Unresolved defects: 500; Unresolved high-priority defects: 36. Component: Spacecraft (software), as of April 2014; Total defects: 3,420; Resolved defects: 3,378; Unresolved defects: 42; Unresolved high-priority defects: Not identified. Component: Spacecraft (hardware), as of April 2014; Total defects: Not identified; Resolved defects: Not identified; Unresolved defects: 332; Unresolved high-priority defects: Not identified. Component: Geostationary Lightning Mapper (software), as of January 2014; Total defects: 54; Resolved defects: 54; Unresolved defects: 0; Unresolved high-priority defects: 0. Component: Geostationary Lightning Mapper (hardware), as of September 2014; Total defects: 152; Resolved defects: 139; Unresolved defects: 13; Unresolved high-priority defects: Not identified. Component: Advanced Baseline Imager, as of July 2014; Total defects: 333; Resolved defects: 332; Unresolved defects: 1; Unresolved high-priority defects: Not identified. Source: GAO analysis of NOAA data. GAO-15-60. [End of table] Program officials noted that in some cases, due to time concerns, lower-priority defects which have been determined to not have a major effect on performance are not closed until later in the testing process. Also, program officials have stated that they are having difficulty in closing defect-related incident reports due to insufficient manpower. Until the program reduces the number of longstanding unresolved defects, it faces an increased risk of further delays to the GOES-R launch date should an open defect affect future performance. Facing a Gap in Backup Satellite Coverage, the GOES-R Program Has Improved Contingency Plans Though Shortfalls Remain: GOES satellite data are considered a mission-essential function because of their criticality to weather observations and forecasts. These forecasts--such as those for severe storms, hurricanes, and tornadoes- -can have a substantial impact on our nation's people, infrastructure, and economy. Because of the importance of GOES satellite data, NOAA's policy is to have two operational satellites and one backup satellite in orbit at all times. This policy proved useful in December 2008 and again in September 2012, when the agency experienced problems with one of its operational satellites, but was able to move its backup satellite into place until the problems had been resolved. However, NOAA is facing a period of up to 17 months when it will not have a backup satellite in orbit. Specifically, in April 2015, NOAA expects to retire one of its operational satellites (GOES-13) and to move its backup satellite (GOES-14) into operation. Thus, the agency will have only two operational satellites in orbit--and no backup satellite--until GOES-R is launched and completes an estimated 6-month post-launch test period. If GOES-R is launched in March 2016, the earliest it could be available for operational use would be September 2016. Figure 9 shows the potential gap in backup coverage, based on the launch and decommission dates of GOES satellites. Figure 9: Potential Gap in Geostationary Operational Environmental Satellite Coverage, as of April 2014: [Refer to PDF for image: timeline] Satellite: GOES-13; Available as backup: 2009-2010 Q1; Operational period: 2010 Q1-2015 Q1; Projected gap in backup coverage: 2015 Q1-2016 Q4. Satellite: GOES-14; Launch date[A]: 2009 Q3; Post launch test period: 2009 Q3-2010 Q1; Available as backup: 2010 Q1-2015 Q1; Operational period: 2015 Q1-2020 Q1; Projected gap in backup coverage: 2015 Q1-2016 Q4. Satellite: GOES-15; Launch date[A]: 2010 Q2; Post launch test period: 2010 Q2-2010 Q4; Available as backup: 2010 Q4-2012 Q1; Operational period: 2012 Q1-2017 Q1; Projected gap in backup coverage: 2015 Q1-2016 Q4. Satellite: GOES-R; Launch date[A]: 2016 Q2; Post launch test period: 2016 Q2-2016 Q4; Available as backup: 2015 Q4-2017 Q1; Operational period: 2017 Q1-2023 and beyond; Projected gap in backup coverage: 2015 Q1-2016 Q4. Satellite: GOES-S; Launch date[A]: 2017 Q3; Post launch test period: 2016 Q3-2018 Q1; Available as backup: 2018 Q1-2020 Q2; Operational period: 2020 Q2-2023 and beyond; Projected gap in backup coverage: 2015 Q1-2016 Q4. Source: GAO analysis of NOAA data. GAO-15-60. [A] The GOES-R and GOES-S launch dates reflect the end of the quarters listed in NOAA's latest launch estimates. [End of figure] During the time in which no backup satellite would be available, there is a greater risk that NOAA would need to either rely on older satellites that are beyond their expected operational lives and may not be fully functional, request foreign satellite coverage, or to operate with only a single operational satellite. Agency officials stated that the risk of a gap may be reduced, because NOAA satellites have historically remained in operation longer than their expected life. While many satellites outlive their expected lifespans, the current GOES satellites are operating with reduced functionality, and one has experienced two major outages. Without a full complement of operational GOES satellites, the nation's ability to maintain the continuity of data required for effective weather forecasting could be compromised. This, in turn, could put the public, property, and the economy at risk. Any delay to the GOES-R launch date would extend the time without a backup to more than 17 months. As discussed earlier in this report, further delays to the committed launch date of the first GOES-R satellite are possible due to continued technical issues encountered during testing and integration. GOES-R Satellite Contingency Plan Shows Improvement, but Continues to Lack Details in Key Areas: Government and industry best practices call for the development of contingency plans to maintain an organization's essential functions-- such as GOES satellite data--in the case of an adverse event.[Footnote 21] In September 2013, we reported on weaknesses in the contingency plans for NOAA's satellites.[Footnote 22] At that time, we compared NOAA's plans to 17 best practices associated with three main areas: identifying failure scenarios and impacts, developing contingency plans, and validating and implementing contingency plans. We reported that while NOAA identified failure scenarios, recovery priorities, and minimum levels of acceptable performance, the satellite contingency plan contained areas that fell short of best practices, such as working with the user community to account for potential reductions in capability under contingency operations. Furthermore, the agency did not identify alternative solutions or timelines for preventing a GOES- R launch delay. In February 2014, NOAA released a new satellite contingency plan in response to these recommendations. This plan improved in comparison to many, but not all, of the best practices. Specifically, the plan improved in 6 areas and stayed the same in 4 areas. Table 10 compares our assessment of the current satellite contingency plan with our September 2013 analysis for all best practices that were not fully met. Table 10: Assessment of Satellite Contingency Plans for the Geostationary Operational Environmental Satellite-R Program over Time: Category: Identifying failure scenarios and impacts; Key element: Define likely failure scenarios; September 2013 assessment: Fully implemented; September 2014 assessment: Not applicable. Key element: Conduct impact analyses showing impact of failure scenarios on business processes and user requirements; September 2013 assessment: Not implemented; September 2014 assessment: Partially implemented; Description: The plan now requires the NOAA Satellite Operations Command Center to notify users regarding impacts and outages. However, it does not specify if or how impact analyses should be conducted. Key element: Define minimum acceptable level of outputs and recovery time objectives, and establish resumption priorities; September 2013 assessment: Fully implemented; September 2014 assessment: Not applicable. Category: Developing contingency plans; Key element: Define roles and responsibilities for implementing contingency plans; September 2013 assessment: Partially implemented; September 2014 assessment: Fully implemented; Description: For each contingency scenario in the plan, roles and responsibilities are prominently listed. Detailed information on responsibilities are provided for each action in the plan. Key element: Identify alternative solutions to address failure scenarios; September 2013 assessment: Partially implemented; September 2014 assessment: Partially implemented; Description: Solutions for each failure scenario are listed, including activation of an on-orbit spare or use of a foreign satellite, and shortening the on-orbit test period of a recently launched satellite. However, the plan does not address alternative solutions for preventing a launch delay. Key element: Select contingency strategies from among alternatives based on costs, benefits, and impacts; September 2013 assessment: Partially implemented; September 2014 assessment: Partially implemented; Description: The satellite plan describes specific alternative actions that would be implemented depending on the circumstances of a scenario, including the conducting of impact analyses and reporting procedures. However, the plan does not address strategies for preventing a delay in launch. Key element: Develop "zero-day" procedures; September 2013 assessment: Partially implemented; September 2014 assessment: Fully implemented; Description: Zero-day events are defined, as are procedures for action if a zero-day event occurs. A subordinate policy document called out in the satellite contingency plan describes in more detail the actions that will be taken immediately should an event occur, and states that "at no point should immediate fix actions… be delayed.". Key element: Define actions needed to implement contingency strategies; September 2013 assessment: Partially implemented; September 2014 assessment: Fully implemented; Description: The satellite plan describes detailed steps and actions for each major contingency strategy, including which groups should perform each action. Key element: Define and document triggers and timelines for enacting the actions needed to implement contingency plans; September 2013 assessment: Partially implemented; September 2014 assessment: Partially implemented; Description: The satellite plan is specific in giving timelines and triggers for actions. For example, the plan calls for user notification procedures and a transition to full-disk imagery to begin after one hour of data loss, and other steps within 24 and 48 hours. However, the plan does not discuss triggers or timelines related to preventing a delay in satellite launch. Key element: Ensure that steps reflect priorities for resumption of products and recovery objectives; September 2013 assessment: Partially implemented; September 2014 assessment: Fully implemented; Description: The satellite plan describes NOAA priorities and that of its component office, the National Environmental Satellite Data and Information Service. The plan then describes how the actions in the remainder of the plan are related to those priorities, in particular "restor(ing) the health of the satellite, and/or the flow of data." The plan also shows objectives for recovery of service. Key element: Designated officials review and approve contingency plan; September 2013 assessment: Fully implemented; September 2014 assessment: Not applicable. Category: Validating and implementing contingency plans; Key element: Identify steps for testing contingency plans and conducting training exercises; September 2013 assessment: Fully implemented; September 2014 assessment: Not applicable. Key element: Prepare for and execute tests; September 2013 assessment: Fully implemented; September 2014 assessment: Not applicable. Key element: Execute applicable actions for implementation of contingency strategies; September 2013 assessment: Fully implemented; September 2014 assessment: Not applicable. Key element: Validate test results for consistency against minimum performance levels; September 2013 assessment: Partially implemented; September 2014 assessment: Partially implemented; Description: The satellite plan describes a system of regular test maneuvers performed each year. The plan also describes how these actions would be used in the event of an anomaly in order to meet operational priorities. However, there is no indication that any of the metrics included in the plan are considered minimum performance levels in the event of an anomaly. Key element: Communicate and coordinate with stakeholders to ensure that contingency strategies remain optimal for reducing potential impacts; September 2013 assessment: Partially implemented; September 2014 assessment: Fully implemented; Description: The satellite plan details methods for communicating and coordinating with stakeholders, including actions and responsibilities for notifying users in the event of an anomaly that would affect performance. A subordinate policy document called out in the satellite plan provides greater detail on expected user communications, stating that the Office of Satellite Product Operations should facilitate a complete and accurate data exchange among all parties on the potential effects of an anomaly. This plan also stated that typically within 2 hours, the office should inform stakeholders of unplanned interruptions to mission critical ground system IT services, product outages or product delays or reductions in service quality. Key element: Update and maintain contingency plans as warranted; September 2013 assessment: Fully implemented; September 2014 assessment: Not applicable. Source: GAO analysis of NOAA documentation. GAO-15-60. [End of table] Program officials stated that it is not feasible to include strategies to prevent delays in launch of the first GOES-R satellite in the contingency plan, because such strategies are not static. They explained that options for preventing a delay vary greatly over time based on issues that occur during the satellite's development. They further stated that the program's monthly presentations to NOAA management provide a summary of current threats to the launch date and strategies to mitigate those threats. For example, NOAA is considering whether or not to remove or delay selected ground system functions to post-launch, and the program provides monthly updates on this issue. While actively managing the program to avoid a delay is critical, it is also important that NOAA management and the GOES-R program consider and document feasible alternatives for avoiding or limiting such a launch delay. This will allow stakeholders throughout NOAA to be aware of, respond to, and plan for the potential implementation of each alternative, not only the small number of alternatives the program is actively considering in any given month. Until NOAA addresses the remaining shortfalls in its GOES-R gap mitigation plan, the agency cannot be assured that it is exploring all alternatives or that are able to effectively prepare to receive GOES information in the event of a failure. Conclusions: After spending 10 years and just over $5 billion, the GOES-R program is nearing the launch of its first satellite. However, it continues to face challenges in maintaining its schedule and controlling its costs. The program continues to experience delays in remaining major milestones, which could result in further delays to the launch date. Costs are increasing faster than expected for key program components[Footnote 23] and contractor data is often inconsistent from month to month. Until the agency ensures its contractor cost data are consistent, it will be more difficult for managers and program officials to make financial projections and assess reserve needs and usage. As the GOES-R program progresses through its testing and integration phase, it is essential that it is able to appropriately handle defects that arise during testing. While NOAA and its contractors have implemented a defect management process which is successful in many areas, there are shortfalls in how the program defines defects, monitors trends, and reports on defects and defect metrics. In particular, NOAA has not established a standard set of metric information for all defects, including the dates on which defects were identified and resolved, or an indication of a defect's severity. Also, NOAA did not clearly define the type of issue that constitutes a defect. Furthermore, multiple defects remain open on some program components, including several that have remained open for more than six months. Until the program addresses these shortfalls and reduces the number of open defects, it may not have a complete picture of remaining issues and faces an increased risk of further delays to the GOES-R launch date. NOAA could experience a gap in satellite data coverage if GOES-R is delayed further and one of the two remaining operational satellites experiences a problem. NOAA has made improvements to its satellite contingency plan, but the plan still does not sufficiently address mitigation options for a launch delay, potential impacts, or minimum performance levels. Until such information is available, it will be difficult to integrate mitigation efforts, or to coordinate with users in the event of a failure. Recommendations for Executive Action: To address risks in the GOES-R program development and to help ensure that the satellite is launched on time, we are making the following four recommendations to the Secretary of Commerce. Specifically, we recommend that the Secretary of Commerce direct the NOAA Administrator to: * investigate and address inconsistencies totaling hundreds of thousands of dollars in monthly earned value data reporting for the GLM and ABI instruments; * address shortfalls in defect management identified in this report, including the lack of clear guidance on defect definitions, what defect metrics should be collected and reported, and how to establish a defect's priority or severity; and: * reduce the number of unresolved defects on the GOES ground system and spacecraft. In addition, because NOAA has not fully implemented our prior recommendation to improve its satellite gap mitigation plan, we recommend that the Secretary of Commerce direct the NOAA Administrator to: * add information to the GOES satellite contingency plan on steps planned or underway to mitigate potential launch delays, the potential impact of failure scenarios in the plan, and the minimum performance levels expected under such scenarios. Agency Comments and Our Evaluation: We sought comments on a draft of our report from the Department of Commerce and NASA. We received written comments from the Deputy Secretary of Commerce transmitting NOAA's comments. NOAA concurred with all four of our recommendations and identified steps that it plans to take to implement them. It also provided technical comments, which we have incorporated into our report, as appropriate. NOAA's comments are reprinted in appendix III. On November 14, 2014, an audit liaison for NASA provided an e-mail stating that the agency would provide any input it might have to NOAA for inclusion in NOAA's comments. As agreed with your offices, unless you publicly announce the contents of this report earlier, we plan no further distribution until 30 days from the report date. At that time, we will send copies to interested congressional committees, the Secretary of Commerce, the Administrator of NASA, the Director of the Office of Management and Budget, and other interested parties. The report also will be available at no charge on the GAO website at [hyperlink, http://www.gao.gov]. If you or your staff have any questions on the matters discussed in this report, please contact me at (202) 512-9286 or at pownerd@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix IV. Signed by: David A. Powner: Director, Information Technology Management Issues: [End of section] Appendix I: Objectives, Scope, and Methodology: Our objectives were to (1) assess progress on the GOES-R program with respect to planned schedule, cost, and functionality; (2) assess efforts to identify and address issues discovered during integration and testing; and (3) evaluate the likelihood of a gap in satellite coverage and analyze the adequacy of contingency actions in place to prevent or mitigate such a gap. To assess NOAA's progress on the GOES-R program with respect to planned schedule, cost, and functionality, and to identify risks that could lead to further schedule delays, we analyzed data from monthly program management meetings. We evaluated progress made in completing key program components and major program reviews, and compared planned and actual completion dates for key program milestones over the last two to three years to determine the degree to which these dates have changed. To ensure that the program's schedule data were consistent and reliable, we compared milestone data over several months and contacted agency officials to corroborate events that occurred over the course of our engagement. We analyzed earned value management (EVM) data to compare levels of cost variance over time for key program components, and calculated earned value management metrics using program cost performance reports. To ensure that the program's cost data were reliable, we compared formulas from the GAO Cost Guide [Footnote 24] to the program's EVM approach. We also compared EVM data across a series of monthly program cost performance reports. In doing so, we found inconsistencies in the monthly EVM data for selected components, and reported on those inconsistencies in this report. However, the data were sufficient for our purpose of assessing overruns because these inconsistencies were small in comparison to cost variances. We assessed recently enacted and potential changes in functionality for key program components. We also interviewed program officials regarding changes in schedule milestones, cost performance and reserve funding, and functionality. To assess efforts to identify and address issues discovered during integration and testing, we identified best practices in defect management from leading industry and government organizations. [Footnote 25] We compared NOAA policy documents and defect management artifacts by the GOES program, its contractors, and an independent NOAA mission assurance group to the best practices. We selected three components, which are critical to the program's mission requirements, and evaluated recent test results and defects.[Footnote 26] Specifically, we identified two recent tests for each component and analyzed artifacts associated with defects identified during those tests. We compared the defects against each best practice to determine if NOAA and its contractors fully implemented, partially implemented, or did not implement the best practices. The agency and contractor had to meet all aspects of the best practice to achieve a fully implemented score, some aspects to achieve a partially implemented score, and no aspects to achieve a not implemented score. To identify the program's recent performance in closing and managing defects, we analyzed basic defect trend information--such as number of defects opened and closed, and defect severity--for each of several program components. We developed charts to demonstrate defect trends. To ensure the data were reliable, we compared defect data from individual defect reports to agency trend charts, and sought corroboration from agency and contractor officials. We also interviewed agency and contractor officials to discuss their defect management processes and practices, and to confirm information we found while analyzing individual defect reports and trend charts. To evaluate the likelihood of a gap in satellite coverage, we reviewed monthly program management presentations and other review board documentation. To analyze the adequacy of contingency actions in place to prevent or mitigate such a gap, we compared NOAA's latest satellite contingency plan to best practice criteria from industry and government.[Footnote 27] We focused specifically on the ten areas we identified as weak in our prior report.[Footnote 28] For each of the ten areas, we rated NOAA's contingency plan as having partially or fully implemented the best practice criteria. In addition, we interviewed agency officials regarding the likely amount of a gap in on-orbit backup coverage of GOES satellites, the potential for a gap in operational coverage, and efforts to improve the GOES contingency plan. We conducted this performance audit from January 2014 to December 2014, in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. [End of section] Appendix II: Defect Trends for Key GOES-R Components: As noted earlier, the GOES-R program does not obtain or maintain defect trend data for the program as a whole; however, data on individual components and portions of components show that a large number of defects remain open, including several high-priority defects. The following sections provide more details on recent data trends for the GOES-R ground system and spacecraft as well as the ABI and GLM instruments. It is important to note that the sections do not provide consistent information among components because the program does not require consistent metrics and the contractors document different data. Ground System Has Many Unresolved Defects: As of the end of September 2014, the ground system had 500 open defects, an increase from 342 open defects at the end of September 2013. Also, as of the end of September 2014, 96 defects that were identified prior to January 2014 remained open, and 36 high-priority defects (those rated as critical or moderate) remained open. Program and contractor officials provided rationale and insight into the mitigating circumstances surrounding these defects. Program officials stated that the number of open defects during the integration and test phase is as expected for a ground system of a magnitude similar to GOES. Contractor officials reported that none of the 96 longstanding defects were in either of the two highest severity categories. They also noted that most of the open high-priority defects had been opened during testing events conducted over the previous 3 months, which is a reasonable length of time to close a defect. Furthermore, according to contractor officials, many of the longstanding open defects were in categories such as documentation- related defects which are considered less severe. Officials also stated that defects in these categories are often kept open for specific reasons, such as to gain cost efficiencies or to wait for an already-planned test event rather than creating a new test event. However, as of September 2013, 193 of the 342 defects were not related to documentation and this number rose to 416 of the 500 open defects in September 2014. Figure 10 shows the increase in open defects in the period from September 2013 to August 2014 for the GOES-R ground system. Figure 10: Number of Open and Closed Defects for the Geostationary Operational Environmental Satellite-R Program's Ground System, as of August 2014: [Refer to PDF for image: vertical bar graph] Month of defect: September 2013; Defects newly opened: 114; Defects closed that month: 245; Defects remaining open: 342. Month of defect: October 2013; Defects newly opened: 149; Defects closed that month: 113; Defects remaining open: 378. Month of defect: November 2013; Defects newly opened: 143; Defects closed that month: 118; Defects remaining open: 403. Month of defect: December 2013; Defects newly opened: 150; Defects closed that month: 88; Defects remaining open: 465. Month of defect: January 2014; Defects newly opened: 120; Defects closed that month: 93; Defects remaining open: 492. Month of defect: February 2014; Defects newly opened: 159; Defects closed that month: 134; Defects remaining open: 517. Month of defect: March 2014; Defects newly opened: 194; Defects closed that month: 113; Defects remaining open: 598. Month of defect: April 2014; Defects newly opened: 191; Defects closed that month: 247; Defects remaining open: 542. Month of defect: May 2014; Defects newly opened: 149; Defects closed that month: 193; Defects remaining open: 498. Month of defect: June 2014; Defects newly opened: 136; Defects closed that month: 142; Defects remaining open: 492. Month of defect: July 2014; Defects newly opened: 130; Defects closed that month: 110; Defects remaining open: 512. Month of defect: August 2014; Defects newly opened: 192; Defects closed that month: 223; Defects remaining open: 481. Source: GAO analysis of NOAA data. GAO-15-60. [End of figure] Spacecraft Trends Show Increasing Time to Address Hardware Defects While the Number of Software Defects Declined: As allowed by NOAA's guidance, contractors on the spacecraft track defects differently than the ground system contractors do. Thus, it is not possible to compare opened and closed defects over time as depicted above for the ground system. Instead, spacecraft contractors track hardware and software defects independently. They also track when new defects are identified and how long they stay open. For spacecraft hardware, the defect data show that the average time it took to resolve defects increased over time. Specifically, the average age for hardware defects increased from 99 days in May 2013 to a high of 167 days in April 2014, at which point 58 percent of all open defects had been open for at least 90 days. Program officials stated that they began continuous 24-hour a day, 7-day a week testing in October 2014, which means that they should be able to make progress in closing the remaining defects. While the backlog of hardware-related defects on the spacecraft remains high, NOAA has been effective in recent months in greatly reducing the number of software-related defects on the spacecraft. The total number of open software-related defects increased and remained high through February 2014, but then declined significantly in March and April 2014. Defect Trends Show Decline in Defects for GLM Instrument: Defect data for the GLM instrument showed a decline in the total number of defects for both hardware and software components. While there were 117 unresolved hardware defects in May 2014, only 13 hardware defects remained unresolved as of September 2014. Trend data also showed a decline in the number of unresolved software defects. Specifically, the total number of software-related defects remaining open never increased above 7 over the period from March 2013 to January 2014. For half the year, there was no more than one open defect. ABI Hardware and Software Defects Were Closed in Recent Months: Metrics for the ABI instrument show that all hardware defects and all but one software defect have been closed, likely because the first unit of the instrument was completed. The number of hardware defects occurring each month declined to near zero during the period February 2013 to April 2014. Less than ten ABI software defects were open at any point from September 2012 onward, and only 11 defects were newly opened during that time. Figure 11 shows the number of opened and closed ABI hardware defects by month, and figure 12 shows opened and closed defects for ABI software. Figure 11: Number of Open and Closed Hardware Defects for the Geostationary Operational Environmental Satellite-R Program's Advanced Baseline Imager Instrument for Each Month from February 2013 to August 2014: [Refer to PDF for image: vertical bar graph] Month of defect: February 2013; Defects newly opened: 8; Defects closed that month: 3; Defects remaining open: 5. Month of defect: March 2013; Defects newly opened: 6; Defects closed that month: 3; Defects remaining open: 8. Month of defect: April 2013; Defects newly opened: 4; Defects closed that month: 1; Defects remaining open: 11. Month of defect: May 2013; Defects newly opened: 9; Defects closed that month: 5; Defects remaining open: 15. Month of defect: June 2013; Defects newly opened: 11; Defects closed that month: 1; Defects remaining open: 25. Month of defect: July 2013; Defects newly opened: 1; Defects closed that month: 11; Defects remaining open: 15. Month of defect: August 2013; Defects newly opened: 9; Defects closed that month: 12; Defects remaining open: 12. Month of defect: September 2013; Defects newly opened: 1; Defects closed that month: 11; Defects remaining open: 2. Month of defect: October 2013; Defects newly opened: 9; Defects closed that month: 7; Defects remaining open: 4. Month of defect: November 2013; Defects newly opened: 7; Defects closed that month: 6; Defects remaining open: 5. Month of defect: December 2013; Defects newly opened: 1; Defects closed that month: 1; Defects remaining open: 5. Month of defect: January 2014; Defects newly opened: 6; Defects closed that month: 10; Defects remaining open: 1. Month of defect: February 2014; Defects newly opened: 1; Defects closed that month: 1; Defects remaining open: 1. Month of defect: March 2014; Defects newly opened: 0; Defects closed that month: 0; Defects remaining open: 1. Month of defect: April 2014; Defects newly opened: 0; Defects closed that month: 0; Defects remaining open: 1. Month of defect: May 2014; Defects newly opened: 0; Defects closed that month: 1; Defects remaining open: 0. Month of defect: June 2014; Defects newly opened: 0; Defects closed that month: 0; Defects remaining open: 0. Month of defect: July 2014; Defects newly opened: 0; Defects closed that month: 0; Defects remaining open: 0. Month of defect: August 2014; Defects newly opened: 0; Defects closed that month: 0; Defects remaining open: 0. Source: GAO analysis of NOAA data. GAO-15-60. [End of figure] Figure 12: Number of Open and Closed Software Defects for the Geostationary Operational Environmental Satellite-R Program's Advanced Baseline Imager Instrument from 2012 to 2014: [Refer to PDF for image: vertical bar graph] Month of defect: February 2013; Defects newly opened: 1; Defects closed that month: 0; Defects remaining open: 2. Month of defect: March 2013; Defects newly opened: 0; Defects closed that month: 1; Defects remaining open: 1. Month of defect: April 2013; Defects newly opened: 29; Defects closed that month: 11; Defects remaining open: 10. Month of defect: May 2013; Defects newly opened: 18; Defects closed that month: 0; Defects remaining open: 28. Month of defect: June 2013; Defects newly opened: 23; Defects closed that month: 43; Defects remaining open: 8. Month of defect: July 2013; Defects newly opened: 3; Defects closed that month: 0; Defects remaining open: 11. Month of defect: August 2013; Defects newly opened: 0; Defects closed that month: 0; Defects remaining open: 11. Month of defect: September 2013; Defects newly opened: 0; Defects closed that month: 4; Defects remaining open: 7. Month of defect: October 2013; Defects newly opened: 0; Defects closed that month: 0; Defects remaining open: 7. Month of defect: November 2013; Defects newly opened: 0; Defects closed that month: 0; Defects remaining open: 7. Month of defect: 2013; Defects newly opened: 7; Defects closed that month: 8; Defects remaining open: 5. Month of defect: 2014; Defects newly opened: 3; Defects closed that month: 7; Defects remaining open: 1. Month of defect: February 2014; Defects newly opened: 1; Defects closed that month: 1; Defects remaining open: 1. Source: GAO analysis of NOAA data. GAO-15-60. [End of figure] [End of section] Appendix III: Comments from the Department of Commerce: The Deputy Secretary of Commerce: Washington, D.C. 20230: November 20, 2014: Mr. David A. Powner: Director, Information Technology Management Issues: U.S. Government Accountability Office: 441 G Street NW: Washington, DC 20548: Dear Mr. Powner: Thank you for the opportunity to review and comment on the U.S. Government Accountability Office's draft report entitled Geostationary Weather Satellites: Launch Date Nears, but Remaining Schedule Risks Need to be Addressed (GAO-15-60). On behalf of the Department of Commerce, I have enclosed the National Oceanic and Atmospheric Administration's programmatic comments to the draft report. If you have any questions, please contact me or Margaret Cumrnisky, Assistant Secretary for Legislative and Intergovernmental Affairs at (202) 482-3663. Sincerely, Signed by: Bruce H. Andrews: Deputy Secretary of Commerce: Enclosure: Department of Commerce: National Oceanic and Atmospheric Administration: Comments to the Draft GAO Report Entitled: Geostationary Weather Satellites: Launch Date Nears, but Remaining Schedule Risks Need to be Addressed (GAO-15-60, December 2014): General Comments: The Department of Commerce's National Oceanic and Atmospheric Administration (NOAA) appreciates the opportunity to review and comment on the U.S. Government Accountability Office (GAO) draft report on the Geostationary Operational Environmental Satellite-R (GOES-R) series. NOAA has reviewed the report and agrees with all four GAO recommendations and the response to each recommendation is provided below. NOAA recommends the following factual and technical changes to the report to ensure that the information presented is complete and up-to-date. NOAA Response to GAO Recommendations: Recommendation 1: "Investigate and address inconsistencies totaling hundreds of thousands of dollars in monthly earned value data reporting for the GLM and ASI instruments." NOAA Response: NOAA agrees with this recommendation. The GOES-R program will investigate these apparent inconsistencies and provide an appropriate corrective action plan. Recommendation 2: "Address shortfalls in defect management identified in this report, including the lack of clear guidance on defect definitions, what defect metrics should be collected and reported, and how to establish a defect's priority or severity." NOAA Response: NOAA agrees with this recommendation. The GOES-R program will review the identified shortfalls and address and mitigate risks accordingly. Recommendation 3: "Reduce the number of unresolved defects on the GOES Ground System and Spacecraft." NOAA Response: NOAA agrees with this recommendation. The GOES-R Series Program is actively working to reduce the number of unresolved defects and believes the current defect management process adequately supports this need. GOES-R is currently in its Integration and Test period, which is the phase of the program that is designed to flush out remaining defects. All defects are reviewed on a bi-weekly basis. This review, which contains a constraints list, enforces timely closure of the defect in order to facilitate advancing through the Integration and Test phase. Recommendation 4: "Add information to the GOES satellite contingency plan on steps planned or underway to mitigate potential launch delays, including information on the potential impact of failure scenarios in the plan and the minimum performance levels expected even under such scenarios." NOAA Response: NOAA agrees with this recommendation. NOAA will add information to the GOES satellite contingency plan on steps planned or underway to mitigate potential launch delays, including information on the potential impact of failure scenarios in the plan and the minimum performance levels expected even under such scenarios. [End of section] Appendix IV: GAO Contact and Staff Acknowledgments: GAO Contact: David A. Powner, (202) 512-9286 or pownerd@gao.gov: Staff Acknowledgments: In addition to the contact named above, individuals making contributions to this report included Colleen Phillips (assistant director), Alexander Anderegg, Christopher Businsky, Shaun Byrnes, James MacAulay, and Karl Seifert. [End of section] Footnotes: [1] GAO, High Risk Series: An Update, [hyperlink, http://www.gao.gov/products/GAO-13-283] (Washington, D.C.: February 2013). [2] GAO, GAO Cost Estimating and Assessment Guide: Best Practices for Developing and Managing Capital Program Costs, [hyperlink, http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009). [3] See GAO, Year 2000 Computing Crisis: Business Continuity and Contingency Planning, [hyperlink, http://www.gao.gov/products/GAO/AIMD-10.1.19] (Washington, D.C.: August 1998); National Institute of Standards and Technology, Contingency Planning Guide for Federal Information Systems, NIST 800- 34 (Gaithersburg, Md.: May 2010); Software Engineering Institute, CMMI® for Acquisition, Version 1.3 (Pittsburgh, Pa.: November 2010). [4] There are older GOES satellites in orbit, but they have been decommissioned. [5] GAO, Environmental Satellites: Focused Attention Needed to Improve Mitigation Strategies for Satellite Coverage Gaps, [hyperlink, http://www.gao.gov/products/GAO-13-865T], (Washington, D.C.: Sept. 19, 2013); Geostationary Weather Satellites: Progress Made, but Weaknesses in Scheduling, Contingency Planning, and Communicating with Users Need to be Addressed, [hyperlink, http://www.gao.gov/products/GAO-13-597], (Washington, D.C.: Sept. 19, 2013); Environmental Satellites: Focused Attention Needed to Mitigate Program Risks, [hyperlink, http://www.gao.gov/products/GAO-12-841T], (Washington, D.C.: June 27, 2012); Geostationary Weather Satellites: Design Progress Made, but Schedule Uncertainty Needs to be Addressed, [hyperlink, http://www.gao.gov/products/GAO-12-576], (Washington, D.C.: June 26, 2012); Geostationary Operational Environmental Satellites: Improvements Needed in Continuity Planning and Involvement of Key Users, [hyperlink, http://www.gao.gov/products/GAO-10-799] (Washington, D.C.: Sept. 1, 2010); Geostationary Operational Environmental Satellites: Acquisition Has Increased Costs, Reduced Capabilities, and Delayed Schedules, [hyperlink, http://www.gao.gov/products/GAO-09-596T] (Washington, D.C.: Apr. 23, 2009); Geostationary Operational Environmental Satellites: Acquisition Is Under Way, but Improvements Needed in Management and Oversight, [hyperlink, http://www.gao.gov/products/GAO-09-323] (Washington, D.C.: Apr. 2, 2009); Geostationary Operational Environmental Satellites: Further Actions Needed to Effectively Manage Risks, [hyperlink, http://www.gao.gov/products/GAO-08-183T] (Washington, D.C.: Oct. 23, 2007); Geostationary Operational Environmental Satellites: Progress Has Been Made, but Improvements Are Needed to Effectively Manage Risks, [hyperlink, http://www.gao.gov/products/GAO-08-18] (Washington, D.C.: Oct. 23, 2007); [6] [hyperlink, http://www.gao.gov/products/GAO-12-576]. [7] [hyperlink, http://www.gao.gov/products/GAO-13-597]. [8] [hyperlink, http://www.gao.gov/products/GAO-12-576]. [9] [hyperlink, http://www.gao.gov/products/GAO-13-597]. [10] GAO, 2013 High-Risk Series: An Update, [hyperlink, http://www.gao.gov/products/GAO-13-359T] (Washington, D.C.: Feb. 14, 2013). [11] [hyperlink, http://www.gao.gov/products/GAO-13-597]. [12] The program split the first of the five end-to-end tests into two increments, the first of which was completed in August 2014 and the last which is expected to be completed in December 2014. [13] [hyperlink, http://www.gao.gov/products/GAO-13-597]. [14] See, for example, GAO, NASA: Assessments of Selected Large-Scale Projects, [hyperlink, http://www.gao.gov/products/GAO-11-239SP] (Washington, D.C.: Mar. 3, 2011) and NASA: Assessments of Selected Large-Scale Projects, [hyperlink, http://www.gao.gov/products/GAO-12-207SP] (Washington, D.C.: Mar. 1, 2012). [15] GAO, Office of Personnel Management: Improvements Needed to Ensure Successful Retirement System Modernization, [hyperlink, http://www.gao.gov/products/GAO-08-345] (Washington, D.C.: Jan. 31, 2008) and 2000 Census: New Data Capture System Progress and Risks, [hyperlink, http://www.gao.gov/products/GAO/AIMD-00-61] (Washington, D.C.: Feb. 4, 2000). [16] These components are costing more than their expected contract value; however, program officials expect that the total cost will remain within the program's contingency reserves. [17] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. [18] See, for example, [hyperlink, http://www.gao.gov/products/GAO-11-239SP] and [hyperlink, http://www.gao.gov/products/GAO-12-207SP]. [19] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. [20] Institute of Electrical and Electronics Engineers, Standard for System and Software Verification and Validation, IEEE Std 1012 (New York, N.Y.: May 25, 2012); Project Management Institute: A Guide to the Project Management Body of Knowledge (PMBOK® Guide), Fourth Edition, Project Management Institute, Inc. (PMI), 2008. Copyright and all rights reserved; Institute of Electrical and Electronics Engineers, Software and systems engineering --Software testing, ISO/TEC/IEEE Std 29119 (New York, N.Y.: Sept. 1, 2013); Institute of Electrical and Electronics Engineers, Standard for Information Technology --Software life cycle processes --Implementation considerations, IEEE/EIA Std 12207.2-1997 (New York, N.Y.: April 1998); Institute of Electrical and Electronics Engineers, IEEE Standard for Software and System Test Documentation, IEEE Std 829-2008 (New York, N.Y.: Jul. 10, 2008); Institute of Electrical and Electronics Engineers, IEEE Standard Classification for Software Anomalies, IEEE Std 1044-2009 (New York, N.Y.: Jan. 7, 2010); Software Engineering Institute, CMMI® for Acquisition, Version 1.3 (Pittsburgh, Pa: November 2010); Software Engineering Institute, CMMI® for Development, Version 1.3 (Pittsburgh, Pa: November 2010); GAO, Year 2000 Computing Crisis: A Testing Guide, [hyperlink, http://www.gao.gov/products/GAO/AIMD-10.1.21] (Washington, D.C.: November 1998); GAO Cost Estimating and Assessment Guide: Best Practices for Developing and Managing Capital Program Costs, [hyperlink, http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009); and GAO Schedule Assessment Guide: Best Practices for Project Schedules (Exposure Draft), [hyperlink, http://www.gao.gov/products/GAO-12-120G] (Washington, D.C.: May 2012). [21] See GAO, Year 2000 Computing Crisis: Business Continuity and Contingency Planning, [hyperlink, http://www.gao.gov/products/GAO/AIMD-10.1.19] (Washington, D.C.: August 1998); National Institute of Standards and Technology, Contingency Planning Guide for Federal Information Systems, NIST 800- 34 (Gaithersburg, Md.: May 2010); Software Engineering Institute, CMMI® for Acquisition, Version 1.3 (Pittsburgh, Pa.: November 2010). [22] [hyperlink, http://www.gao.gov/products/GAO-13-597]. [23] While components are costing more than their expected contract value, program officials expect that the total cost will remain within the program's contingency reserves. [24] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. [25] Institute of Electrical and Electronics Engineers, Standard for System and Software Verification and Validation, IEEE Std 1012 (New York, N.Y.: May 25, 2012); Project Management Institute: A Guide to the Project Management Body of Knowledge (PMBOK® Guide), Fourth Edition, Project Management Institute, Inc. (PMI), 2008. Copyright and all rights reserved; Institute of Electrical and Electronics Engineers, Software and systems engineering--Software testing, ISO/TEC/IEEE Std 29119 (New York, N.Y.: Sept. 1, 2013); Institute of Electrical and Electronics Engineers, Standard for Information Technology-Software life cycle processes - Implementation considerations, IEEE/EIA Std 12207.2-1997 (New York, N.Y.: April 1998); Institute of Electrical and Electronics Engineers, IEEE Standard for Software and System Test Documentation, IEEE Std 829-2008 (New York, N.Y.: Jul. 10, 2008); Institute of Electrical and Electronics Engineers, IEEE Standard Classification for Software Anomalies, IEEE Std 1044-2009 (New York, N.Y.: Jan. 7, 2010); Software Engineering Institute, CMMI® for Acquisition, Version 1.3 (Pittsburgh, Pa: November 2010); Software Engineering Institute, CMMI® for Development, Version 1.3 (Pittsburgh, Pa: November 2010); GAO, Year 2000 Computing Crisis: A Testing Guide, [hyperlink, http://www.gao.gov/products/GAO/AIMD-10.1.21] (Washington, D.C.: November 1998); GAO Cost Estimating and Assessment Guide: Best Practices for Developing and Managing Capital Program Costs, [hyperlink, http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009); and GAO Schedule Assessment Guide: Best Practices for Project Schedules (Exposure Draft), [hyperlink, http://www.gao.gov/products/GAO-12-120G] (Washington, D.C.: May 2012). [26] The three components include the ABI instrument, the GLM instrument, and the ground system. [27] See GAO, Year 2000 Computing Crisis: Business Continuity and Contingency Planning, [hyperlink, http://www.gao.gov/products/GAO/AIMD-10.1.19] (Washington, D.C.: August 1998); National Institute of Standards and Technology, Contingency Planning Guide for Federal Information Systems, NIST 800- 34 (Gaithersburg, Md.: May 2010); Software Engineering Institute, CMMI® for Acquisition, Version 1.3 (Pittsburgh, Pa.: November 2010). [28] [hyperlink, http://www.gao.gov/products/GAO-13-597]. [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e-mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select "E-mail Updates." Order by Phone: The price of each GAO publication reflects GAO's actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO's website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. Congressional Relations: Katherine Siggerud, Managing Director, siggerudk@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, DC 20548. [End of document]