This is the accessible text file for GAO report number GAO-11-239SP entitled 'NASA: Assessments of Selected Large-Scale Projects' which was released on March 3, 2011. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: February 2011: Report to Congressional Committees: NASA: Assessment of Selected Large-Scale Projects: GAO-11-239SP: GAO Highlights: Highlights of GAO-11-239SP, a report to congressional committees. Why GAO Did This Study: GAO’s work has shown that the National Aeronautics and Space Administration’s (NASA) large-scale projects, while producing ground- breaking research and advancing our understanding of the universe, tend to cost more and take longer to develop than planned, and are often approved without evidence of a sound business case. Although space development programs are complex and difficult by nature, GAO has found that inherent risks are exacerbated by poor management and oversight practices. GAO has designated NASA’s acquisition management as a high risk area since 1990. This report provides a snapshot of how well NASA is planning and executing its acquisition of selected large-scale projects. It also provides observations about the performance of NASA’s major projects and project management, outlines steps NASA is taking to improve its acquisitions, identifies challenges that contribute to cost and schedule growth, and assesses 21 NASA projects, each with an estimated life-cycle cost of over $250 million. No recommendations are provided in this report; however, GAO has reported extensively and made recommendations on NASA acquisition management in the past. We will also be making recommendations on enhancing transparency and accountability in a separate letter to NASA. What GAO Found: GAO assessed 21 NASA projects with a combined life-cycle cost that exceeds $68 billion. Of those 21 projects, 16 had entered the implementation phase where cost and schedule baselines were established. Development costs for the 16 projects had an average growth of $94 million-—or 14.6 percent-—and schedules grew by an average of 8 months. The total increase in development costs for these projects was $1.5 billion. GAO found that 5 of the 16 projects were responsible for the overwhelming majority of this increase. The issue of cost growth is more significant than the 14.6 percent average would indicate because it does not capture the cost growth that occurred before several projects reported baselines in response to a statutory requirement in 2005. Additionally, the 13 projects that GAO has reviewed over the past 3 years that established baselines prior to 2009 experienced an average development cost growth of almost 55 percent, with a total increase in development costs of almost $2.5 billion from their original confirmation baselines. This does not reflect considerable cost and schedule growth that will likely be experienced by NASA’s largest science program—-the James Webb Space Telescope (JWST). Based on the findings of the independent panel that recently reviewed the JWST project and information we obtained from projects officials, it is likely that JWST will report significant cost and schedule growth, estimated to be $1.4 billion or more and up to 15 months, respectively. Many of the projects GAO reviewed for this report experienced challenges in the areas of technology, design, funding, launch vehicles, development partner performance, parts, and contractor management. Reducing the kinds of challenges this assessment identifies in acquisition programs hinges on developing a sound business case for a project. The development and execution of a knowledge-based business case for these projects can provide early recognition of challenges, allow managers to take corrective action, and place needed and justifiable projects in a better position to succeed. The inherent complexity of space development programs should not preclude NASA from achieving what it promises when requesting and receiving funds. In response to GAO’s designation of NASA’s acquisition management as a high risk area, NASA has developed a corrective action plan to improve the effectiveness of acquisition project management. The plan identifies five areas for improvement, each of which contains targets and goals to measure improvement. As part of this initiative, the agency is continuing its implementation of a new cost estimation tool, the Joint Cost and Schedule Confidence Level, to help project officials with management, cost and schedule estimating, and maintenance of adequate levels of reserves. View [hyperlink, http://www.gao.gov/products/GAO-11-239SP] or key components. For more information, contact Cristina Chaplain at (202) 512-4841 or chaplainc@gao.gov. [End of section] Contents: Foreword: Letter: Background: Observations on NASA’s Portfolio of Major Projects: Observations from Our Assessment of Knowledge Attained by Key Junctures in the Acquisition Process: Observations on Other Challenges That Can Affect Project Outcomes: Observations about NASA’s Continued Efforts to Improve Its Acquisition Management: Project Assessments: Aquarius: Ares I Crew Launch Vehicle: Global Precipitation Measurement (GPM) Mission: Glory: Gravity Recovery and Interior Laboratory (GRAIL: Ice, Cloud, and Land Elevation Satellite-2 (ICESat-2): James Webb Space Telescope (JWST): Juno: Landsat Data Continuity Mission (LDCM): Lunar Atmosphere and Dust Environment Explorer (LADEE): Magnetospheric Multiscale (MMS): Mars Atmosphere and Volatile EvolutioN (MAVEN): Mars Science Laboratory (MSL): NPOESS Preparatory Project (NPP): Orbiting Carbon Observatory 2 (OCO-2): Orion Crew Exploration Vehicle: Radiation Belt Storm Probes (RBSP): Soil Moisture Active and Passive (SMAP): Solar Probe Plus (SPP): Stratospheric Observatory for Infrared Astronomy (SOFIA): Tracking and Data Relay Satellite (TDRS) Replenishment: Agency Comments and Our Evaluation: Appendixes: Appendix I: Comments from the National Aeronautics and Space Administration: Appendix II: Objectives, Scope, and Methodology: Appendix III: Technology Readiness Levels: Appendix IV: GAO Contact and Staff Acknowledgments: Tables: Table 1: Selected Major NASA Projects Reviewed in GAO Annual Assessments: Table 2: Cost and Schedule Growth of Selected NASA Projects Currently in the Implementation Phase: Table 3: Cost Growth from Confirmation for Selected Major NASA Projects That Established Baselines Prior to Fiscal Year 2009: Table 4: ARRA Funding for Reviewed NASA Projects: Table 5: Schedule Growth for Selected NASA Projects with and without Development Partners Baselined before 2009: Figures: Figure 1: NASA’s Life Cycle for Flight Systems: Figure 2: Summary of Projects Assessed by Phase of the NASA Project Life Cycle: Figure 3: Percentage of Major NASA Projects That Moved into Implementation with Immature Technologies at the Preliminary Design Review: Figure 4: Percentage of Engineering Drawings Releasable at CDR for Selected NASA Projects: Figure 5: Comparison of Design Drawing Increase for Projects with CDR prior to and since Fiscal Year 2009: Figure 6: Notional Allocation of Reserves under the 70 Percent Confidence Level Funding Requirements: Figure 7: Illustration of Projects 2-Page Summary: Abbreviations: AFB: Air Force Base: AFS: Air Force Station: APS: Aerosol Polarimetry Sensor: ARRA: American Recovery and Reinvestment Act of 2009: ASI: Argenzia Spaciale Italiana (Italian Space Agency): C&DH: Command and Data Handling: CDDS: Cavity Door Drive System: CDR: critical design review: CMIC: Command and Data Handling Unit Module Interface Card: CONAE: Comision Nacional de Actividades Espaciales (Space Agency of Argentina): CrIS: Cross-track Infrared Sounder: CSA: Canadian Space Agency: DCI: data collection instrument: DM-2: Development Motor 2: DPR: dual-frequency precipitation radar: DT&E: Development Test & Evaluation: ESA: European Space Agency: ETU: engineering test unit: GIDEP: Government Industry Data Exchange Program: GLAST: Gamma-ray Large Area Space Telescope: GMI: GPM microwave imager: GPM: Global Precipitation Measurement (mission): GRACE: Gravity Recovery and Climate Experiment: GRAIL: Gravity Recovery and Interior Laboratory: HEPS: High Efficiency Power Supply: HOPE: Helium-Oxygen-Proton-Electron: ICESat-2: Ice, Cloud, and Land Elevation Satellite-2: IPO: Integrated Program Office: ISS: International Space Station: JAXA: Japan Aerospace Exploration Agency: JCL: Joint Cost and Schedule Confidence Levels: JPL: Jet Propulsion Laboratory: JWST: James Webb Space Telescope: KDP: key decision point: LCROSS: Lunar Crater Observation and Sensing Satellite: LDCM: Landsat Data Continuity Mission: LDEX: Lunar Dust Experiment: LIO: Low Inclination Observatory: LLCD: Lunar Laser Com Demo: LRO: Lunar Reconnaisance Orbiter: MagEIS: Magnetic Electron Ion Spectrometer: MAVEN: Mars Atmosphere and Volatile EvolutioN: MEP: Mars Exploration Program: MMRTG: Multi Mission Radioisotope Thermoelectric Generator: MMS: Magnetospheric Multiscale: MRO: Mars Reconnaissance Orbiter: MSL: Mars Science Laboratory: MSR: Monthly Status Review: NAR: nonadvocate review: NASA: National Aeronautics and Space Administration: NID: NASA Interim Directive: NLS: NASA Launch Services: NMS: Neutral Mass Spectrometer: NPR: NASA Procedural Requirements: NPOESS: National Polar-Orbiting Operational Environmental Satellite System: NPP: NPOESS Preparatory Project: OCFO: Office of the Chief Financial Officer (NASA): OCO: Orbiting Carbon Observatory: OLI: Operational Land Imager: OT&E: Operational Test & Evaluation: PA-1: Pad Abort-1: PDR: preliminary design review: RBSP: Radiation Belt Storm Probes: RWA: reaction wheel assembly: SAM: Sample Analysis at Mars: SBC: single board computer: SDO: Solar Dynamics Observatory: SDP: Spin Plane Double Probe: SID: Strategic Investments Division (NASA): SMAP: Soil Moisture Active and Passive (mission): SOFIA: Stratospheric Observatory for Infrared Astronomy: TAT: Test Assessment Team: TIM: total irradiance monitor: TIRS: Thermal Infrared Sensor: TLGA: Toroidal Low Gain Antenna: TRL: technology readiness level: UVS: Ultraviolet Spectrometer: USGS: U.S. Geological Survey: VIIRS: Visible Infrared Imaging Radiometer Suite: WISE: Wide-field Infrared Survey Explorer: [End of section] United States Government Accountability Office: Washington, DC 20548: March 3, 2011: We are pleased to present GAO’s third annual assessment of selected largescale National Aeronautics and Space Administration (NASA) projects. This report provides a snapshot of NASA’s planning and execution of major acquisitions—-a topic that is on GAO’s high risk list. This past year has been one of turmoil for NASA. The proposed cancellation of the Constellation program—-the agency’s largest program-—has left NASA’s human space flight program in a state of flux. Its future work in this area depends on how budget issues and direction are resolved between the Congress and the Administration. While NASA continued to work toward the program of record for Constellation, its focus has now turned to prioritizing work that can be transitioned to the new path for human space flight set out in the NASA Authorization Act of 2010 while continuing to comply with the requirements of its fiscal year 2010 appropriations. Additionally, funding constraints due to the delayed retirement of the shuttle fleet, the plan to utilize the International Space Station at least 4 years longer than anticipated, and expected overruns in major projects, such as the James Webb Space Telescope and the Mars Science Lab, will affect NASA’s plans for funding new projects for years to come. This environment, coupled with a constrained budgetary outlook, heightens the importance of efficient and effective project management to maximize results. Furthermore, NASA needs to be equipped with the knowledge to make hard choices among competing priorities within the agency. We recently issued an update to our high risk report where we highlighted efforts NASA continues to make to improve its management of major projects. For example, the agency has continued to implement initiatives aimed at strengthening its cost and schedule estimating processes. These initiatives, as well as other efforts, are intended to provide key decisionmakers with increased knowledge to make informed decisions before a project starts and to maintain disciplined management and oversight once it begins. Increased discipline and oversight, however, will require that senior NASA leaders have the will to terminate or reshape projects that do not measure up, hold appropriate parties accountable for poor outcomes, and recognize and reward good management and good decisions. NASA continues to take positive steps, but it will still be some time before the impact of its efforts can be measured. The NASA portfolio of major projects ranges from robotic probes designed to explore the Martian surface, to satellites equipped with advanced sensors to study the earth, to telescopes intended to explore the universe. Some of these missions have literally changed the way we view our planet and the universe. For example, the Kepler mission recently identified the first Earth-size planet candidates in a habitable zone where liquid water could exist on the planet’s surface. In many cases, NASA’s projects are expected to incorporate new and sophisticated technologies that must operate in harsh, distant environments. Although space development programs are complex and difficult by nature, our work consistently finds that inherent risks of NASA’s complex development projects are heightened by the induced risks of less than adequate management and oversight practices. In this year’s report, our work continues to show that NASA’s major projects are frequently approved without evidence of a sound business case that ensures a match between requirements and reasonably expected resources. As a result, the projects cost more and take longer to develop than planned. We found that NASA frequently exceeded its acquisition cost and schedule estimates, even when those estimates were relatively new. In the last 3 years, 12 out of the 13 projects that have been in development for several years significantly exceeded their cost and/or schedule baseline estimates. In today’s fiscal environment, it is clear that this condition cannot be sustained. We believe that this report can provide insights that will help NASA place programs in a better position to succeed, and help the agency maximize its investments. Our work has shown that curbing the induced challenges that can lead to cost and schedule growth hinges on developing a sound business case that includes firm requirements, mature technologies, a knowledge-based acquisition strategy, realistic cost estimates, and sufficient funding. Consistent adoption of such practices can improve results and may help ease the budgetary pressures NASA is likely to continue to face over time. Signed by: Gene L. Dodaro: Comptroller General of the United States: [End of section] United States Government Accountability Office: Washington, DC 20548: March 3, 2011: Congressional Committees: This is GAO's third annual assessment of National Aeronautics and Space Administration's (NASA) large-scale projects. This report provides a snapshot of how well NASA is planning and executing its major acquisitions--an area that has been on GAO's high risk list since 1990. Over the past year, NASA has again showed that its projects produce ground-breaking research and advance our understanding of the universe. For example, the Kepler spacecraft has discovered the first confirmed planetary system with more than one planet transiting the same star. Unfortunately, over the past year, NASA has also experienced much turmoil and cost increases in several of its major projects. For example, the proposed cancellation of the Constellation Program, after spending over $11 billion since 2006, caused uncertainty in NASA's human spaceflight program. More recently, an independent panel concluded that the James Webb Space Telescope project will require additional funding of $1.4 billion or more and a launch delay of 15 months. In the past 2 years, we reported that 11 out of 17 NASA projects experienced significant cost and/or schedule growth from baselines established only 2 or 3 years earlier.[Footnote 1] Such issues continue to impact NASA's ability to continue its ground-breaking work in an efficient and effective manner. NASA has taken steps over recent years to help improve its acquisition management through several initiatives aimed at improving cost estimating and management oversight. While the overall outcomes of these efforts will take time to become apparent, NASA officials indicate that they continue to be committed to the initiatives with the goal of improving performance. The Congress has expressed concern about NASA's performance and has identified the need to standardize the reporting of cost, schedule, and content for NASA research and development projects. In 2005, the Congress required NASA to report cost and schedule baselines-- benchmarks against which changes can be measured--for all NASA programs and projects with estimated life-cycle costs of at least $250 million that have been approved to proceed to the development stage, known as implementation, in which components begin to take physical form.[Footnote 2] It also required that NASA report to Congress when development cost is likely to exceed the baseline estimate by 15 percent or more, or when a milestone is likely to be delayed beyond the baseline estimate by 6 months or more.[Footnote 3] In response, NASA began to establish cost and schedule baselines in 2006 and has been using them as the basis for annual project performance reports to the Congress provided in its budget submission each year. The explanatory statement of the House Committee on Appropriations accompanying the Omnibus Appropriations Act, 2009 directed GAO to prepare project status reports on selected large-scale NASA programs, projects, or activities.[Footnote 4] This report responds to that mandate. Specifically, we assess (1) performance of NASA's major projects and the agency's management of those projects during development, (2) knowledge attained by key junctures in the acquisition process, (3) other challenges that can affect project execution, (4) NASA's continued efforts to improve its acquisitions, and (5) 21 NASA projects, each with an estimated life-cycle cost over $250 million.[Footnote 5] In doing so, the report expands on the importance of providing decision-makers with an independent, knowledge- based assessment of individual systems that identifies potential risks and allows them to take actions to put projects that are early in the development cycle in a better position to succeed. Our approach included an examination of the current phase of a project's development and how each project was advancing.[Footnote 6] NASA provided updated cost and schedule data as of November 2010 for 16 of the 21 projects. We reviewed and compared that data to previously established cost and schedule statutory baselines. We assessed each project's cost and schedule and characterized growth in either as significant if it exceeded the baselines that trigger reporting to the Congress under the law.[Footnote 7] In addition, NASA provided cost and schedule information from previously reported projects that we used for historical analysis. We assessed technology maturity and design stability using GAO's established criteria for knowledge-based acquisitions and on other GAO work on system acquisitions.[Footnote 8] Additionally, we identified other challenges that can affect project outcomes--funding, launch vehicles, development partner performance, parts, and contractor management--as a result of our analysis based on interviews with project officials and information provided by the projects. This list of challenges is not exhaustive and we believe these challenges will evolve, as they have from previous years, as we continue this work into the future. We took appropriate steps to address data reliability. The individual project offices were given an opportunity to provide comments and technical clarifications on our assessments prior to their inclusion in the final product, which were incorporated as appropriate. Appendix III contains detailed information on our scope and methodology. We conducted this performance audit from March 2010 to February 2011 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. We are not making recommendations in this report: Background: A Sound Business Case Underpins Successful Acquisition Outcomes: The development and execution of a knowledge-based business case for NASA’s projects can provide early recognition of challenges, allow managers to take corrective action, and place needed and justifiable projects in a better position to succeed. Our studies of best practice organizations show the risks inherent in NASA’s work can be mitigated by developing a solid, executable business case before committing resources to a new product development.[Footnote 9] In its simplest form, this is evidence that (1) the customer’s needs are valid and can best be met with the chosen concept and that (2) the chosen concept can be developed and produced within existing resources-—that is, proven technologies, design knowledge, adequate funding, adequate time, and adequate workforce to deliver the product when needed. A program should not be approved to go forward into product development unless a sound business case can be made. If the business case measures up, the organization commits to the development of the product, including making the financial investment. Our best practice work has shown that developing business cases based on matching requirements to resources before program start leads to more predictable program outcomes-—that is, programs are more likely to be successfully completed within cost and schedule estimates and deliver anticipated system performance.[Footnote 10] At the heart of a business case is a knowledge-based approach to product development that is a best practice among leading commercial firms. Those firms have created an environment and adopted practices that put their program managers in a good position to succeed in meeting expectations. A knowledge-based approach requires that managers demonstrate high levels of knowledge as the program proceeds from technology development to system development and, finally, production. In essence, knowledge supplants risk over time. This building of knowledge can be described over the course of a program as follows: * When a project begins development, the customer’s needs should match the developer’s available resources—mature technologies, time, and funding. An indication of this match is the demonstrated maturity of the technologies required to meet customer needs—referred to as critical technologies. If the project is relying on heritage—or pre-existing— technology, that technology must be in appropriate form, fit, and function to address the customer’s needs within available resources. The project will normally enter development after completing the preliminary design review, at which time a business case should be in hand. * Then, about midway through the product’s development, its design should be stable and demonstrate it is capable of meeting performance requirements. The critical design review takes place at that point in time because it generally signifies when the program is ready to start building production-representative prototypes. If design stability is not achieved, but a product development continues, costly re-designs to address changes to project requirements and unforeseen challenges can occur. By the critical design review, the design should be stable and capable of meeting performance requirements. * Finally, by the time of the production decision, the product must be shown to be producible within cost, schedule, and quality targets and have demonstrated its reliability, and the design must demonstrate that it performs as needed through realistic system-level testing. Lack of testing increases the possibility that project managers will not have information that could help avoid costly system failures in late stages of development or during system operations. Our best practices work has identified numerous other actions that can be taken to increase the likelihood that a program can be successfully executed once that business case is established. These include ensuring cost estimates are complete, accurate and updated regularly, and holding suppliers accountable through such activities as regular supplier audits and performance evaluations of quality and delivery. Moreover, we have recommended using metrics and controls throughout the life cycle to gauge when the requisite level of knowledge has been attained and when to direct decision makers to consider criteria before advancing a program to the next level and making additional investments. NASA Life Cycle for Flight Systems: NASA life cycle for flight system is defined by two phases-— formulation[Footnote 11] and implementation[Footnote 12]-—and several key decision points. See figure 1. These phases are then further divided into incremental pieces: Phase A through Phase F. Figure 1: NASA’s Life Cycle for Flight Systems: [Refer to PDF for image: life cycle illustration] Formulation: Pre-phase A: Concept Studies: KDP A: Phase A: Concept and Technology Development: SCR: Pre-NAR: KDP B: Phase B: Preliminary Design and Technology Completion: PDR: NAR: KDP C: Program start: Phase C: Final Design and Fabrication: CDR: KDP D: Phase D: System Assembly, Integration and Test, Launch: KDP E: Phase E: Operations and Sustainment: KDP F: Phase F: Closeout: Implementation: Management decision reviews: Pre-NAR = preliminary non advocate review; NAR = non advocate review; KDP = key decision point. Technical reviews: SDR = system definition review; PDR = preliminary design review; CDR = critical design review. Source: NASA data and GAO analysis. [End of figure] Project formulation consists of Phases A and B, during which time the projects develop and define the project requirements and cost/ schedule basis and design for implementation, including developing an acquisition strategy. During the end of the formulation phase, leading up to the preliminary design review (PDR)[Footnote 13] and non- advocate review (NAR),[Footnote 14] the project team completes its preliminary design and technology development. NASA Interim Directive NM 7120-81 for NASA Procedural Requirements 7120.5D, NASA Space Flight Program and Project Management Requirements, specifies that during formulation the project should complete development of mission- critical or enabling technology. As needed, projects are required to demonstrate evidence of technology maturity (i.e., component and/or breadboard validation in the relevant environment) and document the information in a technology readiness assessment report. The project must also develop, document, and maintain a project management baseline[Footnote 15] that includes the integrated master schedule and baseline life-cycle cost estimate. The formulation phase is intended to culminate in a confirmation review at which time cost and schedule baselines are confirmed and project progress hence forth is measured against these baselines. After a project is confirmed, it begins implementation, consisting of phases C, D, E, and F. During phase C, the project performs final design and fabrication as well as testing of components. In phase D, the project performs system assembly, integration, test, and launch activities. Phases E and F consist of operations and sustainment and project closeout. A second design review, the critical design review (CDR),[Footnote 16] is held in the implementation phase during the latter half of phase C. The purpose of the CDR is to demonstrate that the maturity of the design is appropriate to support proceeding with full-scale fabrication, assembly, integration, and test. After CDR and the system integration review,[Footnote 17] the project must be approved before continuing into the next phase. NASA Projects Reviewed in GAO Annual Assessments: The portfolio of projects we reviewed has evolved and grown in each of the last 3 years. Once a project launches, we will no longer include a 2-page summary in our annual report. However, we do maintain and continually assess historical cost, schedule, and performance information collected Table 1: Selected Major NASA Projects Reviewed in GAO Annual Assessments: Projects in Formulation: 2009: Ares I; GPM; JWST; LDCM; Orion; 2010: Ares I; GPM; LDCM; Orion; 2011: Ares I; ICESat-2; Orion; SMAP; SPP. Projects in Implementation: 2009: Aquarius; Dawn[A]; GLAST[A]; Glory; Herschel; Kepler; LRO; MSL; NPP; OCO[B]; SDO; SOFIA; WISE; 2010: Aquarius; Glory; GRAIL; Herschel[A]; Juno; JWST; Kepler[A]; LRO[A]; MMS; MSL; NPP; RBSP; SDO[A]; SOFIA; WISE[A]; 2011: Aquarius; Glory; GPM; GRAIL; Juno; JWST; LADEE; LDCM; MAVEN; MSL; MMS; NPP; OCO-2; RBSP; SOFIA; TDRS Replenishment. Source: GAO analysis of NASA data: [A] NASA projects that have launched. [B] NASA project that launched but failed to reach orbit. [End of table] Observations on NASA's Portfolio of Major Projects: We assessed 21 large-scale NASA projects in this review. We based the majority of our cost and schedule analysis on the 16 projects that are currently in the implementation phase of the project life-cycle. We also analyzed historical data from projects that were a part of our previous reviews. We found that 5 of the 16 projects currently in implementation experienced significant cost and/or schedule growth from their statutory baselines.[Footnote 18] The remaining 11 projects set statutory baselines in fiscal year 2009 or later and have reported little or no deviations from their and cost and schedule baselines. Three of these 11 projects that had been in formulation for most of our review were confirmed late in 2010 and their baselines, according to NASA officials, were to be reported for the first time in the NASA's fiscal year 2012 budget submission. The remaining five projects were in the formulation phase where cost and schedule baselines have yet to be established.[Footnote 19] See figure 2 for a summary of these projects. Figure 2: Summary of Projects Assessed by Phase of the NASA Project Life Cycle: [Refer to PDF for image: illustration] Total projects reviewed: 21; Projects in formulation: 5; Projects in implementation: 16; Projects with significant cost and/or schedule growth: 5; Projects that entered implementation in FY 2009/10: 8; Projects entering implementation in FY 2011: 3. Source: GAO analysis of NASA project data. [End of figure] Development costs for the 16 projects currently in implementation had an average development cost growth of $89.1 million--or 13.8 percent-- and schedule growth of 8 months from their statutory baselines. The total increase in development costs for the 16 projects in implementation was over $1.4 billion. The five projects with baselines set before fiscal year 2009 were responsible for the overwhelming majority of this increase. All 5 projects have exceeded cost and schedule thresholds set by the Congress since their statutory baselines. Two projects--Glory and MSL--were re-baselined, but to gain a more accurate picture of cost and schedule growth, we used their original statutory baselines for our analysis. See table 2. Table 2: Cost and Schedule Growth from Statutory Baseline of Selected NASA Projects in the Implementation Phase (dollars in millions): Project: NPP; Baseline (FY): 2007; Development cost growth: $154.2; Percentage cost growth: 26.0% [shaded]; Launch delay (months): 42 [shaded]. Project: SOFIA; Baseline (FY): 2007; Development cost growth: $177.9; Percentage cost growth: 19.3% [shaded]; Launch delay (months): 12 [shaded]. Project: Aquarius; Baseline (FY): 2008; Development cost growth: $34.6; Percentage cost growth: 18.0% [shaded]; Launch delay (months): 23 [shaded]. Project: Glory[A]; Baseline (FY): 2008; Development cost growth: $170.4; Percentage cost growth: 100.9% [shaded]; Launch delay (months): 26 [shaded]. Project: MSL[B]; Baseline (FY): 2008; Development cost growth: $751.3; Percentage cost growth: 77.6% [shaded]; Launch delay (months): 26 [shaded]. Project: GRAIL; Baseline (FY): 2009; Development cost growth: $0.0; Percentage cost growth: 0.0%; Launch delay (months): 0. Project: Juno; Baseline (FY): 2009; Development cost growth: $0.1; Percentage cost growth: 0.0%; Launch delay (months): 0. Project: JWST; Baseline (FY): 2009; Development cost growth: $129.8; Percentage cost growth: 5.0%; Launch delay (months): 0. Project: RBSP; Baseline (FY): 2009; Development cost growth: $0.1; Percentage cost growth: 0.0%; Launch delay (months): 0. Project: GPM; Baseline (FY): 2010; Development cost growth: $3.0; Percentage cost growth: 0.5%; Launch delay (months): 0. Project: LDCM; Baseline (FY): 2010; Development cost growth: $4.2; Percentage cost growth: 0.7%; Launch delay (months): 0. Project: MMS; Baseline (FY): 2010; Development cost growth: $0.0; Percentage cost growth: 0.0%; Launch delay (months): 0. Project: TDRS Replenishment; Baseline (FY): 2010; Development cost growth: $0.0; Percentage cost growth: 0.0%; Launch delay (months): 0. Project: LADEE; Baseline (FY): 2011; Development cost growth: $0.0; Percentage cost growth: 0.0%; Launch delay (months): 0. Project: MAVEN; Baseline (FY): 2011; Development cost growth: $0.0; Percentage cost growth: 0.0%; Launch delay (months): 0. Project: OCO-2; Baseline (FY): 2011; Development cost growth: $0.0; Percentage cost growth: 0.0%; Launch delay (months): 0. Project: Average; Development cost growth: $89.1; Percentage cost growth: 13.8%; Launch delay (months): 8. Project: Total Development Cost; Development cost growth: $1,425.6. Source: GAO analysis of NASA data. [A] Glory established a new statutory baseline in FY 2009 after being reauthorized by Congress: [B] MSL established a new statutory baseline in FY 2010 after being reauthorized by Congress: Note: Shading indicates projects that exceeded cost and schedule thresholds. [End of table] This table does not reflect considerable cost and schedule growth that will likely be experienced by NASA's largest science program--the James Webb Space Telescope. Based on the findings of the independent panel that recently reviewed the JWST project and information we obtained from projects officials, it is likely that JWST will report significant cost and schedule growth, estimated to be $1.4 billion or more and up to 15 months, respectively. Table 2 also includes information from 11 projects that were all confirmed in the last two years and have not reported significant cost or schedule growth. Many of these projects are entering, or have recently entered, the test and integration phase where cost and schedule growth is typically realized. Specifically, seven projects plan to have their system integration review in fiscal year 2011 or 2012. Importantly, many of these projects have experienced similar challenges as the older projects that have reported cost and/or schedule growth, such as issues with maturing technology and not meeting design criteria. As previously stated, the Glory and MSL projects both sought reauthorization from Congress because of development cost growth in excess of 30 percent despite having statutory baselines reestablished in 2008.[Footnote 20] Congress reauthorized the Glory project and new statutory cost and schedule baselines were established in fiscal year 2009,[Footnote 21] after the project experienced a 53 percent cost growth and 6-month launch delay from its original statutory baseline estimates in fiscal year 2008. Although Glory's development costs have increased by almost 31 percent from the new baseline established in 2009, Glory is scheduled to launch in February 2011 before a second reauthorization would need to be sought. Similarly, MSL was reauthorized by the Congress and NASA established new statutory cost and schedule baselines early in fiscal year 2010 after reporting a 68 percent growth in cost and a 26 month schedule delay from its original statutory baselines established in fiscal year 2008. The issue of cost growth is more significant than the 13.8 percent average identified in table 2 would indicate because it does not capture the cost growth that occurred before the five projects exhibiting the most considerable growth established baselines in response to the statutory requirement in 2005. Additionally, when considering all 13 projects included in our reviews for the past three years that were confirmed prior to fiscal year 2009,[Footnote 22] we found that NASA's major projects have experienced an average development cost growth of over 51 percent, with the total increase in development costs of over $2.3 billion from their original confirmation baselines. In addition, 9 of these projects experienced significant cost growth in excess of 15 percent, the point at which NASA is required to notify the Congress if a project has exceeded the threshold for reporting. See table 3. Table 3: Cost Growth from Confirmation for Selected Major NASA Projects that Established Baselines Prior to Fiscal Year 2009 (dollars in millions). Project: Aquarius; Development Cost: Baseline: $193.0; Development Cost: Current: $227.3; Development Cost: Difference: $34.3; Development Cost: Change: 17.8%. Project: Dawn; Development Cost: Baseline: $198.0; Development Cost: Current: $266.4; Development Cost: Difference: $68.4; Development Cost: Change: 34.5%. Project: GLAST; Development Cost: Baseline: $384.0; Development Cost: Current: $418.8; Development Cost: Difference: $34.8; Development Cost: Change: 9.1%. Project: Glory; Development Cost: Baseline: $159.0; Development Cost: Current: $337.6; Development Cost: Difference: $178.6; Development Cost: Change: 112.3%. Project: Herschel; Development Cost: Baseline: $95.0; Development Cost: Current: $126.7; Development Cost: Difference: $31.7; Development Cost: Change: 33.4%. Project: Kepler; Development Cost: Baseline: $313.0; Development Cost: Current: $388.7; Development Cost: Difference: $75.7; Development Cost: Change: 24.2%. Project: LRO; Development Cost: Baseline: $421.0; Development Cost: Current: $451.3; Development Cost: Difference: $30.3; Development Cost: Change: 7.2%. Project: MSL; Development Cost: Baseline: $969.0; Development Cost: Current: $1,802.2; Development Cost: Difference: $833.0; Development Cost: Change: 86.0%. Project: NPP; Development Cost: Baseline: $513.0; Development Cost: Current: $780.1; Development Cost: Difference: $267.1; Development Cost: Change: 52.1%. Project: OCO; Development Cost: Baseline: $187.0; Development Cost: Current: $230.2; Development Cost: Difference: $43.2; Development Cost: Change: 23.1%. Project: SDO; Development Cost: Baseline: $597.0; Development Cost: Current: $667.0; Development Cost: Difference: $70.0; Development Cost: Change: 11.7%. Project: SOFIA; Development Cost: Baseline: $306.0; Development Cost: Current: $1,128.4; Development Cost: Difference: $822.4; Development Cost: Change: 268.8%. Project: WISE; Development Cost: Baseline: $192.0; Development Cost: Current: $191.8; Development Cost: Difference: -$0.2; Development Cost: Change: -0.1%. Project: Average; Development Cost: Difference: $191.5; Development Cost: Change: 54.99%. Total Development Cost: Development Cost: Baseline:$4,527.0; Development Cost: Current: $7,016.3; Development Cost: Difference: $2,4,89.3. Source: GAO analysis of NASA data. [End of table] If changes NASA continues to implement to improve its acquisition management have their intended impact, we would expect to see improvements over time to the overall performance of the portfolio of projects in maintaining cost and schedule baselines established at their confirmation reviews. Observations from Our Assessment of Knowledge Attained by Key Junctures in the Acquisition Process: Many of NASA's projects are one-time articles, meaning that there is little opportunity to apply knowledge gained to the production of a second, third, or future increments of spacecraft. While space development programs are complex and difficult by nature and most are one-time efforts, NASA is still responsible for achieving what it promises when requesting and receiving funds. We have previously reported that NASA would benefit from a more disciplined, knowledge- based approach to its acquisitions. For the projects reviewed this year, we continue to identify projects that have not met best practice standards for technology maturity and design stability and have experienced challenges in development. These challenges were assessed based on knowledge that, according to acquisition best practices, should be attained at key junctures in the project life-cycle to lessen the risks to the project. Technology Challenges: [Side bar: Projects experiencing technology challenges: * Ares I * Glory; * GPM; * GRAIL; * Juno; * JWST; * LADEE; * LDCM; * MMS; * MSL; * NPP; * Orion; * SOFIA. End side bar] During the course of our review, we found that 13 projects had experienced technology issues, such as a lack of technology maturity for both critical and heritage technologies. Specifically, of the 18 projects that had completed the preliminary design review--the point in time where best practices say requisite technology maturity should be reached to lessen risk--11 projects reported moving forward with immature technologies.[Footnote 23] Two other projects--MMS and NPP-- reported issues with immature technologies for instruments that were being developed by partners. Our best practices work has shown that a technology readiness level (TRL) of 6--demonstrating a technology as a fully integrated prototype in a relevant environment--is the level of maturity needed to minimize risks for space systems entering product development. For NASA, projects enter development following the project's preliminary design review and confirmation review.[Footnote 24] NASA's acquisition policy states that by the preliminary design review a TRL of 6 is desirable prior to integrating a new technology on a project.[Footnote 25] Technology maturity is a fundamental element of a sound business case, and its absence is a marker for subsequent problems, especially as the project begins more detailed design efforts.[Footnote 26] Similarly, our work has shown that the use of heritage technology-- proven components that are being modified to meet new requirements-- can also cause problems when the items are not sufficiently matured to meet form, fit, and function standards of the project that will be using it by the preliminary design review.[Footnote 27] NASA frequently employs heritage technologies that have to be modified from their original form, fit, and function. NASA's Systems Engineering Handbook states that particular attention must be given to heritage systems because they are often used in architectures and environments different from those in which they were designed to operate. Further, the Handbook states that modification of heritage systems is a frequently overlooked area in technology development and that there is a tendency by project management to overestimate the maturity and applicability of heritage technology to a new project. Our work has shown, and NASA's own guidance concurs, that this is an area that is frequently underestimated when developing project cost estimates. Although NASA distinguishes critical technologies from heritage technologies, our best practices work has found critical technologies to be those that are required for the project to successfully meet customer requirements, regardless of whether or not they are based on existing or heritage technology. Therefore, whether technologies are labeled as "critical" or "heritage," if they are important to the development of the spacecraft or instrument--enabling it to move forward in the development process--they should be matured by the preliminary design review. NASA is making progress with regard to adhering to best practices standards for technology maturity at the preliminary design review as the number of projects not meeting this criteria has decreased in recent years. Nearly two thirds of the projects in our current review, however, do not meet this standard. See figure 3 for an analysis of projects that we reviewed in the past three years that held their preliminary design review and the percent of those projects that moved into implementation with immature technologies. Figure 3: Percentage of Major NASA Projects with Immature Technologies at the Preliminary Design Review: [Reefer to PDF for image: stacked vertical bar graph] Year: 2009; Projects meeting technology maturity criteria: 17%; Projects not meeting technology maturity criteria: 83%. Year: 2010; Projects meeting technology maturity criteria: 29%; Projects not meeting technology maturity criteria: 71%. Year: 2011; Projects meeting technology maturity criteria: 38%; Projects not meeting technology maturity criteria: 63%. Source: GAO analysis of data provided by NASA. Note: Totals may not add to 100% due to rounding. [End of figure] Proceeding into implementation with immature technologies increases a project's risk of cost and schedule overruns. For instance, the MSL project was given approval to move into the implementation phase despite reporting that seven of its critical technologies were not mature at the time of its preliminary design review. At the critical design review a year later, three of the seven critical technologies had been replaced by backup technologies with two of the seven still assessed as immature, including one of the replacement technologies, Challenges in development contributed to the MSL project's 26-month schedule delay and $750 million increase in total lifecycle costs. In another example, one of Glory's main instruments--the Aerosol Polarimetry Sensor--was assessed as an immature critical technology at the project's preliminary design review, yet the project was approved to proceed in to implementation. Since then, the project has experienced numerous issues with development of that instrument, resulting in over a year delay in its delivery and a cost increase to the project of over $100 million. Other projects in formulation are allocating extra time and funding in order to mature critical technologies by their preliminary design review. By investing in technology development early on in the project, the project may safeguard against some cost and schedule growth once it is in the implementation phase. For example, two projects in the formulation phase--ICESat-2 and Solar Probe Plus--have both allocated increased time and funding for development of their multi-beam laser and sunshield technologies, respectively, which should help to lessen risk to the projects moving forward. Finally, when analyzing the number of reported critical technology development efforts by the projects in our review, we found four of the 21 projects in our review reported no development of new critical technologies, while another eight projects reported development of only one critical technology. Upon presenting this data to senior NASA officials, we were told that it appears the projects did not accurately identify the number of critical technologies they plan to develop and suggested that the projects were only including technologies at the system level. We plan to continue to work with NASA to ensure projects are accurately identifying their critical technologies, both for our purposes, as well as to assist NASA decision makers in assessing the readiness of projects to move forward in their development lifecycles. Design Challenges: [Side bar: Projects experiencing design challenges: * Aquarius; * Glory; * GPM; * Juno; * JWST; * MAVEN; * MMS; * MSL; * NPP; * SOFIA. End of side bar] Ten of the 12 of the projects we reviewed that held their critical design review[Footnote 28]--the point in time where best practices say requisite design maturity should be reached to lessen risk--did not meet the best practices criteria of having 90 percent engineering drawings releasable. See figure 4. Figure 4: Percent of Engineering Drawings Releasable at CDR for Selected NASA Projects: [Refer to PDF for image: vertical bar graph] Projects that completed CDR: Aquarius; Engineering drawings releasable at CDR: 16%; Best practices criteria: 90%. Projects that completed CDR: Glory; Engineering drawings releasable at CDR: 64%; Best practices criteria: 90%. Projects that completed CDR: GPM; Engineering drawings releasable at CDR: 50%; Best practices criteria: 90%. Projects that completed CDR: GRAIL; Engineering drawings releasable at CDR: 82%; Best practices criteria: 90%. Projects that completed CDR: Juno; Engineering drawings releasable at CDR: 39%; Best practices criteria: 90%. Projects that completed CDR: JWST; Engineering drawings releasable at CDR: 84%; Best practices criteria: 90%. Projects that completed CDR: LDCM; Engineering drawings releasable at CDR: 85%; Best practices criteria: 90%. Projects that completed CDR: MSL; Engineering drawings releasable at CDR: 0%; Best practices criteria: 90%. Projects that completed CDR: NPP; Engineering drawings releasable at CDR: 65%; Best practices criteria: 90%. Projects that completed CDR: OCO-2; Engineering drawings releasable at CDR: 95%; Best practices criteria: 90%. Projects that completed CDR: RBSP; Engineering drawings releasable at CDR: 68%; Best practices criteria: 90%. Projects that completed CDR: TDRS; Engineering drawings releasable at CDR: 95%; Best practices criteria: 90%. Source: GAO analysis of data provided by NASA. [End of figure] We have previously reported that NASA's acquisition policy does not specify a metric by which a project's design stability is measured at the critical design review.[Footnote 29] Guidance in NASA's Systems Engineering Handbook, however, mirrors the best practices metric that at least 90 percent of engineering drawings should be releasable by the critical design review. Discussions with project officials showed the metric was used inconsistently to gauge design stability. For example, Goddard Space Flight Center requires greater than 80 percent drawings released at the critical design review, yet several project officials reported that the "rule of thumb" for NASA projects is between 70 and 90 percent. As shown in figure 6 above, 7 of the 12 projects reported releasable engineering drawings of less than 70 percent, lower than even the "rule of thumb" used by several project managers. The 12 projects averaged having only 62 percent of their engineering drawings releasable at their critical design reviews, an increase from the less than 40 percent we reported last year. While the average has improved, it is still well below the best practices metric. Further, nearly all of the projects we reviewed over the last three years held their critical design review without 90 percent of engineering drawings being releasable--failing to meet NASA Systems Engineering Handbook guidance and our best practices criteria for design stability. Achieving design stability allows projects to "freeze" the design and minimize changes in the future. An unstable design, on the other hand, can result in costly re-engineering and re-work efforts, design changes, and schedule slippage. The majority of the 12 projects that held their critical design review had increases, in two cases well over 100 percent, to the number of engineering drawings released after its critical design review when, according to NASA's Systems Engineering Policy, a project's design is to be stable enough to support full-scale fabrication, assembly, integration and test. [Footnote 30] This is particularly evident in projects in our review that held their critical design reviews prior to fiscal year 2009, or projects that have more of a history to track variances. As shown in figure 5 below, these four projects, on average, had a 107 percent increase in expected engineering drawings after the critical design review after having only 36 percent of drawings releasable at that review. The remaining eight projects have only recently held their critical design review in fiscal year 2009 or later and have not reported a large increase in expected drawings. Figure 5: Comparison of design drawing increase for projects with CDR prior to and since fiscal year 2009: [Refer to PDF for image: vertical bar graph] Projects with CDR prior to FY 2009: Average drawings released at CDR: 36.25%; Average increase in expected drawings after CDR: 107%. Projects with CDR in FY 2009 or later: Average drawings released at CDR: 74.75%. Average increase in expected drawings after CDR: 8.25%. Source: GAO analysis of data provided by NASA. [End of figure] Some of the projects we reviewed in the past three years pointed to other activities that occurred prior to the critical design review as evidence of design stability. In addition to releasable engineering drawings, NASA often relies on subject matter experts in the design review process and other methods to assess design stability. For example, the Standing Review Board[Footnote 31] provides an expert assessment of the technical and programmatic approach, risk posture, and progress against the project baseline at key decision points to be assured that the project has a stable design. Furthermore, some projects reported using engineering models and engineering test units to assess design stability. For example, a MMS project official reported that the number of complete engineering test units is as important, if not more so, than design drawings. By using engineering models that are as flight ready as possible, MMS project officials reported that they can see where problems are and better identify risks. In addition, a GPM project official said that the lack of releasable drawings at the critical design review did not have a serious impact in terms of design stability as testing was almost complete on the engineering test units and flight units were already designed and ready to begin manufacturing. The Juno project released only 39 percent of engineering drawing at its critical design review and project officials reported that they used engineering models for all instruments to demonstrate design maturity at CDR rather then released engineering drawings. The Juno project, however, experienced a 46 percent increase in expected number of engineering drawings after its CDR, indicating that the design was not stable. As mentioned above, NASA does not use a common measure to assess design stability before allowing programs to move from the design phase to the test and integration phases of the development process. Our studies and others have found that significant cost growth occurs in these phases and, in some instances, has tied these problems to issues related to design. Moreover, a recent study by the National Research Council[Footnote 31] found that the critical design review milestone for many NASA missions may be held prematurely--driven by schedule rather than driven by design maturity. Regardless of how stability is measured, common quantitative measures employed at critical design review, such as percentage of engineering drawings that are in a releasable state, can provide evidence that the design is stable and provide assurance that it is mature and will meet performance requirements. These measures can also be an indication to decision makers that the requisite knowledge has been attained to allow the project to proceed in its development lifecycle and better enable them to assess the performance of individual projects against the overall portfolio of projects. Observations on Other Challenges That Can Affect Project Outcomes: In addition to collecting and analyzing data on the attainment of knowledge at key junctures, we collected and assessed data on five additional areas that can present challenges to obtaining positive project outcomes, including: funding, launch vehicle, development partner performance, parts, and contractor management. Challenges with contractors did not present as big a challenge to projects covered by this review compared to previous reports, but continue to warrant monitoring by the projects and other decision makers as a common area that challenges project execution. The degree to which each area challenged project execution varied and, in most instances, we did not designate any specific challenge as a primary factor for cost and/or schedule growth. Funding Challenges: [Side bar: Projects experiencing funding challenges: * Aquarius; * Ares I; * Glory; * GPM; * JWST; * Orion; * SOFIA. Projects that received ARRA funding: * Aquarius; * Ares I; * Glory; * GPM; * ICESat-2; * JWST; * LDCM; * OCO-2; * Orion; * SMAP. End of side bar] Matching funding to requirements is critical to the success of complex acquisitions yet it is often insufficient in government acquisitions as agencies tend to start more projects than can be afforded and often have to make cuts in budgets after programs begin in order to address cost increases in highly problematic efforts. Several studies have highlighted this issue in NASA and NASA's administrator recently stressed the need to ensure projects are affordable before they are started. This year, we identified 3 projects that faced significant cost and schedule problems because their original funding did not align with program plans. These include Ares 1, Orion, and JWST and they represent NASA's largest investments. In addition, we identified 10 projects received unanticipated funding from the American Recovery and Reinvestment Act of 2009.[Footnote 33] This event was an anomaly and it carried with it restrictions and requirements that narrowed the scope of projects it could be applied to and required additional administrative work, which initially dissuaded some projects and contractors from accepting the funds. Nevertheless, the stimulus funding enabled NASA to mitigate the impact of cost increases being experienced in its largest projects and to also address problems being experienced in other projects. In several cases, NASA took advantage of the funding build additional knowledge about technology or design before key milestones. According to NASA officials and independent reviews, the projected budgets for JWST, Ares I, and Orion were inadequate to perform work in certain fiscal years. In November 2010, an independent review panel concluded the JWST budget baseline accepted at the confirmation review did not reflect the most probable cost with adequate reserves in each year of project execution. This resulted in a project that was not executable within the budgeted resources. According to the review, the project was able to stay within its yearly budget allocation by deferring planned work in the budget year to future years. This approach was an ineffective control measure as costs were postponed and funded from a subsequent year's allocation at a cost that was typically two-to three-times higher due to the impact of the deferrals on other work. Further, the panel estimated that the project will need an additional $1.4 billion or more for an earliest launch date of September 2015--$500 million of which will be needed in fiscal years 2011 and 2012. Also, as we have reported previously, NASA initiated the Constellation program relying on the accumulation of a large rolling budget reserve in fiscal years 2006 and 2007 to fund program activities in fiscal years 2008 through 2010.[Footnote 34] This poorly phased funding plan diminished both the Ares I and Orion projects' ability to deal with technical problems and funding shortfalls in 2010, and, in part, led the President to propose cancellation of the program in the fiscal year 2011 budget submission. An independent review commissioned by the Administration also found that the Ares I and Orion programs did not have budget profiles that matched the work that needed to be done. With regard to the American Recovery and Reinvestment Act of 2009 (ARRA), 10 projects used these additional funds to offset existing funding issues, such as covering the cost of delays or averting "stop work" orders to contractors, or to lessen risk by initiating or further enhancing technology development efforts and long lead procurements that otherwise would not have funded at that time. The Science Mission Directorate conducted extensive analysis on how best to utilize the funding, because officials told us that these additional funds would not necessarily alleviate all technology development or other schedule delays, and in some cases the funds would have no impact. See table 4 below for the NASA projects in our review receiving this funding and how these funds were used. Table 4: ARRA Funding for Reviewed NASA Projects: Project: Ares; ARRA funds: $102.4 million; Use of funds: To manufacture and assemble engine components for development testing, completion of a test stand, and preparation for test operations. Project: Aquarius; ARRA funds: $8.6 million; Use of funds: To maintain the current workforce through the planned launch. Project: Glory; ARRA funds: $16.0 million; Use of funds: To maintain the current workforce through the planned launch. Project: GPM; ARRA funds: $32.0 million; Use of funds: To accelerate construction of the GPM Microwave Imager (GMI) instrument to ensure the core spacecraft is successfully launched at the earliest possible opportunity. Project: ICESat-2; ARRA funds: $20.4 million; Use of funds: To mature the micro-pulse laser designs. Project: JWST; ARRA funds: $75.0 million; Use of funds: To maintain workforce levels and achieve the earliest possible launch date. Project: LDCM; ARRA funds: $63.4 million; Use of funds: To initiate development of the thermal infra-red sensor (TIRS); Other LDCM development. Project: OCO-2; ARRA funds: $18.0 million; Use of funds: To acquire long lead components for the spacecraft and facilitate instrument development in order to accelerate and enable the earliest possible OCO-2 launch. Project: Orion; ARRA funds: $165.9 million; Use of funds: To avoid workforce reductions and mitigate technical challenges with its launch abort system, landing parachutes, solar arrays, heatshield, and propulsion systems. Project: SMAP; ARRA funds: $64.0 million; Use of funds: To procure long lead components and conduct component level preliminary design reviews in order to accelerate the launch date. Source: GAO presentation of data provided by NASA. [End of table] Launch Vehicle Challenges: [Side bar: Projects experiencing launch vehicle: challenges; * Glory; * GRAIL; * ICESat-2; * LADEE; * MAVEN; * NPP; * SMAP; * SPP. End of side bar] Eight of 21 projects in our review have experienced challenges with launch vehicles. The primary concern is the retirement of the Delta II medium launch vehicle. Over the past decade, NASA has launched about 60 percent of its science missions on the Delta II. NASA plans to continue to use the Delta II as a launch vehicle for three remaining science missions--Aquarius, Gravity Recovery and Interior Laboratory, and National Polar-orbiting Operational Satellite System Preparatory Project--the last of which is currently scheduled to launch in October 2011. These projects have identified risks associated with the last flights, such as the availability of workforce and spare parts that they, along with NASA's Launch Services Program, have taken steps to mitigate. Our recent work on NASA's transition plans for future medium launch vehicles indicates that emerging NASA science missions will face increased risks until new vehicles are certified.[Footnote 35] NASA science missions requiring a medium class launch vehicle that are approaching their preliminary design review face uncertainties committing to as-yet uncertified and unproven launch vehicles that will eventually replace the Delta II. Several missions, including the SMAP and ICESat-2 missions are approaching the point in the development lifecycle where it is optimal to finalize a decision on launch vehicle. NASA plans to fill the gap left by the retirement of the Delta II by eventually certifying the Falcon 9 and Taurus II vehicles[Footnote 36] for use by NASA science missions in the relative cost and performance range of the Delta II. This approach, however, is not without risk as these vehicles are largely unproven. In a recent report, we recommended that NASA perform detailed cost estimates to determine the likely costs of certification of these new vehicles and provide adequate budgeting for the risks associated with this approach.[Footnote 37] NASA concurred with this recommendation and agreed to provide cost estimates for certification and the resolution of technical issues during certification of the Falcon 9. Other launch challenges beyond the Delta II transition affected projects in our review this year. For example, the Taurus XL, which failed during the launch of OCO, was scheduled to return to flight in late 2010 for the Glory mission. NASA and the Taurus XL launch vehicle contractor were operating under constrained timelines to complete Taurus XL return to flight activities; however, the Glory project experienced technical challenges that led the project to delay the launch from November 2010 to February 2011, providing enough time to address return to flight activities. A malfunction in the ground support equipment associated with the Taurus XL launch vehicle has subsequently delayed launch of the Glory project until March 2011. Development Partner Challenges: [Side bar: Projects experiencing development partner challenges: * Aquarius; * GPM; * Juno; * LDCM; * MMS; * NPP. End of side bar] Six projects reported challenges with international or domestic development partners not meeting project commitments within planned resources. Project officials reported several reasons why development partners were unable to fulfill their obligations, including a lack of experience in producing spacecraft and the lack of adequate funding. For example, delays in the development of the spacecraft bus by Argentina’s National Committee of Space Activities was identified as the reason for the Aquarius project’s 15 percent development cost increase and 18- month schedule slip that NASA reported to the Congress in February 2010. Since that time, the project has determined that the launch will be delayed by at least another 5 months for a total delay of 23 months. Project officials said that while Argentina’s National Committee of Space Activities is technically competent, it lacks experience in managing spacecraft production projects. Aquarius project officials estimate the cost impact of these delays to be approximately $35 million. In addition, projects also experienced challenges related to development partners’ providing adequate funding for their contributions. For example, the GPM project identified a project risk that their international development partner, the Japanese Space Agency, may be unable to fund needed launch support services as originally planned. In the past 3 years, we reviewed 13 projects that established their baseline prior to fiscal year 2009. As shown in table 5, the average schedule delay from their baselines is 17.6 months for the projects with foreign or domestic development partners, but 10.6 months for projects that had no development partner. Table 5: Schedule Growth for Selected NASA Projects with and without Development Partners Baselined before 2009: Projects with Partners: Dawn; Baseline (FY): 2007; Launch Delay (months): 0. Projects with Partners: GLAST; Baseline (FY): 2007; Launch Delay (months): 9. Projects with Partners: Herschel; Baseline (FY): 2007; Launch Delay (months): 21. Projects with Partners: LRO; Baseline (FY): 2008; Launch Delay (months): 8. Projects with Partners: NPP; Baseline (FY): 2007; Launch Delay (months): 42. Projects with Partners: SOFIA; Baseline (FY): 2007; Launch Delay (months): 12. Projects with Partners: Aquarius; Baseline (FY): 2008; Launch Delay (months): 23. Projects with Partners: MSL; Baseline (FY): 2008; Launch Delay (months): 26. Projects with Partners: Average; Launch Delay (months): 17.6. Projects without Partners: Kepler; Baseline (FY): 2007; Launch Delay (months): 9. Projects without Partners: SDO; Baseline (FY): 2007; Launch Delay (months): 18. Projects without Partners: Glory; Baseline (FY): 2008; Launch Delay (months): 20. Projects without Partners: OCO; Baseline (FY): 2008; Launch Delay (months): 5. Projects without Partners: WISE; Baseline (FY): 2008; Launch Delay (months): 1. Projects without Partners: Average: Launch Delay (months): 10.6. Source: GAO Analysis of NASA data. [End of table] Although the cost and schedule growth for some of the projects that have development partners can be attributed to other challenges, for example technology or design issues, there are instances where the performance of the development partners was the primary factor of cost and schedule growth. For example, the Aquarius, NPP and Hershel projects all experienced significant delays as a direct result of issues related to their development partners. Parts Challenges: [Side bar: Projects experiencing parts challenges: * Glory; * Juno; * LADEE; * LDCM; * MSL; * OCO-2; * RBSP; * TDRS Replenishment. End of side bar] While most of the projects in our assessment reported challenges related to parts quality or availability, 8 projects this year experienced an impact to their cost or had to make alterations to their schedules as a result of the challenges. According to NASA officials, parts problems are not uncommon for projects, and NASA's testing process is designed to identify part failures at the component, subsystem, and system level before they lead to mission failure. For example, a parts quality problem discovered during the testing and integration of the Glory project resulted in an additional $61million in cost and delayed the project by 17 months. The project had to replace the printed wiring board of the spacecraft's single board computer due to reliability problems with the original board. In addition, the project recently discovered excessive wear of the Slip Ring Assembly in the solar arrays, resulting in an additional three month launch delay. In addition, the MSL project experienced a part failure associated with the transition joints in the propulsion system which caused the joints to overheat and fail. Project officials reported this issue was realized after the project finished building its propulsion system, causing the project to rebuild the system and adopt a new joint design. The transition to the new design delayed rover testing from 2009 to early 2010. NASA centers work together and communicate potential systemic issues. For example, parts personnel at Goddard Space Flight Center maintain a center-level parts database, which links to the agency-wide Government Industry Data Exchange Program alert system.[Footnote 38] GAO has an on-going assessment of parts quality across the government space sector and will be reporting on actions being taken by NASA and other agencies to prevent and mitigate such problems. Contractor Management Challenges: [Side bar: Projects experiencing contractor management challenges: * Glory; * Juno; * JWST; * Orion; * RBSP; * SOFIA. End of side bar] Five projects in implementation and one project in formulation reported experiencing contractor challenges including not completing work on time, not identifying risks for the project, and inadequate oversight. Contractor management challenges have been reported for a greater number of projects and with a greater impact for projects in past reports. Although the impact of this challenge on projects we reviewed this year has diminished, as contractors spend about 85 percent of NASA's annual budget, their performance is critical in terms of achieving the success of many NASA missions. As a result, we continue to identify this area as a common project challenge that can contribute to cost and schedule growth. In one case, RBSP project officials are expecting the delivery of the Magnetic Electron Ion Spectrometer instrument to be delayed due to the time a vendor is taking in providing needed flight hardware for the instrument. Consequently, the project has re-planned the schedule to accommodate the late delivery and integration of the instrument. This re-plan maintains the launch readiness date by reordering the observatory integration and test flow and changing selected subsystem and instrument delivery dates. In another example, an independent review panel found that the JWST project did not have staff resident at the prime contractor facility to help avoid surprises, especially since the contract represented approximately half of the JWST project’s budget. The panel said that this is a normal practice and is done for other projects at Goddard Space Flight Center. Further, while project officials told us that the project’s prime contractor and one of the subcontractors came forward after confirmation with large cost increases that the contractor had not previously identified as risks, the panel found that these risks had been identified and that the project had asked the prime contractor to submit them in a formal proposal before they could be recognized as risks. GAO has ongoing work to review NASA’s contractor surveillance and oversight practices and will issue a report later in 2011. Observations about NASA's Continued Efforts to Improve Its Acquisition Management: In response to GAO’s designation of NASA’s acquisition management as a high risk area,[Footnote 39] NASA developed a corrective action plan to improve the effectiveness of its program/project management. [Footnote 40] The plan identifies five areas for improvement—- program/project management, cost reporting process, cost estimating and analysis, standard business processes, and management of financial management systems-—each of which contains targets and goals to measure improvement. As part of this initiative, the agency is continuing its implementation of a new cost estimating tool, the Joint Cost and Schedule Confidence Level, to help project officials with management, cost and schedule estimating, and maintenance of adequate levels of reserves. In addition to the corrective action plan, NASA is in the process of implementing Earned Value Management within certain programs and specific in-house efforts to help the projects monitor the scheduled work done by its contractors and employees; however, this management tool has not yet been institutionalized within the NASA Centers. These two efforts, in addition to other improvements NASA is making to address acquisition management, are positive steps toward addressing NASA’s issues with meeting cost and schedule baselines. It is, however, too early to assess their impact on NASA’s performance. Additionally, NASA’s progress could be hindered by the continued lack of a consistent measure for ensuring design stability as well as little transparency with regard to costs for projects in the early, critical phases of development, both of which are key to ensuring that internal and external decision makers are well informed. We recently raised both issues as potential impediments to success in congressional testimony and plan to recommend improvements in a separate report. [Footnote 41] Joint Cost and Schedule Confidence Levels Being Implemented: NASA's Joint Cost and Schedule Confidence Levels (JCL) initiative, adopted in January 2009, is a point-in-time estimate that includes, among other things, all cost and schedule elements, incorporates and quantifies known risks, assesses the impacts of cost and schedule to date, and addresses available annual resources. The primary goals of the JCL are to help project officials with management, cost and schedule estimating, and maintenance of adequate levels of reserves; provide assurance to stakeholders that NASA will meet cost and schedule targets; and to provide transparency on the effects of funding changes on the probability of meeting cost and schedule commitments. NASA requires that a JCL be conducted the prior to the confirmation review. NASA policy also requires that projects be baselined and budgeted at the 70 percent confidence level and funded at a level equivalent to at least the 50 percent confidence level for the project.[Footnote 42] According to NASA officials, this would include reserves held at the directorate and project level. The total amount of reserves held at the project level varies based on where the project is in its lifecycle. The reserves represent the amount of estimated costs that are not allocated to the specific project sub- elements. See figure 6 for a visual depiction of this funding allocation. Figure 6: Notional Allocation of Reserves under the 70 Percent Confidence Level Funding Requirements: [Refer to PDF for image: line graph] The graph depicts the amount NASA budgets for project reserves and mission directorate or program reserves. Source: GAO analysis of NASA policy. Note: The amount of project reserves varies as the project moves through its lifecycle. [End of figure] NASA's Associate Administrator for the Science Mission Directorate indicated that adoption of the new JCL process will reduce NASA's portfolio because the cost estimating will be more accurate at the 70 percent confidence level, reflecting higher costs from the outset to avoid higher cost overruns in the future, and as a result NASA will have fewer dollars available to start new projects. Five out of the 21 projects[Footnote 43] in our review have recently completed the JCL process, and several others are in the process of conducting a JCL analysis. NASA is still in the process of refining the tools used to create the JCL based on feedback from the projects. As NASA evolves its cost estimation processes and as we continue to conduct our reviews of the projects that have gone through the JCL process, we can better assess the impact this initiative has on the projects' ability to meet cost and schedule commitments and to address potential cost and schedule drivers. Implementation of Earned Value Management at NASA Centers in Progress: Earned value management (EVM) is a program management tool that integrates the technical, cost, and schedule parameters of a contract and uses those parameters to measure cost and schedule variances. During our review, we found that implementation of earned value management is occurring within 11 projects and earned value data is reported by projects on a monthly basis to upper level project management. While earned value management is being used by these projects, it has not yet been used consistently by the projects as a tool for managing cost and schedule. According to a briefing from the NASA Advisory Council's Audit, Finance, and Analysis Committee, NASA's goal is to develop and deploy an agency-wide EVM capability that is compliant with generally accepted standards.[Footnote 44] At this time, only the Jet Propulsion Laboratory, a Federally Funded Research and Development Center and not a NASA Center, has a compliant system. If implemented appropriately, EVM provides objective reports of project status, produces early warning signs of impending schedule delays and cost overruns, and can identify specific development efforts contributing to those overruns. For example, MSL's June 2010 earned value management report identified the avionics and actuators as the primary drivers of the project's cost overruns. In particular, the data showed that ongoing unplanned technical issues with three of the heritage avionics technologies would likely result in a cost overrun of $11.5 million. More consistent use of this management tool could help address the project challenges identified earlier in this report that threaten the project's cost and schedule during project development. The data we received from NASA was not received in a timely manner and was incomplete. As a result, we were unable to perform a detailed analysis by project to provide our own determination of whether the information provided by the contractors is accurate and could be relied on by the projects and management as a tool to assess progress. We plan to conduct a more thorough analysis of EVM data in ongoing work and in future iterations of this work. Transparency and Accountability Not Sufficient to Provide Proper Oversight: These initiatives aimed at improving cost estimating and management oversight are positive steps. However, we recently testified that NASA does not yet provide enough transparency during project development to help Congress identify risks and inefficiencies and ensure earlier accountability.[Footnote 45] Currently, NASA does not share cost and schedule information for projects in the early, critical phases of development and only makes this information public after the projects have been formally approved to enter implementation. Projects establish preliminary cost baselines in formulation phase; these estimates, however, are for planning purposes only as they enable NASA decision makers to better manage the overall portfolio of projects. NASA does not report deviations from these preliminary baselines to the Congress. In addition, NASA does not report information on what has been spent to date on the projects in formulation, as it does in its annual budget submission for projects in implementation. To add some perspective to this timing, neither the Ares nor Orion projects has reached this point, despite having spent over $9 billion dollars combined; and JWST just reached this point in 2008, despite having spent nearly $2 billion before then. Despite the absence of established external cost and schedule baselines to measure the progress of the project, cost growth and schedule delays can and do occur during the formulation phase. NASA's internal analysis of past projects indicates that there is an average of 14 percent growth in the development cost estimates during the formulation phase. While there is a need to allow projects a period of time for discovery and to pursue different concepts--particularly highly complex efforts such as JWST--inadequate transparency into their progress for what sometimes amounts to five or more years can preclude effective oversight and accountability and make it even more difficult to stop projects that are not on track to meet the agency's goals with available resources. Additional insight to cost could better enable Congress to make more informed decisions when approving the projects through the annual appropriations process. In addition, a recently released report from the Independent Comprehensive Review Panel[Footnote 46] concerning problems affecting the JWST program concluded that significant changes are still needed in NASA's oversight and accountability functions to ensure that programs base their decisions on sound knowledge. The panel noted that NASA's governance policy is inconsistent with accountability for project execution. In particular, the panel found that a lack of clear lines of authority and accountability contributed to a lack of executive leadership in resolving the broken JWST life-cycle cost baseline. Additionally, the study found that JWST's flawed budget should have been discovered as part of the Goddard Spaceflight Center's execution responsibility, but the interpretation of the agency's governance policy on the role of the center in this regard is ambiguous and not interpreted uniformly within NASA. As a result, the report noted that ongoing, regular independent assessment and oversight processes at the agency are missing. Project Assessments: The two-page assessments of the projects we reviewed provide a profile of each project and describe the challenges we identified this year, as well as challenges that we have identified in the past. On the first page, the project profile presents a general description of the mission objectives for each of the projects; a picture of the spacecraft or aircraft; a schedule timeline identifying key dates for the project; a table identifying programmatic and launch information; a table showing the current statutory baseline year cost and schedule estimates and the November 2010 cost and schedule data; a table showing the challenges relevant to the project; and a project summary narrative. To maintain information on challenges the projects experience over their lifetime, we continued to identify project challenges that were reported in prior reports. On the second page of the assessment, we provide an analysis of the project challenges and the extent to which each project faces cost, schedule, or performance risk because of these challenges. In addition, NASA project offices were provided an opportunity to review drafts of the assessments prior to their inclusion in the final product, and the projects provided both technical corrections and more general comments. We integrated the technical corrections as appropriate and characterized the general comments below the project update. See figure 7 below for an illustration of the layout of each two-page assessment. Figure 7: Illustration of Project Two-Page Summary: [Refer to PDF for image: illustration] A. General description of mission’s science objectives. B. Illustration of spacecraft or aircraft. C. Schedule timeline identifying key dates for the project including when the project began formulation, major design reviews, confirmation to begin the implementation phase, and scheduled launch readiness. D. Project Essentials Programmatic information including the responsible NASA center, international or domestic partners, major contractors, and launch information. E. Project Performance Cost and schedule baseline estimates and the latest estimate updates as of February 2011. F. Project Challenges Summary listing the challenges facing the project based on a successful acquisition business case. G. Project Summary Brief narrative describing current status of the project with regard to the challenges identified. H. Project Update Analysis of project challenges and the extent to which each project faces cost, schedule, or performance risk because of these challenges. I. Project Office comments General comments provided by the cognizant project office. Source: GAO analysis. [End of figure] [End of section] Project data: Common Name: Aquarius: Aquarius is a satellite mission developed by NASA and the Space Agency of Argentina (Comisión Nacional de Actividades Espaciales, CONAE) to investigate the links between the global water cycle, ocean circulation, and the climate. It will measure global sea surface salinity. The Aquarius science goals are to observe and model the processes that relate salinity variations to climatic changes in the global cycling of water and to understand how these variations influence the general ocean circulation. By measuring salinity globally for 3 years, Aquarius will provide a new view of the ocean’s role in climate. [Refer to PDF for image: artist depiction] Source: Aquarius Project. Formulation: Formulation start: 12/03; Preliminary design review: 6/05. Implementation: Project Confirmation: 9/05; Critical design review: 9/06; GAO review: 12/10; Launch readiness date: 6/11. Project essentials: NASA Center Lead: Jet Propulsion Laboratory (JPL)[A]; International Partner: Argentina's National Committee of Space Activities (CONAE); Major Contractors: In-house development; Projected Launch Date: June 2011; Launch Location: Vandenberg AFB, California; Launch Vehicle: Delta II; Mission Duration: 3 years for Aquarius mission; 5 years for SAC-D (CONAE) mission. [A] JPL is a federally funded research and development center. Table: Project Performance (then year dollars in millions): Total Project Cost: Baseline Est. (FY 2008): $241.8; Latest (Feb. 2011): $279.0; Change: 15.4%. Formulation Cost: Baseline Est. (FY 2008): $35.5; Latest (Feb. 2011): $35.6; Change: 0.3%. Development Cost: Baseline Est. (FY 2008): $192.7; Latest (Feb. 2011): $227.3; Change: 18.0%. Operations Cost: Baseline Est. (FY 2008): $13.6; Latest (Feb. 2011): $16.1; Change: 18.4%. Launch Schedule: Baseline Est. (FY 2008): 7/2009; Latest (Feb. 2011): 6/2011; Change: 23 months. [End of table] Recent/Continuing Project Challenges: * Development Partner Issues; * Funding Issues. Previously Reported Challenges: * Design Stability. Project Summary: The launch of Aquarius has been delayed from the July 2009 baseline to June 2011 because of delays in CONAE’s spacecraft development and problems with the propulsion system thrusters. The launch delay, which added costs to the project, prompted NASA to report to the Congress in February 2010 that the Aquarius project exceeded its development cost and schedule baselines by 15 percent and more than 6 months, respectively. NASA completed its development of the Aquarius instrument, which is currently being integrated with the Argentine- developed spacecraft. Project officials estimated the cost of the past schedule slips to be about $35.5 million. Project Update: NASA reported to Congress in the agency’s fiscal year 2011 budget estimates that the Aquarius mission’s development costs had grown by 15 percent from its 2008 baseline. Additionally, the project’s current June 2011 launch date represents a 23-month schedule slip. These cost and schedule overruns are due to delays by the international partner. Development Partner Issues: According to project officials and budget documents, delays in the development of the spacecraft bus by CONAE were responsible for the 15 percent development cost increase and 18- month schedule slip that NASA reported to Congress in February 2010. Since that time, the project has determined that the launch will be delayed by another 5 months to June 2011, for a total delay of 23 months. To facilitate the work of its partners, the Jet Propulsion Laboratory (JPL) project team said that it appointed a chief mission engineer to help facilitate upcoming tests and reviews; however, JPL officials stated that they have not had full access to INVAP, CONAE’s prime contractor, due to contractual agreements between INVAP and CONAE. Additionally, CONAE was responsible for flying the instrument to Vandenberg Air Force Base for launch but could not find a viable commercial aircraft. Project officials said that they are working with the U.S. Air Force to secure a no-cost flight for the integrated satellite, but may have to pay for the flight at a cost of approximately $1 million. Funding Issues: Since no funds are being exchanged between the U.S. and Argentina for this project, NASA bears the costs it incurs associated with any schedule delays. Project officials told us that all of the project’s contingency reserves have been eroded due to past schedule delays with the spacecraft bus as well as current schedule delays associated with the SAC-D instruments being provided by CONAE. These schedule slips increased NASA’s costs by an estimated $35.5 million in the past. Project officials stated that the primary cost driver associated with the launch delay is staffing costs, estimated to be approximately $4.9 million. Further, the project received $8.6 million under the American Recovery and Reinvestment Act of 2009 that was used to maintain the current Aquarius workforce through launch. Other Issues to be Monitored: During thermal vacuum testing on the spacecraft bus, INVAP discovered a problem with the spacecraft’s propulsion systems thrusters that has contributed to delaying the launch until June 2011. After an analysis of the Dual Thruster Module, the Aquarius/SAC-D team determined that the problem was likely due to one or more procedural issues in the test process at the manufacturer or its vendor. Refurbishment of all of the Dual Thruster Module flight units is complete and the flight units were re-integrated with the observatory. INVAP planned to complete integration and testing by November 2010. Project Office Comments: The Aquarius project provided technical comments to a draft of this assessment, which were incorporated as appropriate. The project officials also commented that NASA and CONAE will continue to work together to meet the earliest possible launch date. [End of Aquarius data] Ares I Crew Launch Vehicle: Common Name: Ares I: NASA’s Ares I Crew Launch Vehicle was designed to carry the Orion Crew Exploration Vehicle into low Earth orbit for missions to the International Space Station and the Moon as part of the Constellation Program. The mission of the Ares I project was to deliver a safe, reliable, and affordable launch system with a 24.5-metric ton lift capability. [Refer to PDF for image: illustration] Source: Ares Projects Office. Formulation: Formulation start: 9/05; Preliminary design review: 9/08; GAO review: 12/10. Project Confirmation: Implementation: Critical design review: 9/11; Launch readiness date: 3/15. Project Essentials: NASA Center Lead: Marshall Space Flight Center; Partners: None; Major Contractors: Alliant Techsystems, Pratt and Whitney Rocketdyne, Boeing; Projected Launch Date: March 2015; Launch Location: Kennedy Space Center, Florida; Launch Vehicle: Ares I; Mission Duration: N/A. Table: Project Performance (then year dollars in millions): Latest (Feb. 2011): Preliminary Estimate of Project Life Cycle Cost[A]: $17,000 to $20,000. [A] This estimate is preliminary, as the project is in formulation and there is still uncertainty in the value as design options are explored. NASA uses these estimates for planning purposes. This estimate is for the Ares I vehicle only. Launch Schedule: 3/2015. [End of table] Recent/Continuing Project Challenges: * Funding Issues; * Technology Issues. Project Summary: The President’s fiscal year 2011 budget proposed cancellation of the Ares I project leading to uncertainty, both financial and programmatic, within the project. Given constrained resources, the project prioritized work and did not accomplish some of the work originally planned for 2010; however, it successfully tested Development Motor 2 to gain data on project elements. In early fall 2010, Congress passed the NASA Authorization Act of 2010 directing NASA to develop a space launch system and crew vehicle for missions utilizing existing Ares I contracts and capabilities to the extent practicable. Project Update: The President proposed cancellation of the Constellation Program, including the Ares I project, in the fiscal year 2011 budget request. This proposal led to much debate within Congress and uncertainty, both financial and programmatic, within the project. As a result, the project prioritized work for the year and did not complete some of the work originally planned for 2010. In early fall 2010, Congress passed the NASA Authorization Act of 2010, which directed NASA to develop a space launch system and crew vehicle for missions to near earth orbit and regions of space beyond low-Earth orbit no later than December 2016. In developing this vehicle, Congress directed the agency to extend or modify existing vehicle development and associated contracts to the extent practicable. Funding Issues: The Ares I project received over $102 million under the American Recovery and Reinvestment Act of 2009 (ARRA) that was used to manufacture and assemble engine components for development testing, completion of a test stand, and preparation for test operations. However, project officials explained that due to a series of budgetary constraints for the first 4 months of fiscal year 2010 that roughly offset the amount gained from the ARRA funding, the project could not perform all of its originally planned work. While initially parts of the project were able to maintain momentum, termination liability issues identified in June 2010 caused the three project prime contractors to stop certain portions of the work on their respective contracts. At this time, the project redirected its funding to activities that would potentially benefit NASA’s goals and objectives beyond the current fiscal year. For example, in August 2010, the project successfully tested Development Motor 2 (DM-2). The DM-2 test was conducted to gain data on project elements tested including the redesigned rocket nozzle, new insulation, and the motor casing’s liner. According to project officials, the project was flexible in its planning while it maintained the program of record during fiscal year 2010. Technology Issues: The Ares I project has been working to mitigate several challenges related to the development of heritage technology. However, given the funding uncertainty that has surrounded Ares I, the project has been unable to implement the mitigation strategies. For example, last year, NASA identified thrust oscillation as a technical issue. Thrust oscillation, which causes shaking during launch and ascent, occurs in some form in every solid rocket engine. Computer modeling indicated that there was a possibility that the magnitude and frequency of thrust oscillation within the first stage would be outside the limits of the Ares I design and could cause excessive vibration in the Orion capsule and threaten crew safety. According to project officials, the project plans to mitigate the risk by adding damper and isolation techniques at the interface between the launch vehicle and the Service Module. However, this risk cannot be closed until funding is obtained to implement the mitigation strategy. Furthermore, vibroacoustics—-the pressure of the acoustic waves produced by the firing of the Ares I first stage and the rocket’s acceleration through the atmosphere—-continues to be a concern to the project. Vibroacoustics may cause unacceptable structural vibrations throughout Ares I and Orion and force NASA to qualify components to higher vibration tolerance thresholds than originally expected. According to the project, the global mitigation strategy for the excessive vibration has been on hold due to budget constraints. The project is unable to finalize the design without knowing the final configuration of the crew exploration vehicle. Finally, last year we reported that analysis of the Ares I flight path also indicated that, under some conditions, the Ares I vehicle could hit the launch tower during liftoff and the vehicle would need to be steered away from the launch tower or not launched during high winds. NASA officials told us they have developed a plan to mitigate this risk. Project Office Comments: The Ares I project office provided technical comments on a draft of this assessment, which were incorporated as appropriate. The project office also commented that it has utilized resources to make progress on the Constellation Program while focusing on goals that yield benefits to future human spaceflight endeavors. [End of Ares I data] Global Precipitation Measurement (GPM) Mission: Common Name: GPM: [Refer to PDF for image: artist depiction] Source: GPM Project Office. The Global Precipitation Measurement (GPM) mission, a joint NASA and Japan Aerospace Exploration Agency (JAXA) project, seeks to improve the scientific understanding of the global water cycle and the accuracy of precipitation forecasts. The GPM is composed of a core spacecraft carrying two main instruments: a Dual-frequency Precipitation Radar (DPR) and a GPM Microwave Imager (GMI). GPM builds on the work of the Tropical Rainfall Measuring Mission and will provide an opportunity to calibrate measurements of global precipitation. Formulation: Formulation start: 7/02; Preliminary design review: 11/08. Implementation: Project Confirmation: 12/09; Critical design review: 12/09; GAO review: 12/10; Launch core spacecraft: 7/13. Project Essentials: NASA Center: Goddard Space Flight Center; International Partner: Japanese Aerospace Exploration Agency (JAXA); Major Contractors: Ball Aerospace; Projected Launch Date: July 21, 2013; Launch Location: Tanegashima Island, Japan; Launch Vehicle: JAXA supplied; Mission Duration: 3 years. Table: Project Performance (then year dollars in millions): Total Project Cost: Baseline Est. (FY 2009): $975.9; Latest (Feb. 2011): $928.9; Change: -4.8%. Formulation Cost: Baseline Est. (FY 2009): $349.2; Latest (Feb. 2011): $349.2; Change: 0.0%. Development Cost: Baseline Est. (FY 20098): $555.2; Latest (Feb. 2011): $514.8; Change: -7.3%. Operations Cost: Baseline Est. (FY 2009): $71.6; Latest (Feb. 2011): $64.9; Change: -9.4%. Launch Schedule: Baseline Est. (FY 2009): 7/2013; Latest (Feb. 2011): 7/2013; Change: 0 months. [End of table] Project Summary: Prior to establishing the project’s baseline cost and schedule estimate, NASA descoped the planned second spacecraft of the GPM mission. The project’s international partner, JAXA, is providing the launch vehicle for the core spacecraft. However, GPM project officials were tracking potential funding issues with JAXA. GPM received $32 million under the American Recovery and Reinvestment Act of 2009, which was used to maintain the current schedule, expedite some work on the GMI-1, and begin work on a second GMI. Project Update: Funding Issues: Prior to establishing the project’s baseline cost estimate, NASA removed funding for the second spacecraft of the GPM mission, the Low Inclination Observatory (LIO), due to lack of funding. The Low Inclination Observatory (LIO) was primarily intended to fly a second GPM Microwave Imager (GMI-2), which would gather additional science data to further support the GPM mission. Project officials reported that NASA is currently pursuing an international development partner willing to fund the launch vehicle and spacecraft needed for the second GMI instrument. However, despite de-scoping the LIO launch vehicle and spacecraft, the project continues to invest in the development of the GMI-2 instrument. A GPM project official reported that GMI-2 will be put into storage in 2013 if the LIO mission is not going to launch soon after that. Although the science requirements for GPM could still be met without flying the GMI-2 instrument, project officials reported that without the instrument the available science data from the mission would not be as robust. GPM received $32 million under the American Recovery and Reinvestment Act of 2009. According to project officials, this enabled GPM to maintain schedule in fiscal year 2009, move some of the GMI work planned for fiscal year 2011 into fiscal year 2010, and start the GMI- 2 development on schedule in October 2009. Development Partner Issues: GPM project officials were tracking potential funding issues with the Japanese Aerospace and Exploration Agency (JAXA), which is providing the launch vehicle for the first GPM spacecraft as a risk to the cost and schedule of the project. In addition, the GPM project is tracking the availability of the JAXA- supplied Dual-frequency Precipitation Radar (DPR) instrument. The project reports that delays in the DPR instrument's development have compressed the schedule available for integration and testing. Design Issues: The project has currently released 96 percent of its engineering drawings, but only 53 percent were released by the mission critical design review (CDR) held in December 2009. A project official said that the lack of released drawings at critical design review didn’t have a serious impact in terms of design stability as testing was almost complete on engineering testing units and flight units were already designed and ready to begin manufacturing. Project officials delayed the CDR of the fully demiseable aluminum propulsion tank from August 2010 to October 2010 due to difficulties with parts assembly. The GPM spacecraft was designed to be demiseable—- that is, it will burn up during re-entry into the Earth’s atmosphere to limit orbital debris. However, in December 2008, an updated re- entry structural analysis at Johnson Space Center of GPM indicated that the spacecraft would not be demiseable as originally predicted by the GPM project office and Johnson Space Center. The project had initially delayed the start of the implementation phase and establishment of GPM cost and schedule baselines by 8 months in order to reconcile the project budget with available funding and to resolve the demisability issue. Project Office Comments: The GPM project office provided technical comments to a draft of this assessment, which were incorporated as appropriate. Project officials also commented that overall the GPM Project is making progress. [End of GMP data] Glory: Common Name: Glory: [Refer to PDF for image: artist depiction] Source: Glory Project Office. Glory project is a low-Earth orbit satellite that will contribute to the U.S. Climate Change Science Program. The satellite has two principal science objectives: (1) collect data on the properties of aerosols and black carbon in the Earth’s atmosphere and climate systems and (2) collect data on solar irradiance. The satellite has two main instruments-—the Aerosol Polarimetry Sensor (APS) and the Total Irradiance Monitor (TIM)-—as well as two cloud cameras. The TIM will allow NASA to have uninterrupted solar irradiance data by bridging the gap between NASA’s Solar Radiation and Climate Experiment and the National Polar-orbiting Operational Environmental Satellite System (NPOESS). Formulation: Formulation start: 9/05; Preliminary design review: 9/05. Implementation: Project Confirmation: 12/05; Critical design review: 7/06; GAO review: 12/10; Launch readiness date: 2/11. Table: Project Performance (then year dollars in millions): Total Project Cost: Baseline Est. (FY 2009): $347.9; Latest (Feb. 2011): $424.1; Change: 21.9%. Formulation Cost: Baseline Est. (FY 2009): $70.5; Latest (Feb. 2011): $70.8; Change: 0.4%. Development Cost: Baseline Est. (FY 2009): $259.1; Latest (Feb. 2011): $337.6; Change: 30.3%. Operations Cost: Baseline Est. (FY 2009): $18.3; Latest (Feb. 2011): $15.8; Change: -13.7%. Launch Schedule: Baseline Est. (FY 2009): 67/2009; Latest (Feb. 2011): 3/2011; Change: 21 months. Recent/Continuing Project Challenges: * Launch Issues; * Funding Issues; * Parts Issues. Previously Reported Challenges: * Technology Maturity; * Complexity of Heritage Technology; * Design Stability; * Contractor Performance. Project Summary: Significant cost increases and schedule delays have persisted on Glory despite being reauthorized by Congress and re-baselined in 2009. Development costs have increased by about 30 percent since 2009. Recent cost increases and schedule delays are residual effects of switching to an alternate single board computer provider, the late delivery of the APS instrument, and, more recently, due to part quality issues found in the solar array drive assembly. Glory will launch on the Taurus XL launch vehicle, which is returning to flight after the vehicle failed during a 2009 launch. Project Update: Parts Issues: The Glory project has experienced significant schedule delays due to reliability problems with key parts found during testing. For example, in June 2010, the project discovered excessive wear and debris of the Slip Ring Assembly, a part contained in the solar array drive assembly that rendered one of the array wings unacceptable for flight. The corrected solar array drive assembly was integrated with the spacecraft in November 2010. The other solar array drive assembly was inspected, found to have no signs of wear or debris, and sent back to the contractor for integration with the spacecraft. This issue has resulted in an additional 3 month launch delay. Prior to the solar array issue, the project switched from using a single board computer (SBC) to an alternate SBC produced by another company. According to the project manager, continued reliability issues with the initial SBC, including cracks in the printed wiring boards, required the project to seek another vendor for the SBC as the part failed during testing. While the new SBC has now been integrated with the spacecraft and is performing well, project officials estimate the total cost impact of this switch in technology to be approximately $60.9 million. Launch Issues: The Glory project has been tracking the return to flight activities of the Taurus XL launch vehicle as a risk to achieving its launch readiness date in February 2011. The vehicle failed during the launch of the Orbiting Carbon Observatory (OCO) in February 2009. The launch failure Mishap Investigation Board (MIB) subsequently released findings and suggested corrective actions. Specifically, the MIB found that a payload fairing-—a clamshell-shaped cover that encloses and protects a payload during early flight-—failed to separate during ascent. NASA’s Launch Services Program has developed a corrective action plan and, according to a Launch Services Program official, the Taurus XL corrective actions were on track to meet the launch vehicle readiness review for Glory in September 2010. The return to flight activities for the Taurus XL is on-going while the project performs test and integration of instruments after the over one year late delivery of the APS and a parts failure in the Single Board Computer. A malfunction in the ground support equipment associated with the Taurus XL launch vehicle has subsequently delayed the launch of the Glory project until March 2011. Funding Issues: The Glory project’s development costs have increased by almost 31 percent and its launch has been delayed by 21 months since being reauthorized by Congress and re-baselined in 2009 after a 53 percent development cost increase. Cost increases and schedule delays are a residual result of switching to an alternate single board computer provider due to reliability issues, the late delivery of the APS instrument, and, more recently, due to parts failure in the solar array drive assembly. Since Glory’s original fiscal year 2008 baseline, the project’s development costs have grown by 113 percent and its launch has been delayed over 2 years. The Glory project also received $16 million under the American Recovery and Reinvestment Act of 2009 (ARRA) which was used to maintain the current workforce through the planned launch. Project Office Comments: The Glory project office provided technical comments to a draft of this assessment, which were incorporated as appropriate. Project officials also commented that the project continues to monitor the Taurus XL return to flight activities. [End of Glory data] Gravity Recovery and Interior Laboratory (GRAIL): Common Name: GRAIL: [Refer to PDF for image: artist depiction] Source: Courtesy of NASA/JPL-Caltech. The GRAIL mission will seek to determine the structure of the lunar interior from crust to core, advance our understanding of the thermal evolution of the Moon, and extend our knowledge gained from the Moon to other terrestrial-type planets. GRAIL will achieve its science objectives by placing twin spacecraft in a low altitude and nearly circular polar orbit. The two spacecraft will perform high-precision measurements between them. Analysis of changes in the spacecraft-to- spacecraft data caused by gravitational differences will provide direct and precise measurements of lunar gravity. GRAIL will ultimately provide a global, high-accuracy, high-resolution gravity map of the Moon. Formulation: Formulation start: 12/07; Preliminary design review: 11/08. Implementation: Project Confirmation: 1/09; Critical design review: 11/09; GAO review: 12/10; Launch readiness date: 9/11. Table: Project Performance (then year dollars in millions): Total Project Cost: Baseline Est. (FY 2009): $496.2; Latest (Feb. 2011): $496.2; Change: 0.0%. Formulation Cost: Baseline Est. (FY 2009): $50.5; Latest (Feb. 2011): $50.5; Change: 0.0%. Development Cost: Baseline Est. (FY 2009): $427.0; Latest (Feb. 2011): $427.0; Change: 0.0%. Operations Cost: Baseline Est. (FY 2009): $18.7; Latest (Feb. 2011): $18.7; Change: 0.0%. Launch Schedule: Baseline Est. (FY 2009): 9/2011; Latest (Feb. 2011): 9/2011; Change: 0 months. [End of table] Recent/Continuing Project Challenges: * Technology Issues; * Launch Issues. Project Summary: During formulation it was determined that the reaction wheel assembly did not meet mission requirements. The project office undertook a new development effort of the reaction wheel, but because of a mechanical design flaw found in testing, it will not be delivered on schedule. In addition, the schedule for testing and integration for avionics has been impacted by late delivery of parts and hardware problems. Project officials continue to be concerned about the availability of Delta II Heavy launch personnel and resources for the mission. Project Update: Technology Issues: GRAIL project officials said they included no new technology in designing the GRAIL orbiters to keep the mission simple, cost effective, and as close to the Gravity Recovery and Climate Experiment (GRACE) mission as possible. Therefore, the GRAIL project instruments are similar to those used in the GRACE mission. All heritage technologies for the project, except for the reaction wheel assembly, were deemed mature at the preliminary design review. Project officials told us that during formulation they reviewed the reaction wheel assembly and determined that it did not meet the standards for this mission and caused the project to undertake a new development effort. The electronics of the newly developed reaction wheel are combined into the mechanical assembly, and the project decreased the diameter of the mechanical assembly. However, the reaction wheel assembly flight units are not on track for on-time delivery because of a mechanical design flaw found in testing. The project determined that there was a problem with the bearing material and modifications had to be made to allow for proper load bearings and stability. The project has determined the root cause of the problem and developed a design update to correct the problem. Project officials said that schedule contains enough margin to accommodate the late delivery of the reaction wheel assembly without affecting the launch schedule. Launch Issues: Last year, we reported that GRAIL project officials were concerned about the availability of trained personnel to process the launch since GRAIL would have been the last NASA project to launch on the Delta II launch vehicle. Since that time, the NPOESS Preparatory Project (NPP) has delayed its launch date, and therefore, GRAIL is no longer the last NASA project scheduled to launch on the Delta II launch vehicle. Project officials told us they continue to be concerned about the availability of Delta II launch personnel and continue to monitor that availability as a risk to the project. NASA launch services is monitoring changes in Delta II launch services personnel and processes and the post-production support proposals for all major subcontractors. Other Issues to be Monitored: Project officials told us the delivery of the avionics flight boxes have been delayed due to late delivery of parts, which will impact the system level environmental tests for these units and, therefore, are on the critical path. However, the project mitigated this risk by using engineering test units of the avionics boxes since the flight unit deliveries were delayed past the beginning of test and integration in July 2010. Project officials told us that the project can conduct system-level testing using engineering test units if the avionics boxes are further delayed since the electronics boards are the same in both units and can be swapped out prior to the system-level environmental testing. The project expects that the two flight units will be delivered by early 2011. The project has modified its schedule to accommodate for the delay in the delivery of the flight avionics and reported it has sufficient schedule margin to meet the launch date. Project Office Comments: The GRAIL project office commented that the project has completed all the major milestones on schedule and is currently on track to meet its launch readiness date. [End of GRAIL data] Ice, Cloud, and Land Elevation Satellite-2 (ICESat-2): Common Name: ICESat-2: [Refer to PDF for image: artist depiction] Source: ICESat-2 Project Office. Formulation: Formulation start: 12/09; GAO review: 12/10; Preliminary design review: 11/11. Implementation: Critical design review: 11/12; Launch readiness date: 10/15. Table: Project Performance (then year dollars in millions): Latest (Feb. 2011): Preliminary Estimate of Project Life Cycle Cost[A]: not available. Launch Schedule: 10/2015. [A] The project has not yet reached the point in the acquisition life cycle where a preliminary life cycle cost estimate would normally be developed. [End of table] Recent Project Challenges: * Launch Issues; * Funding Issues. Project Summary: ICESat-2 was approved to begin formulation in December 2009. The project ’s internal cost estimates exceeded the cost cap, which led the project to evaluate potential cost reduction activities and re-scope options. These activities delayed the Mission Definition Review originally planned for August 2010 until January 2011. The project used $20.4 million in American Recovery and Reinvestment Act of 2009 funds to work with four major laser vendors to mature the micro-pulse laser designs. However, the acquisition and testing for the laser subsystem is behind schedule. Project Update: Launch Issues: ICESat-2 is tracking a risk due to the lack of medium class launch vehicle availability. The project is concerned that a delay in identifying a launch vehicle for the mission will lead to cost and schedule impact. The only certified vehicle currently available to NASA missions in the ICESat-2 launch time frame is the Atlas V, an intermediate launch vehicle. The only medium class launch vehicle currently available under NASA’s contract for launch services is the Falcon 9; however, it has not yet been certified. If ICESat-2 selects the Falcon 9, the mission launch date would be tied to a successful certification of the launch vehicle. The Atlas V comes at a higher cost than what NASA has traditionally paid for a medium capability launch vehicle. Officials told us that the project is currently allocating $100 million for the launch vehicle. The project planned to develop a procurement package to initiate procurement of a launch vehicle in early fiscal year 2011. Funding Issues: NASA provided cost parameters for the ICESat-2 mission; however, the project’s internal life cycle cost estimates exceeded the cost cap by $100 million. Project officials are currently evaluating how they can reduce the project’s life-cycle cost estimates through various re-scoping options, such as partnering with another ongoing mission or reducing the mission life. Due to these activities, the project’s Mission Definition Review, originally scheduled for August 2010, was not scheduled to occur until January 2011 at the earliest. In addition, the project used $20.4 million in American Recovery and Reinvestment Act of 2009 (ARRA) funding for the micro- pulse laser development contracts to retire project risk earlier. However, the acquisition and testing of these laser subsystems is behind schedule due to delays associated with the ARRA reporting by the agency. Also, according to project officials, the project received $28 million in fiscal year 2010 funding from the President’s global climate initiative, but it was unable to use all of the additional funds within the fiscal year and is unsure whether it will receive funding from this initiative in fiscal year 2011. Other Issues to be Monitored: The project entered the formulation phase in December 2009. During the mission concept review process, the project responded to changing science requirements, particularly the need to accurately measure slope through micro-pulse laser technology. The Advanced Topographic Laser Altimeter System is the single instrument on the ICESat-2 mission. The project identified two critical technologies, the micro-pulse lasers and the Laser Reference System (LRS). The project expects that both technologies will be mature at the preliminary design review scheduled for November 2011. The micropulse lasers being developed for ICESat-2 use a low energy pulse at a high frequency, a change from the high power lasers used on the original ICESat mission. The project is working with four major laser vendors to mature the micro-pulse laser technology and designs. Despite delays in awarding the contracts, the vendors are working toward the original milestone delivery dates to reduce schedule risk. The LRS is designed to provide absolute laser pointing knowledge in order to pinpoint the ice footprint location 6 meters on the ground. Project Office Comments: The ICESat-2 project provided technical comments to a draft of this assessment, which were incorporated as appropriate. Project officials also commented that ICESat-2 is currently in formulation and activities are on-going to confirm a mission that fits within the cost cap. NASA does not formally commit to a project’s schedule and cost until Key Decision Point (KDP)-C, which ICESat-2 has not yet reached. [End of ICESat-2 data] James Webb Space Telescope (JWST): Common Name: JWST: [Refer to PDF for image: artist depiction] Source: Northrop Grumman Aerospace Systems. Formulation: Formulation start: 3/99; Preliminary design review: 3/08. Implementation: Project Confirmation: 7/08; Critical design review: 3/10; GAO review: 12/10; Launch readiness date: 6/14. Table: Project Performance (then year dollars in millions): Total Project Cost: Baseline Est. (FY 2009): $4963.6; Latest (Feb. 2011): $5095.4; Change: 2.7%. Formulation Cost: Baseline Est. (FY 2009): $1800.1; Latest (Feb. 2011): $1800.2; Change: 0.0%. Development Cost: Baseline Est. (FY 2009): $2581.1; Latest (Feb. 2011): $2710.9; Change: 5.0%. Operations Cost: Baseline Est. (FY 2009): $582.4; Latest (Feb. 2011): $584.5; Change: 0.4%. Launch Schedule: Baseline Est. (FY 2009): 6/2014; Latest (Feb. 2011): 6/2014; Change: 0 months. [End of table] Recent/Continuing Project Challenges: * Funding Issues; * Contractor Issues; * Design Issues. Previously Reported Challenges: * Complexity of Heritage Technology. Project Summary: NASA is taking steps to address deficiencies identified by two independent reviews this year. One independent review panel found that the earliest possible launch date for JWST is September 2015, a 15- month delay from the baseline estimate. To meet this date, the panel estimated the project would need an additional $500 million over the next 2 fiscal years and a total life-cycle cost of approximately $6.5 billion. A separate review team reported that JWST’s test plans exceeded the money and time available. As a result of these reviews, the program office at NASA headquarters will now report directly to the NASA Associate Administrator. Project Update: Funding Issues: According to an October 2010 Independent Comprehensive Review Panel (ICRP) report, JWST’s baseline did not reflect the most probable cost and resulted in a project that was not executable with the given budget. The ICRP found that the budget was understated because it did not include known threats and provided insufficient reserves, particularly in the year of confirmation and the year following. The panel also reported problems with overall project management and a lack of effective oversight by Division managers who concurred with the project’s practice of deferring work to later years without assessing the future impact. To address existing funding concerns, JWST received $75 million under the American Recovery and Reinvestment Act of 2009. Despite these additional funds, the ICRP found that the earliest launch date possible is September 2015—-15 months after the baseline schedule. Further, the ICRP reported that JWST’s life-cycle cost would likely increase by $1.4 billion or more, $500 million of which would be required in the next 2 fiscal years. In response to the panel’s recommendations, NASA made several organizational changes, including establishing a new program office at headquarters that reports directly to the NASA Associate Administrator and managing the project’s budget separately from Astrophysics. Contractor Issues: At confirmation, the project believed it had sufficient insight into contractor performance to predict future trends and used Earned Value Management data to predict cost overruns at the contractor. Project officials told us that shortly after confirmation the prime contractor and a subcontractor came forward with previously unidentified risks to project cost, leaving the project with insufficient reserves. The ICRP found that the project had identified these cost risks, but failed to account for them in project reserves because they had not yet been formally documented by the contractor. The project intends to take over testing and integration responsibilities for the OTE/ISIM instruments from the contractor. Despite these challenges, the project is approaching the end of the 5-year polishing phase for the OTE primary mirror segments and started the fourth round of cryo testing on the primary mirrors in May 2010. Design Issues: The project has identified challenges in analytically demonstrating that the design of the ISIM composite structure had the necessary strength and performance capability. The ISIM structure and the bonds used to attach instruments must be designed to withstand very low temperatures for an indefinite period. The project needed to develop and verify new analytical techniques for testing which required additional time and money. At mission critical design review, the project planned for two thermal and optical performance tests of the ISIM. The project continues to track ISIM’s thermal testing as a major risk. Other Issues to be Monitored: The scale, complexity, and cryogenic nature of JWST prohibit a traditional “Test as you Fly” end- to-end testing program; therefore, the project is more dependent on analysis and subcomponent testing. After the mission critical design review, NASA chartered a Test Assessment Team (TAT) to evaluate the project’s test plans. The TAT report found that some of the test plans exceeded the money and time available and made recommendations to prioritize verification tasks, help the project gain efficiencies, particularly in the thermal testing, and reduce costs and shorten the schedule. The project has formally concurred with most of the TAT recommendations. The project also addressed residual concerns from the mission preliminary design review over the sunshield testing at the instrument CDR in January 2010 and is pending closure as the project works on details of the test plan. Project Office Comments: The JWST project office provided technical comments to a draft of this assessment, which were incorporated as appropriate. The project officials also commented that the project and its international partners have made good technical progress and retired some of the highest technical risks. In addition, NASA is executing a reorganization of the project and developing a new independent cost estimate to address management and budget challenges highlighted in the recent ICRP report. [End of JWST data] Juno: Common Name: Juno: [Refer to PDF for image: artist depiction] Source: NASA/JPL. Formulation: Formulation start: 7/05; Preliminary design review: 5/08. Implementation: Project Confirmation: 8/08; Critical design review: 4/09; GAO review: 12/10; Launch readiness date: 8/11. Table: Project Performance (then year dollars in millions): Total Project Cost: Baseline Est. (FY 2009): $1107.0; Latest (Feb. 2011): $1107.0; Change: 0.0%. Formulation Cost: Baseline Est. (FY 2009): $186.3; Latest (Feb. 2011): $186.3; Change: 0.0%. Development Cost: Baseline Est. (FY 2009): $742.3; Latest (Feb. 2011): $742.3; Change: 0.0%. Operations Cost: Baseline Est. (FY 2009): $178.4; Latest (Feb. 2011): $178.4; Change: 0.0%. Launch Schedule: Baseline Est. (FY 2009): 8/2011; Latest (Feb. 2011): 8/2011; Change: 0 months. [End of table] Recent/Continuing Project Challenges: * Technology Issues; * Design Issues; * Parts Issues; * Contractor Issues. Previously Reported Challenges: * Development Partner Issues. Project Summary: Juno continues to address issues with heritage technology. The Command and Data Handling Unit, a required component of the spacecraft, remains on the critical path due to late workforce ramp-up by the contractor and start of the flight design effort and could cause a delay in the scheduled launch. Furthermore, modifications have been made to the Command and Data Handling Unit’s Module Interface Card board to address Mars Reconnaissance Orbiter in flight issues. Finally, poor materials quality caused the failure of certain components of the spacecraft’s solar arrays during testing and led to a change in supplier. Project Update: Technology Issues: After the preliminary design review, the project reassessed the Toroidal Low Gain Antenna (TLGA) as being immature when it was determined that the materials being used in the highly charged particle environment could store an electrical charge, which would in turn interfere with some lower-level science requirements from two of the instruments on the spacecraft. The project has since coated the surface of the TLGA with germanium to provide a discharge path to the grounded metal structure that resolved the interference issue. Design Issues: The Juno project had released only 39 percent of the engineering drawings at the critical design review (CDR). Project officials, however, said they used engineering models for all instruments to demonstrate design maturity at CDR. For some spacecraft components, the Juno project did not build or test engineering models because they were of heritage designs. For example, some spacecraft components being utilized are very similar to the ones used on the Mars Reconnaissance Orbiter (MRO); therefore, the project accepted some of the spacecraft card designs based on qualification testing. In addition, subsystem and component-level reviews were held prior to the mission CDR, and project officials told us the results of these lower- level reviews provided evidence that the design was stable. However, modifications have been made to the Command and Data Handling Unit’s Module Interface Card (CMIC) board to respond to two series of reset/sideswap events found during the MRO design review as well as MRO in-flight software issues. The root cause of the problems in the MRO CMIC board has not been determined, but Juno has made a total of 12 design changes to mitigate the problems in Juno’s CMIC design. Parts Issues/Contractor Issues: The molybdenum tabs, parts attached to the solar cells used to conduct power from the cells to the solar array power harness, failed during testing. The project established a failure review board that found the failures were caused by poor materials quality. The project subsequently switched the material supplier for this part. The failure review board also investigated solar array disbonding issues and found that they were caused by contractor workmanship errors in the surface preparation of the solar array panels. The contractor adjusted its procedures and re-fabricated the panels. Other Issues to be Monitored: Juno project officials said that they began integration and testing in April 2010. The project is experiencing delays in the delivery of the Command and Data Handling (C&DH) module as a result of late workforce ramp-up and a late start of the flight design effort. The C&DH module remains on the critical path and could cause a delay to Juno’s launch. Assembly and testing has begun with a test unit version of the C&DH module while design issues are addressed on the flight unit. Furthermore, to address schedule concerns on the Italian Space Agency’s (ASI) development of the Ka-band translator after the 2009 earthquake in Italy, the project requested and ASI agreed to upgrade the engineering model to be a flyable engineering model. This flyable engineering model has already been fully tested, delivered to the Juno project, and installed on the flight system. Although the project expected to fly the engineering model, work continued on the original flight model. The original flight model was delivered and integrated on the spacecraft in September 2010. Project Office Comments: The Juno project office provided technical comments to a draft of this assessment, which were incorporated as appropriate. Project officials also commented that the project has successfully resolved several technical issues and has accommodated any delays via technical and schedule resiliency and that the project team continues to make good progress toward its projected launch date of August 5, 2011. [End of Juno data] Landsat Data Continuity Mission (LDCM): Common Name: LDCM: [Refer to PDF for image: artist depiction] Source: Orbital. The Landsat Data Continuity Mission (LDCM), a partnership between NASA and the U.S. Geological Survey, seeks to extend the ability to detect and quantitatively characterize changes on the global land surface at a scale where natural and man-made causes of change can be detected and differentiated. It is the successor mission to Landsat 7. The Landsat data series, begun in 1972, is the longest continuous record of changes in the Earth’s surface as seen from space. Landsat data is a resource for people who work in agriculture, geology, forestry, regional planning, education, mapping, and global change research. Formulation: Formulation start: 10/03; Preliminary design review: 7/09. Implementation: Project Confirmation: 12/09; Critical design review: 5/10; GAO review: 12/10; Launch readiness date: 6/13. Table: Project Performance (then year dollars in millions): Total Project Cost: Baseline Est. (FY 2010): $941.7; Latest (Feb. 2011): $941.6; Change: 0.0%. Formulation Cost: Baseline Est. (FY 2010): $341.5; Latest (Feb. 2011): $341.4; Change: 0.0%. Development Cost: Baseline Est. (FY 2010): $583.4; Latest (Feb. 2011): $587.6; Change: 0.7%. Operations Cost: Baseline Est. (FY 2010): $16.8; Latest (Feb. 2011): $12.5; Change: -25.6%. Launch Schedule: Baseline Est. (FY 2010): 6/2013; Latest (Feb. 2011): 6/2013; Change: 0 months. [End of table] Recent Project Challenges: * Funding Issues; * Parts Issues. Previously Reported Challenges: * Technology Maturity; * Development Partner Performance. Project Summary: In December 2009, NASA established a baseline launch readiness date for the LDCM project of June 2013. However, internally the project continues to plan for a December 2012 launch in order to avoid or minimize a gap in LANDSAT data. When the project established the baseline, the Thermal Infrared Sensor (TIRS) instrument was officially added to the scope of the mission, increasing the mission cost by approximately $160 million. The project is tracking parts issues for all of its major components-—the TIRS and the Operational Land Imager instruments and the spacecraft. The cost and schedule impacts of some of these issues are uncertain. Project Update: Funding Issues: Last year the project reported an estimated lifecycle cost range of $730-800 million but established a baseline life-cycle cost estimate of $941.7 million due to the addition of the Thermal Infrared Sensor (TIRS) instrument in December 2009, at an estimated additional cost of $160 million. The TIRS instrument was officially added to the scope of LDCM due to demand from the science community. With that addition, LDCM’s instrument payload consists of two instruments, the Operational Land Imager (OLI)-—a multi-spectral imaging sensor to detect and characterize land changes—-and the TIRS-— a sensor that has a wide range of uses, including water resource management and wildfire risk assessment. LDCM received $63.4 million in American Recovery and Reinvestment Act (ARRA) funding, and used the money to procure items for the components of the TIRS instrument, the spacecraft, and the OLI instrument. At confirmation in December 2009, the project and the Standing Review Board presented Joint Cost and Schedule Confidence Level (JCL) results based on mutually agreeable risks and uncertainty factors. The JCL estimates developed for the project resulted in a 50-percent confidence level launch date of December 2012, and a 70-percent confidence date of June 2013. The project continues to plan internally for a December 2012 launch date in order to avoid a potential data gap and has $91 million budgeted for risk mitigation in order to meet the earlier date. LDCM is working with its ground system partner, the United States Geological Survey (USGS), to determine the likelihood of a data availability gap and steps to mitigate the risk of a gap. Additionally, to address funding shortfalls at USGS and reduce the risk to mission success, NASA and USGS amended the final implementation agreement for LDCM to increase NASA’s role in the ground system development and shift some of the funding responsibilities to USGS in later years, which decreased the LDCM estimate for operations to decrease by 25 percent. Parts Issues: The project is tracking risks associated with the TIRS and OLI instruments and the spacecraft. The project discovered that the main electronics boards on the main electronics box of the TIRS instrument were not meeting thermal stability requirements. While TIRS is a new, in-house development effort and is on the project’s critical path, many of the subsystems and components were used in earlier flight projects. The issues with the main electronics box cost $3.8 million, but the problem had no net impact to the project’s schedule. The OLI instrument experienced problems with the black chrome plating and dark mirror coating. According to project officials, the black chrome plating did not withstand testing and lost adhesion, due to poor plating processes at the vendor. As a result, the vendor rebuilt the Solar Calibration Assembly. These issues currently have no overall impact on the project’s schedule, and the cost impacts have been negotiated. On the spacecraft, the project identified contamination of the Reaction Wheel Assembly (RWA) lubricant and scheduled to have new bearings installed by the vendor. Project officials said that they have identified windows during integration and test where a new unit can be inserted. Although the problem caused a six month schedule slip for the RWA, the project expects no impact on the overall schedule because the delay was largely absorbed by the integration and testing workarounds and subsystem schedule slack. Last year, we reported that the project had released 83 percent of its design drawings as of September 2009. In April 2010, the project had released 93 percent of its drawings and held a successful mission critical design review (CDR) in May 2010, but the project is tracking risks on each of the major components. Currently, the project reports that 97 percent of the total design drawings have been released. Project Office Comments: The LDCM project provided technical comments to a draft of this assessment, which were incorporated as appropriate. Project officials also commented that the mission has set a commitment for a launch readiness date of June 2013, but the project is aggressively working to launch in December 2012 in order to minimize the chance of a data gap should Landsat 5 or Landsat 7 cease operations. [End of LDCM data] Lunar Atmosphere and Dust Environment Explorer (LADEE): Common Name: LADEE: [Refer to PDF for image: artist depiction] Source: LADEE Project Office. The Lunar Atmosphere and Dust Environment Explorer (LADEE) mission objective is to determine the global density, composition, and time variability of the lunar atmosphere. LADEE’s measurements will determine the size, charge, and spatial distribution of electrostatically transported dust grains. Additionally, LADEE will carry an optical laser communications demonstrator that will test high- bandwidth communication from lunar orbit. Formulation: Formulation start: 2/09; Preliminary design review: 7/10. Implementation: Project Confirmation: 8/10; GAO review: 12/10; Critical design review: 8/11; Launch readiness date: 11/13. Table: Project Performance (then year dollars in millions): Total Project Cost[A]: Baseline Est. (FY 2010): $262.9; Latest (Feb. 2011): $262.9; Change: 0.0%. Formulation Cost: Baseline Est. (FY 2010): $79.5; Latest (Feb. 2011): $79.5; Change: 0.0%. Development Cost: Baseline Est. (FY 2010): $168.2; Latest (Feb. 2011): $168.2; Change: 0.0%. Operations Cost: Baseline Est. (FY 2010): $15.2; Latest (Feb. 2011): $15.2; Change: 0.0%. Launch Schedule: Baseline Est. (FY 2010): 11/2013; Latest (Feb. 2011): 11/2013; Change: 0 months. [A] This estimate does not include the LLCD instrument which is being funded by the Space Operations Mission Directorate at a cost of approximately $65 million. [End of table] Recent Project Challenges: * Technology Issues; * Parts Issues; * Launch Issues. Project Summary: The LADEE project was confirmed on August 23, 2010, to proceed into implementation. LADEE will be flying three heritage instruments, as well as the Lunar Laser Com Demo, which is being developed by the Space Operations Mission Directorate at a cost of approximately $65 million. NASA will launch the project on the Minotaur V. A bid protest delayed the issuance of a delivery order for the launch vehicle and postponed development of a Soft-Ride system that will protect instrumentation during launch. Project Update: Technology Issues: LADEE utilizes three instruments that have been designed for other missions but require modifications to their form, fit, and function. None of the three instruments were considered mature at the preliminary design review in July 2010. NASA flew the Lunar Dust Experiment (LDEX) on various configurations on the HEOS 2, Galileo, Ulysses, and Cassini projects. The Neutral Mass Spectrometer (NMS) is a subset of the Sample Analysis at Mars instrument being developed for the Mars Science Laboratory. The Ultraviolet Spectrometer (UVS) is based on the design of the UVS instrument flown on the Lunar Crater Observation and Sensing Satellite (LCROSS). The project will also fly the Lunar Laser Com Demo (LLCD) as a ride along technology demonstration on LADEE. The LLCD is being developed by the Space Operations Mission Directorate at a cost of approximately $65 million, which is not included in the LADEE cost estimates. Parts Issues: The UVS has run into problems with the source vendor and parts quality and, therefore, is not identical to the LCROSS version of the instrument. Project officials determined that the printed wiring board for the UVS was being developed in a facility with no quality systems or workmanship standards in place. The project decided to keep the printed wiring board design, but had another vendor produce the boards at a NASA-approved facility. Implementation of this change cost the project approximately $1.1 million. Launch Issues: LADEE will be launched on a Minotaur V, which was procured under the Air Force’s indefinite delivery indefinite quantity contract with a commercial launch vehicle provider. A bid protest regarding the selection of the Minotaur V, however, delayed the issuance of the delivery order for the vehicle and the project’s preliminary design review by 3 months and the critical design review by 5 months. Furthermore, the project will need to equip the launch vehicle with a Soft-Ride system in order to protect the project’s instrumentation from excessive vibration during launch. While there is no new development effort behind the Soft-Ride, the system must be tuned to the particular load environment and spacecraft design, which will be delayed until the launch vehicle delivery order is issued. Other Issues to be Monitored: The LADEE project has not reached a design review where we could assess design stability. As of September 2010, the project expected to release 58 percent of its design drawings by the preliminary design review and 83 percent by the critical design review. Because of its focus on being a low cost mission, LADEE’s only critical technology is the RF antenna on the spacecraft, which, according to the project office, is proceeding on schedule. Project Office Comments: The LADEE project office provided technical comments to a draft of this assessment, which were incorporated as appropriate. LADEE project officials also commented that the bid protest on the launch vehicle has been resolved and that the Minotaur will be procured under an Air Force contract with a commercial launch service provider. [End of LADEE data] Magnetospheric Multiscale (MMS): Common Name: MMS: [Refer to PDF for image: Computer Model] Source: MMS Project Office. The Magnetospheric Multiscale (MMS) is made up of four identically instrumented spacecraft. The mission will use the Earth's magnetosphere as a laboratory to study the microphysics of magnetic reconnection, energetic particle acceleration, and turbulence. Magnetic reconnection is the primary process by which energy is transferred from solar wind to Earth’s magnetosphere and is the physical process determining the size of a space weather storm. The spacecrafts will fly in a pyramid formation, adjustable over a range of 10 to 400 kilometers, enabling them to capture the three- dimensional structure of the reconnection sites they encounter. The data from MMS will be used as a basis for predictive models of space weather in support of exploration. Formulation: Formulation start: 5/20; Preliminary design review: 5/09. Implementation: Project Confirmation: 6/09; Critical design review: 8/10; GAO review: 12/10; Launch readiness date: 3/15. Table: Project Performance (then year dollars in millions): Total Project Cost: Baseline Est. (FY 2010): $1082.7; Latest (Feb. 2011): $1082.7; Change: 0.0%. Formulation Cost: Baseline Est. (FY 2010): $173.0; Latest (Feb. 2011): $173.0; Change: 0.0%. Development Cost: Baseline Est. (FY 2010): $857.4; Latest (Feb. 2011): $857.4; Change: 0.0%. Operations Cost: Baseline Est. (FY 2010): $52.3; Latest (Feb. 2011): $52.3; Change: 0.0%. Launch Schedule: Baseline Est. (FY 2010): 3/2015; Latest (Feb. 2011): 3/2015; Change: 0 months. [End of table] Recent Project Challenges: * Development Partner Issues; * Design Issues; * Technology Issues. Project Summary: The MMS project used $6 million in cost reserves to move development work for the Spin Plane Double Probe instrument from Sweden to the University of New Hampshire because Sweden was not providing adequate levels of funding for project development. The movement of development work has resulted in a delay of approximately 6 months for the completion of the design for the instrument. However, project officials do not believe the delay will impact the mission’s March 2015 launch readiness date. Project Update: Development Partner Issues: The MMS project used approximately $6 million in reserve funds to move work from Sweden to the University of New Hampshire because Sweden was not making satisfactory progress on the production of the Spin Plane Double Probe (SDP) instrument due to inadequate levels of funding. After considering three potential candidates, the MMS project selected the University of New Hampshire in 2010 to assume production of the SDP deployment mechanism, the most complex element of the SDP instrument. Sweden will continue to provide SDP flight hardware as well as mission science support. As a result of these changes, the completion of the design for the SDP is behind schedule by approximately 6 months, but MMS officials believe this change poses no threat to the mission’s launch readiness date in March 2015. Design Issues: In August 2010, the project completed its mission critical design review (CDR). At that time the project had released 77 percent of its engineering design drawings. Last year, project officials told us that having 70 to 80 percent of design drawings completed by CDR is normal, but they had not established any goals for the project. MMS officials stated that the number of complete engineering test units is as important, if not more so, than design drawings. According to project officials, MMS uses high fidelity instrument models as a risk reduction effort. By using engineering models that are as flight- ready as possible, project officials reported that they can see where problems are and better identify risks. Additionally, they stated that proceeding with the manufacture of flight hardware without having built flightlike engineering units to test the design, will almost always lead to schedule overruns to solve design issues. Technology Issues: Following mission CDR in August 2010, the MMS project has yet to fully address the form, fit, and function of the payload separation system, a key heritage technology. All four MMS satellites will launch stacked on a single Atlas V launch vehicle. When the top spacecraft deploys, springs will push off the first satellite and trigger a command for each subsequent satellite to deploy. The technology required for the separation system is not new; however, the project is working closely with the contractor to ensure that all four satellites separate in a consistent manner which supports the need for them to fly in a pyramid formation. Other Issues to be Monitored: MMS was authorized to enter formulation, the phase that precedes implementation, in 2002 with an initial cost estimate of $369 million. The project was authorized to enter implementation in June 2009 with a baseline life-cycle cost estimate of over $1 billion. The project manager said the initial cost estimate was for a smaller instrument suite than what is currently planned for the mission and added that one cost driver for the project since the initial cost estimate was the requirement for magnetic and electrostatic cleanliness. The initial cost estimate also did not account for the higher cost of the Atlas V, which is a larger launch vehicle than the Delta II initially considered by the project. Project Office Comments: The MMS project office provided technical comments to a draft of this assessment, which were incorporated as appropriate. Project officials also commented that MMS continues to make technical progress. In 2010, the MMS project completed the detailed design of the instruments and spacecraft. [End of MMS data] Mars Atmosphere and Volatile EvolutioN (MAVEN): Common Name: MAVEN: [Refer to PDF for image: artist depiction] Source: NASA GSFC MAVEN Project Office. The Mars Atmosphere and Volatile EvolutioN (MAVEN) mission, a robotic orbiter mission, will provide a comprehensive picture of the Mars upper atmosphere, ionosphere, solar energetic drivers, and atmospheric losses. MAVEN will deliver comprehensive answers to long-standing questions regarding the loss of Mars’ atmosphere, climate history, liquid water, and habitability. MAVEN will provide the first direct measurements ever taken to address key scientific questions about Mars’ evolution. Formulation: Formulation start: 9/08; Preliminary design review: 7/10. Implementation: Project Confirmation: 10/10; GAO review: 12/10; Critical design review: 7/11; Launch readiness date: 11/13. Table: Project Performance (then year dollars in millions): Total Project Cost: Baseline Est. (FY 2011): $671.2; Latest (Feb. 2011): $671.2; Change: 0.0%. Formulation Cost: Baseline Est. (FY 2011): $63.8; Latest (Feb. 2011): $63.8; Change: 0.0%. Development Cost: Baseline Est. (FY 2011): $567.2; Latest (Feb. 2011): $567.2; Change: 0.0%. Operations Cost: Baseline Est. (FY 2011): $40.1; Latest (Feb. 2011): $40.1; Change: 0.0%. Launch Schedule: Baseline Est. (FY 2011): 11/2013; Latest (Feb. 2011): 11/2013; Change: 0 months. [End of table] Recent Project Challenges: * Design Issues; * Launch Issues. Project Summary: MAVEN was selected under the Mars Scout Program-—a NASA initiative to send a series of small, low-cost robotic missions to Mars. The project was competitively selected from innovative proposals by the scientific community. The project is relying on heritage technologies, but project officials acknowledged that these technologies required modifications to their form, fit, and function to operate as necessary for MAVEN’s requirements. The project is being designed to the Atlas V launch vehicle, which is significantly more expensive than it was under the previous launch services contract. Project Update: Design Issues: At the preliminary design review, the project manager decided not to authorize the Respin of the High Efficiency Power Supply (HEPS), MAVEN’s power supply system, because of a high probability of failure and therefore violates the mission assurance requirements. The project met with the contractor to discuss HEPS design, fabrication, assembly, test history and qualification in order to resolve this issue. The MAVEN project has not reached a design review where we could assess design stability. At the mission preliminary design review in July 2010, the project estimated that it would have 85 percent of its engineering drawings released at the critical design review. Launch Issues: According to project officials, the project was given approval to initiate selection of a launch vehicle in September 2010 after the new NASA Launch Services (NLS) contract was awarded. Project officials told us the project had been designing to two vehicles prior to the new contract being awarded. However, the only available vehicle that currently meets the needs of the MAVEN project is the intermediate-class Atlas V, which will be significantly more expensive than it was under the previous NLS contract. In October 2010, NASA announced that the Atlas V had been selected as the launch vehicle for MAVEN at a total cost of $187 million. Science Mission Directorate officials told us that they incorporated this increased cost into the project’s baseline during the confirmation review. Other Issues to be Monitored: In order to control project costs, the project plans to minimize development activities of new technology by designing MAVEN spacecraft and instruments based on available heritage hardware. The MAVEN project identified seven heritage technologies, all of which are required to meet the mission’s science requirements. Prior to the preliminary design review, the project deemed all heritage technologies to be mature, but project officials acknowledged that these heritage technologies do not take into account modifications of form, fit, and function needed to operate in the Martian environment and require modifications. For example, while MAVEN’s magnetometer design is similar to those flown on prior NASA projects, a minor change to the electronics of the magnetometer is necessary to extend its dynamic range. The project is also concerned that measurements from the magnetometer may become corrupted due to the amount of electronic interference, or noise, on the spacecraft. To alleviate this concern, project officials decided to reconfigure the solar cells on the panel to minimize the magnetic field at the location of the instrument. As a result of this reconfiguration and additional analysis, project officials reported the risk has been mitigated. Furthermore, project officials told us they are evaluating ways to ensure that the spacecraft and instruments will continue to operate and collect data during major solar flares. Project Office Comments: The MAVEN project office provided technical comments to a draft of this assessment, which were incorporated as appropriate. Project officials also commented that the project entered into implementation in October 2010 and is on track for critical design review scheduled for July 2011. [End of MAVEN data] Mars Science Laboratory (MSL): Common Name: MSL: [Refer to PDF for image: photograph] Source: NASA/JPL-Caltech. The Mars Science Laboratory (MSL) is part of the Mars Exploration Program (MEP). The MEP seeks to understand whether Mars was, is, or can be a habitable world. To answer this question, the MSL project will investigate how geologic, climatic, and other processes have worked to shape Mars and its environment over time, as well as how they interact today. The MSL will continue this systematic exploration by placing a mobile science laboratory on the Mars surface to assess a local site as a potential habitat for life, past or present. The MSL is considered one of NASA’s flagship projects and will be the most advanced rover yet sent to explore the surface of Mars. Formulation: Formulation start: 11/03; Preliminary design review: 6/06. Implementation: Project Confirmation: 8/06; Critical design review: 6/07; GAO review: 12/10; Launch readiness date: 11/11. Table: Project Performance (then year dollars in millions): Total Project Cost: Baseline Est. (FY 2010): $2394.2; Latest (Feb. 2011): $2476.3; Change: 3.4%. Formulation Cost: Baseline Est. (FY 2010): $515.5; Latest (Feb. 2011): $515.5; Change: 0.0%. Development Cost: Baseline Est. (FY 2010): $1719.9; Latest (Feb. 2011): $1802.0; Change: 4.8%. Operations Cost: Baseline Est. (FY 2010): $158.8; Latest (Feb. 2011): $158.8; Change: 0.0%. Launch Schedule: Baseline Est. (FY 2010): 11/2011; Latest (Feb. 2011): 11/2011; Change: 0 months. [End of table] Recent/Continuing Project Challenges: * Design Issues; * Parts Issues. Challenges Previously Reported: * Technology Maturity; * Complexity of Heritage Technology. Project Summary: Congress reauthorized the MSL and it was subsequently re-baselined in January 2010 because the project had exceeded its 2008 cost baseline by more than 30 percent. In 2009, MSL’s cost had grown more than $834 million and its scheduled launch had been delayed 26 months from its original 2008 baseline due to work needed to overcome technical challenges with the actuators and avionics. This increase includes more than an 86 percent increase in development costs. Project Update: Congress reauthorized the MSL in the Consolidated Appropriations Act of 2010 and NASA subsequently rebaselined the project in January 2010 after it had exceeded its 2008 development cost baseline by more than 30 percent. Since the original project baseline in 2008, the life- cycle cost for the project has increased by more than $834 million— including more than an 86 percent increase in development costs—-and the launch has been delayed until November 2011 since launch windows for Mars mission are optimally aligned every 26 months. These cost and schedule overruns were driven by problems with the actuators and avionics. Specifically, the project experienced problems with the actuators that allow the vehicle to move and execute the sample operations performed by the lab. The project has since redesigned the actuators and retired this risk. The project indicated that project reserves may be inadequate to meet the scheduled work for 2011. Design Issues: The MSL project design was not stable at the Critical Design Review (CDR). Several design changes were required after CDR to address various issues. For example, project officials told us the avionics hardware was a new design and had been delivered in an immature state. They had hoped to have all issues with the avionics hardware completed by November 2009; however, project officials said the design of the hardware is still not complete and the project has delayed the software development which includes about 12 deliverables. The avionics computer element is currently the leading risk to the MSL schedule and its functionality is critical to the mission’s success. Furthermore, the Sample Analysis at Mars Wide Range Pump has had a series of development problems and although the project has worked through about 10 engineering models, it continues to struggle to pass the life test. The project built and tested two different pump designs in parallel that met the science requirements and conducted an accelerated life test on them. The project plans to make a decision between the two designs at the conclusion of the life test and pump qualification testing, currently scheduled for fall 2010. The project is also monitoring performance degradation of the Multi Mission Radioisotope Thermoelectric Generator (MMRTG) due to the thermocouples that convert the heat generated by the plutonium into electricity degrading at a faster rate than predetermined, or about 10 percent. According to the project manager, the MMRTG can still meet its objectives with a 10 percent decay rate, but if this rate increases the project cannot meet its requirements and will be forced to cut the nominal number of samples collected or the distance the rover is to travel during the primary mission. Parts Issues: The project experienced a parts failure associated with the transition joints in the propulsion system which caused the joints to fail under load. Project officials reported this issue was realized after the project finished building its propulsion system, causing the project to rebuild the system and adopt a new joint design. The transition to the new design required a rework and retest of the descent cruise stages. According to project officials, the project also encountered parts issues on the avionics package, including a shorting out of the pins on the avionics processor and a packaging issue that caused a disconnect between the analog components and the configuration of the board. Project Office Comments: The MSL project office provided technical comments to a draft of this assessment, which were incorporated as appropriate. The project believes that the GAO assessment largely reflects the history of the project and most of the issues identified have been resolved. [End of MSL data] NPOESS Preparatory Project (NPP): Common Name: NPP: [Refer to PDF for image: photograph] Source: Ball Aerospace. The National Polar-orbiting Operational Environmental Satellite System (NPOESS) Preparatory Project (NPP) is a joint mission with the National Oceanic and Atmospheric Administration (NOAA) and the U.S. Air Force. The satellite will measure ozone, atmospheric and sea surface temperatures, land and ocean biological productivity, Earth radiation, and cloud and aerosol properties. The NPP mission has two objectives. First, NPP will provide a continuation of global weather observations following the Earth Observing System missions Terra and Aqua. Second, NPP will function as an operational satellite and will provide data until the first NPOESS satellite launches. Formulation: Formulation start: 11/98; Preliminary design review: 1/03; Critical design review: 8/03. Implementation: Project Confirmation: 11/03; GAO review: 12/10; Launch readiness date: 10/11. Table: Project Performance (then year dollars in millions): Total Project Cost: Baseline Est. (FY 2007): $672.8; Latest (Feb. 2011): $864.3; Change: 28.5%. Formulation Cost: Baseline Est. (FY 2007): $47.3; Latest (Feb. 2011): $47.1; Change: 0.5%. Development Cost[A]: Baseline Est. (FY 2007): $593.0; Latest (Feb. 2011): $780.1; Change: 31.6%. Operations Cost: Baseline Est. (FY 2007): $32.5; Latest (Feb. 2011): $37.1; Change: 14.0%. Launch Schedule: Baseline Est. (FY 2007): 4/2008; Latest (Feb. 2011): 10/2011; Change: 42 months. [End of table] Recent/Continuing Project Challenges: * Development Partner Issues; * Launch Issues. Previously Reported Challenges: * Technology Maturity; * Complexity of Heritage Technology; * Design Stability. Project Summary: NPP has experienced over $183 million in development cost growth and a 42-month launch delay, and officials told us that there is more work remaining than the schedule allows. The last of the partner-provided instruments was delivered for integration on the satellite in June 2010, although a number of risks remain. Project officials said that many problems were uncovered late in the development process, leading NASA to revise NPP mission success criteria. In February 2010, the White House announced a restructuring of the NPOESS program, which could affect the launch schedule. Project Update: NPP project officials have attributed cost and schedule overruns to development partner challenges and a lack of central authority between the three NPOESS agencies. Further, DOD, with agreement from its partner agencies, restructured the NPOESS program in 2006, but the program continued to experience cost and schedule growth. Since NPP was baselined in fiscal year 2007, the project’s development cost has increased by 26 percent in the fiscal year 2011 budget request, and its schedule has increased by 42 months. Development Partner Issues: Management and developmental partner challenges have continued to result in cost overruns and schedule delays in the Visible Infrared Imaging Radiometer Suite (VIIRS) and Crosstrack Infrared Sounder (CrIS) instruments. The project office attributes almost all of the cost and schedule changes to the late delivery of these partner-provided instruments. The CrIS was the last instrument to arrive for NPP and was delivered to the spacecraft contractor in June 2010. Issues with the CrIS instrument moved the launch date from January 2011 to October 2011. Furthermore, because NPOESS is now not scheduled to launch until 2014, NPP will still be a demonstration satellite as originally intended but will have to function as an operational satellite, providing interim data until NPOESS launches. In February 2010, the White House announced plans to restructure the NPOESS program, into the Joint Polar Satellite System (JPSS), to address cost overruns and schedule delays. As a result of the restructure, NOAA and DOD will undertake separate satellite system acquisitions. The NPOESS program continues to develop the instruments and ground systems supporting NPP, but, according to project officials, the management of the instruments’ contracts is being transferred from the NPOESS Integrated Program Office (IPO), which is a joint U.S. Air Force and NOAA program office, to DOD’s Space and Missile Systems Center. The NPP project is taking steps to facilitate cooperation and gain more authority with the technical elements than it had at the beginning of NPP but believes the restructuring will cause further launch delays due to fiscal constraints stemming from a lack of necessary funds to cover termination liability for NPOESS contracts. Although all critical technologies are mature, NPP continues to report an inability to reduce risks to an acceptable level on three instruments provided by its development partners-the VIIRS, the CrIS, and the Ozone Mapper Profiler Suite. Project officials told us they lack confidence in the processes used by the IPO, are unsure how these instruments will function on orbit. Further, they believe there is more work remaining than the schedule allows for an October 2011 launch. For example, the NPP project is currently tracking the VIIRS system’s door deployment testing as a schedule risk. Because of the uncertainty of the instrument’s functionality, NASA is updating the NPP Mission Success Criteria based on these risk assessments in order to lower expectations and define minimum mission success criteria. Launch Issues: Since this will be one of the last missions to be launched on a Delta II, NASA is tracking the availability of trained personnel to launch NPP as a risk. While NASA rates the impact of a launch slip on NPP and the other three remaining missions scheduled for the Delta II as high risk, the agency currently considers this as a low probability as there are sufficient existing processes and mitigation efforts in place. Project Office Comments: The NPP project provided technical comments to a draft of this assessment, which were incorporated as appropriate. Project officials also commented the project is working with the newly formed JPSS Program to finalize an integrated NPP schedule to launch. They added that NPP will continue to be a demonstration satellite for NPOESS/JPSS. However, with the NPOESS/JPSS-1 satellite’s launch delay to 2014, agencies will use the NPP data operationally. [End of NPP data] Orbiting Carbon Observatory 2 (OCO-2): Common Name: OCO-2: [Refer to PDF for image: artist depiction] Source: Jet Propulsion Laboratory. NASA’s Orbiting Carbon Observatory 2 (OCO-2) is based on the original OCO mission that failed to reach orbit in 2009 and is designed to enable more reliable predictions of climate change. It will make precise, time-dependent global measurements of atmospheric carbon dioxide. These measurements will be combined with data from a ground- based network to provide scientists with the information needed to better understand the processes that regulate atmospheric carbon dioxide and its role in the carbon cycle. NASA hopes enhanced understanding of the carbon cycle will improve predictions of future atmospheric carbon dioxide increases and the potential impact on the climate. Formulation: Formulation start: 3/10; Critical design review: 8/10. Implementation: Project Confirmation: 9/10; GAO review: 12/10; Launch readiness date: 2/13. Table: Project Performance (then year dollars in millions): Total Project Cost: Baseline Est. (FY 2009): $349.9; Latest (Feb. 2011): $349.9; Change: 0.0% Formulation Cost: Baseline Est. (FY 2009): $60.9; Latest (Feb. 2011): $60.9; Change: 0.0% Development Cost: Baseline Est. (FY 2009): $249.0; Latest (Feb. 2011): $249.0; Change: 0.0%. Operations Cost: Baseline Est. (FY 2009): $40.0; Latest (Feb. 2011): $40.0; Change: 0.0%. Launch Schedule: Baseline Est. (FY 2009): 2/2013; Latest (Feb. 2011): 2/2013; Change: 0 months. [End of table] Recent Project Challenges: * Parts Issues; * Funding Issues. Project Summary: OCO-2 entered a tailored formulation phase in March 2010. The project management’s goal is to minimize changes from the OCO mission. The project office worked with NASA to develop preliminary cost estimates, which are higher than the 2008 estimate of $273.1 million for OCO, due in part to the project obtaining a full set of spares for OCO-2. NASA has selected the Taurus XL launch vehicle for OCO-2, the same vehicle used for the OCO mission. The project received $18 million under the American Recovery and Reinvestment Act of 2009 that was used to enable the earliest possible launch. Project Update: Parts Issues: The project is making every effort to duplicate the original OCO design using identical hardware, drawings, documents, procedures, and software wherever possible and practical in order to produce OCO-2 with minimum cost, schedule, and performance risk. However, project officials stated that there were no engineering models for many of the OCO components and the original components were lost on OCO, making the rebuild difficult, particularly due to obsolescence of parts. The OCO-2 project will procure a full set of spares to help avoid problems with parts obsolescence during the development and testing of flight hardware. OCO-2 encountered difficulties with two particular components due to lack of spares and parts obsolescence. The cryocooler used on OCO was a spare that the project received at no cost; however, the same cryocoolers were not available for OCO-2. Additionally, the flight computer from OCO is now obsolete. OCO-2 is redesigning and updating the flight computer in order to avoid converting all technology to a new flight computer. Project officials said they held a successful critical design review (CDR) for the redesigned flight computer based on an engineering development unit and they expected the new design to be fully validated by the end of 2010. Funding Issues: The OCO-2 project office helped NASA develop a life- cycle cost estimate based on the original life-cycle costs of OCO. In December 2008, OCO’s life-cycle cost estimate was $273.1 million, compared to OCO-2’s baseline estimate of $349.9 million. Project officials attributed the higher life-cycle cost estimate for OCO-2 to development of a new crycooler, inflation, procurement of a full set of spares, and an increase in the cost of the launch vehicle. For example, NASA did not have acquisition costs for the cryocooler for the original OCO mission. OCO-2 is acquiring two new cryocoolers through an interagency transfer with the National Oceanic and Atmospheric Administration (NOAA), but will have to contract for two new units to provide to NOAA for its future use. The project also used $18 million under the American Recovery and Reinvestment Act of 2009 to acquire long lead items for the spacecraft, instrument development, and project management to enable the earliest possible launch of an OCO recovery mission. Other Issues to be Monitored: OCO-2 entered a tailored formulation phase in March 2010 to expedite entering implementation because the project has been designed and built once. According to project officials the tailored formulation reduces the number of reviews; therefore, OCO-2’s first major review was the mission CDR, which was held in August 2010, and preceded project confirmation. At CDR, the project had released 95 percent of its engineering drawings for the instrument and spacecraft. In June 2010, NASA selected Orbital Sciences Corporation to launch OCO-2 aboard a Taurus XL, the same vehicle used for OCO in 2009. Orbital and NASA ran concurrent mishap investigations following the OCO launch failure, and Orbital has addressed the findings of each report. The Glory mission, the first to launch on the Taurus XL since the 2009 launch failure, is scheduled to launch in March 2011. OCO-2 is the next mission in line for the Taurus XL. OCO-2 includes a single instrument, the three-channel grating spectrometer, based on heritage technology from the OCO mission. Although the project reports that the spectrometer’s technology maturity is high, the project will make minor changes in components and some obsolete parts that will need to be replaced. Project Office Comments: The OCO-2 project provided technical comments to a draft of this assessment, which were incorporated as appropriate. The project officials also commented that OCO-2 is intended to duplicate, as much as possible, the OCO mission that was lost due to the Taurus XL failure. As such, OCO-2 was granted a waiver from the normal NASA project formulation process. OCO-2 is baselining a launch in February 2013. [End of OCO-2 data] Orion Crew Exploration Vehicle: Common Name: Orion: [Refer to PDF for image: artist depiction] Source: Lockheed Martin Space Systems. NASA’s Orion Crew Exploration Vehicle, was designed to carry crew and cargo to the International Space Station (ISS) and to the Moon as part of the Constellation Program. The 5-meter diameter Orion capsule was designed to be launched by the Ares I Crew Launch Vehicle and to carry four astronauts to the ISS and the Moon after linking up with an earth departure stage. The capsule will return to Earth and descend on parachutes to the surface. Orion has three main elements—-the crew module (capsule), service module/spacecraft adapter, and launch abort system. Formulation: Formulation start: 7/06; Preliminary design review: 8/09; GAO review: 12/10. Implementation: Critical design review: 2/11; Launch readiness date: 3/15. Table: Project Performance (then year dollars in millions): Latest (Feb. 2011): Preliminary Estimate of Project Life Cycle Cost[A]: $20,000 to $29,000. Launch Schedule: 3/2015. [A] This estimate is preliminary, as the project is in formulation and there is still uncertainty in the value as design options are explored. NASA uses these estimates for planning purposes. This estimate is for the Orion vehicle only. [End of table] Recent/Continuing Project Challenges: * Funding Stability; * Technology Issues. Previously Reported Challenges: * Contractor Performance. Project Summary: The President’s fiscal year 2011 budget proposed cancellation of the Orion project leading to uncertainty, both financial and programmatic, within the project. Given constrained resources, the project prioritized work and did not accomplish some of the work originally planned for 2010. The project did, however, successfully complete a test of the launch abort system and continue progress on mitigating other technical challenges. In early fall 2010 Congress passed the NASA Authorization Act of 2010 directing NASA to utilize existing Orion contracts and capabilities to the extent practicable. Project Update: The President proposed cancellation of the Constellation Program, including the Orion project, in his fiscal year 2011 budget request. This proposal led to much debate within Congress and uncertainty, both financial and programmatic, within the project. As a result, the project prioritized work for the year and did not complete some of the work originally planned for 2010. In early fall 2010, Congress passed the NASA Authorization Act of 2010, which directed NASA to continue development of a multipurpose crew vehicle capable of reaching near- Earth and beyond near-Earth orbit no later than December 2016. In developing this vehicle, Congress directed the agency to continue to advance development of the human safety features, designs, and systems in the Orion project and to utilize existing contracts and capabilities to the extent practicable. Funding Issues: Funding shortfalls and uncertainty have impacted workforce availability, shifted the Orion schedule and testing strategy, and deferred procurement of new items. For example, during fiscal year 2010, NASA and Lockheed Martin had arranged an agreement under which Lockheed Martin would have performed $200 million worth of work during the current fiscal year that NASA would pay for during later phases of the Orion project. However, according to project officials, NASA decided not to execute the agreement because NASA lacked sufficient budget authority to obligate funds to pay for the work. This left the project, and the entire Constellation Program, without the $200 million worth of work that they had expected and with limited resources for completing the remaining work for fiscal year 2010, so therefore, the project prioritized development activities and tests. The Orion project received nearly $166 million of funding under the American Recovery and Reinvestment Act of 2009 that according to project officials halted layoffs at Lockheed Martin and helped the project overcome technical challenges. The value of the development contracts for Orion has increased by $2.5 billion since 2006. Technology Issues: The Orion project identified one critical heritage technology for the spacecraft: the thermal protection system, or heatshield, that is required for the spacecraft to survive reentry from earth orbit. According to project officials, the new material for the heatshield has been tested against the material used in the Apollo program and performs as well as or better than the heritage material. However, given the current funding constraints and uncertainty surrounding the Orion project, the Orion project office prioritized development activities, and while the heatshield development and testing are continuing on plan, the determination of the manufacturing processes has been deferred. In addition, development of the launch abort system, which would pull the Orion capsule away from the Ares I launch vehicle in the case of a catastrophic problem during launch, remains a high risk area even though it was not identified as a critical technology. In May 2010, the project tested the Launch Abort System in the Orion’s Pad Abort (PA-1) test. According to project officials, PA-1 was an important developmental milestone for the launch abort system, but certain items that were found during the test will require design modifications to the system that will not be tested until funding is available. The project has also developed a new controller for the launch abort system, and planned to test it in the ascent abort test in 2012. However, due to funding instability, it is unknown when and if this test will take place. Project Office Comments: The Orion project office provided technical comments on a draft of this assessment, which were incorporated as appropriate. Project officials also commented that the project has continued its work on the Constellation program. Reductions in planned work content were made to ensure availability of funds required to complete work already under contract. These reductions have made it difficult for NASA to achieve some of its goals and outcomes planned for fiscal year 2010. NASA remains poised to leverage Constellation assets to contribute to future exploration beyond low-Earth orbit. [End of Orion data] Radiation Belt Storm Probes (RBSP): Common Name: RBSP: [Refer to PDF for image: photograph] Source: © 2010 The Johns Hopkins University/Applied Physics Laboratory. All Rights Reserved. The Radiation Belt Storm Probes (RBSP) mission will explore the Sun’s influence on the Earth and near-Earth space by studying the planet’s radiation belts at various scales of space and time. This insight into the physical dynamics of the Earth’s radiation belts will provide scientists data to make predictions of changes in this little understood region of space. Understanding the radiation belt environment has practical applications in the areas of spacecraft system design, mission planning, spacecraft operations, and astronaut safety. The two spacecrafts will measure the particles, magnetic and electric fields, and waves that fill geospace and provide new knowledge on the dynamics and extremes of the radiation belts. Formulation: Formulation start: 9/06; Preliminary design review: 10/08. Implementation: Project Confirmation: 12/08; Critical design review: 12/09; GAO review: 12/10; Launch readiness date: 5/12. Table: Project Performance (then year dollars in millions): Total Project Cost: Baseline Est. (FY 2009): $685.8; Latest (Feb. 2011): $685.9; Change: 0.0%. Formulation Cost: Baseline Est. (FY 2009): $88.2; Latest (Feb. 2011): $88.2; Change: 0.0%. Development Cost: Baseline Est. (FY 2009): $533.9; Latest (Feb. 2011): $534.0; Change: 0.0%. Operations Cost: Baseline Est. (FY 2009): $63.7; Latest (Feb. 2011): $63.7; Change: 0.0%. Launch Schedule: Baseline Est. (FY 2009): 5/2012; Latest (Feb. 2011): 5/2012; Change: 0 months. [End of table] Recent Project Challenges: * Parts Issues; * Contractor Issues. Project Summary: RBSP project officials reported parts failure and contractor issues that may result in the delayed delivery and integration of two key science instruments. Project officials expect delays in the delivery of the Helium-Oxygen-Proton-Electron instrument due to a parts functionality failure and in the delivery of necessary flight hardware for the MagEIS instrument that may impact its integration with the spacecraft. RBSP’s systems integration review was held in October 2010. Project Update: Parts Issues: RBSP project officials expect delays in the delivery and integration the Helium-Oxygen-Proton-Electron (HOPE) instrument. Delivery of HOPE may be delayed due to a parts functionality failure within the high voltage Optocoupler. Currently, the project considers this parts issue a risk to mission cost and schedule. However, the project manager reported that there are sufficient schedule reserves and that they have confidence that issues can be resolved without schedule growth. Project officials said that other NASA missions had issues with the same part. The manufacturer is working to develop a revised Optocoupler to meet multiple mission needs. NASA provided instructions that prohibited the use of certain connectors as part of their ongoing monitoring of quality parts and qualification standards, which caused the project to review the type of connectors used in the observatory and replace the connectors as applicable. The project has successfully qualified a connector to replace the NASA-prohibited connectors. The new connector has been successfully installed on flight model boards across the project. RBSP project officials classify the likelihood of an in-flight failure if the prohibited connectors were used as very small; however, possible consequences including loss of the spacecraft or an instrument are significant. Contractor Issues: Delivery of the Magnetic Electron Ion Spectrometer (MagEIS) instrument is expected to be delayed due to the time a vendor is taking in providing needed flight hardware for the instrument. A project official reported that the vendor was contacted and encouraged to prioritize its commitment to the RBSP contract. However, officials reported that the project underwent a schedule replan to accommodate the late delivery and integration of MagEIS. This replan maintains the launch readiness date by re-ordering the observatory integration and test flow and changing selected subsystem and instrument delivery dates. Other Issues to be Monitored: Project officials indicated that one of the primary challenges for RBSP is developing a spacecraft capable of withstanding the high levels of radiation that it will encounter during the mission. RBSP includes many design elements, such as aluminum shielding around all major subsystems, and is undergoing extensive testing and qualification to ensure sufficient “radiation hardening.” The project manager reported that spacecraft electronic- related parts radiation testing is nearly complete with no problems reported. Only 69 percent of the engineering design drawings, instead of the planned 87 percent, were released by the December 2009 critical design review (CDR) for RBSP. In April 2010, the project had released 93 percent of its drawings. Project officials said that RBSP was the first project at the Johns Hopkins University/Applied Physics Laboratory to use a new tracking package for reviewing and approving design drawings and therefore experienced some delays in releasing drawings at CDR. Project officials reported that there have been only minimal design changes since the CDR and there are no significant design changes expected in the future. Project Office Comments: The RBSP project office provided technical comments to a draft of this assessment, which were incorporated as appropriate. Project officials also commented that the System Integration Review was conducted on 12- 14 October 2010, with the Standing Review Board recommending that the Project be allowed to proceed into observatory integration and test. [End of RBSP data] Soil Moisture Active and Passive (SMAP): Common Name: SMAP: [Refer to PDF for image: artist depiction] Source: Jet Propulsion Laboratory. NASA’s Soil Moisture Active and Passive (SMAP) is one of four first- tier missions recommended by the National Research Council’s 2007 Earth Science Decadal Survey. SMAP leverages previous Earth Science missions and is based on the soil moisture and freeze/thaw mission concept developed by an earlier mission known as Hydros. The SMAP mission will provide new information on global soil moisture and its freeze/thaw state enabling new advances in hydrospheric science and applications. The measures will improve understanding of regional and global water cycles, improve weather forecasts, flood and drought forecasts, and predictions of agricultural productivity and climate changes. Formulation: Formulation start: 9/08; GAO review: 12/10; Preliminary design review: 3/11. Implementation: Project Confirmation: 6/11; Critical design review: 3/12; Launch readiness date: 11/14. Table: Project Performance (then year dollars in millions): Latest (Feb. 2011): Preliminary Estimate of Project Life Cycle Cost[A]: $780 to $900. [A] This estimate is preliminary, as the project is in formulation and there is still uncertainty in the value as design options are explored. NASA uses these estimates for planning purposes. Launch Schedule: 11/2014. [End of table] Recent Project Challenges: * Funding Issues; * Launch Issues. Project Summary: SMAP received $64 million in American Recovery and Reinvestment Act of 2009 funds, as well as funding from the President’s global climate initiative, that the project used to address key mission and implementation risks during formulation and to accelerate the launch readiness date from May 2015 to November 2014. The project is currently being designed to multiple launch vehicle specifications and is tracking the timing of the launch vehicle selection as a top risk. Project Update: Funding Issues: SMAP entered formulation in September 2008 and the Jet Propulsion Lab (JPL) was selected as the lead implementation center in January 2009. NASA officials stated that SMAP was budgeted $30 million in funding from the President’s global climate initiative and $64 million in funding from the American Recovery and Reinvestment Act of 2009, which the project used to accelerate the launch date from May 2015 to November 2014. Launch Issues: Late launch vehicle selection is one of the top risks the project is monitoring. SMAP is currently being designed to fit the specifications for three launch vehicles, including exploring a partnership for a DOD-provided launch service on the Minotaur IV. While designing to accommodate multiple launch vehicles is possible, a project official said that it limits design capabilities and can raise costs to the program as a result. Project officials stated that no certified medium capability vehicle is currently available. The Falcon 9, which is available under the current Launch Services contract, has yet to be certified, and if selected, the mission launch date will be tied to a successful certification of the launch vehicle. NASA is preparing a solicitation to acquire launch services and, if commercial vehicles are not reasonably available, it may request approval by the Secretary of Defense and submit a certification to Congress for authorization to partner with DOD to use the Minotaur IV. The current timeline for launch vehicle selection may result in a decision after the project’s preliminary design review (PDR). Other Issues to be Monitored: Project officials stated that an early focus on risk management enabled SMAP to mitigate several top mission and implementation risks related to the aggressive schedule and the scientific outputs of the mission. For example, the project developed an end-to-end science measurement simulation to increase the data volume requirements. The project expects to mitigate several other development risks by the mission PDR in March 2011. For example, the project reported it has three heritage technologies-—the radar, radiometer, and the reflector boom assembly-—all of which it will adapt for application. None of these technologies, however, is currently mature. The project is tracking the radiometer as a project risk since it requires additional Spectral Filtering for Radio Frequency Interference (RFI) mitigation. The project has identified the spectral filtering as a critical technology. Due to its extensive heritage, the project is accepting the potential risk in cost growth and the technical risks with a verification and validation (V&V) program that includes a comprehensive set of assembly and system level analyses. There is a cost risk, however, associated with the V&V program if the project determines that additional tests and analyses are required. SMAP leverages other Earth Science projects, namely the Aquarius project, which is in the implementation phase, and the Hydros project that was discontinued in 2005 due to lack of available funding. Although SMAP has no funding partners, the National Oceanic and Atmospheric Administration, the U.S. Department of Agriculture, and DOD are all actively engaged with SMAP to develop an applications plan for the data. Project Office Comments: The SMAP project provided technical comments to a draft of this assessment, which were incorporated as appropriate. The project officials also commented that the target launch readiness date of November 2014 is a planning date at this point and can change as funding, scope and schedule are brought into mutual alignment. NASA will not formally commit to a launch readiness date until Project Confirmation, Key Decision Point C, currently scheduled for summer 2011. [End of SMAP data] Solar Probe Plus (SPP): Common Name: SPP: [Refer to PDF for image: artist depiction] Source: © 2010 Johns Hopkins University/Applied Physics Laboratory. Solar Probe Plus (SPP) will explore the Sun's outer atmosphere, or corona, as it extends into space. The spacecraft will orbit the Sun 24 times and its instruments will observe the generation and flow of solar wind from very close range. By observing the corona, where solar energetic particles are energized, there is potential to further science in terms of shedding light on two central issues of heliophysics: the origin and evolution of solar wind, and why the sun’s outer atmosphere is so much hotter than the visible surface. In order to achieve its mission, parts of the spacecraft must be able to withstand temperatures exceeding 2,500 degrees Fahrenheit, as well as endure blasts of extreme radiation. Formulation: Formulation start: 11/09; GAO review: 12/10; Preliminary design review: 1/14; Implementation: Critical design review: 11/15; Launch readiness date: 8/18. Table: Project Performance (then year dollars in millions): Latest (Feb. 2011): Preliminary Estimate of Project Life Cycle Cost[A]: not available. Launch Schedule: 8/2018. [A] The project has not yet reached the point in the acquisition life cycle where a preliminary life cycle cost estimate would normally be developed. Recent Project Challenges: * Launch Issues. Project Summary: SPP is early in formulation, and therefore is unable to provide official cost and schedule data at this time. Currently, the probe will fly within closer proximity to the Sun than any other spacecraft. Chief risks to the project in terms of cost and schedule include development of a sunshield capable of protecting the instruments from the harsh near-Sun environment, development of a cooling system for the retractable solar array panels, and achieving the total launch energy to get the spacecraft to its long-range destination. Project Update: Launch Issues: SPP project officials reported that one of the mission’ s key challenges is achieving the total launch energy necessary to launch the spacecraft toward its long range destination. The mission will most likely require the use of an upper stage solid rocket propellant to provide sufficient launch energy to set the spacecraft on a trajectory to achieve solar exploration. Project officials reported that they are working to understand the performance of the standard stage and possible enhancements to upper stage performance should this be needed. These enhancements could include the possible use of a higher energy propellant and a composite case for mass efficiency. The project commissioned a trade study which seeks to identify the optimal combination of launch vehicle and propellant upper stage to use for the launch. Project officials anticipate the study to be completed by the Mission Design Review, currently scheduled for May 2011. Other Issues to be Monitored: A key challenge of the SPP mission will be the development of critical technologies allowing science instruments to function within the harsh near-Sun environment. Although still in the concept and technology development phase, project officials reported that the Thermal Protection System (TPS)-—a carbon-foam filled sun shield that will measure over 8 feet in diameter—-would sit atop the spacecraft shielding instruments from the direct heat and radiation of the Sun. Project officials reported that they have already completed production of a 30-inch square prototype TPS shield, but at this time the technology is not fully mature. A full prototype of this technology is expected to be matured and built during Phase B. A second area of mission technology development concerns the production of two sets of solar arrays—-essentially solar power generators-—that will retract and extend as the spacecraft moves toward or away from the Sun. A solar array cooling system will be used to ensure the solar panels stay at required temperatures. Project officials reported that the cooling system will need the capacity to dissipate up to 5,000 watts of thermal energy during the spacecraft’s closest approach to the Sun. In order to mitigate mission risk, a back- up pump for the cooling system is planned to be integrated should the first pump fail. However, as is the case with the TPS, it will be impossible to replicate the extreme conditions the probe will be exposed to during its closest proximity to the Sun. Although the key technologies will be tested in representative environments it will be impossible to replicate the extreme conditions the fully assembled probe will be exposed to during its closest proximity to the Sun requiring simulators for the TPS and Solar Arrays in systems test. Thus, the functionality of the entire spacecraft in the near-Sun environment cannot be verified fully through testing prior to launch. An Announcement of Opportunity was issued in December 2009 and project officials reported that thirteen science proposals were considered by a panel of NASA and other scientists. In 2010, the project selected five science investigations, which when awarded will have a combined value of approximately $165 million for preliminary analysis, design, development, and testing. Project Office Comments: The SPP project office provided technical comments to a draft of this assessment, which were incorporated as appropriate. Project officials also commented that SPP is making progress going through formulation. [End of SPP data] Stratospheric Observatory for Infrared Astronomy (SOFIA): Common Name: SOFIA: [refer to PDF for image: illustration] Source: SOFIA First Light Image Composite. SOFIA is a joint project between NASA and the German Space Agency to install a 2.5 meter telescope in a specially modified Boeing 747SP aircraft. This airborne observatory is designed to provide routine access to the visual, infrared, far-infrared, and sub-millimeter parts of the spectrum. Its mission objectives include studying many different kinds of astronomical objects and phenomena, including star birth and death; the formation of new solar systems; planets, comets, and asteroids in our solar system; and black holes at the center of galaxies. Interchangeable instruments for the observatory are being developed to allow a range of scientific measurement to be taken by SOFIA. Formulation: Formulation start: 10/91. Implementation: Project Confirmation: 11/95; Critical design review: 8/00; GAO review: 12/10; Initial operational capability: 12/10; Full operational capability: 12/14. Table: Project Performance (then year dollars in millions): Total Project Cost: Baseline Est. (FY 2007): $2954.5; Latest (Feb. 2011): $3002.9; Change: 1.6%. Formulation Cost: Baseline Est. (FY 2007): $35.0; Latest (Feb. 2011): $35.0; Change: 0.0%. Development Cost: Baseline Est. (FY 2007): $919.5; Latest (Feb. 2011): $1128.4; Change: 22.7%. Operations Cost: Baseline Est. (FY 2007): $2000.0; Latest (Feb. 2011): $1839.5; Change: -8.0%. Launch Schedule: Baseline Est. (FY 2007): 12/2013; Latest (Feb. 2011): 12/2014; Change: 12 months. [End of table] Recent/Continuing Project Challenges: * Technology Issues; * Design Issues; * Contractor Issues. Previously Reported Challenges: * Funding Issues. Project Summary: Since our last review, SOFIA has experienced a delay in the delivery of hardware from vendors and development issues surrounding the Cavity Door Drive System. While this resulted in a 7-month slip to initiation of science flights in December 2010, the program completed a significant progress milestone with the completion of the first light flight on May 25, 2010. In 2009 and 2010, NASA reported to the Congress that SOFIA exceeded both its cost and schedule baselines. Project Update: As required by law, NASA reported to the Congress in 2009 and 2010 that SOFIA exceeded its 2007 development cost baseline by more than 15 percent and its schedule baseline by more than 6 months. SOFIA’s development costs have increased more than 268 percent, over $1.1 billion, since its 1995 estimate. These cost increases are partly due to challenges with modification of the aircraft to be used for SOFIA and more recently development of the Cavity Door Drive System (CDDS). This year, project officials told us SOFIA’s development costs increased due to increased flight hanger costs. Some data for the project was not provided by NASA because, according to project officials, the project documentation did not transfer in its entirety from Ames Research Center to Dryden Flight Research Center. Technology Issues: We could not assess the technology maturity of the overall project as NASA did not provide information for heritage technologies related to the aircraft modification. Data provided for development of the instruments that will fly on SOFIA generally indicates a high level of technology maturity. Many of these technologies have already been used on ground-based telescopes. Project officials told us that of the eight first generation science instruments, one instrument was flown on the first light flight in May 2010, one instrument has been installed and tested on the ground, one instrument is awaiting installation, and four instruments will be installed by 2013. Design Issues: We were unable to determine design stability of the instruments since the drawings were still preliminary at the critical design review. Last year, project officials reported that design work on SOFIA was 97 percent complete and that all designs would be complete by 2011. However, due to problems with the CDDS vendor and longer-than-anticipated door testing, initial science flights have been delayed one year. Because modifications to several subsystems will be ongoing during the early science missions, project officials told us designs will not be finalized until 2014 when the project is scheduled to begin operations. A date for the preliminary design review was not provided by NASA. Contractor Issues: Since our last review, the SOFIA project has experienced at least a 6-month slip in the scheduled commencement of initial science flights due to late delivery of hardware and software in the CDDS and rework of vendor supplied hardware. The project found problems with software quality assurance, which indicated later on that there were problems with hardware quality assurance and required a rebuild of the CDDS components. NASA consequently reduced the contractor’s management role for both development and operations of SOFIA and utilized government personnel to perform these functions in house and to complete the CDDS. The project successfully completed the first open door flight test on December 18, 2009, and experienced no anomalies. To date, the project has conducted three open door landings, two of which were unplanned and caused by nuisance faults. The project manager stated that in the open door testing process there was a high probability of a halt in the door system and the project was prepared for this occurrence. He stated that there is no backup door opening system, but that the project did have a default reset for door issues in flight. The project continues to troubleshoot development of the CDDS and is utilizing an independent consultant to investigate the system and recommend future upgrades. In August 2010, the project completed its second segment of flight tests with its telescope door open to prepare the observatory for early science missions. Project Office Comments: The SOFIA project office provided technical comments to a draft of this assessment, which were incorporated as appropriate. Project officials also commented that the SOFIA project has made progress toward the initiation of science observations. [End of SOFIA data] Tracking and Data Relay Satellite (TDRS) Replenishment: Common Name: TDRS: [Refer to PDF for image: artist depiction] Source: © Boeing. The Tracking and Data Relay Satellite (TDRS) System consists of in- orbit communication satellites stationed at geosynchronous altitude coupled with two ground stations located in New Mexico and Guam. The satellite network and ground stations provide mission services for near-Earth user satellites and orbiting vehicles. TDRS K and L are the 11th and 12th satellites, respectively, to be built for the TDRS system and will contribute to the existing network by providing high bandwidth digital voice, video, and mission payload data, as well as health and safety data relay services to Earth-orbiting spacecraft, such as the International Space Station. Formulation: Formulation start: 2/07; Preliminary design review: 3/09. Implementation: Project Confirmation: 7/09; Critical design review: 2/10; GAO review: 12/10; Launch readiness date TDRS K: 12/12; Launch readiness date TDRS L: 12/13. Table: Project Performance (then year dollars in millions): Total Project Cost: Baseline Est. (FY 2010): $451.3; Latest (Feb. 2011): $434.1 Change: -3.8%. Formulation Cost: Baseline Est. (FY 2010): $241.9; Latest (Feb. 2011): $241.9; Change: 0.0%. Development Cost: Baseline Est. (FY 2010): $209.4; Latest (Feb. 2011): $192.2; Change: -8.2%. Operations Cost: Baseline Est. (FY 2010): $0.0; Latest (Feb. 2011): $0.0; Change: 0.0%. Launch Schedule K: Baseline Est. (FY 2010): 12/2012; Latest (Feb. 2011): 12/2012; Change: 0 months. Launch Schedule L: Baseline Est. (FY 2010): 12/2013; Latest (Feb. 2011): 12/2013; Change: 0 months. [End of table] Recent Project Challenges: * Parts Issues. Project Summary: The TDRS project identified an issue with contamination of the lubricants in the reaction wheel assemblies. The cost impact of this issue is borne by the prime contractor. In June 2010, the project awarded a contract to enhance existing ground-system architecture to ensure the TDRS system continues providing space-to-ground telecommunications. However, even with the successful launch of TDRS K and L, NASA is only able to guarantee continuity of service of the TDRS system through fiscal year 2016. Project Update: Parts Issues: In 2010, TDRS project officials discovered that the lubricant in the reaction wheel assemblies was contaminated by silicone. The project initially reported that it may take up to 18 months for the original supplier to provide replacements and that no other appropriate reaction wheels are in production by alternative vendors. However, project officials expected that replacement reaction wheels would be made available in November and December 2010, which equates to an approximate 2-month delay to scheduled wheel delivery dates. Other Issues to be Monitored: In June 2010, a cost-plus-award fee contract was awarded to modernize the ground based communication systems needed for TDRS K and L. In order to maximize the capabilities of TDRS K, necessary enhancements to the ground system must be prioritized within the 2 years prior to launch in 2012. TDRS K and L are being designed with high-bandwidth communication abilities including the transmission of images, video, voice, and other digital data from Earth-orbiting spacecraft to the ground. The ground-based beamforming architecture at the White Sands Complex in New Mexico is currently being modified to provide TDRS K and L compatible beamformers for the ground station. Project officials reported that the switch to ground-based beamforming was required to provide compatibility with network demand services developed in the late 1990’s. Project officials recognize challenges with updating ground segment equipment describing some current instruments as vintage early 1990’s and facing obsolescence issues. The TDRS System is considered by NASA to be a basic agency capability and a national resource. The Space Shuttle and many near-Earth spacecraft are totally dependent upon the satellite system for communication, and therefore, NASA considers the TDRS Replenishment project critical in terms of achieving launch schedule. However, even with the successful launch of TDRS K and L, continuity of service for TDRS System can only be ensured for NASA and other government agency users through approximately fiscal year 2016 at current support levels. The primary reason for this is due to an aging fleet of satellites. The first TDRS satellite, now decommissioned, has been in Earth orbit since 1983. According to a project official, the current fixed price development contract for TDRS K and L includes an option to produce two additional TDRS satellites—-designated M and N—-and the addition of these two satellites could extend TDRS system service continuity. However, in order to exercise the options for TDRS M and N, NASA would need a financial commitment of $1.2 billion from partnership organizations. Project officials reported that a decision on exercising the option for TDRS M needs to be made no later than November 30, 2011, and no later than November 30, 2012, for TDRS N. Project Office Comments: The TDRS project office provided technical comments to a draft of this assessment, which were incorporated as appropriate. Project officials commented that they agreed with the assessment as written. [End of TDRS data] Agency Comments and Our Evaluation: We provided a draft of this report to NASA for review and comment. In its written response, NASA agreed with our findings and stated that it will continue to identify and address the challenges that may lead to cost and schedule growth in its projects. NASA agreed that GAO’s cost and schedule growth figures reflect what the agency has experienced since baselines were established in response to the 2005 statutory reporting requirement. NASA also stated that the average cost growth remains below the 15 percent threshold that requires Congressional notification. While this is correct, it should be noted that the notification requirements are for individual projects, not the portfolio as a whole. In addition, NASA acknowledges that the current estimates for the James Webb Space Telescope do not represent the cost and schedule required to complete the project, and that the agency is undertaking a comprehensive replanning activity to establish the best budget phasing and schedule to minimize risk and life-cycle cost within the overall constraints of its budget. We encourage NASA to provide a revised budget and schedule for JWST that is based on a sound, knowledge-based business case to allow the project to succeed. NASA noted that its projects are high-risk, one-of-a-kind development efforts that do not lend themselves to all the practices of a “business case” approach that we outlined since essential attributes of NASA’s project development differ from those of a production entity. We agree and do not assess NASA’s projects for production maturity. We do, however, assess NASA projects at critical points in the product development process to ensure that these projects are proceeding with system development with a sound business case. At these key junctures we have found that NASA could benefit from a more disciplined approach to its acquisitions whereby decisions are based upon high levels of knowledge. As we reported, inherent risks are being heightened due to projects moving forward with immature technologies, unstable designs, and other challenges, leading to cost and schedule increases that make it hard for the agency to manage its portfolio and make informed investment decisions. GAO looks forward to working with NASA as it develops metrics to better measure design stability and continues to refine the information it uses to understand a project’s status and make informed decisions. NASA stated that the drawing release metric we use to assess design stability was developed prior to the use of computerized drawings and does not take into account improvements due to the use of this technology. We acknowledge this point, but our analysis of NASA projects shows that those projects that have met or come close to meeting the best practices drawing release metric have fared better with regard to cost and schedule than those projects that did not come close to meeting the metric. Furthermore, in no way does GAO portend that the drawing release metric is the only way to assess design stability. Until NASA has taken steps to identify a consistent and proven metric by which to measure projects with a portfolio perspective, however, we will continue to use this metric to assess stability. NASA has indicated that it will develop such metrics and provide them to GAO in March 2011. We are encouraged by this progress and look forward to receiving the information. NASA expressed concern that technical corrections it provided to our 2-page summaries were not fully accepted. We incorporated the technical comments where supporting documentation that meets our standards of evidence was provided. We did not incorporate the comments where this information was not provided or where the change was less a technical correction and more a difference of opinion between GAO and NASA based on facts or where space limitations required a briefer description of an issue than requested by NASA. As this work will be continuing in future years, we will continue to capture the progress made by all the projects in our review. Finally, we take great strides to provide the latest information possible in our report. We will continue to work with NASA to ensure that updated information is provided to GAO in a timely manner so that it can be included in our analysis. NASA’s written comments are reprinted in appendix I. NASA also provided technical comments, which we addressed throughout the report as appropriate and where sufficient evidence was provided to support significant changes. We will send copies of the report to NASA’s Administrator and interested congressional committees. We will also make copies available to others upon request. In addition, the report will be available at no charge on GAO’s Web site at [hyperlink, http://www.gao.gov]. Should you or your staff have any questions on matters discussed in this report, please contact me at (202) 512-4841 or chaplainc@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix IV. Signed by: Cristina Chaplain: Director: Acquisition and Sourcing Management: [End of section] List of Congressional Committees: The Honorable Barbara A. Mikulski: Chairwoman: The Honorable Kay Bailey Hutchison: Ranking Member: Subcommittee on Commerce, Justice, Science, and Related Agencies: Committee on Appropriations: United States Senate: The Honorable Bill Nelson: Chairman: The Honorable John Boozman: Ranking Member: Subcommittee on Science and Space: Committee on Commerce, Science, and Transportation: United States Senate: The Honorable Frank R. Wolf: Chairman: The Honorable Chaka Fattah: Ranking Member: Subcommittee on Commerce, Justice, Science, and Related Agencies: Committee on Appropriations: House of Representatives: The Honorable Stephen Palazzo: Chairman: The Honorable Gabrielle Giffords: Ranking Member: Subcommittee on Space and Aeronautics: Committee on Science, Space, and Technology: House of Representatives: [End of section] Appendix I: Comments from the National Aeronautics and Space Administration: National Aeronautics and Space Administration: Office of the Administrator: Washington, DC 20546-0001: February 23, 2011: Ms. Christina Chaplain: Director: Acquisition and Sourcing Management: United States Government Accountability Office: Washington, DC 20548: Dear Ms. Chaplain: The National Aeronautics and Space Administration (NASA) appreciates the opportunity to comment on the Government Accountability Office (GAO) draft report entitled "Assessments of Selected Large-Scale Projects" (GAO-11-239SP). NASA values the continued open and constructive communications between NASA and the GAO team on this effort. NASA remains dedicated to continuous improvement of its acquisition management processes and performance and will continue to work with the GAO to identify and address the challenges that may lead to cost and schedule growth of our projects. We are pleased that GAO has again recognized NASA's ongoing efforts to mitigate acquisition management risk and lay a stronger foundation for reducing project cost and schedule growth. As was highlighted, NASA instituted a Joint Cost and Schedule Confidence Level (JCL) policy in 2009 to increase the likelihood of project success at the specified funding level. As expected in 2010, execution of the JCL process prior to confirmation of several projects, including the Lunar Atmosphere and Dust Environment Explorer, the Mars Atmosphere and Volatile Evolution Mission, and the Orbiting Carbon Observatory 2, increased insight by project managers, the Standing Review Board, and NASA management surfacing uncertainties and contingencies with the integrated cost and schedule plan. NASA will continue to assess the impact of utilizing JCLs on project cost and schedule growth as these projects complete their Systems Integration Reviews in the next two years. Furthermore, with the completion and launch in 2011 of three missions baselined under the earlier cost confidence level policy, NASA will have a better measure of the impact of our acquisition management improvement efforts over the last five years. In its draft report, GAO states that NASA's project development costs for the 16 projects in implementation in this review have increased by an average of 14.6 percent from their baseline cost estimates and experienced an average delay of eight months, an improvement of three months since the previous report. NASA agrees with the cost and schedule growth figures that are quoted and notes that the average cost growth remains below the 15 percent threshold which requires special notification to Congress. Furthermore, fewer projects have exceeded this threshold since NASA's new cost estimating policies were put into place. These figures are reflective of what has been experienced since baselines were established in response to the 2005 statutory reporting requirement. GAO notes that the calculation of NASA's average development cost and schedule growth does not include anticipated growth on the James Webb Space Telescope (JWST) project. The cost and schedule quoted in the draft GAO report are the results of a rough estimate by the Independent Comprehensive Review Panel (ICRP) which recently assessed JWST and do not represent NASA's estimate of the cost and schedule required to complete the project. In response to ICRP's findings and recommendations, NASA is currently undertaking a comprehensive replanning activity to establish the best budget phasing and schedule to minimize the risk and life-cycle cost of JWST within the overall constraints of NASA's budget. The revised budget and schedule will be completed after the release of the President's 2012 Budget. Decisions resulting from this replanning activity, with any required Agency offsets, will be reflected in the President's 2013 Budget. While NASA practices many elements of GAO's stated "business case" approach, some essential aspects of NASA's project development differ from those of a production entity, which is the basis for the GAO approach. NASA's projects are generally high-risk, one-of-a-kind developments and, therefore, do not have a production phase. The draft GAO report acknowledges the unique nature of NASA projects but applies a best practices approach that requires an incremental development process. NASA's work pushes the boundary of our achievements and often requires leaps, not steps, to accomplish the mission. NASA aims to continue to innovate in an affordable and sustainable way and will continue to work with GAO to determine which elements of the approach are valuable for informing improvements and which may need to be modified to account best for the complexity that surrounds our challenging missions. An area where NASA and GAO can work together to adapt the assessment approach is in the determination of design stability. Although the best practice recommends 90 percent drawing release by Critical Design Review (CDR), the drawing release metric is a standard developed prior to the use of computerized drawings and, hence, does not take into account improvements due to the use of this technology. NASA continues to develop metrics to measure design stability as well as other knowledge required to understand a project's progress and maturity. The draft GAO report notes challenges with launch vehicles, specifically, NASA's transition plans for future medium-class launch vehicles and cites a previous recommendation that NASA perform detailed cost estimates for certification of new vehicles and adequately budget for the associated risks. NASA is in the process of estimating costs to certify the Falcon 9 and will budget accordingly. Taurus II is not currently included in the NASA Launch Services contract; however, NASA will follow the same process in the event that it is added. In addition, NASA is working to resolve the inherent conflict between the desire to minimize technical risk by identifying the launch vehicle as early as possible (before Preliminary Design Review) and the desire to minimize programmatic risk by not committing to purchase a launch vehicle prior to mission confirmation (at Key Decision Point-C, Post-Preliminary Design Review). NASA is concerned that technical corrections to the Project Two-Page Summaries that were provided in late 2010 were not fully accepted by the GAO in the draft report. Many of these comments have been resubmitted for GAO's consideration. NASA will work with the GAO team to better understand why specific corrections were not accepted and to better explain our issues if necessary. In addition, NASA understands that the GAO's work was completed in the fall of 2010 and is concerned that the assessment, therefore, does not recognize the significant progress the Agency has made since then. NASA will continue to follow through with our new policies and management attention on cost and schedule growth in the coming year. We are committed to continuous improvement in order to explore and utilize space in an affordable way for the benefit of the Nation. To this end, we look forward to continuing to work with the GAO to measure and improve our performance. Thank you for the opportunity to comment on this draft report. If you have any questions or require additional information, please contact Katie Gallagher at (202) 358-2185. Sincerely, Signed by: Lori B. Garver: Deputy Administrator: [End of section] Appendix II: Objectives, Scope, and Methodology: Our objectives were to report on the status and challenges faced by NASA systems with life-cycle costs of $250 million or more and to discuss broader trends faced by the agency in its management of system acquisitions. In conducting our work, we evaluated performance and identified challenges for each of 21 major projects. We summarized our assessments of each individual project in two components--a project profile and a detailed discussion of project challenges. We did not validate the cost and schedule data provided by NASA. However, we took appropriate steps to address data reliability. Specifically, we confirmed the accuracy of NASA-generated data with multiple sources within NASA and, in some cases, with external sources. Additionally, we corroborated data provided to us with published documentation. We determined that the data provided by NASA project offices were sufficiently reliable for our engagement purposes. We developed a standardized data collection instrument (DCI) that was completed by each project office. Through the DCI, we gathered basic information about projects as well as current and projected development activities for those projects. The cost and schedule data estimates that NASA provided were the most recent updates as of November 2010; performance data that NASA provided were also the most recent updates as of September 2010. At the time we collected the data, 8 of the 21 projects were in the formulation phase. Three of these 8 projects--MAVEN, LADEE, and OCO-2--were confirmed and entered the implementation phase late in 2010. To further understand performance issues, we talked with officials from most project offices and NASA's Office of the Chief Financial Officer (OCFO) Strategic Investments Division (SID). We also collected cost and schedule data for projects in operations that we had reviewed in prior reports for historical purposes. These projects were DAWN, GLAST, Herschel, Kepler, LRO, OCO, SDO, and WISE. The information collected from each project office, Mission Directorate, and OCFO/SID were summarized in a 2-page report format providing a project overview; key cost, contract, and schedule data; and a discussion of the challenges associated with the deviation from relevant indicators from best practice standards. The aggregate measures and averages calculated were analyzed for meaningful relationships, e.g. relationship between cost growth and schedule slippage and knowledge maturity attained both at critical milestones and through the various stages of the project life cycle. Cost growth averages used in this report are weighted averages and should not be used as a point of comparison to previous reports where weighted averages were not used. We identified cost and/or schedule growth as significant where, in either case, a project's cost and/or its schedule statutory baseline exceeded the thresholds that trigger reporting to the Congress. To supplement our analysis, we relied on GAO's work over the past years examining acquisition issues across multiple agencies. These reports cover such issues as contracting, program management, acquisition policy, and estimating cost. GAO also has an extensive body of work related to challenges NASA has faced with specific system acquisitions, financial management, and cost estimating. This work provided the context and basis for large parts of the general observations we made about the projects we reviewed. Additionally, the discussions with the individual NASA projects helped us identify further challenges faced by the projects. Together, the past work and additional discussions contributed to our development of a short list of challenges discussed for each project. The challenges we identified and discussed do not represent an exhaustive or exclusive list. They are subject to change and evolution as GAO continues this annual assessment in future years. The challenges, indicated as "issues," are based on our definitions, not that of NASA. Our work was performed primarily at NASA headquarters in Washington, D.C. In addition, we visited NASA's Marshall Space Flight Center in Huntsville, Alabama, and Goddard Space Flight Center in Greenbelt, Maryland, to discuss individual projects. We also met with representatives from NASA's Jet Propulsion Lab in Pasadena, California and a contractor involved with several projects, Orbital Science Corporation. In addition, we interviewed officials at Johnson Space Center in Houston, Texas, Ames Research Center at Moffitt Field in California, and Dryden Flight Research Center at Edwards Air Force Base in California. Data Limitations: NASA only provided specific cost and schedule estimates for 16 of the 21 projects in our review. NASA provided internal preliminary estimated total (life-cycle) cost ranges and associated schedules for three of the projects that had not yet entered implementation, from key decision point B (KDP-B), solely for informational purposes. [Footnote 47] We did not receive cost estimates or ranges for two projects--Ice, Cloud, and Land Elevation Satellite-2 and Solar Probe Plus--since these projects had not yet reached their KDP-B, the point in the acquisition life cycle where a preliminary life cycle cost estimate would normally be developed. We did receive preliminary scheduled launch dates for these two projects. NASA formally establishes cost and schedule baselines, committing itself to cost and schedule targets for a project with a specific and aligned set of planned mission objectives, at key decision point C (KDP-C), which follows a non-advocate review (NAR) and preliminary design review (PDR). KDP-C reflects the life-cycle point where NASA approves a project to leave the formulation phase and enter into the implementation phase. NASA explained that preliminary estimates are generated for internal planning and fiscal year budgeting purposes at KDP-B, which occurs mid-stream in the formulation phase, and hence, are not considered a formal commitment by the agency on cost and schedule for the mission deliverables. NASA officials contend that because of changes that occur to a project's scope and technologies between KDP-B and KDP-C, estimates of project cost and schedule can change significantly heading toward KDP-C. We requested earned value management data for the 21 projects, and received data on 11 of them. However, this information was received late in our review and as a result we were unable to conduct a detailed analysis on the earned value data. We also requested independent cost estimates and Joint Cost and Schedule Confidence Levels (JCL) for the projects that completed them. We received independent cost estimates for 12 of the projects in our review and for 6 projects that have launched since our last review. In most cases we received independent cost estimates conducted at the center level by the projects, along with estimates by the Aerospace Corporation and/or by NASA's Independent Program Assessment Office. We received JCL analyses from three of the five projects that have completed their JCLs. However, this information was incomplete and received late in our review and as a result we were unable to conduct a thorough analysis of the data. Project Profile Information on Each Individual 2-Page Assessment: This section of the 2-page assessment outlines the essentials of the project, its cost and schedule performance, and its summary. Project essentials reflect pertinent information about each project, including, where applicable, the major contractors and partners involved in the project. These organizations have primary responsibility over a major segment of the project or, in some cases, the entire project. Project performance is depicted according to cost and schedule changes in the various stages of the project life cycle. To assess the cost and schedule changes of each project we obtained data directly from NASA OCFO/SID and from NASA's Integrated Budget and Performance documents. For systems in implementation, we compared the latest available information with the statutory cost and schedule baseline estimates for each project. All cost information is presented in nominal "then year" dollars for consistency with budget data.[Footnote 48] Baseline costs are adjusted to reflect the cost accounting structure in NASA's fiscal year 2009 budget estimates. For the fiscal year 2009 budget request, NASA changed its accounting practices from full-cost accounting to reporting only direct costs at the project level. The schedule assessment is based on acquisition cycle time, which is defined as the number of months between the project start, or formulation start, and projected or actual launch date.[Footnote 49] Formulation start generally refers to the initiation of a project; NASA refers to project start as key decision point A, or the beginning of the formulation phase. The preliminary design review typically occurs during the end of the formulation phase, followed by a confirmation review process, referred to as key decision point C, which allows the project to move into the implementation phase. The critical design review is held during the final design period of implementation and demonstrates that the maturity of the design is appropriate to support proceeding with full scale fabrication, assembly, integration, and test. Launch readiness is determined through a launch readiness review that verifies that the launch system and spacecraft/payloads are ready for launch. The implementation phase includes the operations of the mission and concludes with project disposal. We assessed the extent to which NASA projects exceeded their statutory cost and schedule baselines. To do this, we compared the project statutory baseline cost and schedule estimates with the current cost and schedule data reported by the project office in November 2010. Project Challenges Discussion on Each Individual 2-Page Assessment: To assess the project challenges for each project, we submitted a data collection instrument to each project office. In the data collection instrument, we requested information on the maturity of critical and heritage technologies, number of releasable design drawings at project milestones, and project contractors and partnerships. We also held interviews with representatives from each of the projects to discuss the information on the data collection instrument. These discussions led to identification of further challenges faced by NASA projects. The eight challenges we identified were largely apparent in the projects that had entered the implementation phase, however, there were instances where these challenges were identified in projects in the formulation phase. We then reviewed pertinent project documentation, such as the project plan, schedule, risk assessments, and major project reviews to corroborate any testimonial evidence we received in the interviews. To assess issues with technology, we asked project officials to provide the technology readiness levels (TRL) of each of the project's critical technologies at various stages of project development. Originally developed by NASA, TRLs are measured on a scale of one to nine, beginning with paper studies of a technology's feasibility and culminating with a technology fully integrated into a completed product. (See appendix IV for the definitions of technology readiness levels.) In most cases, we did not validate the project offices' selection of critical technologies or the determination of the demonstrated level of maturity. However, we sought to clarify the technology readiness levels in those cases where the information provided raised concerns, such as where a critical technology was reported as immature late in the project development cycle. Additionally, we asked project officials to explain the environments in which technologies were tested. Our best practices work has shown that a technology readiness level of 6--demonstrating a technology as a fully integrated prototype in a relevant environment--is the level of maturity needed to minimize risks for space systems entering product development. In our assessment, the technologies that have reached technology readiness level 6 are referred to as fully mature because of the difficulty of achieving technology readiness level 7, which is demonstrating maturity in an operational environment--space. Projects with critical technologies that did not achieve maturity by the preliminary design review were assessed as having a technology issues project challenge. We did not assess technology maturity for those projects which had not yet reached the preliminary design review at the time of this assessment.[Footnote 50] We also asked project officials to assess the TRL of each of the project's heritage technologies at various stages of project development. We also interviewed project officials about the use of heritage technologies in their projects. We asked them what heritage technologies were being used, what effort was needed to modify the form, fit, and function of the technology for use in the new system, whether the project encountered any problems in modifying the technology, and whether the project considered the heritage technology as a risk to the project. Heritage technologies were not considered critical technologies by several of the projects we reviewed. Based on our interviews, review of data from the data collection instruments, and previous GAO work on space systems, we determined whether these technology issues were a challenge for a particular project. To assess issues with design, we asked project officials to provide the percentage of engineering drawings completed or projected for completion by the preliminary and critical design reviews and as of our current assessment.[Footnote 51] In most cases, we did not verify or validate the percentage of engineering drawings provided by the project office. However, we collected the project offices' rationale for cases where it appeared that only a small number of drawings were completed by the time of the design reviews or where the project office reported significant growth in the number of drawings released after CDR. In accordance with GAO's best practices, projects were assessed as having achieved design stability if they had at least 90 percent of projected drawings releasable by the critical design review. Projects that had not met this metric were determined to have a design stability project challenge. Though some projects used other methods to assess design stability, such as computer and engineering models and analyses, we did not assess the effectiveness of these other methods. We did not assess design stability for those projects that had not yet reached the critical design review at the time of this assessment. To assess issues with funding, we interviewed officials from NASA's OCFO/SID and NASA project officials, and also relied upon past interviews with project contractors about the stability of funding throughout the project lifecycle. In addition, NASA received an appropriation from the American Recovery and Reinvestment Act of 2009 (ARRA). NASA provided a record of projects involved in our review that received ARRA funds and reported the amount of ARRA funds a project received in the cost tab of the data collection instrument. We also asked project and Mission Directorate officials to discuss how these funds were used. Funding issues were considered a challenge if officials indicated that project funding had been interrupted or delayed resulting in an impact to the cost, schedule, or performance of the project, if the project received ARRA funding, or if project officials indicated that the project budgets do not have sufficient funding in certain years based on the work expected to be accomplished. We corroborated the funding changes and reasons with budget documents when available. To assess issues with launch, we interviewed NASA Launch Services and project officials. We also interviewed contractor representatives from Orbital Sciences Corporation to discuss the launch failure of the OCO- 1 mission in 2009 and the return to flight process for the Taurus XL for the Glory and OCO-2 missions. Launch issues were considered a challenge if, after establishing a firm launch date, a project had difficulty rescheduling its launch date because it was not ready; if the project could be affected by another project slipping its launch; or if there were launch vehicle fleet issues. In addition, we assessed the status of launch vehicle selection for projects in formulation and considered it a challenge if the proposed timing for the launch vehicle selection date falls after Preliminary Design Review due the availability of certified medium class launch vehicles. To assess issues with contractor management, we interviewed project officials about their interaction and experience with contractors. We also interviewed contractor representatives from Orbital Sciences Corporation. We were informed about contractor performance problems pertaining to their workforce, the supplier base, and technical and corporate experience. We assessed a project as having this challenge if these contractor issues caused the project to experience a cost overrun, schedule delay, or decrease in mission capability. For projects that did not have a major contractor, we considered this challenge inapplicable to the project. To assess issues with development partners, we interviewed NASA project officials about their interaction with international or domestic partners during project development. Development partner issues was considered a challenge for the project if project officials indicated that domestic or foreign partners were experiencing problems with project development that impacted the cost, schedule, or performance of the project for NASA. These challenges were specific to the partner organization or caused by a contractor to that partner organization. For projects that did not have an international or domestic development partner, we considered this challenge not applicable to the project. To assess issues with parts quality, we submitted a data collection instrument in conjunction with other on-going GAO work to all of the projects in the implementation phase that were schedule to be operating in a space environment. In addition, we asked project officials to identify project components that encountered parts quality or availability problems during development. Additionally, we asked project officials to explain the environments in which the parts quality issues were discovered and any implication on the project's cost and schedule. We considered parts issues a challenge if there were actual or potential cost and/or schedule impacts to the project as a result of parts quality or availability, or if the project had to take special steps in order to address parts issues. The individual project offices were given an opportunity to comment on and provide technical clarifications to the 2-page assessments prior to their inclusion in the final product. We incorporated these comments as appropriate and where sufficient supporting documentation was provided. We conducted this performance audit from March 2010 to February 2011 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. [End of section] Appendix IV: Technology Readiness Levels: Technology readiness level: 1. Basic principles observed and reported; Description: Lowest level of technology readiness. Scientific research begins to be translated into applied research and development. Examples might include paper studies of a technology's basic properties; Hardware: None (paper studies and analysis); Demonstration environment: None. Technology readiness level: 2. Technology concept and/or application formulated; Description: Invention begins. Once basic principles are observed, practical applications can be invented. The application is speculative and there is no proof or detailed analysis to support the assumption. Examples are still limited to paper studies; Hardware: None (paper studies and analysis); Demonstration environment: None. Technology readiness level: 3. Analytical and experimental critical function and/or characteristic proof of concept; Description: Active research and development is initiated. This includes analytical studies and laboratory studies to physically validate analytical predictions of separate elements of the technology. Examples include components that are not yet integrated or representative; Hardware: Analytical studies and demonstration of nonscale individual components (pieces of subsystem); Demonstration environment: Lab. Technology readiness level: 4. Component and/or breadboard; Validation in laboratory environment; Description: Basic technological components are integrated to establish that the pieces will work together. This is relatively "low fidelity" compared to the eventual system. Examples include integration of "ad hoc" hardware in a laboratory; Hardware: Low fidelity breadboard. Integration of nonscale components to show pieces will work together. Not fully functional or form or fit but representative of technically feasible approach suitable for flight articles; Demonstration environment: Lab. Technology readiness level: 5. Component and/or breadboard validation in relevant environment; Description: Fidelity of breadboard technology increases significantly. The basic technological components are integrated with reasonably realistic supporting elements so that the technology can be tested in a simulated environment. Examples include "high fidelity" laboratory integration of components; Hardware: High fidelity breadboard. Functionally equivalent but not necessarily form and/or fit (size weight, materials, etc). Should be approaching appropriate scale. May include integration of several components with reasonably realistic support elements/subsystems to demonstrate functionality; Demonstration environment: Lab demonstrating functionality but not form and fit. May include flight demonstrating breadboard in surrogate aircraft. Technology ready for detailed design studies. Technology readiness level: 6. System/subsystem model or prototype demonstration in a relevant environment; Description: Representative model or prototype system, which is well beyond the breadboard tested for TRL 5, is tested in a relevant environment. Represents a major step up in a technology's demonstrated readiness. Examples include testing a prototype in a high fidelity laboratory environment or in simulated realistic environment; Hardware: Prototype. Should be very close to form, fit and function. Probably includes the integration of many new components and realistic supporting elements/subsystems if needed to demonstrate full functionality of the subsystem; Demonstration environment: High-fidelity lab demonstration or limited/restricted flight demonstration for a relevant environment. Integration of technology is well defined. Technology readiness level: 7. System prototype demonstration in an realistic environment; Description: Prototype near or at planned operational system. Represents a major step up from TRL 6, requiring the demonstration of an actual system prototype in a realistic environment, such as in an aircraft, vehicle or space. Examples include testing the prototype in a test bed aircraft; Hardware: Prototype. Should be form, fit and function integrated with other key supporting elements/subsystems to demonstrate full functionality of subsystem; Demonstration environment: Flight demonstration in representative realistic environment such as flying test bed or demonstrator aircraft; Technology is well substantiated with test data. Technology readiness level: 8. Actual system completed and "flight qualified" through test and demonstration; Description: Technology has been proven to work in its final form and under expected conditions. In almost all cases, this TRL represents the end of true system development. Examples include developmental test and evaluation of the system in its intended weapon system to determine if it meets design specifications; Hardware: Flight qualified hardware; Demonstration environment: Developmental Test and Evaluation (DT&E) in the actual system application. Technology readiness level: 9. Actual system "flight proven" through successful mission operations; Description: Actual application of the technology in its final form and under mission conditions, such as those encountered in operational test and evaluation. In almost all cases, this is the end of the last "bug fixing" aspects of true system development. Examples include using the system under operational mission conditions; Hardware: Actual system in final form; Demonstration environment: Operational Test and Evaluation (OT&E) in operational mission conditions. Source: GAO and its analysis of NASA data. [End of table] Appendix IV: GAO Contact and Staff Acknowledgments: GAO Contact: Cristina Chaplain (202) 512-4841 or chaplainc@gao.gov: Acknowledgments: In addition to the contact named above, Shelby S. Oakley, Assistant Director; Jessica M. Berkholtz; Richard A. Cederholm; Justin D. Dunleavy: Laura Greifner; Kristine R. Hassinger; Caryn E. Kuebler; Jesse Lamarre-Vincent; Kenneth E. Patton; and Roxanna T. Sun made key contributions to this report. [End of section] Footnotes: [1] GAO, NASA: Assessments of Selected Large-Scale Projects, [hyperlink, http://www.gao.gov/products/GAO-09-306SP] (Washington, D.C.: Mar. 2, 2009) and GAO, NASA: Assessments of Selected Large-Scale Projects, [hyperlink, http://www.gao.gov/products/GAO-10-227SP] (Washington, D.C.: Feb. 1, 2010). [2] National Aeronautics and Space Administration Authorization Act of 2005, Pub. L. No. 109-155, §103; 42 U.S.C. § 16613(b). [3] 42 U.S.C. § 16613(d). [4] See Explanatory Statement accompanying the Omnibus Appropriations Act, 2009, Pub. L. No. 111-8, div. B, tit. III. [5] Each assessment is presented in a two-page summary that analyzes the project's cost and schedule status and project challenges we identified with the objective to identify risks that, if mitigated, could put NASA in a better position to succeed. [6] Each project we reviewed was in either the formulation phase or the implementation phase of the project life cycle. In the formulation phase, the project defines requirements--what the project is being designed to do--matures technology, establishes a schedule, estimates costs, and produces a plan for implementation. In the implementation phase, the project carries out these plans, performing final design and fabrication as well as testing components and system assembly, integrating these components and testing how they work together, and launching the project. This phase also includes the period from project launch through mission completion. [7] NASA is required to report to Congress if development cost of a program is likely to exceed the baseline estimate by 15 percent or more, or if a milestone is likely to be delayed by 6 months or more. 42 U.S.C. § 16613(d). [8] GAO, Best Practices: Using a Knowledge-Based Approach to Improve Weapon Acquisition, [hyperlink, http://www.gao.gov/products/GAO-04-386SP] (Washington, D.C.: Jan. 2004). [9] GAO, Defense Acquisitions: Key Decisions to Be Made on Future Combat System, [hyperlink, http://www.gao.gov/products/GAO-07-376] (Washington, D.C.: Mar. 15, 2007); Defense Acquisitions: Improved Business Case Key for Future Combat System's Success, [hyperlink, http://www.gao.gov/products/GAO-06-564T] (Washington, D.C.: Apr. 4, 2006); NASA: Implementing a Knowledge-Based Acquisition Framework Could Lead to Better Investment Decisions and Project Outcomes, [hyperlink, http://www.gao.gov/products/GAO-06-218] (Washington, D.C.: Dec. 21, 2005); NASA's Space Vision: Business Case for Prometheus 1 Needed to Ensure Requirements Match Available Resources, [hyperlink, http://www.gao.gov/products/GAO-05-242] (Washington, D.C.: Feb. 28, 2005). [10] [hyperlink, http://www.gao.gov/products/GAO-05-242]. [11] NASA defines formulation as the identification of how the program or project supports the agency's strategic needs, goals, and objectives; the assessment of feasibility, technology and concepts; risk assessment, team building, development of operations concepts and acquisition strategies; establishment of high-level requirements and success criteria; the preparation of plans, budgets, and schedules essential to the success of a program or project; and the establishment of control systems to ensure performance to those plans and alignment with current agency strategies. NASA Interim Directive (NID) NM 7120-81 for NASA Procedural Requirements (NPR) 7120.5D, paragraph 1.2.1(a) (Sept. 22, 2009) (Hereinafter cited as NID for NPR 7120.5D (Sept. 22, 2009). [12] The implementation phase is defined as the execution of approved plans for the development and operation of the program/project, and the use of control systems to ensure performance to approved plans and continued alignment with the Agency's strategic needs, goals, and objectives. NID for NPR 7120.5D, paragraph 1.2.1(c) (Sept. 22, 2009). [13] According to NID for NPR 7120.5D, Table 2-7 (Sept. 22, 2009), the PDR demonstrates that the preliminary design meets all system requirements with acceptable risk and within the cost and schedule constraints and establishes the basis for proceeding with detailed design. It shows that the correct design option has been selected, interfaces have been identified, and verification methods have been described. Full baseline cost and schedules, as well as risk assessments, management systems, and metrics are presented. [14] According to NID for NPR 7120.5D, Appendix A (Sept. 22, 2009), a NAR is comprised of the analysis of a proposed program or project by a (non-advocate) team composed of management, technical, and resources experts (personnel) from outside the advocacy chain of the proposed program or project. It provides agency management with an independent assessment of the readiness of the program/project to proceed into implementation. [15] The management baseline is the integrated set of requirements, cost, schedule, technical content, and associated joint confidence level that forms the foundation for program or project execution and reporting done as part of NASA's performance assessment and governance process. NID for NPR 7120.5D, paragraph 2.1.8.2 and Appendix A (Sept. 22, 2009). [16] According to NID for NPR 7120.5D, Table 2-7 (Sept. 22, 2009), the CDR demonstrates that the maturity of the design is appropriate to support proceeding with full scale fabrication, assembly, integration, and test, and that the technical effort is on track to complete the flight and ground system development and mission operations in order to meet mission performance requirements within the identified cost and schedule constraints. Progress against management plans, budget, and schedule, as well as risk assessments are presented. [17] The system integration review evaluates the readiness of the project to start flight system assembly, test, and launch operations. This review takes place after the CDR and just prior to the beginning of phase D, where test and integration activities occur. NID for NPR 7120.5D, Table 2-7 and paragraph 4.6.1 (Sept. 22, 2009). [18] For purposes of our analysis, cost or schedule growth is significant if it exceeds the thresholds that trigger reporting to Congress under the law. The thresholds are development cost growth of 15 percent or more from the baseline cost estimate or a milestone delay of 6 months or more beyond the baseline schedule estimate. 42 U.S.C. § 16613(d). [19] NASA did not provide a formal cost and schedule baselines for the projects in formulation, citing that the estimates are preliminary. Baselines are established when the project transitions to implementation. [18] The System Integration Review evaluates the readiness of the project to start flight system assembly, test, and launch operations. This review takes place after the CDR and just prior to the beginning of phase D, where test and integration activities occur. NID for NPR 7120.5D, Table 2-7 and paragraph 4.6.1 (Sept. 22, 2009). [20] If development cost of a program will exceed the baseline estimate by more than 30 percent, then NASA is required to seek reauthorization from Congress in order to continue the program. If the program is reauthorized, NASA is required to establish new cost and schedule baselines. 42 U.S.C. § 16613(e). [21] 42 U.S.C. § 16613(e). [22] These 13 projects include 5 projects reviewed this year and 8 projects from our previous reports in this series. For many of these projects, the confirmation baseline was set prior to the requirement for the statutory baseline. They are of analytical interest because (1) they are or were in the implementation phase, and (2) measuring cost growth from a project's confirmation baseline, not its statutory baseline, allows for more of a more consistent comparison of project cost growth among NASA's portfolio of projects. [23] The Ares and Orion projects have completed their preliminary design reviews, but have not yet held confirmation reviews. [24] The "product development" stage in GAO's knowledge-based approach is equivalent to "implementation" in NASA's lifecycle. [25] NASA Procedural Requirements (NPR) 7123.1A , NASA Systems Engineering Processes and Requirements Appendix G, paragraph G.19(b) (Mar. 26, 2007) [26] Appendix IV provides a description of the metrics used to assess technology maturity. [27] Projects will modify the form, fit, and function of a heritage technology to adapt to the new environment. For example, the size or the weight of the component may change or the technology may function differently than its use in a previous mission. [28] We were unable to determine design stability for the SOFIA project as some data was not provided to us for review by NASA because, according to project officials, the project documentation did not transfer in its entirety from Ames Research Center to Dryden Flight Research Center. In addition, we were unable to determine design stability for the MMS project as it did not provide us with detailed drawing count data. [29] [hyperlink, http://www.gao.gov/products/GAO-06-218] and GAO, NASA: Issues Implementing the NASA Authorization Act of 2010, [hyperlink, http://www.gao.gov/products/GAO-11-216T] (Washington, D.C.: Dec. 1, 2010) [30] NPR 7123.1A, Appendix G, paragraph G.8 (Mar. 26, 2007) [31] For KDP/milestone reviews, external independent reviewers known as Standing Review Board (SRB) members evaluate the program/project and, in the end, report their findings to the decision authority. For a program or project to prepare for the SRB, the technical team must conduct their own internal peer review process. This process typically includes both informal and formal peer reviews at the subsystem and system level. NASA Systems Engineering Handbook, paragraph 6.7.2.1 (Dec. 2007) [32] The National Academies, National Research Council, Controlling Cost Growth of NASA Earth and Space Science Missions (Washington, D.C. 2010). [33] Pub. L. No. 111-5. [34] GAO, NASA: Constellation Program Cost and Schedule Will Remain Uncertain Until a Sound Business Case is Established, [hyperlink, http://www.gao.gov/products/GAO-09-844] (Washington, D.C.: Aug. 26, 2009). [35] GAO, NASA: Medium Launch Transition Strategy Leverages Ongoing Investments but Is Not Without Risk, [hyperlink, http://www.gao.gov/products/GAO-11-107] (Washington, D.C.: Nov. 22, 2010) [36] NASA provides funding to SpaceX and Orbital to help offset International Space Station-related development costs of the Falcon 9 and the Taurus II, respectively. The Falcon 9 and Taurus II are intended to be medium class launch vehicles. [37] [hyperlink, http://www.gao.gov/products/GAO-11-107]. [38] Government Industry Data Exchange Program, or GIDEP, is a partnership between Government Agencies and Industry to share scientific and technical information through an on-line web-enabled database. GIDEP alerts report a problem with parts, components, materials, specifications, software, manufacturing processes, or test equipment that can cause a functional failure. [39] GAO, High-Risk Series: An Update, [hyperlink, http://www.gao.gov/products/GAO-07-310] (Washington, D.C.: Jan. 2007). [40] National Aeronautics and Space Administration, Plan for Improvement in the GAO High-Risk Area of Contract Management, Oct. 31, 2007. [41] GAO, Additional Cost Transparency and Design Criteria Needed for National Aeronautics and Space Administration (NASA) Projects, [hyperlink, http://www.gao.gov/products/GAO-11-346R] (Washington, D.C.: Mar. 3, 2011). [42] NASA Policy Directive 1000.5A, Policy for NASA Acquisitions, paragraphs 1(h)(1)(a) and 1(h)(2) (Jan. 15, 2009). [43] Seven of the 21 projects were not required to complete the JCL process at the time of our review. [44] American National Standards Institute/Electronic Industries Alliance Standard, Earned Value Management Systems, ANSI/EIA-748-B- 2007 approved July 9, 2007. [45] [hyperlink, http://www.gao.gov/products/GAO-11-216T]. [46] Jet Propulsion Laboratory: James Webb Space Telescope (JWST) Independent Comprehensive Review Panel (ICRP): Final Report, JPL D- 67250 (Pasadena, Calif.: Oct.29, 2010). [47] These missions include Ares I, Soil Moisture Active and Passive, and Orion. [48] Because of changes in NASA's accounting structure, its historical cost data are relatively inconsistent. As such, we used "then-year" dollars to report data consistent with the data NASA reported to us. [49] Some projects reported that their spacecraft would be ready for launch sooner than the date that the launch authority could provide actual launch services. In these cases, we used the actual launch date for our analysis rather than the date that the project reported readiness. [50] According to NASA officials, projects that were in formulation at the time of the agency's 2007 revision of its project management policy are required to comply with that policy. Projects that had already entered implementation at the time of the revision were directed to implement those requirements that would not adversely affect the project's cost and schedule baselines. [51] In our calculation for percentage of total number of drawings project for release, we used the number of drawings released at critical design review as a fraction of the total number of drawings projected, including where a growth in drawings occurred. So, the denominator in the calculation may have been larger than what was projected at the critical design review. We believe that this more accurately reflected the design stability of the project. [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each weekday, GAO posts newly released reports, testimony, and correspondence on its Web site. To have GAO e-mail you a list of newly posted products every afternoon, go to [hyperlink, http://www.gao.gov] and select "E-mail Updates." Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s Web site, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: E-mail: fraudnet@gao.gov: Automated answering system: (800) 424-5454 or (202) 512-7470: Congressional Relations: Ralph Dawn, Managing Director, dawnr@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, D.C. 20548: Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, D.C. 20548: