This is the accessible text file for GAO report number GAO-11-143 entitled 'Nuclear Waste: DOE Needs a Comprehensive Strategy and Guidance on Computer Models that Support Environmental Cleanup Decisions' which was released on March 11, 2011. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to Congressional Requesters: February 2011: Nuclear Waste: DOE Needs a Comprehensive Strategy and Guidance on Computer Models that Support Environmental Cleanup Decisions: GAO-11-143: GAO Highlights: Highlights of GAO-11-143, a report to congressional requesters. Why GAO Did This Study: The Department of Energy’s (DOE) Office of Environmental Management (EM) is responsible for one of the world’s largest cleanup programs: treatment and disposal of radioactive and hazardous waste created as a by-product of nuclear weapons production and energy research at sites across the country, such as EM’s Hanford Site in Washington State and the Savannah River Site (SRS) in South Carolina. Computer models—which represent physical and biogeochemical processes as mathematical formulas—are one tool EM uses in the cleanups. GAO was asked to (1) describe how EM uses computer models in cleanup decisions; (2) evaluate how EM ensures the quality of its computer models; and (3) assess EM’s overall strategy for managing its computer models. GAO analyzed the use of selected models in decisions at Hanford and SRS, reviewed numerous quality assurance documents, and interviewed DOE officials as well as contractors and regulators. What GAO Found: EM uses computer models to support key cleanup decisions. Because the results of these decisions can cost billions of dollars to implement and take decades to complete, it is crucial that the models are of the highest quality. Computer models provide critical information to EM’s cleanup decision-making process, specifically to: * Analyze the potential effectiveness of cleanup alternatives. For example, computer models at SRS simulate the movement of contaminants through soil and groundwater and provide information used to predict the effectiveness of various cleanup strategies in reducing radioactive and hazardous material contamination. * Assess the likely performance of selected cleanup activities. After a particular cleanup strategy is selected, EM uses computer modeling to demonstrate that the selected strategy will be designed, constructed, and operated in a manner that protects workers, the public, and the environment. * Assist in planning and budgeting cleanups. EM also uses computer models to support lifecycle planning, scheduling, and budgeting for its cleanup activities. For example, a Hanford computer model simulates the retrieval and treatment of radioactive waste held in underground tanks and provides information used to project costs and schedules. EM uses general departmental policies and industry standards for ensuring quality, but they are not specific to computer models used in cleanup decisions. EM has not regularly performed periodic quality assurance assessments, as required by DOE policy, to oversee contractors’ development and use of cleanup models and the models’ associated software. In our review of eight cleanup decisions at Hanford and SRS that used computer modeling as a critical source of information, GAO found EM conducted required assessments of the quality of computer models in only three cases. In addition, citing flaws in a model EM uses to analyze soil and groundwater contamination, regulators from Washington state have told EM that it will no longer accept the use of this model for chemical exposure analysis at Hanford. EM does not have an overall strategy for managing its computer models. EM has recently begun some efforts to promote consistency in the use of models. For example, it is developing a set of state-of-the-art computer models to support soil and groundwater cleanup decisions across its sites. However, these efforts are still in early stages and are not part of a comprehensive, coordinated effort. Furthermore, although other federal agencies and DOE offices have recognized the importance of comprehensive guidance on the appropriate procedures for managing computer models, EM does not have such overarching guidance. As a result, EM may miss opportunities to improve the quality of computer models, reduce duplication between DOE sites, and share lessons learned across the nuclear weapons complex. What GAO Recommends: GAO recommends that DOE (1) clarify specific quality assurance requirements for computer models used in environmental cleanup decision making; (2) ensure that the models are assessed for compliance with these requirements; and (3) develop a comprehensive strategy and guidance for managing its models. DOE agreed with our recommendations. View [hyperlink, http://www.gao.gov/products/GAO-11-143] or key components. For more information, contact Gene Aloise at (202) 512- 3841 or aloisee@gao.gov. [End of section] Contents: Letter: Background: Computer Models Provide Critical Information to EM's Environmental Cleanup Decision-Making Process: EM Has General Quality Policies for Its Computer Models, but It Has Not Regularly Assessed Contractors' Implementation of Quality Assurance Procedures: EM Does Not Have an Overall Strategy and Guidance for Managing Its Cleanup Models: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Scope and Methodology: Appendix II: Functions of Key Models Used in Cleanup Decisions GAO Reviewed at EM's Hanford and Savannah River Sites: Appendix III: Comments from the Department of Energy: Appendix IV: GAO Contact and Staff Acknowledgments: Related GAO Products: Abbreviations: CERCLA: Comprehensive Environmental Response, Compensation, and Liability Act of 1980, as amended: DOE: Department of Energy: EM: Office of Environmental Management: EPA: Environmental Protection Agency: NEPA: National Environmental Policy Act, as amended: SRS: Savannah River Site: [End of section] United States Government Accountability Office: Washington, DC 20548: February 10, 2011: The Honorable Fred Upton: Chairman: The Honorable Joe Barton: Chairman Emeritus: Committee on Energy and Commerce: House of Representatives: The Department of Energy's (DOE) Office of Environmental Management (EM) is responsible for one of the world's largest environmental cleanup programs, the treatment and disposal of radioactive and hazardous waste created as a by-product of producing nuclear weapons and energy research. The largest component of the cleanup mission is the treatment and disposal of millions of gallons of highly radioactive waste stored in aging and leak-prone underground tanks. In addition, radioactive and hazardous contamination has migrated through the soil into the groundwater, posing a significant threat to human health and the environment. EM spends about $6 billion annually to clean up its sites. As of February 2010, DOE estimated that the overall cost to complete the entire cleanup mission will be between $275 billion and $329 billion. Two DOE sites--the Hanford Site in southeastern Washington state and the Savannah River Site (SRS) in South Carolina--account for more than one-half of these annual costs and about 60 percent of the total projected cost of the overall cleanup of nuclear waste at DOE sites. As with nearly all of DOE's missions, the majority of the work at these two sites is performed by private firms under contract with DOE. One tool EM uses to help decide how to clean up this radioactive and hazardous waste is computer simulation modeling--hereafter referred to as computer models--where the behavior of physical and biogeochemical processes are described through the use of mathematical formulas. For example, computer models may be used to simulate a process such as the transport of contamination through the soil and groundwater or to predict how long it will take to empty waste tanks in a certain sequence. The results from these models often contribute to the basis for cleanup decisions that can cost hundreds of millions of dollars to implement. The set of processes used to ensure the quality of computer software and models--known as "quality assurance"--has been a concern in the past. In 2000 and again in 2002, the Defense Nuclear Facilities Safety Board raised concerns that DOE did not have adequate controls to ensure the reliability of software used in nuclear facilities. The Board noted that many systems used to maintain safety in nuclear or hazardous facilities, such as ventilation system controls, rely on the smooth operation of software to prevent accidents. Another concern regarding software and modeling was raised at Hanford in 2006, when a DOE headquarters review team found that the absence of quality assurance oversight activities and the lack of formal data validation and verification led to data inaccuracies in modeling used to support the development of an environmental impact statement. These problems prompted DOE to undertake a new modeling effort, delaying the environmental impact statement. In response to your request, this report (1) describes how EM uses computer models in cleanup decisions; (2) evaluates how EM ensures the quality of its computer models; and, (3) assesses EM's overall strategy for managing its computer models. To address these objectives, we gathered and reviewed information on the types of cleanup decisions DOE has made at Hanford and SRS. For each site, we selected examples of three types of decisions that were representative of major decisions DOE has made at these sites between 2002 and 2010-- (1) decisions made under environmental statutes, including the Comprehensive Environmental Response, Compensation, and Liability Act of 1980, as amended (CERCLA)[Footnote 1]--which addresses specific environmental remediation solutions for a cleanup site--and the National Environmental Policy Act (NEPA), as amended[Footnote 2]-- under which DOE evaluates the impacts to human health and the environment of proposed cleanup strategies and possible alternatives; (2) performance assessments under DOE orders governing radioactive waste management; and (3) budgeting and planning decisions for liquid tank waste treatment and disposal. We then selected, based on input from EM officials, the main models that were used to support these decisions at the two sites. We obtained and reviewed documentation on the computer models used and decisions made, and interviewed officials from DOE headquarters to determine how the models were used in these decisions. We analyzed this information to determine how the results of computer models were used in making cleanup decisions and the importance of the results. We also obtained and reviewed documentation showing the standards the models were required to meet, as well as DOE, contractor, and other quality assurance assessments indicating whether these standards were met. We also interviewed officials from the Environmental Protection Agency (EPA), the National Research Council, the Defense Nuclear Facilities Safety Board, and other organizations about existing standards for the use and implementation of computer modeling software and modeling coordination strategies. We visited both Hanford and SRS and spoke with EM officials and contractor staff at both locations to better understand the use of models in planning and cleanup decisions, as well as EM oversight of the models. We focused our review on model standards and the use of models in decisions, not on the quality of the models themselves or of their output. We conducted this performance audit from October 2009 to February 2011, in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Background: Since the 1940s, DOE and its predecessors have operated a nationwide complex of facilities used to research, design, and manufacture nuclear weapons and related technologies. The environmental legacy of nuclear weapons production at dozens of these sites across the United States includes contaminated buildings, soil, water resources, and large volumes of radioactive and hazardous wastes that require treatment, storage, and disposal. The two sites that account for the majority of the costs of the cleanup effort--Hanford and SRS--were established in the 1940s and 1950s, respectively, to produce plutonium and other nuclear materials needed to manufacture nuclear weapons. EM manages cleanup projects at these and other sites that involve multiple activities to treat and dispose of a wide variety of radioactive and hazardous wastes. Under federal and state laws, EM must clean up radioactive and hazardous substances in accordance with specified standards and regulatory requirements. EM carries out its cleanup activities under the requirements of federal environmental laws that include, among others, CERCLA and NEPA.[Footnote 3] CERCLA requires EM to evaluate the nature and extent of contamination at the sites and determine what cleanup remedies, if any, are necessary to protect human health and the environment into the future. Under NEPA, EM must prepare an environmental impact statement that assesses the environmental effects for a proposed agency action, all reasonable alternatives, and the no- action alternative. Under both the CERCLA and NEPA processes, EM analyzes proposed remedial action alternatives according to established criteria, invites and considers public comment, and prepares a Record of Decision that documents the selected agency action. If the cleanup method selected under CERCLA or NEPA will result in disposal of waste at an on-site disposal facility, EM is then required, under DOE's radioactive waste management order--DOE Order 435.1--to ensure that waste management activities at each disposal facility are designed, constructed, and operated in a manner that protects workers, the public, and the environment.[Footnote 4] EM does this by completing a "performance assessment" of the selected cleanup method.[Footnote 5] To guide the implementation of selected cleanup methods, EM and its contractors may prepare a "system plan" that provides the basis for scheduling cleanup operations and preparing budget requests. For example, both Hanford and SRS have prepared system plans for treating and disposing of liquid radioactive waste stored in aging and leak-prone underground tanks. EM officials at DOE headquarters and field offices oversee cleanup activities at the sites, but the work itself is carried out primarily by private firms contracting with DOE. EM applies different approaches to managing cleanup activities, depending on the type and extent of contamination and state and/or federal requirements with which it needs to comply. In addition, DOE has agreements with state and federal regulators, known as Federal Facility Agreements, to clean up the Hanford and SRS sites.[Footnote 6] The agreements lay out legally binding milestones for completing major steps in the waste treatment and cleanup process. EPA officials, as well as officials with environmental agencies in the states where EM sites are located, enforce applicable federal and state environmental laws and oversee and advise EM on its cleanup efforts. One tool EM uses in support of cleanup decision analyses is computer modeling. Although the computer models used across EM sites vary, they have certain common characteristics. In general, computer models are based on mathematical formulas that are intended to reflect physical, biogeochemical, mechanical, or thermal processes in simplified ways. For example, a computer model can simulate the movement of contamination through the soil and groundwater or simulate the transfer of high-level radioactive waste from underground storage tanks to facilities where the waste will be treated. Appendix II details the key computer models used in the cleanup decisions we reviewed at Hanford and SRS. Computer Models Provide Critical Information to EM's Environmental Cleanup Decision-Making Process: EM uses computer models to provide critical information for its decision-making process. First, computer models provide information that EM uses to analyze the effectiveness of alternative actions to clean up radioactive waste. Second, once a cleanup strategy has been selected, computer modeling provides information that EM needs to assess the performance of the selected cleanup strategy in reducing risks to human health and the environment. Third, EM uses computer models to simulate operations in the cleanup process, providing the basis for planning cleanup efforts and for making annual budget requests. EM Uses Computer Models to Analyze the Potential Effectiveness of Cleanup Alternatives: EM's decision making for its cleanup efforts is based on meeting federal and state requirements; input from state, local, and regional stakeholders; and other considerations, including the costs of cleanup actions. Computer models provide critical information that EM needs to assess compliance with regulatory requirements when seeking to identify and select alternatives for cleaning up radioactive and hazardous wastes, as well as contaminated soil and groundwater at its sites. EM's cleanup decisions are guided by several federal and state environmental laws, including CERCLA and NEPA, which both set forth processes related to cleanup decisions. In the case of CERCLA, EM determines the nature and extent of the contamination, assesses various cleanup alternatives, and selects the best alternatives according to evaluation criteria that include, among other things, protection of human health and the environment, ease of implementing the alternative, state and community acceptance, and cost. To accomplish these steps, EM uses computer modeling to, among other things, simulate the movement of contaminants through soil and groundwater over many years assuming no cleanup action is taken. Projected contamination levels, migration pathways, and contamination travel timelines are provided by simulations and are evaluated to determine whether regulatory standards will likely be exceeded in the future. If action is needed, then modeling simulations may be conducted for a number of different cleanup alternatives. For example, EM used modeling to assess contamination and the potential effectiveness of various cleanup strategies at SRS's C-Area Burning/Rubble Pit. Used during the 1960s as a trash pit to dispose of organic solvents, waste oils, paper, plastics, and rubble, SRS burned the contents of the pit periodically to reduce its volume. Eventually, SRS used the pit for the disposal of inert rubble, finally covering it with two feet of soil in the early 1980s. However, the disposal of these materials and periodic burning resulted in hazardous substance contamination of the surrounding soil and groundwater. Between 1999 and 2004, EM implemented several actions to clean up the majority of the area's contamination. Following these actions, EM used computer models to simulate the movement of the remaining contamination through the soil and groundwater over the next 1,000 years. Information provided by this modeling helped EM to identify the remaining risks to human health and the environment and to identify actions to clean up the remaining contamination. Using this information, in conjunction with other criteria such as additional site data, input from federal and state regulators and the public, and the availability of an appropriate cleanup technology, EM selected a final cleanup remedy. This remedy, which is ongoing and combines several different cleanup technologies, was estimated in 2008 to cost, in present-worth dollars, about $1.9 million over a 70-year period. In implementing CERCLA, DOE focuses on discrete facilities or areas within a site that are being remediated, making limited assessments of cumulative impacts. By contrast, under NEPA, EM generally prepares environmental impact statements that assess the environmental impacts-- including cumulative impacts--of a proposed cleanup action, all reasonable alternatives, and taking no action. For example, the environmental impact statement for closing underground liquid radioactive waste tanks at Hanford--which, as of November 2010 was still in draft form--includes an analysis of the potential environmental impact of various options for treating and disposing of about 55 million gallons of mixed radioactive and hazardous waste and closing 149 underground radioactive waste tanks. The draft environmental impact statement includes an analysis of 11 tank waste treatment and closure alternatives,[Footnote 7] including a no-action alternative. These alternatives range in cost from about $3 billion to nearly $252 billion, excluding the costs associated with the final disposal of the treated waste. In the draft environmental impact statement, EM used computer models to simulate the movement of contamination through soil and groundwater over a period of 10,000 years for each of the cleanup alternatives. As with CERCLA modeling, the results of the computer models were used to estimate the remaining risks to human health and the environment following the completion of each cleanup alternative and these risks were then compared with requirements. The results of these models will be used along with other information such as input from regulators and the public and the costs of each alternative when EM selects the alternative it will eventually implement. EM Uses Computer Models to Assess the Performance of Selected Cleanup Activities: After a particular cleanup alternative is selected, EM also uses computer modeling to demonstrate that the cleanup activity will result in reduced future contamination levels that meet regulatory requirements. If the cleanup method selected under CERCLA or NEPA will result in disposal of waste at an on-site disposal facility, EM is then required, under DOE's radioactive waste management order--DOE Order 435.1--to ensure that waste management activities at each disposal facility are designed, constructed, and operated in a manner that protects workers, the public, and the environment. To meet the requirements of the order, EM completes a "performance assessment"[Footnote 8] of the selected cleanup method. Under the order, this performance assessment is to document that the disposal facility is designed, constructed, and operated in a manner that protects workers, the public, and the environment. The performance assessment also is to project the release of contamination into the soil and groundwater from a site after cleanup and must include calculations of potential chemical doses to members of the public in the future. For example, in March 2010, SRS issued a performance assessment of a cleanup and closure strategy for a group of 20 underground liquid radioactive waste tanks, known as the F-Tank Farm.[Footnote 9] The performance assessment evaluated closing the underground waste tanks and filling them with a cement-like substance called grout--the alternative selected following completion of SRS's 2002 environmental impact statement. Computer modeling was used extensively to prepare this performance assessment. Specifically, computer modeling was performed using two different types of models. The first computer model was used to perform human health and environmental risk calculations and to calculate radiation doses that could be compared to the maximum level allowed by federal and state requirements. The second model was used to analyze sensitivities and uncertainties in the results of the first model. EM Uses Computer Modeling to Help Plan and Budget Cleanups: EM also uses computer models for lifecycle planning, scheduling, and budgeting for its cleanup activities. Computer models provide important information that EM and its contractors use to develop system plans that outline the schedules for cleanup activities at EM sites. Outputs from computer models and databases are used to create tables, charts, and schedules that are published in the system plans and inform annual budget requests for cleanup activities. For example, at Hanford, a computer model known as the Hanford Tank Waste Operations Simulator is designed to track the retrieval and treatment of over 55 million gallons of radioactive waste held in underground storage tanks. According to the most recent Hanford tank waste system plan, which was issued in November 2010, the model projects the chemical and radiological characteristics of batches of waste that are to be sent to a $12.2 billion waste treatment plant that is being built at Hanford to treat this waste. The model also provides scheduling information the contractor uses to project near- and long-term costs and schedules. Similarly, SRS uses a computer model known as SpaceMan Plus™ to support the site's liquid waste system plan, which was issued in January 2010.[Footnote 10] For example, project work schedules for SRS's tank waste program are guided by this model. The model also simulates how the tank farms integrate with waste processing facilities and tracks the movement of waste throughout the liquid waste system. Output from the model was used to provide tables and schedules found in the appendixes of SRS's system plan that details the specific cleanup activities that are to be accomplished. These tables and schedules are used as part of the basis for determining the costs of completing those activities. This information, in turn, allows DOE and its contractors to generate annual budget requests. EM Has General Quality Policies for Its Computer Models, but It Has Not Regularly Assessed Contractors' Implementation of Quality Assurance Procedures: Although EM uses general departmental quality assurance policies and standards that apply to computer models and relies on contractors to implement specific procedures that reflect these policies and standards, these policies and standards do not specifically provide guidance on ensuring the quality of the computer models used in cleanup decisions. Moreover, EM officials have not regularly performed periodic quality assurance assessments, as required by DOE policy to oversee contractors' development and use of cleanup models and the models' associated software. In addition, DOE and others have identified quality assurance problems. For example, the state of Washington has cited flaws in a model EM uses to analyze soil and groundwater contamination and has told EM that it will no longer accept the use of this model for chemical exposure analysis at Hanford. Although EM Has General Quality Assurance Standards, Its Oversight Is Not Sufficient to Ensure the Quality of Cleanup Models: DOE addresses quality through various departmental policies and industry standards;[Footnote 11] however these policies and standards do not specifically provide guidance on ensuring the quality of the computer models used in cleanup decisions. Specifically, DOE's primary quality assurance policy--DOE Order 414.1C[Footnote 12]--provides general requirements EM and its contractors must meet to ensure all work at the cleanup sites is carried out correctly and effectively, including the development and use of computer models. These requirements include developing a quality assurance program, training staff how to check the quality of their work, and providing for independent assessments of quality. A manual accompanying this order describes acceptable, nonmandatory methods for specifically ensuring quality of "safety software." Safety software is described in the manual as software used to design, manage, or support nuclear facilities. However, the manual is less clear on how to assure quality in computer models. Furthermore, it does not clearly address the use of computer software not considered as safety software, such as those used by computer models that support DOE's cleanup decisions. DOE's quality assurance order also requires contractors to select and comply with an appropriate set of industry standards for all work, including computer modeling. One common set of standards was developed by the American Society of Mechanical Engineers and provides the requirements necessary to ensure safety in nuclear facilities, including the development and validation of computer models and software that is used to design and operate such facilities.[Footnote 13] Initially, the American Society of Mechanical Engineers standards were not mandatory for computer models and software used for cleanup decisions, many of which are considered nonsafety software. These standards were but one of many standards that contractors could choose to use. However, as of November 2008, EM made the American Society of Mechanical Engineers standards mandatory for all cleanup activities, including modeling. EM's contractors are to implement DOE's quality assurance requirements using specific policies and procedures they develop. The specifics of implementation vary from contractor to contractor. In the case of computer software quality, a contractor is to include procedures for testing and validating the software, ensuring changes to software are properly documented, and correcting any errors. EM allows its contractors to take a "graded approach" to quality procedures for computer software, which means the contractor may adjust the rigor of the quality procedures to match the importance of the software to overall operations. According to documents we reviewed, computer software that controls systems in a nuclear facility, for example, would require more rigorous quality procedures than an administrative payroll system, as any failure in the software controlling a nuclear facility could result in potentially hazardous consequences to workers, the public, and/or the environment. EM is to oversee its contractors' implementation of quality standards for computer models by performing periodic quality assurance assessments, according to DOE's quality assurance order. These quality assurance assessments are intended to ensure that computer models meet DOE and accepted industry quality standards. In our review of eight cleanup decisions at Hanford and SRS, we found EM had conducted only three quality assurance assessments that addressed quality standards for the models used in those decisions. For example, for three of the four decisions we reviewed at SRS, DOE officials at SRS could not provide quality assurance assessments that specifically addressed whether the models used in those decision processes met DOE's quality assurance requirements.[Footnote 14] DOE officials at SRS provided three general quality assurance assessments, but these quality assurance assessments did not specifically look at the cleanup models. In contrast, the models for a March 2010 performance assessment selecting a cleanup strategy to close underground liquid waste tanks at SRS did receive a quality assurance assessment by a DOE headquarters group established to review performance assessment decisions.[Footnote 15] In particular, as part of the review, among other things, the DOE group conducted a quality assurance assessment that evaluated the quality of the computer models used in the performance assessment and the degree to which the models complied with DOE requirements and industry standards. A DOE quality assurance official at SRS noted that the site relies primarily on its contractors to perform quality assurance assessments of computer models and their associated software. Similarly, in our review of four cleanup decisions at Hanford, we found that EM had performed assessments that addressed quality standards for the models used in those decisions in only two cases. In fact, one quality assurance assessment was only undertaken after a contractor discovered data quality errors in 2005 in a computer model used to support a prior environmental impact statement at Hanford. According to a DOE quality assurance manager at Hanford, his office conducts quality assurance assessments primarily on those computer models and the associated software for which the failure would result in significant safety consequences to workers, the public, and/or the environment. Some Reviews Have Revealed Quality Assurance Problems: Concerns have been raised by DOE and others that EM does not have complete assurance of the quality of the models. For example: * Citing a number of flaws in a model DOE uses to analyze soil and groundwater contamination at Hanford, the Washington state Department of Ecology told DOE in February 2010 that it would no longer accept the use of this model for chemical exposure analysis at Hanford. For example, Ecology cited previous concerns that the model was not robust enough to capture complexities of the movement of contamination through the subsurface soil. We found that DOE had conducted no specific quality assurance reviews on the model and its associated software. * EM headquarters officials conducted two technical reviews in 2009 of planning models used for tank waste operations at Hanford and SRS. [Footnote 16] The review of the Hanford planning model found that the model has limited ability to sufficiently predict the composition of the contaminated waste as it is prepared for the treatment processes. The review team cautioned that this limitation raised a significant risk that, when actual waste treatment operations started at the site, the waste may not meet the acceptance requirements for processing by Hanford's treatment facility. In addition, the review of SRS's planning model found that, although the data the model provided on tank waste operations were reasonable, the model did not have the ability to optimize operating scenarios, which hampered the site's long-term planning abilities. * A March 2010 independent review commissioned by a Hanford citizen's group raised concerns about a model used in the preparation of a draft environmental impact statement of alternatives for closing Hanford's waste tanks.[Footnote 17] These concerns, based on reviewing the draft statement, included insufficient documentation of the quality assurance processes followed for the model and that modeling uncertainties were inadequately quantified. The review concluded that the environmental impact statement was insufficiently precise to be used to make a cleanup decision. Where DOE has conducted quality assurance assessments, it has found that contractors did not always implement quality requirements consistently. Furthermore, in their own internal reviews, contractors have noted problems with the implementation of quality assurance requirements. Problems noted in DOE's and contractors' quality assurance assessments include: * Inadequate documentation. A 2007 software quality review conducted by DOE at Hanford found implementation problems, including inadequate documentation and improper training for personnel in quality procedures. At SRS, two general software quality assurance reviews performed by DOE in 2004 found that while contractors generally met quality requirements, documentation was sometimes lacking or improperly prepared. A similar 2007 DOE review at SRS found a good software quality program overall, but listed a number of deficiencies including inadequate software plans and procedures. * Not following correct procedures. A 2007 DOE review of a Hanford contractor's software quality assurance program found, among other things, that not all contractor personnel fully understood software quality requirements. The report stated that, although software quality assurance training had been provided, personnel did not follow procedures in managing, maintaining, and overseeing software quality. For example, the report cited an example of a spreadsheet in which data input cells were not properly locked, in violation of procedures. In addition, the report noted that software documentation was not periodically updated, as required, because staff did not fully understand the procedures. * Incorrect quality assurance grading. In some cases, contractors did not always correctly determine the level of rigor needed to ensure the quality of computer models and their associated software. For example, a 2007 internal contractor review at Hanford found that 23 of 138 software codes registered in a central repository were incorrectly designated as nonsafety software, when in fact they should have been considered safety software. As a result, the quality assurance procedures appropriate for a given level of risk may not have always been applied. EM Does Not Have an Overall Strategy and Guidance for Managing Its Cleanup Models: Although EM has recently begun some efforts to promote consistency in the use of models across its various sites, these efforts are still in early stages and, to date, some have had limited involvement of modeling officials at the sites and federal, state, and local stakeholders who are affected by decisions made using the output of computer models. In addition, these efforts are not part of a comprehensive, coordinated effort to improve the management of computer models across EM. In the absence of such a strategy, EM also does not have overarching guidance promoting consistency in modeling management, development, and use across EM's sites. EM Has Some Initiatives to Improve Management of Its Cleanup Models, but They Are Not Part of a Comprehensive, Coordinated Strategy: EM has begun some efforts to improve the use of computer models across its various sites. For example, EM, in fiscal year 2010, began developing a set of state-of-the-art computer models to support soil and groundwater cleanup across the nuclear weapons complex. According to EM officials and documentation they provided, this initiative, called the Advanced Simulation Capability for Environmental Management, will allow EM to provide more sophisticated analysis of soil and groundwater contamination for cleanup decisions. Although the initiative's director told us that the goal is to encourage all sites to use these models for all of their soil and groundwater analysis, he noted that there are no plans to make using these models mandatory. Moreover, SRS has created a forum for improving consistency in groundwater computer modeling performed at the site. According to the charter document, the forum, called the Groundwater Modeling Consistency Team, was formed in 2006 following the discovery of inconsistencies in the data used in groundwater computer modeling conducted at Hanford in support of the preparation of an environmental impact statement under NEPA. The group, which is made up of DOE and contractor officials, reviews software codes, model inputs, and model assumptions to promote sitewide consistency in the management of computer models. Although these efforts may help improve EM's use of computer models, they are largely still in early stages. In addition, according to EM officials, some of these efforts have, to date, had limited involvement of modeling officials at EM's sites and of federal, state, and local stakeholders who are affected by decisions made using the output of computer models. Furthermore, they are not part of a comprehensive, coordinated effort to improve the consistency of computer models and reduce duplication across EM's various sites. For example, we found that different models are used to perform similar functions not only between EM sites, but also within sites. At SRS, one contractor uses a set of models to perform soil and groundwater analyses when evaluating the potential effectiveness of cleanup alternatives under CERCLA and NEPA, while another contractor uses a different set of models to perform similar analyses for performance assessments under DOE's radioactive waste management order. Each contractor has its own set of procedures for developing and using each computer model. Officials from both contractors told us that they use different models because state and federal regulators have only approved the use of certain models for specific types of cleanup decisions. Issues with consistency and duplication of effort in the use of computer models have also been noted by others. For example, a February 2010 DOE review noted that five major DOE sites use 28 different models to analyze groundwater and subsurface contamination when preparing performance assessments under DOE's radioactive waste management order. DOE officials told us that past modeling practices have resulted in conflicting assumptions and data sets, as well as different approaches to uncertainty analyses. In addition, a September 2009 DOE technical review of the Hanford tank waste modeling system raised concerns that two models at Hanford that share data use different assumptions that could lead to inconsistencies between the two. As a result, the Hanford waste treatment system plan, which is based on the output of one of these models, may not reflect the most current information. In contrast, other federal agencies and DOE offices have taken steps to improve consistency and reduce duplication as part of a comprehensive, coordinated strategy to manage the use of computer models. For example, EPA organized a Center for Regulatory Environmental Modeling in 2000 as part of a centralized effort to bring consistency to model development, evaluation, and usage across the agency. The Center brings together senior managers, modelers, and scientists from across the agency to address modeling issues. Among its tasks are to help the agency (1) establish and implement criteria so that model-based decisions satisfy regulatory requirements; (2) implement best management practices to use models consistently and appropriately; (3) facilitate information exchange among model developers and users so models can be continuously improved; and (4) prepare for the next generation of environmental models. According to a DOE official, EM does not have a central coordination point similar to EPA's. Within DOE, the Office of Nuclear Energy recently established an initiative--the Nuclear Energy Modeling and Simulation Energy Innovation Hub--that provides a centralized forum for nuclear energy modelers. According to the director of the Office of Nuclear Energy's Office of Advanced Modeling and Simulation, the hub will provide a more centrally coordinated effort to bring together modeling and simulation expertise to address issues associated with the next generation of nuclear reactors. Similar comprehensive, coordinated efforts are lacking within EM and, as a result, EM may be losing opportunities to improve the quality of its models, reduce duplication, keep abreast of emerging computer modeling and cleanup technologies, and share lessons learned across EM's sites. Other Federal Agencies and DOE Offices Have Recognized the Need for Comprehensive Modeling Guidance: The need for specific guidance for ensuring the careful management of computer models used in decision making is not new. As early as 1976, we reported on the government's use of computer models and found that the lack of guidance contributed to ineffective and inefficient use of computer models.[Footnote 18] We noted that guidance should define the problem to be solved, specify the assumptions and limitations of the model, and provide methods to test whether the model reasonably describes the physical system it is modeling. More recently, a 2007 National Research Council study of modeling at EPA laid out guidelines to improve environmental regulatory computer modeling.[Footnote 19] The study noted that adoption of a comprehensive strategy for evaluating and refining EPA's models could help the agency add credibility to decisions based on modeling results. It also noted several key principles to follow for model development, evaluation, and selection. Moreover, the study recommended that peer review be considered as an important tool for improving model quality. According to the study, a peer review should entail not only an evaluation of the model and its output, but also a review of the model's origin and its history. The study also made recommendations on quantifying and communicating uncertainty in model results to better communicate a model's limitations to stakeholders affected by decisions made using the results of computer models. EPA has taken action to develop specific guidance, issuing a guide in 2009 addressing the management, development, and use of computer modeling used in making environmental regulatory decisions.[Footnote 20] In this guidance, EPA developed a set of recommended best practices to help modelers effectively use computer models. The guidance defines the role of computer models in the public policy process, discusses appropriate ways of dealing with uncertainty, establishes criteria for peer review, and addresses quality assurance procedures for computer modeling. Even within DOE, another office outside of EM has recognized the need for specific guidance for managing computer models. Specifically, DOE's Office of Civilian Radioactive Waste Management specified in its quality assurance requirements several requirements for computer models.[Footnote 21] These requirements included clearly defining the model's objective, documenting alternative models that could be used and the rationales for not using them, and discussed a model's limitations and uncertainties. In addition, the office specified in its requirements that, among other things, a computer model receive a technical review through a peer review or publication in a professional journal. Although the importance of comprehensive guidelines for managing computer models is well established, according to its officials, EM does not have such overarching guidance. As previously discussed, EM does have a manual accompanying its quality assurance order that describes acceptable methods for specifically ensuring the quality of safety software. However, the manual does not generally address models used in cleanup decisions. EM also has guidance addressing the management of computer models used in conducting performance assessments under its radioactive waste management order. Specifically, a DOE headquarters group that is charged with reviewing decisions made under this order--the Low-Level Waste Disposal Facility Federal Review Group--has developed a manual that contains guidance on, for example, ensuring that input data to computer models are described and are traceable to sources derived from, among other things, field data from the site and referenced literature that is applicable to the site. However, this guidance does not apply to computer models used to analyze the potential effectiveness of cleanup alternatives under CERCLA or NEPA or to computer models used for planning, scheduling, and budgeting purposes. As a result, computer models developed at various DOE sites do not have consistent criteria to define the role of the model in the decision-making process, consistent ways of dealing with uncertainties and a model's limitations, and mechanisms to ensure computer model quality, such as quality assurance assessments and peer review. Conclusions: EM's computer models provide critical information that is needed to make significant decisions about how to clean up the radioactive and hazardous legacy waste across the country. However, EM's oversight of the quality of these models and its management of the development, evaluation, and use of the models has not always been commensurate with the models' importance. Because the decisions EM makes must protect human health and the environment for thousands of years into the future, it is critical that the models on which EM bases its decisions are of the highest quality possible. In addition, because these cleanup efforts will take decades and cost billions of dollars, it is also important that models used for planning, scheduling, and budgeting purposes provide the most accurate data possible for EM and Congress to make informed decisions on cleanup activities. EM's failure to fully oversee its contractors' implementation of quality assurance procedures has led to a reduced level of confidence that the models reasonably represent the conditions they are meant to simulate. In several cases, we found necessary quality assurance reviews were not conducted. In others, reviews found that quality assurance procedures were inadequately implemented. Because existing quality assurance requirements that are applied to EM's computer models have not been adequately implemented and, in some cases, are insufficiently understood by its contractors, EM and its contractors do not have an effective mechanism to provide the public and other EM stakeholders with assurance of a model's quality. To its credit, EM is beginning to undertake efforts to improve the consistency of models across the nuclear weapons complex. However, some of these efforts are still in their infancy, and it remains to be seen whether any improvements in EM's management of its models will result. We recognize that every site has its unique conditions and challenges and that a one-size-fits-all approach to modeling would not be appropriate. Nevertheless, there is room for additional consistency in model development and implementation, as well as a mechanism for sharing lessons learned among DOE's various sites. For a number of years, other federal agencies and offices within DOE have recognized the importance of a comprehensive guidance for managing computer models. Without a comprehensive strategy and modeling guidance, EM may miss opportunities to improve the quality of computer models, promote consistency, reduce duplication across DOE sites, and share lessons learned. Recommendations for Executive Action: To help EM increase confidence in the quality of information provided to the public and its stakeholders resulting from the use of computer modeling, we recommend the Secretary of Energy take the following three actions: * Clarify specific quality assurance requirements for computer models used in environmental cleanup decisions, including to analyze the potential effectiveness of cleanup alternatives, assess the performance of selected cleanup activities, and assist in planning and budgeting cleanup activities. * Ensure that the models are assessed for compliance with these requirements. * Develop a comprehensive strategy and guidance for the management of computer models to promote consistency, reduce duplication, and ensure sharing of lessons learned. Agency Comments and Our Evaluation: We provided a draft of this report to DOE for its review and comment. In its written comments, DOE agreed with our recommendations and stated that modeling is an important component of management analysis and decision making for the department. DOE noted that it is committed to continuous improvement in model development and application and commented that our recommendations will strengthen its modeling efforts. DOE stated in its comments that it disagreed with the draft report's assertion that its directives and standards fall short for the development and management of computer models. DOE commented that its quality assurance directives apply directly to the development, coding, and validation of safety and nonsafety computer models used in cleanup decisions and that EM has interpreted and applied these directives and accompanying standards to develop its quality program. We agree with DOE, and our draft report noted, that DOE addresses quality through various departmental policies and industry standards. However, these directives do not provide specific guidance to EM on assuring quality of the cleanup models themselves, guidance that other agencies and offices within DOE have developed. In particular, DOE's primary quality assurance policy--DOE Order 414.1C--addresses general standards that EM and its contractors must meet to ensure all work at its sites is carried out effectively, but is vague on the specific steps that must be followed to ensure the quality of models used in cleanup decisions. In addition, as our draft report noted, a manual accompanying this order describes acceptable, nonmandatory methods for specifically ensuring quality of safety software. However, the manual is less clear on the use of computer software not considered as safety software, such as those used by computer models that support DOE's cleanup decisions. Our recommendation that DOE clarify the specific quality assurance requirements for computer models used in environmental cleanup decisions is intended to address these problems. DOE's comments also provided additional information on the department's oversight of computer models, initiatives it is undertaking to improve its modeling efforts, and the specific steps it plans to take to address our recommendations. DOE also provided technical comments that we incorporated in the report as appropriate. DOE's written comments are presented in appendix III. As agreed with your office, unless you publicly announce the contents of this report earlier, we plan no further distribution until 30 days from the report date. At that time, we will send copies of this report to the appropriate congressional committees; the Secretary of Energy; the Director, Office of Management and Budget; and other interested parties. In addition, the report will be available at no charge on the GAO Web site at [hyperlink, http://www.gao.gov]. If you or your staffs have any questions regarding this report, please contact me at (202) 512-3841 or aloisee@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributors to this report are listed in appendix IV. Signed by: Gene Aloise: Director, Natural Resources and Environment: [End of section] Appendix I: Scope and Methodology: To determine how the Department of Energy's (DOE) Office of Environmental Management (EM) uses computer modeling in cleanup decisions, we focused on cleanup decisions EM has made at its Hanford Site in Washington state and Savannah River Site (SRS) in South Carolina because together these two sites account for more than one- half of EM's annual cleanup spending and approximately 60 percent of the total estimated cost of approximately $275 billion to $329 billion to clean up the entire nuclear weapons complex. We focused our review on decisions made in two major areas that represent the largest and most significant elements of the cleanup program at these two sites. The first is cleanup of radioactive and hazardous waste stored in underground tanks, which DOE has determined poses the most significant environmental safety and health threat in the cleanup program. DOE estimates cleaning up tank waste at the sites will cost between $87 billion and $117 billion, making it the largest cost element of EM's cleanup program. Second, both sites have significant contamination to soil and groundwater, which DOE estimates will cost more than $12 billion to remediate. For each site, we selected three types of decisions that were representative of major decisions made at these sites between 2002 and 2010--(1) decisions made under environmental statutes, including the Comprehensive Environmental Response, Compensation, and Liability Act of 1980, as amended (CERCLA)--which addresses specific environmental remediation solutions for a cleanup site--and the National Environmental Policy Act, as amended (NEPA)-- under which DOE evaluates the impacts to human health and the environment of proposed cleanup strategies and possible alternatives; (2) performance assessments under DOE orders governing radioactive waste management; and (3) cleanup budgeting and planning decisions. We reviewed publicly available information from regulators and interviewed DOE officials and contractor staff to identify the most recent decisions for each of the three types of decisions selected for review at each site. We reviewed these decisions to identify the most recent decision that included the use of computer modeling. We then selected, based on input by EM officials, the main models used to support these decisions at the two sites. We visited both Hanford and SRS and spoke with both EM officials and contractor staff there to better understand the use of models in planning and cleanup decisions and DOE's oversight of the models. We obtained demonstrations of these models, as well as information on how they were used in decision making. We obtained and reviewed the decision documents, as well as modeling studies, notes of meetings between DOE and its regulators to develop models, and other documentation showing how the models were used in decisions. We interviewed officials from DOE headquarters and the two sites, as well as contractor staff, to determine how the models work and how they were used in these decisions. We analyzed this information to determine how the results of computer models were used in making cleanup decisions, the importance of modeling in the selection of a cleanup strategy, and other factors that contributed to the selection of a cleanup strategy. To evaluate how EM determines the quality of the computer models used in cleanup decision making, we obtained and reviewed documentation showing the standards the models were required to meet. We gathered documentation on DOE standards, as well as policies and procedures from contractors overseeing the models. We discussed computer model and software standards with EM officials from EM's sites, contractors at the sites, and headquarters officials. We also interviewed officials from the Defense Nuclear Facilities Safety Board, the National Research Council, the Environmental Protection Agency, and the Washington state Department of Ecology about existing standards for the use and implementation of computer modeling and its associated software. We analyzed EM policies and contractor procedures to determine what quality assurance standards exist to address the quality of computer models. We also requested from EM and its contractors all assessments that were conducted on computer models used in the decisions we were reviewing, indicating whether quality standards were met. In general, the assessments we reviewed were largely conducted by the contractors, regulators, or external sources, such as consultants. These reviews ranged from contractor-performed assessments of the implementation of quality standards for software, to federal and state regulator comments on the modeling output used to develop alternatives in a regulatory package, to an outside consultant- performed review on the appropriateness of modeling for selecting a preferred alternative from an environmental impact statement prepared under NEPA. We analyzed these assessments to understand the level of oversight EM provided to assure model and software quality, as well as the extent to which contractors were implementing quality procedures. To address EM's overall strategy for managing computer models that are used in cleanup decisions, we interviewed DOE officials from headquarters and from each site. We also interviewed officials from the Environmental Protection Agency, National Research Council, DOE's Office of Nuclear Energy, and DOE's Office of Civilian Radioactive Waste Management about the implementation of computer modeling guidance and modeling coordination strategies. We reviewed modeling guidance from these organizations, as well as from the Office of Management and Budget. We focused our review on model quality assurance standards and the use of models in decision making, not on the quality of the models themselves or of their output. We conducted this performance audit from October 2009 to February 2011, in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. [End of section] Appendix II: Functions of Key Models Used in Cleanup Decisions GAO Reviewed at EM's Hanford and Savannah River Sites: Type of cleanup decision: Record of Decision under Comprehensive Environmental Response, Compensation, and Liability Act of 1980, as amended (CERCLA); Models used at Hanford: RESRAD; Description of how Hanford uses the model in the cleanup decision: Uses one-dimensional, simplified model of contaminant transport from the contaminated zone, through the vadose zone, to the aquifer; Models used at the Savannah River Site: MODFLOW; Description of how the Savannah River Site uses the model in the cleanup decision: Approximates groundwater flow in a three-dimensional grid. Used to estimate groundwater concentrations for contaminants over time. Type of cleanup decision: Record of Decision under Comprehensive Environmental Response, Compensation, and Liability Act of 1980, as amended (CERCLA); Models used at Hanford: STOMP; Description of how Hanford uses the model in the cleanup decision: Used with RESRAD in performing contaminant transport-to-groundwater evaluations; Models used at the Savannah River Site: SEASOIL; Description of how the Savannah River Site uses the model in the cleanup decision: Simulates vertical transport of contaminants from source, through the vadose zone, to the water table aquifer. Type of cleanup decision: Environmental Impact Statement/Record of Decision under National Environmental Policy Act (NEPA); Models used at Hanford: MODFLOW; Description of how Hanford uses the model in the cleanup decision: Simulates the groundwater flow field in three dimensions--two horizontal and one vertical--and contaminant transport from points of contact with groundwater at various times to various locations; Models used at the Savannah River Site: MEPAS; Description of how the Savannah River Site uses the model in the cleanup decision: Simulates fluid flow and contaminant transport in a three-dimensional grid in the vadose zone and the saturated zone. Transport results used to calculate groundwater concentrations for multiple contaminants over time. Type of cleanup decision: Environmental Impact Statement/Record of Decision under National Environmental Policy Act (NEPA); Models used at Hanford: STOMP; Description of how Hanford uses the model in the cleanup decision: Type of cleanup decision: Simulates three-dimensional, nonlinear water and contaminant transport through the vadose zone over time; Description of how the Savannah River Site uses the model in the cleanup decision: Simulates fluid flow and contaminant transport in a three-dimensional grid in the vadose zone and the saturated zone. Transport results used to calculate groundwater concentrations for multiple contaminants over time. Type of cleanup decision: Environmental Impact Statement/Record of Decision under National Environmental Policy Act (NEPA); Models used at Hanford: HTWOS; Description of how Hanford uses the model in the cleanup decision: Provided assumptions that were used in the Hanford Environmental Impact Statement as the basis for the number and location of waste receiver facilities; Description of how the Savannah River Site uses the model in the cleanup decision: Simulates fluid flow and contaminant transport in a three-dimensional grid in the vadose zone and the saturated zone. Transport results used to calculate groundwater concentrations for multiple contaminants over time. Type of cleanup decision: Performance Assessment under DOE's Radioactive Waste Management Order--DOE Order 435.1; Models used at Hanford: DMT; Description of how Hanford uses the model in the cleanup decision: A graphical interface model that uses STOMP modeling output to graphically display risk results. Used when calculating groundwater concentrations of selected contaminants, predicting risk, and comparing to regulatory criteria; Models used at the Savannah River Site: PORFLOW; Description of how the Savannah River Site uses the model in the cleanup decision: Used to calculate radiological doses and perform radiological and human health and ecological risk evaluation. Type of cleanup decision: Performance Assessment under DOE's Radioactive Waste Management Order--DOE Order 435.1; Models used at Hanford: STOMP; Description of how Hanford uses the model in the cleanup decision: Type of cleanup decision: Modeled flow and transport of contaminants through the vadose zone and groundwater; Provided inventory estimates at tank closures for tank residue, as well as the concentration of radionuclides and hazardous chemicals in tank retrieval solutions; Models used at the Savannah River Site: GoldSim; Description of how the Savannah River Site uses the model in the cleanup decision: Used with PORFLOW to assist in developing uncertainty and sensitivity analysis. Also used to calculate radiological doses using either concentration results from PORFLOW or GoldSim. Type of cleanup decision: Planning and budgeting; Models used at Hanford: HTWOS; Description of how Hanford uses the model in the cleanup decision: Simulates the movement of contaminated waste stored in underground tanks as it is retrieved, prepared for treatment, and processed through Hanford's under-construction waste treatment plant; Models used at the Savannah River Site: SpaceMan Plus™; Description of how the Savannah River Site uses the model in the cleanup decision: Simulates the operation of the process in the liquid tank waste system, from waste retrieval to waste processing, through the site's waste processing facilities. Source: GAO analysis of information from DOE. [End of table] [End of section] Appendix III: Comments from the Department of Energy: Department of Energy: Washington, DC 20585: January 19, 2011: Mr. Gene Aloise: Director, Natural Resources and Environment: U.S. Government Accountability Office: Washington, DC 20548: Dear Mr. Aloise: Thank you for the opportunity to review and comment on the draft report on the Department of Energy's (DOE) Office of Environmental Management (EM) modeling program, "Nuclear Waste: DOE Needs a Comprehensive Strategy and Guidance on Computer Models that Support Environmental Cleanup Decisions." Modeling is an important component of management analysis and decision making for the Department's highly complex and varied cleanup activities. As such, EM is committed to a process of continuous improvement in both model development and application by both Federal and contractor employees. The draft U.S. Government Accountability Office (GAO) report recognizes EM's adherence to the Department's directives and accompanying industry standards for the development of its corporate quality program. However, we disagree with the report's assertion that these directives and standards fall short for the development and management of computer models. DOE quality assurance directives (DOE 0 414.1C, 10 CFR 830, and ASME NQA-1) apply directly to development, coding, and validation of safety and non-safety computer models used in cleanup decisions. EM has interpreted and applied these directives and accompanying standards to develop its corporate quality program, which includes Software Quality Assurance and therefore computer model development. We agree with GAO that clarification of specific quality assurance requirements for computer models Department-wide may be needed. Also, EM recognizes that there may be instances where oversight of contractor development and use of computer models can fall short of quality program expectations. However, EM continuously makes use of its oversight, corrective action, and lessons learned management systems to correct these deficiencies. The identification of specific implementation issues within a quality program does not necessarily indicate the program is not effective and/or functional. EM shared a description of its oversight, corrective action, and lessons learned management systems prior to issuance of GAO's draft report. Regarding the selection of models used in cleanup decision making, DOE must adhere to compliance agreements and has a long history of collaborating with its environmental regulators at the Federal and state level, other government agencies, tribal nations, and local government and stakeholders in environmental cleanup decision making. This includes determining and applying appropriate models and considering other criteria as mandated by regulation and state-of-the- art practices in selecting cleanup remedies. These processes have been compliant with applicable laws and regulations, and the results have been noteworthy—the Comprehensive Environmental Response, Compensation, and Liability Act-mandated Five-Year Reviews and DOE Order 435.1 Radioactive Waste Management-required Annual Summaries conclude that implemented remedies and performance of low-level radioactive waste disposal facilities at DOE EM sites are protective of human health and the environment. As noted in your report, EM has begun several initiatives to improve our modeling efforts. Notable among these is the Advanced Simulation Capability for Environmental Management effort. As discussed with the GAO, this effort aims at developing improved simulation capabilities for consistent applications to EM cleanup work across the complex. The team developing this software consists of the world's foremost nuclear waste scientists and high performance computing modelers, and includes participants from eight U.S. National Laboratories. A second initiative that EM has completed is the purchase and distribution of 150 copies of three (3) software-related national consensus standards from the American Nuclear Society. DOE hosted a Workshop ("Waste Processing Models: Material Properties Standards and Software V&V Training Workshop," November 30-December 1, 2010) where the standards were distributed and modelers/software developers received training on their application. With respect to Information Technology (IT) and software management, the Principal Deputy Assistant Secretary for EM issued a memorandum to EM Headquarters and Field leaders on August 13, 2009, subject, "Information 'Technology Investments in the Office of Environmental Management." This memorandum established that all requests or requirements for IT investments must be submitted to the Director, Office of Corporate Information Technology, for review and approval prior to initiating, planning, or implementation activities. While this direction was not specific to computer modeling technologies, they were clearly included in the scope given that they are IT investments. Further, this policy document was shared with GAO prior to the issuance of GAO's draft report, and it is not acknowledged in the draft. Additionally, EM Headquarters developed and implemented an IT Governance process in 2010 to enable EM to more effectively manage IT with the goals of reducing duplication between EM sites to achieve maximum cost efficiency, promoting consistency, and sharing lessons learned across the nuclear weapons complex to enable our cleanup mission. We believe that the recommendations made in the draft GAO report will strengthen our modeling efforts. Provided in Enclosure 1 is our response to the GAO recommendations, including our proposed path forward. We are also submitting specific comments in Enclosure 2 that provide clarification to technical and factual information for your consideration in preparing your final report. If you have any questions, please contact Ms. Yvette T. Collazo, Director, Office of Technology Innovation and Development, at (202) 586-5280. Sincerely, Signed by: Ines R. Triay: Assistant Secretary for Environmental Management: 2 Enclosures: [End of letter] Enclosure 1: U.S. Department of Energy: Office of Environmental Management: Response to GAO Recommendations for Executive Action: GAO-11-143 - "Nuclear Waste: DOE Needs a Comprehensive Strategy and Guidance on Computer Models that Support Environmental Cleanup Decisions" Recommendations for Executive Action (Page 23 of Draft GAO- 11-143): To help the Office of Environmental Management (EM) increase confidence in the quality of information provided to the public and its stakeholders resulting from the use of computer modeling, we recommend the Secretary of Energy take the following three actions: * Clarify specific quality assurance requirements for computer models used to analyze the potential effectiveness of cleanup alternatives, assess the performance of selected cleanup activities, and assist in planning and budgeting cleanups. * Ensure that the models are assessed for compliance with these requirements. * Develop a comprehensive strategy and guidance for the management of computer models to promote consistency, reduce duplication, and ensure sharing of lessons learned. Recommendation 1: Clarify specific quality assurance requirements for computer models used to analyze the potential effectiveness of cleanup alternatives, assess the performance of selected cleanup activities, and assist in planning and budgeting cleanups. Concur. Resolution of this issue will require EM to coordinate with additional program offices within the Department because EM is not charged with writing software quality assurance (SQA) policy. The two technical areas that are of concern regarding model quality assurance are: (1) Information Technology (IT) project management; and (2) SQA for both nuclear, and non-nuclear, facilities and applications. The technical leads for each activity within DOE are described below. 1. Software and IT Systems Project Management: Within EM, the Office of Corporate Information Technology has the primary responsibility to ensure that EM IT is acquired and that information resources are managed in a manner consistent with statutory, regulatory, and Departmental requirements and priorities for IT systems. The EM Office of Corporate Information Technology works jointly with the DOE Chief Information Officer (CIO) in the facilitation of DOE Orders and OMB requirements for IT, so that information resources can be utilized effectively and efficiently. EM is committed to working with the DOE CIO to make sure that IT project management requirements are clear, communicated to EM field sites, and implemented in an appropriate manner by EM Headquarters and field personnel. The DOE CIO has developed an Information Technology Project Guide (Guide 413.3-14) which defines guidelines for project implementation. In addition, the EM Office of Corporate Information Technology has created a separate EM IT Projects Guide that is specific to EM which incorporates all the requirements from the DOE CIO IT Project Guide. 2. Software Quality Assurance: The responsible DOE Program Office is the Office of Health Safety and Security (HSS), Office of Quality Assurance Policy and Assistance. From the HSS website [hyperlink, http://www.hss.doe.gov/nuclearsafety/hs23.html], the mission of the Office of Quality Assurance Policy and Assistance is as follows: "The Office of Quality Assurance Policy and Assistance establishes and maintains the quality assurance (QA) policies, requirements and guidance for the Department and serves as DOE's corporate resource to ensure that products and services meet or exceed the Department's quality objectives. The Office provides assistance to Departmental elements and contractors in the interpretation and implementation of DOE quality assurance requirements and in the resolution of QA-related issues." The EM Office of Standards and Quality Assurance will work with DOE HSS to make sure software QA policy/guidance is clear. Where policy/guidance is unclear, or does not exist, EM is committed to working with DOE HSS to clarify QA requirements and communicate this guidance to EM Headquarters and field sites. Recommendation 2: Ensure that the models are assessed for compliance with these requirements. Concur. EM is committed to ensuring that models developed in the field and at EM Headquarters comply with the directives of DOE, and relevant national consensus standards. The EM Office of Standards and Quality Assurance and Office of Corporate Information Technology will work closely with DOE HSS and DOE CIO to assure that EM is compliant with Departmental directives. In addition, EM will review and, where needed, develop additional SQA oversight criteria to ensure computer models that have been or are to be developed within EM comply with Departmental directives and are implemented appropriately at all DOE EM facilities. Recommendation 3: Develop a comprehensive strategy and guidance for the management of computer models to promote consistency, reduce duplication, and ensure sharing of lessons learned. Concur. The EM Office of Corporate Information Technology will follow- up to ensure that every computer modeling IT investment is documented through the OMB-required Capital Planning and Investment Control process across the EM complex. We plan to conduct a survey to ensure that all the modeling tools in use in EM are included, including those referenced in this report by July 31, 2011. We will also reissue the August 13, 2009, memorandum on Information Technology Investments in EM by February 28, 2011. EM will also process the current slate of computer models through our IT Governance process with the goal of streamlining, where appropriate, by December 30, 2011. [End of section] Appendix IV: GAO Contact and Staff Acknowledgments: GAO Contact: Gene Aloise, (202) 512-3841 or aloisee@gao.gov: Staff Acknowledgments: In addition to the individual named above, Ryan T. Coles, Assistant Director; Ivelisse Aviles; Mark Braza; Dan Feehan; Nancy Kintner- Meyer; Jonathan Kucskar; Mehrzad Nadji; Kathryn Pedalino; Thomas C. Perry; and Benjamin Shouse made key contributions to this report. [End of section] Related GAO Products: Nuclear Waste: Actions Needed to Address Persistent Concerns with Efforts to Close Underground Radioactive Waste Tanks at DOE's Savannah River Site. [hyperlink, http://www.gao.gov/products/GAO-10-816]. Washington, D.C.: September 14, 2010. Recovery Act: Most DOE Cleanup Projects Appear to Be Meeting Cost and Schedule Targets, but Assessing Impact of Spending Remains a Challenge. [hyperlink, http://www.gao.gov/products/GAO-10-784]. Washington, D.C.: July 29, 2010. Department of Energy: Actions Needed to Develop High-Quality Cost Estimates for Construction and Environmental Cleanup Projects. [hyperlink, http://www.gao.gov/products/GAO-10-199]. Washington, D.C.: January 14, 2010. Nuclear Waste: Uncertainties and Questions about Costs and Risks Persist with DOE's Tank Waste Cleanup Strategy at Hanford. [hyperlink, http://www.gao.gov/products/GAO-09-913]. Washington, D.C.: September 30, 2009. Department of Energy: Contract and Project Management Concerns at the National Nuclear Security Administration and Office of Environmental Management. [hyperlink, http://www.gao.gov/products/GAO-09-406T]. Washington, D.C.: March 4, 2009. Nuclear Waste: DOE Lacks Critical Information Needed to Assess Its Tank Management Strategy at Hanford. [hyperlink, http://www.gao.gov/products/GAO-08-793]. Washington, D.C.: June 30, 2008. Hanford Waste Treatment Plant: Department of Energy Needs to Strengthen Controls over Contractor Payments and Project Assets. [hyperlink, http://www.gao.gov/products/GAO-07-888]. Washington, D.C.: July 20, 2007. Nuclear Waste: DOE Should Reassess Whether the Bulk Vitrification Demonstration Project at Its Hanford Site Is Still Needed to Treat Radioactive Waste. [hyperlink, http://www.gao.gov/products/GAO-07-762]. Washington, D.C.: June 12, 2007. Hanford Waste Treatment Plant: Contractor and DOE Management Problems Have Led to Higher Costs, Construction Delays, and Safety Concerns. [hyperlink, http://www.gao.gov/products/GAO-06-602T]. Washington, D.C.: April 6, 2006. Nuclear Waste: Absence of Key Management Reforms on Hanford's Cleanup Project Adds to Challenges of Achieving Cost and Schedule Goals. [hyperlink, http://www.gao.gov/products/GAO-04-611]. Washington, D.C.: June 9, 2004. Nuclear Waste: Challenges to Achieving Potential Savings in DOE's High- Level Waste Cleanup Program. [hyperlink, http://www.gao.gov/products/GAO-03-593]. Washington, D.C.: June 17, 2003. Nuclear Waste: Department of Energy's Hanford Tank Waste Project-- Schedule, Cost, and Management Issues. [hyperlink, http://www.gao.gov/products/GAO-RCED-99-13]. Washington, D.C.: October 8, 1998. [End of section] Footnotes: [1] 42 U.S.C. § 9601 et seq. [2] 42 U.S.C. § 4321 et seq. [3] EM cleanup activities are also subject to the requirements of the act commonly known as the Resource Conservation and Recovery Act (42 U.S.C. § 6901 et seq.). Decisions made under this act were not assessed in this report. [4] DOE, Radioactive Waste Management, DOE O 435.1 (Washington, D.C., July 9, 1999). [5] To meet the requirements of DOE O 435.1, DOE completes performance assessments and composite analyses. Performance assessments are required for specific waste management decisions, while composite analyses are performed to evaluate the cumulative impacts of waste management and cleanup actions at a DOE site. Both serve to provide a reasonable expectation that human health and environmental protection performance objectives will be met. [6] Among the cleanup activities Hanford and SRS must address are the treatment and disposal of millions of gallons of highly radioactive waste stored in aging and leak-prone underground tanks and removal, immobilization, or monitoring of radioactive and hazardous contamination that has migrated through the soil into the groundwater, posing a threat to human health and the environment. Other activities include tearing down buildings and removing and disposing of contaminated soil. [7] DOE, Draft Tank Closure and Waste Management Environmental Impact Statement for the Hanford Site, DOE/EIS-0391 (Washington, D.C., October 2009). The draft environmental impact statement is scoped to evaluate the Fast Flux Test Facility, Waste Management, and Tank Closure, and includes analysis of several alternatives for tank closure that include, for example, emptying and removing the tanks from the ground; or emptying the tanks, leaving the tanks in the ground, and filling them with grout or other material. [8] DOE, Radioactive Waste Management, DOE O 435.1 (Washington, D.C., July 9, 1999). [9] Savannah River Remediation, LLC, "Performance Assessment for the F- Tank Farm at the Savannah River Site," prepared for DOE under Contract No. DE-AC09-09SR22505, SRS-REG-2007-00002 (Aiken, S.C., Mar. 31, 2010). A tank farm is a group of tanks buried side by side in the ground. In addition to the tanks themselves, tank farms also contain equipment such as lines and pumps for transferring waste between tanks, equipment for monitoring heat and chemical reactions inside the tanks, instruments to measure temperature and tank waste levels, and other support facilities. Although SRS's F-Tank Farm originally contained 22 underground liquid radioactive waste tanks, 2 of these tanks have already been closed. [10] Savannah River Remediation, LLC, "Liquid Waste System Plan, Revision 15," prepared for DOE under Contract No. DE-AC09-09SR22505 (Aiken, S.C., Jan. 11, 2010). [11] DOE EM Headquarters imposes quality assurance through its Corporate Quality Assurance Program which is, according to DOE, based on law, DOE directives, national consensus standards, and EM quality management expectations. The program allows for a graded approach to quality assurance, specifying additional requirements for software that relates to nuclear safety. [12] DOE, Quality Assurance, DOE Order 414.1C (Washington, D.C., June 17, 2005). DOE Order 414.1 was first approved in November 1998. Although some of the modeling we reviewed was performed as far back as the early 2000s, DOE Order 414.1 was first approved in 1998 and applied to that modeling. In addition to DOE Order 414.1C, EM's quality assurance program is derived from 10 C.F.R. § 830 and EM quality management expectations. DOE refers to its system of quality assurance policies and orders as "directives." DOE generally imposes its quality directives on contractors by inclusion in contracts. [13] American Society of Mechanical Engineers, "Quality Assurance Requirements for Nuclear Facility Applications," NQA-1-2000, (New York, N.Y., May 2001). [14] The types of assessments that DOE provided ranged from EPA and state regulator comments on draft environmental impact statements, to internal quality assessment reviews conducted by contractors, to general quality assurance reviews that DOE conducted of individual contractors. [15] Savannah River Remediation, LLC, "Performance Assessment for the F-Tank Farm at the Savannah River Site," prepared for DOE under Contract No. DE-AC09-09SR22505, SRS-REG-2007-00002 (Aiken, S.C., Mar. 31, 2010). [16] DOE, External Technical Review for Evaluation of System Level Modeling and Simulation Tools in Support of SRS Liquid Waste Process (June 2009) and DOE, External Technical Review for Evaluation of System Level Modeling and Simulation Tools in Support of Hanford Site Liquid Waste Process (September 2009). [17] KD Auclair & Associates, LLC, Independent Review of the Draft Tank Closure and Waste Management Environmental Impact Statement (Benton City, Wash., March 2010). [18] GAO, Ways to Improve Management of Federally Funded Computerized Models, [hyperlink, http://www.gao.gov/products/LCD-75-111] (Washington, D.C.: Aug. 25, 1976). [19] National Research Council, Models in Environmental Regulatory Decision Making, (Washington, D.C., 2007). [20] EPA, Guidance on the Development, Evaluation, and Application of Environmental Models, EPA/100/K-09/003 (Washington, D.C., March 2009). [21] DOE's Office of Civilian Radioactive Waste Management was terminated on September 30, 2010. [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each weekday, GAO posts newly released reports, testimony, and correspondence on its Web site. To have GAO e-mail you a list of newly posted products every afternoon, go to [hyperlink, http://www.gao.gov] and select "E-mail Updates." Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s Web site, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: E-mail: fraudnet@gao.gov: Automated answering system: (800) 424-5454 or (202) 512-7470: Congressional Relations: Ralph Dawn, Managing Director, dawnr@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, D.C. 20548: Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, D.C. 20548: