This is the accessible text file for GAO report number GAO-13-256 entitled 'Combating Nuclear Smuggling: Lessons Learned from Cancelled Radiation Portal Monitor Program Could Help Future Acquisitions' which was released on June 11, 2013. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to Congressional Requesters: May 2013: Combating Nuclear Smuggling: Lessons Learned from Cancelled Radiation Portal Monitor Program Could Help Future Acquisitions: GAO-13-256: GAO Highlights: Highlights of GAO-13-256, a report to congressional requesters. Why GAO Did This Study: Preventing terrorists from smuggling radiological or nuclear material into the United States to carry out an attack is a national priority. DHS’s DNDO develops and deploys radiation detection equipment to assist other federal agencies, such as CBP, in intercepting illicit radiological or nuclear materials that could be used to make a radiological dispersive device (dirty bomb) or a crude nuclear bomb. CBP uses RPMs at nearly all land border crossings and seaports to detect radiation in trucks and cargo. DHS recently canceled acquisition of ASPs, which was originally envisioned as costing from $2 billion to $3 billion. GAO was asked to provide updated information on the ASP program. This report examines, among other things, (1) the results of ASP testing conducted in 2009 and 2010 that led to DHS’s decision to cancel the ASP program and (2) the benefits of lessons learned reviews and how DHS captures any lessons learned when programs are canceled. GAO reviewed testing and acquisition documents and interviewed key agency officials, as well as seven experts the National Academies identified for their knowledge of leading practices in large-scale engineering and acquisition programs. What GAO Found: The advanced spectroscopic portal monitor (ASP)—a next-generation radiation portal monitor (RPM) for screening trucks and cargo containers—did not pass field validation tests conducted in 2009 and 2010. The Department of Homeland Security’s (DHS) Domestic Nuclear Detection Office (DNDO) intended to replace many currently deployed RPMs and handheld radiation detectors used by U.S. Customs and Border Protection (CBP) with ASPs. However, in the tests, ASP did not meet key requirements to detect radiation and identify its source. For example, ASP triggered too many false alarms from benign, naturally occurring radioactive material in common items such as kitty litter and granite, and it sometimes would not turn on or continue operating long enough to complete a day of testing. In addition, GAO’s review identified analytical weaknesses related to the testing and program cancellation, including inconsistencies in DNDO’s analysis of the settings used for testing the ASP. The final field validation test was conducted in November 2010, and the Secretary of Homeland Security notified Congress of her decision to cancel the program in October 2011. Conducting lessons learned reviews when programs are canceled benefits organizations by identifying things that worked well and did not work well in order to improve future acquisitions programs, according to experts GAO consulted. However, DHS does not have processes in place to ensure such reviews are conducted or that the results are disseminated. Experts identified by the National Academies told GAO that lessons learned reviews help identify reasons why programs were canceled. The experts also said lessons learned reviews should be required and conducted promptly, and the results should be disseminated. At the direction of DHS management, DNDO reviewed the ASP program and submitted and disseminated a lessons learned report in November 2012. (See timeline below.) This report cited 32 lessons learned including having program officials work closely with end users to ensure equipment meets operational requirements. DHS guidance calls for lessons learned reviews immediately after programs are canceled and states that the lessons learned are to be shared throughout the department, but this guidance is not a requirement. Before DHS’s directive, there was confusion about whether a lessons learned review was needed for the ASP program, and DNDO officials did not intend to conduct such a review. Moreover, DHS officials were unable to provide examples of previous lessons learned reports from other canceled programs. DHS officials also said they have no process for disseminating such reports but are planning one. Figure: Events from Final Test of ASP to Lessons Learned Report: [Refer to PDF for image: timeline] November 2010: Third and final field validation test concludes. October 2011: Secretary of Homeland Security notifies Congress of her decision to cancel ASP program. July 2012: DHS’s Under Secretary for Management directs lessons learned review. November 2012: Lessons learned report. Source: GAO analysis of ASP program documents. [End of figure] What GAO Recommends: DHS should require lessons learned reviews and develop processes to ensure such reviews are done in a timely manner and the results disseminated throughout the department. DHS agreed with all of GAO’s recommendations and has planned and taken some actions to address them. View [hyperlink, http://www.gao.gov/products/GAO-13-256]. For more information, contact David C. Trimble at (202) 512-3841 or trimbled@gao.gov or Dr. Timothy M. Persons at (202) 512-6412 or personst@gao.gov. [End of section] Contents: Letter: Background: Test Results Show That ASP Did Not Meet Criteria to Pass Field Validation Testing, Leading DHS to Cancel the Program: Lessons Learned Reviews Improve Future Acquisition Efforts, but DHS Does Not Have Processes Ensuring Such Reviews: DHS Is Part of International Tests of Next-Generation RPMs and Is Working with States to Gather ASP Data: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Experts Identified by the National Academies Who Described Leading Practices for Reviewing Cancelled Acquisition Programs: Appendix II: DNDO-CBP ASP Lessons Learned Report: Appendix III: Comments from the Department of Homeland Security: Appendix IV: GAO Contacts and Staff Acknowledgments: Related GAO Products: Tables: Table 1: Criteria for Demonstrating a Significant Increase in Operational Effectiveness: Table 2: Summary of Results for Field Validation Testing: Figures: Figure 1: Trucks Passing though Radiation Portal Monitors: Figure 2: CBP Officer Screening a Truck with a Handheld Detector That Can Identify Sources of Radiation: Figure 3: Timeline of DHS Studies and Tests of Next-Generation RPMs, Including ASP: Abbreviations: ANSI: American National Standards Institute: ASP: advanced spectroscopic portal monitor: CBP: Customs and Border Protection: DHS: Department of Homeland Security: DNDO: Domestic Nuclear Detection Office: DOE: Department of Energy: EU: European Union: IEC: International Electrotechnical Commission: ITRAP: Illicit Trafficking Radiation Assessment Program: ITRAP+10: ITRAP revisited after about 10 years: ORNL: Oak Ridge National Laboratory: PARM: Office of Program Accountability and Risk Management: PNNL: Pacific Northwest National Laboratory: [End of section] United States Government Accountability Office: Washington, DC 20548: May 13, 2013: The Honorable Dan Maffei Ranking Member Subcommittee on Oversight Committee on Science, Space, and Technology House of Representatives: The Honorable Donna F. Edwards Ranking Member Subcommittee on Space Committee on Science, Space, and Technology House of Representatives: Preventing terrorists from smuggling radiological or nuclear material into the United States to carry out an attack is a national priority. To implement federal policy to protect the nation against radiological or nuclear attacks, among other purposes, the Department of Homeland Security's (DHS) Domestic Nuclear Detection Office (DNDO) was established in 2005.[Footnote 1] DNDO is tasked with coordinating development of a global nuclear detection architecture and is responsible for implementing the domestic portion of the architecture. [Footnote 2] It also develops, acquires, and deploys radiation detection equipment to support the efforts of other federal agencies. For example, DHS's U. S. Customs and Border Protection (CBP) uses radiation detection equipment at U.S. ports of entry to screen cargo containers and trucks for illicit radiological materials, including material that could be used in radiological dispersive devices (dirty bombs) and special nuclear material that could be used to make a nuclear weapon.[Footnote 3] Potential pathways for illicit radiological materials to enter the United States include being smuggled in trucks at border crossings or in cargo on ships. According to DHS officials, nearly all trucks at land border crossings and containerized cargo at seaports are screened for radiological materials. To screen cargo at these locations, CBP uses radiation portal monitors (RPM)--large stationary radiation detectors through which trucks and cargo containers pass. (See figure 1.) Figure 1: Trucks Passing though Radiation Portal Monitors: [Refer to PDF for image: photograph] Source: GAO. Note: Dashed frame contains one radiation portal monitor. [End of figure] According to DNDO the current generation of RPMs has limitations in that it can detect but cannot identify sources of radiation. Because of this inability, the RPMs' radiation alarms can be triggered by benign, naturally occurring radioactive material present in common items such as kitty litter and granite. Cargo that has triggered an alarm during the initial ("primary") screening is sent for additional inspection (i.e., "secondary" screening), first by another RPM to confirm the alarm and then by a CBP officer using a handheld radiation detector--about the size of a shoe box--that can identify the source of the radiation (see figure 2). Figure 2: CBP Officer Screening a Truck with a Handheld Detector That Can Identify Sources of Radiation: [Refer to PDF for image: photograph] Source: GAO. [End of figure] One way to reduce the rate of alarms that are triggered by benign materials--and thereby reduce the number of unnecessary secondary screenings--would be to use next-generation RPMs that can both detect and identify radiation sources. In 2005, DNDO began working with CBP on a program to develop and test a type of next-generation RPM called the advanced spectroscopic portal monitor (ASP), which was designed to both detect radiation and identify the source as benign, suspect, or a threat. The initial concept of the program was to develop, procure, and deploy enough ASPs to replace many of CBP's currently deployed RPMs and handheld detectors at a cost of $2 billion to $3 billion, according to DNDO.[Footnote 4] Throughout the ASP program, DNDO and CBP faced many challenges, on which we have reported and testified to Congress several times. [Footnote 5] Our last in-depth review of the ASP program was in May 2009, when we reported that the ASP had mixed performance during testing to determine whether it could successfully be used for primary and secondary screening.[Footnote 6] Challenges in the ASP program were also identified by the National Research Council of the National Academies.[Footnote 7] In 2009, the National Research Council recommended in an interim report that DNDO use a better approach to testing, evaluation, cost-benefit assessment, and deployment of ASPs. [Footnote 8] In 2010, the National Research Council found in its final report that, among other things, DNDO's 2008 testing had shortcomings that impaired the ability to draw conclusions about ASP's likely performance, and DNDO's draft cost-benefit analysis needed substantial improvement to support decision making.[Footnote 9] In July 2011, after several years of testing, the DNDO Director testified that the Secretary of Homeland Security had directed DNDO and CBP to "end the ASP program as originally conceived" and to instead use existing ASPs to help CBP officers gain operational familiarity with ASP and to gather data to support a future acquisition program. In October 2011, the Secretary of Homeland Security wrote to key congressional committees informing them of her decision to cancel the ASP program.[Footnote 10] In July 2012, the Under Secretary for Management stated that, in accordance with the Secretary's letter, the program was considered canceled. Once canceled, programs such as ASP are subject to provisions of DHS's acquisition guidance regarding review of canceled programs and dissemination of the resulting lessons learned. In this context, you asked us to provide updated information on the ASP program. Our objectives were to determine: (1) the results of ASP testing conducted in 2009 and 2010 that led to DHS's decision to cancel the ASP program; (2) the benefits of lessons learned reviews and how DHS captures any lessons learned when programs are canceled; and (3) what additional testing, if any, of next-generation RPMs, including ASP, DHS is conducting or planning. To determine the results of the ASP testing conducted in 2009 and 2010 that led to the decision to cancel the ASP program, we reviewed ASP testing documents and briefing materials from DNDO, CBP, the Department of Energy's (DOE) Pacific Northwest National Laboratory, and Johns Hopkins University's Applied Physics Laboratory and interviewed officials from DNDO and CBP. To determine the benefits of lessons learned reviews and how DHS captures any lessons learned when programs are canceled, we analyzed acquisition policy documents from DHS and interviewed officials from DNDO, DHS's Office of Program Accountability and Risk Management (PARM), and DHS's Enterprise Business Management Office. In addition, we interviewed seven experts identified by the National Academies to highlight leading practices for reviewing canceled acquisition programs; these experts were identified on the basis on their knowledge of leading practices in large-scale engineering and acquisition programs. (See appendix I for a list of the experts we interviewed.) To determine what additional testing of next-generation RPMs, including ASP, DHS is conducting or planning, we reviewed documents from DHS, DNDO, the National Institute of Standards and Technology, the American National Standards Institute, and the International Electrotechnical Commission, and we witnessed some of the testing. We also interviewed officials from DNDO and DOE's Oak Ridge National Laboratory. We conducted this performance audit from October 2011 to May 2013 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Background: DHS began testing next-generation RPMs with a preliminary small-scale study performed in July 2004 by DOE's Pacific Northwest National Laboratory (PNNL). (See figure 3.) Specifically, PNNL performed small- scale, side-by-side comparisons of next-generation RPMs and the RPMs used by CBP. This study found that in some situations, the performance of the next-generation RPMs was better than that of CBP's RPMs, and that in other situations, the performance of the two was equal. In a follow-up, small-scale study conducted for DHS in July 2005, PNNL found that the next-generation RPMs performed no better than CBP's currently deployed RPMs but did produce fewer alarms for benign materials. Figure 3: Timeline of DHS Studies and Tests of Next-Generation RPMs, Including ASP: [Refer to PDF for image: timeline] July 2004: Preliminary small-scale study. July 2005: Follow-up small-scale study. October 2005: Test on next-generation RPMs; results used to help evaluate ASP proposals. July 2006: ASP contracts awarded. February-March 2007: First ASP tests. Final ASP testing campaign 2008-2010: July-August 2008: Performance testing at DOE’s Nevada National Security Site (primary and secondary screening). January-February 2009: First field validation test (primary and secondary screening). July-August 2009: Second field validation test (primary and secondary screening). October-November 2010: Third field validation test (secondary screening). Source: GAO analysis of ASP program documents. [End of figure] In October 2005, DNDO conducted a number of tests on next-generation RPMs that DHS had commissioned from different manufacturers and used the test results to help evaluate proposals for the ASP program. In July 2006, DNDO awarded contracts for development and production of ASPs. Among other things, according to DNDO officials, the design specifications for ASP stated that it should operate on trucks traveling at 5 miles per hour during primary screening and 2 miles per hour during secondary screening. DNDO began testing these ASPs at DOE's Nevada National Security Site in February 2007.[Footnote 11] As we have previously reported, we identified a number of weaknesses in the methods used for these tests that impaired the significance and validity of the test results.[Footnote 12] Concerned about the performance and cost of ASP, Congress required the Secretary of Homeland Security to certify that ASP would provide a "significant increase in operational effectiveness" before DNDO obligated funds for full-scale procurement of ASPs.[Footnote 13] In response, DNDO, CBP, and DHS jointly issued criteria in July 2008 for determining whether ASP provided a significant increase in operational effectiveness compared with existing equipment. The criteria generally compare ASP with the RPMs used by CBP under CBP's standard operating procedure. Specifically, as shown in table 1, there were four criteria for primary screening and two criteria for secondary screening.[Footnote 14] In addition to these overarching criteria, each phase of testing also had criteria that needed to be met to complete that phase. Table 1: Criteria for Demonstrating a Significant Increase in Operational Effectiveness: Primary screening criteria: 1. When special nuclear material is present in cargo without naturally occurring radioactive material, ASP's probability of detecting the material must be equal to or greater than that of the RPMs used by CBP. 2. When special nuclear material is present in cargo along with naturally occurring radioactive material, the ASP must increase the probability of detecting and identifying the material compared with CBP's currently deployed RPMs and handheld detectors. 3. When medical or industrial radiological sources are present in cargo, ASP's probability of detecting the sources must be equal to or greater than that of CBP's currently deployed RPMs. 4. When the only radiological source present in the cargo is naturally occurring radioactive material, the ASP must refer at least 80 percent fewer trucks for secondary screening than CBP's currently deployed RPMs. Secondary screening criteria: 1. When compared with CBP's currently deployed handheld detectors, ASP must reduce, by at least a factor of 2, the probability that special nuclear material is misidentified as naturally occurring radioactive material, a medical or industrial radiological source, an unknown radiological source, or no radiological source at all. 2. When compared with CBP's currently deployed handheld detectors, ASP must reduce the average time required to correctly release trucks from secondary screening. Source: GAO summary of DHS documentation. [End of table] The final ASP testing campaign began in July 2008 and was concluded in November 2010. DNDO started the campaign with performance testing at DOE's Nevada National Security Site. The performance tests determined the probability of detection and identification of radiation sources, including special nuclear material that could be used to make a nuclear weapon. CBP then began field validation testing in 2009. Field validation tests are the first step in determining the suitability of fully operational equipment in the actual environment where the equipment is intended to be used. The ASP field validation tests were performed at land border crossings in Detroit, Michigan, and Laredo, Texas, and at seaports in Long Beach, California, and in New York City, New York. CBP made three attempts to complete this testing--in January and February 2009, July and August 2009, and October and November 2010. This testing, which we reviewed for this report, evaluated the compatibility of ASP with CBP's interdiction system on the basis of 10 additional criteria CBP had specified for completion of this phase of testing.[Footnote 15] Among other things, these criteria included whether, under normal field operating conditions, ASP improved detection to the required degree compared with the currently deployed CBP equipment. To pass the field validation testing phase, ASP had to meet all 10 criteria. In addition, DNDO's ASP program manager specified that ASP could not have any major unresolved issues that might render it ineffective or unsuitable for use by CBP.[Footnote 16] Test Results Show That ASP Did Not Meet Criteria to Pass Field Validation Testing, Leading DHS to Cancel the Program: Because of unsatisfactory test results, ASP did not pass field validation testing, which led DHS to cancel the program. Specifically, in CBP's first and second attempts to pass field validation testing in 2009, ASP did not meet 1 of the 4 criteria for demonstrating a significant increase in operational effectiveness during primary screening--that is, it did not refer at least 80 percent fewer trucks for secondary screening than CBP's currently deployed RPMs.[Footnote 17] When CBP began the first field validation test in late January 2009, CBP found ASP produced a higher number of referrals to secondary screening than CBP's currently deployed RPMs. CBP suspended the test about 2 weeks later in mid-February; CBP officials reported that the problem with referrals required significant corrective actions before testing could resume. DNDO, working with Johns Hopkins University's Applied Physics Laboratory, determined that the problem could be addressed by revising ASP's software to raise the thresholds for triggering alarms. CBP conducted a second field validation test of ASP in July and August 2009 using the revised ASP software. CBP and DNDO documents indicate that, with the revised software, ASP was able to reduce referrals to secondary screening by about 70 percent but not by the 80 percent required to demonstrate a significant increase in operational effectiveness. Significantly, according to these documents, the majority of the ASP referrals to secondary screening were for alarms that falsely indicated the presence of special nuclear material that could be used to make a nuclear weapon. CBP officials told us these types of false alarms were very disruptive in a port environment because they caused CBP officers to take enhanced security precautions--specifically, CBP officers had to conduct a thorough inspection to ensure no special nuclear material was present before permitting the cargo to enter the country. Moreover, repeated false alarms for such nuclear materials could cause CBP officers to be skeptical about future alarms. In the course of the second field validation test, ASP also experienced a "critical failure," which caused it to shut down. Importantly, during this critical failure, ASP did not emit a signal to alert the CBP officer that it was no longer screening cargo. Had this failure not occurred in a controlled testing environment, in which CBP was using its RPMs in tandem with ASP, the CBP officer would have permitted cargo to enter the country unscreened for radiological material. Testing is performed so that issues like this can be discovered and fixed prior to full-scale production and deployment. Following the second field validation test, DNDO's Acting Director stated that the ASP program would be scaled back to secondary screening only.[Footnote 18] According to the Acting Director, DNDO's decision was based on ASP's performance and cost. Specifically, the Acting Director stated that DHS decided to not pursue ASP for primary screening because ASP did not meet DHS's criteria for a significant increase in operational effectiveness. However, the Acting Director stated that results to date had shown that ASP met secondary screening criteria by wide margins. The Acting Director also stated that the cost to replace current RPMs with ASP for secondary screening was relatively low--approximately $350 million--compared with the original total program cost of $2 billion to $3 billion to use ASP for primary and secondary screening. This is because the original program called for many more ASPs to be deployed in CBP primary screening than secondary screening. In November 2010, ASP could not pass the third and final field validation test for use in secondary screening because ASP did not meet 6 of the 10 criteria needed to complete the field validation testing phase, according to our review of CBP test documents. According to these documents, 2 of the 6 criteria that ASP did not meet involved issues that would significantly affect operations unless they could be mitigated or resolved. ASP did not meet these criteria in part because of its inability to turn on at the beginning of each day and to continue operating long enough to complete a full day of testing. In addition, when ASP was operating, it produced an excessive number of alarms that incorrectly indicated naturally occurring radioactive material in the cargo might be masking special nuclear material. In some cases, testing of the ASP had to be suspended because the high number of these alarms was backing up traffic for secondary screening, delaying the flow of commerce. According to the CBP documents, the remaining 4 of the 6 criteria that ASP did not meet involved issues that could affect or delay port operations, although not as significantly. For example, ASP did not meet one of these criteria because of difficulty in integrating ASP into a CBP information network. In addition to the 6 criteria that the ASP did not meet, DNDO test documents identified several major unresolved issues that would have needed to be resolved for successful completion of field validation testing. In explaining these issues, DNDO and CBP officials told us that, for example, at certain times of the day, direct sunlight impaired the ability of the ASP to operate properly. See table 2 for a summary of the results for the three field validation tests. Table 2: Summary of Results for Field Validation Testing: Type of test: First field validation test: primary and secondary screening; Date of testing: January and February 2009; Summary of results: ASP had more referrals from primary screening to secondary screening than CBP's currently deployed RPMs. The requirement for a significant increase in operation effectiveness was to have 80 percent fewer referrals to secondary screening. Type of test: Second field validation test: primary and secondary screening; Date of testing: July and August 2009; Summary of results: ASP had 70 percent fewer referrals to secondary screening than CBP's currently deployed RPMs instead of the 80 percent fewer required for a significant increase in operation effectiveness. Also, the majority of referrals were for alarms falsely indicating the presence of special nuclear material, and such alarms were very disruptive in the port environment. Type of test: Third field validation test: secondary screening only; Date of testing: October and November 2010; Summary of results: ASP did not meet 6 of 10 criteria required to pass the field validation testing phase. For example, ASP produced excessive alarms incorrectly indicating that naturally occurring radioactive material might be masking special nuclear material. At times, testing had to be suspended because these alarms were backing up traffic. Source: GAO analysis of ASP program documents. [End of table] In addition to the issues identified during testing, our review identified analytical weaknesses before and after the third field validation test. For example, DNDO was inconsistent in its analysis of how to properly set up the ASP for secondary screening. Specifically, ASPs have separate settings (modes) for primary and secondary screening, and before the third field validation test, DNDO performed an analysis that indicated ASP would perform best during secondary screening if the machines were set in primary screening mode. However, after the third field validation test, DNDO officials analyzed the test results and concluded that ASP would have performed better for secondary screening had the machines been set in secondary screening mode. Furthermore, the analytical basis for a key decision regarding the ASP program is unclear. Specifically, DNDO could not provide us with any supporting analysis on a principal factor that led to the decision to cancel the program. According to DNDO officials, for ASP to work in secondary screening, truck speeds would need to average 2 miles per hour, as indicated in the design specifications. On the basis of these specifications, the ASP Governance Board concluded in February 2011 that ASP was not operationally suitable for secondary screening and should not be deployed because, according to CBP officials, it was not possible to control truck speed to an average of 2 miles per hour.[Footnote 19] However, when we asked DNDO officials for the data or analysis supporting this conclusion, these officials could not provide such support. Moreover, during congressional testimony in July 2012, the Acting Director of DNDO said that it is possible to control truck speed to 2 miles per hour at state-operated truck weigh stations. Accordingly, it is unclear why such speeds cannot be achieved at CBP land border crossings and seaports but can be achieved at truck weigh stations. Lessons Learned Reviews Improve Future Acquisition Efforts, but DHS Does Not Have Processes Ensuring Such Reviews: Conducting lessons learned reviews when programs are canceled benefits organizations by identifying things that worked well and did not work well in order to improve future acquisitions programs, according to experts we consulted; however, DHS does not have processes in place to ensure such reviews are conducted and reports documenting the results of the review are disseminated. To determine the key benefits of conducting lessons learned reviews, we consulted seven experts in large-scale engineering and acquisitions programs who were identified at our request by the National Academies. (See appendix I for a list of the experts we interviewed.) When we asked these experts for their views on lessons learned reviews and reports for canceled acquisition programs, they generally agreed on the following observations: [Footnote 20] * Lessons learned reviews help to determine the reasons why programs were canceled including problems with the technology, management, or setting requirements. By identifying problematic practices and causes for program failure, lessons learned reports allow an organization to improve future efforts. * Lessons learned reports should be prepared promptly because otherwise, knowledgeable personnel may not be available to contribute to the reports, important details may not be recalled accurately, and dissemination of lessons learned will be delayed. * Lessons learned reports should not be optional for an organization. A requirement for lessons learned reports should be institutionalized-- lessons learned reports should be required for programs administered by the organization rather than being an ad hoc requirement. However, an appropriate acquisitions executive should have the authority to tailor or waive the lessons learned report based on factors such as the size or cost of a program, its perceived importance and effects, or the perceived causes for failure. * Lessons learned reports must be submitted to an appropriate acquisitions executive to be useful. The acquisitions executive should disseminate the lessons learned reports to other appropriate acquisition officials. Lessons learned reports should be disseminated widely; however, they should be edited to preserve the lessons learned while minimizing potential damage to the reputations of people and organizations that were involved with the program. DHS interim acquisition guidance, issued in 2010, also recognizes the benefits of timely lessons learned reviews for canceled programs, although as guidance it does not constitute a requirement.[Footnote 21] Similar to the views of the experts identified by the National Academies, the DHS acquisition guidance states that (1) the objective of the reviews is to share lessons learned throughout the department to increase the probability of success for future acquisition programs and (2) the reviews are to take place immediately after programs are canceled. In addition, an official from the DHS's Office of Program Accountability and Risk Management (PARM)--DHS's policy office for acquisitions--told us that it is important to conduct these reviews immediately so that key people remain available and do not forget the factors that contributed to programs being canceled. Under DHS acquisition guidance, upon completion of the lessons learned review, a report documenting the lessons learned is to be submitted to PARM. DHS acquisition guidance on lessons learned provides general direction on what is to happen when programs are canceled, but it does not specify processes to follow. PARM officials stated that there were no documented processes in place for (1) conducting timely lessons learned reviews, (2) preparing lessons learned reports, or (3) disseminating lessons learned reports throughout the department. Under the federal standards of internal control, agencies are to clearly document internal controls (i.e., in management directives, administrative policies, or operating manuals), and this documentation should be readily available for examination.[Footnote 22] PARM officials told us they generally rely on DHS component agencies, such as DNDO, to self-monitor many aspects of their programs including conducting lessons learned reviews, preparing lessons learned reports, and submitting lessons learned reports. In the case of the ASP program, a lessons learned review was conducted, and a lessons learned report was submitted to PARM and disseminated within CBP and DNDO in November 2012 at the explicit direction of DHS's Under Secretary for Management.[Footnote 23] The lessons learned report states that the program was canceled by the Secretary in October 2011. However, before the Under Secretary's directive, there was confusion about whether such a review should be conducted for canceled programs, as well as uncertainty about whether the program had been canceled. DNDO officials told us in March 2012 that, even though they considered the program to have been canceled by the Secretary in October 2011, they did not intend to conduct a lessons learned review because such reviews were not needed for canceled programs. Furthermore, there was uncertainty between DNDO and PARM about whether and when the program had been canceled. In this regard, while DNDO relied on the Secretary's October 2011 letter, PARM officials told us that the program was not canceled and would not be canceled until DNDO and CBP completed certain additional actions such as gathering data to support a future acquisition program. These actions were started but not completed, however, the uncertainty over the program's status was eventually resolved in July 2012 with a memorandum from the Under Secretary for Management which stated that, in accordance with the Secretary's October 2011 letter, the program was considered canceled.[Footnote 24] In discussing this with DNDO, and DHS officials, they acknowledged that the DHS acquisition guidance is confusing on when programs are considered canceled and when to begin conducting a lessons learned review. DHS officials stated that they are in the process of revising the guidance to provide greater clarity on these issues. To determine whether the confusion and uncertainty that affected the ASP program were unique to that program, we asked PARM officials to provide us with examples of previous lessons learned reports from any canceled programs across DHS, including several programs canceled in 2011 that we had previously reported upon.[Footnote 25] The PARM officials told us that they did not have any lessons learned reports for these programs or any others in the department. Furthermore, there was confusion within the department about the need to disseminate lessons learned reports. Before the Under Secretary's memorandum of July 2012 directing DNDO and CBP to prepare and disseminate an ASP lessons learned report, PARM officials had told us there was no requirement to disseminate lessons learned outside individual component agencies. Subsequent to the memorandum, PARM officials told us that they are developing plans to disseminate lessons learned reports through department-wide Centers of Excellence.[Footnote 26] The lessons that can be learned from such reviews and reports can be extensive and informative. In the case of the ASP program, DNDO and CBP's lessons learned report identified 32 lessons learned in the following six categories: * acquisition strategy; * documentation of planning; * structuring and staffing of the program office; * scheduling; * testing; and: * coordination between stakeholders and end users regarding technical and operational requirements, as well as costs. (See appendix II for a copy of the lessons learned report.) For example, concerning acquisition strategy, the lessons learned report states the government should consider a "commercial first" approach--meaning consider whether commercially available equipment can meet programmatic needs before developing new technologies. Such an approach can reduce costs and risks to the program. Another lesson learned from the ASP program is that the acquisition officials and the end users must work closely together to ensure the needed capability meets operational requirements. An additional lesson learned was to manage the schedule of the acquisition program by events--not the calendar. In other words, the program should not proceed to the next major phase until all the criteria are met for completing the current phase. Furthermore, the report stated that acquisition programs should be flexible enough to respond to major issues that may be discovered after program initiation. Specifically, management should take care not to believe in the program so strongly that early signs of trouble are ignored, or cannot be detected and acted upon in a timely manner. At this point, however, PARM has no assurance that lessons learned reviews have been or will be conducted for other canceled programs, as called for in DHS acquisition guidance. Until DHS puts processes in place to ensure component agencies conduct lessons learned reviews and prepare lessons learned reports, and PARM implements its plans for disseminating the reports, DHS risks limiting the probability of success for the department's future acquisition programs. Success in such acquisition programs is important given the scale of DHS's planned investments, which DHS has estimated to be many billions of dollars during the next few years.[Footnote 27] DHS Is Part of International Tests of Next-Generation RPMs and Is Working with States to Gather ASP Data: DHS is participating in international tests of next-generation RPMs, including commercially available RPMs that have characteristics similar to ASP, and is working with states to gather data using ASPs at five truck weigh stations. For the international tests, DNDO is working with its European Union (EU) counterparts in the Illicit Trafficking Radiation Assessment Program (ITRAP); this program is currently known as ITRAP+10 because it is being revisited about 10 years after testing was originally conducted. According to the test planning documents, the final report is scheduled for issuance in August 2013. The documents further indicate that DNDO has the following objectives for the ITRAP+10 tests: * Provide scientific and technical data on radiological and nuclear detection equipment to policy makers. * Identify the best technologies based on results from repeatable testing. * Promote harmonization of national and international standards and guidelines. * Improve international exchange of information. * Provide manufacturers with feedback on how well their equipment performed against standards. * Promote new research and development efforts. The commercially available next-generation RPMs that DNDO and the EU are testing come from a variety of vendors and are being tested along with several other categories of radiation detection equipment. This testing is being performed to national and international standards for radiation detection set by the American National Standards Institute (ANSI)[Footnote 28] and the International Electrotechnical Commission (IEC).[Footnote 29] DNDO officials told us these tests are not part of any planned acquisition. According to the test plan, Oak Ridge National Laboratory (ORNL) will perform ITRAP+10 tests for DNDO on these next-generation RPMs. The primary planning documents for the testing of the next-generation RPMs were submitted by the National Institute of Standards and Technology and were approved by DNDO and the European Commission's Joint Research Centre.[Footnote 30] These documents indicate that, in addition to testing to the standards, ORNL will test to determine the outer limits of what the next-generation commercial RPMs can detect. In addition to participating in the ITRAP+10 tests, DNDO is working with state agencies to gather data using five existing ASPs and installed at state-operated truck weigh stations in five states. According to a DNDO official, data collected will include information on the trucks and cargo passing through the ASPs, as well as any radiation detected by ASP. According to a DNDO official, data collected by the five ASPs will be sent to DNDO's Joint Analysis Center Collaborative Information System, which informs DNDO about nuclear detection and coordinates responses to nuclear detection alarms. This official told us the data will be analyzed to determine the types of cargo transported and any trends that may be apparent. Conclusions: To its credit, DHS has canceled the ASP program, and DNDO conducted a lessons learned review about the canceled program and prepared a lessons learned report. However, DNDO prepared the lessons learned report and disseminated it 2 years after the final ASP testing, and about a year after the Secretary of Homeland Security notified Congress of her decision to cancel the program. Moreover, the report was not prepared until there was a specific directive from the DHS Under Secretary for Management. For canceled programs, DHS's acquisition guidance calls for a lessons learned review immediately after cancellation of a program, preparation of a lessons learned report, and dissemination of such a report. However, the acquisition guidance does not constitute an institutional requirement, which the experts we consulted said it should be. Moreover, DHS does not have documented processes for component agencies to follow in (1) conducting timely lessons learned reviews, (2) preparing lessons learned reports, or (3) disseminating lessons learned reports throughout the department, instead relying on such agencies to self- monitor. The amount of time between the final field validation testing of the ASP, the program's cancellation, and the conducting of a lessons learned review and dissemination of a lessons learned report suggests that timely identification of lessons learned was not a DHS priority. While our review focused on the ASP program, the inability of PARM to identify any lessons learned reports from canceled DHS programs suggests that the problem is broader than this one program. PARM is developing plans for disseminating future lessons learned reports department-wide. However, until DHS makes lessons learned reviews an institutional requirement, puts processes in place to ensure component agencies conduct such reviews and prepare lessons learned reports, and implements plans for disseminating the reports, DHS may be missing opportunities to improve its chances of success for billions of dollars in future acquisitions. Recommendations for Executive Action: To increase the probability of success for future acquisition programs, we recommend that the Secretary of Homeland Security take the following four actions for canceled acquisition programs: Make lessons learned reviews an institutional requirement, such as through an agency directive or order or other appropriate means. Put documented processes in place to ensure that component agencies: * conduct timely lessons learned reviews, and: * prepare and submit lessons learned reports. Complete and implement plans to disseminate lessons learned reports throughout the department. Agency Comments and Our Evaluation: We provided a draft of this report to DHS for review and comment. In its written response, reproduced in appendix III, DHS agreed with all four of our recommendations. DHS asked us to consider two of the recommendations resolved and closed on the basis of existing guidance and processes that were discussed in the body of this report. DHS outlined steps that it intends to take to address the other two recommendations. We are encouraged by some of the actions planned. However, we do not consider two of the recommendations resolved and closed because, as discussed below, we believe that DHS should take steps to address these recommendations. In DHS's response to our recommendation to make lessons learned reviews an institutional requirement, DHS commented that it has institutionalized this requirement through existing departmental acquisitions guidance that was discussed in the body of this report.[Footnote 31] However, as discussed in this report and confirmed by PARM officials during the course of our review, this guidance does not constitute a requirement. DHS's comments also refer to additional guidance called the Department of Homeland Security (DHS) Capital Planning and Investment Control (CPIC) Guide. However, officials from both PARM and the DHS Office of the Chief Information Officer told us that CPIC applies only to information technology programs,[Footnote 32] not to other programs such as ASP. In our view, the existing guidance referenced in DHS's comments contributed to the confusion and uncertainty we discussed in the body of this report, including confusion about whether lessons learned were required for canceled programs. Accordingly, we continue to believe that DHS needs to take action to address the problem. Similarly, in commenting on our recommendation to put processes in place to ensure that component agencies conduct timely lessons learned reviews, DHS stated that our recommendation is addressed by its existing acquisition oversight process, as evidenced by the completion of a lessons learned report for ASP. However, according to DNDO and CBP's lessons learned report, the Secretary of DHS canceled the program in October 2011,[Footnote 33] but the lessons learned report for this program was not issued until more than a year later. Furthermore, as noted in this report, the ASP lessons learned report is the only example PARM officials could identify of such a report having been completed. In our view, relying on this existing process seems unlikely to ensure that component agencies conduct timely lessons learned reviews. With regard to our final two recommendations--to put documented processes in place to ensure that component agencies prepare and submit lessons learned reports, and to complete and implement plans to disseminate lessons learned reports throughout the department--DHS stated that it is taking several actions. Specifically, DHS is revising and clarifying its acquisitions guidance with respect to lessons learned; it expects to complete these revisions by June 2014. In its comments, DHS stated that it has developed an online forum and library for DHS acquisition professionals that includes an area dedicated to lessons learned and best practices. Furthermore, DHS has designated a lead Center of Excellence and established a time frame of September 2013 for collecting and posting information on lessons learned for canceled programs. In our view, as DHS completes more lessons learned reports, an online forum and library could become a useful means of disseminating these reports throughout the department. DHS also provided technical comments that we incorporated in the report, as appropriate. As agreed with your offices, unless you publicly announce the contents of this report earlier, we plan no further distribution until 30 days from the report date. At that time, we will send copies to the Secretary of Homeland Security, the Director of DNDO, the Commissioner of CBP, the Executive Director of PARM, the appropriate congressional committees, and other interested parties. In addition, the report will be available at no charge on the GAO website at [hyperlink, http://www.gao.gov]. If you or your staff members have any questions about this report, please contact David C. Trimble at (202) 512-3841 or trimbled@gao.gov or Dr. Timothy M. Persons at (202) 512-6412 or personst@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made key contributions to this report are listed in appendix IV. Signed by: David C. Trimble Director, Natural Resources and Environment: Signed by: Timothy M. Persons, Ph.D., Chief Scientist: Director, Center for Science, Technology, and Engineering: [End of section] Appendix I: Experts Identified by the National Academies Who Described Leading Practices for Reviewing Cancelled Acquisition Programs: At our request, the National Academies identified experts who could highlight leading practices for reviewing canceled acquisition programs. These experts were identified on the basis on their knowledge of leading practices in large-scale engineering and acquisition programs. The National Academies provided us a list of 10 experts, 7 of whom were available to participate in this effort. Six of the 7 experts who were available were members of the National Academy of Engineering. We conducted a group telephone conference with 6 of the experts. We drafted conclusions from the conference and circulated them to all 7 of the experts for their review and comment. With one exception, the experts reached consensus on all of the observations. One expert felt that lessons learned reports were so important that acquisitions executives should have less authority to tailor or waive the requirements for such reports. Hon. Philip E. Coyle, consultant, former Assistant Secretary of Defense and Director of Operational Test and Evaluation: Dr. Joseph M. DeSimone; Chancellor's Eminent Professor of Chemistry at the University of North Carolina; Member of the National Academy of Engineering; Member of the National Academy of Science: Dr. Donald Fraser; Draper Laboratory, retired; former Principal Under Secretary of Defense for Acquisition; Member of the National Academy of Engineering: Dr. Mary L. Good; Dean Emeritus and Special Assistant to the Chancellor, Center for Innovation and Commercialization at the University of Arkansas at Little Rock; former Under Secretary for Technology in the Department of Commerce; Member of the National Academy of Engineering: Don R. Kozlowski; Senior Vice President of the Boeing Company, retired; Member of the National Academy of Engineering: Dr. Lawrence Papay; Chief Executive Officer of PQR, LLC; former Sector Vice President of SAIC; Member of the National Academy of Engineering: A. Thomas Young; Executive Vice President of the Lockheed Martin Corporation, retired; former President and Chief Operating Officer of the Martin Marietta Corporation; Member of the National Academy of Engineering: [End of section] Appendix II: DNDO-CBP ASP Lessons Learned Report: The following DNDO-CBP ASP lessons learned report was redacted on December 4, 2012, from the November 5, 2012, For Official Use Only version by DNDO. U.S. Department of Homeland Security: Domestic Nuclear Detection Office (DNDO): Homeland Security: Advanced Spectroscopic Portal (ASP) Program: Post Implementation Review/Lessons Learned: Redacted: December 04, 2012: Document Number 600-ASP-119650v1.11: Advanced Spectroscopic Portal (ASP) Program: Post Implementation Review/lessons Learned: Submitted by: Paul Burrowes: Deputy Assistant Director: Product Acquisition and Deployment Directorate: Domestic Nuclear Detection Office: Date: Endorsed by: Lafonda Sutton-Burke: Director, Non-Intrusive Inspection: Cargo and Conveyance Security: U.S. Customs and Border Protection: Date: Endorsed by: Ira Reese: Executive Director: Laboratories & Scientific Services: U.S. Customs and Border Protection: Date: Endorsed by: Mark Borkowski: Assistant Commissioner: Office of Technology Innovation and Acquisition: Component Acquisition Executive: U.S. Customs and Border Protection: Date: Endorsed by: Stephen Karoly: Assistant Director: Product Acquisition and Deployment Directorate: Acting Component Acquisition Executive: Domestic Nuclear Detection Office: Date: Note: The original unredacted version was signed and dated by all of the officials above. Table Of Contents: Executive Summary: Section 1: Introduction: 1.1. Purpose: 1.2. The Review Process Overview: Section 2: PIR Areas Of Assessment: 2.1. Impact to Stakeholder (CBP): 2.2. Ability for the system to Deliver Results (Quantitative and Qualitative): 2.3. Ability for the Program to Meet Baseline Goals: Section 3: Lessons Learned: 3.1. Acquisition Strategy: 3.2. Planning Documentation: 3.3. Program Management Office (PMO) Formulation and Staffing: 3.4. Schedule: 3.5. Testing: 3.6. End-User and Stakeholder Requirements: Section 4: Conclusions/recommendations: 4.1. Conclusions: 4.2. Recommendations: Executive Summary: This Post Implementation Review (PIR) and Lessons Learned document complies with the Acquisition Decision Memorandum (ADM) issued by the DHS Under Secretary for Management on 16 July 2012. These requirements comprise the first two outputs of the "Evaluate" phase of an investment within DHS Capital Planning and Investment Control (CPIC). The Secretary of Homeland Security terminated the ASP Program on 3 October 2011, prior to implementation. Normally, a program would enter the CPIC "Evaluate" phase after it is deployed and operational. However, according to the CPIC Guide, "programs not approved for continuation by the appropriate IT decision authority must also execute the Evaluate Phase." A PIR normally covers 1) the program's impact to stakeholders, 2) its ability to deliver results, and 3) its ability to meet baseline goals, and is usually followed by the lessons learned from the program. Since ASP was terminated before implementation, only the lessons learned section is applicable. The DNDO Technical Review Board conducted the PIR and documented the lessons learned. This version of the document has been redacted for use by the Government Accounting Office (GAO) in support of their Advanced Spectroscopic Portal (ASP) engagement (GAO #361322); all For Official Use Only (FOUO) information in the original document dated November 05, 2012 has been removed. Section 1: Introduction: 1.1. Purpose: The purpose of a Post Implementation Review (PIR) is to assess the impact of the investment's deployment on customers, the mission and program, and technical and/or mission capabilities. Normally executed after a new system has been implemented, it helps determine whether an investment has achieved expected benefits, such as lowered costs, reduced cycle time, increased quality, or increased speed of service delivery. A PIR normally occurs after a system has been in operation for about six months, or as in the case of the ASP system, following investment termination. The purpose for applying lessons learned is to improve the Acquisition Management and Capital Planning and Investment Control (CPIC) processes. Identifying ways to incorporate them will increase the probability of a successful outcome in an investment process (a fundamental goal for the Department). The utility of lessons learned, and thus the CPIC requirement for documenting them, applies to programs that are terminated before implementation, such as ASP, as well as successful programs. 1.2. Background: The ASP Program began in 2004 with two Broad Agency Announcements released by DHS Science and Technology. The program was initiated to help solve a problem encountered during radiological scanning of inbound cargo at U.S. land borders and sea ports: a high alarm rate primarily due to naturally occurring radioactive material (NORM). In order to mitigate the negative impact on the flow of commerce, ASP was developed to provide near-real-time identification of the radiological material present and thereby reduce the amount of manpower and time required to clear alarming cargo. DNDO was formed in April 2005, and the ASP Program immediately transferred to DNDO. The program advanced through developmental testing at the manufacturer's location, a data collection event at the New York Container Terminal, and two rounds of performance testing at the Nevada National Security Site. From 4Q FY 2007 through 1Q FY2010 the system was subjected to five rounds of Field Validation (FV). The system did not enter Operational Testing, and the Secretary canceled the program in 1Q FY2012. The 16 July 2012 Acquisition Decision Memorandum (ADM) reiterated that the program had been canceled, and listed a single action item: DNDO, CBP, and the ASP Program are to provide "lessons learned/Post Implementation Review from this acquisition" with a due date of 120 days from the date of the memo. 1.3. The Review Process Overview: Normally, a PIR focuses on three primary areas based on deployment of the system under review. The areas of interest include 1) the impact to stakeholders and customers, 2) the ability to deliver results, and 3) the ability to meet baseline goals. The three areas, discussed in the previous paragraph above, could not be assessed for the ASP system because it was not deployed due to program termination. A detailed discussion regarding the ASP system PIR assessment follows in Section 2 of this document. The PIR assessment summary for the ASP system follows below: 1. Impact to Stakeholder (U.S. Customs and Border Protection (CBP)): Not assessed due to program termination prior to operational testing and deployment. 2. Ability for the ASP system to deliver results: Not assessed due to program termination prior to operational testing and deployment. 3. Ability for the ASP Program to meet baseline goals: Not assessed due to program termination prior to operational testing and deployment. The lessons learned were captured from various sources over the life of the program, to include feedback from the end-user. The PIR and review of Lessons Learned were conducted by the DNDO Technical Review Board, with input from the Program Manager and the End-User. Section 2: PIR Areas Of Assessment: The PIR usually occurs either after a system has been in operation for about six months or following investment termination. The three primary areas of focus for a PIR are, as previously discussed in Section 1 above, 1) the impact to stakeholders and customers, 2) the ability to deliver results, and 3) the ability to meet baseline goals. The following sections below provide a detailed discussion regarding the ASP system PIR assessment. 2.1. Impact to Stakeholder (CBP): The impact the investment has on stakeholders and customers will typically be measured by the Program Manager (PM) through end-user surveys (formal or informal), interviews, and feedback studies. Since the ASP system was not deployed, the program did not reach the stage where operational end-user could be surveyed to gauge the impact. For the same reason, interviews and feedback studies were not conducted. Therefore, the impact was not assessed. 2.2. Ability for the ASP system to Deliver Results (Quantitative and Qualitative): The investment's impact to mission and program would normally be carefully assessed to determine whether it delivered the expected results. This information should be compared to the investment's original performance goals. The impact to mission and program is unknown, since the system was never deployed. Therefore, the ability to deliver results was not assessed. 2.3. Ability for the ASP Program to Meet Baseline Goals: Early in a program's life, certain baselines are established for certain goals involving cost, returns on investment, schedule, and the analysis of architecture and risk. Since the ASP program was terminated before system deployment, the ability to meet these goals was not assessed. Section 3: Lessons Learned: Over the life of the program, lessons learned were captured from various sources and venues. These include routine observations within the Program Management Office (PMO), an Acquisition Review Board (ARB) Meeting, interactions with the Government Accountability Office (GAO) and the National Academies of Science (NAS), and discussions with the end-user (CBP). Lessons learned are arranged in groups corresponding to broad topics, including Acquisition Strategy, Planning Documentation, PMO Formulation and Staffing, Schedule, Testing, and End-User and Stakeholder Requirements. The following sub-sections provide the lessons learned by the ASP Program stakeholders and are arranged by the topic areas discussed in the previous paragraph above. 3.1. Acquisition Strategy: 3.1.1. Clear down-select criteria should be established with key stakeholders. 3.1.2. Development contracts should be written clearly and succinctly. Also, they should not tie risky development to very expensive acquisition options in the same contract. 3.1.3. DHS acquisition programs will benefit from the rigor imposed by the DHS Acquisition Management Directive (AD) 102-01. DHS programs that started before AD 102-01 was implemented were at a great disadvantage. In addition, best business practices and other processes that supplement the AD 102-01, e.g. DNDO's Solution Development Process (SDP), should be used where appropriate. 3.1.4. The PMO should take care not to over-specify the development effort. The government should develop and provide to industry capability based requirements versus performance specifications and trust industry to find the best solution to meet the desired capability. This approach may provide capability to the end-user faster and/or cheaper. 3.1.5. Government acquisition strategy should consider "Commercial First" approach in an effort to reduce life-cycle cost and risk to the program. 3.1.6. Requests for Information (RFI) to industry should be used as a market research tool. 3.1.7. An assessment of technology readiness level (TRL) should be performed and care taken to ensure the correct level of technology maturity (typically TRL 6) is realized before proceeding with acquisition program, i.e. Acquisition Decision Event 2B (ADE 2B). To increase the probability of program success, detailed program planning is required based on TRL assessment, and the program development schedule must be developed in consideration of the technology maturity. Failure to heed this lesson can result in multiple design changes, premature attempts to field-test a solution, development cost and schedule increases, and unintentional negative consequences on performance. 3.1.8. A thorough threat assessment should be completed prior to or as part of determining mission need and developing operational requirements to ensure the proper capability is delivered to the end- user. 3.1.9. The acquisition and end-user communities should ensure as part of their analysis during the DHS Acquisition Review Process (ARP) "Need" phase they consider if an existing program could be modified or how a proposed new program might impact the life cycle of existing programs. 3.1.10. Programs should conduct an assessment of the benefits and drawbacks of acquiring the vendor's intellectual property/software design rights. The potential benefit is better government control of system configuration over its lifecycle. A major drawback is the vendor will probably charge a premium for giving up their rights resulting in significant cost increase to the program. There are many other pros and cons to this strategy. With that said, preprogram initialization activities should include a thorough analysis to determine whether or not to proceed with this acquisition strategy. 3.1.11. The acquisition and end-user communities must work closely together throughout the acquisition lifecycle for the program to ensure the needed capability meets operational requirements in an effective and suitable manner. For example, the AA or AoA plan should be developed collaboratively with all stakeholders; findings should be agreed upon before proceeding to program initiation. Additionally, once an acquisition program is initialized stakeholders should work together collaboratively on all aspects of the program, e.g. the end- user should be part of the source selection team for the service and/or product being procured. 3.2. Planning Documentation: 3.2.1. Key documentation (e.g. Mission Needs Statement (MNS), Operational Requirements Document (ORD), Concept of Operations (CONOPS), Acquisition Plan, Functional Requirement Document, etc) should be reviewed at major milestones or every two years from date of approval up until the DHS ARP "Obtain" phase. 3.2.2. Development of integrated logistics documents (training, operation, and maintenance) should start early in the program life- cycle with the goal of supporting systems that are easy to use, last longer, and require less support, thereby reducing costs and increasing return on investments. 3.3. Program Management Office (PMO) Formulation and Staffing: 3.3.1. Acquisition programs should be flexible enough to respond appropriately to major issues that maybe discovered after program initiation. Management should take care not to believe in the program so strongly that early signs of trouble are ignored, cannot be detected, or acted upon in a timely manner. AD 102-01, Appendix B, Systems Engineering Life Cycle (SELC) discusses one of many ways in which oversight is provided to a program to help keep it on track. In addition to the Program Manager (PM), Systems Engineering governance requires the participation of a Lead Technical Authority (LTA) and a Lead Business (or Operational) Authority (LBA). At the completion of each SELC review, the combined concurrence of these three stakeholders (PM, LTA, LBA) is documented in a SELC Review Completion Letter along with the resultant actions taken during the review from the other Component and Department participants as the formal record of the SELC review. 3.3.2. A formal charter should be developed and signed at program initiation; it provides the basis for a well-organized program. The program charter defines the ground rules among stakeholders for management of the program particularly for multi-component programs. Some of the basic elements that are normally included in an charter are: Purpose; Scope; Program/System Description; Organization; Roles, Responsibilities, and Authority (including those of the PM, LTA, and LBA); Coordination/Communications; Funding Authority; Arbitration of Disputes; Documentation; Public Affairs; Component Maiming; and Review Procedures. In addition, the charter will establish the ground rules needed to develop an environment of trust between all stakeholders. 3.3.3. Ensure the PMO is properly staffed to run a major acquisition program including but not limited to the proper mix of acquisition, logistics, contract, finance, and operational personnel. This includes the correct level of certification for personnel assigned to the program. 3.4. Schedule: 3.4.1. The Integrated Master Schedule (IMS) for the program should be event not calendar driven. The program should not be allowed to proceed to the next major milestone until ALL exit criteria are met for the present milestone. 3.5. Testing: 3.5.1. The use of modeling and simulation (M&S) should be considered for risk mitigation and cost reduction. However, models must be carefully evaluated and used within their limitations. In addition, opportunities to develop and/or improve M&S tools should be considered when developing test and evaluation strategy for the program, e.g. model-test-model. Lastly, the pedigree of M&S tools should be established prior to their use; this is normally done by Verification, Validation and Accreditation (VV&A). In situations where VV&A cannot be completed, due to prohibited cost and schedule impacts to the program, risk associated with using M&S tools must be properly identified and understood by all stakeholders. 3.5.2. The Operational Test Authority (OTA) should be included in the development of the test strategy. 3.5.3. Regarding software development and testing, ensure careful consideration is given to when software changes will no longer be allowed (software is locked down) and the process for conducting regression testing in the case of software fixes. Software development strategy must be well thought-out and collaborated with the software developer, e.g. consider process to update software for operating system changes. 3.5.4. Test plans and test procedures must be developed in coordination with all stakeholders and approved at the right level before the start of testing. Any changes or additions to plans or procedures during testing should be done under strict configuration control. 3.5.5. Resources required to support field testing should be understood well in advance and appropriately budgeted. 3.6. End-User and Stakeholder Requirements: 3.6.1. During the DHS ARP "Need" phase, the end-user should coordinate with the acquisition community to ensure mission need and operational requirements are well written and understood; failure may lead to the development of a system that is not suitable and/or effective. With that said the acquisition community must be sensitive to end-user inputs to the program and ensure they are meeting the stated need and associated requirements. 3.6.2. During the DHS ARP "Analyze/Select" phase it is very important the PMO captures all end-user costs in the draft Life-Cycle Cost Estimate (LCCE) for the program. All stakeholders must weigh-in prior to program initiation to ensure the correct investment is being made. Particular attention early in the program's lifecycle should be given to end-user operations and maintenance (O&M) costs to ensure realistic estimates are included in the LCCE. In addition, the PMO may consider including operational availability (AO) requirements in the Request for Proposal (RFP). Failure to include realistic logistics implications can undermine program sustainability. 3.6.3. Effective communication should be established between the acquisition and end-user communities. This is critical to ensure the push and pull between stakeholders does not result in unnecessary tension and poor communications. 3.6.4. PMO should solicit from the end-user and other acquisition programs any lessons learned on similar programs and ensure those lessons are not relearned on any of the programs in the portfolio. 3.6.5. When appropriate, technical expertise should be sought from logical sources; leveraged early and to the maximum extent possible. For example, on rad/nuc programs collaboration with Department of Energy (DOE) National Nuclear Security Administration (NNSA) should be considered. 3.6.6. To achieve effective requirements development, timely interactions should take place between the acquisition community, the end-user community, and national laboratories that support both communities. This is extremely important when requirements and/or program scope changes. 3.6.7. The PMO must remain vigilant to requirement changes and ensure they are properly vetted and accounted throughout the program life- cycle via a configuration managed requirements change process. All stakeholders should work to minimize the number requirements changes. 3.6.8. In order for the end-user to plan out-year budgets effectively, the acquisition and end-user communities should coordinate inputs to the Office of Management and Budget (OMB) Exhibit 300. The coordination process needs to be mature such that a split budget request will not result in a shortfall. 3.6.9. Since O&M costs can be upwards of 70% of the overall lifecycle cost of a program, the acquisition community must work together with the end-user to conduct O&M assessments during the various stages of the Acquisition Lifecycle Framework beginning with the "Analyze/Select" phase of the program. These O&M assessments will assist the PM in identifying if and when O&M costs outweigh the benefits gained from acquiring and/or developing a specific capability. 3.6.10. Program Offices should work with the end-user to identify and control program risks and issues. Introducing a new technology to address an existing capability gap is sometimes a high risk, program offices should work with the end-user to ensure the proposed solution will be effective and suitable in meeting their needs. Additionally, consideration should be given to improving existing technologies versus developing a new one; this may provide solution to end-user faster and cheaper. Section 4: Conclusions/Recommendations: 4.1. Conclusions: As DNDO moves forward with plans to address the remaining gaps in interdicting rad/nuc material at our ports and border crossings, lessons learned from the ASP Program will be taken into account. Many have already been incorporated into best business practices. For example, by following the now-implemented AD 102-01 DNDO successfully completed ADE-3 for the Human Portable Radiation Detection System (HPRDS) Program and acquired devices for deployment by operational components. In addition, best business practices and other processes that supplement the AD 102-01, e.g. DNDO's Solution Development Process (SDP), are being used to ensure the lessons learned in this document are not relearned on any of the programs in the portfolio. 4.2. Recommendations: DNDO and CBP recommend that the Under Secretary approve this document to satisfy the requirement from the Acquisition Decision Memorandum of 16 July 2012. [End of section] Appendix III: Comments from the Department of Homeland Security: U.S. Department of Homeland Security: Washington, DC 20528: April 29, 2013: David C. Trimble: Director, Natural Resources and Environment: U.S. Government Accountability Office: 441 G Street, NW: Washington, DC 20548: Timothy M. Persons, PhD: Chief Scientist: Director, Center for Science, Technology, and Engineering: U.S. Government Accountability Office: 441 G Street, NW: Washington, DC 20548: Re: Draft Report GA0-13-256, "Combating Nuclear Smuggling: Lessons Learned from Cancelled Radiation Portal Monitor Program Could Help Future Acquisitions" Dear Mr. Trimble and Dr. Persons: Thank you for the opportunity to review and comment on this draft report. The U.S. Department of Homeland Security (DHS) appreciates the U.S. Government Accountability Office's (GAO's) work in planning and conducting its review and issuing this report. The Advanced Spectroscopic Portal (ASP) Program was established in 2004 to improve radiation and nuclear detection capabilities at our seaports and land border crossings and to address known limitations of the existing systems. In 2010, the Department's field validation of the ASP system showed the design specification for ASP—jointly developed by the Domestic Nuclear Detection Office (DNDO) and U.S. Customs and Border Protection (CBP) in 2007—did not adequately reflect the current operational needs in the field; specifically, for truck speeds in excess of 2 miles per hour in secondary inspection, In addition, commercially available portal radiation detection systems had come on the market since the ASP Program first began. On the basis of these developments, the Department concluded that the best course of action was not to proceed with full deployment in either primary or secondary inspections, and so notified Congress in October 2011. DNDO and CBP began reviewing lessons learned soon thereafter and issued the final Lessons Learned Report in November 2012. The draft report contained four recommendations with which the Department concurs. Specifically, GAO recommended that the Secretary of Homeland Security: Recommendation 1: Make lessons learned reviews an institutional requirement, such as through an agency directive or order or other appropriate means. Response: Concur. The requirement for lessons learned is institutionalized through the Department's Acquisition Management Directive (MD) 102-01-001 (DHS Instruction Manual 102-01-001 Acquisition Management Instruction/Guidebook Appendix B, Systems Engineering Life Cycle [SELC], Version 2.0, dated 9/21/2010). The requirement states that a Lessons Learned Report will be developed during the Operations and Maintenance stage of the SELC. The content for the Lessons Learned Report comes from Post-Implementation Review findings, from various reviews held, or from conducting required activities during the project life cycle. The DHS Capital Planning and Investment Control (CPIC) Guide further states that the process for incorporating lessons learned includes: (1) identifying lessons learned, (2) providing recommendations (based on lessons learned), (3) agreeing on the appropriate process improvements, and (4) applying the process improvements in the next iteration of the Select Phase. This process will yield specific actions designed to improve the Department's project success rates, while non—value-added steps are revised or removed. The CPIC Guide also states that the Lessons Learned Report should be made available to the DHS Integrated Project Review Team for use throughout DHS. We request that GAO consider this recommendation resolved and closed. Recommendation 2: Put documented processes in place to ensure that component agencies conduct timely lessons learned reviews. Response: Concur. Further institutionalization is provided by the established process of documenting required actions to the program through a signed Acquisition Decision Memorandum (DM) from the DHS Chief Acquisition Officer (CAO) to the Program and Component Acquisition Executive (CAE). The DM identifies specific actions related to completion of the lessons learned, the due date for completing the lessons learned, and the intended recipient of the lessons learned. Copies of the ADM for the ASP program cancellation and lessons learned review were previously provided to GAO. When the Department officially canceled the ASP program in July 2012, DNDO was required to follow up with lessons learned within 120 days. The lessons learned were identified, documented and provided to the Office of Program Accountability and Risk Management (PARM) in November 2012. We request that GAO consider this recommendation resolved and closed. Recommendation 3: Put documented processes in place to ensure that component agencies prepare and submit lessons learned reports. Response: Concur. The Department's Instruction Manual 102-01-001 Appendix B, Systems Engineering Life Cycle includes the requirement to conduct lessons learned. As part of the MD 102-01-001 policy update cycle, the SELC is being revised to provide further clarification of the lessons learned requirement. This update is expected to occur before June 30, 2014. In addition, the Program Management Center of Excellence (PM COE) provides a process and mechanism for collecting and distributing lessons learned. To reach a wider audience, the PM COE has completed an online forum and library for all the Department's acquisition professionals. This site includes an area dedicated to lessons learned and best practices. Estimated Completion Date (ECD): June 30, 2014. Recommendation 4: Complete and implement plans to disseminate lessons learned reports throughout the department. Response: Concur. Lessons learned will increase the probability of success for future acquisitions. The dissemination of the lessons learned are planned to be accessed through PARM's online portals. Phase one of a SharePoint site for acquisition and program management staff has been completed and includes an area dedicated to lessons learned and best practices. The PM COE will be the lead for announcing the lessons learned for canceled programs and posting the lessons for access through the PM COE portal. The collection, through the Component Leads and posting of existing lessons learned, is expected to be completed by September 30, 2013. Additionally, new lessons learned documentation will be announced at the PM COE stakeholder working groups and CAE Forums, as well as through CAE Council sessions with the CAO. ECD: September 30, 2013. Again, thank you for the opportunity to review and comment on this draft report. Technical comments were previously provided under separate cover. Please feel free to contact me if you have any questions. We look forward to working with you in the future. Sincerely, Signed by: Jim H. Crumpacker: Director: Departmental GAO-OIG Liaison Office: [End of section] Appendix IV: GAO Contacts and Staff Acknowledgments: GAO Contacts: David C. Trimble, (202) 512-3841 or trimbled@gao.gov: Timothy M. Persons, Ph.D., (202) 512-6412 or personst@gao.gov: Staff Acknowledgments: In addition to the individuals named above, Ned Woodward, Assistant Director; Gene Aloise; Cheryl R. Arvidson; Antoinette C. Capaccio; Frederick K. Childers; R. Scott Fletcher; Cindy K. Gilbert; David C. Maurer; Cynthia Norris; Katrina E. Pekar-Carpenter; Kiki Theodoropoulos; and Nathan A. Tranquilli made key contributions to this report. [End of section] Related GAO Products: Combating Nuclear Smuggling: DHS has Developed Plans for Its Global Nuclear Detection Architecture, but Challenges Remain in Deploying Equipment. [hyperlink, http://www.gao.gov/products/GAO-12-941T]. Washington, D.C.: July 26, 2012. Combating Nuclear Smuggling: DHS Has Made Some Progress but Not Yet Completed a Strategic Plan for Its Global Nuclear Detection Efforts or Closed Identified Gaps. [hyperlink, http://www.gao.gov/products/GAO-10-883T]. Washington, D.C.: June 30, 2010. Combating Nuclear Smuggling: Recent Testing Raises Issues About the Potential Effectiveness of Advanced Radiation Detection Portal Monitors. [hyperlink, http://www.gao.gov/products/GAO-10-252T]. Washington, D.C.: November 17, 2009. Combating Nuclear Smuggling: Lessons Learned from DHS Testing of Advanced Radiation Detection Portal Monitors. [hyperlink, http://www.gao.gov/products/GAO-09-804T]. Washington, D.C.: June 25, 2009. Combating Nuclear Smuggling: DHS Improved Testing of Advanced Radiation Detection Portal Monitors, but Preliminary Results Show Limits of the New Technology. [hyperlink, http://www.gao.gov/products/GAO-09-655]. Washington, D.C.: May 21, 2009. Nuclear Detection: Domestic Nuclear Detection Office Should Improve Planning to Better Address Gaps and Vulnerabilities. [hyperlink, http://www.gao.gov/products/GAO-09-257]. Washington, D.C.: January 29, 2009. Department of Homeland Security: Billions Invested in Major Programs Lack Appropriate Oversight. [hyperlink, http://www.gao.gov/products/GAO-09-29]. Washington, D.C.: November 18, 2008. Combating Nuclear Smuggling: DHS's Phase 3 Test Report on Advanced Portal Monitors Does Not Fully Disclose the Limitations of the Test Results. [hyperlink, http://www.gao.gov/products/GAO-08-979]. Washington, D.C.: September 30, 2008. Combating Nuclear Smuggling: DHS Needs to Consider the Full Costs and Complete All Tests Prior to Making a Decision on Whether to Purchase Advanced Portal Monitors. [hyperlink, http://www.gao.gov/products/GAO-08-1178T]. Washington, D.C.: September 25, 2008. Combating Nuclear Smuggling: DHS's Program to Procure and Deploy Advanced Radiation Detection Portal Monitors Is Likely to Exceed the Department's Previous Cost Estimates. [hyperlink, http://www.gao.gov/products/GAO-08-1108R]. Washington, D.C.: September 22, 2008. Combating Nuclear Smuggling: DHS's Decision to Procure and Deploy the Next Generation of Radiation Detection Equipment Is Not Supported by Its Cost-Benefit Analysis. [hyperlink, http://www.gao.gov/products/GAO-07-581T]. Washington, D.C.: March 14, 2007. Combating Nuclear Smuggling: DNDO Has Not Yet Collected Most of the National Laboratories' Test Results on Radiation Portal Monitors in Support of DNDO's Testing and Development Program. [hyperlink, http://www.gao.gov/products/GAO-07-347R]. Washington, D. C.: March 9, 2007. Combating Nuclear Smuggling: DHS's Cost-Benefit Analysis to Support the Purchase of New Radiation Detection Portal Monitors Was Not Based on Available Performance Data and Did Not Fully Evaluate All the Monitors' Costs and Benefits. [hyperlink, http://www.gao.gov/products/GAO-07-133R]. Washington, D.C.: October 17, 2006. Combating Nuclear Smuggling: Additional Actions Needed to Ensure Adequate Testing of Next Generation Radiation Detection Equipment. [hyperlink, http://www.gao.gov/products/GAO-07-1247T]. Washington, D.C.: September 18, 2007. Combating Nuclear Smuggling: DHS Has Made Progress Deploying Radiation Detection Equipment at U.S. Ports-of-Entry, but Concerns Remain. [hyperlink, http://www.gao.gov/products/GAO-06-389]. Washington, D.C.: March 22, 2006. [End of section] Footnotes: [1] National Security Presidential Directive 43/Homeland Security Presidential Directive 14, Domestic Nuclear Detection, April 15, 2005. DNDO was established in statute by the Security and Accountability for Every Port Act of 2006 (SAFE Port) Act, Pub. L. No. 109-347, § 501, 120 Stat. 1884, 1932 (codified as amended at 6 U.S.C. § 591). [2] The global nuclear detection architecture is an integrated system of radiation detection equipment and interdiction activities to combat nuclear smuggling in foreign countries, at the U.S. border, and inside the United States. [3] Special nuclear material includes plutonium, uranium enriched in the isotope 233 or in the isotope 235; and any material artificially enriched by any of the foregoing. 42 U.S.C. § 2014 (2006). [4] DNDO, letter from the Acting Director of DNDO to the Chairman of the Senate Committee on Homeland Security and Governmental Affairs, February 24, 2010. [5] For a listing of past reports and testimonies, see Related Products at the end of this report. [6] GAO, Combating Nuclear Smuggling: DHS Improved Testing of Advanced Radiation Detection Portal Monitors, but Preliminary Results Show Limits of the New Technology, [hyperlink, http://www.gao.gov/products/GAO-09-655] (Washington, D.C.: May 21, 2009). [7] The National Academies comprise four organizations: the National Academy of Engineering, National Academy of Sciences, Institute of Medicine, and National Research Council. [8] National Research Council, Evaluating Testing, Costs, and Benefits of Advanced Spectroscopic Portals for Screening Cargo at Ports of Entry - Interim Report (Washington, D.C.: 2009). [9] National Research Council, Evaluating Testing, Costs, and Benefits of Advanced Spectroscopic Portals - Final Report, (Washington, D.C.: 2010). [10] DHS, letter from the Secretary of Homeland Security to the Chairman and Ranking Members of the House Committee on Homeland Security, the Senate Committee on Homeland Security and Governmental Affairs, and the House and Senate Appropriations Subcommittees on Homeland Security, October 3, 2011. [11] The Nevada National Security Site was formerly known as the Nevada Test Site. [12] GAO, Combating Nuclear Smuggling: Additional Actions Needed to Ensure Adequate Testing of Next Generation Radiation Detection Equipment, [hyperlink, http://www.gao.gov/products/GAO-07-1247T] (Washington, D. C.: Sept. 18, 2007); and Combating Nuclear Smuggling: DHS's Phase 3 Test Report on Advanced Portal Monitors Does Not Fully Disclose the Limitations of the Test Results, [hyperlink, http://www.gao.gov/products/GAO-08-979] (Washington, D.C.: Sept. 30, 2008). [13] Consolidated Appropriations Act, 2008, Pub. L. No. 110-161, 121 Stat. 1844, 2069 (2007); Consolidated Security, Disaster Assistance, and Continuing Appropriations Act, 2009, Pub. L. No. 110-329, 121 Stat. 3574, 3679 (2008); Department of Homeland Security Appropriations Act, 2010, Pub. L. No. 111-83, 123 Stat. 2142, 2167 (2009). The requirement was extended by continuing resolutions in fiscal year 2011. [14] We reviewed these criteria in 2009 and found that they would require a marginal improvement in the ability to detect threats and a large reduction in alarms for benign materials. See [hyperlink, http://www.gao.gov/products/GAO-09-655] (Washington, D. C.: May 21, 2009). [15] These criteria were contained in CBP's field validation test plan. [16] Department of Homeland Security, DNDO, Issue Management and Adjudication Summary (ASP 2010 field validation exit briefing presented by Paul Burrowes, ASP Program Manager, at the meeting of the ASP Governance Board, Washington, D.C.: January 2011). [17] This criterion applies when only naturally occurring radioactive material is present, which was the case for the first and second field validation tests. [18] DNDO, letter from the Acting Director of DNDO to the Chairman of the Senate Committee on Homeland Security and Governmental Affairs, February 24, 2010. [19] The ASP Governance Board consisted of officials from DHS Office of Policy, DHS Science and Technology Directorate, the Office of the Deputy Secretary, DHS Office of Program Accountability and Risk Management (PARM), CBP, and DNDO. It was constituted in November 2009 to examine the ASP program. [20] With one exception, the experts reached consensus on all of the observations. One expert felt that lessons learned reports were so important that acquisitions executives should have less authority to tailor or waive the requirements for such reports. [21] DHS Systems Engineering Life Cycle Version 2.0 (INTERIM), September 2010, Appendix B of DHS Instruction Manual 102-01-001 Acquisition Management Instruction/Guidebook, October 2011. [22] GAO, Standards for Internal Control in the Federal Government, [hyperlink, http://www.gao.gov/products/GAO/AIMD-00-21.3.1] (Washington, D.C.: November 1999). [23] DHS, Acquisition Decision Memorandum from the DHS Under Secretary for Management to the Acting Director of DNDO and the Acting Commissioner of CBP, July 16, 2012. [24] This statement was contained in the same memorandum directing that a lessons learned report be prepared. [25] See GAO, Homeland Security: DHS Requires More Disciplined Investment Management to Help Meet Mission Needs, [hyperlink, http://www.gao.gov/products/GAO-12-833] (Washington, D.C.: Sept. 18, 2012). The canceled programs that we identified were (1) the Secure Border Initiative Network, (2) the Electronic Records Management System, (3) the Transformation and Systems Consolidation, (4) the Grants Management Integrated Environment, and (5) the Online Tracking Information System. [26] DHS's eight Centers of Excellence for Acquisition and Program Management work with acquisition managers, program managers, and subject matter experts across DHS to strengthen acquisition and program management. [27] Department of Homeland Security, Future Years Homeland Security Program (FYHSP), Fiscal Years 2012-2016, Vol. 2, "Program Resources, Milestones, Performance Measures, and Capital Investments" (Washington, D.C.: May 2011). [28] ANSI is a not-for-profit organization that promotes and facilitates voluntary consensus standards that directly impact U. S. businesses in nearly every sector. [29] IEC is a not-for-profit organization for the preparation and publication of international standards for all electrical, electronic and related technologies, known collectively as "electrotechnology." [30] The Joint Research Centre is the European Commission's in-house science service center. [31] DHS Systems Engineering Life Cycle Version 2.0 (INTERIM), September 2010, Appendix B of DHS Instruction Manual 102-01-001 Acquisition Management Instruction/Guidebook, October 2011. [32] Department of Homeland Security (DHS), Office of the Chief Information Officer, Capital Planning and Investment Control (CPIC) Guide, Version 7.1 (August 2010). [hyperlink, https://learn.dau.mil/CourseWare/809481_1/resources/cpic_guide.pdf] (accessed May 3, 2013). [33] Department of Homeland Security, Domestic Nuclear Detection Office (DNDO), Advanced Spectroscopic Portal (ASP) Program Post Implementation/Lessons Learned (Redacted), Document number 600-ASP- 119650v1.11 (Dec. 4, 2012). See the Executive Summary, which is included as appendix III of this report. [End of section] GAO’s Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO’s commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO’s website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e-mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select “E-mail Updates.” Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. Congressional Relations: Katherine Siggerud, Managing Director, siggerudk@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, DC 20548. [End of document]