This is the accessible text file for GAO report number GAO-10-774 entitled 'Defense Acquisitions: DOD Needs to Develop Performance Criteria to Gauge Impact of Reform Act Changes and Address Workforce Issues' which was released on July 29, 2010. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. Report to the Committee on Armed Services, U.S. Senate: United States Government Accountability Office: GAO: July 2010: Defense Acquisitions: DOD Needs to Develop Performance Criteria to Gauge Impact of Reform Act Changes and Address Workforce Issues: GAO-10-774: GAO Highlights: Highlights of GAO-10-774, a report to the Committee on Armed Services, U.S. Senate. Why GAO Did This Study: In May 2009, Congress passed the Weapon Systems Acquisition Reform Act of 2009 (Reform Act). The Reform Act contains a number of systems engineering and developmental testing requirements that are aimed at helping weapon programs establish a solid foundation from the start of development. GAO was asked to examine (1) DOD’s progress in implementing the systems engineering and developmental testing requirements, (2) views on the alignment of the offices of the Directors of Systems Engineering and Developmental Test and Evaluation, and (3) challenges in strengthening systems engineering and developmental testing activities. In conducting this work, GAO analyzed implementation status documentation and obtained opinions from current and former DOD systems engineering and testing officials on the placement of the two offices as well as improvement challenges. What GAO Found: DOD has implemented or is implementing the Reform Act requirements related to systems engineering and developmental testing. Several foundational steps have been completed. For example, new offices have been established, directors have been appointed for both offices, and the directors have issued a joint report that assesses their respective workforce capabilities and 42 major defense acquisition programs. Many other requirements that have been implemented will require sustained efforts by the directors’ offices, such as approving systems engineering and developmental testing plans, as well as reviewing these efforts on specific weapon programs. DOD is studying the option of allowing the Director, Developmental Test and Evaluation, to serve concurrently as the Director of the Test Resource Management Center. The directors have not yet developed joint guidance for assessing and tracking acquisition program performance of systems engineering and developmental testing activities. It is unclear whether the guidance will include specific performance criteria that address long-standing problems and program risks, such as those related to concurrency of development and production activities and adequacy of program resources. Current and former systems engineering and developmental testing officials offered varying opinions on whether the new directors’ offices should have been placed under the Director of Defense Research and Engineering organization—an organization that focuses primarily on developing and transitioning technologies to acquisition programs. The Director of Defense Research and Engineering believes aligning the offices under his organization helps address congressional and DOD desires to increase emphasis on and strengthen activities prior to the start of a new acquisition program. Most of the officials GAO spoke with believe the two offices should report directly to the Under Secretary for Acquisition, Technology and Logistics or otherwise be more closely aligned with acquisition programs because most of their activities are related to weapon programs. They also believe cultural barriers and staffing issues may limit the effectiveness of the two offices under the current organizational structure. Currently, DOD is not reporting to Congress on how successfully the directors are effecting program changes, making it difficult to determine if the current placement of the offices makes sense or if the Reform Act is having an impact. The military services face a number of challenges as they try to strengthen systems engineering and developmental testing activities on acquisition programs. Although the services believe they have enough staff to perform both of these activities, they have not been able to clearly identify the number of staff that are actually involved. The Director of Developmental Test and Evaluation does not believe the military services have enough testing personnel and is concerned that DOD does not have the capacity to train the large influx of contractors that are expected to be converted to government employees. What GAO Recommends: GAO recommends that DOD develop performance criteria to assess program risk; track the extent to which directors’ recommendations are implemented; address identified workforce and training needs; and report to Congress on the status of these efforts. DOD concurred with the recommendations. View [hyperlink, http://www.gao.gov/products/GAO-10-774] or key components. For more information, contact Michael J. Sullivan at (202) 512-4841 or sullivanm@gao.gov. [End of section] Contents: Letter: Background: DOD Has Made Progress in Implementing Reform Act Requirements, but Has Not Developed Performance Criteria to Track Success: Experts Offer Varying Opinions on the Placement of the Systems Engineering and Developmental Test and Evaluation Offices: Military Services Face Workforce and Resource Challenges as They Strive to Strengthen Their Systems Engineering and Developmental Testing Efforts: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Comments from the Department of Defense: Tables: Table 1: Implementation Status of Key Reform Act Provisions Related to Systems Engineering and Developmental Testing: Table 2: Military Service Systems Planning, Research Development, and Engineering and Developmental Testing Personnel: Figures: Figure 1: Major Changes in Organizational Placement of Systems Engineering and Developmental Testing Activities within the Office of the Secretary of Defense: Figure 2: Options for Placement of Directors' Offices for Systems Engineering and Developmental Test and Evaluation: Abbreviations: DOD: Department of Defense: AT&L: Acquisition, Technology and Logistics: [End of section] United States Government Accountability Office: Washington, DC 20548: July 29, 2010: The Honorable Carl Levin: Chairman: The Honorable John McCain: Ranking Member: Committee on Armed Services: United States Senate: For years, GAO has reported on significant cost overruns on the Department of Defense's (DOD) major weapon system acquisition programs. Even though DOD has incorporated previous legislative provisions into its acquisition policies, such as requiring weapon programs to use mature technologies from the start of development, programs are still experiencing cost and schedule problems. The Senate Armed Services Committee reported that since the beginning of 2006, nearly half of DOD's largest acquisition programs have exceeded Nunn- McCurdy[Footnote 1] cost-growth standards established by Congress. DOD is now faced with making tough decisions about the viability of some of its weapon system programs. In 2009, for example, the Secretary of Defense proposed canceling or significantly curtailing weapon programs with a projected cost of at least $126 billion. Cost and schedule overruns can be attributed to a number of factors that occur early in an acquisition, including poorly analyzed requirements, design instability, and inadequate systems engineering and testing. In May 2009, Congress passed the Weapon Systems Acquisition Reform Act of 2009 (Reform Act),[Footnote 2] aimed at improving DOD's organization and procedures for the acquisition of major weapon systems. This legislation places more emphasis on activities that should occur early in weapon systems development, including those related to systems engineering[Footnote 3] and developmental testing, in order to help establish a solid program foundation from the start of development. The Senate Armed Services Committee asked us to examine (1) DOD's progress in implementing systems engineering and developmental testing requirements called for in the Reform Act, (2) views on the alignment of the offices of the Director of Systems Engineering and the Director of Developmental Test and Evaluation within the Office of the Secretary of Defense, and (3) challenges in strengthening systems engineering and developmental testing activities. In conducting our work, we interviewed officials and collected documents from the offices of the Director of Systems Engineering and the Director of Developmental Test and Evaluation in order to learn the status of their efforts to implement the Reform Act legislation and challenges they are addressing. We also interviewed officials from various offices within the Office of the Under Secretary of Defense for Acquisition, Technology and Logistics (AT&L); the office of the Director, Operational Test and Evaluation; each of the military services; the Defense Science Board; as well as former DOD systems engineering and developmental testing executives to obtain their opinions on the alignment of the two offices within the Office of the Secretary of Defense and potential challenges. We conducted this performance audit from December 2009 to July 2010 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Background: Systems engineering and test and evaluation are critical parts of the weapon system acquisition process and how well these activities are conducted early in the acquisition cycle can greatly affect program outcomes. Systems engineering translates customer needs into specific product requirements for which requisite technological, software, engineering, and production capabilities can be identified through requirements analysis, design, and testing. Early systems engineering provides the knowledge that weapon system requirements are achievable with available resources such as technologies, time, people, and money. It allows a product developer to identify and resolve performance and resource gaps before product development begins by reducing requirements, deferring them to the future, or increasing the estimated cost for the weapon system's development. Systems engineering plays a fundamental role in the establishment of the business case for a weapon acquisition program by providing information to DOD officials to make tradeoffs between requirements and resources. Systems engineering is then applied throughout the acquisition process to manage the engineering and technical risk in designing, developing, and producing a weapon system. The systems engineering processes should be applied prior to the start of a new weapon acquisition program and then continuously throughout the life- cycle. Test and evaluation provides information about the capabilities of a weapon system and can assist in managing program risk. There are generally two broad categories of testing: developmental and operational. Developmental testing is used to verify the status of technical progress, substantiate achievement of contract technical performance, and certify readiness for initial operational testing. Early developmental testing reduces program risks by evaluating performance at progressively higher component and subsystem levels, thus allowing program officials to identify problems early in the acquisition process. Developmental testing officials in the Office of the Secretary of Defense and the military services provide guidance and assistance to program managers on how to develop sound test plans. The amount of developmental testing actually conducted however, is controlled by the program manager and the testing requirements explicitly specified in the development contract. In contrast, operational testing determines if a weapon system provides operationally useful capability to the warfighter. It involves field testing a weapon system, under realistic conditions, to determine the effectiveness and suitability[Footnote 4] of the weapon for use in combat by military users, and the evaluation of the results of such tests. DOD's Director of Operational Test and Evaluation conducts independent assessments of programs and reports the results to the Secretary of Defense and Congress. In 2008, the Defense Science Board reported that operational testing over the previous 10 years showed that there had been a dramatic increase in the number of weapon systems that did not meet their suitability requirements. The board found that failure rates were caused by several factors, notably the lack of a disciplined systems engineering process early in development and a robust reliability growth program. The board also found that weaknesses in developmental testing, acquisition workforce reductions and retirements, limited government oversight, increased complexity of emerging weapon systems, and increased reliance on commercial standards (in lieu of military specifications and standards) all contributed to these failure rates. For example, over the last 15 years, all service acquisition and test organizations experienced significant personnel cuts, including the loss of a large number of the most experienced technical and management personnel, including subject matter experts, without an adequate replacement pipeline. The services now rely heavily on contractors to help support these activities. Over the past two decades, the prominence of the developmental testing and systems engineering communities within the Office of the Secretary of Defense has continuously evolved, as the following examples illustrate. * In 1992, a systems engineering directorate did not exist and the developmental test function was part of the Office of the Director of Test and Evaluation, which reported directly to the Under Secretary of Defense for Acquisition. At that time, the director had direct access to the Under Secretary on an array of issues related to test policy, test assets, and the workforce. * In 1994, the Development Test, Systems Engineering and Evaluation office was formed. This organization effectively expanded the responsibilities of the former testing organization to formally include systems engineering. The organization had two deputy directors: the Deputy Director, Development Test and Evaluation, and the Deputy Director, Systems Engineering. This organization was dissolved in 1999. * From 1999 to 2006, systems engineering and developmental testing responsibilities were aligned under a variety of offices. The responsibility for managing test ranges and resources, for example, was transferred to the Director of Operational Test and Evaluation. This function was later moved to the Test Resource Management Center, which reports directly to AT&L, where it remains today. In 2004, a Director of Systems Engineering was re-established and then in 2006 this became the System and Software Engineering Directorate. Developmental testing activities were part of this directorate's responsibilities. As a result, systems engineering and developmental testing issues were reported indirectly to AT&L through the Deputy Under Secretary for Acquisition and Technology. Congress passed the Weapon Systems Acquisition Reform Act of 2009 (Reform Act)--the latest in a series of congressional actions taken to strengthen the defense acquisition system. The Reform Act establishes a Director of Systems Engineering and a Director of Developmental Test and Evaluation within the Office of the Secretary of Defense and defines the responsibilities of both offices. The Reform Act requires the services to develop, implement, and report on their plans for ensuring that systems engineering and developmental testing functions are adequately staffed to meet the Reform Act requirements. In addition, it requires the directors to report to Congress on March 31 of each year on military service and major defense acquisition program systems engineering and developmental testing activities from the previous year. For example, the report is to include a discussion of the extent to which major defense acquisition programs are fulfilling the objectives of their systems engineering and developmental test and evaluation master plans, as well as provide an assessment of the department's organization and capabilities to perform these activities. Figure 1 shows some of the major reorganizations over the past two decades, including the most recent change where DOD decided to place the two new directors' offices under the Director of Defense Research and Engineering. Figure 1: Major Changes in Organizational Placement of Systems Engineering and Developmental Testing Activities within the Office of the Secretary of Defense: [Refer to PDF for image: 3 organizational charts] 1992[A]: Top level: Under Secretary of Defense for Acquisition: Second level, reporting to Under Secretary of Defense for Acquisition: * Director, Test & Evaluation[B] (Developmental Test & Evaluation Activities); * Director, Defense Research & Engineering. 2006: Top level: Under Secretary of Defense for Acquisition, Technology & Logistics: - Test Resource Management Center. Second level, reporting to Under Secretary of Defense for Acquisition, Technology & Logistics: * Director, Defense Research & Engineering; * Deputy Under Secretary for Acquisition & Technology: - Director, Systems & Software Engineering (Systems Engineering Activities); -- Deputy Director, Developmental Test & Evaluation (Developmental Test & Evaluation Activities). 2009: Top level: Under Secretary of Defense for Acquisition, Technology & Logistics: - Test Resource Management Center. Second level, reporting to Under Secretary of Defense for Acquisition, Technology & Logistics: * Assistant Secretary of Defense (Acquisition); * Director, Defense Research & Engineering: - Director, Systems Engineering (Systems Engineering Activities); - Director, Developmental Test & Evaluation (Developmental Test & Evaluation Activities). Source: GAO presentation of Defense Science Board and DOD information. [A] There was no systems engineering office within the Office of the Secretary of Defense in 1992. DOD established a combined developmental testing and systems engineering office in 1994. [B] Director, Test and Evaluation, had oversight responsibilities for developmental and live-fire testing, weapon system assessments, and test facilities and resources. [End of figure] DOD Has Made Progress in Implementing Reform Act Requirements, but Has Not Developed Performance Criteria to Track Success: DOD has made progress in implementing the systems engineering and developmental test and evaluation provisions of the Reform Act, but has not yet developed performance criteria that would help assess the effectiveness of the changes. Some requirements, such as the establishment of the two new offices, have been fully implemented. The implementation of other requirements, such as the review and approval of systems engineering and developmental test and evaluation plans, has begun but requires sustained efforts. The department has not fully implemented other requirements. For example, DOD has begun development of joint guidance that will identify measurable performance criteria to be included in the systems engineering and developmental testing plans. DOD initially decided that one discretionary provision of the act--naming the Director of Developmental Test and Evaluation also as the Director of the Test Resource Management Center--would not be implemented. However, the Director of Defense Research and Engineering is currently examining the implications of this organizational change. It will be several years before the full impact of the Reform Act provisions is known. The offices of the Director of Systems Engineering and Developmental Test and Evaluation were officially established by the Under Secretary of Defense for AT&L in June 2009 to be his principal advisors on systems engineering and developmental testing matters. The directors took office 3 months and 9 months later, respectively, and are working on obtaining the funding, workforce, and office space needed to accomplish their responsibilities. The directors have also completed evaluations of the military services' organizations and capabilities for conducting systems engineering and developmental testing, and identified areas for improvement.[Footnote 5] These evaluations were based on reports provided by the services that were also required by the Reform Act.[Footnote 6] As shown in table 1, many of the requirements that have been implemented will require ongoing efforts. Table 1: Implementation Status of Key Reform Act Provisions Related to Systems Engineering and Developmental Testing: Reform Act provision: Establish office, appoint director; Systems engineering: Completed; ongoing efforts to obtain needed staff, budget, and office space; Developmental testing: Completed; ongoing efforts to obtain needed staff, budget, and office space. Reform Act provision: Act as principal advisor to AT&L and subject to the supervision of AT&L; Systems engineering: Ongoing efforts; reports indirectly to AT&L through the Director, Defense Research and Engineering, on major defense acquisition programs; Developmental testing: Ongoing efforts; reports indirectly to AT&L through the Director, Defense Research and Engineering, on major defense acquisition programs. Reform Act provision: Directors should coordinate closely to fully integrate developmental testing and systems engineering activities in DOD; Systems engineering: Ongoing effort; Developmental testing: Ongoing effort. Reform Act provision: Develop policies and guidance; Systems engineering: Ongoing effort. In fiscal year 2009, published new policy that expands reliability, availability, and maintainability guidance for acquisition programs and updated the Defense Acquisition Guidebook chapter on systems engineering. Also, updating systems engineering plan guidance (to be released in 2010) and the Guide for Integrating Systems Engineering into DOD Acquisition Contracts (to be released in fiscal year 2011); Developmental testing: Ongoing effort; in fiscal year 2009, published guidance on incorporating test and evaluation requirements into acquisition contracts. Updated required content in test and evaluation strategy and master plan documents to include reliability factors. Reform Act provision: Review, approve acquisition planning documents; Systems engineering: Ongoing effort; in fiscal year 2009 reviewed 22 and approved 16 systems engineering plans; Developmental testing: Ongoing effort; in fiscal year 2009 reviewed and approved 25 developmental test and evaluation plans. Reform Act provision: Monitor, review activities of major acquisition programs; Systems engineering: Ongoing effort; in fiscal year 2009, reviewed systems engineering activities on 35 programs. In 2009, participated in 20 technical reviews; Developmental testing: Ongoing effort; in fiscal year 2009 reviewed developmental testing activities on 17 programs. Reform Act provision: Provide advocacy, oversight, and guidance for respective DOD acquisition workforce career fields; Systems engineering: Ongoing effort; acts as the principal leader in DOD for over 45,000 people in two engineering career fields. Assessment of systems engineering competencies is under way; Developmental testing: Ongoing effort; acts as DOD's principal leader for over 7,000 people in the test and evaluation acquisition career field. In fiscal years 2009 and 2010 updated education and training requirements and validated certification requirements. Reform Act provision: Review military services organizations and capabilities; identify needed changes or improvements; Systems engineering: Ongoing effort; completed evaluation of service reports and identified weakness in staffing levels and expertise; Developmental testing: Ongoing effort; completed evaluation of service reports and identified weakness in staffing levels and expertise. Reform Act provision: Director of Developmental Test and Evaluation may serve as Director of the Test Resource Management Center; Systems engineering: Not applicable; Developmental testing: Discretionary provision not exercised initially; however, decision is being reexamined. Reform Act provision: Prepare joint annual report to Congress; Systems engineering: Ongoing effort. First report issued on March 31, 2010. Future reports are required by March 31 each year. Reform Act provision: Issue joint guidance on: * the development and tracking of performance criteria; * use of developmental test and evaluation to measure achievement of performance objectives; * a system to store and track achievement of performance criteria and objectives; Systems engineering: Not yet completed; efforts under way to develop criteria. Source: GAO presentation of DOD data. [End of table] The directors have the responsibility for reviewing and approving systems engineering and developmental test and evaluation plans as well as the ongoing responsibility to monitor the systems engineering and developmental test and evaluation activities of major defense acquisition programs. During fiscal year 2009, the Director of Systems Engineering reviewed 22 systems engineering plans and approved 16, while the Director of Developmental Test and Evaluation reviewed and approved 25 developmental test and evaluation plans within the test and evaluation master plans. Both offices are monitoring and reviewing activities on a number of major acquisition programs, including the Virginia Class Submarine, the Stryker Family of Vehicles, and the C- 130 Avionics Modernization Program. Once their offices are fully staffed, the directors plan to increase efforts in reviewing and approving applicable planning documents and monitoring the activities of about 200 major defense acquisition and information system programs. Evaluations of 42 weapon systems[Footnote 7] were included in the directors' first annual joint report to Congress. The individual systems engineering program assessments were consistent in that they typically included information on 10 areas, including requirements, critical technologies, technical risks, reliability, integration, and manufacturing. In some cases, the assessments also included an overall evaluation of whether the program was low, medium, or high risk; the reasons why; and a general discussion of recommendations or efforts the director has made to help program officials reduce any identified risk. Examples include the following. * In an operational test readiness assessment of the EA-18G aircraft, the Director of Systems Engineering found multiple moderate-level risks related to software, communications, and mission planning and made recommendations to reduce the risks. The program acted on the risks and recommendations identified in the assessment and delayed the start of initial operational testing by 6 weeks to implement the fixes. It has completed initial operational testing and was found to be effective and suitable by Navy testers. The Director of Operational Test and Evaluation rated the system effective but not suitable, and stated that follow-on testing has been scheduled to verify correction of noted deficiencies. The program received approval to enter full rate production and is rated as a low risk in the joint annual report. * The systems engineering assessment of the Global Hawk program was high risk pending the determination of actual system capability; it also stated that there is a high probability that the system will fail operational testing. The assessment cited numerous issues, including questions regarding the system's ability to meet mission reliability requirements, poor system availability, and the impact of simultaneous weapon system block builds (concurrency). Despite the director's concerns and efforts to help the program office develop a reliability growth plan for Global Hawk, no program funding has been allocated to support reliability improvements. * The Expeditionary Fighting Vehicle assessment did not include an overall evaluation of risk. The assessment noted that the program was on track to meet the reliability key performance parameter of 43.5 hours mean time between operational mission failure. Problems related to meeting this and other reliability requirements were a primary reason why the program was restructured in 2007. However, the assessment did not address the high degree of concurrency between development and production, which will result in a commitment to fund 96 low-rate initial procurement vehicles prior to demonstrating that the vehicle can meet the reliability threshold value at initial operational test and evaluation, currently scheduled for completion by September 2016.[Footnote 8] Developmental testing assessments covered fewer programs and were not as structured as those provided by the systems engineering office in that there were no standard categories of information that were included in each assessment. Part of the reason is that the Director of the Developmental Test and Evaluation office was just developing the necessary expertise to review and provide formal assessments of programs. For the programs that were reviewed, the assessments included a status of developmental testing activities on programs and in some cases an assessment of whether the program was low, medium, or high risk. For example, the Director of Developmental Test and Evaluation supported an assessment of operational test readiness for the C-5 Reliability Enhancement and Reengining Program. The assessment stated that due to incomplete testing and technical issues found in developmental testing, there is a high risk of failure in operational testing. The assessment recommended that the program resolve these issues before beginning operational testing. The Reform Act also requires that the Director of Systems Engineering develop policies and guidance on, among other things, the use of systems engineering principles and best practices and the Director of Developmental Test and Evaluation develop policies and guidance on, among other things, the conduct of developmental testing within DOD. [Footnote 9] The directors have issued some additional policies to date, such as expanded guidance on addressing reliability and availability on weapon programs and on incorporating test requirements in acquisition contracts. The directors plan to update current guidance and issue additional guidance in the future. According to DOD officials, there are over 25 existing documents that provide policy and guidance for systems engineering and developmental testing. The directors also have an ongoing responsibility to advocate for and support their respective DOD acquisition workforce career fields, and have begun examining the training and education needs of these workforces. Two provisions, one of which is discretionary, have not been completed. The Reform Act requires that the directors, in coordination with the newly established office of the Director for Program Assessments and Root Cause Analysis, issue joint guidance on the development of detailed, measurable performance criteria that major acquisition programs should include in their systems engineering and testing plans. The performance criteria would be used to track and measure the achievement of specific performance objectives for these programs, giving decision makers a clearer understanding each program's performance and progress. The offices have begun efforts to develop these policies and guidance, but specific completion dates have not been identified. At this time, it is unclear whether the guidance will include specific performance criteria that should be consistently tracked on programs and any risks associated with these programs, such as ones related to technology maturity, design stability, manufacturing readiness, concurrency of development and production activities, prototyping, and adequacy of program resources. Finally, the Reform Act gives DOD the option of permitting the Director of Developmental Test and Evaluation to serve as the Director of the Test Resource Management Center. DOD initially decided not to exercise this option. However, the Director of Defense Research and Engineering recently stated that his organization is examining the possibility of consolidating the offices. The director stated that it makes sense to combine the two offices because it would merge test oversight and test resource responsibilities under one organization, but the ultimate decision will be based on whether there are any legal obstacles to combining the two offices. While most of the Reform Act's requirements focus on activities within the Office of the Secretary of Defense, the military services are ultimately responsible for ensuring that their weapon systems start off with strong foundations. To that end, in November 2009, the services, in reports to the Directors of Systems Engineering and Developmental Test and Evaluation, identified plans for ensuring that appropriate resources are available for conducting systems engineering and developmental testing activities. The individual reports also highlighted management initiatives undertaken to strengthen early weapon acquisition activities. For example, the Army is establishing a center at Aberdeen Proving Ground that will focus on improving reliability growth guidance, standards, methods, and training for Army acquisition programs. The Navy has developed criteria, including major milestone reviews and other gate reviews, to assess the "health" of testing and evaluation at various points in the acquisition process. The Air Force has undertaken an initiative to strengthen requirements setting, systems engineering, and developmental testing activities prior to the start of a new acquisition program. Air Force officials believe this particular initiative will meet the development planning requirements of the Reform Act. Experts Offer Varying Opinions on the Placement of the Systems Engineering and Developmental Test and Evaluation Offices: Experts provided different viewpoints on the proper placement of the new systems engineering and developmental test and evaluation offices, with some expressing concern that as currently placed, the offices will wield little more power or influence than they had prior to the passage of the Reform Act. According to the Director of Defense Research and Engineering, the Under Secretary of Defense for AT&L placed the new offices under his organization because the department wanted to put additional emphasis on systems engineering and developmental testing prior to the start of a weapons acquisition program. The director believes this is already occurring and that both offices will continue to have a strong relationship with acquisition programs even though they do not report directly to an organization with significant involvement with major defense acquisition programs. However, many current and former DOD systems engineering and developmental testing officials we spoke with believe the offices should be closely linked to weapon acquisition programs because most of their activities are related to those programs. Similarly, the Defense Science Board recommended that a developmental testing office be established and report directly to an organization that has significant involvement with major defense acquisition programs. In addition, officials we spoke with believe several other significant challenges, including those related to staffing and the culture of the Defense Research and Engineering organization, are already negatively affecting the offices' effectiveness. DOD has not established any performance criteria that would help gauge the success of the new directors' offices, making it difficult to determine if the offices are properly aligned within the department or if the Reform Act is having an impact on program outcomes. DOD Aligned New Systems Engineering and Developmental Test and Evaluation Offices with the Research and Engineering Organization: After the passage of the Reform Act, DOD considered several options on where to place the new offices of the Director of Systems Engineering and Director of Developmental Test and Evaluation. According to an official who helped evaluate potential alternatives, DOD could have aligned the offices under AT&L in several different ways (see figure 2). For example, the offices could have reported directly to the Under Secretary of AT&L or indirectly to the Under Secretary of AT&L either through the Assistant Secretary of Defense (Acquisition)[Footnote 10] or the Director of Defense Research and Engineering. DOD decided to place the offices under the Director of Defense Research and Engineering, an organization that previously primarily focused on science and technology issues. Figure 2: Options for Placement of Directors' Offices for Systems Engineering and Developmental Test and Evaluation: [Refer to PDF for image: organization chart] Option 1: Top level: Under Secretary of Defense for Acquisition, Technology & Logistics (USD AT&L). Second level: * Director, Developmental Test & Evaluation (Direct report to USD AT&L); * Director, Systems Engineering (Direct report to USD AT&L); * Assistant Secretary of Defense (Acquisition); * Director, Defense Research & Engineering. Option 2: Top level: Under Secretary of Defense for Acquisition, Technology & Logistics (USD AT&L). Second level: * Assistant Secretary of Defense (Acquisition): - Director, Developmental Test & Evaluation (Indirect report to USD AT&L); - Director, Systems Engineering (Indirect report to USD AT&L); * Director, Defense Research & Engineering. Option 3: Top level: Under Secretary of Defense for Acquisition, Technology & Logistics (USD AT&L). Second level: * Assistant Secretary of Defense (Acquisition); * Director, Defense Research & Engineering: - Director, Developmental Test & Evaluation (Indirect report to USD AT&L); - Director, Systems Engineering (Indirect report to USD AT&L). Source: DOD data; GAO analysis and presentation. [End of figure] Director of Defense Research and Engineering Believes Offices Are Properly Aligned: The Director of Defense Research and Engineering is aware of the challenges of placing the offices under an organization whose primary mission is to develop and transition technologies to acquisition programs, but believes that the current placement makes sense given congressional and DOD desires to place more emphasis on activities prior to the start of a new acquisition program. He stated that the addition of systems engineering and developmental testing not only stretches the role and mission of his organization, but also strengthens the organization's role in acquisitions because it helps give the organization's research staff another point of view in thinking about future technologies and systems. He plans for the offices to perform both assessment and advisory activities, including: * providing risk assessments of acquisition programs for the Defense Acquisition Board, * continuing to help programs succeed by providing technical insight and assisting the programs in the development of the systems engineering plan and the test and evaluation master plan, and: * educating and assisting researchers to think through new concepts or technologies using systems engineering to inform fielding and transition strategies. According to the Director of Defense Research and Engineering, the offices are already performing some of these functions. For example, the new directors have provided technical input to the Defense Acquisition Board on various weapons programs. The director stated the systems engineering organization is reviewing manufacturing processes and contractor manufacturing readiness for weapons programs such as the Joint Strike Fighter. In addition, a developmental testing official stated they are assisting the Director of Defense Research and Engineering Research Directorate in conducting technology readiness assessments and helping programs identify the trade spaces for testing requirements while reviewing the test and evaluation master plan. The director believes the value of having the offices perform both assessment and advisory activities is that they can look across the acquisition organization and identify programs that are succeeding from a cost, schedule, and performance perspective and identify common threads or trends that enable a program to succeed. Conversely, they could identify common factors that make programs fail. The Director of Defense Research and Engineering identified three challenges that he is trying to address in order for systems engineering and developmental testing to have a more positive influence on weapon system outcomes. First, the director would like to improve the technical depth of the systems engineering and developmental testing offices. Both functions have atrophied over the years and need to be revitalized. This will require the offices to find highly qualified people to fill the positions, which will not be easy. Second, the director wants to improve the way the Defense Research and Engineering organization engages with other DOD organizations that are involved in weapon system acquisition. The director noted that there are a lot of players and processes involved in weapon acquisition and that the systems engineering office can play a large role in facilitating greater interaction. Third, the director would like the Defense Research and Engineering organization to find better ways to shape, engage with, contract with, and get information from the defense industrial base. In addition to the three challenges, it will also be difficult to determine whether the two new offices are having a positive impact on weapon system outcomes. The Directors of Systems Engineering and Developmental Test and Evaluation are not reporting the number of recommendations implemented by program managers or the impact the recommendations have had on weapon programs, which would allow senior leaders to gauge the success of the two offices. This type of information could help the Under Secretary of AT&L determine if the offices need to be placed under a different organization, if the offices need to place more emphasis on advisory or assessment activities, and if the Reform Act is having an impact on program outcomes. Most Experts Believe Offices Would Be Better Aligned under an Acquisition Organization: The vast majority of current and former DOD systems engineering and test officials we spoke with were opposed to the placement of the offices under the Director of Defense Research and Engineering. Their chief concern is that the mission of the Director of Defense Research and Engineering organization is primarily focused on developing new technologies and transitioning those technologies to acquisition programs. While they recognize that the systems engineering and developmental testing offices need to be involved in activities prior to the official start of a new weapons program, they believe the offices' expertise should be focused on helping DOD acquisition programs establish doable requirements given the current state of technologies, not on the technologies themselves. Therefore, they believe the offices would be more appropriately placed under the newly established offices of the Principal Deputy Under Secretary of Defense for AT&L or the Assistant Secretary of Defense for Acquisition, whose missions are more closely aligned with acquisition programs. Some officials we spoke with believe that a cultural change involving the focus and emphasis of the office of the Director of Defense Research and Engineering will have to take place in order for that organization to fully support its role in overseeing acquisition programs and improving the prominence of the two new offices within the department. However, these same officials believe that this cultural change is not likely to occur and that the Director of Defense Research and Engineering will continue to focus primarily on developing and transitioning new technologies to weapon programs. Therefore, the offices may not get sufficient support and resources or have the clout within DOD to effect change. One former systems engineering official pointed out that the historic association of systems engineering with the Director of Defense Research and Engineering does not bode well for the systems engineering office. Based upon his experience, the Director of Defense Research and Engineering's focus and priorities resulted in a fundamental change in philosophy for the systems engineering mission, the virtual elimination of a comprehensive focus on program oversight or independent identification of technical risk, and a reduction in systems engineering resources. In short, he found that the Director of Defense Research and Engineering consistently focused on science and technology, in accordance with the organization's charter, with systems engineering being an afterthought. Likewise, current and former developmental testing officials are concerned about the Director of Defense Research and Engineering's support for developmental testing activities. They identified several staffing issues that they believe are key indicators of a lack of support. * First, they pointed out that it took almost 9 months from the time the Director of Developmental Test and Evaluation office was established before a new director was in place compared to 3 months to place the Director of Systems Engineering. If developmental testing was a priority, officials believe that the Director of Defense Research and Engineering should have filled the position earlier. * Second, test officials believe the Director of Developmental Test and Evaluation office needs to have about the same number of staff as the offices of the Director of Systems Engineering and the Director of Operational Test and Evaluation. According to officials, DOD currently plans to have about 70 people involved with developmental testing activities, 180 people for systems engineering, and 250 for operational testing. However, testing officials believe the offices should be roughly the same size given the fact that developmental testing will cover the same number of programs as systems engineering and operational testing and that roughly 80 percent of all testing activities are related to developmental tests, with the remaining 20 percent being for operational tests. * Third, even though the Director of Developmental Test and Evaluation expects the office to grow to about 70 people by the end of fiscal year 2011, currently there are 30 people on board. The director believes there are a sufficient number of qualified people seeking positions and therefore the office could be ramped up more quickly. * Finally, the Director of Developmental Test and Evaluation stated that his office has only one senior-level executive currently on staff who reports to him and that there are no plans to hire more for the 70- person organization. The director believes it is crucial that the organization have more senior-level officials because of the clout they carry in the department. The director believes that the lack of an adequate number of senior executives in the office weakens its ability to work effectively with or influence decisions made by other DOD organizations. Further, officials from other testing organizations, as well as the systems engineering office, indicated they have two or more senior executive-level employees. A May 2008 Defense Science Board report, which was focused on how DOD could rebuild its developmental testing activities, recommended that developmental testing be an independent office that reports directly to the Deputy Under Secretary of Defense (Acquisition and Technology). At that time, according to the report, there was no office within the Office of the Secretary of Defense with comprehensive developmental testing oversight responsibility, authority, or staff to coordinate with operational testing. In addition, the existing residual organizations lacked the clout to provide development test guidance and developmental testing was not considered to be a key element in AT&L system acquisition oversight. According to the study director, placing the developmental testing office under the Director of Defense Research and Engineering does not adequately position the new office to perform the oversight of acquisition programs. Military Services Face Workforce and Resource Challenges as They Strive to Strengthen Their Systems Engineering and Developmental Testing Efforts: The military services, the Directors of Systems Engineering and Developmental Test and Evaluation, and we have identified a number of workforce and resource challenges that the military services will need to address to strengthen their systems engineering and developmental testing activities. For example, it is unclear whether the services have enough people to perform both systems engineering and developmental testing activities. Even though the services reported to the directors that they have enough people, they do not have accurate information on the number of people performing these activities. The Director of Developmental Test and Evaluation disagreed with the services' assertions, but did not know how many additional people are needed. Service officials have also expressed concern about the department's ability to train individuals who do not meet requisite certification requirements on a timely basis[Footnote 11] and being able to obtain additional resources to improve test facilities. The military services were required by the Reform Act to report on their plans to ensure that they have an adequate number of trained systems engineering and developmental testing personnel and to identify additional authorities or resources needed to attract, develop, train, and reward their staff. In November 2009, the military services submitted their reports to the respective directors within the Office of the Secretary of Defense on their findings. In general, the services concluded that even with some recruiting and retention challenges, they have an adequate number of personnel to conduct both systems engineering and developmental testing activities (see table 2 below). According to service officials, this determination was based on the fact that no program offices identified a need for additional staffing to complete these activities. The reports also stated the services generally have sufficient authorities to attract and retain their workforce. In DOD's first annual joint report to Congress, the Director of Developmental Test and Evaluation did not agree with the military services' assertion that they have enough staff to perform the full range of developmental testing activities. The director does not know how many more personnel are needed, but indicated that the office plans to work with the services to identify additional workforce needs. The Director of Systems Engineering agreed with the services' reports that they have adequate staffing to support systems engineering activities required by current policy. According to the director, this was based on the 35,000 current personnel identified in the System Planning, Research Development, and Engineering workforce-- a generic workforce category that includes systems engineering activities--as well as the services' plans to hire over 2,500 additional personnel into this same workforce category over the next several years. Table 2: Military Service Systems Planning, Research Development, and Engineering and Developmental Testing Personnel: Air Force: Systems Planning, Research Development, and Engineering[A]: Civilian: 5,004; Military: 1,871; Total: 6,875; Developmental testing[B]: Civilian: 1,354; Military: 1,276; Total: 2,630. Army: Systems Planning, Research Development, and Engineering[A]: Civilian: 10,107; Military: 107; Total: 10,214; Developmental testing[B]: Civilian: 2,131; Military: 11; Total: 2,142. Navy: Systems Planning, Research Development, and Engineering[A]: Civilian: 17,885; Military: 201; Total: 18,086; Developmental testing[B]: Civilian: 2,381; Military: 450; Total: 2,831. Total: Systems Planning, Research Development, and Engineering[A]: Civilian: 32,996; Military: 2,179; Total: 35,175; Developmental testing[B]: Civilian: 5,866; Military: 1,737; Total: 7,603. Source: GAO presentation of DOD data. Note: Developmental testing data for all three services are as of September 2009. The Air Force, Army, and Navy systems engineering data are as of June 2009, September 2009, and November 2009, respectively. [A] The military services identified their systems engineering personnel as those coded as Program Systems Engineers and Systems Engineers (a general classification for other types of engineers) in the Systems Planning, Research Development, and Engineering workforce classification. [B] Some personnel conducting work in developmental testing may not be included because their work is primarily conducted in another area. [End of table] Although not clearly articulated in the services' reports, military service officials acknowledged that the personnel data in their reports may not be entirely accurate. For example, officials believe the systems engineering numbers identified in table 2 overstate the number of people actually performing systems engineering activities because that particular career field classification is a generic category that includes all types of engineers. The developmental test workforce shown in the table does not completely reflect the number of people who actually perform developmental testing activities because the information provided by the military services only identifies the personnel identified in the test and evaluation career field. Service officials told us that there are many other people performing these activities who are identified in other career fields. The Director of Developmental Test and Evaluation believes these other people may not be properly certified and that in the case of contractors, they do not possess certifications which are equivalent to the certification requirements of government personnel. This director plans to request another report from the services in fiscal year 2010. This report will address the overall workforce data; it will cover current staffing assigned to early test and evaluation activities, training, and certification concerns they have related to in-sourcing staff, rapid acquisition resource plans, and infrastructure needs for emerging technologies. The Director of Systems Engineering does not intend to request another report from the services. Nevertheless, each of the military services plans to increase its systems engineering workforce over the next several years. The exact number of personnel is uncertain because the services' hiring projections relate to a general engineering personnel classification, not a specific systems engineering career field. The directors also identified challenges they believe the services will face in strengthening systems engineering and developmental testing activities. The Director of Systems Engineering pointed out that the services need to put greater emphasis on development planning activities, as called for by the Reform Act. The services are currently conducting these activities to some extent, but the director believes a more robust and consistent approach is needed. The Director of Developmental Test and Evaluation highlighted two other challenges facing the military services. First, the director would like to increase the number of government employees performing test and evaluation activities. The services experienced significant personnel cuts in these areas in the mid-1990s and has to rely on contractors to perform the work. DOD's joint report to Congress noted that the Air Force in particular relies heavily on prime contractor evaluations and that this approach could lead to test results that are inaccurate, misleading, or not qualified, resulting in turn, in premature fielding decisions since prime contractors would not be giving impartial evaluations of results. The director believes there are a number of inherently governmental test and evaluation functions that produce a more impartial evaluation of results and that a desired end state would be one where there is an appropriate amount of government and contractor testing. Second, the director is concerned that DOD does not have the capacity to train and certify an estimated 800 individuals expected to be converted from contractor to government employees within the required time frame. While most of the contractors are expected to have some level of training and experience performing test activities, they probably will not meet certifications required of government employees because they have not had the same access to DOD training. In addition to those challenges recognized by the directors, we have identified other challenges we believe the services may face in implementing more robust systems engineering and developmental testing, including the following. * According to the military services, they plan to meet hiring targets primarily through the conversion of contractors who are already performing those activities, but do not have plans in place to ensure that they have the right mixture of staff and expertise both now and in the future. DOD officials acknowledge that they do not know the demographics of the contractor workforce. However, they believe many contractors are often retired military with prior systems engineering experience. Therefore, while they may be able to meet short-term needs, there could be a challenge in meeting long-term workforce needs. * Army test officials indicated that they have experienced a significant increase in their developmental testing workload since the terrorist attacks of September 2001, with no corresponding increase in staffing. As a result, personnel at their test ranges are working longer hours and extra shifts, which testing officials are concerned may affect their retention rates. Army officials also indicated that test ranges are deteriorating more quickly than expected and they may not have the appropriate funding to upgrade and repair the facilities and instrumentation. Test personnel are often operating in obsolete and outdated facilities that cannot meet test requirements, resulting in safety issues, potential damage to equipment, and degraded quality of life. * DOD's increased emphasis on fielding rapid acquisition systems may require the services to tailor their approach to systems engineering. According to an Air Force official, efforts that normally take months to complete for a more traditional acquisition program, have to be completed in a matter of weeks for rapid acquisition programs. Conclusions: DOD efforts to implement Reform Act requirements are progressing, but it will take some time before the results of these efforts can be evaluated. Current and former systems engineering and developmental testing officials offer compelling insights concerning the placement of the new directors' offices under the Office of the Director of Defense Research and Engineering, but it is still too soon to judge how effective the offices will be at influencing outcomes on acquisition programs. The current placement of the offices may present several challenges that could hinder their ability to effectively oversee weapon system acquisition programs and ensure that risks are identified, discussed, and addressed prior to the start of a new program or the start of operational testing. Foremost among these potential challenges is the ability of the Director of Defense Research and Engineering to change the focus of the organization to effectively assimilate the roles and missions of the two new offices and then ensure that the offices are properly staffed and have the appropriate number of senior leaders. The mission of the office of the Director of Defense Research and Engineering has been to develop technology for weapon programs; its focus has not been to manage the technical aspects of weapon system acquisition programs. Ultimately, the real proof of whether an organization outside of the major defense acquisition program arena can influence acquisition program decisions and outcomes should be based on results. The directors' offices have started to assess and report on the systems engineering and developmental testing activities on some of the major defense acquisition programs. They have also made recommendations and worked with program officials to help reduce risks on programs such as the EA-18G, Global Hawk, and the C-5 Reliability Enhancement and Reengining programs. However, guidance on the development and tracking of performance criteria that would provide an indication of how much risk is associated with a particular weapon system--such as those related to technology maturity, design stability, manufacturing readiness, concurrency of development and production activities, prototyping, and adequacy of program resources--has yet to be developed. Further, the directors are not reporting to Congress on the extent to which programs are implementing recommendations and the impact recommendations are having on weapon programs, which would provide some insight as to the impact the two offices are having on acquisition programs. Although not required by the Reform Act, this type of information could be useful for Congress to gauge the effectiveness of the directors' offices. The military services, which face increasing demands to develop and field more reliable weapon systems in shorter time frames, may need additional resources and training to ensure that adequate developmental testing and systems engineering activities are taking place. However, DOD's first joint annual report to Congress, which was supposed to assess the department's organization and capabilities for performing systems engineering and developmental testing activities, did not clearly identify the workforce performing these activities, future workforce needs, or specific hiring plans. In addition, DOD's strategy to provide the necessary training within the required time period to the large number of staff it plans to hire is unclear. Therefore, workforce and training gaps are unknown. Recommendations for Executive Action: In order to determine the effectiveness of the newly established offices, we recommend that the Secretary of Defense direct the Directors of Systems Engineering and Developmental Test and Evaluation to take the following five actions: * Ensure development and implementation of performance criteria for systems engineering plans and developmental test and evaluation master plans, such as those related to technology maturity, design stability, manufacturing readiness, concurrency of development and production activities, prototyping, and the adequacy of program resources. * Track the extent to which program offices are adopting systems engineering and developmental testing recommendations. * Work with the services to determine the appropriate number of government personnel needed to perform the scope of systems engineering and developmental testing activities. * Develop plans for addressing the training needs of the new hires and contractors who are expected to be converted to government personnel. * Report to Congress on the status of these efforts in future joint annual reports required by the Reform Act. Agency Comments and Our Evaluation: DOD provided us with written comments on a draft of this report. DOD concurred with each of the recommendations, as revised in response to agency comments. DOD's comments appear in appendix I. Based upon a discussion with DOD officials during the agency comment period, we revised the first recommendation. Specifically, instead of recommending that the Directors of Systems Engineering and Developmental Test and Evaluation develop a comprehensive set of performance criteria that would help assess program risk, as stated in the draft report, we now recommend that the directors ensure the development and implementation of performance criteria for systems engineering plans and developmental test and evaluation master plans. The wording change clarifies the nature and scope of performance criteria covered by our recommendation and is consistent with Reform Act language that requires the directors to develop guidance on the development of detailed, measurable performance criteria that major acquisition programs should include in their systems engineering and developmental testing plans. According to DOD officials, the military services are then responsible for developing the specific criteria that would be used on their respective programs. DOD also provided technical comments, which we incorporated as appropriate. We are sending copies of this report to the Secretary of Defense, the Director of the Office of Management and Budget, and interested congressional committees. We will also make copies available at no charge on the GAO Web site at [hyperlink, http://www.gao.gov]. If you have any questions about this report or need additional information, please contact me at (202) 512-4841 or sullivanm@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributors to this report were Bruce Thomas, Assistant Director; Cheryl Andrew; Rae Ann Sapp; Megan Hill; and Kristine Hassinger. Signed by: Michael J. Sullivan: Director: Acquisition and Sourcing Management: [End of section] Appendix I: Comments from the Department of Defense: Department of Defense: Director Of Defense Research And Engineering: 3030 Defense Pentagon: Washington, DC 20301-3030: July 22, 2010: Mr. Michael Sullivan: Director, Acquisition and Sourcing Management: U.S. Government Accountability Office: 441 G Street, N.W. Washington, DC 20548: Dear Mr. Sullivan: This is the Department of Defense (DOD) response to the GAO draft report 10-774, "Defense Acquisitions: DoD Needs to Develop Performance Criteria to Gauge Impact of Reform Act Changes and Address Workforce Issues," dated June 18, 2010, (GAO Code 120880). Detailed comments on the report recommendations are enclosed. Detailed comments on factual information within the body of the report have been forwarded to the GAO action officer separately. The Department appreciates the opportunity to respond to your draft report and looks forward to working with you as we continue to ensure a strong and capable Defense acquisition capability. Sincerely, Signed by: Zachary J. Lemnios: Enclosure: As stated: [End of letter] GAO Draft Report — Dated June 18, 2010: GAO Code 120880/GAO-10-774: "Defense Acquisitions: DoD Needs to Develop Performance Criteria to Gauge Impact of Reform Act Changes and Address Workforce Issues" Department Of Defense Comments To The Recommendations: Recommendation 1: The GAO recommends that the Secretary of Defense direct the Directors of Systems Engineering and Developmental Test and Evaluation to ensure development and implementation of performance criteria for systems engineering plans and developmental test and evaluation master plans, such as those related to technology maturity, design stability, manufacturing readiness, concurrency of development and production activities, prototyping, and the adequacy of program resources. DOD Response: Concur. The Director, Systems Engineering and the Director, Developmental Test and Evaluation, consistent with PL 111- 23, will issue guidance on the development of performance criteria for systems engineering plans and developmental test and evaluation master plans and will ensure that these criteria are in fact developed and implemented. Recommendation 2: The GAO recommends that the Secretary of Defense direct the Directors of Systems Engineering and Developmental Test and Evaluation to track the extent to which program offices are adopting systems engineering and developmental testing recommendations. DOD Response: Concur. The Director, Systems Engineering and the Director, Developmental Test and Evaluation will track the extent to which program offices adopt systems engineering and developmental test and evaluation recommendations. However, these will not be reported as standalone metrics. The Director, Systems Engineering and the Director, Developmental Test and Evaluation will continue to work with the Services and programs to understand both their risk management approach and overall program performance. Recommendation 3: The GAO recommends that the Secretary of Defense direct the Directors of Systems Engineering and Developmental Test and Evaluation to work with the Services to determine the appropriate number of Government personnel needed to perform the scope of systems engineering and developmental testing activities. DOD Response: Concur. The Director, Systems Engineering and the Director, Developmental Test and Evaluation will work with the Services to determine the appropriate number of Government personnel needed to perform the scope of systems engineering and developmental testing activities. Recommendation 4: The GAO recommends that the Secretary of Defense direct the Directors of Systems Engineering and Developmental Test and Evaluation to develop plans for addressing the training needs of the new hires and contractors that are expected to be converted to Government personnel. DOD Response: Concur. The Director, Systems Engineering and the Director, Developmental Test and Evaluation, consistent with their Functional Leader roles, will develop plans for addressing the training needs of their respective career fields. The Directors will work collaboratively with the Services and Components who are ultimately responsible for training and quality of their personnel. Recommendation 5: The GAO recommends that the Secretary of Defense direct the Directors of Systems Engineering and Developmental Test and Evaluation to report to the Congress on the status of these efforts in future joint annual reports required by the Reform Act. DOD Response: Concur. The Director, Systems Engineering and the Director, Developmental Test & Evaluation will report the status of these efforts in the joint annual report required by the Reform Act. [End of section] Footnotes: [1] 10 U.S.C. § 2433 establishes the requirement for unit cost reports. If certain cost thresholds are exceeded (known as unit cost or Nunn-McCurdy breaches), DOD is required to report to Congress and, in certain circumstances, certify the program to Congress. [2] Pub. L. No. 111-23. [3] Systems engineering efforts also include development planning, which the department considers to be engineering activities prior to the start of weapon system development. [4] Effectiveness refers to the ability of the system to perform its mission. Suitability refers to the ability to place and sustain the system in the field. Suitability measures include reliability, availability, and logistics supportability. [5] The Reform Act requires that the Director of Developmental Test and Evaluation and the Director of Systems Engineering issue a joint annual report not later than March 31 each year, beginning in 2010, to the congressional defense committees addressing activities undertaken to meet various requirements of the Reform Act. Pub. L. No. 111-23, § 102(a) (codified at 10 U.S.C. § 139d). [6] The Reform Act requires that the service acquisition executive of each military department and each defense agency with responsibility for a major defense acquisition program submit a report to the Director of Developmental Test and Evaluation and the Director of Systems Engineering on the status of the development and implementation of their plans for ensuring that developmental testing and system engineering functions are adequately staffed. Pub. L. No. 111-23, § 102(b). [7] Depending on a weapon system's activity for the year, an assessment may include a summary of only developmental testing activity, systems engineering activity, or both. [8] GAO, Expeditionary Fighting Vehicle (EFV) Program Faces Cost, Schedule and Performance Risks, [hyperlink, http://www.gao.gov/products/GAO-10-758R] (Washington, D.C.: July 2, 2010). [9] Pub. L. No. 111-23, § 102(a) (codified at 10 U.S.C. § 139d). [10] The National Defense Authorization Act for Fiscal Year 2010 realigned the organizational structure of the Office of the Secretary of Defense. The effect of the realignment is that the position of Deputy Under Secretary of Defense for Acquisition & Technology is replaced by the position of Principal Deputy Under Secretary of Defense for Acquisition, Technology & Logistics. Also, additional Assistant Secretaries were added, including the position of Assistant Secretary of Defense for Acquisition. Pub. L. No. 111-84, § 906(a), (b)(2) (codified at 10 U.S.C. § 137a, 138). [11] Each acquisition, technology, and logistics (AT&L) position, meaning positions designated to be acquisition positions in accordance with the Defense Acquisition Workforce Improvement Act, has certification-level requirements. When an individual is placed in an AT&L position, the determination that the individual has satisfied appropriate certification and assignment-specific training requirements, or a plan for the individual to meet the requirements within 24 months of placement or other established period, shall be documented. If an individual does not meet position requirements within established time frames, a waiver must be obtained according to applicable procedures for the individual to remain in the position. Department of Defense Instruction 5000.66, Operation of the Defense Acquisition, Technology, and Logistics Workforce Education, Training, and Career Development Program paragraphs E2.1.3.3 and E2.4.1.2. (Dec. 21, 2005). [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each weekday, GAO posts newly released reports, testimony, and correspondence on its Web site. To have GAO e-mail you a list of newly posted products every afternoon, go to [hyperlink, http://www.gao.gov] and select "E-mail Updates." Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s Web site, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: E-mail: fraudnet@gao.gov: Automated answering system: (800) 424-5454 or (202) 512-7470: Congressional Relations: Ralph Dawn, Managing Director, dawnr@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, D.C. 20548: Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, D.C. 20548: