This is the accessible text file for GAO report number GAO-13-529T entitled 'Science Technology, Engineering, And Mathematics Education: Governmentwide Strategy Needed to Better Manage Overlapping Programs' which was released on April 10, 2013. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Testimony: Before the Subcommittee on Early Childhood, Elementary, and Secondary Education, Committee on Education and the Workforce, House of Representatives: For Release on Delivery: Expected at 10 a.m. EDT: Wednesday, April 10, 2013: Science Technology, Engineering, And Mathematics Education: Governmentwide Strategy Needed to Better Manage Overlapping Programs: Statement of George A. Scott, Director: Education, Workforce, and Income Security Issues: GAO-13-529T: GAO Highlights: Highlights of GAO-13-529T, a testimony before the Subcommittee on Early Childhood, Elementary, and Secondary Education, Committee on Education and the Workforce, House of Representatives. Why GAO Did This Study: STEM education programs help to enhance the nation’s global competitiveness. Many federal agencies have been involved in administering these programs. Concerns have been raised about the overall effectiveness and efficiency of STEM education programs. This testimony discusses (1) the number of federal agencies and programs that provided funding for STEM education programs in fiscal year 2010; (2) the extent to which STEM education programs overlap; and (3) the extent to which STEM education programs measured effectiveness and were aligned to a governmentwide strategy. This testimony is based on several previously published GAO reports and includes updates on actions taken in response to these reports. What GAO Found: In fiscal year 2010, 13 federal agencies invested over $3 billion in 209 programs designed to increase knowledge of science, technology, engineering, and mathematics (STEM) fields and attainment of STEM degrees. The number of programs within agencies ranged from 3 to 46, with the Department of Health and Human Services, Department of Energy, and the National Science Foundation administering more than half of the 209 programs. Almost a third of all programs had obligations of $1 million or less, while some had obligations of over $100 million. Beyond programs specifically focused on STEM education, agencies funded other broad efforts that contributed to enhancing STEM education. Eighty-three percent of the programs GAO identified overlapped to some degree with at least 1 other program in that they offered similar services to similar target groups in similar STEM fields to achieve similar objectives. Many programs have a broad scope—serving multiple target groups with multiple services. However, even when programs overlap, the services they provide and the populations they serve may differ in meaningful ways and would therefore not necessarily be duplicative. Nonetheless, the programs are similar enough that they need to be well coordinated and guided by a robust strategic plan. Figure: Overlapping Federal STEM Education Programs: [Refer to PDF for image: illustration] 209 STEM educational programs: Programs that have at least one similar target population: 100%; 209 programs. Programs that have at least one similar target population and also provide at least one similar service: 99%; 207 programs; 2 programs do not overlap. Programs that have at least one similar target population and also provide at least one similar service and also at least one similar STEM field of focus: 83%; 173 programs; 34 programs do not overlap. Programs that have at least one similar target population and also provide at least one similar service and also at least one similar STEM field of focus and also have at least one similar program objective: 83%; 173 programs. Source: GAO. [End of figure] Agencies’ limited use of performance measures and evaluations may hamper their ability to assess the effectiveness of their individual programs as well as the overall STEM education effort. Specifically, program officials varied in their ability to provide reliable output measures—for example, the number of students, teachers, or institutions directly served by their program. Further, most agencies did not use outcomes measures in a way that is clearly reflected in their performance planning documents. In addition, a majority of programs did not conduct comprehensive evaluations since our prior review in 2005 and the time of our survey in 2011 to assess effectiveness, and the evaluations GAO reviewed did not always align with program objectives. Finally, GAO found that completed STEM education evaluation results had not always been disseminated in a fashion that facilitated knowledge sharing between both practitioners and researchers. In naming STEM education as a crosscutting goal, the administration is taking the first step towards better governmentwide coordinated planning; however, it will be important to finalize a governmentwide strategic plan so agencies can better align their performance plans and reports to new governmentwide goals. What GAO Recommends: GAO previously recommended that the Office of Science and Technology Policy (OSTP) should direct the National Science and Technology Council (NSTC) to work with agencies to better align their activities with a governmentwide strategy, develop a plan for sustained monitoring of coordination, identify programs for consolidation or potential elimination, and assist agencies in determining how to better evaluate their programs. Since GAO’s report, OSTP released a progress report that identified some programs for elimination, and the Office of Management and Budget (OMB) named STEM education one of its interim cross-cutting priority goals. View [hyperlink, http://www.gao.gov/products/GAO-13-529T]. For more information, contact George A. Scott at (202) 512-7215 or scottg@gao.gov. [End of section] Chairman Rokita, Ranking Member McCarthy, and Members of the Subcommittee: I am pleased to be here today to discuss the findings from our prior work on fragmentation, overlap, and potential duplication in federally funded science, technology, engineering, and mathematics (STEM) education programs.[Footnote 1] These programs can serve an important role both by helping to prepare students for careers in STEM fields and by enhancing the nation's global competitiveness. In addition to the federal effort, state and local governments, universities and colleges, and the private sector have also developed programs that provide opportunities for students to pursue STEM education and occupations. However, research shows that despite this investment, the United States lacks a strong pipeline of future workers in STEM fields and that U.S. students continue to lag behind students in other highly technological nations in mathematics and science achievement. Over the decades, Congress and the executive branch have continued to create new STEM education programs, even though, as we previously reported, there has been a general lack of assessment of how well STEM programs are working.[Footnote 2] Governmentwide strategic planning could help better align federal efforts in an efficient and effective manner. My testimony is based on several previously published GAO reports on STEM education, including our January 2012 report entitled Science, Technology, Engineering, and Mathematics Education: Strategic Planning Needed to Better Manage Overlapping Programs across Multiple Agencies and also addresses actions taken by the administration to better plan and coordinate federal STEM education programs since our 2012 report was issued.[Footnote 3] My testimony focuses on (1) the number of federal agencies and programs that provided funding for STEM education programs in fiscal year 2010; (2) the extent to which these STEM programs overlap; and (3) the extent to which federal STEM education programs have measured their effectiveness and were aligned to a governmentwide strategy. To conduct this work, we reviewed relevant federal laws and regulations as well as previous GAO work on overlap, duplication, and fragmentation. We interviewed officials from the Office of Management and Budget (OMB) and OSTP, and officials from other federal agencies that administer STEM education programs. We reviewed relevant literature and past reports that catalog and assess the federal investment in STEM education. To gather information on federal STEM education programs and to assess the level of fragmentation, overlap, and potential duplication, we reviewed past GAO work on assessing the level of fragmentation, overlap, and duplication among other groups of federal programs and we surveyed over 200 programs across 13 agencies that met our definition of a STEM education program, asking questions about program objectives, target populations, services provided, interagency coordination, outcome measures and evaluations, and funding. Our survey was administered between May 2011 and August 2011 to federal agency program officials and achieved a 100 percent response rate. To gather information on program effectiveness, we reviewed evaluations provided by program officials as well as agencies' annual performance plans and reports. The work upon which this testimony is based was performed in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Background: Since 2005, there have been several efforts to inventory federal STEM education programs and reports that call for the need to better coordinate and evaluate STEM education programs. In 2005, for example, GAO identified a multitude of agencies that administer such programs. [Footnote 4] The primary missions of these agencies vary, but most often, they are to promote and enhance an area that is related to a STEM field or enhance general education. In addition, the National Science and Technology Council (NSTC) was established in 1993 and is the principal means for the administration to coordinate science and technology with the federal government's larger research and development effort. The America COMPETES Reauthorization Act of 2010 sought to address coordination and oversight issues, including those associated with the coordination and potential duplication of federal STEM education efforts.[Footnote 5] Specifically, the law required the Director of the Office for Science and Technology Policy (OSTP) to establish a committee under the NSTC to inventory, review, and coordinate federal STEM education programs. The law also directed this NSTC committee to develop a 5-year governmentwide STEM education strategic plan, which must specify and prioritize annual and long-term objectives for STEM education. Moreover, the Director of OSTP is required to send a report to Congress annually on this strategic plan, which must include, among other things, an evaluation of the levels of duplication and fragmentation of STEM programs and activities. In our January 2012 report on STEM education, we defined a federally funded STEM education program as a program funded in fiscal year 2010 by congressional appropriation or allocation that included one or more of the following as a primary objective: * attract or prepare students to pursue classes or coursework in STEM areas through formal or informal education activities, * attract students to pursue degrees (2-year, 4-year, graduate, or doctoral degrees) in STEM fields through formal or informal education activities, * provide training opportunities for undergraduate or graduate students in STEM fields, * attract graduates to pursue careers in STEM fields, * improve teacher (preservice or in-service) education in STEM areas, * improve or expand the capacity of K-12 schools or postsecondary institutions to promote or foster education in STEM fields, or: * conduct research to enhance the quality of STEM education programs provided to students. In addition, a program was defined as an organized set of activities supported by a congressional appropriation or allocation. Further, we defined a program as a single program even when its funds were allocated to other programs as well. We asked agency officials to provide a list of programs that received funds in fiscal year 2010. [Footnote 6] In our January 2012 report, we examined the extent to which federal STEM education programs were fragmented, overlapping, and duplicative. [Footnote 7] Using our framework established in previous fragmentation, overlap, and duplication work, key terms were defined as follows: * Fragmentation occurs when more than one federal agency (or more than one organization within an agency) is involved in the same broad area of national need. * Overlap occurs when multiple programs offer similar services to similar target groups in similar STEM fields to achieve similar objectives. * Duplication occurs when multiple programs offer the same services to the same target beneficiaries in the same STEM fields. Thirteen Federal Agencies Administered over 200 Programs with Over $3 Billion in Obligated Funds: As we reported in 2012, 13 agencies administered 209 STEM education programs in fiscal year 2010. Agencies reported that they developed the majority (130) of these programs through their general statutory authority and that Congress specifically directed agencies to create 59 of these programs.[Footnote 8] The number of programs each agency administered ranged from 3 to 46 with three agencies--the Department of Health and Human Services,[Footnote 9] the Department of Energy, and the National Science Foundation (NSF)--administering more than half of all programs--112 of 209. (See figure 1) Agencies obligated over $3 billion to STEM education programs in fiscal year 2010, ranging from $15,000 to hundreds of millions of dollars per program. NSF and the Department of Education programs accounted for over half of this funding. Almost a third of the programs had obligations of $1 million or less, with five programs having obligations of more than $100 million each. Figure 1: Number and Funding Obligations of STEM Education Programs in 2010, as Reported by Agencies: [Refer to PDF for image: plotted point graph] Federal agency: NASA; FY 2010 obligations: $183.0 million; Number of programs: 10. Federal agency: NSF; FY 2010 obligations: $1.059 billion; Number of programs: 37. Federal agency: NRC; FY 2010 obligations: $22.5 million; Number of programs: 3. Federal agency: Agriculture; FY 2010 obligations: $41.9 million; Number of programs: 11. Federal agency: Commerce; FY 2010 obligations: $85.2 million; Number of programs: 19. Federal agency: DoD; FY 2010 obligations: $156.5 million; Number of programs: 19. Federal agency: Education; FY 2010 obligations: $691.2 million; Number of programs: 12. Federal agency: Energy; FY 2010 obligations: $72.2 million; Number of programs: 29. Federal agency: HHS; FY 2010 obligations: $595.1 million; Number of programs: 46. Federal agency: DHS; FY 2010 obligations: $8.9 million; Number of programs: 5. Federal agency: Interior; FY 2010 obligations: $26.7 million; Number of programs: 5. Federal agency: DOT; FY 2010 obligations: $94.3 million; Number of programs: 6. Federal agency: EPA; FY 2010 obligations: $18.3 million; Number of programs: 10. Source: GAO analysis of survey responses. [End of figure] Beyond the 209 programs identified in our review, federal agencies carried out other activities that contribute to the overall federal STEM education effort. Having multiple agencies, with varying expertise, involved in delivering STEM education has both advantages and disadvantages. On the one hand, it could allow agencies to tailor programs to suit their specific missions and needs to attract new employees to their workforce. On the other hand, it could make it challenging to develop a coherent federal approach to educating STEM students and creating a workforce with STEM skills. Further, it could make it difficult to identify gaps and allocate resources across the federal government. Most STEM Programs Overlapped to Some Degree: As we reported in 2012, and as figure 2 illustrates, in fiscal year 2010, 83 percent of STEM education programs overlapped to some degree with another program in that they offered at least one similar service to at least one similar target group in at least one similar STEM field to achieve at least one similar objective. These programs ranged from being narrowly focused on a specific group or field of study to offering a range of services to students and teachers across STEM fields. This complicated patchwork of overlapping programs has largely resulted from federal efforts to both create and expand programs across many agencies in an effort to improve STEM education and increase the number of students going into STEM fields. Program officials reported that approximately one-third of STEM education programs funded in fiscal year 2010 were first funded between 2005 and 2010. We believe the creation of new programs during that time frame may have contributed to overlap and, ultimately, to inefficiencies in how STEM programs across the federal government are focused and delivered. Overlap among STEM education programs is not new. In 2007, the Academic Competitiveness Council (ACC) identified extensive overlap among STEM education programs, and, in 2009, we identified overlap among teacher quality programs, which include several programs focused on STEM education.[Footnote 10] Overlapping programs can lead to individuals and institutions being eligible for similar services in similar STEM fields offered through multiple programs and, without information sharing, could lead to the same service being provided to the same individual or institution. Figure 2: Overlapping Federal STEM Education Programs: [Refer to PDF for image: illustration] 209 STEM educational programs: Programs that have at least one similar target population: 100%; 209 programs. Programs that have at least one similar target population and also provide at least one similar service: 99%; 207 programs; 2 programs do not overlap. Programs that have at least one similar target population and also provide at least one similar service and also at least one similar STEM field of focus: 83%; 173 programs; 34 programs do not overlap. Programs that have at least one similar target population and also provide at least one similar service and also at least one similar STEM field of focus and also have at least one similar program objective: 83%; 173 programs. Source: GAO analysis of survey responses. [End of figure] Similar Target Groups: Our analysis found that many programs provided services to similar target groups, such as K-12 students, postsecondary students, K-12 teachers, and college faculty and staff. The vast majority of programs (170) served postsecondary students. Ninety-five programs served college faculty and staff, 75 programs served K-12 students, and 70 programs served K-12 teachers. In addition, many programs served multiple target groups. In fact, 177 programs primarily served two or more target groups. Similar Services Provided: We also found many STEM programs providing similar services. * To support students, 167 different programs provided research opportunities, internships, mentorships, or career guidance. In addition, 144 programs provided short-term experiential learning opportunities and 127 long-term experiential learning opportunities. Short-term experiential learning activities included field trips, guest speakers, workshops, and summer camps. Long-term experiential learning activities last a semester in length or longer. Furthermore, 137 programs provided outreach and recognition to generate student interest, 124 provided classroom instruction, and 75 provided student scholarships or fellowships. * To support teachers, 115 programs provided curriculum development, 83 programs provided teacher in-service, professional development, or retention activities, and 52 programs provided preservice or recruitment activities. * To support STEM research, 68 programs reported conducting research to enhance the quality of STEM education. * To support institutions, 65 programs provided institutional support to management and administrative activities, and 46 programs provided support for expanding the facilities, classrooms, and other physical infrastructure of institutions. Similar STEM Fields of Focus: In addition to serving multiple target groups, our analysis found that most programs also provided services in multiple STEM fields. Twenty- three programs targeted one specific STEM field, while 121 programs targeted four or more specific STEM fields. In addition, 26 programs indicated not focusing on any specific STEM field, and instead provided services eligible for use in any STEM field. Five different STEM fields had over 100 programs that provided services. Biological sciences and technology were the most selected STEM fields that programs focused on. Agricultural sciences, which was the least commonly selected, still had 27 programs that provided services specifically to that STEM field. While our 2011 survey data also show that many programs overlapped, it is important to compare programs' target groups and STEM fields of focus to get a better picture of the potential target beneficiaries that could be served within a given STEM discipline. For example, both the National Oceanic and Atmospheric Administration's National Environmental Satellite, Data, and Information Service (NESDIS) Education program and the Department of Energy's Graduate Automotive Technology Education program provided scholarships or fellowships to postsecondary students, but NEDSIS focused on students in earth sciences programs, and the other on engineering; therefore, the target beneficiaries served by these two similar programs are quite different. Nevertheless, we found that 76 programs served postsecondary students in physics. As table 1 illustrates, many programs offered services to similar target groups in similar STEM fields of focus. Table 1: STEM Fields of Focus and Target Groups of Federal STEM Education Programs: Target groups: K-12 students; Agricultural sciences: 8; Biology: 40; Chemistry: 36; Computer science: 30; Earth sciences: 38; Engineering: 32; Mathematics: 33; Physics: 31; Social sciences: 19; Technology: 43. Target groups: Postsecondary students; Agricultural sciences: 22; Biology: 99; Chemistry: 85; Computer science: 84; Earth sciences: 64; Engineering: 89; Mathematics: 79; Physics: 76; Social sciences: 62; Technology: 87. Target groups: K-12 teachers; Agricultural sciences: 5; Biology: 36; Chemistry: 33; Computer science: 25; Earth sciences: 39; Engineering: 26; Mathematics: 28; Physics: 29; Social sciences: 17; Technology: 38. Target groups: College faculty and staff; Agricultural sciences: 17; Biology: 49; Chemistry: 42; Computer science: 43; Earth sciences: 35; Engineering: 47; Mathematics: 37; Physics: 36; Social sciences: 30; Technology: 50. Source: GAO analysis of survey results. Note: Many STEM education programs serve multiple target groups with multiple STEM fields of focus. The totals in table 1 will not sum to 209, the numbers of programs in our January 2012 review. Earth sciences include atmospheric and ocean sciences; social sciences include psychology, sociology, anthropology, cognitive science, economics, and behavior sciences. [End of table] Similar Objectives: We also found that many STEM education programs had similar objectives. In response to our 2011 survey, the vast majority (87 percent) of STEM education program officials indicated that attracting and preparing students throughout their academic careers in STEM areas was a primary objective. In addition to attracting and preparing students throughout their academic careers in STEM areas, officials also indicated the following primary program objectives: * improving teacher education in STEM areas (teacher development)--26 percent, * improving or expanding the capacity of K-12 schools or postsecondary institutions to promote or foster education in STEM fields (institution capacity building)--24 percent, and: * conducting research to enhance the quality of STEM education provided to students (STEM education research)--18 percent. Many programs also reported having multiple primary objectives. While 107 programs focused solely on student education, 82 others indicated having multiple primary objectives, and 9 programs reported having 4 or more primary objectives. Few programs reported focusing solely on teacher development, institution capacity building, or STEM education research. Most of these objectives were part of a larger program that also focused on attracting and preparing students in STEM education. Overlapping Programs Are Not Necessarily Duplicative: However, even when programs overlapped, we found that the services they provided and the populations they served may differ in meaningful ways and would therefore not necessarily be duplicative. There may be important differences between the specific field(s) of focus and a program's stated goals. For example, both Commerce's National Estuarine Research Reserve System Education Program and the Nuclear Regulatory Commission's Integrated University Program provided scholarships or fellowships to doctoral students in the field of physics; however, the Commerce program focuses on increasing environmental literacy related to estuaries and coastal watersheds, while the Nuclear Regulatory Commission program focuses on supporting education in nuclear science, engineering, and related fields with the goal of developing a workforce capable of designing, constructing, operating, and regulating nuclear facilities and capable of handling nuclear materials safely. In addition, programs may be primarily intended to serve different specific populations within a given target group. For example, of the 34 programs that we surveyed in 2011 that provided services to K-12 students in the field of technology, 10 were primarily intended to serve specific underrepresented, minority, or disadvantaged groups, and 2 were limited geographically to individual cities or universities. Furthermore, individuals may receive assistance from different programs at different points throughout their academic careers that provide services that complement or build upon each other, simultaneously supporting a common goal rather than serving cross purposes. Limited Use of Performance Measures and Evaluations Hamper Ability to Assess Effectiveness: In 2012, we reported that in addition to the fragmented and overlapping nature of federal STEM education programs, agencies' limited use of performance measures and evaluations may hamper their ability to assess the effectiveness of their individual programs as well as the overall STEM education effort. Understanding program performance and effectiveness is key in determining where to strategically invest limited federal funds to achieve the greatest impact in developing a pipeline of future workers in STEM fields. Program officials varied in their ability to provide reliable output measures--for example, the number of students, teachers, or institutions directly served by their program. In some cases, the program's agency did not maintain databases or contracts that would track the number of students served by the program. In other cases, programs may not have been able to provide information on the numbers of institutions they served because they provided grants to secondary recipients. In 2012, we reported that the inconsistent collection of output measures across programs makes it challenging to aggregate the number of students, teachers, and institutions served and to assess the effectiveness of the overall federal effort. In addition, most agencies did not use outcome measures in a way that is clearly reflected in their performance plans and reports--publicly available documents they use for performance planning. These documents typically lay out agency performance goals that establish the level of performance to be achieved by program activities during a given fiscal year, the measures developed to track progress, and what progress has been made toward meeting those performance goals. The lack of performance outcome measures may hinder decision makers' ability to assess how agencies' STEM education efforts contribute to agencywide performance goals and the overall federal STEM effort. For our 2012 report, we reviewed fiscal year 2010 annual performance plans and reports of the 13 agencies with STEM programs and found that most agencies did not connect STEM education activities to agency goals or measure and report on the progress of those activities.[Footnote 11] In addition, a majority of programs had not conducted comprehensive evaluations between 2005 and 2011 to assess effectiveness, and the evaluations we reviewed did not always align with program objectives. Programs need to be appropriately evaluated to determine what is working and how improvements can be made. However, the majority of programs we surveyed (66 percent) had not conducted an evaluation of their entire program since 2005.[Footnote 12] (See fig. 3). Some programs that reported that they did not complete an evaluation reported they had their grantees complete one; however, in those cases, few programs used these grantee evaluations to inform a more comprehensive evaluation of the entire program that they or an external evaluator completed. Figure 3: Percentage of STEM Education Programs, by Status of Evaluations Completed between 2005 and 2011: [Refer to PDF for image: pie-chart] In progress: 5%; Completed: 29%; None completed: 66%. Source: GAO analysis of survey responses. Note: Status of agencies' evaluation efforts was provided between May and August of 2011. [End of figure] Furthermore, in order to influence program practice, the evaluation results must be disseminated widely. We found that although nearly all of the STEM education programs that reported completing an evaluation reported using different mechanisms to disseminate results, they did not always share results in a way that facilitated knowledge sharing between practitioners and researchers. In addition, NSTC identified other issues with sharing information about STEM education program results and suggested several actions that agencies could take to improve dissemination, such as engaging practitioners to collaborate with researchers in setting research agendas.[Footnote 13] According to NSTC officials, most agencies do not share or disseminate evaluations in a way that could be useful for coordination. We concluded that without an understanding of what is working in some programs, it will be difficult for the federal government to develop a clear strategy for how to spend limited federal funds. Recognizing the need for improved collaboration across the federal government, Congress passed the GPRA Modernization Act of 2010 [Footnote 14] and the America COMPETES Reauthorization Act of 2010, which afford agencies the opportunity to better utilize performance measures for both governmentwide and agency-specific STEM education efforts. For example, the GPRA Modernization Act establishes a new framework aimed at taking a more crosscutting and integrated approach to focusing on results and improving government performance. Among other things, the Act requires OMB to coordinate with agencies to establish outcome-oriented federal government priority goals-- otherwise referred to as crosscutting goals--covering a limited number of policy areas. STEM education was named as a crosscutting priority goal in the President's 2013 budget submission. In naming STEM education as a crosscutting goal, the administration is taking the first step towards creating better governmentwide coordinated planning. We previously reported that effective implementation of the GPRA Modernization Act could play an important role in clarifying desired outcomes, addressing program performance that spans multiple organizations, and facilitating future actions to reduce unnecessary duplication, overlap, and fragmentation.[Footnote 15] Implementation of Our Recommendations Would Better Align Federal Efforts: In our January 2012 report, we made four recommendations to the Director of OSTP to direct the NSTC to: 1. Work with agencies, through its strategic-planning process, to identify programs that might be candidates for consolidation or elimination, which could be identified through an analysis that includes information on program overlap and program effectiveness. As part of this effort, OSTP should work with agency officials to identify and report any changes in statutory authority necessary to execute each specific program consolidation identified by NSTC's strategic plan. 2. Develop guidance to help agencies determine the types of evaluations that may be feasible and appropriate for different types of STEM education programs and develop a mechanism for sharing this information across agencies. This step could include guidance and sharing of information that outlines practices for evaluating similar types of programs. 3. Develop guidance for how agencies can better incorporate each agency's STEM education efforts and the goals from NSTC's 5-year STEM education strategic plan into each agency's own performance plans and reports. 4. Develop a framework for how agencies will be monitored to ensure that they are collecting and reporting on NSTC strategic plan goals. This framework should include alternatives for a sustained focus on monitoring coordination of STEM programs if the NSTC Committee on STEM terminates in 2015 as called for in its charter. OSTP agreed with our conclusions and, as figure 4 shows, NSTC has made some progress in addressing recommendations from our January 2012 report. Figure 4: Status of Recommendations from GAO-12-108 Report (Science, Technology, Engineering, and Mathematics Education: Strategic Planning Needed to Better Manage Overlapping Programs across Multiple Agencies): [Refer to PDF for image: illustrated table] GAO’s recommendation: Identify programs for consolidation and elimination; NSTC should work with agency officials to identify programs for potential consolidation or elimination; Status of recommendation: Fully addressed. GAO’s recommendation: Provide guidance on program evaluation; NSTC should develop guidance to help agencies evaluate different types of STEM education programs and share the results across agencies; Status of recommendation: Not addressed. GAO’s recommendation: Align agency efforts; NSTC should develop guidance on how agencies can better incorporate NSTC’s STEM education strategic plan and each agency’s STEM education efforts and into their plans and reports; Status of recommendation: Partially addressed. GAO’s recommendation: Develop monitoring framework; NSTC should develop a framework for monitoring agencies to ensure that they are collecting and reporting on NSTC strategic plan goals; Status of recommendation: Not addressed. Source: GAO. [End of figure] Subsequently, OSTP has stated that NSTC's 5-Year Federal STEM Education Strategic Plan, originally scheduled to be released in spring 2012, would address our recommendations; however, the release of NSTC's Strategic Plan has been delayed. In February 2012, NSTC published Coordinating Federal Science, Technology, Engineering, and Mathematics (STEM) Education Investments: Progress Report, which identified a number of programs that could be eliminated in fiscal year 2013. By identifying programs for consolidation, elimination, and other actions, the administration could increase the efficient use of scarce government resources to achieve the greatest impact in developing a pipeline of future workers in STEM fields. Although NSTC said it planned to create a small working group to develop guidance on the appropriateness of different types of evaluations for different types of STEM education programs, OSTP has not released the findings of this working group. Agency and program officials would benefit from guidance and information sharing within and across agencies about what is working and how to best evaluate programs. This could help improve individual program performance and also inform agency and governmentwide decisions about which programs should continue to be funded. We continue to believe that without an understanding of what is working in some programs, it will be difficult to develop a clear strategy for how to spend limited federal funds. In addition, STEM education was named as an interim crosscutting priority goal in the President's 2013 budget submission; however, it will be important for NSTC to finalize its strategic plan, which should include guidance for how agencies can better align their performance plans and reports to new governmentwide goals. Although OSTP agreed to develop milestones and metrics to track the implementation of NSTC strategic goals by each agency, it has not taken action to develop a framework for how agencies will be monitored to ensure that they are collecting and reporting on NSTC strategic plan goals. A framework for monitoring agency progress towards NSTC's strategic plan is necessary to improve transparency and strengthen accountability of NSTC's strategic planning and coordination efforts. In conclusion, if NSTC's 5-year strategic plan is not developed in a way that aligns agencies' efforts to achieve governmentwide goals, enhances the federal government's ability to assess what works, and concentrates resources on those programs that advance the strategy, the federal government may spend limited funds in an inefficient and ineffective manner that does not best help to improve the nation's global competitiveness. Chairman Rokita, Ranking Member McCarthy, and Members of the Subcommittee, this concludes my prepared statement. I would be pleased to answer any questions that you may have at this time. Contacts and Staff Acknowledgments: For further information about this testimony, please contact George A. Scott at (202) 512-7215 or scottg@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this testimony. Other key contributors to this testimony include: Bill Keller, Assistant Director; Susan Baxter; James Bennett; Karen Brown; David Chrisinger; Melinda Cordero; Elizabeth Curda; Karen Febey; Jill Lacey; Ben Licht; Dan Meyer; Amy Radovich; James Rebbe; Nyree Ryder Tee; Martin Scire; Ryan Siegel; and Walter Vance. [End of section] Footnotes: [1] GAO, Science, Technology, Engineering, and Mathematics Education: Strategic Planning Needed to Better Manage Overlapping Programs across Multiple Agencies. [hyperlink, http://www.gao.gov/products/GAO-12-108]. (Washington, D.C.: January 20, 2012). [2] [hyperlink, http://www.gao.gov/products/GAO-12-108]. [3] Other relevant GAO reports include: (1) Managing for Results: GAO's Work Related to the Interim Crosscutting Priority Goals under the GPRA Modernization Act. [hyperlink, http://www.gao.gov/products/GAO-12-620R]. (Washington, D.C.: May 31, 2012); (2) 2012 Annual Report: Opportunities to Reduce Duplication, Overlap and Fragmentation, Achieve Savings, and Enhance Revenue. [hyperlink, http://www.gao.gov/products/GAO-12-342SP]. (Washington, D.C.: February 28, 2012); (3) Science, Technology, Engineering, and Mathematics Education: Survey of Federal Programs ([hyperlink, http://www.gao.gov/products/GAO-12-110SP], January 2012), an E- supplement to [hyperlink, http://www.gao.gov/products/GAO-12-108]. [hyperlink, http://www.gao.gov/products/GAO-12-110SP]. (Washington, D.C.: January 20, 2012); and (4) Government Efficiency and Effectiveness: Opportunities to Reduce Fragmentation, Overlap, Duplication and Achieve Other Financial Benefits. [hyperlink, http://www.gao.gov/products/GAO-13-496T]. (Washington, D.C.: April 9, 2013). [4] GAO, Higher Education: Federal Science, Technology, Engineering, and Mathematics Programs and Related Trends.[hyperlink, http://www.gao.gov/products/GAO-06-114]. (Washington, D.C.: October 12, 2005). [5] Pub. L. No. 111-358, § 101, 124 Stat. 3982, 3984. [6] Although we surveyed 29 earmarks that were funded in fiscal year 2010, we did not include earmark data in our analysis because, according to our survey, 25 of these were not funded in fiscal year 2011. [7] [hyperlink, http://www.gao.gov/products/GAO-12-108]. [8] Nine program officials indicated that they did not know whether the program was created under their agencies' general statutory authority or through congressional direction, 10 indicated that the program was created by a means other than congressional direction or general statutory authority, and 1program official did not answer the question. [9] Forty-four of the 46 Department of Health and Human Services programs were in the National Institutes of Health. [10] GAO, Teacher Quality: Sustained Coordination among Key Federal Education Programs Could Enhance State Efforts to Improve Teacher Quality.[hyperlink, http://www.gao.gov/products/GAO-09-593]. (Washington, D.C.: July 6, 2009). [11] In addition to reporting on STEM education programs through their performance plans and performance reports, there may be other ways to report on these efforts. However, our analysis was limited to these two documents. [12] We define "evaluation" as an individual systematic study conducted periodically or on an ad hoc basis to assess how well a program is working, typically relative to its program objectives. [13] The National Science and Technology Council Committee on Science Subcommittee on Education and Workforce. Review and Appraisal of the Federal Investment in STEM Education Research. (October 2006). [14] Pub. L. No. 111-352, 124 Stat. 3866. [15] [hyperlink, http://www.gao.gov/products/GAO-12-620R]. [End of section] GAO’s Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO’s commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO’s website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e-mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select “E-mail Updates.” Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. Congressional Relations: Katherine Siggerud, Managing Director, siggerudk@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, DC 20548. [End of document]