This is the accessible text file for GAO report number GAO-11-283 entitled 'Intercity Passenger Rail: Recording Clearer Reasons for Awards Decisions Would Improve Otherwise Good Grantmaking Practices' which was released on April 11, 2011. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to the Chairman, Committee on Transportation and Infrastructure, House of Representatives: Intercity Passenger Rail: Recording Clearer Reasons for Awards Decisions Would Improve Otherwise Good Grantmaking Practices: GAO-11-283: GAO Highlights: Highlights of GA0-11-283, a report to the Chairman, Committee on Transportation and Infrastructure, House of Representatives. Why GAO Did This Study: The American Recovery and Reinvestment Act of 2009 (Recovery Act) appropriated $8 billion for high and conventional speed passenger rail. The Federal Railroad Administration (FRA), within the Department of Transportation (the department), was responsible for soliciting applications, evaluating them to determine program eligibility and technical merits, and selecting awards, which were announced in January 2010. This report examines the extent to which FRA (1) applied its established criteria to select projects, (2) followed recommended practices for awarding discretionary grants, and (3) communicated outcomes to the public, compared with selected other Recovery Act competitive grant programs. To address these topics GAO reviewed federal legislation, FRA documents, and guidance for other competitive grant programs using Recovery Act funds. GAO also analyzed data resulting from the evaluation and selection process and interviewed a cross-section of FRA officials and applicants. What GAO Found: FRA applied its established criteria during the eligibility and technical reviews, but GAO could not verify whether it applied its final selection criteria because the documented rationales for selecting projects were typically vague. Specifically, FRA used worksheets and guidebooks that included the criteria outlined in the funding announcement to aid in assessing the eligibility and technical merit of applications. FRA also recorded general reasons for selecting applications and publicly posted broad descriptions of the selected projects. However, the documented reasons for these selection decisions were typically vague or restated the criteria listed in the funding announcement. In addition, there were only general reasons given for the applications not selected or for adjusting applicants' requested funding amounts. FRA subsequently provided GAO with more detailed reasons for several of its selection decisions, but this information was not included in the department's record of its decisions. Documentation on the rationales for selection decisions is a key part of ensuring accountability and is recommended by the department as well as other federal agencies. Without a detailed record of selection decisions, FRA leaves itself vulnerable to criticism over the integrity of those decisions—an important consideration, given that passenger rail investments have a very public profile. FRA also substantially followed recommended practices when awarding grants, including communicating key information to applicants prior to the competition, planning for the competition, using a merit review panel with certain characteristics, assessing whether applicants were likely to be able to account for grant funds, notifying applicants of awards decisions, and documenting the rationale for awards decisions (albeit generally). For example, FRA issued a funding announcement that communicated key pieces of information, such as eligibility, technical review, and selection criteria. FRA officials also conducted extensive outreach to potential applicants, including participating in biweekly conference calls, providing several public presentations on the program, and conducting one-on-one site visits with potential applicants. According to FRA, officials used lessons from a number of other grant programs when developing its approach to reviewing and selecting projects. FRA publicly communicated outcome information similar to other Recovery Act competitive grant programs we examined, including projects selected, how much money they were to receive, and a general description of projects and their intended benefits. Only one of the programs GAO examined communicated more outcome information on technical scores and comments; however, this program used a much different approach to select awards than FRA used to select intercity passenger rail awards. According to officials, FRA did not disclose outcome information from the technical reviews because officials were concerned that releasing reviewers' names and associated scores could discourage them from participating in future grant application reviews. What GAO Recommends: GAO recommends that FRA create additional records to document the substantive reasons behind award decisions to better ensure accountability for its use of federal funds. In commenting on a draft of this report, the department agreed to consider our recommendation. The department also provided technical comments, which were incorporated as appropriate. View [hyperlink, http://www.gao.gov/products/GA0-11-283] or key components. For more information, contact Susan Fleming at (202) 512- 2834 or flemings@gao.gov. [End of section] Contents: Letter: Background: FRA Applied Its Established Criteria to Determine Eligibility and Assess Technical Merit, but Selection Rationales Were Typically Too Vague to Assess: FRA Substantially Met Recommended Practices for Awarding Discretionary Grants: FRA Publicly Communicated at Least as Much Outcome Information as Other Competitively Awarded Recovery Act Grant Programs: Conclusions: Recommendation for Executive Action: Agency Comments: Appendix I: Extent to Which Recovery Act Projects Align with Statutory and Other Goals: Appendix II: Scope and Methodology: Appendix III: Difference Between the Amounts Requested and Estimated Awards by State: Appendix IV: Additional Results from Our Statistical Analysis of Award Decisions: Tables: Table 1: Technical Review Criteria: Table 2: Selection Criteria: Table 3: Recommended Practices FRA Followed: Table 4: Recovery Act Applications Supporting High Speed Rail Categories by Future Corridor Speed 5 Years After Project Completion: Table 5: Amounts Awarded, Obligated, and Spent on First Round Track 1 and 2 Recovery Act Projects, as of December 31, 2010: Table 6: FRA Plan for Obligating and Spending Recovery Act Funds and Amounts Obligated and Spent, as of December 31, 2010: Table 7: Guidance and Reports Used To Identify Recommended Government Practices: Table 8: Recovery Act Discretionary Grant Programs Reviewed: Table 9: Difference between the Amounts Requested and Estimated Awards by State, as of January 2010: Table 10: Difference between the Amounts Requested and Estimated Awards by State, as of December 2010: Table 11: Applications Selected by Technical Review Score, and Odds and Odds Ratios Derived from Them: Table 12: Applications Selected by Track, and Odds and Odds Ratios Derived from Them: Table 13: Applications Selected by Requested Amount, and Odds and Odds Ratios Derived from Them: Table 14: Applications Selected by State and State Group, and Odds and Odds Ratios Derived from Them: Table 15: Odds Ratios from Bivariate and Multivariate Logistic Regression Models by Technical Review Score, Track, Amount Requested, and State and State Group: Figures: Figure 1: Approach that FRA Used to Assess Applications and Select Recovery Act Recipients: Figure 2: Differences between Proposed and Requested Award Amounts, in Percents: Figure 3: Number of Selected Applications by Technical Review Score: Figure 4: HSIPR Reported Outcomes Compared to Other Competitively Awarded Recovery Act Programs: Abbreviations; department: Department of Transportation: FRA: Federal Railroad Administration: HSIPR: high speed intercity passenger rail: PRIIA: Passenger Rail Investment and Improvement Act of 2008: Recovery Act: American Recovery and Reinvestment Act of 2009: [End of section] United States Government Accountability Office: Washington, DC 20548: March 10, 2011: The Honorable John Mica: Chairman: Committee on Transportation and Infrastructure: House of Representatives: Dear Mr. Chairman: A recent influx of federal funds has breathed new life into the prospect of developing an expanded national passenger rail network in the United States. Specifically, the American Recovery and Reinvestment Act of 2009 (Recovery Act) appropriated $8 billion—- significantly more than Congress provided in recent years-—to develop high speed and intercity passenger rail service.[Footnote 1] Interest in these funds was high, and in January 2010 the Federal Railroad Administration (FRA)-—an agency within the Department of Transportation (the department)-—selected 62 applications in 23 states and the District of Columbia to receive the money.[Footnote 2] The vast majority (almost 90 percent) of the $8 billion awarded went to develop new or substantially improved passenger rail corridor projects, which, in several cases, expect to deliver high speed rail service reaching speeds of more than 150 miles per hour. The remaining funding generally went to projects focusing on upgrades and improvements to existing rail service (typically up to 79 miles per hour). With the Recovery Act funding, FRA recognized that it needed to transform itself from essentially a rail safety organization to one that can make and oversee multibillion dollar investment choices. [Footnote 3] This report assesses how FRA made the first of those choices and ensured that national investment goals are being met. [Footnote 4] It focuses on the extent to which FRA (1) applied its established criteria to select projects; (2) followed recommended practices for awarding discretionary grants; and (3) communicated outcomes to the public, compared with selected other Recovery Act competitive grant programs. These topics are the main focus of the report. In addition, we are also reporting on the extent to which selected projects align with legislative and federal goals. (See appendix I.) Our overall approach to addressing these topics was to (1) review publicly available information, such as federal legislation, plans, and other guidance, about the high speed intercity passenger rail program's evaluation, selection and communication approach, and compare it to practices used by other competitive grant programs; (2) review documents that FRA used in reviewing applications and selecting awardees to determine the extent to which FRA applied its established criteria; (3) analyze FRA data on technical review scores to determine the statistical relationship between some of FRA's published criteria and the selection decisions; and (4) interview a cross-section of officials from 12 of the 40 states and the District of Columbia which submitted either a preapplication or an application for Recovery Act funding (selected to reflect a range of application outcomes, award amounts, number of applications, and geographic location), a random sample of 18 of the 44 department reviewers which included at least one person from each applicant review panel, and other FRA officials who oversaw the evaluation and selected awards.[Footnote 5] We focused our review on projects selected by FRA in January 2010 and funded through the Recovery Act, which included applications submitted for ready-to-go projects (called "track 1a"), the completion of environmental and preliminary engineering requirements necessary to prepare projects for future funding (called "track 1b"), and projects to develop new high speed rail or intercity passenger services or substantially upgrade existing corridor service (called "track 2"). [Footnote 6] We assessed the reliability of FRA's scoring data by conducting a series of data tests, reviewing documents and reports about FRA's data systems, and speaking with officials familiar with the data. We determined that these data are sufficiently reliable for our reporting purposes. (Additional information on our scope and methodology is contained in appendix II.) We conducted this performance audit from April 2010 to March 2011 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Background: FRA is the primary federal agency responsible for overseeing safety in the railroad industry, as well as for distributing federal funds for intercity passenger rail service. FRA also administers federal operating and capital grants to the National Railroad Passenger Corporation (known as Amtrak), which have averaged between $1 billion and $1.3 billion per year since fiscal year 2003. FRA also approves Railroad Rehabilitation and Improvement Financing loans and Rail Line Relocation and Improvement Capital grants, and is the granting agency for the $120 million in fiscal year 2008 and fiscal year 2009 capital funds for intercity passenger rail projects. Recent legislation has vastly increased the federal role in and federal funds for developing intercity passenger rail service. The Passenger Rail Investment and Improvement Act of 2008 (PRIEA), enacted in October 2008, authorized more than $3.7 billion for three different federal programs for high speed rail, intercity passenger rail, and congestion reduction grants.[Footnote 7] PRIEA also called for FRA to create a preliminary national rail plan within 1 year after passage of the act as well as a long-range national rail plan that promotes an integrated and efficient national rail system. FRA released a preliminary national rail plan in October 2009 and a subsequent progress report in September 2010.[Footnote 8] The Recovery Act, enacted in February 2009, appropriated $8 billion for the three PRIIA-established intercity passenger rail programs. [Footnote 9] Unlike PRIIA, which authorized an 80 percent federal share for a project's capital costs, the Recovery Act provided up to 100 percent federal funding available for obligation through fiscal year 2012 and expenditure through fiscal year 2017.[Footnote 10] The Recovery Act required that the department develop a strategic plan to use these funds. In April 2009, FRA released its strategic plan for developing high speed rail in America and distributing federal funds. [Footnote 11] PRIIA and the Recovery Act created new responsibilities for FRA to plan, award, and oversee the use of new federal funds for intercity passenger rail. In response, FRA launched the high speed intercity passenger rail (HSIPR) program in June 2009 by issuing a funding announcement and interim guidance, which outlined the requirements and procedures for obtaining federal funds.[Footnote 12] FRA further outlined the vision and goals of the program through a number of outreach events and meetings, including seven regional workshops and more than 25 one-on-one site visits and conference calls with potential state applicants.[Footnote 13] States expressed a great deal of enthusiasm for the new program, requesting $102 billion across 278 preapplications, which FRA used to gauge initial interest and anticipate its staffing needs to manage the program. States, including the District of Columbia, ultimately submitted 229 applications for $57.8 billion in Recovery Act funds.[Footnote 14] FRA asked applicants to submit Recovery Act project applications under three tracks: 1a, 1b, and 2. Track 1 was intended to primarily address economic recovery goals, and could either focus on ready-to-go projects (track la) or the completion of environmental and preliminary engineering requirements necessary to prepare projects for future funding (track 1b). Track 2 focused on much larger, long-term projects to develop new high speed rail services or substantially upgrade existing corridor service. While track 1 and track 2 applications were submitted and reviewed at different times, FRA used a similar approach to assess them, and applied the same criteria during three independent steps: eligibility determination, technical review, and selection. [Footnote 15] (See figure 1.) Figure 1: Approach that FRA Used to Assess Applications and Select Recovery Act Recipients: [Refer to PDF for image: time line] June 2009: FRA launches the HSIPR program. FRA issues interim guidance and a funding announcement with the requirements and procedures for obtaining federal funds. July 2009: Preapplications deadline. FRA received 278 preapplications expressing initial interest from 40 states and the District of Columbia totaling $102 billion. August 2009: Track 1: Applications due to FRA. FRA received 184 applications requesting $6.9 billion. August-September 2009: Track 1: Eligibility determination. FRA determined 156 applications requesting $3.5 billion were eligible. September 2009: Track 1: Technical review. Twelve panels, each comprised of three individuals, assessed eligible applications on six pre-established technical review criteria. Track 1: Preliminary selection. Senior department and FRA officials considered the results of the technical review as well as four additional pre-established selection criteria. October 2009: Track 2: Applications due to FRA. FRA received 45 applications requesting $50.9 billion. October-November 2009: Track 2: Eligibility determination. FRA determined 23 applications requesting $20.6 billion were eligible. November-December 2009: Track 2: Technical review. One panel comprised of eight individuals assessed eligible applications on six pre-established technical review criteria. December 2009: Track 2: Preliminary selection. Senior department and FRA officials considered the results of the technical review as well as four additional pre-established selection criteria. January 2010: Track 1 and 2: Final decisions. Senior department and FRA officials examined projects from all tracks and recommended the selection of specific projects. The Secretary of Transportation concurred with these recommendations. Track 1 and 2: Award announcement. FRA announced the selection of 62 applications from 23 states and the District of Columbia for the $8 billion in Recovery Act funds. * Track 1: 48 applications were selected for $0.9 billion. * Track 2: 14 applications were selected for $7.0 billion. Source: GAO presentation of FRA data. [End of figure] The eligibility determination was conducted by panels of officials experienced in environmental requirements or passenger and commuter rail. These officials used worksheets to aid in assessing application completeness and determining whether applicants and proposed projects were eligible to receive funds. Eligibility panels also made preliminary determinations as to whether applicants had substantially completed environmental requirements and whether the projects they submitted were ready to begin. For the most part, applications deemed not yet ready or ineligible were not forwarded for technical review; because the track 1 eligibility and technical review periods overlapped, there were two Recovery Act applications that received technical review scores and were later deemed not yet ready or ineligible and removed from award consideration. The technical review was conducted by panels of officials with experience in several fields, such as passenger and commuter rail, grants management, and environmental requirements. The technical review differed slightly for track 1 applications, which were reviewed by 12 panels composed of three reviewers, and track 2 applications, which were reviewed by a single panel of eight reviewers. For both tracks, reviewers used guidebooks to assess applications against six technical review criteria: (1) transportation benefits, (2) economic recovery benefits, (3) other public benefits (e.g., environmental quality and energy efficiency), (4) project management approach, (5) sustainability of benefits, and (6) timeliness of project completion. (See table 1.) The guidebooks provided detailed descriptions of what was included within each of these criterion, as well as step-by-step instructions on reviewing applications that included a suggested scoring method using a scale from one (lowest) to five (highest). For example, the track 2 guidebook suggested applications that included more than one major weakness, were nonresponsive, or failed to address a particular criterion be given a technical review score of a one for that criterion. Applications that technical panelists determined were responsive, and included major and minor strengths and no major or very few minor weaknesses in a particular criterion, were to be given a technical review score of a five for that criterion. After completing an individual evaluation of each application, reviewers convened within their panel to discuss their overall thoughts on the application and technical review scores for each criterion, which they could revise based on input from other panelists. To arrive at a final score for each application, FRA officials used a formula that averaged individual scores and weighted the scores based on established priorities identified in the funding announcement.[Footnote 16] In addition, program officials standardized track 1 application scores to correct for potential inconsistencies across review panels.[Footnote 17] Table 1: Technical Review Criteria: Criteria: Transportation benefits; Types of factors to be considered by reviewers when assessing the application: * Supports the development of high speed passenger rail and generates improvements to intercity passenger rail. * Reduces congestion across other modes of transportation. * Encourages integration across other modes of transportation, such as connections at airports, bus terminals, and subway stations. * Promotes equipment standardization, signaling, communication, and power. * Provides for cost sharing across benefiting rail users, including freight and commuter railroads, host railroads, and state and local government. * Improves the overall safety of the transportation system. Criteria: Economic recovery benefits; Types of factors to be considered by reviewers when assessing the application: * Promotes business opportunities, including the short- and long-term creation and preservation of jobs. * Increases efficiency by promoting technological advances. * Avoids reduction in the essential services provided by states. Criteria: Other public benefits; Types of factors to be considered by reviewers when assessing the application: * Contributes to environmental quality and energy efficiency and reduces dependence on foreign oil. * Promotes livable communities, including integration with high- density, livable developments (e.g., central business districts with access to public transportation, pedestrian, and bicycle networks). Criteria: Project management approach; Types of factors to be considered by reviewers when assessing the application: * Assesses that the applicant has the financial, legal, and technical capacity to implement the project. * Considers the applicant's experience in administering similar grants. * Examines the soundness of cost methodologies, assumptions, and estimates. * Examines whether the application is complete and includes comprehensive supporting documentation, such as a schedule for project implementation, a project management plan, agreements with key partners, an explanation of progress towards completing environmental requirements, and any completed engineering work. Criteria: Sustainability of benefits; Types of factors to be considered by reviewers when assessing the application: * Considers the quality of the financial and operating service plans. * Examines the reasonableness of revenue and operating maintenance cost forecasts, and estimates for user and nonuser benefits. * Assesses that funds are available to support operating costs. * Considers agreements with key partners, such as the proposed operator and railroads that own pieces of infrastructure necessary to achieve benefits. Criteria: Timeliness of project completion; Types of factors to be considered by reviewers when assessing the application: * Assesses whether the project is ready-to-go, will be completed on time, and will deliver the proposed benefits. Source: GAO summary of FRA's funding announcement. [End of table] After the technical review, senior department and FRA officials—Deputy Secretary, Under Secretary for Policy, FRA Administrator, and FRA Deputy Administrator, among others—selected projects to recommend to the Secretary of Transportation. They considered the technical review scores along with four additional pre-established selection criteria identified in the funding announcement: (1) region and location, (2) innovation, (3) partnerships, and (4) track type and funding round. (See table 2.) HSIPR program officials gave five briefings to senior officials on the results of the technical review and possible factors to consider in making award decisions, such as potential project cost, service speed, shared benefits, and readiness. Program officials also provided additional information, including funding scenarios, facts sheets on individual applications, and corridor maps upon request. According to FRA, senior officials considered this information when making their recommendations, but did not numerically score or rank applications. Table 2: Selection Criteria: Criteria: Region and location; Factors to be considered by senior department and FRA officials when assessing the application: * Ensures projects are distributed across the country, in both small and large population centers. * Ensures integration and augmentation of projects across the nationwide transportation network. * Provides assistance to economically distressed regions. Criteria: Innovation; Factors to be considered by senior department and FRA officials when assessing the application: * Pursues new technology where the public return on investment is favorable. * Promotes domestic manufacturing, supply, and industrial development. * Develops passenger rail engineering, operating, planning, and management capacity. Criteria: Partnerships; Factors to be considered by senior department and FRA officials when assessing the application: * Emphasizes organized partnerships with joint planning and prioritization of investments when projects span multiple states. * Encourages creative approaches to workforce diversity and use of disadvantaged and minority businesses. Criteria: Track type and funding round; Factors to be considered by senior department and FRA officials when assessing the application: * Preserves funds for track 2 projects, as well as future funding rounds, if possible. Source: GAO summary of FRA's funding announcement. [End of table] On January 27, 2010, the FRA Administrator recommended 62 applications for funding and the Secretary of Transportation concurred with these recommendations. On January 28, 2010, DOT announced the selections. [Footnote 18] The selections were spread across several types of intercity passenger rail, including projects for emerging high speed rail (operating at speeds up to 90 miles per hour), regional corridors (operating at speeds between 90 and 124 miles per hour), and core express corridors (operating at speeds between 125 and 250 miles per hour or more). For example, the department selected one project to receive $35 million to rehabilitate track and provide service from Portland to Brunswick, Maine at speeds up to 70 miles per hour. Another project was to receive more than $50 million to construct 11 miles of dedicated passenger rail track near Rochester, New York, which will allow for service speeds up to 110 miles per hour. A third project was selected to receive almost $2.3 billion to initiate the first part of the California's high speed rail system, which will allow for more than 200 miles per hour service between Los Angeles, San Francisco and the Central Valley, and eventually, San Diego. These selections were consistent with the criteria in PRIIA, the Recovery Act, and FRA's strategic plan, which included broad goals that gave FRA discretion in developing a national passenger rail system. (Additional information on the legislative and program goals, and how the selected projects fit into them, is contained in appendix I.) FRA Applied Its Established Criteria to Determine Eligibility and Assess Technical Merit, but Selection Rationales Were Typically Too Vague to Assess: FRA applied its established criteria to determine eligibility and assess applications' technical merit. However, its rationales for selecting projects were typically too general to determine how it applied the additional selection criteria. When asked for more information on certain applications, FRA provided specific reasons for its selection decisions, but, in our opinion, creating a detailed, comprehensive record alongside the final selections is preferable. Officials reported that they used the technical review scores as a starting point from which to apply each of the four selection criteria, which is partially supported by our analysis of FRA data. For example, we found that applications receiving a higher technical review score were about seven to eight times more likely to be selected for an award compared to those receiving a lower technical review score. FRA Applied Its Established Criteria to Determine Eligibility and Assess Technical Merit: We found that FRA applied eligibility criteria established in its funding announcement when determining whether applications were eligible. Specifically, eligibility criteria listed in the funding announcement aligned with criteria outlined in the worksheets used by the panelists to verify that applications were eligible. Panelists were given separate worksheets to conduct the track 1 and track 2 eligibility reviews, and each of these worksheets included eligibility criteria listed in the funding announcement. For instance, as outlined in the funding announcement, the track 1 worksheet required eligibility panelists to indicate if the application was submitted on- time, by an eligible applicant, and with all of the required supporting documents. Similarly, the track 2 worksheet included questions regarding applicant eligibility, qualifications, and construction grant prerequisites which aligned with the eligibility criteria listed in the funding announcement. FRA also applied the established technical review criteria communicated in its funding announcement by including these criteria in the guidebooks provided to technical panelists to assess the technical merits of each application. Specifically, the guidebooks FRA provided to panelists for reviewing track 1a, 1b, and 2 applications were divided into six sections that aligned with each of the six technical review criteria listed in the funding announcement. Moreover, the criteria within these sections of the guidebook often matched the criteria in the funding announcement very closely and, in some cases, word-for-word. For example, the funding announcement stated that an applicant's experience administering similar projects would be considered under the technical review criteria of project management approach, which was included word-for-word in the project management approach section of the guidebooks.[Footnote 19] We spoke to at least one representative from each technical review panel; these representatives confirmed that panelists used the criteria listed in the guidebooks and did not use other criteria during their evaluation of individual applications. Officials Reported Using Technical Review Panel Results and Selection Criteria to Make Awards Recommendations; Decision Rationales Provided Little Insight into Selections: Senior department and FRA officials recommended to the Secretary of Transportation applications to receive awards and the proposed amounts of the awards. When deciding which applications to recommend for awards, senior FRA officials told us that they used the results of the technical review panels and the four selection criteria. These four criteria were described in FRA's June 2009 funding announcement as: (1) region and location (e.g., ensuring geographic balance, integration into the nationwide transportation network, and assistance to economically distressed regions), (2) innovation (e.g., pursuing new technology with a favorable public return, promoting domestic manufacturing, and developing human capital capacity for sustainable rail development), (3) partnerships (e.g., multi-state planning and investment and workforce diversity), and (4) tracks and round timing (e.g., longer-term track 2 corridor development balanced with ready-to- go track 1 investments). For example, officials stated that they used the innovation criterion to select applications with higher proposed speeds of service. In particular, senior officials reported using this criterion to reinforce the selection of the California and Florida intercity passenger rail projects, which were the only eligible projects with the potential for service above 150 miles per hour. In another example, officials reported that they applied the partnership criterion by assessing applicants' track record with implementing large transportation projects as well as demonstrated relationships with key stakeholders, such as private railroads. Senior FRA officials stated they developed their funding amount recommendations based on their professional judgment and national high speed intercity passenger rail program goals. Officials told us they accounted for the risks related to the total cost of the project during selection discussions and weighed them against the overall policy goals of developing a national high speed passenger rail network. Officials also stated that they used their professional judgment about rail systems to recommend the award amounts for each application, paying particular attention to the amounts distributed to the large, track 2 projects, and that they are continuing to assess the effect of changes to the requested funding amounts during the scope of work negotiations with awardees. According to FRA, its rationales for selecting applications are recorded in a recommendation from the FRA Administrator to the Secretary of Transportation and in a memorandum from the Secretary to the Administrator concurring on the recommendations and specifying potential funding amounts. The rationales stated in these memorandums were typically vague, such as "aligns well with FRA's published evaluation criteria" and "will result in significant transportation benefits [and] preserve and create jobs." These rationales most often restated the criteria listed in the funding announcement generally (e.g., result in significant transportation benefits) rather than providing insight into why the department viewed projects as meritorious. In addition, the memorandums did not provide any information on why other applications were not recommended for selection, which prevents us from assessing how the department viewed the merits of successful applications over unsuccessful ones. For example, we found several instances in which, without documentation, it was difficult to determine the reasons why some projects were selected and others were not. Specifically, FRA decided not to select six track 1a applications from New York that received higher technical review panel scores, and selected a lower scoring track 1a application from the same applicant. FRA officials subsequently told us that the lower scoring application was selected for a number of reasons, including improving the reliability of the passenger trains on the rail line, ensuring that the project will become part of the infrastructure of any significant improvements to passenger rail service west of Albany, and improving the fluidity of both passenger and freight rail operations on this heavily used rail route. Similarly, FRA selected a lower scoring track 1a application from Illinois, but not a relatively high scoring one. FRA officials subsequently told us that they selected this application because it is an essential part of a long-standing program of projects to improve the fluidity of rail traffic in the highly congested Chicago area. FRA officials also told us that the scope of the relatively high scoring track 1a application was included in Illinois' selected track 2 application. This level of information, which provides some insight into the merits of projects, was not included in the department's record of its decisions. In addition to the memorandums, FRA posted descriptions on its Web site of the selected projects, their expected benefits, and prospective award amounts. However, these descriptions are not particularly useful in understanding why these projects were selected because the cited benefits—-such as reducing travel times, increasing travel speed and ridership, providing attractive transportation alternatives, and creating jobs—-were supposed to be integral to all projects. For example, FRA's Web site describes one project as increasing on-time performance and ultimately allowing speeds of up to 110 miles per hour on its segment, but does not give any indication why this project was meritorious. Other descriptions were similar. FRA also sent letters to individual applicants regarding its decision, and, if the application was not selected, a brief explanation as to why it was not selected. For example, a number of these letters explained that applications were not selected because they did not meet a prerequisite, had application materials that did not provide sufficient support for the proposed activities, or did not submit all application materials necessary to adequately evaluate the project. However, these letters did not provide further details on how the proposed projects did not meet the prerequisite, how the application materials were insufficient, or which application materials were not received. Other decision letters provided applicants with similarly broad explanations. FRA officials also told us that they called all applicants, as well as their state secretaries of transportation and state governors, to inform them of FRA's decisions. Several of the states that we contacted reported that the primary purpose of these calls was for FRA to provide feedback on their individual projects and, when requested, give explanations for why projects were not selected. While applicants stated that this information will be helpful during future application rounds, there is no required written record of these conversations and, therefore, they do not provide others with insight on why selection decisions were made. Documentation of agency activities is a key part of accountability for decisions.[Footnote 20] The department has a financial assistance guidance manual to assist agencies with administering awards competitions and which FRA officials told us that they used to develop the competition framework.[Footnote 21] The manual recommends that all discretionary project selections, such as the intercity passenger awards, include an explanation of how the projects were selected based on the established funding priorities, but does not lay out expectations for the level of explanation. In particular, the manual recommends that officials document decisions if projects with the highest priority are not funded. While the department documented its decisions, as required by its financial assistance manual guidance, the absence of an insightful internal record of the reasons behind award recommendations, and the final selections where they differ, can give rise to challenges to the integrity of the decisions made. While FRA was able to provide us with specific reasons on a case-by-case basis for why projects were selected, almost a year after these decisions were made, we believe creating a sufficiently detailed record has increased relevance in high-stakes, high-profile decisions, such as the intercity passenger rail awards competition in which there are vocal critics and ardent supporters of the program. Similar arguments apply for creating an internal record for amounts recommended for awards. FRA officials understood that the available Recovery Act funds were not sufficient to fully fund a number of the projects and sought to fund projects or portions of projects that could provide transportation benefits if no additional federal funds were available. For these decisions, FRA proposed awarding 10 states (including the District of Columbia) all (100 percent) of the funds they applied for, 8 states nearly all (91-99 percent) of the funds they applied for, 5 states some (47-86 percent) of the funds they applied for, and one state with slightly more (104 percent) than it applied for.[Footnote 22] (See figure 2. See also app. DI for dollar amounts associated with figure 2.) The applicant notification letters did not offer an explanation for why FRA proposed award amounts that differed from requests, and applicants we spoke with did not report that FRA had provided such information to them. Given that infrastructure projects have an inclination for cost growth, developing a record that explains why the recommended costs are appropriate for the proposed project provides integrity to the final decisions. Figure 2: Differences between Proposed and Requested Award Amounts, in Percents: Percent of requested funds proposed for award: 100 or more; Number of states: 9. Percent of requested funds proposed for award: 91-99; Number of states: 10. Percent of requested funds proposed for award: 47-86; Number of states: 5. Source: GAO analysis of FRA data. [End of figure] The current economic climate has also increased the importance of providing an internal rationale for large differences between requested funds and proposed award amounts. Many states have faced large budget deficits in 2010 that will require them to make difficult budget decisions about the future use of state funds, particularly where the Recovery Act awards will not provide all the funding expected to be needed to complete a project. For example, as of June 2010, Florida had made $3 billion in budget cuts to close its budget deficit. For its high speed rail award, Florida is slated to receive less than half of what it said is needed to complete the proposed Tampa to Orlando High Speed Rail Express project. An official from the Florida Department of Transportation is hopeful that Florida will receive additional federal grants, but is unsure where the remaining funds will come from otherwise.[Footnote 23] Additionally, Washington state applied for 16 separate projects totaling $976 million and was selected to receive a composite award of $590 million. Washington state officials acknowledged that the award amount will not fund all 16 of the projects, and have since reduced the scope of the application to the 11 projects that could be completed with the awarded amount while still providing the maximum benefit to the corridor.[Footnote 24] FRA officials stated they awarded amounts that differed from those requested in applications as a result of their recognition that many of the projects were based on preliminary work that was not well-refined, and that states differed in their ability to accurately estimate costs.[Footnote 25] In contrast, North Carolina received 4 percent more funding than originally requested. According to FRA, the additional funding was allocated to North Carolina for possibly adding additional train frequencies for a Recovery Act project. While we recognize that FRA may have developed these proposed award amounts for good reasons, without a written record of the department's rationale for these adjustments, after the fact reconstructions of funding amount decisions invite outside criticism of the decisions. Applications with Higher Technical Review Scores Were Typically Chosen Over Those with Lower Scores: One of your interests was in how the results of technical review panels aligned with final award decisions. As discussed earlier, while FRA considered the technical review panels to be an important part of its decision making, they were not the sole basis for selecting projects. This was detailed in FRA's funding announcement, which described how applications were first to be assessed against six technical review criteria and then final recommendations would be made using the technical review results and four selection criteria. While the technical review panel evaluations alone were not meant to designate final selections, we found that of 179 eligible Recovery Act applications, senior management recommended 92 percent (57 of 62) of higher scoring applications for funding; that is they received review panel scores of 3 or higher out of 5 possible points. (See figure 3.) Within these recommended applications, most received a technical review score of 3 or 4, and three of the five applications that received a technical review score of 5 were recommended for selection. One of the two applications that scored a 5 and was not selected for funding was included in a selected track 2 application. Figure 3: Number of Selected Applications by Technical Review Score: [Refer to PDF for image: stacked vertical bar graph] Score: 1; Applications Selected: 0; Applications Not Selected: 3. Score: 2; Applications Selected: 5; Applications Not Selected: 31. Score: 3; Applications Selected: 35; Applications Not Selected: 48. Score: 4; Applications Selected: 19; Applications Not Selected: 33. Score: 5; Applications Selected: 3; Applications Not Selected: 2. Source: GAO analysis of FRA data. [End of figure] In a few cases though, senior officials recommended applications that received a lower technical review score (i.e., a score of 2) because, according to FRA, they believed these projects included freight and commuter rail service partners that were willing to make cost contributions in line with their potential benefit share, were strategically important to other selected applications, or helped to achieve regional balance. These considerations were included in the four selection criteria senior department and FRA officials said they used to evaluate applications. For example, one of the two applications from Rhode Island requested $1.2 million to complete preliminary engineering and environmental reviews and received a lower overall technical review score, in part because technical reviewers did not believe the applicant had sufficiently quantified the transportation and economic recovery benefits. This application was later recommended for selection. According to FRA, senior officials recommended applications receiving lower technical review scores, such as this Rhode Island application, in part to achieve greater regional balance. Additionally, FRA indicated this particular application was one of the few applications proposed for the Northeast Corridor, which further supported the region/location selection criteria. In another instance, senior officials selected a track 2 application from California that requested $194 million for preliminary engineering and environmental requirements for a large corridor application that received a lower technical review panel score. According to FRA, senior officials recommended some applications receiving lower technical review scores due to the projects' strategic importance to other selected applications. Officials stated that they recommended the track 2 California application because the completion of preliminary engineering and environmental requirements were necessary to move forward on several other large California projects also recommended for an award. Officials also told us that some applications receiving a higher technical review score (i.e., 3, 4, or 5) were not selected in order to ensure regional balance, especially when an applicant had already been selected for other large awards. For example, a track 1a application from North Carolina received a higher technical review panel score due, in part, to the anticipated transportation benefits of increased ridership and on-time-performance, and the applicant's estimates that the project would create more than 400 new jobs. Most of the projects that North Carolina applied for under this application were also included as part of a larger, intercity passenger rail application that was later recommended for selection, and the state was awarded an estimated total of $545 million for high and conventional speed rail projects. Department and senior FRA officials reported that higher evaluated applications were not selected if the proposed project was already included in larger selected projects, to avoid duplicative selections. Another example was FRA's decision not to select a higher scoring track 1a application from Florida that requested $270 million to acquire 61 miles of right-of-way. This application was scored highly due in part to its immediate benefits and substantial contribution of state funds but, similar to North Carolina, Florida had already been awarded $1.25 billion for a separate large, track 2 corridor project. In addition, almost 90 percent of the applications that scored a 4 and were not selected were submitted by applicants that had either already received a large award or had submitted a relatively high number of applications. To provide further insight into the attributes that were consistent with being selected for Recovery Act awards, we examined technical review score and application data using a statistical model and found that two out of four variables we included in our model, technical review scores and the number of applications submitted per state, were significantly related to the likelihood of an application being selected for an award.[Footnote 26] Applications with higher scores (i.e., scores of 3, 4, or 5) were about seven to eight times more likely to be selected than those with scores of 1 or 2. For example, an application receiving a technical review score of 5, the highest possible score, was more than nine times more likely to be selected for an award as an application receiving a technical review score of a 1 or 2. This analysis supports statements from senior department and FRA officials indicating that the technical review scores were largely the basis for their selection deliberations. Additionally, we found that states submitting fewer applications (i.e., between one and three) were more than three times more likely to have their application selected than states submitting higher numbers of applications (i.e., between four and nine). This result suggests that selection officials attempted to spread the awards across different applicants, which is consistent with FRA's reported efforts to attain geographic distribution. However, the results differed somewhat for the four states that submitted 10 or more applications. In this case two of the states had a lower likelihood of being selected for an award than states submitting fewer than 10 applications, while one state had a higher likelihood of being selected. One additional state had about the same likelihood of being selected as states submitting between four and nine applications. When asked about these differences across states, FRA officials said that the number of applications submitted did not affect their selection decisions. FRA Substantially Met Recommended Practices for Awarding Discretionary Grants: We identified six recommended practices used across the federal government to ensure a fair and objective evaluation and selection of discretionary grant awards. These practices are based on policies and guidance used by the Office of Management and Budget and other federal agencies—including the department, and our work.[Footnote 27] FRA substantially followed these practices, including communicating key information to applicants, planning for the competition, using a technical merit review panel with desirable characteristics, assessing applicants' ability to account for funds, and notifying applicants of awards decisions. (See table 3.) In our opinion, FRA partially met one recommended practice: documenting the rationale for funding decisions. As discussed previously, we believe it would have been beneficial to provide more detail about the rationales for these decisions. According to FRA officials, the methods they used to evaluate and select applications were based on best practices collected from several other federal government agencies, which we believe likely helped them meet a number of the recommended practices we identified. Table 3: Recommended Practices FRA Followed: Practice: Communicate with potential applicants prior to the competition; Attributes of practice: Provide information prior to making award decisions on available funding, key dates, competition rules (i.e., eligibility, technical review, and selection criteria), funding priorities, types of projects to be funded, outreach efforts to new applicants and preapplication assistance; FRA followed? Yes. Practice: Plan for administering the technical review; Attributes of practice: Develop a plan for the technical review that describes the number of panels and reviewers and includes methods for assigning applications to review panels, identifying reviewers, recording the results of the technical review, resolving scoring variances across panels, and overseeing the panel to ensure a consistent review; FRA followed? Yes. Practice: Develop a technical review panel with certain characteristics; Attributes of practice: Use a technical review panel consisting of reviewers who hold relevant expertise, do not have conflicts of interest, apply the appropriate criteria, and are trained; FRA followed? Yes. Practice: Assess applicants' capabilities to account for funds; Attributes of practice: Assess applicants' abilities to account for funds by determining if applicants meet eligibility requirements, checking previous grant history, assessing financial management systems, and analyzing project budgets; FRA followed? Yes. Practice: Notify applicants of awards decisions; Attributes of practice: Notify unsuccessful and successful applicants of selection decisions in writing and provide feedback on applications; FRA followed? Yes. Practice: Document rationale for awards decisions; Attributes of practice: Document the rationale for awards decisions, including the reasons individual projects were selected or not selected and how changes made to requested funding amounts may affect applicants' ability to achieve project goals; FRA followed? Partially. Source: GAO analysis of federal agency guidance and the HSIPR evaluation and selection approach. [End of table] * Communicate with potential applicants prior to the competition. FRA issued a funding announcement that included information on the $8 billion in available funding, key dates, the competition rules, the funding priorities and relative importance for each one, and the types of projects FRA would consider for federal grants. Applicants we spoke with praised FRA's communication and stated that FRA officials did a good job providing information and answering questions during the period leading up to the preapplication and application deadlines. For example, officials from several states indicated that FRA officials participated in biweekly conference calls, which were helpful in understanding the technical aspects of how to apply. Applicants also indicated that the outreach events, particularly the site visits, helped them refine their applications and ensure projects met program requirements. * Plan for administering the technical review. FRA developed two plans for determining technical merit: (1) the track 1 technical review used 12 panels each comprised of three reviewers and (2) the track 2 technical review used one panel of eight reviewers. Track 1 applications were randomly assigned across the panels, while the track 2 panel reviewed all of the eligible applications. FRA identified and asked for volunteers to participate in the technical reviews from within FRA and across several other agencies within the department. FRA officials also provided reviewers with guidebooks to document their application assessments and instructed them to input the results, including scores and comments, into a centralized database. FRA standardized final track 1 application scores to account for any unintentional differences in the way panels assessed and scored applications, but did not need to standardize track 2 scores because the review was conducted by a single panel. Finally, according to officials, FRA oversaw the review by examining technical review scores and comments, and conducting daily meetings with representatives from each panel to ensure panelists were consistently applying the criteria. * Develop a technical review panel with certain characteristics. FRA compiled technical review panels that included staff with background in several relevant fields, such as grants management, passenger and commuter rail, and environmental requirements, and made other knowledgeable staff available if panelists had questions. FRA officials stated that panelists were also required to sign or submit a previously completed conflict of interest form to attest to their independence. In addition, panelists were given guidebooks to assess applications that included the technical review criteria and were told by FRA program officials to apply only these criteria during their efforts. FRA also trained panelists during a 1-day orientation session. * Assess applicants' capabilities to account for funds. FRA required applicants to provide information on their ability to account for funds. Specifically, applicants were asked to describe their experience, if any, managing rail investment projects. If applicants reported that they did not have experience on projects similar to the one they were proposing, FRA instead asked applicants to include a plan for building the capacity to manage the project. The application also required applicants to provide information on their financial management capability, including previous audit results, and the applicants' ability to manage potential cost overruns and financial shortfalls. In addition, FRA required applicants to submit supplemental materials such as a detailed capital cost budget, which provided a breakdown of the activities included in each application and their anticipated cost. These pieces of information were assessed by FRA through an eligibility panel, to ensure the application was complete, and a technical review panel, to evaluate the applicants' overall ability to manage the project. * Notify applicants of awards decisions. FRA officials provided each applicant with a letter indicating which applications were selected and a general reason why individual applications were not selected. While FRA did not include estimated award amounts in these notification letters, this information was made publicly available on the department's Web site and distributed through a press release. In addition, most of the applicants we spoke with indicated that FRA provided informal feedback on applications via telephone calls shortly after the awards were announced. For example, an official from one applicant stated that FRA provided information on ways to improve applications that were not selected, which the applicant used when applying for funds in future rounds. * Document rationale for awards decisions. According to the guidance from the department, Department of Commerce, the Department of Education, and our work, agencies should document their rationale for award decisions. As stated previously, FRA documented how it applied the technical criteria for selected projects, and provided applicants with a general explanation for selecting or rejecting individual projects. However, as discussed in a previous section, in our view FRA typically did not clearly document specific reasons for selecting individual projects, reasons for not selecting other projects, or how changes made to requested funding amounts might affect applicants' ability to achieve project goals. According to FRA, officials used lessons from a number of other government programs when developing the method for evaluating and selecting projects. For example, one of the officials responsible for developing the funding announcement, technical review guidebooks, and the format of the technical review panels stated that he relied on his experience working with large transit grants to create a review that was both quantifiable and allowed for subjective professional judgment. In addition, this official noted that FRA examined the methods used by other agencies, such as the Department of Health and Human Services, the Department of Justice, and the Federal Transit Administration, to develop and implement a list of best practices for awarding discretionary grants. FRA Publicly Communicated at Least as Much Outcome Information as Other Competitively Awarded Recovery Act Grant Programs: FRA publicly communicated outcome information, such as a list of awards and the award amounts, at a level similar to or greater than most other Recovery Act competitive grant programs that we examined. Specifically, FRA communicated information on award decisions to the public, but did not communicate the results of the technical review that had contributed to these decisions. Only one of the programs that we examined-—the Department of Education's State Innovation grants (known as Race to the Tope[Footnote 28])-—publicly communicated the results of its technical review, which include technical scores and comments; however, this program used a much different approach for selecting awardees than the HSIPR program. Members of Congress and the President have emphasized the need for accountability, efficiency, and transparency in the expenditure of Recovery Act funds and have made it a central principle of the act. However, the act did not define the attributes of transparency or how deeply an agency's actions should be transparent.[Footnote 29] We also did not find any non-Recovery Act requirement or guidance instructing federal programs to publicly disclose the reasons for their selection decisions. To assess the extent to which FRA publicly communicated outcome information, we compared the HSIPR program to 21 other Recovery Act competitive grant programs, including Race to the Top. (See figure 4.) We selected 20 of these programs randomly from a list of almost 200 competitively awarded grant programs that distributed Recovery Act funds.[Footnote 30] We included the 21st program, Race to the Top, because it was of interest to you. Figure 4: HSIPR Reported Outcomes Compared to Other Competitively Awarded Recovery Act Programs: [Refer to PDF for image: illustrated table] Outcomes communicated: Technical review (scores and comments); Degree communicated: HSIPR: Information was not communicated; Degree communicated: Other Recovery Act programs: Information was not communicated; Degree communicated: Race to the Top: Information was communicated. Outcomes communicated: Selection decisions (awards and award amounts); Degree communicated: HSIPR: Information was communicated; Degree communicated: Other Recovery Act programs: Information was Information was communicated for some programs, but not all; Degree communicated: Race to the Top: Information was communicated. Source: GAO analysis of publicly available data on discretionary Recovery Act program grants. [End of figure] FRA publicly communicated at least as much outcome information as all but one Recovery Act competitive grant programs we reviewed. Specifically, FRA publicly communicated through its Web site the selection decisions, including the amount of funds requested, general benefits from the project, and the potential award amounts for the 62 Recovery Act applications that it selected. It did not communicate the results of the technical review. Out of the other 21 competitively awarded Recovery Act programs we examined, 13 communicated selection information similar to FRA, including awards and award amounts, but not the results of the technical review. For example, the Department of Health and Human Services' National Institutes of Health published a list of 21,581 award winners for nearly $9 billion, but, similar to the HSIPR program, did not report the results of the technical review. Eight other programs conveyed less information than FRA and did not publicly communicate the results of the technical review or the awards and award amounts. Race to the Top was the only program we examined that publicly provided the results of its technical review. These results, which were posted on the Department of Education's Web site, included scores and comments from reviewers for each applicant, but were not connected to individual reviewers by name. According to its Web site, the Department of Education decided to release this level of detailed information because the $4 billion Race to the Top program was larger than any other discretionary program the Department of Education had previously administered, and officials sought to ensure the highest level of integrity and transparency. Unlike the HSIPR program, however, Race to the Top used these scores as the sole basis for selecting awards and only chose applicants receiving the highest scores. As described in the previous section, the technical review scores were an important component for making HSIPR selection decisions, but did not include consideration of additional pre- established selection criteria designed to ensure long-term success and sustainability of the program. As such, publishing them without additional decision making information on the specific reasons for selecting and not selecting individual applications could lead to erroneous conclusions about FRA's decisions. According to FRA officials, the results of the technical review were not communicated because department officials were concerned that associating technical review scores and comments with a specific reviewer could discourage reviewers from participating in future department competitive grant evaluations. Furthermore, in their view, this might also prevent reviewers in future funding rounds from providing candid evaluations. However, as the Race to the Top program demonstrated, it would be possible for FRA to present overall technical panel review assessments or their individual comments without linking individuals' names to comments, if it chooses to do so. FRA officials stated that the anonymous disclosure of technical scores and comments would still prevent FRA and department leadership and staff from frankly expressing their individual judgments, as they might still be concerned over how theses opinions would reflect on the FRA and HSIPR program if they were made public. Conclusions: The $8 billion appropriated by the Recovery Act for the High Speed Intercity Passenger Rail program represents a large investment in the development of a national passenger rail network. FRA established a fair and objective approach for distributing these funds and substantially followed recommended discretionary grant award practices used throughout the government. The exception is what we view as incomplete documentation of why some applications were chosen and not others, and how FRA decided to distribute the funds at the time those decisions were made. This incomplete documentation is notable given the robust documentation of the other steps used to determine eligibility and assess technical merit. We believe that establishing a record that provides insight into why decisions were made, rather than merely restating general technical review and selection criteria, including amounts to be provided, would enhance the credibility of FRA's awards decisions to the extent that this record confirms that selected projects aligned with established criteria and goals. By not establishing this record, FRA invites skepticism about the overall fairness of its decisions, even if they are sound, and hinders meaningful disclosure of how it made its decisions, if it chooses to do so. Recommendation for Executive Action: To help ensure accountability over federal funds, we recommend that the Secretary of Transportation direct the Administrator of the Federal Railroad Administration to create additional records that document the rationales for award decisions in future HSIPR funding rounds, including substantive reasons (1) why individual projects are selected or not selected and (2) for changes made to requested funding amounts. Agency Comments: We provided a draft of this report to the Department of Transportation for its review and comment. The department told us that it carefully constructed the grant processes for the HSIPR program based on extensive review and consideration of best practices both within and outside the agency with the intent of providing a comprehensive and transparent process. The department indicated that its overall intent was to select the best projects that offered the greatest available and achievable benefit to the nation. The department told us that it would carefully consider our recommendation to determine if there are means to further enhance the transparency of its grant selection process with additional documentation, without creating a process that is unduly burdensome to administer. The department also offered technical comments which we incorporated as appropriate. As agreed with your office, unless you publicly announce the contents of this report earlier, we plan no further distribution until 30 days from the report date. We are sending copies of this report to congressional subcommittees with responsibilities for surface transportation issues; the Director, Office of Management and Budget; the Secretary of Transportation; and the Administrator of the Federal Railroad Administration. In addition, this report will be available at no charge on GAO's Web site at [hyperlink, http://www.gao.gov]. If you or your staff have any questions regarding this report, please contact me at (202) 512-2834 or flemings@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made key contributions to this report are Owen Bruce, Matthew Cook, Colin Fallon, Michele Fejfar, Maria Gaona, Grant Mallie, James Ratzenberger, Douglas Sloane, Matthew Voit, and Crystal Wesco. Sincerely yours, Signed by: Susan A. Fleming: Director, Physical Infrastructure Issues: [End of section] Appendix I: Extent to Which Recovery Act Projects Align with Statutory and Other Goals: We examined the extent to which American Recovery and Reinvestment Act of 2009 (Recovery Act) projects selected by the Federal Railroad Administration (FRA) align with legislative and the administration's goals to develop high speed and conventional rail networks. Congress Provided Broad Goals with a Priority for High Speed Systems Congress provided its most recent expectations for high and conventional speed rail in the Passenger Rail Investment and Improvement Act of 2008 (PRIIA) and the Recovery Act. In this regard, PRIIA speaks generally about supporting improvements to high and conventional speed rail and does not set out any expectations for relative attention to high and conventional speed passenger rail improvements. The Recovery Act appropriated $8 billion for both forms of rail service broadly. However, it required that FRA give priority to projects that support the development of intercity high speed service. Further, the act required that FRA develop a strategic plan that describes how FRA will use Recovery Act funding to improve and deploy high speed systems. FRA had wide latitude to achieve goals laid out in its strategic plan. FRA's Vision Describes Broad Goals for High Speed Rail, but Provides Limited Detail on How Goals Will Be Achieved: FRA has outlined its vision for developing intercity passenger rail service in its strategic plan, as required by the Recovery Act, and in both its preliminary national rail plan issued in 2009 and the plan's update nearly a year later. FRA's vision documents—the strategic plan and its updated national rail plan—described broad goals, such as for transportation, safety, and economic competitiveness, and established categories for the type of high speed rail projects it intends to support. For example, the strategic plan notes the high speed rail program aims to generate construction and operating jobs, while providing a steady market for various industries producing rail, control systems, locomotives, and passenger cars. In addition, the plan notes that investments in high speed rail can result in competitive trip times and rail transport can also result in higher- density development as compared to other modes of transportation. Similarly, the updated national rail plan sets a goal of connecting communities through high speed rail while, among other things, reducing congestion, boosting economic growth, and promoting economic sustainability. However, these vision documents provide limited details on goals for the high speed rail program. For example, while the strategic plan emphasizes investments that will yield tangible benefits to rail performance and improve connections between different modes of transportation, it does not describe how and when FRA intends to realize these benefits. In addition, as we reported last June, the preliminary rail plan did not offer specific recommendations for future action and was designed to serve as a springboard for further discussion with states and freight railroads.[Footnote 31] While the update to this plan included improving rail performance as a goal and provided some measurements for high speed rail performance, such as competitive trip times, it did not provide any specific targets for these metrics, or any time line showing when FRA hopes to attain these improvements. FRAs Application Selection Is Consistent with the Recovery Act's Priority for High Speed Rail Service: Consistent with the Recovery Act's direction to give priority for high speed rail service, about half (45 percent) of the applications selected were for core express corridors (high speed service of 125- 250 miles per hour or more) or regional corridors (higher-speed service of 90-124 miles per hour) using categories of service similar to those FRA established in its vision documents.[Footnote 32] (See table 4.) FRA did not establish specific targets for the number of each type of project it intended to support.[Footnote 33] Table 4: Recovery Act Applications Supporting High Speed Rail Categories by Future Corridor Speed 5 Years After Project Completion: Category: Core express corridors; Top speed: 125-250 miles per hour or more; Number of projects meeting targeted corridor top speed: 5. Category: Regional corridors; Top speed: 90-124 miles per hour; Number of projects meeting targeted corridor top speed: 23[A]. Category: Emerging high speed rail; Top speed: Up to 90 miles per hour; Number of projects meeting targeted corridor top speed: 21. Source: GAO categorization of applications based on FRA information and applicant data. Note: We categorized these applications solely by future project speed 5 years after completion using the speeds reported at the time the application was submitted. Thirteen applications did not provide information on anticipated top speed. These applications anticipate a variety of improvements, such as station rehabilitations, the reconfiguration of rolling stock, and track and grade crossing upgrades. FRA subsequently classified two of these projects as supporting core express corridors, six as supporting regional corridors, two of these projects as emerging routes, and the remaining three projects as contributing to more than one category. We relied on data submitted by applicants to FRA to assign applications to categories. We have reported that applicants for major infrastructure projects, such as high speed rail projects, often overstate benefits, such as speed of service. See [hyperlink, http://www.gao.gov/products/GA0-09-317]. Some projects may attain higher speeds than those reported in the applications following negotiations between FRA and the various states. In addition, because corridors may include multiple projects, the top speed of a corridor may exceed those for some of its component projects. [A] Five of these applications had an estimated future speed of 110 miles per hour, but did not specify whether this speed would be achieved within the 5 years following project completion. [End of table] Selected Applications Reflect Short-Term Economic Recovery and Long- Term Infrastructure Investment Goals: In addition to providing priority for high speed projects, the applications that FRA selected were consistent with near-term economic recovery goals and its long-term development goals. While most selected projects are short-term in nature and are intended to support economic recovery goals established by the Recovery Act, most funding was provided to several long-term, high speed corridor projects. Specifically, we found that 48 of the 62 applications selected were track 1 applications, which are smaller projects designed to be completed within 2 years. (See table 5.) These projects represent about 11 percent of the funding provided for high speed rail and intercity passenger rail through the Recovery Act. The remaining 89 percent of funding was provided for 14 track 2 applications, which are primarily long-term, corridor projects. The funding allocation aligns with FRA's focus on long-term investments that will support development of a high speed passenger rail network as described in the funding announcement. Table 5: Amounts Awarded, Obligated, and Spent on First Round Track 1 and 2 Recovery Act Projects, as of December 31, 2010: Project track: Track 1; Number of applications: 48; Amount awarded: $887 million; Amount obligated: $71 million; Amount spent: $0. Project track: Track 2; Number of applications: 14; Amount awarded: $7.025 billion; Amount obligated: $4.192 billion; Amount spent: $50 million. Project track: Total; Number of applications: 62; Amount awarded: $7.912 billion; Amount obligated: $4.263 billion; Amount spent: $50 million. Source: GAO analysis of FRA data. Note: This table does not include $15 million obligated or $10 million spent by FRA for contracts associated with the award and oversight of these grants. [End of table] While FRA announced in January 2010 awards of nearly $8 billion in grants for the program, many of these projects have only recently begun. As of December 31, 2010, FRA had obligated $4.2 billion, or about 54 percent of the funding awarded in January, and about $50 million has been spent for projects selected under track 2. In May 2009, FRA issued a plan for spending Recovery Act funds, which it updated in July 2010. FRA missed its May 2009 targets for obligations and spending through 2010 estimate because it had planned to announce awards—and begin obligating funds—in the autumn of 2009. However, FRA did not make those announcements until January 2010 and did not begin to obligate funds until May 2010. FRA then revised its estimates in July 2010. FRA surpassed the calendar year 2010 goals for obligating and spending funds in the July 2010 plan. (See table 6.) During calendar year 2010, FRA obligated about 11 times as much as anticipated in the July 2010 plan, while awardees have spent about 7 times as much as planned over the same time period. The Recovery Act authorized obligation of funds through September 30, 2012, and FRA intends to obligate all funds by this date.[Footnote 34] Table 6: FRA Plan for Obligating and Spending Recovery Act Funds and Amounts Obligated and Spent, as of December 31, 2010: Amount obligated: May 2009 plan: $6.002 billion; July 2010 plan: $400 million; Actual: $4.263 billion. Amount spent: May 2009 plan: $1.760 billion. July 2010 plan: $7 million; Actual: $51 million. Source: GAO analysis of FRA data. Note: This table does not include $15 million obligated or $10 million spent by FRA for contracts associated with the award and oversight of these grants. [End of table] Passenger rail investments are often long-term efforts that must be carried out in partnership between the state and others, notably private railroads. For example, in order to begin design and construction on many of these projects, grant recipients must negotiate and secure agreements with private freight railroads to use their tracks for passenger rail trains. However, officials from these railroads are concerned that sharing tracks would create safety risks and liability concerns, prevent freight expansion, and cause rail congestion. Some of the states have experienced delays finalizing these agreements with the railroads and, accordingly, have not completed agreements with FRA to obligate awarded funding. [End of section] Appendix II: Scope and Methodology Criteria Used to Select Projects: To determine the extent to which FRA applied its established criteria to select projects, we identified the criteria that it planned to use from its June 23, 2009, funding announcement outlining its evaluation and selection approach. We then compared these criteria to the worksheets and guidebooks that FRA used to determine eligibility and assess technical merit. Finally, we interviewed FRA officials who participated in evaluating and selecting projects to obtain information on whether and how they applied the established criteria. Specifically, we randomly selected 1 technical reviewer from each of the 12 track 1, 3, and 4 panels (12 out of 36 reviewers), and 6 of the 8 reviewers from the track 2 panel. In addition, we interviewed senior FRA officials to further understand how senior Department of Transportation (the department) and FRA officials applied the selection criteria, selected projects, and determined the amount of funding provided for each project. We also asked FRA officials to provide reasons for why several lower scored applications were selected, while other higher scored applications were not. We conducted semi-structured interviews with officials from 10 of the 40 states and the District of Columbia that submitted a preapplication or an application for track 1 and track 2 funding about how FRA communicated its approach to reviewing applications and award results. [Footnote 35] We selected these states on the basis of four characteristics: (1) the extent to which applicants progressed through the preapplication, application, evaluation, and selection stages; (2) geographic regions; (3) the number of applications submitted; and (4) the amount of funding. We also contacted officials in two additional states (Ohio and Washington) to understand the effect of FRA's funding decisions on the scope of these states' proposed rail program. [Footnote 36] Our efforts were limited to applications requesting funding under track 1 and track 2 of the High Speed Intercity Passenger Rail Program (HSIPR) in August 2009 and October 2009 and awarded Recovery Act funding for projects in January 2010. We did not review FRA's rationale for its decision in December 2010 to redistribute $1.195 billion from two projects in Ohio and Wisconsin to on-going high speed rail projects in 13 states. We also assessed whether FRA's approach to calculating reviewers' individual scores and compiling them for an overall panel score reflected the criteria and weights for each criterion as published in the funding announcement as well as the overall reliability of the data used to make these calculations. To do this, we reviewed documentation about the system used to collect the information and spoke with officials knowledgeable about the data. We found some inaccuracies in how FRA calculated the technical review scores. Specifically, we found that some standardized scores were incorrect due to the inclusion of three duplicate records and three applications deemed not yet ready or ineligible. In addition, we noted FRA incorrectly weighted some technical evaluation scores for applications submitted under track 1b. However, we determined that these errors would not materially affect our findings and for the purposes of examining the effect of the scores on application selection, we found the data to be sufficiently reliable. FRA officials said that they would correct their calculations for future rounds of rail funding. Further we performed tests to determine the variables (e.g., technical review scores and number of applications submitted) that had a significant statistical relationship with being selected for an award. Our approach is described in appendix IV. Following Recommended Practices for Discretionary Grant Awards: To determine the extent to which FRA used recommended practices for awarding discretionary grants, we examined Office of Management and Budget guidance, guidance from several federal agencies, and our reports on this issue. (See table 7.) We identified key grant practices recommended across executive branch agencies and compared them to practices analyzed in our prior work.[Footnote 37] Specifically, we identified six recommended practices relating to (1) communicating with potential applicants prior to the competition, (2) planning for administering the review of applications, (3) developing a technical review panel with certain characteristics, (4) assessing applicants' abilities to manage grant funds, (5) notifying applicants of decisions, and (6) documenting reasons for award decisions. We compared these practices to information from the 2009 funding announcement, guidance to applicant reviewers, and to statements made by FRA officials regarding their implementation of their grants award program. For this effort, one analyst carried out the comparison and a second analyst verified the comparison results. Where differences existed, the two analysts discussed them and reached agreement. We also discussed the extent of FRA's use of several of these practices with the officials from our sample of 10 states. Table 7: Guidance and Reports Used To Identify Recommended Government Practices: Federal agency: Source: Department of Commerce; Guidance or report: Grants and Cooperative Agreements Manual (June 2007). Source: Department of Energy; Guidance or report: Merit Review Guide for Financial Assistance (August 2007). Source: Department of Labor; Guidance or report: U.S. Department of Labor, Veterans' Employment and Training Service Guide to Competitive and Discretionary Grants (April 2003). Source: Department of Transportation; Guidance or report: Financial Assistance Guidance Manual (March 2009). Source: Office of Management and Budget; Guidance or report: Office of Federal Financial Management Policy Directive on Financial Assistance Program Announcements, 68 FR 37370 (June 23, 2003). GAO: Guidance or report: Runaway and Homeless Youth Grants: Improvements Needed in the Grant Award Process, [hyperlink, http://www.gao.gov/products/GA0-10-335] (Washington, D.C.: May 10, 2010); Guidance or report: Discretionary Grants: Further Tightening of Education's Procedures for Making Awards Could Improve Transparency and Accountability, [hyperlink, http://www.gao.gov/products/GAO-06-268] (Washington, D.C.: Feb. 21, 2006); Guidance or report: Grants Management Despite Efforts to Improve Weed and Seed Program Management, Challenges Remain, [hyperlink, http://www.gao.gov/products/GAO-04-245] (Washington, D.C.: Mar. 24, 2004) Guidance or report: Education Discretionary Grants: Awards Process Could Benefit From Additional Improvements, [hyperlink, http://www.gao.gov/products/GAO/HEHS-00-55] (Washington, D.C.: Mar. 30, 2000); Guidance or report: Standards for Internal Control in the Federal Government, [hyperlink, http://www.gao.gov/products/GAO/AIMD-00-21.3.1] (Washington, D.C.: November 1999). Governmentwide: Source: Grant Accountability Project[A]; Guidance or report: Grant Accountability Project, Guide to Opportunities for Improving Grant Accountability (October 2005). Source: GAO. [End of table] This project was initiated by the Domestic Working Group, which consists of 19 federal, state, and local audit organizations and is chaired by the Comptroller General of the United States. The purpose of the group is to identify current and emerging challenges of mutual interest and explore opportunities for greater collaboration within the intergovernmental audit community. Communication of Selection Results: To determine the extent FRA publicly communicated information about the results of its award competition, we compared the information it communicated to the public about its awards to the types of information communicated by a random sample of 20 other competitively awarded Recovery Act programs. (See table 8.) We selected the sample from 193 Recovery Act programs identified in the Catalog of Federal Domestic Assistance as competitive grant programs using Recovery Act funds.[Footnote 38] In addition, we compared the information communicated about FRA's awards to the information communicated by the Innovation Grants program (Race to the Top)—a discretionary grant program run by the Department of Education. We included the Race to the Top program because you expressed interest in it. Table 8: Recovery Act Discretionary Grant Programs Reviewed: Program: Broadband Technology Opportunities Program; Responsible federal agency: Department of Commerce. Program: Central Valley Project Improvement Act, Title XXXIV; Responsible federal agency: Department of the Interior. Program: Emergency Medical Services for Children (Recovery Act); Responsible federal agency: Department of Health and Human Services. Program: Emergency Watershed Protection Program; Responsible federal agency: Department of Agriculture. Program: Fish and Wildlife Coordination Act; Responsible federal agency: Department of the Interior. Program: Geologic Sequestration Training and Research Grant Program; Responsible federal agency: Department of Energy. Program: Grants to Health Center Programs (Recovery Act); Responsible federal agency: Department of Health and Human Services. Program: Head Start (Recovery Act); Responsible federal agency: Department of Health and Human Services. Program: National Geospatial Program: Building The National Map; Responsible federal agency: Department of the Interior. Program: National Railroad Passenger Corporation Grants; Responsible federal agency: Department of Transportation. Program: Office of Science Financial Assistance Program; Responsible federal agency: Department of Energy. Program: Pregnancy Assistance Fund Program; Responsible federal agency: Department of Health and Human Services. Program: Preventing Healthcare—-Associated Infections (Recovery Act); Responsible federal agency: Department of Health and Human Services. Program: Prevention and Wellness-—Leveraging National Organizations (Recovery Act); Responsible federal agency: Department of Health and Human Services. Program: Recovery Act Grants for Training in Primary Care Medicine and Dentistry Training and Enhancement; Responsible federal agency: Department of Health and Human Services. Program: Recovery Act Transitional Housing; Responsible federal agency: Department of Justice. Program: Science Grants for Basic Research, Educational Outreach, or Training Opportunities (Recovery Act); Responsible federal agency: National Aeronautics and Space Administration. Program: Senior Community Service Employment Program; Responsible federal agency: Department of Labor. Program: State Fiscal Stabilization Fund Race-to-the-Top Incentive Grants (Recovery Act); Responsible federal agency: Department of Education. Program: State Grants to Promote Health Information Technology (Recovery Act); Responsible federal agency: Department of Health and Human Services. Program: Trans-National Institutes of Health Research Support (Recovery Act); Responsible federal agency: Department of Health and Human Services. Source: GAO. [End of table] We first reviewed materials on FRA's Web site and other public releases, such as press releases and outreach presentations to determine what FRA publicly communicated. We then discussed these results with FRA officials to confirm our results. For each of the 21 other Recovery Act programs, we reviewed three public information sources: (1) the program's Catalog of Federal Domestic Assistance award announcement, (2) internet search results, and (3) Grants.gov, which provides information on more than 1,000 grant programs.[Footnote 38] For each program, we searched these sources for information about final award results (project description, why the project was selected, and award amount) and for information that demonstrated how applications fared at different states of the process (eligibility determination and internal reviews, such as technical review panels). We defined the results of any technical review as either scores or comments, and when at least one of these elements was listed in at least one of the three sources of information, we concluded that technical review information was publicly communicated about the program. In carrying out this assessment, one analyst carried out the work and a second analyst independently performed the same tasks. The two analysts then compared their results and resolved any differences. The results of our comparison to a sample of other Recovery Act programs are not generalizable across all Recovery Act programs. Alignment with Statutory and Other Goals: To determine the extent to which HSIPR applications align with statutory and other goals, we reviewed federal laws, including the Recovery Act and PRIIA. We analyzed FRA's Federal Register notice describing its approach for selecting applications, its strategic vision for high speed rail, and its preliminary national rail plan and its subsequent update to gather information on any goals the agency has established for high speed rail networks and conventional service and the types of projects it seeks to support. We did not assess whether the applications selected by FRA will achieve the stated benefits or costs. We reviewed information submitted by applicants, namely the type of project proposed, the funding requested and awarded, and the estimated future speed of the projects. We used this data to sort projects into three categories developed by FRA: core express corridors, regional corridors, and emerging high speed rail routes. FRA's definition of top speeds within these categories overlap, which we modified slightly to provide discrete endpoints. Of the 62 applications selected by FRA, 13 did not provide data on anticipated top speed after project completion. These 13 applications include a variety of improvements, including station rehabilitations, the reconfiguration of rolling stock, and existing tracks and grade crossings upgrades for which one would not expect top speed information. We used these data as background on selected applications and did not assess them for reliability. We also reviewed FRA's Recovery Act plans and compared FRA goals for obligating and spending awarded funds to its actual rates of obligating and spending from January 2010 through December 2010. After reviewing a Department of Transportation Inspector General audit report on its financial management system and speaking with department officials familiar with the system, we determined that these data were sufficiently reliable. [End of section] Appendix III: Difference Between the Amounts Requested and Estimated Awards by State: In January 2010, FRA proposed to provide 18 of the 24 states, including the District of Columbia, selected for awards all or nearly all (91 percent or more) of the money that they requested. (See table 9.) The agency proposed to provide one state, North Carolina, with slightly more than it requested and the remaining five states with amounts varying from 47 percent to 86 percent of the amounts requested. Table 9: Difference between the Amounts Requested and Estimated Awards by State, as of January 2010: State: North Carolina; Amount requested: $523.8 million; FRA proposed amount: $545.0 million; Difference (percent): $21.2 million (104%). State: District of Columbia; Amount requested: $2.9 million; FRA proposed amount: $2.9 million; Difference (percent): $0.0 (100%). State: Maryland; Amount requested: $69.4 million; FRA proposed amount: $69.4 million; Difference (percent): $0.0 (100%). State: Michigan; Amount requested: $40.3 million; FRA proposed amount: $40.3 million; Difference (percent): $0.0 (100%). State: New Jersey; Amount requested: $38.5 million; FRA proposed amount: $38.5 million; Difference (percent): $0.0 (100%). State: Pennsylvania; Amount requested: $25.7 million; FRA proposed amount: $25.7 million; Difference (percent): $0.0 (100%). State: Rhode Island; Amount requested: $1.2 million; FRA proposed amount: $1.2 million; Difference (percent): $0.0 (100%). State: Texas; Amount requested: $3.8 million; FRA proposed amount: $3.8 million; Difference (percent): $0.0 (100%). State: Virginia; Amount requested: $74.8 million; FRA proposed amount: $74.8 million; Difference (percent): $0.0 (100%). State: Wisconsin; Amount requested: $831.7 million; FRA proposed amount: $822.0 million; Difference (percent): -$9.7 million (99%). State: Indiana; Amount requested: $71.4 million; FRA proposed amount: $71 million; Difference (percent): -$0.4 million (99%). State: Iowa; Amount requested: $17.3 million; FRA proposed amount: $17 million; Difference (percent): -$0.3 million (98%). State: Illinois; Amount requested: $1.275.3 billion; FRA proposed amount: $1.233.0 billion; Difference (percent): -$42.3 million (97%). State: Connecticut; Amount requested: $41.1 million; FRA proposed amount: $40.0 million; Difference (percent): -$1.1 million (97%). State: Massachusetts; Amount requested: $72.9 million; FRA proposed amount: $70.0 million; Difference (percent): -$2.9 million (96%). State: New York; Amount requested: $157.4 million; FRA proposed amount: $150.0 million; Difference (percent): -$7.4 million (95%). State: Vermont; Amount requested: $52.7 million; FRA proposed amount: $50.0 million; Difference (percent): -$2.7 million (95%). State: Missouri; Amount requested: $33.3 million; FRA proposed amount: $31.0 million; Difference (percent): -$2.3 million (93%). State: Maine; Amount requested: $38.4 million; FRA proposed amount: $35.0 million; Difference (percent): -$3.4 million (91%). State: Oregon; Amount requested: $9.4 million; FRA proposed amount: $8.0 million; Difference (percent): -$1.4 million (86%). State: Ohio; Amount requested: $563.8 million; FRA proposed amount: $400.0 million; Difference (percent): -$163.8 million (71%). State: Washington; Amount requested: $976.4 million; FRA proposed amount: $590.0 million; Difference (percent): -$386.4 million (60%). State: California[A]; Amount requested: $4.766.0 billion; FRA proposed amount: $2.343.0 billion; Difference (percent): -$2.423.0 billion (49%). State: Florida; Amount requested: $2.654.0 billion; FRA proposed amount: $1.250.0 billion; Difference (percent): -$1.404.0 billion (47%). Source: GAO analysis of FRA data. [A] There were two different entities submitting applications for projects in California. The California Department of Transportation submitted track 1a and 1 b applications, and the California High Speed Rail Authority, a public agency established by California to develop high speed rail, submitted track 2 applications. [End of table] In December 2010, nearly a year after making these proposals, FRA announced that $1.195 billion in Recovery Act funds for high speed rail-representing most of the $810 million for Wisconsin's Milwaukee- Madison corridor and $385 million for Ohio's Cincinnati-Columbus- Cleveland "3C" route, originally designated for these states in January 2010-would be redirected to high speed rail projects already underway in 13 other states.[Footnote 39] In making these changes, FRA noted that Wisconsin has suspended work under its existing high speed rail agreement and the incoming governors in Wisconsin and Ohio have both indicated that they will not move forward to use high speed rail money received under Recovery Act. The adjusted amounts resulted in FRA proposing to provide all or nearly all of the original request amounts (91 percent or more) for one additional state (Oregon). (See table 10.) While most of the funding was redistributed to three states (California, Florida, and Washington), the total funding awarded to these three states was less than 80 percent of their original requests. Table 10: Difference between the Amounts Requested and Estimated Awards by State, as of December 2010: State: North Carolina; Applicant requested amount: $523.8 million; FRA adjusted amount: $546.5 million; Difference (percent): $22.7 million (104%). State: District of Columbia; Applicant requested amount: $2.9 million; FRA adjusted amount: $2.9 million; Difference (percent): $0.0 (100%). State: Illinois; Applicant requested amount: $1.275.3 billion; FRA adjusted amount: $1.275.3 billion; Difference (percent): $0.0 (100%). State: Indiana; Applicant requested amount: $71.4 million; FRA adjusted amount: $71.4 million; Difference (percent): $0.0 (100%). State: Iowa; Applicant requested amount: $17.3 million; FRA adjusted amount: $17.3 million; Difference (percent): $0.0 (100%). State: Maryland; Applicant requested amount: $69.4 million; FRA adjusted amount: $69.4 million; Difference (percent): $0.0 (100%). State: Michigan; Applicant requested amount: $40.3 million; FRA adjusted amount: $40.3 million; Difference (percent): $0.0 (100%). State: New Jersey; Applicant requested amount: $38.5 million; FRA adjusted amount: $38.5v; Difference (percent): $0.0 (100%). State: Oregon; Applicant requested amount: $9.4 million; FRA adjusted amount: $9.4 million; Difference (percent): $0.0 (100%). State: Pennsylvania; Applicant requested amount: $25.7 million; FRA adjusted amount: $25.7 million; Difference (percent): $0.0 (100%). State: Rhode Island; Applicant requested amount: $1.2 million; FRA adjusted amount: $1.2 million; Difference (percent): $0.0 (100%). State: Texas; Applicant requested amount: $3.8 million; FRA adjusted amount: $3.8 million; Difference (percent): $0.0 (100%). State: Vermont; Applicant requested amount: $52.7 million; FRA adjusted amount: $52.7 million; Difference (percent): $0.0 (100%). State: Virginia; Applicant requested amount: $74.8 million; FRA adjusted amount: $74.8 million; Difference (percent): $0.0 (100%). State: Maine; Applicant requested amount: $38.4 million; FRA adjusted amount: $38.3 million; Difference (percent): -$0.1 million (100%). State: Massachusetts; Applicant requested amount: $72.9 million; FRA adjusted amount: $72.8 million; Difference (percent): -$0.1 million (100%). State: Missouri; Applicant requested amount: $33.3 million; FRA adjusted amount: $33.2 million; Difference (percent): -$0.1 million (100%). State: New York; Applicant requested amount: $157.4 million; FRA adjusted amount: $157.3 million; Difference (percent): -$0.1 million (100%). State: Connecticut; Applicant requested amount: $41.1 million; FRA adjusted amount: $40.0 million; Difference (percent): -$1.1 million (97%). State: Washington; Applicant requested amount: $976.4 million; FRA adjusted amount: $751.5 million; Difference (percent): -$224.9v (77%). State: California[A]; Applicant requested amount: $4.766.0 billion; FRA adjusted amount: $2.967.0 billion; Difference (percent): -$1.799.0 billion (62%). State: Florida; Applicant requested amount: $2.654.0 billion; FRA adjusted amount: $1.592.3 billion; Difference (percent): -$1.061.7 billion (60%). State: Wisconsin; Applicant requested amount: $831.7 million; FRA adjusted amount: $44.0 million; Difference (percent): -$787.7 million (5%). State: Ohio; Applicant requested amount: $563.8 million; FRA adjusted amount: $15.0 million; Difference (percent): -$548.8 million (3%). Source: GAO analysis of FRA data. Note: Due to rounding the percentage difference between applicant requested and FRA adjusted amounts may equal 100, even when there is a small dollar difference. [A] There were two different entities submitting applications for projects in California. The California Department of Transportation submitted track la and 11a b applications, and the California High Speed Rail Authority, a public agency established by California to develop high speed rail, submitted track 2 applications. [End of table] [End of section] Appendix IV: Additional Results from Our Statistical Analysis of Award Decisions: This appendix contains information related to our statistical analyses of FRA and application data to examine possible relationships between several variables and FRA's selection decisions. Overview of the Data: We obtained the data for our analysis from the Application Review Module of GrantSolutions, the database FRA used to store application information and technical review scores. Our analysis examined all 206 out of 259 submitted applications which FRA deemed eligible and ready to receive federal funds. Eligible applications included those requesting Recovery Act funds, tracks 1 and 2, as well as those requesting annual appropriations, tracks 3 and 4. We included track 3 and 4 applications in our analysis because FRA reviewed, weighted, and calculated the results for tracks 1, 3, and 4 applications as a group rather than by distinct tracks.[Footnote 40] To assess the reliability of the data in Application Review Module, we reviewed database user manuals, spoke with officials knowledgeable about the data, and conducted a series of data tests. We found some inaccuracies in how FRA calculated the technical review scores. Specifically, we found that some final technical review scores were incorrect due to the inclusion of three duplicate technical review scores and three applications later determined to be not yet ready or ineligible.[Footnote 41] In addition, FRA had mistakenly applied incorrect weights to the track 1b application technical review scores, which resulted in 15 final scores that were one point higher than they should have been and another 5 final scores that were one point lower than they should have been. We determined that these errors would not materially affect our findings and for the purposes of examining the effect of scores on application selection, found these data are sufficiently reliable. Methodology: To determine the extent to which specific variables were related to the department's selection of applications, we considered a set of bivariate tables and conducted a series of bivariate and multivariate regression analyses.[Footnote 42] From the tables and regression analyses we were interested in determining how the department's decision to select an application for an award was affected by four variables: (1) the technical review scores, (2) application track, (3) the requested funding amount, and (4) the number of applications submitted by state or groups of states. Our analyses provide us with estimates, called odds ratios, which indicate the differences in the odds of applications being selected for an award across certain categories of the different variables we examined.[Footnote 43] An odds ratio of 1.0 would indicate that applications in different categories were equally likely to be selected for an award. An odds ratio of less than 1.0 implies that applications in the category to which the odds ratio applies were less likely to be selected relative to those they are being compared to (known as the "reference" category). For example, if applications receiving a technical review score of a 3 had an odds ratio of 0.5 it would indicate that they were half as likely to be selected for an award as applications that received a score of 1 or 2 (the reference category). Inversely, an odds ratio greater than 1.0 suggests that applications with that characteristic were more likely to be selected. For example, if applications receiving a technical review score of a 5 had an odds ratio of 3.0, we would conclude that applications receiving that score were three times more likely to be selected relative to the reference category. The primary reason for preferring odds ratios to describe the relationships across variables is because the significance of the differences between specific odds ratios can be easily tested and the ratios can be re-estimated after considering other variables. Technical Review Score Affected the Selection of Applications: We first examined the effect of technical review scores on the likelihood of being selected for an award, and found that the odds of being selected for an award were in general greater for applications receiving higher technical review scores than for applications receiving lower ones. For the purposes of our analyses we combined scores of 1 and 2, as there were only five applications that had received a score of 1 and there was no evidence that they were significantly different from applications that had been assigned scores of 2, in terms of being selected for funding. A smaller percentage of the applications that received scores of 1 or 2 were selected for funding (16 percent selected) than applications that had received a score of 3 (46 percent selected) or 4 (42 percent selected), and applications that received a score of 5 had the highest percentage of being selected for funding (69 percent selected). In addition, the odds ratios of 4.30, 3.67, and 11.57 indicate that applications receiving a higher technical review score (3, 4, or 5, respectively) were at least three times more likely to be selected for an award than those receiving a lower technical review score (i.e., 1 or 2). (See table 11.) Table 11: Applications Selected by Technical Review Score, and Odds and Odds Ratios Derived from Them: Technical review score: 1 or 2; Selected: No: 36; Selected: Yes: 7; Percent selected: 16%; Odds on selected: 0.19; Odds ratios: reference. Technical review score: 3; Selected: No: 49; Selected: Yes: 41; Percent selected: 46%; Odds on selected: 0.84; Odds ratios: 4.30. Technical review score: 4; Selected: No: 35; Selected: Yes: 25; Percent selected: 42%; Odds on selected: 0.71; Odds ratios: 3.67. Technical review score: 5; Selected: No: 4; Selected: Yes: 9; Percent selected: 69%; Odds on selected: 2.25; Odds ratios: 11.57. Source: GAO analysis of FRA data. Note: The differences across score categories are significant given the low probability associated with the likelihood ratio chi-square statistic calculated to test the independence of scores and selection (L2 = 17.14 with 3 df, P < 0.01). This table includes track 1 a, 1 b, 2, 3, and 4 applications and, therefore, does not match the data provided in the body of this report. [End of table] We followed several steps to calculate the odds ratios of 4.30, 3.67, and 11.57. First, we derived the odds that applications with certain technical review scores would be selected for an award. For example, to determine the selection odds for applications receiving a score of 1 or 2, we divided the number of applications receiving a score of 1 or 2 that were selected by the number applications receiving those scores that were not selected. Seven were selected for awards, whereas 36 were not; the resulting odds (7/36) equal 0.19. This means that 19 applications receiving a score of 1 or 2 would be selected for an award for every 100 that were not. By comparison, the odds on being selected for applications receiving a technical review score of 3 were 41/49, or 0.84, which indicates that 84 applications receiving a score of 3 would be selected for an award for every 100 that were not. The odds ratio comparing these two odds is 0.84/0.19 equal to 4.30. This odds ratio suggests that the odds of being selected for an award are more than four times greater for applications receiving a technical review score of 3 than for applications receiving a score of 1 or 2. Application Track Affected the Likelihood of Applications Being Selected for Award: We also found that there were sizable differences in the likelihood of applications submitted under different tracks being selected for an award. While only between one-quarter and one-third of the applications in tracks 1a and 1b were selected, 61 percent of track 2 applications were selected, as were nearly three-fourths of the applications in tracks 3 and 4. These differences are also apparent from looking at the odds and odds ratios in table 12. The odds on being selected for funding were slightly lower (by a factor of 0.77) for track 1b applications than for track la, but they were more than three times greater for track 2 applications than for track 1a applications, and more than five times greater for track 3 and track 4 applications than for track 1a applications. (See table 12.) Table 12: Applications Selected by Track, and Odds and Odds Ratios Derived from Them: Track: 1a; Selected: No: 68; Selected: Yes: 33; Percent selected: 33%; Odds on selected: 0.49; Odds ratios: reference. Track: 1b; Selected: No: 40; Selected: Yes: 15; Percent selected: 27%; Odds on selected: 0.38; Odds ratios: 0.77. Track: 2; Selected: No: 9; Selected: Yes: 14; Percent selected: 61%; Odds on selected: 1.56; Odds ratios: 3.21. Track: 3 or 4; Selected: No: 7; Selected: Yes: 20; Percent selected: 74%; Odds on selected: 2.86; Odds ratios: 5.89. Source: GAO analysis of FRA data. Note: The differences across tracks are statistically significant given the likelihood ratio chi-square statistic calculated to test the independence of tracks and selection (L2 = 23.17 with 3 df, P < 0.01). [End of table] Amount Requested Had an Effect on Likelihood of Selection: There were also sizable differences in the likelihood of applications being selected based on the amount requested. Slightly more than half of the applications that requested less than $1 million were funded, as were exactly half of the applications that requested $50 million or more. At the same time, roughly 40 percent of applications requesting between $1 million and $10 million were funded, and less than one- fourth of the applications requesting $10 to $50 million were funded. The odds and odds ratios indicate that applications requesting the lowest amounts were the most likely to be selected and that applications requesting the highest amounts were almost as likely as those requesting the lowest amounts to be selected. Applications requesting more than $1 million but less than $50 million were somewhat less likely to be selected than applications requesting less than $1 million or more than $50 million. (See table 13.) Table 13: Applications Selected by Requested Amount, and Odds and Odds Ratios Derived from Them: Amount requested: $100,000-999,999; Selected: No: 18; Selected: Yes: 21; Percent selected: 54%; Odds on selected: 1.17; Odds ratios: reference. Amount requested: $1,000,000-9,999,999; Selected: No: 44; Selected: Yes: 29; Percent selected: 40%; Odds on selected: 0.66; Odds ratios: 0.56. Amount requested: $10,000,000-49,999,999; Selected: No: 43; Selected: Yes: 13; Percent selected: 23%; Odds on selected: 0.30; Odds ratios: 0.26. Amount requested: $50,000,000 or more; Selected: No: 19; Selected: Yes: 19; Percent selected: 50%; Odds on selected: 1.00; Odds ratios: 0.86. Source: GAO analysis of FRA data. Note: The differences across requested amount categories are statistically significant given the likelihood ratio chi-square statistic calculated to test the independence of amount requested and selection (L2 = 11.66 with 3 df, P < 0.01). [End of table] Number of Applications per State Had Sizable Effect on the Likelihood of Application Selection: We examined the differences of applications being selected by each state and, ultimately, by the number of applications submitted in many of the states. We first considered the number of applications selected for each of the 34 states that submitted eligible applications. Of these 34, 4 states submitted more than 10 applications, 22 states submitted 3 or fewer applications, and 9 states submitted a single application. We found a statistically significant relationship indicating there was a much greater tendency for applications to be selected when they came from states in which a maximum of three applications were submitted.[Footnote 44] Given the low number of applications submitted by many of the states, however, we could not control for all of the differences between states in a multivariate analysis in which the effects of the other variables are estimated simultaneously. Therefore, we combined the states with smaller numbers of applications into two groups: one group contained states which submitted one to three applications and the other group contained states submitting four to nine applications. These groupings did not result in the loss of any significant information with respect to differences in the likelihood of applications being selected across states.[Footnote 45] The results of these state groupings indicate that the percentage of applications selected for funding from states with one to three applications (70 percent selected) were considerably higher than the percentage of funded applications from states with four to nine applications (40 percent selected). In addition, those states that submitted more than nine applications showed considerable differences in the percent of applications selected. California (37 percent selected) and Missouri (75 percent selected) had a relatively high percentage of applications selected, and New York (18 percent selected) and Washington state (5 percent selected) had a relatively low percentages of applications selected. As in the previous tables, the odds and odds ratios give us the same sense of the association that the percentages reveal; the odds on being selected for funding were more than three times higher for applications from states submitting one to three applications than for applications from California or from states submitting four to nine applications (2.33/0.68, which equals 3.43). In addition, New York and Washington state were much less likely to likely to have an application selected than California and Missouri. (See table 14.) Table 14: Applications Selected by State and State Group, and Odds and Odds Ratios Derived from Them: State and state group: States with 1 to 3 applications; Selected for funding: No: 12; Selected for funding: Yes: 28; Percent selected: 70%; Odds on selected: 2.33; Odds ratios: 3.94. State and state group: States with 4 to 9 applications; Selected for funding: No: 31; Selected for funding: Yes: 21; Percent selected: 40%; Odds on selected: 0.68; Odds ratios: 1.14. State and state group: Missouri; Selected for funding: No: 3; Selected for funding: Yes: 9; Percent selected: 75%; Odds on selected: 3.00; Odds ratios: 5.06. State and state group: Washington; Selected for funding: No: 20; Selected for funding: Yes: 1; Percent selected: 5%; Odds on selected: 0.05; Odds ratios: 0.08. State and state group: New York; Selected for funding: No: 31; Selected for funding: Yes: 7; Percent selected: 18%; Odds on selected: 0.23; Odds ratios: 0.38. State and state group: California; Selected for funding: No: 27; Selected for funding: Yes: 16; Percent selected: 37%; Odds on selected: 0.59; Odds ratios: reference. Source: GAO analysis of FRA data. Note: The differences across states and groups of states are statistically significant given the likelihood ratio chi-square statistic calculated to test the independence of states and state groups and selection (L2 = 43.32 with 5 df, P < 0.01). [End of table] Regression Analyses Show that Technical Review Score and Number of Applications Submitted Significantly Affected the Likelihood of Selection: We also examined the data using bivariate and multivariate logistic regression models to estimate the effects of these different variables on the likelihood of applications being selected for funding. The bivariate models examine the effects of the technical review score, track, amount requested, and state or group of states from which the application arose, one at a time. The odds ratios from these models are the same as those produced from the observed frequencies in the different two-way tables described above. From these models, however, we obtain specific tests of the significance of the differences between each variable category to determine more generally whether there are differences between any of the categories. In the bivariate regression model we find that many, but not all, of the odds ratios describing these differences are significant. (See table 15.) Table 15: Odds Ratios from Bivariate and Multivariate Logistic Regression Models by Technical Review Score, Track, Amount Requested, and State and State Group: Variable category: Technical review score: 1 or 2; Bivariate logistic regressions: reference; Multivariate logistic regressions: reference. Variable category: Technical review score: 3; Bivariate logistic regressions: 4.30[A]; Multivariate logistic regressions: 6.88[A]. Variable category: Technical review score: 4; Bivariate logistic regressions: 3.67[A]; Multivariate logistic regressions: 7.53[A]. Variable category: Technical review score: 5; Bivariate logistic regressions: 11.57[A]; Multivariate logistic regressions: 9.36[A]. Variable category: Track: 1a; Bivariate logistic regressions: reference; Multivariate logistic regressions: reference. Variable category: Track: 1b; Bivariate logistic regressions: 0.77; Multivariate logistic regressions: 0.60. Variable category: Track: 2; Bivariate logistic regressions: 3.21[A]; Multivariate logistic regressions: 2.34. Variable category: Track: 3 or 4; Bivariate logistic regressions: 5.89[A]; Multivariate logistic regressions: 1.93. Variable category: Amount requested: $100,000-999,999; Bivariate logistic regressions: reference; Multivariate logistic regressions: reference. Variable category: Amount requested: $1,000,000-9,999,999; Bivariate logistic regressions: 0.56; Multivariate logistic regressions: 1.35. Variable category: Amount requested: $10,000,000-49,999,999; Bivariate logistic regressions: 0.26[A]; Multivariate logistic regressions: 0.37. Variable category: Amount requested: $50,000,000 or more; Bivariate logistic regressions: 0.86; Multivariate logistic regressions: 0.79. Variable category: Applications submitted by state and state group: States with 1 to 3 applications; Bivariate logistic regressions: 3.94[A]; Multivariate logistic regressions: 2.94. Variable category: Applications submitted by state and state group: States with 4 to 9 applications; Bivariate logistic regressions: 1.14; Multivariate logistic regressions: 0.88. Variable category: Applications submitted by state and state group: Missouri; Bivariate logistic regressions: 5.06[A]; Multivariate logistic regressions: 9.12[A]. Variable category: Applications submitted by state and state group: Washington; Bivariate logistic regressions: 0.08[A]; Multivariate logistic regressions: 0.07[A]. Variable category: Applications submitted by state and state group: New York; Bivariate logistic regressions: 0.38; Multivariate logistic regressions: 0.37. Variable category: Applications submitted by state and state group: California; Bivariate logistic regressions: reference; Multivariate logistic regressions: reference. Source: GAO analysis of FRA data. [A] Ratios which are significant at P < 0.05. [End of table] The multivariate model estimates the net effects of these different variables on the likelihood of applications being selected for funding, or the effects of each variable when the effects of other variables are considered simultaneously, rather than one at a time. Our results indicate that the differences between tracks and amount requested categories are rendered insignificant when technical review scores and state and state group are taken into account, while the effect of technical review score and the differences between state and state group remain sizable and in most cases significant. Specifically, we found that, when we accounted for all four variables, applications receiving a technical review score of 3, 4, or 5 were about seven to eight times more likely to be selected for funding than applications which scored 1 or 2. In addition, applications from states that submitted one to three applications, and applications from Missouri, were three and nine times as likely, respectively, to be selected as those from California, while those from Washington state were less than one-tenth as likely as those from California to be selected. The remaining variable categories were not significant in the multivariate model and, therefore, do not provide a statistical explanation for why applications were more or less likely to be selected for an award. [End of section] Footnotes: [1] Pub. L. No. 111-5, 123 Stat. 115 (Feb. 17, 2009). [2] On December 9, 2010, the department redirected $1.195 billion in intercity passenger rail funds originally designated for Ohio and Wisconsin to 13 other states, which were selected for Recovery Act awards in January 2010. At the time of this announcement, our audit work was substantially complete and, therefore, we did not assess FRA's approach to making these funding decisions. [3] GAO, High Speed Rail: Learning From Start-ups, Prospects for Increased Industry Investment, and Federal Oversight Plans, [hyperlink, http://www.gao.gov/products/GAO-10-625] (Washington, D.C.: June 17, 2010). [4] On October 28, 2010, the department announced 54 additional awards totaling $2.4 billion. These awards will be funded through the department's annual appropriation for fiscal years 2009 and 2010, which remain available until expended. The department also requested $1 billion for intercity passenger rail in fiscal year 2011. [5] We did not assess whether the selected applications will achieve benefits and costs stated in the applications submitted to FRA. We have reported that applicants for major infrastructure projects, such as high speed rail projects, often overstate benefits, such as the number of likely riders. See GAO, High Speed Passenger Rail: Future Development Will Depend on Addressing Financial and Other Challenges and Establishing a Clear Federal Role, [hyperlink, http://www.gao.gov/products/GAO-09-317] (Washington, D.C.: Mar. 19, 2009). [6] At this time FRA also used the same approach to assess applications for planning grants using up to $9.54 million in fiscal year 2008 and 2009 funds (called "track 3") and for final design and construction projects using at least $82.3 million of fiscal year 2008 and 2009 funds (called "track 4"). Application for tracks 3 and 4 were assessed by the same technical review panels and at the same time as the track 1a and 1b applications. Unless otherwise noted, this report deals with only track 1a, 1b, and 2 applications. [7] Pub. L. No. 110-432, 122 Stat. 4848 (Oct. 16, 2008). [8] FRA, Preliminary National Rail Plan (Washington, D.C., October 2009) and FRA, National Rail Plan-Moving Forward: A Progress Report (Washington, D.C., September 2010). [9] By comparison, the fiscal years 2008 and 2009 appropriations for the department included $30 million and $90 million, respectively, for intercity passenger rail grants to states. [10] An obligation is a commitment that creates a legal liability of the government for the payment of goods or services ordered or received. [11] Department of Transportation, Vision for High-Speed Rail in America (Washington, D.C., April 2009). [12] 1274 Fed. Reg. 29900 (June 23, 2009). [13] Only states, groups of states, interstate compacts, public agencies, and Amtrak were eligible to apply for funding. Amtrak did not independently submit any applications, but was included in a number of other applications as the anticipated service operator. [14] The number of applications and amount of requested funds submitted by applicants includes several duplicate projects. For example, Washington state submitted three track 2 applications in which 11 of the same projects were contained in each application. In tallying the number of applications and amounts requested, we did not double count duplicate applications. [15] Some of the criteria for the eligibility, technical, and selection reviews were derived from requirements in PRIIA. For example, PRIIA directed FRA to select projects that encourage intermodal connectivity, which is covered under the technical review criteria of transportation benefits. In addition, according to an FRA official, the technical review criteria were based on the Recovery Act and the department's general goals for transportation projects identified in the funding announcement (e.g. developing livable communities and encouraging environmental benefits). [16] Final application scores were derived from individual panelists' scores for each technical review criterion, which were weighted based on the track under which they were submitted. For example, track 1 application scores were weighted to emphasize the reviewers' technical review scores for transportation benefits, economic recovery benefits, and project management approach, and place less weight on the scores for other public benefits. In contrast, track 2 application scores were weighted to emphasize the scores for transportation benefits and other public benefits. [17] Standardized scores, called z-scores, were applied only to tracks 1a, 1b, 3, and 4 technical review scores as an internal control to ensure interrater reliability across the 12 review panels. Standardization was not required for track 2 projects because the technical review was conducted by a single panel. [18] For a list of applications and the extent to which they made it through eligibility determination, and selection see [hyperlink, http://www.dot.govirecovery/docs/hsiprapplist.pdf]. [19] There were three instances in which the criteria in the funding announcement did not completely align with the guidebooks. For example, the funding announcement includes a technical review criterion that projects create an integrated intercity passenger rail network, including allowance for and support of future network expansion. The guidebooks discuss the connection of the proposed project to other intercity rail services, but do not indicate that panelists should consider future network expansion. We viewed these instances as minor and, therefore, concluded that the funding announcement and technical review guidebooks generally align. [20] GAO, Standards for Internal Control in the Federal Government, [hyperlink, http://www.gao.gov/products/GAO/AIMD-00-21.3.1] (Washington, D.C.: November 1999). [21] U.S. Department of Transportation, Financial Assistance Guidance Manual (Washington, D.C., March 2009). [22] These amounts include track 1 and 2 awards, and may change as FRA finalizes awards. [23] On October 28, 2010, FRA announced that Florida was selected for an additional $808 million from the department's annual appropriations for the Tampa to Orlando high speed rail project. In addition, on December 9, 2010, FRA announced that it had redistributed an additional $342.3 million to Florida originally designated for Ohio and Wisconsin. On February 16, 2011, Florida announced that they will turn down $2.4 billion in funding awarded for the Tampa to Orlando high speed rail project. As of late February, FRA had not announced how this funding would be used. [24] On October 28, 2010, FRA announced that Washington state was selected for an additional $31 million from department's annual appropriations. In addition, on December 9, 2010, FRA announced that it had redistributed an additional $161.5 million to Washington state originally designated for Ohio and Wisconsin. We did not ask Washington state officials how these additional funds will affect their ability to complete the 16 proposed projects. [25] Officials also noted that they were adjusting award amounts and the scope of the projects as they negotiate cooperative agreements with each state, which will serve as documentation of the final award decisions. [26] The statistical tests we ran were bivariate and multivariate logistic regression models, which included the following factors: (1) technical review scores, (2) number of applications submitted per state, (3) application track, and (4) the amount of project funding requested. The third and fourth factors were not statistically significant and, therefore, do not provide a statistical explanation for why projects were more or less likely to be selected for an award. For more information on our methodology and additional analyses see appendix IV. [27] The federal agency guidance we examined came from the Departments of Commerce, Education, Labor, and Transportation. For more information on our methodology for developing recommended practices for this review see appendix II. [28] Race to the Top, which is part of the Recovery Act's State Fiscal Stabilization Fund, is a competitive grant program which provides funds to states to encourage educational reform that will result in improved academic performance. [29] The Recovery Act contains a number of provisions related to transparency, notably the requirement that recipients of these funds report quarterly on a number of things, such as the purpose and expected outcomes of their awards and on jobs created. These reports are available on the administration's Web site at [hyperlink, http://www.recovery.gov]. See GAO, Recovery Act: Increasing the Public's Understanding of What Funds Are Being Spent on and What Outcomes Are Expected, [hyperlink, http://www.gao.gov/products/GAO-10-581] (Washington, D.C.: May 27, 2010). [30] For a list of programs we examined and more information on our methodology for selecting these programs see appendix II. [31] [hyperlink, http://www.gao.gov/products/GA0-10-625]. [32] FRA identified provisional amounts of funding to applicants, subject to negotiation. Further, FRA did not specify funding amounts for each project in its notification letters to applicants, providing the flexibility for states to allocate funding across FRA-selected applications. As a result, we were unable to determine the amount of funding awarded for each category of high speed rail applications. [33] Given that FRA was provided discretion to determine the number of high speed rail applications to select during its funding competition, we did not evaluate whether it selected an appropriate number of projects for funding. [34] In contrast, several other capital grants and investment programs have relied on existing program structures, such as the department's Highway Infrastructure Investment and Transit Capital Assistance programs, and only authorized agencies to obligate funds through September 30, 2010. The Highway Infrastructure Investment program, administered by the Federal Highway Administration provides funding to states for restoration, repair, and construction of highways among other things. The Transit Capital Assistance program, administered by the Federal Transit Administration, provides grants for facility renovation or construction, vehicle replacements, preventive maintenance, and other related activities. [35] The states were Arizona, California, Florida, Iowa, Kansas, Louisiana, New York, Oregon, Pennsylvania, and South Carolina. [36] We selected Ohio and Washington because FRA proposed to provide these states with $164 million and $386 million, respectively, less than requested. [37] To identify these practices, we reviewed prior work on discretionary grants to compile an initial list of grants manuals from a number of federal agencies. We then verified and added to this listing through a separate search and review of government agency Web sites. In addition, we consulted with GAO staff who have expertise on federal discretionary grant practices. These practices reflect our review of a judgmental sample of discretionary grant guidance and may not include all recommended practices used by federal agencies. [38] We typically reviewed information on each program's agency Web site as part of our review of these three sources. [39] Ohio is slated to receive $15 million for preliminary engineering and environmental analysis work conducted on its track 2 project. According to FRA, $30 million remains obligated to Wisconsin for costs incurred on its track 2 project, and the state is scheduled to receive $14 million for the two previously selected track 1a projects. [40] The technical review scores were averaged and weighted based on the priorities listed in the funding announcement. [41] The inclusion of these additional data points slightly affected the final technical review scores for other applications because FRA used the average score across applications when it used a standardization procedure, called a z-score, to correct for potential differences in the ways applications were scored across track 1, 3, and 4 panels. [42] The bivariate models estimate differences in the odds on selection across groups or categories of applications when the other variables are ignored. The multivariate models estimate differences in the odds on selection across categories of each variable when the other variables are taken into account, or controlled statistically. [43] For example, a category within the technical review score variable might include applications receiving a score of 3, as in table 11. [44] We have omitted this expanded table to save space. The differences across the 34 states that submitted an application are statistically significant given the likelihood ratio chi-square statistic calculated to test the independence of the state from which the application was submitted and the likelihood of selection (L2 = 67.40 with 33 df, P < .01). [45] When we grouped the states submitting lower number of applications, two-thirds of the variability from the individual state analysis was retained, and the variability lost as a result of the grouping was statistically insignificant. [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each weekday, GAO posts newly released reports, testimony, and correspondence on its Web site. To have GAO e-mail you a list of newly posted products every afternoon, go to [hyperlink, http://www.gao.gov] and select "E-mail Updates." Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s Web site, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: E-mail: fraudnet@gao.gov: Automated answering system: (800) 424-5454 or (202) 512-7470: Congressional Relations: Ralph Dawn, Managing Director, dawnr@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, D.C. 20548: Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, D.C. 20548: