This is the accessible text file for GAO report number GAO-10-579 entitled 'Information Technology: Management Improvements Are Essential to VA's Second Effort to Replace Its Outpatient Scheduling System' which was released on June 28, 2010. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. Report to the Ranking Member, Committee on Veterans' Affairs, U.S. Senate: United States Government Accountability Office: GAO: May 2010: Information Technology: Management Improvements Are Essential to VA's Second Effort to Replace Its Outpatient Scheduling System: GAO-10-579: GAO Highlights: Highlights of GAO-10-579, a report to the Ranking Member, Committee on Veterans’ Affairs, U.S. Senate. Why GAO Did This Study: The Department of Veterans Affairs (VA) provides medical care, disability compensation, and vocational rehabilitation to veterans. The Veterans Health Administration (VHA)—a component of VA—provides care to over 5 million patients in more than 1,500 facilities. VHA relies on an outpatient scheduling system that is over 25 years old. In 2000, VHA began the Scheduling Replacement Project to modernize this system as part of a larger departmentwide modernization effort called HealtheVet. However, in February 2009, VA terminated a key contract supporting the project. GAO was asked to (1) determine the status of the Scheduling Replacement Project, (2) determine the effectiveness of VA’s management and oversight of the project, and (3) assess the impact of the project on VA’s overall implementation of its HealtheVet initiative. To do so, GAO reviewed project documentation and interviewed VA and contractor officials. What GAO Found: After spending an estimated $127 million over 9 years on its outpatient scheduling system project, VA has not implemented any of the planned system’s capabilities and is essentially starting over. Of the total amount, $62 million was expended for, among other things, project planning, management support, a development environment, and equipment. In addition, the department paid an estimated $65 million to the contractor selected to develop the replacement scheduling application. However, the application software had a large number of defects that VA and the contractor could not resolve. As a result, the department terminated the contract, determined that the system could not be deployed, and officially ended the Scheduling Replacement Project on September 30, 2009. VA began a new initiative that it refers to as HealtheVet Scheduling on October 1, 2009. As of April 2010, the department’s efforts on this new initiative had largely consisted of evaluating whether to buy or custom build a new scheduling application. VA’s efforts to successfully complete the Scheduling Replacement Project were hindered by weaknesses in several key project management disciplines and a lack of effective oversight that, if not addressed, could undermine the department’s second effort to replace its scheduling system: * VA did not adequately plan its acquisition of the scheduling application and did not obtain the benefits of competition. * VA did not ensure requirements were complete and sufficiently detailed to guide development of the scheduling system. * VA performed system tests concurrently, increasing the risk that the system would not perform as intended, and did not always follow its own guidance, leading to software passing through the testing process with unaddressed critical defects. * VA’s project progress and status reports were not reliable, and included data that provided inconsistent views of project performance. * VA did not effectively identify, mitigate, and communicate project risks due to, among other things, staff members’ reluctance to raise issues to the department’s leadership. * VA’s various oversight boards had responsibility for overseeing the Scheduling Replacement Project; however, they did not take corrective actions despite the department becoming aware of significant issues. The impact of the scheduling project on the HealtheVet initiative cannot yet be determined because VA has not developed a comprehensive plan for HealtheVet that, among other things, documents the dependencies among the projects that comprise the initiative. VA officials stated that the department plans to document the interdependencies, project milestones, and deliverables in an integrated master schedule as part of a project management plan that is expected to be completed by June 2010. In the absence of such a plan, the impact of the scheduling project’s failure on the HealtheVet program is uncertain. What GAO Recommends: GAO is recommending that the Secretary of Veterans Affairs direct the Chief Information Officer to take six actions to improve key processes, including acquisition management, system testing, and progress reporting, which are essential to the department’s second outpatient scheduling system effort. In written comments on a draft of this report, VA generally concurred with GAO’s recommendations and described actions to address them. View [hyperlink, http://www.gao.gov/products/GAO-10-579] or key components. For more information, contact Valerie Melvin at (202) 512- 6304 or melvinv@gao.gov. [End of section] Contents: Letter: Background: VA Ended the Outpatient Scheduling System Project without Delivering Expected Capabilities and Has Begun a New Initiative: Scheduling Replacement Project Was Hindered by Weaknesses in Key Management Capabilities: Impact of Scheduling Replacement Project Failure on HealtheVet Program is Uncertain: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Objectives, Scope, and Methodology: Appendix II: Comments from the Department of Veterans Affairs: Appendix III: GAO Staff Contact and Acknowledgments: Figure: Figure 1: Timeline of Scheduling Replacement Project Events: Abbreviations: ANSI: American National Standards Institute: CASU: Cooperative Administrative Support Units: CIO: Chief Information Officer: COTS: commercial off-the-shelf: CPI: cost performance index: EVM: earned value management: EIA: Electronic Industries Alliance: FAR: Federal Acquisition Regulation: GSA: General Services Administration: IT: information technology: OED: Office of Enterprise Development: OI&T: Office of Information & Technology: OMB: Office of Management and Budget: PMAS: Program Management Accountability System: SPI: schedule performance index: SwRI: Southwest Research Institute: VA: Department of Veterans Affairs: VHA: Veterans Health Administration: VistA: Veterans Health Information Systems and Technology Architecture: VISN: Veterans Integrated Service Network: [End of section] United States Government Accountability Office: Washington, DC 20548: May 27, 2010: The Honorable Richard Burr: Ranking Member: Committee on Veterans' Affairs: United States Senate: Dear Senator Burr: The Department of Veterans Affairs (VA) is responsible for providing a variety of services to veterans, including medical care, disability compensation, and vocational rehabilitation. The Veterans Health Administration (VHA)--a component of VA--manages one of the largest health care systems in the United States, providing health care to more than 5 million patients in more than 1,500 facilities. To carry out its daily operations in providing health care to patients, VHA relies on the Veterans Health Information Systems and Technology Architecture (VistA), an information system comprised of multiple applications that include health provider applications; management and financial applications; registration, enrollment, and eligibility applications; health data applications; and information and education applications. As part of VistA, VHA operates an electronic outpatient scheduling system that automates all aspects of the outpatient appointment process, including the scheduling of patients and the generation of managerial reports. However, this system is over 25 years old, is inefficient in coordinating care between different sites, and has contributed to increasing wait times for appointments as the number of VA patients has grown in recent years. In 2000, VHA began an initiative to modernize the system--the Scheduling Replacement Project--with the goal of creating an outpatient scheduling application that would improve veterans' access to health care. The Scheduling Replacement Project was to result in the first system to be deployed as part of a larger, departmentwide initiative to modernize the department's health information system, known as HealtheVet. [Footnote 1] However, after 9 years of attempting to produce a new outpatient scheduling system, VA terminated a key contract supporting the Scheduling Replacement Project in February 2009 and ended the project in September 2009. According to the program manager, the department then began a new project to develop a scheduling system in October 2009. At your request, we conducted a review of VA's efforts toward replacing its scheduling system. Specifically, our objectives were to (1) determine the status of the scheduling project, (2) determine the effectiveness of VA's management and oversight of the project, and (3) assess the impact of the project on the department's overall implementation of its health information system modernization initiative--HealtheVet. To accomplish our objectives, we reviewed relevant project documentation and interviewed appropriate VA and contractor officials. Specifically, to determine the status of the project, we reviewed the project management plan and project status reports. To determine the effectiveness of VA's management and oversight of the project, we compared VA's plans and activities in key areas of management controls to best practices, as well as the department's own policies and guidance. To assess the impact of the scheduling project on VA's overall implementation of its health information system modernization initiative, we reviewed documentation such as briefings from HealtheVet planning meetings and interviewed officials about the status of the HealtheVet initiative. We conducted this performance audit from May 2009 through May 2010 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. See appendix I for a more complete description of our objectives, scope, and methodology. Background: As part of VA's mission, VHA is to serve the needs of America's veterans and their families (spouses and children) by providing primary care, specialized care, and related medical and social support services. VHA provides health services through more than 1,500 sites of care, including 153 hospitals, 995 outpatient clinics, 135 community living centers, and 232 Vet Centers. It employs more than 15,000 physicians and serves more than 5 million patients at these sites of care each year. To carry out its daily operations in providing health care to veterans and their families, VHA relies on an outpatient appointment scheduling system that is part of the department's current electronic health information system, known as VistA. However, according to the department, the current scheduling system has a number of limitations that impede its effectiveness, including: * Appointment activity resides at multiple medical centers, making it difficult to retrieve all of a patient's health care history. * Clinicians must maintain multiple calendars to account for the various services they provide. * Appointments and ancillary services are not linked, resulting in the inability to associate medical data with appointments. * Access to multiple sites is required to make appointments, resulting in inefficient coordination of care between facilities. Accordingly, in 2000, VHA initiated a project to replace the existing scheduling system. In doing so, it envisioned that the new scheduling system would provide benefits for the department, including: * a single enterprise database that would allow all appointments to be viewed, regardless of the point of care; * calendars that would include sequential appointment settings; * long-term appointment lists that would track and remind staff of future appointments; and: * ancillary service links that would allow for automated updates to appointment cancellations. VA originally planned to deploy the new outpatient scheduling system to an initial site by December 2004 and nationally by June 2006. In August 2002, the department had estimated that the total cost to develop and deploy the new system across all VHA facilities would be about $59 million. History of VA's Scheduling System Initiative: VHA began the scheduling replacement initiative in October 2000, at which time it began to identify business requirements for the new system. It also issued a request for proposals, seeking interested Veterans Integrated Service Networks (VISN)[Footnote 2] to partner with its Office of Information to conduct a business process reengineering effort and replace the VistA scheduling system with a commercial off-the-shelf (COTS) application. In January 2001, VHA selected VISNs 16 and 17, representing Texas and the south-central United States, respectively, to perform these tasks. Additionally, the Muskogee, Oklahoma medical center, part of VISN 16, was the planned location for the initial deployment of the new scheduling system. The VISNs used a pre-existing Cooperative Administrative Support Units (CASU)[Footnote 3] contract to obtain the services of the Southwest Research Institute (SwRI) to support the project.[Footnote 4] The statement of work included tasks to develop business information flow models and information system technical documents, and to select a COTS product to integrate into the system. However, according to project officials, in April 2002, VHA's Chief Information Officer (CIO) determined that using a COTS solution would result in excessive costs and make the department dependent on a vendor for a core business function. Thus, the CIO directed the VISNs to redirect their efforts and funding to develop a scheduling application instead of purchasing a COTS application. VA issued a new statement of work for SwRI to design, build, and test the scheduling application. The department planned to deliver the new outpatient scheduling system first to the location in Muskogee, referred to as the alpha deployment, by December 2004. Once successfully tested and deployed at this location, the system was to be deployed within VISN 16 and 17 for testing, then nationally to all VHA facilities. In 2004, issues integrating the application with HealtheVet components and funding reductions led to a delay in the alpha deployment date, pushing it back to October 2006. In an effort to meet the new date, VA decided in April 2005 to descope the alpha version of the scheduling application by removing certain planned capabilities. Simultaneously, the department and SwRI began treating a separate version that was to retain all planned capabilities as a distinct development effort, referred to as the beta version. Nevertheless, delays in correcting defects, conducting tests, and changing the code in response to infrastructure modifications, resulted in six more extensions of the target alpha deployment date (over 2 ½ years beyond the October 2006 planned date). Further, in an attempt to expedite the project, in September 2008, the Principal Deputy Under Secretary for Health directed the project team to focus its efforts on a national deployment of the new scheduling system by the end of 2009, rather than on the single-site alpha deployment. However, in January 2009, the project team determined that the product that had been developed for alpha deployment would not be suitable for national deployment by the end of 2009; thus, in February 2009, the department terminated its contract for the replacement scheduling application. VA subsequently ended the entire Scheduling Replacement Project in September 2009. Figure 1 depicts a timeline of key project events from its initiation through its termination. Figure 1: Timeline of Scheduling Replacement Project Events: [Refer to PDF for image: timeline] Request for Proposals to VISNs: October 2000. Project awarded to VISNs 16 and 17: January 2001. Project redirected from COTS to development: April 2002. Slipped alpha deployment: December 2004. Project descoped alpha and beta version split: April 2005. Original planned national deployment: June 2006. Slipped alpha deployment: October 2006. Slipped alpha deployment: June 2007. Slipped alpha deployment: September 2007. Slipped alpha deployment: March 2008. Slipped alpha deployment: June 2008. Project redirected to implement national deployment of scheduling application by the end of 2009: August 2008. Slipped alpha deployment: September 2008. Key contract for scheduling application terminated: February 2009. Slipped alpha deployment: May 2009. Project termination: October 2009. Source: GAO analysis of VA data. [End of figure] Governance of the Scheduling Replacement Project: Several organizations within VA were responsible for governance of the Scheduling Replacement Project: * In July 2000, VHA established a project management office to coordinate all efforts and monitor project activities to ensure success of the Scheduling Replacement Project. The project management office was to ensure achievement of milestones, evaluate project success, and report to VHA senior level executives. * In June 2001, VHA established the Scheduling Replacement Board of Directors to guide the overall direction of the project. According to its charter, the board was to review project activities on a quarterly basis, provide key decisions at major project milestones, confirm the achievement of project milestones, and evaluate project success. * In February 2003, VA established an Enterprise Information Board as its executive decision-making body for information technology (IT) capital planning and investment control. The board was to provide oversight in the selection, management, and control of IT investments such as the scheduling system. In February 2007, the Secretary of Veterans Affairs approved a centralized IT management structure for the department.[Footnote 5] As part of this realignment, staff from the project management office with responsibility for the Scheduling Replacement Project were transferred from VHA to the Office of Enterprise Development (OED) within VA's Office of Information and Technology (OI&T). Also in 2007, VA issued a governance plan[Footnote 6] to enable the department to better align its IT strategy to its business strategy, manage investments, and reconcile disputes regarding IT. The governance structure established by the plan included three governance boards for IT projects, such as the Scheduling Replacement Project: * The Budgeting and Near-Term Issues Board is to identify, review, recommend, and advocate projects and programs across the department. [Footnote 7] The board's responsibilities include monitoring projects' achievement of results. * The Programming and Long-Term Issues Board is to oversee portfolio development and evaluate program execution by conducting milestone reviews and program management reviews of IT investments.[Footnote 8] * The Information Technology Leadership Board is responsible for adjudicating all unresolved resource issues forwarded by the Budgeting and Near-Term Issues Board and forwarding recommendations to the department's Strategic Management Council.[Footnote 9] Prior Reviews of HealtheVet and the Scheduling Replacement Project: We and VA's Office of Inspector General have both issued reports concerning the HealtheVet initiative and the Scheduling Replacement Project. Specifically, in a June 2008 report, we raised concerns about VA's HealtheVet initiative.[Footnote 10] We noted that the eight major software development projects comprising the initiative (which included the Scheduling Replacement Project) were in various stages of development, and that none had yet been completed. We noted that while VA had established interim dates for completing the component projects, it had not developed a detailed schedule or approach for completing the overall HealtheVet initiative. Further, the department had not yet implemented a complete governance structure; several key leadership positions within the development organization had not been filled or were filled with acting personnel; and the departmental governance boards had not scheduled critical reviews of HealtheVet projects. We concluded that, without all elements of governance and oversight in place, the risk to the success of the HealtheVet initiative and, therefore, its component initiatives (such as the Scheduling Replacement Project) was increased. Accordingly, we recommended that VA develop a comprehensive project management plan and schedule, as well as a governance structure, to guide the development and integration of the many projects under this complex initiative. Subsequent to our 2008 report, VA reported that it had begun to formulate a project management plan, an integrated schedule of projects, and a governance plan for the HealtheVet initiative. Further, in reporting on the development of the replacement scheduling application in August 2009, the Office of Inspector General noted, among other things, that VA did not have staff with the necessary expertise to execute large-scale IT projects. The report also noted that there was minimal oversight of the contracting processes on the project and that the department had made no attempt to find a contracting officer with experience for this multi-year, complex project. The Inspector General suggested that VA develop effective oversight processes, develop in-house staff with the expertise to manage and execute complex integrated IT programs, and expand the number of contracting officers with experience on large projects. In response, the department consolidated IT procurements under the Office of Acquisition, Logistics, and Construction and established the Technology Acquisition Center to administer future OI&T contracts. VA Ended the Outpatient Scheduling System Project without Delivering Expected Capabilities and Has Begun a New Initiative: After spending an estimated $127 million over 9 years (from fiscal years 2001 through 2009) on its outpatient scheduling system project, VA has not yet implemented any of the system's expected capabilities. According to the department, of the total amount, $62 million was expended for, among other things, project planning, management support, a development environment, and equipment. In addition, the department paid an estimated $65 million to SwRI to develop the replacement scheduling application. However, VA and SwRI were not able to resolve a large number of system defects, and the department terminated the contract in February 2009. Subsequently, the department determined that the application was not viable (i.e., did not meet its needs), and officially ended the Scheduling Replacement Project on September 30, 2009. The department began a new initiative on October 1, 2009, which it refers to as HealtheVet Scheduling.[Footnote 11] However, as of early April 2010, it had completed only limited tasks for the new initiative. Specifically, the department's efforts consisted of analyzing alternatives and briefing VA's CIO on the analysis. Officials told us that they had not yet developed a project plan or schedule for the initiative, but intended to do so after determining whether to build or buy the new application. Scheduling Replacement Project Was Hindered by Weaknesses in Key Management Capabilities: The success of large IT projects is dependent on agencies' possessing management capabilities to effectively conduct acquisitions, manage system requirements, perform system tests, measure and report project performance, and manage project risks. In addition, effective institutionalized oversight is necessary to ensure that projects are, in fact, demonstrating these management capabilities and achieving expected results. However, the Scheduling Replacement Project had weaknesses in these areas that, if not addressed, could derail the department's current attempt to deliver a new scheduling system. Contracting for the Scheduling System Was Inconsistent with Fundamental Acquisition Management Principles: The Federal Acquisition Regulation (FAR) requires preparation of acquisition plans,[Footnote 12] and our prior work evaluating major system acquisitions has found that planning is an essential activity to reduce program risk.[Footnote 13] According to the FAR, an acquisition plan must address, among other things, how competition will be sought, promoted, and sustained throughout the course of the acquisition, or cite the authority and justification for why full and open competition cannot be obtained.[Footnote 14] Competition can help save taxpayer money, improve contractor performance, and promote accountability for results. Agencies are generally required to obtain full and open competition, except in certain specified situations such as modifications within the scope of the existing contract.[Footnote 15] Orders placed against a federal supply schedule are considered to be issued using full and open competition if the applicable procedures are followed.[Footnote 16] We have also found that having a capable acquisition workforce is a necessary element of properly conducting acquisitions that will meet agency needs. VA did not develop an acquisition plan until May 2005, about 4 years after the department first contracted for a new scheduling system. Thus, formative decisions with implications for the scheduling project's success, such as what the contractor was to do, the type of contract to be used, and how competition would be promoted and, if not, why, were made in an ad hoc fashion (i.e., not subject to a deliberative planning process). Further, VA did not promote competition in contracting for its scheduling system. Specifically, rather than performing activities that are intended to promote competition (e.g., announcing the requirement, issuing a solicitation, and evaluating proposals), VA issued task orders against an existing CASU contract that the department had in place for acquiring services such as printing, computer maintenance, and data entry. Later, when the department changed its strategy to acquire a custom-built scheduling application instead of pursuing COTS integration--a fundamental change to the development approach and contract scope--the department again did not seek to obtain the benefits of competition. Instead, the project team directed the change through a letter to the existing contractor and a substantially revised statement of work. In August 2004, VA determined that it would no longer support the CASU agreement, and in response, the project team sought to use a General Services Administration (GSA) schedule contract to retain the services of its existing contractor. However, VA did not follow required ordering procedures when it transitioned to the GSA schedule contract. Specifically, VA did not solicit price quotes from at least three schedule vendors, as required by the FAR.[Footnote 17] Instead, at the direction of the program office, the department provided a statement of work only to the incumbent contractor, which responded with a proposal and price quote. As a result, VA increased the risk that it was not selecting a contractor that would provide the best approach. Further, VA did not assess whether the purchase of commercial services under this schedule was the most suitable means for developing a custom-built scheduling application. These weaknesses in VA's acquisition management for the scheduling system project reflect the inexperience of the department's personnel in administering major IT contracts. In this regard, VA's Inspector General identified the lack of VA personnel who are adequately trained and experienced to plan, award, and administer IT contracts as a major management challenge for the department and specifically cited the scheduling system acquisition as an example. Also, VA's contracting officer told us that the contracting office did not have prior experience in the award or administration of contracts for IT system development. According to the HealtheVet Scheduling program manager, going forward, the scheduling system project team plans to use VA's Technology Acquisition Center within the Office of Acquisition, Logistics, and Construction to administer future contracts. Established in March 2009, in an effort to improve the department's IT acquisition management, the center is comprised of experienced acquisition staff members who are to provide exclusive contracting support to the Office of Information and Technology. According to the Executive Director, the Technology Acquisition Center includes technical specialists who can offer assistance with refining statements of work and contractual requirements. Also, representatives from the Office of General Counsel are colocated with the center to facilitate reviews for compliance with applicable federal laws and regulations. Although VA has taken positive actions to improve its IT acquisition management, these actions do not ensure that the department will not repeat the pattern of failing to seek and promote competition and other weaknesses that it demonstrated in contracting for the scheduling system. Until the department ensures that it has adequately planned for the future acquisition of a scheduling system, including whether and how it will provide for competition or otherwise comply with federal contracting requirements, it cannot ensure that it will be effective in acquiring a system that meets user needs at a reasonable cost and within a reasonable time frame. VA Did Not Ensure Requirements Were Complete and Sufficiently Detailed to Guide Development of the Scheduling System: According to recognized guidance, using disciplined processes for defining and managing requirements can help reduce the risks of developing a system that does not meet user needs, cannot be adequately tested, and does not perform or function as intended. [Footnote 18] Requirements should serve as the basis for a shared understanding of the system to be developed. Among other things, effective practices for defining requirements include analyzing requirements to ensure that they are complete, verifiable, and sufficiently detailed to guide system development. In addition, maintaining bidirectional traceability from high-level operational requirements through detailed low-level requirements to test cases is an example of a disciplined requirements management practice. Further, in previous work, we have found that requirements development processes should be well-defined and documented so that they can be understood and properly implemented by those responsible for doing so. [Footnote 19] VA did not adequately analyze requirements to ensure they were complete, verifiable, and sufficiently detailed to guide system development. For example, in November 2007, VA determined that performance requirements were missing and that some requirements were not testable. Further, according to project officials, some requirements were vague and open to interpretation. For example, although the requirement to sort appointment requests to be processed was included, it required clarification on how those appointments should be sorted. Also, requirements for processing information from systems on which the scheduling application depended were missing. For example, in June 2008, several requirements for processing updates to a patient's eligibility had to be added. The incomplete and insufficiently detailed requirements resulted in a system that did not function as intended. In addition, VA did not ensure that requirements were fully traceable. As early as October 2006, an internal review of the scheduling project's requirements management noted that the requirements did not trace to business rules or to test cases. Yet, almost 2 years later, in August 2008, VA documentation continued to reflect this problem-- stating that not every lower-level requirement traced back to one or more of the higher-level functional requirements and down to test cases. By not ensuring requirements traceability, the department increased the risk that the system could not be adequately tested and would not function as intended. According to scheduling project officials, requirements were incomplete, in part, because they depended on information from other related systems that had not yet been fully defined. In addition, VA did not develop a requirements management plan for the Scheduling Replacement Project until October 2008. Our analysis of this plan found it to be generally consistent with leading practices. However, the project team's use of the requirements management plan was precluded by the department's decision to end the project. According to the Scheduling program manager, the project team expects to further develop the requirements management plan, dependent upon the department's yet-to-be-selected alternative for proceeding with the current effort, HealtheVet Scheduling. Nevertheless, the department has not yet demonstrated its capability to execute effective requirements management practices. Without well-defined and managed requirements, VA and its contractor lacked a common understanding of the system to be developed and increased the risk that the system would not perform as intended. Going forward, effective requirements development and management will be essential to ensuring that this risky situation, which could endanger the success of VA's new scheduling system project, is not repeated. VA's Approach to Performing System Tests Increased Risk that the System Would Not Perform as Intended: Best practices in system testing indicate that testing activities should be performed incrementally, so that problems and defects [Footnote 20] with software versions can be discovered and corrected early, when fixes generally require less time and fewer resources. VA's guidance on conducting tests during IT development projects is consistent with these practices and specifies four test stages and associated criteria that are to be fulfilled in order to progress through the stages.[Footnote 21] For example, defects categorized as critical, major, and average severity that are identified in testing stage one (performed within the development team) are to be resolved before testing in stage two (performed by the testing services organization) is begun.[Footnote 22] Nonetheless, VA took a high-risk approach to testing the scheduling system by performing tests concurrently rather than incrementally. Based on information provided by project officials, the department began stage two testing on all 12 versions of the scheduling application before stage one testing had been completed. On average, stage two testing began 78 days before stage one testing of the same version had been completed. In two of these cases, stage two testing started before stage one testing had begun. Compounding the risk inherent in this concurrent approach to testing, the first alpha version to undergo stage two testing had 370 defects that were of critical, major, or average severity even though the department's criteria for starting stage two testing specified that all such defects are to be resolved before starting stage two testing. While stage two testing was ongoing, VA made efforts to reduce the number of defects by issuing additional task orders for defect repair to its contractor and by hiring an additional contractor whose role was to assist in defect resolution. However, almost 2 years after beginning stage two testing, 87 defects that should have been resolved before stage two testing began had not been fixed. Scheduling project officials told us that they ignored their own testing guidance and performed concurrent testing at the direction of Office of Enterprise Development senior management in an effort to prevent project timelines from slipping. In addition, project officials told us they made a conscious decision to conduct concurrent testing in an effort to promote early identification of software defects. However, because the department did not follow its guidance for system testing and, instead, performed concurrent testing, it increased the risk that the scheduling project would not perform as intended and would require additional time and resources to be delivered. If VA is to be successful in its new initiative to provide an outpatient scheduling system, it is critical that the department adhere to its own testing guidance for ensuring the resolution of problems in a timely and cost-effective manner. Not doing so lessens the usefulness of results from its testing activities and increases the risk of additional system development failures. VA's Progress Reporting Based on Earned Value Management Data Was Unreliable: Office of Management and Budget (OMB) and VA policies require major projects to use earned value management (EVM) to measure and report progress.[Footnote 23] EVM is a tool for measuring project progress by comparing the value of work accomplished with the amount of work expected to be accomplished. Such a comparison permits actual performance to be evaluated, based on variances from the cost and schedule baselines.[Footnote 24] Identification and reporting of variances and analysis of their causes help program managers determine the need for corrective actions. In addition, the cost performance index (CPI) and schedule performance index (SPI) are indicators of whether work is being performed more or less efficiently than planned. [Footnote 25] Like the variances, reporting of CPI and SPI can provide early warning of potential problems that need correcting to avoid adverse results. For a complete view of program status and an indication of where problems exist, performance data should be reported for both current (generally the most recent month) and cumulative periods. In addition, federal policy requires that systems used to collect and process EVM data be compliant with the industry standard developed by the American National Standards Institute (ANSI) and Electronic Industries Alliance (EIA), ANSI/EIA Standard 748. [Footnote 26] Such compliance is necessary to demonstrate the capability to provide reliable cost and schedule information for earned value reporting. Although VA submitted monthly reports to the department's CIO based on earned value data for the scheduling project, the reliability of the data on which the reports were based was questionable and the reports included data that provided inconsistent views of project performance. Specifically regarding data reliability, department officials did not ensure that the EVM reporting systems for the scheduling project had been certified for compliance with ANSI/EIA Standard 748.[Footnote 27] According to the former program manager, the department did not seek to determine whether its development contractor's system was compliant because SwRI entered cost and schedule data directly into the department's EVM system. Although department officials asserted that this EVM system was compliant with ANSI/EIA Standard 748, the department could not provide documentation of such compliance. Because VA had not demonstrated compliance with the standard, it could not ensure that the data resulting from its EVM system and used for progress reporting were reliable. Regarding EVM reporting, in January 2006, the scheduling project management office began providing monthly reports to the department's CIO that were based on EVM data. However, in addition to being based on data from EVM systems that had not been assessed for compliance with the applicable standard, the progress reports also included contradictory information about project performance. Specifically, the reports featured stoplight (i.e., green for "in control," yellow for "caution," or red for "out of control") indicators, based on the cumulative CPI and SPI.[Footnote 28] These indicators frequently provided a view of project performance that was inconsistent with the reports' narrative comments. For example, the September 2006 report identified cost and schedule performance as green, even though supporting narrative comments stated that the project schedule was to be extended by 9 months due to a delay in performing testing and the need for additional time to repair system defects. The June 2007 report also identified project cost and schedule performance as green, despite the report noting that the project budget was being increased by $3 million so that the development contract could be extended to accommodate schedule delays. Further, the December 2007 report identified cost and schedule performance as green, while at the same time stating that the development contract was to be extended again and that a cost variance would be reported in the near future. This pattern of inconsistent progress reporting continued until October 2008, when the report for that month and all others through August 2009 showed cost and schedule performance as red, which was consistent with the actual state of the project. In discussing this matter, the former program manager stated that the Scheduling Replacement Project complied with the department's EVM policies, but noted that the department performed EVM for the scheduling project only to fulfill the OMB requirement and that the data were not used as the basis for decision making because doing so was not a part of the department's culture.[Footnote 29] Because VA's scheduling project was not managed in accordance with EVM methods that could provide a widely recognized means of reliably determining and reporting cost and schedule performance, the department was not positioned to detect performance shortfalls and initiate timely corrective actions that might have prevented the project's failure. Having EVM reporting that provides a reliable measure of progress will be essential as the department moves forward with its new scheduling project. Major Risks of the Scheduling Project Were Not Identified and Reported: Managing project risks means proactively identifying circumstances that increase the probability of failure to meet commitments and taking steps to prevent them from occurring. Federal guidance and best practices advocate risk management.[Footnote 30] To be effective, risk management activities should include identifying and prioritizing risks as to their probability of occurrence and impact, documenting them in an inventory, and developing and implementing appropriate risk mitigation strategies. By performing these activities, potential problems can be avoided before they become actual cost, schedule, and performance shortfalls. VA established a process for managing the scheduling project's risks that was consistent with relevant best practices. Specifically, project officials developed a risk management plan for managing risks to the scheduling project. The plan defined five phases of the risk management process--risk identification, risk analysis, risk response planning, risk monitoring and control, and risk review. The plan also defined risk-related roles and responsibilities for the scheduling project staff and tools to be used to capture identified risks, track their status, and communicate them. In addition, project officials captured identified risks to the scheduling project in an automated tracking tool. Examples of risks identified in the tool included the risk that hardware at sites where the system was to be deployed was incompatible with the new application and another related to SwRI's failure to meet deliverable dates. However, while the department had established a process for managing risks to the scheduling project, it did not have a comprehensive list of risks because it did not take key project risks into account. As previously discussed, we identified problems in VA's approach to managing the project in four major areas--acquisition management, requirements management, system testing, and earned value management. Nevertheless, VA did not identify as risks its weaknesses in the following three project management practices: (1) using a noncompetitive acquisition approach, (2) conducting concurrent testing and initiation of stage two testing with significant defects, and (3) reporting unreliable project cost and schedule performance information. Any one of these risks alone had the potential to adversely impact the outcome of the project. The three of them together dramatically increased the likelihood that the project would not succeed. Since these project management weaknesses were not identified as risks, VA was unable to estimate the significance of their occurrence and decide what steps should be taken to best manage them. Senior project officials indicated that staff members were often reluctant to raise risks or issues to leadership in the Office of Enterprise Development due to the emphasis on keeping the project on schedule. Further, the scheduling program manager recognized that the project management office was inadequately staffed to implement a disciplined risk management process and stated that, in September 2008, a full-time risk manager was added to the staff. As VA continues with its latest scheduling effort, it will be critical that the department identify a comprehensive list of risks so that threats to the project can be detected and mitigated in a timely manner. VA Did Not Conduct Oversight of the Scheduling Replacement Project for 2 Years after Major Problems Occurred: GAO and OMB guidance call for the use of institutional management processes to control and oversee IT investments.[Footnote 31] Critical to these processes are activities to track progress of IT projects, such as milestone reviews that include mechanisms to identify underperforming projects, so that timely steps can be taken to address deficiencies. These reviews should track project performance and progress toward predefined cost and schedule goals, as well as monitor project benefits and exposure to risks. Moreover, these activities should be conducted by a department-level investment review board (or comparable entity) composed of senior executives from the IT office and business units with appropriate authority to address issues when projects are not meeting cost, schedule, and performance goals. VA's Enterprise Information Board was established in February 2003 to provide oversight of IT projects through in-process reviews when projects experience problems or variances outside of tolerance levels. Similarly, the Programming and Long-Term Issues Board, established in June 2007 as a result of the IT realignment, is responsible for performing milestone reviews and program management reviews of projects. However, between June 2006 and May 2008, the department did not provide oversight of the Scheduling Replacement Project, even though the department had become aware of significant issues indicating that the project was having difficulty meeting its schedule and performance goals.[Footnote 32] Specifically, in June 2006, the project team found that a delivery of software from SwRI included over 350 defects, leading the office to delay the system deployment by 9 months, from October 2006 to July 2007, to mitigate the defects. A May 2007 report from an independent contractor stated that VA's project management team did not have a clear understanding of the status of the project in terms of progress being made on those defects. Further, a July 2007 review by the Software Engineering Institute found that a test environment had not been developed and that the schedule for testing did not include sufficient time to identify and correct all infrastructure issues. Based on the results of these reviews, the project management office recommended the project be stopped and reevaluated before moving forward. Despite indications of problems with the project, neither the Enterprise Information Board nor the Programming and Long-Term Issues Board conducted reviews between June 2006 and May 2008 that could have identified corrective actions for the Scheduling Replacement Project. In June 2008, the Director of the Office of Enterprise Development requested an operational test readiness review of the replacement scheduling application by the Programming and Long-Term Issues Board to determine if the application was ready for deployment. That review identified issues, including significant critical defects in the application and a lapse in a contract to resolve defects. According to the chairman of the Programming and Long-Term Issues Board, it did not conduct reviews of the scheduling project prior to June 2008 because it was focused on developing the department's IT budget strategy. In June 2009, VA's Assistant Secretary for Information and Technology, who serves as the department's CIO, began establishing a new process for planning and managing its IT projects--the Program Management Accountability System (PMAS). According to the CIO, this process is intended to promote near-term visibility into troubled programs, allowing the department to take corrective actions earlier and avoid long-term project failures. PMAS is expected to improve oversight of IT projects through strict adherence to project milestones and imposing strong corrective measures if a project misses multiple milestones. According to the CIO, under PMAS, projects will be expected to deliver smaller, more frequent releases of new functionality to customers. In addition, specific program resources and documentation are to be in place before development begins, and approved processes are to be used during the system development life cycle. This approach is intended to ensure that customers, project members, and vendors working on a project are aligned, accountable, and have access to the resources necessary to succeed before work begins. For a program to be approved for investment under PMAS, the program must have, among other things, an established customer sponsor, a qualified incremental program plan, requirements for three delivery milestones, and documented success criteria. According to the HealtheVet Scheduling program manager, the department expects to develop plans for the new scheduling initiative, required under PMAS, once a strategy for the initiative is selected. However, the department has not yet demonstrated that it can sustain the wholesale change in management of IT projects that PMAS represents or that this new approach will be sufficiently robust to prevent or correct weaknesses such as those that contributed to the Scheduling Replacement Project's failure. Until the department has fully established and effectively implemented the project management controls that are expected to be a component of PMAS, it remains to be seen whether this new approach will be effective in providing oversight to ensure the success of the department's new scheduling effort. Impact of Scheduling Replacement Project Failure on HealtheVet Program is Uncertain: While the Scheduling Replacement Project was one of many components of VA's HealtheVet initiative, the impact of the project's termination on the initiative is currently unclear. The impact is unclear because the relationships (i.e., interdependencies) among the various projects under HealtheVet have not been determined. As described in VA's budget submission for 2011, HealtheVet is the most critical IT development program for medical care, and is expected to enhance and supplement the legacy VistA system using highly integrated health care applications, such as the capability to schedule outpatient appointments. However, the department's efforts have not yet resulted in a finalized plan that outlines what needs to be done and when. As of March 2010, the department had not completed its comprehensive plan and integrated schedule to guide the development and integration of the many projects that make up this departmentwide initiative. According to officials in VA's Office of Information and Technology, the department plans to document the interdependencies, project milestones, and deliverables in an integrated master schedule as part of a project management plan that is expected to be completed by June 2010. In the absence of an overall comprehensive plan for HealtheVet that incorporates critical areas of system development and considers all dependencies and subtasks and that can be used as a means of determining progress, it is difficult to determine how scheduling and other applications will be integrated into this larger HealtheVet system. Likewise, without such a plan, the impact of the terminated scheduling project on the completion of the HealtheVet initiative cannot be determined. Conclusions: After almost a decade of effort, VA has not accomplished what it set out to achieve in replacing its patient scheduling system. A broad range of managerial weaknesses plagued the project from beginning to end and increased the project's risk of failure. Specifically, because the department did not develop and execute an acquisition plan, its acquisition activities were ad hoc and it did not seek to obtain the benefits of competition. Additionally, in defining and managing system requirements, the department did not perform critically important activities such as ensuring that the requirements were complete and sufficiently detailed. Further, the department's decision to concurrently conduct tests contributed to an increased risk that the application would not perform as intended, and its earned value management data did not serve as a reliable indicator of project performance. Moreover, even though the department had a plan and process for managing project risks, it did not identify key risks mentioned or take steps to mitigate them. Finally, although the department was aware of major issues with the project through several external reviews, the lack of effective institutional oversight allowed the project to continue unchecked and, ultimately, to fail. Given this situation, the department is starting over and is in the process of analyzing alternative strategies, which will be the basis for a project plan that is to be developed. At the same time, the department is instituting a new approach that is intended to manage and control IT system projects and avoid project failures, such as what has occurred with the Scheduling Replacement Project. Finally, while the scheduling system project was to result in the first component of VA's larger HealtheVet initiative to modernize the department's health information system, the specific impact of the project's failure on this initiative is unclear because HealtheVet plans have not been completed. Until the department effectively implements measures that prevent the types of management weaknesses that plagued its earlier efforts, it risks incurring similar weaknesses in its latest scheduling replacement effort, which could again prevent VA from delivering this important capability for serving the health care needs of veterans and their families. Recommendations for Executive Action: To enhance VA's effort to successfully fulfill its forthcoming plans for the outpatient scheduling system replacement project and the HealtheVet program, we recommend that the Secretary of Veterans Affairs direct the CIO to make certain the following six actions are taken: * Ensure acquisition plans document how competition will be sought, promoted, and sustained or identify the basis of authority for not using full and open competition. * Ensure implementation of a requirements management plan that reflects leading practices for requirements development and management. Specifically, implementation of the plan should include analyzing requirements to ensure they are complete, verifiable, and sufficiently detailed to guide development, and maintaining requirements traceability from high-level operational requirements through detailed low-level requirements to test cases. * Adhere to the department's guidance for system testing including (1) performing testing incrementally and (2) resolving defects of average and above severity prior to proceeding to subsequent stages of testing. * Ensure effective implementation of EVM by making certain that the: (1) EVM reporting systems for the scheduling project are certified for compliance with ANSI/EIA Standard 748 and data resulting from the systems are reliable; (2) project status reports based on EVM data are reliable in their portrayal of the project's cumulative and current cost and schedule performance; and (3) officials responsible for managing and overseeing the project use earned value data as an input to their decision-making processes. * Identify risks related to the scheduling project moving forward and prepare plans and strategies to mitigate them. * Ensure that the policies and procedures VA is establishing to provide meaningful program oversight are effectively executed and that they include (1) robust collection methods for information on project costs, benefits, schedule, risk assessments, performance metrics, and system functionality to support executive decision making; (2) the establishment of reporting mechanisms to provide this information in a timely manner to department IT oversight control boards; and (3) defined criteria and documented policies on actions the department will take when development deficiencies for a project are identified. Agency Comments and Our Evaluation: The VA Chief of Staff provided written comments on a draft of this report. In its comments, the department generally agreed with our conclusions, concurred with five of our six recommendations, and described actions to address them. For example, the department stated that it will work closely with contracting officers to ensure future acquisition plans clearly identify an acquisition strategy that promotes full and open competition. In addition, the department stated that its new IT project management approach, PMAS, will provide near- term visibility into troubled programs, allowing the Principal Deputy Assistant Secretary for Information and Technology to provide help earlier and avoid long-term project failures. The department concurred in principle with one of our recommendations: that it ensure effective implementation of EVM. In this regard, the department noted that PMAS requires monthly analysis and reporting of project performance, in addition to VA's project status reporting to OMB and the public. However, the department did not describe its actions to ensure the reliability of project performance data and reports, nor did it explain how it would ensure the use of reliable performance data in managing and overseeing the project under PMAS. Unless the department fully addresses this recommendation, VA may not be positioned to reliably detect performance shortfalls and initiate timely corrective actions that could prevent future project failure. The department also provided technical comments, which we have incorporated in the report as appropriate. The department's written comments are reproduced in appendix II. As agreed with your office, unless you publicly announce its contents earlier, we plan no further distribution of this report until 30 days from the date of this letter. At that time, we will send copies of the report to interested congressional committees, the Secretary of Veterans Affairs, and other interested parties. In addition, the report will be available at no charge on the GAO Web site at [hyperlink, http://www.gao.gov]. If you or your staff have questions about this report, please contact me at (202) 512-6304 or melvinv@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributors to this report are listed in appendix III. Sincerely, Signed by: Valerie C. Melvin: Director, Information Management and Human Capital Issues: [End of section] Appendix I: Objectives, Scope, and Methodology: The objectives of our study were to (1) determine the status of the Scheduling Replacement Project, (2) determine the effectiveness of the Department of Veterans Affairs (VA) management and oversight of the project, and (3) assess the impact of the project on VA's overall implementation of its health information system modernization initiative--HealtheVet. To determine the status of the Scheduling Replacement Project, we reviewed status briefings on VA's assessment of alternatives for its new scheduling initiative, as well as the department's fiscal year 2011 budget submission. We supplemented these reviews with interviews with the scheduling program manager, the Director of the Office of Enterprise Development, and the Veterans Health Administration Enterprise Systems Manager for the project. To determine the effectiveness of the department's management and oversight of the project, we evaluated its acquisition management, system requirements management, system test management, use of earned value management, management and mitigation of risks, and project oversight and governance processes. To evaluate VA's approach to contracting for the scheduling system, we reviewed and analyzed program documentation, including the Scheduling Replacement Project acquisition plans, contract task orders, statements of work, sole source justifications, and a contracting white paper to determine the extent to which the agency's practices were consistent with relevant planning and competition requirements in the Federal Acquisition Regulation. Regarding system requirements management, we compared project requirements management practices described in system requirements documents such as the software requirements specification and project status briefings to recognized requirements management guidance, such as those included in the Software Engineering Institute's Capability Maturity Model Integration.[Footnote 33] We also assessed the scheduling project requirements management plan and examined the degree to which it was consistent with leading requirements management practices such as the Software Engineering Institute's Capability Maturity Model Integration. To determine the effectiveness of VA's test management, we reviewed the department's guidance for performing system tests and compared project testing activities to this guidance and associated best practices. Specifically, we reviewed documentation of test results to determine the dates testing occurred and the number and severity of defects identified. To review VA's use of earned value management (EVM) to assess and report project performance, we reviewed Office of Management and Budget Memorandum M-05-23, as well as VA standard operating procedures related to EVM to identify requirements for effective execution of this discipline in assessing project performance. We compared the Scheduling Replacement Project's approach to EVM with recognized practices as described in GAO's Cost Estimating and Assessment Guide, such as the American National Standards Institute (ANSI) and Electronic Industries Alliance (EIA), ANSI/EIA Standard 748.[Footnote 34] We reviewed scheduling project reports on earned value performance that were provided to management to determine the level to which these reports provided complete and meaningful cost and schedule performance trends to department management. To determine the effectiveness of the management and mitigation of scheduling project risks, we consulted industry guidance on risk mitigation and management, including Software Engineering Institute's Capability Maturity Model Integration.[Footnote 35] In addition, we reviewed the scheduling project's risk management plan and process, including the Scheduling Replacement Project Risk Management plan, and determined the level to which the department's plans and processes met industry best practices and were executed to identify risks. Further, we also examined the department's risk inventory to determine whether project risks we found during our review had been identified and considered by VA. To assess the effectiveness of scheduling project oversight and governance, we reviewed GAO guidance on effective project oversight, including our Information Technology Investment Management Framework;[Footnote 36] analyzed documentation from department oversight entities that existed over the course of the project, including the Enterprise Information Board and the Programming and Long-Term Issues Board; and determined the extent to which these bodies performed effective oversight of the project. In addition to the actions just described, we supplemented our analysis by interviewing cognizant VA and contractor officials including the VA Chief Information Officer, current and former program managers, project team members, representatives from the Veterans Health Administration, the department's contracting officer for the project, and the Director of the Office of Enterprise Development. To assess the impact of the scheduling project on VA's overall implementation of its health information system modernization initiative, we reviewed documentation such as briefings from HealtheVet planning meetings and interviewed cognizant officials, including the Medical Care Program Executive Officer in the Office of Information and Technology and the Director of Health Information Systems in the Office of Enterprise Development, about the status of the HealtheVet initiative. We conducted this performance audit at VA headquarters in Washington, D.C., from May 2009 through May 2010, in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. [End of section] Appendix II: Comments from the Department of Veterans Affairs: Department of Veterans Affairs: Office of the Secretary: May 11, 2010: Ms. Valerie C. Melvin: Director, Information Management and Human Capital Issues: U.S. Government Accountability Office: 441 G Street, NW: Washington, DC 20548: Dear Ms. Melvin: The Department of Veterans Affairs (VA) has reviewed the Government Accountability Office's (GAO) draft report, Information Technology: Management Improvements Are Essential to VA's Second Effort to Replace Its Outpatient Scheduling System (GA0-10-579) and generally agrees with GAO's conclusions and concurs with five and concurs in principle with one of GAO's recommendations to the Department. The enclosure addresses GAO's recommendations and provides technical comments to the draft report. VA appreciates the opportunity to comment on your draft report. Sincerely, Signed by: John R. Gingrich: Chief of Staff: Enclosure: [End of letter] Enclosure: Department of Veterans Affairs (VA) Comments to Government Accountability Office (GAO) Draft Report: Information Technology: Management Improvements Are Essential to VA's Second Effort to Replace Its Outpatient Scheduling System (GA0-10-579): GAO recommendation: To enhance VA's efforts to successfully fulfill its forthcoming plans for the outpatient scheduling system replacement project and the HealtheVet program, we recommend that the Secretary of Veterans Affairs direct the CIO to make certain the following six actions are taken: Recommendation 1: Ensure acquisition plans document how competition will be sought, promoted, and sustained or identify the basis of authority for not using full and open completion. Response: Concur. VA will work closely with contracting officers to ensure future acquisition plans clearly identify an acquisition strategy that promotes, and gives preference to, full and open competition. Specifically, the Technical Acquisition Center (TAC) will be central to all acquisition efforts. Recommendation 2: Ensure implementation of a requirements management plan that reflects leading practices for requirements development and management. Specifically, implementation of the plan should include analyzing requirements to ensure they are complete, verifiable, and sufficiently detailed to guide development, and maintaining requirements traceability from high-level operational requirements through detailed low-level requirements to test cases. Response: Concur. Involving the business and technical communities, VA initiated a complete review of all business requirements and is working towards ensuring the requirements are complete, verifiable and sufficiently detailed to guide development. All requirements (functional and technical) are closely managed and documented using documented, repeatable, processes located in Office of Enterprise Development (OED) ProPath. OED ProPath is a Web-based, comprehensive map mandating standardized, end-to-end, repeatable processes and project documentation for all IT development projects. Recommendation 3: Adhere to the department's guidance for system testing including (1) performing testing incrementally and (2) resolving defects of average and above severity prior to proceeding to subsequent stages of testing. Response: Concur. All testing is closely managed and documented using documented, repeatable, processes located in OED ProPath. Recommendation 4: Ensure effective implementation of EVM by making certain that the (1) EVM reporting systems for the scheduling project are certified for compliance with ANSI/EIA Standard 748 and data resulting from the systems are reliable; (2) project status reports based on EVM data are reliable in their portrayal of the project's cumulative and current cost and schedule performance; and (3) officials responsible for managing and overseeing the project use earned value data as an input to their decision-making processes. Response: Concur in principle. Each project in the Office of Information and Technology is transitioning to an incremental delivery model. VA's Program Management Accountability System (PMAS) incorporates industry best practices relating to software development. As such, VA intends to adopt the industry best practices for management, metrics, and reporting that are inherent to these new, risk-reducing software development methods. PMAS related processes in VA's implementation of ProPath require project management analysis of performance as well as examining both performance and examining performance variances in monthly reports to Executive Management. VA is already engaged in fully transparent reporting of project status to the Office of Management and Budget and the public. VA is also looking for additional approaches to enhance transparency to the public and strengthen stewardship of public funds. Recommendation 5: Identify risks related to the scheduling project moving forward and prepare plans and strategies to mitigate them. Response: Concur. VA developed a comprehensive Analysis of Alternatives (AoA) to examine the benefits and risks associated with several alternatives for meeting the patient scheduling needs of the business community. The AoA closely examined acquisition, development, deployment, and sustainment risks and identified mitigation strategies for consideration as part of the VA decision process. Additionally, the HealtheVet Scheduling Program Office employs a full-time Risk Manager and has implemented a formal, active, and thorough Risk Management Program. The Risk Management Program includes regular program reviews to evaluate and manage risks. Recommendation 6: Ensure that the policies and procedures VA is establishing to provide meaningful program oversight are effectively executed and that they include (1) robust collection methods for information on project costs, benefits, schedule, risk assessments, performance metrics, and system functionality to support executive decision making; (2) the establishment of reporting mechanisms to provide this information in a timely manner to department IT oversight control boards; and (3) defined criteria and documented policies on actions the department will take when development deficiencies for a project are identified. Response: Concur. VA instituted PMAS as a rigorous management approach to addressing performance shortcomings. PMAS delivers smaller, more frequent releases of new functionality to customers, ensuring customers, project managers, and vendors working on a project are aligned, accountable, and have access to necessary resources before work begins. PMAS mandates that specific program resources and documentation be in place before development begins and mandates that approved processes be used during the system development life cycle (SDLC). PMAS provides near-term visibility into troubled programs, allowing the Principal Deputy Assistant Secretary for Information and Technology to provide help earlier and avoid long-term project failures. Frequent software deliveries allow customers to provide earlier feedback on system functionality, eliminates "big bang" program/project failures, and increases the probability of successfully developing and deploying VA IT systems. To address shortfalls in project documentation and process controls, VA identified and documented key SDLC processes in OED ProPath. OED ProPath is a Web-based, comprehensive map mandating standardized, end- to-end, repeatable processes and project documentation for all IT development projects. [End of section] Appendix III: GAO Staff Contact and Acknowledgments: GAO Contact: Valerie C. Melvin, (202) 512-6304 or melvinv@gao.gov: Staff Acknowledgments: In addition to the contact named above, key contributions to this report were made by Mark T. Bird, Assistant Director; Carol Cha; Shaun Byrnes; Neil Doherty; Rebecca Eyler; Michele Mackin; Lee McCracken; Constantine J. Papanastasiou; Michael W. Redfern; J. Michael Resser; Sylvia Shanks; Kelly Shaw; Eric Trout; Adam Vodraska; and Merry Woo. [End of section] Footnotes: [1] See GAO, Veterans Affairs: Health Information System Modernization Far from Complete; Improved Project Planning and Oversight Needed, [hyperlink, http://www.gao.gov/products/GAO-08-805] (Washington, D.C.: June 30, 2008). In this report, we highlighted VA's progress toward modernizing its medical information system, developing plans for completing the project, and providing oversight of the project. However, we noted that VA lacked a comprehensive project management plan to guide remaining work and a complete governance structure for HealtheVet. [2] In 1995, VA shifted management authority from its headquarters to new regional management structures--VISNs. VA created 22 VISNs, each led by a director and a small staff of medical, budget, and administrative officials. The VISNs have been configured around historic referral patterns to VA's tertiary care medical centers. These networks have substantial operational autonomy and now perform the basic decision-making and budgetary duties of the VA health care system. The network office in each VISN oversees the operations of the medical centers in its area and allocates funds to each of them. [3] CASUs are a network of federal entrepreneurial organizations that provide a full range of "best value" administrative support services to federal agencies throughout the United States and overseas on a cost-reimbursable basis. [4] SwRI is an independent, nonprofit applied research and development organization. [5] This centralization was to provide greater authority and accountability over the department's resources by centralizing IT management under the department-level Chief Information Officer and standardizing operations and systems development across the department using new management processes based on industry best practices. [6] Department of Veterans Affairs, VA IT Governance Plan, version 8.3, March 12, 2007. [7] This board became operational in May 2007 and was originally named the Business Needs and Investment Board. This board is chaired by the VA Principal Deputy Assistant Secretary for Information and Technology. [8] This board became operational in June 2007 and was originally named the Planning, Architecture, Technology, and Services Board. This board is chaired by the VA Deputy Assistant Secretary for Information and Technology Enterprise Strategy, Policy, Plans, and Programs. [9] This board became operational in June 2007 and is chaired by the VA Assistant Secretary for Information and Technology. [10] [hyperlink, http://www.gao.gov/products/GAO-08-805]. [11] VA's fiscal year 2010 budget estimate for HealtheVet Scheduling is $10 million. [12] See FAR, subpart 7.1. See also FAR 34.004. [13] See FAR 39.102 and GAO, Defense Acquisitions: Managing Risk to Achieve Better Outcomes, GAO-10-374T (Washington, D.C.: Jan. 20, 2010). [14] See FAR 7.105 b(2). [15] See FAR, part 6. [16] See FAR 8.404. [17] See FAR 8.405-2, et seq., as added by Federal Acquisition Circular 2001-24, FAR Case 1999-603; Item V, 69 Fed. Reg. 34231 (June 18, 2004). [18] See FAR 39.102 and Carnegie Mellon Software Engineering Institute, Capability Maturity Model® Integration for Development, Version 1.2 (Pittsburgh, Pa., August 2006), and Software Acquisition Capability Maturity Model (SA-CMM) version 1.03, CMU/SEI-2002-TR-010 (Pittsburgh, Pa., March 2002). [19] GAO, Secure Border Initiative: DHS Needs to Address Significant Risks in Delivering Key Technology Investment, [hyperlink, http://www.gao.gov/products/GAO-08-1086] (Washington, D.C.: Sept. 22, 2008). [20] Defects are system problems that require a resolution and can be due to a failure to meet the system specifications. [21] According to VA testing documentation, these stages are (1) testing within the VA development team, (2) testing services, (3) field testing, and (4) final review and acceptance testing. [22] VA's Defect Control Process identifies four severity levels: (1) critical, serious errors that cause system crashes, loss of data, or loss of overall system functionality without a workaround; (2) major, serious errors that impair major system function for which no workaround is available or for which only workarounds of more than three user steps are available; (3) average, errors in daily operations, serious errors with workarounds with less than three user steps; and (4) minor, cosmetic, and documentation errors such as misspellings, field alignments, and missing fly-over text. [23] OMB issued policy guidance (M-05-23) to agency CIOs on improving technology projects that includes requirements for reporting performance to OMB using EVM (August 2005). VA, VA EVM System, VA Directive 6061, (February 2006). VA, Standard Operating Procedures (SOP) for EVM Reporting and Analysis, EVM System SOP 7, (February 2007), VA, Primavera SOP, SOP 015: EVM, (February 2005). See also FAR, Subpart 34.2. [24] Cost variances compare the value of the completed work (i.e., the earned value) with the actual cost of the work performed. Schedule variances are also measured in dollars, but they compare the earned value of the completed work with the value of the work that was expected to be completed. Positive variances indicate that activities cost less or are completed ahead of schedule. Negative variances indicate activities cost more or are falling behind schedule. [25] CPI is the ratio of earned value to actual costs; SPI is the ratio of earned value to planned value. [26] See OMB, Capital Programming Guide, II.2.4, Establishing an Earned Value Management System. Reflected in FAR, subpart 34.2. [27] Typically, an independent organization conducts the compliance review of an EVM system. Upon successful completion of the review, system acceptance should be documented, showing how each of the 32 ANSI/EIA Standard 748 guidelines has been satisfied. [28] According to relevant best practices, indexes that are in control are indicated by the color green. If the project goes into the caution area, it is indicated by yellow and the project manager needs to get involved to prevent the project from entering the out of control area, which is designated by the color red. A project that stays in the red area is considered to be out of control. In the reports to management, VA set the tolerances for CPI and SPI to be green if the index is 0.96 through 1.04, yellow if it is 0.90 through 0.95 or 1.05 through 1.10, and red if it is less than 0.90 or greater than 1.10. [29] See GAO, Information Technology: Agencies Need to Improve the Implementation and Use of Earned Value Techniques to Help Manage Major System Acquisitions, [hyperlink, http://www.gao.gov/products/GAO-10-2] (Washington, D.C.: Oct. 8, 2009). In this report, we assessed VA's earned value management approach for its VistA-Foundations Modernization program, which addressed the need to transition the department's electronic medical record system to a new architecture. We found that the program's EVM reports did not offer adequate detail to provide insight into data reliability issues and earned value data was not used for decision making. [30] OMB Circular A-130 (Nov. 30, 2000) and Carnegie Mellon Software Engineering Institute, Capability Maturity Model Integration for Development, Version 1.2 (Pittsburgh, Pa., August 2006). [31] GAO, Information Technology Investment Management: A Framework for Assessing and Improving Process Maturity, [hyperlink, http://www.gao.gov/products/GAO-04-394G] (Washington, D.C.: March 2004) and Office of Management and Budget, Capital Programming Guide: Supplement to Circular A-11, Part 7, Planning, Budgeting, and Acquisition of Capital Assets (Washington, D.C., June 2006). [32] Oversight of the project was the responsibility of the Scheduling Replacement Board of Directors from 2000-2004 and occurred through annual reviews of the project. [33] Carnegie Mellon Software Engineering Institute, Capability Maturity Model Integration for Development, Version 1.2 (Pittsburgh, Pa., August 2006). [34] GAO, GAO Cost Estimating and Assessment Guide: Best Practices for Developing and Managing Capital Program Costs, [hyperlink, http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009). [35] Carnegie Mellon Software Engineering Institute, Capability Maturity Model Integration for Development, Version 1.2 (Pittsburgh, Pa., August 2006). [36] GAO, Information Technology Investment Management: A Framework for Assessing and Improving Process Maturity, [hyperlink, http://www.gao.gov/products/GAO-04-394G], version 1.1 (Washington, D.C.: March 2004). [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each weekday, GAO posts newly released reports, testimony, and correspondence on its Web site. To have GAO e-mail you a list of newly posted products every afternoon, go to [hyperlink, http://www.gao.gov] and select "E-mail Updates." Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s Web site, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: E-mail: fraudnet@gao.gov: Automated answering system: (800) 424-5454 or (202) 512-7470: Congressional Relations: Ralph Dawn, Managing Director, dawnr@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, D.C. 20548: Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, D.C. 20548: