This is the accessible text file for GAO report number GAO-12-720R entitled 'Schedule Best Practices Provide Opportunity to Enhance Missile Defense Agency Accountability and Program Execution' which was released on July 19, 2012. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. GAO-12-720R: United States Government Accountability Office: Washington, DC 20548: July 19, 2012: Lt. Gen. Patrick J O'Reilly: Director: Missile Defense Agency: 5700 18th Street: Fort Belvoir, VA 22060: Subject: Schedule Best Practices Provide Opportunity to Enhance Missile Defense Agency Accountability and Program Execution: Dear Lt. Gen. O'Reilly: During the course of our annual assessment of the Missile Defense Agency's (MDA) ongoing cost, schedule, testing and performance progress for the Ballistic Missile Defense System (BMDS),[Footnote 1] we performed a detailed analysis comparing the schedules for five MDA programs--the Standard Missile-3 (SM-3) Block IIA, Aegis Ashore, Ground-based Midcourse Defense (GMD), Precision Tracking Space System (PTSS), and the Targets and Countermeasures Extended Medium-Range Ballistic Missile (eMRBM) target--to best practices GAO has identified for schedule development.[Footnote 2] We did not report on this detailed analysis, which has implications for transparency and accountability of MDA programs, in the April 2012 review; instead, we are providing in this report (1) the results of how the MDA program schedules compare to the nine best practices and (2) a summary of how these MDA program results compare to the results of analysis GAO has conducted of program schedules in other agencies. During our review, management officials for the MDA programs we reviewed expressed a willingness to learn from best practices for scheduling and identified steps they were already taking to address deficiencies we identified. We selected the five MDA programs based on recent congressional interest, program budget and trends, program phase, value to other programs or GAO work, and program variety. Four of the five programs we reviewed had schedules owned and maintained by contractors rather than by the government. Of the five programs we selected, only Aegis Ashore, which does not have a prime contractor, maintained a government schedule. In addition, we were unable to conduct the analysis on GMD's prime contract due to technical difficulties with the software used to maintain the program schedule and therefore, selected a different GMD contract that includes an upcoming flight test and post-test analysis for this program. Finally, PTSS was in early stages of development at the time of our review, which affected the program's ability to develop its schedule. In performing our analyses, we determined the extent to which each schedule was prepared in accordance with best practices that GAO previously has identified as fundamental to having a reliable schedule. GAO's Schedule Assessment Guide includes 10 such best practices; however, at the time of our review, the 10th best practice-- maintaining a baseline schedule--was not fully developed. As a result, we included only the first 9 best practices in our analysis. We then characterized the extent to which each of the nine scheduling best practices were met; that is, we rated each characteristic as being either: not met, minimally met, partially met, substantially met, or fully met.[Footnote 3] We shared the criteria against which we evaluated the program's schedule estimates and our preliminary findings with program management officials. We then discussed our preliminary assessment results with the officials and lead schedulers for the programs. When warranted, we updated our analyses based on the agency response and additional documentation provided to us. We met with MDA to discuss our final results with the officials and lead schedulers for the programs. Our analysis of these five programs cannot be used to make general statements about the schedule practices of MDA programs as a whole. Further, many of the schedules we reviewed were partial schedules; for example, for GMD, we reviewed a contractor schedule that included only the activities related to an upcoming flight test and post-test analysis and not the entire GMD program due to technical software issues. The SM-3 Block IIA program developed its schedule in connection with the period of performance of its existing contract, which means that its schedule was developed for several months at a time instead of for the life of the program. As a result, our findings on the specific program schedules are limited to the particular schedule we reviewed, which may not include all activities the program will carry out over the life of the program. However, it should be noted that regardless of the length of the schedule we reviewed and with the exception of the first best practice, capturing all activities, each program had equal opportunity to meet best practices because our analysis focused on the content of the schedule we reviewed as opposed to the overall schedule for each program. Consequently, the main value of our analysis is to highlight areas where MDA could improve on its existing scheduling practices in part and in whole. We conducted this performance audit from August 2011 to July 2012 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Results in Brief: Based on our analysis, none of the five MDA schedules we reviewed fully met all nine of the schedule best practices, including the practice of capturing all activities. The schedules were inconsistent in meeting best practices, and some had major deficiencies. These results are significant because a reliable schedule is one key factor that indicates a program is likely to achieve its planned outcomes. Our analysis suggests that estimated time frames and costs of these programs are either not reliable or the program is missing information that could make it more efficient. The MDA schedule results are similar to those of other agencies that GAO has analyzed. We are recommending actions that would better ensure compliance with schedule best practices for the five programs reviewed as well as for the long- term MDA program. The Department of Defense (DOD) concurred with our recommendations. Background: We have reported for years on a range of knowledge-based acquisition practices that provide a systematic and disciplined method to deliver promised capabilities within estimated costs and schedules.[Footnote 4] We also have consistently reported that MDA programs have had troubled acquisition histories at least in part due to not following these practices. In April 2012, we reported that many MDA acquisition programs have a concurrent schedule in which there is overlap between technology development and product development, or product development and the production of a system.[Footnote 5] Such a strategy forces decision makers to make key decisions without adequate information about the weapon's demonstrated operational effectiveness, reliability, logistic supportability, and readiness for production. For example, GMD's concurrent acquisition approach allowed it to rapidly field a limited defense capability, but it resulted in performance shortfalls, unexpected cost increases, schedule delays, test problems, and expensive retrofit programs. We also have reported that MDA program schedules are optimistic and frequently change. As a knowledge-based approach has been found to be important to successfully executing a program, the problems faced by MDA programs can be, at least in part, attributed to the programs not following an approach in which knowledge about program capabilities precedes key program commitments.[Footnote 6] In addition to our prior work on knowledge-based acquisition practices, we published our Cost Estimating and Assessment Guide, which provided tools for promoting effective program management across the federal government.[Footnote 7] In this guide, we identified best practices associated with effective schedule estimating. These best practices have been further refined and explained in our May 2012 Schedule Assessment Guide.[Footnote 8] Given that a schedule includes day-to-day effort necessary to carry out a program, it is an effective tool to carry out oversight of a program. We previously have reported that the success of a large-scale acquisition program depends in part on having a reliable schedule that defines, among other things, when work activities and milestone events will occur, how long they will take, and how they are related to one another. As such, the schedule not only provides a road map for systematic program execution but also provides the means by which to gauge progress, identify and address potential problems, and promote accountability. Without a reliable schedule, it is likely that established program milestones will slip. Table 1 lists each of the 10 schedule best practices as well as a brief description of each best practice.Our analysis includes only the first 9 of these best practices. Table 1: Summary of Schedule Best Practices: Best practice: 1. Capturing all activities; Description: The schedule is an integrated master schedule that reflects all activities, including both government and contractor activities. Best practice: 2. Sequencing all activities; Activities are logically sequenced in the order they are to be carried out. Description: The schedule identifies activities that must finish before others start, or predecessor activities, and activities that cannot begin until others have finished, or successor activities. Best practice: 3. Assigning resources to all activities; Description: The schedule realistically reflects the resources needed to do the work, whether resources will be available when needed, and if there are any constraints in funds or time. Best practice: 4. Establishing the duration of all activities; Description: The schedule realistically reflects how long it will take to execute each activity; the duration should be as short as possible and have specific start and end dates that are estimated under normal, not optimal, conditions. Best practice: 5. Verifying that the schedule is traceable horizontally and vertically; Description: The schedule should link associated products, outcomes, and activities, which demonstrates that the schedule can be traced horizontally. The schedule also should be traceable vertically, meaning that lower-level schedules are clearly traced to higher-level schedule events. Best practice: 6. Confirming that the critical path is valid; Description: The schedule should identify the critical path--the path of longest duration through the sequenced list of activities--and confirm that it is valid. Best practice: 7. Ensuring reasonable total float; Description: The schedule should identify a reasonable time, or total float, that a predecessor activity can slip before the delay affects successor activities. Best practice: 8. Conducting a schedule risk analysis; Description: The program should use data about project schedule risks and opportunities to predict the amount of risk associated with meeting the planned completion date and to identify high-priority risks and opportunities. Best practice: 9. Updating schedule using logic and progress; Description: The schedule should be updated using a documented and consistently applied process. Best practice: 10. Maintaining a baseline schedule; Description: A baseline schedule is the basis for managing the project scope, the time period for accomplishing it, and the required resources. Source: GAO. [End of table] These best practices call for a program schedule to cover an entire program--that is, it should have an integrated master schedule (IMS) that includes the integrated breakdown of the work both the government and its contractors will perform over the program's expected life. [Footnote 9] Best practices also call for the schedule to expressly identify and define the relationships and dependencies of the work activities and any constraints affecting their start and completion. A reliable schedule shows when major events are expected as well as the completion dates for all activities leading up to them. This inclusion helps determine whether the program's parameters are realistic and its goals can be achieved. Further, a schedule's reliability determines the credibility of the program's forecasted dates for decision making. Since a well-defined schedule helps identify the human capital and fiscal resources needed to execute a program, it is an important contribution to a reliable cost estimate. In addition, the schedule should be properly updated to identify when schedule variances will affect future work. The scheduling best practices are interrelated so that deficiencies in one best practice will cause deficiencies in other best practices. For example, if the schedule does not capture all activities, then there will be uncertainty about whether activities are sequenced in the correct order and whether the schedule properly reflects the resources needed to accomplish the work. MDA's Schedules Are Not Aligned with Best Practices and Are at Risk of Delays and Inefficiencies: Based on our analysis of how five programs have constructed and maintained their schedules, there were mixed results in meeting the nine best practices for each of the program schedules we reviewed. Table 2 summarizes the results of our review of the five programs. Table 2: Best Practices Assessment of MDA Program Schedules: Best practice: 1. Capturing all activities; M-3 Block IIA: Minimally; Aegis Ashore: Partially; GMD: Minimally; PTSS: Partially; eMRBM: Partially. Best practice: 2. Sequencing all activities; M-3 Block IIA: Partially; Aegis Ashore: Minimally; GMD: Partially; PTSS: Minimally; eMRBM: Substantially. Best practice: 3. Assigning resources to all activities; M-3 Block IIA: Substantially; Aegis Ashore: Did not meet; GMD: Substantially; PTSS: Partially; eMRBM: Substantially. Best practice: 4. Establishing the duration of all activities; M-3 Block IIA: Partially; Aegis Ashore: Substantially; GMD: Fully; PTSS: Substantially; eMRBM: Substantially. Best practice: 5. Verifying the schedule is traceable horizontally and vertically; M-3 Block IIA: Substantially; Aegis Ashore: Partially; GMD: Substantially; PTSS: Partially; eMRBM: Substantially. Best practice: 6. Confirming the critical path is valid; M-3 Block IIA: Partially; Aegis Ashore: Minimally; GMD: Fully; PTSS: Minimally; eMRBM: Substantially. Best practice: 7. Ensuring reasonable total float; M-3 Block IIA: Substantially; Aegis Ashore: Minimally; GMD: Substantially; PTSS: Minimally; eMRBM: Partially. Best practice: 8. Conducting a schedule risk analysis; M-3 Block IIA: Minimally; Aegis Ashore: Minimally; GMD: Partially; PTSS: Did not meet; eMRBM: Partially. Best practice: 9. Updating the schedule using actual progress and logic; M-3 Block IIA: Fully; Aegis Ashore: Partially; GMD: Fully; PTSS: Did not meet; eMRBM: Substantially. Source: GAO analysis of MDA data. Note: "Fully met" means the program provided evidence that completely satisfies the best practices criterion. "Substantially" means the program provided evidence that satisfies a large portion of the criterion. "Partially" means the program provided evidence that satisfies about half of the criterion. "Minimally" means the program provided evidence that satisfies a small portion of the criterion. "Not met" means the program provided no evidence that satisfies any of the criterion. [End of table] Overall, none of the five programs had an integrated master schedule for the entire length of acquisition as called for by the first best practice, meaning the programs are at risk for unreliable completion estimates and delays. An IMS should reflect all activities to be performed by the government and the contractor. If a project schedule does not fully and accurately reflect the project, it will not serve as an appropriate basis for analysis and may result in unreliable completion dates, time extension requests, and delays. Further, the failure to fully meet the first best practice by capturing all activities in the schedule raises uncertainties about how well the schedule meets other schedule best practices. Provided below are descriptions of each of the programs we reviewed including important details of the schedule we analyzed, a summary of the results of the analysis for each program, and a discussion of information provided by the programs in response to our analysis. For all programs, we analyzed the most recent schedule as of September 1, 2011. The full results of this analysis are available in enclosure II. SM-3 Block IIA: * This program is developing the third SM-3 variant to be developed for use with the sea-based and future land-based Aegis Ballistic Missile Defense (BMD).[Footnote 10] It began in 2006 as a joint development with Japan, and it was added to the European Phased Adaptive Approach (PAA) when that approach for the missile defense of Europe was announced in 2009. As part of European PAA Phase III, the SM-3 Block IIA is planned to be fielded with Aegis Weapons System version 5.1 by the 2018 time frame. We assessed a portion of the SM-3 Block IIA contractor's schedule that spans from April 2011 to November 2011. Program management officials told us that they had not developed a complete program schedule as of September 2011, because they did not yet have a contract that spans the life of the program. Currently, the program's schedule is limited to the duration of the period of performance of the current contract. * The program fully met one best practice--updating the schedule-- substantially met three best practices, partially met three, and minimally met two. Based on these results, the program may not have a feasible schedule, sufficiently understand the amount of risk associated with meeting the planned completion date, or have necessary insight into properly allocating resources to tasks and understanding how those tasks affect later work. * In response to our analysis, SM-3 Block IIA program management officials stated they plan to develop an integrated master schedule for the remainder of the program when its completion contract is finalized. Aegis Ashore: * This program is a future land-based variant of the ship-based Aegis BMD. It is expected to track and intercept ballistic missiles in their midcourse phase of flight using SM-3 interceptor variants as they become available. Key components include a vertical launch system and a reconstitutable enclosure that houses the SPY-1 radar and command and control system. DOD plans to deploy the first Aegis Ashore with SM- 3 Block IB in the 2015 time frame as part of the European PAA. The schedule we assessed for the Aegis Ashore program began in October 2009 and ends in February 2016, meaning it contains the program events related to the deployment of the first Aegis Ashore. It does not include activities needed to develop the second Aegis Ashore that is aligned with the 2018 European PAA time frame. According to program management officials, Aegis Ashore is not like other MDA acquisition efforts because it does not have a prime contractor leading the development of the program and managing the schedule. Instead of developing a new product, the program largely is integrating existing components from multiple ongoing government contracts into Aegis Ashore installations. In addition, the officials said the program was unable to include some information in its schedule due to contract competition considerations. * Aegis Ashore did not meet one best practice--assigning resources to all activities--substantially met one best practice, partially met three, and minimally met four. These results suggest that, because the program does not assign resources to all activities, the program's ability to have a high quality cost estimate is limited. Also, based on this analysis, the program may have limited schedule flexibility, reducing its ability to allocate resources from non-critical activities to activities that will affect the project finish date if they are delayed. * In commenting on the outcome of our analysis, Aegis Ashore program management officials provided information that they worked to improve scheduling practices in many areas, including reviewing the sequencing of activities in their schedule, dividing activities with a long duration into multiple tasks, and taking actions to improve the reliability and traceability of the schedule. Program management officials stated they do not have the personnel necessary to assign resources to all activities. GMD: * This ground-based missile defense system is designed to destroy intermediate and intercontinental ballistic missiles during the midcourse phase of their flight. Its mission is to protect the U.S. homeland against ballistic missile attacks from North Korea and the Middle East. GMD has two ground-based interceptor variants--the Capability Enhancement I and the Capability Enhancement II. MDA has emplaced its total planned inventory of 30 interceptors at two missile field sites--Fort Greely, Alaska and Vandenberg, California. We assessed the portion of the GMD schedule from January 2009 to June 2013 that only contains the program events leading up to an upcoming flight test and the post-test analysis. This schedule is not the prime GMD contract schedule, contains mostly contractor work, and does not reflect the entire GMD schedule. * For the limited schedule we were able to assess, GMD fully met three best practices, substantially met three, partially met two, and minimally met one. Despite this portion of the schedule fully meeting three best practices, the program is still at risk for not sufficiently understanding the amount of risk associated with meeting the planned completion date and may miss opportunities to promote efficiency. * In response to our analysis, GMD program management officials stated they plan to develop an integrated master schedule for the remainder of the program. PTSS: * This program is being developed as an operational component of the BMDS designed to support intercept of regional medium and intermediate range ballistic missile threats to forces and allies and long range threats to the United States. PTSS is expected to track large missile raid sizes after booster burn-out, which could enable earlier intercepts. The program schedule as of September 2011 that we used in our assessment spanned from September 2010 to March 2017, was in its early phases as the program had only recently received funding, and is maintained by the contractor. * PTSS substantially met one best practice, partially met three, minimally met three, and did not meet two best practices. Based on this analysis, the program is at risk of not knowing the true performance of the program, not sufficiently understanding the amount of risk associated with meeting the planned completion date, being unable to determine which tasks will have detrimental effects on the project finish date if they are delayed, and may not be able to demonstrate that the overall schedule is rational or planned in a logical sequence. * In response to our analysis, PTSS program management officials stated they plan to have an IMS that includes contractor and government activities when they award a contract for the program. eMRBM: * MDA develops and manufactures highly complex targets for short, medium, intermediate and eventually intercontinental ranges used in BMDS flight tests to present realistic threat scenarios. The targets are designed to encompass the full spectrum of threat missile ranges and capabilities. As part of target development, MDA has developed the extended medium range ballistic missile (eMRBM) target. The development of this target started in 2003, was put on hold in 2008, and was restarted in mid-2010. We assessed the contractor's schedule that started in March 2010 and has a projected finish in November 2014. The schedule we reviewed was from September 2011, which means the development of the target had been restarted for little more than a year. The only formal schedule for this program is the contractor's schedule. * This program substantially met six and partially met three best practices. Based on these results, the program does not have the full ability to allocate resources from non-critical activities to activities that will affect the project finish date if they are delayed and may be at risk for not sufficiently understanding amount of risk associated with meeting the planned completion date. It also should be noted that, based on the September 2011 schedule we reviewed, the program will deliver the target vehicles a year later than previously planned in part due to delayed hardware delivery for the targets.[Footnote 11] * In commenting on the results of our analysis, eMRBM program management officials stated they adjusted their schedule risk analysis processes and plan to include government activities in its schedule. Further, the program has reduced the number of activities with large float values in the schedule. Results for Assessed MDA Program Schedules Are Similar to Results for Other Government Programs: Overall, the five MDA programs' schedules proved to have issues similar to those of 44 other programs we have assessed since 2009 with the schedule best practices assessment methodology. We have assessed programs in the Departments of Homeland Security (DHS), Defense, Energy (DOE), State and Veterans Affairs (VA).[Footnote 12] Like programs in these agencies, most of the five MDA programs had mixed results in terms of meeting the nine best practices. Our review of agency program schedules suggests that these problems we found with MDA program schedules are typical in that most government program offices we have assessed do not include all activities in an integrated master schedule.[Footnote 13] When schedules do not account for all activities, one cannot be certain whether activities are scheduled in the correct order, resources are properly allocated, missing activities would appear on the critical path, or a schedule risk analysis (SRA) accounts for all risk. Our reviews of other agency schedules have found that government program offices do not consistently review and update the schedule. As a result, the schedule does not reflect true status and cannot be used to determine variances from the plan. Conclusions: Fundamental to accountability and oversight is being able to establish a sound plan and to track actual performance against that plan. We have previously concluded that it is imperative that a program's schedule be sustained in a way that provides reliable reporting of progress from which accountability can be maintained. However, there remain critical gaps in the quality of the underlying acquisition schedules that are needed to establish baselines and to meaningfully measure progress against those baselines. Several MDA program management officials have indicated that efforts are under way to address some of these gaps, which is a step forward. However, further actions to fully address these critical gaps are essential because establishing sound and reliable schedules is fundamental to creating both schedule and cost baselines that are realistic. Decision makers in DOD and Congress rely on realistic baselines to ensure transparency, accountability, and oversight of the BMDS. Recommendations for Executive Action: To improve the transparency and needed accountability over the BMDS over the near and long term, we recommend the Director of MDA take the following two actions: * for the near term, direct the SM-3 Block IIA, Aegis Ashore, GMD, PTSS, and eMRBM programs to improve their compliance with the schedule best practices as outlined in GAO's Schedule Assessment Guide. * for the long term, develop a plan, including direction to program offices to develop and maintain integrated master schedules that reflect both government and contractor activities, to ensure that best practices are applied to those schedules as outlined in GAO's Schedule Assessment Guide. Agency Comments and Our Evaluation: We provided a copy of the draft report to DOD and MDA for comment. In its written comments, reproduced in enclosure I, DOD agreed with our overall findings and concurred with our recommendations. DOD indicated that MDA program offices will improve their compliance with schedule best practices as outlined in our Schedule Guide. In addition, MDA will develop a plan to adopt and tailor the schedule best practices for use by MDA program offices. DOD also provided technical comments that we have incorporated throughout the report as appropriate. We are sending a copy of this report to appropriate congressional committees and the Secretary of Defense. In addition, the report is available at no charge on the GAO website at [hyperlink, http://www.gao.gov]. If you or your staff have any questions about this report, please contact me at (202) 512-4841 or chaplainc@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributors to this report were: LaTonya Miller, Assistant Director; Letisha J. Antone; David Best; Tisha Derricote; Jennifer Echard; Ann Rivlin; Luis E. Rodriguez; John H. Pendleton; Kenneth E. Patton, Karen Richey; Robert Swierczek; and Alyssa Weir. Sincerely yours, Signed by: Cristina Chaplain: Director: Acquisition and Sourcing Management: Enclosures--2: [End of section] Enclosure I: Comments from the Department of Defense: Office of The Under Secretary Of Defense: Acquisition, Technology and Logistics: 3000 Defense Pentagon: Washington, DC 20301-3000: July 5, 2012: Ms. Cristina Chaplain: Director, Acquisition and Sourcing Management: U.S. Government Accountability Office: 441 G Street, N.W. Washington, DC 20548: Dear Ms. Chaplain: This is the Department of Defense (DoD) response to the GAO Draft Report, GAO-12-720R, "Schedule Best Practices Provide Opportunity to Enhance Missile Defense Agency Accountability and Program Execution," dated June 7, 2012 (GAO Code 121065). Detailed comments on the report recommendations are enclosed. The DoD concurs with both of the draft report's recommendations. I submitted separately a list of technical and factual errors for your consideration. We appreciate the opportunity to comment on the draft report. My point of contact for this effort is Lt Col Peter Jackson, 703-695-7328, Peter.Jackson@osd.mil. Sincerely, Signed by: David G. Ahern: Deputy Assistant Secretary of Defense: Strategic and Tactical Systems: Enclosure: As stated. [End of letter] GAO Draft Report Dated JUNE 7, 2012: GAO-12-720R (GAO Code 121065): "Schedule Best Practices Provide Opportunity To Enhance Missile Defense Agency Accountability And Program Execution" Department Of Defense Comments To The GAO Recommendations: Recommendation 1: To improve the transparency and needed accountability over the Ballistic Missile Defense System (BMDS) over the near term, the GAO recommends that the Director, Missile Defense Agency (MDA) direct the Standard Missile-3 (SM-3) Block IIA, Aegis Ashore, Ground-based Midcourse Defense (GMD), Precision Tracking Space System (PTSS) and Extended Medium-Range.Ballistic Missile (eMRBM) programs to improve their compliance with the schedule best practices as outlined in GAO's Schedule Assessment Guide. DoD Response: Concur. MDA Program Offices will improve their compliance with the schedule best practices as outlined in the GAO draft Schedule Assessment Guide dated May 2012. Recommendation 2: To improve the transparency and needed accountability over the BMDS over the long term, the GAO recommends that the Director, MDA, to develop a plan, including direction to program offices to develop and maintain integrated master schedules that reflect both government and contractor activities, to ensure that best practices are applied to those schedules as outlined in GAO's Schedule Assessment Guide. DoD Response: Concur. MDA will develop a plan to adopt and tailor the best practices and concepts, in the draft GAO Schedule Assessment Guide dated May 2012, for use by MDA Program Offices to develop and maintain integrated master schedules that reflect both government and contractor activities. [End of section] Enclosure II: MDA Program Schedule Results: This enclosure details the results of our schedule analysis for five selected MDA programs. SM-3 Block IIA Schedule Results: Table 3 below contains the comparison of the SM-3 Block IIA program schedule as of September 2011 to schedule best practices. We reviewed a contractor schedule that started April 4, 2011, and finished November 22, 2011. The program reported that the current schedule was incomplete because the program did not have a contract that covered the entire life of the program and the schedule would be replaced by a full integrated master schedule in mid-2012 when contract negotiations for the completion of the program were final. Table 3: SM-3 Block IIA Schedule Compared to Best Practices: 1. Captured all activities? Result: Minimally met; Analysis: The program had no comprehensively networked government integrated master schedule (IMS) for managing the entire program. The schedule we reviewed covered only the 8 months from April through November 2011. The only schedule for the program, owned and maintained by the contractor, did not clearly distinguish government from contractor activities. Program management officials said they planned to fix this in the forthcoming baselined schedule and, for the remainder of the program, to develop their own IMS to include detailed effort through 2017 project completion. This updated schedule is to be developed after the contract is awarded for the remainder of the project. 2. Sequenced all activities? Result: Partially met; Analysis: While no activities were missing predecessor or successor logic, 11 remaining activities (2 percent) had dangling successors. This means that while all these activities had successor logic links, their finish dates failed to affect the start dates of their successor activities. Program management officials acknowledged that the dangling logic stemmed predominantly from artificial constraints used to force activities to finish at the same time as the missile contract period of performance. The schedule was highly concurrent: activities had a relatively high number of converging predecessors. For example, 3 activities each had more than 60 predecessors. Program management officials explained that some concurrency stemmed from the nature of the work flow while some stemmed from the failure to complete all subsystem preliminary design reviews. Several constraints and a few lags indicated that the schedule logic was incomplete. In particular, of the remaining activities 27 percent had start no earlier than constraints and 2 percent were affected by lags spanning 1 to 35 days. 3. Assigned resources to all activities? Result: Substantially met; Analysis: The schedule was resource-loaded. Of the 652 remaining activities, 90 percent had identified resources. Program management officials said that the contractor estimated and managed resources to meet all schedule obligations by conducting monthly engineering design reviews, including a resource review to determine the program's status on resources. According to program management officials, no resource issues affected the schedule. 4. Established the duration of all activities? Result: Partially met; Analysis: Fifty percent of the remaining activities had durations greater than the best-practices recommended 44 days, with many longer than 100 days. Program management officials explained that many of the large number of activities with long durations were most likely level- of-effort tasks. The schedule's baseline durations varied considerably from actual durations. Program management officials said that inconsistent durations most likely resulted from time delays in status updates from their Japanese counterparts. 5. Verified that the schedule was traceable horizontally and vertically? Result: Substantially met; Analysis: The schedule demonstrated horizontal traceability. For instance, when we added substantial time to an activity, we saw comparable delays in the final milestone. Program management officials said that major handoffs and deliverables with subcontractors were negotiated monthly during program reviews, where they were able to see every task that was in danger of being late or overrunning. They said it was hard to verify vertical traceability between the schedule and management briefing charts because the current schedule was a small segment of the overall effort. Attempting to crosswalk key program milestones to the schedule, we could find only some of them. Program management officials agreed with our findings, stating that when the full IMS is available in July 2012, it will be possible to find all the milestones in the schedule. 6. Confirmed that the critical path was valid? Result: Partially met; Analysis: We calculated two critical paths of three activities each that drove the end milestone date. Program management officials said that to create the critical path, the contractor used an end constraint on the key deliverable milestone. As a result of this constraint, 397 activities had zero or negative float, convoluting the calculation of the critical path. Though the temporary use of hard constraints is a valuable tool for assessing the realism of available resources achieving the planned activity date, using hard constraints to fix activity dates at certain points in time immediately affects the critical path calculations and reduces the credibility of the later schedule dates. 7. Ensured reasonable total float? Result: Substantially met; Analysis: Total float was reasonable given the planned finish date of November 22, 2011. Only a few remaining activities had more than 44 days of total float. The maximum total float in the schedule was 78 days, which made sense given that only 4 months remained to the planned finish date. 8. Conducted a schedule risk analysis? Result: Minimally met; Analysis: The contractor but not the program office had conducted an SRA. Program management officials said it was performed on the current 5-month effort and that another would be conducted on the bigger effort once it had been baselined. The SRA was poor for many reasons, including an arbitrary contract end date milestone and a very partial schedule with limited risk data. As a result, the risk range the SRA generated was too narrow. At 5 percent probability, the end date was calculated at October 6, 2011, while the 95 percent probability showed an end date of October 11, 2011. The risk spread was only 5 days. Program management officials agreed that the SRA results were used to demonstrate what could be done to assess program risk but since the schedule covered only a small portion of the overall effort, it had no strategic value for the program. 9. Updated the schedule using actual progress and logic? Result: Fully met; Analysis: A valid and current status date reported as July 30, 2011, showed no activities with start or finish dates in the past with no actual start or finish dates recorded. We found no activities with actual start or finish dates in the future, and no activities performed out of sequence. Program management officials said that schedule progress was recorded monthly and that the contractor provided the program office with a metrics package that included reasons for delays as well as logic and activity changes. Reviewing briefing charts on the SM-3 Block IIA schedule, we found evidence of the program office's actively monitoring slipped tasks from the baseline as well as tracking float and activity durations. Officials said schedule updates and changes were the responsibility of two schedulers certified and trained in critical path method scheduling. Source: GAO analysis of MDA SM-3 Block IIA schedule data. [End of table] Aegis Ashore Schedule Results: Table 4 below contains the comparison of the Aegis Ashore program schedule as of September 2011 to schedule best practices. Program management officials said the government schedule, which begins in 2009, was detailed only through the end of 2015, when the first Aegis Ashore site will be complete in Romania, and does not include the effort required to complete the second Aegis Ashore site in Poland. As a result, the schedule we reviewed addressed only a portion of the program and did not cover later phases. Program management officials stated that Aegis Ashore is unlike other MDA acquisition efforts in that there is not a prime contractor leading the program's development. Instead, the program is modifying and integrating existing components from multiple ongoing government contracts into Aegis Ashore installations. It relies on the development activities of other Aegis BMD programs--such as the development of vertical launching systems for use by the SM-3 Block IIA and SM-3 Block IIB by those programs--as well as development activities--such as the creation of a deckhouse for use on land--run by the program itself. Table 4: Aegis Ashore Schedule Compared to Best Practices: Result: 1. Captured all activities? Result: Partially met; Analysis: The IMS covered only a portion of the planned effort negotiated through 2015. Additional phases planned for after 2015 were not included; developing an IMS to cover all European PAA phases was planned. Program management officials said work for the Poland site, part of Phase III, was not in the schedule because they were recompeting that contract and procurement officials did not want to disclose this information to prospective contractors. The work breakdown structure (WBS) elements, or effort that is needed to be achieved, captured in the government schedule could not be matched to the contractor's schedule and because the government and the contractor did not share the same scheduling software, government officials received updates from the contractor by undocumented processes. Further, many activities had redundant names, making it difficult to identify them. Acknowledging the duplication of names, program management officials said they would make them more distinct. Result: 2. Sequenced all activities? Result: Minimally met; Analysis: Program management officials, acknowledging broken and incomplete logic, said they were reviewing and correcting this problem. Of the remaining activities, 30 percent were missing predecessor logic, 15 percent successor logic; 2 percent had dangling logic, meaning they were missing successor links that would affect future start dates. Program management officials said they were reviewing and correcting the dangling activities. Four remaining summary activities with logic links were corrected. A few activities were highly concurrent. Forty-eight percent of the remaining activities had constraints, the majority start-no-earlier-than constraints. Several activities with must-start-on and must-finish-on hard constraints overwrote schedule network logic. Program management officials said all tasks were being reviewed to reduce the number of constraints but that some task constraints were dictated by contract award and delivery date deadlines. Twenty-three percent of the remaining activities had 1 to 300 day lags; 1 percent had leads (or negative lags). Program management officials said they were revising the logic to eliminate leads and reduce lags to less than 5 percent of incomplete tasks. 3. Assigned resources to all activities? Result: Not met; Analysis: The schedule was not resource-loaded. We found only one resource that has since been removed. Program management officials agreed that the IMS did not contain resources but asserted that the contractor's detailed schedule was resource-loaded. They said that resources assigned to IMS activities were documented outside the schedule. Program management officials said they had no resource issues and believed the current plan was feasible given resource availability but might have an issue with the future availability of test engineers; that is, they needed to coordinate with the Navy test community to ensure that test engineers would be available when needed. As for sufficiency of resources, officials said that the spend plan and cost estimate were compared in cost analyses to ensure that they matched the schedule. 4. Established the duration of all activities? Result: Substantially met; Analysis: Durations were generally within the 44 day target, although some were longer. Of remaining activities, 72 percent had durations of 44 days or less, which was in line with best practices. Program management officials said that all activities longer than 44 days were being reviewed and broken into multiple tasks where possible. Some activities were longer than 44 days because they were expected to take that long by their nature. For example, expecting fabrication lines to take less than 44 days was unreasonable since this effort takes more than 100 days to complete. Activity durations had been estimated and documented from experience with similar projects. 5. Verified that the schedule was traceable horizontally and vertically? Result: Partially met; Analysis: The schedule was not horizontally traceable because of issues regarding sequencing, including activities' missing logic and activities with hard constraints. When we tested the schedule for horizontal traceability by adding hundreds or thousands of days to several activities, we found that the schedule's planned finish date was not delayed in comparable ways. For example, we added 500 days to an activity that resulted in moving the finish date by only 4 months because the task was missing a successor link. When we added 1,000 days to another activity, the activity started 4 years sooner because a must finish on constraint did not allow the activity's finish date to move into the future. The result is that we have little confidence in the calculated dates or critical path. The schedule was mostly vertically integrated, low-level activities could be traced to high- level summary activities and major milestones could be mapped between the schedule and high-level briefing charts. However, vertical traceability was hampered somewhat because of four summary links. 6. Confirmed that the critical path was valid? Result: Minimally met; Analysis: Program management officials said that the overly constrained IMS we reviewed did not have a valid critical path. They said that they were revising the IMS to remove constraints to the maximum extent possible. We found a longest path to a flight test that proved the safety and performance of the Aegis Ashore system, but the path stopped at an activity that had a must finish on constraint. 7. Ensured reasonable total float? Result: Minimally met; Analysis: Our analysis confirmed program management officials' statement that because of missing logic, lags, and leads, float in the schedule was not realistic. We found extremely high float for many activities in the schedule (unrealistically high float greater than 100 days): 394 activities (18 percent) had float greater than 1,000 days, and 1,642 activities (75 percent) had float greater than 100 days. We also found 70 activities (3 percent) with negative float values, indicating constraints on activities that were behind schedule. Program management officials said all tasks were being reviewed and logic relationships were being established. They said they planned to monitor total float, both negative and positive. 8. Conducted a schedule risk analysis? Result: Minimally met; Analysis: An SRA performed in June 2011 did not yield a confidence level that the program could use because the risk inputs were not validated and the schedule did not include all activities. Program management officials shared the results nonetheless, identifying different confidence levels for the deckhouse fabrication. The confidence levels ranged from 5 percent with a finish date of August 23, 2012, to 95 percent with a finish date of September 25, 2012. We found these results too optimistic given that the entire risk range spanned only 1 month. Moreover, the most likely date at the 85 percent confidence level with a finish date of September 18, 2012, failed to account for the potential for major design or fabrication errors. Finally, the SRA excluded data from the program's risk register and the schedule had no risk mitigation activities. 9. Updated the schedule using actual progress and logic? Result: Partially met; Analysis: The current August 31, 2011, status date was valid but problems with some of the activity dates indicated that the schedule had not been updated. For example, 331 activities (15 percent) had start dates in the past with no actual start dates entered, and 305 activities (14 percent) had finish dates in the past with no actual finish dates. We found 7 activities with actual start dates recorded in the future and 8 activities with actual finish dates after the status date. The 31 out-of-sequence activities (1 percent of the remaining activities) that were not addressed during the August 31, 2011, update have since been reviewed and corrected. Program management officials also said that assessing the IMS for trends was hampered by logic, leads, lags, and constraint issues. Once these issues are resolved, schedule reporting and trending were to be monitored. Source: GAO analysis of MDA Aegis Ashore schedule data. [End of table] GMD Schedule Results: Table 5 contains the comparison of the GMD program schedule as of September 2011 to schedule best practices. The schedule spans January 2009 to June 2013 and is a partial schedule that contains the activities for a flight test and post test analysis. It does not contain activities funded under the GMD prime contract. Program management officials stated they plan to develop a schedule that extends to the planned 2032 program completion. Table 5: GMD Schedule Compared to Best Practices: 1. Captured all activities? Result: Minimally met; Analysis: No IMS included both government and contractor efforts for the entire length of the program. A contractor-managed schedule that addressed the contractor's and subcontractor's work and included 71 completed government tasks accounted only for a small part of the overall planned effort through February 2013, even though the overall program was planned to be completed in 2032. Program management officials said that they planned to update the schedule to include all the effort. The schedule reflected a product-oriented WBS with a document showing how the schedule activities mapped to the WBS. However, we found 2 percent of the activities had the same names. 2. Sequenced all activities? Result: Partially met; Analysis: Most of the logic was complete; only a few activities had dangling starts and finishes. Lags were few and small. A number of start no earlier than constraints that prevented work from starting as soon as possible included 23 percent of the remaining activities; the majority of these constraints actively delayed activities from starting earlier. Program management officials said they are not aware of any constraints from funding, weather, equipment, or material availability. The program admitted that the contractor was using an excessive number of constraints. In addition, we found a few activities that had a high degree of concurrency. For example, the contract completion activity had 341 predecessors while 3 other activities had 50 or more predecessors. 3. Assigned resources to all activities? Result: Substantially met; Analysis: Our analysis found deficiencies in assigning resources to all activities; however, program management officials provided evidence that they used an alternative method that largely ensures activities are supported by resources. We found 32 percent of the activities (254 of 789) had resources assigned to them while 68 percent did not. Program management officials stated that resources were sufficient in each work period and no potential difficulties in obtaining the necessary resources had been identified. Management officials told us that schedule activities were resourced outside the schedule and that many of the remaining activities that had resources were being performed by a subcontractor whose schedule was not resource-loaded. However, the program conducts reviews to ensure that resources are discussed and agreed to and has a system for ensuring that subcontractor efforts are supported by resources. 4. Established the duration of all activities? Result: Fully met; Analysis: Most activities had durations shorter than 44 days, which was in line with best practices. Only 9 activities had durations of 44 to 120 days (about 5.5 months); however, some of longest had been removed from the schedule while the agency investigated its December 2010 flight test failure. Activity durations were based on the scope required, knowledge of the work to be done, and the contractor's experience. They also were supported by detailed estimates using historical data. 5. Verified that the schedule was traceable horizontally and vertically? Result: Substantially met; Analysis: Because of the number of starts no earlier than constraints and some incomplete logic from a few dangling activities, horizontal traceability was somewhat hampered. Program management officials explained that key handoff dates were monitored in many management review processes at the team and program levels and were tracked with four major code fields in the schedule. We confirmed that the schedule was vertically traceable by finding that 4 different activity dates in the schedule were consistent with the dates presented in a management briefing chart. 6. Confirmed that the critical path was valid? Result: Fully met; Analysis: Seventeen well-organized paths led to a milestone scheduled to be completed on December 23, 2011. While this milestone might later be incorporated into a more inclusive IMS, it was now the collection point for several paths, many seemingly well-ordered and driven by logic and duration rather than constraints. The milestone had many predecessors. Activities on these paths had total float of zero days. These paths started at an unconstrained activity and the work proceeded from actual dates directly to the end without being interrupted by a "target" or constraint date. 7. Ensured reasonable total float? Result: Substantially met; Analysis: Two hundred fifty-three activities (30 percent of the 846 remaining) had somewhat-to very-high total float of 45 to 300+ working days. We could not tell whether the activities' high total float values derived from missing activity logic or incorrect logic. We also found two radar testing milestones with negative float ranging from - 193 days to -217 days indicating that this work was constrained and behind schedule. Float was monitored in weekly joint management reviews, and program officials reviewed schedule analysis reports that discussed total float, including its direction from one week to the next. 8. Conducted a schedule risk analysis? Result: Partially met; Analysis: The program conducted an SRA, but did not determine the confidence levels associated with the schedule's various projected end dates. It was not clear that the probability of test failures had been included in the risk analyses, even though there had been some important recent failures. Program management officials stated that risks were applied to various hardware and software efforts using percentage increases and decreases on the estimated activity durations. The prime contractor had a risk management plan describing how the program would identify and manage risks such as test anomalies, but the probability and effect of the risks were not identified. In addition, the schedule contained no risk mitigation activities. 9. Updated the schedule using actual progress and logic? Result: Fully met; Analysis: The schedule included a valid and current status date recorded as September 1, 2011. The prime contractor updated the schedule weekly and recorded status dates of its activities, including all subcontractors. An archive of each IMS had been delivered over the past 2 years. We found one activity with an actual start date in the future relative to the status date and five activities with actual finish dates in the future that were only 1 day after the status date. Program management officials stated that the prime contractor's schedulers had proper training, experience, and professional certifications. The contractors' scheduling team consists of 9 schedulers with over 123 years of experience. Source: GAO analysis of MDA GMD schedule data. [End of table] PTSS Schedule Results: Table 6 contains the comparison of the PTSS program schedule as of September 2011 to schedule best practices. The contractor schedule we reviewed started on February 1, 2010, and was projected to finish on March 10, 2017. The program was in the early stage of development and the schedule we assessed was very immature. Program officials explained that continuing resolutions and budget cuts had resulted in their having received only a small portion of the program's overall funding. They did not have time to fully develop the schedule to meet best practices at the time of our assessment. Table 6: PTSS Schedule Compared to Best Practices: 1. Captured all activities? Result: Partially met; Analysis: The schedule was owned and maintained by the contractor and contained only contractor and subcontractor efforts. Program management officials said that government activities were to be added before the PTSS Technology Development Decision. Since the schedule WBS did not map to the program WBS, it was difficult to tell if the schedule contained all appropriate activities. However, key milestones matched the milestones in other program documentation. Descriptions for some activity names were insufficient to differentiate one task from another. For example, multiple activities were simply named "PDR." Custom text fields in the schedule were mostly blank. Program management officials explained that the contractor used a consistent and standardized field coding convention for managing all its major project schedules, but during the early phases of project formulation it was not uncommon for these fields to be unpopulated because the team was concentrating on developing the schedule logic. 2. Sequenced all activities? Result: Minimally met; Analysis: The schedule relied heavily on constraints and lags, and the logic was incomplete. While it might have been too early in the project to expect a mature schedule, the following problems should have been addressed. More than half the 2,332 remaining activities had predecessor or successor activities missing. 50 dangling activities (2 percent of the remaining activities) had either no predecessor on their start date or no successor from their finish date. 6 summary activities with logic links. 750 activities (32 percent of the remaining activities) with date constraints. A must finish on constraint for the proposed launch date, causing the launch to happen on a specific date, overwriting network logic. 106 tasks with lags (6 percent of the remaining activities). Program management officials explained that many activities were missing logic links or had constraints because the schedule was in the early stages of development and was missing sections of work that had not been detail planned at the time of our review. They also said that constraints were used to pin interdependencies between subsystems and to actively manage negative float. Program management officials explained that since the schedule was not resource loaded, they used lags to account for when resources would be available. 3. Assigned resources to all activities? Result: Partially met; Analysis: Many--2,283 or 98 percent-remaining activities had no resource assignments in the schedule. However, the contractor managed resources by an in-house Resource Management Information System that tracked activities in all four of its laboratories. This system, a web- based resource planning and management reporting system, was designed to give users one source of data for managing tasks and resources. It captures actual costs and resource plans for all departments plus supplemental laboratory information. The program manager said that resources were determined from historical information and the engineering buildup estimating technique. The program performed an analysis to ensure that sufficient resources were available when needed. 4. Established the duration of all activities? Result: Substantially met; Analysis: Activity durations were generally within the 44-day target, with 67 percent of the durations 44 working days long or less. Of the remaining activities, 19 percent had durations of 1 day. Program management officials explained that some activities were placeholders because the program had not yet received information from vendors and that other activities had long durations because they involved deliveries that were outside the program's control. Program management officials consulted with experts when estimating durations. 5. Verified that the schedule was traceable horizontally and vertically? Result: Partially met; Analysis: Horizontal traceability was hampered by missing and dangling logic as well as reliance on too many date constraints. The schedule was vertically traceable because lower-level tasks and milestones rolled up into higher-level summary tasks. Dates in the schedule matched dates in program status briefings. 6. Confirmed that the critical path was valid? Result: Minimally met; Analysis: The program's ready-for-launch milestone, scheduled for August 2016, had no clear critical path. Only 7 activities had zero total float while the earliest critical path activity was set to begin in 2016. As a result, no activities on the critical path before 2016 indicated that it was broken. Program management officials, agreeing with this finding, said that they had not run the critical path analysis because the schedule was not mature enough. Because of the lack of a valid critical path, we attempted to find the path that drove the ready-for-launch date in August 2016. Going backward from this activity, we were able to trace activities all the way back to a critical design review activity but we found that this path was not continuous either because it stopped abruptly on October 17, 2013, by a start no earlier than constraint for the critical design review. 7. Ensured reasonable total float? Result: Minimally met; Analysis: We found unreasonable amounts of total float throughout the schedule. For example, 49 percent of the 2,332 remaining activities had float values greater than 1,000 days--that is, almost half of the remaining activities could slip almost 3 years or more and not affect the end date of the program, which was not reasonable. Program management officials agreed that several activities had excessive float values, caused by incomplete and incorrect logic discussed in best practice 2. Until the schedule matured to reflect valid logic links, reasonable total float values would not be possible. 8. Conducted a schedule risk analysis? Result: Not met; Analysis: No SRA had been conducted because the schedule's immaturity meant that it could not produce realistic results with statistical techniques. However, program management officials said they had evaluated risks to the schedule. 9. Updated the schedule using actual progress and logic? Result: Not met; Analysis: The schedule had no valid or current status date, indicating that it had not been updated. Using the date that the schedule was delivered to us on October 10, 2011, as the status date, we found that 42 percent of the 2,332 remaining activities had start dates in the past with no actual start dates recorded and 40 percent of the activities had finish dates in the past with no actual finish dates recorded. Program management officials acknowledged the schedule we reviewed had not been updated but they plan to keep the schedule updated once it is mature. Source: GAO analysis of MDA PTSS schedule data. [End of table] eMRBM Schedule Results: Table 7 contains the comparison of the eMRBM program schedule as of September 2011 to schedule best practices. The contractor schedule we reviewed started on March 17, 2010, and had a projected finish date of November 19, 2014. The development of the eMRBM began in 2003, but was suspended in 2008. Development started again in mid-2010. Table 7: eMRBM Schedule Compared to Best Practices: 1. Captured all activities? Result: Partially met; Analysis: The schedule contained no detailed government effort. Instead, it covered the entire program through fiscal year 2014 and represented contractor and subcontractor effort. The contractor owned and maintained the only formal program schedule and therefore included only government-furnished equipment milestones in its schedule. The contractor's schedule included links to 13 subschedules maintained independently so that program staff could work on their schedules without locking down the entire IMS. The detailed schedule was closely linked to the WBS, using a column in the schedule that denoted the associated WBS element. The majority of the activity names were unique but a few activities (5 percent) had repetitive names that could make for difficult communication for the various teams working on the schedule. 2. Sequenced all activities? Result: Substantially met; Analysis: The IMS had no missing predecessor or successors, but we found some broken logic in the schedule where 2 percent of the remaining activities did not have a predecessor driving their start dates, and one activity did not have a successor from its finish. Program management officials stated that the contractor had greatly emphasized resolving any missing logic issues. We found a high degree of concurrency, some activities showing 116 to 436 predecessors. Approximately 19 percent of the remaining activities were constrained with inadequate justification. Program management officials said the contractor used constraints to impose deadlines, monitor deliverables, or model promise dates for material receipts. 3. Assigned resources to all activities? Result: Substantially met; Analysis: The schedule was partially resource loaded with 57 percent of the activities having assigned resources. Activities that had no resources included subcontractor efforts and government-furnished equipment. Program management officials said that while labor resources in the schedule corresponded to the program budget, another tool captured material, equipment, overhead allocations, and fringe costs. The contractor based resource estimates on experience with similar programs. Resource leveling was not performed in the schedule, but the contractor stated that resource issues were handled case by case. Program management officials reported that there were no resource allocation issues and none were foreseen. 4. Established the duration of all activities? Result: Substantially met; Analysis: The majority of durations (71 percent) were generally within the 44-day target for this best practice, although some durations were significantly longer--more than 400 days long. According to program management officials, the program had been working to reduce the number of activities with durations longer than 40 working days. They said that allotting 40+ working days was reserved for subcontract component lead times, which were only for reference and not loaded with labor hours. Durations were estimated by the people responsible for doing the work and were based on historical data obtained from the contractor's enterprise system. Evidence program management officials provided of detailed bases for estimates included a detailed description of the estimating methodology, actual historical hours, charge numbers that included periods of the historical data, and the source of the historical data. 5. Verified that the schedule was traceable horizontally and vertically? Result: Substantially met; Analysis: The schedule was traceable horizontally since the majority of the logic was in place. To test horizontal traceability, we extended an activity from 40 to 900 days, moving the project completion date by the same amount of time. However, using the same test on another activity, we found that because the activity had no successor link from its finish (that is, logic was dangling) the lengthened duration had no effect on successive activities in the network. Handoffs between teams were negotiated in weekly schedule merge and coordination meetings monitored by MDA staff as part of their standard analysis of the IMS. The schedule included a custom text field called "Responsible" that made it easy to track who was in charge of each deliverable. We found a slight issue with vertical integration within the primary schedule where two summary tasks had end dates later than the project completion milestone. Program officials explained that these two activities were considered summary tasks so they are were not linked with any logic. They said these two tasks were in the IMS for display only and accounted for effort outside the current planning period. 6. Confirmed that the critical path was valid? Result: Substantially met; Analysis: The program identified several critical paths that were continuous from the status date to important program milestones. Program management officials said the critical path was updated, reviewed, and analyzed each week. During the analysis, all conflicts were addressed and resources updated as required. All critical paths analyzed contained a small number of lags, which were used on a limited basis to delay a dependent successor. However, our findings for best practice 2 showed that the dangling logic made it possible that the critical paths would not update correctly when durations changed. 7. Ensured reasonable total float? Result: Partially met; Analysis: Several activities in the schedule had excessive total float values: 56 percent of the remaining activities had float values greater than or equal to 100 days. These activities could slip more than 4 working months without affecting successor activities. For example, one activity had 726 days of float, which meant that it could slip 3 years without affecting the end-of-contract date Some activities had high float values because they represented targets that were on contract but would not be ready for several years. A significant number of activities had negative float values as large as -215 days. Program management officials said they were working to drive down the negative float, which was caused by delays in engineering drawing releases and issues associated with procurement items. 8. Conducted a schedule risk analysis? Result: Partially met; Analysis: An SRA used risks from the program's risk register to identify optimistic and pessimistic assessments. The risk inputs, validated by the risk owners, were based on historical data from similar missile programs. Mitigation plans for high and medium risks were tracked in the risk management process and were included in the IMS in accordance with best practices. However, the results from the SRA indicated that there was a zero percent probability of meeting the delivery date of the first two vehicles. In fact, the SRA results indicated that this delivery was likely to take place at least 1 year after its planned date. Program officials definitized the contract in October 2011 and said they planned to perform a new SRA to reflect the status of the schedule after that date. 9. Updated the schedule using actual progress and logic? Result: Substantially met; Analysis: The schedule recorded a valid and current status date of September 18, 2011. A weekly process updated activities, addressed constraints, and resolved issues. Program management officials said schedule changes were made in real time, directly into the schedule, during its schedule merge and coordination meetings. The officials said the contractor followed processes set up to maintain any changes to the baseline or IMS. A scheduler with 20 years of experience performed schedule updates. MDA and the contractor interacted daily at all levels. Significant issues were raised to senior program management staff immediately. Four activities had start dates in the past that were missing actual start dates, and 6 activities had finish dates in the past with no actual finish dates. A small number of activities (8) had actual start dates in the future, and 114 activities had actual finish dates in the future relative to the status date of September 18, 2011. Program management officials said the schedule file had no baseline information because when the schedule was submitted to GAO, the contract was not yet definitized. The officials said there was no formal record of activities completed out of sequence. Source: GAO analysis of MDA eMRBM schedule data. [End of table] [End of section] Related GAO Products: Defense Acquisitions: Assessments of Selected Weapon Programs. [hyperlink, http://www.gao.gov/products/GAO-12-400SP]. Washington, D.C.: March 29, 2012. Immigration Benefits: Consistent Adherence to DHS's Acquisition Policy Could Help Improve Transformation Program Outcomes. [hyperlink, http://www.gao.gov/products/GAO-12-66]. Washington, D.C.: November 22, 2011. Coast Guard: Action Needed As Approved Deepwater Program Remains Unachievable. [hyperlink, http://www.gao.gov/products/GAO-11-743]. Washington, D.C.: July 28, 2011. Federal Protective Service: Progress Made but Improved Schedule and Cost Estimate Needed to Complete Transition. [hyperlink, http://www.gao.gov/products/GAO-11-554]. Washington, D.C.: July 15, 2011. Aviation Security: TSA Has Enhanced Its Explosives Detection Requirements for Checked Baggage, but Additional Screening Actions Are Needed. [hyperlink, http://www.gao.gov/products/GAO-11-740]. Washington, D.C.: July 11, 2011. Defense Acquisitions: Assessments of Selected Weapon Programs. [hyperlink, http://www.gao.gov/products/GAO-11-233SP]. Washington, D.C.: March 29, 2011. Information Technology: Better Informed Decision Making Needed on Navy's Next Generation Enterprise Network Acquisition. [hyperlink, http://www.gao.gov/products/GAO-11-150]. Washington, D.C.: March 11, 2011. Secure Border Initiative: DHS Needs to Strengthen Management and Oversight of Its Prime Contractor. [hyperlink, http://www.gao.gov/products/GAO-11-6]. Washington, D.C.: October 18, 2010. DOD Business Transformation: Improved Management Oversight of Business System Modernization Efforts Needed. [hyperlink, http://www.gao.gov/products/GAO-11-53]. Washington, D.C.: October 7, 2010. Global Positioning System: Challenges in Sustaining and Upgrading Capabilities Persist. [hyperlink, http://www.gao.gov/products/GAO-10-636]. Washington, D.C.: September 15, 2010. Nuclear Waste: Actions Needed to Address Persistent Concerns with Efforts to Close Underground Radioactive Waste Tanks at DOE's Savannah River Site. [hyperlink, http://www.gao.gov/products/GAO-10-816]. Washington, D.C.: September 14, 2010. Defense Acquisitions: Observations on Weapon Program Performance and Acquisition Reforms. [hyperlink, http://www.gao.gov/products/GAO-10-706T]. Washington, D.C: May 19, 2010. Defense Acquisitions: Strong Leadership Is Key to Planning and Executing Stable Weapon Programs. [hyperlink, http://www.gao.gov/products/GAO-10-522]. Washington, D.C.: May 6, 2010. Secure Border Initiative: DHS Needs to Reconsider Its Proposed Investment in Key Technology Program. [hyperlink, http://www.gao.gov/products/GAO-10-340]. Washington, D.C.: May 5, 2010. Best Practices: DOD Can Achieve Better Outcomes by Standardizing the Way Manufacturing Risks Are Managed. [hyperlink, http://www.gao.gov/products/GAO-10-439]. Washington, D.C: April 22, 2010. GAO Review of the Department of Homeland Security's Certification of the Secure Flight Program--Cost and Schedule Estimates. [hyperlink, http://www.gao.gov/products/GAO-10-535R]. Washington, D.C.: April 5, 2010. Nuclear Nonproliferation: DOE Needs to Address Uncertainties with and Strengthen Independent Safety Oversight of Its Plutonium Disposition Program. [hyperlink, http://www.gao.gov/products/GAO-10-378]. Washington, D.C.: March 26, 2010. VA Construction: VA Is Working to Improve Initial Project Cost Estimates, but Should Analyze Cost and Schedule Risks. [hyperlink, http://www.gao.gov/products/GAO-10-189]. Washington, D.C.: December 14, 2009. Homeland Security: Key US-VISIT Components at Varying Stages of Completion, but Integrated and Reliable Schedule Needed. [hyperlink, http://www.gao.gov/products/GAO-10-13]. Washington, D.C.: November 19, 2009. Transportation Worker Identification Credential: Progress Made in Enrolling Workers and Activating Credentials but Evaluation Plan Needed to Help Inform the Implementation of Card Readers. [hyperlink, http://www.gao.gov/products/GAO-10-43]. Washington, D.C.: November 18, 2009. Nuclear Weapons: National Nuclear Security Administration Needs to Better Manage Risks Associated with Modernization of Its Kansas City Plant. [hyperlink, http://www.gao.gov/products/GAO-10-115]. Washington, D.C.: October 23, 2009. Military Readiness: DOD Needs to Strengthen Management and Oversight of the Defense Readiness Reporting System. [hyperlink, http://www.gao.gov/products/GAO-09-518]. Washington, D.C.: September 25, 2009. United Nations: Renovation Still Scheduled for Completion in 2013, but Risks to Its Schedule and Cost Remain. [hyperlink, http://www.gao.gov/products/GAO-09-870R]. Washington, D.C.: July 30, 2009. Aviation Security: TSA Has Completed Key Activities Associated with Implementing Secure Flight, but Additional Actions Are Needed to Mitigate Risks. [hyperlink, http://www.gao.gov/products/GAO-09-292]. Washington, D.C.: May 13, 2009. Defense Acquisitions: Charting a Course for Lasting Reform. [hyperlink, http://www.gao.gov/products/GAO-09-663T]. Washington, D.C.: April 30, 2009. Defense Acquisitions: Measuring the Value of DOD's Weapon Programs Requires Starting with Realistic Baselines. [hyperlink, http://www.gao.gov/products/GAO-09-543T]. Washington, D.C.: April 1, 2009. Joint Strike Fighter: Accelerating Procurement before Completing Development Increases the Government's Financial Risk. [hyperlink, http://www.gao.gov/products/GAO-09-303]. Washington, D.C.: March 12, 2009. Department of Energy: Contract and Project Management Concerns at the National Nuclear Security Administration and Office of Environmental Management. [hyperlink, http://www.gao.gov/products/GAO-09-406T]. Washington, D.C.: March 4, 2009. Defense Acquisitions: DOD Must Balance Its Needs with Available Resources and Follow an Incremental Approach to Acquiring Weapon Systems. [hyperlink, http://www.gao.gov/products/GAO-09-431T]. Washington, D.C.: March 3, 2009. Defense Acquisitions: Perspectives on Potential Changes to Department of Defense Acquisition Management Framework. [hyperlink, http://www.gao.gov/products/GAO-09-295R]. Washington, D.C.: February 27, 2009. Defense Acquisitions: A Knowledge-Based Funding Approach Could Improve Major Weapon System Program Outcomes. [hyperlink, http://www.gao.gov/products/GAO-08-619]. Washington, D.C.: July 2, 2008. Best Practices: Increased Focus on Requirements and Oversight Needed to Improve DOD's Acquisition Environment and Weapon System Quality. [hyperlink, http://www.gao.gov/products/GAO-08-294]. Washington, D.C.: February 1, 2008. Best Practices: An Integrated Portfolio Management Approach to Weapon System Investments Could Improve DOD's Acquisition Outcomes. [hyperlink, http://www.gao.gov/products/GAO-07-388]. Washington, D.C.: March 30, 2007. Defense Acquisitions: Major Weapon Systems Continue to Experience Cost and Schedule Problems under DOD's Revised Policy. [hyperlink, http://www.gao.gov/products/GAO-06-368]. Washington, D.C.: April 14, 2006. Best Practices: Capturing Design and Manufacturing Knowledge Early Improves Acquisition Outcomes. [hyperlink, http://www.gao.gov/products/GAO-02-701]. Washington, D.C.: July 15, 2002. [End of section] Footnotes: [1] GAO, Missile Defense: Opportunity Exists to Strengthen Acquisitions by Reducing Concurrency, [hyperlink, http://www.gao.gov/products/GAO-12-486] (Washington, D.C.: April 20, 2012). [2] GAO, GAO Cost Estimating and Assessment Guide: Best Practices for Developing and Managing Capital Program Costs, [hyperlink, http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009); and GAO Schedule Assessment Guide Exposure Draft, [hyperlink, http://www.gao.gov/products/GAO-12-120G] (Washington, D.C.: May 2012). [3] "Not met" means the program provided no evidence that satisfies any of the best practices criterion. "Minimally" means the program provided evidence that satisfies a small portion of the criterion. "Partially" means the program provided evidence that satisfies about half of the criterion. "Substantially" means the program provided evidence that satisfies a large portion of the criterion. "Fully met" means the program provided evidence that completely satisfies the best practices criterion. [4] GAO products on these knowledge-based acquisition practices are listed at the end of this report in related GAO products. [5] [hyperlink, http://www.gao.gov/products/GAO-12-486]. [6] Information provided by MDA states that several of the programs are designed around specific deployment dates and that these dates limit the ability of the programs to follow the knowledge-based acquisition approach. [7] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. [8] [hyperlink, http://www.gao.gov/products/GAO-12-120G]. [9] See, for example, [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. [10] Aegis BMD is a sea-based missile defense system being developed in incremental, capability-based blocks to defend against ballistic missiles of all ranges. Key components include the shipboard SPY-1 radar, SM-3 missiles, and command and control systems. It also is used as a forward-deployed sensor for surveillance and tracking of ballistic missiles. The SM-3 missile has multiple versions in development or production. The first two variants are referred to as the SM-3 Block IA and SM-3 Block IB. [11] Program management officials provided information that the schedule has been changed since our analysis was conducted and the targets are now on track to be delivered for a summer 2013 test. [12] We have assessed 19 programs from DHS, 15 from DOD, 5 from DOE, 2 from State, and 3 from VA. [13] GAO reports that include the schedule assessment of government programs in other agencies are listed at the end of this report. [End of section] GAO’s Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO’s commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO’s website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e-mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select “E-mail Updates.” Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. Congressional Relations: Katherine Siggerud, Managing Director, siggerudk@gao.gov, (202) 512-4400 U.S. Government Accountability Office, 441 G Street NW, Room 7125 Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov, (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, DC 20548.