This is the accessible text file for GAO report number GAO-11-53 
entitled 'DOD Business Transformation: Improved Management Oversight 
of Business System Modernization Efforts Needed' which was released on 
October 8, 2010. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 


Report to Congressional Requesters: 

United States Government Accountability Office:
GAO: 

October 2010: 

DOD Business Transformation: 

Improved Management Oversight of Business System Modernization Efforts 
Needed: 

GAO-11-53: 

GAO Highlights: 

Highlights of GAO-11-53, a report to congressional requesters. 

Why GAO Did This Study: 

The Department of Defense (DOD) invests billions of dollars annually 
to modernize its business systems, which have been on GAO’s high-risk 
list since 1995. DOD is in the process of implementing nine enterprise 
resource planning (ERP) efforts which perform business-related tasks 
such as general ledger accounting and supply chain management. These 
efforts are essential to transforming DOD’s business operations. GAO 
was asked to (1) provide the status of the ERPs as of December 31, 
2009; (2) determine whether selected ERPs followed schedule and cost 
best practices; and (3) determine if DOD has defined the performance 
measures to assess whether the ERPs will meet their intended business 
capabilities. To accomplish these objectives, GAO reviewed data on the 
status of each ERP from the program management officers and 
interviewed the DOD and military departments’ chief management 
officers. 

What GAO Found: 

Based upon the data provided by DOD, six of the nine ERPs have 
experienced schedule delays ranging from 2 to 12 years and five have 
incurred cost increases ranging from $530 million to $2.4 billion. DOD 
has stated that the ERPs will replace over 500 legacy systems that 
cost hundreds of millions of dollars to operate annually. However, 
delays in implementing the ERPs require DOD to fund the legacy systems 
longer than anticipated, thereby reducing the funds available for 
other DOD priorities. In 2007, 2008, and 2009, GAO made 19 
recommendations to improve the management of DOD’s ERP efforts. While 
DOD agreed with the recommendations, 14 have not yet been fully 
implemented. 

GAO analyzed four of the nine ERPs to determine whether scheduling and 
cost estimating best practices were being followed. Regarding 
scheduling practices, GAO found that none of the programs had 
developed a fully integrated master schedule as an effective tool to 
help in the management of the programs. A reliable schedule is crucial 
to estimating the overall schedule and cost of a program. Without a 
reliable schedule, DOD is unable to predict, with any degree of 
confidence, if the estimated completion dates are realistic. Regarding 
the cost estimates, GAO found that although the four ERPs’ cost 
estimates generally met the criteria for three of the four best 
practices—well-documented, accurate, and comprehensive—three ERPs did 
not fully meet the credibility criteria because potential limitations 
were not discussed. More specifically, the three ERPs lacked a 
sensitivity analysis or a risk and uncertainty analysis as stipulated 
in GAO, Office of Management and Budget, and DOD guidance, thus 
diminishing the credibility of the estimates. 

While the ERPs are critical to transforming DOD’s business operations, 
DOD lacks a comprehensive set of performance measures to assess these 
systems and their contribution to transforming business operations. 
Management needs to define what constitutes a successful 
implementation in terms that can be used to assess whether the system 
is (1) being used as expected and (2) providing the intended benefits. 
Accordingly, the actual measures used to accomplish these objectives 
will differ depending on the system. For example, measures for a 
logistical system may focus on reducing inventory levels, while those 
for a financial system may focus on reducing prompt payment penalties. 
Without performance measures to evaluate how well the ERPs are 
accomplishing their intended goals, DOD decision makers do not have 
all the information they need to determine whether DOD investments are 
accomplishing their desired goals, and program managers do not have 
the information they need to ensure that their individual program is 
helping DOD to achieve business transformation and thereby improve 
upon its primary mission of supporting the warfighter. 

What GAO Recommends: 

In addition to reiterating its existing recommendations, GAO is making 
eight recommendations to the Secretary of Defense aimed at improving 
schedule and cost practices and the development of performance 
measures to evaluate whether the ERPs’ intended goals are being 
accomplished. DOD concurred with our recommendations and plans to take 
action to implement them. 

View [hyperlink, http://www.gao.gov/products/GAO-11-53] or key 
components. For more information, contact Asif A. Khan at (202) 512-
9095 or khana@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

Status of DOD's ERP Implementation Efforts: 

DOD Did Not Follow Key Best Practices for Estimating ERP Schedules and 
Cost, Resulting in Unreliable Estimates: 

ERP Success in Transforming Business Operations Has Not Been Defined 
or Measured: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Objective, Scope, and Methodology: 

Appendix II: Comments from the Department of Defense: 

Appendix III: Status of DOD's Actions on Previous GAO Recommendations 
Related to Business Systems Modernization: 

Appendix IV: Assessments of Four DOD ERP Programs' Integrated Master 
Schedules: 

Appendix V: Assessments of Four DOD ERP Program Cost Estimates: 

Appendix VI: GAO Contacts and Staff Acknowledgments: 

Tables: 

Table 1: Reported Full Deployment Schedule Slippage for Each ERP as of 
December 31, 2009: 

Table 2: Reported Original and Current Life-Cycle Cost Estimate for 
Each ERP as of December 31, 2009: 

Table 3: Defense Agencies' Scheduled Implementation of DAI: 

Table 4: Extent to Which Program Schedules Met Best Practices: 

Table 5: Extent Cost Estimates Met Best Practices: 

Table 6: Status of DOD's Actions to Address GAO Recommendations in GAO-
07-860: 

Table 7: Status of DOD's Actions to Address GAO Recommendations in GAO-
08-822: 

Table 8: Status of DOD's Actions to Address GAO Recommendations in GAO-
08-866: 

Table 9: Status of DOD's Actions to Address GAO Recommendations in GAO-
08-896: 

Table 10: Status of DOD's Actions to Address GAO Recommendations in 
GAO-09-841: 

Table 11: Analysis of the Air Force's DEAMS Program Schedule: 

Table 12: Analysis of the Air Force's ECSS Solutions Development 
Project Schedule: 

Table 13: Analysis of the Air Force's ECSS Reports, Interfaces, 
Conversions, and Extensions (RICE) Program Schedule: 

Table 14: Analysis of the Army's GFEBS Program Schedule: 

Table 15: Analysis of the Army's GCSS-Army Program Schedule: 

Table 16: The 12 Steps of High-Quality Cost Estimating, Mapped to the 
Steps of a High-Quality Cost Estimate: 

Table 17: Analysis of the Air Force's DEAMS Cost Estimate: 

Table 18: Analysis of the Air Force's ECSS Cost Estimate: 

Table 19: Analysis of the Army's GFEBS Cost Estimate: 

Table 20: Analysis of the Army's GCSS-Army Cost Estimate: 

Figure: 

Figure 1: DOD's Fiscal Year 2011 Business Systems Budget Request by 
DOD Components (Dollars in Thousands): 

Abbreviations: 

ATEC: Army Test and Evaluation Command: 

BPR: business process reengineering: 

BSM: Business System Modernization: 

BTA: Business Transformation Agency: 

CARD: Cost Analysis Requirements Document: 

COTS: commercial off-the-shelf: 

DAI: Defense Agencies Initiative: 

DBSMC: Defense Business Systems Management Committee: 

DCMO: deputy chief management officer: 

DEAMS: Defense Enterprise Accounting and Management System: 

DID: Data Item Description: 

DIMHRS: Defense Integrated Military Human Resources System: 

DLA: Defense Logistics Agency: 

DOD: Department of Defense: 

EBS: Enterprise Business System: 

ECSS: Expeditionary Combat Support System: 

ERAM: Enterprise Risk Assessment Methodology: 

ERP: enterprise resource planning: 

EVM: earned value management: 

FFP: firm-fixed price: 

GCSS-Army: Global Combat Support System-Army: 

GCSS-MC: Global Combat Support System-Marine Corps: 

GFEBS: General Fund Enterprise Business System: 

IMS: integrated master schedule: 

IOT&E: initial operational test and evaluation: 

IPPS-A: Integrated Personnel and Pay System-Army: 

IRB: investment review board: 

IT: information technology: 

IUID: item-unique identification: 

IV&V: independent verification and validation: 

LMP: Logistics Modernization Program: 

MAIS: major automated information system: 

MDA: Milestone Decision Authority: 

MDAP: major defense acquisition program: 

MSO: Must Start On: 

NAVAIR: Naval Air Systems Command: 

Navy ERP: Navy Enterprise Resource Planning: 

NTC: National Training Center: 

OMB: Office of Management and Budget: 

PCA: pre-certification authority: 

PMO: program management office: 

RICE: reports, interfaces, conversions, and extensions: 

SFIS: Standard Financial Information Structure: 

TAV: total asset visibility: 

[End of section] 

United States Government Accountability Office:
Washington, DC 20548: 

October 7, 2010: 

Congressional Requesters: 

The Department of Defense's (DOD) business systems[Footnote 1] 
modernization program has been on our high-risk list[Footnote 2] since 
1995 because of the size, complexity, and significance of the related 
efforts. DOD's business systems modernization entails investments in 
and the implementation of comprehensive, integrated business systems 
for managing an organization's resources, commonly referred to as 
enterprise resource planning (ERP)[Footnote 3] systems and the 
elimination of hundreds of legacy systems. DOD officials have said 
that successful implementation of ERPs is key to resolving the long-
standing weaknesses in the department's business operations in areas 
such as business transformation, financial management, and supply 
chain management,[Footnote 4] and improving the department's 
capability to provide DOD management and the Congress with accurate 
and reliable information on the results of its operations. 

DOD has identified 10 ERPs,[Footnote 5] 1 of which has been fully 
implemented, as essential to its efforts to transform its business 
operations. According to DOD, as of December 2009, it had invested 
approximately $5.8 billion to develop and implement these ERPs and 
will invest additional billions before the remaining 9 ERPs are fully 
implemented. Our prior reviews of several ERPs have found that the 
department has not effectively employed acquisition management 
controls or delivered the promised capabilities on time and within 
budget.[Footnote 6] 

This report provides information to support your continuing oversight 
of DOD's progress in modernizing its business systems to address long- 
standing weaknesses and ultimately to transform its business 
operations. As agreed with your office, our objectives were to (1) 
provide the status as of December 31, 2009 of the nine ERPs DOD 
identified as essential to transforming its business operations, (2) 
assess the scheduling and cost estimating practices of selected ERPs 
to determine the extent to which the program management offices (PMO) 
were applying best practices, and (3) ascertain whether DOD and the 
military departments have defined the performance measures to 
determine whether the systems will meet their intended business 
capabilities. 

To address the first objective, we reviewed status information 
obtained from each PMO, such as the reported amount of funds expended 
on the implementation of the nine ERPs, the estimated number of legacy 
systems to be replaced by each ERP, and the reported annual cost of 
maintaining these legacy systems. We also reviewed past GAO reports 
[Footnote 7] that were specific to the department's efforts to 
implement the nine ERPs to identify prior recommendations and assess 
DOD's progress in addressing the 19 recommendations discussed in these 
reports. 

For the purposes of this report, we did not include information on the 
Defense Logistics Agency (DLA) Business System Modernization (BSM)/ 
Enterprise Business System (EBS). According to DLA, the BSM effort was 
fully implemented in July 2007, and transformed how the agency 
conducts its operations in five core business processes: order 
fulfillment, demand and supply planning, procurement, 
technical/quality assurance, and financial management. Subsequently, 
in September 2007, the name of the program was changed to the EBS, 
which is a continuation of the ERP's capabilities to support internal 
agency operations. 

To address the second objective, we assessed the scheduling and cost 
estimating practices for four of the nine ERPs[Footnote 8] to 
determine the extent to which the PMOs were applying best practices 
for scheduling and cost estimating. For the four ERPs, we obtained and 
analyzed the most current schedule and cost estimate for each program 
and compared them against the criteria set forth in GAO's cost guide. 
[Footnote 9] In using the guide, we determined the extent to which the 
schedule was prepared in accordance with the best practices[Footnote 
10] that are fundamental to having a reliable schedule. In assessing 
each program's cost estimates, we used the GAO cost guide to evaluate 
the PMOs' estimating methodologies, assumptions, and results to 
determine whether the cost estimates were comprehensive, accurate, 
well-documented, and credible. We did not conduct detailed schedule 
and cost assessments for the remaining five programs because (1) the 
implementation strategy has not been fully defined for two of the 
ERPs, (2) one of the ERPs is near full deployment, and (3) we have 
previously reported[Footnote 11] on two ERPs' schedule and cost 
estimating practices. 

To address the third objective, we reviewed the extent to which DOD 
and the military departments included performance measures in their 
congressional reports on business transformation. In addition, we met 
with the military departments' deputy chief management officers (DCMO) 
to obtain an understanding of how they define success in terms of 
deploying their respective ERPs. We also met with the DOD DCMO and the 
Director of the Business Transformation Agency (BTA) to obtain an 
understanding of their respective roles and responsibilities in the 
oversight of DOD's ERP implementation efforts. Additional details on 
our scope and methodology are presented in appendix I. 

We conducted this performance audit from June 2009 through October 
2010 in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audit 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. We 
requested comments on a draft of this report from the Secretary of 
Defense or his designee. We received written comments from the Deputy 
Chief Management Officer, which are reprinted in appendix II. 

Background: 

DOD is one of the largest and most complex organizations in the world. 
In fiscal year 2009, DOD reported that its operations consisted of 
$1.8 trillion in assets, $2.2 trillion in liabilities, approximately 
3.2 million military and civilian personnel--including active and 
reserve components--and disbursements of over $947 billion.[Footnote 
12] Execution of these operations spans a wide range of defense 
organizations, including the military departments and their respective 
major commands and functional activities, large defense agencies and 
field activities, and various combatant and joint operational commands 
that are responsible for military operations for specific geographic 
regions or theaters of operation. To execute military operations, the 
department performs interrelated and interdependent business 
functions, including financial management, logistics management, 
health care management, and procurement. To support its business 
functions, DOD has reported that it relies on about 2,080 business 
systems,[Footnote 13] including accounting, acquisition, logistics, 
and personnel systems. 

Funding of DOD's Business Systems: 

To fund its existing business systems environment, DOD requested for 
fiscal year 2011 nearly $17.4 billion to operate, maintain, and 
modernize its reported 2,080 business systems (see fig. 1). Of this 
amount, about $12.2 billion is for operations and maintenance and the 
remaining $5.2 billion is for planned or ongoing DOD business systems 
development modernization efforts. 

Figure 1: DOD's Fiscal Year 2011 Business Systems Budget Request by 
DOD Components (Dollars in Thousands): 

[Refer to PDF for image: illustrated table] 

Component: Army; 
Current services: $3,031,957;
Development/modernization: $1,768,150; 
Total: $4,800,107; 
Percent: 27.7%. 

Component: Air Force; 
Current services: $2,323,175; 
Development/modernization: $1,666,103; 
Total: $3,989,278; 
Percent: 23.0%. 

Component: Navy; 
Current services: $2,310,296; 
Development/modernization: $536,492; 
Total: $2,846,788; 
Percent: 16.4%. 

Component: TRICARE Management Activity; 
Current services: $1,403,434; 
Development/modernization: $403,857; 
Total: $1,807,291; 
Percent: 10.4%. 

Component: Defense Logistics Agency; 
Current services: $763,438; 
Development/modernization: $160,478; 
Total: $923,916; 
Percent: 5.3%. 

Component: Defense Information Systems Agency; 
Current services: $689,584; 
Development/modernization: $25,190; 
Total: $714,774; 
Percent: 4.1%. 

Component: Defense Finance and Accounting Service; 
Current services: $397,239; 
Development/modernization: $29,812; 
Total: $427,051; 
Percent: 2.5%. 

Component: Defense Human Resources Activity; 
Current services: $253,215; 
Development/modernization: $67,950; 
Total: $321,165; 
Percent: 1.9%. 

Component: Transportation Command; 
Current services: $186,370; 
Development/modernization: $101,973; 
Total: $288,343; 
Percent: 1.7%. 

Component: Business Transformation Agency; 
Current services: $53,332; 
Development/modernization: $145,190; 
Total: $198,522; 
Percent: 1.2%. 

Component: Washington Headquarters Service 
Current services: $153,579; 
Development/modernization: $27,119; 
Total: $180,698; 
Percent: 1.0%. 

Component: Missile Defense Agency; 
Current services: $0; 
Development/modernization: $152,208; 
Total: $152,208; 
Percent: 0.8%. 

Component: Defense Commissary Agency; 
Current services: $117,861; 
Development/modernization: $3,616; 
Total: $121,477; 
Percent: 0.7%. 

Component: Defense Contract Management Agency; 
Current services: $103,391; 
Development/modernization: $13,933; 
Total: $117,324; 
Percent: 0.7%. 

Component: Department of Defense Dependents Education; 
Current services: $94,590; 
Development/modernization: $0; 
Total: $94,590; 
Percent: 0.6%. 

Component: Office of the Secretary of Defense; 
Current services: $29,755; 
Development/modernization: $49,098; 
Total: $78,853; 
Percent: 0.5%. 

Component: Joint Chiefs of Staff; 
Current services: $60,151; 
Development/modernization: $10,963; 
Total: $71,114; 
Percent: 0.4%. 

Component: Other DOD components; 
Current services: $180,479; 
Development/modernization: $23,104; 
Total: $203,58; 
Percent: 31.2%. 

Component: Total; 
Current services: $12,151,846; 
Development/modernization: $5,185,236; 
Total: $17,337,082; 
Percent: 100%. 

Source: GAO based upon fiscal year 2011 budget request data provided 
by DOD. This data has not been validated. 

[End of figure] 

The Office of Management and Budget (OMB) requires that funds 
requested for information technology (IT) projects be classified as 
either "steady state" (or "current services" in DOD) or as 
"development/modernization." Current services represents funds for 
operating and maintaining systems at current levels (i.e., without 
major enhancements). The development modernization budget category 
represents funds for developing new IT systems or making major 
enhancements to existing systems. Some systems have both current 
services and development modernization funding. While current services 
are to be used for operating the system at various locations, 
development modernization funds are to be used for activities such as 
developing and expanding system functionality at existing locations 
and deploying the system to new locations. Generally, current services 
are financed through Operation and Maintenance appropriations, whereas 
development modernization funding can come from several or a 
combination of several appropriations, such as Research, Development, 
Test, and Evaluation; Procurement; or the Defense Working Capital Fund. 

DOD's Acquisition System Framework: 

ERPs are developed within the defense acquisition system framework, 
which is intended to translate mission needs and requirements into 
stable, affordable, and well-managed acquisition programs.[Footnote 
14] The defense acquisition system framework was updated in December 
2008 and consists of five program life-cycle phases and three related 
milestone decision points which are described below. 

Materiel solution analysis (previously concept refinement). The 
purpose of this phase is to refine the initial system solution 
(concept) and create a strategy for acquiring the solution. A decision 
is made at the end of this phase (Milestone A) regarding whether to 
move to the next phase. 

- Milestone A authorizes acquisition of the program and permission to 
begin planning and development of the system technology. 

Technology development. The purpose of this phase is to determine the 
appropriate set of technologies to be integrated into the investment 
solution by iteratively assessing the viability of the various 
technologies while simultaneously refining user requirements. Once the 
technology has been demonstrated, a decision is made (Milestone B) 
whether to move to the next phase. 

- Milestone B authorizes product development of the program based on 
well-defined technology and a reasonable system design plan. 

* Engineering and manufacturing development (previously system 
development and demonstration). The purpose of this phase is to 
develop a system and demonstrate through developer testing that the 
system can function in its target environment. A decision is made at 
the end of this phase (Milestone C) whether to move to the next phase. 

- Milestone C authorizes entry of the system into the production and 
deployment phase or into limited deployment in support of operational 
testing. 

* Production and deployment. The purpose of this phase is to achieve 
an operational capability that satisfies the mission needs, as 
verified through independent operational test and evaluation, and to 
implement the system at all applicable locations. 

* Operations and support. The purpose of this phase is to 
operationally sustain the system in the most cost-effective manner 
over its life cycle. 

Overview of DOD Business Systems Investment Review Process: 

In 2005, DOD adopted a "tiered accountability" approach to improve 
control and accountability over the billions of dollars it invests 
annually in DOD business systems. Under this approach, executive 
leadership for the direction, oversight, and execution of DOD 
investments is the responsibility of several entities within DOD and 
its components. As indicated below, the investment control process 
begins at the component level and works its way up through a hierarchy 
of review and approval authorities, depending on the size and 
significance of the investment.[Footnote 15] 

* Defense Business Systems Management Committee (DBSMC) serves as the 
highest-ranking governance body for business systems modernization 
activities and approves funding request for investments costing more 
than $1 million within the department. 

* Investment review boards (IRB)[Footnote 16] are responsible for the 
review, approval, and oversight of the planning, design, acquisition, 
deployment, operation, maintenance, and modernization of defense 
business systems. The IRBs are also responsible for recommending 
business systems to the DBSMC for certification,[Footnote 17] which 
equates to recommending funding, for all business system investments 
costing more than $1 million. 

* The Milestone Decision Authority (MDA) is the senior DOD official 
who has overall authority to approve entry of an acquisition program 
into the next phase of the acquisition process and is accountable for 
cost, schedule, and performance reporting, including congressional 
reporting. 

* DOD Component Acquisition Executive is responsible for providing a 
written memorandum to the MDA through the cognizant IRB that (1) 
states that the program complies with applicable DOD statutory and 
regulatory requirements, (2) describes any conditions or issues 
applicable to the requested acquisition decision, and (3) recommends 
approval of the acquisition decision request. 

* A DOD component pre-certification authority (PCA) acts as the 
component's principle point of contact with the IRBs. The PCA is 
responsible for identifying the component's systems that require IRB 
certifications and prepares, reviews, approves, and validates 
investment documentation as required. The PCA also submits to the 
appropriate IRB the component's precertification memorandum that 
asserts the status and validity of the business system's investment 
information during the certification and annual review processes. 

The MDA, IRBs, the DBSMC or a combination of these can place 
conditions or issues needing resolution upon the individual programs 
during the defense business system's funding certification and 
acquisition decision review processes. These conditions are generally 
noted in a memorandum. Further, DOD's business investment management 
system includes two types of reviews for business systems: 
certification and annual reviews. Certification reviews apply to 
modernization projects with total costs over $1 million. These reviews 
focus on program alignment with the business enterprise architecture 
and must be completed before components obligate funds for programs. 
As noted above, the IRBs recommend certification to the DBSMC, which 
approves the expenditure of funds. The annual reviews apply to all 
business programs and are undertaken to determine whether the system 
development effort is meeting its milestones and addressing its 
certification conditions. 

Additionally, the Duncan Hunter National Defense Authorization Act for 
Fiscal Year 2009 directs that the executive-level oversight of DOD-
wide business systems modernization and overall business 
transformation--including defining and measuring success in enterprise 
resource planning--is the responsibility of a military department-
level chief management officer and the DCMO.[Footnote 18] 

DOD's ERP Efforts: 

The department stated that the following nine ERPs are critical to 
transforming the department's business operations and addressing some 
of its long-standing weaknesses. A brief description of each ERP is 
presented below. 

* The General Fund Enterprise Business System (GFEBS) is intended to 
support the Army's standardized financial management and accounting 
practices for the Army's general fund,[Footnote 19] with the exception 
of that related to the Army Corps of Engineers, which will continue to 
use its existing financial system, the Corps of Engineers Financial 
Management System.[Footnote 20] GFEBS will allow the Army to share 
financial, asset and accounting data across the active Army, the Army 
National Guard, and the Army Reserve. The Army estimates that when 
fully implemented, GFEBS will be used to control and account for about 
$140 billion in spending. 

* The Global Combat Support System-Army (GCSS-Army) is expected to 
integrate multiple logistics functions by replacing numerous legacy 
systems and interfaces. The system will provide tactical units with a 
common authoritative source for financial and related non-financial 
data, such as information related to maintenance and transportation of 
equipment. The system is also intended to provide asset visibility for 
accountable items. GCSS-Army will manage over $49 billion in annual 
spending by the active Army, National Guard, and the Army Reserve. 

* The Logistics Modernization Program (LMP) is intended to provide 
order fulfillment, demand and supply planning, procurement, asset 
management, material maintenance, and financial management 
capabilities for the Army's working capital fund. The Army has 
estimated that LMP will be populated with 6 million Army-managed 
inventory items valued at about $40 billion when it is fully 
implemented. 

* The Navy Enterprise Resource Planning System (Navy ERP) is intended 
to standardize the acquisition, financial, program management, 
maintenance, plant and wholesale supply, and workforce management 
capabilities at six Navy commands.[Footnote 21] Once it is fully 
deployed, the Navy estimates that the system will control and account 
for approximately $71 billion, or 50 percent, of the Navy's estimated 
appropriated funds--after excluding the appropriated funds for the 
Marine Corps and military personnel and pay. 

* The Global Combat Support System-Marine Corps (GCSS-MC) is intended 
to provide the deployed warfighter enhanced capabilities in the areas 
of warehousing, distribution, logistical planning, depot maintenance, 
and improved asset visibility. According to the PMO, once the system 
is fully implemented, it will control and account for approximately 
$1.2 billion of inventory. 

* The Defense Enterprise Accounting and Management System (DEAMS) is 
intended to provide the Air Force the entire spectrum of financial 
management capabilities, including collections, commitments and 
obligations, cost accounting, general ledger, funds control, receipts 
and acceptance, accounts payable and disbursement, billing, and 
financial reporting for the general fund. According to Air Force 
officials, when DEAMS is fully operational, it is expected to maintain 
control and accountability for about $160 billion. 

* The Expeditionary Combat Support System (ECSS) is intended to 
provide the Air Force a single, integrated logistics system--including 
transportation, supply, maintenance and repair, engineering and 
acquisition--for both the Air Force's general and working capital 
funds. Additionally, ECSS is intended to provide the financial 
management and accounting functions for the Air Force's working 
capital fund operations. When fully implemented, ECSS is expected to 
control and account for about $36 billion of inventory. 

* The Service Specific Integrated Personnel and Pay Systems are 
intended to provide the military departments an integrated personnel 
and pay system.[Footnote 22] 

* Defense Agencies Initiative (DAI) is intended to modernize the 
defense agencies' financial management processes by streamlining 
financial management capabilities and transforming the budget, 
finance, and accounting operations. When DAI is fully implemented, it 
is expected to have the capability to control and account for all 
appropriated, working capital and revolving funds at the defense 
agencies implementing the system. 

Status of DOD's ERP Implementation Efforts: 

Based upon the information provided by the PMOs, six of the ERPs have 
experienced schedule slippages (see table 1) based on comparing the 
estimated date that each program was originally scheduled to achieve 
full deployment[Footnote 23] to the full deployment date as of 
December 2009. For the remaining three ERPs, the full deployment date 
has either remained unchanged or has not been established. The GFEBS 
PMO noted that the acquisition program baseline approved in November 
2008, established a full deployment date in fiscal year 2011 and that 
date remains unchanged. Additionally, according to the GCSS-Army PMO a 
full deployment date has not been established for this effort. The PMO 
noted that a full deployment date will not be established for the 
program until a full deployment decision has been approved by the 
department. A specific timeframe has not been established for when the 
decision will be made. Further, in the case of DAI, the original full 
deployment date was scheduled for fiscal year 2012, but the PMO is in 
the process of reevaluating the date and a new date has not yet been 
established. 

Table 1: Reported Full Deployment Schedule Slippage for Each ERP as of 
December 31, 2009: 

Army: 

Component/system name: GFEBS; 
Originally scheduled fiscal year for full deployment: 2011; 
Actual or latest estimated fiscal year for full deployment: 2011; 
Schedule slippage: None. 

Component/system name: GCSS-Army; 
Originally scheduled fiscal year for full deployment: [A]; 
Actual or latest estimated fiscal year for full deployment: [A]; 
Schedule slippage: Not applicable. 

Component/system name: LMP; 
Originally scheduled fiscal year for full deployment: 2005; 
Actual or latest estimated fiscal year for full deployment: 2011; 
Schedule slippage: 6 years. 

Navy: 

Component/system name: Navy ERP; 
Originally scheduled fiscal year for full deployment: 2011; 
Actual or latest estimated fiscal year for full deployment: 2013; 
Schedule slippage: 2 years. 

Component/system name: GCSS-MC; 
Originally scheduled fiscal year for full deployment: 2010; 
Actual or latest estimated fiscal year for full deployment: 2013; 
Schedule slippage: 3 years[B]. 

Air Force: 

Component/system name: DEAMS; 
Originally scheduled fiscal year for full deployment: 2014; 
Actual or latest estimated fiscal year for full deployment: 2017; 
Schedule slippage: 3 years. 

Component/system name: ECSS; 
Originally scheduled fiscal year for full deployment: 2012; 
Actual or latest estimated fiscal year for full deployment: 2016; 
Schedule slippage: 4 years. 

DOD: 

Component/system name: Service Specific Integrated Personnel and Pay 
Systems; 
Originally scheduled fiscal year for full deployment: 2006; 
Actual or latest estimated fiscal year for full deployment: Army--2014; 
Navy--2017; 
Air Force--2018; 
Schedule slippage: 12 years[C]. 

Component/system name: DAI; 
Originally scheduled fiscal year for full deployment: 2012; 
Actual or latest estimated fiscal year for full deployment: [D]; 
Schedule slippage: Not applicable. 

Source: DOD program management offices. 

[A] The PMO has not yet determined the full deployment date. 

[B] The PMO stated that the estimated full deployment date is only for 
phase 1. The full deployment date for the entire program has not yet 
been determined. 

[C] Originally, this ERP was referred to as the Defense Integrated 
Military Human Resources System (DIMHRS) and was intended to provide a 
joint, integrated, standardized personnel/pay system for all military 
personnel departmentwide. The original full deployment date represents 
the estimated date for DIMHRS. Each military service is now 
responsible for developing its own integrated personnel and pay system. 

[D] As of December 2009, the DAI PMO had not determined the revised 
full deployment date. 

[End of table] 

Besides schedule slippages, five of the ERP efforts have reported a 
cost increase and one program--GFEBS--reported a cost decrease of $17 
million (see table 2). The reported life-cycle[Footnote 24] cost 
estimate for GCSS-MC only represents the estimated cost for 
phase[Footnote 25] 1 of the program. The cost of the remaining phases 
has not yet been determined and therefore, a total life-cycle cost 
estimate for the entire program has not been determined. Additionally, 
a current life-cycle cost estimate has not been determined for the 
Service Specific Integrated Personnel and Pay Systems and DAI. 

Table 2: Reported Original and Current Life-Cycle Cost Estimate for 
Each ERP as of December 31, 2009: 

Dollars in millions. 

Army: 

Component/system name: GFEBS; 
Original life-cycle cost estimate: $1,354; 
Current life-cycle cost estimate: $1,337; 
Reported cost increase: $(17). 

Component/system name: GCSS-Army; 
Original life-cycle cost estimate: $3,900; 
Current life-cycle cost estimate: $3,900; 
Reported cost increase: 0. 

Component/system name: LMP; 
Original life-cycle cost estimate: $2,630; 
Current life-cycle cost estimate: $2,630[A]; 
Reported cost increase: 0. 

Navy: 

Component/system name: Navy ERP; 
Original life-cycle cost estimate: $1,870; 
Current life-cycle cost estimate: $2,400; 
Reported cost increase: $530. 

Component/system name: GCSS-MC; 
Original life-cycle cost estimate: $126; 
Current life-cycle cost estimate: $934; 
Reported cost increase: $808[B]. 

Air Force: 

Component/system name: DEAMS; 
Original life-cycle cost estimate: $1,100; 
Current life-cycle cost estimate: $2,048; 
Reported cost increase: $948. 

Component/system name: ECSS; 
Original life-cycle cost estimate: $3,000; 
Current life-cycle cost estimate: $5,200; 
Reported cost increase: $2,200[C]. 

DOD: 

Component/system name: Service Specific Integrated Personnel and Pay 
Systems; 
Original life-cycle cost estimate: $577[D]; 
Current life-cycle cost estimate: Army[D]; Navy-$1,300; Air Force-
$1,700; 
Reported cost increase: At least $2,423. 

Component/system name: DAI; 
Original life-cycle cost estimate: $209; 
Current life-cycle cost estimate: [E]; 
Reported cost increase: Not applicable. 

Source: DOD Program Management Offices. 

[A] At the time LMP was designated as a major automated information 
system (MAIS) program in December 2007, it was required to comply with 
the DOD guidance for MAIS programs. This guidance requires, among 
other things, that a MAIS program have a completed and approved 
acquisition program baseline--the baseline description of the program, 
including the life-cycle cost estimate--prior to Milestone B approval. 
The $2.6 billion is the only life-cycle cost estimate that has been 
developed for the program. 

[B] The current life-cycle cost estimate for GCSS-MC is for phase one. 
The remaining two phases will have separate baselines. 

[C] Originally, ECSS was to be implemented in three phases, but now, 
it will be implemented in four phases. 

[D] The original life-cycle cost estimate represents the estimate for 
DIMHRS. While the Navy and Air Force have estimated their respective 
life-cycle cost estimate, the Army is in the process of completing its 
life-cycle cost estimate. 

[E] As of December 2009, the life-cycle cost estimate for DAI had not 
been finalized. According to the PMO, the life-cycle cost estimate is 
expected to be approved at Milestone B in fiscal year 2011. 

[End of table] 

According to the PMOs, while there have been schedule slippages and 
cost increases, for several of the nine ERP efforts, the functionality 
that was envisioned and planned when each program was initiated 
remains the same today. While the original intent of each program 
remains the same, the anticipated savings that were to accrue to the 
department may not be fully realized. Delays in implementing the ERPs 
result in DOD having to fund the operation and maintenance of the 
legacy systems longer than anticipated, thereby reducing funds that 
could be used for other DOD priorities. 

Furthermore, we have previously reported on the department's effort in 
implementing some of the ERPs and made 19 recommendations to improve 
DOD's management and oversight of these efforts. As of October 2010, 
the department has taken sufficient action to implement 5 of the 
recommendations. Appendix III provides details on the specific 
recommendations and the department's efforts to address them. The 
following information describes in more detail the status of each ERP. 

General Fund Enterprise Business System: 

Figure: DOD Program Data for GFEBS, as of December 31, 2009: 

[Refer to PDF for image: text box] 

Date of initiation: October 2004. 

Program owner: Assistant Secretary of the Army for Financial 
Management and Comptroller. 

Reported life-cycle cost estimate: $1.336.7 billion: 
* Development and Modernization: $642.4 million; 
* Operations and Maintenance: $694.3 million. 

Reported amount expended: $416.8 million. 

Reported legacy systems to be replaced: 87. 

Reported annual cost of maintaining legacy systems: $57.8 million. 

Number of system interfaces: 56. 

Date of last certification of funding: September 2, 2009, by the DBSMC. 

Number of system users: 79,000. 

Number of locations: 200. 

Source: DOD’s GFEBS Program Management Office. These data have not 
been validated. 

[End of figure] 

Program Status: 

According to the GFEBS PMO, the system will be implemented in four 
phases. Phases 1 and 2 were completed in October 2008 and provided 
full functionality to 250 users at the Management Command, Fort 
Jackson, South Carolina. The implementation of phase 2 set the stage 
for GFEBS to be deployed to the rest of the Army. The PMO currently 
estimates that phases 3 and 4 will be deployed Army-wide with full 
functionality by December 2011. PMO officials told us that the 
establishment of the December 2011 milestone resulted from conditions 
placed on the GFEBS program at Milestone B, directing the Army to 
develop an integrated strategy for the implementation of GFEBS and 
GCSS-Army--meaning that both systems were to be implemented using a 
standard configuration and set of common master data.[Footnote 26] The 
PMO also stated that the original life-cycle cost estimate of 
approximately $1.3 billion covering fiscal years 2005 through 2022 
remained unchanged as of December 31, 2009. 

On May 30, 2009, GFEBS was authorized by the MDA to proceed with a 
limited deployment to initial operational test and evaluation (IOT&E) 
[Footnote 27] sites. In January 2010 and again in March 2010, the 
GFEBS program was authorized to continue its deployment to a limited 
number of sites. According to the MDA, this limited deployment process 
allows the program to gain additional operational experience with the 
GFEBS application and conduct additional user testing. MDA approval is 
required for deployment to additional sites and full deployment. 
Before GFEBS will be granted approval for full deployment of phase 4, 
the PMO must address several conditions[Footnote 28] that were placed 
on the program by the MDA. According to the PMO, all of the conditions 
were addressed in December 2009 and presented to the IRB for approval. 
However, the decision on the deployment of the system to additional 
locations is pending and scheduled to occur during fiscal year 2010. 

In December 2009, the U.S. Army Test and Evaluation Command (ATEC) 
reported on concerns with GFEBS's data accuracy, reliability, and 
timeliness.[Footnote 29] More specifically, the report noted that Army 
"installations certifying year-end data with caveats and notes related 
to inaccurate, incomplete, and missing data." Furthermore, the report 
noted that "because of incomplete or not implemented business 
processes, users at times, executed their mission using the 
"workarounds" of the legacy systems that the GFEBS is intended to 
replace or subsume." The report recommended that the deployment of 
GFEBS be limited until the problems are resolved and the corrective 
actions have been validated by ATEC. According to the PMO, in 
conjunction with ATEC, a plan of action and milestones has been 
developed to address the issues. The PMO noted that GFEBS is 
undergoing an additional operational test and evaluation limited user 
test; and at the conclusion of the testing, a determination will be 
made whether the ATEC issues have been addressed. 

Global Combat Support System-Army: 

Figure: DOD Program Data for GCSS-Army, as of December 31, 2009: 

[Refer to PDF for image: text box] 

Date of initiation: December 2003[A]. 

Program owner: Army Deputy Chief of Staff for Logistics. 

Reported life-cycle cost estimate: $3.9 billion: 
* Development and Modernization: $1.8 billion; 
* Operations and Maintenance: $2.1 billion. 

Reported amount expended: $581 million. 

Reported legacy systems to be replaced: 7. 

Reported annual cost of maintaining legacy systems: $63 million. 

Number of system interfaces: 106. 

Date of last certification of funding: September 2, 2009, by the DBSMC. 

Number of system users: 169,880. 

Number of locations: 379. 

Source: DOD's GCSS-Army Program Management Office. These data have not 
been validated. 

[A] Prior to the initiation of the current ERP effort, the Army had 
been developing custom software since May 1997. 

[End of figure] 

Program Status: 

GCSS-Army is being implemented in three phases with phases 1 and 2 
being proof-of-concept demonstrations that have been ongoing since 
December 2007 at the National Training Center (NTC) in Fort Irwin, 
California and testing and evaluation are scheduled to be completed in 
January 2012. The GCSS-Army team is conducting critical activities, 
such as data cleansing and training users at the NTC site. Phase 3 is 
intended to provide full functionality and is scheduled to begin 
implementation in October 2013, but a full deployment date has not yet 
been determined. According to the PMO, the exact locations for the 
implementation of phase 3 have not been determined because the 
deployment schedule by specific location has not yet been finalized. 

In July 2008, the MDA, in approving Milestone B, directed GCSS-Army to 
develop and implement a strategy to better facilitate interactions 
with GFEBS and LMP. Under the federated strategy, GCSS-Army will use 
GFEBS' financial template to allow the Army to integrate data on 
logistics, financial, maintenance, property, and accountability of 
assets. This strategy is intended to standardize transactional input 
and business processes across the Army ERPs to enable common cost 
management activities; provide accurate, reliable, and real-time data; 
and tie budgets to execution. According to the PMO, this change in 
implementation strategy resulted in: 

* the Cost Analysis Improvement Group's direction that an additional 
year of support be added to the cost estimate because of the 
additional time needed to deploy the system and: 

* a revised strategy that resulted in an increase in the number of 
required reports, interfaces, conversions, and extensions that need to 
be developed or tested for GCSS-Army's integration with GFEBS. 

Logistics Modernization Program: 

Figure: DOD Program Data for LMP, as of December 31, 2009] 

[Refer to PDF for image: text box] 

Date of initiation: December 1999. 

Program owner: Army Materiel Command. 

Reported life-cycle cost estimate: $2.630 billion: 
* Development and Modernization: $637 million; 
* Operations and Maintenance: $1.993 billion. 

Reported amount expended: $1.1 billion. 

Reported legacy systems to be replaced: 2. 

Reported annual cost of maintaining legacy systems: $25 million. 

Number of system interfaces: 27. 

Date of last certification of funding: September 2, 2009, by the DBSMC. 

Number of system users: 21,000. 

Number of locations: 104. 

Source: DOD's LMP Program Management Office. These data have not been 
validated. 

[End of figure] 

Program Status: 

LMP was deployed at the Army Communications-Electronics Command and 
Tobyhanna Army Depot in July 2003. In May 2009, the second deployment 
of LMP became operational at the Army Aviation and Missile Command and 
Corpus Christi and Letterkenny Army Depots. The final deployment of 
LMP is scheduled to occur in October 2010 at the Army Sustainment 
Command, the Joint Munitions and Lethality Command, the Tank-
automotive and Armaments Command, and the Anniston and Red River Army 
Depots. 

LMP has experienced schedule slippages primarily because requirements 
management[Footnote 30] and system testing were ineffective which we 
reported on in May 2004[Footnote 31] and June 2005.[Footnote 32] For 
example, at the Tobyhanna Army Depot deployment in fiscal year 2003, 
customers were not being properly billed for work performed which 
affected the accurate recording of revenue, and account balances could 
not be reconciled when transferred from the legacy systems to LMP. As 
a result, the full deployment date of the system has slipped by 6 
years. 

Furthermore, in April 2010,[Footnote 33] we reported that the Army's 
management processes for ensuring data reliability that were 
established prior to the second deployment of LMP were not effective. 
Specifically, the Army was unable to ensure that the data used by LMP 
were of sufficient quality to enable the depots to perform their day- 
to-day missions after LMP became operational. As a result of these 
data quality issues, depot personnel had to develop and use manual 
work-around processes until they could correct the data in LMP, which 
prevented the Army from achieving the expected benefits from LMP. Data 
quality issues occurred despite improvements made by the Army to 
address similar issues experienced during the first deployment of LMP 
because the Army's testing strategy did not provide reasonable 
assurance that the data being used by LMP were accurate and reliable. 
We made recommendations to help improve the third deployment of LMP. 
We are following up on the Army's efforts to implement our 
recommendations and will report on those actions separately. 

The PMO further noted that the original life-cycle cost estimate of 
approximately $2.6 billion[Footnote 34] covering fiscal years 2000 
through 2021 remained unchanged as of December 2009. PMO officials 
told us that there were no issues or conditions that had been placed 
upon LMP by the MDA, IRBs or the DBSMC that needed to be resolved as 
of December 2009. 

Navy Enterprise Resource Planning System: 

Figure: DOD Program Data Provided for Navy ERP, as of December 31, 
2009: 

[Refer to PDF for image: text box] 

Date of Initiation: July 2003. 

Program owner: Assistant Secretary of the Navy, Research, Development, 
and Acquisition. 

Reported life-cycle cost estimate: $2.4 billion: 
* Development and Modernization: $1.0 billion; 
* Operations and Maintenance: $1.4 billion. 

Reported amount expended: $691.3 million. 

Reported legacy systems to be replaced: 98. 

Reported annual cost of maintaining legacy systems: $102 million. 

Number of system interfaces: 51. 

Date of last certification of funding: September 2, 2009, by the DBSMC. 

Number of system users: 66,000. 

Number of locations: 53. 

Source: DOD's Navy ERP Program Management Office. These data have not 
been validated. 

[End of figure] 

Program Status: 

Navy ERP is to be implemented in two phases. As part of phase 1, the 
financial and acquisition functionalities of Navy ERP were deployed to 
the Naval Air Systems Command, Naval Supply Systems Command, and the 
Space and Naval Warfare Systems Command. Those functionalities are 
scheduled for deployment for the general fund at the Naval Sea Systems 
Command in October 2010 and the Navy Working Capital Fund in October 
2011 and the Office of Naval Research and Strategic Systems Planning 
in October 2012. Phase 2 is currently in progress with the deployment 
of the wholesale and retail supply functionalities to the Navy. 
According to the PMO, Navy ERP is currently being used by 38,000 users 
and is executing approximately $37 billion of the Navy's total 
obligational authority. Further, the PMO noted that in fiscal year 
2010, 19 legacy systems have already been retired. 

The Navy ERP implementation has experienced slippages of 2 years. 
Originally, the Navy ERP was to achieve full deployment in fiscal year 
2011, but now full deployment is planned for fiscal year 2013. 
According to program documentation, these slippages occurred, in part, 
because of problems experienced in data conversion and adopting new 
business procedures associated with implementing the ERP. The delay 
occurred at the Naval Air Systems Command and affected the deployment 
schedule for the other locations. In addition to slippages in 
schedule, there have also been increases in the life-cycle cost 
estimate. The 2003 original life-cycle cost estimate for the Navy ERP 
was about $1.87 billion. This estimate was later revised in August 
2004, December 2006, and again in September 2007 to $2.4 billion. 
According to the September 2007 acquisition program baseline, the 
estimated $2.4 billion is for acquisition, operations, and support for 
fiscal years 2004 through 2023. Moreover, in September 2008,[Footnote 
35] we reported that not effectively implementing key IT management 
controls, such as earned value management, has contributed to the more 
than 2-year schedule delay and almost $600 million cost overrun on the 
program since it began, and will likely contribute to future delays 
and overruns if not corrected. 

The IRB has identified two issues or conditions that the Navy ERP PMO 
has to address: (1) provide a description of how the Navy plans to use 
the item-unique identification (IUID)[Footnote 36] and (2) provide an 
updated checklist to BTA showing compliance with the Standard 
Financial Information Structure (SFIS).[Footnote 37] The PMO stated 
that it presented a plan to the Navy Comptroller in February 2010 
describing how it will use IUID and provided BTA the SFIS checklist in 
April 2010. 

Global Combat Support System-Marine Corps: 

Figure: DOD Program Data for GCSS-MC, as of December 31, 2009: 

[Refer to PDF for image: text box] 

Date of Initiation: September 2003. 

Program owner: Assistant Secretary of the Navy, Research, Development, 
and Acquisition. 

Reported life-cycle cost estimate: $934 million: 
* Development and Modernization: $489 million; 
* Operations and Maintenance: $445 million. 

Reported amount expended: $245 million. 

Reported legacy systems to be replaced: 4. 

Reported annual cost of maintaining legacy systems: $4.5 million. 

Number of system interfaces: 42. 

Date of last certification of funding: June 1, 2010 by the DBSMC. 

Number of system users: 33,000. 

Number of locations: 6. 

Source: DOD's GCSS-MC Program Management Office. These data have not 
been validated. 

[End of figure] 

Program Status: 

GCSS-MC was authorized to "Go Live" for field user evaluation in March 
2010 and Milestone C was granted in May 2010. GCSS-MC is to be 
implemented in three phases. Phase 1 is intended to provide a wide 
range of asset management capabilities such as planning inventory 
requirements to support current and future demands; requesting and 
tracking the status of products (e.g., supplies and personnel) and 
services (e.g., maintenance and engineering); allocating resources 
(e.g., inventory, warehouse capacity, and personnel) to support unit 
demands for specific products; and scheduling maintenance resources 
(e.g., manpower, equipment, and supplies) for specific assets, such as 
vehicles. Phases 2 and 3 are intended to provide additional 
functionally such as transportation and wholesale inventory management. 

To date, there have been program slippages and cost increases. The PMO 
told us that full deployment for phase 1 was originally scheduled to 
be achieved in November 2009. However, the current estimated full 
deployment date for phase 1 is January 2013.[Footnote 38] GCSS-MC 
program officials informed us that the schedule slippage for phase 1 
occurred incrementally over time during the design, build, and test 
phases of the program. The slippages occurred because of issues 
associated with system interfaces and the conversion of data from the 
legacy systems. Moreover, in July 2008,[Footnote 39] we reported that 
not effectively implementing key IT management controls, such as 
economically justifying investment in the system, has in part 
contributed to a 3-year schedule slippage and about $193 million cost 
overrun on the first phase of the program and will likely contribute 
to future delays and overruns if not corrected. 

These schedule slippages caused the program to exceed the MAIS 
critical-breach criteria for time-certain development, which is the 
failure to achieve initial operating capability within 5 years of 
Milestone A approval. PMO officials also told us that initially, GCSS- 
MC had an estimated cost of approximately $126 million over a 7-year 
life cycle.[Footnote 40] This cost estimate was later revised in 2005 
to approximately $249 million over a 13-year life cycle.[Footnote 41] 
Currently, the PMO estimates the total life-cycle cost estimate for 
phase 1 to be approximately $934 million. The total life-cycle cost 
estimate for the additional phases has not been determined. According 
to the PMO, phase 2 is in the preliminary planning stage and all 
additional phases will have separate acquisition program baselines. 
[Footnote 42] As a result, a total life-cycle cost estimate for the 
entire system may not be available for several years. 

The IRB directed that the GCSS-MC PMO (1) provide a component-wide 
plan that addresses how GCSS-MC will include the capability to use 
IUID and (2) brief the Navy DCMO on the extent to which business 
process reengineering (BPR) has been performed to address the 
statutory requirement regarding BPR in Section 1072 of the Fiscal Year 
2010 National Defense Authorization Act. In this regard, the act 
directs the Chief Management Officer to determine whether or not 
appropriate business process re-engineering efforts have been 
undertaken to ensure that (1) the business process to be supported by 
the business system will be as streamlined and efficient as 
practicable and (2) the need to tailor the ERP to meet unique 
requirements or incorporate unique interfaces has been eliminated. The 
PMO stated that it provided the BPR information to the Navy DCMO and 
the DCMO indicated that the PMO had addressed the requirements 
contained in the act. 

Defense Enterprise Accounting and Management System: 

Figure: DOD Program Data for DEAMS, as of December 31, 2009: 

[Refer to PDF for image: text box] 

Date of initiation: August 2003. 

Program owner: Assistant Secretary of the Air Force for Financial 
Management and Comptroller. 

Reported life-cycle cost estimate: $2.048 billion: 
* Development and Modernization: $1.030 billion; 
* Operations and Maintenance: $1.018 billion. 

Reported amount expended: $139.1 million. 

Reported legacy systems to be replaced: 10. 

Reported annual cost of maintaining legacy systems: $55.9 million. 

Number of system interfaces: 100. 

Date of last certification of funding: December 14, 2009, by the DBSMC. 

Number of system users: 30,000. 

Number of locations: 179. 

Source: DOD's DEAMS Program Management Office. These data have not 
been validated. 

[End of figure] 

Program Status: 

DEAMS will be deployed in three phases. Phase 1 deployed limited 
functionality--recording commitments--to about 650 system users at 
Scott Air Force Base in July 2007. According to the PMO, as part of 
phase 1, additional functionality was deployed to an additional 870 
users in May 2010. Further, the PMO noted that DEAMS is currently 
scheduled to achieve initial operating capability for phase 2 for the 
U.S. Transportation Command and most of the Air Force's major commands 
in fiscal year 2014. According to the PMO, the final phase of DEAMS 
will be deployed to the remaining Air Force's major commands by fiscal 
year 2017, thereby providing the entire spectrum of general fund 
capabilities to the entire Air Force. 

The Air Force expects DEAMS to reach full deployment in fiscal year 
2017--which is a 3-year slippage from the full deployment date 
reported at program initiation. According to the PMO, DEAMS has 
experienced a 3-year schedule slippage because of problems caused by 
software code defects, integration test delays and to accommodate 
schedule risk. DEAMS program management officials acknowledged that 
the standardization of computer desktops across the Air Force 
contributed to schedule slippages. Our August 2008 report discussed 
this specific problem.[Footnote 43] 

In addition to schedule slippages, DEAMS also had an increase in its 
life-cycle cost estimate. In August 2008, we reported that the Air 
Force's life-cycle cost estimate for DEAMS was about $1.1 billion 
through fiscal year 2021.[Footnote 44] According to the PMO, as of 
December 2009, the life-cycle cost estimate for the DEAMS is 
approximately $2 billion through fiscal year 2027. The PMO stated that 
the increase in the life-cycle cost estimate can be attributed to 
changes in the program implementation strategy from two phases to 
three phases and program development and testing issues. 

The IRB directed the DEAMS's PMO to (1) create an IUID compliance plan 
indicating when the system will include the capability to use IUID, 
(2) identify the date that one of the legacy systems will be subsumed, 
(3) provide a plan on how DEAMS will meet Environmental Liabilities 
Recognition Valuation and Reporting requirements, and (4) comply with 
Section 1072 of the National Defense Authorization Act for Fiscal Year 
2010 related to business process reengineering. According to the PMO, 
the Air Force has addressed these issues. 

Expeditionary Combat Support System: 

Figure: DOD Program Data for ECSS, as of December 31, 2009: 

[Refer to PDF for image: text box] 

Date of Initiation: January 2004. 

Program owner: Deputy Chief of Staff for Logistics, Installations, and 
Mission Support, Headquarters, U.S. Air Force. 

Reported life-cycle cost estimate: $5.2 billion: 
* Development and Modernization: $3.4 billion; 
* Operations and Maintenance: $1.8 billion. 

Reported amount expended: $518.9 million. 

Reported legacy systems to be replaced: 240. 

Reported annual cost of maintaining legacy systems: $325 million. 

Number of system interfaces: 157 (phase 1) and 673 (phases 2, 3, and 
4). 

Date of last certification of funding: September 2, 2009, by the DBSMC. 

Number of system users: 250,000. 

Number of locations: 186. 

Source: DOD's ECSS Program Management Office. These data have not been 
validated. 

[End of figure] 

Program Status: 

ECSS will be deployed in four phases. The Air Force anticipates that 
phase 1 will begin deployment in June 2012, with phase 2 scheduled for 
deployment in April 2014, phase 3 in January 2015, and phase 4 in 
November 2015. According to the PMO, each phase will provide 
additional functionality to the system users. Phase 1 will focus on 
base materiel and equipment management, phase 2 will concentrate on 
global materiel and equipment management and enterprise planning, 
phase 3 will involve depot maintenance repair and overhaul, and phase 
4 will involve flight line maintenance and ammunition management. The 
PMO estimated that full deployment will be achieved in July 2016--a 
slippage of at least 4 years. According to the PMO, the slippage can 
be attributed to (1) two contract award protests--both denied by GAO--
and (2) the change in the implementation strategy, which had 
originally called for the system to be implemented in three phases. 
Also, in our August 2008 report,[Footnote 45] we noted that the life-
cycle cost estimate was approximately $3 billion for the entire ECSS 
program when it was scheduled for three phases. According to the ECSS 
PMO, the current life-cycle cost estimate is approximately $5.2 
billion. Funding has not yet been approved for phases 2 through 4. The 
PMO noted that ECSS will seek approval at each phase's critical 
milestone in order to go forward to the next phase. 

The Air Force DCMO told us that Air Force leadership (including the 
Secretary of the Air Force, Air Force Chief of Staff, and Senior 
Acquisition Executive) reviewed the program to determine whether it 
should be restructured or canceled. The leadership was specifically 
concerned about the size, scope, and pace of the program. The program 
was restructured, and in June 2009, the decision was made to pursue 
only the revised phase 1 pending a demonstration of the program's 
ability to deliver to the revised schedule. The DCMO told us that the 
Air Force will make a decision on (1) whether to implement phase 1 and 
(2) whether to budget for the other phases in June 2010. According to 
the PMO, it anticipates the Air Force fully funding phase 1 and the 
long-lead requirements for phase 2 in the fiscal year 2012 program 
objective memorandum.[Footnote 46] 

Because of changes in the implementation strategy, in September 2009, 
the DOD MDA approved a revised Milestone A for ECSS. The revised 
milestone provides for additional funding, and it grants the Air Force 
authority to continue with ECSS technology development and prepare for 
Milestone B for phase 1. In preparing for Milestone B, the Air Force 
was directed to: 

* present quarterly reports regarding the progress of the program, 
including internal and external challenges and risks, to the IRBs for 
weapons system, material, service, and financial management; 

* complete an enterprise risk assessment methodology review of the 
program 120 days prior to Milestone B; and: 

* provide a cost analysis requirement document to the Air Force 
Analysis Agency to support the development of an independent cost 
estimate. 

According to the PMO, each of these actions was completed by May 20, 
2010. 

Service Specific Integrated Personnel and Pay Systems[Footnote 47] 

Figure: DOD Program Data for Service Specific Integrated Personnel and 
Pay Systems, as of December 31, 2009: 

[Refer to PDF for image: text box] 

Date of Initiation: February 1998. 

Program owner: Army-—Army’s Program Executive Office, Enterprise 
Information Systems; Navy-—Chief of Naval Operations; Air Force-—Air 
Force Program Executive Office and Service Acquisition Executive. 

Reported life-cycle cost estimate: Army-—Has not yet been determined; 
Navy-—$1.3 billion; Air Force-—$1.7 billion. 

Reported amount expended: $841.1 million. 

Legacy systems to be replaced: Army-—65; Navy-—7; Air Force-—Has not 
yet been determined. 

Reported annual cost of maintaining legacy systems: Army—-$39 million; 
Navy-—$69 million; Air Force—-Has not yet been determined. 

Number of system interfaces: Has not yet been determined. 

Date of last certification of funding: Not applicable for the military 
services as of December 2009. 

Number of system users: Has not yet been determined. 

Number of locations: Has not yet been determined. 

Sources: The Army, Navy, and Air Force program management offices. 
These data have not been validated. 

[End of figure] 

Program Status: 

In a January 2009 memorandum, the Deputy Secretary of Defense changed 
the department's strategy for implementing an integrated personnel and 
pay system. The memorandum directed the BTA to develop the pay module 
and provide it to the military departments. Each military department 
would be responsible for implementing an integrated personnel and pay 
system for its respective service. In revising the department's 
strategy, a subsequent memorandum issued September 2009 by the Under 
Secretary of Defense (Acquisition, Technology and Logistics) noted 
that the capabilities needed by DOD to develop integrated personnel 
and pay systems are best met through the military departments because 
of several risks, including governance, technical complexities, and 
past failed attempts of developing DIMHRS as a one-fits-all solution. 
The memorandum further noted that military departments were to use, to 
the maximum extent practical, the DIMHRS requirements related to the 
pay module developed by BTA. Highlighted below is the status of each 
of the military department's efforts to implement an integrated 
personnel and pay system. 

Integrated Personnel and Pay System-Army (IPPS-A): 

Army PMO officials told us that in accordance with the September 2009 
memorandum, the Army intends to use the BTA-developed pay module, 
develop the personnel module and implement an integrated system. Once 
IPPS-A is developed it will be implemented in several phases. The 
first deployment is planned for the Army National Guard, followed by 
the Army Reserves, and then the active Army. The PMO stated that the 
personnel and pay portion will be deployed to all Army components by 
August 2014. The Army anticipates that full deployment will occur late 
in fiscal year 2014. The PMO informed us that the Army is in the 
process of developing the life-cycle cost estimate. 

Navy Future Pay and Personnel Solution: 

According to PMO officials, the Navy is in the process of evaluating 
the extent to which the BTA-developed pay module can be used to meet 
its needs for an integrated system. Navy anticipates that this 
evaluation will be completed by the second quarter of fiscal year 
2011. PMO officials told us that if the pay module can be used, the 
system will be implemented in two phases. Phase 1 will consolidate the 
existing legacy personnel systems and establish a single personnel 
record. Phase 2 will be the implementation of the pay module. Navy 
would begin deployment in fiscal year 2014 for phase 1 and fiscal year 
2015 for phase 2, with full deployment being achieved in fiscal year 
2017. PMO officials told us that the Navy estimates that the life-
cycle cost estimate for its integrated personnel and pay system will 
be about $1.3 billion. The PMO further stated that if an alternative 
to using the BTA-developed pay module is selected, the implementation 
dates and estimated cost may change. In the September 2009 memorandum, 
it was noted that the Marine Corps will continue to use the Marine 
Corps Total Force System because it is already an integrated personnel 
and pay system. 

Air Force Integrated Personnel and Pay System: 

At the time of our review, Air Force was evaluating the BTA-developed 
pay module to assess whether it could be used. According to the PMO, 
the system will be implemented in three phases, provided the existing 
BTA-developed pay module can be used. Phase 1 will consist of 
transferring data from the legacy systems to the new integrated 
personnel and pay system and will include implementation of leave/ 
benefits for all. Phase 2 will provide an integrated personnel and pay 
solution for active Air Force officers, and phase 3 will deploy the 
system to the rest of the Air Force personnel including guard and 
reserve personnel. The quantity and content of the phases may change 
as the Air Force evolves the acquisition and deployment strategies. 
According to the PMO, it is anticipated that full deployment will be 
achieved in April 2018. The Air Force PMO currently estimates the life-
cycle cost estimate to be about $1.7 billion covering fiscal year 2010 
through fiscal year 2027. The PMO told us that as the Air Force better 
defines its implementation strategy, the implementation dates and life-
cycle cost estimate could change. The PMO also said that the Air Force 
is in the process of ascertaining how many legacy systems can be 
eliminated through its implementation of an integrated personnel and 
pay system. 

Defense Agencies Initiative: 

Figure: DOD Program Data for DAI, as of December 31, 2009: 

[Refer to PDF for image: text box] 

Date of Initiation: January 2007. 

Program owner: The Business Transformation Agency was the first entity 
to implement DAI. Each defense agency will be responsible for the 
management and oversight of its respective implementation. 

Reported life-cycle cost estimate: Has not yet been determined; 
* Development and Modernization: Has not yet been determined; 
* Operations and Maintenance: Has not yet been determined. 

Reported amount expended: $40.2 million. 

Reported legacy systems to be replaced: 17. 

Reported annual cost of maintaining legacy systems: $35 million. 

Number of system interfaces: 24. 

Date of last certification of funding: September 30, 2009 by the DBSMC. 

Number of system users: 15,000 (estimated). 

Number of locations: 11 (estimated). 

Source: DOD's DAI Program Management Office. These data have not been 
validated. 

[End of figure] 

Program Status: 

DAI became operational at BTA in October 2008 and at the Defense 
Technical Information Center in October 2009. Table 3 lists the 
defense agencies that are scheduled to implement DAI in fiscal years 
2011 through 2013. 

Table 3: Defense Agencies' Scheduled Implementation of DAI: 

Defense agency: Uniform Services University of the Health Services; 
Scheduled Implementation of DAI: Fiscal year 2011. 

Defense agency: Missile Defense Agency; 
Scheduled Implementation of DAI: Fiscal year 2011. 

Defense agency: Defense Threat Reduction Agency; 
Scheduled Implementation of DAI: Fiscal year 2012. 

Defense agency: Defense Information Systems Agency; 
Scheduled Implementation of DAI: Fiscal year 2012. 

Defense agency: Defense Technology Security Administration; 
Scheduled Implementation of DAI: Fiscal year 2012. 

Defense agency: Chemical Biological Defense Program; 
Scheduled Implementation of DAI: Fiscal year 2012. 

Defense agency: TRICARE Management Agency--Headquarters; 
Scheduled Implementation of DAI: Fiscal year 2012. 

Defense agency: Defense Media Agency; 
Scheduled Implementation of DAI: Fiscal year 2012. 

Defense agency: Defense Information System Agency--General Fund; 
Scheduled Implementation of DAI: Fiscal year 2013. 

Defense agency: Defense Acquisition University; 
Scheduled Implementation of DAI: Fiscal year 2013. 

Defense agency: Defense POW/Missing Personnel Office; 
Scheduled Implementation of DAI: Fiscal year 2013. 

Defense agency: Defense Advanced Research Projects Agency; 
Scheduled Implementation of DAI: Fiscal year 2013. 

Defense agency: Defense Security Service; 
Scheduled Implementation of DAI: Fiscal year 2013. 

Defense agency: Office of Economic Adjustment; 
Scheduled Implementation of DAI: Fiscal year 2013. 

Defense agency: Center for Countermeasures; 
Scheduled Implementation of DAI: Fiscal year 2013. 

Defense agency: National Defense University; 
Scheduled Implementation of DAI: Fiscal year 2013. 

Source: Business Transformation Agency. 

[End of table] 

There has been some slippage in the implementation schedule. However, 
at the time of our review, a revised full deployment date for all of 
the agencies scheduled to use DAI had not been established. According 
to the department's fiscal year 2011 IT budget request, additional 
defense agencies have expressed an interest in using DAI. However, the 
Financial Management IRB and the DBSMC must grant approval to any 
entity that wants to use DAI. DOD's budget request notes that the 
total cost of the program is affected by the number of agencies 
participating. The budget request further notes that a more accurate 
implementation-plus-sustainment cost can be determined once all of the 
signed memorandums of intent from agencies wanting to use DAI have 
been received. 

DOD Did Not Follow Key Best Practices for Estimating ERP Schedules and 
Cost, Resulting in Unreliable Estimates: 

Our analysis of the schedules and cost estimates for four ERP 
programs--DEAMS, ECSS, GFEBS, and GCSS-Army--found that none of the 
programs are fully following best practices for developing reliable 
schedules and cost estimates. More specifically, none of the programs 
had developed a fully integrated master schedule (IMS) that reflects 
all activities, including both government and contractor activities. 
In addition, none of the programs established a valid critical path or 
conducted a schedule risk analysis.[Footnote 48] We have previously 
reported that the schedules for GCSS-MC and Navy ERP were developed 
using some of these best practices, but several key practices were not 
fully employed that are fundamental to having a schedule that provides 
a sufficiently reliable basis for estimating costs, measuring 
progress, and forecasting slippages.[Footnote 49] We recommended that 
each program follow best practices to update its respective schedule. 
DOD generally agreed with the recommendations. Additional details on 
the status of the recommendations are discussed in appendix III. The 
success of any program depends on having a reliable schedule of the 
program's work activities that will occur, how long they will take, 
and how the activities are related to one another. As such, the 
schedule not only provides a road map for systematic execution of a 
program, but also provides the means by which to gauge progress, 
identify and address potential problems, and promote accountability. 

Our analysis of the four programs' cost estimates found that ECSS, 
GFEBS, and GCSS-Army did not include a sensitivity analysis, while 
cost estimates for GFEBS did not include a risk and uncertainty 
analysis. GAO, OMB, and DOD guidance[Footnote 50] stipulate that risk 
and uncertainty analysis should be performed to determine the level of 
risk associated with the dollar estimate. Furthermore, a sensitivity 
analysis would assist decision makers in determining how changes to 
assumptions or key cost drivers (such as labor or equipment) could 
affect the cost estimate. We have previously reported that the cost 
estimates for Navy ERP and GCSS-MC are comprehensive and well- 
documented, but only partially accurate and credible.[Footnote 51] We 
recommended that each program update its respective cost estimate 
following best practices. The department generally agreed with the 
recommendations. Additional details on the status of the 
recommendations are discussed in appendix III. For DOD management to 
make good decisions, the program estimate must reflect the degree of 
uncertainty so that a level of confidence can be given about the 
estimate. A reliable cost estimate provides the basis for informed 
investment decision making, realistic budget formulation and program 
resourcing, meaningful progress measurement, proactive course 
correction, and accountability for results. 

Program Schedules Not Developed in Accordance with Key Scheduling 
Practices: 

Our cost guide best practices and related federal guidance call for a 
program schedule to be programwide, meaning that it should include an 
integrated breakdown of the work to be performed by both the 
government and its contractors over the expected life of the program. 
[Footnote 52] Our guidance identifies nine scheduling best practices 
that are integral to a reliable and effective master schedule: (1) 
capturing all activities, (2) sequencing all activities, (3) assigning 
resources to all activities, (4) establishing the duration of all 
activities, (5) integrating schedule activities horizontally and 
vertically, (6) establishing the critical path for all activities, (7) 
identifying float between activities, (8) conducting a schedule risk 
analysis, and (9) updating the schedule using logic and durations to 
determine the dates. 

The scheduling best practices are interrelated so that deficiencies in 
one best practice will cause deficiencies in other best practices. For 
example, if the schedule does not capture all activities, then there 
will be uncertainty about whether activities are sequenced in the 
correct order and whether the schedule properly reflects the resources 
needed to accomplish the work. The schedule should use logic and 
durations in order to reflect realistic start and completion dates for 
program activities. Maintaining the integrity of the schedule logic is 
not only necessary to reflect true status, but is also required before 
conducting follow-on schedule risk analyses. If the schedule is not 
properly updated, positive and negative float will not change 
properly. Positive float indicates the amount of time the schedule can 
fluctuate before affecting the end date. Negative float indicates 
critical path effort that may require management action such as 
overtime, second or third shifts, or resequencing of work. Moreover, 
if activities are not properly sequenced with logical links, it is not 
certain whether the critical path--which represents the chain of 
dependent activities with the longest total duration--is valid. Table 
4 summarizes the results of our review of the four programs. 

Table 4: Extent to Which Program Schedules Met Best Practices: 

Best practice: 1. Capturing all activities; 
Extent best practice met: DEAMS: Partially; 
Extent best practice met: ECSS[A]: Substantially; 
Extent best practice met: GFEBS: Substantially; 
Extent best practice met: GCSS-Army: Partially. 

Best practice: 2. Sequencing all activities; 
Extent best practice met: DEAMS: Minimally; 
Extent best practice met: ECSS[A]: Partially; 
Extent best practice met: GFEBS: Partially; 
Extent best practice met: GCSS-Army: Partially. 

Best practice: 3. Assigning resources to all activities; 
Extent best practice met: DEAMS: Fully met; 
Extent best practice met: ECSS[A]: Minimally; 
Extent best practice met: GFEBS: Not Met; 
Extent best practice met: GCSS-Army: Substantially. 

Best practice: 4. Establishing the duration of all activities; 
Extent best practice met: DEAMS: Substantially; 
Extent best practice met: ECSS[A]: Substantially; 
Extent best practice met: GFEBS: Fully Met; 
Extent best practice met: GCSS-Army: Fully Met. 

Best practice: 5. Integrating schedule activities horizontally and 
vertically; 
Extent best practice met: DEAMS: Minimally; 
Extent best practice met: ECSS[A]: Partially; 
Extent best practice met: GFEBS: Minimally; 
Extent best practice met: GCSS-Army: Partially. 

Best practice: 6. Establishing the critical path for all activities; 
Extent best practice met: DEAMS: Minimally; 
Extent best practice met: ECSS[A]: Partially; 
Extent best practice met: GFEBS: Partially; 
Extent best practice met: GCSS-Army: Partially. 

Best practice: 7. Identifying reasonable float between activities; 
Extent best practice met: DEAMS: Minimally; 
Extent best practice met: ECSS[A]: Partially; 
Extent best practice met: GFEBS: Minimally; 
Extent best practice met: GCSS-Army: Substantially. 

Best practice: 8. Conducting a schedule risk analysis; 
Extent best practice met: DEAMS: Minimally; 
Extent best practice met: ECSS[A]: Not met; 
Extent best practice met: GFEBS: Not met; 
Extent best practice met: GCSS-Army: Minimally. 

Best practice: 9. Updating schedule using logic and durations to 
determine dates; 
Extent best practice met: DEAMS: Minimally; 
Extent best practice met: ECSS[A]: Partially; 
Extent best practice met: GFEBS: Partially; 
Extent best practice met: GCSS-Army: Substantially. 

Sources: GAO analysis based on data provided by the PMOs. 

Note: "Not met" means the program provided no evidence that satisfies 
any of the criterion. "Minimally" means the program provided evidence 
that satisfies a small portion of the criterion. "Partially" means the 
program provided evidence that satisfies about half of the criterion. 
"Substantially" means the program provided evidence that satisfies a 
large portion of the criterion. "Fully met" means the program provided 
evidence that completely satisfies the criterion. 

[A] In reviewing ECSS we analyzed two project schedules: (1) solutions 
development and (2) reports, interfaces, conversions, and extensions 
(RICE). We analyzed two schedules because the ECSS IMS is made up of 
46 individual project schedules. The ratings were exactly the same for 
the nine practices. 

[End of table] 

Highlighted below are examples of the specific weaknesses we found in 
each of the nine best practices.[Footnote 53] Appendix IV contains a 
detailed discussion of the extent to which the four ERPs we analyzed 
met the nine best practice criteria. 

* Capturing all activities. A schedule should reflect all activities 
as defined in the program's work breakdown structure to include 
activities to be performed by the government and the contractor. Our 
analysis found that the ERP program schedules differed in the extent 
to which they capture all activities, as well as in the integration of 
government and contractor activities. The DEAMS PMO does not have a 
single schedule that integrates government and contractor activities. 
While the PMO maintains internal schedules that reflect government-
only activities, these activities are not linked to contractor 
activities. In addition, many contractor activities within the DEAMS 
schedule are not mapped to the work breakdown structure, hampering 
management's ability to ensure all effort is included in the schedule. 
While the GCSS-Army schedule identifies contractor activities, it 
contains only key government milestones for the program. Other 
government activities, such as testing events and milestones beyond 
December 2010, are not captured in the schedule. The ECSS program 
schedule contains detailed activities associated with government 
effort and contractor effort. However, the government activities are 
not fully linked to contractor activities, so that updates to 
government activities do not have a direct impact on scheduled 
contractor activities. While the GFEBS's schedule captures government 
and contractor activities, dependencies between key milestones in 
deployment, software release, and maintenance are not linked, thereby 
precluding a comprehensive view of the entire program. Without fully 
integrating government activities with contractor activities, the 
schedule will not be able to reliably estimate the date the program is 
to be finished if a significant amount of key activities are not 
adequately captured. 

* Sequencing all activities. The schedule should be planned so that it 
can meet program critical dates. To meet this objective, activities 
need to be logically sequenced in the order that they are to be 
carried out and no artificial date constraints should be included in 
the schedule. In particular, activities that must finish prior to the 
start of follow-on activities (i.e., predecessor activities), as well 
as activities that cannot begin until other activities are completed 
(i.e., successor activities), should be identified. None of the 
contractor schedules we assessed fully met the criteria for sequencing 
all activities. For example, the DEAMS schedule has over 60 percent of 
the remaining activities missing logic links to predecessor or 
successor activities. Missing predecessors or successors reduce the 
credibility of the calculated dates. The DEAMS schedule also has date 
constraints[Footnote 54] that keep the schedule from responding 
correctly to changes. The ECSS schedule has 78 instances of unusual 
logic that cause activities to finish at the same time that their 
predecessor activities start.[Footnote 55] The GCSS-Army schedule has 
constraints on 1,503 of the remaining activities that keep the 
schedule from responding to changes. Moreover, the GFEBS schedule has 
date constraints and linked summary activities that interfere with the 
critical path.[Footnote 56] Missing or incorrect logic reduces the 
credibility of the calculated dates in the schedule because the 
schedule will not reflect the effects of slipping activities on the 
critical path, scheduled resources, or scheduled start dates of future 
activities. 

* Assigning resources to all activities. The schedule should 
realistically reflect what resources (i.e., labor, material, and 
overhead) are needed to do the work, whether all required resources 
will be available when needed, and whether any funding or time 
constraints exist. Because of the fixed price contractual arrangements 
with ERP contractors, resources are not reflected in the periodic 
updates of the schedules submitted to the PMOs. While the GCSS-Army 
IMS does not include resources, scheduled activities can be traced to 
control account plans which have resources laid out by month by labor 
category. On the other hand, the DEAMS PMO provided evidence that 
resources were assigned to activities in the schedule. In the case of 
ECSS, PMO officials stated the contractor assigned resources to 
scheduled activities, but we were not able to verify whether resources 
were assigned. The GFEBS contractor's schedules had resources assigned 
to activities in earlier releases of the system, but according to the 
PMO, resources are no longer assigned to activities. Without resource 
information, DOD management has insufficient insight into current or 
projected over-allocation of contractor resources, thus increasing the 
risk of slippage in the estimated completion date. 

* Establishing the duration of all activities. The schedule should 
reflect how long each activity will take to execute and activity 
durations should be as short as possible with specific start and end 
dates. The four programs properly reflected how long each activity 
should take to execute. In addition, activities were generally shorter 
than 44 working days--or 2 working months--which represents best 
practices for activity durations. 

* Integrating schedule activities horizontally and vertically. The 
schedule should be integrated horizontally and vertically. Horizontal 
integration means that the schedule links the products and outcomes 
associated with already-sequenced activities. Horizontal integration 
also demonstrates that the overall schedule is rational, planned in a 
logical sequence or to reflect interdependencies between work and 
planning packages and provides a way to evaluate current status. When 
schedules are vertically integrated, lower-level schedules are clearly 
traced to upper-tiered milestones, allowing for total schedule 
integration and enabling different teams to work to the same schedule 
expectations. The program schedules we assessed partially met the 
criteria for horizontal and vertical integration. In general, as 
discussed earlier, issues with missing or convoluted logic and 
artificially constrained dates prevent the program schedules from 
being horizontally integrated. Schedules that are not horizontally 
integrated may not depict relationships between different program 
elements and product handoffs. While ECSS and GCSS-Army program 
schedules are vertically integrated, the inability to clearly trace 
lower-level schedules to upper-tiered milestones prevent the DEAMS and 
GFEBS program schedules from being fully vertically integrated. 

* Establishing the critical path. The establishment of a critical 
path--the longest duration path through the sequenced list of 
activities--is necessary for examining the effects of any activity 
slipping along this path. The calculation of a critical path is 
directly related to the logical sequencing of events. Missing or 
convoluted logic and artificially constrained dates prevent the 
calculation of a valid critical path, and can mark activities as 
critical that are not truly critical. The program schedules either 
partially or minimally met the criteria for establishing a critical 
path. While the ECSS PMO has insight into detailed contractor 
activities, officials acknowledged that it is difficult to establish a 
critical path using the program's current schedule. Instead, the 
program tracks high-level milestones in a separate schedule and 
officials stated that they are in the process of revamping the 
program's work breakdown structure in order to establish a clearer 
relationship between work products and hence a more accurate critical 
path. While the GFEBS PMO stated that it receives weekly updates from 
the contractor and manages to a critical path, our analysis of GFEBS 
concluded that the critical path was not reliable because of 
artificial date constraints and unrealistic float.[Footnote 57] 
Conversely, GCSS-Army and DEAMS officials stated that regardless of 
insight into detailed contractor activities, a critical path is not 
possible because it would be too complex. However, as program 
complexity increases, so must the schedule's sophistication. Further, 
our analysis of the DEAMS schedule found that it would be impossible 
to develop a critical path because 60 percent of the remaining 
activities are missing logic links. Likewise, our analysis found that 
a critical path within the GCSS-Army schedule is not possible because 
of artificial date constraints placed on key activities. While the 
level of complexity in an ERP is daunting, a critical path through at 
least a higher-level version of the detailed schedule would assist 
management in identifying which slipped tasks will have detrimental 
effects on the completion date. By managing to pre-defined, 
constrained dates instead of a critical path, management does not have 
a clear picture of the tasks that must be performed to achieve the 
target completion date. 

* Identifying reasonable float between activities. The schedule should 
identify float--the time that a predecessor activity can slip before 
the delay affects successor activities--so that schedule flexibility 
can be determined. As a general rule, activities along the critical 
path typically have the least amount of float. The DEAMS, ECSS, and 
GFEBS schedules did not meet the criteria for identifying reasonable 
float. The missing or convoluted logic and artificially constrained 
dates identified above prevent the proper calculation of float, which 
in turn affects the identification of a valid critical path. Without 
proper insight into float, management cannot determine the flexibility 
of tasks and therefore cannot properly reallocate resources from tasks 
that can safely slip to tasks that cannot slip without adversely 
affecting the estimated program completion date. 

* Conducting a schedule risk analysis. A schedule risk analysis uses 
statistical techniques to predict a level of confidence in meeting a 
completion date. The purpose of the analysis is to develop a 
probability distribution of possible completion dates that reflect the 
project and its quantified risks. This analysis can help management to 
understand the most important risks and to focus on mitigating these 
risks. We found that none of the PMOs have explicitly linked program 
risks to their schedule in the form of a schedule risk analysis. The 
ECSS PMO stated that it actively monitors schedule risk, but it has 
not performed a schedule risk analysis. The GFEBS PMO stated that 
while schedule risks have been discussed in team meetings, it has not 
performed a formal schedule risk analysis. However, the GFEBS PMO 
stated that it is open to improving in the area of schedule risk 
analysis. The DEAMS PMO stated that while it has tied risks to 
activities the program considers to be on the critical path, a formal 
schedule risk analysis was not performed because the schedule provided 
by the contractor lacks a sufficient level of detail to do such an 
analysis. The GCSS-Army contractor recently conducted a high-level 
schedule risk analysis on two major milestones. In addition, GCSS-Army 
PMO officials acknowledged the importance of a detailed schedule risk 
analysis and stated that the PMO intends to include this requirement 
in the contract within the next few months. A schedule risk analysis 
is important because it allows high-priority risks to be identified 
and mitigated, and the level of confidence in meeting projected 
completion dates can be predicted. Without a schedule risk analysis, 
the PMO cannot reliably determine the level of confidence in meeting 
the completion date. However, if the schedule risk analysis is to be 
credible, the program must have a quality schedule that reflects 
reliable logic and clearly identifies the critical path--conditions 
that none of the ERP schedules met. 

* Updating the schedule using logic and durations to determine dates. 
The schedule should use logic and durations in order to reflect 
realistic start and completion dates. The schedule should be 
continually monitored to determine when forecasted completion dates 
differ from the planned dates, which can be used to determine whether 
schedule variances will affect future work. There are differences in 
the four programs' ability to update the schedule using logic and 
durations to determine schedule dates. For example, the GCSS-Army 
schedule substantially met the criteria for updating the schedule, but 
the GFEBS schedule has over 100 instances of activities that should 
have occurred, yet have no actual start dates or finish dates. In 
addition, the ECSS schedules had a status date of January 1, 2010--a 
federal holiday--while schedules provided to the DEAMS program office 
by the contractor did not have a status date. A status date denotes 
the date of the latest update to the schedule and therefore defines 
the point in time at which completed work and remaining work are 
calculated. Without a valid status date, management is not able to 
determine what work is completed and what work is remaining. An 
invalid or missing status date is also an indication that management 
is not using the schedule to effectively oversee and monitor the 
effort. Furthermore, maintaining the integrity of the schedule logic 
is not only necessary to reflect true status, but is also required 
before conducting a schedule risk analysis. 

Each of the PMOs acknowledged the importance of many of the scheduling 
best practices, but stated that its ability to meet the prescribed 
best practices is limited because of the complexity of the ERP 
development process and the use of the firm-fixed price contract. 
Under the terms of these firm-fixed price contracts, the contractors 
are not required to provide detailed and timely scheduling data, which 
are essential for preparing order and an accurate and reliable 
schedule for the implementation of the system. While some of the 
necessary information is not being provided by the contractor, this 
does not relieve the PMOs of the responsibility for developing an IMS 
that fully meets prescribed best practices. Without the development of 
an IMS that meets scheduling best practices, the PMOs and the 
department are not positioned to adequately monitor and oversee the 
progress of the billions of dollars being invested in the 
modernization of DOD's business systems. Lacking a credible IMS, 
management is unable to predict, with any degree of confidence, 
whether the estimated completion date is realistic. An integrated 
schedule is key in managing program performance and is necessary for 
determining what work remains and the expected cost to complete it. A 
schedule delay can also lead to an increase in the cost of the project 
because, for example, labor, supervision, facilities, and escalation 
cost more if the program takes longer. A schedule and cost risk 
assessment recognizes the interrelationship between schedule and cost 
and captures the risk that schedule durations and cost estimates may 
vary. But without a fully integrated master schedule, the full extent 
of schedule uncertainty is not known, and therefore cannot be 
incorporated into the cost uncertainty analysis as schedule risk. 

Subsequent to the completion of our field work, ECSS and GFEBS PMOs 
provided updated schedules for assessment, but we were unable to 
perform a detailed evaluation of the updated schedules. Program 
officials for ECSS and GFEBS indicated that the updated schedules 
addressed some areas in which their previous schedules were deficient 
according to GAO's assessment of the nine scheduling best practices. 
In response to limitations that we identified and shared with the 
GFEBS PMO, the program office enacted several formal changes to its 
existing schedule. The ECSS PMO provided us with an updated IMS that 
contains details on future activities beyond the scheduled activities 
we originally assessed. Although we did not assess the new schedule, 
according to the ECSS PMO, the updated schedule is an improvement over 
past versions of the ECSS schedule and addresses many of the 
deficiencies GAO identified in the earlier version. 

Although Cost Estimates Meet Most Best Practices, the Lack of 
Sensitivity and Uncertainty Analyses Results in Estimates That May Not 
Be Credible: 

We have identified[Footnote 58] four characteristics of a reliable 
cost estimate (1) well-documented, (2) comprehensive, (3) accurate, 
and (4) credible. The four characteristics encompass 12 best practices 
for effective program cost estimates that are identified in appendix 
V. The results of our review of the DEAMS, ECSS, GFEBS, and GCSS-Army 
cost estimates are summarized in table 5. 

Table 5: Extent Cost Estimates Met Best Practices: 

Best practice: Well-documented; 
DEAMS: Substantially; 
ECSS: Substantially; 
GFEBS: Fully met; 
GCSS-Army: Substantially. 

Best practice: Comprehensive; 
DEAMS: Fully met; 
ECSS: Fully met; 
GFEBS: Fully met; 
GCSS-Army: Substantially. 

Best practice: Accurate; 
DEAMS: Fully met; 
ECSS: Substantially; 
GFEBS: Substantially; 
GCSS-Army: Partially. 

Best practice: Credible; 
DEAMS: Fully met; 
ECSS: Partially; 
GFEBS: Minimally; 
GCSS-Army: Partially. 

Sources: GAO analysis based on information provided by the PMOs. 

Note: "Not met" means the program provided no evidence that satisfies 
any of the criterion. "Minimally" means the program provided evidence 
that satisfies a small portion of the criterion. "Partially" means the 
program provided evidence that satisfies about half of the criterion. 
"Substantially" means the program provided evidence that satisfies a 
large portion of the criterion. "Fully met" means the program provided 
evidence that completely satisfies the criterion. 

[End of table] 

Highlighted below are examples of the specific strengths and 
weaknesses we found in each of the four best practices. Appendix V 
contains a detailed discussion of the extent to which the four ERPs we 
analyzed met the four best practices criteria. 

* Well-documented. The cost estimates should be supported by detailed 
documentation that describes the purpose of the estimate, the program 
background and system description, the scope of the estimate, the 
ground rules and assumptions, all data sources, estimating methodology 
and rationale, and the results of the risk analysis. Moreover, this 
information should be captured in such a way that the data used to 
derive the estimate can be traced back to, and verified against, their 
sources. The cost estimates for DEAMS, ECSS, GFEBS, and GCSS-Army are 
well-documented. The cost estimates have clearly defined purposes and 
are supported by documented descriptions of key program or system 
characteristics (e.g., relationships with other systems, performance 
parameters). Additionally, they capture in writing such things as the 
source data used and their significance, the calculations performed 
and their results, and the rationale for choosing a particular 
estimating method or reference. This information is captured in such a 
way that the data used to derive the estimate can be traced back to, 
and verified against, the sources. The final cost estimates are 
reviewed and accepted by management on the basis of confidence in the 
estimating process and the estimate produced by the process. 

* Comprehensive. The cost estimates should include costs of the 
program over its full life cycle, provide a level of detail 
appropriate to ensure that cost elements are neither omitted nor 
double-counted, and document all cost-influencing ground rules and 
assumptions. We found that cost estimates for DEAMS, ECSS, GFEBS, and 
GCSS-Army are comprehensive. The cost estimates include both 
government and contractor costs over the program's life cycle, from 
the inception of the program through design, development, deployment, 
and operation and maintenance to retirement. They also provide an 
appropriate level of detail to ensure that cost elements are neither 
omitted nor duplicated and include documentation of all cost-
influencing ground rules and assumptions. 

* Accurate. The cost estimates should be based on an assessment of 
most likely costs (adjusted for inflation), documented assumptions, 
and historical cost estimates and actual experiences on other 
comparable programs. Estimates should be cross-checked against an 
independent cost estimate for accuracy, double counting, and 
omissions. In addition, the estimates should be updated to reflect any 
changes. Our analysis also found the cost estimates for DEAMS, ECSS, 
and GFEBS to be accurate. The cost estimates provide for results that 
are unbiased and are not overly conservative or optimistic. In 
addition, the cost estimates are updated regularly to reflect material 
changes in the program, and steps are taken to minimize mathematical 
mistakes and their significance. Among other things, the cost 
estimates are grounded in historical record of cost estimating and 
actual experiences on comparable programs. Our analysis found the cost 
estimate for GCSS-Army to be partially accurate because we could not 
verify how actual incurred costs were used to update the cost estimate. 

* Credible. The cost estimates should discuss any limitations of the 
analysis because of uncertainty, or biases surrounding data or 
assumptions. Risk and uncertainty analysis should be performed to 
determine the level of risk associated with the estimate. Further, the 
estimate's results should be cross-checked against an independent cost 
estimate.[Footnote 59] While we found that the ERP programs were 
generally following the cost estimating best practices, our analysis 
also found that the cost estimates for ECSS, GFEBS, and GCSS-Army are 
not fully credible. As stipulated in OMB and DOD guidance, ECSS and 
GCSS-Army did not include a sensitivity analysis, and GFEBS did not 
include a sensitivity analysis or a cost risk and uncertainty 
analysis. In the case of GFEBS, our July 2007 report[Footnote 60] 
noted that a sensitivity analysis had not been developed in 
calculating the life-cycle cost estimate. Cost estimates should 
discuss any limitations of the analysis because of uncertainty or 
biases surrounding data and assumptions. Major assumptions should be 
varied, and other outcomes recomputed to determine how sensitive they 
are to changes in the assumptions. Having a range of costs around a 
point estimate is more useful to decision makers because it conveys 
the level of confidence in achieving the most likely cost and also 
informs them of cost, schedule, and technical risks. In addition, as 
discussed earlier, because each of the four programs we assessed did 
not meet best practices for schedule estimating, none of the cost 
estimates could be considered credible because they did not assess the 
cost effects of schedule slippage. While individual phases of a multi-
phased project may be completed on time, the project as a whole can be 
delayed, and phases that are not part of an IMS may not be completed 
efficiently which could result in future cost overruns. 

A reliable cost estimate is a key variable in calculating return on 
investment, and it provides the basis for informed investment decision 
making, realistic budget formulation and program resourcing, 
meaningful progress measurement, proactive course correction, and 
accountability for results. According to OMB[Footnote 61] programs 
must maintain current and well-documented cost estimates, and these 
estimates must encompass the full life cycle of the program. OMB 
states that generating reliable cost estimates is a critical function 
necessary to support OMB's capital programming process. Without 
reliable estimates, programs are at increased risk of experiencing 
cost increases, missed deadlines, and performance shortfalls. 

ERP Success in Transforming Business Operations Has Not Been Defined 
or Measured: 

DOD has not yet defined success for ERP implementation in the context 
of business operations and in a way that is measurable. Accepted 
practices in system development include testing the system in terms of 
the organization's mission and operations--whether the system performs 
as envisioned at expected levels of cost and risk when implemented 
within the organization's business operations. The Clinger-Cohen Act 
of 1996 recognizes the importance of performance measurement in 
requiring agencies to (1) establish goals for improving the efficiency 
and effectiveness of agency operations and (2) ensure that performance 
measurements determine how well the information technology supports 
programs of the executive agency.[Footnote 62] 

DOD also has recognized the importance of performance measures, which 
the department directs should be (1) written in terms of desired 
outcomes, (2) quantifiable, (3) able to measure the degree to which 
the desired outcome is achieved, (4) independent of the particular 
automated system tested and not focused on system-performance 
criteria, and (5) designed to include benefits to the DOD component 
and the enterprise. In regard to the ERPs, measures determining 
whether the system is being used as expected and is providing the 
desired benefits from a business perspective will vary depending on 
the specific type of business functions the system is performing. For 
example, in a logistical system, the new system and its processes may 
be expected to help accomplish such items as (1) reducing inventory 
levels; (2) increasing the inventory turnover rate which shows that 
the items actually needed are the items being procured; and (3) 
increasing the accuracy of the projected completion dates for repair 
projects which allows for better equipment utilization. On the other 
hand, a financial system may measure benefits in such areas as (1) 
reducing prompt payment penalties; (2) improving the financial 
statement preparation process by having the system automatically 
generate the statements which reduces the potential for manual error; 
and (3) improving management oversight of an entity's operations and 
providing the detailed data necessary to evaluate abnormalities that 
may be detected. Developing and using specific performance measures to 
evaluate a system effort should help management understand whether the 
expected benefits are being realized. 

While the definition of success and performance measures for DOD's 
ERPs will differ between organizational levels, components, and 
subcomponents, our previous work has shown that performance measures 
should be aligned toward a shared direction. In this regard, all 
members of the organization need to understand the ultimate result to 
be achieved and all parties should work toward the same goal and 
desired results. This alignment should extend throughout the 
organization and cover the activities that an entity is expected to 
perform to support the intent of the program.[Footnote 63] DOD has not 
taken actions to align the definitions and related performance 
measures used by its components to measure progress and determine 
success. 

The DCMOs told us that they had not yet developed a DOD-wide 
definition of success or related performance measures for ERPs. While 
acknowledging the importance of these practices, the officials told us 
that they are still in the early stages of implementing processes for 
managing and overseeing their business systems modernization efforts, 
in accordance with the fiscal year 2010 Defense Authorization Act. 
Successful implementation of the ERPs is critical to transforming 
business operations. Without defining ERP success in terms of support 
for mission and business operations and establishing the related 
performance measures, the military services and the department cannot 
ensure that the performance of deployed ERPs has been realistically 
and accurately measured. 

Our April 2010 report,[Footnote 64] which focused on the second 
deployment of LMP at the Corpus Christi and Letterkenny Army Depots, 
illustrates the importance of establishing performance measures. Based 
on our observations at the Corpus Christi and Letterkenny Army Depots, 
we found that the Army's measures for assessing LMP implementation at 
the two deployment sites did not accurately reflect whether the 
locations were able to perform their day-to-day operations using LMP 
as envisioned. Rather, the measures used by the Army assessed the 
success at the two locations from a system-software perspective. While 
this is important, performance measures from a business perspective 
were not considered to determine whether the depots were able to use 
LMP to perform their mission to repair items. Without performance 
measures to evaluate how well these systems are accomplishing their 
desired goals, DOD decision makers including program managers do not 
have all the information they need to evaluate their investments to 
determine whether the individual programs are helping DOD achieve 
business transformation and thereby improve upon its primary mission 
of supporting the warfighter. 

Conclusions: 

Modernizing the department's business systems is a critical part of 
transforming DOD's business operations, addressing some of its high- 
risk areas, and providing more accurate and reliable financial 
information to the Congress on the results of DOD's operations. 
However, DOD continues to experience difficulties that hinder its 
ability to implement these efforts on time and within budget. The 
department has not followed best practices and developed a reliable 
IMS for several of these modernization efforts. As a result, it lacks 
the assurance that these ERPs will be completed by the projected date. 
Furthermore, while DOD generally followed best practices in developing 
the programs' cost estimates for these efforts, with the exception of 
DEAMS, none of the programs has prepared a sensitivity analysis. The 
lack of a sensitivity analysis increases the chances that decisions 
will be made without a clear understanding of the possible impact on 
the estimates of costs and benefits of each program. In addition, 
because each of the four programs we assessed did not meet best 
practices for schedule estimating, none of the cost estimates could be 
considered credible because they did not assess the cost effects of 
schedule slippage. It is critical to correct the underlying issues to 
help ensure that the billions of dollars spent annually are being used 
in the most efficient and effective manner. While modernizing its 
business systems is not a risk-free endeavor, additional funds spent 
because of schedule slippages are funds that could have been available 
for other departmental priorities. Furthermore, the longer it takes to 
implement these critical business systems, the longer the department 
will continue to use its existing duplicative, stovepiped systems 
environment and further erode the estimated savings that were to 
accrue to DOD as a result of modernizing its business systems. 

Additionally, the department has not defined the measures to ascertain 
if the systems are providing the desired functionality to achieve 
DOD's business transformation goals. DOD has stated that the 
successful implementation of the ERPs is critical to transforming its 
business operations and addressing some of its high-risk areas. 
However, we found that the department has not yet developed 
performance measures to ascertain whether the systems, once 
implemented, are providing the intended functionality. If the systems 
cannot be used for their intended purpose, transformation will be 
difficult if not impossible to achieve and the billions of dollars 
being invested in these systems may not generate the benefits and 
efficiencies as intended. Further, we reaffirm our prior 
recommendations related to the actions needed to improve the 
department's management and oversight of the ERPs. 

Recommendations for Executive Action: 

To strengthen DOD's management oversight and accountability over 
business system investments and help provide for the successful 
implementation of the ERPs, we recommend that the Secretary of Defense 
take the following eight actions: 

* Direct the Secretary of the Army to ensure that the Chief Management 
Officer of the Army directs the PMO for the GFEBS to develop an IMS 
that fully incorporates best practices. The schedule should: 

- sequence all activities, 

- assign resources to all activities, 

- integrate schedule activities horizontally and vertically, 

- establish the critical path for all activities, 

- identify float between activities, 

- conduct a schedule risk analysis, and: 

- update schedule using logic and durations to determine dates. 

* Direct the Secretary of the Army to ensure that the Chief Management 
Officer of the Army direct the PMO for GCSS-Army to develop an IMS 
that fully incorporates best practices. The schedule should: 

- capture all activities, 

- sequence all activities, 

- integrate schedule activities horizontally and vertically, 

- establish the critical path for all activities, and: 

- conduct a schedule risk analysis. 

*Direct the Secretary of the Air Force to ensure that the Chief 
Management Officer of the Air Force directs the PMO for DEAMS to 
develop an IMS that fully incorporates best practices. The schedule 
should: 

- capture all activities, 

- sequence all activities, 

- integrate schedule activities horizontally and vertically, 

- establish the critical path for all activities, 

- identify float between activities, 

- conduct a schedule risk analysis, and: 

- update schedule using logic and durations to determine dates. 

* Direct the Secretary of the Air Force to ensure that the Chief 
Management Officer of the Air Force directs the PMO for ECSS to 
develop an IMS that fully incorporates best practices. The schedule 
should: 

- sequence all activities, 

- assign resources to all activities, 

- integrate schedule activities horizontally and vertically, 

- establish the critical path for all activities, 

- identify float between activities, 

- conduct a schedule risk analysis, and: 

- update schedule using logic and durations to determine dates. 

* Direct the Secretary of the Army to ensure that the Chief Management 
Officer of the Army directs the PMO for GFEBS to update the cost 
estimates by preparing sensitivity and risk and uncertainty analyses 
using best practices. 

* Direct the Secretary of the Army to ensure that the Chief Management 
Officer of the Army directs the PMO for GCSS-Army to update the cost 
estimates by using actual cost and preparing a sensitivity analysis 
using best practices. 

* Direct the Secretary of the Air Force to ensure that the Chief 
Management Officer of the Air Force directs the PMO for ECSS to update 
the cost estimates by preparing a sensitivity analysis using best 
practices. 

* Direct the department's Chief Management Officer and the chief 
management officers of the military departments to establish 
performance measures based on quantitative data that will enable the 
department to assess whether each respective military service's ERP 
efforts are providing the intended business capabilities to the system 
users. 

Agency Comments and Our Evaluation: 

DOD provided written comments on a draft of this report. In its 
comments, DOD concurred with the eight recommendations and cited 
actions planned to address them. For example, the department 
recognized the importance of an integrated master schedule as a key 
program management tool fundamental to having a reliable program 
schedule. The department stated that the appropriate military 
department Chief Management Officer will direct program managers to 
implement the recommendations and further noted that the Chief 
Management Officer will oversee the implementation of the 
recommendations. Further, DOD stated that guidance will be issued 
requiring DOD business systems investments to include performance 
measures that can be used to assess the expected benefits of the 
investments. Additionally, the department noted that the performance 
measures will be incorporated into DOD's Business Enterprise 
Architecture. 

We are sending copies of this report to the Secretary of Defense; the 
Secretary of the Army; the Secretary of the Navy; the Secretary of the 
Air Force; the Deputy Secretary of Defense; the Under Secretary of 
Defense (Comptroller); the Chief Management Officer of the Army, the 
Navy, and the Air Force; the program management office for each 
business system that was included in the audit; and other interested 
congressional committees and members. This report also is available at 
no charge on the GAO Web site at [hyperlink, http://www.gao.gov]. 

Please contact Asif A. Khan at (202) 512-9095 or khana@gao.gov or 
Nabajyoti Barkakati at (202) 512-4499 or barkakatin@gao.gov if you or 
your staff have questions on matters discussed in this report. Contact 
points for our Offices of Congressional Relations and Public Affairs 
may be found on the last page of this report. Key contributors to this 
report are listed in appendix VI. 

Signed by: 

Asif A. Khan: 
Director: 
Financial Management and Assurance: 

Signed by: 

Nabajyoti Barkakati: 
Chief Technologist: 
Applied Research and Methods Center for Science, Technology, and 
Engineering: 

List of Requesters: 

The Honorable Evan Bayh: 
Chairman: 
The Honorable Richard Burr: 
Ranking Member: 
Subcommittee on Readiness and Management Support: 
Committee on Armed Services: 
United States Senate: 

The Honorable Thomas R. Carper: 
Chairman: 
The Honorable John McCain: 
Ranking Member: 
Subcommittee on Federal Financial Management, Government Information, 
Federal Services, and International Security: 
Committee on Homeland Security and Governmental Affairs: 
United States Senate: 

The Honorable George Voinovich: 
Ranking Member: 
Subcommittee on Oversight of Government Management, the Federal: 
Workforce, and the District of Columbia: 
Committee on Homeland Security and Governmental Affairs: 
United States Senate: 

The Honorable Tom Coburn: 
Ranking Member: 
Permanent Subcommittee on Investigations: 
Committee on Homeland Security and Governmental Affairs: 
United States Senate: 

[End of section] 

Appendix I: Objective, Scope, and Methodology: 

Our objectives were to (1) provide the status as of December 31, 2009 
of the nine enterprise resource planning (ERP) systems that the 
Department of Defense (DOD) identified as essential to transforming 
its business operations; (2) assess the scheduling and cost estimating 
practices of selected ERPs to determine the extent to which the 
program management offices (PMO) were applying best practices; and (3) 
ascertain whether DOD and the military departments have defined the 
performance measures to determine whether the systems will meet their 
intended business capabilities. 

To address the first objective, we obtained and reviewed information 
provided by the PMO responsible for the nine ERP efforts.[Footnote 65] 
More specifically, we obtained data related to the following for each 
program: (1) when the program was initiated, (2) the program's 
accountable official, (3) the purpose of the program, (4) the cost of 
the program, (5) the implementation schedule, (6) the number of legacy 
systems intended to be replaced, (7) the cost of the legacy systems, 
(8) the date the program was last certified, and (9) the conditions 
placed on the program by the various review boards. For the purposes 
of this report, we did not include information on the Defense 
Logistics Agency Business System Modernization/Enterprise Business 
System. According to DOD the Business System Modernization effort was 
fully implemented in July 2007 and transformed how the agency conducts 
its operations in five core business processes: order fulfillment, 
demand and supply planning, procurement, technical/quality assurance, 
and financial management. Subsequently, in September 2007, the name of 
the program was changed to the Enterprise Business System, which is a 
continuation of the ERP's capabilities to support internal agency 
operations. 

We also reviewed various DOD documents such as the Enterprise 
Transition Plans issued in September 2008 and December 2009, the 
Defense Business Systems Management Committee meeting minutes and 
briefings, the Selected Capital Investment Reports, which are prepared 
in support of the funding requests for the ERPs, the Congressional 
Report on Defense Business Operations for fiscal years 2009 and 2010 
and the Major Automated Information System Reports for fiscal years 
2008 and 2009--to corroborate the information obtained from the PMOs. 
In instances where we identified discrepancies, we followed up with 
the PMOs to obtain an explanation. Most of the financial information 
in this report was obtained through interviews with or responses to 
GAO questions from knowledgeable PMO officials for the nine ERP 
systems. As part of the first objective, we also reviewed past GAO 
reports[Footnote 66] that were specific to the department's efforts to 
implement the nine ERPs to identify prior recommendations and assess 
DOD's progress in addressing the 19 recommendations discussed in these 
reports. 

To assess the scheduling and cost estimating practices of selected 
ERPs, we selected the General Fund Enterprise Business System (GFEBS), 
the Global Combat Support System-Army (GCSS-Army), the Defense 
Enterprise Accounting and Management Systems (DEAMS), and the 
Expeditionary Combat Support System (ECSS). The other programs were 
excluded because (1) the Logistics Modernization Program is expected 
to be fully deployed soon, (2) it is too soon to assess the 
department's integrated personnel and pay efforts because of the 
recent change in the Defense Integrated Military Human Resources 
System implementation strategy and the Defense Agencies Initiative has 
yet to develop its implementation schedule for the various defense 
agencies, and (3) we reported[Footnote 67] on concerns with the Marine 
Corps and Navy schedule and cost estimating practices in July 2008 and 
September 2008, respectively. In performing our analysis for the four 
ERPs, we reviewed the schedules and cost estimates available at the 
time of our review and evaluated them using the criteria set forth in 
GAO's cost guide.[Footnote 68] In using the guide, we determined the 
extent to which each schedule was prepared in accordance with the best 
practices[Footnote 69] that are fundamental to having a reliable 
schedule. In assessing each program's cost estimates, we used the GAO 
cost guide to evaluate the PMOs' estimating methodologies, 
assumptions, and results to determine whether the cost estimates were 
comprehensive, accurate, well-documented, and credible. We discussed 
the results of our assessments with the PMOs, lead schedulers, and 
cost estimators. 

To address the third objective, we obtained and reviewed the 2009 and 
2010 reports[Footnote 70] on business transformation submitted to 
congressional defense committees by each military service to determine 
the extent to which these reports included performance measures. In 
addition, we met with the military departments' deputy chief 
management officers to obtain an understanding of how they define 
success in terms of their respective ERPs. We also met with the 
personnel within the department's DCMO office and the Director, 
Business Transformation Agency, to obtain an understanding of their 
respective roles and responsibilities for the implementation of the 
ERPs within the department. 

We conducted this performance audit from June 2009 through October 
2010 in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audit 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. 

[End of section] 

Appendix II: Comments from the Department of Defense: 

Deputy Chief Management Officer: 
9010 Defense Pentagon: 
Washington, DC 20301-9010: 

September 23, 2010: 

Mr. Asif A. Khan: 
Director,Financial Management and Assurance: 
U.S. Government Accountability Office: 
441 G Street, NW Washington, DC 20548: 

Dear Mr. Khan: 

The Department of Defense (DoD) response to the U.S. Government 
Accountability Office (GAO) draft report 10-951, "DOD Business 
Transformation: Improved Management Oversight of Business System 
Modernization Needed," dated August 25, 2010 (GAO Code 197086), is 
contained in this letter. The Department concurs with all GAO 
recommendations contained in the draft report. [The report number is 
now GAO-11-53] 

Your audit highlighted the need to incorporate best practices into 
development of an integrated master schedule (IMS). The Department 
recognizes IMS as a key program management tool fundamental to program 
schedule reliability. The appropriate Military Department Chief 
Management Officers will direct Program Managers to implement
these recommendations and have oversight over implementation. 
Additionally, the Department will issue guidance requiring DoD 
business systems investments to include performance measures which can 
be used to assess expected business benefits. For strategic alignment, 
these measures will also be incorporated into the Business Enterprise 
Architecture. 

Sincerely, 

Signed by: 

Elizabeth A. McGrath: 

[End of section] 

Appendix III: Status of DOD's Actions on Previous GAO Recommendations 
Related to Business Systems Modernization: 

[End of section] 

Tables 6 through 10 provide information on the status of DOD's actions 
to address our recommendations in previous reports. 

Table 6: Status of DOD's Actions to Address GAO Recommendations in GAO-
07-860: 

GAO recommendation: 1. The Secretary of Defense should direct the 
Secretary of the Army and the Director, Business Transformation Agency 
(BTA) to jointly develop a concept of operations that (1) clearly 
defines the ERP vision for accomplishing total asset visibility (TAV) 
within the Army; (2) addresses how its business systems and processes, 
individually and collectively, will provide the desired functionality 
to achieve total asset visibility; and (3) determines the desired 
functionality among the selected systems; 
DOD action taken to address the recommendation: The Army's March 2010 
report to Congress[A] stated that the Army lacks a concept of 
operations that describes at a high level how the GFEBS, GCSS-Army, 
and LMP systems relate to each other and how information flows between 
and through the systems. Furthermore, the Army found that 
representatives from the three systems were not able to articulate (1) 
what specific data would be exchanged between the three systems and 
(2) which system would be considered the official system of record for 
master data that needed to be consistent between the three systems. 
The Army did not provide a timeframe for completing the concept of 
operations; 
Status of GAO recommendation: Open. 

GAO recommendation: 2. The Secretary of Defense should direct the 
Secretary of the Army and the Director, BTA to jointly develop 
policies, procedures, and processes to support the oversight and 
management of selected groupings of business systems that are intended 
to provide a specific capability or functionality, such as TAV from a 
portfolio perspective, utilizing indicators such as costs, schedule, 
performance, and risks; 
DOD action taken to address the recommendation: In June 2010, the 
Under Secretary of the Army established the Business Systems 
Information Technologies Executive Steering Group. The purpose of the 
group is to advise the Army Chief Management Officer on Army-wide 
requirements for the synchronization, integration, prioritization, and 
resourcing of Army business systems. The Army's efforts to establish 
an enterprisewide focus on systems investments should improve the 
Army's ability to oversee the billions of dollars it is investing in 
its business systems. The group meets the intent of the recommendation; 
Status of GAO recommendation: Closed. 

GAO recommendation: 3. The Secretary of Defense should direct the 
Secretary of the Army and the Director, BTA to jointly establish an 
independent verification and validation (IV&V) function for GFEBS, 
GCSS-Army, and LMP. Additionally, direct that all IV&V reports for 
each system be provided to Army management, the appropriate investment 
review board (IRB), and BTA; 
DOD action taken to address the recommendation: In August 2009, the 
Army awarded a contract to carry out the IV&V function for these 
systems. Under the contract, the contractor is to provide reports on 
each of the systems to the Program Executive Office Enterprise 
Information Systems, which reports to the Army's Deputy Chief 
Management Officer (DCMO). The Army's action to establish an IV&V 
function under the direction of the Army's DCMO, if fully and 
effectively implemented, should enable the Army to improve its 
management and oversight of its business systems investments. Given 
the responsibility of the Chief Management Officer for overseeing and 
monitoring the implementation of Army's business systems, the Army's 
action meets the intent of the recommendation; 
Status of GAO recommendation: Closed. 

GAO recommendation: 4. The Secretary of Defense should direct the 
Secretary of the Army and the Director, BTA to jointly require that 
any future GFEBS economic analysis identify costs and benefits in 
accordance with the criteria specified by DOD and Office of Management 
and Budget (OMB) guidance, to include a sensitivity analysis; 
DOD action taken to address the recommendation: While the Army has 
developed an updated economic analysis, it was not prepared in 
accordance with DOD and OMB guidance.[B] For example, the economic 
analysis did not include a sensitivity analysis or a cost uncertainty 
analysis. Cost estimates should discuss any limitations of the 
analysis because of uncertainty or biases surrounding data and 
assumptions. Major assumptions should be varied, and other outcomes 
recomputed to determine how sensitive they are to changes in the 
assumptions. Having a range of costs around a point estimate--the best 
guess at the cost estimate, given the underlying data--is more useful 
to decision makers because it conveys the level of confidence in 
achieving the most likely cost and also informs management on cost, 
schedule, and risks; 
Status of GAO recommendation: Open. 

GAO recommendation: 5. The Secretary of Defense should direct the 
Secretary of the Army and the Director, BTA to jointly direct that LMP 
utilize system testers that are independent of the LMP system 
developers to help ensure that the system is providing the users of 
the system the intended capabilities; 
DOD action taken to address the recommendation: The Army has stated 
that LMP system testers are now independent of the system developers. 
We are in the process of evaluating the Army's actions as part of our 
ongoing work on the third deployment of LMP; 
Status of GAO recommendation: Open. 

Source: GAO analysis of data provided by DOD. 

[A] U.S. Army, Report to Congress on Business Transformation (Mar. 1, 
2010). 

[B] Department of Defense Instruction 7041.3 and OMB Circular No. A-94. 

[End of table] 

Table 7: Status of DOD's Actions to Address GAO Recommendations in GAO-
08-822: 

GAO recommendation: 1. The Secretary of Defense direct the Secretary 
of the Navy to ensure that investment in the next acquisition phase of 
the program's first increment is conditional upon fully disclosing to 
program oversight and approval entities the steps under way or planned 
to address each of the risks discussed in the report, including the 
risk of not being architecturally compliant and being duplicative of 
related programs, not producing expected mission benefits commensurate 
with reliably estimated costs, not effectively implementing earned 
valued management (EVM), not mitigating known program risks, and not 
knowing whether the system is becoming more or less mature and stable. 
We further recommend that investment in all future Global Combat 
Support System-Marine Corps (GCSS-MC) increments be limited if the 
management control weaknesses that are the source of these risks, and 
which are discussed in the report, have not been fully addressed; 
DOD action taken to address the recommendation: An Enterprise Risk 
Assessment Methodology (ERAM)-based review was conducted on GCSS-MC, 
and the results were presented at a May 2009 IRB meeting. According to 
DOD, the assessment included a review of the program's risk management 
database and policies. The ERAM process identified seven risk areas, 
some of which relate to risks discussed in our report. DOD reported 
that the governance-related risks identified in our report require 
longer-term actions; while the program had nevertheless demonstrated 
compliance with its business enterprise architecture and that the IRB 
reviewed and certified compliance with the architecture in October 
2009. DOD also reported that the program implemented a new risk 
management process in March 2009 and developed metrics related to 
system maturity and stability, such as metrics to track defects during 
developmental test and evaluation, and is tracking change requests and 
generating monthly trend analyses of each. In addition, DOD reported 
that the program is working closely with the Milestone Decision 
Authority, via the IRB, to correct management control weaknesses. As 
of October 2010, DOD had yet to provide the supporting documentation 
for the above actions taken by the department; 
Status of GAO recommendation: Open. 

GAO recommendation: 2. The Secretary of Defense direct the appropriate 
organization within DOD to collaborate with relevant organizations to 
standardize the cost element structure for the department's ERP 
programs and to use this standard structure to maintain cost data for 
its ERP programs, including GCSS-MC, and to use this cost data in 
developing future cost estimates; 
DOD action taken to address the recommendation: In April 2010, DOD 
reported that planning is underway within the BTA and the Office of 
Acquisition Resources and Analysis for development of a common set of 
high-level work elements, such as testing, design, and training, to 
augment detailed work breakdown structures developed by program 
managers for their respective ERP programs. DOD also stated that it 
plans to use the common set of high-level work elements, along with a 
common set of cost elements--buckets of cost types such as program 
management, technical labor, hardware, and software--to capture 
historical costs across ERP programs. DOD also stated that it still 
plans to track and maintain ERP cost data through the Business 
Capability Lifecycle Integrated Management Information Environment, 
and use the data to develop future cost estimates and an economic 
analysis. As of October 2010, DOD did not provide timeframes for 
completion of these actions; 
Status of GAO recommendation: Open. 

GAO recommendation: 3. The Secretary of Defense direct the Secretary 
of the Navy, through the appropriate chain of command, to ensure that 
the program's current economic analysis is adjusted to reflect the 
risks associated with it not reflecting cost data for comparable ERP 
programs, and otherwise not having been derived according to other key 
cost estimating practices, and that future updates to the GCSS-MC 
economic analysis similarly do so; 
DOD action taken to address the recommendation: In April 2010, DOD 
reported that the GCSS-MC program developed its Cost Analysis 
Requirements Document and Economic Analysis Development Plan in 
partnership with the Office of the Secretary of Defense (Cost Analysis 
and Program Evaluation) to ensure that the GCSS-MC economic analysis 
addresses DOD-wide assumptions and risks. DOD stated that the 
independent cost estimate prepared by the Naval Center for Cost 
Analysis and approved in January 2010 was risk adjusted and included 
cross-checks from similar ERP systems and models. As of October 2010, 
DOD had yet to provide the supporting documentation for the above 
actions; 
Status of GAO recommendation: Open. 

GAO recommendation: 4. To enhance GCSS-MC's use of EVM, we recommend 
that the Secretary of Defense direct the Secretary of the Navy, 
through the appropriate chain of command, to ensure that the program 
office (1) monitors the actual start and completion dates of work 
activities performed so that the impact of deviations on downstream 
scheduled work can be proactively addressed; (2) allocates resources, 
such as labor hours and material, to all key activities on the 
schedule; (3) integrates key activities and supporting tasks and 
subtasks; (4) identifies and allocates the amount of float time needed 
for key activities to account for potential problems that might occur 
along or near the schedule's critical path; (5) performs a schedule 
risk analysis to determine the level of confidence in meeting the 
program's activities and completion date; (6) allocates schedule 
reserve for high-risk activities on the critical path; and (7) 
discloses the inherent risks and limitations associated with any 
future use of the program's EVM reports until the schedule has been 
risk-adjusted; 
DOD action taken to address the recommendation: In April 2010, DOD 
reported that the schedule was rebaselined and is now used to monitor 
and report actual versus planned start and completion dates of work 
activities; allocate resources to activities; integrate key activities 
and supporting tasks and sub-tasks; identify and allocate the amount 
of float time needed for key activities, and allocate schedule reserve 
for high-risk activities on the critical path. DOD also reported that 
the program conducted a schedule risk analysis which resulted in more 
detailed task definitions, the ability to provide detailed weekly 
status reports to the program manager, and more effective analysis, 
monitoring and risk assessment of the program's scheduled activities 
and completion dates. As of October 2010, DOD had yet to provide the 
supporting documentation for the above actions; 
Status of GAO recommendation: Open. 

GAO recommendation: 5. The Secretary of Defense direct the Secretary 
of the Navy, through the appropriate chain of command, to ensure that 
the program office (1) adds each of the risks discussed in this report 
to its active inventory of risks, (2) tracks and evaluates the 
implementation of mitigation plans for all risks, (3) discloses to 
appropriate program oversight and approval authorities whether 
mitigation plans have been fully executed and have produced the 
intended outcome(s), and (4) only closes a risk if its mitigation plan 
has been fully executed and produced the intended outcome(s); 
DOD action taken to address the recommendation: In April 2010, DOD 
reported that the program office took a number of actions to 
strengthen risk management. First, it included all risks reported by 
GAO as well as risks identified through DOD's ERAM in its risk 
database. Second, program risks are now reviewed, tracked and managed 
on a continuous basis, and the program office conducts weekly risk 
meetings to track and evaluate mitigation plan implementation for all 
risks. Third, monthly risk boards are convened to discuss risks and 
mitigation plans with GCSS-MC senior leadership, and risks are not 
closed without risk board approval. Further, the program office meets 
monthly with the Program Executive Office for Enterprise Information 
Systems and quarterly with the Assistant Secretary of the Navy 
(Research Development and Acquisition) to discuss program risks. Also, 
the program office reports risks to other program oversight bodies, 
such as the Weapons Systems Lifecycle Management and Materiel Supply 
and Services Management IRB, and the Defense Business Systems 
Management Committee. Fourth, the program office revised its Risk 
Management Plan, in March 2009, to reflect these new processes and 
policies. As of October 2010, DOD had yet to provide the supporting 
documentation for the above actions; 
Status of GAO recommendation: Open. 

GAO recommendation: 6. The Secretary of Defense direct the Secretary 
of the Navy, through the appropriate chain of command, to ensure that 
the program office (1) collects the data needed to develop trends in 
unresolved system defects and change requests according to their 
priority and severity and (2) discloses these trends to appropriate 
program oversight and approval authorities; 
DOD action taken to address the recommendation: In April 2010, DOD 
reported that during system developmental test and evaluation, 
completed in October 2009, the program office developed metrics to 
track defects and correction of defects throughout the test period, 
and that the metrics were made available to the BTA and the Cost 
Analysis and Program Evaluation Office. DOD reported that the program 
office is also (1) collecting defect data over time across severity 
levels and using diagrams to show trends and (2) managing change 
requests according to its configuration management plan and generating 
trend analysis reports to track them. As of October 2010, DOD had yet 
to provide the supporting documentation for the above actions; 
Status of GAO recommendation: Open. 

Source: GAO analysis of data provided by DOD. 

[End of table] 

Table 8: Status of DOD's Actions to Address GAO Recommendations in GAO-
08-866: 

GAO recommendation: 1. The Secretary of Defense direct the Secretary 
of the Air Force to direct Air Force program management officials for 
ECSS and DEAMS to ensure that risk management activities at all levels 
of the program are identified and communicated to program management 
to facilitate oversight and monitoring. Key risks described at the 
appropriate level of detail should include and not be limited to risks 
associated with interfaces, data conversion, change management, and 
contractor oversight; 
DOD action taken to address the recommendation: In July 2009, the 
DEAMS Program Charter noted that risk management activities at all 
levels of the program would be identified and communicated to the 
program manager to facilitate oversight and monitoring. The charter 
noted that program risk will include, but not be limited to, 
interfaces, data conversion, change management, and contractor 
oversight. The charter also notes that the risk management process 
will include risk identified by various reviews, including GAO audits. 
As of July 2010, ECSS was still in the process of revising its risk 
management plan to address our recommendation; 
Status of GAO recommendation: Open. 

GAO recommendation: 2. The Secretary of Defense direct the Secretary 
of the Air Force to direct the Air Force program management offices to 
test ECSS and DEAMS on relevant computer desktop configurations prior 
to deployment at a given location; 
DOD action taken to address the recommendation: The intent of the 
recommendation was to reduce program risk and ensure that when DEAMS 
and ECSS were deployed to a given location they would operate as 
intended. According to the DEAMS PMO, the PMO has performed 
appropriate testing prior to the system being operational and if 
necessary, changes are made prior to the implementation of the system. 
The Defense Finance and Accounting Service is also participating in 
the testing, thereby helping to ensure that the accounting information 
will process correctly. As of July 2010, ECSS had not yet become 
operational at a given location; 
Status of GAO recommendation: Open. 

Source: GAO analysis of data provided by DOD. 

[End of table] 

Table 9: Status of DOD's Actions to Address GAO Recommendations in GAO-
08-896: 

GAO recommendation: 1. The Secretary of Defense direct the Secretary 
of the Navy, through the appropriate chain of command, to ensure that 
future Navy ERP estimates include uncertainty analyses of estimated 
benefits, reflect the risks associated with not having cost data for 
comparable ERP programs, and are otherwise derived in full accordance 
with the other key estimating practices, and economic analysis 
practices discussed in this report; 
DOD action taken to address the recommendation: In July 2010, DOD 
reported that uncertainty analysis will be applied to the Navy ERP's 
benefit estimate in support of the next milestone review, Full 
Deployment Decision Review, planned for the first quarter of fiscal 
year 2011. The benefit estimation model is being updated to include 
variations among key cost drivers, such as labor category efficiency 
and legacy system sustainment difficulty factors, through the use of 
Monte Carlo simulation. In addition, DOD reported that the Navy ERP 
program is working with the Space and Naval Warfare Systems Command 
and Naval Center for Cost Analysis, as they conduct an independent 
assessment of the program's life-cycle cost estimate. According to 
DOD, the assessment will include a review of the risk/uncertainty 
approach and methodologies used to develop the cost estimate; 
Status of GAO recommendation: Open. 

GAO recommendation: 2. The Secretary of Defense direct the Secretary 
of the Navy, through the appropriate chain of command, to ensure that 
(1) an integrated baseline review on the last two releases of the 
first increment is conducted, (2) compliance against the 32 accepted 
industry earned value management (EVM) practices is verified, and (3) 
a plan to have an independent organization perform surveillance of the 
program's EVM system is developed and implemented; 
DOD action taken to address the recommendation: In July 2010, DOD 
reported that the Navy ERP program office conducted an integrated 
baseline review of its second release, which resulted in 
recommendations to mature and implement EVM processes. Because the 
third release is no longer a part of Navy ERP's program of record, 
this recommendation is not applicable to this release. In addition, 
DOD reported that the Navy Center for Earned Value Management planned 
to conduct surveillance of the Navy ERP's EVM system in September 
2010, and that it would review compliance against the 32 accepted 
industry EVM practices; 
Status of GAO recommendation: Open. 

GAO recommendation: 3. The Secretary of Defense direct the Secretary 
of the Navy, through the appropriate chain of command, to ensure that 
the schedule (1) includes the logical sequencing of all activities, 
(2) reflects whether all required resources will be available when 
needed, (3) defines a critical path that integrates all three 
releases, (4) allocates reserve for the high-risk activities on the 
entire program's critical path, and (5) incorporates the results of a 
schedule risk analysis for all three releases and recalculates program 
cost and schedule variances to more accurately determine a most likely 
cost and schedule overrun; 
DOD action taken to address the recommendation: As of July 2010, Navy 
ERP continues to make progress in addressing this recommendation. For 
example, it is using metrics to track and logically link activities 
and account for resources and their availability, and it plans to 
conduct a schedule risk assessment in September 2010 so that reserves 
can be established for high-risk activities. Further, in July 2010, 
DOD reported that it was not feasible to define a critical path 
integrating all three releases because (1) key functionality 
deliverables for the first release were completed prior to the second 
release's development and (2) the third release was removed from Navy 
ERP's program of record. However, the March 2010 metrics report shows 
that not all activities are logically sequenced, which can affect the 
calculation of the critical path and finish date. Further, because the 
schedules are not integrated and personnel are assigned to activities 
across multiple releases, if deployment activities in one schedule 
were to be delayed, the other schedule that requires the same 
resources would likely also be delayed; 
Status of GAO recommendation: Open. 

GAO recommendation: 4. The Secretary of Defense direct the Secretary 
of the Navy, through the appropriate chain of command, to ensure that 
(1) the plans for mitigating the risks associated with converting data 
from legacy systems to Navy ERP and positioning the commands for 
adopting the new business processes embedded in the Navy ERP are re-
evaluated in light of the recent experience with the Naval Air Systems 
Command (NAVAIR) and adjusted accordingly, (2) the status and results 
of these and other mitigation plans' implementation are periodically 
reported to program oversight and approval authorities, (3) these 
authorities ensure that those entities responsible for implementing 
these strategies are held accountable for doing so, and (4) each of 
the risks discussed in this report are included in the program's 
inventory of active risks and managed accordingly; 
DOD action taken to address the recommendation: The department has 
taken actions to address the intent of this recommendation. First, the 
Navy ERP program office reevaluated its plans for mitigating risks 
associated with data conversion and adopting new business processes. 
Second, the program manager and System Command officials report 
monthly to the Navy ERP Senior Integration Board (NESIB) on 
performance, and periodically brief oversight and approval authorities 
on the implementation of risk mitigation plans. Third, the NESIB 
requires actionable reporting on performance by the program manager 
and System Command officials, and the program manager is to report to 
the Milestone Decision Authority on implementation of risk mitigation 
strategies. Fourth, the program's risk inventory has been updated to 
include risks related to adopting new business processes and data 
conversion; 
Status of GAO recommendation: Closed. 

Source: GAO analysis of data provided by DOD. 

[End of table] 

Table 10: Status of DOD's Actions to Address GAO Recommendations in 
GAO-09-841: 

GAO recommendation: 1. The Secretary of Defense direct the Secretary 
of the Navy, through the appropriate chain of command, to (1) revise 
the Navy ERP procedures for controlling system changes to explicitly 
require that a proposed change's life-cycle cost impact be estimated 
and considered in making change request decisions and (2) capture the 
cost and schedule impact of each proposed change in the Navy ERP 
automated control tracking tool; 
DOD action taken to address the recommendation: The Navy ERP program 
updated its Enterprise Change Request Process and Procedures to 
explicitly require that a change's life-cycle cost impact be estimated 
as part of the change control process. In addition, the change control 
tracking tool now captures cost and schedule impact information. As a 
result, management of the Navy ERP's change control process has been 
strengthened. As a result, approval authorities should be provided key 
information needed to fully inform their decisions on whether to 
approve a change, thus decreasing the risk of unwarranted cost 
increases and schedule delays; 
Status of GAO recommendation: Closed. 

GAO recommendation: 2. The Secretary of Defense direct the Secretary 
of the Navy, through the appropriate chain of command, to (1) stop 
performance of the IV&V function under the existing contract and (2) 
engage the services of an IV&V agent that is independent of all Navy 
ERP management, development, testing, and deployment activities that 
it may review; 
DOD action taken to address the recommendation: According to DOD, the 
Navy ERP program office terminated the IV&V functions under the 
existing contract on September 30, 2009, and awarded a new IV&V 
contract in September 2010; 
Status of GAO recommendation: Closed. 

Source: GAO analysis of data provided by DOD. 

[End of table] 

[End of section] 

Appendix IV: Assessments of Four DOD ERP Programs' Integrated Master 
Schedules: 

This appendix provides the results of our analysis of the extent to 
which the processes and methodologies used to develop and maintain the 
four ERP integrated master schedules meet the nine best practices 
associated with effective schedule estimating.[Footnote 71] Tables 11, 
12, 13, 14, and 15 provide the detailed results of our analyses of the 
program schedules for DEAMS, ECSS, GFEBS, and GCSS-Army compared to 
the nine best practices. 

"Not met" means the program provided no evidence that satisfies any of 
the criterion. "Minimally" means the program provided evidence that 
satisfies a small portion of the criterion. "Partially" means the 
program provided evidence that satisfies about half of the criterion. 
"Substantially" means the program provided evidence that satisfies a 
large portion of the criterion. "Fully met" means the program provided 
evidence that completely satisfies the criterion. 

Table 11: Analysis of the Air Force's DEAMS Program Schedule: 

Best practice: 1. Capturing all activities; 
Explanation: The schedule should reflect all activities as defined in 
the project's work breakdown structure, which defines in detail the 
work necessary to accomplish a project's objectives, including 
activities to be performed by both the owner and contractors; 
Criterion met: Partially; 
GAO analysis: Our analysis found that the DEAMS program schedule is 
not fully integrated. While the DEAMS PMO maintains internal schedules 
that reflect government-only activities, these government schedules 
have no links to activities within the contractor schedule. We found 
that activities in the contractor schedule are mapped to contract line 
item numbers and assigned to integrated product teams, but many 
activities are missing contractor work breakdown structure mappings. 
PMO officials told us that because of the firm-fixed price (FFP) 
nature of the current contract, the prime contractor is not obligated 
to provide detailed insight into the contractor schedule. Instead, the 
PMO uses the contractor schedule as a starting point to develop more 
detailed internal tools, such as lower-level schedule information 
maintained in spreadsheets. But without government activities fully 
integrated with contractor activities, we cannot guarantee that the 
schedule has either adequately captured all key activities necessary 
for the program's completion or that the PMO can reliably estimate the 
finish date for the program. 

Best practice: 2. Sequencing all activities; 
Explanation: The schedule should be planned so that critical project 
dates can be met. To meet this objective, activities need to be 
logically sequenced--that is, listed in the order in which they are to 
be carried out. In particular, activities that must be completed 
before other activities can begin (predecessor activities), as well as 
activities that cannot begin until other activities are completed 
(successor activities), should be identified. This helps ensure that 
interdependencies among activities that collectively lead to the 
accomplishment of events or milestones can be established and used as 
a basis for guiding work and measuring progress; 
Criterion met: Minimally; 
GAO analysis: Our analysis of the DEAMS contractor schedule shows that 
131 of the 273 remaining activities, or 48 percent, have missing 
predecessor or successor logic. Missing predecessors or successors 
reduce the credibility of the calculated dates. If an activity that 
has no logical successor slips, the schedule will not reflect the 
effect on the critical path, float, or scheduled start dates of 
downstream activities. In addition, we found that 42 remaining 
activities, or 15 percent, have "dangling" logic--that is, these 
activities whose start or finish dates are missing logic. Of these 42 
activities with dangling logic, 37 activities are missing logic that 
would determine their start dates. Because their start dates are not 
determined by logic, these activities would have to start earlier in 
order to finish on time if they ran longer than their planned 
durations. The other 5 activities with dangling logic are missing 
successors off their finish date. In other words, these activities 
could continue indefinitely and not affect the start or finish dates 
of future activities. We found six remaining activities with start-to-
finish links. Start-to-finish links are rarely, if ever, used because 
they have the odd effect of causing a successor to finish before its 
predecessor.[A] We also found 18 links to or from summary tasks. 
Summary tasks should not have dependencies because they take their 
start date, finish date, and duration from lower-level activities. 
In addition, we found 50 remaining activities (18 percent) with Start 
No Earlier Than constraints. These are considered "soft" date 
constraints in that they allow the activity to slip into the future 
based on what happens to their predecessor activities. While 
activities may be soft constrained, for example, to represent receipt 
of delivery of equipment, in general constraining an activity's start 
date prevents managers from accomplishing work as soon as possible and 
consumes flexibility in the project. Of the remaining activities, 47 
activities are linked to their successor activities with lags, 
including lags that are greater than 100 days. Lags represent the 
passing of time between activities but are often misused to put 
activities on a specific date or to insert a buffer for risk. Lags 
should be justified because they cannot vary with risk or uncertainty. 
PMO officials noted that the contractor schedule follows a Data Item 
Description (DID) that details the preparation of the schedule. 
However, the schedule does not meet the requirements set forth in the 
DID. For example, the DID states that the schedule "shall be an 
integrated, logical network-based schedule" and that a key element of 
the schedule is the "relationship/dependency" of an activity. Without 
logically sequencing activities, the schedule cannot be used as a 
reliable basis for guiding work and measuring progress. 

Best practice: 3. Assigning resources to all activities; 
Explanation: The schedule should reflect what resources (e.g., labor, 
materials, and overhead) are needed to do the work, whether all 
required resources will be available when needed, and whether any 
funding or time constraints exist; 
Criterion met: Fully met; 
GAO analysis: Because of the current FFP contractual arrangement, the 
government does not have insight into the contractor's efforts to 
assign resources to activities. However, contractor officials provided 
evidence that resources have been assigned to activities within their 
schedule. In addition, PMO officials assign and monitor individual 
government resources to lower-level activities that are updated in 
internal tools outside the delivered contractor schedule. 

Best practice: 4. Establishing the duration of all activities; 
Explanation: The schedule should realistically reflect how long each 
activity will take to execute. In determining the duration of each 
activity, the same rationale, historical data, and assumptions used 
for cost estimating should be used. Durations should be as short as 
possible and have specific start and end dates. The schedule should be 
continually monitored to determine when forecasted completion dates 
differ from planned dates; this information can be used to determine 
whether schedule variances will affect subsequent work; 
Criterion met: Substantially; 
GAO analysis: The majority of remaining activities in the contractor 
schedule meet best practices for durations. There are 50 activities 
(18 percent) with planned durations longer than 44 days, which exceeds 
the best practice for activity duration.[B] There are 7 (3 percent) 
level-of-effort activities with durations greater than 1,200 days. 
These level-of-effort activities drive the end date of the project and 
hence adversely affect the calculation of the critical path--the 
longest duration path through the sequenced list of activities. Level-
of-effort activities, such as systems engineering and program 
management, should not define the critical path because they are 
nondiscrete support activities that do not produce a definite end 
product. 

Best practice: 5. Integrating schedule activities horizontally and 
vertically; 
Explanation: The schedule should be horizontally integrated, meaning 
that it should link products and outcomes associated with other 
sequenced activities. These links are commonly referred to as 
"handoffs" and serve to verify that activities are arranged in the 
right order to achieve aggregated products or outcomes. The schedule 
should also be vertically integrated, meaning that the dates for 
starting and completing activities in the integrated master schedule 
should be aligned with the dates for supporting tasks and subtasks. 
Such mapping or alignment among levels enables different groups to 
work to the same master schedule; 
Criterion met: Minimally; 
GAO analysis: Vertical integration--that is, the ability to 
consistently trace work breakdown structure elements between detailed, 
intermediate, and master schedules--is demonstrated somewhat because 
of the efforts by the DEAMS PMO to enhance its insight into contractor 
effort despite the FFP contract environment. PMO officials stated that 
while the contractor is under no obligation to provide detailed 
activities in the contractor schedule, the government has broken down 
areas such as object development and testing into detailed activities 
with internal tools that allow for weekly monitoring and status 
checking. However, we could not fully establish the link between the 
internal updating of activities by the government in lower-level 
spreadsheets and the high-level schedule delivered by the contractor. 
Issues with missing dependencies, activities with dangling logic, 
overuse of lags, and critical level-of-effort activities prevent the 
contractor schedule from fully complying with the requirement of 
horizontal integration--that is, the overall ability of the schedule 
to depict relationships between different program elements and product 
handoffs. PMO officials stated that rather than using the high-level 
contractor schedule, government and contractor subject matter experts 
meet each week to discuss progress on ongoing activities using other 
internal management tools. If activities are delayed or accelerated, 
the experts discuss potential impacts to downstream activities and 
provide management with weekly to daily information on these impacts. 
But while subject matter experts may understand the impacts of delayed 
activities, senior decision makers may not be aware of near-critical 
activities that have the potential to significantly delay the project, 
nor do they have the proper insight into available float--the amount 
of time an activity can slip before it delays the finish date of the 
project--that can be used to mitigate the risk of critical or near- 
critical activities. 

Best practice: 6. Establishing the critical path for all activities; 
Explanation: Scheduling software should be used to identify the 
critical path, which represents the chain of dependent activities with 
the longest total duration. Establishing a project's critical path is 
necessary to examine the effects of any activity slipping along this 
path. Potential problems along or near the critical path should also 
be identified and reflected in scheduling the duration of high-risk 
activities; 
Criterion met: Minimally; 
GAO analysis: Our analysis could not determine a valid critical path 
within the DEAMS contractor schedule, particularly because over 60 
percent of remaining activities have missing or incomplete logic, and 
because level-of-effort activities (over 1,200 days long) define the 
start and finish dates of the project. Level-of-effort activities, 
such as systems engineering and program management, should not define 
the critical path because they are nondiscrete support activities that 
do not produce a definite end product. PMO officials acknowledged that 
a critical path cannot be calculated within the schedule and stated 
that the contractor schedule is used only as a starting point for more 
detailed internal tracking tools such as spreadsheets. Detail is not 
available within the contractor schedule because of the current FFP 
contract with the contractor. PMO officials also stated that 
establishing a traditional critical path is not possible in a complex 
ERP environment because there is no one clear path through development 
or testing. Rather than use the high-level contractor schedule 
government and contractor subject matter experts meet on a weekly to 
daily basis to discuss progress on ongoing activities using other 
internal management tools. If activities are delayed or accelerated, 
the experts discuss potential impacts to downstream activities and 
provide management with weekly to daily information on these impacts. 
But senior decision makers may not be aware of near-critical 
activities nor have the proper insight into available float that can 
be used to mitigate the risks associated with these activities. In 
addition, PMO officials noted that the contractor schedule follows a 
DID that details the preparation of the schedule. However, the 
contractor schedule does not meet the requirements set forth in the 
DID. The DID states a critical path is a key element of the detailed 
schedule; that "the critical path and near-critical paths are 
calculated by the scheduling software"; and "the critical path shall 
be easily identified." 

Best practice: 7. Identifying reasonable float; 
Explanation: The schedule should identify the float--the amount of 
time by which a predecessor activity can slip before the delay affects 
successor activities--so that a schedule's flexibility can be 
determined. As a general rule, activities along the critical path have 
the least float. Total float is the total amount of time by which an 
activity can be delayed without delaying the project's completion, if 
everything else goes according to plan; 
Criterion met: Minimally; 
GAO analysis: Our analysis found that float calculations within the 
DEAMS contractor schedule are not reliable because of the improper 
linking of summary tasks. In addition, because the schedule is missing 
dependencies, float estimates will be miscalculated because float is 
directly related to the logical sequencing of events. PMO officials 
told us that internal activity tracking and monitoring tools used in 
lieu of the detailed contractor activities do not allow insight into 
float calculations. PMO officials noted that the contractor schedule 
follows a DID that details the preparation of the schedule. However, 
the contractor schedule does not meet the requirements set forth in 
the DID. The DID states total float is a key element of the detailed 
schedule to be delivered monthly. Without float estimates management 
may be unable to allocate resources from non-critical activities to 
activities that cannot slip without affecting the project finish date. 

Best practice: 8. Conducting a schedule risk analysis; 
Explanation: A schedule risk analysis should be performed using 
statistical techniques to predict the level of confidence in meeting a 
project's completion date. This analysis focuses not only on critical 
path activities but also on activities near the critical path, since 
they can affect the project's status; 
Criterion met: Minimally; 
GAO analysis: The program office has not performed a schedule risk 
analysis on the schedule because the schedule is not used as a primary 
tool for monitoring the status of the program. However, program 
officials stated that they have tied risks to what subject matter 
experts consider to be critical path activities. They stated that they 
proactively monitor risk on a weekly basis by assigning a probability 
to the risk, examining the potential impact of the risk on activities 
if it is realized, and developing mitigation plans to be executed if 
the risk is realized. However, the risk assessments cannot be used to 
calculate the overall probability of finishing the project on time. 
Since any task can become critical if it is delayed long enough, 
complete schedule logic and a comprehensive risk assessment are 
essential tools for decision makers. A schedule risk analysis can be 
used to determine a level of confidence in meeting the completion date 
or whether proper reserves have been incorporated into the schedule. A 
schedule risk analysis will calculate schedule reserve, which can be 
set aside for those activities identified as high risk. Without this 
reserve, the program faces the risk of delays to the scheduled 
completion date if any delays were to occur on critical path 
activities. In addition, PMO officials noted that the contractor 
schedule follows a DID that details the preparation of the schedule. 
However, the contractor schedule does not meet the requirements set 
forth in the DID. The DID states that a key element of the detailed 
schedule is a schedule risk analysis that "predicts the probability of 
project completion by contractual dates" using three-point estimates 
about the remaining durations of remaining activities. 

Best practice: 9. Updating the schedule using logic and durations to 
determine the dates; 
Explanation: The schedule should be continuously updated using logic 
and durations to determine realistic start and completion dates for 
program activities. The schedule should be analyzed continuously for 
variances to determine when forecasted completion dates differ from 
planned dates. This analysis is especially important for those 
variations that impact activities identified as being in a project's 
critical path and can impact a scheduled completion date; 
Criterion met: Minimally; 
GAO analysis: Our analysis shows the contractor schedule does not have 
a status date (or data date), nor did the program office expect one. A 
status date denotes the date of the latest update to the schedule and 
thus defines the point in time at which completed work and remaining 
work are calculated. Officials stated that the status date is 
reflected by the month in the schedule file name; but because no day 
is given there is no indication whether the date reflects the 
beginning or end of the calendar month or beginning or end of the 
contractor accounting period. Regardless of the exact date, we found 
31 activities that had actual starts in future months relative to the 
month in the file name. That is, according to the schedule, these 
activities had actually started in the future. For example, the 
schedule file name is November 2009, yet we found actual start dates 
for activities in December 2009, February 2010, and April 2010. PMO 
officials noted that the contractor schedule follows the DID that 
details the preparation of the schedule. However, the contractor 
schedule does not meet the requirements set forth in the DID. The DID 
states that "actual start and actual finish dates, as recorded, shall 
not be later than the status date." PMO officials stated that rather 
than use the high-level contractor schedule, which does not give the 
required activity detail, government and contractor subject matter 
experts meet on a weekly to daily basis to discuss progress on ongoing 
activities using other internal management tools. For example, the 
Testing Integrated Product Team meets daily to review tasks that have 
been performed that day. If deadline criteria are not met, senior 
decision makers are alerted to potential impacts to the schedule. 
However, the schedule should use logic and durations in order to 
reflect realistic start and completion dates for program activities. 
The schedule should also be continually monitored to determine when 
forecasted completion dates differ from the planned dates, which can 
be used to determine whether schedule variances will affect downstream 
work. Maintaining the integrity of the schedule logic is not only 
necessary to reflect true status, but is also required before 
conducting a schedule risk analysis. 

Source: GAO analysis based on data provided by the DEAMS PMO. 

[A] Activities need to have certain predecessor-successor 
relationships so the schedule gives the correct results when they are 
updated or when durations change. Two logic requirements have to be 
provided: (1) a finish-to-start or start-to-start predecessors, so 
that if the activity is longer than scheduled it does not just start 
earlier automatically, and (2) finish-to-start or finish-to-finish 
successors that will be "pushed" if they take longer or finish later. 

[B] The Naval Air Systems Command recommends keeping individual task 
durations to less than two calendar months (44 working days). The 
shorter the duration of the tasks in the schedule, the more often the 
Control Account Managers are compelled to update completed work which 
more accurately reflects the actual status of the tasks. When task 
durations are too long, management insight into the actual status of 
the activity is reduced. 

[End of table] 

Table 12: Analysis of the Air Force's ECSS Solutions Development 
Project Schedule: 

Best practice: 1. Capturing all activities; 
Explanation: The schedule should reflect all activities as defined in 
the project's work breakdown structure, which defines in detail the 
work necessary to accomplish a project's objectives, including 
activities to be performed by both the owner and contractors; 
Criterion met: Substantially; 
GAO analysis: While the PMO does have detailed schedules of government 
effort--a commendable best practice--these are not fully integrated 
into an integrated master schedule (IMS) with the contractor 
schedules. Our analysis found that the ECSS Solutions Development 
schedule contains 215 detail activities associated with government 
effort, representing dependencies between contractor and government 
activities. However, the government activities are not completely 
linked to government schedules maintained and updated by the 
government PMO. Our analysis found that activities in the Solutions 
Development workstream schedule are mapped to contractor work 
breakdown structure elements and can be traced to completion criteria 
and descriptions of associated work products. 

Best practice: 2. Sequencing all activities; 
Explanation: The schedule should be planned so that critical project 
dates can be met. To meet this objective, activities need to be 
logically sequenced--that is, listed in the order in which they are to 
be carried out. In particular, activities that must be completed 
before other activities can begin (predecessor activities), as well as 
activities that cannot begin until other activities are completed 
(successor activities), should be identified. This helps ensure that 
interdependencies among activities that collectively lead to the 
accomplishment of events or milestones can be established and used as 
a basis for guiding work and measuring progress; 
Criterion met: Partially; 
GAO analysis: Our analysis shows that 31 of the 1,901 remaining 
activities, or 2 percent, have missing predecessor or successor logic. 
This is a relatively low number for such a highly integrated schedule, 
but any number of missing predecessors or successors can reduce the 
credibility of calculated dates. If an activity that has no logical 
successor slips, the schedule will not reflect the effect on the 
critical path, float, or scheduled start dates of future activities. 
However, of those remaining activities that have logical predecessor 
and successor links, 259 activities (14 percent), have "dangling 
logic." Of these 259 activities with dangling logic, 229 activities 
are missing logic that would determine their start dates. Because 
their start dates are not determined by logic, these activities would 
have to start earlier in order to finish on time if they ran longer 
than their planned durations. The other 30 activities with dangling 
logic are missing successors off their finish date. In other words, 
these activities could continue indefinitely and not affect the start 
or finish dates of downstream activities. The schedule includes four 
Must Finish On constraints. A Must Finish On constraint is considered 
a "hard" date constraint because it prevents the activity from 
finishing earlier or later than its planned date. This renders the 
schedule rigid and prevents the schedule from being dynamic. A Must 
Finish On constraint is artificial and makes the scheduled activity 
appear to be on track to finish on time when it may not be. There are 
also 17 Start No Earlier Than constraints within the schedule. These 
are considered "soft" constraints in that they allow the activity to 
slip into the future based on what happens to their predecessor 
activities. Activities may be soft constrained, for example, to 
represent receipt of delivery of equipment. However, in general 
constraining an activity's start date prevents managers from 
accomplishing work as soon as possible and consumes flexibility in the 
project. We found 78 start-to-finish links within the schedule. Start-
to-finish links are rarely, if ever, used in scheduling practice 
because they have the odd effect of causing a successor activity to 
finish when its predecessor starts. Moreover, we found that each of 
the 78 start-to-finish links have 3-to 7-day lags. That is, the 
schedule logic dictates that these successors must finish a set number 
of days before their predecessors begin. Start-to-finish logic is at 
best confusing and at worst incorrect; activities should be rearranged 
to find true predecessors and successors and linked with 
straightforward logic. Of the 1,901 remaining activities, 467 
activities are linked to their successor activities with lags. Lags 
represent the passing of time between activities but are often misused 
to put activities on a specific date or to insert a buffer for risk. 
Lags should be justified because they cannot have risk or uncertainty. 
Without logically sequencing activities, the schedule cannot be used 
as a reliable basis for guiding work and measuring progress. 

Best practice: 3. Assigning resources to all activities; 
Explanation: The schedule should reflect what resources (e.g., labor, 
materials, and overhead) are needed to do the work, whether all 
required resources will be available when needed, and whether any 
funding or time constraints exist; 
Criterion met: Minimally; 
GAO analysis: The ECSS PMO stated that the government is aware that 
the contractor assigns resources to activities, but the government has 
no detailed insight into the resources because of the current FFP 
contractual arrangement. However, the program office was not able to 
provide evidence that would confirm that the schedule is resource 
loaded. Resource information would assist the program office in 
forecasting the likelihood of activities being completed based on 
their projected end dates. If the current schedule does not allow for 
insight into current or projected over-allocation of resources, then 
the risk of the program slipping is significantly increased. 

Best practice: 4. Establishing the duration of all activities; 
Explanation: The schedule should realistically reflect how long each 
activity will take to execute. In determining the duration of each 
activity, the same rationale, historical data, and assumptions used 
for cost estimating should be used. Durations should be as short as 
possible and have specific start and end dates. The schedule should be 
continually monitored to determine when forecasted completion dates 
differ from planned dates; 
this information can be used to determine whether schedule variances 
will affect subsequent work; 
Criterion met: Substantially; 
GAO analysis: Eighty-eight percent of remaining activities meet best 
practices for durations, being less than 44 days (or two working 
months). Seventy activities (4 percent) have longer than 100-day 
durations; the PMO has identified the majority of these as level-of-
effort support activities. Twenty-five of these level-of-effort 
activities span the start and end dates of the project and appear in 
the schedule as critical activities. Level-of-effort activities, such 
as systems engineering and program management, cannot define the 
critical path because they are nondiscrete support activities that do 
not produce a definite end product. 

Best practice: 5. Integrating schedule activities horizontally and 
vertically; 
Explanation: The schedule should be horizontally integrated, meaning 
that it should link products and outcomes associated with other 
sequenced activities. These links are commonly referred to as 
"handoffs" and serve to verify that activities are arranged in the 
right order to achieve aggregated products or outcomes. The schedule 
should also be vertically integrated, meaning that the dates for 
starting and completing activities in the integrated master schedule 
should be aligned with the dates for supporting tasks and subtasks. 
Such mapping or alignment among levels enables different groups to 
work to the same master schedule; 
Criterion met: Partially; 
GAO analysis: We found that vertical integration--that is, the ability 
to consistently trace work breakdown structure elements between 
detailed, intermediate, and master schedules--is demonstrated because 
the overall ECSS schedule is made up of individual project schedules 
like the Solutions Development schedule. However, issues with reliance 
on hard date constraints, the overuse of lags, critical level-of-
effort tasks, and instances of convoluted logic such as start-to-
finish links keep this detailed schedule from fully complying with the 
requirement of horizontal integration--that is, the overall ability of 
the schedule to depict relationships between different program 
elements and product handoffs. Horizontal integration demonstrates 
that the overall schedule is rational, planned in a logical sequence, 
accounts for interdependencies between work and planning packages, and 
provides a way to evaluate current status. 

Best practice: 6. Establishing the critical path for all activities; 
Explanation: Scheduling software should be used to identify the 
critical path, which represents the chain of dependent activities with 
the longest total duration. Establishing a project's critical path is 
necessary to examine the effects of any activity slipping along this 
path. Potential problems along or near the critical path should also 
be identified and reflected in scheduling the duration of high-risk 
activities; 
Criterion met: Partially; 
GAO analysis: Our analysis could not determine a valid critical path--
the longest duration path through the sequenced list of activities--
because level-of -effort activities define the start and finish dates 
of the detail planning portion of the project. Level of effort 
activities should not drive the critical path because they only serve 
to support detail work activities. The PMO acknowledged that a 
critical path would be difficult to calculate within the schedule 
because the project schedules are team-oriented rather than product-
oriented, which causes complex linking relationships. While a true 
critical path does not exist throughout all 46 project schedules, 
program management reviews a high-level, manually constructed 
"Critical Events" schedule that tracks the status of major program 
milestones. These major program milestones are linked to lower-level 
schedules, and their status is updated daily and reviewed each week by 
program management. However, it is important that the lower-level 
schedules include complete logic that addresses the relationships 
between predecessor and successor activities, because any activity can 
become critical under some circumstances. Without clear insight into a 
critical path at the project level, management will not be able to 
monitor critical or near-critical detail activities that may have a 
detrimental impact on downstream activities if delayed. 

Best practice: 7. Identifying reasonable float; 
Explanation: The schedule should identify the float--the amount of 
time by which a predecessor activity can slip before the delay affects 
successor activities--so that a schedule's flexibility can be 
determined. As a general rule, activities along the critical path have 
the least float. Total float is the total amount of time by which an 
activity can be delayed without delaying the project's completion, if 
everything else goes according to plan; 
Criterion met: Partially; 
GAO analysis: Most remaining tasks appear to have reasonable total 
float, but there are 587 activities (31 percent) with over 50 days (2 
working months) of total float. In other words, according to the 
schedule, 587 remaining activities (20 percent) could be delayed by 2 
working months and not delay the final activity in the Solutions 
Development schedule. Activities with large float values may indicate 
a lack of completeness in the schedule logic. The PMO stated that 
total float is monitored by management in higher-level milestone 
schedules, not lower-level project schedules. Incorrect float 
estimates will result in an invalid critical path, and will result in 
an inability to allocate resources from non-critical activities to 
activities that cannot slip without affecting the project finish date. 

Best practice: 8. Conducting a schedule risk analysis; 
Explanation: A schedule risk analysis should be performed using 
statistical techniques to predict the level of confidence in meeting a 
project's completion date. This analysis focuses not only on critical 
path activities but also on activities near the critical path, since 
they can affect the project's status; 
Criterion met: Not met; 
GAO analysis: PMO officials stated that while the program reviews the 
schedule on a weekly basis and assesses risks to the program, it has 
not performed a schedule risk analysis. Best practices suggest that a 
schedule risk analysis can be used to determine a level of confidence 
in meeting the completion date or whether proper reserves have been 
incorporated into the schedule. Such an analysis will calculate 
schedule reserve, which can be set aside for those activities 
identified as high risk. Without this reserve, the program faces the 
risk of delays to the scheduled completion date if any delays were to 
occur in critical path activities. 

Best practice: 9. Updating the schedule using logic and durations to 
determine the dates; 
Explanation: The schedule should be continuously updated using logic 
and durations to determine realistic start and completion dates for 
program activities. The schedule should be analyzed continuously for 
variances to determine when forecasted completion dates differ from 
planned dates. This analysis is especially important for those 
variations that impact activities identified as being in a project's 
critical path and can impact a scheduled completion date; 
Criterion met: Partially; 
GAO analysis: The status date for the version of the schedule we 
analyzed is January 1, 2010, a federal holiday. A status date denotes 
the date of the latest update to the schedule and thus defines the 
point in time at which completed work and remaining work are 
calculated. The PMO could not confirm that this date was correct. 
Assuming the status date is correct, we found several date anomalies 
within the schedule, suggesting that management may need to review how 
and when the schedule is updated. We found 29 activities (2 percent) 
that should have started but have no actual start date; 
24 activities (1 percent) that should have finished but have no actual 
finish date; and 9 milestone activities with actual finish dates in 
the future. In addition, we found 24 instances (1 percent) of out-of- 
sequence logic--that is, actual progress being recorded on activities 
that, according to schedule logic, should not have begun yet. This is 
a common occurrence in scheduling, as actual events often override 
planned logic. However, some of these successor activities are planned 
to begin 2 to 3 months in the future, suggesting that the schedule 
logic should be updated to reflect changes. 

Source: GAO analysis based on data provided by the ECSS PMO. 

Note: The ECSS program schedule consists of a master schedule with 46 
embedded project schedules representing individual product teams, or 
workstreams. The 46 schedules include 2 high-level schedules, one 
dedicated to key date milestones and another to critical events. Two 
project schedules were chosen based on their importance to the program 
and the high amount of activity currently associated with the product 
team. 

[End of table] 

Table 13: Analysis of the Air Force's ECSS Reports, Interfaces, 
Conversions, and Extensions (RICE) Program Schedule: 

Best practice: 1. Capturing all activities; 
Explanation: The schedule should reflect all activities as defined in 
the project's work breakdown structure, which defines in detail the 
work necessary to accomplish a project's objectives, including 
activities to be performed by both the owner and contractors; 
Criterion met: Substantially; 
GAO analysis: While the PMO does have detailed schedules of government 
effort--a commendable best practice--these are not fully integrated 
into an integrated master schedule (IMS) with the contractor 
schedules. Our analysis found that the ECSS RICE schedule contains 
"touch points," or links between government and contractor activities, 
representing dependencies between contractor and government 
activities. However, the government activities are not completely 
linked to government schedules maintained and updated by the 
government PMO. Our analysis found that activities in the RICE 
workstream schedule are mapped to contractor work breakdown structure 
elements and can be traced to completion criteria and descriptions of 
associated work products. 

Best practice: 2. Sequencing all activities; 
Explanation: The schedule should be planned so that critical project 
dates can be met. To meet this objective, activities need to be 
logically sequenced--that is, listed in the order in which they are to 
be carried out. In particular, activities that must be completed 
before other activities can begin (predecessor activities), as well as 
activities that cannot begin until other activities are completed 
(successor activities), should be identified. This helps ensure that 
interdependencies among activities that collectively lead to the 
accomplishment of events or milestones can be established and used as 
a basis for guiding work and measuring progress; 
Criterion met: Partially; 
GAO analysis: Our analysis shows that 472 of the 4,433 remaining 
activities, or 11 percent, have missing logic. Missing predecessors or 
successors are usually a signal of broken logic and reduce the 
credibility of the calculated dates. If an activity that has no 
logical successor slips, the schedule will not reflect the effect on 
the critical path, float, or scheduled start dates of future 
activities. In addition, we found 820 remaining activities, or 19 
percent, have "dangling" logic. Of these 820 activities with dangling 
logic, 241 activities are missing logic that would determine their 
start dates. Because their start dates are not determined by logic, 
these activities would have to start earlier in order to finish on 
time if they ran longer than their planned durations. The other 579 
activities with dangling logic are missing successors off their finish 
date. In other words, these activities could continue indefinitely and 
not affect the start or finish dates of future activities. We found 
277 Start No Earlier Than constraints (6 percent) within the schedule. 
These are considered "soft" date constraints in that they allow the 
activity to slip into the future based on what happens to their 
predecessor activities. Activities may be soft constrained, for 
example, to represent receipt of delivery of equipment. However, in 
general constraining an activity's start date prevents managers from 
accomplishing work as soon as possible and consumes flexibility in the 
project. Of the remaining activities, 91 activities (2 percent) are 
linked to their successor activities with lags, including a lag 
greater than 100 days. Lags represent the passing of time between 
activities but are often misused to put activities on a specific date 
or to insert a buffer for risk. Lags should be justified because they 
cannot have risk or uncertainty. Without logically sequencing 
activities, the schedule cannot be used as a reliable basis for 
guiding work and measuring progress. 

Best practice: 3. Assigning resources to all activities; 
Explanation: The schedule should reflect what resources (e.g., labor, 
materials, and overhead) are needed to do the work, whether all 
required resources will be available when needed, and whether any 
funding or time constraints exist; 
Criterion met: Minimally; 
GAO analysis: The ECSS PMO stated that the government is aware that 
the contractor assigns resources to activities, but the government has 
no detailed insight into the resources because of the current FFP 
contractual arrangement. However, the program office was not able to 
provide evidence that would confirm the schedule is resource loaded. 
Resource information would assist the program office in forecasting 
the likelihood of activities being completed based on their projected 
end dates. If the current schedule does not allow for insight into 
current or projected over-allocation of resources, then the risk of 
the program slipping is significantly increased. 

Best practice: 4. Establishing the duration of all activities; 
Explanation: The schedule should realistically reflect how long each 
activity will take to execute. In determining the duration of each 
activity, the same rationale, historical data, and assumptions used 
for cost estimating should be used. Durations should be as short as 
possible and have specific start and end dates. The schedule should be 
continually monitored to determine when forecasted completion dates 
differ from planned dates; this information can be used to determine 
whether schedule variances will affect subsequent work; 
Criterion met: Substantially; 
GAO analysis: Ninety-seven percent of the remaining activities meet 
best practices for durations, being less than 44 days (or two working 
months). Sixty activities (1 percent) have longer than 100-day 
durations, which the PMO has identified as level-of-effort support 
activities. Forty-two of these level-of-effort activities span the 
start and end dates of the project and appear in the schedule as 
critical activities. Level-of-effort activities, such as systems 
engineering and program management, cannot define the critical path 
because they are nondiscrete support activities that do not produce a 
definite end product. 

Best practice: 5. Integrating schedule activities horizontally and 
vertically; 
Explanation: The schedule should be horizontally integrated, meaning 
that it should link products and outcomes associated with other 
sequenced activities. These links are commonly referred to as 
"handoffs" and serve to verify that activities are arranged in the 
right order to achieve aggregated products or outcomes. The schedule 
should also be vertically integrated, meaning that the dates for 
starting and completing activities in the integrated master schedule 
should be aligned with the dates for supporting tasks and subtasks. 
Such mapping or alignment among levels enables different groups to 
work to the same master schedule; 
Criterion met: Partially; 
GAO analysis: We found that vertical integration--that is, the ability 
to consistently trace work breakdown structure elements between 
detailed, intermediate, and master schedules--is demonstrated because 
the overall ECSS schedule is made up of individual project schedules 
like the RICE schedule. However, issues with missing dependencies, 
activities with dangling logic, overuse of lags, and critical level-of-
effort activities keep this detailed schedule from being fully 
compliant with the requirement of horizontal integration--that is, the 
overall ability of the schedule to depict relationships between 
different program elements and product handoffs. Horizontal 
integration demonstrates that the overall schedule is rational, 
planned in a logical sequence, accounts for interdependencies between 
work and planning packages, and provides a way to evaluate current 
status. 

Best practice: 6. Establishing the critical path for all activities; 
Explanation: Scheduling software should be used to identify the 
critical path, which represents the chain of dependent activities with 
the longest total duration. Establishing a project's critical path is 
necessary to examine the effects of any activity slipping along this 
path. Potential problems along or near the critical path should also 
be identified and reflected in scheduling the duration of high-risk 
activities; 
Criterion met: Partially; 
GAO analysis: Our analysis could not determine a valid critical path--
the longest duration path through the sequenced list of activities--
because nearly 30 percent of remaining activities have missing or 
incomplete logic, and because level-of-effort tasks (209 days long) 
define the start and finish dates of the project. Level-of-effort 
activities should not drive the critical path because they only serve 
to support detail work activities. The government PMO acknowledged 
that a critical path would be difficult to calculate within the 
schedule because the project schedules are team-oriented rather than 
product-oriented, which causes complex linking relationships. While a 
true critical path does not exist throughout all 46 project schedules, 
program management reviews a high-level, manually constructed Critical 
Events schedule that tracks the status of major program milestones. 
These major program milestones are linked to lower-level schedules, 
and their status is updated daily and reviewed each week by program 
management. However, it is important that the lower level schedules 
include complete logic that addresses the relationships between 
predecessor and successor activities, because any activity can become 
critical under some circumstances. Without clear insight into a 
critical path at the project level, management will not be able to 
monitor critical or near-critical detail activities that may have a 
detrimental impact on downstream activities if delayed. 

Best practice: 7. Identifying reasonable float; 
Explanation: The schedule should identify the float--the amount of 
time by which a predecessor activity can slip before the delay affects 
successor activities--so that a schedule's flexibility can be 
determined. As a general rule, activities along the critical path have 
the least float. Total float is the total amount of time by which an 
activity can be delayed without delaying the project's completion, if 
everything else goes according to plan; 
Criterion met: Partially; 
GAO analysis: We found that the schedule did not have a reasonable 
amount of float because 78 percent of remaining activities have zero 
days of total float. In other words, according to the schedule, 3,448 
remaining activities cannot slip one day without delaying the finish 
date of the project by one day. The program lead scheduler stated that 
total float is monitored by management at the higher critical events 
schedules, not lower-level project schedules. However, incorrect float 
estimates in lower-level schedules will result in an invalid critical 
path, and will result in an inability to allocate resources from non-
critical activities to activities that cannot slip without affecting 
the project finish date. 

Best practice: 8. Conducting a schedule risk analysis; 
Explanation: A schedule risk analysis should be performed using 
statistical techniques to predict the level of confidence in meeting a 
project's completion date. This analysis focuses not only on critical 
path activities but also on activities near the critical path, since 
they can affect the project's status; 
Criterion met: Not met; 
GAO analysis: PMO officials stated that while the program reviews the 
schedule on a weekly basis and assesses risks to the program, it has 
not performed a schedule risk analysis. Best practices suggest that a 
schedule risk analysis can be used to determine a level of confidence 
in meeting the completion date or to determine whether proper reserves 
have been incorporated into the schedule. Such an analysis will 
calculate schedule reserve, which can be set aside for those 
activities identified as high risk. Without this reserve, the program 
faces the risk of delays to the scheduled completion date if any 
delays were to occur on critical path activities. 

Best practice: 9. Updating the schedule using logic and durations to 
determine the dates; 
Explanation: The schedule should be continuously updated using logic 
and durations to determine realistic start and completion dates for 
program activities. The schedule should be analyzed continuously for 
variances to determine when forecasted completion dates differ from 
planned dates. This analysis is especially important for those 
variations that impact activities identified as being in a project's 
critical path and can impact a scheduled completion date; 
Criterion met: Partially; 
GAO analysis: The status date for the version of the schedule we 
analyzed is January 1, 2010, a federal holiday. A status date denotes 
the date of the latest update to the schedule and thus defines the 
point in time at which completed work and remaining work are 
calculated. The PMO could not confirm that this date was correct. 
Assuming the status date is correct, we found several date anomalies 
within the schedule, suggesting that management may need to review how 
and when the schedule is updated. For example, we found 14 activities 
(less than 1 percent) that should have started but have no actual 
start date; 17 activities (less than 1 percent) that should have 
finished but have no actual finish date; and 155 activities (3 
percent) that occurred in the past according to the schedule but are 
missing both actual start dates and actual finish dates. In addition, 
we found 22 (less than 1 percent) instances of out-of-sequence logic-- 
that is, actual progress being recorded on activities that, according 
to schedule logic, should not have begun yet. This is a common 
occurrence in scheduling, as actual events often override planned 
logic. However, schedule logic should be updated to reflect changes as 
much as possible. 

Source: GAO analysis based on data provided by the ECSS PMO. 

[End of table] 

Table 14: Analysis of the Army's GFEBS Program Schedule: 

Best practice: 1. Capturing all activities; 
Explanation: The schedule should reflect all activities as defined in 
the project's work breakdown structure, which defines in detail the 
work necessary to accomplish a project's objectives, including 
activities to be performed by both the owner and contractors; 
Initial result: Substantially; 
Final result: Substantially; 
GAO analysis: Initial Analysis: Our analysis found that while the Wave 
4 deployment schedule captures both contractor and government 
activities, the program schedule is not fully integrated because 
individual deployment schedules for software releases are not related 
to activities within other program schedules. PMO officials stated 
that the while release and maintenance activities are integrated 
together in one schedule, and each deployment wave has its own 
schedule, the schedules are not linked to each other because the 
activities within each schedule are not related. However, a fully 
integrated master schedule would link government and contractor 
development, deployment, and subsequent maintenance activities. 
Activities in the program schedule are mapped to the program's 
integrated master plan, and deliverables in the Wave 4 schedule are 
mapped to the program's Quality Assurance Surveillance Plan through 
unique identification numbers. A large portion of the Wave 4 
deployment schedule is made up of receiver milestones; that is, 
products the program needs to receive from external field sites before 
certain activities can be conducted. In addition to including 
government and contractor activities, the schedule also include tasks 
representing work being performed by external organizations; 
Updated analysis: No change to initial assessment. 

Best practice: 2. Sequencing all activities; 
Explanation: The schedule should be planned so that critical project 
dates can be met. To meet this objective, activities need to be 
logically sequenced--that is, listed in the order in which they are to 
be carried out. In particular, activities that must be completed 
before other activities can begin (predecessor activities), as well as 
activities that cannot begin until other activities are completed 
(successor activities), should be identified. This helps ensure that 
interdependencies among activities that collectively lead to the 
accomplishment of events or milestones can be established and used as 
a basis for guiding work and measuring progress; 
Initial result: Minimally; 
Final result: Partially; 
GAO analysis: Initial analysis: Our analysis found 18 activities of 
2,150 remaining (less than 1 percent) within the schedule that have no 
successor links, and three activities (less than 1 percent) that have 
neither successor nor predecessor links. Activities without successor 
links do not affect any other future activity. That is, they can 
continue until the end of the project without affecting the finish 
date of the project. The schedule includes 24 (1 percent) Must Start 
On (MSO) constraints. An MSO constraint is considered a "hard" date 
constraint because it prevents the activity from starting earlier or 
later than its planned date. This renders the schedule rigid and 
prevents the schedule from being dynamic. An MSO constraint is 
artificial and makes the scheduled activity appear to be on track to 
finish on time when it may not be. PMO schedulers told us that of the 
24 MSO-constrained tasks, 15 (less than 1 percent of all remaining) 
are associated with executive briefings that are now out of scope and 
should be removed from the schedule. Of the remaining 9 MSO-
constrained tasks, 8 are used to force successor activities to start 
on exactly the first days of calendar months. While these constraints 
may make scheduling activities simpler, they have an adverse effect on 
the project's critical path. An activity with an MSO constraint 
automatically becomes critical within scheduling software regardless 
of whether it actually should be critical. A final MSO constraint is 
attached to the "Go Live" milestone, which prevents the project finish 
milestone from shifting because of completed or remaining effort on 
predecessor activities. PMO officials acknowledged that the MSO 
constraint should not have been applied to the finish milestone and 
stated that it would be removed in the next update of the schedule. 
Our analysis also found that 50 summary tasks (12 percent of remaining 
summary tasks) have predecessor links. PMO schedulers told us that 
these summary links are used in lieu of linking predecessors to the 
numerous lower-level tasks. Because many of the lower-level tasks 
begin on the same date, this makes updating the schedule simpler: an 
updated start date for the summary task will force that same date on 
all the unlinked lower-level tasks. While this indeed makes updating 
easier, this technique is not considered a best practice. First, 
summary tasks do not represent work and are simply used as grouping 
elements. As such, they should take their start and finish dates from 
lower-level activities; they should not dictate the start or finish of 
lower-level activities. Secondly, linking summary tasks obfuscates the 
logic of the schedule. That is, tracing logic through summary links 
does not impart to management the sequence in which lower-level 
activities should be carried out. Our analysis found that 358 
activities (17 percent) are scheduled to occur on a Sunday. This is a 
consequence of a summary task linked to a constrained milestone--
constrained to start on the first day of a calendar month, which 
happened to be a Sunday and in turn causes a multitude of lower-level 
activities to also begin on a Sunday. PMO schedulers acknowledged that 
this was an error and the activities would be shifted to begin on a 
work day. There are 67 remaining activities (3 percent) that are 
linked to their successor activities with lags and 38 (2 percent) are 
linked with negative lags (or "leads"). Lags represent the passing of 
time between activities but are often misused to put activities on a 
specific date or to insert a buffer for risk. Lags should be justified 
because they cannot have risk or uncertainty. Without logically 
sequenced activities, the schedule cannot be used as a reliable basis 
for guiding work and measuring progress; 
Updated analysis: There are still 18 tasks within the schedule without 
successors (or less than 1 percent of remaining activities in the 
updated schedule). While these activities have finish dates in 
December 2009 and February 2010, they do not have actual finish dates 
and we therefore cannot determine if these activities are completed or 
have the potential to cause future activities to slip. The updated 
schedule now contains 7 MSO constraints (less than 1 percent): the 15 
constraints associated with executive meetings have been removed; 
the MSO constraint on the "Go Live" milestone has been removed; 
2 MSO constraints marking the beginning of months have occurred; 
and 1 new constraint has been added to mark the beginning of the month 
following deployment. The 358 activities unintentionally scheduled to 
begin on a Sunday have been altered by a 1-day lag to begin on a 
proper workday. However, lags should not be used in lieu of logic to 
force activities to start on a specified date. Additionally, the 
updated schedule corrects minor missing predecessor logic issues. 

Best practice: 3. Assigning resources to all activities; 
Explanation: The schedule should reflect what resources (e.g., labor, 
materials, and overhead) are needed to do the work, whether all 
required resources will be available when needed, and whether any 
funding or time constraints exist; 
Initial result: Not met; 
Final result: Not met; 
GAO analysis: Initial analysis: GFEBS officials stated that because of 
the current FFP contractual arrangement, the government does not have 
insight into the contractor's efforts to assign resources to 
activities. They stated that while they are aware that activities in 
previous schedule releases were assigned resources by the contractor, 
the current schedule is not resource loaded. Resource information 
would assist the program office in forecasting the likelihood of 
activities being completed based on their projected end dates. If the 
current schedule does not allow for insight into current or projected 
over-allocation of resources, then the risk of the program slipping is 
significantly increased; 
Updated analysis: No change to initial assessment. 

Best practice: 4. Establishing the duration of all activities; 
Explanation: The schedule should realistically reflect how long each 
activity will take to execute. In determining the duration of each 
activity, the same rationale, historical data, and assumptions used 
for cost estimating should be used. Durations should be as short as 
possible and have specific start and end dates. The schedule should be 
continually monitored to determine when forecasted completion dates 
differ from planned dates; 
this information can be used to determine whether schedule variances 
will affect subsequent work; 
Initial result: Fully met; 
Final result: Fully met; 
GAO analysis: Initial analysis: Seventy-two percent of remaining 
activities meet best practices for duration, being less than 44 days 
(or 2 working months). Activities with excessive durations (more than 
100 days) represent effort being performed by organizations outside of 
the program office. Representing effort in the schedule that is 
performed by outside organizations is considered a best practice 
because it keeps management informed of ongoing work that might easily 
be forgotten until the deliverable is due, and the impact on future 
activities if the deliverable is behind schedule; 
Updated analysis: No change to initial assessment. 

Best practice: 5. Integrating schedule activities horizontally and 
vertically; 
Explanation: The schedule should be horizontally integrated, meaning 
that it should link products and outcomes associated with other 
sequenced activities. These links are commonly referred to as 
"handoffs" and serve to verify that activities are arranged in the 
right order to achieve aggregated products or outcomes. The schedule 
should also be vertically integrated, meaning that the dates for 
starting and completing activities in the integrated master schedule 
should be aligned with the dates for supporting tasks and subtasks. 
Such mapping or alignment among levels enables different groups to 
work to the same master schedule; 
Initial result: Minimally; 
Final result: Minimally; 
GAO analysis: Initial analysis: The GFEBS program schedule includes 
detailed information on release, deployment, and maintenance 
government and contractor activities. However, our analysis of the 
schedule concludes that vertical integration--that is, the ability to 
consistently trace work breakdown structure elements between detailed, 
intermediate, and master schedules--is not fully demonstrated because 
none of the activities within the deployment schedules are related to 
activities associated with release, maintenance, or other wave 
schedules. PMO officials stated that while the release and maintenance 
activities are integrated together in one schedule, and each 
deployment wave has its own schedule, the schedules are not linked to 
each other because the activities within each schedule are not 
related. However, it is unlikely that deployment activities are 
unrelated to release or maintenance activities. Without vertically 
integrating the schedules, lower-level schedules cannot be clearly 
traced to upper-tiered milestones. Issues with reliance on hard date 
constraints, lags, and instances of convoluted logic such as linked 
summary tasks, keep the schedule from fully complying with the 
requirement of horizontal integration--that is, the overall ability of 
the schedule to clearly depict relationships between different program 
elements and product handoffs. Horizontal integration demonstrates 
that the overall schedule is rational, planned in a logical sequence, 
accounts for interdependencies between work and planning packages, and 
provides a way to evaluate current status; 
Updated analysis: No change to initial assessment. 

Best practice: 6. Establishing the critical path for all activities; 
Explanation: Scheduling software should be used to identify the 
critical path, which represents the chain of dependent activities with 
the longest total duration. Establishing a project's critical path is 
necessary to examine the effects of any activity slipping along this 
path. Potential problems along or near the critical path should also 
be identified and reflected in scheduling the duration of high-risk 
activities; 
Initial result: Minimally; 
Final result: Partially; 
GAO analysis: Initial analysis: Our analysis could not determine a 
valid critical path--the longest duration path through the sequenced 
list of activities--because the "Go Live" finish milestone is 
constrained with an MSO constraint. An MSO constraint is considered a 
"hard" date constraint because it prevents the activity from starting 
earlier or later than its planned date. This renders the schedule 
rigid and prevents the schedule from being dynamic. An MSO constraint 
is artificial and makes the scheduled activity appear to be on track 
to finish on time when it may not be. When the constraint is removed, 
the "Go Live" milestone slips two months from its constrained date of 
January 3, 2011 to March 7, 2011. In addition, our analysis found that 
without the MSO constraint, the nearest driving activity to the "Go 
Live" milestone (that is, the activity determining the date of the "Go 
Live" activity) is in October 2010. In other words, according to the 
schedule, no activity starting in November or December 2010 is 
critical to determining the "Go Live" date. Without clear insight into 
a critical path at the project level, management will not be able to 
monitor critical or near-critical detail activities that may have a 
detrimental impact on downstream activities if delayed; 
Updated analysis: The updated schedule has altered the predecessors on 
the "Go Live" finish milestone and the MSO constraint has been 
removed. The "Go Live" date is now scheduled to occur in February 
2011. However, the critical path, as measured by the path with the 
lowest available float, shows only five activities from the "Go Live" 
date in February 2011 to the "Site Visit Activities Complete" 
milestone completed in March 2010. Because so few activities are on 
the current critical path, no activities scheduled within 1 or 2 
months of deployment are currently driving the project finish date. In 
addition, the earliest critical activity on the path appears to be a 
functional survey scheduled for April 1, 2010, that has yet to 
actually start. 

Best practice: 7. Identifying reasonable float; 
Explanation: The schedule should identify the float--the amount of 
time by which a predecessor activity can slip before the delay affects 
successor activities--so that a schedule's flexibility can be 
determined. As a general rule, activities along the critical path have 
the least float. Total float is the total amount of time by which an 
activity can be delayed without delaying the project's completion, if 
everything else goes according to plan; 
Initial result: Minimally; 
Final result: Minimally; 
GAO analysis: Initial analysis: We found that the Wave 4 Deployment 
schedule displays unrealistic total float values. For example, 1,273 
activities (59 percent) within the schedule are showing negative 
float. That is, these activities are one to 242 days behind schedule. 
Other tasks display an unrealistic amount of positive float: 49 tasks 
(59 percent) are showing 100 to more than 300 days of total float. In 
other words, according to the schedule, 49 remaining activities could 
be delayed by more than 4 working months and not delay the final 
activity in the Wave 4 schedule. As a general rule, activities along 
the critical path have the least amount of float. Activities with 
large float values may indicate some lack of completeness in the 
schedule logic. Incorrect float estimates will result in an invalid 
critical path and an inability to allocate resources from noncritical 
activities to activities that cannot slip without affecting the 
project finish date; 
Updated analysis: The updated schedule continues to reflect 
unrealistic float. For example, 172 remaining activities (8 percent) 
have from 90 to 252 days of negative float, while 25 remaining 
activities (1 percent) have 104 to 310 days of float. 

Best practice: 8. Conducting a schedule risk analysis; 
Explanation: A schedule risk analysis should be performed using 
statistical techniques to predict the level of confidence in meeting a 
project's completion date. This analysis focuses not only on critical 
path activities but also on activities near the critical path, since 
they can affect the project's status; 
Initial result: Not met; 
Final result: Not met; 
GAO analysis: Initial analysis: The PMO has not performed a schedule 
risk analysis. GFEBS officials stated that while schedule risks have 
been discussed in team meetings, the PMO has not performed a formal 
schedule risk analysis. However, officials stated that they are open 
to improving in the area of schedule risk analysis. Best practices 
suggest that a schedule risk analysis can be used to determine a level 
of confidence in meeting the completion date or to determine whether 
proper reserves have been incorporated into the schedule. Such an 
analysis will calculate schedule reserve, which can be set aside for 
those activities identified as high-risk. Without this reserve, the 
program faces the risk of delays to the scheduled completion date if 
any delays were to occur on critical path activities; 
Updated analysis: No change to initial assessment. 

Best practice: 9. Updating the schedule using logic and durations to 
determine the dates; 
Explanation: The schedule should be continuously updated using logic 
and durations to determine realistic start and completion dates for 
program activities. The schedule should be analyzed continuously for 
variances to determine when forecasted completion dates differ from 
planned dates. This analysis is especially important for those 
variations that impact activities identified as being in a project's 
critical path and can impact a scheduled completion date; 
Initial result: Minimally; 
Final result: Partially; 
GAO analysis: Initial analysis: The status date for the version of the 
Wave 4 schedule we analyzed is May 3, 2010. A status date denotes the 
date of the latest update to the schedule and thus defines the point 
in time at which completed work and remaining work are calculated. As 
of this date, we found a relatively large number of date anomalies 
within the schedule, suggesting that management may need to review how 
and when the schedule is updated. For example, we found 247 activities 
(11 percent) that should have started but have no actual start date 
and 200 activities (9 percent) that should have finished but have no 
actual finish date. Moreover, we found 7 activities (less than 1 
percent) that have actual finish dates in the future. Schedule logic 
should be updated to reflect actual progress so that management is 
aware of the latest plan and the impacts to the project if activity 
planned dates are not met; 
Updated analysis: The updated schedule no longer includes activities 
that have actual finish dates beyond the status date. However, the 
schedule contains 44 activities (2 percent) that should have started 
but have no actual start date; 22 activities (1 percent) that should 
have finished but have no actual finish date; and 109 activities (5 
percent) that should have started and finished but have neither an 
actual start nor actual finish date. 

Source: GAO analysis based on data provided by the GFEBS PMO. 

Note: The initial analysis reflects our assessment of the schedule 
originally submitted by the GFEBS PMO for our review. In response to 
limitations that we identified and shared with the GFEBS PMO, the 
program office enacted several formal changes to their existing 
schedule. The updated analysis reflects our review of the revised 
schedule. 

[End of table] 

Table 15: Analysis of the Army's GCSS-Army Program Schedule: 

Best practice: 1. Capturing all activities; 
Explanation: The schedule should reflect all activities as defined in 
the project's work breakdown structure, which defines in detail the 
work necessary to accomplish a project's objectives, including 
activities to be performed by both the owner and contractors; 
Criterion met: Partially; 
GAO analysis: We found that the GCSS-Army program schedule is not 
fully integrated. While the program schedule contains detailed 
contractor activities, it only contains some major government 
milestones. Other government activities, such as testing events and 
future milestones beyond December 2010, are displayed in isolated, 
high-level illustrated documents rather than in dynamic scheduling 
documents. We also found that contractor activities within the program 
schedule are assigned contractor work package numbers and can be 
traced to individual control account plans and contractor work 
breakdown structure elements. Activities are also assigned integrated 
product teams and individual control account managers. However, 
without fully integrating government activities with contractor 
activities, DOD cannot guarantee the schedule has either adequately 
captured all key activities necessary for the program's completion or 
that it can reliably estimate the finish date for the program. 

Best practice: 2. Sequencing all activities; 
Explanation: The schedule should be planned so that critical project 
dates can be met. To meet this objective, activities need to be 
logically sequenced--that is, listed in the order in which they are to 
be carried out. In particular, activities that must be completed 
before other activities can begin (predecessor activities), as well as 
activities that cannot begin until other activities are completed 
(successor activities), should be identified. This helps ensure that 
interdependencies among activities that collectively lead to the 
accomplishment of events or milestones can be established and used as 
a basis for guiding work and measuring progress; 
Criterion met: Partially; 
GAO analysis: We found that only 2 of 2,255 activities (less than 1 
percent) are missing dependencies and 19 activities (less than 1 
percent) have "dangling" logic--that is, activities whose start or 
finish dates are missing logic. These activities with dangling logic 
have no successor from their finish date, meaning they can carry on 
indefinitely without affecting the start date of any other activity. 
While dependencies within the schedule are generally sound, 60 percent 
of the activities (1,360) have Start No Earlier Than constraints. 
Start No Earlier Than constraints are considered "soft" date 
constraints because they allow an activity to slip into the future if 
their predecessor activity is delayed, but the activity cannot begin 
earlier than its constraint date. Program officials stated that Start 
No Earlier Than constraints are used to manually allocate resources 
and to coordinate data tests, which rely on coordination with outside 
partners. Officials further stated that individual control account 
managers monitor these constraints. However, we found that 87 percent 
of the constraints were actively affecting the start date of their 
activities. That is, without the constraint, the activity may be able 
to start sooner. If these activities cannot start earlier, then their 
dates and dependencies should be updated to reflect reality. 
Constraining over half of all activities to start on or after specific 
dates defeats the purpose of a dynamic scheduling tool and greatly 
reduces to ability of the program to take advantage of possible time 
savings. We also found 143 Finish No Earlier Than constraints (6 
percent). These are also considered "soft" date constraints because 
they prevent activities from finishing earlier than their constraint 
date. Program officials stated that these were erroneously created in 
the schedule during an internal file conversion process and would be 
removed in the next version of the program schedule. Without logically 
sequenced activities, the schedule cannot be used as a reliable basis 
for guiding work and measuring progress. 

Best practice: 3. Assigning resources to all activities; 
Explanation: The schedule should reflect what resources (e.g., labor, 
materials, and overhead) are needed to do the work, whether all 
required resources will be available when needed, and whether any 
funding or time constraints exist; 
Criterion met: Substantially; 
GAO analysis: While the integrated master schedule is not resource 
loaded, scheduled activities can be traced to control account plans 
which have resources laid out by month by labor category. Budgets are 
assigned at the control account level and resources are accounted for 
in monthly updates to the program's earned value management system. 

Best practice: 4. Establishing the duration of all activities; 
Explanation: The schedule should realistically reflect how long each 
activity will take to execute. In determining the duration of each 
activity, the same rationale, historical data, and assumptions used 
for cost estimating should be used. Durations should be as short as 
possible and have specific start and end dates. The schedule should be 
continually monitored to determine when forecasted completion dates 
differ from planned dates; this information can be used to determine 
whether schedule variances will affect subsequent work; 
Criterion met: Fully met; 
GAO analysis: Ninety-eight percent of remaining activities meet the 
best practice for activity duration, being less than 44 days. Only two 
remaining activities have durations that exceed the best practice, 
extending beyond 80 days. 

Best practice: 5. Integrating schedule activities horizontally and 
vertically; 
Explanation: The schedule should be horizontally integrated, meaning 
that it should link products and outcomes associated with other 
sequenced activities. These links are commonly referred to as 
"handoffs" and serve to verify that activities are arranged in the 
right order to achieve aggregated products or outcomes. The schedule 
should also be vertically integrated, meaning that the dates for 
starting and completing activities in the integrated master schedule 
should be aligned with the dates for supporting tasks and subtasks. 
Such mapping or alignment among levels enables different groups to 
work to the same master schedule; 
Criterion met: Partially; 
GAO analysis: The schedule is vertically integrated, with low-level 
tasks and milestones being traceable to higher-level summary tasks. 
While the schedule has a relatively small number of missing 
dependencies and activities with dangling logic, the use of date 
constraints on more than 60 percent of remaining activities, prevent 
the schedule from being completely horizontally integrated. That is, 
the date constraints limit the overall ability of the schedule to 
depict dynamic relationships between different program elements and 
product handoffs. Horizontal integration demonstrates that the overall 
schedule is rational, planned in a logical sequence, accounts for 
interdependencies between work and planning packages, and provides a 
way to evaluate current status. 

Best practice: 6. Establishing the critical path for all activities; 
Explanation: Scheduling software should be used to identify the 
critical path, which represents the chain of dependent activities with 
the longest total duration. Establishing a project's critical path is 
necessary to examine the effects of any activity slipping along this 
path. Potential problems along or near the critical path should also 
be identified and reflected in scheduling the duration of high-risk 
activities; 
Criterion met: Partially; 
GAO analysis: We found that a reliable and realistic critical path 
could not be determined within the program schedule, and program 
officials agreed with our assessment. Program officials stated that 
the schedule is constructed to increase the visibility of each 
software object's development, and as a consequence of this amount of 
detail, a critical path cannot be shown. The schedule displays the 
detailed development life cycle for hundreds of objects and depending 
on the order in which objects are completed, the dependencies between 
objects, and the constant reallocation of resources, a traditional 
critical path may be too volatile to be useful. However, officials 
stated that a higher-level summary type schedule, which would display 
a valid critical path, would not allow management the proper insight 
into the risks underlying the development of each object. In lieu of a 
traditional critical path, program management monitors object 
development weekly and program officials stated that they are fully 
aware of which activities are behind or ahead of schedule. It is 
commendable that the schedule includes the necessary amount of 
complexity and detail to track lower-level, high-risk development 
activities. However, our analysis found that a critical path could not 
be derived because of artificial date constraints rather than complex 
object development detail. Program officials stated that three major 
milestones are being tracked: Critical Design Review, DTOE 1.1, and 
Build/Design Phase Completion. We found that critical paths do not 
exist for any of these milestones because of artificial date 
constraints on activities unrelated to detailed object development. As 
a result, we cannot determine a critical path to any major milestone 
based on actual effort related to object development. In this respect, 
the schedule cannot reliably forecast completion dates for Critical 
Design Review, DTOE, Build/Design Completion, or, as a consequence, 
Milestone C. 

Best practice: 7. Identifying reasonable float; 
Explanation: The schedule should identify the float--the amount of 
time by which a predecessor activity can slip before the delay affects 
successor activities--so that a schedule's flexibility can be 
determined. As a general rule, activities along the critical path have 
the least float. Total float is the total amount of time by which an 
activity can be delayed without delaying the project's completion, if 
everything else goes according to plan; 
Criterion met: Substantially; 
GAO analysis: The majority of remaining tasks in the GCSS-Army 
contractor schedule appear to have reasonable total float values, 
varying from 0 to 30 days. Program office officials stated that they 
believe the schedule portrays accurate float. However, our analysis 
found 338 (15 percent) remaining activities with over 100 days of 
total float. In other words, according to the schedule, 338 remaining 
activities could be delayed by 4 months and not delay the final 
project date. Activities with large float values may indicate a lack 
of completeness in the schedule logic. 

Best practice: 8. Conducting a schedule risk analysis; 
Explanation: A schedule risk analysis should be performed using 
statistical techniques to predict the level of confidence in meeting a 
project's completion date. This analysis focuses not only on critical 
path activities but also on activities near the critical path, since 
they can affect the project's status; 
Criterion met: Minimally; 
GAO analysis: Program office officials stated that a schedule risk 
analysis is not routinely performed, and that there is currently no 
requirement for the contractor to do so. While no detailed risk 
analysis has been performed on the schedule, the contractor recently 
conducted a high-level Monte Carlo risk analysis on two major 
milestones for an integrated master schedule management review 
meeting. This high-level risk analysis shows the probability of 
completing the key milestones on time and identifies mitigating 
actions to prevent delays. Program officials stated they are 
interested in periodic risk analysis and intend to include a schedule 
risk analysis requirement in the contract within the next few months. 
A schedule risk analysis can be used to determine a level of 
confidence in meeting the completion date or whether proper reserves 
have been incorporated into the schedule. Such an analysis will 
calculate schedule reserve, which can be set aside for those 
activities identified as high risk. Without this reserve, the program 
faces the risk of delays to the scheduled completion date if any 
delays were to occur on critical path activities. 

Best practice: 9. Updating the schedule using logic and durations to 
determine the dates; 
Explanation: The schedule should be continuously updated using logic 
and durations to determine realistic start and completion dates for 
program activities. The schedule should be analyzed continuously for 
variances to determine when forecasted completion dates differ from 
planned dates. This analysis is especially important for those 
variations that impact activities identified as being in a project's 
critical path and can impact a scheduled completion date; 
Criterion met: Substantially; 
GAO analysis: We found no instances of illogical dates, such as actual 
start or actual finish dates in the future. We found 112 instances (5 
percent) of out-of-sequence logic; that is, actual progress recorded 
on activities that, according to schedule logic, should not have 
started yet. This is a common occurrence in scheduling, as actual 
events often override planned logic. However, a large number of out-of-
sequence activities may indicate that the schedule is not being 
thoroughly updated to reflect reality on a periodic basis. 

Source: GAO analysis based on data provided by the GCSS-Army PMO. 

[End of table] 

[End of section] 

Appendix V: Assessments of Four DOD ERP Program Cost Estimates: 

This appendix provides the results of our analysis of the extent to 
which the processes and methodologies used to develop and maintain the 
four ERP cost estimates meet the characteristics of high-quality cost 
estimates.[Footnote 72] The four characteristics of high-quality 
estimates are explained and mapped to the 12 steps of such estimates 
in table 16. 

Table 16: The 12 Steps of High-Quality Cost Estimating, Mapped to the 
Steps of a High-Quality Cost Estimate: 

Characteristic: Well-documented; 
Explanation: The documentation should address the purpose of the 
estimate, the program background and system description, its schedule, 
the scope of the estimate (in terms of time and what is and is not 
included), the ground rules and assumptions, all data sources, 
estimating methodology and rationale, the results of the risk 
analysis, and a conclusion about whether the cost estimate is 
reasonable. Therefore, a good cost estimate--while taking the form of 
a single number--is supported by detailed documentation that describes 
how it was derived and how the expected funding will be spent in order 
to achieve a given objective. For example, the documentation should 
capture in writing such things as the source data used and their 
significance, the calculations performed and their results, and the 
rationale for choosing a particular estimating method or reference. 
Moreover, this information should be captured in such a way that the 
data used to derive the estimate can be traced back to and verified 
against their sources, allowing for the estimate to be easily 
replicated and updated. Finally, the cost estimate should be reviewed 
and accepted by management to ensure that there is a high level of 
confidence in the estimating process and the estimate itself; 
Step: 
Step 1: Define the estimate's purpose, scope, and schedule; 
Step 3: Define the program characteristics; 
Step 5: Identify ground rules and assumptions; 
Step 6: Obtain the data; 
Step 10: Document the estimate; 
Step 11: Present the estimate to management for approval. 

Characteristic: Comprehensive; 
Explanation: The cost estimates should include both government and 
contractor costs of the program over its full life cycle, from 
inception of the program through design, development, deployment, and 
operation and maintenance to retirement of the program. They should 
also completely define the program, reflect the current schedule, and 
be technically reasonable. Comprehensive cost estimates should provide 
a level of detail appropriate to ensure that cost elements are neither 
omitted nor double counted, and they should document all cost-
influencing ground rules and assumptions. Establishing a product-
oriented work breakdown structure (WBS) is a best practice because it 
allows a program to track cost and schedule by defined deliverables, 
such as a hardware or software component; 
Step: 
Step 2: Develop the estimating plan; 
Step 4: Determine the estimating structure; 
Step 5: Identify ground rules and assumptions[A]. 

Characteristic: Accurate; 
Explanation: The cost estimates should provide for results that are 
unbiased, and they should not be overly conservative or optimistic. 
Estimates are accurate when they are based on an assessment of most 
likely costs, adjusted properly for inflation, and contain few, if 
any, minor mistakes. In addition, the estimates should be updated 
regularly to reflect material changes in the program, such as when 
schedules or other assumptions change, and actual costs so that the 
estimate is always reflecting current status. Among other things, the 
estimate should be grounded in documented assumptions and a historical 
record of cost estimating and actual experiences on other comparable 
programs; 
Step: 
Step 7: Develop the point estimate[B]; 
Step 12: Update the estimate to reflect actual costs and changes. 

Characteristic: Credible; 
Explanation: The cost estimates should discuss any limitations of the 
analysis because of uncertainty or biases surrounding data or 
assumptions. Major assumptions should be varied, and other outcomes 
recomputed to determine how sensitive they are to changes in the 
assumptions. Risk and uncertainty analysis should be performed to 
determine the level of risk associated with the estimate. Further, the 
estimate's results should be crosschecked, and an independent cost 
estimate conducted by a group outside the acquiring organization 
should be developed to determine whether other estimating methods 
produce similar results. For management to make good decisions, the 
program estimate must reflect the degree of uncertainty, so that a 
level of confidence can be given about the estimate. Having a range of 
costs around a point estimate is more useful to decision makers 
because it conveys the level of confidence in achieving the most 
likely cost and also informs them on cost, schedule, and technical 
risks; 
Step: 
Step 7: Compare the point estimate to an independent cost estimate[C]; 
Step 8: Conduct sensitivity analysis; 
Step 9: Conduct risk and uncertainty analysis. 

Source: GAO-09-3SP. 

[A] This step applies to two of the characteristics--well-documented 
and comprehensive. 

[B] A point estimate is a single cost estimate number representing the 
most likely cost. 

[C] This step applies to two of the characteristics--credible and 
accurate. 

[End of table] 

Tables 17, 18, 19, and 20 provide the detailed results of our analysis 
of the program cost estimates for DEAMS, ECSS, GFEBS and GCSS-Army. 
"Not met" means the program provided no evidence that satisfies any of 
the criterion. "Minimally" means the program provided evidence that 
satisfies a small portion of the criterion. "Partially" means the 
program provided evidence that satisfies about half of the criterion. 
"Substantially" means the program provided evidence that satisfies a 
large portion of the criterion. "Fully met" means the program provided 
evidence that completely satisfies the criterion. 

Table 17: Analysis of the Air Force's DEAMS Cost Estimate: 

Four characteristics of high-quality cost estimates: Well-documented; 
Criterion met: Substantially; 
Key examples of rationale for assessment: The purpose, scope and 
schedule of the estimate were clearly defined. Further, the estimate 
identified all the ground rules and assumptions as well as the 
estimating methodology. The PMO presented evidence of receiving 
approval of the estimate through briefings to management. The sources 
of data the estimate was based on were also documented. However, we 
found inconsistencies when comparing the program requirements found in 
the Cost Analysis Requirements Document (CARD) with the requirements 
contained in the cost estimate. For example, commercial-off-the-shelf 
(COTS) software licenses requirements outlined in the CARD do not 
match the assumptions used in the cost estimate. 

Four characteristics of high-quality cost estimates: Comprehensive; 
Criterion met: Fully met; 
Key examples of rationale for assessment: The program provided 
supporting documentation that showed the ground rules and assumptions. 
The estimate is based on a cost element structure as stated in the 
Department of Defense Automated Information System Economic Analysis 
Guide. The program also provided an estimating plan that included the 
cost estimating schedule. 

Four characteristics of high-quality cost estimates: Accurate; 
Criterion met: Fully met; 
Key examples of rationale for assessment: The DEAMS cost model details 
the calculations and inflation indexes underlying the estimated costs. 
Calculations within the model can be traced back to supporting 
documentation. In addition, the cost model is updated annually to 
incorporate actual costs expended in prior fiscal years. 

Four characteristics of high-quality cost estimates: Credible; 
Criterion met: Fully met; 
Key examples of rationale for assessment: An independent cost estimate 
was developed by the Air Force Cost Analysis Agency. The PMO and Air 
Force Cost Analysis Agency also conducted analyses to identify the 
cost elements with the greatest degree of uncertainty, and determine 
the cost drivers for the program, and performed analyses to determine 
the impact of changing major ground rules and assumptions. For 
example, during the reconciliation process, there was debate as to the 
best estimate for the total number of DEAMS users. The PMO performed a 
sensitivity analysis on this parameter to illustrate the total life 
cycle cost impact of changing this assumption. The PMO submitted 
several supporting documents that detail the risk and uncertainty 
analysis performed on the cost estimates. In addition to the risk and 
uncertainty analysis, the PMO implemented a risk management process at 
the inception of the program and is planned to continue throughout the 
program's life. 

Source: GAO analysis based on data provided by the DEAMS PMO. 

[End of table] 

Table 18: Analysis of the Air Force's ECSS Cost Estimate: 

Four characteristics of high-quality cost estimates: Well-documented; 
Criterion met: Substantially; 
Key examples of rationale for assessment: The purpose, scope and 
schedule of the estimate were clearly defined. The PMO presented 
evidence of receiving approval of the estimate through briefings to 
management. The data sources were also documented. The PMO also 
provided ample descriptions of the methodology used to derive the 
estimates. However, our analysis found inconsistencies between 
requirements found in the CARD and assumptions used to calculate the 
estimate. For example, personnel requirements and the number of 
reports, interfaces, conversions, and extensions were different 
between the two documents. 

Four characteristics of high-quality cost estimates: Comprehensive; 
Criterion met: Fully met; 
Key examples of rationale for assessment: The program provided 
supporting documentation that showed the ground rules and assumptions. 
The estimate is prepared in accordance with the Office of the 
Secretary of Defense ERP work breakdown structure as stated in draft 
DOD guidance.[A] The program also provided an estimating plan that 
included the cost estimating schedule. 

Four characteristics of high-quality cost estimates: Accurate; 
Criterion met: Substantially; 
Key examples of rationale for assessment: The ECSS cost model details 
the calculations and inflation indexes underlying the estimated costs. 
Calculations within the model can be traced back to supporting 
documentation. However, our analysis found minor inconsistencies when 
cross-checking costs that were presented to management and the 
underlying calculations within the model. For example, estimates for 
data migration, data cleansing, and help desk within the cost model do 
not match the cost estimates presented to management. ECSS PMO 
officials stated they cannot compare actual costs to the cost estimate 
because they do not yet have an approved baseline. However, these 
officials stated the program has a Baseline Change Board that holds 
monthly Resource Board meetings during which officials review the 
program baseline to assess the potential impacts of proposed changes 
to all aspects of the program's life cycle. 

Four characteristics of high-quality cost estimates: Credible; 
Criterion met: Partially; 
Key examples of rationale for assessment: An independent cost estimate 
was created by the Air Force Cost Analysis Agency. ECSS PMO officials 
stated that the cost estimate was adjusted based on sensitivity 
analyses. However, the cost estimate model does not include evidence 
of a sensitivity analysis. Because the Air Force did not conduct a 
sensitivity analysis to identify the effects of uncertainties 
associated with different assumptions, there is an increased risk that 
decisions will be made without a clear understanding of the possible 
impact on cost and benefit estimates. The ECSS PMO performed a cost 
risk and uncertainty analysis. This analysis shows that the service 
cost position is at the 60 percent confidence level-meaning there is a 
40 percent chance of a cost overrun. In addition to the risk and 
uncertainty analysis, the PMO has implemented a risk management 
process to identify and mitigate schedule, cost, and performance risks. 

Source: GAO analysis based on data provided by the ECSS PMO. 

[A] MIL-HDBK-881. 

[End of table] 

Table 19: Analysis of the Army's GFEBS Cost Estimate: 

Four characteristics of high-quality cost estimates: Well-documented; 
Criterion met: Fully met; 
Key examples of rationale for assessment: The purpose, scope and 
schedule of the estimate were clearly defined. Further, the 
documentation identified all the ground rules and assumptions as well 
as the estimating methodology. The PMO presented evidence of receiving 
approval of the estimate through briefings to management. The sources 
of data the estimate was based on were also documented. 

Four characteristics of high-quality cost estimates: Comprehensive; 
Criterion met: Fully met; 
Key examples of rationale for assessment: The program provided 
supporting documentation that showed the ground rules and assumptions 
underlying the cost estimate. The estimate is based on a cost 
estimating structure as dictated by the Department of the Army 
Economic Analysis Manual. The program also provided an estimating plan 
that included the cost estimating schedule. 

Four characteristics of high-quality cost estimates: Accurate; 
Criterion met: Substantially; 
Key examples of rationale for assessment: The GFEBS cost estimate 
details the calculations and inflation indexes underlying the 
estimated costs. Calculations within the model can be traced back to 
supporting documentation. In addition, evidence was provided that 
shows how estimated costs were derived based on actual costs incurred 
to date. For example, the estimated cost for program management is 
based on actual historical program management costs. However, because 
a cost uncertainty analysis has not been performed, DOD cannot 
guarantee that the estimate represents most likely costs to be 
incurred. 

Four characteristics of high-quality cost estimates: Credible; 
Criterion met: Minimally; 
Key examples of rationale for assessment: An independent cost estimate 
was created by the Office of the Deputy Assistant Secretary of the 
Army for Cost and Economics. However, the estimate does not include 
either a sensitivity or risk and uncertainty analysis. The GFEBS PMO 
stated that it has adequately accounted for risks in the cost estimate 
based on the maturity of the program and the reconciliation process 
between the PMO estimate and the independent cost estimate. However, 
because the Army did not conduct a sensitivity analysis to identify 
the effect of uncertainties associated with different assumptions, 
there is an increased risk that decisions will be made without a clear 
understanding of the possible impact on cost and benefit estimates. 

Source: GAO analysis based on data provided by the GFEBS PMO. 

[End of table] 

Table 20: Analysis of the Army's GCSS-Army Cost Estimate: 

Four characteristics of high-quality cost estimates: Well-documented; 
Criterion met: Substantially; 
Key examples of rationale for assessment: The purpose, scope and 
schedule of the cost estimate were clearly defined. The program has a 
current technical baseline document, and the PMO presented evidence of 
receiving approval of the estimate through briefings to management. 
However, the Economic Analysis documentation describing the cost 
estimate presents costs at a high level but does not provide details 
on lower level cost elements. 

Four characteristics of high-quality cost estimates: Comprehensive; 
Criterion met: Substantially; 
Key examples of rationale for assessment: The GCSS-Army PMO uses a 
"hybrid" work breakdown structure for the program based on its 
collaboration with the Office of the Secretary of Defense Cost and 
Resource Center. This hybrid work breakdown structure while not 
entirely product-oriented, standardizes the vocabulary for cost 
elements for automated information systems. Because there is currently 
no standardized work breakdown structure in use by DOD that 
corresponds to the implementation of an ERP system, the PMO worked 
closely with the Office of the Secretary of Defense Cost and Resource 
Center to develop a mutually acceptable work breakdown structure that 
meets best practices. In addition, the program provided supporting 
documentation that showed the ground rules and assumptions used to 
generate the cost estimate. However, our analysis shows that not all 
ground rules and assumptions were used to develop the cost risk and 
uncertainty analysis. For example, there are several assumptions 
associated with the number of software licenses, yet the risk and 
uncertainty analysis does not reflect any risk associated with these 
assumptions. 

Four characteristics of high-quality cost estimates: Accurate; 
Criterion met: Partially; 
Key examples of rationale for assessment: The cost estimate model 
shows the methodology and calculations used to prepare the estimate. 
However, because the PMO did not provide supporting documentation that 
details the use of actual costs to derive cost estimates, we are 
unable to verify the quality of the cost estimates. Programs should be 
monitored continuously for their cost-effectiveness by comparing 
planned and actual performance against the approved program baseline. 
The estimates should be updated with actual costs so that it is always 
relevant and current. This results in a higher-quality cost estimate 
and provides an opportunity to incorporate lessons learned. 

Four characteristics of high-quality cost estimates: Credible; 
Criterion met: Partially; 
Key examples of rationale for assessment: An independent cost estimate 
was created by the Army Cost Review Board Working Group. However, the 
cost estimate does not include a sensitivity analysis. Because the 
Army did not conduct a sensitivity analysis to identify the effects of 
uncertainties associated with different assumptions, there is an 
increased risk that decisions will be made without a clear 
understanding of the possible impact on cost and benefit estimates. 
The supporting documentation shows risk-adjusted costs, which were 
generated by applying probability distributions to cost elements 
within the cost model. However, the probability distributions applied 
throughout the model to account for risks are generalized and do not 
make a distinction in how specific risks may affect specific cost 
elements differently. While the GCSS-Army PMO has a risk process to 
identify, analyze, plan, track, control, and communicate risks, our 
analysis found that the PMO did not adequately link risks to the cost 
estimate. For example, data cleansing and data migration are noted as 
high-risks within the risk register but they are not accounted for in 
the risk and uncertainty analysis. Without a realistic risk and 
uncertainty analysis, the PMO can neither quantify the level of 
confidence in achieving a program within a certain funding level, nor 
determine a defensible amount of contingency reserve to quickly 
mitigate risk. 

Source: GAO analysis based on data provided by the GCSS-Army PMO. 

Note: The focus of our GCSS-Army cost assessment is the "ratified" 
GCSS-Army Cost Position dated November 2006 because the ratified Army 
Cost Position represents a more detailed approach to the program's 
cost estimating process compared to the current "federated" approach 
estimate for Milestone B. The current federated estimate, which 
reflects the federated ERP integration strategy for GCSS-Army and 
General Funds Enterprise Business Systems (GFEBS), was developed 
within 40 days as mandated by the Department of the Army. The PMO 
plans to implement a more detailed cost estimating process for their 
federated Army Cost Position in preparation for Milestone C in 
February 2011. 

[End of table] 

[End of section] 

Appendix VI: GAO Contacts and Staff Acknowledgments: 

GAO Contacts: 

Asif A. Khan, (202) 512-9095 or khana@gao.gov: 

Nabajyoti Barkakati, (202) 512-4499 or barkakatin@gao.gov: 

Staff Acknowledgments: 

In addition to the contacts named above, the following individuals 
made key contributions to this report: J. Christopher Martin, Senior-
Level Technologist; Darby Smith, Assistant Director; Evelyn Logue, 
Assistant Director; Karen Richey, Assistant Director; F. Abe Dymond, 
Assistant General Counsel; Beatrice Alff; Tyler Benson; Michael Bird; 
Jennifer Echard; Maxine Hattery; Jason Kelly; Jason Kirwan; Crystal 
Lazcano; Jason Lee; Len Ogborn; and Vanessa Virtudazo. 

[End of section] 

Footnotes: 

[1] DOD's business systems are information systems, including 
financial and nonfinancial systems that support DOD business 
operations, such as civilian personnel, finance, health, logistics, 
military personnel, procurement, and transportation. 

[2] GAO, High-Risk Series: An Update, [hyperlink, 
http://www.gao.gov/products/GAO-09-271] (Washington, D.C.: January 
2009). 

[3] An ERP solution is an automated system using commercial off-the- 
shelf (COTS) software consisting of multiple, integrated functional 
modules that perform a variety of business-related tasks such as 
general ledger accounting, payroll, and supply chain management. 

[4] These areas were designated as high risk in 2005, 1995, and 1990, 
respectively. 

[5] The 10 ERPs are as follows: Army--General Fund Enterprise Business 
System (GFEBS), Global Combat Support System-Army (GCSS-Army), and 
Logistics Modernization Program (LMP); Navy--Navy Enterprise Resource 
Planning (Navy ERP) and Global Combat Support System-Marine Corps 
(GCSS-MC); Air Force--Defense Enterprise Accounting and Management 
System (DEAMS) and Expeditionary Combat Support System (ECSS); 
Defense--Service Specific Integrated Personnel and Pay Systems and 
Defense Agencies Initiative (DAI); and Defense Logistics Agency--
Business System Modernization (BSM). According to DOD, BSM was fully 
implemented in July 2007. 

[6] GAO, Defense Logistics: Actions Needed to Improve Implementation 
of the Army Logistics Modernization Program, [hyperlink, 
http://www.gao.gov/products/GAO-10-461] (Washington, D.C.: Apr. 30, 
2010); DOD Business Systems Modernization: Navy Implementing a Number 
of Key Management Controls on Enterprise Resource Planning System, but 
Improvements Still Needed, [hyperlink, 
http://www.gao.gov/products/GAO-09-841] (Washington, D.C.: Sept. 15, 
2009); DOD Business Systems Modernization: Important Management 
Controls Being Implemented on Major Navy Program, but Improvements 
Needed in Key Areas, [hyperlink, 
http://www.gao.gov/products/GAO-08-896] (Washington, D.C.: Sept. 8, 
2008); DOD Business Transformation: Air Force's Current Approach 
Increases Risk That Asset Visibility Goals and Transformation 
Priorities Will Not Be Achieved, [hyperlink, 
http://www.gao.gov/products/GAO-08-866] (Washington, D.C.: Aug. 8, 
2008); DOD Business Systems Modernization: Key Marine Corps System 
Acquisition Needs to Be Better Justified, Defined, and Managed, 
[hyperlink, http://www.gao.gov/products/GAO-08-822] (Washington, D.C.: 
July 28, 2008); and DOD Business Transformation: Lack of an Integrated 
Strategy Puts the Army's Asset Visibility System Investments at Risk, 
[hyperlink, http://www.gao.gov/products/GAO-07-860] (Washington, D.C.: 
July 27, 2007). 

[7] [hyperlink, http://www.gao.gov/products/GAO-10-461], [hyperlink, 
http://www.gao.gov/products/GAO-09-841], [hyperlink, 
http://www.gao.gov/products/GAO-08-896], [hyperlink, 
http://www.gao.gov/products/GAO-08-866], [hyperlink, 
http://www.gao.gov/products/GAO-08-822], and [hyperlink, 
http://www.gao.gov/products/GAO-07-860]. 

[8] We reviewed the Army's GFEBS and GCSS-Army and the Air Force's 
DEAMS and ECSS. 

[9] GAO, GAO Cost Estimating and Assessment Guide Best Practices for 
Developing and Managing Capital Program Costs, [hyperlink, 
http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009). 

[10] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[11] [hyperlink, http://www.gao.gov/products/GAO-08-822] and 
[hyperlink, http://www.gao.gov/products/GAO-08-896]. 

[12] The reported amounts are not audited. In November 2009, the DOD 
Inspector General reported that because of long-standing internal 
control weaknesses, DOD's annual financial statements, which included 
these reported amounts, were not accurate and reliable. 

[13] DOD excludes from its business systems those designated as 
national security systems under Section 2222 (j) of Title 10, United 
States Code. National security systems are intelligence systems, 
cryptologic activities related to national security, military command 
and control systems, and equipment that is an integral part of a 
weapon or weapons system or is critical to the direct fulfillment of 
military or intelligence missions. 

[14] DOD Directive 5000.01, The Defense Acquisition System (Nov. 20, 
2007). 

[15] There are five tiers of business systems. Tier 1 systems include 
all large, expensive system programs classified as a major automated 
information system (MAIS) or a major defense acquisition program 
(MDAP) and subject to the most extensive statutory and regulatory 
reporting requirements. Tier 2 systems include those with 
modernization efforts of $10 million or greater but that are not 
designated as MAIS or MDAP or programs that have been designated as 
investment review board programs of interest because of their effect 
on DOD transformation objectives. Tier 3 systems include those with 
modernization efforts that have anticipated costs greater than $1 
million but less than $10 million. Tier 4 includes systems with 
development/modernization cost of $1 million or less. Tier 5 includes 
systems in operation and maintenance or sustainment. 

[16] The five IRBs are (1) financial management established by the 
Under Secretary of Defense (Comptroller); (2) weapon systems life-
cycle management and materiel supply and services management 
established by the Under Secretary of Defense (Acquisition, Technology 
and Logistics); (3) real property and installations life-cycle 
management established by the Under Secretary of Defense (Acquisition, 
Technology and Logistics); (4) human resources management established 
by the Under Secretary of Defense for Personnel and Readiness; and (5) 
Department of Defense Chief Information Officer established by the 
Assistant Secretary of Defense (Networks and Information 
Integration)/DOD Chief Information Officer. 

[17] Ronald W. Reagan National Defense Authorization Act for Fiscal 
Year 2005, Pub. L. No. 108-375, § 332, 118 Stat. 1811, 1851-1856 (Oct. 
28, 2004), codified in part at 10 U.S.C. § 2222, directs that DOD may 
not obligate appropriated funds for a defense business system 
modernization with a total cost of more than $1 million unless, the 
approval authority--that is the appropriate IRB--certifies that the 
business system modernization either (1) complies with the 
department's business enterprise architecture, (2) is necessary to 
achieve a critical national security capability or address a critical 
requirement in an area such as safety or security, or (3) is necessary 
to prevent a significant adverse effect on an essential project in 
consideration of alternative solutions. This certification must also 
be approved by the DBSMC. Also, as of October 28, 2009, the fiscal 
year 2010 National Defense Authorization Act, Pub. L. No. 111-84, 
§1072, 123 Stat. 2190, 2470 (Oct. 28, 2009), amended this requirement. 
This amendment requires the chief management officer of the military 
services, or for defense agencies, the DOD DCMO, to assess whether (1) 
the business process that the system supports will be as streamlined 
and efficient as possible and (2) the need to tailor commercial-off-
the-shelf systems to meet unique requirements or incorporate unique 
interfaces has been eliminated or reduced to the maximum extent 
practicable. This assessment is required both as a precondition of the 
approval of any new business system modernization with a cost over $1 
million, and as a review of any previously approved business system 
modernization with a cost over $100 million. 

[18] Pub. L. No. 110-417, div. A, title IX; §908, 122 Stat. 4356, 4569 
(Oct. 14, 2008). 

[19] The general fund can be defined as the fund into which receipts 
are deposited, except those from specific sources required by law to 
be deposited into other designated funds and from which appropriations 
are made by Congress to carry on the general and ordinary operations 
of the government. 

[20] According to the GFEBS PMO, once the system is fully operational 
the Army will assess the feasibility of GFEBS becoming the system of 
record for the Corps of Engineers. 

[21] The six Navy commands are the Naval Air Systems Command, the 
Naval Supply Systems Command, the Space and Naval Warfare Systems 
Command, the Naval Sea Systems Command, the Strategic Systems Program, 
and the Office of Naval Research and Strategic Systems Planning. 

[22] The military services integrated personnel and pay system is a 
replacement for the Defense Integrated Military Human Resources System 
that was intended to provide a joint, integrated, standardized 
personnel and pay system for all military personnel. 

[23] Full deployment means with respect to a major automated 
information system program, the fielding of an increment of the 
program in accordance with the terms of a full deployment decision--
the final decision made by the MDA authorizing an increment of the 
program to deploy software for operational use. Pub. L. No. 111-84, 
div. A, §841, 123 Stat. 2190, 2418 (Oct. 28, 2009), the National 
Defense Authorization Act for Fiscal Year 2010, directed that the 
terminology be changed from full operational capability to full 
deployment. 

[24] A life-cycle cost estimate provides an accounting of all 
resources and associated cost elements required to develop, produce, 
deploy, and sustain a particular program. The life-cycle cost estimate 
encompasses all past, present, and future costs for every aspect of 
the program, regardless of funding source. 

[25] ERPs are developed in accordance with various models using 
terminology that varies among defense organizations and in some cases 
even within a given military service. For example, the Army's GFEBS 
refers to a scheduled segment as a "release," and within a release, 
there are "waves." The Air Force's DEAMS program refers to scheduled 
segments as "increments" and within increments, there are "spirals." 
For the purposes of this report, we refer generally to scheduled 
segments of implementation as "phases." 

[26] Master data are that persistent, non-transactional data that 
defines a business entity for which there is, or should be, an agreed 
upon view across the organization. This key business information may 
include data about customers, products, employees, materials, and 
suppliers. Master data are often used by several functional groups and 
stored in different data systems across an organization and may or may 
not be referenced centrally; therefore, the possibility exists for 
duplicate master data and/or inaccurate master data. 

[27] IOT&E are conducted on production or production-representative 
articles to determine whether systems are operationally effective and 
suitable. 

[28] Conditions or issues needing resolution may be placed upon the 
ERPs by the MDA, the IRB, or the DBSMC during the business system's 
funding certification and acquisition decision review milestone 
process. These conditions are generally noted in a memorandum. 

[29] U.S. Army Test and Evaluation Command, Operational Test Agency 
Evaluation Report for the General Fund Enterprise Business System 
(Alexandria, Va.: Dec. 16, 2009). 

[30] According to the Software Engineering Institute, requirements 
management is a process that establishes a common understanding 
between the customer and the software project manager regarding the 
customer's business needs that will be addressed by a project. A 
critical part of this process is to ensure that the requirement 
development portion of the effort documents, at a sufficient level of 
detail, the problems that need to be solved and the objectives that 
need to be achieved. 

[31] GAO, DOD Business Systems Modernization: Billions Continued to be 
Invested with Inadequate Management Oversight and Accountability, 
[hyperlink, http://www.gao.gov/products/GAO-04-615] (Washington, D.C.: 
May 27, 2004). 

[32] GAO, Army Depot Maintenance: Ineffective Oversight of Depot 
Maintenance Operations and System Implementation Efforts, [hyperlink, 
http://www.gao.gov/products/GAO-05-441] (Washington, D.C.: June 30, 
2005). 

[33] [hyperlink, http://www.gao.gov/products/GAO-10-461]. 

[34] At the time LMP was designated as a MAIS program in December 
2007, it was required to comply with the DOD guidance for MAIS 
programs. This guidance requires, among other things, that a MAIS 
program have a completed and approved acquisition program baseline--
the baseline description of the program, including the life-cycle cost 
estimate--prior to Milestone B approval. The $2.6 billion is the only 
life-cycle cost estimate that has been developed for the program. 

[35] [hyperlink, http://www.gao.gov/products/GAO-08-896]. 

[36] According to DOD, the purpose of the IUID is to facilitate asset 
accountability and tracking, including the identification and 
aggregation of related costs to derive the full cost of a contract 
deliverable. 

[37] BTA defines SFIS as a comprehensive "common business language" 
that supports information and data requirements for budgeting, 
financial accounting, cost/performance management, and external 
reporting across the DOD enterprise. 

[38] January 2013 is the estimated full deployment date in the 
proposed acquisition program baseline submitted for MDA approval in 
February 2010. 

[39] [hyperlink, http://www.gao.gov/products/GAO-08-822]. 

[40] According to the May 10, 2004, analysis of alternatives, this 
estimate was a "rough order of magnitude" for research and 
development, procurement and operations and support from fiscal years 
2004 through 2011. 

[41] According to the July 15, 2005, economic analysis, program costs 
are estimated from fiscal years 2005 through 2018, in base year 2005 
dollars, and exclude $9.6 million associated with supporting and 
maintaining legacy systems during GCSS-MC development and $11.9 
million in fiscal year 2004 sunk costs. 

[42] The acquisition program baseline is an important document for 
program management and should reflect the approved program being 
executed. In this regard, the acquisition program baseline formally 
documents the program's estimated cost, schedule, and performance 
goals. 

[43] [hyperlink, http://www.gao.gov/products/GAO-08-866]. 

[44] [hyperlink, http://www.gao.gov/products/GAO-08-866]. 

[45] [hyperlink, http://www.gao.gov/products/GAO-08-866]. 

[46] The program objective memorandum details planned resource 
allocation 6 years in the future. 

[47] Each military department refers to its respective personnel and 
pay system by a different name--the Integrated Personnel and Pay 
System--Army, the Navy Future Pay and Personnel Solution, and the Air 
Force Integrated Personnel and Pay System. For purposes of this 
report, we are collectively referring to these efforts as the Service 
Specific Integrated Personnel and Pay Systems--a name used by DOD. 

[48] A critical path is the longest duration path through a sequenced 
list of activities within a schedule. A schedule risk analysis uses 
statistical techniques to predict a level of confidence in meeting a 
completion date. 

[49] [hyperlink, http://www.gao.gov/products/GAO-08-822] and 
[hyperlink, http://www.gao.gov/products/GAO-08-896]. 

[50] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]; OMB Revised 
Circular No. A-94, Guidelines and Discount Rates for Benefit-Cost 
Analysis of Federal Programs (Oct. 29, 1992); and DOD Instruction 
7041.3, Economic Analysis of Decisionmaking (Nov. 7, 1995). 

[51] [hyperlink, http://www.gao.gov/products/GAO-08-822] and 
[hyperlink, http://www.gao.gov/products/GAO-08-896]. 

[52] See, for example, [hyperlink, 
http://www.gao.gov/products/GAO-09-3SP]; and OMB Capital Programming 
Guide V 2.0, Supplement to Office of Management and Budget Circular A-
11, Part 7: Planning, Budgeting, and Acquisition of Capital Assets 
(Washington, D.C.: June 2006). 

[53] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[54] A constraint predefines the start, finish, or both dates of an 
activity. The schedule should use logic and durations in order to 
reflect realistic start and completion dates for activities. 

[55] These unusual links are known as start-to-finish links, and they 
are rarely, if ever, used in scheduling. Schedules should contain a 
predominance of finish-to-start logical relationships so that one can 
know which activities must finish before others begin. 

[56] Summary activities summarize the effort of multiple lower-level 
tasks. 

[57] Float is the amount of time by which a predecessor activity can 
slip before the delay affects successor activities. 

[58] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[59] An independent cost estimate is another estimate based on the 
same technical information that is used to validate and cross-check 
the baseline estimate, but is prepared by a person or organization 
that has no stake in the approval of the project. 

[60] [hyperlink, http://www.gao.gov/products/GAO-07-860]. 

[61] OMB Circular No. A-11, Preparation, Submission, and Execution of 
the Budget (June 2006); OMB Circular No. A-130, Revised, Management of 
Federal Information Resources (Nov. 28, 2000); and Office of 
Management and Budget, Capital Programming Guide: Supplement to 
Circular A-11, Part 7, Preparation, Submission, and Execution of the 
Budget (June 2000). 

[62] Pub. L. No. 104-106, div. E, title LI, § 5123, 110 Stat. 679, 683-
84 (Feb. 10, 1996), codified, as amended, at 40 U.S.C. § 11313. 

[63] GAO, Tax Administration: IRS Needs to Further Refine Its Tax 
Filing Season Performance Measures, [hyperlink, 
http://www.gao.gov/products/GAO-03-143] (Washington, D.C.: Nov. 22, 
2002). 

[64] [hyperlink, http://www.gao.gov/products/GAO-10-461]. 

[65] This engagement focused on nine ERP efforts that DOD considers 
critical to transforming its business operations and resolving some of 
the department's high-risk areas such as business transformation, 
business system modernization, financial management, and supply chain 
management. 

[66] [hyperlink, http://www.gao.gov/products/GAO-10-461], [hyperlink, 
http://www.gao.gov/products/GAO-09-841], [hyperlink, 
http://www.gao.gov/products/GAO-08-896], [hyperlink, 
http://www.gao.gov/products/GAO-08-866], [hyperlink, 
http://www.gao.gov/products/GAO-08-822], and [hyperlink, 
http://www.gao.gov/products/GAO-07-860]. 

[67] [hyperlink, http://www.gao.gov/products/GAO-08-822] and 
[hyperlink, http://www.gao.gov/products/GAO-08-896]. 

[68] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[69] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[70] United States Army, 2009 United States Army Report to Congress 
(Arlington, Va.), and 2010 Army Report to Congress on Business 
Transformation (Arlington, Va.: Mar. 1, 2010); Department of the Navy, 
Congressional Report, NDAA 2009, Section 908, Business Transformation 
Initiatives for the Military Departments (Arlington, Va.), and 
Department of the Navy Fiscal Year 2010 Business Transformation Report 
Update (Arlington, Va.); and United States Air Force, Initial Report 
on Implementation of NDAA 2009, Business Transformation Initiatives 
for the Military Departments (Sec 908) (Arlington, Va.: July 2009), 
and March 2010 Follow-up Report on Implementation of NDAA 2009, 
Business Transformation Initiatives for the Military Departments (Sec 
908) (Arlington, Va.: March 2010). 

[71] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[72] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: