This is the accessible text file for GAO report number GAO-02-9 
entitled 'Information Technology: Inconsistent Software Acquisition 
Processes at the Defense Logistics Agency Increase Project Risks' 
which was released on January 10, 2002. 

This text file was formatted by the U.S. General Accounting Office 
(GAO) to be accessible to users with visual impairments, as part of a 
longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the 
printed version. The portable document format (PDF) file is an exact 
electronic replica of the printed version. We welcome your feedback. 
Please E-mail your comments regarding the contents or accessibility 
features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

United States General Accounting Office: 
GAO: 

Report to Congressional Committees: 

January 2002: 

Information Technology: 

Inconsistent Software Acquisition Processes at the Defense Logistics 
Agency Increase Project Risks: 

GAO-02-9: 

Contents: 

Letter: 

Results in Brief: 

Background: 

DLA Lacks the Capability to Acquire Software Effectively: 

DLA Lacks Effective Software Process Improvement: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: Results of Software Acquisition Capability Maturity Model 
Evaluation for Business Systems Modernization: 

Appendix III: Results of Software Acquisition Capability Maturity 
Model Evaluation for Fuels Automated System: 

Appendix IV: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Acknowledgments: 

Tables: 

Table 1: Six SA-CMM Level-2 and One Level-3 Key Process Areas: 

Table 2: Key Process Area Strengths and Weaknesses for BSM: 

Table 3: Key Process Area Strengths and Weaknesses for FAS: 

Table 3: Software Acquisition Findings for BSM: 

Table 4: Solicitation Findings for BSM: 

Table 5: Requirements Development and Management Findings for BSM: 

Table 6: Project Management Findings for BSM: 

Table 7: Contract Tracking and Oversight Findings for BSM: 

Table 8: Evaluation Findings for BSM: 

Table 9: Acquisition Risk Management Findings for BSM: 

Table 10: Software Acquisition Planning Findings for FAS: 

Table 11: Requirements Development and Management Findings for FAS: 

Table 12: Project Management Findings for FAS: 

Table 13: Contract Tracking and Oversight Findings for FAS: 

Table 14: Evaluation Findings for FAS: 

Table 15: Acquisition Risk Management Findings for FAS: 

Figure 1: SA-CMM Levels and Descriptions: 

Abbreviations: 

BSM: Business Systems Modernization: 

CIO: Chief Information Officer: 

CMM®: Capability Maturity Model: 

CMU: Carnegie Mellon University: 

DLA: Defense Logistics Agency: 

DOD: Department of Defense: 

FAS: Fuels Automated System: 

IT: Information Technology: 

SA-CMM®: Software Acquisition Capability Maturity Model: 

SEI: Software Engineering Institute: 

[End of section] 

United States General Accounting Office: 
Washington, DC 20548: 

January 10, 2002: 

The Honorable Carl Levin: 
Chairman: 
The Honorable John Warner: 
Ranking Minority Member: 
Committee on Armed Services: 
United States Senate: 

The Honorable Bob Stump: 
Chairman: 
The Honorable Ike Skelton: 
Ranking Minority Member: 
Committee on Armed Services: 
House of Representatives: 

The Defense Logistics Agency (DLA) plays a critical role in supporting 
America's military forces worldwide. To fulfill this role, DLA employs 
about 28,000 civilian and military workers, located at about 500 sites 
in all 50 states and in 28 countries. It also manages about 4 million 
supply items and processes about 30 million annual supply distribution 
actions. In fiscal year 2000, DLA reported that these operations 
resulted in sales to the military services of about $13 billion. DLA 
relies on software-intensive systems to support this work. An 
important determinant of the quality of software-intensive systems, 
and thus DLA's mission performance, is the quality of the processes 
used to acquire these systems. 

This report is one in a series of products to satisfy our mandate 
under the fiscal year 2001 Defense Authorization Act.[Footnote 1] The 
act directed that we review DLA's efficiency and effectiveness in 
meeting requirements, its application of best business practices, and 
opportunities for improving its operations. As agreed with your 
offices, the objectives of this review of DLA's information technology 
(IT) management were to determine (1) whether DLA has the effective 
software acquisition processes that are necessary to modernize and 
maintain systems and (2) what actions DLA has planned or in place to 
improve these processes. 

Carnegie Mellon University's Software Engineering Institute (SEI), 
recognized for its expertise in software processes, has developed 
models and methods that define and determine organizations' software 
process maturity. Together, these models and methods provide (1) a 
logical framework for baselining an organization's current process 
capabilities (i.e., determining what practices are effectively 
implemented [strengths], not effectively implemented [weaknesses], or 
contain mixed or inconclusive evidence [observations]) and (2) a 
structured plan for incremental process improvement. These models and 
methods are generally recognized as best business practices. 

Using SEI's Software Acquisition Capability Maturity Model (SA-CMM®) 
[Footnote 2] and SEI's software capability evaluation method, our 
staff (trained at SEI) evaluated DLA's software acquisition maturity 
in six of seven key process areas that are necessary to attain a 
"repeatable" level of process maturity.[Footnote 3] The repeatable 
level of process maturity is level 2 on SEI's five-level scale. An 
organization at the repeatable level of process maturity has the
necessary process discipline in place to repeat earlier successes on 
similar projects. Organizations that do not satisfy the requirements 
for the repeatable level are by default judged to be at level 1, the 
"initial" level of maturity. This means that their processes are 
immature, ad hoc, and sometimes even chaotic, with few of the 
processes defined and success dependent mainly on the heroic efforts 
of individuals. We also evaluated DLA on one level-3, or "defined" 
level, process—acquisition risk management. We included acquisition 
risk management because many software experts consider it to be one of 
the most important process areas. 

Our evaluation included DLA's only ongoing software/system 
acquisitions: the Business Systems Modernization (BSM) and the Fuels 
Automated System (FAS). Details on our objectives, scope, and 
methodology are contained in appendix I. The Department of Defense 
(DOD) provided us with comments on a draft of this report, which are 
discussed in the "Agency Comments" section. 

Results in Brief: 

DLA does not have mature software acquisition processes across the 
agency, as evidenced by the wide disparity in the rigor and discipline 
of processes between the two systems we evaluated. Whereas BSM fully 
satisfied requirements for most of the key process areas evaluated, 
FAS did not fully satisfy all the criteria for any key process area. 
More specifically, BSM satisfied all requirements for three level-2 
key process areas—-software acquisition management, project 
management, and contract tracking and oversight-—and for the one level-
3 key process area that we evaluated—acquisition risk management. 
Further, BSM satisfied all but a few practices in the other level-2 
key process areas-—solicitation, requirements development and 
management, and evaluation. On the other hand, FAS did not fully 
satisfy all requirements for any of the level-2 key process areas, and 
also did not satisfy the one level-3 key process area we evaluated. 
According to DLA officials, the variance between BSM and FAS software 
acquisition maturity can be attributed in part to differences in the 
level of resources that each project committed to acquisition process 
controls. This means that DLA does not have effective corporate 
processes for consistently acquiring software (the most costly and 
complex component of systems), which can lead to the acquisition of 
systems that do not meet the information needs of management and 
staff, do not provide support for necessary programs and operations, 
and cost more and take longer than expected to complete. 

Moreover, DLA does not have a software process improvement program in 
place to effectively strengthen its corporate software acquisition 
processes. Earlier this year, we reported that DLA does not have a 
software process improvement program, having eliminated the program in 
1998.[Footnote 4] We also reported that DLA's Chief Information 
Officer (CIO) stated that the program would be reestablished. However, 
DLA still does not have written plans and milestones for doing so 
because the improvement program has not been an agency priority. 
Without a software process improvement program, it is unlikely that 
DLA can effectively improve its institutional software acquisition 
capabilities, which in turn means that DLA's software projects will be 
at risk of not delivering promised capabilities on time and within 
budget. 

To reduce DLA's software acquisition project risks, we are 
recommending actions aimed at (1) correcting BSM and FAS process 
weaknesses and (2) establishing a framework for long-term institution 
software process improvement. 

DOD provided what it termed "official oral comments" on a draft of 
this report. In its comments, DOD stated that it generally concurred 
with the report and that it concurred with the recommendations. 

Background: 

DLA is the Department of Defense's (DOD) logistics manager for all DOD 
consumable items[Footnote 5] and some department repair items. 
[Footnote 6] Its primary business function is to provide supply 
support in order to sustain military operations and readiness. In 
addition to this primary function, which DLA refers to as either 
"materiel management" or "supply-chain management," DLA performs five 
other major business functions: distributing materiel ordered from its 
inventory; purchasing fuels for DOD and the U.S. government; storing 
strategic materiel;[Footnote 7] marketing surplus DOD materiel for 
reuse and disposal; and providing numerous information services, such 
as item cataloging,[Footnote 8] for DOD, the United States, and 
selected foreign governments. DLA consists of a central command 
authority supported by a number of field commands that manage the 
agency's six business functions. 

Until about 1997, DLA generally developed its systems in-house. Since 
then, the agency has begun to acquire systems, relying on contractors 
for system development and managing the acquisition of these systems. 
Currently, DLA is in the process of acquiring two systems: Business 
Systems Modernization (BSM) and Fuels Automated System (FAS). 

* BSM is intended to modernize DLA's materiel management business 
function, changing the agency from being solely a provider and manager 
of physical inventory to being a manager of supply chains. In this 
role, DLA would link customers with appropriate suppliers and track 
physical and financial business practices. It is planning to replace 
two large legacy systems, as well as several supporting programs, that 
are more than 30 years old and are not integrated. BSM is based on 
commercially available software products. DLA plans to acquire and 
deploy its BSM system solution through a series of four system 
releases/increments. First, it plans to demonstrate successful 
application of its new concept of doing business for selected 
commodities—namely, earth-moving equipment, medical/pharmaceutical 
supplies, and F/A-18 engine components—at the three Defense Supply 
Centers. If this first release is successfully demonstrated, DLA plans 
to expand the system solution to other commodities in three additional 
increments. DLA plans to invest approximately $658 million to acquire 
and implement BSM from fiscal years 2000 through 2005. 

* FAS is intended to help the Defense Energy Support Center manage 
about $5 billion in contracts with petroleum suppliers each year. FAS 
is to be a multifunctional system that provides for, among other 
things, point-of-sale data collection inventory control, finance and 
accounting, procurement, and facilities management. FAS, which relies 
on a commercially available software package, is being fielded 
incrementally. Increment 1 is the base-level operational module that 
is currently being deployed to base-level sites worldwide. The second 
increment is the enterprise-level system, which is to be deployed to 
its direct delivery commodity business unit. DLA plans to invest $293 
million in FAS from fiscal year 1995 through 2002. 

SEI's SA-CMM is used to measure an organization's capability to manage 
the acquisition of software. SEI's expertise in, and model and methods 
for, determining software process assessment are recognized and 
accepted throughout the software industry. The model defines five 
levels of software acquisition maturity. Each level of maturity 
(except level 1) indicates process capability in relation to key 
process areas. For a maturity level to be achieved, all key process 
areas related to that level must be implemented effectively. 

The second level of process maturity, level 2 (referred to as the 
repeatable level), demonstrates that basic management processes are 
established to track performance, cost, and schedule, and the 
necessary discipline is in place to repeat earlier successes on 
similar projects. Organizations that do not effectively implement all 
key process areas for the repeatable level are, by default, at level 
1, the initial level of maturity. Level-1 processes can be described 
as immature, ad hoc, and sometimes chaotic; success in software 
acquisition for these organizations depends on the ability and 
commitment of the staff involved. Figure 1 further explains the five-
level software acquisition model. 

Figure 1: SA-CMM Levels and Descriptions: 

[Refer to PDF for image: illustration] 

Level 1: Initial: 
The software acquisition process is characterized as ad hoc, and 
occasionally even chaotic. Few processes are defined and success 
depends on individual effort. 

Disciplined process: 

Level 2: Repeatable: 
Basic software acquisition project management processes are 
established to plan all aspects of the acquisition, manage software 
requirements, track project team and contractor team performance, 
manage the project's cost and schedule baselines, evaluate the 
products and services, and successfully transition the software to its 
support organization. The necessary process discipline is in place to 
repeat earlier successes on projects in similar domains. 

Standard consistent process: 

Level 3: Defined: 
The acquisition organization's software acquisition process is 
documented and standardized. All projects use an approved, tailored 
version of the organization's standard software acquisition process 
for acquiring their software products and services. 

Predictable process: 

Level 4: Quantitative: 
Detailed measures of the software acquisition processes, products, and 
services are collected. The software processes, products, and services 
are quantitatively and qualitatively understood and controlled. 

Continuously improving process: 

Level 5: Optimizing: 
Continuous process improvement is empowered by quantitative feedback 
from the process and from piloting Innovative ideas and technologies. 

Source: Software Engineering Institute (SEI). 

[End of figure] 

We evaluated DLA against six of the seven level-2 (repeatable) key 
process areas in the SA-C1VEVI. We did not evaluate DLA on the seventh 
key process area—-transition to support—-because the contractors who 
are implementing BSM and FAS will support these systems when they are 
operational, rendering transition to support irrelevant for these 
acquisitions. We evaluated DLA against one level-3 (defined) key 
process area-—acquisition risk management-—because many software 
acquisition experts consider it to be one of the most important key 
process areas. These key process areas are described in table 1. 

Table 1: Six SA-CMM Level-2 and One Level-3 Key Process Areas: 

SA-CMM level 2: 

Key process area: Software acquisition planning; 
Description: Ensuring that reasonable planning for the software 
acquisition is conducted and that all elements of the project are 
included; 
Examples of SA-CMM required practices[A]: Includes (1) having a 
written software acquisition policy, (2) having adequate resources for 
software acquisition planning, (3) developing and documenting the 
software acquisition strategy and plan, (4) having management review 
software acquisition planning activities, and (5) making and using 
measurements to determine the status of software acquisition planning 
activities. 

Key process area: Solicitation; 
Description: Ensuring that award is made to the contractor most 
capable of satisfying the specified requirements; 
Examples of SA-CMM required practices[A]: Includes (1) designating 
responsibility for the software portion of the solicitation, (2) 
preparing cost and schedule estimates for the software products and 
services being acquired, (3) having a written policy for the conduct 
of the software portion of the solicitation, and (4) having an 
independent review of cost and schedule estimates for the software 
products and services being acquired. 

Key process area: Requirements development and management; 
Description: Establishing a common and unambiguous definition of 
software acquisition requirements that is understood by the 
acquisition team, system users, and contractor(s). This key process 
area involves two subprocesses: (1) developing a baseline set of 
software-related contractual requirements and (2) managing these 
requirements and changes to these requirements for the duration of the 
acquisition; 
Examples of SA-CMM required practices[A]: Includes (1) having a 
written policy for managing the software-related contractual 
requirements, (2) having a group that is responsible for performing 
requirements development and management activities, (3) ensuring that 
the team performs its activities in accordance with its documented 
requirements development and management plans, (4) appraising system 
requirements change requests for their impact on the software being 
acquired, (5) appraising changes to the software-related contractual 
requirements for their impact on performance and contract schedule and 
cost, and (6) measuring and reporting on the status of requirements 
development and management activities to management. 
		
Key process area: Project management; 
Description: Managing the activities of the project office and 
supporting contractor(s) to ensure a timely, efficient, and effective 
software acquisition; 
Examples of SA-CMM required practices[A]: Includes (1) designating 
responsibility for project management, (2) having a written policy for 
the management of the software project, (3) having adequate resources 
for the duration of the software acquisition project, (4) documenting 
the roles, responsibilities, and authority for the project functions, 
(5) tracking the risks associated with cost, schedule, and resources, 
and (6) using a corrective action system for identifying, recording, 
tracking, and correcting problems. 

Key process area: Contract tracking and oversight; 
Description: Ensuring that the software activities under contract are 
being performed in accordance with contract requirements and that 
products and services will satisfy contract requirements; 
Examples of SA-CMM required practices[A]: Includes (1) designating 
responsibility for contract tracking and oversight, (2) including 
contract specialists in the project team, (3) ensuring that 
individuals performing contract tracking and oversight activities have 
experience or receive training, (4) having a documented plan for 
contract tracking and oversight, and (5) comparing the actual cost and 
schedule of the contractor's software engineering effort to planned 
schedules and budgets. 

Key process area: Evaluation; 
Description: Determining that the acquired software products and 
services satisfy contract requirements before acceptance; 
Examples of SA-CMM required practices[A]: Includes (1) designating 
responsibility for planning, managing, and performing evaluation 
activities, (2) ensuring that adequate resources are provided for 
evaluation activities, (3) documenting evaluation plans and conducting 
evaluation activities in accordance with the plan, (4) developing and 
managing evaluation requirements in conjunction with developing 
software technical requirements, and (5) measuring and reporting on 
the status of evaluation activities to management. 
	
SA-CMM level 3: 

Key process area: Acquisition risk management: 
Description: Identifying risks as early as possible and adjusting the 
acquisition to mitigate those risks; 
Examples of SA-CMM required practices[A]: Includes (1) having a 
written policy for managing software acquisition risk, (2) designating 
responsibility for software acquisition risk activities, (3) providing 
adequate resources for software acquisition risk management 
activities, (4) developing a software acquisition risk management 
plan, and (5) measuring and reporting on the status of acquisition 
risk management activities to management. 

[A] We included only examples of the SA-CMM key practices. 

Source: GAO, based on SEI data. 

[End of table] 

As established by the model, each key process area contains five 
common features—commitment to perform, ability to perform, activities 
to be performed, measurement and analysis of activities, and 
verification of activities' implementation. These five features 
collectively provide a framework for the implementation and 
institutionalization of the key process areas. The common feature 
definitions are as follows: 

* Commitment to perform: This feature describes the actions that the 
organization takes to establish the process and ensure that it can 
endure. Key practices typically involve establishing organizational 
policies and sponsorship. 

* Ability to perform: This feature describes the preconditions that 
must exist in the project or organization to implement the software 
acquisition process competently. Key practices typically include 
assigning responsibility and providing training. 

* Activities to be performed: This feature describes the roles and 
procedures necessary to implement a key process area. Key practices 
typically involve establishing plans and procedures, performing the 
work, tracking it, and taking appropriate management actions. 

* Measurement and analysis of activities: This feature describes the 
steps necessary to measure progress and analyze the measurements. Key 
practices typically involve defining the measurements to be taken and 
the analyses to be conducted to determine the status and effectiveness 
of the activities performed. 

* Verification of activities' implementation: This feature describes 
the steps the organization must take to ensure that project activities 
are performed in accordance with established processes. Key practices 
typically involve regular reviews by management. 

Each common feature consists of a number of key practices-—specific 
actions such as developing an organizational policy for software 
acquisition, developing various plans for software acquisition 
activities, and tracking a contractor's progress. When an organization 
is evaluated against the SA-CMM, comparisons of actual performance 
against a key practice can result in one of four possible outcomes or 
ratings: 

* Strength: The key practice involved was effectively implemented. 

* Weakness: The key practice was not effectively implemented or was 
not implemented. 

* Observation: The key practice was evaluated, but cannot be 
characterized as a strength because (1) the project team did not 
provide sufficient evidence to support a strength rating or (2) the 
key practice was only partially performed. 

* Not rated: The key practice is not relevant to the project. 

To achieve the repeatable level, DLA would have to demonstrate that 
the key practices related to this level were implemented effectively 
in the software acquisition projects being evaluated, and thus the 
project successes can be repeated in future projects. 

DLA Lacks the Capability to Acquire Software Effectively: 

DLA is not at level 2 (the repeatable level of maturity) when compared 
with the SA-CIVEVI—meaning that DLA does not possess an agencywide or 
corporate ability to effectively acquire software-intensive systems. 
Whereas DLA's BSM project fully or substantially satisfied SEI's SA-
CIVIIVI requirements for the key process areas for level 2, as well as 
requirements for one level 3 (defined level) key process area, its FAS 
project did not satisfy all the criteria for any of these key process 
areas. A discussion of how each system compared with the SA-CMM is 
summarized below. 

BSM Satisfied or Substantially Satisfied All Key Process Areas: 

BSM completely satisfied requirements for three of the level-2 key 
process areas, as well as for the one level-3 key process area, and 
substantially satisfied requirements for the remaining three level-2 
key process areas that we evaluated.[Footnote 9] (See table 2 for the 
percentage of strengths and weakness for each area evaluated.) 
According to BSM officials, satisfying the criteria for the key 
process areas is attributable to the following factors: allocating 
adequate resources; following good program management practices, as 
defined in DOD Directive 5000; and working closely with relevant 
oversight groups. To address those few weaknesses that we identified, 
project officials told us that they have initiated corrective action. 

Table 2: Key Process Area Strengths and Weaknesses for BSM: 

Key process area: Software acquisition planning; 
Strengths: 100%; 
Weaknesses: 0%; 
Observations: 0%. 

Key process area: Solicitation; 
Strengths: 94%; 
Weaknesses: 6%; 
Observations: 0%. 

Key process area: Requirements development and management; 
Strengths: 79%; 
Weaknesses: 21%; 
Observations: 0%. 

Key process area: Project management; 
Strengths: 100%; 
Weaknesses: 0%; 
Observations: 0%. 

Key process area: Contract tracking and oversight; 
Strengths: 100%; 
Weaknesses: 0%; 
Observations: 0%. 

Key process area: Evaluation; 
Strengths: 93%; 
Weaknesses: 0%; 
Observations: 7%. 

Key process area: Acquisition risk management; 
Strengths: 100%; 
Weaknesses: 0%; 
Observations: 0%. 

Source: GAO calculations, based on data and interviews with Business 
Systems Modernization officials. 

[End of table] 

BSM satisfied all key practices in: 

* software acquisition planning, such as (1) having a written software 
acquisition policy, (2) having adequate resources for software 
acquisition planning activities, (3) developing and documenting the 
software acquisition strategy and plan, and (4) making and using 
measurements to determine the status of software acquisition planning 
activities. 

* project management, including (1) designating responsibility for 
project management, (2) having a written policy for the management of 
the software project, (3) having adequate resources for the duration 
of the software acquisition project, and (4) tracking the risks 
associated with cost, schedule, resources, and the technical aspects 
of the project. 

* contract tracking and oversight, including (1) designating 
responsibility for contract tracking and oversight, (2) including 
contract specialists in the project team, and (3) having a documented 
plan for contract tracking and oversight. 

* acquisition risk management, such as (1) having a risk management 
plan, (2) having a written policy for the management of software 
acquisition risk, and (3) measuring and reporting on the status of 
acquisition risk management activities to management. 

BSM also satisfied all but one key practice in solicitation. Strengths 
included (1) designating responsibility for the software portion of 
the solicitation, (2) preparing cost and schedule estimates for the 
software products and services being acquired, and (3) having an 
independent review of cost and schedule estimates for the software 
products and services being acquired. BSM's one weakness in this key 
process area was in not having a written policy for the software 
portion of the solicitation. This is significant because, according to 
the SEI, an institutional policy provides for establishing an enduring 
process. 

BSM also satisfied all but three key practices in requirements 
development and management. Strengths included (1) having a written 
policy for managing the software-related contractual requirements, (2) 
having a group that is responsible for performing requirements 
development and management activities, and (3) measuring and reporting 
to management on the status of requirements development and management 
activities. One of the three weaknesses was the lack of a documented 
requirements development and management plan. Such a plan provides a 
roadmap for completing important requirements development and 
management activities. Without it, projects risk either not performing 
important tasks or not performing them effectively. The other two 
weaknesses involved the project office's appraisal of system 
requirements changes. Specifically, BSM did not appraise (1) requests 
to change system requirements for their impact on the software being 
acquired or (2) all changes to the requirements for impact on 
performance and contract schedule and cost. These activities are 
critical to making informed, risk-based decisions about whether to 
approve requirements changes. 

Last, BSM satisfied all but one key practice in evaluation, and we do 
not view that practice as significant. Strengths included (1) 
designating responsibility for contract tracking and oversight, (2) 
documenting evaluation plans and conducting evaluation activities in 
accordance with the plan, and (3) developing and managing evaluation 
requirements in conjunction with developing software technical 
requirements. 

By generally satisfying these key process areas for its BSM project, 
DLA has increased the chances that the software acquired on this 
project will meet stated requirements and will be delivered on time 
and within budget. 

See appendix II for more detailed information on key process areas and 
our findings on BSM. 

FAS Did Not Satisfy Any of the Key Process Areas: 

Because of the number and severity of its key practice weaknesses, FAS 
did not fully satisfy all the criteria for any of the five level-2 SA-
CMM key process areas or for the one level-3 key process area that we 
evaluated.[Footnote 10] (See table 3 for the percentage of strengths 
and weakness for each area evaluated.) According to FAS officials, 
these weaknesses are attributable to a lack of adequate resources for 
the process areas. However, these officials stated that they are 
currently in the process of reorganizing and addressing resource 
shortages. 

Table 3: Key Process Area Strengths and Weaknesses for FAS: 

Key process area: Software acquisition planning; 
Strengths: 80%; 
Weaknesses: 13%; 
Observations: 7%; 
Not rated: 0%. 

Key process area: Requirements development and management; 
Strengths: 43%; 
Weaknesses: 43%; 
Observations: 14%; 
Not rated: 0%. 

Key process area: Project management; 
Strengths: 63%; 
Weaknesses: 37%; 
Observations: 0%; 
Not rated: 0%. 

Key process area: Contract tracking and oversight; 
Strengths: 65%; 
Weaknesses: 29%; 
Observations: 6%; 
Not rated: 0%. 

Key process area: Evaluation; 
Strengths: 60%; 
Weaknesses: 13%; 
Observations: 13%; 
Not rated: 14%. 

Key process area: Acquisition risk management; 
Strengths: 20%; 
Weaknesses: 73%; 
Observations: 7%; 
Not rated: 0%. 

Source: GAO calculations, based on data and interviews with Fuels 
Automated System officials. 

[End of table] 

In the software-acquisition-planning key process area, FAS had 12 
strengths, 2 weaknesses, and 1 observation. Strengths included, among 
other things, (1) having a written software acquisition policy, (2) 
developing and documenting the software acquisition strategy and plan, 
and (3) having management review software-acquisition-planning 
activities. Weaknesses included (1) not having adequate resources for 
software-acquisition-planning activities and (2) not measuring the 
status of the software-acquisition-planning activities and resultant 
products. The weaknesses are significant because they could prevent 
management from developing effective plans, from being aware of 
problems in meeting planned commitments, or from taking necessary 
corrective actions expeditiously. 

In the requirements development and management key process area, FAS 
had six strengths, six weaknesses, and two observations. Examples of 
strengths included (1) having a written policy for managing the 
software-related contractual requirements and (2) having a group that 
is responsible for performing requirements development and management 
activities. However, we found weaknesses in important key practices 
that jeopardize effective control of the requirements baseline and can 
result in software products that do not meet cost, schedule, or 
performance objectives. Specific examples of weaknesses included (1) 
not having a documented requirements development and management plan, 
(2) not appraising requests to change system requirements for their 
impact on the software being acquired, (3) not appraising changes to 
the software-related contractual requirements for their impact on 
performance and contract schedule and cost, and (4) not measuring and 
reporting to management on the status of requirements development and 
management activities. 

In the project management key process area, FAS had 10 strengths and 6 
weaknesses. Strengths included, among other things, (1) designating 
responsibility for project management, (2) having a written policy for 
the management of the software project, and (3) using a corrective 
action system for identifying, recording, tracking, and correcting 
problems. Examples of weaknesses included (1) not having adequate 
resources for the duration of the software acquisition project, (2) 
not documenting the roles, responsibilities, and authority for the 
project functions, and (3) not tracking the risks associated with 
cost, schedule, and resources. These weaknesses are significant 
because they could jeopardize the project's ability to ensure that 
important project management and contractor activities are defined, 
understood, and completed. 

FAS had 11 strengths, 5 weaknesses, and 1 observation in the contract 
tracking and oversight key process area. Strengths included, among 
other things, (1) designating responsibility for contract tracking and 
oversight, (2) including contract specialists on the project team, and 
(3) ensuring that individuals performing contract tracking and 
oversight activities had experience or received training. Examples of 
weaknesses included (1) not having a documented plan for contract 
tracking and oversight and (2) not comparing the actual cost and 
schedule of the contractor's software engineering effort with planned 
schedules and budgets. Because of these weaknesses, FAS contractor 
tracking and oversight activities are undisciplined and unstructured, 
thereby increasing the chances of FAS software acquisitions being 
late, costing more than expected, and not performing as intended. 

In the evaluation key process area, FAS had nine strengths, two 
weaknesses, two observations, and two areas that were not rated. 
Strengths included, among other things, (1) designating responsibility 
for planning, managing, and performing evaluation activities, (2) 
documenting evaluation plans and conducting evaluation activities in 
accordance with the plan, and (3) developing and managing evaluation 
requirements in conjunction with developing software technical 
requirements. Weaknesses were (1) not ensuring that adequate resources 
were provided for evaluating activities and (2) not measuring and 
reporting on the status of evaluation activities to management. These 
weaknesses are significant because they preclude DLA decisionmakers 
from knowing whether contractor-developed software is meeting defined 
requirements. 

FAS performed poorly in the one level-3 key process area that we 
evaluated-—acquisition risk management—-with 3 strengths, 11 
weaknesses, and 1 observation. Examples of strengths included (1) 
having a written policy for the management of software acquisition 
risk and (2) designating responsibility for software acquisition risk 
activities. Weaknesses included, among others, (1) not having adequate 
resources for performing risk management activities, (2) not having a 
software risk management plan, and (3) not measuring and reporting on 
the status of acquisition risk management activities to management. 
Because of these weaknesses, the project office does not have adequate 
assurance that it will promptly identify risks and effectively 
mitigate them before they become problems. 

By not satisfying any of these key process areas for its FAS project, 
DLA is unnecessarily increasing the risk that the software acquired on 
this project will not meet stated requirements and will not be 
delivered on time and within budget. 

Appendix III provides more details on the key process areas and our 
findings on FAS. 

DLA Lacks Effective Software Process Improvement: 

The quality of the processes involved in developing, acquiring, and 
engineering software and systems has a significant effect on the 
quality of the resulting products. Accordingly, process improvement 
programs can increase product quality and decrease product costs. 
Public and private organizations have reported significant returns on 
investment through such process improvement programs. In particular, 
SEI has published reports of benefits realized through process 
improvement programs. For example, SEI reported in 1995[Footnote 11] 
that a major defense contractor had implemented a process improvement 
program in 1988 and by 1995 had reduced its re-work costs from about 
40 percent of project cost to about 10 percent, increased staff 
productivity by about 170 percent, and reduced defects by about 75 
percent. According to a 1999 SEI report,[Footnote 12] a software 
development contractor reduced its average deviation from estimated 
schedule time from 112 percent to 5 percent between 1988 and 1996. 
During the same period, SEI reported that this contractor reduced its 
average deviation from estimated cost from 87 percent to minus 4 
percent. 

DLA does not currently have a software process improvement program, 
and recent efforts to establish one have not made much progress. We 
recently reported on DOD's software process improvement efforts, 
including those within DLA. Specifically, we reported that before 
1998, DLA had a software process improvement program;[Footnote 13] 
however, DLA eliminated it during a reorganization in 1998. In 
response to our report, DLA's Chief Information Officer said that the 
software process improvement program was to be reestablished during 
fiscal year 2001 and that DLA's goal would be for its system 
developers and acquirers to reach a level 2 on the CMM by fiscal year 
2002. 

To date, DLA has established an integrated product team for software 
process improvement that is tasked to study DLA's software processes 
and, based on this study, to make recommendations on areas in which 
DLA needs to improve. DLA has also dropped its goal of achieving level 
2 by 2002, and it does not intend to specify a CMM level for its 
contractors. The software process improvement team has produced 
several draft papers and a draft policy, but it does not have a plan 
or milestones for achieving software process improvement. According to 
an agency official associated with DLA's process improvement effort, 
funding to develop and implement a software process improvement 
program has not been approved because of other agency IT funding 
priorities, such as BSM. 

Conclusions: 

DLA does not have the institutional management capabilities necessary 
for effectively acquiring quality software repeatedly on one project 
after another. This lack of agencywide consistency in software 
acquisition management controls means that software project success at 
DLA currently depends more on the individuals assigned to a given 
project than on the rules governing how any assigned individuals will 
function. That has proven to be a risky way to manage software-
intensive acquisitions. 

To DLA's benefit, it currently has a model software acquisition 
project (BSM) that, albeit not perfect, is a solid example from which 
to leverage lessons learned and replicate effective software 
acquisition practices across the agency. To do so effectively, 
however, DLA will need to implement a formal software process 
improvement program and devote adequate resources to correct the 
weaknesses in the software acquisition processes discussed in this 
report. It will also have to commit the resources needed to implement 
a software process improvement program. 

Recommendations for Executive Action: 

To reduce the software acquisition risks associated with its two 
ongoing acquisition projects, we recommend that the Secretary of 
Defense direct the Director of DLA to immediately correct each BSM and 
FAS software-acquisition-practice weakness identified in this report.
	
To ensure that DLA has in place the necessary process controls to 
acquire quality software consistently on future acquisition projects, 
we recommend that the Secretary also direct the DLA Director to: 

* issue a policy requiring that (1) DLA software-intensive acquisition 
projects satisfy all applicable SEI SA-CMM level-2 key process areas 
and the level-3 risk management key process area and (2) DLA software 
contractors have comparable software process maturity levels; and; 

* direct the Chief Information Officer (CIO) to establish and sustain 
a software process improvement program, including (1) developing and 
implementing a software process improvement plan that specifies 
measurable goals and milestones, (2) providing adequate resources to 
the program, and (3) reporting to the Director every 6 months on 
progress against plans. 

Agency Comments: 

DOD provided what it termed "official oral comments" from the Deputy 
Under Secretary for Logistics and Materiel Readiness on a draft of 
this report. In its comments, DOD stated that it generally concurred 
with the report and concurred with the recommendations. In particular, 
DOD stated that it will issue policy directives requiring the Director 
of DLA to (1) correct identified software acquisition practice 
weaknesses, except in circumstances in which corrections to past 
events make doing so impractical; (2) implement a plan in all software-
intensive projects to satisfy all applicable SEI SA-CMM level-2 and 
level-3 key process areas, and require all DLA software contractors to 
have comparable software process maturity levels; and (3) establish 
and sustain a software process improvement program that includes a 
plan specifying measurable goals and milestones, provides adequate 
resources, and reports to the Director of DLA every 6 months on 
progress against the plan. 

We are sending copies of this report to the Chairmen and Ranking 
Minority Members of the Senate Appropriations Subcommittee on Defense; 
the Subcommittee on Readiness and Management Support, Senate Committee 
on Armed Services; the House Appropriations Subcommittee on Defense; 
and the Subcommittee on Readiness, House Committee on Armed Services. 
We are also sending copies to the Director, Office of Management and 
Budget; the Under Secretary of Defense for Acquisition and Technology; 
the Deputy Under Secretary of Defense for Logistics and Materiel 
Readiness; and the Director, Defense Logistics Agency. Copies will be 
made available to others upon request. 

If you have any questions regarding this report, please contact me at 
(202) 512-3439 or by e-mail at hiter@gao.gov. An additional GAO 
contact and staff acknowledgments are listed in appendix IV. 

Signed by: 

Randolph C. Hite: 
Director, Information Technology Systems Issues: 

[End of section] 

Appendix I: Objectives, Scope, and Methodology: 

Our objectives were to determine (1) whether the Defense Logistics 
Agency (DLA) has the effective software acquisition processes 
necessary to modernize and maintain systems and (2) what actions DLA 
has planned or in place to improve these processes. 

To determine whether DLA has effective software acquisition processes, 
we applied the Software Engineering Institute's (SEI) Software 
Acquisition Capability Maturity Model using our SEI-trained analysts. 
We focused on the key process areas necessary to obtain a repeatable 
level of maturity, the second level of SEI's five-level model. We also 
evaluated against one level-3 key process area-—acquisition risk 
management-—because of its importance. We met with project managers 
and project team members to determine whether and to what extent they 
implemented each key practice, and to obtain relevant documentation. 
In accordance with the SEI model, for each key process area we 
reviewed,[Footnote 14] we evaluated DLA's institutional policies and 
practices and compared project-specific guidance and practices against 
the required key practices. 

More specifically, for each key practice we reviewed, we compared 
project-specific documentation and practices against the criteria in 
the software acquisition model. If the project met the criteria for 
the key practice reviewed, we rated it as a strength. If the project 
did not meet the criteria for the key practice reviewed, we rated it 
as a weakness. If the evidence was mixed or inconclusive and did not 
support a rating of either a strength or a weakness, we treated it as 
an observation. If the key practice was not relevant to the project, 
we did not rate it. 

We evaluated DLA's only two software acquisition projects underway at 
the time of our review: the Business Systems Modernization (BSM) and 
the Fuels Automated System (FAS). 

To determine what actions DLA has planned or in place to improve its 
software processes, we identified the group within DLA that is tasked 
with performing this function. We interviewed agency officials who are 
involved in software process improvement, collected data, and analyzed 
draft policies and draft working papers describing planned work. 

We performed our work from May through October 2001, in accordance 
with generally accepted government auditing standards. 

[End of section] 

Appendix II: Results of Software Acquisition Capability Maturity Model 
Evaluation for Business Systems Modernization: 

Table 3: Software Acquisition Findings for BSM: 

Common feature: Commitment 1; 
Key practice: The acquisition organization has a written policy for 
planning the software acquisition; 
Finding: Defense Acquisition System (DODD 5000)-—for planning the 
software acquisition; 
Rating: Strength[A]. 

Common feature: Commitment 2; 
Key practice: Responsibility for software acquisition planning 
activities is designated; 
Finding: Responsibility for software acquisition planning activities 
is assigned to the BSM program manager; 
Rating: Strength. 

Common feature: Ability 1; 
Key practice: A group that is responsible for planning the software 
acquisition exists; 
Finding: The BSM program office is responsible for planning the 
software acquisition; 
Rating: Strength. 

Common feature: Ability 2; 
Key practice: The acquisition organization provides experienced 
software acquisition management personnel to support project software 
acquisition planning; 
Finding: DLA provides experienced software acquisition management 
personnel to support program software acquisition planning; 
Rating: Strength. 

Common feature: Ability 3; 
Key practice: Adequate resources are provided for software acquisition 
planning activities; 
Finding: According to BSM program officials, adequate resources are 
provided for software acquisition planning activities; 
Rating: Strength. 

Common feature: Activity 1; 
Key practice: Software acquisition planning personnel are involved in 
system acquisition planning; 
Finding: Software acquisition planning personnel are involved in 
system acquisition planning; 
Rating: Strength. 

Common feature: Activity 2; 
Key practice: The project's software acquisition planning is 
accomplished in conjunction with system acquisition planning; 
Finding: The BSM program's software acquisition planning is 
accomplished in conjunction with system acquisition planning; 
Rating: Strength. 

Common feature: Activity 3; 
Key practice: The software acquisition strategy for the project is 
developed and documented; 
Finding: The software acquisition strategy for the program is 
developed and documented in the Acquisition Strategy Plan; 
Rating: Strength. 

Common feature: Activity 4; 
Key practice: Software acquisition planning addresses the elements of 
the software acquisition process; 
Finding: Software acquisition planning addresses the elements of the 
software acquisition process, such as program management, requirements 
development and management, contract tracking and oversight, and 
evaluation; 
Rating: Strength. 

Common feature: Activity 5; 
Key practice: The project's software acquisition planning is 
documented and the planning documentation is maintained over the life 
of the project; 
Finding: The BSM program's software acquisition planning is documented 
and the planning documentation is maintained over the life of the 
program; 
Rating: Strength. 

Common feature: Activity 6; 
Key practice: Life-cycle support of the software is included in 
software acquisition planning documentation; 
Finding: Life-cycle support of the software, such as identifying 
adequate facilities and resources for the software support 
organization, is included in software acquisition planning 
documentation; 
Rating: Strength. 

Common feature: Activity 7; 
Key practice: Life-cycle cost and schedule estimates for the software 
products and services being acquired are prepared and independently 
reviewed; 
Finding: Life-cycle cost and schedule estimates for the software 
products and services being acquired are prepared by the BSM program 
office and independently reviewed by the Naval Center for Cost 
Analysis; 
Rating: Strength. 

Common feature: Measurement 1; 
Key practice: Measurements are made and used to determine the status 
of the software acquisition planning activities and resultant products.	
Finding: Measurements, such as metrics that track software acquisition 
planning activities and compare them to baselines, are made and used 
to determine the status of the software acquisition planning 
activities and resultant products; 
Rating: Strength. 

Common feature: Verification 1; 
Key practice: Software acquisition planning activities are reviewed by 
acquisition organization management on a periodic basis; 
Finding: Software acquisition planning activities are reviewed by the 
DLA Executive Board on a quarterly basis; 
Rating: Strength. 

Common feature: Verification 2; 
Key practice: Software acquisition planning activities are reviewed by 
the project manager on both a periodic and event-driven basis; 
Finding: Software acquisition planning activities are reviewed by the 
program manager on both a weekly and event-driven basis; 
Rating: Strength. 

[A] Strength = Key practice effectively implemented. 

Source: Key practice data from SEI; findings and ratings from GAO. 

[End of table] 

Table 4: Solicitation Findings for BSM: 

Common feature: Commitment 1; 
Key practice: The acquisition organization has a written policy for 
the conduct of the software portion of the solicitation; 
Finding: The BSM program officials stated that The Defense Acquisition 
System (DODD 5000) is the written policy for the conduct of the 
software portion of the solicitation; however, this directive does not 
address the conduct of the software portion of the solicitation; 
Rating: Weakness[A]. 

Common feature: Commitment 2; 
Key practice: Responsibility for the software portion of the 
solicitation is designated; 
Finding: Responsibility for the software portion of the solicitation 
is assigned to the Contracting Officer; 
Rating: Strength. 

Common feature: Commitment 3; 
Key practice: A selection official has been designated to be 
responsible for the selection process and the decision; 
Finding: The Contracting Officer has been designated the selection 
official responsible for the selection process and the decision; 
Rating: Strength. 

Common feature: Ability 1; 
Key practice: A group that is responsible for coordinating
and conducting solicitation activities exists; 
Finding: The BSM Acquisition Integrated Product Team is responsible 
for coordinating and conducting solicitation activities; 
Rating: Strength. 

Common feature: Ability 2; 
Key practice: Adequate resources are provided for solicitation 
activities; 
Finding: According to BSM program officials, adequate resources are 
provided for solicitation activities. Individuals performing 
solicitation activities have experience and receive training; 
Rating: Strength. 

Common feature: Ability 3; 
Key practice: Individuals performing solicitation activities have 
experience or receive training; 
Finding: Individuals performing solicitation activities have 
experience or receive training; 
Rating: Strength. 

Common feature: Ability 4; 
Key practice: The groups supporting the solicitation (e.g., end user, 
systems engineering, software support organization, and application 
domain experts) receive orientation on the solicitation's objectives 
and procedures; 
Finding: he groups supporting the solicitation (e.g., end user, 
systems engineering, software support organization, and application 
domain experts) receive orientation on the solicitation's objectives 
and procedures; 
Rating: Strength. 

Common feature: Activity 1; 
Key practice: The project team performs its activities in accordance 
with its documented solicitation plans; 
Finding: The BSM program office performs its activities in accordance 
with its documented solicitation plans; 
Rating: Strength. 

Common feature: Activity 2; 
Key practice: Solicitation activities are conducted in a manner 
compliant with relevant laws, policies, and guidance; 
Finding: Solicitation activities are conducted in a manner compliant 
with relevant laws, policies, and guidance; 
Rating: Strength. 

Common feature: Activity 3; 
Key practice: The software and evaluation requirements are 
incorporated into the solicitation package and resulting contract; 
Finding: The software and evaluation requirements are incorporated 
into the solicitation package and resulting contract; 
Rating: Strength. 

Common feature: Activity 4; 
Key practice: Proposals are evaluated in accordance with documented 
solicitation plans; 
Finding: Proposals are evaluated in accordance with documented 
solicitation plans; 
Rating: Strength. 

Common feature: Activity 5; 
Key practice: Cost and schedule estimates for the software products 
and services being sought are prepared; 
Finding: Cost and schedule estimates for the software products and 
services being sought are prepared; 
Rating: Strength. 

Common feature: Activity 6; 
Key practice: Software cost and schedule estimates are independently 
reviewed for comprehensiveness and realism; 
Finding: Software cost and schedule estimates are independently 
reviewed by the Naval Center for Cost Analysis for comprehensiveness 
and realism; 
Rating: Strength. 

Common feature: Activity 7; 
Key practice: The selection official uses proposal evaluation results 
to support his or her decision to select an offeror; 
Finding: The selection official uses proposal evaluation results to 
support his decision to select an offeror; 
Rating: Strength. 

Common feature: Activity 8; 
Key practice: The project team takes action to ensure the mutual 
understanding of software requirements and plans with the selected 
offeror(s) prior to contract signing; 
Finding: The BSM program office takes actions, such as meetings, e-
mails, and question and answer sessions, to ensure the mutual 
understanding of software requirements and plans with the selected 
offeror(s) prior to contract signing; 
Rating: Strength. 

Common feature: Measurement 1; 
Key practice: Measurements are made and used to determine the status 
of the solicitation activities and resultant products; 
Finding: Measurements, such as metrics that track solicitation 
activities and compare them to baselines, are made and used to 
determine the status of the solicitation activities and resultant 
products; 
Rating: Strength. 
				
Common feature: Verification 1; 
Key practice: Solicitation activities are reviewed by the acquisition 
organization management on a periodic basis; 
Finding: Solicitation activities are reviewed by the DLA Executive 
Board on a quarterly basis; 
Rating: Strength. 

Common feature: Verification 2; 
Key practice: Solicitation activities are reviewed by the program 
manager and designated selection official on both a weekly and event-
driven basis; 
Finding: Solicitation activities are reviewed by the project manager 
or designated selection official on both a periodic and event-driven 
basis; 
Rating: Strength. 

[A] Weakness = Key practice not effectively implemented or not 
implemented. 

Source: Key practice data from SEI; findings and ratings from GAO. 

[End of table] 

Table 5: Requirements Development and Management Findings for BSM: 

Common feature: Commitment 1; 
Key practice: The acquisition organization has a written policy
for establishing and managing the software-related contractual 
requirements; 
Finding: The acquisition organization, which is DLA, has a written 
policy-—The Defense Acquisition System (DODD 5000)-—for establishing 
and managing the software-related contractual requirements; 
Rating: Strength. 

Common feature: Commitment 2; 
Key practice: Responsibility for requirements development and 
management is designated; 
Finding: Responsibility for requirements development and management is 
assigned to the BSM Core Integrated Product Team; 
Rating: Strength. 

Common feature: Ability 1; 
Key practice: A group that is responsible for performing requirements 
development and management activities exists; 
Finding: The BSM Requirements Development and Management Integrated 
Product Team is responsible for performing requirements development 
and management activities; 
Rating: Strength. 

Common feature: Ability 2; 
Key practice: Adequate resources are provided for requirements 
development and management activities; 
Finding: According to BSM program officials, adequate resources are 
provided for requirements development and management activities; 
Rating: Strength. 

Common feature: Ability 3; 
Key practice: Individuals performing requirements development and 
management activities have experience or receive training; 
Finding: Individuals performing requirements development and 
management activities have experience and receive training; 
Rating: Strength. 

Common feature: Activity 1; 
Key practice: The project team performs its activities in accordance 
with its documented requirements development and management plans; 
Finding: The BSM program does not have documented requirements 
development and management plans; 
Rating: Weakness. 

Common feature: Activity 2;
Key practice: The project team develops, baselines, and maintains 
software-related contractual requirements and places them under change 
control early in the project, but not later than release of the 
solicitation package; 
Finding: The BSM program office team developed, baselined, and 
maintained software-related contractual requirements and placed them 
under change control at the same time the solicitation package was 
released; 
Rating: Strength. 

Common feature: Activity 3;
Key practice: The project team appraises system requirements change 
requests for their impact on the software being acquired; 
Finding: The BSM program office does not appraise system requirements 
change requests for their impact on the software being acquired; 
Rating: Weakness. 

Common feature: Activity 4;
Key practice: The project team appraises all changes to the software-
related contractual requirements for their impact on performance, 
architecture, supportability, system resource utilization, and 
contract schedule and cost; 
Finding: The BSM program office does not appraise all changes to the 
software-related contractual requirements for their impact on 
performance, architecture, supportability, system resource 
utilization, and contract schedule and cost; 
Rating: Weakness. 

Common feature: Activity 5; 
Key practice: Bi-directional traceability between the contractual 
requirements and the contractor's team software work products and 
services is maintained throughout the effort; 
Finding: The BSM program office has a traceability matrix Strength 
that it uses to trace between the contractual requirements and the 
contractor's team software work products and services. The matrix is
maintained throughout the effort; 
Rating: Strength. 

Common feature: Activity 6; 
Key practice: The end user and other affected groups are involved in 
the development of all software-related contractual requirements and 
any subsequent change activity; 
Finding: The end user and other affected groups, such as Strength the 
program management group, change management group, and technical 
management
group, are involved in the development of all software-related 
contractual requirements and any subsequent change activity; 
Rating: Strength. 

Common feature: Measurement 1; 
Key practice: Measurements are made and used to determine the status 
of the requirements development and management activities and 
resultant products; 
Finding: Measurements, such as metrics that track Strength 
requirements development and management activities and compare them to 
baselines, are made and used to determine the status of the 
requirements development and management activities and resultant 
products; 
Rating: Strength. 

Common feature: Verification 1; 
Key practice: Requirements development and management activities are 
reviewed by acquisition organization management (and the contractor) 
on a periodic basis; 
Finding: Requirements development and management	activities are 
reviewed by the DLA Executive Board on a quarterly basis and by the 
contractor on a weekly basis; 
Rating: Strength. 

Common feature: Verification 2; 
Key practice: Requirements development and management activities are 
reviewed by the project manager on both a periodic and event-driven 
basis; 
Finding: Requirements development and management activities are 
reviewed by the program manager on both a weekly and event-driven 
basis; 
Rating: Strength. 

Source: Key practice data from SEI; findings and ratings from GAO. 

[End of table] 

Table 6: Project Management Findings for BSM: 

Common feature: Commitment 1; 
Key practice: The acquisition organization has a written policy for 
execution of the software project; 
Finding: The acquisition organization, which is DLA, has a written 
policy-—The Defense Acquisition System (DODD 5000)-—for execution of 
the software program; 
Rating: Strength. 

Common feature: Commitment 2; 
Key practice: Responsibility for project management is designated; 
Finding: Responsibility for program management is assigned to the BSM 
program manager; 
Rating: Strength. 

Common feature: Ability 1; 
Key practice: A team that is responsible for performing the project's 
software acquisition management activities exists; 
Finding: The BSM Program Management Office is responsible for 
performing the program's software acquisition management activities; 
Rating: Strength. 

Common feature: Ability 2; 
Key practice: Adequate resources for the project team are provided for 
the duration of the software acquisition project; 
Finding: According to BSM program officials, adequate resources for 
the program team are provided for the duration of the software 
acquisition program; 
Rating: Strength. 

Common feature: Ability 3; 
Key practice: When project trade-offs are necessary, the project 
manager is permitted to alter the performance, cost, or schedule 
software acquisition baseline; 
Finding: When program trade-offs are necessary, the program manager is 
permitted to alter the performance, cost, or schedule software 
acquisition baseline; 
Rating: Strength. 

Common feature: Ability 4; 
Key practice: The project team has experience or receives training in 
project software acquisition management activities; 
Finding: The BSM program office has experience and receives training 
in program software acquisition management activities; 
Rating: Strength. 

Common feature: Activity 1; 
Key practice: The project team performs its activities in accordance 
with its documented software acquisition management plans; 
Finding: The BSM program office performs its activities in accordance 
with its Acquisition Strategy Plan; 
Rating: Strength. 

Common feature: Activity 2; 
Key practice: The roles, responsibilities, and authority for the 
project functions are documented, maintained, and communicated to 
affected groups; 
Finding: The roles, responsibilities, and authority for the program 
functions are documented in the Acquisition Strategy Plan and are 
maintained and communicated to affected groups; 
Rating: Strength. 

Common feature: Activity 3; 
Key practice: The project team's commitments, and changes to 
commitments, are communicated to affected groups; 
Finding: The BSM program office's commitments, and changes to 
commitments, are communicated to affected groups during weekly status 
meetings; 
Rating: Strength. 

Common feature: Activity 4; 
Key practice: The project team tracks the risks associated with cost, 
schedule, resources, and the technical aspects of the project; 
Finding: The BSM program office tracks the risks associated with cost, 
schedule, resources, and the technical aspects of the program; 
Rating: Strength. 

Common feature: Activity 5; 
Key practice: The project team tracks project issues, status, 
execution, funding, and expenditures against project plans and takes 
action; 
Finding: The BSM program office tracks program issues, status, 
execution, funding, and expenditures against program plans and takes 
action; 
Rating: Strength. 

Common feature: Activity 6; 
Key practice: The project team implements a corrective action system 
for the identification, recording, tracking, and correction of 
problems discovered during the software acquisition; 
Finding: The BSM program office implemented a corrective action system 
for the identification, recording, tracking, and correction of 
problems discovered during the software acquisition; 
Rating: Strength. 

Common feature: Activity 7; 
Key practice: The project team keeps its plans current during the life 
of the project as replanning occurs, issues are resolved, requirements 
are changed, and new risks are discovered; 
Finding: The BSM program office keeps its plans current during the 
life of the program as replanning occurs, issues are resolved, 
requirements are changed, and new risks are discovered; 
Rating: Strength. 

Common feature: Measurement 1; 
Key practice: Measurements are made and used to determine the status 
of the project management activities and resultant products; 
Finding: Measurements, such as metrics that track program management 
activities and compare them to baselines, are made and used to 
determine the status of the program management activities and 
resultant products; 
Rating: Strength. 

Common feature: Verification 1; 
Key practice: Project management activities are reviewed by the DLA 
Executive Board on a periodic basis; 
Finding: Program management activities are reviewed by acquisition 
organization management on a quarterly basis; 
Rating: Strength. 

Common feature: Verification 2; 
Key practice: Project management activities are reviewed by the 
project manager on both a periodic and event-driven basis; 
Finding: Program management activities are reviewed by the program 
manager on both a weekly and event-driven basis; 
Rating: Strength. 

Source: Key practice data from SEI; findings and ratings from GAO. 

[End of table] 

Table 7: Contract Tracking and Oversight Findings for BSM: 

Common feature: Commitment 1; 
Key practice: The acquisition organization has a written policy for
the contract tracking and oversight of the contracted software effort; 
Finding: The acquisition organization, which is DLA, has a written 
policy-—The Defense Acquisition System (DODD 5000)-—for the contract 
tracking and oversight of the contracted software effort; 
Rating: Strength. 

Common feature: Commitment 2; 
Key practice: Responsibility for contract tracking and oversight 
activities is designated; 
Finding: Responsibility for contract tracking and oversight activities 
is assigned to the Contract Management Office; 
Rating: Strength. 

Common feature: Commitment 3; 
Key practice: The project team includes contracting specialists in the 
execution of the contract; 
Finding: The BSM program office includes contracting specialists in 
the execution of the contract; 
Rating: Strength. 

Common feature: Ability 1; 
Key practice: A group that is responsible for managing contract 
tracking and oversight activities exists; 
Finding: The BSM Acquisition and Contract Management Integrated 
Product Team is responsible for managing contract tracking and 
oversight activities; 
Rating: Strength. 

Common feature: Ability 2; 
Key practice: Adequate resources are provided for contract tracking 
and oversight activities; 
Finding: According to BSM program officials, adequate resources are 
provided for contract tracking and oversight activities; 
Rating: Strength. 

Common feature: Ability 3; 
Key practice: Individuals performing contract tracking and oversight 
activities have experience or receive training; 
Finding: Individuals performing contract tracking and oversight 
activities have experience and receive training; 
Rating: Strength. 

Common feature: Activity 1; 
Key practice: The project team performs its activities in accordance 
with its documented contract tracking and oversight plans; 
Finding: The BSM program office performs its activities Strength in 
accordance with its documented contract tracking and oversight plans; 
Rating: Strength. 

Common feature: Activity 2; 
Key practice: The project team reviews required contractor software 
planning documents which, when satisfactory, are used to oversee the 
contractor team's software engineering effort; 
Finding: The BSM program office reviews required contractor software 
planning documents such as the program management plan, software risk 
management plan, and subcontract management plan which, when 
satisfactory, it uses to oversee the contractor team's software
engineering effort; 
Rating: Strength. 

Common feature: Activity 3; 
Key practice: The project team conducts periodic reviews and 
interchanges with the contractor team; 
Finding: The BSM program office conducts daily Strength reviews and 
interchanges with the contractor team; 
Rating: Strength. 

Common feature: Activity 4; 
Key practice: The actual cost and schedule of the contractor's 
software engineering effort are compared to planned schedules and 
budgets and issues are identified; 
Finding: The actual cost and schedule of the Strength contractor's 
software engineering effort are compared to planned schedules and 
budgets and issues are identified; 
Rating: Strength. 

Common feature: Activity 5; 
Key practice: The size, critical computer resources, and technical 
activities associated with the contractor team's work products are 
tracked and issues identified; 
Finding: The size, critical computer resources, and technical 
activities associated with the contractor team's work products are 
tracked and issues identified; 
Rating: Strength. 

Common feature: Activity 6; 
Key practice: The project team reviews and tracks the development of 
the software engineering environment required to provide life cycle 
support for the acquired software and issues are identified; 
Finding: The BSM program office reviews and tracks Strength the 
development of the software engineering environment required to 
provide life cycle support for the acquired software and issues
are identified; 
Rating: Strength. 

Common feature: Activity 7; 
Key practice: Any issues found by the project team during contract 
tracking and oversight are recorded in the appropriate corrective 
action system, action taken, and tracked to closure; 
Finding: Any issues found by the BSM program office Strength during 
contract tracking and oversight are recorded in the appropriate 
corrective action system, action taken, and tracked to closure; 
Rating: Strength. 

Common feature: Activity 8; 
Key practice: The project team ensures that changes to the software-
related contractual requirements are coordinated with all affected 
groups and individuals, such as the contracting official, contractor, 
and end user; 
Finding: The BSM program office ensures that changes Strength to the 
software-related contractual requirements are coordinated with all 
affected groups and individuals, such as the contracting official, 
contractor, and end user; 
Rating: Strength. 

Common feature: Measurement 1; 
Key practice: Measurements are made and used to determine the status 
of the contract tracking and oversight activities and resultant 
products; 
Finding: Measurements, such as metrics that track contract tracking 
and oversight activities and compare them to baselines, are made and 
used to determine the status of the contract tracking and oversight 
activities and resultant products; 
Rating: Strength. 

Common feature: Verification 1; 
Key practice: Contract tracking and oversight activities are reviewed 
by acquisition organization management on a periodic basis; 
Finding: Contract tracking and oversight activities are reviewed by 
the DLA Executive Board on a quarterly basis; 
Rating: Strength. 

Common feature: Verification 2; 
Key practice: Contract tracking and oversight activities are reviewed 
by the project manager on both a periodic and event-driven basis; 
Finding: Contract tracking and oversight activities are reviewed by 
the program manager on both a weekly and event-driven basis; 
Rating: Strength. 
		
Source: Key practice data from SEI; findings and ratings from GAO. 

[End of table] 

Table 8: Evaluation Findings for BSM: 

Common feature: Commitment 1; 
Key practice: The acquisition organization has a written policy for 
managing the evaluation of the acquired software products and services; 
Finding: The acquisition organization, which is DLA, has a written 
policy—-The Defense Acquisition System (DODD 5000)—-for managing the 
evaluation of the acquired software products and services; 
Rating: Strength. 

Common feature: Commitment 2; 
Key practice: Responsibility for evaluation activities is designated; 
Finding: Responsibility for evaluation activities is assigned to the 
BSM Program Management Office; 
Rating: Strength. 

Common feature: Ability 1; 
Key practice: A group that is responsible for planning, managing, and 
performing evaluation activities for the project exists; 
Finding: The BSM Test and Evaluation Integrated Product Team is 
responsible for planning, managing, and performing evaluation activities
for the program; 
Rating: Strength. 

Common feature: Ability 2; 
Key practice: Adequate resources are provided for evaluation 
activities; 
Finding: According to BSM program officials, adequate resources are 
provided for evaluation activities; 
Rating: Strength. 

Common feature: Ability 3; 
Key practice: Individuals performing evaluation activities have 
experience or receive training; 
Finding: Individuals performing evaluation activities have experience 
and receive training; 
Rating: Strength. 

Common feature: Ability 4; 
Key practice: Members of the project team and groups supporting the 
software acquisition receive orientation on the objectives of the 
evaluation approach; 
Finding: Members of the BSM program office stated that they received 
orientation on the objectives of the evaluation approach; however, they
could not provide documentation to support this; 
Rating: Observation[A]. 

Common feature: Activity 1; 
Key practice: The project team performs its activities in accordance 
with its documented evaluation plans; 
Finding: The BSM program office performs its activities, such as 
assessing technical risk, reviewing the integration approach, and 
ensuring that resources are sufficient, in accordance with its 
documented evaluation plans; 
Rating: Strength. 

Common feature: Activity 2; 
Key practice: The project's evaluation requirements are developed in 
conjunction with the development of the system or software technical 
requirements; 
Finding: The BSM program's evaluation requirements are developed in 
conjunction with the development of the system technical requirements; 
Rating: Strength. 

Common feature: Activity 3; 
Key practice: The project's evaluation activities are planned to 
minimize duplication and take advantage of all evaluation results, 
where appropriate; 
Finding: The BSM program's evaluation activities, as stated in the 
Test and Evaluation Master Plan, are planned to minimize duplication 
and take advantage of all evaluation results, where appropriate; 
Rating: Strength. 

Common feature: Activity 4; 
Key practice: The project team appraises the contractor team's 
performance over the total period of the contract for compliance with 
requirements; 
Finding: The BSM program team appraises the contractor team's 
performance over the total period of the contract for compliance with
requirements; 
Rating: Strength. 

Common feature: Activity 5; 
Key practice: Planned evaluations are performed on the evolving 
software products and services prior to acceptance for operational use; 
Finding: Planned evaluations are performed on the evolving software 
products and services prior to acceptance for operational use; 
Rating: Strength. 

Common feature: Activity 6; 
Key practice: Results of the evaluations are analyzed and compared to 
the contract's requirements to establish an objective basis to support 
the decision to accept the products and services or to take further 
action; 
Finding: Results of the evaluations are analyzed and compared to the 
contract's requirements to establish an objective basis to support the
decision to accept the products and services or to take further action; 
Rating: Strength. 

Common feature: Measurement 1; 
Key practice: Measurements are made and used to determine the status 
of the evaluation activities and resultant products; 
Finding: Measurements, such as metrics that track evaluation 
activities and compare them to	baselines, are made and used to 
determine the status of the evaluation activities and resultant 
products. 
Rating: Strength. 
					
Common feature: Verification 1; 
Key practice: Evaluation activities are reviewed by acquisition 
organization management on a periodic basis; 
Finding: Evaluation activities are reviewed by the DLA Strength 
Executive Board on a quarterly basis; 
Rating: Strength. 

Common feature: Verification 2; 
Key practice: Evaluation activities are reviewed by the project 
manager on both a periodic and event-driven basis; 
Finding: Evaluation activities are reviewed by the program manager on 
both a weekly and event-driven basis; 
Rating: Strength. 

[A] Observation = Key practice evaluated, but the practice cannot be 
rated as either a strength or a weakness because (1) documentation was 
not provided or (2) the practice was only partially implemented. 

Source: Key practice data from SEI; findings and ratings from GAO. 

[End of table] 

Table 9: Acquisition Risk Management Findings for BSM: 

Common feature: Commitment 1; 
Key practice: The acquisition organization has a written policy for 
the management of software acquisition risk; 
Finding: The acquisition organization, which is DLA, has a written 
policy—-The Defense Acquisition System (DODD 5000)—-for the management 
of software acquisition risk
Rating: Strength. 

Common feature: Commitment 2; 
Key practice: Responsibility for software acquisition risk management 
activities is designated; 
Finding: Responsibility for software acquisition risk management 
activities is assigned to the Risk Management Office; 
Rating: Strength. 

Common feature: Ability 1; 
Key practice: A group that is responsible for coordinating software 
acquisition risk management activities exists; 
Finding: BSM's Risk and Issue Management Integrated Product Team is 
responsible for coordinating software acquisition risk management 
activities; 
Rating: Strength. 

Common feature: Ability 2; 
Key practice: Adequate resources are provided for software acquisition 
risk management activities; 
Finding: According to BSM program officials, adequate resources are 
provided for software acquisition risk management activities; 
Rating: Strength. 

Common feature: Ability 3; 
Key practice: Individuals performing software acquisition risk 
management activities have experience or receive required training; 
Finding: Individuals performing software acquisition risk management 
activities have experience and receive required training; 
Rating: Strength. 

Common feature: Activity 1; 
Key practice: Software acquisition risk management activities are 
integrated into software acquisition planning; 
Finding: Software acquisition risk management activities are 
integrated into software acquisition planning; 
Rating: Strength. 

Common feature: Activity 2; 
Key practice: The Software Acquisition Risk Management Plan is 
developed in accordance with the project's defined software 
acquisition process; 
Finding: The Acquisition Risk Management Plan is developed in 
accordance with the program's defined software acquisition process; 
Rating: Strength. 

Common feature: Activity 3; 
Key practice: The project team performs its software acquisition risk 
management activities in accordance with its documented plans; 
Finding: The BSM program office performs its software acquisition risk 
management activities in accordance with its documented Acquisition 
Risk Management Plan; 
Rating: Strength. 

Common feature: Activity 4; 
Key practice: The project team encourages and rewards project-wide 
participation in the identification and mitigation of risks; 
Finding: The BSM program office encourages and rewards program-wide 
participation in the identification and mitigation of risks. For 
example, staff who identify risks are publicly commended during weekly 
status meetings; 
Rating: Strength. 

Common feature: Activity 5; 
Key practice: Risk management is conducted as an integral part of the 
solicitation, project performance management, and contract performance 
management processes; 
Finding: Risk management is conducted as an integral part of the 
solicitation, program performance management, and contract performance 
management processes; 
Rating: Strength. 

Common feature: Activity 6; 
Key practice: Software acquisition risks are analyzed, tracked, and 
controlled until mitigated; 
Finding: Software acquisition risks are analyzed, tracked, and 
controlled until mitigated; 
Rating: Strength. 

Common feature: Activity 7; 
Key practice: Project reviews include the status of identified risks; 
Finding: Program reviews include the status of identified risks; 
Rating: Strength. 

Common feature: Measurement 1; 
Key practice: Measurements are made and used to determine the status 
of the acquisition risk management activities and resultant products; 
Finding: Measurements, such as metrics that track identified risks 
from discovery to mitigation to closure, are made and used to 
determine the status of the acquisition risk management activities and 
resultant products; 
Rating: Strength. 

Common feature: Verification 1; 
Key practice: Acquisition risk management activities are reviewed by 
acquisition organization management on a periodic basis; 
Finding: Acquisition risk management activities are reviewed by the 
DLA Executive Board on a quarterly basis; 
Rating: Strength. 

Common feature: Verification 2; 
Key practice: Acquisition risk management activities are reviewed by 
the project manager on both a periodic and event-driven basis; 
Finding: Acquisition risk management activities are reviewed by the 
program manager on both a weekly and event-driven basis; 
Rating: Strength. 
				
Source: Key practice data from SEI; findings and ratings from GAO. 

[End of table] 

[End of section] 

Appendix III: Results of Software Acquisition Capability Maturity 
Model Evaluation for Fuels Automated System; 

Table 10: Software Acquisition Planning Findings for FAS: 

Common feature: Commitment 1; 
Key practice: The acquisition organization has a written policy for 
planning the software acquisition; 
Finding: The acquisition organization, which is DLA, has a written 
policy-—The Defense Acquisition System (DODD 5000)—-for planning the 
software acquisition; 
Rating: Strength[A]. 
		
Common feature: Commitment 2; 
Key practice: Responsibility for software acquisition planning
activities is designated; 
Finding: Responsibility for software acquisition planning activities 
is assigned to the FAS program manager; 
Rating: Strength. 

Common feature: Ability 1; 
Key practice: A group that is responsible for planning the software 
acquisition exists; 
Finding: The FAS program office is responsible for planning the 
software acquisition; 
Rating: Strength. 

Common feature: Ability 2; 
Key practice: The acquisition organization provides experienced 
software acquisition management personnel to support project software 
acquisition planning; 
Finding: DLA provides experienced software acquisition management 
personnel to support program software acquisition planning; 
Rating: Strength. 

Common feature: Ability 3; 
Key practice: Adequate resources are provided for software acquisition 
planning activities; 
Finding: According to FAS program officials, adequate resources are 
not provided for software acquisition planning activities; 
Rating: Weakness[B]. 

Common feature: Activity 1; 
Key practice: Software acquisition planning personnel are involved in 
system acquisition planning; 
Finding: Software acquisition planning personnel are involved in 
system acquisition planning; 
Rating: Strength. 

Common feature: Activity 2; 
Key practice: The project's software acquisition planning is 
accomplished in conjunction with system acquisition planning; 
Finding: The program's software acquisition planning is accomplished 
in conjunction with system acquisition planning; 
Rating: Strength. 

Common feature: Activity 3; 
Key practice: The software acquisition strategy for the project is 
developed and documented
Finding: The software acquisition strategy for the program is 
developed and documented in the Acquisition Strategy Plan; 
Rating: Strength. 

Common feature: Activity 4; 
Key practice: Software acquisition planning addresses the elements of 
the software acquisition process; 
Finding: Software acquisition planning addresses the elements of the 
software acquisition process, such as program management, requirements 
development and management, contract tracking and oversight, and 
evaluation; 
Rating: Strength. 

Common feature: Activity 5; 
Key practice: The project's software acquisition planning is 
documented and the planning documentation is maintained over the life 
of the project; 
Finding: The program's software acquisition planning is documented; 
however, there is no evidence that the planning documentation is
maintained over the life of the program; 
Rating: Observation[C]. 

Common feature: Activity 6; 
Key practice: Life-cycle support of the software is included in 
software acquisition planning documentation; 
Finding: Life-cycle support of the software, such as identifying 
adequate facilities and resources for the software support 
organization, are included in software acquisition planning 
documentation; 
Rating: Strength. 

Common feature: Activity 7; 
Key practice: Life-cycle cost and schedule estimates for the
software products and services being acquired are prepared and 
independently reviewed; 
Finding: Life-cycle cost and schedule estimates for the software 
products and services being acquired are prepared and independently
reviewed; 
Rating: Strength. 

Common feature: Measurement 1; 	
Key practice: Measurements are made and used to determine the status 
of the software acquisition planning activities and resultant products; 
Finding: Measurements are not made and used to determine the status of 
the software acquisition planning activities and resultant products; 
Rating: Weakness. 

Common feature: Verification 1; 
Key practice: Software acquisition planning activities are reviewed by 
acquisition organization management on a periodic basis; 
Finding: Software acquisition planning activities are reviewed by the 
DLA Executive Board on a quarterly basis; 
Rating: Strength. 

Common feature: Verification 2; 
Key practice: Software acquisition planning activities are reviewed by 
the project manager on both a periodic and event-driven basis; 
Finding: Software acquisition planning activities are reviewed by the 
program manager on a daily basis; 
Rating: Strength. 
					
[A] Strength = Key practice effectively implemented. 

[B] Weakness = Key practice not effectively implemented or not 
implemented. 

[C] Observation = Key practice evaluated, but the practice cannot be 
rated as either a strength or a weakness because (1) documentation was 
not provided or (2) the practice was only partially implemented. 

Source: Key practice data from SEI; findings and ratings from GAO. 

[End of table] 

Table 11: Requirements Development and Management Findings for FAS: 

Common features: Commitment 1; 
Key Practice: The acquisition organization has a written policy for 
establishing and managing the software-related contractual 
requirements; 
Finding: The acquisition organization, which is DLA, has a written 
policy-—The Defense Acquisition System (DODD 5000)-—for establishing 
and managing the software-related contractual requirements; 
Rating: Strength. 

Common features: Commitment 2; 
Key Practice: Responsibility for requirements development and 
management is designated; 
Finding: Responsibility for requirements development and management is 
assigned to the FAS program manager; 
Rating: Strength. 

Common features: Ability 1; 
Key Practice: A group that is responsible for performing requirements 
development and management activities exists; 
Finding: The Product Assurance Group is responsible for performing 
requirements development and management activities; 
Rating: Strength. 

Common features: Ability 2; 
Key Practice: Adequate resources are provided for requirements 
development and management activities; 
Finding: According to FAS program officials, adequate resources are 
not provided for requirements development and management activities; 
Rating: Weakness. 

Common features: Ability 3; 
Key Practice: Individuals performing requirements development and 
management activities have experience or receive training; 
Finding: FAS program officials said that individuals performing 
requirements development and management activities have experience and
receive training. However, they could not provide documents to support 
this; 
Rating: Observation. 

Common features: Activity 1; 
Key Practice: The project team performs its activities in accordance 
with its documented requirements development and management plans; 
Finding: The FAS program does not have documented requirements 
development and management plans; 
Rating: Weakness. 

Common features: Activity 2; 
Key Practice: The project team develops, baselines, and maintains 
software-related contractual requirements and places them under change
control early in the project, but not later than release of the 
solicitation package; 
Finding: The FAS program office did not develop, baseline, and 
maintain software-related contractual requirements and place them 
under change control before the contract was awarded; 
Rating: Weakness. 

Common features: Activity 3; 
Key Practice: The project team appraises system requirements change 
requests for their impact on the software being acquired; 
Finding: The FAS program office does not appraise system requirements 
change requests for their impact on the software being acquired; 
Rating: Weakness. 

Common features: Activity 4; 
Key Practice: The project team appraises all changes to the software-
related contractual requirements for their impact on performance, 
architecture, supportability, system resource utilization, and 
contract schedule and cost; 
Finding: The FAS program office does not appraise changes to the 
software-related contractual requirements for their impact on 
performance, architecture, supportability, system resource 
utilization, and contract schedule and cost; 
Rating: Weakness. 

Common features: Activity 5; 
Key Practice: Bi-directional traceability between the contractual 
requirements and the contractor’s team software work products and 
services is maintained throughout the effort; 
Finding: The FAS program office has a traceability matrix that it uses 
to trace between the contractual requirements and the contractor’s 
team software work products and services. The matrix is maintained 
throughout the effort; 
Rating: Strength. 

Common features: Activity 6; 
Key Practice: The end user and other affected groups are involved in 
the development of all software-related contractual requirements and any
subsequent change activity; 
Finding: The end user and other affected groups are involved in the 
development of all software-related contractual requirements; however, 
the team could not provide evidence of how affected groups were 
involved in changes to software requirements; 
Rating: Observation. 

Common features: Measurement 1; 
Key Practice: Measurements are made and used to determine the status 
of the requirements development and management activities and 
resultant products; 
Finding: Measurements are not made and used to determine the status of 
the requirements development and management activities and resultant 
products; 
Rating: Weakness. 

Common features: Verification 1; 
Key Practice: Requirements development and management activities are 
reviewed by acquisition organization management (and the contractor)
on a periodic basis; 
Finding: Requirements development and management activities are 
reviewed by the DLA Executive Board on a quarterly basis; 
Rating: Strength. 

Common features: Verification 2; 
Key Practice: Requirements development and management activities are 
reviewed by the project manager on both a periodic and event-driven 
basis; 
Finding: Requirements development and management activities are 
reviewed by the program manager on a daily basis; 
Rating: Strength. 

Source: Key practice data from SEI; findings and ratings from GAO. 

[End of table] 

Table 12: Project Management Findings for FAS: 

Common feature: Commitment 1; 
Key practice: The acquisition organization has a written policy for 
execution of the software project; 
Finding: The acquisition organization, which is DLA, has a written 
policy—-The Defense Acquisition System (DODD 5000)-—for execution of the
software program; 
Rating: Strength. 

Common feature: Commitment 2; 
Key practice: Responsibility for project management is designated; 
Finding: Responsibility for program management is assigned to the FAS 
program manager; 
Rating: Strength. 

Common feature: Ability 1; 
Key practice: A team that is responsible for performing the project's 
software acquisition management activities exists; 
Finding: The FAS program office is responsible for performing the 
program's software acquisition management activities; 
Rating: Strength. 

Common feature: Ability 2; 
Key practice: Adequate resources for the project team are provided for 
the duration of the software acquisition project; 
Finding: According to FAS program officials, adequate resources for 
the program team are not provided for the duration of the software
acquisition program; 
Rating: Weakness. 

Common feature: Ability 3; 
Key practice: When project trade-offs are necessary, the project 
manager is permitted to alter the performance, cost, or schedule 
software acquisition baseline; 
Finding: When project trade-offs are necessary, the program manager is 
permitted to alter the performance, cost, or schedule software 
acquisition baseline; 
Rating: Strength. 

Common feature: Ability 4; 
Key practice: The project team has experience or receives training in 
project software acquisition management activities; 
Finding: The FAS program office receives training in program software 
acquisition management activities; 
Rating: Strength. 

Common feature: Activity 1; 
Key practice: The project team performs its activities in accordance 
with its documented software acquisition management plans; 
Finding: The FAS program office performs its activities in accordance 
with its documented Acquisition Strategy Plan; 
Rating: Strength. 

Common feature: Activity 2; 
Key practice: The roles, responsibilities, and authority for the 
project functions are documented, maintained, and communicated to 
affected groups; 
Finding: The roles, responsibilities, and authority for the program 
functions are not documented, maintained, and communicated to affected
groups; 
Rating: Weakness. 

Common feature: Activity 3; 
Key practice: The project team's commitments, and changes to 
commitments, are communicated to affected groups; 
Finding: The FAS program office's commitments, and changes to 
commitments, are communicated to affected groups during weekly status
meetings; 
Rating: Strength. 

Common feature: Activity 4; 
Key practice: The project team tracks the risks associated with cost, 
schedule, resources, and the technical aspects of the project; 
Finding: The FAS program office does not track the risks associated 
with cost, schedule, resources, and the technical aspects of the 
program; 
Rating: Weakness. 

Common feature: Activity 5; 
Key practice: The project team tracks project issues, status, 
execution, funding, and expenditures against project plans and takes 
action; 
Finding: The FAS program office does not track program issues, status, 
execution, funding, and expenditures against program plans and take 
action; 
Rating: Weakness. 

Common feature: Activity 6; 
Key practice: The project team implements a corrective action system 
for the identification, recording, tracking, and correction of 
problems discovered during the software acquisition; 
Finding: The FAS program office implemented a corrective action system 
for the identification, recording, tracking, and correction of problems
discovered during the software acquisition; 
Rating: Strength. 

Common feature: Activity 7; 
Key practice: The project team keeps its plans current during the life 
of the project as replanning occurs, issues are resolved, requirements 
are changed, and new risks are discovered; 
Finding: The FAS program office has not kept its plans current during 
the life of the program; 
Rating: Weakness. 

Common feature: Measurement 1; 
Key practice: Measurements are made and used to determine the status 
of the project management activities and resultant products; 
Finding: Measurements are not made and used to determine the status of 
the program management activities and resultant products; 
Rating: Weakness. 

Common feature: Verification 1; 
Key practice: Project management activities are reviewed by 
acquisition organization management on a periodic basis; 
Finding: Program management activities are reviewed by the DLA 
Executive Board on a quarterly basis; 
Rating: Strength. 

Common feature: Verification 2; 
Key practice: Project management activities are reviewed by the 
project manager on both a periodic and event-driven basis; 
Finding: Program management activities are reviewed by the program 
manager on a daily basis; 
Rating: Strength. 

Source: Key practice data from SEI; findings and ratings from GAO. 

[End of table] 

Table 13: Contract Tracking and Oversight Findings for FAS: 

Common feature: Commitment 1; 
Key practice: The acquisition organization has a written policy for 
the contract tracking and oversight of the contracted software effort; 
Finding: The acquisition organization, which is DLA, has a written 
policy—-The Defense Acquisition System (DODD 5000)—-for the contract 
tracking and oversight of the contracted software effort; 
Rating: Strength. 

Common feature: Commitment 2; 
Key practice: Responsibility for contract tracking and oversight 
activities is designated; 
Finding: Responsibility for contract tracking and oversight activities 
is assigned to the contracting officer's technical representative; 
Rating: Strength. 

Commitment 3; 
Key practice: The project team includes contracting specialists in the 
execution of the contract; 
Finding: The FAS program office includes contracting specialists in 
the execution of the contract; 
Rating: Strength. 

Common feature: Ability 1; 
Key practice: A group that is responsible for managing contract 
tracking and oversight activities exists; 
Finding: The FAS program office is responsible for managing contract 
tracking and oversight activities; 
Rating: Strength. 

Common feature: Ability 2; 
Key practice: Adequate resources are provided for contract tracking 
and oversight activities; 
Finding: According to FAS program officials, adequate resources are 
not provided for contract tracking and oversight activities; 
Rating: Weakness. 

Common feature: Ability 3; 
Key practice: Individuals performing contract tracking and oversight 
activities have experience or receive training; 
Finding: Individuals performing contract tracking and oversight 
activities have experience; 
Rating: Strength. 

Common feature: Activity 1; 
Key practice: The project team performs its activities in accordance 
with its documented contract tracking and oversight plans; 
Finding: The FAS program office does not have a contract tracking and 
oversight plan; 
Rating: Weakness. 

Common feature: Activity 2; 
Key practice: The project team reviews required contractor software 
planning documents which, when satisfactory, are used to oversee the 
contractor team's software engineering effort; 
Finding: Although FAS program officials indicate that they review many 
of the program's planning documents, they could not provide evidence 
that these reviews take place; 
Rating: Observation. 

Common feature: Activity 3; 
Key practice: The project team conducts periodic reviews and 
interchanges with the contractor team; 
Finding: FAS program team conducts periodic reviews and interchanges 
with the contractor team; 
Rating: Strength. 

Common feature: Activity 4; 
Key practice: The actual cost and schedule of the contractor's 
software engineering effort are compared to planned schedules and 
budgets and issues are identified; 
Finding: The actual cost and schedule of the contractor's software 
engineering effort are not compared to planned schedules and budgets 
and issues are not identified; 
Rating: Weakness. 

Common feature: Activity 5; 
Key practice: The size, critical computer resources, and technical 
activities associated with the contractor team's work products are 
tracked, and issues identified; 
Finding: The size, critical computer resources, and technical 
activities associated with the contractor team's work products are 
tracked, and issues identified; 
Rating: Strength. 

Common feature: Activity 6; 
Key practice: The project team reviews and tracks the development of 
the software engineering environment required to provide life-cycle 
support for the acquired software and issues are identified; 
Finding: The FAS program office reviews and tracks the development of 
the software engineering environment required to provide life-cycle 
support for the acquired software and issues are identified; 
Rating: Strength. 

Common feature: Activity 7; 
Key practice: Any issues found by the project team during contract 
tracking and oversight are recorded in the appropriate corrective 
action system, action taken, and tracked to closure; 
Finding: Issues found by the project team during contract tracking and 
oversight are recorded in the appropriate corrective action system,
action taken, and tracked to closure; 
Rating: Strength. 

Common feature: Activity 8; 
Key practice: The project team ensures that changes to the software-
related contractual requirements are coordinated with all affected 
groups and individuals, such as the contracting official, contractor, 
and end user; 
Finding: The program team does not ensure that changes to the software-
related contractual requirements are coordinated with all affected
groups and individuals, such as the contracting official, contractor, 
and end user; 
Rating: Weakness. 

Common feature: Measurement 1; 
Key practice: Measurements are made and used to determine the status 
of the contract tracking and oversight activities and resultant 
products; 
Finding: Measurements are not made and used to determine the status of 
the contract tracking and oversight activities and resultant products; 
Rating: Weakness. 

Common feature: Verification 1; 
Key practice: Contract tracking and oversight activities are reviewed 
by acquisition organization management on a periodic basis; 
Finding: Contract tracking and oversight activities are Strength 
reviewed by the DLA Executive Board on a quarterly basis; 
Rating: Strength. 

Common feature: Verification 2; 
Key practice: Contract tracking and oversight activities are reviewed 
by the project manager on both a periodic and event-driven basis; 
Finding: Contract tracking and oversight activities are Strength 
reviewed by the program manager on a daily basis; 
Rating: Strength. 
		
Source: Key practice data from SEI; findings and ratings from GAO. 

[End of table] 

Table 14: Evaluation Findings for FAS: 

Common feature: Commitment 1; 
Key practice: The acquisition organization has a written policy for 
managing the evaluation of the acquired software products and services; 
Finding: The acquisition organization, which is DLA, has a written 
policy-—The Defense Acquisition System (DODD 5000)-—for managing the 
evaluation of the acquired software products and services; 
Rating: Strength. 

Common feature: Commitment 2; 
Key practice: Responsibility for evaluation activities is designated; 
Finding: Responsibility for evaluation activities is assigned to the 
FAS Product Assurance Office; 
Rating: Strength. 

Common feature: Ability 1; 
Key practice: A group that is responsible for planning, managing, and 
performing evaluation activities for the project exists; 
Finding: The FAS Working Level Test and Evaluation Integrated Product 
Team is responsible for planning, managing, and performing evaluation 
activities for the program; 
Rating: Strength. 

Common feature: Ability 2; 
Key practice: Adequate resources are provided for evaluation 
activities; 
Finding: According to FAS program officials, adequate resources are 
not provided for evaluation activities; 
Rating: Weakness. 

Common feature: Ability 3; 
Key practice: Individuals performing evaluation activities have
experience or receive training; 
Finding: Although FAS program officials said individuals performing 
evaluation activities have experience or receive training, they could 
not provide documents to support this; 
Rating: Observation. 

Common feature: Ability 4; 
Key practice: Members of the project team and groups supporting the 
software acquisition receive orientation on the objectives of the 
evaluation approach; 
Finding: Members of the program team and groups supporting the 
software acquisition received orientation on the objectives of the 
evaluation approach; 
Rating: Strength. 

Common feature: Activity 1; 
Key practice: The project team performs its activities in accordance 
with its documented evaluation plans; 
Finding: The FAS program office performs its activities in accordance 
with its Testing and Evaluation Master Plan; 
Rating: Strength. 

Common feature: Activity 2; 
Key practice: The project’s evaluation requirements are developed in 
conjunction with the development of the system or software technical 
requirements; 
Finding: The FAS program’s evaluation requirements were developed in 
conjunction with the development of the system technical requirements; 
Rating: Strength. 

Common feature: Activity 3; 
Key practice: The project’s evaluation activities are planned to
minimize duplication and take advantage of all evaluation results, 
where appropriate; 
Finding: The FAS program’s evaluation activities, as stated in the 
Testing and Evaluation Master Plan, are planned to minimize 
duplication and take advantage of all evaluation results, where 
appropriate; 
Rating: Strength. 

Common feature: Activity 4; 
Key practice: The project team appraises the contractor team’s
performance over the total period of the contract for compliance with 
requirements; 
Finding: FAS program officials said that they appraise the contractor 
team’s performance over the total period of the contract for compliance
with requirements. However, they could not provide evidence to support 
this; 
Rating: Observation. 

Common feature: Activity 5; 
Key practice: Planned evaluations are performed on the evolving 
software products and services prior to acceptance for operational use; 
Finding: The FAS program office plans to perform evaluations prior to 
operational use; 
Rating: Not rated. 

Common feature: Activity 6; 
Key practice: Results of the evaluations are analyzed and compared 
with the contract’s requirements to establish an objective basis to 
support the decision to accept the products and services or to take 
further action; 
Finding: The FAS program office has done some evaluations and will 
finish in August 2001. At that time, the results of the evaluations 
will be analyzed and compared with the contract’s requirements to 
establish an objective basis to support the decision to accept the 
products and services or to take further action; 
Rating: Not rated. 

Common feature: Measurement 1; 
Key practice: Measurements are made and used to determine the status 
of the evaluation activities and resultant products; 
Finding: Measurements are not made and used to determine the status of 
the evaluation activities and resultant products; 
Rating: Weakness. 

Common feature: Verification 1; 
Key practice: Evaluation activities are reviewed by acquisition 
organization management on a periodic basis; 
Finding: Evaluation activities are reviewed by the DLA Strength 
Executive Board on a quarterly basis; 
Rating: Strength. 

Common feature: Verification 2; 
Key practice: Evaluation activities are reviewed by the project 
manager on both a periodic and event-driven basis; 
Finding: Evaluation activities are reviewed by the program manager on 
a daily basis; 
Rating: Strength. 

Source: Key practice data from SEI; findings and ratings from GAO. 

[End of table] 

Table 15: Acquisition Risk Management Findings for FAS: 

Common feature: Commitment 1; 
Key practice: The acquisition organization has a written policy
for the management of software acquisition risk; 
Finding: The acquisition organization, which is DLA, has a written 
policy-—The Defense Acquisition System (DODD 5000)-—for the management 
of software acquisition risk; 
Rating: Strength. 

Common feature: Commitment 2; 
Key practice: Responsibility for software acquisition risk
management activities is designated; 
Finding: Responsibility for software acquisition risk management 
activities is assigned to the FAS program office; 
Rating: Strength. 

Common feature: Ability 1; 
Key practice: A group that is responsible for coordinating software 
acquisition risk management activities exists; 
Finding: The Risk Review Board is responsible for coordinating 
software acquisition risk management activities; 
Rating: Strength. 

Common feature: Ability 2; 
Key practice: Adequate resources are provided for software acquisition 
risk management activities; 
Finding: According to FAS program officials, adequate resources are 
not provided for software acquisition risk management activities; 
Rating: Weakness. 

Common feature: Ability 3; 
Key practice: Individuals performing software acquisition risk 
management activities have experience or receive required training; 
Finding: The FAS program office stated that individuals performing 
acquisition risk management activities have experience; however, they 
could not provide us with evidence. 
Rating: Observation. 

Common feature: Activity 1; 
Key practice: Software acquisition risk management activities are 
integrated into software acquisition planning; 
Finding: Software acquisition risk management activities are not 
integrated into software acquisition planning; 
Rating: Weakness. 

Common feature: Activity 2; 
Key practice: The Software Acquisition Risk Management Plan is 
developed in accordance with the project's defined software 
acquisition process; 
Finding: The Software Acquisition Risk Management Plan was not 
developed in accordance with the program's defined software 
acquisition process; 
Rating: Weakness. 

Common feature: Activity 3; 
Key practice: The project team performs its software acquisition risk 
management activities in accordance with its documented plans; 
Finding: The FAS program office does not perform software acquisition 
risk management activities; 
Rating: Weakness. 

Common feature: Activity 4; 
Key practice: The project team encourages and rewards project-wide 
participation in the identification and mitigation of risks; 
Finding: The FAS program office does not encourage and reward program-
wide participation in the identification and mitigation of risks; 
Rating: Weakness. 

Common feature: Activity 5; 
Key practice: Risk management is conducted as an integral part of the 
solicitation, project performance management, and contract performance 
management processes; 
Finding: Risk management is not conducted as an integral part of the 
solicitation, program performance management, and contract performance 
management process; 
Rating: Weakness. 

Common feature: Activity 6; 
Key practice: Software acquisition risks are analyzed, tracked, and 
controlled until mitigated; 
Finding: Software acquisition risks are not analyzed, tracked, and 
controlled until mitigated; 
Rating: Weakness. 

Common feature: Activity 7; 
Key practice: Project reviews include the status of identified risks; 
Finding: Meeting minutes of program reviews do not include the status 
of identified risks; 
Rating: Weakness. 

Common feature: Measurement 1; 
Key practice: Measurements are made and used to determine the status 
of the acquisition risk management activities and resultant products; 
Finding: Measurements are not made and used to determine the status of 
the acquisition risk management activities and resultant products; 
Rating: Weakness. 

Common feature: Verification 1; 
Key practice: Acquisition risk management activities are reviewed by 
acquisition organization management on a periodic basis; 
Finding: Acquisition risk management activities are not reviewed by 
acquisition organization management; 
Rating: Weakness. 

Common feature: Verification 2; 
Key practice: Acquisition risk management activities are reviewed by 
the project manager on both a periodic and event-driven basis; 
Finding: Acquisition risk management activities are not reviewed by 
the program manager
Rating: Weakness. 

Source: Key practice data from SEI; findings and ratings from GAO. 

[End of table] 

[End of section] 

Appendix IV: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Carl Urie (202) 512-6231. 

Acknowledgments: 

In addition to the individual named above, key contributors to this 
report were Suzanne Bums, Yvette Banks, Niti Bery, Sophia Harrison, 
Madhav Panwar, and Teresa Tucker. 

[End of section] 

Footnotes: 

[1] Floyd D. Spence National Defense Authorization Act for Fiscal Year 
2001, P.L. 106-398 app., section 917. 

[2] Capability Maturity Models"` is the service mark of Carnegie 
Mellon University, and CMM is registered with the U.S. Patent and 
Trademark Office. GAO used the Software Acquisition Capability 
Maturity Models"` Version 1.2 (CMU/SEI-99-TR-002, April 1999), the 
latest version of the model. 

[3] The six key process areas that we evaluated are software 
acquisition planning, solicitation, requirements development and 
management, project management, contract tracking and oversight, and 
evaluation. We did not evaluate DLA against the seventh key process 
area, transition to support, because the contractors who are 
implementing the systems we evaluated will also support the systems 
when they are operational, rendering transition to support irrelevant 
for these acquisitions. 

[4] DOD Information Technology: Software and Systems Process 
Improvement Programs Vary in Use of Best Practices (GAO-01-116, March 
30, 2001). 

[5] Consumable items include such commodities as subsistence (food), 
fuels, medical supplies, clothing, and construction equipment. 

[6] These repair items are spare and repair parts that support about 
1,400 DOD weapons systems. Each of the military services also manages 
its own service-unique repair items. 

[7] "Strategic materiel" is defined as any item needed to sustain the 
United States in the event of a national emergency. 

[8] DLA defines "item cataloging" to include all activities that 
describe the technical characteristics and data for an individual item 
of supply. 

[9] We did not evaluate BSM against the transition-to-support key 
process area because the contractor who is implementing BSM will also 
support this system when it is operational, rendering transition to 
support irrelevant. 

[10] We did not evaluate FAS on solicitation because it was a sole-
source purchase, or on transition to support because the contractor 
who is implementing FAS will also support this system when it is 
operational, rendering transition to support irrelevant. 

[11] Technical report CMU/SEI-95-TR-017, November 1995. 

[12] Technical Report CMU/SEI-99-TR-027, November 1999. 

[13] GA0-01-116, March 30, 2001. 

[14] We evaluated BSM in six of the seven level-2 key process areas—
software acquisition planning, solicitation, requirements development 
and management, project management, contract tracking and oversight, 
and evaluation. We evaluated FAS in five of the seven level-2 key 
process areas, as listed above, except for solicitation. We did not 
evaluate FAS on solicitation because it is a sole-source procurement. 
We did not evaluate BSM or FAS on the seventh key practice area—-
transition to support—-because the contractors who are implementing 
these systems will also support the systems when they are operational, 
rendering transition to support irrelevant. We also evaluated BSM and 
FAS on one level-3 key process area—acquisition risk management. 

[End of section] 

GAO’s Mission: 

The General Accounting Office, the investigative arm of Congress, 
exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO’s commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO’s Web site [hyperlink, 
http://www.gao.gov] contains abstracts and fulltext files of current 
reports and testimony and an expanding archive of older products. The 
Web site features a search engine to help you locate documents using 
key words and phrases. You can print these documents in their entirety, 
including charts and other graphics. 

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as “Today’s Reports,” on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
[hyperlink, http://www.gao.gov] and select “Subscribe to daily E-mail 
alert for newly released products” under the GAO Reports heading. 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. General Accounting Office: 
441 G Street NW, Room LM: 
Washington, D.C. 20548: 

To order by Phone: 
Voice: (202) 512-6000: 
TDD: (202) 512-2537: 
Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs Contact:
Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Public Affairs: 

Jeff Nelligan, managing director, NelliganJ@gao.gov: 
(202) 512-4800: 
U.S. General Accounting Office: 
441 G Street NW, Room 7149:
Washington, D.C. 20548: