This is the accessible text file for GAO report number GAO-11-84 
entitled 'Defense Management: DOD Has a Rigorous Process to Select 
Corrosion Prevention Projects, but Would Benefit from Clearer Guidance 
and Validation of Returns on Investment' which was released on 
December 8, 2010. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

United States Government Accountability Office: 
GAO: 

Report to the Subcommittee on Defense, Committee on Appropriations, 
U.S. Senate: 

December 2010: 

Defense Management: 

DOD Has a Rigorous Process to Select Corrosion Prevention Projects, 
but Would Benefit from Clearer Guidance and Validation of Returns on 
Investment: 

GAO-11-84: 

GAO Highlights: 

Highlights of GAO-11-84, a report to the Subcommittee on Defense, 
Committee on Appropriations, U.S. Senate. 

Why GAO Did This Study: 

Corrosion costs DOD over $23 billion annually, affects both equipment 
and facilities, and threatens personnel safety. DOD has taken steps to 
improve its corrosion prevention and control (CPC) efforts. These 
efforts include reorganizing the DOD-wide Corrosion Office and 
instituting Corrosion Executive positions in each of the military 
departments. In response to the Senate Appropriations Committee Report 
accompanying the fiscal year 2010 DOD appropriations bill, GAO 
evaluated to what extent (1) the Corrosion Executives are involved in 
preparing CPC project proposals for submission, (2) the Corrosion 
Office has created a process to review and select projects for 
funding, and (3) the military departments have validated the return on 
investment (ROI) for funded projects. GAO also reviewed the process 
the Corrosion Office uses to determine the CPC activities that it will 
fund. To carry out this study, GAO observed project selection panel 
meetings, interviewed corrosion officials, and reviewed documents and 
project proposals. 

What GAO Found: 

The acceptance of the military departments’ CPC proposals varied 
relative to the types of projects and nature of review that the 
military Corrosion Executives required before the proposals were 
submitted to the Corrosion Office for funding consideration. DOD 
guidance provides that Corrosion Executives coordinate CPC actions, 
including submitting corrosion project opportunities. Prior to 
submitting the proposals for a preliminary evaluation by the Corrosion 
Office’s project selection panel, Army and Navy Corrosion Executives 
and staffs reviewed proposal summaries and provided feedback to the 
authors. The Air Force did not perform a review that included pre-
submission feedback. Later, during a preliminary evaluation, the 
Corrosion Office’s project selection panel determined that a much 
higher percentage of Army and Navy proposals were acceptable than 
those submitted by the Air Force. A selection panel member told us 
that because the Air Force did not perform a pre-submission review of 
proposals, deficiencies in those proposals were not corrected prior to 
the panel’s evaluation. 

DOD has criteria and a rigorous multistep procedure for evaluating 
proposals, but some military department stakeholders indicated that 
this information is not communicated clearly. Previously, GAO noted 
involving stakeholders helps agencies target resources to the highest 
priorities. Criteria used for the project selection panel to evaluate 
proposed projects are not clearly identified in DOD’s Corrosion 
Prevention and Mitigation Strategic Plan, and some project managers 
said that they were unfamiliar with how projects were evaluated. While 
the Corrosion Office already takes actions, such as providing in-depth 
feedback to proposals’ authors and assembling corrosion experts to 
participate on the selection panel, unclear communications on some 
issues could adversely affect authors’ abilities to prepare effective 
project proposals. 

The military departments are late in validating ROIs for some 
completed projects. The Strategic Plan suggests that follow-on reviews 
with validated ROIs are required for completed projects within 3 years 
after full project implementation. Project managers have completed 
these reviews for 10 of the 28 implemented projects funded in fiscal 
year 2005, with 8 of the 10 completed reviews performed by one Army 
command. Corrosion Executives told GAO that because CPC funding is 
awarded only for the 2-year project implementation period, they 
typically do not have funds remaining for validating ROIs after 
projects are completed. If the ROI validations of completed projects 
are not performed, the Corrosion Office will not have needed data to 
adjust project selection criteria in order to invest limited CPC funds 
in the types of projects with the greatest potential benefits.
The Corrosion Office created Product Teams to implement DOD-wide CPC 
activities in seven areas. Using volunteers and a budget averaging 
around $4.5 million per year, the Teams propose activities, such as 
determining the costs of corrosion and DOD-wide specifications for CPC 
products, which are then selected for funding by the Director of the 
Corrosion Office. The Corrosion Executives are becoming more involved 
in Team activities. 

What GAO Recommends: 

GAO is making recommendations to: 1) improve the oversight of 
proposals submitted for funding consideration, 2) communicate more 
clearly the criteria used to select which projects will be funded, and 
3) fund and complete ROI validations. 

In written comments on this report, DOD disagreed with the first two 
recommendations and agreed with the third, citing alternatives or 
differing views. GAO believes the recommendations remain valid. 

View [hyperlink, http://www.gao.gov/products/GAO-11-84] or key 
components. For more information, contact Jack Edwards at (202) 512-
8246 or edwardsj@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

Acceptance of Project Proposal Submissions to the Corrosion Office 
Often Varies by the Nature of Corrosion Executives' Oversight and 
Review and Type of Project Proposed: 

The Corrosion Office Has a Rigorous Process to Evaluate CPC Proposals 
for Funding, but Selection Criteria Are Not Clearly Communicated: 

The Military Departments Have Not Determined the Benefits of About Two 
Thirds of the Completed Corrosion Projects: 

Product Teams Propose and Implement DOD-wide CPC Activities, and the 
Staffing Process for the Teams Is Evolving: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Scope and Methodology: 

Appendix II: Information on Selected Corrosion Prevention and Control 
Projects: 

Appendix III: Comments from the Department of Defense: 

Appendix IV: GAO Contact and Staff Acknowledgments: 

Related GAO Products: 

Tables: 

Table 1: Results of Preliminary Evaluation of Fiscal Year 2011 CPC 
Project Proposals: 

Table 2: Funding of the Product Teams for Fiscal Years 2005 through 
2010: 

Figures: 

Figure 1: Percentage of Accepted CPC Projects Receiving Corrosion 
Office Funding (Fiscal Years 2005 through 2010): 

Figure 2: Estimated Average ROI for Funded CPC Projects (Fiscal Years 
2005 through 2010): 

Abbreviations: 

Corrosion Executive: Corrosion Control and Prevention Executive: 

Corrosion Office: Office of Corrosion Policy and Oversight: 

CPC: corrosion prevention and control: 

DOD: Department of Defense: 

Product Teams: Working Integrated Product Teams: 

ROI: return on investment: 

[End of section] 

United States Government Accountability Office: 
Washington, DC 20548: 

December 8, 2010: 

The Honorable Daniel Inouye: 
Chairman: 
The Honorable Thad Cochran: 
Ranking Member: 
Subcommittee on Defense: 
Committee on Appropriations: 
United States Senate: 

In 2010, the Department of Defense (DOD) estimated that corrosion 
costs the department over $23 billion annually. Moreover, the Defense 
Science Board Task Force estimated in a 2004 report that 30 percent of 
corrosion costs could be avoided through proper investment in 
prevention and mitigation of corrosion during design, manufacture, and 
sustainment.[Footnote 1] Corrosion negatively affects all military 
assets, including both equipment and infrastructure, and is defined as 
the unintended destruction or deterioration of a material due to its 
interaction with the environment.[Footnote 2] Corrosion also affects 
military readiness, taking critical systems out of action and creating 
safety hazards. For example, an October 2009 study estimated that 
corrosion is responsible for up to 16 percent of the unavailability of 
the equipment reviewed in the study.[Footnote 3] Also, our April 2007 
report noted that the Army attributed over 50 aircraft accidents and 
12 fatalities to corrosion since 1985.[Footnote 4] According to DOD, 
increased prevention and control efforts are needed to adequately 
address the wide-ranging and expensive effects of corrosion on 
equipment and infrastructure. 

Congress has enacted several legislative requirements to address the 
high cost of corrosion's negative effects on military equipment and 
infrastructure. To fulfill these requirements, DOD created the Office 
of Corrosion Policy and Oversight (Corrosion Office) in 2003. The 
Corrosion Office is responsible for the prevention and mitigation of 
corrosion of military equipment and infrastructure.[Footnote 5] The 
National Defense Authorization Act for Fiscal Year 2008, which amended 
10 U.S.C. § 2228, specified organizational changes to the Corrosion 
Office and added new reporting requirements.[Footnote 6] These changes 
included assigning the former duties of the DOD-wide Corrosion 
Executive to the newly established position of Director of the 
Corrosion Office and mandating that the incumbent report directly to 
the Under Secretary of Defense for Acquisition, Technology and 
Logistics. Additionally, the Act required DOD to annually report on 
corrosion funding to Congress. The Duncan Hunter National Defense 
Authorization Act for Fiscal Year 2009 required each military 
department to designate a Corrosion Control and Prevention Executive 
(Corrosion Executive) to be the senior official in the department with 
responsibility for coordinating corrosion prevention and control (CPC) 
program activities, and also required each Corrosion Executive to 
submit an annual report of recommendations regarding CPC actions and 
funding levels to the Secretary of Defense.[Footnote 7] 

We conducted this work in response to the Senate Appropriations 
Committee Report accompanying the fiscal year 2010 DOD appropriations 
bill.[Footnote 8] In the Report the Committee directed us to review 
selected CPC projects and activities, identify the methodology and 
processes the military services use to forward candidate projects for 
funding consideration, and determine why the military services' entire 
estimated requirements are not reflected in the overall DOD funding 
requirement.[Footnote 9] In April 2010, we provided observations on 
the process that DOD and the military departments use to estimate 
funding requirements for CPC projects and activities, and the reasons 
why DOD's funding requirement did not reflect the estimated 
requirements identified by the military departments.[Footnote 10] This 
report discusses our evaluation of the extent: 

* the Corrosion Executives are involved in preparing CPC project 
proposals for submission, 

* the Corrosion Office has created a process to review and select 
projects for funding, and: 

* the military departments have validated the return on investment 
(ROI) for funded projects. 

We also discuss the process used by the Corrosion Office to determine 
the CPC activities that it will fund. 

In performing our work we used data on projects that the military 
departments submitted to the Corrosion Office for funding 
consideration in fiscal years 2005 through 2010. We assessed the 
reliability of the data by interviewing staff knowledgeable about the 
data and the system that produces them and by testing for missing 
data, outliers, or obvious errors. We determined the data were 
sufficiently reliable for the purposes of determining how the military 
departments decide which projects to submit to the Corrosion Office 
for funding consideration and how the Corrosion Office decides which 
projects to approve for funding. To enhance our understanding of the 
review and decision-making processes, we selected and reviewed a 
nonprobability sample of 24 project proposals and related information 
that the military departments submitted in fiscal years 2006, 2008, or 
2010. To select this sample, we used the following four considerations: 

* the year the project was submitted to the Corrosion Office, 

* whether the project was accepted or not accepted by the Corrosion 
Office, 

* the Corrosion Office's and military department's combined project 
cost, and: 

* the estimated return on investment (ROI) of the project. 

As part of these project reviews, we interviewed six officials who 
were the principal authors and points of contact for 11 of the 
projects in our sample. We additionally met with each Corrosion 
Executive to discuss the steps they and their staffs took to oversee 
CPC efforts for their respective military department. We met with 
officials at the Corrosion Office to discuss the CPC project selection 
process and also observed two meetings of the CPC project selection 
panel as part of the fiscal year 2011 project selection process. We 
observed meetings where the panel provided feedback to military 
department representatives regarding the panel's observations on the 
project proposals submitted for fiscal year 2011 funding 
consideration. To determine how the military departments validate the 
ROIs for funded projects, we met with the Corrosion Executives and 
their staffs, as well as the principal points of contact for 11 of the 
projects we reviewed. We also obtained the final reports for CPC 
projects funded in fiscal year 2005 from the Corrosion Office and 
reviewed these reports to obtain data on estimated and validated ROIs 
for these projects.[Footnote 11] We met with representatives from 
three of the seven CPC Working Integrated Product Teams to understand 
how CPC activities are formulated, funded, and implemented. Further 
details on our scope and methodology are included in appendix I. 

We conducted this performance audit from April 2010 through December 
2010, in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audit 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. 

Background: 

Corrosion, if left unchecked, can degrade the readiness and safety of 
equipment and has been estimated to cost DOD billions of dollars 
annually.[Footnote 12] Using fiscal year 2006 data, DOD noted that it 
spends approximately $80 billion each year to maintain its ships, 
aircraft, strategic missiles, and ground combat and tactical vehicles. 
Corrosion-related costs of equipment maintenance were estimated to 
total $19.4 billion each year, or 24 percent of the total cost of 
maintenance. In addition, DOD spends approximately $10 billion to 
maintain about 577,000 buildings and structures at more than 5,300 
sites worldwide. Approximately $1.9 billion, or 11.7 percent, of these 
maintenance costs were estimated to be related to corrosion. 

The Director of the Corrosion Office is responsible for the prevention 
and mitigation of corrosion of DOD equipment and infrastructure. The 
Director's duties include developing and recommending policy guidance 
on the prevention and mitigation of corrosion to be issued by the 
Secretary of Defense, reviewing the CPC programs and funding levels 
proposed by the Secretary of each military department during the 
annual internal DOD budget review process, and submitting 
recommendations to the Secretary of Defense regarding those programs 
and proposed funding levels. In practice, this review includes the 
process of selecting projects proposed by the military departments for 
funding. In addition, the Director leads the CPC Integrated Product 
Team, which is comprised of representatives from the military 
departments to accomplish the goals and objectives of the Corrosion 
Office, and includes the seven Working Integrated Product Teams 
(Product Teams) that implement CPC activities. These seven Product 
Teams are: policy and requirements; metrics, impact, and sustainment; 
specifications, standards, and product qualification; training and 
certification; communications and outreach; science and technology; 
and facilities. Until fiscal year 2011, the Corrosion Office consisted 
of the Director and contractor support. The Director told us that 4 
full-time staff were expected to be hired in early fiscal year 2011. 

The Corrosion Office funds projects and activities aimed at preventing 
and mitigating corrosion. Projects are specific CPC efforts with the 
objective of developing and testing new technologies. To receive 
Corrosion Office funding, the military departments submit project 
proposals that are evaluated by a panel of experts assembled by the 
Director of the Corrosion Office. The Corrosion Office currently funds 
up to $500,000 per project, and the military departments pledge 
complementary funding for each project they propose.[Footnote 13] The 
level of military department funding and the estimated ROI are two of 
the criteria used to evaluate the project proposals. (See app. II for 
examples of CPC projects.) Activities encompass efforts, such as 
training and cost studies, to enhance and institutionalize CPC efforts 
within DOD. These activities are coordinated through the seven Product 
Teams discussed above. Product Team representatives told us that 
funding for these activities is centrally coordinated through the 
Corrosion Office in consultation with the Product Teams. 

According to the Corrosion Office, constrained budgets and competing 
requirements to support worldwide military operations have precluded 
the full funding of CPC projects that have met the requirements for 
funding. In April 2010, we reported on the funding available to the 
Corrosion Office for projects and activities.[Footnote 14] For fiscal 
years 2005 through 2010, the Corrosion Office accepted 271 CPC 
projects with funding requests totaling $206 million, but DOD provided 
$129 million, or 63 percent of the funding required for the Corrosion 
Office to fund all 271 projects. As a result, the Corrosion Office 
funded 169 CPC projects over this 6 year period. As represented in 
Figure 1, the historical funding rates for CPC projects have 
fluctuated during fiscal years 2005 through 2010. During the same 6 
year period, the Corrosion Office also funded a total of $26 million 
in corrosion-related activities such as training, outreach, and costs 
of corrosion studies. 

Figure 1: Percentage of Accepted CPC Projects Receiving Corrosion 
Office Funding (Fiscal Years 2005 through 2010): 

[Refer to PDF for image: line graph] 

Fiscal year: 2005; 
Projects funded: 73%. 

Fiscal year: 2006; 
Projects funded: 59%. 

Fiscal year: 2007; 
Projects funded: 54%. 

Fiscal year: 2008; 
Projects funded: 68%. 

Fiscal year: 2009; 
Projects funded: 72%. 

Fiscal year: 2010; 
Projects funded: 53%. 

Source: GAO analysis of DOD data. 

[End of figure] 

In April 2010, we reported that the CPC requirements for fiscal year 
2011 totaled $47 million, but the fiscal year 2011 budget identified 
$12 million for CPC, leaving an unfunded requirement of about $35 
million.[Footnote 15] Additionally, we reported that the funding level 
identified in the fiscal year 2011 budget request could result in a 
potential cost avoidance of $418 million. Similarly, multiplying the 
average estimated ROI by the amount of the unfunded requirements shows 
that DOD may be missing an opportunity for additional cost avoidance 
totaling $1.4 billion by not funding all of its estimated CPC 
requirements. Both calculations are highly contingent on the accuracy 
of the estimated ROIs that have not been validated by the military 
departments. (See the Related GAO Products section at the end of this 
report for a full listing of our reports on DOD's CPC program.) 

Acceptance of Project Proposal Submissions to the Corrosion Office 
Often Varies by the Nature of Corrosion Executives' Oversight and 
Review and Type of Project Proposed: 

The acceptance of military departments' CPC project proposals varied 
relative to the nature of review--if any--that the Corrosion 
Executives required before proposals were submitted to the Corrosion 
Office for funding consideration. The military departments have 
established Corrosion Executives to oversee CPC efforts, but their 
level of oversight varies. The Duncan Hunter National Defense 
Authorization Act for Fiscal Year 2009 requires the Corrosion 
Executive of each military department to serve as the principal point 
of contact between the military department and the Director of the 
Corrosion Office.[Footnote 16] It also requires each Corrosion 
Executive to submit an annual report to the Secretary of Defense 
containing recommendations pertaining to the military department's CPC 
program, including corrosion-related funding levels necessary to carry 
out all the Corrosion Executive's duties. In addition, DOD Instruction 
5000.67, Prevention and Mitigation of Corrosion on DOD Military 
Equipment and Infrastructure, which was updated in February 2010, 
reflects certain legislative requirements and provides Corrosion 
Executives with responsibility for certain CPC activities in their 
military department. It requires the Corrosion Executives to submit 
CPC project proposals to the Corrosion Office with coordination 
through the proper military department chain of command, as well as to 
develop and support an effective CPC program in their military 
department, evaluate the CPC program's effectiveness, serve as the 
principal point of contact with the Corrosion Office, and establish a 
process to review and evaluate the adequacy of CPC planning. 

We have reported that a key factor in helping achieve an 
organization's mission and program results and minimize operational 
problems is to implement appropriate internal control.[Footnote 17] 
Effective internal control also helps in managing change to cope with 
shifting environments and evolving demands and priorities. Control 
activities such as the policies, procedures, techniques, and 
mechanisms that enforce management's directives, are an integral part 
of an entity's planning, implementing, reviewing, and accountability 
for stewardship of government resources and achieving effective 
results. For an entity to run and control its operations, it must also 
have relevant, reliable, and timely communications relating to 
internal as well as external events. 

During the annual process of identifying and submitting CPC project 
proposals for funding consideration, each Corrosion Executive 
exercises a different level of review prior to submission of the 
proposals to the Corrosion Office. For example, the Army and Navy 
Corrosion Executives organized and directed a review of their 
department's project proposals prior to submitting them to the 
Corrosion Office for fiscal year 2011 CPC funding, but the Air Force 
Corrosion Executive's preliminary oversight was more limited. 

The Army Corrosion Executive requested the various Army commands to 
submit abbreviated project proposals 5 weeks prior to the application 
deadline set by the Corrosion Office. Individuals nominated by the 
Army commands then reviewed these abbreviated proposals by using 
criteria the Army adapted from the project selection evaluation charts 
included in DOD's Corrosion Prevention and Mitigation Strategic Plan. 
The Corrosion Executive's office provided the results from this 
internal peer review to the authors of the proposed projects, so that 
comments obtained from the review could be incorporated into the 
project proposals before the Corrosion Executive submitted the 
projects to the Corrosion Office. Army staff told us that some authors 
withdrew their project proposals following this review, based on the 
feedback they received. 

The Navy Corrosion Executive directed a similar review process, 
requiring that a one-page synopsis of each project proposal be 
prepared and submitted to him 7 weeks prior to the Corrosion Office 
deadline. The Corrosion Executive assembled a panel with members from 
each of the Navy's system commands to review the synopses. 
Specifically, individuals from other system commands reviewed and 
scored the synopses from the remaining commands based on the synopses' 
alignment with the Navy's priorities, and the estimated ROI. The Navy 
Corrosion Executive then ranked the synopses based on the aggregate 
scores received from each reviewer. A Navy project manager told us 
that receiving a low ranking did not preclude project proposals from 
being submitted to the Corrosion Office, because the Navy Corrosion 
Executive did not discourage the managers of these projects from 
submitting the full proposal to the Corrosion Office for funding 
consideration. 

We found that the Air Force Corrosion Executive did not direct a 
similar level of review and feedback for project proposals before they 
were submitted to the Corrosion Office for fiscal year 2011 funding. 
The Air Force Corrosion Executive requested that the Air Force major 
commands submit project proposals to his office prior to submitting 
project proposals to the Corrosion Office. However, the Air Force 
Corrosion Executive did not establish a process to review the 
proposals and provide preliminary feedback for revising them before 
submission to the Corrosion Office. The Air Force Corrosion Executive 
told us that he did not conduct a review of the proposals because, due 
to the historically low rate of Air Force CPC projects accepted for 
funding, he thought it was appropriate to submit all of the Air Force 
proposals to the Corrosion Office. He also said that since the 
Corrosion Office is more familiar with the criteria used to judge the 
proposals he did not want to reject any project proposals. 

According to a member of the Corrosion Office's project selection 
panel, the additional steps taken by Army and Navy Corrosion 
Executives to ensure that their military department's proposals met 
the panel's criteria were contributing factors for a higher acceptance 
rate for Army and Navy proposals. The project selection panel found 
during the preliminary evaluation step of the proposal selection 
process that 66 percent of the Army project proposals and 61 percent 
of the Navy project proposals submitted for fiscal year 2011 funding 
were acceptable in their current form, while 11 percent of the Air 
Force projects were considered acceptable (see table 1). 

Table 1: Results of Preliminary Evaluation of Fiscal Year 2011 CPC 
Project Proposals: 

Department of the Army: 
Percentage of proposals judged acceptable: 66%. 

Department of the Army: Facilities; 
Number of proposals submitted: 21; 
Number of proposals judged acceptable: 14; 
Percentage of proposals judged acceptable: 67%. 

Department of the Army: Weapons; 
Number of proposals submitted: 11; 
Number of proposals judged acceptable: 7; 
Percentage of proposals judged acceptable: 64%. 

Department of the Navy: 
Percentage of proposals judged acceptable: 61%. 

Department of the Navy: Facilities; 
Number of proposals submitted: 10; 
Number of proposals judged acceptable: 7; 
Percentage of proposals judged acceptable: 70%. 

Department of the Navy: Weapons - ships; 
Number of proposals submitted: 6; 
Number of proposals judged acceptable: 1; 
Percentage of proposals judged acceptable: 17%. 

Department of the Navy: Weapons - air; 
Number of proposals submitted: 6; 
Number of proposals judged acceptable: 2; 
Percentage of proposals judged acceptable: 33%. 

Department of the Navy: Weapons - Marine Corps; 
Number of proposals submitted: 9; 
Number of proposals judged acceptable: 9; 
Percentage of proposals judged acceptable: 100%. 

Department of the Air Force: 
Percentage of proposals judged acceptable: 11%. 

Department of the Air Force: Facilities; 
Number of proposals submitted: 9; 
Number of proposals judged acceptable: 1; 
Percentage of proposals judged acceptable: 11%. 

Department of the Air Force: Weapons; 
Number of proposals submitted: 9; 
Number of proposals judged acceptable: 1; 
Percentage of proposals judged acceptable: 11%. 

Total: 
Number of proposals submitted: 81; 
Number of proposals judged acceptable: 42; 
Percentage of proposals judged acceptable: 52%. 

Source: GAO analysis of OSD data. 

[End of table] 

The panel member also told us that the Army and Navy fiscal year 2011 
proposals were more complete and more effectively addressed the 
selection criteria than those submitted by the Air Force. For example, 
most of the Air Force project proposals lacked required information 
needed for the project selection panel to judge the merits of the 
proposal. The panel's feedback to the authors of the Air Force project 
proposals highlighted areas where the provided information was 
insufficient or incomplete, such as: 

* the project managers did not follow the project proposal template in 
the DOD Corrosion Prevention and Mitigation Strategic Plan, which 
includes topics to be addressed in project proposals; 

* the contents of the project proposals did not explain the technology 
demonstration aspects of the project; or: 

* the project proposals did not include information on matching funds 
that would be provided by the Air Force. 

The project selection panel also concluded that most of the Air 
Force's fiscal year 2011 project proposals were requests for 
replacement funds, rather than the technology demonstrations that the 
Corrosion Office's CPC program is intended to support. Selection panel 
members questioned if a review had occurred by the Air Force Corrosion 
Executive because these deficiencies were not identified and corrected 
prior to submitting the project proposals to the Corrosion Office for 
funding consideration. 

The Corrosion Office Has a Rigorous Process to Evaluate CPC Proposals 
for Funding, but Selection Criteria Are Not Clearly Communicated: 

For fiscal year 2011, the Corrosion Office used a rigorous multistep 
process to review and select CPC project proposals that were 
acceptable for funding; however, some military department personnel 
involved in the process did not clearly understand the criteria used 
to select projects for funding. A project selection panel reviewed 
submitted project proposals from each military department at two 
different times. For the preliminary review, the panel used a set of 
criteria that is different from those used for final project selection 
later in the process. For the final review, the panel used criteria 
that are found in the DOD Corrosion Prevention and Mitigation 
Strategic Plan but not explicitly identified as the specific criteria 
used to evaluate CPC projects. Corrosion Executives and several 
authors of the project proposals told us they were not clear on what 
the criteria were or when they were used. 

The Corrosion Office Used a Rigorous Multistep Process to Select 
Projects for Funding: 

For the fiscal year 2011 project review and selection, we observed 
that the Corrosion Office used a rigorous multistep process to 
determine if proposed projects were acceptable for funding. 

* Step 1: In mid-June 2010, the military departments submitted 81 CPC 
project proposals to the Corrosion Office, as shown in table 1 above. 
At this point, Corrosion Office support staff assembled the project 
plans into binders for review by the project selection panel convened 
by the Director of the Corrosion Office. The fiscal year 2011 panel 
had five members: the Director, Corrosion Office (chair); Associate 
Director, Materials and Structures, Office of the Director, Defense 
Research & Engineering (vice-chair); and an official from each of the 
following organizations within the Office of the Under Secretary of 
Defense (Acquisition, Technology and Logistics): Defense Acquisition 
University; Installations and Environment; and Logistics and Materiel 
Readiness, Maintenance Policy and Programs.[Footnote 18] 

* Step 2: In mid-July 2010, 2 weeks after project information was 
provided to the panel, the panel members assembled for their 
preliminary evaluation of the proposals. This preliminary evaluation, 
which we observed, was conducted at a meeting immediately prior to the 
annual DOD Corrosion Forum and resulted in projects being designated 
as either a "go" (meaning that the projects are deemed acceptable in 
their current form) or a "no go" (meaning that the projects require 
additional information or changes in scope to be acceptable to the 
panel). We observed that the panel used criteria for this preliminary 
evaluation that are not made available to the submitters of project 
proposals and are different from those used for final project 
selection later in the process.[Footnote 19] 

* Step 3: Following the preliminary evaluation and during the 
Corrosion Forum, the panel held individual feedback sessions with 
project managers from the military commands, such as Naval Air Systems 
Command, Army Aviation and Missile Command, and Air Force Civil 
Engineer Support Agency, so feedback could be done in person. The 
panel provided feedback on each project, regardless of whether it was 
designated as a "go" or "no go." A panel member told us that the panel 
provided feedback on all projects so that project managers could 
address--if they choose to do so--any perceived weaknesses in their 
"go" projects and improve their ranking in the final evaluation, as 
well as revise the "no go" project submissions. Following the 
feedback, the project managers had three options: prepare and submit 
information addressing the feedback provided by the panel, re-submit 
project proposals in their original form, or remove projects from 
consideration for that year's funding process. Project managers told 
us that they sometimes decide to remove their "no-go" projects from 
consideration and that the military departments may implement such 
projects using other funding. A project selection panel member told us 
that if a project manager decided to modify a project proposal to 
address the panel's feedback, this modified proposal was due to the 
Corrosion Office no later than 2 weeks after the feedback session. 
Upon receipt of any revised proposals, the panel conducted another 
review of all proposals (original and resubmitted), which involved 
each panel member independently scoring the projects on judgmental 
criteria and providing written comments.[Footnote 20] 

* Step 4: In mid-August 2010, Corrosion Office support staff used an 
analytical tool to rank the projects based on the average of the 
scores recorded by each panel member for eight criteria: the five 
judgmental criteria above and three quantitative criteria--ROI, 
Corrosion Office funding as a percentage of total project cost, and 
the project performance, or implementation, period. 

* Step 5: Following the ranking of projects using the analytical tool, 
the selection panel reconvened for a final evaluation of the projects. 
The panel arranged the ranked list that resulted from the analytical 
tool described above into four categories: best, acceptable-
prioritized for funding, acceptable-not prioritized, and not 
acceptable. According to the staff, the "best" projects would likely 
all be funded, the "acceptable-prioritized for funding" projects would 
be funded by priority until the Corrosion Office funding is exhausted. 
Corrosion Office support staff informed the panel that, based on 
historical funding levels, they anticipated having $7 million in 
available funding for CPC projects in fiscal year 2011. The panel 
identified 30 of the 53 accepted projects that it anticipated would be 
funded following completion of DOD's fiscal year 2011 budget process. 
These 30 projects included the 20 projects categorized as "best" and 
10 projects in the "acceptable-prioritized for funding" category. We 
observed that the panel then reviewed the projects that were within 
the anticipated funding level to ensure a balance between the number 
of facilities and weapons projects identified for funding. In the 
meeting we observed, no adjustments to the final ranking were 
necessary to ensure this balance. 

Criteria Used for Project Selection Are Not Clearly Communicated: 

Corrosion Office officials told us that projects are evaluated based 
on the eight criteria that they believed were clearly listed in the 
DOD Corrosion Prevention and Mitigation Strategic Plan (and discussed 
above), yet some project managers told us they were unaware of these 
criteria. We have previously reported that a key business practice for 
performance management is the early and direct involvement of 
stakeholders.[Footnote 21] We have also reported that leading results- 
oriented organizations believe strategic planning is not a static or 
occasional event but rather a dynamic and inclusive process.[Footnote 
22] For example, we noted that stakeholder involvement is important to 
help agencies ensure that their efforts and resources are targeted at 
the highest priorities. 

We found that some military department stakeholders--including the 
Corrosion Executives and project managers who submit project 
proposals--had limited familiarity with the criteria to evaluate 
projects for CPC funding. As described above, the selection panel used 
a different set of criteria to make the preliminary "go/no-go" 
decision than the set used for the final evaluation and decision. 
Corrosion Office officials told us that they believed these criteria 
were clearly listed in the DOD Corrosion Prevention and Mitigation 
Strategic Plan, but we found that only some of the criteria used to 
evaluate CPC project proposals were clearly found in the Strategic 
Plan. Further, the criteria identified by the Corrosion Office 
officials were grouped in the Strategic Plan with other criteria not 
used for the project selection process. Two of the six project 
managers with whom we met told us that they were unfamiliar with the 
criteria used to assess CPC projects. The other four project managers 
said that they became familiar with the criteria by attending the DOD 
Corrosion Forums, discussing projects with the panel during previous 
years' feedback sessions, or learning about the criteria from other 
project managers--not by reading the DOD Corrosion Prevention and 
Mitigation Strategic Plan. Some project managers told us that project 
managers who are new to the process of applying for CPC funding would 
have difficulty understanding the criteria sufficiently to prepare a 
successful project proposal. Also, the Corrosion Executives told us 
that they were unfamiliar with the criteria used by the project 
selection panel to prioritize projects for funding. For example, the 
Air Force Corrosion Executive told us that he did not review CPC 
projects prior to submitting them to the Corrosion Office for funding 
consideration because he was not sufficiently familiar with the 
criteria used by the Corrosion Office to select projects. 

During our observations of the project selection panel process, we 
identified several conditions that show communication between the 
Corrosion Office and the military department stakeholders is not as 
clear as it could be. 

* Criteria used for project selection are not clearly identified in 
the Corrosion Prevention and Mitigation Strategic Plan. The Strategic 
Plan includes an attachment with seven project assessment charts that 
the Strategic Plan states are "not to be filled out and submitted" 
with the project proposal and "will not be used to score projects, 
although they may be used as a guide" for the preliminary and final 
project evaluations. However, we observed the project selection panel 
using one of the topics described in the assessment charts (ROI) to 
make project acceptance decisions. 

* Further, it appeared that certain criteria were more important for 
project acceptance than others, even though this difference in 
importance was not identified in the Strategic Plan. For example, 
during the project selection meetings we observed, the proposed 
projects' estimated ROI appeared to be a very important criterion in 
the panel's decision-making process. Also, we observed that the ratio 
of funding requested from the Corrosion Office to that provided by the 
military department was often cited by the project selection panel as 
a reason for scoring a project higher or lower, even though the 
Strategic Plan does not explicitly mention this criterion.[Footnote 23] 

* The panel also assessed some projects using criteria that were not 
listed in the Corrosion Prevention and Mitigation Strategic Plan. 
Specifically, the extent to which past projects had used similar 
technology and the extent to which a proposed project's location 
previously experienced difficulties with project implementation both 
factored in part into the selection panel's decisions about whether to 
accept projects for funding, even though these criteria are not listed 
in the Strategic Plan.[Footnote 24] 

* The project selection process did not incorporate the priorities of 
the military departments, even though the Navy provided this 
information to the panel for the fiscal year 2011 selection process. 
Corrosion Executives and project managers told us they believed that 
it was appropriate for the project selection panel to consider the 
priorities of the military departments, as each department was 
required to provide matching funds for proposed projects. However, a 
selection panel member and Corrosion Office officials told us that 
they disagreed with this view, and added that the CPC program was 
intended as a technology demonstration program with the goal of 
awarding funds to the most competitive projects, regardless of 
department priorities. 

The military department stakeholders' limited knowledge and 
understanding of the selection criteria could be a challenge for the 
Corrosion Office in accomplishing the stated purpose of the Strategic 
Plan to articulate policies, strategies, objectives, and plans that 
will ensure an effective, standardized, affordable DOD-wide approach 
to prevent, detect, and treat corrosion and its effects on military 
equipment and infrastructure. This situation makes it difficult for 
stakeholders to craft effective project proposals because they are 
unsure about the criteria that the project selection panel uses to 
make decisions on which projects to accept for funding. 

The Military Departments Have Not Determined the Benefits of About Two 
Thirds of the Completed Corrosion Projects: 

The military departments have completed a third of their required ROI 
validations for projects funded in fiscal year 2005, but completion of 
the remaining projects' validations for that year is behind schedule. 
Guidance in the DOD Corrosion Prevention and Mitigation Strategic Plan 
describes the steps to be taken to initially estimate the ROIs for CPC 
projects submitted for funding by the Corrosion Office. These 
estimation steps include (1) calculating the project costs--such as up-
front investment costs and operating and support costs, (2) 
calculating the benefits that are expected to result from the project--
such as reduction of costs like maintenance hours and inventory costs, 
and (3) calculating the net present value of the annual costs and 
benefits over the projected service life of the proposed technology. 
[Footnote 25] 

The DOD Corrosion Prevention and Mitigation Strategic Plan notes that 
follow-on reviews of completed projects are required and that the 
reviews are to focus on validating the project's ROI. Corrosion Office 
officials told us that because the CPC projects are generally funded 
for 2 years of implementation and ROI validations are required within 
3 years of completing the project's implementation, reviews for 
projects funded in fiscal year 2005 are due by the end of fiscal year 
2010.[Footnote 26] The ROI validations consist of: 

* reviewing assumptions used earlier in computing the estimated ROI; 

* updating the costs and benefits associated with the new technology 
resulting from the project; 

* recalculating the ROI based on validated data; and: 

* providing an assessment of the difference, if any, between the 
estimated ROI and the validated ROI. 

The military departments have completed these reviews, including the 
ROI validations, for 10 (36 percent) of the 28 implemented projects 
funded in fiscal year 2005. For these 10 projects, the average ROI 
ratio was validated as 12:1, slightly higher than the average 
estimated ROI of 11:1 for these projects when they were originally 
proposed. While the agreement between the average estimated and 
validated ROIs is encouraging, the small number of projects--overall 
and by type of project--does not allow these findings to be 
generalized. 

Nine of these ten CPC projects with validated ROIs were focused on 
corrosion in facilities, and facilities projects accepted by the 
Corrosion Office for funding have historically had lower estimated 
ROIs than CPC equipment projects.[Footnote 27] Specifically, for CPC 
projects funded in fiscal year 2005, the facilities projects had an 
estimated average ROI of 13:1, while the equipment projects had an 
estimated average ROI of 67:1. Figure 2 shows the estimated average 
ROIs for projects funded in fiscal years 2005 through 2010. 

Figure 2: Estimated Average ROI for Funded CPC Projects (Fiscal Years 
2005 through 2010): 

[Refer to PDF for image: vertical bar graph] 

Fiscal year: 2005; 
Facilities: 12.7:1; 
Weapons: 67.1:1. 

Fiscal year: 2006; 
Facilities: 14.3:1; 
Weapons: 80.8:1. 

Fiscal year: 2007; 
Facilities: 12.7:1; 
Weapons: 120.7:1. 

Fiscal year: 2008; 
Facilities: 24.4:1; 
Weapons: 78.2:1. 

Fiscal year: 2009; 
Facilities: 15.5:1; 
Weapons: 134.1:1. 

Fiscal year: 2010; 
Facilities: 14.3:1; 
Weapons: 56.6:1. 

Source: GAO analysis of DOD Corrosion Office data. 

[End of figure] 

Both Corrosion Office and military department officials conceded that 
they are behind schedule on completing ROI validations for fiscal year 
2005 projects. Army and Navy corrosion officials told us that, because 
CPC funding is awarded for a 2-year project implementation period, 
they typically do not have sufficient funds remaining for validating 
the ROI after projects are implemented. However, the Army group that 
conducts CPC projects for facilities has completed 8 of its 9 required 
ROI validations for projects funded in fiscal year 2005. According to 
an Army official, this group has historically been allocated $5 
million annually for CPC activities. The Corrosion Office Director 
told us they are aware of the military departments' difficulties in 
completing the validations and are considering budgeting DOD-wide CPC 
funds for ROI validation. If this action is taken, funding would go to 
the Product Team responsible for CPC metrics for the team to allocate 
to ensure completion of the validations. 

Because the military departments have not completed the required 
validations of ROI estimates, DOD and the military departments are 
unable to fully demonstrate the costs and benefits of the CPC 
projects. One project selection panel member told us that the lack of 
completed ROI validations makes it more difficult for the panel to 
make decisions about how to change project selection criteria to 
invest limited funds in the types of projects with the greatest 
benefits. Moreover, the continued access to limited evaluative data 
prevents DOD from making better informed decisions about the amount of 
funding for the Corrosion Office's CPC program, as well as where best 
to invest CPC funds. 

Product Teams Propose and Implement DOD-wide CPC Activities, and the 
Staffing Process for the Teams Is Evolving: 

The Corrosion Office has created seven Product Teams to propose and 
implement DOD-wide CPC activities in seven areas, as discussed 
earlier. Using volunteers from the military departments, the Product 
Teams propose activities, such as determining the costs of corrosion, 
which are then selected for funding. In the past, product team members 
served on an informal voluntary basis with little involvement from the 
military departments. However, now that each department has a 
Corrosion Executive, the process for selecting the Product Teams' 
members is changing. 

Product Teams Implement CPC Activities: 

According to a Product Team member, the Product Teams convene during 
the DOD Corrosion Forums held twice each year and coordinate 
activities by email and through the Corrosion Office Web site during 
the rest of the year. For example, at the July 2010 DOD Corrosion 
Forum that we observed, the Product Teams presented their activities 
to the attendees, discussed their progress on the activities, and 
prepared a set of goals for actions to be completed before the next 
Corrosion Forum. The Product Teams' action plans are included in the 
DOD Corrosion Prevention and Mitigation Strategic Plan and are updated 
annually. The Product Teams are staffed by representatives from the 
military departments, and Corrosion Office staff and the Product Team 
representatives told us that an informal process is used to fund the 
CPC activities implemented by the Product Teams. Specifically, each 
year the Director of the Corrosion Office asks the Product Team chairs 
to provide details on the funding required for the activities planned 
for the next year. The Director then requests the funds through the 
annual budget request submitted to the DOD Comptroller.[Footnote 28] 
Product Team representatives told us that they were satisfied with the 
level of funding provided for CPC activities. Table 2 lists the 
funding for each Product Team for fiscal years 2005 through 2010. 

Table 2: Funding of the Product Teams for Fiscal Years 2005 through 
2010: 

Product team: Policy and requirements; 
6-year total: $10.0 million; 
Proportion of funding: 39%. 

Product team: Metrics, impact, and sustainment; 
6-year total: $5.8 million; 
Proportion of funding: 23%. 

Product team: Specifications, standards, and product qualification; 
6-year total: $3.0 million; 
Proportion of funding: 12%. 

Product team: Training and certification; 
6-year total: $3.2 million; 
Proportion of funding: 12%. 

Product team: Communications and outreach; 
6-year total: $2.9 million; 
Proportion of funding: 11%. 

Product team: Science and technology; 
6-year total: $0.8 million; 
Proportion of funding: 3%. 

Product team: Facilities[A]; 
6-year total: 0.0; 
Proportion of funding: 0%. 

Product team: Total; 
6-year total: $25.7; 
Proportion of funding: 100%. 

Source: GAO analysis of DOD data. 

[A] Corrosion Office staff told us that the Facilities Product Team is 
not funded directly, but rather through other Product Teams, since 
their activities fall within each of the other six Product Team areas. 
Members of the Facilities Product Team also serve on the other six 
Teams, where their funding needs are addressed. 

Note: The figures in Table 2 reflect the fiscal year funding plans, 
which Corrosion Office officials told us may not be the exact final 
funding figures and, in a few cases, may not include all of the final 
funding. 

[End of table] 

The tasks completed by the Product Teams vary according to their area 
of specialization. Descriptions of two Product Teams' tasks and impact 
are used to illustrate the specialization and important information 
generated. 

The Metrics, Impact, and Sustainment Product Team has focused on 
determining the baseline costs of corrosion for DOD. This task 
involves establishing a methodology to measure the costs associated 
with corrosion throughout DOD and applying the methodology to selected 
components of the military departments (such as Army aviation and 
missiles, and Navy ships). These efforts resulted in a series of 
reports that estimated the cost of corrosion for various classes of 
equipment and facilities across the military departments. A project 
manager with whom we met told us that these cost studies helped him 
and his colleagues to identify areas in which to focus their CPC 
efforts. He told us that the Army Aviation and Missile Command 
established a corrosion team to focus on cost drivers, following the 
issuance of a cost study that estimated Army aviation and missile 
assets had corrosion costs of $1.6 billion per year. This Product Team 
plans to update the cost of corrosion for each military department 
component on a 3-year cycle and to use this information to track the 
impact of CPC efforts over time. This Product Team also has ongoing 
efforts to measure the impact of corrosion on readiness. A preliminary 
report, published in October 2009, concluded that corrosion-related 
factors can cause asset unavailability of up to 16 percent, with the 
greatest impact occurring on aviation assets. One Product Team 
representative told us that (1) their studies on corrosion costs were 
completed prior to the Corrosion Executives' being established at the 
military departments and (2) the Product Team plans to consult with 
the Corrosion Executives to incorporate their input into future 
updates to the cost studies. He told us that he expected this would 
have a positive impact at the military departments. 

In addition, the Specifications, Standards, and Product Qualification 
Product Team has developed a Web-based tool to help suppliers match 
their products with existing specifications and standards used by DOD. 
A Product Team representative told us that this activity is expected 
to result in improved technologies and products available to the DOD 
maintenance community for use in preventing corrosion. Additionally, 
the Product Team representative told us that product specifications 
are required to be updated every 2-5 years and that these updates cost 
DOD up to $20,000 each. He told us that there are over 800 corrosion- 
related product specifications, such as information on what types of 
treatments, primers, and paints are to be applied to a particular 
material in a given situation. Because of the large number of 
specifications involved and the cost of revising each of them, this 
Product Team has focused its efforts on assembling a list of 38 "high- 
risk" specifications that are given priority for funding. 

Staffing of the Product Teams Is Evolving to Incorporate the Corrosion 
Executives and Their Inputs: 

The Corrosion Executives of the military departments are responsible 
for supporting the Product Teams, which are part of the CPC Integrated 
Product Team, and the Product Team staffing process is evolving to 
recognize their emerging roles and responsibilities. Since February 
2010, the Corrosion Executives have been required by DOD Instruction 
to support the Product Team process by designating trained or 
qualified representatives.[Footnote 29] According to the DOD Corrosion 
Prevention and Mitigation Strategic Plan, the Director of the 
Corrosion Office manages and coordinates the CPC Integrated Product 
Team, which includes the Product Teams. The Strategic Plan does not 
reflect this new requirement for the Corrosion Executives to designate 
representatives to the Product Teams. 

The Corrosion Executives and two of the Product Teams' chairs told us 
that the process of staffing the Product Teams is changing. According 
to the Navy Corrosion Executive, in the past, participation on a 
product team has always been based on individual interest and whether 
a volunteer had time available to dedicate to a Product Team. However, 
recently, when a Navy representative who was serving as the chair of a 
Product Team asked to be replaced, the Navy Corrosion Executive 
nominated another individual from the Navy to serve on the Product 
Team. The Corrosion Executive communicated the nomination to the 
Director of the Corrosion Office and the Corrosion Executives of the 
Army and Air Force, and there were no objections to the change. The 
Navy Corrosion Executive told us that this example is typical of the 
informal process currently used to staff the Product Teams. He added 
that the Corrosion Executives have met with the Director of the 
Corrosion Office to discuss establishing a Corrosion Board of 
Directors, which could establish regular meetings between the 
Corrosion Executives and the Director of the Corrosion Office to 
discuss policy issues, including a more formal process of staffing the 
Product Teams. 

While the Corrosion Office has, in the past, relied on the Product 
Team members to represent the position of the military departments on 
corrosion-related issues, the Corrosion Executives told us they felt 
that it was now more appropriate for such discussions to occur between 
the Director of the Corrosion Office and the Corrosion Executives 
directly. However, the Air Force has recently designated particular 
Product Team representatives from their military department as 
authorized to speak for the department in communications with the 
Corrosion Office. The Air Force Corrosion Executive told us that this 
designation was intended to prevent any miscommunication between 
Product Team representatives and the Corrosion Office. 

Product Team members with whom we spoke had mixed reactions to the 
involvement of the Corrosion Executives in the Product Teams. One 
member told us that he felt it was appropriate for the Product Teams 
to be staffed by volunteers and was concerned that an increased role 
by the Corrosion Executives in designating members to the Product 
Teams would reduce the commitment of the members to the Product Teams. 
In contrast, another Product Team member told us that he thought it is 
good for the Corrosion Executives to be more involved, because it is 
important to ensure that the Corrosion Executives have buy-in to the 
Product Team activities. 

Conclusions: 

Corrosion significantly impacts DOD in terms of cost, readiness, and 
safety. The Corrosion Office has made substantial progress toward 
establishing a coordinated DOD-wide approach to controlling and 
mitigating corrosion, including: 

* creating a process to select and fund projects intended to develop 
and use new CPC technologies, 

* quantifying the costs of corrosion, and: 

* working more closely with the military departments. 

Also, each military department has recently designated a legislatively 
mandated Corrosion Executive to manage and coordinate its corrosion 
efforts and give increased visibility to this important area of 
equipment and infrastructure sustainment. However, some continuing 
uncertainty about how the Corrosion Executives should fulfill their 
responsibilities may be limiting the positive impact that these 
positions could have on CPC efforts. For example, the nature and 
extent of reviews of CPC proposals before they are submitted to the 
Corrosion Office were cited as a possible cause for differences in the 
rates at which the military departments' proposed projects are 
selected for supplemental funding from the Corrosion Office. 
Similarly, some issues with how clearly the criteria used to select 
projects for funding are communicated may have negative effects. These 
effects include significant revisions to project proposals and can 
result in fewer projects being accepted. If these concerns are not 
addressed, DOD and the military departments may not achieve maximum 
benefits from the program and thereby limit the effects of corrosion 
on the assets that they manage. An additional area of concern is the 
limited follow-through on the requirement to validate the ROIs that 
were originally estimated for the funded projects. While the few 
validations completed thus far document positive results, the small 
and non-representative group of findings prevents (1) generalization 
about the impact of other funded projects and (2) efforts to identify 
and focus future funding toward types of projects that have been shown 
to have the best likelihood for high payoffs. Also, more complete 
information on ROIs could provide DOD with an empirical basis for 
determining how, if at all, the Corrosion Office's funding and 
activities should be modified. 

Recommendations for Executive Action: 

To ensure that the Department of Defense is taking full advantage of 
the cost savings that can be achieved by implementing CPC projects, we 
recommend that the Secretary of Defense direct the Under Secretary of 
Defense (Acquisition, Technology and Logistics) to take the following 
three actions: 

* Update applicable guidance, such as DOD Instruction 5000.67, 
Prevention and Mitigation of Corrosion on DOD Military Equipment and 
Infrastructure or the DOD Corrosion Prevention and Mitigation 
Strategic Plan to further define the responsibilities of the military 
departments' Corrosion Executives, to include more specific oversight 
and review of the project proposals before and during the project 
selection process. 

* Modify the DOD Corrosion Prevention and Mitigation Strategic Plan to 
clearly specify and communicate the criteria used by the panel in 
evaluating CPC projects for funding consideration. This action should 
include listing and describing each criterion used by the panel in the 
preliminary and final project evaluation decisions and discussing how 
the criteria are to be used by the panel to decide on project 
acceptability. 

* Develop and implement a plan to ensure that return on investment 
validations are completed as scheduled. This plan should be completed 
in coordination with the military department Corrosion Executives and 
include information on the time frame and source of funding required 
to complete the validations. 

Agency Comments and Our Evaluation: 

In written comments on a draft of this report, DOD agreed with one of 
our recommendations and did not agree with the other two 
recommendations. DOD's letter also provided some technical comments 
that we have incorporated as appropriate. For example, DOD's comments 
noted some new information that the department had not shared with us 
previously. Therefore, we revised our report to reflect the fact that 
DOD now estimates that approximately $1.9 billion, or 11.7 percent, of 
facilities' maintenance costs are related to corrosion. We have also 
revised our report to reflect additional information the department 
provided on how the Product Teams are staffed. DOD's comments are 
included in their entirety in appendix III. 

DOD did not agree with our recommendation that the Secretary of 
Defense direct the Under Secretary of Defense (Acquisition, Technology 
and Logistics) to update applicable guidance, such as DOD Instruction 
5000.67 or the DOD Corrosion Prevention and Mitigation Strategic Plan, 
to further define the responsibilities of the military departments' 
Corrosion Executives, to include more specific oversight and review of 
the project proposals before and during the project selection process. 
In its comments, DOD stated that DOD-level policy documents are high- 
level documents that delineate responsibilities to carry out the 
policy. Specific implementing guidance is provided through separate 
documentation. DOD also stated that the Corrosion Office will be 
updating the DOD Corrosion Prevention and Control Planning Guidebook 
and beginning the process of converting it into a DOD manual in the 
next year. In addition, DOD's response noted that the "best practice" 
of the military department Corrosion Executives conducting their own 
internal reviews before and during the project selection process will 
be included in that update. Our recommendation to "update applicable 
guidance" did not prescribe where the updated guidance should be made. 
Instead, our recommendation only offered examples of documents that 
might be modified. We believe that updating the Guidebook and 
converting that to a DOD Manual would provide the needed direction to 
the military department Corrosion Executives and would meet the intent 
of our recommendation. 

DOD also did not agree with our recommendation that the Secretary of 
Defense direct the Under Secretary of Defense (Acquisition, Technology 
and Logistics) to modify the DOD Corrosion Prevention and Mitigation 
Strategic Plan to clearly specify and communicate the criteria used by 
the panel in evaluating CPC projects for funding consideration, as 
well as listing and describing each criterion used by the panel in the 
preliminary and final project evaluation decisions. In its response, 
DOD stated that it disagreed with the implications that the Strategic 
Plan is deficient in clearly specifying the criteria and that added 
discussion is needed in the Strategic Plan regarding how the criteria 
are used by the panel. DOD commented that the criteria used by the 
panel and the steps in the process are completely transparent to the 
[project proposal] authors, and the details have been verbally 
communicated to stakeholders and are available on line and by e-mail 
in Appendix D of the Strategic Plan. However, DOD also stated: (1) 
"While not always defined as 'criteria,' all factors considered in the 
evaluation are articulated in Appendix D" and (2) "While not expressly 
defined as 'criteria,' these indices are clearly criteria from which 
anyone submitting a project plan can determine what is likely to 
improve the chances of a higher DEA [the model used in the panel 
process] ranking." 

In developing our findings, we analyzed the Strategic Plan to 
understand the process and criteria used to evaluate CPC projects for 
funding; observed the panel proceedings for both the preliminary and 
final project reviews; discussed the panel process with panel members 
and military department Corrosion Executives; and discussed their 
understanding of the process and the criteria used for project 
evaluation with Corrosion Executives and project authors. The views of 
the panel members, Corrosion Executives, and project authors, as well 
as our observations, formed our findings and conclusions and led to 
our recommendations. Despite the efforts of the Corrosion Office to 
communicate with its constituency through briefings, emails, and other 
methods as delineated in DOD's comments, some of those involved in the 
process reported to us that they did not clearly understand what the 
criteria were and when they were used in the process. Moreover, DOD's 
comments quoted above acknowledge that criteria are not always clearly 
defined in Appendix D of the Strategic Plan. We believe our findings 
are sound and that our recommendation to clearly identify and 
communicate the criteria is still appropriate. Continued use of 
unclear criteria could result in wasted personnel time associated with 
preparing and revising proposals. 

DOD agreed with our recommendation that the Secretary of Defense 
direct the Under Secretary of Defense (Acquisition, Technology and 
Logistics) to develop and implement a plan to ensure that return on 
investment validations are completed as scheduled. DOD stated that 
plans are underway to address this requirement. 

DOD also commented that some of our statements are inaccurate. For 
example, DOD claims that statements in the draft report regarding the 
use of different criteria for the preliminary and final project 
evaluation are not true. However, in our discussions with the panel 
members and project authors, as well as our observations of the panel 
process, it was clear that some criteria were used in one evaluation 
and not in the other. Second, DOD stated that the evaluation team 
[panel] is not an "ad hoc working group" and the panel members are 
selected based on experience, expertise, and judgment. In response to 
DOD's comments, we modified our characterization of the panel. 
Finally, DOD commented that a statement in the draft report that the 
process did not consider military department priorities is not 
accurate. However, as we state in the report, both Corrosion Office 
staff and a panel member told us that it was not the intent of the CPC 
program to fund military department priorities, but to award funds to 
the most competitive projects. Also, DOD's comments state that "the 
panel does not initially rank projects using the military department 
priorities" and assert that those priorities have been used by the 
panel in the final ranking if a military department has two or more 
projects that are considered to be comparatively equal. However, this 
is a relatively limited circumstance and, in the view of some 
stakeholders, does not adequately acknowledge the priorities of the 
military departments. 

We are sending copies of this report to the appropriate congressional 
committees. We are also sending copies to the Secretary of Defense; 
the Deputy Secretary of Defense; the Under Secretary of Defense 
(Comptroller); the Under Secretary of Defense (Acquisition, Technology 
and Logistics); the Secretaries of the Army, Navy, and Air Force; and 
the Commandant of the Marine Corps. This report will also be available 
at no charge on our Web site at [hyperlink, http://www.gao.gov]. 
Should you or your staff have any questions concerning this report, 
please contact me at (202) 512-8246 or edwardsj@gao.gov. Contact 
points for our Offices of Congressional Relations and Public Affairs 
may be found on the last page of this report. Key contributors are 
listed in appendix IV. 

Signed by: 

Jack E. Edwards: 
Director, Defense Capabilities and Management: 

[End of section] 

Appendix I: Scope and Methodology: 

For the overall context of our analysis, we reviewed relevant laws; 
Department of Defense (DOD) and military department-specific guidance; 
the DOD Corrosion Prevention and Mitigation Strategic Plan; and 
reports issued by LMI and the Defense Science Board. 

To address our objectives, we met with the Director of the Office of 
the Secretary of Defense's Corrosion Policy and Oversight Office 
(Corrosion Office), members of the Corrosion Prevention and Control 
(CPC) project selection panel assembled by the Director of the 
Corrosion Office, DOD contractors who assist the Director of the 
Corrosion Office in managing the CPC program, each military 
department's Corrosion Executive and their staffs, representatives of 
three of the seven Working Integrated Product Teams (Product Teams) 
that coordinate CPC activities, and the six project managers who 
authored the proposals for 11 of the CPC projects included in our 
sample. 

We obtained data from the Corrosion Office for projects that the 
military departments had submitted for funding consideration for 
fiscal years 2005 through 2010. Projects submitted for fiscal year 
2011 funding were not in that population because the Corrosion Office 
had not completed the funding of these projects at the time of our 
review. We assessed the reliability of the data by (1) interviewing 
staff knowledgeable about the data and the system that produces them; 
(2) testing for missing data, outliers, or obvious errors using 
comparisons to data obtained during prior GAO reviews; and (3) 
conducting logic tests. We determined that the data were sufficiently 
reliable for the purposes of our review, which were to determine how 
the military departments decide which projects to submit to the 
Corrosion Office for funding consideration, and how a panel of experts 
and the Corrosion Office decide which projects to approve for funding. 
To identify corrosion projects for a more detailed review, we selected 
a nonprobability sample of projects from each of fiscal years 2006, 
2008, and 2010 using the following criteria: 

* year the project was submitted to the Corrosion Office, 

* whether the Corrosion Office did or did not accept the project, 

* the Corrosion Office's and military department's combined project 
cost, and: 

* the estimated return on investment of the project. 

Applying the above criteria, we selected a sample of 24 projects for 
further review. 

To determine the extent the Corrosion Executives are involved in 
preparing CPC project proposals for submission to the Corrosion Office 
for funding consideration, we met with each of the Corrosion 
Executives and their staffs and reviewed the military departments' 
corrosion reports, to identify whether there was a process at each 
department to review CPC projects. For projects in our sample, we 
interviewed six officials who were the principal authors and points of 
contact for 11 of the projects in our sample. We also reviewed 
legislation and military department documents, as well as guidance on 
internal controls, to identify relevant responsibilities and practices 
that could be used as criteria. 

To determine the extent the Corrosion Office has created a process to 
review and select projects for funding, we interviewed the Corrosion 
Office staff who manage the process of requesting and receiving 
project proposals from the military departments. We also interviewed 
some members of the project selection panel that decided which 
projects to accept for funding to obtain their observations on the 
evaluation and selection process. For projects in our sample, we 
reviewed records of the project selection panel's decisions whether to 
accept the projects for funding. We observed the project selection 
panel's preliminary and final project evaluation meetings for fiscal 
year 2011 projects to determine the current process for evaluating 
projects. Additionally, we reviewed the project proposal template 
included in DOD's Corrosion Prevention and Mitigation Strategic Plan. 

To determine the extent the military departments have validated the 
return on investment (ROI) of funded projects, we obtained the 10 
project review reports that had been completed for fiscal year 2005 
projects. We reviewed these reports for data on the validated ROI, the 
comparison between the validated data and the original estimate, and 
information on the reasons--if applicable--why the ROI had changed. 

To determine how the Corrosion Office determines which CPC activities 
to fund, we interviewed the chairs of three of the seven Product Teams 
who manage the CPC activities. We also reviewed materials (e.g., cost 
studies) that the Product Teams produced, obtained information on the 
funding for the Product Teams and attended sessions at the DOD 
Corrosion Forum where Product Team representatives described their 
ongoing and planned activities. 

We conducted this performance audit from April 2010 through December 
2010 in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audit 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. 

[End of section] 

Appendix II: Information on Selected Corrosion Prevention and Control 
Projects: 

Project name and year of funding request: Dehumidification of PATRIOT 
Missile Systems; fiscal year 2008; 
Project description: The Army Aviation and Missile Command implemented 
this project which had estimated costs of $95,000 divided equally 
between the Army and the Corrosion Office and an estimated ROI of 
47:1. The project involved using advanced commercial off the shelf 
forced-air dehumidification technology to dehumidify the air intake 
for the PATRIOT missile system radar set. The intent of this effort 
was to reduce the $46.4 million in annual corrosion costs identified 
in DOD's May 2007 report on the cost of corrosion for Army aviation 
and missile equipment; 
Final status: The Corrosion Office accepted the project and provided 
$48,000. Army Aviation and Missile Command staff told us that the 
project is still being implemented and that some units have deployed 
to the field. 

Project name and year of funding request: Laser powder deposition 
repair of knife edge seals on Navy and Army jet engines; fiscal year 
2010; 
Project description: The Army Aviation Missile Command and the Naval 
Air Systems Command submitted a joint project proposal to demonstrate 
new technology using a laser powder deposition technique to repair 
knife edge seals that are components within the T700 engine. Almost 
all of the used (overhauled) seals wear enough to require repair or 
replacement. This new technology can reduce repair time and 
replacement of the seals. The T700 engine is used by the Air Force, 
Army, and the Navy. The military departments did not identify their 
funding contribution but requested $30,000 from the Corrosion Office. 
This Army-led project has an estimated ROI of 7:1; 
Final status: The Corrosion Office accepted this project and provided 
$30,000. Army Aviation and Missile Command staff told us that delays 
in obtaining Army funding have slowed the implementation of this 
project. 

Project name and year of funding request: Avdec sealants for 
conductive gaskets and floorboard; fiscal year 2006; 
Project description: The Naval Air Systems Command submitted this 
project proposal for a total cost of $2.7 million, of which 68 percent 
was requested from the Corrosion Office. The project has an estimated 
ROI of 14:1. Due to the high rate of corrosion-related replacement of 
antennas on the Navy's F/A-18 Hornets and the cost of $2.5 million per 
year to replace the antennas, the project proposed developing a new 
generation of sealants to avoid corrosion on aircraft antennas and 
floorboards; 
Final status: The project was accepted but not funded by the Corrosion 
Office. Naval Air Systems Command staff told us that the project was 
funded by other sources and is in the early stages of implementation. 

Project name and year of funding request: Advanced aluminum-anodizing 
system; fiscal year 2006; 
Project description: The Navy and Army jointly submitted this project 
proposal with the Naval Air Systems Command as the lead organization. 
The project had a total cost of $470,000, with 74 percent requested 
from the Corrosion Office. The project's estimated ROI was 2:1. This 
project would use Metallast technology to help provide more precise 
control of coating consistency, durability, and corrosion protection 
to improve the process of anodizing complex parts. Implementation 
would include installing new computer controlled anodizing systems at 
two Naval aviation depots, and also assessing the feasibility of a 
follow-on implementation at an Army depot; 
Final status: The project was accepted, but not funded by the 
Corrosion Office. Naval Air Systems Command staff told us that the 
project was funded by other sources, and has been completed. 

Project name and year of funding request: Sputtered aluminum process 
for high-strength steel components; fiscal year 2006; 
Project description: Naval Air Systems Command submitted this project 
proposal for a total cost of $550,000, with 82 percent requested from 
the Corrosion Office. Its estimated ROI was 1:1. The project proposal 
addresses implementation of a Plug and Coat sputtered aluminum system 
on an existing IVD aluminum system at the Naval depot in Jacksonville 
and to validate potential use in other naval aviation depots. The Plug 
and Coat system is a proven technical solution to access cavities and 
other internal surfaces of high-strength steel components and coat 
them with aluminum to protect against corrosion. The proposal said 
that the current process (1) consumes excessive man-hours to process 
parts and (2) leads to additional corrosion of components; 
Final status: The project was not accepted by the Corrosion Office. 
Naval Air Systems Command staff told us that the project was not 
pursued further. 

Project name and year of funding request: High-efficiency paint spray 
gun systems; fiscal year 2010; 
Project description: The Air Force Research Laboratory submitted this 
project proposal for a total cost of $560,000, with 54 percent 
requested from the Corrosion Office. Its estimated ROI was 605:1. The 
project plan proposed evaluating and testing several new paint spray 
gun systems using various types of existing coatings. Ease of use, 
economics, and the quality and uniformity of the finish coating would 
be compared for the various systems; 
Final status: The project was accepted but not funded by the Corrosion 
Office. According to laboratory officials, the project was not 
resubmitted because Air Force priorities changed and they did not 
believe it would rank above the funding line. 

Project name and year of funding request: Mildew growth/bio-corrosion 
prevention using an antimicrobial coating on material surfaces; fiscal 
year 2006; 
Project description: The U.S. Army Natick Soldier Center submitted 
this project proposal for a total cost of $627,000, with an estimated 
ROI of 842:1. The project plan proposed demonstrating new processes 
for use of an alternative to copper 8 coating system now in use for 
protection against material bio-degradation. The proposed alternative 
was an environmentally friendly coating system for fabric protection 
for use on tents, truck covers, helmets, parachutes, and other 
materials; 
Final status: This project was accepted by the Corrosion Office but 
not initially funded. According to a center official, the project was 
eventually funded by the Corrosion Office. The project is complete and 
a final project report was recently sent to the Corrosion Office, but 
no ROI validation was conducted as part of the final report. 

Project name and year of funding request: Remote monitoring of 
degradation of steel and reinforced thermoplastic composite bridges; 
fiscal year 2008; 
Project description: The U.S. Army Corps of Engineers, Engineer 
Research Development Center, submitted this project proposal for a 
total cost of $1.6 million split evenly between the Army Corps of 
Engineers and the Corrosion Office, and estimated an ROI of 6:1. The 
initial project plan scope focused on testing remote monitoring of 
Army non-metallic bridges to help identify corrosion or degradation 
where ordinary nondestructive testing methods cannot identify actively 
growing defects. The Army expanded the scope of this project at the 
request of the Corrosion Office. As a result of the Interstate 35W 
Bridge collapse in Minneapolis, Minnesota, with corrosion and fatigue 
cracking likely contributors to the catastrophe, the Corrosion Office 
requested the Army to expand the scope of this project to include both 
non-metallic and metallic bridges. Because of this, the Corrosion 
Office waived the $500,000 funding limit for this project. Engineers 
stated that part of the project was to monitor the I-20 Bridge near 
Vicksburg, Mississippi. Expansion of the scope included coordinating 
with the Department of Transportation, Federal Highways 
Administration, and the Illinois and Indiana Departments of 
Transportation; 
Final status: Prior to the refocusing of the project, engineers told 
us that it was accepted with some additional clarification required. 
Engineers were in the process of resubmitting the project proposal 
when the Corrosion Office requested the wider scope. This project was 
accepted and funded. The project is three fourths completed. 

Project name and year of funding request: Alkali-activated zinc 
grouted anode cathodic protection system for concrete reinforcing 
steel; fiscal year 2008; 
Project description: The Naval Facilities Engineering Service Center, 
Pacific submitted this project proposal for a total cost of $1.2 
million, with $80,000 requested from the Corrosion Office. Its 
estimated ROI was 5:1. The project was to demonstrate the 
effectiveness of a discrete galvanic anode cathodic protection system 
as a means of mitigating corrosion and increasing the service life 
during the repair of the reinforced concrete Kilo Wharf at the Naval 
Base Guam; 
Final status: This project was accepted and funded. The project is 
still being implemented. Engineers told us that the project ran into 
some complications. For example, the sites where the project was 
installed are not the originally planned sites. The contractor 
estimates at the originally planned sites were much higher than the 
government estimates. Because of this the facilities command had to 
find a different site to use for project implementation. 

Project name and year of funding request: Alternative 
backfill/galvanic anode cathodic protection for fuel storage tank 
bottoms; fiscal year 2010; 
Project description: The Naval Facilities Engineering Service Center, 
Pacific submitted this project proposal for a total cost of $450,000, 
with 56 percent requested from the Corrosion Office. The estimated ROI 
was 2:1. The project was to test results of a technical paper 
reporting that an improved backfill and/or galvanic anode system may 
provide better cathodic protection than current impressed systems; 
Final status: A center official noted that the Navy removed this from 
funding consideration because (1) it could not find any matching funds 
and (2) there was no site selected to demonstrate the technology. 

Project name and year of funding request: High-rate paint stripper; 
fiscal year 2008; 
Project description: The Naval Air Systems Command submitted this 
project proposal for a total cost of $940,000, with 29 percent 
requested from the Corrosion Office. The project's estimated ROI was 
2:1. The project was to evaluate alternative paint removal technology 
that could be used (1) where spot paint removal is necessary for non-
destructive inspections and (2) at intermediate and depot-level 
facilities where larger scale removal of coating is required for 
inspections and repairs; 
Final status: This project was not accepted and not funded by the 
Corrosion Office. A command official noted that funding was obtained 
from other sources to complete this project. 

Source: GAO analysis of DOD documents and interviews with CPC project 
managers. 

Note: This appendix provides short summaries on the status of the 11 
projects discussed with the project's program manager. 

[End of table] 

[End of section] 

Appendix III: Comments from the Department of Defense: 

Office Of The Under Secretary Of Defense: 
Acquisition, Technology	And Logistics: 
3000 Defense Pentagon: 
Washington, DC 20301-3000: 

November 19, 2010: 

Mr. Jack E. Edwards: 	
Director, Defense Capabilities and Management: 
U.S. Government Accountability Office: 
441 G Street, N.W. 
Washington, DC 20548: 

Dear Mr. Edwards: 

This is the Department of Defense (DOD) response to the GAO Draft 
Report, GAO-11-84, "Defense Management: DOD Has a Rigorous Process to 
Select Corrosion Prevention Projects, but Would Benefit from Clearer 
Guidance and Validation of Returns on Investment," dated October 20, 
2010 (GAO Code 351447). Detailed comments on the report 
recommendations are enclosed. 

Sincerely, 

Signed by: 

Daniel J. Dunmire: 
Director: 
DoD Corrosion Policy and Oversight: 

Enclosure: As stated: 

[End of letter] 

GAO Draft Report Dated November 2010: 
GAO-11-84 (GAO CODE 351447): 

"Defense Management: DOD Has A Rigorous Process To Select Corrosion 
Prevention Projects, But Would Benefit From Clearer Guidance And 
Validation Of Returns On Investment" 

Department Of Defense Comments To The GAO Recommendations: 

Recommendation 1: The GAO recommends that the Secretary of Defense 
direct the Under Secretary of Defense (Acquisition, Technology and 
Logistics) to update applicable guidance, such as DOD Instruction 
5000.67, Prevention and Mitigation of Corrosion on DOD Military 
Equipment and Infrastructure or the DOD Corrosion Prevention and 
Mitigation Strategic Plan to further define the responsibilities of 
the military departments' Corrosion Executives, to include more 
specific oversight and review of the project proposals before and 
during the project selection process. (See page 27/GAO Draft Report.) 

DoD Response: Non-concur. DoDI 5000.67 currently states in section 
E3.b.5. that "The [Military Department Corrosion Control and 
Prevention Executive] CCPE shall, with coordination through the proper 
Military Department chain of command: (a) Provide to the Director, 
CPO, information on...2. Corrosion project opportunities to support 
the Director's responsibilities." Later in section E3.b.5. the CCPE is 
further given the responsibility to "(e) Support the CPC IPT process 
by...2. Submitting candidate military equipment and infrastructure 
corrosion prevention and mitigation projects during the annual project 
data call." 

DoD level policy documents delineate responsibilities to carry out the 
policy. They are high level documents. Specific implementing guidance 
such as "best practices" on how to accomplish those responsibilities 
are provided through separate documentation such as DoD Guidebooks or 
Manuals. A more appropriate document to communicate to the CCPEs the 
benefit of conducting their own review of project proposals before and 
during the selection process is either a Guidebook or Manual. The 
Corrosion Policy and Oversight office will be updating the DoD 
Corrosion Prevention and Control Planning Guidebook and begin the 
process of converting it into a DoD Manual in the next year. The "best 
practice" of the CCPEs conducting their own internal reviews before 
and during the project selection process will be included in that 
update. It is inappropriate to include such specific implementing 
guidance in a DoD Instruction. 

In addition, two of the military departments are already performing 
effective reviews of project proposals before submitting to DOD. For 
example, as stated in the report, DON has already established an 
internal process for oversight and evaluation of DON project 
proposals. This process was developed in accordance with Enclosure 2, 
Section 3, Paragraph 2, Subsection A of Department of Defense (DOD) 
Instruction 5000.67 and is reviewed annually to ensure competitive DON 
project proposal submissions. 

In summary, DoDI 5000.67 specifies what the military departments shall 
do. It does not nor should it specify how the military departments 
should perform these requirements. As observed in this report, two of 
the military departments established and executed an effective process 
for reviewing and submitting project plans without directions from DOD 
on how to do so. 

Recommendation 2: The GAO recommends that the Secretary of Defense 
direct the Under Secretary of Defense (Acquisition, Technology and 
Logistics) to modify the DOD Corrosion Prevention and Mitigation 
Strategic Plan to clearly specify and communicate the criteria used by 
the panel in evaluating CPC projects for funding consideration. This 
action should include listing and describing each criterion used by 
the panel in the preliminary and final project evaluation decisions 
and discussing how the criteria are to be used by the panel to decide 
on project acceptability. (See page 27/GAO Draft Report.) 

DoD Response: Non-concur. The Strategic Plan is updated each year to 
reflect any changes that affect actions to implement requirements and 
activities specified in the Strategic Plan. We will continue to do so 
with respect to any changes to project submission and selection, 
including criteria for selection. However, we disagree with the 
implication that the Strategic Plan is deficient in clearly specifying 
the criteria. And we disagree with the suggestion that added 
discussion is needed in the Strategic Plan regarding how the criteria 
are used by the panel. We find that a number of findings articulated 
in this report are erroneous and may have contributed to this 
conclusion and recommendation. 

On page 2, this report states that Military department stakeholders 
indicated that the procedure for evaluating proposals is not 
communicated clearly. It also states that criteria used for the 
project selection panel's evaluation of proposed projects are not 
clearly identified in the Strategic Plan. The criteria used by the 
panel and the steps in the process are completely transparent to the 
authors, and the details have been verbally communicated every year, 
prior to project submittals, to stakeholders through a briefing and 
Q&A by the evaluation panel chairman and the Corrosion Policy and 
Oversight office staff. The staff has also been available to answer 
any questions at any time, and has done so for people who are 
genuinely interested in receiving added information and support. The 
information on evaluation factors and criteria is available on line in 
the Strategic Plan, therefore project managers have access to the 
strategic plan where all criteria plus added guidelines are 
articulated. Furthermore, Appendix D of the Strategic Plan, which is 
titled Project Plan Instructions, has been extracted and distributed
separately to the services via email. That appendix specifies the 
content of the project plan, and includes a series of paragraphs with 
specific declarative statements that begin with "describe," "define," 
or other phrases that clearly indicate the requirements of the project 
plan. There should be no doubt that the information required in the 
plan will be used to evaluate the need, impact, quality, methodology, 
investment and cost avoidance of the technology proposed, all of which 
represent implicit evaluation factors. On pages 14 and 17, this report 
states that only some of the criteria used to evaluate proposals were 
clearly found in the Strategic Plan and that some criteria used by 
corrosion officials were grouped with other criteria not used in the 
selection process. And on page 16, this report states that one of the 
seven project assessment charts (ROI) not used to score projects 
though may be used as a guide was used to make project decisions. We 
do not agree that only some criteria are available in the Strategic 
Plan. While not always defined as "criteria," all factors considered 
in the evaluation are articulated in Appendix D. The added project 
plan instruction information contained in seven additional evaluation 
charts, that were not used specifically as evaluation criteria, were 
provided to project submitters as guidance for improving the quality 
of the content in the narrative portions of the project plan. The fact 
that ROI is addressed in one of the charts does not single out that 
chart as a separate criterion used to evaluate the projects — ROI is 
clearly addressed as a criterion elsewhere in Appendix D. The indices 
found in Attachment 2 of Appendix D are used in the DEA model to 
evaluate the relative affordability of each project, where each index 
represents the value of a key affordability variable. While not 
expressly defined as "criteria," these indices are clearly criteria 
from which anyone submitting a project plan can determine what is 
likely to improve the chances of a higher DEA ranking. The implication 
that service representatives at any level are not provided sufficient 
information to effectively compete is not consistent with the 
documentary and objective interactive evidence. 

On page 12, this report states that the panel used a different set of 
criteria for the preliminary review than for the final selection 
process; and that for the final review, the panel used criteria found 
in the Strategic Plan but not explicitly identified as the specific 
criteria used to evaluate projects. It also states that the project 
selection panel is an ad hoc group according to a panel member. On 
page 13, this report states that the panel used criteria for go no-go 
not made available to the submitters of the project proposals. 
Likewise, on page 14, this report states that the panel used a 
different set of criteria to make preliminary go no-go decision. None 
of these statements are true. Criteria are the same for both 
preliminary and final reviews and all criteria are available to 
stakeholders and project submitters as described in the previous 
paragraph. The preliminary review is designed to reveal fatal flaws in 
a project, many of which can be corrected. The final review includes 
closer scrutiny of a project proposal. This two-step process is 
designed for efficiency. The evaluation team is not an ad hoc working 
group. Members were selected based on their functional positions 
within DOD and knowledge and insight in major acquisition, technology, 
or logistics areas, and are permanently assigned until they
transfer to other positions or retire. Their combined experience, 
expertise and judgment are the key to effective, high quality 
evaluation results. 

Page 13 of this report states that the panel reviewed projects that 
were within the anticipated funding level to ensure a balance between 
the number of facilities and weapons projects identified for funding. 
Equal numbers of facilities and weapon projects are never an 
objective — that is evident throughout the history of project 
selection. We look at the project selection to see if there is 
reasonable balance between equipment and facilities. We do not 
equalize them. The simple fact is that the distribution has 
historically been approximately equal year to year, so no real 
discussion about it being out of balance has ever been needed. We have 
never established a balancing policy, because it has not been 
necessary. If things got out of balance, then we would deal with it 
through internal discussions and dialog with the corrosion executives 
and other stakeholders. 

Page 15 reports that corrosion executives said they were unfamiliar 
with the criteria used by project selection panel. The corrosion 
executives were briefed specifically on this process more than once. 
This included a full afternoon of briefings during the DOD Corrosion 
Conference in August 2009, and the subject is always covered in the 
Spring Corrosion Forum. Page 17 states that the process did not 
consider military department priorities. That is not an accurate 
statement. True, the process is designed to select the best projects 
regardless of military department or area of application — the 
objective is to choose those projects that can best leverage 
significant reductions in the cost and impact of corrosion on 
warfighting systems and infrastructure. While the panel does not 
initially rank projects using the military department priorities, the 
priorities have been used to establish final rankings when two or more 
projects submitted by military department are considered to be 
comparatively equal by the evaluators. Page 16 of the report states 
that the ratio of Corrosion Office funding to department funding is 
not cited as a reason for scoring project in Strategic Plan. This is 
true, but is not considered necessary to emphasize this ratio in 
Appendix D since the need for complementary funding is well understood 
by the military departments, and the use of the ratio is a variable in 
the DEA model, which has been explained in detail to the entire 
corrosion community. The data used for this ratio is a required input 
to the project plan, and is presented in a table that reflects the 
distribution of funding from the DOD and the military department. 

Other errors in the report are also noted. Page 5 of this report 
states that approximately $1.5 billion, or 15 percent, of these 
maintenance costs were estimated to be related to corrosion. But 
according to the most current data available, a July 2010 report from 
LMI titled "The Annual Cost of Corrosion for the Department of Defense 
Facilities and Infrastructure: 2007 — 2008 Update," the correct figure 
is $1.9 billion, or 11.7 percent (FY2008). Page 9 of this report 
states that The Corrosion Executive assembled a panel with members 
from each of the Navy's system commands to review the synopses. But 
the DON project evaluation process did not require assembly of a 
special panel. Rather, the CCPE utilized the existing membership of 
the DON Corrosion Cross-Functional Team (CFT). Pages 21 and 22 of this 
report states that the Product Teams are staffed by representatives 
from the military departments, and according to a Product Team member 
the chair of each Product Team is rotated among the three departments 
annually. The fact is that only the chair of the Facilities Working-
Level Integrated Product Team (WIPT) is rotated annually. And page 26 
of this report states that some continuing uncertainty about how the 
Corrosion Executives should fulfill their responsibilities may be 
limiting the positive impact that these positions could have on CPC 
efforts. But the DON CCPE feels that there is no uncertainty or 
ambiguity in the CCPE responsibilities outlined in DODI 5000.67. 

Recommendation 3: The GAO recommends that the Secretary of Defense 
direct the Under Secretary of Defense (Acquisition, Technology and 
Logistics) to develop and implement a plan to ensure that return on 
investment validation are completed as scheduled. This plan should be 
completed in coordination with the military department Corrosion 
Executives and include information on the time frame and source of 
funding required to complete the validations. (See page 27/GAO Draft 
Report.) 

DoD Response: Concur. Plans are already underway to address this 
requirement within the Corrosion Policy and Oversight Directorate and 
with the military department corrosion executives. 

[End of section] 

Appendix IV: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Jack Edwards, (202) 512-8246 or edwardsj@gao.gov: 

Staff Acknowledgments: 

In addition to the contact name above, the following staff members 
made key contributions to this report: Ann Borseth, Assistant 
Director; Janine Cantin; Foster Kerrison; Charles Perdue; Terry 
Richardson; Michael Shaughnessy; and Erik Wilkins-McKee. 

[End of section] 

Related GAO Products: 

Defense Management: Observations on Department of Defense and Military 
Service Fiscal Year 2011 Requirements for Corrosion Prevention and 
Control. [hyperlink, http://www.gao.gov/products/GAO-10-608R]. 
Washington, D.C.: April 15, 2010. 

Defense Management: Observations on the Department of Defense's Fiscal 
Year 2011 Budget Request for Corrosion Prevention and Control. 
[hyperlink, http://www.gao.gov/products/GAO-10-607R]. Washington, 
D.C.: April 15, 2010. 

Defense Management: Observations on DOD's Fiscal Year 2010 Budget 
Request for Corrosion Prevention and Control. [hyperlink, 
http://www.gao.gov/products/GAO-09-732R]. Washington, D.C.: June 1, 
2009. 

Defense Management: Observations on DOD's Analysis of Options for 
Improving Corrosion Prevention and Control through Earlier Planning in 
the Requirements and Acquisition Processes. [hyperlink, 
http://www.gao.gov/products/GAO-09-694R]. Washington, D.C.: May 29, 
2009. 

Defense Management: Observations on DOD's FY 2009 Budget Request for 
Corrosion Prevention and Control. [hyperlink, 
http://www.gao.gov/products/GAO-08-663R]. Washington, D.C.: April 15, 
2008. 

Defense Management: High-Level Leadership Commitment and Actions Are 
Needed to Address Corrosion Issues. [hyperlink, 
http://www.gao.gov/products/GAO-07-618]. Washington, D.C.: April 30, 
2007. 

Defense Management: Additional Measures to Reduce Corrosion of 
Prepositioned Military Assets Could Achieve Cost Savings. [hyperlink, 
http://www.gao.gov/products/GAO-06-709]. Washington, D.C.: June 14, 
2006. 

Defense Management: Opportunities Exist to Improve Implementation of 
DOD's Long-Term Corrosion Strategy. [hyperlink, 
http://www.gao.gov/products/GAO-04-640]. Washington, D.C.: June 23, 
2004. 

Defense Management: Opportunities to Reduce Corrosion Costs and 
Increase Readiness. [hyperlink, 
http://www.gao.gov/products/GAO-03-753]. Washington, D.C.: July 7, 
2003. 

Defense Infrastructure: Changes in Funding Priorities and Strategic 
Planning Needed to Improve the Condition of Military Facilities. 
[hyperlink, http://www.gao.gov/products/GAO-03-274]. Washington, D.C.: 
February 19, 2003. 

[End of section] 

Footnotes: 

[1] Department of Defense, Under Secretary of Defense (Acquisition, 
Technology and Logistics), Defense Science Board Report on Corrosion 
Control (Washington, D.C.: 2004). 

[2] Corrosion includes such varied forms as rusting; pitting; galvanic 
reaction; calcium or other mineral buildup; degradation due to 
ultraviolet light exposure; and mold, mildew, or other organic decay. 

[3] LMI, The Impact of Corrosion on the Availability of DOD Weapon 
Systems and Infrastructure (McLean, Virginia: 2009). 

[4] GAO, Defense Management: High-Level Leadership Commitment and 
Actions Are Needed to Address Corrosion Issues, [hyperlink, 
http://www.gao.gov/products/GAO-07-618] (Washington, D.C.: Apr. 30, 
2007). 

[5] The Bob Stump National Defense Authorization Act for Fiscal Year 
2003 required the Secretary of Defense to designate an officer, 
employee, board, or committee as the individual or office with this 
responsibility. See Pub. L. No. 107-314, § 1067 (2002) (codified at 10 
U.S.C. § 2228). The National Defense Authorization Act for Fiscal Year 
2008 amended this requirement by designating the Director of Corrosion 
Policy and Oversight as the official with these responsibilities. See 
Pub. L. No. 110-181, § 371 (2008) (amending § 2228). 

[6] Pub. L. No. 110-181, § 371 (2008) (amending 10 U.S.C. § 2228). 

[7] Duncan Hunter National Defense Authorization Act for Fiscal Year 
2009, Pub. L. No. 110-417, § 903 (2008). 

[8] S. Rep. No. 111-74, at 155-156 (2009). 

[9] Although the Report language refers to the military services, it 
is the Military Department Corrosion Control and Prevention Executives 
who, with coordination through the proper military department chain of 
command, provide information on corrosion project opportunities to the 
Director of the Corrosion Office. Our focus in this report is 
therefore on the military departments. 

[10] GAO, Defense Management: Observations on Department of Defense 
and Military Service Fiscal Year 2011 Requirements for Corrosion 
Prevention and Control, [hyperlink, 
http://www.gao.gov/products/GAO-10-608R] (Washington, D.C.: Apr. 15, 
2010). 

[11] DOD's Corrosion Prevention and Mitigation Strategic Plan suggests 
that follow-on reviews with validated ROIs are required for completed 
projects within the 3 years after full project implementation. 
Projects from fiscal year 2005 are the first projects to meet this 
requirement. 

[12] Department of Defense, Under Secretary of Defense (Acquisition, 
Technology and Logistics), DOD Annual Cost of Corrosion (Washington, 
D.C.: 2009). 

[13] According to the Corrosion Office, the $500,000 per project 
funding limit was introduced for the fiscal year 2006 project 
selection process to enable more projects to be funded. 

[14] GAO, Defense Management: Observations on the Department of 
Defense's Fiscal Year 2011 Budget Request for Corrosion Prevention and 
Control, [hyperlink, http://www.gao.gov/products/GAO-10-607R] 
(Washington, D.C.: Apr. 15, 2010); and GAO-10-608R. 

[15] [hyperlink, http://www.gao.gov/products/GAO-10-608R]. 

[16] Pub. L. No. 110-417, § 903 (2008). 

[17] GAO, Internal Control: Standards for Internal Control in the 
Federal Government, [hyperlink, 
http://www.gao.gov/products/GAO/AIMD-00-21.3.1] (Washington, D.C.: 
November 1999). 

[18] The panel member from Logistics and Materiel Readiness, 
Maintenance Policy and Programs did not participate in the project 
selection meetings we observed. 

[19] The criteria used for the preliminary evaluation include whether 
the proposed project requires greater than $500,000 of Corrosion 
Office funds to complete, uses similar technology to a previously 
approved project, or is anticipated to take more than 2 years to 
complete. The preliminary evaluation did not consider the joint 
applicability of the project, but this was a criterion in the final 
project evaluation. 

[20] The judgmental criteria are: joint applicability, readiness 
impact, safety impact, logistics benefits, and anticipated 
contribution of the project to reducing the cost of corrosion. 
Corrosion Office officials told us that they believe the criteria to 
be clearly identified in the DOD Corrosion Prevention and Mitigation 
Strategic Plan. 

[21] GAO, Results-Oriented Cultures: Implementation Steps to Assist 
Mergers and Organizational Transformations, [hyperlink, 
http://www.gao.gov/products/GAO-03-669] (Washington, D.C.: July 2, 
2003). 

[22] GAO, Executive Guide: Effectively Implementing the Government 
Performance and Results Act, [hyperlink, 
http://www.gao.gov/products/GAO/GGD-96-118] (Washington, D.C.: June 
1996). 

[23] The strategic plan does not mention that the ratio of Corrosion 
Office funding requested to military department matching funds will be 
used to evaluate projects. Instead a concept called "management 
support" is found, and proposals where "management actively supports" 
the project are categorized as "low risk." Although active management 
support includes resources such as funding, other resources are also 
listed. 

[24] Project managers told us that project proposals were rejected due 
to previously funded technologies being proposed for new projects. 
They added that this severely limited their ability to develop 
corrosion prevention technologies. The DOD Corrosion Prevention and 
Mitigation Strategic Plan categorizes projects that use "mature 
technology" as "low risk" while projects with "undemonstrated 
technology" are categorized as "high risk." 

[25] The Strategic Plan includes a template spreadsheet for project 
managers to use to calculate the net present value of the projects. 
This template accounts for the time value of money by discounting the 
future benefits expected by the project in terms of their net present 
value, and computes the ratio of these benefits to the present value 
of the costs. The discount rate used by the template is 7 percent, 
recommended by Office of Management and Budget, Circular No. A-94: 
Guidelines and Discount Rates for Benefit-Cost Analysis of Federal 
Programs (Washington, D.C.: 1992), for use in analyzing benefits and 
costs of public investments. 

[26] The DOD Corrosion Prevention and Mitigation Strategic Plan states 
that data "should be" updated after 2 or 3 years of actually using the 
technology (following the 2-year implementation period). Corrosion 
Office officials told us that they expected the validations to be 
completed within 5 years of initial project funding. 

[27] One fiscal year 2005 weapons project has completed ROI 
validation. This Marine Corps project's ROI increased from an 
estimated 15:1 to a validated 17:1. 

[28] The funding process for CPC activities is described in 
[hyperlink, http://www.gao.gov/products/GAO-10-608R] and [hyperlink, 
http://www.gao.gov/products/GAO-10-607R]. 

[29] Department of Defense Instruction 5000.67, Prevention and 
Mitigation of Corrosion on DOD Military Equipment and Infrastructure 
(Feb. 1, 2010). 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: