This is the accessible text file for GAO report number GAO-12-22 
entitled 'Arizona Border Surveillance Technology: More Information on 
Plans and Costs Is Needed before Proceeding' which was released on 
November 4, 2011. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

United States Government Accountability Office: 
GAO: 

Report to Congressional Committees: 

November 2011: 

Arizona Border Surveillance Technology: 

More Information on Plans and Costs Is Needed before Proceeding: 

GAO-12-22: 

GAO Highlights: 

Highlights of GAO-12-22, a report to congressional committees. 

Why GAO Did This Study: 

In recent years, nearly half of all annual apprehensions of illegal 
aliens along the entire Southwest border with Mexico have occurred 
along the Arizona border. Keeping illegal flows of people and drugs 
under control remains a top priority for the Department of Homeland 
Security’s (DHS) U.S. Customs and Border Protection (CBP). In 2005, 
the Secure Border Initiative Network (SBInet) was conceived as a 
surveillance technology to create a “virtual fence” along the border. 
After spending nearly $1 billion, DHS deployed SBInet systems along 53 
miles of Arizona’s border that represent the highest risk for illegal 
entry. In January 2011, in response to concerns regarding SBInet’s 
performance, cost, and schedule, DHS canceled future procurements. CBP 
developed the Arizona Border Surveillance Technology Plan (Plan) for 
the remainder of the Arizona border. Funding for this Plan for fiscal 
year 2012 is $242 million. GAO was requested to assess the extent to 
which CBP (1) has the information needed to support and implement the 
Plan and (2) estimated life-cycle costs for future investments in 
accordance with best practices. GAO analyzed Plan documents and cost 
estimates, compared those estimates with best practices, and 
interviewed CBP officials. 

What GAO Found: 

CBP does not have the information needed to fully support and 
implement its Arizona Border Surveillance Technology Plan in 
accordance with DHS and Office of Management and Budget (OMB) 
guidance. In developing the Plan, CBP conducted an analysis of 
alternatives and outreach to potential vendors. However, CBP has not 
documented the analysis justifying the specific types, quantities, and 
deployment locations of border surveillance technologies proposed in 
the Plan. Best practices for developing and managing costs indicate 
that a business case analysis should be rigorous enough that 
independent parties can review it and clearly understand why a 
particular alternative was chosen to support mission requirements. 
Without documentation of the analysis, there is no way to verify the 
process CBP followed, identify how the underlying analyses were used, 
assess the validity of the decisions made, or justify the funding 
requested for the Plan. CBP officials also have not yet defined the 
mission benefits expected from implementing the new Plan. GAO has 
previously reported that a solid business case providing an 
understanding of the potential return of large investments can be 
helpful to decision makers for determining whether continued 
investment is warranted after deployment. Defining the expected 
benefit could help improve CBP’s ability to assess the effectiveness 
of the Plan as it is implemented. CBP does not intend to assess and 
address operational issues regarding the effectiveness and suitability 
of SBInet, steps that could provide CBP with information to help make 
decisions regarding alternatives for implementing the Plan. OMB 
guidance suggests that a post-implementation review occur when a 
system has been in operation for 6 months or immediately following 
investment termination. Such a review could help CBP make the most 
effective use of existing SBInet systems that, in connection with the 
Plan, could build a comprehensive and integrated approach for 
surveillance technology along the entire Arizona border. 

CBP’s 10-year life-cycle cost estimate for the Plan of $1.5 billion is 
based on a rough order of magnitude analysis, and agency officials 
were unable to determine a level of confidence in their estimate as 
best practices suggest. Specifically, GAO’s review of the estimate 
concluded that the estimate reflected substantial features of best 
practices, being both comprehensive and accurate, but it did not 
sufficiently meet other characteristics of a high-quality cost 
estimate, such as credibility, because it did not identify a level of 
confidence or quantify the impact of risks. GAO and OMB guidance 
emphasize that reliable cost estimates are important for program 
approval and continued receipt of annual funding. In addition, because 
CBP was unable to determine a level of confidence in its estimate, it 
will be difficult for CBP to determine what levels of contingency 
funding may be needed to cover risks associated with implementing new 
technologies along the remaining Arizona border. Thus, it will be 
difficult for CBP to provide reasonable assurance that its cost 
estimate is reliable and that its budget request for fiscal year 2012 
and beyond is realistic and sufficient. A robust cost estimate—one 
that includes a level of confidence and quantifies the impact of risk—
would help ensure that CBP’s future technology deployments have 
sufficient funding levels related to the relative risks. 

What GAO Recommends: 

GAO recommends that CBP document the analysis justifying the 
technologies proposed in the Plan, determine its mission benefits, 
conduct a post-implementation review of SBInet and determine a more 
robust life-cycle cost estimate for the Plan. DHS concurred with the 
recommendations. 

View [hyperlink, http://www.gao.gov/products/GAO-12-22] For more 
information, contact Richard Stana at (202) 512-8777 or StanaR@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

CBP Does Not Have the Information Needed to Fully Support and 
Implement Its Plan: 

CBP's Cost Estimate Reflects Some but Not All Key Cost-Estimating Best 
Practices: 

Conclusions: 

Recommendations: 

Agency Comments and Our Evaluation: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: Photographs of Technologies Contained in the Arizona 
Border Surveillance Technology Plan: 

Appendix III: Aspects of High-Quality Cost Estimates: 

Appendix IV: Comments from the Department of Homeland Security: 

Appendix V: GAO Contact and Staff Acknowledgments: 

Related GAO Products: 

Tables: 

Table 1: U.S. Border Patrol Apprehensions for the Southwest Border and 
Arizona: 

Table 2: U.S. Border Patrol Marijuana Seizures for the Southwest 
Border and Arizona (in pounds): 

Table 3: Results for AOA of Four Technology Alternatives for Arizona: 

Table 4: Extent to which CBP's Arizona Border Surveillance Technology 
Plan Cost Estimate Meets Best Practices: 

Table 5: The 12 Steps of High-Quality Cost Estimating Mapped to the 
Characteristics of a High-Quality Cost Estimate: 

Figures: 

Figure 1: Mobile Surveillance System (MSS): 

Figure 2: Mobile Video Surveillance System (MVSS): 

Figure 3: Integrated Fixed Tower Concept (SBInet Tower): 

Figure 4: Air Support (Unmanned Aerial System): 

Figure 5: Long Range Handheld Thermal Imaging System (RECON III): 

Figure 6: Agent Portable Surveillance System (APSS): 

Figure 7: Remote Video Surveillance System (RVSS): 

Abbreviations: 

AAR: after-action review: 

AoA: Analysis of Alternatives: 

APSS: Agent Portable Surveillance System: 

ATEC: Army Test and Evaluation Command: 

CBP: U.S. Customs and Border Protection: 

COP: Common Operating Picture: 

DHS: Department of Homeland Security: 

HSI: Homeland Security Studies and Analysis Institute: 

IFT: integrated fixed tower: 

LCCE: life-cycle cost estimate: 

MSS: Mobile Surveillance System: 

MVSS: Mobile Video Surveillance System: 

OMB: Office of Management and Budget: 

OTIA: Office of Technology Innovation and Acquisition: 

Plan: Arizona Border Surveillance Technology Plan: 

ROM: rough order of magnitude: 

RVSS: Remote Video Surveillance System: 

SBI: Secure Border Initiative: 

SBInet: Secure Border Initiative Network: 

[End of section] 

United States Government Accountability Office: 
Washington, DC 20548: 

November 4, 2011: 

The Honorable Peter T. King: 
Chairman: 
The Honorable Bennie G. Thompson: 
Ranking Member: 
Committee on Homeland Security: 
House of Representatives: 

The Honorable Candice S. Miller: 
Chairman: 
The Honorable Henry Cuellar: 
Ranking Member: 
Subcommittee on Border and Maritime Security: 
Committee on Homeland Security: 
House of Representatives: 

Securing the Arizona portion of the approximately 2,000 miles of 
southwest border that the United States shares with Mexico--while 
keeping illegal flows of people and drugs under control--is a top 
priority for the Department of Homeland Security's (DHS) U.S. Customs 
and Border Protection (CBP). In recent years, nearly half of all 
annual apprehensions of illegal aliens along the entire southwest 
border with Mexico have occurred along the Arizona border, but that 
number has been steadily decreasing. DHS's Office of Immigration 
Statistics reported in June 2011, that the number of apprehensions of 
people entering the country illegally in 2010 reflects the fifth 
consecutive year-to-year decrease and is now at its lowest level since 
the early 1970s. As reflected in table 1, that trend is true for both 
the southwest border and Arizona in particular. On the other hand, 
Arizona remains the highest risk area for illegal trafficking in 
marijuana not only because of the upward trend in the number of pounds 
of marijuana seized by the Border Patrol but also because nearly half 
of all marijuana seizures along the southwest border are made in 
Arizona alone, as reflected in table 2, which follows. 

Table 1: U.S. Border Patrol Apprehensions for the Southwest Border and 
Arizona: 

Border Location: Southwest; 
FY 2005: 1,171,396; 
FY 2006: 1,071,972; 
FY 2007: 858,638; 
FY 2008: 705,005; 
FY 2009: 540,865; 
FY 2010: 447,731. 

Border Location: Arizona; 
FY 2005: 577,517; 
FY 2006: 510,623; 
FY 2007: 416,231; 
FY 2008: 326,059; 
FY 2009: 248,624; 
FY 2010: 219,318. 

Source: CBP. 

Note: In the first half of fiscal year 2011 (Oct. 1, 2010, to April 1, 
2011), Arizona's apprehensions were 69,722, and if that rate continued 
through the end of FY2011, total apprehensions for fiscal year 2011 
would be lower than in fiscal year 2010. 

[End of table] 

Table 2: U.S. Border Patrol Marijuana Seizures for the Southwest 
Border and Arizona (in pounds): 

Border Location: Southwest; 
FY2005: 1,194,427; 
FY2006: 1,362,376; 
FY2007: 1,852,525; 
FY2008: 1,632,169; 
FY2009: 2,550,187; 
FY2010: 2,417,170. 

Border Location: Arizona; 
FY2005: 525,145; 
FY2006: 662,650; 
FY2007: 946,718; 
FY2008: 846,260; 
FY2009: 1,256,397; 
FY2010: 1,070,647. 

Source: CBP. 

Note: In the first half of fiscal year 2011 (Oct 1, 2010 to April 1, 
2011), Arizona marijuana seizures (lbs.) were 566,699 and if that rate 
continued through the end of fiscal year 2011, total seizures for 
fiscal year 2011 would be higher than in fiscal year 2010. 

[End of table] 

CBP began development of the Secure Border Initiative Network (SBInet) 
in 2005 as a combination of surveillance technologies that relied 
primarily on radar and camera towers to create a "virtual fence" along 
the southwest border in order to enhance CBP's capability to detect, 
identify, classify, track, and respond to illegal breaches at and 
between land ports of entry. After 5 years and a cost of nearly $1 
billion, SBInet systems are now deployed along the 53 miles of 
Arizona's 387-mile border with Mexico that represent the highest risk 
area for illegal entry attempts. 

In January 2011, in response to internal and external assessments that 
identified concerns regarding the performance, cost, and schedule for 
implementing the systems, the Secretary of Homeland Security announced 
the cancellation of further procurements of SBInet systems.[Footnote 
1] However, CBP plans to continue to operate the existing SBInet 
systems and received $26.4 million in fiscal year 2011 money for 
operations and maintenance of the systems. CBP estimates that 
continued operation and support of the SBInet systems will cost $10 
million in fiscal year 2012 and that these costs will continue for the 
foreseeable future. 

CBP has taken steps to develop and implement a new Arizona Border 
Surveillance Technology Plan (the Plan) for the remainder of the 
Arizona border. This Plan is the first step in a multiyear, 
multibillion dollar effort to secure the southwest border. The Plan is 
intended to identify, acquire, and deploy additional surveillance 
technology types and quantities, and suit them to the varying terrain 
along the Arizona border to enhance situational awareness of illegal 
intrusions. In addition to the $185 million CBP already allocated in 
fiscal year 2011, CBP has requested $242 million to fund the new Plan 
for fiscal year 2012 and estimates that the total costs of acquiring 
and maintaining all of the proposed new systems for the Arizona border 
over their expected 10-year life-cycle will be about $1.5 billion. 

Because of the high cost and challenges faced by CBP's development of 
SBInet and the importance of the revised plan, you asked us to review 
CBP's plans for developing and implementing a new approach for using 
surveillance technology along the remainder of the southwest border in 
Arizona. As agreed, our objectives were to determine (1) the extent to 
which CBP has in accordance with DHS and Office of Management and 
Budget (OMB) guidance supported and implemented its Arizona Border 
Surveillance Technology Plan, and (2) the extent to which CBP's 
estimated life-cycle costs for the Arizona Border Surveillance 
Technology Plan reflect best practices. 

To address our first objective, we reviewed key program-planning 
documents CBP relied on to support its new approach to identifying, 
acquiring, and deploying surveillance technology applicable to 
specific types of terrain along the Arizona border and compared them 
with requirements in DHS acquisition regulations, including 
Acquisition Regulation 102-01, and OMB guidance A-11. We also 
interviewed CBP officials responsible for assessing the need for and 
documenting the cost and operational effectiveness and suitability of 
proposed systems to support its Arizona Border Surveillance Technology 
Plan (Plan) and for identifying appropriate metrics to assess progress 
in border security. We also assessed documents and evaluations of the 
SBInet developed and deployed in Arizona's Tucson sector from 2005 
through 2010 and CBP's plans for SBInet's operation and maintenance 
over its life-cycle. In doing so, we reviewed key program 
documentation that describes the operational benefits of SBInet and 
the Army Test and Evaluation Command's (ATEC) reports and briefing to 
CBP on operational test findings. We interviewed Army leadership 
involved in the design and implementation of the operational test and 
evaluation of test results in order to determine the reliability of 
the information we used to support our finding. We determined that the 
test results were sufficiently reliable for the purposes of this 
report. We also interviewed officials from CBP's Office of Technology 
Innovation and Acquisition (OTIA) on how they intended to use the 
operational test findings and recommendations to inform the continuing 
operation of existing SBInet technology.[Footnote 2] 

To address our second objective, we reviewed cost and budget documents 
CBP relied on to support cost estimates for technology alternatives 
contained in the "analysis of alternatives" (AOA) for Arizona and in 
the President's budget request for fiscal year 2012.[Footnote 3] We 
also interviewed OTIA program officials and contractors responsible 
for estimating the cost of future investments in surveillance 
technology, specifically the life-cycle approach, requirements 
development and management, test management, and risk management. We 
then compared this information to relevant federal cost-estimating 
guidance, derived from leading government and industry practices. 
[Footnote 4] To assess the reliability of the cost data for the rough 
order of magnitude estimate for implementation of the Plan, which 
assumed a 10-year life-cycle for the acquisition, we relied on data 
for fiscal year 2010 and beyond to support the findings in the report. 
To assess the reliability of the data that we used to support the 
findings in this report, we reviewed relevant program documentation to 
substantiate evidence obtained through interviews with knowledgeable 
agency officials, where available, regarding the integrity of the 
data. We determined that these data are sufficiently reliable for the 
purposes of this report. 

We conducted this performance audit from March 2011 through October 
2011 in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audit 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. Further 
details of our objectives, scope, and methodology are in appendix I. 

Background: 

In November 2005, DHS announced the launch of the Secure Border 
Initiative (SBI), a multiyear, multibillion-dollar program aimed at 
securing U.S. borders and reducing illegal immigration. CBP is the 
lead agency within DHS responsible for the development and deployment 
of SBI technology (e.g., cameras, sensors, radars, and tactical 
communications) and tactical infrastructure (e.g., pedestrian and 
vehicle fences, roads, and lighting). In July 2010, CBP announced the 
formation of OTIA, which was created to ensure all of CBP's technology 
efforts are properly focused on the mission and are well integrated, 
and to strengthen CBP's expertise and effectiveness in program 
management and acquisition. OTIA assumed the responsibilities of the 
former SBI program office that previously managed SBInet. 

SBInet was intended to cover the entire southwest border with an 
integrated set of fixed sensor towers. These towers were to transmit 
radar and camera information into a centralized location that 
integrated information to create a Common Operating Picture (COP) at 
work stations manned at all times by Border Patrol Agents. SBInet's 
initial deployment, known as Block 1, was deployed to 53 miles of the 
Arizona border where it continues to be used by the Border Patrol. 
Since its inception, SBInet had continued and repeated technical 
problems, cost overruns, and schedule delays, which raised serious 
questions about SBInet's ability to meet Border Patrol's needs for 
surveillance technology along the border. We have issued 26 reports 
and testimonies identifying operational and program management 
weaknesses that contributed to SBInet's performance shortfalls, 
including cost overruns and schedule slippages. For example, in 
September 2008 and May 2010, we reported on deficiencies in CBP's 
timely preparation and completion of key acquisition documents 
essential to setting operational requirements, identifying and 
mitigating risks, and establishing the cost, schedule, and performance 
of the project and the technology to be delivered.[Footnote 5] We also 
reported that key acquisition documents, such as a risk management 
plan, were not prepared and approved for SBInet prior to the start of 
the acquisition process, a lack that precluded a fully informed design 
for the system that would meet CBP's needs within the expected time 
frame. In May 2010, we made a number of recommendations to enhance 
CBP's acquisition of SBInet systems. DHS agreed with 10 of our 
recommendations and partially agreed with two of them and detailed 
actions planned to address each. This included a recommendation to 
respond to a departmentwide reassessment of the program. 

In January 2010, the Secretary of Homeland Security ordered a 
departmentwide reassessment of the SBInet program to consider options 
that may more efficiently, effectively, and economically meet the 
nation's border security needs. The assessment focused on two key 
questions: 

* whether the SBInet program was viable and could be made to work 
effectively and fulfill the intent of the program and: 

* whether SBInet was cost-effective. 

After receiving the results of the assessment, in January 2011, the 
Secretary of Homeland Security announced that the department had 
concluded that SBInet systems were not appropriate for the entire 
southwest border and did not meet current standards for viability and 
cost-effectiveness. While the department would continue to use those 
elements of SBInet that were useful, the Secretary announced that the 
department was canceling further deployments of SBInet systems using 
the current contract. 

In its place, DHS is implementing a new approach for acquiring and 
deploying border security technology called Alternative (Southwest) 
Border Technology. As the approach's first step, CBP's Plan is to 
deploy a mix of technologies to complete coverage of the Arizona 
border including integrated fixed-tower (IFT) systems, Remote Video 
Surveillance Systems (RVSS),[Footnote 6] Mobile Surveillance Systems 
(MSS),[Footnote 7] hand-held equipment, and unattended ground sensors. 
CBP plans to deploy five IFT systems each comprising about 10 radar-
and-camera-equipped towers and integrate their signals into a system 
command center. According to CBP officials, though similar, the IFT 
systems' equipment will be simpler when compared with the equipment 
for the tower systems deployed under SBInet.[Footnote 8] Thus, CBP's 
plans include the currently deployed SBInet system for 53 miles in 
Arizona along with the new Plan to acquire and deploy additional 
towers, mobile surveillance equipment, unattended ground sensors, and 
hand-held devices to secure the rest of the Arizona border. CBP 
estimates that the total life-cycle cost of the new Plan will be about 
$1.5 billion for Arizona. In fiscal year 2011, CBP allocated $185 
million to procure border surveillance technologies contained in the 
Plan except for the new IFT systems. The agency has requested $242 
million in fiscal year 2012 appropriations to procure the first three 
IFT systems also included in the Plan. 

CBP Does Not Have the Information Needed to Fully Support and 
Implement Its Plan: 

CBP does not have the information needed to fully support and 
implement its Plan in accordance with DHS and OMB guidance. To develop 
this Plan, CBP conducted an analysis of alternatives (AOA) and 
outreach to potential vendors, and took other steps to test the 
viability of the current system. However, CBP has not: 

* documented the analysis justifying the specific types, quantities, 
and deployment locations of border surveillance technologies proposed 
in the Plan; 

* defined the mission benefits or developed performance metrics to 
assess its implementation of the Plan; or: 

* developed a plan to assess and address operational issues with the 
continuing use of SBInet systems along the highest risk section of the 
border that could affect the new Plan's implementation across the 
remainder of Arizona. 

For these reasons, CBP's newly proposed approach is at an increased 
risk of not accomplishing its goal in support of Arizona border 
security. 

CBP, in Developing a Business Case for Its New Approach, Conducted an 
Analysis of Alternatives: 

At the Secretary of Homeland Security's direction, CBP has adopted a 
new approach for developing a technology plan for surveillance at the 
border that includes development of a business case to justify the way 
forward. CBP officials told us their business case consists of the 
Arizona Border Surveillance Technology Plan and a phased independent 
analysis of alternatives (AOA).[Footnote 9] 

According to CBP officials, the development of the Arizona Border 
Surveillance Technology Plan consisted of a two-step process. First, 
the Homeland Security Studies and Analysis Institute (HSI) was 
enlisted to conduct a multipart AOA beginning with Arizona.[Footnote 
10] Second, using the AOA, the Border Patrol conducted an operational 
assessment of border surveillance technologies to identify the 
appropriate mix of technologies required to gain situational awareness 
and manage the Arizona border area. HSI's AOA considered four 
technology alternatives: (1) agent-centric hand-held devices, (2) 
integrated fixed-tower systems, (3) mobile surveillance equipment, and 
(4) unmanned aerial vehicles. These technology alternatives were 
analyzed in four representative geographic areas of Arizona. The AOA 
for Arizona found that integrated fixed-tower systems, like the other 
technology alternatives, represent the most effective choice only in 
certain circumstances and that there is no one technology alternative 
that is appropriate for the entire Arizona border. A summary of the 
conclusions reached for each of the four alternatives examined is 
presented in table 3. 

Table 3: Results for AOA of Four Technology Alternatives for Arizona: 

Technology alternative: Integrated fixed towers (IFTs); 
Conclusions: IFTs had significant information technology 
infrastructure costs and their cost-effectiveness depended on the area 
to be covered, which could be significant over moderately-sized areas 
of largely open or rolling terrain. 

Technology alternative: Mobile surveillance equipment; 
Conclusions: Somewhat lower in cost and providing slightly less 
coverage than integrated fixed towers. The AOA also noted that mobile 
surveillance equipment had significant personnel costs but that the 
costs were generally well defined. 

Technology alternative: Hand-held devices; 
Conclusions: The AOA concluded the hand-held devices that are agent-
centric were the lowest cost, but provided the smallest increase in 
coverage. 

Technology alternative: Unmanned aerial vehicles; 
Conclusions: This alternative had significant infrastructure costs 
with the highest cost risk, but could provide significantly more 
coverage in areas with rugged terrain. 

Source: GAO analysis based on the SBInet analysis of alternatives. 

[End of table] 

Unattended ground sensors were not included in the analysis because 
they were considered part of the existing baseline of technology and 
would co-exist with all of the alternatives in the AOA. In the AOA, 
HSI noted that its analysis did not, among other things, identify the 
optimal combination of specific equipment and systems, measure the 
contribution of situational awareness to achieving control of the 
border, or quantify the number of apprehensions that may result from 
the deployment of any technology solution. 

Upon completion of the AOA, in July 2010, the Secretary of Homeland 
Security directed the AOA study team to seek independent validation of 
its work. In response, HSI assembled an independent review team 
composed of senior subject matter experts with expertise in border 
security, operational testing, acquisition, performance measurement, 
and the management and execution of AOAs to evaluate the AOA for 
Arizona. In the results of the final report in March 2011, the review 
team concluded that the AOA for Arizona appeared to have successfully 
answered the questions asked and drew appropriate conclusions and 
insights that should be useful to DHS and CBP.[Footnote 11] CBP 
officials said they planned to conduct additional analysis of 
alternatives to incorporate additional technologies and Border Patrol 
Sectors.[Footnote 12] 

Following the completion of the AOA, the Border Patrol conducted its 
operational assessment, which included a comparison of alternative 
border surveillance technologies and an analysis of operational 
judgments to consider both effectiveness and cost. According to CBP 
officials, they started with the results of the AOA for Arizona, 
noting that the AOA considered the technologies in terms of the trade 
offs between capability and cost--but did not document the quantities 
of each technology needed, the appropriate mix of the technologies, or 
how a proposed mix of technologies would be applied to specific border 
areas. CBP officials stated that a team of Border Patrol agents 
familiar with the Arizona terrain determined the appropriate quantity 
and mix of technologies by considering the terrain in each area under 
consideration and which mix of technologies appeared to work for that 
area and terrain. These officials also stated that they used an 
iterative process involving dialogue between trained engineers and 
Border Patrol agents based on the team's understanding of topography 
and technology, considering the lowest dollar cost mix of technologies 
as a starting point to see if situational awareness provided by the 
mix sufficiently met the threat. As a result, according to CBP 
officials, if the least expensive technology, such as hand-held 
portable equipment, met the threat, then that technology would be 
chosen. If the threat was not addressed by the hand-held technologies, 
then officials said the team considered the next higher cost 
technology. The officials added that the IFT systems were the most 
expensive. 

CBP Did Not Document How It Derived the Specific Types and Quantities 
of Technologies Contained in Its Arizona Border Surveillance 
Technology Plan: 

CBP has taken a number of steps to develop the Plan; however, program 
officials developed and proposed the new Plan without documenting the 
analysis justifying the specific types, quantities, and deployment 
locations of border surveillance technologies CBP proposed. While the 
AOA process itself was well documented, the Border Patrol's 
operational assessment, a key analytical component leading to the 
Plan, was not transparent because of the lack of documentation. 

The Plan includes quantities of various technologies, prioritized and 
planned for implementation on a yearly basis. Specifically, based on 
the Plan, CBP allocated $185 million to purchase border surveillance 
technologies including mobile and hand-held equipment as well as RVSS 
for fiscal year 2011, and has requested $242 million for fiscal year 
2012 to acquire and deploy three IFT systems in Arizona, with two 
others to be deployed by 2015, depending on funding availability. 

Without documentation of the analysis justifying the specific types, 
quantities, and deployment locations of border surveillance 
technologies proposed in the Plan, an independent party cannot verify 
the process followed, identify how the AOA was used, determine whether 
CBP's use of the AOA considered the limitations identified by HSI, 
assess the validity of the decisions made, or justify the funding 
requested. Given that the number of apprehensions of illegal border 
crossers is at the lowest level in 40 years, if threats in the 
southwest border environment continue to change and the Plan otherwise 
requires updating or revision, it will be difficult for CBP officials 
to reassess the rationale for and determine what, if any, changes are 
needed in the types, quantities, and deployment locations of border 
surveillance technologies called for in the Plan. 

Internal and management control standards for the federal government 
call for agencies to promptly record and clearly document transactions 
and significant events to maintain their relevance and value to 
management in controlling operations and making decisions and to 
ensure that agency objectives are met. The standards also call for 
documentation to be readily available for examination.[Footnote 13] 
These standards apply to CBP's development of the quantities and types 
of technology and their suitability to terrain to support the Plan; 
the expenditure of fiscal year 2011 funds on mobile, RVSS, and hand-
held equipment; as well as the planned acquisition of IFT systems 
requested in the President's fiscal year 2012 budget request.[Footnote 
14] 

A senior CBP official responsible for the program's acquisitions told 
us that he believed the AOA and the process used to develop and 
support the plan justified acquisition decisions called for in the 
Arizona Border Surveillance Technology Plan. According to CBP 
officials, the agency is in the process of drafting the acquisition-
planning documents required for the DHS Acquisition Review Board to 
review and make a decision on acquiring the IFT systems.[Footnote 15] 
These acquisition-planning documents are required by DHS guidance for 
planning acquisitions, setting operational requirements, and 
establishing acquisition baselines to help ensure delivery of the 
required performance at acceptable levels of cost, schedule, and risk. 
[Footnote 16] CBP officials said that they expect the Acquisition 
Review Board to meet in November 2011 to consider the IFT acquisition. 
The Acquisition Review Board is to consider these documents prior to 
approving the program for acquisition and the issuance of a request 
for proposal for the new IFT systems. 

Nonetheless, in the absence of documentation that describes how CBP 
integrated the operational assessments and technology deployment 
analyses and used the results of the AOA to develop the types and 
quantities of technology and their suitability to the terrain from the 
various alternatives, it is unclear whether and how the analyses 
conducted to develop the Plan demonstrated the cost and operational 
effectiveness of the selected mix of technology versus other less 
costly solutions, or whether the analyses determined the most 
appropriate technology for the terrain. As a result, CBP cannot 
demonstrate the validity of the Arizona Border Surveillance Technology 
Plan and the acquisition approach and lacks reasonable assurance that 
the acquisition-planning documents will fully support future 
deployments of border surveillance technology in Arizona. 

CBP Officials Have Not Yet Defined Expected Mission Benefits nor 
Quantified Metrics to Assess Progress in Implementing the Plan: 

Agency officials have not yet defined the mission benefits expected or 
quantified metrics to assess the contribution of the selected 
approaches in achieving their goal of situational awareness and 
detection of border activity using surveillance technology. Without 
defining the expected benefit or quantifying metrics, it will be 
difficult for CBP to assess the effectiveness of the Plan as it is 
implemented. Assessing the effectiveness of the program in Arizona 
will be essential as CBP works to develop a more comprehensive plan 
for the entire southwest border. 

Our findings are particularly relevant considering similar 
deficiencies in SBInet systems. In May 2010, we reported that in the 
case of the deployment of SBInet systems along the first 53 miles of 
the Arizona border, CBP did not define or measure the expected mission 
benefits of the system.[Footnote 17] For example, while program 
officials reported that system benefits are documented in the SBInet 
Mission Need Statement dated October 2006, this document did not 
include either quantifiable or qualitative benefits. Rather, it 
provided general statements such as "the lack of a program such as 
SBInet increases the risks of terrorist threats and other illegal 
activities." Moreover, we concluded that DHS had not demonstrated that 
its proposed SBInet solution was a cost-effective course of action, 
and thus whether the considerable time and money invested to acquire 
and deploy it was a prudent use of limited resources. As a result, we 
recommended that DHS should reconsider its proposed SBInet solution. 
In doing so, it should explore ways to both limit its near-term 
investment in an initial set of operational capabilities and develop 
and share with congressional decision makers reliable projections of 
the relative costs and benefits of longer-term alternatives. These 
longer-term alternatives would help meet the mission goals and 
outcomes that SBInet was intended to advance. DHS should also share 
with congressional decision makers the reasons why cost-benefit 
information was not available and the uncertainty and risks associated 
with not having it. DHS concurred with reconsidering its proposed 
SBInet solution and the Secretary canceled the program in January 2011. 

The Secretary of Homeland Security reported in January 2011 that the 
new Plan is expected to provide situational awareness for the entire 
Arizona border by 2014, but CBP officials have not yet defined the 
expected benefits or developed measurable and quantifiable performance 
metrics that would show progress toward achieving that goal.[Footnote 
18] The Clinger-Cohen Act of 1996 and OMB guidance emphasize the need 
to ensure that information technology investments, such as IFT 
systems, actually produce tangible, observable improvements in mission 
performance.[Footnote 19] We have previously reported that a solid 
business case providing an understanding of the potential return of 
large investments can be helpful to decision makers for determining 
whether continued investment is warranted.[Footnote 20] Additionally, 
according to the Government Performance and Results Act, as amended, 
activities need to be established to monitor performance measures and 
indicators.[Footnote 21] 

The supporting documents CBP officials used to justify its allocation 
of fiscal year 2011 funds and its budget requests for fiscal year 2012 
did not include any performance goals related to the expected outcome 
of the investment.[Footnote 22] CBP officials reported that the 
decision documents that informed their fiscal year 2012 budget request 
for $242 million (the AOA, the Plan, and the Department's fiscal year 
2012-2016 Resource Allocation Decision) did not contain any measurable 
and quantifiable performance metrics by which progress toward 
achieving performance goals could be determined. They said that the 
AOA contained four measures of effectiveness associated with the 
alternatives they assessed; however, these measures do not quantify 
the mission benefits associated with implementation of the Plan. 
Without measurable and quantifiable performance goals relating to 
expected outcomes, particularly for alternatives selected for CBP's 
Plan, it will be difficult for decision makers to assess the costs and 
benefits provided by acquisition and deployment of these systems and, 
more broadly, to measure program performance and progress in achieving 
national homeland security goals for securing the southwest border. 

We have previously reported on key attributes of successful 
performance measures that should be included in program performance 
metrics.[Footnote 23] In circumstances where complete information is 
not available to measure performance outcomes, agencies may need to 
use intermediate goals and measures to show progress or contribution 
to intended results. For example, Border Patrol may currently lack the 
capability to detect all illegal entries of people, drugs, and weapons 
along the southwest border. However, they may choose to establish 
performance measures that can track progress in terms of using 
technology to increase the probability of detection. Once CBP achieves 
an optimal level in terms of the probability of detection, or 
situational awareness, it may then transition to measures for reducing 
the flow of illegal activity and interdiction. In September 2011, CBP 
officials reported that they are developing new measures to determine 
whether and how technology investments impact border security. They 
acknowledged that since large investments have been made in border 
security, it is critical to assess the impacts these investments have 
had on improving border security as well as projecting the additional 
impact future investments will have on their ability to manage the 
borders. However, CBP officials have not yet determined the key 
attributes of these new measures. Measures and key attributes are 
generally defined as part of the business case in order to explain how 
they contribute to the mission's benefits.[Footnote 24] Without a 
meaningful understanding and disclosure of the mission benefits of the 
Plan and related metrics to assess progress, it will be difficult for 
CBP to justify and make informed decisions about its investment as 
well as measure the extent to which implementation of the Plan will 
actually deliver mission value commensurate with costs, similar to the 
challenges faced by SBInet. 

CBP Does Not Have a Plan to Assess and Address Operational Issues for 
Continuing Use of SBInet Technology for Surveillance: 

The new Arizona Border Surveillance Technology Plan does not include 
the 53 miles covered by previously deployed SBInet systems that have 
historically been at the highest risk for illegal crossing. CBP made 
its decision to continue using SBInet Block 1 systems in the Tucson 
sector before the results of operational testing were available, and 
CBP does not have a plan to assess and address operational issues with 
SBInet technology in use in this area. Effective use of existing 
SBInet systems is essential for a comprehensive and integrated 
approach for surveillance technology along the entire Arizona border. 

The Secretary of Homeland Security's January 2011 announcement stated 
that in DHS's assessment, the issue of viability was evaluated within 
the context of the SBInet Block 1 deployments in the Tucson and Ajo 
Border Patrol Stations' areas of responsibility--referred to as Tuscon-
1 and Ajo-1. It stated that testing and evaluation of the system was 
under way at those sites and that it was too early to quantify the 
effectiveness of the technology. However, based on qualitative 
assessments from the Border Patrol, which had begun using the 
systems,[Footnote 25] SBInet systems enhanced the Border Patrol's 
ability to detect, identify, track, deter, and respond to threats 
along some parts of the border. The announcement further stated that 
SBInet contributed in part to increasing the likelihood of the 
apprehension of illegal entrants. 

Test Results Revealed System Operational Challenges, Although Test 
Participants Who Provided Feedback Had Favorable Opinions of the 
System: 

Since the Secretary's announcement, CBP has received the U.S. Army's 
Test and Evaluation Command (ATEC) operational test results for the 
SBInet system at Tucson-1 that revealed challenges regarding the 
effectiveness and suitability of the technology for border 
surveillance.[Footnote 26] In its March 2011 report on operational 
testing conducted from October 2010 to November 2010, ATEC said that 
SBInet was "effective with limitations" because (1) the ability of the 
system to correctly detect, identify, and classify items of interest 
was below initial system acceptance benchmarks and was (2) further 
degraded by terrain and weather conditions, and (3) the radar system 
generated a high number of extraneous radar returns or "hits" that 
overwhelmed operators. ATEC found that the system was "not 
operationally suitable" because the reliability of the system was low. 
[Footnote 27] 

Specifically, ATEC officials found that the rugged, restrictive 
terrain and weather conditions prevalent where SBInet is deployed 
affected the performance of the system's radar, which impacted success 
in detecting, identifying, and classifying the items of interest. ATEC 
officials referred to this situation as a "terrain/technology 
mismatch." ATEC also reported that the radar's difficulties with 
terrain and weather resulted in a high number of extraneous radar 
hits' being generated by the system, hits that presented a difficult-
to-manage workload for operators for which SBInet's technical 
filtering techniques could not compensate fully. Moreover, ATEC also 
noted that the system required operators to cull through thousands of 
extraneous radar hits (among a total average of 26,000 hits per day). 
This generated an unreasonable expectation given the lack of 
standardized procedures in how to manage the extraneous radar hits and 
lack of training in how to use the system tools to filter them out. 

In response to ATEC's findings, CBP said that problems with using 
SBInet to detect, identify, and classify items of interest are less 
significant now than when they began using the system since the 
operators, through their continued experience with the system, better 
understand what causes extraneous radar hits and are better able to 
deal with them. Similarly, CBP stated that ATEC's reliability findings 
have been mitigated by the fact that many of the system failures were 
because of routine system reboots that are being addressed by 
enhancements to SBInet currently in process. 

Notwithstanding the findings of the ATEC testers, Border Patrol SBInet 
operators and field agents who participated in this testing and 
completed questionnaires during and at the end of testing responded 
favorably about a number of aspects of the system, including 
responding that the system significantly enhanced both agent safety 
and overall situational awareness during day-to-day operations for 
tracking and apprehending illegal border crossers.[Footnote 28] 
Further, in our March 2011 work reviewing the status of SBInet, all 
the Border Patrol officials we spoke with told us the system provided 
them with capability they did not have previously and was considerably 
better than the technology that was available to them prior to 
SBInet's deployment.[Footnote 29] Nonetheless, based on the factors 
mentioned above, ATEC concluded that because of the limitations of the 
SBInet radar, the system does not significantly reduce the need for 
traditional field agents' role in the operating environment. ATEC also 
concluded that, despite receiving high questionnaire ratings from test 
participants who completed questionnaires, actual performance of 
SBInet in terms of interdiction was only slightly different than if 
the system had not been present in the areas where it is deployed. 

A Post-Implementation Review of SBInet Could Help CBP Determine How 
Best to Proceed in Its Operation and Inform Future Deployment 
Decisions: 

According to DHS guidance, project managers are required to conduct a 
Post Implementation Review to evaluate the impact of an investment's 
deployment on customers, the mission and program, and technical and/or 
mission capabilities.[Footnote 30] Similarly, OMB's Capital 
Programming Guide, a supplement to OMB Circular A-11, identifies a 
Post-Implementation Review as a tool to evaluate an investment's 
efficiency and effectiveness to determine how well an investment 
achieved the planned functionality and anticipated benefits.[Footnote 
31] Moreover, as the next step in the evaluation phase for any major 
information technology investment, like SBInet, DHS policy requires 
that an operational analysis be undertaken to measure the performance 
and cost of the asset against the established baseline.[Footnote 32] 
According to the guidance, operational analyses measure how close the 
investment is to achieving the project's expected cost, schedule, and 
performance goals. When performance is found deficient, the project 
manager must identify and schedule suitable corrective actions. 

DHS guidance further states that the Post-Implementation Review should 
occur when a system has been in operation at least 6 months or 
immediately following investment termination, and the Operational 
Analysis should be performed annually for information technology 
investments in the steady-state or operations and maintenance phase 
like SBInet. Such reviews would be prudent and provide a baseline for 
CBP to decide whether to continue the system without adjustment, to 
modify the system to improve performance--to the extent that 
addressing the operational issues identified by the Army's operational 
testing are cost beneficial--or, if necessary, to consider 
alternatives to the implemented system. The reviews could also provide 
CBP with an opportunity to more quantitatively determine and document 
the SBInet system's ability to satisfy the agency's operational 
requirements, given that CBP plans to continue to operate the SBInet 
system along the highest risk 53 miles of the Arizona border and will 
be faced with funding operation and maintenance costs over the 
remaining 10-year life of the system. (For example, CBP has requested 
$10 million for fiscal year 2012 to support the continuing operation 
of SBInet systems.) 

A Post-Implementation Review and Operational Analysis could also help 
inform CBP's decisions about whether future technology deployments of 
similar ground-based radar technologies that are to make up the IFT 
systems being used for the next step in its plan to deploy border 
surveillance technology in Arizona are necessary where SBInet systems 
are currently being used. 

CBP program officials initially told us they did not intend to develop 
an action plan that addressed ATEC deficiencies and recommendations. 
They said that the Secretary of Homeland Security's decision to cancel 
further procurements of SBInet systems was a basis for their decision 
not to commit resources to resolve technical, logistical, and 
operational issues identified during the Army's operational testing of 
the system. However, in response to our inquiries related to the 
applicability of this guidance, CBP told us in August 2011 that the 
Border Patrol was considering, but had not yet developed, a plan for 
reviewing and addressing the results of the ATEC tests for SBInet. 

CBP officials said they had not developed a plan to address SBInet 
operational test outcomes or conducted a post-implementation review 
because of the Secretary's cancellation of the program. They said they 
were confident that the technology was now available to acquire and 
deploy a non-developmental system as part of the new Arizona Border 
Surveillance Technology Plan.[Footnote 33] However, CBP plans to 
continue using SBInet for surveillance along the highest risk corridor 
in Arizona. The impact of the use of SBInet systems could affect the 
deployment and use of other surveillance technologies along the 
Arizona border. For example, if SBInet systems are particularly 
effective, illegal border-crossing traffic may decrease in the area 
where the systems are in use. Conversely, if SBInet is less effective, 
illegal border crossings may increase in the area surveilled. Thus, 
conducting an assessment of SBInet operational test results and the 
potential cost-effective resolution of the issues identified could 
better position CBP in determining analyses of alternative 
technologies for future systems' deployments in the areas of the 
Arizona Border covered by SBInet. 

CBP's Cost Estimate Reflects Some but Not All Key Cost-Estimating Best 
Practices: 

CBP officials have taken steps to develop a cost estimate for the 
Arizona Border Surveillance Technology Plan consistent with some best 
practices. However, the officials did not determine a level of 
confidence around their rough order of magnitude (ROM) estimate, 
inconsistent with best practices. 

CBP's Cost Estimate Is Substantially Comprehensive and Accurate but 
Partially Documented and Minimally Credible: 

Our analysis of CBP's 10-year life-cycle cost estimate (LCCE) for the 
Arizona Border Surveillance Technology Plan (the Plan) found that CBP 
did not fully follow best practices for developing a reliable LCCE, 
which is at the core of successfully managing a project within cost 
and affordability guidelines. CBP's estimate for the Plan is $1.5 
billion.[Footnote 34] The estimate includes approximately $750 million 
in acquisition costs and approximately $800 million for operations and 
maintenance costs to procure and deploy a range of border surveillance 
technology across Arizona. 

Our guide and OMB guidance emphasize that reliable cost estimates are 
important for program approval and continued receipt of annual 
funding.[Footnote 35] DHS policy similarly provides that life-cycle 
cost estimates are essential to an effective budget process and form 
the basis for annual budget decisions. Reliable LCCEs reflect four 
characteristics. They are (1) well-documented, (2) comprehensive, (3) 
accurate, and (4) credible. These four characteristics encompass 12 
best practices for reliable program life-cycle cost estimates. (See 
appendix III that describes the 12 steps of high-quality cost 
estimates.) The results of our analysis of CBP's cost estimate against 
these four best practice characteristics are summarized in table 4. 

Table 4: Extent to which CBP's Arizona Border Surveillance Technology 
Plan Cost Estimate Meets Best Practices: 

Best Practice: Well-documented; 
Best practice description: The cost estimates should be supported by 
detailed documentation that describes the purpose of the estimate, the 
program background and system description, the scope of the estimate, 
the ground rules and assumptions, all data sources, estimating 
methodology and rationale, and the results of the risk analysis. 
Moreover, this information should be captured in such a way that the 
data used to derive the estimate can be traced back to, and verified 
against, their sources; 
Results of GAO analysis: Partially Met. 

Best Practice: Comprehensive; 
Best practice description: The cost estimates should include costs of 
the program over its full life-cycle, provide a level of detail 
appropriate to ensure that cost elements are neither omitted nor 
double-counted, and document all cost-influencing ground rules and 
assumptions; 
Results of GAO analysis: Substantially Met. 

Best Practice: Accurate; 
Best practice description: The cost estimate should be based on an 
assessment of most likely costs (adjusted for inflation), documented 
assumptions, and historical cost estimates and actual experiences on 
other comparable programs. Estimates should be cross-checked against 
an independent cost estimate for accuracy, double counting, and 
omissions. In addition, the estimate should be updated to reflect any 
changes; 
Results of GAO analysis: Substantially Met. 

Best Practice: Credible; 
Best practice description: The cost estimates should discuss any 
limitations of the analysis because of uncertainty, or biases 
surrounding data or assumptions. Risk and uncertainty analysis should 
be performed to determine the level of risk associated with the 
estimate. Further, the estimate’s results should be cross-checked 
against an independent estimate[A]; 
Results of GAO analysis: Minimally Met. 

Source: GAO analysis based on information provided by CBP. 

Note: "Not met" means CBP provided no evidence that satisfies any of 
the criterion. 

"Minimally met" means CBP provided evidence that satisfies a small 
portion of the criterion. 

"Partially met" means CBP provided evidence that satisfies about half 
of the criterion. 

"Substantially" means CBP provided evidence that satisfies a large 
portion of the criterion. 

"Fully met" means CBP provided evidence that completely satisfies the 
criterion. 

[A] An independent cost estimate is another estimate based on the same 
technical information that is used to validate and cross-check the 
baseline estimate, but is prepared by a person or organization that 
has no stake in the approval of the project. 

[End of table] 

CBP's life-cycle cost estimate for the Plan substantially met best 
practices in terms of being both comprehensive and accurate. For 
example, in terms of comprehensiveness, the estimate included 
technical data that was documented at a sufficient level of detail. 
This included specific technology requirements anticipated to provide 
situational awareness for each of the focus areas along the Arizona 
border such as the number of integrated fixed-tower systems, mobile 
surveillance systems, or other technologies. However, detailed 
technical data related to shared IT infrastructure was missing, and 
risk information on the technologies, assumptions, and estimating were 
not provided. As a result, our analysis concluded that CBP's cost 
estimate substantially, but not fully, reflected best practices for 
comprehensiveness. In terms of accuracy, the cost estimate was 
continually updated and refined as more information became known; this 
helps to provide decision makers with accurate and current 
information. Specifically, there were 10 changes documented that 
clearly showed what updates were made to the cost estimate. These 
changes included new technology quantities, learning-curve 
adjustments, and incurred cost adjustments. However, the estimate also 
relied on historical data from earlier SBInet deployment, and the 
accuracy and the reliability of that data were questionable because 
some data were still pending. As a result, CBP's estimate 
substantially, but not fully, met criteria for accuracy. 

Moreover, the Plan's estimate partially met best practices in terms of 
being well-documented and minimally met best practices for being 
credible. Cost estimates are well-documented when they can be easily 
repeated and can be traced to original sources. The documentation 
should explicitly identify the primary methods, calculations, 
assumptions, and sources of the data used to generate each cost 
element. However, according to our review of data provided to us by 
CBP, while many data sources were discussed, the actual data used to 
determine the estimate were not always shown. Therefore, it is not 
possible for an unfamiliar analyst to recreate the estimate with the 
provided documentation. As a result of insufficient documentation, the 
validity and reliability of the CBP's life-cycle cost estimate for the 
Arizona Border Surveillance Technology Plan cannot be verified. For 
that reason, we assessed CBP's cost estimate as partially meeting 
criteria for being well-documented. 

In terms of credibility, we found that CBP officials did not conduct a 
sensitivity analysis, and a cost-risk and uncertainty analysis, to 
determine a level of confidence in the $1.5-billion life-cycle cost 
estimate for Arizona. Therefore, CBP's estimate provides an incomplete 
basis for management decisions because without a level of confidence, 
it will be difficult for decision makers to identify a range of 
possible costs, higher and lower, corresponding to the associated 
risks involved with the acquisition and deployment of technology 
across Arizona. A sensitivity analysis of all cost estimates examines 
the effects of changing one assumption or cost driver at a time while 
holding all other variables constant. Since uncertainty cannot be 
avoided, it is necessary to identify the cost elements that represent 
the most risk and, if possible, cost estimators should quantify the 
risk.[Footnote 37] 

In addition to sensitivity analysis, which looks at the effects of 
changing one parameter or cost driver at a time, a cost risk and 
uncertainty analysis should be performed to capture the cumulative 
effect of multiple variables changing, such as schedules slipping, or 
proposed solutions' not meeting user needs, allowing for a known range 
of potential costs.[Footnote 38] Because CBP officials did not perform 
a cost-risk, and uncertainty analysis, the estimate for the Plan is 
likely to be unrealistic because it does not assess the variability in 
the cost estimate from such effects as schedules slipping, missions 
changing, and proposed solutions' not meeting users' needs. Without 
this type of analysis for example, it will be difficult for CBP 
decision makers to determine a defensible level of contingency 
reserves necessary to cover increased costs resulting from 
uncertainties associated with the Arizona plan. 

Another way to reinforce the credibility of the cost estimate would be 
for CBP to commission an independent cost estimate and then reconcile 
any differences between the two.[Footnote 39] This process is 
considered one of the best and most reliable estimate validation 
methods.[Footnote 40] However, because CBP officials did not compare 
their estimate with an independent estimate, agency decision makers 
may lack insight regarding the plan's range of potential costs because 
independent cost estimates frequently use different methods and are 
less burdened with organizational biases. Despite these deficiencies, 
we assessed CBP's cost estimate as minimally meeting best practices 
for credibility rather than not meeting them because CBP did identify 
some cost drivers that could be used as a basis for conducting a 
sensitivity analysis. 

Responding to the results of our cost analysis, CBP officials reported 
that their approach was to develop and report a rough order of 
magnitude (ROM) cost estimate for the portfolio of technology projects 
contained in the Arizona Border Surveillance Technology Plan. Because 
CBP officials considered the $1.5-billion estimate an initial ROM 
estimate, they reported that it lacked some elements of the technology 
costs and complete supporting documentation, and was not subjected to 
an independent or corroborating cost-estimating effort. 

Based on a Rough Order of Magnitude Analysis, CBP's Budget Request for 
IFT Systems May Not Be Realistic and Is Not Sufficient: 

CBP officials reported that while they believed the $1.5 billion cost 
estimate to complete the Arizona border was reasonable, they cautioned 
that they considered it to be a ROM estimate rather than a LCCE. 
According to cost-estimating best practices, a ROM cost estimate is 
developed when a quick estimate is needed and few details are 
available. It is usually based on historical ratio information and 
typically developed to support what-if analyses and can be developed 
for a particular phase or portion of an estimate to the entire cost 
estimate, depending on available data. It is helpful for examining 
differences in high-level alternatives to see which are the most 
feasible. However, according to cost-estimating best practices, 
because a ROM is developed from limited data and in a short time, a 
ROM analysis should never be considered a budget-quality cost 
estimate. However, CBP used the ROM estimate to support its $242-
million budget request for fiscal year 2012 because it lacked the time 
needed to develop a more robust estimate. 

CBP officials said the request reflected relevant operational 
information from authoritative CBP sources as well as comprehensive 
program technical descriptions for both the acquisition and 
sustainment life-cycle phases. Officials plan to use the fiscal year 
2012 appropriations to purchase IFT systems technology for future 
deployments in Arizona. The three initial deployments are planned for 
the Nogales, Douglas, and Casa Grande station areas of operation 
followed by two additional deployments planned in Sonoita and Wellton 
station areas. According to OTIA and Border Patrol officials, 
depending on the availability of funding, the deployments of the IFT 
system component of the Plan are expected to begin around March 2013 
and be completed by the end of 2015 (or possibly early 2016), with 
other sector deployments sequentially following the Arizona sector. 
CBP estimated that the entire IFT system acquisition for Arizona would 
cost about $570 million, including funding for design and development, 
equipment procurement, production and deployment, systems engineering 
and program management, and a national operations center. 

Nonetheless, there is significant uncertainty regarding the cost of 
IFT systems stemming from assumptions made as part of the cost-
estimating process. For example, when developing the ROM estimate, CBP 
officials expected that IFT systems would be able to access existing 
commercial communication networks in target deployment areas. CBP 
officials said that this assumption is no longer valid in all cases 
and additional communication relay equipment will likely be necessary. 
While CBP officials believe they have adequate risk contingency funds 
to address this issue, because they did not undertake a risk and 
uncertainly analysis to quantify the impact on the cost estimate of 
these kinds of risks, it will be difficult for them to determine 
whether the contingency funds will be sufficient to cover this or 
other risks. 

The findings of our analysis are particularly relevant considering 
that similar deficiencies were identified with the life-cycle cost 
estimate for the SBInet Block 1 deployment. In May 2010, we reported 
that the life-cycle cost estimate for the Block 1 deployment was not 
credible because risk and uncertainty were not adequately assessed. 
[Footnote 41] For example, the risks associated with software 
development were not examined, even though such risks were known to 
exist. In fact, the only risks considered were those associated with 
uncertainty in labor rates and hardware costs, and instead of being 
based on historical quantitative analyses, these risks were expressed 
by assigning them arbitrary positive or negative percentages. In 
addition, the estimate did not specify contingency reserve amounts to 
mitigate known risks, and an independent cost estimate was not used to 
verify the estimate. Our program assessments have too often revealed 
that not integrating cost estimation, system development oversight, 
and risk management--three key disciplines, interrelated and essential 
to effective acquisition management--has resulted in programs' costing 
more than planned and delivering less than promised.[Footnote 42] 

In discussing this issue, CBP officials said they attempted to 
establish as much fidelity as possible with the Arizona technology 
cost estimate and associated budget requests. However, the officials 
reported that they knew that several of their planning and estimating 
assumptions were broad and that they lacked some desired details. For 
those reasons, the officials continue to call their Arizona technology 
cost estimates ROM estimates. CBP officials stated that they used the 
best information available to establish budget quality estimates and 
plan to provide updated, comprehensive, and thoroughly documented cost 
estimates in fall 2011 related to the Plan. 

CBP officials said they consider the Arizona Border Surveillance 
Technology Plan to be a grouping of multiple projects that will 
proceed as independent acquisitions rather than a unified capital 
asset acquisition. As such, CBP officials reported that they are 
preparing LCCE for the individual acquisition projects in the Plan, 
initially for the IFT systems and the Remote Video Surveillance 
Systems (RVSS) with other projects to follow. CBP officials reported 
that OTIA will request baseline approval for the projects in the Plan 
later this year from the appropriate department or CBP acquisition 
oversight board. They said that this process will further examine 
respective cost and schedule estimates, technical performance and 
program risks, as well as contracting and related management concerns. 
Prior to the major acquisition reviews, CBP officials said that OTIA 
is developing detailed program management plans and supporting 
documentation for each of the Arizona technology projects within the 
portfolio. CBP officials do not expect to release a cost estimate for 
technology acquisition and deployment beyond Arizona until February 
2012. However, without a complete LCCE that contains all cost 
estimating best practices for the Arizona Plan, CBP could experience 
the same kind of problems as the ones it encountered in the 
acquisition of SBInet. 

Conclusions: 

CBP has not yet demonstrated the effectiveness and suitability of its 
new approach for deploying surveillance technology in Arizona. By 
taking steps to document how, where, and why it plans to deploy 
specific combinations of technology prior to its acquisition and 
deployment, CBP could be better positioned to minimize performance 
risks associated with the new approach. Given that apprehensions along 
the southwest border are at their lowest levels since the 1970s and, 
in light of the difficulties CBP has faced in its efforts to procure 
and deploy surveillance technology, documenting the underlying 
analysis used to justify the technology types, quantities, and 
suitability to terrain contained in the Arizona Border Surveillance 
Technology Plan could help CBP make its decisions more transparent. 
Further, better defining the mission benefits to be gained from 
planned procurements and quantifying performance metrics to assess the 
effectiveness of technologies selected for Arizona would help justify 
program funding and assist CBP in measuring its progress toward 
securing the southwest border. Given that CBP plans to spend $1.5 
billion for technologies to enhance surveillance across the remainder 
of the Arizona border, conducting a post-implementation review and 
operational assessment of the SBInet systems that includes a review of 
operational test results, and then weighing costs and benefits of 
taking action on the results could give CBP the opportunity to 
maximize the effectiveness of the system it has already deployed in 
the highest risk area in Arizona. It could also help CBP in making 
decisions for future technology deployments along the southwest border 
and provide a sound basis for assessing and deploying alternative 
technologies. 

Fully documenting the data used in the cost model could help ensure 
that the validity and reliability of the CBP's life-cycle cost 
estimate for the Arizona Border Surveillance Technology Plan can be 
verified. Because CBP officials did not conduct a sensitivity analysis 
and a cost-risk and uncertainty analysis to determine a level of 
confidence in the $1.5-billion life-cycle cost estimate for the Plan, 
it will be difficult for decision makers to determine what levels of 
contingency funding may be needed to cover risks associated with 
implementing new technologies along the remaining Arizona border. 
Until CBP officials accurately quantify the impacts of the risks, the 
budget requests for fiscal year 2012 and beyond may not be realistic 
and sufficient to achieve program aims. Because CBP officials do not 
expect to release a cost estimate for technology acquisition and 
deployment beyond Arizona until February 2012, until that time, the 
cost visibility of the total investment required to deploy technology 
across the southwest border is unclear. Verification of the new life-
cycle cost estimate with an independent cost estimate and 
reconciliation of any differences could further help ensure the 
credibility of the cost estimate. 

Recommendations: 

To increase the likelihood of successful implementation of the Arizona 
Border Surveillance Technology Plan and maximize the effectiveness of 
technology already deployed, we recommend that the Commissioner of CBP 
take the following three steps in planning the agency's new technology 
approach: 

* ensure that the underlying analyses of the Plan are documented in 
accordance with DHS guidance and internal controls standards; 

* determine the mission benefits to be derived from implementation of 
the plan and develop and apply key attributes for metrics to assess 
program implementation; and: 

* conduct a post-implementation review and operational assessment of 
SBInet, including consideration of the ATEC test results, and assess 
the costs and benefits of addressing the issues identified to help 
ensure the security of the 53 miles already covered by SBInet and 
enhance security on the Arizona border. 

To increase the reliability of CBP's Cost Estimate for the Arizona 
Border Surveillance Technology Plan, we recommend that the 
Commissioner of CBP update its cost estimate for the Plan using best 
practices, so that the estimate is comprehensive, accurate, well-
documented, and credible. Specifically, the OTIA program office should 
(1) fully document data used in the cost model; (2) conduct a 
sensitivity analysis and risk and uncertainty analysis to determine a 
level of confidence in the estimate so that contingency funding can be 
established relative to quantified risk; and (3) independently verify 
the new life-cycle cost estimate with an independent cost estimate and 
reconcile any differences. 

Agency Comments and Our Evaluation: 

We requested comments on a draft of this report from DHS and DOD. DHS 
provided written comments which are reprinted in appendix IV. In 
commenting on the draft report, DHS concurred with our recommendations 
and identified steps officials planned to take to implement them, 
along with estimated dates for their completion. DHS also stated that 
there were several issues raised in the report that could not be 
addressed at present. In an email received on October 14, 2011, the 
DOD liaison indicated that DOD had no comments on the report. 

Regarding the first recommendation that CBP ensure that the underlying 
analyses of the Plan are documented in accordance with DHS guidance 
and internal controls standards, DHS concurred. DHS stated that CBP 
plans to work with the DHS Internal Control Program Management Office 
to ensure Plan documentation is in accordance with DHS guidance and 
internal controls and anticipates completing this action by May 31, 
2012. Such actions should address the intent of the recommendation. 

Regarding the second recommendation that CBP determine the mission 
benefits to be derived from implementation of the Plan and develop and 
apply key attributes for metrics to assess the program's 
implementation, DHS concurred and stated that CBP plans to develop a 
set of measures by April 30, 2012, that will assess the effectiveness 
and mission benefits of future technology investments. Such action 
should address the intent of the recommendation. 

With regard to the third recommendation that CBP conduct a post- 
implementation review and operational assessment of SBInet, DHS 
concurred and stated that CBP's Office of Border Patrol (OBP) is 
working with Johns Hopkins University Applied Physics Laboratory on a 
Block I after-action review (AAR), which will address the operational 
test and evaluation results and offer recommendations on tactics, 
techniques, and procedures. DHS also said that OTIA and the Border 
Patrol will conduct a post-implementation review and operational 
assessment required in light of the OBP AAR, consistent with 
departmental policy and procedures for recurring reporting of fielded 
systems. DHS states that CBP plans to complete these actions by June 
30, 2012. Such actions should address the intent of the recommendation. 

Regarding the three recommendations related to CBP's life-cycle cost 
estimate--that CBP fully document data used in the cost model; conduct 
a sensitivity analysis and risk and uncertainty analysis to determine 
a level of confidence in the estimate so that contingency funding can 
be established relative to quantified risk; and independently verify 
the new life-cycle cost estimate with an independent cost estimate and 
reconcile any differences--DHS concurred. DHS stated that OTIA is 
preparing individual RVSS and IFT project cost estimates consistent 
with the GAO's guidelines and is fully documenting all assumptions, 
data structures and sources, methods and calculations, as well as 
risks and sensitivities for the two largest elements of the Plan that 
will enable CBP to refine contingency funding as needed. Officials 
plan to submit the appropriate project documentation, including the 
projects' Cost Estimating Baseline Document and the updated life-cycle 
cost estimate, to the department for independent review and 
verification of the respective projects' methodology and data sources. 
The department commented that it plans to determine the need for an 
independent cost estimate at a later time but will complete these 
actions by April 30, 2012. While these actions are positive steps, 
they do not fully address the recommendation that DHS implement best 
practices for cost estimates for the entire Plan. Instead, DHS's 
response indicates that it plans to implement these best practices for 
the two largest projects within the Plan. To fully understand the 
impacts of integrating these separate projects, DHS should update the 
life-cycle cost estimate for the entire Plan. 

DHS also noted that there were three issues in the draft that it did 
not feel, at present, it could address. First, regarding the need to 
document analytical steps taken to develop the Plan, the department 
stated that DHS relies on Border Patrol field agents' expert judgment 
to select the types and quantities of technologies best suited for 
their respective geographic areas of responsibilities. According to 
DHS, in all cases, technology selections were verified for consistency 
with the major findings of the AoA. In some cases, however, the Border 
Patrol determined that operational priorities justified a technology 
mix that was not necessarily the lowest cost--for example, the Border 
Patrol said a higher cost integrated fixed tower (IFT) solution would 
be operationally superior to deploying lower cost mobile systems. 
According to DHS, CBP is not planning further analyses or additional 
documentation given that they consider their analyses to be 
sufficiently documented in the final Plan. 

We recognize the value of Border Patrol agents' expert judgment in 
selecting the types and quantities of technologies best suited for 
their respective geographic areas of responsibility. Nonetheless, 
internal control standards call for documentation to support decision 
making to be available for examination. In the Plan, CBP officials 
documented the results of their analyses in terms of their planned 
deployments of technologies but did not include documentation of the 
supporting operational assessment done by the Border Patrol justifying 
the specific types, quantities, and deployment locations of border 
surveillance technologies, a key analytical component leading to the 
Plan. Documentation of the underlying analyses, not just the results, 
would enable the analyses supporting the Plan to be independently 
assessed. As noted in the report, it is unclear whether and how the 
analyses conducted to develop the Plan demonstrated the cost and 
operational effectiveness of the selected mix of technology, including 
whether the most appropriate technology for the terrain was selected. 
CBP cannot demonstrate the validity of the Arizona Border Surveillance 
Technology Plan and its acquisition approach in the absence of 
documentation that describes how CBP developed the operational 
assessments and technology deployment analyses and used the results of 
the AoA to develop the types and quantities of technologies and their 
suitability to the terrain from the various alternatives. Further, in 
the light of the significant difficulties faced by CBP in its prior 
efforts to develop and implement the nearly $1 billion SBInet system 
to provide unquantified improvements in border surveillance along 53 
miles of the Arizona border after 5 years of program efforts, we 
remain concerned that CBP lacks reasonable assurance that its Plan 
will fully support its future deployments of border surveillance 
technology in Arizona. 

The second issue DHS raised regarded the report's observations about 
limitations of SBInet systems currently fielded in Arizona and the 
need for CBP to address operational test results. DHS did concur with 
the recommendation that CBP conduct a post-implementation review and 
operational assessment of SBInet. However, DHS said that, because of 
the Border Patrol's ongoing mitigation efforts and a planned system 
enhancement to address these limitations, they are unable to address 
this issue at this time. DHS added that they plan to continue to use 
the system to maintain enhanced situational awareness while gaining 
additional experience with the system until the planned system 
enhancement can be implemented in 2012 to address operational concerns. 

The third issue regarded the report's observations about limitations 
of the Plan's cost estimates and the potential sufficiency of 
contingency funds to accommodate unforeseen cost growth. DHS said that 
CBP program officials "are mindful" of this concern, were conservative 
in their budget requests, and believe this issue has been largely 
addressed by their prior efforts to accommodate reasonable cost 
contingencies. However, DHS added that, in response to the related 
recommendation, it is preparing updated life-cycle cost estimates, 
consistent with the GAO's best practice guidelines, for two projects 
in the Plan that account for 90 percent of the estimate. But to fully 
address this recommendation, DHS will need to implement best practices 
for the entire Plan, not just for the two largest projects, so that 
the impacts of integrating the separate projects can be fully 
understood. DHS and DOD provided technical comments, which we 
incorporated as appropriate. 

We are sending copies of this report to the Secretary of Homeland 
Security, the Commissioner of the U.S. Customs and Border Protection, 
and interested congressional committees. In addition, the report will 
be available at no charge on GAO's Web site at [hyperlink, 
http://www.gao.gov]. 

If you or your staff have questions regarding this report, please 
contact me at (202) 512-8777 or at StanaR@gao.gov. Contact points for 
our Offices of Congressional Relations and Public Affairs may be found 
on the last page of this report. Key contributors to this report are 
listed in appendix V. 

Signed by: 

Richard Stana: 
Director, Homeland Security and Justice Issues: 

Appendix I: Objectives, Scope, and Methodology: 

Our objectives were to determine the extent to which (1) U.S. Customs 
and Border Protection (CBP) has the information needed to fully 
support and implement its Arizona Border Surveillance Technology Plan 
in according with Department of Homeland Security (DHS) and Office of 
Management and Budget (OMB) guidance, and (2) CBP’s life-cycle cost 
estimate for the Arizona Border Surveillance Technology Plan reflects 
best practices. 

To answer our first objective, we reviewed key program-planning 
documents CBP relied on to support its new approach to identifying, 
acquiring, and deploying surveillance technology applicable to 
specific types of terrain along the Arizona border. We also 
interviewed CBP officials responsible for assessing the need for and 
documenting the cost-and operational effectiveness and suitability of 
proposed systems to support its Arizona Border Surveillance Technology 
Plan and for identifying appropriate metrics to assess progress in 
border security. Specifically, we reviewed the announcement of the 
Secretary of Homeland Security and her vision of CBP’s new approach to 
identifying, acquiring, and deploying surveillance technology to the 
Arizona border in support of Border Patrol’s mission, principle goal, 
and objective. We also reviewed CBP’s analysis of alternatives (AOA) 
for Arizona, the Arizona Border Surveillance Technology Plan informed 
by the AOA, the final report of the independent peer review team on 
the AOA, CBP’s request for information on integrated fixed-tower 
technology, its Industry Day announcement and answers to industry 
questions, and CBP’s comparison of the similarities and differences 
between integrated fixed towers and SBInet technology. 

In relation to operational test results, we reviewed what the 
independent evaluation of SBInet and discussed with officials the 
extent CBP is using these findings to inform future investments as 
well as the continuing operation of SBInet. We largely focused on the 
elements of SBInet known as Block 1, developed and deployed in Arizona’
s Tucson sector between 2005 and 2010 and CBP’s plans for its 
operation and maintenance over its life-cycle. In doing so, we 
reviewed program documentation, including the Army Test and Evaluation 
Command’s reports and briefing to CBP, and interviewed the key 
officials involved in the design and implementation of the operational 
test and evaluation of test results in order to determine the 
reliability of the information we used to support our finding. 
[Footnote 43] We compared CBP’s program management plans and 
activities with requirements in DHS acquisition regulations including 
Acquisition Regulation 102-01 and OMB guidance A-11. We also 
interviewed CBP officials from its Office of Technology Innovation and 
Acquisition (OTIA) on how they intended to use the operational test 
findings and recommendations to inform the continuing operation of 
existing SBInet technology. Specifically, we reviewed the Army’s 
operational test plans, the initial and final test and evaluation 
reports, and their “Quick Look” briefing to OTIA officials. We also 
interviewed CBP and Army officials about the results of those tests 
and discussed the soundness of the test design process, its sampling 
methodology, and its implementation in order to determine whether we 
could rely on test results data. We found the test results to be 
sufficiently reliable for the purposes of this report. We also 
observed the SBInet systems in operation in the Tucson sector, and 
discussed the systems’ performance with Border Patrol Agents in the 
Tucson and Ajo station SBInet command centers. We reviewed our body of 
work on SBInet since 2005 as a basis for assessing CBP’s proposed 
approach for developing and implementing its new Arizona Border 
Surveillance Technology Plan. 

To answer our second objective, we reviewed cost and budget documents 
CBP relied on to support cost estimates for technology alternatives 
contained in the AOA for Arizona and in the President’s budget request 
for fiscal year 2012. We also interviewed program officials and 
contractors responsible for estimating the cost of future investments 
in surveillance technology, specifically the life-cycle approach, 
requirements development and management, test management, and risk 
management. We then compared this information to relevant federal 
guidance derived from leading industry practices.[Footnote 44] To 
assess the reliability of the cost data for the rough order-of-
magnitude estimate for implementation of the Plan, which assumed a 10-
year life-cycle for the acquisition, we relied on data for fiscal year 
2010 and beyond to support the findings in the report. We also 
reviewed relevant program documentation to substantiate evidence 
obtained through interviews with knowledgeable agency officials, where 
available, regarding the integrity of the data. We determined that the 
data used in this report are sufficiently reliable for the purposes of 
this report.[Footnote 45] We compared CBP cost estimating practices 
and budget documents to our Cost Estimating and Assessment Guide, 
which contains best practices compiled from cost-estimating 
organizations throughout the federal government and industry.[Footnote 
46] 

We conducted this performance audit from March 2011 through October
2011 in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audit 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. 

[End of section] 

Appendix II: Photographs of Technologies Contained in the Arizona 
Border Surveillance Technology Plan: 

Figure 1: Mobile Surveillance System (MSS): 

[Refer to PDF for image: photograph] 

An MSS consists of camera and radar systems mounted on a truck, with 
images being transmitted to and monitored on a computer screen in the 
truck’s passenger compartment. 

Source: GAO. 

[End of figure] 

Figure 2: Mobile Video Surveillance System (MVSS): 

[Refer to PDF for image: photograph] 

An MVSS is a truck-mounted, long-range infrared imaging device. 

[End of figure] 

Figure 3: Integrated Fixed Tower Concept (SBINet Tower): 

[Refer to PDF for image: photograph] 

An Integrated Fixed Tower “system” consists of various components and 
program support activities. The components include fixed towers, 
sensors (cameras and radar), a data communications network, facilities 
upgrades, information displays, and an information management system. 
Program support activities include those performed to design, acquire, 
deploy, and test the system; and manage government and contractor 
efforts. 

Source: GAO. 

[End of figure] 

Figure 4: Air Support (Unmanned Aerial System): 

[Refer to PDF for image: photograph] 

The mission of the UAS is to provide sensor information to law 
enforcement, emergency management, and intelligence planners to 
prevent terrorism, secure the borders from the illicit flow of people 
and contraband, and respond to disasters. 

Source: GAO. 

[End of figure] 

Figure 5: Long Range Handheld Thermal Imaging System (RECON III): 

[Refer to PDF for image: photograph] 

Source: GAO. 

[End of figure] 

Figure 6: Agent Portable Surveillance System (APSS): 

[Refer to PDF for image: photograph] 

Source: GAO. 

[End of figure] 

Figure 7: Remote Video Surveillance System (RVSS): 

[Refer to PDF for image: photograph] 

Source: GAO. 

[End of figure] 

[End of section] 

Appendix III: Aspects of High-Quality Cost Estimates: 

Characteristic: Well-documented; 
Explanation: The documentation should address the purpose of the 
estimate, the program background and system description, its schedule, 
the scope of the estimate (in terms of time and what is and is not 
included), the ground rules and assumptions, all data sources, 
estimating methodology and rationale, the results of the risk 
analysis, and a conclusion about whether the cost estimate is 
reasonable. Therefore, a good cost estimate—while taking the form of a 
single number—is supported by detailed documentation that describes 
how it was derived and how the expected funding will be spent in order 
to achieve a given objective. For example, the documentation should 
capture in writing such things as the source data used and their 
significance, the calculations performed and their results, and the 
rationale for choosing a particular estimating method or reference. 
Moreover, this information should be captured in such a way that the 
data used to derive the estimate can be traced back to and verified 
against their sources, allowing for the estimate to be easily 
replicated and updated. Finally, the cost estimate should be reviewed 
and accepted by management to ensure that there is a high level of 
confidence in the estimating process and the estimate itself.
Step: 
Step 1: Define the estimate’s purpose, scope, and schedule; 
Step 3: Define the program characteristics; 
Step 5: Identify ground rules and assumptions; 
Step 6: Obtain the data; 
Step 10: Document the estimate; 
Step 11: Present the estimate to management for approval. 

Characteristic: Comprehensive; 
Explanation: The cost estimates should include both government and 
contractor costs of the program over its full life-cycle, from 
inception of the program through design, development, deployment, and 
operation and maintenance to retirement of the program. They should 
also completely define the program, reflect the current schedule, and 
be technically reasonable. Comprehensive cost estimates should provide 
a level of detail appropriate to ensure that cost elements are neither 
omitted nor double counted, and they should document all cost-
influencing ground rules and assumptions. Establishing a product-
oriented work breakdown structure is a best practice because it allows 
a program to track cost and schedule by defined deliverables, such as 
a hardware or software component.
Step: 
Step 2: Develop the estimating plan; 
Step 4: Determine the estimating structure; 
Step 5: Identify ground rules and assumptions[A]. 

Characteristic: Accurate; 
Explanation: The cost estimates should provide for results that are 
unbiased, and they should not be overly conservative or optimistic. 
Estimates are accurate when they are based on an assessment of most 
likely costs, adjusted properly for inflation, and contain few, if 
any, minor mistakes. In addition, the estimates should be updated 
regularly to reflect material changes in the program, such as when 
schedules or other assumptions change, and actual costs so that the 
estimate is always reflecting current status. Among other things, the 
estimate should be grounded in documented assumptions and a historical 
record of cost estimating and actual experiences on other comparable 
programs.
Step: 
Step 7: Develop the point estimate[B]; 
Step 12: Update the estimate to reflect actual costs and changes. 

Characteristic: Credible; 
Explanation: The cost estimates should discuss any limitations of the 
analysis because of uncertainty or biases surrounding data or 
assumptions. Major assumptions should be varied, and other outcomes 
recomputed to determine how sensitive they are to changes in the 
assumptions. Risk and uncertainty analysis should be performed to 
determine the level of risk associated with the estimate. Further, the 
estimate’s results should be crosschecked, and an independent cost 
estimate conducted by a group outside the acquiring organization 
should be developed to determine whether other estimating methods 
produce similar results. For management to make good decisions, the 
program estimate must reflect the degree of uncertainty, so that a 
level of confidence can be given about the estimate. Having a range of 
costs around a point estimate is more useful to decision makers 
because it conveys the level of confidence in achieving the most 
likely cost and also informs them on cost, schedule, and technical 
risks.
Step: 
Step 7: Compare the point estimate to an independent cost estimate[C]; 
Step 8: Conduct sensitivity analysis; 
Step 9: Conduct risk and uncertainty analysis. 

Source: GAO-09-3SP. 

[A] This step applies to two of the characteristics—well-documented 
and comprehensive. 

[B] A point estimate is a single cost estimate number representing the 
most likely cost. 

[C] This step applies to two of the characteristics—credible and 
accurate. 

[End of table] 

[End of section] 

Appendix IV: Comments from the Department of Homeland Security: 

U.S. Department of Homeland Security: 
Washington, DC 20528: 

October 20, 2011: 

Richard Stana: 
Director, Homeland Security and Justice: 
441 0 Street, NW: 
U.S. Government Accountability Office: 
Washington, DC 20548: 

Re: Draft Report GAO-12-22, "Arizona Border Surveillance Technology:
More Information on Plans and Costs Is Needed Before Proceeding" 

Dear Mr. Stana: 

Thank you for the opportunity to review and comment on this draft 
report. The U.S. Department of Homeland Security (DHS) appreciates the 
U.S. Government Accountability Office's (GAO's) work in planning and 
issuing this report. 

In 2010, Secretary Napolitano directed an independent, quantitative 
assessment of the SBInet program, which combined the input of U.S. 
Border Patrol agents on the front lines with the Department's leading 
science and technology experts. This assessment made clear that
SBInet could not meet its original objective of providing a one-size-
fits-all border security technology solution. As a result, earlier 
this year, U.S. Customs and Border Protection (CBP) began the process 
of redirecting SBInet resources to other, proven technologies—-
tailored to each border region--to better meet the operational needs 
of the Border Patrol. This new border security technology plan — which 
is already well underway—-is providing faster deployment of 
technology, better coverage, and a more effective balance between cost 
and capability. 

The Department is pleased to note the report highlights several 
important aspects of our efforts to develop and implement the Arizona 
Border Surveillance Technology Plan (Plan) and address the progress 
being made. The draft acknowledges the robust and deliberative process 
we have used to hone CBP's surveillance technology investment plan for 
the Arizona border region. The draft reiterates the threats and 
challenges our Nation faces at the border and the Department's need 
for additional surveillance to establish situational awareness for 
enhancing border security efforts. The draft acknowledges the 
Department's deliberative efforts to evaluate and assess multiple 
sensor technology investment options as part of a formal Analysis of 
Alternatives (AoA). The draft affirms the expertise of our Border 
Patrol and their detailed "operational assessments" regarding the 
selection of specific technologies by frontline agents responsible for 
securing the targeted areas in Arizona. In addition, the draft found 
our cost estimating to be comprehensive and accurate, reflecting rigor 
throughout the process as well as support for the final decisions 
identified in the Plan. 

The draft report also detailed several ongoing CBP activities to 
implement the Plan. As of today, CBP has awarded contracts for 
commercially available mobile surveillance capabilities, hand-held 
thermal imaging binoculars, and agent-portable tripod-mounted 
radar/camera systems—these systems are now deployed and being used by 
agents on the border or are undergoing government acceptance testing 
prior to field delivery. Additional contracts for mobile video camera 
systems and unattended ground sensors are expected this fall. Of 
particular interest is the final unit price for each of these 
procurement contracts is approximately 10 percent to 20 percent less 
than what CBP estimated; this fact validates our cost estimating and 
budgeting methods. 

In addition to its recommendations, the GAO draft encourages DHS to 
consider additional measures that may enhance the success of the Plan. 
Although DHS agrees with these recommendations, there were some points 
in the draft that we do not feel, at present, we can address. 

* GAO asserts analytical steps were inadequately documented, thereby 
denying decision-makers and overseers insights into aspects of the 
Plan – As CBP has previously briefed the GAO, the Department relies on 
field agents' expert judgment to select the types and quantities of 
technologies best suited for their respective geographic areas of 
responsibilities. This judgment is based on the years of experience 
the Border Patrol has with operating many of the candidate 
technologies, as well as their intimate familiarity with the threats, 
tactics, and terrain of the target geographical areas. In all cases, 
the Border Patrol's selections were verified for consistency with the 
major findings of the Department's Analysis of Alternatives. including 
considerations of geographic location, cost and operational 
objectives. In some cases, however, the Border Patrol determined that 
operational priorities justified a technology mix that was not 
necessarily the lowest cost, e.g., a higher cost Integrated Fixed 
Tower (IFT) solution would be operationally superior to deploying 
lower cost mobile systems onto an active Marine Corps bombing range in 
Arizona that would have to be relocated during frequent range 
exercises. These decisions are documented as part of the final Plan 
and were shared with the GAO during the review. Given the thorough 
nature of this analysis, CBP is not planning further analyses or 
additional documentation at this time. 

* GAO asserts previously fielded SBInet Block 1 systems (TUS-1 and AJO-
1) are likely to require costly system modifications to he effective 
and suitable for current border security efforts — The findings of the 
TUS-1 system Operational Test & Evaluation event showed that the 
system is operationally effective, but limited by operational and 
support concerns. CBP noted these findings in its publicly released 
Plan in January 2011. Many of the concerns identified in the testing 
event were anticipated prior to testing, and the Border Patrol 
operators are effectively using the system in ways that mitigate most 
of the concerns. Additionally, the program office and prime contractor 
have been preparing a system enhancement (due summer of 2012) to 
address several of the identified concerns. Until then, the Border 
Patrol advocates continued use of the Block I system "as-is" to 
maintain the enhanced situational awareness, command and control 
capabilities offered by the system, while gaining additional 
experience and learning prior to implementing significant changes to 
the Block 1 deployments. 

* GAO asserts the Plan's cost estimates, reported to be substantially 
comprehensive and accurate, still require detailed documentation and 
analysis, and may understate contingency budgets to accommodate 
unforeseen needs — The draft suggests CBP may not have sufficient 
contingency budget available to accommodate unforeseen cost growth. 
CBP program officials are mindful of this concern, and accordingly 
developed conservative estimates and integrated a contingency margin 
in our overall budget requests. Based on recent contract award prices 
being less than our estimates, and based on generally accepted 
budgeting practices, CBP is confident that we are adequately resourced 
to accommodate reasonable cost contingencies. Additionally, we are 
preparing updated life cycle cost estimates for the Remote Video 
Surveillance System (RVSS) and the !FT projects, consistent with the 
GAO best practice guidelines, to report the quantifiable risk and 
sensitivity measures for more than 90 percent of the total Plan's cost 
estimate. 

The draft report contained six recommendations directed at DHS, with 
which DHS concurs. Specifically, GAO recommended that the Commissioner 
of CBP: 

Recommendation 1: Ensure that the underlying analyses of the Plan are 
documented in accordance with DHS guidance and internal controls 
standards. 

Response: Concur. CBP has documentation for all of the underlying 
analyses and resulting conclusions of the SBInet Analysis of 
Alternatives, the Arizona "operational assessments," and the completed 
final Plan. CBP has provided the GAO significant documentation, 
interviews, and analytical products covering all facets of the Plan's 
development, including the specified types, quantities, and deployment 
locations of technologies derived from the expert judgment of seasoned 
Border Patrol Agents. The DHS Internal Control Program Management 
Office will work with CBP to ensure Plan documentation is in 
accordance with OHS guidance and internal controls. Estimated 
Completion Date (ECD): May 31, 2012. 

Recommendation 2: Determine the mission benefits to be derived from 
implementation of the plan and develop and apply key attributes for 
metrics to assess program implementation. 

Response: Concur. The Department and CBP are developing a set of 
measures that will assess the effectiveness and mission benefits of 
future technology investments. ECD: April 30, 2012. 

Recommendation 3: Conduct a post-implementation review and operational 
assessment of SBInet, including consideration of the ATEC test 
results, and assess the costs and benefits of addressing the issues 
identified to help ensure the security of the 53 miles already covered 
by SBInet and enhance security on the Arizona border. 

Response: Concur. CBP's Office of Border Patrol (OBP) is working with 
Johns Hopkins Applied Physics Laboratory on a Block 1 After Action 
Review (AAR), which will address the Operational Test & Evaluation 
results and offer recommendations on Tactics, Techniques, and 
Procedures. OTIA and the Border Patrol will conduct a post 
implementation review required in light of the OBP AAR, consistent 
with Departmental policy and procedures for recurring reporting of 
fielded systems. Because of the planned software/hardware upgrades and 
enhancements being developed. OTIA and OBP will coordinate the post-
implementation review and operational analysis next summer following 
the upgrade installations. These updates were already identified by 
OBP to improve the system performance and effectiveness. Additionally, 
OTIA and the Border Patrol continue to collect system performance 
information data that highlights performance of the system ECD: June 
30, 2012 

Recommendation 4: Fully document data used in the cost model. 

Response: Concur. OTIA, as well as the cost estimating team involved 
in formulating the updated Plan, applied the GAO Cost Estimating 
Guidelines to the extent practical. All of the cost estimating 
underlying the Arizona AoA is fully documented as part of the Homeland
Security Institute's final report to the government. This 
documentation, coupled with the documented inputs from the Border 
Patrol's operational assessments (specific quantities, locations, 
performance values, and other significant system specifications), 
provide a comprehensive record of the cost estimating methodology and 
data sources employed for the Plan. OTIA is preparing individual RVSS 
and Du project cost estimates consistent with the GAO guidelines, 
fully documenting all assumptions, data structures and sources, 
methods and calculations, as well as risks and sensitivities for the 
two largest elements of the Plan. ECD: April 30, 2012. 

Recommendation 5: Conduct a sensitivity analysis and risk and 
uncertainty analysis to determine a level of confidence in the 
estimate so that contingency funding can he established relative to 
quantified risk. 

Response: Concur. CBP is updating cost estimates for the RVSS and IFT 
projects, the two largest elements of the Plan. The updated estimates 
will include sensitivity and quantified risk analyses and will enable 
CBP to refine contingency funding as needed. ECD: April 30, 2012. 

Recommendation 6: Independently verify the new life-cycle cost 
estimate with an independent cost estimate and reconcile any 
differences. 

Response: Concur. CBP is updating cost estimates for the RVSS and IFT 
projects, the two largest elements of the Plan. CBP will submit the 
appropriate project documentation. including the projects' Cost 
Estimating Baseline Document and the updated Life Cycle Cost
Estimate, to the Department for independent review and verification of 
the respective projects' methodology and data sources. The Department 
will determine the need for an independent cost estimate at a later 
time. ECD: April 30, 2012. 

Again, thank you for the opportunity to review and comment on this 
draft report. General, technical. and sensitivity comments have beets 
provided under separate cover. We look forward to working with you on 
future Homeland Security issues. 

Sincerely, 

Signed by: 

Jim H. Crumpacker: 
Director
Departmental GAO-OIG Liaison Office: 

[End of section] 

Appendix V: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Richard Stana, at (202)512-8777 or RichardS@gao.gov. 

Staff Acknowledgments: 

Chris Keisling, Assistant Director, and Ron Salo, Analyst-in-Charge, 
managed this assignment. David Alexander, Seto Bagdoyan, Charles
Bausell, Justin Dunleavy, Mike Harmond, Richard Hung, Karen Richey, 
and Sean Seales made important contributions to this report. Frances
Cook provided legal assistance, and Tina Cheng provided graphics 
assistance. Katherine Davis contributed to report preparation. 

[End of section] 

Related GAO Products: 

Homeland Security: DHS Could Strengthen Acquisitions and Development 
of New Technologies. [hyperlink, 
http://www.gao.gov/products/GAO-11-829T] (Washington, D.C.: July 15, 
2011). 

Border Security: DHS Progress and Challenges in Securing the U.S. 
Southwest and Northern Borders. [hyperlink, 
http://www.gao.gov/products/GAO-11-508T] (Washington, D.C.: March 30, 
2011). 

Border Security Preliminary Observations on the Status of Key 
Southwest Border Technology Programs. [hyperlink, 
http://www.gao.gov/products/GAO-11-448T] (Washington, D.C March 15, 
2011). 

Secure Border Initiative: DHS Needs to Strengthen Management and 
Oversight of Its Prime Contractor. [hyperlink, 
http://www.gao.gov/products/GAO-11-6] (Washington, D.C.: October 18, 
2010). 

U.S. Customs and Border Protection’s Border Security Fencing, 
Infrastructure and Technology Fiscal Year 2010 Expenditure Plan, 
[hyperlink, http://www.gao.gov/products/GAO-10-877R] (Washington, 
D.C.: July 30, 2010). 

Department of Homeland Security: Assessments of Selected Complex 
Acquisitions, [hyperlink, http://www.gao.gov/products/GAO-10-588SP] 
(Washington, D.C.: June 30, 2010). 

Secure Border Initiative: DHS Needs to Reconsider Its Proposed 
Investment in Key Technology Program. [hyperlink, 
http://www.gao.gov/products/GAO-10-340] (Washington, D.C.: May, 5, 
2010). 

Secure Border Initiative: DHS Has Faced Challenges Deploying 
Technology and Fencing Along the Southwest Border, [hyperlink, 
http://www.gao.gov/products/GAO-10-651T] (Washington, D.C.: May 4, 
2010). 

Secure Border Initiative: Testing and Problem Resolution Challenges 
Put Delivery of Technology Program at Risk. [hyperlink, 
http://www.gao.gov/products/GAO-10-511T] (Washington, D.C.: March 18, 
2010). 

Secure Border Initiative: DHS Needs to Address Testing and Performance 
Limitations That Place Key Technology Program at Risk. [hyperlink, 
http://www.gao.gov/products/GAO-10-158] (Washington, D.C.: January 29, 
2010). 

Secure Border Initiative: Technology Deployment Delays Persist and the 
Impact of Border Fencing Has Not Been Assessed. [hyperlink, 
http://www.gao.gov/products/GAO-09-1013T] (Washington, D.C.: September 
17, 2009). 

Secure Border Initiative: Technology Deployment Delays Persist and the 
Impact of Border Fencing Has Not Been Assessed. [hyperlink, 
http://www.gao.gov/products/GAO-09-896] (Washington, D.C.: September 
9, 2009). 

Customs and Border Protection’s Secure Border Initiative Fiscal Year 
2009 Expenditure Plan. GAO-09-274R (Washington, D.C.: April 30, 2009).
Secure Border Initiative Fence Construction Costs. [hyperlink, 
http://www.gao.gov/products/GAO-09-244R] (Washington, D.C.: January 
29, 2009). 

Northern Border Security: DHS’s Report Could Better Inform Congress by 
Identifying Actions, Resources, and Time Frames Needed to Address 
Vulnerabilities. [hyperlink, http://www.gao.gov/products/GAO-09-93] 
(Washington, D.C.: November 25, 2008). 

Department of Homeland Security: Billions Invested in Major Programs 
Lack Appropriate Oversight. [hyperlink, 
http://www.gao.gov/products/GAO-09-29] (Washington, D.C.: November 18, 
2008). 

Secure Border Initiative: DHS Needs to Address Significant Risks in 
Delivering Key Technology Investment. [hyperlink, 
http://www.gao.gov/products/GAO-08-1086] (Washington, D.C.: September 
22, 2008). 

Secure Border Initiative: DHS Needs to Address Significant Risks in 
Delivering Key Technology Investment. [hyperlink, 
http://www.gao.gov/products/GAO-08-1148T] (Washington, D.C.: September 
10, 2008). 

Secure Border Initiative: Observations on Deployment Challenges. 
[hyperlink, http://www.gao.gov/products/GAO-08-1141T] (Washington, 
D.C.: September 10, 2008). 

Secure Border Initiative Fiscal Year 2008 Expenditure Plan Shows 
Improvement, but Deficiencies Limit Congressional Oversight and DHS 
Accountability. [hyperlink, http://www.gao.gov/products/GAO-08-739R] 
(Washington, D.C.: June 26, 2008). 

Department of Homeland Security: Better Planning and Oversight Needed 
to Improve Complex Service Acquisition Outcomes. [hyperlink, 
http://www.gao.gov/products/GAO-08-765T] (Washington, D.C.: May 8, 
2008). 

Department of Homeland Security: Better Planning and Assessment Needed 
to Improve Outcomes for Complex Service Acquisitions. [hyperlink, 
http://www.gao.gov/products/GAO-08-263] (Washington, D.C.: April 22, 
2008). 

Secure Border Initiative: Observations on the Importance of Applying 
Lessons Learned to Future Projects. [hyperlink, 
http://www.gao.gov/products/GAO-08-508T] (Washington, D.C.: February 
27, 2008). 

Secure Border Initiative: Observations on Selected Aspects of SBInet 
Program Implementation. [hyperlink, 
http://www.gao.gov/products/GAO-08-131T] (Washington, D.C.: October 
24, 2007). 

Secure Border Initiative: SBInet Planning and Management Improvements 
Needed to Control Risk. [hyperlink, 
http://www.gao.gov/products/GAO-07-504T] (Washington, D.C.: February 
27, 2007). 

Secure Border Initiative: SBInet Expenditure Plan Needs to Better 
Support Oversight and Accountability. [hyperlink, 
http://www.gao.gov/products/GAO-07-309] (Washington, D.C.: February 
15, 2007). 

[End of section] 

Footnotes: 

[1] GAO reported concerns about SBInet in a number of products. For 
example, in May 2010, we reported our concerns regarding DHS’s 
management of the program, see GAO, Secure Border Initiative: DHS 
Needs to Reconsider Its Proposed Investment in Key Technology Program, 
[hyperlink, http://www.gao.gov/products/GAO-10-340] (Washington, D.C.: 
May, 5, 2010); and in September 2008, we reported that SBInet was at 
risk because of a number of acquisition management weaknesses, and we 
made recommendations to address them that DHS largely agreed with and 
committed to addressing, see GAO, Secure Border Initiative: DHS Needs 
to Address Significant Risks in Delivering Key Technology Investment, 
[hyperlink, http://www.gao.gov/products/GAO-08-1086] (Washington, 
D.C.: Sept. 22, 2008). 

[2] CBP’s Office of Technology Innovation and Acquisitions (OTIA) was 
created to ensure all of CBP’s technology efforts are properly focused 
on the mission and are well integrated, and to strengthen CBP’s 
expertise and effectiveness in program management and acquisition. 

[3] The “analysis of alternatives” (AOA) analyzed the cost 
effectiveness of technology alternatives to SBInet for Arizona and was 
intended to inform Border Patrol’s development of the Arizona Border 
Surveillance Technology Plan. 

[4] Office of Management and Budget, Circular No. A-11, Preparation, 
Submission, and Execution of the Budget (Washington, D.C.: August 
2011) (hereinafter referred to as OMB Circular A-11); Circular No. A-
130 Revised, Management of Federal Information Resources (Washington, 
D.C.: Nov. 28, 2000); and Office of Management and Budget, Capital 
Programming Guide: Supplement to Circular A-11, Part 7, Preparation, 
Submission, and Execution of the Budget (Washington, D.C.: June 2006); 
and GAO, Cost Estimating and Assessment Guide: Best Practices for 
Developing and Managing Capital Program Costs, [hyperlink, 
http://www.gao.gov/products/GAO-09-3SP], (Washington, D.C.: March 
2009). 

[5] GAO, Department of Homeland Security: Billions Invested in Major 
Programs Lack Appropriate Oversight, [hyperlink, 
http://www.gao.gov/products/GAO-09-29] (Washington, D.C.: Nov. 18, 
2008): GAO, Department of Homeland Security: Assessments of Selected 
Complex Acquisitions, [hyperlink, 
http://www.gao.gov/products/GAO-10-588SP] (Washington. D.C.: Jun. 30, 
2010): and GAO, Secure Border Initiative DHS Needs to Reconsider Its 
Proposed Investment in Key Technology Program [hyperlink, 
http://www.gao.gov/products/GAO-10-340] (May 5, 2010, Washington, 
D.C.). 

[6] An RVSS is a system of towers with cameras that transmit 
information to video monitors at a Border Patrol facility. 

[7] A MSS consists of camera and radar systems mounted on a truck, 
with images being transmitted to and monitored on a computer screen in 
the truck’s passenger compartment. For a picture of a MSS, see 
appendix II. 

[8] CBP officials said that the five IFT systems collectively would 
consist of 52 fixed towers, about 39 fewer towers than would have been 
the case under the original SBInet deployment that was canceled. 

[9] Although different organizations use different names for these 
decision packages-—such as business cases or project requests-—the 
packages generally include documents and analyses to support a 
proposed investment. 

[10] The Homeland Security Studies and Analysis Institute is a 
federally funded research and development center to provide 
independent analysis of homeland security issues. 

[11] SBInet AOA Report of the Independent Review Team Final Report 
(March 28, 2011). The independent review team was composed of staff 
members from two consulting groups and the Air Force Materiel Command, 
Office of Aerospace Studies. 

[12] The department’s new technology deployment plan for the entire 
southwest border is called the Alternative (Southwest) Border 
Technology plan. This plan, which the Arizona Border Surveillance 
Technology Plan is a part of, is still being developed. 

[13] GAO, Internal Control: Standards for Internal Control in the 
Federal Government, [hyperlink, 
http://www.gao.gov/products/GAO/AIMD-00-21-3.1] (Washington, D.C.: 
November 1999) and Office of Management and Budget, Circular A-123, 
Management’s Responsibility for Internal Control (Washington, D.C.: 
Dec. 21,2004). 

[14] OMB Circular A-11 requires budget submissions to have undergone 
the scrutiny of cost and performance risk analyses. 

[15] The Acquisition Review Board (ARB) is a cross-component 
organization within the department that determines whether a proposed 
acquisition has met the requirements of key phases in the acquisition 
life-cycle framework and is able to proceed to the next phase and 
eventual full production and deployment. 

[16] Acquisition planning documents include (1) a Mission Need 
Statement to provide a description of the strategic need for an 
investment; (2) an Operational Requirements Document to provide a 
bridge between the functional requirements of the mission needs 
statement and the detailed technical requirements that form the basis 
of the performance specifications; and (3) an Acquisition Program 
Baseline to identify operational requirements for addressing the 
program’s critical cost, schedule, and performance parameters. 

[17] GAO, Secure Border Initiative: DHS Needs to Reconsider Its 
Proposed Investment in Key Technology Program, [hyperlink, 
http://www.gao.gov/products/GAO-10-340] (Washington, D.C.: May 5, 
2010). 

[18] According to OMB Circular A-11, Part 6, Section 200, performance 
measurement should include program accomplishments in terms of outputs 
(quantity of products or services provided) and outcomes (results of 
providing outputs in terms of effectively meeting intended agency 
mission objectives), as well as, indicators, statistics or metrics 
used to gauge program performance. 

[19] Clinger-Cohen Act of 1996, 40 U.S.C. §§ 11101-11703, and Office 
of Management and Budget, Circular No. A-130 Revised, Management of 
Federal Information Resources (Washington, D.C.: Nov. 28, 2000). 

[20] A sound business case includes well defined requirements, 
preliminary design, realistic cost estimates, and mature technology 
according to GAO, NASA’s Space Vision, Business Case for Prometheus 1 
Needed to Ensure Requirements Match Available Resources, [hyperlink, 
http://www.gao.gov/products/GAO-05-242]. (Washington D.C.: Feb. 28, 
2005). 

[21] Pub. L. No. 103-62, 107 Stat. 285 (1993), amended by Government 
Performance and Results Act (GPRA) Modernization Act of 2010, Pub. L. 
No. 111-352, 124 Stat. 3866. 

[22] For the fiscal year 2012 budget submission, CBP did not provide 
an A-11 Exhibit 300 in support of the emerging (at that time) Arizona 
Technology Plan. The OMB’s Circular, A-11, Exhibit 300 cycle had 
already commenced and was nearing completion at the time the final 
Arizona Plan-—including specifically the Integrated Fixed Tower plan 
and associated cost estimates-—was submitted for approval. In lieu of 
the Exhibit 300 documentation, information was provided to, and 
discussed with, OMB during the final fiscal year 2012 Border Security, 
Fencing, Infrastructure and Technology budget submission. As part of 
this year’s fiscal year 2013 budget Exhibit 300 cycle, CBP is 
preparing and submitting Exhibit 300s for the appropriate Arizona 
Technology capital projects. These will not be available for release 
until OMB concludes the cycle later this fall. 

[23] GAO, Employment and Training Programs: Opportunities Exist for 
Improving Efficiency, [hyperlink, 
http://www.gao.gov/products/GAO-11-506T] (Washington: D.C.: Apr. 7, 
2011). 

[24] Office of Management and Budget, OMB Circular No. A-11, Part 7, 
Section 300, Planning, Budgeting, Acquisition, and Management of 
Capital Assets (Washington, D.C.: Executive Office of the President, 
July 2010). 

[25] The Border Patrol began using SBInet at Tuscon-1 in February 2010 
and at Ajo-1 in August 2010. 

[26] ATEC is the operational test agency for the SBInet Block 1 
deployment at Tuscon-1. In this capacity, ATEC provides an independent 
evaluation of the system’s operational effectiveness and suitability. 
ATEC issued its final evaluation report in March 2011: U.S. Army Test 
and Evaluation Command, Operational Test Agency Evaluation Report for 
the Secure Border Initiative Network (SBInet) Block 1.0 (Aberdeen 
Proving Ground, Md.: Mar. 29, 2011). 

[27] Detection is a visual determination by the COP operator that 
items of interest are present in the field of view of SBInet cameras. 
Identification is a visual determination by the COP operator that the 
detected item of interest is a person, vehicle, or animal. 
Classification is a determination by the COP operator that the 
identified item of interest can be assigned a designation of migrants, 
traffickers, or other. 

[28] Border Patrol SBInet operators completed 61 questionnaires and 
field agents completed 103 questionnaires about their opinions of the 
system at the end of shifts over the course of the test period. Some 
operators and field agents may have completed multiple questionnaires, 
if they had worked on more than one shift during testing. Seventeen of 
the operators and 10 of the field agents who participated in 
operational testing completed questionnaires about their opinions of 
the system at the end of the test period. CBP and Border Patrol 
officials and operators with whom we spoke also had favorable opinions 
of the system. 

[29] [hyperlink, http://www.gao.gov/products/GAO-11-448T]. 

[30] Chief Information Officer, Department of Homeland Security, 
Capital Planning and Investment Control (CPIC) Guide, Version 4.0 
(Washington, D.C.: May 2007). 

[31] Office of Management And Budget, Capital Programming Guide V 2.0 
Supplement to Office of Management and Budget, Circular A–11, Part 7: 
Planning, Budgeting, and Acquisition of Capital Assets (Washington, 
D.C.: Executive Office of the President, June 2006). 

[32] Chief Information Officer, Department of Homeland Security, 
Capital Planning and Investment Control (CPIC) Guide, Version 4.0 
(Washington, D.C.: May 2007). 

[33] The Arizona Border Surveillance Technology Plan limits 
acquisition to proven, fully-integrated, non-developmental systems 
suitable for operations in remote, isolated areas along the border. 

[34] $1.54 billion then-year dollars. Then-year dollars reflect the 
cost at the time of the procurement. 

[35] Office of Management And Budget, Capital Programming Guide V 2.0 
Supplement to Office of Management and Budget, Circular A–11, Part 7: 
Preparation, Submission, and Execution of the Budget (Washington, 
D.C.: Executive Office of the President, June 2006) and [hyperlink, 
http://www.gao.gov/products/GAO-09-3SP]. 

[36] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[37] A sensitivity analysis also requires estimating the high and low 
uncertainty ranges for significant cost driver input factors. To 
determine what the key cost drivers are, a cost estimator needs to 
determine the percentage of total cost that each cost element 
represents. The major contributing variables within the highest 
percentage cost elements are the key cost drivers that should be 
varied in a sensitivity analysis. 

[38] High-quality cost estimates usually fall within a range of 
possible costs, with a point estimate being located between extremes. 
Having a range of costs around a point estimate is more useful to 
decision makers, because it conveys the level of confidence in 
achieving the most likely cost and also informs them on cost, 
schedule, and technical risks. Lacking a cost risk and uncertainty 
analysis, management cannot determine a defensible level of 
contingency reserves and the estimate is unrealistic because it does 
not assess the variability in the cost estimate. 

[39] An independent estimate provides an independent view of expected 
program costs that tests the program office’s estimate for 
reasonableness. It is usually developed from the same technical 
baseline description the program office used, but it is typically 
performed by an organization higher in the decision-making process 
than the office performing the baseline estimate. Without an 
independent cost estimate, decisions makers will lack insight into a 
program’s potential costs because independent cost estimates 
frequently use different methods and are less burdened with 
organizational bias. 

[40] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[41] GAO, Secure Border Initiative: DHS Needs to Reconsider Its 
Proposed Investment in Key Technology Program, [hyperlink, 
http://www.gao.gov/products/GAO-10-340] (Washington, D.C.: May 5, 
2010). 

[42] GAO, Homeland Security: DHS Could Strengthen Acquisitions and 
Development of New Technologies, [hyperlink, 
http://www.gao.gov/products/GAO-11-829T] (Washington, D.C.: July 15, 
2011). 

[43] The operational test results were based on testing activities 
conducted by the Army on SBInet Block 1 during October and November 
2010. 

[44] GAO, Secure Border Initiative: DHS Needs to Reconsider Its 
Proposed Investment in Key Technology Program, [hyperlink, 
http://www.gao.gov/products/GAO-10-340] (Washington, D.C.: May, 5, 
2010); and Secure Border Initiative: DHS Needs to Address Significant 
Risks in Delivering Key Technology Investment, [hyperlink, 
http://www.gao.gov/products/GAO-08-1086] (Washington, D.C.: Sept. 22, 
2008). 

[45] The CBP cost data we relied on was developed by CBP during July 
and August 2010. 

[46] GAO, Cost Estimating and Assessment Guide: Best Practices for 
Developing and Managing Capital Program Costs, [hyperlink, 
http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009). 

[End of section] 

GAO’s Mission: 

The Government Accountability Office, the audit, evaluation, and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the 
performance and accountability of the federal government for the 
American people. GAO examines the use of public funds; evaluates 
federal programs and policies; and provides analyses, recommendations, 
and other assistance to help Congress make informed oversight, policy, 
and funding decisions. GAO’s commitment to good government is 
reflected in its core values of accountability, integrity, and 
reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO’s website [hyperlink, http://www.gao.gov]. Each 
weekday afternoon, GAO posts on its website newly released reports, 
testimony, and correspondence. To have GAO e mail you a list of newly 
posted products, go to [hyperlink, http://www.gao.gov] and select “E-
mail Updates.” 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of 
production and distribution and depends on the number of pages in the 
publication and whether the publication is printed in color or black 
and white. Pricing and ordering information is posted on GAO’s 
website, [hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or 
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card, 
MasterCard, Visa, check, or money order. Call for additional 
information. 

Connect with GAO: 

Connect with GAO on facebook, flickr, twitter, and YouTube.
Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts.
Visit GAO on the web at www.gao.gov. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 
Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; 
E-mail: fraudnet@gao.gov; 
Automated answering system: (800) 424-5454 or (202) 512-7470. 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov, (202) 512-4400
U.S. Government Accountability Office, 441 G Street NW, Room 7125
Washington, DC 20548. 

Public Affairs: 
Chuck Young, Managing Director, youngc1@gao.gov, (202) 512-4800
U.S. Government Accountability Office, 441 G Street NW, Room 7149 
Washington, DC 20548.