This is the accessible text file for GAO report number GAO-10-860 
entitled 'Homeland Security: US-VISIT Pilot Evaluations Offer Limited 
Understanding of Air Exit Options' which was released on August 10, 
2010. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to Congressional Committees: 

United States Government Accountability Office: 
GAO: 

August 2010: 

Homeland Security: 

US-VISIT Pilot Evaluations Offer Limited Understanding of Air Exit 
Options: 

GAO-10-860: 

GAO Highlights: 

Highlights of GAO-10-860, a report to congressional committees. 

Why GAO Did This Study: 

The Department of Homeland Security’s (DHS) U.S. Visitor and Immigrant 
Status Indicator Technology (US-VISIT) program is to control and 
monitor the entry and exit of foreign visitors by storing and 
processing biometric and biographic information. The entry capability 
has operated since 2006; an exit capability is not yet implemented. In 
September 2008, the Consolidated Security, Disaster Assistance, and 
Continuing Appropriations Act, 2009, directed DHS to pilot air exit 
scenarios with the U.S. Customs and Border Protection (CBP) and 
airlines, and to provide a report to congressional committees. DHS 
conducted CBP and Transportation Security Administration (TSA) pilots 
and issued its evaluation report in October 2009. 

Pursuant to the act, GAO reviewed the evaluation report to determine 
the extent to which (1) the report addressed statutory conditions and 
legislative directions; (2) the report aligned with the scope and 
approach in the pilot evaluation plan; (3) the pilots were conducted 
in accordance with the evaluation plan; and (4) the evaluation plan 
satisfied relevant guidance. To do so, GAO compared the report to 
statutory conditions, the evaluation plan, and relevant guidance. 

What GAO Found: 

The evaluation report partially addressed statutory conditions and 
legislative directions and expectations. Specifically, the report 
addressed the statutory condition for CBP to collect biometric 
information on exiting foreign nationals and four legislative 
directions and expectations for conducting the pilots. However, DHS 
was unable to address the statutory condition for an airline scenario 
because no airline was willing to participate. Also, the report did 
not meet a legislative expectation for gathering information on the 
security of information collected from visitors subject to US-VISIT. 
DHS officials told us that DHS did not view the expectation in the 
House report as a requirement. Moreover, they said that security 
requirements were tested prior to the pilots and there were no 
reported security incidents. However, DHS did not supply documentation 
that demonstrated the operational verification of pilot security 
requirements. 

The evaluation report generally aligned with the scope and approach in 
the evaluation plan. Specifically, the objectives and operational 
conditions described in the evaluation report were generally 
consistent with the evaluation plan. However, the report did not fully 
align with the evaluation plan because certain metrics, observations, 
and costs (e.g., percentage of system downtime or inoperability, costs 
for requirements analysis) were not reported as planned. Also, the 
reported scope and approach of the pilots included limitations not 
defined in the plan, such as suspending exit screening at departure 
gates to avoid flight delays. Such divergence was due, in part, to a 
desire to minimize the pilot’s impact on the airports, airlines, and 
travelers. 

The pilots were not conducted in accordance with the evaluation plan, 
in that they did not meet the plan’s stated purpose of operationally 
evaluating the air exit requirements. More specifically, about 30 
percent of the requirements were not operationally tested, either as 
part of the pilots or as part of another exit project. Rather, they 
were tested, for example, prior to commencement of pilot operations or 
as part of another exit project that has yet to complete operational 
testing. DHS officials considered such testing of requirements to be 
sufficient. 

The evaluation plan did not satisfy relevant guidance, such as 
defining standards for gauging the pilots’ performance, defining a 
comprehensive methodology for selecting airports and flights, and 
planning data analysis to ensure that the results of the evaluation 
support air exit decision making. The evaluation plan diverged from 
such guidelines, in part, because DHS viewed reporting on how the 
pilot results would be used to be outside the scope of its report. 

Collectively, the above limitations in scope, approach, and reporting 
restrict the pilots’ ability to inform a decision for a long-term air 
exit solution and point to the need for DHS to leverage compensating 
sources of information on air exit’s operational impacts in making air 
exit solution decisions. 

What GAO Recommends: 

GAO recommends that the Secretary of Homeland Security identify 
additional sources of information beyond the pilots to inform a 
strategic air exit solution decision. DHS agreed with the 
recommendation. 

View [hyperlink, http://www.gao.gov/products/GAO-10-860] or key 
components. For more information, contact Randolph C. Hite, (202) 512-
3439, hiter@gao.gov. 

[End of section] 

Contents: 

Letter: 

Results in Brief: 

Conclusions: 

Recommendation for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Briefing to Staff of Congressional Committees: 

Appendix II: Comments from the Department of Homeland Security: 

Appendix III: GAO Contact and Staff Acknowledgments: 

Abbreviations: 

ADIS: Arrival and Departure Information System: 

CBP: U.S. Customs and Border Protection: 

DHS: Department of Homeland Security: 

IDENT: Automated Biometric Identification System: 

POE: port of entry: 

RFID: radio frequency identification: 

TSA: Transportation Security Administration: 

US-VISIT: U.S. Visitor and Immigrant Status Indicator Technology: 

[End of section] 

United States Government Accountability Office:
Washington, DC 20548: 

August 10, 2010: 

The Honorable Frank R. Lautenberg: 
Interim Chairman: 
The Honorable George Voinovich: 
Ranking Member: 
Subcommittee on Homeland Security: 
Committee on Appropriations: 
United States Senate: 

The Honorable David E. Price: 
Chairman: 
The Honorable Harold Rogers: 
Ranking Member: 
Subcommittee on Homeland Security: 
Committee on Appropriations: 
House of Representatives: 

Congress and the executive branch have long sought to improve the 
integrity and security of U.S. borders through better ways to record 
and track the arrival and departure of foreign travelers through U.S. 
air, sea, and land ports of entry (POE). Pursuant to a series of 
statutory mandates, the Department of Homeland Security (DHS), in 
coordination with the Department of State, established the U.S. 
Visitor and Immigrant Status Indicator Technology (US-VISIT) program 
to use biometric and biographic information to control and monitor the 
pre-entry, entry, status, and exit of certain foreign visitors and 
immigrants. This program is intended to enhance the security of U.S. 
citizens and visitors, facilitate legitimate travel and trade, ensure 
the integrity of the U.S. immigration system, and protect the privacy 
of visitors to the United States. 

Since 2006, DHS has been operating a US-VISIT entry capability at 
about 300 air, sea, and land POEs, and has conducted evaluations and 
proof-of-concept experiments to further define a US-VISIT exit 
capability. In April 2008, DHS announced its intention to implement 
biometric exit verification at air and sea POEs in a Notice of 
Proposed Rule Making.[Footnote 1] Under this notice, commercial air 
and sea carriers would be responsible for developing and deploying the 
capability to collect the biometrics from departing travelers and 
transmit them to DHS. DHS received comments on the notice and has yet 
to publish a final rule. Subsequent to the rule making notice, on 
September 30, 2008, the Consolidated Security, Disaster Assistance, 
and Continuing Appropriations Act, 2009, was enacted, which directed 
DHS to test two scenarios for an air exit solution.[Footnote 2] The 
legislative history also provided accompanying direction to DHS in 
carrying out the pilot tests of the air exit solution. The act also 
required DHS to submit a report on the pilot tests and required that 
we review this report. 

The act prohibits DHS from obligating any US-VISIT funds provided in 
the act for the implementation of an air exit solution until the 
department provided a report to the Senate and House Committees on 
Appropriations on pilot tests for the solution that addressed the two 
scenarios: U.S. Customs and Border Protection (CBP) collects biometric 
exit data at airport departure gates; and airlines collect and 
transmit such data. 

The explanatory statement[Footnote 3] that accompanied the act, and 
the House Report[Footnote 4] incorporated by reference into the 
explanatory statement, provided further legislative direction for the 
conduct of the pilots. DHS issued its Air Exit Pilots Evaluation Plan 
in May 2009 and operated two air exit pilots from May 2009 until July 
2009. DHS submitted its Air Exit Pilots Evaluation Report to the House 
and Senate Appropriations Subcommittees on Homeland Security in 
October 2009. According to the US-VISIT Acting Deputy Program Director 
and agency documentation, the pilot results are one of several sources 
of information that are to be used to inform its decision about a long-
term air exit capability. 

Pursuant to the act's requirement that we review DHS's US-VISIT pilot 
evaluation report, we determined the extent to which (1) the 
evaluation report addresses statutory conditions and legislative 
directions; (2) the evaluation report aligns with the scope and 
approach in the evaluation plan; (3) the pilots were conducted in 
accordance with the evaluation plan; and (4) the evaluation plan 
satisfies relevant guidance. To accomplish our objectives, we compared 
(1) the evaluation report to applicable statutory conditions and 
legislative directions specified in the DHS fiscal year 2009 
appropriations act and the accompanying explanatory statement and 
House report; (2) the evaluation's reported objectives, scope, 
approach, and limitations with those found in the pilots' plan 
(including its evaluation framework[Footnote 5]); (3) planned 
evaluations and tests with pilot execution documentation, including 
business and system requirement test results; and (4) the evaluation 
plan to relevant guidance for evaluation planning that we had 
previously identified during reviews of federal pilot projects. 
[Footnote 6] 

On June 10, 2010, we briefed your staffs on the results of our review. 
This report summarizes and transmits the presentation slides we used 
to brief the staff, which included a recommendation to the Secretary 
of Homeland Security. The full briefing materials, including details 
on our scope and methodology, are reprinted as appendix I.[Footnote 7] 

We conducted this performance audit at US-VISIT program offices in 
Arlington, Virginia, from November 2009 to August 2010 in accordance 
with generally accepted government auditing standards. Those standards 
require that we plan and perform the audit to obtain sufficient, 
appropriate evidence to provide a reasonable basis for our findings 
and conclusions based on our audit objectives. We believe that the 
evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objectives. 

Results in Brief: 

The two US-VISIT Air Exit Pilots that DHS planned, executed, and 
reported to the House and Senate Appropriations Committees were 
limited in the information that they contributed toward the 
department's understanding of an air exit solution's operational 
impacts. Specifically, 

* The evaluation report addressed one statutory requirement for a CBP 
scenario to collect biometric information on exiting foreign 
nationals, and four of the legislative directions and expectations for 
conducting the pilots. However, DHS was unable to address the 
statutory requirement for an airline scenario because no airline was 
willing to participate. Also, the report did not meet a legislative 
expectation for gathering information on the security of information 
collected from visitors subject to US-VISIT during the pilots. DHS 
officials told us that DHS did not view the expectation of the House 
report as a requirement. Moreover, they said security requirements 
were tested prior to the pilots and there were no reported security 
incidents. However, DHS did not supply documentation that demonstrated 
the operational verification of pilot security requirements.[Footnote 
8] 

* The objectives and operational conditions described in the 
evaluation report were generally consistent with the evaluation plan. 
However, the report did not fully align with the evaluation plan 
because certain metrics, observations, and costs (e.g., percentage of 
system downtime or inoperability, costs for requirements analysis) 
were not reported as planned. Also, the reported scope and approach of 
the pilots included limitations not defined in the plan, such as 
suspending exit screening at departure gates to avoid flight delays. 
Such divergence was due, in part, to a desire to minimize the pilots' 
impact on airports, airlines, and travelers. 

* The pilots were not conducted in accordance with the evaluation 
plan's stated purpose of operationally evaluating the air exit 
requirements. More specifically, about 30 percent of the requirements 
were not operationally tested, either as part of the pilots or as part 
of another exit project. Rather, they were tested, for example, prior 
to commencement of pilot operations or as part of another exit project 
that has yet to complete operational testing. DHS officials considered 
such testing of requirements to be sufficient. 

* The evaluation plan did not implement relevant pilot project 
guidance, such as defining standards for gauging the pilots' 
performance, defining a comprehensive methodology for selecting 
airports and flights, and planning data analysis to ensure that the 
results of the evaluation support air exit decision making. The 
pilots' evaluation plan diverged from such guidelines, in part, 
because DHS viewed reporting on how the pilot results would be used to 
be outside the scope of its report. 

Collectively, these limitations in the pilots' scope, approach, and 
reporting restrict the pilots' ability to inform a decision for a long-
term air exit solution and highlight the need for compensating sources 
of information on air exit's operational impacts. 

Conclusions: 

DHS has long been challenged in its ability to deliver the exit 
portion of US-VISIT and thereby have a biometrically-based capability 
for knowing the status of foreign nationals who have entered the 
country. To help address these challenges, Congress directed DHS to 
conduct two pilot tests so that the department might gain a better 
understanding of the operational impact of implementing different exit 
solutions at air ports of entry. However, the degree to which the 
results of these pilots can inform DHS's future decisions was limited 
because the department was unable to test one scenario and did not 
meet a congressional expectation. Further, it was limited in the 
extent to which it followed defined pilot plans and reported all 
expected results in the evaluation report. Moreover, the scope and 
approach defined in the plans that governed the pilots' execution were 
also limited by conditions disclosed in the plan and the report, as 
well as by the extent and timing of requirements testing. DHS 
officials attributed key limitations to schedule constraints and 
decisions to intentionally limit the pilots' scope and impacts on 
travelers, air carriers, and airports. However, the collective result 
is that the pilots cannot alone adequately inform future DHS decisions 
on an exit solution for air ports of entry. If these limitations in 
the pilots are not otherwise compensated with other information 
sources on operational impacts of implementing an air exit solution, 
such as comments on the Notice of Proposed Rule Making, then the 
department will continue to be challenged in its ability to deliver US-
VISIT exit capabilities in airports. 

Recommendation for Executive Action: 

To the extent that the limitations in the Air Exit Pilots are not 
addressed through other information sources, we recommend that the 
Secretary of Homeland Security direct the Under Secretary for National 
Protection and Programs to have the US-VISIT Program Director identify 
additional sources for the operational impacts of air exit not 
addressed in the pilots' evaluation and to incorporate these sources 
into its air exit decision making and planning. 

Agency Comments and Our Evaluation: 

In written comments on a draft of this report, signed by the Director, 
Departmental GAO/OIG Liaison Office and reprinted in appendix II, DHS 
concurred with our recommendation and clarified its statements 
regarding a congressional report expectation. DHS also provided 
technical comments and suggested corrections, which we have 
incorporated into the report as appropriate. 

We are sending copies of this report to the Secretary of Homeland 
Security, appropriate congressional committees, and other interested 
parties. In addition, the report is available at no charge on the GAO 
Web site at [hyperlink, http://www.gao.gov]. 

Should you or your staffs have questions on matters discussed in this 
report, please contact me at (202) 512-3439 or hiter@gao.gov. Contact 
points for our Offices of Congressional Relations and Public Affairs 
may be found on the last page of this report. GAO staff that made 
major contributions to this report are listed in appendix III. 

Signed by: 

Randolph C. Hite: 
Director, Information Technology Architecture and System Issues: 

[End of section] 

Appendix I: Briefing to Staff of Congressional Committees: 

Homeland Security: US-VISIT Pilot Evaluations Offer Limited 
Understanding of Air Exit Options: 

Briefing for staff members of the Subcommittees on Homeland Security: 

Senate and House Committees on Appropriations: 

June 10, 2010[A]: 

[A] Slides 31, 64, 67 and 69 of this briefing were amended after the 
date it was provided to the committees to make a technical correction 
to reflect updated information. 

Briefing Overview: 
Introduction; 
Objectives; 
Results in Brief; 
Background
Results; 
* Objective 1; 
* Objective 2; 
* Objective 3; 
* Objective 4; 
Conclusions; 
Recommendation for Executive Action; 
Agency Comments and Our Evaluation. 
Attachment 1: Objectives, Scope, and Methodology; 
Attachment 2: Detailed US-VISIT Processes and Systems; 
Attachment 3: Detailed Description of Air Exit Pilots; 
Attachment 4: Limitations in Pilot Data Collection. 

Introduction: 

Congress and the executive branch have long sought to improve the 
integrity and security of U.S. borders through better ways to record 
and track the arrival and departure of foreign travelers through U.S. 
air, sea, and land ports of entry (POE). 

Pursuant to a series of statutory mandates, the Department of Homeland 
Security (DHS), in coordination with the Department of State, 
established the U.S. Visitor and Immigrant Status Indicator Technology 
(US-VISIT) program to use biometric and biographic information to 
control and monitor the pre-entry, entry, status, and exit of certain 
foreign visitors and immigrants. This program is intended to: 

* enhance the security of U.S. citizens and visitors, 

* facilitate legitimate travel and trade, 

* ensure the integrity of the U.S. immigration system, and, 

* protect the privacy of visitors to the United States. 

Since 2006, DHS has been operating a US-VISIT entry capability at 
about 300 air, sea, and land POEs, and has conducted evaluations and 
proof-of-concept experiments to further define a US-VISIT exit 
capability. 

In April 2008, DHS announced its intention to implement biometric exit 
verification at air and sea POEs in a Notice of Proposed Rule Making. 
[Footnote 9] Under this notice, commercial air and sea carriers would 
be responsible for developing and deploying the capability to collect 
the biometrics from departing travelers and transmit them to DHS. DHS 
received comments on the notice and has yet to publish a final rule. 

Subsequent to the rule making notice, on September 30, 2008, the 
Consolidated Security, Disaster Assistance, and Continuing 
Appropriations Act, 2009, was enacted, which directed DHS to test two 
scenarios for an air exit solution.[Footnote 10] The legislative 
history also provided accompanying direction to DHS in carrying out 
the pilot tests of the air exit solution. The act also required DHS to 
submit a report on the pilot tests and required that we review this 
report. 

The act prohibits DHS from obligating any US-VISIT funds provided in 
the act for the implementation of an air exit solution until the 
department provided a report to the Senate and House Committees on 
Appropriations on pilot tests for the solution that addressed the two 
scenarios: 

* U.S. Customs and Border Protection (CBP) collects biometric exit 
data at airport departure gates; and; 

* airlines collect and transmit such data. 

The explanatory statement[Footnote 11] that accompanied the act, and 
the House Report[Footnote 12] incorporated by reference into the 
explanatory statement, provided further legislative direction for the 
conduct of the pilots. 

DHS issued its Air Exit Pilots Evaluation Plan in May 2009 and 
operated two air exit pilots from May 28, 2009, until July 2, 2009. 
DHS submitted its Air Exit Pilots Evaluation Report to the House and 
Senate Appropriations Subcommittees on Homeland Security on October 
26, 2009. According to the US-VISIT Acting Deputy Program Director and 
agency documentation, the pilot results are one of several sources of 
information that are to be used to inform its decision about a long-
term air exit capability. 

[End of section] 

Objectives: 

As agreed, our objectives were to determine the extent to which (1) 
the evaluation report addresses the statutory condition and 
legislative directions; (2) the evaluation report aligns with the 
scope and approach in the evaluation plan; (3) the pilots were 
conducted in accordance with the evaluation plan; and (4) the 
evaluation plan satisfies relevant guidance. 

To accomplish our objectives, we compared (1) the evaluation report to 
applicable statutory conditions and legislative directions specified 
in the DHS fiscal year 2009 appropriations act and the accompanying 
explanatory statement and House report; (2) the evaluation's reported 
objectives, scope, approach, and limitations with those found in the 
pilots' plan (including its evaluation framework[Footnote 13]); (3) 
planned evaluations and tests with pilot execution documentation, 
including business and system requirement test results; and (4) the 
evaluation plan to relevant guidance for evaluation planning that we 
had previously identified during reviews of federal pilot 
projects.[Footnote 14] Details of our scope and methodology are 
described in attachment 1. 

We conducted this performance audit at US-VISIT program offices in 
Arlington, Virginia, from November 2009 to June 2010 in accordance 
with generally accepted government auditing standards. Those standards 
require that we plan and perform the audit to obtain sufficient, 
appropriate evidence to provide a reasonable basis for our findings 
and conclusions based on our audit objectives. We believe that the 
evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objectives. 

[End of section] 

Results in Brief: 

The two US-VISIT Air Exit Pilots that DHS planned, executed, and 
reported to the House and Senate Appropriations Committees were 
limited in the information that they contributed toward the 
department's understanding of an air exit solution's operational 
impacts. Specifically, 

* The pilots addressed one statutory requirement for a CBP scenario to 
collect information on exiting foreign nationals, and four of the 
legislative directions and expectations for conducting the pilots. 
However, DHS was unable to address the statutory requirement for an 
airline scenario because no airline was willing to participate. Also, 
the pilots did not meet a legislative expectation for gathering 
information on the security of information collected from visitors 
subject to US-VISIT during the pilots. 

* The objectives and operational conditions described in the 
evaluation report were generally consistent with the evaluation plan. 
However, the report did not fully align with the evaluation plan 
because certain metrics, observations, and costs (e.g., percentage of 
system downtime or inoperability, costs for requirements analysis) 
were not reported as planned. Also, the reported scope and approach of 
the pilots included limitations not defined in the plan, such as 
suspending exit screening at departure gates to avoid flight delays. 
Such divergence was due, in part, to a desire to minimize the pilots' 
impact on airports, airlines, and travelers. 

* The pilots were not conducted in accordance with the evaluation 
plan's stated purpose of operationally evaluating the air exit 
requirements. More specifically, about 30 percent of the requirements 
were not operationally tested, either as part of the pilots or as part 
of another exit project. Rather, they were tested, for example, prior 
to commencement of pilot operations or as part of another exit project 
that has yet to complete operational testing. DHS officials considered 
such testing of requirements to be sufficient. 

* The evaluation plan did not implement relevant pilot project 
guidance, such as defining standards for gauging the Air Exit Pilots' 
performance, defining a comprehensive methodology for selecting 
airports and flights, and planning data analysis to ensure that the 
results of the evaluation support air exit decision making. The Air 
Exit Pilots' evaluation plan diverged from such guidelines, in part, 
because DHS viewed the use of pilot results to be outside the scope of 
its report. 

Collectively, these limitations curtail the pilots' ability to inform 
a decision for a long-term air exit solution and point to the need for 
compensating sources of information on air exit's operational impacts. 

Accordingly, we are making a recommendation to the Secretary of 
Homeland Security aimed at identifying and leveraging other sources of 
information, such as comments from the Notice of Proposed Rule Making, 
to better inform a strategic air exit solution decision. 

In oral comments on a draft of this briefing, DHS officials agreed 
with our recommendation, but did not agree with our point that the 
evaluation report omitted a number of planned evaluation metrics and 
observations. In this regard, the officials cited information in the 
report and provided oral explanations for some, but not all, of these 
omissions to counter our position that the metrics and observations, 
as defined in the evaluation plan, were missing from the report. While 
we acknowledge that most of the citations and explanations provide 
information that was related to the missing metric or observation, in 
no instance was this information sufficient to satisfy the planned 
metric or observation. To clarify the basis for our finding about 
these results, we have added an example to the briefing that describes 
how the citations and explanations that were provided by DHS officials 
fall short of actually reporting results as planned. 

DHS officials also provided a range of other comments, including 
providing additional information about the testing of the air exit 
requirements that were applicable to the pilots and emphasizing that 
the scope of the pilots was intentionally limited in order to respond 
to the timeframes specified in legislative direction. We have 
incorporated these comments into the briefing, as appropriate. 

[End of section] 

Background: US-VISIT Purpose and Goals: 
		
The purpose of US-VISIT is to provide biometric (e.g., fingerprint) 
identification—through the collection, maintenance, and sharing of 
biometric and selected biographic data—to authorized DHS and other 
federal agencies. In this regard, US-VISIT supports a series of 
homeland security-related mission processes that cover hundreds of 
millions of foreign national travelers who enter and leave the United 
States at about 300 air, sea, and land POEs.[Footnote 15] An overview 
of these five processes is depicted in figure 1; the processes are 
described in attachment 2. 

Figure 1: Mission Processes Supported by US-VISIT: 

[Refer to PDF for image: illustration] 

Analysis; 
Pre-entry; 
Entry; 
Status; 
Exit. 

Sources: GAO analysis of US-VISIT data, Nova Development Corp. 
(images). 

[End of figure] 

The US-VISIT program's goals[Footnote 16] are to (1) enhance the 
security of U.S. citizens and visitors, (2) facilitate legitimate 
travel and trade, (3) ensure the integrity of the U.S. immigration 
system, and (4) protect the privacy of visitors. The program is to 
achieve these goals by: 

* collecting, maintaining, and sharing information on certain foreign 
nationals who enter and exit the United States; 

* identifying foreign nationals who (1) have overstayed or violated 
the terms of their visit; (2) can receive, extend, or adjust their 
immigration status; or (3) should be apprehended or detained by law 
enforcement officials; 

* detecting fraudulent travel documents, verifying visitor identity, 
and determining visitor admissibility through the use of biometrics 
(digital fingerprints and a digital photograph); and; 

* facilitating information sharing and coordination within the 
immigration and border management community. 

Background: Prior DHS Efforts to Evaluate Exit Solutions: 
		
Since 2004, DHS has evaluated options for recording the exit of 
travelers in the air, sea, and land environments by means of several 
initiatives. 

* January 2004 to May 2007. DHS operated biometric exit pilots at 14 
U.S. air and sea POEs to evaluate three technology solutions: self-
service kiosk, mobile device, and a combination of the two. The pilots 
established the technical feasibility of a biometric exit solution at 
air and sea POEs and identified issues that limited the operational 
effectiveness of the solution (e.g., low traveler compliance rates). 

* August 2005 to November 2006. DHS operated land entry/exit proof-of-
concept demonstrations at five ports of entry to examine the 
feasibility of using passive radio frequency identification (RFID) 
technology[Footnote 17] for recording travelers' entry and exit via 
RFID tags embedded in the Form 1-94 and to provide CBP officers in 
pedestrian lanes with biographic, biometric, and watch list data. The 
demonstrations showed that RFID technology was too immature to meet 
the requirements of a land exit solution. 

Background: Overview of Air Exit Pilots: 	
		
The Air Exit Pilots are one component of a larger US-VISIT project 
known as Comprehensive Exit, which is to, in part, plan, develop, and 
deploy an air and sea exit capability.[Footnote 18] 

According to DHS, the purpose of the Air Exit Pilots was to evaluate 
the impact on airport exit operations of identifying, verifying, and 
collecting information from passengers who were subject to US-VISIT 
and leaving the United States. 

To accomplish this, the pilots were to: 

* evaluate identity verification and exit-recording capabilities when 
used with existing POE operations and infrastructure; 

* biometrically and biographically verify the identity of in-scope 
travelers departing the United States at the pilot locations; and; 

* record the exit of, and update the IDENT and Arrival and Departure 
Information System (ADIS) records for, each subject traveler. 

DHS conducted two pilots from May 2009 until July 2009: 

* a CBP pilot at Detroit Metropolitan Wayne County Airport and; 

* a Transportation Security Administration (TSA) pilot at Hartsfield-
Jackson Atlanta International Airport. 

The pilots utilized two types of portable biometric collection 
devices, as described in table 1. For a detailed description of the 
pilots, see attachment 3. 

Table 1: Devices Used by Air Exit Pilots: 

Type of Device: Mobile; 
Description: Hand-held device that scanned information on travel 
documents and collected biometrics one fingerprint at a time 
Pilot Location(s): Detroit, Atlanta[A]. 

Type of Device: Portable; 
Description: Small suitcase that contained a laptop computer, document 
scanning device, and a biometric scanner that collected a four-print
slap; 
Pilot Location(s): Detroit 

Source: DHS. 

[A] According to a TSA operations official, only the mobile device was 
used in Atlanta because of the limited space available within the 
checkpoint area. 

[End of table] 

According to US-VISIT officials and the Air Exit Pilots documents, 
pilot results were to be one of several sources of information to 
inform rule making and decisions for a long-term air and sea exit 
capability. In this regard, the US-VISIT director also stated that the 
scope of the pilots was intentionally limited in order to respond to 
the timeframes specified in legislative direction. 

Background: GAO Reports on Prior US-VISIT Exit Efforts: 
		
Over the past several years, we have identified a range of broad 
management challenges and issues associated with DHS's prior efforts 
to develop and deploy an air exit solution. 

* In August 2007,[Footnote 19] we reported that US-VISIT had not 
developed a complete schedule for biometric exit implementation. 

* In February 2008,[Footnote 20] we reported that the Comprehensive 
Exit project had not been adequately defined, citing a lack of 
analytical basis for high-level project milestones. 

* In September 2008,[Footnote 21] we reported that DHS was unlikely to 
meet its timeline for implementing an air exit system with biometric 
indicators, such as fingerprints, by July 1, 2009, due to several 
unresolved issues, such as opposition to the department's published 
plan by the airline industry. 

* In December 2008,[Footnote 22] we reported that DHS still had not 
developed a schedule for the full implementation of a comprehensive 
exit solution. 

* Most recently, in November 2009,[Footnote 23] we reported that DHS 
had not developed a master schedule for Comprehensive Exit that was 
integrated or derived in accordance with relevant guidance.
In each of these reports, we made recommendations to ensure that US-
VISIT exit was planned, designed, developed, and implemented in an 
effective and efficient manner. DHS generally agreed with our 
recommendations. 

[End of section] 

Objective 1 - Results: 

Evaluation Report Satisfied Most, but Not All, Statutory Conditions 
and Legislative Directions and Expectations: 

The act required the department to provide a report to the Committees 
on Appropriations that addressed a test of two scenarios, in which: 
(1) CBP collects biometric exit data at airport departure gates; and 
(2) airlines collect and transmit such data. To DHS's credit, its 
evaluation report addresses the results of the first scenario. 
However, the report does not provide results for the second scenario. 
As the report states, and the US-VISIT Program Director and airline 
officials confirmed, no airlines agreed to participate in the pilots, 
thus precluding DHS from testing the second scenario. 

In lieu of this second scenario, DHS pilot tested a third scenario in 
which TSA collected biometric exit data at a security checkpoint. 
According to the pilots' evaluation report, this scenario was added 
because it had already been examined as an exit alternative in the 
Notice of Proposed Rule Making and because TSA was part of the 
traveler departure process. 

Notwithstanding the addition of this third scenario, because DHS was 
unable to test a scenario where airlines collect and transmit traveler 
biometric data, the department's understanding of the impact of this 
previously-proposed air exit solution is limited. 

Satisfaction of Legislative Directions and Expectations: 
		
The explanatory statement and House report that accompanied the act 
provided six legislative directions and expectations for the conduct 
of the pilots. In summary, the evaluation report met four and 
partially met one of these directions and expectations, and did not 
meet the remaining one (see table 2). 

Table 2: Air Exit Pilots' Satisfaction of Legislative Directions and 
Expectations: 

Legislative Directions and Expectations: The pilots shall be completed 
not later than January 31, 2009; 
Not met. 

Legislative Directions and Expectations: The pilots should be 
conducted over a time period of not less than 30 days; 
Met. 

Legislative Directions and Expectations: The pilots are expected to 
gather: workload information; 
Met. 

Legislative Directions and Expectations: The pilots are expected to 
gather: cost data; 
Met. 

Legislative Directions and Expectations: The pilots are expected to 
gather: information on the impact on passenger processing time; 
Met. 

The pilots are expected to gather: data related to the quality and 
security of traveler information collected; 
Partially met. 

Source: GAO analysis of DHS data. 

Notes: "Met" means that DHS fully satisfied the direction or 
expectation. "Partially met" means that DHS satisfied some, but not 
all, aspects of the direction or expectation. "Not met" means that DHS 
did not satisfy any aspect of the direction or expectation. Our 
assessment of the data gathered by the pilots was based on whether the 
evaluation report presented metrics or observations related to each 
information category. 

[End of table] 

More specifically, the pilots operated for a period of 36 days (longer 
than the minimum legislatively-directed duration) and while in 
operation, collected most of the types of data. The evaluation report 
presented: 

* workload information, such as average wait times for the pilots and 
total field collector and non-field support staff hours needed to 
operate the pilots; 

* cost data, such as pre-deployment costs, operational support costs, 
and CBP and TSA labor and expenses; 

* information on the impact of passenger processing time, such as 
comparing pilot to baseline wait and processing times for both CBP and 
TSA; and; 

* data on the quality of traveler information collected, such as 
fingerprint quality scores. 

However, the pilots were completed on July 2, 2009, about 5 months 
after the deadline. The Air Exit Pilots project manager told us that 
the January 31, 2009, deadline, which allowed US-VISIT 4 months to 
complete the pilots,[Footnote 24] was not enough time for the pilots 
to be executed in accordance with the US-VISIT life cycle methodology. 

Further, while the House report expected the pilots to gather data 
related to the security of traveler information collected, the 
evaluation report only stated that all pilot-specific security 
requirements were fully met and did not present any data on the 
security of the information collected during pilot operations. 
According to the US-VISIT director, DHS was not required to fulfill 
the expectations of the House report. The US-VISIT Privacy Officer 
told us that security requirements were tested prior to the pilots and 
that there were no reported security incidents. However, we have yet 
to receive any documentation demonstrating the operational 
verification of security requirements. As a result, DHS's 
understanding of the effectiveness and impact of operational security 
controls on air exit processing is limited. 

[End of section] 

Objective 2 - Results: 

Pilot Evaluation Report Was Aligned with Key Aspects of the Evaluation 
Plan, but Important Differences Highlight Pilot Limitations: 

The evaluation plan defined the pilots' scope, approach, objectives, 
and conditions and defined an evaluation framework that included 
quantitative metrics, qualitative observations, and cost elements for 
which results were expected to be gathered during the pilots. 

To DHS's credit, the pilots' three objectives[Footnote 25] were 
consistently described in the plan and report. Further, the 
operational conditions (e.g., airport locations, passenger screening 
locations, biometric collection devices, and duration of the pilots) 
described in the plan and report were generally consistent. In 
addition, the majority of metrics, observations, and cost elements 
that the plan defined for data collection and reporting were addressed 
in the report. Specifically, 79 percent of the metrics, 79 percent of 
the observations, and 71 percent of the cost elements defined in the 
plan were represented in the evaluation report. 

Evaluation Report Aligned with Some Aspects of Plans: 
		
Nevertheless, the planned metrics, observations, and cost elements 
that were omitted from the report were significant in that each 
offered potential insights into the operational impact of the air exit 
solution options. Examples of these missing evaluation results include: 

* percentage of system downtime or inoperability, 

* time needed to address device problems, 

* cost for requirements analysis, 

* cost for development of IDENT and ADIS reports, 

* time needed to instruct travelers, 

* effectiveness of airport signs, 

* depth and clarity of collector training sessions, and
* percentage of collectors trained during the pilot. 

The report did not explain why these cost elements, metrics, and 
observations were not captured and reported as planned, other than to 
state that planned metrics were revised due to operational constraints 
or unavailable data. DHS officials attributed certain differences 
between planned and reported metrics and observations to errors in the 
evaluation framework. They also explained that certain cost elements 
were not reported as planned because they were too small to identify 
or applied to items that were not procured as planned. 

For some, but not all, of the missing metrics and observations, DHS 
officials provided citations in the report and oral explanations that 
they viewed as addressing the omissions. However, in each case the 
information provided did not satisfy the planned metric or 
observation. For example, regarding the metric "percentage of 
collectors trained during the pilots," a DHS official referred us to 
the reported results for a different metric in the plan entitled 
"percentage of collectors trained prior to the pilots" and stated that 
because this latter metric was reported and because they knew that 100 
percent of officers were trained prior to operating any pilot devices, 
then the omitted metric could be derived. 

However, the oral information needed to derive the metric was, 
nevertheless, not verifiable and the derivation could not be arrived 
at based on the information in the report. Moreover, the oral 
explanation conflicted with a statement in the report that CBP and TSA 
each had an officer who performed biometric processing without 
completing a formal training class. 

These omissions limit the ability of the reported results of the 
pilots to fully inform DHS's understanding of the operational impacts 
and costs of implementing an air exit solution. 

Reported Limitations Were Not Specified in Evaluation Plans: 
		
In addition to specifying aspects of the pilots' scope and approach, 
the evaluation plan also identified a variety of associated 
limitations that were expected to affect the execution and results of 
the pilots, and these limitations were reiterated in the evaluation 
report. For example, both documents disclosed that the pilots were not 
intended to fully assess existing systems or biometric devices.
However, the report also identified scope and approach limitations 
that were not specified in the plan. For example, TSA did not: 

* collect identification from all in-scope travelers ages 14 to 
18,[Footnote 26] 

* collect flight information from in-scope travelers, or; 

* perform biometric collection during the main security checkpoint's 
peak period. 

Additionally, TSA and CBP suspended exit processing to address 
situations that could have negatively impacted travelers or flights. 
See attachment 4 for greater detail on these and other reported 
limitations. 

While the report appropriately disclosed these additional limitations, 
it did not address their operational impacts. Moreover, it shows that 
the pilot was even more limited than planned. For example, the report 
did not: 

* describe the operational impacts or costs to TSA operations 
associated with a recognized need for automated collection of flight 
information from in-scope travelers at TSA security checkpoints; 

* discuss the implications of the project's decisions to abort 
biometric data collection when potential airline and passenger delays 
became apparent or its conclusion that the pilots had "no conclusive 
impact on flight delays, delay durations, boarding times or number of 
passengers who missed flights;" 

* explain how the implied deficiencies in IDENT and ADIS matching and 
overstay identification capabilities affected reported matching and 
overstay pilot results; and; 

* explain how the reported percentages of biometrically-processed 
travelers or total flow times would be affected if TSA and CBP had 
selected other screening periods, including periods during peak 
operations. 

According to the evaluation report, some of the additional limitations 
were the result of DHS's desire to minimize the impact of the pilots 
on airlines, airports, and travelers. The Air Exit Pilots project 
manager stated that the impact of these decisions on the evaluation 
results was not addressed in the report because the pilots were to 
only document discovered limitations, not to extrapolate data based on 
them. 

Collectively, the limitations cited in the plan and report restrict 
the pilots' ability to fully inform DHS's understanding of the 
operational impact of implementing an air exit solution. 

[End of section] 

Objective 3 - Results: 

Pilots Were Not Conducted in Accordance With Key Aspect of the 
Evaluation Plan: 

A key aspect of the pilots' scope, as defined in the evaluation plan, 
was that 7 metrics[Footnote 27] to be analyzed during the pilots were 
linked to 99 air exit business requirements[Footnote 28] (i.e., 
operational requirements). Of the 99 requirements, the project office 
designated 84 as being applicable to the pilots. 

DHS tested 54 of the 84 requirements in the pilots' operational 
settings as part of operational readiness testing (41 requirements) or 
in conjunction with the deployment of a US-VISIT exit-related 
reporting capability (13 requirements). However, 25 requirements that 
were applicable to the pilots were not tested in the operational 
setting associated with the pilots, as provided for in the plan (6 
security requirements and 19 requirements assigned to another exit 
project). Testing for the remaining 5 requirements was either not 
performed (1 requirement) or DHS reported successful testing but has 
yet to provide documentation to reflect this (4 requirements). 

Pilots Deviated From Key Aspect of Evaluation Plan: 
		
As a result, the impact of 26 requirements on pilot operations and 
pilot results was neither evaluated nor reported. The testing status 
of these 84 in-scope requirements is summarized in table 3 and 
described in greater detail following the table. 

Table 3: In-scope Requirements Testing Status: 

Designated test environment: Tested with pilots; 
Requirements testing status: 
Operationally tested: 41; 
Not operationally tested: 6; 
Not tested: 1; 
No documentation: 4. 

Designated test environment: Tested with other projects; 
Requirements testing status: 
Operationally tested: 13; 
Not operationally tested: 19; 
Not tested: 0; 
No documentation: 0. 

Designated test environment: Total requirements; 
Requirements testing status: 
Operationally tested: 54; 
Not operationally tested: 25; 
Not tested: 1; 
No documentation: 4. 

Source: GAO analysis of DNS data. 

[End of table] 

Six security requirements were not part of the pilots' final 
operational readiness test. Rather, these six were tested several 
weeks prior to the final operational readiness test as part of 
security testing, which was not performed in the pilots' operational 
environment. As a result, the legislative expectation to gather 
information on the security of traveler data, as discussed earlier, 
was not met. 

For 32 requirements, testing was conducted in conjunction with other 
exit projects related to exit record processing and reporting. 
According to air exit project officials, since the capabilities 
associated with these requirements were delivered by projects other 
than the pilots, they relied on the testing results from those 
projects as verification of the requirement. However, while one of 
these two projects was operationally tested, we have previously 
reported[Footnote 29] that the processing capability associated with 
the other project has yet to be deployed and will not be completely 
tested until data from US-VISIT's long-term air/sea exit solution are 
available. Further, program officials also previously told us that 
this processing capability was not used by the pilots because the 
required technology infrastructure was not in place at the pilot 
locations. As a result, 19 of the 32 pilot-related requirements that 
were tested as part of other projects were not operationally tested. 

Four additional requirements were reportedly tested, but we have yet 
to receive verifiable test results to confirm this. Further, the 
project office has acknowledged that one additional requirement was 
not part of any phase of the pilot testing process.[Footnote 30] 
		
Collectively, this means that about 30 percent of in-scope 
requirements were not operationally tested. The Air Exit Pilots 
project manager told us that given that the focus of the pilots' 
operational evaluation was the impact of air exit on agency operations 
and traveler processing, the testing that was performed to demonstrate 
satisfaction of the 7 metrics and applicable requirements was 
considered sufficient. Nevertheless, the evaluation report states that 
100 percent of the operational requirements that were relevant to the 
pilots were met. It does not disclose the number of requirements that 
were not tested in the pilots' operational setting, and it does not 
cite the associated limitations of not doing so. 

In light of these requirements that were not operationally tested, the 
extent to which the pilots provide a full understanding of DHS's air 
exit operational impact is diminished. 

[End of section] 

Objective 4 - Results: 

Evaluation Plan Did Not Reflect Key Aspects of Relevant Guidance: 

As we have previously reported,[Footnote 31] a key to effectively 
conducting pilot projects is having a well-defined evaluation plan. 
Among other things, such a plan should: 

* define performance standards, 

* describe a comprehensive methodology for conducting the pilot, and, 

* specify required data analysis. 

The Air Exit Pilots' evaluation plan, which was intended to direct the 
evaluation of all aspects of the pilots, did not satisfy these key 
aspects of relevant guidance. Supporting project documents also did 
not fully address these key aspects. Specifically, they did not define 
standards against which pilot performance could be assessed; describe 
the basis for selecting airports and flights; or specify the analysis 
needed to determine pilot effectiveness and inform decision making. 
The air exit project manager stated that, in general, DHS used the air 
exit Notice of Proposed Rule Making,[Footnote 32] congressional 
direction, and US-VISIT's project life cycle methodology for guidance 
in planning the evaluation. 
		
Performance standards. Although the air exit requirements discussed 
earlier included performance requirements—-such as the requirement to 
transmit traveler data within 24 hours-—these requirements were not 
specified in the evaluation plan as standards against which to gauge 
pilot performance. Moreover, certain planned metrics that could have 
provided performance standards by measuring baseline operational data 
(e.g., pre-pilot average boarding time) were not identified as bases 
for determining whether the pilots met operational needs. The 
evaluation report did cite one performance standard that was not met 
(TSA's service goal to check documents within 10 seconds), but this 
standard was not defined as an air exit pilots requirement. 

DHS officials said that they did not include performance standards in 
their evaluation planning because they expected to use the pilot 
results to set new performance standards. However, they acknowledged 
that the report should have discussed how well the pilot met existing 
performance requirements, such as the 24 hour data transmission 
requirement. 

By not clearly defining performance standards in the pilots' 
evaluation plan, the pilots were limited in the extent to which they 
could definitively determine the operational impacts and results of 
each air exit scenario. 

Comprehensive methodology. The evaluation plan and supporting 
documents did not explain key aspects of the methodology for 
conducting the pilots. Specifically, the methodology for selecting 
pilot airports and flights from their respective populations was not 
adequately defined. 

According to a DHS official, TSA and CBP were each allowed to select a 
pilot airport from the 12 airports listed in the Notice of Proposed 
Rule Making.[Footnote 33] To select the specific airport, CBP and TSA 
considered such factors as airport size and flight destinations (a mix 
of international and domestic flights). However, the initial 
constraint of 12 airports was not documented in the evaluation plan, 
and neither agency fully documented the selection factors or criteria 
to be used in making their eventual airport choice, as described below. 

* A TSA official told us that TSA sought to pilot an airport from the 
Notice of Proposed Rule Making list with a medium-size checkpoint (5-
10 lanes), a strong mix of domestic and international air carriers 
(90:10 ratio), and at least one U.S. and one international carrier. 
However, TSA then used the number of airport security checkpoints as a 
basis for changing its airport selection from Chicago to Atlanta. 
[Footnote 34] Moreover, this basis for selection is contradicted by 
information in the evaluation report, which stated that Atlanta had 
three checkpoints, and that 32 percent of travelers originating in 
Atlanta went through alternate checkpoints and thus were not processed 
by the pilot. 

*CBP documented two factors as the basis for airport selection, namely 
carrier diversity and inbound-outbound scheduling flexibility. 
However, the agency did not document how it would apply these factors 
for each airport or how airports would be selected or eliminated based 
on these factors. 

In addition, CBP did not fully document the selection factors or 
criteria to be used in choosing flights for air exit screening. 

* CBP documentation specified the air carriers that would be subject 
to the pilot, but did not explain why these carriers were chosen. 
Regarding flight selection, the evaluation plan stated an assumption 
that CBP would use a "risk selection factor" derived from a flight's 
number of visa waiver participants and other criteria. However, the 
plan did not specify the other criteria or how these criteria would 
contribute to making a flight selection decision. For example, while 
the evaluation report stated that CBP chose 91 flights based on their 
volume of travelers subject to US-VISIT, neither the evaluation plan 
nor CBP's documentation specified the volume threshold that was used 
to trigger flight selection, and neither specified whether other 
criteria, such as destination,[Footnote 35] was relevant to selection. 
Also, while the evaluation report stated that the 91 flights 
represented 14.6 percent of all the international flights departing 
from the Detroit airport during pilot hours, neither the plan nor 
related pilot documents explained why this sample size was sufficient 
for understanding air exit's operational impacts on flights. 

Without a comprehensive methodology that includes explicit criteria 
for selecting the pilots' airports and flights, DHS lacks sufficient 
assurance that the scope of its pilots provided a meaningful 
understanding of air exit operational impacts. 

Data analysis. The evaluation plan did not specify how data would be 
analyzed to determine pilot effectiveness and determine how the 
results would inform decision making. 

Although the evaluation plan stated that both the plan and the 
evaluation report would describe how pilot results would inform air 
exit decision making, neither addressed how to analyze the collected 
data to produce inputs for such decision making. The evaluation report 
concluded that the pilot data provided insight into traveler impacts, 
biometric capture procedures, traveler compliance, and staffing needs, 
and would support further economic analysis for an air exit solution 
decision, but did not identify the framework elements or analysis of 
pilot results needed to support the economic analysis. The report also 
stated that the results of the pilots would be combined with a review 
of public comments on the proposed air exit rule to inform the 
solution decision, but did not identify which pilot results were 
needed for this effort, or the analysis required to compensate for the 
known limitations of the pilots. Figure 2, from the pilots' evaluation 
report, illustrates DHS's view of the pilots in relation to follow-on 
air exit activities. 

Figure 2: DHS's Air Exit Next Steps Roadmap: 

[Refer to PDF for image: illustration] 

2008 Regulatory Impact Analysis (RIA): 

Notice of Proposed Rulemaking (NPRM): 

Next steps: 

D-102: Economic Analysis: 

Final Rule: 
(1) Confirm NPRM; or: 
(2) Revise NPRM; or: 
(3) Other options. 

Implementation Approach Strategy: 
* Technology
* Full Process (e.g. Outreach, training, facilities). 

Source: DHS. 

[End of figure] 

A DHS official told us that specifying how the pilots' results would 
be used was beyond the scope of the pilots and declined to describe 
the relationship of the pilot evaluation to air exit decision making.
By not specifying the data analysis required to clearly link the 
evaluation results to their intended use, DHS was limited in its 
ability to measure the pilots' effectiveness, and thus ensure that 
they provided the requisite basis for informing a final air exit 
solution decision. 

[End of section] 

Conclusions: 

DHS has long been challenged in its ability to deliver the exit 
portion of US-VISIT, and thereby have a biometrically-based capability 
for knowing the status of foreign nationals who have entered the 
country. To help address these challenges, Congress directed DHS to 
conduct two pilot tests so that the department might gain a better 
understanding of the operational impact of implementing different exit 
solutions at air ports of entry. However, the degree to which the 
results of these pilots can inform DHS's future decisions was limited 
because the department was unable to test one scenario and did not 
meet a key congressional expectation. Further, it was limited in the 
extent to which it followed defined pilot plans and reported all 
expected results in the evaluation report. Moreover, the scope and 
approach defined in the plans that governed the pilots' execution were 
also limited by conditions disclosed in the plan and the report, as 
well as by the extent and timing of requirements testing. DHS 
officials attributed key limitations to schedule constraints and 
decisions to intentionally limit the pilots' scope and impacts on 
travelers, air carriers, and airports. However, the collective result 
is that the pilots cannot alone adequately inform future DHS decisions 
on an exit solution for air ports of entry. If these limitations in 
the pilots are not otherwise compensated with other information 
sources on operational impacts of implementing an air exit solution, 
such as comments on the Notice of Proposed Rule Making, then the 
department will continue to be challenged in its ability to deliver US-
VISIT exit capabilities in airports. 

[End of section] 

Recommendation for Executive Action: 
To the extent that the limitations in the Air Exit Pilots are not 
addressed through other information sources, we recommend that the 
Secretary of Homeland Security direct the Under Secretary for National 
Protection and Programs to have the US-VISIT Program Director identify 
additional sources for the operational impacts of air exit not 
addressed in the pilots' evaluation and to incorporate these sources 
into its air exit decision making and planning. 

[End of section] 

Agency Comments and Our Evaluation: 

We provided a draft of this briefing to DHS officials, including the 
US-VISIT director and the Air Exit Pilots project manager, for review 
and comment. In their oral comments, the officials agreed with our 
recommendation. However, they took issue with our finding that the 
reported pilot results omitted a number of planned metrics and 
observations. Specifically, they cited information in the report and 
provided explanations that they said addressed the metrics and 
observations in question. We reviewed each of these citations and 
explanations and acknowledge that while most of this information is 
related to the omitted metrics or observations, it did not supply the 
missing evaluation results as defined in the plan. To clarify the 
basis for our finding about these results, we have added an example to 
the briefing that describes how the information that DHS provided 
falls short of actually reporting all results as planned. 

DHS officials also provided a range of other comments, to include 
providing additional information about the testing of the air exit 
requirements that were applicable to the pilots and emphasizing that 
the scope of the pilots was intentionally limited in order to be 
responsive to the timeframes specified in legislative direction. We 
have incorporated these comments into the briefing, as appropriate. 

[End of section] 

Attachment 1: Objectives, Scope, and Methodology: 
		
Our objectives were to determine the extent to which (1) the 
evaluation report addresses the statutory condition and legislative 
directions, (2) the evaluation report aligns with the scope and 
approach in the evaluation plan, (3) the pilots were conducted in 
accordance with the evaluation plan, and (4) the evaluation plan 
satisfies relevant guidance. 

We focused our review on the Air Exit Pilots Evaluation Plan, 
published by the United States Visitor and Immigrant Status Indicator 
Technology (US-VISIT) program on May 14, 2009, and the US-VISIT Air 
Exit Pilots Evaluation Report, submitted to Congress on October 26, 
2009. We supplemented these documents with other pilot project plans 
and records provided by the Department of Homeland Security (DHS), 
including documentation and interviews we obtained from our prior 
review of the US-VISIT Comprehensive Exit project.[Footnote 36] 
	
To accomplish the first objective, we compared the information 
provided in the evaluation report with the applicable statutory 
condition in the Department of Homeland Security Appropriations Act, 
2009,[Footnote 37] and the legislative directions and expectations 
specified in the explanatory statement[Footnote 38] that accompanied 
the act and the House report[Footnote 39] incorporated by reference 
into the explanatory statement, and determined the extent to which the 
report addressed all aspects of each applicable condition, direction, 
or expectation. We then characterized each condition, direction, and 
expectation as satisfied, partially satisfied, or not satisfied 
[Footnote 40] and interviewed DHS officials about their reasons for 
not fully satisfying the condition, direction, or expectation. 

To accomplish the second objective, we compared the pilots' reported 
objectives, scope and limitations, and the evaluation approach 
described in the pilots' evaluation framework[Footnote 41] with the 
equivalent components defined in the evaluation plan to identify any 
differences. We also compared reported pilot data with the evaluation 
framework components specified in the plan to determine whether all 
planned results were presented. We then interviewed program officials 
to determine the reasons for identified variations, and categorized 
the identified differences as either reporting omissions or 
limitations not specified in the plan. With the assistance of US-VISIT 
officials and contractors, we also performed a walk through of the 
files used to compile and aggregate pilot results in order to 
understand how the reported results were derived from the raw data 
collected for the pilots and to confirm that the reported results 
corresponded to aggregate data.[Footnote 42] 

To accomplish our third objective, we identified aspects of pilot 
execution that were not otherwise reviewed for the second objective. 
Based on this determination, we identified the Air Exit Pilots' 
requirements testing and execution as the focus of this objective. We 
compared the pilots' evaluation report and supporting project test 
reports with supporting project execution plans—such as pilot business 
and system requirements—to determine the extent to which business and 
system requirements were incorporated into the pilot as planned. We 
also determined whether testing of those requirements was performed as 
planned, including the time frames for testing and the extent to which 
testing was successfully completed.[Footnote 43] Based on this 
analysis, we categorized the discrepancies we identified according to 
whether they related to the pre-operational testing or operational 
verification of pilot system capabilities. We then interviewed DHS 
officials to understand the projects' approach to pilot requirements 
verification, and to clarify and correct the discrepancies, where 
appropriate. 

To accomplish our fourth objective, we identified key evaluation plan 
components based on our previous reviews of federal pilot projects. 
[Footnote 44] We then analyzed the contents of the Air Exit Pilots' 
evaluation plan with respect to the key components in order to 
determine the extent to which the plan addressed the components. For 
components not fully addressed in the plan, we reviewed the evaluation 
report and the pilots' project documents—such as the project's 
management plan and tailoring plan—to determine the extent to which 
these components were addressed outside the plan. We also interviewed 
DHS officials about the guidance they used to develop the pilots 
evaluation plan and the reasons for the weaknesses we identified. 

	
For each of our objectives, we assessed the reliability of the data we 
analyzed by reviewing existing documentation related to the data 
sources and interviewing knowledgeable agency officials about the data 
that we used. We found the data sufficiently reliable for the purposes 
of this review. 

We conducted this performance audit at the US-VISIT program offices in 
Arlington, Virginia, from November 2009 to June 2010 in accordance 
with generally accepted government auditing standards. Those standards 
require that we plan and perform the audit to obtain sufficient, 
appropriate evidence to provide a reasonable basis for our findings 
and conclusions based on our audit objectives. We believe that the 
evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objectives. 

[End of section] 

Attachment 2: Detailed US-VISIT Processes and Systems: 
		
The United States Visitor and Immigrant Status Indicator Technology 
(US-VISIT) program provides biometric (e.g., fingerprint) 
identification—through the collection, maintenance, and sharing of 
biometric and selected biographic data—to, among others, authorized 
Department of Homeland Security (DHS) and other federal agencies, such 
as U.S. Customs and Border Protection, U.S. Citizenship and 
Immigration Services, U.S. Coast Guard, Department of Defense, 
Department of State, Department of Justice, Transportation Security 
Administration, and the intelligence community. In fulfilling its 
mission, US-VISIT supports a series of homeland security-related 
mission processes that cover hundreds of millions of foreign national 
travelers who enter and leave the United States. An overview of these 
five processes is depicted in figure 3 and described following the 
figure. 

Figure 3: Mission Processes Supported by US-VISIT: 

[Refer to PDF for image: illustration] 

Analysis; 
Pre-entry; 
Entry; 
Status; 
Exit. 

Sources: GAO analysis of US-VISIT data, Nova Development Corp. 
(images). 

[End of figure] 

* Pre-entry: the process of evaluating a traveler's eligibility for 
required travel documents, enrolling travelers in automated inspection 
programs, and prescreening travelers entering the United States. 

* Entry: the process of determining a traveler's admissibility into 
the United States at air, sea, or land ports of entry. 

* Status management: the process of managing and monitoring the 
changes and extensions of the visits of lawfully admitted nonimmigrant 
foreign nationals to ensure that they adhere to the terms of their 
admission and that they notify appropriate government entities when 
they do not. 

* Exit: the process of collecting information on travelers departing 
the United States. 

* Analysis: the process of continuously screening against watch lists 
of individuals enrolled in US-VISIT for appropriate reporting and 
action and by matching information on arrival and departure and change 
or adjustment of status to identify individuals who have overstayed 
the terms of their admission.[Footnote 45] 

To support these processes, data must be exchanged among a variety of 
systems owned by several agencies. Two key US-VISIT systems are: 

* The Automated Biometric Identification System (I DENT), which 
collects and stores biometric data about foreign visitors, including 
information from the Federal Bureau of Investigation, U.S. Immigration 
and Customs Enforcement information on deported felons and sexual 
offender registrants, and DHS information on previous criminal 
histories and previous IDENT enrollments. 

* The Arrival and Departure Information System, which stores 
noncitizen traveler arrival and departure biographic data received 
from air and sea carrier manifests. It matches entry, immigration 
status updates, and departure data to provide immigration status, 
including whether the individual has overstayed his or her authorized 
period of stay. This system contributes information used to support 
the analysis mission process described above. 

[End of section] 

Attachment 3: Detailed Description of Air Exit Pilots: 
		
As we have previously reported,[Footnote 46] the Air Exit Pilots are 
one component of a larger United States Visitor and Immigrant Status 
Indicator Technology (US-VISIT) project known as Comprehensive Exit, 
which is to plan, develop, and deploy an air, sea, and land exit 
capability.[Footnote 47] 

The purpose of the Air Exit Pilots was to evaluate the impact on 
airport exit operations of identifying, verifying, and collecting 
information from passengers who were subject to US-VISIT and leaving 
the United States. To accomplish this, the pilots were to: 

* evaluate identity verification and exit-recording capabilities when 
used with existing port operations and infrastructure; 

* biometrically and biographically verify the identity of travelers 
subject to US-VISIT departing the United States at the pilot 
locations; and; 

* record the exit of, and update the Automated Biometric 
Identification System (IDENT) and Arrival and Departure Information 
System (ADIS) records of, each subject traveler. 

DHS conducted two pilot scenarios from May 2009 until July 2009: 

* a U.S. Customs and Border Protection (CBP) pilot at Detroit 
Metropolitan Wayne County Airport, and; 

* a Transportation Security Administration (TSA) pilot at Hartsfield-
Jackson Atlanta International Airport. 

The Air Exit Pilots used two types of portable biometric collection 
devices: 

* a hand-held device ("mobile device") that scanned information on 
travel documents and collected biometrics one fingerprint at a time, 
and; 

* a small suitcase ("portable device") that contained a laptop 
computer, document scanning device, and a biometric scanner that 
collected a four-print slap. 

The CBP pilot in Detroit used both devices. According to a TSA 
operations official, only the mobile device was used in Atlanta 
because of the limited space available within the checkpoint area.
		
The pilot process consisted of four phases: 

1. Identification. For the CBP pilot, CBP officers prescreened 
passengers after they provided their boarding passes to airline 
employees to identify passengers who were subject to US-VISIT and to 
then direct them to a CBP processing station in the jetway. For the 
TSA pilot, a TSA Ticket Document Checker prescreened every passenger 
entering the checkpoint to identify subject passengers who were 
escorted to a processing station manned by Transportation Security 
Officers equipped with mobile devices. 

2. Collection. Both CBP and TSA officers scanned a machine-readable 
travel document presented by a passenger to collect biographic data. 
If the document did not scan correctly, the officers were instructed 
to enter the biographic data manually into the device. The officers 
then used the mobile or portable device to collect an index and middle 
fingerprint or a four-print image, respectively. 

3. Processing. Once the device indicated that the collected prints 
were of sufficient quality, the CBP and TSA officers directed the 
passenger to continue onto the departing aircraft or through the 
normal checkpoint security screening.
		
4. Transmission. US-VISIT staff uploaded the information from the 
devices to a dedicated workstation and transmitted the data to IDENT 
via a secure network connection. Once transmitted, the data were 
matched to existing records. 

Figure 4 depicts the relationships of the equipment and systems used 
in phases 2, 3, and 4 of the pilot process. 

Figure 4: Illustration of Air Exit Pilots Biometric Data Collection 
and Transmission Process: 

[Refer to PDF for image: illustration] 

Mobile and portable devices: 
Detroit air exit pilot (CBP officer): 
US-VISIT staff; Dedicated workstation. 

Mobile device: 
Atlanta air exit pilot (Transportation Security Officer); 
US-VISIT staff; Dedicated workstation. 

Secure computer connection: 
IDENT. 

Source: GAO analysis of agency data. 

[End of figure] 

CBP Pilot Operations. As reported by DHS, CBP pilot operations were 
conducted at departure gates of selected international flights and 
usually occurred in the jetway between the air carrier boarding pass 
collector and the aircraft itself. The CBP pilot also tested several 
biometric collection configurations in the terminal itself, directly 
outside the jetway. CBP pilot operations generally consisted of four 
steps. 

* CBP officers, who were designated as "sorters," inspected travel 
documents. 

* Sorters directed travelers not subject to US-VISIT to bypass the 
biometric collection area and to board the aircraft. 

* If travelers were identified as subject to US-VISIT, the sorters 
directed them to one of the four or five CBP officer collectors. 

* Collectors gathered biographic and biometric information and then 
directed the travelers to board the aircraft. 

In cases where less physical space was available, CBP used a different 
configuration where the sorters were located just inside the doorway 
of the boarding area. The CBP data collectors then positioned 
themselves behind the sorters along the far wall of the boarding area. 
Once travelers were processed, they were directed to the jetway. 

TSA Pilot Operations. TSA pilot operations were conducted at the 
Atlanta airport's main security checkpoint. TSA operations generally 
consisted of five steps. 

* TSA Travel Document Checkers reviewed travel documents and 
interviewed travelers about their final destination. 

* The Travel Document Checkers directed travelers not subject to US-
VISIT to proceed to security screening. 

* If travelers were identified as subject to US-VISIT, the Travel 
Document Checkers called for a Control Transportation Security Officer 
escort. 

* The Control Transportation Security Officers then escorted these 
travelers to one of three biometric collection areas where Biometric 
Collection Transportation Security Officers collected biographic and 
biometric information. 

* Once traveler biometric and biographic data collection was complete, 
travelers were directed to the metal detector queues where they 
completed the security screening process. 

DHS reported several constraints on TSA's traveler processing for the 
pilot. 

* Air exit pilot operations were not conducted at the other two 
security checkpoints in the Atlanta airport. 

* TSA did not process travelers who flew into the Atlanta airport and 
then departed on an international flight without leaving the airport's 
sterile area. 

* TSA pilot did not process all travelers ages 14 to 18. Although 
travelers in this age group may be subject to US-VISIT, TSA policy 
does not require travelers under the age of 18 to present photo 
identification.[Footnote 48] 

In its Air Exit Pilots Evaluation Report, DHS presented a variety of 
information that characterized the CBP and TSA pilots. Elements of 
that information that help to understand the scope and context of the 
pilots are presented in table 4. 

Table 4: DHS-reported Information on the Air Exit Pilots: 

Pilot Characteristic: Airport; 
CBP: Detroit Metropolitan Wayne County Airport; 
TSA: Hartsfield-Jackson Atlanta International Airport. 

Pilot Characteristic: Physical location; 
CBP: Departure gates for selected flights at McNamara and North 
Terminals; 
TSA: Main TSA security checkpoint. 
		
Pilot Characteristic: Operational time frame; 
CBP: 05/28/2009 — 07/02/2009; 
TSA: 05/28/2009 — 07/02/2009. 

Pilot Characteristic: Technologies used; 
CBP: Mobile and portable collection devices; 
TSA: Mobile collection device. 
			
Pilot Characteristic: Flights inspected; 
CBP: 2-4 international flights per day (excluding flights to Canada 
and Mexico, pre-cleared and chartered flights); 
TSA: Unreported, as TSA did not record individuals' flight departure 
information. 

Pilot Characteristic: Number of affected flights; 
CBP: 91; 
TSA: Unreported, as TSA did not record individuals' flight departure 
information. 

Pilot Characteristic: Number of passengers checked for biometric 
collection eligibility: 
CBP: 27,111; 
TSA: 476,168. 

Pilot Characteristic: Number of passengers processed by pilots; 
CBP: 9,448; 
TSA: 20,296. 

Pilot Characteristic: Number of passengers that refused to provide 
biometric data; 
CBP: 0; 
TSA: 1. 

Pilot Characteristic: Average impact to boarding flow time (CBP) or 
security check flow time (TSA), per passenger; 
CBP: None; 
TSA: 2 min. 8 sec. for travelers subject to US-VISIT; 17 sec. for 
travelers not subject to US-VISIT. 

Pilot Characteristic: Biographic and biometric data collection times; 
CBP: 49 sec. for mobile device; 30 sec. for portable device; 
TSA: 68 sec. for mobile device (portable device not used). 

Pilot Characteristic: Labor hours over 35-day operations; 
CBP: 1,292 hours; 
TSA: 6,423 hours. 

Pilot Characteristic: Labor costs and expenses over 35-day operations; 
CBP: $77,501; 
TSA: $393,410. 

Pilot Characteristic: Number of watchlist hits[A]; 
CBP: 44; 
TSA: 131. 

Pilot Characteristic: Number of suspected overstays[B]; 
CBP: 60; 
TSA: 90. 

Source: DHS. 

[A] DHS reported CBP reviewed each of these watchlist hits and 
immediately demoted 145. Further review by CBP concluded that none of 
the 175 hits would have resulted in prevention of departure. 

[B] DHS reported that these system-generated results are overstated 
due to system limitations related to instances where ADIS records did 
not reflect up-to-date traveler status due to recent changes or 
extensions of status. Further, the report noted that the biometric and 
biographic data reconciliation between IDENT and ADIS needed to be 
enhanced to improve record matching. 

[End of table] 

The evaluation plan specified a variety of limitations that were 
expected to affect the execution of the pilots. For example, the 
pilots were not intended to fully assess existing systems or biometric 
devices, would minimize interference with air carrier boarding 
processes, and would rely on subject matter experts for cost data not 
available during the pilot, and these limitations were generally 
reiterated in the evaluation report. However, the report also 
identified other limitations not called out in the plan, as identified 
in table 5. 

Table 5: Selected Limitations in Pilot Data Collection: 

Limitation Area: Data from travelers subject to United States Visitor 
and Immigrant Status Indicator Technology (US-VISIT); 
Evaluation Report Examples: Transportation Security Administration 
(TSA) did not collect identification from all travelers subject to US-
VISIT ages 14 to 18;[A] 
Impact on Pilot Results: Metrics and observations did not reflect all 
travelers subject to US-VISIT. 

Limitation Area: Data from travelers subject to United States Visitor 
and Immigrant Status Indicator Technology (US-VISIT); 
Evaluation Report Examples: TSA did not collect flight information 
from travelers subject to US-VISIT; 
Required data on travelers subject to US-VISIT was not collected. 
Impact on Pilot Results: Report stated that automation would be 
required to collect flight information, but operational impact of the 
automation was not described. 

Limitation Area: Data from travelers subject to United States Visitor 
and Immigrant Status Indicator Technology (US-VISIT); 
Evaluation Report Examples: U.S. Customs and Border Protection (CBP) 
did not regularly collect identification from airline crew members 
subject to US-VISIT who
boarded their plane prior to the start of CBP exit processing at the 
departure gates; 
Impact on Pilot Results: Metrics and observations did not reflect all 
airline crew members subject to US-VISIT. 

Limitation Area: Impact on travelers and flights: 
Evaluation Report Examples: TSA suspended exit processing to address 
queues at other TSA posts unrelated to the pilot. CBP suspended 
processing and data collection from boarding passengers and crew when 
such collection would have delayed flight departures; 
Impact on Pilot Results: Report stated that the pilots were designed 
to avoid impact on air carrier and airport operations and had no 
conclusive impact on flight delays, delay durations, boarding times, 
or number of passengers who missed flights. 

Limitation Area: Facility and infrastructure needs; 
Evaluation Report Examples: No data was reported on the costs of 
airport electricity, device storage, or network circuits for the 
pilots; 
Impact on Pilot Results: Upgrade and recurring costs of exit 
processing for airports and Department of Homeland Security (DHS) 
telecommunications were not described. 

Limitation Area: Acquisition and development; 
Evaluation Report Examples: The costs for design, development, and 
testing were not individually measured, but were derived by evenly 
dividing a single reported value three ways; 
Impact on Pilot Results: Accurate costs for product design, 
development, and test planning and execution activities were not 
reported. 

Limitation Area: Acquisition and development; 
Evaluation Report Examples: Arrival and Departure Information System 
and Automated Biometric Identification System need to be enhanced to 
match biometric and biographic data; 
Impact on Pilot Results: Shortcomings of biometric and biographic 
matching were not described. Needed improvements and their operational 
impacts were not described. 

Limitation Area: Acquisition and development; 
Evaluation Report Examples: Overstay results did not reflect recent 
changes to or extensions of traveler status; 
Impact on Pilot Results: Report stated that more analysis would be 
required to confirm the system-generated overstay results, but did not 
describe the analysis or its results. 

Limitation Area: Acquisition and development; 
Evaluation Report Examples: Mobile device did not report finger scan 
quality score to collectors; 
Impact on Pilot Results: Pilot could not assess how operational 
conditions affected fingerprint quality. 

Source: GAO analysis of DNS data. 

[A] The evaluation report stated that visibility into the US-VISIT 
traveler status of travelers ages 14 to 18 was limited because of 
conflicting TSA and US-VISIT policies. According to a TSA official, 
data were collected on these travelers only when they were with in-
scope adults. 

[End of table] 

[End of Briefing Slides] 

Appendix II: Comments from the Department of Homeland Security: 

U.S Department of Homeland Security: 
Washington, DC 20528: 

July 27, 2010: 

Mr. Randolph C. Hite: 
Director, Information Technology Architecture and System Issues: 
441 G Street, NW: 
U.S. Government Accountability Office: 
Washington, DC 20548: 

Dear Mr. Hite: 

Re: Draft Report GA0-10-860, Homeland Security: US-VISIT Pilot 
Evaluations Offer Limited Understanding of Air Exit Options 
(Engagement 310688): 

The Department of Homeland Security (DHS/Department) appreciates the 
opportunity to review and comment on the U.S. Government 
Accountability Office's (GAO) draft report referenced above. GAO made 
one recommendation regarding the evaluation report of the air exit 
pilots that the United States Visitor and Immigrant Status Indicator 
Technology (US-VISIT) Program submitted to Congress in October 2009. 

That recommendation reads as follows: 

To the extent that the limitations in the Air Exit Pilots are not 
addressed through other information sources, we recommend that the 
Secretary of Homeland Security direct the Under Secretary for National 
Protection and Programs to have the US-VISIT Program Director identify 
additional sources for the operational impacts of air exit not 
addressed in the pilots' evaluation and to incorporate these sources 
into its air exit decision making and planning. 

DHS readily concurs with GAO's recommendation. The pilots that US-
VISIT conducted from May to July 2009 were never intended to be the 
sole source of information for the Department to consider in making a 
decision on a final air exit solution, but rather are only one source 
of information that DHS has taken into account. 

While we separately are providing a comment matrix that addresses 
several technical points and suggested corrections that the Department 
wishes to share with GAO, one minor clarification is worth 
highlighting here. The report states that "DHS officials told us that 
DHS did not view the expectation in the House report as a 
requirement." DHS certainly takes seriously, and endeavors to comply 
with, the guidance provided by Congress in committee reports; however, 
the point of the Department's statements on this matter was merely 
that congressional reports do not, technically speaking, have the 
force of law. See, e.g., Hein v. Freedom From Religion Foundation, 
Inc., 551 U.S. 587, 608 n.7 (2007) ("Indicia in committee reports and 
other legislative history as to how the funds should or are expected 
to be spent do not establish any legal requirements on the agency"). 

Thank you for the opportunity to comment on this Draft Report and we 
look forward to working with you on future homeland security issues. 

Sincerely, 

Signed by: 

Jerald E. Levine: 
Director: 
Departmental GAO/01G Liaison Office: 

[End of section] 

Appendix III: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Randolph C. Hite, (202) 512-3439 or hiter@gao.gov: 

Staff Acknowledgments: 

In addition to the contact name above, individuals making 
contributions to this report included Paula Moore (Assistant 
Director), Neil Doherty, Rebecca Eyler, Claudia Fletcher, Dave 
Hinchman, and Daniel Swartz. 

[End of section] 

Footnotes: 

[1] 73 Fed. Reg. 22065 (Apr. 24, 2008). 

[2] Pub. L. No. 110-329, 122 Stat. 3574, 3668-70 (Sept. 30, 2008). 

[3] See Explanatory Statement, 154 Cong. Rec. H9427, H9802 (daily ed. 
Sept. 24, 2008) and the Consolidated Security, Disaster Assistance, 
and Continuing Appropriations Act, 2009, Pub. L. No. 110-329, Div. D, 
Department of Homeland Security Appropriations Act, 2009 (Sept. 30, 
2008). Section 4 of Pub. L. No. 110-329 provides that the explanatory 
statement shall have the same effect with respect to the allocation of 
funds and the implementation of the act as if it were a joint 
explanatory statement of a committee of conference. 

[4] H.R. Rep. No. 110-862, at 103 (2008). 

[5] The Air Exit Pilots' evaluation framework consisted of metrics, 
observations, and cost elements; associated data sources; and other 
data collection specifications. 

[6] GAO, Tax Administration: IRS Needs to Strengthen Its Approach for 
Evaluating the SRFMI Data-Sharing Pilot Program, [hyperlink, 
http://www.gao.gov/products/GAO-09-45] (Washington, D.C.: Nov. 7, 
2008) and Transportation Worker Identification Credential: Progress 
Made in Enrolling Workers and Activating Credentials but Evaluation 
Plan Needed to Help Inform the Implementation of Card Readers, 
[hyperlink, http://www.gao.gov/products/GAO-10-43] (Washington, D.C.: 
Nov. 18, 2009). 

[7] The briefing in appendix I contains a minor change from the 
version provided to the committees on June 10, 2010, to recognize that 
TSA collected data from certain passengers ages 14 to 18. 

[8] This summary clarifies our findings for the first objective by 
using "evaluation report" and "report" in place of "pilots," the term 
used in the corresponding paragraph of the briefing in appendix I. 

[9] 73 Fed. Reg. 22065 (Apr. 24, 2008). 

[10] Pub. L No. 110-329, 122 Stat. 3574, 3668-70 (Sept. 30, 2008). 

[11] See Explanatory Statement, 154 Cong. Rec. H9427, H9802 (daily ed. 
Sept. 24, 2008) and the Consolidated Security, Disaster Assistance, 
and Continuing Appropriations Act, 2009, Pub. L. No. 110-329, Div. D, 
Department of Homeland Security Appropriations Act, 2009 (Sept. 30, 
2008). Section 4 of Pub. L. No. 110-329 provides that the Explanatory 
Statement shall have the same effect with respect to the allocation of 
funds and the implementation of the act as if it were a joint 
explanatory statement of a committee of conference. 

[12] H.R. Rep. No. 110-862, at 103 (2008). 

[13] The Air Exit Pilots' evaluation framework consisted of metrics, 
observations, and cost elements; associated data sources; and other 
data collection specifications. 

[14] GAO, Tax Administration: IRS Needs to Strengthen Its Approach for 
Evaluating the SRFMI Data-Sharing Pilot Program, [hyperlink, 
http://www.gao.gov/products/GAO-09-45] (Washington, D.C.: Nov. 7, 
2008) and Transportation Worker Identification Credential: Progress 
Made in Enrolling Workers and Activating Credentials but Evaluation 
Plan Needed to Help Inform the Implementation of Card Readers, 
[hyperlink, http://www.gao.gov/products/GAO-10-43] (Washington, D.C.: 
Nov. 18, 2009). 

[15] US-VISIT currently applies to a certain group of foreign 
nationals—nonimmigrants from countries whose residents are required to 
obtain nonimmigrant visas before entering the United States and 
residents of certain countries who are exempt from U.S. visa 
requirements when they apply for admission to the United States for up 
to 90 days for tourism or business purposes under the Visa Waiver 
Program. US-VISIT also applies to (1) lawful permanent residents; (2) 
Mexican nonimmigrants traveling with a Border Crossing Card, who wish 
to remain in the United States longer than 30 days, or who declare 
that they intend to travel more than 25 miles into the country from 
the border; and (3) Canadians traveling to the United States for 
certain specialized reasons. See 8 C.F.R. § 235.1(f). 

[16] US-VISIT program documentation now refers to these as 
"principles." 

[17] Radio frequency technology relies on proximity cards and card 
readers. Radio frequency devices read the information contained on the 
card when the card is passed near the device. The information can 
contain personal identification of the cardholder. 

[18] Other Comprehensive Exit projects include modification of IDENT 
to collect, validate, and store biometric and biographic data for 
travelers exiting the United States; enhancement of IDENT's reporting 
capabilities to support the analysis and evaluation of the Air Exit 
Pilot results; and recording the departure of certain temporary 
agricultural and nonagricultural workers at two Arizona land POEs. 

[19] Homeland Security: U.S. Visitor and Immigrant Status Program's 
Long-standing Lack of Strategic Direction and Management Controls 
Needs to Be Addressed, [hyperlink, 
http://www.gao.gov/products/GAO-07-1065] (Washington, D.C.: Aug. 31, 
2007). 

[20] GAO, Homeland Security: Strategic Solution for US-VISIT Program 
Needs to Be Better Defined, Justified, and Coordinated, [hyperlink, 
http://www.gao.gov/products/GAO-08-361] (Washington, D.C.: Feb. 29, 
2008). 

[21] GAO, Visa Waiver Program: Actions Are Needed to Improve 
Management of the Expansion Process, and to Assess and Mitigate 
Program Risks, [hyperlink, http://www.gao.gov/products/GAO-08-967] 
(Washington, D.C.: Sept. 15, 2008). 

[22] GAO, Homeland Security: U.S. Visitor and Immigrant Status 
Indicator Technology Program Planning and Execution Improvements 
Needed, [hyperlink, http://www.gao.gov/products/GAO-09-96] 
(Washington, D.C.: Dec. 12, 2008). 

[23] GAO, Homeland Security: Key US-VISIT Components at Varying Stages 
of Completion, but Integrated and Reliable Schedule Needed, 
[hyperlink, http://www.gao.gov/products/GAO-10-13] (Washington, D.C.: 
Nov. 19, 2009). 

[24] This date was included in the explanatory statement that 
accompanied the Department of Homeland Security Appropriations Act, 
2009, which was enacted on September 30, 2008. 

[25] The pilots were to (1) evaluate identity verification and exit-
recording capabilities when used with existing POE operations and 
infrastructure; (2) biometrically and biographically verify the 
identity of in-scope travelers departing the United States at the 
pilot locations; and (3) record the exit of, and update the IDENT and 
ADIS records for, each subject traveler. 

[26] The evaluation report stated that visibility into the US-VISIT 
traveler status of travelers ages 14 to 18 was limited because of 
conflicting TSA and US-VISIT policies. According to a TSA official, 
data were collected on these travelers only when they were with in-
scope adults. 

[27] These metrics corresponded to the business requirement categories 
of data capture, transmission, data linkage, search and match, 
reporting, interoperability, and non-technical. 

[28] One of the business requirements applied only to the CBP pilot—
that the air exit solution shall operate with existing CBP policies, 
processes, and systems. 

[29] [hyperlink, http://www.gao.gov/products/GA0-10-13]. 

[30] This operational requirement is to be able to generate a report 
on attempts of unauthorized access or requests of US-VISIT systems or 
data. According to the program office, this requirement was not tested 
because the pilot was scoped to only allow certain individuals to log 
into the system. In our view, this does not alleviate the need to test 
whether persons other than those allowed attempted to access the 
systems or data. 

[31] [hyperlink, http://www.gao.gov/products/GA0-09-45] and 
[hyperlink, http://www.gao.gov/products/GAO-10-43]. 

[32] 73 Fed. Reg. 22065 (Apr. 24, 2008). 

[33] These airports were: 1) Baltimore-Washington Thurgood Marshall 
International; 2) Chicago O'Hare International; 3) Denver 
International, 4) Dallas Fort Worth International, 5) San Juan Luis 
Munoz Marin International, 6) Detroit Metropolitan Wayne County 
(McNamara Terminal), 7) Newark Liberty International, 8) San Francisco 
International, 9) Hartsfield-Jackson Atlanta International, 10) 
Philadelphia International, 11) Fort Lauderdale/Hollywood 
International, and 12) Seattle-Tacoma International. 

[34] TSA told us that Atlanta was selected because it had a single 
security checkpoint for international travelers, resulting in a 100 
percent probability of capturing exit data from travelers subject to 
US-VISIT who originated in Atlanta. In contrast, Chicago had two 
checkpoints, thus providing a 50 percent probability of processing the 
exiting international travelers who originated there. 

[35] Flights to English-speaking countries, or countries with positive 
relations with the United States, are examples of characteristics that 
might influence how quickly travelers move through the data collection 
process. CBP documentation did state that flights to Canada would be 
excluded because they primarily consisted of travelers not subject to 
US-VISIT processing. 

[36] GAO, Homeland Security: Key US-VISIT Components at Varying Stages 
of Completion, but Integrated and Reliable Schedule Needed, 
[hyperlink, http://www.gao.gov/products/GAO-10-13] (Washington, D.C.: 
Nov. 19, 2009). 

[37] Consolidated Security, Disaster Assistance, and Continuing 
Appropriations Act, 2009, Pub. L. No. 110-329, 122 Stat. 3574, 3668-70 
(Sept. 30, 2008). 

[38] See Explanatory Statement, 154 Cong. Rec. H9427, H9802 (daily ed. 
Sept. 24, 2008) and the Consolidated Security, Disaster Assistance, 
and Continuing Appropriations Act, 2009, Pub. L. No. 110-329, Div. D, 
Department of Homeland Security Appropriations Act, 2009 (Sept. 30, 
2008). Section 4 of Pub. L. No. 110-329 provides that the Explanatory 
Statement shall have the same effect with respect to the allocation of 
funds and the implementation of the act as if it were a joint 
explanatory statement of a committee of conference. 

[39] H.R. Rep. No. 110-862, at 103 (2008). 

[40] "Satisfied" means that the report met all aspects of the 
direction or expectation. "Partially satisfied" means that the report 
met some, but not all, aspects of the direction or expectation. "Not 
satisfied" means that the report did not satisfy any aspects of the 
direction or expectation. 

[41] The air exit pilots evaluation framework consisted of metrics, 
observations, and cost elements; associated data sources; and other 
data collection and analysis specifications. 

[42] We did not verify that all raw data and analyses supported the 
aggregate data that we reviewed. 

[43] We did not verify whether the planned tests or test results were 
sufficient to demonstrate satisfaction of pilot requirements. 

[44] GAO, Tax Administration: IRS Needs to Strengthen Its Approach for 
Evaluating the SRFMI Data-Sharing Pilot Program, [hyperlink, 
http://www.gao.gov/products/GAO-09-45] (Washington, D.C.: Nov. 7, 
2008) and Transportation Worker Identification Credential: Progress 
Made in Enrolling Workers and Activating Credentials but Evaluation 
Plan Needed to Help Inform the Implementation of Card Readers, GAO-10-
43 (Washington, D.C.: Nov. 18, 2009). 

[45] Travelers who remain in the country beyond their authorized 
period of stay are referred to as "overstays." 

[46] GA0, Homeland Security: Key US-VISIT Components at Varying Stages 
of Completion, but Integrated and Reliable Schedule Needed, 
[hyperlink, http://www.gao.gov/products/GAO-10-13] (Washington, D.C.: 
Nov. 19, 2009). 

[47] Other Comprehensive Exit projects include the modification of 
IDENT to collect, validate, and store biometric and biographic data 
for travelers exiting the United States; enhancement of IDENT's 
reporting capabilities to support the analysis and evaluation of the 
Air Exit Pilot results; and recording the departure of certain 
temporary agricultural and nonagricultural workers at two Arizona land 
POEs. 

[48] The evaluation report stated that visibility into the US-VISIT 
traveler status of travelers ages 14 to 18 was limited because of 
conflicting TSA and US-VISIT policies. According to a TSA official, 
data were collected on these travelers only when they were with in-
scope adults. 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: