This is the accessible text file for GAO report number GAO-08-936 
entitled '2010 Census: Census Bureau's Decision to Continue with 
Handheld Computers for Address Canvassing Makes Planning and Testing 
Critical' which was released on September 2, 2008.

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to the Subcommittee on Information Policy, Census, and National 
Archives, Committee on Oversight and Government Reform, House of 
Representatives: 

United States Government Accountability Office: 
GAO: 

July 2008: 

2010 Census: 

Census Bureau's Decision to Continue with Handheld Computers for 
Address Canvassing Makes Planning and Testing Critical: 

GAO-08-936: 

GAO Highlights: 

Highlights of GAO-08-936, a report to the Subcommittee on Information 
Policy, Census, and National Archives, Committee on Oversight and 
Government Reform, House of Representatives. 

Why GAO Did This Study: 

The U.S. Census Bureau (Bureau) had planned to rely heavily on 
automation in conducting the 2010 Census, including using handheld 
computers (HHC) to verify addresses. Citing concerns about escalating 
costs, in March 2008 the Secretary of Commerce announced a redesign of 
the key automation effort. GAO was asked to (1) analyze Bureau and 
contractor data showing how HHCs operated and their impact on 
operations, and (2) examine implications the redesign may have on plans 
for address canvassing in the 2010 Census. 

GAO reviewed Bureau and contractor data, evaluations, and other 
documents on HHC performance and staff productivity; interviewed Bureau 
and contractor officials; and visited the two dress rehearsal sites to 
observe and document the use of the HHCs in the field. 

What GAO Found: 

Census and contractor data highlight problems field staff (listers) 
experienced using HHCs during the address canvassing dress rehearsal 
operation in 2007. Help desk logs, for example, revealed that listers 
most frequently reported issues with transmission, the device freezing, 
mapspotting (collecting mapping coordinates), and difficulties working 
with large blocks. When problems were identified, the contractor 
downloaded corrected software to the HHCs. Nonetheless, help desk 
resources were inadequate. The Bureau acknowledged that issues with the 
use of technology affected field staff productivity. After address 
canvassing, the Bureau established a review board and worked with its 
contractor to create task teams to analyze and address Field Data 
Collection Automation (FDCA) performance issues. 

Although the Bureau recognized that technology issues affected 
operations, and the contractor produced data on average transmission 
times, the Bureau and its contractor did not fully assess the magnitude 
of key measures of HHC performance. GAO previously recommended the 
Bureau establish specific quantifiable measures in such areas as 
productivity and performance. Also, the FDCA contract calls for the 
contractor to provide near real-time monitoring of performance metrics 
through a “dashboard” application. This application was not used during 
the census dress rehearsal. The Bureau has developed a preliminary list 
of metrics to be included in the dashboard such as daily measures on 
average transmission duration and number of failed transmissions, but 
has few benchmarks for expected performance. For example, the Bureau 
has not developed an acceptable level of performance on total number of 
failed transmissions or average connection speed. 

Technology issues and the Bureau’s efforts to redesign FDCA have 
significant implications for address canvassing. Among these are 
ensuring that FDCA solutions for technical issues identified in the 
dress rehearsal are tested, the help desk adequately supports field 
staff, and a solution for conducting address canvassing in large blocks 
is tested. In June 2008, the Bureau developed a testing plan that 
includes a limited operational field test, but the plan does not 
specify the basis for determining the readiness of the FDCA solution 
for address canvassing and when and how this determination will occur. 

Figure: Photograph of Contractor-Built Handheld Computer: 

[See PDF for image] 

Source: U.S. Census Bureau, Public Information Office (PIO). 

[End of figure] 

What GAO Recommends: 

GAO recommends the Secretary of Commerce direct the Bureau to specify 
the basis for determining the readiness of the FDCA solution for 
address canvassing and when and how this determination will occur, and 
to include the “dashboard” of performance metrics in its operational 
field test. 

In commenting on a draft of this report, Commerce had no substantive 
disagreements with GAO’s conclusions and recommendations and cited 
actions it is taking to address the challenges GAO identified. 

To view the full product, including the scope and methodology, click on 
[hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-08-936]. For more 
information, contact Mathew J. Scirč at (202) 512-6806 or 
sciremj@gao.gov or David A. Powner at (202) 512-9286 or 
pownerd@gao.gov. 

[End of section] 

Contents: 

Letter: 

Results in Brief: 

Background6: 

Field Operations Were Affected by Problems Encountered Using New 
Technology, and the Bureau Did Not Sufficiently Specify What It 
Expected of Technology: 

The Redesign of the Decennial Census Carries with It Significant 
Implications for 2010 Address Canvassing: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation22: 

Appendix I: Scope and Methodology: 

Appendix II: Comments from the Department of Commerce: 

Appendix III: GAO Contacts and Staff Acknowledgments: 

Table: 

Table 1: Dress Rehearsal Productivity Data by Area--Target and 
Reported: 

Figures: 

Figure 1: San Joaquin County Selected for Dress Rehearsal in 
California: 

Figure 2: Nine Counties Selected for Dress Rehearsal in North Carolina: 

Figure 3: Handheld Computer: 

Abbreviations: 

ALMI: Automated Listing and Mapping Instrument: 

Bureau: U.S. Census Bureau: 

Commerce: Department of Commerce: 

DAAL: Demographic Area Address Listing: 

FDCA: Field Data Collection Automation: 

GPS: Global Positioning System: 

HHC: Handheld computer: 

MAF: Master Address File: 

OCS: Operations Control System: 

TIGER®: Topographically Integrated Geographic Encoding and Referencing: 

[End of section] 

United States Government Accountability Office:
Washington, DC 20548: 

July 31, 2008: 

The Honorable Wm. Lacy Clay: 
Chairman: 
The Honorable Michael R. Turner: 
Ranking Member: 
Subcommittee on Information Policy, Census, and National Archives: 
Committee on Oversight and Government Reform: 
House of Representatives: 

In March 2008, we designated the 2010 Decennial Census as a high-risk 
area, citing a number of long-standing and emerging challenges. 
[Footnote 1] These include weaknesses in managing information 
technology, operational planning, and cost estimating, as well as 
uncertainty over dress rehearsal plans and the ultimate cost of the 
census. Because the census is fundamental for many government 
decisions, threats to a timely and reliable census can affect the 
public's confidence in government. 

From May to June 2007, the U.S. Census Bureau (Bureau) conducted the 
address canvassing operation of the 2008 Dress Rehearsal. This 
operation was the Bureau's final opportunity to test, under census-like 
conditions, handheld computers (HHC) developed by the contractor that 
will be deployed during the 2010 Census Address Canvassing operation-- 
scheduled to take place in the spring of 2009. In previous decennial 
censuses, the Bureau relied on a paper-based operation. According to 
the Bureau, the HHCs were to be a keystone to the reengineered census 
because they were to be used in developing an accurate address list for 
the Bureau and in obtaining information from households that fail to 
return Census forms. The Bureau believed that the HHCs would reduce the 
amount of paper used, process data in real time, and improve the 
quality of the data. However, at a March 2008 hearing, the Department 
of Commerce (Commerce) and the Bureau stated that the Field Data 
Collection Automation (FDCA) program, under which the HHCs are being 
developed, was likely to incur significant cost overruns and announced 
a redesigning effort to get the 2010 Decennial Census back on track. 

The Secretary of Commerce outlined several alternatives for redesigning 
this central technology investment, and on April 3, 2008, he decided to 
continue with the HHCs for address canvassing. During redesign 
deliberations, Bureau officials pointed out that it was too late in the 
decennial cycle to consider dropping the use of the HHCs for address 
canvassing in 2009. They considered that with hard deadlines fast 
approaching, there was not enough time to revert to a paper-based 
address canvassing operation. The decision to use the HHCs in the 2010 
Address Canvassing operation makes it critical that any problems 
identified with the HHCs in the dress rehearsal are resolved quickly 
and that the Bureau understand the implications of proceeding with this 
technology. 

Continued oversight of 2010 Census preparation is critical as the 
Bureau is redesigning operations late in the decennial cycle and 
relying on new technology to modernize its address listing and mapping 
activities. To respond to your interest in performance of the HHCs 
during 2008 Dress Rehearsal Address Canvassing, we examined whether the 
HHCs worked in collecting and transmitting address and mapping data. As 
part of this subcommittee's ongoing oversight of the 2010 Census, we 
testified in April 2008 on our preliminary observation of weaknesses 
with HHC performance and the potential implications for the 2010 
Census.[Footnote 2] We also raised the importance of performance 
measures and planning, recommending that the Bureau establish specific 
quantifiable measures in such areas as productivity and performance. At 
the subcommittee's request, we (1) analyzed Bureau and contractor data 
showing how HHCs operated and its implications on operations, and (2) 
examined implications the redesign may have on plans for address 
canvassing in the 2010 Census. 

In responding to these objectives, we reviewed Bureau planning 
documents, data on HHC performance and staff productivity, evaluation 
reports, and staff observations of address canvassing operations. We 
reviewed contract documents, help desk logs, contractor data on 
transmissions, and contractor evaluations of HHC performance. We also 
interviewed Bureau and contractor officials to determine the 
functionality of the HHCs during dress rehearsal address canvassing. 
Finally, we visited the two dress rehearsal sites in California and 
North Carolina to attend address canvassing lister training and to 
observe and document the use of the HHCs in the field during the dress 
rehearsal in the summer of 2007. Appendix I provides more detail on our 
scope and methodology. We conducted this performance audit from April 
2007 to July 2008 in accordance with generally accepted government 
auditing standards. Those standards require that we plan and perform 
the audit to obtain sufficient, appropriate evidence to provide a 
reasonable basis for our findings and conclusions based on our audit 
objectives. We believe that the evidence obtained provides a reasonable 
basis for our findings and conclusions based on our audit objectives. 

Results in Brief: 

The Bureau reported, in its 2008 Census Dress Rehearsal Address 
Canvassing Assessment Report,[Footnote 3] being able to use the HHC to 
collect address information for 98.7 percent of housing units visited 
and map information for 97.4 percent of the housing units visited. The 
Bureau also reported meeting planned time frames but saw performance 
problems that affected productivity. For example, Census and contractor 
data highlighted problems field staff (listers) experienced using HHCs 
during the address canvassing operation. The help desk logs, for 
example, revealed that listers most frequently reported issues with 
transmission, the device freezing, mapspotting (collecting mapping 
coordinates), and working with large blocks (geographic areas with 
large numbers of housing units more often found in urban areas). One 
factor that may have contributed to these performance problems was a 
compressed schedule that did not allow for thorough testing before the 
dress rehearsal. Given the tighter time frames going forward, testing 
and quickly remedying issues identified in these tests becomes even 
more important. The Bureau also reported that 5,429 records were lost 
and not recorded in the mapping and address database because multiple 
HHCs had the same identification number assigned to them. As a result, 
when a HHC transmitted information, it overwrote any data previously 
recorded for HHCs with the same identification number. According to 
Bureau officials this problem was identified and corrected during the 
address canvassing dress rehearsal. 

The Bureau acknowledged that issues with the use of technology affected 
staff productivity in its assessment of the address canvassing dress 
rehearsal operation. Data show staff productivity exceeded expectations 
in rural areas but did not meet Bureau expectations in urban/suburban 
areas, which represent a greater share of housing units across the 
nation. For example, the reported productivity for urban/suburban areas 
was more than 10 percent lower than the target and this difference will 
have implications for the costs of the address canvassing operation. We 
previously testified that the Bureau had not sufficiently measured the 
performance of the HHCs during the dress rehearsal, nor fully specified 
how it will measure performance during the 2010 Census.[Footnote 4] The 
Bureau received data from the contractor on average transmission times, 
but the Bureau has not used these data to analyze the full range of 
transmission times, nor how transmissions may have changed throughout 
the entire operation. Without this information, the magnitude of the 
handheld computers' performance issues throughout the dress rehearsal 
was not clear. The Bureau has few benchmarks (the level of performance 
it is expected to attain) to help evaluate the performance of HHCs 
throughout the address canvassing operation. For example, the Bureau 
has not developed an acceptable level of performance for measures on 
total number of failed transmissions or average connection speed. The 
contract supporting the Bureau's field data collection calls for the 
contractor to provide near real-time reporting and monitoring of 
performance metrics and a "control panel/dashboard" application to 
visually report metrics from any Internet-enabled personal computer. 
Such real-time reporting may be helpful to the contractor and the 
Bureau to monitor ongoing address canvassing operations in 2009, but 
was not used during the dress rehearsal. The Bureau has developed a 
preliminary list of dashboard metrics, which include such daily 
measures as average transmission duration, and expects to use the 
dashboard for address canvassing in 2009. 

The Secretary of Commerce's decision to redesign the 2010 Decennial 
Census carries with it significant implications for address canvassing. 
Among these are ensuring that (1) the FDCA solution for address 
canvassing works, (2) the solution for collecting data in large blocks 
in parallel with other areas is tested and ready for use, and (3) the 
help desk adequately supports field staff. We previously testified that 
the Bureau needs to specify its plans for addressing these challenges. 
In his April 9, 2008, congressional testimony, the Bureau's Director 
outlined next steps that included developing an integrated schedule for 
address canvassing and testing. On May 22, 2008, the Bureau issued this 
integrated schedule, which identifies activities that need to be 
accomplished for the decennial census. In addition, the Bureau 
established milestones for completing tasks. However, the milestones 
for preparing for address canvassing in 2008 are very tight and in one 
case overlap the deployment of address canvassing. On June 6, 2008, the 
Bureau produced an address canvassing testing plan, including a field 
operations test. However, the plan does not specify the use of the 
dashboard in the field test. The address canvassing testing plan is a 
high-level plan that describes a partial redo of the dress rehearsal to 
validate certain functionality. While it represents a reasonable 
approach, it does not specify the basis for determining the readiness 
of the FDCA solution for address canvassing or when and how this 
determination will occur--when the Bureau would say that the 
contractor's solution meets its operational needs. 

To ensure that the Bureau addresses key challenges facing its 
implementation of the address canvassing operation for the 2010 Census, 
we recommend the Secretary of Commerce direct the Bureau to (1) specify 
the basis for determining the readiness of the FDCA solution for 
address canvassing and when and how this determination will occur--when 
the Bureau would say that the contractor's solution meets its 
operational needs; (2) specify how data collection in large blocks will 
be conducted in parallel with the address canvassing operation, and how 
this dual-track will be tested in order to ensure it will function as 
planned; (3) specify the benchmarks for measures used to evaluate the 
HHC performance during address canvassing; and (4) use the dashboard to 
monitor performance of the HHCs in the operational field test of 
address canvassing. 

On July 25, 2008, the Secretary of Commerce provided written comments 
on a draft of this report. Commerce had no substantive disagreements 
with our conclusions and recommendations and provided several technical 
corrections. We accepted the Department's revised language for one 
recommendation and incorporated technical comments elsewhere. The 
comments are reprinted in their entirety in appendix II. 

Background: 

In preparation for the 2010 Census, the address canvassing operation 
was tested as part of the 2008 Dress Rehearsal. From May 7 to June 25, 
2007, the Bureau conducted its address canvassing operation for its 
2008 Dress Rehearsal in selected localities in California (see fig. 1) 
and North Carolina (see fig. 2). The 2008 Census Dress Rehearsal took 
place in San Joaquin County, California, and nine counties in the 
Fayetteville, North Carolina, area. According to the Bureau, the dress 
rehearsal sites provided a comprehensive environment for demonstrating 
and refining planned 2010 Census operations and activities, such as the 
use of HHCs equipped with Global Positioning System (GPS). 

Figure 1: San Joaquin County Selected for Dress Rehearsal in 
California: 

[See PDF for image] 

This figure is a map of San Joaquin County, California. 

Source: U.S. Census Bureau. 

[End of figure] 

Figure 2: Nine Counties Selected for Dress Rehearsal in North Carolina: 

[See PDF for image] 

This figure is a map of the Fayetteville, North Carolina, site, 
including the following counties: 
Chatham; 
Cumberland; 
Harnett; 
Hoke; 
Lee; 
Montgomery; 
Moore; 
Richmond, and; 
Scotland. 

Source: U.S. Census Bureau. 

[End of figure] 

Prior to Census Day, Bureau listers perform the address canvassing 
operation, during which they verify the addresses of all housing units. 
Address canvassing is a field operation to help build a complete and 
accurate address list. The Bureau's Master Address File (MAF) is 
intended to be a complete and current list of all addresses and 
locations where people live or potentially live. The Topographically 
Integrated Geographic Encoding and Referencing (TIGER®) database is a 
mapping system that identifies all visible geographic features, such as 
type and location of streets, housing units, rivers, and railroads. 
Consequently, MAF/TIGER® provides a complete and accurate address list 
(the cornerstone of a successful census) because it identifies all 
living quarters that are to receive a census questionnaire and serves 
as the control mechanism for following up with households that do not 
respond. If the address list is inaccurate, people can be missed, 
counted more than once, or included in the wrong location(s). 

Generally, during address canvassing, census listers go door to door 
verifying and correcting addresses for all households and street 
features contained on decennial maps. The address listers add to the 
2010 Census address list any additional addresses they find and make 
other needed corrections to the 2010 Census address list and maps using 
GPS-equipped HHCs. Listers are instructed to compare what they discover 
on the ground to what is displayed on their HHC. 

As part of the 2004 and 2006 Census Tests, the Bureau produced a 
prototype of the HHC that would allow the Bureau to automate 
operations, and eliminate the need to print millions of paper 
questionnaires, address registers, and maps used by temporary listers 
to conduct address canvassing[Footnote 5] and non-response follow-up as 
well as to allow listers to electronically submit their time and 
expense information. The HHCs for these tests were off-the-shelf 
computers purchased and programmed by the Bureau. While the Bureau was 
largely testing the feasibility of using HHCs for collecting data, it 
encountered a number of technical problems. The following are some of 
the problems we observed during the 2004[Footnote 6] and 2006[Footnote 
7] tests: 

* slowness and frequent lock-up, 

* problems with slow or unsuccessful transmissions, and: 

* difficulty in linking a mapspot to addresses for multi-unit 
structures. 

For the 2008 Dress Rehearsal and the 2010 Census, the Bureau awarded 
the development of the hardware and software for a HHC to a contractor. 
In March 2006, the Bureau awarded a 5-year contract of $595,667,000 to 
support the FDCA project. The FDCA project includes the development of 
HHCs, and Bureau officials stated that the HHCs would ultimately 
increase the efficiency and reduce costs for the 2010 Census. According 
to the Director of the Census Bureau, the FDCA program was designed to 
supply the information technology infrastructure, support services, 
hardware, and software to support a network for almost 500 local 
offices and for HHCs that will be used across the country. He also 
indicated that FDCA can be thought of as being made up of three 
fundamental components: (1) automated data collection using handheld 
devices to conduct address canvassing, and to collect data during the 
non-response follow-up of those households that do not return the 
census form; (2) the Operations Control System (OCS) that tracks and 
manages decennial census workflow in the field; and (3) census 
operations infrastructure, which provides office automation and support 
for regional and local census offices. 

The 2008 Dress Rehearsal Address Canvassing operation marked the first 
time the contractor-built HHCs and the operations control system were 
used in the field. In 2006, we reported that not using the contractor- 
built HHCs until 2008 Dress Rehearsal Address Canvassing would leave 
little time to develop, test, and incorporate refinements to the HHCs 
in preparation for the 2010 Census. We also reported that because the 
Bureau-developed HHC had performance problems, the introduction of a 
new HHC added another level of risk to the success of the 2010 Census. 
[Footnote 8] 

For the 2008 Dress Rehearsal, the FDCA contractor developed the 
hardware and software used in census offices and on the HHCs. See 
figure 3 for more details. The HHC included several applications that 
varied depending on the role of the user: software enabling listers to 
complete their time and expense electronically; text messaging software 
enabling listers to communicate via text message; software enabling 
staff to review all work assigned to them and enabling crew leaders to 
make assignments; software enabling staff to perform address 
canvassing; and an instrument enabling quality control listers to 
perform quality assurance tasks. 

Figure 3: Handheld Computer: 

[See PDF for image] 

This figure contains an image of a handheld computer, as well as the 
following information: 

The HHCs performed several functions during dress rehearsal address 
canvassing including: 
* receive maps and address files from MAF/TIGER®; 
* verify addresses; 
* Global Positioning System (GPS) mapspot addresses; 
* transmit information; 
* quality control, and; 
* time and expense data. 

Source: Harris Corporation. 

[End of figure] 

Field Operations Were Affected by Problems Encountered Using New 
Technology, and the Bureau Did Not Sufficiently Specify What It 
Expected of Technology: 

The dress rehearsal address canvassing started May 7, 2007, and ended 
June 25, 2007, as planned. The Bureau reported in its 2008 Census Dress 
Rehearsal Address Canvassing Assessment Report being able to use the 
HHC to collect address information for 98.7 percent of housing units 
visited and map information for 97.4 percent of the housing units 
visited. There were 630,334 records extracted from the Bureau's address 
and mapping database and sent to the Bureau's address canvassing 
operation and 574,606 valid records following the operation.[Footnote 
9] Mapspots (mapping coordinates) were collected for each structure 
that the Bureau defined as a Housing Unit, Other Living Quarters, or 
Uninhabitable. Each single-family structure received its own mapspot, 
while multi-unit structures shared a single mapspot for all the living 
quarters within that structure.[Footnote 10] According to the Bureau's 
2008 Dress Rehearsal Address Canvassing Assessment Report, the address 
canvassing operation successfully collected GPS mapspot coordinates in 
the appropriate block for approximately 92 percent of valid structures; 
most of the remaining 8 percent of cases had a manual coordinate that 
was used as the mapspot. It is not clear whether this represents 
acceptable performance because the Bureau did not set thresholds as to 
what it expected during the address canvassing dress rehearsal. 

Listers Encountered Problems Using HHCs to Update Addresses and Collect 
Mapspots during the Dress Rehearsal Address Canvassing Operation: 

Listers experienced multiple problems using the HHCs. For example, we 
observed and the listers told us that they experienced slow and 
inconsistent data transmissions from the HHCs to the central data 
processing center. The listers reported the device was slow to process 
addresses that were a part of a large assignment area. Bureau staff 
reported similar problems with the HHCs in observation reports, help 
desk calls, and debriefing reports. In addition, our analysis of Bureau 
documentation revealed problems with the HHCs consistent with those we 
observed in the field: 

* Bureau observation reports revealed that listers most frequently had 
problems with slow processing of addresses, large assignment areas, and 
transmission. 

* The help desk call log revealed that listers most frequently reported 
issues with transmission, the device freezing, mapspotting, and large 
assignment areas. 

* The Bureau's debriefing reports illustrated the impact of the HHCs 
problems on address canvassing. For example, one participant commented 
that the listers struggled to find solutions to problems and wasted 
time in replacing the devices. 

Collectively, the observation reports, help desk calls, debriefing 
reports, and Motion and Time Study raised serious questions about the 
performance of the HHCs during the address canvassing operation. The 
Bureau's 2008 Dress Rehearsal Address Canvassing Assessment Report 
cited several problems with HHCs. For example, the Bureau observed the 
following problems: 

* substantial software delays for assignment areas with over 700 
housing units; 

* substantial software delays when linking mapspots at multi-unit 
structures; 

* unacceptable help desk response times and insufficient answers, which 
"severely" affected productivity in the field, and: 

* inconsistencies with the operations control system that made 
management of the operation less efficient and effective. 

The assessment reported 5,429 address records with completed field work 
were overwritten during the course of the dress rehearsal address 
canvassing operation, eliminating the information that had been entered 
in the field. The Bureau reported that this occurred due to an 
administrative error that assigned several HHCs the same identification 
number. Upon discovering the HHC mistake, the FDCA contractor took 
steps during the dress rehearsal address canvassing operation to ensure 
that all of the HHC devices deployed for the operation had unique 
identification numbers. Left uncorrected, this error could have more 
greatly affected the accuracy of the Bureau's master address list 
during dress rehearsal. 

The HHCs are used in a mobile computing environment where they upload 
and download data from the data processing centers using a commercial 
mobile broadband network. The data processing centers housed 
telecommunications equipment and the central databases, which were used 
to communicate with the HHCs and manage the address canvassing 
operation. The HHCs download data, such as address files, from the data 
processing centers, and upload data, such as completed work and time 
and expense forms, to the data processing centers. The communications 
protocols used by the HHCs were similar to those used on cellular 
phones to browse Web pages on the Internet or to access electronic 
mail. For HHCs that were out of the coverage area of the commercial 
mobile broadband network or otherwise unable to connect to the network, 
a dial-up capability was available to transfer data to the data 
processing centers. FDCA contract officials attributed HHC transmission 
performance problems to this mobile computing environment, 
specifically: 

* telecommunication and database problems that prevented the HHC from 
communicating with the data center; 

* extraneous data being transmitted (such as column and row headings), 
and: 

* an unnecessary step in the data transmission process. 

When problems with the HHC were identified during address canvassing, 
the contractor downloaded corrected software in five different 
instances over the 7-week period of the dress rehearsal address 
canvassing operation. After address canvassing, the Bureau established 
a review board and worked with its contractor to create task teams to 
address FDCA performance issues such as (1) transmission problems 
relating to the mobile computing environment, (2) the amount of data 
transmitted for large assignment areas, and (3) options for improving 
HHC performance. One factor that may have contributed to these 
performance problems was a compressed schedule that did not allow for 
thorough testing before the dress rehearsal. Given the tighter time 
frames going forward, testing and quickly remedying issues identified 
in these tests becomes even more important. 

The Bureau Achieved Productivity Expectations for Rural Areas but Not 
Urban/suburban Areas: 

Productivity results were mixed when Census listers used the HHC for 
address canvassing activities. A comparison of planned versus reported 
productivity reveals lister productivity exceeded the Bureau's target 
by almost two housing units per hour in rural areas, but missed the 
target by almost two housing units per hour in urban/suburban areas. 
Further, the reported productivity for urban/suburban areas was more 
than 10 percent lower than the target, and this difference will have 
cost implications for the address canvassing operation. Table 1 shows 
planned and reported productivity data for urban/suburban and rural 
areas. 

Table 1: Dress Rehearsal Productivity Data by Area--Target and 
Reported: 

Area: Urban/suburban areas; 
Housing units per hour: Target: 15.0; 
Housing units per hour: Reported: 13.4. 

Area: Rural areas; 
Housing units per hour: Target: 8.0; 
Housing units per hour: Reported: 9.8. 

Source: U.S. Census Bureau. 

[End of table] 

While productivity results were mixed, the lower than expected 
productivity in urban/suburban areas represents a larger problem as 
urban/suburban areas contain more housing units--and therefore a larger 
workload. According to the Bureau's dress rehearsal address canvassing 
assessment report, HHC problems appear to have negatively affected 
listers' productivity. The Bureau's assessment report concluded that 
"productivity of listers decreased because of the software problems." 
However, the extent of the impact is difficult to measure, as are other 
factors that may have affected productivity. 

The effect of decreases in productivity can mean greater costs. The 
Bureau, in earlier cost estimates, assumed a productivity rate of 25.6 
housing units per hour, exceeding both the expected and reported rates 
for the dress rehearsal. We previously reported that substituting the 
actual address canvassing productivity for the previously assumed 25.6 
units per hour resulted in a $270 million increase in the existing life-
cycle cost estimate.[Footnote 11] The Bureau has made some adjustments 
to its cost estimates to reflect its experience with the address 
canvassing dress rehearsal, but could do more to update its cost 
assumptions. We recommended the Bureau do so in our prior report. 
[Footnote 12] 

The Bureau Collected Some Data on HHC Performance Issues, but Did Not 
Develop Benchmarks: 

The Bureau took some steps to collect data, but did not fully evaluate 
the performance of the HHCs. For instance, the contractor provided the 
Bureau with data such as average transmission times collected from 
transmission logs on the HHC, as required in the contract. But the 
Bureau has not used these data to analyze the full range of 
transmission times, nor how this may have changed throughout the entire 
operation. Without this information, the magnitude of the handheld 
computers' performance issues throughout dress rehearsal was not clear. 
Also, the Bureau had few benchmarks (the level of performance it is 
expected to attain) to help evaluate the performance of HHCs throughout 
the operation. For example, the Bureau has not developed an acceptable 
level of performance for total number of failed transmissions or 
average connection speed. Additionally, the contractor and the Bureau 
did not use the dashboard specified in the contract for dress rehearsal 
activities. Since the dress rehearsal, the Bureau has specified certain 
performance requirements that should be reported on a daily, weekly, 
monthly, and on an exception basis. 

In assessing an "in-house built" model of the HHC, we recommended in 
2005 that the Bureau establish specific quantifiable measures in such 
areas as productivity that would allow it to determine whether the HHCs 
were operating at a level sufficient to help the Bureau achieve cost 
savings and productivity increases.[Footnote 13] Further, our work in 
the area of managing for results has found that federal agencies can 
use performance information, such as that described above, to make 
various types of management decisions to improve programs and results. 
For example, performance information can be used to identify problems 
in existing programs, identify the causes of problems, develop 
corrective actions, plan, identify priorities, and make resource 
allocation decisions. Managers can also use performance information to 
identify more effective approaches to program implementation.[Footnote 
14] 

The Bureau had planned to collect certain information on operational 
aspects of HHC use, but did not specify how it would measure HHC 
performance. Specifically, sections of the FDCA contract require the 
HHCs to have a transmission log with what was transmitted, the date, 
time, user, destination, content/data type, and outcome status. In the 
weeks leading up to the January 16, 2008, requirements delivery, Bureau 
officials drafted a document titled "FDCA Performance Reporting 
Requirements," which included an array of indicators such as average 
HHC transmission duration, total number of successful HHC 
transmissions, total number of failed HHC transmissions, and average 
HHC connection speed. Such measures may be helpful to the Bureau in 
evaluating its address canvassing operations. While these measures 
provide certain useful information, they only cover a few dimensions of 
performance. For example, to better understand transmission time 
performance, it is important to include analyses that provide 
information on the range of transmission times. 

The original FDCA contract also requires that the contractor provide 
near real-time reporting and monitoring of performance metrics on a 
"control panel/dashboard" application to visually report those metrics 
from any Internet-enabled PC. Such real-time reporting would help the 
Bureau and contractor identify problems during the operation, giving 
them the opportunity to quickly make corrections. However, the "control 
panel/dashboard" application was not used during the dress rehearsal. 
The Bureau explained that it needed to use the dress rehearsal to 
identify what data or analysis would be most useful to include on the 
dashboard it expects to use for address canvassing in 2009. In January 
and February 2008, the Bureau began to make progress in identifying the 
metrics that will be used in the dashboard. According to Bureau 
officials, the dashboard will include a subset of measures from the 
"FDCA Performance Reporting Requirements" such as average HHC 
transmission time and total number of successful and failed HHC 
transmissions, which would be reported on a daily basis. Between April 
28, 2008, and May 1, 2008, the Bureau and its contractor outlined the 
proposed reporting requirements for the dashboard. The Bureau indicated 
that the dashboard will be tested during the systems testing phase, 
which is currently scheduled for November and December 2008. They did 
not specify if the dashboard will be used in the operational field test 
of address canvassing, which is the last chance for the Bureau to 
exercise the software applications under Census-like conditions. 

The dress rehearsal address canvassing study assessment plan outlines 
the data the Bureau planned to use in evaluating the use of the HHC, 
but these data do not allow the Bureau to completely evaluate the 
magnitude of performance problems. The plan calls for using data such 
as the number of HHCs shipped to local census offices, the number of 
defective HHCs, the number of HHCs broken during the dress rehearsal 
address canvassing operation, the number checked in at the end of the 
operation, whether deployment affected the ability of staff to complete 
assignments, software/hardware problems reported through the help desk, 
the amount of time listers lost due to hardware or software 
malfunctions, and problems with transmissions. The plan also called for 
the collection of functional performance data on the HHCs, such as the 
ability to collect mapspots. 

Despite reporting on the data outlined in the study plan, the Bureau's 
evaluation does not appear to cover all relevant circumstances 
associated with the use of the HHC. For example, the Bureau does not 
measure when listers attempt transmissions but the mobile computing 
environment does not recognize the attempt. Additionally, the Bureau's 
evaluation does not provide conclusive information about the total 
amount of downtime listers experienced when using the HHC. For example, 
in the Bureau's final 2008 Census Dress Rehearsal Address Canvassing 
Assessment Report, the Bureau cites its Motion and Time Study as 
reporting observed lister time lost due to hardware or software 
malfunctions as 2.5 percent in the Fayetteville and 1.8 percent in the 
San Joaquin County dress rehearsal locations. The report also notes 
that the basis for these figures does not include either the downtime 
between the onset of an HHC error and the last/successful resolution 
attempt, nor does it include the amount of time a lister spent unable 
to work due to an HHC error. These times were excluded because they 
were not within the scope of the Motion and Time Study of address 
canvassing tasks. However, evaluating the full effect of HHC problems 
should entail accounting for the amount of time listers spend resolving 
HHC errors or are not engaged in address canvassing tasks due to HHC 
errors. 

The Redesign of the Decennial Census Carries with It Significant 
Implications for 2010 Address Canvassing: 

Because of the performance problems observed with HHCs during the 2008 
Dress Rehearsal, and the Bureau's subsequent redesign decision to use 
the HHCs for the actual address canvassing operation, HHC use will have 
significant implications for the 2010 Address Canvassing operation. 
[Footnote 15] In his April 9, 2008, congressional testimony, the 
Bureau's Director outlined next steps that included developing an 
integrated schedule for address canvassing and testing. On May 22, 
2008, the Bureau issued this integrated schedule, which identifies 
activities that need to be accomplished for the decennial and 
milestones for completing tasks. However, the milestones for preparing 
for address canvassing are very tight and in one case overlap the onset 
of address canvassing. Specifically, the schedule indicates that the 
testing and integrating of HHCs will begin in December 2008 and be 
completed in late March 2009; however, the deployment of the HHCs for 
address canvassing will actually start in February 2009, before the 
completion of testing and integration. It is uncertain whether the 
testing and integration milestones will permit modification to 
technology or operations prior to the onset of operations. Separately, 
the Bureau on June 6, 2008, produced a testing plan for the address 
canvassing operation. This testing plan includes a limited operational 
field test of address canvassing; however, the plan does not specify 
that the dashboard described earlier will be used in this test. The 
address canvassing testing plan is a high-level plan that describes a 
partial redo of the dress rehearsal to validate certain functionality 
and represents a reasonable approach. However, it does not specify the 
basis for readiness of the FDCA solution for address canvassing and 
when and how this determination will occur--when the Bureau would say 
that the contractor's solution meets its operational needs. 

Field staff reported problems with HHCs when working in large 
assignment areas during address canvassing. According to Bureau 
officials, the devices could not accommodate more than 720 addresses-- 
3 percent of dress rehearsal assignment areas were larger than that. 
The amount of data transmitted and used slowed down the HHCs 
significantly. In a June 2008, congressional briefing, Bureau officials 
indicated once other HHC technology issues are resolved the number of 
addresses the HHCs can accommodate may increase or decrease from the 
current 720. Identification of these problems caused the contractor to 
create a task team to examine the issues, and this team recommended 
improving the end-to-end performance of the mobile solution by 
controlling the size of assignment area data delivered to the HHC for 
address canvassing. One specific recommendation was limiting the size 
of assignment areas to 200 total addresses. However, the redesign 
effort took another approach and decided that the Bureau will use 
laptops and software used in other demographic surveys to collect 
information in large blocks (assignment areas comprise one or more 
blocks). Specifically, the collection of information in large blocks 
(those with over 700 housing units) will be accomplished using existing 
systems and software known as the Demographic Area Address Listing 
(DAAL)[Footnote 16] and the Automated Listing and Mapping Instrument 
(ALMI).[Footnote 17] Prior to the start of the address canvassing 
operation, blocks known to have more than 700 housing units would be 
removed from the scope of the FDCA solution. These blocks will be 
flagged in the data delivered to the contractor and will not be 
included for the address canvassing operation. Because this plan 
creates dual-track operations, Bureau officials stated that differences 
exist in the content of the extracts and that they are currently 
working to identify the differences and determine how to handle those 
differences. Additionally, they said that plans for the testing of the 
large block solution are expected to occur throughout various phases of 
the testing for address canvassing and will include performance 
testing, interface testing, and field testing. 

The costs for a help desk that can support listers during address 
canvassing were underestimated during planning and have increased 
greatly. Originally, the costs for the help desk were estimated to be 
approximately $36 million, but current estimates have the cost of the 
help desk rising as high as $217 million. The increased costs are meant 
to increase the efficiency and responsiveness of the help desk so that 
listers do not experience the kind of delays in getting help that they 
did during the address canvassing dress rehearsal. For example, the 
Bureau's final assessment of dress rehearsal address canvassing 
indicated that unacceptable help desk response times and insufficient 
answers severely affected productivity in the field. Field staff told 
us that help desk resources were unavailable on the weekends and that 
they had difficulty getting help. The increased costs cited above are 
due in part to improvements to the help desk, such as expanded 
availability and increased staffing. 

Lower than expected productivity has cost implications. In fact, the 
Bureau is beginning to recognize part of this expected cost increase. 
Specifically, the Bureau expects to update assumptions for the number 
of hours listers may work in a given week. The model assumes 27.5 hours 
per week, but the Bureau now expects this to be 18. This will make it 
necessary to hire more listers and, therefore, procure more HHCs. The 
Bureau adjusted its assumptions based on its experience in the dress 
rehearsal. Our related report recommends updating assumptions and cost 
estimates.[Footnote 18] 

Conclusions: 

The dress rehearsal represents a critical stage in preparing for the 
2010 Census. This is the time when Congress and others should have the 
information they need to know how well the design for 2010 is likely to 
work, what risks remain, and how those risks will be mitigated. We have 
highlighted some of the risks facing the Bureau in preparing for its 
first major field operation of the 2010 Census--address canvassing. 
Going forward, it will be important for the Bureau to specify how it 
will ensure that this operation will be successfully carried out. If 
the solutions do not work in resolving HHC technology issues the Bureau 
will not achieve productivity targets, and decennial costs will 
continue to rise. Without specifying the basis and time frame for 
determination of readiness of the FDCA address canvassing solution, the 
Bureau will not have the needed assurance that the HHCs will meet its 
operational needs. Such testing is especially critical for changes to 
operations that were not part of the address canvassing dress 
rehearsal. For example, because data collection in large blocks will be 
conducted in parallel with the address canvassing operation, and the 
Bureau is currently working to identify the differences in the content 
of the resulting extracts, it is important that this dual-track be 
tested to ensure it will function as planned. Furthermore, without 
benchmarks defining successful performance of the technology, the 
Bureau and stakeholders will be less able to reliably assess how well 
the technology worked during address canvassing. Although the Bureau 
field tested the HHCs in its dress rehearsal last year, it did not then 
have in place a dashboard for monitoring field operations. The Bureau's 
proposal for a limited field operations test this fall provides the 
last opportunity to use such a dashboard in census-like conditions. To 
be most effective, test results, assessments, and new plans need to be 
completed in a timely fashion, and they must be shared with those with 
oversight authority as soon as they are completed. 

Recommendations for Executive Action: 

To ensure that the Bureau addresses key challenges facing its 
implementation of the address canvassing operation for the 2010 Census, 
we recommend that the Secretary of Commerce direct the Bureau to take 
the following four actions: 

* Specify the basis for determining the readiness of the FDCA solution 
for address canvassing and when and how this determination will occur-
-when the Bureau would say that the contractor's solution meets its 
operational needs. 

* Specify how data collection in large blocks will be conducted in 
parallel with the address canvassing operation, and how this dual-track 
will be tested in order to ensure it will function as planned. 

* Specify the benchmarks for measures used to evaluate the HHC 
performance during address canvassing. 

* Use the dashboard to monitor performance of the HHCs in the 
operational field test of address canvassing. 

Agency Comments and Our Evaluation: 

The Secretary of Commerce provided written comments on a draft of this 
report on July 25, 2008. The comments are reprinted in appendix II. 
Commerce had no substantive disagreements with our conclusions and 
recommendations and cited actions it is taking to address challenges 
GAO identified. Commerce offered revised language for one 
recommendation, which we have accepted. Commerce also provided 
technical corrections, which we incorporated. 

Specifically, we revised our recommendation that the Bureau "Specify 
the basis for acceptance of the FDCA solution for address canvassing 
and when that acceptance will occur--when the Bureau would say it meets 
its operational needs and accepts it from the contractor" to "Specify 
the basis for determining the readiness of the FDCA solution for 
address canvassing and when and how this determination will occur--when 
the Bureau would say that the contractor's solution meets its 
operational needs." Also, after further discussion with Bureau 
officials, we provided more specific measures of address and map 
information successfully collected. We revised our discussion of the 
2004 and 2006 census tests to make clear that the HHC prototype was 
only used for non-response follow-up in the 2004 test. Finally, we 
revised our language on their decision to contract the development of 
HHC hardware and software to address the Bureau's concerns about how we 
characterized the timing of its decision. 

As agreed with your offices, unless you publicly announce the contents 
of this report earlier, we plan no further distribution until 30 days 
from the report date. At that time, we will send copies of this report 
to other interested congressional committees, the Secretary of 
Commerce, and the Director of the U.S. Census Bureau. Copies will be 
made available to others upon request. This report will also be 
available at no charge on GAO's Web site at [hyperlink, 
http://www.gao.gov]. 

If you have any questions on matters discussed in this report, please 
contact Mathew J. Scirč at (202) 512-6806 or sciremj@gao.gov, or David 
A. Powner at (202) 512-9286 or pownerd@gao.gov. Contact points for our 
Offices of Congressional Relations and Public Affairs may be found on 
the last page of this report. GAO staff who made major contributions to 
this report are listed in appendix III. 

Signed by: 

Mathew J. Scirč: 
Director, Strategic Issues: 

Signed by: 

David A. Powner: 
Director, Information Technology Management Issues: 

[End of section] 

Appendix I: Scope and Methodology: 

Our objectives for this report were to analyze U.S. Census Bureau 
(Bureau) and contractor data showing how handheld computers (HHC) 
operated and its implications on operations, and examine implications 
the redesign may have on plans for address canvassing in the 2010 
Census. To determine how well the HHC worked in collecting and 
transmitting address and mapping data, and what data the Bureau and 
contractor used in assessing HHC performance during address canvassing, 
we examined Bureau documents, observed HHCs in use, and interviewed 
Bureau and contractor officials. For example, we reviewed Census Bureau 
memos that outline the data on HHC performance the Bureau planned to 
collect. We reviewed the Field Data Collection Automation (FDCA) 
contract, focusing specifically on what performance specifications and 
requirements were included in the contract. We observed HHC use during 
dress rehearsal address canvassing, and interviewed Bureau officials 
and contractor officials about HHC use and performance during the dress 
rehearsal of address canvassing. Specifically, we observed five 
different listers over the course of 2 days in the Fayetteville, North 
Carolina, dress rehearsal site and six different listers over 3 days in 
the San Joaquin County, California, dress rehearsal site. We also 
analyzed data on HHC use including data on HHC functionality/usability, 
HHC log data, the Bureau's Motion and Time Study, the Bureau's 2008 
Dress Rehearsal assessments, observational and debriefing reports, a 
log of help desk tickets, and lessons-learned documents. Additionally, 
we interviewed knowledgeable Bureau and contractor officials. We did 
not independently verify the accuracy and completeness of the data 
either input into or produced by the operation of the HHCs. 

To better understand how HHC performance affected worker productivity, 
we attended the dress rehearsal address canvassing training for 
listers, interviewed Bureau officials about HHC performance, and 
examined data provided in the Bureau's Motion and Time Study and other 
sources related to predicted and reported productivity. In addition, we 
identified and analyzed the factors that contribute to HHC performance 
on aspects of address canvassing productivity. We examined the Bureau's 
Motion and Time Study results, conducted checks for internal 
consistency within the reported results, and met with Bureau officials 
to obtain additional information about the methodology used. The 
results reported in the study are estimates based on a non-random 
sample of field staff observed over the course of the address 
canvassing operation. Within the context of developing estimates for 
the time it takes address listers to perform address canvassing tasks 
and successfully resolve certain HHC problems, we determined that these 
data were sufficiently reliable for the purposes of our analysis. 
However, the study's methodology did not encompass a full accounting of 
the time field staff spent on the job, nor did the report explain how 
some results attributed to the Motion and Time Study were derived. 

We also compared the Bureau's expected productivity rates to 
productivity rates reported to us by the Bureau in response to our 
request for actual productivity data from the 2008 Dress Rehearsal 
Addressing Canvassing operation. After analyzing the Bureau's 
productivity data, we requested information about how the productivity 
data figures were calculated in order to assess their reliability. In 
reviewing documentation on the methodology and data, we identified 
issues that raise concerns. The Bureau acknowledged that data for all 
address field staff were not included in its analysis. Even though the 
productivity figures reported to us and presented in this report are 
generally in line with the range of productivity figures shown in the 
Bureau's Motion and Time Study, the missing data, along with the 
Bureau's lack of response to some of our questions about calculations 
of productivity figures, limit the reliability of these data. We 
determined that they are adequate for purposes of this report in that 
they provide a rough estimate of field worker productivity, but are not 
sufficiently reliable to be characterized as definitive representation 
of the actual productivity experienced in the 2008 Dress Rehearsal 
Address Canvassing operation. 

To ascertain the implications the redesign may have on plans for 
address canvassing in the 2010 Census, we observed meetings with 
officials of the Bureau, Commerce, Office of Management and Budget, and 
the contractor who were working on the FDCA redesign at Bureau 
headquarters. We also met with the Director of the Census Bureau and 
analyzed key Department of Commerce, Bureau, and contractor documents 
including the 2010 Census Risk Reduction Task Force Report and a 
program update provided by the contractor (as well as new and clarified 
requirements). The Bureau is in the process of revising some of its 
plans for conducting address canvassing and had not finalized those 
plans prior to the completion of this audit. 

We conducted this performance audit from April 2007 to July 2008 in 
accordance with generally accepted government auditing standards. Those 
standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objectives. We believe that 
the evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objectives. 

[End of section] 

Appendix II: Comments from the Department of Commerce: 

The Secretary Of Commerce: 
Washington, D.C. 20230: 

July 25, 2008: 

Mr. Mathew J. Scire: 
Director: 
Strategic Issues: 
United States Government Accountability Office: 
Washington, DC 20548: 

Dear Mr. Scire: 

The U.S. Department of Commerce appreciates the opportunity to comment 
on the United States Government Accountability Office's draft report 
entitled 2010 Census: Census Bureau's Decision to Continue with 
Handheld Computers for Address Canvassing Makes Planning and Testing 
Critical (GAO-08-936). I enclose the Department's comments on this 
report. 

Sincerely, 

Signed by: 

Carlos M. Gutierrez: 

Enclosure: 

U.S. Department of Commerce: 
Comments on the United States Government Accountability Office Draft 
Report Entitled Census 2010: Census Bureau's Decision to Continue with 
Handheld Computers for Address Canvassing Makes Planning and Testing 
Critical (GA0-08-936), July 2008: 

The U.S. Department of Commerce and the U.S. Census Bureau continue to 
work closely with all stakeholders to ensure the 2010 census is 
successful. As highlighted in this study conducted by the United States 
Government Accountability Office (GAO), we are especially focused on 
ensuring a successful address canvassing operation in April 2009 using 
handheld computers (HHCs). We have no substantive disagreements with 
the conclusions and recommendations made by GAO in this report. In 
fact, we have already taken a number of important steps to address the 
challenges identified. 

Many of the problems with the HHCs surfaced during the Address 
Canvassing Dress Rehearsal. These challenges were further clarified and 
evaluated by our Integrated Program Team, the Risk Reduction Task 
Force, an Expert Panel established by the Secretary of Commerce, and 
other internal and external reviews. 

The first major step taken was to implement the Task Force's 
recommendation-supported by the members of the Expert Panel and by the 
Secretary-to move to a paper-based Nonresponse Follow-Up operation 
while retaining the use of the HHCs in address canvassing. This 
decision reduced the risks associated with systems development work 
while allowing us to leverage Global Positioning System (GPS) 
technologies by using HHCs. This approach will improve the accuracy of 
our address list, which is fundamental to an accurate census. 

On June 8, 2008, we completed and began implementing a comprehensive 
testing plan for the Address Canvassing operation. The current plan 
relies primarily on the Field Data Collection Automation (FDCA) 
contract to supply a substantial portion of software for the operation. 
From our experience in the Dress Rehearsal, we have learned that a 
comprehensive, logical test plan is critical as we prepare for the 
production field activity in April 2009. The Census Bureau's test plan 
is organized into five comprehensive categories: FDCA testing, Large 
Block testing, Geography Division's testing, Interface testing 
(primarily FDCA interfaces with the Census Bureau), and an Operational 
Field Test. This plan lays out a logical flow of test activities, 
involves Census Bureau stakeholders throughout, details the dates and 
purposes of each test, and ends with a confirmation test that puts all 
the pieces together in an environment that replicates actual census 
conditions. 

As outlined in the Address Canvassing testing plan, we have developed 
and continue to develop specific criteria to evaluate the effectiveness 
of ongoing tests and determine progress in developing and deploying 
HHCs. Part of this process entails working closely with the Mitre 
Corporation and others to refine and improve performance metrics that 
will be captured in a dashboard. 

We are also closely monitoring the contractor's efforts by embedding 
Census Bureau staff at key sites to observe firsthand all their testing 
and requirements verification processes. During the last two weeks of 
January 2009, we will participate in a thorough Operational Readiness 
Review with the contractor which will include verifying specific 
readiness criteria that will be documented in advance. 

Specific Comments on the Draft Report: 

Page 3 - Results In Brief: Please clarify the statement, "During dress 
rehearsal address canvassing operations using HHC technology, the 
Bureau reported being able to use the HHC to collect address and map 
information for over 90 percent of housing units visited." 

Census Bureau Comment: We have been unable to determine how GAO derived 
this figure, so we would request clarification on how it was 
calculated. 

Page 8, second full paragraph: This paragraph implies that a prototype 
of the HHC was used for address canvassing during both the 2004 and 
2006 Census Tests. 

Census Bureau Comment: While the Bureau did have a prototype of the 
handheld computers (HHC) during the 2004 Census Test, it was used only 
for the Nonresponse Follow-Up operations. It was not until the 2006 
Census Test that we began using the HHC prototype to collect addresses 
and automate the maps. Also, please add "address canvassing registers" 
to the list of items that no longer required printing. 

Page 8, third full paragraph: The statement reads, "After unsuccessful 
attempts to develop its own HHC, the Bureau decided to award the 
development of the hardware and software for a HHC to be used in the 
2008 Census Dress Rehearsal and the 2010 Census to a contractor." 

Census Bureau Comment: Please note that the development of the FDCA 
contract was conceived-and the request for proposals offered-before the 
2006 Address Canvassing was completed. The decision to contract was not 
based on the Bureau's assessment of that software. 

Pages 5 and 21 - Recommendations for Executive Action: Recommendation 1 
reads, "Specify the basis for acceptance of the FDCA solution for 
address canvassing and when that acceptance will occur-when the Bureau 
would say it meets its operational needs and accepts it from the 
contractor." 

Census Bureau Comment: Based on the contract type of cost plus award 
fee, the Bureau views "acceptance" as a contractual term. We would not 
officially accept that the contractor's solution worked until after we 
successfully completed Address Canvassing and received the data. As 
noted, we continue to work closely with the contractor to verify 
specific readiness criteria that will be documented in advance. We 
request that you modify the statement to read, "Specify the basis for 
determining the readiness of the FDCA solution for address canvassing 
and when and how this determination will occur-when the Bureau would 
say that the contractor's solution meets its operational needs. 

[End of section] 

Appendix III: GAO Contacts and Staff Acknowledgments: 

GAO Contacts: 

Mathew J. Scirč, (202) 512-6806 or sciremj@gao.gov: 

David A. Powner, (202) 512-9286 or pownerd@gao.gov: 

Acknowledgments: 

In addition to the contact names above, Assistant Director Signora May, 
Stephen Ander, Thomas Beall, Jeffrey DeMarco, Richard Hung, Barbara 
Lancaster, Andrea Levine, Amanda Miller, Niti Tandon, Lisa Pearson, 
Cynthia Scott, Timothy Wexler, and Katherine Wulff made key 
contributions to this report. 

[End of section] 

Footnotes: 

[1] GAO, Information Technology: Significant Problems of Critical 
Automation Program Contribute to Risks Facing 2010 Census, [hyperlink, 
http://www.gao.gov/cgi-bin/getrpt?GAO-08-550T] (Washington, D.C.: Mar. 
5, 2008). 

[2] GAO, Census 2010: Census at Critical Juncture for Implementing Risk 
Reduction Strategies, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-
08-659T] (Washington, D.C.: Apr. 9, 2008). 

[3] K. Dixon, M. Blevins, et al., 2008 Census Dress Rehearsal Address 
Canvassing Assessment Report, SSD 2008 Census Dress Rehearsal Memoranda 
Series, No. 55, U.S. Census Bureau (Apr. 16, 2008). 

[4] [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-08-659T]. 

[5] The prototype of the HHC used during the 2004 Census Test, was used 
only for the Non-response Follow-up operations. It was not until the 
2006 Census Test that the Bureau began using the HHC prototype to 
collect addresses and automate the maps. 

[6] GAO, 2010 Census: Basic Design Has Potential, but Remaining 
Challenges Need Prompt Resolution, [hyperlink, http://www.gao.gov/cgi-
bin/getrpt?GAO-05-9] (Washington, D.C.: Jan. 12, 2005). 

[7] GAO, 2010 Census: Census Bureau Needs to Take Prompt Actions to 
Resolve Long-standing and Emerging Address and Mapping Challenges, 
[hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-272] (Washington, 
D.C.: June 15, 2006). 

[8] [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-272]. 

[9] During the dress rehearsal address canvassing, 378,742 records were 
verified; listers added 49,406 records to the Bureau's database; 
removed 102,631 records; corrected 138,094 records; and there were 
8,283 records that had no action. 

[10] When the lister was collecting the mapspot, there were symbols 
that displayed to indicate the status of the GPS signal. In the case 
when a GPS signal was not available (and all attempts made by the 
lister to obtain a signal were unsuccessful), the lister would manually 
spot the structure (by tapping the HHC screen with its stylus) without 
the benefit of the GPS coordinate collection. When a GPS signal was 
available, the listers action of tapping on the screen collected both a 
manual and GPS map spot. 

[11] GAO, 2010 Census: Census Bureau Should Take Action to Improve the 
Credibility and Accuracy of Its Cost Estimate for the Decennial Census, 
[hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-08-554] (Washington, 
D.C.: June 16, 2008). 

[12] [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-08-554]. 

[13] [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-9]. 

[14] GAO, Managing for Results: Enhancing Agency Use of Performance 
Information for Management Decision Making, [hyperlink, 
http://www.gao.gov/cgi-bin/getrpt?GAO-05-927] (Washington, D.C.: Sept. 
9, 2005). 

[15] [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-08-659T]. 

[16] DAAL is a post-Census 2000 program that coordinates various 
operations related to the review and automated update of the geographic 
content of the TIGER® database and the addresses in the MAF; the 
results of the reviews and updates are recorded using laptop computers. 

[17] ALMI is a post-Census 2000 system of files and software used by 
the Bureau to enable regional office field staff to update the address 
information in the MAF and the street, address location, and related 
information in the TIGER® database for an area. The field staff use 
laptop computers to view address and map information derived from the 
TIGER® database and the MAF, and to record updates and corrections to 
those files. 

[18] [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-08-554]. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office: 
441 G Street NW, Room LM: 
Washington, D.C. 20548: 

To order by Phone: 
Voice: (202) 512-6000: 
TDD: (202) 512-2537: 
Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: