This is the accessible text file for GAO report number GAO-09-262 
entitled 'Information Technology: Census Bureau Testing of 2010 
Decennial Systems Can Be Strengthened' which was released on March 5, 
2009.

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to Congressional Requesters: 

United States Government Accountability Office: 
GAO: 

March 2009: 

Information Technology: 

Census Bureau Testing of 2010 Decennial Systems Can Be Strengthened: 

GAO-09-262: 

GAO Highlights: 

Highlights of GAO-09-262, a report to congressional requesters. 

Why GAO Did This Study: 

The Decennial Census is mandated by the U.S. Constitution and provides 
vital data that are used, among other things, to reapportion and 
redistrict congressional seats. In March 2008, GAO designated the 2010 
Decennial Census a high-risk area, citing a number of long-standing and 
emerging challenges, including weaknesses in the Census Bureau’s 
(Bureau) management of its information technology (IT) systems and 
operations. In conducting the 2010 census, the Bureau is relying on 
both the acquisition of new IT systems and the enhancement of existing 
systems. Thoroughly testing these systems before their actual use is 
critical to the success of the census. GAO was asked to determine the 
status of and plans for testing key decennial systems. To do this, GAO 
analyzed testing documentation, interviewed Bureau officials and 
contractors, and compared the Bureau’s efforts with recognized best 
practices. 

What GAO Found: 

Although the Bureau has made progress in testing key decennial systems, 
critical testing activities remain to be performed before systems will 
be ready to support the 2010 census. Bureau program offices have 
completed some testing of individual systems, but significant work 
still remains to be done, and many plans have not yet been developed 
(see table below). In its testing of system integration, the Bureau has 
not completed critical activities; it also lacks a master list of 
interfaces between systems; has not set priorities for the testing of 
interfaces based on criticality; and has not developed testing plans 
and schedules. Although the Bureau had originally planned what it 
refers to as a Dress Rehearsal, starting in 2006, to serve as a 
comprehensive end-to-end test of key operations and systems, 
significant problems were identified during testing. As a result, 
several key operations were removed from the Dress Rehearsal and did 
not undergo end-to-end testing. The Bureau has neither developed 
testing plans for these key operations, nor has it determined when such 
plans will be completed. 

Weaknesses in the Bureau’s testing progress and plans can be 
attributed, in part, to a lack of sufficient executive-level oversight 
and guidance. Bureau management does provide oversight of system 
testing activities, but the oversight activities are not sufficient. 
For example, Bureau reports do not provide comprehensive status 
information on progress in testing key systems and interfaces, and 
assessments of the overall status of testing for key operations are not 
based on quantitative metrics. Specifically, key operations that do not 
yet have plans developed are marked as making acceptable progress based 
solely on management judgment. Further, although the Bureau has issued 
general testing guidance, it is neither mandatory nor specific enough 
to ensure consistency in conducting system testing. Without adequate 
oversight and more comprehensive guidance, the Bureau cannot ensure 
that it is thoroughly testing its systems and properly prioritizing 
testing activities before the 2010 Decennial Census, posing the risk 
that these systems may not perform as planned. 

Table: Status and Plans of 2010 System Testing: 

System: Headquarters processing; 
Testing status: In progress; 
Testing plan completed: Partial; 
Testing schedule completed: Partial. 

System: Master address and geographic information; 
Testing status: In progress; 
Testing plan completed: Partial; 
Testing schedule completed: Partial. 

System: Decennial response integration; 
Testing status: In progress; 
Testing plan completed: Partial; 
Testing schedule completed: Partial. 

System: Field data collection automation; 
Testing status: In progress; 
Testing plan completed: Partial; 
Testing schedule completed: Partial. 

System: Paper-based operations; 
Testing status: In progress; 
Testing plan completed: No; 
Testing schedule completed: Partial. 

System: Data access and dissemination; 
Testing status: In progress; 
Testing plan completed: Partial; 
Testing schedule completed: Partial. 

Source: GAO analysis of Bureau data. 

[End of table] 

What GAO Recommends: 

GAO is recommending that the Secretary of Commerce direct the Bureau to 
complete key system testing activities, develop and maintain plans for 
integration testing, and improve the oversight of and guidance for 
systems testing. In comments on a draft of this report, the department 
agreed with GAO’s recommendations. 

To view the full product, including the scope and methodology, click on 
[hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-09-262]. For more 
information, contact David Powner at (202) 512-9286 or pownerd@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

Bureau Is Making Progress in Conducting Key Decennial System Testing, 
but Lacks Plans and Schedules to Guide Remaining Efforts: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Scope and Methodology: 

Appendix II: Comments from the Department of Commerce: 

Appendix III: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: Key Systems Supporting 2010 Census: 

Table 2: Status of System Testing and Plans: 

Table 3: Status of System Testing and Plans for Headquarters Processing 
Systems (Dress Rehearsal and 2010 Testing): 

Table 4: Status of System Testing and Plans for MAF/TIGER (2010 
Testing): 

Table 5: Status of System Testing and Plans for DRIS (2010 Testing): 

Table 6: Status of System Testing and Plans for FDCA (2010 Testing): 

Table 7: Status of System Testing and Plans for PBO Release 0 (2010 
Testing): 

Table 8: Status of System Testing and Plans for DADS II Components 
(2010 Testing): 

Table 9: Integration Testing Status: 

Figures: 

Figure 1: Summary of Key Decennial Activities: 

Figure 2: Timeline of Key Decennial Activities: 

Figure 3: Dates Key Decennial Systems Are to Be Operational: 

Figure 4: Inventory of Testing Activities as of May 2008: 

Abbreviations: 

Bureau: Census Bureau: 

DADS II: Data Access and Dissemination System II: 

DRIS: Decennial Response Integration System: 

FDCA: Field Data Collection Automation: 

IEEE: Institute of Electrical and Electronics Engineers: 

IT: information technology: 

MAF/TIGER: Master Address File/Topologically Integrated Geographic 
Encoding and Referencing system: 

PBO: Paper-Based Operations: 

RDS: Replacement Dissemination System: 

RPS: Response Processing System: 

RTS: Replacement Tabulation System: 

UC&M: Universe Control and Management: 

[End of section] 

United States Government Accountability Office:
Washington, DC 20548: 

March 5, 2009: 

Congressional Requesters: 

The Census Bureau (Bureau) is relying on both the acquisition of new 
systems and the enhancement of existing legacy systems for conducting 
operations for the 2010 Decennial Census. As you know, the census is 
mandated by the U.S. Constitution and provides data that are vital to 
the nation. These data are used, for example, to reapportion and 
redistrict the seats of the U.S. House of Representatives; realign the 
boundaries of the legislative districts of each state; allocate 
billions of dollars in federal financial assistance; and provide a 
social, demographic, and economic profile of the nation's people to 
guide policy decisions at each level of government. The Bureau is 
required to take a population count as of April 1, 2010 (Census Day), 
and the Secretary of Commerce is required to report to the President on 
the tabulation of total population by state within nine months of that 
date.[Footnote 1] 

Carrying out the census is the responsibility of the Department of 
Commerce's Census Bureau, which is relying on automation and technology 
to improve the coverage, accuracy, and efficiency of the 2010 census. 
Because the accuracy of the 2010 census depends, in part, on the proper 
functioning of these systems, both individually and when integrated, 
thorough testing of these systems before their actual use is critical 
to the success of the census. 

In March 2008, we designated the 2010 Decennial Census as a high-risk 
area, citing a number of long-standing and emerging challenges, 
[Footnote 2] including weaknesses in the Bureau's management of its 
information technology (IT) systems and operations. The 2010 Decennial 
Census remained as one of our high-risk areas in our recent high-risk 
update issued in January 2009.[Footnote 3] Given the importance of 
comprehensive testing prior to the 2010 census, you asked us to 
determine the status of and plans for the testing of key decennial 
systems. 

To address this objective, we analyzed documentation related to system, 
integration, and end-to-end testing,[Footnote 4] including plans, 
schedules, and results, and interviewed Bureau officials and 
contractors. We then compared the Bureau's practices with those 
identified in our testing guide and other best practices.[Footnote 5] 
We conducted this performance audit from June 2008 to February 2009, in 
accordance with generally accepted government auditing standards. Those 
standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objective. We believe that 
the evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objective. Appendix I contains further 
details about our scope and methodology. 

Background: 

The Bureau's mission is to provide comprehensive data about the 
nation's people and economy. Its core activities include conducting 
decennial, economic, and government censuses; conducting demographic 
and economic surveys; managing international demographic and 
socioeconomic databases; providing technical advisory services to 
foreign governments; and performing other activities such as producing 
official population estimates and projections. 

Conducting the decennial census is a major undertaking that includes 
the following major activities: 

* Establishing where to count. This includes identifying and correcting 
addresses for all known living quarters in the United States (address 
canvassing) and validating addresses identified as potential group 
quarters, such as college residence halls and group homes (group 
quarters validation). 

* Collecting and integrating respondent information. This includes 
delivering questionnaires to housing units by mail and other methods, 
[Footnote 6] processing the returned questionnaires, and following up 
with nonrespondents through personal interviews (nonresponse follow-
up). It also includes enumerating residents of group quarters (group 
quarters enumeration) and occupied transitional living quarters 
(enumeration of transitory locations), such as recreational vehicle 
parks, campgrounds, and hotels. It also includes a final check of 
housing unit status (field verification) where Bureau workers verify 
potential duplicate housing units identified during response 
processing. 

* Providing census results. This includes processes to tabulate and 
summarize census data and disseminate the results to the public. 

Figure 1 illustrates key decennial activities. 

Figure 1: Summary of Key Decennial Activities: 

[Refer to PDF for image: illustration] 

Establish where to count: 

* Lists of addresses for living quarters in the United States are 
compiled from multiple sources, such as the Postal Service and state 
and local governments. 

* The addresses and other information are verified by enumerators using 
handheld computers (address canvassing). 

* Group quarters, such as college residence halls and group homes, are 
validated using paper (group quarters validation). 

* The data are used to establish a universe of addresses from which to 
count. 

Collect and integrate respondent information: 

* Questionnaires are delivered to housing units by mail and other 
methods, such as enumerators (update/leave). 

* Enumerators contact nonrespondents through personal interviews to 
complete paper questionnaires. 

* Enumerators also collect information from other living quarters, such 
as group quarters (group quarters enumeration) and occupied transitory 
locations, such as hotels (enumeration of transitory locations), using 
paper. 

* Returned questionnaires are processed. 

* A final check of housing unit status, known as field verification, is 
conducted by census workers to verify potential duplicate housing units 
(field verification). 

Provide census results: 

* Census data are tabulated, summarized, and disseminated to the 
public. 

* Apportionment data used to redistribute seats in the House of 
Representatives are delivered to the President, and redistricting data 
are delivered to state legislatures. 

Source: GAO analysis of Bureau data. 

[End of figure] 

The 2010 census enumerates the number and location of people on Census 
Day, which is April 1, 2010. However, census operations begin long 
before Census Day and continue afterward. For example, address 
canvassing for the 2010 census will begin in April 2009, while 
tabulated census data must be distributed to the President by December 
31, 2010, and to state legislatures by March 31, 2011. Figure 2 
presents a timeline of key decennial operations. 

Figure 2: Timeline of Key Decennial Activities: 

[Refer to PDF for image: illustration] 

Opening of 12 regional census centers: June 2008. 

Opening of 151 census offices: February 2009. 

Start of address canvassing operations: April, 2009. 

Start of group quarters validation: September 2009. 

Opening of 344 census offices (495 total): December 2009. 

Initial questionnaires mailed: March 2010. 

Census Day: April 1, 2010. 

Begin nonresponse follow-up operations: May, 2010. 

Deliver apportionment counts to the President: January 2011. 

Deliver redistricting data to state legislatures: April, 2011. 

Source: GAO summary of Bureau data. 

[End of figure] 

Role of IT in the Decennial Census: 

Automation and IT are to play a critical role in the success of the 
2010 census by supporting data collection, analysis, and dissemination. 
Several systems will be used in the 2010 census. For example, 
enumeration "universes," which serve as the basis for enumeration 
operations and response data collection, are organized by the Universe 
Control and Management (UC&M) system, and response data are received 
and edited to help eliminate duplicate responses using the Response 
Processing System (RPS). Both UC&M and RPS are legacy systems that are 
collectively called the Headquarters Processing Systems. 

Geographic information and support to aid the Bureau in establishing 
where to count U.S. citizens are provided by the Master Address File/ 
Topologically Integrated Geographic Encoding and Referencing (MAF/ 
TIGER) system. The Decennial Response Integration System (DRIS) is to 
provide a system for collecting and integrating census responses from 
all sources, including forms and telephone interviews. The Field Data 
Collection Automation (FDCA) program includes the development of 
handheld computers for the address canvassing operation and the 
systems, equipment, and infrastructure that field staff will use to 
collect the data. Paper-Based Operations (PBO) was established in 
August 2008, primarily to handle some of the operations that were 
originally part of FDCA. PBO includes IT systems and infrastructure 
needed to support the use of paper forms for operations such as group 
quarters enumeration activities, nonresponse follow-up activities, 
enumeration at transitory locations activities, and field verification 
activities. These activities were originally to be conducted using IT 
systems and infrastructure developed by the FDCA program. Finally, the 
Data Access and Dissemination System II (DADS II) is to replace legacy 
systems for tabulating and publicly disseminating data. 

Table 1 describes the key systems supporting the 2010 census, as well 
as the offices responsible for their development. 

Table 1: Key Systems Supporting 2010 Census: 

System name: Headquarters Processing--Universe Control and Management 
(UC&M); 
Responsible entity: Decennial System and Processing Office; 
Description: Organizes address files into enumeration "universes," 
which serve as the basis for enumeration operations and response data 
collection. UC&M data contain, among other things, a list of addresses 
to which census respondent forms must be delivered. This functionality 
is critical to ensuring that census respondent forms are delivered to 
the correct addresses. 

System name: Headquarters Processing--Response Processing System (RPS); 
Responsible entity: Decennial System and Processing Office; 
Description: Receives response data and edits the data to help 
eliminate duplicate responses by, for example, identifying people who 
have been enumerated more than once. After response data are finalized, 
they are provided to DADS II for tabulation and dissemination. 

System name: Master Address File/Topologically Integrated Geographic 
Encoding and Referencing (MAF/TIGER) system; 
Responsible entity: Geography Division; 
Description: Provides geographic information and support to aid the 
Bureau in establishing where to count the U.S. population for the 2010 
census. The Bureau's address list--MAF--is associated with the TIGER 
system, which is a geographic information system containing street maps 
and other geographic features. MAF/TIGER was recently updated under the 
MAF/TIGER Accuracy Improvement Project in April 2008, which provided 
corrected coordinates on a county-by-county basis for all current 
features in the TIGER database. 

System name: Decennial Response Integration System (DRIS); 
Responsible entity: DRIS Program Management Office and Lockheed Martin 
(lead contractor); 
Description: Collects and integrates census responses from all sources, 
including forms and telephone interviews. DRIS is to improve accuracy 
and timeliness by standardizing the response data and providing the 
data to other Bureau systems for analysis and processing. 

System name: Field Data Collection Automation (FDCA); 
Responsible entity: FDCA Program Management Office and Harris (lead 
contractor); 
Description: Provides automation support for field data collection 
operations. It includes the development of handheld computers for the 
address canvassing operation and the systems, equipment, and 
infrastructure that field staff will use to collect data. It is to 
establish office automation for the 12 regional census centers, the 
Puerto Rico area office, and approximately 494 temporary local census 
offices. FDCA handheld computers were originally to be used for 
nonresponse followup, but due to problems in testing, nonresponse 
followup was switched to paper-based operations. 

System name: Paper-Based Operations (PBO); 
Responsible entity: Decennial System and Processing Office; 
Description: Established in August 2008, primarily to handle operations 
that were originally part of FDCA. PBO includes IT systems and 
infrastructure needed to support the use of paper forms for operations 
such as group quarters enumeration, nonresponse followup, enumeration 
at transitory locations, and field verification. 

System name: Data Access and Dissemination System II (DADS II); 
Responsible entity: DADS II Program Management Office and IBM (lead 
contractor); 
Description: Replaces legacy systems for tabulating and publicly 
disseminating data. DADS II includes the Replacement Tabulation System 
and the Replacement Dissemination System, which are expected to 
maximize the efficiency, timeliness, and accuracy of tabulation and 
dissemination products and services; minimize the cost of tabulation 
and dissemination; and increase user satisfaction with related 
services. The Replacement Tabulation System is responsible for 
tabulating 2010 census data. The Replacement Dissemination System is 
responsible for distributing and disseminating census results. 

Source: GAO analysis of Bureau data. 

[End of table] 

Figure 3 shows the timeframes when each of the systems for the 2010 
census are to be operational, according to the Census Bureau. 

Figure 3: Dates Key Decennial Systems Are to Be Operational: 

[Refer to PDF for image: illustration] 

US&M: June 2009 - October 2010; 
RPS: January 2010 - April 2011; 
MAF/TIGER: August 2007 - December 2011; 
DRIS: August 2009 - October 2010; 
FDCA: January 2009 - January 2011; 
PBO: January 2010 - October 2010; 
DADS II: January 2011 and beyond. 

Source: GAO analysis of Bureau data. 

[End of figure] 

We Have Previously Reported on Weaknesses in Management of Census IT 
Systems: 

We have reported long-standing weaknesses in the Bureau's management of 
its IT systems. For example, in October 2007, we reported on the status 
and plans of key 2010 census IT acquisitions and whether the Bureau was 
adequately managing associated risks.[Footnote 7] We identified 
critical weaknesses in the Bureau's risk management practices, 
including those associated with risk identification, mitigation, and 
executive-level oversight. Further, operational testing planned during 
the census Dress Rehearsal would take place without the full complement 
of systems and functionality that was originally planned, the Bureau 
had not finalized its plans for testing all the systems, and it was 
unclear whether the plans would include testing to address all 
interrelated systems and functionality. We recommended that the Bureau 
develop a comprehensive plan to conduct an end-to-end test of its 
systems under census-like conditions. 

In March 2008, we designated the 2010 census as a high-risk area, 
citing several long-standing and emerging challenges.[Footnote 8] These 
challenges included, among other things, weaknesses in risk management 
and system testing, elimination of several operations from the 2008 
Dress Rehearsal, and questions surrounding the performance of handheld 
computers developed for the 2010 census. We have also testified 
[Footnote 9] on significant risks facing the 2010 census. For example, 
in March 2008, we testified that the FDCA program was experiencing 
significant problems, including schedule delays and cost increases 
resulting from changes to system requirements, which required 
additional work and staffing. Shortly thereafter, in April 2008, we 
testified on the Bureau's efforts to implement risk reduction 
strategies, including the decision to drop the use of handheld 
computers during the nonresponse follow-up operation and revert to a 
paper-based operation. Further, in June 2008, we testified that the 
Bureau had taken important steps to plan for a paper-based nonresponse 
follow-up operation, but several aspects remained uncertain. We 
concluded that it was critical to test capabilities for supporting the 
nonresponse follow-up operation. 

In July 2008, we reported that continued planning and testing of the 
handheld computers would be critical to the address canvassing 
operation.[Footnote 10] Specifically, the Bureau had developed a 
testing plan that included a limited operational field test, but the 
plan did not specify the basis for determining whether the FDCA 
solution was ready for address canvassing and when and how this 
determination would occur. 

In response to our findings and recommendations, the Bureau has taken 
several steps to improve its management of the 2010 Decennial Census. 
For example, the Bureau has sought external assessments of its 
activities from independent research organizations, implemented a new 
management structure and management processes, brought in experienced 
personnel in key positions, and established improved reporting 
processes and metrics. 

Comprehensive Testing Improves Chances of a Successful Decennial 
Census: 

As stated in our testing guide and the Institute of Electrical and 
Electronics Engineers (IEEE) standards,[Footnote 11] complete and 
thorough testing is essential for providing reasonable assurance that 
new or modified IT systems will perform as intended. To be effective, 
testing should be planned and conducted in a structured and disciplined 
fashion that includes processes to control each incremental level of 
testing, including testing of individual systems, the integration of 
those systems, and testing to address all interrelated systems and 
functionality in an operational environment. 

* System testing: verifies that the complete system (i.e., the full 
complement of application software running on the target hardware and 
systems software infrastructure) meets specified requirements. It 
allows for the identification and correction of potential problems 
within an individual system, prior to integration with other systems. 

* Integration testing: verifies that systems, when combined, work 
together as intended. Effective integration testing ensures that 
external interfaces work correctly and that the integrated systems meet 
specified requirements. 

* End-to-end testing: verifies that a defined set of interrelated 
systems, which collectively support an organization's core business 
area or function, interoperate as intended in an operational 
environment. The interrelated systems include not only those owned and 
managed by the organization, but also the external systems with which 
they interface. 

To be effective, this testing should be planned and scheduled in a 
structured and disciplined fashion. Comprehensive testing that is 
effectively planned and scheduled can provide the basis for identifying 
key tasks and requirements and better ensure that a system meets these 
specified requirements and functions as intended in an operational 
environment. 

Dress Rehearsal Includes Testing of Certain Systems and Operations: 

In preparation for the 2010 census, the Bureau planned what it refers 
to as the Dress Rehearsal. The Dress Rehearsal is managed by the 
Bureau's Decennial Management Division, in collaboration with other 
Bureau divisions (including the program offices, shown in table 1, 
which are responsible for developing and testing each of the systems). 
The Dress Rehearsal includes systems and integration testing,[Footnote 
12] as well as end-to-end testing of key operations in a census-like 
environment. During the Dress Rehearsal period, running from February 
2006 through June 2009, the Bureau is developing and testing systems 
and operations, and it held a mock Census Day on May 1, 2008. The Dress 
Rehearsal activities, which are still under way, are a subset of the 
activities planned for the actual 2010 census and include testing of 
both IT and non-IT related functions, such as opening offices and 
hiring staff. 

Dress Rehearsal Testing of Key Systems and Activities Identified 
Problems with Technologies: 

The Dress Rehearsal tested several activities involving key systems. 
For example, the Bureau tested key systems with address canvassing and 
group quarters validation operations, including FDCA handheld computers 
and the MAF/TIGER system. In addition, the Bureau used the UC&M system 
and MAF/TIGER to provide an initial list of housing unit addresses for 
the Dress Rehearsal test sites. Questionnaires were mailed to these 
housing units in April 2008. Subsequently, a mock Census Day was held 
on May 1, 2008--1 month later than originally planned. The mock Census 
Day was delayed, in part, to focus greater attention on testing the 
technology being used. 

The Dress Rehearsal identified significant technical problems during 
the address canvassing operations. For example, the Bureau had 
originally planned to use handheld computers, developed under the FDCA 
program, for operations such as address canvassing and non-response 
followup. However, from May 2007 to June 2007, the Bureau tested the 
handhelds under census-like conditions for the first time during the 
Dress Rehearsal address canvassing operation. Bureau officials observed 
a number of performance problems with the handheld computers, such as 
slow and inconsistent data transmissions.[Footnote 13] In addition, 
help desk logs revealed that users had frequently reported problems, 
such as the devices freezing up or users having difficulties collecting 
mapping coordinates and working with large blocks (geographic areas 
with large numbers of housing units, more often found in urban areas). 

The Bureau also found system problems during testing of the group 
quarters validation operation, in which field staff validate addresses 
as group quarters and collect information required for their later 
enumeration. As part of this operation, the Bureau tested the 
operations control system--designed to manage field operations that 
rely on paper, as well as those that rely on the handheld computers-- 
and the system was found to be unreliable. As a result, the workload 
for these operations had to be supplemented with additional paper-based 
efforts by local census office staff, instead of being performed 
electronically, as intended. 

Results of Dress Rehearsal Testing Led to Decisions to Remove Testing 
of Certain Operations and to Revert to Some Paper-Based Processes for 
Key Operations: 

As a result of the problems observed with the handheld computers and 
operations control system, cost overruns and schedule slippage in the 
FDCA program, and other issues, the Bureau removed the planned testing 
of key operations from the Dress Rehearsal as follows: 

* update/leave (that is, after enumerators update addresses, they leave 
questionnaires at housing units; this occurs mainly in rural areas 
lacking street names, house numbers, or both), 

* nonresponse follow-up, 

* enumeration of transitory locations, 

* group quarters enumeration, and: 

* field verification. 

Furthermore, in April 2008, the Secretary of Commerce announced a 
redesign of the 2010 Decennial Census, including the FDCA program. 
Specifically, the Bureau would no longer use handheld computers for 
nonresponse follow-up (its largest field operation), but would conduct 
paper-based nonresponse follow-up, as in previous censuses. It would, 
however, continue to use the handheld computers for the address 
canvassing operations. In May 2008, the Bureau issued a plan that 
detailed key components of the paper-based operation and described 
processes for managing it and other operations. It later established 
the PBO office to manage designing, developing, and testing paper-based 
operations, as well as to prepare related training materials. 

Additional Testing Is Planned to Supplement the Dress Rehearsal: 

In addition to the planned Dress Rehearsal testing, the Bureau is 
planning supplementary testing to prepare for the 2010 Decennial 
Census. This testing includes system, integration, and end-to-end 
testing of changes resulting from the Dress Rehearsal, operations or 
features that were not tested during the Dress Rehearsal, and 
additional features or enhancements that are to be added after the 
Dress Rehearsal. 

Bureau Is Making Progress in Conducting Key Decennial System Testing, 
but Lacks Plans and Schedules to Guide Remaining Efforts: 

The Bureau has made progress in conducting system, integration, and end-
to-end testing for the 2010 census, but much remains to be done. 
Significant testing remains to be done, and many plans for the 
remaining testing activities have not been developed. The weaknesses in 
the Bureau's IT testing can be attributed, in part, to a lack of 
sufficient executive-level oversight and guidance on testing. Without 
comprehensive oversight and guidance, the Bureau cannot ensure that it 
is thoroughly testing its systems before the 2010 Decennial Census. 

Bureau Has Performed Many System Testing Activities, but Much Remains 
to be Done: 

Through the Dress Rehearsal and other testing activities, the Bureau 
has completed key system tests, but significant testing has yet to be 
performed, and planning for this is not complete. For example, the 
Headquarters Processing systems (UC&M and RPS) are still completing 
system testing related to the Dress Rehearsal, and the program office 
is planning for further testing. For DRIS, on the other hand, system 
testing related to the Dress Rehearsal is complete, and additional 2010 
system testing is under way. Table 2 summarizes the status and plans 
for system testing. 

Table 2: Status of System Testing and Plans (Dress Rehearsal and 2010 
Testing): 

System: Headquarters Processing--UC&M and RPS; 
Dress Rehearsal system testing: In progress; 
2010 system testing: Testing status: In progress; 
2010 system testing: Testing plan completed: Partial; 
2010 system testing: Testing schedule completed: Partial. 

System: MAF/TIGER; 
Dress Rehearsal system testing: Completed; 
2010 system testing: Testing status: In progress; 
2010 system testing: Testing plan completed: Partial; 
2010 system testing: Testing schedule completed: Partial. 

System: DRIS; 
Dress Rehearsal system testing: Completed; 
2010 system testing: Testing status: In progress; 
2010 system testing: Testing plan completed: Partial[A]; 
2010 system testing: Testing schedule completed: Partial[A]. 

System: FDCA; 
Dress Rehearsal system testing: Partially completed[B]; 
2010 system testing: Testing status: In progress; 
2010 system testing: Testing plan completed: Partial; 
2010 system testing: Testing schedule completed: Partial. 

System: PBO; 
Dress Rehearsal system testing: N/A[C]; 
2010 system testing: Testing status: In progress; 
2010 system testing: Testing plan completed: No; 
2010 system testing: Testing schedule completed: Partial. 

System: DADS; 
Dress Rehearsal system testing: DADS[D] in progress; 
2010 system testing: Testing status: DADS II in progress; 
2010 system testing: Testing plan completed: Partial; 
2010 system testing: Testing schedule completed: Partial. 

Source: GAO analysis of Bureau data. 

[A] Program officials stated that DRIS's test plan and schedule were 
completed but will be modified to reflect changes resulting from the 
switch to paper-based operations. 

[B] System testing related to operations removed from the Dress 
Rehearsal was not completed. These operations were later moved to PBO. 

[C] The office to support PBO was created in August 2008. 

[D] DADS is being used for Dress Rehearsal system testing, but the 
replacement system, DADS II, is being developed and tested for 2010 
operations. 

[End of table] 

Dress Rehearsal System Testing for Headquarters Processing Systems Is 
Partially Completed, but Additional Test Plans for 2010 System Testing 
Have Not Yet Been Developed: 

For both Headquarters Processing Systems (UC&M and RPS), system testing 
for the Dress Rehearsal has been partially completed, as shown in table 
3. 

* For UC&M, Dress Rehearsal system testing is divided into three 
phases, as shown. These phases include a total of 19 products (used to 
control and track Dress Rehearsal enumeration activities). The 
completed phases (1 and 2) included the development and testing of 14 
products. Program officials had planned to complete testing of the 
remaining 5 products for UC&M by October 2008, but as of December 2008, 
the program had not yet completed this testing. 

* For RPS, Dress Rehearsal system testing is being done by component-- 
eight components perform functions for key activities such as data 
integration--where response data are integrated before processing. 
According to program officials, development and testing of four 
components are complete, and the remaining four components are planned 
to be completed by March 2009. 

Table 3: Status of System Testing and Plans for Headquarters Processing 
Systems (Dress Rehearsal and 2010 Testing): 

System: UC&M: Phase 1; 
Dress Rehearsal system testing: Dates: 7/07-9/07; 
Dress Rehearsal system testing: Testing status: Completed; 
2010 system testing: Dates: 6/09-8/09; 
2010 system testing: Testing status: Not started; 
2010 system testing: Testing plan completed: Partial; 
2010 system testing: Testing schedule completed: Partial[A]. 

System: UC&M: Phase 2; 
Dress Rehearsal system testing: Dates: 12/07-5/08; 
Dress Rehearsal system testing: Testing status: Completed; 
2010 system testing: Dates: 10/09-5/10; 
2010 system testing: Testing status: Not started; 
2010 system testing: Testing plan completed: Partial; 
2010 system testing: Testing schedule completed: Partial[A]. 

System: UC&M: Phase 3; 
Dress Rehearsal system testing: Dates: 7/08-; 
Dress Rehearsal system testing: Testing status: In progress[B]; 
2010 system testing: Dates: 6/10-8/10; 
2010 system testing: Testing status: Not started; 
2010 system testing: Testing plan completed: Partial; 
2010 system testing: Testing schedule completed: Partial[A]. 

System: RPS: Components 1-4; 
Dress Rehearsal system testing: Dates: 1/08-9/08; 
Dress Rehearsal system testing: Testing status: Completed; 
2010 system testing: Dates: 12/09-9/10; 
2010 system testing: Testing status: Not started[C]; 
2010 system testing: Testing plan completed: Partial; 
2010 system testing: Testing schedule completed: Partial[A]. 

System: RPS: Components 5-8; 
Dress Rehearsal system testing: Dates: 10/08-; 
Dress Rehearsal system testing: Testing status: In progress[B]; 
2010 system testing: Dates: 9/10-12/10; 
2010 system testing: Testing status: Not started; 
2010 system testing: Testing plan completed: Partial; 
2010 system testing: Testing schedule completed: Partial[A]. 

Source: GAO analysis of Bureau data. 

[A] High-level schedules have been defined; detailed schedules are not 
complete. 

[B] Completion has been delayed. 

[C] For 2010 operations, only two components will be included; two were 
combined and one was omitted. 

[End of table] 

In addition to ongoing Dress Rehearsal system testing, the program 
office intends to perform system testing for 2010 census operations, 
but plans for this testing have not yet been developed. According to 
program officials, they have not developed testing plans and schedules 
for additional testing for the 2010 census because Bureau management 
has not yet finalized the requirements for 2010 operations. Finalizing 
these requirements may involve both changes to existing requirements 
and new requirements. Program officials stated that they do not 
anticipate substantial changes in UC&M and RPS system requirements for 
2010 census operations and plan to have them finalized by May 2009. In 
commenting on a draft of this report, the Bureau provided an initial 
test plan and schedule, but did not provide the finalized baselined 
requirements for the Headquarters Processing Systems for 2010 
operations. 

Until the baseline requirements are established, it is unclear whether 
the amount of additional testing necessary for 2010 census operations 
will be significant. According to industry best practices, defining 
requirements for a system is important because they provide a baseline 
for development and testing activities and are used to establish test 
plans, which define schedule activities, roles and responsibilities, 
resources, and system testing priorities. The absence of finalized 
requirements increases the risk that there may not be sufficient time 
and resources to adequately test the systems, which are critical to 
ensuring that address files are accurately organized into enumeration 
universes and that duplicate responses are eliminated. 

MAF/TIGER Program Has Partially Completed Testing, but Test Plans and 
Schedules Are Incomplete and Ability to Track Progress Is Unclear: 

System testing has been partially completed for MAF/TIGER products 
(that is, extracts from the MAF/TIGER system) required for the 2010 
census. For MAF/TIGER, testing activities are defined by products 
needed for key activities, such as address canvassing. During Dress 
Rehearsal system testing, the program office completed testing for a 
subset of MAF/TIGER products for address canvassing, group quarters 
validation, and other activities. 

Additional system testing is planned for the 2010 census. According to 
program officials, as of December 2008, the Bureau had defined 
requirements and completed testing for 6 of approximately 60 products 
needed for 2010 operations (these 6 products are related to address 
canvassing). The program office has also developed detailed test plans 
and schedules through April 2009, but these do not cover all of the 
remaining products needed to support the 2010 census. Table 4 is a 
summary of the status of MAF/TIGER 2010 testing and plans. 

Table 4: Status of System Testing[A] and Plans for MAF/TIGER (2010 
Testing): 

Census activities: Address canvassing (including testing of MAF/TIGER 
updates); 
Dates: 5/08-5/09; 
Testing status: In progress; 
Testing plan completed: Partial; 
Testing schedule completed: Partial. 

Census activities: Group quarters validation; 
Dates: 12/08-9/09; 
Testing status: In progress; 
Testing plan completed: Partial; 
Testing schedule completed: Partial. 

Census activities: Enumeration universe (including group quarters 
enumeration, enumeration of transitory locations); 
Dates: 12/08-10/10; 
Testing status: In progress; 
Testing plan completed: Partial; 
Testing schedule completed: Partial. 

Census activities: Nonresponse follow-up; 
Dates: 3/09-6/10; 
Testing status: Not started; 
Testing plan completed: Partial; 
Testing schedule completed: Partial. 

Census activities: Field verification; 
Dates: 10/08-10/10; 
Testing status: In progress; 
Testing plan completed: Partial; 
Testing schedule completed: Partial. 

Census activities: Post-census products; 
Dates: 3/10-11/10; 
Testing status: Not started; 
Testing plan completed: Partial; 
Testing schedule completed: Partial. 

Source: GAO analysis of Bureau data. 

[A] System tests of a subset of MAF/TIGER products were also performed 
during the Dress Rehearsal. 

[End of table] 

According to program officials, the detailed test plans for the 
remaining products will be developed after the requirements for each 
are finalized. As mentioned, establishing defined requirements is 
important because these provide a baseline for development and testing 
activities and define the basic functions of a product. The officials 
stated that they were estimating the number of products needed, but 
would only know the exact number when the requirements for the 2010 
census operations are determined. The officials added that, by January 
2009, they plan to have a detailed schedule and a list of the products 
needed through December 2009. 

Without knowing the total number of products, related requirements, and 
when the products are needed for operations, the Bureau risks both not 
sufficiently defining what each product needs to do and not being able 
to effectively measure the progress of MAF/TIGER testing activities, 
which therefore increases the risk that there may not be sufficient 
time and resources to adequately test the system and that the system 
may not perform as intended. 

DRIS Testing Is Under Way; Plans Have Been Established but Will Be 
Revised: 

System testing has been partially completed for DRIS components, 
including paper, workflow control and management, and telephony. 
[Footnote 14] Portions of the functionality in each DRIS component are 
being developed and tested across five increments for 2010 operations. 
As of November 2008, the program had planned and completed testing of 
increments 1, 2, and 3. Testing of increment 4 is currently ongoing. 
Table 5 is a summary of the status of DRIS testing for 2010 operations. 
(In addition, system testing of a subset of DRIS functionality, 
including the integration of certain response data, took place during 
the Dress Rehearsal.) 

Table 5: Status of System Testing[A] and Plans for DRIS (2010 Testing): 

Phase: Increment 1; 
Dates: 12/07-3/08; 
Testing status: Completed; 
Testing plan completed[B]: N/A; 
Testing schedule completed[B]: N/A. 

Phase: Increment 2; 
Dates: 3/08-7/08; 
Testing status: Completed; 
Testing plan completed[B]: N/A; 
Testing schedule completed[B]: N/A. 

Phase: Increment 3; 
Dates: 6/08-11/08; 
Testing status: Completed; 
Testing plan completed[B]: N/A; 
Testing schedule completed[B]: N/A. 

Phase: Increment 4; 
Dates: 10/08-5/09; 
Testing status: In progress; 
Testing plan completed[B]: Partial; 
Testing schedule completed[B]: Partial. 

Phase: Increment 5; 
Dates: 3/09-7/09; 
Testing status: Not started; 
Testing plan completed[B]: Partial; 
Testing schedule completed[B]: Partial. 

Source: GAO analysis of Bureau data. 

[A] System tests of a subset of DRIS functionality were also performed 
during the Dress Rehearsal. 

[B] Program officials stated that DRIS's test plan and schedule were 
completed, but will be modified to reflect changes resulting from the 
switch to paper-based operations. 

[End of table] 

The DRIS program has developed a detailed testing plan and schedule, 
including the remaining testing for increment 5. For example, detailed 
testing plans have been developed for all 558 functional requirements 
for DRIS. According to program officials, most of the 558 functional 
requirements will be fully tested during increments 4 and 5. As of 
November 2008, 22 of the 558 requirements had been tested. 

Although plans and schedules were completed, the change from handheld 
computers to paper processes for nonresponse follow-up has caused 
changes to DRIS processing requirements. For example, DRIS program 
officials stated that they now need to process an additional 40 million 
paper forms generated as a result of this switch. Although DRIS program 
officials stated that they are prepared to adjust their test schedule 
and plan to accommodate this change, they cannot do so until the 
requirements have been finalized for the switch to paper processes. 
(This responsibility is primarily that of the PBO program office.) 
Furthermore, based on the switch to paper, DRIS may not be able to 
conduct a test using these operational systems and live data. This 
increases the risk that the Bureau could experience problems with these 
systems and the processing of paper forms during the 2010 census. The 
DRIS program office is addressing this risk by developing alternative 
strategies for testing and providing additional resources as 
contingencies for activities that may not be fully tested before 2010 
operations begin. 

FDCA System Testing Is Proceeding on an Aggressive Schedule, but Key 
Test Plans Have Not Been Developed: 

FDCA testing has been partially completed, but much more work remains. 
System testing for FDCA took place during the Dress Rehearsal, but 
problems encountered during this testing led to the removal of key 
operations from the Dress Rehearsal[Footnote 15] and the April 2008 
redesign, as described earlier. Going forward, FDCA development and 
testing for 2010 operations are being organized based on key census 
activities. For example, FDCA testing for the address canvassing and 
group quarters validation operations was completed in December 2008. 
The FDCA contractor is currently developing and testing a backup system 
(known as the continuity of operations system) for address canvassing 
and group quarters validation, and is currently testing another system 
(known as map printing) to provide printed maps for paper-based field 
operations, such as nonresponse follow-up. Table 6 summarizes the FDCA 
test status. 

Table 6: Status of System Testing and Plans for FDCA (2010 Testing): 

Census activity: Address canvassing/group quarters validation; 
Dates: 5/08-12/08; 
Status: Partially completed; 
Testing plan completed: N/A; 
Testing schedule completed: N/A. 

Census activity: Map printing; 
Dates: 5/08-6/09; 
Status: In progress; 
Testing plan completed: Partial; 
Testing schedule completed: Partial. 

Census activity: Continuity of operations; 
Dates: 9/08-5/09; 
Status: In progress; 
Testing plan completed: Partial; 
Testing schedule completed: Partial. 

Source: GAO analysis of Bureau data. 

[End of table] 

Although system testing for address canvassing and group quarters 
validation was recently completed, program officials have not 
demonstrated that all current system requirements have been fully 
tested. As part of a contract revision required by the April 2008 
redesign, the FDCA program established a revised baseline for system 
requirements on November 20, 2008. According to program officials, this 
revision included both modifications to previous requirements and 
removal of requirements that were part of activities transferred to 
PBO. As of December 2008, program officials stated that detailed 
testing plans for many of the requirements exist, but need to be 
revised to address the newly baselined requirements. 

Furthermore, as of December 2008, the FDCA program had not finalized 
detailed testing plans and schedules for the continuity of operations 
and map printing systems. According to program officials, they had not 
yet developed detailed testing plans and schedules for testing systems' 
requirements because their focus has been on testing the system for 
address canvassing and group quarters validation. Officials added that 
they plan to begin testing the requirements for the continuity of 
operations system in January 2009, and for the map printing system in 
February 2009. However, without having established testing plans and 
schedules for these systems, it is unclear what amount of testing will 
be needed and whether sufficient time has been allocated in the 
schedule to fully test these systems before they are needed for 
operations. 

Paper-Based Operations Are in Early Development, but Detailed 2010 
Testing Plans and Schedules Have Not Yet Been Developed: 

Testing has only recently started for PBO because it is still in the 
preliminary phase of program planning and initial system development as 
the Bureau shifts responsibility for certain operations from the FDCA 
program office to PBO. Because this office has only recently been 
created, it is currently hiring staff, developing a schedule for 
several iterations of development, establishing a means to store and 
trace requirements, developing testing plans, and establishing a 
configuration management process. 

According to program officials, development will occur in five 
releases, numbered 0 through 4. The first release, Release 0, is 
planned to contain functionality for the nonresponse follow-up and 
group quarters enumeration activities. Table 7 provides the current 
status of PBO Release 0 test activities. 

Table 7: Status of System Testing and Plans for PBO Release 0 (2010 
Testing): 

Phase: Complete initial system testing (Alpha testing); 
Dates: 10/08-3/09; 
Status: Started; 
Testing plan completed: No; 
Testing schedule completed: Partial. 

Phase: Additional testing (Beta testing); 
Dates: 3/09-4/09; 
Status: Not started; 
Testing plan completed: No; 
Testing schedule completed: Partial. 

Phase: Mock field testing activities for nonresponse follow-up and 
group quarters enumeration; 
Dates: 4/09-5/09; 
Status: Not started; 
Testing plan completed: No; 
Testing schedule completed: Partial. 

Source: GAO analysis of Bureau data. 

[End of table] 

However, the Bureau still has not yet determined when detailed testing 
plans and schedules for PBO systems will be developed. Officials stated 
that a more detailed schedule for Release 0 and development schedules 
for the remaining releases are under development, and that they plan to 
have the majority of the schedules developed by the end of January 
2009. In commenting on a draft of this report, the Bureau provided a 
partial schedule for PBO test activities. 

Furthermore, officials stated they had not yet fully defined which 
requirements PBO would be accountable for, and which of these 
requirements will be addressed in each iteration of development. The 
officials did state that the requirements will be based on those 
requirements transferred from FDCA as part of the reorganization. 
Bureau officials stated they had not yet completed these activities 
because responsibility for the requirements was only formally 
transferred as of October 2008. The program office expects to have its 
first iteration of requirements traceable to test cases by March 2009. 
However, officials did not know what percentage of program requirements 
will be included in this first iteration. 

Although progress has been made in establishing the PBO program office, 
numerous critical system development activities need to be planned and 
executed in a limited amount of time. Because of the compressed 
schedule and the large amount of planning that remains, PBO risks not 
having its systems developed and tested in time for the 2010 Decennial 
Census. Testing is critical to ensure that the paper forms used to 
enumerate residents of households who do not mail back their 
questionnaires, group quarters, and transitional living quarters are 
processed accurately. 

DADS II Is in Early Stages of Development, and Testing Plans and 
Schedules Are Being Developed: 

The DADS system (which had been used in the 2000 census) is currently 
being tested during the Dress Rehearsal, which is scheduled to be 
completed in March 2009. However, the Bureau intends to replace DADS 
with DADS II, which is currently being developed and tested for 2010 
operations. DADS II is still in the early part of its life cycle, and 
the program office has only recently started system testing activities. 
The two main DADS II components, the Replacement Tabulation System 
(RTS) and Replacement Dissemination System (RDS), are being developed 
and tested across a series of three iterations. As of December 2008, 
the program had begun iterations 1 and 2 for RTS, and iteration 1 for 
RDS. Table 8 summarizes the RTS and RDS testing status. 

Table 8: Status of System Testing and Plans for DADS II Components 
(2010 Testing): 

Phase: RTS: Iteration 1; 
Dates: 7/08-4/09; 
Status: In progress; 
Testing plan completed: Partial; 
Testing schedule completed: Partial[A]. 

Phase: RTS: Iteration 2; 
Dates: 9/08-10/09; 
Status: In progress; 
Testing plan completed: Partial; 
Testing schedule completed: Partial[A]. 

Phase: RTS: Iteration 3; 
Dates: 3/09-4/10; 
Status: Not started; 
Testing plan completed: Partial; 
Testing schedule completed: Partial[A]. 

Phase: RTS: Deployment; 
Dates: 2/10-7/10; 
Status: Not started; 
Testing plan completed: Partial; 
Testing schedule completed: Partial[A]. 

Phase: RDS: Iteration 1; 
Dates: 7/08-3/09; 
Status: In progress; 
Testing plan completed: Partial; 
Testing schedule completed: Partial[A]. 

Phase: RDS: Iteration 2; 
Dates: 1/09-3/10; 
Status: Not started; 
Testing plan completed: Partial; 
Testing schedule completed: Partial[A]. 

Phase: RDS: Iteration 3; 
Dates: 5/09-8/10; 
Status: Not started; 
Testing plan completed: Partial; 
Testing schedule completed: Partial[A]. 

Phase: RDS: Deployment; 
Dates: 7/10-2/11; 
Status: Not started; 
Testing plan completed: Partial; 
Testing schedule completed: Partial[A]. 

Source: GAO analysis of Bureau data. 

[A] High-level plans have been defined; detailed plans are not 
complete. 

[End of table] 

The DADS II program office has developed a high-level test plan for RTS 
and RDS system testing, but has not yet defined detailed testing plans 
and a schedule for testing system requirements. System requirements for 
the new system have been baselined, with 202 requirements for RTS and 
318 requirements for RDS. According to program officials, the program 
office is planning to develop detailed testing plans for the system 
requirements for both RTS and RDS. 

Bureau Has Conducted Limited Integration Testing, but Has Not Developed 
2010 Test Plans and Schedules for Integration Testing: 

Effective integration testing ensures that external interfaces work 
correctly and that the integrated systems meet specified requirements. 
This testing should be planned and scheduled in a disciplined fashion 
according to defined priorities. 

For the 2010 census, each program office is responsible for and has 
made progress in defining system interfaces and conducting integration 
testing, which includes testing of these interfaces. However, 
significant activities remain in order for comprehensive integration 
testing to be completed by the date that the systems are needed for 
2010 operations. For example, DRIS has conducted integration testing 
with some systems, such as FDCA, UC&M, and RPS, and is scheduled to 
complete integration testing by February 2010. The FDCA program office 
has also tested interfaces related to the address canvassing operation 
scheduled to begin in April 2009. However, for many other systems, such 
as PBO, interfaces have not been fully defined, and other interfaces 
have been defined but have not been tested. Table 9 provides the status 
of integration testing among key systems. 

Table 9: Integration Testing Status: 

System: UC&M and RPS; 
Testing status: In progress; 
Description: Conducted testing during the Dress Rehearsal, but until 
requirements are baselined for the 2010 census, it is unclear whether 
changes have occurred that would require retesting. According to 
program officials, all UC&M and RPS interfaces will be fully tested by 
an independent testing group within the Bureau. 

System: MAF/TIGER; 
Testing status: In progress; 
Description: Conducted testing with systems, such as FDCA, during the 
Dress Rehearsal and in preparation for the 2010 address canvassing 
operation. However, interfaces with other systems, such as RPS and 
UC&M, are still in development and have not been retested following the 
Dress Rehearsal. 

System: DRIS; 
Testing status: In progress; 
Description: Conducted testing with systems, such as FDCA, for the 
group quarters validation operation. However, interfaces with other 
systems, such as PBO, are still in development and have not been 
tested. According to program officials, interface testing may not be 
completed until February 2010 due to the limited availability of other 
systems for testing. 

System: FDCA; 
Testing status: In progress; 
Description: Conducted testing with systems, such as DRIS and 
MAF/TIGER, for the address canvassing and group quarters validation 
operations. However, interface testing with systems for other 
operations, such as map printing, has not been completed. 

System: PBO; 
Testing status: Not started; 
Description: Evaluating interfaces but has not yet fully defined or 
developed them. 

System: DADS II; 
Testing status: Not started; 
Description: Defining and developing its interfaces for the RDS and RTS 
system. According to program documentation, interfaces for RTS are 
planned to be fully tested by July 2009, and interfaces for RDS are 
planned to be fully tested by July 2010. 

Source: GAO analysis of Bureau data. 

[End of table] 

In addition, the Bureau has not established a master list of interfaces 
between key systems, or plans and schedules for integration testing of 
these interfaces. A master list of system interfaces is an important 
tool for ensuring that all interfaces are tested appropriately and that 
the priorities for testing are set correctly. Although the Bureau had 
established a list of interfaces in 2007, according to Bureau 
officials, it was not updated because of resource limitations at the 
time and other management priorities. As of October 2008, the Bureau 
had begun efforts to update this list, but it has not provided a date 
when this list will be completed. 

Without a completed master list, the Bureau cannot develop 
comprehensive plans and schedules for conducting systems integration 
testing that indicate how the testing of these interfaces will be 
prioritized. This is important because a prioritized master list of 
system interfaces, combined with comprehensive plans and schedules to 
test the interfaces, would allow for tracking the progress of this 
testing. With the limited amount of time remaining before systems are 
needed for 2010 operations, the lack of comprehensive plans and 
schedules increases the risk that the Bureau may not be able to 
adequately test system interfaces, and that interfaced systems may not 
work together as intended. 

Bureau Has Conducted Limited End-to-End Testing as Part of the Dress 
Rehearsal, but Has Not Developed Testing Plans for Critical Operations: 

The Dress Rehearsal was originally conceived to provide a comprehensive 
end-to-end test of key 2010 census operations; however, as mentioned 
earlier, because of the problems encountered with the handheld devices, 
among other things, testing was curtailed. As a result, although 
several critical operations underwent end-to-end testing in the Dress 
Rehearsal, others did not. According to the Associate Director for the 
2010 census, the Bureau tested approximately 23 of 44 key operations 
during the Dress Rehearsal. Examples of key operations that underwent 
end-to-end testing during the Dress Rehearsal are address canvassing 
and group quarters validation. An example of a key operation that was 
not tested is the largest field operation--nonresponse follow-up. 

Although the Bureau recently conducted additional testing of the 
handhelds, this test was not a robust end-to-end test. In December 
2008, after additional development and improvements to the handheld 
computers, the Bureau conducted a limited field test for address 
canvassing, intended to assess software functionality in an operational 
environment. We observed this test and determined that users were 
generally satisfied with the performance of the handhelds. According to 
Bureau officials, the performance of the handheld computers has 
substantially improved from previous tests. However, the test was not 
designed to test all the functionality of the handhelds in a robust end-
to-end test--rather, it included only a limited subset of functionality 
to be used during the 2009 address canvassing operations. Further, the 
field test did not validate that the FDCA system fully met specified 
requirements for the address canvassing operation. Bureau officials 
stated that additional testing of the FDCA system, such as performance 
testing, mitigated the limitations of this field test. 

Nonetheless, the lack of robustness of the field test poses several 
risks for 2010 operations. Specifically, without testing all the FDCA 
system's requirements in a robust operational environment, it is 
unclear whether the system can perform as intended when the address 
canvassing operation begins in April 2009. 

Furthermore, as of December 2008, the Bureau has neither established 
testing plans nor schedules to perform end-to-end testing of the key 
operations that were removed from the Dress Rehearsal, nor has it 
determined when these plans will be completed. As previously mentioned, 
these operations include: 

* update/leave, 

* nonresponse follow-up, 

* enumeration of transitory locations, 

* group quarters enumeration, and: 

* field verification. 

Although the Bureau has established a high-level strategy for testing 
these operations, which provides details about the operations to be 
tested, Bureau officials stated that they have not developed testing 
plans and schedules because they are giving priority to tests for 
operations that are needed in early 2009. In addition, key systems 
needed to test these operations are not ready to be tested because they 
are either still in development or have not completed system testing. 
Until system and integration testing activities are complete, the 
Bureau cannot effectively plan and schedule end-to-end testing 
activities. Without sufficient end-to-end testing, operational problems 
can go undiscovered, and the opportunity to improve these operations 
will be lost. 

The decreasing time available for completing end-to-end testing 
increases the risk that testing of key operations will not take place 
before the required deadline. Bureau officials have acknowledged this 
risk in briefings to the Office of Management and Budget. The Bureau is 
in the process of identifying risks associated with incomplete testing 
and developing mitigation plans, which it had planned to have completed 
by November 2008. However, as of January 2009, the Bureau had not 
completed these mitigation plans. According to the Bureau, the plans 
are still being reviewed by senior management. Without plans to 
mitigate the risks associated with limited end-to-end testing, the 
Bureau may not be able to respond effectively if systems do not perform 
as intended. 

Bureau Lacks Sufficient Executive-Level Oversight and Guidance for 
Testing: 

As stated in our testing guide and IEEE standards, oversight of testing 
activities includes both planning and ongoing monitoring of testing 
activities. Ongoing monitoring entails collecting and assessing status 
and progress reports to determine, for example, whether specific test 
activities are on schedule. Using this information, management can 
effectively determine whether corrective action is needed and, if so, 
what action should be taken. In addition, comprehensive guidance should 
describe each level of testing (for example, system, integration, or 
end-to-end), criteria for each level, and the type of test products 
expected. The guidance should also address test preparation and 
oversight activities. 

Although the 2010 Decennial Census is managed by the Decennial 
Management Division, the oversight and management of key census IT 
systems is performed on a decentralized basis. DRIS, FDCA, and DADS II 
each have a separate program office within the Decennial Automation 
Contracts Management Office; Headquarters Processing and PBO are 
managed within the Decennial System and Processing Office; and MAF/ 
TIGER is managed within the Geography Division. Each program has its 
own program management reviews and develops plans and tracks metrics 
related to testing. These offices and divisions collectively report to 
the Associate Director for the 2010 Census. According to the Bureau, 
the associate director chairs biweekly meetings where the officials 
responsible for these systems meet to review the status of key systems 
development and testing efforts. 

In addition, in response to prior recommendations, the Bureau took 
initial steps to enhance its programwide oversight; however, these 
steps have not been sufficient to establish adequate executive-level 
oversight. In June 2008, the Bureau established an inventory of all 
testing activities specific to all key decennial operations. This 
inventory showed that, as of May 2008, 18 percent of about 1049 system 
testing activities had not been planned. (See figure 4.) In addition, 
approximately 67 percent of about 836 operational testing activities 
had not been planned. 

Figure 4: Inventory of Testing Activities as of May 2008: 

[Refer to PDF for image: pie-charts] 

System testing: 
Testing complete: 1%; 
Testing planned or partially complete: 73%; 
No testing planned: 18%; 
Testing status unknown: 7%. 

Operational testing: 
Testing complete: 6%; 
Testing planned or partially complete: 25%; 
No testing planned: 67%; 
Testing status unknown: 3%. 

Source: GAO analysis of Bureau data. 

[End of figure] 

Although officials from the Decennial System and Processing Office 
described the inventory effort as a means of improving executive-level 
oversight, the inventory has not been updated since May 2008, and 
officials have no plans for further updates. Instead, officials stated 
that they plan to track testing progress as part of the Bureau's 
detailed master schedule of census activities. However, this schedule 
does not provide comprehensive status information on testing. 

In another effort to improve executive-level oversight, the Decennial 
Management Division began producing (as of July 2008) a weekly 
executive alert report and has established (as of October 2008) monthly 
dashboard and reporting indicators. However, these products do not 
provide comprehensive status information on the testing progress of key 
systems and interfaces. For example, the executive alert report does 
not include the progress of testing activities, and although the 
dashboard provides a high-level, qualitative assessment of testing for 
key operations and selected systems, it does not provide information on 
the testing progress of all key systems and interfaces. 

Further, the assessment of testing progress has not been based on 
quantitative and specific metrics. For example, the status of testing 
key operations removed from the Dress Rehearsal was marked as 
acceptable, or "green," although the Bureau does not yet have plans for 
testing these activities. Bureau officials stated that they marked 
these activities as acceptable because, based on past experience, they 
felt comfortable that a plan would be developed in time to adequately 
test these operations. The lack of quantitative and specific metrics to 
track progress limits the Bureau's ability to accurately assess the 
status and progress of testing activities. In commenting on a draft of 
this report, the Bureau provided selected examples in which they had 
begun to use more detailed metrics to track the progress of end-to-end 
testing activities. 

Finally, although the Bureau announced in August 2008 that it was 
planning to hire a senior manager who would have primary responsibility 
for monitoring testing across all decennial systems and programs, the 
position had not been filled as of January 2009. Instead, agency 
officials stated that the role is being filled by another manager from 
the Decennial Statistical Studies Division, who has numerous other 
responsibilities. 

The Bureau also has weaknesses in its testing guidance; it has not 
established comprehensive guidance for system testing. According to the 
Associate Director for the 2010 Census, the Bureau did establish a 
policy strongly encouraging offices responsible for decennial systems 
to use best practices in software development and testing, as specified 
in level 2 of Carnegie Mellon's Capability Maturity Model® Integration. 
[Footnote 16] However, beyond this general guidance, there is no 
additional guidance on key testing activities such as criteria for each 
level or the type of test products expected. Standardized policies and 
procedures help to ensure comprehensive processes across an 
organization and allow for effective executive-level oversight. The 
lack of guidance has led to an ad hoc--and, at times--less than 
desirable approach to testing. 

Conclusions: 

While the Bureau's program offices have made progress in testing key 
decennial systems, much work remains to ensure that systems operate as 
intended for conducting an accurate and timely 2010 census. Several 
program offices have yet to prepare and execute system test plans and 
schedules and ensure that system requirements are fully tested. In 
addition, the Bureau has not developed a master list of interfaces, 
which is necessary to prioritize testing and to develop comprehensive 
integration test plans and schedules. Additionally, end-to-end testing 
plans for key operations have not been finalized or executed based on 
established priorities to help ensure that systems will support census 
operations. 

Weaknesses in the Bureau's IT testing can be attributed, in part, to a 
lack of sufficient executive-level oversight and guidance. More 
detailed metrics and status reports would help the Bureau to better 
monitor testing progress and identify and address problems. Giving 
accountability for testing to a senior-level official would also 
provide the focus and attention needed to complete critical testing. 
Also, completing risk mitigation plans will help ensure that actions 
are in place to address potential problems with systems. Given the 
rapidly approaching deadlines of the 2010 census, completing these 
important tests and establishing stronger executive-level oversight and 
guidance are critical to ensuring that systems perform as intended when 
they are needed. 

Recommendations for Executive Action: 

To ensure that testing activities for key systems for the 2010 census 
are completed, we are making 10 recommendations. We recommend that the 
Secretary of Commerce require the Director of the Census Bureau to 
expeditiously implement the following recommendations: 

* For the Headquarters UC&M and RPS, finalize requirements for 2010 
census operations and complete testing plans and schedules for 2010 
operations that trace to baselined system requirements. 

* For MAF/TIGER, establish the number of products required, define 
related requirements, and establish a testing plan and schedule for 
2010 operations. 

* For FDCA, establish testing plans for the continuity of operations 
and map printing systems that trace to baselined system requirements. 

* For PBO, develop baseline requirements and complete testing plans and 
schedules for 2010 operations. 

* Establish a master list of system interfaces; prioritize the list, 
based on system criticality and need date; define all interfaces; and 
develop integration testing plans and schedules for tracking the 
progress of testing these interfaces. 

* Establish a date for completing testing plans for the operations 
removed from the Dress Rehearsal operations and prioritize testing 
activities for these operations. 

* Finalize risk mitigation plans detailing actions to address system 
problems that are identified during testing. 

* Establish specific testing metrics and detailed status reports to 
monitor testing progress and better determine whether corrective action 
is needed for all key testing activities. 

* Designate a senior manager with primary responsibility for monitoring 
testing and overseeing testing across the Bureau. 

* In addition, after the 2010 census, we recommend that the Bureau 
establish comprehensive systems and integration testing guidance to 
guide future testing of systems. 

Agency Comments and Our Evaluation: 

The Associate Under Secretary for Management of the Department of 
Commerce provided written comments on a draft of this report. The 
department's letter and general comments are reprinted in appendix II. 

In the comments, the department and Bureau stated they had no 
significant disagreements with our recommendations. However, the 
department and Bureau added that since the FDCA replan last year, their 
testing strategy has been to focus on those things they have not done 
before, and to demonstrate to their own satisfaction that new software 
and systems will work in production. The department added that it has 
successfully conducted Census operations before, and was focusing "on 
testing the new things for 2010--not things that have worked before." 

While we acknowledge that the Bureau has conducted key census 
operations before, the systems and infrastructure in place to conduct 
these operations have changed substantially since the 2000 census. For 
example, while the Bureau has conducted paper-based nonresponse 
followup during previous censuses, it will be using newly developed 
systems which have not yet been fully tested in a census-like 
environment to integrate responses and manage the nonresponse followup 
work load. In addition, new procedures, such as one to remove 
questionnaires that were mailed in late from the nonresponse followup 
operation, have not been tested with these systems. Any significant 
change to an existing IT system introduces the risk that the system may 
not work as intended; therefore, testing all systems after changes have 
been made to ensure the systems work as intended is critical to the 
success of the 2010 census. 

In addition, the department and Bureau provided technical comments, 
such as noting draft plans that had been developed after the conclusion 
of our work, that we have incorporated where appropriate. 

We are sending copies of this report to the Secretary of Commerce, the 
Director of the U.S. Census Bureau, and other appropriate congressional 
committees. The report also is available at no charge on the GAO Web 
site at [hyperlink, http://www.gao.gov]. If you have any questions 
about this report, please contact David Powner at (202) 512-9286 or 
pownerd@gao.gov. GAO staff who made contributions to this report are 
listed in appendix III. Contact points for our Offices of Congressional 
Relations and Public Affairs may be found on the last page of this 
report. 

Signed by: 

David A. Powner: 
Director, Information Technology Management Issues: 

List of Congressional Requesters: 

The Honorable Tom Carper:
Chairman:
Subcommittee on Federal Financial Management, Government Information, 
Federal Services, and International Security: 
Committee on Homeland Security and Governmental Affairs:
United States Senate: 

The Honorable Edolphus Towns:
Chairman:
The Honorable Darrell Issa:
Ranking Member:
Committee on Oversight and Government Reform:
House of Representatives: 

The Honorable Wm. Lacy Clay:
Chairman:
The Honorable Patrick T. McHenry:
Ranking Member:
Subcommittee on Information Policy, Census, and National Archives:
Committee on Oversight and Government Reform:
House of Representatives: 

The Honorable Michael R. Turner: 
House of Representatives: 

[End of section] 

Appendix I: Scope and Methodology: 

To determine the status of and plans for testing key decennial systems, 
we analyzed documentation related to system, integration, and end-to- 
end testing.[Footnote 17] For system testing, we analyzed documentation 
related to each key decennial system, including system test plans, 
schedules, requirements, results, and other test-related documents. We 
then compared the Bureau's practices with those identified in our 
testing guide and Institute of Electrical and Electronics Engineers 
(IEEE) standards[Footnote 18] to determine the extent to which the 
Bureau had incorporated best practices in testing. We also interviewed 
program officials and contractors of key decennial systems to obtain 
information on the current status of and plans for testing activities. 

For integration testing, we analyzed interface control documents, 
interface testing plans, and schedules. We also analyzed documentation 
of the Census Bureau's (Bureau) oversight of integration testing 
activities, including efforts to issue integration testing guidance and 
monitor the progress of integration testing activities. We interviewed 
program officials at each key decennial system program office to obtain 
information on the current status of and plans for integration testing 
and interviewed program officials at the Decennial Systems Processing 
Office to obtain information on the executive-level oversight of 
integration testing activities. We compared the Bureau's practices with 
those identified in our testing guide and IEEE guidance. 

For end-to-end testing, we analyzed documentation related to the 
testing of key census operations during the Bureau's Dress Rehearsal, 
additional testing conducted for the address canvassing operation, and 
efforts to establish testing plans and schedules for operations removed 
from the Dress Rehearsal. We also observed the Bureau's operational 
field test, held in December 2008 in Fayetteville, North Carolina. We 
interviewed program officials at the Decennial Systems Processing 
Office to obtain information on the current status and plans for end- 
to-end testing activities. We compared the Bureau's practices with 
those identified in our testing guide and IEEE guidance. 

We also analyzed documentation of the Bureau's overall oversight of 
testing, including its executive alert reports and monthly dashboard 
reports. In addition, we assessed system testing guidance and 
interviewed the Associate Director for the Decennial Census to obtain 
information on the overall oversight of testing activities. 

We conducted this performance audit from June 2008 to February 2009 in 
the Washington, D.C., and Fayetteville, North Carolina, areas, in 
accordance with generally accepted government auditing standards. Those 
standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objective. We believe that 
the evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objective. 

[End of section] 

Appendix II: Comments from the Department of Commerce: 

United States Department Of Commerce: 
Economics and Statistics Administration: 
Washington D.C. 20230: 

February 25, 2009: 

Mr. David A. Powner: 
Director: 
IT Management Issues: 
United States Government Accountability Office: 
Washington, DC 20548: 

Dear Mr. Powner: 

The U.S. Department of Commerce appreciates the opportunity to comment 
on the United States Government Accountability Office's draft report 
entitled Information Technology: Census Bureau Testing of 2010 
Decennial Systems Can Be Strengthened (GAO-09-262). I enclose the 
Department's comments on this report. 

Sincerely, 

Signed by: 

James K. White: 
Associate Under Secretary for Management: 

Enclosure: 

Census Bureau Comments on Draft Government Accountability Office (GAO) 
Report: "Census Bureau Testing of 2010 Decennial Systems Can Be 
Strengthened" (GAO-09-262): 

The Census Bureau appreciates this opportunity to review this draft GAO 
report and to provide its comments. 

Overall, we have no significant disagreements with the specific testing 
recommendations at the end of this report, but we do have the following 
comments concerning the findings. 

General Comment: 

Generally, and in particular on page 14, this report describes various 
criteria that GAO believes should constitute effective testing, 
including end-to-end testing of all systems, operations, and 
interfaces. Our testing strategy is-and has been since the re-plan last 
year-to focus on those things we have not done before, and to 
demonstrate to our own satisfaction that the new software and systems 
will work in production. We will have only one opportunity to use these 
new things, and they must work the first and only time they will be 
deployed. 

As part of our strategy, one aspect of the re-plan decision was for 
Census Bureau staff to take responsibility for several major systems we 
originally had included in the Field Data Collection Automation (FDCA) 
contractor's scope of work-e.g., the Operational Control System for 
field operations, including integrating our payroll and personnel 
system for hundreds of thousands of temporary workers. We made this 
decision to reduce the risk of system or operational failure, because 
the Census Bureau has successfully done these things before, and we 
believe we can do so again. While we will test these systems with our 
internal stakeholders, we are putting much more focus on testing the 
new things for 2010, not on testing things that have worked before. 
While risks remain, we are managing those risks through a process that 
includes risk assessment, development of mitigation strategies, and (as 
needed) development of contingency plans. 

[End of section] 

Appendix III: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

David A. Powner, (202) 512-9286 or pownerd@gao.gov: 

Staff Acknowledgments: 

In addition to the contact name above, individuals making contributions 
to this report included Cynthia Scott (Assistant Director), Sherrie 
Bacon, Barbara Collier, Neil Doherty, Vijay D'Souza, Nancy Glover, Lee 
McCracken, Jonathan Ticehurst, Melissa Schermerhorn, and Karl Seifert. 

[End of section] 

Footnotes: 

[1] 13 U.S.C. 141 (a) and (b). 

[2] GAO, Information Technology: Significant Problems of Critical 
Automation Program Contribute to Risks Facing 2010 Census, [hyperlink, 
http://www.gao.gov/products/GAO-08-550T] (Washington, D.C.: Mar. 5, 
2008). 

[3] GAO, High-Risk Series: An Update, [hyperlink, 
http://www.gao.gov/products/GAO-09-271] (Washington, D.C.: Jan. 22, 
2009). 

[4] System testing verifies that a system meets specified requirements. 
Integration testing verifies that systems, when combined, work as 
intended. End-to-end testing verifies that a set of systems work as 
intended in an operational environment. 

[5] See, for example, GAO, Year 2000 Computing Crisis: A Testing Guide, 
[hyperlink, http://www.gao.gov/products/GAO/AIMD-10.1.21] (Washington, 
D.C.: Nov. 1, 1998); and IEEE Std 12207- 2008, Systems and Software 
Engineering-Software Life Cycle Processes (Piscataway, N.J.: 2008). 

[6] For example, in the "update/leave" operation, after enumerators 
update addresses, they leave questionnaires at housing units; this 
occurs mainly in rural areas that lack street names, house numbers, or 
both. 

[7] GAO, Information Technology: Census Bureau Needs to Improve Its 
Risk Management of Decennial Systems, [hyperlink, 
http://www.gao.gov/products/GAO-08-79] (Washington, D.C.: Oct. 5, 
2007). 

[8] [hyperlink, http://www.gao.gov/products/GAO-08-550T]; GAO, Census 
2010: Census at Critical Juncture for Implementing Risk Reduction 
Strategies, [hyperlink, http://www.gao.gov/products/GAO-08-659T] 
(Washington, D.C.: Apr. 9, 2008). 

[9] [hyperlink, http://www.gao.gov/products/GAO-08-550T]; GAO, 2010 
Census: Census at Critical Juncture for Implementing Risk Reduction 
Strategies, [hyperlink, http://www.gao.gov/products/GAO-08-685T] 
(Washington, D.C.: Apr. 15, 2008) and GAO, 2010 Census: Plans for 
Decennial Census Operations and Technology Have Progressed, But Much 
Uncertainty Remains, [hyperlink, 
http://www.gao.gov/products/GAO-08-886T] (Washington, D.C.: June 11, 
2008). 

[10] GAO, 2010 Census: Census Bureau's Decision to Continue with 
Handheld Computers for Address Canvassing Makes Planning and Testing 
Critical, [hyperlink, http://www.gao.gov/products/GAO-08-936] 
(Washington, D.C.: July 31, 2008). 

[11] [hyperlink, http://www.gao.gov/products/GAO/AIMD-10.1.21] and IEEE 
Std. 12207-2008. 

[12] Individual program offices manage individual system testing for 
the Dress Rehearsal, and integration testing is managed by the pairs of 
program offices whose interfaces are being tested. 

[13] For more information on performance of the handheld computers, see 
[hyperlink, http://www.gao.gov/products/GAO-08-936]. 

[14] DRIS functionality includes the following: (1) the paper segment 
processes census paper forms; (2) the workflow control and management 
segment provides the databases, workflow, and interfaces to capture 
response data, store these data, and transfer data between segments and 
external entities; and (3) the telephony segment provides 
infrastructure and application for performing coverage follow-up 
operations, telephone questionnaire assistance, and interactive voice 
response operations. 

[15] These include update/leave, nonresponse follow-up, enumeration of 
transitory locations, group quarters enumeration, and field 
verification, as mentioned earlier. 

[16] Capability Maturity Model® Integration is intended to provide 
guidance for improving an organization's processes, and gives the 
ability to manage the development, acquisition, and maintenance of 
products and services. The model uses capability levels to assess 
process maturity. 

[17] System testing verifies that a system meets specified 
requirements. Integration testing verifies that systems, when combined, 
work as intended. End-to-end testing verifies that a set of systems 
work as intended in an operational environment. 

[18] [hyperlink, http://www.gao.gov/products/GAO/AIMD-10.1.21] and IEEE 
Std 12207-2008. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: