This is the accessible text file for GAO report number GAO-02-956R 
entitled 'The American Community Survey: Accuracy and Timeliness 
Issues' which was released on September 30, 2002. 

This text file was formatted by the U.S. General Accounting Office 
(GAO) to be accessible to users with visual impairments, as part of a 
longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

United States General Accounting Office: 
Washington, DC 20548: 

September 30, 2002: 

The Honorable Dave Weldon, M.D. 
Chairman: 
Subcommittee on Civil Service, Census, and Agency Organization: 
Committee on Government Reform: 
House of Representatives: 

The Honorable Dan Miller: 
Vice-Chairman: 
Subcommittee on Civil Service, Census, and Agency Organization: 
Committee on Government Reform: 
House of Representatives: 

Subject: The American Community Survey: Accuracy and Timeliness Issues: 

In response to your March 11, 2002, request, we have reviewed several 
major issues associated with the proposed full implementation of the 
American Community Survey (ACS) by the Bureau of the Census for 2003. 
If the ACS is approved, this mandatory mail survey would cost from $120 
to $150 million a year, and would require responses from a sample of 
about 3 million households (250,000 each month) to some 60 to 70 
questions. The ACS would provide annual data for areas with a 
population of 65,000 or more and multiyear averages for smaller 
geographic areas. In addition, the ACS would replace the decennial 
census long form for 2010 and subsequent decennial censuses. 

Based on your request and subsequent discussions with your staffs, we 
agreed to report on the following questions: 

* How would the quality of the annual ACS data and multiyear averages, 
which would be available beginning with annual data for 2003, compare 
with that of the 2010 Decennial Census long-form data, and would these 
ACS data adequately replace long-form data in meeting the needs of 
federal agencies? 

* Are the questions to be asked in the ACS beginning with 2003 
justified by statutory requirements of federal agencies, and is the 
planned use of ACS data to select samples for additional surveys 
consistent with the confidentiality provisions of Title 13 of the 
United States Code? 

* Are ACS questions duplicative of or similar to those in other federal 
surveys, and can the burden on the respondents be reduced? 

* If the ACS was conducted as a voluntary survey, how would the costs 
be affected? 

* How did the Bureau encourage participation in the ACS test program 
through (1) training for follow-up interviewers of nonrespondents and 
(2) outreach and promotion efforts? 

We conducted our audit work at Bureau headquarters in Suitland, 
Maryland, and Washington, D.C., from March through August 2002, in 
accordance with generally accepted government auditing standards. 

Results in Brief: 

On the basis of sampling errors and related measures of reliability, 
the Census Bureau has decided that ACS data will be published annually 
for geographic areas with a population of over 65,000; as 3-year 
averages for geographic areas with a population of 20,000 to 65,000; 
and as 5-year averages for geographic areas with a population of less 
than 20,000. According to the Bureau, the annual ACS data and 3-year 
averages would be significantly less accurate than data for 2010 from 
the decennial census long form; 5-year averages, which would be 
available at the detailed long-form level of geographic detail, would 
be about as accurate as the long-form data. If the Bureau’s 2003 budget 
is approved, annual ACS data for 2003 would be available beginning in 
2004; the first 5-year average data, for 2003-07, would be available 
beginning in 2008. ACS data would be significantly more timely than the
once-every-10-year data from the long form. Accuracy and timeliness are 
both important components of survey quality. Because there is no one 
formula to determine the relative importance of the components, it is 
not possible to determine an overall measure of survey quality to 
compare the ACS and long-form data. 

Federal agencies that extensively use the 2000 Decennial Census long-
form data for program implementation would use ACS data in the future 
if the long form was eliminated. To make the transition from the 2000 
Decennial Census long-form data to ACS data, which would begin with the 
release of the annual ACS data for 2003, these agencies would need key 
information from the Bureau’s evaluation of differences between the 
data collected from the 2000 long form and that collected in the ACS 
tests. However, this evaluation will not provide the agencies with the 
following key information: data from the 2000-02 ACS special 
supplements and the 2003 ACS with the same treatment of group quarters 
and seasonal residences as the 2000 Census; techniques to improve 
consistency between the data items from the 2000 long form and the 2003 
and subsequent ACS estimates; measures of stability of annual ACS data
and ACS multiyear averages; a framework for reconciling annual and 
multiyear data for the same geographic level of detail; and procedures 
for revising previously published ACS data to incorporate decennial 
census population counts. 

The questions to be asked in the 2003 ACS reflect justifications—specific 
statutes, regulations, and court cases—provided to the Bureau by federal 
agencies. To identify these justifications, the Bureau worked with the 
agencies using a process similar to that used to prepare the justifications 
for the questions on the 2000 Decennial Census long form. To support the 
request for approval of the ACS by the Office of Management and Budget 
(OMB), the Bureau submitted a list of justifications it selected from 
those provided by the agencies. These justifications were selected from 
among those classified by the agencies as either mandatory—decennial 
census data specified by statute—or required—decennial census data used 
historically to support a statute or for court-imposed requirements. 
Because agencies have not yet formally approved the complete list 
provided to the Bureau, we limited our review of the justifications and 
their classifications to the list selected by the Bureau. The 
justifications classified as mandatory met the Census Bureau’s 
criteria. However, justifications classified as required could not be 
verified because the agencies were not asked to provide sufficient 
information about either their historical use of decennial census data 
or planned use of the ACS. 

The Bureau’s plan to use responses to ACS questions to develop samples 
for additional surveys is not prohibited by the disclosure provisions 
in 13 U.S.C. § 9, as long as the Bureau conducts the surveys. 
Information from the Census 2000 Supplementary Survey has already been 
used to develop the sample for the National Epidemiological Survey on 
Alcohol and Related Conditions, sponsored by the National Institutes of 
Health. The OMB, in approving the ACS questionnaire, instructed the 
Bureau not to use the ACS universe for additional surveys without
agreement by OMB. 

Some ACS questions duplicated or are similar to questions on two 
existing federal surveys. In the request for OMB approval for the 2003 
ACS questionnaire, the Bureau said that there was some duplication, but 
that there was no other single federal survey that collected all the 
ACS data. OMB concurred with this position, and it appears to be a 
valid interpretation. However, there is no indication that the agencies 
sponsoring the existing surveys with questions that duplicate or are 
similar to ACS questions have considered eliminating questions on their 
surveys. Identical questions could be eliminated from the existing 
surveys because the ACS data would be more accurate, available at 
greater geographic detail, and more timely. Similar questions could be 
eliminated if the greater ACS accuracy, detail, and timeliness offset 
the advantage of asking additional and more relevant questions on these 
surveys. 

The Bureau determined, and GAO has agreed in a recently issued legal 
opinion, that it has the statutory authority to conduct the ACS as a 
mandatory survey, like the decennial census long form the ACS would 
replace. Based on this authority and on federal agency studies that a 
mandatory mail survey would most likely result in a higher response 
rate than a voluntary one, the Bureau plans to conduct the ACS as a
mandatory survey. If the ACS was conducted as a voluntary survey, the 
Bureau would need to make up for the lower mail response with more 
interviews to maintain the proposed level of accuracy of the ACS. 
Because obtaining responses by interview is more costly than obtaining 
responses by mail, conducting the ACS as a voluntary survey would be 
more expensive. The Bureau has prepared a rough estimate of the
added cost under the assumption that the mail response rate to a 
voluntary ACS would be 6 percentage points less than the rate for a 
mandatory ACS. Using this assumption, the Bureau estimates that a 
voluntary ACS would cost as much as $20 to $35 million a year more. 

The Bureau used a number of strategies to encourage participation in 
the ACS test program, which started in 1996. Two of the key strategies 
were (1) the training of interviewers, whose job it was to collect data 
from households that did not return the mail questionnaires, and (2) 
outreach and promotion efforts. According to the Bureau, the tests have 
consistently achieved high overall response rates and Bureau officials 
have been pleased with the results. Telephone and in-person interviewers
were provided scripted replies, designed to overcome the objections of
nonrespondents, that highlighted themes such as the importance of ACS 
data to the community and the legal requirement to participate in the 
ACS. For the 1996 test, the refusal rates for telephone interviews were 
about 14 percent and for in-person interviews about 4 percent. 
Moreover, for the tests conducted from 1996 to 2002, the Bureau 
reported that it had received about 250 letters expressing concern 
about the ACS. In a review of 82 of these letters, just 4 complained 
about the conduct of an interviewer; in the other letters, the major 
concern appeared to be privacy. For the outreach and promotion 
strategy, when the ACS test began in 1996, the Bureau relied on press 
releases and free media coverage for publicity. Since 1997, outreach and
promotion efforts have increased to include local workshops and town 
hall meetings, as well as contacts with representatives of print and 
broadcast media, professional journals, and umbrella organizations. 

We are recommending that the Bureau provide federal agencies with key 
additional information to better ensure the success of the transition 
from the use of the 2000 Decennial Census long-form data to the use of 
ACS data. We are also recommending that the Bureau and users of data 
from existing surveys determine whether duplicative or similar 
questions on these surveys can be eliminated because the same or 
similar data from the ACS will be more accurate and timely. 

Background: 

A decennial census usually consists of two major mandatory mail 
surveys. To provide the basic population counts, which are required for 
congressional apportionment and redistricting, a short form is mailed 
to all housing units.1 A long form is mailed to a sample of housing 
units to provide detailed information for many federal programs, 
including such topics as population and housing characteristics, 
incomes, education, transportation, and disabilities at the Census 
tract level. [Footnote 2] 

The President’s budget for fiscal year 2003 included a request for 
about $120 million to fully fund the ACS, beginning with 2003, and to 
eliminate the long form. According to the Bureau, the ACS, which would 
be an annual survey of a sample of 3 million housing units, was 
developed primarily to (1) provide long-form data items, at detailed 
geographic levels, that would be more timely than the long form and more
accurate than annual data from existing surveys such as the Current 
Population and American Housing Surveys and (2) improve the accuracy of 
the decennial census population counts. [Footnote 3] Bureau officials 
noted that the size of the ACS sample was determined in part by the 
Bureau’s projected funding level for a conventional decennial census in 
2010. 

If approved, beginning with the 2010 Census, the ACS would replace the 
long form, which, as GAO reported in 1998, “...is a cost-effective 
method of providing baseline and trend data for use by federal agencies 
and various other census stakeholders, compared to the alternative of 
multiple data collections by other federal agencies for their own 
purposes.” [Footnote 4] Thus, because the ACS would replace the 
decennial census long form, it would be important for the ACS to 
continue to serve federal agencies in the same role as the long form. 

Because of its sample size, the proposed ACS would eliminate the 
availability of complete long-form detail—data items and geographic 
levels—for any single year. The Bureau has determined that based on the 
size of the ACS sample, it would be able to publish reliable annual ACS 
data only for states and for cities, counties, and metropolitan areas 
with a population of more than 65,000. [Footnote 5] Compared with the 
size of the sample used for the 2000 long form, which most likely would 
also be the size used for 2010, the standard error for annual ACS data 
would be about three times larger. [Footnote 6] For smaller areas, the 
Bureau determined that it would publish data only using 3-year or 5-
year averages, depending on the population size. The 3-year averages 
would be published for areas with a population of between 20,000 and 
65,000; 5-year averages would be published for all geographic levels 
down to the tract level. For these 5-year averages, the data would have 
standard errors about 1.33 times as large as comparable long-form 
standard errors, but the Bureau expects that this error will be offset 
by lower item nonresponse because of the use of experienced 
interviewers for follow-up. [Footnote 7] 

If the funding request is approved, annual ACS data, for 2003, would be 
released beginning in 2004; data for areas with a population between 
20,000 and 65,000 would first be released in 2006 as 3-year averages; 
and data for areas with a population of less than 20,000 would first be 
released in 2008 as 5-year averages. 

Federal agencies using population counts to update fund allocations or 
for other program purposes will not be affected by the ACS. Population 
counts for 2000 and 2010 will come from the decennial census short 
form, and annual population estimates will continue to come from the 
Bureau’s intercensal population estimates program. [Footnote 8] Some of 
these agencies have already used these data to update fund allocations 
for programs requiring population counts. 

Federal agencies dependent on detailed long-form data will incorporate 
the 2000 Decennial Census long-form data into their programs before 
they start to use ACS data. These agencies will use the 2000 long-form 
data either to replace the corresponding data from the 1990 Decennial 
Census or, for some programs, to replace other source data for more 
recent years. Some agencies have updated data from the 1990 Decennial 
Census using annual data from household surveys. For example, the 
Current Population Survey (CPS) and American Housing Survey (AHS)
ask many of the same questions as does the long form, but because of 
the sample size of these surveys, they provide only national-level and 
some state-level data. The CPS data on national and state levels of 
poverty and unemployment are also used extensively for federal 
programs. For poverty and unemployment data for smaller areas, the CPS 
data are supplemented by estimates from model-based programs of the 
Census Bureau and the Bureau of Labor Statistics (BLS). [Footnote 9] In 
addition, data on income and employment at the state and county level 
are prepared as part of the U.S. regional economic accounts program of 
the Bureau of Economic Analysis. [Footnote 10] Data from these accounts 
are used to allocate over $125 billion in federal funds annually. 
[Footnote 11] 

Regardless of how these agencies updated the 1990 Census data, if the 
ACS proposal is approved, federal agencies would be required either to 
start using 2000 Decennial Census data to using (1) annual ACS data 
beginning with 2004, (2) 3-year averages beginning with 2006, or (3) 5-
year ACS averages beginning with 2008. [Footnote 12] Thus, federal 
agencies using long-form data would begin making decisions about the 
extent to which they would use the new ACS data in 2004. 

To test the quality of the ACS data and to assist ACS data users, the 
Bureau conducted special national supplementary surveys that provided 
data for the ACS questions for geographic areas with a population of 
250,000 or more. Using the annual data from the 2000 supplementary 
survey and 1999-2001 averages for the test sites, the Bureau has 
started an ACS development program that would evaluate and analyze 
differences between ACS data and the corresponding data from the 2000 
Decennial Census long form. [Footnote 13] Two of these reports have 
been released and the remaining reports are scheduled to be completed 
in 2003. In the second of these reports, the Bureau evaluated the 
quality of the ACS data using measures of four types of errors 
identified by OMB’s guidelines for survey error: accuracy, timeliness, 
relevance, and accessibility. [Footnote 14] However, this report only 
provides limited information on the quality of the two surveys and does 
not provide an overall measure of quality of either the long form or 
the ACS. The Bureau also has announced that it plans to use the data
from the supplementary surveys for 2001 and 2002 to evaluate the 
stability of the annual estimates, but there is no schedule for the 
scope or completion dates for this evaluation. 

The Bureau proposes to conduct the ACS as a mandatory mail survey 
because it would cost less than conducting the ACS as a voluntary 
survey, based on studies by federal agencies that showed response rates 
to mandatory mail surveys are higher. The Bureau also wanted the ACS to 
be mandatory because the Census long form was mandatory and both the 
long form and the ACS collect data that have a use mandated by statute. 
In response to a congressional request, GAO issued a legal opinion on
April 4, 2002, that concluded that the Bureau has the authority to 
conduct the ACS as a mandatory survey. [Footnote 15] 

The process for determining the questions to be asked in the 2003 ACS 
started with the questions asked on the 2000 long form. To determine 
the 2000 questions, federal agencies, as they had for recent decennial 
censuses, provided the Bureau with a list of statutory programs that 
support specific questions; using criteria developed by the Bureau, 
these agencies classified each program into one of three categories—
mandatory, required, or programmatic. A program was to be classified as 
mandatory if the supporting statute explicitly calls for the use of 
decennial census data. A program was to be classified as required 
either if the supporting statute required the use of data and the 
decennial census was the historical source of that data or if the data 
were needed for requirements imposed by the U.S. federal courts. A 
program was classified as programmatic if it did not meet the mandatory 
or required criteria, but was needed for such purposes as program 
planning, implementation, evaluation, or for the operation of another 
statistical program. The Bureau included as questions on the 2000 long 
form those justified by either a mandatory or required program and, in 
addition, included a few questions that the Census Bureau needed for 
survey operation purposes. The Bureau submitted the questions to 
Congress for review 2 years before the forms were to be mailed, in 
accordance with 13 U.S.C. § 141(f); Congress did not disapprove them. 
The Bureau then submitted the questions to OMB, in accordance with 
provisions of the Paperwork Reduction Act. OMB reviewed and cleared the 
questions. 

A similar process was used for the 2003 ACS questions with two 
exceptions: (1) a final list from the federal agencies is not yet 
available and (2) the questions were not submitted to Congress until 
July 11, 2002. For the 2003 ACS, the Bureau sent federal agencies the 
questions on the 2000 long form and asked them to do the following: 
Provide a list of programs to support specific questions; classify each
program using criteria developed by the Bureau into one of the three 
categories used for 2000—mandatory, required, or programmatic; and 
describe the frequency and level of geographic detail needed for each 
program. [Footnote 16] Although no additional written information or 
guidance was given to agencies to help them classify the programs into 
the correct category for the ACS, Census officials spoke with agency 
officials about this matter and reported that the categories were 
discussed at meetings of the OMB-sponsored and -directed Interagency 
Committee on the American Community Survey. In addition, agencies were 
not asked to provide any information on how they were planning to use 
2000 long-form data or on planning to transition from the use of long-
form data to the use of ACS data. 

Although the process of compiling the lists of programs for the 2003 
ACS started in early 2001, the agencies were not able to complete a 
final list in time for the Bureau to submit the ACS questionnaire for 
OMB approval. Therefore, in late 2001, the Bureau decided that for OMB 
approval, it would select a short list of programs with mandatory or 
required justifications already identified by the agencies. The Bureau
selected programs so that each of the proposed 2003 ACS questions was 
supported by at least two statutes. In April 2002, as required by 
provisions of the Paperwork Reduction Act, the Bureau submitted the 
2003 ACS to OMB for approval and justified the questions with the 
programs on the short list. [Footnote 17] In order to ensure that the 
latest lists provided by the agencies were complete, on June 13, the 
Department of Commerce formally requested that each agency review their 
lists. These approved lists were not available at the time of this 
review. On June 28, 2002, OMB cleared the ACS questionnaire with the 
condition that the Bureau must submit to OMB, in advance, any plans to 
use the ACS to select samples for other surveys. [Footnote 18] This 
advance submittal would continue until OMB agreed on an approach for 
the Bureau to evaluate such plans. 

The Bureau has been conducting tests of the ACS at the county level, 
starting with four test sites in 1996 and increasing to 31 sites by 
1999. Based on these tests, the Bureau has determined that it has 
successfully demonstrated the feasibility of conducting the ACS. 
[Footnote 19] 

Annual ACS Data Less Accurate but More Timely than Long Form; Federal
Agencies Need Additional Information for Transition to ACS: 

Our framework for evaluating quality of the data from the two surveys 
was based on four OMB guidelines for measuring survey errors--accuracy, 
timeliness, relevance, and accessibility. [Footnote 20] Because of the 
larger long-form sample, data for 2010 from a decennial census long 
form would be significantly more accurate than the data for 2010 from 
the proposed ACS. Based on the size of the 2000 long-form sample, the
2010 census long form would be mailed to about 20 million housing 
units; in contrast, the 2003 and later years’ ACS questionnaires would 
be mailed to 3 million units a year. On the one hand, ACS data for 2010 
not only would be less accurate than the 2010 long-form data, but would 
also be limited to areas with a population of more than 65,000. On the 
other hand, ACS data would be timelier. Data for areas with a 
population of more than 65,000 would be available annually, and the 5-
year averages would be available for the same geographic levels as the 
long form. Because similar questions are used on both surveys, data 
from the long form and the ACS would have the same level of relevance. 
Based on past Bureau practices, there would be no significant 
differences in accessibility to the data. 

We did not attempt to combine our evaluations of the four guidelines 
into an overall measure of quality for each survey. First, complete 
information was not available to evaluate all of the components of each 
of these guidelines. Second, as noted in the OMB guidelines, even with 
complete information, it is difficult to combine the results of the 
evaluations of these components into an overall measure of quality. 

Currently, the Census Bureau’s plans to evaluate ACS data provide only 
a limited amount of the information needed by federal agencies to 
transition from their use of the 2000 long form to the ACS. For 
example, the plans do not provide for (1) 2003 data conceptually 
consistent with the 2000 long-form data, (2) information to adjust 2003 
and 2004 data to account for statistical differences with the 2000 long 
form, (3) information to integrate annual data and multiyear averages, 
and (4) the Bureau’s proposals to incorporate, first, the population 
counts from the 2010 Decennial Census into the 2010 ACS and, second, 
the resulting revisions to the intercensal population estimates into 
previously published ACS data. [Footnote 21] 

Evaluation of ACS and Long-Form Data Quality Currently Incomplete
We evaluated the data quality of the two surveys, using the four OMB 
guidelines for measuring survey quality. According to the information 
available to us, we found that the accuracy of ACS, based on sample 
size, would be less than that of the decennial census long form. 
Sufficient information on nonsampling errors is not yet available to
compare the two surveys for this measure of accuracy. Nonresponse 
errors, based on the incomplete information, were somewhat smaller for 
the ACS. Measurement errors, based on more complete information, 
appeared to be larger in the ACS. However, the timeliness of the ACS 
data would be superior. 

Accuracy: 

Our findings on accuracy were based on information from the Bureau on 
both sampling and nonsampling errors, which includes nonresponse, 
measurement, and coverage errors. According to the Bureau, the ACS 
sampling error will be larger than the error for the long form, but the 
impact of this larger sampling error may be reduced through the use of 
more experienced interviewers than those used for the decennial census. 
However, we found no indication that the experience of the interviewer 
would make a significant impact, especially if the ACS mail response 
rate was high and the number of follow-up interviews low. 

As reported in the OMB guidelines for survey errors, nonsampling error 
is frequently the source of the most significant errors in surveys. But 
we were not able to determine whether that was the case for either the 
long form or the ACS. Nevertheless, for both surveys, we found 
indications based on incomplete data of two types of nonsampling error, 
nonresponse and measurement error. The impact of nonresponse error 
appears to have been greater for the long form; the impact of 
measurement error appears to be greater for the ACS. 

Item nonresponse occurs when a respondent does not complete an item on 
the survey form or provides an unusable response. [Footnote 22] For 
both the Census 2000 Supplementary Survey and the 2000 Decennial Census 
long form, information on item nonresponse was based on published 
information on imputations for selected states and on national-level 
data on imputations for a small group of items provided to GAO by the 
Bureau. For the 12 states for which the Bureau has released item 
nonresponse data, imputation rates were typically about the same for 
all the states. For individual items, imputation rates were slightly 
higher for the long form. Of the 35 items for which we had imputations 
at the national level for both surveys, we found that for total income, 
the imputation exceeded 20 percent of the total value for both. For
items such as period of active-duty military service, time of departure 
for work, weeks worked, the year housing was built, and the value of 
owner-occupied housing units, we found that the value imputed was 
between 10 and 20 percent for both surveys. For the rest of the items 
for which we had data for both surveys, the imputations accounted for 
slightly more of the long-form total than of the corresponding items on 
the supplementary survey. 

We found indications of measurement error, one of the major sources of 
nonsampling error, based on our examination of long-form and ACS data. 
Measurement error is usually calculated as the difference between the 
survey value and the true value. As is usually the case, true values 
are not available. For this review, we assumed that because of the much 
larger sample size in the 2000 long form, the value from the 2000 long 
form is closer to the true value than that in the Census 2000 
Supplementary Survey value. [Footnote 23] To examine the differences 
between long-form and ACS data, we compared published national-level 
and state-level data for a set of items selected from among the major 
topics on the form. [Footnote 24] 

These comparisons showed large national differences for key items that 
did not appear to be accounted for by coverage differences between the 
two surveys. [Footnote 25] For example, at the national level, the 
largest differences were for these items: (1) for the number of housing 
units lacking complete plumbing facilities, with the long-form estimate 
27 percent higher than the estimate from the supplementary survey, and
(2) for the number of unpaid family workers, with the long-form 
estimate 59 percent lower. Other items with national-level differences 
of at least 10 percent included self-employed workers, housing units 
lacking complete kitchen facilities, and housing units with no 
telephone service. We also found a great degree of variation in the 
state differences between the long form and the supplementary survey. 
For the following items, a significant proportion of the states had 
long-form estimates that were both 10 percent or more higher than and 
10 percent or more lower than the supplementary survey estimates: 
workers commuting by public transportation; households with income of 
$200,000 or more; housing units lacking complete plumbing facilities;
number of renter-occupied units with gross monthly rent of $1,000 to 
$1,499; and some of the measures of the number of individuals and 
related children below the poverty threshold. 

To gauge the accuracy of the 2000 ACS data, we also looked at 
differences between the 2000 ACS and the Census Bureau’s CPS. We 
reviewed these data using a dimension of quality that is part accuracy 
as well as part relevance and part timeliness. Based on sampling 
errors, we found that the 2003 ACS would be more accurate. However, 
based on technical reports, we found that both the Census Bureau and 
the BLS view the existing surveys as providing more accurate and more
relevant information. [Footnote 26] For example, neither the Bureau nor 
BLS uses long-form data in the statistical measures of income, poverty, 
and labor at the national and state levels. [Footnote 27] The reason 
given by these agencies for not using the long-form data for these
items is that (1) the CPS has more detailed questions that more closely 
relate to the underlying concepts and (2) the surveys are conducted by 
experienced interviewers. OMB, in Statistical Policy Directive No. 14, 
has designated the CPS as the official source of statistical measures 
of poverty. The Department of Health and Human Services (HHS) has 
designated the CPS as the source of poverty measures for its programs. 
[Footnote 28] 

To follow up this information about poverty and unemployment rates, we 
compared the total unemployment rate and two poverty rates--for 
individuals and for related children under 18--in the long form, the 
Census 2000 Supplementary Survey, and the CPS. [Footnote 29] We found 
that at the national and state levels, there were small differences for
the unemployment rate and for the poverty rate for all individuals. In 
contrast, comparisons of these rates for the CPS with these two surveys 
showed larger differences. The national unemployment rate, according to 
the CPS, was 4.0 percent, compared with 5.8 percent for the long form 
and 5.4 percent for the supplementary survey. The national rate for 
individuals in poverty for the CPS was 11.3 percent, compared with 12.4 
percent for the long form and 12.5 percent for the supplementary 
survey. The pattern for the national poverty rate for related children 
under 18 for the CPS was different because there was a larger 
difference between the ACS and longform rates. The CPS rate was 15.6 
percent, as compared with 16.1 percent for the long form and 17.0 
percent for the supplementary survey. Small differences were also shown 
in comparisons of the long-form and supplementary survey distribution of
state differences for the unemployment rate and for the poverty rates 
for related children under 18 and for individuals. Comparing the 
distribution of the state differences between the CPS and either of the 
other two surveys only showed significant differences for the poverty 
rate for related children under 18. Compared with the long form, the 
CPS rate for 12 states is 2.5 or more percentage points lower and for 
10 states is 2.5 or more percentage points higher. Compared with the 
supplementary survey, the CPS rate for 16 states is 2.5 or more 
percentage points lower and for 5 states, 2.5 or more percentage points 
higher. 

We asked Census Bureau and BLS officials about future plans for the use 
of the ACS. According to Census Bureau officials, they had been doing 
research into the use of ACS data to improve their model-based 
estimates, but did not have any definitive plans. [Footnote 30] 
According to BLS officials, they had recently let a research contract 
to help them determine whether ACS data could be used to improve their 
small-area estimates. Because of the widespread use of CPS poverty and 
unemployment data in federal programs, assistance by these two 
statistical agencies would help the program agencies in deciding 
whether to replace the CPS data with ACS data. 

We anticipate that the Bureau’s evaluation studies, to be completed in 
2003, will provide explanations for the measurement errors. For 
example, we expect that the evaluations will separate out measurement 
errors by quantifying the impact of excluding from the supplementary 
surveys people living in group quarters and of treating differently 
people with seasonal residences. In our review of differences between 
the long-form and supplementary survey data, it did not appear that 
these errors would explain the large differences noted above. 
Nevertheless, this exclusion will contribute significantly to 
differences in certain states and for certain data items. 

Timeliness: 

The timeliness of the ACS data for all geographic levels would be a 
major improvement over the long form, especially for annually published 
data for geographic areas with a population of 65,000 or more. However, 
use of these annual data for geographic areas with populations at the 
lower end of this range may be limited. The Bureau has reported that 
the accuracy of the annual data for these areas would be roughly 
comparable with the accuracy of the state estimates from the CPS. We 
found that in describing the accuracy of the CPS income and poverty 
data, the Bureau has reported that annual state data should not be 
used, but that 2-year averages should be used to calculate changes at 
the individual state level and 3-year averages should be used for 
calculating relative rankings for states. [Footnote 31] Because the
ACS has a larger sample than the CPS, these limitations should not 
apply to annual ACS data for states and other large areas, but they may 
apply to the annual ACS data for smaller areas. Thus, federal agencies 
planning to use annual data for these areas will need information on 
when to use multiyear averages instead of the annual data. 

Relevance: 

Because of the similarity of the long-form and ACS questions, their 
levels of relevance—the extent to which a survey provides conceptually 
meaningful and useful measures—are similar. However, for federal 
program use, two important measures from both surveys—poverty and 
unemployment rates—the ACS and long-form data are not as relevant as 
the measures from existing surveys according to the agencies that 
conduct them. Our findings on these measures were discussed under 
“accuracy.” 

Information to Meet Federal Agencies’ Transition Needs Missing: 

Federal agencies would need assistance from the Bureau in the 
transition process, as recognized in the 2001 National Academy of 
Sciences report choosing the formula allocations, which concluded: 

The American Community Survey (ACS), which is intended to replace the 
decennial census long form, would be a major new data source that could 
be used in estimating inputs if the survey were implemented as planned. 
With data from census 2000 becoming available in stages and the ACS 
pending, an immediate and high priority should be given to developing 
recommendations on how to make a smooth transition to these and other 
data sources and how to evaluate the impact on allocations of 
introducing new data sources. [Footnote 32] 

The Census Bureau has recognized its responsibility to provide such 
assistance through various outreach efforts and its ACS development 
program. [Footnote 33] The Bureau has stated: “Users need to understand 
the differences in order to properly use the C2SS and ACS data in their 
own applications and to be able to distinguish real changes over time 
from changes in estimates because of differences in methods.” [Footnote 
34] The current plans for the testing program call for an analysis of 
differences between the 2000 Census long-form data and the data from 
the Census 2000 Supplementary Survey, to be completed in 2003. 
[Footnote 35] 

From the perspective of the federal agencies, however, we found the 
content of the ACS development program is missing material, described 
below. This material would address differences between the two surveys 
related to sampling, measurement, and nonresponse errors, discussed 
earlier in this section. The analysis of these differences will provide 
information critical to the agency’s transition to the ACS because 
these differences are likely to significantly change the allocation of 
funds and program eligibility, and agencies will need to fully 
understand the sources of such changes. 

In an earlier report on the comparability of the 2000 Census long-form 
data and the Census 2000 Supplementary Survey, the Bureau noted the 
following about its evaluation program: “The purpose of those 
evaluations is to help the user understand how the estimates will 
differ, but not to adjust the C2SS in any way to mirror the long form.” 
[Footnote 36] Thus, the Bureau has excluded from the current testing 
program a plan to adjust the data from supplementary surveys for 2000-
2002 and the 2003 ACS to account for coverage differences—for group 
quarters and seasonal residences—between the ACS and the 2000 Decennial 
Census long-form data at the national and state levels. [Footnote 37] 
In the supplementary surveys and the 2003 ACS, people living in group
quarters were excluded; in the 2000 Decennial Census, people living in 
group quarters accounted for about 2.8 percent of the population. 
[Footnote 38] In addition, the Bureau decided to change the treatment 
of people who had seasonal residences because the treatment in the 
decennial census reflected where people lived on only 1 day of the 
year, even though they might spend most of the time living somewhere 
else. This difference does not affect the national-level data. The 
adjusted series for 2000 would help explain some of the large 
differences between the 2000 ACS and long-form data; the adjusted 
series for 2003 would allow the agencies to consider using the adjusted 
2003 ACS data to update the 2000 estimates instead of waiting until 
2004, when ACS would begin to cover people living in group quarters. 
Thus, it would only be necessary to adjust ACS data beginning with 2004 
for the difference in the treatment of seasonal residences. 

We also found that the ACS development program does not include plans 
to provide information on two elements of accuracy of the annual ACS 
estimates--their use as measures of yearly changes for state and county 
data and relative rankings between states and counties. This 
information would assist federal agencies in deciding (1) how 
frequently they should update their fund allocations or eligibility 
criteria and (2) whether they should use averages or the annual data. 
As previously noted, the accuracy of the ACS annual data would be 
roughly comparable with state data from the CPS and the Bureau has 
recommended using 3-year averages when calculating relative rankings of 
state CPS income and poverty data. 

In addition, we found that the ACS development program did not cover 
information about different ways to integrate the annual data for 
states and large counties and the 3- and 5-year averages for smaller 
counties. For example, in 2008, the Bureau would publish annual data 
for 2007 for states and counties with a population of more than 65,000; 
3-year averages for 2005-07 for counties with populations of 20,000 or 
more; and 5-year averages for 2003-07 for all counties. Federal 
agencies that need state data can choose to use either the annual data, 
multiyear averages of the annual data, or 3-year or 5-year ACS 
averages. Federal agencies that also need county data will face several 
options: They can choose to use the most recent annual data for large
counties and adjust the averages of the smaller counties to agree with 
annual data. Alternatively, they can choose to use various combinations 
of multiyear averages. We also found that some agencies use existing 
household survey data instead of decennial census data. These agencies 
would now have the option of when or whether to switch to the ACS. We 
found that the Bureau’s ACS development program did not include a 
report analyzing differences for corresponding data items in annual 
changes and in annual levels between the ACS and the existing surveys. 

Finally, we looked ahead to 2011, when the Bureau would need to 
incorporate (1) the 2010 Decennial Census population counts into the 
2010 annual ACS data and (2) the revised 2003-09 population estimates 
into the previous multiyear averages. We found no plans on benchmarking 
ACS data to the 2010 Decennial Census, although these plans could 
affect agencies’ decisions on use of the ACS. 

Federal Agencies Justify ACS Questions, but Uncertainty Remains on 
Extent of ACS Data Use: 

Federal agencies provided the Bureau with a list of justifications to 
support ACS questions and classified each program into one of three 
categories—mandatory, required, or programmatic. This list was not 
complete when the Bureau submitted the request to OMB for approval of 
the ACS questionnaire. Consequently, from among those programs 
classified by the agencies as mandatory—decennial census data specified 
by statute—or required—decennial census data historically used to 
support a statute or court-imposed requirements—the Bureau selected a 
short list of justifications for submission to OMB. The Bureau provided 
OMB both the short list and the latest draft of the complete list, and 
OMB cleared the ACS questionnaire based on this information. Without a 
complete list approved by the agencies and information on how the 
agencies planned to use 2000 Decennial Census and ACS data, we reviewed 
only the justifications on the Census-approved short list sent to
OMB. [Footnote 39] 

The 20 questions justified by mandatory programs reflect the provisions 
of seven statutes: One statute justifies 13 questions for providing 
information to the Equal Employment Opportunities Commission (EEOC) to 
enforce the Federal Affirmative Action Plan. Another statute justifies 
6 questions for providing information to the Department of Justice 
(DOJ) to enforce the Voting Rights Act. A Department of Commerce (DOC) 
statute justifies 1 question for providing information for legislative
redistricting. The other statutes relate to programs of the Department 
of Agriculture (USDA), DOC, and HHS. Based on our review of the 
statutes underlying these programs, we found that the statutes require 
the use of decennial census data. 

As previously noted, we were unable to verify most of the 48 required 
classifications because the agencies were not asked to report on how 
they planned to use the newly available 2000 Decennial Census data. 
Information on how these data were actually used was not available when 
the agencies submitted their justification list because the 2000 Census 
long-form data were not yet available. In addition, we were not able to 
review agency plans for the ACS because the Bureau did not ask agencies 
to report their planned use of ACS data in their programs. Information 
about when these data would be introduced, whether annual data or 
multiyear averages would be used, and whether 2000 Decennial Census and 
ACS data would be integrated are likely to have also been useful to 
guide the Bureau in the ACS development program. We were told by Bureau 
officials that this information was not requested because the Bureau
followed the justification process used for the 2000 Decennial Census 
long form. We were also told that the three questions included for 
survey-operation purposes were necessary. 

In addition to providing federal agencies with direct use of ACS data 
for program needs, the Bureau has announced that it would conduct 
special surveys for them, based on ACS responses. In the past, the 
Bureau has used this practice for responses to other surveys, such as 
the decennial censuses. We agree with the Bureau that this practice is 
not prohibited by the disclosure provisions in 13 U.S.C. § 9(a)(1), 
which provide that Census data may not be used “for any purpose other 
than the statistical purposes for which it is supplied.” We agree with 
the Bureau’s opinion that “statistical purposes” includes the use of 
information collected in one Bureau survey to conduct another Title 13 
statistical survey. The Bureau itself would conduct all additional 
surveys; responses would not be provided to any other federal agency. In
the cover letters mailed with the ACS questionnaires, the Bureau had 
notified respondents in the ACS testing programs of this plan. We were 
unable to determine whether respondents understood the possible impact 
of the plan; we also did not find any mention of this notification in 
the information provided to the staff conducting the ACS testing. 

OMB, in approving the ACS questionnaire for 2003, has required the 
Bureau to meet certain conditions before using the ACS sample to select 
samples for other surveys. OMB stated, “The Census Bureau is not 
permitted to use the ACS for follow-up studies until an approach has 
been agreed to with OMB.” [Footnote 40] 

Duplicate or Similar Questions in ACS and Other Federal Surveys: 

Duplicate or similar questions in federal surveys may cause an 
unnecessary burden on respondents. The Paperwork Reduction Act requires 
agencies to minimize the reporting burden for respondents and the cost 
to the government by prohibiting unnecessary duplication of questions 
in information collection. In its statement submitted to OMB for 
approval of the ACS questionnaire under this act, the Bureau reported: 
“The content of the American Community Survey reflects topics that the
Census Bureau is mandated or required to collect. A number of questions 
in the American Community Survey appear in other demographic surveys, 
but the comprehensive set of questions does not duplicate any other 
single information collection.” [Footnote 41] It should also be noted 
that although many ACS questions are similar to or the same as 
questions on other federal surveys, these other surveys do not provide
data for small geographic areas that the Bureau plans to provide from 
the ACS. 

The Bureau’s statement on duplication does not address the possible 
elimination of questions, on other surveys, that would become 
duplicative because the data would be collected on the ACS. But we 
identified other existing federal surveys that ask some of the same 
questions or similar ones to those on the ACS. It appears, however,
that continuation of the inclusion of these questions on these surveys 
is justified because the questions are primarily about population 
characteristics—such as age, sex, race, and, sometimes, income. For 
example, these questions are needed to provide context for the major 
focus of each of the following surveys: the Survey of Consumer 
Expenditures (Department of Labor), the National Health and Nutrition
Examination and National Health Interview Surveys (HHS), the Survey of 
Crime Victimization (DOJ), and the Survey on Nutrition (USDA). The 
questions on these surveys focus on consumer spending, smoking or 
eating habits, or crime, topics that would not be covered by the ACS. 

Other than the questions on population characteristics that are on many 
surveys, questions on three voluntary household interview surveys 
appear to have the most overlap with ACS questions. The surveys are the 
Bureau’s annual supplement to the CPS, the Bureau’s Survey of Income 
and Program Participation (SIPP), and the AHS, which the Bureau 
conducts for HUD. All three of these surveys have questions that 
overlap with ACS questions on the labor force, incomes, and other 
topics, such as country of birth. For the AHS, there also is a 
substantial overlap for questions on housing characteristics. 

It should be noted that in some cases, overlap does not mean that the 
identical questions were asked. In addition, even when virtually 
identical questions were asked, one survey might include additional 
questions to obtain the most relevant response. For example, to 
determine whether a person is unemployed, the CPS asked more questions 
than does the ACS; to determine whether a property is used as a 
business or medical office, the AHS asked about the number of rooms 
used for business, number of rooms used for both business and personal 
use, and if there is a medical or dental office on the property. In the 
ACS, the respondent is only asked, “Is there a business (such as a 
store or barber shop) or a medical office on this property?” 

According to the Census Bureau, income and labor force data should 
continue to be collected in the CPS and SIPP because of the unique 
characteristics of the data from these surveys. [Footnote 42] The CPS 
income data have been determined by OMB (Statistical Policy Directive 
No. 14) to be the official statistical source to calculate the poverty
threshold and related estimates for the nation and for the states. SIPP 
collects more detailed information on incomes and on characteristics 
related to poverty; it is designed as a longitudinal survey, which 
allows users to study household behavior over time. In addition, CPS 
and SIPP periodically include supplements covering special topics. The 
CPS has covered topics such as workers who hold multiple jobs, 
intermittent workers, and health insurance. The SIPP has covered topics 
such as wealth, day care, and disability. The ACS estimates of income 
and poverty would be more accurate than the CPS or SIPP because they 
would have a smaller sampling error, but the use of trained 
interviewers for the CPS and SIPP reduce nonresponse error sufficiently 
to offset lower ACS sample error. Although trained interviewers may 
reduce nonresponse error, there is also empirical research that shows 
that both CPS and SIPP income data differ significantly from 
independent benchmark estimates. [Footnote 43] Now that 2000 long-form 
income data are available, updating this research would enable the 
agencies to reexamine the relative accuracy of the various estimates. 

The AHS is a biennial household interview survey, sponsored by HUD and 
conducted by the Bureau. The survey costs about $17 million a year and 
has many questions on income and housing characteristics that are more 
detailed, but similar to ACS questions. The ACS is based on a much 
larger sample and provides far more geographic detail annually than the 
AHS. Our review of ACS and AHS questions showed a substantial overlap 
for questions on place of birth and citizenship, education, labor force 
characteristics, transportation to work, income, and housing 
characteristics. Of the 66 questions on the 2003 ACS, 25 are in the 
section on housing characteristics; all but one of these questions are 
the same as or similar to questions on the AHS. In addition, when we 
reviewed the most recent list of program justifications for the ACS, 
provided by HUD to the Bureau, we noted an overlap between HUD’s 
current use of the AHS and the decennial census and its planned use of 
the ACS. According to information provided to OMB to support approval 
of the AHS, HUD reported: 

The major program uses of the AHS are to develop and evaluate the Fair 
Market Rents (FMR's) for the Section 8, Existing Housing Program, the 
Housing Voucher Program, and the Annual Adjustment Factors (AAFs) used 
to grant rent increases for units under contract for both Section 8, 
New Construction and Existing Programs: New Construction Housing and 
Existing Housing. The preliminary list of ACS uses by HUD, provided to 
the Bureau, also showed several of these same programs. 

Conducting the ACS as a Voluntary Survey Would Most Likely Result in
Higher Costs: 

The Bureau’s decision to conduct the ACS as a mandatory survey is 
supported by studies of two surveys—one of households and one of 
businesses—that showed that response rates to mandatory mail surveys 
are higher than those to voluntary mail surveys. The study on the 
household survey, conducted by the Bureau as an experiment, using the 
1990 Decennial Census short form, showed the response to the mandatory 
survey was about 9 percentage points higher than the response to the
voluntary survey. [Footnote 44] The study on the business survey, also 
conducted by the Bureau, showed the response to the mandatory survey 
was more than 20 percentage points higher. [Footnote 45] We reviewed a 
study of another Bureau mail survey and a BLS study of mail surveys of 
businesses and found the same pattern of reporting. We also analyzed
unpublished BLS data on the response rates to a monthly business 
survey, where the reporting in some states was mandatory. These data 
showed a higher response rate with mandatory surveys, but the gap was 
smaller—12 percentage points for March to May of 2001 and 6 percentage 
points for the same months in 2002. However, we also found that 
interpreting differences in response rates between surveys is 
difficult, as noted in the literature on response rates. [Footnote 46] 
Some of the factors that can distort the comparisons include 
differences in survey methods, survey length, population surveyed, 
quality of nonresponse follow-up interviewers, and extent and nature of
follow-up methods. 

We also found that response rates to private surveys tend to be lower 
than for federal government surveys. Among the privately conducted 
national household interview surveys, two are sponsored by HHS. For the 
Health and Retirement Survey, conducted by the Institute for Social 
Research of the University of Michigan, the response rate is about 82 
percent. [Footnote 47] For the Medical Expenditures Panel Survey 
Household Component, conducted by Westat, Inc., and the National 
Opinion Research Center of the University of Chicago, the response rate 
for the 1996 survey was 83 percent. [Footnote 48] For telephone 
surveys, an industrywide survey of private marketing and opinion 
research firms reported the highest average response rate among
different types of telephone surveys, 52.5 percent for customer 
satisfaction surveys. [Footnote 49] In contrast, the combined response 
for the four ACS test sites in 1996 was 98.2 percent and for the Census 
2000 Supplementary Survey, 95.4 percent. 

Information provided by the Bureau indicated that costs of a voluntary 
ACS would be greater because of the larger number of follow-up 
interviews that would be needed due to the lower response rate. 
However, it is not clear whether with sufficient funding, the Bureau 
would be able to achieve the same overall response rate for a voluntary 
mail or interview survey as for a comparable mandatory mail survey. Such
a conclusion cannot be determined from the existing evidence because 
there has been no testing of response rates for a voluntary mail survey 
of households of the size and scope of the ACS. For the ACS, such a 
study would be needed not only to determine the overall response rate, 
but also the extent of item nonresponse. 

As to costs, we asked the Bureau to estimate the additional costs of 
conducting the ACS as a voluntary survey, assuming a lower mail 
response rate and comparable quality results. The Bureau provided an 
estimate of an additional $20 to $35 million per year, assuming that 
the mail response rate was 6 percent lower. 

Interviewer Training, as Well as Outreach and Promotion Efforts, 
Encouraged Participation in the ACS Test Program: 

As with all its surveys, one of the Bureau’s principal objectives in 
conducting the ACS test program was to achieve a high response rate so 
as to collect complete and accurate data. The training the Bureau 
provided to interviewers who collected data from nonrespondents, in 
concert with other strategies—such as a respondent-friendly 
questionnaire, multiple mailings, as well as outreach and 
promotion—encouraged participation, that is, a high response rate, in 
the ACS test program. [Footnote 50] 

Follow-up Interviewers Trained to Encourage Participation in the ACS
The Bureau has consistently achieved high overall response rates in the 
ACS tests. For example, the Bureau reported that the first ACS test in 
1996 had a mail response rate of 60.9 percent at the four test sites 
(Rockland County, N.Y.; Brevard County, Fla.; Fulton County, Pa.; and 
Multnomah County and the city of Portland, Ore.). But the final 
response rate—once the Bureau completed its follow-up efforts with 
people who did not respond to the initial mail survey—was 98.2 percent. 
[Footnote 51] The Bureau’s ACS program staff was pleased with the 
results. 

As the ACS test program expanded to 31 sites between 1997 and 1999, the 
Bureau continued to achieve similar mail and final response rates. The 
Bureau’s staff of follow-up interviewers helped achieve these high 
rates because they were trained in a variety of techniques to encourage 
participation by households that did not respond to an initial mail 
survey. 

During the first month of the 3-month ACS data collection cycle, the 
Bureau made a concerted effort to obtain responses by mail because this 
is the least costly method of obtaining survey data. To encourage 
participation, the Bureau used a respondent-friendly questionnaire and 
a four-part mailing strategy: over the course of the month, the Bureau 
sent each household (1) a pre-notification letter that described the ACS
and informed recipients they would soon receive the questionnaire; (2) 
an initial ACS questionnaire and information about the survey; (3) a 
postcard reminding recipients to complete the questionnaire and 
thanking them if they had already done so; and (4) about 3 weeks after 
the initial ACS questionnaire, a replacement questionnaire that was 
mailed to housing units that had not yet returned their questionnaires. 
The Bureau reported that in 1996, the replacement questionnaire added 
about 10 percentage points to the initial response rate at each test 
site. 

During the second month, Bureau staff attempted to collect data via the 
telephone, using a procedure called Computer-Assisted Telephone 
Interviewing, from households that did not mail back their 
questionnaires. A month later, in a final procedure called Computer-
Assisted Personal Interviewing, Bureau field representatives were to 
visit a one-in-three sample of the remaining nonrespondents. Overall, 
the telephone interviewers and field representatives appeared to be 
effective in their tasks. In 1996, refusal rates were about 14 percent 
for the telephone interviews and 4 percent for the in-person 
interviews. 

Because the telephone interviewers and field representatives play an 
important data collection role and represent the Bureau to the general 
public, proper training is critical. The Bureau provided both telephone 
interviewers and field representatives with similar training, 
consisting of lectures, scripted mock interviews, and discussions. Our 
review of the materials used for the follow-up indicates that most of
the training was devoted to correct use of the computers and other 
mechanics of conducting the interview. Dealing with reluctant 
respondents appeared to make up a small portion of the training. 

According to the training manual, telephone interviewers, after 
verifying the household, were to begin the survey by telling 
respondents: “I am required by law to tell you that this survey is 
authorized by Title 13, section 182, of the United States Code…. This 
survey is mandatory and your cooperation is very important. All the
information you provide is completely confidential.” [Footnote 52] 

If respondents were reluctant to participate in the telephone 
interview, the interviewers had available scripted answers to common 
questions about the survey. These answers were aimed at addressing 
respondent concerns and keeping them engaged. One or more of the 
following themes typically ran through the suggested replies: federal 
law requires participation; data from the ACS benefits the respondent’s 
community and the nation; federal law protects the privacy of 
responses; and responding now can help save taxpayers’ money. For 
example, if a respondent said, “I think this is a waste of taxes!” the 
interviewer was instructed to explain: “There are many reasons why it’s 
definitely NOT a waste of tax dollars. Businesses, government agencies, 
and the general public rely on up-to-date statistics, like the 
information we are collecting in this survey, to make informed 
decisions. Calling people by phone to collect this information is the 
least expensive way to do it, if we can’t get a response by mail.” The 
suggested replies appeared to be courteous, informative, firm, and 
nonthreatening. 

In addition, although the ACS was a mandatory survey, the training 
materials cautioned interviewers: “It is rarely necessary to mention 
this law because most people understand the importance of Census Bureau 
survey data and are willing to cooperate. The Bureau places a high 
value on the public’s cooperation and we are counting on you to 
maintain this cherished relationship.” 

Households that refused to participate in the telephone interview and 
households for which the Bureau was unable to obtain a valid telephone 
number were added to the universe of cases eligible for personal 
interviews by the field representatives. Because personal visits are 
the most expensive data collection method, the Bureau used a one-in-
three sample of the remaining nonresponding households. Such households 
are sometimes the most difficult cases for the Bureau to resolve because
a number of them have already refused two mailed questionnaires and the 
telephone follow-up. 

The field representatives were trained in a variety of interviewing 
skills, such as using probe questions to (1) obtain responses from 
respondents who might not answer some of the questionnaire and (2) 
eliminate bias from interview responses. In addition, to help improve 
response rates, field representatives were told how to make a good 
impression on respondents, demonstrate a strong knowledge of the survey,
introduce themselves with confidence and a smile, dress appropriately, 
and be prepared to allay respondents’ concerns. Further, the classroom 
training included a video in which several experienced field 
representatives provided tips on dealing with difficult refusals and 
people who were hard to track down. 

This training was followed with, among other topics, a discussion of 
how field representatives could convert a potential refusal into a 
completed interview. The training manual reminded field representatives 
that the ACS is mandatory, and respondents who are living at addresses 
selected for the survey are legally required to complete the 
questionnaire. The manual also noted that (1) the introductory letter
and the materials mailed subsequently to the household indicate that 
the ACS is mandatory and (2) the field representatives should have a 
copy of the letter available to give to any reluctant respondents. 

The training manual acknowledges that even though respondents have been 
notified that participation is mandatory, some people may still be 
reluctant to participate. The manual then instruct interviewers about 
the importance of (1) making a proper introduction and good first 
impression and (2) listening to and addressing any objections to 
participation, such as the length of the survey or the personal nature 
of the questions. Interviewers were provided with standard responses to 
frequently asked questions that were similar to those responses 
provided to the telephone interviewers. 

If, after following these procedures, the respondent still refuses to 
participate, interviewers were trained to “remain calm and 
professional, and leave the site.” Interviewers were to report the 
refusal to their supervisors who, in turn, were to attempt to contact 
the address either by mail or telephone. 

When the Bureau conducted personal interviews in 1996, the field 
representatives were new to the endeavor. This initially resulted in 
mistakes, such as interviewing neighbors and other nonhousehold 
members. However, the Bureau retrained the interviewers and found that 
the number of such mistakes declined. Moreover, the follow-up efforts 
elicited little in the way of public complaint to the Bureau. Indeed, 
although the Bureau invited the public to comment on the conduct of the 
ACS, none were received from three of the test sites, according to the 
Bureau. 

The exception was the Brevard County test site where, according to our 
review of Bureau documents and interviews with Bureau officials, about 
30 people, in 1996, wrote letters to Congress with concerns or 
complaints about the ACS. The letters generally focused on the personal 
nature of the questions or the legal requirement to participate in the 
survey, not about the interviewers themselves. However, there was one 
reported incident in which a field representative did not follow Bureau
procedures and was overly aggressive in collecting information from 
respondents. The Bureau reportedly reprimanded that individual. 

The Bureau reports that between 1996 and 2002, it received about 250 
letters expressing concerns about the ACS. Our review of 82 letters, or 
about one-half of those available to GAO, suggests that privacy was a 
frequent concern; just 4 of the letters we reviewed mentioned that a 
Bureau interviewer was rude or intimidating. 

In 1996, the ACS nonresponse follow-up operation collected data from 
about 13,800 households, with few problems. This record suggests that 
the training the Bureau provided its telephone interviewers and field 
representatives was aligned with the objective of securing a high 
response rate. For subsequent tests of the ACS, the Bureau relied more 
heavily on a staff of permanent interviewers. The Bureau believed that 
the training and experience of such interviewers resulted in higher
response rates and better quality data. The Bureau’s future plans call 
for a similar approach. 

Outreach and Promotion Efforts Have Gradually Expanded: 

According to Bureau officials, when it launched the ACS test in 1996, 
the Bureau had no outreach staff onboard. Instead, the Bureau used a 
press release and free media to publicize the survey to respondents. 
Following the initial test, the Bureau developed outreach and promotion 
efforts that appeared to be geared, in large part, toward government 
officials and data users. An employee responsible for outreach first
joined the ACS program in late 1996 and worked with local people in the 
Multnomah County, Oregon, test site on how the data could best be used. 
The Bureau conducted additional workshops at test sites in 1997, 
following the release of the 1996 data. Those invited to attend 
included congressional staff, local elected officials, planners,
and other local government agencies. 

As the ACS program expanded to 31 test sites, the Bureau increased the 
number and type of outreach activities to include more data workshops; 
town hall meetings; contacts with representatives of national and local 
print and broadcast media; professional journals; and umbrella 
organizations, such as the National League of Cities. For example, in 
late June 2002, the Bureau held the third in a series of ACS meetings 
in Seattle, Washington. According to the Bureau, among the 80 attendees
were representatives of congressional offices, public and private 
organizations, academia, and the media. An outreach staff of six 
employees continues to work with many of the organizations that are 
represented in the Bureau’s racial, ethnic, and decennial census 
advisory committee. 

If the Bureau’s plans for full implementation of the ACS are approved, 
it expects to continue working with organizations that it partnered 
with for the 2000 Decennial Census. As we noted in our earlier report, 
53 the Bureau relied on these partnerships to help improve 
participation in the census and mobilize support for key census 
operations. The Bureau recognized that local people and organizations 
know (1) what the characteristics of their communities are better than 
the Bureau and (2) how to best communicate with their communities. 

By comparison, the promotion and outreach efforts for the decennial 
census were far more ambitious, but that is to be expected, given the 
national scope and universal coverage of the census. It included an 
advertising campaign, developed by a private sector advertising agency, 
and a nationwide effort to enlist support in taking the census through 
partnering with corporations, community groups, and other 
organizations. In all, for the 2000 Decennial Census, the Bureau spent 
about $374 million on marketing, communication, and partnerships, or 
about $3.19 per household. According to the Bureau, the mail return 
rate was about 74 percent. 

Conclusions: 

If the ACS is approved, federal agencies will be able to start using 
annual ACS data as early as 2004. Primarily because the annual ACS data 
will be less accurate than the 2000 decennial census long-form data, 
these agencies will need to be provided with key information about ACS 
data to ensure that the transition from the use of long-form data to 
ACS data is more likely to be successful. In addition, the availability 
of ACS data will create opportunities to eliminate questions on 
existing surveys and reduce the reporting burden of these surveys. 

Recommendations for Executive Action: 

In order to facilitate the transition by federal agencies from the use 
of 2000 Decennial Census data to the ACS, we recommend that the 
Secretary of Commerce direct the Director, Bureau of the Census, revise 
and expand the quality-testing and evaluation component of the ACS 
development program. In particular, the following actions should be 
taken: 

* Establish a process to make sure that the ACS development program 
produces key information needed by federal agencies that will have to 
use ACS data when the long form is eliminated. 

* Develop estimates, for states and large local government areas, of 
social, economic, and housing characteristics from the 2000-02 ACS 
special surveys and the 2003 and 2004 ACS to provide agencies with ACS 
estimates that are conceptually consistent with the 2000 Census. 

* Expand the planned evaluation of differences between data from the 
Census 2000 Supplementary Survey and the 2000 Decennial Census long 
form, so as to identify techniques for agencies to use to improve 
consistency between the 2000 Census data and the 2003 and subsequent 
ACS data. 

* Analyze and report on differences between year-to-year changes for 
2001 and 2002, using the data—from ACS special supplements and the CPS 
at the national and state levels—for key economic and housing 
characteristics, such as the unemployment and poverty rates, to 
determine the stability of the annual ACS data. 

* Extend the scope of the ACS development program to include plans to 
benchmark ACS estimates, beginning with 2005, to the 2010 Census 
population counts and the revised 2005-09 population estimates to 
ensure comparability between the ACS and 2010 Census data. 

To more completely address the possibility of reducing the reporting 
burden in existing surveys, we recommend that the Secretary of Commerce 
direct the Director, Bureau of the Census, to review for possible 
elimination, proposed ACS questions now asked on two surveys conducted 
by the Bureau—the annual demographic supplement of the Current 
Population Survey and the American Housing Survey. 

Questions that are not identical should be eliminated if, in the 
absence of other reasons, the accuracy, timeliness, and geographic 
detail of the ACS data outweigh the greater relevance of the data from 
the existing survey. 

Scope and Methodology: 

We used a combination of approaches and methods to examine the Census 
Bureau’s implementation of the ACS. These included statistical 
analyses; meetings with key Bureau headquarters officials; and reviews 
of relevant documentation, including congressional testimony and 
Federal Register comments on the ACS. Information on all aspects of the 
ACS, the decennial census, the supplementary surveys, and other Bureau 
surveys is available at the Bureau’s Web site [hyperlink, 
http://www.census.gov]. 

To obtain data on the ACS and the 2000 census and to examine how the 
quality of the ACS data, beginning with 2003, would compare with that 
of the 2010 Decennial Census long-form data, we spoke to Bureau 
officials about the technical aspects of the ACS. We reviewed materials 
prepared by the Bureau on the quality, coverage, and underlying 
definitions of the ACS and the relationship of the ACS to other Bureau
programs. We also conducted an analysis of differences, for a 
representative set of data items at both the national and state levels, 
between Census 2000 Supplementary Survey and 2000 long-form data. 

To assess the extent to which ACS data would meet the needs of federal 
agencies, we spoke to officials at BLS and the Census Bureau concerning 
the use of ACS data in their programs. We reviewed previous GAO reports 
on formula allocation and eligibility determination. [Footnote 54] We 
also reviewed directives and guidelines prepared by OMB on the 
measurement of poverty, and spoke to OMB staff on the potential impact
of the ACS on those guidelines. In addition, we reviewed recent 
studies, prepared by the National Academy of Sciences, on federal fund 
allocation, small-area data modeling, and statistical agency practices. 

To determine whether the questions to be asked in the ACS are justified 
by statutory requirements, we reviewed the statutes for mandatory 
programs that agencies used to support the questions. To determine 
whether the planned use of ACS data to select samples for additional 
surveys is consistent with the confidentiality provisions of Title 13, 
we reviewed the pertinent statutory provisions. We reviewed the cover 
letter for the ACS that notified respondents of this use. 

To determine if ACS questions are duplicative or similar to those in 
other federal surveys and if the burden on the respondents could be 
reduced, we reviewed the questions on other federal agency household 
surveys for duplication with the ACS questions. For the CPS and AHS, we 
reviewed a line-by-line comparison prepared for GAO by the Bureau. 

To explore whether the costs of conducting the ACS would be affected if 
it was conducted as a voluntary survey, we reviewed published studies 
of differences in response rates for the same surveys when conducted on 
a mandatory versus a voluntary basis. [Footnote 55] We also obtained 
similar unpublished data from BLS for state-conducted surveys for which 
some states had made responses mandatory. [Footnote 56] 

To determine how the Bureau encouraged participation in the ACS test 
program through training for follow-up interviewers of nonrespondents, 
as well as outreach and promotion efforts, we interviewed Bureau 
officials and reviewed documentation, including training manuals, 
videos, and letters of complaint about the ACS test program. 

We requested comments on a draft of this report from the Secretary of 
Commerce. On September 25, 2002, the Secretary forwarded the Bureau ’s 
written comments on the draft (see enclosure). 

Agency Comments and Our Evaluation: 

In written comments on a draft of this report, the Secretary of 
Commerce provided the Bureau of the Census’s comments. Those comments 
are included in the enclosure. Overall, the Bureau agreed with the 
thrust of our recommendations. However, it expressed a number of 
concerns about some of the detailed findings. The principal concerns 
raised by the Bureau and our response are presented below. The Bureau 
also provided technical comments that have been incorporated where
appropriate. 

First, the Bureau expressed concerns about our approach to comparing 
the quality of data from the proposed ACS and the 2000 Decennial Census 
long form, stating that (1) we did not adequately take into account the 
tradeoffs between accuracy and timeliness and (2) we did not take into 
account certain information on response rates. We followed OMB 
guidelines on measuring survey quality in our analysis, and included in 
our analysis information on the impact of nonsampling error, using 
measurement error and item imputation rates for the detailed questions. 
We made standard assumptions about the impact of sampling error on the 
two sets of data. In addition, we recognized the limitations of these 
measures, including those noted by the Bureau in its comments, and 
summarized our findings with the following cautionary statement: 
“Because there is no one formula to determine the relative importance 
of the components, it is not possible to determine an overall measure of
survey quality to compare the ACS and long-form data.” 

Second, the Bureau expressed concern about our focus on single-year ACS 
data and our analysis of measurement errors in the ACS. Any analysis of 
measurement errors in the ACS necessarily must focus on single-year 
data since those are the only ACS data that exist. Moreover, our 
methodology for determining relative measurement error is fully 
consistent with two previously stated Bureau positions. In the statement
to OMB justifying the need for the 2000 Census Supplementary Survey, 
the Bureau reported that the primary need for the 2000 ACS data “…is to 
determine how well ACS data compare with long-form data from Census 
2000.” In addition, the Bureau provided users with the following 
statement on their own Web site: “The Census 2000 Supplementary Survey 
[ACS] data provided an early look at the detailed characteristics of 
the U.S. population for 2000. However, as the official census sample 
data become available, they should be used instead of the Census 2000
Supplementary Survey to describe the population in 2000 and to look at 
changes from 1990 to 2000.” This statement clearly implies that the 
Bureau agrees that the ACS data are less accurate. 

Third, the Bureau stated that we should have addressed the use of 
income and poverty data, in the official OMB measures, based on the 
Current Population Survey (CPS) and not based on the corresponding long-
form data. This statement is incorrect. We addressed this issue in our 
discussion, comparing the differences between the CPS, census long-
form, and ACS data. In the report, we compared two poverty measures and 
found that at the national level, the long-form data were closer to the 
CPS data than the ACS data. 

Finally, the Bureau disagreed with our description of the list of 
federal agency justifications, provided to OMB in April 2002, as 
incomplete, stating that it was “complete” when it was submitted. This 
statement is inconsistent with (1) the fact that the list provided to 
OMB was annotated as a “draft” and (2) our later discussions with 
Bureau officials in which they confirmed that all agencies have not yet
submitted a final list of justifications for ACS questions. 

As agreed with your office, unless you publicly announce its contents 
earlier, we plan no further distribution of this report until 30 days 
from its issue date. At that time, we will send copies to other 
interested congressional committees, the Secretary of Commerce, the 
Director of the Bureau of the Census, the Secretary of Housing and
Urban Development, and the Administrator of the Office of Information 
and Regulatory Affairs of the Office of Management and Budget. Copies 
will be made available to others on request. In addition, the report 
will be available at no charge at the GAO Web site at [hyperlink, 
http://www.gao.gov]. Tanya Cruz, Robert Goldenkoff, Andrea Levine, 
Christopher Miller, Patrick Mullen, and Theodore Saks made major 
contributions to this report. If you have questions about this report, 
you may contact me on (202) 512-9750. 

Singed by 

Robert P. Parker: 
Chief Statistician: 

Enclosure: 

[End of section] 

Enclosure: 

The Secretary Of Commerce: 
Washington, D.C. 20230: 

September 25, 2002: 

Mr. Robert Parker: 
Chief Statistician: 
U.S. General Accounting Office: 
Washington, DC 20548: 

Dear Mr. Parker: 

The U.S. Department of Commerce appreciates the opportunity to comment 
on the General Accounting Office's draft document entitled The American 
Community Survey: Accuracy and Timeliness Issues. The Department of 
Commerce's comments on this report are enclosed. 

Sincerely, 

Signed by: 

Donald L. Evans: 

Enclosure: 

U.S. Department of Commerce: 
U.S. Census Bureau: 

Comments on the General Accounting Office Draft Report: 
The American Community Survey: Accuracy and Timeliness Issues: 

The U.S. Census Bureau appreciates the opportunity to comment on the 
draft General Accounting Office (GAO) Report, The American Community 
Survey: Accuracy and Timeliness Issues. 

Finding One: Annual ACS Data Less Accurate but More Timely than Long 
Form; Federal Agencies Need Additional Information for Transition to 
ACS: 

The Census Bureau concurs that the ACS data are more timely than the 
long form, but disagrees with the suggestion that the ACS data are less 
accurate. GAO's conclusion about accuracy is incomplete, because it 
focuses narrowly on sample size and minimizes other aspects of 
accuracy, most importantly timeliness. Over any given decade, the long 
form data products will age, providing less and less accurate 
representations of current circumstances. The ACS, in contrast, will 
provide an ongoing profile of the Nation's people and economy. 

GAO focused on single-year ACS data and its sampling error when it 
concluded that the annual ACS data are less accurate than the census 
long form data. This focus ignores both five-year average ACS estimates 
and nonsampling error. The Census Bureau designed ACS so that five 
years of aggregated data would replace the long form. It is true that 
the decennial census long form's 20 million housing unit sample size 
will result in less sampling error than the ACS one-year 3 million and 
five-year 15 million housing unit sample sizes. A more precise finding 
would be that the annual ACS estimates will contain substantially more 
sampling error than the long form estimates, but that the five-year ACS 
estimates will contain only slightly more sampling error. 

GAO chooses to focus almost entirely on sampling error when examining 
the comparative accuracy and quality of the ACS and long form 
estimates. The OMB guidelines on data quality, however, make clear that 
the quality of a survey should be judged from an analysis of user needs 
and the totality of quality characteristics, not a narrow examination 
of sampling error. [Footnote 57] GAO correctly notes that data quality 
should be assessed by examining accuracy, timeliness, relevance, and 
accessibility. GAO's focus on sampling error is in some degree 
understandable, as sampling error is much more easily measured than 
nonsampling en-or or the other three elements of quality. However, the 
choice is misleading as Census Bureau research supports the conclusion 
that sampling error will be greater in the ACS than in the long form 
but suggests that nonsampling error will be less. 

While GAO correctly notes that the ACS will produce one-, three-, and 
five-year estimates, it chose to compare only the one-year estimates to 
the census long form sample estimates. As noted above, Census Bureau 
designed the ACS so that five years of aggregated data from the ACS 
would replace the long form sample estimates. Pending funding, the five-
year ACS estimates will be available each year starting in 2008 and can 
be substituted for the single-year, point-in-time long form estimates 
without an overall loss in data quality. Assuming demographics continue 
to change over the decade, for any given area, the five-year estimates 
released in 2008 will be more accurate than the decennial long form 
estimates. This is because they will more closely reflect the area's 
current conditions than the long form estimates from the 2000 decennial 
census. 

The ACS estimates' slightly larger sampling error should be compensated 
for by their expected lower nonsampling error. First, GAO minimizes 
available data demonstrating consistently higher item response rates in 
the ACS than in the 2000 decennial long form. Second, although high 
unit response is another key indicator and critical component of survey 
quality, GAO chooses not to acknowledge the high unit response rates 
for the ACS. Third, the draft report ignores available data that the 
ACS provided very good coverage for historically undercounted 
populations, another critical indicator of survey accuracy and quality. 

GAO's narrow focus on sampling error led to another key 
misunderstanding regarding measurement error. The report's conclusion 
that greater measurement error exists in the ACS than the long form 
sample is not substantiated. This conclusion incorrectly assumes that, 
because the decennial census long form sample is larger, the long form 
estimates contain less measurement error. The draft report uses 
differences in the ACS and long form estimates to conclude that the ACS 
has more measurement error and is therefore less accurate. However, the 
assumption that the long form estimates are the benchmark does not 
acknowledge error associated with the long form or the many factors 
that could have led to the observed differences. Long form estimates of 
certain indicators, such as income, may not be the "gold standard" 
implied by its use as a benchmark in the GAO report. For example, the 
official measurements of income and poverty defined by the Office of 
Management and Budget (OMB) are from the Current Population Survey 
(CPS) annual demographic supplement. 

Finding Three: Federal Agencies Justify ACS Questions, but Uncertainty 
Remains on Extent of ACS Data Use: 

The Census Bureau concurs with GAO's finding that federal agencies have 
justified the ACS questions and believes that the ACS data will be used 
by federal agencies. The Census Bureau, through the auspices of OMB, 
sought input from federal agencies regarding the legally required/ 
authorized uses of the ACS data by these agencies. The focus was on the 
agencies' intended use, because their actual use cannot be determined 
definitively until after the survey is taken and the data are 
available. The list of justifications was complete at the time the 
Census Bureau submitted it to OMB. 

Finding Four: Duplicate or Similar Questions in ACS and Other Federal 
Surveys: 

The Census Bureau acknowledges that certain questions on the ACS are 
similar to those on other surveys. However, important reasons 
necessitate some overlap. Large national surveys such as the CPS, the 
Survey of income and Program Participation (SIPP), and the AHS collect 
complex and specific information, focusing in depth on key topics, thus 
requiring large national samples. The ACS, in contrast, is the only 
survey that would provide information at the smallest geographic levels 
on a wide variety of topics. The government needs both the complex 
concepts measured on the national surveys and the indicators measured 
on the ACS. 

Finding Five: Conducting the ACS as a Voluntary Survey Would Most 
Likely Result in Higher Costs: 

The Census Bureau concurs with GAO that field testing is required to 
calibrate an estimate of how much more it would cost to take the ACS as 
a voluntary survey and is developing plans to conduct such a test as 
early as 2003. The Census Bureau also concurs that converting the 
survey to a voluntary one would result in lower mail response rates, 
meaning more cases would have to be resolved by more expensive personal 
visits. 

Census Bureau analysis supports a preliminary estimate that the ACS 
would cost between $20 million and $35 million more per year if it were 
taken as a voluntary survey. Any estimate of increased cost, however, 
is extremely assumption dependent. The Census Bureau's lower-range 
estimate is based on 1993 work evaluating short-form response; the 
upper bound is also based on this research, but it takes into account 
the general decline in response rates noted in survey research over the 
past decade. Both the upper and lower bound assumptions assume that the 
Census Bureau will act to maintain acceptable survey quality (that is, 
hold the standard errors of the survey estimates constant). Without 
field testing, however, the appropriate response assumptions cannot be 
determined. 

Finally, other than the effect on response rates, other aspects of data 
quality are not addressed in the Census Bureau's preliminary cost 
estimates. Any field testing of the ACS as a voluntary survey should 
also evaluate how a switch to voluntary reporting would affect the 
quality of the ACS data. 

Recommendations: 

GAO's first recommendation is that the Census Bureau revise and expand 
its quality testing and evaluation program to facilitate the transition 
of federal agencies to using the ACS data in 2004 and beyond. Subject 
to appropriate funding levels, the Census Bureau concurs with this 
recommendation and intends to develop a formal transition plan this 
year. The Census Bureau cannot determine at this time whether this 
transition plan will accept each and every one of GAO's sub-
recommendations for additional research, but the plan will 
comprehensively address the needs of the federal user community. The 
transition plan will analyze and prioritize a number of transition 
issues, not just those specified by GAO. 

GAO's second recommendation is that the Census Bureau review the Annual 
Demographic Supplement (ADS) to the CPS and the AHS to determine if any 
questions can be eliminated from either of these two surveys due to 
their duplication of the ACS questions. The Census Bureau is always 
looking for opportunities to streamline, clarify, and reduce respondent 
burden, and will bring this recommendation to the attention of the 
Office of Statistical Policy at the Office of Management and Budget and 
the sponsoring agencies. It may be that full ACS implementation will 
allow elimination of some duplication. GAO should note that substantial 
testing will be required before changes can be made in surveys that 
provide key national social indicators, and that survey methodology has 
shown that even minor changes in surveys can have major unintended 
consequences. 

[End of enclosure] 

[End of section] 

Footnotes: 

[1] Article I of the United States Constitution requires an enumeration 
of the population, every 10 years, for purposes of apportionment. See 
U.S. Constitution art. I, sec. 2, cl. 3. To implement this 
constitutional requirement, Congress enacted 13 U.S.C. § 141, which 
requires a decennial census of population. 

[2] For the 2000 Decennial Census, the long form was mailed to 19 
million housing units, or 1 out of every 6 units. The Census tract is 
the smallest level of geographic entity for which long-form data are 
available. Census tracts are statistical entities within a county and 
are defined by local data users. Generally, tracts have a population 
between 2,500 and 8,000 people. 

[3] For a discussion of the Census Bureau’s cost estimates for the 2010 
Decennial Census, see U.S. Bureau of the Census, “Potential Life Cycle 
Savings for the 2010 Census” (Washington, D.C.: June 2001). 

[4] See U.S. General Accounting Office, Decennial Census: Overview of 
Historical Census Issues, GAO/GGD 98-103 (Washington, D.C.: May 1998). 

[5] Population-size criteria reflect population in 2000. 

[6] The Bureau reported that standard errors of these annual data would 
correspond to a 12 percent coefficient of variation for a 10 percent 
estimate, which implies a 90 percent confidence interval of 10.0+ 2.0. 
For more details, see Charles Alexander, “American Community Survey 
Data for Economic Analysis,” paper presented to the Census Advisory 
Committee of the American Economic Association (Suitland, Md.: October 
2001). 

[7] Alexander. 

[8] This program, also known as the Intercensal Demographics Estimates 
and the Population Estimates Programs, is mandated by 13 U.S.C. § 181. 
In this program, administrative record data on births, deaths, 
immigration, and emigration are used to produce annual population 
estimates—by state, age, sex, race, and Hispanic origin—that are then 
used to implement federal programs. For a description of this program, 
see “Population Estimates: Concepts” at the Census Bureau’s Web site 
[hyperlink, http://www.census.gov]. Some decennial census data users 
have recommended that the ACS estimates should be used to improve the 
intercensal estimates. For example, see Linda Gage, Department of 
Finance, California, statement prepared for the Subcommittee on the
Census, House Committee on Government Reform, 107th Cong. 1st sess., 
2001, 107-9. Census Bureau plans for such improvements are discussed in 
Charles Alexander and Signe Wetrogan, “Integrating the American 
Community Survey and the Intercensal Demographic Estimates Program” 
(paper presented at a meeting of the American Statistical Association, 
Indianapolis, Ind.: August 14, 2000). 

[9] For the Census Bureau’s income and poverty estimates program, see 
“Small Area Income and Poverty Estimates” at the Bureau’s Web site 
[hyperlink, http://www.census.gov]; for the BLS labor force estimates 
program, see “Local Area Unemployment Statistics” at the BLS Web site 
[hyperlink, http://www.bls.gov]. (Although the Census Bureau conducts 
the CPS, it is largely funded by BLS, the agency responsible for 
preparing the official estimates of unemployment and related labor 
force characteristics.) 

[10] For a description of the program, see “Regional Economic Accounts” 
at the Bureau of Economic Analysis Web site [hyperlink, 
http://www.bea.gov]. 

[11] See U.S. Office of Management and Budget, Budget of the United 
States Government: Appendix (Washington, D.C.: 2002) 215. 

[12] The results of a special national survey conducted for 2000, 
officially titled the “Census 2000 Supplementary Survey” and called
the “C2SS” by the Bureau, have been published. However, the Bureau has 
recommended that these data should not be used when the 2000 long-form 
data become available. The supplementary survey for 2000, as well as 
similar ones for 2001 and 2002, was conducted by the Bureau using the 
ACS questionnaire and survey methodology to provide testing of the ACS. 
For purposes of this GAO report, “ACS” refers to both the ACS surveys 
conducted at test sites throughout the country and to these 
supplementary annual surveys. For additional information, see “What Are 
Supplementary Surveys” and “What is the American Community Survey” at 
the Census Bureau’s Web site. 

[13] The ACS development program refers to testing, research, and 
development activities the Bureau plans to conduct until the ACS is 
implemented in 2003. 

[14] See U.S. Bureau of the Census, Meeting 21st Century Demographic 
Data Needs—Implementing the American Community Survey: May 2002, Report 
2: Demonstrating Survey Quality (Washington, D.C.: May 2002). For a 
discussion of the guidelines, see U.S. Office of Management and Budget, 
Statistical Policy Working Paper 31, Measuring and Reporting Sources of 
Errors in Surveys (Washington D.C.: July 2001). 

[15] See Legal Opinion B-289852 (April 4, 2002) at GAO’s Web site 
[hyperlink, http://www.gao.gov]. 

[16] The definitions of the categories used for the 2003 ACS were 
essentially the same as those used for the 2000 long form, except
that the definitions were modified to add the ACS when the 2000 
Decennial Census was referenced in the criteria. 

[17] The Paperwork Reduction Act (44 U.S.C. § 3507) also required that 
a notice of the request be published in the Federal Register; it 
appeared on May 1, 2002 (67 Federal Register 21629-30). 

[18] See Office of Management and Budget, Notice of Action 0607-0810 
(June 28, 2002). 

[19] U.S. Bureau of the Census, Meeting 21st Century Demographic Data 
Needs—Implementing the American Community Survey: July 2001, Report 1: 
Demonstrating Operational Feasibility (Washington, D.C.: July 2001). 

[20] To evaluate the quality of the ACS program, we have primarily used 
guidelines for measuring survey errors in U.S. Office of Management and 
Budget, Statistical Policy Working Paper 31, Measuring and Reporting 
Sources of Errors in Surveys (Washington D.C.: July 2001). These 
guidelines are similar to guidelines published by Statistics Canada, 
the International Monetary Fund, and in OMB’s newly issued “Guidelines 
for Ensuring and Maximizing the Quality, Objectivity, Utility, and
Integrity of Information Disseminated by Federal Agencies.” 

[21] For a discussion of the impact on federal programs resulting from 
the replacement of the 2000 intercensal population estimates with the 
2000 Census population counts, see U.S. General Accounting Office, 
Formula Grants: 2000 Census Will Redistribute Federal Funding among 
States, GAO-02-1062 (Washington, D.C.: forthcoming). 

[22] The second major type of nonresponse error, unit nonresponse, 
which is the complete failure to obtain data from a respondent, was 
very small for both the 2000 long form and the supplementary survey. 

[23] Because the sample size of the supplementary surveys is about one-
fourth that of the proposed 2003 ACS, these differences may overstate 
the differences between the 2003 ACS data and comparable long-form 
data. 

[24] Although the Bureau did compare 2000 Census and 2000 ACS results 
in one of their evaluation reports, the comparisons were limited to 
short-form items. See U.S. Census Bureau, Meeting 21st Century 
Demographic Data Needs—Implementing the American Community Survey: May 
2002, Report 2: Demonstrating Survey Quality (Washington, D.C.: May 
2002). 

[25] These differences, discussed later in the report, are the 
exclusion of people living in group quarters and the different treatment
of people with seasonal residences. 

[26] For information on income and poverty data, see “Guidance on 
Survey Differences in Income and Poverty Estimates” (March 19, 2002) at 
the Census Bureau’s Web site. For information on labor force data, see 
Charles Alexander, Sharon Brown, and Hugh Knox, “American Community 
Survey Data for Economic Analysis” (paper presented at a meeting of the 
Federal Economics Statistics Advisory Committee, Washington, D.C., 
December 14, 2001). 

[27] The Census Bureau and BLS, however, use detailed geographic 
information from the long form in constructing model-based estimates of 
income, poverty, and unemployment for small geographic areas. 

[28] See “Annual Update of the HHS Poverty Guidelines,” 67 Federal 
Register, 6931- 33 (February 14, 2002). 

[29] Comparisons with the AHS were not possible because it is a 
biennial survey and no data at the national level were available for
2000. 

[30] For a discussion of potential ACS use in these models, see 
National Academy of Sciences, Small Area Income and Poverty Estimates: 
A Workshop (Washington, D.C., 2000) 123. 

[31] See U.S. Census Bureau, Money Income in the United States: 2000 
(Washington, D.C.: September 2001). 

[32] National Academy of Sciences, Choosing the Right Formula: Initial 
Report (Washington D.C.: 2001). 

[33] This type of assistance is required by OMB’s data quality 
guidelines and is recommended in National Academy of Sciences, 
Principles and Practices for a Federal Statistical Agency (Washington, 
D.C.: 2001). 

[34] See “Preliminary Assessment of the Comparability of Census 2000 
Long Form Estimates with Census 2000 Supplementary Survey Estimates,” 
6, at the Census Bureau’s Web site. 

[35] The program was included in the Census Bureau’s “American 
Community Survey Alert, June 2002,” which appears at the Bureau’s Web 
site. 

[36] “Preliminary Assessment of the Comparability of Census 2000 Long 
Form Estimates with Census 2000 Supplementary Survey Estimates,” 6. 

[37] For more information, see “Preliminary Assessment of the 
Comparability of Census 2000 Long Form Estimates with Census 2000 
Supplementary Survey Estimates,” 3. 

[38] People living in group quarters–e.g., nursing homes, correctional 
institutions, college dormitories, and military quarters--were excluded 
from the supplementary survey data for all years in an effort to reduce 
reporting burden on the operators of these facilities in 2000. They 
were also excluded from the 2001-02 supplementary surveys and the 
proposed 2003 ACS; they will be covered in the ACS, beginning with 
2004. 

[39] For the 2000 Decennial Census long form, the lists provided to the 
Bureau by the federal agencies were not formally approved by the 
agencies. On June 13, 2002, the General Counsel of the Department of 
Commerce sent a letter to the General Counsels of the agencies that 
submitted information for the lists, requesting formal approval. A 
final list, based on the responses to the request, which were due July 
13, 2002, was not available at the time this report was prepared. 

[40] OMB, Notice of Action 0607-0810 (June 28, 2002). 

[41] See Census Bureau, supporting statement, para. A4, provided to OMB 
by the Bureau as part of the “Paperwork Reduction Act Submission for 
the 2003 ACS.” 

[42] For information on SIPP, including comparisons with other surveys, 
see SIPP Users’ Guide at the Census Bureau’s Web site. 

[43] See Marc I. Roemer, “Assessing the Quality of the March Current 
Population Survey and the Survey of Income and Program Participation 
Income Estimates, 1990-1996” (June 16, 2000) at the Census Bureau’s Web 
site. 

[44] See D. A. Dillman and others, “Effects of Benefits Appeals, 
Mandatory Appeals, and Variations in Statement of Confidentiality
on Completion Rates for Census Questionnaires,” Public Opinion 
Quarterly, 60, (1996) 376-89. 

[45] D. R. Tulp Jr. and others, “Nonresponse Under Mandatory vs. 
Voluntary Reporting in the 1989 Survey of Pollution Abatement Costs and 
Expenditures (PACE)” (U. S. Census Bureau, Suitland, Md., photocopy). 

[46] See U. S. Office of Management and Budget, Statistical Policy 
Working Paper 31, Measuring and Reporting Sources of Errors in Surveys 
(Washington, D.C.: July 2001) and B. K. Atrostic and others, 
“Nonresponse in U.S. Government Household Surveys: Consistent Measures, 
Recent Trends, and New Insights,” Journal of Official Statistics, 17:2 
(2001): 209-26. 

[47] For a description of this survey, see “Health and Retirement 
Study“ at the Web site of the Institute for Social Research at 
[hyperlink, http://www.isr.umich.edu]. 

[48] For a description of this survey, see “Estimation Procedures in 
the 1996 Medical Expenditures Panel Survey Household Component" at the 
Web site of the Agency for Health Care Policy and Research [hyperlink, 
http://www.meps.ahcpr.gov]. 

[49] Jane M. Shepard and Steve Everett, “Cooperation Tracking Survey: 
April 2002 Update” at the Council for Marketing and Opinion Research 
Web site [hyperlink, http://www.cmor.org]. 

[50] This discussion does not cover interview, outreach, and promotion 
efforts associated with the 2000-02 Census Supplementary Survey 
program, conducted with the ACS questionnaire and survey methodology 
and used to test the quality of these data. 

[51] The 60.9 percent response rate roughly reflects the percentage of 
mail surveys returned before the start of follow-up interviewing. After 
the processing was completed, 78.5 percent of the responses were based 
on mailed report forms, 11.5 percent on telephone interviews, and 10.5 
percent on personal interviews. Information on item nonresponse rates 
is not available. For additional information, see Susan Love and Greg 
Diffendal, “The American Community Survey Monthly Response Rates, by 
Mode” (paper presented at the American Community Survey Symposium, 
Bureau of the Census, Washington, D.C.: March 1998). 

[52] The Privacy Act of 1974, 5 U.S.C. § 552a, requires all federal 
agencies that collect information to advise respondents under what
authority the information is being collected, how the information will 
be used, whether participation is required, and the consequences of not 
responding. 

[53] U.S. General Accounting Office, 2000 Census: Review of Partnership 
Program Highlights Best Practices for Future Operations, GAO-01-579 
(Washington, D.C.: August 2001). 

[54] See U.S. General Accounting Office, Formula Grants: Effects of 
Adjusted Population Counts on Federal Funding to States, GAO/HEHS-99-
69; Means-Tested Programs: Determining Financial Eligibility is 
Cumbersome and Can be Simplified, ” GAO-02-58, and Title I Funding: 
Poor Children Benefit Though Funding Per Poor Child Differs, GAO-02-242 
(Washington D.C.: February 1999, November 2001, and January 2002). 

[55] In addition to the studies used by the Census Bureau, see John 
Gawalt, “Research and Development in Industry: 1990, NSF 94-304” 
(Washington, D.C.: 1994). 

[56] See Bureau of Labor Statistics, “A Brief Study of Findings from 
the CES Enrollment Research” (unpublished: November 1996). 

[57] OMB, "Statistical Policy Working Paper 31: Measuring and Reporting 
Sources of Error in Surveys," July 2001. 

[End of section] 

GAO's Mission: 

The General Accounting Office, the investigative arm of Congress, 
exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO’s commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO’s Web site [hyperlink, 
http://www.gao.gov] contains abstracts and fulltext files of current 
reports and testimony and an expanding archive of older products. The 
Web site features a search engine to help you locate documents using 
key words and phrases. You can print these documents in their entirety, 
including charts and other graphics. 

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as “Today’s Reports,” on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
[hyperlink, http://www.gao.gov] and select “Subscribe to daily E-mail 
alert for newly released products” under the GAO Reports heading. 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. General Accounting Office: 
441 G Street NW, Room LM: 
Washington, D.C. 20548: 

To order by Phone: Voice: (202) 512-6000: 
TDD: (202) 512-2537: 
Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Public Affairs: 

Jeff Nelligan, managing director, NelliganJ@gao.gov, (202) 512-4800: 
U.S. General Accounting Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: