This is the accessible text file for GAO report number GAO-07-62 
entitled 'Federal Information Collection: A Reexamination of the 
Portfolio of Major Federal Household Surveys Is Needed' which was 
released on December 15, 2006. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to Congressional Requesters: 

United States Government Accountability Office: 

GAO: 

November 2006: 

Federal Information Collection: 

A Reexamination of the Portfolio of Major Federal Household Surveys Is 
Needed: 

GAO-07-62: 

GAO Highlights: 

Highlights of GAO-07-62, a report to congressional requesters 

Why GAO Did This Study: 

Federal statistical information is used to make appropriate decisions 
about budgets, employment, and investments. GAO was asked to (1) 
describe selected characteristics of federally funded statistical or 
research surveys, (2) describe agencies’ and OMB’s roles in identifying 
and preventing unnecessary duplication, (3) examine selected surveys to 
assess whether unnecessary duplication exists in areas with similar 
subject matter, and (4) describe selected agencies’ efforts to improve 
the efficiency and relevance of surveys. GAO reviewed agency documents 
and interviewed officials. Using this information and prior GAO work, 
GAO identified surveys with potential unnecessary duplication. 

What GAO Found: 

At the time of GAO’s review, the Office of Management and Budget (OMB) 
had approved 584 ongoing federal statistical or research surveys, of 
which 40 percent were administered to individuals and households. Under 
the Paperwork Reduction Act, agencies are to certify to OMB that each 
information collection does not unnecessarily duplicate existing 
information, and OMB is responsible for reviewing the content of 
agencies’ submissions. OMB provides guidance that agencies can use to 
comply with the approval process and avoid unnecessary duplication, 
which OMB defines as information similar to or corresponding to 
information that could serve the agency’s purpose and is already 
accessible to the agency. 

Based on this definition, the seven surveys GAO reviewed could be 
considered to contain necessary duplication. GAO identified three 
subject areas, people without health insurance, people with 
disabilities, and housing, covered in multiple major surveys that could 
potentially involve unnecessary duplication. Although they have 
similarities, most of these surveys originated over several decades, 
and differ in their purposes, methodologies, definitions, and 
measurement techniques. These differences can produce widely varying 
estimates on similar subjects. For example, the estimate for people who 
were uninsured for a full year from one survey is over 50 percent 
higher than another survey’s estimate for the same year. While agencies 
have undertaken efforts to standardize definitions and explain some of 
the differences among estimates, these issues continue to present 
challenges. In some cases, agencies have reexamined their existing 
surveys to reprioritize, redesign, combine, and eliminate some of them. 
Agencies have also used administrative data in conjunction with their 
surveys to enhance the quality of information and limit respondent 
burden. These actions have been limited in scope, however. In addition, 
two major changes to the portfolio of major federal household surveys 
are underway. The American Community Survey is intended to replace the 
long-form decennial census starting in 2010. This is considered to be 
the cornerstone of the government’s efforts to provide data on 
population and housing characteristics and will be used to distribute 
billions of dollars in federal funding. Officials are also redesigning 
the Survey of Income and Program Participation which is used in 
estimating future costs of certain government benefit programs. 

In light of these upcoming changes, OMB recognizes that the federal 
government can build upon agencies’ practices of reexamining individual 
surveys. To ensure that surveys initiated under conditions, priorities, 
and approaches that existed decades ago are able to cost-effectively 
meet current and emerging information needs, there is a need to 
undertake a comprehensive reexamination of the long standing portfolio 
of major federal household surveys. The Interagency Council on 
Statistical Policy (ICSP), which is chaired by OMB and made up of the 
heads of the major statistical agencies, is responsible for 
coordinating statistical work and has the leadership authority to 
undertake this effort. 

What GAO Recommends: 

Upcoming changes provide an opportunity to go beyond individual agency 
efforts and examine the portfolio of major federal household surveys. 
Therefore, GAO recommends that the Director of OMB work with the ICSP 
to plan for a comprehensive reexamination to redesign or reprioritize 
the major federal household surveys. OMB and the Department of Housing 
and Urban Development agreed with GAO’s recommendation. The Department 
of Health and Human Services stated that a reexamination was not 
warranted without evidence of unnecessary duplication, but GAO's 
recommendation is based on other factors, including the upcoming 
changes. 

[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-62]. 

To view the full product, including the scope and methodology, click on 
the link above. For more information, contact Bernice Steinhardt at 
(202) 512-6543 or steinhardtb@gao.gov. 

[End of Section] 

Contents: 

Letter: 

Results in Brief: 

Background: 

More Than 500 Statistical or Research Surveys Have Been Approved: 

Agencies and OMB Have Procedures Intended to Identify and Prevent 
Unnecessary Duplication: 

Duplicative Content in Selected Surveys Exists, but Survey Purposes and 
Scope Differ: 

Agencies Have Undertaken Efforts to Improve the Efficiency and 
Relevance of Surveys: 

Conclusions: 

Recommendation for Executive Action: 

Agency Comments: 

Appendix I: Scope and Methodology: 

Appendix II: Comments from the Department of Housing and Urban 
Development: 

Appendix III: Comments from the Department of Health & Human Services: 

Appendix IV: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: Selected Surveys That Cover Similar Content in Three Subject 
Areas: 

Table 2: Characteristics of Selected Research and Statistical Surveys: 

Table 3: Uninsured Estimates from Selected Surveys: 

Table 4: Estimated Population of Persons with Disabilities, by Data 
Source and Different Categories of Disability: 

Figures: 

Figure 1: Primary Purpose of OMB-approved Information Collections: 

Figure 2: Respondents to OMB-approved Statistical and Research Surveys: 

Figure 3: Burden Hour Ranges of the 584 Research and General Purpose 
Statistics Surveys: 

United States Government Accountability Office: 
Washington, DC 20548: 

November 15, 2006: 

The Honorable Tom Davis: 
Chairman, Committee on Government Reform: 
House of Representatives: 

The Honorable Michael R. Turner: 
Chairman, Subcommittee on Federalism and the Census: 
Committee on Government Reform: 
House of Representatives: 

Governments, businesses, and citizens depend on relevant and timely 
statistical information from federal statistics to make appropriate 
decisions about budgets, employment, investments, and many other 
essential topics. Given the importance of federally funded surveys to 
the quality of statistical information, and the ever-increasing demand 
for more and better information within limited resources, it is 
essential to maximize their utility. To this end, officials 
implementing federally funded surveys must avoid unnecessary 
duplication with existing information sources, as mandated by the 
Paperwork Reduction Act of 1980 (PRA), as amended, and work to ensure 
efficiency in areas where subject matter is similar.[Footnote 1] As 
highlighted in our 21st Century Challenges report, the federal 
government must address and adapt to a range of major trends and 
challenges in the nation and the world--including, among other things, 
a long-term structural fiscal imbalance and a transformation to a 
knowledge-based economy.[Footnote 2] Statistical programs are likely to 
continue to face constrained resources in the future, and the changing 
information needs of our society and economy raise important questions 
regarding the portfolio of major federal household surveys--a portfolio 
that has been developing for more than six decades in response to 
conditions and information needs that have changed over time. 

In light of the importance of minimizing unnecessary duplication 
between statistical and research surveys, at your request this report 
(1) identifies the number and selected characteristics of Office of 
Management and Budget (OMB)-approved federally funded statistical or 
research surveys, (2) describes agencies' and OMB's roles in 
identifying and preventing unnecessary duplication, (3) examines 
selected surveys to assess whether unnecessary duplication exists in 
areas with similar subject matter, and (4) describes selected efforts 
agencies have used to improve the efficiency and relevance of surveys. 
OMB defines the term unnecessary duplication as information similar to 
or corresponding to information that could serve the agency's purpose 
and is already accessible to the agency. Therefore, as agreed, our 
review focused on several surveys that we identified as having the 
potential for being unnecessarily duplicative because they contain 
similar information. 

To address the first objective to identify the number and 
characteristics of OMB-approved federally funded surveys, we reviewed 
the information collections that OMB approved under the PRA. We used 
information from the database of OMB-approved federally funded 
information collections.[Footnote 3] In 2005 we conducted a reliability 
assessment of the database of OMB-approved information collections and 
concluded that the data were accurate and complete for the purposes of 
that report.[Footnote 4] Because this assessment was recent, we decided 
that we would not repeat this assessment. As OMB's approval can be in 
effect for a maximum of 3 years, and may be for a shorter period, our 
review reflects a snapshot in time of all those collections that OMB 
had approved for use as of August 7, 2006. We focused on two categories 
of information collections: general purpose statistics, which are 
surveys whose results are to be used for statistical compilations of 
general public interest, and research surveys.[Footnote 5] 

For the second objective to describe agencies' and OMB's roles in 
identifying and preventing unnecessary duplication, we reviewed the PRA 
requirements for both agencies and OMB. We interviewed clearance 
officers from the Departments of Commerce, Labor, and Health and Human 
Services to learn about their processes for submitting the information 
collection packages to OMB. These agencies were the top three agencies 
in terms of funding for statistical activities in fiscal year 2006. We 
also interviewed OMB officials regarding their role in approving 
information collections. 

For the third objective to examine selected surveys to assess whether 
unnecessary duplication exists in areas with similar subject matter, we 
reviewed our reports and literature and interviewed agency officials to 
identify areas of similar content covered in multiple surveys. We 
subsequently identified three subject areas with potentially 
unnecessary duplication based on similar content in the surveys: (1) 
people without health insurance, (2) those with disabilities, and (3) 
housing. Once we had identified these three subject areas, we analyzed 
information from literature and interviews we conducted to identify the 
current federally funded surveys that were cited as the major surveys 
on people without health insurance (Current Population Survey (CPS), 
National Health Interview Survey (NHIS), Medical Expenditure Panel 
Survey (MEPS), and Survey of Income and Program Participation (SIPP)) 
and disability (NHIS, National Health and Nutrition Examination Survey 
(NHANES), MEPS, SIPP, and the American Community Survey (ACS)) as shown 
in table 1. For the third area, housing, we relied on our earlier 
report that identified the potential unnecessary duplication between 
the ACS and American Housing Survey (AHS).[Footnote 6] One of the 
surveys we included, the Census Bureau's SIPP, will be reengineered. 
However, the content of the redesigned SIPP has not been determined, 
and as a result, it may continue to include questions on disability and 
people without health insurance, so we have included information 
relative to this long-standing survey in this report. 

Table 1: Selected Surveys That Cover Similar Content in Three Subject 
Areas: 

Survey: American Community Survey (ACS); 
Purpose: Will replace the decennial census long-form, and monitors 
changes in communities; 
People without health insurance: [Empty]; 
Disability: X; 
Housing: X. 

Survey: American Housing Survey (AHS); 
Purpose: Collects data on the nation's housing, including income, 
neighborhood quality, costs, equipment and fuels, and movement; 
People without health insurance: [Empty]; 
Disability: [Empty]; 
Housing: X. 

Survey: Current Population Survey (CPS) and the Annual Social and 
Economic Supplement (ASEC); 
Purpose: Obtains information on labor force characteristics for the 
U.S. population. (The ASEC in addition covers income, noncash benefits, 
and migration); 
People without health insurance: X; 
Disability: [Empty]; 
Housing: [Empty]. 

Survey: Medical Expenditure Panel Survey (MEPS); 
Purpose: Provides extensive information on health care use and costs; 
People without health insurance: X; 
Disability: X; 
Housing: [Empty]. 

Survey: National Health and Nutrition Examination Survey (NHANES); 
Purpose: Assesses the health and nutritional status of adults and 
children; 
People without health insurance: [Empty]; 
Disability: X; 
Housing: [Empty]. 

Survey: National Health Interview Survey (NHIS); 
Purpose: Monitors health of U.S. population on a variety of health 
topics; 
People without health insurance: X; 
Disability: X; 
Housing: [Empty]. 

Survey: Survey of Income and Program Participation (SIPP); 
Purpose: Collects source and amount of income, labor force information, 
program participation and eligibility data, and general demographic 
characteristics to measure the effectiveness of existing federal, 
state, and local programs and to estimate future costs and coverage for 
government programs; 
People without health insurance: X; 
Disability: X; 
Housing: [Empty]. 

Source: GAO analysis of selected surveys. 

[End of table] 

To learn more about the potentially duplicative content between these 
surveys, we reviewed relevant literature and agency documents. We also 
interviewed officials from OMB, Census Bureau at the Department of 
Commerce (DOC), the Bureau of Labor Statistics (BLS) at the Department 
of Labor (DOL), the National Center for Health Statistics (NCHS) and 
the Agency for Healthcare Research and Quality (AHRQ) at the Department 
of Health and Human Services (HHS), and the Division of Housing and 
Demographic Analysis at the Department of Housing and Urban Development 
(HUD). We also interviewed experts from organizations that focus on 
federal statistics, such as the Council of Professional Associations on 
Statistics and the Committee on National Statistics, National Academies 
of Science. 

For the fourth objective, to describe selected agency efforts to 
improve the efficiency and relevance of surveys, we analyzed 
information from agency and OMB interviews, expert interviews as 
discussed above, and literature. We conducted our work in accordance 
with generally accepted government auditing standards from April 2005 
through June 2006. Appendix I provides a more complete description of 
our scope and methodology. 

Results in Brief: 

At the time of our review, OMB had approved 584 new and ongoing federal 
statistical or research surveys[Footnote 7] of which 40 percent were 
administered to individuals and households. About 35 percent of the 
approved statistical and research surveys each required 1,000 or less 
annual estimated burden hours (i.e., the amount of time for an average 
respondent to complete a survey, multiplied by the total number of 
respondents). 

Under the PRA, agencies are responsible for certifying to OMB that each 
information collection does not unnecessarily duplicate existing 
information. OMB defines unnecessary duplication as information that is 
similar to or corresponding to information that could serve the 
agency's purpose and is already accessible to the agency. In prior 
work, we found that some of these certifications were made without 
complete supporting information.[Footnote 8] When approving a survey, 
OMB is required to review the content of the agency's submission to 
ensure that each information collection is not unnecessarily 
duplicative. OMB also provides guidance that agencies can use to comply 
with the approval process, including guidance on when it is acceptable 
to duplicate questions in other surveys. An agency may consult with OMB 
before it submits an information collection for approval, and officials 
told us that early consultation can help identify and prevent 
unnecessary duplication. 

Based on OMB's definition of unnecessary duplication, the surveys we 
reviewed could be considered to contain necessary duplication. The 
seven surveys we reviewed have duplicative content and in some cases 
ask the same or similar questions in three subject areas: (1) people 
without health insurance (CPS, NHIS, MEPS, and SIPP), (2) people with 
disabilities (NHIS, NHANES, MEPS, SIPP, and ACS), and (3) housing (AHS 
and ACS). However, the agencies and OMB judged that this was not 
unnecessary duplication given the differences among the surveys. The 
surveys originated at various times over several decades, and some 
differ in their purposes and methodologies (such as the sampling 
methodologies) as well as in their definitions and measurement 
techniques (such as the time frames used). In some instances, the 
ability to link this information with other questions in the same 
survey can yield richer data that allow for a fuller description or 
understanding of specific topics. However, the resulting estimates of 
similar characteristics can be very different, which can be confusing. 
For example, the 2004 CPS estimate for people who were uninsured for a 
full year is over 50 percent higher than the NHIS estimate of the 
number of uninsured for that year. Interagency groups have undertaken 
efforts to explain or reconcile inconsistencies among surveys that 
address the same subject area, such as explaining the differences 
between estimates of the number of uninsured persons. 

In some cases, agencies have taken steps to enhance the relevance and 
efficiency of their surveys. For example, the Census Bureau undertook a 
review of its portfolio of manufacturing surveys and decided to 
eliminate several in order to undertake new surveys on the industrial 
sectors that were of growing importance to the economy. Agencies have 
also used administrative data in conjunction with their surveys, which 
has enhanced the quality of the information and limited respondent 
burden. 

At the same time, there are two major changes upcoming to the portfolio 
of major federal household surveys. The ACS, which is intended to 
replace the long-form decennial census in 2010, is considered to be the 
cornerstone of the government's efforts to provide data on population 
and housing characteristics and will be used to distribute billions of 
dollars in federal funding. Efforts are also underway to redesign the 
SIPP, which is used in estimating future costs of certain government 
benefit programs. In light of these upcoming changes, OMB recognizes 
that the federal government should build upon agencies' practice of 
reexamining individual surveys. Providing greater coherence among 
surveys, particularly in definitions and time frames, could help reduce 
costs to the federal government and associated burden hours. The 
Interagency Council on Statistical Policy, which is chaired by OMB and 
made up of the heads of the major statistical agencies, is responsible 
for coordinating statistical work and has the leadership authority to 
undertake a comprehensive reexamination of the portfolio of major 
federal household surveys. 

The rollout of the ACS and the reengineering of the SIPP provide an 
opportunity to go beyond these individual efforts to examine the 
effectiveness and efficiency of the portfolio of major household 
surveys that have developed over six decades. Therefore, we are 
recommending that the Director of OMB work with the Interagency Council 
on Statistical Policy to plan for a comprehensive reexamination to 
identify opportunities for redesigning or reprioritizing the portfolio 
of major federal household surveys. Such a reexamination would identify 
opportunities to ensure that major federal household surveys initiated 
under conditions, priorities, and approaches that existed decades ago 
are able to cost-effectively meet current and emerging information 
needs. 

OMB and HUD agreed with our recommendation but OMB officials expressed 
concerns about the range of participants and the universe of surveys 
that might be involved in such a reexamination. In response, we revised 
the recommendation to clarify that OMB should work with the ICSP and 
focused the recommendation on seven surveys that are considered to be 
major federal household surveys. HHS stated that a reexamination was 
not warranted without evidence of unnecessary duplication, but our 
recommendation is based on other factors, including a need to provide 
greater coherence among the surveys and to take advantage of changes in 
the statistical system to reprioritize information needs and possibly 
help reduce costs to the federal government and associated burden 
hours. HHS also provided additional information that we incorporated as 
appropriate in the report. In addition, we obtained written comments 
from the DOC and informal electronic comments from the DOL, which we 
incorporated as appropriate in the report. 

Background: 

The purpose of the PRA is to (1) minimize the federal paperwork burden 
for individuals, small businesses, state and local governments, and 
other persons; (2) minimize the cost to the federal government of 
collecting, maintaining, using, and disseminating information; and (3) 
maximize the usefulness of information collected by the federal 
government. The PRA also aims to provide for timely and equitable 
dissemination of federal information; improve the quality and use of 
information to increase government accountability at a minimized cost; 
and manage information technology to improve performance and reduce 
burden, while improving the responsibility and accountability of OMB 
and the federal agencies to Congress and the public. 

To achieve these purposes, the PRA prohibits federal agencies from 
conducting or sponsoring an information collection unless they have 
prior approval from OMB. The PRA requires that information collections 
be approved by OMB when facts or opinions are solicited from 10 or more 
people. Under the law, OMB is required to determine that an agency 
information collection is necessary for the proper performance of the 
functions of the agency, including whether the information will have 
practical utility. 

The PRA requires every agency to establish a process for its chief 
information officer (CIO) to review program offices' proposed 
information collections, such as certifying that each proposed 
collection complies with the PRA, including ensuring that it is not 
unnecessarily duplicative. The agency is to provide two public notice 
periods--an initial 60-day notice period and a 30-day notice period 
after the information collection is submitted to OMB for 
approval.[Footnote 9] Agencies are responsible for consulting with 
members of the public and other affected agencies to solicit comments 
on, among other things, ways to minimize the burden on respondents, 
including through the use of automated collection techniques or other 
forms of information technology. According to an OMB official, this 
could include asking for comments on a proposal to use administrative 
data instead of survey data. 

Following satisfaction of these requirements, an agency is to submit 
its proposed information collection for OMB review, whether for new 
information collections or re-approval of existing information 
collections. Before an agency submits a proposed information collection 
for approval, an agency may invest substantial resources to prepare to 
conduct an information collection. An agency may undertake, among other 
things, designing the information collection, testing, and consulting 
with users. For example, over the last 8 years, BLS has led an 
interagency effort designed to develop a measure of the employment rate 
of adults with disabilities pursuant to Executive Order 13078 signed by 
President Clinton in 1998. This effort has entailed planning, 
developing, and testing disability questions to add to the CPS. OMB is 
responsible for determining whether each information collection is 
necessary for the proper performance of the agency's functions. 
According to the Statistical Programs of the United States Government: 
Fiscal Year 2006, an estimated $5.4 billion in fiscal year 2006 was 
requested for statistical activities.[Footnote 10] 

The PRA also requires the establishment of the Interagency Council on 
Statistical Policy (ICSP). According to the Statistical Programs of the 
United States Government: Fiscal Year 2006, the ICSP is a vehicle for 
coordinating statistical work, particularly when activities and issues 
cut across agencies; for exchanging information about agency programs 
and activities; and for providing advice and counsel to OMB on 
statistical matters. 

The PRA also requires OMB to annually report on the paperwork burden 
imposed on the public by the federal government and efforts to reduce 
this burden, which is reported in Managing Information Collection: 
Information Collection Budget of the United States Government. For 
example, the 2006 Information Collection Budget reported on agency 
initiatives to reduce paperwork, such as HHS's assessment of its 
information collections with a large number of burden hours, which 
resulted in reducing the department's overall burden hours by over 36 
million in fiscal year 2005. 

OMB produces the annual Statistical Programs of the United States 
Government report to fulfill its responsibility under the PRA to 
prepare an annual report on statistical program funding. This document 
outlines the effects of congressional actions and the funding for 
statistics proposed in the President's current fiscal year budget, and 
highlights proposed program changes for federal statistical activities. 
It also describes a number of long-range planning initiatives to 
improve federal statistical programs, including making better use of 
existing data collections while protecting the confidentiality of 
statistical information. 

More Than 500 Statistical or Research Surveys Have Been Approved: 

At the time of our review, OMB had approved 584 new and ongoing 
statistical and research surveys as recorded in the database of OMB- 
approved information collections. OMB uses the database for tracking 
purposes, as it provides the only centralized information available on 
the characteristics of the surveys that OMB has approved. The database 
contains information on some, but not all, of the characteristics of 
the information collections. The information that agencies provide in 
the packages they submit to OMB for approval includes additional data, 
such as the estimated cost. 

Statistical and research surveys represent about 7 percent of the total 
universe of 8,463 OMB-approved information collections, the majority of 
which, as shown in figure 1, are for regulatory or compliance and 
application for benefits purposes. Although there are certain surveys 
funded through grants and contracts that are not approved by OMB under 
the PRA, OMB stated that there is no comprehensive list of these 
surveys.[Footnote 11] 

Figure 1: Primary Purpose of OMB-approved Information Collections: 

[See PDF for image] 

Source: GAO analysis of the database of OMB-approved federally funded 
information collections as of August 7, 2006. 

[End of figure] 

Forty percent of OMB-approved statistical and research surveys were 
administered to individuals and households, as shown in figure 2. 

Figure 2: Respondents to OMB-approved Statistical and Research Surveys: 

[See PDF for image] 

Source: GAO analysis of the database of OMB-approved federally funded 
information collections as of August 7, 2006. 

[End of figure] 

Annual estimated burden hours are defined as the amount of time for the 
average respondent to fill out a survey times the number of 
respondents.[Footnote 12] Figure 3 shows the range of burden hours, for 
general purpose research and statistics information collections, with 
about 35 percent of the surveys each accounting for 1,000 or fewer 
total burden hours. 

Figure 3: Burden Hour Ranges of the 584 Research and General Purpose 
Statistics Surveys: 

[See PDF for image] 

Source: GAO analysis of the database of OMB-approved federally funded 
information collections as of August 7, 2006. 

[End of figure] 

According to an OMB official, the electronic system, Regulatory 
Information Service Center Office of Information and Regulatory Affairs 
Consolidated Information System, has automated the agency submission 
and OMB review process. This new system, which was implemented in July 
of 2006, is intended to allow OMB and agency officials to search 
information collection titles and abstracts for major survey topics and 
key words. 

Table 2 provides information from agency officials and documents for 
the selected surveys that we reviewed in more depth. For these seven 
surveys, the sample sizes ranged from 5,000 individuals for the NHANES 
to 55,000 housing units for the AHS. The NHANES has a much smaller 
sample size and greater cost (as compared to the other surveys with 
similar burden hours) because it includes both an interview and a 
physical examination in a mobile exam center. The physical examination 
can include body measurements and tests and procedures, such as a blood 
sample and dental screening, to assess various aspects of respondents' 
health. Other differences among the surveys we reviewed included their 
specific purposes (e.g., to obtain health information or demographics 
data); the time period considered (some of the surveys provide data as 
of a certain point in time while others are longitudinal and follow the 
same respondents over a period of time); and the frequency with which 
the surveys were conducted. 

In addition, many of these surveys have been in existence for decades. 
Of the seven surveys we reviewed, five are defined by the Statistical 
Programs of the United States Government Fiscal Year 2006 as major 
household surveys (ACS, AHS, CPS, NHIS, and SIPP), and in addition 
MEPS's household sample is a sub-set of NHIS's sample. The ACS, unlike 
the other surveys, is mandatory and will replace the decennial census 
long-form. In addition to the surveys that we reviewed, two other 
surveys, the Consumer Expenditure Surveys and the National Crime 
Victimization Survey, are also defined by the Statistical Programs of 
the United States Government of 2006 as major household surveys. 

Table 2: Characteristics of Selected Research and Statistical Surveys: 

Survey: ACS; 
Sponsoring agency: Census Bureau, DOC; 
Purpose: Will replace the decennial Census long-form, and monitors 
changes in communities; 
Sample size: 3,122,900 households[A]; 
Produces state- level estimates: X; 
Survey frequency: Monthly; 
Longitudinal: [Empty]; 
2006 Cost (dollars in millions): $169; 
Date originated: Fully Implemented January 2005; 
Annual burden hours: 1,917,410. 

Survey: AHS; 
Sponsoring agency: HUD; 
Purpose: Collect data on the nation's housing, including income, 
neighborhood quality, costs, equipment and fuels, and movement; 
Sample size: 55,000 (average) housing units for national component and 
about 4,100 housing units for each of the 47 metropolitan areas; 
Produces state-level estimates: [Empty]; 
Survey frequency: Odd years for national sample; Every 6 years for 47 
metropolitan areas; 
Longitudinal: X; 
2006 Cost (dollars in millions): $16; 
Date originated: 1973; 
Annual burden hours: 30,517. 

Survey: CPS and the Annual Social and Economic Supplement (ASEC); 
Sponsoring agency: CPS: BLS, DOL and Census Bureau, DOC; ASEC: Census 
Bureau, DOC; 
Purpose: Obtain information on labor force characteristics for the U.S. 
population (The ASEC is the primary source of detailed information on 
income and work experience in the United States.); 
Sample size: 60,000 households monthly for CPS; 76,000 annually for 
ASEC; 
Produces state-level estimates: X; 
Survey frequency: Monthly for CPS; February, March and April for ASEC; 
Longitudinal: [Empty]; 
2006 Cost (dollars in millions): $62.7 for CPS; $2 for ASCE; 
Date originated: 1948 for the CPS; 
Annual burden hours: 34,980. 

Survey: MEPS; 
Sponsoring agency: AHRQ, HHS; 
Purpose: Provides extensive information on health care use and costs; 
Sample size: 12,860 households[B]; 
Produces state-level estimates: [Empty]; 
Survey frequency: Annual; 
Longitudinal: X; 
2006 Cost (dollars in millions): $55.3; 
Date originated: 1977; 
Annual burden hours: 203,414. 

Survey: NHANES; 
Sponsoring agency: NCHS, HHS; 
Purpose: Assesses the health and nutritional status of adults and 
children; 
Sample size: 5,000 individuals; 
Produces state-level estimates: [Empty]; 
Survey frequency: Continuous; 
Longitudinal: [Empty]; 
2006 Cost (dollars in millions): $40.4; 
Date originated: 1960; 
Annual burden hours: 62,974. 

Survey: NHIS; 
Sponsoring agency: NCHS, HHS; 
Purpose: Monitors health of U.S. population on a broad range of health 
topics; Sample size: 35,000 households; 
Produces state-level estimates: For larger states[C]; 
Survey frequency: Continuous; 
Longitudinal: [Empty]; 
2006 Cost (dollars in millions): $26; 
Date originated: 1957; 
Annual burden hours: 39,837. 

Survey: SIPP; 
Sponsoring agency: Census Bureau, DOC; 
Purpose: Collects source and amount of income, labor force information, 
program participation and eligibility data, and general demographic 
characteristics to measure the effectiveness of federal, state, and 
local programs, to estimate future costs, and coverage for government 
programs; Sample size: 26,000 households; 
Produces state-level estimates: [Empty]; 
Survey frequency: Continuing with monthly interviews; 
Longitudinal: X; 
2006 Cost (dollars in millions): $46.2; 
Date originated: 1983; 
Annual burden hours: 148,028. 

Source: GAO analysis. 

Note: The costs data were rounded to the nearest tenth of a million. 

[A] Although the ACS' annual sample size is 3,122,900, starting in 
2006, data will be available annually for all areas with populations of 
65,000 or more. For smaller areas, it will take 3 to 5 years to 
accumulate a large enough sample to produce annual data. For example, 
areas of 20,000 to 65,000 can receive data averaged over 3 years. For 
rural areas, small urban neighborhoods or population groups of less 
than 20,000, it will take five years to accumulate a sample size 
comparable to the decennial census. These averages will be updated 
every succeeding year. 

[B] In addition to the MEPS survey to households, MEPS also includes 
surveys to public and private employers to collect data on the number 
and types of private health insurance offered, benefits associated with 
those plans, premiums, contributions by employers and employees, 
eligibility requirements, and employer characteristics. 

[C] According to a HHS official, depending on the year and the 
population being estimated, NHIS can produce state-level estimates for 
most states, with the exception of approximately 8 to10 smaller states. 
For example, using the 2004 NHIS data to estimate the number of people 
who do not have health insurance by state, HHS produced state-level 
data for all states except District of Columbia, Delaware, Iowa, North 
Dakota, New Hampshire, Rhode Island, South Dakota, and Wyoming. 

[End of table] 

Agencies and OMB Have Procedures Intended to Identify and Prevent 
Unnecessary Duplication: 

Agencies and OMB have procedures intended to identify and prevent 
unnecessary duplication in information collections. Agencies are 
responsible for certifying that an information collection is not 
unnecessarily duplicative of existing information as part of complying 
with OMB's approval process for information collections. OMB has 
developed guidance that agencies can use in complying with the approval 
process. Once an agency submits a proposed information collection to 
OMB, OMB is required to review the agency's paperwork, which includes 
the agency's formal certification that the proposed information 
collection is not unnecessarily duplicative. 

Agencies are Responsible for Identifying and Preventing Unnecessary 
Duplication: 

Under the PRA, agencies are responsible for certifying that a proposed 
information collection does not unnecessarily duplicate an available 
information source. According to OMB's draft Implementing Guidance for 
OMB Review of Agency Information Collection, the term unnecessary 
duplication is defined as information similar to or corresponding to 
information that could serve the agency's purpose and need and is 
already accessible to the agency. OMB guidance states the following: 

"For example, unnecessary duplication exists if the need for the 
proposed collection can be served by information already collected for 
another purpose - such as administrative records, other federal 
agencies and programs, or other public and private sources. If specific 
information is needed for identification, classification, or 
categorization of respondents; or analysis in conjunction with other 
data elements provided by the respondent, and is not otherwise 
available in the detail necessary to satisfy the purpose and need for 
which the collection is undertaken; and if the information is 
considered essential to the purpose and need of the collection, and/or 
to the collection methodology or analysis of results, then the 
information is generally deemed to be necessary, and therefore not 
duplicative within the meaning of the PRA and OMB regulation." 
[Footnote 13] 

When an agency is ready to submit a proposed information collection to 
OMB, the agency's CIO is responsible for certifying that the 
information collection satisfies the PRA standards, including a 
certification that the information collection is not unnecessarily 
duplicative of existing information sources.[Footnote 14] We have 
previously reported that agency CIOs across the government generally 
reviewed information collections and certified that they met the 
standards in the act. However, our analysis of 12 case studies at the 
Internal Revenue Service (IRS) and the Department of Veterans Affairs, 
HUD, and DOL, showed that the CIOs certified collections even though 
support was often missing or incomplete. For example, seven of the 
cases had no information and two included only partial information on 
whether the information collection avoided unnecessary duplication. 
Further, although the PRA requires that agencies publish public notices 
in the Federal Register and otherwise consult with the public, agencies 
governmentwide generally limited consultation to the publication of the 
notices, which generated little public comment. Without appropriate 
support and public consultation, agencies have reduced assurance that 
collections satisfy the standards in the act. We recommended that the 
Director of OMB alter OMB's current guidance to clarify the kinds of 
support that it asks agency CIOs to provide for certifications and to 
direct agencies to consult with potential respondents beyond the 
publication of Federal Register notices.[Footnote 15] OMB has not 
implemented these recommendations. 

OMB Is Responsible for Reviewing Agencies' Efforts to Identify and 
Prevent Unnecessary Duplication: 

OMB has three different guidance publications that agencies can consult 
in the process of developing information collection submissions, 
according to OMB officials. The three guidance publications address 
unnecessary duplication to varying degrees. The draft, Implementing 
Guidance for OMB Review of Agency Information Collection, provides, 
among other things, instructions to agencies about how to identify 
unnecessary duplication of proposed information collections with 
existing available information sources. 

OMB's Questions and Answers When Designing Surveys for Information 
Collections discusses when it is acceptable to duplicate questions used 
in other surveys. The publication also encourages agencies to consult 
with OMB when they are proposing new surveys, major revisions, or large-
scale experiments or tests, before an information collection is 
submitted. For example, when BLS was developing its disability 
questions for the CPS, BLS officials stated that they consulted OMB on 
numerous occasions. OMB officials also said that when they are involved 
early in the process, it is easier to modify an agency's plan for an 
information collection. 

OMB officials told us that an agency consultation with OMB before an 
information collection is developed can provide opportunities to 
identify and prevent unnecessary duplication. For example, according to 
an OMB official, while OMB was working with the Federal Emergency 
Management Agency (FEMA) to meet the need for information on the impact 
of Hurricane Katrina, OMB identified a survey partially funded by the 
National Institute of Mental Health (NIMH) that was in the final stages 
of design and would be conducted by Harvard University--the Hurricane 
Katrina Advisory Group Initiative. OMB learned that this survey, which 
was funded through a grant (and was not subject to review and approval 
under the PRA), planned to collect data on many of the topics that FEMA 
was interested in. OMB facilitated collaboration between FEMA and HHS 
and ultimately, FEMA was able to avoid launching a new survey by 
enhancing the Harvard study. 

OMB's draft of the Proposed Standards and Guidelines for Statistical 
Surveys, which focuses on statistical surveys and their design and 
methodology, did not require that agencies assess potential duplication 
with other available sources of information as part of survey planning. 
We suggested that OMB require that when agencies are initiating new 
surveys or major revisions of existing surveys they include in their 
written plans the steps they take to ensure that a survey is not 
unnecessary duplicative with available information sources. OMB has 
incorporated this suggestion. 

Under the PRA, OMB is responsible for reviewing proposed information 
collections to determine whether a proposed information collection 
meets the PRA criteria, which include a requirement that it not 
unnecessarily duplicate available information. According to an OMB 
official responsible for reviewing information collections, OMB's 
review process consists of several steps. She said that once an agency 
has submitted the proposed information collection package to OMB, the 
package is sent to the appropriate OMB official for review. When there 
is a need for clarification or questions exist, this OMB official told 
us that OMB communicates with the agency either through telephone 
conferences or via e-mail. After approval, OMB is required to assign a 
number to each approved information collection, which the agencies are 
then to include on their information collection (e.g., survey) forms. 

In addition to its responsibilities for reviewing proposed information 
collections, OMB also contributes to or leads a wide range of 
interagency efforts that address federal statistics. For example, OMB 
chairs the ICSP. The ICSP is a vehicle for coordinating statistical 
work, exchanging information about agency programs and activities, and 
providing advice and counsel to OMB on statistical matters. The council 
consists of the heads of the principal statistical agencies,[Footnote 
16] plus the heads of the statistical units in the Environmental 
Protection Agency, IRS, National Science Foundation, and Social 
Security Administration (SSA). According to an OMB official, the ICSP 
can expand its membership for working groups to address specific 
topics. For example, the ICSP established an employment-related health 
benefits subcommittee and included non-ICSP agencies, such as HHS's 
AHRQ (which co-chaired the subcommittee). The ICSP member agencies 
exchange experiences and solutions with respect to numerous topics of 
mutual interest and concern. For example, in the past year, the council 
discussed topics such as: 

² the revision of core standards for statistical surveys: 

² opportunities for interagency collaboration on information technology 
development and investment and: 

² sample redesign for the major household surveys with the advent of 
the ACS. 

Duplicative Content in Selected Surveys Exists, but Survey Purposes and 
Scope Differ: 

On the basis of OMB's definition of unnecessary duplication, the 
surveys we reviewed could be considered to contain necessary 
duplication. To examine selected surveys to assess the extent of 
unnecessary duplication in areas with similar subject matter, we looked 
at surveys that addressed three areas: (1) people without health 
insurance (CPS, NHIS, MEPS, and SIPP), (2) people with disabilities 
(NHIS, NHANES, MEPS, SIPP, and ACS), and (3) the housing questions on 
the AHS and ACS. We found that the selected surveys had duplicative 
content and asked similar questions in some cases. However, the 
agencies and OMB judged that this was not unnecessary duplication given 
the differences among the surveys. In some instances, the duplication 
among these surveys yielded richer data, allowing fuller descriptions 
of specific topics and providing additional perspectives on a topic, 
such as by focusing on the different sources and effects of 
disabilities. The seven surveys we reviewed originated at different 
times and differ in many aspects, including the samples drawn, the time 
periods measured, the types of information collected, and level of 
detail requested. These factors can affect costs and burden hours 
associated with the surveys. In addition, the differences can create 
confusion in some cases because they produce differing estimates and 
use different definitions. 

Surveys That Measure People without Health Insurance Produce Differing 
Estimates: 

Although the CPS, NHIS, MEPS, and SIPP all measure people who do not 
have health insurance, the surveys originated at different times and 
differ in several ways, including the combinations of information 
collected that relate to health insurance, questions used to determine 
health insurance status, and time frames. Health insurance status is 
not the primary purpose of any of these surveys, but rather one of the 
subject areas in each survey. In addition, because each survey has a 
different purpose, each survey produces a different combination of 
information related to people's health insurance. 

² The CPS originated in 1948 and provides data on the population's 
employment status. Estimates from the CPS include employment, 
unemployment, earnings, hours of work, and other indicators. 
Supplements also provide information on a variety of subjects, 
including information about employer-provided benefits like health 
insurance. CPS also provides information on health insurance coverage 
rates for sociodemographic subgroups of the population. The time frame 
within which data is released varies; for example, CPS employment 
estimates are released 2-3 weeks after collection while supplement 
estimates are released in 2-9 months after collection. 

² The NHIS originated in 1957 and collects information on reasons for 
lack of health insurance, type of coverage, and health care 
utilization. The NHIS also collects data on illnesses, injuries, 
activity limitations, chronic conditions, health behaviors, and other 
health topics, which can be linked to health insurance status. HHS 
stated that although health insurance data are covered on other 
surveys, NHIS's data on health insurance is key to conducting analysis 
of the impact of health insurance coverage on access to care, which is 
generally not collected on other surveys. 

² The MEPS originated in 1977 and provides data on health insurance 
dynamics, including changes in coverage and periods without coverage. 
The MEPS augments the NHIS by selecting a sample of NHIS respondents 
and collecting additional information on the respondents. The MEPS also 
links data on health services spending and health insurance status to 
other demographic characteristics of survey respondents. The MEPS data 
can also be used to analyze the relationship between insurance status 
and a variety of individual and household characteristics, including 
use of and expenditures for health care services. 

² The SIPP originated in 1983 in order to provide data on income, labor 
force, and government program participation. The information collected 
in the SIPP, such as the utilization of health care services, child 
well-being, and disability, can be linked to health insurance status. 
The SIPP also measures the duration of periods without health 
insurance. 

Because the surveys use different methods to determine health insurance 
status, they can elicit different kinds of responses and consequently 
differing estimates within the same population. To determine if a 
person is uninsured, surveys use one of two methods: they ask 
respondents directly if they lack insurance coverage or they classify 
individuals as uninsured if they do not affirmatively indicate that 
they have coverage. The CPS and the NHIS directly ask respondents 
whether they lack insurance coverage. While the difference between 
these approaches may seem subtle, using a verification question prompts 
some people who did not indicate any insurance coverage to rethink 
their status and indicate coverage that they had previously forgotten 
to mention. 

The surveys also differ both in the time period respondents are asked 
to recall and in the time periods measured when respondents did not 
have health insurance. Hence, the surveys produce estimates that do not 
rely upon standardized time or recall periods and as a result are not 
directly comparable. The ASEC to the CPS is conducted in February, 
March, and April and asks questions about the prior calendar year. An 
interviewer asks the respondent to remember back for the previous 
calendar year which can be as long as 16 months in the April interview. 
The other three surveys, in contrast, asked about coverage at the time 
of the interview. Because a respondent's ability to recall information 
generally degrades over time, most survey methodologists believe that 
the longer the recall period, the less accurate the answers will be to 
questions about the past, such as exactly when health insurance 
coverage started or stopped, or when it changed because of job changes. 
Another difference is the time period used to frame the question. The 
CPS asked whether the respondent was uninsured for an entire year, 
while NHIS, MEPS, and SIPP asked whether the individual was ever 
insured, or was uninsured at the time of the interview, for the entire 
last year, and at any time during the year. 

Table 3 illustrates the differing estimates obtained using data from 
the four selected surveys. While these differences can be explained, 
the wide differences in the estimates are of concern and have created 
some confusion. For example, the 2004 CPS estimate for people who were 
uninsured for a full year is over 50 percent higher than the NHIS 
estimate for that year. HHS has sponsored several interagency meetings 
on health insurance data, which involved various agencies within HHS 
and the Census Bureau. The meetings focused on improving estimates of 
health insurance coverage and included, among other things, examining 
how income data are used, exploring potential collaboration between HHS 
and the Census Bureau on whether the CPS undercounts Medicaid 
recipients, examining health insurance coverage rates, and discussing a 
potential project to provide administrative data for use in the CPS. As 
a result, HHS created a Web site with reports and data on relevant 
surveys and HHS's office of the Assistant Secretary for Planning and 
Evaluation (ASPE) produced the report Understanding Estimates of the 
Uninsured: Putting the Differences in Context with input from the 
Census Bureau in an effort to explain the differing estimates.[Footnote 
17] 

Table 3: Uninsured Estimates from Selected Surveys: 

Survey: CPS; 
Most recent year: 2004; 
Uninsured for full year: 45.8 million; 
Point in time estimate: N/A; 
Ever uninsured during the year: N/A. 

Survey: NHIS; 
Most recent year: 2004; 
Uninsured for full year: 29.2 million; 
Point in time estimate: 42.1 million; 
Ever uninsured during the year: 51.6 million. 

Survey: MEPS; 
Most recent year: 2003; 
Uninsured for full year: 33.7 million; 
Point in time estimate: 48.1 million; 
Ever uninsured during the year: 62.9 million. 

Survey: SIPP; 
Most recent year: 2001; 
Uninsured for full year: 18.9 million; 
Point in time estimate: 38.7 million; 
Ever uninsured during the year: 66.5 million. 

Source: GAO extract of ASPE issue brief: Understanding Estimates Of the 
Uninsured: Putting the Differences in Context (September, 2005). 

[End of table] 

Surveys that Measure Disability Status Differ in Definitions, Purposes, 
and Methodologies Used: 

Similarly, although the NHIS, NHANES, MEPS, SIPP, and ACS all estimate 
the percentage of the population with disabilities, the surveys define 
disability differently and have different purposes and methodologies. 
In addition to these five surveys, which measure aspects of disability, 
BLS is also currently developing questions to measure the employment 
levels of the disabled population. HHS also stated that disability is 
included on multiple surveys so that disability status can be analyzed 
in conjunction with other information that an agency needs. For 
example, disability information is used by health departments to 
describe the health of the population, by departments of transportation 
to assess access to transportation systems, and departments of 
education in the education attainment of people with disabilities. The 
lack of consistent definitions is not unique to surveys; there are over 
20 different federal agencies that administer almost 200 different 
disability programs for purposes of entitlement to public support 
programs, medical care, and government services. 

Although each of the surveys asks about people's impairments or 
functionality in order to gauge a respondent's disability status, there 
are some differences in how disability is characterized. For example, 
the NHIS asks respondents if they are limited in their ability to 
perform age-dependent life and other activities. The NHIS also asks 
about the respondent needing assistance with performing activities of 
daily living and instrumental activities of daily living.[Footnote 18] 
The NHANES measures the prevalence of physical and functional 
disability for a wide range of activities in children and adults. 
Extensive interview information on self-reported physical abilities and 
limitations is collected to assess the capacity of the individual to do 
various activities without the use of aids, and the level of difficulty 
in performing the task. The MEPS provides information on days of work 
or school missed due to disability. The SIPP queries whether the 
respondent has limitations of sensory, physical, or mental functioning 
and limitations on activities due to health conditions or impairments. 
The ACS asks about vision or hearing impairment, difficulty with 
physical and cognitive tasks, and difficulty with self-care and 
independent living. 

Because surveys produce different types of information on disability, 
they can provide additional perspectives on the sources and effects of 
disabilities, but they can also cause confusion because of the 
differences in the way disability is being measured. The NHIS contains 
a broad set of data on disability-related topics, including the 
limitation of functional activities, mental health questions used to 
measure psychological distress, limitations in sensory ability, and 
limitations in work ability. Moreover, the NHIS provides data, for 
those persons who indicated a limitation performing a functional 
activity, about the source or condition of their functional limitation. 
The NHANES links medical examination information to disability. The 
MEPS measures how much individuals spend on medical care for a person 
with disabilities and can illustrate changes in health status and 
health care expenses. The SIPP provides information on the use of 
assistive devices, such as wheelchairs and canes. Finally, the ACS 
provides information on many social and economic characteristics, such 
as school enrollment for people with disabilities as well as the 
poverty and employment status of people with different types of 
disabilities. 

However, the estimates of disability in the population that these 
surveys produce can vary widely. A Cornell University study compared 
disability estimates among the NHIS, SIPP, and ACS. A number of 
categories of disability were very similar, such as the nondisabled 
population, while others, such as the disabled population or people 
with sensory disabilities, had widely varying estimates, as shown in 
table 4.[Footnote 19] For example, according to data presented in a 
Cornell University study that used survey questions to define and 
subsequently compare different disability measures across surveys, the 
SIPP 2002 estimate of people with sensory disabilities for ages 18-24 
was more than six times the NHIS estimate for that year for ages 18-24. 
In commenting on this report, the DOC and HHS acknowledged that 
comparing the NHIS and SIPP with respect to sensory disabilities is 
problematic. HHS officials noted that the confusion caused by these 
different estimates derives mostly from the lack of a single definition 
of disability, which leads to data collections that use different 
questions and combinations of information to define disability status. 

Table 4: Estimated Population of Persons with Disabilities, by Data 
Source and Different Categories of Disability: 

Ages 18-24; 
Surveys: NHIS (2002); 
No disability: 25,225,000; 
Disability: 2,126,000; 
Work limitation: 927,000; 
Instrumental activities of daily living: 228,000; 
Activities of daily living: 147,000; 
Mental: 786,000; Physical: 859,000; Sensory: 78,000. 

Ages 18-24; 
Surveys: SIPP (2002); 
No disability: 24,820,000; 
Disability: 2,426,337; 
Work limitation: 1,209,000; 
Instrumental activities of daily living: 366,000; 
Activities of daily living: 146,000; 
Mental: 1,076,000; 
Physical: 982,000; 
Sensory: 533,000. 

Ages 18-24; 
Surveys: ACS (2003); 
No disability: 24,194,401; 
Disability: 1,667,355; 
Work limitation: 714,229; 
Instrumental activities of daily living: 399,423; 
Activities of daily living: 187,904;
Mental: 953,448; 
Physical: 535,666; 
Sensory: 356,820. 

Ages 25-61; 
Surveys: NHIS (2002); 
No disability: 115,934,000; 
Disability: 23,192,000; 
Work limitation: 13,725,000; 
Instrumental activities of daily living: 3,169,000; 
Activities of daily living: 1,350,000; 
Mental: 4,627,000; 
Physical: 14,545,000; 
Sensory: 2,730,000. 

Ages 25-61; 
Surveys: SIPP (2002); 
No disability: 115,900,000; 
Disability: 26,620,000; 
Work limitation: 14,420,000; 
Instrumental activities of daily living: 4,931,000; 
Activities of daily living: 3,362,000; 
Mental: 4,394,000; 
Physical: 18,790,000; 
Sensory: 6,490,000. 

Ages 25-61; 
Surveys: ACS (2003); 
No disability: 126,649,510; 
Disability: 17,146,845; 
Work limitation: 9,854,223; 
Instrumental activities of daily living: 4,227,427; 
Activities of daily living: 2,925,715; 
Mental: 5,745,569; 
Physical: 10,819,521; 
Sensory: 3,944,388. 

Source: GAO extract of Cornell University's Employment and Disability 
Institute report A Guide to Disability Statistics from the National 
Health Interview Survey (2005). 

Note: Instrumental activities of dally living (IADL) include a broader 
set of participation restrictions than the '"go-outside-home" 
definition in the ACS. It also includes participation restrictions that 
affect the ability to: manage money and keep track of bills, prepare 
meals, and do work around the house. 

[End of table] 

Because the concept of disability varies, with no clear consensus on 
terminology or definition, and there are differing estimates, several 
federal and international groups are examining how the associated 
measures of disability could be improved. HHS's Disability Workgroup, 
which includes officials from HHS and the Department of Education, 
examines how disability is measured and used across surveys. The task 
of another federal group, the Subcommittee on Disability Statistics of 
the Interagency Committee on Disability Research, is to define and 
standardize the disability definition. The Washington Group on 
Disability Statistics (WGDS), an international workgroup sponsored by 
the United Nations in which OMB and NCHS participate, is working to 
facilitate the comparison of data on disability internationally. The 
WGDS aims to guide the development of a short set or sets of disability 
measures that are suitable for use in censuses, sample-based national 
surveys, or other statistical formats, for the primary purpose of 
informing policy on equalization of opportunities. The WGDS is also 
working to develop one or more extended sets of survey items to measure 
disability, or guidelines for their design, to be used as components of 
population surveys or as supplements to specialty surveys. HHS added 
that the interest in standardizing the measurement of disability status 
is also driven by the desire to add a standard question set to a range 
of studies so that the status of persons with disabilities can be 
described across studies. 

The AHS and ACS Ask Some Similar Questions on Housing, but Their 
Purposes and Scope Differ: 

In 2002, we reported that the AHS and ACS both covered the subject of 
housing.[Footnote 20] Of the 66 questions on the 2003 ACS, 25 were in 
the section on housing characteristics, and all but one of these 
questions were the same as or similar to the questions on the AHS. For 
example, both the AHS and the ACS ask how many bedrooms a housing unit 
has. However, the two surveys differ in purposes and scope. 

The purpose of the AHS is to collect detailed housing information on 
the size, composition, and state of housing in the United States, and 
to track changes in the housing stock over time, according to a HUD 
official. To that end, the AHS includes about 1,000 variables, 
according to a HUD official, such as the size of housing unit, housing 
costs, different building types, plumbing and electrical issues, 
housing and neighborhood quality, mortgage financing, and household 
characteristics. The AHS produces estimates at the national level, 
metropolitan level for certain areas, and homogenous zones of 
households with fewer than 100,000 households. The AHS is conducted 
every 2 years nationally and every 6 years in major metropolitan areas, 
except for six areas, which are surveyed every 4 years. 

In contrast, the level of housing data in the ACS is much less 
extensive. The ACS is designed to replace the decennial Census 2010 
long-form and covers a wide range of subjects, such as income, commute 
time to work, and home values. The ACS provides national and county 
data and, in the future, will provide data down to the Census tract 
level, according to a Census Bureau official. The ACS is designed to 
provide communities with information on how they are changing, with 
housing being one of the main topic areas along with a broad range of 
household demographic and economic characteristics. 

The AHS and ACS also have different historical and trend data and data 
collection methods. The AHS returns to the same housing units year 
after year to gather data; therefore, it produces data on trends that 
illustrate the flow of households through the housing stock, according 
to a HUD official, while the ACS samples new households every month. 
Historical data are also available from the AHS from the 1970s onward, 
according to a HUD official. 

Analysts can use AHS data to monitor the interaction among housing 
needs, demand, and supply, as well as changes in housing conditions and 
costs. In addition, analysts can also use AHS data to support the 
development of housing policies and the design of housing programs 
appropriate for different groups. HUD uses the AHS data, for example, 
to analyze changes affecting housing conditions of particular 
subgroups, such as the elderly. The AHS also plays an important role in 
HUD's monitoring of the lending activities of the government-sponsored 
enterprises, Fannie Mae and Freddie Mac, in meeting their numeric goals 
for mortgage purchases serving minorities, low-income households, and 
underserved areas. AHS's characteristic of returning to the same 
housing units year after year provides the basis for HUD's Components 
of Inventory Change (CINCH) and Rental Dynamics analyses. The CINCH 
reports examine changes in housing stock over time by comparing the 
status and characteristics of housing units in successive surveys. The 
Rental Dynamics program, which is a specialized form of CINCH, looks at 
rental housing stock changes, with an emphasis on changes in 
affordability. Another use of AHS data has been for calculating certain 
fair market rents (FMR), which HUD uses to determine the amount of 
rental assistance subsidies for major metropolitan areas between the 
decennial censuses. However, HUD plans to begin using ACS data for 
fiscal year 2006 FMRs. As we previously reported, this could improve 
the accuracy of FMRs because the ACS provides more recent data that 
closely matches the boundaries of HUD's FMR areas than the 
AHS.[Footnote 21] 

In our 2002 report, which was published before the ACS was fully 
implemented, we also identified substantial overlap for questions on 
place of birth and citizenship, education, labor force characteristics, 
transportation to work, income, and, in particular, housing 
characteristics. We recommended that the Census Bureau review proposed 
ACS questions for possible elimination that were asked on the AHS to 
more completely address the possibility of reducing the reporting 
burden in existing surveys.[Footnote 22] The Census Bureau responded 
that they are always looking for opportunities to streamline, clarify, 
and reduce respondent burden, but that substantial testing would be 
required before changes can be made in surveys that provide key 
national social indicators. 

The Advent of the ACS and the Proposed Reengineering of the SIPP Are 
Changes to the Portfolio of Major Household Surveys: 

In addition to efforts underway to try to reconcile inconsistencies 
among surveys that address the same subject areas, a number of major 
changes have occurred or are planned to occur that will affect the 
overall portfolio of major household surveys. As previously discussed, 
the ACS was fully implemented in 2005 and provides considerable 
information that is also provided in many other major household 
surveys. The ACS is the cornerstone of the government's effort to keep 
pace with the nation's changing population and ever-increasing demands 
for timely and relevant data about population and housing 
characteristics. The new survey will provide current demographic, 
socioeconomic, and housing information about America's communities 
every year, information that until now was only available once a 
decade. Starting in 2010, the ACS will replace the long-form census. As 
with the long-form, information from the ACS will be used to administer 
federal and state programs and distribute more than $200 billion a 
year. Detailed data from national household surveys can be combined 
with data from the ACS to create reliable estimates for small 
geographic areas using area estimation models. 

Partly in response to potential reductions in funding for fiscal year 
2007, the Census Bureau is planning to reengineer the SIPP with the 
intent of ultimately providing better information at lower cost. SIPP 
has been used to estimate future costs of certain government programs. 
For example, HUD used SIPP's longitudinal capacity to follow families 
over time to determine that households with high-rent burdens in one 
year move in and out of high-rent burden status over subsequent years. 
Therefore, although the overall size of the population with worst-case 
housing needs is fairly stable, the households comprising this 
population change with considerable frequency--an issue that HUD told 
us is potentially important in the design of housing assistance 
programs. 

Although the SIPP has had problems with sample attrition and releasing 
data in a timely manner, which the reengineering is intended to 
ameliorate, there has been disagreement about this proposal among some 
users of SIPP data. Census Bureau officials said they are meeting with 
internal and external stakeholders and are considering using 
administrative records. Census Bureau officials told us that they could 
develop a greater quality survey for less money, with a final survey to 
be implemented in 2009. They also said that they may consider using the 
ACS or CPS sampling frame. 

Agencies Have Undertaken Efforts to Improve the Efficiency and 
Relevance of Surveys: 

In addition to the seven surveys discussed previously, we also 
identified examples of how, over the years, agencies have undertaken 
efforts to enhance their surveys' relevance and efficiency through 
steps such as using administrative data in conjunction with survey 
data, reexamining and combining or eliminating surveys, and redesigning 
existing surveys. 

Agencies Have Used Administrative Data in Conjunction with Surveys: 

The Census Bureau and BLS have used administrative data collected for 
the administration of various government programs in conjunction with 
survey data. The Census Bureau and BLS have used the administrative 
data to target specific populations to survey and to obtain information 
without burdening survey respondents. 

The Census Bureau uses administrative data in combination with survey 
data to produce its Economic Census business statistics, which, every 5 
years, profile the U.S. economy from the national to the local level. 
The Economic Census relies on the centralized Business Register, which 
is compiled from administrative records from IRS, SSA, and BLS, along 
with lists of multi-establishment businesses that the Census Bureau 
maintains. The Business Register contains basic economic information 
for over 8 million employer businesses and over 21 million self- 
employed businesses. The Economic Census uses the Business Register as 
the sampling frame to identify sets of businesses with specific 
characteristics, such as size, location, and industry sector. 

BLS also uses a combination of administrative and survey data to 
produce its quarterly series of statistics on gross job gains and 
losses. BLS uses administrative data provided by state workforce 
agencies that compile and forward quarterly state unemployment 
insurance (UI) records to BLS. These state agencies also submit 
employment and wage data to BLS. The data states provide to BLS include 
establishments subject to state UI laws and federal agencies subject to 
the Unemployment Compensation for Federal Employees program, covering 
approximately 98 percent of U.S. jobs. These administrative data enable 
BLS to obtain information on many businesses without having to impose a 
burden on respondents. BLS augments the administrative data with two 
BLS-funded surveys conducted by the states. The Annual Refiling Survey 
updates businesses' industry codes and contact information, and the 
Multiple Worksite Report survey provides information on multiple work 
sites for a single business, data that are not provided by the UI 
records, enabling BLS to report on business statistics by geographic 
location. Combining the data from these surveys with administrative 
data helps BLS increase accuracy, update information, and include 
additional details on establishment openings and closings. 

However, because of restrictions on information sharing, BLS is not 
able to access most of the information that the Census Bureau uses for 
its business statistics because much of this information is commingled 
with IRS data. The Confidential Information Protection and Statistical 
Efficiency Act of 2002 (CIPSEA, 44 U.S.C. § 3501 note) authorized 
identifiable business records to be shared among the Bureau Economic 
Analysis (BEA), BLS, and the Census Bureau for statistical purposes. 
CIPSEA, however, did not change the provisions of the Internal Revenue 
Code that preclude these agencies from sharing tax return information 
for statistical purposes. OMB officials stated that there is continued 
interest in examining appropriate CIPSEA companion legislation on 
granting greater access for the Census Bureau, BLS, and BEA to IRS 
data. 

Reexamination Has Led to Modification or Elimination of Surveys: 

Several agencies have reexamined some of their surveys, which has led 
to their elimination or modification. The Census Bureau, for example, 
reviewed its portfolio of Current Industrial Reports (CIR) program 
surveys of manufacturing establishments, which resulted in the 
elimination and modification of some surveys. Census Bureau officials 
said they decided to undertake this reexamination in response to 
requests for additional data that could not be addressed within 
existing budgets without eliminating current surveys. They were also 
concerned that the character of manufacturing, including many of the 
industries surveyed by the CIR program, had changed since the last 
reexamination of the CIR programs, which had been over 10 years 
earlier. Using criteria developed with key data users, Census Bureau 
officials developed criteria and used them to rank 54 CIR program 
surveys. The criteria included 11 elements, such as whether the survey 
results were important to federal agencies or other users, and the 
extent to which the subject matter represented a growing economic 
activity in the United States. The recommendations the Census Bureau 
developed from this review were then published in the Federal Register 
and after considering public comments, the Census Bureau eliminated 11 
surveys, including ones on knit fabric production and industrial 
gases.[Footnote 23] The Census Bureau also redesigned 7 surveys, 
scaling back the information required to some extent and updating 
specific product lists. As a result of this reexamination, the Census 
Bureau was able to add a new survey on "analytical and biomedical 
instrumentation," and it is considering whether another new CIR program 
survey is needed to keep pace with manufacturing industry developments. 
Census Bureau officials told us that they plan on periodically 
reexamining the CIR surveys in the future. 

HHS has also reexamined surveys to identify improvements, in part by 
integrating a Department of Agriculture (USDA) survey which covered 
similar content into HHS's NHANES. For about three decades, HHS and 
USDA conducted surveys that each contained questions on food intake and 
health status (NHANES and the Continuing Survey of Food Intakes by 
Individuals, respectively). HHS officials stated that HHS and USDA 
officials considered how the two surveys could be merged for several 
years before taking action. According to HHS officials, several factors 
led to the merger of the two surveys, including USDA funding 
constraints, the direct involvement of senior-level leadership on both 
sides to work through the issues, and HHS officials' realization that 
the merger would enable them to add an extra day of information 
gathering to the NHANES. Integrating the two surveys into the NHANES 
made it more comprehensive by adding a follow-up health assessment. 
According to HHS officials, adding this component to the original in- 
person assessment allows agency officials to better link dietary and 
nutrition information with health status. 

Another mechanism HHS has established is a Data Council, which, in 
addition to other activities, assesses proposed information 
collections. The Data Council oversees the entire department's data 
collections to ensure that the department relies, where possible, on 
existing core statistical systems for new data collections rather than 
on the creation of new systems. The Data Council implements this 
strategy through communicating and sharing plans, conducting annual 
reviews of proposed data collections, and reviewing major survey 
modifications and any new survey proposals. According to HHS officials, 
in several instances, proposals for new surveys and statistical systems 
have been redirected and coordinated with current systems. For example, 
HHS officials stated that when the Centers for Disease Control and 
Prevention (CDC) proposed a new survey on youth tobacco use, the Data 
Council directed it to the Substance Abuse and Mental Health Services 
Administration's National Survey of Drug Use and Health. The Data 
Council stated that by adding questions on brand names, CDC was able to 
avoid creating a new survey to measure youths' tobacco use. 

OMB recognizes that the federal government should build upon agencies' 
practice of reexamining individual surveys to conduct a comprehensive 
reexamination of the portfolio of major federal household surveys, in 
light of the advent of the ACS. OMB officials acknowledged that this 
effort would be difficult and complex and would take time. According to 
OMB, integrating or redesigning the portfolio of major household 
surveys could be enhanced if, in the future, there is some flexibility 
to modify the ACS design and methods.[Footnote 24] For example, an OMB 
official stated that using supplements or flexible modules periodically 
within the ACS might enable agencies to integrate or modify portions of 
other major household surveys. OMB officials indicated that such an 
effort would likely not happen until after the 2010 decennial census, a 
critical stage for ACS when ACS data can be compared to 2010 Census 
data. OMB officials said and their long-range plans have already 
indicated their expectation that there will be improved integration of 
the portfolio of related major household surveys with the advent of the 
ACS. For example, the Statistical Programs of the United States 
Government: Fiscal Year 2006 describes plans for redesigning the 
samples for demographic surveys, scheduled for initial implementation 
after 2010, when the ACS may become the primary data source. 

Conclusions: 

In light of continuing budgetary constraints, as well as major changes 
planned and underway within the U.S. statistical system, the portfolio 
of major federal household surveys could benefit from a holistic 
reexamination. Many of the surveys have been in place for several 
decades, and their content and design may not have kept pace with 
changing information needs. The duplication in content in some surveys, 
while considered necessary, may be a reflection of incremental attempts 
over time to address information gaps as needs changed. OMB and the 
statistical agencies have attempted to address some of the more 
troublesome aspects of this duplication by providing explanations of 
the differences in health insurance estimates and with efforts to 
develop more consistent definitions of disability. These efforts, 
however, while helpful, address symptoms of the duplication without 
tackling the larger issues of need and purpose. In many cases, the 
government is still trying to do business in ways that are based on 
conditions, priorities, and approaches that existed decades ago and are 
not well suited to addressing today's challenges. Thus, while the 
duplicative content of the surveys can be explained, there may be 
opportunities to modify long-standing household surveys, both to take 
advantage of changes in the statistical system, as well as to meet new 
information needs in the face of ever-growing constraints on budgetary 
resources. 

Some agencies have begun to take steps to reevaluate their surveys in 
response to budget constraints and changing information needs. Agencies 
have reexamined their surveys and used administrative data in 
conjunction with survey data to enhance their data collection efforts. 
These actions, however, focused on individual agency and user 
perspectives. By building upon these approaches and taking a more 
comprehensive focus, a governmentwide reexamination could help reduce 
costs in an environment of constrained resources and help prioritize 
information needs in light of current and emerging demands. 

Given the upcoming changes in the statistical system, OMB should lead 
the development of a new vision of how the major federal household 
surveys can best fit together. OMB officials told us they are beginning 
to think about a broader effort to better integrate the portfolio of 
major household surveys once the ACS has been successfully implemented. 
Providing greater coherence among the surveys, particularly in 
definitions and time frames, could help reduce costs to the federal 
government and associated burden hours. The Interagency Council on 
Statistical Policy (ICSP) could be used to bring together relevant 
federal agencies, including those that are not currently part of the 
ICSP. The ICSP has the leadership authority, and in light of the 
comprehensive scope of a reexamination initiative, could draw on 
leaders from the agencies that collect or are major users of federal 
household survey data. While OMB officials have stated that the ACS may 
not have demonstrated its success until after 2010, the complexity and 
time needed to reexamine the portfolio of major federal household 
surveys means that it is important to start planning for that 
reexamination. 

Recommendation for Executive Action: 

To deal with the longer term considerations crucial in making federally 
funded surveys more effective and efficient, GAO recommends that the 
Director of OMB work with the Interagency Council on Statistical Policy 
to plan for a comprehensive reexamination to identify opportunities for 
redesigning or reprioritizing the portfolio of major federal household 
surveys. 

Agency Comments: 

We requested comments on a draft of this report from the Director of 
OMB and the Secretaries of Commerce, HHS, HUD, and Labor or their 
designees. We obtained oral and technical comments on a draft of this 
report from the Chief Statistician of the United States and her staff 
at OMB, as well as written comments from the Acting Deputy Under 
Secretary for Economic Affairs at Commerce; the Assistant Secretary for 
Legislation at HHS; and the Assistant Secretary for Policy Development 
and Research at HUD; and technical comments from the Acting 
Commissioner of BLS at Labor, which we incorporated in the report as 
appropriate. In commenting on a draft of the report, OMB officials 
stated that the draft report presented an interesting study that 
addresses an issue worth looking at. OMB officials generally agreed 
with our recommendation, although they expressed concerns about the 
range of participants that might be involved in such a reexamination. 
We revised the recommendation to provide clarification that OMB should 
work with the Interagency Council on Statistical Policy rather than 
with all relevant stakeholders and decision makers. OMB officials also 
expressed concerns about moving from examining selected surveys in 
three subject areas to the conclusion that the entire portfolio of 
household surveys should be reexamined. In response we clarified that 
we were recommending a comprehensive reexamination of the seven surveys 
that comprise the portfolio of major federal household surveys, most of 
which were included in our review. OMB officials also provided 
clarification on how we characterized their statements on reexamining 
the portfolio of major household surveys, which we incorporated into 
the report. 

Each of the four departments provided technical clarifications that we 
incorporated into the report, as appropriate. In addition, HHS and HUD 
officials offered written comments on our findings and recommendation, 
which are reprinted in appendix II. HHS stated that a reexamination was 
not warranted without evidence of unnecessary duplication and also 
highlighted a number of examples of agency efforts to try to clarify 
varying estimates. However we did not rely on evidence of duplication, 
but rather based our recommendation on other factors, including a need 
to provide greater coherence among the surveys and to take advantage of 
changes in the statistical system to reprioritize information needs and 
possibly help reduce costs to the federal government and associated 
burden hours. Further, in light of the major upcoming changes involving 
the ACS and SIPP, and in conjunction with constrained resources and 
changing information needs, we believe that the major household surveys 
should be considered from a broader perspective, not simply in terms of 
unnecessary duplication. 

HHS also provided a number of general comments. We incorporated 
additional information to reflect HHS's comments on the different uses 
of disability information, a standard set of disability questions, 
NHIS's coverage of access to care, and the fact that MEP's sample is a 
subset of the NHIS sample. HHS's comments on differences in estimates 
and the lack of a single definition of disability were already 
addressed in the report. HHS also stated that NCHS works through 
various mechanisms to ensure that surveys are efficient. We support 
efforts to enhance efficiency and believe that our recommendation 
builds upon such efforts. 

HUD officials were very supportive of our recommendation, stating that 
such a reexamination is especially important as the ACS approaches full-
scale data availability. In response to HUD's comments suggesting 
adding more information on SIPP and AHS, we expanded the report's 
discussion of the longitudinal dimension of SIPP and AHS. 

As agreed with your office, unless you publicly announce the contents 
of the report earlier, we plan no further distribution of it until 30 
days from the date of the report. We will then send copies of this 
report to the appropriate congressional committees and to the Director 
of OMB, and the Secretaries of Commerce, HHS, HUD, and Labor, as well 
as to other appropriate officials in these agencies. We will also make 
copies available to others upon request. In addition, the report will 
be available at no charge on the GAO Web site at [Hyperlink, 
http://www.gao.gov]. 

If you or your staff have any questions regarding this report, please 
contact me at (202) 512-6543 or steinhardtb@gao.gov. Contact points for 
our Offices of Congressional Relations and Public Affairs may be found 
on the last page of this report. GAO staff who made major contributions 
to this report are listed in appendix II. 

Signed by: 

Bernice Steinhardt: 
Director, Strategic Issues: 

[End of section] 

Appendix I: Scope and Methodology: 

To answer our first objective of identifying the number and 
characteristics of Office of Management and Budget (OMB)-approved 
federally funded statistical and research surveys, we obtained the 
database of information collections that had been approved by OMB as of 
August 7, 2006. The information in the database is obtained from Form 
83-I which is part of an agency's submission for OMB approval of an 
information collection. As the approval is in effect for up to 3 years, 
this database reflects all those collections with OMB approval for 
their use as of that date, and is thus a snapshot in time. 

Although OMB Form 83-I requires agencies to identify various types of 
information about an information collection, including whether the 
information collection will involve statistical methods, the form does 
not require agencies to identify which information collections involve 
surveys consequently the database of OMB-approved information 
collections does not identify which information collections are 
surveys. Furthermore, the definition of information collections 
contained in the Paperwork Reduction Act (PRA) of 1980 is written in 
general terms and contains very few limits in scope or coverage. On the 
form, agencies can select from seven categories when designating the 
purpose of an information collection, which are (1) application for 
benefits, (2) program evaluation, (3) general purpose statistics, (4) 
audit, (5) program planning or management, (6) research, and (7) 
regulatory or compliance. When completing the form, agencies are asked 
to mark all categories that apply, denoting the primary purpose with a 
"P" and all others that apply with an "X." Since OMB does not further 
define these categories, the agency submitting the request determines 
which categories best describe the purpose(s) of the proposed 
collection. The choices made may reflect differing understandings of 
these purposes from agency to agency or among individuals in the same 
agency. 

The list of surveys contained in this report was derived from the 
database of OMB-approved information collections and therefore contains 
all information collections that an agency designated as either 
"general purpose statistics" or "research" in the primary purpose 
category that we used as a proxy for the universe of surveys. The 
directions to agencies completing the forms call for agencies to mark 
"general purpose statistics" when the data are collected chiefly for 
use by the public or for general government use without primary 
reference to the policy or program operations of the agency collecting 
the data. Agencies are directed to mark "research" when the purpose is 
to further the course of research, rather than for a specific program 
purpose. We did not determine how accurately or reliably agencies 
designated the purpose(s) of their information collections. It is also 
possible that the database may contain other federally funded surveys 
that the agency did not identify under the primary purpose we used to 
"identify" surveys, and these would not be included in our list of 
surveys. 

We have taken several steps to ensure that the database of OMB-approved 
information collections correctly recorded agency-submitted data and 
contained records of all Forms 83-I submitted to OMB. Our report, 
entitled Paperwork Reduction Act: New Approach May Be Needed to Reduce 
Burden on Public, GAO-05-424 (Washington, D.C.: May 20, 2005), examined 
the reliability of the database of OMB-approved information collections 
and concluded that the data were accurate and complete for the purposes 
of that report. Because this assessment was recent, we decided that we 
would not repeat this assessment. We did, however, compare a sample of 
the surveys from the Inventory of Approved Information Collection on 
OMB's Web site to our copy of the database of OMB-approved collections. 
We found that all of the surveys in the Inventory of Approved 
Information Collection were contained in the database. 

Not all information collections require OMB approval under the PRA. 
OMB's draft Implementing Guidance for OMB Review of Agency Information 
Collection explains that in general, collections of information 
conducted by recipients of federal grants do not require OMB approval 
unless the collection meets one or both of the following two 
conditions: (1) the grant recipient is collecting information at the 
specific request of the sponsoring agency or (2) the terms and 
conditions of the grant require that the sponsoring agency specifically 
approve the information collection or collection procedures. As also 
stated in the OMB draft, information collections that are federally 
funded by contracts do not require OMB approval unless the information 
collection meets one or both of the following two conditions: (1) if 
the agency reviews and comments upon the text of the privately 
developed survey to the extent that it exercises control over and 
tacitly approves it or (2) if there is the appearance of sponsorship, 
for example, public endorsement by an agency, the use of an agency seal 
in the survey, or statements in the instructions of the survey 
indicating that the survey is being conducted to meet the needs of a 
federal agency. Although there are additional surveys funded through 
grants and contracts that are not approved by OMB under the PRA, OMB 
stated that there is no comprehensive list. In addition, the draft 
guidance states that the PRA does not apply to current employees of the 
federal government, military personnel, military reservists, and 
members of the National Guard with respect to all inquiries within the 
scope of their employment and for purposes of obtaining information 
about their duty status. 

For the second objective describing current agency and OMB roles in 
identifying and preventing unnecessary duplication, we took several 
different steps. We reviewed the PRA requirements for agencies and OMB. 
We also interviewed agency clearance officers at the Departments of 
Commerce, Health and Human Services, and Labor about their processes 
for submitting information collection packages to OMB. These agencies 
are the top three agencies in terms of funding for statistical 
activities in fiscal year 2006. We also interviewed OMB officials about 
their role in approving proposed information collections. 

For the third objective, through reviewing our reports and literature 
and by interviewing agency officials, we identified surveys with 
duplicative content. We identified duplication by looking for areas of 
potential duplication when several surveys contained questions on the 
same subject. This duplication was strictly based on similar content in 
the surveys on the same subject, specifically people without health 
insurance and those with disabilities. We also looked at the 
duplication in the subject area of housing between the American 
Community Survey and American Housing Survey, which had been identified 
by our previous work. We also looked at environmental surveys, but 
determined that there was not duplicative content with our major 
surveys. Once we had identified the three subject areas, we used 
literature and interviews to identify the current federally funded 
surveys that were cited as the major surveys in each theme. We did not 
focus on any particular type of survey, but rather chose the surveys 
that were cited as the major surveys in each theme. To learn more about 
the duplicative content between surveys related to these three themes, 
we reviewed relevant literature and agency documents. We also 
interviewed officials from OMB, and the Departments of Commerce, Labor, 
Health and Human Services, and Housing and Urban Development. In 
addition, we interviewed experts from organizations that focus on 
federal statistics, such as at the Council of Professional Associations 
on Statistics and the Committee on National Statistics, National 
Academies of Science. 

Although we have included the Census Bureau's Survey of Income and 
Program Participants as part of our assessment of potential 
duplication, the fiscal year 2007 President's budget proposed to cut 
Census Bureau funding by $9.2 million, to which the Census Bureau 
responded by stating that it would reengineer the SIPP. Therefore, the 
fate of the SIPP is uncertain, and reengineering has not been 
completed. 

For the fourth objective, we also interviewed OMB officials, agency 
officials, and organizations that focus on federal statistics. Through 
the combination of agency and OMB interviews, expert interviews, and 
research, we identified selected agency efforts to improve the 
efficiency and relevance of surveys. 

[End of section] 

Appendix II: Comments from the Department of Housing and Urban 
Development: 

Us. Department Of Housing And Urban Development: 
Washington, DC 20410- 6000: 
Assistant Secretary For Policy Development And Research: 

August 31, 2006 

Ms. Bernice Steinhardt: 
Director, Strategic Issues: 
United States Government Accountability Office: 
441 G Street, NW: 
Washington, DC 20548: 

Dear Ms. Steinhardt: 

On behalf of Secretary Jackson, thank you for your letter of August 3, 
2006, requesting comments before the report is issued. HUD is pleased 
to provide comments on your draft report, " Federal Information 
Collection: A Reexamination of the Portfolio of Major Household Surveys 
is Needed," (GAO-06-941). HUD is very supportive of the Government 
Accountability Office (GAO) recommendation that Office of Management 
and Budget (OMB) coordinate a government-wide comprehensive 
reexamination of federally funded surveys. This is especially important 
as the American Community survey (ACS) approaches full-scale data 
availability. We also agree with GAO's support of the nodon that there 
is "necessary duplication," and that such duplication creates richer 
and more meaningful data sources for multivariate data analysis of the 
complex intetution of housing, family, economic, and neighborhood 
characteristics. 

HUD would like to see an expansion of the discussion of the 
longitudinal dimension of some surveys. HUD has made use of the 
longitudinal characteristics of both the American Housing Survey (AHS) 
and Survey of Income Program Participation (SIPP). The longitudinal 
characteristic is important to HUD and other analysts since it involves 
following the same housing unit (as done in the AHS) or family of 
individuals over time (as done in SIPP). We use the AHS to study the 
dynamics of the housing stock and the SIPP to understand the income 
dynamics that drive variation in the need for housing assistance over 
time. 

The longitudinal feature of the AHS is the basis for our Components of 
Inventory Change (CINCH) and Rental Dynamics analyses. The CINCH 
reports examine changes in the housing stock over time by comparing the 
status and characteristics of housing units in successive surveys. The 
Rental Dynamics program is a specialized form of CINCH, concentrating 
on changes to the rental housing stock, with an emphasis on changes in 
affordability. 

HUD has used SIPP in recent analysis to show that households 
experiencing high rent burdens in a given year move in and out of high 
rent burden status over subsequent years with considerable frequency. 
That is, the size of the population with worst-case need is fairly 
stable, but its membership is not. This is potentially an important 
issue in the design of housing assistance programs, and cannot be 
studied in the absence of a long and deep survey panel. 

If you have' any questions concerning our comments please contact Ron 
Sepanik of my staff at 202-708-1060, extension 5887; or at Ronald J. 
Sepanik@hud.gov. 

Sincerely, 

Signed by: 

Darlene F. Williams: 
Assistant Secretary for Policy Development and Research: 

[End of section] 

Appendix III: Comments from the Department of Health & Human Services: 

Office of the Assistant Secretary for Legislation: 
DEPARTMENT OF HEALTH & HUMAN SERVICES: 

SEP 06 2006: 

Ms. Susan Ragland: 
Assistant Director: 
Strategic Issues Team: 
U.S. Government Accountability Office: 
Washington, DC 20548: 

Dear Ms. Ragland: 

Enclosed are the Department's comments on the U.S. Government 
Accountability Office's (GAO) draft report entitled, "Federal 
Information Collections: A Reexamination of the Portfolio of Major 
Household Surveys is Needed" (GAO-06-941), before its publication. 
These comments represent the tentative position of the Department and 
are subject to reevaluation when the final version of this report is 
received. 

The Department provided several technical comments directly to your 
staff. 

The Department appreciates the opportunity to comment on this draft 
report before its publication. 

Sincerely, 

Signed by: 

Vincent J. Ventimiglia, Jr. 
Assistant Secretary for Legislation: 

Comments Of Department Of Health And Human Services On Federal 
Information Collections: A Reexamination Of The Portfolio Of Major 
Household Surveys Is Needed GAO-06-941: 

General Comments: 

The report investigated potential unnecessary duplication which is 
defined as "information similar to or corresponding to information that 
could serve the agency's purpose and is already accessible to the 
agency." Only three subject areas were studied: people without health 
insurance, persons with disabilities, and housing. The following 
comments focus on the first two topics. 

The draft discusses the fact that different data collections produce 
different estimates of characteristics. It is important to emphasize, 
however, that differences in estimates does not mean that there is 
unnecessary duplication according to the definition employed by GAO. 
The report should clarify how the existing variation, and the progress 
that is being made to address it, relates to determining if unnecessary 
duplication exists. If the concern is only that the estimates differ, 
considerable work has been done with regard to both health insurance 
and disability data to address this issue with the objective of 
understanding the differences so as to improve estimates. There are 
challenges in how data from multiple sources are reported and 
interpreted, and CDC's National Center for Health Statistics (NCHS) is 
working with our partners in HHS and Census to address these issues. 

The draft does not address whether the information needed by the agency 
is available from other sources. While there may be similarities or 
even overlap in data collection items, these items are often included 
in surveys so that they can be analyzed in conjunction with other 
information that is needed by the agency. Disability status is an 
excellent example. Health departments use health surveys to obtain 
information on functioning and disability to describe the health of the 
population and to relate disease states to their functional sequelae. 
Departments of transportation are interested in whether persons with 
disabilities access the transportation system and departments of 
education are interested in the educational attainment of persons with 
disabilities. Detailed information on health, transportation and 
education are not found on a single survey. In fact, the reason that 
there is such interest in standardizing the measurement of disability 
status (as noted by the reference to the work of the Washington Group 
and the American Community Survey (ACS) workgroup) is the desire to add 
a standard question set to a range of studies so that the status of 
persons with disabilities can be described across studies. Similar 
examples exist for insurance status. The draft on pages 6 and 19 refers 
to the fact that duplication can yield richer data that provides for a 
fuller description or understanding of the topic but this fact is not 
given the importance it deserves. 

The draft does not mention the negative effects of not collecting 
information because information on that one estimate is available from 
another source. For example, not collecting data on health insurance as 
part of the National Health Interview Survey (NHIS) would eliminate the 
ability to conduct important multivariate analyses on such topics as 
the impact of insurance coverage on access to care, as access to care 
is generally not collected on other surveys. 

The draft on page 22 specifically addresses disability data and the 
fact that different surveys collect different measures and produce 
different estimates. Again, this is to be expected. The confusion 
caused by these different estimates derives in large part from the lack 
of a single definition of disability which leads to data collections 
that use different questions and that also combine information 
differently in defining this population. While the label disability is 
used, different concepts are being measured. The data collections 
mentioned are simply reflecting the complexity of this concept and the 
need for obtaining data on various aspects of a health condition to 
contribute to an understanding of it. 

The draft on page 31 cites only one example of HHS efforts to reduce 
survey duplication, but others could be cited as well. It should be 
noted in particular that the NHIS and the Medical Expenditure Panel 
survey (MEPS) - both of which are discussed in the draft - have been 
closely linked since 1995. A subset of the broad-based NHIS sample is 
selected to form the sample of the more focused MEPS. 

NCHS works through multiple coordinating and clearance mechanisms to 
ensure that our data systems are efficient and are not duplicative of 
other existing surveys, and considers it extremely important to 
continue to work with HHS, OMB, and other agencies to ensure that 
surveys are efficient, yet produce the full depth, range, and 
analytical comparability of information needed. 

HHS Recommendations: 

The draft recommends ".that the Director of OMB bring together all 
relevant stakeholders and decision makers. that would identify 
opportunities for redesigning and reprioritizing the portfolio of 
household surveys." 

CDC's NCHS has on multiple occasions worked closely with OMB on cross- 
agency efforts to streamline and integrate statistical efforts, and on 
a regular basis works with OMB to ensure that proposed health surveys 
do not duplicate other efforts. NCHS is committed to continuing such 
involvement and to working closely with OMB in the discharge of their 
responsibilities under the Paperwork Reduction Act. 

The recommendation for a sweeping government-wide review of surveys and 
priorities seems to be a disproportionate response to the limited 
evidence of overlap and duplication presented in the draft. The draft 
presents examples of agency efforts that address similar topics, 
several of which are already the subject of ongoing interagency 
coordination efforts (i.e., HHS-led efforts to rationalize health 
insurance estimates and aid analytic interpretation of estimates from 
multiple surveys). It would be more appropriate for GAO to make more 
targeted recommendations in areas where they found unambiguous evidence 
of duplication without purpose, or to identify shortcomings in the 
process by which agencies address these issues. Federal agencies are 
already mindful of the impact the American Community Survey will have 
on ongoing household surveys, and are prepared to address the changes 
in the statistical system that will be necessary. 

[End of section] 

Appendix IV: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Bernice Steinhardt, (202) 512-6543 or steinhardtb@gao.gov: 

Staff Acknowledgments: 

In addition to the contact named above, key contributors to this report 
were Susan Ragland, Assistant Director; Maya Chakko; Kisha Clark; Ellen 
Grady; Elizabeth M. Hosler; Andrea Levine; Jean McSween; Elizabeth 
Powell; and Greg Wilmoth. 

(450414): 

FOOTNOTES 

[1] The PRA was enacted in 1980 and has been amended several times. 44 
U. S. C. §§ 3501 - 3521. 

[2] GAO, 21st Century Challenges: Reexamining the Base of the Federal 
Government, GAO-05-325SP (Washington, D.C.: February 2005). 

[3] The database of OMB-approved federally funded information 
collections is administered by the General Service Administration, 
which works closely with OMB. 

[4] GAO, Paperwork Reduction Act: New Approach May Be Needed to Reduce 
Government Burden on Public, GAO-05-424 (Washington, D.C.: May 20, 
2005). 

[5] OMB Form 83-I provides seven categories for agencies' use in 
designating the purpose for the proposed information collection: 
application for benefits, program evaluation, general purpose 
statistics, audit, program planning or management, research, 
regulatory, and compliance. 

[6] GAO, The American Community Survey: Accuracy and Timeliness Issues, 
GAO-02-956R (Washington D.C.: Sept. 30, 2002). 

[7] The data are current as of August 7, 2006. OMB's approvals may be 
in effect for up to 3 years and include new and ongoing information 
collections. 

[8] GAO, Paperwork Reduction Act: New Approach May Be Needed to Reduce 
Government Burden on Public, GAO-05-424 (Washington, D.C.: May 20, 
2005). 

[9] We have suggested that Congress eliminate the 60-day Federal 
Register notice from the agency clearance process, since these notices 
elicit few comments. GAO, Paperwork Reduction Act: New Approach May Be 
Needed to Reduce Government Burden on Public, GAO-05-424 (Washington, 
D.C.: May 20, 2005). 

[10] According to the Statistical Programs of the United States 
Government: Fiscal Year 2006, approximately 40 percent of the funding 
for statistical programs provides resources for 10 agencies that have 
statistical activities as their principal mission. The remaining 
funding is spread among almost 70 other agencies that carry out 
statistical activities in conjunction with other program missions, such 
as providing services or enforcing regulations. 

[11] As referenced in OMB's draft guidance on agency information 
collections, surveys conducted by recipients of federal funding 
generally do not require OMB approval. However, there are circumstances 
where the survey may require OMB approval. See appendix I for 
explanation. 

[12] We have reported that it is important to recognize that burden- 
hour estimates have limitations. Estimating the amount of time it will 
take for an individual to collect and provide information or how many 
individuals an information collection will affect is not a simple 
matter. Therefore, the degree to which agency burden-hour estimates 
reflect real burden is unclear. Nevertheless, these are the best 
indicators of paperwork burden available, and we believe they can be 
useful as long as their limitations are kept in mind. GAO, The 
Paperwork Reduction Act: Burden Increases and Violations Persist GAO-02-
598T (Washington, D.C.: Apr. 11, 2002). 

[13] Office of Management and Budget, The Paperwork Reduction Act of 
1995: Implementing Guidance for OMB Review of Agency Information 
Collection, draft (Aug. 16, 1999). 

[14] There are 10 information collection standards required by the PRA. 
The packages agencies submit to OMB typically include a copy of the 
survey instrument and a Paperwork Reduction Act Submission (Standard 
Form 83-I). The 83-I requires agencies to answer questions, and provide 
supporting documentation, about why the collection is necessary, 
whether it is new or an extension of a currently approved survey, 
whether it is voluntary or mandatory, and the estimated burden hours. 

[15] GAO-05-424. 

[16] The principal statistical agencies are the Bureau of the Census, 
Bureau of Economic Analysis, Bureau of Justice Statistics, Bureau of 
Labor Statistics, Bureau of Transportation Statistics, Economic 
Research Service, Energy Information Administration, National 
Agricultural Statistics Service, National Center for Education 
Statistics, and National Center for Health Statistics. 

[17] Department of Health and Human Services, ASPE Issue Brief: 
Understanding Estimates of the Uninsured: Putting the Differences in 
Context (September, 2005). 

[18] Activities of daily living include getting around inside the home, 
getting in or out of bed or a chair, bathing, dressing, eating, and 
toileting. Instrumental activities of daily living include going 
outside the home, keeping track of money and bills, preparing meals, 
doing light housework, taking prescription medicines in the right 
amount at the right time, and using the telephone. 

[19] Benjamin H. Harris, Gerry Hendershot, and David C. Stapleton, A 
Guide to Disability Statistics From the National Health Interview 
Survey (New York: Cornell University Employment and Disability 
Institute, October 2005). 

[20] GAO-02-956R. 

[21] GAO, Rental Housing: HUD Can Improve Its Process for Estimating 
Fair Market Rents, GAO-05-342 (Washington, D.C.: Mar. 31, 2005). 

[22] GAO-02-956R. 

[23] Knit fabric is fabric made on a knitting machine, and industrial 
gases are manufactured industrial organic and inorganic gases in 
compressed, liquid, or solid forms. 

[24] At least 2 years before the decennial census is implemented, 
census-proposed questions must be submitted to the committees of 
Congress having legislative jurisdiction over the Census. 13 U.S.C. § 
141(f). 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site (www.gao.gov). Each weekday, GAO posts 
newly released reports, testimony, and correspondence on its Web site. 
To have GAO e-mail you a list of newly posted products every afternoon, 
go to www.gao.gov and select "Subscribe to Updates." 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office 441 G Street NW, Room LM 
Washington, D.C. 20548: 

To order by Phone: Voice: (202) 512-6000 TDD: (202) 512-2537 Fax: (202) 
512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Gloria Jarmon, Managing Director, JarmonG@gao.gov (202) 512-4400 U.S. 
Government Accountability Office, 441 G Street NW, Room 7125 
Washington, D.C. 20548: 

Public Affairs: 

Paul Anderson, Managing Director, AndersonP1@gao.gov (202) 512-4800 
U.S. Government Accountability Office, 441 G Street NW, Room 7149 
Washington, D.C. 20548: