This is the accessible text file for GAO report number GAO-11-396 entitled 'Key Indicator Systems: Experiences of Other National and Subnational Systems Offer Insights for the United States' which was released on April 1, 2011. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to Congressional Addressees: March 2011: Key Indicator Systems: Experiences of Other National and Subnational Systems Offer Insights for the United States: GAO-11-396: [Note: The report was reissued on April 12, 2011 with the following clarification to the Highlights page (in the first paragraph under Why GAO Did This Study): "The Patient Protection and Affordable Care Act of 2010 (PPACA) authorized a congressionally appointed commission and the National Academy of Sciences (NAS) to oversee the development of a key national indicator system for the U.S." This change makes the text consistent with that used on page 9 of the report]. GAO Highlights: Highlights of GAO-11-396, a report to congressional addressees. Why GAO Did This Study: The U.S. has many indicators on a variety of topics such as the economy and health, but has no official vehicle for integrating and disseminating this information to better inform the nation about complex challenges. Diverse jurisdictions across the U.S. and internationally are integrating and disseminating this information through comprehensive key indicator systems. The Patient Protection and Affordable Care Act of 2010 (PPACA) authorized the National Academy of Sciences (NAS) to oversee the development of a key national indicator system for the U.S. PPACA also directed GAO to study (1) how indicator systems are being used; (2) how indicator systems are designed and developed; (3) some factors necessary to sustain a system; and (4) potential implications for the development and use of a U.S. system. This study builds on a 2004 GAO report on key indicator systems. GAO also obtained information on 20 comprehensive indicator systems from diverse U.S. and international areas; reviewed seven of those systems in greater depth; and interviewed system experts, representatives, and stakeholders. GAO verified the accuracy of the information about indicator systems with system representatives, the NAS, the Office of Management and Budget, and selected federal agencies and made technical changes as appropriate. GAO does not make recommendations in this report. What GAO Found: Key indicator systems integrate reliable statistical information on a jurisdiction’s economic, social, and environmental conditions. Figure: Possible Topics for a Comprehensive Key Indicator System: [Refer to PDF for image: illustrated table; 3 photographs] The Economy: * Consumers and employment; * Transportation and infrastructure; * Finance and money; * Business and markets; * Government; * The world economy. Society: * Health and housing; * Communities and citizenship; * Education and innovation; * Security and safety; * Crime and justice; * Children, families, and aging; * Democracy and governance; * Arts and culture. The Environment: * The Earth (ecosystems) * Land; * Water; * Air; * Natural resources. Crosscutting indicator categories: * Quality of life; * Sustainability; * Poverty; * Diversity; * Opportunity; * Mobility; * Equity. Sources: GAO (information); PhotoDisc and BrandXPictures (photos). [End of figure] The NAS and others who will oversee the development of a U.S. key indicator system can draw insights from the experiences GAO observed at the local, state, regional, and national levels in the U.S. and other countries. GAO found that the indicator systems reviewed were used for one or more overarching purposes, including increasing transparency and public awareness; fostering civic engagement and collaboration; and monitoring progress, aiding decision making, and promoting accountability. GAO also identified several key elements in developing and designing indicator systems, such as: (1) consulting experts and stakeholders about the purpose and design of the system, (2) using relevant indicators based on reliable data, and (3) providing disaggregated and comparative data where feasible. In addition, GAO found that sustaining indicator systems can present a constant challenge, depending on stable and diversified funding and the continued interest of key stakeholders. Thus, a participatory process for developing and revising the system is important. Data produced by the federal statistical community and other sources could serve as the beginning foundation for a U.S. system. The federal government can also benefit from a system by using information on trends in societal conditions to inform strategic planning and decision making. Although a fully operational set of measures will take time to develop, require broad involvement of American society, and involve substantial resource commitments, the benefits can include: (1) more informed policy choices, (2) a better educated citizenry, and (3) greater civic engagement. View [hyperlink, http://www.gao.gov/products/GAO-11-396] or key components. For more information, contact Bernice Steinhardt at (202) 512-6543 or steinhardtb@gao.gov. [End of section] Contents: Letter: Background: Key Indicator Systems Are Used for Multiple Purposes: Key Elements Factored into the Development and Design of Comprehensive Indicator Systems: Sustaining Support for Indicator Systems Is a Constant Challenge: Potential Implications for How a Key National Indicator System Could Be Developed and Used in the U.S. Appendixes: Appendix I: Indicator System Definitions: Appendix II: Comprehensive Key Indicator System Case Study Profiles: Measures of Australia's Progress: MONET Indicator System, Switzerland: United Kingdom's Government Sustainable Development Indicators: Community Indicators Victoria, Australia: Virginia Performs: King County AIMs High, Washington: Boston Indicators Project: Appendix III: Objectives, Scope, and Methodology: Appendix IV: Pub. L. No. 111-148, "Patient Protection and Affordable Care Act," Title V, Section 5605; 124 Stat. 680: Appendix V: Full Text for Figure 3 Presentation of Key Indicators from the MONET System: Appendix VI: GAO Contact and Staff Acknowledgments: Tables: Table 1: Descriptions of the Seven Case Study Comprehensive Key Indicator Systems: Table 2: Descriptions of the 13 Additional Comprehensive Key Indicator Systems Reviewed: Table 3: Comprehensive Key Indicator Systems Selected for GAO's Study: Figures: Figure 1: Possible Topics for a Comprehensive Key Indicator System: Figure 2: Illustration of MAP Access to Commentary, MAP Data, and Source Data: Figure 3: Interactive Presentation of Key Indicators from the MONET System: Figure 4: Presentation of Virginia Comparative Educational Attainment Data: Figure 5: Presentation of Renewable Energy Production and Consumption Trend Data: Figure 6: Example of Community Indicators Victoria Report Creation Interface and Report: Figure 7: Example of Online Indicator Mapping Tool Featuring Information on the Greater Boston Metropolitan Area: Figure 8: Example of MAP Use of Color Coding to Show Indicator Trends: Figure 9: Example of Information Provided by MAP, Competitiveness and Openness Supplementary Dimension: Figure 10: Example of Information Provided by MONET on the Official Development Assistance to Poor Countries Indicator: Figure 11: Example of Monet System Use of Color Coding to Depict Indicator Progress for the Global Dimension: Figure 12: Summary of the 11 Key Challenges of the Sustainable Development Strategy Using the Color Coding of the Indicators: Figure 13: Example of Pocket Guide Information Provided on the Indicator for Water Resource Use: Figure 14: Summary of Changes in All UK Government Sustainable Development Indicators from the Pocket Guide: Figure 15: Table Presentation of Indicator from Community Indicators Victoria, by Region within Victoria: Figure 16: Map Presentation of Indicator from Community Indicators Victoria: Figure 17: A High-level Schematic of the Virginia Performs Architecture: Figure 18: Example of High School Graduation Indicator Page from Virginia Performs: Figure 19: Virginia Performs Indicators Scorecard at a Glance: Figure 20: List of Community Indicators and Performance Measures in the "Health" Category of King County AIMs High: Figure 21: Example of Indicator Page from AIMs High Web Sit: Figure 22: Boston Indicators Project Web Site Indicator Page Example: Abbreviations: ABS: Australian Bureau of Statistics: BIP: Boston Indicators Project: Defra: Department for Environment, Food and Rural Affairs: GDP: Gross Domestic Product: GPRA: Government Performance and Results Act: JCCI: Jacksonville Community Council, Inc. MAP: Measures of Australia's Progress: MONET: Monitoring der Nachhaltigen Entwicklung or Monitoring Sustainable Development: NAS: National Academy of Sciences: OECD: Organization for Economic Cooperation and Development: OMB: Office of Management and Budget: PPACA: Patient Protection and Affordable Care Act of 2010: SUSA: State of the USA: UK: United Kingdom: VCIP: Victorian Community Indicators Project: [End of section] United States Government Accountability Office: Washington, D.C. 20548: March 31, 2011: Congressional Addressees: The creation of a key national indicator system to help Americans better assess the nation's progress is formally under way, with passage of legislation authorizing a national system.[Footnote 1] A key national indicator system aims to aggregate into a system essential statistical measures of economic, social, and environmental issues to provide reliable information on a country's condition, offering a shared frame of reference that enables collective accountability.[Footnote 2] Key indicator systems are numerous in communities, cities, counties, and regions across the country, but the United States, unlike some other countries, has had no widely shared factual frame of reference for assessing national position and progress across a range of critical challenges. The 21st century ushered in a period of profound transition for societies and governments around the world, marked by growing global interdependence, rapid advances in science and technology, and environmental sustainability and quality of life issues, among others. All of these trends have changed public expectations of government, and in the U.S., carry a number of significant implications. Among other things, the government's ability to attain societal goals will increasingly depend on strengthened mechanisms for collaboration with other governments and the not-for-profit and private sectors in dealing with a number of major challenges. A key national indicator system can help support these collaborations, providing a framework for related strategic planning efforts and linking shared purposes. It can also enhance transparency, accountability, and efficiency as it helps the public and its leaders better assess national position and progress. We have previously reported that a key national indicator system has the potential to build sophisticated information resources that can help to identify a country's significant challenges and opportunities, inform choices regarding the allocation of scarce public resources, assess whether solutions are working, and make comparisons within the country and to other countries.[Footnote 3] Indicators are measures that describe conditions over time. This is important for monitoring progress toward societal aims, such as improving education, enhancing security, or protecting the environment, which require reliable, unbiased, and useful indicators that are readily accessible to a wide variety of audiences. In many ways such information about the nation and the world is more available today than ever before, but too often it is in formats and locations that may make it difficult to locate and use effectively to provide an integrated picture of a jurisdiction's position and progress. Looking at the parts of a society or individual topics is no substitute for viewing the whole. Along these lines, there are numerous examples of comprehensive key indicator systems that bring together a select set of indicators that provide information conveniently in one place on a broad range of areas, such as economic development and employment, air and water quality, and public health and education. We were asked by the Chairman of the Senate Homeland Security Subcommittee on Federal Financial Management, Government Information, Federal Services, and International Security to update our work on indicator systems to learn more about how key indicator systems are being used, experiences of others in developing the systems, and what some of the implications might be for a U.S. key national indicator system. Subsequently, the Patient Protection and Affordable Care Act, which included a provision directing that the National Academy of Sciences[Footnote 4] establish a U.S. key national indicator system, required that we report on work conducted by public agencies, private organizations, or foreign countries with respect to best practices for a key national indicator system.[Footnote 5] In response to the Senate request and the mandate, this report addresses (1) how indicator systems are being used by government entities, nongovernment stakeholders, and citizens; (2) how indicator systems are developed and designed; (3) some of the factors necessary to sustain indicator systems; and (4) potential implications for how a U.S. key national indicator system could be developed and used. This report builds on the findings from our November 2004 report on key indicator systems.[Footnote 6] In addition, based on recommendations from experts and our review of the literature, we selected a group of 20 comprehensive indicator systems from different jurisdictional levels and diverse geographic locations. We also conducted in-depth case studies--including interviews with officials or managers and stakeholders--of 7 of these 20 systems. The criteria for selection as a case study system included (1) comprehensiveness--a mixture of economic, social, and environmental indicators; (2) longevity--in existence for at least 5 years and currently in operation; (3) outcome-oriented--with measures of progress over time or toward goals or outcomes; and (4) involvement of a government entity as a partner or as a user of information from the system. We interviewed representatives from each of the selected indicator systems, as well as a range of experts in the indicator field and representatives from the National Academy of Sciences. Table 1 provides a description of the 7 case study indicator systems we examined, and table 2 has a description of the 13 additional indicator systems included in our review. Further information on the case study systems is provided in appendix II. To analyze potential implications for a key national indicator system for the U.S., we drew upon our fieldwork, expert interviews, and professional judgment. Table 1: Descriptions of the Seven Case Study Comprehensive Key Indicator Systems: Name of system: Boston Indicators Project (MA); Level: Local; Description: Consists of 70 goals with indicators organized into 10 "sectors"--civic vitality, cultural life and the arts, economy, education, environment and energy, health, housing, public safety, technology, and transportation; Identified purposes: To raise public awareness, aid decision making, foster civic engagement, and monitor progress toward defined outcomes; Managing/host organization(s): The Boston Foundation, a community foundation, in partnership with the City of Boston and the Metropolitan Area Planning Council; Date first reported: First report released in 2000; Web site: [hyperlink, http://www.bostonindicators.org]. Name of system: King County AIMs High (WA); Level: County; Description: Consists of over 60 "community indicators" organized into 8 categories--natural resources; built environment; housing and homelessness; economic vitality; health; law, safety, and justice; accountability and transparency; equity and social justice; Identified purposes: To raise public awareness and aid decision making; Managing/host organization(s): Government of King County, Washington; Date first reported: First report released in 2006; Web site: [hyperlink, http://www.kingcounty.gov/aimshigh/]. Name of system: Community Indicators Victoria, Australia; Level: State; Description: Consists of approximately 80 indicators organized into 5 "domains"--social, economic, environmental, democratic, and cultural; Identified purposes: To raise public awareness, aid decision making, and foster civic engagement; Managing/host organization(s): The McCaughey Centre, School of Population Health at the University of Melbourne; Date first reported: Web site released in 2007; Web site: [hyperlink, http://www.communityindicators.net.au]. Name of system: Virginia Performs (VA); Level: State; Description: Consists of 49 indicators organized into 7 categories-- economy, education, health and family, public safety, natural resources, transportation, government and citizens. Also includes state agency objectives and performance measures that align with 7 long-term state goals; Identified purposes: To raise public awareness, aid decision making, and monitor progress toward defined outcomes; Managing/host organization(s): The Council on Virginia's Future, a state advisory board chaired by the governor; Date first reported: Web site released in 2007; Web site: [hyperlink, http://vaperforms.virginia.gov/]. Name of system: Measures of Australia's Progress; Level: National; Description: Consists of 22 "dimensions of progress" (17 headline and 5 supplementary) organized into 3 broad "domains"--society, the economy, and the environment. Each domain addresses several dimensions, such as health within the social domain, national income within the economic domain, and biodiversity within the environmental domain. Each dimension has a range of indicators and contextual information; Identified purposes: To raise public awareness and aid decision making; Managing/host organization(s): Australian Bureau of Statistics; Date first reported: First report released in 2002; Web site: [hyperlink, http://www.abs.gov.au/AUSSTATS/abs@.nsf/mf/1370.0]. Name of system: MONET Indicator System, Switzerland; Level: National; Description: Consists of 80 indicators organized under 12 themes. A headline set of 17 key indicators are arranged under 4 broad questions--"How well do we live?" "How well are resources distributed?" "What are we leaving behind for our children?" and "How efficiently are we using our natural resources?" Identified purposes: To raise public awareness and aid decision making; Managing/host organization(s): Swiss Federal Statistical Office in cooperation with others, including the Federal Office for Spatial Development; Date first reported: First report released in 2003; Web site: [hyperlink, http://www.bfs.admin.ch/bfs/portal/en/index/themen/21.html]. Name of system: United Kingdom Government Sustainable Development Indicators; Level: National; Description: Consists of 68 indicators organized under 4 themes-- "Sustainable consumption and production," "Climate change and energy," "Protecting natural resources and enhancing the environment," and "Creating sustainable communities." 20 of these indicators are identified as "key" indicators; Identified purposes: To raise public awareness and aid decision making; Managing/host organization(s): UK Department for Environment Food and Rural Affairs (Defra); Date first reported: First report released in 1996; Web site: [hyperlink, http://sd.defra.gov/uk/progress/national/annual-review]. Source: GAO analysis of information from the case study comprehensive key indicator systems. [End of table] Table 2: Descriptions of the 13 Additional Comprehensive Key Indicator Systems Reviewed: Name of system: Albuquerque Progress Report (NM); Level: Local; Description: Consists of 8 goal areas ranging from "Human and Family Development" to "Environmental Protection and Enhancement," to "Economic Vitality"--that are further subdivided into 62 Desired Community Conditions. Individual indicators are used to assess progress toward those desired conditions; Identified purposes: To raise public awareness, aid decision making, and monitor progress toward defined outcomes; Managing/host organization(s): The Indicators Progress Commission (IPC), which has responsibility for developing and tracking the indicators, and the City of Albuquerque; Date first reported: First report released by City of Albuquerque in 1996.The IPC released subsequent editions beginning in 2000; Web site: [hyperlink, http://www.cabq.gov/progress/]. Name of system: Cercle Indicateurs, Switzerland; Level: Local/state; Description: Consists of 37 indicators organized into environmental, economic, and society "dimensions." Provides comparative information for cities and cantons in Switzerland; Identified purposes: To aid decision making; Managing/host organization(s): Swiss Federal Office for Spatial Development and the Swiss Federal Statistical Office; Date first reported: First report released in 2005; Web site: [hyperlink, http://www.bfs.admin.ch/bfs/portal/en/index/themen/21/02/autres.html]. Name of system: Jacksonville Community Council, Inc. Quality of Life Progress Report (FL); Level: Local; Description: Consists of 115 indicators, with a subset of 22 identified as "key," organized into 9 categories--education, economy, natural environment, social wellbeing and harmony, arts and culture, health, government, transportation, and public safety. The categories are aligned with 9 "goal statements." Identified purposes: To raise public awareness, aid decision making, foster civic engagement, and monitor progress toward defined outcomes; Managing/host organization(s): Jacksonville Community Council, Inc., a non-profit civic organization; Date first reported: First report released in 1985; Web site: [hyperlink, http://www.jcci.org/jcciwebsite/pages/indicators.html]. Name of system: Truckee Meadows Tomorrow Quality of Life Indicators (NV); Level: Local; Description: Consists of 33 indicators divided into 10 categories-- arts and cultural vitality, civic engagement, economic well-being, education and lifelong learning, enrichment, health and wellness, innovation, land use and infrastructure, natural environment, and public well-being; Identified purposes: To raise public awareness, aid decision making, foster civic engagement, and monitor progress toward defined outcomes; Managing/host organization(s): Truckee Meadows Tomorrow, a community- based nonprofit organization; Date first reported: First report released in 1994; Web site: [hyperlink, http://www.truckeemeadowstomorrow.org/]. Name of system: Orange County Community Indicators (CA); Level: County; Description: Consists of over 45 indicators organized into 7 categories--economic and business climate, technology and business innovation, education, community health and prosperity, public safety, environment, and civic engagement; Identified purposes: To raise public awareness and aid decision making; Managing/host organization(s): Government of Orange County, CA, in partnership with the Orange County Business Council and the Children and Families Commission of Orange County; Date first reported: First report released in 2000; Web site: [hyperlink, http://www.ocgov.com/ocgov/Info%20OC/Facts%20&%20Figures/Community%20Ind icators]. Name of system: Santa Cruz County Community Assessment Project (CA); Level: County; Description: Consists of over 100 indicators organized into 6 categories--economy, education, health, public safety, social environment, and natural environment; Identified purposes: To raise public awareness, aid decision making, foster civic engagement, and monitor progress toward defined outcomes; Managing/host organization(s): Consortium of public and private health, education, human service, and civic organizations convened by the United Way of Santa Cruz County; Date first reported: First report released in 1995; Web site: [hyperlink, http://www.santacruzcountycap.org/]. Name of system: Long Island Index (NY); Level: Regional; Description: Consists of indicators organized into 10 categories-- economy, population, housing, transportation, safety net, health, education, environment, open space, and governance; Identified purposes: To raise public awareness, foster civic engagement, and monitor progress toward defined outcomes; Managing/host organization(s): The Rauch Foundation, a Long Island- based foundation; Date first reported: First report released in 2004; Web site: [hyperlink, http://www.longislandindex.org]. Name of system: Silicon Valley Index (CA); Level: Regional; Description: Consists of indicators organized into 15 categories ranging from "Employment" and "Innovation" to "Quality of Health" to "Environment." These 15 categories are grouped also into 4 broader categories--people, economy, society, and place; Identified purposes: To raise public awareness and aid decision making; Managing/host organization(s): Joint Venture: Silicon Valley Network, a public-private organization, and the Silicon Valley Community Foundation; Date first reported: First report released in 1995; Web site: [hyperlink, http://www.jointventure.org/]. Name of system: Arizona Indicators (AZ); Level: State; Description: Consists of indicators divided into 11 "content areas"-- economy, public finance, education, innovation, sustainability, culture, health, human services, criminal justice, transportation, and demographics; Identified purposes: To raise public awareness and aid decision making; Managing/host organization(s): Morrison Institute for Public Policy at Arizona State University; Date first reported: Web site released in 2007; Web site: [hyperlink, http://arizonaindicators.org/]. Name of system: Measures of Growth in Focus (ME); Level: State; Description: Consists of 25 indicators divided into 10 indicator categories. These 10 categories are also grouped into 3 broader categories--economic, community, and environment; Identified purposes: To aid decision making and monitor progress toward defined outcomes; Managing/host organization(s): Maine Development Foundation, a nonprofit corporation with a mandate to promote Maine's economy; Date first reported: First report released in 1996; Web site: [hyperlink, http://www.mdf.org/publications.php]. Name of system: Oregon Benchmarks (OR); Level: State; Description: Consists of 91 "benchmarks," and 158 "benchmark indicators" organized into 7 categories--economy, education, civic engagement, social support, public safety, community development, and environment; Identified purposes: To raise public awareness, aid decision making, and monitor progress toward defined outcomes; Managing/host organization(s): Oregon Progress Board, an independent state board. Funding was discontinued in 2009, but the Secretary of State continues to keep the data current; Date first reported: First report released in 1991; Web site: [hyperlink, http://benchmarks.oregon.gov]. Name of system: South Australia's Strategic Plan, Australia; Level: State; Description: Consists of 98 targets organized according to 6 "objectives"--"Growing prosperity," "Improving wellbeing," "Attaining sustainability," "Fostering creativity and innovation," "Building communities," and "Expanding opportunity." Each target has associated indicators used to track progress; Identified purposes: To raise public awareness, aid decision making, foster civic engagement, and monitor progress toward defined outcomes; Managing/host organization(s): Government of the state of South Australia, with an independent Audit Committee to provide oversight and report on progress; Date first reported: First progress report released in 2006; Web site: [hyperlink, http://www.stateplan.sa.gov.au/]. Name of system: Tasmania Together, Australia; Level: State; Description: Consists of 12 goals, ranging from "Increased work opportunities for all Tasmanians" to "Active, healthy Tasmanians with access to health care," and 151 "benchmarks," or indicators, that measure progress toward the goals; Identified purposes: To raise public awareness, aid decision making, foster civic engagement, and monitor progress toward defined outcomes; Managing/host organization(s): Tasmania Together Progress Board, an independent statutory authority reporting directly to the Tasmanian Parliament; Date first reported: First progress report released in 2002; Web site: [hyperlink, http://www.tasmaniatogether.tas.gov.au/]. Source: GAO analysis of information from the comprehensive key indicator systems. [End of table] We conducted our work from February 2010 to March 2011 in accordance with all sections of GAO's Quality Assurance Framework that are relevant to our objectives. The framework requires that we plan and perform the engagement to obtain sufficient and appropriate evidence to meet our stated objectives and to discuss any limitations in our work. We believe that the information and data obtained, and the analysis conducted, provide a reasonable basis for our findings and conclusions. More detailed information on our scope and methodology appears in appendix III. Background: The Need for a U.S. Key National Indicator System Has Gained Recognition: In February 2003, we convened a forum, in cooperation with the National Academy of Sciences, centered on the creation of a national system of indicators for the United States. More than 60 leaders from around the country gathered to discuss whether a key national indicator system could help create a more informed and accountable democracy. Subsequent to the forum, we reported on the state of the practice in key indicator systems already under way at all levels of U.S. society and options for Congress to consider in creating a key national indicator system for the U.S.[Footnote 7] In November 2006, we recommended that the 110th Congress' oversight agenda include, among other things, highlighting the need for a U.S. key national indicator system through public hearings and examining the possible role of a public-private partnership to further develop and operate a system of key national indicators.[Footnote 8] By the end of 2008, a legislative proposal for a key national indicator system had been created with bipartisan sponsorship. The President signed it into law in March 2010 as part of the Patient Protection and Affordable Care Act, with the provision that members of a federally appointed commission oversee implementation of the new key national indicator system. By December 2010, congressional leaders in the Senate and the House of Representatives had selected members of a bipartisan Commission on Key National Indicators.[Footnote 9] Specific responsibilities of the commission include conducting oversight of the system and issuing annual reports; managing a contract with the National Academy of Sciences for system implementation; facilitating support of the system, including federal funding and access to federal data sources; and making recommendations on system improvements as well as issues and measures to be considered.[Footnote 10] The National Academy of Sciences has been working in partnership with a nonprofit institute, the State of the USA (SUSA), to develop a plan for the construction and management of a key national indicator system; select issues to be represented by the indicators and the measures and data to be used for those indicators; design and maintain a public Web site;[Footnote 11] and develop a quality assurance framework to ensure rigor in the presentation of information and the selection of measures and data sources. According to a National Academy of Sciences representative, this plan is based on experience gained through research, development, and piloting activities conducted by SUSA over the past 5 years. A total of $70 million in public financial support is authorized for the system over 9 years to complement contributions by the private sector, which to date total approximately $13 million. In our 2004 report, we suggested that with such a public-private partnership, Congress would have greater flexibility in designing a unique organization and selecting from a range of possible features, with the opportunity to leverage federal resources with private ones--money, expertise, and technologies. [Footnote 12] However, to date Congress has not appropriated funding for the system. U.S. Federal Statistical System Includes Indicators in a Variety of Topical Areas: The U.S. federal statistical system includes indicators on many specific topics and consists of numerous agencies and programs. Each was established separately in response to different needs, and there are over 70 agencies conducting statistical activities. Ten principal federal statistical agencies collect, analyze, and produce statistics as their primary mission, and the Interagency Council on Statistical Policy--under the leadership of the Office of Management and Budget (OMB)--enhances coordination and collaboration among federal agencies that collect and disseminate indicators. More broadly, the United States has national-level indicator systems in a variety of topical areas, most of which are supported by the federal statistical system. For example, America's Children: Key National Indicators of Well-Being provides a comprehensive set of 40 indicators measuring critical aspects of children's lives. This indicator system is managed by the Federal Interagency Forum on Child and Family Statistics, which consists of 22 federal agencies that deal with children's issues. Some private research organizations and policy institutes in the United States also produce national-level reports on social, cultural, and environmental indicators in various subject areas. For example, the Annie E. Casey Foundation, a private charitable organization, produces the annual KIDS COUNT Data Book and the KIDS COUNT Data Center, which present national, state, and local- level indicators on the status of America's children.[Footnote 13] The indicators required to inform our nation have evolved in response to needs for new or different types of information, new challenges, and shifting issues and priorities. The call for economic indicators grew out of the nation's experiences during the Great Depression. Social upheavals after World War II and the Great Society in the 1960s helped spark a desire for social and cultural information. Concerns about society's impact on the environment pointed to a need for more information on environmental conditions. Substantial information assets now exist in these topical areas, providing a foundation consisting of thousands of indicators. Comprehensive key indicator systems, however, attempt to address questions that topical indicator systems, which focus on a specific issue such as the economy or health, cannot answer for wide and diverse audiences. Indicators included in such systems are a core set of statistical measures that have been selected from a much larger range of possibilities. Figure 1 illustrates the three issue areas commonly found in comprehensive indicator systems and provides an illustration of potential indicator categories. Figure 1: Possible Topics for a Comprehensive Key Indicator System: [Refer to PDF for image: illustrated table; 3 photographs] The Economy: * Consumers and employment; * Transportation and infrastructure; * Finance and money; * Business and markets; * Government; * The world economy. Society: * Health and housing; * Communities and citizenship; * Education and innovation; * Security and safety; * Crime and justice; * Children, families, and aging; * Democracy and governance; * Arts and culture. The Environment: * The Earth (ecosystems) * Land; * Water; * Air; * Natural resources. Crosscutting indicator categories: * Quality of life; * Sustainability; * Poverty; * Diversity; * Opportunity; * Mobility; * Equity. Sources: GAO (information); PhotoDisc and BrandXPictures (photos). [End of figure] Selecting the key aspects or activities of a society that are most important to measure is a challenge for indicator systems. Diverse perspectives and value judgments significantly affect indicator choices and definitions, which are inherently subjective. While opinions can and do differ over what constitutes a nation's position and progress, those involved with indicator systems have nonetheless found sufficient common ground to agree that sustained efforts to collect, organize, and disseminate information in more comprehensive, balanced, and understandable ways provide critical information that all can use in discussing options and making choices to address societal challenges. In addition, international organizations, such as the Organization for Economic Cooperation and Development (OECD) and the International Organization of Supreme Audit Institutions, have begun actively promoting the development and application of key national indicator systems. At the national level, the movement toward comprehensive indicator systems is in part based on long-standing concerns about the adequacy of current measures of national performance, in particular those solely based on Gross Domestic Product (GDP). A key concern is that GDP has become a singular measure of national performance yet does not reflect other dimensions of national well-being, such as improvements or harm to social structures and the environment, sustainability of growth, nonmarket household activities such as unpaid child care, and quality of life issues such as the availability of leisure time. In response to these concerns, French President Nicolas Sarkozy commissioned a report to "explore a broader conceptualization of social progress."[Footnote 14] The report pointed out some of the limitations and the consequences of relying on GDP, highlighting, for example, subjective measures, such as those providing insight into how people perceive their own well-being. The report emphasized that issues such as quality of governance, social contact, and health status are important indicators in themselves, independent of their effect on income. The move to consider alternative or additional measures of progress and well-being beyond economic indicators has also been endorsed by the OECD, which has sponsored three World Forums on measuring social progress.[Footnote 15]: Key Indicator Systems Are Used for Multiple Purposes: Key Indicator Systems Can Increase Transparency and Public Awareness: We have previously reported that the effective use of key indicator systems can improve transparency and enhance accountability by giving decision makers and the public easy access to information. If the systems are viewed as credible and relevant, they can provide the capacity for many to work from and make choices based upon a single source of reliable statistical information. They can also enhance efficiency by eliminating the need for individuals or institutions to expend time and resources looking for or compiling and integrating information from disparate sources. Indicator systems can also promote public awareness of issues through indicator reports and Web sites and by making information on the condition of a jurisdiction, and the factors influencing changes in those conditions, more accessible to the community.[Footnote 16] It is important to note, however, that indicators communicate societal conditions, and that while they may provide some insights into the causes of those conditions, this does not necessarily lead to a consensus on the cause or what action, if any, should be taken. Many key indicator systems, such as the King County AIMs High system in the state of Washington, are created to increase the transparency and accessibility of information for their jurisdictions. The AIMs High system, administered by the county government, includes a public report that presents information on key indicators describing the condition of the county across a range of areas, from the quality of its natural resources to the health of its citizens to the vitality of its economy. For example, the AIMs High Web site has an indicator for the number of businesses in the county, information on factors that influence business development, and the role county government plays in supporting business development. According to a county legislator, King County government needs to be transparent and accountable to its citizens, and AIMs High has helped with these goals. Key indicator systems not only bring together diverse sources of information, they provide analysis and context for that information, which helps to raise public awareness of conditions in their nation, region, city, or community. For example, Measuring Australia's Progress (MAP), a key national indicator system developed by the Australian Bureau of Statistics, is designed to provide statistical information about the condition of the nation to the public. In addition, MAP releases include extensive interpretive information that provides analysis and context for its indicators.[Footnote 17] The dimension on "work," for example, has data and analysis on unemployment and underemployment, including discussions of subpopulations, such as younger and older workers, single parents, individuals with disabilities and caregivers, and indigenous people. There are also comparisons with other countries, a glossary of related terms, and a hyperlinked list of related Australian Bureau of Statistics publications. Similar interpretive material is provided for other indicators in MAP. Additionally, for those interested in more detail and information on data sources, the MAP Web site offers access to additional sources of data or to more in-depth statistical information. The site, for example, provides links to more extensive data both through downloads of data used in MAP and to the Australian Bureau of Statistics Web pages for supporting data streams. Figure 2 illustrates a MAP Web site user's access to commentary, MAP data, and source data when looking at the work dimension. In addition, the Australian Bureau of Statistics works with the Australian media to help ensure that releases of MAP are reported in the national press, which helps bring MAP to the attention of people throughout Australia. Figure 2: Illustration of MAP Access to Commentary, MAP Data, and Source Data: [Refer to PDF for image: web page] Source: Australian Bureau of Statistics. Note: Measures of Australia's Progress, cat. no. 1370.0, Canberra, 2010. Web page can be accessed at [hyperlink, http://www.abs.gov.au/about/progress] (viewed on Mar. 7, 2011). [End of figure] Other systems also present indicators using a narrative approach that "tells a story" and that is designed to make indicators more accessible to general audiences by providing important background and contextual information. For example, for each of the indicators available through its Web site, the Boston Indicators Project explains why the indicator is important and, to place the data in a broader context, how groups or geographic areas within Boston compare to one another or, where feasible, how Boston compares to peer cities throughout the United States. To provide additional contextual information, the system highlights key trends and challenges, recent developments, accomplishments, and innovations for each of the 10 sectors that are tracked, such as the economy and education. In addition, the Boston Indicators Project issues a narrative report every 2 years based on themes developed in civic convenings, the analysis of long-term trends, and progress on measurable goals. Indicator systems can also highlight the links between different policy areas. As an example, the Swiss MONET (Monitoring der Nachhaltigen Entwicklung or Monitoring Sustainable Development) system is based on three qualitative objectives of sustainable development-- economic efficiency, social solidarity, and environmental responsibility. Out of a total set of 80 indicators, 17 "headline" or key indicators, each representing a group of indicators, were selected to highlight major trends and salient features.[Footnote 18] The set of 17 indicators is grouped according to four questions that are derived from the MONET indicators framework: * Meeting needs--how well do we live? * Fairness--how well are resources distributed? * Preservation of resources--what are we leaving behind for our children? * Decoupling--how efficiently are we using our natural resources? Figure 3 depicts how the 17 key indicators from the MONET system relate to the three qualitative objectives and are grouped according to the four questions. This highlights how indicators and themes link together. For example, the orange theme shows connections among resource use, energy, economy, and transportation and how they relate to the different objectives underlying MONET. According to MONET officials, such indicator data helped raise awareness of the concerns about overdevelopment and the impacts of land use on transportation, energy use, and the preservation of natural areas.[Footnote 19] Figure 3: Interactive Presentation of Key Indicators from the MONET System: [Refer to PDF for image: illustration] Interactivity instructions: Click on each square to see related indicators and explanations. For the print version of the MONET figure, please see appendix V. Objectives: Social solidarity; Environmental responsibility; Economic efficiency. Questions Related to Objectives: Meeting needs – how well do we live? Fairness – how well are resources distributed? Preservation of resources – what are we leaving behind for our children? Decoupling – how efficiently are we using our natural resources? Source: Adapted from graphics of MONET system, Swiss Confederation. Note: Federal Statistical Office, Federal Office for Spatial Development, Agency for Development and Cooperation, and Federal Office for the Environment, Sustainable Development--A Brief Guide 2010 (2011). Web page can be accessed at [hyperlink, http://www.bfs.admin.ch/bfs/portal/de/index/themen/21/02/dashboard/02.ht ml]. [End of figure] In addition, key indicator systems can present indicator information and analysis with products oriented toward different audiences. Many indicator systems produce simplified "scorecards," "pocket guides," reports, and Web-based presentations that provide succinct summaries of the indicators in a way that makes them accessible to a broad audience. These products aim to bring together indicators from different areas in a coherent way, allowing users to quickly determine how a jurisdiction is progressing. Some indicator systems also find it useful to produce specialized products for particular audiences that a system is designed to serve. The developers of Virginia Performs make indicator information available by state legislative district, summarizing this information in a brief "Community Snapshot" document personalized for each member of the state legislature. According to one Virginia legislator, these products are particularly useful as they consolidate key pieces of information on the conditions and trends in each legislative district. Key Indicator Systems Can Foster Civic Engagement and Collaboration: In addition to providing information and raising public awareness, indicator systems are sometimes used to link the system's broad goals and indicators to guide specific actions. An indicator system can serve as a vehicle for encouraging civic engagement both through the system's development process and through action once the indicator system is in place. Comprehensive key indicator systems can also help address community or national challenges by facilitating collaboration of various parties inside and outside of government. The focused attention that an indicator system or corresponding report can bring to certain conditions may bring increased pressure to bear on diverse parties in the public and private sectors. Accordingly, these kinds of efforts help break down traditional boundaries between various actors and organizations and encourage them to work together in ways that can provide solutions to long-term challenges. Incorporating public input in the development and use of indicator systems was particularly common among the local systems we examined. For example, one of the stated purposes of the Truckee Meadows Tomorrow indicator system in Nevada is to foster civic involvement around issues affecting the region, such as protecting the region's natural resources and environment, increasing parental involvement in education, and encouraging voter participation. The developers of the Truckee Meadows Tomorrow system used a citizen-and stakeholder-driven process to identify goals and priorities for the region, and the indicators, which provide information on the status of each of these goals, were used to encourage civic engagement and inform collaborative efforts. Managers of Truckee Meadows Tomorrow have also used "Quality of Life Compacts" to encourage civic engagement and improve community outcomes. Quality of Life Compacts are formal, voluntary agreements between Truckee Meadows Tomorrow and one or more organizations, individuals, businesses, or local government entities that work together to improve performance on targeted indicators. One completed compact involved the Washoe County School District and the Washoe Education Association, a teachers' union, which was designed to improve parental involvement in schools by actions such as increasing the number of parent volunteer hours, parent participation in parent- teacher conferences, and better use of parent volunteers by teachers through individual action plans at each school in the district. Following these efforts, the system's "communitywide involvement in education" indicator, which measures parental involvement through both a survey and parent-teacher conference attendance, showed improvement. Some indicator system managers have convened groups that work on collective strategies to address areas of common interest. In addition to providing data on the condition of a community, the systems facilitate conversations between members of a community from a variety of sectors about ways to address problems. For example, the Boston Indicators Project periodically brings together leaders from the public, private, and nonprofit sectors to discuss key issues and surface themes for its next report. One such effort includes providing staff support to the John LaWare Leadership Forum, quarterly forums that bring together civic, business, and community leaders from throughout Boston to reflect on and discuss identified challenges and potential solutions. For example, the first forums in 2005, focused on the weakening of corporate and civic leadership in Boston. By bringing together business and civic leaders with academic experts to focus on key issues and data identified by the Boston Indicators Project, participants were able to explore areas in which Boston and the region could sustain and expand their competitiveness in a global economy while addressing local challenges in education and housing. According to several stakeholders of the system, this effort to foster a shared understanding of key challenges and opportunities has been critical in facilitating connections between actors from different sectors. As another example, at the national level the Healthy People indicator system initiative, a federal effort led by the Department of Health and Human Services, has increasingly engaged stakeholders at subnational levels to assist in progress toward the system's health goals and objectives. The Healthy People Consortium--an alliance that now consists of more than 400 national organizations and 250 state and local agencies--was created to forge a coalition dedicated to taking action to achieve the Healthy People objectives, such as reducing obesity. It facilitates broad participation in the process of developing the national prevention agenda and engages local chapters and their members in the provision of community and neighborhood leadership. Some indicator systems have also been used to raise awareness about specific problems and the need for collaborative efforts to address them. The Commonwealth of Virginia provides an example of how indicators encouraged collaboration across sectors to address the issue of infant mortality. In 2007, with a rate of 7.8 deaths per 1,000 live births, Virginia had the 12th highest rate in the nation. Data also showed that there were wide disparities from the northern part to the south and southwestern parts of the Commonwealth. The Virginia governor set a goal of achieving a statewide infant mortality rate of less than 7 per 1,000 live births. According to a Virginia official, Virginia Performs, by listing the infant mortality rate as one of its key indicators, helped serve as a catalyst, raising the profile of the issue and helping people identify drivers of outcomes. The increased visibility and attention on reducing the infant mortality indicator in Virginia served as a means for focusing collaborative efforts. In 2008, the Commissioner of Health formed a Working Group on Infant Mortality that brought together leaders from the health care industry, community and faith organizations, business community, insurers, educators, and associations to find ways to promote the health of pregnant women and women with young children. Furthermore, after closely analyzing information on infant deaths in Virginia, it was found that 10 areas within the Commonwealth accounted for 52 percent of all infant deaths. To help address the issue in these areas, the Virginia Department of Health created an initiative that focused resources on those 10 areas and engaged community partners, such as grocery store chains, in developing strategies, plans, and actions for reducing the number of infant deaths. By the end of 2008, the infant mortality rate in almost every region was down, and the statewide rate had fallen to 6.7 deaths per 1,000 live births. Similarly, in the city of Jacksonville, Florida, the inclusion of the infant mortality rate as an indicator in the Jacksonville Community Council, Inc. (JCCI) report also helped raise awareness about the scope of infant mortality in Jacksonville and led to collaborative action to address the problem. First, the creation of an Infant Mortality Advocacy Task Force and a JCCI report on infant mortality found that numerous factors faced by women throughout their life cycle, not just those directly related to health care, influence their predisposition for poor birth outcomes. This information resulted in a number of different approaches being developed so that Jacksonville could address the problem in a multifaceted way. For example, local hospitals implemented "baby friendly" designations, vendors at the farmers' markets began to accept food stamps to increase the availability of nutritious alternatives, and Rotary Clubs promoted safe sleep practices. A local foundation launched a social marketing campaign to help educate the community about infant mortality. According to the manager of the JCCI indicator system, efforts like these helped contribute to a 27 percent decline in the infant mortality rate in Jacksonville between 2005 and 2009. Key Indicator Systems Can Be Used to Monitor Progress, Establish Accountability for Results, and Aid Decision Making: Indicator systems and their reports have been used to highlight instances when progress is not being made and to encourage interested parties and stakeholders to take action. In addition, by ensuring that relevant, reliable information is made more accessible and usable by many different members of our society, indicator systems help establish accountability and increase the probability that pressing problems are understood and that decisions are well informed. System managers and experts we interviewed expressed a range of perspectives on the importance of articulating goals as part of an indicator system and how specifically an indicator system should define goals or targets to be achieved. Some said that the existence of specifically defined goals or targets can make indicators more meaningful and relevant as accountability tools, help people better understand where a jurisdiction is relative to its goals, and help generate coordinated action to address shared challenges. For example, Maine's Measures of Growth in Focus includes a "research and development expenditures" indicator that tracks progress toward a target that total research and development spending in Maine will increase to 3 percent of the state's GDP by 2015. Other systems used indicators to track progress toward broader goals. The JCCI indicator system, for example, uses a combination of key and supporting indicators to track progress toward nine high-level quality of life goals, such as "achieving educational excellence," "growing a vibrant economy," and "preserving the natural environment." This information is used to identify priority areas where action is needed as well as those areas where improvements have been made. Others stated that some systems, because of the sensitive political environments in which they operate, seek to avoid the political issues that are inherently part of selecting and articulating goals. Instead, these systems may use benchmarks or comparisons to show how a jurisdiction differs from its peers or use trend data to show the movement in an indicator and provide a focus on generating positive movement in that area. For example, the UK Government Sustainable Development Indicator System includes an indicator of productivity, which is used to track output per worker over time, relative to a 1991 baseline, and relative to other countries. Some indicator systems exist as one element of a broader plan and are developed to support the monitoring of that plan. For example, the state of South Australia has developed an indicator system to support its strategic plan. The strategic plan includes 98 specific targets, each of which has an associated indicator, which represent outcomes the government hopes to achieve over time. Performance of state agency executives is evaluated regarding the progress made toward the targets for which their agency has responsibility, and government policies and new proposals are also evaluated according to their ability to produce positive movement toward the targets. By using a series of targets that stem from high-level statewide goals, and indicators to track whether progress is being made, the strategic plan is being used to redirect resources and guide government decision making. As an example, South Australia has used math and science outcomes from its annual Indicators Progress Report to inform the allocation of resources. In recent years the progress report has shown a slight decline in the percentage of students meeting the target for math and science achievement that the government has set as its objective. On the basis of this information, the government has laid out strategies to increase the recruitment, retention, and retraining of math and science teachers, and the government's most recent budget also includes $8.7 million over 4 years to provide schools with more teachers who have specialist qualifications in math and science. There are several mechanisms by which the Tasmanian government has linked its actions to the Tasmania Together system. The Tasmanian government has identified a subset of 40 of Tasmania Together's 152 benchmarks as priority benchmarks and assigned responsibility for improving them to state agencies. The state agencies have been required to develop action plans in addition to reporting annually to parliament on relevant benchmarks. This process is being reviewed with a focus on a smaller, more discrete number of priority benchmarks. Most state agencies have also incorporated Tasmania Together benchmarks into their planning processes. Further, the government encourages agencies to link their budget requests to the Tasmania Together benchmarks. Tasmania Together publishes a detailed biennial progress report to parliament in addition to a snapshot of progress every year that is designed to be a quick and accurate assessment of what is progressing and what might need more attention in terms of achieving the targets. Indicator systems are often tied to information used by governments to manage programs and make decisions. For example, Ballarat, a city in the state of Victoria, Australia, used the framework and the data generated by Community Indicators Victoria, a state indicator system, to support its community plans and a legislative initiative on alcohol control that was generated by the community plan's findings on public safety and alcohol consumption. In addition, the government of Orange County, California, used information from its annual indicators report to develop plans and take action to address asthma and immunization rates, as well as homelessness. Specifically, data from a recent report showed that county immunization rates were lower than in peer regions, asthma rates were higher, and homelessness among children was growing. The county used this data in an effort to improve outcomes by developing a plan to provide joint asthma clinics with a local university and hospital, an immunization campaign to immunize children in Orange County by the age of 2, and a 10-year plan to address homelessness among the young. As another example, the Boston Foundation recently completed a data- driven strategic review of its grant making program, beginning with an overview of trends and conditions presented by the Boston Indicators Project. Guided by its mission statement and the documentation of community conditions, the foundation developed nine strategies to achieve its goals and then examined data in each issue area to get a better sense of trends and issues. Relevant statistical measures for each strategy are generated internally on a quarterly basis to track progress toward the goals in each strategy area. According to a Boston Foundation official, the quarterly reports are having an impact by focusing decision makers on investments that have the greatest potential to influence positive movement toward the achievement of the foundation's objectives. For example, the foundation is now looking at data from neighborhoods to determine where to invest resources more effectively to address low birth weights in certain areas of Boston. The official said that the focus on data and results has also changed the nature of the conversation between the Boston Foundation and its grant recipients to one that clearly lays out expectations that the foundation has for each grant recipient. Key Elements Factored into the Development and Design of Comprehensive Indicator Systems: Consulting Experts and Stakeholders about Purpose and Design Can Result in a More Relevant and Useful System: Involving technical and subject matter experts in the development process can help developers get an accurate sense for which select group of indicators and statistical measures are most appropriate given the purpose and structure of the system and the data needs of intended audiences. For example, a representative from the National Academy of Sciences noted that framing issues and choosing indicators should be based on the best available research from around the nation and the world, particularly given how challenging it is to focus on a limited number of measures. To take advantage of this expertise, some systems, including the three key national indicator systems we reviewed, have used a developmental approach that relies on input from a group of key stakeholders and technical and subject matter experts to inform the selection of the indicators used to measure a jurisdiction's condition and progress. Some system managers we interviewed said that experts play an important role in the development of indicator systems by providing technical and subject matter knowledge that can be used to identify (1) the factors that are most critical in determining how a jurisdiction is doing, (2) the most appropriate indicators to measure a jurisdictions' condition and progress, and (3) sources of available data for the indicators. A system manager cautioned, however, that without opportunities for meaningful stakeholder input during the development process, the indicators may bear little relation to the priorities and concerns of intended audiences, which can undermine the relevancy and legitimacy of a system. Therefore, this approach to development has also been combined with other mechanisms for collecting feedback from a broader range of stakeholders and citizens. The developers of MAP, for example, convened a group of experts from universities, national scientific organizations, and nongovernmental organizations to help guide the initial development of the system. Subsequently, developers also reached out to a wider range of interested public, nonprofit, and private sector stakeholders from across Australia and collected feedback through a series of targeted seminars held throughout the country that were also open to the public. Because the selection of indicators is not a value-neutral activity, and different audiences may prefer different indicators, involving a diverse collection of stakeholders in the development process can allow developers to collect input on the priorities, concerns, and preferences of a range of potential audiences. Several system managers and experts we interviewed mentioned that before selecting specific indicators it is important to identify the system's intended audiences--whether it be the general public, government officials, or specific sets of stakeholders such as business and civic leaders--and consider how representatives from those audiences can be involved in the system's development. Involving these representatives in decisions about the system's purposes and design can also help build and sustain the credibility and legitimacy of the system; help ensure that the selected goals and indicators align with the priorities of intended audiences; create a sense of ownership from involved stakeholders; and increase the likelihood that intended audiences will see the indicator system as a relevant and useful tool to inform their decision making. For example, while Virginia Performs was under development, stakeholders from state government and the Council on Virginia's Future expressed their desire for a system that would allow them to see trends in the condition of Virginia, compare Virginia to peer states and national leaders, and compare regions within Virginia. Because developers were aware of these needs, they were able to design the system to collect and present the disaggregated, comparative, and trend data necessary to ensure the system would accommodate those needs. Involving a wide range of stakeholders, including the general public, in the development process was a common characteristic of systems we reviewed that were designed to monitor progress toward achieving goals and to increase civic engagement. For example, Truckee Meadows Tomorrow is an example of a system that used extensive public participation to select the indicators that would make up the system. Its developers began by bringing together a diverse group of representatives from local organizations--including representatives of groups that may have been underrepresented in past community-based efforts. These representatives formed nine committees that each developed a list of about a dozen potential indicators that could be used to track community progress in different areas. Truckee Meadows Tomorrow board members and staff then made over 100 presentations to civic organizations, community groups, and businesses to present the draft indicators and used these opportunities to ask audience members to prioritize the indicators. Over 2,000 citizens participated in this phase of the project, using "play money" to vote on what mattered the most to their quality of life. The next phase involved several surveys of the community asking respondents to rate the top 100 indicators on a scale of 1 to 5. This effort yielded input from another 1,000 residents and was followed by a random phone survey of 600 residents. All of this input was used to inform the selection of the final set of 66 indicators. As another example, the development of Tasmania Together began with the Premier of Tasmania asking a representative group of 24 community leaders from around the state to consult with their communities, identify common priorities, and collect input from citizens on what Tasmania should seek to achieve by 2020. This group of leaders collected the views of Tasmanians via public forums and meetings, Internet submissions, and letters. This effort led to the selection of 24 goals to structure the indicator system, and a "consultation" document was released to collect additional public feedback. As an example of how stakeholder and public outreach efforts such as these can be combined with the work of experts, after completing the public outreach process, the Tasmania Together system's developers worked with more than 100 industry, community, and public-sector specialists to select the indicators and data sources. Extensive public outreach can present logistical challenges. As we have previously reported, when indicator systems involve a diverse group of stakeholders, it is important to build sufficient time into the process of selecting indicators to allow stakeholders to address differences and reach consensus. For example, in Tasmania the initial public consultation process was expected to take only 3 months but ended up taking 18 months. As another example, the public engagement process used to inform the selection of the original indicators for Truckee Meadows Tomorrow took approximately 1 year to complete. Relevant Indicators Based on Reliable Data Help Ensure the Credibility of a System: According to system managers we interviewed, ensuring the credibility of a system requires relevant indicators supported with reliable, accurate, and up-to-date statistical information. The selection of specific data sources was described as a process that should be guided by professional standards for quality. While the data used to support indicators should be reliable and of high quality, the indicators must also be relevant to the key issues that the system's stakeholders and audiences care about. Many of the comprehensive key indicator systems we reviewed highlighted the importance of selecting indicators that share these characteristics. Several systems even made these explicit criteria to help ensure that the indicators would meet certain standards for reliability and relevancy. For example, Community Indicators Victoria established criteria that required an indicator to be, among other things: * relevant and valuable to the community; * endorsed by experts on the topic; * populated with regular and reliable data sources; and: * unambiguous and resonate with the general population. Using a set of selection criteria that all stakeholders agree to in advance can also help ensure that the indicator selection process works effectively from the outset, and applying criteria can help facilitate decisions not to use some of the potential indicators or to rank a possible list of indicators. While many indicator systems rely, for the most part, on data- producing organizations to ensure valid, quality data, there are systems that have their own processes to help ensure the quality and appropriateness of their indicators and data. For example, before selecting individual indicators, the Albuquerque Indicators Progress Commission considered several questions, including whether: * the source is unbiased and reliable; * there are policy agendas connected to the indicator; * the data are gathered consistently; and: * the measurement methodology is sound. Providing Disaggregated and Comparable Data Available over Time Can Increase the Usefulness of an Indicator System: In addition to ensuring that measures included in an indicator system are unbiased and reliable, system managers and experts we interviewed emphasized that an indicator system can be enhanced by having data that are disaggregated by geographic area and demographic group, comparable across jurisdictions, and available over time. Interactive Web sites and mapping technologies are also improving the ease of presenting and analyzing large amounts of data. Disaggregated Data: Indicators supported with data disaggregated by race, gender, geography, and socioeconomic status are useful for audiences because they can show variations among areas and groups. Aggregated measures providing a high-level view of a jurisdiction are useful in some contexts, but may have limited value for decision makers as they can mask disparities among geographic areas and demographic groups. Disaggregated data are valuable because they allow users to see these disparities, which can help decision makers identify issues needing attention and target strategies to address the disparities. The extent to which the systems we reviewed provided disaggregated data varied, but virtually all of the systems provided some data disaggregated by geography or demographic characteristics. There are some, such as Virginia Performs, that make it a central part of their presentation. For example, according to users of Virginia Performs, the usefulness of the system is strengthened by the fact that it provides disaggregated data for eight regions within Virginia. A Virginia state legislator said that state policymakers can use this information to understand the disparities that exist among regions in Virginia and to inform legislative initiatives to address them. For example, data have shown that, over time, educational attainment levels are higher in northern Virginia than in other parts of the Commonwealth. Because of increased awareness of these disparities, Virginia has begun to invest in expanding higher education opportunities in traditionally underserved areas. In figure 4, disaggregated educational attainment data from Virginia Performs show how regions within Virginia compare to one another. Figure 4: Presentation of Virginia Comparative Educational Attainment Data: [Refer to PDF for image: web site page] Source: Council on Virginia’s Future. Note: Web page from Virginia Performs can be accessed at [hyperlink, http://vaperforms.virginia.gov/indicators/education/edAttainment] (viewed on Mar. 7, 2011). [End of figure] The demand may be high for data from progressively smaller geographic areas, but capturing reliable data for these areas can be a challenge. As data are disaggregated, their quality and reliability may come into question. Furthermore, data may not be collected at the desired geographic level or according to the racial, gender, or demographic variables of interest. For example, according to officials from Arizona Indicators, an ongoing challenge is data availability, particularly finding uniform data at the subcounty level. In some cases, the system's developers have been hindered from providing information at the community level because they have found that the county is the smallest unit for which they have been able to procure reliable and uniform data. A system official, for example, said that they would like to be able to provide data disaggregated by zip code for certain indicators, such as the incidence of diabetes, but the information is not available at that level. Comparative Data: Virtually all of the systems we reviewed also made some comparative data available, while some, such as the Albuquerque Progress Report and the Swiss Cercle Indicateurs, made comparative data a central part of their presentation. Data that are comparable and consistent across jurisdictions can provide a frame of reference for assessing the condition and progress of a jurisdiction relative to its peers, and, by identifying jurisdictions that may serve as a model for others, encourage benchmarking and action to generate improvements. For example, in 2004 the Community Assessment Project of Santa Cruz County reported that the county ranked 51st out of California's 66 counties for the percentage of overweight children younger than 5 years, and 57th for children aged 5 to less than 20 years. The availability of this comparative information and increased awareness about this problem helped spur the creation of the Go for Health! Collaborative, which was created in 2004 to increase healthy eating and regular physical activity for children and youth in Santa Cruz County. Go for Health! has over 150 member organizations working to achieve 24 outcomes. In addition, it has placed fruit stands on school campuses, worked with public works departments to add bike lanes, and worked with grocery stores to replace candy with fruit at check-out aisles. A lack of consistency in the data definitions or units of measurement from one jurisdiction to another will have an impact on the usability and comparability of data, however. For this reason, it is important for system developers interested in using comparative data to ensure that the methodologies, definitions, and units of measurement are consistent across jurisdictions. Trend data: Stability and continuity in the indicators and data can also help audiences detect changes in indicators and understand the historical context surrounding an issue. According to system managers we interviewed, trend data are important because, when they are available over a sufficient period of time, they can provide a clearer picture of the progress of a jurisdiction. Trend analysis can be used to determine if changes in indicators represent an isolated movement or a true trend, or if a policy or programmatic initiative could be having an intended or unintended impact. Furthermore, by indicating when a persistent problem exists, trend data can be used to focus civic leaders and government officials on issues most deserving of attention. Lastly, insights into the correlations between indicators can provide perspective on how issues are connected, reinforcing that societal issues should not be looked at in isolation. Virtually all indicator systems we reviewed made trend data available. Trend data are particularly useful for jurisdictions using their indicator systems to monitor progress toward defined goals or outcomes. For example, the South Australian Strategic Plan includes a target that renewable energy should comprise 20 percent of the state's electricity production and consumption by 2014. As shown in figure 5, the trend data show that since 2000 to 2001 there has been significant growth in both renewable energy production and consumption and "positive movement" toward the achievement of the target. Figure 5: Presentation of Renewable Energy Production and Consumption Trend Data: [Refer to PDF for image: web site page] Source: South Australia’s Strategic Plan Audit Committee. Note: South Australia's Strategic Plan Progress Report 2010. Report can be downloaded at [hyperlink, http://www.stateplan.sa.gov.au/system/pdf/SASP%202010%20Progress%20repor t.pdf] (viewed on Mar. 7, 2011). [End of figure] Internet and Mapping Technologies: Over the past several years, improvements in Internet and electronic mapping technologies have played a large role in the increased sophistication with which indicators and statistical information can be presented. In the past, indicators were generally presented in printed reports released on a periodic basis. Today, by contrast, indicators are increasingly being presented using interactive Web sites that can be updated frequently and that allow users to sort and analyze data by geographic area, subject, or indicator, and to create customized reports. For example, as shown in figure 6, Community Indicators Victoria allows users to create customized "Wellbeing Reports," with comparative charts for the geographic areas and indicators most relevant to them. On the left side of the figure is an illustration of the interface used to select relevant local government areas or regions and indicators to create a Wellbeing Report. On the right side of the figure is an example of a Wellbeing Report that a user has created for five local government areas within the Northern and Western Metro region of Victoria. This report's information allows the user to see how the level of people reporting their health as excellent or very good varies by area, as well as how these levels compare with the level for the region and for the state of Victoria. The dotted reference line, which represents the highest score for the indicator registered for any local government area in the state, also shows how these levels compare with this benchmark. Figure 6: Example of Community Indicators Victoria Report Creation Interface and Report: [Refer to PDF for image: web site page] Source: McCaughey Centre, School of Population Health, University of Melbourne. Note: Web page from Community Indicators Victoria can be accessed at [hyperlink, http://www.communityindicators.net.au/node/add/report] viewed on Mar. 7, 2011). [End of figure] Geographic Information Systems and mapping technologies have also made it possible to map indicators down to focused geographic areas, such as the community level, when data are available at that level. Improved mapping and data visualization software is also simplifying analysis by allowing large amounts of data to be presented using a variety of visual formats, including scatter plots, bar or pie charts, and line graphs, and by allowing users to create maps that show disparities that exist across multiple jurisdictions. Several of the indicator systems we reviewed now offer mapping tools on their Web sites. For example, the developers of the Boston Indicators Project have worked with staff from the region's planning agency to create the MetroBoston DataCommon, an online mapping tool that provides data on the Boston region. The DataCommon allows users to analyze multiple data sets and create customized maps of the region and its municipalities. In figure 7, a user of the MetroBoston DataCommon has created a map comparing the percentages of students receiving reduced- price or free school lunches in municipalities throughout the Boston metropolitan region. The various colors represent different percentage levels of students receiving reduced-price or free lunches and allow users to visualize the variations that exist across the region. In this case, areas with the lightest color have between 0 and 6 percent of students receiving free or reduced-price lunches, while areas with the darkest color have 49 percent or more of students receiving free or reduced-price lunches. Figure 7: Example of Online Indicator Mapping Tool Featuring Information on the Greater Boston Metropolitan Area: [Refer to PDF for image: web site page] Source: The Boston Foundation and Metropolitan Area Planning Council. Note: MetroBoston DataCommon mapping tool can be accessed at [hyperlink, http://www.metrobostondatacommon.org] (viewed on Mar. 7, 2011). [End of figure] The Boston Indicators Project and Metropolitan Area Planning Council have also partnered with computer scientists at the University of Massachusetts-Lowell and representatives of other indicator systems around the nation to form the Open Indicators Consortium, which is developing an open source mapping, analysis, and data visualization tool. Finding New Ways to Collect and Use Data Can Help Fill Gaps: Filling in gaps in data can be challenging for comprehensive key indicator systems that rely almost exclusively on data from public sources, which may not provide data in areas of interest or at sufficiently disaggregated levels. According to a representative from the National Academy of Sciences, it will be essential that a U.S. key national indicator system rely not only on government data but on university-based, commercial, and nonprofit data sources in areas where the government cannot provide data. In some instances, however, the data for indicators necessary to measure key issues may not be available from any source and would need to be developed. Some system managers we interviewed stated that they have tried not to let data availability affect the selection of indicators. If an issue has been identified as important, they believe it should be included in the indicator set and efforts made to find supporting data. This can be done by finding new ways to collect and use existing data, or by collecting new data. For example, following the 2005 revision of the UK Government Sustainable Development indicators, eight indicators without supporting data were added to the system. By 2009, the system's managers were able to find data for seven of those indicators. A specific example of their efforts to address one of these gaps involves the development of an indicator for "environmental equality." This indicator, which first appeared in the 2007 report, is designed to evaluate the relationship between environmental conditions and poverty. By combining data from the English Indices of Multiple Deprivation, measures for local areas released by the UK Department of Communities and Local Government, with information on eight environmental conditions for communities in England, they developed a measure of the percentage of the population living in areas with, in relative terms, the "least favorable" environmental conditions. Analysis of this indicator has shown that a higher proportion of people in the most deprived areas of England may live in areas with multiple environmental conditions that are the least favorable, compared with populations living in less deprived areas. A UK official said this was the first time this relationship had been quantified and efforts are now ongoing to determine how this information can be used to inform policy development. Other indicator systems addressed data gaps through the collection of original survey data. For example, to collect comparable data on the perceived well-being of citizens in each of the 79 local government areas in the state of Victoria, in 2007, Community Indicators Victoria commissioned a telephone survey of approximately 24,000 Victorians, ensuring that they received at least 300 responses from each local government area. Efforts have also been used in Albuquerque, New Mexico; Jacksonville, Florida; Santa Cruz County, California; Truckee Meadows, Nevada; and Long Island, New York, to collect information on the concerns, opinions, desires, and needs of a demographically representative sample of citizens and to determine if citizen perceptions align with the empirical evidence about conditions in each jurisdiction. Indicator systems may also use existing data collected by agencies responsible for administering nonstatistical programs and services. Using administrative data has a number of advantages, including no additional costs for data collection or burdens on survey respondents, and recent advancements in technology have permitted statistical agencies to overcome many of the limitations of processing large data sets. For example, as a measure of "economic innovation," the Silicon Valley Index uses information on patent registrations from the United States Patent and Trademark Office to calculate the percentage of all patents registered in California and the United States that are registered to residents of Silicon Valley. Systems that use administrative data, however, should be aware of issues related to the level of quality control over the data, problems associated with missing records or the timeliness of the data, privacy concerns, as well as the cost that comes with cleaning administrative data to make it useful. The managers of the Annie E. Casey Foundation's KIDS COUNT system have worked with a nationwide network of partners to more effectively compile and leverage existing data. According to a KIDS COUNT official, following the initial development of the system in the early 1990s, there was a desire to go beyond collecting and reporting aggregated state-level data. The managers of KIDS COUNT formed partnerships with child advocacy organizations and research institutions in all 50 states to collect and report county-and local- level data, which are made available through the KIDS COUNT Data Center. While there are variations in county-level data available from one state to another that in some cases make it impossible to compare counties from different states, this effort has made it possible to compare counties within a state. This network of state partners, which receives some financial assistance from the national KIDS COUNT office, plays a critical role in the KIDS COUNT effort to provide information on the condition of children across the country. Periodic Reevaluation and Revision of the Indicators Maintains Relevance: While it is important to have stable indicators and measures, occasional changes, including dropping, modifying, and adding indicators, are needed to ensure the system remains relevant. According to system managers we interviewed, system developers should allow flexibility for revisions and modifications based on feedback from users, changes in the interests and values of audiences over time, advances in research, and improvements in data. For this reason, it is important that there are periodic reviews and mechanisms for collecting feedback from users. While some systems collect ongoing feedback, the approach used by many of the indicator systems we reviewed is a formal review of the indicators, often as part of a periodic effort to update their systems. For example, the Jacksonville indicator system instituted a formal process to review the system's indicators and draft products annually. Before the release of its indicators report, the organization will convene a balanced group of 20 to 25 community leaders, data experts, and interested citizens. They provide participants with the draft report, facilitate a review of the draft to ensure that the content is clear, accurate, and fair, and collect feedback on the design and usability of each report. Furthermore, every year, following the public release of its report, JCCI will survey key stakeholders and interested citizens to collect feedback on what they liked about the report, as well as suggestions for how it could be improved. According to a JCCI official, every year approximately 5 percent of the indicators are altered, removed, or added to reflect the availability of better indicators or data, or changes in the perceptions of issues within the community. Sustaining Support for Indicator Systems Is a Constant Challenge: Stable and Diversified Funding Helps Ensure Continuity of Indicator Systems: We have previously reported that securing adequate and stable funding to run the indicator system at the outset, when costs are higher, as well as later when costs sometimes level off, is crucial to a system's long-term sustainability.[Footnote 20] One way to ensure the stability of the system is to diversify the number and types of funding sources. A lack of diversified funding sources makes indicator systems more vulnerable due to their dependence on one source for most or all of their funding. Systems that rely on multiple funding sources, such as governments, foundations, and corporations, can make up for reductions from one source by turning to others for additional funding or possibly by reaching out to new funding sources. A project manager from Community Indicators Victoria, a state indicators project hosted by the McCaughey Centre at the University of Melbourne, noted that finding stable funding for the system is challenging. When creating the system in 2004, the developers recognized that the advantage of being at arm's length from the state government was that Community Indicators Victoria would be viewed as independent. However, as a nongovernmental entity, maintaining funding has been more precarious. The developers of the system rejected the idea of charging for data, reasoning that the data should be for the public good. Now, the project manager said the charitable foundation funding the system would like to take a step back as the main funding source. She noted that she has had to develop a consultancy service to generate revenue to help support the project. The Arizona Indicators system, created in 2007, is an example of an indicator system with several sources of support. The system began in the Office of the President at Arizona State University with strong backing from the Arizona Community Foundation. These two entities each have made significant multiyear commitments and continue to provide the vast majority of funding for the project. In addition, Valley of the Sun United Way has provided varying levels of support over time. Recently, for example, they have underwritten the addition of content that tracks changes in the state budget and explores the human impact of funding cuts. Early on, the Arizona Department of Commerce provided funding for the development of select economic indicators. They have not been able to continue their financial commitment, but they feature Arizona Indicators prominently on their Web site and drive considerable traffic to the system's Web site. All of these partners contribute time and expertise to the indicator system by, for example, attending planning meetings, reviewing content, helping with outreach, and connecting the system with colleagues in their professional networks. Indicator Systems Depend on the Continued Interest of Sponsors, Advisors, and Champions: Maintaining stable and diversified funding depends in part on the continued interest of sponsors, advisors, and champions. Experts and managers from our selected systems told us that the challenge of maintaining the interest of stakeholders is constant, even among indicator systems that already have strong levels of financial and political support and large user bases. Some systems that are able to garner the funding and political support needed to start an effort experienced difficulties in maintaining that support. Buy-in from users across the public, private, and nonprofit sectors, however, can increase the likelihood that an indicator system will be funded, and we have previously reported that mechanisms for helping to maintain support from system stakeholders include showing that the system's managers are achieving the indicator system's stated aims; using scarce resources effectively; remaining independent from political processes; and emphasizing opportunities for improvement.[Footnote 21] Indicator researchers have noted that managing the expectations of stakeholders is also an important part of sustaining an indicator system. If expectations for the system are unrealistic, the actual achievements of the project may be undermined, which in addition to engendering a sense of disappointment, risks the continued support of sponsors.[Footnote 22] An official previously associated with the Oregon Benchmarks indicator system, a statewide system which is currently not funded, noted that for an indicator system to have an impact, it is important to have a critical mass of influential actors who understand and support the system. In the Oregon legislature, the fact that legislative term limits were instituted exacerbated difficulties already present due to the legislature's turnover rate, as, over time, there were fewer members who understood the purpose of the benchmarks or had a desire to use them to inform their decision making. In hindsight, the official said that more could have been done to maintain buy-in and interest from the system's stakeholders. For example, in her view, holding annual events could have brought everyone--politicians, policy advocates, business community representatives, and interested citizens--together to discuss the importance of the benchmarks system. She believed these events also would have served as an opportunity to encourage the development of a continuing dialogue among stakeholders and to mine the knowledge of citizens. The importance of cultivating and maintaining champions of the indicator system was mentioned by a number of managers and officials we interviewed. For example, according to the head of the Australian Bureau of Statistics, a risk for Australia's national indicator system would be not having various sectors, such as the business community and the media, supportive of the MAP initiative. In his opinion, having this support gives politicians comfort and confidence in the system. An official formerly with the Oregon Benchmarks system mentioned the importance of cultivating bipartisan champions to support the system. He noted that he was able to navigate in a challenging environment by aligning with the influential legislators who saw the value of the indicator system and were willing to support it. Similarly, a King County AIMs High manager said that there must be buy-in from high-level county leadership to help ensure a strong, clear mandate for the system, which then makes it easier for developers to persuade others of the system's worth. Indicator Systems Insulated from Political Pressure Can Protect the Systems from Perceptions of Bias: Managing the tension between the scientific, political, and cultural dimensions of indicator work involves acknowledging the value-laden nature of indicator development. Given this tension, we have previously reported that if an indicator system is to have staying power, it is important to insulate the system, as much as possible, from political pressures and other sources of potential bias.[Footnote 23] An indicator system and its managers must be seen as credible, with a participatory process for developing and revising the system over time. When this is not the case and indicator systems are perceived as biased toward a particular ideological or partisan perspective, the indicators are less likely to have credibility and may lose support from a broad group of users. Without the credibility that comes from a strong degree of independence and support from a diverse set of stakeholders, some users may lose trust in the accuracy and objectivity of the system and the information it provides. The Oregon Benchmarks system experience suggests that support for an indicator system can be lost if it is perceived as being the creation of a particular political party, a political leader, or a single branch of government. When the Oregon benchmarks were first created, the governors and majorities in both chambers of the state legislature were from the same political party. Support for the indicator system from the legislature decreased after the opposing political party gained the majority in the legislature because the system was perceived as being driven by the executive branch and the governor's political party. The system, as mentioned previously, is currently not funded. In contrast, the Council on Virginia's Future--a group that has the involvement of the governor, lieutenant governor, cabinet members, high ranking members of the General Assembly from both parties, and influential citizens--is designed to serve as the overall champion of Virginia Performs. The developers of Virginia Performs have also partnered with experts from the Weldon Cooper Center, a well-respected and nonpartisan public affairs research institution at the University of Virginia. According to members of the council and their staff, Virginia Performs' relationship with the Weldon Cooper Center has been important in insulating the system from political concerns and questions about its quality, as the developers of the system have been able to take advantage of the expertise and technical capacity of the Center's researchers. Developers of indicator systems have also established independent bodies to provide objective, nonpartisan oversight and ensure that their systems are not in a position to be politicized. For example, the government of South Australia established an audit committee to oversee the development of indicators used to track progress toward goals outlined in the South Australian Strategic Plan. The committee is an independent body that ensures that the indicators are sufficiently rigorous, meet criteria for selection, and are periodically reevaluated, and provides suggestions for improvements. According to South Australian officials, its existence and the assessments it provides increase the credibility of the indicators by ensuring that they receive independent verification and validation. The Patient Protection and Affordable Care Act, which authorized the establishment of a key national indicator system for the United States, also provided for oversight of the system through the creation of the bipartisan Commission on Key National Indicators. Continually Raising the Public's Level of Awareness of a System Can Help Preserve Its Relevance: Reaching diverse audiences, including the print and electronic media, can be achieved through multifaceted marketing and communications strategies. These strategies spread the word about the existence and features of the system; disseminate information on what the indicator trends are showing; help to encourage a broader base of individuals and organizations to use the system; and provide training and assistance to users. Developers of the indicator system need to establish strong relations with the media and listen to their reporting needs. As an example of this, The Arizona Republic, a daily newspaper published in Phoenix, frequently promotes Arizona Indicators by publishing an "Arizona Indicators Snapshot" in the Viewpoints section of its Sunday edition and covers Arizona Indicators events and policy briefs. The "product" of the system also needs to be attractive and easily accessible to the media and the public. For example, according to staff from the Australian Bureau of Statistics, MAP can be viewed as successful if it is picked up by the media and when it is used as a tool for debate in schools and within other sectors, such as the business sector. According to an official who helped to develop MAP, the importance of a communication or media strategy is one of the lessons they learned and one that is essential for sustaining an indicator system. Another official familiar with key national indicator systems observed that the successful ones devote at least as much effort to communications and promotion of the indicators as they do to their development. When asked what the Boston Foundation and Boston Indicators Project have done to become more influential, a senior manager for the Boston Foundation said that they created a communications strategy around their indicators. She noted that she was aware of the Boston Indicator Project when she joined the Boston Foundation and knew it was a good asset that provided a base of knowledge about conditions in Boston. She sensed, however, that although the indicator system provided a wealth of knowledge about the city, it was largely unknown to the key people who should know about it, such as the media. Consequently, the foundation has worked with the Boston Globe, a daily Boston newspaper, to ensure coverage of the indicator system data and research as part of its public relations strategies and ensures that the project's biennial indicator reports are released at a major civic venue. The continued publication and presentation of the data through foundation newsletters, additional research, public briefings and forums, and other formats help to keep the public engaged. The manager stated that the foundation is now perceived as a neutral, consistent information provider. Potential Implications for How a Key National Indicator System Could Be Developed and Used in the U.S. Experts and Stakeholders Could Help Clarify the Purposes and Select the Content of a U.S. Key National Indicator System: Managers of national key indicator systems we reviewed emphasized the importance of involving technical and subject matter experts in the development of a key national indicator system and in the selection of the indicators that will make up the system. The role that the National Academy of Sciences has been given in the development and implementation of a key national indicator system could help ensure that the selection of indicators is informed by the best available research from around the world and input from the nation's most knowledgeable sources. Several system managers and officials we interviewed emphasized that those charged with developing a system of key national indicators should also work with a range of stakeholders from the intended audiences to consider the purposes the system will be designed to fulfill. For example, developers and stakeholders should consider whether the system will be designed to provide a high-level overview of the condition of the country, to give users a more detailed perspective on the differences that exist among states or regions across the country, to monitor progress toward defined outcomes, to stimulate citizen engagement, or for other purposes. Involving stakeholders early in the development process can give potential users an opportunity to share their priorities and preferences on the content of an indicator system and the purposes it should be designed to fulfill. The purposes of some indicator systems do evolve, but these initial decisions about the purposes, audiences, and content of a system could have an impact on the approach used to develop the system, the indicators that are selected, and the information-sharing tools and products the system makes available. Clearly articulating the purposes of the system will also help ensure that there is a common understanding of what the system will be designed to achieve. When selecting the indicators and data sources that will make up a national indicator system, several system managers and officials suggested that one potential approach could seek to combine input from key public and private sector stakeholders, subject matter specialists, and technical experts with a mechanism for collecting more widespread input from a wider range of potential users, including interested stakeholders and citizens. For example, one official suggested that a first step could involve experts working with the developers of a national system to create a proposal identifying the categories to structure the system, the individual indicators used to measure the country's condition and progress, and the most appropriate data sources available to support these indicators. The Institute of Medicine,[Footnote 24] which is affiliated with the National Academy of Sciences, has already used a similar process to select 20 key health measures for the nation.[Footnote 25] For this effort, the Institute of Medicine convened a committee of experts to select from a myriad of available health indicators a manageable set of 20 indicators considered crucial for understanding the state of the nation's health. This process is an example of how the developers of a national key indicator system might take advantage of the National Academy of Sciences' ability to bring together experts from various fields to gather information, perspective, and input. Because there are value judgments involved in the selection of indicators, however, officials we interviewed emphasized that developers will need to solicit input on any proposal from a wider range of stakeholders and interested citizens. For example, according to representatives of the National Academy of Sciences, ensuring that a national indicator system is relevant and is seen as legitimate will require that developers have feedback mechanisms to collect input from interested stakeholders in all sectors and at all levels of society, including the public. Developers of a key national indicator system might use a number of approaches to collect this input including the following: * Advisory committees, which are used by statistical agencies to draw on the expertise of academics and research communities and to collect recommendations on statistical methodology and other technical matters related to the collection, tabulation, and analysis of statistics. * Outreach to state, regional, and local indicator partners through organizations like the Community Indicators Consortium and the National Neighborhood Indicators Partnership. The Community Indicators Consortium is a network of individuals and organizations engaged in indicator efforts at the local, regional, state, national, and international levels that is used to facilitate the exchange of knowledge about the effective development and use of indicators. The National Neighborhood Indicators Partnership is a collaborative effort by the Urban Institute and local partners from 35 metropolitan areas around the country to further the development and use of neighborhood information systems in local policy making and community building. * Community forums, town hall meetings, survey, or focus groups. * More technologically advanced tools like online surveys and online voting, online town hall meetings, formal requests for comment collected through the indicator system's Web site, and social media. A system of key national indicators, as outlined by the legislation, will be designed to serve as a resource for the entire nation rather than just the federal government. However, it will be important to consider the information needs of members of Congress and other federal officials. As we have seen from other systems we reviewed, collecting input on the information needs of legislators and government officials can provide developers with the insights they need to create content and products sensitive to the interests of that audience. At the same time, it is important to ensure that the selection of indicators and data sources is independent of government control. Attempting to closely tie societal indicators to government decision making, for instance by using indicator information to determine resource allocations, can present challenges. For example, according to an Oregon state official we interviewed, interest by the governor in using the Oregon Benchmarks indicators to guide resource allocation decisions led to a situation where perceptions of agency and program value began to be judged on an agency or program having a representative indicator as part of the system. Because agency officials and issue advocates had the ability to influence the selection of indicators, their pressures led to a proliferation of measures, which temporarily created a situation where the total number of indicators became unwieldy. This situation was rectified after a few years when the Oregon Progress Board placed an upper limit of 100 on the total number of benchmarks. These types of political pressures may also lead to demands to select indicators that portray the government in a positive light, which may introduce political bias and undermine the system's credibility and legitimacy. Some indicator systems created with significant involvement from government officials, such as those in Virginia, South Australia, Tasmania, and Albuquerque, New Mexico, have attempted to address this tension by using independent advisory and oversight boards. These bodies have a responsibility for ensuring that indicators are selected based on their quality and appropriateness, while allowing public officials to play an important, but not dominant, role in the development of systems and the selection of indicators. A U.S. Key National Indicator System Could Leverage Existing Data Sources and Technologies: Data: To help ensure the quality and reliability of the data in their systems, many indicator systems in the United States use existing data produced by federal statistical organizations, such as the United States Census Bureau, which have quality assurance processes in place to ensure accuracy. Under guidelines established by OMB, federal agencies are required to ensure and maximize the quality, objectivity, utility, and integrity of statistical information that is disseminated to the public, and are, among other things, required to adopt specific information quality standards and develop a process for reviewing the quality of information before it is released. Leveraging high-quality data that are already being collected by these organizations can help minimize the burden on indicator system developers. While there are costs associated with identifying data sources and acquiring the relevant data, relying on existing data to the extent possible can help reduce these costs. Similarly, data already produced by the federal statistical community, and other university-related, commercial, and nonprofit data sources, could serve as the beginning foundation for a key national indicator system for the United States. Using data being produced by federal statistical agencies could help ensure the quality of the system's data and reduce the possibility of duplicative data collection efforts at the national level. A key national indicator system could also aid the federal statistical community in its mission of making federal statistics more visible and accessible to a broad audience of potential users. It will be important, however, for the developers of a national system to ensure that there is appropriate attribution so that users are aware of the ultimate source of the information. Because of the importance of these data sources, and the importance of using them appropriately, involving representatives of federal statistical agencies and other data providers in the development of a key national indicator system could help establish a tradition of ongoing cooperation between the developers and data providers and enhance the developers' access to the expertise of the data community. These lines of communication would also allow the developers of a national system to engage data providers in a conversation about the processes that are used to verify the quality of each data set, the sources of the data, any limitations or concerns about the quality of the data that might exist, and the feasibility of and costs associated with addressing data gaps. The purposes of the key national indicator system will also dictate the degree to which it needs comparable or disaggregated data. For instance, if the system is designed to allow audiences to see how the United States compares to other countries, developers of a national system will need to consider how a national system for the U.S. might align with existing indicator and statistical systems in other countries. After the committee of experts from the Institute of Medicine had selected the list of 20 key health indicators, for example, they took this list of indicators, compared each to a list of health indicators maintained by other countries that are members of the OECD, and found that 9 of 20 were comparable. Furthermore, if the system is designed to provide data disaggregated by state, region, or county, or by demographic characteristics, this will require the identification of existing sources that provide reliable data for progressively smaller levels of geography or for different demographic or socioeconomic groups. For example, one of the criteria that the committee of experts from the Institute of Medicine used to select its indicators was the need for the indicator to be supported by data that could be viewed by population subgroups or geographic region. The committee also explicitly recommended that a key national indicator system should include the ability to explore disparities by socioeconomic status, race and ethnicity, and geographic region for each indicator selected for its various issue areas. According to a National Academy of Sciences representative, the level to which disaggregated data will be made available will depend on the quality of the data available at different levels. For example, when high-quality data are available for multiple levels, a key national indicator system could make these data available to facilitate state, national, and international comparisons, as well as demographic comparisons based on gender, race, age, poverty level, or other demographic characteristics. Making these data available through one source could allow users at all levels to see the differences that exist from one jurisdiction or group to another. Users of a national indicator system may also be interested in data that are specific to the areas in which they live, or to other domains of particular interest to them, such as their age group or national origin group. Viewing the information in this way may make the indicators more meaningful and relevant to the personal experience of the user. It may also help states, regions, or counties see how they fit into the national picture. For example, the UK's Government Sustainable Development indicators include 46 indicators collected for each of the nine regions of England which, according to one regional official we spoke with, make it possible for regions to track their progress relative to their peers. Technology: As discussed above, according to several system managers we interviewed, the ongoing evolution of the technologies available to present, analyze, and share statistical information has led to a shift in the way that indicators are being presented and disseminated. Furthermore, in an effort to expand access to new data visualization technologies and stimulate innovation and collaboration across indicator systems, new tools, such as the Open Indicators Consortium's WEAVE[Footnote 26] platform, are being developed. The developers of a key national indicator system should also consider the importance of openness and transparency in the development of its Web interface and supporting technologies. Pursuing an open approach could help ensure that there is a collaborative process used to collect input and ideas into the technical development of a national system and leverage the expertise of the widest possible range of technical experts and potential users, including other information providers and end users. According to a representative from the National Academy of Sciences, SUSA, the nonprofit institute working in partnership with the National Academy of Sciences, has committed to this type of approach through the release of its draft enterprise architecture and beta Web site. Specifically, in 2010, SUSA released a beta Web site with the goal of testing advanced technical capabilities, to refine content features across eight pilot issue areas, and to leverage intellectual capital by exposing SUSA design principles to broad technical scrutiny. Since the beta site was released, there are now close to 1,000 individuals who have access to the site and give regular feedback on features and functions that they like and things they would like to see improved. According to the National Academy of Sciences representative, this feedback will always be crucial in making design adjustments for the rollout and evolution of an official key national indicator Web site. A U.S. Key National Indicator System Could Be Used to Inform Federal Government Strategic Planning and Decision Making: In addition to their usefulness to society as a whole, some governments have looked to indicator systems to inform their own planning efforts and decision making. For example, the President's annual budget includes approximately 60 "social indicators" that measure long-term trends in the economic, social, and environmental condition of the United States.[Footnote 27] However, a system of key national indicators could go beyond this by providing decision makers with easy access to a broader set of economic, social, and environmental indicators, disaggregated data, and additional contextual information that could serve as a valuable tool for framing and informing budgetary and policy decisions. A system of key national indicators may also be useful to federal officials as a tool to support strategic planning and monitoring by OMB and federal agencies. The Government Performance and Results Act (GPRA), which was recently amended by the GPRA Modernization Act of 2010,[Footnote 28] now requires that every 4 years OMB develop a limited number of long-term, outcome-oriented priority goals for the federal government covering policy areas that cut across agencies. The act also requires OMB to develop a federal government performance plan, which, among other things, is to detail for each federal government priority goal (1) performance goals, measures, and targets; (2) the agencies, programs, and activities involved; and (3) an official responsible for coordinating efforts. Together these requirements are to function effectively as a governmentwide strategic plan.[Footnote 29] The act also requires that agencies select agency priority goals, which will also have performance targets and milestones. The process of identifying these goals and measures has already begun within the executive branch where OMB and agency officials have identified priority goals. According to OMB, some agencies are tracking "contextual indicators" alongside performance measures and targets. Contextual indicators are intended to be relevant quantitative measures that provide a broader perspective on the conditions that may influence an agency's ability to achieve its performance goals as well as provide context for understanding agency progress toward the priority goals. Examples could include data about the outcomes an agency is trying to influence over the long term or with only limited control, warning signals, unwanted side-effects, and external factors that affect outcomes, including both causal factors the government can try to influence and those over which it can have very little effect. For example, the Department of the Treasury has a Priority Goal to "Repair and Reform the Financial System." Treasury has identified the Chicago Federal Reserve National Activity Index as a key contextual indicator, over which it may have some but very indirect influence, but which provides an indication of overall economic activity and inflationary pressure. According to OMB, agencies do not need to provide targets for contextual indicators, as their direct ability to influence these indicators is limited, or they do not intend to directly affect these indicators. OMB characterized these contextual indicators as analogous to the indicators of societal condition often found in key indicator systems. A system of key national indicators might contribute to the federal government's ongoing strategic planning and performance monitoring efforts in three ways. First, federal officials could look to measures included in a system of key national indicators to inform the selection of contextual indicators used by the federal government. These indicators could help provide federal officials with a broader perspective on changes in societal conditions and how these changes might affect their ability to achieve performance goals. Second, a system of key national indicators could be used to inform the selection of future priority goals, as well as governmentwide and agency strategic planning efforts. By providing information on economic, social, and environmental conditions and trends across the United States, a key indicator system for the U.S. may help highlight areas in need of improvement and provide federal officials with insights into the environment in which agencies are operating. Third, a system of key national indicators could also support efforts to address duplicative and overlapping programs and initiatives, a governmentwide issue on which we recently reported.[Footnote 30] For example, to influence positive movement in certain indicators, federal officials could look at all the programs that contribute to improving outcomes, examine how each contributes, and use this information to streamline and align the programs to create a more effective and efficient approach. U.S. Key National Indicator System Could Be Refined over Time: Experts and managers emphasized that developers of a U.S. key indicator system will need to ensure that the system remains relevant to users and continues to fill their information needs. Like the national, state, regional, and local indicator systems we reviewed, the developers of a national indicator system may want to consider periodically reevaluating the indicators and data sources by systematically collecting feedback from users to refine and improve the system and address changing needs. A number of options exist for collecting this feedback. For example, one expert suggested that input from a national user advisory group made up of local, state, and national partners could identify improvements and provide insights into how the system indicators are being used. Other approaches could include periodic meetings of subject matter specialists to discuss scientific research into factors driving changes in indicators, or of technical experts to discuss improvements in the quality, availability, and presentation of data. These formal, periodic reviews could also be supplemented by ongoing feedback from users collected through the system's Web site and other venues, such as conferences. The National Academy of Sciences and others who will oversee the development of a U.S. key national indicator system can draw insights from the experiences we observed at the local, state, regional, and national levels in the U.S. and other countries. Since building a key national indicator system requires an investment of significant time and resources, such costs can only be justified if there is a reasonable expectation of meaningful benefits over time. Such information must be useful to the public, professionals, and leaders at all levels of our society. Although a fully operational set of credible measures of our progress and prospects will take time to develop, require broad involvement of American society, and involve substantial resource commitments, the benefits can include more informed policy choices, a better educated citizenry, and greater civic engagement. We are sending copies of this report to the appropriate congressional committees and other interested parties. The report will also be available at no charge on the GAO Web site at [hyperlink, http://www.gao.gov]. If you have any questions concerning this report, please contact Bernice Steinhardt at (202) 512-6543 or steinhardtb@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of the report. Key contributors to this report are listed in appendix VI. Signed by: Bernice Steinhardt: Director, Strategic Issues: List of Addressees: The Honorable Thomas Harkin: Chairman: The Honorable Michael Enzi: Ranking Member: Committee on Health, Education, Labor, and Pensions: United States Senate: The Honorable Frederick Upton: Chairman: The Honorable Henry Waxman: Ranking Member: Committee on Energy and Commerce: House of Representatives: The Honorable Thomas Carper: Chairman: Subcommittee on Federal Financial Management, Government Information, Federal Services, and International Security: Committee on Homeland Security and Governmental Affairs: United States Senate: [End of section] Appendix I: Indicator System Definitions: An indicator is a quantitative measure that describes an economic, social, or environmental condition over time. The unemployment rate, infant mortality rates, and air quality indexes are a few examples of indicators. Indicators are measures that are focused on changes in conditions. Some indicators may be direct in that they measure what they say they do. For example, the unemployment rate is a direct indicator. Other indicators may be indirect or "proxy" indicators. For example, the number of patents granted may be used as a proxy for measuring the degree of inventiveness. An indicator system is a systematic effort to assemble and disseminate, through various products and services, a group of indicators that satisfy the needs of intended audiences and together tell a story about the condition and progress of a jurisdiction or jurisdictions. A jurisdiction, such as Australia, is distinguished from a governmental entity, such as the federal government of Australia. Indicator systems aggregate into a system statistical measures of many things, including attributes of people, animals and plant life, institutions, industries, and the physical environment, among others. It is useful to distinguish between two types of indicator systems. The first are topical indicator systems, which consist of indicators pertaining to a related set of issues, such as health, the environment, education, or transportation. For example, a topical system in health might have related indicators like the prevalence of certain diseases, such as cancer or heart disease; the number of citizens with access to health insurance; and the number of doctors or hospitals available for use by citizens in a particular jurisdiction. Topical indicator systems are a major source of information for the media, professionals, researchers, citizens, and policymakers. By contrast, comprehensive key indicator systems aggregate the most essential economic, social, and environmental indicators into a single system. These systems can make it easier to see a more complete, general picture of the condition and progress of a particular jurisdiction and can facilitate analysis of how changes in one domain can affect other domains. These systems are "comprehensive" in that they provide information across the three primary domains: economic, social, and environmental. Indicators included in these systems can be defined as "key," as they are a core set of measures that a group of citizens or stakeholders has selected from a much larger range of possibilities. While there is no "right" number of key indicators, the systems are not intended to be exhaustive. Because they represent a select set, they cannot provide a full description of the condition and progress of a jurisdiction but rather focus on providing a generally accurate picture of the whole. Topical indicator systems form the essential underpinning for aggregating information into comprehensive key indicator systems, as comprehensive key indicator systems are built selectively by members of a jurisdiction from the foundation of existing topical indicators. [End of section] Appendix II: Comprehensive Key Indicator System Case Study Profiles: We conducted seven in-depth case studies of comprehensive key indicator systems over the course of the review. We reviewed three national systems, two state systems, and one county system, and one local system. The indicator systems profiled here are: 1. Measures of Australia's Progress (National): 2. MONET Indicator System, Switzerland (National): 3. United Kingdom's Government Sustainable Development Indicators (National): 4. Community Indicators Victoria, Australia (State): 5. Virginia Performs (State): 6. King County AIMs High,Washington (County): 7. Boston Indicators Project, Massachusetts (Local): Measures of Australia's Progress: Overview: Measures of Australia's Progress (MAP), a key national indicator system developed by the Australian Bureau of Statistics (ABS), is designed to provide statistical information about the condition of the nation to the public.[Footnote 31] According to the ABS, it originally developed MAP to satisfy growing public interest in the relationships between economic, social, and environmental aspects of life and to supplement Gross Domestic Product, which was viewed as a limited indicator of Australia's overall condition. Development: In 1995, the Australian Senate undertook a study of national well-being and recommended that the ABS create a system of well-being indicators for Australia. To foster a dialogue in Australia on progress and well-being, in 1997, the ABS sponsored a conference and invited top thinkers from throughout Australia to participate. Development of the first MAP, originally Measuring Australia's Progress, was led at the most senior levels of the ABS. There was a steering committee of senior bureau officials as well as a small staff with expertise in many areas at the bureau. In addition, the MAP steering committee and staff were advised by an external reference group of nine experts. The first version of MAP, released in 2002, was timed to coincide with a major statistical conference, and there was coordination with the national press to help publicize it. Subsequent releases were published in 2004, 2006, and 2010. The 2010 revision followed a similar development process as 2002, relying on bureau staff and on an external reference group, and coordinating release with the media. The headline indicators on the MAP Web site are updated annually. Design: The 2010 release of MAP is organized into three domains-- society, economy, and environment--with more than 80 headline and supplemental indicators. It includes a dashboard for 17 headline dimensions, such as work and housing, using a "traffic light" color coding system to show trends. MAP uses gray shading to indicate there is insufficient data to evaluate the trend (see figure 8). There are also five supplementary dimensions--culture and leisure; communication; transport; inflation; and competitiveness and openness. Figure 8: Example of MAP Use of Color Coding to Show Indicator Trends: [Refer to PDF for image: web site page] Source: Australian Bureau of Statistics. Note: Measures of Australia's Progress--Summary Information, Canberra, 2010. Publication can be downloaded at [hyperlink, http://www.abs.gov.au/about/progress]. [End of figure] The updated MAP Web site provides data and definitions of subpopulations of interest, comparisons with other countries, a glossary of related terms, and a hyperlinked list of related ABS publications (see figure 9). In the 2010 release, disaggregated data are available through the MAP Web site. The MAP system provides analysis and interpretation of indicator trends, but it does not establish explicit goals or benchmarks to be achieved. To help reach the public, the ABS also revised how MAP was presented for the 2010 release by creating a 20-page guide, which summarized the key dimensions in plain language and provided a graphic for each key indicator. Figure 9: Example of Information Provided by MAP, Competitiveness and Openness Supplementary Dimension: [Refer to PDF for image: web site page] Source: Australian Bureau of Statistics. Note: Measures of Australia's Progress, cat. no. 1370.0, Canberra, 2010. Web page can be accessed at [hyperlink, http://www.abs.gov.au/about/progress] (viewed on Mar. 7, 2011). [End of figure] Key Products: * Web site, Measures of Australia's Progress, [hyperlink, http://www.abs.gov.au/about/progress]. * Pocket Guide, Measures of Australia's Progress: Is Life in Australia Getting Better? 2010, available on the Web site under MAP Downloads. MONET Indicator System, Switzerland: Overview: MONET (Monitoring der Nachhaltigen Entwicklung or Monitoring Sustainable Development) is a system of indicators designed to provide the general public and policymakers with information about the current situation and trends in the social, economic, and environmental qualititative objectives of sustainable development in Switzerland. Sustainable development was defined by the United Nations in 1987 as "development that meets the needs of the present without compromising the ability of future generations to meet their own needs." The MONET indicators are the monitoring element of Switzerland's national sustainable development strategy, and the system is carried out jointly by the Federal Statistical Office, the Federal Office for the Environment, and the Swiss Federal Office for Spatial Development. Development: The sustainable development strategy and the MONET indicators were first developed from 1997 through 2002 within the Swiss government. To select a pilot set of indicators, a small core team of federal employees drew on the expertise of 80 government officials from 20 agencies and organizations. The data in the indicators are updated annually. A system revision project, which took place from September 2007 to June 2009, was aimed at reducing the size of the system, increasing its relevance, filling in gaps, and optimizing international comparability. Design: The current MONET set, released in 2009, includes 80 indicators. For each indicator, MONET provides quantitative information on trends, commentary, definitions, and links to additional statistical information, among other things (see figure 10). In addition, each indicator is assessed using a "traffic light" color coding system that shows the trend of each indicator and an arrow that shows the direction of movement. As an example, the global dimension includes 12 indicators with trends and arrows (see figure 11). The MONET indicators used to monitor the sustainable development strategy are presented in a "dashboard of sustainable development." This dashboard makes an aggregated assessment of the 11 key challenges of the strategy, using the traffic light color coding of the indicators (see figure 12). Figure 10: Example of Information Provided by MONET on the Official Development Assistance to Poor Countries Indicator: [Refer to PDF for image: web site page] Source: Swiss Federal Statistical Office, Swiss Agency for Spatial Development. Note: Web page from MONET can be accessed at [hyperlink, http://www.bfs.admin.ch/bfs/portal/en/index/themen/21/02/ind9.indicator. 70702.90602.html] (viewed on Mar. 16, 2011). [End of figure] Figure 11: Example of Monet System Use of Color Coding to Depict Indicator Progress for the Global Dimension: [Refer to PDF for image: web site page] Source: Swiss Federal Statistical Office, Swiss Agency for Spatial Development. Note: Web page from MONET can be accessed at [hyperlink, http://www.bfs.admin.ch/bfs/portal/en/index/themen/21/02/ind9.approach.9 03.html] (viewed on Mar. 16, 2011). [End of figure] Figure 12: Summary of the 11 Key Challenges of the Sustainable Development Strategy Using the Color Coding of the Indicators: [Refer to PDF for image: web site page] Source: Swiss Federal Statistical Office, Swiss Agency for Spatial Development. Note: Web page can be accessed at [hyperlink, http://www.bfs.admin.ch/bfs/portal/fr/index/themen/21/02/dashboard/02.ht ml] (viewed on Mar. 16, 2011). [End of figure] Among the 80 indicators, the MONET system has designated 17 indicators as "key" and grouped them into four questions: * Meeting needs--how well do we live? * Fairness--how well are resources distributed? * Preservation of resources--what are we leaving behind for our children? * Decoupling--how efficiently are we using our natural resources? MONET does not provide disaggregated data by cantons (states) and cities within Switzerland. Instead, indicators for cantons and cities are provided in a different indicator system called Cercle Indicateurs, with a link to the system provided on the MONET Web site. The first MONET indicators report, as well as the MONET Web site, were both released in 2003. The full MONET indicator set, supplementary information for each indicator, graphical presentations, Cercle Indicateurs, and additional information about MONET are available on a Web site available in German and French. An abbreviated Web site with the 17 key indicators, global dimension indicators, and some interpretive information is also available in English and Italian. A biennial "pocket guide" print product with the 17 key indicators is available in all four languages. Key Products: * Web site, Indicators, and Postulates, the MONET Indicator System, [hyperlink, http://www.bfs.admin.ch/bfs/portal/en/index/themen/21/02/01.html]. * Pocket Guide, Sustainable Development - A Brief Guide 2010, 17 Key Indicators to Measure Progress, available for download at [hyperlink, http://www.monet.admin.ch]. United Kingdom's Government Sustainable Development Indicators: Overview: The United Kingdom's (UK) Government Sustainable Development system of indicators is one element of its sustainable development strategy.[Footnote 32] The purpose of the indicators is to provide an overview of progress on four themes: * sustainable consumption and production; * climate change and energy; * protecting natural resources and the environment; and: * creating sustainable communities. Development: The first set of UK Government Sustainable Development indicators was published in 1996. The original sustainable development strategy was produced by the government, with some involvement of nongovernmental organizations through a seminar and ongoing meetings between interested organizations and government departments. In addition, the public was provided with an opportunity to comment on the draft strategy. An interdepartmental government working group compiled the indicators along with informal input from other governmental and nongovernmental organizations, such as the UK Government Panel on Sustainable Development; the Sustainable Development Round Table; groups representing local governments in Britain; and statistical, academic, and research organizations. Both the strategy and the indicators have gone through several revisions since they were first published and last revised in 2005. In 2007, commitments were met to include measures of well-being in the set, including on life satisfaction and satisfaction with aspects of life. In 2011, the UK government mandated that a new set of indicators be developed by the Department for Environment, Food and Rural Affairs (Defra) and has directed that the indicators should be a useful tool for policy evaluation and decision making. Design: The UK Government Sustainable Development indicator system has 68 indicators and 126 measures, with a subset of 20 indicators identified as "framework" indicators. The trend for each indicator is depicted graphically and evaluated using a "traffic light" color coding system (see figure 13), and the change for the indicators under each theme is summarized in a pie chart (see figure 14). Figure 13: Example of Pocket Guide Information Provided on the Indicator for Water Resource Use: [Refer to PDF for image: web site page] Source: UK Defra. Note: Measuring Progress Sustainable Development Indicators 2010. Publication can be downloaded at [hyperlink, http://sd.defra.gov.uk/progress/national/annual-review/]. [End of figure] Figure 14: Summary of Changes in All UK Government Sustainable Development Indicators from the Pocket Guide: [Refer to PDF for image: web site page] All indicators[A]: Changes in measures since 1990[B]: Showing deterioration: 13; Showing improvement: 50; Showing little or no change: 11; Insufficient data: 25. Changes in measures since 2003[B]: Showing deterioration: 10; Showing improvement: 57; Showing little or no change: 24; Insufficient data: 8. [A] Based on 99 of 126 measures, comprising 68 indicators. [B] Or nearest year for which data are available. Compared with the position in 2003, 57 measures show improvement (representing over half of those for which it is possible to make an assessment), and 24 show little or no change. A wide range of measures show improvement including renewable electricity, emissions of air pollutants and manufacturing emissions, fossil fuels used for electricity generation, waste and land recycling, agricultural emissions and land stewardship, crime, fear of crime, mortality rates, road accidents, rough sleepers and homeless households. Fossil fuels used for electricity generation has improved since 2003. Source: UK Defra. Note: Measuring Progress Sustainable Development Indicators 2010. Publication can be downloaded at [hyperlink, http://sd.defra.gov.uk/progress/national/annual-review/]. [End of figure] In addition to national indicators, the system introduced regional indicators in 1999 for nine regions in England. Currently, regional data exist for 46 of the 68 national indicators. The indicators were first published in a traditional print product in 1996. Revised indicators were published again in hard copy in 1999, and since 2001, they have been published annually. The indicators are also available on the Defra Web site, which is updated regularly as new data become available. In addition to the definitions and descriptive information available in a pocket guide, the Defra Web site includes tables of national, international, and regional data. The hard copy indicator publications were historically distributed to members of parliament and staff and were available to other government officials as well. The publications were particularly popular with schools and colleges, with tens of thousands of copies distributed annually. However, with the 2010 edition, the annual publication is available only online. Key Product: * Web site, Sustainable Development in Government, Reviewing Progress, [hyperlink, http://sd.defra.gov.uk/progress/national/annual-review/]. Community Indicators Victoria, Australia: Overview: Community Indicators Victoria aims to support the development and use of local community well-being indicators in Victoria to improve citizen engagement, community planning, and policy making. The system provides a framework for community well-being indicators for local communities and the state of Victoria. Development: Inspired by other efforts to establish comprehensive indicator systems, such as "Measures of Australia's Progress" and "Tasmania Together," Community Indicators Victoria is an outgrowth of a process called the Victorian Community Indicators Project (VCIP). VCIP was initiated in January 2005 to determine a framework for local governments in Victoria to make better use of indicators. The concept that community indicator development needs to be linked to community engagement processes was central to the VCIP design. VCIP conducted extensive consultation with local and state government officials and academics and conducted a literature review to develop a framework of indicators for measuring the well-being of Victorians. While much of the desired information was available through preexisting data streams, there was a need to centralize all of the available information and fill in information gaps. A survey was conducted in 2007 to provide previously unavailable information identified by VCIP participants as potentially valuable to local governments in Victoria. Design: Community Indicators Victoria consists of a framework of five domains: * healthy, safe and inclusive communities; * dynamic, resilient local economies; * sustainable built and natural environments; * culturally rich and vibrant communities; and: * democratic and engaged communities. Each domain contains multiple indicators with a total of approximately 80 indicators. The indicators include a broad range of measures designed to identify and communicate economic, social, environmental, democratic, and cultural trends and outcomes for each community in Victoria, Australia. Data for each indicator are available in the aggregate for the state of Victoria, but can also be disaggregated to the level of the 79 local government areas and regions in Victoria, including Metro Melbourne, the major city in Victoria, and Country Victoria, rural areas in Victoria. This allows for comparisons of indicator data among communities within Victoria. Data sources include state and local administrative data, data from the Australian Bureau of Statistics, data from existing Victorian surveys, and Community Indicators Victoria's survey. Community Indicators Victoria presents data and reports on the well- being of Victorians using an integrated set of community well-being indicators through a public Web portal. The Web portal has dynamic reporting capabilities, which allow users to generate custom reports, both in table (see figure 15) and map format (see figure 16). Figure 15: Table Presentation of Indicator from Community Indicators Victoria, by Region within Victoria: [Refer to PDF for image: web site page] Source: McCaughey Centre, School of Population Health, University of Melbourne. Note: Reports from Community Indicators Victoria can be generated at [hyperlink, http://www.communityindicators.net.au/] (viewed on Mar. 16, 2011). [End of figure] Figure 16: Map Presentation of Indicator from Community Indicators Victoria: [Refer to PDF for image: web site page] Source: McCaughey Centre, School of Population Health, University of Melbourne. Note: Community Indicators Victoria mapping tool can be accessed at [hyperlink, http://www.communityindicators.net.au/] (viewed on Mar. 16, 2011). [End of figure] User-requested reports are stored in an online database that is also available for public review. In addition, the staff of Community Indicators Victoria is available to provide customized assistance for a fee. Key Product: * Web site, Community Indicators Victoria, [hyperlink, http://www.communityindicators.net.au/]. Virginia Performs: Overview: Virginia Performs is a "performance leadership and accountability system" for the Commonwealth of Virginia that is overseen by the Council on Virginia's Future. It includes a system of "societal indicators" that is designed to provide citizens and policymakers with a high-level assessment of Virginia's condition and progress, to assess the state's progress toward seven high-level goals for Virginia, and to serve as a catalyst for better strategic thinking and performance-driven decision making by maintaining a focus on achieving priority outcomes. Development: To develop a vision and long-term goals for Virginia's future, in 2003 the Virginia General Assembly established the Council on Virginia's Future, an advisory board to the governor and General Assembly that is chaired by the governor and made up of the lieutenant governor, senior members of the General Assembly, citizen and business leaders, and cabinet members. Virginia Performs is one of the Council's signature initiatives, and the Council on Virginia's Future members and staff have overseen the development and design of the system since 2004. Following its creation, the Council on Virginia's Future worked to establish a vision and goals for Virginia, eventually settling on seven long-term goals for the Commonwealth, six of which are outwardly focused and address quality of life issues, with a seventh focused on the efficiency and effectiveness of state government operations. After these long-term goals had been selected, workgroups made up of legislators, subject matter experts, and other stakeholders were created to establish priorities and propose indicators in each of the seven goal areas. These efforts were supplemented by additional work by Council staff to finalize the list of societal indicators and data sources. The Virginia Performs Web site, which serves as a portal to societal indicators at the state and regional levels, was released publicly in early 2007. Design: Virginia Performs is made up of three distinct but interconnected tiers. Figure 17 presents a high-level schematic of the "architecture" of this system. 1. The first tier is made up of 49 societal indicators arranged according to the seven goal areas--economy; education; health and family; public safety; natural resources; transportation; and government and citizens--that answer the question, "How is Virginia doing?" These indicators are designed to provide an overview of how Virginia is doing with respect to several broad issues, such as water quality or educational attainment. An example of a societal indicator in the education area is Virginia's high school graduation rate. These indicators are measured over time and, where possible, by region within Virginia and in comparison to other states. 2. The second tier is made up of approximately 200 key objectives and measures, which are agency performance measures selected by agencies and the governor tracked to determine if state government is making progress on its highest priorities. For example, a key measure for the Virginia Department of Education is the percentage of high school students who exit high school with a diploma. The Council has developed tables that show how these key agency measures align with each societal indicator. 3. The third tier is made up of other agency performance measures. State agencies establish objectives and measures for programs and services as part of their strategic planning process, and agencies now regularly report their performance on those measures. For example, an agency performance measure for the Department of Education is the percentage of youth with disabilities graduating from high school with an Advanced or Standard diploma. Figure 17: A High-level Schematic of the Virginia Performs Architecture: [Refer to PDF for image: web site page] Source: Council on Virginia’s Future. Note: Graphic from presentation to GAO Staff, Sept. 30, 2010. Presentation can be accessed at [hyperlink, http://www.future.virginia.gov/docs/RecentPresentations/GAO_2010_09_30.p df] (viewed on Mar. 7, 2011). [End of figure] On the Virginia Performs Web site, each societal indicator has its own page that includes a description of the indicator and its importance, a description of how Virginia is doing, information on major factors influencing the indicator, and perspective on state government's role in moving the indicator. Figure 18 provides an example of the high school graduation indicator page. Figure 18: Example of High School Graduation Indicator Page from Virginia Performs: [Refer to PDF for image: web site page] Source: Council on Virginia’s Future. Note: Web page from Virginia Performs can be accessed at [hyperlink, http://vaperforms.virginia.gov/indicators/education/hsGraduation.php] (viewed on Mar. 7, 2011). [End of figure] To provide a quick snapshot and summary of the state's performance, the Council also created a one-page document that summarizes the trend for each of the societal indicators included in Virginia Performs. This "Scorecard at a Glance" is shown in figure 19. Figure 19: Virginia Performs Indicators Scorecard at a Glance: [Refer to PDF for image: web site page] Source: Council on Virginia’s Future. Note: Scorecard from Virginia Performs can be accessed at [hyperlink, http://vaperforms.virginia.gov/Scorecard/ScorecardatGlance.php] (viewed Mar. 7, 2011). [End of figure] Key Products: * Virginia Performs Web Site - [hyperlink, http://vaperforms.virginia.gov/]. * The Virginia Report - [hyperlink, http://vaperforms.virginia.gov/extras/newsResources.php#reports]. * Virginia Performs Scorecard at a Glance - [hyperlink, http://vaperforms.virginia.gov/Scorecard/ScorecardatGlance.php]. King County AIMs High, Washington: Overview: King County AIMs High is a key indicator system managed by the government of King County, Washington. It is designed to provide citizens and policymakers with insights into the social, economic, and environmental condition of King County, as well as information on what King County government is doing to improve those conditions, in an effort to improve public discussion, management decision making, and accountability. Development: The first AIMs High report was released in 2006 as a companion to the County Executive's budget proposal, but did not contain information on community indicators. Instead, the report focused on the performance of individual county departments. Following the 2006 AIMs High report, the County Executive sought to enhance the report by including information on community conditions. To structure this new report, staff from the County Executive's office worked with agency staff to select the themes, categories, and associated community indicators to be included in the report. The community indicators were selected from two existing indicator reports, King County Benchmarks and Communities Count. King County Benchmarks was designed originally to track growth management issues and report on indicators that focus on land use, economic conditions, affordable housing, transportation, and environmental policy. Communities Count tracks indicators with a focus on social and health conditions across the county. This revised approach was used for the AIMs High report and Web site released in 2007, and for subsequent annual updates through June 2010. The AIMs High system has continued to evolve. In July 2010 the King County Council approved a new countywide strategic plan that consists of eight high-level countywide goals. Each goal consists of several high-level objectives, such as "Keep people safe in their homes." Community indicators, such as the percentage of resident survey respondents who feel safe in their neighborhood during the day and at night, will be used to gauge progress toward these objectives. The plan also includes specific strategies for achieving each objective, such as "Maintain a proactive law enforcement presence." "Strategic measures," such as the response time of the sheriff's department, will be used to assess how well the strategies have been implemented. The legislation authorizing the creation of the strategic plan also requires the continued release of an annual public performance report, with information on both community indicators and government performance measures, consistent in principle with the current structure of the AIMs High report. The intention going forward is to have the AIMs High report align with the structure and objectives outlined in the strategic plan. Design: The current AIMs High report consists of eight categories-- natural resources; built environment; housing and homelessness; economic vitality; health; law, safety, and justice; accountability and transparency; and equity and social justice--that, together, are designed to capture the breadth of conditions addressed by county services. Within each category AIMs High provides two levels of information. The first level is comprised of "Community Indicators," which are higher- level measures that track the state of the environment or the condition of the community. Indicators are generally influenced by a number of factors and jurisdictions, and individual organizations have less ability to control the conditions being measured. The second level is comprised of "Performance Measures," which, by contrast, are quantifiable measures of the amount, quality, efficiency, or effectiveness of products or services produced by a specific program or agency. The broader perspective provided by the community indicators is intended to provide citizens and county officials with an understanding of whether or not county programs are making a difference at the community level. Figure 20 provides a visual illustration of the relationship between the community indicators and performance measures in the "Health" category of AIMs High. Figure 20: List of Community Indicators and Performance Measures in the "Health" Category of King County AIMs High: [Refer to PDF for image; web site page] Source: King County, Washington, Office of the Executive. Note: Web page from King County AIMs High can be accessed at [hyperlink, http://your.kingcounty.gov/aimshigh/health.asp] (viewed on Mar. 7, 2011). [End of figure] The AIMs High Web site is comprised of individual pages for each subcategory of community indicators. For example, Figure 21 provides an example of the "Health promotion" subcategory page. Each subcategory page consists of information on how King County is doing (including graphical depictions of the indicators), the factors that influence the indicators, and the role that King County government plays in improving conditions in the County. On the left side of the page are links to pages with information on each performance measure. Figure 21: Example of Indicator Page from AIMs High Web Site: [Refer to PDF for image; web site page] Source: King County, Washington, Office of the Executive. Note: Web page from King County AIMs High can be accessed at [hyperlink, http://your.kingcounty.gov/aimshigh/health.asp] viewed on Mar. 7, 2011). [End of figure] Key Product: * King County AIMs High Web Site [hyperlink, http://your.kingcounty.gov/aimshigh/index.asp]. Boston Indicators Project: Overview: The Boston Indicators Project (BIP) is a local key indicator system managed by the Boston Foundation in partnership with the City of Boston and the Metropolitan Area Planning Council that was designed to: * "democratize data" by increasing access to data and research on local conditions; * engage the public, community-based organizations, media, the business community, and government; * help leaders from different sectors find ways to collaborate; and: * monitor progress toward shared civic goals for Boston. Development: The effort to develop BIP began in 1997. Since then, the system has evolved through an open, participatory approach to development that has involved a wide range of engaged residents, public officials, academics, and leaders from the private and nonprofit sectors. At the beginning, the effort involved a small number of individuals from various community organizations and city departments, but over time the group grew to include more than 300 participants who worked to develop a broad framework for the project, including a vision, goals, and indicator categories. The next step in the process involved the identification of the indicators themselves, and included 150 individuals working in both large and small group settings for a period of about 6 months. As the effort evolved, participants formed a steering group and various subcommittees, and developed criteria to select indicators and identify data sources. By early 1998, participants had narrowed an initial "wish list" of 1,500 measures to about 150 proposed indicators, and they began the process of collecting data. After releasing a draft report in 1999 to more than 1,000 people and collecting feedback on the draft for a year, the first BIP indicators report was released in the fall of 2000. BIP subsequently released reports in 2003, 2005, 2007, and 2009. Each of these biennial reports had a distinct theme, and they have been used to measure progress toward a long-term vision for Boston. To inform the development of its reports BIP has hosted a series of "convenings," which have been used to capture a range of perspectives from experts, community-based practitioners, public officials, private sector representatives, and interested citizens. The number of participants for each convening varies, but each has used the same structured agenda to elicit views on key long-term trends, major developments and accomplishments, and key challenges in different topic areas. This input has been used to frame and prioritize the findings of the next BIP report. BIP has also hosted a series of What's Next? Seminars to engage younger participants and emerging leaders. The process of convening stakeholders around the development of its biennial reports, as well as holding briefings following the release of reports, has helped keep core constituencies engaged and informed over time. Design: BIP is divided into 10 "sectors"--civic vitality; cultural life and the arts; economy; education; environment and energy; health; housing; public safety; technology; and transportation. On the BIP Web site, each sector has its own page, summarizing key trends, accomplishments, developments, challenges, and innovations. Each sector is subdivided according to the goals for that sector, each of which is supported by at least one indicator. For example, the economy sector includes a goal to attain "Economic Strength and Resilience;" progress toward this goal is measured by several indicators, including employment by industry sector, the unemployment rate in Boston, and hotel and office occupancy rates. Each specific indicator is also given its own page, which provides a brief summary of why an indicator is important and how Boston is doing. Figure 22 provides an example of an indicator page from the BIP Web site. Figure 22: Boston Indicators Project Web Site Indicator Page Example: [Refer to PDF for image: web site page] Source: The Boston Foundation. Note: Web page from the Boston Indicators Project can be accessed at [hyperlink, http://bostonindicators.org/Indicators2008/Economy/Indicators.aspx?id=11 020] (viewed on Mar. 7, 2011). [End of figure] To allow users to explore certain crosscutting subjects, the BIP Web site also offers a "Sector Crosscut" filter for six different subjects--Boston neighborhoods, children and youth, competitive edge, fiscal health, race/ethnicity, and sustainable development. The BIP Web site contains a page for each of these crosscutting subjects that includes a description of the subject and a list of links to relevant indicators from across sectors. For example, the children and youth crosscut is made up of a list of 29 indicators from eight different sectors, while the competitive edge crosscut is made up of 24 indicators from nine different sectors. In addition to the full list of regional goals available through the BIP Web site, BIP also worked with hundreds of stakeholders and experts to develop a "Civic Agenda" for Boston. This civic agenda is organized into four major areas--an open and effective civic culture, world class human capital, 21st century infrastructure, and 21st century jobs and economic strategies--each of which has a high-level goal and associated measurable milestones, relevant statistical information, and information on the strategies that are being used by different actors to drive progress toward achieving the milestones. Key Products: * Boston Indicators Project Web Site--[hyperlink, http://www.bostonindicators.org]. * Biennial Boston Indicators Project Reports--available through [hyperlink, http://www.bostonindicators.org]. * Boston Civic Agenda--available through [hyperlink, http://www.bostonindicators.org]. * MetroBoston DataCommon--[hyperlink, http://www.metrobostondatacommon.org/]. [End of section] Appendix III: Objectives, Scope, and Methodology: The objectives of our review were to address (1) how indicator systems are being used by government entities, nongovernment stakeholders, and citizens; (2) how indicator systems are developed and designed; (3) some of the factors necessary to sustain indicator systems; and (4) potential implications for how a U.S. key national indicator system could be developed and used. This report builds on the findings from our November 10, 2004, report on key national indicators, Informing our Nation: Improving How to Understand and Assess the USA's Position and Progress, GAO-05-1. We conducted a literature review of indicator systems, focusing on developments since 2004. We determined the status of indicator systems we previously identified and researched additional national, state, regional, and local systems, reviewing primary and secondary documents related to the comprehensive key indicator systems. We interviewed experts, current and former government officials, and noted practitioners from the indicator community to get a sense of the main issues related to the development and use of indicator systems, lessons learned, and possible challenges and effects of a key national indicator system. Based on recommendations from experts and our review of the literature, we selected a group of 20 comprehensive indicator systems from different jurisdictional levels and diverse geographic locations, as shown in table 3. We conducted interviews with representatives from these systems. We selected 7 of the 20 systems to serve as case studies. These in-depth case studies included interviews with officials or managers and stakeholders. To select the case study systems, we looked for national, state, and local indicator systems that met four criteria, including: (1) comprehensiveness--a mixture of economic, social, and environmental indicators, (2) longevity--in existence for at least 5 years and currently in operation, (3) outcome- orientation--with measures of progress over time or toward goals or outcomes, and (4) involvement of a government entity as a partner or as a user of information from the system. Table 3: Comprehensive Key Indicator Systems Selected for GAO's Study: Local/county/regional level: Name of System: King County AIMs High, Washington (case study). Name of System: Albuquerque, NM Progress Report, New Mexico. Name of System: Boston Indicators Project, Massachusetts (case study). Name of System: Cercle Indicateurs, Switzerland (local/state level)[A]. Name of System: Jacksonville Community Council Inc. Quality of Life Progress Report, Florida. Name of System: Long Island Index, New York. Name of System: Orange County Community Indicators, California. Name of System: Santa Cruz County Community Assessment Project, California. Name of System: Silicon Valley Index, California. Name of System: Truckee Meadows Tomorrow Quality of Life Indicators, Nevada. State level: Name of System: Arizona Indicators, Arizona. Name of System: Community Indicators Victoria, Australia (case study). Name of System: Measures of Growth in Focus, Maine. Name of System: Oregon Benchmarks, Oregon. Name of System: South Australia's Strategic Plan, Australia. Name of System: Tasmania Together, Australia. Name of System: Virginia Performs, Virginia (case study). National level: Name of System: Measures of Australia's Progress, Australia (case study). Name of System: MONET Indicator System, Switzerland (case study). Name of System: United Kingdom Government Sustainable Development Indicators, United Kingdom (case study). Source: GAO. [A] Cercle Indicateurs provides comparative information for Swiss cities and cantons. Cantons in Switzerland are roughly the equivalent of U.S. states. [End of table] The three national case study systems--the comprehensive key indicator system maintained by the United Kingdom's Government Sustainable Development Indicators, the MONET Indicator System maintained by Switzerland, and Measures of Australia's Progress--were chosen to reflect similarities in systems of government, demographic and cultural diversity, geography, and economy to the United States. We also selected three domestic subnational systems--Virginia Performs, King County AIMs High, and the Boston Indicators Project--as case study systems. These systems were chosen to represent different types of jurisdictions--state, county, and local; different types of governing authorities--governmental and nongovernmental; and different regions of the country. We conducted a case study of one statewide nongovernmental system in Australia, Community Indicators Victoria. For each of these case studies, in addition to a review of documents, we also conducted site visits to meet with officials and selected stakeholders involved in the systems. To better understand how the United States government might use a key national indicator system, we met with representatives from the National Academy of Sciences and a number of federal statistical agencies. We also interviewed officials from two federal government topical national indicator systems--Healthy People, maintained by the Department of Health and Human Services and the Report on the Environment, maintained by the Environmental Protection Agency. In addition, we interviewed an official from KIDS COUNT, a national topical indicator system hosted by the Annie E. Casey Foundation, a private charitable organization. The KIDS COUNT system presents national, state, and local-level indicators on the status of America's children. To analyze insights for a key national indicator system for the U.S., we drew upon our professional judgment, fieldwork, and interviews with scholars, practitioners, and government officials. Given the case study approach, this report's findings rely heavily on practices in certain situations and contexts. There may be limitations on the extent to which the insights of key stakeholders on the development and design of indicator systems and the factors necessary to sustain indicator systems could be used in a U.S. national context. We did not perform a cost and benefit analysis of the systems reviewed. Nor did we evaluate the federal statistical system and its related agencies. Most of the graphics presented in this report from the indicator systems we studied are only to illustrate the types of information and the variety of ways it is presented in the reports or on the Web sites of these systems. The examples are not intended to highlight or frame discussions of the substantive issues conveyed by them. We verified the accuracy of the information about the indicator systems with system representatives, the National Academy of Sciences, the Office of Management and Budget, and selected federal agencies. We incorporated their comments, where appropriate, throughout the draft. We conducted our work from February 2010 to March 2011 in accordance with all sections of GAO's Quality Assurance Framework that are relevant to our objectives. The framework requires that we plan and perform the engagement to obtain sufficient and appropriate evidence to meet our stated objectives and to discuss any limitations in our work. We believe that the information and data obtained, and the analysis conducted, provide a reasonable basis for any findings and conclusions in this product. [End of section] Appendix IV: Pub. L. No. 111-148, "Patient Protection and Affordable Care Act," Title V, Section 5605; 124 Stat. 680: March 23, 2010: SEC. 5605. Key National Indicator System: (a) Definitions: In this section: (1) Academy: The term "Academy" means the National Academy of Sciences. (2) Commission: The term "Commission" means the Commission on Key National Indicators established under subsection (b). (3)Institute: The term "Institute" means a Key National Indicators Institute as designated under subsection (c)(3). (b) Commission On Key National Indicators: (1) Establishment: There is established a "Commission on Key National Indicators"'. (2) Membership: (A) Number And Appointment: The Commission shall be composed of 8 members, to be appointed equally by the majority and minority leaders of the Senate and the Speaker and minority leader of the House of Representatives. (B) Prohibited Appointments: Members of the Commission shall not include Members of Congress or other elected Federal, State, or local government officials. (C) Qualifications: In making appointments under subparagraph (A), the majority and minority leaders of the Senate and the Speaker and minority leader of the House of Representatives shall appoint individuals who have shown a dedication to improving civic dialogue and decision-making through the wide use of scientific evidence and factual information. (D) Period Of Appointment: Each member of the Commission shall be appointed for a 2-year term, except that 1 initial appointment shall be for 3 years. Any vacancies shall not affect the power and duties of the Commission but shall be filled in the same manner as the original appointment and shall last only for the remainder of that term. (E) Date: Members of the Commission shall be appointed by not later than 30 days after the date of enactment of this Act. (F) Initial Organizing Period: Not later than 60 days after the date of enactment of this Act, the Commission shall develop and implement a schedule for completion of the review and reports required under subsection (d). (G) Co-Chairpersons: The Commission shall select 2 Co-Chairpersons from among its members. (c) Duties Of The Commission: (1) In General.--The Commission shall: (A) conduct comprehensive oversight of a newly established key national indicators system consistent with the purpose described in this subsection; (B) make recommendations on how to improve the key national indicators system; (C) coordinate with Federal Government users and information providers to assure access to relevant and quality data; and: (D) enter into contracts with the Academy. (2) Reports: (A) Annual Report To Congress: Not later than 1 year after the selection of the 2 Co-Chairpersons of the Commission, and each subsequent year thereafter, the Commission shall prepare and submit to the appropriate Committees of Congress and the President a report that contains a detailed statement of the recommendations, findings, and conclusions of the Commission on the activities of the Academy and a designated Institute related to the establishment of a Key National Indicator System. (B) Annual Report To The Academy: (i) In General: Not later than 6 months after the selection of the 2 Co-Chairpersons of the Commission, and each subsequent year thereafter, the Commission shall prepare and submit to the Academy and a designated Institute a report making recommendations concerning potential issue areas and key indicators to be included in the Key National Indicators. (ii) Limitation: The Commission shall not have the authority to direct the Academy or, if established, the Institute, to adopt, modify, or delete any key indicators. (3) Contract With The National Academy Of Sciences: (A) In General: As soon as practicable after the selection of the 2 Co- Chairpersons of the Commission, the Co-Chairpersons shall enter into an arrangement with the National Academy of Sciences under which the Academy shall: (i) review available public and private sector research on the selection of a set of key national indicators; (ii) determine how best to establish a key national indicator system for the United States, by either creating its own institutional capability or designating an independent private nonprofit organization as an Institute to implement a key national indicator system; (iii) if the Academy designates an independent Institute under clause (ii), provide scientific and technical advice to the Institute and create an appropriate governance mechanism that balances Academy involvement and the independence of the Institute; and: (iv) provide an annual report to the Commission addressing scientific and technical issues related to the key national indicator system and, if established, the Institute, and governance of the Institute's budget and operations. (B) Participation: In executing the arrangement under subparagraph (A), the National Academy of Sciences shall convene a multi-sector, multidisciplinary process to define major scientific and technical issues associated with developing, maintaining, and evolving a Key National Indicator System and, if an Institute is established, to provide it with scientific and technical advice. (C) Establishment Of A Key National Indicator System.--: (i) In General: In executing the arrangement under subparagraph (A), the National Academy of Sciences shall enable the establishment of a key national indicator system by--: (I) creating its own institutional capability; or: (II) partnering with an independent private nonprofit organization as an Institute to implement a key national indicator system. (ii) Institute: If the Academy designates an Institute under clause (i)(II), such Institute shall be a non-profit entity (as defined for purposes of section 501(c)(3) of the Internal Revenue Code of 1986) with an educational mission, a governance structure that emphasizes independence, and characteristics that make such entity appropriate for establishing a key national indicator system. (iii) Responsibilities: Either the Academy or the Institute designated under clause (i)(II) shall be responsible for the following: (I) Identifying and selecting issue areas to be represented by the key national indicators. (II) Identifying and selecting the measures used for key national indicators within the issue areas under subclause (I). (III) Identifying and selecting data to populate the key national indicators described under subclause (II). (IV) Designing, publishing, and maintaining a public website that contains a freely accessible database allowing public access to the key national indicators. (V) Developing a quality assurance framework to ensure rigorous and independent processes and the selection of quality data. (VI) Developing a budget for the construction and management of a sustainable, adaptable, and evolving key national indicator system that reflects all Commission funding of Academy and, if an Institute is established, Institute activities. (VII) Reporting annually to the Commission regarding its selection of issue areas, key indicators, data, and progress toward establishing a web-accessible database. (VIII) Responding directly to the Commission in response to any Commission recommendations and to the Academy regarding any inquiries by the Academy. (iv) Governance: Upon the establishment of a key national indicator system, the Academy shall create an appropriate governance mechanism that incorporates advisory and control functions. If an Institute is designated under clause (i)(II), the governance mechanism shall balance appropriate Academy involvement and the independence of the Institute. (v) Modification And Changes: The Academy shall retain the sole discretion, at any time, to alter its approach to the establishment of a key national indicator system or, if an Institute is designated under clause (i)(II), to alter any aspect of its relationship with the Institute or to designate a different non-profit entity to serve as the Institute. (vi) Construction: Nothing in this section shall be construed to limit the ability of the Academy or the Institute designated under clause (i)(II) to receive private funding for activities related to the establishment of a key national indicator system. (D) Annual Report: As part of the arrangement under subparagraph (A), the National Academy of Sciences shall, not later than 270 days after the date of enactment of this Act, and annually thereafter, submit to the Co-Chair persons of the Commission a report that contains the findings and recommendations of the Academy. (d) Government Accountability Office Study And Report: (1) GAO Study: The Comptroller General of the United States shall conduct a study of previous work conducted by all public agencies, private organizations, or foreign countries with respect to best practices for a key national indicator system. The study shall be submitted to the appropriate authorizing committees of Congress. (2) GAO Financial Audit: If an Institute is established under this section, the Comptroller General shall conduct an annual audit of the financial statements of the Institute, in accordance with generally accepted government auditing standards and submit a report on such audit to the Commission and the appropriate authorizing committees of Congress. (3) GAO Programmatic Review: The Comptroller General of the United States shall conduct programmatic assessments of the Institute established under this section as determined necessary by the Comptroller General and report the findings to the Commission and to the appropriate authorizing committees of Congress. (e) Authorization Of Appropriations: (1) In General: There are authorized to be appropriated to carry out the purposes of this section, $10,000,000 for fiscal year 2010, and $7,500,000 for each of fiscal year 2011 through 2018. (2) Availability: Amounts appropriated under paragraph (1) shall remain available until expended. [End of section] Appendix V: Full Text for Figure 3 Presentation of Key Indicators from the MONET System: [Refer to PDF for image: illustration] [A] Meeting needs - How well do we live? Being healthy, feeling safe, and having enough income to live are all needs that, when met, contribute to the well-being of the population. Enabling all individuals to live in dignity and enjoy a good quality of life is a central goal of sustainable development. [B] Preservation of resources – what are we leaving behind for our children? Sustainable development also means meeting the needs of the present without compromising the ability of future generations to meet their own needs. The quality of life of future generations depends, in large part, on the state of environmental, economic, and social resources we leave them in Switzerland and worldwide. [C] Fairness – how well are resources distributed? The concept of sustainable development is based on a demand for fairness. In this context, all individuals should have fair access to important resources such as education, income, health, and clean air. Inequality and poverty must be tackled at the national and international level. [D] Decoupling – how efficiently are we using our natural resources? From a sustainable development perspective, it is necessary that we seek to satisfy our needs within the limits of what the environment can withstand. Promoting economic and social development without damaging the environment means adopting more rational and efficient modes of production and consumption. Overlapping Objectives: Social solidarity: Poverty[C]; Physical safety[A]; Health[A]; Teenage reading skills[B]. Economic efficiency: Investment[B]; Innovation and technology[B] Public debt[B]. Environmental responsibility: Built-up areas[B]; Biodiversity[B]. Social solidarity and Economic efficiency: Unemployment[A]; Income[A]; Equality[C]. Environmental responsibility and Economic efficiency: Freight transport[D]; Material consumption[D]; Energy consumption[D]. Social solidarity, Environmental responsibility, and Economic efficiency: Official development assistance[C]; Passenger transport[D]. Source: Adapted from graphics of MONET system, Swiss Confederation. Innovation and technology Note: Federal Statistical Office, Federal Office for Spatial Development, Agency for Development and Cooperation, and Federal Office for the Environment, Sustainable Development--A Brief Guide 2010 (2011). Web page can be accessed at [hyperlink, http://www.bfs.admin.ch/bfs/portal/de/index/themen/21/02/dashboard/02.ht ml]. [End of figure] [End of section] Appendix VI: GAO Contact and Staff Acknowledgments: GAO Contact: Bernice Steinhardt (202) 512-6543 or steinhardtb@gao.gov: Staff Acknowledgments: In addition to the contact named above, Elizabeth Curda, Assistant Director, and Janice Latimer and Judith Kordahl, Analysts-in-Charge, supervised the development of this report. Adam Miles and Diana Zinkl made significant contributions to all aspects of this report. Gregory Wilmoth assisted with the design and methodology. Sabrina Streagle provided legal counsel. William Trancucci verified the information in the report. [End of section] Footnotes: [1] The Patient Protection and Affordable Care Act of 2010, Pub. L. No. 111-148, §5605, established a Commission on Key National Indicators that will enter into an arrangement with the National Academy of Sciences to establish a U.S. key national indicator system. [2] Other definitions regarding indicator systems are in appendix I. [3] GAO, Informing our Nation: Improving How to Assess the Position and Progress of the United States, [hyperlink, http://www.gao.gov/products/GAO-05-1] (Washington, D.C.: Nov. 10, 2004). [4] The National Academy of Sciences is a congressionally chartered, nongovernmental, tax-exempt institution that includes two other honorary academies, the National Academy of Engineering and the Institute of Medicine, as well as its operating arm, the National Research Council, [hyperlink, http://www.nas.edu]. [5] In addition, if an institute is established under this section, we are to conduct an annual financial statement audit and programmatic assessments as necessary. [6] [hyperlink, http://www.gao.gov/products/GAO-05-1]. [7] [hyperlink, http://www.gao.gov/products/GAO-05-1]. [8] GAO, Suggested Areas for Oversight for the 110th Congress, [hyperlink, http://www.gao.gov/products/GAO-07-235R] (Washington, D.C.: Nov. 17, 2006). [9] The seven commission appointees are: Nicholas N. Eberstadt, Ph.D.; Stephen Heintz; Wade F. Horn, Ph.D.; Ikram U. Khan, M.D.; Dean Ornish, M.D.; Tomas J. Philipson, Ph.D.; and Marta Tienda, Ph.D. One additional person was appointed but subsequently resigned. Two commission appointments each were made by the majority and minority leaders of the Senate and the speaker and the minority leader of the House. [10] Appendix IV includes section 5605 of the Patient Protection and Affordable Care Act, which details the provisions for implementation of a key national indicator system. [11] See [hyperlink, http://stateoftheusa.org]. [12] [hyperlink, http://www.gao.gov/products/GAO-05-1], pp. 168-169. [13] See [hyperlink, http://www.childstats.gov/americaschildren/index.asp] for more information on the America's Children indicators. See [hyperlink, http://www.KIDSCOUNT.org] for more information on KIDS COUNT indicators. [14] Joseph Stiglitz, Sen Amartya, and Jean-Paul Fitoussi, Report by the Commission on the Measurement of Economic Performance and Social Progress (2009), available online at [hyperlink, http://www.stiglitz-sen-fitoussi.fr]. [15] The OECD sponsored the first World Forum on measuring social progress in November 2004 in Palermo, Italy. The second one was held in June 2007 in Istanbul, Turkey, and the third was held in October 2009 in Busan, Korea. [16] [hyperlink, http://www.gao.gov/products/GAO-05-1], p. 87. [17] The Australian Bureau of Statistics releases an updated MAP product periodically. [18] Although the MONET Web site shows that there are 16 indicators, according to Swiss federal officials, a 17th indicator, poverty, is considered part of the MONET key indicator set, although adequate data are not available to populate that indicator. [19] Both the Swiss MONET and UK Government Sustainable Development indicator systems are part of national sustainable development strategies. Sustainable development was defined by a United Nations document in 1987 as "development that meets the needs of the present without compromising the ability of future generations to meet their own needs." [20] [hyperlink, http://www.gao.gov/products/GAO-05-1], p. 157. [21] [hyperlink, http://www.gao.gov/products/GAO-05-1], pp. 17-18. [22] K. Scrivens and B. Iasiello, What Makes a Successful Set of Indicators, Organization for Economic Cooperation and Development, Global Project on Measuring the Progress of Societies, 2008. [23] [hyperlink, http://www.gao.gov/products/GAO-05-1], p. 18. [24] The Institute of Medicine is an independent organization affiliated with the National Academy of Sciences that provides unbiased and authoritative advice to decision makers and the public on health-related matters. [25] For more information on the 20 key health measures, see [hyperlink, http://www.stateoftheusa.org/content/from-hundreds-of- health-indica.php]. [26] WEAVE, or Web-based Analysis and Visualization Environment, is a data visualization tool being developed by the Open Indicators Consortium. WEAVE, by integrating maps, charts, and tables on one Web site, is designed to allow users to visualize and analyze economic, social, and environmental indicators at the neighborhood, municipal, county, and regional levels. [27] For the current list of indicators, see the "Analytical Perspectives" section of the fiscal year 2012 federal budget. [28] Pub. L. No. 111-352, 124 Stat. 3866 (Jan. 4, 2011). [29] S. Rep. No. 111-372, at 8 (2010). [30] GAO, Opportunities to Reduce Potential Duplication in Government Programs, Save Tax Dollars, and Enhance Revenue, [hyperlink, http://www.gao.gov/products/GAO-11-318SP] (Washington, D.C.: Mar. 1, 2011). [31] The ABS is an independent statutory authority with the Australian government and is headed by an Australian Statistician who serves a 7- year term. The Australian Statistician is not a member of parliament, and although the ABS is attached to the Treasury Portfolio, the Australian Statistician has independent control of the operations of the ABS. [32] The first UK sustainable development strategy was issued in 1994. Sustainable development can be defined as development that meets the needs of the present without compromising the ability of future generations to meet their own needs. [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each weekday, GAO posts newly released reports, testimony, and correspondence on its Web site. To have GAO e-mail you a list of newly posted products every afternoon, go to [hyperlink, http://www.gao.gov] and select "E-mail Updates." Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s Web site, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: E-mail: fraudnet@gao.gov: Automated answering system: (800) 424-5454 or (202) 512-7470: Congressional Relations: Ralph Dawn, Managing Director, dawnr@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, D.C. 20548: Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, D.C. 20548: