This is the accessible text file for GAO report number GAO-12-42 entitled 'Chemical Assessments: Challenges Remain with EPA's Integrated Risk Information System Program' which was released on January 9, 2012. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to the Ranking Member, Subcommittee on Energy and Environment, Committee on Science, Space, and Technology, House of Representatives: December 2011: Chemical Assessments: Challenges Remain with EPA's Integrated Risk Information System Program: GAO-12-42: GAO Highlights: Highlights of GAO-12-42, a report to the Ranking Member, Subcommittee on Energy and Environment, Committee on Science, Space, and Technology, House of Representatives. Why GAO Did This Study: The Environmental Protection Agency’s (EPA) Integrated Risk Information System (IRIS) Program supports EPA’s mission to protect human health and the environment by providing the agency’s scientific position on the potential human health effects from exposure to various chemicals in the environment. The IRIS database contains quantitative toxicity assessments of more than 550 chemicals and provides fundamental scientific components of human health risk assessments. In response to a March 2008 GAO report on the IRIS program, EPA revised its IRIS assessment process in May 2009. GAO was asked to evaluate (1) EPA’s progress in completing IRIS assessments under the May 2009 process and (2) the challenges, if any, that EPA faces in implementing the IRIS program. To do this work, GAO reviewed and analyzed EPA productivity data, among other things, and interviewed EPA officials. What GAO Found: EPA’s May 2009 revisions to the IRIS process have restored EPA’s control of the process, increased its transparency, and established a new 23-month time frame for its less challenging assessments. Notably, EPA has addressed concerns GAO raised in its March 2008 report and now makes the determination of when to move an assessment to external peer review and issuance—-decisions that were made by the Office of Management and Budget (OMB) under the prior IRIS process. In addition, EPA has increased the transparency of the IRIS process by making comments provided by other federal agencies during the interagency science consultation and discussion steps of the IRIS process available to the public. Progress in other areas, however, has been limited. EPA’s initial gains in productivity under the revised process have not been sustained. After completing 16 assessments within the first year and a half of implementing the revised process, EPA completed 4 assessments in fiscal year 2011. Further, the increase in productivity does not appear to be entirely attributable to the revised IRIS assessment process and instead came largely from (1) clearing the backlog of IRIS assessments that had undergone work under the previous IRIS process and (2) issuing assessments that were less challenging to complete. EPA has taken longer than the established time frames for completing steps in the revised process for most of its less challenging assessments. However, EPA has not analyzed its established time frames to assess the feasibility of the time frame for each step or the overall 23-month process. The agency’s progress has also been limited in completing assessments that it classifies as exceptionally complex and reducing its ongoing assessments workload. Beyond the 55 ongoing IRIS assessments, the backlog of demand for additional IRIS assessments is unclear. With existing resources devoted to addressing its current workload of ongoing assessments, EPA has not been in a position to routinely start new assessments. EPA faces both long-standing and new challenges in implementing the IRIS program. First, EPA has not fully addressed recurring issues concerning the clarity and transparency of its development and presentation of draft IRIS assessments. For example, as part of its independent scientific review of EPA’s draft IRIS assessment of formaldehyde, the National Academies provided suggestions for improving EPA’s development and presentation of draft IRIS assessments in general, including that EPA use a standardized approach to evaluate and describe study strengths and weaknesses and the weight of evidence. EPA announced that it planned to respond to the National Academies’ suggestions by implementing changes to the way it develops draft IRIS assessments. Given that many of the issues raised by the National Academies have been long-standing, it is unclear whether any entity with scientific and technical credibility, such as an EPA advisory committee, will have a role in conducting an independent review of EPA’s planned response to the suggestions. In addition, EPA has not addressed other long-standing issues regarding the availability and accuracy of current information to users of IRIS information, such as EPA program offices, on the status of IRIS assessments, including when an assessment will be started, which assessments are ongoing, and when an assessment is projected to be completed. What GAO Recommends: GAO recommends, among other things, that EPA assess the feasibility of the established time frames for each step in the IRIS assessment process and make changes if necessary, submit for independent review to an entity with scientific and technical credibility a plan for how EPA will implement the National Academies’ suggestions, and ensure that current and accurate information on chemicals that EPA plans to assess through IRIS is available to IRIS users. EPA agreed with GAO’s recommendations and noted specific actions it will take to implement them. View [hyperlink, http://www.gao.gov/products/GAO-12-42]. For more information, contact David C. Trimble at (202) 512-3841 or trimbled@gao.gov. [End of section] Contents: Letter: Background: EPA's Progress in Completing Assessments under Its Revised Process Has Been Limited: EPA Faces Long-standing and New Challenges in Implementing the IRIS Program: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Scope and Methodology: Appendix II: Status of Chemicals in the IRIS Assessment Development Process, as of September 30, 2011: Appendix III: Information on Chemicals of Key Concern: Appendix IV: Projected Completion Dates for IRIS Assessments Currently in the Assessment Development Process, as of September 30, 2011: Appendix V: Comments from the Environmental Protection Agency: Appendix VI: GAO Contact and Staff Acknowledgments: Tables: Table 1: EPA's May 2009 IRIS Assessment Process Steps and Established Time Frames: Table 2: Completed IRIS Assessments from May 2009 through September 2011: Table 3: Ongoing IRIS Assessment Workload Balance: Table 4: May 2009 IRIS Process Step Time Frames and Actual Completion Times for Each Step: Figures: Figure 1: National Academies' Risk Assessment Model Used by EPA: Figure 2: Number of Completed IRIS Assessments, Fiscal Years 2002-2011: Abbreviations: ATSDR: Agency for Toxic Substances and Disease Registry: BBP: butyl benzyl phthalate: BOSC: Board of Scientific Counselors: CEQ: Council on Environmental Quality: CERCLA: Comprehensive Environmental Response, Compensation, and Liability Act of 1980: DBP: dibutyl phthalate: DEHA: di(2-ethylhexyl) adipate: DEHP: di(2-ethylhexyl) phthalate: DIBP: diisobutyl phthalate: DINP: diisononyl phthalate: DIPE: diisopropyl ether: DOD: Department of Defense: DPP: dipentyl phthalate: EGBE: ethylene glycol monobutyl ether: EPA: Environmental Protection Agency: ETB: Eethyl tertiary butyl ether: FDA: Food and Drug Administration: HHS: Department of Health and Human Services: IRIS: Integrated Risk Information System: IRISTrack: IRIS Substance Assessment Tracking System: MTBE: methyl tert-butyl ether: NASA: National Aeronautics and Space Administration: NCEA: National Center for Environmental Assessment: NCEH: National Center for Environmental Health: NIOSH: National Institute for Occupational Safety and Health: NTP: National Toxicology Program: OIRA: Office of Information and Regulatory Affairs: OMB: Office of Management and Budget: ORD: Office of Research and Development: PAH: polycyclic aromatic hydrocarbon: PART: Program Assessment Rating Tool: PBPK: physiologically based pharmacokinetic: PCBs: polychlorinated biphenyls: PFOA: perfluorooctanoic acid-ammonium salt: PFOS: perfluorooctane sulfonate-potassium salt: RDX: hexahydro-1,3,5-trinitro-1,3,5-triazine: RfC: inhalation reference concentration: RfD: oral reference dose: TAEE: tertiary amyl ethyl ether: TAME: tertiary amyl methyl ether: TCDD: 2,3,7,8-tetrachlorodibenzo-p-dioxin: TCE: trichloroethylene: TCP: trichloropropane: THF: tetrahydrofuran: TSCA: Toxic Substances Control Act: [End of section] United States Government Accountability Office: Washington, DC 20548: December 9, 2011: The Honorable Brad Miller: Ranking Member: Subcommittee on Energy and Environment: Committee on Science, Space, and Technology: House of Representatives: Dear Mr. Miller: The Environmental Protection Agency's (EPA) Integrated Risk Information System (IRIS) Program supports EPA's mission to protect human health and the environment by providing the agency's scientific position on the potential human health effects that may result from exposure to various chemicals in the environment. IRIS was created in 1985 to help EPA develop consensus opinions within the agency about the health effects from chronic exposure to chemicals, and its importance has increased over time. The IRIS database contains quantitative toxicity assessments of the health effects of more than 550 chemicals and provides fundamental scientific components-- qualitative hazard identification and quantitative dose-response assessment--of human health risk assessments.[Footnote 1] EPA's IRIS Program develops new IRIS assessments and, as needed, updates existing IRIS values contained in the IRIS database. These IRIS assessments, in turn, provide scientific input for risk management decisions, such as whether EPA should establish air or water quality standards to protect the public from exposure to toxic chemicals or set cleanup standards for hazardous waste sites. Consequently, IRIS assessments are a critical component of EPA's capacity to support scientifically sound decisions, policies, and regulations. State and local environmental programs and some international regulatory bodies also rely on IRIS for managing their environmental protection programs. In 2008, we reported that the IRIS database was at serious risk of becoming obsolete because the agency had not been able to keep its existing assessments current, decrease its ongoing assessments workload to a manageable level, or complete assessments of the most important chemicals of concern.[Footnote 2] In addition, we reported that as of December 2007, most of the ongoing assessments being conducted at that time had been in process for more than 5 years and that some assessments of key chemicals--chemicals that are likely to cause cancer or other significant health effects--had been in process even longer. For example, the formaldehyde and dioxin assessments had been ongoing for 11 and 17 years, respectively.[Footnote 3] We also reported that new Office of Management and Budget (OMB)-required reviews of IRIS assessments by OMB and other federal agencies--called interagency reviews--were conducted in a manner that limited the transparency and credibility of the assessments and hindered EPA's ability to manage the IRIS assessment process. Because of these issues, we recommended that EPA revise its IRIS assessment process to develop the timely chemical risk information the agency needs to effectively conduct its mission and to better ensure the development of transparent, credible chemical assessments. EPA issued a revised process in April 2008 that we concluded, in testimony before Congress, would further exacerbate the timeliness and credibility concerns we had identified.[Footnote 4] Because the agency had not developed sufficient chemical assessment information to limit public exposure to many chemicals that may pose substantial health risks--and, in particular, because of EPA's lack of responsiveness to our March 2008 recommendations--in January 2009, we added EPA's processes for assessing and controlling toxic chemicals to our list of areas at high risk for waste, fraud, abuse, and mismanagement or in need of broad-based transformation.[Footnote 5] In response to our 2008 report and subsequent high-risk designation, EPA revised its IRIS assessment process in May 2009 to, among other things, restore EPA's control of the process and increase its transparency. Our biennial review of high-risk areas in 2011 concluded that the EPA Administrator needs to continue to demonstrate a strong commitment to and support of the IRIS Program to ensure that EPA's 2009 reforms are implemented effectively and that the program can routinely provide timely, transparent, and credible assessments.[Footnote 6] In this context, you asked us to review EPA's IRIS assessment process. Our objectives were to evaluate (1) EPA's progress in completing IRIS assessments under the May 2009 process and (2) the challenges, if any, that EPA faces in implementing the IRIS program. In conducting our work, we analyzed EPA's May 2009 assessment process; data from fiscal year 1999 through September 30, 2011, on IRIS productivity, such as the number of IRIS assessments initiated and completed; the status of IRIS assessments that are currently in progress; and EPA's goals for completing assessments. To assess the reliability of the data, we conducted interviews and e-mail exchanges with EPA officials about the data system, the method of data input, and internal data controls and documentation, among other things. We found the data to be sufficiently reliable for the purposes of our report. We also interviewed officials from EPA's Office of Research and Development's (ORD) National Center for Environmental Assessment (NCEA), which manages the IRIS Program. In addition, we interviewed officials from other federal agencies involved in the IRIS process-- including OMB and the Department of Defense (DOD)--and groups that have knowledge of the IRIS Program. We did not evaluate the scientific content or quality of IRIS assessments, but we reviewed the suggestions to EPA in peer review reports on overall improvements to the development of IRIS assessments and information on other issues affecting the IRIS Program. A more detailed description of our scope and methodology is presented in appendix I. We conducted this performance audit from July 2010 to December 2011 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Background: This section discusses EPA's risk assessment and risk management practices and the May 2009 IRIS process. Risk Assessment and Risk Management: EPA's IRIS Program is an important source of information on health effects that may result from exposure to chemicals in the environment. As figure 1 shows, the toxicity assessments in the IRIS database fulfill the first two critical steps of the risk assessment process-- providing qualitative hazard identification and dose-response assessment (see definitions below).[Footnote 7] IRIS information can then be used with the results of exposure assessments (typically conducted by EPA's program or regional offices) to provide an overall characterization of the public health risks for a given chemical in a given situation. EPA defines a risk assessment, in the context of human health, as the evaluation of scientific information on the hazardous properties of environmental agents (hazard characterization), the dose-response relationship (dose-response assessment), and the extent of human exposure to those agents (exposure assessment). In final form, a risk assessment is a statement regarding the probability that populations or individuals so exposed will be harmed and to what degree (risk characterization). The development of risk assessments is directly dependent on the development of toxicity assessments such as those developed by the IRIS Program. Figure 1: National Academies' Risk Assessment Model Used by EPA: [Refer to PDF for image: illustration] IRIS toxicity assessment: 1. Hazard identification; 2. Dose-response assessment; Plus: Exposure assessment; Combine to form: Risk characterization. Source: GAO presentation of the risk assessment component of the National Academies' risk assessment and risk management model used by EPA. Note: The National Academies comprise four organizations: the National Academy of Sciences, the National Academy of Engineering, the Institute of Medicine, and the National Research Council. [End of figure] A typical IRIS toxicity assessment is based on two sequential analyses: qualitative hazard identification and quantitative dose- response assessment. Among other things, a hazard identification identifies health hazards that may be caused by a given chemical at environmentally relevant concentrations; this identification describes the potential noncancer and cancer health effects of exposure to a chemical that research studies have suggested or determined. For cancer effects, EPA describes the carcinogenic potential of a chemical in a narrative which includes one of five weight-of-the-scientific- evidence descriptors, ranging from "carcinogenic to humans" to "not likely to be carcinogenic to humans." The second analysis is the dose- response assessment, which characterizes the quantitative relationship between the exposure to a chemical and the resultant health effects; this assessment describes the magnitude of hazard for potential noncancer effects and increased cancer risk resulting from specific exposure levels to a chemical or substance. The quantitative dose- response analysis relies upon credible research data, primarily from either animal (toxicity) or human (epidemiology) studies. The noncancer dose-response assessments may include: * an oral reference dose (RfD)--an estimate of the daily oral exposure to a chemical that is likely to be without an appreciable risk of deleterious effects during a person's lifetime--expressed in terms of milligrams per kilogram per day and: * an inhalation reference concentration (RfC)--an estimate of the daily inhalation exposure to a chemical that is likely to be without an appreciable risk of deleterious effects during a person's lifetime-- expressed in terms of milligrams per cubic meter. The focus of IRIS toxicity assessments has been on the potential health effects of long-term (chronic) exposure to chemicals. According to OMB, EPA is the only federal agency that develops qualitative and quantitative assessments of both cancer and noncancer risks of exposure to chemicals, and EPA does so largely under the IRIS Program. [Footnote 8] The risk characterization information, which is derived from toxicity and exposure assessments--exposure assessments identify the extent to which exposure actually occurs--can be used to make risk management decisions designed to protect public health. For example, IRIS assessments support scientifically sound decisions, policies, and regulations under such key statutes as the Clean Air Act, the Safe Drinking Water Act, and the Clean Water Act, as well as for setting Superfund cleanup standards of hazardous waste sites.[Footnote 9] Risk management, as opposed to risk assessment, involves integrating the risk characterization information with other information--such as economic information on the costs and benefits of mitigating the risk, technological information on the feasibility of managing the risk, and the concerns of various stakeholders--to decide when actions to protect public health are warranted. More specifically, an initial risk management decision would be to determine whether the health risks identified in a chemical risk assessment warrant regulatory or other actions. As a result, the development of IRIS assessments is of key interest to stakeholders, such as other federal agencies and their contractors, chemical companies, and others who could be affected if regulatory actions were taken. That is, stakeholders could face increased cleanup costs and other legal liabilities if EPA issued an IRIS assessment for a chemical that resulted in a risk management decision to regulate the chemical to protect the public. The May 2009 IRIS Process: EPA's process for developing IRIS assessments--established in May 2009--consists of seven steps. In announcing its revised process in May 2009, EPA noted that the new process would ensure that the majority of assessments would be completed within 2 years (23 months)-- a significantly shorter time than the estimated completion time frame of about 6 to 8 years under the previous process. We note that the seven steps are preceded by a literature search and data call-in, which is not included as part of the process or its time frames. Results of the literature search are posted on the IRIS website and announced in the Federal Register, along with a request for information--the data call-in--about any pertinent studies not listed. According to EPA officials, the literature search and data call-in are not part of the process because the agency does not dedicate full-time staff to them. EPA officials told us that after the literature search, they place IRIS assessments in one of three categories--standard, moderately complex, or exceptionally complex[Footnote 10]--on the basis of such factors as the number of available scientific studies on the chemical, the number of potential health effects identified in these studies, the staff resources required to complete the assessment, and the level of stakeholder interest. However, this process, as written, does not distinguish among different types of assessments with varying complexity. Table 1 outlines the steps in the IRIS assessment process, along with the planned time frames established by EPA. Table 1: EPA's May 2009 IRIS Assessment Process Steps and Established Time Frames: Step: 1: IRIS draft assessment completed; Action: Assessment drafted; Time frame: 345 days. Step: 2: Internal agency review; Action: Draft assessment reviewed by EPA program and regional offices; Time frame: 60 days. Step: 3: EPA-managed interagency science consultation; Action: Interagency science consultation on draft assessment coordinated by EPA (White House offices[A] and other federal agencies[B]); Time frame: 45 days. Step: 4: External peer review; Action: Draft assessment posted on IRIS website for independent peer review[C] and peer review meeting announced in Federal Register; concurrently, public comment period and public listening session announced in Federal Register[D]; Time frame: 105 days. Step: 5: Draft assessment revised; Action: Draft assessment revised in response to peer review and public comments and document prepared responding to comments; Time frame: 60 days. Step: 6a: Internal agency review; Step: 6b: EPA-led interagency science discussion; Action: Draft assessment reviewed by EPA program and regional offices concurrent with interagency science consultation on draft assessment coordinated by EPA (White House offices and other federal agencies); Time frame: 45 days. Step: 7: Final IRIS assessment; Action: Final assessment, including summary, toxicological review, and response to comments, posted on IRIS website[E]; Time frame: 30 days. Total: Time frame: 690 days (23 months). Source: GAO analysis of EPA process. [A] The White House offices participating include OMB's Office of Information and Regulatory Affairs (OIRA) and the Council on Environmental Quality (CEQ). [B] Other federal agencies participating include DOD, the Department of Health and Human Services (HHS), and the National Aeronautics and Space Administration (NASA). [C] EPA decides the type of independent peer review an IRIS assessment will undergo. The peer reviews are conducted by (1) a peer review panel assembled by an EPA contractor, (2) EPA's Science Advisory Board, or (3) the National Academies. [D] The public listening session is announced concurrently with the peer review meeting in the Federal Register and provides an opportunity for interested parties to present scientific and technical comments on draft IRIS assessments. [E] [hyperlink, http://www.epa.gov/IRIS]. [End of table] All IRIS assessments undergo external peer review, but exceptionally complex assessments are generally peer reviewed by EPA's Science Advisory Board panels and in some cases by National Academies panels. [Footnote 11] These peer reviews typically require more planning and take longer than the reviews for less complicated assessments. [Footnote 12] Peer reviews for all other assessments are typically conducted by expert panels that are independently assembled by an EPA contractor. All panel members, including Science Advisory Board and National Academies panels, are composed of individuals with expertise in various scientific and technical disciplines who retain their primary involvement in academia, industry, state government, and environmental organizations. As we reported in 2008, an overarching factor that can affect EPA's ability to complete IRIS assessments in a timely manner is the compounding effects of delays.[Footnote 13] Once a delay in the assessment process occurs--for example, suspending work on an assessment to wait for additional studies--work that has been completed can become outdated, necessitating rework of some or all of the steps in the assessment process. Even a single delay can have far- reaching, time-consuming consequences, in some cases requiring that the assessment process essentially start over.[Footnote 14] EPA's Progress in Completing Assessments under Its Revised Process Has Been Limited: EPA's May 2009 IRIS assessment process addresses some of the problems we identified in our March 2008 report. However, progress in other areas has been limited. EPA's initial gains in productivity under the revised process have not been sustained. EPA has not significantly reduced its workload of ongoing assessments, which would enable the agency to routinely start new assessments and keep existing assessments current. EPA has not met established time frames for IRIS assessment process steps. Improvements to the IRIS Process: EPA has addressed concerns we raised in our March 2008 report regarding the transparency of the IRIS process. Since May 2009, all federal agency and White House office comments from both the interagency science consultation and discussion (steps 3 and 6b of the IRIS process) are available to the public on EPA's IRIS website. In addition, EPA has made publicly available documents that show EPA's responses to selected "major" interagency comments for all draft IRIS assessments that have completed an interagency review step since June 2011. As we have previously reported, we believe that interagency coordination can enhance the quality of EPA's IRIS assessments. Previously, OMB considered its comments and changes, and those of other federal agencies, to be "deliberative"--that is, they were not part of the public record. We believe the input from other federal agencies is now obtained in a manner that better ensures that EPA's scientific analysis is given appropriate weight. As a result, stakeholders, including EPA regional and program offices, the public, and industry, can now see which other federal agencies comment and the nature of their comments, making IRIS assessments more transparent. Transparency is especially important because agencies providing input, such as DOD and NASA, may have a vested interest in the outcome of the assessment should it lead to regulatory or other actions. For example, these agencies may be affected by the potential for increased environmental cleanup costs and other legal liability if EPA issued an IRIS assessment that resulted in a decision to regulate a chemical to protect the public. Officials we spoke with from other federal agencies--including DOD, NASA, and the Department of Health and Human Services (HHS)--all agreed that making their comments publicly available was a good practice. In addition, EPA now manages the interagency science consultation and discussion (steps 3 and 6b of the IRIS process, formerly OMB-managed interagency reviews). As we recommended in 2008, the process now includes time limits for all parties, including OMB and other federal agencies, to provide comments to EPA on draft assessments. Prior to May 2009, OMB managed these steps, and EPA was not allowed to proceed with assessments until OMB notified EPA that it had sufficiently responded to comments from OMB and other federal agencies. EPA has also streamlined its IRIS process, as we recommended in our 2008 report, by consolidating some process steps and eliminating others that had provided opportunities for other federal agencies to suspend IRIS assessments to conduct additional research.[Footnote 15] Initial Gains in Productivity under the Revised Process Have Not Been Sustained: Shortly after it implemented its revised IRIS assessment process in May 2009, EPA experienced a surge of productivity in terms of the number of IRIS assessments it issued. Specifically, from May 2009 through September 30, 2011, EPA completed 20 IRIS assessments--more than doubling the total productivity it achieved during fiscal years 2007 and 2008.[Footnote 16] However, 16 of these were completed in the first year and a half of implementing the revised process, and productivity fell sharply during fiscal year 2011, with EPA issuing 4 IRIS assessments (see fig. 2). Figure 2: Number of Completed IRIS Assessments, Fiscal Years 2002-2011: [Refer to PDF for image: line graph] Fiscal year: 2002: 3. Fiscal year: 2003: 11. Fiscal year: 2004: 4. Fiscal year: 2005: 4. Fiscal year: 2006: 2. Fiscal year: 2007: 2. Fiscal year: 2008: 5. Fiscal year: 2009: 7. Fiscal year: 2010: 10. Fiscal year: 2011: 4. Source: GAO presentation of EPA data. [End of figure] In completing 4 IRIS assessments in fiscal year 2011, EPA fell significantly short of its original plan to complete 20 assessments--a goal that it had revised to 9 as of August 2011. In addition, EPA is unlikely to meet its fiscal year 2012 goal of completing 40 assessments.[Footnote 17] As of September 30, 2011, 12 of the 40 assessments that EPA plans to complete in fiscal year 2012 are still being drafted (step 1 of the IRIS process). See appendix II for the status of chemicals in the IRIS assessment process as of September 30, 2011. On the basis of the planned time frames EPA established under its revised process, once these 12 IRIS assessments are drafted, EPA will require at least 345 days, or 11˝ months, to complete the remaining IRIS process steps and issue the assessments--making it unlikely these will be completed in 2012. The increased productivity occurring after May 2009 does not appear to be entirely attributable to the revised IRIS assessment process. According to our analysis of EPA data, the agency's ability to complete more assessments was not due to a fundamental gain in how quickly assessments are completed, but rather to EPA's ability to clear up the backlog of assessments that had undergone work under the previous IRIS process and had been delayed for multiple reasons. Most of the assessments completed from May 2009 through September 2011 had been in process 5 years or longer and thus had already passed through some key process steps prior to the implementation of the revised process. In addition, most of these completed IRIS assessments were for standard and moderately complex assessments--that is, they were less challenging to complete than those for more complex chemicals. Specifically, 17 of 20 assessments issued from May 2009 through September 30, 2011, were in process for 5 years or longer, and 2 of the 20 were for exceptionally complex assessments (see table 2). For example, 1 exceptionally complex assessment that EPA did complete was for trichloroethylene (TCE). For information on TCE, as well as on some other key chemicals for which EPA has not completed IRIS assessments, see appendix III. Table 2: Completed IRIS Assessments from May 2009 through September 2011: IRIS assessment: Trichloroacetic acid; Level of complexity[A]: Moderately complex; Assessment completion date: Sept. 30, 2011; Length of time to complete assessment: 7 years, 9 months. IRIS assessment: Trichloroethylene (TCE); Level of complexity[A]: Exceptionally complex; Assessment completion date: Sept. 28, 2011; Length of time to complete assessment: 13 years, 9 months. IRIS assessment: Hexachloroethane; Level of complexity[A]: Standard; Assessment completion date: Sept. 23, 2011; Length of time to complete assessment: 6 years. IRIS assessment: Urea; Level of complexity[A]: Standard; Assessment completion date: July 13, 2011; Length of time to complete assessment: 3 years, 6 months. IRIS assessment: Chloroprene; Level of complexity[A]: Moderately complex; Assessment completion date: Sept. 30, 2010; Length of time to complete assessment: 11 years, 10 months. IRIS assessment: Dichloroethylene -1,2-Cis-; Level of complexity[A]: Standard; Assessment completion date: Sept. 30, 2010; Length of time to complete assessment: 6 years, 8 months. IRIS assessment: Dichloroethylene -1,2-Trans-; Level of complexity[A]: Standard; Assessment completion date: Sept. 30, 2010; Length of time to complete assessment: 6 years, 8 months. IRIS assessment: Pentachlorophenol; Level of complexity[A]: Moderately complex; Assessment completion date: Sept. 30, 2010; Length of time to complete assessment: 12 years, 9 months. IRIS assessment: Tetrachloroethane-1,1,2,2; Level of complexity[A]: Standard; Assessment completion date: Sept. 30, 2010; Length of time to complete assessment: 5 years. IRIS assessment: Hydrogen cyanide; Level of complexity[A]: Standard; Assessment completion date: Sept. 28, 2010; Length of time to complete assessment: 7 years, 6 months. IRIS assessment: Dioxane-1,4--(oral route); Level of complexity[A]: Moderately complex; Assessment completion date: Aug. 11, 2010; Length of time to complete assessment: 6 years, 6 months. IRIS assessment: Carbon tetrachloride; Level of complexity[A]: Moderately complex; Assessment completion date: Mar. 31, 2010; Length of time to complete assessment: 10 years, 3 months. IRIS assessment: Ethylene glycol monobutyl ether (EGBE); Level of complexity[A]: Moderately complex; Assessment completion date: Mar. 31, 2010; Length of time to complete assessment: 6 years, 2 months. IRIS assessment: Acrylamide; Level of complexity[A]: Exceptionally complex; Assessment completion date: Mar. 22, 2010; Length of time to complete assessment: 9 years, 1 month. IRIS assessment: Bromobenzene; Level of complexity[A]: Standard; Assessment completion date: Sept. 30, 2009; Length of time to complete assessment: 6 years, 8 months. IRIS assessment: Thallium; Level of complexity[A]: Standard; Assessment completion date: Sept, 30, 2009; Length of time to complete assessment: 7 years, 9 months. IRIS assessment: Trichloropropane-1,2,3 (TCP); Level of complexity[A]: Moderately complex; Assessment completion date: Sept, 30, 2009; Length of time to complete assessment: 6 years, 5 months. IRIS assessment: Cerium Oxide; Level of complexity[A]: Standard; Assessment completion date: Sept, 29, 2009; Length of time to complete assessment: 4 years, 3 months. IRIS assessment: Hexanone-2; Level of complexity[A]: Standard; Assessment completion date: Sept, 25, 2009; Length of time to complete assessment: 4 years, 8 months. IRIS assessment: Chlordecone (kepone); Level of complexity[A]: Standard; Assessment completion date: Sept, 22, 2009; Length of time to complete assessment: 6 years, 8 months. IRIS assessment: Average time to complete; Level of complexity[A]: [Empty]; Assessment completion date: [Empty]; Length of time to complete assessment: 7 years, 6 months. Source: GAO analysis of EPA data. [A] EPA determines the level of complexity for IRIS assessments. [End of table] EPA Has Not Significantly Reduced Its Ongoing Assessment Workload, Which Would Enable the Agency to Start New Assessments: As of September 30, 2011, EPA had 55 IRIS assessments ongoing and 14 on hold--down from the 88 assessments that were in various stages of development when it implemented its revised IRIS assessment process in May 2009. Since May 2009, EPA has undertaken 6 new assessments, dropped 5 assessments that it determined were no longer required, completed 20 assessments, and continued to have 14 assessments on hold (see table 3). According to EPA officials, assessments that have been put on hold will be resumed when the agency has resources available to staff them. Table 3: Ongoing IRIS Assessment Workload Balance: Beginning workload, May 2009: 88. New assessments undertaken[A]: 6. Dropped from agenda[B]: (5). Completed[C]: (20). On hold[D]: (14). Ending workload, September 30, 2011: 55. Source: GAO analysis of EPA data. [A] Since May 2009, EPA has started 3 new IRIS assessments (n-butanol, diethyl phthalate, and polychlorinated biphenyls [PCBs]-noncancer). In addition, EPA decided to split the assessments for 3 chemicals (chromium VI, 1,4-dioxane, and methanol), effectively adding 3 assessments to its list of ongoing assessments. It split the latter assessment into assessments of the cancer and noncancer effects of methanol, and the former 2 assessments into assessments of the effects of oral and inhalation exposure. [B] As indicated in the Federal Register (75 Fed. Reg. 63827, October 18, 2010), EPA stopped work on (or "dropped") the IRIS assessments of perfluorooctane sulfonate-potassium salt (PFOS) and perfluorooctanoic acid-ammonium salt (PFOA) because it is focusing on these chemicals as part of its chemicals management program under the Toxic Substances Control Act (TSCA). In the same Federal Register notice, EPA announced that it had stopped work on the IRIS assessment of weathered toxaphene because of lack of data to support an IRIS assessment, and that it had revised its approach to the IRIS assessment of asbestos, deciding to focus exclusively on a certain type of asbestos--Libby amphibole asbestos--in order to respond to the needs of the agency and the mining community of Libby, Montana. Under the revised approach, EPA dropped its plans to assess asbestos other than Libby amphibole asbestos. In August 2011, EPA decided to drop a fifth assessment, the draft assessment of mirex, because of its relatively low priority EPA- wide. [C] For a list of the 20 completed IRIS assessments from May 2009 through September 2011, see table 2. [D] The following 14 assessments remain on hold as of September 30, 2011: manganese, ethylene dichloride, tungsten, tertiaryamylmethyl ether (TAME), ethylbenzene, alkylates, antimony, carbonyl sulfide, diisopropyl ether (DIPE), tertiary amyl ethyl ether (TAEE), bisphenol A, refractory ceramic fibers, isopropanol, and ethanol. [End of table] However, this tally of IRIS assessments does not reflect the true extent of EPA's workload or the backlog of demand for IRIS assessments. Beyond the 55 ongoing IRIS assessments and 14 on hold, the demand for additional IRIS assessments is unclear. With existing resources devoted to addressing its current workload of ongoing assessments, EPA has not been in a position to routinely start new assessments. In late 2010, for the first time since 2007, EPA solicited nominations for new IRIS assessments from EPA program and regional offices, as well as from the public and federal agencies that participate in IRIS interagency reviews. However, as of September 30, 2011, EPA officials had not decided which chemicals to include on the IRIS agenda and thus include in their workload.[Footnote 18] Moreover, instead of nominating new chemicals for assessment in 2010, one regional office requested that the IRIS Program focus its efforts on completing assessments currently under way. In addition, in 2007, the Office of Air and Radiation--which develops national programs, policies, and regulations for controlling air pollution and radiation exposure--requested that ongoing assessments be expedited for 28 chemicals that it identified as high-priority and required to fulfill its regulatory mandates. As of September 30, 2011, 17 of the 28 assessments the office identified are ongoing, and 3 are on hold. See appendix IV for EPA's expected completion dates for IRIS assessments currently in the assessment process. In addition, other assessments in the IRIS database may need to be updated. As we reported in March 2008, EPA data from 2001 through 2003 indicated that 287 of the assessments in the IRIS database at that time may need to be updated. In October 2009, EPA announced in the Federal Register the establishment of the IRIS Update Project. The stated purpose of the project was to update IRIS toxicity values, such as oral reference doses or inhalation reference concentrations, that are more than 10 years old. However, according to EPA officials, since the project was announced, little progress has been made toward updating these assessments. We note that even if EPA were to overcome the significant productivity difficulties it has experienced in recent years and meet its goal of completing 40 assessments in fiscal year 2012, it is not clear that this level of productivity would meet the needs of EPA program offices and other users. EPA Has Not Met Established Time Frames for IRIS Assessment Process Steps: IRIS assessments have taken longer than the time frames established under the revised IRIS process. Since implementing the revised process, most IRIS assessments have exceeded the established time frames for each step of the process. EPA officials, however, told us that the time frames established for the steps in the revised IRIS assessment process apply only to standard assessments--and not to moderately or exceptionally complex assessments. While EPA officials have said that they are trying to hold moderately complex assessments to the established time frames, EPA does not have a written policy that describes the applicability of these time frames or written criteria for designating IRIS assessments as standard, moderately complex, or exceptionally complex. Consequently, it is unclear how IRIS users will know which assessments are standard, moderately complex, or exceptionally complex and what time frames will be required to complete them. According to EPA officials, NCEA management, including IRIS Program management, is tracking the time it takes for each IRIS assessment to complete the various steps in the IRIS process. However, EPA has not yet analyzed these data to determine whether the time frames established for each step or the overall 23-month process are realistic. According to EPA officials, they do not yet have the data needed to draw conclusions regarding completion time frames. On the basis of our analysis of EPA data, however, we determined how long each IRIS process step was taking on average compared with the time frames established for each step under the May 2009 revised process. We performed this analysis for the 55 assessments that were ongoing, as of September 30, 2011, and the 20 assessments that were completed after May 2009. Because none of the 20 IRIS assessments completed from May 2009 through September 2011 were initiated after the revised process was implemented, it was not possible to fully evaluate the extent to which EPA is adhering to the new 23-month time frame. Further, we combined our analysis of steps 4 and 5 because EPA data do not indicate when step 4 ends and when step 5 begins, and we combined steps 6 and 7 for the same reason. According to our analysis, on average, assessments of all types have taken longer than the established time frames for every step in the IRIS process (see table 4). Table 4: May 2009 IRIS Process Step Time Frames and Actual Completion Times for Each Step: IRIS process steps and established EPA time frames[B]: Step 1-IRIS draft assessment completed[C]; (345 days); Average completion time[A}(Number of assessments that completed the step): Standard assessments: 675 days; (1 assessment); Average completion time[A}(Number of assessments that completed the step): Moderately complex assessments: 352 days; (1 assessment); Average completion time[A}(Number of assessments that completed the step): Exceptionally complex assessments: 506 days; (1 assessment). IRIS process steps and established EPA time frames[B]: Step 2-Internal agency review; (60 days); Average completion time[A}(Number of assessments that completed the step): Standard assessments: 91 days; (4 assessments); Average completion time[A}(Number of assessments that completed the step): Moderately complex assessments: 141 days; (2 assessments); Average completion time[A}(Number of assessments that completed the step): Exceptionally complex assessments: 109 days; (4 assessments). IRIS process steps and established EPA time frames[B]: Step 3-EPA-led interagency science consultation; (45 days); Average completion time[A}(Number of assessments that completed the step): Standard assessments: 73 days; (8 assessments); Average completion time[A}(Number of assessments that completed the step): Moderately complex assessments: 69 days; (3 assessments); Average completion time[A}(Number of assessments that completed the step): Exceptionally complex assessments: 114 days; (8 assessments). IRIS process steps and established EPA time frames[B]: Step 4-External peer review, and Step 5-Draft assessment revised; (165 days combined[D]); Average completion time[A}(Number of assessments that completed the step): Standard assessments: 294 days; (6 assessments); Average completion time[A}(Number of assessments that completed the step): Moderately complex assessments: 442 days; (4 assessments); Average completion time[A}(Number of assessments that completed the step): Exceptionally complex assessments: 523 days; (2 assessments). IRIS process steps and established EPA time frames[B]: Step 6a-- Internal agency review; Step 6b--EPA-led interagency science consultation and discussion, and; Step 7-Final IRIS assessment posted; (75 days combined[E]); Average completion time[A}(Number of assessments that completed the step): Standard assessments: 80 days; (11 assessments); Average completion time[A}(Number of assessments that completed the step): Moderately complex assessments: 92 days; (7 assessments); Average completion time[A}(Number of assessments that completed the step): Exceptionally complex assessments: 155 days; (2 assessments). Source: GAO analysis of EPA data. [A] We calculated average completion times, rounded to the nearest day, using EPA-provided data. In our calculations, we considered only assessments that began and completed a step under the May 2009 process. This included the 55 ongoing assessments and the 20 completed since May 2009. For example, some assessments completed multiple steps since May 2009, while others completed only one or two. [B] According to EPA officials, the time frames established for the steps in the IRIS assessment process apply to standard assessments. EPA officials told us they are trying to hold moderately complex assessments to the 23-month time frame, and the process time does not apply to exceptionally complex assessments. [C] As of September 30, 2011, 23 other assessments were still in step 1, and 21 of these had already exceeded the 345-day time frame for the step. These 21 assessments had been started prior to EPA's implementation of the May 2009 IRIS process. [D] We combined steps 4 and 5 because the EPA data do not indicate when step 4 ends and when step 5 begins. [E] We combined steps 6 and 7 because the EPA data do not indicate when step 6 ends and when step 7 begins. [End of table] Some other federal agencies that participate in interagency reviews expressed concern that in some cases time and resource constraints present challenges as they try to meet EPA's time frames for the two interagency review steps. In addition to the time limits established under the revised process, in an effort to increase productivity and complete more IRIS assessments, EPA officials said that, beginning in April 2011, the agency began to accelerate the number of draft assessments sent through the interagency review steps. However, officials from other federal agencies--including HHS and DOD--told us that they have advised EPA that the accelerated pace of interagency reviews in the second half of fiscal year 2011 strained their resources. In addition, the official from NASA told us that not only are the increased pace of reviews straining the agency's resources, but that it has also affected the ability to provide in-depth independent technical reviews and interagency comments. EPA officials also told us that the interagency reviewer at NASA is so concerned with the pace of the interagency reviews under the revised process that NASA officials have asked OMB to form an interagency work group to discuss the reviews. EPA Faces Long-standing and New Challenges in Implementing the IRIS Program: EPA faces both long-standing and new challenges in implementing the IRIS Program. First, the National Academies has identified recurring issues with how the IRIS Program develops and presents its assessments and has suggested improvements. Second, EPA has not consistently provided reliable information on ongoing and planned IRIS assessments to IRIS users. Third, unresolved discussions with OMB regarding EPA's responses to Data Quality Act challenges may impede EPA's ability to issue completed IRIS assessments. The National Academies Has Identified Recurring Issues with How the IRIS Program Develops and Presents Its Assessments: The National Academies and EPA's Science Advisory Board have identified several recurring issues with how EPA develops and presents IRIS assessments. For example, in April 2011, the National Academies in its independent scientific review of EPA's draft IRIS assessment of formaldehyde provided a critique of EPA's development and presentation of draft IRIS assessments. Overall, the National Academies noted some recurring methodological problems in the draft IRIS assessment of formaldehyde.[Footnote 19] In addition, in the report the National Academies also identified recurring issues concerning clarity and transparency with EPA's development and presentation of its draft IRIS assessments. The National Academies and Science Advisory Board have identified similar clarity and transparency issues in peer review reports over the past 5 years.[Footnote 20] Some of these reports stated that EPA should more clearly explain its reasons for including or excluding the scientific studies supporting draft IRIS assessments. In addition, some reports stated that EPA should more transparently present its justifications for its methodological approaches. Independent of its review of the formaldehyde assessment, the National Academies also provided a "roadmap for revision" that made suggestions for improvements to the IRIS draft development process, during which EPA selects and evaluates evidence (the literature search) and drafts an assessment (step 1). The National Academies' "roadmap for revision" suggested that EPA take the following steps, among others: * use clear, standardized methods to identify and select study evidence; * use a standardized approach to evaluate and describe study strengths and weaknesses and the weight of evidence, describe and justify the assumptions and models used, and adopt a standardized approach to characterizing uncertainty factors;[Footnote 21] and: * present methodology and findings more clearly and more concisely through better use of graphics and tables and use a template to facilitate a consistent description of the approach to study selection. The National Academies' report on the draft IRIS assessment of formaldehyde specifically noted that EPA should not delay the finalization of the assessment in order to implement any of the suggestions it made regarding the overall IRIS process. As of September 30, 2011, according to EPA officials, the agency is revising the assessment in response to the National Academies' suggestions, but the status page on EPA's website for formaldehyde lists "TBD"--to be determined--as the posting date for the final assessment. In July 2011, EPA announced that it planned to respond to the National Academies' suggestions by implementing changes to the way it develops draft IRIS assessments. In announcing the planned changes, EPA stated that it would take the following actions: * enhance its approach to identifying and selecting scientific study evidence; * provide more complete documentation of its approach to evaluating scientific study evidence and indicate which criteria were most influential in its evaluation of the weight of evidence; and: * concisely state the criteria used to include or exclude studies, continue to use existing IRIS guidelines to enhance the clarity and transparency of its data evaluation and presentation of findings and conclusions, eliminate the need for some report text using standardized tables, and portray toxicity values graphically. According to EPA officials, in implementing these changes, EPA will subject those assessments that are in earlier stages of development to more extensive changes than those in later stages of development. It will change the latter "as feasible" without repeating steps in the overall IRIS process. However, EPA has not provided a more detailed description of how the National Academies' suggestions will apply to each of the assessments in its current inventory of IRIS assessments. Without a more precise description of which drafts would be considered "in the earlier stages of development" or what "more extensive changes" would entail, it is too soon to provide a comprehensive assessment of EPA's approach. In addition, it is not transparent to stakeholders and other interested parties which assessments will be subject to these changes and which will not. EPA established the Board of Scientific Counselors (BOSC), an advisory committee composed of non- EPA technical experts from academia, industry, and environmental communities, to provide independent advice, information, and suggestions to the Office of Research and Development (ORD) research program--which houses the IRIS Program.[Footnote 22] Part of BOSC's mission is to evaluate and provide advice concerning the utilization of peer review within ORD to sustain and enhance the quality of science in EPA. It is unclear if BOSC will have a role in reviewing EPA's response to the National Academies' suggestions. We reviewed two IRIS assessments--one completed and one still in draft form--that reflect changes EPA has made in response to the National Academies' suggestions.[Footnote 23] First, for its assessment of urea, finalized in July 2011, EPA streamlined the report by moving sections of text from the body to an appendix, which shortened the body of the assessment from 89 to 57 pages, making it more concise. In addition, we reviewed the draft IRIS assessment of diisobutyl phthalate (DIBP), which EPA provided to us, that was undergoing agency review (step 2) and reflects some of the National Academies' suggestions regarding presentation. For example, it includes (1) descriptive and pictorial explanations of the study selection methods used; (2) tables that, among other things, give side-by-side comparisons of studies considered in determining the oral reference dose for the chemical; and (3) brief descriptions of the strengths and weaknesses of various studies considered. For these two assessments, it appears that EPA has begun to enhance the readability of its assessments by making changes that appear to be in line with the suggestions made by the National Academies. EPA Has Not Consistently Provided Reliable Information on Planned and Ongoing IRIS Assessments to IRIS Users: EPA uses two primary mechanisms--the IRIS agenda and a website feature known as IRISTrack--to make information on the status of IRIS assessments available to EPA program and regional offices, other federal agencies, and the public.[Footnote 24] EPA has not effectively used these two mechanisms, or a third that we recommended in March 2008--that the agency provide a 2-year notice of its intent to assess specific chemicals--to consistently provide reliable information on IRIS assessments to stakeholders and other interested parties. [Footnote 25] First, EPA has not published an IRIS agenda in the Federal Register-- identifying the chemicals that EPA plans to assess (both new and ongoing assessments)--since it announced its 2008 IRIS agenda in December of 2007. EPA started developing an annual IRIS agenda and providing it to the public in a notice in the Federal Register in 1997.[Footnote 26] In late 2010, EPA began to solicit nominations for its fiscal year 2011 IRIS agenda from its program and regional offices, as well as from the public and federal agencies that participate in IRIS interagency reviews. However, as of September 30, 2011, EPA had not published its fiscal year 2011 agenda. In addition, some of the information provided in the Federal Register notices about the IRIS agenda has been incomplete. For example, an October 2010 Federal Register notice contained a list of chemicals currently on the IRIS agenda but did not distinguish between chemicals the agency was actively assessing and those it had designated for future assessment. [Footnote 27] We reported on similar issues in March 2008--noting that EPA had identified some assessments that had been suspended as ongoing.[Footnote 28] Second, EPA has not kept information on the status of the individual ongoing assessments up to date in IRISTrack--an issue we also reported on in 2008.[Footnote 29] EPA's IRISTrack, a feature of its website, is intended to provide stakeholders and other interested parties with information on draft IRIS assessments--specifically, estimated start and end dates for steps in the IRIS process.[Footnote 30] For example, officials from the Office of Water indicated that their office relies heavily on IRISTrack for information about the status of IRIS assessments. In addition to not updating IRISTrack, EPA recently removed some key information presented in IRISTrack. Now, in some cases, the IRISTrack date for the beginning of draft development (step 1) understates the actual duration of an assessment--sometimes by many years. For example, IRISTrack indicates that draft development for the dioxin assessment began in the first quarter of fiscal year 2009; in fact, as we have reported, EPA has been assessing dioxin since 1991. [Footnote 31] IRISTrack also understates the duration of assessments of other chemicals of key concern--for formaldehyde, naphthalene, and TCE. Therefore, current and accurate information regarding when an assessment will be started, which assessments are currently ongoing, and when an assessment is projected to be completed is presently not publicly available. Third, EPA does not provide at least 2 years' notice of its intent to assess specific chemicals, as we recommended the agency should do in our March 2008 report to give agencies and other interested parties the opportunity to conduct research needed to fill any data gaps. [Footnote 32] In commenting on our report, EPA agreed to consider our recommendation, and EPA officials recently stated that they continue to agree with it, but as of September 30, 2011, the agency still had not taken steps to implement our recommendation. Unresolved Discussions between EPA and OMB Could Contribute to Delays: Discussions between EPA and OMB officials regarding Data Quality Act challenges related to specific draft IRIS assessments have been ongoing for over a year without resolution. If these unresolved discussions continue, they could contribute to delays of IRIS assessments. According to EPA officials, OMB would like to return to its role in the prior assessment process, in which it managed interagency reviews and made the final determination as to whether EPA has satisfactorily responded to comments from OMB and officials in other federal agencies. The Information Quality Act, commonly called the Data Quality Act, requires OMB to issue governmentwide guidelines to "ensure and maximize the quality, objectivity, utility, and integrity of information, including statistical information," disseminated to the public.[Footnote 33] In addition, it required agencies to issue their own guidelines, set up administrative mechanisms to allow affected parties to seek the correction of information they considered erroneous, and report periodically to OMB information about Data Quality Act challenges ("requests for correction" of agency information) and how the agencies addressed them.[Footnote 34] Under its data quality guidelines, when EPA provides opportunities for public participation by seeking comments on information, such as during a rulemaking, the agency uses the public comment process rather than EPA guidelines to address concerns about EPA's information. This is consistent with OMB's data quality guidelines, which encourage agencies to incorporate data quality procedures into their existing administrative practices rather than create new and potentially duplicative or contradictory processes. According to EPA's data quality guidelines, the public comment period serves the purposes of the guidelines, provides an opportunity for correction of information, and does not duplicate or interfere with the orderly progression of draft documents through an established process--in this case, the IRIS assessment process. That is, the external peer review and associated public comment period provide the public with the opportunity to raise questions regarding the quality of the information being used to support an IRIS assessment. According to EPA officials, federal agency responses to data quality challenges must be cleared by OMB before EPA sends responses to the parties filing challenges--although no law or guidance specifically provides for such reviews. In June and July 2010, EPA received Data Quality Act challenges regarding two draft IRIS assessments. According to EPA officials, in its draft responses to these data quality challenges, EPA declined to review the challenged data because, according to agency policy, draft IRIS documents are not subject to data quality challenges. EPA used the same approach in 2006 when responding to and declining a similar challenge regarding a draft IRIS assessment; at that time, OMB approved the EPA response. EPA sent its draft responses for the two more recent challenges to OMB for approval in September 2010 and January 2011. EPA's data quality guidelines set a goal of responding to Data Quality Act challenges within 90 days, but EPA officials said that they still await a decision by OMB. According to EPA officials, OMB is delaying a decision because OMB would like to return to its role in the prior assessment process, in which it managed interagency reviews and made the final determination as to whether EPA has satisfactorily responded to comments from OMB and officials in other federal agencies. EPA officials told us that as of September 30, 2011, the issues regarding data quality challenges had not delayed the progress of draft IRIS assessments.[Footnote 35] Meanwhile, OMB staff told us that they had sent comments to EPA on the draft responses and await EPA's reply to their comments. It appears to GAO that the discussions of these issues between EPA and OMB officials, which have been ongoing for over a year without resolution, have highlighted the agencies' differences regarding the revised IRIS process. If these differences persist, they could contribute to the compounding effects of delays in the IRIS process, discussed here and in our earlier work. For example, in August 2011, EPA received a third data quality challenge on an assessment that EPA had expected to be finalized at the end of fiscal year 2011.[Footnote 36] For reasons that remain unclear, EPA now projects that this assessment will not be finalized until fiscal year 2012. We note that the assessment had entered the interagency science discussion (step 6b) in July 2011. EPA asked interagency reviewers to submit written comments by August 26, 2011, but as of September 2011, OMB reviewers have not yet submitted comments. Conclusions: The IRIS process reforms EPA began implementing in May 2009 have restored EPA's control of the process and increased its transparency. Notably, EPA has addressed concerns we raised in our March 2008 report regarding the transparency of comments from both the interagency science consultation and discussion steps in the IRIS process. Making these comments publicly available is especially important because agencies providing input may have a vested interest in the outcome of the assessment should it lead to regulatory or other actions. As a result, stakeholders, including EPA regional and program offices, the public, and industry, can now see which other federal agencies comment and the nature of their comments, making IRIS assessments more transparent. In addition, EPA now manages the interagency science consultation and discussion steps and has streamlined the IRIS process. Progress in other areas, however, has been more limited. For example, even for its less challenging assessments, EPA took longer than its established time frames for accomplishing steps in the revised process--calling into question the feasibility and appropriateness of the established time frames in the IRIS assessment process for standard assessments. Thus, the established time frames may not be feasible. It is also unclear whether the established time frames apply to moderately complex assessments because EPA does not have a written policy that describes the applicability of the time frames, although EPA officials said they are trying to hold moderately complex assessments to the 23-month time frame. Similarly, EPA does not have written criteria for designating IRIS assessments as standard, moderately complex, or exceptionally complex. We note that EPA has not analyzed the time frames to determine whether the actual time taken for each step of the overall 23-month process is realistic. Such an analysis would provide more accurate information for EPA to use in establishing time frames for these assessments. Not having established time frames for these assessments also creates uncertainty for many stakeholders with significant interest in IRIS assessments. EPA also faces both long-standing and new challenges in implementing the IRIS Program. Notably, the National Academies and Science Advisory Board have identified recurring issues of clarity and transparency of draft IRIS assessments. Consequently, as part of its independent scientific review of EPA's draft IRIS assessment of formaldehyde, the National Academies also provided suggestions in a "roadmap for revision" that included suggestions for improving EPA's development and presentation of draft IRIS assessments in general. The report identified recurring methodological issues with how the IRIS Program develops and presents its assessments and suggested improvements. EPA announced that it intends to address the issues raised in the National Academies' report but has not publicly indicated how these proposed changes would be applied to its current inventory of IRIS assessments. Many of the issues raised in the National Academies' report have been brought to the agency's attention previously. It is unclear whether any independent entity with scientific and technical credibility, such as EPA's Board of Scientific Counselors, will have a role in reviewing EPA's planned response to the National Academies' suggestions to ensure that EPA addresses these long-standing issues. In addition, EPA has not addressed other long-standing issues regarding the accuracy and availability of information on the status of IRIS assessments to IRIS users--including stakeholders such as EPA program and regional offices, other federal agencies, and the public. For example, since 2007, EPA has not published in the Federal Register an IRIS agenda that includes information on chemicals the agency is actively assessing or when it plans to start assessments of other listed chemicals. The agency also has not updated IRISTrack to display all current information on the status of assessments on the IRIS agenda, including estimated start dates and end dates of steps in the IRIS process. In addition, EPA has recently removed some key information presented in IRISTrack that showed the duration of IRIS assessments. Now, in some cases, the IRISTrack date for the beginning of draft development underestimates the actual duration of an assessment--sometimes by many years. Therefore, current and accurate information regarding when an assessment will be started, which assessments are currently ongoing, and when an assessment is projected to be completed is presently not publicly available. Finally, as we recommended the agency should do in our March 2008 report, EPA does not provide at least 2 years' notice of its intent to assess specific chemicals, which would give agencies and other interested parties the opportunity to conduct research needed to fill any data gaps. Recommendations for Executive Action: To improve EPA's IRIS assessment process, we are making the following six recommendations: To better ensure the credibility of IRIS assessments by enhancing their timeliness and certainty, we recommend that the EPA Administrator require the Office of Research and Development to: * assess the feasibility and appropriateness of the established time frames for each step in the IRIS assessment process and determine whether different time frames should be established, based on complexity or other criteria, for different types of IRIS assessments, and: * should different time frames be necessary, establish a written policy that clearly describes the applicability of the time frames for each type of IRIS assessment and ensures that the time frames are realistic and provide greater predictability to stakeholders. To better ensure the credibility of IRIS assessments by enhancing their clarity and transparency, we recommend that the EPA Administrator require the Office of Research and Development to submit for independent review to an independent entity with scientific and technical credibility, such as EPA's Board of Scientific Counselors, a plan for how EPA will implement the National Academies' suggestions for improving IRIS assessments in the "roadmap for revision" presented in the National Academies' peer review report on the draft formaldehyde assessment. To ensure that current and accurate information on chemicals that EPA plans to assess through IRIS is available to IRIS users--including stakeholders such as EPA program and regional offices, other federal agencies, and the public--we recommend that the EPA Administrator direct the Office of Research and Development to: * annually publish the IRIS agenda in the Federal Register each fiscal year; * indicate in published IRIS agendas which chemicals EPA is actively assessing and when EPA plans to start assessments of the other listed chemicals; and: * update IRISTrack to display all current information on the status of assessments of chemicals on the IRIS agenda, including projected and actual start dates, and projected and actual dates for completion of steps in the IRIS process, and keep this information current. Agency Comments and Our Evaluation: We provided a draft of this report to the Administrator of EPA for review and comment. In written comments, which are included in appendix V, EPA agreed with the report's recommendations. EPA also provided technical comments, which we incorporated into the report as appropriate. Specifically, EPA agreed that it should (1) assess the feasibility and appropriateness of the established time frames for each step in the IRIS assessment process by using available program performance measures collected since the current IRIS process was established to evaluate determine whether different time frames should be established, based on complexity or other criteria, for different types of IRIS assessments, (2) determine if different time frames are necessary, establish a written policy that clearly describes the applicability of the time frames for each type of IRIS assessment and ensures that the time frames are realistic and provide greater predictability to stakeholders, (3) continue to implement the 2011 suggestions for improving IRIS assessments in the "roadmap for revision" presented in the National Academies' peer review report on the draft formaldehyde assessment and seek independent review through the Science Advisory Board to ensure that the agency is addressing the recommendations, (4) annually publish the IRIS agenda in the Federal Register each fiscal year, (5) indicate in published IRIS agendas which chemicals EPA is actively assessing and when EPA plans to start assessments of the other listed chemicals, and (6) update IRISTrack to display all current information on the status of assessments of chemicals on the IRIS agenda, including projected and actual start dates, and projected and actual dates for completion of steps in the IRIS process, and keep this information current. As agreed with your office, unless you publicly announce the contents of this report earlier, we plan no further distribution until 30 days from the report date. At that time, we will send copies to the Administrator of EPA, the appropriate congressional committees, and other interested parties. In addition, the report will be available at no charge on the GAO website at [hyperlink, http://www.gao.gov]. If you or your staff members have any questions about this report, please contact me at (202) 512-3841 or trimbled@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix VI. Sincerely yours, Signed by: David C. Trimble: Director Natural Resources and Environment: [End of section] Appendix I: Scope and Methodology: This appendix details the methods we used to assess the Environmental Protection Agency's (EPA) management of its Integrated Risk Information System (IRIS). For this review, our objectives were to evaluate (1) EPA's progress in completing IRIS assessments under the May 2009 process and (2) the challenges, if any, that EPA faces in implementing the IRIS Program. To address these objectives, we reviewed relevant EPA documents, including documents outlining the April 2008 and the May 2009 versions of the IRIS assessment process; documents related to IRIS performance metrics; chemical nomination forms submitted by EPA regional and program offices, federal agencies, and others; and documents and other information on the public EPA website, including the IRIS database and IRISTrack, the assessment tracking system available at the IRIS website. In addition, we reviewed other relevant documents, including Federal Register notices announcing, among other things, IRIS agendas, as well as documents related to EPA's meetings with other federal agencies involved in interagency reviews of draft IRIS assessments. We did not evaluate the scientific content or quality of IRIS assessments; however, we did review the National Academies' peer review report on the draft IRIS assessment of formaldehyde to evaluate their suggestions for overall improvements to the development of IRIS assessments and other peer review reports by the National Academies and EPA's Science Advisory Board to evaluate their suggestions for improvements to draft IRIS assessments. In addition, we interviewed officials from EPA's National Center for Environmental Assessment (NCEA) who manage the IRIS Program, including the Acting Center Director, the Associate Director for Health, and the IRIS Program Acting Director, to obtain their perspectives on, among other things, the May 2009 IRIS process and the effects of changes from the April 2008 IRIS process, the extent to which EPA has made progress in completing timely, credible chemical assessments, challenges EPA faces in completing assessments, and EPA's process for responding to Data Quality Act challenges. We interviewed officials from EPA's Office of Environmental Information to obtain their perspectives on EPA's process for responding to data quality challenges. We also attended two Board of Scientific Counselors (BOSC) meetings to understand the board's role in providing advice, information, and recommendations about the Office of Research and Development (ORD) research programs, including IRIS. For the first objective, we obtained and analyzed data from fiscal year 1999 through September 30, 2011, including data, spreadsheets, project plans, and other documents used in IRIS assessment planning, development, and completion. From the data we gathered, we analyzed information on IRIS productivity, including information on the number of IRIS assessments completed and initiated, the status of IRIS assessments that are currently in progress or on the IRIS agenda, and the completion dates and durations of IRIS assessment process steps completed or currently in progress for given chemical assessments. In addition, we assessed the reliability of the data we received from EPA for our first objective. Our assessment consisted of interviews and e- mail exchanges with EPA officials about the data system, the method of data input, and internal data controls and documentation, among other areas. We also corroborated the data with other sources, where possible. For example, we verified the information provided in tables of IRIS assessment start dates and completion dates of IRIS assessment process steps through interviews and e-mail exchanges with the NCEA officials responsible for maintaining these data. Through our assessment, we determined that the data were sufficiently reliable for our purposes. For the second objective, we interviewed the chair of the National Academies Committee to Review EPA's Draft IRIS Assessment of Formaldehyde to obtain his perspective on the National Academies' suggestions for improvements to the IRIS assessment process. We interviewed officials from the Office of Management and Budget's (OMB) Office of Information and Regulatory Affairs (OIRA) to obtain their perspectives on interagency review of draft IRIS assessments, OMB's process for responding to EPA with regard to Data Quality Act challenges, and OMB's process for reviewing and approving EPA guidance documents. In addition, we interviewed officials from the Department of Defense (DOD), the National Aeronautics and Space Administration (NASA) and the Department of Health and Human Services (HHS)-- including representatives from the Centers for Disease Control and Prevention's National Center for Environmental Health (NCEH)/Agency for Toxic Substances and Disease Registry (ATSDR), National Institute for Occupational Safety and Health (NIOSH). We also interviewed HHS officials from the Food and Drug Administration (FDA); the National Institute of Environmental Health Sciences/National Toxicology Program and the Office of the Secretary. We also interviewed representatives from a chemical industry group and a nonprofit research and educational organization. We conducted this performance audit from July 2010 to December 2011 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. [End of section] Appendix II: Status of Chemicals in the IRIS Assessment Development Process, as of September 30, 2011: Step 1-Draft development (24 assessments in step): Acetaldehyde; Arsenic, inorganic (noncancer); Beryllium (cancer); Butanol, t-; Cadmium; Chloroethane; Chromium VI (inhalation); Cobalt; Copper; Di(2-ethylhexyl) adipate (DEHA); Ethyl tertiary butyl ether (ETBE); Hexabromo-cyclododecane; Hexachloro-butadiene; Hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX); Methyl tert-butyl ether (MTBE); Naphthalene; Nickel (soluble salts); Phthalates (cumulative); Phthalates: -Dibutyl (DBP); -Di(2-ethylhexyl) (DEHP); -Diethyl (DEP); Styrene; Uranium; Vinyl acetate. Step 2-EPA internal review (8 assessments in step): Ammonia; Chloroform; Phthalates: -Butyl benzyl (BBP); -Diisobutyl (DIBP); -Diisononyl (DINP); -Dipentyl (DPP); Trimethylbenzene, 1,2,4-; Trimethylbenzene, 1,3,5-. Step 3-Interagency science consultation (2 assessments in step): Benzo[a]pyrene; Polychlorinated biphenyls (PCBs) (noncancer). Step 4-External peer review and public comment (9 assessments in step): Acrylonitrile; Biphenyl; Butanol, n-; Chromium VI (oral); Dioxane, 1,4-(inhalation); Libby amphibole Asbestos (cancer); Methanol (cancer)[A]; Methanol (noncancer); Vanadium pentoxide. Step 5-EPA draft revision (7 assessments in step): Arsenic, inorganic (cancer); Dichloro-benzene, 1,2-; Dichloro-benzene, 1,3-; Dichloro-benzene, 1,4-; Formaldehyde; Polycyclic aromatic hydrocarbon (PAH) mixtures; Tetrachloro-dibenzo-p-dioxin, 2,3,7,8-(dioxin). Step 6a and b-Final EPA review/Interagency science discussion (1 assessment in step): Platinum. Step 7-Completion and posting (4 assessments in step): Dichloro-methane; Ethylene oxide (cancer); Tetrachloro-ethylene (perchloro-ethylene or perc); Tetrahydrofuran (THF). Source: GAO analysis of EPA data. [A] The draft cancer assessment of methanol is on hold pending the results of a National Toxicology Program (NTP) review of data from the Italian Ramazzini Institute, on which, according to EPA, the assessment "relied substantially." In 2010, an NTP report on a Ramazzini Institute animal study of methanol revealed differences of opinion between NTP scientists and the institute regarding diagnoses of certain cancers. [End of table] [End of section] Appendix III: Information on Chemicals of Key Concern: * Trichloroethylene (TCE). In September 2011, EPA finalized and posted an IRIS assessment of TCE, 13 years after initiating it. A degreasing agent used in industrial and manufacturing settings, TCE is a common environmental contaminant in air, soil, groundwater, and surface water. TCE has been found in the drinking water at Camp Lejeune, a large Marine Corps base in North Carolina. It has also been found at Superfund sites and many industrial and government facilities, including aircraft and spacecraft manufacturing operations. TCE has been linked to cancer, including childhood cancer, and other significant health hazards, such as birth defects. In 1995, the International Agency for Research on Cancer, part of the World Health Organization, classified the chemical as "probably carcinogenic to humans," and in 2000, the Department of Health and Human Services' National Toxicology Program concluded that TCE is "reasonably anticipated to be a human carcinogen." However, between 1989 and September 2011, the IRIS database contained no quantitative or qualitative data on TCE. Because of questions raised by peer reviewers about the IRIS cancer assessment for TCE, EPA withdrew it from the IRIS database in 1989 and did not initiate a new TCE cancer assessment until 1998. In 2001, EPA issued a draft IRIS assessment of TCE that characterized TCE as "highly likely to produce cancer in humans." The draft assessment was peer reviewed by a Science Advisory Board panel and released for public comment. In the course of these reviews, issues arose concerning, among other things, EPA's use of emerging risk assessment methods and the uncertainty associated with these new methods.[Footnote 37] To help address these issues, EPA and other agencies sponsored a National Academies peer review panel to provide guidance. The National Academies panel recommended in its 2006 report that EPA finalize the draft assessment using available data, noting that the weight of evidence of cancer and other health risks from TCE exposure had strengthened since 2001. Nonetheless, the TCE assessment was returned to draft development. It then underwent a third peer review, again through Science Advisory Board, which issued its report in January 2011. EPA revised the draft in response to the Science Advisory Board's comments and, in September 2011, finalized and posted the TCE assessment. * Dioxin. The term "dioxin" applies to a family of chemicals that are often the byproducts of combustion and other industrial processes. Complex mixtures of dioxins enter the food chain and human diet through emissions into the air that settle on soil, plants, and water. When animals ingest plants, commercial feed, and water contaminated with dioxins, the dioxins accumulate in the animals' fatty tissue. EPA's initial assessment of dioxin, which was published in 1985, focused on the dioxin TCDD (2,3,7,8-tetrachlorodibenzo-p-dioxin), which animal studies dating to the 1970s had shown to be the most potent cancer-causing chemical studied to date. EPA began work on updating this assessment in 1991. From 1995 through 2000, the revised draft assessment underwent a full peer review, as well as three peer reviews of key segments of the draft. As we have reported previously, EPA officials said in 2002 that the version of the revised assessment then in progress would conclude that dioxin may adversely affect human health at lower exposure levels than had previously been thought, and that most exposure to dioxins occurs from eating such American dietary staples as meat, fish, and dairy products.[Footnote 38] EPA was moving closer to finalizing the assessment when, in 2003, a congressional appropriations committee directed the agency not to issue the assessment until it had been peer reviewed by the National Academies. The National Academies issued its peer review report in July 2006. EPA then revised the draft assessment in response to the National Academies' recommendations, releasing it for public comment in May 2010 and sending it to the Science Advisory Board for another peer review. In August 2011, the Science Advisory Board panel issued its peer review report. Having been tasked with evaluating EPA's responses to the National Academies review and its incorporation of studies that have become available since 2006, the panel concluded that the draft IRIS assessment of dioxin was "generally clear, logical, and responsive to many but not all of the suggestions of the NAS." [Footnote 39] Among other things, the Science Advisory Board panel recommended that EPA discuss both linear and nonlinear models for cancer risks associated with dioxin exposure in its revised report. Three days after the Science Advisory Board issued its report, EPA announced that it would split the dioxin assessment into two parts, completing the noncancer portion of the assessment first and then addressing the Science Advisory Board's comments and completing the cancer portion of the assessment. EPA expects to complete the noncancer portion of the dioxin assessment by January 2012, and states that it will complete the cancer portion as expeditiously as possible thereafter. The effort to update the assessment of dioxin, which could have significant health implications for all Americans, has been ongoing for 20 years. * Formaldehyde. Formaldehyde is a gas widely used in such products as pressed wood, paper, pharmaceuticals, leather goods, and textiles. The IRIS database currently lists formaldehyde as a "probable human carcinogen"; however, the International Agency for Research on Cancer classifies it as "carcinogenic to humans." In June 2011, the Department of Health and Human Services' National Toxicology Program classified formaldehyde as "known to be a human carcinogen" in its Report on Carcinogens.[Footnote 40] The report stated that epidemiological studies "have demonstrated a causal relationship between exposure to formaldehyde and cancer in humans"--specifically, nasopharyngeal cancer, sinonasal cancer, and myeloid leukemia. The current IRIS assessment of formaldehyde dates to 1989, when the cancer portion of the assessment was issued, and 1990, when the noncancer portion was added. The last significant revision of the formaldehyde assessment dates to 1991. As we have reported previously, EPA began efforts to update the IRIS assessment of formaldehyde in 1997. [Footnote 41] In 2004, EPA received a congressional directive to await the results of a National Cancer Institute study that was expected to take, at most, 18 months before finalizing the draft assessment. That study was completed in May 2009, and in June 2010, EPA released the draft assessment, which assessed both cancer and noncancer health effects, to the National Academies for peer review. In May 2011, the National Academies published its peer review report. As of September 30, 2011--14 years after EPA began work to update the IRIS formaldehyde assessment--the agency had indicated no timetable for finalizing the assessment. Continued delays in the revision of the IRIS assessment of formaldehyde have the potential to affect the quality of EPA's regulatory actions. For example, in August 2011, EPA announced a proposed rule under the Clean Air Act related to certain emissions from natural gas processing plants. Because a newer IRIS assessment of formaldehyde has not been completed, the proposed rule relies on the existing IRIS value for formaldehyde, last updated in 1991. EPA had expected to complete the formaldehyde assessment by the end of fiscal year 2011, but withdrew the projected completion date from the IRIS website after the publication of the National Academies' peer review report on the draft assessment. As of April 2011, EPA expected to complete the formaldehyde assessment in the fourth quarter of fiscal year 2011. However, as of September 30, 2011, the IRIS website provided no projected completion date for the assessment. * Tetrachloroethylene (Perc). Tetrachloroethylene--also called perchloroethylene or perc--is a manufactured chemical used in, for example, dry cleaning, metal degreasing, and textile production. Perc is a widespread groundwater contaminant and the National Toxicology Program has determined that it is "reasonably anticipated to be a human carcinogen." Currently, the IRIS database contains only a noncancer assessment based on oral exposure to perc, posted in 1988; it gives no information on potential cancer effects or potential noncancer effects associated with inhalation of perc. EPA began work to update this assessment, and to include information on cancer and noncancer inhalation risk, in 1998. As we have reported previously, EPA completed its internal review of the draft perc assessment in February 2005 and the interagency review in March 2006.[Footnote 42] However, when the Assistant Administrator of EPA's Office of Research and Development requested that additional analyses be conducted, EPA was delayed in sending the draft assessment to the National Academies for peer review. In June 2008, EPA sent the draft assessment to the National Academies, which released its peer review report in February 2010. EPA is in the process of responding to the National Academies' suggestions, 13 years after the agency began work on the draft perc assessment. As a result, IRIS users, including EPA regional and program offices, continue to lack both cancer values and noncancer inhalation values to help them make decisions about how to protect the public from this widespread groundwater contaminant. EPA had expected to complete the perc assessment by the end of fiscal year 2011, but as of September 30, 2011, it had not done so. * Naphthalene. Naphthalene is used in jet fuel and in the production of such widely used commercial products as moth balls, dyes, insecticides, and plasticizers. The current IRIS assessment of naphthalene, issued in 1998, lists the chemical as a "possible human carcinogen"; since 2004, the National Toxicology Program has listed it as "reasonably anticipated to be a human carcinogen." As we have reported previously, EPA began updating the cancer portion of its naphthalene assessment in 2002.[Footnote 43] By 2004, EPA had drafted a chemical assessment that had completed internal peer reviews and was about to be sent to an external peer review committee. Once it returned from external review, the next step, at that time, would have been a formal review by EPA's IRIS Agency Review Committee. If approved, the assessment would have been completed and released. However, in part because of concerns raised by DOD, OMB asked to review the assessment and conducted an interagency review of the draft. In their 2004 reviews of the draft IRIS assessment, both OMB and DOD raised a number of concerns about the assessment and suggested to EPA that it be suspended until additional research could be completed to address what they considered to be significant uncertainties associated with the assessment. Although all of the issues raised by OMB and DOD were not resolved, EPA continued with its assessment by submitting the draft for external peer review, which was completed in September 2004.[Footnote 44] However, according to EPA, OMB continued to object to the draft IRIS assessment and directed EPA to convene an additional expert review panel on genotoxicity to obtain recommendations about short-term tests that OMB thought could be done quickly.[Footnote 45] According to EPA, this added 6 months to the process, and the panel, which met in April 2005, concluded that the research that OMB was proposing could not be conducted in the short term. Nonetheless, EPA officials said that the second expert panel review did not eliminate OMB's concerns regarding the assessment, which they described as reaching a stalemate. In September 2006, EPA decided, however, to proceed with developing the assessment. By 2006, the naphthalene assessment had been in progress for 4 years, and EPA decided that the noncancer portion of the existing IRIS assessment was outdated and needed to be revisited. Having made this decision, the agency returned both portions of the assessment, cancer and noncancer, to the drafting stage. We reported in March 2008 that EPA estimated a 2009 completion date for the naphthalene assessment.[Footnote 46] As of September 30, 2011, however, the assessment remained in the draft development stage, even though EPA program offices had identified the naphthalene assessment as a high-priority need for the air toxics and Superfund programs. As of September 30, 2011, EPA expects to complete the naphthalene assessment in the third quarter of fiscal year 2013. * Royal Demolition Explosive. This chemical, also called RDX or hexahydro-1,3,5-trinitro-1,3,5-triazine, is a highly powerful explosive used by the U.S. military in thousands of munitions. Currently classified by EPA as a "possible human carcinogen," this chemical is known to leach from soil to groundwater. RDX can cause seizures in humans and animals when large amounts are inhaled or ingested, but the effects of long-term, low-level exposure on the nervous system are unknown. As we reported in March 2008, as is the case with naphthalene, the IRIS assessment of RDX could require DOD to undertake a number of actions, including steps to protect its employees from the effects of this chemical and to clean up many contaminated sites.[Footnote 47] We reported at that time that EPA had started an IRIS assessment of RDX in 2000, but it had made minimal progress on the assessment because EPA had agreed to a request by DOD to wait for the results of DOD-sponsored research on this chemical. In 2007, EPA resumed work on the assessment, although some of the DOD- sponsored research was still outstanding at the time. EPA decided to suspend work on the assessment in 2009 in order to focus on assessments that were further along in the IRIS process. According to EPA's project plan for RDX, in March 2010, EPA received a letter from DOD requesting that EPA complete the assessment. In addition, in 2010, EPA's Superfund Program labeled the RDX assessment as a high priority because of the presence of the chemical at federal facilities. In June 2010, EPA renewed work on the RDX assessment, but as of September 30, 2011, it remained in the draft development stage (step 1). An EPA official told us in October 2011 that EPA plans to contact DOD officials to confirm that the draft assessment of RDX adequately captures the findings of the DOD-sponsored research. In addition, the EPA official said that the agency plans to contact officials at HHS's Agency for Toxic Substances and Disease Registry to ensure that the two agencies have coordinated research efforts on this chemical. EPA projects that it will finalize the assessment of RDX in the first quarter of fiscal year 2013. [End of section] Appendix IV: Projected Completion Dates for IRIS Assessments Currently in the Assessment Development Process, as of September 30, 2011: Fiscal year 2011 (4 assessments completed, 20 initially projected): Initial fiscal year 2011 projection: 20 completed assessments: Benzo[a]pyrene; Formaldehyde; Methanol (cancer); Polycyclic aromatic hydrocarbon (PAH) mixtures; Tetrachlorodibenzo-p-dioxin, 2,3,7,8-(dioxin). April 2011 projection: 15 completed assessments: Arsenic, inorganic (cancer); Chromium VI (oral); Dichlorobenzene, 1,2-; Dichlorobenzene, 1,3-; Dichlorobenzene, 1,4-; Mirex (dropped from IRIS agenda in August 2011). August 18, 2011 projection: 9 completed assessments: Dichloromethane; Ethylene oxide (cancer); Platinum; Tetrachloroethylene (perchloroethylene or perc); Tetrahydrofuran (THF). September 30, 2011: 4 actual completed assessments: Hexachloroethane; Trichloroacetic acid; Trichloroethylene (TCE); Urea. Fiscal year 2012 (40 assessments): Acetaldehyde; Acrylonitrile; Ammonia; Arsenic, inorganic (cancer); Benzo[a]pyrene; Beryllium (cancer); Biphenyl; Butanol, n-; Butanol, t-; Chloroform; Chromium VI (oral); Di(2-ethylhexyl) adipate (DEHA); Dichlorobenzene, 1,2-; Dichlorobenzene, 1,3-; Dichlorobenzene, 1,4-; Dichloromethane; Dioxane, 1,4-(inhalation); Ethylene oxide (cancer); Hexabromocyclododecane; Hexachlorobutadiene; Hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX); Methanol (noncancer); Methyl tert-butyl ether (MTBE); Phthalates (cumulative); Phthalates: -Dibutyl (DBP); -Diethyl (DEP); -Diisobutyl (DIBP); -Diisononyl (DINP); -Dipentyl (DPP); -Butyl benzyl (BBP); -Di(2-ethylhexyl) (DEHP); Platinum; Polycyclic aromatic hydrocarbon (PAH) mixtures; Tetrachlorodibenzo-p-dioxin, 2,3,7,8-(dioxin); Tetrachloroethylene (perchloroethylene or perc); Tetrahydrofuran (THF); Trimethylbenzene, 1,2,4-; Trimethylbenzene, 1,3,5-; Uranium; Vanadium pentoxide. Fiscal year 2013 (12 assessments): Arsenic, inorganic (noncancer); Cadmium; Chloroethane; Chromium VI (inhalation); Cobalt; Copper; Ethyl tertiary butyl ether (ETBE); Libby amphibole asbestos (cancer); Naphthalene; Nickel (soluble salts); Polychlorinated biphenyls (PCBs) (noncancer); Vinyl acetate. Fiscal year 2014 (1 assessment): Styrene. TBD--To be determined (2 assessments): Methanol (cancer); Formaldehyde. Source: GAO analysis of EPA data. [End of table] Appendix V: Comments from the Environmental Protection Agency: United States Environmental Protection Agency: Office Of Research And Development: Washington, D.C. 20460: November 22, 2011: Mr. David C. Trimble: Director, Natural Resources and the Environment: U.S. Government Accountability Office: Washington, DC 20548: Dear Mr. Trimble: Thank you for the opportunity to review and comment on GAO's draft report "Chemical Assessments: Challenges Remain with EPA's Integrated Risk Information System (IRIS) program." GAO has provided a thorough analysis of both EPA's progress in completing IRIS assessments under the May 2009 process, and the continuing challenges EPA faces in implementing the process. The report makes a number of observations on how EPA's IRIS program has improved over the last two years, and identifies several areas of concern. GAO's recommendations fall into three categories: A. Enhance the timeliness and certainty of timelines in the development of IRIS assessments; B. Improve the clarity and transparency of IRIS assessments per the recommendations of the National Academies of Science; C. Provide improved information to stakeholders and users on planned assessments and assessments that are under development. The Agency is committed to making improvements in each of these areas. Please see the attached table for EPA's response to each specific recommendation. Following implementation of the current IRIS process in May 2009, the productivity of the IRIS program increased; six assessments were completed in 2009, and ten were completed in 2010. The IRIS program continues to improve in completing complex and major assessments, including the recent postings of the trichloroethylene and dichloromethane (methylene chloride) assessments in 2011. The IRIS program is targeting the completion of the assessment for tetrachloroethylene (perc) in December 2011 and the noncancer portion of the dioxin assessment in January 2012. Completion of these major assessments will continue the trend of increased productivity of the IRIS Program and reduce the backlog of assessments. The Agency continues to emphasize improving the quality and transparency of IRIS assessments, while increasing the productivity of its IRIS program. Immediately following the release of the National Academies' Review of the EPA's Draft IRIS Assessment of Formaldehyde in April 2011, EPA began to implement the Academies' recommendations to improve the development of IRIS assessments. The Agency outlined these improvements in a July 2011 press release,[Footnote 1] factsheet[Footnote 2] and IRIS Progress Report.[Footnote 3] As a result, EPA has made IRIS assessments clearer, more concise, and more transparent. The documents include more graphical and tabular representations of data, and the assessments have increased clarity and transparency of data, methods, and decision criteria. The Agency will continue to improve discussion of the scientific rationale supporting the assessments, and toxicity values will be more transparent as IRIS assessments evaluate and describe the strengths and weaknesses of critical studies in a more uniform way. Discussion of which criteria are most influential will also be improved. In 2012, EPA will hold a public workshop on weight of the evidence analysis in which the public and stakeholders will have the opportunity to participate. A consistent theme in the GAO report is the critical need for transparency in IRIS. assessments and the IRIS process. GAO notes substantial improvements that have been made to increase transparency, particularly in regard to the interagency review steps in the May 2009 process. hi response to the 2008 GAO recommendations for improving the IRIS program, all written comments from federal agencies on draft assessments are made available to the public on the IRIS website. In addition, as of July 2011, EPA's responses to major interagency comments are also publicly available. The May 2009 IRIS process[Footnote 4] was developed in response to the recommendations GAO made in its 2008 report. This process is based on the commitment to produce timely and high quality assessments in a transparent manner with robust peer review and multiple opportunities for public involvement. Above all, the May 2009 process is a result of EPA's commitment to using the best available science for developing IRIS assessments. The timeframes in the IRIS process are intended to reflect the time needed to complete each step in the development and review of the majority of assessments. Most assessments are expected to be. completed within 23 months. In all instances, the amount of time taken at each step will ensure the assessments are based on the best available science. Assessments using exceptionally complex science are not expected to be completed in the 23 month period. As GAO notes, these assessments generally undergo lengthier peer reviews, and development of the draft assessment and other steps in the process may take longer than the timeframes specified in the process. The Agency's goal is to assure that all IRIS assessments are of the highest quality and are completed in the allotted timeframes to inform decision-making. To further ensure the quality of IRIS assessments, EPA's Science Advisory Board recently announced that a new Chemical Assessment Advisory Committee is being formed to provide advice to EPA on both draft IRIS assessments and the IRIS program overall.[Footnote 5] This standing panel will provide a new mechanism for peer review with greater consistency in membership, which will better enable the panel to observe trends over time in IRIS assessments. The panel will also help to ensure that the IRIS program implements the National Academies' recommendations and provides sound scientific health assessments. The Agency agrees with GAO's statement that interagency coordination can enhance the quality of IRIS assessments. Under EPA's management, participation of agencies with public health missions in the interagency review process has increased. This is important, because scientists who are familiar with human health research and risk assessment methodology provide the most useful scientific feedback to improve IRIS assessments. The Agency acknowledges, however, that further improvement in interagency review is needed. In this vein, EPA is seeking to work with other federal agencies to ensure that future interagency reviewers have the knowledge necessary to provide constructive scientific comments on draft IRIS assessments. In reconstituting the interagency workgroup, EPA will also communicate time commitments for reviewers to ensure that the interagency review steps of the IRIS process are not an impediment to completing assessments in a timely manner. Again, thank you for the opportunity to review and comment on the draft GAO report. The Agency recognizes the importance of the IRIS program to EPA's mission and the work of many public health agencies and organizations. The Agency is strongly committed to strengthening the IRIS program and values the input and recommendations that GAO has provided. Sincerely, Signed by: Paul T. Anastas: Assistant Administrator: Attachment: [End of letter] Attachment: GAO Recommendation: Assess the feasibility and appropriateness of the established timelines for each step in the IRIS process and determine whether different time frames should be established. Should different time frames be necessary, establish a written policy that clearly describes the applicability of the time frames for each type of IRIS assessment and ensures that the time frames are realistic and provide greater reliability to stakeholders. EPA agreement? Yes. Comments and implementation steps: In instituting the May 2009 IRIS process, EPA made a commitment to develop high quality assessments in a timely manner while providing opportunities for input from the public, peer reviewers, and other federal agencies. The 23 month assessment development time frame was intended to apply to the majority of assessments, with exceptionally complex assessment requiring lengthier time frames. EPA will use the available program performance measures collected since the current IRIS process was established to evaluate the appropriateness of the current timeframes described in the IRIS process. GAO Recommendation: Submit a plan to independent review on how EPA will implement the National Academies' suggestions for improving IRIS assessments in the "roadmap for revision" presented in the National Academies peer review report on the draft formaldehyde assessment. EPA agreement? Yes. Comments and implementation steps: The Agency is committed to implementing the 2011 National Academies' recommendations. As noted in the current IRIS GAO report, EPA has already begun implementation of several important aspects of the recommendations soon after the release of the formaldehyde report. The Agency will fully implement the recommendations in a phased approach and will seek independent review through the Science Advisory Board's newly announced Chemical Assessment Advisory Committee to ensure that the Agency is addressing the recommendations offered by the National Academies. GAO Recommendation: Publish the IRIS agenda in the Federal Register each fiscal year. Indicate in published IRIS agendas which chemicals EPA is actively assessing and when EPA plans to start assessments of the other listed chemicals. EPA agreement? Yes. Comments and implementation steps: The Agency will publish an annual Federal Register Notice identifying the substances that EPA is actively assessing along with projections for when EPA-plans to start assessments for other listed chemicals. In 2010, EPA published a Federal Register Notice announcing the list of chemicals for which assessments were underway or expected to be underway in coming years. Future Federal Register notices will contain the additional information recommended by GAO. GAO Recommendation: Update IRIS Track to display all current information on the status of assessments of chemicals on the IRIS agenda, including actual start dates, and projected and actual dates for completion of steps in the IRIS process, and keep this information current. EPA agreement? Yes. Comments and implementation steps: The Agency will improve the timeliness and accuracy of information presented on the IRIS Track website. By November 30, 2011, EPA will conduct a quality review of the schedules presented in IRIS Track and make revisions as necessary. In addition, IRIS Track will be updated monthly thereafter to ensure that the information presented is current and as accurate as possible. The IRIS Program will also continue to maintain regular communication with EPA programs and regional offices to ensure they are apprised of all ongoing and unpcoming activities. Appendix V Footnotes: [1] http://yosemite.epa.gov/opa/admpress.nsf/dOcf6618525a9efb85257359003fb69 d/a3fcd60838197067852578cb00666c4d!OpenDocument] [2] http://www.epa.gov/IRIS/pdfs/irisprocessfactsheet2011.pdf] [3] http://www.epa.gov/IRIS/pdfs/irisprogressreport2011.pdf] [4] http://www.epa.gov/IRIS/process.htm] [5] http://www.gpo.gov/fdsys/pkg/FR-2011-11-18/pdf/2011-29916.pdf] [End of section] Appendix VI: GAO Contact and Staff Acknowledgments: GAO Contact: David C. Trimble, (202) 512-3841 or trimbled@gao.gov: Staff Acknowledgments: In addition to the individual named above, Diane LoFaro, Assistant Director; Christine Fishkin, Assistant Director; Summer Lingard; Mark Braza; Jennifer Cheung; Nancy Crothers; Lorraine Ettaro; Robert Grace; Gary Guggolz; Richard P. Johnson; Michael Kniss; Nadia Rhazi; and Kiki Theodoropoulos made key contributions to this report. Also contributing to the report were Tim Bober, Michelle Cooper, Anthony Pordes, Benjamin Shouse, Jena Sinkfield, and Nicolas Sloss. [End of section] Footnotes: [1] As we have previously reported, the Toxic Substances Control Act (TSCA) requires EPA to demonstrate that certain health or environmental risks are likely before it can require companies to test the approximately 700 new chemicals introduced into commerce annually or take action to control unreasonable risks by placing restrictions on the tens of thousands of chemicals already in the agency's TSCA inventory. [2] GAO, Chemical Assessments: Low Productivity and New Interagency Review Process Limit the Usefulness and Credibility of EPA's Integrated Risk Information System, [hyperlink, http://www.gao.gov/products/GAO-08-440] (Washington, D.C.: Mar. 7, 2008). [3] Formaldehyde is a gas widely used in such products as pressed wood, paper, pharmaceuticals, leather goods, and textiles. In addition, the term "dioxin" applies to a family of chemicals that are often the byproducts of combustion and other industrial processes. Complex mixtures of dioxins enter the food chain and human diet through emissions into the air that settle on soil, plants, and water. For more information on formaldehyde and dioxin, see appendix III. [4] GAO, Toxic Chemicals: EPA's New Assessment Process Will Increase Challenges EPA Faces in Evaluating and Regulating Chemicals. [hyperlink, http://www.gao.gov/products/GAO-08-743T] (Washington, D.C.: Apr. 29, 2008). [5] GAO, High-Risk Series: An Update, [hyperlink, http://www.gao.gov/products/GAO-09-271] (Washington, D.C.: Jan. 22, 2009). This high-risk area addresses EPA's implementation of the IRIS program as well as implementation of the Toxic Substances Control Act (TSCA). [6] GAO, High-Risk Series: An Update, [hyperlink, http://www.gao.gov/products/GAO-11-278] (Washington, D.C.: Feb. 16, 2011). We also concluded that the EPA Administrator needed to continue to support the agency's TSCA initiatives. [7] EPA uses the model presented by the National Academies in Science and Decisions: Advancing Risk Assessment (Washington, D.C.: The National Academies Press, 2009). This publication is also known as the Silver Book. [8] OMB, Fiscal Year 2006 Program Assessment Rating Tool (PART) assessment of EPA's Human Health Risk Assessment Program. [9] The Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA) requires federal agencies to respond to certain releases or threatened releases of hazardous substances on lands they administer. CERCLA also created a trust fund--the Superfund--to provide for certain cleanup activities at nonfederal sites. Under CERCLA, EPA established the Superfund program to address the threats that contaminated sites pose. [10] EPA refers to a standard assessment as a tier 1 assessment, a moderately complex assessment as tier 2, and an exceptionally complex assessment as tier 3. [11] Congress established EPA's Science Advisory Board in 1978 and gave it a broad mandate to advise the agency on technical matters. The Science Advisory Board's preliminary work is done by subcommittees or panels focused on various environmental science topics. These groups are chaired by Science Advisory Board members, and their recommendations are transmitted to the Science Advisory Board for discussion and deliberation. Recommendations are forwarded to EPA only if the Science Advisory Board determines that it is appropriate. [12] Since May 2009, 1 assessment has been externally peer reviewed by the National Academies (formaldehyde), 3 have been externally peer reviewed by the Science Advisory Board (trichloroethylene [TCE], Polycyclic aromatic hydrocarbon [PAH] mixtures, and Tetrachlorodibenzo- p-dioxin, 2,3,7,8-[dioxin]), and 14 have been externally peer reviewed by expert panels that are independently assembled by an EPA contractor (dichloromethane, hexachloroethane, urea, hexavalent chromium, halogenated platinum salts, 1,4-dioxane [oral route], pentachlorophenol, ethyl tertiary butyl ether [ETBE], hydrogen cyanide, tetrachloroethane-1,1,2,2, dichloroethylene -1,2-cis-, dichloroethylene -1,2-trans-, trichloroacetic acid, and chloroprene). [13] [hyperlink, http://www.gao.gov/products/GAO-08-440]. [14] The National Academies' Silver Book observes that "delays in the process of assessing risks may increase overall exposure to risk when decisions are delayed." The Silver Book notes that "the design of a risk-assessment process should balance the pursuit of individual attributes of technical quality in the assessment and the competing attribute of timeliness of input into decision-making." Science and Decisions: Advancing Risk Assessment. [15] [hyperlink, http://www.gao.gov/products/GAO-08-440]. [16] In fiscal year 2007 EPA issued 2 IRIS assessments. In fiscal year 2008 EPA issued 5 assessments--4 of which were assessments of related chemicals assessed and peer reviewed together but finalized individually. In addition, in February 2009--3 months before the revised IRIS process was announced--EPA issued an assessment for nitrobenzene. [17] This goal includes assessments remaining from fiscal year 2011. EPA originally planned to issue 20 IRIS assessments in fiscal year 2011: 4 were issued, 1 was dropped, 2 were given "TBD"--to be determined--completion dates, and 13 were added to the fiscal year 2012 completion goals. [18] The IRIS agenda lists chemicals that are to be assessed during a given fiscal year. [19] We did not evaluate the scientific content or quality of IRIS assessments. [20] In addition to reviewing the National Academies' peer review report on the draft IRIS assessment of formaldehyde, we reviewed National Academies peer review reports on the draft assessments of tetrachloroethylene (2010) and dioxin (2006) and the Science Advisory Board peer review reports on the draft assessments of trichloroethylene (TCE) (2011), acrylamide (2008), ethylene oxide (2007), dioxin (2011), polycyclic aromatic hydrocarbon (PAH) mixtures (2011), and inorganic arsenic (2011). [21] EPA uses uncertainty factors in the derivation of IRIS values to account for uncertainty due to, among other things, variability in susceptibility to a chemical among humans, the extrapolation of animal study data to humans, and the extrapolation of a chemical's effects over a lifetime. [22] Members of the BOSC Executive Committee and subcommittees constitute a distinguished body of scientists and engineers who are recognized experts in their respective fields. [23] According to EPA, these two assessments were changed based on the National Academies' suggestions for improvement. We chose one assessment because it was completed during our review and EPA provided us the other assessment, which it said reflected changes made based on the National Academies' suggestions. These two assessments may not be representative of all assessments, but are examples of assessments that EPA has changed based on the National Academies' suggestions. [24] EPA's IRIS assessment tracking system is formally called the IRIS Substance Assessment Tracking System. [25] [hyperlink, http://www.gao.gov/products/GAO-08-440]. [26] EPA also uses the Federal Register to announce other IRIS-related developments, such as public meetings of peer review panels and public listening sessions, at which interested parties are invited to give comments on draft IRIS assessments. [27] Federal Register (75 Fed. Reg. 63827, October 18, 2010). [28] [hyperlink, http://www.gao.gov/products/GAO-08-440]. [29] [hyperlink, http://www.gao.gov/products/GAO-08-440]. [30] IRISTrack provides estimated start and end dates by fiscal year quarter. [31] [hyperlink, http://www.gao.gov/products/GAO-08-440]. [32] [hyperlink, http://www.gao.gov/products/GAO-08-440]. [33] Consolidated Appropriations-Fiscal Year 2001, Pub. L. No. 106- 554, § 515, 114 Stat. 2763A-153 to 2763A-154 (2000) (44 U.S.C. § 3516 note). [34] OMB's data quality guidelines recognize that "information quality comes at a cost," and that agencies should weigh the costs and benefits of increasing information quality. 67 Fed. Reg. 8453 (February 22, 2002). OMB has stated that its involvement in the IRIS process increases the quality of the assessments, but it has produced no cost-benefit or other analysis supporting that statement, nor has it disclosed the performance measures it uses to evaluate assessment quality. [35] One of the challenged assessments, the draft inorganic arsenic cancer assessment, is currently undergoing post-peer review EPA revisions (step 5). The other challenged assessment, the draft cancer assessment of methanol, is currently on hold while EPA reviews some of the data underlying the findings of the draft assessment. [36] On August 1, 2011, the International Platinum Group Metals Association filed a request for correction on the IRIS assessment for Halogenated Platinum Salts and Platinum Compounds. In addition, The National Academies has observed that "reaching consensus on all aspects of the scope and conduct of a risk assessment among decision- makers and stakeholders representing diverse interests will not always be feasible. In addition, it is not necessarily in the public interest to delay the risk assessment where consensus is difficult to achieve." Science and Decisions: Advancing Risk Assessment (Washington, D.C.: The National Academies Press, 2009). Moreover, EPA has noted in its data quality guidelines that "most environmental statutes obligate EPA to act to prevent adverse environmental and human health impacts. For many of the risks that we must address, data are sparse and consensus about assumptions is rare. In the context of data quality, we seek to strike a balance among fairness, accuracy, and efficient implementation. Refusing to act until data quality improves can result in substantial harm to human health, safety, and the environment." [37] Emerging new methods included cumulative assessment of TCE and its metabolites and the use of a physiologically based pharmacokinetic (PBPK) model. PBPK models are a class of dosimetry models--models that measure doses--that are useful for predicting internal doses to target organs. With the appropriate data, these models can be used to extrapolate across species and exposure scenarios and address various sources of uncertainty in risk assessments. [38] [hyperlink, http://www.gao.gov/products/GAO-08-440]. [39] U.S. Environmental Protection Agency, Office of the Administrator, Science Advisory Board. "SAB Review of EPA's Reanalysis of Key Issues Related to Dioxin Toxicity and Response to NAS Comments (May 2010)" (EPA-SAB-011-014). Washington, D.C.: August 26, 2011. [40] U.S. Department of Health and Human Services, Report on Carcinogens, Twelfth Edition (2011). [41] [hyperlink, http://www.gao.gov/products/GAO-08-440]. [42] [hyperlink, http://www.gao.gov/products/GAO-08-440]. [43] [hyperlink, http://www.gao.gov/products/GAO-08-440]. [44] As we have previously reported, according to DOD, EPA did not specifically ask the peer reviewers to address some of the technical questions DOD had raised and wanted the peer review to address. [45] Genotoxic substances are a type of carcinogen, specifically those capable of causing genetic mutation and of contributing to the development of tumors. This includes both certain chemical compounds and certain types of radiation. [46] [hyperlink, http://www.gao.gov/products/GAO-08-440]. [47] [hyperlink, http://www.gao.gov/products/GAO-08-440]. [End of section] GAO’s Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO’s commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO’s website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select “E- mail Updates.” Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. Congressional Relations: Ralph Dawn, Managing Director, dawnr@gao.gov, (202) 512-4400 U.S. Government Accountability Office, 441 G Street NW, Room 7125 Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov, (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, DC 20548.