This is the accessible text file for GAO report number GAO-05-83 entitled 'VA Patient Safety Program: A Cultural Perspective at Four Medical Facilities' which was released on December 15, 2004. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. Report to the Secretary of Veterans Affairs: United States Government Accountability Office: GAO: December 2004: VA Patient Safety Program: A Cultural Perspective at Four Medical Facilities: GAO-05-83: GAO Highlights: Highlights of GAO-05-83, a report to Secretary of Veterans Affairs: Why GAO Did This Study: The Department of Veterans Affairs (VA) introduced its Patient Safety Program in 1999 in order to discover and fix system flaws that could harm patients. The Program process relies on staff reports of close calls and adverse events. GAO found that achieving success requires a cultural shift from fear of punishment for reporting close calls and adverse events to mutual trust and comfort in reporting them. GAO used ethnographic techniques to study the Patient Safety Program from the perspective of direct care clinicians at four VA medical facilities. This approach recognizes that what people say, do, and believe reflects a shared culture. The focus included (1) the status of VA’s efforts to implement the Program, (2) the extent to which a culture exists that supports the Program, and (3) practices that promote patient safety. GAO combined more traditional survey methods with those from ethnography, including in-depth interviews and observation. What GAO Found: GAO found progress in staff familiarity with and participation in the VA Patient Safety Program’s key initiatives, but these achievements varied substantially in the four facilities we visited. In our study conducted from November 2002 through August 2004, three-fourths of the clinicians across the facilities were familiar with the concepts of teams investigating root causes of unintentional adverse events and close calls. One-third of the staff had participated in such teams, and most who participated in these teams found it a positive learning experience. The cultural support clinicians expressed for the Program also differed. At three of four facilities, GAO found a supportive culture, but at one facility the culture blocked participation for many clinicians. Clinicians articulated two themes that could stimulate culture change: leadership actions and open communication. For example, nurses need the confidence to disagree with physicians when they find an unsafe situation. Although VA has conducted a cultural survey, it has not set goals or explicitly measured, for example, staff familiarity and mutual trust. Clinicians reported management practices at one facility that had helped them adopt the Program, including (1) story-telling techniques such as leaders telling about a case in which reporting an adverse event resulted in system change, (2) management efforts to coach staff, and (3) reward systems. The Patient Safety Program Process in the figure shows how ideally (1) clinicians have cultural support for reporting adverse events and close calls, (2) teams investigate root causes, (3) systems are changed, (4) feedback and reward systems encourage reporting, and (5) patients are safer. The Patient Safety Program Process: [See PDF For image] [End of figure] What GAO Recommends: To better assess the adequacy of clinicians’ familiarity with, participation in, and cultural support for the Program, VA should (1) set goals, (2) develop tools for measuring goals by facility, and (3) develop interventions when goals have not been met. VA concurred with our recommendations and will develop an action plan. www.gao.gov/cgi-bin/getrpt?GAO-05-83. To view the full product, including the scope and methodology, click on the link above. For more information, contact Nancy Kingsbury at (202) 512-2700 or kingsburyn@gao.gov. [End of section] Contents: Letter: Chapter 1: VA's Patient Safety Program: Scope and Methodology: Background: Chapter 2: Progress in Clinicians' Familiarity with and Participation in the Program: Facilities Shared Safety Hazards but Not Program Familiarity and Participation: Summary: Chapter 3: Measuring Cultural Support for the Program: Varying Cultural Support: Building a Supportive Culture: Improving Assessment of, Familiarity with, Participation in, and Cultural Support for the Program: Summary: Chapter 4: Promoting Patient Safety: Using Storytelling to Promote Culture Change: Deliberate Teaching, Coaching, and Role Modeling: Rewarding Close Call Reporting: Summary: Chapter 5: Conclusions and Recommendations: Measuring Clinicians' Familiarity with and Cultural Support for the Program: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Content Analysis, Statistical Tests, and Intercoder Reliability: Content Analysis: Ethnography: Data Collection: Data Analysis: Significance Testing: Intercoder Reliability: Appendix II: A Timeline of the Implementation of VA's Patient Safety Program: Appendix III: Semistructured Interview Questionnaire: Appendix IV: Comments from the Department of Veterans Affairs: Appendix V: GAO Contacts and Staff Acknowledgments: GAO Contacts: Staff Acknowledgments: Glossary: Tables: Table 1: Familiarity with and Participation in the Patient Safety Program's Initiatives at Four VA Facilities, 2003: Table 2: Number of Root Cause Analyses at Four VA Facilities, Fiscal Years 2000-2003: Table 3: Content Analysis: Achieving a Supportive Culture through Aspects of the Work Environment: Table 4: Nonparametric Multiple Comparison Results: Table 5: Intercoder Reliability Assessment Results: Figures: Figure 1: A VA Patient Safety Poster and Its Story: Figure 2: Model of the Patient Safety Program at Four VA Medical Facilities: Figure 3: Types of Adverse Event and Close Call Reporting at Four VA Facilities, June 2002: Figure 4: Familiarity with and Participation in the Program by Facility: Figure 5: Familiarity with VA's Program Compared with Trust and Comfort in Reporting at Four Facilities: Figure 6: Barriers to Staff Reporting Close Calls: Abbreviations: JCAHO: Joint Commission on Accreditation of Healthcare Organizations: NASA: National Aeronautics and Space Administration: NCPS: National Center for Patient Safety: PSRS: Patient Safety Reporting System: RAP: rapid assessment process: RCA: root cause analysis: VA: Department of Veterans Affairs: United States Government Accountability Office: Washington, DC 20548: December 15, 2004: The Honorable Anthony J. Principi: Secretary of Veterans Affairs: Dear Mr. Secretary: This report on the Department of Veterans Affairs Patient Safety Program examines the Program's status, the creation and implementation of a culture that supports close call and adverse event reporting, and practices that medical facility leaders have used to promote patient safety. In our study, we used ethnography, a social science method that includes qualitative and quantitative techniques developed within cultural anthropology for studying communities and organizations in natural settings. We include recommendations aimed at strengthening the Patient Safety Program by helping to build a more supportive culture and foster patient safety. We are sending copies of the report to appropriate congressional committees and others who are interested. We will also make copies available on request. If you have any questions about the report, please call me at (202) 512-2700. Sincerely yours, Signed by: Nancy Kingsbury, Managing Director: Applied Research and Methods: [End of section] Chapter 1: VA's Patient Safety Program: At the end of the 20th century, a report that the Institute of Medicine issued estimated that up to 98,000 persons died each year from accidents in U.S. hospitals. Before the institute published this figure, the Department of Veterans Affairs (VA) had launched a Patient Safety Program that included teams investigating the root cause of medical close calls and adverse events and confidential staff reporting systems. The Program's ultimate goal is to create a culture in which VA can discover and correct unsafe systems and processes before they harm patients. VA has indicated that it is attempting through the Patient Safety Program to introduce significant change in staff attitudes, beliefs, and behavior so that health care professionals will report events as part of their daily work. In testimony before the Congress in 2000, we suggested that the Program could be more successful if greater attention were paid to several leadership strategies the Institute of Medicine has outlined, such as making patient safety a more prominent goal and communicating the importance of patient safety to all staff.[Footnote 1] In addition, we noted that: "VA could also better ensure success if it prepared a detailed implementation plan that identifies how and when VA's various patient safety Programs will be implemented, how they are aligned to support improved patient safety, and what contribution each Program can be expected to make toward the goal of improved patient safety."[Footnote 2] One of the most challenging aspects of VA's Patient Safety Program is creating an atmosphere in which employees are willing to reveal system problems and find system solutions to them. Traditionally, hospital employees have been held responsible for adverse patient outcomes, whether they stemmed from employees' mistakes or the health care system. For example, a nurse might be blamed for administering the wrong medicine, even when the system was at fault, as when two medicines with similar names--one deadly, the other not--were stored on the same shelf in similar bottles. The poster and story in figure 1 show how complicated a day in the life of a healthcare provider can be. In this instance, a VA nurse recognized a potentially dangerous flaw in the system that could have caused unintentional harm to patients. In June 2002, she reported the close call, because she saw that the environment she worked in encouraged reporting, and she was then rewarded with a gift certificate. Figure 1: A VA Patient Safety Poster and Its Story: [See PDF for image] -graphic text: PHARMACY WARNING Danger: Look Alike/Feel Alike: EVENT: Oral liquid KCL found in Acetaminophen storage bin. Both containers are same size. Both containers are the same color. Both containers have the same blue on white labeling. ACTION: Eliminate liquid KCL from stock since KCL is available in powder form. The Close Call Story behind the Poster: We visited an intensive care unit and talked to the nurse who reported a close call of two look-alike drugs that were mixed together in the same drawer. She said she reached for liquid Tylenol and found potassium chloride concentrate also in the drawer. She told us that the two drugs were very different-one could kill you and the other is a mild analgesic. The two drugs were packaged similarly in containers with pull-off lids, and since the same drug company made both medications, the labels were similar. She told us she notified her supervisor and the pharmacy. Since the medical facility had a reward system for close calls, she received a gift certificate for the cafeteria, and later it was determined that this close call was the "pick of the month" This meant that her unit received a plate of cookies. She said that she reported the close call not for the reward but because she is a professional. When one day a poster appeared in the hall alerting others to the two look-alike drugs, she wondered whether the other medical facilities were notified. She wondered whether she had made a difference in safety nationwide; a nurse rarely has that chance. Source: VA (poster). [End of figure] High-risk industries such as nuclear power and aerospace have found that reliable safety organizations discover and correct system flaws. In effective safety cultures, frontline workers trust one another and report close calls and adverse events without fear of blame. Healthcare, which traditionally employs a culture of blame, must place a premium on learning from staff reporting of adverse events and close calls.[Footnote 3] Experts in patient safety acknowledge that emphasis on culture is important in preventing medical adverse events and close calls and promoting patient safety.[Footnote 4] To describe the culture in VA's medical facilities and to search for a deeper understanding of patient safety from the viewpoint of VA staff, we proposed to answer the following questions in the context of four VA medical facilities: 1. What is the status of the Program's implementation at four medical facilities? 2. To what extent do the four sites we studied have a culture that supports the Program? What cultural changes can be stimulated? 3. What practices in the four facilities promoted patient safety? Scope and Methodology: To meet our study's challenges, we used several methods from ethnography, and in certain cases we blended them with survey methods to provide in-depth knowledge of organizational culture from the perspective of VA's frontline staff--its physicians, nurses, and others directly responsible for patient care.[Footnote 5] We intend this study to complement our earlier reports on organizational culture and changing organizations.[Footnote 6] We chose ethnography because several of its techniques and perspectives helped us study aspects of patient safety that would otherwise have remained overlooked or would not have been observed, such as informal mores, and to assist GAO in the development of new evaluation methods.[Footnote 7] These aspects were ethnography's research traditions of (1) conversational interviews, enabling interviewers to explore a participant's own view of and associations with an issue of interest, (2) the researchers' observations of real processes to further understand the meaning behind patient safety from the natural environment of staff, and (3) systems thinking.[Footnote 8] Our study measures, at the facility level, the extent of familiarity with, participation in, and cultural support for the Program, and it complements a cultural survey VA conducted in 2000. VA expects to resurvey staff in the near future, using its past survey data as a baseline. VA's original, nonrandom survey contained questions regarding shame and staff willingness to report adverse events when the safety of patients was at hazard during their care. The VA survey did not establish staff familiarity with key concepts of the Program, participation in VA safety activities, or the facilities' levels of cultural support for the Program.[Footnote 9] Conversational Survey Interviews: We recognized that a tradition of fear of being blamed for adverse events and close calls might make staff reluctant to talk about their experience of potential harm to patients. Besides breaking through an emotional barrier, we wanted to understand the private views of staff on what facilitates patient safety. To achieve the informal, open, and honest discussions we needed, we conducted private, nonthreatening, conversational interviews with randomly selected clinicians and other staff in a judgmental sample. At each site, we chose one random and one judgmental (nonrandom) sample of staff to interview in a conversational manner, using similar semistructured questionnaires (see app. III). For the first sample, we interviewed a random selection of 10 physicians and 10 nurses at each of the four facilities. While this provided us with a representative sample of clinicians (physicians and nurses) from each facility, the sample size was too small to provide a statistical basis for generalizing our survey results to an entire facility. To give us a better understanding of the culture and context of patient safety beyond the clinicians involved in direct patient care at each facility, we also interviewed more than a hundred other staff in the four study sites, including medical facility leaders, Patient Safety Managers, and hospital employees from all levels--maintenance workers, security officers, nursing assistants, technicians, and service chiefs. (Appendix I contains more technical detail about our analysis.) Reporting adverse events and close calls is a highly sensitive subject and can successfully be explored with qualitative methods that allow respondents to talk privately and freely. When staff did not recognize a key element of the Program, our interviewers explained it to them. (We were not giving the respondents a test they could fail.) Selecting clinicians randomly at each of four facilities, and asking some close- ended questions such as those expecting "yes" or "no" answers, allowed us to analyze and present some issues as standard survey data. This combined survey and ethnographic approach afforded us most of the advantages of standard surveys while establishing an environment in which the respondents could talk, and did talk, at length about the cultural context of patient safety in their own facilities. Clinicians responded to a standard set of questions, many open ended, such as, To what extent do you perceive there to be trust or distrust within your unit or team? Among the advantages these questions had were that they allowed the clinicians to discuss issues spontaneously and they allowed us to discover what facilitates trust from their point of view. Thus, if clinicians thought leadership was important, we had an opportunity to see this from their viewpoint rather than starting from the premise that leadership would be important to them. An important part of our approach was content analysis, which we used to analyze answers to both the standard and open-ended questions. Content analysis summarizes qualitative information by categorizing it and then systematically sorting and comparing the categories in order to develop themes and summarize them. We determined, by intercoder reliability tests, that our content analysis results were trustworthy across different raters. (See app. I.) Observation: We added another ethnographic technique in order to more completely understand the culture within each facility. Since responses to surveys are sometimes difficult to understand out of context, our in-depth ethnographic observations of the patient care process gave us a more complete picture of how the elements of the Patient Safety Program interacted. They also gave us a better understanding of VA's medical facility systems. We observed staff in their daily work activities at each medical facility, which helped us understand patient safety in context. For example, we attended staff meetings where the Program was discussed and we attended RCA meetings, and we followed a nurse on her rounds. We took detailed field notes from our observations, and we analyzed and summarized our notes. We reviewed files to examine data on adverse events, close calls, and RCA reports. We read files from administrative boards, reward programs, and patient safety committee minutes. And we interviewed high-level VA officials. Systems Thinking: Finally, our ethnographic research approach was systemic. This was to help us appreciate interactions between the elements of the Program and the facilities' existing culture. Ethnography has traditionally been used to provide rich, holistic accounts of the way of life of a people or a community; in recent decades, it has also been used successfully to study groups in modern societies. A systems approach casts a wide net over the subject. In this case, we chose to study the Patient Safety Program in relation to other aspects of culture in VA's medical facilities that might affect its adoption, such as the extent to which staff have mutual trust. We also developed a model, or flow chart, to guide our study of the Program and the culture of the facilities. The model, in figure 2, helped us conceptualize the important safety activities within the Program and analyze and present our results. We looked not only at the Program's key elements, in the darkly shaded boxes in figure 2, but also at what surrounds them--the context of the medical facilities' culture--and whether the culture supports the adoption of the Program. Our model illustrates that our primary focus was measuring clinicians' supportive culture for reporting close calls and adverse events and their familiarity with and participation in reporting programs and RCAs. The model also depicts the interaction between clinicians' receiving feedback and being rewarded and their desire to continue reporting close calls and adverse events. It also allows us to describe how clinicians' reporting close calls and adverse events, and the subsequent investigation of the root causes of them, developed into system changes that in turn resulted in patients being safer. Figure 2: Model of the Patient Safety Program at Four VA Medical Facilities: [See PDF for image] [End of figure] We conducted the study at three medical facilities that VA had recommended as being well managed. We selected a fourth facility for geographic balance. Thus, the four facilities were in different regions of the country. Using rapid assessment techniques, we conducted fieldwork for approximately a week at each of two facilities, for 3 weeks at a third, and for 25 days at the fourth.[Footnote 10] We did our work from November 2002 to August 2004 in accordance with generally accepted government auditing standards. Background: The Patient Safety Goal: In 1998, in an influential editorial in the Journal of the American Medical Association, George Lundberg, the journal's editor, along with Kenneth Kizer, then VA's Under Secretary for Health, and other patient safety advocates and theorists, challenged the medical profession: "to make health care safe we need to redesign our systems to make error difficult to commit and create a culture in which the existence of risk is acknowledged and injury prevention is recognized as everyone's responsibility. A new understanding of accountability that moves beyond blaming individuals when they make mistakes must be established if progress is to be made."[Footnote 11] This vision of making patients safe through "redesign . . . to make errors difficult to commit" led to VA's National Center for Patient Safety (NCPS), established to improve patient safety throughout the largest health care system in the United States.[Footnote 12] To transform the existing culture of patient care in VA's medical facilities, VA's leaders aimed to persuade clinicians and other staff in health care settings to adopt a new practice of reporting, free of fear and with mutual trust, identifying vulnerabilities, and taking necessary actions to mitigate risks. The Under Secretary had recognized risk to patients during care and that a focus on VA's existing culture could improve patient safety. Related research shows that if complex decision making organizations are to change, they must modify their organizational culture.[Footnote 13] Traditionally, clinicians involved in an adverse event could be blamed or sued, but the roots of unintentional errors are now understood as originating often in the institutions and structures of medicine rather than in clinicians' incompetence or negligence.[Footnote 14] Several contextual factors influence how the Patient Safety Program is experienced at the medical facilities we visited and show the increasingly complex world of patient care. Our study's limitations meant that we could not study these factors, but health care facilities in general, as well as VA's, are experiencing difficulty in hiring and retaining nurses, as well as potential staffing shortages. Patients admitted to VA medical facilities have more multiple medical problems that require more extensive care than in the past. VA's eligibility reform allowed veterans without service-connected conditions to seek VA services, leading to a 70 percent increase in the number of enrolled veterans between 1999 and 2002. The Patient Safety Process: VA has provided funding of $37.4 million to NCPS for its Patient Safety Program operations and related grants and contracts for fiscal years 1999-2004.[Footnote 15] In fiscal year 1999, NCPS defined three major initiatives: (1) a more focused system for mandatory close call and adverse event reporting, including a renewed focus on close calls; (2) reviews of close calls and adverse events, including RCAs, using interdisciplinary teams at each facility to discover system flaws and recommends redesign to prevent harm to patients; and (3) staff feedback on system changes and communication about improvements to patient safety.[Footnote 16] Close Call and Adverse event Reporting: Starting with the NCPS program in 1999, reporting of close calls increased dramatically as their value for patient safety improvement was widely disseminated and increasingly recognized by VA personnel. A close call is an event or situation that could have resulted in harm to a patient but did not, either by chance or by timely intervention. VA encourages reporting close calls and adverse events, since redesigning system flaws depends on staff revealing them.[Footnote 17] VA's Patient Safety Managers told us that only adverse events and not close calls were traditionally required to be reported to supervisors and then up the chain of command. Under the Program, staff also have optional routes for reporting-- through Patient Safety Managers or a confidential system outside their facilities. Staff can now report close calls and adverse events directly to the facilities' Patient Safety Managers. They, in turn, evaluate the reports, based on criteria for deciding which adverse events or close calls should be investigated further. NCPS also has a confidential reporting option--the Patient Safety Reporting System (PSRS)--through a contract with the National Aeronautics and Space Administration (NASA). NASA has 27 years of experience with a similar program, the Federal Aviation Administration's Aviation Safety Reporting System. Under the contract with VA, NASA removes all identifying information and sends selected items of special interest to the NCPS. NASA also publishes a newsletter based on reports that have had their identifying information removed. Root Cause Analysis Teams: Working on interdisciplinary teams of usually five to seven participants, staff focus on either one or a group of similar close calls or adverse events to investigate their causes. Then they search for system flaws and redesign patient care so that mistakes are harder to make. Under the Program, NCPS envisioned that these teams would be a key step to improving patient safety through system change and one of its primary mechanisms of introducing clinicians to the Program.[Footnote 18] In 1999, NCPS began RCA implementation.[Footnote 19] In this on-the-job training, Patient Safety Managers guide local interdisciplinary teams in studying reports of close calls or adverse events to identify and redesign system weaknesses that threaten patients' safety. Teams are allowed 45 days to learn as much as possible from a close call or adverse event or a group of similar close calls or adverse events such as falls, missing persons, medication errors, and suicides called aggregated reviews. Within the given time period, teams are to develop action plans for system improvement. Personal experience on interdisciplinary RCA teams investigating close calls and adverse events at their home facilities is the clinicians' key training experience. VA expected that the RCA experience would persuade staff that VA was changing its culture by encouraging a different approach to reporting. Feedback Mechanisms: Staff need to receive proof that the Program is working by receiving timely feedback on their reporting. A feedback loop fosters and perpetuates close call and adverse event reporting.[Footnote 20] Without it, staff may feel the effort is not worth their time. NCPS built in feedback loops at several levels of the system. For example, individuals who report a close call or adverse event are supposed to get feedback from the RCA team on actions recommended as a result of their reports. Also, NCPS issues an online bimonthly newsletter that reports safety changes. In chapter 2, we measure clinicians' familiarity and participation in the Program at the four facilities we visited. Chapter 3 is an examination of whether the culture at the four facilities supports the Patient Safety Program and chapter 4 provides examples of management practices that promote patient safety. We asked VA to comment on our report; VA's comments are in appendix IV. Our response to their comments is in the conclusions located in chapter 5. VA also provided some additional comments to emphasize that it believes that it has taken steps to address the issue of mutual trust. VA describes those steps in the report on page 67. [End of section] Chapter 2: Progress in Clinicians' Familiarity with and Participation in the Program: In general, we found progress in clinicians' understanding and participation in the Patient Safety Program. Three facilities had medium or higher familiarity with and participation in the Program's core elements, and one had lower. At that facility, the staff were not following VA's policy of reporting close calls and were not being educated in the benefits of doing so. Examining the data across our total random sample, we found that some clinicians were familiar with several core concepts of the Program and were unfamiliar with others-- a picture NCPS officials said did not surprise them. About three-quarters of clinicians were familiar with the concept of RCAs (newly introduced in 2000) and the concept of the close call. About half the clinicians recognized the new confidential reporting process--another equally important program. One-third had participated in an RCA or knew someone who had. NCPS staff told us that participation in RCAs is crucial to culture change at VA, and clinicians who were on RCA teams indicated that they experienced the beginning of a culture shift.[Footnote 21] Of the staff who had participated in RCAs, many indicated that it was a positive learning experience, but facilities varied in ensuring clinicians' broad participation. Facilities Shared Safety Hazards but Not Program Familiarity and Participation: VA has made progress in familiarizing and involving clinicians with the Program's key concepts. But while the facilities we studied shared basic safety problems, three had made more progress than the fourth. First, all four experienced similar hazards to patient safety. Second, we report clinicians' familiarity with and participation in the Program in two ways--grouped first by facility and then across the four sites. Facilities' Share Common Safety Reporting Pattern: The four facilities shared an overall pattern in the types of adverse events they reported, reflecting their common safety challenge. To establish the Program's context, we asked at the four facilities to review documents related to close calls and adverse events reported over a one-month period (June 2002). All the facilities reported falls for this period, while two facilities or more recorded patients' violence toward staff, patients' suicides and suicide attempts, missing patients, and medication errors (see fig. 3).[Footnote 22] Although our data reflect a limited time period, the highly overlapping types of reporting at the facilities parallel those found in the wider VA patient care system, as documented in an earlier review by the VA Medical Inspector.[Footnote 23] Figure 3: Types of Adverse Event and Close Call Reporting at Four VA Facilities, June 2002: [See PDF for image] Note: Excludes reports in pharmacies, laboratories, and other areas of VA facilities that had separate reporting systems. Facilities with suicides not reported for June 2002 may have had suicides reported in other months. [End of figure] Facilities' Differences in Participation and Familiarity with the Program: Staff at one facility had less familiarity with and participation in the Program than staff at the three others (see fig. 4).[Footnote 24] In the interviews with the random sample, we found Facility D had lower familiarity with the Program's concepts than the other facilities and lower participation in RCAs; this pattern was buttressed by additional interviews at Facility D. For example, the quality manager who supervised Patient Safety Managers at that facility did not realize that close call reporting was mandated, and the education officer who trained staff in patient safety told us that staff were generally not acquainted with the concept of reporting close calls. Because knowing that an initiative exists is often the first step to participation, the lower familiarity with the Program at Facility D in the fifth year of implementation was a likely impediment to the adoption of the Program there. Figure 4: Familiarity with and Participation in the Program by Facility: [See PDF for image] Note: A summary code we created for each facility reflected a composite score for answers to five questions about familiarity with the key elements of and participation in RCAs: Do you know what a close call is? Do you know what the Patient Safety Reporting System is? Do you know what an RCA is? Have you participated in an RCA? Do you know anyone who has participated? Coders analyzed all answers for each individual random sample respondent with regard to expressions of mutual trust and comfort in reporting. Then they created a summary value rating of low, medium, or high for each individual. This summary rating was then tested through rater reliability, and the scores were determined acceptable. Individual summary ratings were averaged for each facility. In each key elements question, we let "yes" equal 2 and "no" equal 0, ensuring that an individual who knew each of the five elements would achieve a composite score of 10. Finally, we averaged composite scores to get an average score for each facility. Rather than display these numbers, we used a scale of high, medium, and low for 10, 5, and 0 and placed the answers accordingly. (Appendix I describes our methodology; appendix III reprints our questionnaire.) [End of figure] Differences in Facilities' Adhering to Close Call Reporting Policy: The four medical facilities we studied also varied in their adherence to close call reporting policies under the Program. We found three out of four facilities followed the policy of reporting close calls. One facility, in particular, showed a marked increase in the number of close calls in a short period of time; close call reports were rare in the 6 months before but numbered 240 in the 6 months after its leaders told staff patient safety was an organizational priority and introduced a simple reward system for close call reporting. However, one facility we visited was not reporting close calls in the Program's fifth year. Familiarity with and Participation in the Program across Four Facilities: We looked at interview responses with randomly selected clinicians across all four facilities. We found that three-quarters of the clinicians knew the meaning of close call--that is, when a potential incident is discovered before any harm has come to a patient--but only half were aware of the option of reporting close calls and adverse events confidentially. (See table 1.) Close calls are presumed to occur more often than adverse events, and reporting them in addition to adverse events is central to the Program's goal of discovering and correcting system flaws. Staff who do not recognize the close call concept cannot bring to light system flaws that could harm patients. Further, because changing from traditional blaming behavior to reporting without fear can take time, staff familiarity with the confidential reporting option is important. However, only half the clinicians surveyed at the four facilities knew that they could report adverse events or close calls confidentially under the NASA reporting contract. Table 1: Familiarity with and Participation in the Patient Safety Program's Initiatives at Four VA Facilities, 2003: Program: Root cause analysis; Percentage of staff: 78%; Indicator: Familiar with the concept. Program: Root cause analysis; Percentage of staff: 35%; Indicator: Had participated. Program: Root cause analysis; Percentage of staff: 43%; Indicator: Knew someone who had participated. Program: Close call; Percentage of staff: 75%; Indicator: Familiar with the concept. Program: Confidential report to NASA; Percentage of staff: 51%; Indicator: Familiar with the program. Source: GAO analysis. Note: Data, rounded to the nearest whole number, are from our interviews with 81 randomly selected VA physicians and nurses. If staff initially did not know of a concept, we explained it to them. If they then recognized it, we accepted their answer as "yes." Therefore, when we state that they are familiar with it, this means they either knew the definition or recognized the term after an explanation. [End of table] Culture Shift through Root Cause Analysis: Clinicians who had participated in interdisciplinary RCA teams found that their participation enabled them to understand the benefits of using a systems approach rather than blaming individuals for unintentional adverse events and close calls. To understand the RCA process from close call reporting to RCA team analysis, we provide an example from fieldwork that shows how two misidentifications in a surgery ward led to a reexamination of the preoperative process in an RCA. (See "Developing Patient Safety from Examining Close Calls and an RCA.") While examining how many RCAs were conducted from 2000 to 2003 at the four facilities, we found that the most active facility we studied had performed twice as many RCAs as the least active. The RCAs have the potential to promote a cultural shift from blaming staff for unintentional close calls and adverse events to a rational search for the root causes, but clinicians at the four facilities had inconsistent opportunities to participate in the Program. Illustrating the Steps from Close Calls to RCAs: "Developing Patient Safety from Examining Close Calls and an RCA" illustrates an RCA team's initial steps by following a series of events involving two close calls of mistaken identity in surgery at one facility. Developing Patient Safety from Examining Close Calls and an RCA: The Patient Safety Manager had an unusual visit from the Chief Surgeon. He had come to report two recent instances of patients being mistakenly scheduled for surgery. The identity mix-ups had been discovered before the patients were harmed--a situation the surgeon recognized as fitting the Program's mandate to report close calls in order to identify hazards in the system. After each close call, he had filled out a form and made a report to NCPS, which had called him back within 24 hours to ask for more information and to offer some reengineering suggestions. At the next weekly surgery preoperation meeting, the Chief Surgeon and his staff discussed their schedule and details of coming surgeries, using a matrix timetable projected for all to see. Then he discussed the two close calls. In both cases, the correct patient had come to the surgery preparation room, but the staff had been expecting someone else. In one case, the scheduling staff had confused two similar names. In the other case, the scheduling staff had, as usual, used the last four digits of the Social Security number to help identify the patient but had had two patients with the same last four digits. In the meeting's discussion, the staff tried to understand how such mistakes could happen. The Patient Safety Manager convened an expedited RCA team of three other VA staff to get at the root cause of such identification problems. She opened the meeting by saying, "If we don't learn from this [close call], we're all fools." She announced that the RCA would be limited to two or three meetings rather than several weeks. After introductions, the staff members explained their role in scheduling and what happened in such cases. As they spoke, the staff tried to outline the scheduling process: what forms were completed, whether they were electronic or paper, how they moved from person to person, and who touched the forms. Several problems emerged. (1) Some VA patients might not always know their identity or surgical site because of illness or senility or both. Also, patients with multiple problems cannot always relay them to staff, because they may focus on one problem while the appointment scheduled is for another problem. (2) Two key VA staff may be absent at the same time and a substitute may make the error. (3) In one case, two patients' names differed only by m and n. (4) A scheduler noted that scheduling is filled with interruptions and opportunities for confusion. For example, it is not uncommon that scheduled patients have overlapping numerals for the last four digits of their Social Security numbers. The RCA team's next meeting was scheduled. In future meetings, the RCA team would consider various ways of preventing or minimizing similar events. [End of text box] Clinicians' Belief in RCAs as a Positive Learning Experience: Staff who had participated in RCAs told us that their experience was a valuable and convincing introduction to the Patient Safety Program. In lieu of giving clinicians formal training in the central concepts of the Program, NCPS expected to change the culture of patient care one clinician at a time by their individual experience in RCAs. NCPS intended that experience on multidisciplinary RCA teams investigating the underlying causes of reported close calls and adverse events at their home facilities would be clinicians' key educational experience and that it would persuade them that VA was taking a different approach to reporting. All facilities are expected to perform RCAs, in which local interdisciplinary teams study reports of close calls and adverse events in order to identify and redesign systems that threaten patients' safety. Staff also reported that RCA investigations created a learning environment and were an excellent way to introduce staff to redesign systems to prevent harm to patients. Two doctors at one facility, for example, told us that the RCAs they participated in were a genuine "no blame learning experience" that they felt good about or found valuable. Two nurses at another facility reported being amazed at the change from a blaming culture to an inquiring culture as they experienced the RCA process. However, staff also told us that the RCA process took too much time or took time away from patient care. At another facility, where trust was low and only 5 of 20 clinicians had a positive view of reporting, each of those 5 clinicians had a positive experience with RCAs under the new Program. "How Participating in RCAs Affects Clinicians' Work" presents some clinicians' own stories of their participation in RCAs. How Participating in RCAs Affects Clinicians' Work: Physician 1: I participated in an RCA through my work in the blood bank. It taught me to look at errors systematically and not rush to blame individuals. But if an employee were eventually found responsible, then the Lab would hold that person accountable. [This example reflects the decision leaders must make between personal accountability and systemic change.] Physician 2: RCAs are a good thing. It's fixing a potential disaster before it can coalesce and become a disaster. Nurse 1: I think RCAs are a good thing, because usually the problems are system problems. I think if you fix the system, you fix the problem. It seems to be that way in surgery. You try and concentrate on the things you can fix. Nurse 2: They used to have a process in psychiatry called "post mortem." That process often led to the conclusion that a suicide could not have been prevented. By contrast, in the new RCA process, we look at how the RCA can promote system changes. Nurse 3: RCA does a good job of identifying not only the actual adverse event but also the contributing factors. This is very helpful because it allows us to better understand what to do about an adverse event. Nurse 4: RCA is a good system. It's a good way to share information and avoid recurring error. Nurse 5: My general impression is that RCAs are great. They're especially important when teams look for results and action items. [End of text box] Variation in Facilities' RCA Activity: Over the 4 years of the RCA implementation, the most active facility we studied (Facility A) had performed twice as many RCAs as the least active facility (Facility D). (See table 2.) The number of RCAs, similar to the number of close calls and adverse events, does not reflect the actual numbers of adverse events or close calls that occurred or how safe the facility is; rather, it reflects whether organizational learning is taking place, through increasing participation in a core Program activity. Similarly, NCPS staff recently reported to a facility leaders' training session that networks of their facilities varied fourfold in fiscal year 2002 with respect to number of RCAs conducted. Facility D's director told us that NCPS had recently identified his facility as having too few RCA reviews. Table 2: Number of Root Cause Analyses at Four VA Facilities, Fiscal Years 2000-2003: Fiscal year: 2000; Facility A: 10; Facility B: 9; Facility C: 8; Facility D: 1. Fiscal year: 2001; Facility A: 20; Facility B: 14; Facility C: 11; Facility D: 9. Fiscal year: 2002; Facility A: 13; Facility B: 9; Facility C: 8; Facility D: 5. Fiscal year: 2003; Facility A: 11; Facility B: 6; Facility C: 7; Facility D: 8. Total; Facility A: 54; Facility B: 38; Facility C: 34; Facility D: 23. Source: GAO analysis. Note: Includes only individual RCAs; excludes aggregate reviews. In 2002, VA began a program of aggregate RCAs, in which the most commonly reported events, such as patient falls, were grouped and analyzed quarterly. Thus, in 2003 we see a reduction in individual RCAs across these facilities. [End of table] Inconsistent Opportunities to Participate in RCAs: One facility was more successful than the three others at providing busy physicians with the opportunity to participate in RCA teams by adopting a mandatory rotation system. RCAs have been required under the Program since 2000. About three- fourths of the respondents were familiar with the RCA concept. Seventy- five percent staff familiarity represents substantial learning, given when the concept was introduced. However, only about a third had participated in an RCA or knew someone who had. At one facility, we found broad participation by physicians because management required it. NCPS envisions RCA experience as central to changing to a culture of safety, but many VA clinicians (approximately 65 percent) at the facilities we studied had yet to participate in the nonblaming process that NCPS's director told us he viewed as the most effective experience for culture change: "We don't want professional root cause analysis people doing this stuff. Then you don't change the culture." We found a wide spectrum of methods being used to recruit physicians into RCA teams. One facility had broad physician participation in RCAs as its policy, and at another facility one unit had a rotational plan that encouraged its own clinicians to participate, in contrast to the whole facility. Administrators at three of the four had no policy across the facility to ensure physician participation on the teams. At two facilities, Patient Safety Managers told us it was difficult to get physicians to participate because of their busy schedules. Understandably, most of the clinicians we surveyed had not served on RCA teams. Summary: We found progress but also variation in the range of clinicians' familiarity with and participation in key elements of the Program. Looking facility by facility, we found one of the four facilities had lower familiarity and participation in the Program. Examining the clinicians across the random sample, we also found that about three- fourths were familiar with close call reporting but only half were familiar with a confidential reporting system. Focusing on RCAs, we found that about three-quarters of the sample knew the concept--that is, staff teams investigate the causes for accidents--while one-third had participated. Most of those who had participated thought that RCAs were promoting a culture shift by investigating adverse events and close calls in a no-blame atmosphere and redesigning systems so that future problems could be prevented. [End of section] Chapter 3: Measuring Cultural Support for the Program: Cultural support for VA's Patient Safety Program varied at the four facilities we studied. While clinicians we surveyed at three facilities had a more supportive cultural foundation for the Program, significantly lower levels of mutual trust and comfort in reporting limited the adoption of core Program activities at the fourth facility. Further, our analysis indicated that low trust and fear of punishment that characterize an unsupportive culture limit the adoption of the Program and constitute a feature held by clinicians that does not necessarily improve when they become familiar with the key concepts in the Program.[Footnote 25] The clinicians identified barriers to their participation in the Program. However, they fundamentally agreed on workplace conditions that can build the supportive culture and foster patients' safety. Their most frequently articulated themes for building supportive culture were (1) effective leadership; (2) good two-way communication, including feedback on reports of adverse events and close calls; (3) their professional values; and (4) workflow.[Footnote 26] Varying Cultural Support: Clinicians at three of the four facilities had medium or higher cultural support for the Program. One facility had lower support, and many clinicians indicated that they would not report adverse events because they feared punishment.[Footnote 27] This suggests that the Program will not succeed unless cultural support is bolstered. We explored the cultural support from these four groups in two ways: (1) by graphically comparing the groups' levels of mutual trust and comfort in reporting close call and adverse events with their levels of familiarity with the Program and (2) by graphically demonstrating the barriers clinicians see as blocking their close call and adverse event reporting, in conjunction with some elements of basic familiarity with and cultural support for the Program. Clinicians' Trust and Comfort in Reporting Varies by Facility: In figure 5, we compare our findings on clinicians' mutual trust and their comfort in reporting close calls and adverse events at the four facilities. The levels of these components of a supportive culture appeared to vary among the clinician groups.[Footnote 28] For example, staff at Facility A had medium familiarity with the Program but had the lowest levels of comfort in reporting adverse events and close calls and mutual trust among the four facilities. Knowledge from specific safety training or RCA participation was not sufficient for them to readily change to safety practices under the Program if levels of comfort in reporting and mutual trust were not high enough. Figure 5 contrasts information on the supportive culture (mutual trust and comfort in reporting) with a measure of staff familiarity with the Program from figure 4. Figure 5: Familiarity with VA's Program Compared with Trust and Comfort in Reporting at Four Facilities: [See PDF for image] Note: We reviewed all coded expressions of mutual trust and comfort in reporting for each interview in the random sample, assessing the preponderance of expressions and creating a summary high, medium, or low value for each individual. Intercoder reliability testing found coding consistency acceptable. We averaged these scores for each facility. Finally, we created a summary code for each facility, reflecting a composite score, using five questions about familiarity with the key elements and participation in RCAs. Coders analyzed all answers for each individual random sample respondent with regard to expressions of mutual trust and comfort in reporting and then created a summary rating of low, medium, or high values for each individual. This summary rating was then tested through rater reliability, and the scores were determined acceptable. For each facility, the individual summary ratings were averaged. We assigned numeric values, as customary in quantifying verbal answers. For display and comparison purposes, we decided to let the maximum individual knowledge, trust, and comfort levels be 10. Thus, in each key elements question, we let "yes" equal 2 and "no" equal 0, ensuring that an individual who knew all of the five elements would achieve a composite score of 10. Finally, we averaged composite scores to get an average score for each facility. In the trust and comfort summary judgments, we let "high" equal 10, "middle" equal 5, and "low" equal 0. Rather than display these numbers, we used a scale of high, medium, and low for 10, 5, and 0 and placed their answers accordingly. (See app. I for more on our methodology.) [End of figure] Many staff at Facility A were afraid of being punished, and they mistrusted management and other work units. One staff member explained why staff would not report adverse events: "We have a culture of back- stabbing here. They are always covering themselves." Many other staff members echoed this characterization of the atmosphere, linking the lack of cultural support to their decision not to perform the most basic of the Program's activities. Staff at that facility needed a boost in supportive culture to fully implement the Program. In contrast, Facility D, with the least familiarity with the Program, had trust and comfort levels almost as high as any of the others, indicating that if the Program were to be pursued with greater vigor there, cultural support would not be a barrier to reporting close calls and adverse events. Barriers to Reporting: In interviewing clinicians, we found that barriers remain to reporting adverse events and close calls. Even for staff familiar with the concepts, reporting required overcoming numerous remaining obstacles. These staff indicated that reporting formally would be a time-consuming diversion from patient care or, worse, "an invitation to a witch hunt." In figure 6, we display the cumulative effect of the barriers to reporting close calls that staff told us about, in conjunction with familiarity with and cultural support for the Program. Figure 9: Barriers to Staff Reporting Close Calls: [See PDF for image] Note: We asked VA staff "Do you know what a close call is?" If they answered "No," we explained it to them; if they recognized the concept, we accepted their answer as "Yes." [End of figure] Clinicians told us about barriers to their participation in reporting, including (1) limited perceived value, (2) not knowing how to report, (3) not having enough time to report, (4) fearing traditional blame or punishment, (5) lacking trust that coworkers would not shame them, and (6) lacking knowledge of the confidential reporting option. Staff at all four sites reported such barriers in reporting both close calls and adverse events. We present some of their views in "Clinicians' Barriers to Reporting Close Calls and Adverse Events." Clinicians' Barriers to Reporting Close Calls and Adverse Events: Nurse 1: Some clinicians feel comfortable reporting adverse events and close calls. I agree with the concept. It depends on the person. Some would feel it would be used against them. I've seen nonreporting, because, before, they got written comments such as "This is not a near miss." "This is not a close call." We get shut down instead of worked with. [By "shut down," she meant that management told her it was not a close call and not to report it.] It happened to me. Management generally discourages and does not empower staff to feel comfortable reporting patient safety conditions. Instead, I reported and it was used against me. Physician 1: I can't remember if I've written a close call. That does not happen here--only very, very rarely. Maybe I wrote one early on in my career, but I'm not sure. Physician 2: I thought I had a close call once and showed it to the chief of staff and he told me that it was not a close call. I'm unclear what the definition of a close call is. Physician 3: I know what a close call is in other settings, but not in the hospital setting. [Interviewer explains the definition.] They are not reporting on close calls in this hospital. Physician 4: Yes, I know what a close call is. I've not reported a close call, but if I were to, I would go to a nurse supervisor and tell her about it orally and have her report it. I would not use incident reports to report a close call--only actual events. Physician 5: I have not reported a close call. I'm removed from the nursing communications. Physician 6: I'm unsure if it is safe to report close calls without punishment. Nurse 2: If I saw a close call, I would go talk to the nurse who did it. Writing up a close call on someone would be cruel. I would not write up a close call or adverse event report on someone else. If something happened to the patient, I would write it up. Writing up another person would cause conflict. We need to help each other, and writing each other up is not considered helpful. [End of text box] Additional Steps to Stimulate Culture Change: The themes for work conditions that promote a supportive culture for patient safety that clinicians articulated most often were (1) leadership, (2) communication, (3) professional values, and (4) workflow.[Footnote 29] Building a Supportive Culture: A few strong patterns emerged from the clinicians' responses to our open-ended interview questions about what affects trust and comfort in reporting close calls and adverse events. First, across the survey, the clinicians said their leaders' actions were most likely to increase or decrease comfort and trust. Attributes of communication were the second most common aspect of their work that they said influenced their comfort and trust. Third, and somewhat less commonly, clinicians thought that the values and norms that they had developed in their professional training and that had been reinforced on the job influenced their culture, but they also thought that workflow could support or undercut trust generally. In their view, trust literally could be made or broken, depending on whether tasks shared between individuals or between units went smoothly and cooperation was maintained. Table 3 shows the results of our content analysis, listing the clinicians' four top themes--leadership, communication, professional values, and workflow--and how many times we found these themes in our analysis. Table 3: Content Analysis: Achieving a Supportive Culture through Aspects of the Work Environment: Aspects of work environment: four top themes: Leadership; Culture element: Comfort in reporting: 22; Culture element: Mutual trust: 25; Number of times theme appeared in our analysis: 47. Aspects of work environment: four top themes: Communication; Culture element: Comfort in reporting: 13; Culture element: Mutual trust: 25; Number of times theme appeared in our analysis: 38. Aspects of work environment: four top themes: Professional values; Culture element: Comfort in reporting: 15; Culture element: Mutual trust: 8; Number of times theme appeared in our analysis: 23. Aspects of work environment: four top themes: Workflow; Culture element: Comfort in reporting: 0; Culture element: Mutual trust: 12; Number of times theme appeared in our analysis: 12. Source: GAO analysis. [End of table] When we asked clinicians what affected a culture that supported comfort in reporting and trust among the different professions, departments, teams, and shifts they worked with, their most frequent answers were effective leadership and good two-way communication. Moreover, the clinicians told us that an unsupportive culture lacks these characteristics. Clinicians gave us these same answers, whether we asked about comfort in reporting or mutual trust. Further, we found that the culture of blame and punishment traditionally learned in medical training hampers close calls and adverse event reporting but that mutual trust is developed more by workplace conditions. Effective Leadership: Leadership's role is important in fostering a supportive cultural environment for the Program. Clinicians reported examples of leaders facilitating comfort in reporting and mutual trust that enabled them to participate in the Program. But at several facilities we also heard about distrust of the Program that resulted from leaders' action or lack of action. Clinicians told us that some VA leaders had not focused sufficiently on building the supportive culture that the Program requires. Staff reported that in order to trust, they needed information and needed to take part in decisions about their workplace and policies that affect their work. For example, clinicians told us that they wanted to be part of management's decisions or, at the very least, to be informed about management's decisions when a number of changes were being introduced, such as when medical supplies and software were purchased, clinicians were assigned temporary rotations, and performance measures were implemented. Their observations are in line with other studies that show that leaders' making decisions without consulting frontline workers can cause serious problems of trust.[Footnote 30] In "Clinicians' Perspectives on Leaders' Supporting Trust," we illustrate staff's positive attitudes toward patient safety and how leadership is instrumental in developing mutual trust and comfort. Clinicians' Perspectives on Leaders' Supporting Trust: Nurse 1: I asked my staff what the role of leaders should be so I could serve staff better. Many answered, "communication" and "knowing what is happening at the facility is important.” Physician 1: Leaders often bring up patient safety. They're "taking a lead in making staff aware of patient safety." At my facility, they hold staff meetings to review the patient safety goals of the Joint Commission on Accreditation of Healthcare Organizations (JCAHO). The chief of staff constantly brings up patient safety in meetings. The administration takes the lead, not only "talking the talk" but also "walking the talk.” Nurse 2: Trust is sustained, in part, because of weekly meetings with management, where they talk about patient safety. Physician 2: It's leadership's responsibility to communicate that staff are accountable for cooperation and coordination of patient care. [End of text box] Conversely, respondents said leaders' actions can diminish clinicians' comfort and trust, as summarized in "Clinicians' Perspectives on Leaders' Undercutting Trust." Physicians and nurses at different facilities told us that trust is diminished when staff do not work in stable teams. Some of the policies that clinicians told us were obstacles to building a stable team include assigning floating or nonpermanent supervisory personnel, rotating physicians on and off the ward, and the monthly rotation of student nurses and doctors. Clinicians' Perspectives on Leaders' Undercutting Trust: Physician 1: For 20 years, there was nothing but "blame and train." In the past, an adverse event or close call was associated with a person you had to blame, and the "fix" was to train them. Nurse 1: We have a panel of nurse managers who have discouraged adverse event reports for medication errors. I vow to encourage reporting errors without blame. We still have a way to go to be honest about reporting. Nurse 2: I know of instances when staff reported adverse events, they were transferred, so that does not make staff comfortable reporting them. There is no trust of management. Nurse 3: Decisions that affect our work are made without talking to staff or understanding our work situation. Physician 2: If you don't know what's going on, you invent it. Physician 3: The most critical change needed at this facility is in the area of leadership. Leaders are ineffective because they are not good at communication. We hear about reasons why we are blamed. This causes a feeling of distrust. Physician 4: Leadership has little grasp of patient care and, thus, policy directives have little impact. If we're given a policy to spend a maximum of 20 minutes per patient, including completing records, I do what the patient needs. Management can just yell at me. [End of text box] Communication: Staff indicated that communication in the workplace affects trust and comfort in reporting. Further, they told us that communication is challenging, since it involves coordinating tasks with and between leaders and teams and their empowerment, all of which can be problematic in the medical setting. Some VA staff told us that unequal power relationships and hierarchical decision making are often obstacles to patient safety. They also elaborated on the kinds of communication that support patient safety, including empowering staff so that they can be heard. Traditionally, a nurse's status is lower than a physician's in hospitals, and some nurses could find it difficult to speak up in disagreement with physicians. For patients to be safe, however, nurses indicated they wanted to be empowered to openly disagree with physicians and other staff when they found an unsafe situation. For example, nurses told us that they had to speak up when they disagreed with the medication or dosage doctors had ordered. They also said that they had problems when physicians telephoned nurses and gave directions orally when policy stated that physicians' orders must be written. The clinicians spoke to us about empowerment and their involvement or lack of involvement in decision making. "Clinicians' Perspectives on How Communication Promotes Trust" gives some examples of what they told us about communication that they believed supports patient safety. Clinicians' Perspectives on How Communication Promotes Trust: Nurse 1: We interact with doctors and nurses in clinic. If something happens, we share with one another about how we might have done it differently. This goes on daily. Nurse 2: The director of the medical facility is a good communicator. he keeps us informed. He maintains a personal newsletter. Our nurse manager is well rounded and she listens. Nurse 3: Peers and coworkers communicating with one another supports patient safety. For instance, sometimes we have patients who have a history of violence. This information is reflected in the computer and comes up when they "chart them in," but sometimes a nurse may still not know of such a history. Therefore, in the nurses' reports, the history of violence and the need for caution is passed on. Extra information about the patients can also help them deescalate confrontations between patients. Physician 1: VA's Computerized Order Entry system [a computerized method for ordering medications] promotes patient safety. Before, it was hard to read the physicians' handwriting. The Computerized Order Entry at least eliminated the legibility problem. They do not have Computerized Order Entry at the university where I also work. VA also got rid of using Latin abbreviations. Now everything has to be written out. Physician 2: Open communication promotes team buy-in and therefore better customer service. Physician 3: We have a good department because staff can communicate their complaints. Nurse 4: We do an RCA on our own close call or adverse event or those from other sources, and then we present the results to the staff. I brought a PowerPoint briefing to our staff meeting about another hospital's wrong site surgery, so we could know what had happened. If JCAHO published an adverse event, I put it in our staff notes and have it discussed at the next staff meeting. Nurse 5: Management is more involved with the workers. It seems that they are listening more. Physician 4: Within the unit, we have good trust. Outside the unit, the administration has more trust and more communication. We're in the loop more. In the clinic, we have good trust in nurse-to-doctor and doctor- to-doctor relationships and with leadership. Physician 5: I reported a close call recently and feared blame, but it was not that way at all. It was a learning experience for all who heard about it. I think it's wonderful that VA has created this open atmosphere. Formerly, you might be a scapegoat, have backlash, and get a poorer rating. Today, we don't feel we're going to be punished. [End of text box] In "Clinicians' Perspectives on How Faulty Communication Diminishes Trust," we give clinicians' examples of management's undermining patient safety by deciding policies without consulting them, as when nurses were not included in decision making. Such policies sometimes proved dysfunctional or were ignored. Clinicians' Perspectives on How Faulty Communication Diminishes Trust: Nurse 1: I have to double-check changes in supplies in order to safeguard patients, because Supply often sends ABC instead of XYZ. Since we're not included in decisions about product changes, we're forced to continually double-check Supply to keep patients safe. Nurse 2: We have poor communication between other units and the radiology unit. They send incontinent or violent Isolation [contagious] patients without notifying X-ray staff to be wary. [End of text box] Facility staff also wanted additional and more timely feedback on what happens to their reports of close calls, adverse events, and the results of RCAs. Some Patient Safety Managers often felt too busy to provide feedback to staff because their jobs included a number of activities, including facilitating RCAs. At one facility, Patient Safety Managers routinely reported system changes back to staff who made the reports, but at the other facilities, they did not have a routine way of doing this. Many staff at the four facilities told us that they did not know the recommendations of the RCA teams or the results of close call or adverse event reports. NCPS agrees that feedback to staff is necessary but inadequate, and it plans to focus on the need for feedback at facilities in the near future. NCPS's Web site publicizes selected results of RCAs and alerts and system changes that result from reporting. Some of what VA's leaders and frontline clinicians told us about the need for more feedback is presented in "Facility Staff Concerns about Limited Feedback.” Facility Staff Concerns about Limited Feedback: Nurse Manager: We do a good job of following up on close call or adverse event reports in my unit, but not as good a job following up on the recommendations from RCAs. I was able to implement the action items right away in my unit after I participated in an RCA on patients' falls, but other nurse managers didn't hear about the results from the RCA for 2 or 3 months. The RCA teams develop really good ideas, but we need follow-through to make sure everyone knows that this is what we're going to do to change the system. Delays result from organizational routing and financial constraints. Even when the recommendation is signed, sometimes there's a delay getting the information down to the nurse managers. Physician 1: There should be an annual report of actions taken as a result of reporting adverse events and close calls. For example, if three units have developed a different way of labeling medication that used to be labeled alike, then the rest of the staff should know about it. [This was a reference to medication that looks alike and confuses staff. One solution is for the pharmacy to buy the two medications from different manufacturers so that the labels will be different.] It makes people feel better to know the information they reported helped make things better. I'd make sure that the information on improved medical care gets reported back to the staff. Administrative Official: The distribution of RCAs has been limited to staff responsible for the action or system change, but in the future the results will be distributed more broadly. Physician 2: I haven't heard any results from the RCAs. A pamphlet on the results would be a good idea. Note: "Administrative Official" is a title we used in this report to keep identity confidential. [End of text box] Workflow and Professional Training: In addition, staff spoke to us frequently about workflow issues--how safely handing off tasks between shifts and teams required trust but could cause mistrust when the transition was not smooth or efficient. VA clinicians clarified for us that mutual trust could be either gained or lost between workers and units, depending on coordination. And they drew conclusions about the importance of the quality and nature of workflow to patient safety. Clinicians also elaborated on aspects of the values they learned in training that did not facilitate a blame- free workplace. They indicated that shifting patient care between groups was an ongoing challenge to patient safety. For analysis purposes, we found these issues in continuity of care to be part of the larger problem of workflow, because they entailed the coordination of tasks and communication within and across teams. In the views of the clinicians at the facilities we studied, if staff, teams, or units begin to feel they cannot adequately communicate their patients' needs for care because of workflow problems, then trust may be lost, in turn diminishing patient safety.[Footnote 31] At one facility, where trust and comfort were lower than at the others, clinicians told us that workflow failures diminished trust and threatened patient safety. In "Clinicians on Workflow Problems and Patients' Safety," some physicians and nurses talk about these problems and how they tried to find solutions to promote patients' safety. Clinicians on Workflow Problems and Patients' Safety: Nurse 1: Some units are less particular about paperwork and records than others, so when we transfer patients, their information is sometimes incomplete. Patients don't come back to my unit as quickly from one unit as from other units, and sometimes their information is not available. Physician: Personnel tends to lose things, and this makes it hard to recruit new staff. Nurse 2: We often have difficulty getting the supplies we need. For example, it's especially difficult to obtain blood on the night shift. Nurse 3: At the change of a shift, I had to discharge one patient and admit another. Since I couldn't do both at the same time, I chose to admit but not to discharge. But my relief nurse expressed unhappiness about the situation, suggesting that I had left my work for another crew to do. I spoke with the relief nurse, and the problem of mistrust was resolved when everyone understood the work context better. When people communicate across shifts this way, they have a better understanding of and appreciation for one another. Nurse 4: I go to the ward before my shift starts to make sure the patients' wounds have been properly dressed. I take dressings to homebound patients when they weren't sent home with them. I cultivate motivated individuals from the ward staff, letting them see the procedures in the Dialysis Unit, and give them responsibility for those patients when they're back on the ward and reward them. I stock snacks because feeble elderly patients are sent to Dialysis without breakfast, and then they're expected to get to breakfast after their dialysis session and pay for their own meal. I see this situation as inherently unsafe, so I supply them with free snacks. [End of text box] The professional values physicians and nurses learned in their formal education or on the job can also be an obstacle to the Program, because these values do not always foster a nonpunitive atmosphere. Some of the values clinicians have been trained in run counter to the Program's expectations for open reporting, as we show in "Clinicians' Professional Values and the Patient Safety Program.” Clinicians' Professional Values and the Patient Safety Program: Nurse 1: There is much trust within the nursing profession. We have to trust each other because of the critical nature of passing patients from one shift to another. Nurse 2: The only group I worry about is Clerical. Their work is frontline and high-stress, but it's entry level, so they may have never worked in a hospital before. We have to double-check their work because there's no system in the clinic to verify orders, as there is in the hospital. Nurse 3: We trust those we work with. The exception is Housekeeping. We have to continually call to complain about the cleanliness of the clinic. Nurse 4: Nurses have a value system in which we "eat our young," which undercuts comfort in reporting errors. Traditionally, older nurses taught younger ones their way of doing things, and the younger ones were punished when they failed to do things that way. Now, we must allow nurses to do things a new way without punishment. Nurse 5: I keep hearing that we're looking to learn and not blame. Nursing culture is a blaming culture, and [the Patient Safety Program] is helping to stop this. Nurse 6: The model in nursing is "a nun with a ruler.” Physician 1: The culture is changing, but it's taking a while. I'm impressed with administration here that tries to say, "How can we learn from this?". Physician 2: To promote the Program, you have to have a change to a no- blame culture. Physician 3: Clinicians have to stop blaming each other and learn from their mistakes. [End of text box] VA clinicians explained that nurses see themselves as the patients' first and last guard against harm during care. Nurses are expected to be double-checking physicians' orders, medicines, and dressings and, for example, preventing falls or suicide attempts. Generally speaking, in their traditional role, nurses feel personally responsible for patients' welfare and are designated to fulfill that role. They hold fast to protocols as safety devices, follow rules, and double-check work orders. Some spoke favorably of a bygone era when nurses could be counted on to back up one another, while many others thought this described their current work environment. In contrast, VA staff told us that physicians have thought of themselves as taking more original and independent actions but not as part of a multidisciplinary team. Their actions, based on traditional professional values, would thus undercut mutual trust. Physicians told us that patient safety would be improved if they were better trained to work on teams. Both nurses and physicians face many obstacles to improving patients' safety in the increasingly complex and ever changing world of medicine. VA clinicians take seriously their mission as caretakers of the nation's veterans, many of whom are older and have multiple chronic diseases, making these efforts to improve patient safety even more challenging. Many told us that they feel ethically and morally bound as frontline caretakers to keep their patients safe by reducing the number of adverse events and close calls. Improving Assessment of, Familiarity with, Participation in, and Cultural Support for the Program: Although VA conducted a cultural assessment survey in 2000 and plans to resurvey VA staff in the near future, it has not measured staff familiarity with, participation in, and cultural support for the Program. For example, it did not ask about staff knowledge and understanding of key concepts (close call reporting, RCAs, and VA's confidential reporting system to NASA) or RCA participation. Although the 2000 survey did describe some important attitudes about patient safety, such as shame and punishment related to reporting adverse events, it did not explicitly measure mutual trust among staff, a central theme of VA clinicians in describing what affected patient safety and a supportive culture. Finally, while NCPS staff asked each facility to administer the survey to a random sample, many facilities did not follow their directions. The VA survey may serve as a baseline measure of national local trends, but it could not be used to identify facility-level improvements or interventions.[Footnote 32] Summary: We found that three of the four facilities had a supportive culture that allowed staff to trust one another and feel comfortable reporting close calls and adverse events. At the fourth site, clinicians told us their facility had an atmosphere of fear and blame that did not support the Program. Content analysis revealed the most frequent themes were effective leadership, good two-way communication, clinicians' professional values, and workflow. [End of section] Chapter 4: Promoting Patient Safety: Successful management actions at one facility had resulted in the most complete adoption of safety practices under the Program at the time of our study. These actions included (1) storytelling, a well-documented oral tradition in medicine, to show changes in norms and values; (2) teaching, coaching, and role modeling for open communication throughout the hierarchy; and (3) offering rewards for participation in close call reporting. Clinicians at that facility pointed to these practices, which facilitated patient safety and their adoption of the Program's concepts and activities. The three other facilities used some or few of these practices; nonetheless, clinicians there proposed them as potentially good ways to improve patient safety. While our work reflects the clinicians' views at the four facilities we studied, these findings correspond with other studies of organizations' attempts to change culture.[Footnote 33] Using Storytelling to Promote Culture Change: VA leaders at some facilities we studied showed staff they support the Program by telling stories. They used the stories to publicly demonstrate a changed and open atmosphere for learning from adverse events and close calls, for example. While leaders must still distinguish episodes that warrant professional accountability, they must fairly draw the line between system fixes and performance issues.[Footnote 34] One way to do this is by repeating stories that demonstrate that VA leaders encourage a culture that supports the Program and an atmosphere of open reporting and learning from past close calls and adverse events. Leaders supported the Program by telling staff stories that demonstrated a systems change to safeguard patients after a medical adverse event was reported.[Footnote 35] Storytelling has a long tradition in medicine as way of teaching newcomers about a group's social norms.[Footnote 36] One leader shared with us the story he used to kick off VA's Patient Safety Program. Each time he tells the story, he confirms the importance of changing VA's culture and helps transform the organization because staff remember it. Instead of dismissing an employee who has reported not giving a patient the drug the patient was supposed to receive, the leader judged the adverse event to be a systems problem. In discussions with NCPS, the leader recognized that this story was an opportunity to show his staff that the facility was following the Program by taking a systems rather than a disciplinary approach and to highlight that reporting close calls and adverse events was critical in changing the patient care practice so that such problems would not recur. "Leaders' Effective Promotion of Patient Safety in Staff Meetings" contains another example of storytelling to change communication practice. Leaders' Effective Promotion of Patient Safety in Staff Meetings: [The Administrative Official met with a unit leader and about 20 physicians and residents.] Administrative Official: The Patient Safety Program includes close calls as reportable incidents. [That is, VA is accepting staff reports of close calls.] A culture change is needed at VA, brought about by sharing a vision of what is valuable to us. We also want to show that leadership endorses the Program. [He walked the meeting through an aviation example that showed that the first officer should have challenged the captain, raising parallels with failure to question authority--or to "cross-check"--at this facility. He asked the group how they challenged authority effectively. Finally, he introduced RCAs as a new type of system analysis. Physicians continued their discussion.] Physician 1: Cross-checking is more effective if it's not hostile. Physician 2: There are fewer errors in medical settings where there's a stable team, but recently VA has been trying to do things more quickly with fewer staff. Physician 3: Communication is a problem on my unit, where we have 28 contract nurses. Physician 4: Could it be bad if one unit reported a lot of close calls? Physician 5: [in a leadership position]: VA has 50 years of being punitive. The Patient Safety Managers will be looking for patterns across a large number of reports, not seeking to blame individuals. Physician 6: Why can't the reporting simply be open and the names of the reporters known? [Several members of the meeting talked about the fear of punishment that still existed.] Physicians 7 and 8: Are the forms discoverable? Can they be subpoenaed? Can the reports be anonymous? [In a subsequent interview, leaders told about how the Program was progressing.] Leader 1: We must change doing what you're told without questioning orders. We tell nurses that it's OK to challenge physicians in an atmosphere of mutual respect. We're establishing it as a facility goal, keeping it on the front burner and keeping it a priority. Leader 2: Since leaders began visiting staff meetings to get the word out on close call reporting, we've noticed a change--a significant reduction in the fear of reporting close calls. Not all fear is gone, but the close call program is a success. Leader 3: Leadership raised safety consciousness with the close call airplane accident lesson. If it had been handed to us as just another memo, it might have been thrown away, but when leaders are there in person to answer questions, then it raises people's awareness of patient safety. Physician 1: Leadership here went out and talked about patient safety. Their support and emphasis and bringing their level of importance to it made the Program happen. [End of text box] Deliberate Teaching, Coaching, and Role Modeling: Staff at one facility told us that VA's leadership supported the Program and the patient safety culture by teaching, coaching, and role modeling patient safety concepts to their staff in more than a hundred small meetings. VA's leaders had a three-part agenda in their initial staff meetings. First, they taught a scenario in which two pilots failed to communicate well enough to avoid a fatal crash. The first officer did not cross-check and challenge an order from his captain to descend in a wind shear, resulting in the plane's crashing and killing 37 people. Facility leaders depicted the strong parallels---including the communication effects of unequal power relationships and hierarchical decisionmaking discussed earlier---between the pilots' communication to save the plane and clinicians' communications to save the patient. Second, they discussed the importance of communications in medical care, coaching lower-level staff to speak up when they saw adverse events and emphasized the importance of two-way communication. Finally, they introduced a new close call reporting program at the facility and modeled for staff that they supported this type of reporting in introducing the new Program and its elements. "Leaders' Effective Promotion of Patient Safety in Staff Meetings" presents a portion of one such meeting and also interviews with VA staff when they discussed how the staff meetings had raised their consciousness about patient safety. "Leaders' Effective Promotion" represents more than a hundred small meetings conducted at one facility that successfully demonstrated that patient safety was a priority for the organization. When top leaders attended staff meetings, staff listened to their message. It may be no coincidence that this facility had the highest rating for comfort in reporting, according to the findings of our survey. Many staff at this facility told us that because their top leaders spoke to them about the Program, they concluded that the Program and its culture change were a priority for their leaders. Midlevel staff also acknowledged progress but admitted to some remaining fear. Participants heard their leaders say that challenging authority--here called "cross-checking"--was important for patient safety. They were asked to compare their own communication patterns with the aviation crew's communication in a similarly high-risk setting that depended on teamwork. The administrative official at the medical facility meeting, drawing an analogy between the aviation example and participants' work, noted that an RCA had found that an adverse event could have been prevented if authority had been challenged. His message to the meeting's participants was that VA's leadership saw cross-checking as acceptable and necessary. Rewarding Close Call Reporting: The same facility that held small meetings for staff developed a close call reward system that reinforced the idea that reporting a close call not only did not result in punishment but was actually rewarded. Staff feared a negative atmosphere when the close call program was first established, with staff telling on one another, but this did not occur. The number of close calls at this facility was few before the reward program began. In the first 6 months of the program, 240 close calls were reported. While we were visiting the Patient Safety Managers, many staff called them to report close calls; each staff member was given a $4 cafeteria certificate. Patient Safety Managers at this facility told us that they rewarded reporting, no matter who reported or how trivial the report. The unit with the month's best close call received a plate of cookies. The Patient Safety Manager reported that a milestone had been reached when a chief of surgery reported a close call--a first for surgery leadership. "Rewarding Close Call Reporting" paraphrases leaders and clinicians on the success of the close call program at their facility. Rewarding Close Call Reporting: Leader 1: With the close call program, the wards do not feel as secretive. VA leadership thought the new close call program might cause staff to turn on one another and begin to blame one another for reporting close calls, but this has not happened. Nurse 1: People are rewarded for reporting close calls and adverse events--and not punished. Nurse 2: I feel comfortable about reporting close calls and adverse events. When management first introduced the close call program, we thought everyone was going to tell on each other. If everyone starts to find out things about you, you could lose your job, because it could be on your record. You would have to ask yourself, "Is this something I would really want to tell someone about?" We thought it would be like "Big Brother Is Watching You." But that is not what it's like. I feel comfortable reporting close calls and adverse events. Administrative Official: To promote patient safety, we did a lot of reward and recognition to let staff know that what they have done [reporting close calls and adverse events] is important. [End of text box] Other facilities did not have as extensive a reward system. At one facility, the Patient Safety Manager had recently given a certificate to someone who had done a good job in describing an adverse event. However, at another facility, the quality manager who supervised Patient Safety Managers told us that she thought it improper to reward staff for reporting: She did not want to reward people for almost making a mistake. Clinicians in our interviews, however, pointed to the need to develop reward programs around patient safety. For example, one nurse said that if she were the director, she would call staff to thank them for reporting close calls and adverse events and would develop a reward system. Summary: We found that leaders used three management strategies at one facility that promoted the Program: (1) storytelling; (2) teaching, coaching, and role modeling open communication in staff meetings; and (3) offering rewards for participation in close call reporting. These strategies changed clinicians' attitudes and behavior, because they believed that the Program is an organizational priority, and they acted on this by reporting more close calls. An important part of the Program is encouraging close calls to surface so that safeguards can be established before patients are harmed. [End of section] Chapter 5: Conclusions and Recommendations: Five years into VA's Program to improve the safety of patients' care at its medical facilities, we found progress at certain facilities but continuing barriers to the Program's adoption at others. Having recognized the risks to patients that are inherent in medical care, VA seeks with its Program to identify and fix system flaws before they can harm patients. To successfully change its culture, VA acknowledges that it is necessary to change staff attitudes, beliefs, and behavior from those of fear of blame to open willingness to report close calls and adverse events. The fear is rooted in, and reinforced by, many years of professional training and experience in medical care settings. In the four facilities in which we studied the Program's progress, we were able to measure significant differences in clinicians' familiarity with and participation in the Program and the levels of cultural support for it. We conclude that progress in patient safety could be facilitated if VA's program efforts focused on facilities where familiarity with the Program's major concepts is low--concepts such as close call reporting, the NASA confidential reporting program, and RCAs--and on the facilities where participation in RCAs and levels of cultural support for the Program are low. VA may be able to use lessons learned by focusing on clinicians' perspectives to prioritize future actions to further the goal of patient safety. VA should have tools available to determine which facilities face barriers to adopting the Program and, therefore, need assistance in stimulating culture change and promoting the Program. VA is to be commended for conducting a cultural survey that showed staff attitudes toward safety at the national level. However, since it was not a random survey, it was not effective in discerning staff attitudes at the local level. In addition, VA has not measured staff knowledge of the Program, staff participation in RCAs, or whether facility staff have enough mutual trust to support the Program. VA may be able to adapt measures we have suggested, such as adding to its survey some of our questions that focus on these issues, so as to identify facilities for specific interventions and assess the Program's progress at the local and national levels. Measuring Clinicians' Familiarity with and Cultural Support for the Program: Clinicians' familiarity with the Program and opportunities to participate in RCAs could be measured at each facility in order to identify facilities that require specific interventions. Because low familiarity or participation can hinder the success of the Program, VA could attempt to measure and improve basic staff familiarity with the Program's core concepts and ensure opportunities to participate in RCA teams. Our study developed measures of familiarity with and participation in the Program by analyzing responses from interviews of a small random sample of clinicians, and these could be further developed into useful measures in a larger study. These measures could also be developed into goals to be achieved nationally and, more importantly, locally for each facility. According to the clinicians we interviewed, the supportive culture of individual facilities plays a critical role in clinicians' participation in the Program and warrants VA leadership's priority. In one of the three facilities where staff had above average familiarity with the Program, staff told us that fear prevented them from fully participating in the Program. From the clinicians' vantage point, their leaders need not accept given levels of mutual trust or comfort in reporting close calls and adverse events; instead, once facilities are identified as having low cultural support for the Program, that can be a starting point for change. In our conversational interviews with clinicians, they consistently pointed to specific workplace conditions that fostered their mutual trust and comfort in reporting. Notably, management can take actions to stimulate culture change by developing a work environment that reinforces patient safety. Drawing from their own experience, clinicians had views that were consistent with many studies of culture change in organizations, indicating that leaders' actions and open communication are important in the transformation sought under the Program. We were able to directly observe practices that have convinced frontline workers that the Program is a priority for VA, that it is worth their while to participate in it, and that by doing so medical facilities are safer for patients. These practices included leadership's demonstrating to staff that patient safety is an organizational priority--for example, by coaching and by communicating safety stories in face-to-face meetings with all staff--and that the organization values reporting close calls because it rewards and does not punish staff for reporting them. Recommendations for Executive Action: To better assess the adequacy of clinicians' familiarity with, participation in, and cultural support for the Program, we recommend that the Secretary of Veterans Affairs direct the Under Secretary for Health to take the following three actions: 1. set goals for increasing staff: * familiarity with the Program's major concepts (close call reporting, confidential reporting program with NASA, root cause analysis), * participation in root cause analysis teams, and: * cultural support for the Program by measuring the extent to which each facility has mutual trust and comfort in reporting close calls and adverse events; 2. develop tools for measuring goals by facility; and: develop interventions when goals have not been met. Agency Comments and Our Evaluation: We provided a draft of this report to VA for its review. The Secretary of Veterans Affairs stated in a December 3, 2004, letter that the department concurs with GAO's recommendations and will provide an action plan to implement them. VA also commented that the report did not address the question of whether VA's work in patient safety improvement serves as a model for other healthcare organizations. GAO's study was not designed to evaluate whether VA's program was a model, compared with other programs, but was limited to how the program had been implemented in four medical facilities. VA also provided several technical comments that we incorporated as appropriate. [End of section] Appendix I: Content Analysis, Statistical Tests, and Intercoder Reliability: Content Analysis: To analyze the data we collected, we used content analysis, a technique that requires that the data be reduced, classified, and sorted. In content analysis, analysts look for, and sometimes quantify, patterns in the data. We conducted tests on clinicians' responses to our key variables and found a number of significant differences. We also conducted intercoder reliability tests--that is, we assessed the degree to which coders agreed with one another. The tests showed that the consistency among the coders was satisfactory. Ethnography: Ethnography is a social science method, embracing qualitative and quantitative techniques, developed within cultural anthropology for studying a wide variety of communities in natural settings. It allowed us to study the Program in VA's medical facilities. Ethnography is particularly suited to exploring unknown variables, such as studying what in VA's culture at the four facilities affected the Program. In our open-ended questions, we did not supply the respondents with any answer choices. We allowed them to talk at length, and therefore the interviews lasted anywhere from a half hour to an hour or more. Ethnography is also useful for giving respondents the confidence to talk about sensitive topics. We anticipated that clinicians would find the study of VA's medical facility culture, including staff views of close calls and adverse events, a sensitive subject. Therefore, we gave full consideration to the format and context of the interviews. Although ethnography is commonly associated with lengthy research aimed at understanding remote cultures, it can also be used to inform the design, implementation, and evaluation of public programs. Governments have used ethnography to gain a better understanding of the sociocultural life of groups whose beliefs and behavior are important to federal programs. For example, the U.S. Census Bureau used ethnographic techniques to understand impediments to participation in the census among certain urban and rural groups that have long been undercounted.[Footnote 37] Data Collection: We conducted fieldwork for approximately a week at each of two facilities, for 3 weeks at a third, and for 25 days at the fourth. Although ethnographers traditionally conduct fieldwork over a year or more, we used a more recent rapid assessment process (RAP). RAP is an intensive, team-based ethnographic inquiry using triangulation and iterative data analysis and additional data collection to quickly develop a preliminary understanding of a situation from the insider's perspective.[Footnote 38] We drew two samples, one judgmental and one random. To understand how the Program was implemented at each medical facility, we conducted approximately a hundred nonrandom interviews with facility leaders, Patient Safety Managers, and a variety of facility employees at all levels, from maintenance workers, security officers, nursing aides, and technicians to department heads. This allowed us a detailed understanding of how the Program was implemented at each facility. To ensure that we represented clinicians' views at all four facilities, we selected a random sample of 80, using computer-generated random numbers from an employee roster of clinicians, yielding 10 physicians and 10 nurses at each facility.[Footnote 39] While this provided us with a representative sample of clinicians (physicians and nurses) from each facility, the size of this sample was too small to provide a statistical basis for generalizing from our survey results to the entire facility or to all facilities. For both samples, we used a similar semistructured questionnaire (see app. III). It consisted of mostly open-ended questions and a few questions with yes-or-no responses. At every interview, we asked staff for their ideas, and we incorporated a number of their perspectives into this report. A hallmark of ethnography is its observation of behavior, attitudes, and values. Observation is conducted for a number of purposes. One is to allow ethnographers to place the specific issue or program they are studying in the context of the larger culture. Another, in our case, was to allow some facility staff to feel more comfortable with us as we interviewed them. Both purposes worked for us in this study. Because we had observed meetings and RCA teams at work, we could better understand respondents' answers. Respondents noted how comfortable they were in talking to us and how different our conversational interviews were from other interviews they had experienced in the past. We observed staff in their daily activities. For example, we accompanied a nurse while she administered medication using bar code technology that scans the medication and the patient's wristband. We also observed staff at numerous meetings, including RCA team meetings, patient safety conferences, patient safety training sessions, staff meetings in which patient safety was discussed, and daily leadership meetings. Our methodology included collecting data from facility records. We examined all close calls and adverse events reported for a 1-month period and all RCA reports conducted at each facility, and we reviewed administrative boards and rewards programs. We read minutes from patient safety committees and other committees that addressed safety issues. Data Analysis: Our data were mostly recorded, but some interviews were written, depending on respondents' permission to record. Using AnnoTape, qualitative data analysis software, we coded the interviews for both qualitative and quantitative patterns, and we used the software to capture paraphrases for our analysis. We developed a prescriptive codebook to guide the coders in identifying interviews and classifying text relevant to our variables. After several codebook drafts, we agreed on common definitions and uses for the codes. In the content analysis of our random sample data, we looked for patterns, associations, and trends. AnnoTape allowed us to mark a digital recording or transcribed text with our codes and then sort and display all the marked audio or text bites by these codes. Because all the coders operated from a common set of rules, we achieved a satisfactory intercoder rater reliability score. AnnoTape also allowed us to record prose summaries of the interviews, some of which paraphrased what the clinicians said; the paraphrases we present in the report reflect the range of views and perceptions of VA staff at the four medical facilities. A rough gauge of the importance of their views is discernible in the extent to which certain opinions or perceptions are repeatedly expressed or endorsed. Using the statistical package SAS, we analyzed the variables with two- choice and three-choice answers and transferred them to an SAS file for quantitative analysis. Among the quantifiable variables were five yes- or-no questions asking about respondents' familiarity with key elements of the Patient Safety Program. We created a new variable that reflected a composite familiarity score for the Program, using the five questions about familiarity with the key elements (the questions are listed in the note to fig. 4). We also assessed respondents' levels of comfort in reporting close calls and adverse events and mutual trust among staff at each facility, based on each whole interview. We used these two assessments, rated high, middle, or low to characterize cultural support for the Patient Safety Program. In quantifying verbal answers for display and comparison purposes, we decided that the maximum individual familiarity, trust, and comfort levels should be 10. Thus, in each key elements question, we let "yes" equal 2 and "no" equal 0, ensuring that an individual who knew all of the five elements would achieve a composite score of 10. Finally, we averaged composite scores to get an average score for each facility. In the trust and comfort summary judgments, we let "high" equal 10, "medium" equal 5, and "low" equal 0. Rather then display these numbers, we used a scale of high, medium, and low for 10, 5, and 0 and placed the answers accordingly. Significance Testing: We were able to determine statistically significant differences in clinicians' responses by facility and, unless otherwise noted, we report only significant results. First, we conducted a nonparametric statistical test, called Kruskal- Wallis, on all possible comparisons in the subset of variables that we report in our text.[Footnote 40] Four of these variables were central to the report: comfort summary score, trust summary score, close call score, and root cause score. In the Kruskal-Wallis test, each observation is replaced with its rank relative to all observations in the four samples. Tied observations are assigned the midrank of the ranks of the tied observations. The sample rank mean is calculated for each facility by dividing its rank sum by its sample size. If the four sampled populations were actually identical, we would expect our sample rank means to be about equal--that is, we would not expect to find any large differences among the four medical facilities. The Kruskal-Wallis test allows us to determine whether at least one of the medical facilities differs significantly from at least one other facility. This test showed that--for each of the comfort, trust and close call variables--at least one of the medical facilities differed significantly from at least one of the other medical facilities. Next, we conducted a follow-up test to determine specifically which pairs of medical facilities were significantly different from other pairs on key variables. This follow-up test is a nonparametric multiple comparison procedure called Dunn's test.[Footnote 41] Our using Dunn's test meant testing for differences between six pairs of medical facilities: A vs. B, A vs. C, A vs. D, B vs. C, B vs. D, and C vs. D. Table 4 presents the results of Dunn's test, along with each facility's sample rank mean and sample size. The pairs of facilities that are statistically significantly different from one another are in the far right column. Note that for the root cause characteristic, there are no statistically significant findings from the multiple comparison testing, which conforms to the results of the earlier Kruskal-Wallis test on root cause. Table 4: Nonparametric Multiple Comparison Results: Characteristic: Comfort; Facility A: 25.5 (20); Facility B: 49.4 (20); Facility C: 43.6 (19); Facility D: 41.7 (20); Statistically significant comparison[A]: A vs. B***, A vs. C***, A vs. D**. Characteristic: Trust; Facility A: 28.8 (19); Facility B: 44.4 (21); Facility C: 46.3 (20); Facility D: 41.7 (20); Statistically significant comparison[A]: A vs. B*, A vs. C**. Characteristic: Close call[B]; Facility A: 38.5 (20); Facility B: 49.2 (20); Facility C: 42.7 (20); Facility D: 26.3 (18); Statistically significant comparison[A]: B vs. D***, C vs. D**. Characteristic: Root cause[C]; Facility A: 43.0 (20); Facility B: 39.4 (21); Facility C: 43.1 (20); Facility D: 36.3 (19); Statistically significant comparison[A]: None. Source: GAO analysis. Note: Numbers are sample rank means and, in parentheses, sample sizes. [A] Significance levels 0.0250, 0.0167, and 0.0083 are indicated by three, two, and one asterisks, respectively. These significance levels were determined by dividing overall significance levels 0.15, 0.10, and 0.05, respectively, by 6, or the number of comparisons. [B] A sum of scores on "Do you know what close call or near miss reporting is?" and "Do you know what the Patient Safety Reporting System to NASA is?"--a related subgroup of the knowledge questions. [C] A sum of scores on "Do you know what an RCA is?" "Have you participated in an RCA?" and "Do you know anyone who has participated in an RCA?"--a related subgroup of the knowledge questions. [End of table] Intercoder Reliability: Consistency among the three coders was satisfactory. We assessed agreement among the coders for selected variables for interviews with seven clinicians--that is, we assessed the extent to which they consistently agreed that a response should be coded the same. To measure their agreement, we used Krippendorff's alpha reliability coefficient, which equals 1 when coders agree perfectly or 0 when coders agree as if chance produced the results, indicating a lack of reliability.[Footnote 42] Our Krippendorff's alpha values ranged from 0.636 to 1.000 for nine of the selected variables (see table 5). Compared with Krippendorff's guidelines that alpha is at least 0.8 for an acceptable level of agreement and ranges from 0.667 to 0.8 for a tentative acceptance, we believe our overall our results are satisfactory. Table 5: Intercoder Reliability Assessment Results: Variable: Q2 Facility location; Krippendorff's alpha: 0.878. Variable: Q5 Respondent set; Krippendorff's alpha: 1.000. Variable: Q8 Respondent title; Krippendorff's alpha: 1.000. Variable: Q17 Change; Krippendorff's alpha: [A]. Variable: Q18 Promotes safety; Krippendorff's alpha: [A,B]. Variable: Q19 Undercuts safety; Krippendorff's alpha: [A,B]. Variable: Q20 Close call recognition; Krippendorff's alpha: 0.796. Variable: Q21 PSRS; Krippendorff's alpha: 0.818. Variable: Q23 RCA recognition; Krippendorff's alpha: [A,B]. Variable: Q24 RCA participation; Krippendorff's alpha: 0.808. Variable: Q25 RCA knows participant; Krippendorff's alpha: 0.636. Variable: Summary comfort score; Krippendorff's alpha: 0.757. Variable: Summary trust score; Krippendorff's alpha: 0.791. Source: GAO analysis. [A] For this question, we consider Krippendorff's alpha indeterminate: (1) the coders did not disagree (there was no variation) or (2) there was one disagreement among them but otherwise no variation. [B] To calculate Krippendorff's alpha, we used a computer program in N. Kang and others, "A SAS MACRO for Calculating Intercoder Agreement in Content Analysis," Journal of Advertising 22:2 (1993): 17-28. [End of table] [End of section] Appendix II: A Timeline of the Implementation of VA's Patient Safety Program: This timeline highlights the training programs and other events NCPS completed between 1997 and 2004. Year: 1997; Event: * VA announces a special focus on patient safety; * VA drafts patient safety handbook[A]; * VA develops Patient Safety Event Registry[B]. Year: 1998; Event: * Patient Safety Awards Program begins[C]; * Expert Advisory Panel is convened to look at reporting systems. Year: 1999; Event: * Four Patient Safety Centers of Inquiry are funded; * NCPS is established and funded[D]; * VA informs Joint Commission on Accreditation of Healthcare Organizations that it will go beyond JCAHO's sentinel event reporting system to include close calls; * VA pilots RCAs at six facilities; * Institute of Medicine issues To Err Is Human. Year: 2000; Event: * VA and NASA sign interagency agreement on the confidential Patient Safety Reporting System; * NCPS adverse event and close call reporting system established throughout VA; * NCPS trains clinical and quality improvement staff in patient safety topics, including the RCA process; * VA establishes Patient Safety Manager (hospital level) and Officer (network level) positions. Year: 2001; Event: * RCA training continues; * Online and print newsletter Topics in Patient Safety begins publication; * RCA software is rolled out; * Facilities and networks are given the performance measure of completing RCAs in 45 days; * Healthcare Failure Mode and Effect Analysis (HFMEA), a proactive risk assessment tool is developed by VA and rolled out through multiple videoconferences. Year: 2002; Event: * Aggregate RCA implementation is phased in over the year[E]; * New hires are trained in RCAs and Patient Safety Officers and Managers are given refresher training; * The Veterans Health Administration's Patient Safety Improvement Handbook, 3rd rev. ed. (VHA 1050.1), is officially adopted; * Facilities are given a new performance measure, being required to conduct proactive risk assessment, using HFMEA to review contingency plans for failure of the electronic bar code medication administration system; * The American Hospital Association (AHA) sends Program tools developed by VA to 7,000 hospitals[F]; * Rollout of confidential reporting to NASA is largely complete. Year: 2003; Event: * Facility directors receive a day of training to reinforce what they could do to improve the success of their patient safety programs; * Facilities are given a performance measure for timely installation of software patches to critical programs; * VA begins to provide training, funded by the Department of Health and Human Services, for state health departments and non-VA hospitals as the "Patient Safety Improvement Corps, an AHRQ/VA Partnership". Year: 2004; Event: * Facility managers, for example, Nurse Executives and Chiefs of Staff, receive a day of patient safety training; * VA plans a patient safety assessment to document the Program's progress; * Directors are given the performance measure of timely verification of radiology reports. Source: NCPS and GAO. We updated the timeline at www.patientsafety.gov and revised it with input from NCPS. [A] Revising VA's patient safety handbook was one of the first tasks NCPS took on in 1999; it was finally published as Patient Safety Improvement Handbook, 3rd rev. ed. (VHA 1050.1) and officially adopted by VA in 2002. The handbook, now part of NCPS's training material, is available at VA's Web site. [B] VA's Safety Event Registry, developed in 1997, is an internal VA program for collecting data on adverse events. VA reports certain "sentinel events" to JCAHO. [C] According to NCPS, the Patient Safety Awards Program, begun in 1998, is no longer active. [D] In the report, we consider that the Patient Safety Program began in 1999, when NCPS was established. [E] Regularly held aggregate RCAs examined close call and adverse event reports that are grouped by commonly occurring events, such as falls. [F] In 2002, AHA sent Patient Safety Program tools that VA had developed to 7,000 hospitals. The tools were videotapes about the Program and guides on how to conduct RCAs. AHA believed these tools would help non-VA hospitals develop their own Programs on patient safety. [End of table] From 1999 through 2004, NCPS has conducted training in the Patient Safety Program. It was attended primarily by quality managers and Patient Safety Officers and Managers. Typically, the training lasted 3 days and included an introduction to the new Patient Safety Improvement Handbook and small group training in the RCA process. Trainees, especially Patient Safety Managers, were expected to take the Program back to their medical facilities, collect and transmit reported adverse events and close calls to NCPS, and guide clinicians in the RCA teams. We observed health fairs at several of the four facilities. Beginning in 2003, NCPS convened medical facility directors and other managers in 1-day sessions that introduced them to the systemic approach to improving patient safety, including a blame-free approach to adverse events in health care. [End of section] Appendix III: Semistructured Interview Questionnaire: Interviewer, please fill out items 1-12. 1. Interview number: 2. Code name for VAMC: 3. Pseudonym: 4. File name: 5. From sample list: 6. Interviewer: 7. Person writing up interview: 8. Date: 9. Profession (circle or bold one) Nurse Doctor: 10. Title: 11. Unit: 12. Was informed consent signed? - yes -no: Questions to Ask Respondent: Background: 13. How many years have you had a license to practice in your specialty as a: doctor (years): nurse (years): 14. How many years have you worked at VA? 15. How many years have you worked at this medical center? 16. What are the specialties of the people you work with on a regular basis? 17. Tell me a little about what you do at work. Reciprocity: 18. To what extent do you perceive there is trust or distrust: (a) Within your profession at your VAMC? (nurses if nurse, doctors if doctor, etc.): (b) Within your unit or team? (c) Between your profession and other departments? Please provide examples. What else? Patient Safety: 19. In your time at VA, what changes have you seen with regard to patient safety at this medical center? Please provide examples. What else? 20. What do you find that supports an atmosphere that promotes patient safety? Please provide examples. What else? 21. What undercuts patient safety? Please provide examples. What else? Reporting: 22. Do you know what close call or near miss reporting is? 23. Do you know what the Patient Safety Reporting System to NASA is? 24. One of the goals of the Patient Safety Program is to create an atmosphere in which VA staff feel comfortable reporting adverse events and close calls without punishment or blame. To what extent do you think this is happening at your medical center? Please provide examples. What else? What more could be done? Root Cause Analysis: 25. Do you know what a root cause analysis (RCA) is? Explain. 26. Have you participated in an RCA? Please provide examples. Any other? 27. Do you know anyone who has? Please provide examples. Anyone else? Wiidcard: 28. If you were in charge of the medical facility and you had all the money and staff you needed, what would you do to bring about the transformation to a patient safety culture? Suggestions for Focus of Study: 29. What else should we be focusing on or asking about patient safety? [End of section] Appendix IV: Comments from the Department of Veterans Affairs: THE SECRETARY OF VETERANS AFFAIRS: WASHINGTON: December 3, 2004: Ms. Nancy Kingsbury: Managing Director: Applied Research and Methods: U.S. Government Accountability Office: 441 G Street, NW: Washington, DC 20548: Dear Ms. Kingsbury: The Department of Veterans Affairs (VA) has reviewed the Government Accountability Office's (GAO) draft report, VA PATIENT SAFETY INITIATIVE: A Cultural Perspective at Four Medical Facilities, (GAO-05- 83). VA concurs with GAO's recommendations and will provide an action plan to implement the recommendations in our response to GAO's final report. VA notes your report demonstrates that the Department's efforts at patient safety improvement have resulted in significant accomplishments and identifies areas that merit special attention. Your data, indicating 78 percent of the clinicians interviewed knew of the Root Cause Analysis process, and 75 percent understood the concept of a 'close call" suggest the successful penetration of the patient safety precepts that the Veterans Health Administration (VHA) has introduced. More importantly, however, this suggests the active participation of clinicians in patient safety improvement. While VA is pleased with the positive tone of your report, GAO did not address the question of whether VA's work in patient safety improvement serves as a model for other health care organizations. This is a significant question and was primary in GAO's proposal for this study in 2002. VA believes its work does serve as a model, and GAO should identify those aspects of the program that have led to the successes noted throughout their report. Thank you for the opportunity to review this draft report. Sincerely yours, Signed by: Anthony J. Principi: Enclosure: DEPARTMENT OF VETERANS AFFAIRS (VA) COMMENTS TO GOVERNMENT ACCOUNTABILITY OFFICE (GAO) DRAFT REPORT, VA PATIENT SAFETY INITIATIVE. A Cultural Perspective at Four VA Medical Facilities (GAO-05-83): To better assess the adequacy of clinicians' familiarity with, participation in, and cultural support for the Initiative, GAO recommends that the Secretary of Veterans Affairs direct the Under Secretary for Health to take the following three actions: 1. Set goals for increasing staff: * familiarity with the initiative's major concepts (close call reporting, confidential reporting program with NASA, root cause analysis): * participation in root cause analysis teams. * cultural support for the Initiative by measuring the extent to which each facility has mutual trust and comfort in reporting close calls and incidents. 2. Develop tools for measuring goals by facility. 3. Develop interventions when goals have not been met. Concur - The Department of Veterans Affairs (VA) concurs with GAO's findings and recommendation. A detailed action plan is being developed and VA will provide the action plan to GAO as part of VA's response to the final report. VA believes the critique in GAO's report regarding mutual trust may lead readers to believe that VA does not understand and address this topic. VA believes that the topic is understood and addressed. For example, the patient safety survey VA undertook in the year 2000 focused on many issues relevant to mutual trust. VA medical center directors received the results of the 2000 survey for their respective facilities, as well as national data, and the directors were able to use this information in local patient safety improvement efforts. Many of these questions were used in a new patient safety survey developed by the Agency for Healthcare Research and Quality, the Department of Defense and the American Hospital Association. VA will review the updated survey that is planned for implementation in 2005 and consider adding several questions to explicitly address the topic of mutual trust. [End of section] Appendix V: GAO Contacts and Staff Acknowledgments: GAO Contacts: Nancy R. Kingsbury (202) 512-2700, kingsburyn@gao.gov; Charity Goodman (202) 512-4317, goodmanc@gao.gov: Staff Acknowledgments: Additional staff who made major contributions to this report were Barbara Chapman, Bradley Trainor, Penny Pickett, Neil Doherty, Jay Smale, George Quinn and Kristine Braaten. Donna Heivilin, recently retired from GAO, also played an important role in preparing this report. [End of section] Glossary: Center of Inquiry: A research and development arm of NCPS's Patient Safety Program. The centers concentrate on identifying and preventing avoidable, adverse events, and each has a different focus. Close Call: An event or situation that could have resulted in harm to a patient but, by chance or timely intervention, did not. It is also referred to as a "near miss." Frontline Staff: Staff directly involved with patient care. Adverse Event: An incident directly associated with care or services provided within the jurisdiction of a medical facility, outpatient clinic, or other Veterans Health Administration facility. Adverse events may result from acts of commission or omission. Joint Commission on Accreditation of Healthcare Organizations: JCAHCO is an accrediting organization for hospitals and other health care organizations. Medical Facility: A VA hospital and its related nursing homes and outpatient clinics. National Center for Patient Safety: NCPS is the hub of VA's Patient Safety Program, where approximately 30 employees work, in Ann Arbor, Michigan. Other employees work in the Center of Inquiry in White River Junction, Vermont, and in Washington, D.C. Patient Safety Reporting System: PSRS, a confidential and voluntary reporting system in which VA staff may report close calls and adverse events to a database at the National Aeronautics and Space Administration. Root Cause Analysis Team: An interdisciplinary group that identifies the basic or contributing causes of close calls and adverse events. [End of section] FOOTNOTES [1] Certain management practices are essential in creating safety within an organization and in the success of organizational change for improving patient safety: (1) balancing the tension between production efficiency and reliability (safety), (2) creating and sustaining trust throughout the organization, (3) actively managing the process of change, (4) involving workers in making decisions pertaining to work design and work flow, and (5) using knowledge management practices to establish the organization as a "learning organization." (See Ann Page, ed., Keeping Patients Safe: Transforming the Work Environment of Nurses, Washington, D.C.: National Academies Press, 2004, pp. 3-4.) Throughout this report, we refer to the various patient safety initiatives under the National Center for Patient Safety (NCPS) as the Patient Safety Program, or the Program. The initiatives we studied included adverse event and close call reporting, root cause analysis (RCA), and the confidential reporting system to the National Aeronautics and Space Administration (NASA). [2] GAO, Patient Safety Programs Promising but Continued Progress Requires Culture Change, GAO/T-HEHS-00-167 (Washington, D.C.: July 27, 2000), p. 3. [3] See for example, Annick Carnino, "Management of Safety, Safety Culture and Self Assessment," http://www.iaea.or.at/ns/nusafe/publish/ papers/mng_safe.htm, (Feb. 19/2002); Columbia Accident Investigation Board, The CAIB Report, vol. 1 (Arlington, Va.: Aug. 26, 2003). http:/ /www.caib.us/ (Sept. 9, 2004) and Gaba, David "Structural and Organizational Issues in Patient Safety: A Comparison of Health Care to Other High-Hazard Industries," California Management Review 43:1 (Fall 2000): 83-102.). A review of research on influences on collaboration also found that "mutual respect, understanding, and trust" appeared more often than any other factor to be a positive influence (see Paul Mattessich and others, Collaboration: What Makes It Work, 2nd ed. (St. Paul, Minn.: Amherst H. Wilder Foundation, 2001)). [4] Highly effective safety organizations share the following characteristics: (1) acknowledgment of the high-risk, error-prone nature of the organization's activities, (2) a blame-free environment in which individuals can report close calls without punishment, (3) an expectation of collaboration across ranks to seek solutions to vulnerabilities, (4) the organization's willingness to direct resources toward addressing safety concerns, (5) communication founded on mutual trust, (6) shared perceptions of the importance of safety, and (7) confidence in the efficacy of preventive measures. (See M. D. Cooper, "Toward a Model of Safety Culture," Safety Science 36 (2000): 111-36, and Lucian L. Leape and others, "Promoting Patient Safety by Preventing Medical Error," JAMA 280:16 (Oct. 28, 1988): 1444-47.) [5] Ethnography is research carried out in a natural setting--such as a workplace--and using multiple types of data, both qualitative and qualitative. The approach embraces diverse elements that influence behavior. Most important, it recognizes that what people say, do, and believe reflect a shared culture--a set of beliefs and values---that can be discovered by systematic study of their behavior. Ethnography produces a picture of social groups from their members' viewpoint. (See Margaret D. LeCompte and Jean J. Schensul, Ethnographer's Toolkit, vol. 1, Designing and Conducting Ethnographic Research (Lanham, Md.: Rowman & Littlefield, 1999).) Other ethnographers consider the multicultural image of organizations as leading to a consideration of culture's cohesive, as well as divisive, functions. In this case, culture is defined as a learned way of coping with experience. Kathleen Gregory notes "More researchers have emphasized the homogeneity of culture and its cohesive functions." However, she also describes a multicultural model that could be divisive in function among different occupational or ethnic groups. See Kathleen Gregory, "Native-View Paradigms: Multiple Cultures and Culture Conflicts in Organizations," Administrative Science Quarterly 28 (1983): 359-76. [6] GAO, Organizational Culture: Techniques Companies Use to Perpetuate or Change Beliefs and Values, GAO/NSIAD-92-105 (Washington, D.C.: Feb. 27, 1992); Weapons Acquisition: A Rare Opportunity for Lasting Change, GAO/NSIAD-93-15 (Washington, D.C.: Dec. 1, 1992); Managing in the New Millennium: Shaping a More Efficient and Effective Government for the 21st Century, GAO/T-OCG-00-9 (Washington, D.C.: Mar. 9, 2000); Results- Oriented Cultures: Implementation Steps to Assist Mergers and Organizational Transformations, GAO-03-669 (Washington, D.C.: July 2, 2003); and High-Performing Organizations: Metrics, Means, and Mechanisms for Achieving High Performance in the 21st Century Public Management Environment, GAO-04-343SP (Washington, D.C.: Feb. 13, 2004). [7] One of the goals of the Center for Evaluation, Methods, and Issues in GAO's Applied Research and Methods group is to find new tools for evaluation; one purpose in conducting this study was to see if ethnography was a practical tool for GAO to use in studying an organization's culture. By statute, "[t]he Comptroller General shall develop and recommend to Congress ways to evaluate a program or activity the Government carries out under existing law." See 31 U.S.C. §717(c) (2000). [8] Regarding aspect no. 1, see James P. Spradley, The Ethnographic Interview (New York: Holt, Rinehart and Winston, 1997). [9] VA's survey was a nonrandom survey sent to 6,000 clinicians; it provides a description of VA culture but not an adequate and reliable measure for generalizing at the facility level. Although NCPS asked each facility to use a random sample, NCPS staff acknowledged that in many cases this was not done. Furthermore, although the survey presented questions on cultural attitudes and beliefs, such as attitudes about punishment and shame for reporting adverse events, it did not address staff understanding of concepts such as close call reporting, root cause analyses (RCAs), confidential reporting systems, whether staff participated in RCA teams, or whether staff explicitly had mutual trust. [10] See James Beebe, Rapid Assessment Process (Lanham, Md.: Rowman & Littlefield, 2001). Before we began fieldwork, we also visited each facility and conducted numerous interviews for approximately 3 to 5 days in order to write our study protocol. [11] Leape and others, "Promoting Patient Safety by Preventing Medical Error," p. 1444. [12] VA's health care system plays an important role in teaching physicians and nurses. It has 193,000 full-time-equivalent employees. The 158 medical facilities are organized into 21 regional networks. [13] GAO/NSIAD-92-105. [14] David M. Gaba, "Structural and Organizational Issues in Patient Safety: A Comparison of Health Care to Other High-Hazard Industries," California Management Review 43 (2000): 83-102. [15] For fiscal year 2004, information was collected through August 4. [16] Efforts under NCPS that we did not study included prospective analysis of potential problems (such as reviewing contingency plans for failure of the electronic bar code medication administration system), safety protocols focused on surgery, and a system of technical alerts to warn clinicians of malfunctioning mechanical equipment. [17] The Patient Safety Program does not replace VA's existing accountability systems, which include VA internal review boards, compromise or settlement of monetary claims, and referring possible criminal cases to the Department of Justice. See 38 C.F.R. §§14.560, 14.561, 14.600 (2004). If an RCA team determines that a crime is suspected or has been committed, it initiates the review process by referring the matter to the facility director. Similarly, questions involving quality of performance are handled outside the Program. [18] All RCA material and findings are part of VA's medical quality- assurance program. Records developed under the program are confidential, privileged, and subject to limited disclosure. See 38 U.S.C. §5705 (2000). [19] Only reported adverse events and close calls that meet certain criteria of seriousness and frequency are examined in RCAs. [20] John, Corrigan, and Donaldson, eds., To Err Is Human, p. 99. [21] For more on NCPS and its implementation of the Program, see the timeline in appendix II. [22] Missing patients includes patients who have a pass to leave their unit and have not returned on time, as well as patients who leave without a pass. [23] VA Office of Medical Inspector, VA Patient Safety Event Registry: First Nineteen Months of Reported Cases Summary and Analysis (Washington, D.C.: June 1997-Dec. 1998), p. 12. [24] To measure how familiar the staff were with the Program's core concepts, we calculated the average familiarity, grouped by facility, by combining answers for the series of questions noted in figure 4. More information about our methods is in appendix I; our questionnaire is in appendix III. [25] We studied the attitudes, beliefs, and behavior of clinicians directly involved in patient care. Ethnographic studies of U.S. hospital workers other than clinicians reveal their unique perspectives. See, for example, Karen Brodkin Sacks and Dorothy Ramey, My Troubles Are Going to Have Trouble with Me (Brunswick, N.J.: Rutgers University Press, 1984), and Karen Brodkin Sacks, Caring by the Hour: Women, Work, and Organizing at Duke Medical Center (Chicago: University of Illinois Press, 1988). [26] For our purposes, workflow refers to the coordination of tasks within and across teams, and professional values refers to norms that are learned from formal and informal training and that are reinforced on the job. [27] Cultural support is a composite measure of levels of mutual trust and comfort in reporting close calls and adverse events for each of four groups of clinicians. [28] In chapter 2, we described a scale of low, medium, and high familiarity with the Program that combined the answers to the following questions: Do you know what a close call is? Do you know what the Patient Safety Reporting System is? Do you know what an RCA is? Have you participated in an RCA? Do you know anyone who has participated? [29] Using content analysis, we grouped clinicians' responses to open- ended questions in categories. We asked them a series of questions about trust, such as "To what extent do you perceive there is trust or distrust within your profession? Your team? And between your profession and other departments?" To measure comfort in reporting, we asked, "One of the goals of the Patient Safety Program is to create an atmosphere in which VA staff felt comfortable reporting adverse events and close calls without punishment or blame. To what extent do you think this is happening at your medical facility?" Many clinicians returned to the subject of trust and comfort in reporting adverse events and close calls spontaneously in the interviews, as when they answered questions like "What promotes patient safety?" and "What undercuts patient safety?" (More detail on our methodology is in app. I; our questions are in app. III.) [30] Page, ed., Keeping Patients Safe, pp. 3-4. [31] The supportive culture necessary for patient safety is hard to achieve in a complex medical setting. According to the Institute of Medicine, when hospital staff are not fearful of reporting and when they have mutual trust, they cooperate better and are more successful at integrating their work tasks within and across teams. However, hospitals are complex social systems of numerous professions and work groups, and the work often involves high-risk tasks, making intrateam and interteam coordination difficult (see Page, ed., Keeping Patients Safe, pp. 3-4). Charles L. Bosk notes distrust between clinicians in different specialties, such as surgeons and radiologists or anesthetists and internists (see Bosk, Forgive and Remember: Managing Medical Error, Chicago: University of Chicago Press, 1979, p. 105)). [32] VA told us that despite the sample not being random, the NCPS did provide local results to facility directors in case the information was useful. [33] For example, Schein highlights practices that help leaders transmit culture to, and embed it in, the organization and help staff learn new practices from (1) how leaders react to critical incidents, organizational crises, and deliberate role modeling, teaching, and coaching and (2) criteria leaders use for allocating rewards and status. See Edgar H. Schein, Organizational Culture and Leadership (San Francisco, Calif.: Jossey-Bass, 1991). [34] VA leaders told us that performance errors involve patterns of behavior that require disciplining physicians and other staff. For example, the same nurse giving out the wrong medicine three times in a month becomes a performance issue. [35] Storytelling can be a way to implement system change. See, for example, Stephen Denning, The Springboard: How Storytelling Ignites Action in Knowledge-Era Organizations (Boston, Mass.: Butterworth- Heinemann, 2000); Ann T. Jordan, "Critical Incident Story Creation and Culture Formation in a Self-Directed Work Team," Journal of Organizational Change Management 9:5 (1996): 27-35; and GAO/NSIAD-92-105. [36] For more on storytelling as a tradition in medicine, see Bosk, Forgive and Remember, pp. 103-10. [37] GAO, Federal Programs: Ethnographic Studies Can Inform Agencies' Actions, GAO-03-455 (Washington, D.C.: March 2003). [38] See James Beebe, Rapid Assessment Process. [39] At one site, we interviewed 11 physicians, so our random sample actually consisted of 81 staff. [40] Rank sum tests such as Kruskal-Wallis are designed for situations in which the distributions of the populations that are the source of data are unknown. [41] Dunn's test is a multiple comparison procedure considered appropriate for use following a Kruskal-Wallis test. See Wayne W. Daniel, Applied Nonparametric Statistics (Boston: Houghton Mifflin, 1978), p. 212. [42] The advantage of using Krippendorff's technique is, among others, that it applies to any number of coders, any number of categories or scale values, any level of measurement, incomplete or missing data, and large and small sample sizes. GAO's Mission: The Government Accountability Office, the investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through the Internet. GAO's Web site ( www.gao.gov ) contains abstracts and full-text files of current reports and testimony and an expanding archive of older products. The Web site features a search engine to help you locate documents using key words and phrases. You can print these documents in their entirety, including charts and other graphics. Each day, GAO issues a list of newly released reports, testimony, and correspondence. GAO posts this list, known as "Today's Reports," on its Web site daily. The list contains links to the full-text document files. To have GAO e-mail this list to you every afternoon, go to www.gao.gov and select "Subscribe to e-mail alerts" under the "Order GAO Products" heading. Order by Mail or Phone: The first copy of each printed report is free. Additional copies are $2 each. A check or money order should be made out to the Superintendent of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more copies mailed to a single address are discounted 25 percent. Orders should be sent to: U.S. Government Accountability Office 441 G Street NW, Room LM Washington, D.C. 20548: To order by Phone: Voice: (202) 512-6000: TDD: (202) 512-2537: Fax: (202) 512-6061: To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov Automated answering system: (800) 424-5454 or (202) 512-7470: Public Affairs: Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, D.C. 20548: