Quality InitiativesFree Access

Quality Initiatives: Anatomy and Pathophysiology of Errors Occurring in Clinical Radiology Practice

Published Online:https://doi.org/10.1148/rg.305105013

Abstract

The Joint Commission requires development of comprehensive error detection systems that incorporate root cause analyses for all sentinel events. To prevent medical errors from occurring, there is a need for a readily available and easy-to-implement system for detecting, classifying, and managing mistakes. The wide spectrum of interrelated contributing factors makes the classification of errors difficult. Contributors to and causes of radiologic errors can be classified under latent and active failures. Latent failures include technical and system-related failures, with a radiology-specific subgroup of communication failures that includes documentation, inaccurate or incomplete information, and communication loop failures. Active failures may be ascribed to human failures (more specifically failure of execution of a task, inadequate planning, or behavior-related failures), patient-based failures, and external failures. Classification of an error should also include the impact of the error on the patient, staff, other customers, and radiology practice. Further considerations should include nonmedical impact of the error, including legal, social, and economic effects on both the patient and the system. Rather than focusing the investigation on blaming individuals for active failures, the primary effort should be to discover latent system failures that can be remedied at a departmental level. Such an error classification system will decrease the likelihood of future errors and diminish their adverse impact.

© RSNA, 2010

LEARNING OBJECTIVES

After reading this article and taking the test, the reader will be able to:

•. 

Describe an error classification system for analyzing errors in diagnostic radiology.

•. 

List the latent and active failures that contribute to radiologic errors.

•. 

Discuss the impact of an error on the patient, staff, other customers, and radiology practice.

Introduction

To err is human, yet society demands that medical professionals be faultless. For radiologists, being held to such standards is particularly challenging due to the rapidly advancing science of image acquisition, the art of digital image interpretation in an era of multiplanar availability, and our reliance on referring physicians to provide us with appropriate clinical information.

Nevertheless, the Joint Commission now requires performance evaluation processes for all physicians, as well as the development of comprehensive error detection systems that incorporate root cause analyses for all sentinel events (1) and that will permit identification of opportunities for minimizing the incidence and impact of our errors.

To prevent errors from occurring, there is a need for readily available and easy-to-implement systems for detecting, classifying, and managing mistakes that are amenable to a thorough root cause analysis. Diagnostic errors are often unrecognized or unreported and may be associated with high morbidity (2,3). The science of measuring such errors is underdeveloped (4,5), and the implementation of a peer review process in diagnostic radiology is one method of responding to this need. The wide spectrum of interrelated contributing factors makes the classification of medical errors difficult. This has led to the development of many systems, few of which are widely accepted or simple to use. Nevertheless, it is important for radiologists to have a user-friendly classification of errors that provides a conceptual framework within which contributing factors can be identified and strategies implemented to reduce or eliminate mistakes.

In this article, we describe factors that contribute to radiology errors and present a classification system, based on the expertise of psychologists as well as the Joint Commission (1), that has been valuable for managing errors in our practice. Specific topics discussed are definitions of errors and adverse events, common myths about human errors and their management, general approaches to classifying errors, an approach to classifying radiologic errors, consequences of an error, and strategies for managing and minimizing diagnostic errors.

Definitions of Errors and Adverse Events

An error is a deviation from the expected norm, regardless of whether it results in any harm. It is frequently merely a symptom of a flawed underlying process that can be remedied. An adverse event is a harmful consequence. Thus, driving through a red light is an error regardless of whether an accident ensues. The vehicle may be adversely affected or even destroyed, but the driver may experience no injury at all. In medical terms, this would be considered a near-miss event, in which the patient fortunately sustained no harm (which could have resulted from the error) or timely steps were taken to prevent it from occurring. Adverse events can be categorized as a spectrum ranging from a near-miss experience to loss of life (6).

Common Myths about Human Errors and Their Management

James Reason (7) has highlighted a number of myths pertaining to errors. A common misconception is that errors are intrinsically bad. Yet there may be a positive aspect to an error or seemingly poor outcome, such as Alexander Fleming’s discovery of penicillin in a discarded Petri dish contaminated with a mold that turned out to be penicillin. Moreover, detection and analysis of errors may permit interventions that improve future performance, minimize the incidence or impact of subsequent errors, and prevent more serious errors from occurring.

Individuals who make errors are not inherently less experienced, more careless, or less well-trained than those who do not. Indeed, the Institute of Medicine reported that 90% of medical errors result from systemic problems rather than individual factors (8). Errors will continue to occur unless the initial error is properly addressed and potential contributing factors from the individuals involved are resolved. To return to the motor vehicle analogy, the driver may have intended to speed through the red light. Consequently, error detection systems must not only be able to detect such behavior but must also provide structures for managing and providing appropriate remediation for such acts.

Another common belief is that errors occur at random. Although this may appear to be true at first glance, errors frequently reflect long-standing substandard practices (often recognized in retrospect, but not acted on or detected at the time of occurrence) coupled with latent system failures that caused the errors to occur. Similarly, another myth is that it is easier to change the behavior of an individual than the process itself, when in fact substandard practices are more amenable to change than habits and human behavior.

Although many physicians believe that errors are rare in medical practice, this belief has been dispelled by data in the Institute of Medicine report To Err Is Human (8). The previous misconception may be due to the lack of error detection or reporting systems or to the paucity of published and scientifically rigorous data. Many physicians are reluctant to expose their errors owing to the fear of litigation, even though studies have shown that open discussion of errors does not increase the likelihood of a lawsuit (9) and may even significantly decrease the number of claims and overall annual litigation expenses (10). Therefore, when a medical error occurs, the best way to decrease the risk of litigation is to provide full disclosure to the patient (11).

System- or process-related failures are very frequent (12) but in most cases fortunately do not cause harm. Although there may be numerous predisposing factors and near-miss events, only the accidental alignment of a series of latent errors may result in a real accident. Consequently, it is essential to establish error detection systems to detect those seemingly minor or near-miss occurrences before they inevitably lead to an adverse event.

General Approaches to Classifying Errors

Why Classify Errors?

Classifying errors permits the development and implementation of detection and analysis systems to minimize the occurrence of errors or the degree of their resulting harm. The Joint Commission requires all accredited institutions to have error detection systems, to establish proactive rather than reactive processes for managing errors, and to perform a root cause analysis for all sentinel events. These requirements provide us with an opportunity to develop a practical radiology-specific classification system that takes into account our varied modes of practices and imaging technologies.

Classification Systems

Although several error classification systems have been developed for use in medicine (6,1315), none are customized for diagnostic radiology, to our knowledge. Most concentrate more on human error than on system processes. Renfrew et al (16) proposed a scheme that focuses on perceptual errors, classifying them as related to complacency, faulty reasoning, lack of knowledge, underreading, poor communication, false-negative causes, and complications. Even though human error in radiology is highly important, this classification does not take into account many predisposing factors, such as management issues, understaffing, ergonomics, and work volume.

Reason’s Swiss Cheese Model of Medical Error

In 1990, British psychologist James Reason (12) introduced risk analysis and management systems to human error. He described two basic approaches (17). The person approach assigns blame for an error to an individual.

The system approach acknowledges that humans make mistakes and errors are to be expected, but views these as a consequence and instead focuses on identification of an underlying system failure.
Therefore, numerous safeguards, defenses, and barriers must be implemented to prevent an error from occurring and to reduce its impact, with emphasis placed on seeking possible or latent failures within the system.

Highly technical, complicated systems have multiple levels of defenses (physical, electronic, personal, procedural, and administrative). Unfortunately, none of these defenses function perfectly. Holes in the defenses are considered latent failures, but no single one of these will result in an adverse event (Fig 1). A human error will usually not cause an adverse event if all safeguards are in place. Examples of such safeguards in interventional radiology include the preprocedure time-out, checking the consent form and ensuring that it matches the request, verifying the patient identification, checking for allergies, and being aware of the coagulation status. National Patient Safety Goals have been developed in part to plug latent failure holes (Fig 2). The existence of multiple holes in a system coupled with a high likelihood of some active failure make a medical error virtually inevitable.

Figure 1

Figure 1 Reason’s Swiss cheese model of latent and active contributors to errors. A series of safeguards (slices of Swiss cheese) exist in a system or are introduced to minimize the chance of an error occurring. Within each layer of defense, latent factors exist (holes in the cheese) that predispose the system to errors. A single latent failure is unlikely to result in an error; however, with proper alignment of latent failures and in the presence of an active failure, an error is likely to manifest. When steps are introduced to prevent errors, attention should therefore focus not only on human failures but also on introducing safeguards to prevent latent failures from aligning.

Figure 2

Figure 2 A specific safeguard: the Universal Protocol. The Joint Commission introduced the Universal Protocol to improve compliance with National Patient Safety Goals. By requiring preprocedure verification (including confirmation of patient identity [ID]), marking of the site and side of a proposed procedure, and performance of a preprocedure time-out to verify the nature of the procedure, the Universal Protocol ensures that many latent contributors can be prevented from resulting in overt errors. In this example, when the Universal Protocol is adhered to, all factors on the left side are prevented except for the inadequate training, which is considered a latent organizational failure. For this reason, more than just a single safeguard is usually required to ensure adequate prevention of an error.

Figure 3 shows an example of successful implementation of defenses. Figure 4 illustrates an active failure: a junior resident missing a subtle finding on a computed tomographic (CT) scan. In a properly functioning system with all barriers intact, an attending radiologist will discover the problem during a formal readout. Thus, the resident’s “failure” will not result in an error manifesting itself. System factors that may have been in place to prevent such an error include a policy requiring an attending radiologist to review all cases, a culture in which residents preview all cases before readout, a protocol that optimizes depiction of subtle findings, a distraction-free and ergonomic reading environment, and a requirement that ordering physicians provide relevant clinical information to facilitate interpretation. An additional consideration is that even if both the resident and the attending radiologist miss the finding, there may be no adverse consequence to the patient.

Figure 3

Figure 3 Safeguards and defenses. A series of five safeguards and defenses have been introduced to minimize the chances of latent failures aligning to produce an error. By undertaking regular safety audits, introducing standardized work flow in procedure areas, instilling a culture of safety throughout a department, training and orienting all new hires, and ensuring that all procedures are properly supervised, the effects or complications related to latent contributors residing within a system can be minimized or even prevented.

Figure 4

Figure 4 Insufficient safeguards. If insufficient safeguards or defenses exist, an error is likely to ultimately occur. In this example, despite the demonstration that some defenses effectively minimize or prevent errors from occurring, a situation will arise when a series of latent contributors align and an error will occur. Here, poor ergonomics and ambient light, frequent telephone disturbances, working with an inexperienced resident, and other factors listed on the left side align to enable an error to occur. A lesion detected during a readout session was not mentioned in the report, possibly with important consequences for the patient’s care. Such contributors should be identified through the root cause analysis process, and steps should be put in place to prevent this error from occurring again. IV = intravenous.

Approach to Classifying Radiologic Errors

We approach errors using a system in which the patient is at the center of all errors, closely surrounded by active and latent contributors (Fig 5), all of whom interact with each other directly or indirectly like pieces of a puzzle. This approach considers issues specific to contemporary diagnostic radiology, such as communication of abnormal results, technical contributors, and patient factors. It facilitates root cause analysis of adverse events by emphasizing detection and management of all underlying latent errors. The focus on potential system errors rather than human issues makes it more likely that involved radiologists will participate in the investigation and implementation of change.

Figure 5

Figure 5 The patient and practitioners. The patient experience in a radiology department is affected by a number of practitioners, all of whom are linked directly or indirectly like pieces of a puzzle. Any break in this puzzle will impact the experience and outcome. Such breaks can occur due to latent factors or to active human errors, which may be knowledge based, skill based, violations of rules and guidelines, or combinations thereof. Therefore, root cause analysis of human contributors should consider all practitioners involved in patient care, not just the individual most closely associated with occurrence of an error.

Contributors and Causes

For a medical error to manifest at the clinical level, there must be a convergence of active or human factors facilitated by a combination of predisposing (latent) failures.

Therefore, the goal of error analysis and proactive safety systems is to identify dormant latent failures before they enable an active error to occur.
Rather than blaming individuals, it is very important to analyze near misses or adverse events to highlight predisposing system factors. Unlike active failures, which may be difficult to foresee, latent conditions can be identified and treated before an adverse event occurs—a proactive approach rather than reactive risk management (12,17).

Latent Failures.

Latent failures represent predisposing conditions that enable an error to occur (Table). These errors may lie dormant for a prolonged period, remaining undetected due to an absence of surveillance, absent root cause analysis processes, or lack of a culture that stresses patient safety.
Staff may be aware of such conditions but may be unwilling to report them; if latent failures are not properly reported, they may not be adequately addressed by those in management positions.

Comparative Definition of Latent versus Active Failures

Latent factors can be considered technical or system related (Fig 6). Technical latent factors include those relating to equipment and engineering (construction and design flaws, ease of use, safety issues); departmental design; work flow design; hardware, software, and equipment failures; picture archiving and communication systems (PACS) and the integrity of the digital environment; materials and material management (contrast agents, devices); protocols, policies, rules, and regulations; and routine maintenance of all systems involved. Technical issues are particularly relevant in a radiology department, where they can be minimized by routine equipment maintenance and adherence to safety guidelines, which should be clear and easy to implement lest they cause potentially dangerous ambiguity.

Figure 6

Figure 6 Latent failures. Latent failures can be categorized as system related or technical in origin. Examples of latent failures that may occur in an imaging environment are listed.

System latent factors include staffing, duty hours, ergonomics, the departmental culture of safety and leadership training, supervision, and departmental governance. System errors occur due to failures of higher-level decision makers, managers, and maintenance personnel. Although remote from the error itself, in both space and time, system failures generally have substantial influence.

Poor communication is at the heart of many medical errors (13). Although most commonly considered as failure to transmit critical or urgent important incidental or unexpected results, communication errors encompass a far broader scope in daily practice. They include incomplete or inaccurate information, questionable consent and disclosure processes, inadequate documentation, and failure to perform or ineffective performance of the preprocedure time-out. Communication failures need not be limited to verbal interchange. They may be in written or electronic format, with incorrect or inaccurate radiology reports representing a large source of communication errors. Absent communication, or failure to communicate correctly, can be an equally egregious error. A poorly performed study due to failure to communicate instructions to a deaf patient may result in acquisition of diagnostically inadequate images. Rather than representing any fault of the patient, this should be considered a failure of appropriate communication and the image acquisition processes.

Communication errors can be conveniently categorized into three classes: errors of documentation, communication of inaccurate or incomplete information, and failures in the communication loop.

With the increasing complexity and frequency of imaging studies, it is essential that each step in the process be clearly documented and verified. Many errors occur because of inaccurate, inadequate, or insufficient documentation. The widespread deployment of electronic medical records may provide systemic checks to minimize such errors.

Inaccurate or incomplete information most frequently involves failure of the ordering physician (or scheduling office) to provide essential clinical data, such as a history of allergy to contrast material. However, inaccurate or incomplete information can also reflect failure of radiologists to adequately explain their thoughts and opinions in the official radiology report.

The interaction between referring physicians, scheduling services, technologists, and radiologists forms a radiology communication loop (Fig 7). This communication loop is highly susceptible to error and must be subjected to continuous scrutiny and documentation.

Figure 7

Figure 7 The radiology communication loop. A series of practitioners participate in the care that is provided to a patient visiting a radiology department. Each interlinked person plays an essential role in ensuring a satisfying and successful outcome. From the referring physicians’ interaction with schedulers, to the patients’ interaction with office and welcoming staff, including extenders such as valet parking attendants, to transport technologists, imaging technologists, nurses, and physicians, each link in this chain communicates with others and each is equally important to maintain the integrity of the chain. Any break may have serious consequences. As an example, should a breakdown occur during the transcription phase, results may not be received or incorrect results may be transferred. Such breaks can be prevented by insertion of safeguards, as shown in Figure 8.

Our department has developed various strategies to safeguard the radiology communication loop (Fig 8). An electronic protocol and schedule ensure timely and exact protocol details; in addition, a time-out consisting of double verification of patient identity and the type of procedure is made before performing every examination. Technologists change the status of each study in the PACS after verifying patient arrival and study completion. An automatic update of the study list ensures that each case is promptly read by a radiologist. In addition, we have developed a convenient electronic notification system that enables radiologists to convey critical findings to the referring physician. Finally, follow-up multidisciplinary conferences provide important feedback to radiologists, including quality improvement information.

Figure 8

Figure 8 The radiology imaging chain. Encircling the patient is an interlinked series of events that occur from the time a study is ordered until the results of the study are communicated to the ordering provider. At each stage of this process, a link may break due to latent or active failures. Such breaks can be prevented by inserting appropriate safeguards. In this example, the recent introduction of a series of digital safeguards is used to illustrate the extent to which conventional steps of the imaging chain can be “protected.” Such digital safeguards include provider order entry systems, radiology information systems and hospital information systems, standard tools of quality assurance, PACS, voice recognition systems, the electronic medical record, and the Internet via the World Wide Web.

Active Failures or Human Errors.—Active failures or human errors include procedural complications or mistakes and diagnostic misses and misinterpretations. However, they may occur due to patient factors or involve one or more practitioners involved in the study, as illustrated in Figure 5, as well as being secondary to latent contributors such as scheduling the incorrect examination. An active failure is usually person-related, and its consequences are immediately felt by the system (Table). Active failures may be ascribed to human failures (more specifically, failure of execution of a task, inadequate planning, or behavior-related failures), patient-based failures, and external failures.

Failures of execution (18), which involve procedures that are adequately planned but incorrectly performed, may be divided into slips and lapses. A slip is a problem of attention, such as omission of safety procedures, misordering, mistiming, and intrusion. Similarly, an abnormality may “slip” out of our attention when we are distracted during the readout. Safeguards against such distractions include an ergonomically designed workplace, properly positioned ambient light, prohibitions on telephone calls, and a quiet environment. Lapses are related to memory and include the omission of planned items. For example, a radiologist may properly identify an abnormality but simply forget to include it in the report. One technique for eliminating such lapses is structured reporting.

Mistakes or failures of intention can be rule-based or knowledge-based. Rule-based mistakes occur in familiar surroundings, when the radiologist misapplies a good rule owing to failure to notice contraindications. For example, it is a good rule that a young female patient with right lower quadrant abdominal pain, nausea, leukocytosis, and a dilated appendix with surrounding fat stranding most likely has acute appendicitis. However, failure to notice an enlarged ovary with a large cyst and deviation of the uterus in this patient would cause the radiologist to miss the diagnosis of ovarian torsion with secondary changes in the appendix. Another source of error is to apply a poor rule. For example, a radiologist might review only axial and reformatted images from a CT study and not the scout view, but the scout view might be the only place where an abnormality can be detected.

Knowledge-based mistakes, which reflect a failure to know what one is doing or to not realize that what one is doing is incorrect, occur in unusual and novel situations when one fixates on a particular hunch or hypothesis. To prevent this error, it is important to leave options open and avoid the “satisfaction of search,” as well as not neglect contradictory evidence or suggestions, even when they come from support personnel or subordinates.

Human errors may be deliberate and intentional and are usually associated with motivational problems (low morale, poor supervision, or a perceived lack of concern). They may be categorized as negligent behavior, recklessness, and intentional rule violations, all of which we have previously addressed (19). Methods for determining accountability and culpability have been previously illustrated (20).

External Causes.—Some mistakes and failures are inevitable and are beyond radiologist or organizational control and responsibility. Examples of such external causes include electrical failures, magnet quenches, CT malfunctions, or a flood or earthquake. Defensive strategies and safeguards can diminish the impact of errors arising from external causes. For example, an alternative, readily available source of electricity (such as a power generator) should be available in case of an electrical failure. Manual air bags should be available in the radiology department to care for patients requiring ventilation. Policies defining contingency plans must be established in case essential pieces of equipment do not function properly.

Most errors have more than one contributing failure. With medical mistakes, an active human error is readily apparent, whereas latent system and technical failures may not be readily identified. The most important part of any investigation is to unearth all predisposing latent failures, for without them an active failure alone will usually not result in a medical error.

Customers: The Patient, Personnel, and Practitioners

The patient and nonradiology personnel can also contribute to medical errors. Patient factors that may contribute or predispose to an error include physical attributes (weight, body habitus, presence of hardware, language barriers), comorbidities, allergies, age, lack of prior experience with a similar procedure or study, and an inability to comprehend or follow instructions.

Consequences of an Error

Consequences of radiologic errors can impact the patient and his or her immediate circle, as well as other imaging staff directly involved and members of the medical team. There also are more global consequences of error that may affect the practice, the institution, the profession, the procedure, or even the imaging study.

The Patient

The impact of an error on a patient includes physical, psychological, and financial components. The initial response must be to make every effort to treat the patient and to minimize the degree of physical harm. Once the patient’s condition is stable, the severity and permanency of the physical harm should be assessed. Similarly, there should be a measurement of the psychological effect of an error on the patient; the psychological effect is often greater than the physical impact. In most situations, the patient should be offered appropriate care free of charge, including a follow-up plan to manage any resulting harm and referrals to appropriate specialists.

Staff and Other Customers

The impact on the radiologist involved in an error, especially the psychological impact, may be devastating and affect all personnel involved in patient care (technologists, nurses, physicians, and the institution). The “second victim” of a medical error is the physician and other members of the medical team involved in the adverse event (21). At times, the psychological impact may be so grave as to disable all future professional activity. When necessary, counseling should be provided to all personnel impacted emotionally by a serious event.

Global Considerations

Nonmedical impact of an error includes legal, social, and economic effects on both the patient and the system. These issues frequently contribute to reluctance of the medical profession in disclosing medical errors. However, full disclosure about an adverse event and measures taken to diminish the likelihood of its recurrence may significantly aid in the patient-radiologist relationship, improve the patient’s well-being, and decrease the likelihood of litigation (911).

Strategies for Managing and Minimizing Diagnostic Errors

An important goal of error analysis is to create processes aimed at reducing or preventing the occurrence of errors and minimizing the degree of harm. The development of an effective system for detecting and appropriately managing errors is essential to substantially attenuate their consequences. At this stage, the error analysis process identifies contributing factors to enable the implementation of concrete steps to prevent such errors from occurring in the future. Active and comprehensive management of errors and adverse events requires ongoing surveillance processes (20).

Minimizing the incidence and extent of active errors requires reliable reporting mechanisms coupled with policies that define mechanisms of case review and management. Educational programs, morbidity and mortality meetings, and a comprehensive and respected root cause analysis process are also essential components of this comprehensive approach.

Conclusions

We propose an error classification system for diagnostic radiology that includes personnel, communication, cause, and impact characteristics. Rather than focusing the investigation on blaming individuals for active failures, the primary effort should be to discover latent system failures that can be remedied at a departmental level. This will result in decreasing the likelihood of future errors and diminishing their adverse impact.

Recipient of a Certificate of Merit award for an education exhibit at the 2007 RSNA Annual Meeting.

For this CME activity, the authors, editors, and reviewers have no relevant relationships to disclose.

References

  • 1 Joint Commission. Accepted: new and revised hospital elements of performance related to CMS application process. Jt Comm Perspect 2009;29(10):16–19. MedlineGoogle Scholar
  • 2 Croskerry P. Clinical cognition and diagnostic error: applications of a dual process model of reasoning. Adv Health Sci Educ Theory Pract 2009;14(suppl 1):27–35. Crossref, MedlineGoogle Scholar
  • 3 Croskerry P. A universal model of diagnostic reasoning. Acad Med 2009;84(8):1022–1028. Crossref, MedlineGoogle Scholar
  • 4 Newman-Toker DE, Pronovost PJ. Diagnostic errors: the next frontier for patient safety. JAMA 2009;301(10):1060–1062. Crossref, MedlineGoogle Scholar
  • 5 Newman-Toker DE, Camargo CA, Hsieh YH, Pelletier AJ, Edlow JA. Disconnect between charted vestibular diagnoses and emergency department management decisions: a cross-sectional analysis from a nationally representative sample. Acad Emerg Med 2009;16(10):970–977. Crossref, MedlineGoogle Scholar
  • 6 Chang A, Schyve PM, Croteau RJ, O’Leary DS, Loeb JM. The JCAHO patient safety event taxonomy: a standardized terminology and classification schema for near misses and adverse events. Int J Qual Health Care 2005;17(2):95–105. Crossref, MedlineGoogle Scholar
  • 7 Reason J. Beyond the organisational accident: the need for “error wisdom” on the frontline. Qual Saf Health Care 2004;13(suppl 2):ii28–ii33. Crossref, MedlineGoogle Scholar
  • 8 Kohn LT, Corrigan JM, Donaldson MS, eds. To err is human: building a safer health system. Washington, DC: National Academy Press, 2000. Google Scholar
  • 9 Stewart RM, Corneille MG, Johnston Jet al.. Transparent and open discussion of errors does not increase malpractice risk in trauma patients. Ann Surg 2006;243(5):645–649; discussion 649–651. Crossref, MedlineGoogle Scholar
  • 10 Clinton HR, Obama B. Making patient safety the centerpiece of medical liability reform. N Engl J Med 2006;354(21):2205–2208. Crossref, MedlineGoogle Scholar
  • 11 Feinmann J. You can say sorry. BMJ 2009;339:b3057. Crossref, MedlineGoogle Scholar
  • 12 Reason J. The contribution of latent human failures to the breakdown of complex systems. Philos Trans R Soc Lond B Biol Sci 1990;327(1241):475–484. Crossref, MedlineGoogle Scholar
  • 13 Woolf SH, Kuzel AJ, Dovey SM, Phillips RL. A string of mistakes: the importance of cascade analysis in describing, counting, and preventing medical errors. Ann Fam Med 2004;2(4):317–326. Crossref, MedlineGoogle Scholar
  • 14 Vincent C, Taylor-Adams S, Stanhope N. Framework for analysing risk and safety in clinical medicine. BMJ 1998;316(7138):1154–1157. Crossref, MedlineGoogle Scholar
  • 15 Vincent C. Understanding and responding to adverse events. N Engl J Med 2003;348(11):1051–1056. Crossref, MedlineGoogle Scholar
  • 16 Renfrew DL, Franken EA, Berbaum KS, Weigelt FH, Abu-Yousef MM. Error in radiology: classification and lessons in 182 cases presented at a problem case conference. Radiology 1992;183(1):145–150. LinkGoogle Scholar
  • 17 Reason J. Human error: models and management. BMJ 2000;320(7237):768–770. Crossref, MedlineGoogle Scholar
  • 18 Reason J. Safety in the operating theatre. II. Human error and organisational failure. Qual Saf Health Care 2005;14(1):56–60. Crossref, MedlineGoogle Scholar
  • 19 Kruskal JB, Anderson S, Yam CS, Sosna J. Strategies for establishing a comprehensive quality and performance improvement program in a radiology department. RadioGraphics 2009;29(2):315–329. LinkGoogle Scholar
  • 20 Kruskal JB, Siewert B, Anderson SW, Eisenberg RL, Sosna J. Managing an acute adverse event in a radiology department. RadioGraphics 2008;28(5):1237–1250. LinkGoogle Scholar
  • 21 Wu AW. Medical error: the second victim—the doctor who makes the mistake needs help too. BMJ 2000;320(7237):726–727. Crossref, MedlineGoogle Scholar

Article History

Received: Jan 22 2010
Revision requested: Feb 17 2010
Revision received: Feb 25 2010
Accepted: Feb 25 2010
Published online: Aug 31 2010
Published in print: Sept 2010