There once was a doctor so smart He’d perfected the medical art But when writing a script He’d let the decimal point drift And the digoxin stopp-ed the heart Medical errors and adverse events (AE) are common and preventable. The Canadian Adverse Events study assessed the incidence of AE in acute care hospitals in British Columbia, Alberta, Ontario, Quebec and Nova Scotia. AEs are defined as unintended injuries or complications resulting in death, disability or prolonged hospital stay, arising from health care management. The overall incidence rate of AEs was 7.5% which would predict that of ~ 2.5 million annual hospital admissions in Canada, 185,000 are associated with an AE, and 70,000 would potentially be preventable. Dr. David Holland, a Nephrologist at Queen’s University recently conducted a thought-provoking Morbidity and Mortality Round dealing with the topic of Medical Error at Queen’s University. A patient with renal failure, apparently due to vasculitis, was treated with immunosuppression with apparent success, as judged by his improving creatinine. Despite improvement in renal function, his course was complicated by recurrent abdominal pain. Multiple investigations of the etiology of this vague abdominal pain revealed no cause. Diagnostic imaging tests were negative, including abdominal scans and endoscopies/colonoscopies. After several years the pain ultimately proved to be due to a pancreatic tumor. The question Dr. Holland raised was could or should the diagnosis have been made earlier? Was Ockham’s razor dull (ie. did both entities, the vasculitis and the tumour co-exist) or did the treatment for the vasculitis accelerate the tumour progression? Why were all the appropriate investigations for abdominal pain negative? An hour was passed in thoughtful review of the case, and more importantly, how doctors make decisions. Although no error was found in this case, Dr. Holland’s thoughts on how medical errors occur prompted this blog. The venue for our discussion of Medical Error, the Department of Medicine’s weekly Morbidity and Mortality Rounds (M&M), run by Dr. David Lee, Divisional Chair, Hematology, is itself worthy of comment. M&M is a weekly conference where adverse and unexplained outcomes are discussed by the Departmental faculty, with trainees in attendance and input from colleagues in pathology, surgery and radiology. The spirit is reflective: self-assessment combined with curiosity, with the goal of identifying error. M&Ms are not themselves a quality improvement intervention but a necessary first step in identifying practices and process that require monitoring/improvement. M&Ms, like Grand Rounds, are under siege from time pressure. In an era of monitored productivity, with focus on more and shorter patient encounters, M&M conferences are at risk of going the route of the dinosaur, or doctors’ lounges. That would be a shame. Despite advanced imaging and sophisticated laboratory testing there is no evidence that medicine is more free of error today than it was when Osler undoubtedly chastised some intern for “one too many leaches applied to the wound”! So, congratulations to Dr. David Lee who coordinates this well attended weekly session. Dr. Holland divided medical errors into those related to care delivery (such as medication errors and wrong-side surgery) versus diagnostic errors. His focus, diagnostic errors, can be further divided into system errors (i.e. the malignant biopsy result was not uploaded to the medical record in a timely manner leading to delayed diagnosis) versus errors of cognition (the doctor concluded that 2+2 was 3). Approaches to medical decision-making can broadly be divided into heuristic decision-making (relying on shortcuts or memory aids) and analytical decision-making (the use of classical differential diagnosis). To understand the heuristic approach, think of the checklist used by pilot and co-pilot on takeoff or landing. The consummate example of the analytical approach would be Sherlock Holmes, deducing a suspect’s country of origin based on the provenance of a coat as discerned from a fibre left at a crime scene. Heuristic decision making (it walks like a duck, it quacks like a duck, it must be a duck) is rapid and, in the right context, with a favorable pre-test probability of disease is fast and useful; analytical decision-making is a broader more contemplative process, which takes more time but is better suited to finding less common more complex diseases. In reality, the two techniques often are used in parallel. Even in North America not all central chest pressure with radiation to the left arm in middle aged, obese men is coronary ischemia. While stress tests are being ordered and angiograms considered, the nonheuristic half of the medical mind should be thinking of non-cardiac causes in the form of a short differential diagnosis. Dr. Holland summarized the following causes of Medical errors and focus on cognitive errors (in yellow):

The many causes of cognitive errors are some worth considering. Most of these misleading biases seem to have nautical origins: anchoring, searching, sunk-cost etc. Dr. Holland summarized these biases as follows: Premature closure (anchoring). Anchoring refers to narrowing the choice of diagnostic possibilities too soon. Related to this is the search satisfying bias. These are errors related to calling off off the search once an abnormality is found, and thereby missing other important problems relevant to our case. We are often prone to this search satisfying bias due to availability bias, meaning assessment of the current case is too heavily influenced by our own recent experience. This bias is dangerous particularly when our experience is limited or the case at hand happens to be a rare disease. We all like to be correct, so we tend to overvalue findings that support our diagnosis and may not seek out or may ignore discordant data, creating a potential confirmation bias. Other biases that lead to cognitive errors are recent bias, meaning we favor the data that are most recently acquired. Errors in decision-making also result from sunk cost bias (which is not a reference to Canada’s used submarine program). Sunk bias refers to the early commitment of resources to a specific diagnosis compelling the clinician to pursue that diagnosis despite subsequent information suggesting other possibilities. Our nautical voyage of bias may include a port of call at a casino. Physicians are prone to the Gambler's fallacy, a bias by prior events that is also known as the Monte Carlo fallacy. If we see a series of four patients with small lung nodules that prove benign we may be tempted to assume that the fifth patient with such a finding will have cancer (ie. the run of benign pathology cannot continue). However, even if you flip a fair coin four times and get heads each time you still have a 50% chance of head on the fifth coin toss. So how do we improve? In addition to being mindful of our biases, Dr. Holland suggests the following interventions: 
1) Individual
- Increase knowledge and experience
- Improved clinical reasoning
- Meta-cognition:
- Increased self-awareness
- Increased situational awareness
- Education about biases
- Prospective hindsight
2) System/healthcare environment
- Decision support tools
- More integrated information
- More timely information
Just a few thoughts on system errors (point #2). Many errors do relate to the systems that we use to acquire andtransmit information. In isolation, each event looks unfortunate. Seen from altitude, they form a pattern. System improvements can often be made that reduce AE and improve care. Richard Jewitt, Program Medical Director for Medicine at Kingston General Hospital summarizes our approach to reduce medical errors and AEs in the sidebar. These policies (the SAFE reporting system) increase patient safety and quality of the care we provide. Essentially, the entire team is engaged in processes that identify adverse events, from falls to medical dose errors. The team notifies responsible individuals allowing intervention and tracking of these events with the goal of understanding their cause and improving processes to mitigate risk. The system is triggered by frontline providers and notifies both frontline and leadership teams of all adverse events, rating them on a scale of 1-4, with 4 being serious-life threatening. In addition to more immediate local corrective action, these errors or adverse events, many of which are complex, like patient falls, are batched so that recurring themes can be identified and prevented. Our patient had two processes that were likely independent and not simultaneous, a vasculitis causing renal failure and a slowly growing pancreatic tumor. It is unknown whether immunosuppression hastened tumor growth and even less certain how such information would be used, since immunosuppression was required to preserve renal function. I will close with two highlights common to making a medical error, which suggest solutions. 1) A series of unfortunate events: To channel Lemony Snicket (nom de plume for children’s author, Daniel Handler and paraphrase Dr. Holland, serious errors are usually the sad result of an alignment of multiple minor errors.

This is illustrated by the ‘Swiss chees model’ . Many minor ‘failures’ only lead to an AE when when the stars (holes in the diagram above) align. Conversely, a system that focuses effort on reducing the ‘minor errors’ may prevent this alignment and reduce the incidence of serious AEs. 2) My aircraft (avoiding error through clear communication): Many errors reflect poor handover or inadequate documentation. We expect robotic precision in our airplane pilots...perhaps we can expect something close to this from health care professionals.

The ultimate example of the benefits of concise, clear communication in managing critical events occurred between pilot Sully Sullenberger (left) and his copilot. As they transferred control of the US 1549 aircraft prior to its landing, sans engines, in the Hudson river the first communication is the pilot saying to his co-pilot, “My aircraft”, to which the copilot responds “Your aircraft”, acknowledging transfer of responsibility.” Finally Sullenberger closes the loop confirming, “My aircraft”..and then the fun begins! (read on). We can all ask ourselves if our handover communications are as clear. Quality care provided reflects the practices of the entire team, from receptionists, IT specialists, pharmacists, physiotherapists and housekeeping staff to administrators, doctors and nurses. Prevention of errors by each of these groups prevents the series of unfortunate events that leads to error. Medical errors are common, costly and can be reduced through a combined approach of systems intervention and personal accountability.