Department of Medicine
School of Medicine Queen's University
 
 

News, Innovations and Discoveries Blog

Everything You Wanted to Know About Competency Based Medical Education (CBME) But Were Afraid to Ask. A guest blog by Dr. David Taylor with introduction by Stephen Archer

The Royal College of Physicians and Surgeon’s (RCPSC) has a new model for postgraduate medical training that is called Competency by Design (CBD). The program that realizes CBD is referred to as Competency Based Medical Education (CBME). This new educational regime is meant to improve the quality of residency training in Canada. CBME’s premise is that better definition of the key components of competency and more frequent evaluation and feedback doctors in training as to their progress towards competency will make for a safer and better independent practitioner. CBME divides the things we doctors do into components called entrustable professional acts (EPAs). Each specialty will have their own collection of EPAs, with some overlap of common EPAs (such as ability to elicit a history or perform a physical examination). CBME is meant to ensure that a residency training program produces trainees that are as good as they can be (and presumably of a satisfactory quality to be independent physicians). The corollary is that if a trainee is not competent to perform the requisite EPAs they will not advance and will get the help they need to move ahead. In theory, once a resident has mastered the cannon of EPAs require of their specialty they can be certified to have the skill set that in aggregate makes them the caliber of doctor to whom we would trust our health. In the old days we had similar standards that residents had to meet before being certified; however it was phrased more bluntly. “We’ll let you know when your done!” …but I digress. Once a resident has achieved competency at the highest level for al their EPAs they are certified by their program director as ready to take the final step in CBME and make the transition to practice. The RCPSC hopes that this new educational program will improve the preparedness and quality of physician that complete postgraduate programs in Canada.

Each specialty had is own RCPSC-dictated target date to covert to CBME. The strategy that Queen’s was different. We chose to implement CBME en mass with a July 1st 2017, across the aboard, go-live date. In other words we had simultaneous implementation of CBME in all our postgraduate programs. It was a 2-year race to that deadline but we did it! CBME was successfully launched in all programs at Queen’s University in July 2017. The launch of CBME at Queen’s University has by all accounts, including my own, been successful and trainees have embraced the change while medical educators have had the wind fill the sails of their academic careers and propelled them forward. Full credit to our leadership who envisioned that we should be national leaders in the early implementation and shaping of CBME, notably our Dean, Dr. Richard Reznick, Dr. Damone Dagnone, CBME lead, and a team of dedicated Program Directors and educational leaders, including our own core Medicine Residency Program Director, Dr. David Taylor, author of the guest blog that follows).

While CBME has launched successfully it has increased faculty workload and required commitment of a substantial amount of internal funding to create electronic evaluation platforms, train faculty etc. The RCPSC may have provided the mandate but they certainly do not fund implementation costs (neither technology nor the requisite human resources). Moreover, my gut tells me that the sustain mode for this new educational ideology is not net neutral. Like repairing a running engine, RCPSC mandated a major shift in educational methodology and allowed it to be built and implemented on the fly (a tricky task which the medical education mechanics at Queen’s took on with relish).

I am very fortunate to have Dr. David Taylor as my Department’s leader in coordinating the implementation of CBME. I particularly enjoy David’s openness to candid discussions about the real-world challenges that implementing CBME has raised, both for me, as Department Head and for faculty members. We always have these conversations in a spirit that acknowledges we (the Department of Medicine) are moving ahead with CBME, but mindful that we need to be honest and thoughtful about its costs, open to evaluating its impact and willing to debate its goals. As a Department Head, my focus cannot be restricted to postgraduate education; rather I am accountable for success in each of our pillars, which include clinical care, quality improvement, and research, as well as undergraduate, postgraduate education and continuous professional development. With the mandate from the RSPSC to implement this unfunded program, I often discuss with David how we, as a Department, should direct funding and resources to CBME in an equitable manner, that does not hurt the clinical or research missions.

Often our conversations relate to my questions around the rationale that was used to justify changing the educational model. Questions include:

  1. How will the success of CBME will be evaluated?
  2. How will additional evaluation of residents be helpful unless faculty become more willing to deliver honest critiques and provide negative feedback when appropriate?
  3. Will an effective CBME program not mean that more residents will require longer training and that more (perhaps not many more, but more) will ultimately not be deemed competent to practice? If this is not the case, and all who begin training finish, then how does this differ from the current model and why do we need CBME?
  4. How many more faculty positions do I need to allow me to run the Department in light of the increased time required for CBME, with its mandate for new educational positions for each of our 10 training programs? Each of these residency programs requires a CBME lead (one per program) and new Faculty Advisor positions (1 faculty for every 3-4 residents).

As past president of the Canadian Association of Professors of Medicine (the national organization representing Canada’s Heads/Chairs of Academic Departments of Medicine) I can attest that my colleagues have these same questions (and more). Unlike us, they have not implemented CBME yet. I invited Dr. Taylor to speak to CAPM in October 2017. He gave a stellar talk and is now ably spreading the gospel of CBME across the country!

In response to my many questions, and questions from CAPM membership, he agreed to write this week’s blog. Enjoy!

_________________________

GUEST BLOG

Competency Based Medical Education (CBME): Seeking Excellence by Embracing Criticism

Dr. David Taylor, Director, Core Internal Medicine Program

As Director of the Core Internal Medicine training program, CBME has played a huge role in the workload of myself and other CBME leads within the Department of Medicine.  As part of my portfolio, I have had the opportunity to speak to this topic, training and educating faculty and residents on what to expect with CBME implementation. Dr. Archer and I have had discussion around the following issues and questions, and they will hopefully provide context to CBME implementation and rationale. Before I get to my guest blog-let me address some of Dr. Archer’s questions! 

What is the rationale/supporting evidence justifying a change to the current educational model to the CBME model?

Interestingly, I only rarely hear this question now. My impression is that there’s been a clear recognition that the status quo is not acceptable and that stronger accountability of training programs for the quality of their graduates is necessary (but perhaps people have simply acquiesced to the inevitability of change). This question is really two questions – why do we need a change, and why should that change be competency-based medical education (CBME). For the first of these questions, two key points are important to consider: there is a problem with physician competence; and our profession pursues excellence, not adequacy.

First, medical error is a major problem in our health care system that is at least partly (and perhaps largely) attributable to physicians1-4. The Canadian Adverse Events Study is the most relevant research on this for us. The adverse event rate they found was staggering, but consistent with AE rates in other studies. Further, and of direct concern, their data demonstrated that the majority of preventable AE were related to diagnosis and patient management (including medications, fluids, and procedures). In light of these (and other findings) it is hard to argue we are making the grade as a profession. By extension, it’s hard to argue that our training programs are meeting the grade.

Secondly, in our profession, we should never be satisfied with adequacy, but always strive for excellence. For our training programs, this implies an academic imperative that we leverage advances in medical education so that trainees have the best opportunity to reach their full potential, ensuring patients get the best physician possible. Unless you reflect on the abilities of an in-coming medical student and compare them to the out-going subspecialty fellow, it is easy to forget how powerful a health care intervention medical education is. We need to make the most of this intervention. For me and for the residents in our programs, “good enough” is not good enough.

Why CBME has become the answer is a separate and very complex question. Fundamentally, the reasons relate to the erosion of the social contract of Medicine. As society’s trust in physicians has waned over the past several decades, demands that our training systems provide quality assurance for our graduates have become almost deafening5-14. Outcome-based education has been a direct response to this call for quality assurance. Competency-based education, where outcomes are defined in terms of a collection of professional competencies, is the outcomes-based approach adopted in most medical training jurisdictions. CBME is not without its weaknesses, but has the advantage that it aligns well with established educational frameworks (CanMEDS, and ACGME Core Competencies). There is also room within a larger view of CBME to adapt its design to evolving research and understanding in learning and assessment. It is worth noting that Ontario physicians were the biggest driver of educational change in Canada. Their job action in 1986 very publicly alienated patients, other health professionals, and society as a whole.  This single event exposed a deeply damaged social contract and directly led to the development of the first CanMEDS framework and the beginning of the competency-based movement in Canada5.

How will CanMEDS be carried forward in CBME?

Knowledge of the impressive body of research that underlies CanMEDS is important to understanding its on-going relevance to medical education in Canada. CanMEDS is built on the Educating the Future Physicians of Ontario project, an enormous research initiative completed through collaboration of all of the medical schools in Ontario5,6. Through surveys and interviews of people with barriers to health care access, analysis of population health data, and interviews of health care providers and academic leaders, EFPO gathered an enormous body of data describing health needs in Ontario. They used this to develop a framework describing the physician qualities and abilities that Ontarians need. Their work, including the underlying data, was then used by the Royal College to develop the first CanMEDS competency framework. CanMEDS is an educational framework built empirically on this large body of research, with a view to our profession’s responsibility to society and patients.

Getting back to the question, EFPO’s work was intended to direct learning in our training programs, not to create a highly granular framework of subcompetencies to be used in the assessment of students and learners (e.g. CanMEDS 2015). There is recognition that CanMEDS is an excellent description of physicianship when viewed holistically; the granularity it offers is useful for guiding feedback and learning, but is not likely useful for making judgments of competence. Moving forward, CanMEDS will continue to play an important role—but I anticipate that role will be more focused on directing feedback and learning. Specific and constructive feedback is essential to help our learners put their skills, strengths, and weaknesses together into high quality professional work. This is where CanMEDS continues to offer important value and hold us accountable to society’s needs.

If the deliverable is “better doctors”, how will we know that this has been achieved?

This is, perhaps, the central question for proponents of CBME. The change we see in our learners over the course of their training is truly transformative; the student on day one of medical school is unrecognizable at end of residency. Over the past several decades, we have become much better at capturing and measuring much of this learning. But the goal of our move to CBME is clinical practice improvement and better patient outcomes, not simply learning. Measuring these higher-level outcomes directly in our complex health care system is near impossible. So how do we test the impact of CBME we’re looking for? In my mind, this is the big challenge in introducing such an enormous and resource-intensive initiative. We as educators must deliver on this daunting challenge. Currently, the best answer to this question is examination of residents’ clinical performance assessments, which in aggregate provide very reliable measures of learning—but this is not enough. Defining how we answer the bigger question of practice improvement and patient outcomes will require epidemiologists, clinicians, educators, and qualitative and quantitative researchers to work together. Several faculty members in the Department of Medicine are laying the groundwork for such a project now. This is a daunting task, but one I’m excited to take on.

What are the consequences for other mandates of an academic Department (such as research), when the RCPS(C) mandates unfunded programs and resources are redirected to this goal? How can we improve communication with the Royal College, which currently does not officially talk with Canada’s academic Department Heads (who paradoxically control the HR required to implement CBME)?

Competence by Design (CBD) is not simply a new residency curriculum; it is a change in how academic departments are expected to operate. It demands a significant increase in human and financial resource allocations to educational programs. By my estimate, CBD adds roughly 0.035 of faculty FTE per resident. In our department, this amounts to a 3% increase in faculty time dedicated to CBME. In departments without the ability to recruit new faculty, this necessarily means re-alignment of faculty job descriptions towards education deliverables. One major goal I have in our move to CBME is to find ways for us to protect and grow all the areas our academic departments contribute to: research, clinical care, health care administration, and education (undoubtedly, the most threatened of these domains is research). This will require national leadership to help secure support for new academic positions and funding for departments. Ministries of Health, Deans of Health Sciences, academic hospital leadership, and professional organizations (such as the OMA and CMA) must all participate in solutions for this massive change management challenge. The Royal College Specialty Committees are important places to start. I would encourage everyone to lobby your regional representatives on your specialty committee to advocate for engagement with these key stakeholders. We need national leadership to make clear to these stakeholders the resources required to be successful. At our last Internal Medicine Specialty Committee meeting, we initiated a plan to formally engage the Department Chairs of Medicine in CBD discussions within the committee. This is a start, but only a start.

CBME: Seeking Excellence by Embracing Criticism

Recently, I was talking with a colleague about the future of competency-based medical education in Canada. At the heart of our discussion was a deep-rooted concern that CBME could ultimately become synonymous with unfulfilled potential. Powerful external and political pressures driving the national adoption and implementation of CBME seem to have made the system of competency-based training the goal instead of exceptional learning, assessment, and clinical care. (At times it can feel heretical to suggest there might be more to assessment than entrustable professional activities.)

I recently read a powerful article in Medical Education, “Competency-based medical education: the discourse of infallibility”. In their work, the authors tackle a challenging and critical issue that goes to the heart of academic discourse. Without casting judgment on the value of CBME, they examine the nature of the academic discourse on CBME15. Through critical discourse analysis (CDA) of the peer-reviewed literature on CBME—the literature guiding its development and implementation—Boyd and colleagues identify patterns in which voices critical of CBME have been suppressed. Further, they capture and frame the assumptions and social constructs that have enabled this suppression to occur. In doing so, they illuminate what may be the greatest threat to our pursuit of an exceptional system of medical education—a discourse in the literature that prohibits us from acknowledging and addressing important criticisms and weaknesses.

Through their research several important findings came to light. First, the literature on CBME relies heavily on expert opinion with a relative paucity of publications presenting empiric data; there are two opinion pieces for every piece presenting data. This imbalance is true for both proponents and critics of CBME. Second, the response to criticisms of CBME has been to minimize and/or deflect the concerns. In doing so, the response to criticism often reframes important theoretical challenges into challenges of implementation or interpretation. The consequence is an academic discourse that selectively legitimizes supporting positions for CBME, while concurrently making certain topics critical of CBME “unauthorized”. Although such a discourse can propel initial adoption, it represents an existential threat to long-term success.

How do we respond to this? I actually found this article empowering—I’ve felt like the patient who knew there was something not quite right, but until now, had no evidence to tell me what it was. The first and most important step is acknowledging and accepting the current state of affairs—CBME lives in a discourse of infallibility. And we must change this discourse. We need to tackle fundamental criticisms such as the counterproductive behaviourist underpinnings of some aspects of CBME; we all agree that our work is far more than a collection of milestones. We must challenge and test current concepts and constructs of competency. And most importantly, we must remember our first principle: better health care for patients and society. This is and always will be the primary goal of medical education. CBME represents a good step in the right direction; but we only ever achieve excellence by embracing and tackling criticism and weakness.

It is far too easy to use this article to cast judgment on those leading the development and implementation of CBME.  The vast majority of those contributing to this discourse on CBME are the very best of us. These are not fascist dictators of medical education, but altruistic clinicians and educators who are among the strongest advocates of learners, patients, and society. But as proponents of CBME, we must all take responsibility for the evolution of the discourse revealed in Boyd’s work. But I would go a step further—we must embrace this opportunity to tackle these challenges and criticisms. Only through rigorous scholarly work and open debate can we resolve these challenges in a manner that moves medical education towards our true goal – exceptional clinicians meeting the health needs of each patient and society.

I will finish with this warning—a discourse of infallibility is almost certainly not unique to CBME or the medical education literature. Regardless of your academic focus, on reading this article we should all ask the question: is there a discourse of infallibility in my field. Is there a discourse of infallibility for QI, pharmaceutical therapies, or perhaps even mitochondrial research! Being blind to weakness will always be the greatest threat to success.

Reference

  1. Institute of Medicine (US) Committee on Quality of Health Care in America, Kohn LT, Corrigan JM, Donaldson MS. To Err Is Human: Building a Safer Health System. Washington (DC): National Academies Press (US); 2000.
  2. Baker GR, Norton PG, Flintoft V, et al. The Canadian Adverse Events Study: the incidence of adverse events among hospital patients in Canada. CMAJ. 2004;170(11):1678-1686.
  3. Calder LA, Forster A, Nelson M, et al. Adverse events among patients registered in high-acuity areas of the emergency department: a prospective cohort study. CJEM. 2010;12(5):421-430.
  4. Makary MA, Daniel M. Medical error-the third leading cause of death in the US. BMJ. 2016;353:i2139.
  5. Neufeld VR, Maudsley RF, Pickering RJ, et al. Educating future physicians for Ontario. Academic Medicine. 1998;73(11):1133-1148.
  6. Maudsley RF, Wilson DR, Neufeld VR, et al. Educating future physicians for Ontario: phase II. Academic Medicine. 2000;75(2):113-126.
  7. Baron RB. Can we achieve public accountability for graduate medical education outcomes? Acad Med. 2013;88(9):1199-1201. doi:10.1097/ACM.0b013e31829ed2ed.
  8. Berwick DM. Postgraduate education of physicians: professional self-regulation and external accountability. JAMA. 2015;313(18):1803-1804. doi:10.1001/jama.2015.4048.
  9. Boelen C, Woollard R. Social accountability: the extra leap to excellence for educational institutions. Med Teach. 2011;33(8):614-619. doi:10.3109/0142159X.2011.590248.
  10. Busing N, Harris K, MacLellan A-M, et al. The Future of Postgraduate Medical Education in Canada. Acad Med. 2015;90(9):1258-1263. doi:10.1097/ACM.0000000000000815.
  11. Carraccio C, Englander R, Van Melle E, et al. Advancing Competency-Based Medical Education: A Charter for Clinician-Educators. Acad Med. 2016;91(5):645-649. doi:10.1097/ACM.0000000000001048.
  12. Chen C, Petterson S, Phillips RL, Mullan F, Bazemore A, O’Donnell SD. Toward graduate medical education (GME) accountability: measuring the outcomes of GME institutions. Acad Med. 2013;88(9):1267-1280. doi:10.1097/ACM.0b013e31829a3ce9.
  13. Gibbs T, McLean M. Creating equal opportunities: the social accountability of medical education. Med Teach. 2011;33(8):620-625. doi:10.3109/0142159X.2011.558537.
  14. Irby DM, Cooke M, O’Brien BC. Calls for reform of medical education by the Carnegie Foundation for the Advancement of Teaching: 1910 and 2010. Acad Med. 2010;85(2):220-227. doi:10.1097/ACM.0b013e3181c88449.
  15. Boyd VA, Whitehead CR, Thille P, Ginsburg S, Brydges R, Kuper A. Competency-based medical education: the discourse of infallibility. Med Educ. October 2017. doi:10.1111/medu.13467.

One Response to Everything You Wanted to Know About Competency Based Medical Education (CBME) But Were Afraid to Ask. A guest blog by Dr. David Taylor with introduction by Stephen Archer

  1. David Taylor says:

    Debra Weinstein wrote a great perspective piece in the most recent New England Journal of Medicine on the importance of measuring outcomes in graduate medical education: http://www.nejm.org/doi/full/10.1056/NEJMp1711483?query=featured_home

Leave a Reply to David Taylor Cancel reply

Dr. Archer, Dept. Head
Dr. Archer, Dept. Head