Column Editor: Sophie Soklaridis, PhD
This column aims to explore trends and developments in continuing professional development (CPD) education and research.
Dr. Deep-Mind Will See You Now: Challenging Clinician-Educators to Create New Roles for a Digital World
In this first column, I wish to challenge CPD educators and innovators to think about creating new roles. What will medicine and health professions education look like in the next 15 years? What skills will we need to keep on top of ever-evolving technology and engage our students?
Given the massive shift toward fields of virtual/augmented reality, data analytics and artificial intelligence, the roles of clinician-educators are changing. Some CPD scholars are urging clinician-educators to extend their analog skills to become curators, creators and moderators of the digital space (Minter et al., 2021). Educators are learning from students about blogs, apps and videos so they can then better direct students to resources that are evidence-based, that use modern educational principles and that are suitable for the learner’s stage of development (clinician as curator). Social distancing has forced educators to move to digital platforms, which are rapidly becoming a new form of classroom (clinician as creator). This means that clinicians are having to become better versed in the tools of digital conversations, such as chat functions, shared screens and breakout rooms (clinician as moderator). Other scholars think we will need to create entirely new jobs (thus new curricula, assessments and competencies). For example, they have identified a near-future need for emerging health professions, such as “medical virtualists” – clinicians who have expertise in artificial intelligence and predictive analytics (Thakur et al., 2021).
The deployment of effective medical treatments for COVID-19 has relied on artificial intelligence, data science and digital technology, which traditionally remain outside of health professions education. This means that creating new roles requires paradigm shifts to facilitate novel ways of thinking beyond the traditional biomedical model. But changing fundamental ways of thinking is much easier said than done.
Although health professionals typically will acknowledge that other paradigms can be useful in practice (e.g. the biopsychosocial model), they often retain their biological orientation, especially when there is uncertainty around a medical diagnosis. Other paradigms are still dismissed as having a low degree of credibility in the biomedical hierarchy of evidence. Many health professionals tend to reject or explain away interpretations of a patient’s suffering that are not rooted in biological causes. In a similar way, health professionals tend to dismiss or resist the adoption of medical artificial intelligence because they believe algorithms cannot capture subjectively derived medical knowledge - the kind of knowledge, clinical judgment and wisdom that develop over years of practice. This is similar to the kind of clinical judgment that used to be hotly debated when it was thought to be threatened by the ushering in of evidence-based medicine – another paradigm shift. In fact, some of the common misconceptions about evidence-based medicine also underlie distrust in using artificial intelligence in medicine; the devaluing of clinical judgment; the lack of recognition of individual patient needs; and a “cookbook” approach to treatment that ignores patients’ values and needs.
Some interesting research is challenging our belief in the superiority of human decision making. The decisions we make may not be more transparent than decisions made by algorithms. Cadario and colleagues (2021) found that health professionals wrongly believe that they have a better understanding of how they come to make a decision than they actually do (Cadario et al., 2021). The notion that human decision making is transparent is rooted in the belief that people have direct access to the psychological process that they use to make a decision. Furthermore, we believe that the decision-making process that algorithms use are opaque because machines, as non-humans, lack the ability for introspection. Also, most people don’t understand how algorithms are programmed to make medical decisions. The perception of transparency in human decision making is actually an illusion. We lack access to our own associative machinery and often rely on mental shortcuts to understand decision making. We must learn to accept that much of what happens in our head can be out of our awareness and control, aAnd that of which we do know is influenced by our biases and pre-conceived notions, which are not simple to unpack. Certainly, this makes human decision making just as much of a black box as the decisions made by algorithms.
Let’s assume that some CPD early-adopters and innovators have accepted that human decision making is pretty irrational. They can now move toward creating new roles entrenched in paradigms that reflect a shift beyond the traditional medical model. Design thinking can accelerate the development of prototypes and their implementation when there is incoming new information and when rapid, widespread dissemination is necessary (Thakur et al., 2021). For example, Project Extension for Community Healthcare Outcomes (ECHO) has demonstrating its capacity to virtually connected health providers around the world to support their unique needs during COVID 19 (Sockalingam et al., 2020). The complex adaptive system approach to collect, vet and decide urgent, evidence-based adjustments to policies and procedures has been used to rapidly develop guidelines and job aids and to make electronic documentation adjustments to improve patient care during the pandemic (Goss & Brown, 2020). Another innovation is the crisis-curriculum analysis framework, which provides a structured approach to curriculum analysis in the face of disruption, such as COVID-19 (Govender & de Villiers, 2021). This multi-dimensional framework combines existing curriculum frameworks with strategies from the crisis-management literature. It offers three levels of crisis from minimal disruption (level 1) to major disruption (level 3), such as the current pandemic with its loss of several months of teaching and need for the adoption of emergency-remote teaching strategies. At each level of disruption there exist corresponding learning opportunities. Does this framework only apply to crisis situations? I would posit that educators find themselves in challenging situations on a daily basis. Therefore such a framework could easily translate and transfer into many teaching and learning situations.
Critical perspectives will also be essential to ensure that medical algorithms are helpful, harmless and fair. As Sikstrom and colleagues (2022) explain, mitigating and eliminating bias in machine learning requires a sustained dialogue with a range of stakeholders. It involves working with and for equity-deserving and underrepresented communities to advance the concepts of health equity and fairness in machine learning. Algorithms are not inherently (un)helpful, (un)harmful or (un)fair. They are a mere reflection of our normative values. Incorporating sociotechnical approaches that are interdisciplinary, collaborative and equity-focused can safeguard against bias in the creation of algorithms.
These approaches and frameworks, coupled with virtual/augmented reality, data analytics and artificial intelligence, are the future of health professions education and practice. How can the CPD of today prepare health professionals for the technological revolution in health care? CPD educators and innovators have an incredible opportunity to lead the paradigm shift that moves beyond traditional biomedical approaches and creatively engages with technology-enhanced learning and artificial intelligence for improved health and systems outcomes.
Cadario, R., Longoni, C. & Morewedge, C.K. (2021). Understanding, explaining, and utilizing medical artificial intelligence. Nature Human Behavior, 5, 1636–1642.
Goss, J.L. & Brown, M.-M. (2020). Nursing through a pandemic. Using a complex adaptive systems approach to transform practice. 2020 MNA Convention Abstracts. Maryland Nurse Journal, 22 (1). Retrieved from https://d3ms3kxrsap50t.cloudfront.net/uploads/publication/pdf/2146/Maryland_Nurse_11_20.pdf
Govender, L. & de Villiers, M.R. (2021). When disruption strikes the curriculum: Towards a crisis-curriculum analysis framework. Medical Teacher, 43, 694–699.
Minter, D.J., Geha, R., Manesh, R. & Dhaliwal, G. (2021). The future comes early for medical educators. Journal of General Internal Medicine, 36, 1400–1403.
Sikstrom, L., Maslej, M.M., Hui, K., Findlay, Z. & Hill, S.L. (2022). Conceptualising fairness: Three pillars for medical algorithms and health equity. BMJ Health & Care Informatics, 29 (1), e100459.
Sockalingam S, Clarkin C, Serhal E, Pereira C, Crawford A. Responding to health care professionals’ mental health needs during COVID-19 through the rapid implementation of Project ECHO [published online ahead of print July 9, 2020]. J Cont Educ Health Prof. doi:10.1097/CEH.0000000000000311.
Thakur, A., Soklaridis, S., Crawford, A., Mulsant, B. & Sockalingam, S. (2021). Using rapid design thinking to overcome COVID-19 challenges in medical education. Academic Medicine, 96 (1), 56–61.