December 2018, Vol. 10, No. 4 AJHPE 191 Editorial Training and assessing healthcare professionals boils down to one simple question, ‘Can they be trusted to provide safe, effective and efficient healthcare?’ How this question is answered is of major interest to the public and healthcare funders who are, respectively, the recipients and ‘sponsors’ of the care to be provided. Unfortunately, the wealth of data reporting on the morbidity and mortality associated with medical errors attests to the failure of health professions education programmes to achieve this mandate. How can the situation be remedied? Processes for ensuring the competence of graduating healthcare professionals have been a major focus of attention throughout the history of medicine. Learning the ‘art and craft’ of medicine 400 years ago was achieved largely through an apprenticeship model. In the early 1900s, Flexner revolutionised this ‘cottage industry’ approach to clinical training by introducing a rigorous scientific approach to medical training programmes in the USA.[1] This trend was widely adopted and training programmes with a strong scientific foundation became the norm. Over the next century health professions educators wrestled with the design of curricula, which evolved from organ- and system-based approaches to problem-based learning,[2] and more recently outcome-based[3] and competency-based education.[4] All these iterations of training reflect an earnest ongoing attempt to bridge the gap between theory and practice.[5] Competency frameworks for guiding the training of healthcare professionals marked the start of a trend to more specifically define the desired capabilities of healthcare graduates.[4] A major challenge of these frameworks is their practicability in the clinical setting.[6] While they serve as comprehensive descriptions of desired graduate abilities, there remains a gap between competencies described in frameworks and clinicians who can be trusted to deliver safe, effective and efficient patient care. How then can clinical training be reconceptualised to focus on the essential responsibilities and activities of clinicians and their ability to perform them independently? Strategies to address this issue have given rise to the development of workplace-based assessment (WBA)[7] and the concept of ‘entrustable professional activities’ (EPAs).[8] These endeavours contribute to improving the likelihood that graduating healthcare professionals will ‘do the job properly’. The idea of assessing the performance of trainees in the workplace began to take shape about 30 years ago. While it has taken time to gain momentum, it is now increasingly being implemented in clinical training, especially postgraduate programmes. The literature highlights the many challenges associated with WBA and potential strategies for dealing with current limitations.[9,10] Despite ongoing scepticism and resistance from both trainees and educators, it is unlikely that WBA will be abandoned in an era where public expectations and demands for safer clinical practice are not negotiable. What about EPAs? These are tasks or responsibilities that supervising clinicians entrust trainees to execute, unsupervised, once they have demonstrated adequate competence.[8] EPAs need to be discrete measureable units of work that can be observed and a judgement passed by supervising practitioners on the level of entrustment that trainees can be afforded. The literature warns that EPAs need to be sufficiently specific and well defined to form a recognisable unit of work, but not dissected into disarticulated lists of actions.[6] It is recommended that postgraduate specialty programmes should focus on 20 - 30 EPAs.[11] Essentially, EPAs bridge the gap between the desired abilities of graduates described in competency frameworks and the actual professional activities they undertake in clinical practice.[11] Unlike competencies that are generic across a range of clinical specialities, e.g. the CanMEDS framework, EPAs have discipline-specific nuances and include both process and content perspectives. For example, managing a patient with severe pregnancy- induced hypertension is a content-orientated obstetric EPA, while counselling a patient about end-of-life decisions is a process-orientated EPA. While the concept was described more than a decade ago, the actualisation of EPAs in clinical training programmes is taking time to gain ground. There are descriptions of EPA-based training programmes in a range of disciplines, including internal medicine,[12,13] family medicine[14] and psychiatry.[15] However, more work is needed before EPAs become mainstream practice in health professions education. A key feature of using EPAs to describe the development of competence in the workplace is that they are assessed using entrustment scales. These scales allow supervising clinicians to ‘make assessments based on narrative descriptors that reflect real-world judgements, drawing attention to the trainee’s readiness for independent practice rather than his/her deficiencies’.[16] It follows logically that assessors are required to explain to trainees why unsupervised practice of an EPA is not yet possible, and what further action is needed to achieve independent practice. The scales typically include five points: ‘observation only’; ‘perform under direct supervision’; ‘perform with readily available supervision’; ‘perform unsupervised with oversight’; and ‘provide supervision to more junior colleagues’.[6,11] These scales obviate the need to translate observed behaviour into numerical scores with abstract descriptors that do not resonate with the lived experience of clinicians who supervise trainees, e.g. ‘the trainee performs at the expected level of competence’.[16] These types of descriptors do not provide a clear definition of competence and are known to produce widely varying ratings with poor consistency. Once trainees no longer require supervision, they receive a ‘statement of awarded responsibility’ (STAR).[5] Training is complete once a STAR has been awarded for each EPA in the training programme. Ultimately this approach to assessment – the glass is half full rather than half empty – may foster a more positive attitude towards WBA. So, how can these advances support the endeavours of health professions educators who wrestle with the challenge of declaring trainees safe to undertake independent clinical practice? First, consideration needs to be given to articulating curricula as a set of essential EPAs with a clear description of the requisite knowledge, skills and attributes to perform each activity. Guidelines for undertaking this process have been published.[17,18] Second, it seems that assessment in the workplace could be reconceptualised as a process of progressive entrustment, culminating in unsupervised independent practice of predetermined EPAs. This perspective would clarify the frequently misunderstood purpose of WBA and simplify the rating processes currently used.[16] Furthermore, feedback – the Achilles heel of formative assessment – is an integral part of EPA Can they be trusted? This open-access article is distributed under Creative Commons licence CC-BY-NC 4.0. 192 December 2018, Vol. 10, No. 4 AJHPE Editorial assessment processes because clinicians are obliged to provide trainees with reasons why they cannot yet perform an activity without supervision.[19] To make progress in our endeavours to train healthcare professionals who can be trusted to deliver safe, effective and efficient healthcare, broader uptake of new methods for describing and assessing competence in the workplace is needed. As was recently written, ‘Be an advocate for a new view of certification and licencing’.[19] Vanessa Burch Honorary Professor, Department of Medicine, Faculty of Health Sciences, University of Cape Town, South Africa vcburch.65@gmail.com 1. Cooke M, Irby DM, Sullivan W, Ludmerer KM. American medical education 100 years after the Flexner report. N Engl J Med 2006;355(13):1339-1344. https://doi.org/10.1056/NEJMra055445 2. Barrows HS. Problem‐based learning in medicine and beyond: A brief overview. New Direct Teach Learn 1996;1996(68):3-12. https://doi.org/10.1002/tl.37219966804 3. Harden RM. Outcome-based education: The future is today. Med Teach 2007;29(7):625-629. https://doi. org/10.1080/01421590701729930 4. Frank JR, Snell LS, Cate OT, et al. Competency-based medical education: Theory to practice. Med Teach 2010;32(8):638-645. https://doi.org/10.3109/0142159X.2010.501190 5. Ten Cate O, Scheele F. Competency-based postgraduate training: Can we bridge the gap between theory and clinical practice? Acad Med 2007;82(6):542-547. https://doi.org/10.1097/ACM.0b013e31805559c7 6. Ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ 2013;5(1):157-158. https://doi. org/10.4300/JGME-D-12-00380.1 7. Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach 2007;29(9):855-871. https://doi.org/10.1080/01421590701775453 8. Ten Cate O. Entrustability of professional activities and competency‐based training. Med Educ 2005;39(12):1176-1177. https://doi.org/10.1111/j.1365-2929.2005.02341.x 9. Govaerts M, van der Vleuten CP. Validity in work‐based assessment: Expanding our horizons. Med Educ 2013;47(12):1164-1174. https://doi.org/10.1111/medu.12289 10. Academy of Medical Royal Colleges. Improving Assessment: Further Guidance and Recommendations. AoMRC: London, 2016. 11. Ten Cate O. AM last page: What entrustable professional activities add to a competency-based curriculum. Acad Med 2014;89(4):691. https://doi.org/10.1097/ACM.0000000000000161 12. Hauer KE, Kohlwes J, Cornett P, et al. Identifying entrustable professional activities in internal medicine training. J Grad Med Educ 2013;5(1):54-59. https://doi.org/10.4300/JGME-D-12-00060.1 13. Caverzagie KJ, Cooney TG, Hemmer PA, Berkowitz L. The development of entrustable professional activities for internal medicine residency training: A report from the Education Redesign Committee of the Alliance for Academic Internal Medicine. Acad Med 2015;90(4):479-484. https://doi.org/10.1097/ACM.0000000000000564 14. Shaughnessy AF, Sparks J, Cohen-Osher M, Goodell KH, Sawin GL, Gravel Jr J. Entrustable professional activities in family medicine. J Grad Med Educ 2013;5(1):112-118. https://doi.org/10.4300/JGME-D-12-00034.1 15. Boyce P, Spratt C, Davies M, McEvoy P. Using entrustable professional activities to guide curriculum development in psychiatry training. BMC Med Educ 2011;11(1):96. https://doi.org/10.1186/1472-6920-11-96 16. Rekman J, Gofton W, Dudek N, Gofton T, Hamstra SJ. Entrustability scales: Outlining their usefulness for competency-based clinical assessment. Acad Med 2016;91(2):186-190. https://doi.org/10.1097/ACM. 0000000000001045 17. Peters H, Holzhausen Y, Boscardin C, ten Cate O, Chen HC. Twelve tips for the implementation of EPAs for assessment and entrustment decisions. Med Teach 2017;39(8):802-807. https://doi.org/10.1080/014215 9X.2017.1331031 18. Ten Cate O, Chen HC, Hoff RG, Peters H, Bok H, van der Schaaf M. Curriculum development for the workplace using entrustable professional activities (EPAs): AMEE Guide No. 99. Med Teach 2015;37(11):983-1002. https:// doi.org/10.3109/0142159X.2015.1060308 19. Ten Cate OT. Entrustment as assessment: Recognizing the ability, the right, and the duty to act. J Grad Med Educ 2016;8(2):261-262. https://doi.org/10.4300/JGME-D-16-00097.1 Afr J Health Professions Educ 2018;10(4):191-192. DOI:10.7196/AJHPE.2018.v10i4.1173 https://doi.org/10.1080/01421590701729930 https://doi.org/10.1080/01421590701729930 https://doi.org/10.4300/JGME-D-12-00380.1 https://doi.org/10.4300/JGME-D-12-00380.1 https://doi.org/10.1097/ACM.-0000000000001045 https://doi.org/10.1097/ACM.-0000000000001045 https://doi.org/10.1080/0142159X.2017.1331031 https://doi.org/10.1080/0142159X.2017.1331031 https://doi.org/10.3109/0142159X.2015.1060308 https://doi.org/10.3109/0142159X.2015.1060308