December 2019, Vol. 11, No. 4 AJHPE 118 Research This article focuses on describing the perspectives of a sample of South African (SA) emergency care practitioner (ECP) students regarding the value of simulation v. four other learning methods in preparing them for clinical practice. In the context of our study, simulation refers to the creation of learning experiences through the use of actors, manikins, training aids and related equipment to simulate an authentic patient-practitioner interaction. Incident or scene management refers to the management of resources and application of management strategies to deal with the patient and the environment. A ‘case’ in the context of this study refers to a particular patient – real or simulated. In SA, prehospital emergency care is provided by a number of private and public emergency medical services. Each service employs staff with different levels of education, training and associated scope of practice. Historically, emergency care training ranged from only a few weeks (for basic ambulance attendants) to months and years for paramedics. SA advanced life-support paramedics enjoy an extensive scope of practice that allows them to independently manage the majority of patients they encounter. However, there remains a subset of critically ill or injured patients who require interventions that fall outside the paramedic scope of practice. In many other countries, such patients would be attended to by emergency service doctors. In SA, a shortage of doctors to fulfil this role prompted government to follow a different path, which saw the emergence of the ECP. ECPs are healthcare professionals who function as prehospital acute care clinicians and medical rescue specialists.[1]  ECPs practise independently, predominantly within the pre- and inhospital emergency and critical care transport environments. To become an ECP, one needs to complete a 4-year (National Qualifications Framework (NQF) level 8) professional degree in emergency medical care (EMC). ECP graduates register as independent practitioners with the Health Professions Council of South Africa (HPCSA). The University of Johannesburg (UJ) is 1 of 4 institutions nationally that offer the Bachelor degree in EMC.[2] These institutions use simulation for teaching, learning and assessment. Simulation in health science education is not new and was first described in the 17th century in France, where rudimentary manikins were used for simulating the process of birth.[3,4] Technology has progressed considerably, with modern human patient simulators now being able to closely replicate the anatomy of real patients, including the performance of a range of physiological actions such as blinking, breathing, bleeding, vomiting, sweating and even convulsing. Many simulators, including those used at UJ, are equipped with computer-feedback systems that allow for the recording and analysis of a number of clinical procedures and interventions. UJ, along with most national and international providers of emergency care education and training, makes extensive use of task trainers and models together with advanced life-support manikins for clinical teaching, learning and assessment. At UJ, these are housed in a purpose-built simulation laboratory that services a number of departments in the faculty. Our application of simulation in the academic unit is based on an educational philosophy of constructivism, where we see EMC students as active participants in the learning process, with lecturers taking on the role of facilitators of learning. Clinical learning usually begins with the mastery Background. Simulation is a commonly used method for clinical learning and assessment in the health sciences. However, despite technological advancements, we are unable to perfectly simulate the appearance and behaviour of real patients, including the stressors, distractions and surroundings commonly encountered in the authentic clinical environment. In South Africa, simulations are used extensively in the education and training of emergency care practitioners (ECPs). Objective. To investigate and describe the views of ECP students regarding the value of simulation v. four other learning methods in preparing them for real-world practice. Methods. ECP students (N=79) completed a purposefully designed questionnaire. A Likert scale was used to obtain participants’ views on how well simulation ranks compared with theoretical lectures, tutorials, inhospital and prehospital work in preparing them for clinical practice. Results. Participants valued simulation as an educational tool. Simulation was ranked as the best method for teaching clinical assessments and procedures and came second only to the real clinical environment for teaching clinical decision-making. Simulation was ranked third, after theoretical classes and prehospital shifts, with regard to learning to take a history and identify life-threatening conditions. Conclusions. ECP students view simulation as a valuable learning method to manage incidents, conduct clinical assessments, perform procedures and make clinical decisions. Simulation, however, has limitations and was seen as less suited for teaching history-taking and identification of life-threatening conditions. Further research is needed to determine the ideal blend of simulation with other pedagogies in the education of ECP students. Afr J Health Professions Educ 2019;11(4):118-122. https://doi.org/10.7196/AJHPE.2019.v11i4.1041 Views of emergency medical care students on the value of simulation for achievement of clinical competence C Vincent-Lambert, ND AET, NHD PSE, NHD FST, BTech EMC, MTech ED, PhD, HPE; C N Douglas, BTech EMC Department of Emergency Medical Care, Faculty of Health Sciences, University of Johannesburg, Doornfontein Campus, Johannesburg, South Africa Corresponding author: C Vincent-Lambert (clambert@uj.ac.za) This open-access article is distributed under Creative Commons licence CC-BY-NC 4.0. 119 December 2019, Vol. 11, No. 4 AJHPE Research of individual clinical procedures and associated psychomotor skills linked to set tasks such as measurement of vital signs, suturing, airway interventions and intravenous cannulation. We view mastery of these individual skills as essential building blocks and a prerequisite for engaging in a simulated patient interaction. The individual procedural skills are taught and assessed by objective structured clinical examinations (OSCEs). During a simulation learning experience, students are expected to perform one or more OSCEs in the appropriate context, setting and sequence. As simulation requires active student participation, simulation has become entrenched in the framework of our curriculum as a learning method that is used by all our educators involved in clinical learning. ECP students are also exposed to a number of other diverse learning experiences, ranging from conventional theoretical lectures and tutorials to inhospital clinical work. In addition to these, great emphasis continues to be placed on the attendance of rostered prehospital clinical learning shifts. During these shifts students have the opportunity to work and learn in an authentic prehospital emergency care environment under the guidance of a supervising clinician (usually a qualified ECP), where ECP students are expected to apply their knowledge and skills that were taught and practised via simulation. Despite the longstanding use of simulation, there is limited literature describing perspectives of ECP students on the value of simulation for learning, or the perceived link between clinical practice and procedures taught via simulation, and their performance in the real world. We chose to explore the views of EMC students on the value of simulation for achievement of selected core competencies required to function as an ECP in the prehospital emergency medical service environment. We feel this study delivers new insights into how ECP students rank and/or value simulation compared with other commonly encountered learning experiences. Methods A prospective, quantitative, descriptive design was chosen for our study.[5] We chose this design because there was no pre-existing dataset available to analyse that spoke sufficiently to our aim and objective. Data were therefore gathered by means of a self-designed, non-validated questionnaire, which consisted of 17 closed-ended questions. Questions 1 - 3 focused on gathering selected demographic information that described the participants. This data set included year of study, age and gender. Questions 4 - 9 required participants to rank different learning methods (including simulation) from best to worst for the achievement of defined core competencies. The latter were selected pragmatically by the researchers, who felt that these reflected important exit-level learning outcomes from the qualification. The final 8 questions provided statements relating to practices and procedures taught in the simulated learning environment and their application in the authentic clinical environment. A Likert response scale was used to obtain participant degree of agreement or disagreement with each statement. The questionnaire was piloted with 4 students before being used in the study. The participants in the pilot group indicated that the questions were clear; consequently no adjustments were made. The participants from the pilot group were excluded from the study population. Participants were ECP students from years 1 - 4 enrolled for the EMC degree programme at UJ. At the time of the study, all participants would have had prior exposure to simulations and would have worked in both the pre- and inhospital clinical learning environments. There were ~120 students in the degree programme at the time of data gathering; of these, 81 agreed to participate. Two of the questionnaires were found to be incomplete and had to be excluded. Seventy-nine completed questionnaires were thus available for analysis. Data were analysed descriptively by tallying the responses to each question, allowing for calculation of percentage and frequency of selected options. Data were captured onto an Excel spreadsheet, allowing for generation of charts and tables summarising responses. Ethical approval Ethical approval for the study was granted by the Faculty of Health Sciences Research Ethics Committee, UJ (ref. no. REC-01-104-2017). Participation in the study was voluntary and individual students, educators and supervising practitioners remained anonymous. Results In presenting the results, we attempted to follow the logic and flow similar to those of the questionnaire. Table 1 shows how participants ranked simulation as a learning method against four other selected learning methods for achievement of six identified core competencies. Tables 2 - 9 provide a summary of responses to statements made regarding simulation learning. A brief narrative at the end of each table draws attention to selected core findings/areas of interest. These are dealt with in greater depth in the discussion. Table 1 shows that, overall, the participants ranked simulation highly in terms of its educational value. Simulation was ranked as the best method for learning clinical assessments and procedures and came second only to the real clinical environment for learning clinical decision-making. Simulation, however, was seen to be less effective in preparing students to take a medical history and identify life-threatening conditions. Responses to statements on simulation practices The statements, together with tables summarising the responses, are given here. Table 1. Ranking of simulation compared with other selected learning methods for achievement of core competencies Learning method Core competency Best 2nd best 3rd best 4th best Worst Incident management Prehospital shifts Simulation Theoretical lectures Inhospital shifts Tutorials History-taking Theoretical lectures Prehospital shifts Simulation Inhospital shifts Tutorials Clinical assessment Simulation Prehospital shifts Inhospital shifts Theoretical lectures Tutorials Identification of life-threatening emergencies Theoretical lectures Prehospital shifts Simulation Inhospital shifts Tutorials Performance of clinical procedures Simulation Prehospital shifts Inhospital shifts Theoretical lectures Tutorials Clinical decision-making Prehospital shifts Simulation Inhospital shifts Theoretical lectures Tutorials December 2019, Vol. 11, No. 4 AJHPE 120 Research Statement 1: ‘When I practise simulations in the simulation laboratory, the main reason I do so is to improve the way in which I manage real patients.’ Participants’ responses are summarised in Table 2. The majority of participants agreed with the statement, indicating that the main reason they practise simulation is to improve the way in which they manage real patients in the real clinical environment. Statement 2: ‘The way in which I am taught to manage a patient in the simulation laboratory is the same way as I am expected to manage a similar case when I work in the hospital or in the prehospital environment.’ Participants’ responses are summarised in Table 3. The participants agreed that the way in which they are taught to manage a patient in the simulated environment is the same as they are expected to manage a patient in the inhospital and prehospital environment. Statement 3: ‘The way in which I am taught to use equipment in the simulation laboratory is the same way as I am expected to use it inhospital and in the prehospital environment.’ Participants’ responses are summarised in Table 4. The majority of participants agreed that they use equipment the same way in the real clinical environment as they do in the simulation laboratory. Statement 4: ‘The way in which I am taught to perform specific skills in the simulation laboratory is the same as the way these skills are performed in the hospital or prehospital environment.’ Participants’ responses are summarised in Table 5.There was agreement by 45/79 (58%) participants that the way in which they perform skills is the same in the simulated environment as in the real clinical environment. Interestingly, 42% of the participants did not feel that the way skills are taught in the simulation environment is the same as the way they are performed in the real world. Statement 5: ‘The way in which the simulation environment and manikins are prepared accurately represents the real clinical environment.’ Participants’ responses are summarised in Table 6. Despite the institution having invested in expensive ‘high-end’ manikins and related simulation technologies, only 33% of participants felt the simulation environment and manikins realistically represent the real clinical environment. Statement 6: ‘It is better to first practise clinical procedures and patient management in a simulated environment before being expected to perform these in the real clinical setting.’ Participants’ responses are summarised in Table 7. The majority of participants (57%) strongly agreed that it is better to first practise a clinical procedure and patient management in a simulated environment before being expected to perform these in a real clinical setting. Statement 7: ‘The amount of time spent practising in the simulation environment is sufficient to prepare me for engaging in the real clinical environment.’ Participants’ responses are summarised in Table 8. Only 38% of partici- pants felt that the time they spent practising in the simulated environment was sufficient to prepare them for the real clinical environment. We noted that the second-year participants contributed to the majority of the population who disagreed with the statement. Statement 8: ‘When I practise simulations in the simulation laboratory, the main reason I do so is to improve my performance in a simulation assessment.’ Participants’ responses are summarised in Table 9. Only 9% of students disagreed with the abovementioned statement. This outcome linked with Table 2. Responses to statement 1 (N=79) Response n (%) Strongly disagree 1 (1) Disagree 6 (8) Neutral 9 (11) Agree 29 (37) Strongly agree 34 (43) Table 3. Responses to statement 2 (N=79) Response n (%) Strongly disagree 1 (1) Disagree 15 (19) Neutral 15 (19) Agree 30 (38) Strongly agree 18 (23) Table 4. Responses to statement 3 (N=79) Response n (%) Strongly disagree 1 (1) Disagree 4 (5) Neutral 10 (13) Agree 33 (42) Strongly agree 31 (39) Table 5. Responses to statement 4 (N=79) Response n (%) Strongly disagree 0 (0) Disagree 17 (21) Neutral 17 (21) Agree 26 (34) Strongly agree 19 (24) Table 6. Responses to statement 5 (N=79) Response n (%) Strongly disagree 5 (6) Disagree 28 (36) Neutral 20 (25) Agree 24 (30) Strongly agree 2 (3) Table 7. Responses to statement 6 (N=79) Response n (%) Strongly disagree 2 (2) Disagree 3 (4) Neutral 3 (4) Agree 26 (33) Strongly agree 45 (57) 121 December 2019, Vol. 11, No. 4 AJHPE Research responses indicating that the main reason for practising in the simulation environment was to improve actual patient management. Discussion The literature shows that simulation-based learning is a mode of instruction widely used by emergency care educators locally and abroad as a way of improving confidence with regard to the performance of clinical skills in stressful situations.[6] In SA, simulations are used extensively in the education and training of ECPs. Our study explored the perspectives of a group of ECP students regarding the value of simulation v. four other learning methods in preparing them for real-world practice. Table 1 shows that simulation was highly ranked by ECP students as a method of learning to perform clinical assessments and procedures. This finding may be linked to emergency care interventions and procedures being infrequently performed and many being invasive. Consequently, emergency care educators tend to rely heavily on practising emergency procedures such as intubation, establishing a surgical airway and defibrillation on models and manikins in a simulated environment. However, simulation was more valued than clinical learning shifts for patient assessment. This outcome was unexpected, as one would have thought that the best way to learn patient assessment skills would be to practise on live patients. Reasons for this finding are not clear, but may point to limitations and/or negative experiences encountered by our participants in the authentic clinical learning environment rather than the strength of simulation as a tool to achieve this outcome. Further research to explore the possible reasons for this finding are therefore recommended. Our participants’ views that the prehospital environment was best for learning how to manage a scene may have been linked to recognised limitations of current simulation technologies. At the time of this study, our simulation facilities were such that we were not able to fully replicate the prehospital environment in terms of noise, on-scene hazards and distractions, such as the presence of patients’ family members and bystanders. Despite these limitations, the literature supports the idea that simulations can remain an accepted way of teaching students how to deal with stressful environments in a controlled setting.[7] A potential solution to making simulation a better tool for learning to manage an emergency scene may be to increase the level of fidelity when creating prehospital emergency care simulations. Our participants saw simulations as a beneficial learning method to teach clinical decision-making skills. Clinical decision-making is a complex process that involves the gathering and interpreting of data from multiple sources to make a decision on clinical interventions, treatment plans and/or immediate courses of action.[8,9] In our context, we see clinical decision-making as a critical exit-level learning outcome of the Bachelor degree qualification and a critical cornerstone of independent prehospital emergency care practice. This study supports the value of simulation as a tool for the learning and assessing of clinical decision-making for ECP students. The study also explored students’ experiences of the links between the way they are taught and their experience in the simulation laboratory and what they encountered in the real world. The respective frequencies for agreement (Tables 2 - 4) were 61%, 81% and 81%, respectively, indicating that the majority agreed that the way they are taught to perform certain skills in the simulation laboratory is similar to how these skills are expected to be performed in the prehospital or inhospital environment. Conversely, the students disagreed that our simulated environments and manikins accurately present the real-world setting. As mentioned above, this is a well- recognised limitation of simulation-based learning.[9,10] Conclusions Although the results of this study show that ECP students value simulation as a learning method, they seem to agree with educators that clinical competence cannot be achieved through simulation alone. Clinical placements, prehospital caseload and work in the authentic environment remain highly valued learning experiences.[11] Study limitations There are certain limitations relating to the scope and design of this study. Firstly, we acknowledge that our study was purely exploratory and descriptive. We did not probe in-depth exact reasons for the views expressed by our participants. Further research needs to be conducted to explore in greater depth ECP students’ experiences of simulation as a pedagogical tool. This may further assist educators to determine the optimum blend of learning experiences and how simulation is expressed in the curriculum. Secondly, certain of our response options contained what may be considered ‘neutral’ responses/ statements. Should similar surveys be considered, we would advocate omitting this option. Finally, while the study delivered some interesting findings, our participants were from a single university and thus the views and options expressed may differ between institutions and across disciplines. Declaration. None. Acknowledgements. We wish to thank the participants for giving their valuable time to participate in the study. Author contributions. CVL supervised the research and wrote the article; CND gathered the raw data and wrote the report on which the article is based. Funding. None. Conflicts of interest. None. 1. Vincent-Lambert C, Bezuidenhout J, van Vuuren MJ. Are further education opportunities for emergency care technicians needed and do they exist? Afr J Health Professions Educ 2014;6(1):6-9. https://doi.org/10.7196/AJHPE.285 2. Department of Emergency Medical Care. University of Johannesburg. 2017. https://www.uj.ac.za/faculties/ health/Emergency-Medical-Care/Pages/default.aspx (accessed 20 August 2017). Table 8. Responses to statement 7 (N=79) Response n (%) Strongly disagree 8 (10) Disagree 16 (20) Neutral 25 (32) Agree 21 (27) Strongly agree 9 (11) Table 9. Responses to statement 8 (N=79) Response n (%) Strongly disagree 0 (0) Disagree 7 (9) Neutral 8 (10) Agree 28 (35) Strongly agree 36 (46) http://dx.doi.org/10.7196%2FAJHPE.285 https://www.uj.ac.za/faculties/health/Emergency-Medical-Care/Pages/default.aspx https://www.uj.ac.za/faculties/health/Emergency-Medical-Care/Pages/default.aspx December 2019, Vol. 11, No. 4 AJHPE 122 Research 3. Mack P. Understanding simulation based learning. In: Understanding Simulation Based Learning. 1st ed. Singapore: SGH-Life Support Training Centre, 2009:1-2. 4. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003 - 2009. Med Educ 2010;(44):50-63. https://doi.org/10.1111/j.1365-2923.2009.03547.x 5. Williams C. Research methods. J Bus Econ Res 2007;5(3):65-72. https://doi.org/10.19030/jber.v5i3.2532 6. Omer T. Nursing students’ perceptions of satisfaction and self-confidence with clinical simulation experience. J Educ Pract 2016;7(5):131-138. 7. Kaddoura MA. New graduate nurses’ perceptions of the effects of clinical simulation on their critical thinking, learning, and confidence. J Contin Educ Nurs 2010;41(11):506-516. https://doi.org/10.3928/00220124-20100701-02 8. Tiffen J, Corbridge SJ, Slimmer L. Enhancing clinical decision making: Development of a contiguous definition and conceptual framework. J Prof Nurs 2014;30(5):399-405. https://doi.org/10.1016/j.profnurs.2014.01.006 9. Vincent-Lambert C, Bogossian F, eds. A Guide for the Assessment of Clinical Competence Using Simulation. 1st ed. Johannebsurg: Universitas 21 Health Sciences Group, 2017. 10. Perkins GD. Simulation in resuscitation training. Resuscitation 2007;73(2):171-324. https://doi.org/10.1016/j. resuscitation.2007.01.005 11. Ruessler M, Weinlich M, Muller M, Byhahn C, Marzin I, Walcher F. Simulation training improves medical emergency management. 2012. http://www.medscape.com/viewarticle/764050 (accessed 20 August 2017). Accepted 1 July 2019. https://doi.org/10.1111/j.1365-2923.2009.03547.x https://doi.org/10.19030/jber.v5i3.2532 https://doi.org/10.3928/00220124-20100701-02 https://doi.org/10.1016/j.profnurs.2014.01.006 https://doi.org/10.1016/j.resuscitation.2007.01.005 https://doi.org/10.1016/j.resuscitation.2007.01.005 http://www.medscape.com/viewarticle/764050