Research

November 2015, Vol. 7, No. 2  AJHPE         155

In 2010, a Skills Centre came into operation at the Medunsa 
campus of the University of Limpopo, which is situated 
25 km north-west of Pretoria, South Africa. The medical 
core curriculum skills list was revised and skills that could 
be taught in simulated situations were listed for each of 

the 6 academic years. Since 2011, all 6th-year students have been required 
to manage 3 simulated clinical emergencies in small groups during the 
orientation period of the Family Medicine rotation. The skills incorporated 
in these simulations are cardiopulmonary resuscitation (CPR), airway 
suctioning, placement of an oropharyngeal airway (OPA), endotracheal 
intubation, bag-valve-mask (BVM) ventilation and defibrillation. 

As the 2012 final-year students had not had exposure to the simulated 
emergency skills training currently scheduled for 4th-year students, their 
emergency skills training comprised apprenticeship in real-life clinical 
situations. Their formal emergency training was limited to CPR in their 
4th year and endotracheal intubation and BVM ventilation in their 5th year 
of study. There was no evidence that these students had had opportunities 
to practise emergency skills during their practical rotations or of their 
competence and confidence when performing these skills.

Traditional bedside teaching, based on the apprenticeship model of 
education, cannot be relied on to provide adequate and comprehensive 
clinical skills training.[1] The healthcare systems (reduced hospital stay and 
a rapid advance in diagnosis and treatment technologies) have made this 
teaching method less effective, resulting in a sharp decline in standards 
of acquisition of clinical skills among medical students. The drawbacks of 
an apprenticeship methodology of skills acquisition (where learning is left 

to chance and is unobserved by teachers) can, however, be overcome by 
structured and observed training in skills centres.[2]

Growing evidence validating medical simulation as an educational tool 
has promoted its use beyond the instruction of physicians-in-training,[3] 
and skills centres have become an established part of training for healthcare 
professionals. Clinical skills centres provide students with the opportunity 
to practise clinical techniques on manikins and simulators in a safe 
environment, without affecting the quality of patient care. This has changed 
the centuries-old approach to learning medical procedures by first practising 
on a patient, to one where competency is first demonstrated on a simulator. 
Simulation training, especially in emergency skills, is designed in such a way 
that healthcare providers can learn from practising in a situation that they 
are likely to encounter. It ensures that patients are not put at unnecessary 
risk by exposure to novice or out-of-practice caregivers,[4] and is also 
conducive to the conducting of objective assessments. 

There is considerable debate on how accurately students assess their 
own competence. Several studies have shown that medical students’ 
self-perceived competence correlates poorly with objectively assessed 
competence.[5] Apart from inadequate self-assessment skills, biased self-
evaluation in applied settings can also be ascribed to the overconfidence 
phenomenon. ‘We don’t know what we know, but we are confident we do 
… Not only are we wrong, but we are confident that we are right!’[6] A 
more serious problem that has been identified is that individuals at the 
lowest levels of mastery lack the metacognitive understanding of what 
actually constitutes mastery, leading them to greatly overestimate their 
own skills.[6]

Background. At Medunsa, Pretoria, South Africa, the training of final-year medical students includes the management of simulations that incorporate, 
inter alia, the following emergency skills: cardiopulmonary resuscitation (CPR), defibrillation, airway suctioning, oropharyngeal airway placement, 
endotracheal intubation and bag-valve-mask ventilation. Other than CPR, all emergency training of the 2012 student group was by means of apprenticeship 
in clinical rotations. Therefore, there was no evidence of the students’ competence or confidence with regard to their performance of emergency skills.
Objectives. To explore the effect of simulated skills training and assessments on medical students’ competence and confidence when using the skills 
required to manage clinical emergencies.
Method. A one-group pretest post-test quasi-experimental design was used, with a convenience sample (n=82) comprising final-year medical students 
from 3 of the 6 annual Family Medicine rotations. The participants’ competence (knowledge and selected emergency skills as per curriculum) and 
confidence were assessed before training. The intervention comprised training in relevant theory, demonstrations and supervised hands-on practice. 
The post-training assessments were a repeat of the pretraining assessments. 
Results. The improvement in participants’ confidence and competence levels when performing all the emergency skills on completion of the 
demonstrations and hands-on practice was highly significant (p≤0.001). Participants were unanimous in their opinion that pre-assessments had 
enhanced their learning experience.
Conclusions. The strategy of teaching/learning and assessment of emergency skills in simulation was highly effective in enhancing the competence 
and confidence of medical students when managing a clinical emergency. However, students appeared to be overconfident, which could be ascribed to 
ignorance, and possibly indicates that feedback during training should be improved.

Afr J Health Professions Educ 2015;7(2):155-157. DOI:10.7196/AJHPE.229

Effect of simulated emergency skills training and assessments on 
the competence and confidence of medical students
I Treadwell, DCur, HED

Skills Centre, Sefako Makgatho Health Sciences University (formerly Medunsa Campus of the University of Limpopo), Pretoria, South Africa

Corresponding author: I Treadwell (ina.treadwell@gmail.com)



Research

156         November 2015, Vol. 7, No. 2  AJHPE

Competence and confidence are terms used for 
expressing beliefs about one’s ability to perform 
an activity. Confidence refers to self-assurance 
arising from an appreciation of one’s own 
abilities,[7] while in this study competence refers to 
the ability to perform a clinical skill successfully 
or efficiently. Competence can, however, be 
thwarted by a lack of confidence; however, 
misguided overconfidence in professional 
capabilities may have serious professional and 
malpractice consequences.[8] Clinical expe-
rience and the level of confidence have no 
predictive value in performance assessments 
when using standardised simulated scenarios. As 
self-confidence is not a reliable indicator of skills 
competence, it is important to measure both 
confidence and competence.[9] 

Final-year medical students have a sound 
theoretical knowledge of emergency procedures,[10] 
but how confident and competent are they in 
performing these procedures? 

Objective
The objective of this study was to explore the 
effect of simulated skills training and assessments 
on final-year medical students’ competence and 
confidence in performing skills required to 
manage clinical emergencies.

Method
The study was conducted at the Skills Centre at 
Medunsa. The population of MB ChB VI students 
(N=176) was divided into 6 groups that rotated, 
as per curriculum, through 6 blocks of various 
disciplines during the year. A convenience sample 
(n=82) was used, comprising all the consenting 
students from 3 of these groups during their Family 
Medicine rotation. Ethical clearance was granted by 
the Medunsa Research and Ethics Committee and 
informed consent was obtained from participants. 

A one-group pretest post-test quasi-experi-
mental design was used to determine the effect 
of skills training and assessment on students’ 
competence and confidence in performing 
emergency procedures. Pretraining assessments of 
participants’ competence (knowledge and selected 
emergency skills as per curriculum) and confidence 
were administered. The intervention comprised 
3 training and practice sessions of 30 minutes 
each: (i) adult CPR and defibrillation; (ii) adult 
endotracheal intubation; and (iii) resuscitation 
of a paediatric patient (CPR, airway suctioning, 
placement of an OPA, and BVM ventilation). Three 
groups, each comprising 9 - 10 students, rotated 
through the 3 stations, each manned by 2 lecturers 

who supervised the students and, by implication, 
provided them with feedback.

The post-training assessments were a repeat 
of the pretraining assessments. Pre- and post-
training assessments were conducted on the 
same day to minimise the threat of maturation 
and history. To prevent social desirability bias 
the questionnaires were administered by the 
researcher, and responses to questionnaires were 
not accessible to the lecturers.

The pre- and post-training questionnaires 
comprised a 4-point Likert scale for self-report 
of confidence levels in performing 6 skills: CPR, 
clearing the airway by suctioning, placement of an 
OPA, endotracheal intubation, BVM ventilation 
and defibrillation. A statement on the effect of skills 
assessment prior to the teaching session was added 
to the post-training questionnaire.

The multiple-choice questions (MCQ) test, 
used before and after the training, comprised 
questions relevant to the range of skills. The 
test was compiled and verified by 4 lecturers 
involved in emergency care training. The 
Objective Structured Clinical Examination 
(OSCE) assessment tools were compiled and 
tested to assess objectively the skills performed 

at each of the 3 OSCE stations. A pilot study 
with 43 students in the first Family Medicine 
rotation of 2012 was conducted to determine 
the viability of the instruments and timing of 
the activities. These results were not included 
in the study.

Results
The results of the MCQ test, questionnaires and 
OSCE were captured on an Excel spreadsheet. 
The test and OSCE results before and after the 
teaching sessions were compared using Fisher’s 
exact test. All statistical tests were two-sided and 
p-values ≤0.01 were considered significant. 

The mean scores of the pre- and post-training 
tests and OSCE assessments, the differences 
(improvement) and significance thereof are 
shown in Table 1.

The responses to the 4 categories of the 
Likert scale were summarised by frequency 
counts and percentages. The pre- and post-
training percentages of ‘competent’ outcomes (a 
combination of responses in category 1 (very 
confident) and category 2 (confident)) were 
compared using Fisher’s exact test. The mean 
scores of the pre- and post-training confidence 

Table 1. Differences in the mean scores of pre- and post-training assessments (n=82)

Assessments Training
Mean score
pretraining, %

Mean score
post-training, % Difference, %

Significance, 
p-value

MCQ test 42 64 21 0.0001

OSCE station 1 Paediatric resuscitation 23 74 51 0.0001

OSCE station 2 CPR and defibrillation 19 81 62 0.0001

OSCE station 3 Endotracheal intubation 16 52 37 0.0001

Table 2. Differences in pre- and post-training confidence levels in performing emergency skills (n=82)

Skill
Confidence 
pretraining, %

Confidence 
post-training, % Difference, %

Significance, 
p-value

Airway suctioning 66 100 34 <0.001

Placement of OPA 33 99 66 <0.001

BVM ventilation 81 100 19 <0.001

Endotracheal intubation 30 94 64 <0.001

CPR 87 100 13 <0.001

Defibrillation 33 96 64 <0.001

Table 3. Value of pretraining assessment (n=82)
Value of OSCE Strongly disagree, % Disagree, % Agree, % Strongly agree, %

Created awareness 0 0 9 91

Enhanced learning 0 0 12 88



Research

November 2015, Vol. 7, No. 2  AJHPE         157

levels, the differences (improvement) and significance thereof are given in 
Table 2.

Participants were unanimous (combination of category 1 (strongly agree) 
and category 2 (agree)) in their opinion that the pretraining OSCE had made 
them aware of their learning needs and the OSCE experience had enhanced 
their learning during the teaching session (Table 3).

Discussion
The lowest mean OSCE score was for endotracheal intubation (16% 
pretraining and 52% post-training). The medical students seem to find this 
emergency skill the most problematic. The literature shows that medical 
graduates feel inadequately prepared for performing an endotracheal 
intubation and it is recommended that more emphasis be placed on training 
medical students in this skill.[11] 

The improvement of participants’ competence in performing emergency 
skills in the post-training OSCE was highly significant (p<0.001). This 
improvement corresponds to the findings in a study on residents’ improved 
competence in critical resuscitation procedures following an intensive 
simulation-based training programme.[12]

The literature reports low confidence levels and poor self-assessment 
of proficiency with regard to procedural skills among medical students 
entering clinical rotations. Their confidence improved significantly after a 
course in procedural skills.[13] Our results likewise indicate a highly signi-
ficant increase in confidence levels when performing each of the skills.

Students reported that the pretraining assessment (OSCE) improved their 
learning. This was similar to a report indicating that students who were 
evaluated prior to their training performed better in the post-training eva-
luation than a control group who had not been evaluated before training.[14]

A limitation of this study was that, although the students seemed 
alarmingly overconfident, the data were unsuitable to statistically determine 
the correlation between competence (scores in percentages) and confidence 
(4 categories). An additional limitation was that individual feedback, as 
implied during supervised hands-on sessions, was not monitored. The 
absence of a correlation between confidence and grades could be the 
result of a lack of appropriate and clear feedback.[4] Students’ inflation of 
their abilities might be caused by ignorance rather than arrogance;[15] such 
exaggerated judgements might be the result of an absence of feedback or 
failure to incorporate feedback into self-perception.[5] Students tended to 
overestimate their own abilities. High-quality feedback[15] could act as an 
antidote to such inaccurate self-assessment. 

Conclusion
The strategy of teaching/learning and assessment of emergency skills 
in simulation proved highly effective in enhancing the competence and 

confidence of medical students in their management of a simulated clinical 
emergency. The improvement of students’ performance and confidence 
levels on completion of demonstrations and hands-on practice was highly 
significant (p<0.001). 

The students appeared to be overconfident before engaging in this 
teaching/learning strategy. Their confidence levels escalated significantly on 
completion of the simulation, but were unfounded when compared with 
the proficiency scores. This confirms a finding previously reported in the 
literature that self-confidence is not a reliable indicator of skills competence.[10] 

Recommendations
As students’ confidence levels were higher than their actual competency 
levels in the performance of emergency skills, it is recommended that 
training in the latter be expanded to include high-quality individual 
feedback. The effect of such individual feedback and its role in enhancing 
self-perception should be further researched. 

 
Acknowledgements. I would like to acknowledge the help and contributions 
of staff from the Skills Centre (H Havenga, M Theron, Y Uys, B Randa, K Kgasi, 
T Zana and T van Dyk), the Department of Family Medicine (Drs K Hlabyago, 
H Mabuza, S Nyalunga, C Barua, I Govender and J Ndimande) and the students 
who participated in this study.

References
1. Remmen R, Derese A, Scherpbier A, et al. Can medical schools rely on clerkships to train students in basic clinical 

skills?  Med Educ 1999;33(8):600-605. [http://dx.doi.org/10.1046/j.1365-2923.1999.00467.x] 
2. Ahmed AM. Role of clinical skills centres in maintaining and promoting clinical teaching. Sudan J Public Health 

2008;3(2):97.
3. Meguerdichian DA, Heiner JD, Younggren BN. Emergency medicine simulation: A resident’s perspective. Ann 

Emerg Med 2012;60(1):121. [http://dx.doi.org/10.1016/j.annemergmed.2011.08.011] 
4. Brookes L. Developing simulation training for medical emergencies. Medscape interview, Paul Preston. http://

www.medscape.com/index/list_6121_1 (accessed 15 November 2012).
5. Lai NM, Teng Cl. Self-perceived competence correlates poorly with objectively measured competence in evidence 

based medicine among medical students. BMC Med Educ 2011;11(1):25. [http://dx.doi.org/10.1186/1472-6920-
11-25] 

6. Heath L, DeHoek A, Locatelli SH. Indirect measures in evaluation: On not knowing what we don’t know. Practical 
Assessment, Research and Evaluation 2012;17(6). http://pareonline.net/pdf/v17n6.pdf (accessed 15 November 
2012).

7. Oxford dictionaries. http://oxforddictionaries.com/definition/english/confidence (accessed 20 March 2013).
8. Elzubeir MA, Rizk DEE. Assessing confidence and competence of senior medical students in an obstetrics and 

gynaecology clerkship using an OSCE. Educ Health 2001;14(3):373-382. 
9. Hansen M, Oosthuizen G, Windsor J, et al. Enhancement of medical interns’ levels of clinical skills competence 

and self-confidence levels via video iPods: Pilot randomized controlled trial. J Med Internet Res 2011;13(1):e29. 
[http://dx.doi.org/10.2196/jmir.1596]

10. Remes V, Sinisaari I, Harjula A, Helenius I. Emergency procedure skills of graduating medical doctors. Med 
Teach 2003;25(2):149-154. [http://dx.doi.org/10.1080/014215903100092535]

11. Ochsmann EB, Zier U, Drexler H, Schmid K. Well prepared for work? Junior doctors’ self-assessment after 
medical education. BMC Med Educ 2011;24(11):99. [http://dx.doi.org/10.1186/1472-6920-11-99] 

12. Langhan TS, Rigby IJ, Walker IW, Howes D, Donnon T, Lord JA. Simulation-based training in critical 
resuscitation procedures improves residents’ competence. CJEM 2009;11(6):535-539. 

13. Stewart RA, Hauge LS, Stewart RD, Rosen RL, Charnot-Katsikas A, Prinz RA. A CRASH course in procedural 
skills improves medical students’ self-assessment of proficiency, confidence, and anxiety. Am J Surg 
2007;193(6):771-773. [http://dx.doi.org/10.1016/j.amjsurg.2007.01.019] 

14. Li Q, Ma EL, Liu J, Fang LQ, Xia T. Pre-training evaluation and feedback improve medical students’ skills in basic 
life support. Med Teach 2011;33(10):e549-e555. [http://dx.doi.org/10.3109/0142159X.2011.600360] 

15. DeAngelis T. Why we overestimate our competence? American Psychological Association 2003;34(2). http://
www.apa.org/monitor/feb03/overestimate.aspx (accessed 13 November 2012).