INDONESIAN JOURNAL OF NURSING PRACTICES 70 IJNP (Indonesian Journal of Nursing Practices) Vol 4 No 2 DECEMBER 2020 : 70-76 Resti Yulianti Sutrisno1, Yanuar Primanda1, Fahni Haris1 1Program Studi Keperawatan, Universitas Muhammadiyah Yogyakarta Corresponding Author: Resti Yulianti Sutrisno Email: restiyulianti@umy.ac.id Student's Satisfaction on Online Nursing OSCE (ON-OSCE) Assessment Application Article Info Online ISSN DOI : http://journal.umy.ac.id/index.php/ijnp : 2548 4249 (Print) : 2548 592X (Online) : 10.18196/ijnp.v4i2.10142 Abstract Background: Objective Structured Clinical Examination (OSCE) is one of the final assessment components for nursing students. OSCE assessment uses the ON-OSCE (Online Nursing OSCE) application to fulfill the need and overcome the manual OSCE assessment challenges using papers. The manual OSCE assessment with multiple checklists is very detailed and takes a long time in scoring. Besides, the evaluation using papers tends to cause a miscalculation in scoring a total of ratings that can harm the student. Method: This research was a non-experimental study with a descriptive design and cross-sectional approach. The samples were 480 nursing students. The instrument of assessment of satisfaction was analyzed based on the score accuracy, time to retrieve score, and examiner's attention. Data analysis used descriptive frequency and percentage distributions. Results: The majority of students were satisfisfied (n= 365, 76%) based on the time to retrieve score. More than half respondents (n= 285, 59,4 %) were also satisfied with ON-OSCE related to the score accuracy. Regarding the examiners' attention, 273 students (56,9 %) felt the examiners ignored the students' actions. They focused on the laptop to provide an assessment. It could be due to the new ON-OSCE application for the examiners and their unfamiliarity with operating the application. Conclusion: Most students were satisfied with the assessment using the ON-OSCE application to retrieve scores and the score accuracy. However, they were less satisfied with the examiners' attention. The examiners should be more familiar with the ON-OSCE. Keywords: Assessment, Application, Online Nursing OSCE, ON-OSCE, OSCE, Nursing INTRODUCTION Objective Structured Clinical Examination (OSCE) evaluates nursing students' clinical skills learned in the skills lab. OSCE is a tool for assessing several clinical competence components such as assessment, physical examination, procedural, interpretation of lab results, patient problem management, communication, and attitudes (Ananthakrishnan, 1993). The usage of OSCE can facilitate students' psychomotor skills and knowledge and attitudes (Baid, 2011). OSCE is a valid and reliable method for assessing clinical competence objectively in various settings (Kurz, Mahoney, Martin-Plank, & Lidicker, 2009). By using OSCE, the clinical skills can be evaluated to see the competencies achieved by students. http://journal.umy.ac.id/index.php/ijnp http://u.lipi.go.id/1477106461 http://issn.pdii.lipi.go.id/issn.cgi?daftar&1478151103&1&& https://doi.org/10.18196/ijnp.v4i2.10142 VOL. 4 NO. 2 DECEMBER 2020 71 OSCE provides many benefits, including developing student confidence and preparing students' skills and understanding for clinical practice (Alinier, 2003; Barry, Noonan, Bradshaw, & Tighe, 2011). OSCE requires the students to actively demonstrate how they will apply the acquired knowledge to simulated "real world" situations (Franklin, 2005). Skills competency examination with OSCE provides examiners at each station with real-time direct observation and assessment of each nursing student's performance (Quero Munoz, O'Byrne, Pugsley, & Austin, 2005). Examiner assesses each student's performance using a checklist or rating scale (Robbins & Hoke, 2008). The examination at each station test specific competencies that are scored using a scoring sheet. The sheet can be in the form of a checklist or a combination of checklists and global scores. Examiners then provide a total score based on the student's overall performance (Ahuja, 2009). OSCE assessment is carried out using various instruments. Some are assessed using papers; some are assessed online. Based on a study conducted by Natarajan & Thomas, OSCE assessment using paper has several shortcomings as follows: (1) results are not immediately available; (2) delay in providing scores and feedback; (3) time pressure for academic staff; (4) low ability to moderate and audit examination results; (5) high production costs; (6) pressure on the appraisal administration staff (Natarajan & Thomas, 2014). The paper-based OSCE scoring system's weaknesses make academics think of innovating to make the OSCE scoring system better, more effective, efficient, and accurate. The existing OSCE scoring system is developed in a paper- based system and electronic or online system and technological developments. Several studies mention the development of an electronic OSCE scoring system. As has been done by Primanda, Sutrisno, and Haris, they developed an online OSCE assessment application for the school of nursing called ON-OSCE (Online Nursing Objective Structured Clinical Examination) (Primanda, et. al., 2019). Another study also explores the electronic OSCE management system for nursing OSCEs (Meskell et al., 2015). The development of the online OSCE system or ON-OSCE (Online Nursing Objective Structured Clinical Examination) has also been carried out by the School of Nursing, Faculty of Medicine and Health Sciences, Universitas Muhammadiyah Yogyakarta since 2018. This application in the OSCE’s scoring process has been carried out since semester two in the academic years of 2018. Therefore, it is essential to evaluate students' satisfaction when examined using the ON-OSCE system in the process of OSCE. METHODS Research Design The study designed as a quantitative and non- experimental study with descriptive design to determine the satisfaction of students who take OSCE with online assessments in terms of the scoring accuracy, the time (speed) of the score release schedule, and the focus/attention of the examiners to the actions done by students during the exam. This study was conducted with a cross- sectional approach, and data collection was done only once. Population and Samples This study's population were nursing students in the first, second, third, and fourth years who took part in the OSCE using online OSCE assessment or ON-OSCE. The sampling technique used was total sampling. The number of samples is 480 respondents. There are respondents who have taken OSCE by using the ON-OSCE assessment for one semester (two blocks), two semesters (5 blocks), or three semesters (8 blocks). Each block has an OSCE scoring rubric component according to its respective block. The inclusion criteria in this study were nursing study program students of UMY who had participated in the OSCE using ON- OSCE assessment. Data Collection Data collection was carried out using a questionnaire that was distributed online. Respondents filled out online questionnaires using Google Form. Online questionnaires and informed consent were distributed through the class leader, to be filled in by all respondents in that batch. Each respondent filled out a student satisfaction questionnaire on the OSCE assessment using ON- OSCE system. The satisfaction questionnaire INDONESIAN JOURNAL OF NURSING PRACTICES 72 consisted of three components, namely the speed of the assessment, the accuracy of the assessment, and the attention of the examiners during the implementation of the OSCE using ON- OSCE. Respondents were then requested to provide an evaluation with a Likert scale with a score of 0-10. A score of 0 indicates that the respondent is very dissatisfied, and ten indicates that the respondent is very satisfied. The score is then categorized as satisfied for a score of 8-10 and dissatisfied for 0-7. Respondents were also allowed to provide descriptive assessments to describe more clearly the respondent's satisfaction/dissatisfaction regarding the assessment with the ON-OSCE application. Researchers had provided time to fill out the questionnaire for seven days. Every respondent who filled out the questionnaire had the opportunity to get a reward. The researcher observed the completeness of the data and the number of respondents who filled out the online questionnaire based on the responses on the Google Doc. After seven days of data collection, the data on Google Doc was downloaded and the data was analyzed. Ethics This study has passed the Ethics Committee of the Faculty of Medicine and Health Sciences' ethical clearance, Universitas Muhammadiyah Yogyakarta with ethics number 051 / EC-KEPK FKIK UMY / II / 2019. This study also pays attention to ethical principles, namely informed consent. Participants who are willing to participate in this study are then given explanations related to the research. As for confidentiality, the respondents' identity was not mentioned in the questionnaire to maintain confidentiality. The principle of justice is also carried out. Whether students choose to participate or choose not to participate in this study will not affect their study. For the Beneficence principle, this study will benefit students since it is also an effort to improve student evaluation instrument’s quality. Data Analysis Data were analyzed descriptively. Age demographic data were analyzed using numeric analysis of mean, minimum, maximum, standard deviation. For gender, Student's Year, Score release speed/ time, Score accuracy, and examiner attention/focus were displayed in the form of frequency and percentage. An exploratory, descriptive analysis of the satisfaction/ dissatisfaction explanations/ description given by the participants. RESULTS Table 1. Characteristics of Respondents by Age Variable Mean Minimum Maximum SD Age 21 17 26 1,6 The respondent’s' average age was 21 years old, while the youngest was 17 years and the oldest was 26 years old. Table 2. Characteristics of Respondents based on Gender and Duration of Exposure to ON-OSCE Variable Frequency Percentage Gender Male 84 17,5 Female 396 82,5 Total 480 100,0 Duration of ON-OSCE Exposure 1 semester 91 19,0 2 semesters 113 23,5 3 semesters 276 57,5 Total 480 100,0 Based on table 2, respondents' characteristics based on gender were found that most of the respondents were female, with 82.5%, then male by 17.5%. The majority of respondents have been exposed to the ON-OSCE scoring system for three semesters, namely 276 respondents (57.5%). Then two semesters and one semester. Table 3. Description of Respondents' Satisfaction based on the time to retrieve the score (speed), score accuracy, examiner's attention Variable Frequency Percentage The time to retrieve the score (Speed) Satisfied 365 76 Dissatisfied 115 24 Total 480 100,0 Score Accuracy Satisfied 285 59,4 Dissatisfied 195 40,6 Total 480 100,0 Examiner's Attention Satisfied 207 43,1 Dissatisfied 273 56,9 Total 480 100,0 Based on table 3, it was found that respondent satisfaction was based on the time to retrieve score (speed). Most of the respondents were VOL. 4 NO. 2 DECEMBER 2020 73 satisfied, namely as many as 365 respondents (76%); most of the respondents are satisfied with the Scoring Accuracy using the ON-OSCE system. This can be seen in the number of respondents satisfied with the assessment's accuracy with 285 respondents (59.4%). Respondent satisfaction was based on "the examiner's attention to the actions taken by the respondent (OSCE exam participant) during the OSCE implementation" there were more respondents who were dissatisfied than the respondents who were satisfied, with a small margin. Respondents who were dissatisfied were 273 respondents (56.9%), while respondents that were satisfied were 207 respondents (43.1%). Table 4. Description of Respondents' Satisfaction on The Time to Retrieve score based on Gender and Duration Assessed by ON-OSCE Variable The time to retrieve the score Score Accuracy Examiners Attention Satisfied Dissatisfied Satisfied Dissatisfied Satisfied Dissatisfied Gender Male 56 28 47 37 47 37 % Male 66,7 33,3 56,0 44,0 56,0 44,0 % Gender 11,7 5,8 9,8 7,7 9,8 7,7 Female 309 87 238 158 238 158 % Female 78,0 22,0 60,1 39,9 60,1 39,9 % Gender 64,4 18,1 49,6 32,9 49,6 32,9 Duration Assessed using ON-OSCE 1 Semester 64 27 61 30 39 52 % 1 Semester 70,3 29,7 67,0 33,0 42,9 57,1 % Total Semester 13,3 5,6 12,7 6,3 8,1 10,8 2 Semester 80 33 65 48 65 48 % 2 Semester 70,8 29,2 57,5 42,5 57,5 42,5 % Total Semester 16,7 6,9 13,5 10,0 13,5 10,0 3 Semester 221 55 159 117 103 173 % 3 Semester 80,1 19,9 57,6 42,4 37,3 62,7 % Total Semester 46,0 11,5 33,1 24,4 21,5 36,0 Based on table 4, it was found that male respondents were mostly satisfied with the time to retrieve the score within 24 hours after the OSCE. A percentage of 66.7% and female respondents were also mostly satisfied with a percentage of 78%. While the respondent's satisfaction with the time to retrieve score of the OSCE score based on the duration of exposure to ON-OSCE explained that most of the respondents were satisfied for the respondents who had participated for three semesters, namely 221 respondents (46% from full semester), then followed by those who had participated for two semesters and then one semester. Most male and female respondents were satisfied with the accuracy of the OSCE scoring using the ON-OSCE application. Namely, 56% of male respondents were satisfied, and 60.1% of female respondents were satisfied. As for satisfaction with the accuracy of the scoring based on the duration participating the OSCE using the ON- OSCE system, it was found that there were more of the respondents who are satisfied than those who are dissatisfied, both in the exposure of 1 semester, two semesters, and three semesters. Meanwhile, the highest satisfaction was for students who participated for one semester, three semesters, and two semesters. Female respondents were more dissatisfied with the examiners' attention during the OSCE exam compared to male respondents. Based on the table, it can also be seen that students who took OSCE for two semesters were more satisfied (57.5%) with the examiner's attention than those who were not satisfied (42.5%). More respondents who had participated in OSCE with ON-OSCE system for one semester and three semesters were not satisfied with the examiners' attention. In addition to the descriptive results using a questionnaire, this study also analysed the respondent's descriptive evaluation/explanation of the OSCE. Several themes were obtained from the evaluation delivered by the respondents, namely: 1. The usage of the ON-OSCE system is a good improvement: INDONESIAN JOURNAL OF NURSING PRACTICES 74 a. The usage of ON-OSCE is excellent at the present b. Examiners are becoming more objective c. It is better to use ON-OSCE so that the score can be accessed quickly, and when there are some errors in the score, it can be corrected together d. This scoring system is already good, and I prefer the current scoring system. Hopefully, the credibility will be improved e. I prefer the online OSCE assessment because the results come out faster, so I won't have to wait for too long f. It is better for the examiner to not only just sitting down in OSCE 2. With On-OSCE system, the OSCE Score can be accessed easier and faster a. "It's nice to have it online. The results come out right away." b. Using the online OSCE, we can get our score faster 3. Examiners focus on laptops instead of the student's actions a. It is better if examiners can pay more attention to students during the OSCE so that the scores given are correct b. Several lecturers were glued to their tablets and paid no attention to the student's process c. Sometimes examiner only focus on the laptop d. Sometimes examiners did not pay attention to students who were practicing but were busy paying attention to the laptop list. Sometimes, students had executed the action on the list, but often the examiners were not aware or missed it. DISCUSSIONS Based on this study's results, it was found that OSCE scoring using the "Online-Nursing OSCE (ON- OSCE)" system was more effective and efficient because it was more satisfying for students. This can be seen from the student satisfaction level based on the score release/retrieval speed and the scoring accuracy. This is in line with Luimes & Labrecque's research, which stated that electronic-based scoring systems are more effective and efficient than paper-based assessments (Luimes & Labrecque, 2018; Snodgrass, et. al., 2014; Currie, et. al., 2017). OSCE scoring using an online system in the form of ON-OSCE makes the scores quickly released because the data inputted by the examiner is immediately calculated and the results come out right away whether the student passes the exam or has to repeat it. Besides that, the grades can be accessed by the admin and announced directly to students. On the other hand, when the assessment was still using the paper-based system, the examiner still needs time to sum up, the detailed item score, then the admin inputs the score into the system announces it to the OSCE participants; in this case, the OSCE score cannot come out real-time but need to be processed for several days. With the online OSCE scoring system, students feel very satisfied with the score release speed based on quantitative and qualitative results, stating that "Scoring with On-OSCE is very good because the score comes out straight away." The existence of this innovation makes the work time shorter and faster. Grades come out quickly, so students can find out if they passed the OSCE or if they need to retake the exam. With the quick release of the OSCE score, students can prepare themselves better when taking the retest. The OSCE online scoring system was very satisfying in terms of the score release speed since the assessor and admin's working time was shorter, which is also in line with what was conveyed by Onwudiegwu, who stated that the usage electronic scoring system saves time (Onwudiegwu, 2018). This is also in line with the statement of Treadwell, who compared paper- based OSCE with electronic methods. These findings suggested that electronic methods are just as effective and more efficient (take less time) than traditional paper-based methods (Treadwell, 2006). Another thing that satisfies students using the ON-OSCE scoring system is that they are more accurate than the paper-based system. The online calculation system can produce complete accuracy than the manually written system. The ON-OSCE system automatically calculates each scoring item that has been listed by the examiner. VOL. 4 NO. 2 DECEMBER 2020 75 The system automatically performs the calculation and concludes whether the student passed the exam or has to take the retest. Manual calculations by examiners run the risk of causing errors because examiners add up each scoring item manually. Also, in the ON-OSCE system, respondents are satisfied that the OSCE scoring with the ON-OSCE system is better in terms of the accessibility of the score detail, including the details of the examinee's mistake noted by the examiners. This can be done because the examiners are provided a place to write comments when scoring student errors or incompleteness in the OSCE process. The feedback from the examiners becomes more complete. This is in line with Lumino's research, which stated that the electronics OSCE system improves the efficiency of assessment and examiners' objectivity and can provide faster feedback to students (Luimes & Labrecque, 2018). The assessment results of each action taken by students who take OSCE and are examined using ON-OSCE are recorded in the system. They can be used to analyze further educational development and evaluation of the learning process implemented, including improving guidelines, checklists, teaching strategies, or even the curriculum. This is in line with what Meskell et al. who stated that the electronic OSCE scoring system could further analyze student performance and develop strategies for improving student performance (Meskell et al., 2015). In this study, an aspect that still did not satisfy students was the examiner's attention when the examiner begins using ON-OSCE. For example, students' descriptions show that examiners are more focused on laptops or checklists on laptops, making the examiners not fully face the student or pay attention to student performance. This happened because examiners are not familiar with the system, considering that ON-OSCE has only been used for three semesters. This is a challenge for examiners to adapt to the new system to keep paying attention to the assessment checklist while also paying attention to students' actions throughout the OSCE process. This is in line with the research submitted by Snodgrass et al., which mentioned that lack of knowledge of the scoring system is one of the examiners' challenges in using an electronic-based scoring system (Snodgrass et al., 2014). CONCLUSIONS Most of the students are satisfied with the release time of OSCE score and the accuracy of scoring results using the ON-OSCE system. However, many students are still dissatisfied with the examiners' attention. The examiners focused more on the tablet or laptop they held during the OSCE than the student's performance. The examiners should be more familiar with the scoring system and objectively score student's performance based on the thorough observation ACKNOWLEDGEMENT This study was supported by a research grant from Universitas Muhammadiyah Yogyakarta. The researchers thank the head of the School of Nursing and all students for their support and efforts to perform this project. The authors also address the special thanks to the students for their cooperation. REFERENCES Ahuja, J. (2009). OSCE: A guide for students, part 1. Practice Nurse, 37(1), 37–39. Alinier, G. (2003). Nursing students' and lecturers' perspectives of objective structured clinical examination incorporating simulation. Nurse Education Today, 23(6), 419–426. https://doi.org/10.1016/S0260- 6917(03)00044-3 Ananthakrishnan, N. (1993). Objective Structured Clinical/Practical Examination (OSCE/OSPE). Journal of Postgraduate Medicine, 39(2), 82– 84. Retrieved from http://www.jpgmonline.com/text.asp?1993 /39/2/82/628 Baid, H. (2011). The objective structured clinical examination within intensive care nursing education. Nursing in Critical Care, 16(2), 99– 105. https://doi.org/10.1111/j.1478- 5153.2010.00396.x Barry, M., Noonan, M., Bradshaw, C., & Tighe, S. (2011). An exploration of student midwives' experiences of the Objective Structured Clinical Examination assessment process. Nurse Education Today, 32, 690–694. https://doi.org/10.1016/j.nedt.2011.09.007 INDONESIAN JOURNAL OF NURSING PRACTICES 76 Currie, G. P., Sinha, S., Thomson, F., Cleland, J., & Denison, A. R. (2017). Tablet computers in assessing performance in a high stakes exam: Opinion matters. Journal of the Royal College of Physicians of Edinburgh, 47, 164–167. https://doi.org/10.4997/JRCPE.2017.215 Franklin, P. (2005). OSCEs as a means of assessment for the practice of nurse prescribing. Nurse Prescribing, 3, 14–23. https://doi.org/10.12968/npre.2005.3.1.175 09 Kurz, J. M., Mahoney, K., Martin-Plank, L., & Lidicker, J. (2009). Objective Structured Clinical Examination and Advanced Practice Nursing Students. Journal of Professional Nursing, 25(3), 186–191. https://doi.org/10.1016/j.profnurs.2009.01. 005 Luimes, J., & Labrecque, M. (2018). Implementation of Electronic Objective Structured Clinical Examination Evaluation in a Nurse Practitioner Program. Journal of Nursing Education, 57, 502–505. https://doi.org/10.3928/01484834- 20180720-10 Meskell, P., Burke, E., Kropmans, T. J. B., Byrne, E., Setyonugroho, W., & Kennedy, K. M. (2015). Back to the future: An online OSCE Management Information System for nursing OSCEs. Nurse Education Today, 35(11), 1091–1096. https://doi.org/10.1016/j.nedt.2015.06.010 Natarajan, J., & Thomas, D. S. (2014). Integrative Review Literature on Objective Structured Clinical Examination and its implications in Nursing Education. IOSR Journal of Nursing and Health Science, 3(4), 23–30. https://doi.org/10.9790/1959-03412330 Onwudiegwu, U. (2018). OSCE: Design, Development, And Deployment. Journal of the West African College of Surgeons, 8, 1– 22. Primanda, Y., Sutrisno, R. Y., & Haris, F. (2019). The Development of Online OSCE Prototype for OSCE in School of Nursing : Lesson Learned. Advances in Health Sciences Research, 15(IcoSIHSN), 215–220. https://doi.org/https://doi.org/10.2991/icos ihsn-19.2019.47 Quero Munoz, L., O'Byrne, C., Pugsley, J., & Austin, Z. (2005). Reliability, validity, and generalizability of an objective structured clinical examination (OSCE) for assessment of entry-to-practice in pharmacy. Pharmacy Education, 5(1), 33–43. https://doi.org/10.1080/156022104000253 47 Robbins, L. K., & Hoke, M. M. (2008). Using Objective Structured Clinical Examinations to Meet Clinical Competence Evaluation Challenges With Distance Education Students. Perspectives in Psychiatric Care, 44(2), 81–88. https://doi.org/10.1111/j.1744- 6163.2008.00157.x Snodgrass, S. J., Ashby, S. E., Anyango, L., Russell, T., & Rivett, D. A. (2014). Electronic Practical Skills Assessments in the Health Professions: A Review. The Internet Journal of Allied Health Science and Practice, 12(1), 1–10. Retrieved from http://ijahsp.nova.edu/articles/Vol12Num1/ pdf/Snodgrass.pdf Treadwell, I. (2006). Assessment The usability of Personal Digital Assistants ( PDAs ) for Assessment of Practical Performance. Medical Education, 40(6), 855–861. https://doi.org/10.1111/j.1365- 2929.2006.02543.x