International Journal of Interactive Mobile Technologies (iJIM) – eISSN: 1865-7923 – Vol  17 No  13 (2023) Paper—E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the… E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the University of Sharjah, UAE https://doi.org/10.3991/ijim.v17i13.41515 Balsam Qubais Saeed1, Najah Rajeh Al Salhi2,3,4(), Salman Yousuf Guraya1, Sami Sulieman Al Qatawneh7, Mohd. Elmagzoub Eltahir2,3 Ahmed Omar Adrees5, Kubais Saeed Fahady2, Nagaletchimee Annamalai6 1 Department of Clinical Sciences, University of Sharjah, Sharjah, UAE 2 College of Humanities and Science, Ajman University, Ajman, UAE 3 Humanities and Social Sciences Research Center (HSSRC), Ajman University, Ajman, UAE 4 Deanship of Research and Graduate Studies, Ajman University, Ajman, UAE 5 College of Medicine, University of Sharjah, Sharjah, UAE 6 School of Distance Education, Universiti Sains Malaysia, Penang, Malaysia 7 College of Arts, Humanities, and Social Sciences, University of Sharjah, Sharjah, UAE n.alsalhi@ajman.ac.ae Abstract—The aim of the current research was to detect how undergraduates from the faculties of medicine, dentistry, pharmacy, and health sciences felt about using electronic assessment at the time of COVID-19. Cross-sectional research was made at the University of Sharjah in UAE between January and April 2021. As a study tool, a questionnaire with 26 items was created using Google Forms and disseminated by the registration department via the learner's E-learning platforms. The study's data were analyzed using SPSS software. The outcomes demonstrated that students had a high level of acceptance of the computerized assessment. The overall arithmetic mean of the students' replies was (3.49) and had a standard deviation of (1.33), indicating that they accepted the electronic assessment to a high degree. Furthermore, there was a discernible difference between the acceptance of the internet-based assessment by male and female students, but the College of Medicine (COM) students benefited significantly from the variation in acceptance of the assessment depending on the variable of college. Students who had inadequate computer capacities, on the other hand, were more receptive to electronic testing. The study's findings may be helpful in developing academic methods, rearranging assessment alternatives, and changing the academic curriculum to address the problems and limitations of electronic assessment. Keywords—e-assessment, COVID-19, pandemic, health sciences, undergraduate, perspective iJIM ‒ Vol. 17, No. 13, 2023 79 https://doi.org/10.3991/ijim.v17i13.41515 mailto:n.alsalhi@ajman.ac.ae Paper—E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the… 1 Introduction The World Health Organization (WHO) confirmed the severe acute respiratory syndrome coronavirus 2 (SARS-CoV2)-caused coronavirus 2019 (COVID-19) as a pandemic on March 11, 2020. [1]. This virus is highly contagious it can spread from infected people to others by liquid particles when they cough, sneeze, speak, or breathe, it has rapidly spread across the world [1, 2]. Worldwide, several countries took restrictive measures to avoid this pandemic disease include social distancing, lockdown, wearing facemasks, travel limitations, and the halting of all non-critical events, and halting of all learning events [3]. Due to this, attendance at schools and universities has been disrupted; according to educational statistics, there are an estimated 1,5 billion pupils who have been impacted by school and university closures. [4–6]. The reaction to the COVID-19 by medical universities around the globe including University of Sharjah (UOS) have either suspended or canceled their classroom lectures, conferences, workshops, and activities and transitioned to distance /online teaching [7, 8]. Althouthe distance online learning has made the theoretical educational content easy to reach such as attending workshops around the word, video conferencing applications and educational blogs [9]. The abrupt switch to distance/ online learning, several obstacles were seen by educational institutions., one of them the assessment of students. Internet-based assessments were used by the majority of educational institutes at the time of COVID-19 pandemic as a students' assessment tool [10], which is considered challenging for education stitutionsonal and faculty members due to the increasing student numbers. However, many major concerns and difficulties were observed such as the technical problems related to internet connection, E- assessment platforms, anxiety among students about the mode of assessment, and the impracticality of such tests to assess practical and clinical skills [11–14]. Moreover, taking remote electro-examinations (E-exams) at home has many other challenges, one of these are the potential technical issues that affect the validity of an examination, academic dishonesty, and dishonest behaviors among students [15–17]. UOS has been applied to remote distance/online learning as a tool for students learning at the time of COVID-19 and the distant E-test weaken into account as the main assessment tool at the time of Covid [18, 19]. There hasn’t been much research exploring distant online testing in universities. Thus, the objective was to determine the level of acceptance of remote internet-based assessment at the time the COVID-19 among medical faculties at UOS in the (UAE). These faculties included the College of Medicine (COM), College of Dental Medicine (COD), College of Pharmacy (COP), and College of Health Sciences (CHS). The uniqueness of this research is that it advances and activates the use of electronic testing. The study will give the researcher the chance to address the lack of evidence for the use of electronic tests in the educational process, particularly in UOS. 80 https://www.i-jim.org Paper—E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the… 1.1 Research questions To investigate the level of acceptability of electronic evaluation during COVID- 19 among students at UOS, UAE. As a result, the project aims to address the following questions: RQ1: In light of the COVID-19 outbreak, how much do Sharjah University students embrace computerized testing? RQ2: Does the acceptance of electronic assessment by Sharjah University students throughout the rollout of COVID-19 differ by gender, level of education, and computer proficiency? 1.2 Significance of study ─ A better understanding of the perspective of medical and health science students might help identify the major obstacles to the most effective implementation, design, and administration of electronic tests. ─ This study may be helpful in encouraging and easing the switch to continuous E- education and electronic tests as an assessment process in the system of education at the time and after an evaluation of COVID-19. ─ Universities and other educational fields outside of UAE may benefit from the current study. 2 Methodology 2.1 Participants The research participants consisted of 1150 undergraduates studying in the following four colleges of COM, COD, COP, and CHS at the time of the second term of 2020/2021. The participants were chosen by using the simple random sample method, which is a random technique which in the researcher can select a subset of a population (sample). When employing such sampling, each person in the community could be possibly chosen. Participants were given a total of 1150 questionnaires to complete to gather the information required to meet the research goals. 1032 of these were returned with all required fields filled in correctly. 118 individuals, representing all of the chosen colleges, failed to correctly complete the questionnaire. The participants thus became 1032 pupils. Table 1 shows the demographic statistics for the participants of uan ndergraduate who responded to the questionnaire correctly, Table 1 displays the details of the participants. iJIM ‒ Vol. 17, No. 13, 2023 81 Paper—E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the… Table 1. Demographic Data of Participants Study variables Variables levels Freq (f) (%) Gender Male 503 48.74% Female 529 51.26% Total 1032 100% College Medicine 350 33.91% Dentistry 177 17.15% Pharmacy 226 21.90% Health Science 279 27.03% Total 1032 100% Computer skills Poor 243 23.55% Moderate 329 31.88% Good 326 31.59% Excellent 134 12.98% Total 1032 100% 2.2 Study tool Data from the sample students were gathered using the questionnaire. COVID- 19 was occurring when it was given to them at the time of the 2nd term of the year of study 2020–2021. Previous research in this field, [14, 15] was evaluated throughout the development of the questionnaire. Two sections make up the questionnaire. Basic data on the students were covered in the first section, and questionnaire paragraphs (n=26) depending on the study's goals were covered in the second. The validity of the instrument. 11 UAE university instructors with good backgrounds in education were requested to act as arbitrators and to provide their opinions on the questionnaire paragraphs, including their usefulness for realizing the research goals and the quantity and thoroughness of the questions. The criticisms and revisions proposed by the educational professionals were taken into consideration, and the appropriate deletions, modifications, and extras were made. In order to meet the study's goals, the questionnaire was modified and eventually had 26 items. Instrument reliability. Cronbach's alpha was employed to confirm the research tool's internal consistency. The determined Cronbach alpha coefficient for a pilot study comprising 50 students who were not from the research participants was 0.873. 2.3 Measures of data analysis A five-dimensional Likert scale is employed in this research, as noticed in Figure 1 below. 82 https://www.i-jim.org Paper—E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the… Fig. 1. Evaluation of Scale Data Using Different Scale and Score Interval Options 2.4 Data statistical analysis The descriptive analyses (number, percentage, mean, and standard deviation), independent samples t-test, one-way ANOVA, and Scheffe tests were all performed by the researchers using the SPSS software program. 3 Results 3.1 Study results ascribed to Question 1: How much do Sharjah university students embrace electronic testing as it relates to the COVID-19? To respond to the 1st question, we calculated the average scores and standard deviations of each participant's answers to questions 1–26 that were pertinent to the participants' agreement of E-assessment over the range COVID–19, as shown in Table 2. Table 2. Descriptive Data for the participant’s answers to the paragraphs regarding the Degree of agreement of E-assessment at the time of COVID-19 No. Paragraphs Mean SD Description Q1 I think E-assessment exams are more stressful than conventional. 3.17 1.38 Moderate Q2 I believe that I prefer E-assessment rather than traditional exams. 4.07 1.09 High Q3 In my opinion, the regulations on e-assessments are clear and understandable 3.63 1.27 High Q4 I believe that E-assessment exams are a flexible method of evaluation 3.47 1.15 High Q5 I believe that students' e-assessment times are suitable 3.61 1.08 High Q6 I feel that the E-assessment increases the chances and attempts of cheating among students. 3.69 1.15 High iJIM ‒ Vol. 17, No. 13, 2023 83 Paper—E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the… No. Paragraphs Mean SD Description Q7 The E-assessment offers me a more motivating experience than adopting traditional exam paper 3.80 1.08 High Q8 The E-assessment exams helps in extracting and obtain results quickly. 3.98 1.01 High Q9 E-exam provides me the capability of identifying and accessing unanswered questions easily 3.41 1.30 High Q10 Environmentally, E-assessment is more responsive than paper test 3.27 1.25 Moderate Q11 I believe that E-assessment is a precise and reliable method of evaluation. 3.30 1.32 Moderate Q12 Compared to the printed conventional examination paper, E-assessment exams make me more stressed, pressured, and anxious. 3.22 1.28 Moderate Q13 I feel that there are enough questions in the E-assessment exam. 3.40 1.29 Moderate Q14 E-assessment exam contributes to raising the student’s efficiency to learning 3.30 1.35 Moderate Q15 I think that the E-assessment system is flawless and detailed 3.52 1.34 High Q16 Taking the E-assessment exams needs fewer time than in the paper-based test 3.67 1.39 High Q17 I believe E-assessment are more difficult than conventional paper exams 3.20 1.35 Moderate Q18 I am extremely worried about the internet interruption while I conduct E-assessment exams. 3.59 1.35 High Q19 E-assessment exams is designed to evaluate students in every course 2.97 1.57 Moderate Q20 I prefer to take E-assessment exams to evaluate my knowledge 3.31 1.47 Moderate Q21 I think that the E-assessment exam usually includes a variety of questions and with high thinking skills 3.23 1.58 Moderate Q22 In my opinion, I can improve my academic performance by using the E-assessment. 4.21 1.32 V. High Q23 I think that E-assessment time is adequate to answer all questions 3.72 1.46 High Q24 I think that E-assessment exam makes me more interactive and enthusiastic during the test. 3.15 1.56 Moderate Q25 I think that, when using the computer and internet, students don't need outside help 3.59 1.52 High Q26 The E-assessment exam helps give me quick feedback 3.34 1.56 Moderate Total 3.49 1.33 High The results provided in Table 2 indicate that students accepted electronic assessments during COVID-19, with the mean response rate for all items (1-26) being 3.49 (SD 1.56). This statistic would suggest that most Sharjah University students prefer electronic assessments to traditional paper examinations at the time of COVID- 19. Additionally, from Table 2 data that undergraduates’s responses to Q-22, which asked, "In my viewpoint, I could enhance my academic achievement through using the E-assessment," received the highest mean score (4.21) at a great level. With a mean score of 4.07, Q-2 ('I feel that I prefer E-assessment rather than traditional tests.') came in second and performed similarly well. Additionally, Q-8, which said that "The E- assessment exams contribute to extracting and getting results quickly," placed third overall with a mean score of 3.99. The E-assessment gives me a more inspiring 84 https://www.i-jim.org Paper—E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the… experience than utilizing traditional test paper,' Q-7, which placed fourth overall at greater extent with a mean value of 3.80, is another example. Additionally, Q-23, which asks "I believe that E-assessment time is enough to answer all questions," is clear from the students' replies. Such a question was assessed as the sixth greatest level of agreement of utilizing the E-assessment exam, with a mean of 3.72, and it came at a high level. Similar results were obtained for Qs 6, 16, 3, 5, 25, 18, 15, 4, 9 and 22, with mean values of 3.69, 3.67, 3.63, 3.61, 3.59, 3.59, 3.52, 3.47, 3.41, and 3.41, respectively. The question 19 ("E-assessment exams is designed to evaluate students in every course") had the lowest mean (2.97), indicating a modest degree. Similar to Qs 13, 26, 20, 11, 14, 10, 21, 17, 1, and 24, a moderate degree was likewise attained with the corresponding mean values of 3.40, 3.34, 3.31, 3.30, 3.30, 3.27, 3.23, 3.21, 3.32, 3.17, and 3.15. 3.2 Findings ascribed to Question 2 Does the level of acceptability of internet-based assessment among Sharjah University students at the time COVID-19 differ according on gender, level of education, and computer proficiency? The significance of the differences among averages was assessed using the Scheffe's post-hoc comparison test, the one-way ANOVA test, and the t test after calculating the mean scores and SD for each item. The results of the study respondents' responses are provided below in accordance with the study variables. First: Gender variations among students. As indicated in Table 3, a t-test was performed to determine the significance of the variations among the averages of the agreement of inrternet- based assessment by those learners at Sharjah University throughout the COVID-19. Table 3. Means and Standard Deviations of Students' Responses Based on Gender Gender N Mean SD Mean Difference T. Value df Sig. Sig. level Male 503 3.42 0.662 0.095 2.046 1030 0.041* Significant Female 529 3.52 0.821 * Statistically significant at (p<0.05) As seen in Table 3. The findings shown in Table 3 demonstrate that the observed p (0.041) is below 0.05. Thus, the test at the 0.05 level is significant, indicating that there is a significant difference in the level of agreement of E-assessment by undergrads students at Sharjah University at the time of COVID-19 depending on the gender variable (males and females), in favor of females. Second: College variable among students. The average agreement to internet- based assessment by UOS undergraduate students throughout the dissemination of COVID-19 was compared to other undergraduate students' average acceptance using a one-way ANOVA test to detect their relevance. Table 4 shows the findings of this variable's one-way ANOVA test. iJIM ‒ Vol. 17, No. 13, 2023 85 Paper—E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the… Table 4. One-way ANOVA test for College Variable Among Students Sum of squares Df Mean square F Sig. (tailed) Sig. level College variable among Groups 4.473 3 1.491 2.724 0.043* Significant Within Groups 562.647 1028 0.547 Total 567.120 1031 * Statistically significant at (p<0.05) The results, which are shown in Table 4, demonstrate that there are statistically significant changes in undergrads’ opinions depending on the variable of faculty since the p-value is 0.043, which is lower than the necessary statistical significance threshold (0.05). The results of the following comparisons, which employed the Scheffe test to detect the cause of the discrepancies, are displayed in Table 5. The results in Table 5 highlight the fact that students in the college of Medicine benefited from the variations in student acceptance of internet-based assessment according to the variable of faculty. Table 5. The Scheffe Test Results based on the College Variable (I) The college Mean Difference (I-J) Sig. Medicine Dentistry 0.12585 0.065 Pharmacy 0.09958 0.115 Health science 0.16079* 0.007 Dentistry Medicine -0.12585 0.065 Pharmacy -0.02627 0.723 Health science 0.03494 0.623 Pharmacy Medicine -0.09958 0.115 Dentistry 0.02627 0.723 Health science 0.06121 0.356 Health science Medicine -.16079* 0.007 Dentistry -0.03494 0.623 Pharmacy -0.06121 0.356 * Statistically significant at (p<0.05) Third: The variable of computer skills among undergraduates. The agreement of internet-based assessments by undergraduate sat UOS at the time of COVID-19 distribution was compared between averages using a one-way/ANOVA test to determine the significance of the discrepancies. Table 6 displays the results of this variable's one-way ANOVA test. Given that the p-value for the computer skills variable is 0.000, which is less than the necessary statistical significance at (0.05), the results, as shown in Table 6, clearly demonstrate that there are statistically significant variations in learners’ opinions. 86 https://www.i-jim.org Paper—E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the… Table 6. One-way ANOVA Test for computer skills variable among undergrads Sum of squares Df Mean square F Sig. (tailed) Sig. level Computer Skills Among Groups 20.382 3 6.794 10.796 0.000* Significant Within Groups 646.964 1028 0.629 Total 667.346 1031 * Statistically significant at (p<0.05) The results of the following comparisons, which employed the Scheffe test to determine the cause of the discrepancies, are displayed in Table 7 below. The findings in Table 7 demonstrate that students with poor computer abilities were favored as the cause of disparities in their acceptance of the E-evaluation assessment based on the computer skills variable. Table 7. The Scheffe Test's findings in light of the computer skills variable (I) Computer skills Mean Difference (I-J) Sig. Poor Moderate 0.24748* 0.004 Good 0.37524* 0.000 Excellent 0.29950* 0.007 Moderate Poor -0.24748* 0.004 Good 0.12776 0.236 Excellent 0.05202 0.938 Good Poor -0.37524* 0.000 Moderate -0.12776 0.236 Excellen -0.07574 0.834 Excellen Poor -0.29950* 0.007 Moderate -0.05202 0.938 Good 0.07574 0.834 * Statistically significant at (p<0.05) 4 Discussion COVID-19 impacted education worldwide. Universities have responded to the halt in many ways, nonetheless, the most common practically ubiquitous reaction is shifted to online teaching, learning, and assessment. In the assessment of medical students performance to get over the constraints of a single evaluation, numerous techniques of assessment and helpful feedback are needed. During the period of Covid-19, the electronic exams in distance were regarded as a primary type of assessment for undergrads in medical colleges. The majority of students view assessments as success indicators for their academic achievement, making them an essential tool in the learning process [20]. In recent years, online tests have been created and then utilized extensively in higher education [21]. Web-based learning (WBL), which increases the efficacy of instructional programs, has quickly changed contemporary medical education. [22]. Although electronic exams have included being approved by many iJIM ‒ Vol. 17, No. 13, 2023 87 Paper—E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the… collages in UOS before the pandemic but were mainly performed in-university campuses, therefore, the covid-19 of distance electronic tests has boosted valuable worries for universities, colleges, and students [23]. therefore, our objective was to evaluate the acceptance of distance electronic assessment of COM; College of COD; COP; CHS students during the COVID-19, moreover, we evaluated the acceptance of E-assessment based on gender, college, and computer skills to detect assessment barriers and student weaknesses. The research results indicated that medical students in the UOS had a high acceptance of electronic assessment than compared to conventional paper exams at the time of COVID-19. The previous adoption of online exams of COM students, in addition, the knowledge and experience of staff in modes of online/distance learning and assessment; and the availability of resources and equipment, like electronic exam platforms, PCs, webcams, headsets might have contributed to this response [7]. A similar study by [15] found that remote electronic tests were less favored among 2/3 of students in comparison to on-campus electronic tests. Another study by [24] reported that 58.82% of respondents expressed their great satisfaction with online workshops, online assessments, and virtual classrooms. In our research, a high number of students indicated that they improve their academic performance by using the E-assessment, the improved performance may be the result of using cutting- edge tools and online resources for remote learning in UOS. According to research [25] on pre-doctoral students at a dentistry school in the USA, online courses offered at the time of COVID-19 might provide student course performance that was on par with or better than that of the identical in-person courses offered prior to the pandemic. One of the advantages of the online course, the students are able to view recordings of the lectures, which helps them identify the important points that the doctors mentioned during the lectures that they might have missed, as study in Al Ain University, UAE among students showed that the students perform significantly better in exams when taught online [26]. Students who study remotely benefit from a variety of benefits, such as unrestricted access to educational resources like lecture recordings, networking chances with people from various cultural and geographic backgrounds, and scheduling ease. [27]. On the other hand, a major of the students reported that prefer electronic assessment rather than traditional exams. The main advantages of using E-assessment include improving student performance, raising instructors' efficiency, cutting expenses for the organization, and providing students with immediate feedback. High-order thinking development is one of the educational purposes. [28]. That is why a good number of students indicated that electronic assessment exams help in extracting and obtaining results quickly, using electronic assessment will also save the instructor time because paper tests need the teacher to spend time correcting each paper [29-39]. In this study, the students reported that they believe that E-assessment time is enough to answer all questions. The University of Sharjah created Online Exam Guidelines, one of the instructions in this guide is the instructors must solve the exam ahead of time in order to estimate the time needed to solve it and to avoid errors in it. as well as determine the duration of the exam, as the average time for answering one multiple-choice question is 45 seconds, and the duration of the exam should be in the range between 75 minutes and 120 minutes, based on the reasonable time required to answer the exam question [40]. Our findings highlighted a significant difference in the degree of the 88 https://www.i-jim.org Paper—E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the… agreement of electronic assessment by students, the females reported the acceptance of E-assessment more than the males. This gender disparity may result from female students' greater dedication to participating in various educational activities. In the study of [24], in comparison to the in-person meetings of the relevant year, a substantial improvement in the female students' first- and third-year mean PBL marks were seen throughout the online sessions. Females are more likely than males to enroll in online courses, according to different research [41], and the authors speculated that this may be because women feel safer at home. According to the research, students with poor computer capacities were more likely to accept an electronic evaluation than those with better computer skills. Furthermore, only 30% of the students by Hamsatu et al. (2016) strongly agreed that the test is technical and requires computer abilities, whereas 10% disagreed and 3.33% disputed that the exam is not technical and does not require computer skills. Because just one institution participated in this study, it is possible that the results cannot be generalized. The study was also carried out a year after the epidemic when the majority of students had good to a very good experience with distance learning and knew how to handle electronic evaluation. 5 Conclusion Currently, electronic exams are considered an important tool in distance education, the results of this research found out that most learners from Faculties of Medical Sciences at the University of Sharjah prefer electronic assessment comparative of conventional paper exams at the time of COVID-19. the Collage of Medicine students preferred the electronic assessment rather than other students. The females reported more acceptance of electronic assessment than males, while the accepting of E- assessment assessment based on the variable of laptop experience was for students with poor laptop skills. Our findings will be useful in developing academic methods, rearranging assessment alternatives, and changing the academic curriculum to address the problems and challenges presented by electronic examinations. year). 6 Limitations of the study It is a must to acknowledge that this research had several limitations to be acknowledged, as with any other analysis. A principal limitation of this study was that it only looked at the responses of students; responses from faculty members were not gathered. The second limitation was that the research participants were limited to 1150 students studying in the following four colleges of COM, COD, COP, and CHS at the time of the second course of 2020/2021. iJIM ‒ Vol. 17, No. 13, 2023 89 Paper—E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the… 7 Acknowledgments Researchers thank the University of Sharjah for their cooperation and for providing all the facilities needed for the study. 8 References [1] C.-C. Lai, T.-P. Shih, W.-C. Ko, H.-J. Tang, and P.-R. Hsueh, “Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) and coronavirus disease-2019 (COVID-19): The epidemic and the challenges,” International journal of antimicrobial agents, vol. 55, no. 3, p. 105924, 2020. https://doi.org/10.1016/j.ijantimicag.2020.105924 [2] B. Q. Saeed, I. Elbarazi, M. Barakat, A. O. Adrees, and K. S. Fahady, “COVID-19 health awareness among the United Arab Emirates population,” Plos one, vol. 16, no. 9, p. e0255408, 2021. https://doi.org/10.1371/journal.pone.0255408 [3] S. Esposito and N. Principi, “School closure during the coronavirus disease 2019 (COVID- 19) pandemic: an effective intervention at the global level?,” JAMA pediatrics, vol. 174, no. 10, pp. 921–922, 2020. https://doi.org/10.1001/jamapediatrics.2020.1892 [4] H. Hafiz, S.-Y. Oei, D. M. Ring, and N. Shnitser, “Regulating in pandemic: evaluating economic and financial policy responses to the coronavirus crisis,” Boston College Law School Legal Studies Research Paper, no. 527, 2020. https://doi.org/10.2139/ssrn.3555980 [5] B. Q. Saeed, R. Al-Shahrabi, and O. A. Bolarinwa, “Socio-demographic correlate of knowledge and practice toward COVID-19 among people living in Mosul-Iraq: A cross- sectional study,” PloS one, vol. 16, no. 3, p. e0249310, 2021. https://doi.org/10.1371/ journal.pone.0249310 [6] F. J. de Oliveira Araújo, L. S. A. de Lima, P. I. M. Cidade, C. B. Nobre, and M. L. R. Neto, “Impact of Sars-Cov-2 and its reverberation in global higher education and mental health,” Psychiatry research, vol. 288, p. 112977, 2020. https://doi.org/10.1016/j.psychres.2020. 112977 [7] M. H. Taha, M. E. Abdalla, M. Wadi, and H. Khalafalla, “Curriculum delivery in Medical Education during an emergency: A guide based on the responses to the COVID-19 pandemic,” MedEdPublish, vol. 9, no. 69, p. 69, 2020. https://doi.org/10.15694/mep.2020. 000069.1 [8] A. M. Sindiani et al., “Distance education during the COVID-19 outbreak: A cross-sectional study among medical students in North of Jordan,” Annals of medicine and surgery, vol. 59, pp. 186–194, 2020. https://doi.org/10.1016/j.amsu.2020.09.036 [9] C. A. Dykman and C. K. Davis, “Part One⁁ The Shift Toward Online Education.,” Journal of Information Systems Education, vol. 19, no. 1, 2008. [10] J. Dermo, “e‐Assessment and the student learning experience: A survey of student perceptions of e‐assessment,” British Journal of Educational Technology, vol. 40, no. 2, pp. 203–214, 2009. https://doi.org/10.1111/j.1467-8535.2008.00915.x [11] E. Birch and M. de Wolf, “A novel approach to medical school examinations during the COVID-19 pandemic,” Medical education online, vol. 25, no. 1, p. 1785680, 2020. https://doi.org/10.1080/10872981.2020.1785680 [12] A. Chirumamilla, G. Sindre, and A. Nguyen-Duc, “Cheating in e-exams and paper exams: the perceptions of engineering students and teachers in Norway,” Assessment & Evaluation in Higher Education, vol. 45, no. 7, pp. 940–957, 2020. https://doi.org/10.1080/02602938. 2020.1719975 90 https://www.i-jim.org https://doi.org/10.1016/j.ijantimicag.2020.105924 https://doi.org/10.1371/journal.pone.0255408 https://doi.org/10.1001/jamapediatrics.2020.1892 https://doi.org/10.2139/ssrn.3555980 https://doi.org/10.1371/journal.pone.0249310 https://doi.org/10.1371/journal.pone.0249310 https://doi.org/10.1016/j.psychres.2020.112977 https://doi.org/10.1016/j.psychres.2020.112977 https://doi.org/10.15694/mep.2020.000069.1 https://doi.org/10.15694/mep.2020.000069.1 https://doi.org/10.1016/j.amsu.2020.09.036 https://doi.org/10.1111/j.1467-8535.2008.00915.x https://doi.org/10.1080/10872981.2020.1785680 https://doi.org/10.1080/02602938.2020.1719975 https://doi.org/10.1080/02602938.2020.1719975 Paper—E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the… [13] A. O. Mohmmed, B. A. Khidhir, A. Nazeer, and V. J. Vijayan, “Emergency remote teaching during Coronavirus pandemic: the current trend and future directive at Middle East College Oman,” Innovative Infrastructure Solutions, vol. 5, pp. 1–11, 2020. https://doi.org/10.1007/ s41062-020-00326-7 [14] O. for E. C. and Development, Remote online exams in higher education during the COVID- 19 crisis. OECD Publishing Paris, 2020. [15] L. Elsalem, N. Al-Azzam, A. A. Jum’ah, and N. Obeidat, “Remote E-exams during Covid- 19 pandemic: A cross-sectional study of students’ preferences and academic dishonesty in faculties of medical sciences,” Annals of Medicine and Surgery, vol. 62, pp. 326–333, 2021. https://doi.org/10.1016/j.amsu.2021.01.054 [16] R. Wardoyo, I. M. A. Wirawan, and I. G. A. Pradipta, “Oversampling approach using radius- SMOTE for imbalance electroencephalography datasets,” Emerging Science Journal, vol. 6, no. 2, pp. 382–398, 2022. https://doi.org/10.28991/ESJ-2022-06-02-013 [17] A. A. Tatarkanov, I. A. Alexandrov, L. M. Chervjakov, and T. V Karlova, “A Fuzzy Approach to the Synthesis of Cognitive Maps for Modeling Decision Making in Complex Systems,” Emerging Science Journal, vol. 6, no. 2, pp. 368–381, 2022. https://doi.org/ 10.28991/ESJ-2022-06-02-012 [18] L. Ali and N. Dmour, “The shift to online assessment due to COVID-19: An empirical study of university students, behaviour and performance, in the region of UAE,” International Journal of Information and Education Technology, vol. 11, no. 5, pp. 220–228, 2021. https://doi.org/10.18178/ijiet.2021.11.5.1515 [19] M. E. Kosov, V. V. Eremin, S. A. Pobyvaev, and T. S. O. Gaibov, “Applying the investment multiplier to identify key points of economic growth,” Emerging Science Journal, vol. 6, no. 2, pp. 273–285, 2022. https://doi.org/10.28991/ESJ-2022-06-02-05 [20] S. Kearney, “Improving engagement: the use of ‘Authentic self-and peer-assessment for learning’to enhance the student learning experience,” Assessment & Evaluation in Higher Education, vol. 38, no. 7, pp. 875–891, 2013. https://doi.org/10.1080/02602938.2012. 751963 [21] K. Khalaf, M. El-Kishawi, M. A. Moufti, and S. Al Kawas, “Introducing a comprehensive high-stake online exam to final-year dental students during the COVID-19 pandemic and evaluation of its effectiveness,” Medical Education Online, vol. 25, no. 1, p. 1826861, 2020. https://doi.org/10.1080/10872981.2020.1826861 [22] D. A. Cook, S. Garside, A. J. Levinson, D. M. Dupras, and V. M. Montori, “What do we mean by web‐based learning? A systematic review of the variability of interventions,” Medical education, vol. 44, no. 8, pp. 765–774, 2010. https://doi.org/10.1111/j.1365-2923. 2010.03723.x [23] H. M. Elmehdi and A.-M. Ibrahem, “Online summative assessment and its impact on students’ academic performance, perception and attitude towards online exams: University of Sharjah Study Case,” in Creative Business and Social Innovations for a Sustainable Future: Proceedings of the 1st American University in the Emirates International Research Conference—Dubai, UAE 2017, 2019, pp. 211–218. https://doi.org/10.1007/978-3-030- 01662-3_24 [24] A. Elzainy, A. El Sadik, and W. Al Abdulmonem, “Experience of e-learning and online assessment during the COVID-19 pandemic at the College of Medicine, Qassim University,” Journal of Taibah University Medical Sciences, vol. 15, no. 6, pp. 456–462, 2020. https://doi.org/10.1016/j.jtumed.2020.09.005 [25] M. Zheng, D. Bender, and C. Lyon, “Online learning during COVID-19 produced equivalent or better student course performance as compared with pre-pandemic: empirical evidence iJIM ‒ Vol. 17, No. 13, 2023 91 https://doi.org/10.1007/s41062-020-00326-7 https://doi.org/10.1007/s41062-020-00326-7 https://doi.org/10.1016/j.amsu.2021.01.054 https://doi.org/10.28991/ESJ-2022-06-02-013 https://doi.org/10.28991/ESJ-2022-06-02-012 https://doi.org/10.28991/ESJ-2022-06-02-012 https://doi.org/10.18178/ijiet.2021.11.5.1515 https://doi.org/10.28991/ESJ-2022-06-02-05 https://doi.org/10.1080/02602938.2012.751963 https://doi.org/10.1080/02602938.2012.751963 https://doi.org/10.1080/10872981.2020.1826861 https://doi.org/10.1111/j.1365-2923.2010.03723.x https://doi.org/10.1111/j.1365-2923.2010.03723.x https://doi.org/10.1007/978-3-030-01662-3_24 https://doi.org/10.1007/978-3-030-01662-3_24 https://doi.org/10.1016/j.jtumed.2020.09.005 Paper—E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the… from a school-wide comparative study,” BMC medical education, vol. 21, pp. 1–11, 2021. https://doi.org/10.1186/s12909-021-02909-z [26] G. G. A. El Refae, A. Kaba, and S. Eletter, “The impact of demographic characteristics on academic performance: face-to-face learning versus distance learning implemented to prevent the spread of COVID-19,” The International Review of Research in Open and Distributed Learning, vol. 22, no. 1, pp. 91–110, 2021. https://doi.org/10.19173/irrodl. v22i1.5031 [27] N. Croft, A. Dalton, and M. Grant, “Overcoming isolation in distance learning: Building a learning community through time and space,” Journal for Education in the Built Environment, vol. 5, no. 1, pp. 27–64, 2010. https://doi.org/10.11120/jebe.2010.05010027 [28] N. Alruwais, G. Wills, and M. Wald, “Advantages and challenges of using e-assessment,” International Journal of Information and Education Technology, vol. 8, no. 1, pp. 34–37, 2018. https://doi.org/10.18178/ijiet.2018.8.1.1008 [29] L. Gilbert, D. Whitelock, and V. Gale, “Synthesis report on assessment and feedback with technology enhancement,” 2011. [30] D. K. Abdul-Rahman Al-Malah, and B. H. Majeed, "Enhancement the Educational Technology by Using 5G Networks," International Journal of Emerging Technologies in Learning, vol. 18, no. 1, 2023. https://doi.org/10.3991/ijet.v18i01.36001 [31] R. ALairaji, "Abnormal Behavior Detection of Students in the Examination Hall From Surveillance Videos," in Advanced Computational Paradigms and Hybrid Intelligent Computing, vol. 1373: Springer Singapore, 2022, pp. 113-125. https://doi.org/10.1007/978- 981-16-4369-9_12 [32] H. T. Hazim, "Secure Chaos of 5G Wireless Communication System Based on IOT Applications," International Journal of Online & Biomedical Engineering, vol. 18, no. 12, 2022. https://doi.org/10.3991/ijoe.v18i12.33817 [33] H. Salim, "Secured Transfer and Storage Image Data for Cloud Communications," international Journal of Online and Biomedical Engineering, vol. 19, no. 06, 2023. https://doi.org/10.3991/ijoe.v19i06.37587 [34] A. Al-zubidi, R. K. Hasoun, and S. H. Hashim, , "Mobile Application to Detect Covid-19 pandemic by using Classification Techniques: Proposed System," International Journal of Interactive Mobile Technologies, vol. 15, no. 16, pp. 34-51, 2021. https://doi.org/10.3991/ ijim.v15i16.24195 [35] N. A. Jassim, and M. S. Farhan, "Design and Implementation of Smart City Applications Based on the Internet of Things," iJIM, vol. 15, no. 3, 2021. https://doi.org/10.3991/ijim. v15i13.22331 [36] H. Alrikabi, "The impact of teaching by using STEM approach in the Development of Creative Thinking and Mathemati-cal Achievement Among the Students of the Fourth Sci- entific Class," International Journal of Interactive Mobile Technologies (iJIM), vol. 15, no. 13, pp. 172-188, 2021. https://doi.org/10.3991/ijim.v15i13.24185 [37] H. T. Hazim, "Enhanced Data Security of Communication System using Combined Encryption and Steganography," International Journal of Interactive Mobile Technologies, vol. 15, no. 16, pp. 144-157, 2021. https://doi.org/10.3991/ijim.v15i16.24557 [38] R. A. Azeez, M. K. Abdul-Hussein, and M. S. Mahdi, "Design a system for an approved video copyright over cloud based on biometric iris and random walk generator using watermark technique," Periodicals of Engineering Natural Sciences, vol. 10, no. 1, pp. 178- 187, 2021. https://doi.org/10.21533/pen.v10i1.2577 [39] B. Mohammed, and R. Chisab, "Efficient RTS and CTS Mechanism Which Save Time and System Resources," international Journal of Interactive Mobile Technologies, vol. 14, no. 4, pp. 204-211, 2020. https://doi.org/10.3991/ijim.v14i04.13243 92 https://www.i-jim.org https://doi.org/10.1186/s12909-021-02909-z https://doi.org/10.19173/irrodl.v22i1.5031 https://doi.org/10.19173/irrodl.v22i1.5031 https://doi.org/10.11120/jebe.2010.05010027 https://doi.org/10.18178/ijiet.2018.8.1.1008 https://doi.org/10.3991/ijet.v18i01.36001 https://doi.org/10.1007/978-981-16-4369-9_12 https://doi.org/10.1007/978-981-16-4369-9_12 https://doi.org/10.3991/ijoe.v18i12.33817 https://doi.org/10.3991/ijoe.v19i06.37587 https://doi.org/10.3991/ijim.v15i16.24195 https://doi.org/10.3991/ijim.v15i16.24195 https://doi.org/10.3991/ijim.v15i13.22331 https://doi.org/10.3991/ijim.v15i13.22331 https://doi.org/10.3991/ijim.v15i13.24185 https://doi.org/10.3991/ijim.v15i16.24557 https://doi.org/10.21533/pen.v10i1.2577 https://doi.org/10.3991/ijim.v14i04.13243 Paper—E-assessment during the Coronavirus Outbreak from the Perspective of Undergraduate at the… [40] S. K. Raman, “Challenges in Effective Curricular Delivery, While Navigating the Uncertainties of the Pandemic Year 2020-’21–An Autoethnographic Report”. [41] D. Sonia and R. Kumar, “Students’ Perception towards Digitization of Education after Covid-19: A Survey,” International Journal of Engineering, Science, vol. 1, no. 1, 2020. 9 Authors Dr. Balsam Qubais Saeed, Clinical Microbiology at Collage of Medicine, University of Sharjah, United Arab Emirates (email: bsaeed@sharjah.ac.ae). Dr. Najeh Rajeh Ibrahim Al Salhi, Deanship of Research and Graduate Studies at Ajman University. Also, at Humanities and Social Sciences Research Center (HSSRC), Ajman University, Ajman, UAE (email: n.alsalhi@ajman.ac.ae). Prof. Salman Yousuf Guraya, Department of Medicine, University of Sharjah, United Arab Emirates (email: sguraya@sharjah.ac.ae). Dr. Mohd. Elmagzoub Eltahir, College of Humanities and Sciences, Ajman University, Ajman, UAE; Humanities and Social Sciences Research Center (HSSRC), Ajman University, Ajman, UAE (email: m.babiker@ajman.ac.ae). Dr. Sami Sulieman Al Qatawneh, College of Arts, Humanities, and Social Sciences, University of Sharjah, Sharjah, UAE (email: salqatawneh@sharjah.ac.ae). Ahmed Omar Adrees, is a Medical student at the University of Sharjah, Sharjah United Arab Emirates (email: aadrees@sharjah.ac.ae). Prof. Kubais Saeed Fahady, College of Humanities and Sciences, Ajman University, Ajman, UAE (email: K.fahady@ajman.ac.ae). Dr. Nagaletchimee Annamalai, School of Distance Education, Universiti Sains Malaysia, 11800, Pulau Pinang (email: naga@usm.my). Article submitted 2023-04-22. Resubmitted 2023-05-19. Final acceptance 2023-05-24. Final version published as submitted by the authors. iJIM ‒ Vol. 17, No. 13, 2023 93 mailto:bsaeed@sharjah.ac.ae mailto:n.alsalhi@ajman.ac.ae mailto:sguraya@sharjah.ac.ae mailto:m.babiker@ajman.ac.ae mailto:salqatawneh@sharjah.ac.ae mailto:aadrees@sharjah.ac.ae mailto:K.fahady@ajman.ac.ae mailto:naga@usm.my