1802023 41(2): 180-194 https://doi.org/10.38140/pie.v41i2.6912 Students’ acceptance and perceptions of online assessments post- COVID-19 pandemic: A case of Community Extension students at a historically disadvantaged institution Abstract Traditionally, research has shown poor uptake and acceptance of non-traditional assessments. The COVID-19 pandemic, however, necessitated a drastic shift towards online assessments and negated the practicality of traditional assessments. The acceptance of online assessments by university students enrolled in historically disadvantaged higher education institutions is currently under- researched. To address this gap, the Technological Acceptance Model (TAM) is employed as a theoretical framework to examine the acceptance of online assessments and identify barriers to their adoption. The study used a quantitative research design and data were collected through an online questionnaire distributed via ®Microsoft Teams, the university’s Learning Management System (LMS). Descriptive and inferential data analysis was conducted using a combination of Microsoft Excel and JASP version 0.16.1.0. A total of 83 second and third-year students registered with the Department of Community Extension participated in the study. The results showed that 42% (n=35) of students found online assessments difficult to complete. Anxiety during the assessment was prevalent in 57% (n=47) of the students. Forty percent (n=50) of the students indicated that online assessments improved academic performance. The percentage of students that still preferred face-to-face invigilated tests over online assessments was also 40% (n=33). The results indicate that online assessments were accepted by students in historically disadvantaged institutions. However, results emphasised the need for the implementation of effective measures to maintain academic integrity, mitigate technical challenges and the provision of training and support to reduce anxiety among students caused by assessments. Keywords: Community Extension university students, COVID-19, historically disadvantaged universities, online assessments AUTHOR: Ntombenhle Ndlovu1 Phiwayinkosi R. Gumede1 Sandile Mthimkhulu1 AFFILIATION: 1Mangosuthu University of Technology, South Africa DOI: https://doi.org/10.38140/ pie.v41i2.6912 e-ISSN 2519-593X Perspectives in Education 2023 41(2): 180-194 PUBLISHED: 30 June 2023 RECEIVED: 1 December 2022 ACCEPTED: 12 June 2023 https://doi.org/10.38140/pie.v41i2.6912 https://orcid.org/0009-0009-5396-9636 https://orcid.org/0000-0002-9203-8457 https://orcid.org/0000-0001-6778-8064 https://doi.org/10.38140/pie.v41i2.6912 1812023 41(2): 181-194 https://doi.org/10.38140/pie.v41i2.6912 Ndlovu, Gumede & Mthimkhulu Students’ acceptance and perceptions of online assessments 1. Introduction In more recent times, there has been a drastic shift towards the global adoption of technology. This trend has given rise to the rapid growth of online assessments in the higher education sector. Although students’ acceptance and perceptions of online assessments are reported (Alsalhi et al., 2022) in developed countries, little is known in developing countries such as South Africa. Consequently, understanding students’ acceptance and their perceptions of online assessments is critical for institutions of higher learning, particularly historically disadvantaged institutions. Given the reduced teaching resources and increased student numbers in the sector, Donnelly (2014) concedes that technology integration within the higher education sector is a persistent requirement in line with the global industry’s fourth industrial revolution (4IR) trajectory. Kala and Chaubey (2022) assert that the diffusion and acceptance of online learning have become a common approach to education in developed and developing countries. Therefore, the understanding of students’ acceptance and perceptions of the online assessments post the COVID-19 pandemic is critical in assisting the institutions of higher learning to prioritise the deployment of relevant and targeted student support initiatives. The use of online assessments in its primitive form dates back to the late nineties (between 1960 to 1970) when the first known attempt to use computers to assist education assessment processes was recorded (Woolley, 1994 in Ćwil, 2019; Caleb & Elaine, 2022). This trend has increased the rapid growth of both online learning and assessment approaches within the higher education sector. While there are various factors which can be attributed to the exponential shift towards the use of online learning, assessments generally play a vital role in teaching and learning since they measure the level of students’ understanding of subject matter; determine the level of knowledge acquired by students; and are used to determine students’ academic progression from one grade to the next. In the recent past, online assessment has come to replace paper-based assessments in many universities (Boitshwarelo, Reedy & Billany, 2017). The 4IR is understood to be the core driver of such a gradual shift of the traditional face-to-face assessment methods towards more technologically driven methods such as online assessment. This exponential evolution has been compounded and fast-tracked by the unpredictable nature of the universe in which we live. The COVID-19 pandemic is one example of such a catastrophe that fuelled the recent drastic shift in higher education, rendering old traditional methods of teaching and learning unfeasible and impractical (Yadav, Sankhla & Yadav, 2022). As a result, this has altered the way students are taught and assessed. Due to the COVID-19 pandemic, the higher education sector around the world was confronted with overwhelming challenges because university campuses had to be shut down. Institutions were forced to find ways to continue teaching and learning activities without the physical attendance of academic staff members and students (UNESCO, 2020; Lin, Foung & Chen, 2022). The synergies had to be refocused to ensure that academic years were saved, and no students were left behind because of the pandemic. Unavoidably, universities turned to online learning platforms to deal with this challenge. As a global response, UNESCO (2020) listed a shift to online assessments as one of the five main strategies that countries had adopted to manage high-stakes assessments during the COVID-19 crisis. Online assessments as an assessment approach continue in many universities post the COVID-19 pandemic, hence, it is important to understand its acceptability amongst students because of the existing contestation with issues of effectiveness of online assessments when compared to traditional invigilated paper- based assessments (Ellis, Oeppen & Brennan, 2021). https://doi.org/10.38140/pie.v41i2.6912 1822023 41(2): 182-194 https://doi.org/10.38140/pie.v41i2.6912 Perspectives in Education 2023: 41(2) According to Khan and Khan (2019), the benefits of online assessments were an answer to the numerous challenges of traditional paper-based assessments which include high paper use when printing; fears related to the security of transporting assessment scripts; lack of automated marking and high administrative costs (Snekalatha et al., 2021; Vazquez, Chiang & Sarmiento-Barbieri, 2021). The advantages of online assessments may be counteracted by a lack of resources, poor infrastructure and access to the internet with acceptable speed. These conditions are synonymous with historically disadvantaged universities, and this may potentially have impacted the acceptance of online assessments among students enrolled at these HEIs. While research indicates some of the challenges involved in the use of online assessments, it does not dismiss the value of online tests in the assessment of 21st century learning, but provides insight into identifying how these concerns can be addressed (Boitshwarelo et al., 2017). Generally, students’ perceptions and acceptance of online assessment are positive (Alsalhi et al., 2022; Caleb & Elaine, 2022). However, there is limited research that has been conducted post-COVID-19 and at historically disadvantaged institutions. Hence, this study closes a gap that currently exists regarding the implementation of online assessments post COVID-19 in historically disadvantaged institutions – thereby contributing to the body of knowledge. 1.1 Theoretical framework The Technological Acceptance Model (TAM) which was developed by Davis in 1986 is used as a theoretical framework to understand the acceptance of online assessments by university students in historically disadvantaged higher education institutions. TAM is an adaptation of the Theory of Reasoned Action (TRA) specifically tailored for modelling user acceptance of information systems (Davis, Bagozzi & Warshaw 1989). TAM explains how individuals make decisions to accept and use a particular technology. The model postulates that Perceived Usefulness (PU) and Perceived Ease of Use (PEOU) are crucial in determining users’ intention to use or adopt technology (Davis et al., 1989). The Technological Acceptance Model is shown in Figure 1. The purpose of the TAM is to provide a basis for tracing the impact of external factors on internal beliefs, attitudes, and intentions. Figure 1: Technological Acceptance Model Within the context of this paper, PU refers to the extent to which students perceive online assessments as beneficial and valuable in improving their learning outcomes. If students believe that online assessments can enhance their academic performance, they are more likely to accept and adopt this mode of assessment. On the other hand, PEOU relates to https://doi.org/10.38140/pie.v41i2.6912 1832023 41(2): 183-194 https://doi.org/10.38140/pie.v41i2.6912 Ndlovu, Gumede & Mthimkhulu Students’ acceptance and perceptions of online assessments the degree of ease and convenience students associate with using online assessments. If students perceive the online assessment system as user-friendly, accessible, and easy to navigate, it positively influences their acceptance and intention to use it. TAM is used to achieve the overarching aim of this paper, which is to provide insights into the factors that affect students’ acceptance of online assessments, particularly within the context of historically disadvantaged higher education institutions. 1.2 Research objectives The objectives of this study were 1) to determine students’ perceptions of online assessments, 2) to establish the acceptance of online assessments, and 3) to establish barriers to the acceptance of online assessments. 2. Methodology 2.1 Study design The study was descriptive by nature and employed a quantitative research design. Although a quantitative research approach was selected, participants were afforded an added opportunity to expand on the selected option/s. The study sought to determine students’ perceptions of online assessments, students enrolled within the Department of Community Extension at a historically disadvantaged institution, in undertaking online assessments during the COVID-19 pandemic. The methodological integrity of the study was assured by enlisting a non-academic university staff member to recruit participants and administer the online questionnaire. This was done to limit any conflict of interest that could have arisen due to power relations between the participants (students) and researchers (academics/study investigators). 2.2 Study population and sampling strategy Whole-population sampling was used to collect data. The decision to use the whole-sample selection method was driven by several considerations, which included a fear of a high dropout rate by respondents, as the population being investigated was already at a low number and we needed maximum participation to conduct statistical manipulation. An invitation to participate in the study, information letter and consent letter (embedded in the Online Questionnaire) were published via one of the Learning Management Systems (LMSs), ®Microsoft Teams used by the institution. A meeting with interested students was arranged online (MS Teams) to introduce the study. The prospective participants (students) were allowed to read the study information letter, ask questions, and voice any concerns they might have regarding the study. After factoring in the students’ input, the link to access online questionnaires was uploaded on ®MS Teams to allow respondents to complete the online questionnaire. 2.3 Recruitment and data collection Participants were recruited from a pool of 95 (second- and third-year) students for a period of five months (December 2021 to April 2022). Of the 95 students recruited, 83 participated in the study. These were students who had registered in 2020 as first- or second-year students. The online questionnaire was set to allow only one response from each student. To ensure anonymity, no personal information such as student numbers or respondents’ names was required on the online questionnaire. https://doi.org/10.38140/pie.v41i2.6912 1842023 41(2): 184-194 https://doi.org/10.38140/pie.v41i2.6912 Perspectives in Education 2023: 41(2) 2.4 Data analysis Descriptive and inferential data analysis was conducted using a combination of Microsoft Excel and JASP version 0.16.1.0. The Microsoft Excel spreadsheet was primarily for descriptive statistics of student responses obtained during the survey. More complex correlational information and analysis were conducted through JASP. For example, the analysis of variance (ANOVA) provided data to measure if the observed differences in recorded responses were statistically significant. 2.5 Ethical considerations Permission to conduct the study was obtained from the institution’s Research Ethics Committee (RD1/09/21). The participants were reassured that the study was voluntary and that they could withdraw at any time. They were also advised that there would be no financial gain for participating in the study. Furthermore, participants were advised that no costs would be incurred by being part of the study since the institution provided Wi-Fi or student data. All information collected from the participants was kept confidential. 3. Results 3.1 Characteristics of respondents Table 1: Characteristics of the Community Extension students who participated in the current study Gender Classification Freq. Perc (%) Male 31 37% Female 52 63% Study level 2nd level 45 54% 3rd level 38 46% Residence Home 11 13% Student residence 70 84% Renting 2 3% Location during exam Home 7 8% On campus 23 28% Student residence 53 64% A device used in online test Mobile phone 74 89% Laptop 9 11% Data source Campus Wi-Fi 13 16% Residence Wi-Fi 20 24% CRG data 48 58% Personal data 2 2% **CRG: Covid Response Grant https://doi.org/10.38140/pie.v41i2.6912 1852023 41(2): 185-194 https://doi.org/10.38140/pie.v41i2.6912 Ndlovu, Gumede & Mthimkhulu Students’ acceptance and perceptions of online assessments Table 1 above indicates that among the total of 83 students, 37% (n=31) were males and 63% (n=52) were females. There were 54% (n=45) second-level and 46% (n=38) third-level students. An overwhelming majority of 84% (n=70) of students lived in university residences located in various parts of the city of Durban, whereas 13% (n=11) and 3% (n=2) lived at home and stayed in lodgings, respectively. At the time when this study was conducted, most students (64%, n=53) indicated that they wrote online tests in their residences. The majority (89%) of the students used mobile phones to write tests and only a few (11%) used laptops. Most of the students (58%, n=48) used the Covid Responsiveness Grant (CRG) data to access and write the online test. Only 2% (n=2) of the students wrote the test using their personal data. Table 2 and Table 3 below report on the useability and accessibility of MS Teams as an online assessment platform for Community Extension students. Table 2: The useability and accessibility of MS Teams as an online assessment platform for Community Extension students (The values in parentheses represent the percentages) Variables Median SD 1 2 3 4 5 Were there any technical/practical problems when undertaking the online test? 4 0.47 Yes (34) No (66) How would you rate the structure of the online assessment? 4 0.84 Poor (0) Fair (4) Good (23) Very good (40) Excellent (33) Navigation through the online test was easy 4 0.58 Completely disagree (1) Disagree (12) Neutral (0) Agree (69) Completely agree (18) The acquisition of skills needed to complete the online test was easy 4 0.51 Completely disagree (0) Disagree (5) Neutral (0) Agree (70) Completely agree (25) How would you rate the functionality of MS Teams as an online test platform? 4 0.93 Poor (1) Fair (4) Good (20) Very good (34) Excellent (41) Did you experience any significant Wi-Fi issues during the test? 1 0.49 Yes (63) No (37) https://doi.org/10.38140/pie.v41i2.6912 1862023 41(2): 186-194 https://doi.org/10.38140/pie.v41i2.6912 Perspectives in Education 2023: 41(2) Table 3: The analysis of variance for the useability and accessibility of MS Teams to Community Extension students as an online assessment platform (The values in parentheses represent the percentages) Source of variation SS d.f MS F P-value F crit Were there technical/practical problems during the online test? 264.55 1 264.55 4.31 0.286 161.45 Structure of online test 1407.13 3 469.04 59.2 0.004 9.28 Navigation through the online test 636.67 4 159.17 17.83 0.008 6.39 Acquisition of skills to complete the online test was easy 1523.26 3 507.75 239.86 <0.001 9.28 Functionality of MS Teams as an online test platform 625.05 4 156.26 29.49 0.003 6.39 Did you experience any significant Wi-Fi issues during the test? 160.04 1 160.04 1.96 0.395 161.45 Table 4: Academic performance and quality as perceived by the participating Community Extension students (The values in parentheses represent the percentages) Variables Median SD 1 2 3 4 5 Would you say online tests have increased or decreased your academic performance? 1 0.45 Increased (72) Decreased (28) I found the online test easy to complete 4 0.70 Completely disagree (2) Disagree (13) Neutral (0) Agree (58) Completely agree (27) The online test was useful in my learning process 4 0.64 Completely disagree (2) Disagree (6) Neutral (0) Agree (64) Completely agree (28) The online test covered the intended course content 4 0.52 Completely disagree (0) Disagree (6) Neutral (0) Agree (65) Completely agree (29) Were you anxious before and after the online test? 1 0.49 Yes (57) No (43) Did you have easy access to a suitable quiet space to sit the online test? 1 0.36 Yes (84) No (16) How easy is it to cheat on an online test? 4 1.45 Very easy (12) Easy (17) Neutral (13) Difficult (19) Very difficult (39) Which strategy is most effective in preventing cheating in online tests? 3 1.01 Limited time (34) NoA (10) SoQ (50) SPA (6) **The students’ responses indicating academic performance and perceived quality on online assessment (n=83). NoA: none of the above. SPA: shuffling the possible answers. SoQ: shuffling of questions (different order for everyone) https://doi.org/10.38140/pie.v41i2.6912 1872023 41(2): 187-194 https://doi.org/10.38140/pie.v41i2.6912 Ndlovu, Gumede & Mthimkhulu Students’ acceptance and perceptions of online assessments Table 5: The analysis of variance for the academic performance and quality as perceived by the participating Community Extension students Source of variance SS d.f MS F P-value F crit Would you say online tests have increased or decreased your academic performance? 496.81 1 496.81 54.76 0.086 161.45 I found the online test easy to complete 864.24 3 288.08 24.43 0.013 9.28 The online test was useful in my learning process 1193.75 3 397.92 47.33 0.005 9.28 The online test covered the intended course content 1356.33 3 452.11 16.87 0.022 9.28 Were you anxious before and after the online test? 43.91 1 43.91 2.47 0.361 161.45 Did you have easy access to a suitable quiet space to sit the online test? 1179.05 1 1179.05 3249 0.011 161.45 How easy is it to cheat on an online test? 231.67 4 57.92 10.23 0.022 6.39 Which strategy is most effective in preventing cheating in online tests? 352.01 3 117.34 7.46 0.066 9.28 3.2 The useability and accessibility of the online assessment platform The results as reflected in Table 2 and Table 3 above indicate that the majority of students (66%) reported no technical/practical problems when undertaking the online test, while 34% reported having experienced technical/practical problems. A total of 40% (p=0.004) of the students reported that the structure of the online assessment was very good, while 69% (p=0.008) agreed that navigating through the online assessment was easy, compared to only 12% that disagreed. Most students (70%, p<0.000) admitted that the acquisition of skills needed to complete online tests was easy, 25% (p<0.000) completely agreed, while only 5% (p<0.000) disagreed. The functionality of MS Teams as an online test platform was rated as excellent by 41% (p=0.003) of the students, good by 34% (p=0.003), and fair by only 4% (p=0.003) of the students. A higher percentage of students (63%) reported significant Wi-Fi issues during the test, compared to 37% who reported no significant Wi-Fi issues during the online test. The following section will cover the impact of online assessments on academic performance and the overall quality as perceived by students. 3.3 The academic performance and quality as perceived by the students Table 4 and Table 5 above denote the results obtained from students in relation to academic performance and the overall quality of online assessments. The results show that most students (72%) stated that online assessments increased their academic performance, while 28% indicated that online assessments decreased their academic performance. Most students agreed (58%, p=0.013) that they found online assessments easy to complete, 27% (p=0.013) completely agreed, and 13% (p=0.013) disagreed. The majority of the students agreed (64%, p=0.005) that online assessments were useful in the learning process, 28% (p=0.005) completely agreed, 6% disagreed (p=0.005), and 2% completely disagreed (p=0.005). https://doi.org/10.38140/pie.v41i2.6912 1882023 41(2): 188-194 https://doi.org/10.38140/pie.v41i2.6912 Perspectives in Education 2023: 41(2) In addition, a large percentage (65%, p=0.022) of students felt that online assessments covered the course content adequately, 29% (p=0.022) completely agreed and (6%) disagreed (p=0.022). Most students (57%) reported feeling anxious before and after the online test, compared to 43% who did not feel any form of anxiety before or after the online assessment. Eighty-four percent of the students were able to access a suitable, quiet space to sit for the online test, while only 16% of the students reported not having access to a suitable, quiet space to sit. Most students reported that cheating on an online test would be very difficult (39%, p=0.022), while 19% (p=0.022) found that cheating would be difficult, 17% (p=0.022) found that cheating would be easy, and only 12% (p=0.022) found that cheating would be very easy. Half of the students reported the shuffling of questions as the most effective strategy to curb cheating during online assessments, followed by limiting time (34%) and shuffling of possible answers (6%). Table 6 below depicts the assessment method preferences for Community Extension students. Table 6: The responses (A) and analysis of variance (B) for students’ preference for assessment method in the Department of Community Extension (The values in parentheses represent the percentages) A Median SD 1 2 3 4 5 How likely are you to accept completing online tests in the future? 4 1.16 Most unlikely (5) Unlikely (5) Neutral (24) Likely (20) Most likely (46) Having completed both online tests and face-to-face invigilated on-campus tests, which do you prefer? 2 0.49 Face-to-face (40) Online (60) B SS d.f MS F P-value F crit How likely are you to accept completing online tests in the future? 571.35 4 142.84 51.79 0.001 6.39 Having completed both online tests and face-to-face invigilated on-campus tests, which do you prefer? 104.88 1 104.88 1.71 0.416 161.45 3.4 Student assessment method preference Table 6 above reported on students’ online assessment method preferences. The results show that a large percentage of students (46%, p=0.001) were most likely to accept completing online assessments in the future; 5% (p=0.001) were unlikely to complete online assessments in the future; 24% of the students were neutral; and 5% (p=0.001) of the students reported they would most likely not accept the completing of online assessments in the future. The majority of the students (60%) preferred online tests, compared to 40% of the students who preferred face-to-face invigilated tests completed on campus. https://doi.org/10.38140/pie.v41i2.6912 1892023 41(2): 189-194 https://doi.org/10.38140/pie.v41i2.6912 Ndlovu, Gumede & Mthimkhulu Students’ acceptance and perceptions of online assessments 4. Discussion The aim of this paper was to provide insights into the factors that affect students’ acceptance of online assessments, particularly in the context of historically disadvantaged higher education institutions. The Technological Acceptance Model was used as a theoretical framework. It explored the relationship between students’ perceptions of usefulness, ease of use, and their attitudes towards using online assessments. We evaluated Community Extension students’ perceptions of online assessments and the perceived ease of use with the aim of incorporating this practice into the formative assessment methods of the diploma programme. The following three objectives guided the study: 1) to determine students’ perceptions of online assessments; 2) to establish the acceptance of online assessments; and 3) to establish barriers to the acceptance of online assessments. 3.4 Student assessment method preference Table 6 above reported on students’ online assessment method preferences. The results show that a large percentage of students (46%, p=0.001) were most likely to accept completing online assessments in the future; 5% (p=0.001) were unlikely to complete online assessments in the future; 24% of the students were neutral; and 5% (p=0.001) of the students reported they would most likely not accept the completing of online assessments in the future. The majority of the students (60%) preferred online tests, compared to 40% of the students who preferred face-to-face invigilated tests completed on campus. 4. Discussion The aim of this paper was to provide insights into the factors that affect students' acceptance of online assessments, particularly in the context of historically disadvantaged higher education institutions. The Technological Acceptance Model was used as a theoretical framework. It explored the relationship between students' perceptions of usefulness, ease of use, and their attitudes towards using online assessments. We evaluated Community Extension students’ perceptions of online assessments and the perceived ease of use with the aim of incorporating this practice into the formative assessment methods of the diploma programme. The following three objectives guided the study: 1) to determine students’ perceptions of online assessments; 2) to establish the acceptance of online assessments; and 3) to establish barriers to the acceptance of online assessments. Perceived Ease of Use (PEOU) 1. Ease of navigating online test (77%) 2. Functionality of Teams (96%) 3. No technical problems (66%) 4. Skills to complete online test (95%) 5. No technical problems (60%) 6. Access to quite spaces External Variables 1. Infrastructure 2. Lack of support 3. COVID-19 Perceived Usefulness (PU) 1. Increased academic performance (72%) 2. Easiness to complete (85%) 3. Usefulness for learning (98%) 4. Coverage of learning content (94) Behavioural Intention to use 1.Prefered online assessment (60%) 2. Conduct future online test (66%) Actual System Use Acceptance of online assessment Figure 2: Technological Acceptance Model (adopted from Davis, 1986) 4.1 Objective 1: To determine students’ perceptions of online assessments The current study indicated that a higher percentage of students reported that cheating was difficult when completing an online assessment. This result is inconsistent with the literature, which reports a high prevalence of cheating during online assessments (Valdez & Maderal, 2021). Academic dishonesty has significantly contributed to the poor adoption and acceptance of online assessment among the education fraternity (Petrisor et al., 2016; Iskandar et al., 2021; Valdez & Maderal, 2021) and varies in severity. Similarly, Meccawy, Meccawy and Alsobhi (2021) reiterate the increasing prevalence of online cheating among students and recommends curbing or reducing this common practice by: 1) increasing awareness of ethics and integrity among students; 2) training educators on cheating methods; and 3) imposing substantial disciplinary actions on students that participate in academic dishonesty. https://doi.org/10.38140/pie.v41i2.6912 1902023 41(2): 190-194 https://doi.org/10.38140/pie.v41i2.6912 Perspectives in Education 2023: 41(2) On the contrary, Peled et al. (2019) put forth an alternative view, in that the decision to engage in academic dishonesty is independent of whether an assessment is online or face-to-face, but rather, is determined by the student’s personality traits and the university’s assessment-related policies and procedures. This view suggests that online assessment itself is not an obstacle, but rather the academic integrity of the students that participate in online assessments. This view supports the need to increase student awareness of matters relating to academic integrity and ethical behaviour during assessments of any kind (Meccawy et al., 2021). This does not, however, negate the need for HEIs to ensure that online assessments meet the learning objectives and outcomes while still maintaining acceptable levels of rigour. The rapid transition to online learning and assessments may have caused anxiety and stress among students. A greater percentage of students in the current study reported having experienced anxiety before and after online assessments. Govender, Reddy and Bhagwan (2021) further suggest that students from disadvantaged areas might be prone to added stress, especially when subjected to remote online assessments. This is attributed to persistent connectivity issues which may exacerbate online test anxiety and may affect the academic performance of the students. Interestingly, although the majority of the students reported experiencing online test anxiety, 72% of the students indicated that online assessments increased academic performance. In contrast, Woldeab and Brothen (2019) advise that the prevalence of test anxiety has a negative impact on the academic performance of students, which is greatly supported by the literature. The results from this case study suggest that test anxiety may not impact the academic performance of students. In addition, we were not able to ascertain distinctively whether online test anxiety existed before the online test or after the online test. Further research is required to understand the impact of pre-online test anxiety and post-online test anxiety and the resultant factors that influence anxiety in both stages of the assessment. 4.2 Objective 2: To establish the acceptance of online assessments The study found that the majority of the students preferred online assessments over face- to-face assessments. This finding echoes previous research by Iskandar et al. (2021) which shows a preference for new modes of assessment over more traditional options. However, Aguilera-Hermida (2020) asserts that those face-to-face options, despite being dated, remain the preferred option for some students. This view is supported by 40% of the study participants who indicated a preference for face-to-face assessments. These divergent views show that both old and new modes have substantial support for varying reasons. The loss of support for online modes was blamed on the numerous technical and resource-related challenges that students reported during the completion of the online assessments. Most of the students indicated an acceptance of online assessments for future use. However, 24% were still neutral, 5% were unlikely to accept them and 5 % of the students still reported to be most unlikely to accept online assessments in the future. These results emphasise that although most students accept online assessments, there is still a substantial number of students that are not keen on completing online assessments in the future. This is concerning, especially with the educational trends touting e-learning and e-assessments as the future of education. More research needs to be conducted to establish solutions to ensure online assessments are acceptable to every student, especially students enrolled in historically disadvantaged universities. https://doi.org/10.38140/pie.v41i2.6912 1912023 41(2): 191-194 https://doi.org/10.38140/pie.v41i2.6912 Ndlovu, Gumede & Mthimkhulu Students’ acceptance and perceptions of online assessments The impact of online assessments on academic performance has been an area of interest even before the recent pandemic. Literature suggests that online assessments have a positive impact on academic performance. Butler-Henderson and Crawford (2020) attribute the improved academic performance to 1) the flexibility of the online assessments; 2) the time-saving capabilities of online assessments; 3) the trustworthiness of test results; and 4) economical aspects (saving on printing and paper costs). Meanwhile, a multitude of scholars and educators have, however, attributed the improved results to the level of academic dishonesty and the lack of academic rigour that online assessments inherently possess. 4.3 Objective 3: To establish barriers to the acceptance of online assessments Most students reported an ease in completing online assessments despite infrastructural and technical difficulties. The result is unexpected, as the sample group had never engaged in online assessments prior to the COVID-19 pandemic. In addition, the movement from face- to-face to online assessments was a drastic shift, which left many HEIs unprepared, with previously disadvantaged HEIs incurring an added triple burden due to a lack of resources, inferior infrastructure, and technologically untrained educators (Ndebele & Mbodila, 2022). Other studies that have sought to determine technology acceptance among students enrolled in historically disadvantaged universities have found persistent challenges, which include poor connectivity, limited technology access and security, and an unwillingness to engage with unfamiliar technology and language being the major barriers. In contrast, the current study reports that most students (66%) reported no technical problems while undertaking online assessments, while many students (63%) reported Wi-Fi challenges. This inconsistency in reporting may be attributed to students regarding Wi-Fi challenges as a system issue rather than a technical issue. Furthermore, the latter results are in alignment with Nyahodza and Higgs’ (2017) sentiments of an existent digital divide between historically disadvantaged universities and well-resourced universities, which was heightened by the shift to online learning and assessments during the COVID-19 pandemic. 5. Limitations of the study This study had some potential limitations. The sample group was limited to Community Extension students. This was the only group of students that participated in online assessments during the recent COVID-19 pandemic within the Faculty of Natural Sciences. Furthermore, this sample had no prior engagements with online assessments before the pandemic. This prevented the study from taking a holistic approach by involving all students at the outset. Secondly, the study excluded senior students, i.e. Advanced Diploma students, who were not allowed to participate in online assessments during the pandemic. This also hindered the ability of the study to establish the acceptance of online assessments from the different levels of study within the Department of Community Extension. Therefore, in future, studies need to be conducted involving all student levels within the faculty or across faculties within the institution. https://doi.org/10.38140/pie.v41i2.6912 1922023 41(2): 192-194 https://doi.org/10.38140/pie.v41i2.6912 Perspectives in Education 2023: 41(2) 6. Conclusion The development and implementation of online assessments are challenges for the whole educational community. This study focused on university students from historically disadvantaged institutions. The study concludes that, despite the challenges emanating from the nature of the institution, i.e. historically disadvantaged, there is a general acceptance of online assessments among students. While students reported various challenges such as technological difficulties, resource-related challenges, academic dishonesty, and anxiety during the assessments, these did not deter the acceptance of online assessment by students. However, the findings of this study highlight the need for institutions to develop and implement strategies to curb academic dishonesty, improve infrastructure, acquire the necessary resources, and increased training prior to students engaging in online assessments to increase acceptability. Using the TAM assisted the study to identify critical factors influencing the acceptance of online assessment by students at a historically disadvantaged institution. These factors include the infrastructure, lack of adequate support, and the COVID-19 pandemic which served as external variables influencing both the perceived usefulness (PU) and the perceived ease of use (PEOU) of online assessments. Through the identification of the PU and PEOU, we were able to conclude that most students preferred online assessment and were willing to conduct online tests in the future. These indicated the behavioural intention to use online assessments; hence the conclusion that students from the Community Extension department accept the online assessments as a useful modality. Therefore, the study concluded that being an historically disadvantaged institution has no fundamental bearing on the perception of students about online assessments. It also suggested practical solutions to improve online assessment practices. We concluded that future studies should investigate possible strategies to deal with the identified challenges so that student acceptance of online assessments could be further improved. References Aguilera-Hermida, A. (2020). College students’ use and acceptance of emergency online learning due to COVID-19. International Journal of Educational Research Open, 1. https://doi. org/10.1016/j.ijedro.2020.100011 Alsalhi, N.R., Qusef, A.D., Al-Qatawneh, S.S. & Eltahir, M.E. 2022. Students’ perspective on online assessment during the COVID-19 pandemic in higher education institutions. Information Sciences Letters, 11(1): 37- 46. https://doi.org/10.1007/s11135-022-01470-1 Boitshwarelo, B., Reedy, A.K. & Billany, T. 2017. Envisioning the use of online tests in assessing twenty-first-century learning: A literature review. Research and Technology Enhanced Learning, 12(16): 2-16. https://doi.org/10.1186/s41039-017-0055-7 Butler-Henderson, K. & Crawford, J. 2020. A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Computers and Education, 159: 1-12. https://doi.org/10.1016/j.compedu.2020.104024 Caleb, O. & Elaine, C. 2022. Development and acceptance of online assessment in higher education: Recommendations for further research. Journal of Applied Learning & Teaching, 5(1): 10-26. https://doi.org/10.37074/jalt.2022.5.1.6 Ćwil, M. 2019. Teacher’s attitudes towards electronic examination - a qualitative perspective. International Journal of Learning and Teaching, 5(03): 77-82. https://doi.org/10.18178/ ijlt.5.1.77-82 https://doi.org/10.38140/pie.v41i2.6912 https://doi.org/10.1016/j.ijedro.2020.100011 https://doi.org/10.1016/j.ijedro.2020.100011 https://doi.org/10.1007/s11135-022-01470-1 https://doi.org/10.1186/s41039-017-0055-7 https://doi.org/10.1016/j.compedu.2020.104024 https://doi.org/10.37074/jalt.2022.5.1.6 https://doi.org/%2010.18178/ijlt.5.1.77-82 https://doi.org/%2010.18178/ijlt.5.1.77-82 1932023 41(2): 193-194 https://doi.org/10.38140/pie.v41i2.6912 Ndlovu, Gumede & Mthimkhulu Students’ acceptance and perceptions of online assessments Davis, D.F. 1986. A technology acceptance model for empirically testing new end- user information systems: Theory and results. Massachusetts Institute of Technology. Massachusetts Institute of Technology. https://doi.org/10.1126/science.146.3652.1648 Davis, D.F., Bagozzi, R.P. & Warshaw, P.R. 1989. User acceptance of computer technology: A comparison of two theoretical models. Management Science, 35(8): 982-1003. https://doi. org/10.1287/mnsc.35.8.982 Donnelly, C. 2014. The use of case-based multiple-choice questions for assessing large group teaching: Implications on student’s learning. Irish Journal of Academic Practice, 3(1): 1-15. https://doi.org/10.21427/D7CX32 Ellis, R., Peppen, R.S. & Brennan, P.A. 2021. Virtual postgraduate exams and assessments: the challenges of online delivery and optimising performance. British Journal of Oral and Maxillofacial Surgery, 59(2): 233-237. https://doi.org/10.1016/j.bjoms.2020.12.011 Govender, N., Reddy, P. & Bhagwan, R. 2021. Academic and psychosocial challenges of health sciences students during the covid-19 pandemic: A University of Technology perspective. Perspectives in Education, 39(3): 44-61. https://doi.org/10.18820/2519593X/pie.v39.i3.5 Iskandar, N., Ganesan, N. & Maulana, N.S.A. 2021. Student’s perception towards the usage of online assessment in university Putra Malaysia amidst Covid-19 pandemic. Journal of Research in Humanities and Social Science, 9: 9-16. Kala, D. & Chaubey, D.S. 2022. Examination of relationships among technology acceptance, student engagement and perceived learning on tourism-related MOOCs. Journal of Teaching in Travel and Tourism, 00(00): 1-18. https://doi.org/10.1080/15313220.2022.2038342 Khan, S. & Khan, R.A. 2019. Online assessments: Exploring perspectives of university students. Education and Information Technologies, 24(1): 661-677. https://doi.org/10.1007/ s10639-018-9797-0 Lin, L., Foung, D. & Chen, J. 2022. Assuring online assessment quality: the case of unproctored online assessment. Quality Assurance in Education, ahead of print. https://doi.org/10.1108/ QAE-02-2022-0048 Meccawy, Z., Meccawy, M. & Alsobhi, A. 2021. Assessment in ‘survival mode’: student and faculty perceptions of online assessment practices in HE during COVID-19 pandemic. International Journal for Educational Integrity, 17(16): 1-24. https://doi.org/10.1007/ s40979-021-00083-9 Ndebele, C. & Mbodila, M. 2022. Examining technology acceptance in learning and teaching at a historically disadvantaged university in South Africa through the technology acceptance model. Education Sciences, 12(54): 1-18. https://doi.org/10.3390/educsci12010054 Nyahodza, L. & Higgs, R. 2017. Towards bridging the digital divide in post-apartheid South Africa: a case of a historically disadvantaged university in Cape Town. South African Journal of Libraries and Information Science, 83(1): 39-48. https://doi.org/10.7553/83-1-1645 Peled, Y., Eshet, Y., Barczyk, C. & Grinautski, K. 2019. Predictors of academic dishonesty among undergraduate students in online and face-to-face courses. Computers and Education, 131: 49-59. https://doi.org/10.1016/j.compedu.2018.05.012 Petrisor, M., Marius, M., Dan, S., Emilian, C. & Dana, G. 2016. Medical status acceptance of online assessment systems. Acta Medica Marisiensis, 62: 30-32: https://doi.org/10.1515/ amma-2015-0110 https://doi.org/10.38140/pie.v41i2.6912 https://doi.org/10.1126/science.146.3652.1648 https://doi.org/10.1287/mnsc.35.8.982 https://doi.org/10.1287/mnsc.35.8.982 https://doi.org/10.21427/D7CX32 https://doi.org/10.1016/j.bjoms.2020.12.011 https://doi.org/10.18820/2519593X/pie.v39.i3.5 https://doi.org/10.1080/15313220.2022.2038342 https://doi.org/10.1007/s10639-018-9797-0 https://doi.org/10.1007/s10639-018-9797-0 https://doi.org/10.1108/QAE-02-2022-0048 https://doi.org/10.1108/QAE-02-2022-0048 https://doi.org/10.1007/s40979-021-00083-9 https://doi.org/10.1007/s40979-021-00083-9 https://doi.org/10.3390/educsci12010054 https://doi.org/10.7553/83-1-1645 https://doi.org/10.1016/j.compedu.2018.05.012 https://doi.org/10.1515/amma-2015-0110 https://doi.org/10.1515/amma-2015-0110 1942023 41(2): 194-194 https://doi.org/10.38140/pie.v41i2.6912 Perspectives in Education 2023: 41(2) Snekalatha, S., Marzuk, S.M., Meshram, S.A., Maheswari, K.U., Sugapriya, G. & Sivasharan, K. 2021. Medical students’ perception of the reliability, usefulness and feasibility of unproctored online formative assessment tests. Advances in Physiology Education, 45(1): 84-88. https:// doi.org/10.1152/advan.00178.2020 UNESCO. 2020, May 28. UNESCO’s support: Educational response to COVID-19. Available at https://en.unesco.org/covid19/educationresponse/support [Accessed 10 June 2022]. Valdez, M.T.C.C. & Maderal, L.D. 2021. An analysis of students’ perception of online assessments and its relation to motivation towards mathematics learning. The Electronic Journal of E-Learning, 19(5): 416-431. https://doi.org/10.34190/ejel.19.5.2481 Vazquez, J.J., Chiang, E.P. & Sarmiento-Barbieri, I. 2021. Can we stay one step ahead of cheaters? A field experiment in proctoring online open book exams. Journal of Behavioral and Experimental Economics, 90: 101653. https://doi.org/10.1016/j.socec.2020.101653 Woldeab, D. & Brothen, T. 2019. 21st-century assessment: Online proctoring, test anxiety and student performance. International Journal of E-Learning and Distance Education, 34(01): 1-10. Available at https://www.ijede.ca/index.php/jde/article/view/1106 [Accessed 12 August 2022]. Yadav, A., Sankhla, M. & Yadav, K. 2022. A snapshot of medical students’ perceptions about online assessment and its comparison with online teaching experience during the COVID-19 pandemic. IAR Journal of Medical Sciences, 3(1): 87-91. https://doi.org/10.47310/iarjms.2022. v03i01.019 https://doi.org/10.38140/pie.v41i2.6912 https://doi.org/10.1152/advan.00178.2020 https://doi.org/10.1152/advan.00178.2020 https://en.unesco.org/covid19/educationresponse/support https://doi.org/10.34190/ejel.19.5.2481 https://doi.org/10.1016/j.socec.2020.101653 https://www.ijede.ca/index.php/jde/article/view/1106 https://doi.org/10.47310/iarjms.2022.v03i01.019 https://doi.org/10.47310/iarjms.2022.v03i01.019 _Hlk114650678