Australasian Journal of Educational Technology, 2021, 37(2). 45 Evaluating student engagement and deep learning in interactive online psychology learning activities Nicole Sugden, Robyn Brunton Charles Sturt University, Australia Jasmine B. MacDonald RMIT University, Australian College of Applied Psychology, and Charles Sturt University, Australia Michelle Yeo, Ben Hicks Charles Sturt University, Australia There is growing demand for online learning activities that offer flexibility for students to study anywhere, anytime, as online students fit study around work and family commitments. We designed a series of online activities and evaluated how, where and with what devices students used the activities, as well as their levels of engagement and deep learning with the activities. A mixed-methods design was used to explore students’ interactions with the online activities. This method integrated learning analytics data with responses from 63 survey respondents, nine interviews and 16 focus group participants. We found that students used a combination of mobile devices to access the online learning activities across a variety of locations during opportunistic study sessions in order to fit study into their daily routines. The online activities were perceived positively, facilitating affective, cognitive, and behavioural engagement as well as stimulating deep learning. Activities that were authentic, promoted problem-solving, applied theory to real-life scenarios and increased students’ feelings of being supported were perceived as most beneficial to learning. These findings have implications for the future design of online activities, where activities need to accommodate students’ need for flexibility as students’ study habits become more mobile. Implications for practice or policy: • The higher education sector needs to recognise students’ increasing need for flexible online learning activities that accommodate study around work and family commitments. • Academics need to design online activities that are compatible with multiple devices and are offered in different formats to allow students to study in opportunistic sessions across a variety of settings. • Lecturers need to contextualise online activities within subject content and create authentic tasks, with real-life applications which make students feel supported. Keywords: online learning, engagement, deep learning, interactive activities, psychology teaching, mixed methods Flexibility in online learning Online and blended learning are becoming increasingly popular. In 2018, 15.5% of Australian university students were enrolled online and 13.1% in multimodal study (a combination of on-campus and online), with increases of 3.2% and 6.6% respectively from 2017 to 2018 (Australian Government, 2019). This growing demand for online learning, along with advancements in learning technologies, has resulted in the development of online learning activities including online tutorials, discussions, quizzes and case-based scenarios or games that offer flexibility in how, when and where students study (Dumford & Miller, 2018). Increasingly, students are using a combination of devices such as laptops, smartphones and tablets to access these online activities anywhere, anytime (Crompton & Burke, 2018; Kaliisa et al., 2019). As noted by Sheail (2018), this flexibility in accessing online learning activities makes online study particularly attractive for mature-age and part-time students as they are able to study around family and work commitments. However, fitting in study around these commitments may result in less quality time engaging with learning resources. Indeed, mature-age and part-time students have some of the highest rates of attrition in Australia, with these students citing work and family commitments as the primary reason for Australasian Journal of Educational Technology, 2021, 37(2). 46 discontinuing study (Cherastidham et al., 2018). To date, engagement with online study has largely been overlooked, with attention focused towards evaluating engagement in on-campus students (O’Shea et al., 2015). Therefore, this research investigated how students engage with online learning activities in the context of flexible online learning environments. Student engagement and depth of learning Engagement is defined by Ben-Eliyahu et al. (2018, p. 87) as “the intensity of productive involvement with an activity”. Engagement is argued to be multidimensional, consisting of three distinct but related components. The affective component refers to emotional investment in the task, demonstrated by enjoyment or interest in the task as well as feelings of belonging to a learning community. Cognitive engagement involves the attentional resources invested in comprehending and mastering the task. This cognitive component is highly correlated with the behavioural component, which relates to participation or involvement in a task (Appleton et al., 2006; Ben-Eliyahu et al., 2018; Fredricks et al., 2004). Students’ level of engagement is suggested to determine the level of processing that students enlist across different learning environments (Bevan et al., 2014). According to the levels of processing theory (Craik & Lockhart, 1972), deep learning involves elaborate cognitive processing of stimuli that facilitates understanding and makes information meaningful to the learner, producing longer lasting and stronger memory traces. This is in contrast to shallow processing, which involves only superficial analysis and maintenance rehearsal of information, which makes information prone to decay over time. In educational settings, students who are psychologically or emotionally invested in long-term mastery of content are more likely to adopt effortful strategies that result in deeper levels of processing. Alternately, students who learn purely to meet task requirements or to receive good grades are more likely to adopt shallow learning strategies such as rote learning (Fredricks et al., 2004). As student engagement and deep learning have been found to enhance students’ learning experiences, Bevan et al. (2014) recommended that activities that promote engagement and deep learning are incorporated into courses. Facilitating engagement and deep learning with online learning activities Numerous online learning activities have been developed to facilitate engagement and deep learning. These include web conferencing (i.e., online classes conducted by Zoom or Adobe Connect applications), case- based scenarios, interactive games, mind maps and virtual demonstrations. In a review of these activities, Schindler et al. (2017) identified several common themes, including the use of technology and promotion of collaboration, and that activities were based on a constructivist pedagogy. In a constructivist paradigm, understanding is derived when newly acquired knowledge is reflected upon and assimilated with existing knowledge (Huang et al., 2010; Schindler et al., 2017). This construction of knowledge occurs through meaningful and authentic learning experiences where students can revise content, apply theory to real- world situations or develop problem-solving skills (Chittaro & Ranon, 2007). Web conferencing provides online face-to-face interaction, with students and teachers able to chat, share screen content and work collaboratively on problem-solving tasks (Islam, 2019). When learning activities are contextualised in web tutorials, for example, embedding YouTube clips demonstrating real-world examples, students engage in deep learning and create meaningful links to the content (Smyth, 2011). Web conferencing has been found to increase students’ affective (i.e., interest and sense of belonging to a learning community) and cognitive (i.e., critical reflection) engagement (Devlin & McKay, 2016; Schindler et al., 2017). The findings on behavioural engagement (i.e., participation and attendance) in these synchronous classes are mixed (Schindler et al., 2017). However, while students may not be attending live classes in a traditional sense, online students may still be engaging with the learning activities by consuming downloadable class recordings that provide them with greater flexibility to study at a time and place that suits them (Devlin & McKay, 2016). Interactive games and case-based scenarios where students apply learning to real-world scenarios within virtual environments are increasingly being used in online learning (Addy et al., 2018). For example, Taillandier and Adam (2018) taught risk management to engineering students through a game where students assumed the role of a local councillor and managed coastal flood risks. These virtual environments are designed to increase affective and cognitive engagement by creating an immersive, multimodal, sensory Australasian Journal of Educational Technology, 2021, 37(2). 47 experience that triggers students’ emotions (Chittaro & Ranon, 2007; Huang et al., 2010). The gamification aspect of these activities, which often includes the use of rewards or competitive leader boards, has been found to increase students’ enjoyment and cognitive involvement (Schindler et al., 2017; Tsai et al., 2015). These activities also assist with deep learning, as feedback provided during the task helps students to gauge their current level of understanding and highlights where further study is required (Tsai et al., 2015). Mind maps are visual representations of concepts where diagrams, arrows and colour are used to depict relationships between associated concepts. Interactive mind maps build on these by adding videos, web links and other interactive materials into the maps. Mind mapping facilitates deep learning and engagement by inspiring creativity and helping students create strong memory associations of conceptual relationships (Davies, 2011). A final type of interactive learning activity is the virtual demonstration, which can be a standalone interactive learning tool or embedded into web conferencing platforms. Virtual demonstrations are particularly common in anatomy-based subjects, where it is impossible for online students to complete hands-on laboratory tasks due to location, cost, or lack of access to specimens (Lewis, 2014). Although these activities have not necessarily led to higher student grades, students have indicated that they do assist with deep learning and engagement with content (Fleagle et al., 2018; Wilson et al., 2018). The degree of engagement and deep learning experienced by students depends upon multiple contextual factors. Activities that have high levels of teacher presence and make students feel that they are a part of a learning community are more likely to promote engagement (Devlin & McKay, 2016). Poor quality learning activities that lack realism will reduce students’ ability to become immersed in learning scenarios or to contextualise their learning meaningfully. Moreover, activities that are prone to technical issues or are difficult to use will be less likely to be used or engaged with by students (Chittaro & Ranon, 2007; Devlin & McKay, 2016). Hence, when designing and evaluating online learning activities, there needs to be a broad consideration of these contextual factors that influence engagement and deep learning processes. The current study As enrolments in online courses increase, there is growing demand for online learning activities that offer flexible study options for students to study anywhere, anytime. Students want online activities that they can complete at their own pace, with options for synchronous cohort-based participation as well as asynchronous or downloadable formats for independent study (Devlin & McKay, 2016; Schroeder et al., 2016; Shearer et al., 2020; Stone & O'Shea, 2019). Students also now require activities to be compatible with mobile technologies to enable remote and mobile access to learning activities (Crompton & Burke, 2018; Kaliisa et al., 2019). While this flexibility in design of online learning activities means that students can more easily access activities on the go, it is unclear whether students fitting in these activities around work and family impacts the quality of engagement and depth of learning experienced when using the activities (Sheail, 2018). Therefore, we sought to investigate students’ levels of processing and engagement with online learning activities that offer such flexibility. To do this, we designed a series of online learning activities for two psychology subjects: Biopsychology and Social Psychology. Using a constructivist approach, we developed multiple web conferencing tutorials, virtual demonstrations, mind maps, interactive games and case-based scenarios aimed at promoting deep learning and engagement. We piloted the activities in the two subjects, then conducted a mixed-methods evaluation of the activities using a combination of learning analytics, survey, qualitative interview and focus group (FG) methods to provide a comprehensive assessment of students’ use of and engagement with the learning activities. As deep learning approaches and affective and cognitive engagement are not easily observable, particularly in the online environment, assessing students’ perspectives through survey or qualitative methods was deemed to be most appropriate (Appleton et al., 2006; Henrie et al., 2015). These qualitative methods as well as learning analytics observations (i.e., time on task and grades) were used to measure behavioural engagement (Fredricks et al., 2004; Schindler et al., 2017). Given that engagement involves multiple, overlapping dimensions (Ben-Eliyahu et al., 2018; Kahu, 2013), and that formal measures of engagement lack predictive validity (Kahu, 2013), we used items measuring individual indicators rather than summative scales to evaluate affective, cognitive, and behavioural engagement. Australasian Journal of Educational Technology, 2021, 37(2). 48 Our study posed the following research questions: • How, where and with what devices are students accessing and using online learning activities within flexible learning environments? • How does the flexibility of these online learning activities influence deep learning and affective, behavioural and cognitive engagement? Method Participants Students enrolled in Biopsychology and Social Psychology subjects in 2018 were invited by email to participate in an anonymous online survey. Upon survey completion, students had the opportunity to further participate in an online interview or FG. From a pool of 270 students, 63 completed the survey. Of these participants, six participated in a follow-up online interview, while 16 participated in FGs (nFG1 = 9; nFG2 = 7). All participants entered an incentive draw for a $50 gift card. Table 1 summarises demographic and enrolment data for the total student pool, and for survey, interview and FG samples. As can be seen in Table 1, participants were predominantly mature-aged, female, married, had completed a previous degree, were studying part-time and were employed, working on average 33 hours a week. Table 1 Demographic characteristics of total student pool, survey participants and qualitative participants Total poola Surveyb Interviewb FGsb Gender (%) Male 18.3 17.5 12.5 Female 81.7 82.5 100 87.5 Age Mean (SD) 36.4 (10.6) 40 (9.5) 37.7 (8.71) 44.13 (8.29) Range 20–69 21–69 26–50 29–59 Completed education N (%) High school 8 (12.7) 2 (33.3) Diploma or trade 13 (20.6) 2 (12.5) Tertiary degree 42 (66.6) 4 (66.7) 11 (68.8) Marital status N (%) Married or de-facto 46 (73.0) 4 (66.7) 11 (68.8) Single or divorced 17 (27.0) 2 (33.3) 3 (18.8) Subject enrolment N (%) Biopsychology 137 48 (76.2) 4 (66.7) 7 (43.8) Social Psychology 133 32 (50.8) 1 (16.7) 2 (12.5) Both subjects 51 19 (23.7) 1 (16.7) 7 (43.8) Study mode (%) Full-time 39.8 7.9 Part-time 60.2 92.1 100 87.5 Employment N (%) Not employed 10 (15.9) 1 (6.3) Employed Hours worked per week 53 (84.1) 33.2 6 (100) 14 (87.5) Note. a learning analytics data, b self-reported data. Materials Online learning activities We developed a series of online learning activities (described in detail in the Appendix) and delivered them via the university’s learning management system, contextualised within weekly topic materials. Australasian Journal of Educational Technology, 2021, 37(2). 49 Activities developed for the Biopsychology subject included: • Interactive games: Choose your own adventure style scenario-based games. • Gamification: Code words embedded at the end of all interactive games were used to unlock a virtual demonstration brain dissection activity. • Experiments with online meetings (EOMs): Students participated in virtual demonstrations, experiments and discussions in online meetings. Activities developed for the Social Psychology subject included: • iMindmaps: interactive mind maps summarising weekly topics • Scenario-based learning: real-life examples with integrated quizzes • Interactive activities with forum: real-life scenarios with forum discussion activities • Video tutorials: online tutorials with embedded discussions and quizzes. Online survey We customised a survey to measure how, when and where students were accessing the online learning activities, as well as their perceived levels of engagement and deep learning with the activities. The survey consisted of demographic questions, followed by items measuring study locations and devices used to access the activities. Next, three items measured internet quality, ability to access (due to technical issues), and knowledge of how to use and where to find information in the learning management system, each using a scale of 1 (I experience a lot of difficulty) to 5 (I never experience difficulty). Students’ perceived levels of engagement and deep learning were measured. For each activity, an image and overview of the activity were presented, and students were asked to indicate whether they completed the activity or not. If they did not complete the activity, they were asked to explain why. For each activity completed, participants rated the activity on eight indicators of engagement (enjoyment, interest, immersion, willingness to repeat the activity in another subject and ease of use) and deep learning (memorisation, understanding and applicability to real life). Items were rated from 1 (strongly disagree) to 5 (strongly agree), with higher scores indicating greater perceived engagement and deep learning. Qualitative interview and FG schedule We developed a 13-point topic schedule from insights from the literature review and online survey responses. The schedule assessed participants’ experiences using the resources, including their level of engagement, deep learning, motivation, ability to apply concepts, most and least useful resources, use of technology to access the resources, study routines, as well as comparisons between these resources and resources used in other subjects. Procedure We developed the online activities and piloted them in the online offerings of Biopsychology and Social Psychology subjects at a regional Australian university during the 14-week study session running from July to October in 2018. The evaluation of these activities commenced at the conclusion of this session, after final grade release, to avoid any potential bias of existing or future relationships between the subject coordinators (authors N. S. and R. B.) and students. These authors were also not provided with any identifiable data from any aspects of the study. After receiving institutional ethics approval, students were invited to participate in the study by email. A link was provided in the email for students to access the online survey in the Qualtrics platform. The qualitative interviews, ranging from 24 to 65 minutes in duration (M = 55.17), were conducted either online, or in-person in a meeting room at another tertiary education campus. Two FGs (80 and 75 minutes’ duration each) were conducted online. Adobe Connect online meeting software was used for online participation as it was familiar to participants and provided flexible options for video and chat communication. All interviews and FGs were facilitated by author J. B. MacD., who had previously taught the subjects but had no current or future teaching responsibilities within the institution. This provided participants with greater perceived freedom to critique the online learning activities. In interviews and FGs, the facilitator Australasian Journal of Educational Technology, 2021, 37(2). 50 demonstrated the learning activities as a prompt for discussion while working through the topic schedule. Afterwards, participants were free to review and withdraw comments collected during data collection. Data analysis Quantitative survey data was analysed using SPSS (version 26). Audio recordings of interviews and FGs were transcribed verbatim by authors J. B. MacD. and M. Y., de-identified (pseudonyms reported here) and quality checked. Analysis of the data used the thematic analysis method (Braun & Clarke, 2006) and NVivo 12 for electronic storage and organisation. Codes were visually mapped and higher-level themes were constructed (Patton, 1990). Learning analytics data (i.e., student demographics, learning management system accesses and interactions, and grades) were accessed by author B. H. The k-medoids algorithm (Arora & Varshney, 2016) was used to cluster students into groups based on similarities in grade, number of accesses, interactions (clicks), forum views and time spent in the learning management system as a measure of behavioural engagement. Due to ethical constraints, individual grades were not able to be linked to any quantitative or qualitative data sets. Results Findings from the online survey, qualitative interviews and FGs, and learning analytics data were integrated and are presented below. First, we report on how, where and with what devices students used to access the online learning activities. Then, we examine how the flexibility of these online learning activities influenced students’ deep learning and engagement. Access and use of online learning activities Most interview and FG participants reported using a dedicated study space. However, further prompting revealed that students were actually much more mobile in their study than they realized. For example, one interview participant said that she only studied in her home office, but later commented that she preferred studying on the couch or in the bedroom and sometimes listened to lectures when driving or going for a walk. Additionally, Figures 1 and 2 show that when survey respondents were asked to list locations that they studied in at home, work and in transit, study locations varied considerably. Qualitative responses revealed an even broader range of study locations with respondents indicating that they studied at the beach, in the park, in shopping centres, in cafés, at friends’ homes, on-campus or public libraries, in hotels while holidaying, while working out or waiting to collect children. Figure 1. Percentage of students who studied at home only, or in a combination of home, work and in transit locations 0 5 10 15 20 25 30 35 40 45 Home Home + Work Home + Transit Home + Work + Transit % Location Australasian Journal of Educational Technology, 2021, 37(2). 51 Figure 2. Study locations within the home and in transit reported by participants As students studied in a variety of locations, it was unsurprising that they used multiple devices to access the online learning activities. Figure 3 summarises the devices that students used and reveals that it was common for students to use a combination of devices. The survey findings were corroborated by the qualitative responses as participants reported selecting devices based on the task and study environment. For example, Caitlin (interview) explained that she preferred to work on her desktop computer for full functionality, but sometimes used more portable devices for on-the-spot study: If I’m at home, I use the computer … if I’m struggling with something or I have a lot to read, I take [my tablet] on the train or to lunch, but I cannot take notes. It was also common for students to use multiple devices within the same study session: Sometimes I use the laptop and the tablet … I keep the guide on the tablet and I write [on the computer]. (Caitlin, interview) Figure 3. Devices used by students to access learning activities An important consideration for students when using online learning activities was the ease in which they could access and use the activities. Table 2 summarises survey participants’ ratings of internet quality and ease of accessing and using the learning management system. Overall, students reported having good internet connection and few technical difficulties. When asked to provide reasons for not completing the activities, only three students reported that internet access or technical issues impeded access. However, some participants did express frustration with technical issues when accessing the activities: 0 5 10 15 20 25 30 35 Home Office Kitchen/dining Lounge Bed Backyard Multiple areas in the home Home (not specified) Public transport (train/bus) Airport/plane Driving (listening to recordings) Car passenger Waiting in car/eating lunch % Location 0 10 20 30 40 50 60 70 80 90 Desktop computer/laptop Apple MacBook iPad/tablet iPhone/smartphone Combination of devices % Device Australasian Journal of Educational Technology, 2021, 37(2). 52 I found with my small screen laptop … I couldn't really see the whole picture anyway, so I had to scroll up and down on my screen to get the whole view. (Natalia, FG2) I tried [the brain dissection] on my iPad, I tried on Chrome and Firefox, but it just … I was clicking the bits, but it wouldn’t register! Then I went through everything, I clicked every single spot that I could on the brain, and it was like, “No that’s still wrong!” so I was like, “Well I clicked everywhere!” (Violet, interview) Although students believed they had good knowledge of the learning management system, two students reported that they did not encounter the online learning activities at all, which may indicate that navigating the system was actually an issue for these students. Table 2 Difficulty experienced accessing the learning management system: N (%) Never Rarely Occasionally Some A lot Internet connection 11 (17.5) 31 (49.2) 13 (20.6) 7 (11.1) 1 (1.6) Technical issues 15 (23.8) 33 (52.4) 13 (20.6) 2 (3.2) 0 Finding and/or using tools 9 (14.3) 36 (57.1) 14 (22.2) 4 (6.3) 0 Online learning activities, engagement and deep learning Table 3 summarises survey participants’ ratings of perceived engagement and deep learning with the online learning activities. As can be seen, students highly utilised the activities, with each activity being used by at least 75% of survey participants. The only exception was the brain dissection activity requiring students to collect multiple code words to unlock the activity at the end of session. Only half of the survey participants completed it. Students perceived all of the activities as highly engaging and conducive to deep learning. The Biopsychology interactive games had the highest ratings of engagement, whilst the brain dissection code word activity received the lowest ratings. Table 3 Mean (SD) survey ratings of engagement and deep learning with online learning activities Engagement Deep learning Enjoy Interest Immersion Repeat Ease Memorise Understand Apply Biopsychology Interactive games (n = 40, 83%) 4.42 (0.92) 4.51 (0.79) 4.35 (0.87) 4.44 (0.92) 4.47 (0.89) 4.43 (0.83) 4.32 (0.90) 4.37 (0.85) Gamification (n = 24, 50%) 2.78 (2.17) 3.00 (2.29) 3.00 (2.29) 3.11 (2.37) 2.78 (2.17) 2.89 (2.21) 3.00 (2.29) 3.11 (2.37) EOMs (n = 42, 88%) 3.78 (1.55) 3.85 (1.51) 3.56 (1.48) 3.70 (1.54) 3.56 (1.51) 3.59 (1.51) 3.59 (1.60) 3.59 (1.54) Social Psychology iMindmaps (n = 26, 82%) 3.89 (2.21) 3.78 (2.17) 3.78 (2.17) 3.89 (2.21) 3.89 (2.21) 3.89 (2.21) 3.89 (2.21) - Scenario-based learning (n = 24, 75%) 3.72 (1.56) 3.78 (1.48) 3.64 (1.51) 3.75 (1.59) 3.69 (1.50) 3.72 (1.44) 3.67 (1.55) 3.69 (1.55) Interactive with forum (n = 26, 82%) 3.40 (1.56) 3.58 (1.68) 3.51 (1.56) 3.53 (1.57) 3.64 (1.58) 3.44 (1.62) 3.53 (1.54) 3.64 (1.60) Video tutorials (n = 24, 75%) 3.19 (1.76) 3.22 (1.72) 3.19 (1.76) 3.19 (1.76) 3.11 (1.68) 3.19 (1.76) 3.19 (1.76) 3.19 (1.76) Note. EOMs = experiments with online meetings. Enjoy = I enjoyed this activity. Interest = I found this activity interesting. Immersion = I felt immersed by the experience provided in the activity. Repeat = I would like to see this type of activity used again in subjects that I am yet to complete. Ease = I found this activity easy to use. Memorise = This activity helped me memorise/revise content. Understand = This activity helped me form a deeper understanding of content on this topic. Apply = This activity helped me understand how I could apply content from this topic to real- life situations. Higher scores indicate greater engagement and deep learning (range 1–5). Australasian Journal of Educational Technology, 2021, 37(2). 53 These positive perceptions of the online learning activities were mirrored in the qualitative data. In terms of affective engagement, students reported the resources to be a novel and enjoyable part of their study experience: It was a bit fun. You know when you got to go on a holiday [in the scenario] or choose different areas of interest I liked it, it was a bit like, you know, when you read a novel you get to choose your own pathway. (Helen, FG1) Many students said that having a variety of resources gave them increased choice in how they approached their study, which allowed them to engage with content at numerous levels of processing: Probably comes back to learning style … having the choice is important because it reflects how the individual learns … it is these sorts of things, games or video clips, or you know, animation type things, are certainly a way to support people’s understanding. (Rita, interview) By using the different methods, we were also able to learn in different ways which helped sink into the memory, instead of just sort of reading it from just the textbook. (Megan, FG1) When exploring the depth of learning, many participants reported that activities that incorporated practical applications of content were the ones that they were most likely to recall in the exam, and even after the subject was completed: [The activities] make you think about different things, it also improves the depth of the subject, so you are not just memorizing the concept but applying them. (Caitlin, interview) Rather than just seeing everything purely in text, it made you start putting some of those things you're learning into real day situations. I found myself after playing some of those games, even when you're walking down the street or I'm looking at things, I started to think about visually what was happening biologically to me when I was seeing or hearing something. (Zara, FG2) What you were learning you could try and apply it out in the real world, if you had the ability to … When I say ability, it’s not that I dissected a sheep brain … you obviously couldn’t go out and do that, it was very interesting and gave you a broader knowledge that you could talk about to people. (Gabrielle, interview) The addition of those online type activities, I feel like I could say that this time around my understanding is probably deeper, not just for the purpose of getting through an assignment or getting through an exam. (Rita, interview) Some participants noted that while they had adopted a shallow approach to revision-style learning activities, the resources provided instantaneous feedback on their understanding that forced them to engage in deeper levels of learning: There'd be things that I thought I understood and then I think … “Oh wait, I didn't understand that at all, I need to go back and actually have a look at it”. So that was good and because it wasn't particularly time consuming it would kind of point out gaping holes in my knowledge, which was heartbreaking but also very good at the same time. (Tegan, interview) Whether participants utilised shallow or deep learning often depended on time of day, location, and students’ energy and concentration levels. Participants reported that they were juggling study around employment and family responsibilities. As Violet explained in the interview: I have a 2-year-old girl … I never get any time, which makes it hard to study. So, when I do study, I don’t want to be trawling through hours of textbooks. (Violet, interview) Australasian Journal of Educational Technology, 2021, 37(2). 54 Therefore, to meet the demands of study, work, and family, students typically scheduled long study blocks for demanding study activities (i.e., lectures, readings, experiments with online meetings and video tutorials), whereas they used more opportunistic, shallow learning sessions (e.g., during lunch breaks, commuting, or late at night) to complete online learning activities that required less attentional resources. They’re things that you can be just late at night, you know, you’ve got your iPad or something on your lap and you think I'll just do this for 10 minutes. (Zara, FG2) I think for me being a bit time poor, I don’t usually do the tutorial activities, but to play a game I would do that like it was not a lot of writing and you could just play it. (Nira, FG1) Not only did students appreciate that the online learning activities provided them with flexible opportunities for shallow and deep learning, they also reported feeling more engaged by the resources because of the effort that went into creating them: The more different activities we can be given it makes us feel supported by our lecturers. It is a lonely world, being an online student, and if all you've got is a syllabus, a textbook and a pre-recorded meeting you almost think “No one cares, why should I?” (Zara, FG2) I loved the games purely because it made me feel special. It made me feel like [the teacher] put in so much effort and they cared about my learning … Because you feel that personal connection, you don’t want to let them down. Violet (interview) Thus, students’ increased engagement and deep learning through the online learning activities were not just a product of their content. These activities facilitated a sense of teacher presence that motivated students to invest more time and effort into completing the activities and learning in the subject. Despite the activities being broadly well-received, and students fitting in opportunistic study sessions, not all resources were utilised. For some participants, the online learning activities were not prioritised, merely because they had completed a previous degree prior to online resources being used in teaching and achieved satisfactory grades without them. Interestingly, there were instances in which these students were surprised by how the activities did indeed enhance their learning: [As] a student early on back in the early 2000s it was lecture, then tutorial … But I did warm up to the idea of the online games, and that was something that I thought, “Ah, not so important”, initially. But! But! They were probably the most crucial in terms of things that I remember in the learning. (Rita, interview) In the online survey, the primary reason for not engaging with the activities was a lack of time. This was particularly true for the Biopsychology experiments with online meetings, Social Psychology video tutorials and interactive activities with forum discussions that required considerable time commitments to complete. Similarly, for the qualitative participants, the most widely discussed reason for not using the resources was a lack of time. These participants described having a relatively consistent approach where they chipped away at lectures, readings, and online learning activities on a weekly basis. However, as assessment tasks approached, participants said that they needed uninterrupted blocks to focus on the assessment task only, which meant that they did not utilise the online learning activities during those times and consequentially fell behind on the activities: I do look at them I just don't look at them week to week just simply because of time … It seems like it’s just a cycle that keeps going. But yeah, those activities generally don't happen. (Tegan, interview) I think that it was down to time … if I had to cut something out it would be the tutorial and the games. Because you have to learn the content that’s assessable and [the online learning activities are] just supporting that learning. (Natalia, FG2) Australasian Journal of Educational Technology, 2021, 37(2). 55 As demonstrated in Natalia’s quote, despite the online learning activities containing content relevant to assessment tasks, there was a perception among students that these activities were a lesser priority than more traditional resources (e.g., lectures and textbooks). Learning analytics access and interaction data confirmed the changing patterns of study that occurred prior to assessment tasks. Approximately halfway through the session, when assessment tasks were due, the number of accesses and time spent interacting with the online learning activities reduced dramatically. Towards the final exam, accesses and time spent interacting began to increase again, but not to the levels seen at the beginning of session. To evaluate students’ behavioural engagement levels, learning analytics activity data was used to create clusters based on students’ study patterns. Students were grouped into five clusters (summarised in Table 3) using the frequency in which they accessed the learning management system overall, as well as the time spent interacting with learning resources within the system. Grades were not used to find the clusters but were included for comparison. As can be seen in Table 3, Cluster 5 accessed the site the most and for the longest time, and this engagement with the content corresponded with the highest grades. Clusters 1 and 2 had the next highest grades, although Cluster 1 had more efficient engagement than Cluster 2 as Cluster 1 accessed the site more frequently but spent less time in doing so. Clusters 3 and 4 had the least number of accesses and time spent on the site; this was reflected in the grades of Clusters 3 and 4, which were lower than the grades of the other clusters Hence, these indicators of behavioural engagement with the activities within the learning management system were associated with student success in the subject. Table 3 Cluster analysis of frequency of access and time spent using the learning management system Cluster N Average grade Number of site accesses (SD) Site clicks Forum views Median time per session Total hours (SD) 1 62 75% 69.60 (27.48) 22.30 186.33 4.45 2.18 (1.52) 2 26 72% 48.03 (32.68) 28.80 117.03 21.98 2.59 (1.97) 3 40 69% 46.02 (18.92) 15.68 47.45 3.50 1.14 (1.23) 4 11 67% 28.09 (11.98) 10.64 13.99 1.08 8.57 (0.66) 5 23 83% 134.43 (49.21) 26.87 760.80 9.20 5.01 (1.83) Note. Median time per session measured in minutes. Discussion How, where and what devices do students use to access online learning activities? In this study, we investigated how, where and with what devices students were accessing online learning activities. Typical of Australian online students (Australian Government, 2019), our participants were prominently female, mature-aged, studying part-time and employed 33 hours a week on average. Consistent with previous studies featuring online students, study was something that our participants were fitting in around other work and family commitments (Sheail, 2018; Stone & O'Shea, 2019). Consequently, the participants expressed a desire for flexible online activities that they could complete, anywhere, anytime, on any device, to allow them to study around their competing commitments (Devlin & McKay, 2016; Shearer et al., 2020). A key finding in our study was that participants did indeed study anywhere, anytime. Previous studies have reported trends of students increasingly using mobile technologies to study (Crompton & Burke, 2018; Sung et al., 2016), and our mixed-methods approach provided a greater understanding of students’ study patterns. Our participants reported studying in numerous informal settings within the home, at work, in transit, and in public places. Yet, these students appeared to lack insight into how much mobile study they engaged in, as declarations of using a dedicated study space were contradicted by follow-up responses where participants disclosed that they studied in multiple locations. Participants’ use of varied study locations was made possible by their high usage of portable devices. Previous studies have found that laptops are students’ preferred device, followed by smart phones and other handheld devices (Crompton & Burke, 2018; Kaliisa et al., 2019). We found similar patterns of device usage among our participants, but we extended on these findings in demonstrating that students often used a combination of devices interchangeably depending on their study task and context. Moreover, having online learning activities compatible with multiple devices was essential for them to be able to fit study into their daily routines. Australasian Journal of Educational Technology, 2021, 37(2). 56 An advantage of these resources was that instantaneous feedback was provided, which meant that students could do the activities at their own pace, without needing to wait for lecturer responses. This flexibility of the online learning activities made it possible for some students to complete activities that they otherwise may not have had time to do. Most participants indicated that they engaged in opportunistic study sessions using the online activities, for example, listening to a recording whilst driving or completing a quick activity while waiting to collect children. Consequently, each of the activities were completed by 75% or more students. Nevertheless, participants’ primary reason for not using activities was a lack of time. For example, only 50% of students completed the Biopsychology brain dissection activity. Participants reported that they tended to fall behind towards the end of session and run out of time to collect all of the code words and complete the activity. When our participants were lacking in time or when assessments were due, online activities were considered a lesser priority than other resources. Therefore, we recommend that online activities are not scheduled every week or in conjunction with assessment task due dates so that students have the opportunity to complete all activities without needing to compromise or prioritise. How does the flexible nature of these online learning activities influence deep learning and engagement? The flexibility of the online activities undoubtably made it easier for students to study anywhere, anytime. Although students were more able to fit online activities into their routine, it is unclear whether this time spent on the activities was quality time (Sheail, 2018). For this reason, we investigated students’ levels of deep learning and engagement with the activities. Encouragingly, participants’ ratings of the activities were very positive. In terms of affective engagement, participants found the activities enjoyable and interesting. Moreover, participants reported feeling affectively engaged and supported by their lecturers as a result of the activities, which in turn may have made them more cognitively and behaviourally invested in completing activities. As recommended by Biggs and Tang (2007) and Hattie (2012), students in our study were given the opportunity to engage with content in multiple ways. This flexibility and varied delivery of content strengthened engagement as students were able to select activities that best suited their learning needs and lifestyles. Observing behavioural engagement via accesses and time spent in the learning management system, consistent with previous studies (Appleton et al., 2006; Bevan et al., 2014; Nieuwodt, 2020; Wong, 2013), we found that higher levels of engagement generally corresponded with higher grades. However, our cluster analysis also supported Perera and Richardson’s (2010) findings that some of the time students spend online may be wasted time (e.g., clicking on links to find information). In our study, Cluster 2 spent more time accessing the learning management system but received lower grades than Cluster 1, who had more site accesses and forum views in less time. This suggests that although the quantity of interactions is important, it is the quality of these interactions where students are actively focused on interacting with contextualised activities that may be more important for predicting learning outcomes (Bernard et al., 2009; Perera & Richardson, 2010). Our activities were not graded, but still had high levels of engagement that were associated with higher grades. Grading activities could potentially have detrimental effects, as students focus purely on outcomes and engage in shallow forms of learning that result in only short-term learning (Fredricks et al., 2004). We therefore recommend that online learning activities are designed as low stakes tasks where students can enjoy the process and become more intrinsically engaged with the content. Although participants may have intended to use some of the online activities briefly, for shallow learning and revision, they indicated that they used the activities mostly to facilitate deeper learning. The activities improved memorisation and understanding of content by highlighting gaps in knowledge for students to reflect on their learning. Supporting the literature that activities involving authentic scenarios applying theory are beneficial for deep learning (Chittaro & Ranon, 2007; Schindler et al., 2017), our participants found the activities helpful in constructing their knowledge. Participants gave examples of how they had been able to make connections between ideas and apply them to real world scenarios as a result of the activities. This highlights the importance of contextualising learning activities among learning resources and providing students with authentic learning experiences. Even though pedagogically the activities may facilitate deep learning and engagement, if students experience technical difficulties or have difficulty navigating the online environment, they are likely to become disengaged with the activity, diverting their attention to trying to fix the issue (Chittaro & Ranon, 2007; Devlin & McKay, 2016). Despite our participants reporting very few of these issues, there were some Australasian Journal of Educational Technology, 2021, 37(2). 57 students who could not find the activities and others that expressed frustration with technical issues. As students were already time-poor and fitting activities into their daily routine, these issues may have contributed to students not persevering with the activities. Therefore, to minimise the risk of students failing to find activities or being deterred by technical issues, it is imperative that online activities are carefully designed to be as user-friendly as possible (Chittaro & Ranon, 2007). Limitations and future directions We used a cluster analysis to evaluate access, time and grade components of behavioural engagement. Due to ethical constraints, we were not able to map individual survey, interview or FG responses to grades or attrition. We recommend that future studies carry out this mapping to gather more insight into how quantitative and qualitative data is reflected in learning analytics data. Our learning activities were perceived positively by our participants. The survey had a response rate of 23%; and so, we cannot rule out whether our sample comprised mostly of students who completed the activities or who perceived them positively. We endeavoured to minimize socially desirable responding by conducting the research after grades had been released, using an anonymous survey, and by having interviews and FGs facilitated by a researcher independent of teaching the students. The wide variations in survey ratings of engagement and deep learning and presence of qualitative participants with negative opinions about the activities does provide some reassurance that our sample was reasonably representative of the entire cohort. Implications and conclusions With students increasingly moving to online study, particularly following COVID-19, there will be a greater demand for flexible online activities that cater for students’ need to be able to study anywhere, anytime (Crawford et al., 2020). Our research found that students use multiple devices to study across a range of locations in opportunistic study sessions in order to fit study around work and family commitments. Although the study sessions may be brief, they still account for quality time studying, as participants reported these activities facilitating deep learning and engagement. Together, these findings show that the future design of online activities needs to incorporate flexibility, portability and compatibility with multiple devices. These activities need to accommodate the evolving study habits of students by offering choice of multiple formats (e.g., synchronous online participation, downloadable recordings and audio podcasts). From a pedagogical perspective, activities need to be designed to promote deep learning and engagement (Bevan et al., 2014). Our findings demonstrate that this can be done by developing activities that are authentic, realistic, require students to solve problems and reflect on their learning, and are contextualised within study modules to allow students to make meaningful connections between theory and real-life applications. Moreover, for online students who often feel isolated and second fiddle to on-campus students (Devlin & McKay, 2016), providing them with interactive, flexible activities that make them feel supported by their lecturers significantly contributes to students’ engagement and learning experience. In the context of COVID-19, the sense of support that online learning activities provide may become increasingly important for reducing students’ feelings of isolation and for maintaining student well-being. Acknowledgements We would like to acknowledge the contribution of Dr Cassandra Colvin for her insights into learning analytics and engagement literature, as well as the following research assistants in de-identifying transcripts and checking the accuracy of transcriptions alongside the audio recordings: Noura M. Hamad, Taryn Humphries, Samiksha Rampersad and Fazlunisa R. Sheik. Additional acknowledgements go to Thu Thuy Tran for collating data and calculating frequencies for participant responses to the open-ended questions of the online survey. References Addy, T. M., Dube, D., Croft, C., Nardolilli, J. O., Paynter, O. C., Hutchings, M. L., Honsberger, M. J., & Reeves, P. M. (2018). Integrating a serious game into case-based learning. Simulation & Gaming, 49(4), 378–400. https://doi.org/10.1177/1046878118779416 https://doi.org/10.1177/1046878118779416 Australasian Journal of Educational Technology, 2021, 37(2). 58 Appleton, J. J., Christenson, S. L., Kim, D., & Reschly, A. L. (2006). Measuring cognitive and psychological engagement: Validation of the Student Engagement Instrument. Journal of School Psychology, 44(5), 427–445. https://doi.org/10.1016/j.jsp.2006.04.002 Arora, P., & Varshney, S. (2016). Analysis of k-means and k-medoids algorithm for big data. Procedia Computer Science, 78, 507–512. https://doi.org/10.1016/j.procs.2016.02.095 Australian Government. (2019). Selected higher education statistics – 2018 student data. https://www.education.gov.au/selected-higher-education-statistics-2018-student-data Ben-Eliyahu, A., Moore, D., Dorph, R., & Schunn, C. D. (2018). Investigating the multidimensionality of engagement: Affective, behavioral, and cognitive engagement across science activities and contexts. Contemporary Educational Psychology, 53, 87–105. https://doi.org/10.1016/j.cedpsych.2018.01.002 Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., & Bethel, E. C. (2009). A meta-analysis of three types of interaction treatments in distance education. Review of Educational Research, 79(3), 1243–1289. https://doi.org/10.3102/0034654309333844 Bevan, S. J., Chan, C. W. L., & Tanner, J. A. (2014). Diverse assessment and active student engagement sustain deep learning: A comparative study of outcomes in two parallel introductory biochemistry courses. Biochemistry and Molecular Biology Education, 42(6), 474–479. https://doi.org/10.1002/bmb.20824 Biggs, J., & Tang, C. (2007). Teaching for quality learning at university (3rd ed.). McGraw-Hill Education & Open University Press. Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa Cherastidham, I., Norton, A., & Mackey, W. (2018). University attrition: What helps and what hinders university completion? Grattan Institute. https://grattan.edu.au/wp- content/uploads/2018/04/University-attrition-background.pdf Chittaro, L., & Ranon, R. (2007). Web3D technologies in learning, education and training: Motivations, issues, opportunities. Computers & Education, 49(1), 3–18. https://doi.org/10.1016/j.compedu.2005.06.002 Craik, F. I., & Lockhart, R. S. (1972). Levels of processing: A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11(6), 671–684. https://doi.org/10.1016/S0022- 5371(72)80001-X Crawford, J., Butler-Henderson, K., Rudolph, J., Malkawi, B., Glowatz, M., Burton, R., Magni, P., & Lam, S. (2020). COVID-19: 20 countries' higher education intra-period digital pedagogy responses. Journal of Applied Learning & Teaching, 3(1), 1–20. https://doi.org/10.37074/jalt.2020.3.1.7 Crompton, H., & Burke, D. (2018). The use of mobile learning in higher education: A systematic review. Computers & Education, 123, 53–64. https://doi.org/10.1016/j.compedu.2018.04.007 Davies, M. (2011). Concept mapping, mind mapping and argument mapping: What are the differences and do they matter? Higher Education, 62(3), 279–301. https://doi.org/10.1007/s10734-010-9387-6 Devlin, M., & McKay, J. (2016). Teaching students using technology: Facilitating success for students from low socioeconomic status backgrounds in Australian universities. Australasian Journal of Educational Technology, 32(1), 92– 106. https://doi.org/10.14742/ajet.2053 Dumford, A. D., & Miller, A. L. (2018). Online learning in higher education: Exploring advantages and disadvantages for engagement. Journal of Computing in Higher Education, 30(3), 452–465. https://doi.org/10.1007/s12528-018-9179-z Fleagle, T. R., Borcherding, N. C., Harris, J., & Hoffmann, D. S. (2018). Application of flipped classroom pedagogy to the human gross anatomy laboratory: Student preferences and learning outcomes. Anatomical Sciences Education, 11(4), 385–396. https://doi.org/10.1002/ase.1755 Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59–109. https://doi.org/10.3102/00346543074001059 Hattie, J. (2012). Visible learning for teachers. Routledge. Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology- mediated learning: A review. Computers & Education 90, 36–53. https://doi.org/10.1016/j.compedu.2015.09.005 Huang, H.-M., Rauch, U., & Liaw, S.-S. (2010). Investigating learners’ attitudes toward virtual reality learning environments: Based on a constructivist approach. Computers & Education, 55(3), 1171– 1182. https://doi.org/10.1016/j.compedu.2010.05.014 https://doi.org/10.1016/j.jsp.2006.04.002 https://doi.org/10.1016/j.procs.2016.02.095 https://www.education.gov.au/selected-higher-education-statistics-2018-student-data https://doi.org/10.1016/j.cedpsych.2018.01.002 https://doi.org/10.3102/0034654309333844 https://doi.org/10.1002/bmb.20824 https://doi.org/10.1191/1478088706qp063oa https://grattan.edu.au/wp-content/uploads/2018/04/University-attrition-background.pdf https://grattan.edu.au/wp-content/uploads/2018/04/University-attrition-background.pdf https://doi.org/10.1016/j.compedu.2005.06.002 https://doi.org/10.1016/S0022-5371(72)80001-X https://doi.org/10.1016/S0022-5371(72)80001-X https://doi.org/10.37074/jalt.2020.3.1.7 https://doi.org/10.1016/j.compedu.2018.04.007 https://doi.org/10.1007/s10734-010-9387-6 https://doi.org/10.14742/ajet.2053 https://doi.org/10.1007/s12528-018-9179-z https://doi.org/10.1002/ase.1755 https://doi.org/10.3102/00346543074001059 https://doi.org/10.1016/j.compedu.2015.09.005 https://doi.org/10.1016/j.compedu.2010.05.014 Australasian Journal of Educational Technology, 2021, 37(2). 59 Islam, C. (2019). Using Web conferencing tools for preparing reading specialists: The impact of asynchronous and synchronous collaboration on the learning process. International Journal of Language and Linguistics, 6(3), 1–10. https://doi.org/10.30845/ijll.v6n3p1 Kahu, E. R. (2013). Framing student engagement in higher education. Studies in Higher Education, 38(5), 758–773. https://doi.org/10.1080/03075079.2011.598505 Kaliisa, R., Palmer, E., & Miller, J. (2019). Mobile learning in higher education: A comparative analysis of developed and developing country contexts. British Journal of Educational Technology, 50(2), 546–561. https://doi.org/10.1111/bjet.12583 Lewis, D. I. (2014). The pedagogical benefits and pitfalls of virtual tools for teaching and learning laboratory practices in the biological sciences. The Higher Education Academy. https://www.advance-he.ac.uk/knowledge-hub/pedagogical-benefits-and-pitfalls-virtual-tools- teaching-and-learning-laboratory Nieuwoudt, J. E. (2020). Investigating synchronous and asynchronous class attendance as predictors of academic success in online education. Australasian Journal of Educational Technology, 36(3), 15–25. https://doi.org/10.14742/ajet.5137 O’Shea, S., Stone, C., & Delahunty, J. (2015). “I ‘feel’ like I am at university even though I am online.” Exploring how students narrate their engagement with higher education institutions in an online learning environment. Distance Education, 36(1), 41–58. https://doi.org/10.1080/01587919.2015.1019970 Patton, M. Q. (1990). Qualitative evaluation and research methods (2nd ed.). Sage Publications, Inc. Perera, L., & Richardson, P. (2010). Students’ use of online academic resources within a course web site and its relationship with their course performance: An exploratory study. Accounting Education: An International Journal, 19(6), 587–600. https://doi.org/10.1080/09639284.2010.529639 Schindler, L. A., Burkholder, G. J., Morad, O. A., & Marsh, C. (2017). Computer-based technology and student engagement: a critical review of the literature. International Journal of Educational Technology in Higher Education, 14(1), 1–28. https://doi.org/10.1186/s41239-017-0063-0 Schroeder, S., Baker, M., Terras, K., Mahar, P., & Chiasson, K. (2016). Students' desired and experienced levels of connectivity to an asynchronous, online, distance degree program. Online Learning, 20(3), 244–263. https://olj.onlinelearningconsortium.org/index.php/olj/article/view/691 Sheail, P. (2018). Temporal flexibility in the digital university: full-time, part-time, flexitime. Distance Education, 39(4), 462–479. https://doi.org/10.1080/01587919.2018.1520039 Shearer, R. L., Aldemir, T., Hitchcock, J., Resig, J., Driver, J., & Kohler, M. (2020). What students want: A vision of a future online learning experience grounded in distance education theory. American Journal of Distance Education, 34(1), 36–52. https://doi.org/10.1080/08923647.2019.1706019 Smyth, R. (2011). Enhancing learner–learner interaction using video communications in higher education: Implications from theorising about a new model. British Journal of Educational Technology, 42(1), 113–127. https://doi.org/10.1111/j.1467-8535.2009.00990.x Stone, C., & O'Shea, S. (2019). Older, online and first: Recommendations for retention and success. Australasian Journal of Educational Technology, 35(1), 57– 69. https://doi.org/10.14742/ajet.3913 Sung, Y.-T., Chang, K.-E., & Liu, T.-C. (2016). The effects of integrating mobile devices with teaching and learning on students' learning performance: A meta-analysis and research synthesis. Computers & Education, 94, 252–275. https://doi.org/10.1016/j.compedu.2015.11.008 Taillandier, F., & Adam, C. (2018). Games ready to use: A serious game for teaching natural risk management. Simulation & Gaming, 49(4), 441–470. https://doi.org/10.1177/1046878118770217 Tsai, F.-H., Tsai, C.-C., & Lin, K.-Y. (2015). The evaluation of different gaming modes and feedback types on game-based formative assessment in an online learning environment. Computers & Education, 81, 259–269. https://doi.org/10.1016/j.compedu.2014.10.013 Wilson, A. B., Miller, C. H., Klein, B. A., Taylor, M. A., Goodwin, M., Boyle, E. K., Brown, K., Hoppe, C., & Lazarus, M. (2018). A meta-analysis of anatomy laboratory pedagogies. Clinical Anatomy, 31(1), 122–133. https://doi.org/10.1002/ca.22934 Wong, L. (2013). Student engagement with online resources and its impact on learning outcomes. Journal of Information Technology Education: Innovations in Practice, 12, 129–146. http://www.jite.org/documents/Vol12/JITEv12IIPp129-146Wong%20FT116.pdf https://doi.org/10.30845/ijll.v6n3p1 https://doi.org/10.1080/03075079.2011.598505 https://doi.org/10.1111/bjet.12583 https://www.advance-he.ac.uk/knowledge-hub/pedagogical-benefits-and-pitfalls-virtual-tools-teaching-and-learning-laboratory https://www.advance-he.ac.uk/knowledge-hub/pedagogical-benefits-and-pitfalls-virtual-tools-teaching-and-learning-laboratory https://doi.org/10.1080/01587919.2015.1019970 https://doi.org/10.1186/s41239-017-0063-0 https://olj.onlinelearningconsortium.org/index.php/olj/article/view/691 https://doi.org/10.1080/01587919.2018.1520039 https://doi.org/10.1080/08923647.2019.1706019 https://doi.org/10.1016/j.compedu.2015.11.008 https://doi.org/10.1177/1046878118770217 https://doi.org/10.1016/j.compedu.2014.10.013 https://doi.org/10.1002/ca.22934 http://www.jite.org/documents/Vol12/JITEv12IIPp129-146Wong%20FT116.pdf Australasian Journal of Educational Technology, 2021, 37(2). 60 Corresponding author: Nicole Sugden, nisugden@csu.edu.au Copyright: Articles published in the Australasian Journal of Educational Technology (AJET) are available under Creative Commons Attribution Non-Commercial No Derivatives Licence (CC BY-NC-ND 4.0). Authors retain copyright in their work and grant AJET right of first publication under CC BY-NC-ND 4.0. Please cite as: Sugden, N., Brunton, R., MacDonald, J. B., Yeo, M., & Hicks, B. (2021). Evaluating student engagement and deep learning in interactive online psychology learning activities. Australasian Journal of Educational Technology, 37(2), 45–65. https://doi.org/10.14742/ajet.6632 mailto:nisugden@csu.edu.au https://creativecommons.org/licenses/by-nc-nd/4.0/ https://doi.org/10.14742/ajet.6632 Australasian Journal of Educational Technology, 2021, 37(2). 61 Appendix: Descriptions of online activities designed for study Below are summaries of each online learning activity type followed by descriptions of the individual activities within each category. These descriptions were provided to students in the online survey prior to the evaluation questions. Note: Some of the activities described below include an example image from the online learning activities. Images from all activities could not be provided as some activities contained images of teaching staff or are subject to copyright restrictions. Biopsychology activities Interactive games The interactive games were “choose your own adventure” style scenarios with questions about real-life applications of content. Activities included: (1) Psychopharmacology Neurotransmitter life cycle: In this scenario, you were asked to follow the life cycle of a neurotransmitter. You were given a series of questions whereby you needed to correctly identify which drug acted as an agonist in order to facilitate neurotransmission and boost your neurotransmitter supplies and to send neural signals onto neighbouring neurons. (2) Neuroanatomy Clinical Placement: Microscope laboratory, medical imaging, and patient clinic: In this scenario, you completed a simulated placement in 3 departments of a biopsychology clinic. In the laboratory department, you viewed microscope images and were asked questions about cells you saw under the microscope. In the medical imaging department, you viewed scans of the brain and were asked questions about medical imaging methods. In the patient clinic ward, you were given case studies of patients with neurological disorders and asked to identify brain regions associated with the symptoms the individuals were presenting. In order to pass your placement at the clinic, you needed to correctly answer all questions in the 3 departments. (3) Vision: Visual pathway. In this scenario, you followed the path of a light wave as it moved through the primary visual pathway. You needed to correctly answer a series of questions about areas of the eye and brain involved in visual processing in order to progress along the primary visual pathway. (4) Sensation: Holiday Sensory Immersion. In this scenario, you took simulated vacations in 3 different holiday destinations (tropical beach resort, snow adventure holiday, and cultural immersion experience). In each location, you were presented with sensory stimuli (e.g., sounds and images associated with each location) and were asked questions about the biopsychological pathways that produce these sensations and perceptions. To avoid being sent home from your vacation, you needed to answer all questions correctly at each location. Example image from activity: Australasian Journal of Educational Technology, 2021, 37(2). 62 (5) Movement: Golf lesson. In this scenario, you took a simulated golf lesson with “Dave” the golf instructor. In the golf lesson, you needed to correctly identify the neural pathways involved in making the right movement to swing the golf club correctly and hit the ball. (6) Memory: Lost memories. In this scenario, you played the role of a person who had anterograde amnesia and had lost their memory. You needed to go back in time to a series of events in your life and correctly identify which memories had been lost due to your amnesia so that you could retrieve them. (7) Emotion: Haunted house. In this scenario, you needed to navigate your way through a haunted house (or a non-scary version of the haunted house featuring cartoon images). As you worked your way through the house, you needed to correctly answer questions about emotion (fear, stress, and anxiety) brain pathways in order to enter the next room and escape the haunted house. (8) Motivation: The Hunger Games. In this scenario, you needed to work your way through the human digestive system and differentiate between hunger and satiety signals produced by the brain and body in order to be rewarded with food to satiate your hunger drives. Gamification At the end of each scenario (above), students collected a code word. All code words from the 8 activities (12 in total) were needed to unlock a virtual brain dissection activity. Virtual sheep brain dissection. At the end of each scenario game, you were given a code word that you needed to use to unlock the brain dissection activity. In this activity, you used your code words from the scenario games to unlock the brain dissection activity. You then watched videos of a sheep brain dissection and were asked to complete a series of activities relating to this dissection (e.g., labelling, identifying structures and functions). EOM Students watched video demonstrations of experiments, whilst recording measurements and observations which were discussed with the teacher and peers in synchronous online meetings. Activities included: (1) Osmosis and diffusion experiments: In these activities, you watched a video of diffusion and osmosis taking place in the laboratory. You then recorded what you observed and discussed the findings in an online meeting. Australasian Journal of Educational Technology, 2021, 37(2). 63 (2) Lie detector demonstration: In this activity, you watched a video of a person taking a lie detector test and were asked to complete a quiz to indicate when you thought the person was lying (based on their galvanic skin response shown on screen). You were then asked to discuss the reliability of lie detectors and the role of physiological arousal in emotion in an online tutorial. (3) Eyewitness memory task: In this activity, you watched a video about mnemonic strategies. In an online meeting afterwards, you were then asked eyewitness memory questions about what was happening in the background of the video (e.g., identifying the person in a line-up of people), with discussion on the reliability of memory. Social Psychology activities Mind maps Mind maps summarised key concepts for each topic. Mind maps contained interactive images (i.e., clicking images opened pop-up windows with YouTube videos or web links). 11 mind maps (one per topic) Australasian Journal of Educational Technology, 2021, 37(2). 64 Scenario-based learning Students read vignettes or watched videos and completed concept check quizzes: (1) Fact or Fiction Quiz: This interactive quiz asked 18 questions designed to show that the common criticism that Social Psychology findings are obvious or just common sense is not always the case. The quiz provided feedback with the ability to check your answers and see the correct response. (2) Conformity Quiz: This activity first presented you with a short PowToon video of Shaz and Ben talking about conformity. You then completed a short interactive quiz on this video and conformity. (3) Attribution Quiz: In this interactive tutorial, Kelley's attribution theory is applied to a real-life scenario. You were presented with a scenario and worked through the theory in the interactive tutorial. (4) Altruism Quiz: This tutorial first presented a short PowToon video where Jimmy and Kimmy discussed the difference between altruism and egotism. You were then presented with 5 scenarios and asked to determine if it was altruism or egotism. Responses were recorded via a link to Mentimeter. (5) Obedience Quiz: This tutorial applied compliance strategies to Jane and Norm’s real-world scenario (Jane trying to convince Norm to give up smoking). (6) Conflict and Cooperation Quiz: This interactive tutorial examined social dilemmas by using real-world examples, video clips and interactive games. Interactive activity with forum discussion Controversial topics were presented. Students reflected on their attitudes and biases by completing rating tasks and participating in forum discussions. (1) Correspondence inference: Child safety judgement task. Students were presented with two real life scenarios (Sandy A or Sandy B) but only read one scenario. After reading their story, students rated how much danger Sandy’s child was in based on the situation. These results were presented via Mentimeter and the differences in responses discussed on the forum. (2) Self-presentation and impression management. This activity provided real-world application of self-presentation and impression management using a YouTube video from The Office. After completing a short interactive quiz, the concepts in relation to real-world examples (e.g., meeting one’s in-laws for the first time) were discussed on the forum. (3) Stereotypes, prejudice, and discrimination. This activity presented two different stories, but students only read one. Students were then asked to make judgements around the credibility of the woman’s claims. Results were recorded on Mentimeter and students were asked to discuss this on the forum. (4) The bystander effect. This activity provided background on the bystander effect by detailing Kitty Genovese’s story. Next, real-life scenarios of this effect were shown via a You Tube clip with the option to discuss this on the forum. Finally, two quizzes were provided to help enhance understanding of the related concepts. (5) Persuasion: Trump speech. This activity provided a mixture of information and YouTube videos that demonstrated persuasion techniques using real-world, and in some cases, challenging content/examples. Australasian Journal of Educational Technology, 2021, 37(2). 65 Video tutorials Recorded tutorials whereby the teacher discussed concepts and students completed quizzes, psychological assessments, and watched videos demonstrating content. (1) The human zoo: What is social psychology? This tutorial looked at the Human Zoo experiment. The tutorial took you through a series of short clips from that experiment discussing various concepts and providing short quizzes throughout. (2) Aggression and antisocial behaviour. In this Panopto tutorial you were also asked to complete some aggression scales and participate in discussion within the Panopto application. (3) Group Processes: Survivor. This tutorial looked at the reality TV show “Survivor” to examine group processes and some of the features and characteristics of groups. During the video you completed some interactive quizzes. Flexibility in online learning Student engagement and depth of learning Facilitating engagement and deep learning with online learning activities The current study Method Participants Materials Online survey Qualitative interview and FG schedule Procedure Data analysis Results Access and use of online learning activities Online learning activities, engagement and deep learning Discussion How, where and what devices do students use to access online learning activities? How does the flexible nature of these online learning activities influence deep learning and engagement? Limitations and future directions Implications and conclusions Acknowledgements References Appendix: Descriptions of online activities designed for study Biopsychology activities Interactive games Gamification Social Psychology activities Mind maps Scenario-based learning Interactive activity with forum discussion Video tutorials