Microsoft Word - Litton_final.docx Journal of Effective Teaching in Higher Education, vol. 4, no. 3 https://doi.org/10.36021/jethe.v4i3.247 Videos in Online Courses: Viewing Patterns and Student Performance Eric Litton, Coker University, elitton@coker.edu Abstract. Many instructors use videos to support their teaching in online courses to convey course content that would normally be taught in a traditional setting. Prior studies have shown some connection between utilizing online videos and student performance but do not always support their findings statistically or consider the nuance of the online videos, such as if the videos are required and how long the videos are. This article uses various quantitative analysis techniques, such as course-level multivariate regressions and a fixed effects model by week, to investigate the relationship between grades, video length, and student video- viewing patterns. It is one of the first studies to analyze student video-viewing patterns by tracking exactly how many minutes of each video the students watch. The findings indicate that videos should stay within a certain length to encourage student engagement with the videos and course assignments. Also, watching online videos is only positively related to grades when students are not required to watch, a result that is consistent across course-level and student-level models. Student viewing patterns also differ for courses that require watching videos versus those that do not. The article concludes by discussing the relevance of these results and how instructors can best utilize online videos in their courses. Keywords: requiring videos; video length; online courses; quantitative analysis; EdPuzzle Online or distance learning is becoming increasingly popular, and even necessary, in the United States. There were almost 7 million online education courses in the United States, enrolling 35.3% of all higher education students (NCES, 2019). Then, with the COVID-19 pandemic, virtually all colleges and universities transitioned to online instruction in Spring 2020. Unfortunately, there was little time to build out online courses or for educators to transfer their in-person lesson plans to an online environment (Means et al., 2020). In Fall 2020, 81.7% of all U.S. higher education institutions planned to offer their instruction completely online, primarily online, in hybrid format, or with online options; by Spring 2021, 60.9% of institutions were primarily or fully online or in a hybrid/hyflex model (Davidson College, n.d.). Unfortunately, this change resulted in a dramatic shift in student satisfaction. According to a survey of 1,008 undergraduate students by Digital Promise and Langer Research Associates, only 19% of students reported being very satisfied with their courses after moving to online learning, as compared to 51% before (Means et al., 2020). This was a result of a decrease in student satisfaction in multiple aspects of online courses, including quality of course content, quality of instruction, and overall learning. Students’ dissatisfaction was mainly driven by students’ inability to collaborate with other students on coursework and the teacher not being able to keep students’ interests in course content. Survey respondents Videos in Online Courses: Viewing Patterns and Student Performance 95 highlighted difficulty in staying motivated in the course and missing the presence of the instructor. Additionally, students appear to get lower grades in online courses. When compared with face-to-face courses, taking an online course lowers student grades by 0.3 points, on a 4-point scale (Xu & Jaggars, 2013). Furthermore, students in online courses have lower completion rates, and online courses increase the performance gap for disadvantaged students and students from other socioeconomic groups that tend to underperform in traditional courses (Protopsaltis & Baum, 2019). The COVID-19 pandemic further lowered student grades in online courses; 85.4% of college students noted that the pandemic negatively affected their performance (OneClass, 2020). Despite students’ reported dissatisfaction with and lower grades in these courses and the disproportionate effects on some populations, online education is the fastest growing segment of higher education (Protopsaltis & Baum, 2019). Online courses are likely to pervade in higher education as they allow schools to reach more students while being flexible to those students’ needs and interests. Consequently, it is in educators’ interests to find ways to keep students motivated while delivering high-quality instruction. Recorded videos are one of the main means that educators have used to deliver content in online courses. Since the appearance of COVID-19, 55% of courses used pre-recorded videos as part of their lesson plans (Means et al., 2020). Even though live video sessions were more popular since the onset of COVID-19 than recorded videos, live videos are not as sustainable. Live videos were common because teachers already had class times scheduled and a general lesson plan for the semester planned out; live video sessions allowed them to maintain that schedule. Online courses are more commonly asynchronous in which students and teacher are not expected to attend sessions together from separate locations. Recorded videos similarly allow educators to visually teach content, but they are scalable since online teachers can utilize the same recorded videos for multiple courses or semesters. This article examines the effectiveness of recorded videos in online teaching by comparing video-viewing patterns with grades using multiple quantitative techniques. Video-viewing patterns are the percentage of total minutes that a student watches a video or a group of videos. First, the article analyzes video- viewing patterns based on video length. In addition, the article compares video- viewing habits in courses that included watching these videos as a requirement of the course grade versus courses that did not include video viewing as part of student grades. This required-versus-optional setup is analogous to including (or not including) attendance or participation as part of a grade in standard in-seat college courses. It is one of the first studies to compare required and optional online videos. The findings suggest that the effectiveness of watching recorded videos, as measured by student grades, is dependent on if the videos are required or not. Optional or non-graded videos have a significant effect on grades while graded videos do not. Further, the results are consistent when measuring at the course-level and a within-student design. The article uses these results to describe Videos in Online Courses: Viewing Patterns and Student Performance 96 implications for educators teaching online courses, which can depend on the teacher’s preference for engaging students in online learning. Literature Review There is a general belief that shorter online videos are better because students are more likely to fully watch and comprehend them. An analysis of massive open online courses (MOOCs) shows that students fully watch videos of six minutes or less, and total viewing time per video decreases as the length of the video increases (Guo et al., 2013). That sentiment has been challenged, however, as that study was done in free online courses where students have less incentive to watch the videos and to perform well. Follow-up studies show that students are willing to watch much longer videos, even videos more than 50 minutes, by breaking the videos up into shorter sessions. On average, students watched 17 to 20 minutes of video per session with median minutes watched of 13 to 16 (Lagerstrom et al., 2015). This is consistent with earlier observations of traditional in-seat courses that found students’ attention spans tend to be best for the first 15 to 20 minutes of class (Middendorf & Kalish, 1996). Additionally, including interactive questions that students can respond to while watching the videos increases mean and median viewing time as well as the percentage of students who completely watch the videos (Geri et al., 2017). So, longer videos may be productive if the students stay engaged or can view them in multiple sessions. Connections have been made between watching content-related videos of various lengths and student performance in online courses. For very short videos, under 5 minutes, there are mixed results. A large study of undergraduate biology students showed that including very short videos increased test scores by 2% across all students and that lower GPA students benefitted the most from the videos, increasing their grades by 6.2% (Dupuis et al., 2013). On the other hand, undergraduate math students’ grades did not vary based on how many very short videos they watched; however, withdrawal and fail rates did decrease when the courses included these short videos (Hsin & Cigas, 2013). Also, online nursing students did not change their engagement in online courses after watching very short summary videos, nor did the students believe the videos helped their learning (Luo & Kalman, 2018). Aragon and Wickramasinghe (2016) used longer videos, 10–15 minutes, to support students in undergraduate statistics courses. They found all students watched at least some of the videos and that watching videos had a positive effect on grades— on average, watching one video increased a student’s grade by 4.07 points. Many studies, however, do not indicate the length of their videos. Studies using undergraduate and graduate students have preliminary results that students watch videos in online courses more than once and those students believe the videos help them understand the content better (Pan et al., 2012; Rose, 2009); on the other hand, no statistical analysis was done to make connections between video-viewing patterns and student surveys. Other studies have made varied statistical Videos in Online Courses: Viewing Patterns and Student Performance 97 connections. Evans (2014) first found that undergraduate students felt better prepared for exams, but test scores were about the same for students in online courses with lecture videos in comparison to students in online courses with just PowerPoint slides. In a follow-up study, though, students in online courses with videos did significantly better on three out of four exams (Evans & Cordova, 2015). Kuo et al. (2015) were able to measure video watching patterns at different time periods during an undergraduate psychology course. They found that accessing videos was positively correlated to higher final grades. Further, students who watched videos earlier prior to the assigned due date and again later to review earned better grades than students who only watched videos once before the assigned completion date. Students who sporadically or infrequently watched videos received the worst grades. The authors noted, however, that they only measured video hits on their website, not the amount of time spent on each video. There are some consistent limitations across studies in addition to the length of videos not always being explicit. As Kuo et al. (2015) mentioned, most studies do not measure the amount of time spent on each video. Instead, studies utilize student-reported views, number of website hits, or a comparison of courses with videos to those without. Moreover, studies that include multiple video views are only able to use student-reported views or website hits. No studies that were found look at how videos of varying lengths relate to students’ performance and video- viewing practices, nor do any studies look at students’ viewing patterns and compare their performance between courses in which the videos are required or optional. Research Questions and Hypotheses This study adds to the prior research by measuring student viewing patterns more precisely by tracking each student’s viewing patterns down to the minute for each video. Consequently, the study adds to the research on video viewing and student performance by answering two main research questions related to students’ detailed viewing patterns. Research Question 1: How does video length relate to students’ video- viewing patterns, and do video-viewing patterns change when videos are part of their grades? Students appear to watch longer videos if they are engaged (Geri et al., 2017). On the other hand, students prefer to break up longer videos into shorter, more manageable sessions (Lagerstrom et al., 2015). Other studies used videos of varying length but did not analyze the specific amount of time students spent on each video. As a result, it is expected that students will watch videos of a moderate length, though, viewing patterns will start to decrease once video length reaches 13 to 16 minutes. Student viewing patterns appear to increase if they have an incentive to watch the videos (Guo et al., 2013; Lagerstrom et al., 2015). Consequently, it is expected that students will watch more video-minutes when doing so is part of their course Videos in Online Courses: Viewing Patterns and Student Performance 98 grade. Additionally, the variability in the amount of time students spend on videos is expected to decrease for courses in which they are graded since students are expected to watch up until they receive full credit (i.e., watching 100% of each video). Research Question 2: How do video-viewing patterns affect student grades? Studies comparing control courses (no videos) to courses with videos mostly indicate a positive connection between watching videos and student performance (Aragon & Wickramasinghe, 2016; Dupuis et al., 2013; Evans & Cordova, 2015; Kuo et al., 2015) or students’ perception of their performance (Pan et al., 2012; Rose, 2009); however, these results are not universal. Some studies found no connection between watching videos and student performance (Evans, 2014; Luo & Kalman, 2018). Based on prior studies that found a connection between online videos and student performance, it appears that the videos are helpful because they are another way to teach or reinforce course content. Because of this, it is expected that student course grades will increase as they watch more of the total course videos; and student assignment grades will increase as they watch more videos related to the specific assignment. To the contrary, this connection is expected to be mitigated or disappear in courses that include the video as part of their grade since video-viewing patterns are expected to be less variable. Data Collection Data was gathered from seven masters-level business courses, five negotiation courses, and two ethics courses at a regional university in the southeastern United States. Each course was offered completely online for six weeks. During each course, the instructor created content-relevant videos to help students better understand course topics and assignments. Videos focused on topics that would normally be covered in-person during a live lecture, such as summarizing key points of case studies as well as preparing for assignments. In four courses (three negotiation courses and one ethics course), watching the videos was not part of students’ grades, hereafter referred to as “non-graded courses.” In three courses (two negotiation and one ethics), the videos represented a small portion of the overall grade, hereafter referred to as “graded courses.” In these courses, a student received full credit if they watched 100% of the video. The number of minutes watched per video by student was tracked using EdPuzzle, an online platform for hosting educational videos. EdPuzzle tracks student viewing of videos, including how much of the video each student has watched and how many times the student has viewed each section of video (each video, regardless of length, is divided into 10 sections). For all courses, students were not informed prior to the course that their viewing patterns would be tracked. Data was extracted from EdPuzzle into Excel, then converted to STATA data files to analyze. All data was anonymized before conducting the analysis and the data collection method was approved through the institution’s IRB process. Videos in Online Courses: Viewing Patterns and Student Performance 99 Measuring video-viewing patterns in this manner, down to the minute for each student on each video, is a substantial improvement over past studies. Many studies did not collect any video-viewing patterns and simply compared courses with videos to those without videos (Dupuis et al., 2013; Evans, 2014; Evans & Cordova, 2015; Hsin & Cigas, 2013; Pan et al., 2012). Other studies collected data on video viewing based on students’ self-reporting (Aragorn & Wickramasinghe, 2016; Luo & Kalman, 2018; Rose, 2009) or by counting hits on the video’s website (Kuo et al., 2015). No other studies that were found look at how much time each student spent on each video. There were 78 videos across all courses or almost two per week per course. Two videos were removed from the analysis because some students had problems with the videos in EdPuzzle, and viewing data was not properly tracked. A total of 76 videos are used in the analysis. There is a total of 1,065 minutes of videos, an average of 14.01 minutes per video (SD = 6.42). The videos range from 4 minutes to 31 minutes. Since the total minutes of videos per course varies, the data is normalized by converting minutes watched to percentages of total minutes, which is used as the variable for students’ video-viewing patterns. There were 132 total students in the seven courses. Data from two students who did not finish a course was removed, resulting in data from 130 students: 84 students in non-graded courses (62 Negotiations, 22 Ethics) and 46 students in graded courses (32 Negotiations, 14 Ethics). Slightly over half of the students (52.3%) are female. The number of students and number of courses analyzed in this study are, with one exception (Dupuis et al., 2013), greater than all of the prior cited studies that looked at online video-viewing habits and student performance. Analysis and Results First, video-viewing patterns are analyzed. Then student performance is analyzed by looking at the course-level data then, in more detail, by analyzing potential patterns between video viewing and grades by week. Video-Viewing Patterns When pooling the 130 students and the videos across courses, there are 1,390 total observations of online videos: 870 in non-graded courses and 520 in graded courses. Figure 1 shows a graph of how much students are watching videos of different lengths. When comparing video-viewing patterns between non-graded and graded courses, the percentage of video minutes watched appears to diverge once videos become longer than 15 minutes. After this point, students in graded courses tend to view longer videos only once, and students in non-graded courses are less likely to watch the videos. Videos in Online Courses: Viewing Patterns and Student Performance 100 Figure 1 Average percentage of video minutes watched by video length Pooled ordinary least squared (OLS) regressions are conducted for each course type to analyze if the relationship visualized in Figure 1 is statistically significant, with control variables for gender and course topic (Table 1). Pooled OLS is used to capture the differences across all videos instead of global differences in length that would result from a normal linear OLS. For non-graded courses, the Wooldridge test shows the presence of autocorrelation (F(1, 83) = 20.24) and the Breusch– Pagan/Cook–Weisberg test shows the presence of heteroscedasticity (chi2(1) = 26.61). For graded courses, there is no indication of autocorrelation in the data (F(1, 45) = .73), but there is heteroscedasticity (chi2(1) = 5.24). As a result, robust standard errors are utilized in the Table 1 regressions (Mehmetoglu & Jakobsen, 2017). When looking at the overall regressions (across videos of all lengths), video length is negatively related to percentage of video minutes watched for both non-graded courses, R2 = .07, F(3, 83) = 20.48, p < .001, and graded courses, R2 = .02, F(3, 45) = 2.36, p = .021. A different story unfolds, however, when separating the data between longer videos (greater than 15 minutes) and shorter videos (15 minutes or less). For both segments in graded courses, the length of videos is not significant although there is a substantial drop in the percentage of video minutes students are watching as shown in the difference in the constant values between the two models. 0 20 40 60 80 100 120 140 160 4 6 8 10 12 14 16 18 20 22 24 26 28 30P e rc e n ta g e o f T o ta l V id e o M in u te s W a tc h e d Video Length (minutes) Non-Graded Graded Videos in Online Courses: Viewing Patterns and Student Performance 101 Table 1 Pooled OLS regressions for percentage of video minutes watched (dependent variable) for courses with non-graded videos and graded videos Variable Overall Length ≤ 15 minutes Length > 15 minutes Beta SE Beta SE Beta SE Non-Graded Length -2.672*** .362 4.526** 1.378 -2.489*** .664 Female 9.978 8.480 8.354 11.231 12.848 7.834 Ethics 23.595* 10.762 13.624 14.426 29.123** 9.203 Constant 123.871*** 10.008 50.444** 17.350 109.397*** 16.091 Graded Length -.650* .272 1.082 .553 -.852 .748 Female -5.390 5.289 -5.936 6.636 -3.895 7.109 Ethics 1.875 6.660 7.721 8.269 -7.411 8.852 Constant 124.967*** 5.859 106.894*** 8.492 126.476*** 18.146 Note: ***p < .001; ** p < .01; *p < .05. For non-graded courses, there is a shift in video-viewing patterns. When videos are 15 minutes or less, students tend to watch more of a non-graded video as the length of the video increases, R2 = .03, F(3, 83) = 4.52, p = .001. On the other hand, for videos longer than 15 minutes, students tend to watch less of a non- graded video as it becomes longer, R2 = .08, F(3, 83) = 9.57, p < .001. Course Grades and Video Viewing The comparison of grades and video-viewing habits by course type are included in Table 2. As would be expected, students in graded courses watched a greater percentage of total video minutes, t(128) = -4.16, p = .0001. They also fully viewed more videos, that is, watching 100% of the video minutes or more, t(128) = -8.28, p < .0001. Students in graded courses received higher overall grades, but the difference is only moderately significant, t(128) = -1.71, p < .1000. Table 2 Comparison of grades and video watching patterns between courses with non- graded videos and graded videos Non-Graded Graded T-test Diff. M SD M SD Grade 83.90 7.14 86.15 7.27 p = .0900 % of Total Video Minutes Watched 87.40 39.39 112.77 16.75 p = .0001 Videos in Online Courses: Viewing Patterns and Student Performance 102 % Of Total Videos Fully Watched 64.77 26.63 97.91 6.79 p < .0001 Note: Grades are out of 100. A linear regression was done for both non-graded courses and graded courses with the course grade as the dependent variable and independent variables for percentage of video minutes watched and percentage of videos fully watched along with control variables for gender and course topic. Table 3 shows the detailed results. Even though the variables for percentage of video minutes watched and percentage of videos fully watched are somewhat correlated (rho = .8145), the regression VIF values are low (2.21 and 1.34, respectively). Additionally, the coefficients and significance of the remaining predictors are unchanged when removing the variable for percentage of videos fully watched from the regressions. As a result, both variables are included in the regression models (Mehmetoglu & Jakobsen, 2017). Table 3 Linear regressions for course grades (dependent variable) for non-graded courses and graded courses Variable Beta SE 95% CI p UL LL Non-Graded % Minutes .0925 .0339 .025 .1599 .008 % Videos -.0141 .0482 -.110 .0820 .771 Female -2.156 1.441 -5.025 .7124 .139 Ethics -2.128 1.729 -5.570 1.314 .222 Constant 78.393 1.968 74.476 82.310 <.001 Graded % Minutes -.029 .075 -.181 .123 0.701 % Videos .286 .186 -.090 .661 0.132 Female -1.249 1.991 -5.271 2.772 0.534 Ethics 7.039 2.176 2.645 11.433 0.002 Constant 60.001 15.030 29.646 90.355 <.001 In non-graded courses, the percentage of total video minutes watched predicts the final grade, R2 = .20, F(4, 79) = 5.00, p = 0.008; while the percentage of total videos that were watched fully does not predict final grade, p = .771. For each additional percentage point of videos watched, on average, a student’s grade increases by .093% (±.067%). In other words, to increase a final grade by 5%, a student watches an additional 54.07% of total video minutes. In graded courses, the percentage of total video minutes watched does not predict the final grade, R2 = .25, F(4, 41) = 3.37, p = .701, and the percentage of total videos that were watched fully also does not predict final grade, p = .132. Videos in Online Courses: Viewing Patterns and Student Performance 103 Weekly Grades and Video Viewing Next, video-viewing habits and the effect on grades is analyzed by week using longitudinal data where the videos and grades are split into the weeks that they were covered in each course. A comparison of video-viewing habits between graded courses and non-graded courses is shown in Figure 2. The figure illustrates that students in non-graded courses watched more videos as the course progressed; whereas, students in graded courses watched about the same amount of the video minutes each week. Notably, graded courses are consistently above 100%, which indicates that students are watching the graded videos fully plus about 10–30% extra. Figure 2 Average percentage of total video minutes watched per week, with 95% confidence intervals Students’ weekly viewing habits are reinforced with a simple time-series regression with the week number as the independent variable and the percentage of total video minutes watched as the dependent variable. For non-graded courses, the percentage of minutes watched increased significantly each week, beta = 7.169, R2 = .03, F(1, 502) = 16.18, p < .001. On the other hand, for graded courses, the percentage of minutes watched did not significantly change each week, beta = 1.141, R2 = .00, F(1, 274) = .88, p < .350. 60 70 80 90 100 110 120 130 140 150 1 2 3 4 5 6 P e rc e n ta g e o f T o ta l V id e o M in u te s W a tc h e d Week Non-Graded Courses Graded Courses Videos in Online Courses: Viewing Patterns and Student Performance 104 Moreover, students watched fewer videos earlier in non-graded courses. The average values are significantly different for Week 1 (t(128) = -4.75, p < .0001), Week 2 (t(128) = -3.60, p = .0004), and Week 3 (t(128) = -3.08, p = .0025). The average values converge and are not statistically different in weeks 4 through 6. Table 4 compares the weekly assignment grades for non-graded courses and graded courses (not including the points for watching the videos). For the first four weeks, grade averages were higher in graded courses and the difference is statistically significant in weeks 2, 3, and 4. Similar to the video-viewing habits, grades converge in the final week of the courses. Table 4 Comparison of weekly grades between courses with non-graded videos and graded videos Week Non-Graded Graded T-Test Diff. M SD M SD 1 88.33 15.40 90.23 8.41 p = .4404 2 82.56 13.57 87.55 10.24 p = .0313 3 87.00 11.70 91.95 12.39 p = .0256 4 86.04 11.30 90.20 9.23 p = .0349 5 89.76 7.87 88.75 6.90 p = .4695 6 80.71 10.63 79.55 13.59 p = .5916 Course grades tend to be better earlier in the course as each masters-level course begins with topics that are more familiar to the students, then progresses to newer, more difficult topics, and concludes with a challenging final project. Because of this, using weekly raw grade percentages will likely not truly reflect each student’s viewing patterns over time. As such, mean difference grades are used by first calculating the average grade for each course by week and then subtracting the students’ grades from the weekly course average. Table 5 reflects models using this mean difference grade as the dependent variable and time series viewing patterns as the independent variables. A fixed effects model is used for each because all analyses are interested in an individual student’s viewing habits and grades (i.e., variations within students), there is no group grading in these courses (i.e., variations between students), and the results from some of the Hausman’s tests of the variables indicates covariation between the error term and the explanatory variables (Mehmetoglu & Jakobsen, 2017). For each set of variables, the random effects model results in similar coefficients with the same statistical significance. Videos in Online Courses: Viewing Patterns and Student Performance 105 Table 5 Panel data using fixed effects models to predict grades (compared to average) based on percentage of video watched and percentage watched in excess of 100% Course Within R2 Beta SE 95% CI p Variable LL UL Non-Graded % Minutes .022 .0034 .0011 .0012 .0057 .002 Constant -.3209 .1218 -.5602 -.0815 .009 Non-Graded Above 100% .048 .0047 .0019 .0009 .0085 .015 Constant -.0468 .1396 -.3232 .2296 .738 Graded % Minutes .016 .0040 .0021 -.0001 .0082 .057 Constant -.4656 .2517 -.9616 .0304 .066 Graded Above 100% .020 .0039 .0028 -.0016 .0094 .164 Constant -.0318 .1261 -.2823 .2186 .801 Note: Control variables for gender and class topic are omitted because they are constant within weeks. The fixed effects model for non-graded courses indicates that the percentage of total video minutes a student watches each week has a positive impact on the student’s grade for that week, R2 = .02, F(1, 419) = 9.29, p = .022. Similarly, if a student watches more than 100% of the video minutes, the amount of the video they rewatch has a positive impact on their grade for that week, R2 = .05, F(1, 121) = 6.09, p = .015. For graded courses, there is some indication that percentage of total video minutes watched has a positive impact on the student’s grade for that week, R2 = .02, F(1, 229) = 3.65, p = .057), but the number of videos watched above 100% is not statistically significant, R2 = .02, F(1, 95) = 1.97, p = .164. Discussion and Implications The results of this study are able to provide detailed yet nuanced answers to both of the research questions. The variations in videos used through the courses in this study allowed a comparison of video length and video-viewing patterns, which answers research question 1. The results support other studies on video length, specifically that students are less likely to watch videos that are longer than 15 minutes. Interestingly, this study showed that students were more likely to watch non- graded videos that were slightly longer as long as the videos stayed under that 15- minute threshold. Once videos in non-graded courses exceeded 15 minutes, the percentage of minutes watched by students reduced significantly. Videos in Online Courses: Viewing Patterns and Student Performance 106 Graded videos under 15 minutes had consistent viewership, with students tending to rewatch small portions of the videos. With longer graded videos (longer than 15 minutes), students were essentially watching the video once and not rewatching sections. When comparing the two course types over time, students in non-graded courses watch more of the online videos as the course progresses while students in graded courses watch about the same amount each week. Interestingly, both the video- viewing habits and student grades are virtually the same at the end of each course type (in Week 6). This initial difference could be a result of students’ overconfidence and/or procrastination. In non-graded courses, students may feel that they sufficiently learned the content using other resources, such as the textbook; alternately, they may have waited too long to work on the assignments, and watching videos would use too much of the little time they have remaining. In graded courses, on the other hand, a small incentive was enough for students to watch the videos throughout the course. Lastly, looking first at research question 2, the results show a positive connection between instructor-made online videos and student grades for courses in which the videos are optional—that is, the videos are not required as part of the students’ grade. These results were found to be robust as they were true at both the overall course level and at the student level when comparing each student’s performance and video-viewing patterns across weeks. For non-graded courses, even though the connection between video viewing and grades is statistically significant, video viewing should not be thought of as a silver bullet to improving student performance. First, the coefficients for the course-level and student-level models are low, indicating that a student would have to spend a lot of time watching videos to change their letter grade. The results suggest that for a student to increase their grade by 10%, they would need to watch (on average) more than 100% of the video minutes; in other words, watch all the videos again and then some. Of course, students on the margins may just need to review key concepts in the videos to bump up their grade. Second, other model outputs, such as the R2 values, indicate that there are other factors also contributing to student performance. For graded courses, none of the models (course-level or student-level) showed a significant relationship between video-viewing patterns and grades. This does not necessarily mean that requiring students to watch videos in an online course is not worthwhile. Students in graded courses did earn higher grades than students in non-graded courses, but the difference was only statistically significant for certain weeks and not statistically significant for overall course grades. This grade difference may be more pronounced in other course types (e.g., undergraduate courses or courses in other subjects). Videos in Online Courses: Viewing Patterns and Student Performance 107 Implications for Practice These findings have numerous implications for higher education instructors. The implications below highlight how instructors can apply these results with respect to the following pedagogical choices: 1. Selecting the optimal length for online videos 2. Getting students to watch more videos related to their course, especially earlier in the course, by including video viewing as part of the grade 3. Using videos for more difficult online courses 4. Increasing video-viewing patterns without making it part of the course grade Whether requiring videos or not, instructors should keep their videos to less than 15 minutes. Students are more likely to watch these videos more than once if they are graded. If videos are optional in the course, students are more likely to watch these shorter videos and even watch them more than once. As a result, only using videos under 15 minutes, required or not, makes it more likely for students to engage with the videos and, consequently, better understand the content. If instructors are having trouble getting students to watch videos, they should require them as part of the grade. Even having videos be a small portion of the grade (total video grades 4.5% to 6% per course were used in this study) is enough to incentivize students to fully watch videos. The difference between how much of each video students are watching in graded versus non-graded courses is especially apparent early in the courses. It appears that when videos are not part of the grade, students do not realize the value of the videos until they after they begin receiving assignment grades. Then, they begin watching more and more of the videos as the course progresses. Instructors can help fill this gap by requiring videos right up front or by explaining to students that watching these videos has a positive impact on their grade. Some courses are considered difficult at an institution or have new topics that students may not have seen yet. Requiring videos in these courses could help students from unnecessarily falling behind early in the online course. Graded videos can also help students to start the course strong by taking advantage of the online videos as a resource to better learn the difficult content. Requiring videos can be troublesome for some instructors. They may not have a way to check or verify that a student has watched a video. Instructors can use online resources, such as EdPuzzle, or they can utilize other ways to engage students while watching the video. For instance, professors can embed questions into the videos they make (e.g., asking students to pause and consider the question). The embedded questions can help students think about the early steps for a project or help students to slow down when considering difficult concepts or practice problems. Another way instructors can increase engagement with videos is by including worksheets that students can complete while they watch the videos and then having the students turn in the worksheets once they are done with the Videos in Online Courses: Viewing Patterns and Student Performance 108 video (or series of videos). Studies have found that engaging students like this makes them more likely to watch longer videos and can increase their course grades through spending more time on online assignments (Lagerstrom et al., 2015; Aragon & Wickramasinghe, 2016). If an instructor chooses to make videos optional for their online course, then they should reinforce the benefits of the videos to the students. Instructors can do this by informing students of the connection between watching content-related videos and grades. They can also encourage students to rewatch videos before a test or before starting a big assignment by posting an announcement on the online platform with links to the relevant videos. Limitations This study utilized masters-level business courses that are relatively short, only 6 weeks. Most other studies use undergraduate courses that take place over 8- or 16- week terms. While the different course level, discipline, and duration helps support the connection between video viewing and grades, some results could be different for undergraduate students or students in other disciplines. The lack of a statistical relationship between grades and video viewing in graded courses might be surprising to some people. This could be due to the lack of variability in the dependent variable (percentage of minutes watched). The range and standard deviation were both less in graded courses than non-graded courses. Additionally, grades were higher for courses that required videos, but not by a significant margin. Both may become statistically significant in other course settings. Conclusions Online learning is becoming more prevalent in higher education, and content- related videos are a common way for teachers to communicate lessons and ideas that they would normally discuss in a traditional, in-seat class. This article is one of the first studies to utilize detailed student viewing pattern data to analyze the connection between watching online videos and student performance. The results illustrate that watching instructor-made videos positively impacts student grades, but only when the videos are not required as part of the course. These results held at both the course level and the within-student level; that is, individual students in non-graded courses improved their grades when watching more video minutes in one week as compared to other weeks. Nevertheless, requiring videos could be helpful in courses if students are not utilizing the videos to better understand course content. Lastly, longer videos (more than 15 minutes) can be counterproductive as students are less likely to watch them, which could negatively impact their grades. Instructors can take advantage of these results in a variety of ways to improve their online courses. Videos in Online Courses: Viewing Patterns and Student Performance 109 Conflicts of Interest The author declares that there is no conflict of interest regarding the publication of this article. References Aragon, R., & Wickramasinghe, I. P. (2016). What has an impact on grades? Instructor-made videos, communication, and timing in an online statistics course. Journal of Humanistic Mathematics, 6(2), 84–95. https://doi.org/10.5642/jhummath.201602.07 Davidson College. (n.d.). The college crisis initiative dashboard. Retrieved July 26, 2021, from https://collegecrisis.shinyapps.io/dashboard/ Dupuis, J., Coutu, J., & Laneuville, O. (2013). Application of linear mixed-effect models for the analysis of exam scores: Online video associated with higher scores for undergraduate students with lower grades. Computers & Education, 66, 64–73. https://doi.org/10.1016/j.compedu.2013.02.011 EdPuzzle Help Center. (n.d.). EdPuzzle. Retrieved July 31, 2021, from https://support.edpuzzle.com/hc/en-us/sections/360001671011-Getting- Started Evans, H. (2014). An experimental investigation of videotaped lectures in online courses. TechTrends: Linking Research & Practice to Improve Learning, 58(3), 63–70. https://doi.org/10.1007/s11528-014-0753-6 Evans, H. K., & Cordova, V. (2015). Lecture videos in online courses: A follow-up. Journal of Political Science Education, 11(4), 472–482. https://doi.org/10.1080/15512169.2015.1069198 Geri, N., Winer, A., & Zaks, B. (2017). Challenging the six-minute myth of online video lectures: Can interactivity expand the attention span of learners? Online Journal of Applied Knowledge Management, 5(1), 101–111. http://dx.doi.org/10.36965/OJAKM.2017.5(1)101-111 Guo, P. (2013, November 13). Optimal video length for student engagement. EdX Blog. https://blog.edx.org/optimal-video-length-student- engagement/?track=blog Hsin, W.-J., & Cigas, J. (2013). Short videos improve student learning in online education. Journal of Computing Sciences in Colleges, 28(5), 253–259. Kuo, Y.-Y., Luo, J., & Brielmaier, J. (2015). Investigating students’ use of lecture videos in online courses: A case study for understanding learning behaviors via data mining. In F. W. B. Li, R. Klamma, M. Laanpere, J. Zhang, B. F. Manjón, & R. W. H. Lau (Eds.), Advances in Web-Based Learning—ICWL 2015 (pp. 231– 237). Springer International Publishing. https://doi.org/10.1007/978-3-319- 25515-6_21 Videos in Online Courses: Viewing Patterns and Student Performance 110 Lagerstrom, L., Johanes, P., & Ponsukcharoen, U. (2015). The myth of the six- minute rule: Student engagement with online videos. 2015 ASEE Annual Conference & Exposition, 26.1558.1-26.1558.17. https://peer.asee.org/24895 Luo, S., & Kalman, M. (2018). Using summary videos in online classes for nursing students: A mixed methods study. Nurse Education Today, 71, 211–219. https://doi.org/10.1016/j.nedt.2018.09.032 Means, B., Neisler, J., & Langer Research Associates. (2020). Suddenly online: A national survey of undergraduates during the COVID-19 pandemic. Digital Promise. https://doi.org/10.51388/20.500.12265/98 Mehmetoglu, M., & Jakobsen, T. G. (2017). Applied statistics using stata: A guide for the social sciences (1st ed.). SAGE Publications. Middendorf, J., & Kalish, A. (1996). The “change-up” in lectures. National Teaching & Learning, 5(2). https://docstull.files.wordpress.com/2013/11/the-generalist- integration.pdf NCES. (n.d.). Digest of education statistics, 2019. National Center for Education Statistics. Retrieved July 26, 2021, from https://nces.ed.gov/programs/digest/d19/tables/dt19_311.15.asp OneClass. (n.d.). Did college students perform worse during COVID-19? Retrieved July 26, 2021, from https://oneclass.com/blog/featured/184334-did-college- students-perform-worse-during-covid-193F.en.html Pan, G., Sen, S., Starrett, D. A., Bonk, C. J., Rodgers, M. L., Tikoo, M., & Powell, D. V. (2012). Instructor-made videos as a learner scaffolding tool. Journal of Online Learning and Teaching, 8(4). https://jolt.merlot.org/vol8no4/pan_1212.htm Protopsaltis, S., & Baum, S. (2019). Does online education live up to its promise? A look at the evidence and implications for federal policy. George Mason University. https://jesperbalslev.dk/wp-content/uploads/2020/09/OnlineEd.pdf Rose, K. K. (2009). Student perceptions of the use of instructor-made videos in online and face-to-face classes. Journal of Online Learning and Teaching, 5(3). https://jolt.merlot.org/vol5no3/rose_0909.htm Xu, D., & Jaggars, S. S. (2013). The impact of online learning on students’ course outcomes: Evidence from a large community and technical college system. Economics of Education Review, 37, 46–57. https://doi.org/10.1016/j.econedurev.2013.08.001