Australasian Journal of Educational Technology, 2018, 34(4). 16 Factors influencing peer learning and performance in MOOC asynchronous online discussion forum Thomas K. F. Chiu, Timothy K. F. Hew The University of Hong Kong Most studies on traditional asynchronous online discussion suggest that facilitating dialogue, that is, commenting forum activities, result in better learning and performance. However, due to open entry and diverse learner backgrounds, learner behaviour in massive open online courses (MOOCs) may be different. Viewing forum messages, which involves fewer mental and physical actions as well as less cognitive processing, rather than posting forum massages, might better match the learner’s study purpose. In this study, we investigated the effects of three common types of online MOOC discussion forum activities (viewing, voting and commenting) on student peer learning (peer reviews) and performance (quiz scores). We used stepwise regression models to analyse two data samples of a humanity and art MOOC provided by a private university in the United States for exploring factors influencing peer learning and performance. The results indicate that peer learning and performance were primarily predicted by viewing, and to a lesser extent by commenting. The three plausible explanations for the findings are the learner’s study purpose, weaker instructor-learner ties, and voluntary forum participation. We suggest ways to encourage viewing messages in forums and present limitations and further directions. Introduction The advent of massive open online courses (MOOCs) has revolutionised online education (Billsberry, 2013; Lucas, 2013) and has stolen the limelight in academia in the last several years (Haggard, 2013). MOOCs, usually run by educational technology service providers such as EdX and Coursera in partnership with reputable universities, are free and open to all, and allow all kinds of individuals to enrol. Anyone can enrol in the courses out of interest; for example, a mathematics professor can enrol and study a history MOOC. The courses welcome novice as well as advanced learners (Yang, Wen, Howley, Kraut, & Rose, 2015). In a MOOC, the participants can consist of school pupils, university students, or research scientists (DeBoer, Stump, Seaton, & Breslow, 2013; Gillani & Eynon, 2014), indicating a large diversity in many aspects, such as motivation, learning styles and goals. Open entry and diverse learner background lead to an unsupervised learning environment with distant relationships between peers and instructors (Garrison, Anderson, Archer, & 2010; Moon, Potdar, & Martin, 2014). As a result, MOOC learners are very likely to engage in learning activities in a different way compared to those studying in online courses in formal, conventional programs. In conventional learning environments, learners usually know one another or share the same interests and academic goals. They are also supervised or given detailed guidance by their teachers. In contrast, a MOOC generally carries no fees, and no predefined expectations for participation (McAuley, Stewart, Siemens, & Cormier, 2010). Learners sign up for MOOCs voluntarily (Alraimi, Zo, & Ciganek, 2015), and do not know most of their peers (DeBoer et al., 2013; Gillani & Eynon, 2014). They have freedom to participate in the learning activities. But perhaps the biggest difference between a MOOC and an ordinary course is the size of the enrolment. The average enrolment in MOOCs is about 43,000 students (Ferenstein, 2014) who are very diverse in terms of nationality and educational level (Breslow et al., 2013). In this paper, we use the term closed to describe the conventional, formal learning environment, and open to describe the MOOC learning environment. For example, closed asynchronous discussion refers to forum-based discussion in conventional learning environments such as university, credit-bearing, face-to-face and online courses, while open asynchronous discussion refers to the forums in MOOCs. Hitherto, most research on asynchronous discussion forums has been conducted in closed environments (e.g., Dennen, 2005; Hew, Australasian Journal of Educational Technology, 2018, 34(4). 17 Cheung, & Ng, 2010; Masters & Oberprieler, 2004; Vonderwell, 2003); little research has studied the impacts of forums in open environments, such as MOOCs. In particular, relatively few studies have examined how forum activities with different cognitive processing, especially commenting, viewing and voting, may influence student performance and participation in large-scale open learning environments. An investigation of open, asynchronous discussion activities in MOOCs can help further our understanding of student participation in online environments. Therefore, the main goal of this study was to investigate the effect of viewing, voting, and commenting (activities ranging from lower to higher cognitive processing), on students’ peer learning and performance in a MOOC. The rest of the paper is organised as follows: we first present a summary of previous studies on MOOCs and asynchronous online discussion activities and their cognitive processing, followed by the purpose and methodology of the study. Then we describe the results of our analyses, followed by discussing the results and concluding the study. Literature review Research in MOOCs Studies on MOOCs have concentrated on two major research themes or strands. One of these strands is student completion and dropout in MOOCs. For example, Alraimi et al. (2015) suggested that openness and reputation are stronger factors affecting a learner’s decision to continue learning in MOOCs when compared to enjoyment and usefulness. Completion rate in a MOOC is higher when its instructor is from a more prestigious university (Alraimi et al., 2015), or when the educational materials are more visible and accessible, that is, uploaded to the learning management system at the beginning of the course, (Alraimi et al., 2015; Iiyoshi & Kumar, 2008). MOOC learners prefer to easily retrieve resources from the platform for self-learning purposes. Pedagogy also plays an important role in helping the learners to persist in the course. Many studies suggest that learning activities and instructional approach are important in MOOCs (Chen & Chen, 2015; Onah, Sinclair, & Boyatt, 2014; Rai & Chunrao, 2016). Chen and Chen (2015) suggested that offline face-to-face study groups are a key influential factor of student participation and motivation. Onah et al. (2014) suggested that MOOC instructors use different presentations – videos, audios, slides, pictures and documents – for resources and activities to uphold learner interest. Rai and Chunrao (2016) suggested MOOC learners who receive sufficient support and timely feedback are more likely to continue until the end of the courses. Nawrot and Doucet (2014) found that most students dropped out due to conflicting life responsibilities and interruptions during the course, such as being left behind because of work or travel. The second major research strand in MOOCs concerns student learning and performance. Major MOOC student learning activities include watching video lectures, participating in discussion forums, performing peer reviews, and completing tests and assignments. In general, MOOC learners watch video lectures, and then discuss a topic with peers and their instructor or facilitators. After that, learners are required to finish tests, assignments, or peer reviews. These activities can be generally classified into two categories: receiving information (e.g., video lectures, and interacting with peers and instructors (e.g., discussion forums). Coffrin, Corrin, de Barba, and Kennedy (2014) found that many more students viewed the videos than worked on the assignments. Although MOOCs rely mainly on discussion forums for interaction among students (Coetzee, Fox, Hearst, & Hartmann, 2014), only about 5%–12% of participants write posts in the forums (Bárcena, Read, Martín-Monje, & Castrillo, 2014; Cisel, 2014; Kizilcec, Papadopoulos, & Sritanyaratana, 2014). More than 75% of participants read discussion threads only once (Cisel, 2014). One of the issues we need to address is that MOOCs lack a physical environment that allows real-time interaction between learners and instructors (Ezen-Can, Boyer, Kellogg, & Booth, 2015; Moon et al., 2014; Wong, Pursel, Divinsky, & Jansen, 2015), and hence online discussion forums in MOOCs play an important role in trying to bridge this gap (Moon et al., 2014). Researchers have also begun to examine the effects of forum participation on student learning. Coetzee et al. (2014), for example, found a positive correlation between the number of responses to forum questions and students’ final grades and course retention. Students who posted at least one message obtained higher achievements (i.e., receiving advanced course certificates). The number of viewed threads was also associated with achievement; students who received course certificates read the Australasian Journal of Educational Technology, 2018, 34(4). 18 forums more than others (Cisel, 2014). These indicate that online forum activities can influence student performance in MOOCs. Asynchronous online discussion forum activities and cognitive processing Many studies have suggested that proper use of asynchronous online discussion in conventional formal learning environments engages learners and improves performance (Brewer & Klein, 2006; Chen & Chiu, 2008; Cheung & Hew, 2004; Hew et al., 2010; Khine, Yeap, & Chin Lok, 2003; Oliver & Shaw, 2003). Learners in these studies had similar backgrounds; for example, all learners were pre-service teachers (Cheung & Hew, 2004; Hew et al., 2010; Khine et al., 2003; Vonderwell, 2003) or had similar academic backgrounds: business (Brewer & Klein, 2006), mathematics (Chen & Chiu, 2008) and health science (Masters & Oberprieler, 2004; Oliver & Shaw, 2003). Therefore, literature on online forum use has focused on investigating how to engage student participation in closed asynchronous online discussions (e.g., Dennen, 2005; Hew et al., 2010; Masters & Oberprieler, 2004; Vonderwell, 2003). Online forums provide an environment that facilitates dialogue between learners and instructors (Dennen, 2005; Hew et al., 2010). Studies have focused on recommending instructional strategies to facilitate commenting and responding. For example, Hew et al. (2010) reviewed 50 empirical studies on asynchronous online discussions. Participants in the reviewed studies were from the same institution, shared the same educational level or had a similar educational background. They further suggested that one of the major factors influencing student participation in discussions is the response time to learners’ questions. Without receiving timely responses, learners feel frustrated and discouraged (Cheung & Hew, 2004). Dennen (2005) has suggested that presenting detailed guidance and deadlines positively affects the discourse in an online course. An, Shin, and Lim (2009) and Dennen (2005) reported the relationship between learners and their instructors as another important factor affecting learner motivation to comment in online discussions. The group sizes in these studies were small, which enabled the instructors to be more involved in the online forums. Involvement of the instructor is a major factor influencing learner contribution (An et al., 2009; Dennen, 2005; Cheung & Hew, 2004; Wang, 2008; Mazzolini & Maddison, 2007). More constructive and timely feedback can encourage learners to participate more in the discussion forum. When learners feel less involvement from their instructors, they are unlikely to post or respond in the discussions (Dennen, 2005). In the above studies, instructional strategies involved only one online forum activity – commenting messages. In other words, commenting on messages is the important factor affecting participation in closed asynchronous discussions. In contrast, the learning environment and the nature of learners in MOOCs different. Commenting messages may not be the most influential factors in such an open environment. MOOC learners are voluntary (Alraimi et al., 2015) and self-regulated (Kop & Fournier, 2011; Yang et al., 2015), and do not know most of their peers (DeBoer et al., 2013; Gillani & Eynon, 2014). They have the freedom to participate in the tasks of their choice at their own pace even if the completion of tasks is required for acquiring certificates (Onah et al., 2014). They may prefer to read messages than respond because of their busy schedules, while learners in a conventional formal programme follow the assessment criteria or methods and focus on meeting instructor and course requirements. MOOC learners select tasks they perceive as useful. This selective participation and learner- preferred learning process further indicate the challenges MOOC instructors face in contrast to conventional online courses. This process also implies that cognitive processing should be considered when designing forum activities for learners in open learning environments (Garrison et al., 2010). Cognitive processing of activities affects student learning and engagement (Chiu & Churchill, 2015a, 2016; Chiu & Mok, 2017; Garrison et al., 2010), which plays an important role in online educational practice, particularly in open online learning (Garrison et al., 2010; Swan, Garrison, & Richardson, 2009). This processing is associated with the learning goals of learners. In other words, activities with different cognitive processing will affect the use of the MOOC online forum for learning. In sum, it is necessary to investigate how forum activities with different cognitive processing may influence student performance in MOOCs. Australasian Journal of Educational Technology, 2018, 34(4). 19 Method Purpose of the present study Our current study is similarly concerned with the effects of forum participation on student performance and learning, but it explores the problem from a different angle. Most recent MOOC studies did not take cognitive processing into account when examining forums activities. However, cognitive processing is very important for learners in open environments (Garrison et al., 2010). Therefore, this study aimed to explore learner behaviour in forums from a different perspective: cognitive processing. Specifically, we investigated the effects of three discussion forum activities – commenting, viewing, and voting – on student peer review and academic performance in MOOCs. Commenting on other learners typically requires deeper thought processing since a learner has to explicitly express her or his thoughts in words. On the other hand, viewing and voting, which involve reading, may require less cognitive processing. Specifically, we examined two types of samples: all forum participants (i.e., learners who commented, viewed, or voted at least once in the forums), and completing participants (i.e., learners who completed the course). We employed these two samples to examine how the three forum activities might affect total quiz scores (academic performance) and the number of peer review submissions (peer learning). Peer review requires learners to submit feedback to other learners’ assignments. We used the number of peer review submissions as a proxy for peer learning because reading and writing the feedback benefits not only the student getting the feedback, but also the student giving it (van den Berg, Admiraal, & Pilot, 2006). Students whose work is reviewed benefit from getting external perspectives on ways to improve their work, thus stimulating their critical thinking (Paré & Joordens, 2008). Students performing the review also benefit because they may get ideas for improving their own work (Paré & Joordens, 2008). We hypothesised as follows: • (H1) Commenting will have a significant, positive effect on peer reviews. • (H2) Commenting will have a significant, positive effect on academic performance. • (H3) Viewing and voting will have significant, positive effects on peer reviews. • (H4) Viewing and voting will have significant, positive effects on academic performance. Dataset and participants The MOOC in this study was offered by a well-known private American university. The course was conducted in Coursera, which is a for-profit company founded in 2012 by two Stanford University computer scientists who partnered with leading universities (Clarke, 2013). This non-credit-bearing course provided the chronology of 20th- and 21st-century American poetry and a way of understanding general cultural transitions from modernism to postmodernism. Learners did not need any prior knowledge of poetry or poetics. The total duration of the course was 10 weeks. Each week participants were required to spend 5 to 10 hours to read some poems, view approximately 2 hours of video, participate in the discussion forum (asynchronous), take quizzes and conduct peer reviews. The video lecture was a means of information transfer; the asynchronous discussion further facilitated deeper understanding; the quizzes assessed learner understanding; and the peer reviews showed learner participatory level in peer learning. The instructor explicitly expressed that the discussion forums were very important by posting a statement in the introduction of the course outline. It was the only environment for learners to interact with the instructor and other learners by asking questions and getting other learners’ responses. Table 1 summarises the features of the MOOC. Australasian Journal of Educational Technology, 2018, 34(4). 20 Table 1 Summary of the MOOC features Course Length Est. workload Video lecture Other resources Assessment Remarks Poetry (Teaching staff – 1 professor, 10 teaching assistants) 10 weeks 5–10 hours per week Video lectures (conversational panel discussion) with English, Spanish subtitles (some videos may include other languages such as Dutch) Videos can be slowed or speeded up by user Length of video: 9–29 min 7–12 lecture videos per week, excluding miscellaneous videos and webcasts; video length (median = 15 min, average = 15 min) Forums Instructors’ office hours Live webcast sessions with guest poets Supplementary syllabus “ModPoPlus” offering new poems, new close readings Two weekly lesson quizzes (untimed) Four writing assignments (peer review) Hard deadlines for quizzes and writing assignments Effective score was the highest score of all allowed attempts on weekly quizzes made before the hard deadline For writing assignment, participants graded as many essays as assigned by the instructor In the course, a certificate of accomplishment will be awarded if a participant writes four essays and submits them on time; writes at least four peer reviews during each of the four peer review periods; takes all the quizzes and receives a score higher than 0 on each one of them; and participates in online discussions by posting a comment at least once each week to any of the poem-specific forums. The forums that were set up by the instructor allowed the learners to view, vote and comment. Reading messages is considered as viewing; liking and disliking a message are considered as voting; and responding to a message is considered as commenting. Commenting can be seen as responding to peer opinions. Comments were accessible only to the professor and learners who enrolled in the course. The anonymous post function was not enabled; that is, all the messages posted had user and message identifications. Procedure We first obtained ethical approval from our university and got consent from the instructor. Then we retrieved a database of the MOOC from the Coursera content management system and imported the course data and its activity into a database management system, MySQL Workbench 6.2 CE. The MOOC database was anonymised without any student identifiers. We read a document called Data export procedures published by Coursera to understand all the tables in the database. We exported the essential data for this study and imported them into SPSS Statistics version 22. Stepwise multiple regression models are used in the exploratory stages of model building to identify useful predictors. In the method, the predictor variables are automatically entered into the models one at a time based Australasian Journal of Educational Technology, 2018, 34(4). 21 upon statistical criteria. Not all the predictor variables may end up in the model. Unlike stepwise regression, hierarchical models are used to examine the contributions of predictors above and beyond previously entered predictors, that is, incremental validity. The order of the predictor variables is based on theory. These two models answer different questions. Since there is no well-recognised theory to understand the contributions of three MOOC discussion forum activities, this study adopted an exploratory approach and used stepwise regression to analyse the data. In the analysis, the number of peer reviews and total scores in quizzes were dependent variables; and the number of posting, commenting, viewing, and voting were independent variables. The database had a lot of data that belonged to different groups, such as the instructor and testers. When we first retrieved the learner records, they showed that many learners did not use the MOOC forums. To increase the validity of the analyses, we cleared out the irrelevant data by excluding the data of learners who had never participated in the MOOC forum. Results In this study, total views is the total number of times that any learner had viewed threads (topics); total threads is the total number of topics in the forums; total votes is the total number of votes cast in the forums; and total comments is the total number of comments across all posts. The asynchronous online discussion was assessed by the numbers of messages commented on or replied to in a thread, messages viewed (Cheng, Paré, Collimore, & Joordens, 2011; Harasim, 1993) and messages voted. Peer learning and performance were measured by the number of submitted peer reviews (Cheng et al., 2011) and the total scores in quizzes. We conducted stepwise linear regression analyses on peer reviews and total quiz scores as dependent variables. In the course, the total number of registrations was 37,156. In the forums, total threads were 9202 and total comments were 47,984; and the total views and votes were 510,853 and 7069 respectively. The mean of the total quiz score was 0.73 (SD = 3.89); the mean of submitted peer reviews was 0.10 (SD = 0.59). In this study, we employed two sample sets to conduct the analyses. The first sample set included all participants (n = 1563) who participated in forum activities. The second sample set, a subset of the first, included participants who completed the course (n = 1185). We used these two sample sets to investigate how the four main MOOC forum activities affect peer reviews and quiz scores. Since asynchronous online discussion activities in such an open learning environment are under-studied by researchers, stepwise multiple linear regression was used to analyse the data (Menard, 2002). Due to the occurrence of type I errors, we used two sample data sets to cross-validate the results (Fox, 1991). All forum participants The number of participants was 1563; and no learners enrolled in the last week of the course. In the forums, the total comments were 29,946; and the total views and votes were 333,276 and 5,350 respectively. The mean of the total scores of quizzes was 16.09 (SD = 10.01); the mean of submitted peer reviews was 2.36 (SD = 1.66). Correlations between among all the variables were significant, all p < 0.001 (Table 2). Positive correlations were found among each pair of dependent and independent variables – dependent variables: total scores and peer reviews; independent variables: commenting, posting, viewing and voting. Multicollinearity was checked and was well within accepted parameters (tolerance < 10.0 and VIF > 0.1; see Tables 3 and 4). Stepwise multiple regression analyses were conducted to determine the degree to which independent variables were able to predict for the dependent variables of peer reviews and total quiz scores. Australasian Journal of Educational Technology, 2018, 34(4). 22 Table 2 Correlation matrix for the first sample 1 2 3 4 Dependent variable: quiz score 1.Score 0.101*** 0.075*** 0.099*** 2.Viewing 0.780*** 0.579*** 3.Voting 0.496*** 4.Commenting Dependent variable: peer review 1.Peer review 0.107*** 0.078*** 0.102*** 2.Commenting 0.780*** 0.579*** 3.Viewing 0.496*** 4.Voting Note. *** p < 0.001 Table 3 Stepwise regression results of the first sample Model b Beta t p Tolerance VIF Dependent variable: quiz score 1. Viewing 0.002 0.101 3.998 < 0.001 1.00 1.00 2.Viewing 0.002 0.065 2.106 0.04 0.67 1.51 Commenting 0.010 0.062 2.002 < 0.05 0.67 1.51 Dependent variable: peer review 1 Viewing 0.000 0.107 4.256 < 0.001 1.00 1.00 2.Viewing 0.000 0.072 2.336 0.02 0.67 1.51 Commenting 0.002 0.061 1.967 < 0.05 0.67 1.51 Notes. **p < 0.01; ***p < 0.001 Table 3 shows findings and results related to the dependent variable peer reviews. At step 1 of the analysis, the independent variable viewing was entered into the regression model and was determined as significant with F(1,1561) = 18.11, p < 0.001. This model accounted for approximately 10.70% of the variance of peer review, R2 = 0.107. At step 2 of the analysis, independent variable commenting was entered into the model and was significantly related to peer reviews with F(2,1560) = 11.01, p < 0.001. This model accounted for approximately 13.30% of the variance of peer reviews, R2 = 0.133. Hence, by looking at the regression model, dependent variable peer reviews was primarily predicted by viewing, and to a lesser extent by commenting. Voting to the model was not significant. Moreover, Table 3 also presents the stepwise regression results for the dependent variable total quiz score. At step 1 of the analysis, viewing was entered into the regression model and found significant with F(1,1561) = 15.98, p < 0.001. This model accounted for approximately 10.10% of the variance of total score, R2 = 0.101. At step 2 of the analysis, commenting was entered into the model and was significantly related to peer reviews with F(2,1560) = 10.01, p < 0.001. This model accounted for approximately 12.70% of the variance of peer reviews, R2 = 0.127. Hence, total score was primarily predicted and determined by viewing, and to a lesser extent by commenting. Voting was not significant. Completing participants Out of the 1563 learners who participated in the previous analysis, 1185 completed the course. The total comments were 21,493; and the total views and votes were 247,844 and 3,883 respectively. The mean of the total scores in quizzes was 19.41 (SD = 8.27); the mean of submitted peer reviews was 3.07 (SD = 1.20). As with the previous analysis, correlations between all the variables and multicollinearity were valid for the analyses (see Table 4 and 5). Australasian Journal of Educational Technology, 2018, 34(4). 23 Table 4 Correlation matrix for the second sample 1 2 3 4 Dependent variable: quiz score 1. Score 0.150*** 0.130*** 0.132*** 2 Viewing 0.757*** 0.464** 3.Voting 0.435*** 4.Commenting Dependent variable: peer review 1.Peer review 0.177*** 0.149*** 0.141*** 2 Viewing 0.757*** 0.464*** 3.Voting 0.435*** 4.Commenting Notes. **p < 0.01; ***p < 0.001 Table 5 Stepwise regression results of the second sample Model b Beta t p Tolerance VIF Dependent variable: quiz score 1 Viewing 0.003 0.150 5.203 < 0.001 1.00 1.00 2.Viewing 0.002 0.113 3.476 0.001 0.79 1.27 Commenting 0.017 0.080 2.467 0.014 0.79 1.27 Dependent variable: peer review 1 Viewing 0.001 0.177 6.182 < 0.001 1.00 1.00 2.Viewing 0.000 0.12 4.412 < 0.001 0.79 1.27 Commenting 0.002 0.075 2.318 0.21 0.79 1.27 Notes. **p < 0.01; ***p < 0.001 Table 5 presents the stepwise regression results for the dependent variable peer reviews. At step 1 of the analysis, viewing was entered into the regression model and was found significant with F(1,1183) = 38.21, p < 0.001. This model accounted for approximately 17.7% of the variance of peer reviews, R2 = 0.177. Hence, the dependent peer reviews was primarily predicted by viewing. At step 2 of the analysis, independent variable commenting was entered into the model and was significantly related to peer reviews with F(2,1182) = 21.86, p < 0.001. Voting was not significant. This model accounted for approximately 21.7% of the variance of peer reviews, R2 = 0.217. Hence, by looking at the regression model, dependent variable peer reviews was primarily predicted by viewing, and to a lesser extent by commenting. Moreover, Table 5 also presents the stepwise regression results for the dependent variable total score. At step 1 of the analysis, viewing was entered into the regression model and found significant with F(1,1183) = 27.08, p < 0.001. This model accounted for approximately 15.00% of the variance of total score, R2 = 0.15. At step 2 of the analysis, commenting was entered into the model and was significantly related to peer reviews with F(2,1182) = 16.64, p < 0.001. This model accounted for approximately 19.3% of the variance of peer reviews, R2 = 0.193. Hence, total score was primarily predicted and determined by viewing, and to a lesser extent by commenting. Voting was not significant. Overall, our results suggest that viewing messages is the main factor affecting total quiz score and number of peer reviews (H3 and H4), followed by commenting on others’ messages (H1 and H2); and the only insignificant factor is voting (H1 and H2). Discussion MOOCs provide massive numbers of learners with an open learning environment. In such a scenario, learners come from diverse backgrounds with different learning goals, and have no obligation to complete the course. The major factors influencing learners’ continuance of learning in MOOCs (Alraimi et al., 2015; Chen & Chen, Australasian Journal of Educational Technology, 2018, 34(4). 24 2015; Imlawi, Gregg, & Karimi, 2015) are different from those in traditional online courses. Hence, factors affecting the participatory level of MOOC forum activities could also be different. As asynchronous online discussion is one of the major interactive learning activities among instructors and learners, this study aimed to investigate the effect of common types of MOOC forum activities – commenting, viewing and voting – on learners’ peer reviews and academic performance. In this study, the results afford two implications, three explanations and five practical suggestions. The stepwise regression analysis conducted with all forum participants indicates that viewing threads or comments are the main predictors of results in peer reviews and quiz scores, but the impact of viewing was suppressed somewhat by commenting when commenting and viewing were introduced into the regression model. These findings suggest that viewing is more influential for conducting peer reviews and getting higher score in quizzes than commenting and voting. The full model (all factors) also indicates that among other forum activities, viewing and commenting are significantly related to peer reviews and total quiz scores, but not voting. A second stepwise regression analysis was conducted to investigate if the pattern of association of variables found in the first regression model with all participants could be replicated with those learners who participated in the forums and completed the course. The findings from the second analysis indicate a pattern of results that was similar but different in the strength of relationships. The analyses undertaken with the second sample set were similar, as viewing was still significant in predicting peer reviews and quiz scores, but with a stronger effect. In MOOC forums, commenting involves heavier cognitive processing and behavioural actions when compared to viewing. Learners who comment in forums are required to read and understand the messages before typing their responses. They probably revisited their responses for further actions. In contrast, learners who viewed and voted on messages are considered to have fewer actions. The degree of commenting to the forum discussion is also different between these two types of learners. The former one is more like a commenter, and the latter is a viewer. Therefore, two implications of the results from this study are that cognitive processing of activities affects MOOC learners learning, which is in line with the studies of Chiu and Churchill (2015a, 2015b) and Chiu and Mok (2017) and learning activities should cater for both viewers and commenters in MOOCs. Three plausible explanations of these findings are learners’ study purpose, less obligated learners and voluntary forum participation. Behavioural intention to learn with MOOCs is different from that with formal courses (Alraimi et al., 2015). MOOC learners seek opportunities for self-learning and self-improvement (Yousef, Chatti, Wosnitza, & Schroeder, 2015). MOOC learners continue to learn when they perceive the good reputation of the instructor or institution, or when they find the educational resources useful and ready to be retrieved (Alraimi et al., 2015). They go online to retrieve the educational resources that they want for their own learning (Alraimi et al., 2015). Therefore, they intend to get information as quickly as possible for their convenience. In the forums, they would rather read (less cognitive processing) instead of responding as that will take longer. The second explanation is that MOOC learners are less obligated to complete the course. In a formal course, learners know their classmates or instructor to some degree through different contacts. For example, learners take more than one course with their peers; they work on a project together and they attend the same face-to face-classes. They build up their relationship through learning together in the same course. In contrast, MOOC learners hardly know their peers, and may only know their peers’ username, resulting in fewer peer effects (Xie, Ke, & Sharma, 2008). Less timely and appropriate feedback from peers also lowers learners’ intention to submit messages in the forum for discussion, that is, posting and commenting. Similarly, the relationship between instructor and learners is another factor in keeping learners from participating in asynchronous discussion (Hew et al., 2010; Masters & Oberprieler, 2004; Vonderwell, 2003) in MOOCs. Due to the massive number of learners, instructors are less likely to give timely support or feedback to all the learners in a MOOC. This insufficient student support results in distant relationships (Imlawi et al., 2015), which are in line with the studies of Hew et al. (2010), Masters and Oberprieler (2004), and Vonderwell (2003) on closed discussion. With these distant relationships, MOOCs, unlike formal courses, are usually perceived as casual learning. The learners may have different personal reasons and only require making minimal efforts, apart from filling in registration forms, paying the tuition fee and getting enrolment permissions. Reasons for taking these actions Australasian Journal of Educational Technology, 2018, 34(4). 25 could be exploring interest in a topic and assessing additional resources for their formal studies (Onah et al, 2014; Yang et al., 2015). Therefore, MOOC learners might not fully engage in learning. In traditional courses, learners invest more in learning when they want to get a better grade and/or get their work recognised by peers or instructors. Therefore, in a MOOC learning environment, some learners like to participate in the learning as viewers rather than as commenters. Another explanation is that learning with MOOC forums is not perceived as compulsory. The learners are not under any instructor supervision or monitoring during learning. They are not reminded personally to complete the tasks when they fail to complete the forum activities as scheduled. Learners are not strictly required to participate in the discussion topic proposed by instructors or to respond to the messages posted by peers. They participate in the forum based on their own interests rather than the course requirements. On the other hand, in traditional online courses, forums could be used as part of the assessment, as recommended and encouraged by many studies (Dennen, 2005; Mazzolini & Maddison, 2007). However, it is important to note that these explanations do not diminish the influential role of commenting in predicting learner peer reviews and quiz scores. The findings suggest that commenting is still one of the significant factors influencing participation and performance. Conclusion and limitations In most MOOCs, video lectures and discussion forums are common instructional strategies used to engage student learning (Hew & Cheung, 2014). Although the use of forums in closed learning environments is well studied, the examination of forums in MOOCs remains unclear (Hew & Cheung, 2014). The major finding of this study suggests that besides commenting, viewing is one of major factors influencing learning in MOOCs; however, only a few studies mention how to encourage viewing. Accordingly, we provide five practical suggestions to facilitate viewing. First, due to the massive number of threads in forums, we should use attractive and short titles for threads or signal keywords in threads (Chiu & Churchill, 2015a, 2015b; signalling principle in Mayer, 2009). Second, as concise messages can promote responding and reading, we should encourage short and simple comments from learners by restricting the number of words in a response. Third, we should highlight important messages in the comments or summarise the comments on a daily basis for newsfeeds. This suggestion can help busy MOOC learners to receive summaries without going through all the messages. Fourth, a picture containing a lot of words can transfer messages to learners more effectively; besides, some learners learn better with images. MOOC instructors should use images to present the main idea of threads. Fifth, we should learners to popular forum messages, that is, the messages learners read or responded to most. This suggestion encourages peer learning by reading other learners’ messages. There are a number of limitations in this study, three of which are noted here. Firstly, although this study the found that participation and performance are primarily predicted by viewing and to a lesser extent by commenting, more studies are needed to validate the finding. The results of the present study could also be extended by additional studies on other MOOCs in other subject domains (Chiu & Churchill, 2016). Secondly, this study did not consider another major learning activity – watching video lectures – and outcomes that are measured in different phases. Future research should be conducted using longitudinal design, including learning time. Thirdly, learners’ backgrounds, such as educational level, age and study purposes, were not considered; these factors could also influence participation in forum discussions. Accordingly, future research on MOOC forum activities should include more factors – cognitive processing of activities and interactions among learners – and should compare humanity and art with science and engineering learners. References Alraimi, K. M., Zo, H., & Ciganek, A.P. (2015). Understanding the MOOCs continuance: The role of openness and reputation. Computers & Education, 80, 28–38. doi:10.1016/j.compedu.2014.08.006 https://doi.org/10.1016/j.compedu.2014.08.006 Australasian Journal of Educational Technology, 2018, 34(4). 26 An, H., Shin, S., & Lim, K. (2009). The effects of different instructor facilitation approaches on students’ interactions during asynchronous online discussions. Computers & Education, 53(3), 749–760. doi:10.1016/j.compedu.2009.04.015 Bárcena, E., Read, T., Martín-Monje, E., & Castrillo, M. D. (2014). Analysing student participation in foreign language MOOCs: A case study. In U. Cress & C. D. Kloos (Eds.), Proceedings of EMOOCs 2014: European MOOCs Stakeholders Summit (pp. 11–17). Lausanne: Open Education Europa. Billsberry, J. (2013). MOOCs: Fad or revolution. Journal of Management Education, 37(6), 739–746. doi:10.1177/1052562913509226 Breslow, L., Pritchard, D. E., DeBoer, J., Stump, G. S., Ho, A. D., & Seaton, D. T. (2013). Studying learning in the worldwide classroom. Research into edX’s first MOOC. Research & Practice in Assessment, 8, 13- 25. Retrieved from http://www.rpajournal.com/dev/wp-content/uploads/2013/05/SF2.pdf Brewer, S., & Klein, J.D. (2006). Type of positive interdependence and affiliation motive in an asynchronous, collaborative learning environment. Educational Technology Research and Development, 54(4), 331-354. https://doi.org/10.1007/s11423-006-9603-3 Chen, G., & Chiu, M. M. (2008). Online discussion processes: Effects of earlier messages’ evaluations, knowledge content, social cues and personal information on later messages. Computers & Education, 50(3), 678–692. doi:10.1016/j.compedu.2006.07.007 Chen, Y. H., & Chen, P. J. (2015). MOOC study group: Facilitation strategies, influential factors, and student perceived gains. Computers & Education, 86, 55–70. doi:10.1016/j.compedu.2015.03.008 Cheng, C. K., Paré, D. E., Collimore, L. M., & Joordens, S. (2011). Assessing the effectiveness of a voluntary online discussion forum on improving students’ course performance. Computers & Education, 56(1), 253– 261. doi:10.1016/j.compedu.2010.07.024 Cheung, W. S., & Hew, K. F. (2004). Evaluating the extent of ill-structured problem solving process among pre-service teachers in an asynchronous online discussion and reflection log learning environment. Journal of Educational Computing Research, 30(3), 197–227. doi:10.2190/9jtn-10t3-wtxh-p6hn Chiu, T. K. F., & Churchill, D. (2015a). Design of learning objects for concept learning: Effects of multimedia learning principles and an instructional approach. Interactive Learning Environments, 24(6), 1355–1370. doi:10.1080/10494820.2015.1006237 Chiu, T. K. F., & Churchill, D. (2015b). Exploring the characteristics of an optimal design of digital materials for concept learning in mathematics: Multimedia learning and variation theory. Computers & Education, 82, 280-291. doi:10.1016/j.compedu.2014.12.001 Chiu, T. K. F, & Churchill, D. (2016). Adoption of mobile devices in teaching: Changes in teacher beliefs, attitudes and anxiety. Interactive Learning Environments, 24(2), 317–327. doi:10.1080/10494820.2015.1113709 Chiu, T. K. F., & Mok, I.A.C. (2017). Learner expertise and mathematics different order thinking skills in multimedia learning. Computers & Education, 107, 147–164. doi:10.1016/j.compedu.2017.01.008 Cisel, M. (2014). Analyzing completion rates in the First French xMOOC. In U. Cress & C. Delgado Kloos (Eds.), Proceedings of the European MOOC Stakeholder Summit (pp. 26–32). Retrieved from https://www.emoocs2014.eu/sites/default/files/Proceedings-Moocs-Summit-2014.pdf Clarke, T. (2013). The advance of the MOOCs (massive open online courses): The impending globalisation of business education? Education+ Training, 55(4), 403–413. doi:10.1108/00400911311326036 Coetzee, D., Fox, A., Hearst, M. A., & Hartmann, B. (2014). Should your MOOC forum use a reputation system? In Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing (pp. 1176–1187). New York, NY: ACM. doi:10.1145/2531602.2531657 Coffrin, C., Corrin, L., de Barba, P., & Kennedy, G. (2014). Visualizing patterns of student engagement and performance in MOOCs. In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge (pp. 83–92). New York, NY: ACM. doi:10.1145/2567574.2567586 DeBoer, J., Stump, G., Seaton, D. T., & Breslow, L. (2013, June). Diversity in MOOC students’ backgrounds and behaviors in relationship to performance in 6.002x. Cambridge, MA: Paper presented at the Sixth International Conference of MIT’s Learning International Networks Consortium. Retrieved from https://tll.mit.edu/sites/default/files/library/LINC%20%2713.pdf Dennen, V. P. (2005). From message posting to learning dialogues: Factors affecting learner participation in asynchronous discussion. Distance Education, 26(1), 127–148. doi:10.1080/01587910500081376 https://doi.org/10.1016/j.compedu.2009.04.015 https://doi.org/10.1177/1052562913509226 http://www.rpajournal.com/dev/wp-content/uploads/2013/05/SF2.pdf https://doi.org/10.1016/j.compedu.2006.07.007 https://doi.org/10.1016/j.compedu.2015.03.008 https://doi.org/10.1016/j.compedu.2010.07.024 https://dx.doi.org/10.2190/9jtn-10t3-wtxh-p6hn http://dx.doi.org/10.1080/10494820.2015.1006237 http://dx.doi.org/10.1016/j.compedu.2014.12.001 http://dx.doi.org/10.1080/10494820.2015.1113709 https://doi.org/10.1016/j.compedu.2017.01.008 https://www.emoocs2014.eu/sites/default/files/Proceedings-Moocs-Summit-2014.pdf http://dx.doi.org/10.1108/00400911311326036 http://dx.doi.org/10.1145/2531602.2531657 http://dx.doi.org/10.1145/2567574.2567586 https://tll.mit.edu/sites/default/files/library/LINC%20%2713.pdf http://dx.doi.org/10.1080/01587910500081376 Australasian Journal of Educational Technology, 2018, 34(4). 27 Ezen-Can, A., Boyer, K. E., Kellogg, S., & Booth, S. (2015). Unsupervised modeling for understanding MOOC discussion forums: a learning analytics approach. In Proceedings of the Fifth International Conference on Learning Analytics and Knowledge (pp. 146–150). New York, NY: ACM. doi:10.1080/01587910500081376 Ferenstein, G. (2014, March 3). Study: Massive online courses enroll an average of 43,000 students, 10% completion. TechCrunch. Retrieved from http://techcrunch.com/2014/03/03/study-massive-online- courses-enroll-an-average-of-43000-students-10-completion/ Fox, J. (1991). Regression diagnostics. Thousand Oaks, CA: Sage. doi:0.4135/9781412985604 - sthash.wfDy8Ba0.dpuf Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the community of inquiry framework: A retrospective. The Internet and Higher Education, 13(1), 5–9. doi:10.1016/j.iheduc.2009.10.003 Gillani, N., & Eynon, R. (2014). Communication patterns in massively open online courses. The Internet and Higher Education, 23, 18–26. doi:10.1016/j.iheduc.2014.05.004 Haggard, S. (2013). The maturing of the MOOC: Literature review of massive open online courses and other forms of online distance learning (A report for the UK Department for Business, Innovation and Skills). London: Department for Business, Innovation and Skills. Retrieved from http://www.obhe.ac.uk/documents/view_details?id=933 Harasim, L. (1993). Collaborating in cyberspace: Using computer conferences as a group learning environment. Interactive Learning Environments, 3(2), 119–130. doi:10.1080/1049482930030202 Hew, K. F., & Cheung, W. S. (2014). Students’ and instructors’ use of massive open online courses (MOOCs): Motivations and challenges. Educational Research Review, 12, 45–58. doi:10.1016/j.edurev.2014.05.001 Hew, K. F., Cheung, W. S., & Ng, C. S. L. (2010). Student contribution in asynchronous online discussion: A review of the research and empirical exploration. Instructional Science, 38(6), 571–606. doi:10.1007/s11251-008-9087-0 Iiyoshi, T., & Kumar, M. (2008). Opening up education: The collective advancement of education through open technology, open content, and open knowledge. Cambridge, MA: MIT Press. Imlawi, J., Gregg, D., & Karimi, J. (2015). Student engagement in course-based social networks: The impact of instructor credibility and use of communication. Computers & Education, 88, 84–96. doi:10.1016/j.compedu.2015.04.015 Khine, M. S., Yeap, L. L., & Chin Lok, A. T. (2003). The quality of message ideas, thinking and interaction in an asynchronous CMC environment. Educational Media International, 40(1–2), 115–126. doi:10.1080/0952398032000092161 Kizilcec, R. F., Papadopoulos, K., & Sritanyaratana, L. (2014). Showing face in video instruction: Effects on information retention, visual attention, and affect. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 2095–2102). New York, NY: ACM. doi:10.1145/2556288.2557207 Kop, R., & Fournier, H. (2011). New dimensions to self-directed learning in an open networked learning environment. International Journal of Self-Directed Learning, 7(2), 2–20. Lucas, H. C. (2013, October 7). Can the current model of higher education survive MOOCs and online learning? EDUCAUSE Review, 48(5), 54–56. Masters, K., & Oberprieler, G. (2004). Encouraging equitable online participation through curriculum articulation. Computers & Education, 42(4), 319–332. doi:10.1016/j.compedu.2003.09.001 Mayer, R. E. (2009). Multimedia learning (2nd ed.). New York, NY: Cambridge University Press. Mazzolini, M., & Maddison, S. (2007). When to jump in: The role of the instructor in online discussion forums. Computers & Education, 49(2), 193–213. doi:10.1016/j.compedu.2005.06.011 McAuley, A., Stewart, B., Siemens, G., & Cormier, D. (2010). The MOOC model for digital practice. Retrieved from http://www.elearnspace.org/Articles/MOOC_Final.pdf Menard, S. (2002). Applied logistic regression analysis (Vol. 106). Thousand Oaks, CA: Sage. doi:10.4135/9781412983433 Moon, S., Potdar, S., & Martin, L. (2014). Identifying student leaders from MOOC discussion forums through language influence. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (pp. 15–20). Stroudsburg, PA: Association for Computational Linguistics. doi:10.3115/v1/w14-4103 http://dx.doi.org/10.1080/01587910500081376 http://techcrunch.com/2014/03/03/study-massive-online-courses-enroll-an-average-of-43000-students-10-completion/ http://techcrunch.com/2014/03/03/study-massive-online-courses-enroll-an-average-of-43000-students-10-completion/ http://dx.doi.org/10.4135/9781412985604#sthash.wfDy8Ba0.dpuf http://dx.doi.org/10.4135/9781412985604#sthash.wfDy8Ba0.dpuf http://dx.doi.org/10.1016/j.iheduc.2009.10.003 http://dx.doi.org/10.1016/j.iheduc.2014.05.004 http://www.obhe.ac.uk/documents/view_details?id=933 http://dx.doi.org/10.1080/1049482930030202 http://dx.doi.org/10.1016/j.edurev.2014.05.001 https://doi.og/10.1007/s11251-008-9087-0 https://doi.org/10.1016/j.compedu.2015.04.015 http://dx.doi.org/10.1080/0952398032000092161 https://doi.org/10.1145/2556288.2557207 https://doi.org/10.1016/j.compedu.2003.09.001 https://doi.org/10.1016/j.compedu.2005.06.011 http://www.elearnspace.org/Articles/MOOC_Final.pdf https://doi.org/10.4135/9781412983433 https://doi.org/10.3115/v1/w14-4103 Australasian Journal of Educational Technology, 2018, 34(4). 28 Nawrot, I., & Doucet, A. (2014). Building engagement for MOOC students: Introducing support for time management on online learning platforms. In A. Broder, K Shim, & T. Suel (Eds.), Proceedings of the Companion Publication of the 23rd International Conference on World Wide Web Companion (pp. 1077– 1082). New York, NY: ACM. doi:10.1145/2567948.2580054 Oliver, M., & Shaw, G. P. (2003). Asynchronous discussion in support of medical education. Journal of Asynchronous Learning Networks, 7(1), 56–67. Retrieved from http://sloanconsortium.org/sites/default/files/v7n1_oliver_1.pdf Onah, D. F., Sinclair, J., & Boyatt, R. (2014). Dropout rates of massive open online courses: Behavioural patterns. In L. Gómez Chova, A. López Martínez, & I. Candel Torres (Eds.), Proceedings of the 6th International Conference on Education and New Learning Technologies (pp. 5825–5834). Valencia: IATED Academy. Paré, D. E., & Joordens, S. (2008). Peering into large lectures: examining peer and expert mark agreement using peerScholar, an online peer assessment tool. Journal of Computer Assisted Learning, 24(6), 526– 540. doi:10.1111/j.1365-2729.2008.00290.x Rai, L., & Chunrao, D. (2016). Influencing factors of success and failure in MOOC and general analysis of learner behavior. International Journal of Information and Education Technology, 6(4), 262–268. doi:10.7763/IJIET.2016.V6.697 Swan, K., Garrison, D. R., & Richardson, J. (2009). A constructivist approach to online learning: The Community of Inquiry framework. Information technology and constructivism in higher education: Progressive learning frameworks (pp. 43–57). Hershey, PA: IGI Global. doi:10.4018/978-1-60566-654- 9.ch004 Van den Berg, I., Admiraal, W., & Pilot, A. (2006). Design principles and outcomes of peer assessment in higher education. Studies in Higher Education, 31(3), 341–356. doi:10.1080/03075070600680836 Vonderwell, S. (2003). An examination of asynchronous communication experiences and perspectives of students in an online course: A case study. The Internet and Higher Education, 6(1), 77–90. doi:10.1016/S1096-7516(02)00164-1 Wang, Q. (2008). Student-facilitators’ roles in moderating online discussions. British Journal of Educational Technology, 39(5), 859–874. https://doi.org /10.1111/j.1467-8535.2007.00781.x Wong J. S., Pursel B., Divinsky A., Jansen B. J. (2015). An analysis of MOOC discussion forum interactions from the most active users. In N. Agarwal, K. Xu, & N. Osgood (Eds), Proceedings of the Social Computing, Behavioral-Cultural Modeling, and Prediction Conference (pp. 452-457). Washington, DC: Springer. doi:10.1007/978-3-319-16268-3_58 Xie, Y., Ke, F., & Sharma, P. (2008). The effect of peer feedback for blogging on college students' reflective learning processes. The Internet and Higher Education, 11(1), 18–25. doi:10.1016/j.iheduc.2007.11.001 Yang, D., Wen, M., Howley, I., Kraut, R., & Rose, C. (2015). Exploring the effect of confusion in discussion forums of massive open online courses. In Proceedings of the Second (2015) ACM Conference on Learning@ Scale (pp. 121–130). New York, NY: ACM. doi:10.1145/2724660.2724677 Yousef, A. M. F., Chatti, M. A., Wosnitza, M., & Schroeder, U. (2015). A cluster analysis of MOOC stakeholder perspectives. RUSC. Universities and Knowledge Society Journal, 12(1), 74–90. doi:10.7238/rusc.v12i1.2253 Corresponding author: Thomas K. F. Chiu, tchiu@hku.hk or thomas.kf.chiu@gmail.com Australasian Journal of Educational Technology © 2018. Please cite as: Chiu, T. K. F., & Hew, T. K. F. (2018). Factors influencing peer learning and performance in MOOC asynchronous online discussion forums. Australasian Journal of Educational Technology, 34(4), 16-28. https://doi.org/10.14742/ajet.3240 https://doi.org/10.1145/2567948.2580054 http://sloanconsortium.org/sites/default/files/v7n1_oliver_1.pdf https://doi.org/10.1111/j.1365-2729.2008.00290.x http://dx.doi.org/10.7763/IJIET.2016.V6.697 https://doi.org/10.4018/978-1-60566-654-9.ch004 https://doi.org/10.4018/978-1-60566-654-9.ch004 http://dx.doi.org/10.1080/03075070600680836 https://doi.org/10.1016/S1096-7516(02)00164-1 https://doi.org/10.1007/978-3-319-16268-3_58 https://doi.org/10.1016/j.iheduc.2007.11.001 https://doi.org/10.1145/2724660.2724677 http://dx.doi.org/10.7238/rusc.v12i1.2253 mailto:tchiu@hku.hk mailto:thomas.kf.chiu@gmail.com https://doi.org/10.14742/ajet.3240 Introduction Literature review Research in MOOCs Asynchronous online discussion forum activities and cognitive processing Method Purpose of the present study Dataset and participants Procedure Results All forum participants Completing participants Discussion Conclusion and limitations References