Australasian Journal of Educational Technology, 2017, 33(5).   

 

 63 

Use of a post-asynchronous online discussion assessment to 
enhance student critical thinking 

 
Chris Klisc, Tanya McGill, Valerie Hobbs 
Murdoch University, Australia 
 

Asynchronous online discussion (AOD) is used in many tertiary education courses, and 
assessing it has been shown to enhance critical thinking outcomes. There has, however, 
been debate on what should be assessed and how the assessment should be implemented. 
The most common form of assessment involves grading the individual discussion 
contributions, but it has been suggested that employing a culminating task based on the 
AOD may be effective. This preliminary study compared the effect on student critical 
thinking of two approaches to AOD assessment: using a post-AOD assessment, and 
assessing the discussion contributions themselves. The results, though tentative, showed 
that while both assessment approaches resulted in significant improvements in student 
critical thinking, there was no difference in the impact on critical thinking skills between 
using the post-AOD assessment and assessing the discussion contributions. This result 
suggests that the form of assessment used in an AOD may be less important than the fact 
that assessment is included. Interviews with students also provided some insight into ways 
in which they perceived the discussion environment had contributed to their critical 
thinking skills. The findings of this study pave the way for further research in this important 
area. 

 
Introduction 
 
Asynchronous online discussion (AOD) continues to be an integral component of many tertiary education 
courses, due in no small part to the dramatic growth in distance education and blended courses in recent 
years (Beebe, Vonderwell, & Boboc, 2010; Zydney, deNoyelles, & Kyeong-Ju Seo, 2012). Within both 
these learning environments AOD has become an important pedagogical communication component, 
especially for distance education courses by providing an online learning community for students and 
instructors who are unable to meet face-to-face (Ertmer, Sadaf, & Ertmer, 2011; Gao, Zhang, & Franklin, 
2013). Additionally, due to the asynchronous text-based feature of AOD, the dialogue thus created in this 
environment has the potential to help students develop complex cognitive skills, especially those 
associated with critical thinking (Hara, Bonk, & Angeli, 2000; Wu & Hiltz, 2004). 
 
Using assessment with AOD has been shown to enhance critical thinking outcomes (Klisc, McGill, & 
Hobbs, 2009; Richardson & Ice, 2010), but there is debate on what should be assessed and how 
assessment should be implemented (Arend, 2009). The most common form of assessment has been that of 
grading the discussion contributions (Dennen, 2008), but it is unclear whether this form of assessment is 
essential for successful learning outcomes (Richardson & Ice, 2010; Schindler & Burkholder, 2014; Wee 
& Abrizah, 2011). This paper describes a study comparing the effect on student critical thinking of two 
approaches to AOD assessment: using a post-AOD assessment (a piece of work submitted after the 
completion of the AOD that builds on the AOD contributions), and assessing the discussion contributions 
themselves. 
 
Literature review 
 
The development of student critical thinking skills is an important aim of many tertiary institutions with 
the ability to reason, think analytically, and justify conclusions regarded as essential skills for graduates 
(Carrington, Chen, Davies, Kaur, & Neville, 2011; Davies, 2011). Many have acknowledged the 
importance of these skills by incorporating critical thinking skills into desired outcomes for their 
graduates (Mummery & Morton-Allen, 2009; Prasad, 2009). However, concern has been expressed about 
the inadequate levels of critical thinking skills possessed by graduates upon leaving university, and the 
fact that only minimal improvement in these thinking skills is achieved by students while undertaking 
their undergraduate degrees (Carrington et al., 2011). 
 
  



Australasian Journal of Educational Technology, 2017, 33(5).   

 

 64 

Discussion as a pedagogical strategy for developing critical thinking is best understood from a 
constructivist perspective, where it is argued that knowledge is created by individuals interacting and 
exchanging information (Rourke & Anderson, 2002). In the act of discussion, articulation and reflection 
exercise the critical thinking skills of interpretation, analysis, synthesis, and evaluation, supporting 
knowledge construction (Gilbert & Dabbagh, 2005). At the same time, research shows there is a strong 
relationship between the acts of critical thinking and writing (Applebee, 1984; Cohen & Spencer, 1993). 
Applebee (1984) states that “it is widely accepted that good writing and careful thinking go hand in hand” 
(p. 577) and that one is not possible without the other. 
 
The environment of AOD provides students with opportunities to engage with one another in ways that 
can potentially promote critical thinking by combining the act of thinking with discussion via writing 
(MacKnight, 2000; Prasad, 2009). The delayed nature of communication in an AOD allows time for 
reflection, while its text-based nature enables the combination of thinking and writing necessary for the 
facilitation of critical thinking (Greenlaw & DeLoach, 2003; Hara et al., 2000). In an AOD students need 
to create their discussion in a written form and read their peers’ postings, exposing them to diverse 
viewpoints and requiring them to evaluate material from multiple perspectives. They also need to make 
judgements about the material presented and, finally, synthesise and draw inferences before coming to 
their own conclusions (Birch & Volkov, 2007; Schellens & Valcke, 2006; Wu & Hiltz, 2004). 
 
Much research has been done on critical thinking in AOD as has been highlighted in recent literature 
reviews (Gao et al., 2013; Loncar, Barrett, & Liu, 2014). Research has investigated ways to enhance 
critical thinking outcomes by focusing on what instructors can implement, including the introduction of 
protocols in AOD (Hew & Cheung, 2011; Zydney et al., 2012), use of supporting materials (Alexander, 
Commander, Greenberg, & Ward, 2010; Bai, 2009), AOD design (Darabi, Arrastia, Nelson, Cornille, & 
Liang, 2011; Kalelioglu & Gulbahar, 2014; Richardson & Ice, 2010), message labelling (Schellens, 
VanKeer, DeWeaver, & Valcke, 2009; Topcu, 2010), and the use of student facilitation (Hew & Cheung, 
2008; Xie, Yu, & Bradshaw, 2014). Other studies have focused on how critical thinking can be evaluated 
within the AOD using some form of content analysis of discussion transcripts (Beckmann & Weber, 
2016; Darabi et al., 2011). 
 
Critical thinking outcomes in AOD have been shown to be significantly enhanced by the use of 
assessment (Klisc et al., 2009). Summative assessment has been claimed both to motivate students to 
participate in the online discussion and to take extra care in creating their discussion contributions (Hara 
et al., 2000; Palmer & Holt, 2009). Using assessment in an AOD encourages more involvement in the 
discussion, which in turn fosters critical thinking in students. 
 
However, despite the recognised importance of assessment in an AOD, it is unclear what form the 
assessment should take. More research is required to investigate optimum ways to assess critical thinking 
for grading purposes in the AOD environment (Wee & Abrizah, 2011). Grading discussion contributions 
is the most common form of assessment (Dennen, 2008) but it is debatable whether assessment of the 
postings themselves is essential for successful learning outcomes (Vonderwell, Liang, & Alderman, 
2007). While there is strong support for the idea that the act of participating in an AOD aids in the 
development of critical thinking, there is debate about where evidence of such critical thinking may be 
found (Arend, 2009; Dennen, 2008; Richardson & Ice, 2010). For example, Garrison, Anderson, and 
Archer (2001) identified the activities happening in an online discussion as representing the process of 
learning rather than any outcome of critical thinking. 
 
Research question and hypotheses 
 
It has been suggested that rather than directly assess individual AOD contributions, a more effective 
assessment strategy may be to employ a culminating task based on the AOD (Dennen, 2008; Greenlaw & 
DeLoach, 2003). Dennen (2008) claims that getting students to produce a reflection paper about their 
AOD experience “serves as a product documenting what the learner has perceived as his or her own 
process of learning through the act of discussion” (p. 212). Similarly, Arend (2009) and Richardson and 
Ice (2010) assert that students need time to absorb, reflect, and synthesise the material before evidence of 
critical thinking can occur. This extra step calling for evaluation and judgement skills may potentially 
stimulate student thinking to a further degree than does the act of participating in a discussion alone 
(Hazari, 2004; Vonderwell et al., 2007). 



Australasian Journal of Educational Technology, 2017, 33(5).   

 

 65 

Akyol and Garrison (2011) explored how the contributions were used by graduate education students to 
complete a post-AOD course redesign project assessment. They reported that students believed the final 
stage of critical thinking, represented by creating and presenting solutions, could not be attained in the 
AOD, but instead but was possible in the course redesign project. This belief was supported by transcript 
analysis of the AOD postings which indeed showed little evidence of the final stage of critical thinking 
having occurred in the AOD. This seems to suggest that the extra task of a post-AOD activity helped to 
fully engage their critical thinking. 
 
The study described in this paper explores the effect of two different forms of assessment in an AOD and 
seeks to answer the following research question: 
 

• How do different forms of assessment used in an AOD affect student critical thinking skills? 
 
The first assessment approach considered was an AOD contribution assessment, reported to be the most 
commonly used form of AOD assessment AOD (Dennen, 2008). Given that the assessment of AOD 
contributions has been shown to improve students critical analysis and reflection (Klisc et al., 2009) it 
was hypothesised that: 
 

H1: Students will perceive an improvement in their critical thinking skills after participating in an 
online discussion with an AOD contribution assessment. 

 
The second approach considered was that of a post-AOD assessment, which in this study was defined as a 
learning activity that incorporates both material researched by the student and material discussed online to 
address a topic in essay format that is graded. While there has been little empirical evaluation of the 
usefulness of this approach, it has been suggested that the opportunity for student reflection and analysis 
of the AOD contributions provided by a post-AOD assessment should stimulate students’ synthesis and 
evaluation skills (Clark, 2001; Greenlaw & DeLoach, 2003; Richardson & Ice, 2010). It was therefore 
hypothesised that: 
 

H2: Students will perceive an improvement in their critical thinking skills after participating in an 
online discussion with a post-AOD assessment. 

 
Based on the arguments in the literature for the benefits of a post-AOD assessment compared with an 
AOD contribution assessment (Akyol & Garrison, 2011; Clark, 2001; Greenlaw & DeLoach, 2003; 
Richardson & Ice, 2010), it was further hypothesised that: 
 

H3: Critical thinking skills will be more evident in students who complete a post-AOD assessment 
than in students who complete an AOD contribution assessment. 

 
Method 
 
The study was conducted in an introductory information technology course at an Australian university, 
over one teaching semester. In order to answer the research question, a mixed method research approach 
was employed, using an explanatory 2-phase design (Creswell & Plano Clark, 2007). The first phase used 
a quasi-experiment to evaluate the effect of the different forms of assessment. The second phase used 
interviews to collect qualitative data to substantiate and elaborate on the quantitative data. 
 
Student critical thinking was measured in two ways. Perceptions of critical thinking were collected both 
prior to the AOD, and after the AOD and associated assessments were completed, and an objective 
measurement of critical thinking was obtained after the AOD and the associated assessments were 
completed. While the use of an objective test to measure student critical thinking skills before and after 
the treatments would have been preferable, as the study was conducted as part of a real educational 
offering this was not possible, as the necessary interval between administrations could not be obtained. 
 
Instruments 
 
Table 1 shows the constructs used in addressing the research question and associated hypotheses, together 
with the instruments used to collect the information needed to measure these constructs. 



Australasian Journal of Educational Technology, 2017, 33(5).   

 

 66 

Table 1 
Hypotheses with constructs and associated data sources 
Hypotheses Constructs Data source 
H 1: Students will perceive an improvement in 
their critical thinking skills after participating in an 
online discussion with an AOD contribution 
assessment. 

Perceived critical 
thinking skills 

Pre-AOD and post-
AOD questionnaires 

H 2: Students will perceive an improvement in 
their critical thinking skills after participating in an 
online discussion with a post-AOD assessment. 

Perceived critical 
thinking skills 

Pre-AOD and post-
AOD questionnaires 

H 3: Critical thinking skills will be more evident 
in students who complete a post-AOD assessment 
than in students who complete an AOD 
contribution assessment. 

Perceived critical 
thinking skills and 
objective critical 
thinking skills 

Pre-AOD and post-
AOD questionnaires 
and critical thinking 
skills test  

 
Pre-AOD and post-AOD questionnaires 
The construct perceived critical thinking skills is defined in this study as self-reported level of critical 
thinking, and includes skills used for analysis, interpretation, evaluation, synthesis and inductive and 
deductive reasoning. Perceived critical thinking skills was measured in both the pre-AOD and post-AOD 
questionnaire. 
 
The 17 items used to measure perceived critical thinking skills (see Table 2) were developed based on 
Mummery and Morton-Allen (2009) instrument, which they based on the California Critical Thinking 
Disposition Inventory (Facione & Facione, 1992). The items were measured on a 5-point Likert scale 
from 1 (strongly disagree) to 5 (strongly agree) and the responses added to obtain a score for perceived 
critical thinking skills, with a possible maximum score of 85; a high score indicating a high perceived 
level of critical thinking. 
 
Table 2 
Items use to measure perceived critical thinking skills 

Items 
I use reasons and evidence to try and gain the best possible understanding of a given situation. 
I am tolerant of the opinions and ideas of others, especially when they are different from my own 
opinions and ideas. 
I carefully consider the possible outcomes or consequences of situations, choices, proposals or plans 
and to take this into account when making decisions. 
I solve problems in an orderly, organised way. 
I am confident in my reasoning and judgment to solve problems and reach my goals. 
I am curious and eager to learn/understand new things, even when I’m not sure how or why this 
learning might be useful. 
I do not see problems and situations as black or white, right or wrong. 
I recognise that there is often a number of ways to solve a problem or reach a goal. 
I understand the need to stand firm in my judgment when there is reason to do so, and to change my 
mind when reasons and evidence indicate that I am mistaken. 
I understand the idea that we sometimes need to make a decision or judgment even in the absence of 
complete knowledge or when there is no clear right or wrong answer. 
I am able to work out how true or false the inferences or conclusions are that someone draws from a 
particular set of information or data. 
I am able to work out what hidden assumptions have been made in a given statement. 
I am able to weigh evidence and decide whether generalisations or conclusions based on given data 
are warranted / justifiable. 



Australasian Journal of Educational Technology, 2017, 33(5).   

 

 67 

I am able to distinguish between strong, relevant arguments and arguments that are weak or irrelevant 
to a particular question or issue. 
I am able to critically evaluate academic writing (e.g., journal articles, books). 
I am aware of what is needed to construct good arguments. 
I aware of the need to monitor, evaluate and adjust my own thinking processes. 

 
The post-AOD questionnaire also included three additional items. All participants were asked to indicate 
their level of agreement with the statement “I feel that the online discussion contributed towards 
developing my critical thinking skills” on a 5-point Likert scale, with an option to describe how it had 
contributed. All participants were also invited to comment on anything else that they felt was relevant to 
the study. Those participants who had completed the post-AOD assessment were also asked if they felt 
the assessment had contributed towards developing their critical thinking skills, with the item “I feel that 
the process of researching and writing the essay contributed towards developing my critical thinking 
skills”. 
 
Critical thinking skills test 
The second measure of critical thinking skills was an objective measure: objective critical thinking skills. 
The test used was the California Critical Thinking Skills Test (CCTST) (Facione, Facione, & 
Winterhalter, 2010). This test consists of 34 multiple choice type items assessing analysis, evaluation, 
inference, and deductive and inductive reasoning. The overall critical thinking skills totals were used to 
compare the differences between the groups completing the different types of assessment. 
 
Interviews 
Interview questions were designed to elaborate on results obtained from the questionnaire used in the 
intervention phases. They focused on participant awareness of critical thinking and AOD related issues 
including online discussions, essay writing and critical thinking.  
 
Procedure 
 
This study involved first-year undergraduate students enrolled in an introductory information technology 
course on multimedia and the Internet. 
 
Intervention phase 
In the intervention phase, participants were randomly allocated to two groups, completing either an AOD 
contribution assessment or a post-AOD assessment (an essay based on the discussion topic). Table 3 
shows how the study activities and data collection were arranged within the semester. 
 
Table 3 
Semester week timeline for the intervention phase 
 Weeks 1-2 Weeks 4-5 Week 7 Weeks 8-9 Week 9 

Group 1 Pre-AOD 
questionnaire 

Discussion  AOD contribution 
assessment 

Post-AOD 
questionnaire 

Critical thinking 
skills test 

Group 2 Pre-AOD 
questionnaire 

Discussion Post-AOD 
assessment 

Post-AOD 
questionnaire 

Critical thinking 
skills test 

 
Participants completed the pre-AOD questionnaire either in class or online. Subsequently, all students, 
including the participants of the study, were introduced to the concepts of critical thinking, the discussion 
topic, and associated assessment information. In preparation for the online discussion, the participants 
were firstly randomly allocated to one of six discussion forums, followed by the random allocation of the 
remaining students, who were not part of the study. This resulted in each forum having seven members. 
The discussion was held over 14 days in weeks 4 and 5, a duration consistent with that used in other 
studies (Klisc, 2015; Richardson & Ice, 2010). 
 
Those students participating in the post-AOD essay assessment were strongly encouraged to use 
discussed points in their essays, and were required to cite at least three postings from their forum in their 
essays. The post-AOD assessment was submitted at the end of week 6, and the marking of all assessments 



Australasian Journal of Educational Technology, 2017, 33(5).   

 

 68 

was done in week 7. After online discussion assessments were completed, the post-AOD questionnaire 
was made available to participants to complete in week 8. This was followed in week 9 by the CCTST. 
 
Interview phase 
In the second phase, semi-structured individual interviews were conducted with a subset of the 
participants after the teaching period was completed. The interviews employed a flexible format, using a 
standard set of questions but allowing participants to volunteer information and pursue spontaneous 
tangents during the conversation. 
 
Data analysis 
 
The data was analysed using both quantitative and qualitative methods. Cronbach’s alpha was used as an 
indicator of reliability for perceived critical thinking skills, and was calculated for both administrations of 
the questionnaire producing coefficients of .747 and .854. Due to the small size of the samples, testing for 
normal distribution was conducted both visually, using histograms and stem-and-leaf plots, and 
objectively, using the Shapiro-Wilk test. As a result, the samples were found to be suitable for parametric 
testing, with paired-samples t tests used to address hypotheses 1 and 2, and a one-tailed independent 
samples t test used to address hypothesis 3. For all of these tests the value used for significance was .05. 
 
As the sample size for the intervention phase (21 participants) was at the lower end of the requirements 
the analysis techniques used (Hair, Black, Babin, Anderson, & Tatham, 2006), this made the qualitative 
data collection and analysis particularly important in order to understand and further explore the issues 
suggested by the quantitative analysis. Qualitative data, collected from the interviews and the post-AOD 
questionnaire, were analysed using a categorising coding strategy (Maxwell, 2005). The interview 
questions were used as organisational categories, and themes were identified within the comments. Tables 
5, 6, and 8 present the results of the thematic analysis. 
 
Results 
 
Participants 
 
Twenty one students participated in the study. The 11 participants who completed the AOD contribution 
assessment were two (18.2%) females and nine males (81.8%), with ages ranging from 17 to 41 years, 
with an average of 21.5 years. Of the 10 participants who completed the post-AOD assessment, 4 (40.0%) 
were female and 6 (60.0%) were male. Their ages ranged from 17 to 32 years, with an average of 20.0 
years. There was no significant difference in initial levels of perceived critical thinking between the two 
groups (t(19) = 1.216, p = .239). Seven of the 21 participants were subsequently interviewed. Three of 
them (1 female and 2 males), had completed the AOD contribution assessment, while the other 4 (2 
females and 2 males), had completed the post-AOD essay assessment. 
 
Improvements in critical thinking skills with an AOD contribution assessment 
 
Higher levels of perceived critical thinking skills after the AOD may suggest an improvement in critical 
thinking skills for those participants who completed an AOD contribution assessment. The results of the 
paired-samples t test, shown in Table 4, indicate that there was a small significant increase in perceived 
critical thinking skills, from prior to the AOD (M = 66.55, SD = 5.39) to after the AOD (M = 68.82, SD = 
6.35), t(10) = -2.12, p = .030 (one-tailed) for the participants who had their AOD contributions assessed. 
Therefore hypothesis 1 was accepted. 
 
Table 4 
Comparison of perceived critical thinking skills before and after an AOD contribution assessment 

  N M SD p 
Pre-AOD perceived critical thinking skills 11 66.55 5.39 

.030* 
Post-AOD perceived critical thinking skills  11 68.82 6.35 

Note * p < .05 
 
This perception, that an AOD contribution assessment facilitated critical thinking was reflected in the 



Australasian Journal of Educational Technology, 2017, 33(5).   

 

 69 

participants’ responses in the post-AOD questionnaire. When asked whether the AOD contributed 
towards developing their critical thinking skills, nine of the participants who had just had an AOD 
contributions assessment agreed or strongly agreed that it had. The remaining two participants in this 
group stated that they were unsure whether the AOD contributed to their critical thinking. Unfortunately, 
neither supplied any elaborating information, nor did they volunteer to be interviewed. 
 
In the data analysis of the qualitative data about how participants believed the AOD had developed their 
critical thinking, two themes emerged most consistently; exposure to different perspectives (Table 5), and 
argument development (Table 6). 
 
Table 5 
Examples of comments associated with the theme of exposure to different perspectives in AOD 
contribution assessment 

Comments 
I have learned I should look at the problems from a different angle. 
I detected others’ different standpoints and that not everyone thinks the same depending on their 
personal and cultural background. 
Gave me a wider view of what more people thought. 
Gave insight on what others thought and allowed me to discuss different viewpoints.  

 
Table 6 
Examples of comments associated with the theme of argument development in AOD contribution 
assessment 

Comments 
I am applying my argument skills and putting them into practice in the discussion. 
I think that it was important for me in developing my own arguments. 
Give people more time to rethink, reorganise his/her own thinking. 
It provoked me to respond to and analyse a problem and then repeat that process against other 
answers. 
By analysing someone’s opinion or a source they have quoted we inadvertently make judgements 
about the argument and start to question the meaning. 
Look at sources and evaluate. Discuss and debate with others. 

 
Of the 9 participants who acknowledged the contribution of the discussion towards their critical thinking, 
three were interviewed. The two themes of exposure to different perspectives and argument development 
were reinforced by these participants, as shown in the following comments: 
 

Gave me a wider view of what more people thought and reinforced my original view. 
 
The discussion forced me to look at it [the topic] again… makes you go through it again… 
see what you did right and what you did wrong, re-evaluate it and see where you slipped 
up. 

 
The responses given in the post-AOD questionnaire and the interviews showed that participants felt that 
the AOD did help to develop their critical thinking skills by allowing access to different perspectives 
which helped in developing their arguments. 
 
Improvements in critical thinking skills with a post-AOD assessment 
 
Higher levels of perceived critical thinking skills after the AOD may suggest an improvement in critical 
thinking skills for those participants who had a post-AOD assessment. The results of the paired-sample t 
test shown in Table 7 indicate that there was a significant increase in perceived critical thinking skills, 
from prior to the AOD (M = 63.70, SD = 5.31) to after the AOD (M = 66.10, SD = 5.71), t(9) = -2.68, p = 
.013 (one-tailed), for the participants that completed a post-AOD essay assessment. Hypothesis 2 was 



Australasian Journal of Educational Technology, 2017, 33(5).   

 

 70 

therefore supported. 
 
Table 7 
Comparison of perceived critical thinking skills before and after a post-AOD assessment 
 N M SD p 

Pre-AOD perceived critical thinking skills 10 63.70 5.31 
.013* 

Post-AOD perceived critical thinking skills 10 66.10 5.71 
Note * p < .05 

 
Again, the belief that the post-AOD assessment facilitated critical thinking was reflected in responses in 
the post-AOD questionnaire. Nine of the 10 participants agreed or strongly agreed that the AOD had 
helped to their critical thinking, and 1 disagreed, but unfortunately did not give a reason. When asked 
about the contribution of the post-AOD essay in developing their critical thinking skills, 7 of the 10 
participants agreed or strongly agreed that it had. Two of the remaining 3 participants felt that the post-
AOD essay assessment did not help. 
 
In the analysis of the qualitative data from the post-AOD questionnaire about how participants believed 
the AOD and the post-AOD assessment had developed their critical thinking, the theme that emerged 
most consistently was that of exposure to different perspectives. Table 8 displays a sample of the 
participants’ responses. The emergence of this theme was similar to the findings from participants who 
only had their AOD contributions assessed; however, the theme of argument development was evident in 
one participant’s comment only: 

 
[I]t [the AOD] encourages critical thinking by forcing us to think in a way which helps us 
build our argument or attempt to disapprove an argument that we deem to be wrong. 

 
Table 8 
Examples of comments associated with the theme of “exposure to different perspectives” in post-AOD 
assessment 

Comment 
Ideas and points of view which I didn’t get myself. 
Showed that other colleagues have different views of points and showed me a new way to view things. 
Allowed me to consider others opinions and compare them with my own. 

The online discussion assisted in developing my critical thinking skills as it allowed me to see points of 
views from different sources and opinions based not completely on my own. 
Made me interact with others and take into account the way they think. 

 
When asked to elaborate on how the post-AOD essay assessment helped in developing their critical 
thinking skills, the most frequently mentioned theme related to the skills involved in completing an essay. 
As the following comments indicate, participants believed the processes of researching, structuring and 
writing an essay stimulated their thought and helped to clarify thinking: 
 

[P]urely because the process of reading and writing essays as well as researching for them 
develops your thoughts on your own work and others. 
 
Researching and writing the essay allowed me to question my own opinions based on the 
topic and provide a much more critical analysis. 

 
Four of the 7 participants who acknowledged both the contribution of the AOD and the post-AOD essay 
assessment towards to their critical thinking were interviewed. One commented that she favoured the 
AOD over the essay for its impact on her critical thinking: 
 

[A] different way of thinking [be]cause people have different opinions and views on a 
certain thing and by having these discussions you think “oh, I didn’t think of it in that kind 
of way” and I found that the discussion really helped in that way. 



Australasian Journal of Educational Technology, 2017, 33(5).   

 

 71 

 
However, the 2 male participants believed that the post-AOD essay assessment contributed more to their 
critical thinking than an AOD: 
 

You are going out and finding info and different ways of putting the info in your essay, and 
in the process developing skills of investigation and essay writing. 
 
You start researching and finding out … then you start thinking about it [the topic] and you 
start developing ideas to write in an essay. You spend a good few hours writing an essay 
and you constantly think about it and you start to develop ideas to write in that essay. 
 

The qualitative data from those participants who had completed the post-AOD assessment, together with 
their elaborating comments in the interviews, showed that they believed an AOD and a post-AOD essay 
assessment both contributed to developing their critical thinking skills. The AOD helped primarily by 
allowing access to different perspectives, while the application of skills involved in essay research and 
writing contributed towards their critical thinking. At the same time however, it was acknowledged that 
the addition of an AOD helped to produce a better quality essay compared to one produced solely by 
one’s own efforts. These comments give support to the finding that there was a perceived improvement in 
the critical thinking skills of participants after completing the post-AOD essay assessment. 
 
Comparing the levels in critical thinking skills between students having an AOD 
contribution assessment and those having a post-AOD assessment 
 
To compare the impact of the two approaches on critical thinking two measures were used: objective 
critical thinking skills and post-AOD perceived critical thinking skills. Both measures were compared 
between those who had the AOD contribution assessment and those who had the post-AOD essay 
assessment. Higher levels of both objective critical thinking skills and post-AOD perceived critical 
thinking skills, for those participants who completed the post-AOD assessment, would suggest that this 
form of assessment contributes to greater levels of improvement in critical thinking than having an AOD 
contribution assessment. The results of the independent samples t tests used to test hypothesis 3 are 
shown in Table 9. 
 
Table 9 
Comparison of objective critical thinking skills and post-AOD perceived critical thinking skills for the 
two forms of assessment 

  N M SD p (one-tailed) 

Objective critical 
thinking skills  

AOD contribution 
assessment 11 16.36 1.47 .320 
Post-AOD assessment 10 15.50 1.01 

Post-AOD perceived 
critical thinking skills 

AOD contribution 
assessment 11 68.82 6.35 .160 
Post-AOD assessment 10 66.10 5.71 

 
There was no significant difference in objective critical thinking skills between those having an AOD 
contribution assessment (M = 16.36, SD = 1.47) and those completing a post-AOD essay assessment (M = 
15.50, SD = 1.01; t(19) = .47, p = .320, one-tailed). There was also no significant difference in post-AOD 
perceived critical thinking skills between those having an AOD contribution assessment (M = 68.82, SD = 
6.35) and those having a post-AOD essay assessment (M = 66.10, SD = 5.71; t(19) = 1.03, p = .160, one-
tailed). Therefore hypothesis 3 was not supported. 
 
Discussion 
 
The findings of this research suggest that the form of assessment used in an AOD may be less important 
than the fact that assessment of some kind is included given the positive outcomes that have been found 
to be associated with assessing AOD (Dennen, 2008; Klisc et al., 2009; Palmer & Holt, 2009). Significant 
improvements in students’ levels of perceived critical thinking occurred for students who had their AOD 



Australasian Journal of Educational Technology, 2017, 33(5).   

 

 72 

contributions assessed, as well as for those who had a post-AOD assessment. This result, linking critical 
thinking outcomes in an AOD with assessment, is consistent with other studies that have examined 
student perceptions of their learning in an AOD that included assessment (Akyol & Garrison, 2011; 
Arend, 2009; Birch & Volkov, 2007). 
 
The results also suggest several ways in which an AOD can contribute to student development of critical 
thinking. Exposure to multiple perspectives in an AOD was most frequently mentioned by students 
regardless of the assessment approach. This sharing and exchanging of ideas in an AOD has been cited in 
previous research as a major benefit of AOD (Birch & Volkov, 2007; Wu & Hiltz, 2004). 
 
The role of AOD in assisting with argument development was also identified; again, a finding consistent 
with previous research. Both Hamann, Pollock, and Wilson (2012) and Meyer (2007) reported that 
students found AOD helpful for formulating and evaluating their ideas, with the time delay of an AOD 
allowing for reflection on the discussion postings prior to responding. Surprisingly though, in the current 
study, it was mostly those students who completed the AOD contribution assessment who mentioned the 
role of AOD in argument development. Perhaps the requirement of incorporating AOD contributions into 
the post-AOD assessment can alter the way in which the AOD is viewed. It is possible that students 
become so preoccupied with finding material for completing the post-AOD assessment that the 
opportunity for reflective thought and active engagement, which would help with clarifying their 
arguments, eludes them. This preoccupation with fulfilling grading requirements is not uncommon (Peters 
& Hewitt, 2010). 
 
Though the results showed significant improvements in the levels of perceived critical thinking after 
completing an AOD with assessment, it was found that there was no significant difference in the levels of 
critical thinking skills, perceived or objectively measured, between the students completing the different 
forms of assessment. So, despite suggestions that a post-AOD assessment may provide the opportunity 
for student reflection and analysis of the AOD contributions, (Clark, 2001; Greenlaw & DeLoach, 2003; 
Richardson & Ice, 2010), the evidence from the current study seems to suggest that the opportunity was 
not capitalised on. 
 
The results of this study raise the possibility that merely having a post-AOD assessment may not be 
enough to guarantee that students will take advantage of the opportunity to exercise their thinking skills. 
Akyol and Garrison (2011) reported that postgraduate students in their study believed that the AOD alone 
was not sufficient to develop high order thinking, and that the subsequent assessment was needed to 
demonstrate “the synthesis, evaluation and summary of everything that went on in the class [online 
discussion]” (p. 243). However, these students emphasised that it was the design of the course that 
directed them to use their synthesising and evaluating skills in completing the post-AOD assessment. The 
findings from the current research therefore suggest that when using a post-AOD assessment, especially 
with undergraduate students, it is important that instructors emphasise and encourage students to direct 
their attention towards using critical thinking skills associated with synthesis and evaluation. 
 
One possible explanation for the lack of difference in critical thinking levels between those completing 
the different assessments may be found in how students in this study viewed the AOD. As already 
discussed, students who had the AOD contribution assessment believed the AOD contributed to their 
critical thinking in two ways: exposure to multiple perspectives and helping with argument development. 
However, most of the students completing the post-AOD assessment reported the exposure to multiple 
perspectives and did not mention argument development. It is possible that the students doing the post-
AOD assessment saw the AOD mainly as a source of information, whereas the students having their 
contributions assessed viewed the AOD not only as a source of information, but also as a place to 
evaluate and develop their arguments. No reported research to date has examined the impact that different 
assessments have on how students view the AOD, and as this has important implications for teaching, it 
should be further investigated. If having a post-AOD assessment alters how students view an AOD, 
instructors using this approach may need to explicitly raise student awareness of the potential benefit of 
an AOD as a place to share, develop and evaluate their arguments, and not merely as place to gather 
material for a post-AOD assessment. 
 
Despite the fact that completing a post-AOD assessment did not lead to higher levels of critical thinking 
compared to students having their AOD contributions assessed, there is nevertheless some merit in the 



Australasian Journal of Educational Technology, 2017, 33(5).   

 

 73 

use of a post-AOD assessment. The use of a post-AOD assessment may be a practical approach from an 
instructor’s perspective. Research indicates that reading and grading AOD postings is very time 
consuming (Andresen, 2009; Beebe et al., 2010; Dennen, 2008) and hence difficult to use for assessment 
purposes (Klisc, 2015). Therefore, using a post-AOD assessment may be a useful alternative, and ways in 
which to do so need investigation. 
 
The study reported here had some limitations. Firstly, the number of participants was relatively small. 
The lack of significant difference in critical thinking skills between the two groups could be due to lack of 
power because of the small sample size, and a larger sample may be necessary to detect small 
improvements in critical thinking. This limitation should be addressed in future research. 
 
Secondly, the small number of participants interviewed may limit the ability to draw conclusions from the 
interview findings; however, the purpose of collecting qualitative information via the interviews was to 
provide insight and elaboration of the quantitative findings, not to seek trends that could be widely 
applied. The current study, however, provides a useful starting point for further research on the efficacy 
of post-AOD assessment for undergraduates, as previous studies have focused on postgraduate students 
(Akyol & Garrison, 2011), or on how a post-AOD submission influences the quality of the online 
discussion (Geer, 2003; MacKinnon, 2004). 
 
While the use of an objective test to measure student critical thinking skills before and after the treatments 
would have been preferable, it was not possible to accommodate the appropriate interval between 
administrations during the semester. Future research should incorporate the use of pre and post objective 
measures of critical thinking. 
 
Conclusion 
 
The results of this study, though tentative, have shown how different assessment approaches used in 
conjunction with AOD may be used to enhance critical thinking outcomes. The findings confirm earlier 
work on the use of assessment to facilitate student critical thinking that claims that without some form of 
assessment students will not participate in an AOD, and that assessment is necessary to motivate students 
to take extra care in creating their discussion contributions (Dennen, 2008; Klisc et al., 2009; Vonderwell 
et al., 2007). 
 
The results also suggest that having a post-AOD assessment may be just as effective in improving levels 
of student critical thinking as the more commonly used assessment of AOD contributions. Given the 
concern expressed about the time-consuming nature of grading AOD contributions (Andresen, 2009; 
Beebe et al., 2010; Dennen, 2008), the use of a post-AOD assessment seems a sensible and practical 
assessment approach from an educator’s perspective. In the current study, an essay was used as the post-
AOD assessment, however such an assessment may take various forms: Akyol and Garrison (2011) 
reported that the use of a redesign project was highly successful in facilitating high order thinking. 
Regardless of the form of a post-AOD assessment, it ought to be a culminating task that draws on the 
AOD discussion and requires students to evaluate and synthesise the material presented in the AOD 
(Richardson & Ice, 2010). 
 
Although online discussion can assist students to develop their critical thinking skills, the results of this 
study suggest that having a post-AOD assessment is not enough to guarantee students will use the 
accompanying AOD to extend critical thinking skills. This reinforces the idea that undergraduate students 
need guidance and instruction to achieve learning outcomes (Alexander et al., 2010; Bai, 2009; Gilbert & 
Dabbagh, 2005). To enhance argument development in an AOD explicit instruction in logical reasoning 
and deduction, testing premises and questioning other participants may be required, in order for students 
to utilise the discussion material to create quality post-AOD work. 
 
References 
 
Akyol, Z., & Garrison, D. R. (2011). Understanding cognitive presence in an online and blended 

community of inquiry: Assessing outcomes and processes for deep approaches to learning. British 
Journal of Educational Technology, 42(2), 233-250. https://doi.org/10.1111/j.1467-
8535.2009.01029.x 

https://doi.org/10.1111/j.1467-8535.2009.01029.x
https://doi.org/10.1111/j.1467-8535.2009.01029.x


Australasian Journal of Educational Technology, 2017, 33(5).   

 

 74 

Alexander, M. E., Commander, N., Greenberg, D., & Ward, T. (2010). Using the four-questions 
technique to enhance critical thinking in online discussions. MERLOT Journal of Online Learning and 
Teaching, 6(2), 409-415. Retrieved from http://jolt.merlot.org/vol6no2/alexander_0610.htm 

Andresen, M. A. (2009). Asynchronous discussion forums: Success factors, outcomes, assessments, and 
limitations. Educational Technology & Society, 12(1), 149-257. Retrieved from 
http://www.ifets.info/issues.php?id=42  

Applebee, A. N. (1984). Writing and reasoning. Review of Educational Research, 54(4), 557-596. 
https://doi.org/10.3102/00346543054004577 

Arend, B. (2009). Encouraging critical thinking in online threaded discussions. The Journal of Educators 
Online, 6(1), 1-23. https://doi.org/10.9743/jeo.2009.1.1 

Bai, H. (2009). Facilitating students’ critical thinking in online discussion: An instructor’s experience. 
Journal of Interactive Online Learning, 8(2), 156-164. http://www.ncolr.org/issues/jiol/v8/n2 

Beckmann, J., & Weber, P. (2016). Cognitive presence in virtual collaborative learning: Assessing and 
improving critical thinking in online discussion forums. Interactive Technology and Smart Education, 
13(1), 52-70. https://doi.org/10.1108/itse-12-2015-0034 

Beebe, R., Vonderwell, S., & Boboc, M. (2010). Emerging patterns in transferring assessment practices 
from f2f to online environments. The Electronic Journal of E-Learning, 8(1), 1-12. Retrieved from 
http://www.ejel.org/volume8/issue1 

Birch, D., & Volkov, M. (2007). Assessment of online reflections: Engaging english second language 
(esl) students. Australasian Journal of Educational Technology, 23(3), 291-306. 
https://doi.org/10.14742/ajet.1254 

Carrington, M., Chen, R., Davies, M., Kaur, J., & Neville, B. (2011). The effectiveness of a single 
intervention of computer-aided argument mapping in a marketing and a financial accounting subject. 
Higher Education Research & Development, 30(3), 387-403. 
https://doi.org/10.1080/07294360.2011.559197 

Clark, J. (2001). Stimulating collaboration and discussion in online learning environments. The Internet 
and Higher Education, 4(2), 119-124. https://doi.org/10.1016/S1096-7516(01)00054-9 

Cohen, A. J., & Spencer, J. (1993). Using writing across the curriculum in economics: Is taking the 
plunge worth it? Journal of Economic Education, 24(3), 219-230. 
https://doi.org/10.1080/00220485.1993.10844794 

Creswell, J. W., & Plano Clark, V. L. (2007). Designing and conducting mixed methods research. 
Thousand Oaks, CA: Sage Publications. 

Darabi, A., Arrastia, M., Nelson, D., Cornille, T., & Liang, X. (2011). Cognitive presence in 
asynchronous online learning: A comparison of four discussion strategies. Journal of Computer 
Assisted Learning, 27(3), 216-227. https://doi.org/10.1111/j.1365-2729.2010.00392.x 

Davies, W. M. (2011). Introduction to the special issue on critical thinking in higher education. Higher 
Education Research & Development, 30(3), 255-260. https://doi.org/10.1080/07294360.2011.562145 

Dennen, V. P. (2008). Looking for evidence of learning: Assessment and analysis methods for online 
discourse. Computers in Human Behavior, 24(2), 205-219. https://doi.org/10.1016/j.chb.2007.01.010 

Ertmer, P. A., Sadaf, A., & Ertmer, D. J. (2011). Student-content interactions in online courses: The role 
of question prompts in facilitating higher-level engagement with course content. Journal of 
Computing in Higher Education, 23(2-3), 157-186. https://doi.org/10.1007/s12528-011-9047-6 

Facione, P. A., & Facione, N. C. (1992). The California critical thinking disposition inventory (cctdi): 
Test administration manual. Millbrae, CA: California Academic Press. 

Facione, P. A., Facione, N. C., & Winterhalter, K. (2010). The California critical thinking skills test 
manual. Millbrae, CA: California Academic Press. 

Gao, F., Zhang, T., & Franklin, T. (2013). Designing asynchronous online discussion environments: 
Recent progress and possible future directions. British Journal of Educational Technology, 44(3), 
469-483. https://doi.org/10.1111/j.1467-8535.2012.01330.x 

Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer 
conferencing in distance education. American Journal of Distance Education, 15(1), 7-23. 
https://doi.org/10.1080/08923640109527071 

Geer, R. (2003). Initial communicating styles and their impact on further interactions in computer 
conferences. In G. Crisp, D. Thiele, I. Scholten, S. Barker & J. Baron (Eds.), Interact integrate 
impact: Proceedings of the 20th Annual Conference of the Australasian Society of Computers in 
Lelearning in Tertiary Education (pp. 194-202). Adelaide. 

  

http://jolt.merlot.org/vol6no2/alexander_0610.htm
http://www.ifets.info/issues.php?id=42
https://doi.org/10.3102/00346543054004577
https://doi.org/10.9743/jeo.2009.1.1
http://www.ncolr.org/issues/jiol/v8/n2
https://doi.org/10.1108/itse-12-2015-0034
http://www.ejel.org/volume8/issue1
https://doi.org/10.14742/ajet.1254
https://doi.org/10.1080/07294360.2011.559197
https://doi.org/10.1016/S1096-7516(01)00054-9
https://doi.org/10.1080/00220485.1993.10844794
https://doi.org/10.1111/j.1365-2729.2010.00392.x
https://doi.org/10.1080/07294360.2011.562145
https://doi.org/10.1016/j.chb.2007.01.010
https://doi.org/10.1007/s12528-011-9047-6
https://doi.org/10.1111/j.1467-8535.2012.01330.x
https://doi.org/10.1080/08923640109527071


Australasian Journal of Educational Technology, 2017, 33(5).   

 

 75 

Gilbert, P. K., & Dabbagh, N. (2005). How to structure online discussions for meaningful discourse: A 
case study. British Journal of Educational Technology, 36(1), 5-18. https://doi.org/10.1111/j.1467-
8535.2005.00434.x 

Greenlaw, S. A., & DeLoach, S. B. (2003). Teaching critical thinking with electronic discussion. Journal 
of Economic Education, 34(1), 36-52. https://doi.org/10.1080/00220480309595199 

Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L. (2006). Multivariate data 
analysis. New Jersey, NJ: Prentice-Hall Inc. 

Hamann, K., Pollock, P. H., & Wilson, B. M. (2012). Assessing student perceptions of the benefits of 
discussions in small-group, large-class, and online learning contexts. College Teaching, 60(2), 65-75. 
https://doi.org/10.1080/87567555.2011.633407 

Hara, N., Bonk, C. J., & Angeli, C. (2000). Content analysis of online discussion in an applied 
educational psychology course. Instructional Science, 28(2), 115-152. 
https://doi.org/10.1023/A:1003764722829 

Hazari, S. (2004). Strategy for assessment of online course discussions. Journal of Information Systems 
Education, 15(4), 349-355. Retrieved from http://jise.org/Volume15/15-4/Contents-15-4.html 

Hew, K. F., & Cheung, W. S. (2008). Attracting student participation in asynchronous online discussions: 
A case study of peer facilitation. Computers & Education, 51(3), 1111-1124. 
https://doi.org/10.1016/j.compedu.2007.11.002 

Hew, K. F., & Cheung, W. S. (2011). Higher-level knowledge construction in asynchronous online 
discussions: An analysis of group size, duration of online discussion, and student facilitation 
techniques. Instructional Science, 39(3), 303-319. https://doi.org/10.1007/s11251-010-9129-2 

Kalelioglu, F., & Gulbahar, Y. (2014). The effect of instructional techniques on critical thinking and 
critical thinking dispositions in online discussion. Educational Technology & Society, 17(1), 248-258. 
Retrieved from http://www.ifets.info/issues.php?id=62 

Klisc, C. (2015). Enhancing student learning outcomes in asynchronous online discussion. (Doctoral 
dissertation). Murdoch University, Perth Western Australia. Retrieved from 
http://researchrepository.murdoch.edu.au/26222/ 

Klisc, C., McGill, T., & Hobbs, V. (2009). The effect of assessment on the outcomes of asynchronous 
online discussion as perceived by instructors. Australasian Journal of Educational Technology, 25(5), 
666-682. https://doi.org/10.14742/ajet.1114 

Loncar, M., Barrett, N. E., & Liu, G.-Z. (2014). Towards the refinement of forum and asynchronous 
online discussion in educational contexts worldwide: Trends and investigative approaches within a 
dominant research paradigm. Computers & Education, 73(4), 93-110. 
https://doi.org/10.1016/j.compedu.2013.12.007 

MacKinnon, G. R. (2004). Computer-mediated communication and science teacher training: Two 
constructivist examples. Journal of Technology and Teacher Education, 12(1), 101-114. Retrieved 
from http://learntechlib.org/p/14637 

MacKnight, C. B. (2000). Teaching critical thinking through online discussions. EDUCAUSE Quarterly, 
23(4), 38-41. Retrieved from http://er.educause.edu/articles/2000/12/educause-quarterly-magazine-
volume-23-number-4-2000 

Maxwell, J. A. (2005). Qualitative research design: An interactive approach. Thousand Oaks, CA: 
SAGE Publications. 

Meyer, K. A. (2007). Student perceptions of face-to-face and online discussions: The advantage goes to ... 
Journal of Asynchronous Learning Networks, 11(4), 53-69.Retrieved from 
https://onlinelearningconsortium.org/read/journal-issues/ 

Mummery, J., & Morton-Allen, E. (2009). The development of critical thinkers: Do our efforts coincide 
with students’ beliefs? The student experience: Proceedings of the 32nd Higher Education Research 
and Development Society of Australasia Annual Conference (pp. 306-313). Darwin. 

Palmer, S., & Holt, D. (2009). Examining student satisfaction with wholly online learning. Journal of 
Computer Assisted Learning, 25(2), 101-113. https://doi.org/10.1111/j.1365-2729.2008.00294.x 

Peters, V. L., & Hewitt, J. (2010). An investigation of student practices in asynchronous computer 
conferencing courses. Computers & Education, 54(4), 951-961. 
https://doi.org/10.1016/j.compedu.2009.09.030 

Prasad, D. (2009). Empirical study of teaching presence and critical thinking in asynchronous discussion 
forums. International Journal of Instructional Technology and Distance Learning, 6(11), 3-26. 
http://itdl.org/Journal/Nov_09/index.htm 

  

https://doi.org/10.1111/j.1467-8535.2005.00434.x
https://doi.org/10.1111/j.1467-8535.2005.00434.x
https://doi.org/10.1080/00220480309595199
https://doi.org/10.1080/87567555.2011.633407
https://doi.org/10.1023/A:1003764722829
http://jise.org/Volume15/15-4/Contents-15-4.html
https://doi.org/10.1016/j.compedu.2007.11.002
https://doi.org/10.1007/s11251-010-9129-2
http://www.ifets.info/issues.php?id=62
http://researchrepository.murdoch.edu.au/26222/
https://doi.org/10.14742/ajet.1114
https://doi.org/10.1016/j.compedu.2013.12.007
http://learntechlib.org/p/14637
http://er.educause.edu/articles/2000/12/educause-quarterly-magazine-volume-23-number-4-2000
http://er.educause.edu/articles/2000/12/educause-quarterly-magazine-volume-23-number-4-2000
https://onlinelearningconsortium.org/read/journal-issues/
https://doi.org/10.1111/j.1365-2729.2008.00294.x
https://doi.org/10.1016/j.compedu.2009.09.030
http://itdl.org/Journal/Nov_09/index.htm


Australasian Journal of Educational Technology, 2017, 33(5).   

 

 76 

Richardson, J. C., & Ice, P. (2010). Investigating students' level of critical thinking across instructional 
strategies in online discussions. The Internet and Higher Education, 13(1), 52-59. 
https://doi.org/10.1016/j.iheduc.2009.10.009 

Rourke, L., & Anderson, T. (2002). Using peer teams to lead online discussions. Journal of Interactive 
Media in Education, 52(1), 5-18. https://doi.org/10.5334/2002-1 

Schellens, T., & Valcke, M. (2006). Fostering knowledge construction in university students through 
asynchronous discussion groups. Computers & Education, 46(4), 349-370. 
https://doi.org/:10.1016/j.compedu.2004.07.010 

Schellens, T., VanKeer, H., DeWeaver, B., & Valcke, M. (2009). Tagging thinking types in asynchronous 
discussion groups: Effects on critical thinking. Interactive Learning Environments, 7(1), 77-94. 
https://doi.org/0.1080/10494820701651757  

Schindler, L. A., & Burkholder, G. J. (2014). Instructional design and facilitation approaches that 
promote critical thinking in asynchronous online discussions: A review of the literature. Higher 
Learning Research Communications, 4(4), 11-29. https://doi.org/10.18870/hlrc.v4i4.222 

Topcu, A. (2010). Relationship of metacognitive monitoring with interaction in an asynchronous online 
discussion forum. Behaviour & Information Technology, 29(4), 395-402. 
https://doi.org/10.1080/01449291003692649 

Vonderwell, S., Liang, X., & Alderman, K. (2007). Asynchronous discussions and assessment in online 
learning. Journal of Research on Technology in Education, 39(3), 309-328. 
https://doi.org/10.1080/15391523.2007.10782485 

Wee, M., & Abrizah, A. (2011). An analysis of an assessment model for participation in online forums. 
Computer Science and Information Systems, 8(1), 121-140. https://doi.org/10.2298/csis100113036c 

Wu, D., & Hiltz, S. R. (2004). Predicting learning from asynchronous online discussions. Journal of 
Asynchronous Learning Networks, 8(2), 139-152. Retrieved from 
https://onlinelearningconsortium.org/read/journal-issues/ 

Xie, K., Yu, C., & Bradshaw, A. C. (2014). Impacts of role assignment and participation in asynchronous 
discussions in college-level online classes. The Internet and Higher Education, 20(0), 10-19. 
https://doi.org/10.1016/j.iheduc.2013.09.003 

Zydney, J. M., deNoyelles, A., & Kyeong-Ju Seo, K. (2012). Creating a community of inquiry in online 
environments: An exploratory study on the effect of a protocol on interactions within asynchronous 
discussions. Computers & Education, 58(1), 77-87. https://doi.org/10.1016/j.compedu.2011.07.009 

 

 
Corresponding author: Chris Klisc, klisc@westnet.com.au  

Australasian Journal of Educational Technology © 2017. 

Please cite as: Klisc, C., McGill, T., & Hobbs, V. (2017). Use of a post-asynchronous online discussion 
assessment to enhance student critical thinking. Australasian Journal of Educational Technology, 
33(5), 63-76. https://doi.org/10.14742/ajet.3030 

 

https://doi.org/10.1016/j.iheduc.2009.10.009
https://doi.org/10.5334/2002-1
https://doi.org/:10.1016/j.compedu.2004.07.010
https://doi.org/0.1080/10494820701651757
https://doi.org/10.18870/hlrc.v4i4.222
https://doi.org/10.1080/01449291003692649
https://doi.org/10.1080/15391523.2007.10782485
https://onlinelearningconsortium.org/read/journal-issues/
https://doi.org/10.1016/j.iheduc.2013.09.003
https://doi.org/10.1016/j.compedu.2011.07.009
mailto:klisc@westnet.com.au
https://doi.org/10.14742/ajet.3030

	Student critical thinking was measured in two ways. Perceptions of critical thinking were collected both prior to the AOD, and after the AOD and associated assessments were completed, and an objective measurement of critical thinking was obtained afte...
	Instruments
	Table 1 shows the constructs used in addressing the research question and associated hypotheses, together with the instruments used to collect the information needed to measure these constructs.
	Procedure
	Data analysis
	Participants
	Improvements in critical thinking skills with an AOD contribution assessment
	Improvements in critical thinking skills with a post-AOD assessment
	Discussion
	Conclusion
	References
	Please cite as: Klisc, C., McGill, T., & Hobbs, V. (2017). Use of a post-asynchronous online discussion assessment to enhance student critical thinking. Australasian Journal of Educational Technology, 33(5), 63-76. https://doi.org/10.14742/ajet.3030