Australasian Journal of Educational Technology, 2016, 32(5).  

48 

 

Higher education teachers’ experiences with learning analytics 
in relation to student retention 
 
Deborah West 
Charles Darwin University 
 
Henk Huijser 
Xi’an Jiaotong Liverpool University  
Batchelor Institute of Indigenous Tertiary Education 
 
David Heath 
Charles Darwin University 
 
Alf Lizzio 
Griffith University 
 
Danny Toohey 
Murdoch University  
 
Carol Miles 
University of Newcastle 
 
Bill Searle 
Charles Darwin University 
 
Jurg Bronnimann 
Batchelor Institute of Indigenous Tertiary Education 
 
 

This paper presents findings from a study of Australian and New Zealand academics (n = 
276) who teach tertiary education students. The study aimed to explore participants’ early 
experiences of learning analytics in a higher education milieu where data analytics is gaining 
increasing prominence. Broadly speaking participants were asked about: (1) their teaching 
context; (2) their current student retention activities; (3) their involvement in, and aspirations 
for, learning analytics use; and (4) their relationship with their institution around learning 
analytics. The sampled teaching staff broadly indicated a high level of interest but limited 
level of substantive involvement in learning analytics projects and capacity building 
activities. Overall, the intention is to present a critical set of voices that assist in identifying 
and understanding key issues and draw connections to the broader work being done in the 
field. 

 
 
Introduction 
 
This paper reports on one component of an Australian Government Office for Learning and Teaching 
(OLT) funded project entitled Learning analytics: Assisting universities with student retention. Carried out 
over the period from April 2014 to November 2015 this mixed-method study investigated the factors that 
impact on the implementation of learning analytics for student retention purposes. 
 
At the commencement of the project in July and August 2014, a survey of higher education institutions (n 
= 24) found that typically institutions were focused on exploring, planning, and piloting different tools and 
applications designed to improve their analytics capacity (West, 2015; West, et al., 2016). Though analytics 
was the subject of much attention in institutions, what was less clear was the extent to which the focus of 
analytics would be on the business dimensions of human resources, marketing, performance management, 
and workload allocation, or whether the analytics focus would be more the educational dimensions of 
learning environments, curriculum design, pedagogical intent, and student experience, for example. 
Although these two broad dimensions are not necessarily dichotomous, the initial institution level survey 



Australasian Journal of Educational Technology, 2016, 32(5).  

49 

 

suggested that integrating human resources, finance, research, and marketing systems into some kind of 
data warehouse tended to be one of the more advanced strategic priorities within surveyed institutions at 
the time (West, 2015). 
 
The institution level survey provided some useful baseline data around institutional decision making and 
progress with learning analytics, but the ways that teaching staff were influencing the direction for learning 
analytics or participating in learning analytics pilots and projects remained unclear. The next phase of the 
project involved the deployment of an academic level survey, which aimed to further knowledge about the 
experiences of teaching staff and other academics with learning analytics, explore their aspirations, and 
elicit their views on key issues identified in the literature. Data from the academic level survey is the 
primary focus of this paper. 
 
Background 
 
Learning analytics 
 
The rise of big data, growth in online learning, and changing politics around higher education are driving 
interest in learning analytics (Ferguson, 2012). Ochoa, Suthers, Verbert, and Duval (2014; p. 5) observe 
that “learning analytics is a new, expanding field that grows at the confluence of learning technologies, 
educational research, and data science”, before indicating that learning analytics has the potential to solve 
two simple but challenging questions: 
 

1. How do we measure the important characteristics of the learning process? and 
2. How do we use those measurements to improve it? 

 
Given the breadth of the above description it is perhaps unsurprising that previous research (Corrin, 
Kennedy, & Mulder, 2013) found that understandings of learning analytics vary amongst academic staff. 
Further, the questions listed by Ochoa and colleagues do not seem too different to those that have existed 
in higher education for many years. However, Ferguson (2012) makes the point that learning analytics 
typically includes a pair of assumptions around the utilisation of machine readable data and a focus on big 
data systems and techniques. 
 
Student retention 
 
The academic and non-academic factors that can influence retention are complex and varied (Nelson, 
Clarke, Stoodley, & Creagh, 2014). Complicating matters are the relationships between retention, success, 
and engagement. Helpfully though, there are numerous relevant studies, including recent OLT (formerly 
the Australian Learning and Teaching Council [ALTC]) projects, on student retention (see Nelson et al., 
2014; Willcoxson et al., 2011), alongside studies on learning analytics with some connection to student 
retention. Signals at Purdue University (Arnold & Pistilli, 2012) is a noted example, though more are 
emerging (e.g. Harrison, Villano, Lynch, & Chen, 2015). 
 
Thinking more holistically, Tinto (2009) suggests that to be serious about student retention, universities 
need to recognise that the roots of student attrition lie not only in students and the situations they face, but 
also in the character of the educational settings in which students are asked to learn. If one goes back to the 
definition adopted by the Society for Learning Analytics Research (SoLAR), which articulates learning 
analytics as “the measurement, collection, analysis and reporting of data about learners and their contexts, 
for purposes of understanding and optimizing learning and the environments in which it occurs” (Siemans 
& Long, 2011, p. 34), it becomes clear that student retention (and success and engagement) have a natural 
affinity with learning analytics. 
 
Tinto (2009) has articulated four conditions of student success: expectations, support, feedback, and 
involvement (or engagement), and Nelson et al. (2014) take this idea further and add more detail in their 
Student Engagement Success and Retention Maturity Model (SESR-MM) that includes the following 
categories: 
 

1. learning – assessment, curricula, teaching practices, pedagogical styles 
2. supporting – information, services, resources, people rich advice, advocacy and peer support 



Australasian Journal of Educational Technology, 2016, 32(5).  

50 

 

3. belonging – interaction, inclusive activities, identity development/formation opportunities 
4. integrating – academic literacies, personal literacies 
5. resourcing – staff development, evidence base, communication, learning environments 

 
Both Tinto’s four conditions, and especially Nelson et al.’s categories are potentially measurable, which is 
where learning analytics becomes particularly relevant. 
 
Linking teaching staff to learning analytics and retention 
 
Corrin et al. (2013) reported on findings from a focus group study featuring 29 staff associated with teaching 
and learning at one Australian institution. A variety of educational problems, situations, and potential ideas 
were raised by the participants in their study. These fell into five broad categories: 
 

1. student performance, 
2. student engagement, 
3. the learning experience, 
4. quality of teaching and the curriculum, and 
5. administrative functions associated with teaching. 

 
These few studies alone illustrate that sizeable variation exists with respect to how learning analytics might 
be applied to issues like student retention. With this in mind, the intention of this study was to both 
incorporate the concepts in these studies into the research instruments and also consider how participant 
responses to open-ended questions fit or did not fit with these typologies. 
 
Aim 
 
As learning analytics is multi-disciplinary, multi-method, and multi-level in application, and this study was 
conducted at a time when participant knowledge was difficult to predict, the research questions are 
necessarily broad in scope. They are: 
 

1. What variety exists in the online environments where teaching takes place? 
2. What involvement do teaching staff currently have in using data to respond to retention issues? 
3. In which learning analytics related activities have teaching staff been involved? 
4. In which retention applications of learning analytics are participants most interested in? 
5. How are institutions supporting learning analytics use amongst teaching staff? 

 
Method 
 
Sampling procedure 
 
The survey employed a purposive, snowball sampling strategy to recruit self-selecting individuals. Given 
the sizeable pool of potential participants, voluntary nature of the research, and presence of other higher 
education focused projects also seeking participants, obtaining a high response rate was expected to be a 
significant challenge and this was reflected in the data collection phase. The research team did take a 
number of steps to try to minimise sample bias, and information about participant recruitment and sample 
demographics will be presented to support evaluation of the representativeness of the sample. 
 
Participant recruitment 
Invitations were circulated via three main avenues: 
 

1. Project team networks: The project team, reference group and evaluation team were comprised 
mainly of senior academics so the decision was taken to use their networks with other institutions 
leaders to facilitate as broad a distribution as the voluntary nature of the project would allow. 
Although the ideal scenario would have been universal distribution by institutions, in reality the 
approaches to senior institutional contacts resulted in varied forms of distribution: 

• distribution of invitations via a specific learning and teaching mailing list; 
• placement of information in a broader newsletter; 



Australasian Journal of Educational Technology, 2016, 32(5).  

51 

 

• forwarding to department heads for discretionary distribution; 
• distribution of the survey invitation throughout the institution; and, 
• declining to distribute information about the project. 

 
Follow up confirmed that the invitation was circulated to staff in some capacity in at least 25 
institutions. In most cases distribution was partial and in three cases it was institution-wide. 
 
2. Professional interest groups: Information about the project was distributed through either 

meetings or the newsletters of the Higher Education Research and Development Society of 
Australasia, Universities Australia, Council of Australian Directors of Academic Development, 
Australasian Council on Open and Distance Education, and Council of Australian University 
Librarians. 

 
3. Conferences and workshops: As is fairly typical, project team members attended conferences and 

conducted workshops as the project progressed, but to avoid a disproportionate number of learning 
analytics enthusiasts, participant recruitment via this avenue was intentionally incidental rather 
than proactive. 

 
Table 1 presents data that shows how this overall approach led to response patterns indicative of wide 
distribution (e.g., one that is not stacked with many participants just from partner institutions). 
 
Table 1 
Survey completion information 

Minimum number of institutions with at least one participant 21 
Separate days where at least one survey was commenced 47 
First survey commenced 2 September 2014 
Last survey commenced 13 November 2014 

 
Demographics 
In total 401 people viewed the survey’s first question. Forty-eight people (12%) who answered no questions 
or only demographic questions were excluded. Of the remaining 353 participants, 276 (78%) answered yes 
to the question “Do you teach students?” This paper is concerned with those 276 respondents. Using this 
parameter allows issues specific to teaching staff to be identified and explored. Table 2 presents a summary 
of the sample demographics. 
 
  



Australasian Journal of Educational Technology, 2016, 32(5).  

52 

 

Table 2 
Frequency distribution of selected demographic data 

Variable 
(n varies due to missing data) 

Category Absolute 
frequency 

Relative 
frequency 

Location  
(n = 274) 

Australia 269  98% 
New Zealand 5 2% 

Primary work role 
(n = 276) 

Teaching students 185  67% 
Learning support 25  9% 
Other 24  9% 
Research 19  7% 
Management/administration 12  4% 
Academic development 7  3% 
Student support 4  1% 

LMS at institution 
(n = 276) 

Blackboard 175  63% 
Moodle 89 32% 
Other 12 4% 

Employment basis 
(n = 275) 

Full time 223  81% 
Part time 35 13% 
Casual 15 5% 
Other 2 1% 

Academic level 
(n = 276) 

Lecturer 115  42% 
Senior Lecturer 79 29% 
Associate Professor 28 10% 
Associate Lecturer/Tutor 24  9% 
Professor 18 7% 
Other 12 4% 

Length of employment in current 
institution 
(n = 251) 

Less than 1.5 years 18 7% 
1.5 – 5 years 57 22% 
5 – 10 years 77 31% 
10- 20 years 72 29% 
More than 20 years 27 11% 

Length of employment in higher 
education sector 
(n = 269) 

Less than 1.5 years 4 1% 
1.5 – 5 years 35 13% 
5 – 10 years 61 23% 
10- 20 years 105 39% 
More than 20 years 64 24% 

Enrolment modes of students 
taught 
(n = 276) 

Internally enrolled students only 144 52% 
A mix of internal and external 
students 

105 38% 

Externally enrolled students only 14 5% 
Other 12 4% 

 
In relation to primary work role the “other” response was 9%. This group of responses did not expand much 
on the categories listed, rather most participants who put “other” did so to express a reluctance to identify 
a single role as “primary”, with 18 people nominating a split between teaching and research. 
 
Materials and procedure 
 
The survey was a purpose designed online questionnaire, built and hosted at Qualtrics. It was accessed via 
a link which made responses anonymous. To alleviate risk of multiple completions by individuals the 
software allowed one survey attempt per computer/IP address. Participants could save and return to an 
incomplete survey. After 2 weeks with no access an in-progress attempt was automatically closed and 
placed with completed surveys. Ethics approval for the study was granted by the Charles Darwin University 
Human Research Ethics Committee. 
 



Australasian Journal of Educational Technology, 2016, 32(5).  

53 

 

Results 
 
Question 1. What variety exists in the online environments where teaching takes place? 
 
The teaching environments chosen by academics have a range of implications for what is possible with 
learning analytics. Table 3 details results when participants were asked about their use of tools or utilities 
outside the LMS for teaching. 
 
Table 3 
Frequency distribution of online teaching activities of participants 

Variable Category Absolute 
Frequency 

Relative 
Frequency 

Tools or 
utilities 
outside the 
LMS used for 
teaching 
(n = 272) 
 

Does not use tools or utilities to teach outside the LMS* 120 44% 
Website hosted externally 57 21% 
Website hosted by their institution 54 20% 
Others 53 20% 
Social media applications 51 19% 
Mobile apps 22 8% 

Teaching 
activities 
conducted 
outside the 
LMS 
(n = 156)** 

Provision of access to learning materials 89 63% 
Assessment submission and feedback 75 52% 
Learning focused interactions between lecturers and 
students 

59 41% 

Learning focused interactions between students 48 34% 

Note. * denotes mutually exclusive response,  
** not asked of 120 participants not teaching outside the LMS. 

 
For the first variable in Table 3 there was space to list the applications for the “social media”, “mobile apps” 
and “other” options. Social media applications included: Facebook (28), Twitter (14), YouTube (13), 
Yammer (3), Instagram (3), Pinterest (2), WordPress (2), Blackboard (2), and 10 singularly mentioned 
applications. In relation to mobile apps it was apparent that a distinction between social media and mobile 
apps was not necessarily mutually exclusive or clear. There were 20 different mobile apps mentioned. 
Finally, in the open “other” category there was a wide mix of responses. Coding into categories found these 
could be grouped into functions, which are, with examples from participant responses: 
 

• productivity and content creation (e.g. multimedia software, Creative Cloud, iMovie); 
• communication (email, Facebook, Skype); 
• discipline dedicated learning resources and tools (many e.g. MathLab; Skritter); 
• general content repositories housing learning materials (YouTube, Vimeo, Lynda); 
• polling and quizzing (Respondus; PollEverywhere); 
• document storage, sharing, and portfolio creation (e.g. Google Docs, Mahara); 
• virtual and simulated learning environments ( e.g. Smart Sparrow); and, 
• shared content creation spaces (e.g. Wikis and Blogs). 

 
Question 2: What retention related data are participants accessing and using? 
 
Methods of identifying at risk students 
One way of exploring the uptake of learning analytics was to explore the types of data that participants 
were using to determine risk. Table 4 presents the frequency distribution of selected data sources used by 
participants. Please note that in the design and pilot phase it was unclear how certain participants would be 
so there a couple of different types of other categories to reflect this. The responses in the table are presented 
as they were in the survey instrument. In terms of the “other” option, class attendance and colleagues were 
the strongest responses. 
 
  



Australasian Journal of Educational Technology, 2016, 32(5).  

54 

 

Table 4 
Data sources considered when identifying at-risk students (n = 246) 
Data source Absolute 

Frequency 
Relative 

Frequency 
Students self-reporting that they are having issues that might affect their 
retention 

146 59% 

LMS 140 57% 
Directly asking students if they are having any issues that might impact their 
retention 

123 50% 

Student Information System 59 24% 
Advised by specialist team that has their own retention monitoring processes 45 18% 
Learning Support 38 15% 
Student Support 33 13% 
Does not take action to identify students with retention related risks* 24 10% 
Consults data from other sources 23 9% 
Teaching tools or utilities outside the LMS 15 6% 
Consults data from other source/s but is not sure what they are called 9 4% 
Library 5 2% 
Note. * denotes mutually exclusive response 
 
Figure 1 shows a frequency distribution of participants’ use of selected indicators to identify at-risk students 
in relation to retention. Notable here is a trend toward indicators that relate to actual performance than more 
predictive indicators often collected as part of student enrolment. As with the previous table, class 
attendance was also the most common response where participants were provided with space to put 
“others”. 
 

 
Figure 1. Indicators used to identify at-risk student in relation to retention (n = 276) 
*several participants said they were not sure but also indicated some of the variables. 

0 40 80 120 160 200

Not sure*

Birth country

First in family to enrol in Higher…

Socio-economic status

Aboriginal and Torres Strait…

Home language

Academic pathway – basis for entry

International student status

Time spent in LMS

Materials or resources access…

English proficiency

Repeat student status

Use of communication tools (e.g.…

LMS access patterns (e.g. logging in)

Attainment of certain grades

Task completion



Australasian Journal of Educational Technology, 2016, 32(5).  

55 

 

 
Involvement in responding to at-risk students 
Participants were asked whether they had a systematic response when students met identified risk 
thresholds. A total of 103 participants had a systematic response, of which 24 (23%) indicated it applied to 
all thresholds and 79 (77%) indicated it applied to some thresholds only. These 103 participants were asked 
about the elements that comprised that response. Overwhelmingly, the most common responses were 
manually conducted (e.g., manual emails, telephone calls, offers of consultation, manual referrals to support 
services). The primary automated methods (emails, automated referrals, or those in the “other” category) 
all had a frequency where n <15. 
 
Question 3: In which learning analytics related activities have teaching staff been 
involved? 
 
The study also investigated participation in learning analytics activities. Results focus on the frequency of 
learning analytics discussions that teaching staff are involved in and the involvement of teaching staff in a 
more diverse selection of analytics activities. 
 
Learning analytics discussion involvement 
Figure 2 explores how often the teachers sampled discussed learning analytics with colleagues in different 
roles. For example, the series on the right hand side represents how often the teaching staff sampled 
discussed learning analytics with institutional management. Higher bars on the left of each series indicate 
more frequent discussion. 
 

 
Figure 2. Frequency of learning analytics discussion with selected groups (n varies by variable) 
 
Learning analytics activity involvement 
Participants were also asked about whether they had been involved in a selection of learning analytics 
related activities. Table 5 presents the results. 
 
  



Australasian Journal of Educational Technology, 2016, 32(5).  

56 

 

Table 5 
Frequency distribution of involvement in selected learning analytics activities (n = 276) 
Learning analytics related activity Absolute 

Frequency 
Relative 
Frequency 

None of the listed choices* 108 40% 
Using learning analytics to help with analysis and decision making 101 37% 
Reading about learning analytics for their own professional development 100 37% 
Advocating for the use of learning analytics to colleagues (informal or formal) 70 26% 
Attending conferences/ training specifically to learn about learning analytics 56 21% 
Conducting formal research and/or publishing work on the topic of learning 
analytics 

26 10% 

Being part of the group that is leading learning analytics at their institution 24 9% 
Delivering training on the use of learning analytics 9 3% 
Note. *denotes mutually exclusive response 
 
Question 4: In which retention applications of learning analytics are participants 
interested? 
 
The survey sought to explore which broad retention related applications of learning analytics participants 
were most interested in. Participants were asked about their level of interest in nine selected applications 
with their responses displayed in Figure 3. Longer bars at the top of each series indicate higher interest 
levels.

 
Figure 3. Participant levels of interest in selected potential applications of learning analytics (n varies) 
 
In interpreting the results readers need to be mindful of two things. First, participants were able to select 
“not sure” but this is not displayed to avoid disrupting the visual flow of the chart. The “not sure” option 
accounted for between 4% and 10% of responses for each application. Additionally, due to missing data, n 
varied between 247 and 252 across the applications. 

0 40 80 120 160 200

Assistance with decision making about student
admissions to the institution

Institutional management evaluating and improving
teaching practice across the institution

Identification of student success with a view to
providing an affirmation/reward type of response

Informing design and layout of online learning sites
and environments

Informing potential initiatives to promote student
retention (e.g. mentoring)

Program teams evaluating and improving their
program curriculum

Development of broad knowledge base about how
effective learning can occur

Students monitoring their own progress and
identifying actions they could take

Teaching staff evaluating and improving their own
teaching practice

Identification of at-risk students with a view to staff
responding to address the risk

A lot of interest A little interest No Interest



Australasian Journal of Educational Technology, 2016, 32(5).  

57 

 

Question 5: How are institutions supporting learning analytics use amongst teaching 
staff? 
 
Subjective perceptions of needs being met 
Participants were asked about the extent to which they felt the institution met their needs in relation to 
selected institutional provisions around learning analytics. Figure 4 shows participant responses when asked 
to rate their institution on seven indicators. Due to missing data, n varied between 230 and 232 across the 
different response options in relation to institutions meeting participants’ perceived needs. 
 

 
Figure 4. Rating of institution at meeting participant needs and expectations in selected areas (n varies) 
 
Professional development and training 
Participants were also asked whether they have attended or would attend training on five different topics: 
(1) introduction to learning analytics concepts and applications; (2) overview of institutional plan around 
learning analytics; (3) accessing data; (4) interpreting data; and, (5) responding to data. Results can be 
summarised into two key points. Firstly, none of the five types of training had been attended by more than 
15% of participants. Secondly, participants were interested in training. Each of the five training topics had 
somewhere between 83% and 86% of participants indicating they have attended, or (more commonly) 
would attend, training on that topic. 
 
Discussion and conclusions 
 
Prior to delving into some of the key discussion points to emerge from the data, some limitations will be 
considered. Firstly, the sample size means that the external validity of the data is quite limited, though the 
authors have taken steps to carefully describe the sampling process and demographics. Secondly, the sample 
size also impacted on the statistical power and the end result is a largely descriptive and exploratory survey. 
However, some contentions are made about the value of the study in the context of these limitations: 
 

1. The study illustrates some issues that are important even if they are not universal; 
2. The mixed-method design means that academic level survey data can be connected to other project 

data (e.g., West, 2015) in very specific ways (e.g., contradictions and tensions between 
institutional direction and teachers priorities can be considered using the different data sets); 

3. Whilst this project was taking place other work was occurring (e.g., a UniSA led OLT project - 
Colvin et al., 2016) which can help expand the breadth and range of understanding; and, 

0 20 40 60 80 100 120 140

Provision of information about how learning analytics
use will affect me

Provision of information about how learning analytics
is being used

Opportunities to provide feedback about learning
analytics implementation

Provision of professional development about learning
analytics

Ease of visualisation and interpretation of data

Relevance and comprehensiveness of data that I can
access

Ease of learning analytics data access

Not Sure Poor or Very Poor Fair Good or Very Good



Australasian Journal of Educational Technology, 2016, 32(5).  

58 

 

4. At this point in time, sector level research is likely to generate further questions rather than 
solutions to specific problems because to some extent the key challenges and issues are still being 
delineated. 

 
A key message, consistently reinforced, was that participants generally expressed a high level of interest in 
learning analytics, but their participation in learning analytics activities was limited, particularly in a 
collaborative way. Although 37% of participants reported using analytics to help with making decisions, 
very few participants engaged in frequent (e.g., weekly or fortnightly) discussion with colleagues, 
especially outside of other teaching staff. As learning analytics is seen as a field where collaborative use of 
different expertise (e.g., data science, pedagogy, discipline knowledge) is important (Ochoa et al., 2014), a 
lack of communication represents a barrier to progress. A similar conclusion can be drawn from the data 
about training, in which roughly 85% of participants reported interest in attending analytics training, but 
very few had attended. 
 
What might be stopping teaching staff pursuing learning analytics in line with their reported interests? In 
addition to the data reported in the results section, the survey featured a number of open ended qualitative 
questions discussed elsewhere (West, 2015), however one of these questions bears mentioning here. When 
participants were asked what they needed to pursue their learning analytics goals there were four dominant 
responses - clarity, access, training and time, concisely connected by one participant who suggested: “Tell 
me what data is available, give me access to it, give me the time to use it and give me guidance in using it”. 
Even extensive discussion would likely be insufficient to explore all the ways in which these needs might 
be met, however such views do highlight two related tensions in the learning analytics space: 

 
1. distributed versus centralised locations as the source analytics of initiation and innovation; and 
2. homogeneity versus heterogeneity of analytics focus (i.e. how universal are the problems within 

an institutional context that analytics might be used to address?). 
 
Learning analytics has been a hot topic over the past couple of years and significant discussion, particularly 
at the institutional level, has been about integrating major data systems, with a view to large projects 
applying predictive analytic and other business intelligence techniques for example (West, 2015). Given 
the often centralised management and focus of these projects teaching staff might be aware of their 
presence, but not been provided with enough information to form a coherent understanding of what their 
role might be, or how the analytic tools, techniques and problem questions of interest to them might differ 
from those being used centrally by institutional, managers leaders and central departments. There are 
potentially a number of reasons for this: 
 

• institutional leaders see their analytics as largely about tools for institutional leaders and managers; 
• institutional leaders do see a role for teaching staff but are not yet sure what that might be; 
• institutional leaders envision a role for teaching staff, but promoting this a future priority; 
• institutional leaders are not necessarily aware that teachers are interested in learning analytics; 

and/or, 
• institutional leaders view the individual problems or questions that teachers might address with 

analytics as distinctly heterogeneous and see analytics initiatives as best driven at distributed or 
localised contexts. 

 
Perhaps lending weight to the final suggestion is that when participants were asked about their teaching 
activities outside the LMS, there was a wide array of tools and applications utilised. Whilst academics may 
have originally selected these tools based on their fit to identified learning requirements, many of these 
tools have embedded analytics functionality (e.g., Iconosquare for Instagram, Google Analytics) that can 
be used out of the box, or as learning analytics researchers (e.g., Kitto, Cross, Waters, & Lupton, 2015) are 
increasingly demonstrating, customised to higher education learning settings using open source tools. 
 
The key question about who makes decisions about what to pursue with regards to learning analytics is an 
important one and one that is liable to vary significantly from institution to institution. Some institutions 
have a clear preference (often supported by policy) that academic staff use the LMS, whereas others allow 
much more discretion (West, 2015). Similarly, some institutions may be focused on developing learning 
analytics reports and dashboards for use across the institution, whereas others may see the role of teaching 
staff, program coordinators and/or educational developers as working together to select and use teaching 



Australasian Journal of Educational Technology, 2016, 32(5).  

59 

 

tools and technologies that meet their unique data needs. 
 
Ultimately, one of the overriding themes across the entire project was about the challenge of dealing with 
the variety of choices that exists in the new era of analytics. Clearly the choices about what to explore and 
adopt can be at once dizzyingly complicated and numerous, yet full of possibility. This study represents an 
initial contribution in the context of a broader community where much is being done to collaboratively 
build capacity around learning analytics and support people across all levels of the sector to better 
understand potential uses. 
 
Further information 
 
More information about the project, including presentations and resources (e.g. framework of factors 
impacting on institution level learning analytics implementation and accompanying set of discussion 
questions) from the National Forum are available on the project website at 
www.letstalklearninganalytics.edu.au 
 
Acknowledgements 
 
Support for this publication has been provided by the Australian Government Office for Learning and 
Teaching. The views expressed in this paper do not necessarily reflect the views of the Australian 
Government Office for Learning and Teaching. 
 
References 
 
Arnold, K., & Pistilli, M. (2012). Course signals at Purdue: Using learning analytics to increase student 

success. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge 
(LAK ’12), Vancouver, British Columbia, Canada, (pp. 67-270). 
http://dx.doi.org/10.1145/2330601.2330666 

Colvin, C., Rogers, T., Wade, A., Dawson, S., Gasevic, D., Buckingham Shum, S., … & Fisher, J. (2016). 
Student retention and learning analytics: A snapshot of current Australian practices and a framework 
for advancement. Canberra, ACT: Australian Government Office for Learning and Teaching. 
Retrieved from http://www.olt.gov.au/project-student-retention-and-learning-analytics-snapshot-
current-australian-practices-and-framework 

Corrin, L., Kennedy, G., & Mulder, R. (2013). Enhancing learning analytics through understanding the 
needs of teachers. In Electric Dreams. Proceedings ascilite Sydney 2013 (pp. 201-205). Retrieved 
from http://www.ascilite.org/conferences/sydney13/program/papers/Corrin.pdf 

Ferguson, R., (2012). Learning analytics: Drivers, developments and challenges. International Journal of 
Technology Enhanced Learning, 4(5/6), 304–317. http://dx.doi.org/10.1504/IJTEL.2012.051816 

Harrison, S., Villano, R., Lynch, G., & Chen, G. (2015). Likelihood analysis of student enrollment 
outcomes using learning environment variables: A case study approach. Proceedings of the 5th 
International Conference on Learning Analytics and Knowledge (LAK ’15), Poughkeepsie, NY, 141-
145. http://dx.doi.org/10.1145/2723576.2723621 

Kitto, K., Cross, S., Waters, Z., & Lupton, M. (2015). Learning analytics beyond the LMS: The 
connected learning analytics toolkit. Proceedings of the 5th International Conference on Learning 
Analytics and Knowledge (LAK ’15), Poughkeepsie, NY, 11-15. 
http://dx.doi.org/10.1145/2723576.2723627 

Nelson, K., Clarke, J., Stoodley, I., & Creagh, T. (2014). Establishing a framework for transforming 
student engagement, success and retention in higher education institutions. Final Report 2014, 
Canberra, Australia: Australian Government Office for Learning & Teaching. Retrieved from 
http://studentengagementmaturitymodel.net/wp-
content/uploads/2013/07/ID11_2056_Nelson_Report_2014-1-1.pdf 

Ochoa, X., Suthers, D., Verbert, K., & Duval, E. (2014). Analysis and reflections on the third Learning 
Analytics and Knowledge Conference (LAK 2013). Journal of Learning Analytics, 1(2), 5‐22. 
Retrieved from: http://epress.lib.uts.edu.au/journals/index.php/JLA/article/view/4080 

Siemens, G. & Long, P. (2011). Penetrating the Fog: Analytics in learning and education. EDUCAUSE 
Review, 46(4) July/August. Retrieved from http://www.educause.edu/ero/article/penetrating-fog-
analytics-learning-and-education  

http://www.letstalklearninganalytics.edu.au/
http://dx.doi.org/10.1145/2330601.2330666
http://www.olt.gov.au/project-student-retention-and-learning-analytics-snapshot-current-australian-practices-and-framework
http://www.olt.gov.au/project-student-retention-and-learning-analytics-snapshot-current-australian-practices-and-framework
http://www.ascilite.org/conferences/sydney13/program/papers/Corrin.pdf
http://dx.doi.org/10.1504/IJTEL.2012.051816
http://dx.doi.org/10.1145/2723576.2723621
http://doi.org/10.1145/2723576.2723627
http://studentengagementmaturitymodel.net/wp-content/uploads/2013/07/ID11_2056_Nelson_Report_2014-1-1.pdf
http://studentengagementmaturitymodel.net/wp-content/uploads/2013/07/ID11_2056_Nelson_Report_2014-1-1.pdf
http://epress.lib.uts.edu.au/journals/index.php/JLA/article/view/4080
http://www.educause.edu/ero/article/penetrating-fog-analytics-learning-and-education
http://www.educause.edu/ero/article/penetrating-fog-analytics-learning-and-education


Australasian Journal of Educational Technology, 2016, 32(5).  

60 

 

Tinto, V. (2009). Taking student retention seriously: Rethinking the first year of university. ALTC FYE 
Curriculum Design Symposium, Queensland University of Technology, Brisbane, Australia. pp. 1-10. 
Retrieved from http://www.fyecd2009.qut.edu.au/resources/SPE_VincentTinto_5Feb09.pdf 

West, D. (2015). Method and findings. Proceedings of Let’s Talk Learning Analytics. The National 
Forum for the Learning Analytics: Assisting Universities with Student Retention project. Griffith 
University, Queensland, Australia. pp. Retrieved from 
http://www.letstalklearninganalytics.edu.au/wp-content/uploads/2015/05/Findings-Presentation-
Final.pdf 

West, D., Huijser, H., Lizzio, A., Toohey, D., Miles, C., Searle, B., & Bronnimann, J. (2016). Learning 
analytics: Assisting universities with student retention. Final Report (Part 1), Canberra, ACT: 
Australian Government Office for Learning and Teaching. Retrieved from 
http://www.olt.gov.au/resource-library?text=Learning+Analytics  

Willcoxson, L., Manning, M., Hibbins, R., Joy, S., Thomas, J., Girardi, A., … Lynch, B. (2011). The 
whole of university experience: Retention, attrition, learning and personal support interventions 
during undergraduate business studies. Surrey Hills, NSW: Australian Learning and Teaching 
Council. Retrieved from http://eprints.usq.edu.au/20138/ 

 
 
Corresponding author: Deborah West, deborah.west@cdu.edu.au 
 
Australasian Journal of Educational Technology © 2016. 
 
Please cite as: West, D., Huijser, H., Heath, D., Lizzio, A., Toohey, D., Miles, C., Searle, B., & 
Bronnimann, J. (2016). Higher education teachers’ experiences with learning analytics in relation to 
student retention. Australasian Journal of Educational Technology, 32(5), 48-60. 
http://dx.doi.org/10.14742/ajet.3435 
 

http://www.fyecd2009.qut.edu.au/resources/SPE_VincentTinto_5Feb09.pdf
http://www.letstalklearninganalytics.edu.au/wp-content/uploads/2015/05/Findings-Presentation-Final.pdf
http://www.letstalklearninganalytics.edu.au/wp-content/uploads/2015/05/Findings-Presentation-Final.pdf
http://www.olt.gov.au/resource-library?text=Learning+Analytics
http://eprints.usq.edu.au/20138/
mailto:deborah.west@cdu.edu.au
http://dx.doi.org/10.14742/ajet.3435

	Higher education teachers’ experiences with learning analytics in relation to student retention
	Introduction
	Background
	Learning analytics
	Student retention
	Linking teaching staff to learning analytics and retention
	Aim
	Method
	Sampling procedure
	Materials and procedure
	Results
	Question 1. What variety exists in the online environments where teaching takes place?
	Question 2: What retention related data are participants accessing and using?
	Question 3: In which learning analytics related activities have teaching staff been involved?
	Question 4: In which retention applications of learning analytics are participants interested?
	Question 5: How are institutions supporting learning analytics use amongst teaching staff?
	Discussion and conclusions
	Further information
	Acknowledgements
	References