Microsoft Word - 6853-Article Text-21813-2-11-20201231.docx Australasian Journal of Educational Technology, 2020, 36(6). 1 Editorial Learning Analytics: Pathways to Impact Linda Corrin Swinburne University of Technology, Australia Maren Scheffel Ruhr-Universität Bochum, Germany Dragan Gašević Monash University, Australia The field of learning analytics has evolved over the past decade to provide new ways to view, understand and enhance learning activities and environments in higher education. It brings together research and practice traditions from multiple disciplines to provide an evidence base to inform student support and effective design for learning. This has resulted in a plethora of ideas and research exploring how data can be analysed and utilised to not only inform educators, but also to drive online learning systems that offer personalised learning experiences and/or feedback for students. However, a core challenge that the learning analytics community continues to face is how the impact of these innovations can be demonstrated. Where impact is positive, there is a case for continuing or increasing the use of learning analytics, however, there is also the potential for negative impact which is something that needs to be identified quickly and managed. As more institutions implement strategies to take advantage of learning analytics as part of core business, it is important that impact can be evaluated and addressed to ensure effectiveness and sustainability. In this editorial of the AJET special issue dedicated to the impact of learning analytics in higher education, we consider what impact can mean in the context of learning analytics and what the field needs to do to ensure that there are clear pathways to impact that result in the development of systems, analyses, and interventions that improve the educational environment. Keywords: Learning analytics, impact, higher education Introduction Impact. It is a word so often used in higher education as the goal of everything from research, to learning and teaching, to financial investment. The Cambridge English Dictionary defines impact as “a powerful effect that something, especially something new, has on a situation or person”. In the context of learning analytics (LA), a field full of new possibilities, there have been high hopes among the higher education community regarding the positive impact learning analytics can deliver for students and institutions to address a range of strategic and educational needs. The Society for Learning Analytics Research (SoLAR) defines learning analytics as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” (SoLAR, n.d.). This definition makes it clear that impact should result in optimised learning and learning environments. This sounds like a fairly straight-forward goal. However, over the past decade, it has been demonstrated time and time again that the reality of identifying and measuring such impact is considerably more complex. While the use of data to inform learning and teaching is a practice that has occurred for many decades, the learning analytics field, which brings together insights and methodologies from disciplines such as education, computing, learning science, psychology, data visualisation, emerged around 2010 (Gašević et al., 2015; Kennedy et al., 2017). At the time, the increase in availability of data, combined with a need to address growing strategic issues such as student retention, provided an optimal environment for researchers and practitioners from across a range of disciplines to join together to explore new opportunities (Siemens, 2013). New tools and novel applications of existing analysis techniques started to emerge in the literature. Australasian Journal of Educational Technology, 2020, 36(6). 2 Some were already being implemented in institutions (e.g., Purdue’s Course Signals system as described by Arnold and Pistilli, 2012), while others were more proof of concept (Macfadyen & Dawson, 2010) or small pilots (Santos et al., 2012). Many of the early LA systems sought to take data that was easily accessible (e.g., logins and ‘clicks’ within learning management systems) and visualise these in meaningful ways for teachers and/or students. The impact of this initial work was significant to the field of learning analytics in not only showing what was possible, but also in highlighting some of the key challenges for the implementation of learning analytics in practice. Measuring impact Learning analytics is often used as an approach to measure impact of educational designs. That may be the impact of, for example, a particular learning design on student engagement, the provision of different forms of feedback on student performance, or the personalisation of learning based on levels of knowledge and behaviour. Researchers have investigated many different ways the data from learning systems can be analysed to show this impact, using a range of statistical analyses. Historically, LA studies have relied predominantly on quantitative approaches to perform these forms of evaluation. However, over time there has been growing acknowledgement of the importance of qualitative approaches to increase the explanatory power of these outcomes. This is an idea explored in this special issue by Han and Ellis who used self- reported data along with observational measures to investigate the impact of a blended course design on student performance. In this study they found that this combined approach led to a more in-depth understanding of the student behaviour which could be used to inform changes the teacher could make to improve the blended learning design. The goal of improving the explanatory power of data within the context of learning analytics has also seen the appropriation of new approaches which seek to quantify qualitative records of student learning behaviour and outcomes. An example of this is the adoption of quantitative ethnography (Shaffer, 2017), and more specifically epistemic network analysis (ENA), into learning analytics research. In this issue, Lim et al. utilise ENA to explore students’ perceptions of personalised feedback delivered via a LA system and how this is associated with their self-regulated learning processes. This study identifies several recommendations to improve the impact of feedback delivered to students, including aligning it to students’ self-regulated learning approaches, specifically acknowledging aspects of curriculum design, using a positive tone, and encouraging ongoing dialogue around learning strategies. The outcomes of studies like this one are useful in exploring the impact of particular LA-based feedback systems (in this case, a system called OnTask), but can also influence a broader pedagogical discussion around enhancing the provision of feedback more generally. As researchers and practitioners continue to apply a range of analysis techniques on different sources of data for learning analytics, it has become clear that a greater understanding of how learning is operationalised and can be measured is vital to the determination of impact. To be able to measure LA impact there needs to be a clear link between the learning design context, what the student does, and how the teacher can respond. This realisation has led to calls for greater reference to learning theories in determining how measures of student activity can be used as proxies for learning (Gašević et al., 2015; Wise & Shaffer, 2015), and also for how the outputs of analyses could be translated into action – the idea of “clicks to constructs” (Knight & Buckingham Shum, 2017, p.17). This is a call that has been taken up by many in the field as demonstrated in the papers in this special issue. Self-regulated learning (SRL) is one such theoretical lens often applied in learning analytics studies to understand patterns of student activity in their approaches to study (Roll & Winne, 2015). Models of SRL provide guidance as to the data and analyses that can aid in understanding how students plan, perform, and monitor their learning (e.g., Butler & Winne, 1995; Zimmerman, 1990). Above it was mentioned that Lim et al. (this issue) used SRL to understand and improve the provision of feedback to students. In another paper in this issue Viberg, Wasson & Kukulska-Hulme present a conceptual framework that outlines dimensions of SRL relevant to the context of mobile-assisted language learning for learning analytics. The framework can be used to inform the development of mobile-assisted language learning apps and the associated learning analytics components as well as support for language learning in mobile contexts. An area of increasing interest in the learning analytics community is how students learn collaboratively and how analytics can be used to support this practice (Wise et al., 2021). From a theoretical perspective, Australasian Journal of Educational Technology, 2020, 36(6). 3 learning analytics research has started to draw on lenses such as socially shared regulation of learning (Panadero & Järvelä, 2015) to understand how students learn together. In order to capture this shared learning practice, approaches often extend beyond analysis of trace data from learning systems to measure students’ physical presence and movement within a collaborative learning environment. An example of this form of analysis is presented by Guo & Barmaki in this issue who explore the use of data relating to students’ gaze when working in groups to be able to evaluate collaboration and cooperation between students. They employ eye-tracking technologies within an anatomy-related learning activity and analyse the gaze direction and focus objects using a deep neural network framework. This work seeks to inform the development of novel assessment tools for evaluating collaborative learning based on analytics. Addressing the potential of negative impact Another way of looking at the impact of learning analytics is to consider any potential risk of harm to students or other stakeholders as a result of the use of LA systems. The ethical implications of LA have been highlighted by many in the literature (e.g., Drachsler & Greller, 2016; Kitto & Knight, 2019; Slade & Prinsloo, 2013) as issues that must be addressed when both building LA systems as well as when implementing them in practice. These implications go beyond the concept of privacy to also include data ownership, transparency, consent, anonymity, non-maleficence, beneficence, data management, security, and access (Corrin et al., 2019). These concepts highlight potential points of adverse impact, from why and how the data is collected, where it is stored, who has access, what is done to it, and what action results. There are still many unresolved issues relating to the ethics of learning analytics, such as the duty an institution has to act on and/or use data that is collected, and an ongoing debate about the validity of the algorithms that are used to generate analyses for learning analytics (Ferguson, 2019). Further examination to identify unforeseen or unintended consequences from learning analytics systems and techniques are important in minimising adverse outcomes for students and staff. Studies of students’ perceptions of the ethical implications of learning analytics have shown some concern from students over the appropriate use of their data (Brooker et al., 2017; Roberts et al., 2016; Tsai et al., 2020). This is somewhat mitigated when students can see a benefit from the resulting analytics and/or interventions (Arnold & Sclater, 2017). In this issue, West et al. provide insights from over 2,000 students across six universities in Australia on the collection and use of their data for learning analytics. While most students were generally happy for some of their data to be used, they had concerns about the inclusion of demographic, location-based, WiFi, and social media data. To mitigate this, West et al. call for greater transparency about what data is being collected and how it is being used so they have the opportunity to give informed consent. They also call for respect for the sometimes-blurred boundary between personal and professional use of devices and applications so that data is used for legitimate educational purposes and not simply for tracking students’ movements and behaviours. The ethical, inclusive, and human-centred use of educational data in teaching practice is also examined in this special issue in a study of pre-service teachers at two Australian universities by Prestigiacomo et al. In this study, the pre-service teachers were asked to take part in a human-centred, participatory workshop to explore their understanding of learning design, how they gain insights into students’ activity in the classroom, and their expectations of the role of data and technology in their teaching. The outcomes indicated that more opportunities should be given for pre-service teachers to learn about the use of data in the classroom and the relation between learning design and learning analytics. These findings suggest that in order for learning analytics to have impact, it is necessary for ongoing professional development opportunities to be given to improve teachers’ data literacy and to address the increasing presence of data in the educational environment. Impact of the institutional implementation of learning analytics When implementing learning analytics systems within an institution there are a range of factors that need to be addressed for the system to have impact (Macfadyen et al., 2014; Tsai & Gašević, 2017). This idea is explored by Clark et al. in this issue who conducted an international study across 39 countries which identified five critical success factors required for implementation of learning analytics in higher education. These include “strategy and policy at organisational level, information technological readiness, performance and impact evaluation, people’s skills and expertise, and data quality” (p. 89). In discussing the factor of performance and impact evaluation, Clark et al. highlight the importance of evaluating all the Australasian Journal of Educational Technology, 2020, 36(6). 4 different aspects of the implementation for evidence of impact, especially the impact of data quality on how data from LA systems are interpreted and actioned. In determining whether impact has been achieved it is important to know what kind of impact was sought in the first place. How we measure impact can be institutionally dependent due to the different priorities, issues, and opportunities that may exist. Being able to specify the intended impact plays a significant role when trying to obtain initial support and funding for learning analytics systems at an institutional level. In making a case for investment of time, money, and personnel into implementing an LA system there is almost always a need for a “return on investment” case to be made to senior management – expressed in terms of monetary amounts. The issue here is that, just as learning is difficult to reduce to numbers (Macfadyen & Dawson, 2012), the impact of LA systems is difficult to reduce to dollars. There may be some ways to quantify potential savings or gains made by being able to retain more students in their degree courses, but it is substantially harder to quantify improvements to the educational experience, inclusivity, and learning outcomes in a dollar figure. In the current funding climate for higher education in many countries which has been affected substantially by the COVID-19 global pandemic, being able to show impact and value is increasingly important for institutions to justify investing in learning analytics systems and initiatives. Concluding thoughts Throughout this special issue the idea of impact has been considered from several different perspectives. As a broad concept there are many ways that impact can be examined, however the ability to effectively measure impact is vital to the success of learning analytics in higher education (Dawson et al., 2017). From a research perspective, it is important that the translation of the research outcomes into practical implications is available to inform future LA systems development and implementation. It is also important that we research and learn from our failures as well as our successes. This idea has been embraced in the LA community through initiatives such as the LACE Evidence Hub (http://evidence.laceproject.eu/) and the failathon event at the annual Learning Analytics and Knowledge conference (LAK) (Clow et al., 2016, 2017). These initiatives encourage researchers and practitioners to share experiences of what did not work, to better understand the conditions under which LA can work. This is an approach to thinking about impact that is not often encouraged within research communities due to the pressure on researchers to focus on the publication of successful studies. Yet, there is so much that can be learned from these unsuccessful attempts. Understanding both what works and what does not is necessary in order to create impactful learning analytics. Creating pathways to impact in the learning analytics community involves the acknowledgement that the use of data to inform learning and teaching practices can have many different impacts. Determining the intended impact of learning analytics initiatives and the theoretical and learning design context for these initiatives is necessary to be able to identify useful measures of impact. It is also important to acknowledge that there may be unintended and/or adverse impacts on stakeholders (e.g., ethical impacts). As the field of learning analytics has evolved there has been a greater recognition of the complexity of identifying and measuring impact. At the same time, the field has had an impact on understanding of concepts related to learning in the broader educational discipline such as learning design (e.g., Rienties & Toetenel, 2016) and educational video design (e.g., Guo, Kim & Rubin, 2014) by performing analyses that were previously impracticable due to the scale of data required. It is an exciting time for the learning analytics community as the maturation of the field has allowed the discourse to move beyond descriptions of learning practices towards a more sophisticated understanding of how impact can be measured and enhanced to ensure attainment of the field’s overarching goal of the optimisation of learning and learning environments. Acknowledgements This special issue is the result of a wonderful group of people who have given their time and considerable expertise to enable this publication. Thank you to the authors for your interesting insights into impact and learning analytics, and the great team of AJET reviewers for your valuable feedback on all the manuscripts submitted to this issue. A special thank you to the AJET copy editors whose attention to detail and experience is greatly appreciated. As a guest editor team, we would also like to thank the AJET lead editors for their encouragement and support as we navigated the processes and systems to bring together this special issue. Australasian Journal of Educational Technology, 2020, 36(6). 5 References Arnold, K. E. & Pistilli, M. D. (2012). Course signals at Purdue: using learning analytics to increase student success. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 267-270). Association for Computing Machinery. https://doi.org/10.1145/2330601.2330666 Arnold, K. E., & Sclater, N. (2017). Student perceptions of their privacy in learning analytics applications. In Proceedings of the Seventh International Learning Analytics Knowledge Conference (pp. 66–69). Association for Computing Machinery. https://doi.org/10.1145/3027385.3027392 Brooker, A., Corrin, L., Mirriahi, N. & Fisher, J. (2017). Defining “data” in conversations with students about the ethical use of learning analytics. In H. Partridge, K Davis, & J. Thomas. (Eds.), Me, Us, IT! Proceedings ASCILITE2017: 34th International Conference on Innovation, Practice and Research in the Use of Educational Technologies in Tertiary Education (pp. 27-31). https://2017conference.ascilite.org/wp-content/uploads/2017/11/Concise-BROOKER.pdf Butler, D. L., & Winne, P. H. (1995). Feedback and Self-Regulated Learning: A Theoretical Synthesis. Review of Educational Research, 65(3), 245–281. https://doi.org/10.3102/00346543065003245 Cambridge University Press. (n.d.). Impact. In Cambridge Dictionary. Cambridge University Press. Retrieved December 7, 2020, from https://dictionary.cambridge.org/dictionary/english/impact Clow, D., Ferguson, R., Macfadyen, L., Prinsloo, P., & Slade, S. (2016). LAK failathon. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (pp. 509-511). Association for Computing Machinery. https://doi.org/10.1145/2883851.2883918 Clow, D., Ferguson, R., Kitto, K., Cho, Y. S., Sharkey, M., & Aguerrebere, C. (2017, March). Beyond failure: the 2nd LAK Failathon. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 504-505). Association for Computing Machinery. http://dx.doi.org/10.1145/3027385.3029429 Corrin, L., Kennedy, G., French, S., Buckingham Shum S., Kitto, K., Pardo, A., West, D., Mirriahi, N., & Colvin, C. (2019). The Ethics of Learning Analytics in Australian Higher Education. A Discussion Paper. https://melbourne-cshe.unimelb.edu.au/research/research-projects/edutech/the-ethical-use-of- learning-analytics Dawson, S., Jovanovic, J., Gašević, D., & Pardo, A. (2017). From prediction to impact: Evaluation of a learning analytics retention program. In Proceedings of the Seventh International Conference on Learning Analytics & Knowledge (pp. 474-478). Association for Computing Machinery. https://doi.org/10.1145/3027385.3027405 Drachsler, H., & Greller, W. (2016, April). Privacy and analytics: it's a DELICATE issue a checklist for trusted learning analytics. In Proceedings of the sixth international conference on learning analytics & knowledge (pp. 89-98). Association for Computing Machinery. https://doi.org/10.1145/2883851.2883893 Ferguson, R. (2019). Ethical Challenges for Learning Analytics. Journal of Learning Analytics, 6(3), 25– 30. https://doi.org/10.18608/jla.2019.63.5 Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64-71. https://doi.org/10.1007/s11528-014-0822-x Guo, P. J., Kim, J., & Rubin, R. (2014, March). How video production affects student engagement: An empirical study of MOOC videos. In Proceedings of the first ACM conference on Learning@ scale conference (pp. 41-50). https://doi.org/10.1145/2556325.2566239 Kennedy, G., Corrin, L., & de Barba, P. (2017). Analytics of what? What are the problems that big data can solve?, In R. James, S. French & P. Kelly (Eds.), Visions for the Future of Australian Tertiary Education (pp.67-76). Melbourne: University of Melbourne. Kitto, K., & Knight, S. (2019). Practical ethics for building learning analytics. British Journal of Educational Technology, 50(6), 2855-2870. https://doi.org/10.1111/bjet.12868 Knight, S., & Buckingham Shum, S. (2017). Theory and Learning Analytics. Handbook of learning analytics, 17-22. https://doi.org/10.18608/hla17.001 Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers & Education, 54(2), 588-599. https://doi.org/10.1016/j.compedu.2009.09.008 Macfadyen, L. P., & Dawson, S. (2012). Numbers are not enough. Why e-learning analytics failed to inform an institutional strategic plan. Journal of Educational Technology & Society, 15(3), 149-163. https://www.jstor.org/stable/pdf/jeductechsoci.15.3.149.pdf Australasian Journal of Educational Technology, 2020, 36(6). 6 Macfadyen, L. P., Dawson, S., Pardo, A., & Gaševic, D. (2014). Embracing big data in complex educational systems: The learning analytics imperative and the policy challenge. Research & Practice in Assessment, 9, 17-28. https://www.rpajournal.com/dev/wp-content/uploads/2014/10/A2.pdf Panadero, E., & Järvelä, S. (2015). Socially shared regulation of learning: A review. European Psychologist, 20(3), 190–203. https://doi.org/10.1027/1016-9040/a000226 Rienties, B., & Toetenel, L. (2016). The impact of learning design on student behaviour, satisfaction and performance: A cross- institutional comparison across 151 modules. Computers in Human Behavior, 60, 333-341. https://doi.org/10.1016/j.chb.2016.02.074 Roberts, L. D., Howell, J. A., Seaman, K., & Gibson, D. C. (2016). Student attitudes toward learning analytics in higher education: “The fitbit version of the learning world”. Frontiers in psychology, 7, 1959. https://doi.org/10.3389/fpsyg.2016.01959 Roll, I., & Winne, P. H. (2015). Understanding, evaluating, and supporting self-regulated learning using learning analytics. Journal of Learning Analytics, 2(1), 7-12. https://doi.org/10.18608/jla.2015.21.2 Santos, J. L., Govaerts, S., Verbert, K., & Duval, E. (2012). Goal-oriented visualizations of activity tracking: a case study with engineering students. In Proceedings of the 2nd international conference on learning analytics and knowledge (pp. 143-152). Association for Computing Machinery. https://doi.org/10.1145/2330601.2330639 Shaffer, D. W. (2017). Quantitative ethnography. Cathcart Press. Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380-1400. https://doi.org/10.1177/0002764213498851 Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510-1529. https://doi.org/10.1177/0002764213479366 Society for Learning Analytics Research. (n.d.) What is Learning Analytics?. Society for Learning Analytics Research. https://www.solaresearch.org/about/what-is-learning-analytics/ Tsai, Y. S., & Gašević, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the Seventh International Conference on Learning Analytics & Knowledge (pp. 233-242). Association for Computing Machinery. https://doi.org/10.1145/3027385.3027400 Tsai, Y. S., Whitelock-Wainwright, A., & Gašević, D. (2020). The privacy paradox and its implications for learning analytics. In Proceedings of the 10th International Conference on Learning Analytics & Knowledge (pp. 230-239). Association for Computing Machinery. https://doi.org/10.1145/3375462.3375536 Wise, A. F., & Shaffer, D. W. (2015). Why theory matters more than ever in the age of big data. Journal of Learning Analytics, 2(2), 5-13. https://doi.org/10.18608/jla.2015.22.2 Wise, A., S. Knight, & Buckingham Shum, S. (2021). Collaborative Learning Analytics. In U. Cress, C. Rosé, A. Wise & J. Oshima (Eds.), International Handbook of Computer-Supported Collaborative Learning. London, UK: Springer. Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: An overview. Educational psychologist, 25(1), 3-17. https://doi.org/10.1207/s15326985ep2501_2 Corresponding author: Linda Corrin, lcorrin@swin.edu.au Copyright: Articles published in the Australasian Journal of Educational Technology (AJET) are available under Creative Commons Attribution Non-Commercial No Derivatives Licence (CC BY-NC-ND 4.0). Authors retain copyright in their work and grant AJET right of first publication under CC BY-NC-ND 4.0. Please cite as: Corrin, L., Scheffel, M., & Gašević, D. (2020). Learning Analytics: Pathways to Impact. Australasian Journal of Educational Technology, 36(6), 1-6. https://doi.org/10.14742/ajet.6853