Australasian Journal of Educational Technology, 2020, 36(6). 107 Data in practice: A participatory approach to understanding pre-service teachers’ perspectives Rita Prestigiacomo The University of Sydney, Australia Jane Hunter, Simon Knight University of Technology Sydney, Australia Roberto Martinez Maldonado Monash University, Australia Lori Lockyer University of Technology Sydney, Australia Data about learning can support teachers in their decision-making processes as they design tasks aimed at improving student educational outcomes. However, to achieve systemic impact, a deeper understanding of teachers’ perspectives on, and expectations for, data as evidence is required. It is critical to understand how teachers’ actions align with emerging learning analytics technologies, including the practices of pre-service teachers who are developing their perspectives on data use in classroom in their initial teacher education programme. This may lead to an integration gap in which technology and data literacy align poorly with expectations of the role of data and enabling technologies. This paper describes two participatory workshops that provide examples of the value of human-centred approaches to understand teachers’ perspectives on, and expectations for, data as evidence. These workshops focus on the design of pre-service teachers enrolled in teacher education programmes (N = 21) at two Australian universities. The approach points to the significance of (a) pre-service teachers’ intentions to track their students’ dispositions to learning and their ability to learn effectively, (b) the materiality of learning analytics as an enabling technology and (c) the alignment of learning analytics with learning design, including the human- centred, ethical and inclusive use of educational data in the teaching practice. Implications for practice or policy: • Pre-service teachers ought to be given opportunities to engage and understand more about learning design, learning analytics and the use of data in classrooms. • Professional experience placements for pre-service teachers should include participatory data sessions or learning design workshops. • Teacher education academics in universities must be provided with ongoing professional development to support their preparation work of pre-service teachers’ data literacy, learning analytics and the increasing presence of data. Keywords: learning analytics, learning design, participatory approach, pre-service teachers, data literacy Introduction Teachers at all levels of education are increasingly expected to work with data in their professional practice. This expectation is common in teacher preparation programmes in universities and in professional development experiences across education systems (McKenney & Mor, 2015; Schildkamp & Kuiper, 2010, p. 16; Wolf, 2007). School inspection regimes in the United Kingdom and accountability structures within the Australian school education sector increasingly demand student achievement to be measured and analysed (Lingard et al., 2015). These data can inform institutional directions in education policy including understanding students’ performance, retention and the effectiveness of learning programmes (Colvin et al., 2016; Persico & Pozzi, 2015). Moreover, university teacher education programmes are increasingly integrating data literacy into pre-service course work (Henderson & Corry, 2020). This move suggests that efforts to familiarise pre-service teachers in how and when to use data to promote evidence-informed teaching practices continues to grow. Australasian Journal of Educational Technology, 2020, 36(6). 108 Emerging learning analytics (LA) technologies provide a means to support the role of data in students’ learning. This paper offers a novel participatory approach to gaining insights into pre-service teacher understanding on data and its uses within education. The study offers examples of the workshop design and the practice insights that participatory approaches provide. Pre-service teachers’ voices are important for supporting alignment both of the research and development of LA to practice and the ways teacher education programmes might better support pre-service teachers’ understandings and expectations of data and its enabling technologies in schools. Background LA as enabling technology in data-informed learning Described as “the collection and (automated) analysis of data concerning learners’ backgrounds, behaviours and progress”, LA is touted as one of the contributing technologies in developing data-savvy practices early in completion of teacher preparation programmes (Wilson et al., 2017, p. 991). LA’s potential impact is only now being considered within primary and secondary teacher preparation, with some affordances being to “predict learner outcomes, trigger interventions or curricular adaptations, and even prescribe new pathways or strategies to improve student success” (Freeman et al., 2017, p. 44). This possibility has led to calls for the widespread adoption of LA in programmes of teacher education in universities (Gasevic et al., 2019). Yet, to effectively design learning with LA, teachers require a degree of data literacy (Gummer & Mandinach, 2015; Tsai & Gasevic, 2017; Wasson & Hansen, 2015). The concept of data literacy refers to the ability that teachers have to collect, analyse and interpret data to transform it into information and actionable instructional insights (Gummer & Mandinach, 2015). Data literacy is tied to how teachers navigate assessment, use technologies such as LA and interact with other forms of data that might, for example, be used for teacher performance or in place of professional judgement (Wolff et al., 2016). Knowing how to better personalise learning plans and design bespoke learning activities for students is important (Mockler & Stacey, 2019). Education in how to use technology and develop innovative learning environments is critical in enacting such personalised learning (Cardno et al., 2019). For example, in a study in three primary schools in Australia (Hunter, 2017), exit tickets, used as in class assessment tools for daily and weekly learning evaluation, were a common data collection method that enabled in-service teachers to shape their practice and design learning to personalise and meet the learning needs of students. As a group of pre-service teachers this kind of feedback on their students personal learning needs is invaluable, especially while they are on professional experience placements in schools. However, further research is needed to understand the alignment of these practices with existing and developing technologies, to align understandings of data literacy to cut across practice and the potential of these technologies. Although LA holds potential, it also raises concerns in the minds of some teacher education scholars regarding the amplification of test-driven teaching, student privacy and the effects of profiling students (Cope & Kalantzis, 2016; Mockler, 2017). Thus, recent LA conversations have turned to understand “what kinds of data and algorithms are being used” and “where [do the data] come from and are they inclusive” (Brown et al., 2020, p. 16). Although LA may be an enabling technology in the attempt to use data to inform learning design and practice in school education, there are critical gaps in understanding its implementation and effectiveness in many education contexts, and thus the needs of teacher education programmes in this regard. LA and learning design LA has the potential to be both embedded in pedagogy to support students’ learning and used by pre-service and in-service teachers to evaluate and develop their practice and design for learning (Lockyer & Dawson, 2011; Lodge et al., 2018). Notions of LA and design for learning (learning design) represent complex aspects of teacher practice that focuses on the design of students’ learning experiences. Learning design is both a process and a product (Goodyear & Dimitriadis, 2013). The process comprises a teachers’ cognitive and practical activity of planning and implementing students’ learning experiences. The product, in the context of school education, is commonly characterised by documented teaching programmes (or units of work), which comprise a coherent set of assessment tasks, resources and lessons that occur over a number Australasian Journal of Educational Technology, 2020, 36(6). 109 of weeks of a school term. Understanding the processes, influences and consideration of university teacher design practice is becoming well established and holds relevance for pre-service teacher education (Bennett et al., 2015, 2017). It has been a focus of international research for some time and is recently emerging as an important element in research in school education contexts (Lockyer, 2018). Consistently, across education settings in primary, secondary and the tertiary sector, major influences in teachers’ design decisions are the considerations of their students. Yet, it is not evident that the basis for these decisions goes beyond lived experience or professional judgement to incorporate students’ data to enliven how and what they need to learn (Mockler, 2017). Much less is known about the influences on pre-service teachers’ learning design practices and thus how we can best develop these skills through course work and professional experience before they graduate from teacher education programmes. Indeed, it is often difficult for in-service teachers to interpret and act upon data that lacks relevance to their practice (Jørnø & Gynther, 2018; Mangaroska & Giannakos, 2019). For pre-service teachers, who are yet to work full-time in schools, this is particularly challenging. Although data literacy is an increasing focus of education research, LA raises new potential for the use of data in practice, yet little is understood in terms of how pre-service teachers at the start of their professional learning conceptualise data and its use in possible use in practice (Earl & Timperley, 2009). LA, as an enabling technology, requires the integration of skills in the areas of data literacy, learning design and the effective use of learning technologies (Alhadad et al., 2018; Thompson et al., 2018). To date, there is limited research on teachers’ use of LA in schools (see, for example, the call for a special issue of the Journal for Learning Analytics; Society for Learning Analytics Research, 2020), and this work has not focused specifically on the data literacy of pre-service teachers. Research in LA has tended to be confined to examinations of the use of educational technology in classroom contexts (Henderson & Corry, 2020). As such, there is an increasing concern that teachers’ views, no matter what stage of their professional development as practitioners, are typically not accounted for in the design process of LA tools (Buckingham Shum et al., 2019). Human-centred approach to LA As a result, there have been calls for LA to embrace the principles of human-centred design approaches, in order to address the perspectives, needs and desires of “critical stakeholders, their relationships, and the contexts in which those systems will function” (Buckingham Shum et al., 2019, p. 2). Given the complexity of school learning environments, we argue that, for pre-service teachers to begin to make better evidence- informed decisions, it is critical to reflect on their course work and early professional experience placements to determine which data is meaningful and relevant to enhance their evolving practice. Discussions of what evidence can be usefully framed according to (a) who will be using the data, (b) how will the data be used, (c) why is the data important and (d) when will the data be collected and (v) for what purpose. This paper builds on previous LA work that adopted human-centred design approaches to investigate stakeholders’ needs (Chen & Zhu, 2019; Holstein et al., 2019; Prieto et al., 2019; Wise & Jung, 2019). Most relevant, Prestigiacomo et al. (2020) investigated what data and why data is critical in the professional development of pre-service teachers. They concluded that a participatory approach could be valuable for generating better insights into how pre-service teachers discuss their knowledge of data and what it means to use it in the context of professional practice. The paper generates a deeper understanding of this issue, which supports better alignment of practice with the effective implementation of LA in schools but more immediately in professional development in teacher education programmes in universities. The study Research objective The main research objective was to gain an understanding of the features of practice that are important to pre-service teachers that arise when they are given opportunities to discuss learning design, the role of data and what its management might mean for the effective implementation of LA in classroom learning. Human-centred workshop design To address this goal, a workshop was designed using a human-centred approach that drew on focus group and participatory design practices. These methods provided us with insights into the intersection of possible Australasian Journal of Educational Technology, 2020, 36(6). 110 challenges, the role of LA as an enabling technology and its interaction with learning design and pre-service teachers’ understanding of data and its uses. It also involved the voice of participants in the research protocols as a critical part in their professional development. This is perhaps in contrast to quantitative methods, such as the use of questionnaires, where the method is more about generalisable researcher insight and less focused on the voice of the profession (Groothuijsen et al., 2020). These methods are an important contribution in designing research that provides insight into pre-service teachers’ perspectives, supporting: • alignment of tools and learning design supports with pre-service teacher perspectives on data, its use, and its enabling technologies • consideration of the kinds of tools and artefacts pre-service teachers expect and want to use in practice and their understanding of what data is and how it may be used. Moreover, the approach exemplifies a method that holds value both in providing practical research insight and a form of professional development for pre-service teachers. In the rest of this section, we provide an exemplification of the application of our participatory approach, drawing on qualitative analysis from workshops with pre-service teachers. Context and participants The participatory approach to the data workshops, conducted with pre-service teachers (with at least two professional experience placements), was a 70-minute session at two large metropolitan universities in Australia in 2019. Participants, recruited through their respective course coordinators, were from a mix of undergraduate and postgraduate teacher education programmes in schools of education. In the workshops, pre-service teachers (N = 21) were prompted to reflect on what data and why data is critical in the learning design and in the learning delivery of classroom programmes and activities. The workshop activities were facilitated by three researchers. The unit coordinator at each site assisted the researchers and facilitated a group of participants at a table in large open plan classrooms. The first workshop was conducted within the context of an undergraduate program of teacher education with 11 participants; mature age pre-service primary school teachers enrolled in units of study that aimed to explore how information and communication technology can be effectively integrated within the primary school curriculum (in Australia these schools are years K–6 with students aged 5–12). The second workshop was conducted within the context of a postgraduate program of teacher education with 10 participants: a final year unit, English Teaching Methods, which fostered the integration of technology in teaching and learning in secondary schools (in Australia, these schools are years K7–12 with students aged 13–18). Both groups of pre-service teachers participated in the workshop as part of their regular class programmes. The workshop design was structured around the following two research questions: 1. What kinds of evidence do pre-service teachers draw on when designing for learning? (RQ1) 2. How effective is that evidence, and what, if any, are the resources and evidence they would like to draw on? (RQ2) Both workshops were audio-recorded and manually transcribed by the first author along with notes from observations and the collection of artefacts generated during each of the sessions. The study was approved by the University of Technology Sydney human research ethics committee (approval no. ETH17-1415). All participants consented to being observed and audio-recorded and to having artefacts produced during the workshops collected for purposes of the research. All data is stored securely and de-identified, and comments attributed in the Results section are referred to by number only. Design protocols used in the research design In the first workshop, participants (N = 11) were divided in two groups: Table 1: Teachers 1–6 (data is not reported due to data loss) and Table 2: Teachers 7–11. Australasian Journal of Educational Technology, 2020, 36(6). 111 In the second workshop, participants (N = 10) were divided in two groups: Table 1: Teachers 1–6 and Table 2: Teachers 7–10. Each group in the workshops was given two A3 paper sheets labelled “learning design activity” and “learning design delivery”, stickers, markers, and blocks of Post-It notes (each in a unique color to identify participants’ ideas related to the learning design and the learning delivery activity). Each workshop commenced with an explanation of the research objective, as stated previously. Its key goals were to understand: • How is learning design (intended as the resources and evidence, i.e., data) used for the design of a unit or work, a lesson or a sequence of learning activities? • How does the delivery of learning direct synchronous or semi-synchronous interaction with students in the classroom? • What are the resources and evidence pre-service teachers use to gain insights and support for their beginning work in classrooms? For example, do they rely on qualitative and quantitative data (information) related to students? What is the visibility of that data (i.e., teachers’ capability to see relevant information, like students’ behavior and learning mood or beliefs about a specific activity in a particular environment to support teachers’ decision-making processes)? Operationalisation of the participatory design approach used in the workshops The objectives were reflected in three stages of design set out in the following way: • Stage 1 (35 minutes) Inspired by the notion of design generation, in the first 15 minutes, this question was posed: “What kind of evidence and resources do you use in your learning design and teaching delivery?”. Its purpose was to encourage participants to write and openly discuss their emerging practices in making evidence visible and the resources they use in the learning design and delivery of classroom activities in their coursework and practicum experiences to date. The final 20 minutes were based on the notion of “tools for dreaming” (Sanders, 2000, p. 3) and Holstein et al.’s (2019) example of imagining they had superpowers. Tools for dreaming and “teacher superpowers” (Holstein et al., 2019, p. 30) can be both deemed as generative design tools which were used to elicit participants’ ideas without constraints. Pre-service teachers were asked the question: “What would be three things (resources) that are hard to see in your current practice that you would like to see?” • Stage 2 (25 minutes) Participants were given six stickers each and asked to rank the most important evidence in the learning design and delivery of lesson activities. The evidence ranking activity was a converging mechanism. It was also designed to serve as a space for reflection and for facilitating the decision- making process by promoting some form of agreement before moving to the next task. After this, for 10 minutes, each group was asked to consider, “Why do you think the identified evidence and resources are valuable? How do the things you have identified connect to learning goals?” Participants further discussed the rationale behind each idea generated. • Stage 3 (10 minutes) In the evidence sharing phase, participants were asked, “Who do you think should or should not have access to this evidence especially data (assessment, essays, test results)? Who do you think should or should not look at these data? Why?” These questions aimed to elicit insights into the ethical use of data. Analysis The first author an inductive approach to analyse the data collected in the two workshops (Braun & Clarke, 2006). Again, this was primarily from the audio recordings of participants’ responses to questions asked at each stage of the workshop, from classroom observations of the table group activities, notes taken during the workshop by the first author and from the Post-It notes. The five stages in the analysis were: Australasian Journal of Educational Technology, 2020, 36(6). 112 1. Manually coding the voice-recording data of each workshop together – noting frequent comments and discussion points in an excel spreadsheet. 2. Determining significant (dominant) themes and categories. 3. Clustering the themes and categories using an affinity diagram technique (Holstein et al., 2019); this approach confirmed identification and refinement of the most common themes and categories. 4. Verbal comments generated during the classroom discussions, observations and from the artefacts were added to the data set and mapped to themes and categories. This enabled a process of triangulation whereby comments from the table group discussions were reflected on Post-It notes and from memos and summaries recorded by the first author during each workshop. 5. We collectively framed these results, using the Post-It notes to cluster specific themes and agreeing alignment of these with an overarching frame derived from the three foci in the literature review, as elaborated in the Findings section. The next section of the paper presents the findings based on the data analysis. Findings Results were organised into three themes that respond to the two research questions. Each theme addresses the kinds of evidence pre-service teachers (N = 21) believe they use to make their learning design visible in the delivery of planned classroom activities (RQ1 – What kinds of evidence do pre-service teachers draw on when designing for learning?) and the resources that will support them to do this work with data (RQ2 – How effective is that evidence, and what, if any, are the resources and evidence they would like to draw on?). It became clear that pre-service teachers at this stage of their professional development are learning that specific kinds of evidence will make their learning design visible. This action impacts the way their classroom activities are planned and delivered, and what data is subsequently collected to report assessment of learning in classrooms. Furthermore, this group of pre-service teachers has specific ideas about who should have access to student data and the kinds of classroom evidence collected. We frame these perspectives from the data analysis in three key themes, the first two of which are reflections addressing RQ1’s focus on understanding their students’ learning and resources for this, while the last reflects RQ2’s focus on the kind of evidence they would like to draw on: • Tracking students’ dispositions to learning and their learning needs (RQ1) • Materiality of LA as an enabling technology and its alignment with learning design (RQ1) • Human-centred, ethical, inclusive use of LA (RQ2). Turning to each key theme in order, noting that the verbatim quotes are from the pre-service teachers whose voice captures the significant sentiment. Tracking students’ dispositions to learning and their learning needs Pre-service teachers would like to make visible students’ dispositions to learning, their moods and emotions (we use the term disposition to draw in collective notions of behaviour, mood and emotion) around their learning and aspects of their learning prior to their entry to the classroom to personalise learning design. They saw potential in technology that would enable understanding these features so as to design learning plans and classroom activities accordingly: As a teacher you do not know what happens to students before they get to school. If we only knew that they are not ready to do learning, they can do other things. At times, they are restless, but you do not know why until you find out they had a tough start to the day, and this is the reason why they did not engage in the lesson. A mood tracker that may be able to read them and signal how they feel – this would be on a teacher’s computer – it would be a superpower. (Participant, Workshop 1, Table 2) Another participant mentioned that exit tickets could be used “to gather data related to students' learning gaps, their struggles, their work completion time and what they found most useful” (Participant, Workshop 2, Table 2). Australasian Journal of Educational Technology, 2020, 36(6). 113 Exit tickets are an invaluable diagnostic informal tool used to collect information about students’ progress and feedback. As such, pre-service teachers and teachers alike value this information in order to make adjustments to their teaching practices. Materiality of LA as an enabling technology and its alignment with learning design The pre-service teachers also reflected on the resources – tangible material artefacts and technologies – they would like to, and do, use in practice. This theme was important, referring to opportunities for gathering evidence in learning design activities, the learning and the design artefacts that support the implementation of LA as an enabling technology. Participants valued school resources, technological tools, syllabus and curriculum documents and students’ portfolios. They also valued formal meetings and networking. One participant pinpointed, “schools share resources” (Participant, Workshop 1, Table 1). Here, collaborative online centralised databases for resources were common requests. This asset, they believed, would support designing learning activities, but it needs to be “curated, informative and tidy” (Participant, Workshop 1, Table 1). Another continued: It would be possible to share what works and adapt it to teachers and students’ needs. Teachers would be able to build on (other) teachers' ideas and learn from the best teachers, rather than working alone or going online to look for resources. (Participant, Workshop 2, Table 2) The importance of accessing a variety of technological tools, including Kahoot for pre-assessment and Google Classroom for creating, distributing and grading students’ assignments, was particularly emphasised. OneNote and Scootle were popular for planning lessons and were often used to collaborate with other pre-service teachers. Other dominant examples are Teachers Pay Teachers and On Butterfly Wings. Although these resources offer lesson ideas from other teachers, Google Chromebooks was popular and useful for “flipped learning” activities. This comment reflects a common response: Google Chromebooks embed a whole pedagogy for students learning outside the classroom and then coming to class to do certain activities. If all students had access to it, teachers could gauge where students are in terms of content and cater lessons according to everyone's needs by differentiating learning. (Participant, Workshop 2, Table 2) The focus on syllabus and curriculum documents on what is taught in classrooms is noteworthy (and to be expected). Participants mentioned using colleagues’ lesson plans, existing programmes and units of work available through the department or stage leader in a school. These ideally had relevant links to web resources that offered general guidelines, with a clear scope and sequence indicating the direction of teaching and learning sequences. For example, one mentioned: How the program allows you to connect what you are doing, week by week, with other teachers to make sure that everyone is moving at the same pace. The document I used was made to easily build upon it. You can also take it further, if you want, by doing more or different things. (Participant, Workshop 1, Table 1) In this theme, pre-service teachers referred to supervising teachers and formal and informal collegial networks for the exchange of ideas and experiences. These structures provide an avenue for reviewing students’ data and reflect on what has been done and the strategies adopted. Additionally, in terms of evidence informing, they also valued informal exchanges (i.e., verbal consultations, discussions with colleagues, personal networks and social media) that notified them of design decisions, common issues and experiences. For example, one participant identified: “other teachers can be the best resources although sometimes they may not have many (tangible) resources” (Participant, Workshop 1, Table 1). There was a note of appreciation towards both pre-service teachers and teachers for lesson design activities, who, as essential human resources in learning design, make the difference. Additionally, cooperation was deemed as critical to leverage pre-service teachers’ (varied) skills and to build on each other’s work. Australasian Journal of Educational Technology, 2020, 36(6). 114 Human-centred, ethical, inclusive use of LA Participants’ understanding of evidence revolved around access to funding and quality resources (e.g., more (learning) support staff, more teachers, small classes, better and more effective technology, incursions and excursions). Each of these were mentioned during the evidence sharing stage of the workshop. These kinds of resources would increase students’ engagement and reduce the current ratio of students to teachers in the classroom. An acute sensitivity towards social justice, welfare concerns, and social support for students emerged from the data analysis. One pleaded, “if a child has been neglected, it would be great to have the required resources to support him/her” (Participant, Workshop 1, Table 1). In line with this, one other pre-service teacher thought it would be valuable to know about students’ home life and have some prior knowledge of their background, stating: We all recognise that knowing students’ motivations and understanding their circumstances would be very useful. It is often something that you are guessing or assuming, but it makes a difference for the student to know something about his/her homelife. Having this information would help you to know how to deal with it and how you respond to his/her request since the student could actually be struggling. (Participant, Workshop 2, Table 1) This concern supports the idea that a focus on students’ welfare and best interests is particularly strong in the early years of teaching, and, while on professional experience within their program of teacher education, this is often an overt apprehension. Provision of adequate tools, to interpret data, access data and ethically use it, was discussed in detail. Most pre-service teachers thought it would be good practice to guarantee access to information to both themselves and the parents of the students they teach (who they believed were often forgotten). Indeed, one participant stated: Parents should have access to certain data, especially when kids suffer from significant behavioural concerns. This way, both teachers and parents can deal with the same behavioural issue. (Participant, Workshop 2, Table 1) It was suggested that sharing data would strengthen parents’ partnerships, providing better opportunities to support pre-service teachers, teachers and children at home. Pre-service teachers lamented that they did not know how to interpret data, perhaps “because it’s not integrated enough in the pre-service teacher education programmes or while on professional experience in schools” (Participant, Workshop 2, Table 1). Undeniably, while one said, “this data thing makes me feel I want to run a thousand miles” (Participant, Workshop 2, Table 2), another shared her interest in knowing more about data: Learning to interpret the data, doing trend analysis, figuring out what is the best kind of data to use, what the data can show me about my class. However, you cannot rely on that too much. I saw how a teacher’s grading was biased by student’s previous ranking. (Participant, Workshop 2, Table 2) Although pre-service teachers shared a strong consensus on the reduced use of data in schools, for some, the lack of clarity about what data is, how to define it and how it works makes them feel uncertain how to proceed with effective use of this data. Concurrently, others shared the importance of engaging with data as an opportunity to gain insights into students’ performance. The importance of knowing how to interpret data to avoid data misinterpretation is reflected in the following comments: Giving data access to people who do not have the right tool may have the detrimental consequence of spreading scary campaigns. (Participant, Workshop 2, Table 2) Australasian Journal of Educational Technology, 2020, 36(6). 115 There exists the risk of generalisation, which means that if something happens during an English class, there is the assumption that all English teachers are the same. (Participant, Workshop 2, Table 1) In relation to the ethical use of data, pre-service teachers were prompted about the use of Scout – a business intelligence software that provides reports as evidence across the learning, teaching and leadership domains (NSW Department of Education, n.d.). Here, they showed some resistance and were concerned about what this meant in terms of their accountability for students’ learning. The concern is reflected here: Each child learns differently, so you can't really say that X student gets it wrong because of Y teacher. Teachers and students are human, and they are allowed privacy. Perhaps external circumstances may have an impact on the day of an exam. It comes down to a question of data reliability. (Participant, Workshop 2, Table 1) The quote refers to the detrimental impact that data decontextualisation could have on teachers and students, suggesting that context matters. Context offers the potential for turning data into something relevant and meaningful that may prompt and justify action. Discussion This study investigated the implementation of a workshop design to explore how a group of pre-service teachers talked about learning design and data and what resources and evidence would support their practice as they commence their teaching careers in primary and secondary school contexts. Findings were described in three key themes: tracking students’ dispositions to learning and their learning needs; materiality of LA as an enabling technology and its alignment with learning design; and human-centred, ethical, inclusive, use of LA to start a discussion of what matters in classroom data from the perspectives of these soon to be members of the teaching profession. Further research that is close to the practices and perceptions of pre- service teachers is crucial in order to design LA and the commensurate programmes in teacher education in how to use LA in K-12 education. The discussion in this paper presents a stepping stone to promote the development of protocols and courses in this critical area of pre-service teacher professional learning in universities. Tracking students’ dispositions to learning and learning needs Pre-service teachers would like to draw on evidence to design learning that understands and takes into account the learning dispositions of students. In their deliberations, there were concerns about anticipating students' behaviour, prior to coming to class or as classroom learning commences. It is reasonable to suggest that exposing and building the capacity of pre-service teachers to understand more about LA technologies and how data is collected is timely (Luckin, 2018). This is particularly significant given the potential of emerging biometric technologies, which are being applied globally in education contexts, to “dehumanise society, foreground race and gender, eliminate obscurity, increase the authoritarian nature of schooling, cascade the logic of automation and oppress marginalised group within schools” (Andrejevic & Selwyn, 2020, pp. 117–124). The insights provided indicate a desire from pre-service teachers to gain a close appreciation of the contextual aspects of their students’ lives that might impact learning and the ways that the students interact with particular learning activities. Moreover, they flag specific kinds of tools and techniques that they would like to use in practice. These insights are significant in identifying key areas in which pre-service teachers may need support in designing tasks with technologies, taking into account the potential as well as the risks of these technologies. Materiality of LA as an enabling technology and its alignment with learning design The pre-service teachers highlighted a number of artefacts that they use in their learning design and delivery. These moved between the analysis of students’ work itself – a common target of LA – and the ways in which they drew on colleagues’ work, shared curricula documents, and professional networks. These latter are typically not a current target of LA and suggest the need to incorporate these considerations into emerging LA technologies and support educators in aligning these kinds of resources with the capabilities of technologies. Moreover, the pre-service teachers mentioned a number of specific tools, including Kahoot, Google Classroom, OneNote and Scootle; these technologies of practice are not typically Australasian Journal of Educational Technology, 2020, 36(6). 116 sites of LA research, suggesting the significance of particular tools, design and artefacts of practice in closing the technology integration gap (Knight et al., 2020). Human-centred, ethical, inclusive use of LA Finally, pre-service teachers discussed in detail aspects related to data interpretation, data access and ethical issues which are critical for the successful appropriation of LA into teaching and learning practice. This emphasises the contribution of this paper to the emerging interest in creating human-centred approaches to design LA tools imbued with teachers’ voices (Wise & Jung, 2019). Some attention has been paid to how researchers should interact with teachers to co-create LA innovations as partners instead of seeing them as users (see examples summarised by Buckingham Shum et al., 2019). However, we particularly emphasise how a participatory approach, embedded as professional development activities for pre-service teachers, can enable them to develop their own professional data practices. Activities such as these would facilitate the identification of local data needs, teaching practices, data access restrictions and ethical implications. These requirements, considered as soft barriers, do not relate directly to the computational realm of LA but can limit their adoption (Drachsler & Greller, 2012). They can also be even more critical than generating more complex data models (Siemens, 2013). Conclusion and future work All teachers are increasingly expected to use data as a form of evidence in their work in schools. To do this effectively, they must make use of the available resources, such as mandated curricula, software technologies and school-based learning programmes. Capability to engage in learning design using these kinds of resources varies across the profession. Increasingly, professional expectations to enhance in- service teachers’ data literacy involving the collection, analysis and deeper understanding of all forms of student data is a necessity. Pre-service teachers’ data literacy is not well understood, and the field of LA can enhance the direction of professional development. More research is needed to understand what pre- service teachers think about the use of data in classrooms and the specific kinds of resources they might use in practice that could guide the integration of technologies into practice, such as LA tools. Research reported in this paper focused on the design of a human-centred workshop. The results reveal the perspectives of two groups of pre-service teachers and what kinds of data is important to them at this stage in their professional development. The insights have important implications for the design of LA tools, and approaches to integrate them with practice, and the development of professional development for teachers, in situated data literacy with respect to their own perceptions of data and its potential. The findings foreground that to enable pre-service teachers to make better-informed decisions, data should be meaningful and relevant to their evolving practice. The findings contribute to an important conversation about the need for the participation of pre-service teachers in LA tool development, questions about pre- service teachers’ data literacy and learning design practices. The study has also commenced a broader discussion about pre-service teachers' understanding of the use of data in schools. Although this small-scale study provides insights into the particular context of this group of pre-service teachers, the findings are not generalisable. Rather, it was a study of a workshop design and implementation, with the results exemplifying the particular gleaned from a group of participants (Stake, 1995). The workshop design valued the pre-service teachers’ voices as a key step in appreciating what future practitioners at the commencement of their careers would like to know more about in this critical stage of their professional development (Hunter, 2015). Key constraints of the study stemmed from time limitations, hence its instructive nature – in which the data was collected over 2 days. Ideally, it would have been useful to have presented each group’s findings back to them in a further forum to add depth to the key findings and interpretations. More research is needed to investigate pre-service teachers’ perspectives on classroom data in order to placate their concerns, rather than data being informed by possible bias and incorrect assumptions. Future studies may seek to address the involvement of other stakeholders, such as teacher educators from university programmes, students and parents, to gain perceptions of their understanding of data and its role in learning in schools. Should much broader perspectives be taken into account, this would complement and start to close a gap in what we know about data in practice in pre-service teacher education and its relationship to data and LA in schools. Australasian Journal of Educational Technology, 2020, 36(6). 117 Acknowledgements We would like to express our gratitude to all participants at the two universities who took part in the study and especially to the unit coordinators at both sites for giving us access to the students in their classes. We appreciate funding for the research from the STEM Education Futures Research Centre in the Faculty of Arts & Social Sciences at the University of Technology Sydney. References Alhadad, S. S., Thompson, K., Knight, S., Lewis, M., & Lodge, J. M. (2018). Analytics-enabled teaching as design: Reconceptualisation and call for research. In A. Pardo & K. Bartimote-Aufflick (Eds.), Proceedings of the 8th International Conference on Learning Analytics and Knowledge (pp. 427– 435). Association for Computing Machinery. https://doi.org/10.1145/3170358.3170390 Andrejevic, M., & Selwyn, N. (2020). Facial recognition technology in schools: Critical questions and concerns. Learning, Media and Technology, 45(2), 115–128. https://doi.org/10.1080/17439884.2020.1686014 Bennett, S., Agostinho, S., & Lockyer, L. (2015). Technology tools to support learning design: Implications derived from an investigation of university teachers' design practices. Computers & Education, 81, 211–220. https://doi.org/10.1016/j.compedu.2014.10.016 Bennett, S., Agostinho, S., & Lockyer, L. (2017). The process of designing for learning: understanding university teachers’ design work. Educational Technology Research and Development, 65(1), 125– 145. https://doi.org/10.1016/j.compedu.2014.10.016 Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa Brown, M., McCormack, M., Reeves, J., Brook, D. C., Grajek, S., Alexander, B., Bali, M., Bulger, S., Dark, S., & Engelbert, N. (2020). 2020 Educause Horizon report: Teaching and learning edition. EDUCAUSE. https://library.educause.edu/-/media/files/library/2020/3/2020_horizon_report_pdf.pdf Buckingham Shum, S., Ferguson, R., & Martinez-Maldonado, R. (2019). Human-centred learning analytics. Journal of Learning Analytics, 6(2), 1–9. https://doi.org/10.18608/jla.2019.62.1 Cardno, C., Tolmie, E., & Howse, J. (2019). New spaces – new pedagogies: Implementing personalised learning in primary school innovative learning environments. Journal of Educational Leadership, Policy and Practice, 33(1), 111–124. https://doi.org/10.21307/jelpp-2017-010 Chen, B., & Zhu, H. (2019). Towards value-sensitive learning analytics design. In D. Azcona & R. Chung (Eds.), Proceedings of the 9th International Conference on Learning Analytics & Knowledge (pp. 343–352). Association for Computing Machinery. https://doi.org/10.1145/3303772.3303798 Colvin, C., Rogers, T., Wade, A., Dawson, S., Gasevic, D., Buckingham Shum, S., Nelson, K., Alexander, S., Lockyer, L., & Kennedy, G. (2016). Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement (Final Report 2016). Australian Government Office for Learning and Teaching. https://doi.org/10453/117173 Cope, B., & Kalantzis, M. (2016). Big data comes to school: Implications for learning, assessment, and research. AERA Open, 2(2). https://doi.org/10.1177/2332858416641907 Drachsler, H., & Greller, W. (2012). The pulse of learning analytics understandings and expectations from the stakeholders. In S. Dawson & C. Haythornthwaite (Eds.), Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 120–129). Association for Computing Machinery. https://doi.org/10.1145/2330601.2330634 Earl, L. M., & Timperley, H. (2009). Using conversations to make sense of evidence: Possibilities and pitfalls. In L. M. Earl & H. Timperley (Eds.), Professional learning conversations: Challenges in using evidence for improvement (pp. 121–126). Springer. https://doi.org/10.1007/978-1-4020-6917- 8_10 Freeman, A., Adams Becker, S., & Cummins, M. (2017). NMC/CoSN Horizon report: 2017 K–12 Edition. The New Media Consortium. https://www.learntechlib.org/p/182003/ Gasevic, D., Tsai, Y.-S., Dawson, S., & Pardo, A. (2019). How do we start? An approach to learning analytics adoption in higher education. The International Journal of Information and Learning Technology, 36(4), 342–353. https://doi.org/10.1108/ijilt-02-2019-0024 Goodyear, P., & Dimitriadis, Y. (2013). In medias res: reframing design for learning. Research in Learning Technology, 21. https://doi.org/10.3402/rlt.v21i0.19909 https://doi.org/10.1145/3170358.3170390 https://doi.org/10.1080/17439884.2020.1686014 https://doi.org/10.1016/j.compedu.2014.10.016 https://doi.org/10.1016/j.compedu.2014.10.016 https://doi.org/10.1191/1478088706qp063oa https://library.educause.edu/-/media/files/library/2020/3/2020_horizon_report_pdf.pdf https://doi.org/10.18608/jla.2019.62.1 https://doi.org/10.21307/jelpp-2017-010 https://doi.org/10.1145/3303772.3303798 https://doi.org/10453/117173 https://doi.org/10.1177/2332858416641907 https://doi.org/10.1145/2330601.2330634 https://doi.org/10.1007/978-1-4020-6917-8_10 https://doi.org/10.1007/978-1-4020-6917-8_10 https://www.learntechlib.org/p/182003/ https://doi.org/10.1108/ijilt-02-2019-0024 https://doi.org/10.3402/rlt.v21i0.19909 Australasian Journal of Educational Technology, 2020, 36(6). 118 Groothuijsen, S. E. A., Bronkhorst, L. H., Prins, G. T., & Kuiper, W. (2020). Teacher-researchers' quality concerns for practice-oriented educational research. Research Papers in Education, 35(6), 766–787. https://doi.org/10.1080/02671522.2019.1633558 Gummer, E. S., & Mandinach, E. B. (2015). Building a conceptual framework for data literacy. Teachers College Record, 117(4), 1–22. http://www.tcrecord.org/Content.asp?ContentId=17856 Henderson, J., & Corry, M. (2020). Data literacy training and use for educational professionals. Journal of Research in Innovative Teaching & Learning. https://doi.org/10.1108/jrit-11-2019-0074 Holstein, K., McLaren, B. M., & Aleven, V. (2019). Co-designing a real-time classroom orchestration tool to support teacher-AI complementarity. Journal of Learning Analytics, 6(2), 27–52. https://doi.org/10.18608/jla.2019.62.3 Hunter, J. (2015). Technology integration and high possibility classrooms (1st ed.). Routledge. https://doi.org/10.4324/9781315769950 Hunter, J. (2017). Switching middle school teachers onto the STEM disciplines using a pedagogical framework for technology integration: The case for high possibility classrooms in Australia. In P. Resta & S. Smith (Eds.), Proceeding of the Society for Information Technology & Teacher Education International Conference (pp. 2115–2124). Association for the Advancement of Computing in Education. https://www.learntechlib.org/p/177504 Jørnø, R. L., & Gynther, K. (2018). What constitutes an “actionable insight” in learning analytics? Journal of Learning Analytics, 5(3), 198–221. https://doi.org/10.18608/jla.2018.53.13 Knight, S., Gibson, A., & Shibani, A. (2020). Implementing learning analytics for learning impact: Taking tools to task. The Internet and Higher Education, 45, 1–17. https://doi.org/10.1016/j.iheduc.2020.100729 Lingard, B., Thompson, G., & Sellar, S. (2015). National testing in schools: An Australian assessment. Routledge. Lockyer, L. (2018). Enhancing teaching and learning through design practice. In Teaching practices that make a difference: Insights from research: Proceedings of the ACER Research Conference 2018 (pp. 71–75). Australian Council for Educational Research. https://research.acer.edu.au/research_conference/RC2018/13august/9 Lockyer, L., & Dawson, S. (2011). Learning designs and learning analytics. In G. Siemens & P. Long (Eds.), Proceedings of the 1st International Conference on Learning Analytics and Knowledge (pp. 153–156). Association for Computing Machinery. https://doi.org/10.1145/2090116.2090140 Lodge, J. M., Horvath, J. C., & Corrin, L. (2018). Learning analytics in the classroom: Translating learning analytics research for teachers. Routledge. Luckin, R. (2018). Machine learning and human intelligence: The future of education for the 21st century. UCL Institute of Education Press. Mangaroska, K., & Giannakos, M. (2019). Learning analytics for learning design: A systematic literature review of analytics-driven design to enhance learning. IEEE Transactions on Learning Technologies, 12(4), 516–534. https://doi.org/10.1109/tlt.2018.2868673 McKenney, S., & Mor, Y. (2015). Supporting teachers in data‐informed educational design. British Journal of Educational Technology, 46(2), 265–279. https://doi.org/10.1111/bjet.12262 Mockler, N. (2017). Classroom ready teachers? Some reflections on teacher education in Australia in an age of compliance. Teacher Education and Practice, 30(2), 335–339. Mockler, N., & Stacey, M. (2019, March 18). What’s good ‘evidence-based’ practice for classrooms? We asked the teachers, here’s what they said. EduResearch Matters. https://www.aare.edu.au/blog/?p=3844 NSW Department of Education. (n.d.). Scout. https://education.nsw.gov.au/about-us/educational- data/scout Persico, D., & Pozzi, F. (2015). Informing learning design with learning analytics to improve teacher inquiry. British Journal of Educational Technology, 46(2), 230–248. https://doi.org/10.1111/bjet.12207 Prestigiacomo, R., Hadgraft, R., Lockyer, L., Knight, S., van den Hoven, E., Martinez-Maldonado, R., & Hunter, J. (2020). Learning-centred translucence: An approach to understand how teachers talk about classroom data. In A. Ahmad & I. Jivet (Eds.), Proceedings of the 10th International Learning Analytics and Knowledge Conference (pp. 100–105). Association for Computing Machinery. https://doi.org/10.1145/3375462.3375475 https://doi.org/10.1080/02671522.2019.1633558 http://www.tcrecord.org/Content.asp?ContentId=17856 https://doi.org/10.1108/jrit-11-2019-0074 https://doi.org/10.18608/jla.2019.62.3 https://doi.org/10.4324/9781315769950 https://www.learntechlib.org/p/177504 https://doi.org/10.18608/jla.2018.53.13 https://doi.org/10.1016/j.iheduc.2020.100729 https://research.acer.edu.au/research_conference/RC2018/13august/9 https://doi.org/10.1145/2090116.2090140 https://doi.org/10.1109/tlt.2018.2868673 https://doi.org/10.1111/bjet.12262 https://www.aare.edu.au/blog/?p=3844 https://education.nsw.gov.au/about-us/educational-data/scout https://education.nsw.gov.au/about-us/educational-data/scout https://doi.org/10.1111/bjet.12207 https://doi.org/10.1145/3375462.3375475 Australasian Journal of Educational Technology, 2020, 36(6). 119 Prieto, L. P., Rodriguez-Triana, M. J., Martinez-Maldonado, R., Dimitriadis, Y., & Gasevic, D. (2019). Orchestrating learning analytics (OrLA): Supporting inter-stakeholder communication about adoption of learning analytics at the classroom level. Australasian Journal of Educational Technology, 35(4), 14–33. https://doi.org/10.14742/ajet.4314 Sanders, E. BN. (2000). Generative tools for co-designing. In S. A. R. Scrivener, L. J. Ball, & A. Woodcock (Eds.), Collaborative design (pp. 343–352). Springer. https://doi.org/10.1007/978-1- 4471-0779-8_1 Schildkamp, K., & Kuiper, W. (2010). Data-informed curriculum reform: Which data, what purposes, and promoting and hindering factors. Teaching and Teacher Education, 26(3), 482–496. https://doi.org/10.1016/j.tate.2009.06.007 Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380–1400. https://doi.org/10.1177/0002764213498851 Society for Learning Analytics Research. (2020, March 31). Special section on Learning Analytics for primary and secondary schools – call for papers. https://learning- analytics.info/index.php/JLA/announcement/view/161 Stake, R. E. (1995). The art of case study research. Sage. Thompson, K., Sakinah, S. J. Alhadad, Buckingham Shum, S., Howard, S., Knight, S., Martinez- Maldonado, R., & Pardo, A. (2018). Connecting expert knowledge in the design of classroom learning experiences. In J. M. Lodge, J. C. Horvath & L. Corrin (Eds.), Learning analytics in the classroom: Translating learning analytics research for teachers (pp. 111–128). Routledge. Tsai, Y.-S., & Gasevic, D. (2017). Learning analytics in higher education-challenges and policies: Areview of eight learning analytics policies. In A. Wise, P. Winne, & G. Lynch (Eds.), Proceedings of the Seventh International Learning Analytics and Knowledge Conferenc, (pp. 233–242). Association for Computing Machinery. https://doi.org/10.1145/3027385.3027400 Wasson, B., & Hansen, C. (2015). Data literacy and use for teaching. In P. Reimann, S. Bull, & M. Kickmeier-Rust (Eds.), Measuring and visualizing learning in the information-rich classroom (pp. 72–89). Routledge. Wilson, A., Watson, C., Thompson, T. L., Drew, V., & Doyle, S. (2017). Learning analytics: Challenges and limitations. Teaching in Higher Education, 22(8), 991–1007. https://doi.org/10.1080/13562517.2017.1332026S Wise, A. F., & Jung, Y. J. (2019). Teaching with analytics: Towards a situated model of instructional decision-making. Journal of Learning Analytics, 6(2), 53–69. https://doi.org/10.18608/jla.2019.62.4 Wolf, P. (2007). A model for facilitating curriculum development in higher education: A faculty-driven, data-informed, and educational developer–supported approach. New Directions for Teaching and Learning, 2007(112), 15–20. https://doi.org/10.1002/tl.294 Wolff, A., Gooch, D., Cavero Montaner, J. J, Rashid, U., & Kortuem, G. (2016). Creating an understanding of data literacy for a data-driven society. The Journal of Community Informatics, 12(3), 9–26. https://doi.org/10.15353/joci.v12i3.3275 Corresponding author: Rita Prestigiacomo, rita.prestigiacomo@sydney.edu.au Copyright: Articles published in the Australasian Journal of Educational Technology (AJET) are available under Creative Commons Attribution Non-Commercial No Derivatives Licence (CC BY-NC-ND 4.0). Authors retain copyright in their work and grant AJET right of first publication under CC BY-NC-ND 4.0. Please cite as: Prestigiacomo, R., Hunter, J., Knight, S., Martinez Maldonado, R., & Lockyer, L. (2020). Data in practice: A participatory approach to understanding pre-service teachers’ perspectives. Australasian Journal of Educational Technology, 36(6), 107-119. https://doi.org/10.14742/ajet.6388 https://doi.org/10.14742/ajet.4314 https://doi.org/10.1007/978-1-4471-0779-8_1 https://doi.org/10.1007/978-1-4471-0779-8_1 https://doi.org/10.1016/j.tate.2009.06.007 https://doi.org/10.1177/0002764213498851 https://learning-analytics.info/index.php/JLA/announcement/view/161 https://learning-analytics.info/index.php/JLA/announcement/view/161 https://doi.org/10.1145/3027385.3027400 https://doi.org/10.1080/13562517.2017.1332026S https://doi.org/10.18608/jla.2019.62.4 https://doi.org/10.1002/tl.294 https://doi.org/10.15353/joci.v12i3.3275 mailto:rita.prestigiacomo@sydney.edu.au https://doi.org/10.14742/ajet.6388 Introduction Background LA as enabling technology in data-informed learning LA and learning design Human-centred approach to LA The study Research objective Human-centred workshop design Context and participants Design protocols used in the research design Operationalisation of the participatory design approach used in the workshops Analysis Findings Tracking students’ dispositions to learning and their learning needs Materiality of LA as an enabling technology and its alignment with learning design Human-centred, ethical, inclusive use of LA Discussion Tracking students’ dispositions to learning and learning needs Materiality of LA as an enabling technology and its alignment with learning design Human-centred, ethical, inclusive use of LA Conclusion and future work Acknowledgements References