Australasian Journal of Educational Technology, 2018, 34(2). 117 Informing learning design through analytics: Applying network graph analysis Dirk Ifenthaler University of Mannheim and Curtin University David Gibson, Eva Dobozy Curtin University Learning design has traditionally been thought of as an activity occurring prior to the presentation of a learning experience or a description of that activity. With the advent of near real-time data and new opportunities of representing the decisions and actions of learners in digital learning environments, learning designers can now apply dynamic learning analytics information on the fly in order to evaluate learner characteristics, examine learning designs, analyse the effectiveness of learning materials and tasks, adjust difficulty levels, and measure the impact of interventions and feedback. In a case study with 3550 users, the navigation sequence and network graph analysis demonstrate a potential application of learning analytics design. Implications based on the case study show that integration of analytics data into the design of learning environments is a promising approach. Introduction One of the next frontiers in educational research may be a synergistic and dynamic relationship between learning design and learning analytics. These two perspectives – design and analytics – have heretofore primarily operated independent of each other, separated by time and space due to the complexity of dealing with interactional data in educational settings. However, now with the advent of near real-time data and new opportunities of representing the decisions and actions of learners in digital learning environments, learning designers may apply dynamic learning analytics information to evaluate learner characteristics, examine learning designs, analyse the effectiveness of learning materials and tasks, adjust difficulty levels, and measure the impact of interventions and feedback. This new level of sophisticated information about learners, learning processes, and complex interactions within the learning environment has the potential to provide valuable insights for on-the-fly educational planning and curricular decision-making fully integrated into the digital learning experience. Although learning design has traditionally been thought of as an activity occurring prior to the presentation of a learning experience or a description of that activity, learning analytics, which employs educational data to provide summative or predictive insights about the effect and effectiveness of various elements and features of learning environments, can also be employed in (near) real time (Martin & Whitmer, 2016). The potential for integrating the newly available and dynamic information from ongoing analysis into learning design requires new perspectives on learning and teaching data processing and analysis as well as advanced theories, methods, and tools for supporting dynamic learning design processes (see authors such as Behrens, Mislevy, Dicerbo, & Levy, 2012; Clarke-Midura, Code, Dede, Mayrath, & Zap, 2012; Dede, 2008; Mayrath, Clarke-Midura, & Robinson, 2012; Quellmalz et al., 2012). Valid pedagogical recommendations may be suggested on the fly as learning analytics methodologies and visualisations evolve and as reliable tools become available and ready for classroom practice (Kevan & Ryan, 2016). Integrating educational data and analysis into the design or reuse of learning and teaching activities and sequences, will be referred to here as learning analytics design (Ifenthaler, 2017. This paper reports on a case study demonstrating the synergetic relationship between learning design and learning analytics, with a focus on the application of navigation sequence and network graph analysis. Particularly, it illustrates how analytics may support the design of learning environments, which is followed by a discussion of implications and conclusion. Australasian Journal of Educational Technology, 2018, 34(2). 118 Learning design and analytics Designing for formal learning and teaching includes three major questions: What shall be learned? How shall it be learned? How shall it be assessed? Although these questions seem easy to understand, they are by no means easy to answer. The design processes for formal learning and teaching are complex and context dependent. They will always change in alignment with the alteration of educational goals and the learning dependent progression of cognitive structure (Ifenthaler & Seel, 2005). Yet, the design of formal learning and teaching is not simply answering the above stated three questions during the construction phase. Goodyear and Dimitriadis (2013) point out that learning design also needs to be accepted as an integral ongoing component of educational practice. Not surprisingly, the field of learning design research is maturing, but one of the key challenges is the multiplicity of conceptualisations and definitions in use at present (Dobozy, 2013; Goodyear & Dimitriadis, 2013). Moreover, it is possible to confuse instructional design with learning design. Whereas instructional design is rooted in behaviourist learning theories and seems to on the one hand focus on learning products, such as learning objects and machine-readable representations and on the other hand on delivery systems and the advancement of the automation of designs, learning design is rooted in constructivist learning theories and seems to focus on making the design process explicit and shareable (Mor, Craft, & Maina, 2015). Hence, the approach of instructional design encompasses systematic and analytical procedures for optimising learning and performance, such as analysis, planning, development, implementation, and evaluation (Branch, 2009; Dick, 1987; Gagné, 1965, 1985; Gustafson & Tillman, 1991). Instructional design models have been especially successful for improving large instructional programs. However, instructional design is often criticised for being too narrow and inflexible (Dijkstra & Leemkuil, 2008) and difficult to apply to small-scale educational settings (Conole, 2013). The different historical origins, theoretical underpinnings and perceived goals of instructional design and learning will need to be understood to ensure that the synergetic relationship between learning design and learning analytics is not compromised through conceptual confusion. In an attempt to build cohesion and support the development of a common understanding of learning design, Dobozy’s (2013) list of definitions of learning design is extended to include more recent definitional constructs further exemplifying the roots of learning design (see Table 1). Table 1 Overview on definitions of learning design Author(s) Definition Agostinho (2006, p. 3) A learning design is a representation of teaching and learning practice documented in some notational form so that it can serve as a model or template adaptable by a teacher to suit his/her context. Conole (2008, p. 191) The range of activities associated with creating a learning activity and crucially provides a means of describing learning activities. Conole (2013, p. 121) A methodology for enabling teachers/designers to make more informed decisions in how they go about designing learning activities and interventions, which is pedagogically informed and makes effective use of appropriate resources and technologies. This includes the design of resources and individual learning activities right up to curriculum-level design. A key principle is to help make the design process more explicit and shareable. Learning design as an area of research and development includes both gathering empirical evidence to understand the design process, as well as the development of a range of learning design resource, tools and activities. Dalziel (2008, p.8) A framework to describe a sequence of educational activities in an online environment. Dobozy (2013, p. 68) A way of making explicit epistemological and technological integration attempts by the designer of a particular learning sequence or series of learning sequences. Australasian Journal of Educational Technology, 2018, 34(2). 119 Hale (2016, p. 1) Learning design is the process of designing learning experiences (planning, structuring, sequencing) through facilitated activities that are pedagogically informed, explicit, and make better use of technologies in teaching. Koper (2006, p. 13) The description of the teaching-learning process that takes place in a unit of learning. The key principle in learning design is that it represents the learning activities and the support activities that are performed by different persons (learners, teachers) in the context of a unit of learning. These activities can refer to different learning objects that are used during the performance of the activities (e.g. books, articles, software programmes, pictures), and it can refer to services (e.g. forums, chats, wiki's) that are used to collaborate and to communicate in the teaching-learning process. Mor & Craft (2012, p. 86) Learning Design is the creative and deliberate act of devising new practices, plans and activities, resources and tools aimed at achieving particular educational aims in a given context. Papadakis (2012, p. 258) The creation of sequences of learning activities, which involve groups or learners interacting within a structured set of collaborative environments. Goodyear and Retalis (2010) emphasise that good educational design is the missing link between the learning sciences and the learning environments needed for success in the 21st century. Design patterns may offer a way of capturing design experience including (a) connecting recognisable problems with tested solutions, (b) relating design problems at any scale level (e.g., micro, meso, and macro), and connecting design solutions across scale levels, (c) supplementing design with research-based evidence, (d) balancing guidance with creativity, (e) having a wide application of designs but being customisable to meet specific needs, and (f) improving design performance while also educating the designer (Goodyear & Retalis, 2010). Dalziel and colleagues (2016, p. 1) noted that: The ultimate goal of Learning Design is to convey great teaching ideas among educators in order to improve student learning … successful sharing of good teaching ideas can lead not only to more effective teaching, but also to more efficient preparation for teaching. Learning design aims to provide a description of optimal designs for learning and teaching with a potential for reuse and adaptation of design; however, it does not offer real-time insights how students are engaged and learn (Lockyer, Heathcote, & Dawson, 2013). Therefore, linking design for learning with learning analytics may provide actionable information for optimising learning environments in near real time. Hence, we propose that the next frontier in educational research may be a synergistic relationship between learning design and learning analytics. Learning analytics use available information from various educational sources including learner characteristics, learner behaviour, and learner performance, as well as detailed information of the learning design (e.g., sequencing of events, task difficulty) for supporting pedagogical interventions and redesigns of learning environments (Berland, Baker, & Bilkstein, 2014). Learning analytics are expected to provide the pedagogical and technological background for producing interventions at all times during the learning process. Students benefit from learning analytics through optimised learning pathways, personalised interventions, and real-time scaffolds (Gašević, Dawson, & Siemens, 2015). Learning analytics provide facilitators detailed analysis and monitoring on the individual student level, allowing them to identify particularly unstable factors, like motivation or attention losses, before they occur (Gašević, Dawson, Rogers, & Gašević, 2016). However, ethical and privacy issues have been identified as a major concern with the adoption of learning analytics (Ifenthaler & Schumacher, 2016; Slade & Prinsloo, 2013). Learning analytics should be aligned with organisational principles and values, include all stakeholders, collect, use, and analyse data transparently and free of bias, and be beneficial for all involved stakeholders (Ifenthaler & Tracey, 2016; Pardo & Siemens, 2014). Australasian Journal of Educational Technology, 2018, 34(2). 120 Three perspectives of learning analytics design provide summative, real-time, and predictive insights (Ifenthaler, 2017): (1) The summative perspective of learning analytics design (e.g., at the end of course or a learning unit) may reveal the impact of interventions for individual students and how the interventions support the student’s learning progress toward a specific learning outcome. Different pedagogical models may be identified and further developed through alignment or misalignment of planned interventions and actual learning behaviour. A curricular perspective may help to increase the quality of a study program and identify gaps or redundancies in such (Lockyer et al., 2013). (2) The real-time perspective of learning analytics design (e.g., while course is running or during learning activity) may help to provide learners with resources, help, or scaffolds for supporting their on-going learning processes. The availability and use of learning materials and tasks can be monitored and adjusted to the learners’ needs. Individual and group characteristics may guide the selection and adaptation of difficulty levels and required resources for better learning outcomes (Ifenthaler & Widanapathirana, 2014). (3) The predictive perspective of learning analytics design (e.g., future learning choices, activities, units and courses) may help to model different pathways of learning. Given specific learner characteristics and behaviour, algorithms can provide optimal sequences of learning events to better cater for individual needs and preferences of the learner. Critical events during learning may be predicted and specific interventions may be provided to avoid course failure and dropouts (Mah, 2016). Learning analytics design is thus expected to generate valuable insights for planning and optimising of pedagogical designs, including adapting and optimising the sequencing of activities on the fly (Ifenthaler, 2017). The synergetic relationship between learning design and learning analytics is exemplifying the notion that formal education in the 21st century with ever changing cultural and technological changes has become a design science (Laurillard, 2012). Adaptation and optimisation of learning and teaching may occur, for example, based on educator-selected benchmarks that help to identify alignment or misalignment towards learning outcomes. In addition, detailed insights into pedagogical processes may facilitate micro interventions whenever the learner needs them (Bannert, 2009; Ifenthaler, 2012; Ifenthaler & Schmidt, 2010; van den Boom, Paas, van Merriënboer, & van Gog, 2004). Graph theory analysis Analysing learner interaction during formal learning and teaching processes can provide valuable insights into patterns of learning, interests in specific content, or unexpected communicative behaviours. In addition, as pointed out here, graph theory analysis, a form of network analysis (Brandes & Erlebach, 2005) may reveal significant issues of learning design (see, for example, the prototype epistemic network analysis in Shaffer et al., 2009). The underlying concept is that suitable learning and teaching interactions in formal settings (e.g., well defined, digital, or represented as a learning design patterns) can be represented as a graph, that is, a set of vertices whose relationships are represented by a set of edges. The elements of a graph and their appertaining graphical measures are defined by the methods of graph theory (Diestel, 2000). Various measures from graph theory enable an in-depth analysis of learner interaction in learning environments. Common structural measures include (Ifenthaler, 2010): • number of vertices indicating the number of concepts (vertices) within a graph; • number of edges indicating the number of links (edges) within a graph; • connectedness indicating how closely the concepts and links of the graph are related to each other; • ruggedness indicating whether non-linked vertices of a graph exist; • diameter indicating how large a graph is; • cyclic indicating the existence of paths within a graph returning back to the start vertex of the starting edge; • number of cycles indicating the number of cycles within a graph; • average degree of vertices indicating the average degree of all incoming and outgoing edges of all vertices within a graph. Australasian Journal of Educational Technology, 2018, 34(2). 121 Besides the measures of graph theory, which account for structural properties of learner interaction in learning environments, a learning design may also account for content specific features, that is, semantic properties such as the words used by learners, concepts, representations and other artefacts utilised by a learner to communicate, problem-solve and collaborate (Ifenthaler, 2010, 2014; Pirnay-Dummer, Ifenthaler, & Spector, 2010). Case study This case study aims to demonstrate how the analysis of navigation patterns and network graph analysis could inform the learning design of self-guided digital learning experiences. In particular, two research questions were addressed: (1) Can navigation patterns identify individual user paths and contribute to optimised learning design? (2) Do visualisations of network graphs help to understand user patterns within a digital learning environment? Ethics approval for the case study has been obtained. Context The Curtin Challenge digital learning platform (http://challenge.curtin.edu.au) supports individual and team- based learning via gamified, challenge-based, open-ended, inquiry-based learning experiences that integrate automated feedback and rubric-driven assessment capabilities. The Challenge platform is an integral component of Curtin University’s digital learning environment along with the Blackboard learning management system and the edX MOOCs platform. The Challenge development team at Curtin Learning and Teaching are working towards an integrated authoring system across all three digital learning environments with the view of creating reusable and extensible digital learning experiences. Curtin Challenge currently includes three sets of content modules: Leadership, Careers and English Language Challenge. Since 2015, over 2600 badges have been awarded for the completion of a challenge. This case study includes analysis from the Careers Challenge, which has 12 modules that can normally be completed in one hour or less. The design features of each module contain approximately five activities that might include one to three different learner interactions. For example, the module “Who am I” in the Careers Challenge (see Figure 1) is a collection of five web pages (called activities) containing learning interactions, such as choosing from among options, writing a short response to a prompt, spinning a wheel to create random prompts, creating, organising and listing ideas, matching items. Each page can contain one or several such interactions, and the learner does not have to submit the page in order for the data to be captured. Data is constantly being captured, which creates information about the timing, sequence, and completeness as well as the content of the interactions. The data record is thus highly granular, providing an opportunity to examine the dynamics of the activity as well as the contents of the artefacts created by the learner for every click on every page for every user. http://challenge.curtin.edu.au/ Australasian Journal of Educational Technology, 2018, 34(2). 122 Figure 1. Careers Challenges start page Australasian Journal of Educational Technology, 2018, 34(2). 123 The administrative dashboard documents that over 16,000 users have undertaken over 194,000 activities in over 35,000 modules and earned over 6800 badges since January 2015 (see Figure 2). Figure 2. Administrative dashboard for Leadership and Careers Challenges Analytics snapshot of the case study focusing on Careers Challenge Analytics data for the presented case study includes 2,753,142 database rows. Overall, 3550 unique users registered and completed a total of 14,587 navigation events within a period of 17 months. Figure 3 provides an overview of modules started (M = 3427, SD = 2880) and completed (M = 2903, SD = 2303) for the Careers Challenge. The average completion rate for the Careers Challenge was 87%. The most frequently started module was “Who am I?” (10,461) followed by the module “Resumes” (7996). The module “Workplace Rights and Responsibilities” showed the highest completion rate of 96%, followed by the module “Interviews” (92%). Australasian Journal of Educational Technology, 2018, 34(2). 124 Figure 3. Module completion of Careers Challenge All 60 activities of the Careers Challenge were included in the analysis of the 12 modules of the Careers Challenge. The average completion rate for the 60 activities was 89% (M = 580, SD = 476). The most frequently started activity was “Why is Self-awareness Important for your Career?” (3225), which is part of the “Who am I?” module. The activity “How do People see You?” within the module “Interviews” showed the highest completion rate of 99%. Activity navigation sequence The navigation sequence identifies individual user paths within the learning environment. It uses the timestamp information to identify at which time the user navigated from one learning activity to the next. For example, user 162 started at activity 46 (“Why is Self-Awareness Important for your Career?”; part of the module “Who am I?”), followed by activities 47 (“Career Values”; part of the module “Who am I?”), 48 (“Self-Awareness in Action”; part of the module “Who am I?”), and 26 (“What are selection criteria?”; part of the module “Selection Criteria”), which results in navigation sequence 162: 46, 47, 48, 26. For user 195, a total of 18 activities were identified: 195: 46, 47, 48, 49, 50, 36, 38, 39, 40, 21, 22, 23, 24, 31, 32, 33, 34, 35. The aggregation of navigation sequences identifies most frequent patterns of navigation within the learning environment. Figure 4 shows the most frequent navigation sequences. Overall, 608 different navigation sequences were found. From a total of 3550 users, 591 users only interacted with learning activity 46. More complex patterns including multiple learning activities were found less frequently. Australasian Journal of Educational Technology, 2018, 34(2). 125 Figure 4. Aggregated navigation sequence Activity network graph analysis The network analysis identifies user paths within the learning environment and visualises them as a network graph on the fly. The dashboard visualisations help the learning designer to identify specific patterns of learners and may reveal problematic learning instances. The nodes of the network graph represent individual interactions. The edges of the network graph represent directed paths from one interaction to another. The indicator on the edges represent the frequency of users taking the path from one interaction to another and in parenthesis the percentage of users who took the path. An aggregated network graph shows the overall navigation patterns of all users. A network graph can be created for each individual user, for selected groups of users (e.g., with specific characteristics), or for all users of the learning environment. Updates of the network graph are generated on the fly. Figure 5a shows a network graph that includes paths taken by 1000 and more users reflecting a linear navigation structure within a single module (“Who am I?”). Figure 5b shows a part of a network graph including paths taken by 100 and more users. Here, navigation across modules is visible. Australasian Journal of Educational Technology, 2018, 34(2). 126 Figure 5. Network graphs of two frequent navigation patterns The aggregation of all individual network graphs provides detailed insights into the navigation patterns of all users. Figure 6 shows the aggregated network graph including paths taken by all 3550 users showing 14,587 navigation events. The five modules are highlighted using different colours. Figure 6. Aggregated network graph Australasian Journal of Educational Technology, 2018, 34(2). 127 Discussion This case study aimed to demonstrate how the analysis of navigation patterns and network graph analysis can inform the learning design of self-guided digital learning experiences. Regarding the first research question, the findings of the case study show that navigation sequence analysis has potential for mapping the cognitive, social and even physical states of the learner (Gibson & Clarke-Midura, 2015; Quellmalz et al., 2012). However, these can only occur if the learning designers give specific affordances and incentives to the user and the sensor system gathering data collects appropriate evidence. With these constraints in mind, the initial authored content in the Careers Challenge represents an incremental step from typical online content – where the user reads content and then answers some questions, or perhaps creates lists of ideas when prompted. With regard to research question 2, the findings of the case study showed that visualisations of network graphs help to understand user patterns within the Careers Challenge. Even with open-ended freedom, only 608 sequences were evidenced by learners out of a potential number of hundreds of millions of sequences (e.g., the combination of sequences of 5 interactions in any order out of 50 is (50*49*48*47*46) = over 254 million sequences). Of the 608 sequences created by users, far fewer have large percentages of the population traversing the same paths. For example, 17% of the total population gave one activity a try and then left the Challenge; another 16% engaged with a sequence of only four interactions and then exited. With the extremely small subspace traversed by users, it is perhaps understandable to think that there is meaning in that pattern (e.g., why are there not more sequences evidenced and why these particular sequences?). As shown in Figure 5a, a large number of users navigated a straight course of interactions in the same order as the designers had programmed them (e.g., a person interacting with item 46 followed that by interacting with item 47 about 56% of the time). We can also see that two interactions captured about 90% of the previous audience, one interaction only captured about 78%. As a population measure, is that drop a matter connected to the content, to the overall flow of the set of activities, or to the next item difficulty? Accordingly, learning analytics design helps to raise new questions and provides new data for follow-up inquiry. In Figure 5b, a different navigation sequence is displayed, which follows what happens to those who made it to interaction 50 in the above sequence. A significant segment of the population splits between going to item 36 or 21. Given the no-plan stance of the learning designers allowing any module to be undertaken in any order, there is perhaps no inference that can be drawn, but it is noticeable that the items in sequence again show a steadily decreasing decline in drop-off. Would a rewrite of content (or a randomising of content sequence) and adding transparent incentives help flatten out the pattern of drop-off? The full story needs to be explored as a function of the freedom to choose anything at any point along with a lack of an interesting choice driven by either a compelling narrative or game goal. Accordingly, the analysis of navigation sequences and visualisations of network graphs reflect on and draw meaning from the design decisions of the authors mixed with the actual use of the space’s affordances by learners. Without a compelling narrative goal leading to completion, the decision by the learner at every point in the experience amounts to “shall I continue this now and what will I lose or gain by doing so?” The learning designers could have directed users to flow through the modules of Careers Challenge in a particular order, or in some small subset of orders of the modules, but instead chose to leave the entire set of modules open at all times to all users. This design decision resulted in Figure 6, which shows a few preferred paths (the thicker lines), but on the whole, a wide variety of paths. Implications From a practitioner’s perspective, a learning design may be conceived of as a blueprint of actions which includes the purpose (goals, aims, outcomes), the content, learner needs, learning activities, instructional processes and resources, assessments, and evaluations (Branch, 2009; Ifenthaler & Gosper, 2014). Such a traditional perspective on learning design is rather static and does not include changes to the learning environment within a short timeframe. In contrast, learning analytics design provides a dynamic perspective including design Australasian Journal of Educational Technology, 2018, 34(2). 128 decisions on the fly. Especially for learning environments with a large number of learners, for example, MOOCs, the benefits of learning analytics design are obvious: • using navigation sequence analysis to identify areas of dropout • identifying alignment or misalignment of optimal learning design with actual behaviour of the learners • providing assistance, scaffolds, or feedback to learners off the track • identifying learning materials and activities which need revisions to improve the overall quality of the learning environment. In addition, predictive learning analytics design may help to better understand which effect any changes to the learning environment may have with regard to the overall outcome, expected dropouts, or learners with specific background (e.g., low vs. high achievers, learners with different prior knowledge or interest). Based on these findings, learning designers may have a better evidence for design decisions and for meeting specific benchmarks (e.g., expected course completion rate). Hence, the evidence-based recommendations for design changes can be made transparent, and the required resource allocation will be evident. By implementing the principles of learning analytics design as a tool for learning designers, adaptation and optimisation of learning and teaching may occur, for example, based on educator-selected benchmarks that help to identify alignment or misalignment towards learning outcomes (Ifenthaler & Gosper, 2014; Janssen, Berlanga, Vogten, & Koper, 2008). In addition, detailed insights into pedagogical processes may facilitate micro interventions whenever the learner needs them. These micro interventions can be triggered through algorithms focusing on misalignment of learning activities. An example of a micro intervention could be an automated prompt helping the student to stop and reflect on the current learning pathway. Another example of a micro intervention could be a personalised message from a tutor who was informed about the misalignment through the learning analytics tool. In addition, a growing repository of optimal learning designs based on analytics data could help future learning designers to avoid common mistakes when implementing learning environments. Learning designers may search for designs focusing on specific content and learner characteristics for receiving best practice examples for implementing the intended learning environment. To sum up, learning analytics design may provide multiple applications for improving learning and teaching on the fly. However, further research and development is needed to make learning analytics design ready for classroom practice. Limitations and future research This study is limited in several aspects that must be addressed in future research. First, learning designers were not confronted with the analytics results (i.e., navigation sequence, network graph analysis) on the fly. Therefore, we were not able to test how learning designers may utilise the analytics data for ad hoc design decisions. Accordingly, future research shall include experimental settings in which learning designers use analytics results for advancing learning environments on the fly (formative), after a unit is completed (summative), as well as projecting effects on possible changes to the learning environment (predictive). Second, due to limited access of student data (e.g., personal characteristics, study load, past academic performance), a more holistic perspective on navigation sequences of individual students and their relationship to personal characteristics was not possible. Hence, future studies shall link additional student data to navigation sequences and therefore provide further insights into these complex relationships. Third, the advanced application of network graph analysis was not demonstrated in this case study. Evidently, elaborated network graph analysis may provide additional insights for learning designers and could prompt them to irregularities in navigation pathways throughout the learning environment. Last, the missing link between learning designs, navigation sequences, and student performance is obvious. This was due to limited consent for accessing additional student data. Accordingly, advances in research on learning analytics design requires access to complete data. Australasian Journal of Educational Technology, 2018, 34(2). 129 Conclusion Only a few applications of graph theory analysis for informing learning design and assessment have been implemented yet (Clariana, 2010; Ifenthaler & Pirnay-Dummer, 2014). As demonstrated above, the network graph of the crowd utilisation of the Careers Challenge reflects the open-ended nature of the designed space, and also illustrates that the crowd exhibited only a small subset of possible traversals. The level of analysis applied to the data thus far perhaps indicates more about task ordering and lack of dependencies among tasks than the internal structure of the tasks or what that structure reveals about each learner’s cognitive, social, emotional or physical status changes during each task. However, the steady decline of drop-offs within a module suggests that ordering of interactions and quality of feedback might be redesigned, randomised or limited by prior choices of the learner which would bring design and analytics closer together. Since it is possible to adapt design on the fly the future of learning design analytics might include emergent design concepts fed by analytics. The integration of analytics data into the design of learning environments is a promising approach. Learning design may offer the right set of theoretical foundations for planning optimal design and reuse of cross-platform learning and teaching sequences. Learning analytics in turn is able to offer detailed insights into individual and collective learning processes and evidence for validating assumptions about the effects of learning designs in various contexts. Accordingly, the synergistic relationship between learning design and learning analytics, that is, learning analytics design (Ifenthaler, 2017), opens up a bright future for the design of personalised and adaptive learning. It is up to educators-as-designers to make the links between learning design and learning analytics operational and use learning analytics design to further advance the educational arena. References Agostinho, S. (2006). The use of visual learning design representation to document and communicate teaching ideas. In Proceedings of ASCILTE 2006, Sydney, NSW. Retrieved from http://www.ascilite.org.au/conferences/sydney06/proceeding/pdf_papers/p173.pdf Bannert, M. (2009). Promoting self-regulated learning through prompts. Zeitschrift für Pädagogische Psychologie, 23(2), 139–145. https://doi.org/10.1024/1010-0652.23.2.139 Behrens, J., Mislevy, R., Dicerbo, K., & Levy, R. (2012). Evidence centered design for learning and assessment in the digital world. In M. Mayrath, J. Clarke-Midura, D. Robinson, & G. Schraw (Eds.), Technology-based assessments for 21st century skills (pp. 13–54). Charlotte, NC: Information Age Publishers. Berland, M., Baker, R. S. J. d., & Bilkstein, P. (2014). Educational data mining and learning analytics: Applications to constructionist research. Technology, Knowledge and Learning, 19(1–2), 205–220. https://doi.org/10.1007/s10758-014-9223-7 Branch, R. M. (2009). Instructional design: The ADDIE approach. New York, NY: Springer. Brandes, U., & Erlebach, T. (Eds.). (2005). Network analysis: Methodological foundations (Vol. 3418). Berlin: Springer. Clariana, R. B. (2010). Deriving individual and group knowledge structure from network diagrams and from essays. In D. Ifenthaler, P. Pirnay-Dummer, & N. M. Seel (Eds.), Computer-based diagnostics and systematic analysis of knowledge (pp. 117–130). New York, NY: Springer. Clarke-Midura, J., Code, J., Dede, C., Mayrath, M., & Zap, N. (2012). Thinking outside the bubble: Virtual performance assessments for measuring complex learning. In M. Mayrath, J. Clarke-Midura, D. Robinson, & G. Schraw (Eds.), Technology-based assessments for 21st century skills (pp. 125–148). Charlotte, NC: Information Age Publishers. Conole, G. (2008). The role of mediating artefacts in learning design. In L. Lockyer, S. Bennett, S. Agostinho, & B. Harper (Eds.), Handbook of research on learning design and learning objects: Issues, applications and technologies (pp. 188–208). New York, NY: Hershey. Conole, G. (2013). Designing for learning in an open world. New York, NY: Springer. https://doi.org/10.1024/1010-0652.23.2.139 https://doi.org/10.1007/s10758-014-9223-7 Australasian Journal of Educational Technology, 2018, 34(2). 130 Dalziel, J. (2008). Learning design: Sharing pedagogical know-how. In T. Iiyoshi & M. Kumar (Eds.), Opening up education: The collective advancement of education through open technology, open content and open knowledge (pp. 375–388). Cambridge, MA: MIT Press. Dalziel, J., Conole, G., Wills, S., Walker, S., Bennett, S., Dobozy, E., … Bower, M. (2016). The Larnaca Declaration on Learning Design. Journal of Interactive Media in Education, Article 1(7), 1–24. https://doi.org/10.5334/jime.407 Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), International handbook of information technology in primary and secondary education (pp. 43–62). New York, NY: Springer. Dick, W. (1987). A history of instructional design and its impact on educational psychology. In J. A. Glover & R. R. Ronning (Eds.), Historical foundations of educational psychology (pp. 183–202). New York, NY: Springer. Diestel, R. (2000). Graph theory. New York, NY: Springer. Dijkstra, S., & Leemkuil, H. (2008). Developments in the design of instruction. From simple models to complex electronic learning environments. In D. Ifenthaler, P. Pirnay-Dummer, & J. M. Spector (Eds.), Understanding models for learning and instruction. Essays in honor of Norbert M. Seel (pp. 189–210). New York, NY: Springer. Dobozy, E. (2013). Learning design research: Advancing pedagogies in the digital age. Educational Media International, 50(1), 63–76. https://doi.org/10.1080/09523987.2013.777181 Gagné, R. M. (1965). The conditions of learning. New York, NY: Holt, Rinehart, and Winston. Gagné, R. M. (1985). The conditions of learning (4 ed.). New York, NY: Holt, Rinehart, and Winston. Gašević, D., Dawson, S., Rogers, T., & Gašević, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. The Internet and Higher Education, 28, 68–84. https://doi.org/10.1016/j.iheduc.2015.10.002 Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64–71. https://doi.org/10.1007/s11528-014-0822-x Gibson, D. C., & Clarke-Midura, J. (2015). Some psychometric and design implications of game-based learning analytics. In P. Isaias, J. M. Spector, D. Ifenthaler, & D. G. Sampson (Eds.), E-learning systems, environments and approaches: Theory and implementation (pp. 247–261). New York, NY: Springer. Goodyear, P., & Dimitriadis, Y. (2013). In medias res: reframing design for learning. Research in Learning Technology, 21, 19909. https://doi.org/10.3402/rlt.v21i0.19909 Goodyear, P., & Retalis, S. (Eds.). (2010). Technology-enhanced learning: design patterns and pattern languages. Rotterdam: Sense Publishers. Gustafson, K. L., & Tillman, M. H. (1991). Introduction. In L. J. Briggs, K. L. Gustafson, & M. H. Tillman (Eds.), Instructional design: Principles and applications (2nd ed., pp. 3–8). Englewood Cliffs, NJ: Educational Technology Publishers. Hale, F. (2016). Learning design: Details of the developing Edinburgh Learning Design roadmap (ELDeR) at the University of Edinburgh. Edinburgh: University of Edinburgh. Retrieved from http://www.ed.ac.uk/information-services/learning-technology/supporting-learning-and-teaching/learning- design Ifenthaler, D. (2010). Scope of graphical indices in educational diagnostics. In D. Ifenthaler, P. Pirnay- Dummer, & N. M. Seel (Eds.), Computer-based diagnostics and systematic analysis of knowledge (pp. 213–234). New York, NY: Springer. Ifenthaler, D. (2012). Determining the effectiveness of prompts for self-regulated learning in problem-solving scenarios. Journal of Educational Technology & Society, 15(1), 38–52. Retrieved from http://www.ifets.info/journals/15_1/5.pdf Ifenthaler, D. (2014). AKOVIA: Automated Knowledge Visualization and Assessment. Technology, Knowledge and Learning, 19(1-2), 241–248. https://doi.org/10.1007/s10758-014-9224-6 Ifenthaler, D. (2017). Learning analytics design. In L. Lin & J. M. Spector (Eds.), The sciences of learning and instructional design: Constructive articulation between communities (pp. 202–211). New York, NY: Routledge. Ifenthaler, D., & Gosper, M. (2014). Guiding the design of lessons by using the MAPLET Framework: Matching aims, processes, learner expertise and technologies. Instructional Science, 42(4), 561–578. https://doi.org/10.1007/s11251-013-9301-6 https://doi.org/10.5334/jime.407 https://doi.org/10.1080/09523987.2013.777181 https://doi.org/10.1016/j.iheduc.2015.10.002 https://doi.org/10.1007/s11528-014-0822-x https://doi.org/10.3402/rlt.v21i0.19909 http://www.ed.ac.uk/information-services/learning-technology/supporting-learning-and-teaching/learning-design http://www.ed.ac.uk/information-services/learning-technology/supporting-learning-and-teaching/learning-design http://www.ifets.info/journals/15_1/5.pdf https://doi.org/10.1007/s10758-014-9224-6 https://doi.org/10.1007/s11251-013-9301-6 Australasian Journal of Educational Technology, 2018, 34(2). 131 Ifenthaler, D., & Pirnay-Dummer, P. (2014). Model-based tools for knowledge assessment. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (4 ed., pp. 289–301). New York, NY: Springer. Ifenthaler, D., & Schmidt, T. (2010). Assessing the effectiveness of prompts for self-regulated learning. In Kinshuk, D. G. Sampson, J. M. Spector, P. Isaias, D. Ifenthaler, & R. Vasiu (Eds.), Proceedings of the IADIS International Conference on Cognition and Exploratory Learning in the Digital Age (pp. 193–202). Timisoara: IADIS Press. Retrieved from http://www.iadisportal.org/digital-library/assessing-the- effictiveness-of-prompts-for-self-regulated-learning Ifenthaler, D., & Schumacher, C. (2016). Student perceptions of privacy principles for learning analytics. Educational Technology Research and Development, 64(5), 923–938. doi:10.1007/s11423-016-9477-y Ifenthaler, D., & Seel, N. M. (2005). The measurement of change: Learning-dependent progression of mental models. Technology, Instruction, Cognition and Learning, 2(4), 317–336. Retrieved from http://www.oldcitypublishing.com/journals/ticl-home/ticl-issue-contents/ticl-volume-2-number-4-2005/ Ifenthaler, D., & Tracey, M. W. (2016). Exploring the relationship of ethics and privacy in learning analytics and design: implications for the field of educational technology. Educational Technology Research and Development, 64(5), 877–880. https://doi.org/10.1007/s11423-016-9480-3 Ifenthaler, D., & Widanapathirana, C. (2014). Development and validation of a learning analytics framework: Two case studies using support vector machines. Technology, Knowledge and Learning, 19(1–2), 221– 240. https://doi.org/10.1007/s10758-014-9226-4h Janssen, J., Berlanga, A., Vogten, H., & Koper, R. (2008). Towards a learning path specification. International Journal of Continuing Engineering Education and Life Long Learning, 18(1), 77–97. https://doi.org/10.1504/IJCEELL.2008.016077 Kevan, J. M., & Ryan, P. R. (2016). Experience API: Flexible, decentralized and activity-centric data collection. Technology, Knowledge and Learning, 21(1), 143–149. https://doi.org/10.1007/s10758-015- 9260-x Koper, R. (2006). Current research in learning design. Journal of Educational Technology & Society, 9(1), 13–22. Retrieved from http://www.ifets.info/journals/9_1/3.pdf Laurillard, D. (2012). Teaching as a design science: Building pedagogical patterns for learning and technology. New York, NY: Routledge. Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10), 1439–1459. https://doi.org/10.1177/0002764213479367 Mah, D.-K. (2016). Learning analytics and digital badges: potential impact on student retention in higher education. Technology, Knowledge and Learning, 21(3), 285–305. https://doi.org/10.1007/s10758-016- 9286-8 Martin, F., & Whitmer, J. C. (2016). Applying learning analytics to investigate timed release in online learning. Technology, Knowledge and Learning, 21(1), 59–74. https://doi.org/10.1007/s10758-015-9261-9 Mayrath, M., Clarke-Midura, J., & Robinson, D. (2012). Introduction to technology-based assessments for 21st centry skills. In M. Mayrath, J. Clarke-Midura, D. Robinson, & G. Schraw (Eds.), Technology-based assessments for 21st century skills (pp. 1–8). Charlotte, NC: Information Age Publishers. Mor, Y., & Craft, B. (2012). Learning design: Reflections upon the current landscape. Research in Learning Technology, 20(Supplement), 85–94. https://doi.org/10.3402/rlt.v20i0.19196 Mor, Y., Craft, B., & Maina, M. (2015). Learning design: Definitions, current issues and grand challenges. In M. Maina, B. Craft, & Y. Mor (Eds.), The art & science of learning design (pp. ix–xxvi). Rotterdam: Sense Publishers. Papadakis, S. (2012). Enabling creative blended learning for adults through learning design. In P. Anastasiades (Ed.), Blended learning environments for adults: Evaluations and frameworks (pp. 257– 273). Hershey, PA: IGI Global. Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. British Journal of Educational Technology, 45(3), 438–450. https://doi.org/10.1111/bjet.12152 Pirnay-Dummer, P., Ifenthaler, D., & Spector, J. M. (2010). Highly integrated model assessment technology and tools. Educational Technology Research and Development, 58(1), 3-18. https://doi.org/10.1007/s11423-009-9119-8 http://www.iadisportal.org/digital-library/assessing-the-effictiveness-of-prompts-for-self-regulated-learning http://www.iadisportal.org/digital-library/assessing-the-effictiveness-of-prompts-for-self-regulated-learning http://www.oldcitypublishing.com/journals/ticl-home/ticl-issue-contents/ticl-volume-2-number-4-2005/ https://doi.org/10.1007/s11423-016-9480-3 https://doi.org/10.1007/s10758-014-9226-4 https://doi.org/10.1504/IJCEELL.2008.016077 https://doi.org/10.1007/s10758-015-9260-x https://doi.org/10.1007/s10758-015-9260-x http://www.ifets.info/journals/9_1/3.pdf https://doi.org/10.1177/0002764213479367 https://doi.org/10.1007/s10758-016-9286-8 https://doi.org/10.1007/s10758-016-9286-8 https://doi.org/10.1007/s10758-015-9261-9 https://doi.org/10.3402/rlt.v20i0.19196 https://doi.org/10.1111/bjet.12152 Australasian Journal of Educational Technology, 2018, 34(2). 132 Quellmalz, E., Timms, M., Buckley, B., Davenport, J., Loveland, M., & Silberglitt, M. (2012). 21st century dynamic assessment. In M. Mayrath, J. Clarke-Midura, D. Robinson, & G. Schraw (Eds.), Technology- based assessments for 21st century skills (pp. 55–90). Charlotte, NC: Information Age Publishers. Shaffer, D., Hatfield, D., Svarovsky, G., Nash, P., Nulty, A., Bagley, E., … Mislevy, R. (2009). Epistemic network analysis: A prototype for 21st-century assessment of learning. International Journal of Learning and Media, 1(2), 33–53. https://doi.org/10.1162/ijlm.2009.0013 Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510–1529. https://doi.org/10.1177/0002764213479366 van den Boom, G., Paas, F. G., van Merriënboer, J. J. G., & van Gog, T. (2004). Reflection prompts and tutor feedback in a web-based learning environment: effects on students' self-regulated learning competence. Computers in Human Behavior, 20(4), 551–567. https://doi.org/10.1016/j.chb.2003.10.001 Corresponding author: Dirk Ifenthaler, dirk@ifenthaler.info Australasian Journal of Educational Technology © 2018. Please cite as: Ifenthaler, D., Gibson, D., & Dobozy, E. (2018). Informing learning design through analytics: Applying network graph analysis. Australasian Journal of Educational Technology, 34(2), 117-132. https://doi.org/10.14742/ajet.3767 https://doi.org/10.1162/ijlm.2009.0013 https://doi.org/10.1177/0002764213479366 https://doi.org/10.1016/j.chb.2003.10.001 mailto:dirk@ifenthaler.info https://doi.org/10.14742/ajet.3767 Introduction Learning design and analytics Graph theory analysis Case study Context Analytics snapshot of the case study focusing on Careers Challenge Activity navigation sequence Activity network graph analysis Discussion Implications Limitations and future research Conclusion References