Australasian Journal of Educational Technology, 2020, 36(2). 27 A virtual internship for developing technological pedagogical content knowledge Diler Oner Bogazici University This study examines the use of a virtual internship (an epistemic game) for developing preservice teachers’ technological pedagogical content knowledge (TPACK). TPACK aims to capture the essential qualities of teacher knowledge that are needed for integrating technology into teaching. Virtual internships are computer-based professional practicum simulation games where participants assume the role of a professional, work collaboratively on authentic tasks, and engage in complex professional thinking. The online collaborative chat records of 33 preservice teachers who played the game over the course of 8 weeks were analysed using epistemic network analysis (ENA), which made it possible to examine the dynamic connections between various TPACK domains over time. The analysis showed that participants’ TPACK representations gradually became more complex in terms of the number of pedagogical considerations and the strength of connections between pedagogical considerations, technology, and content. Suggestions are made for designing learning environments that aim to develop preservice teachers’ TPACK. Implications for practice or policy: • Findings suggest that virtual internships are an effective means of developing preservice teachers’ TPACK. • Teacher educators can use this knowledge when designing learning environments not only to develop preservice teachers’ TPACK but also other forms of teacher professional knowledge. • ENA affords an innovative way to assess TPACK development. • TPACK can be framed as an epistemic frame, which could better align with its conceptualisation. Keywords: technological pedagogical content knowledge; virtual internships; epistemic network analysis; preservice teachers; teacher knowledge Introduction Teacher knowledge in general is highly complex, as it draws on several different types of knowledge (Mishra & Koehler, 2006). One of the best-known approaches to conceptualising teacher knowledge comes from Shulman (1987), who defined several types of knowledge that constitute the knowledge base for the teaching profession. Among these, one category in particular, pedagogical content knowledge (PCK), has attracted the most interest among teacher education researchers and practitioners. Shulman himself also considered PCK to be of special interest because “it represents the blending of content and pedagogy into an understanding of how particular topics, problems, or issues are organized, represented, and adapted to the diverse interests and abilities of learners, and presented for instruction” (Shulman, 1987, p. 8). Mishra and Koehler (2006) extended this construct by adding a technology category to describe the type of knowledge teachers need in order to effectively teach with technology. Referred to as technological pedagogical content knowledge (now known as TPACK), this construct has been well received by researchers in the area of educational technology - although similar conceptualisations have been suggested by other researchers (Graham, 2011). In 2016, Angeli, Valanides, and Christodoulou reported that, since 2005 (the year the term TPACK first appeared in the literature), 1,475 publications had used the phrase technological pedagogical content knowledge. The interest in TPACK most likely comes from the parsimonious conceptualisation of TPACK as the integration of the traditionally separated knowledge domains of content, pedagogy, and technology (Graham, 2011). In addition to denoting a special type of teacher knowledge, TPACK also refers to a technology integration framework that provides a unifying structure and guidance for the challenging task of integrating technology into teaching (Koehler, Mishra, Akcaoglu, & Rosenberg, 2013). However, as pointed out by Harris, Mishra, and Koehler (2009), the TPACK framework does not prescribe how TPACK Australasian Journal of Educational Technology, 2020, 36(2). 28 development should be supported. Specific suggestions are needed for activities and methods that will help develop TPACK (Archambault & Barnett, 2010). Drawing on work in learning sciences, this study presents and evaluates a new means of support for preservice teachers’ TPACK development. Specifically, it introduces a new virtual internship (i.e., an epistemic game) that exemplifies the three essential characteristics of the learning-technology-by-design approach (Koehler & Mishra, 2005; Koehler, Mishra, & Yahya, 2007): authenticity, small group collaboration, and design task. The study aims to evaluate the use of this virtual internship in developing preservice teachers’ TPACK using epistemic network analysis (ENA), a novel data analysis method developed for assessing learning in virtual internships (Shaffer, Collier, & Ruis, 2016). The TPACK framework The TPACK framework presents technology as the third core domain of teacher knowledge, along with content and pedagogy. It also introduces categories of knowledge that emerge from the interactions between and among these three core domains (Mishra & Koehler, 2006). The hybrid bodies of knowledge address how content, pedagogy, and technology “interact, constrain, and afford each other” (Koehler, Mishra, Kereluik, Shin, & Graham, 2014, p. 102). The TPACK framework, usually portrayed in a Venn diagram, comprises seven types of knowledge: content knowledge (CK), pedagogical knowledge (PK), technological knowledge (TK), pedagogical content knowledge (PCK), technological content knowledge (TCK), technological pedagogical knowledge (TPK), and technological pedagogical content knowledge (TPACK) (Figure 1). Central to the TPACK framework is the idea of connections and interactions between and among the three core categories. While a deep understanding of technology, content, and pedagogy is needed, effective teaching with technology requires the total package (the tee pack) that emerges from the connections between them and exist in a state of dynamic equilibrium (Mishra & Koehler, 2006; Niess, 2011). Figure 1. TPACK framework categories (reproduced by permission of the publisher, © 2012 by tpack.org) Although researchers and teacher educators have embraced the TPACK framework quickly, there have been also critical reviews of it. Graham (2011) argued that the TPACK framework was built on the concept of PCK, which already possessed conceptual complexity. While several research studies employed TPACK as their theoretical framework, few made a substantive theoretical contribution to the construct. Furthermore, the total package of TPACK is divided into so many pieces, which are difficult to separate, and are not always defined consistently in the literature (Archambault & Barnett 2010; Brantley-Dias & Ertmer, 2013). This also has created the problem of assessing these seven knowledge domains accurately, which typically relied on self-report measures, such as surveys or questionnaires (Koehler, Shin, & Mishra, 2012). Analyses of TPACK development frequently took the form of examining TPACK components separately, while TPACK is characterised by dynamic relationships composed of the connections between the three core categories. Nonetheless, as Angeli et al. (2016) put it, several researchers have agreed on the usefulness of the TPACK framework for providing a common language and focus for the research in the area of technology integration in education. To address the issues related to TPACK Brantley-Dias and Ertmer (2013) Australasian Journal of Educational Technology, 2020, 36(2). 29 suggested going back to an initial definition of TPACK that highlighted the point of intersection among the three core domains of teacher knowledge (pedagogical [P], content [C], and technological [T]): “a unique knowledge base regarding how technology enables or constraints one’s efforts to help learners master specific subject matter” (p. 120). Supporting TPACK development Research has suggested several professional development strategies to help preservice and inservice teachers develop TPACK. The main guiding principle has been to provide opportunities for participants to connect technological, pedagogical and content knowledge, rather than introducing these in isolation (Koehler et al., 2013). However, researchers have not yet suggested an ideal developmental path that will foster TPACK (Chai, Koh, & Tsai, 2013; Koehler et al., 2014). In their comprehensive literature review, Chai et al. (2013) determined that most of the studies started with either PCK, TCK, TPK, or TPACK, compared to the studies that started their interventions with CK, PK, or TK. Koehler et al. (2014) identified three broad paths for the best known professional TPACK development models: (a) from PCK to TPACK, (b) from TPK to TPACK, and (c) developing PCK and TPACK simultaneously. From PCK to TPACK Koehler et al. (2014) describe approaches that take PCK as the beginning point and then introduce technology to develop TPACK under this category. They consider this a suitable choice for experienced teachers, who have accumulated a repertoire of PCK by virtue of their experience in the classroom and from methods courses they took during their training – although both of these may sometimes act as blocking factors when introducing technology into PCK. A standout approach under this category is the use of learning activity types (Harris et al., 2009; Harris & Hofer, 2009). Activity type is defined as the most essential aspect of a particular learning action. It relates to what students do in a learning activity: group discussion, role playing, or reading text, for example. (Harris & Hofer, 2009, p. 102). Going beyond technocentric (Harris el al., 2009; Papert, 1987) strategies, the starting point for learning activity types is curricular learning goals, rather than the affordances and constraints of technologies. First, a range of possible learning activity types for a particular content area is determined. Only then is each activity type matched with compatible digital and non-digital technologies to support the learning in each activity. Learning activity types can be combined in lesson and unit plans. The power of the learning activity types approach is that it provides teachers a natural starting point in terms of PCK and provides a repertoire of technologies from which they can choose for TPACK development. The challenge is to creating taxonomies for the classification and description of activity types in various content areas, as this requires considerable time and effort (Archambault, 2016). From TPK to TPACK This is presented as the most prevalent default pathway in preservice teacher education programs where students take courses that focus on how technology supports general pedagogical strategies before being introduced to content-specific methods courses (Koehler et al., 2014). The technology-related courses are usually taught by instructors who do not have expertise in all subject areas. The typical progression is starting with TK and TPK in a generic technology course, and then integrating PCK with methods courses and field experiences to develop TPACK. Technology mapping, presented by Angeli and Valanides (2013), is a sophisticated example under this category. In contrast to learning activity types, technology mapping starts with the affordances of information and communication technologies and then focuses on establishing connections between the affordances of these tools, specific content, and pedagogy. Angeli and Valanides developed instructional guidelines that includes discussing educational affordances of a specific technological tool for a specific group of students, demonstrating educational affordances for a specific curricular topic with a worked-out design task, explaining the structure of the worked-out design task in terms of underlying instructional design, allowing students to practice with the affordances, and using a new design task. Australasian Journal of Educational Technology, 2020, 36(2). 30 Developing PCK and TPACK simultaneously Another pathway for TPACK development is the simultaneous development of PCK and TPACK. Learning-technology-by-design, proposed by Koehler and Mishra (2005), best exemplifies this pathway as it emphasises simultaneous connections and interactions between subject matter, technology, and pedagogy (Koehler & Mishra, 2005; Koehler et al., 2007). Learning-technology-by-design emerged from a faculty development seminar that involved both higher education faculty members and graduate students. Koehler and Mishra (2005) created small groups, each of which included a faculty member. These groups worked collaboratively on an authentic design task over the course of a semester. The task was considered authentic in that it involved developing an actual online course to be taught the following academic year. The composition of the groups was not random; the authors wanted the groups to build on PCK by including in each group faculty members who had already developed such knowledge. They also wanted to involve graduate students in each group, most of whom could bring TK to the discussion. During the course, participants were introduced to several technological tools, examined their affordances, and employed some of them in their online courses if they fit with their instructional or pedagogical goals. The authentic design task enabled the teams to investigate and establish a complex web of relationships between content, technology, and pedagogy (Koehler & Mishra, 2005). Virtual internships: A novel method to develop TPACK Researchers tend to agree that TPACK can best be developed through the learning-technology-by-design approach, although what makes it successful is less clear (Voogt, Fisser, Tondeur, & van Braak, 2016). Learning-technology-by-design underlies a learning environment with three essential elements: authenticity, small group collaboration, and a design task. It also seems to work well for experienced teachers who already have familiarity with PCK on which to build. However, the nature of TPACK attainment is different for preservice and experienced teachers (Hofer & Harris, 2012; Koehler et al., 2014). Preservice teachers have relatively less familiarity with TPACK domains, for which reason one cannot identify a natural knowledge type as the starting point for TPACK development (Koehler et al., 2014). Furthermore, authenticity as derived from real teaching situations may not always be a viable option in preservice teacher education. One novel tool better suited for preservice teachers’ TPACK development, and which also embodies authenticity, small group collaboration, and design task is virtual internships. Introduced by Shaffer (2005) and known primarily in the learning sciences community, virtual internships emerged as digital learning environments for the development of professional knowledge, skills, and competencies. More specifically, a virtual internship (or epistemic game) is a computer-based game where participants assume the role of a professional, work collaboratively on authentic tasks, interact with mentors, and engage in complex professional thinking. Authenticity in virtual internships is provided by a simulation of professional workplaces where players are addressed as interns in their intended profession. In virtual internships, participants work collaboratively on complex design projects in small groups in an online environment. Therefore, a virtual internship that targets TPACK development is considered an instantiation of the learning-technology-by-design approach for preservice teachers. Synchronous mentorship, which aims to provide the guidance necessary for participants to make meaningful connections between professional knowledge elements, is another important aspect of virtual internships. There exist virtual internships for several professions, such as Land Science (for urban planning), RescuShell (for biomedical engineering), and Nephrotex (for mechanical engineering). These games employ authentic professional training practices that integrate action and reflection to develop expert-like thinking skills in a specific profession (Shaffer et al., 2009). Each game follows a similar structure: players examine relevant research and resources, work collaboratively on a design task in teams, and submit their work to supervisors as they would in a real professional work environment. The virtual internships developed so far addressed the STEM professions and have been used mostly with high school students and university freshmen in engineering. Thus, the use of virtual internships in higher education is a new area of research. No studies investigated the use of virtual internships in teacher education. Australasian Journal of Educational Technology, 2020, 36(2). 31 This paper introduces and examines the use of a new virtual internship (an epistemic game), the School of the Future, to develop preservice teachers’ TPACK. School of the Future extends virtual internships to a new area (teacher education) and targets a new population (preservice teachers). The role of School of the Future on the development of preservice teachers’ TPACK was evaluated using ENA, a novel data analysis method and a tool that afforded the investigation of the dynamic interaction between the three core categories of the TPACK framework: technology, pedagogy, and content. The research question that guided this study is: • How did participants’ TPACK, represented as epistemic network graphs, change character over the eight sessions of School of the Future? Setting and participants The study was conducted in the context of an educational technology course that was part of a preservice teacher program. The participants were 33 preservice teachers (25 female) from a range of programs (math education, science education, and foreign language education). The virtual internship was introduced as a course requirement, but participation in the study was voluntary. The participants completed eight sessions (eight rooms) of the virtual internship within a three-month period. The data were collected in two runs of the School of the Future in two semesters, Fall 2017 and Spring 2018. Before data collection, Institutional Review Board approvals have been obtained. School of the future: A virtual internship to develop TPACK The new virtual internship/epistemic game was designed by the author, using the Virtual Internship Authoring Tool©. The designed virtual internship was implemented into a professional practice simulator software, a web-based PHP application, and MySQL database. The professional practice simulator uses a web-based client and allows users to interact through simulated email and a live chat interface. Since all components and activities are web-based, users can access the virtual internship from any location with Internet access. In the game, preservice teachers take the role of teacher interns at a fictitious school called School of the Future. School of the Future is played in eight sessions, referred to as rooms. Each session lasted 2 class hours (a total of about 100 active minutes). Participants worked in the rooms in groups of three or four, based on their major. Each group was facilitated by a graduate student, who was portrayed in the game as a senior mentor teacher at School of the Future. At the beginning of each session, participants received an e-mail from their program coordinator that detailed the tasks they were expected to complete and provided a set of resources they could use. They were able to navigate between their inbox, the resources, and the shared area and could chat with teammates and mentors simultaneously (Figure 2). Each task was built on the previous one and was designed to support the team’s completion of the final design task: developing instruction that integrates content knowledge (related to the participants’ field of study), teaching strategies, and technology to be used at the School of the Future in the upcoming semester. This final design task, called the instructional plan, required focusing on the instructional objectives for a specific topic, teaching strategies, and technologies to meet those objectives. The game was structured in such a way that participants would work on the pieces of the final project in what Koehler and Mishra (2008) refer to as a spiral-like manner, that is, with the TPACK categories being introduced iteratively. This way, participants were required to consider and reconsider their decisions with respect to how the three core categories should come together. Team members engaged in individual work and research, team discussions, and reflection to complete their tasks while synchronously interacting (through the chat interface) with their team members and mentors. Australasian Journal of Educational Technology, 2020, 36(2). 32 Figure 2. School of the Future e-mail screen In most of the rooms, the mentor asked a set of pre-prepared questions to guide the participants’ team discussions. These questions were integral part of the School of the Future, and simply dragged and dropped into the group chat by the mentor. For instance, one of the questions mentors asked in Room 4 was: For what instructional purposes this tool can be used? When I say instructional purposes, I mean purposes related to what you teach, that is, the content of your instruction such as certain concepts, knowledge, skills, etc. regarding your topic. This was the room in which the participants investigated several computer-based tools to report to their teammates. Mentors asked four other similar guiding questions in Room 4. Thus, the mentor role was to simply follow the School of the Future content (by asking pre-prepared questions in each room), provide further explanations of tasks if needed, and answer questions regarding technical difficulties. Data sources and analysis The core data of the study involved the records of about 4,000 turns of chat that took place while the teams were completing the School of the Future tasks over the eight sessions in two runs of the game, one in the fall of 2017 and one in the spring of 2018. These were analysed using the ENA Web Tool (version 0.1.0) (Marquart, Hinojosa, Swiecki, & Shaffer, 2018). ENA is a data analysis method (and a web-based tool) built on a theory of meaning-making in discourse (Shaffer, 2017). With ENA, one can identify and quantify connections (co-occurrences) between network elements and represent them in dynamic models using the ENA web tool (Shaffer et al., 2016). ENA is especially useful when “the code and count approach” (Csanadi, Eagan, Shaffer, Kollar, & Fischer, 2017, p. 217) is not sufficient to provide a conceptual understanding of the problem at hand. Applying ENA to TPACK development, one could go beyond traditional frequency-based assessments (Csanadi et al., 2017) and model how technology, pedagogy, and content knowledge are related and interconnected throughout the School of the Future implementation. It also allowed qualitative and quantitative comparisons of TPACK models at different stages of the School of the Future. To address the research question, an ENA model was created using the ENA web tool. The model was based on the codes shown in Table 1. Five sub-codes were created for pedagogy: teaching strategies, assessment, learning outcomes, learning theory, and learners. Additionally, a TPACK code was created when technology, content, and one of the pedagogy codes were all present (in the ENA terminology, TPACK was a derived code). Table 1 Australasian Journal of Educational Technology, 2020, 36(2). 33 Definitions and examples of the codes (nodes) Code Definition Example T.Technology Referring to electronic or digital tools or genre of tools that support users in performing tasks or generating products. “Besides, there is lack of privacy since Wikis can be publicly viewed.” (Intern 8, Fall 2017, Line 1552) C.Content Referring to facts, concepts, theories, principles, procedures of a particular content domain (e.g., math, science), the structure of the subject matter, explanatory frameworks that organise and connect ideas, and the epistemology of the domain – a set of ways in which truth and validity are established. “[O]ur topic is important because students will learn mixtures at the molecular level and their daily usage in this way they can develop concepts and make o [sic] connection between other science disciplines.” (Intern 27, Spring 2018, Line 3430) P.Teaching strategies Referring to methods, means and techniques of teaching. “I believe that cooperative learning experiences are so beneficial for students because sometimes they can learn better from more skillful peers.” (Intern 21, Fall 2017, Line 2819) P.Assessment Referring to the assessment of learning and assessment methods. “Rubrics are needed for all lessons to assess the tasks.” (Intern 12, Spring 2018, Line1621) P.Learning outcomes Referring to specific instructional outcomes, learning taxonomies (such as Bloom’s), overall educational purposes and values, as well as and documents where these can be found (e.g., curriculum). “The other skills play a complementary role for supporting the top three skills. For example, complex problem solving requires cognitive flexibility, creativity asks emotional intelligence and critical thinking cannot be achieved without judgment and decision making.” (Intern 1, Spring 2018, Line 942) P.Learning theory Referring to theories of learning, their constructs, the scholars who developed them, or providing definitions for learning. “According to ‘Situated learning theory’ learning is a social process. Knowledge is co- constructed. Learning is an inseparable part of social practice.” (Intern 6, Spring 2018, Line 2662) P.Learners Referring to learner characteristics such as age, grade level, and learners’ background knowledge and typical learner misconceptions. “I think grade level can be intermediate cause, the topic may not fit in lower grades.” (Intern 1, Spring 2018, Line 1277) In an ENA analysis, big data coding is handled using the tool called Ncoder (The Epistemic Analytics Lab). Ncoder uses automated classifiers to code large sets of qualitative data. The codes were validated using Kappa and Shafer’s rho statistics among the researcher, an independent reviewer, and the computer using rho, an R package that computes inter-rater reliability statistics. Shafer’s rho is a Monte Carlo rejective Australasian Journal of Educational Technology, 2020, 36(2). 34 method of testing the generalisability of any binary inter-rater reliability statistic. It allows researchers to minimise the amount of data that need to be hand-coded to establish the reliability and validity of the codes (Shafer, 2017). Getting acceptable results in Kappa and Shaffer’s rho values for each code (Kappa > .81; rho < .05), the autocoded data file that was created by Ncoder was uploaded to the ENA web tool. In order to examine connections between technology, content, and pedagogy, the ENA web tool was used to identify co-occurrences of the codes (Table 1) in a predetermined relevant context of the talk—since the meaning of any utterance can be understood only in relation to what has been said before. This temporal context is referred to as a stanza, and it is an adjustable feature of the ENA tool, that is, the moving stanza window size. Shaffer (2017) reported that in prior studies 98% of the lines have referred to something in the previous five lines, so in the present study, the MSWS was set to five. The ENA analysis produced a weighted network of the co-occurrences of the codes, along with associated visualisations for each unit of analysis in the data. To answer the research question, network graphs were created for all rooms, with the exception of Room 1, which was not included because it was only an introduction to the School of the Future. The unit of analysis was all the utterances (lines) within a room (e.g., Room 5) defined by the specific semester (e.g., Spring 2018), team name (e.g., Math2), and user name (e.g., Intern 2). This way, one can investigate the network graphs of all participants in each room in terms of the codes given in Table 1, that is, how each user made connections between technology, content, and various aspects of pedagogy in each room of STF. Furthermore, the means of individual students’ network graphs were represented by a single point in the graph (i.e., the centroid of the ENA network), allowing comparisons between the rooms in terms of the connections made between technology, content, and pedagogy. Results Below, the findings are presented from the three stages of the School of the Future - the beginning (Room 2), the mid-phase (Room 5), and the final stage (Room 8) - by providing the network graphs where connections between network elements are visualised and providing evidence where connections are actually found in the data. The beginning of the School of the Future- Room 2 In Room 2, participants were expected to discuss the differences between technology use and technology integration. The centroid of the connections made between technology, content, and aspects of pedagogy is represented by the little pink square in Figure 3, and stronger connections are indicated by thicker lines. One can see that not only did participants make connections between technology and learning outcomes (Table 2), they also talked about technology in relation to a content area, although sometimes the content was not related to their own field of study. One can also notice very few cases of TPACK, where all three core categories were evident, in the graph. Therefore, at the beginning of the School of the Future while there were connections between the core domains of the TPACK framework and the pedagogical aspects were limited; the learning outcomes were the most visible pedagogical consideration. Australasian Journal of Educational Technology, 2020, 36(2). 35 Figure 3. The network graph in Room 2 Table 2 Excerpt 1. Room 2, Spring 2018, Math2a Intern 18 What do you think about encouraging higher-order thinking skills. Intern 16 For example knowledge level of curriculum outcomes necessitate lower order thinking skills where analysis and synthesis are higher order thinking skills. Intern 18 I think, to enhance higher-order thinking skills, teachers can use programs such as programs show different representations of equations in math class. Intern 18 Other usage area of technology can be assessment part of class. I mean, we can use online quiz sites to assess students' knowledge. Intern 17 For example, by using tablets, students can see their previous or wrong operations on a particular task at math. So that, if the teacher notices, he or she can encourage students to look at their past operations so that they can see their faults and have an opportunity to think about it. So that, students think further for that question. They may not have that chance to do it on paper because students tend to erase the operations that they have doubted. Note. a Room #, School of the Future run, team name The mid-phase - Room 5 In the middle of game (Room 5), team members were expected to work on an initial instructional plan, which included a description of teacher and student activities, assessment strategies, and a detailed explanation of how technological tools will be integrated into their instruction. One notices the TPACK code in the graph, and this time there are stronger connections between technology, content, and teaching strategies, and between outcomes and content (Figure 4). Therefore, at this mid phase, participants’ TPACK is characterised by more and stronger connections between the core TPACK categories (T, P, and C). In addition, their pedagogical knowledge involved teaching strategies and learning outcomes (Table 3). Participants started discussing the particular content they were planning to teach with their instructional plan. Centroid of the network graph in Room 2 Australasian Journal of Educational Technology, 2020, 36(2). 36 Figure 4. The network graph in Room 5 Table 3 Excerpt 2. Room 5, Spring 2018, Science 2 Intern 26 Okay I think we can start some questions about mixture to measure their knowledge about topic then ask them to do a little research and find mixtures photos. Intern 26 They load their mixtures photos and explain their reasons why they choose it and with reasons. Intern 26 Therefore all class see a lot of mixtures types and think their reasons also evaluate friends reasons. Actually this discussion may teach what mixture is and mixture types. Intern 26 End of the discussion we can categorise the types and finish lesson a quick quiz like kahoot about the lecture. Intern 28 Ok. I can send you as an example of kahoot quiz. The final stage - Room 8 In the final stage of the School of the Future (Room 8), team members were expected to revise their IP in light of their investigations of technological tools, pedagogical considerations, and the nature of specific content. The network graph of Room 8 (Figure 5) can be considered as the final TPACK representation. One notices that the network is highly similar to the one for Room 5, where teams had worked on their initial IP. As in Room 5, the major connections are between technology, content and teaching strategies. In this final phase, however, there are also new connections; that is, learning theories are connected to content and teaching strategies. Furthermore, most of the connections in Room 8 are stronger than the ones in Room 5, as illustrated by the difference network graph in Figure 6. In this figure, the red lines indicate the connections that are stronger than those in Room 8; the blue ones indicate connections that are stronger in Room 5. According to this graph, while the connections between technology and content and content and strategies are similarly strong in both rooms (as they cancel each other out), the TPACK code is stronger as well as the connections between learning theories and strategies and learning theories and content in Room 8. The only connection that is stronger in Room 5 is the one between learning outcomes and content. Centroid of the network graph in Room 5 Australasian Journal of Educational Technology, 2020, 36(2). 37 Along the y-axis (SVD2), a Mann-Whitney test showed that Room 8 (Mdn = 0.24, N = 32) was statistically significantly different at the alpha = 0.05 level from Room 5 (Mdn = -0.10, N = 31 U = 349.50, p = 0.04, r = 0.30). This shows that participants ended up with a differently weighted network graph, which is represented by stronger connected TPACK categories and by drawing on more aspects of pedagogy at the end of their virtual internship experience (Table 4). Figure 5. The network graph in Room 8 Figure 6. The difference network graph (Room 8 - Room 5) Centroid of the network graph in Room 8 Australasian Journal of Educational Technology, 2020, 36(2). 38 Table 4 Excerpt 3. Room 8, Spring 2018, Math2 Intern 17 We decided to add before class activity on natural numbers, integers, fractions, decimals, because we overlooked that they may not remember these other number sets. So it will be helpful before starting rational numbers. Intern 17 We also added in-class activity so that they will discuss commonly confused rational numbers on Edmodo. Intern 18 Students will have homework about that topic. Intern 18 We will change the activity about number line. Students will conctruct their own number line. Mentor Why did you decide to make those changes? Intern 16 And students will write what they learned at the end of the activity. Intern 16 It is important to know about what students already know, because according to constructivism students build new topics on old knowledges. Intern 17 According to the constructivist approach, we decided that students should find their way of deciding what kind of number line that they should draw. Intern 18 Yes, because thinking about what they learned is good for correcting misconceptions and improving the missing points of students. Intern 17 So we made this change after we learned more about learning theories. Discussion and conclusion The purpose of this study was to examine how the use of a new virtual internship contributed to develop preservice teachers’ TPACK. TPACK refers to the type of knowledge teachers need in order to effectively teach with technology. In the field of educational technology, as Archambault (2016) rightly asserts, TPACK also represents the much-needed conceptual framework. The relationships between and among the TPACK elements are often described as a “complex interplay” (Mishra & Koehler, 2006, p. 1025) between technology, pedagogy, and content. There are connections, interactions, affordances, and constraints between and among these categories, and they exist in a state of dynamic equilibrium. Although the construct of TPACK has been very useful for conceptualising the teacher knowledge needed for effective technology integration, the complex relationship between TPACK elements makes it difficult to both develop activities for and assess TPACK. This study presented and evaluated a new method for developing preservice teachers’ TPACK, drawing on work in learning sciences. The new virtual internship, the School of the Future, was implemented with 33 preservice teachers. Participants completed School of the Future tasks in eight sessions, where they worked collaboratively in teams. Their online chat records were analysed using ENA, a novel data analysis method that allowed me to investigate the dynamic connections between technology, content, and pedagogy over the course of the game. Network graphs of each room were compared in terms of connections between technology, content, and five sub-categories of pedagogy (teaching strategies, assessment, learning outcomes, learning theories, and learners). Preservice teachers started with a rather simple TPACK representation, making connections between content (which was not always related to their own major), technology, and a single aspect of pedagogy, that is, learning outcomes. As they proceeded through the game, however, they started to bring in and connect several aspects of pedagogy to technology and content. By the middle of the game (in Room 5), their TPACK included connections between teaching strategies, content, and technology, as well as a connection between outcomes and content. In their last session (Room 8), not only were the connections between technology, content and teaching strategies stronger, but participants were also bringing in more pedagogical aspects, connecting learning theories to content and teaching strategies in addition to the Australasian Journal of Educational Technology, 2020, 36(2). 39 connections they had established in Room 5. The centroids of the two network graphs in rooms 8 and 5 were in statistically different places along the y-axis, illustrating the differences in connections made in the two rooms. The participants’ TPACK representations thus became more complex over time in terms of the number of pedagogical considerations and the strength of connections between these and technology and content. This showed that virtual internships could be an effective means of support in the development of preservice teachers’ TPACK. This suggests that teacher educators can take advantages of virtual internships to support preservice teachers’ TPACK development. The design of the School of the Future could have contributed to these findings. Introducing TPACK categories in a spiral-like manner (Koehler & Mishra, 2008), for instance, might have helped the preservice teachers gradually develop more and stronger connections between technology, content, and aspects of pedagogy. In addition, the School of the Future (and virtual internships in general) provided a learning environment where authenticity, small-group collaboration, and design task were the defining characteristics. These are also the three pillars of Koehler and Mishra’s (2005) learning-technology-by- design approach. While researchers seem to agree that TPACK can be best developed through this approach, some have stated that what makes it successful is less clear (Voogt et al., 2016). It is reasonable to argue that authenticity, small-group collaboration, and design task – all of which underlie the learning- technology-by-design approach – may be the reasons this approach works. However, further research is needed to examine the effectiveness of these design elements in different settings, computer-based or face- to-face. In order to do that, researchers can design different learning environments reflecting these elements and investigate the role of these in developing preservice teachers’ professional knowledge. While small-group collaboration and design task were implemented in School of the Future in a similar way, authenticity was satisfied in a slightly different way. Authenticity is an important element of virtual internships; it does not derive from a real experience, as is the case in the learning-technology-by-design approach. Rather, its authenticity is based on a simulation of a work environment. School of the Future was built on the underlying theory and technological infrastructure of virtual internships to simulate a work environment. Participants were addressed as intern teachers at School of the Future, and they were given a design task. They were expected to interact with each other and their mentors as they would in a professional work environment. During the implementation, the participants took their virtual internship experience very seriously. They did their individual research, participated in discussions, and contributed to their final group instructional plan in a professional manner. They took on board the feedback received from their mentors, who were portrayed as mentor teachers at the School of the Future. It can therefore be argued that, while the three above-stated design elements are essential, researchers can create authentic situations in different ways when designing learning environments to support the development of preservice teachers’ TPACK. This is important because authenticity derived from real teaching situations may not always be a viable option in preservice teacher education. Future research can investigate the notion of authenticity in different contexts focusing on the distinction between the real experience-based and simulation-based authenticities (as in virtual internships). The School of the Future also embodied well-crafted mentorship at every step. Mentors, who were portrayed as mentor teachers at School of the Future and who were present in every online session, provided guidance and support to the preservice teachers. They posed questions (part of the School of the Future design) for discussion that helped the teams focus on making the necessary connections between the TPACK domains. Koh, Chai, and Tay (2014) also found mentoring to be a critical factor in developing TPACK. They observed that teams with higher occurrences of TPACK had been facilitated by experienced educational technologists throughout their study, suggesting that mentorship is another essential element to consider when a designing learning environment to develop TPACK. In School of the Future, however, participants never actually met their mentors in person. Mentors simply appeared as characters in the game and mostly followed a pre-structured form of guidance. This opens up the possibility of artificial intelligence-based mentorship in preservice teacher education, which is a novel area of research in teacher education. Teacher educators can use this knowledge when designing learning environments not only to develop preservice teachers’ TPACK but also other forms of teacher professional knowledge. Virtual internships focusing on different teacher knowledge bases (e.g., curriculum knowledge, knowledge of educational Australasian Journal of Educational Technology, 2020, 36(2). 40 contexts, knowledge of educational ends, purposes, and values, along with their philosophical and historical grounds) can be created with the free virtual internship-authoring tool. Moreover, preservice teachers could be asked to develop virtual internships for their students. Implications for future research on the TPACK construct In the present study, the TPACK development of preservice teachers was evaluated using ENA, which was originally developed to visualise co-occurrences of frame elements in discourse (Shaffer & Ruis, 2017). ENA is built on a theory that argues that possessing knowledge, skills, identity, values, and epistemology individually is not sufficient to characterise expertise. Rather, one should look at the connections and configurations among these frame elements (Shaffer, 2004, 2006, 2007, 2012). An epistemic frame, similar to TPACK, is a theoretical construct that holds together a set of constituent elements (knowledge, skills, identity, values, and epistemology) and the network of relationships between these defines expertise in any given profession. Once acquired, the epistemic frame functions like a lens with which an individual can approach a situation from the perspective of a member of a community (Shaffer et al., 2009). Similar to the relationships between the TPACK categories, there are interactions and connections between the elements that make up what Shaffer refers to as the epistemic frame of a profession. The TPACK framework also implies a view of expertise in teaching that underlies a network of relationships between the domains of teacher knowledge rather than the mere possession of them. Researchers argued that dividing up the TPACK construct into several subdomains can create the problem of defining each subcomponent mutually exclusively (Archambault & Barnett 2010; Brantley-Dias & Ertmer, 2013). They even suggested that some of the TPACK’s hybrid subdomains of knowledge might not exist in practice. For example, Lux (2010) was not able to identify TCK in his data collected with a TPACK survey. Although part of the problem may be the use of measures that focus on frequency-based assessments of TPACK development (either self-report or artifact-based), the criticisms towards the TPACK construct can benefit from using a new perspective. This study employed ENA to identify the interconnected nature of technology, pedagogy, and content knowledge rather than looking at preservice teachers’ professional knowledge development in separate categories conjectured to make up TPACK. In that regard, not only does ENA afford an innovative way to assess TPACK development, it also opens up new space to talk about the TPACK construct in relation to the ways researchers attempt to measure it. Future research can focus on this conceptual work, (i.e., understanding TPACK as an epistemic frame), which has been typically missing in the literature (Graham, 2011). In this study, the TPACK analysis was conducted on preservice teachers’ group chat records within a virtual internship environment. Therefore, one limitation of studies such as this would be the reliance on discourse, which may or may not manifest itself in actual practice. Harris, Grandgenett, and Hofer (2010) developed an observation rubric, namely the TPACK-Based Technology Integration Observation Instrument, to evaluate teachers’ teaching with technology. Efforts to measure TPACK-in-discourse and -in-action could be coupled to portray more accurate representations of TPACK development. Acknowledgments This research was funded by Research Grant Award No. 17D02P1 from BAP, Bogazici University Scientific Research Projects Fund. It was also supported in part by the National Science Foundation (DRL- 0918409, DRL- 0946372, DRL-1247262, DRL-1418288, DRL-1661036, DRL-1713110, DUE-0919347, DUE-1225885, EEC-1232656, EEC-1340402, REC-0347000), the MacArthur Foundation, the Spencer Foundation, the Wisconsin Alumni Research Foundation, and the Office of the Vice Chancellor for Research and Graduate Education at the University of Wisconsin-Madison. The opinions, findings, and conclusions do not reflect the views of the funding agencies, cooperating institutions, or other individuals. References Angeli, C., & Valanides, N. (2013). Technology mapping: An approach for developing technological pedagogical content knowledge. Journal of Educational Computing Research, 48(2), 199–221. https://doi.org/10.2190/EC.48.2.e https://doi.org/10.2190/EC.48.2.e Australasian Journal of Educational Technology, 2020, 36(2). 41 Angeli, C., Valanides, N., & Christodoulou, A. (2016) Theoretical considerations of technological, pedagogical content knowledge. In M. Herring, M. Koehler, P. Mishra (Eds.), TPACK handbook V2.0: TPACK research and approaches (pp. 11–30), New York, NY: Routledge. Archambault, L. (2016). Exploring the use of qualitative methods to examine TPACK. In M. Herring, M. Koehler, & P. Mishra (Eds.), TPACK handbook V2.0: TPACK research and approaches (pp. 65–86), New York, NY: Routledge. Archambault, L., & Barnett, J. (2010). Revisiting technological pedagogical content knowledge: Exploring the TPACK Framework. Computers and Education, 55(4), 1656–1662. https://doi.org/10.1016/j.compedu.2010.07.009 Brantley-Dias, L., & Ertmer, P. A. (2013) Goldilocks and TPACK. Journal of Research on Technology in Education, 46(2), 103-128. https://doi.org/10.1080/15391523.2013.10782615 Chai, C-S., Koh, J. H-L., & Tsai, C-C. (2013). A review of technological pedagogical content knowledge. Educational Technology & Society, 16(2), 31–51. Retrieved from http://www.ifets.info/ Csanadi, A., Eagan, B., Shaffer, D. W., Kollar, I., & Fischer, F. (2017). Collaborative and individual scientific reasoning of pre-service teachers: New insights through epistemic network analysis (ENA). In B. K. Smith, M. Borge, E. Mercier, & K. Y. Lim (Eds.), Making a difference: Prioritizing equity and access in CSCL. Proceedings of the 12th International Conference on Computer-Supported Collaborative Learning (Vol. I, pp. 215–222). Philadelphia, PA: International Society of the Learning Sciences. Graham, C. R. (2011). Theoretical considerations for understanding technological pedagogical content knowledge (TPACK). Computers and Education, 57(3), 1953–1960. https://doi.org/10.1016/j.compedu.2011.04.010 Harris, J. B., Grandgenett, N., & Hofer, M. (2010). Testing a TPACK-based technology integration assessment rubric. In D. Gibson, & B. Dodge (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference 2010 (pp. 3833-3840). Chesapeake, VA: Association for the Advancement of Computing in Education. Harris, J. B., & Hofer, M. (2009). Instructional planning activity types as vehicles for curriculum-based TPACK development. In C. D. Maddux, (Ed.), Research highlights in technology and teacher education 2009 (pp. 99–108). Chesapeake, VA: Society for Information Technology in Teacher Education (SITE). Harris, J. B., Mishra, P., & Koehler, M. (2009). Teachers’ technological pedagogical content knowledge and learning activity types: Curriculum-based technology integration reframed. Journal of Research on Technology in Education, 41(4), 393–416. https://doi.org/10.1080/15391523.2009.10782536 Hofer, M., & Harris, J. (2012). TPACK research with inservice teachers: Where’s the TCK? In P. Resta (Ed.), Proceedings of Society for Information Technology & Teacher Education International Conference 2012 (pp. 4704–4709). Chesapeake, VA: Association for the Advancement of Computing in Education. Koehler, M. J., & Mishra, P. (2005). What happens when teachers design educational technology? The development of technological pedagogical content knowledge. Journal of Educational Computing Research, 32(2), 131–152. https://doi.org/10.2190/0ew7-01wb-bkhl-qdyv Koehler, M. J., & Mishra, P. (2008). Introducing TPCK. In AACTE Committee on Innovation and Technology (Ed.), The handbook of technological pedagogical content knowledge (TPCK) for educators (pp. 3–29). Mahwah, NJ: Lawrence Erlbaum Associates. Koehler, M. J., Mishra, P., Akcaoglu, M., & Rosenberg, J. M. (2013). Technological pedagogical content knowledge for teachers and teacher educators. In N. Bharati, & S. Mishra (Eds.), ICT integrated teacher education: A resource book (pp. 1-8)., New Delhi: Commonwealth Educational Media Center for Asia. Koehler, M. J., Mishra, P., Kereluik, K., Shin, T. S., & Graham, C. R. (2014). The technological pedagogical content knowledge framework. In J. M. Spector, M. D. Merrill, J. Elen, M. J. Bishop (Eds.), Handbook of research on educational communications and technology (4th ed., pp. 101–111). New York, NY: Springer. Koehler, M. J., Mishra, P., & Yahya, K. (2007). Tracing the development of teacher knowledge in a design seminar: Integrating content, pedagogy and technology. Computers and Education, 49(3), 740–762. https://doi.org/10.1016/j.compedu.2005.11.012 Koehler, M. J., Shin, T. S., & Mishra, P. (2012). How do we measure TPACK? Let me count the ways. In R. N. Ronau, C. R. Rakes, & M. L. Niess (Eds.), Educational technology, teacher knowledge, and classroom impact: A research handbook on frameworks and approaches (pp. 16-31). Hershey, PA: IGI Global. https://doi.org/10.1016/j.compedu.2010.07.009 https://doi.org/10.1080/15391523.2013.10782615 http://www.ifets.info/ https://doi.org/10.1016/j.compedu.2011.04.010 https://doi.org/10.1080/15391523.2009.10782536 https://doi.org/10.2190/0ew7-01wb-bkhl-qdyv https://doi.org/10.1016/j.compedu.2005.11.012 Australasian Journal of Educational Technology, 2020, 36(2). 42 Koh, J. H. L., Chai, C. S., & Tay, L. Y. (2014). TPACK-in-action: Unpacking the contextual influences of teachers' construction of technological pedagogical content knowledge (TPACK). Computers and Education, 78, 20–29. https://doi.org/10.1016/j.compedu.2014.04.022 Lux, N. J. (2010). Assessing technological pedagogical content knowledge (Order No. 3430401). Available from ProQuest Dissertations & Theses Global. (763640461). Retrieved from https://search.proquest.com/docview/763640461?accountid=9645 Marquart, C. L., Hinojosa, C., Swiecki, Z., & Shaffer, D. W. (2018). Epistemic Network Analysis (Version 0.1.0) [Software]. Available from http://app.epistemicnetwork.org Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108, 1017–1054. Retrieved from https://www.tcrecord.org/Content.asp?ContentId=12516 Niess, M. (2011). Investigating TPACK: Knowledge growth in teaching with technology. Journal of Educational Computing Research, 44(3), 299–317. https://doi.org/10.2190/ec.44.3.c Papert, S. (1987). A critique of technocentrism in thinking about the school of the future. Retrieved from http://www.papert.org/articles/ACritiqueofTechnocentrism.html Shaffer, D. W. (2004, June). Epistemic frames and islands of expertise: Learning from infusion experiences. Paper presented at the International Conference of the Learning Sciences. Santa Monica, CA. Shaffer, D. W. (2005). Epistemic games. Innovate, 1(6). Reprinted in Computers and Education 46, 223– 234. Retrieved from https://nsuworks.nova.edu/innovate/vol1/iss6/2 Shaffer, D. W. (2006). Epistemic frames for epistemic games. Computers and Education, 46(3), 223–234. https://doi.org/10.1016/j.compedu.2005.11.003 Shaffer, D. W. (2007). How computer games help children learn. New York, NY: Palgrave Macmillan. Shaffer, D. W. (2012). Models of situated action: Computer games and the problem of transfer. In C. Steinkuehler, K. D. Squire, & S. A. Barab (Eds.), Games, learning, and society: Learning and meaning in the digital age (pp. 403–431). Cambridge: Cambridge University Press. Shaffer, D. W. (2017). Quantitative ethnography. Madison, WI: Cathcart Press. Shaffer, D. W., Collier, W., & Ruis, A. R. (2016). A tutorial on epistemic network analysis: Analyzing the structure of connections in cognitive, social, and interaction data. Journal of Learning Analytics, 3(3), 9–45. https://doi.org/10.18608/jla.2016.33.3 Shaffer, D. W., Hatfield, D., Svarovsky, G., Nash, P., Nulty, A., Bagley, E., … Mislevy, R. (2009). Epistemic network analysis: A prototype for 21st century assessment of learning. International Journal of Learning and Media 1(2): 33–53. https://doi.org/10.1162/ijlm.2009.0013 Shaffer, D. W., & Ruis, A. R. (2017). Epistemic network analysis: A worked example of theory-based learning analytics. In C. Lang, G. Siemens, A. Wise, & D. Grasevic (Eds.), Handbook of Learning Analytics (pp. 175–187). Alberta: Society for Learning Analytics Research. Shulman, L. S. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57(1), 61–77. https://doi.org/10.17763/haer.57.1.j463w79r56455411 Voogt, J., Fisser, P., Tondeur, J., & van Braak, J. (2016). Using theoretical perspectives in developing an understanding of TPACK. In M. Herring, M. J. Koehler, & P. Mishra (Eds.), Handbook of technological pedagogical content knowledge (TPACK) for educators (2nd ed., pp. 33–51). New York, NY: Routledge. Corresponding author: Diler Oner, diler.oner@boun.edu.tr Copyright: Articles published in the Australasian Journal of Educational Technology (AJET) are available under Creative Commons Attribution Non-Commercial No Derivatives Licence (CC BY-NC- ND 4.0). Authors retain copyright in their work and grant AJET right of first publication under CC BY- NC-ND 4.0. Please cite as: Oner, D. (2020). A virtual internship for developing technological pedagogical content knowledge. Australasian Journal of Educational Technology, 36(2), 27-42. https://doi.org/10.14742/ajet.5192 https://doi.org/10.1016/j.compedu.2014.04.022 https://search.proquest.com/docview/763640461?accountid=9645 http://app.epistemicnetwork.org/ https://www.tcrecord.org/Content.asp?ContentId=12516 http://www.papert.org/articles/ACritiqueofTechnocentrism.html https://nsuworks.nova.edu/innovate/vol1/iss6/2 https://doi.org/10.1016/j.compedu.2005.11.003 https://doi.org/10.18608/jla.2016.33.3 https://doi.org/10.1162/ijlm.2009.0013 https://doi.org/10.17763/haer.57.1.j463w79r56455411 mailto:diler.oner@boun.edu.tr https://creativecommons.org/licenses/by-nc-nd/4.0/ https://creativecommons.org/licenses/by-nc-nd/4.0/ https://doi.org/10.14742/ajet.5192 Results Discussion and conclusion Acknowledgments This research was funded by Research Grant Award No. 17D02P1 from BAP, Bogazici University Scientific Research Projects Fund. It was also supported in part by the National Science Foundation (DRL-0918409, DRL- 0946372, DRL-1247262, DRL-1418288, DRL-16... References