Microsoft Word - 522-6135-3-CE_gk.docx Australasian Journal of Educational Technology, 2014, 30(3). ascilite 273 Quality experiences of inquiry in blended contexts – university student approaches to inquiry, technologies, and conceptions of learning Robert A. Ellis Institute for Teaching and Learning University of Sydney, Australia Evaluating the quality of inquiry using technology in blended contexts at university is a complex phenomenon as there are many variables which could account for qualitative variation in the experience. This study looks at reasons for qualitative variation in the university student experience of inquiry using technologies. It considers approaches to inquiry and technologies, conceptions of learning and academic achievement. The results identify which aspects of the experience account for relatively more successful learning and which aspects of the experience tend to be related to less successful experiences. It offers a nuanced understanding of the contribution of technology to successful experiences. The results have implications for the design of activities which involve class and on-line contexts and the way we help students to be successful. Introduction Learning through inquiry is an integral part of the student experience of education in higher education. When its intent is realized by students and teachers in experiences of learning, it can promote problem solving (Norman & Schmidt, 2000; Dochy, Segers, Van den Bossche, & Gijbels, 2003), critical reflection (Schon, 1990), constructing knowledge (Biggs & Tang, 2007) and quality outcomes (Ramsden, 2002). When inquiry in a predominately face-to-face experience is significantly supported by online learning, the experience becomes more complex. In such contexts, it is common for students to be expected to follow trains of thought, ideas, questions and answers, back and forth between the classroom and online contexts. The complexity is more than just structural. It is in part realized in the strategies students need to develop to use the online context effectively for their whole learning experience. Effective online strategies are not the same as effective strategies in class, and integrating the two when working towards the same learning outcome is not necessarily straightforward. Pursuing ideas back and forth between the classroom and online, requires particular skills and an aptitude for learning, not all of which are yet fully understood. Experiences of learning in blended contexts, those which are both in class and on-line, are often referred to as ‘blended learning’ (Garrison & Kanuka, 2004). In this study, the position adopted is that it is not so much that the students’ learning that is blended, but rather that their experiences of learning in class and on-line need to be integrated across the different contexts sufficiently well so that the two work together to support the students’ understanding and outcomes. By focusing on experience as a construct, it will allow some teasing apart of key aspects of learning to try to identify which parts are most responsible for quality experiences, and which parts cause some interference. In describing these parts, it will be argued that while an experience may be designed for students to develop their understanding across class and online contexts, not all students realize the intentions of the teacher, and actually experience fragmentation of the two. Clarity will be sought as to why some aspects may be causing interference and a fragmentation of the experience amongst students in the same cohort. Identifying and providing quantitative measures of parts of the learning experience in blended contexts is difficult. Firstly, it is not necessarily clear which parts of the experience need to be considered. Secondly, there is a part-whole dilemma. The issue is how to measure the contribution of one aspect of an experience of learning amongst many aspects, to the quality of the learning outcomes. Another difficulty is the nature of the variation in the student experience. Two students in the same class can have the same opportunities for learning and use the same technologies in a blended context, yet report and achieve significantly different academic outcomes in the course. Why do some students have better quality experiences of learning than others? What is it in their experience that can explain relatively better or poorer outcomes? Australasian Journal of Educational Technology, 2014, 30(3). 274 To identify which parts of the student experience may offer insight into qualitative variation, research into the quality of approaches to learning that students adopt and the ways they conceive of learning is considered (Prosser & Trigwell, 1999; Ramsden, 2002; Biggs & Tang, 2007). Some students conceive of learning as a holistic and coherent endeavour, actively pursue lines of questioning and thought, develop independence, and reflect on their studies in ways that promote understanding. Others take the easy way out, reproducing large chunks of information with little thought or reflection and not really engaging with the meaning of the learning experience (Prosser & Trigwell, 1999). Such studies provide a defined framework in which to pursue questions of qualitative variation in learning. Few studies have considered university student approaches to inquiry supported by technologies in a blended context. They also do not consider how this variation may relate to ways of thinking about learning, and the academic outcomes the students achieve in contexts which involve an integrated experience in-class and online. Using the issues above as motivation, this study investigates the university student experience of inquiry using technologies in a first year university course. In doing so, it provides insight into variation in the student experience of inquiry, providing some measures of how students approach inquiry. It provides insight into variation in the student experience of learning technologies, by identifying qualitative variation in student approaches to the technologies. The study then relates these experiences to different ways students report thinking about learning, and to their course marks as an outcome measure. View of learning and prior research A focus on quality experiences of learning at university has occupied a coherent body of research over the last four decades (Biggs & Tang, 2007; Entwistle & Ramsden, 1983; Entwistle & McCune, 2004; Entwistle, McCune, & Hounsell, 2002; Laurillard, 1993, 2002; Marton & Säljö, 1976a, 1976b; Prosser & Trigwell, 1999; Ramsden, 2002). This research has identified key aspects of the student experience which offer explanations for qualitative variation. Figure 1 presents a visual representation of the key aspects of university student experiences of learning. Figure 1. Concepts related to the quality of learning at university (from Entwistle et al., 2002). Students’ prior experience, knowledge, conceptions and reasons for studying Students’ perceptions of the teaching-learning environment Type of teaching-learning environment provided University teachers’ pedagogical subject knowledge and conceptions of teaching How course material is selected, organized, presented, assessed Approaches to learning and studying Quality of learning achieved Australasian Journal of Educational Technology, 2014, 30(3). 275 Figure 1 can be understood in two parts. The top three rectangles are key aspects of the student experience; their approaches, prior characteristics and perceptions. The bottom three rectangles are key aspects of the teaching experience; their strategies for selecting materials, their knowledge and conceptions of teaching and the type of environment they provide. The two experiences contribute to the quality of learning represented in the centre. Studies into student conceptions often look for qualitative variation in the structure of the experience of learning. Many studies over the last few decades have found a distinctive pattern in the student experience, identifying coherent and fragmented conceptions to be logically related to higher and lower academic outcomes respectively. These patterns have been found in student experiences of mathematics (Crawford, Gordon, Nicholas, & Prosser, 1998), student experiences of writing biology (Ellis, 2004) and in student experiences of assessment (Fletcher, Meyer, Anderson, Johnston, & Rees, 2012). Related studies have identified that qualitative variation in experiences of learning can also be partly explained by the quality of the approach adopted by the student. Early studies found associations between how students approached reading and the quality of their outcome (Marton & Säljö, 1976a, 1976b). Students who read the texts for the intentional meaning of the author tended to have a better understanding of the argument, while students who approached reading the texts with the intent of being able to recall what the text reported, did not seem to fully grasp the argument. Similar associations have been found between approaches to learning physics and the quality of outcomes (Prosser & Millar, 1989). Some students reported focusing on the categorization and memorization of list of materials to increase the information they had. Other students, who achieved more successful outcomes, reported focusing on abstracting meaning from their studies, developing a personal commitment to developing their understanding in order to better explain the phenomena under scrutiny. More recent and closely related studies into approaches have identified variation in the student experience of inquiry and learning technologies. In a study of university students’ experiences of learning technologies across eight higher education institutions, qualitative variation in their approaches to online learning were identified. Broadly summarizing, some students used online learning to identify the views of other participants, prepare for lectures and offer their views on topics as a way of engaging with the content and ideas of the course. Other students tended to eschew these aspects of the online experience, focusing more on information retrieval, downloading resources or ignoring the online aspect altogether (Ellis, Barrett, Higa, & Bliuc, 2011). A second study investigated the students’ approaches to inquiry in a fourth year pharmacy course. Qualitative variation in the student experience of inquiry of pharmaceutical practice was found to relate significantly to differences in academic achievement. Students who adopted an approach to inquiry which was active, critically aware and related their learning to authentic contexts, tended to perform a significantly higher level as measured by course marks than students who reported an approach which was nominally involved in the process, tending to passively observe and remain detached from the experience (Ellis, Bliuc, & Goodyear, 2012). While these studies provide rich descriptions of student approaches to technologies and inquiry, it is not clear if similar associations are to be found in other courses in other disciplines.   This study adds to the previous research by looking for qualitative variation in a first year blended course in health studies of Indigenous people at an Australian metropolitan university. The design of the study focuses on student approaches to inquiry, student approaches to technology, student conceptions of learning, and their academic achievement as measured by course outcomes. In doing so, it seeks to clarify which parts of the experience are most responsible for relatively more successful experiences of inquiry, and which parts of the experience are likely to cause a fragmentation of the experience. The research questions used to guide this study are: • To what extent is there qualitative variation in the student experience of inquiry using technologies in a blended university course? • What aspects of the experience are responsible for a relatively more successful experience of inquiry? • What aspects of the experience suggest why learning in class and online was fragmented for some students? • What are the implications of these associations? Australasian Journal of Educational Technology, 2014, 30(3). 276 Learning context In the course of their studies into Indigenous health, students were expected to inquire into the history and influence of various health policies used to support the health and well-being of Aboriginal and Torres Strait Islander people, and to consider current issues that continue to impact on their health. The purpose of the course was to explore how key issues, pertaining to Australia`s Indigenous people’s health, have become a national agenda as Australia attempts to find the appropriate approach and the right model of care to improve all aspects of the health of Aboriginal and Torres Strait Islander people. To meet the outcomes of the course, students were expected to complete an inquiry-based task involving significant online research. Students were expected to identify a currently running health program or service for Indigenous people being delivered in a remote, urban or rural setting, and provide an analysis of its purpose, policy implications and benefit to the community. This task required the students to research the background of the program, its aims and objectives, how it was being delivered and what outcomes were being achieved. For programs and services within the metropolitan area of the University, students were encouraged to make contact and visit people involved in delivering the program or service and seek their perspectives on why the program was working well. A key aim of the project was for students to apply their knowledge from the required readings and research of the course to the analysis of the service or program. The assessment weighting for this task was 40% of the total, with the other 60% made up of an essay and group presentation. The task involved large amount of work on the students’ part. The structure of the task was open, leaving students to make decisions about which resources to pursue, how to use the technologies effectively, and the relevance of the findings to their task. Students used presentation, discussion, inquiry tools in a proprietary learning management system (Blackboard 9), and the website resources of the organisation they were investigating to contextualise their assignment. Search engines (Safari and Firefox) and text- production softwares (Word and Adobe pdf) were used to review resources and complete the assignment. Methodology Participants and learning context Towards the end of the students’ semester, the researcher attended one of their lectures and asked for volunteers to participate in this study. Of the third year undergraduates majoring in the course on Indigenous Health at an Australian research-intensive university, 95 (65 female and 30 male) responded out of a total of 172. The students’ ages ranged from 19 to 31, with a mean age of 25.5 and a standard deviation of 3.21. Instruments The students completed three questionnaires: the Approaches to learning through inquiry questionnaire, the Approaches to learning technologies questionnaire, and the Conceptions of learning questionnaire. The approaches questionnaires were informed by related studies including the revised version of the Study Process Questionnaire (Biggs, Kember, & Leung, 2001) and student interviews about their experience of inquiry and using technologies for learning (Ellis & Goodyear, 2010; Ellis et al., 2012). The Approaches to learning through inquiry questionnaire is divided into two scales; deep approaches to inquiry (α=.84) and surface approaches to inquiry (α=.75). The deep approaches scale described approaches to learning through inquiry as proactive, independent activities seeking understanding, whereas the surface scale investigated approaches oriented towards following a formulaic process, exhibiting dependency on others. Similarly, the Approaches to learning technologies questionnaire is composed of two subscales: the deep approaches to technologies (α=.84), views using technologies in learning to broaden and deepen one’s understanding of the subject matter; and the surface approaches to technologies, which view using technologies as tools to download files and to obtain information (α=.61). The Conceptions for learning questionnaire was based on closely related studies (Crawford et al, 1998; Ellis & Calvo, 2004). The questionnaire is divided into the subscales: cohesive conceptions, which view learning experience in the course as a way to deepen understandings of broader topics, reflecting on ideas Australasian Journal of Educational Technology, 2014, 30(3). 277 in new ways, as well as relating personal experiences to the topics in the course (α=.92); and fragmented conceptions, which conceive learning in the course as finding solutions to answers, remembering facts in textbooks (α=.78). Table 1 presents the subscales of all the questionnaires, the reliability, and defining items for each subscale. Table 1 Questionnaires subscales including item number, reliability coefficients and defining items Subscales (Number of items, Cronbach α) Defining items deep approaches to inquiry (6 items, α=.84) I often take the initiative when pursuing a line of questioning in research. I draw different sources together when I am researching to get a better understanding. surface approaches to inquiry (8 items, α=.75) When I research something, just asking a question is usually enough. I do not spend much time thinking about key questions when I am researching something. deep approaches to technology (8 items, α=.84) I spend time using the learning technologies in this course to develop my knowledge on key topics. I try to use the learning technologies in this course to achieve a more complete understanding of key concepts. surface approaches to technology (5 items, α=.61) I restrict my use of learning technologies in this course to do as little as possible. I only use the learning technologies in this course to fulfil course requirements. cohesive conceptions for learning (8 items, α=.92) I think learning for this subject allows me to improve my understanding of the broader topics we study. Learning for this subject allows for relating my personal experiences to topics in order to understand them better. fragmented conceptions for learning (6 items, α=.78) Learning for this subject is just about finding the right answer. Learning for this subject is only about understanding the ideas, rather than the perspective of the person saying them. Procedure The questionnaires were administered to the students in the second last week of a thirteen-week semester. Participation in the survey was voluntary and students consented to their questionnaire ratings and course results being used for the analyses. Students’ academic achievement was measured by the assignment tasks in the course as it was an assessment of their inquiry abilities using technology. This task accounted for 25% of their final grade in the course. The students’ marks ranged from 12 to 24, with a mean of 17.70, and a standard deviation of 2.91. Data analysis and results To investigate the qualitative variation in the student experience, three types of analyses were performed: correlation and factor analyses at the level of variables, and cluster analyses at the level of the students. Correlation analyses were conducted to look at relationships between pairs of variables. Principal component factor analysis examined relationships amongst groups of variables. As a result of using the two inferential statistics together, the integrity of the study was increased (Prosser, Trigwell, Hazel, & Waterhouse, 2000). The cluster analysis identified subgroups of students where the similarities in their Australasian Journal of Educational Technology, 2014, 30(3). 278 experiences within groups, and the differences in their experiences between groups, were maximised. The combined purpose of performing these analyses was to identify which parts of the experience of inquiry using technologies were most responsible for relatively successful experiences, and which parts seemed to be causing interference or fragmentation within the experience. Correlation analysis A series of Pearson product moment correlation analyses were used to examine the relationship between variables. Table 2 presents the results of the correlation analyses for subscales of the questionnaires and students’ academic achievement. The interpretation of the results followed the suggestions by Cohen (1977) that values of r at .10, .30, and .50 indicate small, medium and large effects in terms of magnitude of association between variables. Table 2 Correlations between variables of the experience of learning and academic achievement Variables sai dat sat cc fc aa deep approaches to inquiry (dai) -.27** .46** -.24* .19 -.28** -.05 surface approaches to inquiry (sai) -.14 .53** -.13 .47** -.24* deep approaches to technologies (dat) -.38** .40** -.15 .01 surface approaches to technologies (sat) -.06 .39** -.10 cohesive conceptions (cc) -.08 .15 fragmented conceptions (fc) -.02 academic achievement (aa) --- *p<.05, **p<.01 The deep approaches to inquiry variable had a negative association with the surface approaches to inquiry variable (r=-.27, p<.01), a positive association with the deep approaches to technologies variable (r=.46, p<.01), and negative associations with the surface approaches to technologies variable (r=-.24, p<.05), and the fragmented conceptions for learning variable (r=-.28, p<.01). The surface approaches to inquiry variable showed positive associations with the surface approaches to technologies variable (r=.53, p<.01), and the fragmented conceptions (r=.47, p<.01), and a negative association with the academic achievement variable (r=-.24, p<.05). The deep approaches to technologies variable had a negative association with the surface approaches to technologies variable (r=-.38, p<.01) and a positive association with the deep approaches to technologies variable (r=.40, p<.01). The surface approaches to technologies variable had a positive association with the fragmented conceptions variable (r=.39, p<.01). While the other associations were not significant, their directionality was consistent with the findings in the other cases. Principal component factor analysis To look at the structural relationships between the variables on experience of learning through inquiry and academic achievement, a principal component factor analysis was conducted. The two-factor result sought, following use of the scree-plot data, is shown in Table 3. Factor 1, explaining about 35.06% of the variance, showed substantial positive loadings on the three learning experience variables: the surface approaches to inquiry (.85), the surface approaches to technologies (.73), and the fragmented conceptions (.72); and moderate and negative loadings on the students’ academic achievement (.-33). Factor 2, explaining around 18.25% of the variance, had substantial positive loadings on the three variables of students’ learning experience: deep approaches to inquiry (.68), deep approaches to technologies (.86), and cohesive conceptions (.69). The results of the factor analyses strengthen the evidence for identifying which variables in the experience seem to be contributing to a fragmentation of the experience, and which parts are supporting coherence. Australasian Journal of Educational Technology, 2014, 30(3). 279 Table 3 Principal component factor analysis of the students’ experience of learning through inquiry variables and students’ academic achievement variable Cluster analysis Cluster analyses offer some evidence of the associations amongst variables experienced by groups of students in the population sample. As a means of analysing the relations between students’ experience of learning through inquiry, the variables representing the constructs of approaches to inquiry (deep and surface), approaches to technologies (deep and surface), and conceptions for learning (cohesive and fragmented) were subjected to a hierarchical cluster analysis using Ward’s method (Seifert, 1995). The analysis resulted in two clusters based on the increasing value of the squared Euclidean distance between clusters. Students’ academic achievement scores were assigned on the basis of cluster membership. An ANOVA to compare means was then used to determine the significance of the between-groups contrasts, the effect size of eta-squared values were also reported. Effect size helps to interpret if the significance of a finding is practical and meaningful within the context of a study or if it is attributable to an artefact of the sample size. According to Cohen (1977), an eta-squared value of .01 can be interpreted as indicating a small effect size, .06 as a medium effect size and .14 or greater as a large effect. Standardized scores were used for all the variables to reduce the original scores to a mean of 0.0 and a standard deviation of 1.0 in order to achieve the comparison. The results are shown in Table 4. Table 4 Summary statistics of the two-cluster solution for the variables of experience of learning and learning achievement Variables Cluster 1 (N=47) Cluster 2 (N=48) F p η2 Mean (Z-score) Mean (Z-score) deep approaches to inquiry 0.46 -0.45 24.18 .00 .21 surface approaches to inquiry -0.67 0.65 73.20 .00 .44 deep approaches to technologies 0.23 -0.23 5.25 .02 .05 surface approaches to technologies -0.51 0.50 32.98 .00 .26 cohesive conceptions 0.28 -0.27 7.75 .00 .08 fragmented conceptions -0.60 0.58 51.49 .00 .36 academic achievement 0.12 -0.11 1.28 .26 .01 The ANOVA identified statistically significant contrasts between the two clusters on all the variables of students’ learning experience, according to the cluster membership. Table 4 shows Cluster 1 with a high score on deep approaches to inquiry variable (.46, p<.00), the deep approach to technology variable (0.23, p<.00), the cohesive conceptions variable (.28, p<.00). It also shows Cluster 2 with a positive score on the surface approaches to inquiry variable (.65, p<.00), the surface approaches to technology variable (.50, p<.00), and the fragmented conceptions variable (.58, p<.00). The loadings on the academic variable were not significant but were consistent in terms of direction. Variables Factors 1 2 deep approaches to inquiry .68 surface approaches to inquiry .85 deep approaches to technology .86 surface approaches to technology .73 cohesive conceptions .69 fragmented conceptions .72 academic achievement -.33 53.31 percent of the variance explained Varimax Rotation, loadings less than .30 removed Australasian Journal of Educational Technology, 2014, 30(3). 280 The results showed that amongst the 95 students who participated in the study, a group of 47 students were identified in Cluster 1. This group reported a tendency to adopt deeper approaches to inquiry, used technologies as a way to further develop ideas in learning and research, and conceived learning as a process of understanding broader topics in a more complex and engaged way. The other group of 48 students in Cluster 2 reported an experience orientated towards reproduction, characterised by passive inquiry, used technologies just as a means to fulfil course requirements, and conceived learning as memorization and finding the correct answer. Discussion and conclusion Finding a way into the debate about the quality of learning at university is fraught with challenges; the part/whole nature of how technology supports learning, the different ideas students hold about learning, and differences in how they approach their learning. This article posits that understanding the learning experience from the perspective of students is an effective way to uncover qualitative variation in their learning. The approach used in this study enabled a way of teasing apart the variations in the ways students approached inquiry, the way they approached technologies to support their inquiry, and their ways of thinking about learning, in relation to each other and their academic achievement as measured by their course mark. Before commenting on the findings and limitations of this study, it is worthwhile noting some of its limitations. While the study highlights a number of key issues about the quality of the student experience and related issues concerned with approaches to inquiry, technology, and conceptions of learning, it is designed around student self-report data. It could be strengthened with observational data. The two sets of data could then be used to elaborate on links between evidence and the nature of qualitative variation in the student experience of learning (Bourdieu, 1977). Further studies with increased sample size and domain variation are required to strengthen the validity and reliability of the findings. The results of this study offer some evidence as to why different aspects of an experience of learning in a blended context contribute to qualitative variation. Approaches to inquiry which are active in their question formation, independent, reflective and analytical, tend to be related to approaches to learning technologies, and used by students as a way to improve their understanding and discover new knowledge related to their studies. These types of approaches are related to conceptions of learning which are considered as personally relevant and related to broader topics within the students’ studies. In the correlation and factor analyses, these approaches and conceptions were related to significantly higher levels of academic achievement as measured by the course mark. In contrast, surface approaches to inquiry, those which are passive and detached, are related to approaches to learning technologies which limit usage to course requirements and reproductive activities, such as downloading content and finding information. These approaches also related to conceptions of learning involving finding answers. The perspective of the answer was not necessarily understood. Experiences of learning which were characterised by these aspects were relatively less successful, as judged by the strength of the associations with the students’ academic achievement. A key outcome of the study is that it points to the relational nature of learning and technology in university education. It is not possible to allocate the variation in the quality of the experience to one agent, for example technology, as the reason why one experience may be more successful than another. A more nuanced understanding of the relational and interdependent associations amongst key aspects of the student experience is necessary to discuss qualitative variation in a meaningful way Implications This study’s important contribution to the literature is an emerging description of the role of learning technologies in approaches to learning in blended contexts. The outcomes suggest a more complex structure than previously understood, of approaches to learning when technologies are involved. By isolating variations in approaches to learning technologies, the analysis allows for a treatment of the reasons for disruption (Hedberg, 2006). For example, using a tool in an approach to learning could possibly thwart the students’ intent if the student was unsure of how the tool could assist with the process. Alternatively, the intent of the student’s approach may seek an outcome at odds with the intended Australasian Journal of Educational Technology, 2014, 30(3). 281 outcome of the teacher, or the students’ concept of what learning means in the course may inhibit an effective approach to the learning technologies. In all cases, the results of this study suggest that a purposeful modelling of the role of technologies in the experience is likely to improve the quality of a number of aspects which contribute to overall outcomes. In other words, the finding that a deep approach to learning technology is significantly related to a deep approach to inquiry and cohesive conception of learning and relatively higher academic achievement, provides a way into the design of student-centred pedagogy. Beginning with a description of deep approaches to learning technology, it suggests that success in helping students to appreciate what this might mean in the context of their course is likely to have positive spin offs for the quality of their concepts about learning, their approaches to inquiry and their academic achievement. This finding reinforces Bennett and Maton’s (2010) doubt that students can be left with little guidance when using technologies in learning. There is a plethora of technologies being adopted by teachers and course designers in universities (Lowendahl, 2010; Bonig, 2011). Often the hype surrounding the technologies (Lowendahl, 2012) obscures efforts to understand the more nuanced ways in which the technologies contribute to student experiences of learning which have a meaningful contribution to the quality of their experience. The results presented in this study underline the importance of teasing apart those relationships in ways that reveal qualitative variation and the reasons for successful learning. Significantly, the results presented here suggest that a successful inclusion of technology in the university student experience of learning requires as much attention to student concepts of learning and understanding of the nature of effective inquiry, as it does in how to use the technology to effect inquiry successfully. Further research The outcomes here warrant a program of studies of a similar nature into the quality of technology- mediated student experiences of learning at university, increasing the underpinning sample size of the findings as well as variation across disciplinary bodies. Of particular benefit would be to broaden the research design to more explicitly include variables which investigate variation in the environments in which the learning occurs (Lizzio, Wilson, & Simons, 2002). These could emphasise the combined contribution of physical and virtual contexts within a single experience of learning which is increasingly the rule rather than the exception in universities, rather than putting these in opposition to each other as a type of either/or scenario (Ellis & Goodyear, 2010). This could require the introduction of additional variables into the study design, such as students’ perceptions of their environment in which both the physical and virtual spaces work together to support the same learning outcomes. Acknowledgements We are pleased to acknowledge the assistance of Ms Han in research support and Dr Lee for the excellent learning context in which the study was conducted. References Bennett, S., & Maton, K. (2010). Beyond the ‘digital natives’ debate: Towards a more nuanced understanding of students' technology experiences. Journal of Computer Assisted Learning, 26(5), 321-331. Biggs, J., Kember, D., & Leung, D. Y. P. (2001). The revised two-factor Study Process Questionnaire: R- SPQ-2F. British Journal of Educational Psychology, 71(1), 133-149. Biggs, J., & Tang, K. (2007). Teaching for quality learning at university (3rd ed.). Buckingham: Open University Press. Bonig, R. (2011). Best practices for mobile device learning initiatives in higher education. Stanford, CA: Gartner Research. Bourdieu, P. (1977). Outline of a theory of practice. Cambridge: Cambridge University Press. Cohen, J. (1977). Statistical power analysis for the behavioural sciences. New York: Academic Press. Crawford, K., Gordon, S., Nicholas, J., & Prosser, M. (1998). University mathematics students’ conceptions of mathematics. Studies in Higher Education, 23(1), 87-94. Dochy, F., Segers, M., Van den Bossche, P., & Gijbels, D. (2003). Effects of problem-based learning: A meta-analysis. Learning and Instruction, 13(5), 533–568. Australasian Journal of Educational Technology, 2014, 30(3). 282 Ellis, R. A., (2004). Student approaches to learning science through writing. International Journal of Science Education, 26(15), 1835-1854. Ellis, R. A., Barrett, B., Higa, C., & Bliuc. A-M. (2011). Students’ experiences of learning technologies across the Asia Pacific. Asia Pacific Education Reviewer, 20(1), 103-117. Ellis, R. A., Bliuc, A-M., & Goodyear, P. (2012). Student experiences of engaged inquiry in pharmacy education - Digital natives or something else? Higher Education, 64(5), 609-626. Ellis, R. A., & Calvo, R. A. (2004). Learning through discussions in blended contexts. Educational Media International, 40(1), 263-274. Ellis, R. A., & Goodyear, P. (2010). Student experiences of e-learning in higher education: The ecology of sustainable innovation. London: RoutledgeFalmer. Entwistle, N., & McCune, V. (2004). The conceptual bases of study strategy inventories, Educational Psychology Review, 16(4), 325-345. Entwistle, N., McCune, V., & Hounsell, J. (2002). Approaches to study and perceptions of university teaching-learning environments: Concepts, measures and preliminary findings. Retrieved from http://www.ed.ac.uk/etl/docs/ETLreport1.pdf Entwistle, N., & Ramsden, P. (1983). Understanding student learning. London: Croom Helm. Fletcher, R. B., Meyer L. H., Anderson, H., Johnston, P., & Rees, M. (2012). Faculty and students conceptions of assessment in higher education. Higher Education, 64(1), 119–133. Garrison, D. R., & Kanuka, H. (2004). Blended learning: Uncovering its transformative potential in higher education. The Internet and Higher Education, 7(2), 95-105. Hedberg, J. (2006). E-learning futures? Speculations for a time yet to come. Studies in Continuing Education, 28(2), 171-183. Laurillard, D. (1993). Rethinking university teaching: A framework for the effective use of educational technology. London: Routledge. Laurillard, D. (2002). Rethinking university teaching: A framework for the effective use of educational technology (2nd ed.). London: Routledge. Lizzio, A., Wilson, K., & Simons, R. (2002). University students’ perceptions of the learning environment and academic outcomes: Implications for theory and practice. Studies in Higher Education, 27(1), 27-51. Lowendahl, J-M. (2010). Training, support and sourcing of e-learning platforms in higher education, 2006 to 2010. Stanford, CA: Gartner Research. Lowendahl, J-M. (2012). Hype cycle for 2012. Stanford, CA: Gartner Research. Marton, F., & Säljö, R. (1976a). On qualitative differences in learning I: Outcome and process. British Journal of Educational Psychology, 46(1), 4-11. Marton, F., & Säljö, R. (1976b). On qualitative differences in learning II: Outcome as a function of the learner’s conception of the task. British Journal of Educational Psychology, 46(2), 115-127. Norman, G. R., & Schmidt, H. G. (2000). Effectiveness of problem-based learning curricula: Theory, practice and paper darts. Medical Education, 34(9), 721–728. Prosser, M., & Millar, R. (1989). The ‘how’ and ‘what’ of learning physics, European Journal of Psychology of Education, 4(4), 513-528. Prosser, M., & Trigwell, K. (1999). Understanding learning & teaching: The experience in higher education. Buckingham: Society for Research into Higher Education & Open University Press. Prosser, M., Trigwell, K., Hazel, E., & Waterhouse, F. (2000). Students’ experiences of studying physics concepts: The effects of disintegrated perceptions and approaches. European Journal of Psychology of Education, 15(1), 61-74. Ramsden, P. (2002). Learning to teach in higher education (2nd ed.). London: Routledge. Schon, D. (1990). Educating the reflective practitioner. San Francisco: Jossey-Bass. Seifert, T (1995). Characteristics of ego- and task-oriented students: A comparison of two methodologies. British Journal of Educational Psychology, 65(1), 125–138. Australasian Journal of Educational Technology, 2014, 30(3). 283 Corresponding author: Robert A. Ellis, robert.ellis@sydney.edu.au Australasian Journal of Educational Technology © 2014. Please cite as: Ellis, R.A. (2014). Quality experiences of inquiry in blended contexts – university student approaches to inquiry, technologies, and conceptions of learning. Australasian Journal of Educational Technology, 30(3), 273-283.