Australasian Journal of Educational Technology, 2019, 35(5). 153 Creating mathematics formative assessments using LaTeX, PDF forms and computer algebra Katherine Herbert, Dmitry Demskoi, Kerrie Cullis Charles Sturt University Formative assessment benefits both students and teaching academics. In particular, formative assessment in mathematics subjects enables both students and teaching academics to assess individual performance and understanding through students’ responses. Over the last decade, educational technologies and learning management systems (LMSs) are used to support formative assessment design. In mathematics, this is problematic because of the inflexibility of LMS and educational technology tools. Automating formative assessment generation and marking to support mathematics learning is made possible by utilising specific software and technologies in new ways. This paper proposes a new method of creating mathematics formative assessments using LaTeX and PDF forms in conjunction with a computer algebra system (e.g., Maple), independent of an LMS. This method is implemented in undergraduate mathematics subjects servicing non-mathematics–focused higher education courses. The method generates individualised assessments that are automatically marked. Results show that the method provides the teaching academic with a more efficient way of designing formative mathematics assessments without compromising the effectiveness of the assessment task. This study contributes to the growing research on mathematics in higher education. The implication is an increased understanding of how existing technology, implemented in new ways, can potentially benefit both mathematics students and teaching academics. Introduction There is a difference between a traditional mathematics assessment and what contemporary learning management systems (LMSs) can produce. Those generated using the tools available within an LMS are restricted to either multiple-choice or fill-in-the-blank questions. The obvious problem with designing these types of questions in an LMS is the inflexibility of the platform. The teaching academic writing assessment questions is confined to certain types of questions due to the limited mathematics notation capability of the LMS. Fill-in-the-blank type questions, in particular, are limited in terms of the correct responses students may give (e.g., 0.2 versus 1/5) since the current systems cannot identify that these are the same. The inflexibility of an LMS-generated online assessment is further compounded by the requirement for students to be online. Students in rural and remote areas may be disadvantaged by online assessments as they may experience Internet outages in the middle of completing an online assessment, which causes significant distress. This negatively impacts on students’ learning experiences in mathematics with students associating the feelings of distress with the mathematics itself rather than the technology. This leads to less engagement with the learning processes required to master mathematics. In addition to the inflexibility of the LMS and the requirement to have online access to assessments when completing them, another challenge facing academics in higher education is cheating. This is especially common in mathematics, where students receive the same assessment questions. They can easily sit together and copy answers from their peers when completing both offline or online assessments. While we acknowledge that this is a common challenge across disciplines, it is well documented that students who undertake mathematics as part of a non-mathematics–focused courses opt not to engage in deep learning (Iannone & Simpson, 2015). A common behaviour observed by the authors is students’ focus on getting the right answer, whether by copying or following a recipe, without understanding how they arrived at the answer. Having taught these subjects for a number of years, we have a real sense that students enrolled in mathematics service subjects simply want to get through the subject and pass it. The authors see this as one factor which leads to students not fully taking advantage of the opportunities to engage in deep learning in this discipline. Australasian Journal of Educational Technology, 2019, 35(5). 154 It is these specific challenges that motivated the authors to develop mathematics assessment that: • is formative in nature; • is individualised (i.e., a different set of assessment questions will be presented to each student as explained later in the Procedure section of this paper); • can be marked automatically; • is flexible in terms of both the kind of questions asked and possible answers (e.g., recognises different types of answers like 1/5 and 0.2 as correct); and • can be delivered independent of an LMS. This paper will look at how automated formative assessments could bring about the necessary learning experiences for students undertaking mathematics as a service subject, while overcoming practical and technological barriers. We begin by defining the problem in our specific context, namely higher education mathematics learning and teaching in Australia. The literature on automated formative assessments is explored, framing our contextualised problem. The new method for designing formative assessment for undergraduate mathematics service subjects is then described. Finally, we discuss the opportunities available to enhance students’ learning experiences, as well as teaching academics’ professional practice using our proposed method. The problem and context The authors acknowledge that it has always been a challenge for those who teach in the mathematics discipline to provide engaging and valuable learning experiences that equally serve the needs of both mathematics and non-mathematics–focused courses. As evidenced by recent studies of higher education mathematics subjects in the United Kingdom and Europe (Fhloinn & Carr, 2017; Frode, 2017; Iannone & Simpson, 2015), subject engagement in the mathematics discipline by students is heavily influenced by whether or not a student is undertaking a course that is mathematics-focused, for example, engineering, science or actuarial studies. In fact, a closer look revealed that students who engaged with the intricacies of the mathematical method (how to arrive at a solution or answer a question) were also those who were enrolled in a mathematics-focused course. In contrast, students who were enrolled in non-mathematics– focused courses such as the arts, social sciences, or even business courses, were less likely to engage with the subject and hence less likely to develop the deep learning required to master mathematics. It is the latter issue that this paper aims to target. While the authors of this paper have long accepted the realities of teaching the skills and knowledge of the discipline into non-mathematics–focused courses, it has not deterred them from exploring new ways to encourage student engagement and deep learning. Therefore, in the context of mathematics service subjects, the authors of this paper aimed to answer the following questions: • How do automated formative assessments engage students in deep learning within mathematics service subjects? • How does the proposed new method provide an efficient way to deliver assessments to students both offline and online? To help address these two questions, it is helpful to explore the current research in this area. Deep learning is concerned with comprehending the meaning of the subject matter, versus surface learning, which aims to reproduce or imitate skills, that is, rote learning (Akyol & Garrison, 2011). Deep learning in our context is necessary. The simple to complex skills and knowledge learned in mathematics can be transferred and applied in students’ learning in other subjects and disciplines. It is well documented that the transfer of skills and knowledge learned in mathematics subjects into other areas can be facilitated if students can be motivated to engage in deep learning in mathematics subjects, that is, be able to examine new facts and ideas critically, and tie them into existing cognitive structures making numerous links between ideas (Houghton, 2004). Studies in assessment design have shown that deep learning is more likely to occur through intentional formative assessment type tasks (Baleni, 2015; Man Sze Lau, 2016; Spector et al., 2016). Australasian Journal of Educational Technology, 2019, 35(5). 155 The authors were guided by Ecclestone’s (2010) formative assessment embedded pedagogy. According to Ecclestone, the benefits of formative assessments to students include topic-specific assessment and feedback, which enables them to assess gaps in their knowledge throughout the duration of the subject delivery. In turn, teaching staff can assess their own performance through students’ responses to the assessments. According to Ecclestone, this iterative feedback space for students and teaching staff provides opportunity for deep approaches to learning. While identifying the skills gap for students is one main outcome of intentionally designed formative assessments, it also provides opportunities for both teaching staff and students to find ways forward so that students can progressively build these skills. That is, there is a focus on comprehending the how of the subject, as well as understanding the subject matter. This is important in mathematics where simple skills and concepts, learnt by students, build on each other. If grasped well, students can pull these skills and concepts together to solve more complex problems,. The Discussion section of this paper will expand on this but suffice it to say that students were seen to be more proactive in practising the simpler skills and concepts in the low-stakes formative assessments; showing curiosity for the learning process, before pulling the skills and concepts together in the high-stakes assessments. By designing formative assessments that get students interested in how they learn as well as what they learn, we address the problem of deep learning outlined in this paper. Ecclestone (2010) argues the power of formative assessment lies in the encouragement of students to undertake higher-level cognitive thinking and deep learning, more than their perceived desire to do so. This is evidenced in our pilot case study. In the case of our mathematics service subjects, the intention of the low-stakes formative assessments was to provide an opportunity for students to engage with the subject content and scaffold their skills-building towards successfully completing the more complex problems in the high-stakes assessment tasks. Individualising the assessments, whereby each student received a different set of questions, encouraged students to practise the skills required on their own. We found that this provided space for students to engage in deep learning practices, where their indifference towards mathematics converted to positive and real interest in the how of their learning. The other aspect of our study is the efficient use of the lecturer’s time. Creating assessment questions in mathematics is time-consuming. This time is substantially increased when assessment questions need to be individualised to not only provide students with an engaging task, but also to provide an opportunity for students to build academic integrity (Wylie & Lyon, 2015). This was evidenced in a study by Passmore, Brookshaw, and Butle in 2011. This aspect of designing effective assessment tasks in mathematics more efficiently and in less time is also a key driver in automating formative assessment tasks. In recent years, there has been a proliferation of educational technologies aimed at supporting the implementation of formative assessment in mathematics learning and teaching in more efficient ways. Three recent studies, completed by Barana and Marchisio (2016), Fhloinn and Carr (2017), and Frode (2017), explored automated formative assessment in mathematics in higher education in Europe. These studies provide insight into the advantages of developing automated formative assessments. Those advantages discussed in all three studies include fewer resources and less time required to generate and mark assessments, especially for subjects with a large number of students; feedback provided to students in a more timely manner; and specifically, this type of assessment enables randomised and individualised questions, moving away from bulk multiple-choice and fill-in-the-blank type questions. However, as evidenced in the studies conducted on engaging students in valuable and useful learning experiences using technology, specifically in mathematics, there are still challenges with these methods (Iannone & Simpson, 2015). The problem lies in educational technologies’ limitations when delivering individualised and useful assessment questions that are also efficiently generated online. Specifically, the three recent studies mentioned here discussed the need to overcome delivering automated assessments online, due to students having limited access to the Internet. For regional and remote students, online automated formative assessments can be difficult to access and complete. This issue around completing assessment online is very relevant to the authors’ context – a shared context with other institutions in both local and global locations. Specifically, the context is a higher education institution in Australia servicing regional and at times remote areas. Charles Sturt University is a regional university providing higher education services nationally and internationally both face-to-face and online. Students who attend Charles Sturt University and study mathematics subjects are generally enrolled in non- mathematics courses. Therefore, most mathematics or statistics subjects are studied as service subjects. It Australasian Journal of Educational Technology, 2019, 35(5). 156 is within this context that the authors of this paper test if a new method of mathematics assessment can enhance the student learning experience. The direction of this study and the method described in the next section are informed by the insights from the literature around formative assessments and their automation in mathematics subjects. This paper now turns its attention to how the delivery of automated formative assessments, independent of the Internet and allowing for sophisticated and efficient individualised assessment generation, were designed. Method The method described here is informed by established formative assessment strategies defined by Ecclestone (2010). Based on observations from marking the assessments in previous teaching sessions at Charles Sturt University, it was evident that a large number of students undertaking mathematics subjects were using the assessment to guide their study in the subject rather than studying the content in a sequential way. Discovering this behaviour highlighted the need to explore ways to encourage students to engage with the subject content prior to attempting a relatively high-weighted assessment. It was decided to introduce a low-stakes assessment that would inform and progressively assess the students’ grasp of the concepts, to be completed two weeks prior to the due date of the high-stakes assessment. This formative assessment tested simple techniques and concepts vital to successfully completing the more complex problems in the high-stakes assessment. It was hoped that this assessment would provide the impetus for students to engage with the subject content. The automated feedback would then identify those techniques or concepts which they had not understood and required more work, usually during lectures, tutorials or in online meetings, prior to attempting the higher-weighted assessment. Participants The PDF form assessment was piloted with a distance cohort of 63 students for an assessment in a first- year business statistics subject. This subject is part of a business undergraduate degree that is not a mathematics-specific discipline. This cohort was chosen as the subject was offered over summer by distance and was restricted to a single cohort. The size of the cohort was also large enough to make useful observations. An assessment requiring 20 numerical responses was prepared, and assessment text files were generated for each student. There are many computer algebra systems available today, some of which are completely free. We have used a proprietary system, called Maple. This is a fully featured system with a powerful programming language. Maple can work with a variety of data formats, one of which is the comma delimited format. We used a Maple function, called ImportMatrix, to import the data. For the pilot, the authors decided to use five versions to test the new assessment process. Given that the students were studying online, the authors determined there were fewer opportunities to share tests and hence answers than for an on-campus cohort, so five would be sufficient. This could be extended to a greater number of versions if required so each student received a different assessment. The authors could also randomise question banks if required, by using more than one parameter. Procedure The first step in generating assessment files is to import the gradebook (roll book) from an LMS (e.g., Blackboard or Interact 2) into the computer algebra package (Maple). Computer algebra enables the user to manipulate symbolic expressions and recognise equivalence of seemingly different expressions, for example, sin(2*x)=2*sin(x)*cos(x). All questions in an assessment depend on a number (parameter). A student ID or student name can be used to assign this parameter to each student. For our pilot we used the student ID. Although the files are generated by Maple, they are actually LaTeX files which are then converted to PDFs using a subroutine (pdflatex) of the LaTeX system. LaTeX, for us, is the de facto standard for the communication and publication of scientific documents. It is available for most operating systems free of charge. We used an implementation of the system for Windows called MikTeX. Australasian Journal of Educational Technology, 2019, 35(5). 157 Once the PDFs are generated, they are uploaded to an LMS. To access the assessment, a student enters their student ID, downloads the PDF file, and saves it to their local computer or device. From a technical point of view, an assessment file is a PDF form. Below each question in the assessment, there is a field where students enter their answers. In our pilot the students had one week to work on the assessment and calculate their answers. To complete the assessment the students opened the file using PDF forms–compatible software and entered their answers. Students were advised to use Adobe Reader DC to complete the assessment as it is free and readily available for most users and devices. Once complete, they uploaded the saved file to the Electronic Assignment Submission Tracking System (EASTS) just as they would a normal assessment. EASTS is a platform used by Charles Sturt University to manage assessment submissions and returns. The completed assessment files were then downloaded by the teaching staff. The files are downloaded in a single batch and extracted into a folder. Students’ answers were then compiled into a spreadsheet using the Adobe Acrobat Pro function: Tools>Forms>More Form Options>Manage Form Data>Merge Data Files. We called this file the answer spreadsheet. The assessments are then marked using computer algebra (Maple). The teaching staff imports the completed answer spreadsheet into Maple where the answers are compared with the correct answers. Different algorithms could be used to implement this step depending on the type of answer or response required. For example, if the answer is numerical, then one can just subtract a number from another number and check if the difference is smaller than some specified threshold. This marking process can be made quite flexible using the programming tools of computer algebra. For each submitted assessment file, Maple generates a LaTeX file that contains the assessment questions, the correct answers, and the answers entered by each student. If the correct answer does not match the one entered by the student, a custom feedback may be inserted. For the pilot discussed in this paper, the feedback included references to the relevant concept or technique examined in the question. Alternatively, one could refer students to relevant examples from the textbook. The final step is to compile these LaTeX files into PDFs and return them to the students. The marked assessments are uploaded to EASTS in a single batch and made available for students to view in the LMS. The whole process is depicted in Figure 1. Figure 1. The process of creating the formative assessment In the Appendix, we provide the Maple source code that implements the diagram (Figure 1). The purpose of the code is to illustrate the paradigm. At first glance, it appears minimalistic; it can be extended in a variety of ways, such as using the PDF functionality to support the JavaScript in validating submitted Australasian Journal of Educational Technology, 2019, 35(5). 158 answers. The source code provided here implements one type of answer, namely short answer numerical; however, its potential to extend to a variety of question types is great and currently being explored by the authors in another project. With some programming, one can design more sophisticated questions with mathematical symbols or algebraic answers and constructor questions. In the next section, the authors look at the results of the implemented method for a cohort of undergraduate business students. Results The questions this study aimed to address were: • How do automated formative assessments engage students in deep learning within mathematics service subjects? • How does the proposed new method provide an efficient way to deliver assessments to students both offline and online? We look at the latter question first, focusing on the implementation of the technology. Then we move on to the former question and look at student engagement. Implementation of the technology The method using Maple and PDF enables ultimate flexibility in the type of question which can be created or generated and is not restricted to only multiple-choice. It is possible to include complex formulae, tables, diagrams or images in a question. Questions requiring a numerical answer increase the chance of students arriving at their answer using a correct technique rather than just a lucky guess, which can be the case for multiple-choice questions. Therefore, this type of question requires students to engage with the content to determine the answer and is likely to be a more reliable measure of student understanding. This method also allows for answers which are not numerical. Students downloaded their individualised assessment to their computer and had a week to work on the questions. Students indicated that this was better than a multiple-choice test to be completed within, say, 60 minutes as it allowed time to review topics when one is not sure how to answer a question, and have multiple attempts before finalising their answer. For our assessment, we allowed one week but it is easy to impose shorter time limits for completion using time of release and due date and time of actual assessment item. These options could be implemented in the overwhelming majority of LMSs Once marked, the assessment is uploaded to the LMS ready for students to download. Students then have both a soft copy and a hard copy (if they print it) of the assessment with the original questions, their answers and the correct answers to use for assignment preparation and for exam revision towards the end of the session. The assessment is automatically marked, saving time for the marker. While the students receive individualised assessments, this method also enables automatic marking, which is a feature of most LMSs and online learning publishing tools; however, most cannot accommodate the sophisticated mathematical expressions and formulae that our method can achieve due to using a full-featured LaTeX system. Student engagement There was increased student engagement using this assessment. Of the 63 students enrolled in the subject, 59 completed the automated formative assessment. In the previous sessions, students were provided with a series of weekly multiple-choice self-assessment quizzes. In the same session the year before introducing this assessment, only 18 out of 119 students enrolled online attempted these quizzes. The authors acknowledge that this is most likely due to the weighting (albeit very small) attached to this new assessment. However, it did also encourage students to consider the questions in more depth. For the new assessment, as students reviewed their marked assessment some posted questions on the subject discussion board, demonstrating an increased interest in exploring the steps to the solution. They requested help with understanding how to obtain the correct answer where they were unable to work it out for themselves. The Australasian Journal of Educational Technology, 2019, 35(5). 159 lecturer then used the online meeting that week to work through some of the assessment questions raised by the students. This reflective learning behaviour, as well as engagement in deep learning practices, was not evident in previous sessions where multiple-choice self-assessment quizzes were made available. As evidenced by this change of behaviour, formative assessment, which is now more efficiently delivered using the new method, appears to motivate students’ engagement. There was evidence of a positive correlation between the marks students received from the automated formative assessment and the marks they received in the higher-stakes assessment on the same topic (see Figure 2). This suggests that those students who did not utilise or do well in the automated formative assessments, performed poorly on the higher-stakes assessment. This evidence could be shown to students in future sessions to encourage greater engagement with the lower-stakes assessment to prepare for the higher-stakes assessment. Figure 2. Positive correlation between the marks students received from the automated formative assessment and the marks they received in the higher-stakes assessment on the same topic Discussion Implications of the results Although this study was limited to a single cohort of 63 online learning students, the results are encouraging. Despite Iannone and Simpson’s (2015) argument that students’ engagement with mathematics subject content relies heavily on whether or not their course sits within a mathematics-specific course, this pilot demonstrates that students’ engagement can be positively influenced by a formative assessment that encourages students’ curiosity. It is evident, from the subject discussion forum posts, as well as the tutorial online meetings, after each automated formative assessment was marked and returned to students that students’ curiosity was triggered by this type of assessment. This demonstrated an increase in student interest in the steps of the solution rather than students focusing merely on the correct answer. The automated formative assessment used in this pilot study was designed to encourage students to engage with the subject content prior to attempting a relatively high-weighted assessment. There is evidence that this did have the desired outcome as 59 of the 63 students completed the automated formative assessment, exhibiting greater engagement with the subject content compared with 18 out of 119 in the same session the year before. Students were asked whether the new form of assessment encouraged them to engage with the topics being assessed at the end of each online tutorial, as well as at the end of the teaching session. The majority of students who responded agreed the new form of assessment achieved this objective. Australasian Journal of Educational Technology, 2019, 35(5). 160 We also asked students their opinion about the format of the new online assessment. The majority of students who answered indicated it was easy to download the assessment and upload the completed assessment for marking. They also agreed that it was made explicitly clear how to enter answers into the PDF form. Students also provided additional feedback about the format of the assessment. They indicated that it was better than a multiple-choice test to be completed within, say, 60 minutes as it allowed time to review a topic when one is not sure how to answer a question, and have multiple attempts before finalising their answer. This further demonstrates greater engagement with the subject content than with a typical multiple- choice question. Recommendations In terms of choosing the right technology to support this method, the authors would like to highlight here that in the literature the preferred technology of automating formative assessment is a specific software, Maple T. A. (recently renamed as Möbius Assessment). Maple T.A., from our experience, can provide all the required options to create effective automated formative assessments. While this may be true, it is important to note that this ready-made solution is also an expensive one. Looking towards available technologies for institutions that may not necessarily have the budget to purchase Maple T.A., the authors have created a method and solution that not only achieves similar outcomes to Maple T.A. but which also provides a more accessible type of assessment for students in regional and remote areas where Internet access can be poor, intermittent and even negligible. Our method falls into the category of assessments which have separate question delivery and answer collection. This idea has been previously used by Passmore et al. (2011). Their approach involves using a computer algebra software (Maxima) as well; however, it requires programming on the side of the server (using PHP). Our system on the other hand is more offline; that is, the web server is only required as a storage for the assessment tasks. With minimal modifications, this method can be implemented even without a web server, for example, via email. This opens up the opportunity for our method to be offered to and used by other institutions, no matter the size of their budget, to implement in the delivery of any mathematics subject. Conclusion At the outset of this study, the authors aimed to address two questions: • How do automated formative assessments engage students in deep learning within mathematics service subjects? • How does the proposed new method provide an efficient way to deliver assessments to students both offline and online? It was evident in this study that there was an increase in student engagement using this new method. There was a marked increase in the number of students completing the low-stakes formative assessment, together with indications of an increased interest in exploring the steps to the solution. This reflective learning behaviour, as well as engagement in deep learning practices, had not been evident in previous sessions where multiple-choice self-assessment quizzes were made available. As evidenced by this change of behaviour, formative assessment, which is now more efficiently delivered using the new method, facilitates student learning motivations. It was also evident in this study that the new method provides an efficient way to deliver mathematics assessments to students. Existing LMS tools do offer automatic marking; however, these tools are not flexible enough to generate answers for complex mathematical (or statistical) questions nor to handle mathematical expressions. Through the combination of Maple and LaTeX, the authors were able to achieve automated marking, while utilising the flexibility of coding language to accommodate writing mathematical expressions for both questions and solutions. This makes creating sophisticated and individualised assessments possible but with less time. One of the key benefits of this method is that students who do not have good access to the Internet are still able to complete the assessment as they do not need to be online to complete it. They only need Internet access twice: firstly to download the assessment and again to upload the completed assessment. An additional advantage of this method (students saving the PDF form to their local computer) is that students are able to review the completed assessments in preparation for assignments Australasian Journal of Educational Technology, 2019, 35(5). 161 and examinations. It is this final result which provides us with further motivation to pursue the implementation of this new method in more mathematics service subjects, with both face-to-face and online cohorts. This study was brought about by a contextual challenge of higher education student engagement within mathematics subject content. Mathematics educators can draw from this study and apply our proposed new methods using LaTeX, PDF Forms and Maple to their mathematics subject to create a more effective mathematics assessment that: • is formative in nature; • is individualised; • can be marked automatically; • is flexible in terms of both the kind of questions asked and possible answers; and • can be delivered independent of an LMS. The next step in this study for the authors is to look at other Australian higher education institutions’ use of the specific software Maple T. A., to compare with the results achieved in this pilot study. In the meantime, for mathematics educators who are interested in exploring this new method in their mathematics subject, the authors provide the source code in the Appendix covered under a creative commons licence. Acknowledgements The authors are grateful to Associate Professor Yeslam Al-Saggaf for reading the manuscript and making numerous constructive remarks. References Akyol, Z., & Garrison, D. R. (2011). Understanding cognitive presence in an online and blended community of inquiry: Assessing outcomes and processes for deep approaches to learning. British Journal of Educational Technology, 42, 233–250. https://doi.org/10.1111/j.1467-8535.2009.01029.x Baleni, Z. (2015). Online formative assessment in higher education: Its pros and cons. The Electronic Journal of e-Learning, 13(4), 228–236. Retrieved from http://www.ejel.org/issue/download.html?idArticle=433 Barana, A., & Marchisio, M. (2016). Ten good reasons to adopt an automated formative assessment model for learning and teaching mathematics and scientific disciplines. Procedia - Social and Behavioral Sciences, 228, 608–613. https://doi.org/10.1016/j.sbspro.2016.07.093 Ecclestone, K. (2010). Transforming formative assessment in lifelong learning. Milton Keynes, UK: Open University Press. Fhloinn, E. N., & Carr, M. (2017). Formative assessment in mathematics for engineering students. European Journal of Engineering Education 42(4), 458–470. https://doi.org/10.1080/03043797.2017.1289500 Frode, R. (2017). Influence of computer-aided assessment on ways of working with mathematics. Teaching Mathematics and Its Applications,36(2), 1–14. https://doi.org/10.1093/teamat/hrx001 Houghton, W. (2004). Engineering subject centre guide: Learning and teaching theory for engineering academics. Loughborough, UK: HEA Engineering Subject Centre. Iannone, P., & Simpson, A. (2015). Students preferences in undergraduate mathematics assessment. Studies in Higher Education, 40(6), 1046–1067. https://doi.org/10.1080/03075079.2013.858683 Man Sze Lau, A. (2016). Formative good, summative bad? – A review of the dichotomy in assessment literature. Journal of Further and Higher Education, 40(4), 509–525. https://doi.org/10.1080/0309877X.2014.984600 Passmore, T., Brookshaw, L., & Butle, H. (2011). A flexible, extensible online testing system for mathematics. Australasian Journal of Educational Technology, 27(6), 896–906. https://doi.org/10.14742/ajet.919 Spector, J. M., Ifenthaler, D., Samspon, D., Yang, L., Mukama, E., Warusavitarana, A., Lokuge Dona, K., Eichhorn, K., Fluck, A., Huang, R., Bridges, S., Lu, J., Ren, Y., Gui, X., Deneen, C. C., San Diego, J., & Gibson, D. C. (2016). Technology enhanced formative assessment for 21st century learning. https://doi.org/10.1111/j.1467-8535.2009.01029.x http://www.ejel.org/issue/download.html?idArticle=433 https://doi.org/10.1016/j.sbspro.2016.07.093 https://doi.org/10.1080/03043797.2017.1289500 https://doi.org/10.1093/teamat/hrx001 https://doi.org/10.1080/03075079.2013.858683 https://doi.org/10.1080/0309877X.2014.984600 https://doi.org/10.14742/ajet.919 Australasian Journal of Educational Technology, 2019, 35(5). 162 Educational Technology & Society, 19(3), 58–71. Retrieved from https://www.j- ets.net/ets/journals/19_3/ets_19_3.pdf Wylie, E. C. and Lyon, C. J. (2015). The fidelity of formative assessment implementation: Issues of breadth and quality. Assessment in Education: Principles, Policy & Practice, 22(1), 140–160. https://doi.org/10.1080/0969594X.2014.990416 Corresponding author: Katherine Herbert, kherbert@csu.edu.au Please cite as: Herbert, K., Demskoi, D., & Cullis, K. (2019). Creating mathematics formative assessments using LaTeX, PDF forms and computer algebra. Australasian Journal of Educational Technology, 35(5), 153-167. https://doi.org/10.14742/ajet.4539 https://www.j-ets.net/ets/journals/19_3/ets_19_3.pdf https://www.j-ets.net/ets/journals/19_3/ets_19_3.pdf https://doi.org/10.1080/0969594X.2014.990416 mailto:kherbert@csu.edu.au https://doi.org/10.14742/ajet.4539 Australasian Journal of Educational Technology, 2019, 35(5). 163 Appendix Maple source code The code has been tested on a PC computer running Operating system: Windows 7 Maple version 15 This source code is covered under a creative commons licence attribution-share-alike (https://creativecommons.org/licenses/by-sa/2.5/au/). ############################################################################ # CONTENT of assessment1.maplet ############################################################################ #==================================================================== # Beginning of file assessment1.maplet restart; assignparam := proc (n) local p; p := `mod`(n,4)+1; RETURN(p) end; #==================================================================== # Paths & settings #==================================================================== RootFolder:="D:/Autoassignment/"; #Path to Maple file, should use / or \\ as separator MarkedFolder:=cat(RootFolder,"marked/"); # Folder where marked assessments will be saved StudentsFile:=cat(RootFolder,"rollbook.csv"); #Full path to rollbook file Students:=ImportMatrix(StudentsFile, source=csv); FileName:="Assignment"; # Assessment filenames will start with this IDs:=convert(Students[1..-1,1],list); #This assumes that the 1st column of rollbook contains student's ID SurNames:=convert(Students[1..-1,2],list); #This assumes that the 2nd column of rollbook contains students’ surnames FirstNames:=convert(Students[1..-1,3],list); #This assumes that the 3rd column of rollbook contains students’ first names Params:=map(assignparam,map(convert,map(convert,FirstNames,'bytes'),`+`)); #In this line we assign every student a parameter that is then used to generate individualised assessment marking:=false; #true/false TotalMark:=0; MaxMark:=1; #==================================================================== # Some auxilary procedures # Return terms containing 'A' Australasian Journal of Educational Technology, 2019, 35(5). 164 chn:=proc(f,A) local F,g,i; F := expand(f); if type(F,`+`) then RETURN(map(chn,F,A)); fi; if has(F,A) then RETURN(F) else RETURN(0); fi; end; Printf:=proc() global TexFile,FileID; if TexFile<>'default' then fprintf(FileID, args[1..nargs]); else printf(args[1..nargs]); fi; end; Form:=proc(params::set) local vl,nm,wdth,prmt; if has(params,name) then nm:=op(2,op(1,map(chn,params,name) minus {0})); else nm:="";fi; if has(params,value) then vl:=op(2,op(1,map(chn,params,value) minus {0})); else vl:="";fi; if has(params,width) then wdth:=op(2,op(1,map(chn,params,width) minus {0})); else wdth:="";fi; if has(params,prompt) then prmt:=op(2,op(1,map(chn,params,prompt) minus {0})); else prmt:="";fi; RETURN(cat("\\begin{Form}\\TextField[name=",nm,",value=",vl,",width=",wdth,"pt","]{",prmt,"}\\end {Form}")): end; GetAnswer:=proc(ID,question::string) local i,j,key,qkey,nrows,ncols,ans; global Reports; nrows:=op(1,Reports)[1];ncols:=op(1,Reports)[2]; i:=1;j:=1;key:=0; while i<=nrows and key<>i do if Reports[i,3]=ID then key:=i;i:=nrows; else i:=i+1; fi; od; if key=0 then RETURN("not given or wrong format");fi; while j<=ncols do if Reports[1,j]=question then qkey:=j; RETURN(Reports[key,qkey]); else j:=j+1; fi; od; RETURN("not given or wrong format"); end; FormN:=proc(StudentID,qlabel::string,correct_answer,tolerance) local answer_given,prmpt; global marking,TotalMark; if marking then Printf("Correct answer: $%s$ \\qquad ",Latex(correct_answer)); answer_given:=GetAnswer(StudentID,qlabel); if not type(answer_given,{float,integer,fraction}) then answer_given:="empty"; fi; Printf("Your answer: $%s$ \\qquad ",Latex(answer_given)); if answer_given<>"empty" then if abs(evalf(answer_given-correct_answer))
Download assessment ",FileName,FileName); fclose(FileID): # End of file assessment1.maplet #==================================================================== #==================================================================== # Beginning of file rollbook.csv 11599563,Smith,John 11528962,Doe,Jane 11630628,Mac,George # End of file rollbook.csv #==================================================================== Introduction The problem and context Method Participants Procedure Results Implementation of the technology Student engagement Discussion Implications of the results Recommendations Conclusion Acknowledgements References Appendix Maple source code