Computer-based laboratory simulation: evaluations of student perceptions Norrie S. Edward School of Mechanical and Offshore Engineering, The Robert Gordon University, Aberdeen Providing resources to meet the needs of oil workers who miss blocks of an engineering course was the motivation for producing computer-based simulations of laboratory equipment. This paper reports on student perceptions of various aspects of the package. The factors are grouped into (i) motivation and support, and (ii) presentations and interaction. A schematic representation of the controls and instrumentation was used. Two classes, engineers and non-engineers, were the pilot groups. The engineers clearly preferred laboratories, whereas the non-engineers were just as happy with the simulation. The results of the survey suggest that while computer-based simulation may be an alternative to laboratories, even the best alternative, much is lost as a result. Practical appreciation and team-working skills are not well developed The schematic presentation is easy to use, but gives the student little feel'for the operation of a real plant. Introduction Laboratory experimentation in engineering is an essential part of the three main components in an engineer's formation. The theoretical constructs and models are imparted in lectures and tutorials. Workshop hands-on activity allows the student to acquire an understanding of the interaction of design and manufacture, and the constraints both impose. Characteristics of plant are investigated through experiment, and this aids the learner's understanding of the limitation of models in predicting performance. The learner also gains an appreciation of the nature of errors and of the construction of plant. But while the oil industry has brought prosperity to the North- East, it has also brought unique educational demands: the working arrangements place severe restrictions on part-time student attendance. Technicians work a block of two to four weeks offshore, followed by a similar period of leave. Different companies have different arrangements, and shift-change days. A review of the literature (see Boud et al, 1984) suggests that the main aim of laboratory work should be to teach inquiry methodology and experimental design. It is often also 41 Nome S. Edward Computer-based laboratory simulation: evaluations of student perceptions reported that students do not enjoy the activity. In contrast, our staff at the School of Mechanical and Offshore Engineering are clear that the main aims of the work are to establish the links between theory and practice, to aid visualization, and to develop team- working skills, and our students, in common with Carter's (1980) findings, enjoy this aspect of their education. Whatever the true outcomes of laboratories, the activity is seen as essential by our staff and students alike. Alternative provision, such as directed reading to allow part-time students to stay abreast of theory, is not difficult to arrange, and most of them gain a knowledge of plant construction through daily contact in their working lives. But this is not to say that they are afforded the opportunity to investigate the characteristics of the equipment. Rather, the objective will often be to maintain a steady- state operational condition. We therefore felt that the provision of an alternative to laboratory work was essential. The evaluation reported in this and another paper (Edward, in press) set out to test the contention that in comparison with laboratories, cognitive learning would be as high with a simulation, but that practical appreciation would suffer. The opportunity was taken to carry out what Flagg (1990) describes as 'formulative evaluation' based on our students' perceptions of the package, i.e. 'the systematic collection of information for the purpose of informing decisions to design and improve the product'. Reeves (1992) avers that the effectiveness of a package is constrained by (i) the design of the user-interface, and (ii) the motivation and expertise of the user. It is on these and related factors that this paper reports. Since students may miss either or both of the laboratory experiment and the relevant theory, it was decided that any alternative should provide both. The final package was multiple media. The theory was covered in an interactive text-based workbook. A separate laboratory procedure sheet was also provided so that students who had attended the lectures did not have to work through the book. As an aid to visualization, a short video showing the plant in operation was produced. The heart of the package was a computer-based operational simulation. This was interactive, and allowed users the same type of control as they would have on the actual plant. Similarly, the readings provided were based on actual experimental performance data from the machine, a centrifugal pump being the example examined here. Using real data allowed representative systematic error to be included. Random error was approximated by analysing repeated readings and applying an appropriate randomization factor to each of the displayed outputs. A desirable package would be 100% faithful to actual performance, would include effects such as sound, would have a photographically realistic interface, and would be controlled in a manner identical to the real plant. The first of these would have required an enormous database, but the inaccuracy introduced by limiting the database is very small. Sound, though desirable, was not felt to add much, and at the time would not have been available on most computers. The realism is important. Does the display have to look like the equipment for the user to gain the most from the experience? Must a dial gauge be so represented, or can it be replaced by a digital output? Must a valve be opened by clicking and dragging a lever, or can an indexing button be used? In the interests of reducing the production time* and of ensuring rapid response and portability to relatively low- performance machines, compromises had to be made. A diagrammatic interface was used 42 ALT-J Volume 4 Number 3 (Figure 1). (In passing, it is worth noting that our gas turbine simulator uses a much more realistic interface.) These compromises were not made lightly and, as discussed below, I have separately been investigating their importance. Presentation Window File Valve Opening Control Valve - .5 + 10 - 1 0 Speed Control Figure /: Screen representation of the pump controls and instruments Evaluation of the centrifugal-pump simulation was conducted by running the package with a pilot group of students. Students were drawn from two classes. One class (30 students) was composed of mainstream engineers. The other (26 students) was composed of cross-disciplinary technology and business undergraduates. Means were compared using an independent t-test with a <.O5 level considered to be significant. Few significant differences were detected between the two classes, and they are generally reported here as a single cohort. They were divided into two groups of 28, one of which carried out the actual laboratory experiment while the other obtained their results from the computer- based simulation (CBS). Most of the aspects of the experience reported here relate only to the simulation, and for them the relevant sample size is 28. Evaluation was by means of questionnaires administered to all students, and follow-up interviews. The questionnaires 43 Nome S. Edward Computer-based laboratory simulation: evaluations of student perceptions used five-point Likert scales. Space was provided for free-form comments, with suggestions being made that respondents might wish to comment on matters such as ease of use, help facilities, difficulties with operation, and so on. The response rate was 84% overall, with the CBS group return-rate at 89%. Ten formal audio-taped interviews were conducted, and around the same number of ad-hoc informal interviews were written up afterwards. In the formal interviews, a semi-structured questionnaire was used. Quite a range of factors was investigated, and I have grouped these factors into three main categories. These are the learning outcomes, the presentation and interface, and finally motivation and support. I have reported elsewhere on the learning outcomes (Edward, in press). In this paper, I report on the findings on the remaining factors. Table 1 gives the means and range for each of the factors discussed in this paper. FACTOR Ease of use of the package (general section)* Ease o f use of the package (simulator section) Realism of the simulation Attractiveness of the screen presentation Length of the package Effectiveness of simulation as a laboratory alternative Sufficiency of interaction provided Guidance on use of the package (general section) Guidance on use of the package (simulator section) Guidance on writing report Sufficiency of feedback on performance Motivational impact of the simulation Preference for independence MEAN 3.91 3.88 3.56 3.41 2.71 3.18 2.67 3.18 2.76 2.95 2.95 3.29 3.28 (n = 28 for all factors. * indicates low t o high transposition of results) RANGE 1-5 1-5 2-4 1-5 1-4 1-5 1-4 1-5 2-4 1-5 1-4 2-5 1-5 Table I: Analysis of student perceptions of the computer-based simulation package Presentation and interface Ease of use Students rated how easy they had found the simulation to operate, both in the general questionnaire and in the one on the CBS itself. The scales were reversed, but after transposing one set so that a low score indicates that it was found difficult to use, the two mean scores were almost identical at 3.88 and 3.91. A total of 46% of the respondents rated the package easy to use. Students did report some difficulties, but these generally were due to faults in the program which were readily remedied. Their most compelling criticism was of the quality of on-line help. The laboratory group found the actual plant slightly more difficult to use (mean 3.68), but interviews suggested that this was welcomed. The plant is fairly complex and the students enjoyed operating it. 44 ALT-J Volume 4 Number 3 Screen presentation % 20 10 very poor 2.00 3.00 4.00 Screen presentation figure 2: Students' appraisal of the representation of the plant on screen very good Because of the compromises made in the representation of the apparatus, as reported above, I was anxious to see user-reaction to the result. It was something of a relief that the mean rating was 3.41, and only 16% of students selected a rating below the mid-point. But the students were somewhat more circumspect about the realism of the simulation. No student in either class used the extremes of the scale. Here also the overall mean of 3.56 suggests that a suitable approach has been adopted. Complacency must be avoided, however, as one or two comments were either unfavourable or uninformed. In the latter category, one student welcomed the simulator's ability to repeat the results 'without error', having failed to appreciate the distinction between random and systematic error. One student who had previously used a simulation said that he preferred its approach which featured optional controls offering different levels of realism from alpha-numeric to a fully realistic representation of the plant. Length Although length was included in the assessment of the simulation, some students obviously treated it as relating to the overall experience. Even for the simulation, a definition of this may have proved difficult because users had latitude in use of the various sections, repeat tests and self-assessment questions (SAQs). Some students said they felt that even more choices should have been available. Some wished to be free to escape from an SAQ if they felt it was inappropriate. Another said he would have liked more depth to the theory and more testing SAQs. It was pointed out that these were provided in the workbook. He said: 'I did read the workbook but I'd have preferred to have it all in one place when I was at the computer. Also I would have liked to be able to go back to the theory when I got stuck.' The laboratory group reported that on average they had spent 100 minutes on the test, while those using the CBS approach spent only 74 45 Nome S. Edward 'Computer-based laboratory simulation: evaluations of student perceptions minutes (see Figure 3). Overall, despite the shorter times recorded as being spent on the CBS than on the laboratory, students tended to feel that the package was rather long. They rated it from too short to much too long, but the mean of 3.9 shows that generally it was found slightly long. It suggests that more options on routing would be welcomed by the students. 40, 30, 20- 10< 0 I fin n • . 1In Experimental method •laboratory • CBS 30 35 45 50 60 70 80 85 90 100 120130180 Time spent on activity (minutes) figure 3: Length of time spent by students of both groups on the activity Effectiveness of the simulation The group were asked how effective they had found the pump simulation, and how effective they felt the general approach would be as a laboratory substitute. This was one area in which there was a significant different between the two classes who formed the pilot group, though the small sample sizes make interpretation of the difference speculative. The mechanical engineers did not in general feel that the simulation was an effective alternative. As one put it: 'It [CBS] wouldn't give you practice of hardware and setting things up properly, running it and taking good readings.' Another said: 'It rather defeats the purpose of doing lab work.' The cross-disciplinary technology and business class were more appreciative. One called it the 'perfect alternative', and another described it as 'semi hands-on'. The difference may reflect the engineer's identification of working with plant as 'what an engineer does'. Interaction The appropriateness of the degree of interaction to students' needs is related to routing. A low score indicated that students would have liked more interaction, and as the overall mean was 2.67, this appears to have been the case. Most students rated the factor between 'much too little' and the mid-point. Their comments suggest that they were referring mainly to the theory section where a few students said they would have welcomed more depth being available. One more ambitious individual said: 'I would have liked to have 46 Au-j Volume 4 Number 3 Percent much too little 2.00 3.00 Extent of interaction Figure 4: Student perceptions of the adequacy of the interaction provided 4.00 tried other pump speeds so I could have done an NDG (Non Dimension Group) calculation'. The conclusion is that more and better detail and interaction would be desirable. Motivation and support Guidance on use and report writing, and feedback on performance As these topics are obviously related they will be considered together: they gauge the level of support the students felt they had received. The simulation was run in somewhat contrived circumstances, but every effort was made to confine support to dealing with operational problems. It must also be recognized that the so-called Hawthorne Effect may have been invoked: the students were aware that they were the pilot group for a novel methodology. A rather important factor if the simulation were to be used independently is guidance on the use of the package. Again, a question on this was included in the general section of the questionnaire, and in the section specific to the simulator. Responding to the general question, many of the simulator group felt better guidance could have been beneficial, although the overall mean ranking of 3.18 was moderately favourable. (The laboratory group felt that guidance was very good, with a mean of 3.63.) There were teething problems with getting into the package which will be readily resolved, but of more concern were expressed difficulties in running the simulation: 'I went off in the wrong direction at the start, and didn't have time to complete it. If it was a bit clearer as to what to expect and what to do . . . ' In response to the question in the simulator section, rather strangely the rating for the combined group dropped from 3.18 to 2.76 (significant at the 5% level). Admittedly this was due to many of the students rating guidance as 'too little' but none as 'much too 47 Nome S. Edward Computer-based laboratory simulation: evaluations of student perceptions little'. The most frequently cited reason given was shortage of on-line help, particularly during the operational simulation. • The responses on guidance on report writing are rather disappointing, since the workbook activities were intended to lead the user through the report. Laboratory groups from both classes actually rated the guidance on report writing higher than did their counterparts on the CBS. One CBS user said: 'I knew what the procedures were on the day, but wasn't sure how to apply them or what they were for'. The overall means were 3.32 for the laboratory groups, and 2.95 for the CBS groups. (What the implications are is unclear, but the topic is discussed below.) This contrasted with the responses on feedback on performance where the respondents reported better feedback from the simulation than the laboratory. The mechanical engineers were only slightly higher at a mean of 3.00 as against 2.92, but 89.7% of the combined CBS group rated feedback at or above the mid-point. The non-engineering laboratory group gave a significantly lower rating in the experiences, with a mean of 1.86. Only one student in this group even rated feedback at the mid-point. The technology and business simulation group mean at 2.86 was significantly higher than the laboratory group, but still more than a quarter of the students rated this factor as 'poor' or 'very poor'. It seems that more can be done to enhance feedback, a point addressed below. Motivation Asked how motivational the experience was, the two classes responded rather differently (see Figure 5). Overall, the mean for the laboratory group was similar to that for the CBS group (3.21 as against 3.29). In the case of the mechanical engineers, there was little difference between the two mean ratings, with the laboratory rating being slightly the higher. The non-engineers, however, produced a mean of only 2.71 for the laboratory, compared with a mean of 3.25 for the simulation. Count .5 Experimental method • laboratory • CBS 2.00 3.00 4.00 Figure 5: Student perceptions of motivation of method 48 AdJ-j Volume 4 Number 3 To an extent at least, this may again reflect the engineers' self-identification. They were much more adamant that the missing element in the CBS was hands-on, which they felt was both necessary and enjoyable. Overall, however, both the ratings and the students' written and oral comments make it clear that the non-engineers did not find the laboratory experience motivational. This calls into question the nature of this type of activity for this less technical class. Are the right objectives being pursued? Is the right approach being adopted? Is enough time being given? Further, the question of how motivational students had found the experience begs the question of what motivated them. Were they motivated by the experience per se, by the opportunity to gain knowledge, or by the need to satisfy an assessment criterion? This factor will warrant fuller investigation, but might be related to the students' motivation for taking the course. Independence A feature of laboratory work is that there is always a lecturer supervising who will offer advice or answer queries. The simulation is intended to be used without this support. Students were asked whether they preferred to have a lecturer present or to work independently. The results were rather surprising. Those who had done the actual laboratory declared a significantly higher preference for having a lecturer present (mean 2.5 as against 3.28). Analysis of the figures produced more surprises. Both groups of non- engineers had a mean of 3.0, the mid-point of the scale. The significant difference was due to a very marked difference between the two groups of engineers. The laboratory group much preferred the support of a lecturer (mean 2.00), whereas the CBS group declared a preference for independent working (mean 3.45). Interviews did little to explain this difference. Students generally said that the information they were given about the laboratory was sufficient, and that they rarely needed to consult the lecturer: 'We usually know what we have to do. We just get on with it and take our readings. The lecturer helps if we ask, but mostly we don't need to.' The CBS group, on the other hand, made little comment, though one student did say that he felt he had to find out how the simulation was operated by trial and error. Probably, although asked about their general preference, students responded in the context of the activity they had experienced. Help One question not specifically asked was the students' opinion of the adequacy of on-line help. In retrospect, this important issue should have been included: it was the most quoted topic of concern to the users. Help provision is an important issue which is seldom given the attention it deserves; both the strategy and the presentation of assistance may greatly affect the effectiveness of a package. Students generally felt that insufficient help was available through the computer (although they agreed that most of it was available in the accompanying workbook). One student said: "The instructions weren't clear. You sit down and you do something. You think: what am I to do now? Nothing tells me. It's trial and error, like varying the speeds and things. See what happens.' Student comments have alerted me to the crucial role which the help provision plays in the success of a package. I plan to enhance the provision in the present packages, and to devote significantly more attention to it in future designs. 49 Nome S. Edward Computer-based laboratory simulation: evaluations of student perceptions Correlations Few significant correlations were found which related to the factors considered in this paper. With regard to factors related to support, the feedback the students felt they had received proved to be significantly correlated (p > 0.05) with quite a number of other factors. It was very strongly related to the motivational influence on the student (p. =.000), but negatively correlated with the time they spent on the package. This tends to suggest that feedback led to the desire to get more out of the experience, and hence to spending longer investigating the scope of the package. Respondents had been asked to rank their objectives in pursuing the degree course. The options offered were broadly in two groups: (i) extrinsic, e.g. pursuit of a qualification, improving employment prospects, and (ii) intrinsic, e.g. increasing knowledge, challenge. Feedback also tended to be rated higher by those who had ranked the pursuit of a qualification lowest, although whether this suggests that the extrinsically motivated students merely wanted results with the minimum of effort is debatable; it could equally be argued, given an opposite correlation, that those seeking knowledge had proved to have been less easily satisfied with the feedback given. Guidance on use, and guidance on the report and ease of use, were positively correlated. Guidance on use was negatively correlated with the desire to gain knowledge as a motivational factor. Taken together with the fact that the easier students found the package to use, the more time they spent on it, it could be suggested that those seeking knowledge benefited from feeling better guided and finding the package easier to use in seeking deeper understanding. A more jaundiced connotation could be that, since it is not correlated with their knowledge gain, either their expectations were higher, or they found their search fruitless. Disappointingly, few correlations were detected in any of the measures on the simulation itself. The effectiveness of the simulation correlated positively with the usefulness of the workbook, which suggests that those who found the workbook most informative may then have been better prepared for the simulation. Students were asked a general question on the suitability of CBS as a laboratory alternative. This factor, as might be expected, correlated positively with their perceptions of the effectiveness of the pump simulation. Much less to be expected was the negative correlation between perceptions of the general suitability of simulation and of the effectiveness of guidance on this simulation: a positive correlation would have evoked no surprise. The only other correlation of note was between the length of the simulation and the ranking of seeking a qualification as the students' reason for study, which again suggests that the more extrinsically motivated were looking for the shortest route to satisfying the assessment criteria. Discussion A number of students commented on the value they placed on the summary theory section. They felt that the SAQs helped to reassure them that they understood the topic. This section, they felt, could usefully be extended. This corresponds with the findings of Coleman et al (1994) when evaluating an application of CAL with Electronics students. Motivation is promoted by keeping learners informed of their progress (see, for example, Ritchie and Garner, 1995). Motivation has also been related to prior preparation. Colgan et al (1994) found that only 50 Aa-j Volume 4 Number 3 31% of respondents wanted to use CBT again, and inferred that preparation must be enhanced. My finding, that those who made most effective use of the workbook which contained the preparation also performed best in simulation, perhaps supports this. Motivation is likely to be a crucial factor in an open-learning context. K. Bateson and W. Simpson, of the University of Surrey, reported in 1995 (in a paper given at the CAL95 Conference Learning to Succeed at the University of Cambridge) that allowing Electronics students unrestricted freedom to explore systems behaviour was intensely motivating. Catteral and Ibbotson (1994), however, found that students' motivation declined over time, and concluded that CAL is useful for preparatory work but too time- consuming to remain motivating. I feel that a well-presented package without unnecessary time-consuming effects, and with what Phillips and Moss (1993) have described as 'an attractive interactive manner', can be stimulating and can remain motivating. I used a multiple-media approach, and found that motivation was related to feedback and the effectiveness of the workbook. As I mentioned above, on-line help was pinpointed by a number of students as an important motivational factor. Whiting (1989) found that an easily used screen presentation reduced the need for what in his application was largely off-line help. He suggests optional presentations to cater for different learning approaches. A thought-provoking paper by Pengelly (1993) on motivation and on-line guidance is concerned with developing domain reasoning, monitoring, and reflection levels of learning. He suggests a broad model for developing support by monitoring the search paths of the users, and by analysis inferring their reasoning processes. He points out that a human teacher becomes sensitive to a student's moods, motivation, compliance, etc., and modifies his or her approach accordingly. In some way, the computer must emulate this if effective prospective help is to be provided. Screen presentation is an important issue. A declared objective of laboratories is to provide an appreciation of the plant, and presentation obviously has a bearing on this. The attractiveness of the display, ease of interpretation and use of effects are all likely to influence a user's approach to and evaluation of the product. Respondents would have welcomed more realism: 'The gauges and that sort of thing. Sound and sight has a lot to do with things. If you're looking at things, it does make a difference from seeing it on screen.' In general, however, the students found the display clear and attractive, and were at ease using it, something which they welcomed. Few if any researchers have published systematic evaluations of the importance of realism in simulation. My own work (Edward, 1996) revealed that cognitive learning was perceived to be lower and practical appreciation was poor in a diagrammatic presentation compared to a realistic one. Waddick (1994) found that students could as effectively learn spectrophotometry on a realistic simulation as on real equipment. Leary (1995) exploited the potential to the full by allowing the internal operation of plant to be viewed by the user. Although there are some benefits in using, as in the pump, a restricted diagrammatic presentation, I must concede that for engineers no simulation can take the place of laboratories. That said, if a simulation is required for whatever use, I now believe that it is worth making it as realistic as possible. Adumbra (1994) also advocates interaction which he avers keeps the learner 'awake'. Our Si Nome S. Edward Computer-based laboratory simulation: evaluations of student perceptions simulation is, of course, very interactive, all controls being operated by the student. Students also appreciated the interaction and feedback in the theory section, and wished this to be expanded. This corresponds with the findings of Catterall and Ibbotson (1994), although the need diminished as their students gained confidence (they wanted more control over the interaction). Catterall and Ibbotson reported at ALT-C '94 in Hull that their students sought control over routing and number of attempts at questions. This is an aspect which I propose to address in future simulations. Whiting (1989) found that where ease of use and interaction were highly rated, the learners had less need of external support. Whether the workbook can eventually be absorbed as an interactive section of the package is debatable, but it is a useful long-term objective. Conclusions I am reluctant to use the word 'successful' about the results because at best the simulation was a qualified success. I feel that a sound start has been made, but that my evaluations have revealed scope for improvement. My objective was to produce an alternative for students who miss laboratories. I do not feel it provides an equivalent experience. Although the simulation was clear and easy to use, practical appreciation was diminished. Despite completing the CBS more quickly, students still wished it to be shorter This suggests that, in contrast to laboratories which engineers enjoy, it was viewed as a means of satisfying an assessment criterion. My aim now is to enhance the package, and produce what Solomon (1993) describes as a simulation which 'take[s] the best of the experiential and combinefs] it with more traditional learning methods [. . .] [to] enhance but not replace traditional learning techniques'. References Adumbra, C. (1994), 'Enhancing student experience of computer-aided learning packages', Proceedings of the Conference on Computer Aided Learning in Engineering, Sheffield, September 1994, Sheffield: University of Sheffield. Boud, D., Dunn, J. and Hegarty-Hazel, E. (1984), Teaching in Laboratories, Guildford: SRHE & NFER Nelson. Carter, G., Amour, D. G., Lee, L. S. and Sharpies, R. (1980), 'Assessment of undergraduate electrical engineering laboratory studies', IEE Proceedings, A.460. Catterall, M. and Ibbotson, P. (1994), 'The development of a low technology marketing CBT', Account, (6) 1. Coleman, J. N., Kinniment, D., Burns, F., and Butler, T. (1994), 'Teaching in a third of the time: a successful application of computer-aided learning in degree-level electronics', Proceedings of the Conference on Computer Aided Learning in Engineering, Sheffield, September 1994, Sheffield: University of Sheffield. Colgan, N., McClean, S. and Scotney, B. (1994), 'Computer-based teaching and evaluation of introductory statistics for health science students: some lessons learned', Association for Learning Technology Journal, 2 (2), 68-74. 52 ALT-J Volume 4 Number 3 Edward, N. 'Evaluation of computer-based laboratory simulation, Computers and Education (in press). Edward, N. (1996), 'Screen presentations in laboratory simulations as perceived by students', Proceedings of the SERA (1995) Conference, Glasgow: University of Strathclyde. Flagg, B. (1990), Formative Evaluation for Educational Technologies, Hillsdale NJ: Lawrence Erlbaum. Leary, J. (1995) 'Computer-simulated experiments and computer games: a method of design analysis', Association for Learning Technology Journal, 3 (1), 57-61. Pengelly, M. (1993), 'Computer-based assessment to support the learner not the assessor', Teaching and Learning Technology Programme Workshop on Assessment of Learning in Higher Education, Workshop Papers, Sheffield, November 1993 (ISBN 1-85889-086-1). Phillips, T. and Moss, G. (1993), 'Can CAL biology packages be used to replace the teacher?', Journal of Biological Education, 27 (3), 213—16. Reeves, T. (1992), 'Evaluating interactive multimedia', Educational Technology, May, 47-52. Ritchie, G. and Garner, P. (1995), 'Computer-based laboratory tutorials', IEE Colloquium: Computer-Based Learning in Electronic Education, Digest No. 1995/098, p. 13/1-2, London: IEE. Solomon, C. (1993), 'Simulation training builds teams through experience', Personnel Journal, June. Waddick, J. (1994), 'Case study: the use of a HyperCard simulation to aid in the teaching of laboratory apparatus operation', ETTI, 31 (4), 295-301. Whiting, J. (1989), 'An evaluation of some common CAL and CBT authoring styles', ETTI, 26 (3), 186-200. S3