Designing intelligent computer-based simulations: a pragmatic approach Bernard M. Garrett* and David Callear** *The School of Health Care, Oxford Brookes University **Department of Information Systems, University of Portsmouth email: bmgarrett@brookes.ac.uk This paper examines the design of intelligent multimedia simulations. A case study is presented which uses an approach based in part on intelligent tutoring system design to integrate formative assessment into the learning of clinical decision-making skills for nursing students. The approach advocated uses a modular design with an integrated intelligent agent within a multimedia simulation. The application was created using an object-orientated programming language for the multimedia interface (Delphi) and a logic-based interpreted language (Prolog) to create an expert assessment system. Domain knowledge is also encoded in a Windows help file reducing some of the complexity of the expert system. This approach offers a method for simplifying the production of an intelligent simulation system. The problems developing intelligent tutoring systems are examined and an argument is made for a practical approach to developing intelligent multimedia simulation systems. Introduction There has been great interest in the potential use of multimedia computer-based learning (CBL) packages within higher education. The effectiveness of such systems, however, remains controversial. There are suggestions that such multimedia applications may hold no advantage over traditional formats (Barron and Atkins, 1994; Ellis, 1994; Laurillard, 1995; Simms, 1997; Leibowitz, 1999). One area where multimedia CBL may still prove its value is in the simulation of activities where experiential learning is expensive, undesirable or even dangerous. Simulation The aviation industry has successfully been using computer-based simulation to train pilots for a number of years, and the military use computer-generated simulations for battlefield 5 Bernard M. Garrett and David Calkar Designing intelligent computer-based simulations: a pragmatic approach tactical training (Fletcher, 1999). Computer-based simulation has also proved valuable in areas where experiential learning may not be possible due to time, cost or operational constraints such as in the training of police to respond to bomb threats (Chung and Huda, 1999). The use of computer-based simulations as learning tools has been developed in many areas of health-care education: intensive care (Henry and Waltmire, 1992), child health (Lauri, 1992; Krawczak and Bersky, 1995), physical assessment (White, 1995), and midwifery (Woodsbn, 1997; Lyons, Miller and Milton, 1998). Clinical decision-making skills are another area where the value of simulation has been recognized. During the training of nurses clinical decision-making skills are generally acquired in the practice setting under the guidance of a qualified clinical mentor. The practice of structured reflection is used to evaluate student performance (Benner, 1984), but considerations of public safety make this method.of acquiring skills in clinical decision-making fraught with difficulties. There is usually some consolidation of practical decision-making experience with theory taught during management studies in the final year of training. This is designed to help students develop their deductive reasoning skills. Paper-based simulation exercises are often used but these are complex, unwieldy and difficult to manage for large groups of students. The use of multimedia simulations for developing clinical decision-making skills in nursing students is undoubtedly an area where a computer-based tool would be useful. At least three systems are currently in development for teaching clinical decision-making skills to nurses (Hjelm-Karlson and Stenbeck, 1997; Oliver, 1999; Garrett, 1997). These focus on different specific areas where clinical decision-making is applied and demonstrate the value of computers in modelling deductive processes. The last of these systems has been developed using an integrated intelligent agent in its design. This approach was taken in order to create a rich, effective multimedia learning tool, capable of developing deductive reasoning skills by giving individualized formative advice to the students. It is suggested that students learn most effectively with simulation when there is feedback immediately following an event (Johannson and Wertenberger, 1996; Lyons et al, 1998; Laurillard, 1995, 1997). In this way it is seen as relevant and becomes integrated into the learning experience. It also gives positive reinforcement by the immediate correction of misinterpretations. Lester, Stelling and Stone (1999) suggest that feedback should be presented during problem-solving activity in such a manner that contributes to learning effectiveness and efficiency. The intelligent agent in the system described here was developed using the basic elements from intelligent tutoring system design but without the rigid framework imposed by this architecture. Intelligent Tutoring Systems One of the beliefs behind the development of Intelligent Tutoring Systems (ITS) in the late 1970s was that computers can provide effective individualized formative assessment (Sleeman and Brown, 1982; Elsom-Cook, 1987; Wenger, 1990; Nwana, 1990, 1993). Computer systems were designed to emulate intelligent human behaviour, assessing student performance and giving guidance, as would a tutor with expert knowledge of the subject. This type of system was a radical departure from Computer-Assisted Instruction (CAT). Prior to this, CAI used programmed units of learning in computer programs to 6 ALT-J Volume 9 Number 3 produce what were in effect programmed text-books, or branching programs which allowed a degree of student interaction, but gave no individualized feedback. The ITS developed from research into knowledge-based systems where knowledge is represented by a defined knowledge set, and a program that operates upon this knowledge- base (Bench-Capon, 1990). The use of heuristic search techniques by the program is also a key clement of its design. This allows a student to see the reasoning undertaken to reach the solution, but adds to the complexity of the system, as it must demonstrate the result and the proof. Knowledge-based systems and expert systems (which use a knowledge base and inference engine to represent the knowledge of an expert in a particular domain) are one branch of 'artificial intelligence' research (Jackson, 1999). The underlying paradigm on which they are based is that knowledge and every aspect of learning can be precisely described in some way. Intelligent tutoring systems use an expert system for the representation of an expert's knowledge. Therefore an ITS has the distilled knowledge from the experience of people who have years of expertise in one particular domain encoded in it. The encoding of this knowledge into expert systems has proceeded on the basis that computer-based problem-solvers should be built to operate within a specific domain, that of an expert's knowledge base (Jackson, 1999). In this way the breadth of knowledge can be minimized and the depth of specific knowledge maximized and encoded. How to encode this knowledge and represent it within an application was originally identified as one of the major problems in developing intelligent tutoring systems (Sleeman and Brown* 1982) and it remains so today. Encoding expert knowledge requires significant development time and is one of the more problematic features in moving intelligent tutoring systems from the laboratory into the practical world (Seidel and Park, 1994). In general an ITS has four main elements (Nwana, 1993): • an expert knowledge module (the domain knowledge); • a student model module (assessment of the individual student's abilities and performance); • a tutoring module (structuring the learning appropriately within the system); and • a user interface module (between the system and the student or teacher). An ITS differs from traditional CBL applications in several ways: • it can provide a clear articulation of knowledge in a limited domain; • it uses a model of student performance to drive instruction; • knowledge and inference rules are derived from the designer whilst the tutoring sequence and content is derived from student performance; • it provides diagnosis of errors rather than drill and practice; and • students can pose questions to an ITS. One of the first examples of an expert system was MYCIN, which defined many of the standard characteristics of expert system design (Clancy, 1983). It used backward chaining rules. These rules tested if certain conditions were satisfied and then led to an action. 7 Bernard M. Garrett and David Callear Designing intelligent computer-based simulations: a pragmatic approach MYCIN attempted to match the action rather than the conditions during its operation and in this way it worked from goals to facts, eliminating the need to solve every possible outcome for a given set of rules. A depth-first (rather than breadth-first) strategy was also used to search the knowledge base. Clancy used MYCIN as a basis for developing an ITS (GUIDON) and went to great lengths to disentangle the pedagogic aspects of the ITS from the expert system. Clancy used the ability to question the MYCIN system with 'how?' and 'why?' questions as a basis for developing his ITS. In this way GUIDON could demonstrate the way in which it made its decisions. The system stimulated the development of other intelligent tutoring systems and successfully addressed many of the problems in ITS design. Intelligent tutoring systems developed today such as Anatom-Tutor, an intelligent anatomy tutoring system (Beaumont, 1994) and Atlas-Tutor, a cerebral radiography tutor (Garlatti, 1998), although more refined in their implementation, exhibit many of the basic design elements of GUIDON. The difficulties in creating these highly complex systems and proving their value, however, goes some way to explain why they have yet to be widely adopted in mainstream education. The prototype clinical decision simulator In nurse education and training there is a need to develop effective decision-making skills in students based upon nursing knowledge, in much the same way that medical staff take diagnostic decisions based on medical knowledge. An intelligent computer-based simulation was needed to facilitate the development of those skills in a safe environment allowing the student to explore various 'what if?' decisions and obtain appropriate advice. The underlying assumptions on which the system was designed were that: • successful clinical decision-making is based upon a hypothetico-deductive process involving inductive and deductive reasoning skills (Gordon, 1980; Putzier, Padrick, Westfall and Tanner, 1985; Padrick, Tanner, Putzier and Westfall, 1987; Ellis, 1997); • nurses pass through stages of development to achieve safe and competent practice. These have been described by Benner (1984) as novice, advanced beginner, competent, proficient and expert. To realize this, a system was developed taking some elements from ITS design but using a simplified structure to facilitate the rapid development of an intelligent agent. This agent examines the results of students' decision-making processes at each stage in their progression through a computer-based simulation and provides appropriate individualized advice. Designing a modular system An architecture for the system was chosen to provide an intuitive multimedia interface and to fulfil the basic requirements of an ITS whilst being relatively quick to produce. A design using three modules was developed. This consists of a multimedia simulation interface, an expert-assessment system and a searchable nursing knowledge base (see Figure 1). The student accesses the system as a familiar Microsoft Windows application and uses a simple multimedia interface (with a Web browser-style navigation bar) to move through the application (see Figure 2). The design of this multimedia module was based upon the 8 ALT-J Volume 9 Number 3 Expert Assessment Module (Prolog) Formative Assessment Expert Rule-Based System Generation of Tutorial Advice SummatJve Assessment Generation System Generation of Final Tutorial Advice Summary Multimedia Nursing Knowledge Module Dstabsse Search Engine Knowledge Base (Help File) Multimedia Problem Based Learning Simulation "*"(,' Student^) Multimedia Module (Delphi) Figure I: The three-module design. functional need to create a realistic simulation experience and to provide a stimulating interface that promoted interaction. Digitized video is used extensively to play out the scenarios for the student. Adding interactive video increases the realism of the simulations and provides a challenge to the student to solve the problems. The videos were filmed using a local group of actors in a hospital setting to add realism. The video was filmed with a digital video camera and edited into AVI Image format to retain quality, although this has the disadvantage of producing a rather large file size. This did not present a problem, as the final medium for the application was CD-ROM, giving sufficient storage volume. Audio was also used extensively with music and sound clips to accompany various elements of the program. The audio was used to give audible cues to certain system events, such as receiving advice or entering a chosen simulation. The student is introduced to the Clinical Decision Simulator (CDS) through the multi- media module and selects the appropriate simulation from this interface. An example of a pathway in the simulation is outlined in Figure 1. This module shows the digitized video sequences, controls all of the branching of the simulation pathways and presents questions to the student following events in the simulation. Simulation branching is dependent upon the decisions made by the student in answering these questions. This module also controls the sequencing of tutorial information, sending and retrieving data from the expert assessment module as required. The student can also access the help module directly via the multimedia interface and 9 Bernard M. Garrett and David Collear Designing intelligent computer-based simulations: a pragmatic approach The student is placed in the role of a nurse taking charge of a group of junior staff for a morning shift on a medical ward. 1. The student is asked to allocate patients to appropriate nursing staff. 2. The student makes a selection and allocates the patients unfairly. 3. Another senior nurse complains that the allocation is unfair: 4. Advice is given stating which nurse had the unfair allocation and why this was not good practice. It also suggests the nurse examine the section on managing workload in the help file, which becomes accessible at this point The solution is not given, however 5. The student is given another opportunity to allocate patients to appropriate nursing staff. 6. The student makes a selection and allocates the patients fairly and safely. 7. The nurses go out on to the ward to see their patients 8. Workmen arrive unexpectedly asking if they can have access to carry out repairs to the ward floor 9. The student fails t o make a decision in the allocated time for the question and is prompted to make a decision. 10. The student allows the workmen in. 11. A patient falls over the workmen's tools and complains. 12. The student is asked to decide which nurses are sent on a coffee break first 13. The student sends t o o many staff off the ward at once. 14. A patient develops symptoms of haemorrhage. 15. The student fails to alert the doctor 16. A doctor arrives t o undertake a procedure and is annoyed no nurses are available. 17. The doctor is annoyed no one had informed him of the patient with the symptoms of haemorrhage. 18. Advice is given suggesting the student has made several unsafe decisions in not noticing the seriousness of the haemorrhage symptoms, sending too many staff off the ward at one time and allowing workmen access inappropriately. The advice explains why this is unsafe practice. It also suggests the nurse examine the sections on signs and symptoms of haemorrhage, safe skill mix of staff and multi-professional team work in the help file (which becomes accessible at this point).The solutions t o the questions are not directly given in the help file, but the students can determine the reason they have made clinical errors.The advice also notes that the student took too long to make a decision in this situation (and also notes if this has occurred in other questions they have answered). The simulation continues until the shift has been completed and after the final question summative advice is offered to the student This lists all of the formative advice generated during the simulation and also gives advice categorizing the student's performance in each situation as unsafe (novice level), poor practice (beginners' level) or competent (competency expected for a qualified nurse). The student may print off this advice. The student has scored several unsafe and poor practice remarks so the advice suggests they examine the specific sections of the help file involved. It also suggests that they discuss the advice generated with their tutor or mentor and then run the simulation again (if they had scored consistently competent practice they would have been congratulated and offered an alternative simulation t o try). Figure 2: Example of a simulation pathway. 10 ALT-J Volume 9 Number 3 search for information about clinical decision-making or particular aspects of the simulations. Once the student has engaged in a simulation, however, access to the help file is not available and the module commences timing the student's responses to questions posed. Help becomes available at each point the simulation is interrupted for the student to receive advice. Information about the student's answers and the time taken to respond is recorded by the multimedia module and is then passed to the expert assessment module which builds up a model of the student's performance as they progress. An anthropomorphic representation of a human tutor is used to deliver advice from the expert assessment module to the student. A simple animated talking figure with accom- panying text was produced to achieve this representation (see Figure 3). The expert assessment module generates the actual feedback given by this figure. The agent was designed to take the role of a mentor and is represented as such. This style of pedagogic agent was chosen to facilitate the learning process. The expert assessment module can be seen as analogous to a human tutor in this context. The anthropomorphic representation is useful to contextualize the feedback from the system and enhance student interaction with the system (Heitala and Niemirepo, 1998; Lester et ah, 1999). The expert assessment module was developed to give appropriate tutorial advice for all the possible outcomes from decision-making pathways. Advice is generated in this module by a backward chaining expert system. This module uses rules concerning the outcomes of decisions, ways in which decisions are made during the simulation, and their relationship with defined competent nursing practice in this context. The student cannot directly access the expert assessment module but does so via the multimedia module. The advice itself clearly identifies why it was generated and makes reference to appropriate sections in a nursing knowledge base which the student can access and examine for further detailed information on the particular issue involved. In this way, a heuristic tutorial strategy is realized. The advice is also designed to promote a hypothetico-deductive approach to solving the problems in the simulation, rather than giving the student the complete answer. Rather than attempt to codify the complete domain knowledge for the subject area and then model the student's performance on predetermined performance templates (as in a traditional ITS design), an alternative strategy was used. The expert knowledge base was limited to the domain knowledge essential for successful navigation through the simulations. The associated relevant clinical knowledge was simply represented in the third module using text and graphical material. This third module contains a clinical nursing knowledge base and was developed in the form of a Microsoft Windows help file. It also contains information on using the program and links to a Web site with further resources and updates. Reference to appropriate parts of this module is made by the advice from the intelligent agent, or it can be directly accessed and searched by the student (see Figure 3). I I Bernard M. Garrett and David Calkar Designing intelligent computer-based simulations: a pragmatic approach r j Clinical Decision Simulator File- hlelp Back Next H0E3 " Ennt Help . Exit 'Some advice . . . ASSESSMENT NUMBER 1 You failed to check Mr Howe's temperature during the physical assessment. You failed to check Mr Howe's pulse during the physical assessment. You failed to check Mr Howe's blood pressure during the physical assessment. You failed to check Mr Howe's respirations during the physical assessment. Mr Howe is dyspnoeic and the posture you have positioned him in will severely aggravate this situation. You should read the dyspnoea section in the help file for further information on positioning the patient before you continue. The concentration of oxygen you have chosen is ideal to help relieve Mr Howe's dyspnoea without compromising his respiratory drive. Humidification will help prevent a sore mouth. The analgesia you requested was appropriate. However, a continuous infusion may maintain V j figure 3: Screenshot of the student feedback screen. Rapid application development The system was designed to enable a relatively quick development time. The initial stage of development consisted of identifying vignettes to use in the simulations. This was eventually narrowed down to two scenarios, one based upon managing a patient with a specific clinical condition (bronchial carcinoma), and the other upon the management of. the clinical environment. These were chosen to illustrate the potential for different types of simulation to be used. Two decision-tree pathways were developed, one for each of these scenarios. These were then discussed with professional nurse educators and expert clinicians to ensure that they represented realistic professional situations, and modifica- tions were made where necessary. Flow diagrams were then created to represent these scenarios and all the decision pathways mapped out. Simulation pathways are relatively quick to code from branching flow diagrams, and appropriate points for student feedback were integrated into these pathways. Multiple-choice questions were used following events rather than open free-text questions as although this limits the realism of the simulation, it allows simulations to be developed and coded rapidly, and provides a simple way to assess student performance. Following the development of the simulation module an expert assessment module was coded using an advice knowledge base, rules and an inference engine based upon all the possible decision- making outcomes for the scenarios. 12 ALT-J Volume 9 Number 3 The multimedia interface module and simulation pathways were coded using an object- oriented language (Borland Delphi). This was chosen for ease of construction using event-driven programming. It provided a rapid application development environment to test the system as it was coded and gave the flexibility to link with an expert system created with another language (LPA WinProlog). Developing an intelligent agent with an object- oriented language, whilst feasible, would be cumbersome and time-consuming, so it was developed using Prolog. This language is based upon the principles of logic, and use of rules is inherent in its structure; it is therefore well suited for the task of expert system construction. LPA WinProlog was chosen as this implementation of the language provides an interface to link the expert system as a dynamic library with an object-coded Microsoft Windows application. Finally the nursing knowledge-base module was written for each of the simulations and coded using a Microsoft Windows help-file creation tool (HelpScribble). Discussion Intelligent tutoring systems, whilst promising, remain time-consuming and complex to develop. At present there has been little uptake of these systems in mainstream educational practice and they tend to be restricted to experimental work. Justifying the development investment required for such systems is also difficult (Seidel and Park, 1994). There is little doubt that they offer great potential to harness the power of computers to provide intelligent individualized assessment, but unless techniques for domain knowledge representation become more accessible to the majority of teachers then it is difficult to see this situation changing. Using the approach outlined in this paper, it is possible to construct and code a relatively simple model of student behaviour within the limited domain of the simulation. The advantages of this design are that it retains many of the important elements of an ITS without the time required for a full ITS development. The main disadvantage is that the agent does not model the complete domain knowledge for the student. However, it can guide the student to find this knowledge. The design has been developed to provide personalized formative and summative advice to the student with an intelligent agent, whilst allowing reasonably rapid application development. The clinical simulations offer greater realism than paper-based exercises and the system has the advantage over a traditional multimedia system in that it offers the student individualized advice, and promotes heuristic learning rather than didactic instruction. A comparative evaluation of the system against a similar system without the intelligent agent is currently being undertaken to establish the value of this approach. The modular design presents a more accessible method to create intelligent simulations by limiting specific domain knowledge to a help system knowledge base and by separating the tutorial advice and rules into a separate expert assessment module. The system itself is adaptable and easily extended. Further simulations can be developed and added into the system and the expert assessment system can easily have advice added to its database, its rules modified or additional rules created. This is offered as a pragmatic approach to creating intelligent simulations, whilst offering many of the advantages of an intelligent tutoring system. There are now third-party expert-system components available for several 13 Bernard M. Gorrett and David Callear Designing intelligent computer-based simulations: a pragmatic approach object-oriented languages, making the development of this type of intelligent system even more straightforward. The potential for future development for this approach is to provide a simulation development environment whereby less technically skilled users can model and build their own simulations in much the same way as existing multimedia design applications are used. References Barron, A. E. and Atkins, D. (1994), 'Audio instruction in multimedia education: is textual redundancy important?', Journal of Educational Multimedia and Hypermedia, 3 (3-4), 295-306. Benner, P. (1984), From Novice to Expert: Excellence and Power in Clinical Nursing Practice, Menlo Park, CA: Addison Wesley. Beaumont, I. H. (1994), 'User modelling in the interactive anatomy tutoring system ANATOM-TUTOR', User Modelling and User Adapted Interaction, 4 (1), 21-5. Chung, C. A. and Huda, A. (1999), 'An interactive multimedia-training simulator for responding to bomb threats', Simulation, 1, 68-77. Clancy, W. (1983), 'GUIDON', Journal of Computer Based Instruction, 10, 8-15. Ellis, D. (1994), 'Barefoot multimedia, or, all is not what it seems Moriarty', IFIP Transactions A (Computer Science and Technology), A-59, 151-4. Ellis, P. (1997), 'Processes used by nurses to make decisions in the clinical practice setting', Nurse Education Today, 17, 325-32. Elsom-Cook, M. (1987), 'Intelligent computer aided instruction at the Open University', Technical Report No. 63: Computer Assisted Learning Research Group, Milton Keynes: Open University Press. Fletcher, J. D. (1999), 'Using networked simulation to assess problem solving by tactical teams', Computers in Human Behavior, May/July, 375-402. Garlatti, S. (1998), 'The use of a computerized brain atlas to support knowledge-based training in radiology', Artificial Intelligence in Medicine, 13 (3), 181-205. Garrett, B. (1997), 'Integrating formative assessment into multimedia simulations for nursing', Proceedings of ACENDIO 1997: The First European Conference of the Association for Common European Nursing Diagnoses, Interventions and Outcomes, London: Royal College of Nursing. Gordon, M. (1980), 'Diagnostic strategies in diagnostic tasks', Nursing Research, 29, 39-45. Heitala, P. and Niemirepo, T. (1998), 'The competence of learning companion agents', The International Journal of Artificial Intelligence in Education, 9, 178-92. Hjelm-Karlson, K. and Stenbeck, H. (1997), 'A simulation that teaches clinical decision making in nursing', Nursing Informatics 1997: International Medical Informatics Association, IOS Press, 492-5. 14 ALT-J Volume 9 Number 3 Henry, S. B. and Waltmire, D. (1992), 'Computerised clinical simulations: a strategy for staff development in critical care', American Journal of Critical Care Nursing, 1 (2), 99-107. Jackson, P. (1999), An Introduction to Expert Systems, London: Addison Wesley. Johannson, S. L. and Wertenberger, D. H. (1996), 'Using simulation to test critical thinking skills of nursing students', Nurse Education Today, 16, 323-7. Krawczak, J. and Bersky, A. K. (1995), 'The development of automated client responses to computerized clinical simulation testing', Computers in Nursing, 13 (6), 295-300. Lauri, S. (1992), 'Using a computer simulation program to assess the decision making process in child health care', Computers in Nursing, 10 (40), 171-7. Laurillard, D. (1995), 'Multimedia and the changing experience of the learner', The British Journal of Educational Technology, Sept., 179-89. Laurillard, D. (1997), 'Learning formal representations in multimedia', in F. Marton, D. Hounsell and N. F. Entwistle (eds), The Experience of Learning, 2nd edn, Edinburgh: Scottish Academic Press, 172-83. Leibowitz, J. (1999), 'Web based multimedia case studies: ahead of our time?', Kybernetes, 1, 211-15. Lester, J. C., Stelling, G. D. and Stone, B. A. (1999), 'Lifelike pedagogic agents for mixed- initiative problem solving in constructivist learning environments', User Modelling and User-Adapted Interaction (Netherlands), 1, 1-43. Lyons, J. (1999), 'Reflective education for professional practice: discovering knowledge from experience', Nurse Education Today, 19 (1), 29-34. Lyons, J., Miller, M. and Milton, J. (1998), 'Learning with technology: use of case based physical and computer simulations in professional education', Contemporary Nurse: A Journal for the Australian Nursing Profession, 7 (2), 98-102. Nwana, H. S. (1990), 'Intelligent tutoring systems: an overview', Artificial Intelligence Review, 4, 251-77. Nwana, H. S. (1993), Mathematical Intelligent Learning Environments, London: Intellect. Oliver, M. (1999), 'A self paced multimedia learning environment: facilitating the transition of graduating nurses into the workplace', Australian Journal of Advance Nursing, 16 (4), 42-4. Padrick, K. P., Tanner, C. A., Putzier, D. J. and Westfall, U. E. (1987), 'Hypothesis evaluation: a component of diagnostic reasoning', Classification of Nursing Diagnosis: Proceedings from the Seventh Conference, St Louis: Mosby. Putzier, D. J., Padrick, K. P., Westfall, U. E. and Tanner, C. A. (1985), 'Diagnostic reasoning in critical care nursing', Heart and Lung, 14, 430-5. Seidcl, R. J. and Park, O. C. (1994), 'An historical perspective and a model for evaluation of intelligent tutoring systems', Journal of Educational Computing Research, 1, 103-28. 15 Bernard M. Garrett and David Callear Designing intelligent computer-based simulations: a pragmatic approach Simms, R. (1997), 'Interactivity: a forgotten art?', Computers in Human Behavior, May, 157-91. Sleeman, D. and Brown, J. S. (1982), Intelligent Tutoring Systems, London: Academic Press Inc. Wenger, E. (1990), Artificial Intelligence and Tutoring Systems: Computational and Cognitive Approaches to the Communication of Knowledge, London: Academic Press / Morgan Kaufmann. White, J. E. (1995), 'Using interactive video to add physical assessment data to computer- based patient simulations in nursing', Computers in Nursing, 5, 233-5. Woodson, S. (1997), Clinical Simulations in Maternity Nursing 2. 16