Responding to Student Feedback 1 Running Header: RESPONDING TO STUDENT FEEDBACK Improving the Instruction of Engineering Calculus: Responding to Student Feedback Barbara M. Moskal Colorado School of Mines bmoskal@mines.edu 303-273-3867 An earlier version of this paper titled, "Using Student Feedback to Improve Instruction in Engineering Calculus", appears in the Proceedings of the Frontiers in Education Conference, Kansas City, MO, 2000. Acknowledgements Teri Woodington supported many of the electronic resources discussed here and Barbara B. Bath designed the Engineering Calculus sequence at CSM. Responding to Student Feedback 2 Abstract The purpose of this article is to illustrate how student feedback was used for instructional improvement in a sequence of Engineering Calculus courses. The methods that are employed here are appropriate for other classrooms and disciplines. This article describes the instruction that the students received and the feedback that the students provided. This feedback was used to design the next mathematics course that these students completed. After completing the next course, the students were asked to provide feedback on the changes that had been made. Index Terms calculus, classroom assessment, classroom research, college mathematics, course evaluations Responding to Student Feedback 3 Improving the Instruction of Engineering Calculus: Responding to Student Feedback In an article that was written for pre-college teachers, I proposed a model (Moskal, 2000a) of the classroom assessment process that consisted of four phases: planning, gathering, interpreting and using. This model is equally appropriate for college level instruction and is shown in Figure 1. The vertical columns divide the model into the phases of assessment and the rectangles represent the outcome of each phase. The primary mediators of each phase are distinguished in the model by circles. Each phase of the assessment process implies an action on the part of the instructor (i.e., the instructor plans, gathers, interprets, and uses) and each concludes with an outcome(s). Figure 1. Model of the Classroom Assessment Process Assessment Instru ment Student Response Student •Feedba ck •Cla ssroom decision making Instru ctor Instru ctorInformationInstru ctor Planning Gathering Interpreting Using Instru ctor The planning phase includes the processes of selecting or developing assessment items and assembling these items into an instrument. The outcome of this phase is the attainment of an assessment instrument. Currently there is a large body of information that is available to assist college instructors in selecting appropriate assessment instruments. These instruments may be designed to assess individual performances, group performances or the effectiveness of a course (Angelo & Cross, 1993; Brookhart, 1999; Lewis, Aldrige & Swamidass, 1998; Mehta & Schlecht, Responding to Student Feedback 4 1998; Moskal, 2000b; Moskal, Knecht & Pavelich, 2001; Shaeiwitz, 1998). On-line databases are also available that can assist college instructors in finding assessment instruments that meet their classroom needs (Brisseden & Slater, 2001; Southern Illinois University, Edwardsville, 2001). The gathering phase begins when the instructor administers the selected assessment instrument to students. The students then interpret the requests of their instructor and the tasks as they construct their responses. Unlike the previous phase, which is mediated by the instructor, the gathering phase is primarily mediated by the student. Although the professor administers the task to the students, it is the student who controls what appears in the response. The interpretation phase consists of the instructor's efforts to make sense of students' responses and results in the acquisition of information. The interpretation phase is supported by the use of measurement tools (e.g., scoring rubrics or checklists) and often the application of statistical techniques (Angelo & Cross, 1993; Deek et. al., 1999; McNeil, Bellamy & Burrows,1999; Moskal, 2000b; Moskal & Leydens, 2000). The assessment event, which is a single pass through the assessment process, concludes with the application of the acquired information to serve particular purposes. The use phase may have several outcomes. Two commonly identified uses of classroom assessment information are to assist instructors in making appropriate instructional decisions (Angelo & Cross, 1993; Brissenden & Slater, 2001; Brookhart, 1999) and to enable them to provide accurate feedback to their students (Brissenden & Slater, 2001; Brookhart, 1999; Shaeiwitz, 1998). How the information will be used should be considered in the planning phase in order to guide the selection of appropriate assessment instruments. Current research (Angelo & Cross, 1993; Brookhart, 1999; Moskal, Knecht & Pavelich, 2001) has emphasized the importance of completing the entire assessment process, which includes Responding to Student Feedback 5 the use phase. It is during this final phase that instructional improvements take place. Using assessment information for instructional improvement is one of the most important and the most frequently neglected components of classroom assessment (Angelo & Cross, 1993; Brookhart, 1999). In many colleges, a common assessment practice that is designed to evaluate the effectiveness of a course is the administration of a student survey at the end of the semester. One purpose of this survey is to allow students to provide feedback that may be used to improve instruction in future courses. Since faculty often teach a different course the next semester, the information that is acquired at the end of a course may not be useful in the refinement of the next course. Additionally, if the selection of the assessment instrument is completed by the institution, the given questions may not be relevant to the instructional needs of the course instructor. Poor evaluations are also often explained by the respective instructor as being a result of unmotivated students, heavy teaching loads or an invalid rating systems (Lucus, 1999). In addition, faculty have (Coburn, 1984) argued that students lack the technical expertise to evaluate course content or instructional style. This, they explain, may result in an over emphasis on the evaluation of the teachers' popularity rather than their teaching ability. Concerns have also been raised that an over emphasis on course evaluations results in grade inflation and a reduction in amount of material that is covered in a given course (Wilson, 1998). A great deal of research has been completed that examines the validity and reliability of student course evaluations. Cashin (1995) has reported that more than 1500 articles and books have been written that address the development, design and appropriateness of student evaluations. Based on these resources, he determined that well-designed course evaluations can provide valid and reliable results. Other researchers have provided support for this claim (e.g., Coburn, 1984; Peterson Responding to Student Feedback 6 & Kauchak, 1982). Researchers (Brookhart, 1999; Howard & Maxwell, 1980; Scriven, 1995) have also found that higher grades do not necessarily result in higher course evaluations. In other words, many of the concerns that have been raised with respect to course evaluations appear to be unfounded. A well-designed student evaluation system can produce valid and reliable results. The purpose of this paper is to describe a two-course sequence of engineering calculus at the Colorado School of Mines (CSM) and to illustrate how student feedback was used for instructional improvement. The first course is Honors Engineering Calculus II, which covers vectors, vector functions, partial derivatives and multiple integrals. The students admitted to this course are first semester freshmen who scored a 4 or 5 on the AB Advanced Placement Test (Bath, 1999). The next semester, these same students completed Honors Engineering Calculus III, which covers vector calculus, sequence and series, and an introduction to differential equations. A variety of different instructional techniques were used in the first course and the students were asked to evaluate these techniques through a course evaluation. This information was used to guide the development of the next course, resulting in the completion of the assessment event. The students in the next course were asked to evaluate the impact of these changes —thus, beginning a new assessment event. Honors Engineering Calculus II This section describes the structure of Honors Engineering Calculus II, the evaluation techniques used in that course and the results of the evaluation. Students Of the 35 students who completed Honors Engineering Calculus II, 7 students were female, 1 student was international and 1 student was of Asian decent. The remaining students were Caucasian. Responding to Student Feedback 7 Textbook The textbook was Calculus Concepts and Contexts by James Stewart (1998). According to the Preface of the text, it is designed to focus upon the development of students' conceptual understanding. This is achieved through a combination of geometric, numerical and algebraic approaches and the application of technology to problem solving situations. Course Design Honors Engineering Calculus II is a four-credit course. During the semester of interest, the class met for one hour on Monday and Wednesday and two hours on Friday. On Monday and Wednesday, a modified lecture format was used in which the students were encouraged to actively participate by asking questions and offering suggestions. Physical objects were brought to class to illustrate many of the concepts (e.g., a wire was used to illustrate a space curve and a ball was used to illustrate the concepts underlying the calculation of the surface area of a sphere). On Friday, the students met for two hours in the computer lab to solve problems in teams of three or four students. Sometimes the problems required the use of the computer program, Mathematica, and other times they did not. The students' ability to manipulate physical objects and their ability to use Mathematica were not evaluated on exams. These activities were designed to deepen the students' conceptual understanding as concepts were introduced. The students were required to submit the completed Mathematica assignments. Due to the rigorous structure of the course, there was very little time to answer questions on the assigned homework or to give in-class quizzes on the material. For this reason, the solutions to the homework were made available in the library and the quizzes were completed as take-home assignments. Responding to Student Feedback 8 Web-based Support Throughout the course, electronic media were used to support the learning process. Lecture notes and solutions to quizzes were posted on the web. An electronic discussion group was maintained. Tests and solutions from prior years were made available electronically. Students had access to their instructor via e-mail. The option was also available for students to provide anonymous feedback to their instructor via e-mail at any point during the semester. Many of these resources could have been made available in a paper format; however, by using the web, the overall expense of distributing this information was reduced. Closed Response Survey At the end of Honors Engineering Calculus II, the students were asked to complete a survey in which they rated the extent to which each of the instructional techniques impacted upon their learning process. The four-point scale ranged from "No Impact" to "Significant Impact." Students were asked not to include personal identification on the survey and to indicate the grade that they expected to receive in the course. Short Response Survey The short response survey is administered at the end of each semester in all departmental courses. In the current course, the short response survey was completed before the closed response survey. The questions that comprise this instrument are as follows: 1. What aspects of instruction did you find effective for promoting your learning? 2. What recommendations would you make that would improve the instruction that you received in this course? 3. If you have any additional comments, please write them in the space below. Responding to Student Feedback 9 Closed Responses Survey: Across Students Table 1 displays the activities that the students evaluated. A higher average rating suggests stronger student agreement that the given activity positively impacted their learning. Responses that indicated that a given activity was "Not applicable" were not included in this analysis. The highest rated course component was the electronic availability of solutions to prior tests via the web. The other components of the course that were rated as having a "Strong Impact" were classroom instruction, the three unit tests, the textbook, access to information concerning the course via the instructor's web page and the take-home quizzes. These were closely followed by group work, availability of course notes on the web and the use of manipulatives in class. The activities in the course that were rated as having "No Impact" or a "Slight Impact" on student learning were: the use of the computer program, Mathematica; the availability of the electronic discussion group; the availability of providing electronic anonymous feedback to the instructor via the web and the availability of solutions to take-home quizzes on the web. Closed Response Survey: Within Grade Categories At the start of the survey, the students were asked to indicate the grade that they expected to receive in the course. Thirteen, sixteen and six of the students expected to receive an "A", "B", and "C", respectively. The actual assignment of grades resulted in 11, 17 and 7 students receiving an "A", "B", and "C", respectively. Since the student predicted distribution closely approximated the actual distribution of grades, it is likely that the student predicted grades were accurate indicators of the actual grade that they attained. The final grades in this course were high, which is not surprising given the demanding screening process to enter the course. Responding to Student Feedback 10 Table 1 Ratings of Instructional Techniques in Calculus II Questions Mean "A" "B" "C" Availability of previous tests and solutions on the web. 3.26 (n=34) 3.46 (n=13) 3.20 (n=15) 3.00 (n=6) Classroom instruction. 3.23 (n=35) 3.15 (n=13) 3.31 (n=16) 3.17 (n=6) The three chapter tests. 3.20 (n=35) 3.23 (n=13) 3.25 (n=16) 3.00 (n=6) Textbook. 3.09 (n=34) 2.69 (n=13) 3.27 (n=15) 3.50 (n=6) Access to course information via instructors' web page. 3.03 (n=34) 2.92 (n=13) 3.07 (n=15) 3.17 (n=6) Take-home quizzes. 3.00 (n=35) 3.15 (n=13) 2.81 (n=16) 3.17 (n=6) Group work. 2.97 (n=35) 3.23 (n=13) 2.69 (n=16) 3.17 (n=6) Availability of course notes on the web. 2.94 (n=35) 2.92 (n=13) 2.94 (n=16) 3.00 (n=6) Concrete manipulatives (physical objects). 2.89 (n=35) 2.77 (n=13) 3.06 (n=16) 2.67 (n=6) Homework assignments. 2.79 (n=33) 2.67 (n=12) 2.93 (n=15) 2.67 (n=6) Availability of solutions to homework problems. 2.72 (n=33) 2.17 (n=12) 3.13 (n=15) 2.83 (n=6) Access to your instructor via electronic mail. 2.26 (n=31) 2.08 (n=12) 2.31 (n=13) 2.50 (n=6) Availability of solutions to take-home quizzes. 2.08 (n=31) 2.08 (n=12) 1.93 (n=14) 2.00 (n=5) Ability to provide electronic anonymous feedback. 1.85 (n=29) 1.82 (n=11) 1.75 (n=12) 2.17 (n=6) Electronic Discussion Group. 1.79 (n=33) 1.85 (n=13) 1.73 (n=15) 1.80 (n=5) The use of the computer program, Mathematica. 1.29 (n=35) 1.23 (n=13) 1.31 (n=16) 1.33 (n=6) Responding to Student Feedback 11 Out of the students who expected to receive an "A" in the course, the highest rated component of the course was the availability of solutions to prior tests on the web. This was followed by group work and chapter tests. Instruction and take-home quizzes were also highly rated. For the students who expected to receive a "B" in the course, instruction was rated highest and was closely followed by the textbook, the chapter tests and the availability of previous test solutions on the web. For the students who expected to receive a "C" in the course, the highest rated component of the course was the textbook. This was followed by classroom instruction, take-home quizzes, group work, and access to information concerning the course via the instructors web page. The only component of the course that was consistently rated in the top five across groups was instruction. Across all three groups, the lowest rated component of the course was the use of the computer program, Mathematica. Across all three groups, the availability of the solutions to take- home quizzes on the web, the electronic discussion group, the availability of providing electronic anonymous feedback to the instructor via the web, and access to the instructor via e-mail were rated in the bottom five course components. Short Response Survey: Students' Written Comments The students' written comments provided further insight into why a given component of the course was or was not effective for promoting learning. Thirty-four out of thirty-five students completed the short response survey. Twelve students explained that classroom instruction was greatly enhanced by the visual aids. One student wrote, "Props (straws, balls, wire) are very effective in visualizing in 3D" and another student wrote, "The visual aids were always helpful as well as 'entertaining'." Although the students had not rated the manipulatives as highly as they had Responding to Student Feedback 12 rated instruction on the closed response survey, their comments indicated that these activities had contributed to the high rating of instruction. Another component of the course about which the students frequently commented was the availability of the notes on the web. Fifteen students commented on the effectiveness of this approach. One student stated, "The notes on the web is the biggest help." Although the students did not explain on the short response survey why this was useful, several students had stated during the semester that by printing the notes out before class they could spend class time listening rather than "frantically writing." The students not only provided comments on what was effective, they also made suggestions as to how to improve the course. Overall, the students had highly rated the group work on Fridays. Four students had provided positive comments on the short-response survey on the effectiveness of the group work for promoting their learning. However, 13 students complained either that there was a need for more in class instruction or that the time spent in groups was too long. Their reactions indicated that although group work was helpful, it may have been overdone. For example, one student explained, "More time allotted for difficult concepts. We are sometimes pressured for time as we only have 2 hours in the classroom a week". The same student suggested, "3 hours in the classroom/1 hour in lab [group work]." Another student complained, "It is thrown at us for 2 days, then we get tested. Do we need to spend every Friday just on w/sheets? [group activities]." Three of the students complained about the take-home quizzes, "Friday quizzes are too long [group work]. Take home quizzes are even longer." However, 8 students commented on the effectiveness of this technique for promoting their learning. One student stated, "The quizzes, I hated doing, but they really helped me learn." In other words, the majority of the student responses supported the effectiveness of this technique. Responding to Student Feedback 13 Anonymous Feedback Via E-mail Over the semester, I received three anonymous messages; one contained a sequence of nonsense letters and the statement, "Wanna learn gibberish?" and another student wrote, "If I did my homework, I would be much better off. So I think I will do some homework this weekend." The remaining message complained extensively about the amount of work that was required in the course. Although the given student was obviously unhappy, the feedback that he or she provided was not helpful for improving the course. Honors Engineering Calculus III This section describes the changes that were made in the next course, Calculus III, and the results of the student evaluations to these changes. Students I had 32 students in Calculus III. Twelve (38%) of these students had been in my class the previous semester. Nine students were female and one student was African American. The remaining students were Caucasian. Changes Calculus III, which was also a four-credit course, met four times a week. In response to the students' recommendations, I reduced group work to one hour a week and allowed more in-class time for questions. I continued to place my course notes on the web and use concrete materials to illustrate the concepts that were being addressed. My course web page provided the students with links to my notes, other instructors' notes, solutions to prior tests and solutions to quizzes. Although the students had indicated that the availability of the solutions to quizzes had only a minimal impact on their learning, maintaining this resource took very little time and it provided one form of Responding to Student Feedback 14 feedback to my students on how to solve the problems. I stopped supporting the electronic discussion group and the option of providing electronic anonymous feedback. Mathematica was also eliminated. Follow-up Surveys An altered version of the closed response survey from the previous semester was administered at the end of Calculus III. The questions that referenced Mathematica, the electronic discussion group, and the option of providing anonymous feedback were eliminated from the survey. Additionally, the students were asked to indicate whether they had been in my class the previous semester. Four questions were added to the survey in which the students rated the changes that had been made on a four-point scale that ranged from "Very Bad Change" to "Very Good Change." The students also had the option of indicating that they had no opinion. Closed Response Survey: Across Students Twenty-nine students completed the closed response survey and their responses are summarized in Table 2. When a response indicated that a given activity was "Not applicable", it was eliminated from the analysis. Three of the activities that had been rated in the top five during the previous semester were rated in the top five during the current semester, i.e., chapter tests, availability of previous tests and solutions on the web and instruction. The students rated the take-home quizzes as having the strongest impact on their learning experience. Group work, the three chapter tests, the availability of previous tests and solutions on the web, classroom instruction and access to information concerning the course via the instructors' web page were rated as having had a "Strong Impact" on learning. The remaining activities were rated as having had at least a slight impact on the student learning; none of the activities were rated as having "No Impact" on learning. Responding to Student Feedback 15 Table 2 Ratings of Instructional Techniques in Calculus III Questions Mean "A" "B" "C" Take-home quizzes. 3.66 (n=29) 3.64 (n=11) 3.67 (n=15) 3.67 (n=3) Group work. 3.41 (n=29) 3.45 (n=11) 3.40 (n=15) 3.33 (n=3) The three chapter tests. 3.10 (n=29) 3.27 (n=11) 3.00 (n=15) 3.00 (n=3) Availability of previous tests and solutions on the web. 3.07 (n=28) 3.18 (n=11) 3.14 (n=14) 2.33 (n=3) Classroom instruction. 3.07 (n=29) 3.45 (n=11) 2.80 (n=15) 3.00 (n=3) Access to course information via instructors' web page. 3.00 (n=26) 3.10 (n=10) 3.00 (n=14) 2.50 (n=2) Homework assignments. 2.97 (n=29) 3.09 (n=11) 2.93 (n=15) 2.67 (n=3) Textbook. 2.86 (n=29) 3.00 (n=11) 2.73 (n=15) 3.00 (n=3) Concrete manipulatives (physical objects). 2.65 (n=29) 2.73 (n=11) 2.60 (n=15) 2.67 (n=3) Availability of course notes on the web. 2.52 (n=27) 2.30 (n=10) 2.79 (n=14) 2.00 (n=3) Availability of solutions to take-home quizzes. 2.43 (n=28) 1.90 (n=11) 2.67 (n=15) 3.50 (n=2) Availability of solutions to homework problems 2.18 (n=22) 2.00 (n=9) 2.17 (n=12) 4.00 (n=1) Access to your instructor via electronic mail. 2.00 (n=21) 2.25 (n=8) 1.92 (n=12) 1.00 (n=1) Responding to Student Feedback 16 As discussed earlier, four questions had been added to this survey in which the students were asked to evaluate the changes that had been made since the previous semester. Only students who had been in my class the previous semester were included in this analysis. Nine students responded that the elimination of the computer program Mathematica was either a good change (n=2) or a very good (n=7) change. Four students indicated that the elimination of the discussion group was a good change and 4 students indicated that this was a bad change. Only 3 students responded to the question concerning the electronic anonymous feedback to the instructor and all three indicated that this was a good change. In response to the question concerning the reduction of group work, 1 student indicated that this was a "Very Bad Change", 9 students indicated that this was a "Bad Change" and 1 student indicated that this was a "Good Change". Closed Response Survey: Within Grade Categories As was done the previous semester, the students were asked to indicate the grade that they expected to receive in the course. Eleven, fifteen and three of the students expected to receive an "A", "B", and "C", respectively. The actual assignment of grades resulted in 8, 13, 10 and 1 students receiving an "A", "B", "C" and "D", respectively. Based on this distribution, many students over predicted the actual grades that they would receive. Take-home quizzes were rated as the activity that had the greatest impact on learning by both students who expected to receive an "A" and students who expected to receive a "B" in the course. In all three groups, take-home quizzes, group work and the three chapter tests were rated in the top five activities. The lowest rated activity by the students who expected to receive an "A" was the availability of solutions to take-home quizzes on the web. The lowest rated activity for students who expected to receive a "B" or "C" in the course was access to their instructor via electronic mail. Responding to Student Feedback 17 Short Response Survey: Students' Written Comments In general, the comments that the students provided with respect to the course were favorable. This is illustrated in the following examples, "Enjoy your teacher and its easier to learn," and "I liked going to calculus this semester. The class wasn't just a regular old boring lecture." As had been the case in the previous semester, the written comments also indicated the aspects of instruction that had supported their learning process. Nine students commented on the effectiveness of the use of manipulatives. Two of these students suggested that even more visual demonstrations be made, i.e., "More visual aids!" and "MORE TOYS!" Although many students had indicated that the reduction of group work was a bad change on the closed response survey, only 3 students commented on this component of the course on the short response survey. One student indicated, "I think that the course should be held 3 days a week with one two hour lab section, like Calc 2 honors." This was the only comment that strongly supported returning to the previous course design. The remaining comments indicated that the group work that had been completed during the current course had been useful. By reducing the group work, I had more time in class to devote to student questions. Six students commented that this was an important component of their learning experience, e.g. "I think it's amazing that you can spend as much time answering the homework questions as you do and still get through all the material!" and "she is always willing to answer questions." Eleven students also indicated that the notes on the web continued to be useful. Concluding Remarks An important component of the assessment process is using the information that is acquired for instructional improvement purposes. In this study, I had the opportunity to collect information from my students and use the information to design the next course in the sequence. The changes Responding to Student Feedback 18 that I made were: 1) the reduction of group work, 2) the elimination of Mathematica, 3) the elimination of the discussion group and 4) the elimination of the option of providing electronic anonymous feedback to the instructor. Both the elimination of Mathematica and the option of providing electronic anonymous feedback to the instructor were well received by the students. Their reactions to the other two changes were met with mixed results. I continue to believe that Mathematica or some other 3 dimensional graphing software could have a positive impact on my students' understanding of calculus concepts. In interpreting my students' negative responses to this program, I have concluded that it was my method of implementation that was ineffective. I spent very little time introducing the software and assumed that my students would be able to use this tool effectively. Based on student feedback, this is not what happened. In the future, I intend on reintroducing Mathematica into my classroom. This time, I will do so slowly and with more careful attention to my students' learning needs. Another change that I had made in my classroom was a reduction in the amount of group work. This provided me the opportunity to increase the time that was devoted to students' questions. Although having more time for questions was well received, the reduction in group work was not. The students' responses suggest that they wanted more time for questions and more opportunities to work in groups. Increasing both of these activities is not feasible without increasing the time in class. During the semester that the course discussion group was available, I logged-in on a regular basis and responded to the students' questions. When this activity was eliminated, I had more time for planning class and organizing the course web site. The students had highly rated both classroom instruction and the course web site. In other words, for each of the changes that were made, there were tradeoffs of which the students were unaware. Responding to Student Feedback 19 In order to determine when the benefits outweighed the drawbacks, I needed to move beyond the student responses and consider how the evaluation was completed. The short response survey was administered before the closed response survey. This ordering was purposeful. The closed response survey directs the students to the specific changes that had been made and asks the students to evaluate the impact of each change. The short response survey allows the students to select what they will discuss. If I had administered the closed response survey first, the students' responses to the short response survey may have mirrored the concerns that had been raised through the closed response survey. In other words, the closed response survey could have directed the students to consider specific issues. By administering the short response survey first, I hoped to capture the concerns that were foremost in the students' minds. In Calculus III, only one student indicated that there was an inadequate amount of group work on the short response survey. None of the students' recommended the reintroduction of the discussion group. The students needed to be directly asked about these changes before commenting on their impact. This suggests that these issues were not pressing concerns for the majority of students. Coupling this observation with the overall positive comments that were made to the short response survey, supports the assertion that overall the changes had improved the course. Another observation that can be made through this study is the value of combining information that is collected through different forms of assessment. The importance of using multiple sources of data has been given a great deal of attention in the assessment literature (Angelo & Cross, 1993; Brissende & Slater, 2001; Brookhart, 1999). It was through the combination of the information that was provided through the closed response survey and the short response survey that I was able to make-sense of what was and was not working within the given courses. After changes were implemented, it was through the combination of information acquired through the two surveys Responding to Student Feedback 20 and through the examination of the process used to administer the surveys that I was able to determine the extent to which the given changes had been effective. Responding to Student Feedback 21 References Angelo, T.A., & Cross, K. P. (1993). Classroom Assessment Techniques: A Handbook for College Teachers (2nd edition). San Francisco, CA: Jossey-Bass. Bath, B. (1999). Using Student Feedback to Improve Instruction in Engineering Calculus. Unpublished manuscript. Brissenden, G. & Slater, T. (2001). Field-tested Learning Assessment Guide: For Science, Math, Engineering and Technology Instructors [On-line]. Available: http://www.wcer.wisc.edu/nise/cl1/flag/default.asp?startpage=flag1.asp. Brookhart, S. (1999). The Art and Science of Classroom Assessment: The Missing Part of Pedagogy. ASHE-ERIC Higher Education Report (Vol. 27, No.1), Washington, DC: The George Washington University, Graduate School of Education and Human Development. Cashin, W. E. (1995). "Student rating of teaching: The research revisited". IDEA Paper, (32) [ERIC Document Reproduction Service No. ED 402 338]. Coburn, L. (1984). "Student evaluation of teacher performances." ERIC/TME Update Series [On- line]. Available: http://ericae.net/edo/ED289887.htm. Deek, F. Hiltz, S. R., Kimmel, H. & Rotter, N. (1999). "Cognitive assessment of students' problem solving and program development skills." Journal of Engineering Education, 88 (3), 317-326. Howard, G. & Maxwell, S. (1980). "Correlation between student satisfaction and grades: A case of mistaken causation." Journal of Education Psychology, 72 (6), 810-820. Lewis, P., Aldridge, D. & Swamidass, P.M. (1998). "Assessing teaming skills acquisition on undergraduate project teams." Journal of Engineering Education, 87 (2), 149-155. Lucas, A. (1999). "Reaching the unreachable: Improving the teaching of poor teachers." The Department Chair. Bolton, MA: Anker. Responding to Student Feedback 22 McNeill, B., Bellamy, L. & Burrows, V. (1999). "A quality based assessment process for student work products." Journal of Engineering Education, 88 (4), 485- 500. Mehta, S. & Schlecht, N. W. (1998). "Computerized assessment techniques for large classes." Journal of Engineering Education, 87 (2), 167-172. Moskal, B. (2000a) "An assessment model for the mathematics classroom." Mathematics Teaching in the Middle School, 6 (3), 192-194. Moskal, B. (2000b). "Scoring rubrics: What, when and how?" Practical Assessment, Research & Evaluation, 7 (3) [On-line]. Available : http://ericae.net/pare/getvn.asp?v=7&n=3. Moskal, B., Knecht, R. & Pavelich, M. (2001). "The design report rubric: Assessing the impact of program design on the learning process". Journal for the Art of Teaching: Assessment of Learning, 8 (1), 18-33. Moskal, B. & Leydens, J. (2000). "Scoring rubric development: Validity and reliability". Practical Assessment, Research & Evaluation, 7 (10)[On-line]. Available: http://ericae.net/pare/getvn.asp?v=7&n=10. Peterson, K & Kauchak, D. (1982). Teacher Evaluation: Perspectives, Practices, and Promises. Salt Lake City, UT: Utah University, Center for Educational Practices. [ERIC Document Reproductive Services No. ED 233 996]. Scriven, M. (1995). "Student ratings offer useful input to teacher evaluations." Practical Assessment, Research & Evaluation, 4 (7) [On-line]. Available: http://wricae.net/Pare/getvn.asp?v=4&n=7. Shaeiwitz, J. A. (1998). "Classroom assessment." Journal of Engineering Education, 87 (2), 179- 181. Responding to Student Feedback 23 Stewart, J. (1998). Calculus Concepts and Contexts. Washington, DC: Brooks/Cole Publishing Company. Southern Illinois, Edwardsville (2001). Classroom Assessment Techniques [On-line]. Available: http://www.siue.edu/~deder/assess/catmain.html. Wilson, R. (1998). "New research casts doubt on value of student evaluations of professors". The Chronicle of Higher Education, 44 (9), a12-A14.