C&RL News February 2016 66

Assessment in academic libraries is of growing importance, especially in 
data-driven higher education environments. 
Demonstrating value and proving effective-
ness is especially important in instruction, 
and formative evaluation is one strategy 
that instructional librarians and designers 
alike can use to measure potential value 
and effectiveness ahead of time. This kind 
of evaluation is conducted while a learning 
object, educational tool, or curriculum is still 
under development or before it has been 
widely implemented. 

While formative evaluation is not a new 
idea, the increasing prevalence of online 
learning in academic libraries means that 
formative assessment’s charge to be as true to 
the environment where learning will happen 
may take a different form.1 One such instance 
of formative evaluation involved an online 
learning module with free and easy-to-use 
online tools. These tools and strategies can 
be translated to other environments where 
academic librarians are seeking to engage 
in formative assessment of instructional pro-
grams or objects. 

As noted by Martin Oliver, online teach-
ing and learning programs present a par-
ticular set of possibilities and challenges 
for evaluation and evaluators.2 However, 
Melody M. Thompson asserted that an evalu-
ation, whether for an online or in-person 
product or program, is not an ends, instead 
it provides a vehicle for asking the right 
questions in the right ways.3 Asking these 
questions in the right ways not only involves 

wordsmithing, but also involves the right 
evaluation format. In fact, James L. Moseley 
and Nancy B. Hastings indicated that select-
ing the appropriate medium for the delivery 
of a formative evaluation is an essential 
component of an effective and complete 
evaluation design.4 For many online learning 
environments, that appropriate evaluative 
medium may also be online.

Formative evaluation of Copyright 
and You 
This formative evaluation used electronic 
data collection tools and focused on a three-
part online learning module, Copyright and 
You. Created by three librarians at Oakland 
University (OU) Libraries, the course con-
tent was originally created in response to 
a request from Art and Art History depart-
ment faculty. When it became apparent that 
it could be used more broadly, an online 
formative evaluation was designed to de-
termine how to extend Copyright and You’s 
academic reach. 

The eCourse contains lessons on basic 
copyright information, the student as a user 
of content, and the student as a creator 
of content.5 Each lesson delivers written 

Amanda Nichols Hess and James L. Moseley

Conducting formative evaluation 
online
From planning to execution

Amanda Nichols Hess is eLearning, instructional 
technology, and education librarian at Oak land 
U n i v e r s i t y,  e m a i l :  n i c h o l s @ o a k l a n d . e d u ,  a n d 
James L. Moseley is associate professor of learning 
s c i e n c e s  ( re t i re d )  a t  Way n e  S t a te  U n i ve r s i t y ’s 
College of Education, email: jmosele@comcast.net 
 
© 2016 Amanda Nichols Hess and James L. Moseley



February 2016 67 C&RL News

instruction and concept explanation, and 
concludes with relevant practice questions. 
At the conclusion of the three-part eCourse, 
learners can take a ten-question assessment 
to test their knowledge. Badges of comple-
tion are awarded to scores of at least 8 of 10.  

Evaluation questions
Since Copyright and You had not undergone 
previous formal evaluation, this process’s 
central concern was whether the course 
was applicable and useful to students across 
academic disciplines. From this overarching 
focus of determining academic applicability, 
two secondary questions developed:

• Is the coverage of content appropriate 
and clear, particularly for learners with no 
copyright knowledge or experience?

• Is the instructional design of the mod-
ule responsive to users, and does it help en-
hance understanding and build knowledge?

Evaluation participants
To answer these questions, subject matter 
experts and undergraduates from a range of 
academic areas were engaged in the forma-
tive evaluation process. Two librarians not 
affiliated with the course, but with consider-
able experience in copyright instruction and 
instructional design, were asked to serve as 
the external expert reviewers and share their 
perspectives on the course. 

Also, a small group of library student 
employees from across academic majors 
was sampled and asked to consider the ap-
plicability, instructional content, and design 
of Copyright and You. These two groups 
provided different kinds of formative feed-
back on the module’s content, design, clarity, 
and usefulness. Moreover, their perspectives 
offered insight into how students and instruc-
tors, the primary and secondary user groups, 
would perceive Copyright and You when 
used in courses.

Collecting data
Perhaps most importantly, all data collec-
tion for this formative evaluation happened 
online using self-designed Google Forms. 

Collecting data online most closely simulated 
the actual learning environment, and the 
course’s online design is meant to provide 
convenience and accessibility for a wide 
range of courses and students, including dis-
tance learners. By using Google Forms, both 
expert and student respondents could work, 
impediment-free, through the module’s 
content while concurrently providing their 
thoughts. Also, Google Forms’ simple and 
easy-to-use interface allowed for the creation 
of surveys that captured both qualitative and 
quantitative data through Likert-style and 
guided free-response questions.

Separate Google Forms were created for 
each respondent group, and these docu-
ments considered respondents’ different 
perspectives and points of access to the 
online learning module. The two external 
experts evaluated the learning module as a 
whole with attention toward its content and 
instructional design. One expert focused 
solely on the coverage of content in Copy-
right and You, while the other focused on 
its instructional design.6 The student partici-
pants’ questionnaire asked them to consider 
each section of Copyright and You in terms 
of its clarity of directions, design, content, 
and perceived usefulness.7 Each question-
naire also offered respondents free-response 
space to share any additional thoughts on 
the module’s strengths and weaknesses.

Collecting data online through Google 
Forms surveys took advantage of several oth-
er technological affordances. For instance, 
the online questionnaire was free and easy 
to share. Also, it was convenient for both 
the evaluator and respondents. By hosting 
the survey online, it could be delivered to 
participants instantly, and they could then 
immediately submit their feedback. Finally, 
offering the survey through a clickable link 
delivered via email meant students and ex-
perts alike could access it when convenient.

However, survey data were not the only 
information collected from respondents. 
Because the eCourse exists within the uni-
versity’s course management system, Copy-
right and You stores enrollees’ performance 



C&RL News February 2016 68

and participation data. Student responses to 
both the module’s practice questions and 
concluding quiz were captured, along with 
their page views by time and frequency. This 
information proved helpful in better under-
standing and framing students’ thoughts on 
the online learning module.

Lessons learned
By using Google Forms and pulling student 
performance data from the course manage-
ment system, the evaluator was able to 
identify recommendations for Copyright 
and You’s development team, and support 
those recommendations with qualitative and 
quantitative data. However, considering the 
formative evaluation process illuminates 
several applicable lessons for other librarians 
interested in conducting formative program 
evaluations online.

• Gather data from multiple inputs. Col-
lecting both student response and perfor-
mance data were the key components of this 
formative evaluation. In collecting multiple 
data inputs from student participants (evalu-
ation surveys, review question performance, 
certificate attempts/performance), the evalu-
ator could frame student feedback in terms 
of performance. 

These multiple data sets allowed students’ 
comments on the online learning module to 
be considered in light of their performance 
in the module. This either illustrated that 
understanding of content, and perception 
of that understanding, matched or were 
incongruous. It also helped the evaluator 
determine which student comments were 
valid, and which student had completed 
the evaluation without working through the 
course at all. 

Again, here, the technology used helped 
enable this type of data collection. The 
self-recording feature of the online learning 
module meant that performance data were 
instantly accessible to the evaluator, any-
where and at any time. This is one advantage 
of an online testing tool over a printed as-
sessment. So whenever possible, collecting 
more than just performance data, or just 

survey response data, or any other kind of 
data, is very valuable—especially online.

• Seek diversity in feedback. With an 
online learning module like Copyright and 
You that can have a wide academic impact, 
diversity of opinion should be encouraged 
and sought. Diversity can mean many things. 
For instance, consulting with instructors or 
faculty members outside of the University 
Libraries and the Art and Art History De-
partment could provide future direction on 
implementation of the module in courses. 
A faculty member in the Business school, 
or in the Sociology department, may be 
able to provide useful guidance on how 
the module could be made more useful for 
their students. 

Diversity of student respondents should 
also be encouraged, particularly if taking 
a small-scale evaluation (such as this) to 
a larger group. In this particular instance, 
diversity can be achieved in several ways. 
While student respondents were from a 
diverse range of academic backgrounds, 
a more concerted effort to recruit student 
participants from a variety of the university’s 
schools and colleges could offer insight on 
how the module can be shaped to be ap-
plicable to the broadest possible student 
audience. 

Diversity of experience is also impor-
tant. All student respondents worked for 
the library system, and this could have led 
to bias, or even a greater level of copyright 
knowledge. Using a broader cross-section of 
the population in a wider-scale evaluation 
would offer a logical next evaluative step.
Here, too, technology can help. Recruit-
ment of diverse populations can be done 
via email, announcement on a website, or 
even social media. These technology tools 
can help widen the net cast by this, and 
other, formative evaluations. 

• Improve response rates through greater 
supervision. This formative evaluation was 
very hands-off by design, in part to replicate 
the true nature of the learning experience. 
While this may provide more true-to-life 
feedback, increased scaffolding, and, yes, 



February 2016 69 C&RL News

more structure, it could also improve the 
quality of responses and response rates. An 
in-person think-aloud protocol or even a 
technologically advanced adaptation of the 
procedure using screen capture and voice re-
cording software (i.e., Camtasia) could provide 
respondents with a better understanding of 
the kinds of feedback requested and desired. 
Such scaffolding could also help the evaluator 
collect data from both students and experts on 
thoughts, processes, or diffi culties not record-
ed in the online forms or performance data. 

Conclusion
In responding to the increased importance 
of assessment data, fi nding meaningful—
yet simple—ways to conduct formative 
evaluation can enhance librarians’ practice 
and improve library services, especially in 
instruction. As more learning content is 
available online, both synchronously and 
asynchronously, it is important for academic 
librarians to evaluate these resources before 
they are deployed to patrons. 

Using a free survey tool like Google Forms 
with intended patron groups and subject 
matter experts is one way to collect valuable 
feedback that allows librarians to improve an 
online learning object before it is fi nalized (if 
indeed any learning object can ever really 
be considered fi nalized). Conducting, and 
learning from, formative evaluations in situ 
can help academic librarians improve their 
services, practices, and instructional offerings. 

Notes
1. Martin Tessmer, Planning and Con-

ducting Formative Evaluations: Improving 
the Quality of Education and Training
(London: Kogan Page, 1993).

2. Martin Oliver, “Evaluating Online 
Teaching and Learning,” Information Ser-
vices & Use, 20, no. 2/3 (2001): 83–94.

3. Melody M. Thompson, “Evaluating 
Online Courses and Programs,” Journal of 
Computing in Higher Education, 15, no.2 
(2004): 63–84.

4. James L. Moseley and Nancy B. Hast-
ings, “Is Anyone Doing Formative Evalua-

tion?,” in The 2008 Pfeiffer Annual: Train-
ing, ed. Elaine Biech (Hoboken: Wiley, 
2007), 233–40.

5. Julia Rodriguez, Katie Greer, and 
Barbara Shipman, “Copyright and You: 
Copyright Instruction for College Students 
in the Digital Age,” Journal of Academic 
Librarianship 40, no. 5 (2014): 486–91.

6. See the sample for ms at bit.ly
/CRLNewsformative.

7. Ibid. 

Upcoming ACRL e-Learning
ACRL is offering a variety of online courses 
and webcasts this winter. Upcoming top-
ics include:

Framing the Framework Part Two: In-
novative Instructional Partnerships for 
Librarians and Writing Faculty (Webcast: 
February 10, 2016)

The Library Workplace in the 21st Century 
(Online Course: February 22–March 12, 
2016)

Recent Developments in Fair Use (Online 
Course: March 21–April 15, 2016)

Engaging the Digital Humanities: Collabo-
rating throughout the Research Lifecycle 
(Webcast: March 23, 2016)

Modern Pathfi nders: Easy Techniques to 
Make Better Research Guides (Webcast: 
April 6, 2016)

Make the most of your professional devel-
opment dollars through ACRL’s e-Learning 
Frequent Learner Program. Register for 
three e-Learning courses or Webcasts and 
receive a fourth free.

Visit the ACRL e-Learning website at www.
ala.org/acrl/onlinelearning for complete 
details and a full listing of upcoming ACRL 
e-Learning events.