Microsoft Word - ART1838
Evidence Based Library and Information Practice 2008, 3:4
4
Evidence Based Library and Information Practice
Article
The Information Seeking Behavior of Undergraduate Education Majors: Does Library
Instruction Play a Role?
Jason Martin
Assistant Librarian, University of Central Florida Libraries
Curriculum Materials Center
Orlando, Florida, United States of America
E-mail: mjmartin@mail.ucf.edu
Received: 23 July 2008 Accepted: 09 October 2008
© 2008 Martin. This is an Open Access article distributed under the terms of the Creative Commons
Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use,
distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract
Objective – This study investigated the information seeking behavior of
undergraduate majors to gain a better understanding of where they find their
research information (academic vs. non-academic sources) and to determine if
library instruction had any impact on the types of sources used.
Methods – The study used a convenience sample of 200 students currently
enrolled as undergraduates at the University of Central Florida’s College of
Education. A chi square test of association was conducted to determine if the
proportion of undergraduate Education majors who use academic sources as
compared to non-academic sources varied depending on whether the students
had attended at least one library instruction session.
Results – The majority of students surveyed find their research information on the
freely available Web, even though they admit that academic sources are more
credible. At an alpha level of .05, types of sources used for research were not
statistically significantly related to whether the student attended library instruction
sessions (Pearson χ2 (1, N = 200) = 1.612, p = .447, Cramer’s V = .090).
Conclusion – These results are supported by other studies that indicate that
today’s college students are using freely available Internet sites much more than
Evidence Based Library and Information Practice 2008, 3:4
5
library resources. Little to no association appears to exist between “one-shot”
library instruction sessions and the sources used by students in their research.
Serious consideration needs to be given to multiple library instruction sessions
and to for-credit library courses over one-shot classes.
Introduction
A February 2007 editorial in the Washington
Post stated that judges had cited Wikipedia
four times as often as the Encyclopedia
Britannica in their judicial opinions over the
previous year. The editorial goes on to
praise wikis, YouTube, and other “open-
source projects” as an “unstoppable
movement toward shared production of
knowledge” (Sunstein). While sites such as
Wikipedia are valuable for a myriad of
reasons, including the community creation
of knowledge, librarians, teachers, and other
information professionals must wonder at
the reasons why judges, extremely learned
men and woman, would choose Wikipedia
over an esteemed source such as the
Encyclopedia Britannica for their opinions and
what, if any, evaluation techniques they
used when selecting this resource. Similar
concerns arise regarding the information
seeking behavior of students in higher
education. College students’ strong
preference for quickly and easily accessible
Web sites is an issue for librarians, college
professors, and others in higher education.
Opting for information quickly available on
the Internet hinders the development of
students’ research skills and provides them
with only a small fraction of the information
available on any given topic. Students
relying only on Internet resources will not
only be deficient in their knowledge of a
subject, but also in how to find more
information on that subject.
Information seeking can be defined as “the
interactions between people, the various
forms of data, information, knowledge, and
wisdom that fall under the rubric of
information, and the diverse contexts in
which they interact” (Todd 27). Liao, Finn,
and Lu divide information seeking into
three broad categories: initiating, searching,
and locating (9). Others have argued that
information seeking should not be seen in
such rigid and linear frames. Instead, they
suggest that the process of finding
information should be viewed as subjective
and influenced by previous experiences,
knowledge, and opinions (Weiler 51).
However one approaches the concept of
information seeking, it is clear that this is an
important skill for students to possess.
Those individuals who are deficient in
information seeking skills have difficulty in
knowing when information is needed, the
value of libraries in finding information, and
how to evaluate the sources they do find
(Gross 155). Without these skills, students
will perform poorly in the classroom,
making the professor’s job more difficult
and ultimately reflecting poorly upon the
university. The problems, however, extend
beyond the classroom. These same skills are
needed when graduates seek home or small
business loans, research options for their
retirement plan, or seek to make informed
decisions in local or national elections.
Research and evaluation skills learned in the
classroom are needed throughout life.
The information seeking behavior of
“NextGen” or “Millennial” students is a
matter of great concern for those in higher
education. The difference in credibility
between a Web and a print source document
is negligible to these college students
(Abram and Luther 34). Indeed, Long and
Shrikhande report, “Students often simply
type terms in Google and scan the results
until information on their topic is found. No
assessment of quality, reliability, or accuracy
Evidence Based Library and Information Practice 2008, 3:4
6
generally occurs” (358). While some have
argued that the growth of the Internet
should be seen simply as the development
of a new research methodology, rather than
as a decline of research skills, it appears that
more and more students are forsaking the
library altogether (O’Brien and Symons 411).
Several studies lend legitimacy to these
assertions. A study conducted in the United
Kingdom in 2005, found that 45% of the
students in that study began their academic
research with Google, while another 23%
used a different commercial search engine
such as Yahoo!, Lycos, or AltaVista. Over
two-thirds of the students in this study
began their research on the Internet rather
than in the academic library (Griffiths and
Brophy 545). One reason for this may be that
students simply find the Internet easier to
use than the library. The study further
found that students had difficulty using
library resources and were willing to
sacrifice quality for ease of use (Griffiths and
Brophy 548).
Today’s college students are definitely at
ease with the Internet. A report from the
Pew Internet and American Life Project
reported that 86% of college students have
gone online, and that 20% of today’s
students began using a computer between
the ages of 5 and 8 (Jones 2). The study also
found that college students are positive
about the Internet, using it for both
academic and social/recreational needs. The
study found that 78% of the students used
the Internet for fun, and 73% of them
admitted using the Internet more than they
used the library (Jones 2-3). In fact, 80% of
the students stated they used the library less
than three hours a week. Many remarked
that finding information on the Internet was
easier than using the library (Jones 12-3).
Ease of use is an important component in
the information seeking behavior of
Millennial generation college students.
Academia is filled with jargon that only the
most experienced understand. Added to this
difficulty is the archaic and technical
language used in library catalogs and
database subject headings (Bodi 111). These
barriers make it difficult for time-pressed
students to find what they need for their
classes. While faculty are often ecstatic at
not finding much or any information on
their research topics, students become
frustrated and opt for the Internet because it
gives them the quantity they crave (Bodi
111). The preference for the Internet over the
library is not limited to inexperienced
researchers. One study found no real
difference in library usage among freshman,
sophomore, junior, and senior college
students (Van Scoyoc and Cason 51).
Although an OCLC study did find that 7 out
of 10 students use the library’s site for at
least some of their research, 43% of those
students who do not use the library’s site for
research do so because they think they can
find better information elsewhere (OCLC 6).
Millennial college students make heavy use
of the Web in their class projects and
research. A study that examined the
bibliographies of student papers found that
the number of citations for Web sites rose
from an average of 11.3 per bibliography (or
9% of the total number of references per
bibliography) in 1996 to 14.4 per
bibliography (or 13% of the total number of
references per bibliography) in 2001 (Davis
46). Web citations in student bibliographies
peaked in 2000, with an average of 22% per
bibliography. The decline in percentage is
directly attributable to new restrictions
placed by professors regarding the type and
frequency of Web citations students were
permitted to use (Davis 47). Davis found
that faculty were not opposed to students
using Web sites in their research, but that
they now routinely apply restrictions on
what and how many Web sources students
may use in their papers (45). Most faculty
agree that the Internet is an excellent source
of information, but they are concerned that
Evidence Based Library and Information Practice 2008, 3:4
7
students are not able to properly evaluate
the sources they have found (Herring 255).
These concerns over Web sites and resource
evaluation appear well founded. An OCLC
study found that while two-thirds of
students polled felt they could determine
what sites were best to use, 58% of students
believe that sites with advertisements are
just as reliable as sites without
advertisements (OCLC 4).
A number of authors have attempted to
determine the effectiveness of library
instruction. In an oft-cited study, Lois
Pausch and Mary Popp found that few
critical assessments of library instruction
exist in the literature. Most of what has been
published are informal surveys of students
that measure the students’ satisfaction with
a particular class (Pausch and Popp). Brettle
reviewed the research on information skills
training in the health sector during the time
period 1995-2002 and found that many of
the studies were poorly designed, executed,
and reported (6). Further, Brettle found that
many of the studies reviewed relied on
subjective measures to test the efficacy of
instruction rather than on validated
instruments using objective measures. In
2006 Koufogiannakis and Wiebe undertook
a review of the literature on teaching
information literacy skills. Their findings
report a lack of overall quality in the studies
reviewed. Many of the published studies
suffered from faulty reporting and failed to
use a validated instrument. Twenty percent
of the studies performed no statistical
analysis (Koufogiannakis and Wiebe 19).
A 2004 study conducted by Beile and Boote
attempted to critically assess the
effectiveness of library instruction using
pre- and post-tests. While they found a
statistically significant difference in test
scores, the population was small (49
students) and consisted solely of graduate
education majors (6). Most other studies,
however, show that library instruction has a
minimal impact on students’ information
seeking behavior. In her systematic review
of the literature, Brettle wrote, “the results
revealed very limited evidence to show that
training does improve skills” (7). After
reviewing the literature and performing a
meta-analysis, Koufogiannakis and Wiebe
reported only that library instruction was
better than no instruction (19). Andrew
Robinson and Karen Schlegl undertook a
bibliometric analysis of student research
papers. Their study found that library
instruction had little impact on the types of
sources cited. The students’ choice of
resources was most influenced by the
instructors’ directions to the students; when
the instructors enforced penalties related to
student grades, the students cited more
scholarly sources (Robinson and Schlegl
280). In 2002 Emmons and Martin studied
the effects of library instruction on an
English writing class; they found a
statistically significant increase in scholarly
journal citations following library
instruction (554). Their overall findings
indicated that library instruction “made a
small difference in the types of materials
students chose and how they found them”
(557-8).
Aim
The aim of this study was to examine the
information seeking skills of undergraduate
education majors at the University of
Central Florida (UCF). Specifically, this
study attempted to discern the types of
sources (academic vs. non-academic)
undergraduate education majors used to
find information for their research. The
study also sought to determine whether an
association existed between library
instruction sessions and the types of sources
used. The research was funded with a $1,000
grant sponsored by the UCF Quality
Enhancement Plan
().
Evidence Based Library and Information Practice 2008, 3:4
8
Sample
The University of Central Florida enrolled
almost 49,000 students at the start of the Fall
2007 semester. Of those students, 3,605 were
undergraduate education majors. The study
used a convenience sample of 200 currently
enrolled undergraduate education majors.
Participants volunteered after seeing
advertisements for the survey or after a
Curriculum Materials Center (CMC)
employee asked if they would like to take a
short, online survey. An incentive of $5 was
offered to all participating students. Those
who agreed to participate were shown how
to access the survey in the CMC. Once the
survey was confirmed as complete by the
principal investigator, participants each
received $5.
Instrumentation
The survey consisted of 14 questions
(Appendix A). The survey was administered
online using Survey Monkey questionnaire
software
(). The
survey asked questions about four areas of
information seeking behavior:
• the research habits of students
(questions 1,10, and 11);
• the ease of using the library’s
resources, and how important
convenience is to the student in
selecting resources. (questions 2, 3,
and 9);
• where students find most of their
research information (questions 4,5,
and 8);
• evaluating sources (questions 6 and
7).
Table 1
Student Information Seeking Behavior
If you were researching a topic for a class project like a paper or presentation, where would
you find most of your information?
Internet Library
Resources
Ask Friends Ask Experts Other
72%, n=144 27.5%, n=55 .5%, n=1 0 0
If you were researching a topic for a personal reason, where would you find most of your
information?
Internet Library
Resources
Ask Friends Ask Experts Other
88%, n=176 2.5%, n=5 6%, n=12 3.5%, n=7 0
Table 2
Research Resources
Which of the following sources do you use most in your research?
Internet Book Academic
Journal
Newspaper or
Popular Magazine
All Sources
Equally
65%, n=130 8%, n=16 16%, n=32 0 11%, n=22
Which of the following do you think is the most credible source?
Internet Book Academic
Journal
Newspaper or
Popular Magazine
All Sources are
Equally Credible
2%, n=4 20%, n=40 59%, n=118 1%, n=2 18%, n=36
Evidence Based Library and Information Practice 2008, 3:4
9
Additional demographic questions asked
participants about their class standing, the
number of hours per day spent on the
Internet, and the number of library
instruction sessions attended.
Results
All 200 surveys were deemed usable, and no
one from the original sample opted out of
the research. Table 1 shows that the Internet
was the predominant choice of almost three-
fourths of the respondents for class-related
research. Nearly 9 out of 10 used the
Internet for personal research.
Even though these students realized that
library resources were more credible than
Internet sources, they still chose to use
Internet sources instead of academic library
sources for both personal and class work.
The question remains as to why these
students would make that choice. Ease of
access may be an answer, as may the
students’ high comfort level with the
Internet. Table 3 addresses these ideas.
While almost 90% of respondents felt that
the library’s resources were not hard to use,
78% were still more comfortable using the
freely available Internet instead of the
library’s resources. Dishearteningly, 52% of
the respondents based their decisions more
on convenient access than on the authority
of the resource.
Effects of Library Instruction
Another important question is about the
effect of library instruction on the students’
choice of resources. A chi-square test of
association was conducted to determine
whether the proportion of undergraduate
education majors who used academic
sources in comparison to those who used
non-academic sources varied depending on
whether the students had taken at least one
library instruction session. The null
hypothesis (H0: x2 = 0) states that the
proportions are equal, while the alternative
hypothesis (H1: x2 ≠ ) is that the
proportions are not equal.
Table 3
Resource Selection – Library or Internet
I think the library’s resources are hard to use.
Very Much Disagree Somewhat Disagree Somewhat Agree Very Much Agree
43.5%, n=87 45%, n=90 11%, n=22 .5%, n=1
I am more comfortable using the Internet than the library’s resources.
Very Much Disagree Somewhat Disagree Somewhat Agree Very Much Agree
2.5%, n=5 20%, n=40 54%, n=107 24%, n=48
I would use a source because it is convenient to use even though it is not the best source
on my topic.
Very Much Disagree Somewhat Disagree Somewhat Agree Very Much Agree
17%, n=34 31%, n=62 44%, n=88 8%, n=16
Evidence Based Library and Information Practice 2008, 3:4
10
The independent variable, library
instruction, was assessed with question 14:
“Not counting CMC tours, how many
library instruction sessions have you
attended?” Choices ranged from zero
sessions to five or more. All the responses of
zero (n=61) were grouped into the category
“No Library Instruction.” All the responses
from one session to five or more (n=139)
were grouped into the category “Library
Instruction.” The dependent variable,
“Types of Sources Used,” was assessed with
the question “Which of the following
sources do you use most in your research?”
All the responses for “Internet Sites” were
grouped into the category “Internet.” All the
responses for “Book and Academic Journal”
were used for the category “Academic
Sources.” The responses for “I Use All These
Sources Equally” were grouped together in
the category “All Equally.” No one selected
the response “Newspapers or Popular
Magazines.” Table 4 illustrates the chi-
squared test of association table. All
variables were independent of each other,
and all cells had at least five expected
frequencies, so all assumptions for the chi-
square test of association were met.
At an alpha level of .05, the type of
information resource used for research was
not statistically significantly related to
whether the student attended library
instruction sessions (Pearson χ2 (1, N = 200)
= 1.612, p = .447, Cramer’s V = .090).
Students who had attended a library
instruction session were proportionally just
as likely to use academic and non-academic
sources as those students who had not
attended a library instruction session.
The measure known as ‘effect size’ evaluates
the strength of the association being tested
(Morgan, Reichert, and Harrison 15). It may
be seen as the practical significance of a test
result. In this study the Cramer’s V value of
.090 indicates a small effect size. Tables 5
and 6 illustrate the findings. Since the effect
size is small, it can be thought of as having
less practical significance to the field of
Table 4
Chi-square Test of Association
Amount of Library Instruction Types of Resources Used
Academic
All
Equally
Internet
Total
Library Instruction Count 33
13
94
140
Expected
Count
33.6
15.4
91
140
No Library Instruction Count 15
9
36
60
Expected
Count
14.4
6.6
39
60
Total Count 48
22
130
200
Expected
Count
48
22
130
200
Evidence Based Library and Information Practice 2008, 3:4
11
information literacy and library instruction.
Effect size considered with sample size
determines power, which is the probability
that a test will reject a false null hypothesis.
A test with a small effect size might not
generate enough power to detect statistical
significance (Morgan, Reichert, and
Harrison 16).
Discussion
Although this study is limited in that
without a true random sample and larger
sample size the results cannot be
generalized to the entire population, the
results are nonetheless disappointing, if not
surprising, for those interested in library
instruction. The fact that students surveyed
here performed most of their research,
whether for class or personal reasons, on the
freely available Web is supported by the
findings of Griffiths and Brophy, Davis, and
the Pew Internet report on the use of the
library and the Internet by college students
(Jones). Moreover, 79% of students surveyed
stated that academic sources (e.g., books
and journals) are more credible than the
Internet, yet they still rely heavily on
Internet sources for their research. Griffiths
and Brophy concluded that students have
difficulty using library resources, so they
turn to the Internet with which they are
much more comfortable. This study found
that while students did not find the library
difficult to use, they were more comfortable
using the freely available Internet.
Some might argue that with so many library
resources being online, the distinction
between the Internet and library resources is
blurred, and that students may have
difficulty differentiating between the two.
This may have been the case in this study,
since definitions of “Internet resources” and
“library resources” were not provided in the
survey. However, personal experience at the
library’s reference desk and in library
Table 5
Chi-square Test Results
Value df Asymp. Sig. (2-sided)
Pearson Chi-Square 1.612a 2 .447
Likelihood Ratio 1.548 2 .461
N of Valid Cases 200
a. 0 cells (.0%) have expected count less than 5. The minimum expected count is 6.60.
Table 6
Symmetric Measures
Value Approx. Sig.
Nominal by Nominal Phi .090 .447
Cramer's V .090 .447
N of Valid Cases 200
Evidence Based Library and Information Practice 2008, 3:4
12
instruction sessions suggests that students
do make the distinction between resources
on the library’s site and those available
freely through a search engine such as
Google. Additionally, while Google Scholar
further erodes the separation of academic
sources and the freely available Web,
personal experience again suggests that
undergraduate students are not using
Google Scholar. Again, this may be due to
students being unaware not only of the
differences between academic and non-
academic sources, but also the
appropriateness of using those sources.
This study found no association between
library instruction and the types of sources
used by students. This is supported by the
findings of Emmons and Martin and
Robinson and Schlegl. Furthermore, the
studies by Davis and Robinson and Schlegl
found that instructor guidelines played a
more significant role in student citations
than did library instruction. This raises a
crucial question as to how much students
are learning about research from simply
following the rules written in their class
syllabi. If students are not citing Internet
sources simply because they are told to use
more academic sources, it is possible that
they will revert to using the Internet when
they are not specifically instructed to do so,
and they would not have gained a deeper
understanding of the critical importance of
using academic sources. This is important,
since almost 90% of the students in this
study said they use the Internet as a primary
tool for personal research.
However, Beile and Boote found that the
greatest increases in post-test scores
occurred among students who had previous
library instruction (6). A 2006 bibliometric
study conducted by Wang found a
statistically significant difference in the
citations of students who had taken a for-
credit library course as compared to
citations listed by those students who had
not taken the course. Those who had taken
the course cited more scholarly sources
(Wang 85). Further, Wang reviewed the
guidelines set forth by the professors and
found that none of them specified an
academic penalty for having too many non-
academic citations (Wang 87). This suggests
that for-credit library classes or multiple
library instruction sessions may prove more
effective in changing students’ information
seeking behavior than the traditional “one-
shot” library instruction class.
These studies could have an important
impact on how academic libraries approach
library instruction. Libraries have long used
the “one-shot” library instruction session
where a professor brings his/her class to the
library for a session on how to use the
library. While this approach does have some
value as an introduction to the library for
new students, perhaps it is time for libraries
to seriously consider alternative practices.
Academic libraries might be better served to
invest their limited resources in for-credit
library classes, mandatory multiple library
instruction sessions, or in integrating
librarians into the class curriculum. These
changes in practice will not be easy. Not
only would these approaches require more
time and effort, but the devaluing or
possible eradication of one-shot library
instruction classes strikes at a core belief of
academic librarians.
While the vast majority of library instruction
at the University of Central Florida Libraries
consists of one-shot classes aimed mainly at
freshman composition students, the library
has made efforts to enhance the instruction
program. In conjunction with Course
Development and Web Services (CDWS),
the library has created online information
literacy modules that can be used by the
teaching faculty in their online or face-to-
face classes
(). These
modules focus on different areas of research,
Evidence Based Library and Information Practice 2008, 3:4
13
and more are forthcoming. They include
content, practice, and assessment, so an
instructor can see how well students
understand the information. The UCF
Libraries also offer “embedded librarians”
as an integral part of online classes. They
answer questions, create tutorials, and work
with the instructors on creating proper
research assignments. During the seven
academic years in which this service has
been offered, UCF Librarians have been
embedded in 187 classes reaching almost
5,600 students. Although no formal
assessment of library skills has been made of
students in classes with embedded
librarians, further investigation is planned.
Conclusion
This study found no association between
library instruction and the use of traditional
academic library resources in student
research. Academic libraries are currently
investing staff and time to order to teach
information literacy, and yet the truly
important question of how to effectively
change students’ perception of research
methodology remains unanswered. Is
information literacy and its generic offshoot
library instruction truly effective? Perhaps
the solution lies outside the library in the
types of assignments students are given and
how they are graded. Do libraries need to
rethink and redesign how they organize and
allow access to information? Or have we
truly entered a new age of research where
quantity, easy access, and keyword
searching are more important than
controlled vocabulary and peer-review?
This study makes no claims to answering
these questions, no one study alone can, but
it is important that as the library profession
moves forward it develop a research agenda
and theoretical foundation which will
eventually answer these questions.
Works Cited
Abram, Stephen, and Judy Luther. "Born
with the Chip." Library Journal 129.8 (1
May 2004): 34-7.
Beile, Penny M., and David N. Boote. "Does
the Medium Matter?: A Comparison of
a Web-Based Tutorial with Face-to-
Face Library Instruction on Education
Students' Self-Efficacy Levels and
Learning Outcomes." Research
Strategies 20.1/2 (Mar. 2004): 57-68.
Bodi, Sonia. "How Do We Bridge the Gap
Between What We Teach and What
They Do? Some Thoughts on the Place
of Questions in the Process of
Research." Journal of Academic
Librarianship 28.3 (May 2002): 109-14.
Brettle. Alison. “Information Skills Training:
A Systematic Review of the Literature.”
Health Information and Libraries
Journal 20.Supp.1 (Jun. 2003): 3-9.
Davis, Philip M. "Effect of the Web on
Undergraduate Citation Behavior:
Guiding Student Scholarship in a
Networked Age." portal: Libraries &
the Academy 3.1 (Jan. 2003): 41-51.
Emmons, Mark, and Wanda Martin.
"Engaging Conversation: Evaluating
the Contribution of Library Instruction
to the Quality of Student Research."
College & Research Libraries 63.6 (Nov.
2002): 545-60.
1 Nov. 2008.
Griffiths, Jillian R., and Peter Brophy.
"Student Searching Behavior and the
Web: Use of Academic Resources and
Google." Library Trends 53.4 (Spring
2005): 539-54.
Evidence Based Library and Information Practice 2008, 3:4
14
Gross, Melissa. "The Impact of Low-Level
Skills on Information-Seeking
Behavior." Reference & User Services
Quarterly 45.2 (Winter 2005): 155-62.
Herring, Susan Davis. "Faculty Acceptance
of the World Wide Web for Student
Research." College & Research
Libraries 62.3 (2001): 251-8. 2 Nov. 2008
Jones, Steve. The Internet Goes to College:
How Students are Living in the Future
with Today's Technology. Pew Internet
and American Life Project. 15 Sept.
2002. 2 Nov. 2008
.
Koufogiannakis, Denise and Natasha Wiebe.
“Effective Methods for Teaching
Information Literacy Skills to
Undergraduate Students: A Systematic
Review and Meta-Analysis.” Evidence
Based Library and Information Practice
1.3 (2006): 3-43. 2 Nov. 2008
.
Liao, Yan, Mary Finn, and Jun Lu.
"Information-Seeking Behavior of
International Graduate Students vs.
American Graduate Students: A User
Study at Virginia Tech 2005." College &
Research Libraries 68.1 (Jan. 2007): 5-
25. 2 Nov. 2008
.
Long, Casey M., and Milind M. Shrikhande.
"Improving Information-Seeking
Behavior among Business Majors."
Research Strategies 20.4 (2005): 357-69.
Morgan, Susan E., Thomas Reichert, and
Tyler R. Harrison. From Numbers to
Words: Reporting Statistical Results
for the Social Sciences. Boston, MA:
Allyn & Bacon, 2002.
O'Brien, Heather L., and Sonya Symons.
"The Information Behaviors and
Preferences of Undergraduate
Students." Research Strategies 20.4
(2005): 409-23.
OCLC. “How Academic Librarians Can
Influence Students' Web-Based
Information Choices.” OCLC White
Paper on The Information Habits of
College Students. June 2002. 2 Nov.
2008
.
Pausch, Lois M. and Mary Pagliero Popp.
“Assessment of Information Literacy:
Lessons from the Higher Education
Assessment Movement.” 1997. 23 June
2008. In: ACRL 1997 National
Conference Papers. Chicago:
Association of College and Research
Libraries. 2 Nov. 2008
.
Robinson, Andrew M. and Karen Schlegl.
"Student Bibliographies Improve when
Professors Provide Enforceable
Guidelines for Citations." portal:
Libraries & the Academy 4.2 (April
2004): 275-90.
Sunstein, Cass R. "A Brave New
Wikiworld." The Washington Post
[Washington, DC] 24 Feb. 2007: A,19.
Todd, Ross J. "Adolescents of the
Information Age: Patterns of
Information Seeking and Use, and
Evidence Based Library and Information Practice 2008, 3:4
15
Implications for Information
Professionals." School Libraries
Worldwide 9.2 (July 2003): 27-46.
Van Scoyoc, Anna M., and Caroline Cason.
"The Electronic Academic Library:
Undergraduate Research Behavior in a
Library Without Books." portal:
Libraries & the Academy 6.1 (Jan.
2006): 47-58.
Wang, Rui. 2006. “The Lasting Impact of a
Library Credit Course.” portal:
Libraries & the Academy 6.1 (Jan.
2006): 79-92.
Weiler, Angela. "Information-Seeking
Behavior in Generation Y Students:
Motivation, Critical Thinking, and
Learning Theory." Journal of Academic
Librarianship 31.1 (Jan. 2005): 46-53.
Evidence Based Library and Information Practice 2008, 3:4
16
Appendix A
1.) I enjoy researching.
1
Very Much
Disagree
2
Somewhat
Disagree
3
Neither
Agree Nor
Disagree
4
Somewhat
Agree
5
Very Much
Agree
2.) I think the library’s resources are hard to use.
1
Very Much
Disagree
2
Somewhat
Disagree
3
Neither
Agree Nor
Disagree
4
Somewhat
Agree
5
Very Much
Agree
3.) I am more comfortable using the Internet than the library’s resources.
1
Very Much
Disagree
2
Somewhat
Disagree
3
Neither
Agree Nor
Disagree
4
Somewhat
Agree
5
Very Much
Agree
4.) If you were researching a topic for a CLASS project such as a paper or presentation, where
would you find MOST of your information? (Check only one.)
On the Internet Using Library
Resources
Asking Friends Asking Experts Other
5.) If you were researching a topic for a PERSONAL reason, where would you find MOST of your
information? (Check only one.)
On the Internet Using Library
Resources
Asking Friends Asking Experts Other
6.) Which criteria do you use to evaluate Web sites? (Check all that apply.)
Accuracy Authority Objectivity Currency I do not
evaluate Web
sites
7.) Which of the following do you think is the most credible source?
Internet Site Book Academic
Journal
Newspaper or
Popular
Magazine
All are Equally
Credible
8.) Which of the following sources do you use most in your research? (Check only one.)
Internet Sites Books Academic
Journals
Newspapers or
Popular
Magazines
I Use all these
Sources
Equally
Evidence Based Library and Information Practice 2008, 3:4
17
9.) I would use a source because it is convenient to use even though it is not the best source on
my topic.
1
Very Much
Disagree
2
Somewhat
Disagree
3
Neither
Agree Nor
Disagree
4
Somewhat
Agree
5
Very Much
Agree
10.) I perform a good deal of research for my classes.
1
Very Much
Disagree
2
Somewhat
Disagree
3
Neither
Agree Nor
Disagree
4
Somewhat
Agree
5
Very Much
Agree
11.) I am required to write research papers for my classes.
1
Very Much
Disagree
2
Somewhat
Disagree
3
Neither
Agree Nor
Disagree
4
Somewhat
Agree
5
Very Much
Agree
12.) What is your UCF student status?
Freshman Sophomore Junior Senior
13.) How many hours a day do you spend on the Internet on average?
0 1-3 4-6 7 or More
14.) Not counting CMC Tours, how many library instruction sessions have you attended while a
student at UCF?
0 1 2 3 4 5 or More