90

Gender and Information Literacy: 
Evaluation of Gender Differences 
in a Student Survey of Information 
Sources 

Arthur Taylor and Heather A. Dalal

Arthur Taylor is Associate Professor in the Department of Information Systems and Supply Chain Man-
agement, College of Business Administration, and Heather A. Dalal is Assistant Professor in the Moore 
Library at Rider University; e-mail: ataylor@rider.edu, hdalal@rider.edu. ©2017 Arthur Taylor and Heather 
A. Dalal, Attribution-NonCommercial (http://creativecommons.org/licenses/by-nc/3.0/) CC BY-NC.

Information literacy studies have shown that college students use a va-
riety of information sources to perform research and commonly choose 
Internet sources over traditional library sources. Studies have also shown 
that students do not appear to understand the information quality issues 
concerning Internet information sources and may lack the information 
literacy skills to make good choices concerning the use of these sources. 
No studies currently provide clear guidance on how gender might influ-
ence the information literacy skills of students. Such guidance could help 
improve information literacy instruction. 

This study used a survey of college-aged students to evaluate a subset 
of student information literacy skills in relation to Internet information 
sources. Analysis of the data collected provided strong indications of 
gender differences in information literacy skills. Female respondents ap-
peared to be more discerning than males in evaluating Internet sources. 
Males appeared to be more confident in the credibility and accuracy 
of the results returned by search engines. Evaluation of other survey 
responses strengthened our finding of gender differentiation in informa-
tion literacy skills. 

ollege students today have come of age surrounded by a sea of information 
delivered from an array of sources available at their fingertips at any time 
of the day or night. Studies have shown that the most common source of 
information for college-aged students is the Internet. Information gathered 

in this environment is most likely found using a commercial search engine that returns 
sources of dubious quality using an unknown algorithm. The information in this en-
vironment is often fragmented, limited in depth and breadth, disorganized, biased, 
of unknown authorship, and, in some cases, not credible. It is unclear whether or not 
students discern these deficiencies in the information they encounter. 

doi:10.5860/crl.78.1.90

mailto:hdalal@rider.edu


Gender and Information Literacy  91

Adding to these difficulties is the self-serve nature of the Internet as an information 
source. Searches in a library environment (library databases) are conducted using 
sources that have been evaluated by information professionals. Many college students 
have had little or no information literacy instruction from a librarian and operate on 
fallacious assumptions concerning the quality of Internet information sources. A student 
using a self-serve approach to information resource gathering may use a commercial 
search engine and presume that all sources returned by that search engine are credible, 
though the search engine applied no such filter to the results.

Previous information literacy studies have examined the information search behavior 
of subjects across a limited set of information literacy skills. Not all variables affecting 
the review and selection of Internet information sources have been studied in detail. 
Specifically, it is unclear whether or not male and female students have the same level 
of information literacy skills. Some studies have suggested there may be gender-specific 
differences in media consumption and in the way each gender searches for and evalu-
ates information sources. Determining whether or not these gender-specific differences 
exist in college students is important. A clear understanding of gender-specific influ-
ences could provide meaningful guidance to librarians and educators who provide 
information literacy training for students.

The research reported here evaluated the student perceptions of Internet information 
sources, specifically their evaluation of various properties or criteria of those informa-
tion sources. The study used the evaluation criteria recommended in the Association 
of College and Research Libraries Information Literacy Competency Standard Perfor-
mance Indicator 3.2a concerning “reliability, validity, accuracy, authority, timeliness, 
and point of view or bias” of sources.1 

The terms Internet and web are commonly used interchangeably in American Eng-
lish. Technically, they are different as the Internet refers to a large world-wide TCP/
IP protocol network and the web is short for World Wide Web, a set of technologies, 
protocols, and standards that operate on a TCP/IP network using web browsers and 
web servers. Common English sheds this technical distinction and simply refers to the 
“Internet” and the “web” as the virtual place where information and applications are 
accessed using a web browser. When we use the term “Internet information sources” 
and the web in this paper, we are referring to the use of commercial search engines and 
the World Wide Web to access information for research. This is distinct from library 
databases, some of which may be available on the web. 

Review of Literature 
Information scientists and librarians have long been curious about student evaluation 
of Internet information sources. Research has shown students prefer convenience, tend 
to focus their research on the Internet, and will commonly use Google to search the 
Internet for information before using other sources.2 Some college-aged searchers dem-
onstrate a high level of confidence in the Internet as an information source and report 
rarely finding incorrect information, thus perceiving no need to evaluate information 
sources.3 Students use the library and the Internet at different stages in their research 
process for coursework, typically using Wikipedia and Internet search engines first.4 

Good information literacy skills require the consistent use of appropriate evalua-
tion criteria. In a traditional library environment, the collection is curated and some 
evaluation criteria are applied in developing the collection. The Internet represents a 
very different environment where the “collection” is quite extensive but arguably of 
dubious quality. While this condition increases the need for application of evaluation 
criteria in acquisition of information sources from the Internet, constant exposure to 
this information environment may have created a certain complacency among those 



92  College & Research Libraries January 2017

using it as an information source. Results from studies of students’ evaluation of In-
ternet information sources reveal an inconsistent use of criteria in evaluating Internet 
information sources. The Project Information Literacy study surveyed students on 
their frequency of using traditional evaluation criteria (currency, author’s credentials, 
and different viewpoints expressed) and found that most students claim to evaluate 
Internet sources.5 Other studies, however, indicate that students are familiar with com-
mon evaluative criteria but, in practice, use these criteria far less than they claim.6 Some 
studies determined that, when evaluating quality or suitability of sources, students 
use just one or two pages or skim the content to make their decision.7 Students will 
also state that identifying the author of a source is important but do not consider the 
author’s qualifications in selecting information sources.8

Lorenzen found that students had difficulty making credibility judgments on infor-
mation sources without help from their instructors.9 Tillotson reported that students 
were aware of evaluation criteria for information sources, but their knowledge of the 
details of the criteria was vague and shallow.10 In some studies, students appeared to 
know they should not trust all websites, but they proceeded to make their credibility 
judgments based on questionable criteria such as the professional look of a page, level 
of detail in content, or lack of commercial purpose.11 

Students performing research must find sources relevant to their research topic. It 
is therefore logical that research has shown that the criterion of “relevance” is more 
important than other criteria for many students.12 However, just spotting the keywords 
in the title or lead paragraphs may be the method used to determine relevance,13 further 
suggesting that students are not carefully evaluating Internet sources. Additionally, 
evaluating the “quality” of an information source is a critical information literacy skill. 
However, many college-aged researchers appear to be uninterested in evaluating the 
information quality and other characteristics of an information source.14

Evaluation of the credibility of a source requires subject area knowledge and context 
that many student researchers may lack. Research has provided some indication that 
students may not evaluate credibility using the best techniques. Studies have shown 
that many students evaluate sources based on social confirmation of credibility and use 
online reviews or forums that are written by an individual or group they perceive to be 
an enthusiast on a given topic.15 Instructors and librarians may presume students will 
ask them for help in evaluating sources, but student researchers appear to seek help 
elsewhere. One study determined that 61 percent of the respondents in their sample 
ask for help from friends in evaluating sources, 49 percent from instructors, and only 
11 percent from librarians.16 

Demographic Differences in Information Searching
Studies have provided results that indicate males and females use the Internet differ-
ently. One study found females were more likely to use the Internet more for entertain-
ment, while men were more likely to use the Internet for news and business.17 Another 
study, however, found that females viewed academic websites more often and men 
viewed entertainment websites more regularly and preferred sites with video and 
sound.18 Studies have also reported that females tend to use free resources less often 
and use licensed library resources more, while male students are more likely to use 
nontraditional online sources such as Wikipedia.19 Another study reported that women 
used books and journals more than men, and men used newspapers and magazines 
more than females.20 

Some studies have also found gender differences in the perceptions of sources and 
confidence in their use. Lim and Kwon noted that males have higher perceptions of 
Wikipedia’s information quality and their own ability to evaluate Wikipedia entries.21 



Gender and Information Literacy  93

Additionally, the authors examined the risk perception of subjects and reported that 
men perceive using Wikipedia as much less of a risk than their female classmates. 

Other studies report that men are more satisfied with their results, while women 
are more patient and careful but report having greater uneasiness and anxiety at the 
beginning of their information research process.22 Studies have also reported that 
females appear to take a safe approach to research, while males appear to be more 
confident in their ability to search for information on the Internet.23 

Though females may evaluate their online skills lower than males, studies report 
that there is no difference in the online abilities of females.24 In Burdick’s research, high 
school females spent more time investigating and formulating topics and expressed 
more emotions, while males emphasized the collection of information, focused on 
gathering and completing faster, were less likely to ask for help, and were more con-
fident in their search effort.25 

Neely’s study found women rated evaluating sources significantly more important 
than men.26 Kwon and Song studied both personality traits and gender in relationship to 
information literacy competencies and, in contrast to other studies, found that females 
reported a self-perception of higher competencies in the evaluation of information 
sources.27 Their research also found the personality trait of openness was strongly tied 
to females and to the skill of evaluating sources. 

Justification for Research
Previous information literacy and information search process research has reported 
on a broad set of search behaviors for college-aged students. Most of these studies 
have not analyzed detailed criteria used by student searchers to select sources and 
have not differentiated results based on gender. Understanding specific characteristics 
and identifying detailed search behaviors can help librarians and instructors better 
prepare students to perform research. More specifically, it can help teach students 
to understand and evaluate the quality and usefulness of information available on 
the Internet. 

It may be a revelation to some that gender differences in information literacy may 
exist. Identifying gender-specific characteristics of student information searchers will 
provide further clarity on how students search for information. The specifics of these 
gender differences can provide guidance to instructors in the preparation of gender-
aware information literacy instruction. 

Methods 
The study reported here used a survey to gather responses from subjects on how 
they evaluated information sources. The survey questions were based on the ACRL 
Information Literacy Competency Standards for Higher Education published in 2000. 
Questions were based primarily on Standard 3, which addresses how an information-
literate student performs critical evaluation of information sources.28 Several survey 
questions were used to gather background information about the subject including 
their college major and gender. The remaining questions asked subjects about their 
use of information sources and their evaluation of the reliability, validity, accuracy, 
authority, timeliness, and point of view or bias of Internet information sources. Most 
questions were specifically about Internet information sources, though a few questions 
asked about the use of library databases. 

The survey questions were developed by the authors and then reviewed by library 
faculty. A pilot survey was performed using student workers in the library as subjects. 
Based on feedback from library faculty and the results of the pilot survey, the questions 
were refined to produce the final survey instrument. 



94  College & Research Libraries January 2017

The survey was delivered in an online, web-based format, and results were stored 
anonymously in a database for later analysis (see figure 1). There were two groups of 
survey subjects: one group included students whose instructors had agreed to partici-
pate in the survey; the other group was recruited through an e-mail campaign targeted 
to students in the School of Education and the School of Business. 

The courses where the first group of participants was recruited included multiple 
sections of Composition, Management, Marketing, and Organizational Leadership; 
many of these are taken by all students to fulfill either their major or general education 
requirements. Subjects in this group entered the computer lab for information literacy 
instruction and were presented with the online survey and an informed consent form. 
This form indicated how the survey would be used and that their participation was 
voluntary and would not affect their grade. The librarian conducting the training was 
available to student participants for questions if needed. Subjects did not report any 
difficulty or confusion with the survey questions.

The second group of survey takers was recruited through an e-mail to students in 
two colleges at the institution where the survey was conducted. These subjects took the 
survey in a remote location (not in a computer lab with faculty) at their convenience. 
Most survey participants were from the first group. 

The survey contained a total of 27 questions. Subjects were directed to answer 
questions about how they searched for information for their college courses (college 
research) and how they evaluated and selected information sources. Subjects were not 
monitored while taking the survey and were allowed to proceed at their own pace. 
Subjects were not compensated for taking the survey and were informed they could 
stop at any time. Some subjects were optionally entered in a pool to win a $50 gift 
certificate. Most subjects completed the survey in five to ten minutes. The questions 
relevant to this study are listed in appendix A. 

FIGURE 1
Web-based Survey 



Gender and Information Literacy  95

Results
A total of 386 college-aged students from an American university responded to the 
survey. Most survey takers completed the 27 questions in 5 to 10 minutes. The data 
collected was anonymous. In this survey sample, 129 respondents were male and 257 
were female. 

Since there were more female survey respondents than male survey respondents, 
there is bias in the sample toward female responses so examining raw counts of survey 
responses would not be useful. Instead, in-group percentages (percentages of responses 
to a question within a group of respondents of a gender) were used. 

Chi-square analysis of variance was used to examine correlation between gender 
and survey responses. This result is reported where the test demonstrated a statistically 
significant correlation. This analysis involved running the test on the raw survey counts.

Note that for all survey questions presented in the tables that follow, the percent 
column is calculated as the number of respondents of a gender who selected the item 
divided by the total number of respondents of a gender who answered the question. 
Using this approach for single-select questions, the sum of the percentages for a ques-
tion response will be equal to or close to 100 percent, with some deviations due to 
rounding errors. Using this same approach for multiple-select questions, the sum of 
the percentages for a question response will not total 100 percent. Questions ending 
in the statement “check all that apply” indicate a multiple-select question where a 
respondent may have selected more than one item. 

Analysis of Survey Responses
Most respondents of either gender listed English as their first language (91% of females 
and 88% of males). Respondents were relatively evenly distributed over years of 

TABLE 1
Question: If you have declared a major or completed work for a college under-
graduate degree, what is or was the area of study for your declared undergraduate 
major? (If you have multiple declared majors or multiple degrees, check all that 
apply.)

Female Male Difference Response
2.3346 .7752 1.5594 I have not declared a major
32.2957 67.4419 –35.1462 business
19.0661 4.6512 14.4149 education
14.786 1.0775 4.7085 sciences
7.393 3.876 3.517 communication
4.2802 3.1008 1.1794 arts
1.1673 2.3256 –1.1583 music
1.9455 2.3256 –.3801 math
1.9455 .7752 1.1703 foreign language
8.1712 2.3256 5.8456 English language
2.3346 .7752 1.5594 journalism
4.2802 1.5504 2.7298 theatre
.7782 .7782 engineering

17.8988 11.6279 6.2709 my major is not listed



96  College & Research Libraries January 2017

matriculation: between 15 and 22 percent in each year (freshman, sophomore, junior, 
senior, and graduate). 

Most respondents were business majors (38%), with the remainder being a mix of 
liberal arts, education, and communication majors. Most subjects were female (66%) 
and most subjects indicated they had a GPA of greater than 3.0 (74%). 

Most male respondents in this sample were business majors. Most female respon-
dents were education majors, but female respondents were more likely to have other 
majors as well (see table 1). Females in the sample were more likely to have a GPA 
above 3.5, with 49 percent of female respondents reporting GPAs above 3.5 as opposed 
to 37 percent of male respondents.

When subjects were asked how often they used the web to find information, it was 
clear that most use the web constantly as an information source, with the majority in 
both genders reporting they used the web several times a day to search for information 
(see table 2). However, males in this sample were more likely to use the web on a daily 
basis, with 94 percent of the male sample using the web several times a day versus 88 
percent of the female sample. 

In response to a question concerning the selection of pages returned by a search 
engine, the females in this sample appeared to be more discerning. They were more 
likely to consider their ability to understand a site, and their ability to verify the in-
formation on the site, and evaluate the credentials of the author than were the male 
respondents. The females were also more likely to evaluate the currency of a site and 
the quality of the writing on the site (see table 3). 

TABLE 3
Question: When a search engine returns a list of pages, I select a page from the 
list based on the following criteria. (Check all that apply.)

Female Male Difference Response
6.7004 55.814 4.8864 the format of the site
3.1128 8.5271 –5.4143 whether or not the site has video

12.4514 13.1783 –.7269 the site has pictures
56.8093 51.938 4.8713 the site has a lot of information
77.0428 63.5659 13.4769 the site is understandable
22.5681 23.2558 –.6877 I am familiar with the author of the site
66.9261 58.1395 8.7866 I can verify the information on the site

61.8677 41.8605 2.0072
the credentials (qualifications) of the author 
are good

76.6537 63.5659 13.0878 the site is current
68.4825 48.062 2.4205 the quality of the writing on the site is good
22.9572 21.7054 1.2518 there are reviews and comments on the site

TABLE 2
Question: How often do you use the web to find information?

Female Male Difference Response
88.35 93.65 –5.3 several times a day
11.24 6.35 4.89 about once a day
.40 .4 about once a week



Gender and Information Literacy  97

When asked about using information sources beyond commercial search engines, 
females appeared to be more consistent in using other sources (see table 4). Males 
were much more likely to respond that they never used resources beyond Google or 
Ask.com (11% for males versus 2.44% for females). Alternatively, females were more 
likely to respond that they almost always use information resources beyond Google 
or Ask.com (29.83% for females versus 19.83% for males). The results for this question 
are statistically significant (χ-squared = 12.299, df = 4, P = .01526). 

Similarly, when asked when they know they have enough sources for a paper, males 
were more likely to report that they do not concern themselves with the number of 
sources, with 13.22 percent of male respondents reporting they “don’t worry about 
the number of sources for a paper” versus only 5.13 percent of females who chose 
that response (see table 5). There was an even more striking difference in the choice 
of the response “I try to find enough quality sources to support the information in my 
paper,” with 83.76 percent of female respondents choosing that option versus 64.46 
percent of male respondents. These results are statistically significant ( χ-squared = 
19.4695, df = 4, P < .001). 

In response to a question about the number of search engine result pages reviewed, 
males were more likely to review fewer pages than females, and females were more 
likely to review more than 5 pages of search engine result pages (see table 6). 

When asked which search engines or databases are used for research, gender dif-
ferences in responses indicated stronger information literacy skills among female 
respondents. Males were more likely to indicate the use of Google or Yahoo! search 
engines for research, with 15 percent of males indicating they use Yahoo! and 87 percent 
indicating they use Google, versus 11 percent choosing Yahoo! and 84 percent choosing 
Google for female respondents. Likewise, females were more likely to select a library 

TABLE 4
Question: Consider the papers that you have written in the last year. How often 
have you used research tools beyond Google or Ask.com to do your research?

Female Male Difference Response
2.44 10.74 –8.3 Never
22.69 23.14 –0.45 Infrequently—less than 25 percent of the time.
25.21 23.97 1.24 Sometimes—about 50 percent of the time.
19.33 22.31 –2.98 Usually—about 75 percent of the time.
29.83 19.83 10 Almost always

TABLE 5
Question: How would you know when you have enough sources for a paper?
Female Male Difference Response

5.13 13.22 –5.13 I don’t worry about the number of sources for a paper.
8.12 15.70 7.58 5 sources is enough for any paper.
2.99 4.96 1.97 10 sources is enough for any paper.

1.65 1.65 20 sources is enough for any paper.

83.76 64.46 –19.3
I try to find enough quality sources to support the 
information in my paper.



98  College & Research Libraries January 2017

database as a search resource (see table 7). These results are statistically significant 
(χ-squared = 14.7654, df = 8, P = 0.06387). 

Female respondents also appeared to be less certain and perhaps more discerning 
when evaluating web sources. Male respondents indicated confidence in determining 
the author of a web source, with 59.50 percent of male respondents indicating it was 
possible to determine the author of a web page versus 52.16 percent of female respon-
dents (see table 8). A similar distinction can be seen with the responses concerning 
the qualifications of the author of a web page, with 32.23 percent of male respondents 
indicating they could determine the qualifications of the author of a web page versus 
28.33 percent of female respondents (see table 9).

TABLE 7
Question: Which search engines or library databases do you use for your 
research? Check all that apply.
Female Male Difference Response
10.5058 14.7287 –4.2229 Yahoo!
9.3385 9.3023 .0362 Bing
83.6576 87.5969 –3.9393 Google
4.2802 4.6512 –0.371 Other Search engine such as: Blekko/Lycos/AOL

6.2257 5.4264 .7993

Metasearch engine (one that cross-searches other 
search engines) such as Ixquick/Dogpile/Clusty/
Webcrawler/Mywebsearch

73.1518 58.1395 15.0123

General Library databases such as Academic Search 
Premier/CQ Researcher/Credo Reference/LexisNexis/
JSTOR/Omnifile/Gale Virtual Reference /Ebscohost

43.1907 33.3333 9.8574
Subject-specific library databases such as: ABI-Infom, 
ArtStor, Business Source Premier, Communication

34.6304 19.3798 15.2506
Library OneSearch (cross-searches the library catalog 
and most of our databases)

28.0156 13.9535 14.0621 Library Book Catalog

TABLE 8
Question: It is usually possible to determine the author of a web page.

Female Male Difference Response
52.16 59.50 –7.34 TRUE
47.84 40.50 7.34 FALSE

TABLE 6
Question: Consider the papers that you have written in the last year. How many pages of 
search engine (Google, Yahoo!, Bing) results do you usually view?

Female Male Difference Response
15.38 16.53 –1.15 1 page
32.48 25.62 6.86 2 pages
23.50 31.40 –7.9 more than 2 pages
28.63 26.45 2.18 more than 5 pages



Gender and Information Literacy  99

A notably deep distinction exists between gender perceptions concerning general 
trust or mistrust of search engine results. Most male respondents in this sample (71.43%) 
indicated that they believed search engine results usually contain accurate information, 
whereas 55.17 percent of female respondents indicated that was the case (see table 
10). These results are statistically significant (χ-squared = 4.5126, df = 1, P = 0.03365). 

In evaluating the objectiveness of a source, males were more likely to trust their 
search engine to return objective pages and were also more likely to indicate that 
they do not evaluate the objectiveness of a site, consistent with their other responses 

TABLE 11
Question: How do you decide whether or not a source retrieved from the Internet 
is objective and provides fair and equal treatment of all sides of a topic? (Check 
all that apply.)
Female Male Difference Response

1.5564 3.1008 –1.5444
I do not understand what is meant by “objective” in this 
question.

6.2257 10.8527 –4.627
I believe that pages returned by my search engine are 
objective.

67.7043 57.3643 10.34

I look at the URL of the site, and based on the domain 
(.com, .edu, .org, .net, etc.) and use that information to 
help me determine whether or not the site is objective.

25.6809 24.8062 0.8747
I check with someone who may know; for example, 
library staff or a professor.

6.2257 10.8527 –4.627 I ask a friend if he or she thinks the site is objective.

72.7626 65.1163 7.6463

If the document provides a fair discussion of all sides 
of a topic or issue and acknowledges other viewpoints, 
then I consider it objective.

5.0584 10.8527 –5.7943 I do not evaluate the objectiveness of a site.

5.0584 6.2016 –1.1432
I do not believe it is possible to determine the 
objectiveness of a page returned by a search engine.

TABLE 9
Question: It is usually possible to determine the qualifications of the author of a 
web page.

Female Male Difference Response
28.33 32.23 –3.9 TRUE
66.09 59.50 6.59 FALSE

5.58 8.26 –2.68
I do not understand what is meant by 
qualifications in this question

TABLE 10
Question: I believe the pages listed by a search engine results usually contain 
accurate information.

Female Male Difference Response
55.17 71.43 –16.26 TRUE
44.83 28.57 16.26 FALSE



100  College & Research Libraries January 2017

concerning their use of search engines (see table 11). Males were also more likely to 
ask a friend if a site is objective. Female respondents indicated that they were more 
likely to examine the URL of a site and to consider the content of the document to 
evaluate objectiveness. These results are statistically significant (χ-squared = 12.238, 
df = 7, P = 0.093). 

Consistent with response results on other questions, male respondents in this 
survey were more likely to trust the pages returned by their search engine, with 13 
percent of males indicating that they believed pages returned by their search engine 
were accurate versus 3 percent of females (see table 12). Females were also more likely 
to use the URL of the site to evaluate accuracy, with 66 percent of females indicating 
they used the URL to evaluate accuracy versus 54 percent of males. These results are 
statistically significant (χ-squared = 17.8213, df = 5, P = 0.003179). 

When asked how they evaluate the credibility of a site, female respondents were 
once again more likely to use the URL to determine the credibility of a site, with 63 
percent of female respondents choosing that option versus 51 percent of male respon-
dents (see table 13). Females were also more likely to examine who published the 
site, with 63 percent of females indicating that they examine the site publisher versus 
53 percent of males. Males continue to indicate a strong degree of trust in the search 
engine, with 16 percent of male respondents indicating the pages returned by their 
search engine are credible versus 5 percent of female respondents. These results are 
statistically significant (χ-squared = 19.1975, df = 7, P = 0.007591). 

When asked a question about how to evaluate the quality of a page returned 
by a search engine, female subjects once again indicated the URL was significant 
to them, with 58 percent of female respondents indicating they use the URL and 
domain to determine whether or not a site is a quality site versus 42 percent of male 
respondents (see table 14). Females also indicated they were more likely to ask a 
professor or someone with knowledge of the site for guidance, with 19 percent of 
females indicating they would check with someone knowledgeable about a site 
versus 12 percent of males. These results are statistically significant (χ-squared = 
43.7426, df = 1, P < .001).

TABLE 12
Question: How do you decide whether or not information on a page is accurate 
and contains truthful and correct information? Check all that apply.
Female Male Difference Response

1.9455 2.3256 –.3801
I do not understand what is meant by “accurate” in 
this question.

2.7237 13.1783 –10.4546
I believe that pages returned by my search engine are 
accurate.

66.1479 54.2636 11.8843

I look at the URL of the site, and based on the 
domain (.com, .edu, .org, .net, etc.) and use that 
information to help me determine whether or not the 
site is accurate.

36.1868 31.7829 4.4039
I check with someone with knowledge of the site or 
topic; for example, library staff or a professor.

31.5175 29.4574 2.0601
I believe that sites with a more current date are more 
accurate.

4.6693 5.4264 –.7571
I do not check the accuracy of information on a Web 
site.



Gender and Information Literacy  101

TABLE 14
Question: How do you evaluate the quality of a page returned by a search 
engine? Check all that apply
Female Male Difference Response

2.3346 5.4264 –3.0918
I do not understand what is meant by “quality” in this 
question.

11.284 10.8527 .4313 If the page includes pictures and charts, it is a quality site.

53.6965 48.8372 4.8593
If the page is free from incorrect spelling, typographical 
errors, and poor grammar, I consider it a quality site.

70.8171 62.7907 8.0264

If the information presented in the site is comprehensive 
and covers the topic is considerable depth, I consider it 
a quality site.

57.5875 41.8605 15.727

I look at the URL of the site, and based on the domain 
(.com, .edu, .org, .net, etc.) and use that information to 
help me determine whether or not the site is a quality site.

2.3346 3.876 –1.5414
I do not evaluate the quality of pages returned by a 
search engine.

37.7432 43.4109 –5.6677

If the information on the page is interesting and presents 
a clear, well-reasoned explanation of the topic, I 
consider it a quality page.

3.1128 1.5504 1.5624
I do not believe it is possible to determine the quality of 
a page returned by a search engine.

18.677 11.6279 7.0491
I check with someone with knowledge of the site or 
topic; for example, library staff or a professor.

TABLE 13
Question: How would you evaluate the credibility of a site or document and 
determine that the information is truthful and trustworthy? Check all that apply.
Female Male Difference Response

1.5564 3.876 –2.3196
I do not understand what is meant by “credibility” in 
this question.

63.035 51.1628 11.8722

I look at the URL of the site, and based on the domain 
(.com, .edu, .org, .net, etc.) and use that information to 
help me determine whether or not the site is credible.

56.8093 56.5891 .2202

I look at the background of the author of the page—
their professional affiliations, college degrees, what 
else he or she has written.

62.6459 52.7132 9.9327 I check to see who published the information on the site.

5.0584 16.2791 –11.2207
I believe that pages returned by my search engine are 
credible.

3.8911 6.2016 –2.3105 I do not evaluate the credibility of websites.

5.8366 4.6512 1.1854
I do not believe it is possible to evaluate the credibility 
of pages returned by a search engine.

35.7977 38.7597 –2.962
I evaluate the information on the site against what I 
know about the topic.



102  College & Research Libraries January 2017

When asked how they evaluate the authority of a page, 20 percent of male respon-
dents indicated that they did not understand what was meant by authority in the 
question versus 16 percent of female respondents (see table 15). Twelve (12) percent of 
male respondents indicated that they did not believe it was possible to determine the 
authority of a page returned by a search engine versus 3 percent of female respondents. 

TABLE 15
Question: How do you evaluate the authority of a page and determine the ability 
to comment or write about a particular topic? (Check all that apply.)
Female Male Difference Response

15.9533 20.9302 –4.9769
I do not understand what is meant by “authority” in 
this question.

3.1128 11.6279 –8.5151
I do not believe it is possible to determine the authority 
of a page returned by a search engine.

44.7471 42.6357 2.1114
I check to see if the author of the page has published 
other pages, articles, or books about the topic.

46.6926 39.5349 7.1577

I look for information about the author of the page—
qualifications, degrees or certifications, the profession, 
the author’s background, other pages/documents he or 
she has written.

43.9689 34.1085 9.8604

I look at the URL of the site, and based on the domain 
(.com, .edu, .org, .net, etc.) and use that information 
to help me determine whether or not the site has 
authority.

11.6732 8.5271 3.1461 I do not examine the authority of a site.

14.0078 9.3023 4.7055
I check with someone with knowledge of the site or 
topic, for example, library staff or a professor.

TABLE 16
Question: How do you evaluate the currency of a page and determine how 
current the information on the site is? Check all that apply.
Female Male Difference Response

5.8366 13.1783 –7.3417
I do not understand what is meant by “currency” in this 
question.

38.1323 27.907 10.2253 I examine how frequently the site has been updated.
56.8093 42.6357 14.1736 I look for a date on the site.

1.5564 3.1008 –1.5444
I do not believe it is possible to determine if the 
information returned by a search engine is current.

2.7237 5.4264 –2.7027 I do not evaluate the currency of a site.

6.2257 11.6279 –5.4022
I check with someone with knowledge of the site or 
topic; for example, library staff or a professor.

40.4669 25.5814 14.8855 I look for a copyright date.
35.4086 31.7829 3.6257 I look at the dates of the references.

24.5136 11.6279 12.8857
I check to see if all the links are still active (broken links 
will lead me to believe the site is not current).



Gender and Information Literacy  103

Females were more likely to review the qualifications and prior work of the author of 
a page to determine the authority of a page, 47 percent of female respondents versus 
40 percent of male respondents. Females were also more likely to use the URL to de-
termine the authority of a page. These results are statistically significant (χ-squared = 
16.6788, df = 6, P = 0.01054).

When asked about evaluating the currency of a page, males were more likely to 
respond that they did not understand what currency meant in the question (13% of 
males versus 6% of females). Females were more likely to provide a number of answers 
that indicated they were evaluating currency (see table 16). These results are statisti-
cally significant (χ-squared = 25.4271, df = 8, P = 0.001315).

When asked how they determine the purpose or bias of a website, male respondents 
were more likely to indicate that they did not understand what was meant by the 
question: 9 percent male versus 5 percent female (see table 17). Female respondents 
were more likely to report evaluation of characteristics of the site to determine bias, 
with 69 percent of females indicating that they examined whether the purpose of a 
site is to promote an opinion or point of view versus 57 percent of male respondents. 

Summary of Results
The responses to the survey questions from both genders reveal a mixture of what 
would be considered good IL skills and what would be considered questionable or 
problematic IL skills. In analyzing these based on variations in percentages and analysis 
of variance for raw counts, we found some very clear indications of gender bias in the 
information literacy skills of college students. Major findings based on the analysis of 
the data collected in this study are as follows:

• Based on statistically significant results, females in this sample appeared to be 
more discerning in evaluating a site or source and used a number of criteria 
in evaluating the source. 

• Statistically significant results indicated that females in this sample were more 
likely to use critical evaluation criteria such as their ability to understand a site, 
their ability to verify the information on the site, and the ability to evaluate the 
credentials of the author. Female respondents were also more likely to evaluate 
the currency of a site and the quality of the writing on the site. 

TABLE 17
Question: How do you evaluate the purpose or bias of a Web site? Check all that 
apply.
Female Male Difference Response

4.6693 8.5271 –3.8578
I do not understand what is meant by “purpose” in this 
question.

46.6926 44.186 2.5066
I determine whether or not the author of the page or 
the owner of the URL is trying to sell something.

68.4825 57.3643 11.1182
I examine whether or not the purpose of the site is to 
promote a particular opinion or point of view

48.6381 48.8372 –.1991 I examine whether or not the site is spam, hoax, or joke.
9.7276 9.3023 .4253 I do not evaluate the purpose of a site.

1.9455 4.6512 –2.7057
I do not believe it is possible to determine the purpose 
of a page returned by a search engine.

12.0623 12.4031 –0.3408
I check with someone with knowledge of the site or 
topic, for example, library staff or a professor.



104  College & Research Libraries January 2017

• Statistically significant results indicated that females in this sample were more 
likely to use sources beyond Google or Ask.com and more likely to use library 
databases. 

• Statistically significant results indicated that females in this sample were more 
likely to try to find enough quality sources to support the information in their 
paper and males reported being less concerned with the number of sources 
in a paper. 

• Statistically significant results indicated that females were more likely to evalu-
ate the quality of the site using the URL or were willing to check with someone 
with knowledge about the quality of the site.

• Males in this sample appeared to be more confident that they could determine 
the author of a web page and the qualifications of the author of a web page. 

• Statistically significant results indicated that males in this sample believed the 
pages returned by their search engine were accurate and credible. 

• Statistically significant results indicated that males in this sample were more 
likely to believe that the pages listed by a search engine usually contained ac-
curate information.

• Statistically significant results indicated that males in this sample were more 
likely to believe an Internet source was objective and were also more likely to 
ignore the objectiveness of the site. 

• Statistically significant results indicated that males in this sample were more 
likely to be confused about the criteria of source currency, and females were 
more likely to provide a number of answers that indicated they were evaluat-
ing source currency.

Discussion
Information literacy skill involves the selection of information sources using various 
criteria as part of an information search process. As indicated previously, analysis of 
seventeen questions in our survey of evaluation criteria found consistent evidence of 
gender-specific variations for student use and understanding of these criteria. Specifi-
cally, our analysis indicated that female respondents were more likely to evaluate the 
currency of a site and the quality of the writing on the site and were more likely to 
evaluate the quality of the site using the URL or were willing to check with someone 
with knowledge about the quality of the site. These results add to our impression of 
the careful female researcher, providing further evidence that the female subjects were 
concerned with finding valid, credible sources to strengthen their research projects. 
This validates and extends Neely’s finding that women consider evaluating sources 
to be more important than men do and demonstrate the self-efficacy to apply evalu-
ative criteria.29 

Additional evidence that females are more careful searchers is supported by our 
finding that the females in our sample were more likely to use sources beyond com-
mercial search engines and more likely to use library resources. This supports and 
extends previous research that found females were more likely to use books and 
journals than men do.30

Burdick worked with high school students and made the observation that females 
were more reflective.31 The sources available on the Internet have changed significantly 
since 1995 when Burdick’s study was completed. Our results are based on a study of 
college-aged students conducted in 2013. It is intriguing that, given the differences in 
time (over 18 years of Internet information growth) and the age of the subject pools, 
this study continues to find similar gender-specific differences. It could be that fe-
males are socialized in a way to make more effort in their schoolwork and craft their 



Gender and Information Literacy  105

arguments accordingly. Some studies have provided evidence that parents encourage 
female children with schoolwork more than they do male children, and young male 
academic performance may be hindered by their assumption that adults believe girls 
are better students than boys are.32 

We do not believe our results necessarily indicate that males are unconcerned with 
the quality of their research. In fact, across the responses to several questions the 
majority of males reported what would be considered good IL skills. But more males 
were likely to choose responses indicating questionable IL skills, and in response to 
several questions, they did so with a statistically significant amount of responses. It 
could be that male students did not feel the need to use some evaluation criteria since 
they believe pages returned by a search engine are objective, credible, and accurate. 
Males in our sample consistently expressed confidence in their information evaluation 
skills and about the quality of results from the search engines they were using. They 
also reported being less concerned with the number of sources in a paper. Results of 
other studies report similar confident behavior from male subjects.33 

That males are confident in search engine results is clear, but this could be an in-
dication that males are exhibiting satisficing behavior, choosing the easiest path to the 
selection of sources. Their stated confidence in the quality of search engine results could 
simply be reinforcement of this behavior.34 Other studies have reported similar issues 
with convenience as a factor in information searches of students of both genders.35 

Limitations
These results were based on a sample composed primarily of undergraduate students 
at an American university. This limits the generalizability of these results. A sample 
drawn from a broader mix of cultures and grade levels and including more graduate 
students might produce different results. 

Subjects took our survey before they received information literacy instruction from 
library faculty at our institution. Some subjects may have had information literacy 
instruction in a class or at another institution before taking our survey. 

We did not conduct a reliability and validity test on the questions in this survey. 
Our survey questions were used to collect demographic and background data where 
validity is not an issue (for example, gender, year in college) or for subjects to report 
evaluation and use of information sources. A validity test is used to determine the 
accuracy of a question to test an effect. Our questions were not used to test an effect 
per se; instead, they were used to have the subject self-report past behaviors (use of 
information sources). While we believe our questions were clear and the data gathered 
were valid reports of the subject’s past behavior, a validity test could have been used 
to further refine the questions. 

More males in this study were business majors, so results could be associated with 
the student’s choice of major. A subset of females in our sample reported higher GPAs 
above 3.5 (49% of females versus 37% of males). Our results could also be an indica-
tion that students with higher GPAs exhibit these search behaviors. Likewise, gender 
could be an influence on the selection of a major. 

Implication of Findings
The results presented here raise questions concerning the appropriateness of current 
information literacy instruction. Librarians often stress the use of evaluation criteria 
but, given constraints on time available and staffing, may not have the opportunity 
to explore with students how to apply the criteria. If males and females appear to be 
using a different approach to source selection, then the approach used by each must be 
considered and any deficiencies addressed. This does not imply different information 



106  College & Research Libraries January 2017

literacy instruction for males and females (not a realistic possibility at most institutions), 
but it could be that specific issues identified by this research for both genders becomes 
part of a coeducational and gender-aware approach to information literacy instruction, 
as a part of librarians’ efforts to meet the needs of all learners.

Librarians often provide individual instruction sessions, both at the reference desk 
and in consultations in their offices. Our research has informed us that females are more 
concerned with evaluating sources, and, like Burdick’s suggestion, educators should be 
mindful that females who express concern with their research require more “encour-
agement to take intellectual risks” and “reassurance that they are capable.”36 Fields 
suggests college-aged women’s information literacy skills can be further developed 
by focusing on connections, collaborations, and firsthand experiences.37 The author 
describes specific areas where women feel “held back”: for example, women have the 
ability to, but are uncomfortable with, constructing arguments and expressing their 
own opinions, especially at younger ages.38 The author suggests that this is a part of a 
student’s intellectual development toward the path of having her own “voice” or own 
authority and helps her to use information to create new knowledge.39

This voice finding is essential for the intellectual and professional development of 
women. Kay and Shipman describe women’s “acute lack of confidence” and a reluc-
tance to take risks as the underlying reason women have so rarely risen into senior 
management.40 Thus, it is not hard to believe that a female at any age would have 
less confidence and spend more time evaluating a source that she might use for an 
argumentative research project. Encouraging instructors and librarians to recognize 
gender-specific information literacy traits could give female students skills to increase 
their research self-efficacy and thus increase their likeliness for success. 

Male college students appear to express confidence in the quality of Internet infor-
mation sources, but in reality this confidence may be misplaced. Male college students 
must learn the importance of being more discerning where Internet information sources 
are concerned. This would involve information literacy instruction that stresses the 
limitations of commercial search engines and the need for careful evaluation of Internet 
information sources using a variety of criteria. Given that male students appear to be 
more confused by some evaluation criteria (currency, authority), additional instruction 
should focus on the meaning of these criteria and the process for proper evaluation 
of sources using these criteria. 

Our research confirms and extends findings on library anxiety, the anxious feeling 
surrounding information research.41 Research by Blundell and Lambert found a sig-
nificant difference in library anxiety expressed by their male and female respondents. 
Males were more comfortable using the library and females experienced more confu-
sion navigating library resources. Males believed instructions for using the computers 
were helpful, whereas female respondents perceived reference librarians as unhelp-
ful.42 If females are more discerning in the evaluation of information sources, then it 
is logical that they would be more anxious or confused about using library resources. 
Perhaps female feelings of anxiety motivate them to evaluate information resources 
more thoroughly, or perhaps males are less concerned with evaluating information 
sources because they have more confidence in all their sources. 

Directions for Future Research
Student researchers must evaluate information sources using various criteria. Previous 
information literacy instruction recognized and identified these criteria often in the 
form of a checklist. But research has suggested that, in practice, students struggle with 
the actual use of these evaluation criteria.43 Part of the difficulty has been attributed to 
the changing information environment, which includes the Internet and the increasing 



Gender and Information Literacy  107

availability of mobile devices with information-gathering capabilities. In recognition 
of these difficulties and the ever-changing global information environment, the ACRL 
Information Literacy Competency Standards for Higher Education have been replaced 
by a “Framework for Information Literacy.”44 This new framework emphasizes infor-
mation literacy instruction that encourages students to recognize their own authority 
in the context of information gathering. 

The use of a framework approach does partly address the dynamics of information 
gathering in today’s culture, but it also increases the responsibility and difficulty for 
the student in evaluating information sources. Teaching students information literacy 
skills within this new framework requires a clear understanding of both the student 
(his or her characteristics) and the individual’s information search process. 

The research reported here identified gender-specific distinctions in information 
literacy skills in a sample of undergraduate and graduate students at an American 
institution of higher education. Future research should examine other population 
samples and attempt to determine if other student characteristics might influence 
students’ information literacy skills, specifically in relation to the use of Internet in-
formation sources. 

Conclusion
Our research has suggested that college-aged students of both genders have issues 
evaluating information resources. Developing good information literacy skills is becom-
ing increasingly important in a world permeated with commercialized media. Students 
must question information sources and develop skills for applying important evalu-
ation criteria such as quality, credibility, bias, and validity. These skills are important 
not only in the pursuit of their careers but in supporting their role as informed citizens 
in a participatory democracy.

Whether it is desirable or not, it is reasonable to expect that the use of the Internet 
as an information source for college students will continue and will likely increase. 
Information literacy instruction should recognize the distinct characteristics of college 
students that might affect their information literacy skills, specifically the gender-
specific characteristics that this research identified. 

 



108  College & Research Libraries January 2017

APPENDIX A: Survey Questions
If you have declared a major or completed work for a college undergraduate degree, 
what is or was the area of study for your declared undergraduate major? If you have 
multiple declared majors or multiple degrees, check all that apply

• I have not declared a major
• business
• education
• sciences
• communication
• arts
• music
• math
• foreign language
• English language
• journalism
• theatre
• engineering
• my major is not listed

How often do you use the web to find information?
• several times a day
• about once a day
• about once a week

When a search engine returns a list of pages, I select a page from the list based on the 
following criteria. Check all that apply.

• the format of the site
• whether or not the site has video
• the site has pictures
• the site has a lot of information
• the site is understandable
• I am familiar with the author of the site
• I can verify the information on the site
• the credentials (qualifications) of the author are good
• the site is current
• the quality of the writing on the site is good
• there are reviews and comments on the site

Consider the papers that you have written in the last year. How often have you used 
research tools beyond Google or Ask.com to do your research?

• Never
• Infrequently; less than 25 percent of the time
• Sometimes; about 50 percent of the time
• Usually; about 75 percent of the time
• Almost always

How would you know when you have enough sources for a paper?
• I don’t worry about the number of sources for a paper
• 5 sources is enough for any paper
• 10 sources is enough for any paper
• 20 sources is enough for any paper 
• I try to find enough quality sources to support the information in my paper



Gender and Information Literacy  109

Consider the papers that you have written in the last year. How many pages of search 
engine (Google, Yahoo!, Bing) results do you usually view?

• 1 page
• 2 pages
• more than 2 pages
• more than 5 pages

Which search engines or library databases do you use for your research? Check all 
that apply.

• Yahoo!
• Bing
• Google
• Another search engine such as: Blekko/Lycos/AOL
• Metasearch engine (one that cross-searches other search engines) such as Ix-

quick/Dogpile/Clusty/Webcrawler/Mywebsearch
• General library databases such as Academic Search Premier/CQ Researcher/

Credo Reference/LexisNexis/JSTOR/Omnifile/Gale Virtual Reference/Ebscohost
• Subject-specific library databases such as ABI-Infom, ArtStor, Business Source 

Premier, Communication
• Library OneSearch (cross-searches the library catalog and most of our databases)
• Library book catalog

It is usually possible to determine the author of a web page.
• True
• False 

It is usually possible to determine the qualifications of the author of a web page.
• True 
• False

I believe the pages listed by a search engine results usually contain accurate information.
• True
• False

How do you decide whether or not a source retrieved from the Internet is objective 
and provides fair and equal treatment of all sides of a topic? Check all that apply.

• I do not understand what is meant by objective in this question
• I believe that pages returned by my search engine are objective
• I look at the URL of the site and, based on the domain (.com, .edu, .org, .net, 

or others), use that information to help me determine whether or not the site 
is objective

• I check with someone who may know (for example, library staff or a professor)
• I ask a friend if he or she thinks the site is objective
• If the document provides a fair discussion of all sides of a topic or issue and 

acknowledges other viewpoints, then I consider it objective
• I do not evaluate the objectiveness of a site
• I do not believe it is possible to determine the objectiveness of a page returned 

by a search engine

How do you decide whether or not information on a page is accurate and contains 
truthful and correct information? Check all that apply.



110  College & Research Libraries January 2017

• I do not understand what is meant by accurate in this question
• I believe that pages returned by my search engine are accurate
• I look at the URL of the site , based on the domain (.com, .edu, .org, .net, or oth-

ers), use that information to help me determine whether or not the site is accurate
• I check with someone with knowledge of the site or topic (for example, library 

staff or a professor)
• I believe that sites with a more current date are more accurate
• I do not check the accuracy of information on a website

How would you evaluate the credibility of a site or document and determine that the 
information is truthful and trustworthy? Check all that apply. 

• I do not understand what is meant by credibility in this question
• I look at the URL of the site and, based on the domain (.com, .edu, .org, .net, 

or others), use that information to help me determine whether or not the site 
is credible

• I look at the background of the author of the page: professional affiliations, 
college degrees, other writings

• I check to see who published the information on the site
• I believe that pages returned by my search engine are credible
• I do not evaluate the credibility of websites
• I do not believe it is possible to evaluate the credibility of pages returned by 

a search engine
• I evaluate the information on the site against what I know about the topic

How do you evaluate the quality of a page returned by a search engine? Check all 
that apply.

• I do not understand what is meant by quality in this question
• If the page includes pictures and charts, it is a quality site
• If the page is free from spelling and typographical errors and the author uses 

good grammar, I consider it a quality site
• If the information presented in the site is comprehensive and covers the topic 

in considerable depth, I consider it a quality site
• I look at the URL of the site and, based on the domain (.com, .edu, .org, .net, 

or others), use that information to help me determine whether or not the site 
is a quality site

• I do not evaluate the quality of pages returned by a search engine
• If the information on the page is interesting, and presents a clear, well-reasoned 

explanation of the topic, I consider it a quality page
• I do not believe it is possible to determine the quality of a page returned by a 

search engine
• I check with someone with knowledge of the site or topic (for example, library 

staff or a professor)

How do you evaluate the authority of a page and determine the ability to comment or 
write about a particular topic? Check all that apply.

• I do not understand what is meant by authority in this question
• I do not believe it is possible to determine the authority of a page returned by 

a search engine
• I check to see if the author of the page has published other pages, articles, or 

books about the topic
• I look for information about the author of the page: qualifications, degrees or 



Gender and Information Literacy  111

certifications, the profession, the author’s background, other pages/documents 
the author has written

• I look at the URL of the site and, based on the domain (.com, .edu, .org, .net, 
or others), use that information to help me determine whether or not the site 
has authority

• I do not examine the authority of a site
• I check with someone with knowledge of the site or topic (for example, library 

staff or a professor)

How do you evaluate the currency of a page and determine how current the informa-
tion on the site is? Check all that apply.

• I do not understand what is meant by currency in this question
• I examine how frequently the site has been updated
• I look for a date on the site
• I do not believe it is possible to determine if the information returned by a 

search engine is current
• I do not evaluate the currency of a site
• I check with someone with knowledge of the site or topic (for example, library 

staff or a professor)
• I look for a copyright date
• I look at the dates of the references
• I check to see if all the links are still active (broken links will lead me to believe 

the site is not current)

How do you evaluate the purpose or bias of a website? Check all that apply.
• I do not understand what is meant by purpose in this question
• I determine whether or not the author of the page or the owner of the URL is 

trying to sell something
• I examine whether or not the purpose of the site is to promote a particular 

opinion or point of view
• I examine whether or not the site is spam, hoax, or joke
• I do not evaluate the purpose of a site
• I do not believe it is possible to determine the purpose of a page returned by 

a search engine
• I check with someone with knowledge of the site or topic (for example, library 

staff or a professor)

Notes

 1. Association for College and Research Libraries, “Information Literacy Competency Stan-
dards for Higher Education” (Chicago: American Library Association, 2000).

 2. J. Patrick Biddix, Chung Joo Chung, and Han Woo Park, “Convenience or Credibility? 
A Study of College Student Online Research Behaviors,” The Internet and Higher Education 14, 
no. 3 (July 2011): 175–82; Kate Lawrence, “Today’s College Students: Skimmers, Scanners and 
Efficiency-Seekers,” Information Services & Use 35, no. 1–2 (2015): 89–93; Lea Currie, Frances Devlin, 
Judith Emde, and Kathryn Graves, “Undergraduate Search Strategies and Evaluation Criteria: 
Searching for Credible Sources,” New Library World 111, no. 3/4 (2010): 113–24; Anne M. Fields, 
“Self-Efficacy and the First-Year University Student’s Authority of Knowledge: An Exploratory 
Study,” Journal of Academic Librarianship 31, no. 6 (2005): 539–45; Alison J. Head and Michael B. 
Eisenberg, “Truth Be Told: How College Students Evaluate and Use Information in the Digital 
Age, Project Information Literacy Progress Report,” 2010, available online at http://projectinfolit.
org/images/pdfs/pil_fall2010_survey_fullreport1.pdf [accessed 29 November 2016].

 3. Joy Tillotson, “Web Site Evaluation: A Survey of Undergraduates,” Online Information 
Review 26, no. 6 (Dec. 2002): 392–403.



112  College & Research Libraries January 2017

 4. Biddix, Chung, and Park, “Convenience or Credibility?” 175–82; Lawrence, “Today’s Col-
lege Students,” 91. 

 5. Head and Eisenberg, “How College Students Evaluate,” 51.
 6. Andrew J. Flanagin and Miriam J. Metzger, “Digital Media and Youth: Unparalleled Op-

portunity and Unprecedented Responsibility,” in Digital Media, Youth, and Credibility, eds. Miriam 
J. Metzger and Andrew J. Flanagin (Cambridge, Mass.: MIT Press, 2008), 5–27; Melissa Gross and 
Don Latham, “Experiences with and Perceptions of Information: A Phenomenographic Study of 
First-Year College Students,” Library Quarterly: Information, Community, Policy 81, no. 2 (2011): 
161–86; Miriam J. Metzger, Andrew J. Flanagin, and Lara Zwarun, “College Student Web Use, 
Perceptions of Information Credibility, and Verification Behavior,” Computers & Education 41, no. 
3 (2003): 271–90.

 7. Ian Rowlands et al., “The Google Generation: The Information Behaviour of the Researcher 
of the Future,” in Aslib Proceedings, vol. 60, no. 4 (2008): 290–310.

 8. Deborah J. Grimes and Carl H. Boening, “Worries with the Web: A Look at Student Use 
of Web Resources,” College & Research Libraries 62, no. 1 (2001): 11–22.

 9. Michael Lorenzen, “The Land of Confusion? High School Students and Their Use of the 
World Wide Web for Research,” Research Strategies 18, no. 2 (2001): 151–63.

10. Tillotson, “Web Site Evaluation,” 392–403. 
11. Grimes and Boening, “Worries with the Web,” 11–22; Head and Eisenberg, “How College 

Students Evaluate”; Brian Hilligoss and Soo Young Rieh, “Developing a Unifying Framework of 
Credibility Assessment: Construct, Heuristics, and Interaction in Context,” Information Processing 
& Management 44, no. 4 (2008): 1467–84; Lorenzen, “The Land of Confusion?” 151–63.

12. Ananda Mitra, Jennifer Willyard, Carrie A. Platt, and Michael Parsons, “Exploring Web 
Usage and Selection Criteria Among Male and Female Students,” Journal of Computer-Mediated 
Communication 10, no. 3 (Apr. 1, 2005). 

13. Currie et al., “Undergraduate Search Strategies,” 113–24; Lawrence, “Today’s College 
Students,” 92.

14. Nicholas Carr, The Shallows: What the Internet Is Doing to Our Brains (New York: W.W. Nor-
ton and Company, 2010); Grimes and Boening, “Worries with the Web,” 11–22; Lorenzen, “The 
Land of Confusion?” 151–63; Arthur Taylor, “A Study of the Information Search Behaviour of the 
Millennial Generation,” Information Research 17, no. 1 (2012); Peter Williams and Ian Rowlands, 
Information Behaviour of the Researcher of the Future: The Literature on Young People and Their Informa-
tion Behavior: Work Package II (Bristol, United Kingdom: Joint Information Systems Committee, 
2007), available online at www.webarchive.org.uk/wayback/archive/20140614113317/http://www.
jisc.ac.uk/media/documents/programmes/reppres/ggworkpackageii.pdf [accessed 29 November 
2016].

15. Miriam J. Metzger, Andrew J. Flanagin, and Ryan B. Medders, “Social and Heuristic Ap-
proaches to Credibility Evaluation Online,” Journal of Communication 60, no. 3 (Sept. 2010): 413–39.

16. Head and Eisenberg, “How College Students Evaluate,” 13, 40.
17. Metzger, Flanagin, and Zwarun, “College Student Web Use,” 271–90.
18. Mitra et al., “Exploring Web Usage,” 10.
19. Sook Lim and Nahyun Kwon, “Gender Differences in Information Behavior Concerning 

Wikipedia, an Unorthodox Information Source,” Library & Information Science Research 32, no. 3 
(July 2010): 212–20; Jela Steinerová and Jaroslav Susol, “Users’ Information Behaviour: A Gender 
Perspective,” Information Research 12, no. 3 (2007): 13.

20. Metzger, Flanagin, and Zwarun, “College Student Web Use,” 271–90.
21. Lim and Kwon, “Gender Differences in Information Behavior,” 212–30.
22. Deborah J. Fallows, “How Women and Men Use the Internet” (Pew Internet & American 

Life Project, Dec. 28, 2005), available online at www.pewinternet.org/files/2005/12/PIP_Women_
and_Men_online.pdf [accessed 29 November 2016]; Steinerová and Susol, “Users’ Information 
Behaviour,” 13.

23. Tracey A. Burdick, “Success and Diversity in Information Seeking: Gender and the Informa-
tion Search Styles Model,” School Library Media Quarterly 25, no. 1 (1996): 19–26; Eszter Hargittai 
and Steven Shafer, “Differences in Actual and Perceived Online Skills: The Role of Gender*,” Social 
Science Quarterly 87, no. 2 (2006): 432–48; Fallows, “How Women and Men Use the Internet”; Nai 
Li and Gill Kirkup, “Gender and Cultural Differences in Internet Use: A Study of China and the 
UK,” Computers & Education 48, no. 2 (Feb. 2007): 301–17.

24. Eszter Hargittai, “Beyond Logs and Surveys: In-Depth Measures of People’s Web Use Skills,” 
Journal of the American Society for Information Science and Technology 53, no. 14 (2002): 1239–44; Har-
gittai and Shafer, “Differences in Actual and Perceived Online Skills,” 432–48; Ethelene Whitmire, 
“The Relationship between Undergraduates’ Background Characteristics and College Experiences 
and Their Academic Library Use,” College & Research Libraries 62, no. 6 (2001): 528–40.

25. Burdick, “Success and Diversity in Information Seeking,” 19–26.



Gender and Information Literacy  113

26. Teresa Yvonne Neely, Aspects of Information Literacy: A Sociological and Psychological Study 
(doctoral dissertation, University of Pittsburg, 2000).

27. Nahyun Kwon and Hana Song, “Personality Traits, Gender, and Information Competency 
among College Students,” Malaysian Journal of Library & Information Science 16, no. 1 (July 2011): 
87–107.

28. Association for College and Research Libraries, “Information Literacy Competency Stan-
dards for Higher Education.”

29. Neely, Aspects of Information Literacy.
30. Metzger, Flanagin, and Zwarun, “College Student Web Use,” 271–90.
31. Burdick, “Success and Diversity in Information Seeking,” 19–26.
32. Daniel Voyer and Susan D. Voyer, “Gender Differences in Scholastic Achievement: A Meta-

Analysis,” Psychological Bulletin 140, no. 4 (2014): 1174–1204.
33. Fallows, “How Women and Men Use the Internet”; Steinerová and Susol, “Users’ Informa-

tion Behaviour.”
34. Herbert A. Simon, “Rational Choice and the Structure of the Environment,” Psychological 

Review 63, no. 2 (1956): 129; Herbert A. Simon, “Rationality as Process and as Product of Thought,” 
The American Economic Review 63, no. 2 (1978): 1–16; Amanda Spink et al., “What Is Enough? 
Satisficing Information Needs,” Journal of Documentation 63, no. 1 (2007): 74–89.

35. Lynn Silipigni Connaway, Timothy J. Dickey, and Marie L. Radford, “‘If It Is Too Inconve-
nient I’m Not Going after It’: Convenience as a Critical Factor in Information-Seeking Behaviors,” 
Library & Information Science Research 33, no. 3 (2011): 179–90; Helen Georgas, “Google vs. the 
Library (Part III): Assessing the Quality of Sources Found by Undergraduates,” portal: Libraries 
and the Academy 15, no. 1 (2015): 133–61.

36. Burdick, “Success and Diversity in Information Seeking,” 26.
37. Anne M. Fields, “Women’s Epistemological Development: Implications for Undergraduate 

Information Literacy Instruction,” Research Strategies 18, no. 3 (2001): 227–38.
38. Ibid.
39. Fields, “Self-Efficacy and the First-Year University Student’s Authority of Knowledge,” 

545.
40. Katty Kay and Claire Shipman, “The Confidence Gap,” The Atlantic Monthly 313, no. 4 

(2014): 56–66.
41. Constance A. Mellon, “Library Anxiety: A Grounded Theory and Its Development,” College 

& Research Libraries 47, no. 2 (1986): 160–65.
42. Shelly Blundell and Frank Lambert, “Information Anxiety from the Undergraduate Stu-

dent Perspective: A Pilot Study of Second-Semester Freshmen,” Journal of Education for Library 
and Information Science 55, no. 4 (2014): 261–73.

43. Marc Meola, “Chucking the Checklist: A Contextual Approach to Teaching Undergradu-
ates Web-Site Evaluation,” portal: Libraries & the Academy 4, no. 3 (2004): 331–44; Arthur Taylor 
and Heather A. Dalal, “Information Literacy Standards and the World Wide Web: Results from 
a Student Survey on Evaluation of Internet Information Sources,” Information Research 19, no. 4 
(2014).

44. Association for College and Research Libraries, “Framework for Information Literacy for 
Higher Education” (Chicago: American Library Association, 2015), available online at www.ala.
org/acrl/standards/ilframework [accessed 29 November 2016].