Evidence Based Library and Information Practice 2012, 7.2 5 Evidence Based Library and Information Practice Article Interactions: A Study of Office Reference Statistics Naomi Lederer Liberal Arts Librarian Morgan Library Colorado State University Fort Collins, Colorado, United States Email: naomi.lederer@colostate.edu Louise Mort Feldmann Business and Economics Librarian Morgan Library Colorado State University Fort Collins, Colorado, United States Email: louise.feldmann@colostate.edu Received: 22 Nov. 2011 Accepted: 19 May 2012 2012 Lederer and Feldmann. This is an Open Access article distributed under the terms of the Creative Commons-Attribution-Noncommercial-Share Alike License 2.5 Canada (http://creativecommons.org/licenses/by- nc-sa/2.5/ca/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly attributed, not used for commercial purposes, and, if transformed, the resulting work is redistributed under the same or similar license to this one. Abstract Objective – The purpose of this study was to analyze the data from a reference statistics-gathering mechanism at Colorado State University (CSU) Libraries. It aimed primarily to better understand patron behaviours, particularly in an academic library with no reference desk. Methods – The researchers examined data from 2007 to 2010 of College Liaison Librarians’ consultations with patrons. Data were analyzed by various criteria, including patron type, contact method, and time spent with the patron. The information was examined in the aggregate, meaning all librarians combined, and then specifically from the Liberal Arts and Business subject areas. Results – The researchers found that the number of librarian reference consultations is substantial. Referrals to librarians from CSU’s Morgan Library’s one public service desk have declined over time. The researchers also found that graduate students are the primary patrons and email is the preferred contact method overall. mailto:naomi.lederer@colostate.edu mailto:louise.feldmann@colostate.edu Evidence Based Library and Information Practice 2012, 7.2 6 Conclusion – The researchers found that interactions with patrons in librarians’ offices – either in person or virtually – remain substantial even without a traditional reference desk. The data suggest that librarians’ efforts at marketing themselves to departments, colleges, and patrons have been successful. This study will be of value to reference, subject specialist, and public service librarians, and library administrators as they consider ways to quantify their work, not only for administrative purposes, but in order to follow trends and provide services and staffing accordingly. Introduction Reference services have traditionally been measured in some way in order to collect evidence, most commonly by a simple tick mark to indicate a transaction. In late 2006, Colorado State University (CSU) Libraries moved from a traditional reference desk model to a referral system. Staff and students working at a library information desk started to refer patrons to librarians for in-depth assistance, and the librarians wanted to collect data about their in-office reference consultations in order to capture information about this new service. CSU is a land-grant institution located in Fort Collins, Colorado, United States, with an FTE of approximately 25,000 students. The Libraries consist of a main library, Morgan Library, and a Veterinary Teaching branch. The CSU Libraries College Liaison Librarians unit consists of 10 librarians and 2 staff members. Since 2007, these librarians have used a reference database developed in-house to record office research consultations. This database provides a place to input various data and to generate reports for librarians, the College Liaison unit, and the Libraries administration. Administrators can use the database to see specific liaison workloads and which subjects have the most inquiries, and can then use this information for rebalancing of assignments (e.g., subjects reconfigured or other responsibilities reassigned to compensate for a heavier load) and justification of budgets for additional librarians and other relevant resources. At CSU, College Liaison Librarians do not staff a public service desk, but provide reference assistance in their offices via drop-in and appointments. Additionally, some librarians offer reference services in departments or colleges for two to four hours each week. CSU Libraries has a help desk at which staff and students may refer in-depth questions to librarians. The researchers were curious about how CSU university library patrons are seeking information. Claims that reference statistics are declining may refer only to data from the traditional reference desk. Are patrons still seeking librarians for assistance? Are trends at a national level, such as a decline in reference desk statistics, occurring locally? The data from the office statistics database provided an opportunity to identify patterns and to explore how patrons are seeking reference services, and in 2011 the database statistics were analyzed to answer these questions. The subject areas of the questions were also of interest because they might reflect success in outreach or areas that might be candidates for additional promotion of services. In this study, the researchers identified overall trends and looked specifically at the subject areas of Liberal Arts and Business. Literature Review The broad topic of library statistics often encompasses collection holdings, staffing, and circulation data. In line with the focus of this article, only literature relating to library reference statistics was examined. Only one article was found that discusses the collection of reference statistics resulting from transactions originating from multiple sources (reference desk, email, phone, instant messaging, etc.); the majority of articles focus on public service desk statistics, and those which have relevant ideas are discussed below. Few articles consider how statistics are Evidence Based Library and Information Practice 2012, 7.2 7 gathered, but rather focus on the results of the statistics gathering. Furthermore, no close analyses of any particular librarians’ office interactions were found. Novotny (2002) shows how some libraries collect reference statistics on paper, including example sheets with categories that in some cases are used away from services desks. Examples include separate telephone and email reference question sheets, weekly summaries, and a question sheet with options for multiple types of contact with the patron available for each question. The summary of reference statistics covers public desks, not office numbers. Measures for Electronic Resources (E-Metrics) (2002) discusses digitally based reference (and other) transactions. Possible statistics are provided for networked and electronic services and resources, but the emphasis is on electronic resources, not on the work that librarians might be doing somewhere other than at a reference or public service desk. Electronic reference is just one aspect of the paper, and in any case it has changed substantially since 2002. In providing guidelines for gathering digital reference statistics, McClure, Lankes, Gross, and Choltco-Delvin (2002) point out that “libraries have seriously underrepresented their services in terms of use of digital services being provided . . . by not counting and assessing these uses and users. As more users rely on digital library services—including digital library services—this undercount will continue to increase” (p. 8). The measures in these guidelines focus on digital reference, rather than on any kind of off-desk assistance. Nevertheless, this type of statistical gathering could be a useful starting point for a library developing a statistics-gathering database. The majority of articles on digital reference services do not focus on the methods patrons use to contact librarians directly. Instead they mention digital reference in passing or provide a careful analysis of where and how the services are available at specific locations (Lederer, 2001; Pomerantz, Nicholson, Belanger, & Lankes, 2004; White, 2001), or focus on nonaffiliated users of the service (Kibbee, 2006). Articles on digital reference outside of North America and collaborative reference efforts are not closely enough related to the current topic to be included. Some researchers have classified types of questions asked at reference or email reference services. Henry and Neville (2008) discuss how the Katz classification, as detailed in Introduction to Reference Work, and the Warner classification, as detailed in his article “A New Classification for Reference Statistics,” measured experiences at a small academic library. Henry and Neville include references to Association of Research Libraries (ARL) statistics and comparisons of newer types of access such as chat, email, and instant message services used in public as well as academic libraries. Meserve, Belanger, Bowlby, and Rosenblum (2009) applied the Warner classification at their institution, evaluating it favourably and using it to support their tiered reference arrangement. Greiner (2009) responds to the article by Meserve et al. by questioning some of the conclusions and notes the decline in questions overall, attributing it partly to incorrectly interpreted questions, but also citing relationship building by librarians as a necessary component of good reference service. Meserve (2009) replies with overall agreement, while emphasizing that at his library the paraprofessionals are well versed in their role, and reference services were in decline before paraprofessionals were put on a service desk. The evaluation of reference service is a frequent topic in the literature. Logan (2009) provides a good overview in which he starts from the beginnings of reference services in 19th-century America, and points out that although the tools have certainly changed, the functions of reference have not. Reference was not often discussed in publications until the 1970s, with an emphasis on assessment and evaluation of reference services in the 1990s, and more recently on “‘learning outcomes’ and ‘information literacy’” (p. 230). Logan recommends the “establish[ment of] flexible criteria for good service,” which include components related to “behavioral characteristics . . . basic knowledge of resources and collections, subject knowledge, and reference skills” (p. 231). Welch (2007) Evidence Based Library and Information Practice 2012, 7.2 8 describes and discusses the National Information Standards Organization’s (NISO) Z39.7-2004 standard, and highlights the importance of counting email, web page, and other reference transactions. Library services have progressed beyond traditional desk transactions, and a method for tracking all reference transactions is necessary. Library administrators need to be convinced of the relevance and importance of these new methods for providing research services. As Welch (2007) writes, “including electronic reference transactions and visits to reference- generated web pages in statistical reports are ways to demonstrate . . . our continuing usefulness to our patrons” (p. 103). The amount of effort expended for different types of questions is explored by Gerlich and Berard (2007, 2010). They outline 6 levels of effort and provide charts of questions by type for the 2003-2004 academic year (2007); and further broaden the collection of data to 15 libraries in 2010. One of the main points is that collection of statistics only from a traditional reference desk does not capture all reference transactions that are taking place – many transactions are via email and other methods. Gerlich and Berard (2010) argue that “reference transactions are on the decline as documented by librarians and their institutions, yet reference activities taking place beyond traditional service desks are on the rise” (p. 116). Data collection techniques all too frequently do not take these additional assistance points into account, and “counting traffic numbers at the traditional reference desk is no longer sufficient as a measurement that reflects the effort, skill, and knowledge associated with this work” (p. 117). Gerlich and Berard discuss expended effort and difficulty in their larger Reference Effort Assessment Data (READ) experiment (2007). Murgai (2006) describes one library’s sampling of number and types of questions, which both notes some disadvantages of sampling, but also shows that the results of the sampling were within acceptable ranges of accuracy (though reference questions beyond the desk are not included). Another statistic that is more difficult to collect is the often multiple and varied types of resources used by a librarian to answer a single question (Tenopir, 1998). Thomsett-Scott and Reese (2006) examine whether there is a relationship between changes in library technology and reference desk statistics. They note the changes in the number of questions when CD-ROMs and Web-based resources were first introduced, and report that while reference statistics may be declining, the types of questions are “more intricate” or “complex” (p. 148): A review of the literature suggests that reference questions are taking longer to answer and are more extensive, yet the actual number of questions is declining. Reference managers may need to reconsider how reference services are measured. Statistics may be lower due to issues with the traditional recording method of “one patron, one tick” (p. 149). In other words, in the past a patron might come to the desk asking about books on a topic, then return to ask about articles, then return to ask for help with citations (three transactions). In the electronic world, this one patron is likely to be helped in a single transaction. Additionally, the authors point out that “traditional statistical recording systems also may not include reference questions answered beyond the reference desk” (p. 162). They also examined gate counts for 1997-2004, circulation counts for 1998-2004, and various reference counts and types from 1989-2004 from their own library. Not all types of statistics were gathered for all years, as email and chat started only in 1998. They conclude that “statistics should include online reference methods and possibly web page statistics as the proliferation of library-based web pages may . . . be answering many of the questions that face-to-face reference services answered in the past” (p. 163). Some articles describe in-house databases created to collect reference statistics. Aguilar, Keating, and Swanback (2010) describe the thinking behind their library’s in-house database as a “need to discover new ways to gauge the needs of our patrons and employ Evidence Based Library and Information Practice 2012, 7.2 9 concrete data to make decisions” (p. 290). Statistics are gathered at multiple service points – mostly reference desks, but also offices and remote locations as well – and used to justify collections purchases and increased staffing of their “Ask a Librarian” service during a specific time of day. Data from ARL and other reports are easily gathered from Aguilar, Keating, and Swanback’s database. A second in-house database is described by Feldmann (2009), which was created to capture the number of reference questions that were successfully referred from the new information desk after the reference desk was disbanded. The database evolved to be a useful tool for gathering information on librarians’ office transactions. The author cites articles that discuss referral services and various staffing models for tiered services. Smith (2006) describes a Web-based system for collecting statistics and discusses various reasons people have collected reference statistics, as well as the problems associated with collecting them, such as apathy and the wide variation in the parts and types of questions. The author describes how the database was developed and the types of information it collects, including screen shots and HTML coding of and for the database. The references and further reading are substantial. Todorinova, Huse, Lewis, and Torrence (2011) describe one university library’s choice of a commercial product, Desk Tracker, after using a system of clickers that did not record the time of transactions. The data collected included type of patron, form of the transaction (in person, email, or phone), and type of question, and it was used to assign appropriate staffing levels and to inform collection development decisions. Some output weaknesses were found in the software, but the data have been proposed as potentially useful for decision- making and improving services and operations. Although some of the literature examines reference statistics closely, it is in specific contexts such as health or medical libraries or GIS systems (e.g., Parrish, 2006), and has a more focused audience and set of questions. The literature still lacks a close examination of reference transactions away from the reference desk. This study looks closely at not only how the questions were asked, but how long it took to answer them, their subject areas (broadly and more specific, depending upon the topic), the status of the questioner, and whether or not the question was referred from someone else (e.g., via a service desk in the majority of cases). Methods Data were gathered from an office statistics database, which is a recording mechanism used to capture CSU Librarians’ reference transactions, both in-office and during office hours in a department or college on campus (see Figure 1). The database was developed in late 2006, when a CSU Libraries Business Librarian, a staff member, and a member of the library’s technical services department created it using PHP scripting language and MySQL. It was originally conceived as a method to track referrals from the newly implemented information desk (Feldmann, 2009). Starting in 2007, librarians no longer staffed a reference desk or any other public services desk, and staff and students working at the information desk (the now sole public service desk) would refer in-depth questions to librarians. The database initially provided a method for capturing the number of referrals received by librarians from the information desk, in addition to providing a place to record reference transactions. Since librarians at this time placed a renewed emphasis on departmental outreach, it was also thought that the database would capture the impact of marketing their research consultation services to faculty. Over the years, librarians have changed or modified input fields to reflect needs and improve the database. Reports are easily generated in Microsoft Excel spreadsheets. The input form contains both required and voluntary fields. Information collected in the required fields include name, contact type (email, drop-in, phone, appointment, office hours, or other), help desk referral, time spent, number of patrons assisted, and status of patron. Voluntary information includes discipline area, course information, and comments. Evidence Based Library and Information Practice 2012, 7.2 10 Figure 1 CSU Libraries office statistics database entry form For this study, the researchers extracted numbers from the database in the aggregate (total from all librarians) for the years 2007- 2010. Additionally, data from the Liberal Arts and Business Librarians were extracted as samples to examine subject-specific data. The Business Librarian provides assistance for six departments: Accounting, Finance, Marketing, Management, Computer Information Systems, and Economics. The primary Liberal Arts Librarian covers seven departments: English, History, Art, Communication Studies, Journalism and Technical Communication, Ethnic Studies, and Design and Merchandising, which is part of the College of Applied Human Sciences. The database allows data to be pulled directly by various fields, date range, and by librarian. The researchers extracted data by contact type, number of patrons helped, time spent, patron status, and whether or not the question was a referral from the help desk for the years 2007-2010. This information was then examined to determine trends. Results and Discussion Aggregate Information Table 1 shows that both the number of consultations and the numbers of librarians reporting have decreased between 2007 and 2010. While the total number of office consultations decreased by year, a corresponding drop in the number of librarians reporting also occurred, so that the mean (average per librarian) increased from 127 in 2007 to 154 in 2010. Fewer librarians were employed in 2010 than in previous years due to attrition. Table 2 shows that email was by far the most popular way that patrons received assistance, accounting for 50% (3,141 questions) of all transactions. Evidence Based Library and Information Practice 2012, 7.2 11 Table 1 2007-2010 Office Consultations by Year Year No. Librarians Reporting 2007 1,517 12 2008 1,856 12 2009 1,515 11 2010 1,395 9 Table 2 2007-2010 Office Contact Type Contact Type No. Percent Email 3,141 50% Drop-In 1,214 19% Phone 748 12% Appointment 714 11% Other 424 7% Office Hours 40 1% Empty 2 0% Total 6,283 The contact type of “Office Hours” refers to librarians providing dedicated office hours to answer questions from drop-in patrons, similar to traditional office hours that faculty provide. They were recorded only in January and February of 2007 as they were a short-term arrangement where librarians were assigned to be backups for the then information desk. Referrals from the information desk were so rare that the concept was abandoned after a short run. “Empty” indicates that no information was entered. “Other” could mean helping someone in the library while en route to a meeting or returning to one’s office, service provided at a non-library location (for instance, the Business Librarian’s “Librarian to Go” reference in the College of Business), instant messaging (IM), and so on. The status of patrons who directly contacted librarians (Table 3) shows that graduate students and undergraduate students are the heaviest users with faculty members in a solid third place. Table 3 2007-2010 Office Patron Status Patron Status No. Percent Graduate 2,030 32% Undergraduate 1,969 31% Faculty 1,156 18% Community 557 9% Staff 348 6% Elsewhere 80 1% Government 54 0.9% Empty 51 0.8% Visiting Faculty 24 0.4% Evidence Based Library and Information Practice 2012, 7.2 12 Administrator 14 0.2% These figures show that graduate students visit their College Liaison Librarians in greater numbers than any other category, even though they are a much smaller percentage of the University’s overall student population. The majority of consultations are relatively short (Table 4). Table 4 shows that consultations with librarians are for the most part between 10 minutes and 1 hour 25 minutes. Researchers who contact a librarian are more likely to have questions that require some research to answer, and talking with a student or faculty member in an office can often take longer than an interaction at a reference desk, as others in line at a desk can speed up a reply. A user who makes an appointment is not going to rush off. Of course, there are questions that need only a brief answer; the 1 to 4 minute category includes any number of interactions that took no more than 10 seconds, but were recorded as one minute (see Table 5). Email dominates this (and all) categories, but drop-ins are also brief as a patron may have a quick question and thus stop by without an appointment (or is referred from the information or help desks). The number of referrals to librarians from the information or help desks is much lower than expected, as seen in Table 6. Additionally, referrals have been decreasing as time passes, as shown in Table 7. Table 4 2007-2010 Office Consultations – Time Spent with Patron Time Spent No. Patrons 15m-29m 1,667 30m-59m 1,321 10m-14m 853 1h-1h25 773 5m-9m 694 1m-4m 465 1h30m-1h59m 210 Empty 143 2h+ 118 3h+ 24 4h+ 9 5h+ 6 Note. m=minute; h=hour Table 5 2007-2010 Office Consultations – Short Contacts Time Phone Other Office Hours Email Drop In 1 minute 29 35 11 56 29 2 minutes 28 28 4 123 31 Table 6 2007-2010 Office Consultations – Referrals Referral No. Yes 1,041 No 5,242 Evidence Based Library and Information Practice 2012, 7.2 13 Librarians have made a push to directly promote themselves to students and faculty in order to provide the best possible service to their constituents. For example, flyers promoting the College Liaison Librarians by name and specialty have been distributed to faculty in departments. Some librarians offer reference assistance for a few hours a week in departments or colleges on campus and this has increased the visibility of librarian services to faculty, staff, and students in these areas, and possibly resulted in direct contacts rather than referral from the reference desk. Additionally, College Liaison Librarians are promoting their services directly to students in their library instruction sessions. It has been observed that faculty members who are familiar with the librarians’ services are more likely to refer their students directly to their College Liaison Librarian. The actual referrals from the service desk may be an even lower percentage than those recorded here; some librarians record a “referral” when a faculty member refers a student directly to a librarian. The decline in referrals from the information and help desks prompts many questions. Do desk staff give patrons a librarian’s business card but the patron decides, for whatever reason, not to contact the librarian directly? Are the desk personnel unfamiliar with the College Liaison Librarians, and therefore feel uncomfortable referring questions to them? Are the desk staff and patron satisfied with the result of the transaction? Has the nature of questions changed? Do web pages and LibGuides play any role in filling research needs? These are all questions for further examination. Subject-Specific Information: Business CSU Libraries Business office statistics consist of office research consultations, reference assistance during “office hours” held in the CSU College of Business, and assistance via instant messaging, email, phone, and referrals from the Libraries’ Ask-a-Librarian email service. CSU Libraries has one Business Librarian who serves 5,800 students, including Business majors and minors as well as onsite and distance graduate students. CSU College Liaison Librarians enter their office statistics differently, and this impacts how the results can be analyzed. The Business Librarian designates all questions having to do with Business as “Business,” rather than parsing out further into such categories as finance or accounting. Questions are often multi-disciplinary and it may be difficult to categorize the reference consultation topic into only one area. For example, students in CSU’s College of Business Global Social Sustainable Enterprise program often research a country’s social, political, and economic climate along with business logistics. Business reference questions have generally been increasing, with a slight dip in 2010. Total contacts in 2007 were 210, in 2008 were 356, in 2009 were 360, in 2010 were 298, and in 2011 were 342. Taken alone it is difficult to explain the decrease in 2010 or why numbers are not continually increasing given that the number of Business students is increasing, but it could be attributable to successful instruction sessions, students using the library’s LibGuides to find answers, more library-savvy Table 7 2007-June 2011 Office Consultations – Referrals Year Yes No. Percent Referred Total No. Questions 2007 478 1,040 32% 1,518 2008 325 1,530 18% 1,855 2009 139 1,376 9% 1,515 2010 99 1,296 8% 1,395 2011 (Jan.-June) 32 729 4% 761 Evidence Based Library and Information Practice 2012, 7.2 14 Business students, and assignments requiring less or no library research. These results warrant further investigation, potentially through analysis of LibGuide and instruction statistics, or by more qualitative methods, particularly if the questions continue to show a decline in future years. Contact types for Business are similar to the aggregate data, with email being the primary contact type. Table 8 shows contacts, percent of total, and a comparison with the aggregate (overall) percentages. Undergraduates are the primary patrons for the years examined (see Table 9). Table 10 shows a comparison by year of patron status. In 2010, a trend change indicates that graduate more often than undergraduates are the more common contact type. Further breakdowns were explored. Graduate students contact the Business Librarian primarily by email (42%, 2007 through June 2011). The average time spent with a graduate student was 35 minutes. Undergraduates contact the Business Librarian also primarily Table 8 Business Librarian – Contact Type (January 2007-June 2011) Contact Type No. Percent Overall Percent Email 722 51% 50% Drop-In 220 16% 19% Other 215 15% 11% Phone 134 9.6% 12% Appointment 110 7.9% 7% Table 9 Business Librarian – Patron Status (January 2007-June 2011) Patron Status No. Percent Overall Percent Undergraduates 486 40% 31% Graduates 423 35% 32% Faculty 145 12% 18% Community 98 8% 9% Staff 52 4% 6% Visiting Faculty 0 0% 0% Elsewhere 22 2% 1% Table 10 Business Librarian – Patron Status by Year Patron 2007 2008 2009 2010 2011 June Undergraduate 53% 41% 40% 28% 28% Graduate 28% 35% 33% 40% 39% Faculty 8% 13% 11% 14% 20% Community 9% 7% 8% 9% 6% Staff 2% 2% 6% 6% 8% Visiting Faculty 0% 0% 0% 0% 0% Elsewhere 0% 1% 2% 2% 0% Evidence Based Library and Information Practice 2012, 7.2 15 by email (54%, 2007 through June 2011). Average time spent with an undergraduate patron is similar to that spent with graduate students. While the aggregate data (Table 4) show that most student office consultations are between 15 and 29 minutes, the Business student data indicate that slightly more time is spent with them than the average of all patrons. Subject-Specific Information: Liberal Arts The primary Liberal Arts Librarian (responsible for 6 of the 13 departments in the college) answered 158 questions in 2007, 158 questions in 2008, 189 questions in 2009 (an increase of 31), 220 questions in 2010 (another increase of 31), and 251 questions in 2011. The trend has been higher numbers of questions after the second year. Compared to the whole, the Liberal Arts Librarian’s numbers have not always reflected the same trends, as seen in Table 11. In the interactions of the Liberal Arts Librarian, email, phone (by just 3%), and other were a smaller percentage of the total than for other librarians, with email showing a much smaller percentage; however, the Liberal Arts Librarian had a higher percentage of drop-ins and appointments than other librarians. Another difference from the whole was patron status (see Table 12). Undergraduates contacted the Liberal Arts Librarian 22% more often than the overall population; however, graduate students made 11% fewer contacts (see Table 12). Faculty, community contacts, visiting faculty, and elsewhere were close to the overall picture. A possible explanation is that the Liberal Arts Librarian teaches fewer graduate than undergraduate courses. Moreover, 20% of the graduate student numbers (32 students) come from a non-Liberal Arts department, where she has taught the new graduate students in the library classroom every Fall. Table 11 Liberal Arts Librarian – Contact Type (January 2007-June 2011) Contact Type No. Percent Overall Percent Email 309 38% 50% Drop-In 246 31% 19% Appointments 176 22% 11% Phone 69 9% 12% Other 1 0% 7% Table 12 Liberal Arts Librarian – Patron Status (January 2007-June 2011) Patron Status No. Percent Overall Percent Undergraduate 427 53% 31% Graduate 166 21% 32% Faculty 127 16% 18% Community 55 7% 9% Staff 17 2% 6% Visiting Faculty 5 0.6% 0.4% Elsewhere 4 0.4% 1% Evidence Based Library and Information Practice 2012, 7.2 16 As for the disciplines in which questions were asked, the top categories cover many subject areas (Table 13). There were 35 areas represented, with the top 13 shown in Table 13 (16 categories had 1 entry while 6 had 2-4, making up 5% of the total). There were many questions in Design & Merchandising, the non–Liberal Arts subject. Subjects outside of Liberal Arts appear because the specialist for that area was not available that day, and because the Liberal Arts Librarian’s second language is French. Members of the French Department are aware of her specialized knowledge from various interactions and ask questions specific to the French language of her, while in practice foreign literature research questions have been asked of the Foreign Languages librarian (who sometimes consults with the Liberal Arts Librarian about these questions). Comparisons across years show that History questions dominate; all but one of the four years examined had History in first place; in 2008 it was in second place and English Language & Literature had the most questions. English was in the top three of all years. Design & Merchandising was fourth in three of the years and third once (2008). Of the most frequent areas, Art had the most dramatic jump down from second in 2007 to fifth or seventh in the other three years. A possible explanation is a decrease in the number of library instruction sessions provided for Art courses during the later years, thereby decreasing the number of students who meet their Art librarian in person. Conclusion This study examined patterns in patrons’ use of reference services in a library which no longer has a traditional reference desk. Instead, a general help desk is used, among other methods, to refer patrons to subject-specific librarians for in-depth assistance. Routinely collected data were examined to determine if patrons continue to seek librarian assistance without their presence at the reference desk. The data examined included the demographics of the primary patrons, how patrons contact librarians, and how much time librarians spend with them. These data show that from 2007 to 2010 the majority of patrons who contacted CSU College Liaison Librarians were Table 13 Liberal Arts Librarian – Office Consults by Discipline (January 2007-June 2011) Discipline No. Percent History 190 24% English Language & Literature 133 17% Design & Merchandising 100 12% Speech 92 11% Art 75 9% Journalism 58 7% General 46 6% Ethnic Studies 21 3% Bibliographic Citation 16 2% Other 11 1% Education 6 1% Foreign Language & Literature 6 1% Library Science 6 1% Note. “Bibliographic Citation” is a newer entry; earlier entries were put into the “General” category. Evidence Based Library and Information Practice 2012, 7.2 17 graduate students and their primary mode of contact was email. Further examination of the statistics shows a marked decline over time in the number of referrals that librarians received from the information and help desks. Over the same time period, there has not been a large increase in the number of office consultations, although contact numbers are fairly consistent and actually show an average increase per librarian given the decrease in number of librarians. Similar trends were discovered for two subject librarians (Business and Liberal Arts) whose data were examined separately. The database has proven to be useful for examining trends and plans for the future, including following the nature of questions (e.g., in-depth), or for using something similar to the Reference Effort Assessment Data (READ) scale. College Liaison Librarians at CSU are making efforts to promote their services on campus and these efforts may have contributed to increased awareness of librarian reference services by patrons. An in-depth examination of the direct impact of these promotion efforts would be worthwhile, although it must be noted that relying simply on statistical data may not provide a complete picture of how and why trends are occurring. At the same time, the tracking must not become so burdensome that it becomes a distraction from helping patrons. In some instances, students arrive back-to-back and asking them multiple questions takes from the time that is spent actually helping; moreover, remembering the details for later input into the database can be difficult when the patrons arrive in waves. A reference statistic was once satisfied with a quick tick mark, and while the data collected are useful, it must not end up overwhelming the people recording it. Additionally, important soft data might be hard to quantify; for example, are the departments with which the librarians liaise satisfied with how their library is serving them? Some subject areas/departments use the library and the librarian services more than others and this may simply be a discipline- specific behaviour. Further research to explore these patron behaviour patterns would be worthwhile. Data gathering is useful for both library administrators and individual librarians as a means of quantifying their work. Administrators may use this information to examine workloads and productivity, justify the need to hire new faculty, identify the need to purchase software to develop online tutorials, and identify overall trends. Librarians may use the data to show their impact, see trends, and develop relevant online guides and tutorials. At CSU Libraries, the data revealed by the office statistics database can demonstrate which subject areas are using their College Liaison (subject) Librarians the most, and give guidance to the specialists as to which topical supplementary materials might be created to help serve their constituencies, such as web pages, LibGuides, tutorials, or handouts. It is important to remember that although data are useful, interpretation and presentation are important. Quantifying librarians’ work can be difficult and may not always provide a complete picture of activity. Acknowledgements Parts of this article were reported at the Workshop for Instruction in Library Use (WILU) conference in Regina, Saskatchewan, Canada, in June 2011. References Aguilar, P., Keating, K., & Swanback, S. (2010). Click it, no more tick it: Online reference statistics. The Reference Librarian, 51(4), 290-299. doi:10.1080/02763877.2010.501421 Feldmann, L. M. (2009). Information desk referrals: Implementing an office statistics database. College & Research Libraries, 70(2), 133-141. Retrieved 21 May 2012 from http://crl.acrl.org/content/70/2/133.full. pdf Gerlich, B. K., & Berard, G. L. (2007). Introducing the READ scale: Qualitative statistics for academic reference services. Georgia Library Quarterly, 43(4), 7-13. Retrieved 21 Evidence Based Library and Information Practice 2012, 7.2 18 May 2012 from http://digitalcommons.kennesaw.edu/ glq/vol43/iss4/4 Gerlich, B. K., & Berard, G. L. (2010). Testing the viability of the READ scale (Reference Effort Assessment Data): Qualitative statistics for academic reference services. College & Research Libraries, 71(2), 116-137. Retrieved 21 May 2012 from http://crl.acrl.org/content/71/2/116.full. pdf Greiner, T. (2009). Letter to the editor. Reference & User Services Quarterly, 49(2), 111. Retrieved 21 May 2012 from http://rusa.metapress.com/content/k11 07l2616utk6q1/fulltext.pdf Henry, D. B., & Neville, T. M. (2008). Testing classification systems for reference questions. Reference & User Services Quarterly, 47(4), 364-373. Retrieved 21 May 2012 from http://rusa.metapress.com/content/p80 16h827210644v/fulltext.pdf Kibbee, J. (2006). Librarians without borders? Virtual reference service to unaffiliated users. Journal of Academic Librarianship, 32(5), 467-473. Retrieved 21 May 2012 from http://dx.doi.org/10.1016/j.acalib.2006.0 5.003 Logan, F. F. (2009). A brief history of reference assessment: No easy solutions. The Reference Librarian, 50(3), 225-233. doi:10.1080/02763870902947133 Lederer, N. (2001) E-mail reference: Who, when, where, and what is asked. The Reference Librarian, 35(74), 55-73. doi:10.1300/J120v35n74_05 McClure, C. R., Lankes, R. D., Gross, M., & Choltco-Devlin, B. (2002). Statistics, measures, and quality standards for assessing digital reference library services: Guidelines and procedures. Retrieved 21 May 2012 from ERIC database (ED472588). Measures for electronic resources (e-metrics): Complete set. (2002). Washington, DC: Association of Research Libraries. Retrieved 21 May 2012 from http://www.arl.org/bm~doc/e- metrics.pdf.zip Meserve, H. C. (2009). Mr. Meserve’s reply. Reference & User Services Quarterly, 49(2), 111. Retrieved 21 May 2012 from http://rusa.metapress.com/content/h07 1631726l1p5n8/fulltext.pdf Meserve, H. C., Belanger, S. E., Bowlby, J., & Rosenblum, L. (2009). Developing a model for reference research statistics: Applying the “Warner model” of reference question classification to streamline research services. Reference & User Services Quarterly, 48(3), 247- 258. Retrieved 21 May 2012 from http://rusa.metapress.com/content/v8h 758vjlt275234/fulltext.pdf Murgai, S. R. (2006). Reference use statistics: Statistical sampling method works. Southeastern Librarian, 54(1), 45-57. Retrieved 21 May 2012 from http://selaonline.org/SoutheasternLibr arian/SELnSpring06.pdf Novotny, E. (2002). Reference service statistics & assessment: A SPEC kit. Washington DC: Association of Research Libraries. Retrieved 21 May 2012 from http://www.arl.org/bm~doc/spec268w eb.pdf Parrish, A. (2006). Improving GIS consultations: A case study at Yale University Library. Library Trends, 55(2), 327-339. doi:10.1353/lib.2006.0060 Pomerantz, J., Nicholson, S., Belanger, Y., & Lankes R. D. (2004). The current state of digital reference: Validation of a general digital reference model through a survey of digital reference Evidence Based Library and Information Practice 2012, 7.2 19 services. Information Processing & Management 40(2), 347-363. doi:10.1016/S0306-4573(02)00085-7 Smith, M. M. (2006). A tool for all places: A web-based reference statistics system. Reference Services Review, 34(2), 298- 315. doi:10.1108/00907320610669524 Tenopir, C. (1998). Online databases: Reference use statistics. Library Journal 123(8), 32- 33. Thomsett-Scott, B., & Reese, P. E. (2006). Changes in library technology and reference desk statistics: Is there a relationship? Public Services Quarterly, 2(2/3), 143-165. doi:10.1300/J295v02n02_10 Todorinova, L., Huse, A., Lewis, B. & Torrence, M. (2011). Making decisions: Using electronic data collection to re-envision reference services at the USF Tampa libraries. Public Services Quarterly 7(1/2), 34-48. doi:10.1080/15228959.2011.572780 Warner, D. G. (2001). A new classification for reference statistics. Reference and User Services Quarterly, 41(1), 51-55. White, M. D. (2001). Diffusion of an innovation: Digital reference service in Carnegie Foundation master’s (comprehensive) academic institution libraries. Journal of Academic Librarianship, 27(3), 173-187. Retrieved 21 May 2012 from http://dx.doi.org/10.1016/S0099- 1333(01)00179-3