Editorial Evidence Based Library and Information Practice 2009, 4:2 148 Evidence Based Library and Information Practice Evidence Summary A Librarian Consultation Service Improves Decision-Making and Saves Time for Primary Care Practitioners A Review of: McGowan, Jessie, William Hogg, Craig Campbell, and Margo Rowan. “Just-in-Time Information Improved Decision-Making in Primary Care: A Randomized Controlled Trial.” PLoS ONE 3.11 (2008): e3785. 10 Mar 2009 Reviewed by: Heather Ganshorn Librarian Health Information Network Calgary Calgary, AB, Canada E-mail: Heather.Ganshorn@ucalgary.ca Received: 12 March 2009 Accepted: 27 April 2009 © 2009 Ganshorn. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Objectives – To determine whether a point- of-care librarian consultation service for primary care practitioners (PCPs) improves the quality of PCPs’ decision-making; saves PCPs time; reduces the number of point-of- care questions that go unanswered due to time constraints; and is cost-effective. Overall PCP satisfaction with the service was also assessed. Design – Randomized controlled trial. Setting – Four Family Health Networks (FHNs) and 14 Family Health Groups (FHGs) in Ontario, Canada. These represent new models for primary care service delivery in Ontario. Subjects – PCPs working within the selected FHNs and FHGs. The majority of these were physicians, but the sample also contained one resident, one nurse, and four nurse- practitioners. Methods – Subjects were trained in the use of a Web-based query form or mobile device to submit their point-of-care questions electronically. They were also trained in query formulation using PICO (patient, intervention, comparison, and outcome). http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0003785 mailto:Heather.Ganshorn@ucalgary.ca Evidence Based Library and Information Practice 2009, 4:2 149 Allocation was concealed by an independent company hired to manage data for the project. Participants were not randomized; rather the questions were randomized using a random-number generator. To ensure blinding of the librarians, all questions submitted were answered by a librarian. Answers to questions in the intervention group were relayed by a third party to the practitioner within minutes. Answers to the questions in the control group were not communicated to the physician. Blinding of the PCP subjects was not possible, as they either received or did not receive an answer. Subjects were asked to respond to a questionnaire 24 hours after submitting their question. If the question was in the control group, subjects were asked to indicate whether they had let the question remain unanswered or pursued an answer on their own. In order to assess cognitive impact of both librarian-provided information and self-sought information, respondents were asked to rate information on a scale from high positive to negative impact on decision making. Two linear regression models were run on the data, with participant response time as the dependent variable in the first model, and librarian response time as the dependent variable in the second. Main Results – The service received a total of 1,889 questions, of which 472 (25%) were randomized to the control group, and 1,417 (75%) to the intervention group. Analysis run on both groups found that the types and complexity of questions were similar between the two groups, as was librarian response time. Questions were rated for complexity (the rating scale is included in the article), and most (85%) had a Level 1 complexity rating, meaning there was only one concept listed for each PICO element. The primary outcome measure was the amount of time required to answer the question. Average librarian time to respond to questions was 13.68 minutes per question. Average PCP time to find answers to their own questions was 20.29 minutes; however, subjects only attempted to answer 40.5% of control-group questions themselves. Cost- effectiveness analysis was run on these times, and the authors found that the average per-question salary cost for a librarian to answer these questions (based on 15 minutes per question) was $7.15, while average salary cost for a PCP to spend 15 minutes searching for information ranged from $20.75 to $27.69. The results of the questionnaire indicated a significant positive impact of the information on clinician decision-making. Approximately 60% of the questions in the control group went unanswered, whereas all of the questions in the intervention group were answered. Of the questions answered by the information service, 63.7% of the answers were rated by participants as having a high positive impact on decision-making, versus 14.9% of answers to questions in the control group that practitioners sought out themselves. Seventeen percent of the answers were rated as having a moderate positive impact in the intervention group, versus 5.9% in the control group. Only 7.8% of answers in the intervention group were rated as having no impact, versus 24.8% of answers in the control group. A negative impact (where practitioners found too much or too little information or information that they disagreed with or felt was harmful) was found for 7.7% of librarian-provided answers, compared with 44.9% of practitioner-sought answers. Satisfaction was very high, according to the exit satisfaction survey, with 86% agreeing that the service had a positive impact on decision-making, and 83% stating that Evidence Based Library and Information Practice 2009, 4:2 150 relevant answers were provided in an appropriate time frame. Most participants (72%) would consider using such a service, and 33% indicated they would be willing to pay for this type of service. Conclusion – A point-of-care reference service, in which librarians answer primary care practitioners’ questions within minutes, has a very positive impact on clinical decision making and a high rate of client satisfaction. This system saves PCPs time, which may allow them to spend more time with patients. In supporting good clinical decision making, the service may also decrease the need for referrals and further tests. The service is cost-effective, as librarians find better quality information than practitioners, and they do it faster, on a lower per-hour salary. Commentary This is an interesting study for several reasons. Though other studies have evaluated question-and-answer services, this study appears to be unique in looking at a just-in-time service. Clinical trials of library services are still rare, so this trial is a useful example for others considering a clinical trial as their research methodology. The randomisation of questions rather than subjects is an interesting twist on the traditional clinical trial methodology. Randomising the questions, in combination with using a 3:1 randomisation ratio that allocated most questions to the intervention group, ensured that frequent users of the service had a high probability of having most of their questions answered. The article has the authors’ Consolidated Standards of Reporting Trials (CONSORT) checklist attached. The CONSORT Statement is “an evidence-based, minimum set of recommendations for reporting RCTs” (CONSORT Group). This is an excellent template to use when planning and reporting on a trial. The checklist will assist with structuring the article, and may also assist in the planning of a study. Despite its strengths, this study does have some limitations. The authors state that their sample was drawn from sites that were geographically convenient to the researchers, so the sample is not representative of all PCPs. Also, the study was carried out on a relatively small population. Although 95 individuals declined to participate in the study, it is unknown why they declined as well as whether they differed from the participant group in any significant ways. The authors ran a pre-intervention simulation on the random number generator. This simulation determined that a sample of 88 physicians with 22 questions each (five of them controls) had a 99% power to distinguish between control and intervention groups. According to this author’s calculations, a total of 1,936 questions, 440 of which would be controls, would be needed. However, the authors indicate that the 88 physicians enrolled in the study generated only 1,889 questions. As the authors do not include their power calculations, it is difficult to tell what impact the lower number of actual questions would have on the statistical significance of the findings. Nor do they explain why they chose to use 99% power, rather than a lower number. The authors also do not indicate who did the statistical analysis –one of the authors, or the outside agency that managed the project’s data? The authors state that they used two linear regression models, but these models are not stated in mathematical terms nor are the B weights and significance levels reported. Evidence Based Library and Information Practice 2009, 4:2 151 It would have been useful to do a pre- assessment of the participants to discover their existing level of skill in literature searching and evidence based practice. This information might have provided some insight into the reasons for the large satisfaction gap between providers’ own searches and those done by librarians. In addition, it would have been helpful to know which databases were available to the librarians and subjects in the study, and which ones were most commonly consulted by each group. The authors acknowledge limitations to their study including a lack of follow-up with subjects to determine how they used the time saved by the service, or to discover why, in some cases, subjects felt that the information retrieved either by the service or by their own efforts had no impact or a negative impact on clinical decision-making. Follow- up interviews could have been used to elicit this information. The authors also caution that the low enrolment numbers mean that their findings cannot necessarily be generalized. The authors provide several informative graphs and tables with their study, yet they do not include their survey instrument. It would have been useful to see both the question-specific questionnaire that subjects received 24 hours after submitting their question and the final exit survey. Overall, this is perhaps the best example of a clinical trial of library services to date. Librarians considering similar trials of their own services will find this article invaluable to their planning process. Works Cited CONSORT Group. CONSORT: Transparent Reporting of Trials. 2009. 11 Mar 2009 . http://www.consort-statement.org/ http://www.consort-statement.org/