Layout 1 ISDS Annual Conference Proceedings 2012. This is an Open Access article distributed under the terms of the Creative Commons Attribution- Noncommercial 3.0 Unported License (http://creativecommons.org/licenses/by-nc/3.0/), permitting all non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited. ISDS 2012 Conference Abstracts A Review of Evaluations of Electronic Event-based Biosurveillance Systems Kimberly Gajewski*1, Jean-Paul Chretien2, Amy Peterson2, Julie Pavlin3 and Rohit Chitale2 1Emory University, Atlanta, GA, USA; 2Division of Integrated Biosurveillance, Silver Spring, MD, USA; 3Headquarters, Armed Forces Health Surveillance Center, Silver Spring, MD, USA Objective To assess evaluations of electronic event-based biosurveillance systems (EEBS’s) and define priorities for EEBS evaluations. Introduction EEBS’s that use near real-time information from the Internet are an increasingly important source of intelligence for public health or- ganizations (1, 2). However, there has not been a systematic assess- ment of EEBS evaluations, which could identify uncertainties about current systems and guide EEBS development to effectively exploit digital information for surveillance. Methods We searched PubMed and consulted EEBS experts to identify EEBS’s that met the following criteria: uses publicly-available Inter- net info sources, includes events that impact humans, and has global scope. We constructed a list of 17 key evaluation variables using guidelines for evaluating health surveillance systems, and identified the key variables included in evaluations per EEBS, as well as the number of EEBS’s evaluated for each key variable (3,4). Results We identified 10 EEBS’s and 17 evaluations (Table 1). The num- ber of evaluations per EEBS ranged from 1 (Gen-Db, GODsN) to 7 (GPHIN, HealthMap). The median number of variables assessed per EEBS was 6 (range, 3-12), with 5 (25%) evaluations assessing 7+ variables. Nine (53%) published evaluations contained quantitative assessments of at least 1 variable. The least-frequently studied vari- able was cost. No papers examined usefulness as specific public health decisions or outcomes resulting from early event detection, though 8 evaluations assessed usefulness by citing instances where the EEBS detected an outbreak earlier, or by eliciting user feedback. Conclusions While EEBS’s have demonstrated their usefulness and accuracy for early outbreak detection, no evaluations have cited specific ex- amples of public health decisions or outcomes resulting from the EEBS. Future evaluations should discuss these critical indicators of public health utility. They also should assess the novel aspects of EEBS and include variables such as policy readiness, system redun- dancy, input/output geography (5); and test the effects of combining EEBS’s into a “super system”. Table 1. Number of published evaluations and variables on identified EEBS’s Table 2. Key variables used in evaluations of EEBS Keywords evaluation; biosurveillance; event-based surveillance References Heymann DL, et al. Hot spots in a wired world: WHO surveillance of emerging and re-emerging infectious diseases. Lancet Infect Dis. 2001;1:345–53. Keller M, et al. Use of unstructured event-based reports for global infec- tious disease surveillance. Emerg Infect Dis. 2009;15:689–95. German RR, et al. Guidelines working group centers for disease control and prevention (CDC).Updated guidelines for evaluating public health surveillance systems: Recommendations from the Guidelines Work- ing Group. MMWR Recomm Rep. 2001;50(RR-13):1–35. Buehler JW, et al. Framework for evaluating public health surveillance systems for early detection of outbreaks: Recommendations from the CDC working group. MMWR Recomm Rep. 2004;53(RR-5):1–11. Corley CD, et al. Assessing the continuum of event-based biosurveillance through an operational lens. Biosecur Bioterror 2012;10:131-141. *Kimberly Gajewski E-mail: kimberly.gajewski@emory.edu Online Journal of Public Health Informatics * ISSN 1947-2579 * http://ojphi.org * 5(1):e131, 2013