Evaluation of Data Exchange Process for Interoperability and Impact on Electronic Laboratory Reporting Quality to a State Public Health Agency Online Journal of Public Health Informatics * ISSN 1947-2579 * http://ojphi.org * 10(2):e204, 2018 OJPHI Evaluation of Data Exchange Process for Interoperability and Impact on Electronic Laboratory Reporting Quality to a State Public Health Agency Sripriya Rajamani1*, Ann Kayser2, Emily Emerson2, Sarah Solarz2 1. Informatics Program, School of Nursing, University of Minnesota, Minneapolis, Minnesota 2. Minnesota Electronic Disease Surveillance System (MEDSS) Operations, Infectious Disease Epidemiology Prevention and Control Division, Minnesota Department of Health, St. Paul, Minnesota Abstract Background: Past and present national initiatives advocate for electronic exchange of health data and emphasize interoperability. The critical role of public health in the context of disease surveillance was recognized with recommendations for electronic laboratory reporting (ELR). Many public health agencies have seen a trend towards centralization of information technology services which adds another layer of complexity to interoperability efforts. Objectives: The study objective was to understand the process of data exchange and its impact on the quality of data being transmitted in the context of electronic laboratory reporting to public health. This was conducted in context of Minnesota Electronic Disease Surveillance System (MEDSS), the public health information system for supporting infectious disease surveillance in Minnesota. Data Quality (DQ) dimensions by Strong et al., was chosen as the guiding framework for evaluation. Methods: The process of assessing data exchange for electronic lab reporting and its impact was a mixed methods approach with qualitative data obtained through expert discussions and quantitative data obtained from queries of the MEDSS system. Interviews were conducted in an open-ended format from November 2017 through February 2018. Based on these discussions, two high level categories of data exchange process which could impact data quality were identified: onboarding for electronic lab reporting and internal data exchange routing. This in turn comprised of ten critical steps and its impact on quality of data was identified through expert input. This was followed by analysis of data in MEDSS by various criteria identified by the informatics team. Results: All DQ metrics (Intrinsic DQ, Contextual DQ, Representational DQ, and Accessibility DQ) were impacted in the data exchange process with varying influence on DQ dimensions. Some errors such as improper mapping in electronic health records (EHRs) and laboratory information systems had a cascading effect and can pass through technical filters and go undetected till use of data by epidemiologists. Some DQ dimensions such as accuracy, relevancy, value-added data and interpretability are more dependent on users at either end of the data exchange spectrum, the relevant clinical groups and the public health program professionals. The study revealed that data quality is dynamic and on-going oversight is a combined effort by MEDSS informatics team and review by technical and public health program professionals. Evaluation of Data Exchange Process for Interoperability and Impact on Electronic Laboratory Reporting Quality to a State Public Health Agency Online Journal of Public Health Informatics * ISSN 1947-2579 * http://ojphi.org * 10(2):e204, 2018 OJPHI Conclusion: With increasing electronic reporting to public health, there is a need to understand the current processes for electronic exchange and their impact on quality of data. This study focused on electronic laboratory reporting to public health and analyzed both onboarding and internal data exchange processes. Insights gathered from this research can be applied to other public health reporting currently (e.g. immunizations) and will be valuable in planning for electronic case reporting in near future. Keywords: public health informatics, public health surveillance, disease notification, communicable diseases, electronic laboratory reporting, electronic health records *Correspondence: sripriya@umn.edu DOI: 10.5210/ojphi.v10i2.9317 Copyright ©2018 the author(s) This is an Open Access article. Authors own copyright of their articles appearing in the Online Journal of Public Health Informatics. Readers may copy articles without permission of the copyright owner(s), as long as the author and OJPHI are acknowledged in the copy and the copy is used for educational, not-for-profit purposes. Introduction Past [1] and present [2] national initiatives that promote electronic health records (EHRs), also advocate for the electronic exchange of data across various healthcare sectors using nationally recommended standards [3]. The critical role of public health, in the context of disease surveillance is recognized by these regulations, with recommendations for electronic laboratory reporting (ELR). ELR refers to the electronic transmission of labs related to reportable conditions to public health [4]. The emphasis on interoperability in recent legislations [5] and roadmaps [6] is facilitating the focus on electronic movement of data across healthcare settings. Many public health agencies have seen a trend towards centralization of information technology services which adds another layer of complexity to interoperability efforts. Given this landscape, it is essential to understand the process of data exchange and its impact on quality of data being transmitted, as this is a crucial step in interoperability. In addition, this holds broad implications for future priority transactions such as electronic case reporting to public health. Initial research around ELR focused on comparison of paper-based reports to electronic transmissions and found predominantly positive impact of ELR [7,8] on specifically two metrics of data quality: timeliness and completeness. Subsequent studies have assessed the role of intermediaries such as Health Information Exchanges (HIE) [9-11] to facilitate ELR and reported better completeness of data with HIE support. Presently, studies have begun to focus on provider reporting of notifiable diseases [12,13], as moving to electronic case notification [14-16] along with ELR will be great progress to support overall public health disease surveillance. Challenges in adoption and use of recommended codes [17-19] and need for an informatics savvy workforce [20] were identified as some of the issues in the move towards ELR [21]. A recurring theme across these studies was assessing the quality of data, including exploring new venues to measure [22-24] and improve [25] it. Timeliness and completeness were the two dimensions of data quality (DQ) which were often evaluated. Metrics from DQ frameworks published in literature can be used as guidance in identifying additional parameters for assessment. mailto:sripriya@umn.edu Evaluation of Data Exchange Process for Interoperability and Impact on Electronic Laboratory Reporting Quality to a State Public Health Agency Online Journal of Public Health Informatics * ISSN 1947-2579 * http://ojphi.org * 10(2):e204, 2018 OJPHI Data quality assessment framework by Kahn et al. [26], identifies three DQ categories: conformance, completeness and plausibility, along with verification and validation as two DQ assessment contexts. DQ framework by Strong et al., proposes a broad conceptualization of the quality of data from perspective of data consumers. It defines high quality data as one that is fit for use and emphasizes context around data production and usage. Strong’s framework proposes four DQ categories (Intrinsic DQ, Contextual DQ, Representational DQ, Accessibility DQ) comprising of fifteen DQ dimensions [27,28]. These include Intrinsic DQ (Accuracy, Objectivity, Believability, Reputation); Contextual DQ (Relevancy, Value-Added, Timeliness, Completeness, Amount of data); Representational DQ (Interpretability, Ease of understanding, Concise representation, Consistent representation); Accessibility DQ (Accessibility, Access security). The strength of this framework is the breadth of DQ characteristics. Data quality is a multi-dimensional concept dependent on multitude of factors and adoption of data standards does facilitate DQ, but does not guarantee it [29]. Good quality data that meet many of DQ dimensions are critical for public health surveillance purposes. With increasing electronic data exchange and emphasis on interoperability, it is essential to understand impact of various facets of data exchange on various dimensions of DQ. The Minnesota Electronic Disease Surveillance System (MEDSS) [30] is the public health information system for supporting infectious disease surveillance at a state level for Minnesota and operational since 2008. It holds data on reportable conditions and receives ELRs submitted to the state public health agency. MEDSS is used for case management, contact tracing and to support outbreak investigations. Its scope has expanded to include non-infectious diseases such as blood lead surveillance and birth defects. It’s a person-centric surveillance system which currently holds ~1,279,986 events across infectious diseases, lead and community and family health programs. Approximately 153,880 lab tests/results were reported electronically for 2017 across six health systems and four reference labs. Many healthcare systems are currently on a waiting list for either onboarding/move to electronic exchange or upgrade to better version of reporting standard. Nationally recommended standards for ELR [4] comprise of HL7 2.5.1 for message format and LOINC [31] and SNOMED [32] codes for representation of lab tests and results respectively. With increasing demands for electronic data exchange for incoming data to MEDSS from clinical sectors and for outgoing data to Centers for Disease Control and Prevention (CDC), new informatics tools to support data validation and exchange were implemented. The objective of this study was to assess the data exchange process and to understand its impact on the quality of data in MEDSS. The overarching goal is to utilize findings for improvements in informatics tools and processes to enhance the value of MEDSS by providing good quality data to support various public health purposes including disease surveillance. Methods The process of assessing data exchange for electronic lab reporting and its impact was a mixed methods approach with qualitative data obtained through expert discussions and quantitative data obtained from queries of the MEDSS system. Various subject matter experts (n=9) were identified spanning across the informatics team that supports MEDSS operations, public health program professionals who are users of the MEDSS system and its data, and the Information Technology (IT) team which supports the data exchange process. The focus included both onboarding (process Evaluation of Data Exchange Process for Interoperability and Impact on Electronic Laboratory Reporting Quality to a State Public Health Agency Online Journal of Public Health Informatics * ISSN 1947-2579 * http://ojphi.org * 10(2):e204, 2018 OJPHI of shifting to electronic exchange for either new reporting or migration/upgrade to different standard) and on-going submissions. ELR is unique in that reporting can occur from either EHR or from LIMS (Laboratory Information Management System) and can occur from healthcare delivery organization or from reference laboratories and these were taken into consideration. Interviews were conducted in an open-ended/discussion format and were done over time frame of November 2017 through February 2018. Based on these discussions, two high level categories of data exchange process which could impact data quality were identified: onboarding for electronic lab reporting and internal data exchange routing. Figure 1 displays the ELR onboarding process and includes the testing and validation suite of tools offered in public domain by the National Institute of Standards and Technology (NIST) [33]. The six identified key processes that influence quality of data are numbered A through F (A - mapping of tests and results to appropriate codes, B - NIST test bed for testing of messages, C - submit test HL7 messages, D - solicit HL7 messages with test cases (e.g. specific tests, seasonal diseases), E - technical review, F - program review). Figure 2 displays the internal data exchange routing process which includes the PHIN Messaging System (PHIN MS) [34], a CDC provided software that serves as a transport mechanism for effective movement of messages. This part comprises of four main components numbered G through J (G - PHIN MS, H - Lab code list database validation, I – Rhapsody® Integration Engine [35] rules, J - mapping in MEDSS). Figure 1: Overview of ELR Onboarding Process The potential influence of the ten identified critical steps in the data exchange process and its impact on quality of data was identified through expert input using Strong’s DQ framework as a Evaluation of Data Exchange Process for Interoperability and Impact on Electronic Laboratory Reporting Quality to a State Public Health Agency Online Journal of Public Health Informatics * ISSN 1947-2579 * http://ojphi.org * 10(2):e204, 2018 OJPHI guidance. This was followed by analysis of data in MEDSS by criteria identified by the informatics team. Evaluation of messages not mapped to any disease program in MEDSS was identified as a priority. Next, assessment of completeness of race and ethnicity fields before and after implementation of demographic data import feature in ELR was completed. Using Influenza reporting as a scenario, the number of non-reportable tests that get submitted and added to data in MEDSS was examined. Finally, the number of incoming messages which get rejected due to errors was examined to quantify the need for additional technical assistance. Figure 2: Overview of Internal Data Exchange Routing Process Results The process of exchanging data electronically is iterative and is initiated with numerous rounds of message testing and varying gradation of technical assistance based on data submitter need and capabilities. Each step in the process was deemed critical in its impact on the quality of data which moves across clinical sector and public health. Table 1 lists the six identified key processes for ELR onboarding, relevant sub-processes/notes and their influence including both DQ metric and DQ dimension. All DQ metrics (Intrinsic DQ, Contextual DQ, Representational DQ, and Accessibility DQ) were impacted with varying influence on DQ dimensions. Some errors such as improper mapping on EHR end had a cascading effect and can pass through technical filters and go undetected till use of data by epidemiologists. Some DQ dimensions such as accuracy, relevancy, value-added data and interpretability are more dependent on users at either end of the data exchange spectrum, the relevant clinical groups and the public health program professionals. Table 1: Onboarding for Electronic Lab Reporting and Data Quality Evaluation of Data Exchange Process for Interoperability and Impact on Electronic Laboratory Reporting Quality to a State Public Health Agency Online Journal of Public Health Informatics * ISSN 1947-2579 * http://ojphi.org * 10(2):e204, 2018 OJPHI Data Exchange Process for ELR Onboarding Data Quality Metric and Data Quality Dimension Impact A. Mapping of Tests and Results to Appropriate Codes - Completed in the clinical healthcare space (EHR system and LIMS) Intrinsic DQ (Accuracy, Objectivity) B. Test messages using NIST Test Bed - Ability to map content to HL7 fields - Capability to submit data in R (required) fields Contextual DQ (Completeness) C. Submit HL7 test messages to MEDSS - Capability to submit data in R (required) fields - Complete RE (required, but may be empty) and O (optional) fields Contextual DQ (Completeness, Value-added data) D. Solicit HL7 messages with specific tests, seasonal diseases - Checking for message formats and codes which may not be present in current HL7 test feeds Contextual DQ (Completeness, Relevancy) E. Technical review - HL7 format checks - Review of LOINC codes - Review of SNOMED codes - Review of LOINC-SNOMED pairs - Mapping of code pairs with appropriate disease Contextual DQ (Completeness), Representational DQ (Consistent representation, Interpretability) F. Program Review - Confirm mapping of code pairs with diseases - Check for positive and negative test results - Check for odd messages Intrinsic DQ (Objectivity), Contextual DQ (Completeness), Representational DQ (Interpretability) Table 2 lists the six identified key processes related to on-going production submissions using the internal data exchange routing and their influence on data quality. Similar to the on-boarding process, all DQ metrics (Intrinsic DQ, Contextual DQ, Representational DQ, and Accessibility DQ) were impacted with varying influence on DQ dimensions. The three steps labelled H. (Lab Code List Database Validation), I. (Rhapsody Integration Engine Rules) and J. (Mapping in MEDSS) were deemed critical with high level of need for on-going maintenance. Laboratory tests are constantly evolving along with new lab codes (LOINC) and organisms detected (SNOMED) and their combinations to determine disease changing, some processes (H. I. J.) require frequent review. The analysis also revealed the need for collaboration and some processes are dependent on coordination across MEDSS informatics team, information technology (IT) staff and public health program professionals. Evaluation of Data Exchange Process for Interoperability and Impact on Electronic Laboratory Reporting Quality to a State Public Health Agency Online Journal of Public Health Informatics * ISSN 1947-2579 * http://ojphi.org * 10(2):e204, 2018 OJPHI Table 2: Data Quality Impact of Internal Data Exchange Routing Process Internal Data Exchange Routing Process Influence on Data Quality Metric and Data Quality Dimension G. PHIN-MS Transport - Secure messaging platform for transport of messages Accessibility DQ (Access Security) H. Lab Code List Database Validation - Check to ensure that message contains approved code pairs or rules for exemption - Update codes and code pairs based on new tests and results Contextual DQ (Completeness), Representational DQ (Consistent representation) I. Rhapsody Integration Engine Rules - Fixes format of incoming messages as per rules - Converts messages into MEDSS accepted format Representational DQ (Consistent representation) J. Mapping in MEDSS - Assignment of messages to diseases Contextual DQ (Relevancy), Representational DQ (Interpretability) The results from analysis of data in MEDSS by various criteria identified by the informatics team is presented in Table 3. Evaluation for cases which are not mapped to any disease program and assigned to “other/unknown” category yielded 952 cases. Assessment of messages for these cases noted an absence of LOINC and/or SNOMED codes and their combination pair for disease assignment. Next, the analysis focused on submission of non-reportable respiratory diseases along with reportable conditions (Influenza) due to issues with special lab test panel, and this identified 366 cases. This was followed by evaluating the number of incoming messages which get rejected due to errors and there currently isn’t any process that keeps track of it. The corresponding impact on data quality metrics due to these identified issues are also presented in Table 3. An enhancement was implemented in January 2018 to import demographic data (race, ethnicity) from ELR feeds and this evaluation presented in Table 4. Of the total of 3,651 electronic lab messages received from January through February 2018, data on Race was present in 2,310 messages and 1,680 messages received in that time frame had data on Ethnicity. Comparison of this new data with already existing race and ethnicity data in MEDSS obtained through case reporting and follow-up investigations revealed 270 number of messages wherein race from ELR feed was different than one currently recorded in MEDSS. Table 3: Identified Issues, Data Quality Impact and Correlations with Data Exchange Processes Evaluation of Data Exchange Process for Interoperability and Impact on Electronic Laboratory Reporting Quality to a State Public Health Agency Online Journal of Public Health Informatics * ISSN 1947-2579 * http://ojphi.org * 10(2):e204, 2018 OJPHI Identified Issue # of cases (time frame) Data Exchange Process DQ Impact Non-assignment of Messages to Diseases Lack of LOINC and/or SNOMED codes 952 (currently)  Testing during ELR onboarding  Validation checks with Lab Code List Database Contextual DQ (Completeness, Value-added data), Representational DQ (Consistent representation) LOINC – SNOMED pair missing / not mapped 952 (currently) Submission of non-reportable Diseases Presence of numerous non-reportable respiratory pathogens (e.g. adeno virus, corona virus) 366 (over 1 year)  Testing during ELR onboarding  Screening with Rhapsody integration engine rules Contextual DQ (Relevancy) Missing Messages due to Rejections Rejection of messages due to format and code issues ? approx. few/day (not tracked)  Validation checks with Lab Code List Database  Screening with Rhapsody integration engine rules Contextual DQ (Value-added data) Table 4: Demographic Data from Electronic Lab Reports and Influence on Data Quality Data Imported from ELR Number (Jan – Feb 2018) Data Quality Enhancement Race Data 2,310 / 3,651 (63%) Contextual DQ (Completeness, Value-added data) Ethnicity Data 1,680 / 3,651 (46%) Discussion Federal regulations and incentives have offered the needed momentum towards electronic reporting to public health. But, there are differences in public health measure reporting [36] with ELR lagging behind immunization reporting due to complexities around multitude of labs associated with reportable conditions, slow adoption of recommended codes and multiple entities/professionals involved in exchange such as clinical labs, reference labs, ordering provider, infection control practitioner and disease epidemiologists. Another key factor to consider is that ELR can be generated from EHRs or from laboratory information systems (LIS) in reference labs Evaluation of Data Exchange Process for Interoperability and Impact on Electronic Laboratory Reporting Quality to a State Public Health Agency Online Journal of Public Health Informatics * ISSN 1947-2579 * http://ojphi.org * 10(2):e204, 2018 OJPHI or in healthcare settings. This study also portrays the need for constant updates to the various validation tools to ensure errors are not being propagated across the data exchange chain. This research points to the complexity of the data exchange process by illustrating the numerous stakeholders involved and the critical role each one plays in moving towards interoperability. It also pointed to the need for all data exchange partners to be informed of evolution of standards, both message formats (e.g. HL7) and codes (e.g. LOINC, SNOMED). Some of these exchange mechanisms require technical assistance for either submitter (e.g. labs, providers) and the receiver (e.g. public health) or both of them. National projects such as Digital Bridge [37] and APHL Informatics Messaging Services (AIMS) [38] are aimed to assist in data exchange across jurisdictional boundaries in public health. The data exchange process could be set such that messages get rejected if they fail any of the checks, but will require manual intervention by public health or the data reporters to understand quality issues around rejection and fix them. The study also presents various testing tools (NIST test bed) and validation engines (Rhapsody, lab code list validation database) that help to automate quality checks and monitor various DQ dimensions. Approaches from other public health reporting such as immunizations wherein provider quality reports [39] are generated could be tried in the context of ELR. Likewise open source software tools have been proposed to support data quality checks for both immunization reporting [39] and ELR [23,40]. Implementation and maintenance of these tools require both financial and technical resources. Importantly, there needs to be overarching guidance and support from national organizations such as CDC to ensure standardization and to facilitate sharing of tools/resources across jurisdictions. The study revealed that data quality is dynamic and on-going oversight is a collaborative effort by MEDSS informatics team, technical and public health program professionals. Overall, maintenance of good data quality in context of ELR needs a multipronged approach with automated tools, data exchange partners education, technical assistance, regular updates of codes/tools, organizational commitment and national guidelines along with support by informaticians/data quality analysts. This research depicts the details of processes, people and technology and the need for all the parts to align to make an electronic data exchange truly meaningful by providing good quality to data that fits the purpose (public health surveillance in this case). It highlights the benefits of standardization of data exchange processes which can be applied to other public health transactions. Many public health agencies have seen a trend towards centralization of information technology services which adds another layer of complexity to interoperability efforts. It underscores the value of a public health informatician to be part of electronic exchange of data across various sectors (clinical care, labs) and public health. Finally, this study presents a compelling picture of the interoperability endeavor as a team effort and underscores the critical role an informatics team can play in facilitating the data exchange process. Limitations The study has some limitations and focus on some dimensions of data quality by Strong et al., is one of them. Some DQ aspects such as accessibility are not integrated with exchange process and hence were excluded. The research emphasis was determined by criteria outlined by MEDSS Evaluation of Data Exchange Process for Interoperability and Impact on Electronic Laboratory Reporting Quality to a State Public Health Agency Online Journal of Public Health Informatics * ISSN 1947-2579 * http://ojphi.org * 10(2):e204, 2018 OJPHI informatics team, and was limited based on available data during study period. Some metrics were not tracked and certain tool enhancements were implemented recently by IT support team and thus evaluation was limited. Another limitation is that currently a large volume of ELR submitters are reference labs which are not required to collect race and ethnicity data and hence completeness of those data fields through ELR is limited. Some DQ errors are attributed to frequency of upgrade of codes/validation engine that are driven by organizational resources (finances, trained personnel) / institutional priorities and beyond the scope of this study. Conclusion With the growing demands for electronic reporting with public health, there is a need to understand the current processes for supporting electronic exchange and their impact on quality of data. This study focused on electronic laboratory reporting to public health and analyzed both onboarding and internal data exchange processes. Insights gathered from this research can be applied to other public health reporting currently (e.g. immunizations) and will be valuable in planning for electronic case reporting in near future. The study has potential implications in promoting data quality along with electronic exchange to support public health surveillance. Acknowledgements The authors would like to thank the members of the MEDSS technical team for discussions around the data exchange processes and various public health program professionals for their time and valuable input. References 1. Centers for Medicare and Medicaid Services. EHR Incentive Programs. 2010. Available from: http://www.cms.gov/ehrincentiveprograms. Accessed October 27, 2017. 2. Centers for Medicare and Medicaid Services. Advancing Care Information, Quality Payment Program. 2016. Available from: https://qpp.cms.gov/measures/aci. Accessed October 27, 2017. 3. Office of the National Coordinator for Health Information Technology. 2015 Edition Health Information Technology (Health IT) Certification Criteria, 2015 Edition Base Electronic Health Record Definition, and ONC Health IT Certification Program Modifications. 2015. Available from: https://www.federalregister.gov/articles/2015/03/30/2015-06612/2015- edition-health-information-technology-health-it-certification-criteria-2015-edition-base. 4. Centers for Disease Control and Prevention. Electronic Laboratory Reporting 2010. Available from: https://www.cdc.gov/ehrmeaningfuluse/elr.html. Accessed February 16, 2018. 5. U.S. Department of Health & Human Services. 21st Century Cures Act. 2016. Available from: https://www.congress.gov/114/plaws/publ255/PLAW-114publ255.pdf. Accessed February 2, 2018. Evaluation of Data Exchange Process for Interoperability and Impact on Electronic Laboratory Reporting Quality to a State Public Health Agency Online Journal of Public Health Informatics * ISSN 1947-2579 * http://ojphi.org * 10(2):e204, 2018 OJPHI 6. Office of the National Coordinator for Health Information Technology. Connecting Health and Care for the Nation: A Shared Nationwide Interoperability Roadmap. 2015. Available from: https://www.healthit.gov/sites/default/files/hie-interoperability/nationwide- interoperability-roadmap-final-version-1.0.pdf. 7. Overhage JM, Grannis S, McDonald CJ. 2008. A comparison of the completeness and timeliness of automated electronic laboratory reporting and spontaneous reporting of notifiable conditions. Am J Public Health. 98(2), 344-50. Epub 01 2008. PubMed https://doi.org/10.2105/AJPH.2006.092700 8. Nguyen TQ, Thorpe L, Makki HA, Mostashari F. 2007. Benefits and barriers to electronic laboratory results reporting for notifiable diseases: the New York City Department of Health and Mental Hygiene experience. Am J Public Health. 97(Suppl 1), S142-45. Epub 04 2007. PubMed https://doi.org/10.2105/AJPH.2006.098996 9. Dixon BE, McGowan JJ, Grannis SJ. Electronic laboratory data quality and the value of a health information exchange to support public health reporting processes. Proceedings of AMIA Annuual Symposium. 2011:322-30. Epub 2011/12/24. 10. Dixon BE, Grannis SJ, Revere D. 2013. Measuring the impact of a health information exchange intervention on provider-based notifiable disease reporting using mixed methods: a study protocol. BMC Med Inform Decis Mak. 13, 121. Epub 11 2013. PubMed https://doi.org/10.1186/1472-6947-13-121 11. Revere D, Hills RH, Dixon BE, Gibson PJ, Grannis SJ. 2017. Notifiable condition reporting practices: implications for public health agency participation in a health information exchange. BMC Public Health. 17(1), 247. Epub 03 2017. PubMed https://doi.org/10.1186/s12889-017-4156-4 12. Lai PT, Johns JE, Kirbiyik U, Dixon BE. 2015. Timeliness of Chlamydia Laboratory and Provider Reports: A Modern Perspective. Online J Public Health Inform. 7(1). Epub 02 2015. 13. Dixon BE, Zhang Z, Lai PTS, Kirbiyik U, Williams J, et al. 2017. Completeness and timeliness of notifiable disease reporting: a comparison of laboratory and provider reports submitted to a large county health department. BMC Med Inform Decis Mak. 17(1), 87. Epub 06 2017. PubMed https://doi.org/10.1186/s12911-017-0491-8 14. Rajeev D, Staes CJ, Evans RS, Mottice S, Rolfs R, et al. 2010. Development of an electronic public health case report using HL7 v2.5 to meet public health needs. Journal of the American Medical Informatics Association: JAMIA. 17(1), 34-41. Epub 01 2010. PubMed https://doi.org/10.1197/jamia.M3299 15. Rajeev D, Staes C, Evans RS, Price A, Hill M, et al. Evaluation of HL7 v2.5.1 electronic case reports transmitted from a healthcare enterprise to public health. Proceedings of AMIA Annual Symposium. 2011:1144-52. Epub 2011/12/24. https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=18172157&dopt=Abstract https://doi.org/10.2105/AJPH.2006.092700 https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=17413058&dopt=Abstract https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=17413058&dopt=Abstract https://doi.org/10.2105/AJPH.2006.098996 https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=24171799&dopt=Abstract https://doi.org/10.1186/1472-6947-13-121 https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=28284190&dopt=Abstract https://doi.org/10.1186/s12889-017-4156-4 https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=28645285&dopt=Abstract https://doi.org/10.1186/s12911-017-0491-8 https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=20064799&dopt=Abstract https://doi.org/10.1197/jamia.M3299 Evaluation of Data Exchange Process for Interoperability and Impact on Electronic Laboratory Reporting Quality to a State Public Health Agency Online Journal of Public Health Informatics * ISSN 1947-2579 * http://ojphi.org * 10(2):e204, 2018 OJPHI 16. Staes C, Loonsk J, Turner K, Arzt N, Zarcone-Gagne P. Advancing electronic case reporting (eCR) to enable public health disease control and emergency response: getting into the technical weeds. Proceedings of AMIA Annual Symposium 2017:336-8. 17. Dixon BE, Siegel JA, Oemig TV, Grannis SJ. 2013. Towards Interoperability for Public Health Surveillance: Experiences from Two States. Online J Public Health Inform. 5(1). Epub 04 2013. https://doi.org/10.5210/ojphi.v5i1.4395 18. Dixon BE, Vreeman DJ, Grannis SJ. 2014. The long road to semantic interoperability in support of public health: experiences from two states. J Biomed Inform. 49, 3-8. Epub 04 2014. PubMed https://doi.org/10.1016/j.jbi.2014.03.011 19. Gamache RE, Dixon BE, Grannis S, Vreeman DJ. Impact of selective mapping strategies on automated laboratory result notification to public health authorities. Proceedings of AMIA Annual Symposium. 2012:228-36. Epub 2013/01/11. 20. Dixon BE, Gibson PJ, Grannis SJ. 2014. Estimating increased electronic laboratory reporting volumes for meaningful use: implications for the public health workforce. Online J Public Health Inform. 5(3), 225. PubMed https://doi.org/10.5210/ojphi.v5i3.4939 21. Overhage JM, Suico J, McDonald CJ. 2001. Electronic laboratory reporting: barriers, solutions and findings. J Public Health Manag Pract. 7(6), 60-66. Epub 11 2001. PubMed https://doi.org/10.1097/00124784-200107060-00007 22. Dixon BE, Lai PT, Grannis SJ. Variation in information needs and quality: implications for public health surveillance and biomedical informatics. Proceedings of AMIA Annual Symposium. 2013:670-9. Epub 2014/02/20. 23. Dixon BE, Duke J, Grannis S. 2017. Measuring and Improving the Quality of Data Used for Syndromic Surveillance. Online J Public Health Inform. 9(1). Epub 05 2017. https://doi.org/10.5210/ojphi.v9i1.7623 24. Dixon BE, Rosenman M, Xia Y, Grannis SJ. 2013. A vision for the systematic monitoring and improvement of the quality of electronic health data. Stud Health Technol Inform. 192, 884-88. Epub 08 2013. PubMed 25. Dixon BE, Siegel JA, Oemig TV, Grannis SJ. 2013. Electronic health information quality challenges and interventions to improve public health surveillance data and practice. Public Health Rep. 128(6), 546-53. PubMed https://doi.org/10.1177/003335491312800614 26. Kahn MG, Callahan TJ, Barnard J, Bauck AE, Brown J, et al. 2016. A Harmonized Data Quality Assessment Terminology and Framework for the Secondary Use of Electronic Health Record Data. EGEMS (Wash DC). 4(1), 1244. Epub 10 2016. PubMed https://doi.org/10.13063/2327-9214.1244 27. Strong DM, Lee YW, Wang RY. 1997. Data Quality in Context. Commun ACM. 40(5), 103- 10. https://doi.org/10.1145/253769.253804 https://doi.org/10.5210/ojphi.v5i1.4395 https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=24680985&dopt=Abstract https://doi.org/10.1016/j.jbi.2014.03.011 https://doi.org/10.5210/ojphi.v5i3.4939 https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=11713754&dopt=Abstract https://doi.org/10.1097/00124784-200107060-00007 https://doi.org/10.5210/ojphi.v9i1.7623 https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=23920685&dopt=Abstract https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=24179266&dopt=Abstract https://doi.org/10.1177/003335491312800614 https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=27713905&dopt=Abstract https://doi.org/10.13063/2327-9214.1244 https://doi.org/10.1145/253769.253804 Evaluation of Data Exchange Process for Interoperability and Impact on Electronic Laboratory Reporting Quality to a State Public Health Agency Online Journal of Public Health Informatics * ISSN 1947-2579 * http://ojphi.org * 10(2):e204, 2018 OJPHI 28. Wang RY, Strong DM. Beyond Accuracy: What Data Quality Means to Data Consumers. Journal of Management Information Systems, Vol 12, No 4 (Spring, 1996), pp 5-33. 1996. 29. Nahm M, Hammond WE. 2013. Data standard not equal data quality. Stud Health Technol Inform. 192, 1208. Epub 08 2013. PubMed 30. Minnesota Department of Health. Minnesota Electronic Disease Surveillance System (MEDSS). 2008. Available from: http://www.health.state.mn.us/divs/istm/medss/index.html. Accessed February 19, 2018. 31. Regenstrief Institute. LOINC (Logical Observation Identifiers Names and Codes) 1994. Available from: https://loinc.org/. Accessed February 24, 2018. 32. SNOMED International. SNOMED CT. 2016. Available from: https://www.snomed.org/. Accessed February 24, 2018. 33. National Institute of Standards and Technology (NIST). NIST HL7v2 Resource Portal for Electronic Laboratory Reporting. 2018. Available from: https://hl7v2-elr- testing.nist.gov/mu-elr/. Accessed June 27 2018. 34. Centers for Disease Control and Prevention (CDC). PHIN Messaging System (PHIN MS). 2008. Available from: https://www.cdc.gov/phin/tools/phinms/index.html. Accessed June 27, 2018. 35. Health O. Rhapsody Integration Engine. 1993. Available from: https://orionhealth.com/us/products/rhapsody/. Accessed July 21, 2018. 36. Office of the National Coordinator for Health Information Technology. Hospital Reporting on Meaningful Use Public Health Measures in 2014. 2015. Available from: https://www.healthit.gov/sites/default/files/databrief22_hospitalreporting.pdf. Accessed February 2, 2018. 37. Bridge D. Digital Bridge. 2018. Available from: http://www.digitalbridge.us/. Accessed February 2, 2018. 38. Association of Public Health Laboratories (APHL). APHL AIMS Platform. 2011. Available from: https://www.aphl.org/programs/informatics/pages/aims_platform.aspx. Accessed February 2, 2018. 39. American Immunization Registry Association (AIRA). IIS Data Quality Practices: Monitoring and Evaluating Data Submissions. 2017. Available from: http://repository.immregistries.org/files/resources/59cabe6404421/data_quality_phase_ii_9_ 26_17_final.pdf. Accessed July 21, 2018. 40. Observational Health Data Sciences and Informatics (OHDSI). Automated Characterization of Health Information at Large-scale Longitudinal Evidence Systems (ACHILLES). 2015. Available from: http://www.ohdsi.org/web/achilles. Accessed July 21, 2018. https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=23920982&dopt=Abstract Evaluation of Data Exchange Process for Interoperability and Impact on Electronic Laboratory Reporting Quality to a State Public Health Agency Abstract Introduction Methods Results Discussion Limitations Conclusion Acknowledgements References