Metropolitan Universities Vol. 28 No. 4 (November 2017), DOI: 10.18060/21742 What Does it Take for Practitioners to Use Data to Change Behavior in Collective Impact? Jeff Raderstrong, JaNay Queen Nazaire Abstract The use of data to track and manage progress is critical to a collective impact initiative achieving results or understanding impact. Yet, little research has been done to determine how collective impact practitioners can effectively use data. This article—including a literature review, semi- structured interviews with experts on performance management and collective impact, as well as Living Cities’ experience with over 70 collective impact initiatives—outlines five steps for practitioners to grow their initiative’s capacity to use data: Agree on the Data; Find the Data; Present the Data; Discuss and Learn from the Data; Change Behavior and Share Responsibility. Keywords: Community change; continuous improvement; cross-sector partnerships; database management Introduction Collective impact initiatives need to use data to inform decisions, drive direction and manage progress (Kania & Kramer, 2011). The continuous use of data grounds the collective impact work and illuminates for partners when and how they should change their behavior to achieve results. The use of data for collective impact can be thought of as a continuous “feedback loop” to support behavior change and influences systemic change in communities to create large-scale social change (Boyea-Robinson, 2016). Living Cities’ experience with over 70 collective impact initiatives has shown that embedding a data-driven feedback loop can be challenging. There are many challenges associated with using data to change behavior, but they all fundamentally come down to questions of the capacities, skills, and discipline required to manage a data-driven process. To better understand what it takes to use data for collective impact, we interviewed 17 experts and practitioners on performance management for collective impact. Background Living Cities, a collaborative of 18 of the world’s largest foundations and financial institutions, approaches all of its field-level research based on the needs of grantee partners in 105 cities and metropolitan areas across the US. These efforts focus on a variety of content areas, from education and economic development to workforce development and health. A majority of time is spent with grantee partners identifying areas of needed support to achieve large-scale community change. Occasionally, there are similarities across geographies and content focus, since challenges are similar for the different collective impact initiatives. Therefore, Living Cities invests resources in identifying solutions to those challenges. 33 Living Cities was one of the first organizations to encourage the use of a collective impact framework through its funding initiatives (Kania, Hanleybrown, & Splansky Juster, 2014). Within our support of collective impact initiatives, we have explicitly focused on investing and supporting the use of data to achieve large-scale community change (Living Cities, 2016). Among other approaches, Living Cities has used the Results Based Accountability (RBA) framework, to help collective impact practitioners think through their outcomes and necessary data to track progress of performance over time (Friedman, 2015). RBA helps collective impact initiatives develop a framework for measuring, tracking, and managing performance against large-scale results. While the RBA framework and associated tools were helpful to collective impact leaders, it became clear there was a need for greater understanding of how to invest in the capacity of collective impact initiatives to effectively collect, use and manage data throughout the lifecycle of the initiative. Methodology Exploring what capacities are required to use data for collective impact occurred in four different stages. The initial stage consisted of an online literature review, which used keyword searches of data and performance management for collective impact, which all were written after the original Collective Impact article was released in 2011. The review found 19 articles and online resources, and we identified themes across these articles. Several focused on performance management within the context of the social sector (Park, Hironaka, Carver, & Nordstrum 2013; Zhang & Winkler, 2015; Gillespie, 2014; Walker & Moore, 2011), as well as several examples of performance management case studies within the context of collective impact (Edmondson, 2014; Perkins, 2014; StriveTogether, Undated; Kuhlmann, 2016). However, there were few examples of research that explicitly focused on exploring what collective impact practitioners need to invest in to effectively use and manage data within the context of collective impact. With this baseline understanding from the literature, the research expanded to a second phase, comprised of interviews with practitioners located in geographies around the country, as well as intermediary organizations that support collective impact initiatives, such as funders or research organizations. This group had representatives in positions ranging from research assistants to executive directors of programs or organizations. Less than a quarter of the interviewees were receiving direct funding from Living Cities at the time. Living Cities conducted semi-structured interviews with 17 different individuals with experience on using data to achieve social outcomes and/or experience leading and/or supporting collective impact initiatives (Figure 1). Some of these interviewees had experience with tracking and using data to achieve social outcomes, but not necessarily through collective impact, and some of these interviewees had experience with collective impact but not necessarily tracking and managing data. Some had experience with both (i.e., for a full list of first and second round interviewees, visit: https://www.livingcities.org/work/data-collective-impact/about). Several follow-up interviews or email exchanges were conducted with these interviewees to obtain further feedback on the research presented in this article as it was developed. 34 1. What capacities do you think are needed to use data in collective impact? 2. In particular, what capacities do you think are needed to use data for continuous improvement in collective impact? 3. What challenges have you seen for using data in collective impact? 4. What are some potential solutions to solving these challenges? 5. What are good examples of CI initiatives using data well? 6. Other resources? Figure 1. Initial interview questions. After completing the literature review and gathering data from the first set of interviews, we synthesized major themes and drafted a first iteration of the actions needed to use data while undertaking a collective impact initiative. The third phase of research consisted of additional semi-structured interviews with practitioners of collective impact to better understand how the elements apply to the exercise of collective impact in communities. Five semi-structured interviews were conducted with seven practitioners to obtain their feedback on the elements, as well as to capture stories from their work, to better understand the application of each part of the elements (Figure 2). These practitioners were all either currently or formerly involved in a Living Cities-supported collective impact initiative, and they also worked with or for a backbone organization in some capacity. These practitioners had indicated an interest in learning more about using data for collective impact, and had experience doing so. We framed the elements as “steps” in that interview protocol to help practitioners conceptualize how the elements would be applied to their day-to-day work. 1. We’ve identified 5 basic steps for how to use data in collective impact to achieve a shared result: a. Agree on Data: Agree with partners on population metrics that track to a shared result, as well as program-specific metrics to understand how programs are contributing to changes. b. Find the Data: If the data exist, figure out how to access them. If they don’t, figure out how to develop them. c. Present the Data: Take the data and make them digestible, understandable and actionable. d. Discuss the Data: The data will tell a story—determine what’s behind that story and, more importantly, what to do about it. e. Change Behavior and Share Responsibility: Make sure everyone in the collective impact initiative is actually changing behavior based on the data. 2. Do these steps resonate with your work? Is there something missing? 3. What are the roles and responsibilities you’ve seen required to complete some of all of these steps? 4. What are the main successes and challenges you’ve seen? 5. What are other challenges and opportunities you see for using data in collective impact? 6. What resources or tools have been most helpful for you to encourage your partners to share and/or use data? Figure 2. Subsequent practitioner interview protocol. 35 The fourth and final phase of this research was to translate the research findings into an online blog series on LivingCities.org to further validate the elements with a larger body of collective impact practitioners. The blog series ran over the course of four months and was marketed towards collective impact practitioners around the country. Digital metrics from the blog series show that the elements presented generally resonates with the experience of collective impact practitioners. Almost 600 people have subscribed to the blog series through an e-newsletter service, and the blog posts over-performed other blogs on LivingCities.org, with one blog being the most read on the website during the series time period, and three blogs being in the top ten most read blogs. Results From the research, we found five different elements necessary for effectively using data for collective impact: Agree on the Data; Find the Data; Present the Data; Discuss and Learn from the Data; Change Behavior and Share Responsibility. Within each of these elements, the research also identified many different “capacities” that collective impact initiatives need to invest in to use data effectively. Because interviews focused mostly on performance management and measurement techniques, use of quantitative data dominated the discussions. However, the use of qualitative and “lived experience” data is critical to collective impact (Raderstrong and Boyea- Robinson, 2016). Any absence of findings on qualitative data in the discussion below is only because the best use of qualitative data for collective impact is outside the scope of this research. The remainder of this paper will explore the five elements and each subsequent capacity required to strengthen a collective impact initiative based on what it takes to implement those elements. These elements were presented as linear “steps” in the blog series, but should be viewed as dynamic actions that can occur throughout the lifecycle of an initiative. Investment in an element can be seen as independent from the others, but there are intersections in each element that requires consideration of the others. Focus on one area may also surface issues in another, requiring collective impact leaders to shift resources to invest in new or different capacities. Discussion: Five Elements for Effectively Using Data for Collective Impact Agree on the Data Agreement on what data is needed is a foundational element to using data for collective impact. This first requires establishing a “shared result,” (a term often used in Living Cities programmatic support) or “common agenda” (Kania & Kramer, 2011). This shared result is the “north star” of an initiative and should be ambitious but attainable, such as reducing unemployment by 10% in 10 years (Raderstrong & Perkins, 2015). With this shared result in place, a collective impact initiative can then build out a “data-driven feedback loop” to track progress towards achieving a shared result (Figure 3). The feedback loop consists of five components: key drivers; 3-6 and 6-10 year outcomes; a shared result; and strategies. Key drivers are metrics that collective impact initiatives use to measure the impact of strategies, programs, activities, and systems. Outcomes are measures that illustrate changes in the 36 population. The outcomes are divided up in to 3-6 year outcomes and 6-10 year outcomes to allow the collective impact initiative to think of its work in phases. Some progress can be seen early through changes in data, and some progress can only be seen after several years. Strategies are a coherent collection of activities, ideas and programs that drive changes towards a shared result. Figure 3. Data-Driven Feedback Loop. Collective accordance of the metrics for a collective impact initiative’s data-driven feedback loop requires an intensive focus. To secure this agreement on what data to track, a collective impact initiative should invest in three different capacity areas: Facilitation skills; research and analysis skills; and a framework for continuous improvement. Facilitation Skills. A strong facilitator is required to get all partners to agree on the initiative’s shared result, and the best measures to track that shared result. Often collective impact initiatives will use an outside individual or contractor to facilitate an initial meeting or set of meetings to come to agreement on which metrics will make up their data-driven feedback loop. But maintaining this agreement on measures requires on-going relationship management as the collective impact work evolves, so if collective impact initiatives rely on an outside facilitator, they should supplement that individual with their own capacity to manage on-going relationships. Research and Analysis Skills. To get agreement on data, collective impact initiatives need to start with an initial list of measures that have been tested and verified in specific focus areas, such as obesity rates or high school graduation. Using these initial measures as a starting point can help the facilitation of agreement discussed above. Many initiative partners already have a sense of what measures make sense, but some basic research skills are needed to find and analyze a reasonable list of initial measures. 37 The identification of appropriate measures should arise through the collective knowledge of the partners. Depending on the agreement of the members, the initial identification of measures may occur through the efforts of the backbone organization, a data committee, or from a discussion of all the partners. For instance, the Seattle/King County member of The Integration Initiative, Communities of Opportunity, has created a data committee made up of staff from partner organizations that have research and data analysis experience. This committee was pulled together after the initiative agreed on an initial framework for their data-driven feedback loop, but the members have been able to go deeper in this framework to revise the measures as needed. An Approach to Continuous Improvement. The science of continuous improvement is another fundamental part of using data in collective impact (Park, Hironaka, Carver, & Nordstrum 2013). There are many different tools and approaches to continuous improvement and collective impact initiatives need to pick the “right” approach for them based on local context. Living Cities uses several different approaches to continuous improvement. It uses Results Based Accountability (RBA) with our Integration Initiative sites as a way to set up the data-driven feedback loop, explained above. RBA in particular helps collective impact initiatives connect the longer-term outcomes to the programmatic metrics or key drivers. Living Cities’ Prepare Learning Circle sites have used the A3 tool for their continuous improvement processes, which incorporates another continuous improvement method called “Plan, Do, Study, Act.” The A3 tool is designed to produce data to be fed into a data-driven feedback loop, so collective impact initiatives relying only on the A3 tool can miss the bigger picture of their work. Some other initiatives have used Six Sigma, such as the Strive Partnership. This approach has been popular in large corporations, particularly those focused on manufacturing and industry, but can require a lot of quantitative capacity to implement successfully. Find the Data Finding needed data was one of the most common challenges cited in interviews. All interviewees touched on this challenge in some way or the other. Not surprisingly, this challenge is well documented in the literature (StriveTogether & Data Quality Campaign, 2015; NNIP, 2014; Gillespie, 2014; Parkhurst & Preskill, 2014). There are seven different elements a collective impact initiative can invest in to strengthen their capacity to find necessary data: 1. Leadership buy-in and support; 2. A data inventory; 3. Data sharing agreements; 4. A dedicated staff; 5. Surveys; 6. A data partner; and 7. A software platform. 38 Leadership Buy-In and Support. The Strong Healthy Communities Initiative in Newark, NJ has navigated a data-driven culture shift. They are accessing and sharing data with partners, but are struggling to get data they can use for forward-looking, strategic decision-making. Much of the data they have access to, particularly from public sector partners, is compliance focused. For example, they know how many students are chronically absent from school, but not the reasons why they are absent. The lack of these strategic data has become a big barrier to achieving their collective impact goals. Getting the “right data” to achieve goals requires commitment from leaders of your collective impact initiative to share data from their work with the other partners. Oftentimes a commitment to using data for improvement requires fundamental shifts in the culture of an organization (Gillespie, 2104), at staff levels from programmatic up to executive leadership. A Data Inventory. An unfortunate reality is that even with agreement on what data is needed, it is likely that data will not exist. For example, the Network for Economic Opportunity, a member of The Integration Initiative, decided they wanted to track the number of working-aged African American men earning family sustaining wages. As they began to look into collecting data, they realized they couldn’t really track this metric, and instead chose multiple proxy measures , including non-employment rates, labor force participation rate, and average wage. One way to solve this is to create a data inventory, a list of all the data needed, and how to potentially get it (Queen, 2016). Lack of data shouldn’t slow down an initiative’s use of data, but instead inform where an initiative should invest resources in finding and/or creating data. Data Sharing Agreements. Formalizing data sharing relationships can set specific expectations for partners and help ensure a collective impact initiative gets the data it needs. For example, The Integration Initiative member from Detroit shared data across several different partners, but often would get data in the form of PDFs, and not the more usable Excel spreadsheet files. Developing what’s called “data sharing agreements” can help set expectations—like when to share data and in what format—early on in the life of your collective impact initiative. The National Neighborhood Indicators Partnership (NNIP) outlines four elements of successful formalized data sharing agreements: (a) general introduction; (b) data transmission and content; (c) handling and release of data and analysis; and (d) procedural/contractual issues (NNIP, 2014). Partners need to be held accountable to the data sharing agreements they sign. The Strong Healthy Communities Initiative example outlined earlier shows how even with data sharing relationships set, partners may not provide necessary data nor exercise the discipline necessary to use it. Maintaining this type of accountability requires dedicated staff to manage data-centric relationships. Dedicated Staff. Because the continuous collection of data requires significant staff capacity, creating staff positions to manage data collection and analysis can help manage data more effectively. However, this can be challenging since data capacity is often under resourced for level of expertise needed. Dedicated staff for collective impact are usually housed within a 39 centralized backbone organization, but do not have to be. The Neighborhood Developers (TND), the managing partner of Chelsea Thrives in Massachusetts, was able to work within funding constraints to meet data needs by engaging an existing TND staff member on the creation of data dashboard and monitoring tools. That individual has since moved on, and the position is in the process of being filled. Melissa Walsh, Director of Chelsea Thrives, one interviewee for this paper, said her experience building out their staffing for data management “highlights the importance of strong data systems and the need to train additional program staff on those systems. Effective systems building, documentation and sharing of knowledge are crucial for transitions and ultimately for sustainability.” Collective impact initiatives often employ a Data Manager or two who are devoted to the process of using data and continuous improvement. These positions are often the first hire after the initiative director (Collective Impact Forum, 2013; StriveTogether & Data Quality Campaign, 2015). Because collective impact initiatives are often underfunded (StriveTogether, 2013), this dedicated staff capacity is usually held by the initiative director for interim periods. Yet responsibility for use of data should not fall solely on the dedicated data staff of an initiative or the backbone organization. A central function of the collective impact approach is for individual partners to take responsibility for changing their own behavior (Edmondson, 2015). While dedicated staff can help facilitate this behavior change, ultimately partners need to hold themselves accountable to the use of data as well. More on this will be discussed in the “Change Behavior and Share Responsibility” section. Surveys. When the Network for Economic Opportunity realized that the data they needed—more information on the needs of the city’s large numbers of unemployed African American men— didn’t already exist, they decided to create a survey to generate that data. The survey not only created a new data-base of information not previously available to New Orleans or their partners, it revealed some unexpected results that helped inform their overall planning (Landrieu, 2014). The NEO team illustrated that primary data collection is an option for practitioners. Surveys can help collective impact initiatives access both qualitative and quantitative data that provide greater context. Communities of Opportunity in Seattle/King County has used the White Center Community Development Association’s Community Survey as a data source to provide qualitative context to their work reducing health disparities in suburban areas. Results from this survey, and other community data efforts, successfully helped make the case in the state legislature for why an under-resourced neighborhood should be absorbed into the city of Seattle. One tool to support data collection is the Spark Policy Institute’s “Right Now” Survey. Communities of Opportunity used this tool to surface emerging opportunities, concerns and partner needs, particularly for those not comfortable sharing at in-person meetings. As members got to know and trust each other, the initiative began using the survey as a way to organize observations during meetings. The initiative staff used the data from the survey to take advantage of timely opportunities and inform decisions that addressed concerns of partners. A Data Partner. Some collective impact initiatives have found it helpful to work with an external data partner—such as a university—to support their data collection or development. Almost all 40 collective impact initiatives have a partner with some additional research or data capacity at least nominally involved (Parkhurst & Preskill, 2014). But Monique Baptiste-Good, the Executive Director of the Strong Healthy Communities Initiative, said in an interview that, in her experience, relying on a third party data partner is most useful when an initiative has a large amount of data to analyze. The data partner can interpret what the data are “saying.” If the data partner is instead developing or creating most of the data to the initiative, rather than gathering it from other sources, that partner could have too much control over what data is used in the data-driven feedback loop and what isn’t. Like with any partner, a data partner should have a specific purpose for being engaged in the collective impact initiative and should be included in planning conversations early on (Raderstrong & Gold, 2015). A Software Platform. Focusing on creating a software solution first can cause more trouble than it’s worth. Efforts often stall when people spend too much time looking for software, and not enough time trying to use data to improve their activities. Many collective impact initiatives equate the use of data with the need for some kind of software infrastructure, but this oftentimes is “work avoidance” (Heifetz & Laurie, 2007) to push off making decisions around what data needs to be tracked and how to track it. Chelsea Thrives is working with approximately 25 partners to track and share data for the Chelsea Hub Model, an innovative community mobilization model, which is a specific strategy within Chelsea Thrives. To do this, they use something that comes standard on most personal computers: Microsoft Excel. Based on interviews and Living Cities’ experience, most collective impact initiatives rely on Excel or other basic software tools. If a collective impact initiative is considering using a more complicated software option, it should weigh the pros and cons of the tool very carefully before making the decision to invest in it to increase their initiative’s capacity (Zhang & Winkler, 2015). Present the Data Most people cannot consume raw data, nor do they have the time or capacity to do so. To effectively use data, collective impact initiatives must make them more digestible. There are three areas that collective impact initiatives can invest in to increase presentation capacity: (a) analytical skills, (b) artistic skills, and (c) framing Analytical skills. The Network for Economic Opportunity, our New Orleans partner in the Integration Initiative, is managing several projects with the goal of increasing employment rates of African-American men. They’re accessing and using data for each project, and some projects have more accessible data than others. Instead of providing raw spreadsheets of data to their partners, the Network spends time analyzing and consolidating these data into more easily digestible graphs and charts (Figure 4). 41 Figure 4: Example of a Graph from the Network of Economic Opportunity. Making data presentable in this way usually requires some moderate level of analysis. A base understanding of statistical processes can help synthesize data into a more presentable form. Usually the Data Manager, discussed above, fulfills these analytical functions. The Network for Economic Opportunity has several individuals across City Hall fulfilling data management responsibilities in various ways. Some spend time securing data from partners, and some spend time “massaging” data to make it more presentable. Data should be used to encourage partners to make changes to their work to achieve better outcomes. The process of statistical analysis with the process of assigning meaning to the data should not be conflated. These are two discrete steps, and the latter is crucial to do in collaboration with partners. Artistic skills. Presenting data is much more of an art than a science. Communities of Opportunity, the Seattle/King County member of the Integration Initiative has a lot of data on its work to reduce health disparities. They have found that presenting data in ways that connect two or more abstract concepts together can take partners past observing data to action. For example, maps are visually compelling way to connect inequities with people and places. Individuals quickly relate to the data when they see how their neighborhoods rank on several indicators (Benjamin, 2007). Turning numbers and raw data into a visualization can help partners connect the dots in compelling, actionable ways (Pettiross, 2015). Nadine Chan, the Assistant Chief for Assessment, Policy Development, and Evaluation in Seattle and King County, worked with data related to employment using excel charts and graphs to quantify disparities. However, when she later converted the data into a simple infographic using the free resource Piktochart (Figure 5), https://piktochart.com/ 42 partners spent more time looking at it and asked more questions about the data, showing that they connected with the data in new ways. There was a renewed urgency around working on strategies to increase employment rates to improve poverty in their community. This shows not only the need for creative presentation, but also the value in presenting data repeatedly in different ways to get the information to stick with people. Figure 5: Sample of an Infographic from Communities of Opportunity. The infographic was created by Nadine Chan of King County to present her data in a visually compelling way. 43 Framing. The examples of visually appealing data show how powerful data can be when presented in a way that “tells a story.” In particular, disaggregating data through a variety of factors can help tell different stories. For example, Figure 5 has employment rates broken up by region, but also by race. Disaggregating by race in addition to place gives the data a different meaning that if it was only disaggregated by one factor—it shows that in King County, employment does not vary much by region, but it does by race. If partners were to look at the “place” data alone, they may not think there would be much of a problem with employment disparities in their region. But the “race” data clearly shows that black residents clearly are not able to access employment opportunities at the same rates of other citizens. A combination of visual presentation and creative formatting can drastically shift a conversation around data. Organizing “Data Walks,” where graphs and charts are placed on walls, have been shown to encourage discussion and help people absorb data in new ways. The Urban Institute has done extensive research on the usefulness of Data Walks to facilitate community-based changes (Murray, Falkenburger & Saxena, 2015). Discuss and Learn from the Data The fourth element in our initial framework was “Discuss the Data” to highlight the need for collective meaning-making by the collective impact partners (Gold, 2013). However, in our discussions about the initial framework and subsequent blog posts, it became clear that learning from the data was just as important as the act of discussion. To increase an initiative’s capacity to discuss data, we found investment was needed in two areas: ● Facilitation skills, and ● An understanding of assumptions about the problem being solved. Facilitation skills. A strong facilitator can help partnerships come to an agreement about what the meaning behind data in the same way facilitation is required to secure agreement on what data to use. A facilitator should understand how to shape a conversation about data so that other members of the collective impact initiative understand what they are looking at and why that matters, as well as feel empowered to make their own conclusions. Mark Friedman, who pioneered the Results Based Accountability framework discussed earlier, has developed seven different questions to ensure performance accountability that can help collective impact initiatives guide their discussions around data (Friedman, 2015): 1. Who are our customers, clients, people we serve? (e.g children in a child care program). 2. How can we measure if our customers/clients are better off? (performance measures about client results – e.g. percent of children with good literacy skills). 3. How can we measure if we are delivering service well? (e.g. client staff ratio, unit cost, turnover rate etc.). 4. How are we doing on the most important of these measures? Where have we been; where are we headed? (baselines and the story behind the baselines). 5. Who are the partners who have a potential role to play in doing better? 44 6. What works, what could work to do better than baseline? (best practices, best hunches, including partners’ contributions). 7. What do we propose to do? (multi-year action plan and budget, including no-cost and low- cost items). Our partners at the Network for Economic Opportunity in New Orleans have monthly discussions about data and progress to inform on-going work. Due to data accessibility challenges, they haven’t been able to bring a lot of quantitative data to these meetings. Instead, they rely on qualitative updates from partners to move their work forward while simultaneously working on accessing data. They can move their continuous improvement process forward by using anecdotal evidence and “lived experience” while they resolve some of their data accessibility challenges (Klaus, 2014; Mack, 2014). An Understanding of Assumptions. Each partner in a discussion about collective impact will have his or her own assumptions about the work. Partners may hold assumptions (correct or not) about any number of things– from the cause of problems to how to solve those problems. Any conversation about data should involve a conversation about root causes of problems. If there isn’t an understanding of why certain problems exist, it’s unlikely that an initiative can determine effective strategies for solving those problems. Our Prepare Learning Circle sites have gone through a factor analysis process to determine root causes that influence their ultimate shared result, which helped them better understand why and how strategies may fit in to their data- driven feedback loop. In discussing assumptions, practitioners often find that equity, particularly racial equity, surfaces. Many of the root causes of problems, are, at their core, issues of systematic and institutional racism. The entire field of collective impact is recognizing that racial equity needs to be a fundamental component of the process of applying collective impact principles (Schmitz, 2014). An equity lens should be applied to all the elements to using data for collective impact (Williams & Marxer, 2014), but it’s often in this element where equity conversations are most important. Change Behavior and Share Responsibility Changing behavior and sharing responsibility is the most critical element to actually achieving the central goal of collective impact: creating systems level changes in communities (Kania & Kramer, 2011). If an initiative invests in capacity in all of these different elements but cannot hold partners accountable to changing their behavior (Edmondson, 2015), then it is unlikely the initiative will achieve their shared result. Ultimately, partners should own their own data and hold themselves accountable to changes (Hanleybrown, Kania, & Kramer, 2012). To help partners reach that place of shared accountability, collective impact initiatives should invest in the capacity of four areas: ● Dedicated staff, ● An action plan, ● Programmatic staff buy-in and support, and ● Evaluation capacity. 45 Dedicated Staff. The process of holding partners accountable requires continuous follow-up and relationship management. Directors of collective impact initiatives, as well as data managers, often take on the responsibility of following up with partners to ensure they execute on their action commitments. The Collective Impact Forum has created a set of resources to outline common responsibilities for dedicated staff of collective impact initiatives. The three roles it outlines are “Executive Director,” “Project Coordinator” and “Data Consultant.” Of the three, all have responsibilities that require holding partners accountable to various aspects of collective impact. Most of the responsibility falls on the Executive Director, which has responsibilities such as “providing regular reports on progress against goals and indicators” to the steering committee and “cultivating excellent working relationships...in a way that can inspire collective action without formal authority” (Collective Impact Forum, 2013). An Action Plan. Collective impact initiatives often fall prey to one of the most frustrating parts of working in teams: people don’t do what they say they will do (Bain Pillsbury, Undated). An easy way to hold partners accountable to their action commitments is through the use of an “action plan.” There are many ways to create an action plan—Gantt charts are a popular resource—and the process for creating a plan is similar whether it is done for a project specific to an organization or one that needs coordination with partners. The essential components of an action plan should be: ● The task, ● Who is responsible for the task, and ● A timeline for completing a task. Figure 6 provides an example of an action plan from All Hands Raised in Portland, OR. 46 ACTION STEPS Deliverable Person(s) or Group(s) Responsible Implementation Timeline Date Completed Resourc es Needed Start Stop A) Compiling / Creating Useful Tools for Educators Complete sector reports re: (1) types of careers in construction & manufacturing, with wages, and (2) industry needs / labor market projections 2 final sector reports WSI 2/29 [WSI] Complete career pathway guides (1 for construction, 1 for manufacturing) 2 final career guides WSI 2/29 [WSI] Develop 2-page career pathway visuals/infographic (1 for construction, 1 for manufacturing) – planning committee review mid-March – PULL FROM EXISTING WEBSITES Info-graphic AHR 4/1 4/20 Finalize a proposed set of content for use at event (infographics, websites, videos, student stories) – planning committee review mid-March WSI 2/29 4/22 Finalize the packet/toolkit for all attendees: Externship overview (Jesse); Action Plan template (Nate); Administrator infographic (Jesse); Takeaway card and contact list (Nate); Pathways graphics (Nate); Posters (Reese); playing cards (Jesse); PDU certificates (Jesse) AHR/WSI 4/8 4/22 Test messages and tools for clarity, relevance and cultural appropriateness Planning committee 4/15 N/A Revise, finalize and print tools Final printed tools AHR / WSI 4/21 4/25 Distribute awareness tools broadly through networks of superintendents and others – targeting middle & high school administrators & school board members AHR / WSI 4/26 5/6 Figure 6. Sample Action Plan from All Hands Raised. 47 Programmatic Staff Buy-In and Support Without getting everyone on board, the use of data can’t be integrated at all levels of a collective impact initiative (Gillespie, 2104). There will always be one “problem” group or one organization that doesn’t pull its own in terms of follow through in changing behavior. Our partners at Chelsea Thrives have seen how shared ownership and use of data can have real impact on the lives of the people they serve to reduce poverty and improve mobility. A team of their service providers meets monthly to assess risk factors of different families. Much of their work previously felt like “triaging,” according to Chelsea Thrives Executive Director Melissa Walsh. But once they were able to come together as a group with shared data on clients, they were able to effectively prioritize which clients should receive which services. If the partners agree, based on two to eight factors, that a family has a high level of need, they will coordinate amongst themselves to deliver those services. Evaluation Capacity. Evaluation is distinct from performance management. Evaluation is the process in which an organization assesses its programs effect on a target population, whereas performance management is focused on the improvement of programmatic or organizational processes (Walker & Moore, 2011) and impact on customers (Friedman, 2015). Although this article has focused primarily on the elements to implement performance management in collective impact, evaluation of the effects of collective impact is an essential part of understanding what large-scale changes are happening in a community as a result of collective impact efforts. Several entire articles have been written on evaluating collective impact to guide practitioners in understanding what impact their efforts are making on their communities, and why (Preskill, Parkhurst, & Splansky Juster, Undated). Collective impact initiatives should invest in both formal and informal evaluation methods (Preskill, Parkhurst, & Splansky Juster, Undated). If an initiative doesn’t know if it is actually changing the lives of the people, it could do more harm than good. At Living Cities, we build in time and capacity to evaluate all of our programs, especially the collective impact initiatives we support. With our Integration Initiative, we conducted an extensive evaluation that highlighted a number of recommendations to improve the program (Living Cities, 2014). One of the major ones was to create a planning year to help set the stage for collective impact, something the Working Cities Challenge adopted as well. Conclusions Achieving large-scale change through collective impact requires the use of data to understand progress and measure impact. Yet management of data can be one of the most challenging aspects of collective impact, or any community change initiative. For practitioners to effectively use data for performance management, they must invest in the capacity of their initiative across five different elements: Agree on the Data; Find the Data; Present the Data; Discuss and Learn from the Data; Change Behavior and Share Responsibility. This article presented lessons learned and examples that speak to each of these five elements. Yet many questions remain regarding how to best equip collective impact practitioners to build 48 capacity across these five elements. To achieve the true promise of collective impact (that is, large-scale community change), funders and intermediaries like Living Cities can and should explore more deeply what it takes to use data to change behavior in collective impact. Further refinement on the research presented here, particularly focusing on the use of qualitative data for collective impact, will only strengthen the collective impact model as it evolves and is applied in more and more communities in the US and around the world. 49 References Boyea-Robinson, T. (2016). Data-driven feedback loop examples. Living Cities. Retrieved from https://www.livingcities.org/resources/314-data-driven-feedback-loop-examples Bain Pillsbury, J. (Undated). Theory of aligned contributions. Sherbrooke Consulting, Inc. Retrieved from http://nvecac.com/wp-content/uploads/2016/05/TOAC.pdf Benjamin, D. (2007). Doing social math: Case study in framing food and fitness. FrameWorks E- Zine. No. 40. Retrieved from http://www.frameworksinstitute.org/ezine40.html Collective Impact Forum. (2013). Job descriptions of backbone roles. Retrieved from https://collectiveimpactforum.org/resources/job-descriptions-backbone-roles Edmondson, J. (Dec 3, 2014). Children will show us the way…On how to use data in education?!? Retrieved from http://www.strivetogether.org/blog/2014/12/children-will-show-us- the-wayon-how-to-use-data-in-education/ Edmondson, J. (2015). Collective impact irony. Retrieved from http://www.strivetogether.org/blog/2015/03/collective-impact-irony/ Friedman, M. (2015). Trying hard is not good enough. PARSE Publishing. Gillespie, S. (2014). Starting small and thinking long-term. Retrieved from http://www.urban.org/research/publication/starting-small-and-thinking-long-term-qa- performance-measurement-and-evaluation-professionals Gold, A. (2013). What barriers? Insights from solving problems through cross-sector partnerships. Retrieved from https://www.livingcities.org/resources/231-what-barriers-insights- from-solving-problems-through-cross-sector-partnerships Hanleybrown, F., Kania, J. & Kramer, M. (2012). Channeling change: Making collective impact work. Stanford Social Innovation Review blog. Retrieved from https://ssir.org/articles/entry/channeling_change_making_collective_impact_work Heifetz, R. & Laurie, D. L. (2001). The work of leadership. Harvard Business Review, 79(11), 131-141. Retrieved from Kania, J., Hanleybrown, F. & Juster, J.S. (2014). Essential mindset shifts for collective impact. Stanford Social Innovation Review, 12(4), 2-5. Retrieved from https://ssir.org/articles/entry/essential_mindset_shifts_for_collective_impact Kania, J. & Kramer, M. (2011). Collective impact. Stanford Social Innovation Review, 9(1), 36- 41. Retrieved from https://ssir.org/articles/entry/collective_impact Kuhlmann, J. (2016). Data dashboard keeps users, student outcomes at the center of the conversation. Retrieved from http://www.strivetogether.org/blog/2016/03/data-dashboard-keeps- users-student-outcomes-at-the-center-of-the-conversation/ 50 Klaus, T. (2014 October 27). Collective impact 3.0: Big ideas from CI summit in Toronto. Retrieved from http://tamarackcci.ca/blogs/nonprofit-doc/collective-impact-30-big-ideas-ci- summit-toronto Landrieu, M. (2014). African-American male unemployment report. Retrieved from http://www.nolaforlife.org/images/economic-opportunity/bmu-report_final_pages2/ Living Cities. (2014). The integration initiative three year evaluation report. Retrieved from https://www.livingcities.org/resources/282-the-integration-initiative-three-year-evaluation-report Living Cities. (2016). Data and collective impact. Retrieved from https://www.livingcities.org/work/data-collective-impact/about Mack, K. (2014, October 15). Rethink who you call an expert. Retrieved from http://www.fsg.org/blog/rethink-who-you-call-expert Murray, B., Falkenburger, E. & Saxena, P. (2015). Data walks: An innovative way to share data with communities. Retrieved from http://www.urban.org/research/publication/data-walks- innovative-way-share-data-communities NNIP. (2014). Key Elements of Data Sharing Agreements. Retrieved from http://www.neighborhoodindicators.org/library/guides/key-elements-data-sharing-agreements Park, S., Hironaka, S., Carver, P., & Nordstrum, L. (2013). Continuous improvement in education. Stanford, CA: Carnegie Foundation for Advancement of Teaching. Parkhurst, M. & Preskill, H. (2014). Learning in action: Evaluating collective impact. Stanford Social Innovation Review, Fall. Retrieved from https://ssir.org/articles/entry/learning_in_action_ evaluating_collective_impact Pettiross, J. (2015). Why the time-tested science of data visualization is so powerful [Blog post].. Retrieved from https://www.tableau.com/about/blog/2015/11/why-time-tested-science-data- visualization-so-powerful-45705 Perkins, J. (2014). Improving outcomes for opportunity youth. StriveTogether. Retrieved from http://www.strivetogether.org/sites/default/files/images/improving-outcomes-opportunity-youth- data-driven-approach.pdf Preskill, H., Parkhurst, M., Splansky Juster, J. (Undated). Guide to Evaluating Collective Impact. FSG. Retrieved from http://www.fsg.org/publications/guide-evaluating-collective-impact Queen, J. (2016). Data inventory. Retrieved from https://www.livingcities.org/resources/313- data-inventory https://www.livingcities.org/resources/282-the-integration-initiative-three-year-evaluation-report https://www.tableau.com/about/blog/2015/11/why-time-tested-science-data-visualization-so-powerful-45705 https://www.tableau.com/about/blog/2015/11/why-time-tested-science-data-visualization-so-powerful-45705 51 Raderstrong, J. & Boyea-Robinson, T. (2016). The why and how of working with communities through collective impact. Community Development, 47(2), 180-193. https://doi.org/10.1080/15575330.2015.1130072 Raderstrong, J. & Gold, A. (2015). Four questions to consider when building a cross-sector partnership’s structure. Retrieved from https://www.livingcities.org/blog/980-four-questions-to- consider-when-building-a-cross-sector-partnership-s-structure Raderstrong, J. & Perkins, J. (2015). Four components of a shared result that creates enduring change. Retrieved from https://www.livingcities.org/resources/301-four-components-of-a- shared-result-that-creates-enduring-change Schmitz, P. (, 2014 December 22). The culture of collective impact. Huffington Post. Retrieved from http://www.huffingtonpost.com/paul-schmitz/the-culture-of-collective_b_6025536.html StriveTogether. (2013). Funding for the backbone organization in collective impact efforts. Retrieved from http://www.strivetogether.org/sites/default/files/images/Funding_To_Support _The_Backbone_Final_updated.pdf StriveTogether & Data Quality Campaign. (2015). Data drives school community collaboration. Retrieved from http://www.strivetogether.org/sites/default/files/ST-DQC-Data-Drives-School- Community-Collaboration_web.pdf Walker, K. & Moore, K. A. (2011). Performance management and evaluation: What’s the difference? Child Trends. Retrieved from http://www.childtrends.org/wp- content/uploads/2013/06/2011-02PerformMgmt.pdf Williams, J. & Marxer, S. (2014). Bringing an equity lens to collective impact. Retrieved from https://collectiveimpactforum.org/sites/default/files/ EquityandCollectiveImpact_UrbanStrategiesCouncil.pdf Zhang, S. & Winkler, M. (2015). Navigating performance management software options. Retrieved from http://www.urban.org/research/publication/navigating-performance- management-software-options https://doi.org/10.1080/15575330.2015.1130072 52 Author Information *Jeff Raderstrong is a Senior Associate at Living Cities, where he supports the organizational learning process. Previously he worked on Living Cities Collective Impact portfolio, which is a network of over 70 different collective impact initiatives around the country. Jeff Raderstrong Living Cities 1040 Avenue of the Americas New York, New York 10018 Email: jraderstrong@livingcities.org Phone: 646-442-3236 Dr. JaNay Queen Nazaire is a leader, collaborator, and bridge-builder with a fierce belief in the right of all people to live their best possible life. As an Associate Director at Living Cities, she coaches city leaders to collaborate across sectors for powerful social justice results for people on the ground. She contributes to the strategic vision of Living Cities and empowers staff to work to improve the lives of low-income people. JaNay Queen Nazaire, Ph.D. Living Cities 1040 Avenue of the Americas New York, New York 10018 Email: jqueen@livingcities.org Phone: 646-442-2915 *Corresponding author mailto:jraderstrong@livingcities.org mailto:jqueen@livingcities.org Programmatic Staff Buy-In and Support Kuhlmann, J. (2016). Data dashboard keeps users, student outcomes at the center of the conversation. Retrieved from http://www.strivetogether.org/blog/2016/03/data-dashboard-keeps-users-student-outcomes-at-the-center-of-the-conversation/ Living Cities. (2014). The integration initiative three year evaluation report. Retrieved from https://www.livingcities.org/resources/282-the-integration-initiative-three-year-evaluation-report Pettiross, J. (2015). Why the time-tested science of data visualization is so powerful [Blog post].. Retrieved from https://www.tableau.com/about/blog/2015/11/why-time-tested-science-data-visualization-so-powerful-45705 Williams, J. & Marxer, S. (2014). Bringing an equity lens to collective impact. Retrieved from https://collectiveimpactforum.org/sites/default/files/ EquityandCollectiveImpact_UrbanStrategiesCouncil.pdf Author Information