Back to top

Quality Data and Impact Measurement

Quality data and impact measurement allows the collaborative to monitor progress towards its goals and the collective impact of its members when they work together to activate their hiring, purchasing, investing, and other institutional assets. Data reporting bolsters the anchor collaborative's story both internally—as members seek to improve their anchor strategies—and externally—as government stakeholders, business leaders, and community partners hold the collaborative and its members to account for progress and social impact in communities. These efforts build a culture of learning, leadership, adaptability, and transparency among members of the anchor collaborative, increasing their accountability and commitment to the anchor mission.

In this section, we discuss how to engage members of the anchor collaborative in the design and process of data collection and impact measurement, and we share tools and resources for anchor collaboratives to get started, including the HAN Data Companion reference guide for anchor institution members to use in the data collection process.

The value of high quality data

A successful data collection program is informed by, and aligned with, the anchor mission priorities of the collaborative. Appropriately aligned measurement can motivate anchor institutions to build internal reporting capacity, and working alongside peer institutions can spark a sense of “cooperative competition.” Anchor institutions will be motivated to comply with data requests when they value the output they receive in return. Knowing what each anchor institution hopes to accomplish from its reporting will be critical to success.

Value of high quality data from anchor mission strategies:

  • Impact Workforce data can inform hiring and training programs that center racial and economic equity and achieve an anchor institution workforce that better reflects the surrounding community.

  • Impact Purchasing data can reveal which demographic groups may be underrepresented across supply chains and identify opportunities to “localize” spending, which in turn informs how and where the collaborative can expand outreach or build the capacity of local vendors.

  • Place-based Investing data can demonstrate where the institutions are directing capital to address community priorities (e.g., affordable housing, capital for small local businesses) which in turn can build strong community partnerships, increase employee morale, and generate good will with elected officials, community leaders, and residents.

Further, data dashboards and benchmarking can help anchor institutions see how they perform relative to national or regional data, which can be a valuable motivator for consistency in reporting and improvement.

When done well, data collection over time can quantify the collaborative’s impact, track progress toward goals, and increase accountability among members. Our vision at HAN is widespread adoption of a data collection framework by anchor institutions that enables national, multisector benchmarking around anchor strategies in communities across the country. In realizing this vision, we can collectively raise the bar around the role of anchor institutions, quantify the positive impact in communities when anchor institutions intentionally align their economic assets toward community improvement, and ultimately accelerate adoption of anchor mission strategies as a national norm.

For a detailed description of anchor strategy metrics, see the HAN Data Companion. For a list of sample baseline metrics for anchor collaboratives, please see Section 2: Understanding Anchor Strategies and 3.1 Shared Imperative.

Shared Metrics for a Multisector Collaborative, Western Massachusetts Anchor Collaborative (WMAC)

WMAC includes anchor institutions in city government, higher education, healthcare, and other large employers. As a collaborative, WMAC developed shared metrics, definitions, and glossaries for purchasing, hiring, and first community (defined by WMAC as frontline workers that earn below a living wage) data across municipal, healthcare, education, and private sector partners. WMAC members agreed to this shared set of purchasing and hiring metrics, and definitions by which to submit data annually. With this data, WMAC members can set baselines, establish benchmarks, and track progress over time toward purchasing and hiring goals—as individual systems and as a collaborative. The current priority for WMAC leadership is to expand and enhance the dataset through consistent data submissions by all members and, in turn, develop individual commitments in the core WMAC workstreams.

Data collection capacity

To be successful, the organization and individuals responsible for collecting data should have the capacity to analyze, contextualize, and provide timely insight and feedback to institutions who submitted the data. The number of metrics, the complexity of the metrics, the frequency of data collection, and any research in service to quality metric development will have a substantial impact on staffing needs. As these increase, staffing for in-house technology development as well as analytical talent will be necessary.

To generate a high level of value for anchor institutions, many collaboratives have hired dedicated data staff or third-party consultants, or worked with a trusted local third party. For example, as part of the Greater University Circle Initiative in Cleveland, Ohio, the Cleveland Foundation contracted with Cleveland State University to help collect aggregate data and be the independent evaluator of the anchor collaborative’s major accomplishments and initiatives over seven years.

Working with External Partners for Data Collection and Analysis, Milwaukee Anchor Collaborative (MAC)

The Milwaukee Anchor Collaborative (MAC) employs U3 Advisors as an intermediary for streamlined data collection. An unbiased third party to collect and standardize data has been invaluable to MAC for achieving strong reporting from members. MAC utilizes data for both benchmarking and strategic analysis. For MAC, data collection became more meaningful when viewed strategically, providing anchor institutions with insights beyond local spending or diversity metrics. For example, understanding procurement categories and job types enables the development of focused strategies to improve from baseline metrics. Third party analysts also help remove some of the burden from anchor institutions (and the backbone) while yielding insights of high value to them. For instance, U3 can compare anchors’ data against a directory of all known Milwaukee-based minority-owned businesses, identifying suppliers that they are already working with that are minority-owned suppliers that anchor institutions did not know they were already working with. MAC says a main challenge involves alignment with internal tracking. Similar to other anchor collaboratives, MAC is very focused on a specific population–local (city and zip codes) and racially diverse (minority-owned businesses, employees of color). Anchor collaborative members have different ways of tracking geography and racial demographics, and MAC is working with its members to build the MAC metrics and definitions into their reporting to create more consistency.

[58] Molly Schnoke et al., Greater University Circle Initiative: Year 7 Evaluation Report (All Maxine Goodman Levin School of Urban Affairs Publications, 2018), page # 4, https://engagedscholarship.csuohio.edu/urban_facpub/1548/.

Getting started on data collection

Adopting anchor mission strategies often requires that anchor institutions track their hiring, purchasing, and investment activities in new ways. Many systems are not set up to measure the societal impact of their operational behavior, or answer the types of questions associated with anchor strategies. For instance, anchor institutions may not track impact hiring metrics such as the proportion of their employees who earn above a living wage, were hired from an economically disadvantaged area, or do not have a bachelor's degree. Likewise, how anchor institutions track spending across their supply chains will differ organization to organization and sector to sector. Answering the question, “how much does our organization spend with local, minority- or women-owned vendors?” can prove impossible to answer with the organization's current reporting capabilities. It may come as a surprise that these questions, seemingly simple to some, are actually quite difficult to answer at first.

Indeed, establishing systems that answer such questions is part of the systems change, mindset shift, and operational commitments associated with the anchor missionas we shift our practices to maximize community impact, how must we adjust our existing systems to track our progress? Through its data collection efforts, the anchor collaborative can help anchor institutions better measure the societal impact of their operations in alignment with their hiring, purchasing, and investing goals.

Asking each anchor institution what is possible from a data-collection perspective is the best way to get started—as one collaborative put it, “our anchors are intrinsically motivated to start collecting data because they've realized they don't know what they don’t know.” The purpose of data collection in the early stages is to set up the anchor institution systems for robust reporting down the line. For many collaboratives, the first couple years of data collection is for internal purposes only.

For a list of sample baseline metrics, please reference section 2: Understanding Anchor Strategies and section 3.1 Shared Imperative.

Steps for getting started on data collection:

  • Review anchor institution members’ strategic plans and connect existing performance indicators to anchor strategies. What goals have organizations set in these areas? What do anchor members have in common?

  • Meet with each member of the collaborative individually to understand which ways the anchor institution is already reporting data around hiring, purchasing, and place-based investment (if applicable) activities.

  • Keep track of common indicators in hiring, purchasing, and place-based investing (if applicable) across members of the collaborative, including any commonality in systems to which they report data regularly (such as HAN’s annual data collection, or a government reporting system).

  • Be thoughtful about how you engage in group data discussions, recognizing that data quality will vary across organizations in the anchor collaborative. Collective accountability is critical, and institutions may be very hesitant to share data at first. When this is the case, start by creating a safe space for sharing data internally before moving to sharing goals and impacts publicly.

  • Ask anchor institutions already collecting high-quality data to serve as champions for data collection by mentoring others.

  • Use a collaborative process to decide on an initial set of indicators (1-2) that each member of the group will begin to report to the backbone or data collection partner. Even if there are no goals set around that specific indicator (e.g., living wage) yet, getting one to two high quality data points from the group will get members comfortable with the reporting process and spark discussions about opportunities for improvement.

When getting started, know that having aspirational goals for a data collection system and having the patience required to achieve them can be challenging. It may not be possible to collect data at the desired level of depth at the outset. In year one, anchor institutions may commit to reporting on a single, yet powerful, high-quality indicator such as the proportion of employees earning a living wage. At the same time, introducing aspirational metrics in the early stages can start a conversion and orient the group toward deeper tracking in subsequent years (e.g., dollars spent with employee-owned businesses). However, it takes patience to build a fully operational data collection system that is useful for informing anchor strategies in practice. Approach data collection with a commitment to continuous improvement, and remember that quality information-gathering happens slowly. The backbone or data collection partner for the collaborative should frequently remind anchor institutions why and how quality data contributes to the collaborative’s “north star.”

Engaging anchor institutions in data collection and impact measurement

Using a collaborative process to determine a set of metrics on which everyone agrees is essential for successful data collection and impact measurement. Any community partners who are directly involved in the implementation of anchor strategies can likely add insight about what is possible from a tracking perspective based on their own experiences in the field. Without the necessary levels of member engagement throughout the metric development process, data requests may be met with reluctance, resistance, or concerns about the relative value of submitting data.

Following the key principles and practices outlined below can increase the likelihood of a timely, high-quality, and complete data set. Creating common metrics for an anchor collaborative, with members of different sizes and contexts, requires flexibility and the ability to navigate tensions that naturally emerge. To that end, anchor collaboratives can observe the following principles when setting core indicators:

  • be limited and concise as to focus on the key activities of the anchor collaborative;

  • embed equity considerations via the breakdowns of indicators (e.g., race/ethnicity, gender, income);

  • balance flexibility (which makes data appropriate for different contexts) with strict definitions (which make data comparisons possible); and

  • celebrate achievements while driving positive future behavior by being aspirational.

Successful anchor collaboratives secure buy-in from all anchor institution partners before introducing any new data collection system. Thoughtful engagement at this stage can reduce the burden for staff and anchor institution members down the line because all parties understand the motivations for each metric and have agreed to expectations. Building the system together can also help the backbone staff or data collection partner learn the right questions to ask when making data requests.

Western Massachusetts Anchor Collaborative (WMAC)

As part of their Impact Workforce strategy in western Massachusetts, anchor institutions wanted to track the proportion of entry-level employees who moved into management positions as an indicator of upward mobility. The collaborative learned that not all anchor institutions tracked promotions from one level of the organization to the next. WMAC members agreed moving forward to track any promotion as a leading or proxy indicator of upward mobility within the organization. As a result of this consensus process, the collaborative created a stronger initial data set by selecting a metric that is commonly tracked, while pushing the collaborative to think about metrics they may want to introduce in the future.

[59] Dominique Samari and Paul Schmitz, Racial Equity Toolkit: A reflection and resource guide for collective impact backbone staff and partners (Collective Impact Forum, 2023), page #16-19, https://collectiveimpactforum.org/resource/racial-equity-toolkit/.

Data collection tools and resources

Collaboratives are encouraged to design tools and resources that fit their needs. Here we describe types of resources that may prove useful in the data collection process.

Data companions

A data companion serves as a reference guide for members to use in the data collection process. Once developed, it acts as a single source of truth for metric definitions and rationale. A data companion should include: background on who is collecting data and why, instructions and how-to’s, changes to metrics from previous years (if applicable), and any other relevant information, such as instructions for submitting data and details on how data will be protected or used. In addition to metric definitions, the companion should objectively define geographies or populations where the collaborative is focused, rather than using subjective terms like “local” or “impacted populations.”

Data collection instrument

A data companion should be designed for use in conjunction with the data collection instrument to ensure accuracy. We recommend organizing the information by anchor mission strategies so that subject matter experts can easily find the information they need for data submission. The companion supports uniformity in submissions and increases the probability that members of the collaborative will submit data in alignment with expectations. It should be written in a way that speaks to all stakeholders across departments—from the analyst querying the organization’s SQL database to the top-level leaders, to other executives looking for high-level conceptual information about the information their organization will submit to the collaborative.

Below is a sample timeline for data collection that is based on the HAN annual reporting process. While anchor collaboratives need not follow this timeline to a tee, the timeline outlines the key activities and recommended timeframe for each step in the process. HAN is a 70+ member network, and anchor collaboratives with a smaller number of data-submitting organizations may be able to expedite this process over a shorter time frame.

Table 7. Sample timeline for annual data collection

Phase

Q4

Q1

Q2

Q3

Data Collection

Announce the data collection timeline for the upcoming cycle. Members assemble and submit their data teams and establish data-sharing agreements as needed.

Open data collection. Host (and record) a single session overview of the data collection process. Share any relevant resources and make staff available to answer questions. Close data collection at the end of March.

 

 

 

 

 

 

Data Review & Refinement

 

 

 

Throughout the submission window and into the second quarter, backbone or data collection staff review and reconcile all submitted data. This requires an initial internal review, and as-appropriate external reviews with members to adjust erroneous data.

 

 

 

Analysis & Reporting

 

 

 

 

 

 

 

 

 

Data is reconciled, analyzed, and summarized in a final report-out to members (can be a dashboard, report, etc.).

Metric Development

Ongoing. Throughout the process, document frequently asked questions to inform continuous improvement.

Internally finalize any changes or revisions to metrics.

 

 

 

Impact Measurement at West Side United (WSU)

WSU’s mission is to eliminate the life-expectancy gap between the West Side of Chicago and downtown. WSU created a measurement framework, which includes the five primary drivers of the life-expectancy gap, and accounts for 50-75 percent of the life-expectancy gap between each West Side neighborhood and downtown. In addition, WSU established a regional dashboard on its website that conveys community area data about several fundamental metrics from the framework (e.g., unemployment, median household income). The aim is to convey vital information about neighborhood characteristics and where WSU programs are most needed. The next iteration of the dashboard will include storytelling components that highlight how WSU initiatives can improve these indicators.

WSU also publicly shares aggregated local procurement and local hiring data collected from partners that are invested in WSU’s anchor mission approaches. The primary objective behind the data collection is to illustrate the combined impact that WSU’s anchor institutions have in West Side hiring and purchasing efforts. These dashboards serve as tools for WSU collaborators to effectively report on their progress. Each system also receives an individualized report that details their local hiring and purchasing outcomes, interpretations of data trends, and WSU’s recent activities within anchor mission initiatives.

WSU initially focused on setting and collecting data for outcome goals alone. This prevented the organization from building accountability around the activities that partners were expected to complete in service of those outcomes. Anchor institutions may be more motivated to take action if there is a level of accountability through data collection and reporting of process measures.

To achieve measurement goals, WSU working groups in hiring and purchasing established preliminary data collection processes and expectations, which include collecting hiring and procurement data from partners on a regular basis. WSU staff provides data collection templates to anchor mission partners that request hiring and procurement data at the zip code level for the specified time frame. Newer members are notified of expectations for data sharing and are incorporated in the process as appropriate.

Impact investing data is collected quarterly from the community development financial institutions (CDFIs) who carry out the direct lending of WSU’s invested funds. CDFIs have varying experience and capacity to share outcome data in addition to loan data, and there is variation in the types of outcomes reported (e.g., number of jobs created, square feet of commercial real estate developed, number of affordable housing units). WSU compiles the available information across the CDFIs and shares financial and outcome impacts with the working group. WSU developed an impact framework with the support of a consultant that combines CDFI and public datasets to convey a richer portrait of the neighborhood outcomes of the investments.

 

Continuum of Progress: Quality Data and Impact Measurement 

When fully sustaining, collaboratives will use high quality data to track progress toward goals, increase accountability among members, and quantify the collaborative’s impact. However, collaboratives will need sufficient time to reach a consensus on common metrics and establish their data collection, reporting, and analysis processes. A collaborative’s progress towards quality data and impact measurement may take longer to mature than other success factors presented in this playbook, depending on the number of members submitting data and the complexity of the metrics of interest.
Building
Evolving
Sustaining
Members understand the value of data collection and the collaborative has documented the rationale for it.
Members can see their progress reflected in the data and are beginning to leverage the results to create value in their organizations and communities.
Members embed data-informed anchor strategies in their organizations to leverage results that drive community-level change.
Members agree on clearly defined metrics and have established data teams within their institutions; the collaborative has established a basic timeline and framework for regular data collection and reporting.

The collaborative has collected multiple rounds of data from an increasing percentage of the collaborative, showing consistent methodological improvement.

Wins and advances are routinely shared with internal and external stakeholders. Members demonstrate an ongoing commitment to improving and refining metrics to maximize community impact.
The collaborative has designated the data lead organization (e.g. the backbone or a third party) to facilitate annual data collection and impact measurement. 
The data collection lead understands the nuances of data collected from each anchor institution, and champions analytical capacity building for members.
The data collection lead is leveraging its strong relationships with each anchor institution to support a streamlined data and measurement culture; each member has embedded into their strategic protocol to report to the collaborative. 
Agreements are in place that signal a commitment to sharing information with the backbone and/or data collection lead.
Trust is reinforced through ongoing preservation of confidentiality, candor in discussions, and commitment to sharing information with the backbone and/or data collection lead.
Members routinely publicize progress from baseline measures and data insights how those insights have helped inform and improve anchor strategies individually and collectively.
Although they are not yet producing high quality data for sharing, the collaborative may publicize their intentions to collect data and how data will be used. 
Members aggregate and publicize high-quality baselines and initial insights that have helped inform anchor strategies individually and collectively.
 
Member Mindset
I understand the value of data collection and impact measurement and would like to build a reporting system that is in alignment with anchor strategies individually and collectively.
I have submitted several rounds of quality data and have been able to use data submission to affect strategic and programmatic change at my organization.
I am able to track progress on anchor strategy implementation and confidently communicate the collective impact of the anchor collaborative.
Backbone Mindset
I understand the data collection capabilities of members and have deployed initial data collection efforts.
I systematically improve the experience of submitting data, increase the quality of data, and champion the adoption of data submission as a core function of the collaborative.
I am benefiting from mature data collection systems that cyclically inform strategy and future directions, amplifying the successes and impact of the collaborative.
Back to top