Back Donate
Written By: Jennifer Ito

Success in organizing can be measured by a plethora of concrete examples: the number of residents who show up to a community meeting, the number of members who complete a leadership development training and the number of leaders who share their personal stories at a public hearing. But organizers will tell you that success also can be measured by more anecdotal evidence: the new relationships between neighbors when they realize they share the same concerns, the new perspectives they gain when they learn about the root causes of their problems and the influence over policy they have when they speak up collectively before a panel of decision-makers.

METRICS THAT MATTER

Increasingly, organizers are concerned with not only building a powerful organization for change, but also building a powerful movement for change, and the proper metrics are needed to carry this out. As part of the research team at the University of Southern California’s Program for Environmental and Regional Equity (PERE), I helped to develop an evaluative framework for social movements, outlined in our 2011 report, Transactions – Transformations – Translations: Metrics That Matter for Building, Scaling and Funding Social Movements1(or “T3”). The T3 framework attempts to capture both the quantitative and qualitative, transactional and transformational measures of progress, as well as new strategies of movement building. It also takes into account those metrics that go beyond organizational effectiveness and drive towards building social movements.

Little did we anticipate the interest that the T3 framework would generate among funders, evaluators and movement builders alike. Most notably, it has struck a chord with a set of people interested in transformative change and looking to track their efforts qualitatively in ways that go beyond the anecdotal and story-telling. Realizing that the scale of change needed is beyond the reach of any one organization or coalition, both funders and organizations are grappling with big questions about investments and strategies that can make leaps forward – and that often means blurring the lines of immediate self-interest to look at the whole picture. Funders, evaluators and movement builders all have something to learn from each other.

In the T3 report, we cautioned that our framework was not intended to fund a cottage industry of evaluators, and that we were researchers, not implementers. Yet, every year we receive requests that could, in fact, support such an industry. We have agreed to carry out a few evaluations – namely for the Labor/Community Strategy Center, the Dream Resource Center and the National Domestic Workers Alliance – partnering with organizations that are rooted in a vision of transformational change and a strategy of movement building. This work required some blurring of lines on our part. We are not evaluators, yet we agreed to play an evaluative role with the intent of building on-the-ground experience so that we could offer insights and advice to the field about implementation. While we have in no way figured it all out, nor do we want to proclaim a “one-size-fits-all” approach, we have learned a few lessons along the way.

CO-CREATING METRICS: THREE KEY LESSONS

Our overarching takeaway from our foray into conducting evaluations based on our T3 framework is that the metrics of movement building must be co-created, not imposed. It is not about setting guideposts for programmatic or organizational performance. It is actually about stretching beyond organizations, finding complementary roles in the movement and sharing leadership around a common vision. And that process of co-creation requires a new orientation and a new relationship among movement builders, evaluators and funders. We need to rethink the way all partners communicate with each other. A new kind of open and authentic dialogue is critical — one that is more horizontal than it has been in the past. What we offer are our three key lessons about the co-creative process.

Lesson #1: Empower your grantees.

While we have noticed some shifts away from evaluation as a punitive process to evaluation as collective learning, there is nonetheless a power dynamic among funders, evaluators and grantees. Fears of being defunded based on the judgment of a third-party evaluator may actually stifle innovation and hinder the big leaps forward that are needed today to achieve transformative change. When funders provide the means for grantees to directly contract evaluators, they help to establish clearer lines of accountability and a better foundation for an authentic learning process. In all our evaluation projects, we were commissioned directly by the nonprofit, which collaborated on the evaluation process from start to finish, which leads us to the next lesson.

Lessons #2: Reflect, reflect, reflect.

A movement-building approach to evaluation often requires looking beyond stated programmatic goals, objectives and outcomes. Metrics should reflect the movement-building strategy they are intended to measure – but oftentimes that strategy is unstated or not put on paper. While most organizations have well-developed muscles in proposal writing to fund programs and projects, they are less practiced in articulating movement-building theories and strategies. Again, a caution: we are not calling for doctoral-level theories of change (though many groups do). Instead, we encourage organizations to create space and use evaluation data to facilitate reflection and dialogue – a process that is essential to honing theories of change, both formal and less formal.

Lesson #3: Collect the right data.

The clearer that groups are about their theories of change, the easier it is to define metrics of progress and impact. But even in the absence of a clearly-stated theory, it is essential to just start somewhere, and then continually refine your documentation process. Even if the data are wrong (trust me, we know wrong data), there are valuable lessons to learn from failure because, if heeded, these lessons will point the way to better data.

The most common methods of collecting data are through questionnaires, interviews, small group dialogue and observation. For third-party evaluators, being present during that “aha!” moment when someone makes a breakthrough helps in the process of first recognizing it as a metric, naming it and then setting up a systematic way to track and monitor progress over time. Finding appropriate ways to engage funders so that they may experience those transformational moments themselves will help the process of co-creating metrics that matter for the kinds of change needed today.

T3 CASE STUDY: THE NATIONAL DOMESTIC WORKERS ALLIANCE

Our work since releasing T3 – and subsequent blurring of the academic and evaluative roles – has helped PERE to become a better researching body. For those interested in learning more, we recommend a forthcoming report from PERE titled Transforming Lives, Transforming Movement Building: Lessons from the National Domestic Workers Alliance Strategy – Organizing – Leadership (SOL) Initiative. This report – itself the result of a co-creative process – documents and captures the immediate impacts of the SOL Initiative, an organizing and leadership development training for the affiliate members of the National Domestic Workers Alliance (NDWA). Completed last year, SOL was a two-year initiative co-facilitated by NDWA, the movement training organization Generative Somatics and Social Justice Leadership, and PERE was commissioned as an evaluation partner. The overall goal of the SOL Initiative was to provide domestic worker leaders and organizers with the transformative leadership capacities to push the scale and power of local and national organizing in ways that were grounded in vision, strategy, healthy and generative relationship building and sustainability.

From May 2011 through March 2013, approximately 60 women – both worker leaders and staff organizers – from 27 affiliate organizations from across the U.S. participated in five four-day intensive retreats. PERE’s role was to help document and capture the markers of transformation among the participants and the cohort as a whole that will have lasting impacts in the future. SOL’s leadership development approach was a success because of its connection to action and application in the field. We found that when a cohort is part of the same long-term alliance and striving towards the same long-term goals, it can have real-world, direct impacts on legislative campaigns and movement building. Because SOL put skills and plans into action, they applied the learning in a way that gave their practices a sense of both immediacy and relevance.

For research purposes, it was the perfect opportunity to explore more deeply both a set of transformative metrics and a set of movement-building metrics. But for funders and organizers alike, the report offers both a model and the metrics for transformative leadership that we hope will help usher in a new way of approaching social change and the capacities needed to lead that change.

CONCLUSION

Social movements are critical vehicles for moving the needle on issues of regional inclusion, immigrant integration and environmental justice — and that is why we at PERE not only study social movements, but also partner with them. Being effective coconspirators for social change often requires a blurring of the traditional lines that separate community organizing from the world of research and academia, and an acknowledgement that community organizers often are farther ahead in their learning curve than academics. Moving forward, we encourage philanthropic leaders to reconsider old divides, open new channels of dialogue and learning and draw inspiration from movement builders who are at the leading edge of the fight for justice.

Jennifer Ito is project manager for USC’s Program for Environmental and Regional Equity (PERE).


1. Manuel Pastor, Jennifer Ito and Rachel Rosner, Transactions – Transformations – Translations: Metrics That Matter for Building, Scaling, and Funding Social Movements (Program for Environmental and Regional Equity, October 2011), http://dornsife.usc.edu/pere/metrics/.