What does evaluation have to do with social justice?
“A social justice-oriented evaluation examines the holistic nature of social problems. It seeks to increase understanding of the interdependency among individual, community, and society using a more judicious democratic process in generating knowledge about social problems and social interventions and using this knowledge to advance social progress” (Thomas & Madison, 2010, p. 572)
Evaluation may not immediately strike you as a part of your work that can directly support creating a more just and equitable world. For some, evaluation is seen as an innocuous, neutral part of their work. Others might see it as a hindrance to, or in direct contradiction to, the social justice aspects of their primary prevention efforts.
Examples from the field:
Joe is a preventionist who works with people with disabilities, and he reports that the evaluation procedures he has been asked to use are not appropriate for the population he works with. The written evaluation tools especially often resulting in participants feeling shamed or stupid when they cannot understand what is being asked of them or are unable to read.
Amanda does school-based prevention work and says that written pre- and post-tests hinder her ability to build rapport with students because such instruments feel like just another test or exam students have to take. Since the pre-test is the first interaction she has with students and the post-test is the last one, she feels like her time with students neither starts off nor ends on the right foot.
Jackson works primarily with marginalized communities and notes that the people with whom they work are weary of being studied or tested and distrust additional attempts to collect data about them and their lives.
It is difficult to hear stories like these and not think that such evaluation efforts may be harmful to a vision of a more just world. Are these problems inherent in evaluation as a practice or discipline? Many prominent evaluators believe that evaluators should be agents of change working to improve the lives of marginalized peoples in the communities in which they work (Mertens, 2009, p. 207). In fact, one of the principles of Empowerment Evaluation says that “Evaluation can and should be used to address social inequities in society” (Fetterman, 2015, p. 27).
Evaluation does not just measure our social justice work. Depending on how we implement an evaluation, it can either help or hinder progress toward the world we seek to create through our initiatives. We have to keep in mind the ways in which evaluation is political. Evaluations influence the way funders, organizations, and even politicians make decisions about funding and programmatic priorities. Moreover, the data we collect and the way it's shared tells a story about the people with whom we work. The potential impacts of our data collection, analysis, and interpretation cannot be mere afterthoughts to our evaluative processes.
Every time we make a decision about evaluation, we have to weigh issues of justice, access, and equity. We collect data from and about real people that will have impacts on the lives of those real people. The processes of evaluation can make it easy for us to forget this.
Our approaches to evaluation can serve as an integral part of our work to build more just and equitable communities, if our approaches mirror the changes we want to create. Yet, the types of data we collect and the ways we analyze, interpret, use, and share these data can impact the way other community partners and funders think about and understand the issue of sexual violence, injustice in our communities, and the solutions to these issues.
Through our evaluation practices, we model behaviors related to the following concerns:
For a step-by-step process on to avoid racism, sexism, homophobia and more in data collection and analysis check out this presentation on Feminist Data Analysis from Heather Krause. You can also listen to our podcast on Data Equity. The Center for American Progress released a report in 2022, providing best practices for collecting data about LGBTQI+ and other sexual and gender-diverse communities that might also be of interest.
Evaluation and Culture
“Those who engage in evaluation do so from perspectives that reflect their values, their ways of viewing the world, and their culture. Culture shapes the ways in which evaluation questions are conceptualized, which in turn influence what data are collected, how the data will be collected and analyzed, and how data are interpreted” (American Evaluation Association, 2011, p. 2.).
As the quote above highlights, as people who plan and conduct evaluation, our own cultural backgrounds influence our approach to evaluation. Doing social justice evaluation work that is valid and useful requires engaging in culturally responsive practices. Every aspect and stage of evaluation needs to take into account and be responsive to the culture of the people participating in it, especially the cultures of the people who will be most impacted by the evaluation and from whom data will be collected.
A note on terminology:
For example, if you’re working with groups who value and prioritize oral communication over written communication, then oral methods of data collection (e.g., storytelling, interviews, etc.) will probably be met with less resistance (and possible enthusiasm) than will paper-and-pencil measures like questionnaires.
Example: The Visioning B.E.A.R. Circle Intertribal Coalition’s (VBCIC) program Walking in Balance with All Our Relations is based on a circle process. This process involves participants sharing, one-at-a-time, in response to quotes or information offered to the group by a facilitator, the Circle Keeper, who is also part of the circle. Evaluators hired to evaluate the project worked closely with the Circle Keeper to identify and implement evaluation processes that were not intrusive to the group but rather focused on the storytelling and sharing aspects of the circle as part of data collection. Circle Keepers were given evaluative prompts or questions to share with the group, and the participants responded during one round of the circle. Piloting these ideas in circles allowed the evaluators and Circle Keeper to continue to refine prompts and collection of the data (Ramsey-Klawsnik, Lefebvre, & Lemmon, 2016).
For an update on how The Visioning B.E.A.R Circle is advancing their evaluation practices, listen to our podcast episode: Using an Indigenous Circle Process for Evaluation
“While participatory approaches may involve a range of different stakeholders, particular attention should be paid to meaningful participation of programme participants in the evaluation process (i.e., doing evaluation ‘with’ and ‘by’ programme participants rather than ‘of’ or ‘for’ them)”( Guijt, 2014, p. 4) .
To move toward more equitable power sharing among stakeholders in evaluation efforts, we have to ask ourselves what it means to do participant-centered evaluation and what it means to do evaluation that aligns with the principles of our vision and programming.
One way to increase the social justice-orientation and cultural relevance of evaluations is to make them participatory.
Participatory evaluations can move beyond mere stakeholder involvement in decision-making by including program participants and community members in conducting various phases of the evaluation as co-evaluators. This can happen at any stage or all stages:
- Data Collection
- Data Analysis
- Data Interpretation
- Evaluation Reporting
- Evaluation Use
This list is a generic set of steps that are involved in many types of evaluation, but the specific steps of your own evaluation might look different from this. For example, not all evaluations involve formal reporting, though all evaluations ideally involve some way of sharing of the data, conclusions, and planned actions.
The level and timing of participant involvement needs to be determined based on a variety of variables, keeping issues of equity at the forefront of the process.
For example, you must consider issues such as the following:
- If everyone on the evaluation team except the program participants are being compensated for their time, is that fair?
- If outside forces (e.g., funder deadlines) have you crunched for time during one or more phases of the evaluation, can those phases still be participatory? If so, how? If not, how do you still solicit input from impacted stakeholders and build in ways for later adjustments? Is there room to have a conversation with the funder about your vision for participatory evaluation and find out if there can be some leeway in your deadlines?
- Is there commitment and buy-in from key community leaders for the process of participatory evaluation? Will the variety of participatory stakeholders’ voices and input be taken seriously?
Participatory evaluation involves building your own (and your agency’s) capacity to facilitate collective processes with shared decision making. Additionally, you will need to find ways to build the capacity of stakeholders who are new to program evaluation. They need to be equipped with the knowledge and skills to fully participate in whatever aspects of the evaluation they are involved in.
Watch "Cultural Humility: People, Principles and Practices," a 30-minute documentary by San Francisco State professor Vivian Chávez, that mixes poetry with music, interviews, archival footage, and images of community, nature and dance to explain what Cultural Humility is and why we need it.
CREA in the 21st Century: The New Frontier : This blog from the Center for Culturally Responsive Evaluation and Assessment features articles about critical issues related to culturally-responsive evaluation practice.
Participatory Approaches : (PDF, 23 pages) This Methodological Brief from UNICEF provides a very accessible and detailed introduction to participatory program evaluation
Participatory Evaluation : (Online Article) This page on BetterEvaluation gives a brief overview of participatory evaluation.
Self-Study Plan: Integrated, Creative, and Participatory Evaluation Approaches (Intermediate): (PDF, TXT) This self-study guide, which is a part of the Evaluation Toolkit, focuses on integrated, creative, and participatory approaches to evaluation. It includes up to 6 hours of online training options, as well as, in-person training opportunities. This intermediate level plan will assist learners in describing the benefits of using participatory approaches to evaluation, identifying creative evaluation approaches, and developing a plan for integrating participatory options into evaluation work.
Putting Youth Participatory Evaluation into Action : (Video Presentation) This video presentation by Katie Richards-Schuster explains the process and benefits of engaging youth in evaluation work.
Evaluating Culturally-Relevant Sexual Violence Prevention Initiatives: Lessons Learned with the Visioning B.E.A.R. Circle Intertribal Coalition Inc. Violence Prevention Curriculum : (Recorded Webinar) This recorded webinar explores evaluating a culturally relevant prevention program and lessons learned.
Case Study: Culturally Relevant Evaluation of Prevention Efforts : (PDF, 14 pages) This case study examines the evaluation process of a culturally specific violence prevention curriculum.
Statement On Cultural Competence in Evaluation : (PDF, 10 pages) The American Evaluation Association developed this statement on cultural competence in evaluation to guide evaluators and also help the public understand the importance of cultural competence in evaluation practice.
Training and Capacity Building Activities
- To explore implications and responses to integrating social justice principles into evaluation, see the activity (begins on pg. 8) exploring social justice quotes.
- Check out the case study about integrating participatory principles and practices in prevention work and the accompanying activity (begins on pg. 15).
American Evaluation Association. (2011). American Evaluation Association public statement on cultural competence in evaluation. Retrieved from http://www.eval.org/d/do/154
Fetterman, D. M. (2015). Empowerment evaluation: Theories, principles, concepts, and steps. In D.M. Fetterman, S. J. Kaftarian, & A. Wandersman (Eds.), Empowerment evaluation (2nd ed.). Los Angeles, CA: Sage.
Guijt, I. (2014). Participatory approaches (Methodological brief #5). Retrieved from the United Nations Children’s Fund (UNICEF) Office of Research: http://devinfolive.info/impact_evaluation/img/downloads/Participatory_Approaches_ENG.pdf
Mertens, D. M. (2009). Transformative research and evaluation. New York, NY: Guilford Press.
Thomas, V. G., & Madison, A. (2010). Integration of social justice into the teaching of evaluation. American Journal of Evaluation, 31, 570-583. doi:10.1177/1098214010368426