0:00:00.7 Sally Laskey: Welcome to Resource on the Go, a podcast from the National Sexual Violence Resource Center on understanding, responding to and preventing sexual abuse and assault. I'm Sally Laskey, NSVRC's evaluation coordinator. On today's episode, I sit down with my prevention team member, Mo Lewis, to talk about our thoughts on how evaluation can support social justice work. [music] 0:00:43.2 SL: Hi Mo, so glad to have you on the podcast with me. 0:00:47.2 Mo Lewis: Thanks, Sally. It's kind of fun 'cause we work together, but we don't do a lot of podcasts together. 0:00:53.2 SL: I know. And I realize that I jumped into talking with folks on the podcast in the field about evaluation without doing the general discussion about why is the NSVRC talking about evaluation on their podcast? 0:01:11.4 ML: That's a good question. 0:01:15.1 SL: So I reached out to you so that we could share some of our thoughts on how evaluation can support prevention efforts. So Mo, do you have a specific memory of your first experience with evaluation? 0:01:30.9 ML: Yeah, I used to take a lot of evaluations. I would volunteer for an HIV services organization and go through programs, and evaluation was always part of those programs. And then when I was working at an LGBTQ youth center after college, we did a lot of HIV prevention work and those programs look like the ones I did when I was younger, they always included an evaluation, so we could kind of have people fill out the survey, see what they learned, and it was always really fun for me to look back at the surveys and read the comments, get to see what people said about it. So then when I started leading programming, leading the HIV prevention programming, and then leading some sexual assault prevention programming, I had to create evaluations and then that really led me down this whole rabbit hole of learning about evaluation. 0:02:28.1 SL: Yeah, and I think I jumped down that same rabbit hole that you were just talking about. I don't know if I really received the grounding in evaluation principles, maybe that I gained later, but as a prevention educator in college, I was developing educational programs with other students and we had this one-page evaluation that I remember gathering up from locker rooms, from student union building conference rooms and fraternity basements and sometimes classroom settings. It was the '90s, and I was reading those evaluation results like report cards. I was... I'm an adult daughter of two now retired teachers, and report cards were a big deal in my house, and I wanted to make sure that I was doing a good job, that I was doing the right things. I'm not sure if those one-pagers actually told me the full story, and I honestly don't remember if they helped me to improve those programs that I was working on or not when I think back several decades ago, but I really do remember how they made me feel. 0:03:49.7 SL: And in this world of prevention, we are always learning. And for me, thinking about how we're framing evaluation as part of our overall learning process really helps to center our discussions on a shared vision for the future, which I don't think was part of that... My initial experience with evaluations. So I think that in doing this now, it helps to reduce some fear that I think we have attached to doing prevention work, but also spark some creativity in different ways. So only talking about evaluation as a funder mandate often disconnects us from those core values of wanting to change the world. And so I'm wondering if we could talk a little bit about our core values and how evaluation can be used for social justice work in your eyes. 0:04:57.9 ML: Oh yeah. There's a lot there. We think a lot or hear a lot in this work about how we want things to be research-based or best practice, and a lot of that comes from research, which is largely inaccessible. Even for prevention programs, it's really rare to be able to have this sort of randomized control trial that's considered this best practice. The other thing about research is that it has been and is still used to extract knowledge from communities. We talk about the phrase is, I think, "helicopter evaluation" or "helicopter research," where someone will kind of just zoom in to a community, they're not part of the community, they will do some work there, do evaluation there, do research there, and then just go, "bloop!" and take out the information. And it's really not part of our core values, it's not related to anything close to justice at all. It's extractive, it's bad. 0:06:03.7 ML: So in our work, I think we have a chance to build evidence of our work's effectiveness, and we do that through evaluation. And even though evaluation is different than research, I think it's still really important to think about those core values and what we're really trying to do, because we are building that evidence base, we are showing the effectiveness of our work and that can help. So with evaluation, this gives us a chance to build evidence of the effectiveness of our work, which is really important. And the other thing that I think about is that evaluation is not separate from our work, it's not some add-on thing, which when you were talking about how it can be a funder requirement, that definitely can feel like an add-on. It can feel like, "Well, maybe this isn't what we're really here to do," but it is, because things like consent, accessibility, inclusion, those are all really core values of every prevention program that I've ever heard of, and that can be really part of our evaluation. We wanna know if those things are happening and we can build them also into our evaluation to make it so. 0:07:14.6 SL: Absolutely. Mo, you've brought up so much that we could talk about, but I wanna key in on this idea of knowledge and knowledge creation and knowledge extraction, and how you just talked about our core values around consent and accessibility and inclusion. And I think that this is where using a social justice-oriented evaluation approach is so critical, 'cause there are hundreds of different approaches to evaluation that all serve a different purpose, and I'm really interested in using evaluation to help tell our prevention stories in this way that's gonna center the experiences of communities that we work with and support. There's this quote from Veronica Thomas and Anna Madison in an article that they wrote about teaching evaluation as a tool for social justice, and what they said was, "A social justice-oriented evaluation examines the holistic nature of social problems. It seeks to increase understanding of the interdependency among individual, community and society using a more judicious democratic process in generating knowledge about social problems and social interventions, and using this knowledge to advance social progress." And this, I actually have this on my desk, I read this every day, and I think this approach can look different in different context, but to me, how it centers connection and relationships, and that power sharing is what really motivates me to do this work and connects to those core values that you talked about. 0:09:27.8 ML: Yeah, I really love that. I mean, there's so much that is interwoven in our work, and I think about the social ecological model with that individual community and society mention in that quote, but also this idea that we are sharing power and that we're centering connection and relationships, and that is a thing that is not separate from evaluation, that's part of evaluation. And if it doesn't feel like it's part of evaluation, then I think that's a place to examine it. It definitely did not used to be a part of my evaluation, for sure. It used to be really, what were my ideas of what we should evaluate and what did the funders want? And I'm gonna create a survey and then everybody's gonna take the survey, and did I share the information back? Yeah, usually I did, I think I was pretty good at that, but did I always talk with folks about what does that mean for our next steps? Not really. So yeah, if we think about the importance of connection, relationships, power sharing, I think it sets us up in this really good place. We have our own needs as people who are running the program, but I think the community who's part of that prevention work has just as many, if not more needs from evaluation. So they really need to be part of evaluation from the beginning. They can help us figure out, what do we wanna learn from an evaluation? What's the information that they wanna know? What are the best ways to do the evaluation? 0:11:01.5 ML: Maybe it's a survey, maybe it's not. And then they can also help us make meaning from the information that we learn to then help us shape the direction of where we go next. And I think all of that is... I don't know, if you put that in your evaluation process, you're gonna get to a pretty good place, I think. 0:11:22.0 SL: Yes, I think you're right. What's been exciting to me, and I will agree with you that this was not always how I thought about evaluation. And honestly, how I think about and do evaluation changes every day. I'm always learning more. But this approach of integrating social justice into your evaluation, that it's part of your social change efforts and all of the work that is happening in our field to really change things at the community level, I really like how this approach is helping me have a stronger lens to examine systems specifically and the impact of our organizational practices and even the partnerships that are needed to help communities thrive. And for me, it's these evaluation questions when framed to understand specific racial inequities, for example, can help to uncover infrastructure needs and policy gaps. And it's also the day-to-day evaluation practices that can help center the needs of diverse community members. The work of Jara Dean-Coffey and her equitable evaluation framework have really inspired me to approach evaluation work with a focus on building and supporting those collective values to end sexual violence and build equity in our communities. And I feel like every day, I am just unlearning, I'm re-learning some things I forgot, [chuckle] I'm exploring new ways to really use valuation to support our overall mission and to help me learn from community members in a more holistic way. 0:13:36.3 ML: I really love that, and I really relate too, because my approach to evaluation has changed as well. I feel like I wanna tell you about this one time I was very ambitious with evaluation, and this is not in my notes, but it was too much. I looked at these rape myth acceptance scales, and I just kind of scoured the Internet for what are the things that are considered best practices in looking at these scales and things that I could build into an evaluation. And I made the survey and it was so long, but I was just really thinking like, "Well, gosh, if some information is good, then more information is better." And I think I ended up with a survey that was maybe like three or four pages long. [chuckle] And I think about it now, and it's horrible. I was working with all these really cool young people, and I was like, "Here, take this survey," and I think they were like, "What are you talking about? How does this relate to what we're doing? Where did you get this? These are weird questions." Yeah, I just wanted all the data, I wanted to be able to justify the work that we were doing and tell that story and help make those bigger changes, but I wasn't going about it in the way that I needed or that the community needed. So, I did end up eventually shortening the survey. [chuckle] I incorporated other evaluation techniques, that was good. 0:15:07.3 ML: And yeah, it's funny because the more I learn about evaluation, the more fun it is to change things up. And the thing that you and I have been doing lately are these stakeholder interviews as part of our own internal evaluation work, and I really like it. I still find that I get that kind of nerdy data fix from coding the interviews to pull out the themes, but then it also feels really good to not be asking people to take another survey and to be able to have that one-on-one connection to talk with someone about their experience. 0:15:44.4 SL: Oh, I completely agree. Well, thank you so much for sharing that story. I think it's really helpful for us to talk through what... Things that didn't work in evaluation for us and how we made changes and why. And I think the second part of that last part that you just talked about, that these don't end up in grant applications, it's not typically something that's part of a project that we're working on, but looking at evaluation as critical to the relationship building, that has to be part of any effort to create change. If there isn't trust, if there isn't understanding and relationship there, then who's gonna be invested in the change or really understanding what's needed? So I do appreciate, I think, that we are looking at the importance of relationship building in our prevention work, so then of course, it needs to be part of, and there needs to be space for that as we're doing the evaluation work that's attached and part of, and integrate it into those projects. And oh my gosh, I could tell so many stories about my eagerness to extract, as you said earlier, all this data. And I have this constant mantra. 0:17:20.4 SL: I say to myself like, "Why am I asking this question? Where is it coming from? Who does it serve? Is that a question that the folks we're working with have?" And really trying to look at who is being best served by the evaluation data that we're gathering? And while I love the art of survey design, I think it can be very creative and very fun. And there are lots of ways that we can do survey development in a participatory way and talking with community members about the design and getting feedback and making sure it resonates and it's really collecting the data that people want, but I agree with you about that rich data that comes from those conversations that we get to have with people through some of our internal evaluation-focused stakeholder interviews that we're doing. They really do help me to understand and share more meaningful information about the prevention story in our field. So we've talked about how our evaluation approaches should be driven by the values and priorities and styles, communication needs of community members, in episode... I think it was 19. I got to talk with long-time friend and advisor Strong Oak about how they had been able to put indigenous values at the center of their new model for evaluating their program, their prevention program. 0:19:10.7 SL: I learned so much from those conversations that are honoring important traditions, while at the same time expanding the ways that evaluation is incorporated into our social change efforts. And that's one reason why I wanted to come back to this conversation and check in with you and remind folks that if they need some inspiration, we are gathering and organizing tools from around the field of sexual violence prevention and connected movements for our online evaluation toolkit. Folks can access that through the NSVRC website, and I'll put the link in our show notes. I think we're hoping that this conversation between us might spark some interest in evaluation, and folks might wanna start thinking more about and talking more about evaluation as social justice. Actually, section three of our toolkit is focused on that, and there are also some self-study guides that are available in the toolkit introduction. If folks kinda wanna keep exploring that with us, we'd invite you all to take a look at that and then give us some of your feedback. 0:20:31.1 ML: Yeah, we would love to hear what you think. We also have e-learning courses too, and a whole series about data analysis, which personally I found very helpful. I really like being able to go through a course like this at my own pace and then kind of check my knowledge as I go. And what else do we have? There's also the Human Spectrogram course, which is an e-learning course that's pretty fun. If you are getting together with people in-person, this is an activity-based evaluation technique that you can use in your work, and then it also raises this eternal debate of whether pineapple belongs on pizza. 0:21:12.6 SL: We have officially entered into controversy... 0:21:15.6 ML: Oh yes. 0:21:19.5 SL: [laughter] Controversial territory, Mo. So before we blow up the internet, I wanna thank you so much for chatting with me today, Mo. I love talking about and doing evaluation work with you. If you our listeners have a burning evaluation question or topic you would like us to explore in our 2023 episodes outside of pizza toppings, please email us at resources@nsvrc.org. 0:21:47.2 SL: Thank you for listening to this episode of Resource on the Go. For more resources and information about preventing sexual assault, visit our website at nsvrc.org. To learn more about our evaluation work, including NSVRC's evaluation toolkit, visit the episode resources at nsvrc.org/podcasts. You can also get in touch with us by emailing resources@nsvrc.org. [music]