Training & Capacity-Building Activities About this guide This guide presents detailed instructions for several activities that • build evaluation capacity, • assist with evaluation planning, and/or • assist with data collection. How to use this guide The activities in this guide are meant to supplement the resources provided in the Evaluation Toolkit. If you are responsible for providing capacity-building technical assistance around program evaluation, these activities will help you translate concepts and transfer skills related to several aspects of program evaluation. Several of these activities could also be done as ongoing learning opportunities in organizational staff meetings or among a group of preventionists who want to form a community of practice around developing and honing program evaluation skills. A slide deck created in Microsoft PowerPoint accompanies this set of activities to so that you can easily integrate the activities into your training materials. If the accompanying slide deck has slides that correspond to an activity, the slide numbers are indicated in the activity instructions. This is not a comprehensive set of activities but rather a set of field-tested activities that can become part of your tool belt. Additional resources for identifying training activities can be found at the end of the guide. A Note About Liberating Structures Several of the activities in this guide are inspired by Liberating Structures, a set of activities (or structures) based partially on the idea that people can do amazing thinking with each other even without a lot of hands-on facilitation. The structures support innovation and collective thinking. You can find out more about Liberating Structures and get instructions for all of the structures on the Liberating Structures website. Human Spectrogram Source: Human spectrogram is an activity from sociometry, a quantitative method for measuring social relationships. Purpose: This activity could fill one or more of the following purposes. Depending on when you facilitate it and which questions you choose, the purpose could be to: • get a general sense of the participants’ thoughts, feelings, etc., regarding evaluation, • model interactive, integrated data collection, • highlight the connections between quantitative and qualitative data, or• highlight the inadequacy of single points of data collection. When to use: In the beginning of a training, preferably after ice-breakers and introductions Time Required: 10 - 20 minutes # of Participants: 5+ Materials Needed: Tape (optional), large open space PowerPoint Slide-deck Slide: 1 - 5 Setting Up the Space: For this activity, you’ll need a long section of a room – approximately long enough for all of the participants to stand next to each other in a line. You can put a piece of tape in a line on the floor from one wall to another or just point out the “in-bounds” area for the activity. Choosing Prompts: Choose a set of dichotomous statements that relate to the purpose(s) for which you are using the activity. The ones provided in this activity (below and in the slide deck) are examples that have been vetted in trainings but are certainly not the only statements you can use. You can craft prompts that are specific to other aspects of evaluation (e.g., data collection) rather than using statements about evaluation overall. Just remember that the two statements should be extreme and should be more or less opposite viewpoints. Sometimes you will have statements that are not exactly opposites because there are ranges of choices that don’t fall between the two ends. That can still be effective, but make sure you are transparent about it and use it as an opportunity to talk about decisions related to crafting statements and questions for data collection. For example, to introduce people to the activity, you can start with a fun prompt such as “Pizza is the BEST food in the world vs. pizza is the WORST food in the world” or “Cats are the BEST animals in the world vs. Dogs are the BEST animals in the world.” (People who like a totally different animal are likely to feel left out by this prompt.) This set of statements is entirely feeling-oriented but will likely also bring about responses related to people’s perceptions of and specific experiences with evaluation. Example 1 I love evaluation. vs. I hate evaluation. Example 2 This set of statements will most likely elicit responses related to the value people see in evaluation, their experiences with it, and slightly less about their feelings related to it. Evaluation is the absolute best part of my work. vs. Evaluation is the absolute worst part of my work. This set of statements will likely elicit responses that are focused on emotional relationships to evaluation and also to the relative feelings about evaluation compared to program planning and implementation. Example 3 Evaluation is the most important part of my work. vs. Evaluation is the least important part of my work. This set of statements should get at information related to the participants’ set of utility with regard to evaluation and is likely also to get some conversation started related to feelings about evaluation. Example 4 Evaluation is a very important use of time. vs. Evaluation is a complete waste of my time. Facilitation: Have the group move to the open section of the room. Explain to them that you will read two extreme statements that are more or less at opposite ends of a spectrum of opinions on something. Designate either two opposing walls, ends of the tape on the floor, or some other clear point in the room that corresponds to each extreme statement. If you have helpers in the room, they can stand at either end holding papers with the statements printed on them. Announce the two statements and indicate which part of the physical space corresponds to each statement. Instruct the participants to stand at either end – or anywhere in between – to indicate their own feelings or thoughts about the two statements. Let them know that you won’t be answering any clarifying questions about the statements and that they should place themselves somewhere based on their own interpretation of the statements. Once everyone seems relatively settled, ask them to look around and notice the spread of responses. Explain to the group that they can change their position at any point if they hear something that changes their mind. (If someone moves, consider asking them to explain why.) Then, go to a few points along the spectrum to ask people to explain why they are standing where they are standing. It’s a good idea to ask people at both extremes and the center. If you have time and people spread out enough, consider asking the people between each end and the center, too. When people explain the thoughts, feelings, perceptions, etc., that inspired them to stand where they are standing, use that opportunity to make connections to content you’ll be covering, connections to comments others are making, etc. Debriefing: After the activity is complete, share any reflections on the process or issues it highlighted about evaluation practice. For example, you might notice and want to share any of the following with the group: • Prompt Interpretation When you ask people to share why they are standing where they are standing, it might become clear that people are interpreting the prompts differently. This is a risk in developing questions for evaluation data collection, too. • Data & Context People standing in the same area of the spectrum of answers might give very different reasons for being there. Without having asked them why they responded the way they did to the prompt, you would have no way of knowing that. You would only have been able to count the number of participants in a given section of the continuum. This helps highlight the benefits of collecting both qualitative and quantitative data. • Data Use The activity you just conducted was a way for you to collect data about the people in the room. How will you use those data? What did you learn? What might you hope to be different if you did the activity again at the end of your time with the same participants? You also might use this debriefing moment as an opportunity to fess up to the social-desirability bias inherent in the activity. Did people feel like they could be honest saying they hated evaluation with you in the room? With their boss in the room? What are the broader implications for data collection in evaluation? If you intend to directly address the above issues in your training, you can also wait until it’s time to address them to use the activity as an example in your discussion. Using this Activity as Part of the Training Evaluation: You can repeat this activity at or near the end of a training to see how people’s thoughts, feelings, or opinions have changed. This is a quick and transparent evaluation of the workshop that can supplement your other methods of evaluating your efforts. If you want to use it this way, make sure you count how many people are standing in each part of the continuum and write down the comments they make about why they think or feel the way they do. You can use the tracking sheet on the next page to record responses or create a similar one of your own. You can even further engage the participants in this process by collectively analyzing and interpreting the data you collect. If you leave approximately 30 minutes at the end of the training, you can compare the number of people standing at various places in the line from the beginning of the training to the end, read the comments, and get the participants’ opinions on why certain changes did or did not happen. If you want to do this, consider following a format similar to the participatory data analysis outlined in this toolkit. For more on the human spectrogram, see this website. Human Spectrogram Tracking Sheet Date: Name of Facilitator: Prompt: Pre Post 1. Indicate the number of people standing at each part of the line by either writing numbers or drawing dots to represent where people are standing. Strongly Disagree Quadrant 1 Quadrant 2 Quadrant 3 Quadrant 4 Strongly Agree 2. Record comments from participants. Quadrant 1 Quadrant 2 Quadrant 3 Quadrant 4 1-2-4-ALL FOR EVALUATION CHALLENGES Source: This is a riff on a structure from Liberating Structures (n.d.). Purpose: This activity brings challenges out into the room so that they can be acknowledged or directly addressed. When to use: Near the beginning of an evaluation training or beginning of an evaluation planning process. Time: 10 - 20 minutes # of Participants: 4+ Materials: Computer and projector or flipchart, whiteboard, or chalkboard Slide-deck Slides: 6 - 7 Facilitation: Liberating Structures are based partially on the idea that people can do amazing thinking with each other even without a lot of hands-on facilitation. In a sense, the structure of the activity sets parameters that might otherwise need to be held by a facilitator. So, your job is primarily to explain the structure well and to hold to it with regards to both time and content. Pay attention to timekeeping duties during each round since it is your job to hold the structure and move it forward. Unlike what you might be used to doing when you facilitate activities, be a bit strict about the timeframes in this structure. This activity is meant to replace an all group share-out, to give more people a chance to talk, and to let the group begin to identify patterns. This seeks to prevent a few people from monopolizing a share-out opportunity and also to prevent share-outs that focus on a random assortment of items that will be recorded on a flip chart. Give the group an overview of the process in addition to displaying the process on the slide, if possible. If you don’t have access to a computer and projector, write the process on a flipchart, whiteboard, or chalkboard. The prompt (called an “invitation” in Liberating Structures) that will guide their discussion is What is challenging about evaluating primary prevention of sexual violence? Feel free to modify the above as needed. The rounds are structured as follows: Time - One Minute One person Activity - Participants take one silent minute on their own to reflect on the invitation/prompt above. People often struggle with the minute they are meant to spend on their own, but it’s important to hold to this. Let participants know that they can use that time to think, write, draw, pace, etc. — whatever helps them start processing in response to the invitation. This helps the people who process their thoughts a bit more slowly to have a chance to think before they are invited to share with others. It also helps those folks who are quick processors to start to organize their thoughts and figure out which ones are most important to share. Time - Two Minutes Two people Activity - Participants form a dyad and share their thoughts with their partner. Ask participants to try to divide their time equally between the two of them. Give them a time check after one minute has passed. At that time, encourage them to switch if they have not already. Time - Four Minutes Four people Activity - Each dyad joins another dyad to share what they noticed, with a focus on patterns. This should be a sharing of highlights, connections, and surprises, and they should start to listen for patterns. Again, ask them to try to divide the time roughly equally among themselves, though they do not have to speak one at a time. Give them a time check again when there’s about one minute remaining. Time - Six Minutes Everyone Activity - The entire group shares patterns and highlights from the previous rounds. The all-group share should also not just serve as an opportunity to rehash everything that was said in the previous rounds but rather is an opportunity for people to share the patterns, surprises, and powerful things they heard. Ask participants to write down the recurring challenges or patterns on sticky notes or flip chart paper. You can put those up somewhere in the room. Near the end of the day or training, you can use them as seeds for visioning or problem-solving using an activity like the two offered below. Consider pairing this activity with one of the following: Troika Consulting – This structure allows individuals to get consulting around a specific issue they are having or envision having in their work. You would structure an invitation that asks them to consider challenges specific to their work around evaluation. They would then use their previous knowledge and experiences in addition to their new knowledge and skills from the training to help each other envision solutions. 25/10 Crowd Sourcing – This structure invites the entire group to think big about their next bold moves related to an issue. Through several rounds of sharing and voting, the top ideas are shared out with the entire group. It’s a great closing activity to show what has been learned and what kind of excitement has been generated. And since all of the ideas get recorded on a notecard, you can collect all of them and use them as part of your own evaluation of the training or capacity-building effort! If you’re doing this activity as part of an evaluation planning process, you can use these ideas to help move your planning process forward. Evaluation and Social Justice: Quote Activity Purpose: This activity provides an opportunity for participants to think about and struggle with perspectives about the intersection of evaluation and social justice. It will help them think about orientations to the practice of evaluation, including their own. When to use: If you use this activity with a group that is relatively new to evaluation or a group that has perhaps not thought much about how evaluation can support social justice work, use the first option below, which focuses on sparking their thinking and eliciting responses from them about the potential of evaluation as a social justice endeavor. This will be best placed at the beginning of a training or capacity-building effort as it sets a foundation for people to be thinking about evaluation with a social justice lens. If you are working with a more seasoned group of evaluators or even a more seasoned group of prevention workers who frequently think about social justice issues and want them to consider the implications of evaluation as a social justice endeavor in a more practical and action-oriented way, consider using option 2. This activity could also be useful at the beginning of a planning process to orient an evaluation team to thinking about the social justice implications of evaluation. Time Required: 15 minutes (option 1) or up to 1.5 hours (option 2) # of Participants: 4+ Materials Needed: Quote sheets, writing materials, flip chart paper, and markers for tables PowerPoint Slide-deck for option 2: 25 - 28 (What? So What? Now What? W3) Option 1: Reflection & Exploration This option is modified from the Liberating Structure 1-2-4-All (n.d.). Time - One Minute One person Activity - Participants take one silent minute on their own to reflect on the quotes on the quote sheets. After everyone has had a chance to read their sheets, instruct everyone to take one minute of silent reflection to gather their thoughts about the quotes and the accompanying questions. They can take notes or not as they prefer, but everyone must remain silent for that minute. People often struggle with the minute they are meant to spend on their own, but it’s important to hold to this. Let participants know that they can use that time to think, write, draw, pace, etc. — whatever helps them start processing in response to the invitation. This helps the people who process their thoughts a bit more slowly to have a chance to think before they are invited to share with others. It also helps those folks who are quick processors to start to organize their thoughts and figure out which ones are most important to share. Time - Two Minutes Two people Activity - Participants form a dyad and share their thoughts with their partner. After one minute, instruct the participants to form dyads with another person. The groups of two have two minutes to share their thoughts and responses. They should try to divide the time evenly, and you should give them a cue at the halfway point. Time - Four Minutes Four people Activity - Each dyad joins another dyad to share what they noticed, with a focus on patterns. This should be a sharing of highlights, connections, and surprises, and they should start to listen for patterns. Again, ask them to try to divide the time roughly equally among themselves, though they do not have to speak one at a time. Give them a time check again when there’s about one minute remaining. Time - Six Minutes Everyone Activity - The entire group shares patterns and highlights from the previous rounds. The all-group share should also not just serve as an opportunity to rehash everything that was said in the previous rounds but rather is an opportunity for people to share the patterns, surprises, and powerful things they heard. Ask participants to write down the recurring challenges or patterns on sticky notes or flip chart paper. You can put those up somewhere in the room. Option 2: Reflection & Planning This option is an adaptation of the Liberating Structures Conversation Cafe (n.d.) and What? So What? Now What? W3 (n.d.). Instructions: Distribute handouts with quotes and discussion questions. You will need groups of no more than 5 - 7 people. You can either give each group a different quote sheet or the same quote sheet. Consider giving each group butcher paper or flipchart paper to record drawings, notes, or anything else related to their process. Explain to the group that the activity will take place in three rounds that will each have solo, small- group, and large-group components. Each round will also have a separate focus that will build on the previous round. They should not skip ahead and begin building before asked to do so even though it will be tempting and also easy to slip into the other areas without meaning to. The focus will move forward as follows: What? What do they notice? What responses do they have to it? Round 1.1 Time - One Minute After everyone has had a chance to read their sheets, instruct everyone to take one minute of silent reflection to gather their thoughts about the quote. Specifically, they should focus on what they notice about the quote and what they notice about their reactions to it. They can take notes or not as they prefer, but everyone must remain silent for that minute. Round 1.2 Time - 10 - 15 Minutes After one minute, they will begin a round of sharing in small groups of 5 - 7 people. Each person should be given an opportunity to share for approximately one minute. They can pass if they don’t desire to share. After each person has had one minute to share their own reflections, the group can begin to dialogue back and forth about what they’ve noticed. Again, they should try to focus on what they notice about the quote and what they notice about their own reactions to it. Round 1.3 Time - 10 Minutes After 10 - 15 minutes pass, bring the entire group together for an all-group conversation for 10 minutes. Invite them to share the most profound or interesting sharing from their small group discussions or to note themes that arose. Round 2 So what? What meaning do they make of what they noticed in the first round? Why is it important? Why do they feel they way they felt about it? Follow the same protocol for the sub-steps of round 1 for round 2 but focusing on the questions above. Round 3 Now what? What are the implications of what they noticed? What actions are demanded by the information from the first two rounds? What needs to be done differently? Follow the same protocol for the sub-steps of round 1 for round 3 but focusing on the questions above. Consider pairing this with TRIZ to further identify actions the group can take toward what this activity identified as possible next steps. Since this activity also presents an opportunity to make some connections back to the processes of evaluation, consider making the following debriefing points if appropriate. Debriefing: • What? So what? Now what? are also the three core questions to data analysis, interpretation, and use. This activity mirrors that process and how it can be done collectively. • Reflexive practice is important for program evaluators who seek to be ethical practitioners and especially for those who hope that their evaluation practice will contribute to a more just and equitable society. Evaluation and Social Justice: Quotes for Reflection Quote 1 “In practical terms, [Culturally-Responsive Evaluation] CRE is much more than developing instrumentation that reflects local populations, especially those of color or traditionally underserved communities, but ask more fundamental questions that reverberate throughout the evaluation process. These questions include what and whose perspectives are represented in evaluation questions, instrument development, and/or communication of findings. Hence, the socially responsible stance of the culturally responsive evaluator and the evaluation contributes to thinking about social agendas that promote spaces of hope, praxis, and social action for indigenous, marginalized, dispossessed communities, as well as their contexts, histories, struggles, and ideals” (Hopson, 2009, p. 442). Quote 2 “A social justice-oriented evaluation examines the holistic nature of social problems. It seeks to increase understanding of the interdependency among individual, community, and society using a more judicious democratic process in generating knowledge about social problems and social interventions and using this knowledge to advance social progress” (Thomas & Madison, 2010, p. 572). Quote 3 “Those who engage in evaluation do so from perspectives that reflect their values, their ways of viewing the world, and their culture. Culture shapes the ways in which evaluation questions are conceptualized, which in turn influence what data are collected, how the data will be collected and analyzed, and how data are interpreted” (American Evaluation Association, n.d.). TRIZ Source: This is a Liberating Structure. Purpose: This activity helps identify practices that inhibit a group from meeting one of its stated goals or purposes. For evaluation, it could be useful for trying to increase the social justice-orientation of an evaluation, for focusing on evaluation use, or for keeping the evaluation scope in check. When to use: Depending on the invitation you choose, this could be useful at the beginning, middle, or end of an evaluation planning process. Time Required: 35 minutes# of Participants: 4+ Materials Needed: Writing materials Facilitation: It can be implemented per the instructions on the website. The invitations below are possible guides for using this for evaluation purposes. After reading about the activity, feel free to design invitations that are more suited to your needs. Social Justice and Participation: • How can we ensure that this evaluation further marginalizes people who are already marginalized? • How can we ensure that this evaluation keeps silent the voices of people whose voices are already unheard or under-heard? Evaluation Focus: • How can we make sure none of the data we collect is useful? • How can we make sure none of the data we collect is used? Evaluation Scope: • How can we make sure we create an evaluation plan that is so unwieldy it never gets implemented as planned? • How can we make sure we design an evaluation that ends up being way over budget? Thinking About Data Purpose: This activity helps participants apply their learning about different sources of data by identifying specific examples of each type that are available for them to use in their own work. When to use: This activity works well in a training after explaining various data sources and giving a few examples. Time Required: 15+ minutes # of Participants: Any Materials Needed: Copies of the chart below Facilitation: This simple activity can be structured by giving small groups of up to four people copies of the chart and asking them to brainstorm sources of data in their own work. After 10-15 minutes of brainstorming, bring them back to the larger group to share their ideas. Or, if you want to facilitate a quicker version of the activity, you can give groups each one data source to think about and have them take about five minutes to come up with ideas before sharing. Thinking About Data: Worksheet Data Options Observational Data Observational data come from directly observing behaviors of program participants or other members of a target audience or community. Focus Groups/Interviews Focus groups and interviews are opportunities to get detailed descriptive data including people’s perceptions about their experiences in a program and reflections on how they have changed. Existing Documents Existing documents include materials or records that are collected external to your evaluation efforts but which you might be able to access to assist in answering your evaluation questions. Questionnaires Questionnaires include questions and other items to gauge people’s attitudes, behavioral intent, knowledge, and other constructs. Creative Materials Artistic and creative products like drawings, photos, murals, journal entries, poetry, etc. are also sources of data for evaluation. Case Study: Participatory Evaluation Source: The case used for this activity is based on an interview with a real preventionist. Purpose: This activity supports critical thinking around real-world evaluation practices. When to use: This case study activity is well-suited to long-term capacity-building efforts or to the middle or end of a longer training on evaluation where it could serve to help participants integrate several concepts. Time Required: 30 minutes to one hour # of Participants: 4+ Materials Needed: Copies of Case Study: Youth Participation Local Prevention Evaluation Facilitation: This activity involves facilitating a group through responding to the following questions based on the case study on the next page. See the document Case Study: Youth Participation Local Prevention Evaluation (Facilitator Notes) for additional guidance on each question. To facilitate answers to these questions, consider the following ideas: • Model the discussion using 1-2-4-All facilitating one round of 1-2-4-All per question. • If you’re more pressed for time, you can facilitate something more like Conversation Cafe. Five separate groups would receive one of each of the questions below. Questions 1. Describe the participatory nature of this case. 2. How might participation in evaluation processes impact the students in this example? What are the social justice implications of their involvement – either positive or negative? 3. What contextual factors do you see that might have limited the options for participation? What other factors do you imagine or assume might have limited participation options? 4. What, if any, ethical issues are raised in this case example? 5. What, if any, methodological issues are raised? Case Study: Youth Participation Local Prevention Evaluation As part of her agency’s school-based prevention efforts, Allison coordinates a group of student leaders. When the time came to develop outcome measures for her program, she facilitated several brainstorming sessions with these student leaders in order to get a better sense of what was happening in their communities. The information from those conversations was used in conjunction with Allison’s own knowledge and skills about program evaluation to come up with a list of possible items for an evaluation tool. From there, the young people helped decide which ones to keep, which ones to modify, and which ones to remove. They also pilot tested it before she used it. When it becomes outdated, Allison intends to take it back to the group for further refinement. Allison noted that at the end of this process, these students knew what they wanted to change, why, and how they would measure it. She also considered having the young people involved in participatory data collection through various kinds of formal observation but decided against it because she felt like it might contribute to drama in the small school she was working in and also didn’t think she could control for bias among the student leaders. However, informally she has students observe what’s happening among their peers and report back to her; she uses this information in shaping program direction. When it’s time for analysis and interpretation, she will sometimes sit down with participants and with colleagues to look at the data in different ways to come up with different possible interpretations, see what’s connected, and so on. They try to figure out why some things changed but not others. Allison notes that it is helpful if the group is representative of the students who will be participating in programming. She noticed that when the group was less representative of the participants, some of the language and the concepts in the items chosen were problematic for other students. She advocated to expand her leadership group to include students who the administration did not regard as typical leaders. One such student who was failing her classes managed to pull her grade back up after data from the evaluation process helped her see that she could make a difference. Case Study: Youth Participation Local Prevention Evaluation (Facilitator Notes) The following provides potential answers to the questions posed about the case to help the facilitatorguide the conversation. Questions: 1. Describe the participatory nature of this case. The students in this case study were involved in the design of the evaluation tool by providing information to the preventionist that she could use to draft items for a pre- and post-test. The students were able to provide feedback on potential items and pilot the instrument prior to its implementation. They were also sometimes involved in helping make sense of the data. So, they had some level of participation in several stages of the evaluation, but they did not have primary control over the process. 2. How might participation in evaluation processes impact the students in this example? What are the social justice implications of their involvement – either positive or negative? In addition to the specific example offered by Allison, the student participation might impact them in a few other ways. Their involvement in this process might be their first introduction to data collection and analysis, and they are being exposed to new skills through their involvement. This process could also build critical-thinking skills, especially during the analysis and interpretation section where they have to help think through the meaning and implications of the data. Involving them in the process also demonstrates that their opinions are valued and might increase their buy-in to both the program and the process of evaluation more generally. Part of what must be considered when involving other people in evaluation processes is how they are compensated for their time and involvement. They should be able to receive some benefit from their participation. In the case of young people, sometimes they are involved in processes where they are the only members who are not being compensated for their time in some way. 3. What contextual factors do you see that might have limited the options for participation? What other factors do you imagine or assume might have limited participation options? The young leaders in this case do not necessarily have the level of skill and expertise necessary to fully engage in development and implementation of the evaluation. It might be possible that there was either not enough time or not enough resources to train them to be more engaged in the evaluation process. Although it is not addressed explicitly in the case, it might be fair to assume that the prevention worker initiated the evaluation at the urging of either leaders in her organization or as a requirement by a funder. Especially in the latter case, it is not unusual for many aspects of the evaluation to be stipulated in advance, thus limiting the amount of control the agency has and also the amount of true participation they can seek out from other stakeholders. 4. What, if any, ethical issues are raised in this case example? This would be a good place to discuss various guiding principles from the American Evaluation Association (2004). The following excerpted principles seem especially relevant to this case. You might also consider reviewing the principles and pulling others that you deem relevant to the case. • Respect for People - Seek a comprehensive understanding of the contextual elements of the evaluation. By working directly with the students, Allison seeks to both better understand the community with which she is working and also tailor her evaluation tool accordingly. Continuing to check in with the students highlights an understanding that contexts change and that those changes need to be responded to. This is also evident with the way the leadership group was expanded to accommodate a greater diversity of students. - Foster social equity in evaluation, when feasible, so that those who give to the evaluation may benefit in return. Young people are not often given a chance to have a say in the way programs or evaluations are run, so sharing power with them at some level helps to foster equity in this evaluation; additionally it is clear that Allison hopes that sharing the data with them will help them understand their community better. • Responsibilities for General and Public Welfare - Include relevant perspectives and interests of the full range of stakeholders. The case does not show whether or not Allison is engaging a full range of stakeholders, but it is clear that she is involving the group that is probably the least powerful among all of the stakeholders, the youth themselves. These are also the stakeholders who will interact most directly with the evaluation. Other stakeholders to consider involving for school-based prevention work include members of the school faculty and administration, caregivers, and funders. - Allow stakeholders access to — and actively disseminate — evaluative information and present evaluation results in understandable forms that respect people and honor promises of confidentiality. When feasible, Allison shares information from the evaluation with the young people. Judging by the note in the case that this process helped one student to understand that she can make a difference in the lives of her peers, it seems that Allison must present the data in ways that are accessible to the young people involved. 5. What, if any, methodological issues are raised? As is pointed out in the case study, the pre- and post-questionnaire is the only formal data collection tool employed for this evaluation. Although Allison collects observational data through student report-backs, this is not done in a systematic way, so the data have less weight than that collected through the questionnaires. Also, considering the potential pitfalls of paper-and-pencil measures like questionnaires, this might not be the best choice for this particular group of participants or might need to be supplemented by additional measures — like a more systematic observational data- collection process — to more fully understand the impact of the intervention. INTERVIEW PRACTICE Source: This activity was inspired by Appreciative Interviews (Liberating Structures, n.d.). Purpose: This activity highlights the natural process of qualitative data collection and analysis by engaging people in the process. It also serves as part of the evaluation of the training or capacity- building effort. Time Required: 1 - 1.5 hours (while not recommended, this can be condensed into less than an hour) # of Participants: 8+ Materials Needed: Writing materials PowerPoint Slide-deck: Slides 15 - 22 Facilitation: This activity should come after the group has been introduced to some general theory and process related to qualitative data analysis. If you have a preferred process to recommend for data analysis, feel free to train the group on that. If not, there are slides in the slide deck that outline a process. In order to adequately explain the process to the group, review the procedure in Listening to Our Communities: Guide on Data Analysis (The National Sexual Assault Coalition Resource Sharing Project & National Sexual Violence Resource Center, 2014). The activity itself is divided into five steps, each of which constitutes a round. Your only duties for rounds one through four are to keep time and offer the prompts. You also might want to remind them every now and again not to skip ahead to the next step. It’s a natural tendency to jump ahead to meaning making or recommendations, but it’s important to really understand the data first. This takes some training! Step One Time - 15 - 20 minutes Gather Information Step 1 is the interview (i.e., data collection) phase. This step should take 15 - 20 minutes with the time split evenly between the two partners. They will interview each other according to the following guide (or your own modification of it): Ask your partner the following questions (and take notes!): • What has your experience been like during this training? • What are some aspects that have really worked for you? • And some aspects that haven’t quite worked for you? Step Two Time - 20 - 30 Minutes Combine InformationThis round provides an opportunity to begin to synthesize data and also validate that each partner heard correctly (i.e., data verification). Then, they can begin to notice and discuss similarities once all of the stories have been shared. Join with another dyad to form a group of four. Tell the rest of the group your partner’s responses to the questions. • What similarities do you notice? • What differences really stand out to you? Step Three Meaning Making Time - 15 minutes After they’ve noticed patterns, they can start to make meaning from (i.e., interpret) those patterns. This round should be about 15 minutes. • What do you make of the similarities? What does that suggest about people’s training experiences? • What do you make of the differences? What does that suggest about people’s training experiences? • Is there anything significant about what was not said? Step Four Time Varies Make recommendations Now that they’ve made meaning, they can make recommendations based on the data. Based on all of the information above, if this training were to be done again: • What should be done in the same way? • What should be done differently? Step Five Time Varies Meta-Evaluation Depending on how much time you have, Step 5 can be facilitated using 1-2-4-All, Conversation Cafe, or standard small group discussions. Reflect on this activity: • What was challenging about collecting, synthesizing, analyzing, or interpreting the information (i.e., data) in this activity? What was easy or seamless about it? • How might this process be different with much more data to collect, synthesize, analyze, and interpret? • How might you implement a similar process that involves participatory data to collection, synthesis, analysis, and interpretation? How might this process be different if only one person was collecting, synthesizing, analyzing, and interpreting the data? How is the process impacted by whether or not that person is internal to or external from the process? Participatory Data Analysis: Data Placemats Source: This activity was modified from the data placemats process supported by the Innovation Network (Pankaj, Welsh, & Ostenso, 2011). What? So What? Now What? W3 is from Liberating Structures (n.d.). Purpose: This activity can fill one of two purposes depending on which option you choose. Option 1 is geared toward capacity building of evaluators or stakeholders using mock data to involve participants in learning how to use data placemats and facilitate participatory data analysis by going through the process themselves. Option 2 is geared toward actual participatory analysis and involves data from your own evaluation and real stakeholders. Time Required: Two hours or more # of Participants: 3+ Materials Needed: Copies of either the sample data placemat or a real placemat with data from your own evaluation PowerPoint Slide-deck: Slides 23 - 28 Option 1: Mock Participatory Interpretation Facilitation: Give the participants an overview of the data placemat. Explain that these data are examples of the types of data that might be collected to ascertain an intervention’s impact on bystander behaviors. The data come from three sources: • Pre- and post-test scores administered to students that included items to assess their opportunities to intervene in problematic behaviors, their sense of efficacy in effectively intervening, and their intention to intervene. • Existing data from the school that collects monthly information on the number of bullying incidents, incidents of sexual harassment, and instances of bystander intervention. • Focus group data from students in each age group. The quotes were chosen because they were representative of high-level themes from a preliminary round of data analysis. Based on the data available on the data placemat, facilitate What? So What? Now What? in the following way: Note that these times are approximate and you can check in with the group to see if they need more or less time to complete a given step. Certain steps might seem more straightforward than others for different groups. Round 1.1 Time - One Minute What? What do they notice in the data? What stands out to them? After everyone has had a chance to read their sheets, instruct everyone to take one minute of silent reflection to gather their thoughts about the quote. Specifically, they should focus on what they notice about the quote and what they notice about their reactions to it. They can take notes or not as they prefer, but everyone must remain silent for that minute. Round 1.2 Time - 10 - 15 Minutes After one minute, they will begin a round of sharing in small groups of 5 - 7 people. Each person should be given an opportunity to share for approximately one minute. They can pass if they don’t desire to share. After each person has had one minute to share their own reflections, the group can begin to dialogue back and forth about what they’ve noticed. Again, they should try to focus on what they notice about the quote and what they notice about their own reactions to it. Round 1.3 Time - 10 Minutes After 10 - 15 minutes pass, bring the entire group together for an all-group conversation for 10 minutes. Invite them to share the most profound or interesting sharing from their small group discussions or to note themes that arose. Capture the responses, especially the disparate ones or conflicting ones. Round 2 Time - 20 - 35 Minutes So what? What meaning do they make of what they noticed in the first round? Why is it important? Why do they feel they way they felt about it? Follow the same protocol for the sub-steps of round 1 for round 2 but focusing on the questions above. Round 3 Time - 20 - 30 Minutes Now what? What are the implications of what they noticed? What actions are demanded by the information from the first two rounds? What needs to be done differently in the program, in the evaluation, etc.? Follow the same protocol for the sub-steps of round 1 for round 3 but focusing on the questions above. Consider following this activity up with Triz. Option 2: Actual Participatory Interpretation Preparation: Prior to the meeting, you will need to do some preliminary data analysis and design a data placemat. Facilitation: Give the participants an overview of the data placemat. Explain the sources of the data and how the preliminary analyses were conducted. Based on the data available on the data placemat, facilitate What? So What? Now What? in the following way: Note that these times are approximate and you can check in with the group to see if they need more or less time to complete a given step. Certain steps might seem more straightforward than others for different groups. Round 1.1 Time - One Minute What? What do they notice in the data? What stands out to them? After everyone has had a chance to read their sheets, instruct everyone to take one minute of silent reflection to gather their thoughts about the quote. Specifically, they should focus on what they notice about the quote and what they notice about their reactions to it. They can take notes or not as they prefer, but everyone must remain silent for that minute. Round 1.2 Time - 10 - 15 Minutes After one minute, they will begin a round of sharing in small groups of 5 - 7 people. Each person should be given an opportunity to share for approximately one minute. They can pass if they don’t desire to share. After each person has had one minute to share their own reflections, the group can begin to dialogue back and forth about what they’ve noticed. Again, they should try to focus on what they notice about the quote and what they notice about their own reactions to it. Round 1.3 Time - 10 Minutes After 10 - 15 minutes pass, bring the entire group together for an all-group conversation for 10 minutes. Invite them to share the most profound or interesting sharing from their small group discussions or to note themes that arose. Capture the responses, especially the disparate ones or conflicting ones. Round 2 Time - 20 - 25 minutes So what? What meaning do they make of what they noticed in the first round? Why is it important? Why do they feel they way they felt about it? Follow the same protocol for the sub-steps of round 1 for round 2 but focusing on the questions above. Round 3 Time - 20 - 25 minutes Now what? What are the implications of what they noticed? What actions are demanded by the information from the first two rounds? What needs to be done differently in the program, in the evaluation, etc.? Follow the same protocol for the sub-steps of round 1 for round 3 but focusing on the questions above. Additional Resources Building Evaluation Capacity: Activities for Teaching and Training by Hallie Preskill and Darlene Russ-Eft. This book offers many activities for building capacity around a variety of evaluation skills. Note: If you don’t have a high level of capacity and knowledge yourself, these activities might be difficult to facilitate since there is no information given to help with facilitation of the content of the activities. They are structured in a way that assumes the facilitator has a particular breadth and depth around evaluation that does not require detailed information in the activity descriptions. Building Capacity in Evaluating Outcomes: A Teaching and Facilitating Resource for Community-Based Programs & Organizations from the University of Wisconsin-Extension. This document contains useful activities and lessons for increasing evaluation capacity building with a specific focus on nonprofit work. References American Evaluation Association. (2004). American Evaluation Association guiding principles for evaluators. Retrieved from website: http://www.eval.org/p/cm/ld/fid=51 American Evaluation Association. (2011). American Evaluation Association public statement on cultural competence in evaluation. Retrieved from http://www.eval.org/p/cm/ld/fid=92 Hopson, R. K. (2009). Reclaiming knowledge at the margins: Culturally responsive evaluation in the current evaluation moment. In K. E. Ryan, & J. B. Cousins (Eds.), The SAGE international handbook of educational evaluation. Los Angeles: SAGE. Knowledge Sharing Toolkit. (n.d.). Human spectrogram. Retrieved from http://www.kstoolkit.org/Human+Spectrogram Liberating Structures. (n.d.). 1-2-4-all. Retrieved from http://www.liberatingstructures.com/1-1-2-4-all/ Liberating Structures. (n.d.). Appreciative interviews (AI). Retrieved from http://www.liberatingstructures.com/5-appreciative-interviews-ai/ Liberating Structures. (n.d.). Conversation café. Retrieved from http://www.liberatingstructures.com/17-conversation-cafe/ Liberating Structures. (n.d.). What, so what, now what? W3. Retrieved from http://www.liberatingstructures.com/9-what-so-what-now-what-w/ The National Sexual Assault Coalition Resource Sharing Project, & National Sexual Violence Resource Center. (2014). Listening to our communities: A guide on data analysis. Retrieved from http://www.nsvrc.org/sites/default/files/publications_nsvrc_guides_listening-to-our-communities_guide-for-data-analysis.pdf Pankaj, V., Welsh, M., & Ostenso, L. (2011). Participatory analysis: Expanding stakeholder involvement in evaluation. Retrieved from Innovation Network: http://www.pointk.org/client_docs/innovation_network-participatory_analysis.pdf Thomas, V. G., & Madison, A. (2010). Integration of social justice into the teaching of evaluation. American Journal of Evaluation, 31, 570-583. doi:10.1177/1098214010368426 © 2018 National Sexual Violence Resource Center. www.nsvrc.org | prevention@nsvrc.org | (877) 739-3895