Tools, Methods, and Activities for Participatory Evaluation Implementing a participatory evaluation process can feel daunting because of the many steps that you need to navigate. The chart below highlights some guiding questions for various phases of participatory evaluation and some tools, methods, or activities that might assist you at each stage. Click on the tool, method, or activity to see more information about it and get links to additional resources. Step - Planning/Design Considerations What is the problem? How should we define it and conceptualize it? What is the vision for what would be different or better if the problem is addressed? What are the most pressing evaluative questions? Which questions lead to actionable answers? What would program participants prioritize? How is that different from what the evaluator or program implementers would prioritize? What questions, if any, are you required to ask? Tools/Methods/Activities Purpose to Practice (P2P) Rich Pictures Nine Whys Min Specs Dot Voting Poll Everywhere Step - Data Collection Considerations What types of data feel credible to program participants? What types of collection methods are more or less intrusive? What skills do participants need to build to participate in data collection? Tools/Methods/Activities Poll Everywhere Activity-Based Assessment Most Significant Change Step - Data Analysis and Interpretation Considerations What do the participants see in the data? What meaning or sense do the participants make of the data? What do they think caused any clear shifts or prevented shifts? What context do they think is critical for understanding the data? Tools/Methods/Activities Data placemats + What? So What? Now What? W3 Step - Data Use Considerations What are the implications of the data? What needs to be changed about the intervention? What needs to stay the same? What needs to be tweaked about the evaluation tools or processes? Tools/Methods/Activities Ecocycle Planning Purpose to Practice (P2P) Purpose to practice (Liberating Structures, n.d.) involves a group of stakeholders in collectively developing a statement of purpose for their work together, the principles that will guide the work, and the practices that they will engage in to meet that purpose. Instructions for implementing the structure can be found on the Liberating Structures website. Rich Pictures Rich Pictures (Stevens, n.d.) involves collective drawing to better understand a problem or issue. You can also then have the group draw another rich picture depicting their vision for what things will be like when the problem or issue is different in a desired way. For more information, check out this article from BetterEvaluation. Nine Whys Nine Whys (Liberating Structures, n.d.) helps groups of people uncover the deeper purpose, guiding principles, or intentions behind their work. For an evaluation planning process, this could help clarify the evaluation focus and also help uncover the reasons why people want to focus on certain evaluation questions over others. The deeper purpose of potential areas of focus can be uncovered to help determine which are the most meaningful and useful. Find out more information about how to implement this here. Min Specs Min Specs is an activity that helps a group of people determine what absolutely must be done (and not done) to help them achieve a stated purpose. For evaluation practice, this can help a group determine the minimum specifications for answering given evaluation questions or minimum specifications for implementing an evaluation collectively implementing an evaluation more generally. These minimum specifications (Liberating Structures, n.d.) then guide all future decision-making. Dot Voting Dot voting (Gray, 2010) is a transparent and participatory method of decision-making or prioritization. Generally, everyone in a group is given a certain number of dots to use to vote for one or more ideas among many. They can distribute their dots in any manner they choose. You can find more information on how to do this from Gamestorming. Poll Everywhere Poll Everywhere is a real-time polling program that works through presentation software (like MS PowerPoint) to collect and immediately display polling data from your audience. Participants can submit responses through text message or online using phones, tablets, or computers. Evaluation stakeholders could cast votes for their priority questions or focus areas during a meeting, and results can be displayed immediately for discussion. Such a program can be useful for many collaborative processes and also for data collection. For example, you could use it to collect data from participants to establish a baseline of their knowledge or attitudes about something. You can capture those data and also immediately display them to your class to talk with them about what their responses mean. Activity-Based Assessment Activity-Based Assessment (or activity-based evaluation) focuses on integrating data collection into existing curriculum-based efforts to assess learning integration at discrete points throughout the intervention. Through this approach, facilitators can get real-time feedback to make improvements in the intervention, and it supports participatory data collection, analysis, and interpretation. Find out more in this toolkit (Curtis & Kukké, 2014) from the Texas Association Against Sexual Assault and Texas Council on Family Violence. Most Significant Change Most Significant Change (MSC) (McDonald, Stevens, Nabben, & Rogers, n.d.) is an evaluation methodology that focuses on participant storytelling. One way to implement it involves having program participants interview each other about the most significant change they experienced as part of an intervention. Another possibility is to have story circles where each participant is given a chance to share such a story with the rest of the group. Check out this guide (Davies & Dart, 2005) for more information. Data Placemats Data placemats (Pankaj & Emery, 2016) are used to display preliminary data analyses in order to facilitate discussions among stakeholders about the interpretation and use of such data. These conversations in turn influence further data analysis by identifying additional ways to look at the data and possible hypotheses about the potential conclusions that the data suggest. You can find more information on data placemats from this Innovation Network document (Pankaj, Welsh, & Ostenso, 2011) on participatory analysis and on Katherine Haugh’s blog (2015). If you are a member of the American Evaluation Association, you can also access a Coffee Break webinar (Pankaj, 2015) about data placemats. This can be effectively paired with a Liberating Structure called What? So What? Now What? W3, explained below. You can also find a sample data placemat for sexual violence prevention work in the Training and Capacity- Building Slide Deck along with slides for facilitating What? So What? Now What? W3. What? So What? Now What? W3 This structure follows the model of evaluative questioning up the ladder of inference to reflect on what the data says, what meaning might be drawn from the data, and what next steps might be necessary or desirable based on that information. Full instructions for facilitating this structure can be found on the Liberating Structures (n.d.) website, and refer to the supplemental Training and Capacity- Building Slide Deck for more guidance on applying the structure to participatory data analysis for sexual violence prevention. Ecocycle Planning Ecocycle planning is a structure that enables participants to examine the lifecycles of various activities or programs. As part of evaluation, it could be used after data collection and analysis to pinpoint which lifecycle stage a program or project is in. From there, it might be clear what types of actions need to be taken next to keep the initiative from stagnating. Learn more about working with ecocycle planning (Liberating Structures, n.d.) here. References Curtis, M. J. & Kukké, S. (2014). Activity-based assessment: Integrating evaluation into prevention curricula. Retrieved from the Texas Association Against Sexual Assault: http://www.taasa.org/wp-content/uploads/2014/09/Activity-Based-Assessment-Toolkit-Final.pdf Davies, R., & Dart, J. (2004). The ‘Most Significant Change’ (MSC) technique: A guide to its use. Retrieved from Monitoring and Evaluation NEWS: http://www.mande.co.uk/wp-content/uploads/2005/MSCGuide.pdf Gray, D. (2010, October 15). Dot voting [Blog post]. Retrieved from Gamestorming: http://gamestorming.com/dot-voting/ Haugh, K. (2015, January 16). Data placemats: A Data Viz technique to improve stakeholder understanding of evaluation results [Blog post]. Retrieved from http://katherinehaugh.com/data-placemats-a-data-viz-technique-to-improve-stakeholder-understanding-of-evaluation-results/ Liberating Structures. (n.d.). Ecocycle planning. Retrieved from http://www.liberatingstructures.com/31-ecocycle-planning Liberating Structures. (n.d.). Min specs. Retrieved from http://www.liberatingstructures.com/14-min-specs/ Liberating Structures. (n.d.). Nine whys. Retrieved from http://www.liberatingstructures.com/3-nine-whys/ Liberating Structures. (n.d.). Purpose-To-Practice (P2P): Design the five essential elements for a resilient and enduring initiative. Retrieved from http://www.liberatingstructures.com/33-purpose-to-practice-p2p/ Liberating Structures. (n.d.). What, so what, now what? W3. Retrieved from http://www.liberatingstructures.com/9-what-so-what-now-what-w/ McDonald, B., Stevens, K., Nabben, T., & Rogers, P. (n.d.). Most significant change [Blog post]. Retrieved from BetterEvaluation: http://www.betterevaluation.org/en/plan/approach/most_significant_change Pankaj, V. (2015, January 22). Data placemats: A Dataviz technique to improve stakeholder understanding of evaluation results [Webinar]. Washington, DC: American Evaluation Association. Pankaj, V., & Emery, A. K. (2016). Data placemats: A facilitative technique designed to enhance stakeholder understanding of data. In R. S. Fierro, A. Schwartz, & D. H. Smart (Eds.), Evaluation and Facilitation. New Directions for Evaluation, 149, 81–93. doi:10.1002/ev.20181 Pankaj, V., Welsh, M. & Ostenso, L. (2011). Participatory analysis: Expanding stakeholder involvement in evaluation. Retrieved from http://www.pointk.org/client_docs/innovation_network-participatory_analysis.pdf Stevens, K. (2016, May 12). Rich pictures [Blog post]. Retrieved from BetterEvaluation: http://www.betterevaluation.org/evaluation-options/richpictures © 2018 National Sexual Violence Resource Center. www.nsvrc.org | prevention@nsvrc.org | (877) 739-3895