Interviewing participant for Physicians for PeaceIn fall of 2014, three trends emerged and converged that helped us solidify a direction for the months ahead:

  1. Our recently-formed Advisory Board highlighted the importance of equity, disparities, and community needs in evaluation.
  2. We began exploring how we could recruit and retain people from more diverse backgrounds to evaluation.
  3. EvalPartners led a successful campaign to have 2015 declared the Year of Evaluation, with special emphasis on equity-focused and gender-responsive evaluation.

In response, we formed an internal working group focused on improving our responsiveness and reducing bias in our work, both in our internal practices and in our consulting projects.

Background. The working group reviewed literature, participated in webinars, and discussed what we learned. We explored the strengths and weaknesses of several methods, including equity-focused evaluation, culturally-responsive evaluation, and community-based participatory research. Each of these methods attempts to address hidden bias by attending to power dynamics when developing questions, gathering data, and interpreting results. We found this guide to be particularly thorough and accessible.

Need. As we explored these methods and reflected on our practice, we recognized two issues that were not addressed thoroughly in the literature:

  1. Most of our evaluations take place in complex contexts, where culture is just one of the many factors that affects equity and participation.
  2. As consultants, our evaluation practice must serve our clients with comprehensive, valid information that they can use to improve programs and services.

Idea. The research and reflections led us to develop a new iteration that draws on the established idea of community-responsiveness. As we envision it, community-responsiveness recognizes the complexities of each community, and uses methods that respect community members and allows a wide variety of community voices to be heard.

Testing the idea. We have developed preliminary ideas about the methods used in community-responsive evaluation from our experiences of what has worked in our own evaluation projects:

  • Engage community members as advisors
  • Identify and enlist community experts to contribute to and lead aspects of the evaluation
  • Use multiple methods of data collection and analysis
  • Use a multi-phased, iterative approach that allows you to layer learning from multiple community members

And we’ve had the opportunity to test the idea in a few different contexts:

  • In a presentation and discussion with Minnesota researchers
  • In a discussion with researchers at the Federal Reserve Bank of Minneapolis
  • In an internal workshop using a world-café format to explore the strengths, weaknesses, and potential of the method

What does community-responsiveness look like in practice? In an earlier study on behalf of the Minnesota Department of Human Services, we examined the long-term care provided to people who have disabilities or who are aging. We attended meetings with several existing advisory groups to get suggestions about the best ways to recruit and engage people for the study. From this, we designed an approach that gave participants a variety of ways to contribute their perspectives (online survey, discussion board, in-person focus groups, print surveys, and feedback forums at existing conferences). Each option had built-in accommodations. For example, we could offer interpreters at the focus groups, multiple languages of the surveys, and create all online materials in ways that were screen-reader friendly. The resulting data was a rich tapestry of perspectives from all over the state and from people with many different experiences and backgrounds.

Next steps. New ideas emerged from each of these discussions: that evaluators and researchers need more guidance to use a community-responsive approach; that we need to pay attention to articulating the credibility of the methods; and that there may be some community groups that require greater levels of attention because they have been historically marginalized. We plan to explore this further at upcoming conferences, such as the Minnesota International NGO Network IDEA Summit and the American Evaluation Association Evaluation 2015 Conference. We will continue to publish our lessons learned in the months ahead.

The following article shares some of the ways we have practiced community-responsiveness to date: Community Responsive Evaluation & Research