When I first became an environmental economist over a decade ago, the conservation community was focused on showing that nature had value for communities.
After all, once we show that wetlands are reducing flooding, we can make the case that communities must pay for the restoration of wetlands.
And once we’ve demonstrated that native pollinators are increasing crop yields…shouldn’t farmers be paying for native pollinator habitat?
As environmental economists across the country were collecting data and using rigorous methods to prove that nature had economic value, the message was consistent: Stakeholders need to pay for the benefits they are receiving.
In hindsight, you can see where this approach went wrong. Treating communities as “stakeholders” who are just sources of data and money is a very transactional way to interact with people.
This phenomenon is not unique to conservation nonprofits. This applies to any organizations doing research, who want to collect data from communities.
It may be unintentional, but if your nonprofit uses language that suggests that people are sources of data. And if your team skips that crucial step of building relationships based upon trust – your nonprofit will fail to meet its goals.
TIP 1: Consider phone interviews instead of surveys. In some cases, more in-depth conversations with a smaller number of people is better than getting hundreds of survey responses with little depth.
I often see nonprofits wanting to do surveys and get large volumes of responses. But this is a trap, always believing that quantity is better than quality. My colleagues and I have found that many populations are over-surveyed. For certain projects, we have started calling up participants and doing the survey over the phone. I realize this is more labor intensive. But it has multiple benefits. We can chat and talk about the program. The participant has a chance to ask questions. It also allows the interviewer to ask a follow-up question to gain greater clarity. Talk about a win-win – it builds relationships and results in higher quality data.
TIP 2: Believe people when they share their experiences.
It’s unfortunate that this recommendation still has to be on the list, and it’s not considered the default in all cases. When people share their experiences with working with your organization – especially if it is negative feedback, stories of discrimination, or experiences of racism – believe them. Don’t ask them to prove it.
TIP 3: People are more likely to share honest feedback if you make space for a range of emotions, rather than only positive feedback.
Program participants will be unwilling to share the full breadth of their experiences unless you create a safe space to do so. One way to do this is through interviews. Participants may be more willing to share both positive and negative impressions of your nonprofit one-on-one. It is possible to get constructive feedback from focus groups, too, if they are designed in a way that you create safety. Participants need to know that responses will be kept confidential and not attributed to them personally.
TIP 4: Relationships first, data collection second.
Avoid jumping directly to data collection without connecting first. Ice breakers can feel corny, but they work – they are effective ways of getting to know each other before jumping right into data collection. And evaluation expert Claire Stoscheck recently added, “Create space for people to be their full human selves- e.g. encourage people to take bio or emotional breaks, have their pets/kids in screen if virtual, eat/drink as needed, etc… When people feel accepted and welcome for who they are as a whole person, they will feel more comfortable sharing and participating fully.”
TIP 5: Think long-term. Relationship building does not happen through one single interaction.
Avoid thinking of data collection as a one-time action. Make sure to follow up with the communities that you engage. Share the results of your research. Attend their meetings. Go to the festival led by the art center, the economic development forum, the Farm Bureau event, the master planning session for the school district, etc. When communities see your face at other meetings, they understand you also care about their goals.
TIP 6: Focus on shared learning and respect for communities as sources of knowledge; avoid filling the role of experts teaching communities.
I often hear nonprofit leaders focus on the importance of educating the public. But we must remember that learning goes both ways. When it comes to program evaluation, this means showing a willingness to be wrong. At the end of the day, program evaluations are the opportunity to improve and do better. Defensiveness will shut down conversations. Include, listen, and apply the feedback through adaptive management.
TIP 7: Avoid biased, leading questions.
When I say avoid biased questions, that doesn’t mean you can never ask a question like, “Did our nonprofit program provide you with benefits?” It’s only a leading question if you jump in without a series of smaller questions to set the stage.
Setting the stage means grounding survey-takers in the facts of the situation before answering. E.g.:
You want to see if your program helped mitigate impacts from the last flood disaster?
- Walk through a set of specific questions to assess the types and value of damage first.
You want to better understand if your program helped retain jobs?
- Get specific data on total number of part-time and full-time jobs before and after the program. Only after that, start asking about how the program influenced job retention.
Human-centric data equity specialist Meenakshi (Meena) Das has some additional, insightful tips on overcoming bias in surveys– while building better trust and transparency. (Link here).
TIP 8: Focus groups first, then survey.
For program evaluations, this sequencing matters. Focus groups help you understand the context, the language that the communities use, and they help you begin to categorize and prioritize. This allows for surveys to be much more focused, giving you high quality data to quantify the impact of your program.
TIP 9: Be specific about the parameters of the program you are assessing.
Don’t say “last year,” list the exact dates of the program. Don’t say, “our food security program.” List the program name. With the food security work my team has done, we found that many participants do not know which organizations run the programs, and even get confused when there are a lot of food security programs happening concurrently. List the exact name and details of the program that you are evaluating.
TIP 10: Remember that communities are listening when you present your results at conferences and in articles.
Think about how your talk about the research in a range of contexts – e.g., at conferences and in articles. Are you only talking about a rigorous analysis in those settings? Do you refer to communities as if they were specimens? Take caution – your way of thinking in one context gets into your brain, and can come through in other contexts. Communities are listening. It doesn’t help to talk about communities with dignity in some spaces and as transactional relationships in other contexts.