I recently received an email that made my survey nerd heart soar. And from my daughter’s school, of all places! I’m so pleased to offer it up as a shining example of how to share survey results back with your respondents.
The survey scenario: North Clackamas School District has worked with Youth Truth for years to gather insights about their schools from students and families. Youth Truth is a project of the Center for Effective Philanthropy, with seeding funding and backing of several large education funders (e.g., Bill & Melinda Gates Foundation). As their website states:
YouthTruth is based on the simple but powerful premise that when you get timely feedback from those you’re trying to serve, and really listen to that feedback to make changes, you get better.
I am often a canned/”research-based”/”validated” survey skeptic (based on what research? validated by whom? Is it possible to be culturally responsive using the same questions for everyone?). But, I have been continuously impressed with Youth Truth’s work both from a researcher and respondent perspective. Not only was the survey very straightforward and easy to take (on my phone), I was so pleased with this follow up email.
Here’s how the email opens, and what I like about it:

I like the simple explanation of what the survey is, and I love the transparency around WHY here. The color doesn’t hurt in getting and holding my attention. But look what comes immediately next.

They express their thanks! They walk me through what they did and are doing with the results! Before we get lost in the details of the results! So good.
Now, check out how they are sharing the results:

The prompts “We learned…” “We were surprised…” and “We are proud…” are so great. This is strengths-focused without being overly positive and transparent without the need to immediately make promises about what they might do about the results.
What comes next is maybe my favorite bit of all. After a quick introduction to the survey core themes ( which maybe take up more real estate than necessary here but tell me what they were focused on in conducting the survey and analyzing results), we get a “glow” and a “grow” for each core value for the district. Glow of course being a strength they want to highlight and grow being an area for potential improvement. This framing feels so right for education and tells me something about the districts approach to learning and improvement (that I like).

I’ll save you from the rest of the detail but you get the idea.
My main critique — the glow and grow statements could be strengthened with a little data visualization or a tad more detail about context (what does grew 5% mean – what was the % of disagree and strongly disagree previously?). But, again, it’s relatively simple and understandable.
Overall, I feel heard, I get a sense of what the collective responses to the survey were like, and I get some reassurance that the school is doing something with this information.
Do I want to know more? Yes. Is my school and district perfect? No. Is Youth Truth likely a perfect instrument? Is any instrument perfect? Also no. But, I think this is an excellent example of what it looks like to treat respondents like humans we are in conversation with.
What do you like about this approach? How could the school “close the loop” with respondents even better, or further, from your perspective?
