Menu Close

Capturing authentic youth perspectives through surveys

Kim & Sheila here, writing with Dr. Alicia Lynch of Lynch Research Associates

We recently chatted about our experiences working on surveys for youth, and the special challenges and considerations in doing so.

Administering youth surveys is a common method for measuring the effectiveness of youth programming, tracking trends in behaviors, and understanding youth development. When designing surveys, it’s easy to fall into the trap of treating youth as if they’re tiny adults – make a few tweaks to the readability and vocabulary in the survey, and you’re on your way. However, generating reliable data from youth surveys requires understanding how their cognitive, language, and attention skills are still developing and acknowledging that their lived experiences and world views are often vastly different from our own.  

Our general advice for surveying youth

Much of our general survey design advice also applies when surveying youth; we just have to keep them in mind as our respondents.

  • Get clear about survey purpose and overarching research or evaluation questions. Doing this helps us to be super efficient and distill our survey questions down for simpler, shorter surveys than adults might tolerate. 
  • Focus on crafting clear questions. Ensure that youth will understand what we’re trying to ask, and limit complex or multi-part questions. 
  • Make questions meaningful. Ask youth about their thoughts and insights, and communicate to them that their opinions and experiences matter and are valuable. Let them know what we will do with their responses – how what they share with us will be useful and used. 
  • Pretest surveys with youth (not just with other adults). When surveying youth, we need to test whether the readability, vocabulary, and format (paper, online) work for the particular group of young people we are surveying. For youth, soliciting feedback in groups can be a great option when available.

Specific considerations for surveying youth

  • Youth understand and interact with the world differently than adults. Researchers surveying youth need to understand young people and the way their brains work. Youth have different worldviews and experiences and will think about constructs differently than adults. Here’s an example Sheila shared in the context of ‘net promoter’ questions:

Rebecca from Glass Frog told me a story of her nine-year-old daughter encountering this “likely to recommend” question on a survey about her Girl Scout troop. Rebecca’s daughter provided very favorable responses about her experience with the Girl Scouts. However, when asked whether she would recommend this troop to others, she responded with a 0 on the net promoter question and told her mom, “No. I really like our troop the way it is. I wouldn’t recommend it to anyone else, because I don’t want anyone else to join.” The young girl took the question quite literally and answered honestly. 

  • Young people want to feel respected and valued. Asking young people for their opinions and describing why their voice matters in the context of your questions will promote participation, and increase survey response rates.
  • Surveys need to be designed for smartphones. Youth are far more likely to complete a survey on a smartphone, and that means making sure the survey works on a phone – not just technically – but that the questions and response options appear easy to read on a smaller device. Testing accessibility and ease of completion by youth on a variety of devices is worthwhile.  
  • Question language and response options should make sense to young people. As with adults, this requires that we know enough about our respondents and more specifically, know what language, phrasing, and terminology will make sense to them. With youth, this just might be a little trickier because as adults, we are further removed from current slang and communication styles. 
  • Consider age and literacy level. Our advice is generally to aim for a reading level at or below your youngest desired respondents. We recommend a few tools that test for readability in Designing Quality Survey Questions, including the the Flesh-Kincaid Grade Level test, which is built into Microsoft Word’s and available through Readable which also provides other comprehensibility tests. 
  • Know the specific youth population being surveyed. There is a surprising lack of research around creating surveys that fit the unique developmental stages of early childhood, middle childhood, and adolescence. Many of the best practices researchers have used are derived from theoretical models of youth development rather than research on effective survey methodologies. That being said, there are common considerations you can use to guide survey development with young people. 
    • Ages 6 & under: Surveys from children 6 and under are unlikely to yield reliability data. At this age, most children don’t have the cognitive, language, and attention skills required to successfully complete a survey. Consider using other data collection methods such as classroom (or program) observations, or teacher or parent surveys. 
    • Ages 7-10: Children at this age tend to be very literal; avoid jargon and use familiar words and short sentences. Some children in this age group will struggle to understand the nuanced differences among response options within traditional, five-point Likert-type agree-disagree scales. Consider using three point scales (e.g., “Yes/No/Maybe”). Visuals and fun fonts can help keep this age group engaged in the survey, but the goal should be to make it as short and simple as possible. 
    • Ages 11-18: Tweens and teens are able to successfully complete surveys and, in today’s world, have likely been previously exposed to surveys. There is a tremendous body of existing scales that are publicly available and can be tapped to fit unique research or evaluation needs. But beware – just because a scale has been previously administered does not mean it is valid or reliable. Look for scales that were developed by a team of experts in the field and have associated technical reporting, including measures of reliability and validity for this age group. 

It’s important to note that it is difficult, if not impossible, to survey across these age groups without substantially adapting your questions or survey strategies.

  • Take care when planning for adult supervision. Ideally, youth should be able to complete the surveys on their own, without adult support. In cases where youth need an adult to read aloud survey questions and response options, the adult needs to take care not to lead children with any inflections in their voice or other cues. If we are asking others (e.g., program leaders or teachers) to administer our surveys with youth, we need to provide specific instructions for how to do so and potentially monitor or train the adults to ensure integrity of the data. Kim shared an example of when this went wrong: 

Years ago, we were surveying middle school youth in programs across the state and initially didn’t provide much guidance to program leads. We were more focused on making sure they could handle working with multiple versions of paper-based surveys in multiple languages (we may have overcomplicated things for everyone!) and we figured with middle schoolers, all they really needed to do was pass out the survey, give them time to complete it, and collect and send them back to us (that felt like enough to ask of them!). We quickly realized that they needed more guidance about exactly how to administer the survey, and despite sharing more instructions and talking it through one-on-one with several leads, we ended up with one batch of data where something clearly had gone awry, and in the end we had to omit it from our analysis.

  • Expect the unexpected. Sheila shared an example of this from an evaluation on Advanced Placement programs across several high schools:

We asked the students, “Who recommended you for this specific course?” I think it was a closed-ended item with choices of counselor, teacher, parent, or someone else (please write in), and we were stunned to find that many students said they recommended themselves! This was an important perception from students that we had never considered. 

Youth are also likely to provide snarky, silly, or even very rude responses, especially adolescents and teens, and especially in response to open ended questions. When given the opportunity to list their hobbies outside of school, you’d be surprised how many youth are “fighter pilots,” “alien hunters,” and “professional sleepers.” We view silly responses as an opportunity to identify potentially invalid surveys. If you’re seeing nonsense in open-ended responses, we recommend reviewing all the data from that respondent. Are there other “red flags” that would suggest the respondent wasn’t taking the survey seriously? Or is a young person just using what space they have available to have a little fun with it?

  • Use “unsure” or ”I’m not sure” as a response option wherever possible. Young people are in a stage of constant development. Their opinions and behaviors change frequently. An “I’m not sure” option prevents youth from selecting a category when they’re honestly not sure about their response. This “non-substantive” response option will also capture youth who may not understand the question.

Surveys CAN work with youth

Elevating youth voices is essential in evaluations of youth programming and research aimed at understanding how to help young people thrive. And surveys are a fantastic tool for collecting data from large numbers of young people in a relatively painless format. The tips and ideas shared in this blog are meant to encourage you to center young people in your survey design, hopefully without experiencing some of the challenges we have! 

Do you want to talk more about youth surveys? Reach out to Kim (kim@leonardrande.com), Sheila (sbr@sheilabrobinson.com), or Alicia (alicia@lynchresearchassociates.com) and let us know your favorite tips and tricks for generating reliable youth survey data.