Menu Close

Using an “I don’t know” Option in Survey Design

I don’t know. 

In 1982’s Fast Times at Ridgemont High, high school Social Studies teacher Mr. Hand asks student Jeff Spicoli, “Why are you continuously late for this class Mr. Spicoli? Why do you shamelessly waste my time like this?” to which Spicoli responds, “I don’t know.” 

Why didn’t Jeff Spicoli know the answer to Mr. Hand’s questions? Perhaps he was taken aback by the questions, felt put on the spot, or just didn’t have time to come up with a better answer. Perhaps he knew the answer, but was embarrassed by the questions and didn’t want to respond candidly in front of his peers.

Or perhaps he truly just didn’t know.

We don’t need to be a teenage surfer dude more interested in tasty waves than high school history class to relate to this scenario. When faced with a survey question that takes some extra thought, surprises us, or makes us feel self-conscious, we too may answer “I don’t know.”

Hey, Kim here, writing once again with my dynamite survey design collaborator and co-author, Sheila Beth Robinson!

Why do survey respondents choose “I don’t know?”

There are any number of reasons survey respondents choose an “I don’t know”, an “unsure”, or a “no opinion” response if it is offered as a response option. Here are just a handful of possibilities:  

  • Respondents do not understand the question stem or have difficulty comprehending or interpreting parts of it.
  • Respondents have conflicting feelings about the topic or issue and are uncertain about where they stand.
  • Respondents lack confidence in the anonymity of survey results and feel uncomfortable candidly answering the question.
  • Respondents lack confidence in the anonymity of survey results and fear retaliation or retribution if they answer honestly.
  • Respondents understand the question, are willing to be forthcoming, but honestly do not know the answer. 
  • Respondents are tired of answering questions, don’t feel connected to the survey topic, or just don’t care whether they’re offering their best responses. 

When should we consider adding an “I don’t know,” (or similar) response option to a survey question? 

When we’re fairly sure that one or more of the above scenarios could exist with any given question. 

Looking at the performance of survey questions used previously can also help determine when “I don’t know” is needed. Did a substantial number of respondents choose the “I don’t know” (or similar) option? Of course it’s up to you to determine how many responses are “substantial.” You can also consider any evidence you have to suggest why respondents chose this answer and whether or not you want to change the question based on the number of those responses. 

Pre-testing may help uncover which questions need “I don’t know” responses. Testing questions without those response options – especially using cognitive interviews, think-aloud sessions, or focus groups – with potential respondents should shed some light on when they are needed. 

The problem with “I don’t know” responses

You might be thinking, what’s the harm in offering “I don’t know” as a response all the time? Why not give all respondents that option? Unfortunately, adding “I don’t know” or other non-substantive responses can result in respondents overusing them, and giving us low-quality data. “I don’t know” is an easy out for someone who doesn’t want to put in the effort of thinking about the question and reflecting on whether or not they do actually know, whether they do have an opinion one way or another, even if it doesn’t feel like a strong opinion. Leaving off non-substantive responses subtly encourages respondents to ponder a bit more because there is no easy out. 

When and how to avoid “I don’t know”

Of course, we don’t recommend adding “I don’t know” or any other “other” type of response option without thorough consideration. Tacking “I don’t know” onto a question where it’s not needed can result in messy or inadequate data as we mentioned. This might be the case in questions where you’re asking for an opinion and genuinely believe everyone responding will have one. Or where you’ve done adequate pretesting of your survey to have confidence respondents will be able to select a more substantive response option.

In Designing Quality Survey Questions, we cover the topic of “don’t know” responses, and offer filter questions as one strategy to limit their use. 

“For most questions about attitudes or opinions, using a filter question can eliminate the need for a “don’t know” option. The filter question asks if the respondent knows enough or is interested enough about the topic to have formed an opinion. If the respondent answers positively, the next question becomes available. If the respondent answers negatively, one or more of the following questions can be skipped, leaving the researcher with data on opinion questions only from those who have declared having enough knowledge to answer meaningfully” (p. 114). 

You’ve heard this before… context is everything! 

Ultimately, this is yet another scenario in which the right approach heavily depends on context, and where understanding your respondents as best you can* is critical for setting them and you up for success and setting up your survey to result in the most meaningful and useful data possible.

*We cover using a design thinking approach to understanding survey respondents in the book as well. 

Posted in survey question design, surveys

Related Posts

%d bloggers like this: