Capacity building, evaluation use, Reporting results

When data challenges our beliefs

Alternative post title: When reality interferes with our denial (this spin courtesy a colleague’s conversation with an institutional researcher at a recent conference).

No matter what I do, I can’t seem to avoid thinking about this topic lately — the difficulty of sharing data with others (often decision makers) when that data doesn’t align with, or challenges, their beliefs.

This can look like ‘shooting the messenger’, or accusations about data or method quality, or like crafting reasons (sometimes legitimate, but often excuses rather than interpretation) for why the data indicates one thing or another, or otherwise dismissing data or findings. Sometimes it is because there is something wrong with the data, or more information is needed, but more often (if I’ve done my job and the findings have been legitimately constructed), it is confirmation bias*, or a closely related (unnamed?) cousin.

*Confirmation bias occurs when someone gathers or interprets information selectively, in support of their own beliefs.

I recently started reading The Signal and the Noise: Why So Many Predictions Fail — but Some Don’t (Nate Silver’s book about prediction) and was repeatedly reminded of this topic throughout the first section of the book. It may even have shed slightly new light on this struggle for me.

Here’s an example — A very early passage in the introduction discusses the early days of the world wide web. Silver relates it that era to the early days of the printing press (and goes on to compare this experience to the current ‘big data’ movement):

The amount of information was increasing much more rapidly than our understanding of what to do with it, or our ability to differentiate the useful information from the mistruths. Paradoxically, the result of having so much more shared knowledge was increasing isolation along national and religious lines. The instinctual shortcut that we taken when we have “too much information” is to engage with it selectively, picking out the parts we like and ignoring the remainder, making allies with those who have made the same choices and enemies of the rest (p 3).

Interesting to consider that sometimes ‘rejection’ of valid data might be in part a product simply of information overload, that our ability to understand hasn’t yet caught up to the amount of information in front of us, isn’t it?

I’m also currently taking a course in effective listening  as part of a Certificate in Conflict Resolution and Mediation (because, well, who doesn’t need those skills?). It too has shed some light on this topic for me. Turns out there are fundamental aspects of how we listen and communicate (well, or not so well) that manifest in these scenarios — basically, listening, and the absorption of information, is hard work! I plan to share more about my insights about actionable data and listening in a future post. In the meantime, here’s a nice synopsis of the effects of emotions, and the use of “defensive measures” in listening (from our text, Listening: The Forgotten Skill: A Self-Teaching Guide (Wiley Self-Teaching Guides), by Madelyn Burley-Allen):

If something is said that might cause us to change our perception, we feel threatened. Potential change in our perceptions can involve strong feelings. Often, the first feelings are frustration and confusion. To reduce these feelings, we flee mentally from what is being said by tuning out what the talker is saying or by distorting it so we won’t have to alter our perception, belief, or opinion (p 59).

Ok, so, if I have a better handle on what is going on in these situations — can I do a better job of sharing data or findings so that others are more easily able to accept, absorb, and make use of that information? I know I can start with less-detailed reports, or sharing information in tiers or layers, rather than in one overwhelming lump… and that considering and respecting where others are ‘coming from’ as they view the data, and/or somehow softly challenging others to consider other perspectives might be helpful…. But I feel like that’s the tip of an iceberg.

How do you work to promote acceptance of data? How can we prevent selective listening, confirmation bias, and the like?

Added 4/25: The above named colleague (see note about alternative title), wrote on this subject today as well. See We Tell Ourselves Stories.