I have a hypothesis. And my guess is others share this hypothesis. There may even be some research out there on this…. In fact, there ought to be, even maybe must be…. [Note to self — conduct lit review — if no such research exists, pursue it…somehow].
The hypothesis is that interacting with data, with research findings, promotes action based on those findings. By interaction I mean more than reading about findings or watching a presentation about findings even. I hypothesize (and doesn’t this seem reasonable?) that reading or hearing about findings is not nearly as powerful as “rolling your sleeves up” and digging into the data or findings, interacting with it somehow.
How is this done? One way to enable this deeper interaction with findings may be via very nifty tools that allow researchers to disseminate findings via online dashboards that allow the viewer to click through graphical representations of the data, or even to reorganize the data themselves to explore data and findings. Check out Tableau for a brilliant and very professional example of software for this, though I have seen simpler homegrown versions which are usually web-based. More about Tableau in the future as I hope to explore it as a possible tool for use here at Marylhurst.
Debriefing sessions in which researchers help stakeholders explore findings can also provide sufficient interaction with data as to help move those stakeholders to act on their data. I believe this is especially true if those sessions are truly interactive. Again, by this I mean that the participants do more than hear or read about the findings. Instead they have, at minimum, engaged discussions with one another, hopefully including considerations of the implications of those findings in practice. The best debriefing sessions I’ve seen or heard about involved stakeholders working together on an activity that got them out of their seats — such as by physically making connections between findings and practice via an exercise requiring that sticky notes with practice and finding elements are arranged on a wall with yarn strung between them, literally connecting the findings to potential areas of action. In other words, illustrating how stakeholders might USE the data through discussion and activity. This may not sound revolutionary, or maybe it does, but it’s certainly something that we rarely make time for.
In 2010 Michael Quinn Patton spoke at the Oregon Program Evaluators Network conference and one of my favorite ‘take aways’ was that program evaluation (or assessment) work can’t/shouldn’t/doesn’t end with the reporting phase. That dissemination means much more than generating a report and handing it over. He argued that time and energies should be expended post-reporting on not just adequate dissemination but on engaging stakeholders with the findings. And that this should be budgeted and planned for right up front so there are no excuses at the end when there’s no time or money left. I see this as a key to actionable data — it may not be enough just to design data collection for actionability — the evaluator/assessor must also promote and support use once that data is collected, analyzed, and even reported.
Watch for a related post in the future: Do infographics support greater interaction with data and therefore support data-driven action?