Shortly after starting work at Marylhurst I was tasked with analyzing and writing up findings from a survey of alumni conducted prior to my arrival. This is one of the most enjoyable parts of my job — actually digging into data, extracting the interesting and hopefully useful stuff, and translating that information for as wide an audience as possible. More about this alumni survey and the resulting report can be found here.
While the survey provided lots of interesting information (alumni respondents report high levels of satisfaction with their experiences at Marylhurst!), a number of issues limited the actionability (still not sure that’s a real word) of this survey’s findings. I’ve listed a few of the issues below, which begin to provide some context for why we chose to go about reviewing and revising the survey.
1. At the time the report was developed, we were unable to determine at all whether respondents were representative of all alumni in the cohort surveyed. While the response rate wasn’t bad at all for an online survey (24%), the resulting self-selected sample makes it difficult (if not impossible) to confidently generalize results.
2.We reported only at the institutional level because it seemed the best approach to getting results out to the community quickly, and there were two few respondents in any given department to provide department specific results. However, those with the power and ability to act on this data do not operate only at the institutional level, and many departments felt that the survey wasn’t useful because it wasn’t likely to be representative of their students, and/or it didn’t ask questions of specific departmental interest(s).
3. At the time I didn’t have access to (nor time to work with) previous years’ survey data. This means we don’t know if the findings represent change in alumni satisfaction over time (the survey has been conducted for about 3 or 4 years but the data not analyzed in the way that we worked with it this year).
4. Some of the questions resulted in information that is hard to know what to do with — open-ended questions resulted in qualitative information that doesn’t present patterns or helpful feedback, for example. In another example, questions about the connection between alumni experiences and the Marylhurst mission resulted in responses that indicate overall satisfaction, but don’t really provide any information about what could be done to further improve the school experience.
Ultimately, we were left with sort of a “yay” (our alum respondents were pretty happy), “but so what” (what do we do with this information) reaction to the findings.
All this said, the survey was designed by very well-intentioned and brilliant faculty members. They got a lot right and it will be important for us to consider and honor the intent behind the survey’s original design. But it’s clear that changing needs (including departmental need to ask specific questions of their alums to meet specialized accreditation requirements, among other things) require that this survey be reviewed and revisions be carefully and thoughtfully considered.
Future posts (part 2 through ?) on this survey will explore: what the review process looked like and what we learned; our revision recommendations and what the process of going through such a planful revision; how we work with the next set of results based on what we’ve learned through this process; and how we’ll figure out whether we were successful in our goals for revising the survey.
Pingback:Lessons from an attempt at action-focused survey revision, part 3 « actionable data