Menu Close

Lessons from an attempt at action-focused survey revision, part 2

Continued from the post found here.

In attempt to improve the usefulness of data from our alumni survey, a small task group of our Assessment Committee has set out to review and consider revision of the alumni survey. We’re doing so through the following steps:

1. Initial discussion regarding survey purpose and goals, and a review of our needs related to accreditation

2. Review of survey original intentions and history/context: We met with one of the original survey designers to learn more about the survey and learn about how the survey had previously functioned.

3. Gathering Stakeholder Feedback: We spent about a month talking with a wide variety of stakeholders across campus: faculty; deans; staff; department chairs; current students; and alums. In doing so we shared what we know about the survey so far & our intentions (to increase the usefulness of the survey findings at multiple levels within the university), and requested feedback. We also got a better handle on which departments conduct their own surveys and/or have specialized accreditation related needs for alumni data.

4. Discussion and synthesis of feedback: We worked together to review all our notes and take aways from the stakeholder conversations and pulled together a synthesis in the form of themes and important aspects of those conversations as a whole. We also began to pull together some thoughts about the degree to which the survey should be revised (more, perhaps, than we originally thought).

5. Confer with two important groups: the Assessment Committee and the Provost’s Council/Deans (consisting of the Deans of each of our three schools, Dean of Students, University Faculty Council chair and Graduate Committee chair, with the Provost, of course) regarding a set of reworked survey goals and plan for moving forward — drafted as a purpose statement initially.

7. Draft a revision, working with departments for whom special questions are needed: This is our current stage.

8. Return to the Deans for another round of feedback this time on a draft outline organized around draft research questions as well as a revised purpose statement. This conversation will help ensure that we are staying on track, and gaining important buy in for the direction and focus and content, generally, for the revised survey.

Coming up next?

9. Report out to the university community regarding our progress and next steps.

10. Continue with revisions and work with department representatives and/or particular stakeholders on special questions as needed.

11. Test and finalize survey: a handful of alums, students, and staff will be asked to run through the survey to test it’s functionality, make sure questions are clear, time how long the survey will take respondents, etc.

12. Launch and run survey!!

13. Report out findings at the institutional, school, and hopefully departmental level (there may be a few departments with two few graduations/respondents, but we are hopeful we’ll be able to at least give something particularly focused to each department).

And really, this is just the beginning of this effort. As I have heard Michael Quinn Patton say — reporting out is not the end — working with those who need to use the data is vital to supporting the use of that data. We plan to do this primarily through brown bag information events and by facilitating conversations between university stakeholders about how they’d like to and should use the information we receive from the survey.

More to come in the future about the ultimate impact of all this work — hopefully a successful effort that culminates in action regarding the data we receive. We are working soon on a plan for figuring out whether we were successful — meta assessment, if you will.

Posted in survey question design, surveys

Related Posts

1 Comment

  1. Pingback:Lessons from an attempt at action-focused survey revision, part 3 « actionable data

Leave a Reply

%d bloggers like this: