Institute Occasional Paper 8: From Gathering to Using Assessment Results
Drawing from the Wabash Study, a multi-institutional longitudinal research and assessment project, Charlie Blaich and Kathy Wise, from the Center of Inquiry at Wabash College, share their field-tested findings and lessons learned about campus use of assessment results. The Wabash Study assists institutions in collecting, understanding and using data. The researchers at the Center of Inquiry found the last component to be the real challenge—using the data for improved student learning. In this Occasional Paper, Blaich and Wise describe the accountability movement, the history and purpose of the Wabash Study, and the reasons why institutions have a hard time moving from gathering data to using data, giving five practical steps to campus leaders for using the data collected.
The Wabash Study is a longitudinal research and assessment project designed to provide participating institutions with extensive evidence about the teaching practices, student experiences, and institutional conditions that promote student growth across multiple outcomes. Despite the abundant information they receive from the study, most Wabash Study institutions have had difficulty identifying and implementing changes in response to study data. Although much of the national conversation about assessment and accountability focuses on the pros and cons of different approaches to measuring student learning and experience, we have learned from the Wabash Study that measuring student learning and experience is by far the easiest step in the assessment process. The real challenge begins once faculty, staff, administrators, and students at institutions try to use the evidence to improve student learning.
In this paper, we review faulty assumptions we made about assessment in creating the Wabash Study, including our initial thoughts about the primary obstacles to good assessment, the importance of assessment reports, and the benefit of connecting assessment with faculty habits of disciplinary inquiry. As the study progressed and we saw how institutions struggled to use the evidence they had collected, we revised the study to focus more on disseminating and using data. We have distilled the lessons learned from our experience into five practical steps that campuses should consider implementing as they develop assessment projects to increase the likelihood that the evidence they collect will benefit student learning:
1) Perform thorough audits of useful information about student learning and experience that your institution has already collected.
2) Set aside resources for faculty, student, and staff responses to the assessment information before assessment evidence is distributed around campus.
3) Develop careful communication plans so that a wide range of campus representatives have an opportunity to engage in discussions about the data.
4) Use these conversations to identify one, or at most two, outcomes on which to focus improvement efforts.
5) Be sure to engage students in helping you make sense of and form responses to assessment evidence.
Charles Blaich is the Director of Inquiries at the Center of Inquiry at Wabash College and the Director of the Higher Education Data Sharing Consortium (HEDS). He received his Ph.D. in Psychology from the University of Connecticut in 1986. He taught at Eastern Illinois University from 1987-1991 and then at Wabash College until 2002. Blaich assumed his current position at the Center of Inquiry in 2002 and became the director of HEDS in 2011.
Kathleen Wise is the Associate Director of Inquiries at the Center of Inquiry. She received her MBA from the University of Chicago in 2001. She was a Senior Financial Analyst at Eli Lilly and Company from 2001- 2003, and then became a Research Fellow at the Center of Inquiry in 2004. Wise assumed her current position at the Center of Inquiry in 2007.
"In this paper, Charlie Blaich and Kathy Wise share their candid, field-tested insights into the obstacles that institutions must address in order to go beyond collecting evidence of student learning to actually using the results effectively."
George D. Kuh