National Institute for Learning Outcomes Assessment |

National Institute for Learning Outcomes Assessment

NILOA Guest Viewpoints

We’ve invited learning outcomes experts and thought leaders to craft a Viewpoint. We hope that these pieces will spark further conversations and actions that help advance the field. To join the conversation, click the link below the Viewpoint. You can also sign up here to receive monthly newsletters that headline these pieces along with NILOA updates, current news items, and upcoming conferences.

 

Just Assessment. Nothing More. Nothing Less.
Wayne Jacobson
Assessment Director, University of Iowa

 

It is difficult to say anything new about outcomes assessment, and yet few in the higher education community would say we’ve got it entirely figured out.  Many institutions still have questions about what data they should be collecting, how much is enough, and how to best make sense of what they manage to collect.  While many work on figuring out the means, others are not at all sure about the ends:  Why are we even doing this?  Judging from the effort it takes to keep it going, it’s safe to say that the need for assessment is still not entirely self-evident to many on our campuses. 

And that’s unfortunate, because a college education is one of those complex things for which the whole is greater than the sum of the parts.  We may be confident that we are each (individually) doing our part, but that is no guarantee that we (collectively) are doing the good that we set out to do as an institution. Assessment is one of the best options we have for determining how well our institutional systems are doing justice to the communities we are trusting them to serve.

Just Systems

Our institutions admit students who have been given reasons to expect particular things from us.   Are we keeping our part of the bargain, in ways that reasonably and fairly benefit all students?  To ask this question is not to assume that someone is failing to do their job or deliberately giving advantages to some students at the expense of others.  But we have a lot of students, with many different backgrounds, motivations, and levels of preparation.  Institutional systems we create (major programs, General Education, and student services, to name just a few) help these students navigate a path through their many options, but we have little reason to expect that all students experience college in the same way.  Summary data, such as graduation rates or cumulative GPAs, will tell us little about how successfully our many systems are accomplishing the purposes they were designed for or which students benefit the most from them.

In addition to examining how well our systems serve our students, assessment provides a way to more faithfully recognize the full work of faculty.  Too often, substantive ways that faculty support and challenge students remain largely unacknowledged.  Learning outcomes assessment can convey the contributions faculty make to student thinking and development beyond what is shown by student ratings, grade distributions, credit-hour production, or other institutional measures that are commonly used to represent faculty work with students.  To assess a program is not to second-guess or re-assess the work that faculty have already done; it is to duly recognize and acknowledge the substantive contributions that faculty make to their students’ learning.

Assessment also gives departments more visible recognition for their work as stewards of the discipline.  An institutional record of a curriculum in a course catalog is relatively static; a living, breathing, adapting curriculum leads to outcomes that extend far beyond its catalog description.  Or so we hope.   Examining learning outcomes lets a department spotlight its role in preparing the next generation to advance the discipline and broaden its reach to address new questions, new communities, and new challenges. 

These concerns provide the central guiding question for assessment:  Are the systems an institution has created doing justice to the students, faculty, and departments who all hold a stake in them?

Just Assessment

In order to examine how well systems are doing justice to the communities they were designed to serve, assessment must be intentionally inclusive.  It can be a challenge for instructors to account for everything going on in a single class even when they are present in the room.  With assessment, we seek to create a vantage point from which to examine student learning across multiple courses and out-of-class experiences, over multiple semesters.  The more that assessment includes these many interacting elements of a program, the more likely it is that it fairly and adequately represents the program.

More importantly, assessment needs to be inclusive of all the people the program claims to serve.  We may have data that represent some communities of students, but not others.  We may know a lot about how well a program served those who completed it, but very little about those who left before reaching that point.  When we don’t have data to fully represent all experiences or all students, we need to be fully transparent about who and what are represented so that we don’t make the mistake of relying on limited data to speak for everything and everyone.  And we need to be systematic in seeking to learn more about the program experiences of those who are not yet represented in the data.

Examining the justice of our systems also needs to be done in ways that are trustworthy to participants, decision-makers, and other stakeholders.  Users have to be confident that findings address their questions and represent their program well enough to provide a basis for them to make further decisions about it.  If not, the focus will be on the quality of the data rather than what can be learned from it, even with its limitations, and we will rarely get to the point of using the data to examine how much good we are actually doing.  Therefore our assessments need to be user-friendly, contextually relevant, and methodologically sound.  We need to see those who use the findings as partners in identifying and collecting data that they will be willing to trust and use.

If we are serious about assessing the justice of our systems, our efforts also need to be sustainable and integrated into organizational practices.  If it’s not institutionalized in these ways, assessment will happen only when someone gets around to it or puts in extra time to push for it, which means it will be driven by external pressures or internal crises and maintained until attention fades away.  Commitment to justice requires an attention span, and can only be demonstrated by a track record of sustained effort and actions over time.

Accountable for Doing Justice

A commitment to justice is also a commitment to transparency.  Assessment often gets associated with the idea of accountability and the suggestion that someone is checking up on us, but Shulman (2007) reminds us that accountability is, in essence, “being able to render an account.”  In this sense, the purpose is to make sure we’re rightly telling the whole story.  Frankly, in higher education there are plenty of people who are glad to step in and tell our stories for us.  Shulman advises, “Our responsibility is to take control of the narrative.”  We need to keep ourselves accountable – able to render an account – for showing that the systems we create justly serve our students, faculty, departments, and communities we are all part of. 

If our existing assessments are helping the institution render these accounts, we can be confident that we have the assessment we need.  If not, we should seriously consider why we keep doing them.  But in that case, it doesn’t mean we should stop doing assessment.  It means we need to do it better, in ways that are more inclusive, trustworthy, and sustainable, so that we can know our systems are doing the good we are trusting them to do.

Work cited:

Shulman, L. S. (2007). Counting and recounting: Assessment and the quest for accountability. Change, 39(1), 28­–35.

Check out our past Viewpoints:

Just Assessment. Nothing More. Nothing Less.
Wayne Jacobson

Design for a Transparent and Engaging Assessment Website
Frederick Burrack and Chris Urban

Improvement Matters
Peter Felten

Working Together to Define and Measure Learning in the Disciplines
Amanda Cook, Richard Arum, and Josipa Roksa

The Simplicity of Cycles
Mary Catharine Lennon

Helping Faculty Use Assessment Data to Provide More Equitable Learning Experiences
Mary-Ann Winkelmes

Ignorance is Not Bliss: Implementation Fidelity and Learning Improvement
Sara J. Finney and Kristen L. Smith

Student Learning Outcomes Alignment through Academic and Student Affairs Partnerships
Susan Platt and Sharlene Sayegh

The Transformation of Higher Education in America: Understanding the Changing Landscape
Michael Bassis

Learning-Oriented Assessment in Practice
David Carless

Moving Beyond Anarchy to Build a New Field
Hamish Coats

The Tools of Intentional Colleges and Universities: The DQP, ELOs, and Tuning
Paul L. Gaston, Trustees Professor, Kent State University

Addressing Assessment Fatigue by Keeping the Focus on Learning
George Kuh and Pat Hutchings, NILOA

Evidence of Student Learning: What Counts and What Matters for Improvement
Pat Hutchings, Jillian Kinzie, and George D. Kuh, NILOA

Using Evidence to Make a Difference
Stan Ikenberry and George Kuh, NILOA

Assessment - More than Numbers
Sheri Barrett

Challenges and Opportunities in Assessing the Capstone Experience in Australia
Nicolette Lee

Making Assessment Count
Maggie Bailey

Some Thoughts on Assessing Intercultural Competence
Darla K. Deardorff

Catalyst for Learning: ePortfolio-Based Outcomes Assessment
Laura M. Gambino and Bret Eynon

The Interstate Passport: A New Framework for Transfer
Peter Quigley, Patricia Shea, and Robert Turner

College Ratings: What Lessons Can We Learn from Other Sectors?
Nicholas Hillman

Guidelines to Consider in Being Strategic about Assessment
Larry A. Braskamp and Mark E. Engberg

An "Uncommon" View of the Common Core
Paul L. Gaston

Involving Undergraduates in Assessment: Documenting Student Engagement in Flipped Classrooms
Adriana Signorini & Robert Oschner

The Surprisingly Useful Practice of Meta-Assessment
Keston H. Fulcher & Megan Rodgers Good

Student Invovlement in Assessment: A 3-Way Win
Josie Welsh

Internships: Fertile Ground for Cultivating Integrative Learning
Alan W. Grose

What if the VSA Morphed into the VST?
George Kuh

Where is Culture in Higher Education Assessment and Evaluation?
Nora Gannon-Slater, Stafford Hood, and Thomas Schwandt

Embedded Assessment and Evidence-Based Curriculum Mapping: The Promise of Learning Analytics
Jane M. Souza

The DQP and the Creation of the Transformative Education Program at St. Augustine University
St. Augustine University

Why Student Learning Outcomes Assessment is Key to the Future of MOOCs

Wallace Boston & Jennifer Stephens

Measuring Success in Internationalization: What are Students Learning?
Madeleine F. Green

Demonstrating How Career Services Contribute to Student Learning
Julia Panke Makela & Gail S. Rooney

The Culture Change Imperative for Learning Assessment
Richard H. Hersh & Richard P. Keeling

Comments on the Commentaries about "Seven Red Herrings"
Roger Benjamin

Ethics and Assessment: When the Test is Life Itself
Edward L. Queen

Discussing the Data, Making Meaning of the Results
Anne Goodsell Love

Faculty Concerns About Student Learning Outcomes Assessment
Janet Fontenot

What to Consider When Selecting an Assessment Management System
R. Stephen RiCharde

AAHE Principles of Good Practice: Aging Nicely A Letter from Pat Hutchings, Peter Ewell, and Trudy Banta

The State of Assessment of Learning Outcomes Eduardo M. Ochoa

What is Satisfactory Performance? Measuring Students and Measuring Programs with Rubrics
Patricia DeWitt

Being Confident about Results from Rubrics Thomas P. Judd, Charles Secolsky & Clayton Allen

What Assessment Personnel Need to Know About IRBs
Curtis R. Naser

How Assessment and Institutional Research Staff Can Help Faculty with Student Learning Outcomes Assessment
Laura Blasi

Why Assess Student Learning? What the Measuring Stick Series Revealed
Gloria F. Shenoy

Putting Myself to the Test
Ama Nyamekye

From Uniformity to Personalization: How to Get the Most Out of Assessment
Peter Stokes

Transparency Drives Learning at Rio Salado College
Vernon Smith

Navigating a Perfect Storm
Robert Connor

It is Time to Make our Academic Standards Clear
Paul E. Lingenfelter

In Search for Standard of Quality
Michael Bassis

Avoiding a Tragedy of the Commons in Postsecondary Education
Roger Benjamin