National Institute for Learning Outcomes Assessment |

National Institute for Learning Outcomes Assessment

NILOA Guest Viewpoints

We’ve invited learning outcomes experts and thought leaders to craft a Viewpoint. We hope that these pieces will spark further conversations and actions that help advance the field. To join the conversation, click the link below the Viewpoint. You can also sign up here to receive monthly newsletters that headline these pieces along with NILOA updates, current news items, and upcoming conferences.


The Simplicity of Cycles
Mary Catharine Lennon, University of Toronto


Education is fundamentally based on the joint notions of “teach, practice and assess.” When working with learning outcomes, the cycle repeats as “articulate, incorporate and measure”. The basic idea involves identifying what you are trying to achieve, integrating it into activities, and assessing any changes. Until the final stage is complete, the initial phase only identifies goals and the activities are guesses at how to achieve those goals. We only ‘close the loop’ and understand if we’re achieving our goals through evaluation.

As soldiers in the field of learning outcomes, we constantly reinforce the value of this cycle and the importance of appropriate measures to close the loop. We argue that it is critical to provide assessment feedback to improve the teaching and learning experience and to demonstrate the quality of postsecondary education.

Knowing that cyclic approaches are a best practice, why don’t we apply it to the policies we develop for learning outcomes? Policies themselves are based on that same cyclic model: “formulate, implement and evaluate”.

Somehow, we are so focused on determining whether learning outcomes make a difference at the micro-level that we often neglect broader work examining whether there has actually been any policy impact at the macro-level. For example, what has been the impact of the regulatory agency (i.e. quality assurance, accreditation, accountability) policies that are driving many institutional activities?

To answer this simple question, members of the International Network of Quality Assurance Agencies in Higher Education (INQAAHE) and CHEA International Quality Group (CIQG) were surveyed on their policy activities[1], and 74 agencies from around the world responded. The primary questions of the survey concerned what the policy goals were, how those goals were trying to be achieved, and how the achievement of those goals were assessed. For those that had conducted actual research on the policy, follow-up questions asked about the achievement of those goals. The question of impact was also broadened to respondents who had not completed research, instead inquiring about the perceived impact of the learning outcomes policies. The Figure below indicates the results of questions related to goals, impacts and impressions of impact.


The first thing to consider when reviewing the Figure is the wide range of targeted goals indicated by the agencies. Improved student learning was identified by over 80% of the policies, followed by labour market alignment and institutional improvement. What this suggests is that there are extremely different intentions for policies stemming from quality assurance agencies.

The second point of interest concerns comparing the stated policy goals to the research findings on impact. Those agencies that had conducted research on their policies found little impact except in the area of economic development. Note, however, that these data reflect only the percentage of agencies that indicated the goal (N = 29) and the percentage that indicated impact (N = 14). They are not necessarily the same agencies and, significantly, not all of the 74 agencies answered the first question (perhaps because the policy goals were not clearly identified at the outset).

An intriguing result of the survey compares perceived impact to actual impact. Across the board, the perception is that the policies are having little impact on the targets. Yet, when compared to the actual research findings, the policies are often more effective than perceived. The largest differences, for example, are seen in the impact on credit transfer, teaching and learning.

Ultimately, the importance of evidence of achievement is critical to properly understanding the impact of activities. Current perceptions of the value of learning outcomes policies are misaligned with reality, which suggests that policy decisions are likely being made with the wrong information – impressions rather than facts. More than commentary on the impact of learning outcomes policies, the findings suggest that more policy research needs to be conducted: that we need to evaluate the outcomes of our activities in order to fairly value the impact of the work.

This is not new news, but an important reminder that in both the microcosm of learning outcomes cycles of ‘articulate, incorporate, measure’, and the macrocosm of the ‘formulate, implement, evaluate’ policy cycle, the value of closing the loop through evaluation is critical for success.


Lennon, M.C. (2016) In search of quality: Evaluating the impact of learning outcomes policies in higher education regulation (Doctoral dissertation). 

[1] The survey was part of a broader research study conducted as a PhD dissertation (Lennon, 2016)


Check out our past Viewpoints:

The Simplicity of Cycles
Mary Catharine Lennon

Helping Faculty Use Assessment Data to Provide More Equitable Learning Experiences
Mary-Ann Winkelmes

Ignorance is Not Bliss: Implementation Fidelity and Learning Improvement
Sara J. Finney and Kristen L. Smith

Student Learning Outcomes Alignment through Academic and Student Affairs Partnerships
Susan Platt and Sharlene Sayegh

The Transformation of Higher Education in America: Understanding the Changing Landscape
Michael Bassis

Learning-Oriented Assessment in Practice
David Carless

Moving Beyond Anarchy to Build a New Field
Hamish Coats

The Tools of Intentional Colleges and Universities: The DQP, ELOs, and Tuning
Paul L. Gaston, Trustees Professor, Kent State University

Addressing Assessment Fatigue by Keeping the Focus on Learning
George Kuh and Pat Hutchings, NILOA

Evidence of Student Learning: What Counts and What Matters for Improvement
Pat Hutchings, Jillian Kinzie, and George D. Kuh, NILOA

Using Evidence to Make a Difference
Stan Ikenberry and George Kuh, NILOA

Assessment - More than Numbers
Sheri Barrett

Challenges and Opportunities in Assessing the Capstone Experience in Australia
Nicolette Lee

Making Assessment Count
Maggie Bailey

Some Thoughts on Assessing Intercultural Competence
Darla K. Deardorff

Catalyst for Learning: ePortfolio-Based Outcomes Assessment
Laura M. Gambino and Bret Eynon

The Interstate Passport: A New Framework for Transfer
Peter Quigley, Patricia Shea, and Robert Turner

College Ratings: What Lessons Can We Learn from Other Sectors?
Nicholas Hillman

Guidelines to Consider in Being Strategic about Assessment
Larry A. Braskamp and Mark E. Engberg

An "Uncommon" View of the Common Core
Paul L. Gaston

Involving Undergraduates in Assessment: Documenting Student Engagement in Flipped Classrooms
Adriana Signorini & Robert Oschner

The Surprisingly Useful Practice of Meta-Assessment
Keston H. Fulcher & Megan Rodgers Good

Student Invovlement in Assessment: A 3-Way Win
Josie Welsh

Internships: Fertile Ground for Cultivating Integrative Learning
Alan W. Grose

What if the VSA Morphed into the VST?
George Kuh

Where is Culture in Higher Education Assessment and Evaluation?
Nora Gannon-Slater, Stafford Hood, and Thomas Schwandt

Embedded Assessment and Evidence-Based Curriculum Mapping: The Promise of Learning Analytics
Jane M. Souza

The DQP and the Creation of the Transformative Education Program at St. Augustine University
St. Augustine University

Why Student Learning Outcomes Assessment is Key to the Future of MOOCs

Wallace Boston & Jennifer Stephens

Measuring Success in Internationalization: What are Students Learning?
Madeleine F. Green

Demonstrating How Career Services Contribute to Student Learning
Julia Panke Makela & Gail S. Rooney

The Culture Change Imperative for Learning Assessment
Richard H. Hersh & Richard P. Keeling

Comments on the Commentaries about "Seven Red Herrings"
Roger Benjamin

Ethics and Assessment: When the Test is Life Itself
Edward L. Queen

Discussing the Data, Making Meaning of the Results
Anne Goodsell Love

Faculty Concerns About Student Learning Outcomes Assessment
Janet Fontenot

What to Consider When Selecting an Assessment Management System
R. Stephen RiCharde

AAHE Principles of Good Practice: Aging Nicely A Letter from Pat Hutchings, Peter Ewell, and Trudy Banta

The State of Assessment of Learning Outcomes Eduardo M. Ochoa

What is Satisfactory Performance? Measuring Students and Measuring Programs with Rubrics
Patricia DeWitt

Being Confident about Results from Rubrics Thomas P. Judd, Charles Secolsky & Clayton Allen

What Assessment Personnel Need to Know About IRBs
Curtis R. Naser

How Assessment and Institutional Research Staff Can Help Faculty with Student Learning Outcomes Assessment
Laura Blasi

Why Assess Student Learning? What the Measuring Stick Series Revealed
Gloria F. Shenoy

Putting Myself to the Test
Ama Nyamekye

From Uniformity to Personalization: How to Get the Most Out of Assessment
Peter Stokes

Transparency Drives Learning at Rio Salado College
Vernon Smith

Navigating a Perfect Storm
Robert Connor

It is Time to Make our Academic Standards Clear
Paul E. Lingenfelter

In Search for Standard of Quality
Michael Bassis

Avoiding a Tragedy of the Commons in Postsecondary Education
Roger Benjamin