National Institute for Learning Outcomes Assessment |

National Institute for Learning Outcomes Assessment

NILOA Guest Viewpoints

We’ve invited learning outcomes experts and thought leaders to craft a Viewpoint. We hope that these pieces will spark further conversations and actions that help advance the field. To join the conversation, click the link below the Viewpoint. You can also sign up here to receive monthly newsletters that headline these pieces along with NILOA updates, current news items, and upcoming conferences.


Design for a Transparent and Engaging Assessment Website
Frederick Burrack, Director, and Chris Urban, Assistant Director, Office of Assessment, Kansas State University


Institutions of higher education are increasingly concerned with developing an assessment culture focused on faculty and program ownership. As with any other culture, norms and values of assessment emerge from interactions between members of that community. For an institutional assessment office, a clear and transparent website communicates norms and reinforces the institution’s approach to assessment.  Focusing on transparency alone, however, can result in the presentation of information not enticing to view.  To be truly engaging, a website also needs to actively encourage transformation of views and practices by involving users in its content.  Balancing these two elements—transparency and engagement—is crucial to designing a good assessment website.  The website for Kansas State University’s Office of Assessment recognizes the uniqueness of student learning that occurs within academic programs and units and emphasizes multiple ways they can learn what students know and can do.  We strive to make our website service-oriented, meaning the message throughout should be ‘what can the Office of Assessment do for you’ rather than ‘let us tell you what must be done for us’.  By designing our website to encourage constituent ownership of the assessment process, purposeful strategies are used to reinforce this institutional culture of assessment. 

Transparency Components1

The following are important components of an assessment program and should guide decisions for the development of a website presence:

Student Learning Outcomes should be clearly stated, as these are of primary relevance to constituents using the site (e.g. academic programs, student life units, instructional resource groups). Student learning outcomes must come directly from what programs find as essential for their students’ learning. Ownership of the assessment process begins where the constituents find relevance and purpose. Clarity exists in the detail provided about the expected knowledge, skills, competencies, and habits of mind that students are expected to acquire and demonstrate. An important consideration is alignment of these focused outcomes with clearly stated university-wide outcomes. This alignment is one of high importance in our communication plan for our site. Communication of the alignment of overarching learning goals that occurs in programs and across the university provides opportunities to discover commonalities of student learning expectations across disciplines.

Programmatic Assessment Plans that convey how student learning are assessed, how data collection tools and approaches are used, and the timeline for implementation can guide programs in the development of meaningful assessment processes.  This is best accomplished by exposing a variety of assessment processes to both encourage respectful autonomy and communicate a common purpose of assessment. Commonality among program assessment plans will often be evident in the alignment to institution-wide outcomes. The process for reporting must be clearly defined with instructional guidance provided for each step. A site can provide text description, videos, templates, and examples to assist in thorough planning, administration, and reporting of student learning assessment.

Evidence of Student Learning (p.3) is shared through results of assessment activities. This includes results from indirect (e.g. surveys) and direct (e.g. course-embedded and standardized demonstrated in the linked document) assessments of student learning. A web presence can publicly present information about units and the overall institution’s assessment of student learning. Our site provides university and college-wide results from the National Survey of Student Engagement, Annual Senior Survey, Annual Alumni Survey, and General Education Survey.  From these, we report institutional data as well as data disaggregated by college. Direct data from course-based assessments and from standardized measures are reported by university outcomes and made public in aggregate form. Program-specific assessment outcome data are available through password protected access.

Use of Student Learning Evidence for program improvement (p.4) provides guidance for constituents to learn how to use evidence of student learning for continual improvement. (e.g. curricular changes implemented, development of instructional strategies, adaptations made in assessment measures or scoring devices, programmatic enhancements initiated for improvement, decision-making processes initiated). Programs and units involved in student learning assessment often use these examples as a model for problem identification, planning, goal setting, faculty development, course revision, program review, accreditation, and/or self-study. Examples of continual program improvement are important means through which programs can most effectively use their assessment processes and learn from their data.

Assessment Resources should provide relevant information or training for faculty and staff to help them understand, develop, implement, communicate, and use evidence of student learning. One option is to organize this as a toolkit that provides instructional documents and videos, examples applied at the university, and external resources to guide assessment practice at any level. Our assessment toolkit is organized into six categories:

  • ASSESSMENT BASICS: Assessment defined; Components of student learning assessment; Principles of meaningful assessment; Program assessment process; Continuous improvement; Assessment glossary; FAQs
  • ASSESSMENT PLANNING: Policies and procedures for new program; Developing an assessment plan; Alignment and curriculum mapping; Assessment plans; High impact practices
  • STUDENT LEARNING OUTCOMES: Writing measureable outcomes; Learning outcome generators; University learning outcomes; Bloom’s Taxonomy; Action verb lists
  • MEASUREMENT: Reasons for measurement; Measurement best practices; Choosing measurement tools; Rubrics; Guides to effective measurement
  • IMPROVEMENT THROUGH ASSESSMENT: Examples of best practice and improvement from the campus; Examples from other institutions
  • ASSESSMENT LIBRARY: Resources are curated and organized so what constituents actually use is clearly visible and accessible.

Assessment Activities described on an assessment website could include information of projects and activities to help programs gauge student learning, make improvements, provide programmatic professional development, or respond to accountability interests.  We host professional development opportunities on assessment basics as well as on topics that support concurrent initiatives. Our website is used as the primary mechanism for promoting and disseminating best practices through the Institute for Student Learning Assessment, our annual university assessment conference. Annual assessment awards recognizing outstanding achievements in student learning assessment are posted through the site. The site also is used as the primary means to demonstrate assessment processes and technology during professional development sessions.

Engagement Components

Crucial to achieving transparency in an assessment website is presenting content in ways that constituents will understand and use.  An assessment website may have all the elements necessary for transparency, but if those elements are not presented in an engaging way, then constituents may never experience them, which minimizes transparency. 

Create Visual Appeal. The initial view on the website should visually reflect the assessment culture at the institution and encourage engagement with the site. It is better if it is not overly complicated. Banner images should be clear and commonly perceived to offer quick focus central to faculty ownership of the assessment process.  It is important that whatever text is offered, it clearly communicates the student-centered paradigm using terminology that is faculty oriented. When working toward transparency that promotes faculty/staff ownership of the assessment process, it is essential that constituents perceive relevance and purpose from first glance all the way through depth of inquiry.

Emphasize Improvement Initiatives. Components of web-based assessment communication should reflect the core elements of the mission of the assessment office. For us, this focus encourages ownership of the learning experiences and the desire to identify the quality of learning that results. This is especially important when promoting professional development opportunities.

Use appropriate formatting for different sections/topics. Engaging users who have different expectations for different content requires different formats within a common structure.  Sometimes assessment websites are primarily pages with links to static handbooks or reports that require users to wade through to get to the information they need.  Slicing information into sections that are formatted in a variety of ways engages users and demonstrates consideration of user needs.       

Utilize existing information.  Not everything needs to be created in-house.  There is a wealth of information on websites for the Assessment Commons, NILOA, and the DQP, among others.  Linking to pre-existing resources saves time, but if faculty-developed or faculty-used resources are linked, the resources also attain credibility.  Rather than providing in-house resources that reflect what “we say assessment should be”, linking to other resources shows the variety of different assessment methods.   

Embed elements within useful instructional modules. We are designing our site to be more instructional than presentational. The intent is to be interactive in the sense that constituents that use the site can learn through the inquiry process. The design focuses on current resources that address questions faculty ask about their assessment processes, as well are strategies to analyze and discuss assessment data.  We are trying to provide models for what to do with assessment instead of what to know about assessment. We are currently developing short instructional videos of faculty sharing how they implement and effectively use specific assessment processes in their program. The Assessment Showcase and Institute for Student Learning Assessment provide the opportunity for programs, faculty, and staff to learn about effective assessment practice and transfer ideas into their own context.

Utilize technologies for engagement. Enhanced technologies allow more interactivity with data and analytics through interactive dashboards and reports that can be manipulated by the viewer. At Kansas State University, we utilize these reports to provide triangulated views of assessment data that include indirect and direct results.  Faculty interact with the results to see how particular groups of students are performing.  Static PDF reports are still available for use beyond the web.

Utilize the same resources to serve multiple purposes/constituencies. Institutional assessment offices serve and communicate with multiple constituencies, from governing boards, to accreditors, to faculty and students.  When publishing resources, it is often useful to think of who the resource is for, but also whether it can be designed to serve other constituencies.   

Use appropriate structure and language. The structure and language of your website should reinforce your approach to assessment and how you view your website’s role.  Prescriptive, institutionally-centered assessment necessitates a different structure and terminology than faculty-oriented assessment.      

Curate resources to support specific initiatives. An effective website is actively adapted to fit current needs.  As events and initiatives are planned, the many ways the website can be used to support those events should also be considered.  Often, pre-existing information can be emphasized through a direct home page link, consolidation on a new page, or within other resources.  Curating resources, announcements, and event notifications provides users with what they most likely want to engage with (or what you want them to engage with) upon accessing the page.  In contrast, providing long series of links to all resources in a similar format requires users to sift through information, making it more likely they will disengage. 

Highlight exemplary institutional examples. Spotlighting particularly impactful assessment examples can help in three ways.  First, highlighted programs can be held up as models for other programs.  Second, programs doing one or two things well (but that still need work) can be highlighted to provide encouragement for additional development.  Third, highlighting programs can strengthen ties between the assessment office and faculty/staff representatives. 

1Freely adapted from the NILOA Transparency Framework

Check out our past Viewpoints:

Design for a Transparent and Engaging Assessment Website
Frederick Burrack and Chris Urban

Improvement Matters
Peter Felten

Working Together to Define and Measure Learning in the Disciplines
Amanda Cook, Richard Arum, and Josipa Roksa

The Simplicity of Cycles
Mary Catharine Lennon

Helping Faculty Use Assessment Data to Provide More Equitable Learning Experiences
Mary-Ann Winkelmes

Ignorance is Not Bliss: Implementation Fidelity and Learning Improvement
Sara J. Finney and Kristen L. Smith

Student Learning Outcomes Alignment through Academic and Student Affairs Partnerships
Susan Platt and Sharlene Sayegh

The Transformation of Higher Education in America: Understanding the Changing Landscape
Michael Bassis

Learning-Oriented Assessment in Practice
David Carless

Moving Beyond Anarchy to Build a New Field
Hamish Coats

The Tools of Intentional Colleges and Universities: The DQP, ELOs, and Tuning
Paul L. Gaston, Trustees Professor, Kent State University

Addressing Assessment Fatigue by Keeping the Focus on Learning
George Kuh and Pat Hutchings, NILOA

Evidence of Student Learning: What Counts and What Matters for Improvement
Pat Hutchings, Jillian Kinzie, and George D. Kuh, NILOA

Using Evidence to Make a Difference
Stan Ikenberry and George Kuh, NILOA

Assessment - More than Numbers
Sheri Barrett

Challenges and Opportunities in Assessing the Capstone Experience in Australia
Nicolette Lee

Making Assessment Count
Maggie Bailey

Some Thoughts on Assessing Intercultural Competence
Darla K. Deardorff

Catalyst for Learning: ePortfolio-Based Outcomes Assessment
Laura M. Gambino and Bret Eynon

The Interstate Passport: A New Framework for Transfer
Peter Quigley, Patricia Shea, and Robert Turner

College Ratings: What Lessons Can We Learn from Other Sectors?
Nicholas Hillman

Guidelines to Consider in Being Strategic about Assessment
Larry A. Braskamp and Mark E. Engberg

An "Uncommon" View of the Common Core
Paul L. Gaston

Involving Undergraduates in Assessment: Documenting Student Engagement in Flipped Classrooms
Adriana Signorini & Robert Oschner

The Surprisingly Useful Practice of Meta-Assessment
Keston H. Fulcher & Megan Rodgers Good

Student Invovlement in Assessment: A 3-Way Win
Josie Welsh

Internships: Fertile Ground for Cultivating Integrative Learning
Alan W. Grose

What if the VSA Morphed into the VST?
George Kuh

Where is Culture in Higher Education Assessment and Evaluation?
Nora Gannon-Slater, Stafford Hood, and Thomas Schwandt

Embedded Assessment and Evidence-Based Curriculum Mapping: The Promise of Learning Analytics
Jane M. Souza

The DQP and the Creation of the Transformative Education Program at St. Augustine University
St. Augustine University

Why Student Learning Outcomes Assessment is Key to the Future of MOOCs

Wallace Boston & Jennifer Stephens

Measuring Success in Internationalization: What are Students Learning?
Madeleine F. Green

Demonstrating How Career Services Contribute to Student Learning
Julia Panke Makela & Gail S. Rooney

The Culture Change Imperative for Learning Assessment
Richard H. Hersh & Richard P. Keeling

Comments on the Commentaries about "Seven Red Herrings"
Roger Benjamin

Ethics and Assessment: When the Test is Life Itself
Edward L. Queen

Discussing the Data, Making Meaning of the Results
Anne Goodsell Love

Faculty Concerns About Student Learning Outcomes Assessment
Janet Fontenot

What to Consider When Selecting an Assessment Management System
R. Stephen RiCharde

AAHE Principles of Good Practice: Aging Nicely A Letter from Pat Hutchings, Peter Ewell, and Trudy Banta

The State of Assessment of Learning Outcomes Eduardo M. Ochoa

What is Satisfactory Performance? Measuring Students and Measuring Programs with Rubrics
Patricia DeWitt

Being Confident about Results from Rubrics Thomas P. Judd, Charles Secolsky & Clayton Allen

What Assessment Personnel Need to Know About IRBs
Curtis R. Naser

How Assessment and Institutional Research Staff Can Help Faculty with Student Learning Outcomes Assessment
Laura Blasi

Why Assess Student Learning? What the Measuring Stick Series Revealed
Gloria F. Shenoy

Putting Myself to the Test
Ama Nyamekye

From Uniformity to Personalization: How to Get the Most Out of Assessment
Peter Stokes

Transparency Drives Learning at Rio Salado College
Vernon Smith

Navigating a Perfect Storm
Robert Connor

It is Time to Make our Academic Standards Clear
Paul E. Lingenfelter

In Search for Standard of Quality
Michael Bassis

Avoiding a Tragedy of the Commons in Postsecondary Education
Roger Benjamin