Assessment should get at learning over time.

  • It should measure overall learning and development of specific skills and capacities students have experienced, not just describe “exposure” to certain kinds of content and learning opportunities in their courses.
  • It is fine to collect data in the context of courses, but assessment should involve efforts to understand and interpret this data in relationship to tangible learning goals you have named for your students.

Assessment is not the same as data collection.

  • Your assessment process should include analysis and interpretation of patterns in student learning––meaning their abilities, skills, and experiences in your department or program (in the major/minor or a given course sequence as well as general education courses). It is an opportunity for research and conversation about what is happening in your programs.


Recording grades is not enough!

  • Routine evaluation of student work (e.g. by assigning grades on exams and papers) is of course a form of assessment. But it usually focuses on discrete performance in a given course, and generally is not strongly concerned with growth over time (although individual faculty may well be aware of that growth trajectory).
  • If you want to use course material for assessment, think of the activity of assessing as distinct from what you are already doing, with core learning objectives in the major as your priority.
  • Ideally, the program considers multiple contexts when evaluating student learning.

Use what you’ve already got, but use it intentionally.

  • Build on the strengths, defining features and core learning priorities of your programs when developing an assessment methodology
    • For example, if you have an especially robust capstone process or attentive mentoring, or you conduct regular evaluation of oral presentations, find a way to make these elements focal points––and occasions––for assessment. They already reflect priorities you’ve named for your students’ learning.
  • This assessment should, however, go beyond routine evaluation (assigning a grade); that is, it should be concerned with general learning and skills, and should contribute to systematic assessment of the program.

Data do not have to be quantitative in all cases!

  • Qualitative data are also welcome, and indeed often more revealing.
    • For example, comments in an online class discussion or observed student behavior may be used to identify areas of strength or improvement, if analyzed systematically.
  • Rubrics and numerical scoring can speed up a process of review but effective assessment often involves the triangulation of multiple forms of data, and involves qualitative analysis.

Analyze and interpret!

  • Reports should not simply present undigested data (e.g. a full set of survey responses or a brief set of average scores on a set of essays).
    • If you systematically collect data of this kind, great; use the report to communicate ‘findings’––meaning both what you make of the data and what you plan to do in response––from a given cycle and be sure to include questions/considerations that will inform the next round of assessment.


Use the assessment report to tell a specific story.

  • Think of it as an occasion to articulate core goals and priorities for your unit––to explain how the assessment priorities you have named and developed (perhaps over many years) reflect the particular concerns of your faculty and experiences of your students; and to delve into teaching-and-learning-related issues that really matter to you.

Develop context-specific assessment goals.

  • While it is certainly appropriate to attend to general trends and patterns in student learning/growth, it is also helpful to name specific goals for a given cycle or period that correspond to specific issues or topics you wish to explore.
  • This prevents assessment from becoming so generic as to be unhelpful for program decision-making. It also invites you to think about general trends in relationship to contextual factors your unit is concerned with right now, and to investigate developments that have implications for instruction, curriculum, and resources.

Contact: Heidi Aronson Kolk