Aligning Assessments to SLOs

There are multiple purposes for an assessment. There’s the traditional purpose of evaluating if the student has learned the objectives (termed a summative assessment because it is a final judgment). Alternatively, it could be to provide the student with the opportunity to make a self-assessment on what they’ve learned (a form of formative assessment because the goal is to identify where improvements can be made). In either instance, it’s important for the assessment to closely align with the student learning objectives (SLOs). This is tied to the issue of face validity (whether the assessment in fact measures what it is purported to measure). SLOs are written with specific verbs describing the performance style of the student after mastery. These verbs help to indicate which type of assessment is most appropriate (see tables below).

BloomsTaxonomyGraph

Best Practices

  • List SLOs on the module page with that week’s activities and lectures so students can see the objectives for themselves.
  • In order to create a higher-order multiple-choice questions , you often have to create complicated stems (providing a reading passage or chart) for them to Interpret, Infer, Predict, and Conclude from.
  • Assignments are another great way to assess student learning. Don’t feel locked into the suggestions above. Brainstorm on projects, case studies, presentations, and other ways that students can demonstrate that they’ve met the objective.
  • Refer to your SLOs when designing your rubric  for grading essays, discussion board posts, and other assignments to ensure that there’s alignment between the objective and what you’re emphasizing in points.

Resources

  • Examples of Question/Assessment Type Alignment to Verb
  • The seminal work on educational objectives taxonomy is:
    • Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: Handbook I: Cognitive domain. New York, NY: David McKay Co.
  • Jones, K. O., Harland, J., Reid, J. M., & Bartlett, R. (2009). Relationship between examination questions and Bloom’s taxonomy. In Frontiers in Education Conference, 1-6.
    • This work was done from an engineering degree program perspective. It does a great job of not only reviewing Bloom’s taxonomy in relation to SLOs but it also discusses the impact of focus and perspective in the analysis of higher order thinking questions, not just categorizing SLOs by the verb (pg. 3). The discussion also covers their surprise at how few higher order thinking questions they had in their exams and recognized the need to assess if they are in fact included in assignments in the coursework or if it needs to be added to their assessment methods.
  • Pashler, H., Bain, P., Bottge, B., Graesser, A., Koedinger, K., McDaniel, M., and Metcalfe, J. (2007). Organizing Instruction and Study to Improve Student Learning (NCER 2007-2004). Washington, DC: National Center for Education Research, Institute of Education Sciences, U.S. Department of Education. Retrieved from http://ncer.ed.gov
    • Recommendation #7 covers emphasis on higher order thinking questions including ways to prompt students to provide deep explanations. A summary of evidence is also included for each recommendation.
  • Rex Heer at Iowa State University created a wonderful interactive model showcasing Bloom’s Taxonomy with the addition of the knowledge dimension. It shows differences, for example, between covering factual information and concepts in a category such as understand (aka comprehension). Note that there are no hard and fast rules. Some of the examples could be in other boxes but it is the general concept of recognizing the difference in lower order and higher order thinking objectives across different types of knowledge that is important.
    • The model represents outcomes from a special issue, Revising Bloom’s Taxonomy, of Theory into Practice. Of particular note in this issue is:
      • Airasian, P. W. & Miranda, H. (2002). The Role of Assessment in the Revised Taxonomy. Theory into Practice, 41(4), 249-254.