Overview

Learning analytics is a field of research that includes the collection and analysis of student data (primarily from learning management systems) with the purpose of providing feedback that can be used to improve teaching and learning. As tools for learning analytics emerge and become available to instructors, the review and analysis of student data can be incorporated into a cycle of continuous improvement.

Much research in this growing field seeks to use big data to make predictions about student learning outcomes, and to find effective means of intervention with the aim of improving student outcomes within individual courses and across colleges and universities. In addition, some work aims to provide student access to learning analytics data to empower self-regulated learners to take responsibility for meaningful learning gains.

Because learning analytics continues to be an area of research, the application section will provide suggestions for how instructors can analyze data and take action at the course-level with the data currently available through the learning management system (without access to “big data”). 

Continue exploring this page, or request assistance from the Center for Instructional Technology and Training.

Application

Once student data has been collected it can be analyzed and interventions can be implemented and reviewed. For information on how to collect quantitative and qualitative data, review Soliciting Student Feedback.

Analyzing Data

Data analysis looks different depending on whether quantitative or qualitative data is collected but in both cases analysis can uncover trends, patterns, or unique situations that may not have been obvious without soliciting feedback. The following examples provide a few options for analyzing data, both qualitative and quantitative, but there are many additional approaches that can be used.

Look for trends in the gradebook (quantitative): Course gradebooks can reveal patterns of behavior that may hint at issues that need to be addressed. Typical patterns that may be apparent are:

  • Frequent late submissions to certain assignments or assignment types
  • Students performing below expected mastery level on an assignment
  • Drop in class average on a particular quiz

If a pattern emerges, it’s helpful to review course content and alignment be sure the material was covered adequately and the assessment aligns well with the stated student learning objectives. It’s also important to review the timing of the assessment within the semester to find out whether students had a competing priority that affected their preparation. For example, a group of students may do poorly on an assessment because they have dedicated time to finishing a project or studying for an exam in another class.

If you’re unsure of the cause you could employ qualitative data techniques or survey the students to investigate the pattern further before deciding whether adjustments are necessary.

Review and improve quiz questions (quantitative): It’s a good idea to complete an item analysis on every new multiple choice quiz or test to identify questions that could be rewritten to be made more effective. The goal should be to write valid questions that will help reliably discriminate between students who know the material being tested, and those who do not. For more information on how to use the statistics available to you within the UF e-Learning system, sign up for Interpreting Course Analytics to Address Student Needs.

Look for trends in survey results (quantitative and qualitative): When reviewing survey results it may be helpful to sort student responses based on one or more factors collected through the survey. For example, if the survey asks students for their expected letter grade or current average, sorting responses based on those factors may reveal differences between how ‘A’ students feel about some aspects of the course compared to the ‘C’ students. It may also be beneficial to look at the breakdown of majors and non-majors. This can help to identify how the course can be improved to better serve all students.

Review discussion boards (qualitative): Review ungraded discussion boards (e.g., Stop, Start, Continue/FAQ/ Course Questions) to identify areas where students needed more clarification. Additionally, review graded discussion topic boards for quantity (frequency of posts beyond the minimum) and quality to see how you could improve your prompts to make them more engaging.

Taking Action

When deciding how to take action, it’s important to weigh the following factors:

  • Does it make sense to intervene by contacting a subset of students who are struggling, or is it best to send an announcement to the entire class?
  • What are the benefits and consequences if you were to implement a change to the course immediately?
  • What are the benefits and consequences of waiting to implement a change until a future semester?
  • If you do make course-wide changes, how will you communicate these to students?

Remember when taking action to practice continuous improvement. Keep track of the changes made and make time to analyze whether the change was effective.

References and Additional Resources

Further Exploration

Educause (2012). Analytics Resources Bibliography.

Malterud, L, (2001). Qualitative Research: Standards, Challenges, and Guidelines, Qualitative Research Series, The Lancet, Vol. 358, 483-488.

Shea, P. (Ed.). 2016. Online Learning, 20(2), Special Issue: Learning analytics. Online Learning Consortium.

University of British Columbia (2016). Annotated Bibliography Series: Learning Analytics.

University of Maryland University College (2013). Learning Analytics in Higher Education: An Annotated Bibliography. 2013

Wise, A. F. (2014). Designing pedagogical interventions to support student use of learning analytics. Published in LAK’14 Proceedings of the Fourth International Conference on Learning Analytics and Knowledge. 203-211.

Available Instructional Development