Using data to support learning at the university
The focus of learning analytics is to provide actionable data that can improve the teaching and learning environment. Learning analytics has been contextually defined for campus as “the undertaking of activities that generate actionable data from the learning intended to improve student outcomes by informing structure, content, delivery or support of the learning environment” (defined in 2018 by the Learning Analytics Roadmap Committee, led by the Vice Provost for Teaching & Learning).
Learning analytics is a process or instructional approach, and not a specific tool, although it uses tools and data to answer questions. There are numerous ways learning analytics can be used as explained in the Learning Analytics Functional Taxonomy, and there are a number of projects and resources on campus supporting the different opportunities.
Work with us
We collaborate on strategic teaching and learning data initiatives aimed at sustained data-driven practices that maintain and enhance student learning.
Projects
Learning analytics continues to gain interest and demonstrate value in higher education. A variety of approaches have been piloted on campus over the past decade. Opportunities continue to arise with the university’s migration to Badger Analytics, our participation in the Unizin Consortium and a new institutional data policy. LACE collaborates with the Data Empowered Educational Practices (DEEP) Executive Committee, the Vice Provost for Teaching & Learning and others on the following learning analytics projects. Close collaboration with stakeholders is a central component of our project approach.
Learner Engagement Analytics Dashboard (LEAD)
Planning and design for the course-level dashboard began in early 2018 with a series of faculty interviews and discussions, to obtain direct feedback about instructor analytics needs. The dashboard has been available enterprise wide since the fall of 2020, and includes three visualizations in Tableau:
- Heat map: Shows when students are active in the course
- Scatter plot: Maps grades and number of page views
- Activity in the course: Displays both duration and number of logins
LEAD aggregates data from Canvas, Kaltura MediaSpace, and Unizin Engage e-text. This unique view allows instructors to see behavioral data about student activity in their courses, and provides unique insights.
Learner Activity View for Advisors (LAVA)
Currently in the pilot stage, the Learner Activity View for Advisors (LAVA) is a learning analytics resource that displays high-level trend data about student performance and engagement to academic advisors. A redesigned LAVA was piloted during the spring of 2022, following an engagement with advisors. A larger LAVA pilot is ongoing in fall 2023.
For additional reference, please see the KB article, “The Learner Activity View for Advisors (LAVA) Overview.”
Learning analytics functional taxonomy
This is an accordion element with a series of buttons that open and close related content panels.
Access learning behavior
Definition:
Learning analytics can collect user-generated data from learning activities and offer trends in learning engagement. Analyzing those trends can reveal students’ learning behavior and identify their learning styles. This approach measures engagement and student behavior rather than performance, giving instructors insight into how their students interact with their course materials.
For example, an instructor would be able to see when, how long, and how often a student accesses different activity types in Canvas.
Key features:
- Drive impact of behavior (for example, correlations between behavior and outcome)
- Identify behavior profiles/approaches to work; can be used to provide guidance, support, and resources
Defining characteristics:
- Unique: Measures engagement and student behavior rather than performance
- Focus: Event-centric (the main focus of the analysis is on the interactions of a learner)
- Feedback type: Reflective (i.e. presenting historic data analysis)
- Primary stakeholder: Teacher, learner
- Guiding questions:
- Understanding students
- What do students do?
- Student support
- Efficiency
- Modeling learner success (course level)
- Learner self-assessment
Examples of access learning behavior approach:
- Check my activity
- JISC Case Study B: Analyzing Use of the LMS at the University of Maryland, Baltimore County.
- Protoypical example at Univ. Maryland: provided information about how students use the website.
- Key takeway: students who used website more got a C or higher.
- Constant over time.
- Allowed students to compare their engagement with website with other students in the course.
- Using learning analytics to assess students’ behavior in open-ended programming tasks
- Stanford. Track students while coding in a Comp Sci class.
- Seven canonical approaches.
- Preliminary work about approaches but not predictive.
- Possible utility recommended: for letting students know where the problems lie (eg get stuck in the middle not the end) and provide resources.
- Learning latent engagement patterns of students in online courses
- Applying classification techniques on temporal trace data for shaping student behavior models
- LAK 2017 Conference Proceedings: [pg 100] Identifying non-regulators: designing and deploying tools that detect self-regulation behaviors
- Predictive analytics at Nottingham Trent University: JISC Case Study 1
Evaluate social learning
Definition:
Learning analytics can be applied to investigate a learner’s activities on any digital social platform — such as online discussions in Canvas — to evaluate the benefits of social learning. This measures and tracks student-to-student and student-to-instructor interactions to help understand if students are benefiting from social learning in their course.
For example, an instructor might apply social network analysis to their online discussions and identify students that bridge groups as knowledge shepherds; they could also identify other students that may not be connecting with others as much as expected.
Key features:
- Focus is on digital aspects of learning
- Derives meaning exclusively from learner/learner & learner/teacher interactions
Defining characteristics:
- Unique:
- Focus on the digital aspect of learning (not F2F) social interactions.
- Clickable, trackable.
- Focuses on the student-to-student or student-to-instructor interactions in these environments.
- Allows reflection
- Focus: Learner-centric (the application of the analytics is specifically on an individual as a learner)
- Feedback type: Reflective (i.e. presenting historic data analysis)
- Primary stakeholder: Teacher
- Guiding questions:
- Understanding students
- What do students do?
- Measuring student learning
- Modeling learner success (course level)
- Learner self-assessment
Examples of evaluating social learning approach:
- SNAPP: Realizing the affordances of real-time SNA within networked learning environments
- (SNA = social network analysis)
- Mapped discussions into visual web with nodes, connections.
- Could see student-student or student -instructor interactions.
- See connections and threads where engagement does or does not exist.
- Discourse-centric learning analytics
- Discourse-centric characterized different types of discussions: e.g. post comment, question, reaction, original ideas.
- Bringing order to chaos in MOOC discussion forums with content-related thread identification
- MOOCs, social networking within MOOCs.
- Focus on distinguishing signal to noise:
- SO much interaction in there, what’s relevant and not.
- Looking at information chaos/overload.
- Models to sort discussion threads by relevance to learning.
- Question about algorithm for what is being deemed “relevant” (and who has input into that algorithm)
Improve learning materials & tools
Definition:
Learning analytics can track a student’s usage of learning materials and tools to identify potential issues or gaps, and offer an objective evaluation of those course materials. This allows instructors to make deliberate decisions about modifying approaches. Using aggregate student data, instructors can see ways to improve the process of learning or the structure of their course.
For example, learning analytics might show that a large percentage of students in a course struggle with a newly introduced topic based on quiz answers.
Key features:
- Uses aggregate student data to adjust instructional practices and materials
- Primary focus is on outcomes (performance) and process
Defining characteristics:
- Unique: Primarily for instructor use
- Focus: Content-centric (the primary emphasis of the application is on the curriculum, course content, or materials.)
- Feedback type: Reflective (i.e. presenting historic data analysis)
- Primary stakeholder: Teacher
- Guiding questions:
- Are materials meeting intent?
- Understanding ourselves as teachers
- Efficiency
- Course design
- Effective outcome alignment
- Measuring student learning
Examples of improving learning materials & tools approach:
- Assignments.org
- Students (HS/MS math) completing problems outside of class, submitting online, teacher gets report on how students did, then uses info to change instruction that follows, instead of giving feedback to every student individually.
- Based on past student performance that allows modification of future design of the learning experience.
- Concern: value of data about individual vs aggregate.
- Informing learning design with learning analytics to improve teacher inquiry (Persico & Pozzi)
- LAK 2017 Proceedings: MAP: Multimodal assessment platform for interactive communication competency
- LAK 2017 Proceedings: [pg 131] Lessons learned from a faculty-led project: using learning analytics for course design
Individualized learning
Definition:
Adaptive or individualized learning systems apply learning analytics to customize course content for each learner. Furthermore, user profiles and other sets of data can be collected and analyzed to offer greater personalized learning experiences. This approach uses continuous feedback to help individual students in their learning.
For example, if an instructor tests students on three topics and a student shows mastery of two topics, but not the third, a program may be able to deliver additional material to the student regarding the topic that has not been mastered, rather than delivering further material/practice questions on concepts the student already has a grasp on.
Key features:
- Driven by information about learners’ prior experience/characteristics (learning style preference, content knowledge etc.)
- Focus is on continuous feedback in real time
Defining characteristics:
- Unique:
- Primary focus is learner-centric nature with emphasis on individual learners.
- Instructor can be hands-off once it’s set up.
- Real-time component for learner. Does not need to wait for instructor to respond.
- Focus: Content-centric (the primary emphasis of the application is on the curriculum, course content, or materials.)
- Feedback type: Adaptive (i.e. presenting real-time data analysis)
- Primary stakeholder: Learner
- Guiding questions:
- Understanding students
- Student support
Examples of individualized learning approach:
- LAK 2017 Proceedings: [p88] Piloting learning analytics to support differentiated learning through LearningANTS
- Math tutoring example – compare software to face-to-face tutoring. They tried to figure out if outputs from the system were better than tutors.
- Conclusion: system outperformed traditional tutoring.
- Premise: students need to approach tutors, vs system spits out recommendations to students
- Using data to understand how to better design adaptive learning (Liu et. al.)
- Baseline knowledge example – remediating to get to the expected baseline. Response to student performance.
- A fuzzy logic-based personalized learning system for supporting adaptive English learning. (Hsieh et.al)
- Adaptive learning (Kerr)
- Development of an adaptive learning system with two sources of personalization information (Tseng et. al.)
Predict student performance
Definition:
Based on already existing data about learning engagement and performance, learning analytics applies statistical models and/or machine learning techniques to predict later learning performance. By doing so, likely at-risk students can be identified for targeted support. Focus is on using data to prompt the instructor to take immediate action to intervene and help a student course- correct before it is too late.
For example, if a student’s behavior and performance in a course suggest that a student is struggling, an instructor has an opportunity to intervene. Predictive analytics can also help instructors identify students that are doing OK, but may need some additional motivation to do better in the course (a C student that could be a B student).
Key features:
- Learners can be provided with feedback to encourage study habit behaviors
- Focus is squarely on “actionable intelligence”
Defining characteristics:
- Unique:
- Common themes: LMS + demographic data used.
- Focus on taking action
- Focus: Learner-centric (the application of the analytics is specifically on an individual as a learner)
- Feedback type: Predictive (i.e. presenting predicted future state or events)
- Primary stakeholder: Teacher
- Guiding questions:
- Understanding students
- Student support
- Modeling learner success (course level)
- Learner of self-assessment
Examples of predicting student performance approach:
- Signals: applying academic analytics
- Purdue University. Used LMS to data mine info from performance of students. Used data to send info to students to students at potential risk;
- Actionable performance.
- Used red/yellow/green warning system to indicate status to students.
- Comments about ability of faculty to use appropriately, concerns about students being overwhelmed by amount of data and warning signals.
- Two weeks into semester could detect at risk.
- Open academic analytics initiative – JISC Case Study E
- Day 0 predictions with student data generated before they begin the courses.
- Improving retention: predicting at-risk students by analysing clicking behaviour in a virtual learning environment
- Forecasting student achievement in MOOCs with natural language processing
Visualize learning activities
Definition:
This approach traces all learning activities performed by users in a digital ecosystem to produce visual reports on the learner behavior, as well as performance. The reports can support both students and teachers to boost learning motivation, adjust practices and leverage learning efficiency. This is about facilitating awareness and self-reflection in students about their learning patterns and behaviors.
For example, a learning analytics tool may help a student see how much time she is spending on certain activity types compared to her peers, and how that might relate to performance measures.
Key features:
- Making learning visible for students and instructors
- Facilitates awareness–>self-reflection–> sense making–>impact
Defining characteristics:
- Unique:
- Common Supposedly the simplest of the applications (novice – low).
- Expertise requirements (“low”) but depends on LMS skill, ability to get the data, and manipulate it, and know how to use it for what purpose.
- Awareness –> self-reflection –> sensemaking –> impact
- Focus: Event-centric (the main focus of the analysis is on the interactions of a learner)
- Feedback type: Reflective (i.e. presenting historic data analysis)
- Primary stakeholder: Teacher, learner
- Guiding questions:
- Understanding students
- What do students do?
- Student support
- Efficiency
- Measuring student learning
- Modeling learner success (course level)
- Learner of self-assessment
Examples of visualizing learning activity approach:
- Learning analytics extension for better understanding the learning process in the Khan Academy Platform
- Learning analytics dashboard applications (Verbert et al)
- LeMO: a learning analytics application focusing on user path analysis and interactive visualization
- LAK 2017 Conference Proceedings: [pg 36] Supporting classroom instruction with data visualization
- GLASS: A learning analytics visualization tool
Resources and opportunities
There are several ways to learn about learning analytics at UW–Madison. LACE collaborates with the Learning Analytics Roadmap Committee (LARC), the Vice Provost for Teaching & Learning and others to provide the following opportunities and resources.
Community of interest for learning analytics
If you are interested in obtaining information about learning analytics opportunities, training, fellowships and events on campus, please join the UW–Madison Learning Analytics MS Team or email us and request to be added to the group.
Additional resources
Materials from past learning analytics events
- Explore Course-Level Data with Tableau Visualizations – presented by Clare Huhn & Jocelyn Milner, February 21, 2019
- Course-Level Dashboard (& Analytics Beta – Canvas) – presented by Kari Jordahl, James McKay, Garrett Smith & Xizhou “Canoe” Xie, November 13, 2018
- The Importance of Meaning: Turning Big Data into Real Understanding (video) – presented by David Williamson Shaffer, October 16, 2018
- 6 Ways to Use Learning Analytics: a Functional Taxonomy – presented by Sarah Hagen & Sarah Traynor, September 18, 2018
- Active Teaching Lab Recap: Canvas Analytics – September 7, 2018
- Canvas Analytics: Instructor Perspectives on Course Analytics – presented by Amanda Margolis, Mark Millard & Catherine Arnott Smith, April 26, 2018
- Data Visualizations & Infographics (It looks cool, but what is it telling you?) – presented by Cid Freitag, March 20, 2018
- Using Pattern to Log Course Activities – presented by Miguel Garcia-Gosalvez, Heather Kirkorian, James McKay, & Kim Arnold, February 21, 2018
Support documentation for tools used on campus
Instructors:
- How do I view Canvas Course Analytics?
- How do I view a context card for a student in a course?
- How do I view analytics for a student in a course?
- How do I view the course access report for an individual user?
- Canvas student course access and anonymous course reports (3rd party tool)
- How do I view Kaltura Media Analytics?
- Kaltura – Mediaspace Media Analytics
- Kaltura – Advanced Mediaspace Analytics
Students:
LACE updates
List of articles
Microgrants awarded in support of diversity, equity, inclusion & belonging
Microgrant recipients for Data Empowered Educational Practices (DEEP) reflect a diverse group of instructional faculty and administrators using teaching and learning data to support student success.
September 27, 2023How can data support inclusive teaching?
4 microgrants on learning analytics in support of diversity, equity, inclusion and belonging were awarded for the 2022-23 academic year—and the impact and ripple effects continue to grow.
June 16, 2023Interested in using data to support diversity, equity, inclusion & belonging?
Coming soon: microgrants to explore how data-empowered educational practices might contribute to DEIB in the UW–Madison teaching and learning experience.
June 12, 2023- More Learning Analytics Center of Excellence posts
Help & contact
Schedule a consultation
There are a wide range of tools available and our learning technology consultants are happy to help you choose the best tool to fit your needs. Instructors and instructional staff can request a consultation with a DoIT AT consultant through the DoIT Help Desk.