Learning Analytics Center of Excellence (LACE)

Using data to support learning at the university

The focus of learning analytics is to provide actionable data that can improve the teaching and learning environment. Learning analytics has been contextually defined for campus as “the undertaking of activities that generate actionable data from the learning intended to improve student outcomes by informing structure, content, delivery or support of the learning environment” (defined in 2018 by the Learning Analytics Roadmap Committee, led by the Vice Provost for Teaching & Learning).

Learning analytics is a process or instructional approach, and not a specific tool, although it uses tools and data to answer questions. There are numerous ways learning analytics can be used as explained in the Learning Analytics Functional Taxonomy, and there are a number of projects and resources on campus supporting the different opportunities.

Work with us

We collaborate on strategic teaching and learning data initiatives aimed at sustained data-driven practices that maintain and enhance student learning.

Contact LACE


Learning analytics continues to gain interest and demonstrate value in higher education. A variety of approaches have been piloted on campus over the past decade. Opportunities continue to arise with the university’s migration to Badger Analytics, our participation in the Unizin Consortium and a new institutional data policy. LACE collaborates with the Data Empowered Educational Practices (DEEP) Executive Committee, the Vice Provost for Teaching & Learning and others on the following learning analytics projects. Close collaboration with stakeholders is a central component of our project approach.

Learner Engagement Analytics Dashboard (LEAD)

Planning and design for the course-level dashboard began in early 2018 with a series of faculty interviews and discussions, to obtain direct feedback about instructor analytics needs. The dashboard has been available enterprise wide since the fall of 2020, and includes three visualizations in Tableau:

  • Heat map: Shows when students are active in the course
  • Scatter plot: Maps grades and number of page views
  • Activity in the course: Displays both duration and number of logins

LEAD aggregates data from Canvas, Kaltura MediaSpace, and Unizin Engage e-text. This unique view allows instructors to see behavioral data about student activity in their courses, and provides unique insights.

Learner Activity View for Advisors (LAVA)

Currently in the pilot stage, the Learner Activity View for Advisors (LAVA) is a learning analytics resource that displays high-level trend data about student performance and engagement to academic advisors. A redesigned LAVA was piloted during the spring of 2022, following an engagement with advisors. A larger LAVA pilot is ongoing in fall 2023.

For additional reference, please see the KB article, “The Learner Activity View for Advisors (LAVA) Overview.”

Learning analytics functional taxonomy

This is an accordion element with a series of buttons that open and close related content panels.

Access learning behavior

Access learning behavior


Learning analytics can collect user-generated data from learning activities and offer trends in learning engagement. Analyzing those trends can reveal students’ learning behavior and identify their learning styles. This approach measures engagement and student behavior rather than performance, giving instructors insight into how their students interact with their course materials.

Examples of questions this approach helps answer

  • Are my students engaging in the course and getting off to a good start?
  • Are students successfully using course materials to complete assignments and activities?
  • Are struggling students spending a lot of time/energy on course material and still not getting it, or are they not putting in the time/energy?
  • Is the assignment feedback provided on student work helpful/effective and used by students?
  • How many students access optional resources? Is it worth my time to create/add these to the course?
  • Which course materials do students access the most? The least? Are there patterns of engagement?
  • When do students access course materials? How often? Do they have enough time to complete activities?
  • Is the class workload too much? How much time are students spending on course activities?
  • Is my course inclusive, and does it provide diverse learning opportunities that all students can access?

UW–‍Madison Examples

Learning Tools Assessment for Anatomy Course
Evaluate whether interactive case studies, which are time-intensive to develop, are effective to prepare students for labs.

Evaluating Use & Effectiveness of New Course Resources
Using page views, student feedback and surveys to analyze whether new course materials are being used and having a positive impact on student learning.

Measuring Student Engagement With Online Course Materials
Optional videos have been provided to support student assignments. Kaltura analytics can identify if students are taking advantage of these materials, and help guide future curriculum design.

Page-Level Student Access Analytics for Online Resource Improvement 
Online course materials provide students with more independent study and retrieval practice opportunities. Learning analytics and student feedback can inform continuous improvement of course materials and help students be more prepared.

Other examples using centrally supported teaching and learning tools
Most of the supported tools in the Learn@UW suite of tools offer some sort of learning analytics capabilities. Search the KnowledgeBase for more information about specific tools.

Other Examples

A Systematic Approach to Quality Online Course Design and Development 
Page views in the LMS along with video views, duration and heatmap data provided student behavior data to support a change in course delivery from blended to fully online. This was conducted as part of a holistic curriculum review and transition.
Access on the web

Profiling students via clustering in a flipped clinical skills course using learning analytics 
This study investigates student profiles in a flipped classroom. Algorithms examined in this study show a two-cluster structure: ‘high interaction’ and ‘low-interaction.’ These results can be used to help identify low-engaged students and give appropriate feedback.   Access on the web  | Access from UW–‍Madison Libraries
DOI: 10.1080/0142159X.2022.2152663

Person-centered analysis of self-regulated learner profiles in MOOCs: a cultural perspective
Research revealed four different self-regulated learner profiles: all-around SRL learners, disillusioned SRL learners, control-oriented SRL learners, and control-dominated SRL learners.   Access on the web  | Access from UW–‍Madison Libraries
DOI: 10.1007/s11423-021-09939-w

Evaluate social learning

Evaluate social learning


Learning analytics can be applied to investigate a learner’s activities on any digital social platform — such as online discussions in Canvas — to evaluate the benefits of social learning. This measures and tracks student-to-student and student-to-instructor interactions to help understand if students are benefiting from social learning in their course.

Examples of questions this approach helps answer

  • Who are the students who engage in online discussions frequently and with a variety of other students? Are they knowledge shepherds that bridge groups?
  • Who are the students who engage less frequently and/or with few other students in the online discussions?
  • What can be learned about student attitudes and learning from an analysis of the content of discussion posts?
  • Do students feel included, and that they belong in the course or program? Do they feel comfortable speaking up or asking questions?

UW–‍Madison Examples

Using Learning Analytics to Improve Inclusion and Belonging in First-Year Engineering Design 
Both quantitative and qualitative data was used to support inclusion and belonging for first-year engineering students in a team-based, hands-on design course. Students learn technical skills as well as collaboration, team-building and professionalism. They worked on long-term projects in their teams, and were asked to complete a team contract as well as peer and self evaluations as part of that process.

Fostering Belonging in the Computer Science Classroom 
This course improvement project used learning analytics to support belonging for diverse groups in non-diverse environments. Computer Science course gender distribution is far from parity. A set of engagement techniques–including small group discussions and murkiest points–were used to monitor and improve the feeling of belonging amongst all students regardless of the size of their cohort. Classroom observations, student surveys and reviewing student questions were also implemented.

Other examples using centrally supported teaching and learning tools
Most of the supported tools in the Learn@UW suite of tools offer some sort of learning analytics capabilities. Search the KnowledgeBase for more information about specific tools.

Other Examples

Using Learning Analytics to Visualise Computer Science Teamwork
A teamwork dashboard, founded on learning analytics, learning theory and teamwork models analyzes students’ online teamwork discussion data and visualizes the team mood, role distribution and emotional climate. Educators can use the tool to monitor teams, identify problem teams, and provide students feedback about team interactions. Access on the web  |  Access from UW–‍Madison Libraries

DOI: 10.1145/2729094.2742613

The Curious Case of Centrality Measures: A Large-Scale Empirical Investigation
A meta-analysis of 69 cases that examine the relationship between network centrality of students in a course online forum and their grade supports results that the number of direct connections is most likely associated with performance. Access on the web  |  Access from UW–‍Madison Libraries

DOI: 10.18608/jla.2022.7415

Applying social learning analytics to message boards in online distance learning: A case study
This study examines the relation between social network analysis parameters and student outcomes, between network parameters and global course performance. Findings show that future research should further investigate whether there are conditions under which social network parameters are reliable predictors of academic performance, but also advises against relying exclusively in social network parameters for predictive purposes. Access on the web  |  Access from UW–‍Madison Libraries

DOI: 10.1016/j.chb.2014.10.038

Fostering student engagement in online discussion through social learning analytics
This pilot study, conducted in an online course, devised an analytics toolkit that turns discussion forum data into information for students to reflect upon. Students used the toolkit to monitor posting behaviors and set participation goals. Tool use was linked to increased solicitation for peer responses and individual reflection. Access on the web  |  Access from UW–‍Madison Libraries

DOI: 10.1016/j.iheduc.2017.12.002

Improve learning materials & tools

Improve learning materials & tools


Learning analytics can track a student’s usage of learning materials and tools to identify potential issues or gaps, and offer an objective evaluation of those course materials. This allows instructors to make deliberate decisions about modifying approaches. Using aggregate student data, instructors can see ways to improve the process of learning or the structure of their course.

Examples of questions this approach helps answer

  • Does the course design align with learning objectives?
  • Are course teaching practices effective for learners?
  • Do learning activities and assessments help learners achieve course outcomes?
  • What patterns can be seen in formative assessments to show students’ progress toward learning outcomes?
  • Which learning materials and activities are MOST helpful and which are LEAST helpful for a given module?
  • Are course materials, activities, assessments and teaching practices inclusive for all learners? Does the course support diversity, equity, inclusion, belonging?

UW–‍Madison Examples

Refocusing the Wisconsin Emerging Scholars Computer Sciences Program 
This project investigated why the target audience for a course and program were not enrolling. It asked students several questions to gauge their sense of belonging as well as what type of support they wanted.  Based on the data that was collected, plans are underway to change the course meeting duration; reach out to students through high impact practice communities, and to conduct focus groups to learn more about what students want.

Curating the Most Effective Course Materials by Using Learning Analytics 
Student feedback on the usefulness and quality of resources and activities can help the instructor curate and modify course materials. Patterns of access to resources can signal the most helpful resources and potentially guide interventions.

Efficacy of Module 0 for TAs (Orientation/Course Policies) 
Situating TAs as student learners in a blended course provides them with an authentic example of a student experience. It provides an efficient location for the instructor to share important resources and grading policies to support equitable and consistent  application of policies.

New Student Online Orientation (Master’s Program)
Students in this fully-online program need support learning about UW–‍Madison resources, as well as online course tools, expectations, and scholarly writing. The orientation will be adjusted based on students’ needs.

Measuring the Effectiveness of Accelerated Online Courses for Faculty Development
Condensing and accelerating a course requires thoughtful activity/assignment planning. Using surveys along with page views and completion rates can guide course modifications.

Review Content Interactions for Student Learning
Students are expected to complete 3 types of online activities in each module (read – view – do) to help prepare them for class. This project used multiple modalities to explore whether students accessed various content types, and whether learning occurred.

Other examples using centrally supported teaching and learning tools
Most of the supported tools in the Learn@UW suite of tools offer some sort of learning analytics capabilities. Search the KnowledgeBase for more information about specific tools.

Other Examples

Informing learning design in online education using learning analytics of student engagement
This approach uses  learning analytics to understand the extent to which students’ engagement with online learning activities aligns with course design expectations, and the subsequent effect on their academic performance. The analysis of digital traces in a virtual learning environment illuminates how students actually engage with the course materials and how different study patterns affect their academic performance. This approach can help instructors pinpoint when and what study materials that students were struggling with.
Access on the web  | DOI: 10.4324/9781003177098-17

Individualized learning

Individualized learning


Adaptive or individualized learning systems apply learning analytics to customize course content for each learner. User profiles and other sets of data can be collected and analyzed to offer greater personalized learning experiences. This approach uses continuous feedback to help individual students in their learning.

Examples of questions this approach helps answer

  • Do students need practice with more basic skills and concepts? Would students benefit from supplemental or more advanced course materials?
  • Can students choose their own learning path, or choose content based on their learning preferences, eg. can they choose to view a video or read an article instead?
  • Can I provide continuous feedback to support independent learning?
  • If students have more opportunities to practice, study and review, will they do better with learning outcomes?
  • Will individualized learning provide more flexibility for diverse learners and increase equity, inclusion and belonging?

UW–‍Madison examples

Performance Management Conversation Training 
This professional development training allows participants to complete a scenario-based activity to determine which learning materials they need to complete. This project leveraged Canvas functionality  to create branching materials and resources for learners at various levels.

Students Choose Their Own Learning Path: TA Training in Online Instruction 
Two paths for learning were created to support teaching assistants with varying levels of subject matter expertise and teaching experience. Students could self-select into a learning path and received customized content, based on which path they selected.

Other examples using centrally supported teaching and learning tools
Most of the supported tools in the Learn@UW suite of tools offer some sort of learning analytics capabilities. Search the KnowledgeBase for more information about specific tools.

Other Examples

Advanced machine learning approaches to personalise learning: learning analytics and decision making: 
Takeaway: an original methodology to personalise learning applying learning analytics in virtual learning environments and empirical research results are presented. Using this learning personalisation methodology, decision-making model and method are proposed to evaluate suitability, acceptance and use of personalised learning units.
Access on the web  |  Access from UW–‍Madison Libraries

DOI: 10.1080/0144929X.2018.1539517

U-Behavior: Using visualizations to improve teaching and learning (video) (or access their website)
Colorado State University developed a tool that leverages Canvas quiz functionality to help students change their study behavior by rewarding them for spacing out practice activities and re-taking formative assessments to promote long-term learning.  Students view visualizations about their behavior and self-reflect as part of this teaching practice.
Access on the web  |

Weathering the Storm: Targeted and Timely Analytics to Support Disengaged Students 
A human centered approach can use learning analytics to identify students that are not engaged, so that an instructor or outreach support person can communicate and offer support. First year students may benefit from this communication strategy the most, and are more likely to continue into their second year of university.
Access on the web  |

Development of Cost-Effective Adaptive Educational Systems via Crowdsourcing
Students can evaluate, create and curate learning materials and resources as part of their learning, with the support of guides, rubrics and exemplars. This project also used gamification mechanism to engage students.
Access on the web  |  Access from UW–‍Madison Libraries

Predict student performance

Predict student performance


Based on already existing data about learning engagement and performance, learning analytics applies statistical models and/or machine learning techniques to predict later learning performance. By doing so, likely at-risk students can be identified for targeted support. Focus is on using data to prompt the instructor to take immediate action to intervene and help a student course-correct before it is too late.

Examples of questions this approach helps answer

  • How can we use data (for example, collected in a learning management system or other) to  predict how students may perform in a course, a degree program, or set of courses?
  • What ways can we leverage learning analytics data to support student learning, whether students are struggling or begin to struggle?
  • What learning analytics approaches can be used to help students from dropping out of college, or their degree programs?
  • Can we use predictive analytics to support equity, diversity, inclusion and belonging? How do we mitigate biases?

UW–‍Madison Examples

Addressing Data Disparities Through Disaggregation (2023 Pharmacy microgrant poster) 
One of the first microgrant projects used a predictive learning analytics approach, through their use of regression modeling. As a first step, Pharmacy looked at demographic data, Kaltura video views, formative Canvas quiz scores and other data to predict students’ outcomes in their Pharmacotherapy I course.

Pattern: Myths & Realities of Self-Reporting Activities 
Students were asked to self-report and reflect on time spent on course activities. Canvas and Kaltura analytics and surveys were also used to explore whether students’ self-reported behavior correlated with Canvas behavior and final grades. Students who logged their study time were generally already high-performers in the course.

Other examples using centrally supported teaching and learning tools
Most of the supported tools in the Learn@UW suite of tools offer some sort of learning analytics capabilities. Search the KnowledgeBase for more information about specific tools.

Other Examples

Learning analytics to predict students’ performance: A case study of a neurodidactics-based collaborative learning platform 
Learning analytics methods were implemented to determine accuracies greater than 99% in predicting the students’ final performance in a Nuerodidactics course.
Access on the web  |  Access from UW–‍Madison Libraries

DOI: 10.1007/s10639-022-11128-y

Predicting Student Performance in Higher Educational Institutions Using Video Learning Analytics and Data Mining Techniques  
Learning analytics data mining techniques allowed researchers to to predict students’ performance in a course with accuracy of 88.3%.
Access on the web  |  Access from UW–‍Madison Libraries

DOI: 10.3390/app10113894

Six Practical Recommendations Enabling Ethical Use of Predictive Learning Analytics in Distance Education
This 2023 case study reports on a 4-year application of LA at the Open University in the UK. It recommends six practical recommendations to ethically use learning analytics in higher education institutions (HEIs); 1. End users should be actively involved in the design and implementation of LA tools; 2. LA should be inclusive by considering diverse student needs; 3. HEls should act upon LA data and communicate the added value of adopting LA tools; 4. Students should benefit from LA through a clear plan of support interventions; 5. HEls should test LA data for hidden bias by engaging with diverse stakeholders, and 6. Institutional LA ethics policy should be reviewed and updated regularly through practical ethics and interdisciplinary research.
Access on the web  |

DOI: 10.18608/jla.2023.7743

Students matter the most in learning analytics: The effects of internal and instructional conditions in predicting academic success
A significant portion of variability stems from the learners’ internal conditions. Hence, when variability in external conditions is largely controlled for (the same institution, discipline, and nominal pedagogical model), students’ internal state is the key predictor of their course performance.
Access on the web  |

DOI: 10.1016/j.compedu.2021.104251

Visualize learning activities


This approach traces all learning activities performed by users in a digital ecosystem to produce visual reports on the learner behavior, as well as performance. The reports can support both students and teachers to boost learning motivation, adjust practices and leverage learning efficiency. This is about facilitating awareness and self-reflection in students about their learning patterns and behaviors.

For example, a learning analytics tool may help a student see how much time she is spending on certain activity types compared to her peers, and how that might relate to performance measures.

Key features:

  • Making learning visible for students and instructors
  • Facilitates awareness–>self-reflection–> sense making–>impact

Defining characteristics:

  • Unique:

    • Common Supposedly the simplest of the applications (novice – low).
    • Expertise requirements (“low”) but depends on LMS skill, ability to get the data, and manipulate it, and know how to use it for what purpose.
    • Awareness –> self-reflection –> sensemaking –> impact
  • Focus: Event-centric (the main focus of the analysis is on the interactions of a learner)
  • Feedback type: Reflective (i.e. presenting historic data analysis)
  • Primary stakeholder: Teacher, learner
  • Guiding questions:
    • Understanding students
    • What do students do?
    • Student support
    • Efficiency
    • Measuring student learning
    • Modeling learner success (course level)
    • Learner of self-assessment

Examples of visualizing learning activity approach:

  1. Learning analytics extension for better understanding the learning process in the Khan Academy Platform
  2. Learning analytics dashboard applications (Verbert et al)
  3. LeMO: a learning analytics application focusing on user path analysis and interactive visualization
  4. LAK 2017 Conference Proceedings: [pg 36] Supporting classroom instruction with data visualization
  5. GLASS: A learning analytics visualization tool

* Learn more about the Nguyen, Gardner, and Sheridan (2017) taxonomy, including a link to the full article.

Despite our best efforts to ensure accessibility of the LACE webpage and resources, there may be some limitations with some of the content. Please contact LearningAnalytics@office365.wisc.edu if you encounter an issue.

Resources and opportunities

There are several ways to learn about learning analytics at UW–‍Madison. LACE collaborates with the Learning Analytics Roadmap Committee (LARC), the Vice Provost for Teaching & Learning and others to provide the following opportunities and resources.

Community of interest for learning analytics

If you are interested in obtaining information about learning analytics opportunities, training, fellowships and events on campus, please join the UW–‍Madison Learning Analytics MS Team or email us and request to be added to the group.

Additional resources

Materials from past learning analytics events

LACE updates

List of articles

Help & contact

Schedule a consultation

There are a wide range of tools available and our learning technology consultants are happy to help you choose the best tool to fit your needs. Instructors and instructional staff can request a consultation with a DoIT AT consultant through the DoIT Help Desk.

DoIT Help Desk

Other questions or feedback?

We’d like to hear from you.

Feedback form