Learning Analytics Center of Excellence (LACE)

Using data to support learning at the university

The focus of learning analytics is to provide actionable data that can improve the teaching and learning environment. Learning analytics has been contextually defined for campus as “the undertaking of activities that generate actionable data from the learning intended to improve student outcomes by informing structure, content, delivery or support of the learning environment” (defined in 2018 by the Learning Analytics Roadmap Committee).

Learning analytics is a process or instructional approach, and not a specific tool, although it uses tools and data to answer questions. There are numerous ways learning analytics can be used as explained in the Learning Analytics Functional Taxonomy, and there are a number of projects and resources on campus supporting the different opportunities.

Work with us

We collaborate on strategic teaching and learning data initiatives aimed at sustained data-driven practices that maintain and enhance student learning.

Contact LACE

Major Efforts

Learning analytics continues to gain interest and demonstrate value in higher education. A variety of approaches have been piloted on campus over the past decade. Opportunities continue to arise with the university’s migration to Badger Analytics, our participation in the Unizin Consortium and a new institutional data policy. LACE collaborates with the Data Empowered Educational Practices (DEEP) Executive Committee, the Vice Provost for Teaching & Learning and others on the following learning analytics projects. Close collaboration with stakeholders is a central component of our project approach.

LACE works closely with campus governance groups and relevant authorities for institutional data to ensure all relevant policies, laws, and regulations, including cybersecurity measures and data protections are adhered to. Specific to our work as practitioners of learning analytics, we hold as a core value the alignment with the UW–‍Madison approved Guiding Principles for Appropriate Use of Data for Learning Analytics.

Data empowered educational practices (DEEP) microgrants

The University of Wisconsin—Madison started a microgrant program in 2021 to grow institutional capacity around data, specifically for teaching and learning. For the first two years, we focused on projects that used data to support diversity, equity, inclusion, and belonging (DEIB), which funded several important projects in this area. The current microgrants program builds on those goals, by expanding to consider making learning more accessible for all students, teachers, educational professionals, and the institution as a whole.

Get details on the DEEP microgrant program

For more information, please join the UW–‍Madison Learning Analytics MS Team.

Microgrant program announcement

The 2024-2025 Microgrant theme is Click, Learn, Thrive: Exploring Online Course Engagement.

Student engagement and student-facing tools

Students are interested in seeing data and leveraging learning analytics. Student engagement began in the spring of 2021 with focus groups, and then again in spring of 2022 when surveys and feedback sessions occurred. Students are already using available (minimal) data from campus-supported tools and are leveraging other external resources. They want to explore more tools, approaches and visualizations to support their learning. A student-facing tool will be piloted in the near future.

Learning Analytics Community of Practice (LA CoP)

The Learning Analytics Community of Practice (CoP) is designed to connect colleagues across campus in sharing learning analytics experiences.

More about the LA CoP

Participatory design process

Learning analytics is a complex organizational change process and involves work across the domains of technology, culture, process/workflow, and policy. Various stakeholders have different needs and interests. We use a participatory design process to engage with stakeholders and keep circling back to get feedback while exploring possible approaches and tools. We explore consortium and vendor tools and also create custom solutions. Whether we’re working with instructors, advisors, or students, our process is similar.

  1. First infographic. Stakeholders Venn diagram intersections at instructors, advisors, students and institution.
  2. Second infographic. Step 1: Engage with stakeholders – what do they need?
  3. Step 2: Design prototype, get feedback.
  4. Step 3: Data governance discussions and process starts.
  5. Step 4: Develop minimum viable product, get feedback.
  6. Step 5: Accessibility and usability review.
  7. Step 6: Pilot, get feedback.
  8. Step 7: Iterate or enhance.
  9. Step 8: Support and communications resources.
  10. Step 9: Launch!
  11. Step 10: Evaluate, support and enhance.

Exploration

Learning Analytics guiding principles

Students are real and diverse individuals, and not just their data or information. These principles — beneficence, transparency, privacy and confidentiality, and minimization of adverse impacts — aim to uphold the dignity of students while ensuring learning analytics are used to improve educational outcomes, optimize the teaching and learning data environment, and support the student experience.

View Guiding Principles for Appropriate Use of Data for Learning Analytics.

Learner Activity View for Advisors (LAVA)

Currently, in the pilot stage, the Learner Activity View for Advisors (LAVA) is a learning analytics resource that displays high-level trend data about student performance and engagement to academic advisors. A redesigned LAVA was piloted during the spring of 2022, following an engagement with advisors. A larger LAVA pilot is ongoing in fall 2023.

For additional reference, please see the KB article, “The Learner Activity View for Advisors (LAVA) Overview.”

Pedagogical guide to learning analytics

What are the pedagogical uses of learning analytics? The focus of learning analytics is to provide actionable information that can improve teaching and learning. By using data generated within online courses, we can make informed improvements to teaching and learning on our campus.

Pedagogical guide to learning analytics

Learning analytics functional taxonomy

Why are people using learning analytics? What are some practical examples?

Learning analytics is not a tool, rather it’s an approach that leverages data to improve teaching and learning. There are many different ways that data can be used to support students. This web content was created based on an article by Nguyen, Gardner and Sheridan (2017)*.

Explore the following tabs for more information and examples about how these approaches might be implemented. Examples are from UW–‍Madison and other institutions.

This is an accordion element with a series of buttons that open and close related content panels.

Access learning behavior

Access learning behavior

Description

Learning analytics can collect user-generated data from learning activities and offer trends in learning engagement. Analyzing those trends can reveal students’ learning behavior and identify their learning styles. This approach measures engagement and student behavior rather than performance, giving instructors insight into how their students interact with their course materials.

Examples of questions this approach helps answer

  • Are my students engaging in the course and getting off to a good start?
  • Are students successfully using course materials to complete assignments and activities?
  • Are struggling students spending a lot of time/energy on course material and still not getting it, or are they not putting in the time/energy?
  • Is the assignment feedback provided on student work helpful/effective and used by students?
  • How many students access optional resources? Is it worth my time to create/add these to the course?
  • Which course materials do students access the most? The least? Are there patterns of engagement?
  • When do students access course materials? How often? Do they have enough time to complete activities?
  • Is the class workload too much? How much time are students spending on course activities?
  • Is my course inclusive, and does it provide diverse learning opportunities that all students can access?

UW–‍Madison Examples

Learning Tools Assessment for Anatomy Course
Evaluate whether interactive case studies, which are time-intensive to develop, are effective to prepare students for labs.

Evaluating Use & Effectiveness of New Course Resources
Using page views, student feedback and surveys to analyze whether new course materials are being used and having a positive impact on student learning.

Measuring Student Engagement With Online Course Materials
Optional videos have been provided to support student assignments. Kaltura analytics can identify if students are taking advantage of these materials, and help guide future curriculum design.

Page-Level Student Access Analytics for Online Resource Improvement 
Online course materials provide students with more independent study and retrieval practice opportunities. Learning analytics and student feedback can inform continuous improvement of course materials and help students be more prepared.

Other examples using centrally supported teaching and learning tools
Most of the supported tools in the Learn@UW suite of tools offer some sort of learning analytics capabilities. Search the KnowledgeBase for more information about specific tools.

Other Examples

A Systematic Approach to Quality Online Course Design and Development 
Page views in the LMS along with video views, duration and heatmap data provided student behavior data to support a change in course delivery from blended to fully online. This was conducted as part of a holistic curriculum review and transition.
Access on the web

Profiling students via clustering in a flipped clinical skills course using learning analytics 
This study investigates student profiles in a flipped classroom. Algorithms examined in this study show a two-cluster structure: ‘high interaction’ and ‘low-interaction.’ These results can be used to help identify low-engaged students and give appropriate feedback.

Access on the web  | Access from UW–‍Madison Libraries
DOI: 10.1080/0142159X.2022.2152663

Person-centered analysis of self-regulated learner profiles in MOOCs: a cultural perspective
Research revealed four different self-regulated learner profiles: all-around SRL learners, disillusioned SRL learners, control-oriented SRL learners, and control-dominated SRL learners.

Access on the web  | Access from UW–‍Madison Libraries
DOI: 10.1007/s11423-021-09939-w

Evaluate social learning

Evaluate social learning

Description:

Learning analytics can be applied to investigate a learner’s activities on any digital social platform — such as online discussions in Canvas — to evaluate the benefits of social learning. This measures and tracks student-to-student and student-to-instructor interactions to help understand if students are benefiting from social learning in their course.

Examples of questions this approach helps answer

  • Who are the students who engage in online discussions frequently and with a variety of other students? Are they knowledge shepherds that bridge groups?
  • Who are the students who engage less frequently and/or with few other students in the online discussions?
  • What can be learned about student attitudes and learning from an analysis of the content of discussion posts?
  • Do students feel included, and that they belong in the course or program? Do they feel comfortable speaking up or asking questions?

UW–‍Madison Examples

Using Learning Analytics to Improve Inclusion and Belonging in First-Year Engineering Design 
Both quantitative and qualitative data was used to support inclusion and belonging for first-year engineering students in a team-based, hands-on design course. Students learn technical skills as well as collaboration, team-building and professionalism. They worked on long-term projects in their teams, and were asked to complete a team contract as well as peer and self evaluations as part of that process.

Fostering Belonging in the Computer Science Classroom 
This course improvement project used learning analytics to support belonging for diverse groups in non-diverse environments. Computer Science course gender distribution is far from parity. A set of engagement techniques–including small group discussions and murkiest points–were used to monitor and improve the feeling of belonging amongst all students regardless of the size of their cohort. Classroom observations, student surveys and reviewing student questions were also implemented.

Other examples using centrally supported teaching and learning tools
Most of the supported tools in the Learn@UW suite of tools offer some sort of learning analytics capabilities. Search the KnowledgeBase for more information about specific tools.

Other Examples

Using Learning Analytics to Visualise Computer Science Teamwork
A teamwork dashboard, founded on learning analytics, learning theory and teamwork models analyzes students’ online teamwork discussion data and visualizes the team mood, role distribution and emotional climate. Educators can use the tool to monitor teams, identify problem teams, and provide students feedback about team interactions.

Access on the web  |  Access from UW–‍Madison Libraries

DOI: 10.1145/2729094.2742613

The Curious Case of Centrality Measures: A Large-Scale Empirical Investigation
A meta-analysis of 69 cases that examine the relationship between network centrality of students in a course online forum and their grade supports results that the number of direct connections is most likely associated with performance.

Access on the web  |  Access from UW–‍Madison Libraries

DOI: 10.18608/jla.2022.7415

Applying social learning analytics to message boards in online distance learning: A case study
This study examines the relation between social network analysis parameters and student outcomes, between network parameters and global course performance. Findings show that future research should further investigate whether there are conditions under which social network parameters are reliable predictors of academic performance, but also advises against relying exclusively in social network parameters for predictive purposes.

Access on the web  |  Access from UW–‍Madison Libraries

DOI: 10.1016/j.chb.2014.10.038

Fostering student engagement in online discussion through social learning analytics
This pilot study, conducted in an online course, devised an analytics toolkit that turns discussion forum data into information for students to reflect upon. Students used the toolkit to monitor posting behaviors and set participation goals. Tool use was linked to increased solicitation for peer responses and individual reflection.

Access on the web  |  Access from UW–‍Madison Libraries

DOI: 10.1016/j.iheduc.2017.12.002

Improve learning materials & tools

Improve learning materials & tools

Description:

Learning analytics can track a student’s usage of learning materials and tools to identify potential issues or gaps, and offer an objective evaluation of those course materials. This allows instructors to make deliberate decisions about modifying approaches. Using aggregate student data, instructors can see ways to improve the process of learning or the structure of their course.

Examples of questions this approach helps answer

  • Does the course design align with learning objectives?
  • Are course teaching practices effective for learners?
  • Do learning activities and assessments help learners achieve course outcomes?
  • What patterns can be seen in formative assessments to show students’ progress toward learning outcomes?
  • Which learning materials and activities are MOST helpful and which are LEAST helpful for a given module?
  • Are course materials, activities, assessments and teaching practices inclusive for all learners? Does the course support diversity, equity, inclusion, belonging?

UW–‍Madison Examples

Refocusing the Wisconsin Emerging Scholars Computer Sciences Program 
This project investigated why the target audience for a course and program were not enrolling. It asked students several questions to gauge their sense of belonging as well as what type of support they wanted.  Based on the data that was collected, plans are underway to change the course meeting duration; reach out to students through high impact practice communities, and to conduct focus groups to learn more about what students want.

Curating the Most Effective Course Materials by Using Learning Analytics 
Student feedback on the usefulness and quality of resources and activities can help the instructor curate and modify course materials. Patterns of access to resources can signal the most helpful resources and potentially guide interventions.

Efficacy of Module 0 for TAs (Orientation/Course Policies) 
Situating TAs as student learners in a blended course provides them with an authentic example of a student experience. It provides an efficient location for the instructor to share important resources and grading policies to support equitable and consistent  application of policies.

New Student Online Orientation (Master’s Program)
Students in this fully-online program need support learning about UW–‍Madison resources, as well as online course tools, expectations, and scholarly writing. The orientation will be adjusted based on students’ needs.

Measuring the Effectiveness of Accelerated Online Courses for Faculty Development
Condensing and accelerating a course requires thoughtful activity/assignment planning. Using surveys along with page views and completion rates can guide course modifications.

Review Content Interactions for Student Learning
Students are expected to complete 3 types of online activities in each module (read – view – do) to help prepare them for class. This project used multiple modalities to explore whether students accessed various content types, and whether learning occurred.

Other examples using centrally supported teaching and learning tools
Most of the supported tools in the Learn@UW suite of tools offer some sort of learning analytics capabilities. Search the KnowledgeBase for more information about specific tools.

Other Examples

Informing learning design in online education using learning analytics of student engagement
This approach uses  learning analytics to understand the extent to which students’ engagement with online learning activities aligns with course design expectations, and the subsequent effect on their academic performance. The analysis of digital traces in a virtual learning environment illuminates how students actually engage with the course materials and how different study patterns affect their academic performance. This approach can help instructors pinpoint when and what study materials that students were struggling with.
Access on the web  | DOI: 10.4324/9781003177098-17

Individualized learning

Individualized learning

Description

Adaptive or individualized learning systems apply learning analytics to customize course content for each learner. User profiles and other sets of data can be collected and analyzed to offer greater personalized learning experiences. This approach uses continuous feedback to help individual students in their learning.

Examples of questions this approach helps answer

  • Do students need practice with more basic skills and concepts? Would students benefit from supplemental or more advanced course materials?
  • Can students choose their own learning path, or choose content based on their learning preferences, eg. can they choose to view a video or read an article instead?
  • Can I provide continuous feedback to support independent learning?
  • If students have more opportunities to practice, study and review, will they do better with learning outcomes?
  • Will individualized learning provide more flexibility for diverse learners and increase equity, inclusion and belonging?

UW–‍Madison examples

Performance Management Conversation Training 
This professional development training allows participants to complete a scenario-based activity to determine which learning materials they need to complete. This project leveraged Canvas functionality  to create branching materials and resources for learners at various levels.

Students Choose Their Own Learning Path: TA Training in Online Instruction 
Two paths for learning were created to support teaching assistants with varying levels of subject matter expertise and teaching experience. Students could self-select into a learning path and received customized content, based on which path they selected.

Other examples using centrally supported teaching and learning tools
Most of the supported tools in the Learn@UW suite of tools offer some sort of learning analytics capabilities. Search the KnowledgeBase for more information about specific tools.

Other Examples

Advanced machine learning approaches to personalise learning: learning analytics and decision making: 
Takeaway: an original methodology to personalise learning applying learning analytics in virtual learning environments and empirical research results are presented. Using this learning personalisation methodology, decision-making model and method are proposed to evaluate suitability, acceptance and use of personalised learning units.
Access on the web  |  Access from UW–‍Madison Libraries

DOI: 10.1080/0144929X.2018.1539517

U-Behavior: Using visualizations to improve teaching and learning (video) (or access their website)
Colorado State University developed a tool that leverages Canvas quiz functionality to help students change their study behavior by rewarding them for spacing out practice activities and re-taking formative assessments to promote long-term learning.  Students view visualizations about their behavior and self-reflect as part of this teaching practice.
Access on the web  |

Weathering the Storm: Targeted and Timely Analytics to Support Disengaged Students 
A human centered approach can use learning analytics to identify students that are not engaged, so that an instructor or outreach support person can communicate and offer support. First year students may benefit from this communication strategy the most, and are more likely to continue into their second year of university.
Access on the web  |

Development of Cost-Effective Adaptive Educational Systems via Crowdsourcing
Students can evaluate, create and curate learning materials and resources as part of their learning, with the support of guides, rubrics and exemplars. This project also used gamification mechanism to engage students.
Access on the web  |

Predict student performance

Predict student performance

Description

Based on already existing data about learning engagement and performance, learning analytics applies statistical models and/or machine learning techniques to predict later learning performance. By doing so, likely at-risk students can be identified for targeted support. Focus is on using data to prompt the instructor to take immediate action to intervene and help a student course-correct before it is too late.

Examples of questions this approach helps answer

  • How can we use data (for example, collected in a learning management system or other) to  predict how students may perform in a course, a degree program, or set of courses?
  • What ways can we leverage learning analytics data to support student learning, whether students are struggling or begin to struggle?
  • What learning analytics approaches can be used to help students from dropping out of college, or their degree programs?
  • Can we use predictive analytics to support equity, diversity, inclusion and belonging? How do we mitigate biases?

UW–‍Madison Examples

Addressing Data Disparities Through Disaggregation (2023 Pharmacy microgrant poster) 
One of the first microgrant projects used a predictive learning analytics approach, through their use of regression modeling. As a first step, Pharmacy looked at demographic data, Kaltura video views, formative Canvas quiz scores and other data to predict students’ outcomes in their Pharmacotherapy I course.

Pattern: Myths & Realities of Self-Reporting Activities 
Students were asked to self-report and reflect on time spent on course activities. Canvas and Kaltura analytics and surveys were also used to explore whether students’ self-reported behavior correlated with Canvas behavior and final grades. Students who logged their study time were generally already high-performers in the course.

Other examples using centrally supported teaching and learning tools
Most of the supported tools in the Learn@UW suite of tools offer some sort of learning analytics capabilities. Search the KnowledgeBase for more information about specific tools.

Other Examples

Learning analytics to predict students’ performance: A case study of a neurodidactics-based collaborative learning platform 
Learning analytics methods were implemented to determine accuracies greater than 99% in predicting the students’ final performance in a Nuerodidactics course.
Access on the web Access from UW–‍Madison Libraries

DOI: 10.1007/s10639-022-11128-y

Predicting Student Performance in Higher Educational Institutions Using Video Learning Analytics and Data Mining Techniques  
Learning analytics data mining techniques allowed researchers to to predict students’ performance in a course with accuracy of 88.3%.
Access on the web  |  Access from UW–‍Madison Libraries

DOI: 10.3390/app10113894

Six Practical Recommendations Enabling Ethical Use of Predictive Learning Analytics in Distance Education
This 2023 case study reports on a 4-year application of LA at the Open University in the UK. It recommends six practical recommendations to ethically use learning analytics in higher education institutions (HEIs); 1. End users should be actively involved in the design and implementation of LA tools; 2. LA should be inclusive by considering diverse student needs; 3. HEls should act upon LA data and communicate the added value of adopting LA tools; 4. Students should benefit from LA through a clear plan of support interventions; 5. HEls should test LA data for hidden bias by engaging with diverse stakeholders, and 6. Institutional LA ethics policy should be reviewed and updated regularly through practical ethics and interdisciplinary research.
Access on the web  |

DOI: 10.18608/jla.2023.7743

Students matter the most in learning analytics: The effects of internal and instructional conditions in predicting academic success
A significant portion of variability stems from the learners’ internal conditions. Hence, when variability in external conditions is largely controlled for (the same institution, discipline, and nominal pedagogical model), students’ internal state is the key predictor of their course performance.
Access on the web  |

DOI: 10.1016/j.compedu.2021.104251

Visualize learning activities

Visualize learning activities

Definition:

This approach traces all learning activities performed by users in a digital ecosystem to produce visual reports on the learner behavior, as well as performance. The reports can support both students and teachers to boost learning motivation, adjust practices and leverage learning efficiency. This is about facilitating awareness and self-reflection in students about their learning patterns and behaviors.

Examples of questions this approach helps to answer:

  1. How much time are students’ successful peers spending interacting with different course materials?
  2. What time of the week/day are students engaging with course materials? Can this help inform instructors about when to send out timely messages?
  3. Are students viewing important course content, or viewing other course content less, and can this help instructors structure their courses differently?

UW–‍Madison examples

Page-Level Student Access Analytics for Online Resource Improvement 
Online course materials provide students with more independent study and retrieval practice opportunities. Learning analytics and student feedback can inform continuous improvement of course materials and help students be more prepared for the end of year board exam.

Curating the Most Effective Course Materials by Using Learning Analytics 
Student feedback on the usefulness and quality of resources and activities can help the instructor curate and modify course materials. Patterns of access to resources can signal the most helpful resources and potentially guide interventions.

Other examples using centrally supported teaching and learning tools
Most of the supported tools in the Learn@UW suite of tools offer some sort of learning analytics capabilities. Search the KnowledgeBase for more information about specific tools.

Other Examples

Applying learning analytics dashboards based on process‐oriented feedback to improve students’ learning effectiveness
Researchers developed a learning analytics dashboard (LAD) based on process‐oriented feedback in iTutor to offer learners their final scores, sub‐scale reports, and corresponding suggestions on further learning content.
Access on the web  |  Access from UW–‍Madison Libraries

DOI: 10.1111/jcal.12502

Learning analytics dashboard: a tool for providing actionable insights to learners
Researchers show how data-driven prescriptive analytics can be deployed within visualizations / dashboards in order to provide concrete advice to the learners, and thereby increase the likelihood of triggering behavioral changes.
Access on the web Access from UW–‍Madison Libraries

DOI: 10.1186/s41239-021-00313-7

Associations between learning analytics dashboard exposure and motivation and self-regulated learning 
Researchers modeled how changes in student motivation and self-regulated learning (SRL) were related to what occurred during 1-on-1 meetings with academic advisors during which students had the potential to view representations of their achievement embedded within an Early Warning System (EWS) that visually represented aspects of their academic performance referenced with course averages.
Access on the web  |  Access from UW–‍Madison Libraries

DOI: 10.1016/j.compedu.2020.104085

Despite our best efforts to ensure accessibility of the LACE webpage and resources, there may be some limitations with some of the content that is linked on the LACE website. Below is a description of known limitations, and potential solutions.  Please contact LearningAnalytics@office365.wisc.edu if you encounter an issue.

Known limitations for LACE’s content:

  1. Case study links and PDFs: Some of the research and case studies listed within the learning analytic functional taxonomy are provided through PDFs and external links. The PDFs and external links have images that may not have text alternatives. The PDFs may not be accessible to all screen readers. The PDFs may not have high color contrasting in some areas. Because we are unable to alter some of the PDFs or content from external links, fully accessible content cannot be guaranteed. However, we are committed to working with any users that need accommodations.

Resources and opportunities

There are several ways to learn about learning analytics at UW–‍Madison. LACE collaborates with the Learning Analytics Roadmap Committee (LARC), the Vice Provost for Teaching & Learning and others to provide the following opportunities and resources.

Community of interest for learning analytics

If you are interested in obtaining information about learning analytics opportunities, training, fellowships and events on campus, please join the UW–‍Madison Learning Analytics MS Team or email us and request to be added to the group.

Additional resources

Materials from past learning analytics events

LACE updates

List of articles

Help & contact

Schedule a consultation

There are a wide range of tools available and our learning technology consultants are happy to help you choose the best tool to fit your needs. Instructors and instructional staff can request a consultation with a DoIT AT consultant through the DoIT Help Desk.

DoIT Help Desk

Other questions or feedback?

We’d like to hear from you.

Feedback form