[Fall Workshop 2021] Assessment, Analytics and Student Learning: Schedule

Time Session
8:30 – 9:00 Soft open – grab a coffee, chat with friends
9:00 – 9:10 Kick off: Land acknowledgement, introductions
9:10 – 10:15 Panel – Brenna Clarke Gray, Colin Madland, Ian Linkletter, Vivian Forssman, Brian Lamb: Student Privacy and Online Exam Proctoring
Brenna Clarke Gray is an Educational Technologies Coordinator at Thompson Rivers University, Colin Madland is Manager of Online Learning and Instructional Technology at Trinity Western University, Ian Linkletter is a Learning Technology Specialist at the University of British Columbia, Brian Lamb is the Director of Learning Technology and Innovation at Thrompson Rivers University, and Vivian Forssman is a Program Manager with the Resilience by Design Lab and involved in other projects.
15 minute break
10:30 – 10:50 Briana Fraser, Susan Bonham: H5P: Building Independent Learning Experiences Without Costing Students’ Privacy
Looking for a tool to create opportunities for students to interact with content without sacrificing student privacy? H5P is a great option.

H5P allows users to create interactive content that can be run on an LMS, WordPress, or other platforms that a student accesses without requiring them to provide any of their personal information to a third party. In this session, we will showcase an H5P object that uses customizable feedback within a no-stakes formative learning and assessment activity. Through interaction with this H5P activity, students can be introduced to new content, test their understanding, receive feedback, integrate that feedback into the next stage of the activity, and work towards mastery of a skill – all while retaining their privacy. In addition to demonstrating this H5P object, we will also point out the H5P settings that block collection of analytics.
10 minute break
11:00 – 11:20 Craig Thompson and Annay Slabikowska: A collaborative approach to understanding enrolment patterns: The Student Flows Project
A group of analysts at UBC have been collaborating to answer common questions about student progression through courses and programs. In this session we will share lessons learned from our work, as well as tips for success for those interested in undertaking similar initiatives.
10 minute break
11:30 – 11:50 Alison Myers, Marko Prodanovic, Sunah Cho: In-house video analytics: taking advantage of available log data to create contextually-relevant learning analytics dashboards
Learning Analytics can provide valuable insights to instructors regarding their teaching, however, analytics from vendor tools are often isolated datasets, which do not give the entire picture. We explored how to combine log data about video viewing activity with LMS data about course structure in order to provide our instructors with analytics that included not only student video watching behaviours (from the log data), but also contextual information about their course (from the LMS data). As the field of learning analytics has matured, we often see “scaled” solutions that aim to provide data or insights to an entire institution. However, the goal of “scale” can overshadow the design decisions made by individual instructors, creating analytics that are not relevant or detailed enough for decision making. Our collaboration between the analytics team and a learning designer allowed the focused development of a learning analytics dashboard that is contextually relevant and meaningful for our instructors so that they can use the information to make design decisions to improve their teaching and the student experience.
11:50 – 12:30 Lunch
12:30 – 1:15 Andy Sellwood and Elle Ting: The Pivot to Online, On The Front Lines: Measuring the Real Impact of Alternative Assessments in Online Learning
During the 2020 pivot to emergency remote teaching VCC’s approach was to support instructors using the tools and methods (technical and pedagogical) that we had or could readily develop, particularly around built-in Moodle functionality and open education practices, Now, with the early scramble to shift delivery online behind us, we are testing the efficacy of these measures. This presentation will discuss our evaluation of (1) what the “pivot to online” actually meant to individual instructors, students, and disciplines in terms of the development and adoption of alternative assessment, and (2) what the most appropriate alternative assessment solutions would be for VCC and other small to midsize postsecondary institutions.
15 minute break
1:30 – 1:50 Open Session
An open spot to continue discussions or catch up offline.
10 minute break
2:00 – 2:20 Stoo Sepp: Opting Out – Allowing for Student Agency as we wait for Big EdTech to catch up
In this presentation, I’ll be proposing a conceptual framework for allowing custom-built analytics tools to allow student agency over the use of their data. As most learning platforms don’t give students the choice of whether their data is collected, custom built tools can do this. If collection is a given, custom tools can at least allow students the agency to choose whether their data is used for learning analytics purposes (even if it still is collected). This would allow for an opt-in mechanism as we wait for large edtech firms to play catch up.
10 minute break
2:30 – 3:15 Jeff Longland: Instrumenting for Learning Events at UBC
An introduction to learning events, learning record stores and the work underway at UBC to build a data system for learning events. We’ll look at how we’ve instrumented learning tools to emit structured events, how events are ingested and stored, then explore examples of how events can be used in instructor and student dashboards.
15 minute break
3:30 – 3:50 Gayle Palas: Data Literacy and Learner Analytics – Understanding our Relationship to Data
As institutions grapple with how to leverage the vast amounts of data online learning and innovation are producing, educators find themselves under increasing pressure to use analytics data to support student learning. Yet the responsibility of interpreting learner analytics data ethically and in a pedagogically sound manner involves data literacy skills, access to tools, and knowledge that is often unavailable. In this session, we will examine a common approach to data literacy in the context of learner analytics and invite participants to critically reflect on what their data literacy needs might be and how they might be met.

This session is applicable to administrators, educators, students, and staff who are interested in expanding their understanding and application of learner analytics through increased data literacy.
4:00 – 4:15 Closing Remarks

Posted

in

,