Measuring Student Success using Learning Analytics and Digital Assessments
Monday, November 30
Please join us for this combined session on Learning Analytics and Digital Assessment. We are happy to announce that we have two great keynote speakers for this session: Matt Richards, Principal Product Owner at Cambridge Assessment and Bart Rienties, Professor of Learning Analytics at Open University UK. There will be invited guests that will share their ideas with us and a number of very interesting lightning talks from our members. You will have plenty of opportunity to listen and contribute to discussions on the several themes.
All times listed are in Central European Time
Welcome - Slides
Learning analytics has always relied upon assessment as both a measurement of student performance and a leading indicator of future success, but we are now starting to see the process of assessment itself benefitting from analytical methods. In this introduction I will review the history of this relationship and share some examples of how it is changing for the benefit of both areas.
Online Proctoring: When do you push the RED button?
Monitoring a live proctored exam requests utmost concentration of the live proctor. Under remote high stakes conditions good, fair testing remains a basic exam regulation principle. There are many forms of irregularities, incidents and violations than can be observed when candidates navigate through their exam. Which ones do you capture and report on? Taking into account that a live proctor monitors at least 6 candidates at the same time. Nothing is what it seems when we assess multiple aberrant patterns. Is there a difference between what we observe and how we judge?
Implementation of new platform for national tests and exams in Norway
By Johan Aamdal Bottheim, Project delivery manager, The Norwegian Directorate for Education and Training
This lightning talk will take you to Norway, where the government is developing digital exams and national tests for students nationwide.
From enabling new approaches to engaging learners, supporting greater educational insights and ultimately ensuring more scalable and convenient products, the transition to Digital Assessment provides significant opportunities to learning and assessment organisations.
|12:30||End of the morning session|
As assessments drive student learning there is a wealth of research and practical experience providing guidelines on how to effectively design different forms of assessment. In this presentation, I will discuss how the Open University UK is using online assessments for its 170K students, and how learning analytics can provide unique insights into how students make complex assessment decisions.
Analytics in vocational assessment
Access Beyond the Assessment: Accessibility for all Aspects of Digital Testing
Accessibility has long been a challenge in the assessment industry. With the recent uptick in online learning and assessment this year, it’s crucial that we rise to meet this challenge. It’s taken longer than many of us hoped, but online assessment platforms are finally taking accessibility seriously, and many vendors are openly boasting about their accessibility compliance and features. However, accessibility for tests isn’t only applicable to the test itself. Assessment programs need to dig a little deeper and think through the entire chain of access to all of the aspects of taking a test.
In this lightning talk, we dive into providing access for the whole assessment process, including what this looks like for test-takers, agents and test creators. We also discuss the critical IMS learning standards that enable accessibility beyond the assessment for all aspects of digital testing.
Authentic Assessment, what’s next?
At Cito we develop authentic assessments to create an immersive testing experience. In this Lightning Talk two examples are shown to demonstrate our newest generation of assessments. These assessments include regular assessment interaction types in an authentic context. And although known item types are used, these assessments do not feel like an assessment anymore. The result? Students enjoy taking these assessments, they are able to demonstrate their ability better and feel less stressed. The problem? Current standards do not seem to provide a way to make this content interoperable. How can we adapt these standards to support this use case?
Panel discussion - Bringing together Digital Assessment and Learning Analytics - Recording
Facilitated by Thomas Hoffmann, Director of Product Strategy & Solutions at Open Assessment Technologies
|15:00||End of the afternoon session|