An LTI-based prototype for a Student Progress Dashboard
At the last 1EdTech quarterly meeting in February 2013 at Lone Star College in Houston we spent a lot of time on analytics. Analytics is a pretty hot topic in education these days. In fact, in HED the hype has been off the charts for about two years now. At EDUCAUSE 2011 analytics was the savior, At EDUCAUSE 2012 the hype was more muted - but still strong.
Why? Economics. Retaining one student is worth substantial dollars. Retaining many = mucho dollars. Not to mention national goals for graduating more students - which has a broader impact on any national economy as the delta in wages over a lifetime is large between degreed and non-degreed people.
One of the problems with the term analytics is that it is VERY broad. At our quarterly meeting we had a parade of companies (large & small) as well as very well-informed individuals working in the analytics field. We learned that there are at least three levels of analytics applicable to education:
- Learning analytics: Data analysis that helps students improve learning outcomes.
- Academic/program analytics: Data analysis that provides information of what is happening in a specific program and how to plug holes or otherwise adjust.
- Institutional analytics: Data analysis that helps make decisions about how to improve at the institutional level.
There is also a fourth level - an even higher level at which governments might crunch numbers to understand a statewide or national level. Since we don't consider ourselves to be part of the government in 1EdTech, this fourth level is not too interesting to us.
There are some great companies doing some great work in analytics. Companies like Oracle, Desire2Learn, LoudCloud, McGraw-Hill and Civitas Learning - all of whom presented at the 1EdTech quarterly.
And, of course one of the things we have learned previously about domain-specific adaptive tutor/homework applications, like Pearson My-Labs, is that they can make use of data collected across many institutions.
The use of analytics to crunch, and potentially correlate, data from what might not appear to be related things, has appeal to many. For instance, one of the claims made by the CEO of Knewton at the U.S. Whitehouse Data Palooza event last year was that the Knewton product would be able to predict how well a student would do based on what they had to eat for breakfast! That sort of data would be very interesting to Frosted Mini-Wheats, as well as some parents.
Crunching large amounts of data from many sources and then figuring out which data is most useful/predictive is often referred to as making use of “Big Data.”
But, there is also “Small Data.” Small data tends to be more localized, and perhaps, immediately actionable (see non-education article on Why Small Data May be Bigger than Big Data). As Mark Milliron said at Learning Impact 2011, “Students are good with collecting data on them if it can actually help them as individuals.” This makes a lot of sense to us at 1EdTech.
Now, of course, data interoperability can potentially aid analytics because agreed upon data definitions used across many tools/products should be easier to compare. Analytics is a really important focus area for 1EdTech - and will be a key focus at this year's Learning Impact 2013 conference May 13-16 in San Diego.
The sort of “holy grail” of data interoperability is an agreed upon “learning/progress map” that all tools and assessments could populate. Some are working on that very issue today (see for instance the Dynamic Learning Maps collaborative that is participating in 1EdTech via CETE at University of Kansas). However, while it is relatively straightforward to agree on some types of data – like for instance assessment item results data as in QTI/APIP or usage data on things like e-books – the state of the market is that student learning models and data is in its infancy. Therefore, many tools will be producing analytics information that makes sense within the tool, but not more generally. 1EdTech wants to put in place standards that encourage that type of innovation through variability, as well as the type of standards that capture things everyone can agree on.
To enable more use of small data in education, it occurred to us that it would be very cool if it was easier for students or teachers to simply see all of the progress data in one place – even though the tools are all separate. What a major step forward it would be for a student to work in several tools and be able to see how their results compared. So, we decided to see if LTI could be used to enable a Student Progress Dashboard that is a mash-up of many dashboards from independent tools. We see such a dashboard as displaying the unique analytics capabilities of any tool – whether or not data definitions are agreed to – and, whether or not the tool provider is willing to share such data. We think this very simple idea is empowering and will complement the progress we are making on defining agreed data fields when we can.
And, now we have a very simple prototype to show one version of the concept – using tools that are not especially analytical in nature – but ones we had lying around. If you go to this screencast by Stephen Vickers you will see the very first 1EdTech-enabled Student Progress Dashboard prototype. We expect this to be a standard feature to be supported in LTI going forward and want to see lots of riffing on this in the LTI community! Let us know what you think! And, tool providers, start your engines!
Note that we may not be able to tell how well a student will perform based on what they had for breakfast, as perhaps Knewton can, but, we can perhaps make a combination of tools – tools available today – more actionable for students or teachers!
1EdTech LTI-Enabled Student Progress Dashboard Prototype