Sharebar?

How is the IMS Learning Impact Report different than the Horizon Report?

Printer-friendly version

Today IMS Global announced the release of the Learning Impact Report. The Learning Impact report endeavors to provide an analysis of the winners of the IMS Learning Impact Awards (LIAs).   The LIAs have been an annual global competition since 2007. However, this is only the 2nd report – the first was a brief summary analysis released in 2010.

2013 LIA Report

As opposed to awards for the latest and greatest products the LIAs are based on evaluation of use of technology “in context” at an institution, a state, or sometimes across an entire nation or continent. The context provides evidence across eight dimensions of impact that experts use to provide ratings that are used to select the winners.  The judges select the winners – IMS does not. The idea behind the analysis provided in the Learning Impact report is that by looking across the winners in the current year, as well as historically, can provide some insight into the types of projects, initiatives, and R&D that is having impact or impact potential.

The other important factor to understand is that IMS as an organization has over 15 years of history of being on the leading edge on pretty much every type of educational technology, including learning management, e-portfolio, e-assessment,  learning design, etc.  Thus the entries are somewhat over representative of developments ahead of the general market.

The net-net of these factors is that we thought that there might be some interesting information  obtained by looking across the medal winners and the finalists (those selected to come to the Learning Impact conference to be considered for a medal). It is certainly very interesting to simply attempt to ascertain the high impact “project categories” – which we had to develop ourselves by looking across the nominations (as the submitters did invent these project categories nor were they asked to submit in a project category).

We are hoping now to be able to release the Learning Impact report annually, largely because of the institutional leadership behind it (please contact me if interested in becoming involved in the annual report).

Even though I feel that the above explanation is fairly obvious with respect to the uniqueness of what the Learning Impact Awards focus on, I wanted to provide here a bit of an excerpt from the Learning Impact report that helps explain how it is different from probably the widest read report on new technologies in the education space, namely the Horizon Report.  Here is that excerpt:

In terms of comparison to other reports there may be a temptation to compare the Learning Impact Report to the annual Horizon Report(s), of which there are K-12, HED and regional editions. However, because the Learning Impact Report takes the approach of focusing on project types rather than attempting to identify specific technologies and their adoption timeframes (as is the nature of the Horizon Report), the two reports are quite complementary.  The reader of this report and any version of the Horizon Report can draw their own conclusions by comparing and contrasting the information provided. To illustrate, the following bullets are a couple of examples of how this Learning Impact Report could potentially help clarify technologies placed in the “one year or less” time to adoption horizon from the 2013 Horizon Report.

  • Massively Open Online Courses: The Learning Impact analysis would see MOOCs as a type of “Blended Learning Optimization” project. As shown in Figure 3A, these types of projects have not yet achieved mainstream effectiveness in the opinion of IMS. That does not mean that there is not a particular instance of a MOOC that has been effective. What it does mean is that from IMS’s perspective, based on the cumulative evidence, the widespread, high impact adoption of projects in this category is not apparent in the near term. Thus, we would potentially modify the Horizon Report’s findings by pointing out that (a) there are many variations of the Blended Learning Optimization concept that institutions should be considering depending on their goals (some examples of which are given in this report), and (b) these are not easy projects to implement at this point in time.
  • Tablet Computing: Tablets have definitely exploded onto the education scene. From IMS’s perspective we ask if indeed they are being leveraged to improve Learning Impact? In the 2010 Learning Impact Report we identified the category of Mobile Learning Resources as being in its early stages. However, in the current report we have eliminated that category because literally all other project categories need to in some way encompass the requirements of mobile devices. IMS has also seen some very innovative and high scoring projects that have had tablets as a primary platform, some of which are now appearing in the Platform Innovation category, but may also appear in other categories depending on the project focus. However, improving Learning Impact specifically from the deployment of tablets typically requires an adjustment of teaching and learning models as well as technology being integrated in new ways. Therefore, we have primarily seen pilot projects that require substantial resources to put in place. Thus IMS would conclude that while tablets are a given, achieving substantial impact requires further development.

Finally, it should also be noted that in the production of this report IMS takes advantage of a unique viewpoint of the educational technology landscape facilitated by a flourishing collaboration among many of the world’s leading educational technology providers and institutions occurring in IMS’s many face-to-face meetings worldwide. To enter the Learning Impact Awards competition look here.

Category:

Tags: