Sharebar?

Learning Impact Blog


Community leadership for more effective use of technology in service to education

Today IMS Global announced the release of the Learning Impact Report. The Learning Impact report endeavors to provide an analysis of the winners of the IMS Learning Impact Awards (LIAs).   The LIAs have been an annual global competition since 2007. However, this is only the 2nd report – the first was a brief summary analysis released in 2010.

2013 LIA Report

As opposed to awards for the latest and greatest products the LIAs are based on evaluation of use of technology “in context” at an institution, a state, or sometimes across an entire nation or continent. The context provides evidence across eight dimensions of impact that experts use to provide ratings that are used to select the winners.  The judges select the winners – IMS does not. The idea behind the analysis provided in the Learning Impact report is that by looking across the winners in the current year, as well as historically, can provide some insight into the types of projects, initiatives, and R&D that is having impact or impact potential.

The other important factor to understand is that IMS as an organization has over 15 years of history of being on the leading edge on pretty much every type of educational technology, including learning management, e-portfolio, e-assessment,  learning design, etc.  Thus the entries are somewhat over representative of developments ahead of the general market.

The net-net of these factors is that we thought that there might be some interesting information  obtained by looking across the medal winners and the finalists (those selected to come to the Learning Impact conference to be considered for a medal). It is certainly very interesting to simply attempt to ascertain the high impact “project categories” – which we had to develop ourselves by looking across the nominations (as the submitters did invent these project categories nor were they asked to submit in a project category).

We are hoping now to be able to release the Learning Impact report annually, largely because of the institutional leadership behind it (please contact me if interested in becoming involved in the annual report).

Even though I feel that the above explanation is fairly obvious with respect to the uniqueness of what the Learning Impact Awards focus on, I wanted to provide here a bit of an excerpt from the Learning Impact report that helps explain how it is different from probably the widest read report on new technologies in the education space, namely the Horizon Report.  Here is that excerpt:

In terms of comparison to other reports there may be a temptation to compare the Learning Impact Report to the annual Horizon Report(s), of which there are K-12, HED and regional editions. However, because the Learning Impact Report takes the approach of focusing on project types rather than attempting to identify specific technologies and their adoption timeframes (as is the nature of the Horizon Report), the two reports are quite complementary.  The reader of this report and any version of the Horizon Report can draw their own conclusions by comparing and contrasting the information provided. To illustrate, the following bullets are a couple of examples of how this Learning Impact Report could potentially help clarify technologies placed in the “one year or less” time to adoption horizon from the 2013 Horizon Report.

  • Massively Open Online Courses: The Learning Impact analysis would see MOOCs as a type of “Blended Learning Optimization” project. As shown in Figure 3A, these types of projects have not yet achieved mainstream effectiveness in the opinion of IMS. That does not mean that there is not a particular instance of a MOOC that has been effective. What it does mean is that from IMS’s perspective, based on the cumulative evidence, the widespread, high impact adoption of projects in this category is not apparent in the near term. Thus, we would potentially modify the Horizon Report’s findings by pointing out that (a) there are many variations of the Blended Learning Optimization concept that institutions should be considering depending on their goals (some examples of which are given in this report), and (b) these are not easy projects to implement at this point in time.
  • Tablet Computing: Tablets have definitely exploded onto the education scene. From IMS’s perspective we ask if indeed they are being leveraged to improve Learning Impact? In the 2010 Learning Impact Report we identified the category of Mobile Learning Resources as being in its early stages. However, in the current report we have eliminated that category because literally all other project categories need to in some way encompass the requirements of mobile devices. IMS has also seen some very innovative and high scoring projects that have had tablets as a primary platform, some of which are now appearing in the Platform Innovation category, but may also appear in other categories depending on the project focus. However, improving Learning Impact specifically from the deployment of tablets typically requires an adjustment of teaching and learning models as well as technology being integrated in new ways. Therefore, we have primarily seen pilot projects that require substantial resources to put in place. Thus IMS would conclude that while tablets are a given, achieving substantial impact requires further development.

Finally, it should also be noted that in the production of this report IMS takes advantage of a unique viewpoint of the educational technology landscape facilitated by a flourishing collaboration among many of the world’s leading educational technology providers and institutions occurring in IMS’s many face-to-face meetings worldwide. To enter the Learning Impact Awards competition look here.

Tags:
Category:

One of the great things about the annual EDUCAUSE conference is hearing the many stories about how IMS standards have enabled innovative new software applications to easily integrate into the educational enterprise. You might think that IMS knows everything about every application of IMS standards. I'd estimate that we typically know about 1/3 of what is actually occurring "out there" - just based on some off the cuff measurement by how often we are surprised or not surprised by something we hear about.  The very weird thing is that sometimes the things we don't hear about are really big adoptions of IMS.

Anyway, please let us know what you are doing so we can help get the word out!

Ray Henderson has recently posted this blog: My Investment Thesis for IN THE TELLING about a start-up he has invested in called "In the Telling." As you can "tell" by the name the product has something to do with "stories."  The more mundane name for what is being offered here is "flipped classroom" - use the out of class time to watch the lectures, use the in the class time for more meaningful interaction.

Problem is that getting students to do anything out of class is a challenge these days. So, In the Telling provides a unique approach that helps the instructor create a story with narration. In essence they are creating a documentary of sorts that is more compelling than a simple lecture.

I have not seen any of the output of In the Telling yet - but the idea is very intriguing.  As someone who has bought more than my share of "great lectures" on various media in which I never made it past the first 30 minutes . . . well, I think better ways to teach is what we need to be investing in.

But, the crowning achievement with respect to IMS comes in the following words from Ray's blog:

COMPATIBLE WITH ALL MODERN LEARNING MANAGEMENT SYSTEMS: The Company designed their solution assuming that the launch point for most learner experiences would begin within an LMS. This is, after all, the way most assignments are made. The platform is built using the IMS Global’s open standard for systems integration—Learning Tool Interoperability or “LTI”—which most modern LMS platforms now natively support. Students can initiate sessions with the platform just as they might with any other assignment, and the same basic usage statistics recorded by the LMS are preserved.

IMS is very proud to be a part of enabling the rapid rise of innovation in the edtech community!  

Today, in preparation for EDUCAUSE 2013 in Anaheim next week IMS has announced the Connected Learning Innovation Challenge!

The Connected Learning Innovation Challenge will feature IMS’s first ever “app challenge” and the establishment of a community of institutional and industry leaders that want to be at the forefront of encouraging a much more diverse and innovative future for educational technology – in real practice at real institutions – not as hype, but as tools that support what teachers and students want to do within the academic enterprise. Note: Kudos and salutations to Instructure Canvas to organizing the first ever LTI app challenge last May-June!

The motivation for the Connected Learning Innovation Challenge is described in a just released EDUCAUSE Review article, A New Architecture for Learning,  that I was fortunate enough to be able to collaborate on with Malcolm Brown, head of the EDUCAUSE Learning Initiative and Jack Suess, VP of IT and CIO at University of Maryland Baltimore County. The article talks about what we as an educational community need to do to enable greater innovation in the connected age and introduces an unprecedented commitment of cooperation among some of education’s leading associations to help make it happen.

1 of 3 IMS Revolution Banners at EDUCAUSE 12

IMS Revolution Banner at EDUCAUSE 12

Last year at EDUCAUSE 2012 we introduced the IMS 10-100x Open Digital Innovation Revolution.  Is the revolution over? Just the opposite my friends – the revolution is burning like wildfire across K-20 education.  As of EDUCAUSE 2012 there were a cumulative 126 IMS conformance certifications. Going into EDUCAUSE 2013 that number is 210! Holy Toledo!  All conformance certifications are listed on IMSCERT.org.  It took roughly 3 years to achieve 126, but in the last year 84 new conformance certifications were achieved! And, the LTI catalog keeps growing - there are about 20 certified platforms now and a myriad of tools/apps.

So, how does the Connected Learning Innovation Challenge relate to the IMS Revolution? The “revolution” is like the paving of the road. As more platforms and applications are based on open standards and can work together with 10-100x less integration cost and time than before, well, then a lot more attention can be put into innovative vehicles to use the roads!  So, the Connected Learning Innovation Challenge – CLIC - is the logical evolution of the revolution –  focusing on what most people care about: great technology that can support or enhance teaching and learning.

To help understand CLIC, or to explain it to your colleagues, I’d like to provide the following talking points from my perspective (you can also visit the CLIC web pages here):

1. CLIC is about institutions working together to figure out how to enable and sustain support for a diverse set of teaching and learning applications (or non-educational apps favored by faculty and students) that can no longer take 6 months to happen. Thus, CLIC is a collaboration to make something happen that many are institutions currently trying to do on their own – but makes more sense to work on collectively.

2. CLIC will accomplish #1 through a few very targeted outputs/activities:

  • Competitions to identify and financially reward innovative apps and platforms supporting connected learning
  • Open source sharing community for sharing things that submitters and/or institutions wish to share, such as tools, frameworks, apps, app gateways, etc. Open source “things” built on standards can be utilized cross platform – so, this is the first ever cross-platform open source initiative anywhere!
  • A facilitated leadership community via listservs and newsletters to keep all interested parties abreast of the happenings, organize the core advocacy/leadership and enable organic growth. There will be app evaluation activities and other community milestones. As an example of organic growth, whereas IMS will be conducting large-scale challenges we will encourage regional/institutional level challenges in conjunction with tech fairs institutions or others may already be conducting.

3. CLIC is NOT an IMS membership program. To lead, support or follow CLIC your organization does not need to be an IMS member. I’m sure lots of IMS member organizations will be supporting CLIC, and, of course the IMS members made all this possible. But, think of CLIC more like the original IMS initiative organized by EDUCAUSE back in the mid-1990’s. CLIC is a collaboration to make something happen without having a whole lot of formality behind it at the start other than the activities themselves. IMS has the chops to facilitate this, but we want it to go in the direction that the institutional leaders who get involved want to take it in terms of something more formal (or not).

Now, I’m going to say right now, from day one, that getting the most out of CLIC for the educational community will take leadership from institutions. Educators and their institutions are going to transform education with innovative technologies – and the CLIC community should be very productive for those wanting to help lead that charge. IMS can facilitate CLIC and put some legs underneath it – but we need institutional leadership, guidance, ideas and resources in terms of time and even financial contributions for those institutions that can. The other nice thing that IMS can bring is a way to sustain and continue the progress that CLIC makes.  IMS is a solid organization that has a track record of sustaining and evolving innovative technical work even as leadership is handed off and evolved among institutions and suppliers. If you represent an institutional interest in CLIC, I hope you will consider becoming an institutional advocate as some of your peers are - and we are very thankful indeed - we should really be able to get 100 institutional advocates for CLIC!

Finally, if you have not had a chance yet to view the short 3-minute video compilation of comments from Dr. Charles Severance of University of Michigan describing some of the motivations behind CLIC I highly encourage you to go to the CLIC landing page and view the video in the top left corner!

With IMS’s recent announcement of the upcoming e-assessment interoperability challenge we thought it would be a good time to discuss electronic assessment. Here is a Q&A with Rob Abel of IMS Global. Feel free to post additional questions and Rob will answer them (if he can)!

Q1: Is it time for electronic assessment in education?

A1: Yes, paper tests are more difficult to administer, take longer to process, are more prone to error and are not able to provide timely data to help improve instruction. Compared to a situation where paper textbooks may still have some usability advantages over digital e-books, paper assessments have no advantage at all over e-assessment.

Q2: Can e-assessment be used for summative or formative testing?

A2: Both.  E-assessment can be used for pure “high stakes test taking” scenarios as well as intermingled throughout other learning activities for formative assessment.

Q3: Is interoperability of assessment items important?

A3: Yes - very. In general digital assessment enables new forms of collaboration. For instance, in various countries around the world there is a desire to enable school organizations to collaborate on item development – since many schools are testing on the same subjects. Standard formats for assessment items enables collaboration on/exchange of items without every organization needing to use the same software platform for item creation and/or delivery. It is becoming pretty clear with historic collaborations such as the U.S. states on the Race to the Top Assessment initiative that the era of the “single delivery platform that outputs pdf” is coming to an end. With interoperability of assessment items enabled by standards there is no reason to be locked into a single vendor solution. Across the assessment community replication of effort goes down, investment in proprietary solutions ends and more investment is focused on innovation.

Q4: Does IMS have standards and a community focused on assessment interoperability?

A4: Yes.  IMS has two related standards that the assessment community worldwide should be making use of. The first is QTI (Question and Test Interoperability) and the second is APIP (Accessible Portable Item Protocol). QTI enables interoperability of assessment items and tests. The latest version is v2.1 which is the one that the assessment community is rallying around. A subset (profile) of an older version of QTI, v1.2, is used in Common Cartridge, which is a format for importing and exporting content into/out of learning platforms. APIP adds accessibility constructs to QTI to enable electronic delivery of a variety of accessible assessments.

Q5: What about other types of interoperability that might enable more effective use of e-assessment?

A5: Yes. There is a very compelling need to use interoperability standards to enable assessment software platforms to “plug into” or connect with other software systems. So, this is the “assessment software product” as an LTI (Learning Tools Interoperability) tool provider, enabling the assessment platform to be seamlessly “launched” from a host system (like a learning management system). This type of “plugging in” can be useful in both formative and summative scenarios (depending on how the later is administered). We see at least four types of assessment products beyond the state level large-scale assessment that will benefit from this type of interoperability:

  • Standard quizzing/test authoring and delivery software that are typically used already with learning platforms
  • The increasingly popular “homework applications” or “adaptive tutoring applications” can be also be viewed as formative assessment platforms.
  • Classroom test creation and scoring systems – yes, including those using paper and pencil
  • Assessment tools used for competency-based degree programs, such as those used by Western Governors University.

Q6: What about interoperability of assessment data?

A6: This of course is also very important. QTI describes formats for item data – which describes how test takers answer questions. The latest IMS work on analytics – the IMS Caliper Learning Analytics Framework (see blog Q&A) - will leverage the QTI data formats as well as other assessment-related formats (e.g. gradebook data). Thus, assessment data can be provided “back” to a learning platform, an assessment delivery platform or to an analytics store.

Q7: What about authentic assessment in the classroom or project-based learning?

A7: Any type of educational assessment, including e-assessment, is just a tool. It is one source of input. In our opinion assessment should be used to improve teaching and to improve learning. Thus, e-assessment plays an important role because it can provide real-time or near real-time feedback in a very transparent way – on a question by question basis (QTI enables such feedback), for computer adaptive testing or simply faster processing of an entire quiz or test. And that feedback can go to teachers, students, parents, etc – whatever makes the most sense. And, initiatives like Race to the Top Assessment are folding teacher evaluation of various “performance events” into the assessment mix. Mobile platforms and interoperable apps could obviously have an very important and innovative role to play in that regard as well as all types of assessment wrapped into apps or otherwise. We’ve already seen some fascinating use of QTI in the mobile setting via the Learning Impact Awards.

Q8: Why has IMS announced a Worldwide Assessment Interoperability Challenge?

A8: Use of interoperability standards such as QTI in the past has been rather flakey in that each supplier implemented different versions and different subsets of functionality. Very few assessment product providers provided feedback to IMS to enable the issues to be resolved.  As a result, interoperability was limited.  Things have turned around radically in the last few years in that IMS now has some 25 or so world-leading providers of assessment products actively involved in implementing QTI and/or APIP. As a result, IMS has been able to finalize these specifications and conformance certification tests that will result in high levels of interoperability. The “challenge” is our way of saying to the world that we have a very strong core set of suppliers who have agreed to achieve conformance certification together over the next few months. Please come and join in for the good of your product development efforts and the good of your customers who desire interoperability that really works.  The extra added “bonus” for participating is entry into the annual IMS Learning Impact Awards under special assessment product categories. Details on the “challenge” are here: http://apip.imsglobal.org/challenge.html

Q9: What if a region of the world wants to work with IMS on a regional profile of QTI or APIP?

A9: Yes, IMS is set up to facilitate that and is in fact in partnership in the Netherlands for the last two years on such an effort regarding national exams.  Feel free to send us an email at assessmentchallenge@imsglobal.org

Q10: What do you see for the future of e-assessment?

A10: We are at the very beginning of a long road ahead filled with many exciting product opportunities.  As with many of the other IMS standards, like Common Cartridge and LTI, we are going to see a very dynamic evolution based on market needs of QTI and APIP. For instance, one of the other application areas we are working on at the moment is QTI application to e-textbooks. E-assessment will permeate every aspect of digital learning materials and activities – with an emphasis on adaptive testing to help pinpoint where additional alternative materials and activities are needed. And, with the undeniable trend toward competency-based learning paths and credentialing the need for better assessment is increasing. As with all of the IMS focus areas the key will be for the technology of assessment to “get out of the way” and be simple and easy to use and benefit from.