Learning Impact Blog

Community leadership for more effective use of technology in service to education

Today IMS released our annual report for the calendar and fiscal year 2013.   See the press release.  See the annual report.

IMS annual report 2013 cover

Producing an annual report is a lot of work – and these days when it seems like very few people have time to read one might ask why do we take the time and effort to do this? We first published an annual report for the year 2009 – so 2013 is the fifth edition.

IMS annual report 2009 cover

I think there were really two catalysts that got us to publish the report.

The first was that after I came into IMS as the CEO in 2006 it became obvious that not even the Board of Directors much less all stakeholders in IMS were getting accurate financial data and other metrics on the organization. First we corrected the situation for the Board but then the Board also vowed that we should be providing this information to the members and the stakeholders.

The second catalyst was Jan Posten Day, who at the time was with Blackboard, and is now with Pearson. As a member of one of our leadership committees in IMS Jan was adamant that IMS should have an annual report. At the time Jan suggested this we were struggling to keep the organization afloat and I pushed back on the idea because it just seemed like we could not pull it off.  But Jan’s insistence made an impression on myself and the other staff – and I think it was within a year or so that we dug deep and got out the 2009 report.

As you will see in this year’s report, IMS has been growing nicely now for eight consecutive years. ims growth through 2013  

In fact, even though there has been quite a bit of churn in the member base over that time, the consistency in the net growth has been a little scary. It’s scary because we have looked long and hard and have not found any other similar growth pattern in organizations similar to IMS. Indeed during this same period most other organizations classified as “standards consortia” have generally been flat to declining. And, if you look at the historical patterns for standards consortia they tend to grow very rapidly when first originated and then flatten or tail off.

So, IMS is an organization in unchartered territory. In my mind it is all about leadership in terms of which way it will go. IMS has provided a viable organization for those organizations, institutions and suppliers, who wish to evolve an unprecedented collaboration to new heights. Or, those afraid of the disruption that IMS is enabling may slow it down. Everyday I see forces on both sides of that equation and think it’s going to be very interesting indeed as we go forward.

However, I assure you that, means willing, IMS will be publishing the report whether or not the results are as rosy as they have been.  Indeed when we began publishing the report we had no idea that the chart data would keep going up for the next 5 years!

But, here’s why I think the report is useful and why you should give it a look:

  1. In one relatively short document you get a full view of the work of IMS – which is not easy to see if you are focused on one or a few IMS initiatives.
  2. You can see how the organization is doing in terms of building momentum and in terms of financial strength.
  3. You can get a great a very summary of the major thrust of IMS and the key initiatives - and a concise commentary on why we are doing what we do.
  4. You can see the individuals and organizations that are leading IMS.
  5. It is a format that can be easily shared with someone else whom you might want to introduce the organization to or update on IMS progress.

IMS architectiure

Hopefully the experience of perusing the report should give you a sense that IMS is indeed a non-profit organization worthy of your support because IMS is changing the education and learning sectors for the better.  And, if you look at the range of initiatives that IMS is undertaking you can feel pride in that your support has made this progress possible. I assure you that without your support this work would not have happened – not only not have happened in IMS, but most likely would not have happened anywhere. IMS is that unique in the leadership and collaboration for progress to the education and learning sectors.

As with most “things IMS” the annual report is a testament to leadership. Not the leadership of the IMS staff, but the leadership of the IMS members, both organizations and individuals (like Jan Day above) who are insistent that we must do better in enabling the next generation of education and learning! IMS community

Please excuse the long time since the last blog folks.  IMS is adding a lot of new members and staff supporting an unprecedented array of exciting initiatives – which has kept yours truly very busy the last few months.

We are now in the final push toward our annual Learning Impact event, May 5-8 in New Orleans, USA. While this is also a busy time we’ve got a great chance at the event and before to be talking about where we are and where we are going in IMS. We hope you will join the conversation! Consider this a first installment.

The (perhaps) provocative title of this post is actually one that we are sometimes asked. After all, IMS is very much a “bottoms-up” meritocracy, like many other organizations that develop interoperability standards. Most of the ideas in IMS, and certainly the best ideas, come from the individuals that are participating on behalf of their member organizations.  And, IMS is a true membership organization (legally organized as such) that provides a level playing field for organizations of all sizes – a construct that we think provides a very good structure for what we do as previously described here. So, when the members speak – we listen – and usually act.

IMS does have a strategy. IMS has an elected Board of Directors that helps formulate the strategy. But, the strategy is very organic, flowing and dynamic. New ideas brought forward by the members go through a certain “due diligence” that occurs by putting the idea in front of key stakeholders – those most motivated to act – and adjusting accordingly (including sometimes putting on the shelf until further interest). Having much experience in the venture capital world I will tell you that it is much like the funneling of ideas/business plans that every VC firm goes through in terms of the process of looking at the risks and opportunities involved.

So, the resulting IMS strategy is a function of bubbling up, testing (against the critical concepts of adoption and learning impact) and organizing into something as coherent as we can make it given what is actually happening in the sector and various sub-segments.  And occasionally adding some key missing pieces that for whatever reason have not bubbled up – like for instance members not willing to share in an area that is actually good for them to share.

For several years past this process unfolded into an IMS strategy centered on what we have called the “Digital Learning Services” standards, focused on (but not limited to) Common Cartridge, Learning Tools Interoperability (LTI) and Learning Information Services (LIS).

The strategic theory behind the DLS focus was that together these standards would solve a very large percentage of the integration challenges in/with the education enterprise.  And, in fact, while different pieces have evolved and been adopted at differing rates, we think this thesis has largely turned out to be on target.  See the accompanying charts on growth in IMS membership during this strategy and growth more recently in the conformance certifications that are the market adoption proof point.  Notice the 97 certifications in 2013 – almost 2 a week. So far in 2014 we are averaging close to 3 a week. In other words, this strategy is still taking hold, but clearly it is taking hold in a big way!




However, the IMS strategy has definitely shifted beyond DLS in the last year or so. First of all, e-assessment, an area IMS has had some activity in for a long while via QTI (and a subset of which is covered in Common Cartridge) became a hot area. The very simple idea that electronic assessments if done right are much more affordable and scalable than paper assessments coupled with the very obvious idea that there should be open formats to enable the e-assessment ecosystem of suppliers and states has come of age (both in the U.S. and other nations such as the Netherlands). Second, now that the IMS DLS standards are working – radically reducing cost, time, complexity of seamless integration – our attention is naturally now turning to what can be enabled with the standards.

While there may not be complete agreement in the IMS community (given its size and diverse nature) over what we should be enabling with the standards, here are the current thoughts – and thus, the strategy going forward:

  1. The power of LTI (first v1 and now v2) to reduce cost and time of achieving seamless integration by 10-1000x will soon lead to 1-click integration.  IMS-enabled applications will be auto negotiating which IMS services are supported – thus revolutionizing the ease with which standards-based applications will be incorporated into the teaching and learning process.
  2. #1 enabling a very diverse open ecosystem of new types of learning platforms and applications and potentially rearranging the ordering of  integrations – very much an “app to app”  model of cooperation with or without a learning management “system” in the middle.
  3. Merging LTI with the IMS work on student information (LIS) and course planning and scheduling (CPS) exchange to continue to open up the educational enterprise via easy to use standards.
  4. Establishing and growing the “educational app community” – like an open source community on steroids that builds things that work across platforms (the “things” may be open source or not, but there should be tools to enable this that are open source). This is a remarkable new type of community indeed – suppliers and institutions working together across platform – kind of like the worldwide web but focused on the education vertical.
  5. Enabling what most refer to as e-books or e-texts as a highly interoperable format across a wide variety of e-readers/mobile devices for the needs of learning and education.  See EDUPUB.
  6. Making instrumentation / measurement of learning activities easy to enable collection of analytics – big and small data. See Caliper Analytics.
  7. Including everything we’ve learned and are learning about e-assessment across #4-6, meaning that we’ve got the standards to enable innovative assessment apps, enable assessment in e-text and the enable easy instrumentation of assessment in learning platforms and apps (via Caliper and the outcomes standards developed on QTI/APIP).
  8. Utilize the standards to create an open source reference implementation for a peer-to-peer app sharing framework that can be used to do, well, what it says – share apps with trusted partners and encourage using standards to do this – thus, the enabling of a standards-based “app store” or “app sharing” equivalent to iTunes, etc. See CASA.

Perhaps though, most importantly, IMS is making great progress with our end-user/institutional led groups to ensure that all of these initiatives are in fact getting them where they want to go.  Our K-12 district advisory board (I3LC) continues to grow and our new HED connected learning advisory board is shepherding the app community, the app sharing architecture, analytics and competency-based learning initiatives.

Hopefully you will see the evolution of the IMS strategy in the above. The IMS community is making change happen in some very substantial ways and I invite you to partake at the May 5-8 Learning Impact event – where the breakout tracks mirror the strategy areas above and the plenary sessions undertake the broader discussion  of “why” we are doing this in terms of the emergent models of education that we wish to enable.


Today IMS Global announced the release of the Learning Impact Report. The Learning Impact report endeavors to provide an analysis of the winners of the IMS Learning Impact Awards (LIAs).   The LIAs have been an annual global competition since 2007. However, this is only the 2nd report – the first was a brief summary analysis released in 2010.

2013 LIA Report

As opposed to awards for the latest and greatest products the LIAs are based on evaluation of use of technology “in context” at an institution, a state, or sometimes across an entire nation or continent. The context provides evidence across eight dimensions of impact that experts use to provide ratings that are used to select the winners.  The judges select the winners – IMS does not. The idea behind the analysis provided in the Learning Impact report is that by looking across the winners in the current year, as well as historically, can provide some insight into the types of projects, initiatives, and R&D that is having impact or impact potential.

The other important factor to understand is that IMS as an organization has over 15 years of history of being on the leading edge on pretty much every type of educational technology, including learning management, e-portfolio, e-assessment,  learning design, etc.  Thus the entries are somewhat over representative of developments ahead of the general market.

The net-net of these factors is that we thought that there might be some interesting information  obtained by looking across the medal winners and the finalists (those selected to come to the Learning Impact conference to be considered for a medal). It is certainly very interesting to simply attempt to ascertain the high impact “project categories” – which we had to develop ourselves by looking across the nominations (as the submitters did invent these project categories nor were they asked to submit in a project category).

We are hoping now to be able to release the Learning Impact report annually, largely because of the institutional leadership behind it (please contact me if interested in becoming involved in the annual report).

Even though I feel that the above explanation is fairly obvious with respect to the uniqueness of what the Learning Impact Awards focus on, I wanted to provide here a bit of an excerpt from the Learning Impact report that helps explain how it is different from probably the widest read report on new technologies in the education space, namely the Horizon Report.  Here is that excerpt:

In terms of comparison to other reports there may be a temptation to compare the Learning Impact Report to the annual Horizon Report(s), of which there are K-12, HED and regional editions. However, because the Learning Impact Report takes the approach of focusing on project types rather than attempting to identify specific technologies and their adoption timeframes (as is the nature of the Horizon Report), the two reports are quite complementary.  The reader of this report and any version of the Horizon Report can draw their own conclusions by comparing and contrasting the information provided. To illustrate, the following bullets are a couple of examples of how this Learning Impact Report could potentially help clarify technologies placed in the “one year or less” time to adoption horizon from the 2013 Horizon Report.

  • Massively Open Online Courses: The Learning Impact analysis would see MOOCs as a type of “Blended Learning Optimization” project. As shown in Figure 3A, these types of projects have not yet achieved mainstream effectiveness in the opinion of IMS. That does not mean that there is not a particular instance of a MOOC that has been effective. What it does mean is that from IMS’s perspective, based on the cumulative evidence, the widespread, high impact adoption of projects in this category is not apparent in the near term. Thus, we would potentially modify the Horizon Report’s findings by pointing out that (a) there are many variations of the Blended Learning Optimization concept that institutions should be considering depending on their goals (some examples of which are given in this report), and (b) these are not easy projects to implement at this point in time.
  • Tablet Computing: Tablets have definitely exploded onto the education scene. From IMS’s perspective we ask if indeed they are being leveraged to improve Learning Impact? In the 2010 Learning Impact Report we identified the category of Mobile Learning Resources as being in its early stages. However, in the current report we have eliminated that category because literally all other project categories need to in some way encompass the requirements of mobile devices. IMS has also seen some very innovative and high scoring projects that have had tablets as a primary platform, some of which are now appearing in the Platform Innovation category, but may also appear in other categories depending on the project focus. However, improving Learning Impact specifically from the deployment of tablets typically requires an adjustment of teaching and learning models as well as technology being integrated in new ways. Therefore, we have primarily seen pilot projects that require substantial resources to put in place. Thus IMS would conclude that while tablets are a given, achieving substantial impact requires further development.

Finally, it should also be noted that in the production of this report IMS takes advantage of a unique viewpoint of the educational technology landscape facilitated by a flourishing collaboration among many of the world’s leading educational technology providers and institutions occurring in IMS’s many face-to-face meetings worldwide. To enter the Learning Impact Awards competition look here.


One of the great things about the annual EDUCAUSE conference is hearing the many stories about how IMS standards have enabled innovative new software applications to easily integrate into the educational enterprise. You might think that IMS knows everything about every application of IMS standards. I'd estimate that we typically know about 1/3 of what is actually occurring "out there" - just based on some off the cuff measurement by how often we are surprised or not surprised by something we hear about.  The very weird thing is that sometimes the things we don't hear about are really big adoptions of IMS.

Anyway, please let us know what you are doing so we can help get the word out!

Ray Henderson has recently posted this blog: My Investment Thesis for IN THE TELLING about a start-up he has invested in called "In the Telling." As you can "tell" by the name the product has something to do with "stories."  The more mundane name for what is being offered here is "flipped classroom" - use the out of class time to watch the lectures, use the in the class time for more meaningful interaction.

Problem is that getting students to do anything out of class is a challenge these days. So, In the Telling provides a unique approach that helps the instructor create a story with narration. In essence they are creating a documentary of sorts that is more compelling than a simple lecture.

I have not seen any of the output of In the Telling yet - but the idea is very intriguing.  As someone who has bought more than my share of "great lectures" on various media in which I never made it past the first 30 minutes . . . well, I think better ways to teach is what we need to be investing in.

But, the crowning achievement with respect to IMS comes in the following words from Ray's blog:

COMPATIBLE WITH ALL MODERN LEARNING MANAGEMENT SYSTEMS: The Company designed their solution assuming that the launch point for most learner experiences would begin within an LMS. This is, after all, the way most assignments are made. The platform is built using the IMS Global’s open standard for systems integration—Learning Tool Interoperability or “LTI”—which most modern LMS platforms now natively support. Students can initiate sessions with the platform just as they might with any other assignment, and the same basic usage statistics recorded by the LMS are preserved.

IMS is very proud to be a part of enabling the rapid rise of innovation in the edtech community!  

Today, in preparation for EDUCAUSE 2013 in Anaheim next week IMS has announced the Connected Learning Innovation Challenge!

The Connected Learning Innovation Challenge will feature IMS’s first ever “app challenge” and the establishment of a community of institutional and industry leaders that want to be at the forefront of encouraging a much more diverse and innovative future for educational technology – in real practice at real institutions – not as hype, but as tools that support what teachers and students want to do within the academic enterprise. Note: Kudos and salutations to Instructure Canvas to organizing the first ever LTI app challenge last May-June!

The motivation for the Connected Learning Innovation Challenge is described in a just released EDUCAUSE Review article, A New Architecture for Learning,  that I was fortunate enough to be able to collaborate on with Malcolm Brown, head of the EDUCAUSE Learning Initiative and Jack Suess, VP of IT and CIO at University of Maryland Baltimore County. The article talks about what we as an educational community need to do to enable greater innovation in the connected age and introduces an unprecedented commitment of cooperation among some of education’s leading associations to help make it happen.

1 of 3 IMS Revolution Banners at EDUCAUSE 12

IMS Revolution Banner at EDUCAUSE 12

Last year at EDUCAUSE 2012 we introduced the IMS 10-100x Open Digital Innovation Revolution.  Is the revolution over? Just the opposite my friends – the revolution is burning like wildfire across K-20 education.  As of EDUCAUSE 2012 there were a cumulative 126 IMS conformance certifications. Going into EDUCAUSE 2013 that number is 210! Holy Toledo!  All conformance certifications are listed on  It took roughly 3 years to achieve 126, but in the last year 84 new conformance certifications were achieved! And, the LTI catalog keeps growing - there are about 20 certified platforms now and a myriad of tools/apps.

So, how does the Connected Learning Innovation Challenge relate to the IMS Revolution? The “revolution” is like the paving of the road. As more platforms and applications are based on open standards and can work together with 10-100x less integration cost and time than before, well, then a lot more attention can be put into innovative vehicles to use the roads!  So, the Connected Learning Innovation Challenge – CLIC - is the logical evolution of the revolution –  focusing on what most people care about: great technology that can support or enhance teaching and learning.

To help understand CLIC, or to explain it to your colleagues, I’d like to provide the following talking points from my perspective (you can also visit the CLIC web pages here):

1. CLIC is about institutions working together to figure out how to enable and sustain support for a diverse set of teaching and learning applications (or non-educational apps favored by faculty and students) that can no longer take 6 months to happen. Thus, CLIC is a collaboration to make something happen that many are institutions currently trying to do on their own – but makes more sense to work on collectively.

2. CLIC will accomplish #1 through a few very targeted outputs/activities:

  • Competitions to identify and financially reward innovative apps and platforms supporting connected learning
  • Open source sharing community for sharing things that submitters and/or institutions wish to share, such as tools, frameworks, apps, app gateways, etc. Open source “things” built on standards can be utilized cross platform – so, this is the first ever cross-platform open source initiative anywhere!
  • A facilitated leadership community via listservs and newsletters to keep all interested parties abreast of the happenings, organize the core advocacy/leadership and enable organic growth. There will be app evaluation activities and other community milestones. As an example of organic growth, whereas IMS will be conducting large-scale challenges we will encourage regional/institutional level challenges in conjunction with tech fairs institutions or others may already be conducting.

3. CLIC is NOT an IMS membership program. To lead, support or follow CLIC your organization does not need to be an IMS member. I’m sure lots of IMS member organizations will be supporting CLIC, and, of course the IMS members made all this possible. But, think of CLIC more like the original IMS initiative organized by EDUCAUSE back in the mid-1990’s. CLIC is a collaboration to make something happen without having a whole lot of formality behind it at the start other than the activities themselves. IMS has the chops to facilitate this, but we want it to go in the direction that the institutional leaders who get involved want to take it in terms of something more formal (or not).

Now, I’m going to say right now, from day one, that getting the most out of CLIC for the educational community will take leadership from institutions. Educators and their institutions are going to transform education with innovative technologies – and the CLIC community should be very productive for those wanting to help lead that charge. IMS can facilitate CLIC and put some legs underneath it – but we need institutional leadership, guidance, ideas and resources in terms of time and even financial contributions for those institutions that can. The other nice thing that IMS can bring is a way to sustain and continue the progress that CLIC makes.  IMS is a solid organization that has a track record of sustaining and evolving innovative technical work even as leadership is handed off and evolved among institutions and suppliers. If you represent an institutional interest in CLIC, I hope you will consider becoming an institutional advocate as some of your peers are - and we are very thankful indeed - we should really be able to get 100 institutional advocates for CLIC!

Finally, if you have not had a chance yet to view the short 3-minute video compilation of comments from Dr. Charles Severance of University of Michigan describing some of the motivations behind CLIC I highly encourage you to go to the CLIC landing page and view the video in the top left corner!

With IMS’s recent announcement of the upcoming e-assessment interoperability challenge we thought it would be a good time to discuss electronic assessment. Here is a Q&A with Rob Abel of IMS Global. Feel free to post additional questions and Rob will answer them (if he can)!

Q1: Is it time for electronic assessment in education?

A1: Yes, paper tests are more difficult to administer, take longer to process, are more prone to error and are not able to provide timely data to help improve instruction. Compared to a situation where paper textbooks may still have some usability advantages over digital e-books, paper assessments have no advantage at all over e-assessment.

Q2: Can e-assessment be used for summative or formative testing?

A2: Both.  E-assessment can be used for pure “high stakes test taking” scenarios as well as intermingled throughout other learning activities for formative assessment.

Q3: Is interoperability of assessment items important?

A3: Yes - very. In general digital assessment enables new forms of collaboration. For instance, in various countries around the world there is a desire to enable school organizations to collaborate on item development – since many schools are testing on the same subjects. Standard formats for assessment items enables collaboration on/exchange of items without every organization needing to use the same software platform for item creation and/or delivery. It is becoming pretty clear with historic collaborations such as the U.S. states on the Race to the Top Assessment initiative that the era of the “single delivery platform that outputs pdf” is coming to an end. With interoperability of assessment items enabled by standards there is no reason to be locked into a single vendor solution. Across the assessment community replication of effort goes down, investment in proprietary solutions ends and more investment is focused on innovation.

Q4: Does IMS have standards and a community focused on assessment interoperability?

A4: Yes.  IMS has two related standards that the assessment community worldwide should be making use of. The first is QTI (Question and Test Interoperability) and the second is APIP (Accessible Portable Item Protocol). QTI enables interoperability of assessment items and tests. The latest version is v2.1 which is the one that the assessment community is rallying around. A subset (profile) of an older version of QTI, v1.2, is used in Common Cartridge, which is a format for importing and exporting content into/out of learning platforms. APIP adds accessibility constructs to QTI to enable electronic delivery of a variety of accessible assessments.

Q5: What about other types of interoperability that might enable more effective use of e-assessment?

A5: Yes. There is a very compelling need to use interoperability standards to enable assessment software platforms to “plug into” or connect with other software systems. So, this is the “assessment software product” as an LTI (Learning Tools Interoperability) tool provider, enabling the assessment platform to be seamlessly “launched” from a host system (like a learning management system). This type of “plugging in” can be useful in both formative and summative scenarios (depending on how the later is administered). We see at least four types of assessment products beyond the state level large-scale assessment that will benefit from this type of interoperability:

  • Standard quizzing/test authoring and delivery software that are typically used already with learning platforms
  • The increasingly popular “homework applications” or “adaptive tutoring applications” can be also be viewed as formative assessment platforms.
  • Classroom test creation and scoring systems – yes, including those using paper and pencil
  • Assessment tools used for competency-based degree programs, such as those used by Western Governors University.

Q6: What about interoperability of assessment data?

A6: This of course is also very important. QTI describes formats for item data – which describes how test takers answer questions. The latest IMS work on analytics – the IMS Caliper Learning Analytics Framework (see blog Q&A) - will leverage the QTI data formats as well as other assessment-related formats (e.g. gradebook data). Thus, assessment data can be provided “back” to a learning platform, an assessment delivery platform or to an analytics store.

Q7: What about authentic assessment in the classroom or project-based learning?

A7: Any type of educational assessment, including e-assessment, is just a tool. It is one source of input. In our opinion assessment should be used to improve teaching and to improve learning. Thus, e-assessment plays an important role because it can provide real-time or near real-time feedback in a very transparent way – on a question by question basis (QTI enables such feedback), for computer adaptive testing or simply faster processing of an entire quiz or test. And that feedback can go to teachers, students, parents, etc – whatever makes the most sense. And, initiatives like Race to the Top Assessment are folding teacher evaluation of various “performance events” into the assessment mix. Mobile platforms and interoperable apps could obviously have an very important and innovative role to play in that regard as well as all types of assessment wrapped into apps or otherwise. We’ve already seen some fascinating use of QTI in the mobile setting via the Learning Impact Awards.

Q8: Why has IMS announced a Worldwide Assessment Interoperability Challenge?

A8: Use of interoperability standards such as QTI in the past has been rather flakey in that each supplier implemented different versions and different subsets of functionality. Very few assessment product providers provided feedback to IMS to enable the issues to be resolved.  As a result, interoperability was limited.  Things have turned around radically in the last few years in that IMS now has some 25 or so world-leading providers of assessment products actively involved in implementing QTI and/or APIP. As a result, IMS has been able to finalize these specifications and conformance certification tests that will result in high levels of interoperability. The “challenge” is our way of saying to the world that we have a very strong core set of suppliers who have agreed to achieve conformance certification together over the next few months. Please come and join in for the good of your product development efforts and the good of your customers who desire interoperability that really works.  The extra added “bonus” for participating is entry into the annual IMS Learning Impact Awards under special assessment product categories. Details on the “challenge” are here:

Q9: What if a region of the world wants to work with IMS on a regional profile of QTI or APIP?

A9: Yes, IMS is set up to facilitate that and is in fact in partnership in the Netherlands for the last two years on such an effort regarding national exams.  Feel free to send us an email at

Q10: What do you see for the future of e-assessment?

A10: We are at the very beginning of a long road ahead filled with many exciting product opportunities.  As with many of the other IMS standards, like Common Cartridge and LTI, we are going to see a very dynamic evolution based on market needs of QTI and APIP. For instance, one of the other application areas we are working on at the moment is QTI application to e-textbooks. E-assessment will permeate every aspect of digital learning materials and activities – with an emphasis on adaptive testing to help pinpoint where additional alternative materials and activities are needed. And, with the undeniable trend toward competency-based learning paths and credentialing the need for better assessment is increasing. As with all of the IMS focus areas the key will be for the technology of assessment to “get out of the way” and be simple and easy to use and benefit from.  

This blog post is an interview with IMS’s Rob Abel to get to the bottom of IMS’s recent announcement of its Caliper Analytics Framework.

See related post on "small data".

Feel free to post additional questions and Rob will answer them (we hope)!

Q1: Is this project/announcement a big deal?

A1: We’ve got a lot of very impactful stuff going on in IMS these days, but enabling widespread adoption of analytics is one of the top priorities of IMS – with a mandate coming right from the IMS Board of Directors. But, perhaps more importantly, if we want to gain the full potential benefit of analytics and dashboards in education we need to make sure it is relatively easy to enable the transmission of data from any applications that can provide useful data. Interoperability standards, if done correctly, can help enable this.

Q2: There is lots of work going on in analytics, dashboards, etc.  Why is this a credible entry into the analytics space by IMS?

A2: Several reasons. IMS has had a relatively unique focus on one of the potentially more fruitful but challenging data collection areas of learning applications and platforms. IMS knows this turf well and brings a large critical mass of members that cover a wide range of product categories and institutional needs. IMS also has a large installed base of applications already using its core standards, such as LTI (Learning Tools Interoperability), Common Cartridge, LIS (Learning Information Services)  and QTI (Question and Test Interoperability). In other words, data is already flowing via these specifications, which are providing conduits upon which more data can ride.

Q3: What is the focus of the analytics initiative versus other IMS specification work?

A3: IMS has a bunch of fast moving task forces and leadership groups that work on applications of standards. This analytics work is coming from such a group that has been in existence about a year, but leveraging off of years of work by IMS in specifications for outcomes data for a variety of purposes – ranging from scores to gradebooks to assessment item data.  The purpose of the analytics effort is to actualize many implementations of analytics feeds in as many products as rapidly as possible, adding a few new bits and pieces, but largely leveraging existing work. In fact, the first proof of concept demonstrations will come very soon – at the next IMS quarterly meeting the week of November 4.  There will be a Summit day there, Thursday, November 7, that will also focus on analytics – both the institutional and supplier perspectives.

Q4: What is the relationship with IMS LTI (Learning Tools Interoperability)?

A4: There’s a short answer and a longer answer.  The short answer is that we expect this analytics work to finalize some work on outcomes (namely a very robust gradebook) that has been proposed for LTI & LIS for awhile but not officially released yet and that we expect the proliferation of LTI-enabled apps and learning platforms to be a natural starting point for data exchange. The longer term has two components. The first is that we are leveraging the work of IMS members in specific LTI app/product categories to help develop the Learning Metric Profiles referenced in the Caliper whitepaper. The second is that we will be introducing a variety of LTI Services that will enable data to go to many different destinations from many sources whether or not LTI was used as the launch mechanism for an app. So, for instance, enabling apps to send data to an analytics storage, dashboard or personal data vault whether or not it was launched by an LMS/learning platform.

Q5: Why is this project developing and releasing the Sensor API?

A5: APIs can make implementation a little easier – especially if a large number of suppliers use the same APIs. IMS is now releasing APIs as best practices with many of its specifications now. Please note that an “API” is by definition programming language specific and a good standard is not.  The standard is the underlying guts – that’s the hard part.

Q6: What if a product company has already developed an API for some category of data transmission – can that still be used with Caliper?

A6: Maybe. One of the cool things about the IMS specs and the development process behind them is that we can work with leading suppliers who already have services/APIs to see if we can “map them” on top of the IMS specifications. You may have developed some APIs that are now experiencing good market adoption for a specific type of service. IMS can potentially work with your organization to harmonize that with Caliper and the LTI services. Please contact us at:

Q7: What if IMS can’t get suppliers to agree on the Learning Metric Profiles?

A7: Well, we wouldn’t be doing this if we were not already seeing some excellent convergence. But, we also want to encourage and fully expect there to be extensions, both public and private, that IMS will capture in a registry.  Thus we can have the stuff that everyone agrees on and the stuff that is new, above and beyond that is either publically sharable or not. That’s how innovation in data and analytics is enabled by all of this.

Q8: What about applications that are kind of out of the learning domain, like CRM (Customer Relationship Management) systems?

A8: We see absolutely no reason why Caliper cannot add Metric Profiles for classes of systems like this and add into the mix. The Caliper Framework should be applicable to almost any type of system.

Q9: What if my analytics product wants to suck in every possible data and user interaction possible?

A9: Yes, big data. If you want to do that across more than one system you still need an agreed upon analytics feed. Caliper will cover that, even if a private solution is needed at first (see A7 above).

Q10: Will U.S. K-12 initiatives such as InBloom or Ed-Fi benefit from this work?

A10:  Yes, they certainly could! Caliper data can go to/from anywhere.  

At the 2013 Learning Impact conference I presented a keynote “Innovation, Disruption, Revolution – Oh My!”  I chose this topic because the degree of hype about “disruption” in education seems to be at an all time high right now. BTW it's amazing how well the Gartner Hype Cycle fits the Wizard of Oz! From Rob Abel's Learning Impact Keynote: Innovation, Disruption, Revolution Oh My!

From Rob Abel's Learning Impact Keynote: Innovation, Disruption, Revolution Oh My!

Excitement about the role of technology in improving education is a good thing as far as I’m concerned. Education is a segment that needs disruptive innovation. To me the hype around things like MOOCs represents a longing by many for “a better educational future” – presumably involving lower costs to students and better career/life fulfillment, not to mention better global citizens needed to solve our global challenges. Let’s face it - there is a general sense that the current education system is not up to the challenges of the future. And, it’s not clear how we get from “here” to “there.”

But, as leaders in the education segment we do need to get better at understanding where we have been and where we are going, what constitutes innovation and/or disruption that is worthy of investment?  Are you an investor? I would argue that any individual that is putting time into educational technology leadership at any level is an investor, but certainly institutions that are spending on innovation, and yes, venture capital investors (investors in this segment, including some pretty big names, are making some bad decisions right now about where they are putting their money – but this post is more about how institutions should decide to invest their resources).

In IMS we try to ferret out “winners” by looking at criteria for something we call “Learning Impact” – which is defined by a set of judging criteria we use in our annual Learning Impact Awards (LIAs). You can think of it as evidence that the application of technology in an educational setting has had a clear impact on access, affordability and/or quality. We’re pretty proud of this program because there is simply no way to win with hype. But, in general the whole annual Learning Impact event is about understanding where the innovation really is.

Right now there are quite a few over-hyped activities in the education segment.  I would include in these MOOCs, Common Core State Standards, analytics, badges and open educational resources (OER). Sorry if I tipped over one of your sacred cows there!  Over-hyped does not mean that something good will not come from these developments. It just means that they are being portrayed as more significant in terms of ability to “disrupt” education than they will be capable of delivering on. As with most hyped innovations, eventually some aspects of the innovation “survive,” crossing the chasm to productive advancement of the industry. The challenge for innovators and investors is to utilize critical thinking to rationalize what will sustain and what the real Impact will be.

MOOCs are perhaps everyone’s favorite example of hype right now. It’s difficult to imagine how something could have achieved more hype in the higher education segment – and they are very clearly striking a nerve for being potentially highly disruptive. Literally every day MOOCs are in the headlines – and smart marketers are trying to jump on the MOOC bandwagon even as that bandwagon morphs continually to address the glaring deficits of the MOOC model and quite frankly, more failures than successes at this point.

Clayton Christensen, the leading expert on “disruptive innovation,” has written at least two books specifically focused on education. According to Christensen’s disruptive innovation theory markets are disrupted when new entrants figure out an innovative way to provide a “simpler” product to a wider set of buyers at a more affordable price.  Since the simpler product is actually what the broader market prefers (simpler means more usable, more effective for the desired purposes) the product is highly disruptive – better product at lower price point.  And thus, these new entrants change the market behavior substantially.  Jim Farmer does a very thorough job of digging into the theories as applied to MOOCs here.

While I, like many other silicon valley entrepreneurs, have found Christensen’s original formulations on disruptive innovation descriptive of what is generally seen in the high tech world, and a useful thought framework, well, the problem is that these formulations have not been at all predictive for education – and that is the acid test for using theory for strategic gain – is it predictive?  Writing about what happened in the past and putting a framework around it is great – but if it doesn't help predict the future it doesn’t especially help entrepreneurs or us “investors.”

The innovators in the education segment have NOT disrupted the status quo significantly so far. While Christensen predicted in 2008’s Disrupting Class great disruptions in the segment from online learning, the reality so far is that greater adoption of online learning has continued as expected but not very disruptive: price points for education continue to rise ahead of inflation and while online education continues to grow it is largely seen as reflecting traditional models rather than a disruption.  And, while online/blended models have certainly improved access – well, the percentage of the population that has achieved credentials has been very level.  I wrote a paper about what the technology impact in higher education back in 2005-7 - and the situation is not significantly different today.

Christensen was recently quoted in an interview as stating that higher education institutions are going to be in real trouble 5 years from now. However, he has not made it clear why things will be different in the next 5 years versus the last 5 years since Disrupting Class was published. I do agree that “buyers” (students/parents) are in fact getting smarter about looking for lower cost options as well as attempting to understand the value of a higher education degree.  But I would argue that we are nowhere near a true disruption (rise) in the number of participants in higher education.

While it is very valuable to have any great thinker on business strategy analyzing education and providing insights from other industries that might apply (like Christensen), education leaders need to do their own critical thinking about these formulations of “disruptive innovation” in the education segment.

I’d like to provide three key factors that IMHO are needed to be understood if to  understand disruption in education. While these factors may be relevant to understanding MOOCs, they can be applied to other hype areas as well. Hold on to your hat here as these are things that we don’t hear much talk about in the education space – especially when you go to meetings on one of the hyped topics or even to investor conferences!

  1. Education is a complex services business in which quality is difficult to define. Disruption in the education space requires better service models that are built around improved educational program quality. Comparison of education to the disruption of the steel industry by mini-mills (a connection some have made because Christensen uses this as a classic example of disruption) is not a valid comparison. Disruption in education is not about replacing the low end of a well-defined product. It’s about redefining quality in a much more complex world of knowledge than that from which most current educational models were designed.  So, for instance, a true disruption in education would be highly desirable/effective K-masters degree in 15-16 years versus the current 19 years (ideally that meets the needs of under performing populations as well as traditionally successful populations).
  2. The next phase of true progress will be to come out of the current era of massification into a new era of more real world relevant and personalized educational pathways.  Massification of educational experiences will not be the ticket to success in the next wave of educational models. It seems that many of the entrepreneurs and investors in the education space fail to realize that we have already been in the era of massification of education for the last 30 years or so in developed countries. The disruption in terms of content will need to be content that enables educational experiences that are up-to-date (read “relevant”), adaptive to the interests of the learner, easily adaptable by teachers and yet thorough in terms of teaching the educational foundation that are defined via #1.  No offense to our friends at Coursera (one of the growing number of MOOC providers), but “course era” is an especially off the mark name for describing the future IMHO.  It is NOT a course era now in education nor will it be in the future. It is the same era that it has been: which is one in which a distinctive program of high quality education will be highly valued. It's just that we need to do a better job getting such distinctive programs to the currently underserved populations at a better price point. Yes, there will be different sources of supporting digital resources that will help enable the redesign of educational delivery (potentially a role for MOOC courses). As discussed in other LI blog posts (here & here) digital resources will need to be in a form of a highly usable toolkit for faculty and learners.  But, these supporting resources have a VERY long way to go to meet the needs of learners. And, the more available (i.e. free, open, massive, etc) a course or content becomes the faster it will become commoditized.
  3. Education is the ultimate “long tail” market with a growing proliferation of high value niche providers and boutique programs. This is only going to increase in the future. Contrary to other recent online phenomena like facebook or twitter, education is not a “winner take all” market. I was at an education investor conference earlier this year where a leading investment advisor made the statement that education is the “ultimate winner take all market.” Education, if done correctly, is life success enabling. The more unique and distinctive your educational experience is, the more valuable it is. The ability to produce knowledge is the key currency in the current and future global economy. There is no distinctiveness to attending “Massified University.” And, a credential from a massive provider will most likely be such a commodity that most will prefer not to waste their time on it (other than for pure fun or reference). We need many more niche-oriented institutions that provide specialized, career-enabling and life-enabling education. Note that even large public institutions, while having many students, can engage this philosophy to create a large number of differentiated programs of study.   Therefore, the “disruptive technology platforms” for education will need to enable great diversity. Diversity in terms of delivery models and blending of high touch with personalized self-service.  Disruptive platforms will also need to enable seamless integration among cooperating providers of the various components of a solution – meaning close partnership among institutions as well as innovative learning tools.  Old style Web 2.0 thinking of the single pervasive platform or the single way of analyzing data will not work for the future of education. Education is not that simple (sorry!).

Are MOOCs potentially promising innovations? Yes. They are clearly an evolution of the trend toward pervasive online/blended/more flexible educational models/flipping the classroom. Will they disrupt education?  Not on the current trajectory. They pretty much fly in the face of the three tenants above.

But, there are some potential ways in which MOOCs could be disruptive in a more limited way.

I think MOOCs have the potential to be disruptive on the low end of the education market as a new model for delivering “open university style education.”  For example, today’s MOOCs could be an appealing alternative to the content portion of massive open universities around the world, featuring star professors from highly ranked universities. And, such populations of learners could probably support an advertising/low course fee model of consumption. So, open university providers get a “pay per use” version of content (versus the larger investments they must make today for tuition) that is likely better than what they are offering today, which if subsidized by advertising (net revenue equals small pay per use from open university provider coupled with advertising revenues) could equate to substantial revenues over a large numbers of users with similar interests (as indicated by the topic of study).

In such a model MOOCs could also double as a new type of “tutorial” reference materials. Very much like Kahn Academy, which is often referred to as a MOOC now even though it existed for a few years prior to the term.

Would this application of MOOCs be disruptive? Well, if you consider displacement of open universities and/or a new business model for them disruptive, then yes.  This approach could potentially disrupt the open university market worldwide and create a much larger interest in open university derivative models as a way to learn more about a particular topic and/or as preparation for entry to other universities.  Will they substitute for those universities – no. Such a disruption will not radically change the efficiency of higher education segment at all.

Is there a strong motivation for current providers of education to engage in this model? Unclear. As previously discussed the more you commoditize an educational experience the less valuable the education is. As discussed above the next stage in education is greater diversification, not massification. Personally I would like to see existing institutions respond to the three factors above by rolling out new “institutes” with new types of degree programs that meet the evolving needs of society.

Should open universities or other new entrants that wish to compete in that space try a MOOC/advertising/low course fee supported model?  I hope so.  That would be a good fit with the “course-focused” strength of MOOCs.  It would also provide a potential revenue producing and marketing outlet for institutions and their “star teachers.” Of course there are the normal challenges of achieving accreditation from the countries/states as needed to prove value to the potential students.

So, hooray for the MOOC phenomena in terms of making claims about needed innovations such as scale, analytics and the potential power of star teachers that will help accelerate the innovation trend toward online/blended that we are already on! And, congratulations to the leading institutions willing to make a space to try out the MOOC innovations as another set of tools that might be leveraged in the quest for improved education.

However, I would say that if you don’t know at this point why you are investing (in terms of what impact you expect will sustain at your institution, and therefore what the return on your investment is likely to be) then you are not applying the level of critical thinking you need to. Personally I would be asking MOOC providers to take the risk in terms of proving the validity of the market/business model (such as the ideas around open university displacement) – and not taking on that risk for my university. 

[Note: In a future blog post I will take a whack at what I think will be the sustainable innovations that might cross the chasm coming out of some of the hype items mentioned above: MOOCs, Common Core State Standards, analytics, badges and open educational resources (OER).]

See more of Rob Abel’s views on educational innovation throughout the Learning Impact blog as well as this feature interview with Anthony Salcito of Microsoft.  

Special Blog from AACC Annual Convention @Comm_College ‪#AACCAnnual

Regardless of what your view on how “liberal” a program of study at a college should be, it seems to be a fair assumption that colleges should help qualify students for a good job and great career. Especially considering the high debt loads that students in the U.S. are incurring to get a college degree – they need a good job to pay for it.

What is the role of interoperability in educational data? As I have posted elsewhere, IMS is working diligently on interoperability of both big data and small data. We are aligning ourselves closely with the needs of institutions leading the charge on competency-based education credentialing. And, we are strong supporters of the U.S. Department of Education and White House “My Data Button” initiative.

Today I had the privilege of moderating a panel on “closing the gap” between college offerings and the world of work. Our distinguished panelists were:

Debra Derr, President, North Iowa Area Community College

Richard Carpenter, Chancellor of Lone Star College System, and Chair of the Texas Association of Community Colleges

Shah Ardalan, President, Lone Star Community College, University Park

North Iowa Area Community College and Lone Star College System (Houston) represented a great range in scale in this conversation, with NIACC being a small community college and LSCS being one of the largest and fastest growing in the U.S.

However, despite the range in size, the best practices were in agreement:

  1. Understand what your student and faculty expectations are with respect to use of technology and technology innovation.
  2. Partner with organizations who have knowledge and expertise to avoid having to reinvent the wheel in terms of deploying new technology.
  3. Build close ties to local industry to understand the needs of employers.
  4. Provide better resources to help students understand employment opportunities, and in general what in the world their degree is qualifying them for.
  5. Move more toward competency-based programs and student documents (evidence of competencies) that can be owned by the student to be used in their quest to match career opportunities.

Richard Carpenter challenged the audience of community college leaders to transform what colleges can do for students by enabling students to “own the student record.” This is a massive paradigm shift from the last 40 years of institutions being the owner of the student data.  But today’s panel questioned whether this is good enough for the future.

Obviously there is nothing wrong with institutions being the keepers of authoritative records about student achievement. The problem occurs when students and parents realize that they have paid for an education for which they have little to show except a transcript. Thus, the challenge by Chancellor Carpenter, and echoed by the other panel participants, is that institutions need to help students understand opportunities, create and organize the artifacts from their learning according to critical competencies, and ultimately enable students to “take this with them” throughout their lives.

Lone Star College has championed a new service called the Education and Career Positioning System – which has been launched as an online service for students in which they can own their data.

IMS Global has been working hand-in-hand with Lone Star on this initiative because we believe this is an absolutely critical element of the IMS Open Digital Innovation Revolution in Education, namely opening up the campus systems so that students can connect their academic accomplishments to career and academic opportunities. Obviously IMS open standards can play an important role in opening up the data and artifacts created in a myriad of educational software for export to the student record: the one owned by the student of the future.


A special featured keynote by Bob Mendenhall, President of Western Governor’s University at Learning Impact 2013

It really doesn’t matter who you talk to in the education field. Literally all agree that doing a better job of understanding competencies is the way that education needs to move. It’s about what you know and what you can do, rather than what course you took and what grade you got in that course.

Western Governors University (WGU) is the recognized leader in competency-based non-profit higher education in the U.S. We are very pleased that President Bob Mendenhall will be joining us at this year’s Learning Impact 2013 to tell the WGU story and participate in a panel on higher education leadership.

As we do with many of our keynotes at Learning Impact we have published a brief interview article with Bob.

In some respects this is sort of a “coming out” for WGU in that they have been replacing proprietary integrations (those very popular “open APIs” that every vendor likes to promote) with open standards-based integrations using IMS LTI. As the article mentions, WGU has quietly replaced 20 such custom integrations with LTI over the past several years, with probably 30 or so more to go!

Which brings us to the very critical link between competency-based learning and open interoperability standards.  The reason why WGU has so many different applications to integrate is because the best resources in different fields come from different providers.  You might think this circumstance is unique for WGU. It is not even today, but less so in the future. That’s because your departments and faculty want to use these sort of resources – or will be wanting to – and, if they are doing so now they are probably doing it WITHOUT INSTITUTIONAL INTEGRATION AND SUPPORT.  Sorry to get loud there, but frankly we are finding that many educational CIOs need to be woken up to both the challenges and opportunities (for better service) to departments and faculty. Well, in a nutshell, IMS standards are all about enabling this – just as is happening at WGU.

There is more information on how to join this open digital innovation revolution, including two special programs to aid higher ed involvement/adoption (called THESIS) and K-12 involvement/adoption (called I3LC).