Category Archives: K-12 Schools

Application to the K-12 or schools segment

Can the Education Sector Lead Learning Tech Impact (and Tech Stds)?

Many thanks to Michael Feldstein of the e-Literate Blog for the insightful post on IMS progress entitled The IMS Is More Important Than You Think It Is.

Michael and Phil Hill have been so successful with the e-Literate Blog because of their intimate understanding of the education technology sector.  The funny thing about the title of this recent blog post about IMS is that even I, as the guy sort of in charge over here at IMS, often have the same sentiment – namely that IMS may be more important than even I think it is!

To explain I will highlight a few statements from Michael’s writing and elaborate a bit – all under the category of sort of a teachable moment. The key foundation here is to understand that when we reorganized IMS beginning eight years ago we took a pretty radical approach (while trying not to appear radical) of turning a technical standards organization upside down. So, rather than focusing on standards for educational technology as the most important thing we took to heart that standards are only a means to an end. That end is what we termed “Learning Impact” which is the impact that technology can have on transforming/improving education and learning. If that seems a bit ethereal to you, it’s not: The event Michael wrote his impressions from is our annual meeting called the Learning Impact Leadership Institute. This is NOT a meeting of standards geeks (even though we all have a bit, or maybe a lot, of that in us) but rather a meeting of those wishing to lead educational transformation.

Michael: “I have long argued that the development of technical interoperability standards for education are absolutely critical for enabling innovation and personalized learning environments. Note that I usually avoid those sorts of buzzwords—”innovation” and “personalized learning”—so when I use them here, I really mean them.”

Rob’s elaboration: Michael gets that IMS is all about innovation, but lot’s of folks misunderstand what goes on in a standards organization like IMS. Some standards are about picking one of several options of a technology already developed. My favorite example is picking a gauge to standardize railroad tracks. However, IMS standards are for technology that is new. These type of standards are all about enabling distribution of innovative practices and technologies. Thus, some will fail but others will enable wider innovation.  Working in IMS is as much or more about defining the innovation and enabling it as it is about locking down a potential standard.

Michael:  “But arriving at those standards often feels . . . painful, frustratingly slow, and often lacking a feeling of accomplishment. It’s easy to give up on the process. Having recently returned from the IMS Learning Impact Leadership Institute, I must say that the feeling was different this time.”

Rob’s elaboration: We’ve figured out a few things over the years that have helped improve the process of developing standards. First we try to separate the participants into groups that emphasize different things. Some folks like to work on developing specifications. Most, however, prefer to implement. Others, especially institutional types, like to work on reviewing to understand and ensure the benefits and resulting policies. The trick is to create some separate spaces and bring them together at the appropriate times. IMS is far from perfect at orchestrating all of this – but we are constantly working at it.  When it all comes together and you have the institutions and suppliers all working together toward the same end it is truly a beautiful thing. I think probably Michael sensed some of that at the meeting.

Michael: “The first indicator that things are different at the IMS these days is the health of the community.  Membership has quadrupled. Interestingly, there was also a very strong K12 contingent at the meeting this year, which is new. This trend is accelerating. According to Rob, the IMS has averaged adding about 20 new members a year for the last eight years but has added 25 so far in 2014. Implementations of IMS standards is also way up.”

Rob’s elaboration: To us IMS is an organization that enables the education sector to collaborate in the leadership of educational and learning technology. Seems like a strange thing to say, but as I pointed out in a 2007 EDUCAUSE Review article (see Innovation, Adoption and Learning Impact: Creating the Future of IT), the education segment does not invest much R&D compared to other segments and without collaboration every institution (all very small businesses – even the largest) spend most of the time and effort they do invest in reinventing what their colleagues at other institutions are doing. This is still a lesson that we are all learning. But, our approach to IMS has been to lay this out to the sector and basically say, “Hey, we can give you a platform for collaboration, but it’s up to you to fund it and make it succeed.” If you’re not supportive it will fail, if you are it will succeed. So far, IMS has grown from a very small standards activity to being on par with the largest and most stable in the world including horizontal and vertical standards organizations.

Michael: “The IMS is just knocking the cover off the ball in terms of its current and near-term prospective impact. This is not your father’s standards body. But I think the IMS is still just warming up.”

Rob’s elaboration: One does get the sense that despite very strong growth the last eight years that IMS may be accelerating.  My personal view is that there is an enormous opportunity for institutions and suppliers in the segment to shape the future right now as digital support for learning and education is accelerating. The concept that is the foundation of IMS, namely that true cross-platform plug and play apps, content and data in support of greater personalization, more distinctive educational programs and more effective educational programs, is a game changer. And, this is truly a charge that educational institutions can and should lead.  After all, who should be inventing the future of education? And, I also expect that much of this IMS work is going to make its way into more horizontal application across other industries (not education only) and the general web.

The Real Lessons from InBloom

Not more than a couple of weeks after my blog post on Gates’ Foundation LRMI and its motivating program InBloom it was announced that InBloom will be “winding down.”

InBloom strictly speaking is a non-profit corporation set up by the Gates Foundation  to provide open source software to implement a data collection system at the state level capable of collecting and analyzing data from (what was hoped to be) numerous educational applications at the district level providing information on student progress. Before the InBloom non-profit was “spun out” the work was incubated under the auspices of the Gates Foundation via a project called first the Shared Learning Initiative (SLI) and later the Shared Learning Collaborative (SLC).

We use the term “InBloom” here to represent the entire sequence of work from inception that led to InBloom inclusive, not just the non-profit corporation work.

The stated reason for the wind down of InBloom has focused on one potential adopter – the state of New York – where InBloom has been highly criticized because of potential data privacy issues.

However, I think we need to remember that InBloom was highly touted by the venerable CCSSO (Chief Council of State School Officers) – representing virtually all the U.S. states, funded by the Gates Foundation to the tune of $100-200 million (starting from the initial Shared Learning Infrastructure / Collaborative) and endorsed by at least a half dozen states at the formation of InBloom (with the hope that many, many states would adopt), not to mention support from major industry consortia like SIIA.

The reality is that the SLI/SLC/InBloom ended up being rejected by a much larger body of potential participants than the state of New York. Therefore, focusing on New York alone and the single issue of data privacy doesn’t fully capture the key lessons that many of us already knew and must now solidify based on the InBloom experience.  After all, education is about learning and if we can’t learn from our experiences then it speaks volumes about our education.

From the perspective of IMS there were numerous fundamental mistakes made in the planning and execution of InBloom.  Here are some of the key ones:

  1. Spending a lot of money to reinvent the wheel usually doesn’t make much sense. While better use of data in education is a good thing there was not anything that was a breakthrough technically speaking from InBloom. Many, many suppliers could have provided the InBloom capability  (which was basically a data warehouse).   It appears that Gates decided to pick one favored contractor because they were impatient and/or not wanting to work with existing industry providers.  In addition, pretty much every aspect of InBloom was already covered in existing education sector interoperability standards – and this was completely ignored.
  2. You can have a lot of people around the table, but if they aren’t the right people the net result will probably be wrong.  It certainly seems like Gates decided to focus on the states/CCSSO because of the power position they had in terms of adopting the Common Core standards. The problem is that state government is generally not where the “rubber hits the road” in terms of student learning. Yes, there are many caring administrators at the state level who want to do the right thing, but it’s in the schools and the school districts where better learning and teaching needs to occur. It doesn’t take much experience engaging with the K-12 sector to understand that there is a huge divide in most states between the states and the districts. To the districts the idea of sending data to the state and expecting something good to happen is a very big stretch of the imagination. The people that are the good teachers are too busy helping students learn to get involved in pie in the sky projects.
  3. “Small” (read:) actionable data is much more important than big data in education. Readers of this blog have already read what IMS has learned from engaging with the education sector today and a general understanding of where we are in the evolution of education systems historically speaking.  The future of education is not about big or massive. The future of education is about diversity and distinctiveness. The future is about using data to help individual students by putting it in the hands of someone that cares (the student, the parent, the mentor, the teacher).
  4. Working with the education sector is likely to produce a better result than trying to end-run it.  InBloom was an attempt to end-run districts, existing suppliers, and, yes, existing standards organizations like IMS Global. To be honest, most we have talked to perceive InBloom as a Microsoft style (or any dominant vendor) platform play (no offense meant here to Mr. Gates as we would doubt he had anything to do with the strategy). It goes like this: We will give you the platform and then dominate the market. The problem with going it alone is that existing collaborations in the education sector that are worth their salt (we like to think that IMS is one of them) provide a huge base of historical knowledge as well as existing base of collaboration that helps guide things in the right direction while actually getting there faster and cheaper.
  5. Open source is not open standards – and to build collaboration it is open standards that work best.  Lot’s of folks have been reiterating this for a long time now. It’s so very simple really. With open source you have code that includes things like APIs. But those APIs are evolved by a single controlling entity and not governed by a community. Thus, the controlling entity drives it. This is why the world has numerous standards organizations of all shapes and sizes. Open standards define standards that any party can use, can be implemented in a variety of APIs/programming languages and that are evolved and governed by a defined and legally bound community process.

We hate to see good money wasted in IMS – as we work very hard for the membership dues we collect. Wasting of $100-200 million is pretty inexcusable in our world. In fact, it’s very sad to think what good that money could have achieved.

Our sincere hope is that leaders in this sector – including those that had a hand in InBloom– do not hide behind red herrings like privacy issues in one state. Leaders in education need to learn from their mistakes and course correct. The InBloom mistakes were numerous and fundamental – and pretty visible right from the start. What about data privacy, wasn’t that a core issue? We don’t see it that way. The data privacy concern was just a symptom – not the root cause.

While IMS never saw InBloom as a direct competitor, the reality is that in the time it took to spend that $100-200 million on InBloom – IMS continues to grow organically and has issued over 280 conformance certifications for interoperable learning platforms, tools, applications and resources – and is now building data collection capability via open standards (Caliper) into all of these.  Hmmm . . . kind of sounds like a shared learning collaborative of some kind since there are over 240 collaborating organizations in IMS. Investment required to get here: Less than $10 million spread among 240 members.

Interesting contrast in approaches and results.

 

Critical Milestone Met: Conformance Certification by Leading e-Assessment Product Suppliers

Today’s IMS announcement of the first group of winning products and organizations to undergo conformance certifications for QTI and APIP is a very, very big step for e-assessment interoperability worldwide. IMS certification seal

For years, no decades, the majority of e-assessment suppliers worldwide have been riding for free on the back of the IMS Question & Test Interoperability TM (QTI TM) specification. QTI has been a labor of love and importance within IMS and led by organizations such as SURF, JISC, University of Cambridge, ETS, BPS Bildungsportal Sachsen (BPS) and University of Pierre & Marie Curie.

These organizations, other than ETS, are not household names in most of U.S. education sector. But they tirelessly carried the load of developing really the world’s only viable interoperability standard for digital interoperability of tests, test items and associated results.

About two years ago the U.S. decided to invest in the Common Core learning standards for K-12 and also launched the Race-to-the-Top Assessment program to encourage states to cooperate on designing and delivering new electronic assessments in conjunction with the Common Core. Shortly ahead of that IMS collaborated with a group of U.S. states to define an evolution of QTI called APIP TM (Accessible Portable Item Protocol TM)which consists of QTI plus some new features to address requirements for special needs students. And, the Netherlands also began partnering with IMS to develop a countrywide initiative to evolve to e-assessment starting with one particular test (similar to the U.S. SAT or college entrance).

These investments in e-assessment have led to a dramatic rise in participation in QTI and APIP in IMS. If you look at the IMS membership list today it is arguably the who’s who of leading assessment organizations, certainly in the U.S. but also perhaps worldwide. The IMS APIP/QTI work over the last 2 years has been co-chaired by Measured Progress, ETS and Pearson with heavy involvement from McGraw-Hill CTB, ACT, Pacific Metrics, NWEA, Data Recognition Corporation and a variety of other assessment industry heavy weights. And in the Netherlands CITO has been leading the charge.

Life has been good. But market development and adoption of standards is always a kind of “chicken and egg” sort of thing.  As mentioned at the very beginning of this post worldwide assessment suppliers of many stripes had been talking up QTI for a very long time. Problem was that every supplier had their own version of QTI and therefore very little interoperability ensued. As we have discussed in other posts, this type of standardization does not deliver on the actual cost and time reduction that standards need to deliver on in the digital world. If conformance to a standard still requires lots of custom programming to get interoperability, well, then it isn’t a very good standard.

Thus, realizing this issues of “loose standards” running rampant in the ed tech sector the IMS members decided to get serious – and also save themselves lots of time and money in redoing integrations – by implementing IMS conformance certification. As we have discussed elsewhere, IMS conformance certification is not a marketing program (although those that go through it obviously do have the right to market that fact) but more of a “UL certified” designation of getting through a testing program. The conformance certification is much more than a “final test to the specification.” The conformance certification program is actually critical to evolving to the best possible specification for the needs of the marketplace. Typically only by going through the testing can the specification be refined and improved.  IMS has seen this process work over and over again with all our specifications the last several years.

The problem is that many vendors often kind of “hope for a miracle” many times with specifications. They hope that even without going through implementation and testing that magically a specification will work.  I think anyone that has ever developed software and does a little projecting of that experience on to a specification – that essentially must bring together the development process/experience of numerous software products – will realize that a good specification requires development participation and feedback from multiple vendors. The IMS conformance certification process – and ongoing developer community and related specification evolution (we call it an APMG: Accredited Profile Management Group) – is that “hub” where the development experience of the multiple suppliers comes together into a great specification.

It’s really a very simple concept but it is greatly complicated by the realities of new markets and new product development where suppliers are challenged to respond to the needs of their project deliverables and the needs to cooperate on the standards testing and evolution.

All that background so that you know that what IMS announced today, that five leading organizations have now completed conformance certification for APIP/QTI across a range of product types, is a huge step forward for the e-assessment community. By “community” we mean the suppliers and the states and end-users of e-assessments.  In addition to the leadership shown by the suppliers listed in this post, this milestone has required exemplary leadership from the end-user organizations that have been key partners in this, namely Maryland, Minnesota, WIDA, Smarter Balanced and the College Voor Examens Netherlands.

http://www.imsglobal.org/apip/alliance.html

We are still relatively early in the adoption of high quality e-assessment worldwide. But what today’s announcement proves is that leading supplier and end-user organizations can come together to enable all the many benefits of interoperable assessments (for a more detailed discussion of these benefits see What You Need to Know About e-Assessment).

It is now time for those organizations that have either gathered around the IMS QTI/APIP table or been long claiming that they are “conforming” to these standards, to contribute to the community by participating in the conformance certification process.

Today’s announced winners were:

APIP:

Platinum:Educational Testing Service/Computerized Assessments and Learning TOMS v3.0.0.0 PNP system (APIP v1.0 PNP Core Compliant) and Sample Students’ Instances v1.0 (APIP v1.0 PNP Content Core Compliant)

Gold:Pacific Metrics Unity v1.9 (APIP v1.0 PNP Core Compliant, APIP v1.0 Item Test Bank Import Compliant)

Silver:Computerized Assessments and Learning Test Delivery system v2.3 (APIP v1.0 Delivery Entry Compliant)

QTI:

Platinum:BPS Bildungsportal Sachsen GmbH ONYX Testsuite v5.3.1 (QTIv2.1 Authoring Compliant, QTIv2.1 Delivery Compliant, QTIv2.1 Item Test Bank Compliant)

Gold:Northwest Evaluation Association Formative Assessment Item Bank v14.1 (QTI v2.1 Item Test Bank Compliant) and NWEA SCIP v14.1 (QTI v2.1 Content Compliant)

The winners will be honored and presented with their awards during the Learning Impact Awards ceremony at the 2014 Learning Impact Leadership Institute 5-8 May 2014 in New Orleans, Louisiana, USA www.imsglobal.org/learningimpact2014/.

Here are some links to addition press releases regarding this important milestone:

ETS Assessment Management System Provides Standardized Platform to Manage Statewide Assessments

Pacific Metrics’ Unity Platform Earns IMS Global Learning Consortium Assessment Conformance Certification

http://www.imsglobal.org/apip/alliance.html

Hap’s Got Apps! FAQ regarding the IMS Connected Learning Innovation Community and Challenge

IMS announced today the winners of our first (of what we expect to be many) annual connected learning innovation challenge (aka app challenge – but this is a bit of a misnomer because the challenge is as much about platforms and tools as apps). And, our eternal hats off to Instructure Canvas for creating the idea for an App Challenge and conducting the first ever last year in conjunction with their annual conference.

We say “Hap’s got apps” because Hap Aziz is the IMS wrangler for this emerging education and learning app community.

Here’s an FAQ about the challenge, including plans going forward.

Q: How many entries and how many winners were there?

A: There were 22 entries and 5 top apps were selected as the winners.

Q: Where can I see the entries and the winners?

A: The winners are summarized in the press release and on the App Challenge Winner web page.  The winners and the other entries are also listed toward the bottom of the LTI certified product web page. You can also sign-up to get the (roughly) monthly CLIC (Connected Learning Innovation Community) newsletter here – which will have features on the winning and other notable apps as well as community news.

http://developers.imsglobal.org/catalog.html

Q: Who chose the winners and how were they chosen?

A: Many thanks to a panel of expert evaluators , primarily institutional leaders, but a few suppliers, who developed a rubric for the evaluation. My understanding is that there was excellent convergence on the winners.

Q: Are these “apps” like the kind of apps available on Google Play or iTunes?

A: No – these IMS app challenge apps are generally a lot better because they are powered with LTI (Learning Tools Interoperability). That’s because these are apps that can connect into over 25 different learning environments/platforms including all of the major learning management systems. Thus, these are “cross-platform” apps, unlike Apple or Google apps which generally only work on Apple. Or Google. In addition the IMS app challenge apps exchange highly useful information with the over 25 learning environments/platforms, such as user information, rosters, progress data, etc. So, the IMS app challenge apps are real enterprise learning apps and not the sort of limited individual user apps  people download to their mobile device from PlayStore or iTunes.

Q: “Could” mobile apps such as those downloaded from Google Play or iTunes become IMS LTI Apps?

A: That’s a bit of a complicated question because it involves software architecture and software architecture limitations of the operating systems involved, but the general answer is ‘yes’. The web-hosted “back-end” of mobile apps as well as the apps themselves could potentially leverage LTI (and/or other IMS standards) to connect to learning environments/platforms. To date we have not had any great examples of this but it is only a matter of time before it will happen.

Q: Was there money or other recognition involved in the Challenge?

A: Yes, each of the top five will receive a $1000 prize and also will be recognized at the IMS Learning Impact Leadership Institute May 5-8, 2014 in New Orleans.  There will also be a plenary panel and entire track on connected learning at the event, facilitated by Hap Aziz, with many of the entrants and evaluators as participants.

Q: Where did the money come from?

A:  A huge debt of gratitude is owed to the organizations that were financial supporters of the challenge and community. They made it possible.  Cengage Learning, Ellucian, Follett, Indiana University – Purdue University Fort Wayne, Instructure Canvas, McGraw-Hill Education, Oracle, Pearson, University of Maryland Baltimore County, and Vital Source.  The initiative requires ongoing support and if your organization would like to sponsor in the future, please contact us at leadingchange@imsglobal.org.

Q: Were you pleased with the quality of the entrants and winners?

A: Very much so. The winners were a mix of small (including tiny) and larger organizations. The top vote getter, Hoot.me, was an extremely innovative combination of the educational enterprise with Facebook. This reflects a trend in which innovative faculty want to take advantage of existing non-educational applications, but couple them with their campus software platforms.  And, all five of the top winners were similarly highly innovative in terms of what they enable faculty, students and/or administrators to do – and that’s what this is all about – making innovation easier!

Q: Isn’t a non-connected app just as innovative?

A: Nice try, but not really. “Innovation” is not just about whether an application is novel.  It also has to be useful (in fact some definitions of the word take into account adoption/usage as a critical aspect of innovation). Apps that are easy to access and use are a lot more useful in the education space than those that aren’t. Having to enter student roster data or having separate logins or going to a different URL for an app is not at all cool. But, more importantly, these extra steps detract from the innovativeness. Faculty and students need to focus on learning and not on configuring software.

Q: Is the IMS Connected Learning Innovation Challenge going to become an annual thing?

A: Yes. We are on an annual schedule of app boot camp for developers at our August quarterly meeting, promotion at Fall EDUCAUSE, promotion at Winter EDUCAUSE ELI and announcement of annual winners during the run-up to the annual Learning Impact event in May.

Q: Is IMS going to do more to make it easier to find apps than the current LTI catalog web page?

A: Yes. The Connected Learning Innovation Community is also sponsoring the Community App Sharing/Store Architecture (CASA) project. Indeed, CASA is more than a whitepaper!  It is open source software that is being developed by a collaborating group of IMS HED institutions, led by UCLA and the University of California System.  CASA is a breakthrough. It’s a peer-to-peer app sharing architecture that will enable institutions or suppliers to partake in a network of  cross-platform educational app sharing. The very first public demonstrations of CASA will occur at the IMS Learning Impact Leadership Institute May 5-8, 2014 in New Orleans. For more background on CASA see this post.

Q: Is the Connected Learning Innovation Community (CLIC) meant to be an open source community?

A: Yes. IMS expects that for those institutions or suppliers that wish to share and collaborate on open source apps, tools or platforms that implement the IMS standards CLIC will evolve into a vibrant software collaboration. We like to say that this is like “an open source community on steroids” because the software developed will run cross-platform. So, whereas the current open source collaborations like Sakai and Moodle have been and will continue to be great, this is a different kind of community that adds a completely new dimension of cross-platform/cross-community.

Q: Where is the K-12 community in this?

A: IMS expects that K-12 institutions and/or states will begin to participate – it’s only a matter of time and resources.  HED has taken the lead here because HED institutions are developing lots of LTI apps on their own. And, HED is more used to these sort of development collaborations. But K-12 is coming.

 

Why Does IMS Global Learning Consortium Publish an Annual Report?

Today IMS released our annual report for the calendar and fiscal year 2013.   See the press release.  See the annual report.

IMS annual report 2013 cover

Producing an annual report is a lot of work – and these days when it seems like very few people have time to read one might ask why do we take the time and effort to do this?

We first published an annual report for the year 2009 – so 2013 is the fifth edition.

IMS annual report 2009 cover

I think there were really two catalysts that got us to publish the report.

The first was that after I came into IMS as the CEO in 2006 it became obvious that not even the Board of Directors much less all stakeholders in IMS were getting accurate financial data and other metrics on the organization. First we corrected the situation for the Board but then the Board also vowed that we should be providing this information to the members and the stakeholders.

The second catalyst was Jan Posten Day, who at the time was with Blackboard, and is now with Pearson. As a member of one of our leadership committees in IMS Jan was adamant that IMS should have an annual report. At the time Jan suggested this we were struggling to keep the organization afloat and I pushed back on the idea because it just seemed like we could not pull it off.  But Jan’s insistence made an impression on myself and the other staff – and I think it was within a year or so that we dug deep and got out the 2009 report.

As you will see in this year’s report, IMS has been growing nicely now for eight consecutive years.

ims growth through 2013

 

In fact, even though there has been quite a bit of churn in the member base over that time, the consistency in the net growth has been a little scary. It’s scary because we have looked long and hard and have not found any other similar growth pattern in organizations similar to IMS. Indeed during this same period most other organizations classified as “standards consortia” have generally been flat to declining. And, if you look at the historical patterns for standards consortia they tend to grow very rapidly when first originated and then flatten or tail off.

So, IMS is an organization in unchartered territory. In my mind it is all about leadership in terms of which way it will go. IMS has provided a viable organization for those organizations, institutions and suppliers, who wish to evolve an unprecedented collaboration to new heights. Or, those afraid of the disruption that IMS is enabling may slow it down. Everyday I see forces on both sides of that equation and think it’s going to be very interesting indeed as we go forward.

However, I assure you that, means willing, IMS will be publishing the report whether or not the results are as rosy as they have been.  Indeed when we began publishing the report we had no idea that the chart data would keep going up for the next 5 years!

But, here’s why I think the report is useful and why you should give it a look:

  1. In one relatively short document you get a full view of the work of IMS – which is not easy to see if you are focused on one or a few IMS initiatives.
  2. You can see how the organization is doing in terms of building momentum and in terms of financial strength.
  3. You can get a great a very summary of the major thrust of IMS and the key initiatives – and a concise commentary on why we are doing what we do.
  4. You can see the individuals and organizations that are leading IMS.
  5. It is a format that can be easily shared with someone else whom you might want to introduce the organization to or update on IMS progress.

IMS architectiure

Hopefully the experience of perusing the report should give you a sense that IMS is indeed a non-profit organization worthy of your support because IMS is changing the education and learning sectors for the better.  And, if you look at the range of initiatives that IMS is undertaking you can feel pride in that your support has made this progress possible. I assure you that without your support this work would not have happened – not only not have happened in IMS, but most likely would not have happened anywhere. IMS is that unique in the leadership and collaboration for progress to the education and learning sectors.

As with most “things IMS” the annual report is a testament to leadership. Not the leadership of the IMS staff, but the leadership of the IMS members, both organizations and individuals (like Jan Day above) who are insistent that we must do better in enabling the next generation of education and learning!

IMS community

 

Does IMS Have a Strategy?

Please excuse the long time since the last blog folks.  IMS is adding a lot of new members and staff supporting an unprecedented array of exciting initiatives – which has kept yours truly very busy the last few months.

We are now in the final push toward our annual Learning Impact event, May 5-8 in New Orleans, USA. While this is also a busy time we’ve got a great chance at the event and before to be talking about where we are and where we are going in IMS. We hope you will join the conversation! Consider this a first installment.

The (perhaps) provocative title of this post is actually one that we are sometimes asked. After all, IMS is very much a “bottoms-up” meritocracy, like many other organizations that develop interoperability standards. Most of the ideas in IMS, and certainly the best ideas, come from the individuals that are participating on behalf of their member organizations.  And, IMS is a true membership organization (legally organized as such) that provides a level playing field for organizations of all sizes – a construct that we think provides a very good structure for what we do as previously described here. So, when the members speak – we listen – and usually act.

IMS does have a strategy. IMS has an elected Board of Directors that helps formulate the strategy. But, the strategy is very organic, flowing and dynamic. New ideas brought forward by the members go through a certain “due diligence” that occurs by putting the idea in front of key stakeholders – those most motivated to act – and adjusting accordingly (including sometimes putting on the shelf until further interest). Having much experience in the venture capital world I will tell you that it is much like the funneling of ideas/business plans that every VC firm goes through in terms of the process of looking at the risks and opportunities involved.

So, the resulting IMS strategy is a function of bubbling up, testing (against the critical concepts of adoption and learning impact) and organizing into something as coherent as we can make it given what is actually happening in the sector and various sub-segments.  And occasionally adding some key missing pieces that for whatever reason have not bubbled up – like for instance members not willing to share in an area that is actually good for them to share.

For several years past this process unfolded into an IMS strategy centered on what we have called the “Digital Learning Services” standards, focused on (but not limited to) Common Cartridge, Learning Tools Interoperability (LTI) and Learning Information Services (LIS).

The strategic theory behind the DLS focus was that together these standards would solve a very large percentage of the integration challenges in/with the education enterprise.  And, in fact, while different pieces have evolved and been adopted at differing rates, we think this thesis has largely turned out to be on target.  See the accompanying charts on growth in IMS membership during this strategy and growth more recently in the conformance certifications that are the market adoption proof point.  Notice the 97 certifications in 2013 – almost 2 a week. So far in 2014 we are averaging close to 3 a week. In other words, this strategy is still taking hold, but clearly it is taking hold in a big way!

IMSmembergrowth

 

IMScertgrowth

 

IMSpackagegrowth

 

However, the IMS strategy has definitely shifted beyond DLS in the last year or so. First of all, e-assessment, an area IMS has had some activity in for a long while via QTI (and a subset of which is covered in Common Cartridge) became a hot area. The very simple idea that electronic assessments if done right are much more affordable and scalable than paper assessments coupled with the very obvious idea that there should be open formats to enable the e-assessment ecosystem of suppliers and states has come of age (both in the U.S. and other nations such as the Netherlands). Second, now that the IMS DLS standards are working – radically reducing cost, time, complexity of seamless integration – our attention is naturally now turning to what can be enabled with the standards.

While there may not be complete agreement in the IMS community (given its size and diverse nature) over what we should be enabling with the standards, here are the current thoughts – and thus, the strategy going forward:

  1. The power of LTI (first v1 and now v2) to reduce cost and time of achieving seamless integration by 10-1000x will soon lead to 1-click integration.  IMS-enabled applications will be auto negotiating which IMS services are supported – thus revolutionizing the ease with which standards-based applications will be incorporated into the teaching and learning process.
  2. #1 enabling a very diverse open ecosystem of new types of learning platforms and applications and potentially rearranging the ordering of  integrations – very much an “app to app”  model of cooperation with or without a learning management “system” in the middle.
  3. Merging LTI with the IMS work on student information (LIS) and course planning and scheduling (CPS) exchange to continue to open up the educational enterprise via easy to use standards.
  4. Establishing and growing the “educational app community” – like an open source community on steroids that builds things that work across platforms (the “things” may be open source or not, but there should be tools to enable this that are open source). This is a remarkable new type of community indeed – suppliers and institutions working together across platform – kind of like the worldwide web but focused on the education vertical.
  5. Enabling what most refer to as e-books or e-texts as a highly interoperable format across a wide variety of e-readers/mobile devices for the needs of learning and education.  See EDUPUB.
  6. Making instrumentation / measurement of learning activities easy to enable collection of analytics – big and small data. See Caliper Analytics.
  7. Including everything we’ve learned and are learning about e-assessment across #4-6, meaning that we’ve got the standards to enable innovative assessment apps, enable assessment in e-text and the enable easy instrumentation of assessment in learning platforms and apps (via Caliper and the outcomes standards developed on QTI/APIP).
  8. Utilize the standards to create an open source reference implementation for a peer-to-peer app sharing framework that can be used to do, well, what it says – share apps with trusted partners and encourage using standards to do this – thus, the enabling of a standards-based “app store” or “app sharing” equivalent to iTunes, etc. See CASA.

Perhaps though, most importantly, IMS is making great progress with our end-user/institutional led groups to ensure that all of these initiatives are in fact getting them where they want to go.  Our K-12 district advisory board (I3LC) continues to grow and our new HED connected learning advisory board is shepherding the app community, the app sharing architecture, analytics and competency-based learning initiatives.

Hopefully you will see the evolution of the IMS strategy in the above. The IMS community is making change happen in some very substantial ways and I invite you to partake at the May 5-8 Learning Impact event – where the breakout tracks mirror the strategy areas above and the plenary sessions undertake the broader discussion  of “why” we are doing this in terms of the emergent models of education that we wish to enable.

IMSLearningImpact

IMS: from 10-100x Revolution to Connected Learning Innovation Challenge!

Today, in preparation for EDUCAUSE 2013 in Anaheim next week IMS has announced the Connected Learning Innovation Challenge!

The Connected Learning Innovation Challenge will feature IMS’s first ever “app challenge” and the establishment of a community of institutional and industry leaders that want to be at the forefront of encouraging a much more diverse and innovative future for educational technology – in real practice at real institutions – not as hype, but as tools that support what teachers and students want to do within the academic enterprise. Note: Kudos and salutations to Instructure Canvas to organizing the first ever LTI app challenge last May-June!

The motivation for the Connected Learning Innovation Challenge is described in a just released EDUCAUSE Review article, A New Architecture for Learning,  that I was fortunate enough to be able to collaborate on with Malcolm Brown, head of the EDUCAUSE Learning Initiative and Jack Suess, VP of IT and CIO at University of Maryland Baltimore County. The article talks about what we as an educational community need to do to enable greater innovation in the connected age and introduces an unprecedented commitment of cooperation among some of education’s leading associations to help make it happen.

1 of 3 IMS Revolution Banners at EDUCAUSE 12

IMS Revolution Banner at EDUCAUSE 12

Last year at EDUCAUSE 2012 we introduced the IMS 10-100x Open Digital Innovation Revolution.  Is the revolution over? Just the opposite my friends – the revolution is burning like wildfire across K-20 education.  As of EDUCAUSE 2012 there were a cumulative 126 IMS conformance certifications. Going into EDUCAUSE 2013 that number is 210! Holy Toledo!  All conformance certifications are listed on IMSCERT.org.  It took roughly 3 years to achieve 126, but in the last year 84 new conformance certifications were achieved! And, the LTI catalog keeps growing – there are about 20 certified platforms now and a myriad of tools/apps.

So, how does the Connected Learning Innovation Challenge relate to the IMS Revolution? The “revolution” is like the paving of the road. As more platforms and applications are based on open standards and can work together with 10-100x less integration cost and time than before, well, then a lot more attention can be put into innovative vehicles to use the roads!  So, the Connected Learning Innovation Challenge – CLIC – is the logical evolution of the revolution –  focusing on what most people care about: great technology that can support or enhance teaching and learning.

To help understand CLIC, or to explain it to your colleagues, I’d like to provide the following talking points from my perspective (you can also visit the CLIC web pages here):

1. CLIC is about institutions working together to figure out how to enable and sustain support for a diverse set of teaching and learning applications (or non-educational apps favored by faculty and students) that can no longer take 6 months to happen. Thus, CLIC is a collaboration to make something happen that many are institutions currently trying to do on their own – but makes more sense to work on collectively.

2. CLIC will accomplish #1 through a few very targeted outputs/activities:

  • Competitions to identify and financially reward innovative apps and platforms supporting connected learning
  • Open source sharing community for sharing things that submitters and/or institutions wish to share, such as tools, frameworks, apps, app gateways, etc. Open source “things” built on standards can be utilized cross platform – so, this is the first ever cross-platform open source initiative anywhere!
  • A facilitated leadership community via listservs and newsletters to keep all interested parties abreast of the happenings, organize the core advocacy/leadership and enable organic growth. There will be app evaluation activities and other community milestones. As an example of organic growth, whereas IMS will be conducting large-scale challenges we will encourage regional/institutional level challenges in conjunction with tech fairs institutions or others may already be conducting.

3. CLIC is NOT an IMS membership program. To lead, support or follow CLIC your organization does not need to be an IMS member. I’m sure lots of IMS member organizations will be supporting CLIC, and, of course the IMS members made all this possible. But, think of CLIC more like the original IMS initiative organized by EDUCAUSE back in the mid-1990’s. CLIC is a collaboration to make something happen without having a whole lot of formality behind it at the start other than the activities themselves. IMS has the chops to facilitate this, but we want it to go in the direction that the institutional leaders who get involved want to take it in terms of something more formal (or not).

Now, I’m going to say right now, from day one, that getting the most out of CLIC for the educational community will take leadership from institutions. Educators and their institutions are going to transform education with innovative technologies – and the CLIC community should be very productive for those wanting to help lead that charge. IMS can facilitate CLIC and put some legs underneath it – but we need institutional leadership, guidance, ideas and resources in terms of time and even financial contributions for those institutions that can. The other nice thing that IMS can bring is a way to sustain and continue the progress that CLIC makes.  IMS is a solid organization that has a track record of sustaining and evolving innovative technical work even as leadership is handed off and evolved among institutions and suppliers. If you represent an institutional interest in CLIC, I hope you will consider becoming an institutional advocate as some of your peers are – and we are very thankful indeed – we should really be able to get 100 institutional advocates for CLIC!

Finally, if you have not had a chance yet to view the short 3-minute video compilation of comments from Dr. Charles Severance of University of Michigan describing some of the motivations behind CLIC I highly encourage you to go to the CLIC landing page and view the video in the top left corner!

What You Need to Know About e-Assessment

With IMS’s recent announcement of the upcoming e-assessment interoperability challenge we thought it would be a good time to discuss electronic assessment. Here is a Q&A with Rob Abel of IMS Global. Feel free to post additional questions and Rob will answer them (if he can)!

Q1: Is it time for electronic assessment in education?

A1: Yes, paper tests are more difficult to administer, take longer to process, are more prone to error and are not able to provide timely data to help improve instruction. Compared to a situation where paper textbooks may still have some usability advantages over digital e-books, paper assessments have no advantage at all over e-assessment.

Q2: Can e-assessment be used for summative or formative testing?

A2: Both.  E-assessment can be used for pure “high stakes test taking” scenarios as well as intermingled throughout other learning activities for formative assessment.

Q3: Is interoperability of assessment items important?

A3: Yes – very. In general digital assessment enables new forms of collaboration. For instance, in various countries around the world there is a desire to enable school organizations to collaborate on item development – since many schools are testing on the same subjects. Standard formats for assessment items enables collaboration on/exchange of items without every organization needing to use the same software platform for item creation and/or delivery. It is becoming pretty clear with historic collaborations such as the U.S. states on the Race to the Top Assessment initiative that the era of the “single delivery platform that outputs pdf” is coming to an end. With interoperability of assessment items enabled by standards there is no reason to be locked into a single vendor solution. Across the assessment community replication of effort goes down, investment in proprietary solutions ends and more investment is focused on innovation.

Q4: Does IMS have standards and a community focused on assessment interoperability?

A4: Yes.  IMS has two related standards that the assessment community worldwide should be making use of. The first is QTI (Question and Test Interoperability) and the second is APIP (Accessible Portable Item Protocol). QTI enables interoperability of assessment items and tests. The latest version is v2.1 which is the one that the assessment community is rallying around. A subset (profile) of an older version of QTI, v1.2, is used in Common Cartridge, which is a format for importing and exporting content into/out of learning platforms. APIP adds accessibility constructs to QTI to enable electronic delivery of a variety of accessible assessments.

Q5: What about other types of interoperability that might enable more effective use of e-assessment?

A5: Yes. There is a very compelling need to use interoperability standards to enable assessment software platforms to “plug into” or connect with other software systems. So, this is the “assessment software product” as an LTI (Learning Tools Interoperability) tool provider, enabling the assessment platform to be seamlessly “launched” from a host system (like a learning management system). This type of “plugging in” can be useful in both formative and summative scenarios (depending on how the later is administered). We see at least four types of assessment products beyond the state level large-scale assessment that will benefit from this type of interoperability:

  • Standard quizzing/test authoring and delivery software that are typically used already with learning platforms
  • The increasingly popular “homework applications” or “adaptive tutoring applications” can be also be viewed as formative assessment platforms.
  • Classroom test creation and scoring systems – yes, including those using paper and pencil
  • Assessment tools used for competency-based degree programs, such as those used by Western Governors University.

Q6: What about interoperability of assessment data?

A6: This of course is also very important. QTI describes formats for item data – which describes how test takers answer questions. The latest IMS work on analytics – the IMS Caliper Learning Analytics Framework (see blog Q&A) – will leverage the QTI data formats as well as other assessment-related formats (e.g. gradebook data). Thus, assessment data can be provided “back” to a learning platform, an assessment delivery platform or to an analytics store.

Q7: What about authentic assessment in the classroom or project-based learning?

A7: Any type of educational assessment, including e-assessment, is just a tool. It is one source of input. In our opinion assessment should be used to improve teaching and to improve learning. Thus, e-assessment plays an important role because it can provide real-time or near real-time feedback in a very transparent way – on a question by question basis (QTI enables such feedback), for computer adaptive testing or simply faster processing of an entire quiz or test. And that feedback can go to teachers, students, parents, etc – whatever makes the most sense. And, initiatives like Race to the Top Assessment are folding teacher evaluation of various “performance events” into the assessment mix. Mobile platforms and interoperable apps could obviously have an very important and innovative role to play in that regard as well as all types of assessment wrapped into apps or otherwise. We’ve already seen some fascinating use of QTI in the mobile setting via the Learning Impact Awards.

Q8: Why has IMS announced a Worldwide Assessment Interoperability Challenge?

A8: Use of interoperability standards such as QTI in the past has been rather flakey in that each supplier implemented different versions and different subsets of functionality. Very few assessment product providers provided feedback to IMS to enable the issues to be resolved.  As a result, interoperability was limited.  Things have turned around radically in the last few years in that IMS now has some 25 or so world-leading providers of assessment products actively involved in implementing QTI and/or APIP. As a result, IMS has been able to finalize these specifications and conformance certification tests that will result in high levels of interoperability. The “challenge” is our way of saying to the world that we have a very strong core set of suppliers who have agreed to achieve conformance certification together over the next few months. Please come and join in for the good of your product development efforts and the good of your customers who desire interoperability that really works.  The extra added “bonus” for participating is entry into the annual IMS Learning Impact Awards under special assessment product categories. Details on the “challenge” are here: http://apip.imsglobal.org/challenge.html

Q9: What if a region of the world wants to work with IMS on a regional profile of QTI or APIP?

A9: Yes, IMS is set up to facilitate that and is in fact in partnership in the Netherlands for the last two years on such an effort regarding national exams.  Feel free to send us an email at assessmentchallenge@imsglobal.org

Q10: What do you see for the future of e-assessment?

A10: We are at the very beginning of a long road ahead filled with many exciting product opportunities.  As with many of the other IMS standards, like Common Cartridge and LTI, we are going to see a very dynamic evolution based on market needs of QTI and APIP. For instance, one of the other application areas we are working on at the moment is QTI application to e-textbooks. E-assessment will permeate every aspect of digital learning materials and activities – with an emphasis on adaptive testing to help pinpoint where additional alternative materials and activities are needed. And, with the undeniable trend toward competency-based learning paths and credentialing the need for better assessment is increasing. As with all of the IMS focus areas the key will be for the technology of assessment to “get out of the way” and be simple and easy to use and benefit from.

 

Q&A w/ Rob Abel: IMS Analytics Interoperability Framework

This blog post is an interview with IMS’s Rob Abel to get to the bottom of IMS’s recent announcement of its Caliper Analytics Framework.

See related post on “small data”.

Feel free to post additional questions and Rob will answer them (we hope)!

Q1: Is this project/announcement a big deal?

A1: We’ve got a lot of very impactful stuff going on in IMS these days, but enabling widespread adoption of analytics is one of the top priorities of IMS – with a mandate coming right from the IMS Board of Directors. But, perhaps more importantly, if we want to gain the full potential benefit of analytics and dashboards in education we need to make sure it is relatively easy to enable the transmission of data from any applications that can provide useful data. Interoperability standards, if done correctly, can help enable this.

Q2: There is lots of work going on in analytics, dashboards, etc.  Why is this a credible entry into the analytics space by IMS?

A2: Several reasons. IMS has had a relatively unique focus on one of the potentially more fruitful but challenging data collection areas of learning applications and platforms. IMS knows this turf well and brings a large critical mass of members that cover a wide range of product categories and institutional needs. IMS also has a large installed base of applications already using its core standards, such as LTI (Learning Tools Interoperability), Common Cartridge, LIS (Learning Information Services)  and QTI (Question and Test Interoperability). In other words, data is already flowing via these specifications, which are providing conduits upon which more data can ride.

Q3: What is the focus of the analytics initiative versus other IMS specification work?

A3: IMS has a bunch of fast moving task forces and leadership groups that work on applications of standards. This analytics work is coming from such a group that has been in existence about a year, but leveraging off of years of work by IMS in specifications for outcomes data for a variety of purposes – ranging from scores to gradebooks to assessment item data.  The purpose of the analytics effort is to actualize many implementations of analytics feeds in as many products as rapidly as possible, adding a few new bits and pieces, but largely leveraging existing work. In fact, the first proof of concept demonstrations will come very soon – at the next IMS quarterly meeting the week of November 4.  There will be a Summit day there, Thursday, November 7, that will also focus on analytics – both the institutional and supplier perspectives.

Q4: What is the relationship with IMS LTI (Learning Tools Interoperability)?

A4: There’s a short answer and a longer answer.  The short answer is that we expect this analytics work to finalize some work on outcomes (namely a very robust gradebook) that has been proposed for LTI & LIS for awhile but not officially released yet and that we expect the proliferation of LTI-enabled apps and learning platforms to be a natural starting point for data exchange. The longer term has two components. The first is that we are leveraging the work of IMS members in specific LTI app/product categories to help develop the Learning Metric Profiles referenced in the Caliper whitepaper. The second is that we will be introducing a variety of LTI Services that will enable data to go to many different destinations from many sources whether or not LTI was used as the launch mechanism for an app. So, for instance, enabling apps to send data to an analytics storage, dashboard or personal data vault whether or not it was launched by an LMS/learning platform.

Q5: Why is this project developing and releasing the Sensor API?

A5: APIs can make implementation a little easier – especially if a large number of suppliers use the same APIs. IMS is now releasing APIs as best practices with many of its specifications now. Please note that an “API” is by definition programming language specific and a good standard is not.  The standard is the underlying guts – that’s the hard part.

Q6: What if a product company has already developed an API for some category of data transmission – can that still be used with Caliper?

A6: Maybe. One of the cool things about the IMS specs and the development process behind them is that we can work with leading suppliers who already have services/APIs to see if we can “map them” on top of the IMS specifications. You may have developed some APIs that are now experiencing good market adoption for a specific type of service. IMS can potentially work with your organization to harmonize that with Caliper and the LTI services. Please contact us at: CaliperFramework@imsglobal.org

Q7: What if IMS can’t get suppliers to agree on the Learning Metric Profiles?

A7: Well, we wouldn’t be doing this if we were not already seeing some excellent convergence. But, we also want to encourage and fully expect there to be extensions, both public and private, that IMS will capture in a registry.  Thus we can have the stuff that everyone agrees on and the stuff that is new, above and beyond that is either publically sharable or not. That’s how innovation in data and analytics is enabled by all of this.

Q8: What about applications that are kind of out of the learning domain, like CRM (Customer Relationship Management) systems?

A8: We see absolutely no reason why Caliper cannot add Metric Profiles for classes of systems like this and add into the mix. The Caliper Framework should be applicable to almost any type of system.

Q9: What if my analytics product wants to suck in every possible data and user interaction possible?

A9: Yes, big data. If you want to do that across more than one system you still need an agreed upon analytics feed. Caliper will cover that, even if a private solution is needed at first (see A7 above).

Q10: Will U.S. K-12 initiatives such as InBloom or Ed-Fi benefit from this work?

A10:  Yes, they certainly could! Caliper data can go to/from anywhere.

 

What is Disruptive Innovation in Education?

At the 2013 Learning Impact conference I presented a keynote “Innovation, Disruption, Revolution – Oh My!”  I chose this topic because the degree of hype about “disruption” in education seems to be at an all time high right now. BTW it’s amazing how well the Gartner Hype Cycle fits the Wizard of Oz!

From Rob Abel's Learning Impact Keynote: Innovation, Disruption, Revolution Oh My!

From Rob Abel’s Learning Impact Keynote: Innovation, Disruption, Revolution Oh My!

Excitement about the role of technology in improving education is a good thing as far as I’m concerned. Education is a segment that needs disruptive innovation. To me the hype around things like MOOCs represents a longing by many for “a better educational future” – presumably involving lower costs to students and better career/life fulfillment, not to mention better global citizens needed to solve our global challenges. Let’s face it – there is a general sense that the current education system is not up to the challenges of the future. And, it’s not clear how we get from “here” to “there.”

But, as leaders in the education segment we do need to get better at understanding where we have been and where we are going, what constitutes innovation and/or disruption that is worthy of investment?  Are you an investor? I would argue that any individual that is putting time into educational technology leadership at any level is an investor, but certainly institutions that are spending on innovation, and yes, venture capital investors (investors in this segment, including some pretty big names, are making some bad decisions right now about where they are putting their money – but this post is more about how institutions should decide to invest their resources).

In IMS we try to ferret out “winners” by looking at criteria for something we call “Learning Impact” – which is defined by a set of judging criteria we use in our annual Learning Impact Awards (LIAs). You can think of it as evidence that the application of technology in an educational setting has had a clear impact on access, affordability and/or quality. We’re pretty proud of this program because there is simply no way to win with hype. But, in general the whole annual Learning Impact event is about understanding where the innovation really is.

Right now there are quite a few over-hyped activities in the education segment.  I would include in these MOOCs, Common Core State Standards, analytics, badges and open educational resources (OER). Sorry if I tipped over one of your sacred cows there!  Over-hyped does not mean that something good will not come from these developments. It just means that they are being portrayed as more significant in terms of ability to “disrupt” education than they will be capable of delivering on. As with most hyped innovations, eventually some aspects of the innovation “survive,” crossing the chasm to productive advancement of the industry. The challenge for innovators and investors is to utilize critical thinking to rationalize what will sustain and what the real Impact will be.

MOOCs are perhaps everyone’s favorite example of hype right now. It’s difficult to imagine how something could have achieved more hype in the higher education segment – and they are very clearly striking a nerve for being potentially highly disruptive. Literally every day MOOCs are in the headlines – and smart marketers are trying to jump on the MOOC bandwagon even as that bandwagon morphs continually to address the glaring deficits of the MOOC model and quite frankly, more failures than successes at this point.

Clayton Christensen, the leading expert on “disruptive innovation,” has written at least two books specifically focused on education. According to Christensen’s disruptive innovation theory markets are disrupted when new entrants figure out an innovative way to provide a “simpler” product to a wider set of buyers at a more affordable price.  Since the simpler product is actually what the broader market prefers (simpler means more usable, more effective for the desired purposes) the product is highly disruptive – better product at lower price point.  And thus, these new entrants change the market behavior substantially.  Jim Farmer does a very thorough job of digging into the theories as applied to MOOCs here.

While I, like many other silicon valley entrepreneurs, have found Christensen’s original formulations on disruptive innovation descriptive of what is generally seen in the high tech world, and a useful thought framework, well, the problem is that these formulations have not been at all predictive for education – and that is the acid test for using theory for strategic gain – is it predictive?  Writing about what happened in the past and putting a framework around it is great – but if it doesn’t help predict the future it doesn’t especially help entrepreneurs or us “investors.”

The innovators in the education segment have NOT disrupted the status quo significantly so far. While Christensen predicted in 2008’s Disrupting Class great disruptions in the segment from online learning, the reality so far is that greater adoption of online learning has continued as expected but not very disruptive: price points for education continue to rise ahead of inflation and while online education continues to grow it is largely seen as reflecting traditional models rather than a disruption.  And, while online/blended models have certainly improved access – well, the percentage of the population that has achieved credentials has been very level.  I wrote a paper about what the technology impact in higher education back in 2005-7 – and the situation is not significantly different today.

Christensen was recently quoted in an interview as stating that higher education institutions are going to be in real trouble 5 years from now. However, he has not made it clear why things will be different in the next 5 years versus the last 5 years since Disrupting Class was published. I do agree that “buyers” (students/parents) are in fact getting smarter about looking for lower cost options as well as attempting to understand the value of a higher education degree.  But I would argue that we are nowhere near a true disruption (rise) in the number of participants in higher education.

While it is very valuable to have any great thinker on business strategy analyzing education and providing insights from other industries that might apply (like Christensen), education leaders need to do their own critical thinking about these formulations of “disruptive innovation” in the education segment.

I’d like to provide three key factors that IMHO are needed to be understood if to  understand disruption in education. While these factors may be relevant to understanding MOOCs, they can be applied to other hype areas as well. Hold on to your hat here as these are things that we don’t hear much talk about in the education space – especially when you go to meetings on one of the hyped topics or even to investor conferences!

  1. Education is a complex services business in which quality is difficult to define. Disruption in the education space requires better service models that are built around improved educational program quality. Comparison of education to the disruption of the steel industry by mini-mills (a connection some have made because Christensen uses this as a classic example of disruption) is not a valid comparison. Disruption in education is not about replacing the low end of a well-defined product. It’s about redefining quality in a much more complex world of knowledge than that from which most current educational models were designed.  So, for instance, a true disruption in education would be highly desirable/effective K-masters degree in 15-16 years versus the current 19 years (ideally that meets the needs of under performing populations as well as traditionally successful populations).
  2. The next phase of true progress will be to come out of the current era of massification into a new era of more real world relevant and personalized educational pathways.  Massification of educational experiences will not be the ticket to success in the next wave of educational models. It seems that many of the entrepreneurs and investors in the education space fail to realize that we have already been in the era of massification of education for the last 30 years or so in developed countries. The disruption in terms of content will need to be content that enables educational experiences that are up-to-date (read “relevant”), adaptive to the interests of the learner, easily adaptable by teachers and yet thorough in terms of teaching the educational foundation that are defined via #1.  No offense to our friends at Coursera (one of the growing number of MOOC providers), but “course era” is an especially off the mark name for describing the future IMHO.  It is NOT a course era now in education nor will it be in the future. It is the same era that it has been: which is one in which a distinctive program of high quality education will be highly valued. It’s just that we need to do a better job getting such distinctive programs to the currently underserved populations at a better price point. Yes, there will be different sources of supporting digital resources that will help enable the redesign of educational delivery (potentially a role for MOOC courses). As discussed in other LI blog posts (here & here) digital resources will need to be in a form of a highly usable toolkit for faculty and learners.  But, these supporting resources have a VERY long way to go to meet the needs of learners. And, the more available (i.e. free, open, massive, etc) a course or content becomes the faster it will become commoditized.
  3. Education is the ultimate “long tail” market with a growing proliferation of high value niche providers and boutique programs. This is only going to increase in the future. Contrary to other recent online phenomena like facebook or twitter, education is not a “winner take all” market. I was at an education investor conference earlier this year where a leading investment advisor made the statement that education is the “ultimate winner take all market.” Education, if done correctly, is life success enabling. The more unique and distinctive your educational experience is, the more valuable it is. The ability to produce knowledge is the key currency in the current and future global economy. There is no distinctiveness to attending “Massified University.” And, a credential from a massive provider will most likely be such a commodity that most will prefer not to waste their time on it (other than for pure fun or reference). We need many more niche-oriented institutions that provide specialized, career-enabling and life-enabling education. Note that even large public institutions, while having many students, can engage this philosophy to create a large number of differentiated programs of study.   Therefore, the “disruptive technology platforms” for education will need to enable great diversity. Diversity in terms of delivery models and blending of high touch with personalized self-service.  Disruptive platforms will also need to enable seamless integration among cooperating providers of the various components of a solution – meaning close partnership among institutions as well as innovative learning tools.  Old style Web 2.0 thinking of the single pervasive platform or the single way of analyzing data will not work for the future of education. Education is not that simple (sorry!).

Are MOOCs potentially promising innovations? Yes. They are clearly an evolution of the trend toward pervasive online/blended/more flexible educational models/flipping the classroom. Will they disrupt education?  Not on the current trajectory. They pretty much fly in the face of the three tenants above.

But, there are some potential ways in which MOOCs could be disruptive in a more limited way.

I think MOOCs have the potential to be disruptive on the low end of the education market as a new model for delivering “open university style education.”  For example, today’s MOOCs could be an appealing alternative to the content portion of massive open universities around the world, featuring star professors from highly ranked universities. And, such populations of learners could probably support an advertising/low course fee model of consumption. So, open university providers get a “pay per use” version of content (versus the larger investments they must make today for tuition) that is likely better than what they are offering today, which if subsidized by advertising (net revenue equals small pay per use from open university provider coupled with advertising revenues) could equate to substantial revenues over a large numbers of users with similar interests (as indicated by the topic of study).

In such a model MOOCs could also double as a new type of “tutorial” reference materials. Very much like Kahn Academy, which is often referred to as a MOOC now even though it existed for a few years prior to the term.

Would this application of MOOCs be disruptive? Well, if you consider displacement of open universities and/or a new business model for them disruptive, then yes.  This approach could potentially disrupt the open university market worldwide and create a much larger interest in open university derivative models as a way to learn more about a particular topic and/or as preparation for entry to other universities.  Will they substitute for those universities – no. Such a disruption will not radically change the efficiency of higher education segment at all.

Is there a strong motivation for current providers of education to engage in this model? Unclear. As previously discussed the more you commoditize an educational experience the less valuable the education is. As discussed above the next stage in education is greater diversification, not massification. Personally I would like to see existing institutions respond to the three factors above by rolling out new “institutes” with new types of degree programs that meet the evolving needs of society.

Should open universities or other new entrants that wish to compete in that space try a MOOC/advertising/low course fee supported model?  I hope so.  That would be a good fit with the “course-focused” strength of MOOCs.  It would also provide a potential revenue producing and marketing outlet for institutions and their “star teachers.” Of course there are the normal challenges of achieving accreditation from the countries/states as needed to prove value to the potential students.

So, hooray for the MOOC phenomena in terms of making claims about needed innovations such as scale, analytics and the potential power of star teachers that will help accelerate the innovation trend toward online/blended that we are already on! And, congratulations to the leading institutions willing to make a space to try out the MOOC innovations as another set of tools that might be leveraged in the quest for improved education.

However, I would say that if you don’t know at this point why you are investing (in terms of what impact you expect will sustain at your institution, and therefore what the return on your investment is likely to be) then you are not applying the level of critical thinking you need to. Personally I would be asking MOOC providers to take the risk in terms of proving the validity of the market/business model (such as the ideas around open university displacement) – and not taking on that risk for my university. 

[Note: In a future blog post I will take a whack at what I think will be the sustainable innovations that might cross the chasm coming out of some of the hype items mentioned above: MOOCs, Common Core State Standards, analytics, badges and open educational resources (OER).]

See more of Rob Abel’s views on educational innovation throughout the Learning Impact blog as well as this feature interview with Anthony Salcito of Microsoft.