Sharebar?

Learning Impact Blog

IMS Global CEO Rob AbelRob Abel, Ed.D. | January 2022

 

"It is not in the stars to hold our destiny but in ourselves." —William Shakespeare

 

IMS is Rebranding to 1EdTech in 2022

Many of you may have already heard this exciting news. This year, we are transitioning to a new brand to better capture the essence of our groundbreaking community and add even greater energy to our cause as the need continues to grow around the world.

Yes, the IMS brand is well known and greatly respected among those involved in standards development. Not to mention the IMS community is (and has been) growing at an impressive rate.

However, in 2018 the IMS Board of Directors concluded that the name, IMS Global Learning Consortium, was not reflecting the primary value that the organization provides. Namely, our value is helping the educational and learning technology ecosystem converge toward the future through the leadership and collaboration of dedicated end-user and supplier organizations. And that not only the name but the overall messaging and positioning of the organization needed a refresh.

Thus, in 2022 we are transitioning to a new brand: 1EdTech.

The new name and the brand messaging reflect both our aspirations and the value we are already providing to the worldwide education technology sector.

1EdTech signifies a united commitment to achieving an open and inclusive education technology ecosystem that serves the needs of every learner, every educational institution, and every edtech supplier. This united ecosystem is owned by all education participants and stakeholders, enabling both public and private good. Together we are accelerating educational innovation to lift every learner.

1EdTech New Site Hero Example

We are purposely communicating the brand transition gradually. IMS members have been providing feedback on the rebranding strategy since it began in earnest towards the end of 2018. But we are still listening for input as we talk with our many partners worldwide. In 1EdTech, just as in IMS, the community shapes the organization.

Over the first half of 2022, you will see incremental updates to the website that reflect messaging that is more aligned with 1EdTech, and, yes, there will be an update to the logos and other brand imagery as we go. We will be highlighting stories that help all stakeholders understand better how our members are creating the future of education by shaping an edtech ecosystem that powers learner potential. I will be creating blog posts every couple of weeks to provide some perspective on the new brand.

I hope you will be able to join us as we celebrate the new brand at Learning Impact 2022 in Nashville, Tennessee, from 13-16 June! Be on the lookout for registration opening soon.

 

Tags:

 

IMS Chief Architect Dr. Colin SmytheIMS TECH TALK

Contributed by Dr. Colin Smythe, IMS Chief Architect

 

Creating IMS Specifications Using a Model-Driven Specification Approach

Background

In 1999, IMS published its first two eLearning interoperability specifications: IMS Metadata and IMS Enterprise Data Model.

In 2000, IMS Content Packaging and IMS Question & Test Interoperability specifications followed.

These specifications were developed and documented as Microsoft Word documents with tabular-based descriptions of the data models. For all of them, the actual interoperability was based upon the exchange of XML files. Initially, we created Document Type Definition (DTD), then XML Schema Definition (XSDs) files to provide machine-enabled validation of the XML files. Eventually, the IMS specification development process was based on creating the relevant Information Model, XML Binding and the Best Practices and Implementation Guide documents, and the accompanying DTD/XSD files. The documents were created using Microsoft Word and published as PDF documents. The DTD/XSD files were created using early versions of XML authoring tools (we are still in the 2001-2003 period). Many other standardization organizations had similar approaches and used similar toolsets. In 2004, IMS started work on their first service-based specification (Enterprise Services). Once again, the interoperability technology was XML-based but now requires SOAP messages with Web Service Description Language (WSDL) files used to provide a machine-readable version of the API. Again, we created MS Word documents with lots of tables.

In early 2006 it became clear that our specification development approach had limitations:

  • The information model and XML binding documents and the accompanying DTD/XSDs were being created through three separate manual processes and, no surprise, there were many discrepancies between them, many of which were trivial, but some were significant and caused a lot of confusion and implementation problems;
  • The specifications were becoming increasingly complex, so the tabular-based descriptions were large and difficult to understand. Also, the number and significance of the discrepancies was increasingly problematic;
  • The creation and maintenance of new versions were becoming difficult. The combination of increased complexity and maintaining consistency across the different, manually produced, and changing artifacts was resulting in unacceptably error-prone documentation;
  • The inclusion of service-based specification development and the accompanying new set of binding technologies was proving difficult to define and describe with the required precision and accuracy using MS Word documents;
  • There was increasing awareness that conformance testing and certification of products—with respect to the specification—would be essential so the corresponding conformance test systems had to be created and maintained.

Transition

In 2006, we decided to move to a Model-Driven Specification (MDS) approach. The primary objective was to create a process by which the definition and description of a specification were entered into a machine-readable format (the model) once, and from that model, ALL of the artifacts would be produced using auto-generation transformations. Therefore, in the worst case, any error would be consistently present in ALL of the artifacts. This has the benefit that any error identified in one artifact can be used to fix that same error in all of the artifacts. In 2005, the IMS Enterprise Service 1.0 and a new version of the IMS Content Package specifications developed using the IMS MDS approach were published. All of the required benefits were achieved, but we also confirmed some drawbacks:

  • The design of the specification required a more abstract approach for the creation of the model and the actual implementation details only became evident once the binding information was autogenerated;
  • The modeling was based upon the use of Unified Modeling Language (UML); the new transformation toolkit and the learning curve for these were difficult and time-consuming.

Summary

Many engineers from IMS Contributing Members did not have the time or skills to understand the details of this new approach or use the new toolkit. There was more significant dependence on the IMS technical staff to facilitate the development of the specifications. This made understanding the proposed specification and consensus-building more difficult.

Over the following decade, the IMS MDS approach underwent considerable evolution and refinement. The tooling remained based on UML using the Eclipse hosted open-source Papyrus tool, the corresponding IMS profile of UML to enable the IMS MDS approach, and the set of proprietary transformation tools. The currently supported artifacts include HTML documents, XSD and WSDL files, OpenAPI and JSON Schema files, JSON Linked Data files, and the set of PHP files used to support the conformance testing of the implementation of a service provider for the corresponding IMS service specification. The MDS approach also supports the definition of Profile of an IMS base specification: a Profile enables the adaptation of a specification to fit the specific requirements of a market, geographic, etc., communities.  

In 2018 we undertook an internal review of the specification development activities in the context of the next generation of MDS. That review identified several key objectives for that next generation:

  • The use of combinations of specifications was becoming more common, so more effective visualization and interaction with the specification documentation sets were required. This leads to the Student Learning Data Model and the IMS Common Data Model;
  • Reusability of data models was becoming important. Using the same data model improves integration between specifications, but this requires the definition of the data models to enable tailoring at the point of inclusion into a specification to avoid unnecessary bloating of a specification. This is accompanied by the application of a set of frameworks that define common patterns to be used across the IMS specifications (the IMS Security Framework is the first of these frameworks);
  • Contributing Members required better integration of the materials without curtailing their types of input (currently supplied using Google docs, markdown, MS Word, etc.). A process by which CMs can amend and then review the changed model and artifacts needed to be introduced;
  • The codebase for the XSLT was aging, and a complete rewrite was required. The range of target artifacts was still increasing, and so the full UML-based approach needed to be revised;
  • More agile specification development requires the use of a Continuous Integration/Continuous Deployment (CI/CD) process to link the IMS GitHub repositories with the IMS Forums, Website, and PURL server so that a single change to the underlying model can result in rapid update and deployment of all of the associated artifacts;
  • The MDS approach must create a new set of common libraries that will be reused to develop the corresponding conformance tests systems, reference implementations and support the Compatibility Check.

Looking Ahead

For the past three years, the IMS technical staff have worked on all of the above objectives. In the past 12 months, several changes in the IMS specification development process have become evident. The EDU-API, Comprehensive Learner Record 2.0, and Open Badges 3.0 specifications are prototyping many revised MDS approaches. In the next 12 months the following changes will be delivered:

  • The IMS Common Data Model visualization technique will be extended to include support for the wide range of IMS service operations and the realization as REST/JSON-based endpoints;
  • All of the HTML-based documents will be rendered in the IMS profile of Respec. This includes those documents that are autogenerated from the MDS approach and those that are created manually;
  • The Versioning, Extensions, and JSON API frameworks will be completed, and as new IMS specifications and new versions are produced, support of these frameworks will be included;
  • The modeling approach will be extended to include support for the Publish/Subscribe-based service bindings;
  • The set of specification documents and artifacts will be published through the new MDS-integrated CI/CD process.

The focus will move to the conformance and certification, reference implementations, and Compatibility Check aspects of the IMS specification development process in the longer term. Conformance & Certification are an essential part of the IMS process. We must continue to improve the quality and reduce the time taken to establish these for our specifications. This will reduce the effort and time taken to create conformance and certification of Profiles of the IMS specifications and make it even easier to create, manage and maintain IMS ecosystems.

 

Tags:

IMS Global CEO Rob AbelRob Abel, Ed.D. | November 2021

 

"We're all in this together. It's so easy to see." — Walter Trout

 

Renewing (Y)Our Commitment to Shaping the EdTech Ecosystem

As we approach the end of 2021, IMS is energized from our recent K-12 Leadership Retreat held in person in Denver—the first IMS in-person event since late February 2020. We're also energized by the dramatic growth in IMS membership, as we started the year with less than 590 and are now approaching 700 member organizations. All of the IMS key initiative areas are gaining momentum across HED and K-12.

Yet, as I mentioned in my last post (Clarity), I've never been more enthusiastic and, at the same time, more concerned about where the edtech sector will go from here. A confluence of factors may distract us, not the least of which is "the great resignation." I sense that 2022-23 will be a test of leadership like no other we have seen in terms of each institution, district, state, and supplier setting a path based on the perceived lessons learned from both before and during the pandemic. Where will the leadership come from?

Well, it may come from many places. Still, I know with 100% certainty that the IMS community will once again bring extraordinary leadership in defining "the next leg up" in the growth of the edtech ecosystem. Our recent K-12 Retreat was a reminder of the power of in-person collaboration. We know we can expect big things in 2022 from our annual Digital Credentials Summit in Atlanta, 28 February through 2 March, and the Learning Impact 2022 conference in Nashville, 13-16 June.

IMS Learning Impact Conference 2021 information header

Much of what happens in the IMS community I can't predict, but here are some of the themes to engage with going into 2022:

  • Curriculum Innovation and Equity: We have seen extraordinary leadership from IMS members in accelerating the deployment of innovative products while enabling new designs for leveraging digital to achieve greater equity and accessibility. It's time to spread these great ideas from both suppliers and institutions to create our future.
  • Ensuring Real-time Grades, Scores, Data Across Applications and Platforms: It's time to take full advantage of LTI-Advantage, OneRoster, and Edu-API to enable seeing the right data in the right place to improve user experiences. We've got the foundational standards—but we need more purposeful design and implementation.
  • Designing and Connecting Credential Ecosystems: There is nothing the IMS community is more passionate about than shaping the educational ecosystem to enable better recognition of student achievement that opens opportunities for students of all types and ages. The foundation provided by the Open Badges and Comprehensive Learner Record standards, including new work to take advantage of other new and evolving standards, will undoubtedly take us where the leadership from IMS higher education and corporate learning leaders will take this undeniable trend.
  • Enabling Embedded and Balanced Assessment: In 2022, the breakthrough QTI 3.0 will be hitting the market in a big way, but the question is how to leverage digital assessment innovation across products and between state and district level systems to enable new models for embedded assessment. QTI 3—and what it can enable—should be on every edtech product company's roadmap. I expect to see robust and collaborative leadership on new models that will prioritize the many available features.
  • Foundational Data and Analytics Architectures to Support Educational Design: In 2021, IMS began work on the application of IMS standards coupled with next-generation data architectures to support the understanding and effective usage of learning platforms, learning tools, and curriculum resources.
  • Defining and Ensuring a Trusted Edtech Ecosystem: After more than a decade of breakthrough work in enabling connectivity among teaching and learning products, the IMS community is leading the definition and deployment of TrustEd Apps across K-12 and HED. The IMS community is where the leadership on what information is needed and how to get it to the range of "users" that set up and touch an institution or state's ecosystem.

Hopefully, you will note that none of the above areas of leadership are just "nice to have." They are all calls to action for collaboration because it is literally impossible to expect one or a small group of organizations to accelerate widespread progress in any of these.

Our purpose in IMS at every one of our over 500 meetings a year, especially at our in-person events, is to design and facilitate the community collaboration that will shape the future of the edtech ecosystem. It is a future filled with greater diversity, innovation, and personalized experiences.

By working together, we achieve a better understanding of our options, better support from building lasting partnerships, and thus become much more effective leaders.

And, by producing connectivity that we can all leverage, we are building the capacity for change via an unparalleled shared investment. It's a formula that has worked well and will continue to work well for those that engage and are active in our community. As we go into 2022, with all the leadership challenges that will be out there, IMS members must take this to heart and benefit from the amazing progress and resources that we produce together.

I wish you all the best for the holiday season and look forward to seeing everyone in 2022!

 

Tags:

 

IMS Chief Architect Dr. Colin SmytheIMS TECH TALK

Contributed by Dr. Colin Smythe, IMS Chief Architect

 

Using Compatibility Check to Improve Interoperability: From Conformance to Characterization

In a previous blog, I discussed the importance of Conformance and Certification as part of the IMS specification process. I also explained why IMS certification is required and why it is not "compliance." Certified products published in the IMS Product Directory have successfully passed rigorous IMS conformance testing. Defining the conformance requirements and providing the associated conformance test capabilities is essential for the IMS specification development process.

While certification is critical, it is fully product-focused and does not cover the configuration of an operationally deployed version of the product. There is a limit to the degree of interoperability guaranteed through certification only. It is possible for two products certified for a standard not to interoperate. Fortunately, the IMS Product Directory provides sufficient details so that confirmation of the correct type of certification is available. In the case of deployed operational systems, there are many different ways in which a certified system can be configured. As the next step beyond certification, IMS has created the Compatibility Check (CCx) system that is used to provide a characterization of a deployment.

Conformance testing is undertaken on a vendor's product configured for a testing environment. Typically, service providers are configured with the in-house test data sets i.e., real data is not used. IMS conformance testing ensures that invalid data is not sent by the service provider. Unless all of the data stored by the service provider is entered through the API undergoing conformance testing, this form of testing does not explore how the data set is entered into the service provider. Therefore, once the test data set has been verified as correct, an inherent part of conformance testing, there is no further check on the service provider's ability to prohibit the storage of invalid data sourced through other interfaces. I've already explained how two certified products may not provide data interoperability. However, there is one more likely reason why certified systems may not have full interoperability. The actual configuration of a product will depend upon the business models being used by the vendor e.g., specific endpoints may only be available for some business models. The fact that a product is certified does not mean that all of the features checked under conformance are available in an operational configuration.

Therefore, actual interoperability can ONLY be determined by examining the configurations of the deployed operational systems. We call this process Characterization.

Characterization enables the coverage of a specification in an operationally deployed implementation to be determined and recorded. Characterization is also the process by which data instances can be analyzed, i.e., covering the scenarios where only the data formats, not the exchange mechanism, are defined in a specification.

  • For service providers, the characterization is automated, whereas an engineer must complete a detailed functionality questionnaire for consumers.

  • For data import systems, a model of the import capabilities must be manually created, whereas, for data export systems, functionality coverage is created through the analysis of a set of exported data instances.

  • For REST API services, the characterization covers all endpoints, the associated query parameters, and the error handling mechanisms.

From the data perspective, characterization checks all of the data-typing, the supplied range of content, and any use of the extension mechanisms. The checking of the range of content enables the characterization to provide insight into the frequency of usage of data fields. For example, whether or not an optional field is always populated, the coverage of the set of enumerated tokens, etc. This content checking is important when confirming that the data required by a consuming system is being supplied by the provider (such a requirement would be an extra constraint on the optionality defined in a specification).

Compatibility Check is the software solution that provides the characterization capability. Characterization is only one of the features available through Compatibility Check. Two other features are available:

  • Verification – confirmation that the actual data being exchanged is valid with respect to the specification. Once characterization has been completed, any other data instances can be verified with respect to that characterization;

  • Compatibility Comparisons – the capability for a CCx user to explore their set of characterizations and to make detailed comparisons between matched characterizations. For example, the compatibility between a specific service provider deployment and the corresponding consumer(s) can be completed. This comparison provides a definitive, predictive statement on the interoperability matches and mismatches.

It is important to stress that while all of these analytics are stored (the characterization data), NONE of the actual data being exchanged is stored. Compatibility Check has been implemented such that a human never has access to, and cannot gain access to, the data being analyzed.

At present, Compatibility Check covers the IMS OneRoster and Common Cartridge standards. It also covers the set of apps under the IMS TrustEd Apps process i.e., apps that have been checked with respect to the IMS TrustEd Apps Rubric and that have the corresponding TrustEd Apps Seal. Access to CCx is available to IMS Contributing Members and Affiliate Suppliers who have OneRoster or Common Cartridge certification. CCx is also available to all Educational Institutions.

Compatibility Check is also a part of the Standards First Initiative. Standards First is a call-to-action for the edtech community. This initiative aims to ensure that we can achieve open standards-based integrations as the foundation for enabling product choice, improving cost, and enhancing data availability and student privacy. Standards First begins with the Pledge to make open standards the first and primary choice for EdTech integrations. Compatibility Check is made available to confirm interoperability through the use of open standards and the correct usage of those open standards. Over time, CCx will be extended to cover other IMS specifications, particularly Learning Tools Interoperability (LTI) Advantage.

IMS Members wishing to find more on Standards First and Compatibility can contact Lisa Mattson at lmattson@imsglobal.org.

 

Tags:

 

Marcy Daniel, CPO, PowerSchoolNovember 2021 | Interoperable K-12 Edtech for Long-Term Success

Contributed by Marcy Daniel, Chief Product Officer, PowerSchool

 

How a Standards-First Approach Leads to Better Support for Student Outcomes

As K-12 schools and districts continue to increase their reliance on software to support operations and instruction, the importance of edtech interoperability with connected products that share data, workflows, and a user experience is vital to long-term success.

While prioritizing digital enablement and student outcomes, schools and districts may inadvertently find themselves supporting multiple software solutions lacking interoperability. These could include their student information system (SIS), back-end office support, talent and recruitment software, instructional tools and resources, and even family communication portals. As a Digital Promise study highlighted, 74 percent of districts use more than 26 different education technology products, and another 17 percent of districts use more than 100. The result of the rapid proliferation of software and technology can mean managing a host of systems that generally are not built to speak to each other efficiently—or at all.

 

Number of Edtech Products K-12 Districts Use

26+ Products

74%

100+ Products

17%

Source: "The State of Data Interoperability in Public Education," Digital Promise, 2017

 

In addition to causing potential funding and talent resource issues for school and district administrators, these non-communicative software platforms can disrupt instruction while decreasing overall productivity throughout an organization. Similar studies share that when teachers must maneuver between so many products, they have less time to teach. On average, teachers spend nearly half of their working time on non-teaching activities.

 

Interoperability Supports Whole Child Instruction

Without insight into each learner's K-12 journey and a strong data culture built around solutions that can speak to each other securely, educators may struggle to make informed instructional decisions. In other words, disparate software systems with data residing in silos mean we're missing the big picture when it comes to maximizing student support.

Alternatively, interoperable solutions can significantly impact seeing the full view of each child's learning path with access to readily available data. An interoperable solution that integrates with other edtech products can make user experiences more streamlined, convenient, and natural when maneuvering between programs. With a unified, cohesive workflow, it's easier to learn applications, navigate quickly, and improve productivity. It all adds up to more time for instruction and managing district or school needs and less time struggling with the technology itself.

 

A Standards-First Approach Supports Interoperability

When choosing—and working with—edtech vendors, school and district leaders should look for products and organizations that strive for a standards-first approach, both in terms of technology and instruction. This outlook helps support equitable technology across organizations without having to initiate costly integration projects on their own.

Knowing which solutions adhere to industry-leading open standards, such as those supported by IMS Global, can help K-12 schools and districts meet significant goals.

By bringing together a suite of integrated solutions from certified edtech companies, districts can:

  • Accelerate digital transformation
  • Make school operations more efficient
  • Improve student performance

Standards First pledge signatory digital badgeMore than 120 organizations, including edtech vendors and K-12 districts working with IMS, have pledged to prioritize and advocate for open standards via the Standards First pledge. Part of this initiative, and critical to the future of modernization in the education landscape, is to ensure that systems connect seamlessly and securely. Edtech vendors can assure this by getting IMS certified. This means that not only are their products interoperable, but by being certified, they confirm their products have plug-and-play connectivity without requiring custom integrations—a significant resource and time saver for K-12 districts.

The organization also supports the Competencies and Academic Standards Exchange® (CASE®) standard. IMS provides a repository of CASE-published standards through the CASE Network for anyone to use, generally at no cost. Designing software that supports the CASE standard ultimately saves districts both time and cost.

Since most state and national learning standards are only published in human-readable formats, such as PDF, it's difficult for K-12 technology directors and curriculum leaders to integrate the learning standards in a useable and flexible way into their edtech tools. However, state and national issuers who choose to publish in the CASE format can make it easier for edtech vendors and their users to access and use the standards to better support student learning.

 

Security Should Be Top-of-Mind

One of the most important aspects of connecting edtech products is the security of students and data. Ensuring that data is transferring from one system to the next in a truly integrated and secure manner is critical. It requires that school and district technology directors select edtech vendors who adhere to strict standards, set and monitor role-based privileges, design data security plans thoughtfully, and update them regularly.

It's also crucial that state and district leaders collaborate with edtech software developers as a community to design products and platforms with student data protection needs at the forefront.

 

Vet Your Edtech Vendors

 → Download this infographic to assess your edtech vendors in 20 key areas that show what they are doing to keep your student, staff, and school data safe.

 

Interoperability Delivers Benefits for the Long Term 

Ultimately, having interoperable systems that adhere to technical and instructional standards can save K-12 schools and districts time and resources. And at the same time, it increases opportunities for developing and delivering individualized instruction tailored to the whole child's needs. In the long term, interoperable systems are a must when supporting better education management.

 

Marcy Daniel is PowerSchool’s Chief Product Officer. She is responsible for developing a unified strategic vision of PowerSchool’s portfolio products, managing overall roadmap development, and delivering to clients on the roadmap.

 

Tags:

IMS Global CEO Rob AbelRob Abel, Ed.D. | October 2021

 

"Clarity" —John Mayer

 

Some Challenges to Consider as We Continue to Grow a Healthy EdTech Ecosystem

Having completed a very successful Learning Impact 2021 virtual conference, and looking forward to our upcoming in-person meetings, including our K-12 Leadership Retreat (in Denver, 10-11 November) and our annual Digital Credentials Summit (in Atlanta, 28 February through 2 March 2022), I’ve never been more hopeful and optimistic about the IMS community cause of enabling a vibrant, innovative, open edtech ecosystem.

However, I’ve also never been more concerned about a confluence of factors I hear about from the IMS members that may slow our progress.

First, let me cover some of the many things to consider on the optimism side of the equation.

We’ve reached new heights in so many ways. Our community and the market highlighted the power of open standards-based ecosystems during the pandemic. Both the number of products going through IMS certification and individuals participating in IMS exceeded 6,000. IMS standards are used in the most widely used edtech products on the planet, positively impacting nearly all students and teachers in the USA and growing in many other parts of the world.

The IMS work agenda has never been more vibrant, relevant, and horizon-expanding (in terms of supporting/encouraging new educational models). IMS provides a new generation of tools to help edtech participants troubleshoot and converge the use of open standards in practice. Probably most importantly, everything we have achieved is based on a rock-solid technical process and financial foundation.

On the not-so-optimistic side of the equation, there are three factors that I am hearing from IMS members as potential sources of drag. First, there has been an impressive number of mergers/acquisitions/roll-ups/IPOs among edtech suppliers. While these are pretty “normal” in a growing market, they also can bring much uncertainty. Second, the “great resignation” and general post-pandemic uncertainty mean more attention on just keeping the lights on. Third, over the last several years, there has been growing confusion caused by well-meaning but not exceptionally expert organizations responding to funded initiatives—leading to less clarity and transparency when it comes to adherence to open standards. We know from many examples that when organizations are funded to advocate or implement interoperability, it rarely results in a solution that helps maintain or sustain open standards.

Image from IMS Annual Report CY2020 - young student in a classroom with backpack smiling

Again on the positive side of things, it is wonderful to see so much leadership in K-12 evolving to more personalized and equitable education. There is renewed interest and focus in higher education on enabling digital transformation to meet the many challenges going forward. It is an honor and privilege to work among so many leaders who are trying to improve education and willing to put in the time and effort to work together to lift up themselves and others. Might I suggest that we take a moment here to remember that the future we seek is dependent on achieving and maintaining a healthy edtech ecosystem based on open standards? And please remember that the impact we seek from open standards should be readily apparent: better user experiences, greater choice of products, lower cost integrations, faster integrations, richer data, actionable data, greater trust between institutions and suppliers. These benefits come at no additional cost because there is a return on investment for all stakeholders as the market opportunity for edtech increases.

I’m confident that the IMS member community will navigate the challenges mentioned above and many more in the coming months and years, just as we have from the beginning.

That said, as someone who has put decades of my life into this mission—and fully acknowledging all our progress—I believe that now is a time when we will need to collaborate even more closely to ensure clarity and transparency in achieving open standards-based interoperability in edtech.

 

Tags:

 

IMS Chief Architect Dr. Colin SmytheIMS TECH TALK

Contributed by Dr. Colin Smythe, IMS Chief Architect

 

The IMS Security Framework 1.1: Security for All IMS Service-Based Specifications

Providing secure access between edtech systems and learning resources is essential. We've published many service-based specifications for edtech interoperability, including Learning Tools Interoperability/LTI Advantage, OneRoster, and Comprehensive Learner Record. Addressing secure data exchange, authentication, and authorization are important parts of our specifications. These needs are not unique to edtech, but some of the most vulnerable users in society make extensive use of edtech systems and resources.

In May 2019, IMS published the Security Framework 1.0. This framework established the set of security patterns and techniques approved for use in an IMS interoperability specification. Experience has shown that security requirements change continually, and a solution that works today may become vulnerable tomorrow. In August 2021, IMS published the Security Framework 1.1, which included new security features and some refinements reflecting the two years of experience gained from using the original version. The Security Framework places IMS in a very strong position with respect to best practice adoption and adaptation. Nevertheless, we already know that it will need a further revision in the next 18-24 months.

The IMS staff are not experts in security but IMS member organizations, presently over 675, have experts in just about every area of technology. So, we have been able to bring their expertise together to create the IMS Security Framework.

Security concerns cover all market sectors. Several other standards organizations have created solutions that have very broad adoption. The approach by IMS when creating the Security Framework is to adopt this best practice and adapt it to meet the specific needs of edtech interoperability. We achieve secure communication by requiring the use of Transport Layer Security (TLS) as defined by the Internet Engineering Task Force (IETF) Request For Comment (RFC) 8446. Providing authentication and authorization is more complicated. Authentication is the confirmation that a user is who they claim to be while authorization gives those users permission to access a resource. In the IMS specifications, there are three scenarios for which awe must support authentication and authorization:

  • Web services-based information exchanged between systems where there is an established trust relationship (OneRoster is a typical example)

  • Web services-based information exchange between systems where there is no established trust relationship (Comprehensive Learner Record is a typical example)

  • Moving users between edtech systems that may or may not have an established trust relationship (LTI Advantage is a typical example)

Therefore, two third-party specifications have been adopted:

  • OAuth 2 defined in IETF RFCs 6749 and 6750
  • OpenID Connect from the OpenID Foundation and which is built upon OAuth 2 to provide authentication

One of the problems when combining specifications from several sources is to create a consistent terminology. In the Security Framework we have used the established IMS terminology as our basis and have been careful to explain how this is mapped to the terminology used in the third-party specifications. 

In version 1.1 several functional additions have been made due to the requirements from new IMS specifications. These additions are:

  • Support for access token refresh as part of the OAuth 2 Authorisation Code Grant workflow
  • Support for access token revocation (using [RFC7009] as part of the OAuth 2 Client Credentials
  • Support for access token and/or refresh token revocation (using [RFC7009] as part of the OAuth 2 Authorisation Code Grant workflow
  • Support for dynamic client registration to simplify the use of OAuth 2 using [RFC7591] and [RFC7592]
  • Support for dynamic client registration to simplify the use of OpenID Connect [OPENID-DCR] and [OPENID-DIS] for systems based upon LTI
  • Definition of the use of a service discovery endpoint and the Service Discovery Document (SDD), based upon OpenAPI (JSON) file format, for obtaining a description of a service provider's service capabilities

Apart from a few bug fixes and corrections the new functionality added in version 1.1 is backwards compatible with version 1.0. Therefore, migration from version 1.0 to 1.1 is expected to occur as part of the natural revision cycle on a per IMS specification basis.

IMS Security Committee

The IMS Security Committee has been given the responsibility for maintaining the Security Framework. This committee brings together some of the IMS technical staff with experts in the field of security and edtech systems from the IMS Contributing Members. This committee provides IMS with awareness for state-of-the-art security best practices and insight into how organizations should apply these practices. The Security Committee also has responsibility for undertaking an annual security audit. This is a formal review of how the Security Framework is being applied in the IMS specifications as well as reviewing the effectiveness of the Security Committee itself. The audit report contains a set of short, medium, and long-term recommendations, which should be completed in the twelve months following the publication of the report.

The IMS architects are responsible for ensuring that the various specification working groups use the Security Framework appropriately. Before submitting a specification to the IMS Technical Advisory Board (TAB) for a vote on Final Release, the specification must be submitted to the Security Committee for formal review. If there is anything that may be contentious, then it is recommended that the specification is sent to the Security Committee for formal review as part of the creation of the Candidate Final Release process.

What would constitute being contentious? The use of the patterns defined in the Security Framework are not mandatory, but if other approaches are to be used, clear justification must be made: this includes when there is a decision not to include the use of security in the specification. In some cases, the security requirements for a specification may not be covered by the Security Framework. Therefore, an update of the Security Framework would be undertaken to include the new capabilities required by that specification. These new patterns would then become available for other IMS specifications. The annual security audit provides the final mechanism for evaluating the ways in which every IMS specification makes use of the latest version of the Security Framework.

 

Tags:

 

Monica Cougan, Manager of Strategic Relationships and Initiatives at CatchOn October 2021 | You Need Data to Know How Much It Helps Your District

Contributed by Monica Cougan, Manager of Strategic Relationships and Initiatives, CatchOn

 

Simplify and Strengthen Your Data Strategy to Assess EdTech Efficacy and Help Ensure Data Privacy Compliance

You won’t find too many educators who don’t know they need detailed data to make the most of their edtech investments, but that doesn’t mean everyone knows just how helpful data can be—or exactly which aspects of district performance data can help boost.

Assessing the true efficacy of edtech has long been a challenge for most districts because they often don’t have the complete picture of what tools and resources are actually being used, by whom, and for what purpose. The data that most districts have on edtech usage and efficacy is siloed, and it’s difficult to see the whole data story. CatchOn provides the tools and learning analytics that make it easy to convert that data into an accelerant to change.

Making the most of all available data can have a transformative impact because, when it comes to districts’ edtech, so much is on the line: in 2020 alone, American school districts spent more than $35 billion on hardware, software, curriculum resources, and networks; now, more than ever, districts are trying to fully understand both the return on—and impact of—those investments, while also monitoring digital assets for privacy compliance. 

But just how much can data analytics help districts? Because we at CatchOn are in the data analytics business, we decided to gather some data about how helpful district-wide student-level data can be: we prepared a study of diverse school districts, gathered data about how they were using CatchOn, and analyzed that data to prepare a report.

Read on to find out just how helpful data analytics can be for your district.

Our Study

To find out how powerful data analytics can be for educators, CatchOn partnered with Digital Promise, a national education nonprofit organization, to conduct a pilot study of seven school districts that are part of Digital Promise’s League of Innovative Schools.  The primary objective of the study was to understand the potential and power of data to support key district needs. We believed that because CatchOn granted administrators better information about how various digital products were being used by students and teachers, the solution would empower districts to improve their edtech strategies.

Our Findings

We found that access to the data analytics within the CatchOn platform can have a profound impact on districts’ edtech strategies. Here are some key findings:

  • 100% of the leaders from the Digital Promise pilot school districts report that reviewing their CatchOn data helps them identify gaps in student engagement that can indicate inequity.
  • 100% of the leaders from the Digital Promise pilot school districts say that their CatchOn data is valuable for informing their ROI analysis on technology investments.
  • 100% of the leaders from the Digital Promise pilot school districts believe that their CatchOn data is valuable for supporting their district’s online learning initiatives.

The districts we surveyed also believed that CatchOn’s data analytics would help them over the long-term with three critical operational benefits. Districts reported that CatchOn:

  • Supported messaging efforts to the community about product choices.
  • Supported utilization analytics with access to regular and timely data.
  • Helped educators ensure they were effectively monitoring product usage, achieving privacy compliance, and following all their requirements.

Our Conclusion

From this initial study, the evidence suggests that CatchOn provides districts the data and analytics they need to attain better educational results, achieve high-level implementation for their investments, and monitor compliance to keep their staff and students safe. CatchOn does so in part through third-party badging that helps districts align with the privacy standards of leading education organizations such as IMS Global.

“Through the strategic and effective use of data, school leaders can make informed decisions regarding budgets, curricula, resources, staffing, and other supports for students,” concluded Dewayne McClary, Digital Promise’s director of the League of Innovative Schools. “Data disaggregation is a powerful tool that allows school leaders to be more intentional about their decision-making and address educational inequities that have plagued student achievement and opportunity for far too long.”

Based upon the Speak Up Research Project’s findings from the 2020-21 school year, 90% of district administrators say their district has successfully implemented a one-to-one device program for their students in which students can use their devices in school and at home. Further, teachers report a 20% increase in their integration of digital content within everyday instruction in the 2020-21 school year compared to the previous year.

Taken together, these findings suggest that as long as districts wish to invest in edtech, it is well worth their while to invest in effective data analytics to evaluate and hone those efforts. Doing so will help boost student performance, promote educational equity, protect student data, and save districts money.

Because after all, the more districts know, the more they’re empowered to secure transformational results.

 

Special Offer for IMS K-12 Member Districts

All of us at CatchOn greatly value our partnership with IMS Global! As part of our partnership, CatchOn is offering this exclusive offer to IMS K-12 member organizations:

  • A 60-day trial of CatchOn
  • Waiver of set up fees for the district-wide implementation of CatchOn on all school-owned devices upon purchase

Click here to learn more about this exclusive offer.

 

Monica Cougan is the Manager of Strategic Relationships and Initiatives at CatchOn and ENA, where she leverages more than 35 years of experience in education and technology to help schools make the most of new technology. She has been an evangelist for the adoption of technology as a transformative educational tool. Monica has extensive experience helping K-12 school districts implement programs that foster systemic change.

 

Tags:

IMS Global CEO Rob AbelRob Abel, Ed.D. | August 2021

 

"Get your motor runnin'" —Steppenwolf

 

What’s Under the Hood of Your EdTech?

Once upon a time, you could not only buy a muscle car, but you could work on pretty much every part of what makes it run. I think it’s likely that I’m in the last generation of humans who spent substantial time working on their cars when they were young, whether just for maintenance or fun. A key reason we were able to do this was that the designs of cars were relatively simple, with pretty much all “subcomponents” operating independently of one another. You did not have to mess with onboard microprocessors and extensive electronics to tune your engine or change your brakes.

But embedded computer tech has gradually changed the design of autos to be much more interconnected. My current car, for instance, is a hybrid. It is a performance car that automatically determines how to distribute power from gas and electricity to each wheel and adjust the suspension in real-time in response to the driver and conditions determined by access to data from many sensors.

My dad was an aerospace engineer. In the 1970s, he was involved in developing prototypes of hybrid engines for cars. Fifty years later, hybrids are everywhere, and it does not take much imagination to predict a predominance of electronic vehicles over the next decade or two. What’s under the hood has changed forever. Motivations for the evolution have included reducing harmful emissions and fuel costs. But the more integrated architecture enables the user experiences that sell cars.

What does this have to do with edtech?

First, we see a very similar evolution in terms of integration bringing value to the end-users. While the individual applications of an edtech design are important, more and more the user experience is how well the totality and built-in flexibility of the design respond to user needs. We're making powerful advancements like achieving digital on day one and enabling instructional innovation by making data easier to see, understand, and act upon. These interconnections are what enable faculty to teach and students to learn their way.

Second, as edtech evolves, the bar is going up for IT, academic, and product leadership when it comes to designing and ensuring the user experience across products. Once upon a time, automobiles had only a set of hardwired gauges each representing one signal for the user to monitor. The software products we know today as the LMS, portal, or single-sign-on system, are just the first or second generation of what will evolve to be a much more sophisticated approach to configuring the edtech experience at an institution and for each learner.

Another way to make this second point is that what is “under your edtech hood” is not just a list of what products you support (for institutions) or what specific integrations you support (for product developers). What is under your hood is the design of how users launch products, how context/user preferences are transmitted to each launched product, what progress indicators and data are generated, micro-credentials are awarded, where the outputs go, and how the outputs are expected to be used.

Candidate NGDLE Architecture

Candidate NGDLE architecture in relation to product categories that already exist in higher ed; taken from the EDUCAUSE Review article Shaping the Educational Technology Innovation Ecosystem by Rob Abel, Published: July 17, 2017

Source: Shaping the Educational Technology Ecosystem, EDUCAUSE Review, July 17, 2017

 

Third, the need to reduce harmful effects on the environment and keep fuel costs low is analogous to the need to achieve scale and agility without increasing cost.

If all of this sounds a bit “futuristic,” well, some of it may seem to be, but this is the road we’re traveling in IMS. The scaling of edtech and cost savings achieved from open standards-based interoperability in concert with a committed community that shares how they are making progress are reaping the rewards. The leaders in IMS can now focus more on that user experience design. For instance, during the pandemic, IMS members—institutions and suppliers alike—have been able to leverage what is “under the hood” of their edtech ecosystem to better configure experiences for end-users.

Going forward, intentionally designed interoperability, working across an institutional product ecosystem, is what will make or break the power of digital edtech for faculty, learners, and administrators.

We have come a long way and have a long way to go. But there is much that we can do today to improve user experiences by encouraging broader and deeper adoption of the work of the IMS community. Let’s take the next step of incorporating the full capabilities of LTI Advantage, OneRoster, QTI, CASE, TrustEd Apps, and Caliper for the benefit of our learners!

To learn more, I encourage that you register for our upcoming annual Learning Impact: Connecting the Power of the EdTech Community. IMS Contributing Members get two free registrations, and Affiliate Members get one free registration!

 

Tags:

 

IMS Chief Architect Dr. Colin SmytheIMS TECH TALK

Contributed by Dr. Colin Smythe, IMS Chief Architect

 

Compliance, Conformance, and Certification: Why Getting IMS Certified is Important

One of the benefits of IMS membership is having your certified products listed in the IMS Product Directory—the official list of all learning apps and tools that have passed IMS interoperability certification. A product must demonstrate support of one or more IMS specifications through conformance testing to appear in the directory. IMS awards each certification for 12 months, so every product must undergo successful recertification to maintain its listing. This 12-month cycle allows vendors to use agile development processes without requiring recertification for every product release. New major versions of a product must be certified. It is not unusual for several versions of a product to show up in the product directory. It is important to note that the product receives certification and not a deployment of the product.

The IMS Product Directory also includes products vetted for student data privacy using the IMS TrustEd Apps process. Many, but not all vetted products, have also achieved IMS standards’ certification. This blog focuses on the products that go through conformance testing for IMS certification.

Defining the conformance requirements and providing the associated conformance test capabilities are essential to the IMS specification development process. Each IMS specification must have a Conformance & Certification document. These documents describe the certification process and define the conformance criteria that a product must achieve for each available certification. Most specifications have more than one certification. An example of this is a service-based specification with certifications as a Service Provider and a Service Consumer (a product must be either or both). The conformance and certification aspects are addressed once the project group responsible for developing the specification publishes the Member Candidate Final documents. These documents are available to IMS members only. A minimum number of products must be certified before the Final Release of an IMS specification can be published. This means the document set is publicly available to everyone.

An IMS specification cannot have a Final Release until IMS members define conformance and certification and create and use the conformance test.

One objective of IMS certification is to demonstrate the level of adoption by vendors committed to open solutions to the market. When a new version 1.0 specification is first published, the conformance requirements are defined to encourage broad adoption. Over time the level of adoption will change, and the specification itself will evolve. Therefore, the IMS specification maintenance process allows for the certification requirements to be changed to fit the changing needs of a market—even when there are no changes to the functionality supported by the specification. Also, it is not unusual for the conformance testing to be continually improved. This flexible approach to certification is another reason why products must undergo annual recertification.

It is becoming more common for organizations to require IMS specifications as part of the procurement process. It is natural for vendors to claim compliance. From an IMS perspective, compliance is claimed by vendors who are not IMS Certified. IMS members can provide their IMS product registration numbers, and users can easily confirm their certification by a quick inspection of the IMS Product Directory. If support of an IMS specification is required, the tendering process should include checking the product’s IMS Registration Number. In most cases, the claim for compliance is wishful thinking and based on unjustified confidence in the in-house interoperability testing. Sometimes, it is a cynical misrepresentation. Buyer beware.

When products claim compliance but are not certified, there are two implications. First, the product has not been through IMS conformance testing. Usage of the IMS conformance test systems is essential in producing a correct implementation of the specification. In most cases, a solution goes through several iterations of conformance testing before being certified. There is no restriction on the usage of our conformance test system for IMS members, meaning they are not just for certification. Secondly, if there is a failure of interoperability when using the IMS specifications, the IMS certification requires the vendors to work together, and if appropriate with IMS, to resolve the problem. In some cases, IMS may have to: improve the implementation guidance, correct the specification, and improve the conformance test systems to avoid such incompatibilities in the future. Experience has shown that products that are not certified do not implement the corresponding IMS specification correctly. Superficially, they appear to work, but there will likely be many significant errors in the implementation.

Certification requires IMS membership. Is access to certification sufficient justification for IMS membership? Undoubtedly, YES!

IMS has invested millions of dollars in developing and supporting our extensive test and conformance systems and related artifacts. Five to ten full-time software developers are working on the various IMS test and conformance systems at any one time. Even the largest organizations see significant benefits in using the IMS test and conformance systems. A further benefit is that the IMS technical team provides a wide range of support to help IMS members adopt and adapt IMS specifications. As part of our specification development process, IMS creates the following testing and conformance artifacts:

  • Service Provider and Consumer conformance test systems (used for OneRoster, LTI, CASE, etc.)

  • Reference implementations of the full specification

  • Online validation of content instances (used for Common Cartridge, QTI, etc.)

  • Reference test sets for testing data import capabilities (used for Common Cartridge, OneRoster, QTI, etc.)

It is important to stress that all of these artifacts are available, for unlimited use, to IMS members. All of these artifacts are being continually improved.

While Certification is very important, it is product-focused and not deployment-specific. There is a limit to the degree of interoperability guaranteed through certification only. It is possible for two products certified for the same specification not to interoperate. For example, in OneRoster, there are both REST-based and CSV-based bindings, and interoperability between these is not possible. Therefore being certified alone is insufficient; the right type of certification is required for interoperability. The IMS Product Directory provides sufficient details to ensure the right type of certification is available. In the case of deployed systems, there are many different ways in which a certified system can be configured (this may also depend on the business model used by the vendor).

As the next step beyond certification, IMS has created the Compatibility Check (CCx), which provides the Characterization of a deployment. Furthermore, CCx enables characterizations to be compared. This means that we can compare the characterizations of certified Service Providers and Consumers to show all of the interoperable and non-interoperable features (including the usage of extensions). At present, CCx supports OneRoster and Common Cartridge, but it will be extended to cover many of the IMS specifications over time. I will go into more details about characterization and CCx in a later blog but understand that the characterization of products is the way forward. It is a far better measure of interoperability than certification alone.

 

Tags:

Pages