IMS Final Release

IMS logo

IMS Question & Test Interoperability Overview

Version: 2.1 Final

Date Issued: 31 August 2012
Latest version: http://www.imsglobal.org/question/

IPR and Distribution Notices

Recipients of this document are requested to submit, with their comments, notification of any relevant patent claims or other intellectual property rights of which they may be aware that might be infringed by any implementation of the specification set forth in this document, and to provide supporting documentation.

IMS takes no position regarding the validity or scope of any intellectual property or other rights that might be claimed to pertain to the implementation or use of the technology described in this document or the extent to which any license under such rights might or might not be available; neither does it represent that it has made any effort to identify any such rights. Information on IMS's procedures with respect to rights in IMS specifications can be found at the IMS Intellectual Property Rights web page: http://www.imsglobal.org/ipr/imsipr_policyFinal.pdf.

Copyright © 2005-2012 IMS Global Learning Consortium. All Rights Reserved.

Use of this specification to develop products or services is governed by the license with IMS found on the IMS website: http://www.imsglobal.org/license.html.

Permission is granted to all parties to use excerpts from this document as needed in producing requests for proposals.

The limited permissions granted above are perpetual and will not be revoked by IMS or its successors or assigns.

THIS SPECIFICATION IS BEING OFFERED WITHOUT ANY WARRANTY WHATSOEVER, AND IN PARTICULAR, ANY WARRANTY OF NONINFRINGEMENT IS EXPRESSLY DISCLAIMED. ANY USE OF THIS SPECIFICATION SHALL BE MADE ENTIRELY AT THE IMPLEMENTER'S OWN RISK, AND NEITHER THE CONSORTIUM, NOR ANY OF ITS MEMBERS OR SUBMITTERS, SHALL HAVE ANY LIABILITY WHATSOEVER TO ANY IMPLEMENTER OR THIRD PARTY FOR ANY DAMAGES OF ANY NATURE WHATSOEVER, DIRECTLY OR INDIRECTLY, ARISING FROM THE USE OF THIS SPECIFICATION.

Join the discussion and post comments on the QTI Public Forum: http://www.imsglobal.org/community/forum/categories.cfm?catid=52

 

© 2012 IMS Global Learning Consortium, Inc. All Rights Reserved.
The IMS Logo is a trademark of the IMS Global Learning Consortium, Inc. in the United States and/or other countries.
Document Name: IMS Global Question & Test Interoperability (QTI) Overview Final v2.1 Revision: 31 August 2012

 

 


Table of Contents

1. Question and Test Interoperability
1.1. History of this Specification
1.2. Scope
2. Specification Use Cases
2.1. Use Case Actors
3. Structure of this Specification
4. 4. Conformance, Extensions, and Specification and Profile Maintenance
4.1 Extension Mechanisms
5. Recent Changes
6. References

1. Question and Test Interoperability

The IMS Question & Test Interoperability (QTI) specification describes a data model for the representation of question (assessmentItem) and test (assessmentTest) data and their corresponding results reports. Therefore, the specification enables the exchange of this item, test and results data between authoring tools, item banks, test constructional tools, learning systems and assessment delivery systems. The data model is described abstractly, using [UML] to facilitate binding to a wide range of data-modelling tools and programming languages, however, for interchange between systems a binding is provided to the industry standard eXtensible Markup Language [XML] and use of this binding is strongly recommended. The IMS QTI specification has been designed to support both interoperability and innovation through the provision of well-defined extension points. These extension points can be used to wrap specialized or proprietary data in ways that allows it to be used alongside items that can be represented directly.

1.1. History of this Specification

An initial V0.5 specification was released for discussion in March 1999 and in November it was agreed to develop IMS Question & Test Interoperability v1.0 which was released as a public draft in February 2000 and as a final specification in May that year. The specification was extended and updated twice, in March 2001 and January 2002. By February of that year in excess of 6000 copies of the IMS QTI 1.x specifications had been downloaded from the IMS web-site.

Since then, a number of issues of have been raised by implementers and reviewed by the QTI project team. Many of them were dealt with in an addendum, which defined version 1.2.1 of the specification and was released in March 2003. Some of the issues could not be dealt with this way as they required changes to the specification that would not be backwardly compatible or because they uncovered more fundamental issues that would require extensive clarification or significant extension of the specification to resolve.

Since the QTI specification was first conceived, the breadth of IMS specifications has grown and work on Content Packaging, Simple Sequencing and most recently Learning Design created the need for a cross-specification review. This review took place during 2003 and a number of harmonization issues affecting QTI were identified. In September that year a project charter was agreed to address both the collected issues from 1.x and the harmonization issues and to draft QTI V2.0. In order to make the work manageable and ensure that results were returned to the community at the earliest opportunity some restrictions were placed on the scope of the recommended work. Therefore, the QTI V2.0 release of the specification concentrated only on the individual assessmentItem and did not update those parts of the specification that dealt with the aggregation of items into sections and tests or the reporting of results. This QTI 2.1 release completes the update from 1.x to 2.x by replacing those remaining parts of the QTI specification.

1.2. Scope

The IMS QTI work specifically relates to content providers (that is, question and test authors and publishers), developers of authoring and content management tools, assessment delivery systems and learning systems. The data model for representing question-based content is suitable for targeting users in learning, education and training across all age ranges and national contexts.


2. Specification Use Cases

QTI is designed to facilitate interoperability between a number of systems that are described here in relation to the actors that use them.

Specifically, QTI is designed to:

  • Provide a well documented content format for storing and exchanging items independent of the authoring tool used to create them.
  • Support the deployment of item banks across a wide range of learning and assessment delivery systems.
  • Provide a well documented content format for storing and exchanging tests independent of the test construction tool used to create them.
  • Support the deployment of items, item banks and tests from diverse sources in a single learning or assessment delivery system.
  • Provide systems with the ability to report test results in a consistent manner.

components

Figure 2.1 The Role of Assessment Tests and Assessment Items.

authoringTool

A system used by an author for creating or modifying an assessment item.

itemBank

A system for collecting and managing collections of assessment items.

testConstructionTool

A system for assembling tests from individual items.

assessmentDeliverySystem

A system for managing the delivery of assessments to candidates. The system contains a delivery engine for delivering the items to the candidates and scores the responses automatically (where applicable) or by distributing them to scorers.

learningSystem

A system that enables or directs learners in learning activities, possibly coordinated with a tutor. For the purposes of this specification a learner exposed to an assessment item as part of an interaction with a learning system (i.e., through formative assessment) is still described as a candidate as no formal distinction between formative and summative assessment is made. A learning system is also considered to contain a delivery engine though the administration and security model is likely to be very different from that employed by an assessmentDeliverySystem.

2.1. Use Case Actors

The set of roles identified in this specification have been reduced to a small set of abstract actors for simplicity. Typically roles in real learning and assessment systems are more complex but, for the purposes of this specification, it is assumed that they can be generalized by one or more of the roles defined here.

author

The author of an assessment item. In simple situations an item may have a single author, in more complex situations an item may go through a creation and quality control process involving many people. In this specification we identify all of these people with the role of author. An author is concerned with the content of an item, which distinguishes them from the role of an itemBankManager. An author interacts with an item through an authoringTool.

itemBankManager

An actor with responsibility for managing a collection of assessment items with an itemBank.

testConstructor

The role of test constructor is to create tests (test forms) from individual items. The items are typically drawn from an item bank.

proctor

A person charged with overseeing the delivery of an assessment. Often referred to as an invigilator. For the purposes of this specification a proctor is anyone (other than the candidate) who is involved in the delivery process but who does not have a role in assessing the candidate's responses.

scorer

A person or external system responsible for assessing the candidate's responses during assessment delivery. Scorers are optional, for example, many assessment items can be scored automatically using response processing rules defined in the item itself.

tutor

Someone involved in managing, directing or supporting the learning process for a learner but who is not subject to (the same) assessment.

candidate

The person being assessed by an assessment test or assessment item.


3. Structure of this Specification

The specification is spread over a number of documents:

  • Implementation Guide: A document that takes you through the data models by example. The best starting point for readers who are new to QTI and want to get an idea of what it can do.
  • Assessment Test, Section and Item Information Model: The reference guide to the main data model for assessment tests and items. The document provides detailed information about the model and specifies the requirements of delivery engines and authoring systems.
  • Metadata and Usage Data: A document that describes a profile of the IEEE Standard for Learning Object Metadata [LOM] data model suitable for use with assessment tests and items and a separate data model for representing usage data (i.e., item statistics). This document will be of particular interest to developers and managers of item banks and other content repositories, and to those who construct assessments from item banks.
  • Results Reporting: A reference guide to the data model for result reporting. The document provides detailed information about the model and specifies the associated requirements on delivery engines.
  • Integration Guide: A document that describes the relationship between this specification and other related specifications such as IMS Content Packaging [IMS_CP], IMS Simple Sequencing [IMS_SS] and IMS Learning Design [IMS_LD].
  • XML Binding: A document describing the way the data models have been bound to [XML].
  • Migration Guide: A document aimed at people familiar with version 1.x. It takes you through the main changes that have been made to the data model and includes an alphabetical listing of version 1 elements providing detailed information about how the same information is represented in version 2.

4. Conformance, Extensions, and Specification and Profile Maintenance

Because the QTI 2.1 specification is designed to accommodate a wide range of practices, conformance is defined in, and tested against, specific profiles. Such a profile can define which parts of the specification a community intends to support (items, tests, packages, results), and what specific features in each of those areas. In some cases, profiles can also include extensions to the specification, which are outlined in further detail below.

A certification of conformance against an existing profile is granted by passing a series of tests, submitting the results to IMS Global and agreeing to become part of the support community [QTI_APIP]. IMS records all certifications of conformance to current profiles on the public IMS certification website [IMS_CERT]. At the time conformance is certified, IMS issues a unique registration number designator for the product name and version for use in proposals. Buyers and users are invited to visit the certification website to find certified products and to log any issues with certified products.

Communities that are interested in defining their own profiles are encouraged to examine current profiles for appropriateness or for inspiration. Contact IMS Global for the possibilities of adopting, adapting or defining a profile of QTI to meet the requirements of your particular community.

Details about conformance related to both QTI and [APIP] are available on [ASSESS_PRIMER]. This page also lists the profiles that are currently available for conformance certification. One of these is the QTI Formative Entry profile, which captures the same assessment requirements represented in Common Cartridge v1.x (based on a profile of the QTI v1.2 specification). Interoperability with this profile was demonstrated by systems involved in the development of the v2.1 specification.

The IMS QTI/APIP Assessment APMG (Accredited Profile Management Group) is responsible for the ongoing maintenance of the QTI and APIP specifications, as well as those profiles that are available for conformance testing.

4.1 Extension Mechanisms

QTI 2.1 can be extended in two ways: with Custom Interactions and Custom Operators. Custom operators are a means of introducing more sophisticated processing of candidate's responses. This is particularly useful where the responses are likely to contain subject specific constructs such as computer code fragments or mathematical proofs. Further information and guidance on using custom operators is available in section 3.4.1 of the Implementation Guide of this specification.

Custom Interactions are a means of using interaction types that are not covered by the interaction types of the specification itself. These custom interactions trade a degree of interoperability for a wider range of user experiences. Because of the interoperability cost, care should be taken in defining such custom interaction types, since they may well be covered by a particular configuration of an already existing interactiontype. More information and best practice about sharing and running custom interactions is available at [INTERACT]. Work is underway to make types of custom interactions 'runnable', and therefore interoperable and sharable. More information will be made available at [INTERACT] in due course.

Custom operators and custom interactions can be shared by registering them at [QTI_INPUT]; already registered custom extensions can be viewed at [QTI_REG]. The [QTI_INPUT] form requires the submitter to specify the title of the submitted custom extension, a short description, a schema (XSD or DTD) that defines the structure of the custom item, and the name and organization of the author of the custom extension. The name and affiliation of the submitter is automatically recorded. Optionally, the submission can include images, long descriptions, the name of a tool that implements the custom extension, and a website URL where more information about the tool and/or the custom extension is available.

Once the vocabulary is submitted, the process proceeds as follows:

  • A submission confirmation email is returned to the submitter of the custom extension;
  • The custom extension is validated with respect to the registration process, i.e., technical correctness, valid submission XSD or DTD and naming (this is to ensure that name clashes do not occur). If a valid custom extension has been received it will be listed on [QTI_REG]. Validation is undertaken by the IMS Global core staff;
  • The corresponding validation response is returned to the submitter of the custom extension;
  • If and when there is a next revision of the QTI specification, the IMS QTI/APIP Assessment APMG (Accredited Profile Management Group) will automatically take the custom extension into consideration for inclusion in the specification. In that case, the workgroup will contact the original submitter.

In the cases where a vocabulary is rejected, then an explanation will be given for the rejection. An amended vocabulary can be resubmitted and this will be treated without prejudice.


5. Recent Changes

Between the second Public Draft and this Final Release, the following changes have been incorporated into the specification:

Description

XSD change

Information Model change

New Response Processing templates added

   

New infoControl element; a simple hint without consequences

yes

yes

Outcome variable identifier reserved names MAXSCORE and PASSED added

 

yes

Clarification of the meaning of the expectedlength attribute value

 

yes

Clarification of the meaning of extendedTextInteraction's expectedlines attribute

 

yes

Clarification of datatypes such as duration

 

yes

Interaction attributes harmonised; they all have the same set, described similarly

yes

yes

New roundTo operator with roundingMode attribute

yes

yes

New templateConstraint element to enable interdependent variables to be generated in a template

yes

yes

New repeat operator; enables authors to set variables whose size is itself a variable

yes

yes

Index datatype now has relaxed constraints to allow integers and variables

yes

yes

printedVariable cardinality and datatype restrictions relaxed, new attributes added: powerForm, field, index, delimiter, mappingIndicator

yes

yes

APIP namespace imported into QTI feedbackBlock, rubricBlock and templateBlock

yes

yes

IMS_LIP replaced with simpler IMS_LIS derived sourcedID structure defined under the QTI namespace for the identification of learners.

yes

yes

sum operator relaxed to allow multiple children (i.e. become a container)

 

yes

New mathOperator element; provides access to common numerical math functions

yes

yes

New mathConstant element; returns mathematical constants as a single float, e.g. π and e.

yes

yes

New min and max operators

yes

yes

New statsOperator element; performs common statistical functions

yes

yes

Relaxing product content restrictions; it takes multiple cardinalities now

 

yes

Relaxing round and truncate content restrictions

 

yes

New gcd and lcm operators offer access to more math functions

yes

yes

Missing candidateComment fixed

yes

 

New allowLateSubmission attribute

yes

yes

New caseSensitive attribute on mapEntry; allows greater flexibility without using regular expressions

yes

yes

Clarification of the meaning and handling of the initial state of sliderInteraction.

 

yes

New assessmentSectionRef element; it enables assessment section reference

yes

yes

Block level elements are now allowed inside prompt

yes

yes

New powerForm attribute for printedVariable

yes

yes

stylesheet capability added to feedbackBlock, rubricBlock and templateBlock

yes

yes

'matchGroup' element deleted

yes

yes

use attribute on rubricBlock added

yes

yes

Datatype and cardinality of the tolerance attribute changed

yes

yes

positionObjectStage now allowed to appear in div

yes

yes

align="center" valign="baseline" attributes enabled in HTML's <td>

yes

 

Multiple nested feedbackBlock and templateBlock examples added to the implementation guide

   

More explanation about customOperators added to the Implementation Guide

   

alt attribute of HTML's <image> element allowed to contain more than 250 characters

yes

yes

Examples added to the Implementation Guide that are more representative, or demonstrate new numerate features, block behavior, adaptive tests or custom extensions

   

feedbackBlock, rubricBlock now allow both HTML block and inline elements

yes

yes

feedback elements no longer allowed in rubricBlock

yes

yes

A body element collocation table has been made available as an implementation aide

   

Old Mexican President feedback example in the Implementation Guide replaced

   

The 'adaptive', 'shuffle' and 'maxAssociation' attributes that were required now have a default value. Display of these attributes in instances is optional

yes yes

 


6. References

IMS Learning Information Services Specification, Version 2.0
http://www.imsglobal.org/LIS/index.html
INTERACT

IMS Global Assessment Custom Interactions
http://www.imsglobal.org/assessment/interactions.html
LOM
IEEE 1484.12.1-2002 Standard for Learning Object Metadata (LOM)
QTI_APIP
IMS Global QTI/APIP Alliance
http://www.imsglobal.org/apip/alliance.html
QTI_INPUT
QTI Custom Interaction Types and Custom Operators registry submission form
http://www.imsglobal.org/question/qtiinteractions/qti.cfm
QTI_REG
QTI Custom Interaction Types and Custom Operators registry
http://www.imsglobal.org/question/qtiinteractions/qtiregistry.cfm
UML
OMG Unified Modeling Language Specification, Version 1.4
Published: 2001-09
XML
Extensible Markup Language (XML), Version 1.0 (second edition)
Published: 2000-10

About This Document

Title IMS Question & Test Interoperability Overview
Editors Wilbert Kraan (JISC/CETIS), Steve Lay (Cambridge Assessment), Pierre Gorissen (SURF)
Version Final v2.1
Version Date 31 August 2012
Status Final Release Specification
Summary This document provides an overview of the QTI specification.
Revision Information 31 August 2012
Purpose This document has been approved by the IMS Technical Advisory Board and is made available for adoption and conformance.
Document Location http://www.imsglobal.org/question/qtiv2p1/imsqti_oviewv2p1.html
To register any comments or questions about this specification please visit: http://www.imsglobal.org/community/forum/categories.cfm?catid=52

List of Contributors

The following individuals contributed to the development of this document:

Name Organization
Odette Auzende Université Pierre et Marie Curie (France)
Dick Bacon JISC/CETIS (UK)
Niall Barr University of Glasgow/IMS Global (UK)
Lance Blackstone Pearson (USA)
Jeanne Ferrante ETS (USA)
Helene Giroire Université Pierre et Marie Curie (France)
Pierre Gorissen SURF (The Netherlands)
Regina Hoag ETS (USA)
Wilbert Kraan JISC/CETIS (UK)
Gopal Krishnan Pearson (USA)
Young Jin Kweon KERIS (South Korea)
Steve Lay Cambridge Assessment (UK)
Francoise LeCalvez Université Pierre et Marie Curie (France)
David McKain JISC/CETIS (UK)
Mark McKell IMS Global (USA)
Sue Milne JISC/CETIS (UK)
Jens Schwendel BPS Bildungsportal Sachsen GmbH (Germany)
Graham Smith JISC/CETIS (UK)
Colin Smythe IMS Global (UK)
Yvonne Winkelmann BPS Bildungsportal Sachsen GmbH (Germany)
Rowin Young JISC/CETIS (UK)

Revision History

Version No. Release Date Comments
Base Document 2.1 14 October 2005 The first version of the QTI v2.1 specification.
Public Draft 2.1 9 January 2006 The Public Draft v2.1 of the QTI specification.
Public Draft 2.1 (revision 2) 8 June 2006 The Public Draft v2.1 (revision 2) of the QTI specification.
Final Release v2.1 31 August 2012 The Final Release v2.1 of the QTI specification. Includes updates, error corrections, and additional details.

 

 

 

IMS Global Learning Consortium, Inc. ("IMS Global") is publishing the information contained in this IMS Question and Test Interoperability Overview ("Specification") for purposes of scientific, experimental, and scholarly collaboration only.
IMS Global makes no warranty or representation regarding the accuracy or completeness of the Specification.
This material is provided on an "As Is" and "As Available" basis.
The Specification is at all times subject to change and revision without notice.
It is your sole responsibility to evaluate the usefulness, accuracy, and completeness of the Specification as it relates to you.
IMS Global would appreciate receiving your comments and suggestions.
Please contact IMS Global through our website at http://www.imsglobal.org
Please refer to Document Name:
IMS Question and Test Interoperability Overview Revision: 31 August 2012