IMS Final Release

IMS logo

IMS Question & Test Interoperability Results Reporting

Version: 2.1 Final

Date Issued: 31 August 2012
Latest version: http://www.imsglobal.org/question/

IPR and Distribution Notices

Recipients of this document are requested to submit, with their comments, notification of any relevant patent claims or other intellectual property rights of which they may be aware that might be infringed by any implementation of the specification set forth in this document, and to provide supporting documentation.

IMS takes no position regarding the validity or scope of any intellectual property or other rights that might be claimed to pertain to the implementation or use of the technology described in this document or the extent to which any license under such rights might or might not be available; neither does it represent that it has made any effort to identify any such rights. Information on IMS's procedures with respect to rights in IMS specifications can be found at the IMS Intellectual Property Rights web page: http://www.imsglobal.org/ipr/imsipr_policyFinal.pdf.

Copyright © 2005-2012 IMS Global Learning Consortium. All Rights Reserved.

Use of this specification to develop products or services is governed by the license with IMS found on the IMS website: http://www.imsglobal.org/license.html.

Permission is granted to all parties to use excerpts from this document as needed in producing requests for proposals.

The limited permissions granted above are perpetual and will not be revoked by IMS or its successors or assigns.

THIS SPECIFICATION IS BEING OFFERED WITHOUT ANY WARRANTY WHATSOEVER, AND IN PARTICULAR, ANY WARRANTY OF NONINFRINGEMENT IS EXPRESSLY DISCLAIMED. ANY USE OF THIS SPECIFICATION SHALL BE MADE ENTIRELY AT THE IMPLEMENTER'S OWN RISK, AND NEITHER THE CONSORTIUM, NOR ANY OF ITS MEMBERS OR SUBMITTERS, SHALL HAVE ANY LIABILITY WHATSOEVER TO ANY IMPLEMENTER OR THIRD PARTY FOR ANY DAMAGES OF ANY NATURE WHATSOEVER, DIRECTLY OR INDIRECTLY, ARISING FROM THE USE OF THIS SPECIFICATION.

Join the discussion and post comments on the QTI Public Forum: http://www.imsglobal.org/community/forum/categories.cfm?catid=52

 

© 2012 IMS Global Learning Consortium, Inc. All Rights Reserved.
The IMS Logo is a trademark of the IMS Global Learning Consortium, Inc. in the United States and/or other countries.
Document Name: IMS Global Question & Test Interoperability (QTI) Results Reporting Final v2.1 Revision: 31 August 2012

 

 


Table of Contents

1. Introduction
2. References
3. Assessment Result

1. Introduction

This document is a reference guide to the data model for reporting the results of an assessment, and provides detailed information about the model and specifies the associated requirements on delivery engines.

2. References

IMS Learning Information Services Specification, Version 2.0
http://www.imsglobal.org/LIS/index.html

3. Assessment Result

Class : assessmentResult

An Assessment Result is used to report the results of a candidate's interaction with a test and/or one or more items attempted. Information about the test is optional, in some systems it may be possible to interact with items that are not organized into a test at all. For example, items that are organized with learning resources and presented individually in a formative context.

Contains : context [1]

Contains : testResult [0..1]
When a test result is given the following item results must relate only to items that were selected for presentation as part of the corresponding test session. Furthermore, all items selected for presentation should be reported with a corresponding itemResult.

Contains : itemResult [*]
A summary report for a test is represented by an assessment result containing a testResult but no itemResults.

Contains : sessionIdentifier [*]
The system that creates the result (for example, the test delivery system) should assign a session identifier that it can use to identify the session. Subsequent systems that process the result might assign their own identifier to the session which should be added to the context if the result is modified and exported for transport again.

Attribute : sourcedID [1]
A unique identifier for the test candidate. The attribute is defined by the IMS Learning Information Services specification [IMS_LIS].

Attribute : sourceID [1]: uri
A unique identifier of the system which added this identifier to the result.

Attribute : identifier [1]: string
The system that creates the report should add a session identifier. Subsequent systems that process the results might use their own identifier for the session and should add this too if the result is exported again for further transport.

Attribute : identifier [1]: string
The identifier of the test for which this is a result.

Attribute : datestamp [1]: datetime
The date stamp of when this result was recorded.

Contains : itemVariable [*]
The values of the test outcomes and any durations that were tracked during the test. Note that durations are reported as built-in test-level response variables with name duration. The duration of individual test parts or sections being distinguished by prefixing them with the associated identifier as described in Assessment Test, Section and Item Information Model.

The result of an item session is reported with an itemResult. A report may contain multiple results for the same instance of an item representing multiple attempts, progression through an adaptive item or even more detailed tracking. In these cases, each item result must have a different datestamp.

Attribute : identifier [1]: string
The identifier of the item for which this is a result. For item results that are reported as part of a test result this is the identifier used to refer to the item in the test (see assessmentItemRef). For item results that are reported on their own, this can be any suitable identifier for the item. Where possible, the value should match the identifier attribute on the associated assessmentItem.

Attribute : sequenceIndex [0..1]: integer
For item results that are reported as part of a test, this attribute must be used to indicate the position of the item within the specific instance of the test. The first item of the first part of the test is defined to have sequence index 1.

Attribute : datestamp [1]: datetime
The date stamp of when this result was recorded.

Attribute : sessionStatus [1]: sessionStatus
The session status is used to interpret the values of the item variables. See sessionStatus below.

Contains : itemVariable [*]
During the item session the delivery engine keeps track of the current values assigned to all itemVariables. The values of including the values of the builtin variables numAttempts, duration and completionStatus. Each value is represented in the report by an instance of itemVariable.

Contains : candidateComment [0..1]
An optional comment supplied by the candidate (see allowComment.

The session status is used to keep track of the status of the item variables in an item session.

initial
The value to use for sessions in the initial state, as described above. This value can only be used to describe sessions for which the response variable numAttempts is 0. The values of the variables are set according to the rules defined in the appropriate declarations (see responseDeclaration, outcomeDeclaration and templateDeclaration).

pendingSubmission
The value to use when the item variables represent a snapshot of the current values during an attempt (in other words, while interacting or suspended). The values of the response variables represent work in progress that has not yet been submitted for response processing by the candidate. The values of the outcome variables represent the values assigned during response processing at the end of the previous attempt or, in the case of the first attempt, the default values given in the variable declarations.

pendingResponseProcessing
The value to use when the item variables represent the values of the response variables after submission but before response processing has taken place. Again, the outcomes are those assigned at the end of the previous attempt as they are awaiting response processing.

final
The value to use when the item variables represent the values at the end of an attempt after response processing has taken place. In other words, after the outcome values have been updated to reflect the values of the response variables.

Attribute : identifier [1]: identifier
The purpose of an itemVariable is to report the value of the item variable with the given identifier.

Attribute : cardinality [1]: cardinality
The cardinality of the variable, taken from the corresponding declaration or definition.

Attribute : baseType [0..1]: baseType
The base type of the variable, taken from the corresponding declaration of definition. This value is omitted only for variables with record cardinality.

Attribute : choiceSequence [*]: identifier {ordered}
When a response variable is bound to an interaction that supports the shuffling of choices, the sequence of choices experienced by the candidate will vary between test instances. When shuffling is in effect, the sequence of choices should be reported as a sequence of choice identifiers using this attribute.

Contains : correctResponse [0..1]
The correct response may be output as part of the report if desired. Systems are not limited to reporting correct responses declared in responseDeclarations. For example, a correct response may be set by a templateRule or may simply have been suppressed from the declaration passed to the delivery engine (e.g., for security).

Contains : candidateResponse [1]
The response given by the candidate.

Contains : value [*] {ordered}
The value(s) of the response variable. A NULL value, resulting from no response, is indicated by the absence of any value. The order of the values is significant only if the response was declared with ordered cardinality.

Attribute : view [*]: view
The views (if any) declared for the outcome must be copied to the report to enable systems that render the report to hide information not relevant in a specific situation. If no values are given, the outcome's value should be considered relevant in all views.

Attribute : interpretation [0..1]: string
See interpretation.

Attribute : longInterpretation [0..1]: uri
See longInterpretation.

Attribute : normalMaximum [0..1]: float
Taken from the corresponding outcomeDeclaration.

Attribute : normalMinimum [0..1]: float
Taken from the corresponding outcomeDeclaration.

Attribute : masteryValue [0..1]: float
If a mastery value is specified in the corresponding outcomeDeclaration it may be reported alongside the value of the outcomeVariable. In some cases, the mastery value may not be an attribute of the item itself, but be determined by the context in which the item is delivered, for example, by examining the candidates in a specific cohort. The mastery value may be reported with the outcome value even when there is no corresponding value in the declaration.

Contains : value [*] {ordered}
The value(s) of the outcome variable. The order of the values is significant only if the outcome was declared with ordered cardinality.

Contains : value [*] {ordered}
The value(s) of the template variable. The order of the values is significant only if the template variable was declared with ordered cardinality.

The class used for comments from the candidate. A simple run of text.


About This Document

Title IMS Question & Test Interoperability Results Reporting
Editors Wilbert Kraan (JISC/CETIS), Steve Lay (Cambridge Assessment), Pierre Gorissen (SURF)
Version Final v2.1
Version Date 31 August 2012
Status Final Release Specification
Summary This document describes the data model for reporting the results of an assessment.
Revision Information 31 August 2012
Purpose This document has been approved by the IMS Technical Advisory Board and is made available for adoption and conformance.
Document Location http://www.imsglobal.org/question/qtiv2p1/imsqti_resultv2p1.html
To register any comments or questions about this specification please visit: http://www.imsglobal.org/community/forum/categories.cfm?catid=52

List of Contributors

The following individuals contributed to the development of this document:

Name Organization
Odette Auzende Université Pierre et Marie Curie (France)
Dick Bacon JISC/CETIS (UK)
Niall Barr University of Glasgow/IMS Global (UK)
Lance Blackstone Pearson (USA)
Jeanne Ferrante ETS (USA)
Helene Giroire Université Pierre et Marie Curie (France)
Pierre Gorissen SURF (The Netherlands)
Regina Hoag ETS (USA)
Wilbert Kraan JISC/CETIS (UK)
Gopal Krishnan Pearson (USA)
Young Jin Kweon KERIS (South Korea)
Steve Lay Cambridge Assessment (UK)
Francoise LeCalvez Université Pierre et Marie Curie (France)
David McKain JISC/CETIS (UK)
Mark McKell IMS Global (USA)
Sue Milne JISC/CETIS (UK)
Jens Schwendel BPS Bildungsportal Sachsen GmbH (Germany)
Graham Smith JISC/CETIS (UK)
Colin Smythe IMS Global (UK)
Yvonne Winkelmann BPS Bildungsportal Sachsen GmbH (Germany)
Rowin Young JISC/CETIS (UK)

Revision History

Version No. Release Date Comments
Base Document 2.1 14 October 2005 The first version of the QTI v2.1 specification.
Public Draft 2.1 9 January 2006 The Public Draft v2.1 of the QTI specification.
Public Draft 2.1 (revision 2) 8 June 2006 The Public Draft v2.1 (revision 2) of the QTI specification.
Final Release v2.1 31 August 2012 The Final Release v2.1 of the QTI specification. Includes updates, error corrections, and additional details.

 

 

 

IMS Global Learning Consortium, Inc. ("IMS Global") is publishing the information contained in this IMS Question and Test Interoperability Results Reporting ("Specification") for purposes of scientific, experimental, and scholarly collaboration only.
IMS Global makes no warranty or representation regarding the accuracy or completeness of the Specification.
This material is provided on an "As Is" and "As Available" basis.
The Specification is at all times subject to change and revision without notice.
It is your sole responsibility to evaluate the usefulness, accuracy, and completeness of the Specification as it relates to you.
IMS Global would appreciate receiving your comments and suggestions.
Please contact IMS Global through our website at http://www.imsglobal.org
Please refer to Document Name:
IMS Question and Test Interoperability Results Reporting Revision: 31 August 2012