![]() |
1EdTech Question & Test Interoperability: ASI Best Practice & Implementation Guide Final Specification Version 1.2 |
Copyright © 2002 1EdTech Consortium, Inc. All Rights Reserved. The 1EdTech Logo is a trademark of 1EdTech Consortium, Inc. Document Name: 1EdTech Question & Test Interoperability: ASI Best Practice & Implementation Guide Date: 11 February 2002
|
IPR and Distribution Notices
Recipients of this document are requested to submit, with their comments, notification of any relevant patent claims or other intellectual property rights of which they may be aware that might be infringed by any implementation of the specification set forth in this document, and to provide supporting documentation.
1EdTech takes no position regarding the validity or scope of any intellectual property or other rights that might be claimed to pertain to the implementation or use of the technology described in this document or the extent to which any license under such rights might or might not be available; neither does it represent that it has made any effort to identify any such rights. Information on 1EdTech's procedures with respect to rights in 1EdTech specifications can be found at the 1EdTech Intellectual Property Rights web page: http://www.imsglobal.org/ipr/imsipr_policyFinal.pdf.
Copyright © 2002 1EdTech Consortium. All Rights Reserved.
Permission is granted to all parties to use excerpts from this document as needed in producing requests for proposals.
Use of this specification to develop products or services is governed by the license with 1EdTech found on the 1EdTech website: http://www.imsglobal.org/license.html.
The limited permissions granted above are perpetual and will not be revoked by 1EdTech or its successors or assigns.
THIS SPECIFICATION IS BEING OFFERED WITHOUT ANY WARRANTY WHATSOEVER, AND IN PARTICULAR, ANY WARRANTY OF NONINFRINGEMENT IS EXPRESSLY DISCLAIMED. ANY USE OF THIS SPECIFICATION SHALL BE MADE ENTIRELY AT THE IMPLEMENTER'S OWN RISK, AND NEITHER THE CONSORTIUM, NOR ANY OF ITS MEMBERS OR SUBMITTERS, SHALL HAVE ANY LIABILITY WHATSOEVER TO ANY IMPLEMENTER OR THIRD PARTY FOR ANY DAMAGES OF ANY NATURE WHATSOEVER, DIRECTLY OR INDIRECTLY, ARISING FROM THE USE OF THIS SPECIFICATION.
Table of Contents
1. Introduction
1.1 Question & Test Interoperability Overview
1.2 Scope & Context
1.3 Structure of this Document
1.4 Nomenclature
1.5 References
2. Relationship to Other Specifications
2.1 1EdTech Specifications
2.2 Related Specifications
2.2.1 IEEE P1484
2.2.2 Advanced Distributed Learning (ADL) Initiative
2.2.3 Aviation Industry CBT Committee (AICC)
3. Overall Data Model
3.1 Information Model
3.2 XML Schema Tree
3.2.1 Item
3.2.2 Section
3.2.3 Assessment
3.2.4 Material Element
3.2.5 Meta-data
4. Example Basic Item Types
4.1 Logical Identifier
4.1.1 Standard True/False (Text)
4.1.2 Standard Multiple Choice (Text)
4.1.3 Standard Multiple Choice (Images)
4.1.4 Standard Multiple Choice (Audio)
4.1.5 Standard Multiple Response (Text)
4.1.6 Multiple Choice with Image Hot Spot Rendering
4.1.7 Multiple Response with Multiple Image Hot Spot Rendering
4.1.8 Multiple Choice with Slider Rendering
4.1.9 Standard Order Objects (Text)
4.1.10 Standard Order Objects (Image)
4.1.11 Connect-the-Points
4.2 XY Co-ordinate
4.2.1 Standard Image Hot Spot
4.2.2 Connect-the-Points
4.3 String
4.3.1 Standard Fill-in-Blank (Text)
4.3.2 Standard Multiple Fill-in-Blank (Text)
4.3.3 Standard Short Answer
4.4 Numerical
4.4.1 Standard Fill-in-Blank (Decimal)
4.4.2 Standard Fill-in-Blank (Integer)
4.4.3 Numerical Entry with Slider
4.5 Logical Group
4.5.1 Drag-and-Drop (Images)
5. Example Composite Item Types
5.1 Multiple Choice Derivatives
5.1.1 Multiple Choice with Fill-in-Blank
5.1.2 Matrix-based Multiple Response
6. Example XML Schema
6.1 Item Examples
6.1.1 Minimum Definition
6.1.2 Full Definition
6.2 Section Examples
6.2.1 Minimum Definition
6.2.2 Full Definition
6.3 Assessment Examples
6.3.1 Minimum Definition
6.3.2 Full Definition
6.4 The XML Example Files
7. Implementation Guidance
7.1 Assessments
7.1.1 Elements and their Attributes
7.1.2 Groups of Elements
7.2 Sections
7.2.1 Elements and their Attributes
7.2.2 Groups of Elements
7.3 Items
7.3.1 Elements and their Attributes
7.3.2 Groups of Elements
7.4 Object Banks
7.5 Aggregated Scoring and Response Processing
7.6 1EdTech Harmonization
7.6.1 1EdTech Meta-data
7.6.2 1EdTech Content Packaging
7.6.3 1EdTech Learner Information Package
7.6.4 1EdTech Accessibility
7.6.5 1EdTech Sequencing
7.7 Naming Conventions
7.7.1 Identities and Labels
7.7.2 Proprietary Extensions
7.8 Scoping Rules
7.8.1 Identities and Labels
8. Proprietary Extensions
9. V1.x/V2.0 Issues & Compatibility
9.1 Function Requirements
9.2 Constraints
9.3 Compatibility Map
10. Conformance
10.1 Valid Data Issues
10.2 Conformance Summary
10.3 Interoperability Statement
10.4 Completing a Conformance Summary
10.5 An Example Conformance Statement
Appendix A - QTI XSDs, DTDs & XDRs
A1 - Overview
A2 - Features of the Different XSDs/DTDs/XDRs
A3 - Recommended Usage of the XSDs/DTDs
A4 - Full Directory Structure
Appendix B - Glossary of Terms
B1 - Q&TI Elements and Attributes
Appendix C - Examples Information
C1 - Proposed Naming Convention
About This Document
List of Contributors
Revision History
Index
1. Introduction
1.1 Question & Test Interoperability Overview
The 1EdTech Question & Test Interoperability (QTI) specification describes a basic structure for the representation of question (item) and test (assessment) data and their corresponding results reports. Therefore, the specification enables the exchange of this item, assessment and results data between Learning Management Systems, as well as content authors and, content libraries and collections. The 1EdTech QTI specification is defined in XML to promote the widest possible adoption. XML is a powerful, flexible, industry standard markup language used to encode data models for Internet-enabled and distributed applications. The QTI specification is extensible and customizable to permit immediate adoption, even in specialized or proprietary systems. Leading suppliers and consumers of learning products, services and content contributed time and expertise to produce this final specification. The 1EdTech QTI specification, like all 1EdTech specifications, does not limit product designs by specifying user interfaces, pedagogical paradigms, or establishing technology or policies that constrain innovation, interoperability, or reuse.
This document is intended to provide vendors with an overall understanding of the ASI specification, the relationship of this specification with other 1EdTech specifications, and a best practices guide derived from experiences of those using the specification.1 Example Item types supported by the specification, examples of composite Item types, and a complete XML example for presenting an Assessment, Section, and Item is included. The Best Practices & Implementation Guide also includes a significant number of actual examples that describe how vendors can make the best use of the QTI specification. These examples, approximately eighty, are also useful as a starting template for each of the different forms of Assessment, Section and Item. Appendices provide the range of available DTDs, XDRs and XSDs (as appropriate), as well as a glossary of key terms and elements used throughout the specification.
1.2 Scope & Context
1.3 Structure of this Document
The structure of this document is:
1.4 Nomenclature
1.5 References
2. Relationship to Other Specifications
2.1 1EdTech Specifications
Version 1.2 of the 1EdTech Question & Test Interoperability specification is made up of ten documents:
The 1EdTech Question & Test Interoperability specification is related to several other 1EdTech specifications, both complete and in progress. This specification is intended to be consistent with these other initiatives wherever possible, in order to reduce redundancy and confusion between specifications. The related specifications are:
- 1EdTech Meta-data Specification V1.2 - the 1EdTech Q&TI specification shares a number of common data object elements with the 1EdTech Meta-data specification. A set of unique meta-data extensions are also used and these are defined within the Q&TI specifications themselves [MD, 01a], [MD, 01b], [MD, 01c];
- 1EdTech Content Packaging Specification V1.1.2 - the 1EdTech QTI data model is a subset of the Content Packaging data model i.e. Q&TI Assessments, Sections and Items are defined as content and their XML can be inserted into a Content Packaging instance [CP, 01a], [CP, 01b], [CP, 01c];
- 1EdTech Learner Information Packaging Specification V1.0 - this specification can be used as an alternative results reporting mechanism [LIP, 01a], [LIP, 01b], [LIP, 01c];
- 1EdTech Sequencing Specification V1.0 - this specification is an extension of the 1EdTech Content Packaging specification to control the way in which the sequencing of the associated content [Seq, 02].
2.2 Related Specifications
2.2.1 IEEE P1484
The IEEE Learning Technology Standardisation Committee P1484 is the only body engaged in the educational domain, which has a recognised formal standing. Given the diversity of the fora represented by the participants in the IEEE, there exist a large number of working groups focused on specific activities, as well as more horizontal activities (such as the Architecture and Reference Model and the Glossary working groups) that attempt to tie the other work together. None of the current IEEE working groups and study groups (note a study group is formed to do preliminary work to scope any subsequent working group in the particular area) is focussed on Question & Test Interoperability.
2.2.2 Advanced Distributed Learning (ADL) Initiative
ADL is a US military programme started by the White House in 1997 that aims to advance the use of state-of-the-art online training amongst the countries defence forces. There is some collaboration with experts in military training applications from other NATO countries. ADL is very focused on content for particular areas of training. It also has the Shareable Content Object Reference Model (SCORM v1.2) as its working architecture to encourage discussion and input on the emerging standards. Again no separate Question & Test Interoperability specification development is underway and the 1EdTech QTI specification is being considered for adoption in later versions of the SCORM i.e. SCORM v1.4.
2.2.3 Aviation Industry CBT Committee (AICC)
The Aviation Industry CBT Committee is a membership-based international forum that develops recommendations on interoperable learning technology, principally for the commercial aviation and related industries. As such its members include both plane and equipment manufacturers, carriers, software and multimedia vendors and a growing number of interested parties not directly engaged in the sector, but nevertheless interested in the work being undertaken. A subgroup of the AICC is working with the ADL and other organisations from the IEEE LTSC. The 1EdTech Q&TI specifications have been presented to the AICC.
2.2.4 ISO/IEC JTC1/SC36 Learning Technology
As of 10th November 1999, the ISO/IEC Joint Technical Committee 1 meeting in Seoul agreed resolution 6, which brought into existence Sub-Committee 36 - Learning Technology. The international secretariat for SC36 is provided by the US National Body: the American National Standards Institute (ANSI). ISO/IEC JTC1/SC36 is intended to address standardisation in the area of information technologies that support automation for learners, learning institutions, and learning resources. It is the intention that SC36 shall not create standards or technical reports that define educational standards, cultural conventions, learning objectives, or specific learning content. Their activity in the field of question and test has yet to be defined.
3. Overall Data Model
3.1 Information Model
The data model for the Q&TI is shown in Figure 3.1 (this is the same as that described in the QTI information Model, [QTI, 02a]).

The objects in this model and their key behaviours are:
- Assessment - the object that represents the Assessment data structure;
- Section - the object that represents the Section data structure;
- Item - the object that represents the Item data structure;
- Activity Selection - selection of the next activity determined by the progress and results obtained up to the moment of activity selection;
- Outcomes Processing - the reconciliation of all the evaluation outputs to produce an overall Assessment/Section evaluation;
- Scoring Weights - the scoring weights that are to be assigned to the results output from the response processing;
- Response Processing - the processing and evaluation of the user responses;
- Presentation - the rendering of the content and the possible responses;
- Examinee Record - the set of collated results that is output from the complete process. This is a 'life-long' record in that it contains the historical progress of the individual;
- Outcomes - the set of outcomes that are to be evaluated by the response processing object. These determine the scoring metrics to be applied to the response evaluations;
- Response - the responses that are supplied by the user of the Items i.e. the input user selections;
- Flow - the underlying presentation structure that defines the block relationship between the different material components;
- Material - the content that is to be displayed.
This structure shows the relationship between the three core data objects, namely Items, Sections and Assessments. The type of objects that can be exchanged are shown in Figure 3.2 (Figure 4.1 in the 1EdTech QTI Information Model Specification [QTI, 02a]).

3.2 XML Schema Tree2
The generic XML schema tree is shown in Figure 3.3.

This representation reflects the structure of an Item, Section and Assessment. This structure has three core components:
- Configuration - generation of the appropriate environment for the correct interpretation of the information contained within the object;
- Processing - the actual processing represented by the object e.g. the presentation of a question and the corresponding response processing and feedback;
- Sequencing - linkage to referenced objects and the selection and sequencing of the next object to be processed.
3.2.1 Item
The XML schema tree for the Item data structure is shown in Figure 3.4.

The corresponding XML schema trees for the <presentation> and <resprocessing> elements are shown in Figures 3.5 and 3.6 respectively.



3.2.2 Section
The Section data structure XML schema tree is shown in Figure 3.7.

3.2.3 Assessment
The Assessment data structure XML schema tree is shown in Figure 3.8.

3.2.4 Material Element
The Material element XML schema tree is shown in Figure 3.9.

3.2.5 Meta-data
The meta-data for the Item structure is shown in Figure 3.10.

4. Example Basic Item Types
The examples of the basic Item types are lists under:
- Standard True/False (text-based options) - choice-based rendering;
- Standard Multiple Choice (text-based options) - choice-based rendering;
- Standard Multiple Choice (image-based options) - choice-based rendering;
- Standard Multiple Choice (audio-based options) - choice-based rendering;
- Standard Multiple Response (text-based options) - choice-based rendering;
- Multiple Choice with Single Image (image-based options) - IHS-based rendering;
- Multiple Response with Multiple Images (image-based options) - IHS-based rendering;
- Multiple Choice (slider-based options) - slider-based rendering;
- Standard Order Objects (text-based objects) - object-based rendering;
- Standard Order Objects (image-based objects) - object-based rendering;
- Connect-the-points (image-based) - IHS-based rendering.
The XY co-ordinate examples are:
- Standard Image Hot Spot (single image) - IHS-based rendering;
- Connect-the-points (image-based) - IHS-based rendering.
- Standard Single Fill-in-Blank - FIB-based rendering;
- Standard Multiple Fill-in-Blank - FIB-based rendering;
- Standard Short Answer (text required) - FIB-based rendering.
- Standard Integer Fill-in-Blank - FIB-based rendering;
- Standard Real number Fill-in-Blank - FIB-based rendering;
- Numerical entry with Slider - slider-based rendering.
4.1 Logical Identifier
4.1.1 Standard True/False (Text)
Figure 4.1 shows a typical True/False multiple-choice question where the possible answers are formatted in to different ways. The corresponding XML is listed after the figure. The user is expected to select either the 'Agree' or 'Disagree' radio buttons.
![]() |
![]() |
|
|
The recommended XML using the V1.2 specification takes two forms, one for each of the two Figures. The XML instance for Figure 4.1a is:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
<questestinterop> <qticomment> This is a simple True/False multiple-choice example using V1.2. The rendering is a standard radio button style. Response processing is incorporated. </qticomment> <item ident="IMS_V01_I_BasicExample001"> <presentation label="BasicExample001"> <flow> <material> <mattext>Paris is the Capital of France</mattext> </material> <response_lid ident="TF01" rcardinality="Single" rtiming="No"> <render_choice> <flow_label> <response_label ident="T"> <material><mattext>Agree</mattext></material> </response_label> <response_label ident="F"> <material><mattext>Disagree</mattext></material> </response_label> </flow_label> </render_choice> </response_lid> </flow> </presentation> <resprocessing> <outcomes> <decvar/> </outcomes> <respcondition title="Correct"> <conditionvar> <varequal respident="TF01">T</varequal> </conditionvar> <setvar action="Set">1</setvar> <displayfeedback feedbacktype="Response" linkrefid="Correct"/> </respcondition> </resprocessing> <itemfeedback ident="Correct" view="Candidate"> <flow_mat> <material><mattext>Yes, you are right.</mattext></material> </flow_mat> </itemfeedback> </item> </questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/trfl_ir_001/trfl_ir_001a.xml'. The equivalent XML for Figure 4.1b is:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 |
<questestinterop> <qticomment> This is a simple True/False multiple choice example using V1.2. The rendering is a standard radio button style. Response processing is incorporated. </qticomment> <item ident="IMS_V01_I_BasicExample001"> <presentation label="BasicExample001"> <flow> <material> <mattext>Paris is the Capital of France</mattext> </material> <response_lid ident="TF01" rcardinality="Single" rtiming="No"> <render_choice> <response_label ident="T"> <flow_mat> <material><mattext>Agree</mattext></material> </flow_mat> </response_label> <response_label ident="F"> <flow_mat> <material><mattext>Disagree</mattext></material> </flow_mat> </response_label> </render_choice> </response_lid> </flow> </presentation> <resprocessing> <outcomes> <decvar/> </outcomes> <respcondition title="Correct"> <conditionvar> <varequal respident="TF01">T</varequal> </conditionvar> <setvar action="Set">1</setvar> <displayfeedback feedbacktype="Response" linkrefid="Correct"/> </respcondition> </resprocessing> <itemfeedback ident="Correct" view="Candidate"> <flow_mat> <material><mattext>Yes, you are right.</mattext></material> </flow_mat> </itemfeedback> </item> </questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/trfl_ir_001/trfl_ir_001b.xml'.
4.1.2 Standard Multiple Choice (Text)
Figure 4.2 shows a typical text-based multiple-choice question. The corresponding XML is listed after the figure. The user is required to choose one of the available options by clicking the appropriate radio button.

Using the V1.2 specification the XML could be:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 |
<questestinterop> <qticomment> This is a simple multiple-choice example that conforms to V1.2. The rendering is a standard radio button style. Response processing is incorporated. </qticomment> <item title="Standard Multiple Choice Item" ident="IMS_V01_I_BasicExample002b"> <presentation label="BasicExample002a"> <flow> <material> <mattext> Which one of the listed standards committees is responsible for developing the token ring specification ? </mattext> </material> <response_lid ident="MCb_01" rcardinality="Single" rtiming="No"> <render_choice shuffle="Yes"> <response_label ident="A"> <flow_mat> <material><mattext>IEEE 802.3</mattext></material> </flow_mat> </response_label> <response_label ident="B"> <flow_mat> <material><mattext>IEEE 802.5</mattext></material> </flow_mat> </response_label> <response_label ident="C"> <flow_mat> <material><mattext>IEEE 802.6</mattext></material> </flow_mat> </response_label> <response_label ident="D"> |
34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 |
<flow_mat> <material><mattext>IEEE 802.11</mattext></material> </flow_mat> </response_label> <response_label ident="E" rshuffle="No"> <flow_mat> <material><mattext>None of the above.</mattext> </material> </flow_mat> </response_label> </render_choice> </response_lid> </flow> </presentation> <resprocessing> <outcomes> <decvar vartype="Integer" defaultval="0"/> </outcomes> <respcondition title="Correct"> <conditionvar> <varequal respident="MCb_01">A</varequal> </conditionvar> <setvar action="Set">1</setvar> <displayfeedback feedbacktype="Response" linkrefid="Correct"/> </respcondition> </resprocessing> <itemfeedback ident="Correct" view="Candidate"> <flow_mat> <material><mattext>Yes, you are right.</mattext></material> </flow_mat> </itemfeedback> </item> </questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/mchc_ir_001/mchc_ir_001a.xml'.
4.1.3 Standard Multiple Choice (Images)
Figure 4.3 shows a typical image-based multiple-choice question. The corresponding XML is listed after the figure. The user is required to select one of the options by clicking on the appropriate radio button.

The equivalent XML (without response processing) that conforms to the V1.2 specification is:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 |
<questestinterop> <qticomment> This is a multiple-choice example with image content. The rendering is a standard radio button style. No response processing is incorporated. </qticomment> <item title="Standard MC with Images Item" ident="IMS_V01_I_BasicExample003"> <presentation label="BasicExample003"> <flow> <material> <mattext>Which symbol is the 'Stop' sign ?<mattext> </material> <response_lid ident="MC02" rcardinality="Single" rtiming="No"> <render_choice shuffle="Yes"> <flow_label> <response_label ident="A"> <material> <matimage imagtype="image/gif" uri="image1.gif"> </matimage> </material> </response_label> <response_label ident="B"> <material> <matimage imagtype="image/gif" uri="image2.gif"> </matimage> </material> </response_label> <response_label ident="C"> <material> <matimage imagtype="image/gif" uri="image3.gif"> </matimage> </material> </response_label> <response_label ident="D"> <material> <matimage imagtype="image/gif" uri="image4.gif"> </matimage> </material> </response_label> </flow_label> </render_choice> </response_lid> </flow> </presentation> </item> </questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/mchc_i_002/mchc_i_002.xml'. The equivalent example with the response processing incorporated is available in the file: 'ims_qtiasiv1p2/basic/mchc_ir_002/mchc_ir_002.xml'.
4.1.4 Standard Multiple Choice (Audio)
Figure 4.4 shows a typical audio-based multiple-choice question. The corresponding XML is listed after the figure. The user is required to make their choice by selecting the appropriate radio button but the audio can only be activated by clicking on each sound source symbol.

The equivalent XML (without response processing) that conforms to the V1.2 specification is:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 |
<questestinterop> <qticomment> This is a multiple-choice example with audio content. The rendering is a standard radio button style. No response processing is incorporated. </qticomment> <item title="Standard MC with Audio Item" ident="IMS_V01_I_BasicExample004"> <presentation label="BasicExample004"> <flow> <material> <mattext>What sound does this instrument make ?</mattext> <matimage imagtype="image/gif" uri="imageinstrument.gif"></matimage> </material> <response_lid ident="MC03" rcardinality="Single" rtiming="No"> <render_choice> <response_label ident="A"> <flow_mat> <material> <mataudio audiotype="audio/wav" uri="sound1.wav"> </mataudio> </material> </flow_mat> </response_label> <response_label ident ="B"> <flow_mat> <material> <mataudio audiotype="audio/wav" uri="sound2.wav"> </mataudio> </material> </flow_mat> </response_label> <response_label ident="C"> <flow_mat> <material> |
35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 |
<mataudio audiotype="audio/wav" uri="sound3.wav"> </mataudio> </material> <flow_mat> </response_label> <response_label ident="D"> <flow_mat> <material> <mataudio audiotype="audio/wav" uri="sound4.wav"> </mataudio> </material> </flow_mat> </response_label> </render_choice> </response_lid> </flow> </presentation> </item> </questestinterop> |
Note: The icon used to denote the sound file is dependent on the rendering system.
This XML code is available in the file: 'ims_qtiasiv1p2/basic/mchc_i_003/mchc_i_003.xml'. The equivalent example with the response processing incorporated is available in the file: 'ims_qtiasiv1p2/basic/mchc_ir_003/mchc_ir_003.xml'.
4.1.5 Standard Multiple Response (Text)
Figure 4.5 shows a typical multiple response question. The corresponding XML is listed after the figure. The user is expected to click on each of the correct solutions using the appropriate check buttons.

The equivalent XML (without response processing) that conforms to the V1.2 specification is:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 |
<questestinterop> <qticomment> This is a multiple-response example. The rendering is a standard check button style. No response processing is incorporated. </qticomment> <item title="Standard Multiple Response Item" ident="IMS_V01_I_BasicExample005"> <presentation label="RS05"> <flow> <material> <mattext>Which of the following elements are used to form water ? </mattext> </material> <response_lid ident="MR01" rcardinality="Multiple" rtiming="No"> <render_choice shuffle="Yes" minnumber="1" maxnumber="4"> <response_label ident="A"> <flow_mat> <material> <mattext>Hydrogen</mattext> </material> </flow_mat> </response_label> <response_label ident="B"> <flow_mat> <material> <mattext>Helium</mattext> </material> </flow_mat> </response_label> <response_label ident="C"> <flow_mat> <material> <mattext>Carbon</mattext> </material> </flow_mat> </response_label> <response_label ident="D"> <flow_mat> <material> <mattext>Oxygen</mattext> </material> </flow_mat> </response_label> <response_label ident="E"> <flow_mat> <material> <mattext>Nitrogen</mattext> </material> </flow_mat> </response_label> <response_label ident="F"> <flow_mat> <material> <mattext>Chlorine</mattext> </material> </flow_mat> </response_label> </render_choice> </response_lid> </flow> |
61 62 63 |
</presentation> </item> </questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/mrsp_i_001/mrsp_i_001.xml'. The equivalent example with the response processing incorporated is available in the file: 'ims_qtiasiv1p2/basic/mrsp_ir_001/mrsp_ir_001.xml'.
4.1.6 Multiple Choice with Image Hot Spot Rendering
Figure 4.6 shows a typical multiple response question using hotspot rendering. The corresponding XML is listed after the figure. The user is expected to click on the appropriate radio button.

The equivalent XML (without response processing) that conforms to the V1.2 specification is:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 |
<questestinterop> <qticomment> This is a multiple-choice with image hot spot. The rendering uses image-based hotspots. No response processing is incorporated. </qticomment> <item title="Multiple Choice with Image Hotspot Rendering Item" ident="IMS_V01_I_BasicExample006"> <presentation label="BasicExample006"> <flow> <flow> <material> <matimage imagtype="image/gif" uri="mchotspot1.gif" x0="0" width="300" y0="512" height="400"> </matimage> </material> </flow> <flow> <material> <mattext>What <mattext> <matemtext>city </matemtext> <mattext>is the capital of <mattext> <matemtext>France ?</matemtext> </material> </flow> <response_lid ident="MC04" rcardinality="Single" rtiming="No"> <render_hotspot> <response_label ident="A" rarea="Ellipse">100,100,2,2 </response_label> <response_label ident="B" rarea="Ellipse"> 150,150,2,2 </response_label> <response_label ident="C" rarea="Ellipse">180,200,2,2 </response_label> <response_label ident="D" rarea="Ellipse">280,230,2,2 </response_label> <response_label ident="E" rarea="Ellipse">30,80,2,2 </response_label> </render_hotspot> </response_lid> <flow> </presentation> </item> </questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/mchc_i_004/mchc_i_004.xml'. The equivalent example with the response processing incorporated is available in the file: 'ims_qtiasiv1p2/basic/mchc_ir_004/mchc_ir_004.xml'.
4.1.7 Multiple Response with Multiple Image Hot Spot Rendering
Figure 4.7 shows a typical multiple response question using multiple image hotspot rendering. The corresponding XML is listed after the figure. The user is expected to use the mouse to click on the four wheels.

The V1.2 XML instance, including response processing, is:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 |
<questestinterop> <qticomment> This is a multiple-response with image hot spot rendering. The rendering is hotspot. </qticomment> <item title="Multiple Response with Image Hotspot Rendering Item" ident="IMS_V01_I_mrsp_ir_002"> <presentation label="BasicExample007b"> <flow> <flow> <material> <matimage imagtype="image/gif" uri="tractor.gif" x0="100" width="100" y0="100" height="100"/> <matimage imagtype="image/gif" uri="bus.gif" x0="300" width="100" y0="0" height="100"/> </material> </flow> <flow> <material> <mattext>Identify all of the wheels on the vehicles displayed. </mattext> </material> <response_lid ident="MR02" rcardinality="Multiple" rtiming="No"> <render_hotspot minnumber="2" maxnumber="4"> <response_label ident="A" rarea="Ellipse">110,10,20,20 </response_label> <response_label ident="B" rarea="Ellipse">190,110,5,5 </response_label> <response_label ident="C" rarea="Ellipse">220,110,5,5 </response_label> <response_label ident="D" rarea="Ellipse">380,110,5,5 </response_label> </render_hotspot> </response_lid> </flow> </flow> </presentation> <resprocessing> |
39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 |
<outcomes> <decvar defaultval="0"/> </outcomes> <respcondition> <qticomment>Scoring for the correct answer.</qticomment> <conditionvar> <varequal respident="MR02">A</varequal> <varequal respident="MR02">B</varequal> <varequal respident="MR02">C</varequal> <varequal respident="MR02">D</varequal> </conditionvar> <setvar action="Add">1</setvar> <displayfeedback feedbacktype="Response" linkrefid="Correct"/> </respcondition> <respcondition> <qticomment>Response trigger for incorrect answer.</qticomment> <conditionvar> <not> <and> <varequal respident="MR02">A</varequal> <varequal respident="MR02">B</varequal> <varequal respident="MR02">C</varequal> <varequal respident="MR02">D</varequal> </and> </not> </conditionvar> <setvar action="Add">0</setvar> <displayfeedback feedbacktype="Solution" linkrefid="CorrectSoln"/> </respcondition> </resprocessing> <itemfeedback ident="Correct" view="Candidate"> <flow_mat> <material><mattext>Yes, there are four wheels.</mattext></material> </flow_mat> </itemfeedback> <itemfeedback ident="CorrectSoln" view="Candidate"> <solution feedbackstyle="Complete"> <solutionmaterial> <flow_mat> <material> <mattext>The are two wheels on each vehicle.</mattext> <matimage imagtype="image/gif" uri="tractor1.gif" x0="100" width="100" y0="100" height="100"> </matimage> <matimage imagtype="image/gif" uri="bus1.gif" x0="300" width="100" y0="0" height="100"> </matimage> </material> </flow_mat> </solutionmaterial> </solution> </itemfeedback> </item> </questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/mrsp_ir_002/mrsp_ir_001.xml'.
4.1.8 Multiple Choice with Slider Rendering
Figure 4.8 shows a typical multiple-choice question using hotspot rendering. The corresponding XML is listed after the figure. The user is expected to move the slider to point towards the correct integer value.

Equivalent XML (without response processing):
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 |
<questestinterop> <qticomment> This is a multiple-choice with slider rendering. No response processing is incorporated. </qticomment> <item title="Multiple-choice and Slider rendering Item" ident="IMS_V01_I_BasicExample008"> <presentation label=" BasicExample008"> <flow> <flow> <material> <mattext>What is the value of 2 * 3 ?</mattext> </material> </flow> <flow> <response_lid ident="MC05" rcardinality="Single" rtiming="No"> <render_slider lowerbound="2" upperbound="10" step="2" startval="4" steplabel="Yes"> <response_label ident="a" rrange="absolute"> 2 </response_label> <response_label ident="b" rrange="Absolute"> 4 </response_label> <response_label ident="c" rrange="Absolute"> 6 </response_label> <response_label ident="d" rrange="Absolute"> 8 </response_label> <response_label ident="e" rrange="Absolute"> 10 </response_label> </render_slider> </response_lid> </flow> </flow> </presentation> </item> </questestinterop> |
Equivalent XML (with response processing):
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 |
<questestinterop> <qticomment> This is a multiple-choice with slider rendering. Response processing is included. </qticomment> <item title="Multiple Choice with Slider rendering Item" ident="IMS_V01_I_BasicExample008b"> <presentation label=" BasicExample008"> <flow> <flow> <material> <mattext>What is the value of 2 * 3 ?</mattext> </material> </flow> <flow> <response_lid ident="MC05" rcardinality="Single" rtiming="No"> <render_slider lowerbound="2" upperbound="10" step="2" startval="4" steplabel="Yes"> <response_label ident="a" rrange="absolute"> 2 </response_label> <response_label ident="b" rrange="Absolute"> 4 </response_label> <response_label ident="c" rrange="Absolute"> 6 </response_label> <response_label ident="d" rrange="Absolute"> 8 </response_label> <response_label ident="e" rrange="Absolute"> 10 </response_label> </render_slider> </response_lid> </flow> </flow> </presentation> <resprocessing> <outcomes> <decvar varname="SLIDECHOICE" vartype="Integer" defaultval="0"/> </outcomes> <respcondition> <qticomment>Scoring for the correct answer.</qticomment> <conditionvar> <varequal respident="MC05">c</varequal> </conditionvar> <setvar action="Add" varname="SLIDECHOICE">5</setvar> <displayfeedback feedbacktype="Response" linkrefid="Correct"/> </respcondition> <respcondition> <qticomment>Detecting the wrong answer.</qticomment> <conditionvar> <or> <varequal respident="MC05">a</varequal> <varequal respident="MC05">b</varequal> <varequal respident="MC05">d</varequal> <varequal respident="MC05">e</varequal> </or> </conditionvar> <displayfeedback feedbacktype="Response" linkrefid="Incorrect"/> </respcondition> </resprocessing> <itemfeedback ident="Correct" view="Candidate"> |
60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 |
<flow_mat> <material><mattext>Correct.</mattext></material> </flow_mat> </itemfeedback> <itemfeedback ident="Incorrect" view="Candidate"> <flow_mat> <material><mattext>The correct answer is 6.</mattext></material> </flow_mat> </itemfeedback> <itemfeedback ident="Incorrect" view="Tutor"> <flow_mat> <material> <mattext>The student chose the wrong answer.</mattext> </material> </flow_mat> </itemfeedback> </item> </questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/mchc_i_005/mchc_i_005.xml'. The equivalent example with the response processing incorporated is available in the file: 'ims_qtiasiv1p2/basic/mchc_ir_005/mchc_ir_005.xml'.
4.1.9 Standard Order Objects (Text)
Figure 4.9 shows a typical order-text-based objects question. The corresponding XML is listed after the figure. The user is expected to click on each text object and to place them in the correct order.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 |
<questestinterop> <qticomment> This is a standard ordering of a list of words. No response processing is incorporated. </qticomment> <item title="Standard Object Ordering of text Item" ident="IMS_V01_I_BasicExample009"> <presentation label="BasicExample009"> <flow> <material> <mattext>What is the correct order for the days of the week ? </mattext> </material> <response_lid ident="OB01" rcardinality="Ordered" rtiming="No"> <render_extension> <ims_render_object shuffle="Yes" orientation="Row"> <flow_label> <response_label ident="A"> <material><mattext>Monday</mattext></material> </response_label> <response_label ident="B"> <material><mattext>Thursday</mattext></material> </response_label> <response_label ident="C"> <material><mattext>Friday</mattext></material> </response_label> <response_label ident="D"> <material><mattext>Tuesday</mattext></material> </response_label> <response_label ident="E"> <material><mattext>Wednesday</mattext></material> </response_label> </flow_label> </ims_render_object> </render_extension> </response_lid> </flow> </presentation> </item> </questestinterop> |
Note: This example makes use of an extension i.e. ims_render_object. In a later version of the specification this extension may be adopted as part of the core specification.
This XML code is available in the file: 'ims_qtiasiv1p2/basic/oobj_i_001/oobj_i_001.xml'. The equivalent example with the response processing incorporated is available in the file: 'ims_qtiasiv1p2/basic/oobj_ir_001/oobj_ir_001.xml'.
4.1.10 Standard Order Objects (Image)
Figure 4.10 shows a typical order image-based objects. The corresponding XML is listed after the figure. The user is expected to move each of the objects by using the mouse and moving them around the screen.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 |
<questestinterop> <qticomment> This is a standard ordering of a group of images. </qticomment> <item title="Standard Object Ordering of images Item" ident="IMS_V01_I_BasicExample010"> <presentation label="BasicExample010"> <flow> <response_lid ident="OB02" rcardinality="Ordered" rtiming="No"> <material> <mattext>Put these objects in ascending order of size, starting with the smallest on the left hand side. </mattext> </material> <render_extension> <ims_render_object shuffle="Yes" orientation="Row"> <flow_label> <response_label ident="A"> <material> <matimage imagtype="image/gif" uri="object1.gif"> </matimage> </material> </response_label> <response_label ident="B"> <material> <matimage imagtype="image/gif" uri="object2.gif"> </matimage> </material> </response_label> <response_label ident="C"> <material> <matimage imagtype="image/gif" uri="object3.gif"> </matimage> </material> </response_label> <response_label ident="D"> <material> <matimage imagtype="image/gif" uri="object4.gif"> </matimage> </material> </response_label> <response_label ident="E"> |
43 44 45 46 47 48 49 50 51 52 53 54 55 |
<material> <matimage imagtype="image/gif" uri="object5.gif"> </matimage> </material> </response_label> </ims_render_object> </render_extension> </flow_label> </response_lid> </flow> </presentation> </item> </questestinterop> |
Note: This example makes use of an extension i.e. ims_render_object. In a later version of the specification this extension may be adopted as part of the core specification.
This XML code is available in the file: 'ims_qtiasiv1p2/basic/oobj_i_002/oobj_i_002.xml'. The equivalent example with the response processing incorporated is available in the file: 'ims_qtiasiv1p2/basic/oobj_ir_002/oobj_ir_002.xml'.
4.1.11 Connect-the-Points
Figure 4.11 shows a typical connect-the -points question. The corresponding XML is listed after the figure. The user is expected to click on the appropriate area in the image and to draw the corresponding figure.

The equivalent XML (without response processing):
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 |
<questestinterop> <qticomment> This is a standard connect-the-points example. The logical identifier response-type is used. No response processing is incorporated. </qticomment> <item title="standard connect-the-points item" ident="IMS_V01_I_BasicExample018"> <presentation label="BasicExample018"> <flow> <response_lid ident="CTP01" rcardinality="Multiple" rtiming="No"> <material> <mattext x0="50" width="200" y0="50" height="100">Connect the appropriate number of points to create a single right-angled triangle. </mattext> </material> <render_hotspot showdraw="Yes"> <material> <matimage imagtype="image/gif" uri="ctpoint.gif" x0 ="0" width ="400" y0="0" height="200"> </matimage> <material> <response_label ident="A" rarea ="Ellipse"> 300,20,1,1 </response_label> <response_label ident="B" rarea ="Ellipse"> 320,40,1,1 </response_label> <response_label ident="C" rarea ="Ellipse"> 380,100,1,1 </response_label> <response_label ident="D" rarea ="Ellipse"> 300,180,1,1 </response_label> <response_label ident="E" rarea ="Ellipse"> 240,120,1,1 </response_label> <response_label ident="F" rarea ="Ellipse"> 280,40,1,1 </response_label> </render_hotspot> </response_lid> </flow> </presentation> </item> <questestinterop> |
The equivalent XML (with response processing):
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 |
<questestinterop> <qticomment> This is a standard connect-the-points example. The logical identifier response-type is used. No response processing is incorporated. </qticomment> <item title="standard connect-the-points item" ident="IMS_V01_I_BasicExample018"> <presentation label="BasicExample018"> <flow> <response_lid ident="CTP01" rcardinality="Multiple" rtiming="No"> <material> <mattext x0="50" width ="200" y0="50" height ="100">Connect the appropriate number of points to create a single right-angled triangle. </mattext> </material> |
17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 |
<render_hotspot showdraw="Yes"> <material> <matimage imagtype="image/gif" uri ="ctpoint.gif" x0 ="0" width="400" y0="0" height ="200"> </matimage> <material> <response_label ident="A" rarea="Ellipse">300,20,1,1 </response_label> <response_label ident="B" rarea="Ellipse">320,40,1,1 </response_label> <response_label ident="C" rarea="Ellipse"380,100,1,1 </response_label> <response_label ident="D" rarea="Ellipse">300,180,1,1 </response_label> <response_label ident="E" rarea="Ellipse">240,120,1,1 </response_label> <response_label ident="F" rarea="Ellipse">280,40,1,1 </response_label> </render_hotspot> </response_lid> </flow> </presentation> <resprocessing> <outcomes> <decvar varname="CTPCHOICE" vartype="Integer" defaultval="0"/> </outcomes> <respcondition> <qticomment>Scoring for the correct answer.</qticomment> <conditionvar> <and> <varequal respident="CTP01">A</varequal> <varequal respident="CTP01">D</varequal> <or> <varequal respident="CTP01">B</varequal> <varequal respident="CTP01">C</varequal> <varequal respident="CTP01">E</varequal> <varequal respident="CTP01">F</varequal> </or> </and> </conditionvar> <setvar action="Add" varname="CTPCHOICE">3</setvar> <displayfeedback feedbacktype="Response" linkrefid="Correct"/> </respcondition> </resprocessing> <itemfeedback ident="Correct" view="Candidate"> <material><mattext>Correct.</mattext></material> </itemfeedback> </item> <questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/ctpt_i_001/ctpt_i_001.xml'. The equivalent example with the response processing incorporated is available in the file: 'ims_qtiasiv1p2/basic/ctpt_ir_001/ctpt_ir_001.xml'.
4.2 XY Co-ordinate
4.2.1 Standard Image Hot Spot
Figure 4.12 shows a typical image hot spot question. The corresponding XML is listed after the figure. The user is expected to click on the appropriate area in the image.

The equivalent XML (without response processing):
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
<questestinterop> <qticomment> This is a standard image hotspot example. No response processing is incorporated. </qticomment> <item title="standard image hotspot item" ident="IMS_V01_I_BasicExample011"> <presentation label="BasicExample011"> <flow> <material> <mattext>Identify the vdu.</mattext> </material> <flow> <response_xy ident="IHS01" rcardinality="Single" rtiming="No"> <render_hotspot> <material> <matimage imagtype="image/gif" uri="ihsvdu.gif" x0="0" width="300" y0="0" height="400"> </matimage> <material> <response_label ident="A" rarea="Rectangle">50,200,250,350 </response_label> </render_hotspot> </response_xy> </flow> </flow> </presentation> </item> <questestinterop> |
The equivalent XML (with response processing):
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
<questestinterop> <qticomment> This is a standard image hotspot example. </qticomment> <item title="standard image hotspot item" ident="IMS_V01_I_BasicExample011b"> <presentation label="BasicExample011"> <flow> <material> <mattext>Identify the vdu.</mattext> </material> <flow> <response_xy ident="IHS01" rcardinality="Single" rtiming="No"> <render_hotspot> <material> <matimage imagtype="image/gif" uri="ihsvdu.gif" x0="0" width="300" y0="0" height ="400"> </matimage> <material> <response_label ident="A" rarea ="Rectangle">50,200,250,350 </response_label> </render_hotspot> </response_xy> </flow> </flow> </presentation> <resprocessing> <outcomes> <decvar varname="IHSSCORE" vartype="Integer" defaultval="1"/> |
29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 |
</outcomes> <respcondition> <qticomment>Scoring for the correct answer.</qticomment> <conditionvar> <varinside respident="IHS01" areatype="Rectangle">50,200,250,350 </varinside> </conditionvar> <setvar action="Add" varname="IHSSCORE">1</setvar> <displayfeedback feedbacktype="Response" linkrefid="Correct"/> </respcondition> <respcondition> <qticomment>Scoring for the incorrect answer.</qticomment> <conditionvar> <not> <varinside respident="IHS01" areatype="Rectangle">50,200,250,350 </varinside> </not> </conditionvar> <setvar action="Subtract" varname="IHSSCORE">1</setvar> <displayfeedback feedbacktype="Response" linkrefid="Incorrect"/> </respcondition> </resprocessing> <itemfeedback ident="Correct" view="Candidate"> <flow_mat> <material><mattext>Correct.</mattext></material> </flow_mat> </itemfeedback> <itemfeedback ident="Incorrect" view="Candidate"> <flow_mat> <material><mattext>No, that is not the VDU.</mattext></material> </flow_mat> </itemfeedback> </item> </questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/ihsp_i_001/ihsp_i_001.xml'. The equivalent example with the response processing incorporated is available in the file: 'ims_qtiasiv1p2/basic/ihsp_ir_001/ihsp_ir_001.xml'.
4.2.2 Connect-the-Points
Figure 4.13 shows a typical connect-the -points question. The corresponding XML is listed after the figure. The user is expected to click on the appropriate area in the image and to draw the corresponding figure.

The equivalent XML (without response processing):
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 |
<questestinterop> <qticomment> This is a standard connect-the-points example. The XY response-type is used. </qticomment> <item title="standard connect-the-points item" ident="IMS_V01_I_BasicExample019"> <presentation label="BasicExample019"> <flow> <response_xy ident="CTP02" rcardinality="Multiple" rtiming="No"> <material> <mattext x0="50" width="200" y0="50" height="100">Connect the appropriate number of points to create a single right-angled triangle. </mattext> </material> <render_hotspot showdraw="Yes"> <material> <matimage imagtype="image/gif" uri="ctpoint.gif" x0="0" width="400" y0="0" height="200"> </matimage> <material> <response_label ident="A" rarea="Ellipse">300,20,1,1 </response_label> <response_label ident="B" rarea="Ellipse">320,40,1,1 </response_label> <response_label ident="C" rarea="Ellipse">380,100,1,1 </response_label> <response_label ident="D" rarea="Ellipse">300,180,1,1 </response_label> <response_label ident="E" rarea="Ellipse">240,120,1,1 </response_label> <response_label ident="F" rarea="Ellipse">280,40,1,1 </response_label> </render_hotspot> </response_xy> </flow> </presentation> </item> <questestinterop> |
The equivalent XML (with response processing):
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
<questestinterop> <qticomment> This is a standard connect-the-points example. The XY response-type is used. </qticomment> <item title="standard connect-the-points item" ident="IMS_V01_I_BasicExample019"> <presentation label="BasicExample019"> <flow> <response_xy ident="CTP02" rcardinality="Multiple" rtiming="No"> <material> <mattext>Connect the appropriate number of points to create a single right-angled triangle. </mattext> </material> |
15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 |
<render_hotspot showdraw="Yes"> <material> <matimage imagtype="image/gif" uri="ctpoint.gif" x0="0" width="400" y0="0" height="200"> </matimage> <material> <response_label ident="A" rarea="Ellipse">300,20,1,1 </response_label> <response_label ident="B" rarea="Ellipse">320,40,1,1 </response_label> <response_label ident="C" rarea="Ellipse">380,100,1,1 </response_label> <response_label ident="D" rarea="Ellipse">300,180,1,1 </response_label> <response_label ident="E" rarea="Ellipse">240,120,1,1 </response_label> <response_label ident="F" rarea="Ellipse">280,40,1,1 </response_label> </render_hotspot> </response_xy> </flow> </presentation> <resprocessing> <outcomes> <decvar varname="CTPCHOICE" vartype="Integer" defaultval="0"/> </outcomes> <respcondition> <qticomment>Scoring for the correct answer.</qticomment> <conditionvar> <and> <varinside respident="CTP02" areatype="Ellipse">300,20,1,1 </varinside> <varinside respident="CTP02" areatype="Ellipse">300,180,1,1 </varinside> <or> <varinside respident="CTP02" areatype="Ellipse">320,40,1,1 </varinside> <varinside respident="CTP02" areatype="Ellipse">380,100,1,1 </varinside> <varinside respident="CTP02" areatype="Ellipse">240,120,1,1 </varinside> <varinside respident="CTP02" areatype="Ellipse">280,40,1,1 </varinside> </or> </and> </conditionvar> <setvar action="Add" varname="CTPCHOICE">3</setvar> <displayfeedback feedbacktype="Response" linkrefid="Correct"/> </respcondition> </resprocessing> <itemfeedback ident="Correct" view="Candidate"> <flow_mat> <material> <mattext>Correct.</mattext> </material> </flow_mat> </itemfeedback> </item> <questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/ctpt_i_002/ctpt_i_002.xml'. The equivalent example with the response processing incorporated is available in the file: 'ims_qtiasiv1p2/basic/ctpt_ir_002/ctpt_ir_002.xml'.
4.3 String
4.3.1 Standard Fill-in-Blank (Text)
Figure 4.14 shows a typical FIB text. The corresponding XML is listed after the figure. The user is required to type the answer into the allocated space.

The recommended V1.2 compliant XML (with response processing) is:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
<questestinterop> <qticomment> This is a standard fill-in-blank (text) example. </qticomment> <item title="Standard FIB string Item" ident="IMS_V01_I_fibs_ir_001"> <presentation label="BasicExample012b"> <flow> <material> <mattext>Complete the sequence: </mattext> </material> <flow> <material> <mattext>Winter, Spring, Summer, </mattext> </material> <response_str ident="FIB01" rcardinality="Single" rtiming="No"> <render_fib fibtype="String" prompt="Dashline" maxchars="6"> <response_label ident="A"/> <material> <mattext>.</mattext> </material> </render_fib> </response_str> </flow> </flow> </presentation> <resprocessing> <outcomes> <decvar varname="FIBSCORE" vartype="Integer" defaultval="0"/> </outcomes> <respcondition> <qticomment>Scoring for the correct answer.</qticomment> <conditionvar> <varequal respident="FIB01" case="Yes">Autumn</varequal> </conditionvar> <setvar action="Add" varname="FIBSCORE">1</setvar> <displayfeedback feedbacktype="Response" linkrefid="Correct"/> </respcondition> </resprocessing> <itemfeedback ident="Correct" view="Candidate"> <flow_mat> <material><mattext>Yes, the season of Autumn.</mattext></material> </flow_mat> </itemfeedback> </item> </questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/fibs_ir_001/fibs_ir_001a.xml'.
4.3.2 Standard Multiple Fill-in-Blank (Text)
Figure 4.15 shows a typical FIB text with multiple entries. The corresponding XML is listed after the figure. The user is expected to type the answers in the allocated spaces.

The recommended V1.2 compliant XML with response processing is:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 |
<questestinterop> <qticomment> This is a standard multiple fill-in-blank (text) example. </qticomment> <item title="Standard FIB string Item" ident="IMS_V01_I_fibs_ir_002"> <presentation label="BasicExample013b"> <flow> <material> <mattext>Fill-in-the blanks in this text from Richard III: </mattext> </material> <flow> <material> <mattext>Now is the </mattext> </material> <response_str ident="FIB01" rcardinality="Single" rtiming="No"> <render_fib fibtype="String" prompt="Dashline" maxchars="6"> <response_label ident="A"/> </render_fib> </response_str> <material> <mattext> of our discontent made glorious </mattext> </material> <response_str ident="FIB02" rcardinality="Single" rtiming="No"> <render_fib fibtype="String" prompt="Dashline" maxchars="6"> <response_label ident="A"/> </render_fib> </response_str> <material> <mattext> by these sons of </mattext> </material> <response_str ident="FIB03" rcardinality="Single" rtiming="No"> <render_fib fibtype="String" prompt="Dashline" maxchars="4"> <response_label ident="A"/> </render_fib> </response_str> </flow> </flow> </presentation> <resprocessing> <outcomes> <decvar varname="FIBSCORE1" vartype="Integer" defaultval="0"/> |
42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 |
</outcomes> <respcondition> <qticomment>Scoring for the correct answer.</qticomment> <conditionvar> <varequal respident="FIB01" case="Yes">Winter</varequal> <varequal respident="FIB02" case="Yes">Summer</varequal> <varequal respident="FIB03" case="Yes">York</varequal> </conditionvar> <setvar action="Add" varname="FIBSCORE1">3</setvar> <displayfeedback feedbacktype="Response" linkrefid="AllCorrect"/> </respcondition> </resprocessing> <resprocessing> <outcomes> <decvar varname="DUMMY"/> </outcomes> <respcondition> <qticomment>Detecting incorrect asnwers for feedback.</qticomment> <conditionvar> <not><varequal respident="FIB01" case="Yes">Winter</varequal></not> </conditionvar> <displayfeedback feedbacktype="Response" linkrefid="InCorrect1"/> </respcondition> </resprocessing> <resprocessing> <outcomes> <decvar/> </outcomes> <respcondition> <qticomment>Detecting incorrect asnwers for feedback.</qticomment> <conditionvar> <not><varequal respident="FIB02" case="Yes">Summer</varequal></not> </conditionvar> <displayfeedback feedbacktype="Response" linkrefid="InCorrect2"/> </respcondition> </resprocessing> <resprocessing> <outcomes> <decvar/> </outcomes> <respcondition> <qticomment>Detecting incorrect asnwers for feedback.</qticomment> <conditionvar> <not><varequal respident="FIB03" case="Yes">York</varequal></not> </conditionvar> <displayfeedback feedbacktype="Response" linkrefid="InCorrect3"/> </respcondition> </resprocessing> <itemfeedback ident="AllCorrect" view="Candidate"> <flow_mat> <material><mattext>All correct. Well done.</mattext></material> </flow_mat> </itemfeedback> <itemfeedback ident="InCorrect1" view="Candidate"> <flow_mat> <material> <mattext>No. The correct first answer is "Winter".</mattext> </material> </flow_mat> |
101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 |
</itemfeedback> <itemfeedback ident="InCorrect2" view="Candidate"> <flow_mat> <material> <mattext>No. The correct second answer is "Summer".</mattext> </material> </flow_mat> </itemfeedback> <itemfeedback ident="InCorrect3" view="Candidate"> <flow_mat> <material> <mattext>No. The correct third answer is "York".</mattext> </material> </flow_mat> </itemfeedback> </item> </questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/fibs_ir_002/fibs_ir_002.xml'.
4.3.3 Standard Short Answer
Figure 4.16 shows a typical short answer question. The corresponding XML is listed after the figure. The user is expected to type text into the space supplied.

The V1.0 and V1.1 compliant XML (without response processing) is:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
<questestinterop> <qticomment> This is a standard fill-in-blank short answer example. No response processing is incorporated. </qticomment> <item title="Standard FIB short answer Item" ident="IMS_V01_I_BasicExample014"> <presentation label="BasicExample014"> <flow> <material> <mattext>In less than 100 words describe how you start a car.</mattext> </material> <response_str ident="FIB91" rcardinality="Ordered" rtiming="No"> <render_fib fibtype="String" prompt="Box" rows="20" columns="80"> <response_label ident="A"/> </render_fib> </response_str> </flow> </presentation> </item> <questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/fibs_i_003/fibs_i_003.xml'. The equivalent example with the response processing incorporated is available in the file: 'ims_qtiasiv1p2/basic/fibs_ir_003/fibs_ir_003.xml'.
4.4 Numerical
4.4.1 Standard Fill-in-Blank (Decimal)
Figure 4.17 shows a typical FIB number. The corresponding XML is listed after the figure. The user is expected to type the appropriate number into the supplied box.

The equivalent XML (without response processing) is:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
<questestinterop> <qticomment> This is a standard numerical fill-in-blank (decimal) example. No response processing is incorporated. </qticomment> <item title="Standard FIB numerical Item" ident="IMS_V01_I_BasicExample015"> <presentation label="BasicExample015"> <flow> <material> <mattext charset="ascii/us">Give the value of</mattext> <mattext charset="greek"> p </mattext> <mattext charset="ascii/us"> to three decimal places: </mattext> </material> <response_num ident="NUM01" rcardinality="Single" rtiming="No"> <render_fib fibtype="Decimal" prompt="Box" maxchars="6"> <response_label ident="A"/> </render_fib> </response_str> </flow> </presentation> </item> </questestinterop> |
The equivalent XML (with response processing) is:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 |
<questestinterop> <qticomment> This is a standard numerical fill-in-blank (decimal) example. No response processing is incorporated. </qticomment> <item title="Standard FIB numerical Item" ident="IMS_V01_I_BasicExample015b"> <presentation label="BasicExample015b"> <flow> <material> <mattext charset="ascii/us">Give the value of </mattext> <mattext charset="greek"> p </mattext> <mattext charset="ascii/us"> to three decimal places: </mattext> </material> <response_num ident="NUM01" rcardinality="Single" rtiming="No" numtype="Decimal"> <render_fib fibtype="Decimal" prompt="Box" maxchars="6"> <response_label ident="A"/> </render_fib> </response_num> </flow> </presentation> <resprocessing> <outcomes> <decvar varname="REALSCORE" vartype="Integer" defaultval="0"/> </outcomes> <respcondition> <qticomment>Scoring for the correct answer.</qticomment> <conditionvar> <vargte respident="NUM01">3.141</vargte> <varlte respident="NUM01">3.149</varlte> </conditionvar> <setvar action="Add" varname="REALSCORE">1</setvar> |
34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 |
<displayfeedback feedbacktype="Response" linkrefid="Correct"/> </respcondition> <respcondition> <qticomment>Scoring for the incorrect answer. </qticomment> <conditionvar> <not> <and> <vargte respident="NUM01">3.141</vargte> <varlte respident="NUM01">3.149</varlte> </and> </not> </conditionvar> <setvar action="Subtract" varname="REALSCORE">1</setvar> <displayfeedback feedbacktype="Response" linkrefid="Incorrect"/> </respcondition> </resprocessing> <itemfeedback ident="Correct" view="Candidate"> <flow_mat> <material> <mattext>Yes, you are correct. Well done.</mattext> </material> </flow_mat> </itemfeedback> <itemfeedback ident="Incorrect" view="Candidate"> <flow_mat> <material> <mattext>No. The correct answer is 3.142.</mattext> </material> </flow_mat> </itemfeedback> </item> </questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/fibn_i_001/fibn_i_001.xml'. The equivalent example with the response processing incorporated is available in the file: 'ims_qtiasiv1p2/basic/fibn_ir_001/fibn_ir_001.xml'.
4.4.2 Standard Fill-in-Blank (Integer)
Figure 4.18 shows a typical numerical entry using FIB rendering. The corresponding XML is listed after the figure. The user must enter an integer.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
<questestinterop>
<qticomment>
This is a standard numerical fill-in-blank (integer) example.
No response processing is incorporated.
</qticomment>
<item title="Standard FIB numerical Item" ident="IMS_V01_I_BasicExample016">
<presentation label="BasicExample016">
<flow>
<material>
<mattext charset="ascii/us">What is 13 x 13 ?</mattext>
</material>
<response_num ident="NUM02" rcardinality="Single" rtiming="No">
<render_fib fibtype="Integer" prompt="Asterisk" maxchars="3">
<response_label ident="A"/>
</render_fib>
</response_str>
</flow>
</presentation>
</item>
</questestinterop>
|
This XML code is available in the file: 'ims_qtiasiv1p2/basic/fibi_i_001/fibi_i_001.xml'. The equivalent example with the response processing incorporated is available in the file: 'ims_qtiasiv1p2/basic/fibi_ir_001/fibi_ir_001.xml'.
4.4.3 Numerical Entry with Slider
Figure 4.19 shows a typical numerical FIB with slider rendering. The corresponding XML is listed after the figure. The user must move the slider until it displays the required answer.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
<questestinterop> <qticomment> This is a standard numerical fill-in-blank using slider rendering. No response processing is incorporated. </qticomment> <item title="Standard FIB numerical Item with Slider rendering" ident="IMS_V01_I_BasicExample017"> <presentation label="BasicExample017"> <flow> <material> <mattext>How many degree are there in a triangle ?</mattext> <material> <response_num ident="NUM04" rcardinality="Single" rtiming="No"> <render_slider lowerbound="1" upbound="360" step="1" startval="90" steplabel="no" orientation ="horizontal"> <response_label ident="A"/> </render_slider> </response_str> </flow> </presentation> </item> </questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/fibi_I_002/fibi_I_001.xml'. The equivalent example with the response processing incorporated is available in the file: 'ims_qtiasiv1p2/basic/fibi_ir_002/fibi_ir_001.xml'.
4.5 Logical Group
4.5.1 Drag-and-Drop (Images)
Figure 4.20 shows a typical drag-and-drop question. The corresponding XML is listed after the figure. The user is expected to click on the appropriate answer images and to place them in the appropriate response holders.

The equivalent XML (without response processing) is:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 |
<questestinterop> <qticomment> This illustrates an example 'drag - drop' question. Candidates are required to place a label showing the name of a planet in the appropriate position to identify the planets in our solar system. Each correctly placed label gives the candidate one mark. If one or more of the labels is incorrectly placed, feedback is given naming the planets in correct order. </qticomment> <item title="The Planets" ident="qm_1052138399372757"> <presentation> <flow> <flow> <material> <mattext texttype="text/plain"> Place the text markers inside the relevant boxes to identify the planets of our solar system. </mattext> </material> </flow> <flow> <material> <mattext>A point will be awarded for every correct answer. </mattext> </material> <flow> <material> <matimage imagtype="image/jpg" uri="solar_system.jpg" height="220" width="560"/> </material> </flow> <response_grp ident="planets" rcardinality="Multiple"> <render_extension> <ims_render_object> <response_label ident="earth_source" match_max="1" match_group="mercury_target, venus_target, earth_target, mars_target, jupiter_target, saturn_target, uranus_target, neptune_target, pluto_target"> <material> <matimage imagtype="image/jpg" uri="earth.jpg" height="16" width="39" x0="300" y0="0"/> </material> </response_label> <response_label ident="venus_source" match_max="1" match_group="mercury_target, venus_target, earth_target, mars_target, jupiter_target, saturn_target, uranus_target, neptune_target, pluto_target"> <material> <matimage imagtype="image/jpg" uri="venus.jpg" height="16" width="50" x0="300" y0="50"/> </material> </response_label> <response_label ident=jupiter_source" match_max="1" match_group="mercury_target, venus_target, earth_target, mars_target, jupiter_target, saturn_target, uranus_target, neptune_target, pluto_target"> |
59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 |
<material> <matimage imagtype="image/jpg" uri="jupiter.jpg" height="16" width="50" x0="300" y0="100"/>/> </material> </response_label> <response_label ident="mars_source" match_max="1" match_group="mercury_target, venus_target, earth_target, mars_target, jupiter_target, saturn_target, uranus_target, neptune_target, pluto_target"> <material> <matimage imagtype="image/jpg" uri="mars.jpg" height="16" width="58" x0="300" y0="150"/>/> </material> </response_label> <response_label ident="neptune_source" match_max="1" match_group="mercury_target, venus_target, earth_target, mars_target, jupiter_target, saturn_target, uranus_target, neptune_target, pluto_target"> <material> <matimage imagtype="image/jpg" uri="neptune.jpg" height="16" width="56" x0="300" y0="200"/>/> </material> </response_label> <response_label ident="pluto_source" match_max="1" match_group="mercury_target, venus_target, earth_target, mars_target, jupiter_target, saturn_target, uranus_target, neptune_target, pluto_target"> <material> <matimage imagtype="image/jpg" uri="pluto.jpg" height="16" width="34" x0="300" y0="250"/>/> </material> </response_label> <response_label ident="saturn_source" match_max="1" match_group="mercury_target, venus_target, earth_target, mars_target, jupiter_target, saturn_target, uranus_target, neptune_target, pluto_target"> <material> <matimage imagtype="image/jpg" uri="saturn.jpg" height="16" width="51" x0="300" y0="300"/>/> </material> </response_label> <response_label ident="uranus_source" match_max="1" match_group="mercury_target, venus_target, earth_target, mars_target, jupiter_target, saturn_target, uranus_target, neptune_target, pluto_target"> <material> <matimage imagtype="image/jpg" uri="uranus.jpg" height="16" width="62" x0="300" y0="350"/>/> </material> </response_label> <response_label ident="mercury_source" match_max="1" match_group="mercury_target, venus_target, earth_target, mars_target, jupiter_target, saturn_target, uranus_target, neptune_target, |
118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 |
pluto_target"> <material> <matimage imagtype="image/jpg" uri="mercury.jpg" height="16" width="59" x0="300" y0="400"/>/> </material> </response_label> <response_label ident="mercury_target" rarea="Rectangle"> 40, 40, 25,80 </response_label> <response_label ident="venus_target" rarea="Rectangle"> 80, 80, 25,80 </response_label> <response_label ident="earth_target" rarea="Rectangle"> 120, 1200, 25,80 </response_label> <response_label ident="mars_target" rarea="Rectangle"> 160, 80, 25,80 </response_label> <response_label ident="jupiter_target" rarea="Rectangle"> 200, 40, 25,80 </response_label> <response_label ident="saturn_target" rarea="Rectangle"> 250, 50, 25,80 </response_label> <response_label ident="uranus_target" rarea="Rectangle"> 325, 60, 25,80 </response_label> <response_label ident="neptune_target" rarea="Rectangle"> 425, 80, 25,80 </response_label> <response_label ident="pluto_target" rarea="Rectangle"> 500, 120, 25,80 </response_label> </ims_render_object> </render_extension> </response_grp> </flow> </presentation> </item> <questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/basic/dobj_i_001/dobj_i_001.xml'. The equivalent example with the response processing incorporated is available in the file: 'ims_qtiasiv1p2/basic/dobj_ir_001/dobj_ir_001.xml'.
5. Example Composite Item Types
The composite examples supplied are:
5.1 Multiple Choice Derivatives
5.1.1 Multiple Choice with Fill-in-Blank
Figure 5.1 shows the multiple-choice question with an additional FIB response opportunity. The corresponding XML is listed after the figure.

The equivalent V1.2 XML (with response processing) is:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
<questestinterop> <qticomment> This is a composite item. It consist of a standard multiple choice question and a fill-in-blank alternative. Response processing for the Multiple-choice part is supplied. The free format material in the FIB should be returned for non-computer based marking. </qticomment> <item title="Composite Item" ident="IMS_V01_I_mcfb_ir_001"> <presentation label="CompExample001"> <flow> <material> <mattext>Which </mattext> <matemtext>city </matemtext> <mattext>is the capital of </mattext> <matemtext>England </matemtext> <mattext>and name another city in England ?</mattext> </material> <response_lid ident="Comp_MC01" rcardinality="Single" rtiming="No"> |
21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 |
<render_choice shuffle="Yes"> <response_label ident="A"> <flow_mat class="List"> <material><mattext>Sheffield</mattext></material> </flow_mat> </response_label> <response_label ident="B"> <flow_mat class="List"> <material><mattext>London</mattext></material> </flow_mat> </response_label> <response_label ident="C"> <flow_mat class="List"> <material><mattext>Manchester</mattext></material> </flow_mat> </response_label> <response_label ident="D"> <flow_mat class="List"> <material><mattext>Edinburgh</mattext></material> </flow_mat> </response_label> </render_choice> </response_lid> <response_str ident="Comp_FIB01" rcardinality="Single" rtiming="No"> <render_fib fibtype="String" prompt="Box"> <material> <mattext>Another city:</mattext> </material> <response_label ident="A"/> </render_fib> </response_str> </flow> </presentation> <resprocessing> <outcomes> <decvar varname="MCSCORE" vartype="Integer" defaultval="0"/> </outcomes> <respcondition title="Mcorrect"> <conditionvar> <varequal respident="Comp_MC01">B</varequal> </conditionvar> <setvar action="Set" varname="MCSCORE">1</setvar> <displayfeedback feedbacktype="Response" linkrefid="Mcorrect"/> </respcondition> </resprocessing> <itemfeedback ident="Correct" view="Candidate"> <flow_mat> <material><mattext>Yes, you are right.</mattext></material> </flow_mat> </itemfeedback> </item> </questestinterop> |
The string-based answer will require human marking intervention. The key differences between this instance and the V1.0 equivalent are:
- The actual question contains emphasised text (lines 13-19);
- The flow structure is added to the <presentation> element;
- The <flow_mat> element is added to the <itemfeedback> element (lines 66-70).
This XML code is available in the file: 'ims_qtiasiv1p2/composite/mcfb_ir_001/mcfb_ir_001.xml'.
5.1.2 Matrix-based Multiple Response
Figure 5.2 shows a composite multiple-choice question arranged as a matrix i.e. only one answer per row is permitted. The corresponding XML is listed after the figure. The user must choose one option per row.

The equivalent V1.2 XML (with response processing) is:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
<questestinterop> <qticomment> This is a composite item. It consists of a matrix of multiple-choice questions. </qticomment> <item title="Composite Item" ident="IMS_V01_I_matx_ir_002"> <presentation label="CompExample002b"> <flow> <material> <mattext> Which of the following are used to describe the passage of time ? </mattext> </material> <response_lid ident="Comp_MC01" rcardinality="Single" rtiming="No"> <render_choice shuffle="Yes"> <flow_label> <response_label ident="A"> <material><mattext>Hour</mattext></material> </response_label> <response_label ident="B"> <material><mattext>Galleon</mattext></material> </response_label> <response_label ident="C"> <material><mattext>Mile</mattext></material> </response_label> </flow_label> |
27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 |
</render_choice> </response_lid> <response_lid ident="Comp_MC02" rcardinality="Single" rtiming="No"> <render_choice shuffle="Yes"> <flow_label> <response_label ident="A"> <material><mattext>Metre</mattext></material> </response_label> <response_label ident="B"> <material><mattext>Dozen</mattext></material> </response_label> <response_label ident="C"> <material><mattext>Decade</mattext></material> </response_label> </flow_label> </render_choice> </response_lid> <response_lid ident="Comp_MC03" rcardinality="Single" rtiming="No"> <render_choice shuffle="Yes"> <flow_label> <response_label ident="A"> <material><mattext>Tonne</mattext></material> </response_label> <response_label ident="B"> <material><mattext>Century</mattext></material> </response_label> <response_label ident="C"> <material><mattext>Score</mattext></material> </response_label> </flow_label> </render_choice> </response_lid> </flow> </presentation> <resprocessing> <outcomes> <decvar varname="MCSCORE1" vartype="Integer" defaultval="0"/> </outcomes> <respcondition title="MfullCorrect"> <conditionvar> <and> <varequal respident="Comp_MC01">A</varequal> <varequal respident="Comp_MC02">C</varequal> <varequal respident="Comp_MC03">B</varequal> </and> </conditionvar> <setvar action="Set" varname="MCSCORE1">3</setvar> <displayfeedback feedbacktype="Response" linkrefid="FullCorrect"/> </respcondition> <respcondition title="MtwoCorrect"> <conditionvar> <or> <and> <varequal respident="Comp_MC01">A</varequal> <varequal respident="Comp_MC02">C</varequal> <not><varequal respident="Comp_MC03">B</varequal></not> </and> <and> <varequal respident="Comp_MC01">A</varequal> |
86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 |
<varequal respident="Comp_MC02">B</varequal> <not><varequal respident="Comp_MC03">C</varequal></not> </and> <and> <varequal respident="Comp_MC01">B</varequal> <varequal respident="Comp_MC02">C</varequal> <not><varequal respident="Comp_MC03">A</varequal></not> </and> </or> </conditionvar> <setvar action="Set" varname="MCSCORE1">2</setvar> <displayfeedback feedbacktype="Response" linkrefid="TwoCorrect"/> </respcondition> <respcondition title="MoneCorrect"> <conditionvar> <or> <and> <varequal respident="Comp_MC01">A</varequal> <not><varequal respident="Comp_MC02">C</varequal></not> <not><varequal respident="Comp_MC03">B</varequal></not> </and> <and> <varequal respident="Comp_MC01">C</varequal> <not><varequal respident="Comp_MC02">A</varequal></not> <not><varequal respident="Comp_MC03">B</varequal></not> </and> <and> <varequal respident="Comp_MC01">B</varequal> <not><varequal respident="Comp_MC02">C</varequal></not> <not><varequal respident="Comp_MC03">A</varequal></not> </and> </or> </conditionvar> <setvar action="Set" varname="MCSCORE1">1</setvar> <displayfeedback feedbacktype="Response" linkrefid="OneCorrect"/> </respcondition> </resprocessing> <itemfeedback ident="FullCorrect" view="Candidate"> <flow_mat> <material><mattext>Yes, you are right.</mattext></material> </flow_mat> </itemfeedback> <itemfeedback ident="TwoCorrect" view="Candidate"> <flow_mat> <material> <mattext>Only two of your answers are correct.</mattext> </material> </flow_mat> </itemfeedback> <itemfeedback ident="OneCorrect" view="Candidate"> <flow_mat> <material> <mattext>Only one of your answers is correct.</mattext> </material> </flow_mat> </itemfeedback> </item> </questestinterop> |
This XML code is available in the file: 'ims_qtiasiv1p2/composite/matx_ir_001/matx_ir_001.xml'.
6. Example XML Schema
The examples shown herein all conform to the Version 1.2 form of the full 1EdTech QTI specification. Specifically they include the usage of the <flow> and related elements and attributes.
6.1 Item Examples
6.1.1 Minimum Definition
The XML for the minimal single useful 'Item' is as follows (this is also shown schematically as Figure 4.1):
<questestinterop> <item ident="IMS_V01_I_BasicExample001a"> <presentation label="BasicExample001a"> <flow> <material> <mattext>Paris is the Capital of France ?</mattext> </material> <response_lid ident="TF01"> <render_choice> <response_label ident="T"> <material><mattext>Agree</mattext></material> </response_label> <response_label ident="F"> <material><mattext>Disagree</mattext></material> </response_label> </render_choice> </response_lid> </flow> </presentation> </item> </questestinterop>
6.1.2 Full Definition
The XML for a complete single useful 'Item' is as follows (this is also shown schematically as Figure 4.1 and is an extension of the XML code for the minimal single useful Item example):
<questestinterop> <qticomment> This is a simple True/False multiple-choice example. The rendering is a standard radio button style. </qticomment> <item ident="IMS_V01_I_BasicExample001b"> <duration>pH1</duration> <itemmetadata/> <objectives view="Candidate"> <flow_mat> <material> <mattext>To test your understanding of French cities.</mattext> </material> </flow_mat> </objectives> <objectives view="Scorer"> <flow_mat> <material> <mattext>Award marks for the right answer only.</mattext> </material> </flow_mat> </objectives> <itemcontrol hintswitch="Yes"/> <rubric> <flow_mat> <material> <mattext>Attempt all questions.</mattext> </material> </flow_mat> </rubric> <presentation label="BasicExample001b"> <flow> <material> <mattext>Paris is the Capital of France ?</mattext> </material> <response_lid ident="TF01" rcardinality="Single" rtiming="No"> <render_choice> <flow_label> <response_label ident="T"> <material><mattext>Agree</mattext></material> </response_label> </flow_label> <flow_label> <response_label ident="F"> <material><mattext>Disagree</mattext></material> </response_label> </flow_label> </render_choice> </response_lid> </flow> </presentation> <resprocessing> <outcomes><decvar/></outcomes> <respcondition title="Correct"> <conditionvar> <varequal respident="TF01">T</varequal> </conditionvar> <setvar action="Set">1</setvar> <displayfeedback feedbacktype="Response" linkrefid="Correct"/> </respcondition> </resprocessing> <itemfeedback ident="Correct" view="Candidate"> <flow_mat> <material><mattext>Yes, you are right.</mattext></material> </flow_mat> </itemfeedback> </item> </questestinterop>
This example is given in the example file: 'ims_qtiasiv1p2/basic/trfl_ir_002/trfl_ir_002.xml'
6.2 Section Examples
6.2.1 Minimum Definition
The XML for the minimal single useful 'Section' is as follows:
<questestinterop> <section ident="IMS_V01_S_Example001"> <item ident="IMS_V01_I_BasicExample001"> <presentation label="RS01"> <flow> <material> <mattext>Paris is the Capital of France ?</mattext> </material> <response_lid ident="TF01"> <render_choice> <response_label ident="T"> <material><mattext>Agree</mattext></material> </response_label> <response_label ident="F"> <material><mattext>Disagree</mattext></material> </response_label> </render_choice> </response_lid> </flow> </presentation> </item> </section> </questestinterop>
The XML for the minimal multiple 'Section' (each with one 'Item') is as follows:
<questestinterop> <section ident="IMS_V01_S_Example001"> <item ident="IMS_V01_I_BasicExample001"> <presentation label="RS01"> <flow> <material> <mattext>Paris is the Capital of France ?</mattext> </material> <response_lid ident="TF01"> <render_choice> <response_label ident="T"> <material><mattext>True</mattext></material> </response_label> <response_label ident="F"> <material><mattext>False</mattext></material> </response_label> </render_choice> </response_lid> </flow> </presentation> </item> </section> <section ident="IMS_V01_S_Example002"> <item ident="IMS_V01_I_BasicExample200"> <presentation label="RS20"> <flow> <material> <mattext>London is the Capital of Germany ?</mattext> </material> <response_lid ident="TF02"> <render_choice> <response_label ident="T"> <material><mattext>True</mattext></material> </response_label> <response_label ident="F"> <material><mattext>False</mattext></material> </response_label> </render_choice> </response_lid> </flow> </presentation> </item> </section> </questestinterop>
6.2.2 Full Definition
The XML for a complete useful 'Section' example (this contains two Items) is as follows:
<questestinterop> <qticomment> This example consists of two Sections. </qticomment> <section title="European Capitals" ident="IMS_V01_S_Example201"> <objectives view="Candidate"> <flow_mat> <material> <mattext>To assess knowledge of the capital cities in Europe.</mattext> </material> </flow_mat> </objectives> <objectives view="Tutor"> <flow_mat> <material> <mattext> To ensure that the student knows the difference between the Capital cities of France, UK, Germany, Spain and Italy. </mattext> </material> </flow_mat> </objectives> <item title="Capital of France" ident="I01" maxattempts="6"> <qticomment> This Item is also available in the accompanying example files. </qticomment> <itemmetadata/> <rubric view="Candidate"> <flow_mat> <material> <mattext>Choose only one of the choices available.</mattext> </material> </flow_mat> </rubric> <presentation label="Resp001"> <flow> <material> <mattext>What is the Capital of France ?</mattext> </material> <response_lid ident="LID01"> <render_choice shuffle="Yes"> <flow_label> <response_label ident="LID01_A"> <material><mattext>London</mattext></material> </response_label> </flow_label> <flow_label> <response_label ident="LID01_B"> <material><mattext>Paris</mattext></material> </response_label> </flow_label> <flow_label> <response_label ident="LID01_C"> <material><mattext>Washington</mattext></material> </response_label> </flow_label> <flow_label> <response_label ident="LID01_D" rshuffle="No"> <material><mattext>Berlin</mattext></material> </response_label> </flow_label> </render_choice> </response_lid> </flow> </presentation> <resprocessing> <qticomment/> <outcomes> <decvar vartype="Integer" defaultval="0"/> </outcomes> <respcondition> <qticomment>Scoring for the correct answer.</qticomment> <conditionvar> <varequal respident="LID01">LID01_B</varequal> </conditionvar> <setvar action="Set" varname="SCORE">10</setvar> <displayfeedback feedbacktype="Response" linkrefid="I01_IFBK01"/> </respcondition> </resprocessing> <itemfeedback title="Correct answer" ident="I01_IFBK01"> <flow_mat> <material> <mattext>Correct answer.</mattext> </material> </flow_mat> </itemfeedback> <itemfeedback ident="I01_IFBK02"> <solution> <solutionmaterial> <flow_mat> <material> <mattext>London is the Capital of England.</mattext> <mattext>Paris is the Capital of France.</mattext> <mattext>Washington is in the USA.</mattext> <mattext>Berlin is the Capital of Germany.</mattext> </material> </flow_mat> </solutionmaterial> </solution> </itemfeedback> <itemfeedback ident="I01_IFBK03" view="All"> <hint feedbackstyle="Multilevel"> <hintmaterial> <flow_mat> <material> <mattext>One of the choices is not in Europe.</mattext> </material> </flow_mat> </hintmaterial> <hintmaterial> <flow_mat> <material> <mattext>Berlin is the Capital of Germany.</mattext> </material> </flow_mat> </hintmaterial> <hintmaterial> <flow_mat> <material> <mattext>The Eiffel tower is in the Capital of France.</mattext> </material> </flow_mat> </hintmaterial> </hint> </itemfeedback> </item> </section> <section title="European Rivers" ident="IMS_V01_S_Example202"> <objectives view="Candidate"> <flow_mat> <material> <mattext>To assess your knowledge of the rivers in Europe.</mattext> </material> </flow_mat> </objectives> <objectives view="Assessor"> <flow_mat> <material> <mattext>Questions on the rivers in Germany, Spain, Italy and France. </mattext> </material> </flow_mat> </objectives> <item title="Rivers in France question" ident="I02"> <rubric view="Candidate"> <flow_mat> <material> <mattext>Choose all of the correct answers.</mattext> </material> </flow_mat> </rubric> <presentation label="Resp002"> <flow> <material> <mattext>Which rivers are in France ?</mattext> </material> <response_lid ident="LID02" rcardinality="Multiple"> <render_choice shuffle="Yes" minnumber="1" maxnumber="2"> <flow_label> <response_label ident="LID02_A"> <material><mattext>Seine</mattext></material> </response_label> </flow_label> <flow_label> <response_label ident="LID02_B"> <material><mattext>Thames</mattext></material> </response_label> </flow_label> <flow_label> <response_label ident="LID02_C"> <material><mattext>Danube</mattext></material> </response_label> <flow_label> </flow_label> <response_label ident="LID02_D"> <material><mattext>Loire</mattext></material> </response_label> </flow_label> </render_choice> </response_lid> </flow> </presentation> </item> <item title="Rivers in Germany" ident="I03"/> <duration>pTH02</duration> <rubric view="Candidate"> <flow_mat> <material><mattext>Choose all of the correct answers.</mattext></material> </flow_mat> </rubric> <presentation label="Resp003"> <flow> <material> <matimage imagtype="image/jpeg" uri="rivers.jpg"></matimage> <mattext>Which rivers are in Germany ?</mattext> </material> <response_lid ident="LID03" rcardinality="Multiple"> <render_hotspot x0="500" y0="500" height="200" width="200"> <response_label ident="LID03_A" rarea="Ellipse"> 10,10,2,2 </response_label> <response_label ident="LID03_B" rarea="Ellipse"> 15,15,2,2 </response_label> <response_label ident="LID03_C" rarea="Ellipse"> 30,30,2,2 </response_label> <response_label ident="LID03_D" rarea="Ellipse"> 60,60,2,2 </response_label> <response_label ident="LID03_E" rarea="Ellipse"> 70,70,2,2 </response_label> </render_hotspot> </response_lid> </flow> </presentation> </item> </section> </questestinterop>
This example is given in the example file: 'ims_qtiasiv1p2/section/mchc_smimr_101/mchc_smimr_101.xml'
6.3 Assessment Examples
6.3.1 Minimum Definition
The XML for the minimal single useful 'Assessment' is as follows:
<questestinterop> <assessment ident="IMS_V01_A_Example301"> <section ident="IMS_V01_S_Example301"> <item ident="IMS_V01_I_BasicExample301"> <presentation label="RS01"> <flow> <material> <mattext>Paris is the Capital of France ?</mattext> </material> <response_lid ident="TF01"> <render_choice> <response_label ident="T"> <material><mattext>Agree</mattext></material> </response_label> <response_label ident="F"> <material><mattext>Disagree</mattext></material> </response_label> </render_choice> </response_lid> </flow> </presentation> </item> </section> </assessment> </questestinterop>
6.3.2 Full Definition
The following example consists of one Assessment with two Sections. One Section contains two Items and the other contains one Item.
<questestinterop> <assessment title ="European Geography" ident ="A01"> <qticomment>A Complex Assessment example.</qticomment> <objectives view ="Candidate "> <flow _mat > <material> <mattext>To test your knowledge of European geography.</mattext> </material> </flow _mat > </objectives> <objectives view ="Assessor "> <flow_mat > <material> <mattext>Tests the candidate's knowledge of European geography.</mattext> </material> </fow_mat> </objectives> <rubric view ="Candidate "> <flow_mat > <material> <mattext>Attempt all questions.</mattext> </material> </flow_mat > </rubric> <outcomes_processing scoremodel ="SumofScores "> <qticomment>Processing of the final accumulated assessment .</qticomment> <outcomes> <decvar/> </outcomes> <outcomes_feedback_test title ="Failed"> <test_variable> <variable_test testoperator ="LTE">9</variable_test> </test_variable> <displayfeedback feedbacktype="Response " linkrefid="Failed"/> </outcomes_feedback_test> <outcomes_feedback_test title="Passed"> <test_variable> <variable_test testoperator ="GT">10</variable_test> </test_variable> <displayfeedback feedbacktype ="Response " linkrefid ="Passed"/> <outcomes_feedback_test > </outcomes_processing > <assessfeedback title ="Failed" ident ="Failed"> <flow_mat > <material> <mattext>You failed the test.</mattext> </material> </flow_mat > </assessfeedback> <assessfeedback title ="Passed" ident ="Passed"> <flow_mat > <material> <mattext>You passed the test.</mattext> </material> </flow_mat > </assessfeedback> <section title ="European Capitals" ident ="S01"> <objectives view ="Candidate "> <flow_mat > <material> <mattext>To assess your knowledge of the capital cities in Europe. </mattext> </material> </flow_mat > </objectives> <objectives view ="Tutor "> <flow_mat > <material> <mattext> To ensure that the student knows the difference between the Capital cities of France, UK, Germany, Spain and Italy. </mattext> </material> </flow_mat > </objectives> <rubric view ="Candidate "> <flow_mat > <material> <mattext>Attempt all questions.</mattext> </material> </flow_mat > </rubric> <outcomes_processing scoremodel ="SumofScores "> <qticomment>Processing of the final accumulated Section .</qticomment> <outcomes> <decvar/> </outcomes> </outcomes_processing > <item title="Capital of France" ident ="I01" maxattempts ="6"> <rubric view ="Candidate "> <flow_mat > <material> <mattext>Choose only one of the choices available.</mattext> </material> </flow_mat > </rubric> <presentation label ="Resp001"> <flow> <response_lid ident ="LID01"> <material> <mattext>What is the Capital of France ?</mattext> </material> <render_choice shuffle ="Yes"> <response_label ident ="LID01_A"> <flow _mat> <material><mattext>London</mattext></material> </flow _mat> </response_label> <response_label ident ="LID01_B"> <flow _mat> <material><mattext>Paris</mattext></material> </flow_mat> </response_label> <response_label ident ="LID01_C"> <flow_mat> <material><mattext>Washington</mattext></material> </flow_mat> </response_label> <response_label ident="LID01_D" rshuffle="No"> <flow _mat> <material><mattext>Berlin</mattext></material> </flow_mat> </response_label> </render_choice> </response_lid> </flow> </presentation> <resprocessing> <qticomment/> <outcomes> <decvar vartype ="Integer" defaultval ="0"/> </outcomes> <respcondition> <qticomment>Scoring for the correct answer.</qticomment> <conditionvar> <varequalrespident ="LID01">LID01_B</varequal> </conditionvar> <setvar action ="Set" varname ="SCORE">10</setvar> <displayfeedback feedbacktype="Response " linkrefid ="I01_IFBK01"/> </respcondition> </resprocessing> <itemfeedback title="Correct answer" ident="I01_IFBK01"> <flow_mat> <material><mattext>Correct answer.</mattext></material> </flow_mat> </itemfeedback> <itemfeedback ident ="I01_IFBK02"> <solution> <solutionmaterial> <flow_mat> <material> <mattext >London is the Capital of England.</mattext> </material> </flow_mat> <flow _mat > <material> <mattext >Paris is the Capital of France.</mattext> </material> </flow_mat> <flow_mat> <material> <mattext>Washington is in the USA.</mattext> </material> </flow_mat> <flow_mat> <material> <mattext >Berlin is the Capital of Germany.</mattext> </material> </flow_mat> </solutionmaterial> </solution> </itemfeedback> <itemfeedback ident ="I01_IFBK03" view ="All"> <hint feedbackstyle ="Multilevel"> <hintmaterial> <flow_mat> <material> <mattext >One of the choices is not in Europe.</mattext> </material> </flow_mat> </hintmaterial> <hintmaterial> <flow_mat> <material> <mattext>Berlin is the Capital of Germany.</mattext> </material> </flow_mat> </hintmaterial> <hintmaterial> <flow _mat> <material> <mattext>The Eiffel tower is in the Capital of France. </mattext> </material> </flow_mat> </hintmaterial> </hint> </itemfeedback> </item> </section> <section title ="European Rivers" ident="SO2"> <objectives view ="Candidate "> <flow _mat> <material> <mattext>To assess your knowledge of the rivers in Europe.</mattext> </material> </flow_mat> </objectives> <objectives view ="Assessor "> <flow_mat> <material> <mattext>Questions on rivers in Germany, Spain, Italy and France. </mattext> </material> </flow_mat> </objectives> <outcomes_processing scoremodel ="SumofScores "> <qticomment>Processing of the final accumulated Section .</qticomment> <outcomes> <decvar/> </outcomes> </outcomes_processing> <item title="Rivers in France question" ident ="I02"> <rubric view ="Candidate "> <flow_mat> <material> <mattext>Choose all of the correct answers.</mattext> </material> </flow_mat> </rubric> <presentation label ="Resp002"> <flow> <material> <mattext>Which rivers are in France ?</mattext> </material> <response_lid ident ="LID02" rcardinality ="Multiple"> <render_choice shuffle ="Yes" minnumber ="1" maxnumber ="2"> <response_label ident ="LID02_A"> <flow_mat> <material><mattext>Seine</mattext></material> </flow_mat> </response_label> <response_label ident="LID02_B"> <flow_mat> <material><mattext>Thames</mattext></material> </flow_mat> </response_label> <response_label ident="LID02_C"> <flow_mat> <material><mattext>Danube</mattext></material> </flow_mat> </response_label> <response_label ident="LID02_D"> <flow_mat> <material><mattext>Loire</mattext></material> </flow_mat> </response_label> </render_choice> </response_lid> </flow> </presentation> </item> <item title="Rivers in Germany" ident="I03"/> <rubric view ="Candidate "> <flow_mat> <material> <mattext>Choose all of the correct answers.</mattext> </material> </flow_mat > </rubric> <presentation label="Resp003"> <flow> <material> <matimage imagtytpe="image/gif" uri ="rivers.gif"></matimage> <mattext>Which rivers are in Germany ?</mattext> </material> <response_lid ident ="LID03" rcardinality ="Multiple"> <render_hotspot x0 ="500" y0 ="500" height ="200"> <response_label ident ="LID03_A" rarea ="Ellipse">10,10,2,2 </response_label> <response_label ident ="LID03_B" rarea ="Ellipse">15,15,2,2 </response_label> <response_label ident ="LID03_C" rarea ="Ellipse">30,30,2,2 </response_label> <response_label ident ="LID03_D" rarea ="Ellipse">60,60,2,2 </response_label> <response_label ident ="LID03_E" rarea ="Ellipse">70,70,2,2 </response_label> </render_hotspot> </response_lid> </flow> </presentation> </item> </section> </assessment> </questestinterop>
This example is given in the example file: 'ims_qtiasiv1p2/assessment/mchc_asmimr_101/mchc_asmimr_101.xml'.
6.4 The XML Example Files
The full set of example files, as referred to in Sections 4 and 5 are available as part of the 1EdTech QTI Resource Kit. These files are listed in Table 6.1 (V1.2 compliant basic Item examples), 6.2 (V1.2 compliant advanced Item examples), 6.3 (V1.2 compliant composite Item examples), 6.4 (V1.2 compliant Section examples) 6.5 (V1.2 compliant Assessment examples), 6.6 (V1.2 compliant QTILite examples) and 6.7 (v1.2 compliant object bank examples). Each XML example directory contains the files necessary to support an example Assessment, Section and/or Item. The XML files are denote by an '.xml' extension (the adopted naming convention is described in Appendix C). The following tables list the name of each example directory, the nature of the example in terms of data structures i.e. Assessment (A), Section (S) and/or Item (I) and a brief description of the example.
The directory for these files is: 'ims_qtiasiv1p2/basic/...'
The directory for these files is: 'ims_qtiasiv1p2/advanced/...'
The directory for these files is: 'ims_qtiasiv1p2/composite/...'
The directory for these files is: 'ims_qtiasiv1p2/section/...'
The directory for these files is: 'ims_qtiasov1p2/assessment/...'
The directory for these files is: 'ims_qtiasiv1p2/qtilite/...'
The directory for these files is: 'ims_qtiasiv1p2/bank/...'
7. Implementation Guidance
7.1 Assessments
7.1.1 Elements and their Attributes
Assessmentcontrol
The assessmentcontrol element should be used to define the default conditions for the display of different types of feedback to users. The Assessment level definitions of the feedbackswitch, hintswitch and solutionswitch take precedence if no lower level definition is encountered i.e. within a Section or Item. This means that the Assessment level definition acts as the default state.
7.1.2 Groups of Elements
7.2 Sections
7.2.1 Elements and their Attributes
Sectioncontrol
The sectioncontrol element should be used to define the default conditions for the display of different types of feedback to users. The Section level definitions of the feedbackswitch, hintswitch and solutionswitch take precedence if no lower level definition is encountered i.e. within an Item. In the case of a clash with Assessment level definitions the Section level ones take precedence.
7.2.2 Groups of Elements
7.3 Items
7.3.1 Elements and their Attributes
Itemmetadata
Itemcontrol
The itemcontrol element should be used to define the default conditions for the display of different types of feedback to users. The Item level definitions of the feedbackswitch, hintswitch and solutionswitch take precedence over all other levels of definition.
7.3.2 Groups of Elements
Variable Manipulation
The manipulation of the scoring variables declared in the outcomes/decvar combination is contained within the conditionvar element. The variable comparisons are made individually using the elements defined as varequal, etc. and the state of these comparisons can be inverted using the logical 'NOT' element. The analysis of the period of the response activity is supported using the durequal, etc. elements (an associated default variable is not assumed). This mechanism will be developed further in V2.0 of the specification. The combination of the individual varequal, etc. elements is possible using two techniques:
- Implicit - the sequence of varequal, etc. elements within a conditionvar element is by definition an 'AND' condition. The usage of multiple conditionvar elements is also treated as an 'AND' condition. The 'OR' condition is achieved through the use of multiple respcondition elements. The sequence of these is equivalent to a logical inclusive 'OR' condition;
- Explicit - the usage of the logical and and or elements that combine the outcomes of each separate comparison and combine them in one consolidated state declaration for the conditionvar element.
It is recommended that the Implicit approach be used whenever possible. This approach results in more interoperable code. Examples of both approaches are given in the files: ims_qtiv1p2/advanced/fibs_ir_101a.xml (usage of the and/or elements) and ims_qtiv1p2/advanced/fibs_ir_101b.xml (implicit approach).
- When the response has not been answered - this can be supported using two techniques namely the unanswered element or the response_na element. The unanswered element is placed within conditionvar and is activated whenever that response has not been attempted. The response_na element is a proprietary extension facility that is hosted within the each of the rendering elements i.e. render_choice, render_hotspot, etc.
- When none of the conditions within the conditionvar element are satisfied and so some catch-all state needs to be declared. This default state is captured using the other element. When included in the conditionvar then 'true' will be returned if none of the other conditions have been invoked (this does not mean being invoked when all of the tests return the false state or when no response is offered).
7.5 Aggregated Scoring and Response Processing
The QTI specifications support scoring at three levels:
The response processing for Items is rich in features and as such it can support a very wide range of response-types. In the 1EdTech QTI Information Model [QTI, 02a] the cardinality of the responses is described in Figure 3.1 and consists of 'Single', 'Multiple' and 'Ordered' responses. The relationship between these, the five core response-types (response_lid, etc.) and the types of question are shown in Table 3.1 of the same document [QTI, 02a]. This gives rise to the data set as described in Section 4.2 of the 1EdTech QTI Information Model, and this set must be held internally by the test engine. It is this sequence of responses that is then applied to the response processing to determine the correctness or otherwise of the response, the corresponding scoring and the subsequent feedback (if any).
A key question is how the test engine is to decide the class of responses and their subsequent response processing i.e. how does an implementation ascertain the sequence in which the response tests are to be applied to an Item that expects multiple responses. Consider the most simple case of a single response multiple choice question, "Which is the first working day of the week ?". The V1.2 compliant XML QTI code is shown below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
<questestinterop> <item title="Single response" ident="A"> <presentation label="BasicExample002a"> <flow> <material> <mattext>Which is the first working day of the week ?</mattext> </material> <response_lid ident="Mcb_01" rcardinality="Single" rtiming="No"> <render_choice> <response_label ident="A"> <material><mattext>Saturday</mattext></material> </response_label> <response_label ident="B"> <material><mattext>Monday</mattext></material> </response_label> <response_label ident="C"> <material><mattext>Wednesday</mattext></material> </response_label> <response_label ident="D"> <material><mattext>Tuesday</mattext></material> </response_label> <response_label ident="E"> <material><mattext>Sunday</mattext></material> </response_label> <response_label ident="F"> |
26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 |
<material><mattext>Friday</mattext></material> </response_label> <response_label ident="G"> <material><mattext>Thursday</mattext></material> </response_label> </render_choice> </response_lid> </flow> </presentation> <resprocessing> <outcomes><decvar/></outcomes> <respcondition title="Correct"> <conditionvar> <varequal respident="Mcb_01">B</varequal> </conditionvar> <setvar action="Set">1</setvar> <displayfeedback feedbacktype="Response" linkrefid="Correct"/> </respcondition> </resprocessing> <itemfeedback ident="Correct" view="Candidate"> <flow_mat> <material> <mattext>Yes, you are right.</mattext> </material> </flow_mat> </itemfeedback> </item> </questestinterop> |
The response processing test on line 39 does the check to see if the response has been 'Monday', the correct answer. The system is aware that a single response is required because of the value in line 8 of the rcardinality attribute i.e. 'Single'.
The next stage is to consider a multiple response Item in which five responses are required. The question is "Which days are NOT the week-end ?". The resulting QTI XML code is shown below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
<questestinterop> <item title="Multiple responses" ident="B"> <presentation label="BasicExample002a"> <flow> <material> <mattext>Which days are NOT the week-end ?</mattext> </material> <response_lid ident="Mcb_01" rcardinality="Multiple" rtiming="No"> <render_choice> <response_label ident="A"> <material><mattext>Saturday</mattext></material> </response_label> <response_label ident="B"> <material><mattext>Monday</mattext></material> </response_label> <response_label ident="C"> <material><mattext>Wednesday</mattext></material> </response_label> <response_label ident="D"> <material><mattext>Tuesday</mattext></material> </response_label> <response_label ident="E"> |
23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 |
<material><mattext>Sunday</mattext></material> </response_label> <response_label ident="F"> <material><mattext>Friday</mattext></material> </response_label> <response_label ident="G"> <material><mattext>Thursday</mattext></material> </response_label> </render_choice> </response_lid> </flow> </presentation> <resprocessing> <outcomes><decvar/></outcomes> <respcondition title="Correct"> <conditionvar> <and> <varequal respident="Mcb_01">B</varequal> <varequal respident="Mcb_01">C</varequal> <varequal respident="Mcb_01">D</varequal> <varequal respident="Mcb_01">F</varequal> <varequal respident="Mcb_01">G</varequal> </and> </conditionvar> <setvar action="Set">1</setvar> <displayfeedback feedbacktype="Response" linkrefid="Correct"/> </respcondition> </resprocessing> <itemfeedback ident="Correct" view="Candidate"> <flow_mat> <material> <mattext>Yes, you are right.</mattext> </material> </flow_mat> </itemfeedback> </item> </questestinterop> |
The response processing test on lines 40-44 (inc.) do the check to see if the responses are 'Monday', 'Tuesday', 'Wednesday', 'Thursday' and 'Friday', the correct answer. The system is aware that several responses are required because of the value in line 8 of the rcardinality attribute i.e. 'Multiple'. The key point to note is that the set of responses must be checked against the list of responses without relying on the order of the responses or the sequence in which the tests are applied - remember this is a 'Multiple' response only and NOT an 'Ordered' one. In the conditionvar element (lines 38-46) all of the tests MUST be true for the setvar condition to be triggered (line 42).
1 2 3 4 5 6 7 8 |
<questestinterop> <item title="Multiple ordered response" ident="C"> <presentation label="BasicExample002a"> <flow> <material> <mattext>What is the correct order of the days of the week starting with Sunday ? </mattext> |
9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 |
</material> <response_lid ident="Mcb_01" rcardinality="Ordered" rtiming="No"> <render_choice> <response_label ident="A"> <material><mattext>Saturday</mattext></material> </response_label> <response_label ident="B"> <material><mattext>Monday</mattext></material> </response_label> <response_label ident="C"> <material><mattext>Wednesday</mattext></material> </response_label> <response_label ident="D"> <material><mattext>Tuesday</mattext></material> </response_label> <response_label ident="E"> <material><mattext>Sunday</mattext></material> </response_label> <response_label ident="F"> <material><mattext>Friday</mattext></material> </response_label> <response_label ident="G"> <material><mattext>Thursday</mattext></material> </response_label> </render_choice> </response_lid> </flow> </presentation> <resprocessing> <outcomes><decvar/></outcomes> <respcondition title="Correct"> <conditionvar> <varequal respident="Mcb_01" index="1">E</varequal> <varequal respident="Mcb_01" index="2">>B</varequal> <varequal respident="Mcb_01" index="3">>D</varequal> <varequal respident="Mcb_01" index="4">>C</varequal> <varequal respident="Mcb_01" index="5">>G</varequal> <varequal respident="Mcb_01" index="6">>F</varequal> <varequal respident="Mcb_01" index="6">>A</varequal> </conditionvar> <setvar action="Set" >1</setvar> <displayfeedback feedbacktype="Response" linkrefid="Correct"/> </respcondition> </resprocessing> <itemfeedback ident="Correct" view="Candidate"> <flow_mat> <material> <mattext>Yes, you are right.</mattext> </material> </flow_mat> </itemfeedback> </item> </questestinterop> |
7.6 1EdTech Harmonization
7.6.1 1EdTech Meta-data
7.6.2 1EdTech Content Packaging
The 1EdTech Content Packaging specification is to be used for the packaging of a QTI instance both in terms of a single instance and the the aggregation of several instances. Consider the following use-case required by the QTI in which the QTI-XML instances for three assessments are to be packaged:
- The sets of assessment have to be created i.e. in the files 'assess1.xml', 'assess2.xml' and 'assess3.xml. In each case the assessment has three associated files - a meta-data file and other material files (See Figure 7.1).
The issue becomes one of ensuring that when these three assessment sets are packaged together so that there is no name clash between:
- The 'image1.gif' files from assessments '1' and '2';
- The 'image2.jpg' files from assessments '2' and '3'.
The 1EdTech Content Packaging requires that all of the packaged fields are uniquely named i.e. an explicit file directory structure has to be used to ensure that file clashes do not occur when creating the packaging manifest. The actual example is the XML used to refer to the 'image1.gif' files references in the XML instances for learner '1' and '2'.

The original partial XML could be of the form:
Assessement '1' - original XML instance:
<material> <matimage mimetype="image/gif" encoding="uri">image1.gif</matimage> </material>
Assessment '2' - original XML instance:
<material> <matimage mimetype="image/gif" encoding="uri">image1.gif</matimage> </material>
There are two possible solutions to this problem. The first requires that a naming convention including directory names is implemented. This would mean that the two examples above now become:
Assessement '1' - original XML instance:
<material> <matimage mimetype="image/gif" encoding="uri">assessment1/image1.gif</matimage> </material>
Assessment '2' - original XML instance:
<material> <matimage mimetype="image/gif" encoding="uri">assessment2/image1.gif</matimage> </material>
The two files now have different names and so no clash occurs. The name can include any level of directories. The issue now becomes one of ensuring that the directory structures ensure a uniquely named path.
7.6.3 1EdTech Learner Information Package
7.7 Naming Conventions
7.7.1 Identities and Labels
7.8 Scoping Rules
7.8.1 Identities and Labels
Scoping within XML is very limited. It is possible to create globally unique identifiers within a file by using the 'ID' attribute and reference to these elements is possible through the usage of 'IDREF' and 'IDREFs'. During the development of the Q&TI specification this global uniqueness was considered too constraining and so the scoping rules listed in Table 7.1 should be followed whenever possible:
8. Proprietary Extensions
The proprietary extensions facilities listed in Table 8.1 are supported as elements within the specifications:
Note: These elements are only used if the suppliers of the ASIs require proprietary features that are not supported by the available range of elements. It is recommended that these elements are used sparingly. Backwards compatibility with proprietary extensions will NOT be guaranteed in future versions of these specifications.
The extension features for 'Outcomes Processing' and 'Selection & Ordering' are described in the corresponding specifications.
9. V1.x/V2.0 Issues & Compatibility
The elements listed in Table 9.1 are used to indicate where new functionality will be added in V2.0 of the specifications:
Note: The structure of these elements will change in V2.0 of these specifications. Their main role is to indicate the type of functions to be included in later releases of these specifications and as such vendors are encouraged NOT to make use of these in V1.0, V1.1 and V1.2 implementations.
9.1 Function Requirements
9.2 Constraints
9.3 Compatibility Map

10. Conformance
The purpose of this statement is to provide a mechanism for customers to fairly compare vendors of assessment systems, tools and content. It is not required for a vendor to support every feature of the 1EdTech QTI specification, however, a vendor must detail their level of support with a "Conformance Statement". For example vendors may choose to accept or publish QTI data, but not choose to repackage QTI data. Compliance is determined through two documents:
- Conformance summary - this is a summary that shows, in colloquial terms, the capabilities of a particular implementation with respect to the 1EdTech QTI specification;
- Interoperability statement - this is a detailed technical checklist that identifies all of the feature capabilities of the implementation in terms of the QTI specification functions.
10.1 Valid Data Issues
Vendors claiming conformance shall publish, accept, and/or repackage valid 1EdTech QTI data as defined by the DTD including proprietary extensions where applicable. Vendors claiming their tools publish QTI shall export valid 1EdTech QTI data. Vendors claiming their system tools accept 1EdTech QTI data shall be able to parse and recognize valid 1EdTech QTI data. Vendors claiming their system tools repackage 1EdTech QTI data shall be able "pass through" valid 1EdTech QTI data whether the tool recognizes the optional elements or not. Vendors claiming their assessment content conforms to this specification shall provide valid 1EdTech QTI data. Publishers claiming their content conforms to 1EdTech QTI shall provide valid 1EdTech QTI data.
10.2 Conformance Summary
Vendors claiming conformance must provide a "Conformance Summary", detailing their level of conformance, substantially similar to the information shown below, upon a reasonable request from a member of the 1EdTech or a prospective customer(s). It is expected that this table is a summary of the information given in the 'Interoperability statement'. The intention is for the 'Conformance Summary' to be informative in nature. Completion of the three columns is intended to reflect:
- Publish - this implies that the XML instance contains the identified elements. If such an element is not ticked then it will not occur within the exported QTI-XML instance(s);
- Accept - it is assumed that the ability to accept the contents of an element is accompanied by the ability to use, and if appropriate, display that content. If this is not the case but the content of the material can be exported then the 'Repackage' column can still be ticked;
- Repackage - this is the ability to import QT-XML instances from one or more sources and to create a new instance that combines the imported information. It is not necessary for the repackaging system to be able to operate on the information supplied.
10.3 Interoperability Statement
The 'Interoperability Statement' addresses support for the various elements within the binding. The set of attributes are not considered. Inclusion of conformance with respect to attributes will be considered in later versions of the specification.
10.4 Completing a Conformance Summary
There is a close relationship between the 'Conformance Summary' and the 'Interoperability Statement'. The guidelines for completing these tables are:
- Any entry of 'Y' in the 'Object-bank level support' part of the 'Conformance Summary' should be reflected by the accompanying 'Object-bank Interoperability Statement'. The 'Conformance Summary' gives details on the 'Meta-data', 'Section' and 'Item' parts only of the 'Interoperability Statement';
- Any entry of 'Y' in the 'Assessment level support' part of the 'Conformance Summary' should be reflected by the accompanying 'Assessment Interoperability Statement'. The 'Conformance Summary' gives details on the 'Meta-data', 'Objectives', 'Rubric', 'Feedback' and 'Score Processing' parts only of the 'Interoperability Statement';
- Any entry of 'Y' in the 'Section level support' part of the 'Conformance Summary' should be reflected by the accompanying 'Section Interoperability Statement'. The 'Conformance Summary' gives details on the 'Meta-data', 'Objectives', 'Rubric', 'Feedback' and 'Score Processing' parts only of the 'Interoperability Statement';
- An Assessment must contain a Section and so support for Assessments must result in completion of both the Assessment and Section parts of the tables;
- Any entry of 'Y' in the 'Item level support' part of the 'Conformance Summary' should be reflected by the accompanying 'Item Interoperability Statement';
- It is the combination of answers under the 'Presentation' element within the 'Item' tables that dictate the type of questions that are supported as indicted in the 'Conformance Summary'. The key combinations required for the type of questions is:
- Multiple-choice (including true/false) and multiple response questions require the 'response_lid' and 'render_choice' elements to be supported. Other possible renderings are 'render_slider' or 'render_hotspot'
- Fill-in-blank and short answer questions require the 'response_str' and 'render_fib' elements to be supported
- Numeric questions require the 'response_num' element to be supported. The rendering elements of either 'render_fib' or 'render_slider' must be supported
- Image hot spot questions require the 'response_xy' and 'render_hotspot' elements to be supported
- Support for response processing within an Item requires all of the 'resprocessing' sub-elements to be supported;
- If Items can support content based upon applets and/or embedded applications then the 'Other' part of the 'Material Content' of the 'Conformance Summary' must be answered as 'Y';
- The features that are subject to proprietary interpretation at the current time are: 'itemprecondition', 'itempostcondition', 'sectionprecondition' and 'sectionpostcondition', 'sectionselection', 'sectionsequence', 'itemselection', and 'itemsequence';
- Backwards compatibility from V1.2 to V1.1 to V1.01 and V1.0 is limited by:
- The 'qtimetadata' element within the 'assessmentmetadata', 'sectionmetadata' and 'itemmetadata' elements was introduced in V1.1
- The 'rubric' element was introduced to Assessments, Sections and Items in V1.01
- The 'flow', 'flow_mat' and 'flow_label' elements were introduced in V1.1. The lack of support for 'Flow' within the 'Conformance Summary' means that V1.1 Items are not supported
- Emphasised text was introduced as part of the V1.1 revisions
- All of the extension features within the 'Interoperability Statement' tables are proprietary. If these are used then an explanatory footnote describing the nature and significance of the extension should be supplied;
- It is important that the 'Interoperability Statement' is clear in showing what is and, perhaps more importantly, what is not supported. The usage of descriptive conformance approach has been adopted to encourage vendors to be as clear as possible when describing the capabilities of their QTI-compliant systems.
10.5 An Example Conformance Statement
Table 10.1 shows an example Conformance Summary.
The system that is described by this summary has the following capabilities:
- It imports QTI-XML instances only i.e. not exporting or repackaging;
- It imports Items only i.e. Assessments and Sections can not be imported;
- An Item can consist of multiple-choice (inc True/false and multiple response), FIB (short answer, numeric, string), drag-and-drop and image hot-spot.
- Support for objectives, rubric, flows, response processing, feedback, hints and solutions is available for Items;
- The content that can be handled includes text, emphasised text and images.
Appendix A - QTI XSDs, DTDs & XDRs
A1 - Overview
The Version 1.2 1EdTech Question & Test Interoperability XML Schema Definition (XSD) and Document Type Definitions (DTDs) are contained in a directory that has4:
- xmla - the directory that contains all of the DTDs and XSDs in native XML Authority format. XML Authority (V2.2.1) is a product supplied by xtensibility Inc;
- mac - the directory that contains all of the XSDs and DTDs in text file format. These text files are designed for usage with Macintosh systems;
- ibm - the directory that contains all of the Xsds and DTDs in text file format. These text files are designed for usage with PC systems;
- unix - the directory that contains all of the XSDs and DTDs in text file format. These text files are designed for usage with Unix systems.
The further directory structure under each of these directories is identical. This further structure is:
Within each of the XSD directories is the file: ims_qtiasiv1p2.xsd and within each of the DTD directories is the file: ims_qtiasiv1p2.dtd. |
This approach means that's the different types of XSD/DTD can be applied without requiring any editing of the associated source XML files. The full directory structure is given in Appendix A of this file.
A2 - Features of the Different XSDs/DTDs/XDRs
The key features of the different XSD/DTD implementations are:
- qtifullxsd/dtd - this is the full XSD/DTD with all of the corresponding comments. The comments correspond to those given in the 1EdTech QTI ASI XML Binding Specification v1.2;
- qtifullncxsd/dtd - this is the full XSD/DTD but with all of the comments removed. This makes the file smaller and it is easier to see the internal structure;
- qticorencxsd/dtd - the core features of the XSD/DTD are as per the full versions but with ALL extension and Version 2.0 specific elements removed. This will ensure that the 'xml' files using this XSD/DTD will be compatible with future releases of the specification;
- qtisectionncxsd/dtd - only those core elements that are used by the Section data structure are available. This means that ALL Assessment specific elements have been removed. This XSD/DTD is a further refinement of the core features;
- qtiitemncxsd/dtd - only those core elements that are used by the Item data structure are available. This means that ALL Assessment and Section specific elements have been removed. This XSD/DTD is a further refinement of the core features;
- qtilitencxsd/dtd - only those elements that support the QTILite specification are included. This can be considered as a constrained version of the qtiitemnc***.
A3 - Recommended Usage of the XSDs/DTDs
The recommended uses of the different XSDs/DTDs are:
- Select the set of XSDs/DTDs that suit your system. All of the mac/unix/ibm text versions are derived from the XML Authority version and created using the BBEdit text processing application;
- In most case only the non-commented versions need to be used. The fully commented files are intended to be informative;
- The core set should be used if you are concerned with compatibility with later versions of the specification. However, this will prohibit the usage of proprietary extension features. The Version 2.0 specific elements are there to show where new developments to the specification are intended and you are recommended to AVOID using any xml files that require their inclusion;
- The Section and Item specific XSDs/DTDs should be used if you intend to import/export only Sections and/or Items. These simpler files structures will also make it easier to understand the structure of the full and core XSDs/DTDs;
- The QTILite version should be used as the initial approach when attempting to first use the specification. The minimal set of features is ideal for demonstrating the basic structure and approach of the QTI specification. Once mastered this provides an excellent basis for the usage of the more advanced features in the full specification.
A4 - Full Directory Structure
The full directory structure is:
Appendix B - Glossary of Terms
B1 - Q&TI Elements and Attributes
Appendix C - Examples Information
C1 - Proposed Naming Convention
A request has been made to introduce a naming convention for the 1EdTech QTI example files that is more informative. The request is for the name to reflect the nature of the content of the file.
A new naming convention is proposed. This convention is based upon two facets:
- Directory - this will reflect the class of the example;
- File - this will reflect the nature of the contents of the '.xml' file. C1.1 Directory
The directory naming convention is proposed to be (the bold words are the corresponding directory names):
composite - Composite Items examples advanced - Advanced Items examples assessment - Assessment examples outcome - Outcomes Processing examples |
C1.2 File Naming
The file naming can mow focus on the nature of the content in the files. The following convention is proposed:
[ABCD]_[A...Z]_[XYZ][*]_[<free form>]
where: [ABCD] A four-character string that reflects the type of question contained. The range of options includes-
fibn numeric fill-in-blank (not integer)
[A...Z] A variable character string that describes the type of data structures contained. The basic convention is-
a single assessment without score processing
ar single assessment with score processing
am multiple assessments without score processing
amr multiple assessments with score processing
s single section without score processing
sr single section with score processing
sm multiple section without score processing
smr multiple section with score processing
i single item without score processing
ir single item with score processing
im multiple item without response processing
imr multiple item with response processing
Any appropriate combination is used.
[XYZ] A number in the range 001-999 (000 is reserved for later usage).
The number of the example. This should follow some form of numbering system.
[*] A single character in the range 'a' to 'z'.
This is used to denote differences between files that are alternative solutions to the same Assessment/Section/Item.
[<free form>] Free format that can be used to add any other descriptive information. It should be of alphanumeric form.
About This Document
Title | 1EdTech Question & Test Interoperability: ASI Best Practice & Implementation Guide |
Editors | Colin Smythe, Eric Shepherd, Lane Brewer, and Steve Lay |
Version | 1.2 |
Version Date | 11 February 2002 |
Status | Final Specification |
Summary |
This document provides additional information regarding 1EdTech Question & Test Interoperability Best Practices and Implementation Guide. It is meant to complement the 1EdTech Question & Test Interoperability: ASI XML Binding and 1EdTech Question & Test Interoperability Information Model documents. |
Revision Information | 22 January 2002 |
Purpose | Defines the best practice and usage of the Assessment, Section & Item XML binding for the Question & Test Interoperability Information Model specification. |
Document Location | http://www.imsglobal.org/question/v1p2/imsqti_asi_bestv1p2.html |
List of Contributors
The following individuals contributed to the development of this document:
Revision History
Index
A
Administrating Authority 1
Administrator 1
ASI 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16
Assessment 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37
Assessment Elements
assessfeedback 1, 2, 3, 4
assessment 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21
assessmentcontrol 1, 2, 3, 4, 5
assessproc_extension 1, 2
Assessor 1, 2, 3, 4
Attributes
action 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24
apptype 1, 2
areatype 1, 2, 3, 4
audiotype 1, 2, 3, 4
case 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
charset 1, 2, 3, 4
class 1, 2, 3, 4, 5, 6, 7
columns 1, 2, 3, 4, 5
continue 1, 2
cutvalue 1
defaultval 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14
embedded 1, 2
encoding 1, 2, 3
entityref 1, 2, 3, 4
feedbackstyle 1, 2, 3, 4, 5, 6, 7
feedbackswitch 1, 2, 3, 4, 5, 6, 7
feedbacktype 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26
height 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17
hintswitch 1, 2, 3, 4, 5, 6, 7, 8
ident 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63
imagtype 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18
index 1, 2, 3, 4, 5, 6
label 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45
labelrefid 1, 2
linkrefid 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27
lowerbound 1, 2, 3, 4, 5
match_group 1, 2, 3
match_max 1, 2, 3
maxattempts 1, 2, 3
maxchars 1, 2, 3, 4, 5, 6, 7
maxnumber 1, 2, 3, 4, 5, 6
maxvalue 1, 2
members 1, 2, 3
minnumber 1, 2, 3, 4, 5, 6
minvalue 1, 2
numtype 1, 2, 3
orientation 1, 2, 3, 4, 5
prompt 1, 2, 3, 4, 5, 6, 7, 8
rarea 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
rcardinality 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38
respident 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24
rows 1, 2, 3
rrange 1, 2, 3
rshuffle 1, 2, 3, 4
rtiming 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30
scoremodel 1, 2, 3, 4, 5, 6, 7
setmatch 1, 2
showdraw 1, 2, 3, 4, 5, 6
shuffle 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14
solutionswitch 1, 2, 3, 4, 5, 6, 7
startval 1, 2, 3, 4, 5
step 1, 2, 3, 4, 5
testoperator 1, 2, 3
texttype 1, 2, 3
title 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46
upperbound 1, 2, 3, 4
uri 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25
varname 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19
vartype 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14
videotype 1, 2
view 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35
vocab_type 1, 2, 3, 4
width 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16
x0 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16
xmllang 1, 2, 3, 4
y0 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15
Author 1
C
Candidate 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29
Common Elements
altmaterial 1, 2, 3, 4, 5, 6
conditionvar 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30
decvar 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29
displayfeedback 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27
duration 1, 2, 3
durequal 1, 2, 3
durgt 1, 2
durgte 1, 2
durlt 1, 2
durlte 1, 2
fieldentry 1, 2, 3, 4, 5
flow_mat 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39
interpretvar 1, 2, 3
mat_extension 1, 2
matapplet 1, 2, 3
matapplication 1, 2, 3, 4
mataudio 1, 2, 3, 4, 5, 6
matbreak 1
matemtext 1, 2, 3, 4, 5, 6, 7
material 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86
material_ref 1, 2, 3
matimage 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, <a data-cke-saved-href="/question/qtiv1p2/imsqti_asi_bestv1href=" href="/question/qtiv1p2/imsqti_asi_bestv1href=" 1p2="" imsqti_asi_bestv1p2.html#1467596"="">13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24
matref 1, 2, 3, 4
mattext 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62
matvideo 1, 2, 3, 4, 5, 6
objectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18
order 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
other 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17
outcomes 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36
outcomes_processing 1, 2, 3, 4, 5, 6, 7, 8, 9
presentation_material 1, 2, 3
qticomment 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39
qtimetadata 1, 2, 3, 4, 5, 6, 7, 8
qtimetadatafield 1, 2, 3, 4, 5
reference 1, 2, 3, 4, 5, 6, 7
rubric 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17
selection 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15
selection_ordering 1, 2, 3, 4, 5, 6
setvar 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26
unanswered 1, 2, 3, 4, 5
var_extension 1, 2
varequal 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24
vargt 1
vargte 1, 2, 3
varinside 1, 2, 3, 4
varlt 1, 2, 3, 4
varlte 1, 2, 3, 4
varsubset 1
varsubstring 1, 2
vocabulary 1, 2, 3, 4, 5, 6, 7, 8
Composite 1, 2, 3, 4
Conformance 1, 2, 3, 4
E
Elements
altmaterial 1, 2, 3, 4, 5, 6
assessfeedback 1, 2, 3, 4
assessment 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21
assessmentcontrol 1, 2, 3, 4, 5
assessproc_extension 1, 2
conditionvar 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30
decvar 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29
displayfeedback 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27
duration 1, 2, 3
durequal 1, 2, 3
durgt 1, 2
durgte 1, 2
durlt 1, 2
durlte 1, 2
fieldentry 1, 2, 3, 4, 5
fieldlabel 1, 2, 3, 4, 5
flow 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65
flow_label 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17
flow_mat 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39
hint 1, 2, 3, 4, 5, 6, 7, 8
hintmaterial 1, 2, 3, 4
interpretvar 1, 2, 3
itemcontrol 1, 2, 3, 4, 5, 6
itemfeedback 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28
itemmetadata 1, 2, 3, 4, 5, 6
itempostcondition 1, 2, 3
itemprecondition 1, 2, 3
itemproc_extension 1, 2
itemref 1, 2, 3
itemrubric 1, 2, 3
mat_extension 1, 2
matapplet 1, 2, 3
matapplication 1, 2, 3, 4
mataudio 1, 2, 3, 4, 5, 6
matbreak 1
matemtext 1, 2, 3, 4, 5, 6, 7
material 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86
material_ref 1, 2, 3
matimage 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24
matref 1, 2, 3, 4
mattext 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62
matvideo 1, 2, 3, 4, 5, 6
objectbank 1, 2, 3
objectives 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18
order 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
other 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17
outcomes 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36
outcomes_feedback_test 1, 2, 3
outcomes_processing 1, 2, 3, 4, 5, 6, 7, 8, 9
presentation 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63
presentation_material 1, 2, 3
qmd_feedbackpermitted 1
qmd_hintspermitted 1
qmd_itemtype 1
qmd_levelofdifficulty 1
qmd_material 1
qmd_maximumscore 1
qmd_renderingtype 1
qmd_responsetype 1
qmd_scoringpermitted 1
qmd_solutionspermitted 1
qmd_status 1
qmd_timedependence 1
qmd_timelimit 1
qmd_toolvendor 1
qmd_topic 1
qmd_typeofsolution 1
qmd_weighting 1
qticomment 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39
qtimetadata 1, 2, 3, 4, 5, 6, 7, 8
qtimetadatafield 1, 2, 3, 4, 5
questestinterop 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52
reference 1, 2, 3, 4, 5, 6, 7
render_choice 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32
render_extension 1, 2, 3, 4, 5, 6, 7, 8, 9
render_fib 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15
render_hotspot 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17
render_slider 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11
respcond_extension 1, 2
respcondition 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26
response_extension 1, 2, 3
response_grp 1, 2, 3, 4, 5, 6, 7, 8
response_label 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54
response_lid 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42
response_na 1, 2, 3
response_num 1, 2, 3, 4, 5, 6, 7, 8, 9
response_str 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
response_xy 1, 2, 3, 4, 5, 6, 7, 8
resprocessing 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32
rubric 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17
section 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25
sectioncontrol 1, 2, 3, 4, 5
sectionfeedback 1, 2, 3, 4
sectionpostcondition 1, 2, 3
sectionprecondition 1, 2, 3
sectionproc_extension 1, 2
sectionref 1, 2, 3
selection 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15
selection_ordering 1, 2, 3, 4, 5, 6
setvar 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26
solution 1, 2, 3, 4, 5, 6, 7, 8
solutionmaterial 1, 2, 3, 4, 5
test_variable 1, 2, 3
unanswered 1, 2, 3, 4, 5
var_extension 1, 2
varequal 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24
vargt 1
vargte 1, 2, 3
variable_test 1, 2, 3
varinside 1, 2, 3, 4
varlt 1, 2, 3, 4
varlte 1, 2, 3, 4
varsubset 1
vocabulary 1, 2, 3, 4, 5, 6, 7, 8
Extension Elements
assessproc_extension 1, 2
itemproc_extension 1, 2
mat_extension 1, 2
render_extension 1, 2, 3, 4, 5, 6, 7, 8, 9
respcond_extension 1, 2
response_extension 1, 2, 3
sectionproc_extension 1, 2
var_extension 1, 2
F
FIB 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15
Accessibility 1
Content Packaging 1, 2, 3, 4, 5, 6, 7
Learner Information Package 1, 2
I
Image Hot Spot 1, 2, 3, 4, 5
Interoperability structures
Assessment 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37
Item 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62
Section 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43
Invigilator 1
Item 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62
Item Elements
flow 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65
flow_label 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17
hint 1, 2, 3, 4, 5, 6, 7, 8
hintmaterial 1, 2, 3, 4
itemcontrol 1, 2, 3, 4, 5, 6
itemfeedback 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28
itemmetadata 1, 2, 3, 4, 5, 6
itempostcondition 1, 2, 3
itemprecondition 1, 2, 3
itemproc_extension 1, 2
itemrubric 1, 2, 3
presentation 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63
render_choice 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32
render_extension 1, 2, 3, 4, 5, 6, 7, 8, 9
render_fib 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15
render_hotspot 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17
render_slider 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11
respcond_extension 1, 2
respcondition 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26
response_extension 1, 2, 3
response_grp 1, 2, 3, 4, 5, 6, 7, 8
response_label 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54
response_lid 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42
response_num 1, 2, 3, 4, 5, 6, 7, 8, 9
response_str 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
response_xy 1, 2, 3, 4, 5, 6, 7, 8
resprocessing 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32
solution 1, 2, 3, 4, 5, 6, 7, 8
solutionmaterial 1, 2, 3, 4, 5
M
Meta-data
Description 1, 2, 3, 4, 5, 6
Elements
qmd_computerscored 1
qmd_feedbackpermitted 1
qmd_hintspermitted 1
qmd_levelofdifficulty 1
qmd_material 1
qmd_maximumscore 1
qmd_responsetype 1
qmd_scoringpermitted 1
qmd_solutionspermitted 1
qmd_status 1
qmd_timedependence 1
qmd_timelimit 1
qmd_toolvendor 1
qmd_topic 1
qmd_typeofsolution 1
qmd_weighting 1 Language 1
Objectives 1, 2, 3, 4, 5
Resource Identifier 1, 2, 3
status 1
Version 1, 2, 3, 4, 5, 6, 7, 8, 9, 10
Multiple choice 1, 2, 3, 4
Multiple response 1, 2, 3
O
Outcomes 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11
Outcomes processing
Attributes
outcomes 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36
outcomes_feedback_test 1, 2, 3
outcomes_processing 1, 2, 3, 4, 5, 6, 7, 8, 9
P
Participant
Administrator 1
Assessor 1, 2, 3, 4
Author 1
Candidate 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29
Invigilator 1
Proctor 1
Psychometrician 1
Scorer 1, 2
Tutor 1, 2, 3, 4
Proctor 1
Psychometrician 1
Q
QTILite 1, 2, 3, 4, 5, 6, 7, 8, 9
Question 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
R
Resource Identifier 1, 2, 3
Response 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39
Response processing 1, 2, 3, 4, 5, 6, 7
S
Scoring algorithms
SumofScores 1, 2, 3, 4, 5
Section 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43
Section Elements
itemref 1, 2, 3
sectioncontrol 1, 2, 3, 4, 5
sectionfeedback 1, 2, 3, 4
sectionpostcondition 1, 2, 3
sectionprecondition 1, 2, 3
sectionproc_extension 1, 2
Selection & ordering 1, 2, 3, 4, 5
Elements
order 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
selection 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15
selection_ordering 1, 2, 3, 4, 5, 6 Slider 1, 2, 3, 4, 5, 6
Solution 1, 2
T
True/false 1, 2, 3
Tutor 1, 2, 3, 4
V
Version 1.01 Additions
Elements
rubric 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17 Version 1.1 Additions
Attributes
flow 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65
flow_label 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17
flow_mat 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39
matbreak 1
qtimetadata 1, 2, 3, 4, 5, 6, 7, 8
qtimetadatafield 1, 2, 3, 4, 5
vocabulary 1, 2, 3, 4, 5, 6, 7, 8 Version 1.2 Additions
Attributes
cutvalue 1
vocab_type 1, 2, 3, 4 Elements
order 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
outcomes 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36
outcomes_feedback_test 1, 2, 3
outcomes_processing 1, 2, 3, 4, 5, 6, 7, 8, 9
selection 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15
selection_ordering 1, 2, 3, 4, 5, 6
X
XML 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74
XML Schema
DTD 1, 2, 3, 4, 5, 6
XDR 1
XSD 1, 2, 3
XSD 1, 2, 3
1EdTech Consortium, Inc. ("1EdTech") is publishing the information contained in this 1EdTech Question & Test Interoperability: ASI Best Practice & Implementation Guide ("Specification") for purposes of scientific, experimental, and scholarly collaboration only.
1EdTech makes no warranty or representation regarding the accuracy or completeness of the Specification.
This material is provided on an "As Is" and "As Available" basis.
The Specification is at all times subject to change and revision without notice.
It is your sole responsibility to evaluate the usefulness, accuracy, and completeness of the Specification as it relates to you.
1EdTech would appreciate receiving your comments and suggestions.
Please contact 1EdTech through our website at http://www.imsglobal.org
Please refer to Document Name: 1EdTech Question & Test Interoperability: ASI Best Practice & Implementation Guide Date: 11 February 2002