IMS Rubric Specification
Version 1.0 Final Specification
Copyright © 2005 IMS Global Learning Consortium, Inc. All Rights Reserved.
The IMS Logo is a registered trademark of IMS/GLC
Document Name: IMS Rubric Specification
Revision: 02 June 2005
IPR and Distribution Notices
Recipients of this document are requested to submit, with their comments, notification of any relevant patent claims or other intellectual property rights of which they may be aware that might be infringed by any implementation of the specification set forth in this document, and to provide supporting documentation.
IMS takes no position regarding the validity or scope of any intellectual property or other rights that might be claimed to pertain to the implementation or use of the technology described in this document or the extent to which any license under such rights might or might not be available; neither does it represent that it has made any effort to identify any such rights. Information on IMS's procedures with respect to rights in IMS specifications can be found at the IMS Intellectual Property Rights web page: http://www.imsglobal.org/ipr/imsipr_policyFinal.pdf.
Copyright © 2005 IMS Global Learning Consortium. All Rights Reserved.
Permission is granted to all parties to use excerpts from this document as needed in producing requests for proposals.
Use of this specification to develop products or services is governed by the license with IMS found on the IMS website: http://www.imsglobal.org/license.html.
The limited permissions granted above are perpetual and will not be revoked by IMS or its successors or assigns.
THIS SPECIFICATION IS BEING OFFERED WITHOUT ANY WARRANTY WHATSOEVER, AND IN PARTICULAR, ANY WARRANTY OF NONINFRINGEMENT IS EXPRESSLY DISCLAIMED. ANY USE OF THIS SPECIFICATION SHALL BE MADE ENTIRELY AT THE IMPLEMENTER'S OWN RISK, AND NEITHER THE CONSORTIUM, NOR ANY OF ITS MEMBERS OR SUBMITTERS, SHALL HAVE ANY LIABILITY WHATSOEVER TO ANY IMPLEMENTER OR THIRD PARTY FOR ANY DAMAGES OF ANY NATURE WHATSOEVER, DIRECTLY OR INDIRECTLY, ARISING FROM THE USE OF THIS SPECIFICATION.
1.1 Structure of this Document
2. Information Model
2.1 Rubric Class
2.2 Description Class
2.3 Outcome Class
2.3.1 Rules Class
2.3.2 requiresCriteria Class
2.4 DimensionOfQuality Class
3. Binding of Rubric
4. Use Cases
4.1 Multi-dimensional Assessment
4.1.1 Variant - Continuum Scoring
4.1.2 Multi-dimensional Assessment with Groupings
4.1.3 Multi-dimensional Assessment with Rollup
4.1.4 Criterion List
4.1.5 Cumulative Criterion List
4.2 Referring to Rubrics
5.1 Multi-dimensional with Rollup
Appendix A - Glossary
About This Document
List of Contributors
This specification deals with the representation of guidance as to how a portfolio has been, or is to be assessed. A rubric carries information about interpretation of merit or the criteria that have been used in the interpretation. That information is carried in statements and a structured model. The word 'guidance' is chosen deliberately since we are not providing a model for assessment schemes in their full glory nor are we providing an information model to support automated evaluation software. The primary consumer of data produced according to this specification is not an assessor, since there would normally be additional elements of guidance and professional expertise, as well as issues of moderation surrounding an actual assessment process.
In the context of the IMS ePortfolio specification, Rubrics constitute one type of PortfolioPart, which may be included within a Portfolio and may be related to other Portfolio Parts. The Rubric Specification is a separate document to acknowledge that the data structure described herein may be useful in other elearning contexts. However, the Rubric Specification explicitly treats Rubrics only in the context of ePortfolios and does not address these other possible uses.
In this specification, we use the term 'rubric' (see Appendix ) while recognizing that the term is used with different but closely related meaning in various English-speaking locales and that it appears in various other European languages with usage that is generally less educationally-oriented (a column; a heading). "Rubric" also has a common ecclesiastical sense, which we do not propose to address.
|CSS||Cascading Style Sheets|
|NLII||EDUCAUSE National Learning Infrastructure Initiative|
|W3C||World Wide Web Consortium|
|XML||Extensible Mark-up Language|
|XSD||XML Schema Definition|
|XSL||Extensible Stylesheet Language|
NOTE: The attributes typed as "RDCEO" reflect the equivalent named structures of the IMS Reusable Definition of Competency or Educational Objective (RDCEO) Information Model v1.0 [RDCEO, 02]. The attributes typed as "LIP" reflect the structures of the IMS Learner Information Package specification [LIP, 01]. For more details please see these specifications.
See the use cases in sub-section 4 and the examples in sub-section 5 for guidance on how to use the Information Model to represent common assessment processes. See sub-section 4.1 for a description of the most common form of rubric used with ePortfolios in current practice. The Information Model does not make the distinction between dimension of quality and level of mastery explicit. The organizational groupings are realized as a hierarchy of 'dimensions of quality'.
This is typical of portfolio assessment and may be used for self assessment, as well as for communicating the criteria to candidates. This sort of rubric is for defining a complex set of interrelated criteria and would usually be visually represented as a table. This sort of rubric generally has three elements1:
- Dimensions of quality - the first column in a rubric lists a set of dimensions or areas to be assessed, such as "use of multimedia" or "grammar." In some rubrics, the dimensions and/or their associated commentaries are referred to as "criteria."
- Levels of mastery - each subsequent column represents a level of performance that is defined by the rubric such as "excellent" or "not yet competent."
- Commentaries - At the intersection of each dimensions of quality and level of mastery, a rubric includes a textual description of the qualities of performances and products on that dimension at that level. For example, the dimension "spelling and grammar" at the level of "exemplary" might be described as "is free of errors in grammar, punctuation, word choice, spelling, and format."
The implication of the main use case "Multi-dimensional Assessment" is that one of the levels of mastery would be selected for a given candidate in a given dimension of quality. In some cases, the levels of mastery may be indicative of milestones within a continuum and a score be presented that does not correspond to the score shown in a 'specific commentary'.
In this case, there are a number of criteria that are not organized into dimensions of quality but are only associated with levels of mastery. All level-linked criteria must be met to show that level of mastery.
- A dimension of quality is identified, for example to present a piece of evidence as an item for assessment.
- The intersection of a dimension of quality and level of mastery is identified, for example to interpret a score.
- An individual criterion is identified, for example when expressing how a level of mastery is shown.
- A reference is found through an association between another part of the portfolio and the intersection of a dimension of quality and level of mastery.
- The reference is followed and a rubric fragment (or whole) retrieved.
- The rubric information is formatted to a human-readable form.
- The formatted information is included, or linked-to, in context.
This example illustrates how to use the Rubric specification to represent a rubric used for multi-dimensional assessment with rollup, as described in sub-section 4.1 and sub-section 4.1.3. The example file is a representation of part of the University of Wisconsin Stout Rubric of Electronic Teaching Portfolio, which is used with permission from Joan M, Vandervelde. The source rubric may be available at http://www.uwstout.edu/soe/profdev/eportfoliorubric.html.This example uses the RDCEO model defined in this document. The example file is located on the IMS ePortfolio website http://www.imsglobal.org/ep/
Assessment is any systematic process of gathering information and making assertions about what a person or set of persons has learned, knows or can do. (While conflicting with the more restrictive definition of assessment as synonymous with testing presented in the IMS QTI specification and Abstract Framework, the broader definition presented here is much more accurate in the context of portfolio practice.)
The creator of an ePortfolio or part of an ePortfolio is the person or group principally responsible for making the meaningful content of the ePortfolio or part. (This definition differs from, but does not conflict with, the definition of "creator" in the Abstract Framework glossary.)
A rubric is a scoring guide that define assessment criteria for each cell in a two dimensional-matrix of dimensions of quality by levels of mastery. Levels of mastery can be defined as a continuum or discrete set.
A part is a self-contained piece of information about the ePortfolio subject, or about anything contained in an ePortfolio. The meaning and significance of a part may draw from the relationships in which the part partakes.
The term element is used by IMS LIP to indicate the constituent units of the specification, at any level, both as structures and as instances. The top-level elements represent some of the parts of the ePortfolio.
|Title||IMS Rubric Specification|
|Editors||Colin Smythe (IMS), Mark McKell (IMS), Adam Cooper (Tribal)|
|Team Co-Leads||Darren Cambridge (EDUCAUSE), Andy Heath (EPICC)|
|Version Date||02 June 2005|
|Summary||This document describes the Rubric specification as it relates to the ePortfolio specification.|
|Revision Information||02 June 2005|
|Purpose||This document has been approved by the IMS Technical Board and is made available for adoption.|
|To register any comments or questions about this specification please visit: http://www.imsglobal.org/developers/ims/imsforum/categories.cfm?catid=24|
|Adam Cooper||Tribal Technology|
|Colin Smythe||IMS Global Learning Consortium, Inc.|
|Mark McKell||IMS Global Learning Consortium, Inc.|
IMS Global Learning Consortium, Inc. ("IMS/GLC") is publishing the information contained in this IMS Rubric Specification ("Specification") for purposes of scientific, experimental, and scholarly collaboration only.
IMS/GLC makes no warranty or representation regarding the accuracy or completeness of the Specification.
This material is provided on an "As Is" and "As Available" basis.
The Specification is at all times subject to change and revision without notice.
It is your sole responsibility to evaluate the usefulness, accuracy, and completeness of the Specification as it relates to you.
IMS/GLC would appreciate receiving your comments and suggestions.
Please contact IMS/GLC through our website at http://www.imsglobal.org
Please refer to Document Name: IMS Rubric Specification Revision: 02 June 2005