1EdTech Logo

1EdTech Rubric Specification

Version 1.0 Final Specification

Copyright © 2005 1EdTech Consortium, Inc. All Rights Reserved.
The 1EdTech Logo is a registered trademark of 1EdTech/GLC
Document Name: 1EdTech Rubric Specification
Revision: 02 June 2005

IPR and Distribution Notices

Recipients of this document are requested to submit, with their comments, notification of any relevant patent claims or other intellectual property rights of which they may be aware that might be infringed by any implementation of the specification set forth in this document, and to provide supporting documentation.

1EdTech takes no position regarding the validity or scope of any intellectual property or other rights that might be claimed to pertain to the implementation or use of the technology described in this document or the extent to which any license under such rights might or might not be available; neither does it represent that it has made any effort to identify any such rights. Information on 1EdTech's procedures with respect to rights in 1EdTech specifications can be found at the 1EdTech Intellectual Property Rights web page: http://www.imsglobal.org/ipr/imsipr_policyFinal.pdf.

Copyright © 2005 1EdTech Consortium. All Rights Reserved.

Permission is granted to all parties to use excerpts from this document as needed in producing requests for proposals.

Use of this specification to develop products or services is governed by the license with 1EdTech found on the 1EdTech website: http://www.imsglobal.org/license.html.

The limited permissions granted above are perpetual and will not be revoked by 1EdTech or its successors or assigns.

THIS SPECIFICATION IS BEING OFFERED WITHOUT ANY WARRANTY WHATSOEVER, AND IN PARTICULAR, ANY WARRANTY OF NONINFRINGEMENT IS EXPRESSLY DISCLAIMED. ANY USE OF THIS SPECIFICATION SHALL BE MADE ENTIRELY AT THE IMPLEMENTER'S OWN RISK, AND NEITHER THE CONSORTIUM, NOR ANY OF ITS MEMBERS OR SUBMITTERS, SHALL HAVE ANY LIABILITY WHATSOEVER TO ANY IMPLEMENTER OR THIRD PARTY FOR ANY DAMAGES OF ANY NATURE WHATSOEVER, DIRECTLY OR INDIRECTLY, ARISING FROM THE USE OF THIS SPECIFICATION.


Table of Contents


1. Introduction
1.1 Structure of this Document
1.2 Nomenclature
1.3 References

2. Information Model
2.1 Rubric Class
2.2 Description Class
2.3 Outcome Class
2.3.1 Rules Class
2.3.2 requiresCriteria Class
2.4 DimensionOfQuality Class

3. Binding of Rubric

4. Use Cases
4.1 Multi-dimensional Assessment
4.1.1 Variant - Continuum Scoring
4.1.2 Multi-dimensional Assessment with Groupings
4.1.3 Multi-dimensional Assessment with Rollup
4.1.4 Criterion List
4.1.5 Cumulative Criterion List
4.2 Referring to Rubrics

5. Examples
5.1 Multi-dimensional with Rollup

Appendix A - Glossary

About This Document
List of Contributors

Revision History

Index


1. Introduction

This specification deals with the representation of guidance as to how a portfolio has been, or is to be assessed. A rubric carries information about interpretation of merit or the criteria that have been used in the interpretation. That information is carried in statements and a structured model. The word 'guidance' is chosen deliberately since we are not providing a model for assessment schemes in their full glory nor are we providing an information model to support automated evaluation software. The primary consumer of data produced according to this specification is not an assessor, since there would normally be additional elements of guidance and professional expertise, as well as issues of moderation surrounding an actual assessment process.

In the context of the 1EdTech ePortfolio specification, Rubrics constitute one type of PortfolioPart, which may be included within a Portfolio and may be related to other Portfolio Parts. The Rubric Specification is a separate document to acknowledge that the data structure described herein may be useful in other elearning contexts. However, the Rubric Specification explicitly treats Rubrics only in the context of ePortfolios and does not address these other possible uses.

In this specification, we use the term 'rubric' (see Appendix ) while recognizing that the term is used with different but closely related meaning in various English-speaking locales and that it appears in various other European languages with usage that is generally less educationally-oriented (a column; a heading). "Rubric" also has a common ecclesiastical sense, which we do not propose to address.

1.1 Structure of this Document

The structure of this document is:

2. Information Model The description of the Rubric Information Model.
3. Binding of Rubric The description of the Rubric Binding.
4. Use Cases The description of Rubric use cases.
5. Examples Example of a Rubric.
Appendix A Glossary of terms and definitions.

1.2 Nomenclature

CSS Cascading Style Sheets
NLII EDUCAUSE National Learning Infrastructure Initiative
W3C World Wide Web Consortium
XML Extensible Mark-up Language
XSD XML Schema Definition
XSL Extensible Stylesheet Language
XSLT XSL Transformations

1.3 References

[EP, 05a] 1EdTech ePortfolio Information Model v1.0, A.Jackl, D.Cambridge, 1EdTech/GLC, June 2005.
[EP, 05b] 1EdTech ePortfolio XML Binding v1.0, A.Jackl, D.Cambridge, S.Wilson, 1EdTech/GLC, June 2005.
[EP, 05c] 1EdTech ePortfolio Best Practice and Implementation Guide v1.0, D.Cambridge, J.Finnefrock, A.Jackl, 1EdTech/GLC, June 2005.
[GWS, 05] 1EdTech General Web Services Base Profiles Public Draft v1.0, C.Schroeder, S.Raju, C.Smythe, 1EdTech/GLC, January 2005.
[LIP, 01] 1EdTech Learner Information Package v1.0, C.Smythe, F.Tansey, R.Robson, 1EdTech/GLC, March 2001.
[RDCEO, 02] 1EdTech Reusable Definition of Competency or Educational Objective v1.0, A.Cooper, C.Ostyn, 1EdTech/GLC, October 2002.

2. Information Model

NOTE: The attributes typed as "RDCEO" reflect the equivalent named structures of the 1EdTech Reusable Definition of Competency or Educational Objective (RDCEO) Information Model v1.0 [RDCEO, 02]. The attributes typed as "LIP" reflect the structures of the 1EdTech Learner Information Package specification [LIP, 01]. For more details please see these specifications.

The UML diagram in Figure 2.1 is based on the profiling guidelines of the 1EdTech General Web Services specification [GWS, 05]. The modelling tool Poseidon by Gentleware was used to create the diagram.

Top level UML Model of a Rubric
Figure 2.1 Top level UML Model of a Rubric.

See the use cases in sub-section 4 and the examples in sub-section 5 for guidance on how to use the Information Model to represent common assessment processes. See sub-section 4.1 for a description of the most common form of rubric used with ePortfolios in current practice. The Information Model does not make the distinction between dimension of quality and level of mastery explicit. The organizational groupings are realized as a hierarchy of 'dimensions of quality'.

2.1 Rubric Class

The set of attributes for the Rubric class are summarized in Table 2.1.

Table 2.1 Summary of attributes for the Rubric class.

Attribute Name Type Mult. Description
comment Comment:LIP 0..1 Contains comments relevant to identifying the Rubric.
contentype Contentype:LIP 0..* Contains the content meta-data description concerning the index for the data, access rights, and time-stamps.
identifier AnyURI:RDCEO 0..1 A unique identifier for the Rubric.
title String:RDCEO 0..1 Title of the Rubric.
description Description:RDCEO 0..* Could contain an unstructured description or could be used alongside structured data as overview material.
rubricOutcome Outcome 0..* Collects the possible Outcomes of the assessment to which the Rubric relates.
dimensionOfQuality DimensionOfQuality 0..* Contains other Dimensions or an enumeration of the possible outcomes for a Dimension.
ext_rubric AnyType 0..* Allows for extension of the Rubric definition.

attributes are formally defined in the respective spec docs.

2.2 Description Class

The attribute for the Description class is summarized in Table 2.2.

Table 2.2 Summary of attributes for the Description class.

Attribute Name Type Mult. Description
langstring String 1..*

An unstructured commentary to describe the Rubric.

2.3 Outcome Class

The Outcome class attributes are described in Table 2.3.

Table 2.3 Attribute definitions for the Outcome class.

Attribute Name Type Mult. Description
identifier AnyURI:RDCEO 0..1 A unique identifier for the Outcome.
title String:RDCEO 0..1 Title of the Outcome.
description Description:RDCEO 0..* An unstructured definition of the Outcome.
definition Definition:RDCEO 0..* Definition as defined in the RDCEO specification.
rules Rules 0..* Rules for the application of the Outcome.

2.3.1 Rules Class

The Rules class attributes are described in Table 2.4.

Table 2.4 Attribute definitions for the Rules class.

Attribute Name Type Mult. Description
requiresTotalScore Integer 0..1 If Dimensions of quality are being used then this would be the sum of scores for the Dimensions. It may be the case that a mark scheme is being used that is not (or cannot be) included.
requiresCriteria Class 0..1 See requiresCriteria class below.
requiresComplex IMSextensionAny 0..1 Allows for implementers to extend the model for complex cases.

2.3.2 requiresCriteria Class

The requiresCriteria class attributes are described in Table 2.5.

Table 2.5 Attribute definitions for the requiresCriteria class.

Attribute Name Type Mult. Description
criteriaAtNumberRequired Integer 1 Allows for m-of-n situations. If absent then all criteria are required.
includeCriterion Identifier 0..* Allows for a criterion that is defined in one outcome to be referenced from another one for re-use.
criterion String 0..* A single description/definition of a criterion.

2.4 DimensionOfQuality Class

The DimensionOfQuality class attributes are described in Table 2.6.

Table 2.6 Attribute definitions for the DimensionOfQuality class.

Attribute Name Type Mult. Description
identifier AnyURI:RDCEO 0..1 A unique identifier for the Dimension.
title String:RDCEO 0..1 Title of the Dimension.
description Description:RDCEO 0..* Description of the Dimension.
dimensionOutcome Outcome 0..* Collects the possible Outcomes of the assessment to which the rubric relates.
dimensionOfQuality DimensionOfQuality 0..* Nesting allows for hierarchical grouping of dimensions of quality. A Dimension will either contain other dimensions OR an enumeration of the possible Outcomes for that Dimension.

3. Binding of Rubric

This specification defines an approach for representing the rubric using a small number of new elements for some parts of the above model and binding other parts to the 1EdTech RDCEO Specification.

The <rubric> element composition
Figure 3.1 The <rubric> element composition.

The <rubricOutcome> and <dimensionOutcome> element composition
Figure 3.2 The <rubricOutcome> and <dimensionOutcome> elements composition.

The <dimensionOfQuality> element composition
Figure 3.3 The <dimensionOfQuality> element composition.

The rules element composition

The <rules> element composition
Figure 3.4 The <rules> element composition.

The RDCEO specification is composed thus:

The <rdceo> element composition
Figure 3.5 The <rdceo> element composition.

The LIP <contentype> element is composed thus:

The <rules> element composition
Figure 3.6 The LIP <contentype> element composition.

4. Use Cases

4.1 Multi-dimensional Assessment

This is typical of portfolio assessment and may be used for self assessment, as well as for communicating the criteria to candidates. This sort of rubric is for defining a complex set of interrelated criteria and would usually be visually represented as a table. This sort of rubric generally has three elements1:

  • Dimensions of quality - the first column in a rubric lists a set of dimensions or areas to be assessed, such as "use of multimedia" or "grammar." In some rubrics, the dimensions and/or their associated commentaries are referred to as "criteria."
  • Levels of mastery - each subsequent column represents a level of performance that is defined by the rubric such as "excellent" or "not yet competent."
  • Commentaries - At the intersection of each dimensions of quality and level of mastery, a rubric includes a textual description of the qualities of performances and products on that dimension at that level. For example, the dimension "spelling and grammar" at the level of "exemplary" might be described as "is free of errors in grammar, punctuation, word choice, spelling, and format."

This use case is a generalization of a number of cases where the content of the 'commentary' differs. For example, in some cases a score value may be included, whereas in others it may not.

4.1.1 Variant - Continuum Scoring

The implication of the main use case "Multi-dimensional Assessment" is that one of the levels of mastery would be selected for a given candidate in a given dimension of quality. In some cases, the levels of mastery may be indicative of milestones within a continuum and a score be presented that does not correspond to the score shown in a 'specific commentary'.

4.1.2 Multi-dimensional Assessment with Groupings

In some rubrics, the dimensions of quality are grouped into larger categories, such as "critical thinking" or "writing." These are groupings of convenience.

4.1.3 Multi-dimensional Assessment with Rollup

The previous use case may be extended by the requirement that the sum of scores in the individual dimensions of quality indicates an overall level of merit.

4.1.4 Criterion List

In this case, there are a number of criteria that are not organized into dimensions of quality but are only associated with levels of mastery. All level-linked criteria must be met to show that level of mastery.

4.1.4.1 Variation - m of n

There may be more complex rules to the threshold for a level of mastery such as meeting 'm of n' or combinations of mandatory + 'm of n'.

4.1.5 Cumulative Criterion List

In some schemes, all criteria for a level of mastery and all criteria for the lower-levels must be met.

4.2 Referring to Rubrics

The information contained within a rubric may need to be referenced as well as the whole. We can imagine cases where:

  • A dimension of quality is identified, for example to present a piece of evidence as an item for assessment.
  • The intersection of a dimension of quality and level of mastery is identified, for example to interpret a score.
  • An individual criterion is identified, for example when expressing how a level of mastery is shown.

It is the use of such referrals that is a key usage scenario for this specification:

  • A reference is found through an association between another part of the portfolio and the intersection of a dimension of quality and level of mastery.
  • The reference is followed and a rubric fragment (or whole) retrieved.
  • The rubric information is formatted to a human-readable form.
  • The formatted information is included, or linked-to, in context.

The model for the rubric itself is separate from a model for showing the scores of a specific portfolio against a rubric.

5. Examples

5.1 Multi-dimensional with Rollup

This example illustrates how to use the Rubric specification to represent a rubric used for multi-dimensional assessment with rollup, as described in sub-section 4.1 and sub-section 4.1.3. The example file is a representation of part of the University of Wisconsin Stout Rubric of Electronic Teaching Portfolio, which is used with permission from Joan M, Vandervelde. The source rubric may be available at http://www.uwstout.edu/soe/profdev/eportfoliorubric.html.This example uses the RDCEO model defined in this document. The example file is located on the 1EdTech ePortfolio website http://www.imsglobal.org/ep/

Appendix A - Glossary

Assessment is any systematic process of gathering information and making assertions about what a person or set of persons has learned, knows or can do. (While conflicting with the more restrictive definition of assessment as synonymous with testing presented in the 1EdTech QTI specification and Abstract Framework, the broader definition presented here is much more accurate in the context of portfolio practice.)

The creator of an ePortfolio or part of an ePortfolio is the person or group principally responsible for making the meaningful content of the ePortfolio or part. (This definition differs from, but does not conflict with, the definition of "creator" in the Abstract Framework glossary.)

ePortfolio: See 1EdTech ePortfolio Specification section 1.1.

An ePortfolio tool is a software application used to create, manage, render, and exchange ePortfolios.

The subject is the individual or group the ePortfolio is about.

The owner of an ePortfolio or part of an ePortfolio is the person or group who should be granted the ability to create, modify, and control access to that ePortfolio or part.

A product is any electronic document included in an ePortfolio or reference to an external artifact that could be presented as evidence of what a person has learned, knows, or can do.

A rubric is a scoring guide that define assessment criteria for each cell in a two dimensional-matrix of dimensions of quality by levels of mastery. Levels of mastery can be defined as a continuum or discrete set.

A view is a subset of an ePortfolio. A view can include any set of parts that can be included in an ePortfolio.

A part is a self-contained piece of information about the ePortfolio subject, or about anything contained in an ePortfolio. The meaning and significance of a part may draw from the relationships in which the part partakes.

The term element is used by 1EdTech LIP to indicate the constituent units of the specification, at any level, both as structures and as instances. The top-level elements represent some of the parts of the ePortfolio.

A relationship in the context of an ePortfolio is a relationship between parts of the ePortfolio. Significant relationships include those of evidencing, helping, and reflecting on.

About This Document

Title 1EdTech Rubric Specification
Editors Colin Smythe (1EdTech), Mark McKell (1EdTech), Adam Cooper (Tribal)
Team Co-Leads Darren Cambridge (EDUCAUSE), Andy Heath (EPICC)
Version 1.0
Version Date 02 June 2005
Status Final Specification
Summary This document describes the Rubric specification as it relates to the ePortfolio specification.
Revision Information 02 June 2005
Purpose This document has been approved by the 1EdTech Technical Board and is made available for adoption.
Document Location http://www.imsglobal.org/ep/epv1p0/imsrubric_specv1p0.html

To register any comments or questions about this specification please visit: http://www.imsglobal.org/developers/ims/imsforum/categories.cfm?catid=24

List of Contributors

The following individuals contributed to the development of this document:

Name Organization
Darren Cambridge EDUCAUSE
Adam Cooper Tribal Technology
Andy Heath EPICC
Colin Smythe 1EdTech Consortium, Inc.
Mark McKell 1EdTech Consortium, Inc.

Revision History

Version No. Release Date Comments
Base Document 1.0 29 March 2004 Initial version of the Rubric specification.
Public Draft 1.0 20 September 2004 Public Draft version of the Rubric specification.
Final Specification 1.0 02 June 2005 Final version of the Rubric specification.

Index

A
Assessment 1, 2, 3, 4, 5, 6, 7, 8

B
Binding 1

E
Elements
relationship 1
ePortfolio 1, 2, 3
Extension 1
 

I
1EdTech Specifications
ePortfolio 1
Learner Information Package 1
Question and Test Interoperability 1
Reusable Definition of Competency or Education Objective 1, 2, 3, 4, 5
 

P
Portfolio 1, 2, 3, 4

R
Rubric 1, 2, 3, 4, 5, 6, 7, 8, 9

S
Structure 1

W
W3C 1
Web Services 1

X
XML 1
XSD 1
XSL 1
XSLT 1
 

1 Level labels from Huba, Mary and Jann Freed. (1999) Learner-Centered Assessment on College Campuses. Allyn and Bacon.

 

 

 

1EdTech Consortium, Inc. ("1EdTech/GLC") is publishing the information contained in this 1EdTech Rubric Specification ("Specification") for purposes of scientific, experimental, and scholarly collaboration only.

1EdTech/GLC makes no warranty or representation regarding the accuracy or completeness of the Specification.
This material is provided on an "As Is" and "As Available" basis.

The Specification is at all times subject to change and revision without notice.

It is your sole responsibility to evaluate the usefulness, accuracy, and completeness of the Specification as it relates to you.

1EdTech/GLC would appreciate receiving your comments and suggestions.

Please contact 1EdTech/GLC through our website at http://www.imsglobal.org

Please refer to Document Name:
1EdTech Rubric Specification Revision: 02 June 2005