ABC on Scoring Rubrics Development for Large Scale Performance Assessment in Mathematics and Science
AbstractAs part of its technical assistance effort, Westat is developing an Occasional Papers series addressing issues of concern in doing outcome evaluation. The first of these papers, the development of scoring rubrics, has now been completed and is available for use and comment. Suggestions for additional papers are welcome.
Remember Westat staff and their consultants are available to provide assistance to you in developing or reviewing your outcome evaluation plans. NSF is providing the resources for this technical assistance. Please don't wait until the last minute to ask for help.
To suggest themes for occasional papers or request technical assistance, please contact Joy Frechtling. She can be reached at (301) 517-4006.
5.1 Off-line Sources
Airasian, P. (1997). Classroom Assessment. 3rd ed. New York: McGraw-Hill. (Note: Chapter 8 of Airasian's book, entitled "Performance Assessment," offers narrative text and samples pertaining to the development of scoring rubrics.)
American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. (1999). Standards for Educational and Psychological Testing. Washington, DC: Author.
Arter, J. (1990). Performance Rubric Evaluation Form (Metarubric). Portland, OR: Northwest Regional Educational Laboratory.
Brewer, R. (1996). Exemplars: A Teacher's Solution. Underhill, VT: Exemplars.
Culham, R., and Spandel, V. (1993). Problems and Pitfalls Encountered by Raters. Developed at the Northwest Regional Educational Laboratory for the Oregon Department of Education.
Danielson, C. (1997). A Collection of Performance Tasks and Rubrics: Middle School Mathematics. Larchmont, NY: Eye on Education.
Danielson, C. (1997). A Collection of Performance Tasks and Rubrics: Upper Elementary School Mathematics. Larchmont, NY: Eye on Education.
Danielson, C., and Hansen, P. (1999). A Collection of Performance Tasks and Rubrics: Primary School Mathematics. Larchmont, NY: Eye on Education
Danielson, C., and Marquez, E. (1998). A Collection of Performance Tasks and Rubrics: High School Mathematics. Larchmont, NY: Eye on Education.
Herman, J., Aschbacher, P., and Winters, L. (1992). A Practical Guide to Alternative Assessment. Alexandria, VA: Association for Supervision and Curriculum Development.
Johnson, B. (1996). The Performance Assessment Handbook: Designs from the Field and Guidelines for the Territory Ahead. Princeton, NJ: Eye on Education.
NOTE: Each of the two volumes of Johnson's work cited above contains a chapter entitled "Standards, Criteria, and Rubrics: Including Teachers and Students in the Search for Quality," replete with detailed samples of rubrics for a variety of subjects.
Lazear, D. (1998). The Rubrics Way: Using MI to Assess Understanding. Tucscon, AZ: Zephyr Press.
Marcus, J. (1995). Data on the Impact of Alternative Assessment on Students. Unpublished manuscript. The Education Cooperative, Wellesley, MA.
Marzano, R., Pickering, D., and McTighe, J. (1993). Assessing Student Outcomes: Performance Assessment Using the Dimensions of Learning Model. Alexandria, VA: ASCD.
Perkins, D., Goodrich, H., Tishman, S., and Mirman Owen, J. (1994). Thinking Connections: Learning to Think and Thinking to Learn . Reading, MA: Addison-Wesley.
Taggart, G.L., Phifer, S.J., Nixon, J., and Wood, M. (Eds.). (1998). Rubrics: A Handbook for Construction and Use. Lancaster, PA: Technomic Publishing.
5.2 On-line Sources
Chicago Public School District
(Note: This site contains many examples of general scoring rubrics.)
Johnson County, Wyoming, School District #1 (Mathematics Assessment Rubrics)
National Center for Research on Evaluation, Standards, and Student Testing (CRESST)
New Jersey Statewide Assessment Sample Forms
Rubrics for Web Lessons, and S.C.O.R.E. Rubrics
Rubrics. http://wwwodyssey.on.ca/%7Eelaine/coxon/rubrics.htm Rubrics for Web Lessions. http://webquest.sdsu.edu/rubrics/weblessons.htm
RMC Research Corporation
Kathy Schrock's Guide for Educators' Assessment - Rubrics http://www.schrockguide.net/assessment-and-rubrics.html
Spokane Public Schools, Washington State http://www.spokaneschools.org/site/default.aspx?PageID=1
Tasks [from the] Performance Assessment Links in Science [PALS]
(The tasks are arranged by grade range and by subject area. Within each task, select the link to Rubric.)
Toronto District School Board (Etobicoke) Research Department
Dr. Patrick J. Greene, a professor of education at Florida Gulf Coast University, web page
The Use of a Rubric for Assessment Purposes:
Mr. David Warlick, Instructional Technology Consultant
Rubric Construction Set:
Basic concepts of scoring rubrics and general procedures of rubrics developed are introduced in this paper. The focus is on holistic scoring for the performance assessment items or tasks in standardized testing situations in mathematics and science.
A scoring rubric is the established criteria, including rules, principles, and illustrations, used in scoring responses to individual items and clusters of items of performance assessment. It has three main functions: establishing objective criteria of judgment, providing established expectations to teachers and students, and maintaining focus on content and standards of a student work.
There are two major approaches to classify scoring rubrics. Analytic scoring and holistic scoring procedures are rubrics by depth of information provided. General scoring and item-specific scoring are rubrics by breadth of application.
Holistic scoring rates a student*s work as a whole and produces a single score. The method is preferred when a quick and consistent judgment is needed and when the skills being assessed are complex and interrelated. Standardized assessments usually use holistic scoring. Analytic scoring judges each dimension of a performance item or task independently and produces both dimension scores and a total score. It provides more detailed information but takes more time. It is mostly used for diagnostic purposes.
A general scoring rubric applies to similar performance tasks such as presentation, while specific rubric is designed for a particular item. Most standardized assessments in mathematics and science design their performance assessment items with specific rubrics.
A scoring rubric includes four important elements: dimension, definition and example of dimension, scale, and standards of excellence.
A scoring rubric scale can be numerical, qualitative, or combination of the two. A numerical scale is often used in mathematics and science performance items. The maximal possible points of a scale depend on factors such as number of dimensions measured, cognitive stages, weight of each dimension, and a developer*s preference. Usually the total number of scores is between 2 to 6 points. The bottom line is not to have so many points that it is hard for scorers to reach agreement, or too few to distinguish between students.
A scoring rubric developer can have three options: adopt, adapt, or start from the beginning. If you can find an exact match of an existing rubric with your item, you may adopt it. Otherwise, you may modify it to fit your need. The most difficult is to do it by yourself. For many standardized assessments in mathematics and science, however, this is the only choice, because each item is new and measures a specific skill.
Scoring rubrics development is an integrated process of writing, revising, piloting and trying it out until you are satisfactory. It also requires teamwork. Generally, there are nine steps to develop a scoring rubric:
The most common challenge in developing a rubric is to write it in clear and direct language. It is also useful to write it positively and avoid unnecessary negative wording. Additionally, articulating the grading system in an easily understandable and clearly distinguishable way will benefit both teachers and students.