Think out of the box logo - Copyright 2004 Claude OstynSpacerOstyn Consulting - Home page
 

Blog Archives

June 2006 | July 2006 | August 2006 | September 2006 | November 2006 | December 2006 | January 2007 | February 2007 |

Subscribe to Posts [Atom]

Brain bogglers elsewhere in the blogosphere

[Random Walk in Learning]
[Stephen's Web]

This page is powered by Blogger. Isn't yours?

Claude Ostyn's Blog

Competency data standards and management, standards-based eLearning content development, SCORM tips and techniques, and whatever else seems relevant.

Tuesday, September 19, 2006

 

Scoring in SCORM

At the beginning, so to speak, in the AICC specification there was only a raw score. However, that meant that there was no way to know what that score meant. In practice, many people had interpreted the AICC specification on the basis of the examples that had been provided, and all those examples were in the 0..100 range, so people assumed that there was such a range. But that was not the actual specification. In fact the range was the range of signed short integers (0 to 0x7FFF = 32767). So, if a LMS wanted to show the scores for multiple courses or content objects as percentage, there was no way to know how to convert a raw score to a percentage since 17 in one package might mean 17 out of 17 and 17 in another package might mean 17 out of 150. This led to the addition of "min" and "max" values as optional elements in the AICC spec. Now, if you knew that min=0 and max=17 and raw=17 then you could show that as 100% with confidence.


Fast forward to SCORM 1.2. The cmi data model in SCORM 1.2 was based on the AICC data model, and min, max and raw came along. In practice, though, many systems and the people implementing them had assumed that there was a "standard" scale of 0..100, and in fact some leading LMS assume that this is the case. So, it certainly does not hurt to use the same normalized scale where min is always 0, max is always 100, and raw is always relative to that scale. This is a best practice. By the way, SCORM 1.2 never got around to fixing the problem of the passing score for which no range is set. This is one more argument for assuming that the same best practice 0..100 scale applies to the passing score. Otherwise, a LMS setting a passing score of, say, 78 is utterly meaningless.


Scoring got cleaned up considerably in the IEEE 1484.11.1 standard, which was developed with participation from the AICC SCORM teams among others. SCORM 2004 implements the IEEE 1484.11.1 standard. In that standard, there are still min, max and raw scores, but they are considered informative and optional. There is also a scaled score, and the scaled score is the only one that is used for formal reporting, to decide whether a passing score has been achieved, and in the calculations involved in SCORM sequencing. The passing score is in the same standard range. In the IEEE 1484.11.1 and SCORM, the standard score range is -1 to 1, where in practive 0..1 scales exactly to the SCORM 1.2 assumed range of 0..100. A scaled score of 0.5 reliably represents 50%. Negative scores are allowed to enable representation of "worse than zero", which sometimes makes sense, e.g. when failing in one SCO must count "against" succeeding in another SCO.


Going from SCORM 1.2 to SCORM 2004 is not very difficult then, since you can use whatever the min, max and raw values might fit your content object's design, and calculating a scaled score on a standard scale is usually easy. For example, if your max score is 26 and your raw score is 21 (as in an example of 21 out of 26 questions answered correctly), the scaled score is simply 21/26.


What I would do in SCORM 1.2 with the example above is report min as 0, max as 100, and raw as 80.77

This is calculated by scaling to 0..100 with the formulas

cmi.core.score.min = 0

cmi.core.score.max = 100

cmi.core.score.raw = ((internal_raw - internal_min)* (100 / (internal_max - internal_min)))

where internal_raw = 21, internal_min = 0, and internal_max = 26.


Going to SCORM 2004, the formula for the new cmi.score.scaled is even simpler:
cmi.score.scaled = ((internal_raw - internal_min)/(internal_max - internal_min))

Monday, September 04, 2006

 

Best practices for Commit in SCORM

I added a very short document on best practices for the use of Commit (a.k.a. LMSCommit) in SCORM content to the resource docs at http://www.ostyn.com/resscormtech.htm#scripting . Hopefully this will help clarify things a little bit.

© 2011 Claude Ostyn.

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 2.5 License.