International Certification & Reciprocity Consortium |
Scoring of Exams
Receiving Scores
All scores are reported to the designated IC&RC Member Board for distribution. IC&RC does not have the authority to release scores. This process takes approximately four to six weeks for paper and pencil exams and two to three weeks for CBT exams. Preliminary computer based exam scores are provided to candidates immediately following completion of the exam. Candidates seeking their official scores should contact their IC&RC Member Board. Contact information for all IC&RC Member Boards can be found here.
Reporting Scores
Scores are reported on a scale ranging from 200-800 with a 500 passing. The minimum scaled passing score is 500 for all examinations. Candidates are provided with official score letters that report a final scaled score and the percentages of items answered correctly in each content domain.
Scaled Scores
Scaled scores are created when the number of questions answered correctly is mathematically transformed so that the passing score equals 500 on a scale starting at 200 and ending at 800.
This transformation is very similar to converting inches to centimeters. For example, a 10 inch ribbon is also 25.4 centimeters long. The length of the ribbon has not been changed, only the units of measurement to describe its length.
The use of scaled scores allows for direct comparison of exam scores from one form of the examination to another. For security purposes, IC&RC keeps multiple forms of each examination in circulation at all times. Candidates are randomly assigned a form. The use of scaled scores allows IC&RC to report scores for every form of an examination using the same scale of 200-800 with a 500 passing.
The use of scaled scores does not influence whether a candidate passes or fails an examination. The passing of an IC&RC examination is always incumbent on achieving the minimum passing score as it is determined in the process below.
Determining a Passing Score
Passing scores for IC&RC exams are not based on a percentage of questions answered correctly. Instead, IC&RC uses a Modified Angoff Study to determine a cut score for each examination. The Angoff method uses a systematic and documented approach to establish accurate, reliable, and legally defensible pass/fail scores.
Cut scores are determined by a panel of Subject Matter Experts (SMEs) that are working in and have demonstrated expertise in the field. SMEs work with our professional testing company to discuss the specific knowledge, skills, and abilities needed to demonstrate minimum competence.
The SMEs evaluate and rate the difficulty of each question. These ratings are then combined to determine the final cut score for the exam. The final cut score is subsequently transformed to an equivalent scaled score. All examination questions are weighted equally.
Use of Multiple Exam Forms
For every IC&RC exam, there are multiple forms of the same examination. Each form will use different questions but test the same content. Examination forms are updated and replaced on a continuous basis to ensure the security and integrity of the examination.
The use of multiple forms for the same exam will not make it easier or more difficult for candidates to pass one form of the examination. IC&RC’s testing company uses statistical data on each test question to evaluate the difficulty of each examination form. The examinations are constructed in order to minimize variations in difficulty from one form to another. The passing scores for each examination form are adjusted accordingly to account for any differences in form difficulty.
Use of Pretesting Items
On each IC&RC exam, there are unweighted items that do not influence final scores. Unweighted items are also called pretest items. Pretest items are not identified on exams and appear randomly on all exam forms. Pretest items do not influence final scores or pass/fail status. IC&RC uses pretest items to pilot newly written questions to ensure item quality prior to its addition to an examination as a weighted question.
Pretesting provides verification that the items are relevant to competency, measure proficiency and helps ensure the quality of future examinations. Pretest items do not influence a candidate’s score and protect candidates against poorly-performing items.
Failing Scores
Candidates who do not pass their examination are provided with percentages of correctly answered items in each content domain to better focus future study efforts. For security reasons, candidates will not be provided with their raw score (total number of questions answered correctly), total percentage of questions answered correctly, or a copy of the examination to review.
It is important to note that because the number of questions contained within each domain of the examination varies, adding or averaging the percentage correct scores in each domain will NOT be an accurate reflection of a candidate’s overall examination score.
|