III. TWO METHODS FOR ASSESSING TEST ITEM QUALITY
This section of the booklet presents two methods for collecting feedback on the quality of your test items. The two methods include using self-review checklists and student evaluation of test item quality. You can use the information gathered from either method to identify strengths and weaknesses in your item writing.
CHECKLIST FOR EVALUATING TEST ITEMS
EVALUATE YOUR TEST ITEMS BY CHECKING THE SUGGESTIONS WHICH YOU FEEL YOU HAVE FOLLOWED.
Multiple-Choice Test Items
____ |
When possible, stated the stem as a direct question rather than as an incomplete statement. |
____ |
Presented a definite, explicit and singular question or problem in the stem. |
____ |
Eliminated excessive verbiage or irrelevant information from the stem. |
____ |
Included in the stem any word(s) that might have otherwise been repeated in each alternative. |
____ |
Used negatively stated stems sparingly. When used, underlined and/or capitalized the negative word(s). |
____ |
Made all alternatives plausible and attractive to the less knowledgeable or skillful student. |
____ |
Made the alternatives grammatically parallel with each other, and consistent with the stem. |
____ |
Made the alternatives mutually exclusive. |
____ |
When possible, presented alternatives in some logical order (e.g., chronologically, most to least). |
____ |
Made sure there was only one correct or best response per item. |
____ |
Made alternatives approximately equal in length. |
____ |
Avoided irrelevant clues such as grammatical structure, well known verbal associations or connections between stem and answer. |
____ |
Used at least four alternatives for each item. |
____ |
Randomly distributed the correct response among the alternative positions throughout the test having approximately the same proportion of alternatives a, b, c, d, and e as the correct response. |
____ |
Used the alternatives "none of the above" and "all of the above" sparingly. When used, such alternatives were occasionally the correct response. |
True-False Test Items
____ |
Based true-false items upon statements that are absolutely true or false, without qualifications or exceptions. |
____ |
Expressed the item statement as simply and as clearly as possible. |
____ |
Expressed a single idea in each test item. |
____ |
Included enough background information and qualifications so that the ability to respond correctly did not depend on some special, uncommon knowledge. |
____ |
Avoided lifting statements from the text, lecture or other materials. |
____ |
Avoided using negatively stated item statements. |
____ |
Avoided the use of unfamiliar language. |
____ |
Avoided the use of specific determiners such as "all," "always," "none," "never," etc., and qualifying determiners such as "usually," "sometimes," "often," etc. |
____ |
Used more false items than true items (but not more than 15% additional false items). |
Matching Test Items
____ |
Included directions which clearly stated the basis for matching the stimuli with the response. |
____ |
Explained whether or not a response could be used more than once and indicated where to write the answer. |
____ |
Used only homogeneous material. |
____ |
When possible, arranged the list of responses in some systematic order (e.g., chronologically, alphabetically). |
____ |
Avoided grammatical or other clues to the correct response. |
____ |
Kept items brief (limited the list of stimuli to under 10). |
____ |
Included more responses than stimuli. |
____ |
When possible, reduced the amount of reading time by including only short phrases or single words in the response list. |
Completion Test Items
____ |
Omitted only significant words from the statement. |
____ |
Did not omit so many words from the statement that the intended meaning was lost. |
____ |
Avoided grammatical or other clues to the correct response. |
____ |
Included only one correct response per item. |
____ |
Made the blanks of equal length. |
____ |
When possible, deleted the words at the end of the statement after the student was presented with a clearly defined problem. |
____ |
Avoided lifting statements directly from the text, lecture or other sources. |
____ |
Limited the required response to a single word or phrase. |
Essay Test Items
____ |
Prepared items that elicited the type of behavior you wanted to measure. |
____ |
Phrased each item so that the student's task was clearly indicated. |
____ |
Indicated for each item a point value or weight and an estimated time limit for answering. |
____ |
Asked questions that elicited responses on which experts could agree that one answer is better than others. |
____ |
Avoided giving the student a choice among optional items. |
____ |
Administered several short-answer items rather than 1 or 2 extended-response items. |
Grading Essay Test Items
____ |
Selected an appropriate grading model. |
____ |
Tried not to allow factors which were irrelevant to the learning outcomes being measured to affect your grading (e.g., handwriting, spelling, neatness). |
____ |
Read and graded all class answers to one item before going on to the next item. |
____ |
Read and graded the answers without looking at the student's name to avoid possible preferential treatment. |
____ |
Occasionally shuffled papers during the reading of answers. |
____ |
When possible, asked another instructor to read and grade your students' responses. |
Problem Solving Test Items
____ |
Clearly identified and explained the problem to the student. |
____ |
Provided directions which clearly informed the student of the type of response called for. |
____ |
Stated in the directions whether or not the student must show work procedures for full or partial credit. |
____ |
Clearly separated item parts and indicated their point values. |
____ |
Used figures, conditions and situations which created a realistic problem. |
____ |
Asked questions that elicited responses on which experts could agree that one solution and one or more work procedures are better than others. |
____ |
Worked through each problem before classroom administration. |
Performance Test Items
____ |
Prepared items that elicit the type of behavior you wanted to measure. |
____ |
Clearly identified and explained the simulated situation to the student. |
____ |
Made the simulated situation as "life-like" as possible. |
____ |
Provided directions which clearly inform the students of the type of response called for. |
____ |
When appropriate, clearly stated time and activity limitations in the directions. |
____ |
Adequately trained the observer(s)/scorer(s) to ensure that they were fair in scoring the appropriate behaviors. |
STUDENT EVALUATION OF TEST ITEM QUALITY
USING ICES QUESTIONNAIRE ITEMS TO ASSESS YOUR TEST ITEM QUALITY
The following set of ICES (Instructor and Course Evaluation System) questionnaire items can be used to assess the quality of your test items. The items are presented with their original ICES catalogue number. You are encouraged to include one or more of the items on the ICES evaluation form in order to collect student opinion of your item writing quality.
IV. ASSISTANCE OFFERED BY THE Center for Innovation in Teaching and Learning (CITL)
The information in the booklet is intended for self-instruction. However, CITL staff members will consult with faculty who wish to analyze and improve their test item writing. The staff can also consult with faculty about other instructional problems. The Measurement and Evaluation Division of CITL also publishes a semi-annual newsletter called Measurement and Evaluation Q & A which discusses various classroom testing and measurement issues. Instructors wishing to receive the newsletter or to acquire CITL assistance can call the Measurement and Evaluation Division at 333-3490.
102--How would you rate the instructor's examination questions? |
116--Did the exams challenge you to do original thinking? |
|
Excellent |
Poor |
|
Yes, very challenging |
No, not challenging
|
103--How well did examination questions reflect content and emphasis of the course? |
118--Were there "trick" or trite questions on tests? |
|
Well related |
Poorlyrelated |
|
Lots ofthem |
Few if any
|
114--The exams reflected important points in the reading assignments. |
122--How difficult were the examinations? |
|
Strongly agree |
Stronglydisagree |
|
Toodifficult |
Too easy
|
117--Examinations mainly testedtrivia. |
123--I found I could score reasonably well on exams by just cramming. |
|
Strongly agree |
Stronglydisagree |
|
Stronglyagree |
Strongly disagree
|
119--Were exam questions worded clearly? |
121--How was the length of exams for the time allotted. |
|
Yes, veryclear |
No, very unclear |
|
Too long |
Too short
|
115--Were the instructor's testquestions thought provoking? |
109--Were exams, papers, reports returned with errors explained or personal comments? |
|
yesDefinitely |
Definitelyno |
|
Almost always |
Almost never
|
125--Were exams adequately discussed upon return? |
|
|
Yes,adequately |
No, not enough |
|
V. REFERENCES FOR FURTHER READING
Ebel, Robert L.
Measuring educational achievement. Englewood Cliffs, New Jersey: Prentice-Hall, 1965, Chapters 4-6.
Ebel, Robert L.
Essentials of educational measurement. Englewood Cliffs, New Jersey: Prentice-Hall, 1972, Chapters 5-8.
Gronlund, N. E.
Measurement and evaluation in teaching. New York: Macmillan Publishing Co., 1976, Chapters 6-9.
Mehrens, W. A. & Lehmann, I. J.
Measurement and evaluation in education and psychology. New York: Holt, Rinehart & Winston, Inc., 1973, Chapters 7-10.
Nelson, C. H.
Measurement and evaluation in the classroom. New York: Macmillan Publishing Co., 1970, Chapters 5-8. Measurement and Evaluation Division, 247 Armory Building. Especially useful for science instruction.
Payne, David A.
The assessment of learning. Lexington, Mass.: D.C. Heath and Co., 1974, Chapters 4-7.
Scannell, D. P. & Tracy, D. B.
Testing and measurement in the classroom. New York: Houghton-Mifflin Co., 1975, Chapters 4-6.
Thorndike, R. L. (Ed.).
Educational measurement (2nd ed.). Washington, D.C.: American Council on Education, 1971, Chapter 9 (Performance testing) and Chapter 10 (Essay exams).