Download Lab 2-1 Drafting your multiple

yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts

Psychometrics wikipedia, lookup

The Morals of Chess wikipedia, lookup

Atkinson–Shiffrin memory model wikipedia, lookup

Psychological testing wikipedia, lookup

Psychometric software wikipedia, lookup

Differential item functioning wikipedia, lookup

Computerized adaptive testing wikipedia, lookup

Attribute hierarchy method wikipedia, lookup

Rasch model wikipedia, lookup

Item response theory wikipedia, lookup

Psy 712
Spring 2013
Project 2 Overview
The purpose of Project 2 is to write a multiple-choice test.
Lab 2-1: Drafting Your Multiple-choice Test
Lab 2-2: Editing Your Multiple-choice Test
Lab 2-3: Finalizing Your Multiple-choice Test
Due Thursday Feb 14 3pm
Due Thursday Feb 21 3pm
Due Thursday Feb 28 3pm
3 marks
3 marks
10 marks
An award will be given for the Best Multiple-choice Test
Project 2
Lab 2-1: Drafting Your Multiple-Choice Test
3 Marks
The purpose of this assignment is to write a multiple-choice test. The test will have 6 – 10 items that cover a
range of learning goals. The test will cover one of the chapters that are included on our first quiz.
Background Reading
1. Read Chapter 15: Principles of Objective Test Development, from Thorndike, R.M. (1997). Measurement
and evaluation in psychology and education, sixth edition. Upper Saddle River, NJ: Prentice-Hall. This is
available through e-reserves.
2. Read the Guidelines for Writing Items When There IS a Correct Answer, which was handed out during the
first class, and which is reproduced below.
Drafting your Multiple-choice Test
To create your multiple-choice test, follow these steps:
1) Content Domain
You will create a multiple-choice test that assesses knowledge of one of the chapters in your text
(Murphy and Davidshofer). You may select chapter 4, 5, or 11.
Write down the chapter number and name.
2) Divide the chapter into 4-10 sub-domains. This is the structure of the content domain. You may divide
the chapter based upon the chapter sub-headings, the main topics as you conceptualize them, the type of
knowledge (e.g., definition, concepts, theories, calculations), or based upon some other criteria.
3) Brain-Storming
Brain-storm as many multiple-choice items as you can for each sub-domain. Each item should have at
least 3 response options, and should ideally have 4 or 5. If you have 8 or more sub-domains, try for two
or three items each. If you have only a few sub-domains write 4 or 5 items for each. Ensure you have
at least two items for every sub-domain. Write the items underneath each sub-domain label.
For each item, put an asterisk next to the correct answer.
Try to create items for each of the categories in Bloom’s taxonomy for the cognitive domain:
knowledge, comprehension, application, analysis, synthesis, and evaluation. As you write each item,
label it according to Bloom’s category. For example
Which measure of central tendency is calculated as the arithmetic average of all the numbers?
*a) mean
b) median
c) mode
d) none of the above
If one person drops out of the study, which measure of central tendency will almost certainly
*a) mean
b) median
c) mode
d) none of the above
What is the mean of 1, 1, 2, 5?
a) 1
b) 1.5
*c) 3
d) 4
In a dataset with 1000 values, the mean is 100 and the median is 150. What can you say about
the shape of the distribution?
a) it is positively skewed
*b) it is negatively skewed
c) it is normally distributed
d) it is bimodal
When data are severely skewed, which measure of central tendency gives the most accurate
portrayal of the “typical” score?
a) mean
b) median
c) all measures of central tendency are equally representative of the “typical” score
d) when data are severely skewed, no measure of central tendency reflects the typical
Brain-storm at least 15 items. If you don’t have at least 15 items, keep working at it until you get at
least 15. It is okay if some of the items overlap in content with some of the other items. If there are
three people on your team, this is easier if you first agree on the sub-domains and then each person
brainstorms as many items as they can for each sub-domain.
4) Item Critique – Item Writing
Examine each item to check that they follow the guidelines for writing items that were discussed in
the Thorndike chapter and are listed below. Often, it is hard to know if an item that you have drafted is
clear, uses appropriate vocabulary, isn’t biased, etc. It is easier to spot potential problems with someone
else’s items. Therefore, I recommend that each team member ask the other team members for feedback
on the items that they wrote.
Sort your items into two categories: items that do follow the guidelines for item-writing given below
and items that do not. In order to sort your items, you should copy and paste all of the items from Step
3, and then sort them into these two categories for Step 4. Dr. Barchard needs to you submit your work
for each of the steps in this assignment so she can see the progression of your ideas. Therefore, you
need to provide separate answers to Step 3 and Step 4.
When people first start writing items, they usually find that the majority of their draft items violate
the guidelines. Try revising some of these items. Leave the poor items in the “violates guidelines”
section but add the revised items to the “follows guidelines” section.
5) Item Critique – Item Content
Now go through your items a second time to find the ones that measure the most important content.
Check whether you have items for each of the categories in Bloom’s taxonomy: knowledge,
comprehension, application, analysis, synthesis, and evaluation.
When you are finished this sorting, check if you do not have at least 6 items that meet the guidelines
below, that measure important content, and that cover a range of learning outcomes (ensure that at least
3 items cover analysis, synthesis, and evaluation). If you don’t, you will need to write more items. You
can try two strategies. One strategy is to brain-storm new items from scratch. Another strategy is to
rephrase items that do not follow the guidelines but which do seem to have good content. Once you’ve
written new items, look them over again to check that they follow the guidelines. Continue this process
until you have at least 6 items that meet the guidelines and measure important content, with at least 3
items covering the higher level learning objectives.
You may find it difficult to create 6 well-written multiple-choice items from one chapter. It is
indeed difficult. You may find that few of your initial items measure the higher cognitive skills. It is
indeed harder to write these items. However, as long as you start work on your tests early, you should
be able to create good tests for this assignment and for the classes you teach in the future.
6) Item Selection
Select 6 – 10 items that you think measure important content and are well written. Try to include items
from every cognitive category. Do not include two items that assess the same idea. To indicate which
items you have selected, you can put a * next to the item, or you can create a separate list which
includes just those 6-10 items you have selected.
6) Feedback Form
Create a single page (it may be double sided) which contains
a) the names of your team members
b) the number and name of the chapter
c) the best 6 – 10 items.
Organize the items by the sub-domains. If there are some sub-domains where you have no
items, the space under that sub- domains would be blank.
For each item, indicate what cognitive category you were trying to assess.
Double space your items.
During class, you will receive feedback on your items from other people in this course.
What to Hand In
Create a single file that contains your answers to questions 1 through 7. Number your answers, to help
Dr. Barchard follow the development of your measure. Email Dr. Barchard this file as an attachment.
Print enough copies of your feedback form for all the other students in the class, and bring these to class
next week. You will receive feedback on your items from other people in class.
Guidelines for Writing Items
When There IS a Correct Answer
Source: The principles below (but not the examples given on the lines below the principles) were taken from Thorndike, R.L.
(1997). Measurement and Evaluation in Psychology and Education, sixth edition. Upper Saddle River, NJ: Prentice-Hall.
Read Chapter 15, Principles of objective test development, for explanations and examples of each principle.
General Principles
1. Keep the reading difficulty and vocabulary level of the test item as simple as possible.
2. Be sure each item has a correct or best answer on which experts would agree.
3. Be sure each item deals with an important aspect of the content area, and not with trivia.
4. Be sure each item is independent. Do not require students to get one item right in order to get another one right.
5. Avoid the use of trick questions. In trick questions, the item appears to be about one thing, but to answer it correctly, they
need to focus on an entirely different point.
6. Be sure the problem posed is clear and unambiguous.
Writing True-False Items
1. Ensure that the item is unequivocally true or false.
2. Avoid the use of specific determiners (all, never, no, always) or qualified statements (usually, sometimes, under certain
conditions, may be).
3. Avoid ambiguous and indefinite terms of degree or amount (frequently, greatly, to a considerable degree, in most cases).
4. Avoid the use of negative statements, and particularly double negatives.
5. Limit true-false statements to a single idea.
6. Make true and false statements approximately equal in length.
7. Include approximately the same number of true statements as false ones.
Writing Multiple-Choice Items
1. Be sure the stem of the item clearly formulates a problem.
2. Include as much of the item as possible in the stem, and keep options as short as possible.
3. Include in the stem only the material needed to make the problem clear and specific.
4. Use the negative only sparingly in an item.
5. Use novel material in formulating problems that measure understanding or ability to apply principles.
6. Be sure there is one and only one correct or clearly best answer.
When we write an item, we usually write the stem and correct answer together, and then write the distractors. But that
makes us think about the item stem in a particular way. Carefully consider if there is another way of thinking about
the item stem so that one of the distractors might also be true.
7. Be sure wrong answers are plausible.
8. Be sure no unintentional clues to the correct answer are given.
e.g., making the correct answer longer, repeating a word in the stem and the correct option, grammatical agreement
between stem and correct option (a versus an, plural agreement with verb), giving clues to one item in another item,
using a consistent pattern of correct responses such as T F T F or A B C D A B C D.
9. Use the option “none of these” or “none of the above” only when the keyed answer can be classified unequivocally as
correct or incorrect.
10. Avoid the use of “all of the above” in the multiple-choice item.
Writing Matching Items
1. Keep the set of statements in a single matching exercise homogeneous.
2. Keep the set of items relatively short.
3. Have the students choose answers from the column with the shorter statements.
4. Use a heading for each column that accurately describes its content.
5. Have more answer choices than the number of entries to be matched, unless answer choices can be used more than once.
6. Arrange the answer choices in a logical order.
7. Specify in the directions both the basis for matching and whether answer choices can be used more than once.
Preparing the Objective Test
1. Prepare more items than you will need.
2. Proofread the items.
3. Arrange items on the test so that they are easy to read.
4. Plan the layout of the test so that a separate answer sheet can be used to record answers.
5. Group items of the same format (true-false, multiple-choice, or matching) together.
6. Within item type, group together items dealing with the same content.
7. Arrange items so that difficulty progresses from easy to hard.
8. Write a set of specific directions for each item type.
Additional Item-Writing Principles from Kim Barchard
1. Avoid items that ask more than one thing at once, often identifiable by the use of the words “and” or “or”.
The person won’t know how to answer if they think that one part is true and the other part is false.
2. Avoid very long items.
Long items are likely to be asking more than one thing, and are likely to confuse test taker. Sometimes we have short
questions about long stimuli. The stimuli can be complex, like a paragraph or a book, but the item itself should be
3. Avoid dependencies, such as questions that contain “if” and “because” and "when".
If part of the statement is true but the other part isn’t, or if both parts are true but the dependency is false, then they
may not know how to answer. Only use this type of item if you really NEED the dependency. Otherwise, avoid it.
4. Avoid items that are likely to be answered correctly by everyone or no one. Items that are too easy or too hard do not tell
you anything about the people taking the test.
Usually, we make Relative Judgments: we want to compare this person to other people. If so, then if we give an item
that everyone gets the same score on, we haven't learned how people compare. Sometimes, we make Absolute
Judgments. In that situation, we want to compare each person to a standard. Examples: air craft mechanic, surgeon,
brain damage. In that case, it's okay if everyone gets an item right or wrong, because you wanted to compare each
person to the standard, not to each other.
What is the mean of 2 and 4?
Write down the first 10 digits of pi.
Do you know your birthday?
5. Avoid biased items.
Intelligence test: Which picture is ugly? Pictures of women from different ethnic groups.
Canadian university-level statistics test using the word “Teeter totter”. This word may be unfamiliar to people who
grew up in another country.
6. Start preparing the test well in advance. I start preparing quizzes and exams at least one week in advance.
7. After you have drafted the entire test, ignore it for one or two days and then proof-read it.
8 Prepare the scoring key before handing out the quiz. This will help you identify problems with the items.
9. Take the test yourself and score your own answers. This identifies errors in both the items and the scoring key.
10. Assume that students will take 5 – 10 times as long as you do to complete the test.
11. Design your test so that there are enough items to ensure adequate internal consistency reliability (and so that the total test
grade does not change dramatically if a student misses one question).
12. In classroom situations, design your test so that all students have enough time to attempt every item. Students are more
likely to think that the test fairly assessed their knowledge.
Additional Item-Writing Principles from Others
For multiple choice questions,
1. Make sure the item stem asks a complete question. Students should be able to give the correct answer to an open-ended
question that includes nothing but the item stem (Linda Suskie, 2009)
2. Make sure that the answers/distractors use proper grammar and punctuation (Galton)
3. Use parallel sentence structure for all answers (Galton). If this is not possible, create pairs that have parallel structure, so
that each answer has one other answer with the same structure (Barchard)
4. For items that use fill-in-the-blanks, use five underlines (_____) so that the blank gives no clue to the answer. Make sure
that the syntax of all responses fit in the blank (upper/lower case, punctuation, grammar, etc.) (Galton).
5. Make all options as close in length as possible, and then order the responses from shortest to longest (Galton). Do not
make the correct answer longer than the rest. However, you may want to make one of the distractors longer than the rest
(Linda Suskie, 2009).
8. “Avoid repeating words between the stem and the correct response. Test-wise students will pick up this clue. (On the other
hand, verbal associations between the stem and a distracter can create an effective distracter.)” (Linda Suskie, 2009).
6. If the answers are numbers, order them from smallest to largest and use logical multiples (Galton).
7. “Line up responses vertically rather than horizontally. It’s much easier and less confusing to scan down a column than
across a line to find the correct answer. If you are using a paper test and your options are so short that this seems to waste
paper, arrange the test in two columns” (Linda Suskie, 2009).
Suskie, L. (2009). Assessing student learning: A common sense guide. San Francisco, CA: Jossey-Bass.