Download Family Practice Clerkship Evaluation: Should We Use Open Book

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Draft—OpenBook Testing 1
Family Practice Clerkship Evaluation: Should We Use Open Book Tests
India L. Broyles, EdD
University of New England
College of Osteopathic Medicine
Peggy R. Cyr, MD
Maine Medical Center
Department of Family Practice
Neil Korsen, MD
Maine Medical Center
Department of Family Practice
A paper presented at the annual conference on PreDoctoral Education
sponsored by the Society for Teachers in Family Medicine
http://faculty.une.edu/com/ibroyles/documents/STFmPreDoc.doc
Austin, TX
January 31, 2003
Abstract
Evaluation studies in a variety of settings have shown that the conventional closed-book tests demonstrate
“only what students can do with whatever they have been able to memorize” (Feller, 1994, p. 235). In contrast,
open-book testing has been associated with “education for the future” (Feller, 1994, p.235). An examination of this
testing approach in a family medicine clerkship seeks to determine if this method more closely mirrors the discipline
of family medicine where practitioners refer daily to written resource materials in order to make clinical decisions
without compromising the learning and assessment process. The desired outcomes of the intervention were
observed: reducing the anxiety of students, wider reading of the text, learning the structure of the textbook as a
learning resource, and deeper understanding of concepts and principles rather than time spent on memorization. The
students appeared to approach the textbook and therefore, perhaps, the body of knowledge as a whole with the
orientation of a generalist. The MMC clerkship coordinator is recommending implementation of the open-book
approach to the Family Practice clerkship at all sites. This recommendation will also support advising students on
the preparation for an open-book test and on tactics for the best use of the textbook during the test.
Draft—OpenBook Testing 2
Family Practice Clerkship Evaluation: Should We Use Open Book Testing?
Purpose:
Evaluation in medical education is an important and complex task with multiple
purposes. Izard (1992) describes the role of assessment (as cited in Theophilides and Dionysiou,
1994):
Assessment has the function of providing valid evidence of learning achievement in
order to inform students, to facilitate provision of further learning or to certify that a
required level has been reached. Teachers are able to develop and improve the
educational process if they have identified the strengths of their students and know which
areas of study require attention (p. 11).
In medical school, the 3rd and 4th year clerkships are experiential learning venues of a short
duration, usually 6-8 weeks. Because the focus is on the integration of basic medical science
into hospital and ambulatory settings, the evaluation of students continues the use of objective,
multiple-choice tests in combination with more subjective measures such as evaluation checklists
from preceptors, graded case presentations, and OSCEs. These latter instruments are used in an
effort to match assessment tools to the type of work evaluation and self-directed learning
expected later in professional practice. While the clerkship final exam remains the best predictor
of success in residency (Campos, et al, 1999), its correlation is low, probably because medical
students are a highly-uniform cohort in academic abilities (p. 93). The multiple-choice tests are
structured to assess a breadth of knowledge around topics and cases that may not be seen during
the short clerkship and are often based on a primary textbook. Two of the common exams used
in Family Medicine Clerkships are the National Board of Medical Examiners’ Shelf Exam and
the exam based on The Essentials of Family Medicine textbook by Sloan, Slatt, Curtis, and
Ebell. Trends in Family Medicine clerkships may mirror those of Internal Medicine clerkships in
Draft—OpenBook Testing 3
which studies show that over the past decade, the use of the National Board of Medical
Examiners subject examination has increased (66% to 83%), use of faculty-developed
examinations has declined (46% to 27%), and the use of a clerkship standardized patient
examination increased sharply (2% to 27%). Minimum passing scores are required for a large
percentage of clerkships (80% for standardized subject tests and 65% for faculty developed test).
(Hemmer, et al).
For clerkships using the textbook-based test, multiple editions of the exam are drawn each
year from a national databank of approximately 1000 questions that are stratified to represent
material from each of the chapters in the book. Some of the questions have been modified to
clarify issues of language and all the questions are reviewed annually for updated content. As
the text grows larger with each new edition (now 827 pages), medical students find it more and
more difficult to read broadly, and several clerkships have begun selecting specific chapters for
testing.
Evaluation studies in a variety of settings have shown that the conventional closed-book tests
demonstrate “only what students can do with whatever they have been able to memorize” (Feller,
1994, p. 235). In contrast, open-book testing has been associated with “education for the future”
(Feller, 1994, p.235). Using this testing approach in the medical clerkship more closely mirrors
the discipline of family medicine where practitioners refer daily to written resource materials in
order to make clinical decisions.
This hospital-based Department of Family Practice supports a clerkship for approximately
one-third of the 95 third-year medical students from a university medical School. Because the
rotation is only four weeks, we decided to construct an educational experiment of open-book
testing. The purpose of this study is to determine if students benefit from having the opportunity
Draft—OpenBook Testing 4
to consult the textbook during the exam. This approach should encourage students to explore the
text more fully, understand its organization, and utilize specific data and charts as reference
tools. More specifically the present study sought to answer the following questions:
1. Are there differences in student achievement between students working in an open-book
setting and those working in a closed-book setting?
2. Are tension and stress reduced by the knowledge of an open-book setting?
3. How do students prepare for an open-book format of a multiple-choice test?
4. How do students use the text during the examination?
Conceptual Framework:
Early studies of open-book testing in a variety of settings (Feldhusen, 1961; Jehu et al., 1970;
Michaels & Kieran, 1973; Weber et al., 1983)suggest several important outcomes:

Reduces examination tension and stress;

Promotes a fair examination;

Leads to lasting learning outcomes

Reduces the unnecessary rote memorizing of facts, thus prompting students to prepare
themselves in more constructive ways.
More recently Theophilidies and Dionysiou (1996) found that open book testing offered
students a vital self-evaluation mechanism that is considered an important outcome in modern
medical education. Their study found that self-evaluation takes place at two different stages:
First, during the study period and preparation for the examination, students assess
their learning gaps in course-content mastery and they act accordingly to complete
their knowledge. Secondly, at the end of the examination, students are in a position
Draft—OpenBook Testing 5
to perform self-evaluation and judge the outcomes of their examination preparation.
(p. 165).
Methodology:
Subjects of this study were enrolled in the Family Medicine Clerkship during the academic
year 2002-2003 (18 months). During this clerkship, students were placed in clinical locations
for a four-week rotation. The majority of the learning occurred in the ambulatory settings with
about 15% of the time available for didactic learning activities. Methods used to evaluate
students on the family medicine clerkship included:
1. Clinical Evaluation of knowledge, skills, and attitudes from the primary preceptor
contributed 65% of the grade.
2. Case Presentation based on a practice patient that the medical student met in the early
weeks of the rotation. The case was focused on a common problem in family
medicine, and the student was required to make the session interactive. The case
grade constituted 15% of the grade.
3. Written final exam was 50 multiple-choice items generated from the national testbank
of the Essentials text, and the score comprised 20% of their grade. There is no
minmum passing score. Typical questions contained a short patient description or
"vignette" followed by a question testing diagnostic or management issues specific to
the case. Each question had one correct answer.
4. Composite Score with a maximum of 100 points; two of the three scores must be
above 90 for the student to receive an honors grade.
During the clinical core year of 2001, the scores of 36 students were plotted on control
charts to see if the mean each month changed in any significant way, computing the high and low
Draft—OpenBook Testing 6
number following over time, which gave a 95% confidence interval around the mean of about
80th percentile.
The null hypothesis tested was that the change from closed book to open book testing
would have no effect on student test scores. Scores were available for students who had their
Family Practice rotations in Maine, where open book tests were used, and in Vermont, where
closed book tests were used. They were available for 2001, when both sites used closed book
tests and for 2002, when Maine used open book tests.
Analysis of variance was used to look at the association between test scores and site,
year, and an interaction term including site and year. Stata 7.0 was used for data analysis.
The clinical core students for 2002-2003 were given two versions of the same test for
completion in 90 minutes using an open-book format. Approximately three students on each
rotation took the test on the last Tuesday of the Clerkship at the Family Practice Center or the
Family Medicine conference room at the hospital. The test was monitored. After the test was
competed, each clerk was interviewed by one of the researchers, a curriculum and evaluation
specialist who has served as a consultant to the Family Medicine Department for the past 10
years. She did not know the medical students and they did not know her. The first four students
were interviewed by phone within several months of the test; others were interviewed
immediately after the examination each month for a total of 18 person interviewed between June
and November, 2002. Of these 18 interviewees, only four were male.
We reviewed relevant research literature and developed a structured interview protocol
grounded in the literature. The following questions were established prior to interview with the
possibility of emergent questions flowing from answers within an interview and from one
interview to the next. Field notes were also recorded by the interviewer.
Draft—OpenBook Testing 7
1. How did you prepare for the Open-Book test? (time and process) How was it
different from preparation for a closed-book test?
2. How did you feel going in to the Open-Book test? (more or less anxious)
3. How did your use the text during the test?
4. How would you prepare differently?
5. What are your feelings now about Open-Book tests?
6. Did you achieve your learning goals?
The interview answers were analyzed using a qualitative approach in which coding of
responses generated several themes for the question that had structural corroboration and
referential adequacy. Where appropriate, percentages of responses were calculated within a
theme. This qualitative analysis enabled us to look for meaning and act on that meaning. Using
qualitative reflection during coding brought to mind possible linkages and relationships (Ratcliff,
2002). These were further enhanced when the themes and supporting student quotations were
presented to the clerkship director for analysis and feedback. Through this analysis we began to
recognize the larger phenomena around the process of open-book testing.
Limitations of the Study
The quantitative statistical analysis is limited by the context for student evaluation in
clerkships. The experiment is considered a naturalistic inquiry in which student testing is
conducted in its traditional protocol, and experimental variables are not held constant. When
comparing two different years of testing, we recognize that the same students are not being
tested, and the exact test is not used again. Similarly within a given year, two forms of the test
are used. However, these tests have been derived from the same pool of test items which were
Draft—OpenBook Testing 8
all developed and reviewed by the same task force. Students within a given year are not paired
with counterparts at the two clinical training sites. Students take the Family Medicine clerkship
and its test at different points in the training and there is opportunity for additional learning from
preceeding clerkships such as pediatrics and internal medicine.
Quantitative Findings and Analysis
Are there differences in student achievement between students working in an open-book
setting and those working in a closed-book setting? There were test scores available for a total
of 150 students. The following table indicates the distribution by site and by year.
Site
2001
2002
Maine
36
22
Vermont
53
39
The mean test score for 2001 across sites was 85.4 and for 2002 was 85.0. This difference
was not statistically significant. The mean score for Maine for both years was 86.5 and for
Vermont it was 84.5. This difference was not statistically significant, although the p value was
0.051. In 2001, the mean score for Maine was 84.9 and for Vermont was 85.8. In 2002, the mean
score for Maine was 89.1 and for Vermont was 82.7. Analysis of variance showed that the model
including both year and site was significant at p = 0.01. The interaction of site and year was
significant at p = 0.003. The adjusted R-squared was 0.055. When we examine the score range,
we see that students at both sites have had very high scores (92-98) both in closed book and open
book settings. Likewise, the low scores at MMC did not change appreciably (2001=74,
2002=78), however the low scores at UVM in 2001 (62) were significantly lower than MMC and
dropped even more so in year 2003 (56). Therefore, we see the standard deviation increasing.
Draft—OpenBook Testing 9
We may want to ask the clerkship director at UVM about selection of students for off-site
placements.
Table 2. Test Scores by Year and Site
MMC
UVM
Mean Score +/- SD
Range
N=
84.9 +/- 5.6
74-92
36
85.8 +/- 7.1
62-96
53
Mean Score +/- SD
Range
N=
89.1 +/- 6.3
78-98
22
83.5 +/- 8.0
56-96
70
2001
2002
After examining these basic statistics, further analysis was needed to explore the
underlying trends since the UVM scores were going down equal to the increases by the MMC
students. When we examine the mean scores by quarter, we see that most of the difference
occurred within one quarter in which UVM students had a dramatic slide in their scores. The
following quarter UVM students returned to their basic mean score, whereas the MMC students
maintained their higher levels. When we connect these results to our qualitative data, we note
that students have now begun to give advice to their colleagues about the best way to study for
the open book test and the best ways to utilize the book within the given time frame.
Draft—OpenBook Testing 10
Mean score by quarter
95
90
85
MMC
UVM
80
75
70
Q1
Q2
Q3
2001
Q4
Q1
Q2
Q3
2002
The change to open book testing in the Maine clerkship training site showed an
association with increased mean test scores. If the medical school decides to implement openbook testing in this clerkship, they may want to change cutoffs for grading students in terms of
Honors and Pass distinctions. The differences in test scores using the textbook indicates that the
test items do indeed relate to the text although we did not analyze test items or identify items for
which the textbook was used.
Qualitative Findings and Analysis
In this section of the paper, we use the research questions to organize the qualitative results
and the resultant themes. Student quotations are used to exemplify the themes and show the
validity and reliability of the themes. The student’s database number is used in parentheses at
the end of the quotation
Draft—OpenBook Testing 11
How do these medical students prepare for a timed, open-book test? Two different
approaches to test preparation were noted. Approximately 25% of the students waited until the
last week or two for the rotation to read and to study the text either because of constraints of the
rotation or admitted procrastination. “Because of the requirements of the rotation, I never had
time before.” (7) “I procrastinated until the last minute. About a week before the end, I began to
read.” (2) One might question how the timing also influenced the process of study. However,
most students began their reading at the beginning of the rotation and continued throughout. One
person admitted that outside factors influenced this process. “I started early in the rotation since
I knew I would be running the NYC marathon the last weekend.” (17)
Although the clerkship coordinator was “shocked” that 25% of the medical students waited
until the last week of the rotation to do the major reading and study after having been advised to
read each night on the topics of patient cases seen during the day, she did acknowledge that some
students have to travel to off-site locations for their clinical work which may affect their
available study time.
When asked how they prepared for the test, several key themes emerged. Some students
“read the whole book, a chapter or two per day, taking notes on each chapter.” (9). Others began
their reading in relation to the clinical cases of the day, but soon discovered they would need to
read additional chapters. “When I saw an interesting patient, I read further on the topic. If I
didn’t see common things, I read up on them. We were encouraged to read about our patients and
their treatments.” However, some students decided to be more selective in their reading of the
text, generally focusing on the second half. “I skimmed the first part. I focused on the diseases
to help with the daily outpatient encounters.” (12) “I skipped the first part on preventative
medicine. I focused on the common problems section beginning about page 200, going from
Draft—OpenBook Testing 12
a(sthma) to z.” (1) “The first half of the text was not applicable, not high yield since it was
primarily about the physician/patient interaction.” (3) One student more fully described her
process: “I read, did problems, and highlighted important places to refer to.” (17)
Students also needed to know the structure of the book, not just its contents. So they
devised strategies for that process as well. “I made copies of the Table of Contents to keep from
looking for individual items. I plotted out the book so I knew where to go for different topics. I
became familiar with the charts.” (6) Several of the first students organized their textbook with
little colored tabs. Since the books were loaned to the students by the clerkship, these tabs were
then available to the remaining students. However, one student did not find them helpful. “I
didn’t use them; some items were spelled wrong. I used the index more than the tabs.” (6)
Several students whose rotations came later in the cycle chose not to concentrate on the
textbook. “I read a lot during this rotation. During any down moment I went to
UP_TO_DATE.com (a service for docs and students). When I saw something, I also read
Washington’s Manual. I didn’t really read this book. I read Blueprints in Medicine (Pediatrics).
I have been studying in Internal Medicine for 4 months, esp. heart disease, incorporation of
Peds/IM. I did not read everything; I looked things up when needed; I read to apply to patients.”
(15) “ “I did not do focused study. I had accumulated knowledge from this rotation and
previous ones. I spent a couple of hours per week and used lots of other books. I used more
specific texts in terms of the patient presenting, ex. neuromuscular. “ (11)
In keeping with the literature on open-book testing, students touched on the issue of
memorization without prompting from the interviewer. “I was not trying to memorize, but
knowing the general ideas.” (13) “I didn’t do as much memorization; I made sure I understood
the point.” (17) This issue would emerge in answer to other questions as well.
Draft—OpenBook Testing 13
Over 60% of the students did not change their study tactics because of the new format.
“Since the test was based on the book, I still would have done the same. I prepared similarly
because it was a timed test.” (1) Those who did change their study process noted that the time
spent in focused study was less. “I would have read the same amount, but then gone back and
memorized.” (14). They acknowledged that they would have spent more time, read more closely,
and more often. However, they felt that instead of memorizing facts and details, they were able
to read more for understanding.
The interviewer probed deeper to ascertain whether or not students would prepare
differently now that they have experienced the process. Over 80% of the students indicated that
they would not change their preparation process. Two students felt that they would have given
more time to the process. “I would read more carefully, studied more – in a similar process to a
closed-book test.” (3) “I would have put more time in than I did. The fact that it was an openbook test lulled me. Next time, I would get to know the material better. I would study more.”
(2) At least one student focused on particular areas of deeper study such as on orthopedics
including low-back pain and ankle injury. (6) Several respondents used this question as another
opportunity to say that they would have changed to process of using the text during the test. In
general, students agreed that the test preparation was very similar to the process for a closedbook test.
One important goal of this study was to encourage more broad reading of the text whereas
the clerkship had begun to assign specific chapters because of the shorten rotation and ever
lengthening textbook. It seems that students may have read a bit more broadly as desired by the
clerkship director; at least they deliberately learned more about the structure of the text. The first
half of book on physician-patient interaction and preventative medicine not as well studied.
Draft—OpenBook Testing 14
Clerkship coordinators need to examine whether or not medical students acquire these
competencies through modeling and clinical teaching. The coordinator was disappointed to learn
that some students skipped the first section since those chapters do have questions on the
examination. “Students were never told that first chapters were not important. It appears that we
need to focus students on preventative medicine.” Students at this level generally have good
study habits as evidenced by their analysis of the textbook’s structure. This is one of the keys to
using Evidenced-Based Medicine as a life-long learners. The structure of a timed test seems to
keep the students honest in their study and eliminates the use of the textbook as a crutch that
saves them from study. The researchers were especially pleased to see that students related their
study to the service of the whole patient in appearing to ask the question, “What does this mean
to me as a doctor?” Further, the researchers agree with their students that the learner misses
something when studying to memorize.
Although end-of-service tests are not considered high-stakes exams, medical students set
high expectations for their success at taking tests. One of the expected benefits of an open-book
approach is the reduction of tension and stress. Over 80% of the respondents described their
feelings when entering the testing situation to be less anxious, less stressful, more comfortable.
One student even used the word “safe” to describe his emotional state. (4) This comfort came in
part because specifics could be found in the text -- “I had the comfort of knowing that if I got
stuck on a detail or an item, had I something to fall back on.” (11) Even more, students felt they
had a good general idea about issues. One student described the experience as “more similar to a
clinical setting. We can’t be expected to have every detail; we can use a textbook or a PDA.”
(11) One student expects better retention of the knowledge gained during the rotation. “In
medicine you have a foundation but new information comes up all the time. You can’t expect to
Draft—OpenBook Testing 15
memorize everything. When you stop using a knowledge base it deteriorates. In Family
Practice, if I don’t use something, I can go look it up if I need to. I understand the basic premise
from which to go look up something. I am constantly using my PDA, which helped me provide
better care when I could look up a new drug or way of treating.” 15)
For some students the tension was similar to other tests, simply because they were ready to
“get it over with” (3) or because they were not sure what to expect. (8) Several students
described additional pressure to do well since faculty may expect significantly higher scores with
an open-book format. One student had a history with open-book tests that were “harder than
might be expected. My assumption is that they ask more obscure questions. Subconsciously, I
worry that I may not be studying as hard – memorizing. I may flounder looking for answers.”
(14) When probed by the interviewer, the student admitted that this test did not follow that trend.
“This exam was okay. The questions that may have been hard would have been anyway.” (14)
Researchers felt it was important to know how students actually used the textbook during
the test in order to provide guidance to future test takers. The responders fell into three distinct
and equal groups. One-third of the students used the textbook for almost all items as they worked
through the test, and this caused some problems with time. “I looked up answers to every
question as I went along, even the ones I knew. I was racing at the end and didn’t look up the
last 10 questions.” (5) These students used a scanning approach when looking for the answers.
“I scanned looking for answers; read quickly; used the index, but it was not helpful. I used the
text for each question even when I was pretty sure of the answer. And that slowed me down. I
guessed at the last seven questions. (6) “Each item was taking me about 7-8 minutes. Looking
up every thing slowed me down and I began to run out of time.” (4)
Draft—OpenBook Testing 16
Another third of the respondents generally used the text for each unknown questions or to
confirm answers. “If I got to a question I didn’t know, I flipped to that part of the book.” (3)
“When I didn’t know an answer or questioned myself, I used the text to clarify.” (17) Most of
these students thought the text was well organized and that it was easy to find an answer.
However, one of the students from this group had a different opinion about searching the text for
answers. “The majority of questions I couldn’t look up a place for one answer. They needed
indirect answers.” (1) Another student used the index as a tool. “I looked up key words asking
myself, what could I find this under?”
With both these strategies, students often found themselves reading too much and losing
time in the process even though they are fairly fast test takers. (14, 1, 15, 2)
The third strategy was to go through the entire test, answering questions and marking those
needing clarification, then using the text for those questions as a group at the end of the test. (9,
10, 11, 12, 13, 18) Because this was not a controlled study, previous classmates had forewarned
some students about the time crunch when using a different strategy. (10) It was interesting to
note that one student did not find the book very helpful when needing clarification. “Using the
book made me less certain rather than more certain. I had more doubts.” (18)
As the interviews progressed, the researcher began asking students what types of questions
needed the text as a resource. Even though this question was asked immediately after the test,
most students gave very general answers. “Ones that had symptoms that fell under multiple
diagnosis. “(10) I used the text when I had to differentiate between two choices. I also used it for
details, such as wound closure, and specific treatments for which I might use a PDA in an
office.” (11). “I used the text for specific treatments, types of medications, or first steps in
management. “(3) I used the text for vaccinations (age), tables, and medications.” (12)
Draft—OpenBook Testing 17
If the open-book format is accepted as appropriate protocol by the medical school, the
faculty may want to revise several of the questions. If the answer to a question is now available
in chart form in the text, the revision should ask the student to use the chart to make a decision,
“a triple jump” question with more problem solving required. Further, we need to examine each
question in the light of best practice in the field where “you may not get the latest information if
you don’t stop to look up a treatment or drug therapy.”
As part of their reflection on the process, the interviewer asked students how they feel
about open-book testing now that it was completed. It should be noted that they did not yet
know their scores; however, as competent test takers we might expect them to have a good
assessment of their success. Over 60% of the students had positive comments regarding this
process of assessment. Several reasons were noted. First, the format leveled the playing field
for everyone, yet was not a dis-incentive to study. (7) The process even encouraged the reading
of the entire book. (6) “I don’t think they ‘cut you any slack’ since one still must have the fund
of knowledge, yet not be required to have every treatment, dosage, or side effect.” (11) Second,
the format was realistic. “It makes sense as we are always looking things up – signs, symptoms,
treatments. They are never going to be in our head all at one time. It is important to know that
you keep on learning. You should be comfortable not having the answer right off.” (16) Third,
students agreed that their focus of study was on principles, not memorization. “This process is a
better way to show what you understand rather than what you memorize. You can’t get away
without reading; there’s not enough time to look up everything.” (13) “I usually memorize, but I
forget within a month. With the open-book format, I learned what I felt I needed. Maybe I don’t
have things ‘off the top of my head’; but in the long term, I will have just as good results.” (3)
Draft—OpenBook Testing 18
Although there were many positive statements, over 50% of the students also added some
negative comments as well. One student clearly voiced those mixed feelings. “It takes the edge
off to not need to remember minutia, but there is a different type of anxiety – you second guess
yourself. “ (16) Another student agreed that open-books tests are often more unpredictable. “I
was not sure about the time for using the book and whether I had time to look up small details.”
(8) One student thought that having an open-book test was not as challenging as it should be,
specifically not this one, which was actually a timed, closed-book format of multiple-choice
items. The interviewer probed further about her belief that the test should be challenging.
“When you walk in a room and see a patient, you should be able to think of this stuff. It should
be in your head. It is too easy to look up (in your PDA). Of course, one can’t know everything.
Two of my preceptors were very traditional.” (10) Another student agreed that she didn’t “push
myself to recall things on the spot and considered that a negative. I wouldn’t choose an openbook test.” (17) One student felt that the fact that it was a timed test precluded actually using the
book and felt that “we should know the answers.” (18)
One neutral response was based on the analysis of criteria for grading. “About 65% of the
grade is based on the preceptor evaluation with another 15% on case presentation. So with the
two items, I felt comfortable about the test as a minimal influence on the grade.” (1)
When the interviewer asked, did you meeting your learning goals, many students looked
quizzical; yet responded in the affirmative without much explication. Others did not set learning
goals in preparation for the test. “Not really, I just read the book as much as I could.” However,
one student gave a very detailed learning plan for both the rotation and for test preparation. “I
wanted to see more chronic cases (hypothyroidism, adult diabetes) since my next rotation is
Internal Medicine. I wanted to see how they were managed in Family Medicine. As far as the
Draft—OpenBook Testing 19
acute situation, the answer is usually there: the patient knows and the doctor confirms. As far as
learning goals for the test, I wanted to be sure I knew about common problems of the older
patient (osteoarthritis, back pain). I was not as comfortable with an orthopedic evaluation even
though I saw lots of patients with shoulder and back problems.” (16) As a department, MMC
Family Medicine believes that medical students and residents should use an adult learning model
in which the learner conducts a self-assessment and sets learning goals at the beginning of each
rotation. For this clerkship, students do a skills inventory in advance and share it with their
preceptors. More emphasis needs to be made on this process at several points in the clerkship
including identification and selection of learning experiences as well as the preparation for
examinations.
Conclusions and Recommendations
Students using the open-book approach increased their mean score for the year 2002, but
the statistical difference with students using the closed-book format is also attributed to the
reduced scores for the control group students which is not explained by this research. We do see
a slight increase in the standard deviations as well as the mean score. The true difference
between students at MMC and UVM seems to come after they are more familiar with the
preparation for and use of the open-book format. More importantly the desired outcomes of the
intervention were observed: reducing the anxiety of students, wider reading of the text, learning
the structure of the textbook as a learning resource, and deeper understanding of concepts and
principles rather than time spent on memorization. The students appeared to approach the
textbook and therefore, perhaps, the body of knowledge as a whole and to orient as a generalist
to the knowledge base.
Draft—OpenBook Testing 20
The MMC clerkship coordinator is recommending implementation of the open-book
approach to the Family Practice clerkship at all sites. This recommendation will also support
advising students on the preparation for an open-book test and on tactics for the best use of the
textbook during the test. Further recommendation will include raising the bar for identification
of HONORS level evaluation.
In keeping with this focus on learning the conceptual framework of the clinical sciences, at
least one student indicated that s/he expected longer retention of knowledge gained. We
recommend a follow-up study of knowledge and skill retention. Although our student population
for this study was small, it would be interesting to look for correlations between the Clerkship
test in family medicine and student success on the national board examinations.
Educational Significance
Many medical educators may challenge the use of open-book examinations believing that
these short tests are also meant to prepare students for the national boards in terms of both
knowledge and test taking skills. However in a recent web-based discussion on the DR-ED
listserve, Baker (2003) called for a different perspective:
“Students respond to the evaluation system. And again we are falling into the trap of
making a memory-based evaluation system work extremely well. If we put the same
amount of effort into making a non-memory-based evaluation system work well,
everything would change in medical school. I have heard all the arguments about the need
to have memory-based exams so students do well on boards – but fact is – medical
education is in love with memory-based evaluation and it is afraid to try anything else,
even on a pilot basis (e.g., just one open-book/resource/PDA question on one exam
somewhere within the four years of medical school.”
Draft—OpenBook Testing 21
But the issue goes beyond the problem of memory-based evaluation. Some older
physicians work from the perspective that “I saw this in the past, so I know what to do.” This
viewpoint needs to change as medical students, residents, and physicians look for the most up-todate and most clinically relevant evidence related to their cases. Family Practice physicians are
leading the charge for evidence-based medicine asking, what’s the data?
As many elements of medical student evaluation and continued medical education
assessment are being made electronic through websites, the open-book format is becoming
acceptable. This approach is also more appropriate to the modern paradigm concerning medical
knowledge. Graduates of medical schools are no longer the custodians of medical knowledge.
Members of the extended health care team as well as patients have access through the WorldWide-Web, the popular press and media to virtually all of the necessary information for decision
making -- the prognosis of disease, the range of therapeutic options, and the complication rates
for diagnostic or therapeutic procedures. The real value to modern society lies in the doctor's
capacity to serve patients – to listen to them, to identify their needs, to explain and interpret the
information, and to apply unique clinical skills. The management and application of knowledge
rather than the retention of factual information must be a key focus. The textbook becomes a
reference book that is essential to the physician. Therefore, why not open-book exams?
References
Baker D (January 27, 2003). Personal statements on DR-ED listserve.
Campos-Outcalt, D., Witzke, D.B., Fulginti, J.V. (1994). Correlations of family medicine
clerkship evaluations with scores on standard measures of academic achievement. Family
Medicine, 23 (7), 85-88.
Draft—OpenBook Testing 22
Feldhusen, J.F. (1961). An evaluation of college students’ reactions to open-book
examinations. Educational and Psychological Measurement, 21, 637-645.
Feller, M. (1994). Open-book testing and education for the future. Studies in Educational
Evaluation, 20, 235-238.
Hemmer PA, Szauter K, Allbritton TA, Elnicki DM. Internal medicine clerkship
directors' use of and opinions about clerkship examinations. Critical Care Medicine 2001 Jun;
29(6):1268-73
Izard, J. (1992). Assessing Learning Achievement. Paris: UNESCO in
Jehu, D., Picton, C.J., and Cher, S. (1970). The use of notes in examinations. British
Journal of Educational Psychology, 40, 353-357.
Michaels, S., and Kieran, T.R. (1973). An investigation of open-book and closed-book
examinations in Mathematics. The Alberta Journal of Educational Research, 19 (3), 202-207.
Ratcliff, D. (March 31, 2002). Qualitative Research Resources. Retrieved on June 23,
2002 at http://don.ratcliff.net/qual/expq4.html.
Rogers PL, Jacob H, Rashwan AS, Pinsky MR. (1994). Quantifying learning in medical
students during a critical care medicine elective: a comparison of three evaluation instruments.
Family Medicine Feb;26(2):85-8.
Smith SR. (August 1999). Is It Time to Close the Book on Closed-Book
Examinations? Medicine and Health / Rhode Island, 82 (3).
Theophilides, C., and Dionysiou, O. (1994). The major functions of the open-book
examination: A factor analytic study. Improving university teaching, Proceedings of the 19th
International Conference (pp. 519-528). College Park: University of Maryland University
College.
Draft—OpenBook Testing 23
Weber, L.J., McBee, K., and Krebs, J.E. (1983). Take home tests: An experimental study.
Research in Higher Education, 18, (2), 473-483.