Download Knows Knows how

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Psychometrics wikipedia , lookup

Machine learning wikipedia , lookup

Transcript
Where are we
with assessment
and where are
we going?
Cees van der Vleuten
University of Maastricht
This presentation can be found at:
www.fdg.unimaas.nl/educ/cees/amee
Overview of presentation




Where is education going?
Where are we with assessment?
Where are we going with assessment?
Conclusions
Where is education going?

School-based learning




Discipline-based curricula
(Systems) integrated curricula
Problem-based curricula
Outcome/competency-based curricula
Where is education going?

Underlying educational principles:

Continuous learning of, or practicing with,
authentic tasks (in steps of complexity; with constant
attention to transfer)



Integration of cognitive, behavioural and affective
skills
Active, self-directed learning & in collaboration
with others
Fostering domain-independent skills, competencies
(e.g. team work, communication, presentation, science orientation,
leadership professional behaviour….).
Where is education going?

Underlying educational principles:
 Continuous learning of, Constructivis
or practicing with,
Cognitive
authentic tasks (in steps of complexity;
with constant
m
psychology
attention to transfer)



Integration of cognitive, behavioural and affective
Collaborative
skillsCognitive
learning
load
Active, self-directed learning & in collaboration
theory
theory
with others
Fostering domain-independent skills, competencies
(e.g. team work, communication,
presentation, science orientation,
Empirical
leadership professional behaviour….).
evidence
Where is education going?

Work-based learning


Practice, practice, practice….
Optimising learning by:





More reflective practice
More structure in the haphazard learning
process
More feedback, monitoring, guiding, reflection,
role modelling
Fostering of learning culture or climate
Fostering of domain-independent skills
(professional behaviour, team skills, etc).
Where is education going?

Work-based learning


Practice, practice, practice….
Deliberatelearning by:
Optimising
Emerging
practice
 More reflective practice
work-based
 More structure in the haphazard learning
learning
process
theories



More feedback, monitoring, guiding, reflection,
role modelling
Empirical
Fostering
of learning culture or climate
evidence
Fostering of domain-independent skills
(professional behaviour, team skills, etc).
Where is education going?




Educational reform is on the agenda
everywhere
Education is professionalizing rapidly
A lot of ‘educational technology’ is
available
How about assessment?
Overview of presentation




Where is education going?
Where are we with assessment?
Where are we going with assessment?
Conclusions
Expanding our toolbox…..
Does
Shows how
Knows how
Knows
how
Knows
Knows
Established technology
of efficient written or
computer-based high
fidelity simulations (MCQ,
Key Feature, Script Concordance
Test, MEQs….)
Expanding our toolbox…..
Does
Shows how
Shows
how
Knows how
Knows
how
Knows
Established technology
of structured high
fidelity in vitro
simulations requiring
behavioural
performance (OSCE, SPbased testing, OSPE….)
Expanding our toolbox…..
Does
Shows how
Shows
how
Knows how
Knows
Emerging technology of
appraising in vivo
performance (Work-based
assessment: Clinical worksampling, Mini-CEX, Portfolio,
practice visits, case orals….)
Expanding our toolbox…..
Does
Shows how
Knows how
Knows
“Domain specific” skills
Emerging technology of
appraising in vivo
performance (self-, peer, coassessment, portfolio,
multisource feedback, learning
process evaluations……)
“Domain
independent”
skills
What have we learned?

Competence is specific, not generic
Reliability as a function of testing time
Testing
Time in
Hours MCQ1
CasePractice
Based
Video
InMini Assess- cognito
Oral Long
Short
Essay2 PMP1 Exam3 Case4 OSCE5 CEX6 ment7
SPs8
1
0.62
0.68
0.36
0.50
0.60
0.47
0.73
0.62
0.61
2
0.76
0.73
0.53
0.69
0.75
0.64
0.84
0.76
0.76
4
0.93
0.84
0.69
0.82
0.86
0.78
0.92
0.93
0.92
8
0.93
0.82
0.82
0.90
0.90
0.88
0.96
0.93
0.93
1Norcini
et al., 1985
2Stalenhoef-Halling et al., 1990
3Swanson, 1987
4Wass
et al., 2001
5Petrusa, 2002
6Norcini et al., 1999
7Ram
et al., 1999
2002
8Gorter,
What have we learned?

Competence is specific, not generic




Any single point measure is flawed
One measure is no measure
No method is inherently superior
Subjectivity/unstandardised conditions is
not something to be afraid of.
What have we learned?


Competence is specific, not generic
One method can’t do it all
Magic expectations…….
Does
Shows how
Shows
how
Knows how
Knows
how
Knows
Knows
Direct observation
methods, Portfolio
OSCEs
Key features
(short cases)
What have we learned?


Competence is specific, not generic
One method can’t do it all



One measure is no measure
We need a mixture of methods
to cover the entire pyramid
We can choose from a rich toolbox!
What have we learned?



Competence is specific, not generic
One method can’t do it all
Assessment drives learning
Assessment and learning
“The in-training assessment programme
was perceived to be of benefit in making
goals and objectives clear and in
structuring training and learning. In
addition, and not surprisingly, this study
demonstrated that assessment fosters
teaching and learning.….”
(Govaerts et al, 2004, p. 774)
Assessment and learning
“Feedback generally inconsistent with and
lower than self-perceptions elicited
negative emotions. They were often
strong, pervasive and long-lasting….”
(Sargeant et al., under editorial review)
Assessment and learning
“You just try and cram - try and get as many of
those facts into your head just that you can
pass the exam and it involves… sadly it involves
very little understanding because when they
come to the test, when they come to the exam,
they’re not testing your understanding of the
concept. They test whether you can recall ten
facts in this way? ” (Student quote from Cilliers et al., in
preparation)
The continuous struggle
Curriculum
Assessment





Learner

Content
Format
Programming/
scheduling
Regulations
Standards
Examiners…
What do we know?



Competence is specific, not generic
One method can’t do it all
Assessment drives learning



Verify the consequences
Use the effect strategically
Educational reforms are as good as the
assessment allows it to be.
What do we know?



Competence is specific, not generic
One method can’t do it all
Assessment drives learning



Verify the consequences
Use the effect strategically
Educational reforms are as good as the
assessment allows it to be.
Overview of presentation




Where is education going?
Where are we with assessment?
Where are we going with assessment?
Conclusions
My assumptions



Innovation in education programmes can only
be as successful as the assessment
programme is
Assessment should reinforce the direction of
education that we are going
Future directions should use our existing
evidence on what matters in assessment.
The Big Challenge



Established assessment technologies have
been developed in the conventional
psychometric tradition of standardisation,
objectification & structuring
Emerging technologies are in vivo and by
nature less standardized, unstructured, noisy,
heterogeneous, subjective
Finding an assessment answer beyond the
classic psychometric solutions is The Big
Challenge for the future.
Design requirements future assessment

Dealing with real-life:



In vivo assessment cannot and should not
be (fully) standardized, structured and
objectified
Includes quantitative AND qualitative
information
Professional and expert judgement play a
central role.
Design requirements future assessment

Dealing with learning:


All assessment should be meaningful to
learning, thus information rich
Assessment should be connected to
learning (framework of the curriculum and the
assessment are identical)

Assessment is ‘embedded’ in learning
(equals the ‘in vivo of educational practice’ and
adds significantly to the complexity).
Design requirements future assessment

Dealing with sampling:

Assessment is programmatic



Comprehensive, includes domain-specific and
domain independent skills
Combines sampling across many information
sources, methods, examiners/judges/
occasions…..
Is planned, coordinated, implemented,
evaluated, revised (just like a curriculum
design).
Challenges we face

Dealing with real life:





How to use professional judgement? Do we
understand judgment?
How to elicit, structure and record
qualitative information?
How to use (flexible) standards?
What strategies for sampling should we
use? When is enough enough?
How to demonstrate rigour? What
(psychometric, statistical, qualitative)
models are appropriate?
Challenges we face

Dealing with learning:





What are methodologies for embedding
assessment (e.g. Wilson & Sloane, 2000)?
How to deal with the confounding of the
teaching and assessor role?
How to combine formative and summative
assessment?
How to involve stakeholders?
How to educate stakeholders?
Challenges we face

Dealing with sampling at the
programme level:





What strategies are useful in designing a sampling
plan or structure of an assessment programme?
How to combine qualitative and quantitative
information?
How to use professional judgement in decision
making on aggregated information?
How to longitudinally monitor competence
development?
What are (new) strategies for demonstrating
rigour in decision making? What formal models
are helpful?
Contrasting views in approach
Conventional assessment



Assessment separate
from learning
Programmatic
embedded assessment

Method-centred

Context free

Assessment as part of
learning
Programme-centred
(based on overarching
cohesive structure)
Context matters
(dynamic relation between an
ability, a task and a context in
which the task occurs - Epstein
& Hundert, 2002)
Contrasting approaches in research
Conventional assessment

Rigour defined in
direct (statistical)
outcome measures
Programmatic
embedded assessment


Reliability/validity


Benchmarking

Rigour defined by
evidence on
thrustworthiness or
credibility on the
assessment process
Saturation of
information,
triangulation
Accounting
Contrasting views in approach
Conventional assessment
Programmatic
embedded assessment
Confused
Overview of presentation




Where is education going?
Where are we with assessment?
Where are we going with assessment?
Conclusions
Conclusions

Assessment has made tremendous
progress


Good assessment practices based on
established technology are implemented
widely
Sharing of high quality assessment
material has begun (IDEAL, UMAP, Dutch
consortium)
Conclusions

We are facing a major next step in
assessment


We have to deal with the real world
The real world is not only the work-based
setting but also the educational training
setting
Conclusions

To make that step:



We need to think out of the box
New methodologies to support
assessment strategies
New methodologies to validate
the assessment
Conclusions

There is a lot at stake:

Educational reform depends on it
I’m here because
I couldn’t change
the assessment
Conclusions

Let’s join forces to make that next step!
This presentation can be found on:
www.fdg.unimaas.nl/educ/cees/amee