Download UNESCAP Evaluation Course March 7-9, 2007

Document related concepts

Internal communications wikipedia , lookup

Ease of doing business index wikipedia , lookup

Transcript
World Bank
Independent Evaluation Group
How to Increase the Utilization
of Evaluations
Michael Bamberger
Session outline
1.
2.
3.
4.
5.
Defining and measuring evaluation
utilization
Reasons why many evaluations are underutilized
Examples of evaluation utilization: the
World Bank “Influential Evaluations” study
Ways to strengthen utilization
If time permits .. Further discussion on
“presenting the message”
2
1. Defining and
measuring
evaluation utilization
Defining evaluation outcomes
and impacts



Use
•
How evaluation findings are utilized by policymakers,
managers and others
Influence
•
How the evaluation influenced decisions and actions
Consequences
•
•
How the process of conducting the evaluation, the findings
and the recommendations affected the agencies involved
and the target populations..
Consequences can be:
•
•
Positive or negative
Expected or unanticipated
4
Measuring evaluation outcomes
and impacts





Changes in individual:
•
•
•
Knowledge
Attitudes
Behavior
Changes in organizational behavior
Changes in program design or
implementation
Changes in policies and planning
Decisions on project continuation, expansion
and funding.
5
Measurement issues



Time horizon
Intensity
Reporting bias
• Many agencies do not acknowledge they
have been influenced

Attribution:
• How do we know the observed changes were
due to the evaluation and not to other
unrelated factors?
6
Attribution analysis
How do we know if observed changes were due
to the evaluation and not to other influences?
 Stakeholder interviews
 Surveys [pretest/posttest or posttest only]
 Analysis of planning and policy documents for
evidence of evaluation influence
 Analysis of mass media
 Key informants
7
Examples of attribution methodology
used in “Influential Evaluations”
See:
“Influential evaluations: Detailed case
studies” pp. 69-72
8
Attribution analysis framework
2.
1. Identify potential
Assess whether there
is a plausible case for
attributing part of the
effects to the evaluation
Impacts (effects)
3.
Triangulation
What proportion of the
effects can be attributed
to the evaluation?
A. Comparison of 2 user surveys +
Stakeholder opinion survey
Bangalore Citizens Report Cards


Comparison of sample surveys of service users (in
1993 and 1999) found reported improvement of
services [potential impact]
Sample of 35 public service agencies, policymakers, mass media and civil society
[corroborated influence of survey in influencing
improvements].
10
B. Testimonials from stakeholders
Bulgaria: Metallurgical Project
and
Indonesia: Village Water Supply


Clients (Metallurgical company, Development
Bank; AUSAID Project Dept, Indonesian water
agency) asked to send testimonials in letter or
e-mail confirming influence of the evaluation.
In Bulgaria clients asked to confirm validity of
benefit projections and to confirm benefits
were attributable to the evaluation.
11
C. EXPERT ASSESSMENT
+ PAPER TRAIL
India: Employment Assurance Program



Considered too bureaucratically difficult to solicit
Government views on effectiveness of government
agency (Evaluation Organization).
Opinions of Bank Resident Mission and other experts
solicited on the influence of the evaluation and the
credibility of the estimates of cost-savings and
employment generation.
Paper trail: specific references in the Five Year Plan and
follow-up sector planning documents to how the
evaluation was used.
12
D. Logical deduction from secondary
sources
Public expenditure tracking study (education): Uganda




Follow-up PETS study estimated increased funds
utilization (potential evaluation impact)
Extensive coverage of the report in the media
Government documents show how the findings
were used
Reports show how community groups use the
budget information posted in schools/ media.
13
2. Reasons why
evaluations are
under-utilized
# 1 Lack of ownership




Evaluation focus and design are
determined by donor agencies or outside
“experts” with little real input from client.
The “goals definition game” alienates
clients
Limited consultation with, and feedback
to, clients.
Evaluation seen as a threat
15
# 2 Timing
The evaluation findings are
• presented too late to be useful
• Too soon – before policymakers or managers
have started to focus on the issues discussed
in the report
16
# 3 Poor communication between
evaluator and client






Clients are not kept in the loop
Clients may not like evaluator’s communication
style
Language problems
Conceptual problems
The “objectivity” paradigm limits contact and
communication between evaluator and client
Client does not share information with other
stakeholders
17
# 4 Lack of flexibility and responsiveness to
client needs




Rigid design that cannot be adapted to
client needs or changing circumstances
Quasi-experimental design that cannot
adapt indicators and data collection
methods to changing circumstances.
“Objective” stance of evaluator limits
interaction with clients.
Timing: too early or too late
[continues next page]
18


Finance ministries try to force evaluation
indicators and focus to correspond to
budget line items
National evaluation systems sometimes
introduce top-down, uniform
evaluation/reporting systems not
reflecting the reality of different agencies
19
# 5 Resource constraints

Budget constraints affect
• Data collection
• Data analysis
• Bringing staff together to participate in the
•
•

evaluation process
Translation into local languages
Report preparation and dissemination
Limited local expertise
20
# 6 Time constraints


Too many demands on client and
stakeholders’ time
The evaluators do not have enough time to:
•
•
•
•
Consult with clients during evaluation planning
Design and implement the evaluation properly
Discuss the draft report with clients
Organize effective dissemination meetings
21
# 7 Relevance



The evaluation does not address priority
information needs of clients
Much of the information is not
considered useful
The information is not analyzed and
presented in the way that clients want:
• Too detailed
• Too general
22
# Factors external to the
evaluation affecting utilization





Problems with the evaluation system
Dissemination mechanisms
Political ethics (attitudes to
transparency)
Client’s lack of long-term vision
Government perception of evaluation
23
3. Examples of
evaluation utilization
The World Bank
“Influential
Evaluations” study
How are evaluations used? When are they
influential?
1.
2.
3.
4.
Evaluation never the only factor. How does
the evaluation complement other sources
of information and advice.
Political cover for difficult decisions
Identifying “winners” and “losers” and
showing how negative impacts can be
mitigated.
Credibility and perceived independence of
the evaluator may be critical
[continued next page]
25
5.
6.
7.
8.
The big picture: helping decision-makers
understand the influence of the social,
economic and political context.
Help managers understand how political and
other pressures limit project access to certain
groups
Providing new knowledge or understanding
Catalytic function: bringing people together or
forcing action.
26
Types of influence that evaluations
can have
1.
India: Employment Assurance
•
•
2.
3.
Broader interagency perspective helped identify
duplications and potential cost savings.
Evaluation Office had high-level access to Planning
Commission
India: Citizen Report Cards
•
•
Alerting management to service problems and
providing quantitative data to civil society pressure groups
Indonesia: Village Water Supply
•
Making policy-makers aware of importance of gender
issues and participatory approaches
[continued next slide]
27
4.
•
•
5.
•
•
Large Dams
Created political space for introducing new social
and environmental criteria for evaluating dams and
launching dialogue that facilitated creation of World
Commission on Dams.
Pakistan: Wheat Flour Ration Shops
Political cover for sensitive political decision
Showed how to mitigate negative consequences
[continued next page]
28
6.
Uganda: Education expenditures
•
•
Developed methodology to document what
everyone suspected (expenditure wastage)
provided documentation to civil society to
pressure for improvements
7. Bulgaria: Metallurgical Project
•
•
•
Alerting borrowers and Development Bank to new
EU legislation
showing how to avoid fines
how to advance launch of mineral production
[continued next page]
29
8.
China: Forestry Policy
•
•
•
Legitimized questioning the logging ban
promoting more in-depth policy research
facilitating creation of Forestry Task Force
30
What difference did the evaluation make?
1.
2.
3.
4.
5.
Major cost savings (India, Bulgaria, Pakistan)
Increased financial benefits (Uganda, Bulgaria)
Forced action (Bangalore, Uganda)
Strengthened gender and participatory planning
and management of water (Indonesia)
Introduced social assessment of dams but
discouraged future investments (Dams)
[Continued next slide]
31
6.
7.
Increased efficiency of service delivery
(India, Bangalore, Indonesia)
Facilitated creation of important policy
agency (Dams, China)
32
4. Ways to
strengthen
evaluation
utilization
Ways to strengthen evaluation
utilization
# 1. Deciding what to evaluate
# 2. Timing:
•
•
When to start
When to present the findings
#3. Deciding how to evaluate
•
Choosing the right methodology
#4. Ensuring effective buy-in
•
•
•
•
Stakeholder analysis and building alliances
The importance of the scoping phase
Formative evaluation strategies
Constant communication with clients
34
#5. Evaluation capacity building
#6. Deciding what to say [see next
section]
#7. Deciding how to say it [see following
section]
•
Effective communication strategies
#8. Developing a follow-up action plan
35
# 6. Deciding what to say




Technical level
Amount of detail
Focus on a few key messages
Target messages to key audiences
36
Sources of lessons about a program
 Evaluation findings
 Experience of practitioners
 Feedback from program participants
 Expert opinion
 Cross-discipline connections and patterns
 Strength of linkages to outcomes
37
Identifying evaluation lessons and
generating meaning


Tactics for generating meaning (Handout
1)
Identifying high quality lessons (Handout
2)
38
# 7. Presenting the message
1.
2.
3.
4.
5.
6.
7.
Communication style and choice of media (Handout 3)
Focus report on intended users
Quantitative and qualitative communication styles
(Handout 4)
The clients preferred communication style (Handout 5`)
Making claims
The importance of graphics
Who receives the evaluation report and who is invited to
comment
39
If time permits ……..
More detailed discussion
on presenting the
message
40
Presenting the message
1.
2.
3.
4.
5.
6.
7.
8.
Communication style and choice of media
Utilization-focused reporting principles
Quantitative and qualitative communication
styles
The client’s preferred communication style
Rules for written reports
Making claims
The importance of graphics
Who receives the report and who is invited to
comment
41
1. Communication style and
choice of media

Continuous communication throughout the
evaluation1
• “No surprises”
• Educating the client how to think about evaluation




Short versus long reports
Combining verbal and written presentations
Slide shows
Informal versus formal
42
Communication style .. continued

Alternative media
• Internet
• Video
• Theater
• Dance
• Murals/paintings
• Posters
• Signs
43
Communication style .. continued

Using the right language
• Making sure the written report is available in
•



stakeholder languages
Economical ways to translate
Personal testimony
Project visits
Working with the mass media
44
2. Utilization-focused reporting
principles





Be intentional and purposeful about
reporting
Focus reports on primary intended users
Avoid surprising stakeholders
Think positive about negatives
Distinguish dissemination from use
Source: Patton (1997): 330-337
45
3. Quantitative and Qualitative
communication styles


Tables versus text
Types of evidence
•
•
•
•
•
•
•
Statistical analysis
Case studies
Stories
Photos
Boxes
Site visits
Testimonials
46
3. The client’s preferred
communication style



Written and/or verbal
Quantitative/qualitative
Multiple presentations to different
audiences
47
Understanding the communication
style of the decisionmaker

High intellectual level: Robert MacNamara,
Elliot Richardson
“These two individuals were perfectly capable of
understanding the most complex issues and absorbing
details – absorbing the complexity, fully considering it
in their own minds.”
Lawrence Lynn. Prof. Public Administration. The
Kennedy School. Patton 1997. PP 58-9.
48
Decisonmakers communication styles .. continued

Short personal stories: Ronald Reagan
Preferred Reader’s Digest type personal
stories and anecdotes.
49
Decisonmakers communication styles .. continued

Political animals: Joseph Califano [U.S. Secy Health, Education and
Welfare
“Califano is a political animal and has a relatively short
attention span – highly intelligent, but an actionoriented person”. The problem that his political
advisers had is that they tried to educate him in the
classical rational way, without reference to any
political priorities .. or presenting alternatives that
would appeal to a political, action-oriented
individual.”
Source: Patton (1997) p.59.
50
5. Rules for written reports
Show, don’t tell
 Be brief and clear
 Build the story with paragraphs
 Write clear sentences
 Omit needless words
 Avoid jargon
[Vaughan and Buss]

51
Reporting results

Arranging data for ease of interpretation
•
Focusing the analysis
Simplicity in data presentation
 Interpretation and judgment
 Making claims
See next slide
 Useful recommendations
 A futures perspective on recommendations
Source: Patton (1997): pp 321-4

52
5. Making claims
Importance of
claims
Major
Minor
Rigor of Strong
claims
Weak
53
Characteristics of claims of
major importance







Involves having an impact
Deals with important social problem
Affects large numbers of people
Saves money and/or time
Enhances quality
Show something can really be done about a
problem
Involves model or approach that could be
replicated
54
Characteristics of strong claims





Valid, believable evidence
Data covers a long period of time
Claim is about clear intervention with solid
documentation
About clearly specified outcomes and impacts
Includes comparisons to program goals, over
time, with other groups, with general trends or
norms
55
Strong claims … continued




Evidence for claims includes replication
•
•
•
•
More than one site
More than one staff member obtains outcomes
Same results from different cohort groups
Different programs obtained comparable results using
the approach
Based on more than one kind of evidence
Clear, logical linkages between intervention
and claimed outcomes
Evaluators independent of staff
56
6. The importance of graphics





Line graphs: trends over time
Pie charts: parts of a whole
Cluster bar chart: comparing several
items
Combination chart: bar chart and trend
line
*** Don’t overload graphs ***
57
7. Who receives the evaluation report and
who is invited to comment?

Restricted to few key decision-makers
and managers
Participatory presentation/ consultation
with stakeholders
Public comment

Copies available to the press


• Public hearings
• Via civil society
58