Download Slides - IIASA

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Guidance for Uncertainty
Scanning and Assessment at
RIVM
Jeroen van der Sluijs, James Risbey, Penny Kloprogge
(Copernicus Institute, Utrecht)
Jerry Ravetz (RMC, London)
Silvio Funtowicz, Serafin Corral, Ângela Pereira (JRC, Ispra)
Bruna de Marchi
Rob Hoppe, Simone Huijs (Fac. Public Policy, Twente Univ.)
Marjolein van Asselt (ICIS, Maastricht)
Peter Jansen, Arthur Petersen, Anton van der Giessen (RIVM)
RIVM learning process
• <1999 Innovative methodological
R&D&D on uncertainty assessment and
management (e.g TARGETS)
• 1999 De Kwaadsteniet affair
Fact sheets
• 1999-2000 National review MB/MV
• 2000 International audit
• >2000 Multi-disciplinary project
• Development of Guidance (“Leidraad”)
Goals
 Structured and transparent approach that facilitates
awareness, identification, and incorporation of
uncertainty
 May not reduce uncertainties, but provides a means to
assess their potential consequences and avoid pitfalls
associated with ignoring or ignorance of uncertainties
 Guidance for use and help against misuse of uncertainty
tools
 Provide useful uncertainty assessments (robust
knowledge)
 Facilitate effective communication on uncertainties in
terms of robustness of knowledge
 Fit RIVM's specific role in the decision analytic cycle
Ingredients
• Typology of uncertainties
• Quick-scan
• Analytic checklist
• Toolbox
• Procedure selection/tuning of tools
• Good practice guidelines
• Glossary
Sorts of uncertainty
• Inexactness (technical)
• Unreliability (methodological)
• Ignorance (epistemological)
(Funtowicz and Ravetz, 1990)
Locations of uncertainty
• Sociopolitical and institutional context
• System boundary & problem framing
– System boundary
– Problem framing
– Scenario framing (storylines)
• Model/instrument
–
–
–
–
Indicators
Conceptual model sruct. /assumptions
Technical model structure
Parameters
• Inputs
– Scenarios
– Data
Main steps
• Quick scan
• Problem framing
• Process/context assessment (history,
stakeholders, values)
• Communication
• (Assess limitations of) Environmental
assessment methods
• Uncertainty identification and prioritization
• Uncertainty analysis
• Review, evaluation, interpretation
• Reporting
Outputs Quickscan (1)
• Description of the problem
• Gauge of how well assessment tools address
the problem
• List of which uncertainties are salient on the
basis of problem structure
• Indication whether to involve stakeholders
• Indication where in policy life cycle the
problem is
• List of stakeholders
• Identification of areas of
agreement/disagreement on value dimensions
Outputs Quickscan (2)
• Prioritized list of salient uncertainties
• Communication plan: when and how to involve
what stakeholders
• First indication of appropriate tools to address
uncertainties identified
• Assessment of attainable robustness of results
+ indication what it might take to increase
robustness
• Assessment of the relevance of results to the
problem
• Pitfalls and hints to facilitate effective
communication of results
Toolbox uncertainty analysis
•
•
•
•
•
•
•
Sensitivity Analysis (screening, local global)
Error propagation equation (TIER 1)
Monte Carlo (TIER 2)
Expert Elicitation
NUSAP
Scenario analysis
Extended Quality Assurance (pedigree
scheme)
• PRIMA
• Checklist model quality assistance
(see www.nusap.net)
• …...
Toolbox
For each tool:
• Main purpose and use
• What sorts and locations of uncertainty are
addressed?
• Required resources
• Strengths and limitations
• Guidance on the application and hints on
complementarity with other tools
• Pitfalls
• References (handbooks, user-guides, web resources,
example studies, experts)
Mapping Toolbox to typology
Review, synthesis and
evaluation
• Synthesise quantitative & qualitative
results
• Revisit problem and assessment steps
• Frame findings in terms of robustness
of the environmental assessment
concerned
• Relevance of results to the problem
• Discuss implications of findings for
different settings of burden of proof
Reporting
• Context of communication
• Who are target audiences
• Language
• Method and style
• Content
Further work
• Expert Review
• User review
• Web-tool
• Demonstration on cases
• Training