Download "Climate Change: Moonshine, Millions of Models, Billions of Data - New Ways to Sort Fact from Fiction"

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Myron Ebell wikipedia , lookup

Atmospheric model wikipedia , lookup

German Climate Action Plan 2050 wikipedia , lookup

2009 United Nations Climate Change Conference wikipedia , lookup

Soon and Baliunas controversy wikipedia , lookup

ExxonMobil climate change controversy wikipedia , lookup

Michael E. Mann wikipedia , lookup

Climatic Research Unit email controversy wikipedia , lookup

Effects of global warming on human health wikipedia , lookup

Heaven and Earth (book) wikipedia , lookup

Climate resilience wikipedia , lookup

Global warming controversy wikipedia , lookup

Fred Singer wikipedia , lookup

Climate change denial wikipedia , lookup

Economics of global warming wikipedia , lookup

Global warming hiatus wikipedia , lookup

Climate change adaptation wikipedia , lookup

Global warming wikipedia , lookup

Politics of global warming wikipedia , lookup

Carbon Pollution Reduction Scheme wikipedia , lookup

Climatic Research Unit documents wikipedia , lookup

Climate change and agriculture wikipedia , lookup

Climate engineering wikipedia , lookup

Citizens' Climate Lobby wikipedia , lookup

Climate governance wikipedia , lookup

Effects of global warming wikipedia , lookup

Instrumental temperature record wikipedia , lookup

Climate change in Tuvalu wikipedia , lookup

Climate change in the United States wikipedia , lookup

Media coverage of global warming wikipedia , lookup

Scientific opinion on climate change wikipedia , lookup

Effects of global warming on humans wikipedia , lookup

Public opinion on global warming wikipedia , lookup

Climate change and poverty wikipedia , lookup

Attribution of recent climate change wikipedia , lookup

General circulation model wikipedia , lookup

Effects of global warming on Australia wikipedia , lookup

Climate change, industry and society wikipedia , lookup

Solar radiation management wikipedia , lookup

Climate change feedback wikipedia , lookup

Surveys of scientists' views on climate change wikipedia , lookup

Climate sensitivity wikipedia , lookup

IPCC Fourth Assessment Report wikipedia , lookup

Transcript
Climate Change:
Moonshine, Millions of Models, & Billions of Data
New Ways to Sort Fact from Fiction
Bruce Wielicki
NASA Langley Research Center
June 5, 2008
VIMS Lecture
Outline
• Is that earthshine or moonshine?
• Fact vs fiction in climate change?
• How would you know an accurate climate
model if you met one?
Climate System Energy Balance:
Kiehl &
Trenberth
1997
IPCC 2007 Climate Science Report
•
Cloud Feedback remains the largest climate sensitivity
uncertainty. Factor of 3 in sensitivity by 2100.
– Low clouds are the dominant problem.
– For low clouds, albedo change dominates cloud feedback
•
Aerosol Indirect Effect on cloud remains the largest radiative
forcing uncertainty. Factor of 3 in current radiative forcing.
•
Climate model predictions for global temperature change don't
diverge until after 2040 to 2050 between high and low
sensitivity climate systems.
– but large temperature change by 2100 between low/high
sensitivity
– why? more sensitive models store heat in the ocean more rapidly
and their response is more delayed than less sensitive models
– need to determine cloud feedback before temperature shows what
the climate sensitivity is (2040 to 2050 is too late).
Amount of change for a factor of 6 in climate model
sensitivity (2K to 12K for doubling CO2)
Dynamics
variables not
very sensitive
Cloud, Radiation,
Sea Ice variables
very sensitive
Weather = dynamics, Climate = energetics
Need Climate Change OSSEs, Climate Obs. Reqmts
Murphy et al.
Nature, 2004
Global Dimming: Wild et al., 2004 GRL
"Several studies indicate that incident shortwave radiation at land surfaces has
significantly decreased between 1960 and 1990. Despite this, land temperature
has increased by over 0.4C over the same period."
SW Reflected Flux Anomaly
Global Albedo Anomaly
Earthshine: Palle et al., Science, May 2004
"These large variations in reflectance imply climatologically significant cloud-driven changes
in Earth's radiation budget, which is consistent with the large tropospheric warming that has
occurred over the most recent decades."
How large are these SW Flux Changes?
•
•
•
•
•
Solar constant = 1365 Wm-2
Global average TOA insolation = 1365/4 = 341 Wm-2
Global average Reflected SW Flux ~ 100 Wm-2, Albedo ~ 30%
Global SW atmosphere absorption ~ 20%, surface insolation ~ 50%
Global net energy balance ~ 341 (Insolation) = 100 (SW) + 240 (LW)
• Anthropogenic Radiative Forcing (IPCC) ~ 0.6 Wm-2 per decade
• A low cloud feedback of 50% (SW dominated) ~ 0.3 Wm-2 per decade.
• This is a change of 0.3% relative or 1/20th the earthshine/dimming
changes.
• 6 Wm-2 Changes are HUGE: factor of 10 larger than anthropogenic
forcing.
• Climate skeptics breathe a sigh of relief: global warming is lost
in the noise.
• Can natural variability be this large? Fact or Fiction?
0.21 Wm-2
Shows consistent calibration stability at < 0.3 Wm-2 per decade (95% conf)
Unfortunately only works for tropical mean ocean (nband vs bband issues)
Regional trends differ by +2 to -5 Wm-2/decade SeaWiFS vs CERES
Loeb et al. 2007
J. Climate
CERES Shortwave TOA Reflected Flux Changes:
Ties to Changing Cloud Fraction
Tropics drive
global albedo
variations:
global is in
phase with
tropics and 1/2
the magnitude
Cloud fraction
variations are
the cause (not
optical depth)
Unscrambling climate signal cause and effect requires complete parameter
set at climate accuracy, e.g. for forcing/response energetics: radiation,
aerosol, cloud, land, snow/ice, temperature, humidity, precipitation
Using CERES to Determine Length of Climate Data
Record Needed to Constrain Cloud Feedback
Half of
Anthrop
Forcing of
0.6 Wm-2
/decade
Given climate variability, 15 to 20 years is required to first detect climate trends at cloud
feedback level with 90% confidence,
and 18 to 25 years to constrain to +/- 25% in climate sensitivity
Loeb et al. 2007
J. Climate
How well can we pull climate records from meteorological
satellite data like ISCCP from geostationary?
Geo calibration &
sampling errors
dominate interannual signals
Uncertainty in
Geo trends
are a factor of 10
larger than
climate goal:
can we learn
how to improve
past data sets?
Loeb et al., 2007 J. Climate
Annual Mean Global SW TOA Flux Anomaly
(Earthshine versus CERES: 2000 to 2004)
Earthshine data implies large change of 6 Wm-2 in
global reflected SW flux: is the Earth's albedo changing?
(Palle et al., Science, 2004)
CERES shows an order of magnitude less
variability than Earthshine
Earthshine approach is incapable of capturing changes in global albedo at climate
accuracy.
Loeb et al. 2007 GRL
"Global Dimming": is it real?
What about new CERES fusion satellite surface fluxes?
ARM/BSRN/CMDL/Surfrad Surface Radiation Sites
Surface SW Flux Validation Noise
Spatial mismatch of surface point to satellite area
Error decreases as simple 1/sqrt(N) random noise: but takes 20 sites for 1 year to reach
1 Wm-2: 10,000 samples.
(Wielicki, AMS Radiation Conference, 2006)
CERES Surface Fluxes vs Surface Sites:
Interannual Anomalies Consistent at 0.2% or 0.3 Wm-2
Global satellite sampling of radiation fields remains key: regional variability (climate
noise) is very large: 10 times the global forcing of 0.6 Wm-2/decade: even averaging 40
disperse surface sites. Result from GEWEX Radiative Flux Assessment (in progress)
Early Cloud Feedback Signals in the Arctic from CERES data
Seiji Kato and the CERES Science Team
Mean Cloud Fraction at Barrow AK
Trends derived from Terra and Aqua Data over the Arctic
Terra
Aqua
Linear Fit
to Terra
Missing
days
•CERES: Derived from MODIS
radiances by the CERES cloud algorithm.
•Radar: Derived from a ARM cloud radar.
•Lasers: Derived from a micro-pulse lidar and
a Vaisala ceilometer
•Error bars and dashed lines indicate max.
and min. during 4 years.
•Snow/sea ice fraction changed at a rate of
0.064 per decade (significant at an 80%
confidence level)
•Cloud fraction changed at a rate of 0.047
per decade (significant at an 80% confidence
level)
•Albedo change is insignificant at an 80%
confidence level.
From Kato, S., N. G. Loeb, P. Minnis, J. A. Francis, T. P. Charlock, D. A. Rutan, E. E. Clothiaux, S. Sun-Mack, 2006:
Seasonal and Interannual Variations of Top-of-Atmosphere Irradiance and Cloud Cover over Polar Regions Derived
from the CERES Data Set, Geophys. Res. Lett., 33, L19804, doi:10.1029/2006GL026685.
Ocean Heat Content and Net Radiation
changes in ocean heat content should = radiative flux anomalies
Ocean Cooling?
Lyman et al., Science 2006
Net Radiation: no
Altimeter Sea Level: no
GRACE Ice Sheet: no
1992 to 2003 data from
Wong et al. J. Climate 2006
Possible Causes of 2004/5 Drop in Ocean Heat Storage:
- transition from XBT to ARGO ocean in-situ data and sampling?
- cooling upper 750m of the ocean, but larger heating deeper?
- unmeasured heating under sea ice?
- The answer: warm bias in XBTs (dominate pre-2002) cold bias in ARGO
(dominate post 2002): no cooling in 2004/5 when bias is corrected. no mystery.
Climate Models: Uncertainty, Observations,
and OSSEs
Model Runs provided by ClimatePrediction.Net
research group at Oxford Univ., Myles Allen, David Stainforth
Model Run Analysis: Yong Hu and B. Wielicki
Ongoing discussions/debates with:
Dave Randall, Graeme Stephens, Tom Ackerman, Bill Collins, Kevin
Trenberth, Leo Donner, Christian Jacob, Bill Rossow, Phil Rasche,
Don Anderson, Max Suarez
Summer 2005 Gordon Conference on Climate
• What data do climate models need?
• When are climate models accurate enough?
• What are the climate observing system requirements?
– variables?
– time scales? spatial scales?
– priority for cost/benefit?
• How do we relate model & observation improvements
to narrowing uncertainty in climate prediction?
• Why can’t we get resources to attack the tough
problems?
Climate OSSEs
•
•
How do we determine decadal climate prediction accuracy?
– IPCC uses differences in model predictions: basically a
preponderance of circumstantial evidence.
– IPCC AR4 draft admits no current way to relate errors in
prediction of past climate to uncertainty in future prediction.
– No current way to relate model improvements in predicting
past climate states (e.g. 1900-2000) to prediction uncertainty
– This is not the case for weather prediction: why? 1000s of
days of weather, but not 1000s of decades of climate to test.
How do we determine climate observing system reqmts?
– For global decadal change we can use anthropogenic forcing,
response, feedbacks: e.g. Ohring et al., 2005 BAMS
– No known method for ANY other time/space scales: such as
regional, zonal, seasonal, diurnal, etc: critical to design obs
– This is not the case for weather prediction: OSSEs exist
– How can we develop physically rigorous climate prediction
OSSEs?
Stainforth et al.,
2005, Nature
Amount of change for a factor of 6 in climate model
sensitivity, by climate variable: clouds dominate
Murphy et al., Nature, 2004
How do we relate model errors versus
observations to climate prediction
uncertainty?
•
•
•
•
•
•
PPE Planet “I” vs Planet “J” is a simulation of Climate Model
vs Earth
Each Planet is a different Earth-like physical climate system
Planets have physics which differ in processes we don’t
understand
Can we use these Planet’s past climate differences to predict
differences in their future climate?
This is in fact the same as asking if we can use past climate
differences in Earth vs Climate Model to predict uncertainty
in predicting the future climate of Earth.
If this is not successful: we cannot predict uncertainty in
predicting the real Earth: until after the fact.
Neural Net Structure
Output Variables
Planet “I” - Planet “J”
2xCO2 minus 1xCO2
Input Variables
Planet “I” - Planet “J”
base state CO2 climate
TOA SW Flux
TOA LW Flux
Total Cloud Fraction
Conv. Cloud Fraction
Total Precipitation
Large Scale Snowfall
Large Scale Rainfall
Surface Latent Ht Flux
Surface Net SW Flux
Surface Net LW Flux
Surface Net Radiation
Neural
Network
Surface Temperature
Summer U.S. Precip
Sea Level
etc...
Planet “I” minus Planet “J”
Doubled CO2 Global Temp Change
Neural Net Prediction of Climate Sensitivity
95% confidence bound
of +/- 0.8C
33 climate model variables
Neural Net Prediction: Doubled CO2 Global Temp Change
(uses Planet I and J normal CO2 climate only)
Y. Hu, B. Wielicki, M. Allen
Planet “I” minus Planet “J”
Doubled CO2 Global Temp Change
Linear Regression Prediction of Climate Sensitivity
95% confidence bound
confidence bound
of95%
+/- 0.8C
of +/- 2.0C
33 climate model variables
Neural Net Prediction: Doubled CO2 Global Temp Change
(uses Planet I and J normal CO2 climate only!)
Y. Hu, B. Wielicki, M. Allen
Neural Net Results vs. No. of Variables
•
Doubling CO2 Global Temperature Uncertainty (1s)
– 33 variables
– 11 variables
– 4 variables
0.41K
0.66K
0.89K
•
Four variables with largest constraint on climate sensitivity
– Top of atmosphere shortwave reflected flux
– Total cloud fraction
– Convective cloud fraction
– Total precipitation
•
Neural net roughly 2.5 times more accurate than multiple linear
regression
Y. Hu, B. Wielicki, M. Allen
Neural Net Structure
Climate OSSEs
Input Variables
Planet “I” - Planet “J”
base state CO2 climate
TOA SW Flux
TOA LW Flux
Total Cloud Fraction
Conv. Cloud Fraction
Total Precipitation
Large Scale Snowfall
Large Scale Rainfall
Surface Latent Ht Flux
Surface Net SW Flux
Surface Net LW Flux
Surface Net Radiation
Output Variables
Planet “I” - Planet “J”
2xCO2 minus 1xCO2
Add
Observation
Error
Bias, σ
Bias, σ
Bias, σ
Bias, σ
Bias, σ
Bias, σ
Bias, σ
Bias, σ
Bias, σ
Bias, σ
Bias, σ
Neural
Network
Surface Temperature
Summer U.S. Precip
Sea Level
etc...
Difference in neural net performance with and without observation errors
Isolates effect of observation error on constraining climate uncertainty
Effect of Observation Error on
Neural Net Prediction Accuracy (2xCO2, Deg C)
If no observation constraint: sigma 1.5 K
Effect of Observation Error on
Neural Net Prediction Accuracy (2xCO2, Deg C)
Base State Climate Differences to Predict
(no decadal change information used)
(error specified as % of mean 2xCO2 change for any variable)
Error as Percentage of Mean Doubled CO2 Change
0%
10%
25%
50%
100%
Variables
33
11
4
2
0
0.63
0.60
0.73
0.73
1.57
0.62
0.67
0.88
1.29
1.07
1.60
1.96
Effect of Observation Error on
Neural Net Prediction Accuracy (2xCO2, Deg C)
Climate Change Differences to Predict
(no base state information used)
Avoid all variables trivially related to Ts
(e.g. PW)
(error specified as % of mean 2xCO2 change for any variable)
Error as Percentage of Mean Doubled CO2 Change
0%
10%
30%
50%
100%
Variables
11
6
2
0
0.11
0.12
0.13
0.17
0.42
0.42
0.65
0.68
1.22
1.21
Effect of Observation Error on
Neural Net Prediction Accuracy (2xCO2, Deg C)
Climate Change Differences to Predict
(no base state information used)
Avoid all variables trivially related to Ts
(e.g. PW)
Test CPDN trained Neural Net using IPCC Runs
(error specified as % of mean 2xCO2 change for any variable)
Error as Percentage of Mean Doubled CO2 Change
0%
10%
30%
50%
100%
Variables
6
2
0
0.33
0.34
0.40
0.50
0.80
1.03
6 Variables: TOA SW All-sky&Clear, TOA LW, Precip, Snowfall, Sfc SW Down
Next Steps:
More complete metrics than simple global means
e.g. Manabe seasonal cloud forcing
e.g. recent Seasonal Temperature metrics:
early results indicate they don't work for PPE set.
Improve neural net accuracy with array gate computer
Greatly increases complexity allowed (# of metrics)
Use more realistic CPDN simulations:
Coupled Ocean/Atmosphere (now 1000 runs)
Realistic time varying forcings
Realistic decadal change
Realistic decadal climate noise
Another View of Climate OSSEs
•
Respond to the papers in Science, October 2007
–
–
Why Is Climate Sensitivity So Unpredictable? by Gerald Roe and Marcia
Baker
Call Off the Quest by Myles Allen and David J. Frame
•
Demonstrate that the framework presented in this paper can be
used to define accuracy measurement requirements for cloud and
other feedbacks and for measurements like:
•
CERES (Clouds and the Earth’s Radiant Energy System)
instruments measure broadband radiation with accuracy to detect
change in broadband solar, infrared and net radiative energy
globally: including the effect of clouds on radiation.
•
CLARREO is a future mission which will allow climate change
accuracy calibration of a wide range of solar reflected and thermal
emitted satellite sensors (“NIST in orbit”)
2
CLARREO / CERES Science Connections:
Climate Sensitivity Uncertainty
The Climate Feedback System
Reducing uncertainty in predictions of ΔT is critical for public policy
since changes in global surface temperature drive changes in sea
level and precipitation
3
CLARREO / CERES Science Connections:
Climate Sensitivity Uncertainty
Uncertainty in Feedback Defines Climate
Sensitivity Uncertainty
The skewed tail of
high climate
sensitivity is inevitable
in a feedback system
IPCC Mean
Sensitivity
2σ
Feedback Factor, f
4
CLARREO / CERES Science Connections:
Climate Sensitivity Uncertainty
Feedback Factor Uncertainty (2 s)
IPCC Climate Feedback Uncertainty
0.300
Current IPCC (Soden & Held 2006)
0.2612
0.2400
0.225
0.150
0.0900
0.075
0.0500
0
Total
Cloud
W. Vapor
Lapse Rate
Surface
Albedo
The uncertainty in climate feedback is driven by these three
components. The feedback for the climate system is
f = 0.62 ± 0.26 (2σ)
5
CLARREO / CERES Science Connections:
Climate Sensitivity Uncertainty
ΔT for 2 x CO2 (oC)
Current Climate Uncertainty
0.26
Feedback Factor, f
Current measured feedback uncertainties result in large
uncertainties in predicted ΔT (Roe and Baker, 2007). ΔTo = the
Earth’s temperature as a simple blackbody
6
CLARREO Requirements Based on Reducing Feedback
Uncertainty: Cloud Feedback Example
ΔT for 2 x CO2 (oC)
Reducing Climate Uncertainty Requires a
More Accurate Measurement of Feedback
0.09
Feedback Factor, f
High accuracy measurements of climate change can constrain predictions of
ΔT through improved estimates of the feedback. The accuracy requirement is
driven by the goal for climate uncertainty reduction.
7
CLARREO Requirements Based on Reducing Feedback
Uncertainty: Cloud Feedback Example
Feedback Factor Uncertainty (2 s)
CLARREO Reduces Climate Uncertainty
0.300
Current IPCC (Soden & Held 2006)
With CLARREO
0.2612
0.2400
0.225
0.150
0.0929
0.0800
0.075
0.0900
0.0400
0.0500
0.0250
0
Total
Cloud
W. Vapor
Surface
Lapse Rate Albedo
These feedback uncertainty goals define the climate
change observation requirements
8
CLARREO Requirements Based on Reducing Feedback
Uncertainty: Cloud Feedback Example
Cloud Feedback Uncertainty Goal Defines
the Observation Requirement
9
CLARREO Requirements Based on Reducing Feedback
Uncertainty: Cloud Feedback Example
Decadal Trend Observation Requirement
The uncertainty goal for feedback factor f sets
the observation goal for Net Cloud Radiative Forcing (CRF) at
1.2 Wm-2/K
IPCC models predict a 0.2 K / decade warming
in the next few decades independent of sensitivity.
(because the warming is controlled by the slow ocean response
time)
Therefore, the Net CRF observation goal is:
(1.2 Wm-2/K) * (0.2K/decade) = 0.24 Wm-2/decade
10
CLARREO Requirements for Cloud Feedback
CLARREO/CERES Calibration Requirement
For Measuring Cloud Feedback
The Net CRF observation goal
sets the decadal calibration goal:
Net CRF = SW CRF + LW CRF
CRF = Clear minus All-Sky TOA Flux
Shortwave (SW): 0.24/50 = 0.5% (2σ)
Longwave (LW): 0.24/30 = 0.8% (2σ)
This requirement is four times more accurate than
the current SW broadband channel absolute accuracy:
Requires overlap for current observations (no gap)
and/or
Requires CLARREO for future observations (gap OK)
11
CLARREO Requirements for Cloud Feedback
CLARREO Sampling Requirement
The Net CRF observation goal also sets the sampling requirements
• 20+ year record for trend to exceed natural variability
• Full swath sampling for low observation sampling noise
• 20km FOV or smaller to separate clear and cloud scenes
Solution: CLARREO required to calibrate broadband observations
to needed absolute accuracy. CERES provides sampling of the Net
CRF decadal change.
12
CLARREO Requirements for Cloud Feedback
Additional Climate Feedbacks:
Similar climate model and data sampling analyses
could be performed for other climate feedbacks
• Water vapor/lapse rate feedback will require latitude profile
and height profile requirements for temperature and
humidity. Can be extended to spectral fingerprinting.
• Surface albedo (e.g. snow/ice) will require latitude
dependent requirements
• Other feedbacks could also be considered in this framework.
• Climateprediction.net perturbed physics modeling provides
an ideal framework to explore the relationships.
13
CLARREO Requirements for Cloud Feedback
The Quest Has Just Begun
A new era of climate Observing System Simulation
Experiments (OSSEs), a new era of calibration.
• A new methodology for linking climate model uncertainties to
observation requirements has been highlighted.
• The current large uncertainties in climate feedbacks are not
inevitable, nor is large uncertainty in climate sensitivity.
CLARREO and CERES will likely play a key role.
• The example of cloud feedback linked to Net CRF does NOT
eliminate the need to separately determine aerosol indirect
effect. This remains the largest radiative forcing uncertainty
and must be subtracted from the observed decadal change in
SW CRF.
14