Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
LAND PRODUCT VALIDATION update F. Baret, J. Nightingale, S. Garrigues, J. Nickeson Missoula, 17 June 2009 1/29 Outline • CEOS LPV context and structure • Guidelines for best practices • The way forward … 2/29 CEOS/WGCV/LPV •CEOS (committee on Earth Observation Satellite): group of space agencies aiming at harmonizing their activities at the international level. •Composed of several working groups (WG) including WG on Calibration and Validation (WGCV). •WGCV is made of several subgroups: - IVOS (Infrared and Visible Optical Sensors) - Microwave - SAR - Terrain Mapping - Atmospheric Chemistry - Land Product Validation (LPV) 3/29 Mission Statement & Goals • To foster quantitative validation of higher level global land products derived from remote sensing data • To relay results to users • To develop and promote international standards and protocols for field sampling, scaling, error budgeting, data exchange • To provide feed-back to international structures (GEO/GEOSS, GCOS, GTOS, IGBP …) for : • product definition, accuracy and quality assurance • Requirements for future missions 4/29 Role of LPV within ECVs • Essential Climate Variables (ECVs) are recognized to play a key role: – within international scientific structures (GCOS, GTOS) – international convention for verification (UNFCCC) to complement/compare with national (official) figures • Strong requirements on the evaluation of product uncertainties: need coordinated and consensus validation efforts mandatory for all the ECVs 5/29 Implementation tools • CEOS has no proper funding mechanisms • Actions based on best efforts by space agencies – CEOS provides recommendations to space agencies… – Possible direct actions on satellite data access • Direct actions within the community – Synergizing existing projects – Initiating new projects • Mostly a bottom-up approach 6/29 Products targeted • • • • • • • • Land cover with GOFC-GOLD Fire (mainly burnt area) with GOFC-GOLD Biophysical variables (LAI, fAPAR) Albedo, BRDF, surface reflectance Land Surface Temperature and emissivity (with IVOS) Soil moisture Biomass? In discussion with GOFC-GOLD Snow? 7/29 LPV proposed new structure sub-groups by product family with leads from the community if possible from different continents – – – – – – Albedo: G. Schaepman & C. Schaaf Land-Cover: M. Herold & M. Friedl Fire: K. Tansey & L. Boschetti Vegetation: R. Fernandes, S. Plummer, J. Nightingale Land Surface Temperature & emissivity: S. Hook & J. Sobrino Soil: W. Wagner & T. Jackson 8/29 Role of sub-group leads • Coordinate the validation activities at the global level: – Guideline for best practices • Lead the writing of a guideline document for best practices on validation • Ensures community consensus and publication/distribution • Ensures that the document is updated when new data / methods are available – Validation activities: promote their development/implementation • • • • Data sharing Data compilation Implementation of validation exercise Publication/distribution of results • Convey information to and from the community – Plans/Status/Results of the validation towards CEOS/international organizations/community 9/29 Communication tools + LPV Web site http://lpvs.gsfc.nasa.gov + LPV Wiki http://lpvs.pbworks.com/ + LPV Listservs 10/29 Outline • CEOS LPV context and structure • Guidelines for best practices • The way forward … 11/29 Best practices guidelines document •The “best practices guidelines” should be: •Based on current knowledge, tools, data and methods • Tested and easily repeatable •The Best Practices guidelines document should: • Define the best practices including data and methods to conduct validation of a satellite-derived land product. • Be a “living” document that is updated as tools, data, methods are improved through scientific endeavour •Process for Endorsement by CEOS/ community • Peer review process 11/29 Proposed common structure 1 Introduction 2 Validation 2.1 Data sets 2.2 Global validation 3 Intercomparison 3.1 The satellite data 3.2 Global intercomparison 4 Recommendations / Conclusions 5 References 13/29 Validation and intercomparison ‘validation’ is the process of quantitatively defining the system response to known, controlled signal inputs. ‘Validation’ refers to assessing the uncertainty of higher level, satellite sensor derived products by analytical comparison to reference data, which is presumed to represent the target value. ‘Intercomparison’ of data products provides an initial indication of gross differences and possibly insights into the reasons for the differences. (Justice et al. 2000) Validation and intercomparison are mandatory and complementary Validation allows ‘absolute’ quantification of uncertainties but often limited by the number and quality of available reference data Intercomparison provides a more exhaustive evaluation of consistencies/differences Required by users required when combining several products 14/29 Products definition and uncertainties • Products definition: ECVs not always very clearly defined (GCOS/GTOS documents). • Need more process model related defintions • Feedback to GCOS/GTOS • Required uncertainties attached to ECVs not defined in a traceable way • Need more process related uncertainties evaluation • Threshold / Optimal / Target ? • Feedback to GCOS/GTOS 15/29 Data sets for validation: sites Distribution of sites: must be representative of surface types, state/conditions - systematic (FRA 2010) - stratified Existing data sets Albedo: 19 BSRN sites LAI/fAPAR: 80-100 sites Land cover: 4300 points Need to capitalize the information 16/29 Data sets for validation: measurements Reference measurements - interpretation by experts based on HSR images (land-cover, fire) - quantitative measurements (albedo, LAI-fAPAR, LST&E , moisture) - definition of the variable measured - footprint and scaling Empirical Transfer Function Site (3 km) 30-50 ESU/Site - Elementary Sampling Unit (ESU) (20-100 m) 10-20 PM/ESU GPS located Variable extraction and averaging at the ESUlevel High spatial resolution biophysical variable map Need for high spatial resolution images (Landsat/SPOT) Point Measurement (PM) (1-50 m) DHP, LAI2000 … 17/29 Global validation Metrics used: Land cover : accuracy /user-producer Confusion matrix … LAI-fAPAR: RMSE, weighed RMSE, biases … Users need more information on the structure of uncertainties 18/29 Satellite data for intercomparison - Definition of variables: LCCS … - Spatial sampling exhaustive (Land cover) systematic stratified (LAI, fAPAR, Albedo …) - Spatial support area need to get same projection / resolution - Temporal support period need to get synchronous / same resolution Degradation of original characteristics 19/29 Global intercomparison Land cover 20/29 Global intercomparison: LAI • Temporal continuity • Temporal consistency • Smoothness of temporal evolution • Statistical distributions • Scatterplots Current status Level 4: operational validation Level 3 Land cover Biophysical Moisture Level 2 Fire Albedo LST & E Level 1 24/29 Operational Validation: land cover Operational lc validation framework Primary validation Comparative validation Updated valid./change Validation of new products Data reprocessing Link to regional datasets In-situ Legend translations global Product synergy Design based sample of reference sites Updated interpretations LCCS-based Interpretation Reference database: (Regional statistically robust, consistent, Networks) harmonized, updated, and accessible Degree of usability and flexibility Existing global LC products Time 25/29 Operational Validation: Biophysical OLIVE User community INFORMATION Objectives Results References & Contributors 1. Information to the community 2. Stand-alone comparison to existing products by potential product producer: Test Mode 3. Actual validation with results visible by the community & addition of the product to the community database: Validation Mode 4. Addition of sites for direct validation by individual contributors EVALUATION Histograms/PFT Scatterplots Temporal continuity Temporal consistency Spatial consistency Direct validation Database descr. How to proceed Input Formats Candidate Product Stand-alone Report Internal DATABASE Existing products BELMANIP2-Valid Candidate Contribution To DIRECT Evaluation criterions Existing products BELMANIP2-Test DIRECT 26/29 Concluding remarks • Very strong ECV context • Need funding mechanisms for sustainable validation activities: validation costs!! • Importance of reference measurements: – data sharing – improved cooperation with existing/developing networks – Availability of high spatial resolution images • A lot of new products … and not all at the ultimate validation level …. very challenging • Mandatory for product improvement / combination 27/29