Download Group Discussions - Summary

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Seminar on ESA 2010 Quality assessment
6 April 2016, Instituto Nacional de Estadística (INE), Madrid, Spain
Group Discussions - Summary
Eurostat
Summary of Group Discussions
• Group 1
• Proposed quantitative indicators
• Group 2
• Supporting metadata for quality assessments
• Group 3
• Complementary in-depth analysis
and overall process
2
Eurostat
Group 1:
Proposed quantitative indicators
• Can National Accounts data quality be expressed
with quantitative indicators?
• The group agreed that we measure the
categories, but different opinions on the concrete
indicators to be used were expressed
• Maybe not directly, but we are trying to identify
good proxies and also recognise clearly their
limitations
3
Eurostat
Group 1:
Proposed quantitative indicators
• Main controversy: number / magnitude of
revisions as proxy for accuracy and reliability
• Might lead to wrong incentive to revise less in
order to have a good score in the quality report
• Revision indicator is not perfect, but acceptable as
proxy
• Eurostat will clarify how revisions are measured
4
Eurostat
Group 1:
Proposed quantitative indicators
• The group agreed largely with the Keep / Drop
proposals
• Additionally suggested to be dropped:
• Number of subsequent data transmissions: Ok
• Indicator 'delivery date of validated data minus
legal delivery date' was questioned
• Eurostat will clarify
• Coherence:
• coherently wrong versus incoherently right
• Current proposal is best proxy and good starting
point
Eurostat
5
Group 2:
Supporting metadata for quality assessments
• General remarks
• We should add the SIMS numbers to the table to
show the clear link between the categories and SIMS
 double-check that we only use categories that are
defined in SIMS 2.0
• Confirmed that the metadata fields (ESMS) are
basically a one-off exercise with annual review and
not an annual reporting exercise
• We are not clear on the expected level of granularity
between high level (ESA), sub-domain level and
existing inventories. Sub-domain level is preferred
but impact on implementation to be reviewed.
• We are not clear on the implementation time table
6
Eurostat
Group 2:
Supporting metadata for quality assessments
• Suggested to be added: Confidentiality policy
• Emphasis on the difference between
“real” confidentiality (number of enterprises) and
“wrong” confidentiality (low reliability).
In principle low reliability should be flagged as
such, but is often flagged C
7
Eurostat
Group 2:
Supporting metadata for quality assessments
• Suggested for removal
• “deviations between methodology and
compilation”  we all apply ESA. Methods are
explained in “sources and methods”  OK
• Number of series breaks  can be seen in data and
does not have a direct quality aspect  OK
• Cost and burden  difficult to measure, not
comparable across countries, definition is not
clear (what to include or not to include), risk of
double counting.
Also there is not really a clear link to quality.
8
Eurostat
Group 2:
Supporting metadata for quality assessments
• To be reviewed: data sources and compilation
methods
• In SIMS those are two fields, we might split.
In GNI inventories they are in the same chapter,
so we might still keep together but would violate
SIMS?
• To be clarified: “changes between periods and
series breaks”
• Is it metadata or quality, because it might be close
to the data itself (i.e. historic data versus current
data)
9
Eurostat
Group 2:
Supporting metadata for quality assessments
• To be clarified: “Statistical processing”
• Source data and data compilation appears also in
“Accessibility and clarity”  duplication?
• Clarified: “Meta data availability and metadata
completeness”
• Is understood to describe the quality process itself
(this exercise), can be pre-filled and possibly
extended nationally if additional info is available
10
Eurostat
Group 3:
Complementary in-depth analysis and overall process
• Suggested to do an annual exercise only (~12
page template) and no in-depth reviews.
• LFS Quality Report process and model paved the
way forward – Report around 20 pages for
Member States.
• Scope and content of in-depth reviews in not clear
11
Eurostat
Group 3:
Complementary in-depth analysis and overall process
• Reduction of quality indicators welcomed
(template of 60+ pages to 45+ pages to around
12 pages)
• In-depth analysis each year not clear and
overlaps with annual report proposal. Need to be
reviewed and content determined – then whether
inclusion in annual report. In addition to regular
annual quality reporting, countries would need to
report every year on a different topic. This was
concluded as difficult to accept that the amount
of work was reduced.
12
Eurostat
Group 3:
Complementary in-depth analysis and overall process
• Eurostat needs to apply similar critique to the indepth analyses as with the annual report and
whether they can be incorporated in the annual
reporting and which items should be dropped and
delete the overall in-depth analysis.
13
Eurostat