Download Integrated Care - Agency for Clinical Innovation

Document related concepts

Health equity wikipedia , lookup

Rhetoric of health and medicine wikipedia , lookup

Electronic prescribing wikipedia , lookup

Patient safety wikipedia , lookup

Managed care wikipedia , lookup

Transcript
Integrated Care
Patient reported outcome measures and
patient reported experience measures a rapid scoping review
Jack Chen MBBS PhD MBA (Exec)
Associate Professor
Simpson Centre for Health Services Research
SWS Clinical School and Ingham Applied Medical Research Faculty of Medicine
University of New South Wales, Australia
Acknowledgement
The author would like acknowledge and thank the senior staff of the Agency of
Clinical Innovation of New South Wales for their helpful comments regarding
the scope, methodology and content of the review as well as their comments
on the first draft of the report. The author is also grateful to Dr Lixin Ou for her
help in searching, retrieving and reviewing the literature. The views expressed
in the review are entirely those of the author and should not be interpreted as
that of the Agency of Clinical Innovation, New South Wales and, needless to
say, any errors are the author’s.
1 | Integrated Care
The Abbreviation List
ABS
Australian Bureau of Statistics
ACI
Agency of Clinical Innovation
ACSQH
Australia Commission on Safety and Quality in Healthcare
AHRQ
Agency for Healthcare Research and Quality
AQoL-8D
Assessment of Quality of Life (8 Dimensions)
BCF
Better Care Fund
BHI
Bureau of Health Information
CER
Comparative effectiveness research
CIHI
Canadian Institute for Health Information
CALD
Culturally and linguistically diverse
CMS
Centre of Medicare and Medicaid
COSMIN
COnsensus-based Standards for the selection of health Measurement Instruments
CPHCRIN
Canadian Primary Health Care Research and Innovation Network
CTT
Classical Test Theory
ED
Emergency Department
EHR
Electronic health record
EQ-5D
The European Quality of Life(EuroQOL) five dimensions questionnaire
FACIT
the Functional Assessment of Chronic Illness Therapy system
GEM
GRID-enabled Measures
GHS
Global health status
HCAHPS
Hospital Consumer Assessment of Healthcare Providers and Systems
HHS
Department of Health and Human Services, the USA
HIT
Health Information technology
HRQoL
Health-related quality of life
HR-PROs
Health-Related Patient-Reported Outcomes
HUI
Health Utilities Index
IC
Integrated care
IOM
Institute of Medicine
IRT
Item Response Theory
KPIs
key performance indicators
LHDs
local health districts
MBS
Medical Benefit Scheme
MCID
Minimal clinically important difference
MID
Minimally important difference
NCGC
National Clinical Guideline Centre
NEHTA
National Electronic Health Transition Authority
Integrated Care | 2
NHP
Nottingham Health Profile
NIH
National Institute of Health
NQB
National Quality Board
NQF
National Quality Forum
NHS
National Health Services
NICE
National Institute for Health and Clinical Excellence
NQMC
National Quality Measures Clearinghouse
OECD
Organisation for Economic Co-operation and Development
OSG
Online support groups
PAIS
Patient Accreditation Improvement Survey
PBS
Pharmaceutical Benefit Scheme
PCC
Patient-centred care
PCEHR
Personally controlled electronic health record
PCORI
Patient-Centered Outcomes Research Institute
PDA
personal digital assistant
PEx
patient experiences surveys
PHC
Primary health care
PREM
Patient reported experience measures
PRO
Patient reported outcome
PROQOLID
Patient-Reported Outcome and Quality of Life Instruments Database
PROM
Patient reported outcome measures
PROMIS
Patient-Reported Outcomes Measurement Information System
PRO-PM
Patient-reported outcome based performance measures
PS
Patient satisfaction
PSat
patient satisfaction surveys
PSIv5
UltraFeedback’s Patient Satisfaction Instrument Version 5
QALY
Quality adjusted life year
QPS
QPS Patient Satisfaction Survey
QoL
Quality of life
QWB-SA
Quality of Wellbeing Scale
RBDM
Registry of Births, Deaths and Marriages
SF-12
The Short Form (12) Health Survey
SF-36
The Short Form (36) Health Survey
SNS
Social network sites
WaPEF
Warwick Patient Experience Framework
WHO
World Health Organisation
WHOQoL-BREF
World Health Organisation Quality of Life Instrument
3 | Integrated Care
Contents
The Abbreviation List. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Executive Summary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1 | Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
1.1 Background. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
1.2 Methodology. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2 | Definitions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.1 Patient reported outcomes (PROs). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.2 The narrow and broad definitions of PROs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.3 Patient reported outcome measures (PROMs) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.4 Patient reported outcome measure (PROMs) vs patient reported experience measures (PREMs) . . . . . . . . . . . . . . . . . . .
2.5 Why PROMs/PREMs are important to measure?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.5.1 From a theoretical and conceptual point of view. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.5.2 From the clinical and health economic point of view. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.5.3 From quality improvement and a societal point of view . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.6 Framing PRO in a broader determinants of health model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.7 The pathways from PRO to PRO-PM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3 | Selecting PROMs and PREMs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
14
14
15
15
15
15
15
17
17
17
21
3.1 Where to find PROM/PREM?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3.2 How to choose a PROM/PREM – methodological issues. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3.2.2 Other confounding factors of choosing a PROM/PREM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3.2.3 Implications of different methods and modes on response rate, reliability and validity. . . . . . . . . . . . . . . . . . . . . . .
3.2.4 Minimally important differences and changes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3.2.5 Response shift, adaptation and other challenges to detect true change . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3.2.6 Generic versus disease- or condition-specific PROM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3.2.7 Other methodological issues related to choosing PROM/PREMs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3.3 Using PROM/PREM as quality measures: the desirable attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3.4 Other principles of selecting PROMs/PREMs in IC. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4 | Capturing PROMs and PREMs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
22
22
26
26
26
27
28
29
29
30
31
4.1 The challenges in capturing PROMs and the promises of PROMIS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.2 Generic HRQoL measures in primary care setting: results from an evidence review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.3 Measuring patient experience in a primary health care setting – an evidence review from Canada. . . . . . . . . . . . . . . . .
4.4 Measuring IC based on the AHRQ ‘Care Coordination Atlas’. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.5 Measuring patient experience in IC in the UK. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.6 Measuring ‘continuity of care’ based on patient experience – a systematic review of the instruments. . . . . . . . . . . . . . .
4.7 The key points in measuring PREMs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5 | Using PROMs and PREMs to improve health care. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
32
35
37
39
47
56
58
59
5.1 A conceptual framework in understanding the effect of PROM/PREM. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
5.2 IC and PROMs/PREMs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
5.2.1 How is patient-centered care (PCC) defined?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Integrated Care | 4
5.2.2 Can the concept of PCC be measured?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.2.4 Patient experience vs patient satisfaction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.3 International experience . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.3.1 The USA experience. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.3.2 The UK experience. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.3.3 The Sweden experience . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.3.4 The Denmark experience: the generic Integrated PRO System (WestChronic) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.4 Measuring PREMs in non-primary care settings in Australia. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.3 Measuring PREMs in primary-care settings in Australia . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.6 Social media, cost-effectiveness and IC. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6 | Future plans & research priorities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
61
63
64
64
64
65
66
68
71
72
73
6.1 Psychosocial behaviours in EHRs as part of PROMs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
6.2 Integrating PROMs with EHRs and other data sources: the need for robust health information infrastructure . . . . . . . 74
6.3 Developing PROM/PREM measures on important subgroups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
6.4 Exploring the ways that the results can be better presented to different stakeholders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
6.5 Investment in understanding population norm, cut-off, MID, responsiveness, response-shift of the PROMs/PREMs. 76
6.6 Investment in IRT/CAT technique and item banks for specific interested area. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
6.7 Developing suitable case-mix adjustment methodology for different stakeholders. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
7 | Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
8 | Appendices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
8.1 Appendix 1 – The list of systematic reviews on HR-PRO by COSMIN (489) (See attached document). . . . . . . . . . . . . . . . 80
8.2 Appendix 2 - Methodological issues related to the measurement properties of the PROMs/PREMs instruments
(COSMIN). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
8.3 Appendix 3 – The sources of the instruments reviewed.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Instrument Cost Notes: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
8.4 Appendix 4 – Measuring Patient Experiences in Primary Health Care Survey (Canadian) (see attached document). . 87
8.5 Appendix 5 – The June 2014 update of the AHRQ review results of the instrument in the care coordination
measurement (Care Coordination Atlas) (see attached document). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
8.6 Appendix 6 – Social Media and IC. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
8.6.1 Using social media to capture patient experience . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
8.6.3 Silver lining of the ‘cloud of patient experience’270 in a ‘perfect storm’271? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
8.7 Appendix 7 - Is IC cost-effective?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
8.8 Appendix 8 - Patient experience framework. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
8.8.1 The Picker principles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
8.8.2 NHS Patient Experience Framework. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
8.8.3 IOM Patient experience framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
8.8.4 The Warwick patient experience framework (WaPEF). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
8.8.5 The AHRQ ‘care coordination’ framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
8.8.6 The patient experience frameworks and its implications on NSW IC. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
5 | Integrated Care
Tables
Table 2.5-1 The types and description of common health economic analyses. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Table 2.7-1 Distinctions among PRO, PROM, and PRO-PM: Two Examples. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
Table 2.7-2 NQF Endorsement Criteria and their Application to PRO-PMs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Table 3.2-1 Main characteristics of PRO methods issues. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Table 3.2-2 The strengths and weaknesses of generic and condition specific PROMs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
Table 4.2-1 Overview of results from psychometric review. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Table 4.3-1 Dimensions of patients’ experiences in primary health care. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Table 4.4-1 Index of measures/instruments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Table 4.4-2 Care coordination master measure mapping table, patient/family perspective†. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Table 4.4-3 Care coordination master measure mapping table, healthcare professional(s) perspective† . . . . . . . . . . . . . . . . . . . 44
Table 4.4-4 Care coordination master measure mapping table, system representative(s) perspective†. . . . . . . . . . . . . . . . . . . . . 45
Table 4.4-5 The June 2014 update of AHRQ review results of instrument in care coordination measurement (Note: page
number refers to the page number in the original document)10. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Table 4.5-1 Existing user/carer experience measures in large national surveys36. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Table 4.5-2 The 18 supplementary questions developed for measuring IC experience in the UK setting . . . . . . . . . . . . . . . . . . . 50
Table 4.6-1 Quality of measurement properties and the interpretability per instrument (Adopted from Uijen et al. 2012197).57
Table 5.1-1 Possible outcome indicators for assessing the impact of the collection of PROM/PREM . . . . . . . . . . . . . . . . . . . . . . . 61
Table 5.3-1 Selected outcomes indicators for Domain 2 of NHS Outcome Framework. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
Table 5.3-2 Elements of clinical application of PRO.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Table 5.3-3 Characteristics of 22 projects involving implementations
of a generic PRO system. Projects with patient level use (n=14). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
Table 5.4-1 Frequency of domains used in the hospital patient experience and satisfaction surveys in Australia . . . . . . . . . . . 68
Table 5.4-2 Survey tools used among St Vincent’s hospitals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Table 5.4-3 Patient experience and satisfaction surveys used in Australia . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
Table 8.2-1 Definitions of domains, measurement properties, and aspects of measurement properties. . . . . . . . . . . . . . . . . . . . 81
Table 8.2-2 Quality criteria for measurement properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Table 8.6-1 Platforms and reported effects/outcomes.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Table 8.6-2 Potential sources of information for the ‘cloud of patient experience’. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Table 8.6-3 Questions Asked for Anecdotal Comments and Ratings on NHS Choices. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
Table 8.8-1 A narrative commentary on IOM patient experience framework. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
Table 8.8-2 The Warwick Patient Experiences Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
Table 8.8-3 Mechanisms for Achieving Care Coordination (Domains) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
Table 8.8-4 Relation Between the Care Coordination Measurement Framework and Other Key Sources. . . . . . . . . . . . . . . . . . . . 98
Figures
Figure 2.2-1 Types of PROs currently used in medical research. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Figure 2.5-1 Patient-based evidence as part of the whole new evidence-base for high quality patient care. . . . . . . . . . . . . . . . 15
Figure 2.6-1 Framing PROs within existing conceptual models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Figure 2.7-1 Pathway from PRO to NQF-endorsed PRO-PM. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Integrated Care | 6
Figure 3.2-1 Types of respondent data and methods/modes of administration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Figure 5.1-1 A hypothetical framework to understand the impact of
routinely collected PROs on patient health outcomes.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
Figure 5.2-1 The care coordination ring . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
Figure 5.3-1 Duty of quality and the NHS Outcomes Framework. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
Figure 5.3-2 Using PROM in the Swedish Hip Arthroplasty Register. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
Figure 8.2-1 The diagram for completing the COSMIN checklist. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
Figure 8.2-2 COSMIN taxonomy of relationships of measurement properties.
Abbreviations: HR-PRO, health related-patient reported outcome . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
Fi Figure 8.6-1 Cumulative number of online ratings of hospitals in England on the NHS Choices website.. . . . . . . . . . . . . . . . . 90
gure 8.8-1 Dimensions included in IC (i.e. person centered coordinated care) narratives (National Voice, 2013) . . . . . . . . . . . . 94
Figure 8.8-2 Care Coordination Measurement Framework Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
Executive Summary
Integrated Care | 8
The Agency for Clinical Innovation (ACI) conducted a
rapid scoping review on Patient Reported Outcome
Measures (PROMs) and Patient Reported Experience
Measures (PREMs), with a particular focus on the NSW
Health Integrated Care (IC) Strategy. This expert scoping
paper has been produced to provide information,
guidance and recommendations on patient reported
measures question sets The overall aim of patient
reported measures within the NSW Health IC strategy is
to ‘enable patients to provide direct, timely feedback about
their health related outcomes and experiences to drive
improvement and integration of health care across NSW’.
The literature demonstrates that patients who are
engaged in their health care tend to experience better
outcomes, and choose less costly interventions, such as
physical therapy for low back pain after they participate
in a process of shared decision making. Measuring
Patient Reported Outcomes (PROs) allows for better
communication and shared decision making between
patients and providers.
9 | Integrated Care
Using and selecting a measure
When considering the selection and use of a PROM
or PREM it is important to take into account the data
source (i.e. self-report vs. proxy), the mode, method and
setting of administration as well as generic vs. condition
specific PRO and the needs of the target population
(vulnerable populations, low literacy, language and
culture, functional abilities). It is important to critically
review the adopted PROM/PREM measures to see if
they are suitable to be used across different population
subgroups. The notable such subgroups in Australia
include the Indigenous population, patients from
culturally and linguistically diverse (CALD) backgrounds,
women, children and the elderly, as well as patients
with mental health problems or cognitive impairment.
The burden of responding to the PROM/PREM on the
patient and the burden of collecting these should not
be underestimated.
When using a PROM/PREM as a quality measure in order
to affect change, PROM/PREM could also be assessed
according to the Agency for Healthcare Research and
Quality (AHRQ) framework of desirable attributes of
quality measures in three broad conceptual areas: (1)
Importance of a measure (2) scientific soundness of a
measure, and (3) feasibility of a measure.
One of the common challenges in applying PROMs in an
IC setting is how to choose a generic PROM that measures
HRQoL given the multitude of existing measurement
instruments. To meet this challenge, the Canadian Institute
of Health Research supported a group of researchers in
2013 to conduct a review. The researchers generated a
short list of PROMs which included: PROMIS, SF-12/SF-36
and EQ-5D, after review of the evidence there was strong
evidence to support PROMIS.
Importantly, infrastructure needs to be in place for the
collection of the collection of PRO data from patients,
data needs to be synthesised in a meaningful way and
presented back to both the patients and providers. This
should also assist with developing strategies for acting
on the information in a timely manner to improve
the clinical care for patients (evaluating the amount,
reasons and patterns of missing data is also important).
The need for robust health information infrastructure to
integrate PROMs with Electronic Health Records (EHRs)
and other data sources is an important consideration
particularly for the integration of any future PROM with
the EHRs or personally controlled electronic health
record (PCEHR).
Measurement and collection
Measuring Patient Reported Outcomes is important
as traditionally measured biomarkers often fail to
correspond with how a patient is actually feeling. By
measuring Health Related Quality of Life (HRQoL) in
particular, directly acknowledges that patients often
value different outcomes than their providers.
PROs will provide a key component to understanding
the true burden of disease, especially in diseases that
are marked by morbidity but not necessarily mortality.
Measuring PROs allows for the assessment of a patient’s
health status entering therapy and identifying treatable
problems. Furthermore, it will determine the degree as
well as the sources of the patients decreased ability to
function and physical, emotional and social problems.
There is unlikely to be universal agreement on the key
important dimensions of the patient experience to
measure amongst different health jurisdictions. The
choice of different dimensions/domains/subdomains
of the patient experience very much reflects the value
and preference of health systems, service providers and
patients themselves.
The systematic collection of PROs data has been
shown to be more reflective of underlying health
status than clinical reporting whilst also predicting
meaningful clinical outcomes including survival. PROs
data is feasible, efficient and valued by clinicians for
documentation and clinical decision making whilst
improving symptom management and patients’ overall
health status. Measuring PROs in this context also
allows for informed decisions regarding the change of
treatment plans and predicting the course of diseases
and outcomes.
Infrastructure
Internationally
Current opportunities exist to engage patients in
building capacity and infrastructure to capture
PROs routinely and then use these data to develop
performance measures to allow for accurate appraisals
of health care quality and efficiency.
The international experience, policy background and
research interests are fast moving in relation to POM/
PREM in IC. The UK, USA and other European countries
have provided many policy initiatives in patient centred
care (PCC)/IC as well as how to utilise PROM/PREM to
Integrated Care | 10
improve quality of care. However, the evidence base
is still emerging. There is a generational change in
developing and testing PROM/PREM, culminated by the
success of the National Health Institute (NIH) funded
PROMIS project which is based on Item Response theory
(IRT) and Computerised Adaptive Test (CAT).
Overall, ten things that need to be considered when
planning how to measure changes in patient and carer
experience over time include:
1) Consider how the patient experience is being defined in
order to inform precisely what needs are to be measured.
2) Think about why the patient experience is being
measured and how the information will be used.
3) Assess whether it would be useful to combine
approaches so that both qualitative and quantitative
material is collected.
4) Consider whether to ask everyone using the services to
provide feedback or only a sample.
5) Think about whether the best time to collect
feedback is immediately after using the services when
experiences are fresh in people’s minds.
6) Allocate enough time at the outset to plan and test
measurement methods, particularly if these will be
used in the future to monitor change over time.
7) Think about how the end result needs to be presented
for a variety of audiences as this may shape how data
are collected. Potential outputs include statistical
averages, in-depth quotes or graphs.
8) Make sure that there is an appropriate infrastructure
at an organisational level to analyse and use patient
experience information.
9) Make sure that patients, carers, managers and health
professionals are all comfortable with why the
feedback is being collected and how it will be used.
Staff need to be on board as well as patients.
10)Ensure that patient experience measures are seen as one
component of a broader framework of measurement
and that all of the approaches work well together
without an excessive burden to either staff or patients.
1 | Introduction
Integrated Care | 12
1.1 Background
of the patient outcomes across the system, with a view
to achieving better patient follow–up and PCC.
The rising burden of chronic disease and the number of
people with complex care needs in particular, require the
development of delivery systems that bring together a
range of professionals and skills from primary-, acute-,
long-term and social-care sectors. Failure to better
integrate or coordinate services along the continuum of
care may result in suboptimal outcomes and the available
evidence regarding integrated care (IC) programs points to
a positive impact on the quality of patient care as well as
improved health or patient satisfaction outcomes.
4) Real time patient feedback: Investing in tools to measure
the experience of the patient immediately after, or during
treatment, providing a more realistic gauge of patient
satisfaction, as well as prompt feedback loops to clinicians
and managers to address any issues.
IC involves the provision of seamless, effective and
efficient care that responds to all of a person’s health
needs, across physical and mental health, in partnership
with the individual, their carers and family. It means
developing a system of care and support that is based
around the needs of the individual (so-called patientcenteredness), provides the right care in the right place at
the right time and makes sure funding is allocated to the
most effective system of delivering healthcare.
1.2Methodology
The New South Wales Ministry of Health envisaged three
clear directions for the future delivery of healthcare in NSW:
1. Keeping people heathy.
2. Providing world class clinical care.
3. Delivering truly IC.
As part of this vision, NSW Health developed a four-year
strategic plan (2014-2017) and committed $120 million to
implement new, innovative locally-led models of IC across
the State which aims to achieve a health system with
services connected across many different providers and
will focus on individual patient needs.
Four areas have been prioritised for investment:
1) HealtheNet: This allows for the mapping of different
patient identifiers to create a single picture of the
patient’s information across all local health districts
(LHDs), and is integrated with the national PCEHR
to provide a comprehensive set of the patient’s
information. The HealtheNet program allows patients
and their clinicians, hospitals and other healthcare
providers to view and share health information which
will provide a more seamless healthcare experience.
2) Risk stratification: Developing tools for use by LHDs
and healthcare providers to identify people at-risk of
illness or chronic disease which can be followed-up by
early and targeted intervention.
3) Patient Reported Outcomes Measures(PROMs):
Investing in better measurement, tracking and feedback
Given this policy background, commissioned by the
Agency of Clinical Innovation (ACI), a scoping review was
conducted to provide an analysis of the issues regarding
applying PROMs/PREMs on a large scale, with a particular
focus on PCC and IC.
Given the extreme wide range of topics covered within
this scoping review, a few strategies were adopted to
provide a quick scan on the literature. Structured searches
were conducted of PubMed, PsychInfo, Web of Science
and Scopus. A grey literature review was also conducted
through Google and Google Scholar. Literature searches
were also performed on related web sites including, but
not limited to, the International Foundation of Integrated
Care, Kings’ Fund, the AHRQ, New South Wales Ministry
of Health, Bureau of Health Information (BHI) NSW,
National Health Performance Agency, National Health
Service (NHS) Choice, The Royal Australian College of
General Practitioners (RACGP, Australia), Primary Health
Care Research and Information Services (PHCRIS), etc. We
initiatively focused on reviews in the area of IC, PROMs and
PREMs. We then extended the search to original articles
and editorials. Scopus’s top-down and bottom-up features
were used to snowball the searching of key references.
Overall, we scanned over 8,000 references and over 3,000
full-text documents were downloaded.
After a brief introduction of the policy background, we
discussed the definitions and conceptual issues related
to PROMs/PREMs, general review of issues related to
selecting and capturing PROMs/PREMs was provided,
as well as applying PROMs/PREM in different health
care settings. Despite that the international and local
experience on both PROM/PREM were described, we
focused on reviewing the most prominent measurement
instruments used in the IC setting, the methodological
issues and the general considerations in applying the
PROM and PREM in an Australian context. Finally, we
outlined several future policy and research directions.
2 | Definitions
Patient Reported Outcome (PROs), Patient Reported Outcome Measures
(PROMs), Patient Reported Experience Measures (PREMs) and Patient
Reported Outcome Based Performance Measures (PRO-PM)
Integrated Care | 14
2.1 Patient reported outcomes (PROs)
Patient reported outcome (PROs) measures include
health status assessments, HRQoL, symptom reporting
measures, satisfaction with care, treatment satisfaction
measures, economic impact measures, and instruments
for assessing specific dimensions of the patient experience
such as depression and anxiety1. The USA Food and Drug
Agency (FDA) adopts a much broader definition2: “A PRO
is any report coming directly from patients about a health
condition and its treatment”, meaning that PROs capture
patients’ perspectives about how illness or new therapies
impact on, for example, their general well-being.
The concept of PROs refers to any report on the status of
a patient’s health condition that comes directly from the
patient, without interpretation of the patient’s response
by a clinician or anyone else. The National Quality Forum
(NQF) of the USA has interpreted patient-reported
outcomes (PROs) as an international term of art; the
word “patient” is intended to be inclusive of all persons,
including patients, families, caregivers and consumers.
Also it is intended to cover all persons receiving support
services, such as those with disabilities.
2.2 The narrow and broad definitions
of PROs
It should be noted that most of the relatively old literature
or the literature of a European origin tend to provide a
narrow definition of PROs. That is, PROs was defined as
only including HRQoL or quality of life (QoL) if defined
differently from HRQoL, symptoms and side effects. Such
a narrow definition did not include the patients’ reported
experience, the satisfaction and expectation regarding
their care as well as any compliments or complaints they
may have. All these concepts are often included in the
term “patient reported experience measures” in most of
the literature. However, recently the Institute of Medicine
(IOM), AHRQ and the NQF in the USA adopted a much
more liberal definition of PROs with its key PRO domains
including but limited to3:
zz HRQoL (including functional status);
zz Symptoms and symptom burden (e.g. pain, fatigue);
zz Experience with care and satisfaction (Figure 2.2-1);
and
zz Health behaviours (e.g. smoking, diet, exercise).
Figure 2.2-1 Types of PROs currently used in medical
research.
QoL
Patient
Satisfaction
Utility
Symptoms
(Impairment)
Activity
limitations
(Disability)
HRQL
15 | Integrated Care
The most commonly used PROMs assess symptoms and/
or functional limitations. These are commonly referred to
as HRQoL measures. The commonly used measures which
generate utility values also ask about symptoms and/or
functional limitations. Patient experience and satisfaction
is generally concerned with issues such as the process of
treatment and relationships with the clinical staff. QoL
measures address need fulfilment rather than symptoms
and/or functional limitations.4
2.5 Why PROMs/PREMs are
important to measure?
2.5.1 From a theoretical and conceptual
point of view
2.3 Patient reported outcome
measures (PROMs)
PROMs are the standard tools for directly eliciting the
PROs and their use has become the standard both
in regulated and non-regulated clinical trials and
other quality improvement initiatives, particularly
for assessment of symptoms and HRQoL. Systematic
collection of PROs data has been shown to be feasible
and efficient, to be more reflective of underlying health
status than clinician reporting, to predict meaningful
clinical outcomes including survival, to increase patient
satisfaction with care, to be valued by clinicians for
documentation and clinical decision making and to
improve symptom management as well as the patient’s
overall health status.
2.4 Patient reported outcome measure
(PROMs) vs patient reported
experience measures (PREMs)
With the very broad definition by the FDA, some will
include the PREMs as part of the PROMs, such as the
positions adopted by the NQF and AHRQ.3 However,
given the long history of measuring patient experience/
satisfaction in many countries, some would normally
make a differentiation between the term PROMs (which
specifically refers to HRQoL measures, patient symptoms
and side effects) with the term PREMs (which refers to all
patient care experiences, values, preferences, satisfaction,
expectations, etc).5 6 In this report, we will describe both
forms of patients’ reports; differentiation will be made in
order to facilitate our discussion.
Patientbased
evidence
High
quality
patient
care
Clinical
evidence
Economic
evidence
Figure 2.5-1 Patient-based evidence as part of the whole
new evidence-base for high quality patient care
The growing recognition of the importance of measuring
PROM/PREM is, in part, inspired by the inclusion of PCC as
one of the five key components of quality of care by the IOM
and, more recently, by Trip Aims promoted by the Institute
of Healthcare Improvement (IHI) which posits that the three
goals of care are to improve health outcomes (including
HRQoL), patient experience and reduce waste (or cost). In
2015, there will be, for the first time, a joint conference of
international evidence-based medicine and international
patient shared decision-making to be held in Sydney to
emphasise the importance of the patient-centeredness,
PROM/PREM, value and preference (so-called patient-based
evidence) to shape the whole evidence-base of health
care decision-making in combination with the clinical and
economic evidence ( Figure 2.5-1 ) .
Integrated Care | 16
2.5.2 From the clinical and health economic
point of view
Table 2.5-1 The types and description of common health
economic analyses
There are several reasons why PROM/PREMs are important
from the clinical and health economic point of view:
Type of analysis
Description
Cost–offset study
Cost analysis: Compares costs
incurred with (other) costs saved;
does not consider alternative use of
resources elsewhere
1. Any experienced provider knows that traditionally
measured biomarkers often fail to correspond with
how a patient is actually feeling. For example, in
diabetes clinicians usually measure the haemoglobin
A1C level. This value is often used to make treatment
decisions, like how aggressively to treat the diabetes,
or whether to change to a new medication. But
the problem is that some patients may have a low
haemoglobin A1C but still feel listless or depressed
despite their favourable laboratory values. In contrast,
others with unfavourable levels may, nonetheless, feel
upbeat and vigorous. Thus, the traditional outcome
measured by healthcare providers (e.g. haemoglobin
A1C levels,) may fail to capture other aspects of health.
2. Patients rarely value traditional biomarkers in the
same manner as providers. For example, patients with
hypertension often fail to share the same enthusiasm
as their providers in achieving specific blood pressure
goals but are quick to comply with therapy when
their hypertension leads to headaches or dizziness.
Measuring HRQoL, in particular, directly acknowledges
that patients often value different outcomes than their
providers.
3. PROMs provide a key component to understanding
the true burden of any disease. Traditional measures
of the disease burden include the prevalence of a
disease, direct and indirect expenditures of a disease,
and the worker productivity decrements related to a
disease. However, in order to fully appreciate the true
burden of a disease, it is also important to appreciate
the HRQoL decrement engendered by the disease. The
notion of “weighting” diseases not only by their cost
and prevalence but also by their HRQoL decrement
has an innate sense of fairness and is a fundamental
principle of health economics7(Table 2.5-1 ). For this
reason alone it is critical to carefully understand the
HRQoL decrement of various diseases because that
information may have policy implications when it
comes to developing a healthcare budget.
4. PROM/PREMs are especially important in diseases
that are marked by morbidity but not necessarily by
mortality. PROM/PREMs have large clinical relevance
in patients with disorders such as chronic migraine
headache, sleep disorders or depression.
Cost-minimisation As cost analysis but compares
analysis
two or more interventions or
programmes; assumes outcomes of
different programmes to be broadly
equivalent
Cost–consequence Compares the costs and
analysis (CCA)
consequences of two or more
alternatives, but does not
aggregate or synthesise costs
and consequences, and all health
outcomes are left in natural units
Cost–effectiveness Relates costs to a (typically single)
analysis (CEA)
common outcome between
alternative interventions/
programmes (which can also involve
no intervention)
Cost–utility
analysis (CUA)
Relates costs to utilities as a measure
of programme effect;
results of CUA are typically
expressed in terms of cost per
health year of cost per quality
adjusted life year (QALY) gained
Cost–benefit
analysis (CBA)
Economic evaluation that values
all costs and benefits in the same
(monetary) value; results of CBA are
typically expressed as a ratio of costs
to benefits or a sum representing
the net benefit (or loss) of one
programme over another
Some of the rationales put forward for measuring PROs in
a clinical perspective include:
zz better communication and shared decision making by
patients and providers;
zz assessing the health status of patients entering therapy
and identifying treatable problems;
zz determining the degree and sources of the patient’s
decreased ability to function;
17 | Integrated Care
zz distinguishing among different types of problems
including physical, emotional, and social;
zz detecting adverse effects of therapy;
zz monitoring the effects of disease progression and
response to therapy;
zz informing decisions about changing treatment plans;
and
zz predicting the course of a disease and the outcomes of
care.
2.5.3 From quality improvement and a
societal point of view
Patient and family engagement is increasingly
acknowledged as a key component of a comprehensive
strategy (along with performance improvement and
accountability) to achieve a high quality and affordable
health system. Emerging evidence affirms that patients
who are engaged in their care tend to experience
better outcomes and choose less costly but effective
interventions if they have participated in a process of
shared decision-making, such as physical therapy for low
back pain. Promising approaches to involve patients and
their families at multiple levels are being implemented
across many countries. Such activities include consumers
serving on governance boards at hospitals and
contributing to system and practice redesign to make care
safer and more patient-centric. There are growing interests
from many different organisations to engage patients by
building capacity and infrastructure to capture PROMs/
PREMs routinely and then to use these data to develop
performance measures to allow for accurate appraisals of
quality and efficiency.3 8-16
There is also growing recognition and vision that PROM/
PREM, in combination with other important data sources
such as clinical data, genetic and biobank data, registry
data, and administrative data within and beyond the
health sector, can form the individual digital footprint that
will provide better clinical management of patients as well
as conducting improved comparative effective research.
2.6 Framing PRO in a broader
determinants of health model
The broad definition of PRO, adopted by the top
organisations in the USA, extends the PRO into a much more
broad care model including preventive care (population
at-risk), acute care (evaluation and initial management of
patients) and sub-acute care (follow-up care)12. It is believed
that PROs at each stage (such as health-related behaviour
at the preventive care stage) are all important and so is the
experience with care12(Figure 2.6-1).
Figure 2.6-1 Framing PROs within existing conceptual
models
Determinants of health model
Genetics &
biometrics
Physical
environment
Social
environment
Lifestyle &
health behaviours
Patient focused episode of care model
Population
at risk
Evaluation
& initial
management
Follow up
care
PHASE 1
PHASE 2
PHASE 3
Clinical episode begins
PRO categories across
the episode
HRQOL/functional
status
Health-related
behaviours
Sympton/sympton
burden
Experience with care
Time
2.7 The pathways from PRO to PRO-PM
Another important trend is to use PRO to affect change
and improve quality of care. In the USA, PRO has been
extended beyond the purposes of its original clinical
implication such as improving communication between
patients and providers as well as monitoring the treatment
effects of different therapies to be used actively in
monitoring the provider and health system. In 2013, the
NQF, the major US organisation that reviews and endorses
quality metrics, assembled an expert panel to develop
standards around the development of PRO-PMs. It aims
to use PROs to affect changes and improve quality of care
at all three levels: patients, providers and health systems.
As a result, the white paper that described a pathway for
developing such measures toward NQF endorsement3,
was endorsed by the International Society for Quality of
Life Research.
Integrated Care | 18
Two examples of the distinction of the purpose of the
three concepts (i.e. PRO, PROM, and PRO-PM) were
presented (Table 2.7-1).
Table 2.7-1 Distinctions among PRO, PROM, and PRO-PM:
Two Examples
Concept
Patients
with clinical
depression
Persons with
intellectual or
developmental
disabilities
PRO
Symptom:
(patient-reported depression
outcome
Functional statusrole: employment
PROM
(instrument,
tool, single-item
measure)
PHQ-9, a
standardised
tool to assess
depression
Single-item
measure on
National Core
Indicators
Consumer
Survey: Do you
have a job in the
community?
PRO-PM
(PRO-based
performance
measure)
Percentage of
patients with
diagnosis of
major depression
or dysthymia
and initial PHQ-9
score >9 with a
follow up score
<5 at 6 months
(NQF #0711)
The proportion
of people with
intellectual or
developmental
disabilities who
have a job in the
community
(Source: PROs in Performance Measurement (NQF, 2013)3)
In order to achieve such a conceptual change, the NQF
developed specific guidelines on the pathways from PRO
to PRO-PM (Figure 2.7-1) which required organisations to
follow the explicit published endorsement criteria and
seeks endorsement from the NQF. (Table 2.7-2).
19 | Integrated Care
NQF Endorsement Process
PRO-PM
PROM
PRO
Figure 2.7-1 Pathway from PRO to NQF-endorsed PRO-PM
1
Identify the quality performance issue or problem
z
Include input from all stakeholders including consumers and patients
2
Identify outcomes that are meaningful to the target population and are amenable to change
z
Ask persons who are receiving the care and services
z
Identify evidence that the outcome responds to the intervention
3
Determine whether patient-/person-reported information is the best way to assess the outcome of interest
z
If a PRO is appropriate, proceed to step 4
4
Identify existing PROMs for measuring the outcome (PRO) in the target population of interest
z
Many PROMs (instrument/scale/single-item) were developed and tested primarily for research
5
Select a PROM suitable for use in performance measurement
z
Identify reliability, validity, responsiveness, feasibility in the target population
(see characteristics in Appendix C)
6
Use the PROM in the real world with the intwended target population and setting to:
z
Assess status or response to intervention, provide feedback for self-management, plan and manage care or
services, share decision-making
z
Test feasibility of use and collect PROM data to develop and test an outcome performance measure
7
Specify the outcome performance measure (PRO-PM)
z
Aggregate PROM data such as average change; percentage improved or meeting a benchmark
8
Test the PRO-PM for reliability, validity and threats to validity
z
Analysis of threats to validity, e.g., measure exclusions; missing data or poor response rate; case mix differences and risk
adjustment; discrimination of performance; equivalence of results if multiple PROMs specified
9
Submit the PRO-PM to NQF for consideration of NQF endorsement
z
Detailed specifications and required information and data to demonstrate meeting NQF endorsement criteria
10 Evaluate the PRO-PM against the NQF endorsement criteria
z
Importance to Measure and Report (including evidence of value to patient/person and amenable to change)
z
Scientific Acceptability to Measure Properties (reliability and validity of PROM and PRO-PM; threats to validity)
z
Feasibility
z
Usability and Use
z
Comparison to Related and Competing Measures to harmonise across existing measures or select the best measure
11 Use the endorsed PRO-PM for accountability and improvement
z
Refine measure as needed
12 Evaluate whether the PRO-PM continues to meet NQF criteria to maintain endorsement
Submit updated information to demonstrate meeting all criteria including updated evidence, performance and testing;
feedback on use, improvement and unintended adverse consequences
(Source: PROs in Performance Measurement (NQF, 2013)3)
z
(Source: PROs in Performance Measurement (NQF, 2013)3)
Integrated Care | 20
Table 2.7-2 NQF Endorsement Criteria and their Application to PRO-PMs
Abbreviated NQF Endorsement
Criteria
Considerations for Evaluating
PRO-PMs that are relevant to other
performance measures
Unique Considerations for
Evaluating PRO-PMs
Importance to Measure and Report
a. High impact
b. Opportunity for improvement
c. Health outcome OR evidence-based
process or structure of care
Scientific Acceptability of Measure
Properties
a. Reliability
1. precise specifications
2. reliability testing for either data
elements or performance measure
score
b. Validity
• Patients/persons must be involved
• Does evidence support that the
outcome is responsive to intervention? in identifying PROs for performance
• When should the evidence exception measurement (person-centered;
be allowed for performance measures meaningful).
focused solely on conducting an
assessment (e.g., administering a
PROM, lab test)?
• Data collection instruments (tools)
should be identified (e.g., specific
PROM instrument, scale, or single
item).
• If multiple data sources (i.e., PROMs,
methods, modes, languages) are used,
then comparability or equivalency
of performance scores should be
demonstrated.
1. specifications consistent with
evidence
2. validity testing for either data
elements or performance measure
score
3. exclusions
4. risk adjustment
5. identify differences in performance
6. comparability of multiple data
sources
(Source: PROs in Performance Measurement (NQF, 2013)3)
• Specifications should include
standard methods, modes, languages
of administration; whether (and how)
proxy responses are allowed; standard
sampling procedures; how missing
data are handled; and calculation of
response rates to be reported with the
performance measure results.
• Reliability and validity should be
demonstrated for both the data
(PROM) and the PRO-PM performance
measure score.
• Response rates can affect validity and
should be addressed in testing.
• Differences in individuals’ PROM
values related to PROM instruments
or methods, modes, and languages of
administration need to be analysed
and potentially included in risk
adjustment.
3 | Selecting
PROMs and PREMs
Integrated Care | 22
3.1 Where to find PROM/PREM?
There are many sources that could be accessed to identify
existing tools in PROM/PREM. The PROMIS website
provides many useful links (http://www.nihpromis.org/
resources /resourcehome) .
3.1.1 Online libraries and databases of PROs
A selection of online libraries and databases of PROs are as
follows:
zz PROQOLID(Patient-Reported Outcome and Quality of
Life Instruments Database): MAPI Research Trust site:
http://www.mapi-trust.org/. This database requires
a subscription fee and currently includes over 900
different instruments of PROs.
zz PROMs group, Oxford (http://phi.uhce.ox.ac.uk/
about.php): There is an online bibliographic database
accessible through the website. However, it was only
updated to December 2005 and no further update is
planned.
zz BiblioPRO (http://www.bibliopro.org): the only virtual
library of tools in Spanish for HRQoL and other PRO.
The new website which started in 2011 currently
includes more than 600 instruments in Spanish of
which 25%, including SF‐12 and Family SF, may be
downloaded directly.
4) Users search for measures and see attributes (e.g.
definition, associated construct, target population,
author, reliability, validity).
5) Users download and share datasets using GEM
measures and constructs.
The goals of GEM are to enable users to collaborate with
their peers to build consensus on the use of common
measures and to facilitate broad-scale data sharing
and harmonisation. GEM allows users to interact with
each other in an online environment. Based upon a wiki
platform, users contribute to the website by adding and
editing information about measures and associated
constructs and by providing feedback and ratings on the
measures and constructs in GEM.
GEM database currently includes 892 measures and
389 constructs and access to the information is free for
registered users.
3.1.2 Systematic review and search of online
bibliographic databases
A common approach to find the most relevant PRO tools is
to conduct a systematic review from relevant bibliographic
databases such as Medline, PsychInfo or Cochrane Online
Library. The website of COSMIN currently maintains an
updated list (up to June, 2014) of published systematic
reviews on the HR-PRO and the list is included in Appendix 1.
zz National Quality Measures Clearinghouse (NQMC),
AHRQ (http://www.qualitymeasures.ahrq.gov/hhs/
index.aspx) : Despite this online source not being
specifically designed for PROM, it does have many
potential useful measures, in particular measuring
patient reported experiences as it is a clearly defined
domain in the AHRQ quality matrix.
zz GEM (GRID-enabled Measures) Databases: (http://
cancercontrol.cancer.gov/brp/gem.html) :
GEM, currently hosted by the National Cancer Institute
of NIH, is an interactive website containing behavioural,
social science and other scientific measures organised
by theoretical constructs. GEM enables researchers to
collaborate with others, encourages the use of common
measures and facilitates the sharing of harmonised data:
1) Users contribute to the virtual community by adding or
editing meta-data about constructs and measures.
2) Users rate and comment to drive consensus on best
measures.
3) Users search for constructs (e.g. anxiety, depression),
see definitions, view theoretical foundations and
download associated measures.
3.2 How to choose a PROM/PREM –
methodological issues
3.2.1 Data source, mode and method
For choosing any specific PROM/PREM, a few factors
should be considered in combination with issues related
to its measurement properties. These factors include the
data source (self-report vs proxy), mode of administration
(self-administration vs interview), setting of administration
(clinic, home, other) and scoring (classical test theory vs
modern test theory).
These three aspects can also be combined in various ways,
e.g. a patient might use the telephone to self-administer a
PRO instrument or an interviewer might use a computer to
read questions and record answers17 ( Figure 3.2-1).
The main characteristics together with the strengths and
weaknesses for each of these methods are summarised.
The strengths and weaknesses should be carefully
weighted before and after a particular instrument was
chosen (Table 3.2-1).17
23 | Integrated Care
Figure 3.2-1 Types of respondent data and methods/modes of administration
Self-report vs proxy/observer
DATA SOURCE
Self-administration
- paper-and-pencil
- telephone
- computer
MODE
Interviewer-administration
- paper-and-pencil
- telephone
- computer
(Source: Cella et al 201217)
METHOD
Integrated Care | 24
Table 3.2-1 Main characteristics of PRO methods issues
Methodological Issue
Main Characteristics
Strengths
Limitations
Self
zz Individual responds
zz Expert on own
zz Not always possible
Proxy
zz Individual responds
zz Useful when target of
zz May not accurately
Source of report
about him/herself
about someone else
experience
assessment unable to
respond
to assess directly e.g.
because of cognitive or
communication deficits
or age/developmental
level
represent subjective or
other experiences
zz Can provide
complementary
information
Mode of administration
Self
zz Individual self-
administers PRO and
records the responses
zz Cost-effective
zz Potential for missing data
zz May yield more
zz Simple survey design (e.g.
participant disclosure
minimal skip patterns)
zz Proceed at one’s own
pace
Interviewer
zz Interviewer reads
questions out loud and
records the responses
zz More complex survey
zz Interviewer costs
design (e.g. skip patterns) zz
zz Useful for respondents
with reading, writing or
vision difficulties
Potential for bias
(interviewer bias,
social desirability bias,
acquiescent response
sets)
Method of administration
Paper-and-pencil
zz Patients self-administer
PRO using a paper and
writing utensil
zz Cost-effective
zz Prone to data entry errors
zz Data entry, scoring
requires more time
zz Less amenable to
incorporation within EHR
25 | Integrated Care
Methodological Issue
Main Characteristics
Strengths
Limitations
Electronic
zz Patient self-administers
zz Interactive
zz Cost
zz Practical
zz Potential discomfort with
PRO using computeror telephone-based
platform
zz Increased comfort for
socially undesirable
behaviours
zz Minimises data entry
errors
technology
zz Accessibility
zz Measurement
equivalence
zz Immediate scoring,
feedback
zz Amenable to
incorporation within EHR
Setting of administration
Clinic
Home
zz Patients complete PROs
zz Real-time assessment of
when they arrive to clinic
outcomes
appointments
zz Feasibility with use of
electronic methods of
administration
zz Patients complete PROs
at home prior to, or in
between clinic visits
zz Minimises impact on
clinic flow
zz Minimises staff burden
zz Impact on clinic flow
zz Interruptions resulting in
missing data
zz Patient anxiety
zz Staff burden
zz Accessibility
zz Health information
privacy
zz Data security
zz Patient safety
zz Patients complete PROs
zz Feasibility with
zz •Cognitive capacity and
Classical test theory
zz Raw scores
zz Easy to implement and
zz All items must be
Modern test theory
zz Probabilistic approach
zz Enables CAT (tailored
zz Difficult to implement
Other
at other types of settings
(e.g. skilled nursing,
rehabilitation)
electronic methods of
administration
potential need for proxy
Scoring
understand
questions)
zz Shorter questionnaires
with more precision
administered
and understand
Integrated Care | 26
3.2.2 Other confounding factors of choosing
a PROM/PREM
There are a number of factors that may impede or
complicate the decision of choosing a particular PROM/
PREM. These factors include:
zz vulnerable populations (very young, old, fragile)
zz low literacy
zz language and culture (Indigenous people or people
from a culturally or linguistically diverse background)
zz functional abilities (both cognitive and physical)
3.2.3 Implications of different methods
and modes on response rate, reliability and
validity
Data collection methods
Different methods differ along a variety of dimensions.
This includes the degree of interviewer involvement and
the level of interaction with the respondent. Channels of
communication (sight, sound, touch) used can be critical;
various combinations may prompt different issues of
comprehension, memory stimulation, social influence
affecting judgment, and response hurdles. Finally, the
degree of technology used is a major consideration.
Using a different method or mode than originally
validated
Considering the implications of using a different method
or mode than the one on which the PROM/PREM was
originally validated is also important. Many existing
PROMs/PREMs were initially validated in paper-andpencil form. However, potential differences exist between
paper-and-pencil and electronic-based PROM/PREM
administration, ranging from differences in how items
and responses are presented (e.g. items presented one at
a time, size of text) to differences in participant comfort
level in responding (e.g. ability to interact with electronic–
based platform).
Implications of using multiple methods and modes
The implications of using multiple methods and modes
also warrant consideration. One might choose to blend
methods for one or more reasons: cost reduction, faster
data collection, and optimisation of response rates.
When combining methods or modes (or both), users
must ensure that they can disentangle any effects of the
method or mode from other population characteristics.
This is especially true when respondents choose which
method or mode they prefer or when access issues
determine the choice of method or mode.
Accounting for the impact of non-responders
Difficulties with data collection and questionnaire
completion are major barriers to the successful
implementation of PROM/PREM. Missing data may be
classified as either item non-response (one or more
missing items within a questionnaire) or unit nonresponse (the whole questionnaire is missing for a
patient). Evaluating the amount, reasons and patterns of
missing data is important.
3.2.4 Minimally important differences and
changes
Historically, research has relied upon tests of statistical
significance to examine differences in PROM/PREM
scores between patients or within patients over time.
Attention has shifted to the concept of clinically significant
differences in PROM/PREM scores.
zz Minimally important differences (MIDs) represent
a specific approach to clinical significance and are
defined as “…the smallest difference in score in the
outcome of interest that informed patients or informed
proxies perceive as important.”18
zz Minimum clinically important differences (MCIDs)
comprise an even more specific category of MID
and are defined as “the smallest difference in score
in the domain of interest which patients perceive as
beneficial and which would mandate, in the absence of
troublesome side effects and excessive cost, a change
in the patient’s management.”19
The examination of clinically significant differences carries
a number of important implications.18
zz aids in the interpretation of PROMs/PREMs
zz emphasises the importance of the patient perspective
zz informs the evaluation of the success of a clinical
intervention
zz assists with sample size estimation.
Currently, no methodological “gold standard” exists for
estimating MIDs; 20 21 however, two primary methods are
currently in-use:
1) the anchor-based method and
2) the distribution-based method.
The anchor-based method of establishing MIDs assesses
the relationship between scores on the PROM/PREM and an
independent measure which is interpretable18, including:
zz clinical anchors which are correlated with the PROM/
PREM at the r ≥ 0.30 level, including clinical trial
experience22
27 | Integrated Care
zz transition ratings which are within-person global
ratings of change made by a patient.22 23
However, due to concerns about validity, it is
recommended that researchers examine the correlation
between pre-and post-test PROM/PREM scores and the
transition rating.24 Between-person differences made by
patients can also be used as anchors when establishing
MIDs for PROM/PREM measures.
Additional sources such as HRQoL-related functional
measures 22 23 and objective standards (e.g. hospital
admissions, time away from work24) can also be used as
anchors. Several limitations should be considered. Firstly,
the transition rating approach to anchor selection is
subject to recall bias on the part of the patient.20 Secondly,
global ratings may only account for some variance in
PROM/PREM scores.20 Thirdly, the anchor based method
does not take into consideration measurement precision
of the instrument.20
The distribution-based method represents the second
method of establishing MIDs in PROM/PREM. The
distribution-based method uses the statistical characteristics
of the PROM scores when establishing MIDs.25
The distribution-based approach evaluates change in
scores in relation to the probability that the change
occurred at random.20 Despite several methods available
when applying a distribution-based approach to MID
establishment, there is little consensus on the benchmarks
for establishing changes that are clinically significant.20
Important strategies in choosing MID include:
zz multiple methods and triangulation should be used to
determine the MID
given limitations of the anchorand distribution-based approaches
20 26
zz the final selection of MID values should be based on
does not directly translate to evaluating clinically
important group differences.22
3.2.5 Response shift, adaptation and other
challenges to detect true change
The ability to detect true change over time in PROM/PREM
poses another barrier to the integrity of PROM/PREM
assessment. Often detecting true change is associated
with the phenomenon of response shift which has been
defined as:
“a change in the meaning of one’s self-evaluation of
a target construct as a result of: (a) a change in the
respondent’s internal standards of measurement (i.e. scale
recalibration); (b) a change in the respondent’s values (i.e. the
importance of component domains constituting the target
construct) or (c) a redefinition of the target construct (i.e.
reconceptualisation)” (p.1532).28
A change in perspective over time may result in patients’
attending to PROMs/PREMs in a systematically different
way from one time point to another.29
Response shift serves as a barrier to PROM/PREM
assessment:
zz it threatens longitudinal PROM/PREM assessment
validity, reliability and responsiveness29
zz response shift can complicate the interpretation of
PROM/PREM, since a change in PROM/PREM outcome
may occur because of response shift, an effect of
treatment or both.30
Monitoring for response shift can aid PROM/PREM users
in interpreting longitudinal PROM/PREM data.31 Several
strategies have been proposed to identify response shift,
although each has limitations:
zz MID values should also be informed by a stakeholder
1) The “then test” compares an actual pre-test rating and
a retrospective pre-test rating to assess for shift but it is
less robust than other methods of detecting response
shift29 and is confounded with recall bias.28
zz when considering MIDs for PROM/PREM, a single
2) Structural equation modelling has also been proposed
as a way to identify response shift; however, it is
sensitive only if most of the sample is likely to make
response shifts.32
systematic review and an evaluation process such as
the Delphi method26
consensus, which includes patient engagement and
input, regarding the extent of change considered to be
meaningful
MID should only be applied to a situation involving
that particular PROM/PREM given that MID varies by
population/context26
zz the distribution around the MID be provided rather
than just a single MID value27
zz calculate the proportion of patients who experience
a clinically significant change because the criteria for
assessing clinically important change in individuals
3) Growth modelling creates a predictive growth curve
model to investigate patterns in discrepancies between
expected and observed scores, thus assessing response
shift at the individual level.33 Although growth
modelling enables users to detect both the timing and
shape of the response shift31, it cannot differentiate
between random error and response shift.28
Integrated Care | 28
3.2.6 Generic versus disease- or conditionspecific PROM
Table 3.2-2 The strengths and weaknesses of generic and
condition specific PROMs
One primary factor to consider when selecting a patientlevel PROM is whether to use a generic versus a conditionspecific PROM. Several elements inform the selection of
measures.34 35
PROMs
Strength
Weakness
Generic
zz allow for
zz less sensitive
1) Targeted population: The specific population of
interest may indicate whether one opts to use a
generic or condition-specific PRO. For example, if
the target population comprises of mainly healthy
individuals, a generic measure may be the preferred
choice. Conversely, examining a specific subset of
patients with a particular health concern, a conditionspecific measure may be more appropriate.
comparability
across patients
and populations
(more suitable for
comparison across
groups than for
individual use)
zz allow assessments in
terms of normative
data which can be
used to interpret
scores
2) Outcomes of interest: Generic measures may capture
a different category of outcomes when compared to a
condition-specific PRO. For example, a generic measure
may assess the domains of general QoL, whereas
a condition-specific PRO may measure symptoms
expected to be directly addressed by a conditionspecific intervention.
zz enable evaluation
3) The assessment purpose: For example, for labelling
purposes one should follow the FDA guidance
which states that pharmaceutical company claims of
improved QoL must be specific to the QoL domain
that was measured; the agency recommends that
assessment of specific symptoms is an appropriate
starting point.
zz can be applied to
to change
than
conditionspecific
measures
zz may fail
to capture
important
conditionspecific
constructs
against population
norms or
comparison with
information about
various disease
conditions
individuals without
specific health
conditions
zz can differentiate
The strengths and weaknesses of both generic and
condition-specific measures can be summarised 34 35
(Table 3.2-2).
groups on indexes
of overall health and
well-being
Conditionor diseasespecific
zz greater sensitivity
to change because
they focus on the
concerns pertinent
to the given
condition
zz enable
differentiation
of groups at the
level of specific
symptoms or patient
concerns
zz introduces
the notable
difficulty
of making
comparisons
across patient
populations
with different
diseases
or health
conditions
29 | Integrated Care
In general, a combination of generic and condition-specific
measures is likely to be the best choice for performance
measurement purposes. Generic and condition-specific
PRO measures may measure different aspects of HRQoL
when administered in combination, resulting in a more
comprehensive assessment. Consequently, hybrid
measurement systems such as the Functional Assessment
of Chronic Illness Therapy (FACIT) system (www.facit.org)
and PROMIS, were developed to create item banks that
are appropriate for use across common chronic disease
conditions as well as for specific conditions, represents both
global and targeted approaches.
3.2.7 Other methodological issues related to
choosing PROM/PREMs
There are many other complex conceptual and
methodological issues related to the measurement of
PROMs/PREMs that are presented in Appendix 2.
3.3 Using PROM/PREM as quality
measures: the desirable attributes
When using PROM/PREM as quality measures in order
to affect change, PROM/PREM could also be assessed
according to the AHRQ framework of desirable attributes
of quality measures in the three broad conceptual areas:
(1) importance of a measure, (2) scientific soundness of a
measure, and (3) feasibility of a measure. AHRQ’s criteria for
users to judge a desirable quality measure are paraphrased
below (more details can be found in NQMC’s Template of
Measures Attributes (http://www.qualitymeasures.ahrq.
gov/about/template-of-attributes.aspx) .
1) Importance of the measure
zz Relevance to stakeholders - the topic area of the measure
is of significant interest and is financially and strategically
important to stakeholders (e.g. patients, clinicians,
purchasers, public health officers, policy makers).
zz Health importance - the aspect of health that the
measure addresses is important as defined by high
prevalence or incidence and/or a significant effect on
the burden of illness (i.e. effect on the mortality and
morbidity of a population).
zz Applicability to measuring the equitable distribution
of health care (for health care delivery measures) or of
health (for population health measures) - the measure
can be stratified or analysed by subgroups to examine
whether disparities in care or of health exist amongst a
diverse population of patients.
zz Potential for improvement - there is evidence indicating
a need for the measure because there is overall poor
quality or variations in quality among organisations
(for health care delivery measures) or overall poor
quality of health or variations in quality of health
among populations (for population health measures).
zz Susceptibility to being influenced by the health care
system - for health care delivery measures, the results
of the measure relate to actions or interventions
that are under the control of those providers whose
performance is being measured so that it is possible
for them to improve that performance. For public
health measures, the results should be susceptible to
influence by the public health system.
2) Scientific soundness: clinical logic
zz Explicitness of evidence - the evidence supporting the
measure is explicitly stated.
zz Strength of evidence - the topic area of the measure is
strongly supported by the evidence, i.e. indicated to be
of great importance for improving quality of care (for
health care delivery measures) or improving health (for
population health measures).
3) Scientific soundness: measure properties
zz Reliability - the results of the measure are reproducible
for a fixed set of conditions irrespective of who makes
the measurement or when it is made; reliability testing
is documented.
zz Validity - the measure truly measures what it purports
to measure; validity testing is documented. See tutorial
on measure validity.
zz Allowance for patient/consumer factors as required
- the measure allows for stratification or case-mix
adjustment if appropriate.
zz Comprehensible - the results of the measure are
understandable for the user who will be acting on
the data.
4)Feasibility
zz Explicit specification of numerator and denominator - a
measure should usually have explicit and detailed
specifications for the numerator and denominator;
statements of the requirements for data collection are
understandable and implementable. Some measures
that do not have explicit and detailed specifications
for the numerator and denominator (e.g. measures
that have counts or means) can be feasible for quality
improvement purposes when used with a specified
baseline, benchmark and/or target.
Integrated Care | 30
zz Data availability - the data source needed to
implement the measure is available and accessible
within the timeframe for measurement. The costs of
abstracting and collecting data are justified by the
potential for improvement in care or health.
3.4 Other principles of selecting PROMs/
PREMs in IC
Indicators can be more or less robust and meaningful
depending on their characteristics and whether they meet
certain criteria. Generic criteria that indicators should
meet if they are to be useful include36:
zz importance and relevance
zz validity
zz accuracy
zz reliability
zz feasibility
zz meaningfulness
zz implications for action
zz avoidance of perverse incentives.
Wider considerations could also inform the selection of
indicators36, such as:
zz size of the population covered
zz representation of important aspects of the care system
zz (wholly or partly within the control of care services i.e.
attributability
zz change detectable within suitable time frames
zz unambiguous interpretation
zz likelihood of being meaningful to users, carers and the
public
zz likelihood of being meaningful to care professionals,
managers and commissioners
zz reflecting the user perspective and/or value for money
perspective
zz timeliness
zz ability to assess the impact on inequalities between
user groups and areas in
zz terms of access and outcomes of care
zz measurable from routinely collected data.
4 | Capturing PROMs
and PREMs
Integrated Care | 32
4.1 The challenges in capturing PROMs
and the promises of PROMIS
Despite a growing interest in the integration of PROMs
into clinical practice, efforts have been hampered by a
number of challenges.
These include:
1. floor and ceiling effects that limit sensitivity to change;
2. lengthy questionnaires that increase patient burden;
3. a proliferation of measures of the same outcome
limiting the ability of decision makers to compare
results across studies;
4. some promising PROMs have not been validated
specifically in the clinical population under study;
5. a scarcity of evidence regarding the validity of PROMs,
despite the FDA urging that special attention be
paid to this in its guidelines on the use of PROMs for
pharmaceutical labelling claims.2
Collectively, these challenges have limited the use of PRO
as endpoints within clinical trials and clinical practice as
well as inhibiting the adoption of key trial findings by
practitioners. Due to lack of standardised instruments
being validated in large heterogeneous populations,
clinicians and policy makers believe that some
instruments may not have decision making relevance
(external validity) in clinical practice.
“The clinical outcomes research enterprise would be
enhanced greatly by the availability of a psychometrically
validated dynamic system to measure PROs efficiently in
study participants with a wide range of chronic diseases
and demographic characteristics.”
National Institute of Health, 2003
The PROMIS Network, a component of the NIH’s Reengineering the Clinical Research Enterprise Program,
seeks to overcome the limitations in existing PRO
instruments by:
1. developing and testing large PRO item banks based on
IRT covering a wide range of concepts and constructs
such as pain, fatigue, physical functioning, emotional
distress and social role participation that have a major
impact on QoL across a variety of chronic diseases;
2. creating a CAT system for the assessment of PROs in
clinical research; and
3. creating a publicly available and updatable system for
accessing and using the item bank via the CAT system
known as Assessment Centre (www.assessmentcenter.net).
This initiative applies to a wide range of disorders
including cancer, congestive heart failure, depression,
arthritis, and multiple sclerosis as well as chronic pain
conditions. PROMIS is creating new paradigms for how
clinical research information is collected, used and
reported. The PROMIS initiative addresses a need in
the clinical research community for a rigorously tested
PROMs tool that utilises recent advances in information
technology, psychometrics, qualitative research, cognitive
research and health survey research.
PROMIS has many assessment options available to
measure self-reported health for clinical research and
practice. PROMIS assessment instruments are drawn
primarily from calibrated item banks (sets of welldefined and validated items) measuring concepts such
as pain, fatigue, physical function, depression, anxiety
and social function. These calibrated item banks can be
used to derive short forms (typically requiring 4-10 items
per concept or computerised adaptive testing (CAT);
typically requiring 4-7 items per concept for more precise
measurement). Assessments are available for children and
adults. Most PROMIS instruments are available through
Assessment Centre. Those which are not yet available on
Assessment Centre can be obtained by contacting the
PROMIS statistical centre through help@assessmentcenter.
net. The Assessment Centre can be utilised for online or
offline computer-based administration or instruments
can be downloaded for paper administration or entry into
other data collection platforms. For registered users, all
the instruments, documentation and necessary computer
platforms are free at the writing of this report.
The instruments from PROMIS are available in the form
of item banks, short forms and profiles. Item banks are
calibrated items from which a summary score can be
obtained from a subset of items (i.e. via CAT or short form)
whereas scales are calibrated items from which a summary
score should be obtained only from the complete set of
items. Item pools are collections of related items that are
not intended to produce a summary score but instead are
to be used as single items. Short forms are static subsets of
item banks and profiles are fixed collections of short forms
measuring multiple concepts.
33 | Integrated Care
Data collection formats for
Patient Reported Outcome
Item response
theory
During the second phase (2009-2013) of the PROMIS
initiative, it continued to advance the field of patient selfreporting in clinical research and practice, by:
zz developing new items and domains
zz translating current and future items and domains
Conventional
into other languages such as Spanish and Chinese to
facilitate international studies
zz conducting validation studies in large-scale clinical
trials in a variety of clinical populations
Static paper form
Customised or
pre-specified
short form
Static electronic
form
Customised or
pre-specified
short form
Static paper form
zz making PROMIS tools accessible to a wider range of
clinical researchers and patient-care communities, and
optimising its usability for rapid adoption
zz providing on-going education and outreach to
Static electronic form
May include simple
branching, but no
dynamic item
selection
Dynamic electronic form
Full computer adaptive item
selection: the next item is
selected based on previous
response
During the first phase of the initiative (2004 to 2009),
PROMIS formed a network of researchers that developed
questions or “items” to analyse five outcomes or “domains”.
PROMIS is creating a psychometrically-robust CAT system
based on IRT to administer these items. In addition, it has
developed a web-based system to give clinical researchers
access to the item banks and the CAT system. Whether
administered through an iterative CAT system that allows
research flexibility or by paper version short forms,
PROMIS has already demonstrated improved efficiency
and sensitivity in comparison with existing PROMs. Longterm trials are planned to address the issues of validity
and sensitivity to changes in clinical populations. The
efficiency, flexibility and sensitivity of PROMIS has the
potential to become a widely-accepted, standardised PRO
measurement tool that will allow greater comparability of
studies with a reduced burden on patients.
familiarise users with new developments in PROMIS
zz improving PROMIS tools to allow for better outcomes
in clinical trials and, potentially, better individual and
clinical decisions
zz engaging stakeholders at all levels, by including
interactions with other health-related federal
agencies, forging new relationships with patients
and patient organisations and establishing publicprivate partnerships to sustain PROMIS once Roadmap
funding ends.
Integrated Care | 34
PROMIS II NETWORK STRUCTURE 2009-2013
NIH
SMB
PROMIS Network Steering Committee (PNSC)
PROMIS Network Executive Committee (PNEC)
External collaborators
PROMIS Statistical Centre (PSC)
Northwestern University
Dr David Colla
PROMIS Network Centre (PNC)
American Institutes for
Research (AIR)
Dr San Keller
PROMIS Technology Centre (PTC)
Northwestern University
Dr Richard Gorshon
Domain & early
clinical validation
See below
New domain
Arthur Stone
Large clinical
validation site
Arnold Potosky
Domain development/Early validation
Stephen Haley
PROs in children
& young adults
with disabilities
Darren DeWalt
Paediatrics:
longitudinal study
linking paediatrics
and adult item
banks
Christopher Forrest
Paediatric PROMIS:
advancing the
measurement &
conceptualisation
of child health
Kevin Weinfurt
Sexual functioning
Dinnesh Khanna
Development/
validation of
PROMIS GI distress
Esi Morgan
DeWitt
Enhancing PROMIS
in paediatric pain
Lisa Shulman
Development/
validation of
self efficacy item
bank
Donald Patrick
Patient reported
outcomes in
routine clinical
care of patients
infected with HIV
Paul Pilkonis
Development/
validation of
mental health/
sleep wake
function
James Fries
Improving
assessment of PF/
drug safety in
health & disease
35 | Integrated Care
Source: NIH Clinical Outcomes Assessment : PROMIS
Overview ( http://commonfund.nih.gov/ clinicalresearch /
overview-dynamicoutcomes.aspx ; accessed in Dec. 2011)
In an ambitious move, the PROMIS initiative aims to
achieve four values which almost all previous attempts to
develop PROs have failed to address:
1. comparability: the measures can be used and
compared across different diseases, conditions and
different populations as well as across life course
2. reliability (precision) and validity: being extensively tested
against existing and legacy instruments such as SF-36
in different formats (i.e. short-form, profile, scale) under
different study populations and conditions and across
the score continuum of the concept; it is extremely
helpful in understanding the responsiveness, floor and
ceiling effects of the instruments
3. flexibility: it can be administrated through paper-pen,
touch-screen, smart phone, personal digital assistant
(PDA) and the web. It can also incorporate specific
instruments developed or adopted by individual research.
It is also linkable to the EHR and other databases
4. inclusiveness: items were written simply at elementaryschool reading level and cognitive interviews, for all
items, were conducted. Every item was pre-tested
and then field tested in individuals with low literacy.
Items have been translated into Spanish and 33 other
countries also requested the translation. The PROMIS II
will focus on children’s measures.
PROMIS integrates the fields of...
Health
information
technology
Qualitative
research
Psychometrics
PROMIS
Survey
research
Clinical
research
Source: NIH PROMIS: Advancing PRO Science in Clinical
Research and Patient Care: January 11, 2011 presentation
(http://www.nihpromis.org/whatsnew/whatsnewhome)
PROMIS has been used in successful NIH grant
applications, producing over 150 journal articles,
including cancer-related publications. PROMIS researchers
have presented on the topic worldwide on hundreds
of occasions in the last four years. Selected journal
publications by the PROMIS network since 200737-186 are
included in the references.
4.2 Generic HRQoL measures in
primary care setting: results from
an evidence review
One of the common challenges in applying PROMs in
a primary health care or IC setting is how to choose a
generic PROM that measures HRQoL given the multitude
of existing measurement instruments. To compare the
methodological quality of these instruments is an arduous
task. To meet this challenge, the Canadian Institute of
Health Research supported a group of researchers in 2013
to conduct a review to answer the question “ What are the
most effective ways to measure patient health outcomes
of primary health care integration through Patient
Reported Outcome Measurement Survey instruments?”187
The project took six steps to complete:
1) Long-list of generic PROMs: A comprehensive long-list
of all generic PROM instruments.
2) Short-list of generic PROMs: To include truly generic,
quantitative measures designed for adult populations
with high recent citation counts.
3) Descriptive overview of short-listed PROMs: To include
official translations, respondent burden including
required literacy/reading level, cost for using the
instrument and dimension coverage.
4) Review of PROM instrument ‘performance’:
Psychometric (e.g. reliability, validity, responsiveness
and interpretability) and decision-making (e.g. norm
reference sets, utility/preference scoring algorithm and
evidence of clinically relevant thresholds)
5) Additional information: Examples of use in a primary
and community care context and PROM-related
activity in other jurisdictions.
6) Workshop and recommendations: To review the
evidence and identify the ‘preferred’ instrument (or
instruments) for use in British Columbia (BC) primary
and community care reform.
Integrated Care | 36
The researchers generated a short-list of PROMs which
included:
1) Assessment of Quality of Life (AQoL-8D)
2) EuroQol EQ-5D
3)HowsYourHealth
4) Health Utilities Index (HUI)
5) Nottingham Health Profile (NHP)
6) PROMIS/Global Health Status(GHS)
7) Short-Form 36 (SF-36) and SF-12
8) Quality of Wellbeing Scale (QWB-SA)
9) World Health Organisation Quality of Life Instrument
(WHOQoL-BREF).
A formal review approach was adopted which included
study selection, data extraction and quality assessment.
Data synthesis involved a scoring approach proposed
by the COSMIN-initiative for systematic reviews of the
measurement properties of instruments. The researchers
considered aspects of each instrument’s reliability, validity
and responsiveness following strict guidelines on scoring
in each of these categories and included two additional
categories: generalisability and comparison with other
PROM instruments.
Twenty-one of the 22 articles provided information about the
psychometric properties of the candidate PROM instruments.
An overview of the results from the psychometric review is
provided (see Table 4.2-1 ). A general conclusion is that the
SF-36 performed particularly well across most psychometric
dimensions and PROMIS was also a strong instrument
although the evidence base was smaller.
Table 4.2-1 Overview of results from psychometric review
AQoL
EQ-5D
SF-36
HUI
NHP
QWB-SA
WHOQoLBREF
PROMIS
Internal consistency
+
n/a
+++
?
+/-
n/a
+/-
+++
Reliability
?
+/-
+++
+/-
+/-
?
?
?
Content
-
-
++
+/-
+/-
+/-
?
+/-
?
?
+++
?
-
?
+++
++
Cross-cultural validity ?
+
+/-
+
+/-
?
+++
?
Criterion
?
+/-
?
?
?
?
?
?
+
--
+++
+/-
-
+/-
?
++
validity
Construct validity
validity
Responsiveness
37 | Integrated Care
It is worth noting that although most of the instruments
can provide utility scores, AQoL was developed in Australia
and by Australian researchers where the utility scores
were derived from an Australian sample. Likewise, all the
instruments provide population norms, but only AQoL ,
SF-12/SF-36 supply Australian population norms.
The project culminated with a workshop, held at the
British Colombia Ministry of Health and involving a wide
range of stakeholders. The primary objectives for the
whole workshop were:
zz to share the details of the review and evidence
synthesis work undertaken by the research team
zz to reflect and deliberate on the PROM instruments and
their potential use
zz to come to a consensus on which PROM instrument(s)
should be recommended for use in BC.
There were two rounds of voting and discussions during
the workshop which occurred after the information was
presented on the review and participants were given
an opportunity to visually review the items within each
measure. The final votes indicated a strong preference for
two instruments, PROMIS (41 votes) and SF (36 votes) and
a desire to keep EQ-5D (18 votes) as a back-up.
More information on the cost, sources and instrument
details is presented in Appendix 3.
4.3 Measuring patient experience in
a primary health care setting – an
evidence review from Canada
Hosted by the Canadian Primary Health Care Research
and Innovation Network (CPHCRIN), three important and
related scoping reports on measurement in primary health
care (PHC) were commissioned. These three separate
but related technical reports were companions to the
Canadian Institute for Health Information (CIHI) suite of
PHC organisational, provider, and patient experience
surveys. The patient experience report188 updates and
builds on work that examined patient experiences in PHC
and suggested six dimensions and 15 sub-dimensions that
were deemed important in measuring patient experience
in PHC (Table 4.3-1). The report was based on an extensive
literature review, qualitative research and consultation
with stakeholders.
The review188 included domains/subdomains that were
not included in the current version of the NHS patient
experience framework. The domains such as trust and
confidence in the PHC system provided different insights
from the patient perspective. In order to operationalise
the domains, the authors conducted an extensive review
of many related measurement instruments and retained
17 instruments that were widely used in the PHC setting.
Of these 17 instruments and surveys, ten were created in
a country other than Canada, one was an international
initiative (Commonwealth Fund International Health
Policy survey) and nine were administered only in Canada.
Through the mapping of different instrument items into
the conceptual framework presented (Table 4.31), a final
87-item survey questionnaire was developed. Given
the large number of domains/subdomains included in
the framework, the number of questions included is
fairly large which reflects on the trade-off between the
completeness/richness of the content and the complexity/
multiplicity of the questions.
Integrated Care | 38
Table 4.3-1 Dimensions of patients’ experiences in primary health care188
Dimension
Sub-dimension
Definition
First contact accessibility
The ability to obtain patient-or client-initiated needed care
(including advice and support) from the provider of choice
within a time frame appropriate to the urgency of the
problem.[40]
Accommodation
The relationship between how resources are organised to
accept patients or clients (including appointment systems,
hours of operation, walk-in facilities, telephone services)
and the patients’ or clients’ ability to accommodate to these
factors to realise access.[40]
Economic accessibility
The extent to which direct or indirect costs related to care
impeded decisions to access needed care or continue
recommended care.
General communication
The ability of the provider to elicit and understand patient
or client concerns, and to explain health and health care
issues.[40, 47]
Respectfulness
The ability of the primary care organisation and
practitioners to provide care that meets the expectations of
users about how people should be treated, such as regard
for dignity and provision of adequate privacy.[40, 47]
Shared decision-making
The extent to which patients or clients are involved in
making decisions about their treatment.[ 47]
Whole-person care
The extent to which providers address the physical,
emotional and social aspects of a patient’s or client’s health
and consider the community context in their care.[40]
Relational continuity
A therapeutic relationship between a patient or client and
one or more identified providers that spans separate health
care episodes and delivers care that is consistent with the
patient’s or client’s biopsychosocial needs.[40]
Information continuity
The extent to which information is used to make current
care appropriate to the patient or client.
Access
Interpersonal
communication
Continuity and coordination Coordination
Team functioning
The provision and organisation of a combination of health
services and information with which to meet a patient’s or
client’s health needs, including services available from other
community health service providers.[9, 10]
The ability of primary health care providers to work
effectively as a collaborative team to manage and deliver
quality patient or client care.
39 | Integrated Care
Dimension
Sub-dimension
Definition
Services provided
The provision, either directly or indirectly, of a full range
of services to meet patients’ or clients’ health care needs.
This includes health promotion, prevention, diagnosis and
treatment of common conditions, referral to other clinicians,
management of chronic conditions, rehabilitation, palliative
care and, in some models, social services.[40]
Health promotion and
primary prevention
Health promotion is the process of enabling people to
increase control over, and to improve, their health.[12]
Primary prevention is directed towards preventing the
initial occurrence of a disorder.[13]
Comprehensiveness of
services
An expectation that the other person will behave in a way
that is beneficial and that allows for risks to be taken based
on this expectation. For example, patient or client trust in
the physician provides a basis for taking the risk of sharing
personal information.[48]
Trust
Patient activation
Patient’s or client’s ability or readiness to engage in health
behaviours that will maintain or improve their health status.
[49, 50]
Patient safety
Patient’s or client’s reports of medication errors (given
or taken the wrong drug or dose) or incorrect medical or
laboratory reports and communication with their provider
about not taking their prescribed medication or medication
side effects.
Confidence in the PHC
system
The perception that allows patients or clients of health care
to make decisions since they assume (and expect) relative
certainty about providers delivering safe and technically
competent care.[51]
Patient-reported impacts of
care
4.4 Measuring IC based on the AHRQ
‘Care Coordination Atlas’
The AHRQ Atlas Report189 described the domains and
mechanisms that were deemed important for ‘care
coordination’. The authors also conducted extensive and
structured reviews and produced an ‘Atlas’ to mapping
the existing instruments and survey questionnaires into
the domains and activities as defined in its framework
(Table 8.8-3).
The extensive review found 61 instruments, of which many
have multiple versions for different subgroups of targeted
populations (Table 4.41). These measures were then
mapped into all activities and domains stipulated in its
framework at three levels: patients, providers and health
system representatives, resulting in three mapping tables
(Table 4.4-2;Table 4.4-3;Table 4.4-4) that could be used to
develop specific instruments according to the individual
project’s needs and aims.
Detailed guidelines on how to match these measure
components to the evaluation project objectives is also
provided.189
Integrated Care | 40
Table 4.4-1 Index of measures/instruments
No.
Measure title
1
Assessment of Chronic Illness Care (ACIC)
2
ACOVE-2 Quality Indicators: Continuity and Coordination of Care Coordination
3
Coleman Measures of Care Coordination
4
Consumer Assessment of Healthcare Providers and Systems (CAHPS)
a. Adult Primary Care 1.0
b. Adult Specialty Care 1.0
c. Child Primary Care 1.0
5
Care Coordination Measurement Tool (CCMT)
6
Client Perception of Coordination Questionnaire (CPCQ)
7
Collaborative Practice Scale (CPS)
a. Nurse Scale
b. Physician Scale
8
Breast Cancer Patient and Practice Management Process Measures
9
Care Transitions Measure (CTM)
a. CTM-3
b. CTM-15
10
Patient Assessment of Care for Chronic Conditions (PACIC)
11
Family-Centered Care Self-Assessment Tool
a. Family Version
b. Provider Version
12
ICU Nurse-Physician Questionnaire
a. Long Version
b. Short Version
13
Primary Care Assessment Survey (PCAS)
14
National Survey of Children With Special Health Care Needs (CSHCN)
15
Head And Neck Cancer Integrated Care Indicators
16
Medical Home Index (MHI)
a. Long Version (MHI-LV)
b. Short Version (MHI-SV)
c. Medical Home Family Index and Survey (MHFIS)
41 | Integrated Care
No.
Measure title
17
Primary Care Assessment Tool (PCAT)
a. Child Expanded Edition (PCAT-CE)
b. Adult Expanded Edition (PCAT-AE)
c. Facility Expanded Edition (PCAT – FE)
d. Provider Expanded Edition (PCAT – PE)
18
Physician-Pharmacist Collaboration Instrument (PPCI)
19
Readiness for the Patient-Centered Medical Home
20
Family Medicine Medication Use Processes Matrix (MUPM)
Measure Titles
21
Resources and Support for Self-Management (RSSM)
22
Continuity of Care Practices Survey
a. Program Level (CCPS-P)
b. Individual Level (CCPS-I)
23
Program of All-Inclusive Care for the Elderly (PACE)
24
Measure of Processes of Care (MPOC-28)
25
Care Evaluation Scale for End-of-Life Care (CES)
26
Oncology Patients’ Perceptions of the Quality of Nursing Care Scale (OPPQNCS)
27
Care Coordination Services In Pediatric Practices
28
Collaboration and Satisfaction About Care Decisions (CSACD)
29
Follow Up Care Delivery
30
Family Satisfaction in the Intensive Care Unit (FS-ICU 24)
31
Korean Primary Care Assessment Tool (KPCAT)
32
Primary Care Multimorbidity Hassles for Veterans With Chronic Illnesses
33
Primary Care Satisfaction Survey for Women (PCSSW)
34
Personal Health Records (PHR)
35
Picker Patient Experience (PPE-15)
36
Physician Office Quality of Care Monitor (QCM)
37
Patient Perceptions of Care (PPOC)
Integrated Care | 42
No.
Measure title
38
PREPARED Survey
a. Patient Version
b. Carer Version
c. Residential Care Staff Version
d. Community Service Provider Version
e. Medical Practitioner Version
f. Modified Medical Practitioner Version
39
Health Tracking Household Survey
40
Adapted Picker Institute Cancer Survey
41
Ambulatory Care Experiences Survey (ACES)
42
Patient Perception of Continuity Instrument (PC)
43
Jefferson Survey of Attitudes Toward Physician-Nurse Collaboration
44
Clinical Microsystem Assessment Tool (CMAT)
45
Components of Primary Care Index (CPCI)
46
Relational Coordination Survey
47
Fragmentation of Care Index (FCI)
48
After-Death Bereaved Family Member Interview
49
Schizophrenia Quality Indicators for Integrated Care
50
Degree of Clinical Integration Measures
51
National Survey for Children’s Health (NSCH)
52
Mental Health Professional HIV/AIDS Point Prevalence and Treatment Experiences Survey Part II
53
Cardiac Rehabilitation Patient Referral from an Inpatient Setting
54
Cardiac Rehabilitation Patient Referral from an Outpatient Setting
55
Patients with a Transient Ischemic Event ER Visit That Had a Follow Up Office Visit
56
Biopsy Follow Up
57
Reconciled Medication List Received by Discharged Patients
58
Transition Record with Specified Elements Received by Discharged Patients (Inpatient Discharges)
59
Timely Transmission of Transition Record
60
Transition Record with Specified Elements Received by Discharged Patients(Emergency Department Discharges)
61
Melanoma Continuity of Care—Recall System
43 | Integrated Care
Table 4.4-2 Care coordination master measure mapping table, patient/family perspective†
FRAMEWORK DOMAINS
KEY SOURCES
COORDINATION ACTIVITIES
Establish Accountability or Negotiate
Responsibility
3, 4a, 4b, 4c, 6, 9b, 11a, 13, 14, 16c, 17a, 17b, 26, 32, 37, 40, 42, 45, 48
Communicate
3, 4a, 4b, 4c, 6, 9b, 10, 11a, 13, 14, 16c, 17a, 17b, 24, 25, 26, 29, 30, 31, 32, 33, 37,
38a, 45, 48, 51
Interpersonal Communication
3, 4a, 4b, 4c, 6, 10, 11a, 13, 14, 16c, 17a, 17b, 21, 33, 35, 36, 37, 38b, 39, 40, 41, 42,
45, 48, 51
Information Transfer
3, 4a, 4b, 4c, 6, 9b, 10, 11a, 13, 14, 16c, 17a, 17b, 21, 24, 26, 29, 30, 31, 32, 33, 35,
36, 37, 38a, 38b, 39, 40, 41, 42, 45, 48, 49, 51
FACILITATE TRANSITIONS‡
Across settings
9a, 9b, 13, 14, 16c, 17a, 17b, 21, 26, 31, 32, 37, 38a, 38b, 40, 42, 51
As coordination needs change
11a, 14, 24
Assess Needs and Goals
3, 4a, 4b, 4c, 6, 9a, 9b, 10, 11a, 13, 14, 16c, 17a, 17b, 21,24, 25, 26, 30, 31, 32, 33,
35, 37, 38a, 38b, 40, 41, 42, 45
Create a Proactive Plan of Care
6, 9b, 10, 11a, 16c, 21, 24, 37, 38a, 40
Monitor, Follow Up and Respond to
Change
3, 4a, 4b, 4c, 6, 9b, 10, 11a, 13, 16c, 17a, 17b, 21, 24, 25, 26, 29, 31, 32, 33, 36, 37,
39, 40, 41, 45
Support Self-Management Goals
4a, 4b, 4c, 6, 9a, 9b, 10, 11a, 13, 16c, 17a, 17b, 21, 24, 25, 26, 29, 31, 32, 33, 35, 36,
37, 38a, 38b, 40, 41
Link to Community Resources
10, 11a, 16c, 17b, 21, 24, 31, 33, 38a, 38b
Align Resources with Patient and
Population Needs
6, 11a, 14, 16c, 17a, 17b, 31, 38a, 38b, 51
BROAD APPROACHES POTENTIALLY RELATED TO CARE COORDINATION
Teamwork focused on Coordination
6, 11a, 16c, 24, 25, 29, 30, 35, 36, 39, 40
Healthcare Home
4a, 4b, 4c, 16c, 17a, 17b, 45, 51
Care Management
11a, 14, 21, 51
Medication Management
4a, 4b, 4c, 6, 9a, 9b, 10, 17a, 17b, 21, 32, 35, 36, 37, 38a, 38b, 42, 48
Health IT-enabled Coordination
4a
† A key to measure numbers can be found in Table 4.4-1.
‡ All measure items addressing transitions are mapped to one of the specific transition types (across settings or as
coordination needs change).
Integrated Care | 44
Table 4.4-3 Care coordination master measure mapping table, healthcare professional(s) perspective†
FRAMEWORK DOMAINS
KEY SOURCES
COORDINATION ACTIVITIES
Establish Accountability or Negotiate
Responsibility
5,7a, 7b, 11b, 18, 20, 22b, 38c, 38d, 38e, 43, 46
Communicate
5, 7a, 7b, 11b, 12a, 12b, 17d, 22b, 23, 38e, 38f, 43, 46
Interpersonal Communication
7a, 7b, 8, 11b, 12a, 12b, 17d, 18, 22b, 28, 43
Information Transfer
5, 8, 11b, 12a, 12b, 17d, 18, 20, 22b, 23, 27, 38c, 38d, 38e, 38f
FACILITATE TRANSITIONS‡
Across settings
5, 17d, 22b, 27, 43, 38c, 38d, 38e, 38f
As coordination needs change
11b, 22b
Assess Needs and Goals
5, 11b, 12a, 12b, 17d, 20, 23, 27, 38d, 38e, 38f, 43, 46
Create a Proactive Plan of Care
5, 7b, 8, 11b, 12a, 22b, 23, 27, 38e, 38f
Monitor, Follow Up, and Respond to
Change
5, 11b, 12a, 12b, 17d, 20, 22b, 23
Support Self-Management Goals
5, 8, 11b, 17d, 20, 22b, 38d, 38e, 38f
Link to Community Resources
5, 11b, 17d, 22b, 27, 38e
Align Resources with Patient and
Population Needs
5, 8, 11b, 17d, 20, 38d, 38e
BROAD APPROACHES POTENTIALLY RELATED TO CARE COORDINATION
Teamwork focused on Coordination
7a, 7b, 11b, 12a, 12b, 18, 23, 27, 28, 43, 46
Healthcare Home
17d
Care Management
5, 11b, 22b, 27
Medication Management
17d, 18, 20
Health IT-enabled Coordination
12a, 17d
† A key to measure numbers can be found in Table 4.4-1.
‡ All measure items addressing transitions are mapped to one of the specific transition types (across settings or as
coordination needs change).
45 | Integrated Care
Table 4.4-4 Care coordination master measure mapping table, system representative(s) perspective†
FRAMEWORK DOMAINS
KEY SOURCES
COORDINATION ACTIVITIES
Establish Accountability or Negotiate Responsibility
1, 2, 15, 16a, 16b, 57, 58, 59, 60
Communicate
1, 16a, 16b, 17c, 22a, 34
Interpersonal Communication
17c, 22a, 52
Information Transfer
1, 2, 15, 16a, 17c, 22a, 34, 44, 49, 50, 52, 53, 54, 56, 57, 58, 59, 60
FACILITATE TRANSITIONS‡
Across settings
15, 16a, 17c, 22a, 49, 50, 55, 57, 58, 59, 60
As coordination needs change
16a, 16b, 22a
Assess Needs and Goals
1, 16a, 16b, 17c, 44, 49
Create a Proactive Plan of Care
1, 16a, 16b, 22a, 49, 52, 55, 58, 59, 60
Monitor, Follow Up, and Respond to Change
1, 2, 3, 17c, 19, 22a, 44, 49, 54, 58, 59, 60, 61
Support Self-Management Goals
1, 16a, 17c, 19, 22a, 34, 49
Link to Community Resources
1, 16a, 17c, 22a, 44, 52
Align Resources with Patient and Population Needs
1, 2, 16a, 16b, 17c, 19, 49, 52
BROAD APPROACHES POTENTIALLY RELATED TO CARE COORDINATION
Teamwork focused on Coordination
1, 44, 52
Healthcare Home
2, 3, 16a, 16b, 17c, 19, 47
Care Management
15, 16a, 16b, 22a, 49
Medication Management
2, 3, 17c, 57, 58, 60
Health IT-enabled Coordination
1, 16a, 17c, 19, 34, 44, 50
† A key to measure numbers can be found in Table 4.4-1.
‡ All measure items addressing transitions are mapped to one of the specific transition types (across settings or as
coordination needs change).
Please note that descriptions of care coordination activities and interventions in the literature were also often ambiguous
with no consensus on definitions of the terms. The broad approach to care coordination frequently only described general
processes or roles without specifying who performs what action under what circumstances. These approaches were also
often with a wide scope and a goal of improving aspects of patient care beyond just care coordination. The AHRQ working
definitions for each framework domain in care coordination were developed drawing from a variety of sources. After the
publication of the Atlas, the authors further updated the instrument measures (June 2014) in another online appendix10
which included new instruments and updates on the old instruments included in Table 4.4-1. The newly included measure
instruments were presented (Table 4.4-5) and the full details and the specific questions asked in each instrument can be
found in the document10 (Appendix 5). This update also included the Canadian Survey of Experiences with Primary Health
Care Questionnaire that is discussed in the current report.
Integrated Care | 46
Table 4.4-5 The June 2014 update of AHRQ review results of instrument in care coordination measurement
(Note: page number refers to the page number in the original document)10
4d
Consumer Assessment of Healthcare Providers and Systems (CAHPS)
Patient Centered Medical Home (PCMH) Supplementary Survey Adult Version 2.0........................................................................ 3
4e
Consumer Assessment of Healthcare Providers and Systems (CAHPS) Patient Centered Medical Home (PCMH)
Supplementary Survey Child Version 1.1...................................................................................................................................................................... 10
41b
Primary Care Provider Ambulatory Care Experiences Survey (PCP ACES) ........................................................................................... 14
62
Team Survey for Program of All-Inclusive Care for the Elderly (PACE) .................................................................................................... 19
63
Medication Reconciliation for Ambulatory Care ....................................................................................................................................................20
64
Promoting Healthy Development Survey PLUS – (PHDS-PLUS) ..................................................................................................................21
65
Canadian Survey of Experiences with Primary Health Care Questionnaire ....................................................................................... 36
66
Interpersonal Processes of Care Survey (IPC-II) .......................................................................................................................................................80
67
Brief 5 A’s Patient Survey ..........................................................................................................................................................................................................84
68
Patient Perceived Continuity of Care from Multiple Providers .....................................................................................................................86
69
Relational and Management Continuity Survey in Patients with Multiple Long-Term Conditions ...................................91
70
Patient Perceptions of Integrated Care Survey (PPIC) .........................................................................................................................................92
71
Safety Net Medical Home Scale (SNMHS) ....................................................................................................................................................................99
72
Parents’ Perceptions of Primary Care (P3C) ............................................................................................................................................................. 100
73
Primary Care Questionnaire for Complex Pediatric Patients ...................................................................................................................... 104
74
Safety Net Medical Home Provider Experience Survey ..................................................................................................................................108
75
Rhode Island Physician Health Information Technology Survey ............................................................................................................ 109
76
Primary Care Medical Home Option Self-Assessment Tool ......................................................................................................................... 116
77
Communication with Referring Physicians Practice Improvement Module(CRP-PIM) ............................................................130
78
Safe Transitions Best Practice Measures for Community Physician Offices...................................................................................... 135
79
National Survey of Physicians Organisations and the Management of Chronic Illness II (NSPO-2) .............................. 158
80
Patient-Centered Medical Home Assessment (PCMH-A) Tool ................................................................................................................... 202
47 | Integrated Care
4.5 Measuring patient experience in
IC in the UK
Improved care coordination and integration of services
within the health care sector as well as across health,
social care and other public services, is a priority for the UK
government. The expectation is that IC will lead to a more
person-centered, coordinated care, improve outcomes
for individuals, deliver more effective care and support
and provide better value from public expenditure.36 Many
initiatives were underway and planned to bring about better
integration of health, social care and other services to meet
people’s needs more comprehensively and seamlessly. The
aim of two recent initiatives in England – the Integrated Care
and Support ‘Pioneers’ and the Better Care Fund (BCF) – is to
enable more effective partnerships working across the NHS
and local government sectors, including the commissioning
and provision of public health, health and social care services,
together with other Local Authority responsibilities.
The UK Department of Health (DH) recently commissioned
two short-term projects: the first ran from November 2013
to February 2014 and aimed to provide advice on indicators
of IC for individual and collective progress monitoring
using routine data; the second was an early evaluative
study of the first 15 months of the pioneers in the context
of the BCF, with a report due in mid-2015. The first project
on the identification of indicators for measuring IC was, in
part, built on two earlier reports commissioned by DH: the
Picker Institute et al (2013) report on options for measuring
patient/user experience of IC190; and the Picker Institute/
Oxford University report (2013)191identifying potential survey
questions for measuring patient/ user experience of IC.
The authors of the first project produced a report on
measurement of IC, guidelines on using the indicators and
on how to use routine quantitative data to measure trends
in IC. Part of the report (section F, Appendix B36 ) also
provides a summary of the existing questions in current
national surveys that have particular relevance for IC and
are outlined in Table 4.5-1. The 18 questions developed in
a previous review191 combined with the questions from the
existing survey instruments provided a less satisfactory,
but nevertheless, short-term solution for the pioneers in
testing different IC models in the UK (Table 4.5-2).
One important feature of the UK’s effort so far was that
the developed PREMs on IC were only captured through
patient perspectives (as framed under the “I” statements),
which is in contrast to the AHRQ framework that also
includes the providers and health system representatives’
perspective. It should be noted that the three-level
perspective adopted by the AHRQ is consistent with the
three recent Canadian reports of measurement scales
and items (on patient, providers and health system
experiences) in primary care settings.
Integrated Care | 48
Table 4.5-1 Existing user/carer experience measures in large national surveys36
Indicator ID
Indicator description Data source
(29)
Proportion of people
dying at home/place
of their choosing
National End of Life Care
Intelligence Network –
End of Life Care Profiles
(30)
Improving people’s
experience of
integrated care
NHS Outcomes
Framework indicator
4.9; Adult Social Care
Outcomes Framework
indicator 3E
(31)
Safety: the proportion Adult Social Care
Outcomes Framework
of people who use
services who say that indicator 4B
those services have
made them feel safe
and secure
(32)
GP Patient Survey
questions
GP Patient Survey
Other notes
New ASCOF indicator from 2014/15
Q32. (For people with LTCs) In the last six months
have you had enough support from local services or
organisations to help you to manage your long-term
health condition(s)? Please think about all services and
organisations, not just health services.
Q33. How confident are you that you can manage
your own health?
Q40. Do you know how to contact an out-of-hours GP
service when the surgery is closed?
(33)
Inpatient survey
questions
Survey of adult
inpatients
Q60. Did hospital staff take your family or home
situation into account when planning your discharge?
Q63. Did hospital staff discuss with you whether you
would need any additional equipment in your home,
or any adaptations made to your home, after leaving
hospital?
Q64. Did hospital staff discuss with you whether
you may need any further health or social care
services after leaving hospital (e.g. services from a GP,
physiotherapist or community nurse, or assistance
from social services or the voluntary sector)?
Q65. Did you receive copies of letters sent between
hospital doctors and your family doctor (GP)?
(34)
A&E survey questions
Accident and emergency Q38. Did hospital staff take your family or home
survey
situation into account when you were leaving the A&E
Department?
Q41. As far as you know, was your GP given all the
necessary information about the treatment or advice
that you received in the A&E Department?
49 | Integrated Care
Indicator ID
Indicator description Data source
Other notes
(35)
VOICES national
bereavement survey
questions
Q3. When he/she was at home in the last three
months of life, did he/she get any help at home from
any of the services listed?
National bereavement
survey (VOICES), Office
for National Statistics
Q4. When he/she was at home in the last three months
of life, did all these services work well together?
Q5. Overall, do you feel that you and your family got
as much help and support from health and social
services as you needed when caring for him/her?
Q27. Did the hospital services work well together with
his/her GP and other services outside of the hospital?
Q44. Do you think he/she had enough choice about
where he/she died?
Q46. Were you or his/her family given enough help
and support by the healthcare team at the actual time
of his/her death?
Q52. Since he/she died, have you talked to anyone
from health and social services, or from a bereavement
service, about your feelings about his/ her illness and
death?
(Adapted from Raleigh et al; 2014)36
Integrated Care | 50
Table 4.5-2 The 18 supplementary questions developed for measuring IC experience in the UK setting
Question
Carer version
‘I’ statement
Notes
All of my needs as a Q3.1 is being taken forward
person are assessed for further testing for
potential inclusion in the
1 All of my needs have been assessed
Inpatient Survey and ASCS
2 Some of my needs have been
assessed
Q3.1 Have all your needs been Q3.1 Have all your needs been
assessed?
assessed?
1 All of my needs have been
assessed
2 Some of my needs have
been assessed
3 None of my needs have
been assessed
4 Don’t know/can’t remember
3 None of my needs have been
assessed
4 Don’t know/can’t remember
I am involved in
discussions and
decisions about my
care, support and
treatment as I want
to be
Q3.2a is being taken
forward for further testing
for potential inclusion in
the Inpatient Survey and
ASCS, but not the Carers
Survey (instead Q3.3a
(carers version) will be
tested)
The researchers did not recommend
using this question in a survey of
carers
I am involved in
discussions and
decisions about my
care, support and
treatment as I want
to be
This question relates to
‘treatment’, so could be
seen as quite healthcentric. ICQIDG therefore
recommended that survey
owners use Q3.2a.
Q3.3a Were your family or
carer involved in decisions
about your care and support
as much as you wanted then
to be?
Q3.3a Were you involved as much as
you wanted to be in decisions about
the care and support of the person
you care for?
My family or carer
is also involved in
these decisions
as much as I want
them to be
Q3.3a (carers version)
has been put forward for
further testing for potential
inclusion in the Carers
Survey
1 Yes, definitely
2 Yes, to some extent
Q3.2a Were you involved as
much as you wanted to be in
decisions about your care and
support?
Q3.2a Were you involved as much as
you wanted to be in decisions about
your care and support?
1 Yes, definitely
2 Yes, to some extent
2 Yes, to some extent
3 No
Q3.2b Were you involved
as much as you wanted to
be in decisions about your
treatment?
1 Yes, definitely
3 No
1 Yes, definitely
2 Yes, to some extent
3 No
2 Yes, to some extent
3 No
1 Yes, definitely
3 No
4 I didn’t want to be involved in
decisions about care
4 There were no family or
carers available to be involved
5 I didn’t want my family
or carer to be involved in
decisions about my care and
support
51 | Integrated Care
Question
Carer version
Q3.3b Were your family or
carer involved in decisions
about your treament as much
as you wanted then to be?
Q3.3b Were you involved as much as My family or carer
you wanted to be in decisions about is also involved in
treatment of the person you care for? these decisions
as much as I want
1 Yes, definitely
them to be
2 Yes, to some extent
1 Yes, definitely
2 Yes, to some extent
3 No
‘I’ statement
Notes
3 No
4 I didn’t want to be involved in
decisions about care
4 There were no family or
carers available to be involved
5 I didn’t want my family
or carer to be involved in
decisions about my treatment
Q3.4 Overall, do you feel that
your carer/family has had as
much support from health
and social services as they
needed?
1 Yes, they have had as much
support as they needed
My carer/family
have their needs
recognised and are
given support to
1 Yes, I have had as much support as I care for me
needed
Q3.4 Overall, do you feel that you
have had as much support from
health and social services as you
needed?
2 Yes, I have had some support but
2 They have had some support not as much as I needed
but not as much as they
3 No, I have had little or no support
needed
4 I did not want/need support
3 No, they did not want/need
support
4 They did not want/need
support
5 There are no family
members or carers to support
Q3.4 is being taken forward
for further testing for
potential inclusion in the
ASCS and the Inpatient
Survey
Integrated Care | 52
Question
Carer version
Q3.5 To what extent do you
agree or disagree with the
following statement…
Q3.5 To what extent do you agree
or disagree with the following
statement…
‘Health and social care staff
always tell me what will
happen next’
1 Strongly agree
2 Agree
3 Neither agree nor disagree
4 Disagree
5 Strongly disagree
‘I’ statement
I know in advance
where I am going,
what I will be
‘Health and social care staff always tell provided with,
and who will be
me what will happen next’
my main point
1 Strongly agree
of professional
contact.
2 Agree
3 Neither agree nor disagree
4 Disagree
5 Strongly disagree
Notes
This question has been put
forward for further testing
for potential inclusion in
the ASCS, Carers Survey
and CMHS.
Responding to concerns
raised over the notion
of health and social care
staff telling people what
will happen next, ICQIDG
has recommended that a
variant should be tested as
well or instead of Q3.5:
“Q3.5a To what extent do
you agree or disagree with
the following statement….
‘Health and social care staff
always ensure I know what
will happen next’
1 Strongly agree
2 Agree
3 Neither agree nor
disagree
4 Disagree
5 Strongly disagree’
The variant of Q3.5 is being
tested for the ASCS, Carers
Survey, CMHS and Inpatient
Survey.
Q3.6 When health or social
care staff plan care or
treatment for you, does it
happen?
Q3.6 Thinking about the person you
care for, when health or social care
staff plan care or treatment for them
does it happen?
1 Yes, it happens all of the
time
1 Yes, it happens all of the time
2 It happens most of the time
2 It happens most of the time
3 It happens some of the time
3 It happens some of the time 4 No
4 No
When something is This question has been put
planned, it happens forward for further testing
for potential inclusion in
the Carers Survey
53 | Integrated Care
Question
Carer version
Q3.7a To what extent do you
agree or disagree with the
following statement…
Q3.7a Thinking about the person
you care for, to what extent do you
agree or disagree with the following
statement…
‘My care and support is
reviewed as often as it should
be’
I have regular
reviews of my care
and treatment,
and of my care and
‘Their care and support is reviewed as support plan.
often as it should be’
1 Strongly agree
1 Strongly agree
2 Agree
2 Agree
3 Neither agree nor disagree
3 Neither agree nor disagree
4 Disagree
4 Disagree
5 Strongly disagree
5 Strongly disagree
Q3.7b To what extent do you
agree or disagree with the
following statement…
Q3.7b Thinking about the person
you care for, to what extent do you
agree or disagree with the following
statement…
‘My treatment is reviewed as
often as it should be’
1 Strongly agree
2 Agree
3 Neither agree nor disagree
4 Disagree
5 Strongly disagree
‘I’ statement
‘Their treatment is reviewed as often
as it should be’
I have regular
reviews of my care
and treatment,
and of my care and
support plan.
1 Strongly agree
2 Agree
3 Neither agree nor disagree
4 Disagree
5 Strongly disagree
Q3.8 Thinking about the person
you care for, to what extent do you
agree or disagree with the following
‘My medicines are thoroughly statement…
‘Their medicines are thoroughly
reviewed as often as they
reviewed as often as they should be’
should be’
Q3.8 To what extent do you
agree or disagree with the
following statement…
1 Strongly agree
1 Strongly agree
2 Agree
2 Agree
3 Neither agree nor disagree
3 Neither agree nor disagree
4 Disagree
4 Disagree
5 Strongly disagree
5 Strongly disagree
I have regular,
comprehensive
reviews of my
medicines
Notes
The carers version of this
question has been put
forward for further testing
for potential inclusion in
the Carers Survey
Integrated Care | 54
Question
Carer version
‘I’ statement
Notes
Q3.9 Do you have a named
health or social care
professional who coordinates
your care and support?
Q3.9 Do you have a named health
or social care professional who
coordinates your care and support?
I always know who
is coordinating my
care
1 Yes
2 No, I coordinate my own care and
support
Q3.9 is being taken forward
for the Inpatient Survey
and for further testing for
potential inclusion in the
ASCS, where the following
additional answers will be
included as Q3.9a:
2 No, I coordinate my own
care and support
3 Don’t know/not sure
1 Yes
3 Don’t know/not sure
No – I need and/or would
like someone to coordinate
my care and support
No – I don’t have multiple
needs so my care and
support does not need
coordinating
No – For other reasons
Q3.10 Do you know who
to contact if you need to
ask questions about your
condition or treatment?
Q3.10 Do you know who to contact if
you need to ask questions about the
condition or treatment of the person
you care for?
1 Yes, definitely
1 Yes, definitely
2 Yes, to some extent
2 Yes, to some extent
3 No
3 No
I always know who
is coordinating my
care.
4 Don’t know/can’t remember 4 Don’t know/can’t remember
Q3.12 If you have questions,
when can you contact the
people treating and caring for
you? Please tick ALL that apply
Q3.12 If you have questions, when
can you contact the people treating
and caring for the person you care
for? Please tick ALL that apply
1 During normal working
hours
1 During normal working hours
2 During the evening
3 During the night
3 During the night
4 Weekends
5 Don’t know/not sure
2 During the evening
I have one first
point of contact.
They understand
both me and my
condition(s).
I can go to them
with questions at
any time.
4 Weekends
5 Don’t know/not sure
Q3.13 Do you feel this person
understands about you and
your condition?
Q3.13 Do you feel this person
understands about the person you
care for and their condition?
1 Yes, definitely
1 Yes, definitely
2 Yes, to some extent
2 Yes, to some extent
3 No
3 No
I have one first
point of contact.
They understand
both me and my
condition(s).
I can go to them
with questions at
any time.
The carers version of this
question has been put
forward for further testing
for potential inclusion in
the Carers Survey
55 | Integrated Care
Question
Carer version
‘I’ statement
Notes
Q3.14 Do all the different
people treating and caring for
you work well together to give
you the best possible care and
support?
Q3.14 Thinking about the person
you care for, do all the different
people treating and caring for them
work well together to give the best
possible care and support?
The professionals
involved with my
care talk to each
other. We all work as
a team.
1 Yes, all of them work well
together
1 Yes, all of them work well together
Q3.14 (carers version)
has been put forward for
further testing for potential
inclusion in the CMHS,
Inpatient Survey, ASCS and
Carers Survey
2 Most of them work well
together
3 Some of them work well together
Taken together, my
care and support
help me live the life
I want to the best of
my ability.
Q3.15 is being taken
forward for further testing
for potential inclusion in
the ASCS and Inpatient
Survey
I am told about
the other services
that are available
to someone in my
circumstances,
including support
organisations.
This question has been put
forward for further testing
for potential inclusion in
the Carers Survey
3 Some of them work well
together
2 Most of them work well together
4 No, they do not work well together
5 Don’t know/not sure
4 No, they do not work well
together
5 Don’t know/not sure
Q3.15 Do health and social
care services help you live
the life you want as far as
possible?
Q3.15 Do health and social care
services help you live the life you
want as far as possible?
1 Yes, definitely
2 Yes, to some extent
2 Yes, to some extent
3 No
1 Yes, definitely
3 No
Q3.17 To what extent do you
agree or disagree with the
following statement…
Q3.17 To what extent do you agree
or disagree with the following
statement…
‘In the last 12 months,
health and social care staff
have given me information
about other services that
are available to someone in
my circumstances, including
support organisations’
‘In the last 12 months, health and
social care staff have given me
information about other services
that are available to someone in my
circumstances, including support
organisations’
1 Strongly agree
2 Agree
2 Agree
3 Neither agree nor disagree
4 Disagree
5 Strongly disagree
1 Strongly agree
3 Neither agree nor disagree
4 Disagree
5 Strongly disagree
(Adapted from King et al., 2014)191
(* Cognitive testing of Q3.5, Q3.5a and Q3.14 for use in the CMHS found that the cohort did not fully understand that they were
being asked about their experience of a range of health and social care services, including services for both mental and physical
health. The questions have therefore not been included in the 2013/14 survey; further work is planned for the 2014/15 survey in
order to ensure that if included the questions are framed so as to collect the intended information.)
Integrated Care | 56
4.6 Measuring ‘continuity of care’
based on patient experience
– a systematic review of the
instruments
Like the term ‘care coordination’, the definitions of ‘IC’
were polyamorous and the different definitions may
lead to different choices of measurement instruments.
The concept of ‘continuity of care’ also seems entangled
with ‘IC’ and ‘coordinated care’.192-203. Uijen and colleagues
conducted a systematic review197 on the measurement
properties of instruments in measuring ‘continuity of
care’ based on the COSMIN checklist204-208. The authors
searched from 1995 to October 2011 and included articles
describing the development and/ or evaluation of the
measurement properties of instruments measuring one
or more dimensions of continuity of care as defined
by the authors: (1) care from the same provider who
knows and follows the patient (personal continuity),
(2) communication and cooperation between care
providers in one care setting (team continuity), and (3)
communication and cooperation between care providers
in different care settings (cross-boundary continuity). The
authors included 24 articles describing the development
and/or evaluation of 21 instruments. Ten instruments
measured all three dimensions of continuity of care.
Instruments were developed for different groups of
patients or providers. For most instruments, three or four
of the six measurement properties were assessed (mostly
internal consistency, content validity, structural validity
and construct validity). Six instruments scored positively
on the quality of at least three of six measurement
properties. The authors concluded that most instruments
had suffered from poor methodological quality and did
not include all three dimensions of continuity of care
(Table 4.6-1). Based on the review results, the authors
recommend the use of one of the four most promising
instruments, depending on the target population:
Diabetes Continuity of Care Questionnaire 209(Type II
diabetes), Alberta Continuity of Services Scale-Mental
Health210 211 (for patients with mental heath problem),
Heart Continuity of Care Questionnaire212 (developed
among patients with either congestive heart failure or
atrial fibrillation) and Nijmegen Continuity Questionnaire
(PHC setting: among patients with one or more chronic
diseases197 202).
57 | Integrated Care
Table 4.6-1 Quality of measurement properties and the interpretability per instrument (Adopted from Uijen et al. 2012197)
Measurement properties
Interpretability
Instrument
Internal
Reliability Measurement Content Structural
Consistency
Error
Validity Validity
Hypotheses Differences in
Testing
scores between
subgroups
Floor/ceiling
effects of
subdomain(s)
Minimal
important
change
(MIC)
CPCI
--
na
na
+++
++
+
Not reported
Unknown
Unknown
4, 3 positief
VCC
--
na
na
+++
++
na
Not reported
Unknown
Unknown
3, 2 positief
CCI
+++
---
na
+
+++
+/-
Not reported
Floor and
ceiling effect
Unknown
5, 3 positief
CONNECT
?
--
na
++
na
?
Not reported
Floor effect
Unknown
4, 1 positief
CPCQ
---
na
na
+
+
+
Not reported
Unknown
Unknown
4, 3 positief
ACSS-MH
+/-
-
na
+++
---
+
Reported
Unknown
Unknown
5, 2 positief
CCPS-I
?
na
na
?
na
na
Not reported
Unknown
Unknown
2, 0 positief
CCPS-P
?
na
na
+
na
?
Not reported
Unknown
Unknown
3, 1 positief
DCCS
?
+
na
+
?
-
Reported
Ceiling effect
Unknown
5, 2 positief
HCCQ
+++
na
na
+
--
+++
Not reported
Unknown
Unknown
4, 3 positief
ECC-DM
---
na
?
na
++
?
Reported
Unknown
Unknown
4, 1 positief
King et al.
(Nameless)
?
+
na
+++
na
na
Not reported
Unknown
Unknown
3, 2 positief
CONTINU-UM na
+
?
?
na
na
Not reported
Unknown
Unknown
3, 1 positief
DCCQ
+
na
na
+
?
-
Not reported
No floor/
ceiling effect
Unknown
4, 2 positief
PCCQ
?
na
na
?
?
++
Reported
Unknown
Unknown
4, 1 positief
Ahgren et al.
(Nameless)
?
na
na
+
na
na
Not reported
Unknown
Unknown
2, 1 positief
CRP-PIM
na
?
na
na
-
na
Not reported
Ceiling effect
Unknown
2, 0 positief
CSI
?
na
na
+++
?
na
Not reported
No floor/
ceiling effect
Unknown
3, 1 positief
Gulliford et al +
(nameless)
na
na
?
+
na
Reported
Unknown
Unknown
3, 2 positief
CCCQ
+++
---
na
+++
+++
?
Not reported
Ceiling effect
Unknown
5, 3 positief
NCQ
+++
+++
?
+
?
+++
Reported
No floor/
ceiling effect
Unknown
6, 4 positief
+++ or - - - = strong evidence positive/negative result, ++ or - - = moderate evidence positive/negative result, + or - =
limited evidence positive/negative result, +/- = conflicting evidence, ? = unknown, due to poor methodological quality. na
= no information available.
Cross-cultural validity, criterion validity and responsiveness were not evaluated
It should be noted that the review only included those measurement instruments specifically designed for ‘continuity of
care’ but not those instruments having multiple domains.
Integrated Care | 58
4.7 The key points in measuring
PREMs
Given the vast range of different frameworks and
methods included in measuring patient experience, it
is not possible to suggest that a certain approach or a
particular tool is most effective for measuring the patient
experience. However, in a recent scoping review, 10
things that need to be considered when planning how to
measure changes in the patient and carer experience over
time were suggested and paraphrased below213:
1) Consider how patient experience is being defined to
inform exactly what needs to be measured.
2) Think about why patient experience is being
measured and how the information will be used.
3) Assess whether it would be useful to combine
approaches so that both qualitative and more
quantitative material is collected.
4) Consider whether to ask everyone using the service or
only a sample to provide feedback.
5) Think about whether the best time to collect
feedback is immediately after using the service when
experiences are fresh in people’s minds.
6) Allocate enough time at the outset to plan and test
measurement methods, particularly if these will be
used in the future to monitor change over time.
7) Think about how the end-result needs to be presented
to various audiences as this may shape how the data
are collected. Potential outputs include statistical
averages, in-depth quotes or graphs.
8) Make sure that there is appropriate infrastructure at
an organisational level to analyze and use patient
experience information.
9) Make sure that patients, carers, managers and health
professionals are all comfortable with why feedback is
being collected and how it will be used. Staff need to
be on board as well as patients.
10) Ensure that patient experience measures are seen
as one component of a broader framework of
measurement and that all approaches work well
together without an excessive burden for either staff
or patients.
5 | Using PROMs and PREMs
to improve health care
Integrated Care | 60
5.1 A conceptual framework in
understanding the effect of PROM/PREM
Greenhalgh et al (2005)214 proposed a framework (Figure
5.1-1) that depicts mechanisms between the routine
collection of PROs and changes in patient outcomes.
The authors posit that the multilayer mediators (i.e.
changes to doctor-patient communication, monitoring
treatment responses, detecting unrecognised problems,
changes to patient health behaviour, changes to clinicians’
management plans, and improved patient satisfaction)
have complex relationships among them. The studies
that unveiled these complex relationships may help us
understand whether and how routinely collected PROs
work to improve the intended outcomes.
Figure 5.1-1 A hypothetical framework to understand
the impact of routinely collected PROs on patient health
outcomes.
A | Provision of information from health status
measures to clinicians
- Alone or supplemented with management guidelines
- With or wothout training in interpretation of scores
- Fed back once or several times
- Graphical displays of scores or written summaries,
with or without population norms, with or without
previous scores
B | Evidence: +
- Clinicians talk to patients about
their feelings/health status
- Develop a shared view of treatment
goals/health status/reason for
the visit
C | Evidence: +/-
D | Evidence: +/-
E | Evidence: -
F | Evidence: -
Monitor treatment
response
Changes to patient
health behaviour
- Visits to the clinician
- Adherence to
treatment
G | Evidence: -
Improved patient
satisfaction
Detect unrecognised
problems
Changes to clinicians
management of
patient
- Changes to/initiation
of treatment
- Referrals to other
agencies
- Tests to investigate
problem further
- Advice on problem
management
H | Evidence: Improved health
outcomes
Recently, Abernethy and colleagues215 have argued that the
routine collection of PROMs has the capacity to impact not
only at the patient-level but by addressing the logistics of
data linkage and could ensure that the system will grow to
accommodate other clinical- and health system-level issues
(e.g. evaluating comparative effectiveness of treatments,
monitoring quality of care, and translating basic science
findings into clinical practice, Figure A). The integration
61 | Integrated Care
of data systems will fuel rapid learning cancer care at the
national and societal levels (See Figures A and B), making
many types of research and system learning possible across
institutions and health sectors. The benefits and implications
of such rapid learning health care systems may include, but
is not limited to, strong and effective quality improvement
(QI), increased transparency, accountability, public reporting,
better health system performance (monitoring, planning,
financing, evaluating, responding) and better quality of care.
The model starts at the patient level, with an ePRO dataset.
New datasets can be sequentially added by warehousing or
federated models. The key element is patient-level linkage.
Societal level
National level
Health system level
ePRO data
Clinic level
Clinical and
administrative
data
Clinical trials
and research
data
Molecular
and biological
data
Figure A: A data linkage
Figure B: A learning
framework
health care
Note : Figures A and B: adopted from Aberthnethy et al. (2010)
Combining both frameworks, we developed a list of
outcome indicators from which the effectiveness of
PROM/PREM should be assessed (Table 5.1-1).
(Note: We used the term ‘ Patient-provider communication’
instead of ‘doctor patient communication’ as proposed by
Greenhalgh et al.(2005)214 in the current study).
Table 5.1-1 Possible outcome indicators for assessing the
impact of the collection of PROM/PREM
No.
Outcomes
1
Patient-provider communication
2
Monitor treatment response
3
Detect unrecognised problems
4
Changes to patient health behaviour
5
Changes to patient management
6
Improved patient satisfaction
7
Improved health outcomes
8
Strong and effective quality improvement
9
Increased transparency, accountability and public
reporting
10
Better system performance (monitoring, planning,
financing; evaluating and responding)
(Source: Chen J. 2012216)
5.2 IC and PROMs/PREMs
5.2.1 How is patient-centered care (PCC)
defined?
The IOM defines PCC as: “Providing care that is respectful
of and responsive to individual patient preferences, needs
and values, and ensuring that patient values guide all
clinical decisions.”217 PCC supports active involvement
of patients and their families in the design of new
care models and in decision-making about individual
options for treatment. In broad terms, PCC is a model in
which providers partner with patients and their families
to identify and satisfy the full range of patient needs
and preferences, while simultaneously supporting the
professional and personal aspirations of their staff. It is
broadly described as an approach that puts the patient
at the center of the care process and that it is sensitive,
empathic and responsive to patients’ individual needs,
preferences and values. PCC is expected to contribute to a
beneficial outcome for patients (e.g. increased satisfaction
with care, improved adherence to treatment and reduced
symptom severity), for healthcare providers (e.g. increased
job satisfaction, reduction of malpractice complaints)
and for the healthcare system (e.g. appropriate use of
healthcare resources, decreased costs). PCC is also one of
the over-reaching goals of health advocacy, in addition to
safer medical systems and greater patient involvement in
healthcare delivery and design. Given that non-consumer
stakeholders often don’t know what matters most to
patients regarding their ability to get and stay well, care
that is truly patient-centered cannot be achieved without
active patient engagement at every level of care design
and implementation. Despite its popularity, the term and
content of ‘PCC’ were not universally agreed and it may
mean different things to different people. For many, the
fundamentals of what it means to be patient-centered
remains unclear and many organisations struggle with
how to actualise the concept into the day-to-day business
of caring for patients and families.
5.2.2 Can the concept of PCC be measured?
PCC is widely accepted as an essential component of highquality health care. 217 218 Although highly recommended
for providing health care, PCC is a vague and poorly
conceptualised component of high-quality care.219 220 It
could be viewed as a general approach for organising
healthcare services as illustrated with the patient-centered
medical home care model 221; a perspective for guiding
the plan, delivery and coordination of the patients’ overall
care; a strategy for designing and implementing particular
care-related events such as discharge from hospital and
Integrated Care | 62
ers
pec
tive
yp
Pat
ien
t/fa
mil
System representative(s) perspective
Home care
Test results
Inpatient care
e
ctiv
spe
Meet patient needs and
preferences in delivery of
high-quality, high-value care
per
For some purposes, they noted that other definitions
may be more appropriate. This is not surprising given the
multiple stakeholders involved in care coordination as
depicted in Figure 5.2-1.
Informal
caregivers
Specialty
care (2)
nal
“Care coordination is the deliberate organisation of patient
care activities between two or more participants (including
the patient) involved in a patient’s care to facilitate the
appropriate delivery of health care services. Organising care
involves the marshalling of personnel and other resources
needed to carry out all required patient care activities and
is often managed by the exchange of information among
participants responsible for different aspects of care.”
Patient/family
education
& support
Specialty
care (1)
ssio
rofe
In the USA, the term ‘care coordination’ is used more
widely. A systematic review commissioned by the AHRQ
found that there were more than 40 different definitions of
‘care coordination’227. The authors of the review combined
the common elements from many definitions to develop
one working definition for use in identifying reviews of
interventions in the vicinity of care coordination and, as a
result, developed a purposely broad definition:
Medications/
Pharmacy
Primary care
ep
car
IC is a concept that has been widely but variously used in
many ways in different health systems. A key challenge
remains the lack of a common definition for a plethora of
terminologies such as ‘IC’, ‘coordinated care’, ‘collaborative
care’, ‘managed care’, ‘disease management’, ‘case
management’, ‘health/social care service user-centred care’,
‘chronic care’, ‘continuity of care’, ‘seamless care’ and others.
There is a lack of clear delineation among the conceptual
boundaries between these terms225 and an absence of a
sound analytical framework through which to examine the
processes of integration.226
Community
resources
lth
5.2.3 What is IC?
Figure 5.2-1 The care coordination ring
Hea
specific treatments such as self-management education222;
and a style for communicating with patients.223 Differences
in interpretations yield inconsistent operationalisation and
application of PCC, which preclude its implementation
with fidelity and the valid evaluation of its effectiveness in
producing the desired outcomes within and across diverse
healthcare contexts. Any effort in trying to measure such
a concept should firstly clarify the essential elements that
are responsible for inducing its expected outcomes to
operationalise the PCC elements in the development of an
instrument to measure PCC.224
Medical
history
Mental health
services
Long-term
care
(Source: AHRQ : Care coordination atlas report)189.
Integration in health care is not likely to follow a single
path and variations will be inevitable, given the nature
of the health-care ‘production’ process with its imprecise
boundaries between stages, the complex ways that
service users progress through the system and the
often probabilistic nature of the treatment process.7
Thus, analysts have identified different dimensions of
integration, most commonly differentiating the type,
breadth, degree and process of integration. Four main
forms of the integration were discussed228 229:
zz Functional: integration of key support functions and
activities such as financial management, strategic
planning and human resource management.
zz Organisational: for example, creation of networks,
mergers and contracting.
zz Professional: for example, joint working, group
practices, contracting or strategic alliances of healthcare professionals within and between institutions and
organisations.
zz Clinical: integration of the different components
of clinical processes such as coordination of care
services for individual health-care service users or care
pathways.
These types can occur in ways that have been described as
horizontal integration or vertical integration (also referred
to as breadth of integration)229. Horizontal integration
links services that are on the same level in the process of
health care, for example general practice and community
63 | Integrated Care
care, that facilitates organisational collaboration and
communication between providers. Vertical integration
brings together organisations at different levels of a
hierarchical structure under one management umbrella,
for example primary and secondary or specialist care.
IC can be realised on a continuum of integration, referred
to as the degree of integration.230 231 The degree can range
from full integration in which the integrated organisation is
responsible for the full continuum of care, including funding
to collaboration which describes separate structures in
which organisations retain their own service responsibility
and funding criteria. The three levels were identified as:
zz Linkage: operates through the separate structures
of existing health and social services systems
with organisations retaining their own service
responsibilities, funding and eligibility criteria and
operational rules.
zz Coordination: as linkage but involves additional explicit
structures and processes such as routinely shared
information, discharge planning and case managers to
coordinate care across the various sectors.
zz Full integration: the integrated organisation/system
assumes responsibility for all services, resources and
funding.
One important message for those contemplating the
implementation of an IC system is that:
“Based on the evidence presented here, there may be a need
to revisit our understanding of what IC is and what it seeks
to achieve and the extent to which the strategy lends itself to
evaluation in a way that would allow for the generation of clearcut evidence, given its polymorphous nature. Fundamentally,
it is important to understand whether IC is to be considered
an intervention that, by implication, ought to be cost-effective
and support financial sustainability or whether it is to be
interpreted and evaluated as a complex strategy to innovate
and implement long-lasting change in the way services in the
health and social-care sectors are being delivered and that
involve multiple changes at multiple levels. Evidence presented
here and elsewhere strongly points to the latter and initiatives
and strategies underway will require continuous evaluation over
extended periods of time enabling assessment of their impacts
both economic and on health outcomes if we are to generate
appropriate conclusions about program effectiveness and the
application of findings to inform decision making.”7
From a measurement point of view, the lack of clarity and
consensus among the exact content, nature, mechanisms
and domains included in IC will hamper the efforts in
developing necessary PROM/PREM to understand the
process and improve the practice by patients, providers
and health systems.
5.2.4 Patient experience vs patient
satisfaction
A distinction can be made between patient’s satisfaction
and patient’s experience. Patient satisfaction surveys tend
to ask patients subjective questions about their satisfaction
with their care. In contrast, patient experience questions
relate to the patient’s objective experience during the entire
health center interaction. The patient experience is the sum
of a patient’s interaction with a health center and is the
patient’s perception of those interactions.
Patient satisfaction questions generally focus on how well
a patient’s expectations are met, their preferences and the
quality of their care. They tend to be subjective and nonspecific. Examples of satisfaction questions include: How
do you rate your doctor’s caring and concern for you? How
satisfied are you with the appointment system in your
health center?
Experience questions relate to the patient’s actual, more
objective experiences in the health center and aim to avoid
value judgments and the effects of existing expectations.
Examples of experience questions include: In the last 12
months, how many days did you usually have to wait for an
appointment when you needed care right away?
It could be argued that terms such as satisfaction
and experience have distinct meanings. For instance,
‘expectations’ may refer to people’s perceptions before
receiving care, ‘experience’ may relate to what happens
during care and ‘satisfaction’ may refer to information
collected afterwards. Alternatively, ‘experience’ could be
taken to describe things that happened and the extent
to which their needs are met whereas ‘satisfaction’ could
relate more to how they feel about those things.
While there are some detailed theoretical and academic
arguments about the difference between these terms,
for simplicity, for this report ‘experience’ means any
combination of satisfaction, expectations and experience
so long as it relates to feedback provided by those
using health services as well as their family or carers. For
consistency, the term ‘experience’ is used throughout
this report although terms such as ‘satisfaction’ and
‘expectations’ are used where needed to focus on those
particular concepts.
Patient experience is an important part of the
multicomponent of quality of care which is a multidimensional concept (safe, effective, efficient, equitable,
timely and patient-centered, IOM). As such, patient
experience does not (and should not) reflect quality
in other domains. Patient experience itself is also a
multi-dimensional concept and there is no universal
agreement on the exact domains/components of patient
Integrated Care | 64
5.3 International experience
5.3.1 The USA experience
In the USA, PROM/PREM are being increasingly recognised
as important for assessing clinical outcomes, advancing
quality improvement and informing technology
assessment and reimbursement decisions. Many national
organisations have recognised the importance of PROMs
in clinical care and the following significant activity has
already occurred:
1) NIH funding for the PROMIS and Neuro-QoL network.
2) Legislation authorising the Patient-Centered Outcomes
Research Institute (PCORI).
3) Grants from the AHRQ.
4) Activity at the Centers for Medicare and Medicaid
Services.
5) Regulatory guidelines from the FDA and other public
and private organisations. Integration of PROs into
EHRs is also a key component of the Center for
Medicare and Medicaid Services’ financial incentive
program to demonstrate “meaningful use” of EHRs.
5.3.2 The UK experience
The NHS in the UK has led the application of the PROMs
over the years, moving from research to clinical practice
and quality improvement. More recently, PROM has been
applied to healthcare quality assessment. The program
launched by the UK Department of Health in April 2009
requires the collection of PROMs for selected surgical
procedures (hip or knee replacement, inguinal hernia,
varicose veins) within all the health facilities financed by
the NHS. Several questionnaires were used before and
after the procedure, thus allowing assessment of the
health outcomes reported by the patient. From 2009
to 2013 the total number of questionnaires was high
and steadily grew; in the period April 2009-March 2010
184,818 pre-procedure and 134,768 post-procedure
questionnaires were completed by the patients and
returned with a 17% and 11% increase respectively,
compared to the previous period (April 2009-March 2010).
The analysis of the UK’s experience offers many valuable
lessons in feasibility and resources, the most appropriate
methodology as well as in professional culture.
Moreover, two out of the five domains of the NHS
Outcomes Framework put great emphasis on PROM/
PREM9. Among the five domains of the outcome
framework, Domain 2 (Enhancing QoL for people with
long-term conditions) and Domain 4 (Ensuring that people
have a positive experience of care) heavily rely on the
collection of PROM/PREM to assess the outcomes.
Figure 5.3-1 Duty of quality and the NHS Outcomes
Framework
Domain 1
Preventing
people from
dying
prematurely
In the US, the health care reform movement is heading
towards the development of accountable care
organisations, integration of PROs into EHRs and is
also becoming an important tool for measuring and
demonstrating the success of care provided through
accountable care organisations. The momentum is
building towards national standardisation of PROs to
provide real-time outcomes. A significant investment is
being made in infrastructure to collect data from patients,
synthesising the data in a meaningful way, presenting
the data back to both patients and providers as well as
developing strategies for acting on the information to
improve the clinical care of patients.
Duty of quality
6) Recent recommendations from the IOM include the
psychosocial behaviour PROMs in the EHRs.
7) The AHRQ commissioned report building the national
digital infrastructure (including eEHRs and PROMs).
Duty of quality
NHS outcomes framework
Domain2
Domain 3
Domain 4
Enhancing
Helping people
Ensuring that
quality of
to recover from
people have
life for
episodes of ill
a positive
people with
health or
experience
long-term
following
of care
conditions
injury
Domain 5
Treating and caring
for people in a
safe environment
and protecting them
from avoidable harm
Duty of quality
experience should be measured (Appendix 8). However,
patient experience is important in its own right. Patient
experience is consistently and positively associated
with other quality outcomes including patient safety,
and clinical effectiveness across a wide range of studies
and healthcare facilities. Those providing high-quality
clinical care tend to have better experiences reported by
patients. Clinical quality and patient experience should be
considered as distinct but inter-related aspects of quality.
NICE quality standards
(Building a library of approx 150 over 5 years)
Commissioning
outcomes
framework
Commissioning
guidance
Provider payment mechanisms
tariff
standard
contract
CQUIN
Commissioning/Contracting
NHS Commissioning Board - certain specialist services and primary care
GP consortia - all other healthcare services
Duty of quality
QOF
65 | Integrated Care
Table 5.3-1 Selected outcomes indicators for Domain 2 of
NHS Outcome Framework
Overarching indicator
2. Health-related quality of life (HRQoL) for people with
long-term conditions (EQ-5D)
Improvement areas
Ensuring people feel supported to manage their
condition
2.1 Proportion of people feeling supported to manage
their condition
Improving functional ability in people with long-term
conditions
respectively232. It also showed that despite the required
structured organisation and effective data capture
infrastructure, a nationwide implementation of a PROMs
program is feasible and the continuous collection of
PROMs permits local and national improvement and
allows for a further health-economic evaluation as well as
comparative effectiveness research (CER)233(Figure 5.3-2).
Figure 5.3-2 Using PROM in the Swedish Hip Arthroplasty
Register.
PROM in Swedish hip arthroplasty has
enabled quality and efficiency
improvements
Significant efficiency gains from PRO metrics in THA
2.2 Employment of people with long-term conditions
Main indications for total hip arthroplasty are pain and
impaired HRQoL, making PRO key
Reducing time spent in hospital by people with longterm conditions
Hence, a standardisedPROM protocol was introduced in
2002
2.4 Health-related quality of life for carers (EQ-5D)
Enhancing quality of life for people with mental illness
2.5 Employment of people with mental illness
5.3.3 The Sweden experience
Sweden started a nationwide use of PROMs using the
disease specific clinical databases (quality registers)
established by the medical profession as early as in 1975.
One prominent example is the nationwide, prospective,
observational follow-up program which included PROMs
for the Swedish Hip Arthroplasty Register. The program
started in 2002 and was gradually expanded to include
all units performing total hip replacement in Sweden.
The self-administered PROMs protocol comprised the
EQ-5D instrument, the Charnley class categorisation and
visual analogue scales for pain and satisfaction. Study232
showed that patients’ response rates to the Registry
were good. Patients eligible for total hip replacement
generally reported low health-related QoL and suffered
from pain. One year post-operatively the mean EQ-5D
index increased to above the level of an age- and gendermatched population, with a considerable reduction of
pain. Females, younger patients and those with a Charnley
category C reported a lower EQ-5D index pre-operatively
than males, older patients and Charnley category A or B,
~8k USD cost per THR patient, of which 61% in productivity
loss, and >16k patients annually, implies significant QoL
and cost gains potential
Based on PROM output, two alternative treatment
procedures could be properly assessed
One-stage bilateral THA cam out on top, with better
HRQoL profile, at lower cost
Pre-operative anxiety/depression stands
out as deciding factor in THA satisfaction
Post-operation satisfaction (VAS)
6
4
2
0
-2
ili
ty
Us Self
ca
ua
re
la
c
tiv
Pa
iti
in
es
An /di
s
xie
co
m
ty
fo
/d
rt
Ch
ep
re
ar
m
ss
io
le
yc
n
at
eg
or
y
Ge
nd
er
Enhancing quality of life for carers
Covering all but one Swedish THA preformers
ob
2.3.ii Unplanned hospitalisation for asthma, diabetes
and epilepsy in under 19s
Including Charnley functional categories, VAS, EQ-5D
instruments and EuroQol
M
2.3. i Unplanned hospitalisation for chronic ambulatory
care sensitive conditions (adults)
(Source: AHRQ workshop discussion (2012))
Integrated Care | 66
5.3.4 The Denmark experience: the generic
Integrated PRO System (WestChronic) 234
Similar to Sweden, Denmark also started to introduce
PROs in some disease-specific national clinical registers
in 2000.235 Despite the fact that logistic and scientific
challenges are similar across diagnostic groups and
applications, most PRO systems have been applied to a
single-patient group. To achieve integration of different
disease-specific groups, a system called WestChronic
was developed in Denmark in 2004 to accommodate
an arbitrary number of PRO projects with individual
questionnaires, protocols, patients and users. For the
patient and clinician, each implemented project appeared
as a unique PRO project with its own logo, domain,
website, email address, accompanying letters and contact
information, etc.
The WestChronic also had one basic module (basic PRO
data collection and logistics) and three optional elements:
1) PRO-based clinical decision support, 2) PRO-based
automated decision algorithms, and 3) other forms of
communication (detail see: Table 5.3-2). While the first
element is ubiquitous, the others are optional and only
applicable at a patient level. Each element has its unique
methodological and organisational challenges. The
WestChronic has, to date, implemented 22 PRO projects
within 18 diagnostic groups, including cardiology,
neurology, rheumatology, nephrology, orthopedic
surgery, gynecology, oncology and psychiatry. Presently,
WestChronic includes 1756 items in 92 questionnaires
and 158 templates for personalised letters and emails. The
system was designed for both the patient and provider
groups. Despite not explicitly stated, the data collected by
the system can also be harnessed from the health system
perspective.
Table 5.3-2 Elements of clinical application of PRO.
Element
Content
Base
element
Content
PRO data
collection and
logistics
Questionnaire (items)
Criteria for inclusion and
termination
Data collection modes:
Web, paper, interview
Approach modes: letter,
email, telephone, texting
Schedules of
questionnaires/reminders
Optional
element 1
Optional
element 2
PRO overview
for clinical
decision
support
Categorisation of PRO for
clinical decision support
PRO-based
automated
decision
algorithms
Decision tree
Optional
element 3
Course-oriented graphic
overview
Action protocol
Other forms of
communication Two-way
communication
One-to-many
communication
(Source: Hjollund et al. 2014234 )
The most recent study234 showed that in a total of 30,174
patients, 59,232 PRO assessments were collected using
92 different PRO questionnaires. Response rates of up to
93% were achieved for first-round questionnaires and up
to 99% during follow-up. For six diagnostic groups, PRO
data was displayed graphically to the clinician to facilitate
flagging of important symptoms as well as decision
support and in five diagnostic groups PRO data was used
for automatic algorithm-based decisions (Table 5.3-3).
The WestChronic system showed the feasibility and utility
of the implementation of all proposed protocols for data
collection and processing. The system has achieved high
response rates and longitudinal attrition is limited. The
relevance of the questions, the mixed-mode principle and
automated procedures has contributed to the high response
rates. Furthermore, development and implementation of a
number of approaches and methods for clinical use of PRO has
been possible without challenging the generic property. 234
67 | Integrated Care
Table 5.3-3 Characteristics of 22 projects involving implementations of a generic PRO system. Projects with patient level
use (n=14)
D: PRO for
E: PRO for
clinical overview automated
(AmbuFlex I)
cancelling of
visits
F: PRO for
screening
G: PRO for
clinical
decision
support (AmbuFlex II)
H: Other forms of
communi- cation
Level of aggregation Patient
Group/patient
Group/patient
Patient
Patient
Implemented
projects
5
3
2
3
1
Invoked
elements(Figure 1)
Base + element 1
Base + element 2
Base + element 2
Base + element
1.2
Base + element 1,3
Patients
Chronic
heart failure
Rheumatoid
arthritis Renal
failure
Hip/knee
replacement
Acute Coronary
Syndrome
Acute Coronary ADHDa
Syndrome
Endometriosis
Heart transplant
Heart transplant
Lung cancer
Prostatic cancer
Recruitment
Preadmission
assessment
Clinic referral
Hospital registers
/ clinical referral
Clinic referral
Clinic referral
Primary aim
Clinical decision
support
Efficient use of
resource
Screening for
depression
Clinical decision Communication
support
(therapists and
patient)
Extension
Local
National, selected Local
hospitals
Regional
Local
In operation from
2009
2011
2011
2012
2012
Patients (Jan 2014)
741
1639
1740
3120
23
Questionnaires/
patient
No limit
3
1/ no limit
No limit
No limit
Response rate
(primary)
75%
N/A
88%
93%
N/A
Response rate
follow-up
82%
97%
N/A
99%
N/A
Integrated Care | 68
5.4 Measuring PREMs in non-primary care settings in Australia
Several reports reviewed15 236-238 the issues of measuring patient experience in Australia. The recent report from the
Australia Commission on Safety and Quality in Healthcare (ACSQH)15 provides a summation of the latest domains used
in the hospital patient experience and satisfaction surveys in Australia (Table 5.4-1). Since the publication of the ACSQH’s
report, the BHI also commissioned a report to review its in-hospital patient experience survey.237
Table 5.4-1 Frequency of domains used in the hospital patient experience and satisfaction surveys in Australia
WA
OLD
Access/waiting time/
admission process
x
Information sharing/
communication
SA
ACT
NT
TAS
VIC
NSW
HCAHPS QPS
%
x
x
x
x
x
x
x
x
90%
x
x
x
x
x
x
x
x
x
90%
Physical environment
x
x
x
x
x
x
x
x
x
90%
Overall satisfaction
x
x
x
x
x
x
x
x
x
90%
Domain
Involvement/participation x
x
x
x
x
x
x
x
Privacy/respect/dignity
x
x
x
x
x
x
x
Consistency/coordination x
of care
x
x
x
x
x
x
Discharge/continuity of
care
x
x
x
x
Pain control
x
Safety/quality (i.e
hand hygiene, patient
identification)
x
x
x
80%
x
80%
x
80%
x
x
x
70%
x
x
x
50%
x
20%
Note: HCAHPS – Hospital Consumer Assessment of Healthcare Providers and Systems
(Adapted; Source: ACQHS, 201215) QPS: QPS Patient Satisfaction Survey.
The ACSQH’s report15 also found that St Vincent’s Health Australia had recently conducted a review of its patient experiences
(PEx) and patient satisfaction (PSat) surveys in order to inform the development of a standard methodology. During this
review, two potential key performance indicators (KPIs) that could be used to measure patient experience were identified.
The two candidate KPIs were:
1) Likelihood of recommending the hospital to family and friends.
2) Overall rating of care.
The details of the different surveys used by St Vincent’s Hospitals are presented in Table 5.42.
69 | Integrated Care
Table 5.4-2 Survey tools used among St Vincent’s hospitals
State level
NSW
Valid
No. of
used
Frequency of
description reporting
tool
questions
Picker
PEx
Annual now
monthly, Q4
Yes
SHC, STJ
State wide
since 2007,
by IPSOS
SVPH
N/A
Press
PEx and
PSat
Monthly, Q4
Yes
PEx and
PSat
Monthly, Q4
Yes
PSat
Annual
PSat
Local*
SVH &
Tool
Ganey
Mater
N/A
Press
Ganey
QLD
HSNS
SVHB
Local
Nil but
required as
part of health
fund contract Local
SVHT
VIC
Tool
KPI 1
KPI 2
70-90**
yes
yes
57 +
yes
yes
55 + some
special
yes
Yes
No
72
yes
no
Annual
No
36
yes
yes
PSat
Continuous
No
65
yes
yes
No
21
yes
Yes
65
no
no
10 special
SV& MP
Nil
Local
PSat
Monthly
SVH M
Under
Local
development
PSat
6 Monthly
*Specific Day surgery and emergency tools used ** Current, proposed to reduce to 50-60
PEx – patient experience, PSat – patient satisfaction
SVH – St Vincent’s Hospital, SHC – Sacred Heart, STJ – St Joseph’s Hospital, SVOPH – St Vincent’s Private Hospital (public), HSNS
– Holy Spirit Northside Hospital, SVHB – St Vincent’s Hospital Brisbane, SVHT - St Vincent’s Hospital Toowoomba, SV & MP - St
Vincent and Mercy Private, SVHM - St Vincent’s Hospital Melbourne
During one of the surveys on day hospitals by the ACSQH,
it was found that a range of survey tools were used by day
hospitals. QPS Patient Satisfaction Survey is the survey
developed by QPS Benchmarking which is an Australian
and New Zealand based health care quality improvement
organisation. A range of sampling methods was used by
the day hospitals. Some surveyed all patients while others
conducted random sampling. The QPS Benchmarking
Patient Satisfaction Survey (Version 3) has 20 questions
covering the following domains:
1) appointment/waiting times
2) information sharing/communication
3) respect and dignity
4) conduct of staff
5) physical environment
6) overall satisfaction
7) pain management
8) services and equipment
9) billing process
10)discharge/continuity of care.
The survey is conducted 1-2 weeks post-discharge. The
QPS scorecards automatically provide a numerator and
denominator as well as question by question graphical
and numerical results.
Integrated Care | 70
Table 5.4-3 Patient experience and satisfaction surveys used in Australia
Provider
Tool
Administration
Scope
Tool description
Northern Territory
Local
In-house
Hospital
PEx
Australia Capital Territory
Victoria Patient
Mail
State-wide
PEx and PSat
Health
Satisfaction Monitor
Tasmania
Local
Mail
State-wide till
2007.
PEx and PSat
Jurisdiction
Hospital
thereafter
South Australia
SACESS
CATI
State-wide
PEx and PSat
Western Australia
Local
CATI
State-wide
PEx and PSat
Queensland
Picker-based
CATI
State-wide
PEx and PSat
Victoria
Victoria Patient
Mail and online
State-wide
PEx and PSat
Satisfaction Monitor
Private Hospitals
Eye Tech Day Surgery, QLD
QPS
CATI
Hospital
PSat
Colin Street Day Surgery, WA
QPS
CATI
Hospital
PSat
Mater Hospital North Sydney
Press Ganey
CATI
Hospital
PEx and PSat
Healthscope (44 hospitals)
HCAPS
Mail
National
Pex and PSat
St Vincent’s Hospitals
(2 public, 8 private)
Picker and Press Ganey
Mail
National
PEx and PSat in
select NSW PSat in
QLD and VIC
Robina Procedure Centre,
Local
In-house
Hospital
PEx
Local
In-house
Hospital
PEx
Buderim Gastroenterology
Centre, QLD
Local
In-house
Hospital
PEx
Liverpool Day Surgery, NSW
Local
In-house
Hospital
PEx
The Women’s Clinic Day
Hospital, WA
Local
In-house
Hospital
PEx
The Eye Hospital, Tasmania
Local
In-house
Hospital
PSat
in NSW, local in QLD and
VIC
QLD
Centre for Digestive
Diseases, NSW
CATI - computer aided telephone interviewing, PEx – Patient Experience, PSat – Patient Satisfaction
The key findings of the ACSQH’s report are paraphrased as follows:
71 | Integrated Care
zz There are differences in the methodologies,
administration, scope, rating scales, inclusion and
exclusion criteria, sampling, data analysis and reporting
methods used across public and private hospitals.
zz Most jurisdictions and some private hospitals are using
a combination of patient experience and satisfaction
questions in their surveys.
zz There are documented impacts from the use of the
surveys showing how feedback has informed service
delivery.
zz There was no significant difference in the frequency or
number of domains used between private and public
hospitals.
zz In some jurisdictions and private hospital ownership
groups, surveys are conducted state-wide or across
one private hospital ownership group while in certain
instances individual institutions have developed and
administered their own surveys.
zz Most of the private hospitals included in this review
use locally developed tools. These tools are often
administered in-house.
zz The surveys are not well suited for use with people who
speak little or no English as translated versions are
generally not available. The Northern Territory (NT)
and Victoria has sought to address language barriers.
In NT hospitals, meaningful pictures and symbols are
incorporated within the surveys. In Victoria, the surveys
are available in English and 16 community languages.
A point worth noting is that there has been little progress
made since the review in 2009236 which reported that
despite the existing patient experience data in acute
hospitals in all Australian states this information was not
utilised optimally to realise its potential value.
5.3 Measuring PREMs in primary-care
settings in Australia
There is no national wide survey currently in Australia
targeting patient experience in the primary care setting.
However, a couple of related surveys are described below.
The National Health Services Patient Experience Survey: The
Australian Bureau of Statistics (ABS) conducts this survey
covering a range of health services and ten domains. It
is administered at the population level. The hospital and
emergency department (ED) modules focus on rates of
admission to hospital within the last 12 months by age,
remoteness and sex. These two modules also contain
questions relating to reasons for visiting the ED or
hospital, whether staff listened and showed respect and
whether a satisfactory explanation of treatment was given.
The survey does not target people who have had a recent
hospital admission or ED presentation. The survey is not
designed for hospital-level reporting.
Commonwealth Fund Patient Experience Survey: The
Commonwealth Fund runs and reports populationbased patient experience surveys in 11 Organisation
for Economic Co-operation and Development (OECD)
countries in three year cycles. Separately, specified
healthcare providers and patient sub-populations are
surveyed in the other years.
The ACSQH and the NSW BHI worked with the
Commonwealth Fund and increased the sample size of the
2010 population survey and reported on access to and use
of primary care services, use of specialists, out-of-pocket
costs, prescriptions and hospital and ED experiences.239
The Royal Australian College of General Practice (RACGP)
Patient Satisfaction Survey: There is no mandatory national
patient experience survey in a primary-care setting in
Australia by the RACGP. However, Criterion 2.1.2 of the
RACGP Standards for general practice (4th edition) (the
Standards) requires practices to:
zz Collect feedback about their patients’ experiences by:
zz Option 1: using an RACGP-approved validated patient
experience questionnaire; or
zz Option 2: developing and gaining RACGP approval of a
practice-specific method for gaining patient feedback.
zz Demonstrate that the information received from
patients is used to help improve the practice.
There are four service providers in conducting “RACGPapproved validated patient feedback questionnaires”
surveys for accreditation purposes.
Integrated Care | 72
The college believes that “The carefully developed RACGPapproved questionnaires are in line with best available
evidence and scientific knowledge about questionnaire
development and administration. The questionnaires have
been to ensure they measure patient experiences in a reliable
way. Providers offering these questionnaires will also collate
and analyse your patients’ responses and provide your
practice with a report of the results”
The four services providers are:
1. Insync Surveys incorporating UltraFeedback’s Patient
Satisfaction Instrument (PSIv5): 1800 143 733
2. cfep Survey’s Patient Accreditation Improvement
Survey (PAIS): (07) 3855 2093
3. Press Ganey Associates : (07) 5560 7400
4. SEHPA Patient Feedback Survey (available to practices
within the Cities of Casey and Greater Dandenong and
the Shire of Cardinia within Victoria): (03) 8792 1900
Further information on the survey can be accessed
from the PAIS website. There is very limited information
available from other service providers’ websites with
respect to the content and cost of the surveys.
The RACGP also provides the Patient Feedback Guide:
Learning from our patients (updated August 2014) which
has been developed to:
zz assist general practices in understanding what is
required to fulfil Criterion 2.1.2 of the RACGP Standards
for general practices (4th edition)
zz explain the options available to practices for meeting
the requirements of Criterion 2.1.2
zz provide guidelines to practices wishing to develop
their own practice-specific patient feedback tool.
The RACGP Toolkit for developing practice-specific
questionnaires (the Toolkit) is a separate document
to be used in conjunction with the Guide. The Toolkit
provides a framework and a range of templates to assist
the development of a valid and reliable practice-specific
questionnaire. As such, applications that use the Toolkit
will be approved more quickly.
The approval process can take up to 6-12 months from
initial submission of the application and can be quite
resource intensive due to the requirements of the Guide.
Applications that utilise the Toolkit are anticipated to be
approved within 8 weeks.
5.6 Social media, cost-effectiveness
and IC
There is an increased use of social media in chronic
disease management and in measuring PROMs/PREMs.
A quick review of the evidence on its potential benefits
and limitations are presented (Appendix 6). The tentative
evidence review on the cost-effectiveness of the IC is
presented in Appendix 7.
6 | Future plans and
research priorities
Integrated Care | 74
This report identified several areas which need policy and
research consideration in an Australia setting.
6.1 Psychosocial behaviours in EHRs
as part of PROMs
In a recent report12 released by the IOM, the Committee
on the Recommended Social and Behavioural Domains
and Measures for Electronic Health Records was asked
to recommend core measures of social and behavioural
domains for inclusion in all EHRs. The Committee
identified a parsimonious panel of measures that is
comprehensive, interoperable and efficient.12 These
“psychosocial vital signs” include four measures that are
already widely collected (race/ethnicity, tobacco use,
alcohol use and residential address) and eight additional
measures (education, financial resource strain, stress,
depression, physical activity, social isolation, exposure to
violence and neighbourhood median household income).
While recognising the additional time needed to collect
such data and act upon it, the committee concluded that
the health benefits of addressing these determinants
outweigh the added burden to providers, patients and
health care systems. Advances in research in the future
will likely point to additional measures that should be
included in the panel and periodic re-reviews should be
undertaken to assess them.
There is need for consideration of what ‘psychosocial
vital signs’ could be included as part of PROMs in the
Australian setting.
6.2 Integrating PROMs with EHRs
and other data sources: the need
for robust health information
infrastructure
The recent report commissioned by the AHRQ provided
a full blueprint on how to build a robust information
infrastructure and integrate PROM into the EHR in the
USA.240 The report posits two major arguments:
zz The current lack of interoperability among data resources
for EHRs is a major impediment to the unencumbered
exchange of health information and the development of
a robust health data infrastructure. Interoperability issues
can be resolved only by establishing a comprehensive,
transparent and overarching software architecture for
health information.
zz The twin goals of improved health care and lowered
health care costs will be realised only if health-related
data can be explored and exploited in the public interest
for both clinical practice and biomedical research. This
will require implementing technical solutions that both
protect patient privacy and enable data integration
across patients.
Given the major push from the IOM to integrate
psychosocial and behaviours measures (as part of PROMs)
into EHRs, it is important to explore the policy, technical
and infrastructure requirement for enabling integration of
any future PROM with the EHR (or PCEHR) in Australia.
Other key findings from the report are (paraphrased):
1. The criteria for Stage 1 and Stage 2 Meaningful Use: while
surpassing the 2013 goals set forth by Department of
Health and Human Services (HHS) for EHR adoption, they
fall short of achieving meaningful use in any practical
sense. At present, large-scale interoperability amounts
to little more than replacing fax machines with the
electronic delivery of page-formatted medical records.
Most patients still cannot gain electronic access to their
health information. Rational access to EHRs for clinical
care and biomedical research does not exist outside the
boundaries of individual organisations. (Section 3.2)
2. Although current efforts to define standards for EHRs and
to certify Health Information Technology (HIT) systems
are useful, they lack an unifying software architecture to
support broad interoperability. Interoperability is best
achieved through the development of a comprehensive,
open architecture. (Section 5.1)
3. Current approaches for structuring EHRs and achieving
interoperability have largely failed to open up new
opportunities for entrepreneurship and innovation that
can lead to products and services that enhance health
care provider workflow and strengthen the connection
between the patient and the health care system, thus
impeding progress toward improved health outcomes.
(Section 5.1)
4. HHS has the opportunity to drive adoption and
interoperability of the EHR by defining successive stages
of Meaningful Use criteria that move progressively
from the current closed box system to an open software
architecture. (Section 5.2)
5. The biomedical research community will be a major
consumer of data from an interoperable health data
infrastructure. At present, access to health data is mostly
limited to proprietary datasets of selected patients. Broad
access to health data for research purposes is essential to
realising the long-term benefits of a robust health data
infrastructure. (Section 6.2)
75 | Integrated Care
6. The data contained in EHRs will increase tremendously
both in volume and in the diversity of input sources.
It will include genomic and other “omic” data, selfreported data from embedded and wireless sensors
and data gleaned from open sources. Some types of
personal health data, especially when combined, will
make it possible to decipher the identity of the individual
even when the data are stripped of explicit identifying
information, thus raising challenges for maintaining
patient privacy. (Section 6.3)
7. The US population is highly diverse, reflecting much of the
diversity of the global population. Therefore, important
research findings applicable to Americans are likely to
come from shared access to international health data.
Currently there is no coherent mechanism for accessing
such data for research. (Section 6.4)
8. Electronic access to health data will make it easier to
identify fraudulent activity but at present there is little
effort to do so using EHRs.
Most of these issues are not only restricted to the USA
and some are more pronounced in an Australian setting
such as the fragmentation of data sources. The recent
review commissioned by National Electronic Health
Transition Authority (NEHTA) recommended 38 sweeping
changes to its governance structure (including dissolving
NEHTA), operation and name of the PCEHR (to MyHR).241
The report viewed the care integration as one of the
most potential beneficial areas of meaningful use of
PCEHR (citing both ABS Patient Experience Survey results
and Kaiser Permanente case). While the effects of the
proposed changes remain to be seen, it is important that
the issues raised above are reviewed in the Australian
setting. It should also be a priority to create a health
information infrastructure to link PROM/EHR to other
existing data sources (registry, administrative data, survey
data, Medical Benefit Scheme(MBS), Pharmaceutical
Benefit Scheme(PBS), Registry of Births, Deaths and
Marriages(RBDM), bio banks, etc) and that measures are
taken to ensure the promised potential of utilising PCEHR
to facilitate care integration.
6.3 Developing PROM/PREM
measures on important
subgroups
While there is no need to reinvent the wheel by creating
our own PROM/PREM if there is already a sound existing
instrument, one should still critically review the adopted
PROM/PREM measures to see if these PROMs/PREMs
are suitable to be used across different population
subgroups. The notable such subgroups in Australia
include the Indigenous population, patients from
culturally and linguistically diverse (CALD) backgrounds,
women, children and the elderly as well as patients
with mental health problems or cognitive impairment.
The sound psychometric properties of an instrument
developed in another setting and population may not
be applicable to the subgroups in an Australian setting.
Thus, it is critical that we test the suitability of these
instruments in different Australian subgroups. In a
psychometric term, it is the issue referred to as crossculture validity or measurement invariance.
In the UK, the National Voices are currently developing
narratives for IC specific to four particular user groups
– mental health, children and young people, end of
life care and frail older people – which will enable the
development of user experience indicators specific to
these groups.36 Another development in the UK is a
project to develop a survey tool for measuring userreported experience of IC among older people with
a long-term condition. Its aim to support health and
social care services in England and, more widely, in an
international context to measure and improve the quality
of IC. Led by the Nuffield Trust in collaboration with the
Picker Institute, The King’s Fund, National Voices and the
International Foundation for Integrated Care, the project
is funded by the AETNA Foundation in the US and is due
for completion in 2015.
Integrated Care | 76
6.4 Exploring the ways that the
results can be better presented to
different stakeholders
6.6 Investment in IRT/CAT technique
and item banks for specific
interested area
The purpose of collecting PROM/PREM is to improve quality
care at three different levels (i.e. patient, providers and health
systems). It is critical to understand and develop the best way
to communicate the results derived from the PROM/PREM to
different stakeholders. One of the reasons for lack of buy-in
from doctors on collecting a PRO is that they did not see the
relevance of PROs to their practice and failed to understand
how the results could be used to improve their clinical
practice. Useful strategies should be explored including:
IRT/CAT has been successfully used in education and
psychology for many decades and PROMIS has showed
the feasibility of its application in the health sector
at scale. IRT/CAT has many conceptual advantages
such as comparability, precision (reliability), flexibility,
inclusiveness and reduced burden for respondents.
Given the foreseeable growth in numbers of PROMs/
PREMs required for patients, the burden of responding
to the PROMs/PREMs on patients and the burden of
collecting these PROMs/PREMs on health professionals
should not be underestimated. However, developing
IRT/CAT based PROMs requires significant investment
and expertise. It should be considered if strategic
investment in this area is warranted.
zz Testing different clinical applications of available tools.
zz Making explicit the utility of collected PROM:
screening, diagnosing, risk stratification and prognosis,
indication for treatment, monitoring, consistent use
along the care pathway.
zz Identifying best methods for data collection (home
vs health care settings, electronic vs paper and pen,
classical test theory-based vs IRT based, timing of data
collection).
zz Best methods for feedback and interpretation, visual
presentation;
zz Most effective ways for training of professionals (web-
based, virtual world, face-to-face, etc);
zz Involving relevant stakeholders in the
conceptualisation, developing and testing PROMs/
PREMs and in interpreting and presenting the results.
6.5 Investment in understanding
population norm, cut-off, MID,
responsiveness, response-shift of
the PROMs/PREMs
For any used PROM/PREM instrument, it is critically
important that the population norm in an Australian
setting, the clinically meaningful cut-off (or minimally
important difference), the responsiveness of such a measure
in tracking change over time and possible existence of
response-shift are thoroughly investigated. Many of these
issues if not addressed properly will threaten the validity
and interpretation of the results and make the whole
effort of collecting the data meaningless. These issues are
complex and often require dedicated investigation and
carefully designed studies. Few studies have addressed
these issues adequately.
6.7 Developing suitable case-mix
adjustment methodology for
different stakeholders
Adequate case-mix or risk-adjustment methods are
an important part of developing meaningful quality
measures. As the aims, settings, and targeted populations
may vary for different PROM/PREM, the risk-adjustment
methods are also likely to be purpose-specific. Given
the recent studies from the USA on the potential
misleading results of applying improper risk-adjustment
methodology242 243, it is important to investigate these
issues to evaluate its implication in an Australian setting
and to develop the correct risk-adjustment methodology.
7 | Summary
Integrated Care | 78
We have conducted a rapid review of the issues on PROMs
and PREMs with particular focus on their relevance to the
IC strategy in NSW. Points from the review include:
1) The international experience, policy background and
research interests are fast-moving in relation to PROM/
PREM and IC.
2) The UK, the USA and other European countries have
provided many policy initiatives in PCC/ICC as well as
how to utilise PROM/PREM to improve quality of care.
However, the evidence-base for most of the policy
initiatives is still emerging.
3) The recent review by the World Health
Organisation(WHO) showed that the cost-analysis
effect of IC is weak and it further highlighted the
importance of measuring patient experience in its own
right in IC.
4) There is a generational change in developing and
testing PROMs/PREMs, culminated by the success of
the NIH funded PROMIS project which is based on IRT
and CAT.
5) There is a growing number of different methods
to capture patient experience with the growing
popularity of social media. More research is needed
to develop and evaluate different models/methods to
capture patient experience and inform policy.
6) Despite the ‘perfect storm’ in transforming health care
promised by the combination of PCC, PROM/PREM and
social media, the risks and benefits of different policies
should be carefully assessed and managed.
7) Despite the increasing number of PROMs/PREMs there
is a need to assess the methodological properties
of PROMs/PREMs in accordance with internationally
recognised guidelines such as COSMIN before making
a decision.
8) There is a good evidence-base for adopting one of the
generic HRQoL questionnaires in primary care or the
ICU setting. The final decision may be reached through
a working panel and a chosen consensus-reaching
process in NSW.
9) More disease- or condition- specific PROMs/PREMs
could be added to the system. The criteria used, the
process (how patients and stakeholders are involved)
and the results (how results are interpreted and fed
back to stakeholders, including patients) should be
clearly specified.
10)One important factor to consider in choosing any
particular PROM/PREM is its utility to patients,
providers and health systems and its ability to affect
change at different levels.
11)There is currently no international consensus on
definitions and consisting domains of many popular
terms such as “PCC” or ‘IC”. Health jurisdictions have to
define their patient experience frameworks according
to their value propositions on different underlying
domains.
12)NSW Health could pioneer the work to operationalise
the indicators of patient experience measures on IC
in an Australian setting, learning from the existing
experience and tools developed in the UK and by the
AHRQ.
13)Critical leadership as well as strategy is needed to
include psychosocial and behaviour variables in PROM/
PREM, integrate PROM/PREM with EHRs, and build
a sustainable big data infrastructure with linkage to
other sources of data.
14)Challenges remain in generating solid evidence on
IC and the effectiveness of PROMs/PREMs which
necessitate a committed and robust research agenda.
8 | Appendices
Integrated Care | 80
8.1 Appendix 1 – The list of
systematic reviews on HR-PRO
by COSMIN (489) (See attached
document)
8.2 Appendix 2 - Methodological
issues related to the measurement
properties of the PROMs/PREMs
instruments (COSMIN)
COSMIN is an international initiative aimed to improve
the selection of health measurement instruments.204-208
As part of this initiative, the COSMIN group developed a
critical appraisal tool (a checklist) containing standards
for evaluating the methodological quality of studies on
the measurement properties of health measurement
instruments (Figure 8.2-1). The COSMIN checklist
was developed in an international Delphi study as a
multidisciplinary, international collaboration with all
relevant expertise involved. The focus was on HR-PROs,
but the checklist is also useful for evaluating studies on
other kinds of health measurement instruments such as
performance-based instruments or clinical rating scales.
The COSMIN checklist can be used to evaluate the
methodological quality of studies on measurement
properties, for example in systematic reviews of
measurement properties. In systematic reviews it is
important to take the methodological quality of the
selected studies into account. If the results of high quality
studies differ from the results of low quality studies, this
can be an indication of bias. The COSMIN checklist can also
be used as a guide for designing or reporting a study on
measurement properties.
COSMIN provides detailed definitions of each
measurement property of an HR-PRO.
Figure 8.2-1 The diagram for completing the COSMIN
checklist
INSTRUCTIONS FOR COMPLETING THE COSMIN CHECKLIST
Mark the properties
that have been
assessed in the article.
Are IRT methods
used in the article?
A. Internal consistence B. Reliability
C. Measurement error D. Content validity
(including face validity) Construct validity
E. Structural validity F. Hypotheses- testing G. Cross-cultural validity H. Criterion validity I. Responsiveness
J. Interprebility
Complete for each
property you marked in
step 1 the corresponding
box A to J
Complete for each
property you marked in
step 1 the
Generalisability box
Step 3
Step 4
No
Yes
Complete IRT box
Step 1
Step 2
Step 1 is to determine which measurement properties are
evaluated in an article.
Step 2 is to determine if the statistical methods used in the
article are based on Classical Test Theory (CTT) or on IRT.
Step 3 is to evaluate the methodological quality of the
studies on the properties identified in step 1.
Step 4 is to assess the generalisability of the results of the
studies on the properties identified in step 1.
A detailed description of how to use the checklist, a
rationale for each item and suggestions for scoring the
items are provided in the COSMIN manual.
Figure 8.2-2 COSMIN taxonomy of relationships of
measurement properties. Abbreviations: HR-PRO, health
related-patient reported outcome205
Quality of a HR-PRO
Reliability
Internal
consistency
Reliability
Validity
(test-retest, inter-rater,
intra-rater)
Measurement
error
(test-retest, inter-rater,
intra-rater)
Content
validity
Face
validity
Construct
validity
Structural
validity
Criterion
validity
Responsiveness
(concurrent
validity, predictive
validity)
Responsiveness
Interpretability
Hypothesestesting
Cross-cultural
validity
81 | Integrated Care
Table 8.2-1 Definitions of domains, measurement properties, and aspects of measurement properties205
Term
Domain
Measurement
property
Aspect of a
measurement
property
Definition
Reliability
The degree to which the measurement is free from measurement
error
Reliability (extended definition)
The extent to which scores for patients who have not changed
are the same for repeated measurement under several conditions:
for example, using different sets of items from the same HR-PROs
(internal consistency), over time (testeretest) by different persons
on the same occasion (interrater) or by the same persons (i.e.,
raters or responders) on different occasions (intrarater)
Internal
consistency
The degree of the interrelatedness among the items
Reliability
The proportion of the total variance in the measurements which
is because of ‘‘true’’a differences among patients
Measurement
error
The systematic and random error of a patient’s score that is not
attributed to true changes in the construct to be measured
Validity
The degree to which an HR-PRO instrument measures the
construct(s) it purports to measure
Content validity
The degree to which the content of an HR-PRO instrument is an
adequate reflection of the construct to be measured
Face validity
Construct
validity
Criterion validity
The degree to which (the items of ) an HR-PRO instrument indeed
looks as though they are an adequate reflection of the construct
to be measured
The degree to which the scores of an HR-PRO instrument are
consistent with hypotheses (for instance with regard to internal
relationships, relationships to scores of other instruments, or
differences between relevant groups) based on the assumption
that the HR-PRO instrument validly measures the construct to be
measured
Structural
validity
The degree to which the scores of an HR-PRO instrument are an
adequate reflection of the dimensionality of the construct to be
measured
Hypotheses
testing
Idem construct validity
Cross-cultural
validity
The degree to which the performance of the items on a translated
or culturally adapted HR-PRO instrument are an adequate
reflection of the performance of the items of the original version
of the HR-PRO instrument
The degree to which the scores of an HR-PRO instrument are an
adequate reflection of a ‘‘gold standard’’
Integrated Care | 82
Term
Domain
Measurement
property
Responsiveness
Definition
The ability of an HR-PRO instrument to detect change over time
in the construct to be measured
Responsiveness
Interpretabilityb
Aspect of a
measurement
property
Idem responsiveness
The degree to which one can assign qualitative meaning
-that is, clinical or commonly understood connotations - to an
instrument’s quantitative scores or change in scores.
Abbreviations: HR-PROs, health-related patient-reported outcomes; CTT, classical test theory.
a: The word ‘‘true’’ must be seen in the context of the CTT, which states that any observation is composed of two components a
true score and error associated with the observation. ‘‘True’’ is the average score that would be obtained if the scale was given an
infinite number of times. It refers only to the consistency of the score and not to its accuracy (ref Streiner & Norman244).
b: Interpretability is not considered a measurement property but an important characteristic of a measurement instrument.
The COSMIN provides also a checklist (Box A-J) and gives a guided procedure to check design issues as well as statistical
methods issues (if applicable) to each measurement property within every domain. Note that the COSMIN covers not only
CTT but also IRT on both the design and statistical methods.
As of November 2014, the dedicated COSMIN website includes 489 systematic reviews on the health measurement that
can be downloaded. A list of these systematic reviews is included (Appendix 1).
Another important work that provided more detailed quantitative guidance on assessing the measurement properties of
PROM is by Terwee and colleagues (2007)245 and its main table is presented(Table 8.2-2).
83 | Integrated Care
Table 8.2-2 Quality criteria for measurement properties 245
Property
Definition
Quality criteriaa,b
1. Content validity
The extent to which the
domain of interest is
comprehensively sampled
by the items in the
questionnaire
+ A clear description is provided of the measurement aim, the target
population, the concepts that are being measured, and the item
selection AND target population and (investigators OR experts) were
involved in item selection;
? A clear description of above-mentioned aspects is lacking OR only
target population involved OR doubtful design or method;
- No target population involvement;
0 No information found on target population involvement
2. Internal
consistency
The extent to which
items in a (sub)scale are
intercorrelated, thus
measuring the same
construct
+ Factor analyses performed on adequate sample size (7 * # items
and ≥100) AND Cronbach’s alpha(s) calculated per dimension AND
Cronbach’s alpha(s) between 0.70 and 0.95;
? No factor analysis OR doubtful design or method;
- Cronbach’s alpha(s) <0.70 or >0.95, despite adequate design and
method;
0 No information found on internal consistency
3. Criterion validity
The extent to which
scores on a particular
questionnaire relate to a
gold standard
+ Convincing arguments that gold standard is ‘‘gold’’ AND correlation
with gold standard ≥ 0.70;
? No convincing arguments that gold standard is ‘‘gold’’ OR doubtful
design or method;
- Correlation with gold standard <0.70, despite adequate design and
method;
0 No information found on criterion validity.
4. Construct validity
The extent to which
scores on a particular
questionnaire relate
to other measures in a
manner that is consistent
with theoretically derived
hypotheses concerning the
concepts that are being
measured
+ Specific hypotheses were formulated AND at least 75% of the
results are in accordance with these hypotheses;
The extent to which
the scores on repeated
measures are close to
each other (absolute
measurement error
+ MIC < SDC OR MIC outside the LOA OR convincing arguments that
agreement is acceptable;
? Doubtful design or method (e.g., no hypotheses);
- Less than 75% of hypotheses were confirmed, despite adequate
design and methods;
0 No information found on construct validity.
5. Reproducibility
5.1. Agreement
? Doubtful design or method OR (MIC not defined AND no convincing
arguments that agreement is acceptable);
- MIC ≥ SDC OR MIC equals or inside LOA, despite adequate design
and method;
0 No information found on agreement.
Integrated Care | 84
Property
Definition
5.2. Reliability
Quality criteriaa,b
+ ICC or weighted Kappa ≥ 0.70;
? Doubtful design or method (e.g., time interval not mentioned);
- ICC or weighted Kappa < 0.70, despite adequate design and
method;
0 No information found on reliability
6. Responsiveness
The ability of a
questionnaire to detect
clinically important
changes over time
+ SDC or SDC < MIC OR MIC outside the LOA OR RR > 1.96 OR AUC ≥
0.70;
? Doubtful design or method;
-SDC or SDC ≥ MIC OR MIC equals or inside LOA OR
RR < 1.96 OR AUC < 0.70, despite adequate design and methods;
0 No information found on responsiveness.
7. Floor and ceiling
effects
The number of
respondents who
+ <15% of the respondents achieved the highest or lowest possible
scores;
achieved the lowest or
highest possible score
? Doubtful design or method;
- >15% of the respondents achieved the highest or lowest possible
scores, despite adequate design and methods;
0 No information found on interpretation
8. Interpretatability
The degree to which one
can assign qualitative
meaning to quantitative
scores
+ Mean and SD scores presented of at least four relevant subgroups
of patients and MIC defined;
? Doubtful design or method OR less than four subgroups OR no MIC
defined;
0 No information found on interpretation
MIC = minimal important change;
SDC = smallest detectable change;
LOA = limits of agreement;
ICC = Intraclass correlation;
SD, standard deviation.
a + = positive rating;
? = indeterminate rating;
- = negative rating;
0 = no information available.
b Doubtful design or method = lacking of a clear description of the design or methods of the study, sample size smaller
than 50 subjects (should be at least 50 in every (subgroup) analysis), or any important methodological weakness in the
design or execution of the study
85 | Integrated Care
8.3 Appendix 3 – The sources of the instruments reviewed.
Instrument Abbreviation
Name
Objective
AQoL
Assessment of Quality of
Life
The AQoL is a multi-attribute utility measure for use in
economic evaluation, measuring health-related quality of life.
The descriptive system can be used to provide health profiles.
NHP
Nottingham Health Profile To provide a brief indication of a patient’s perceived emotional
social and physical health problems
SF-12® / SF-12v2™
SF-12® Health Survey and Developed to be a much shorter, yet valid, alternative to the SFSF-12v2™ Health Survey
36 for use in large surveys of general and specific populations
SF-36® Health Survey and as well as large longitudinal studies of health outcomes
SF-36® / SF-36v2™
HUI®: HUI3
SF-36v2™ Health Survey
The SF-36 was developed during the Medical Outcomes Study
(MOS) to measure generic health concepts relevant across age,
disease, and treatment groups. The SF-12 is a subset of the
Sf-36.
Health Utilities Index
To describe health status, measure within-attribute morbidity
and health-related quality of life, and produce utility scores
There are three versions of the HUI; HUI, HUI2 and HUI3.
WHOQoL-BREF
World Health
Organisation Quality
of Life assessment
instrument
To assess individuals’ perceptions on the quality of their life
QWB-SA
Quality of Well-Being
scale Self-Administered
To measure health-related quality of life, to monitor the
health of populations over time, or to evaluate the efficacy
and effectiveness of clinical therapies of practices using a
preference-weighted self-administered measure
EQ-5D
Euroqol EQ-5D
To assess health outcome from a wide variety of interventions
on a common scale, for purposes of evaluation, allocation
and monitoring. Of note, the EQ-5D-3L consists of the original
EQ-5D descriptive system and the EQ visual analogue scale
(EQ-VAS).
HOWSYOURHEALTH?
HowsYourHealth
Research has shown shows that basic information tailored to
the needs of the respondent and their doctor or nurses is most
likely to make communication better, place everyone “on the
same page”, and increase confidence with self-care.
Integrated Care | 86
Instrument Cost Notes:
AQoL: “There are no fees associated with registration
or use of any of the Centre for Health Economics AQoL
instruments.” Information retrieved Jan 3, 2013 at:
http://www.aqol.com.au/documents/v4_AQoL_User_
Registration_Form_090710.pdf.
There are four identified versions of the AQoL available
online: AQoL-8D; AQoL-7D; AQoL-6D; AQoL-4D.
Information retrieved Jan 3, 2013 at: http://www.aqol.com.
au/choice-of-aqol-instrument.html.
Nottingham Health Profile: “No fees apply for
commercial / pharmaceutical companies” Information
retrieved Jan 3, 2013 at: http://www.outcomesdatabase.
org/node/669
SF12 and SF36: “All use of QualityMetric health surveys,
scoring algorithms, translations and benchmarking data
requires a signed license agreement.” Information retrieved
Jan. 6, 2103 at http://www.sf-36.org/wantsf.aspx?id=1
Licensing fee information request form available at:
http://www.qualitymetric.com/RequestInformation/
SurveyInformationRequest/tabid/263/Default.aspx
(Retrieved Jan 6, 2013)
A sample of the SF12 can be reviewed (and completed),
but not downloaded, at: http://www.qualitymetric.com/
demos/TP_Launch.aspx?SID=52304 (Information retrieved
Jan 6, 2013)
A sample of the SF-36 can be reviewed (and completed),
but not downloaded, at: http://www.qualitymetric.com/
demos/TP_Launch.aspx?SID=100 (Information retrieved
Jan 6, 2013)
Health Utilities Index: Health Utilities Inc. does not
grant permission for copies of its proprietary materials
(e.g. questionnaires) to be distributed in grant proposals
or reports. The HUI Application and Interpretation
Package costs $CAN 4,000.00. This package includes
initial consultation to determine the most appropriate
questionnaires for use in a specific study and permission
to use one version of an HUI questionnaire and the
matching procedures manual set in one study. Additional
questionnaires and manuals are available for use at a cost
of $CAN 2,000.00 each per study. If the study requires more
than one questionnaire the fee schedule becomes more
complicated. For example, a study using two self-completed
questionnaires (e.g. self-complete and self-assessed
in both English and Spanish) will cost $CAN 6,000.00)
while a study using a self-completed and an intervieweradministered questionnaire can cost $CAN 8,000.00 (fee
reflects payment for one additional questionnaire and one
additional manual). HUI grants permission for use of its
proprietary materials (e.g. instrumentation) one study at a
time. (Additional information regarding the fee 55 schedule
is on the web site: http://healthutilities.biz/fees.htm .Source:
Email correspondence with Mr. Bill Furlong, General
Manager Health Utilities Inc, 24/09/2012.)
WHOQoL-BREF can be downloaded freely from the
World Health Organisation website. Information retrieved
January 3, 2013 at: http://www.who.int/substance_abuse/
research_tools/whoqolbref/en/ . The US WHOQoL
Center distributes the WHOQoL-BREF US English Version
(June 1997), with scoring instructions, free of charge as
electronic files. Information retrieved Jan 3, 2013 at: http://
depts.washington.edu/seaqol/WHOQOL-BREF
Quality of Well-Being Scale – Self-Administered
(QWB-SA): Copyright Agreement For Non-Profits
appears to indicate free use with restrictions. Information
retrieved Jan 6, 2013 at: https://hoap.ucsd.edu/qwb-info/
NotforProfit-Copyright.pdf . Terms and cost for use by
non-profits indicates, “Scoring - ◦HSRC Scoring - $57/hr
or, an algorithm for scoring can be purchased for $240.
The scoring instructions are free of charge. Information
retrieved Jan 6, 2013 at: https://hoap.ucsd.edu/qwb-info/
price-nonprofit.aspx Quality of Well-Being Scale – Self
Administered (QWB-SA) tool can be downloaded for
review at https://hoap.ucsd.edu/qwb-info/# (Retrieved Jan
6, 2013). There are two English versions. English 2-page
paper version “QWB-SA V1.04” available at: https://hoap.
ucsd.edu/qwb-info/EnglishQWB-SA_2.pdf (Retrieved Jan
6, 2013) English 4-page paper version “QWB-SA V1.04”
available at: https://hoap.ucsd.edu/qwb-info/EnglishQWBSA_4.pdf (Retrieved Jan 6, 2013)
EQ-5D: “Licensing fees are determined by the EuroQol
Executive Office on the basis of the user information
provided on the registration form. The amount is dependent
upon the type of study/trial/project, funding source, sample
size and number of requested languages. You are not
obligated to purchase by registering.” Information retrieved
Jan 3, 2013 at: http://www.euroqol.org/eq-5d-products/howto-obtain-eq-5d.html The User Guide can be downloaded
from www.euroqol.org homepage.
PROMIS Short Form v1.0 - Global Health Scale can be
obtained for free by email request. Information retrieved
Jan 3, 2013 at: https://www.assessmentcenter.net/
PromisForms. aspx . While it appears use of the GHS is free,
users must “agree that the PROMIS Health Organisation
and PROMIS Cooperative Group provide access to
PROMIS instruments (e.g. item banks, short forms, profile
measures) subject to the PROMIS Terms and Conditions
(PTAC).” Information retrieved Jan 3, 2013 at: http://www.
assessmentcenter.net/documents /PROMIS%20Terms%20
and% 20Conditions%20v7.3.pdf.
87 | Integrated Care
8.4 Appendix 4 – Measuring Patient
Experiences in Primary Health
Care Survey (Canadian) (see
attached document)
sites. For example, since rating is anonymous, rating values
are not risk adjusted and therefore vulnerable to fraud.
Also, ratings are often based on only a few reviews and are
predominantly positive. Furthermore, people providing
feedback on health care via social media are presumably
not always representative of the patient population.
8.5 Appendix 5 – The June 2014
update of the AHRQ review
results of the instrument in the
care coordination measurement
(Care Coordination Atlas) (see
attached document)
Social media and particularly rating sites are an interesting
new source of information about quality of care from the
patient’s perspective. However, this new source should be
used to complement traditional methods, since measuring
quality of care via social media has other serious
limitations. Future research should explore whether
social media is suitable in practice for patients, health
insurers, and governments to help them judge the quality
performance of professionals and organisations.
8.6 Appendix 6 – Social Media and IC
8.6.2 Social media in chronic disease
management
8.6.1 Using social media to capture patient
experience
There is also increased interest in using social media in
chronic disease management. Merolli and colleagues
(2013)247 reviewed the literature on the effect of using social
media in chronic disease management. Amongst the 19
studies identified, 10 studies were from USA; 6 from Europe
and 3 from Asia/Pacific. Of these studies, 10 can be classified
as “online support groups’ (OSG; 10) such as discussion
forums, bulletin boards and chat tools; 5 as social network
sites (SNS; 5), blogs (1) and virtual worlds (1).
There is a growing number of studies that used social
media as a means to capture the patient experience.
The “traditional” methods have significant limitations
which include social desirability bias, a time lag between
experience and measurement and difficulty reaching
large groups of people. Information on social media could
be of value to overcome these limitations as this type of
media are easy to use and are used by the majority of the
population. Furthermore, an increasing number of people
share health care experiences online or rate the quality of
their health care provider on physician rating sites.
The question is whether this information is relevant to
determining or predicting the quality of health care and
how to maximise the value and minimise the risk of this
information. Verhoef and colleagues (2014)246 recently
conducted a scoping review which aimed to :
(1) map the literature on the association between social
media and quality of care,
(2) identify different mechanisms of this relationship, and
(3) determine a more detailed agenda for this relatively
new research area.
The authors describe and categorise according to the type
of social media used on related research questions as well
as consulting national and international stakeholders on
the study results. The study identified 29 articles, of which
21 were concerned with health care rating sites. Several
studies indicated a relationship between the information
on social media and quality of health care. However, some
limitations exist, especially regarding the use of rating
Of the studies that focused on a single chronic condition
(N = 11), five explored cancer (two specifically breast
cancer)248-252, three examined chronic pain related
conditions 253-255, one studied HIV/AIDS256, one was on
diabetes257 and one on rheumatoid arthritis.258 The
remaining studies (N = 8), either explored multiple
chronic disease groups or examined chronic disease in
general.259-266
Of these studies, 12 occurred on earlier social platforms249
such as OSG. Three of the five SNS
studies were created specifically for the purpose of an
intervention251 255 258, the others utilised the existing
platforms of Facebook257 and DailyStrength260.
250 252-254 256 259 261 263-266
Various reasons for using SNS included: enhancing selfmanagement of disease255; driving support and social
interaction251 258; ‘friending’ other participants255 ; videosharing service (as well as user profile creation, photos
and narratives)251; fundraising, awareness, promotions
and support267(the pre-existing SNS, Facebook). In the
case of patients with type 1 diabetes, it appears that
group members primarily use the platform to share
information, request information from others and offer
each other support.257 DailyStrength260, a health-related
Integrated Care | 88
SNS for patients and carers, centers around the formation
of different support groups. Analysis of the conversation
content from support groups for breast cancer, diabetes
and fibromyalgia found the most common usage patterns
centered around ‘support’. The authors suggest that
in poorly understood and socially stigmatised chronic
diseases, support is the main activity or conversation to
occur, as well as coping and fitting within society.260
One study used blogging as a form of emotional catharsis
in chronic disease management.248 It appears that breast
cancer sufferers are one patient population that have
the most to gain from narrating their experiences. For
cancer sufferers, blogging has been used as a means
to self-manage emotions, problem-solve and share
information.248
One study investigated the potential of delivering
psychosocial interventions via a virtual world, such as
Second Life.262 It is suggested that virtual environments
may provide an ideal platform for the delivery of these
interventions for conditions such as depression and
other chronic diseases involving significant psychosocial
components.262
89 | Integrated Care
Table 8.6-1 Platforms and reported effects/outcomes.
OSG
Discussion
forum/bulletin
board/chat
Custom
264
251 268
252 256 259
266 269
253 264
251 268
– Empowerment: (Includes: better
252 256 259
informed, social well-being,
266 269
confidence in treatment, confidence
with practitioner, self-esteem,
network building, acceptance &
belonging, understanding, validation,
optimism, control)
253 254
251
– Information sharing
253 254
251
44
255 258
Engagement/participation
SNS
Blogs Virtual
worlds
Facebook DailyStrength
Social interactions
– Peer support
252 256 259
266 269
Disease-specific knowledge
257
260
248
257
260
248
Psychosocial impacts
– Emotional burden
– Catastrophising
– Pain-induced fear
255
– Depression
252 259 265
266
255
– Anxiety
249 250 252
261
255
262
– Stress
252 259
255
262
– Emotional expression & distress
252
255
262
– Coping
256
– Self-efficacy
256
263
– QOL (psychosocial)
256
263
263
255
Physical condition impacts
263 264
255
–Pain severity
264
255
– Pain-related interference/limitation
255
– Perceived disability
255
– Functional limitation
– QoL (physical)
(Adapted from: Merolli et al.; 2014247)
264
256
248
262
Integrated Care | 90
8.6.3 Silver lining of the ‘cloud of patient
experience’270 in a ‘perfect storm’271?
A growing number of patients are using the internet
to describe their experiences of healthcare.272 A recent
study showed a consistent increase in online voting on
the NHS Choices website272 (Figure 8.6-1). Comparison
between structured patient rating data online and patient
surveys has already shown significant associations at the
hospital level.273 274 The increasing availability of patients’
accounts of their care on blogs, social networks, Twitter
and hospital review sites presents an unprecedented
yet intriguing opportunity to advance PCC agenda and
provide novel quality of care data. Greaves and colleagues
(2013) described this concept as the ‘cloud of patient
experience’.270 The authors argued that the collection and
aggregation of patients’ descriptions of their experiences
on the internet could be used to detect poor clinical
care. The techniques of natural language processing and
sentiment analysis can be used to transform unstructured
descriptions of patient experience on the internet into
usable measures of healthcare performance. These
analyses, if automated, can provide real-time feedback
into the healthcare system. In another commentary,
Rozenblum and Bates (2013) 271 contends that the three
domains—patient-centered healthcare, social media and
the internet— are beginning to come together and have
the potential to create a major shift in how patients and
healthcare organisations connect, in effect, the ‘perfect
storm’, a phrase that has been used to describe a situation
in which a rare combination of circumstances result in an
event of unusual magnitude.275
Figure 8.6-1 Cumulative number of online ratings of
hospitals in England on the NHS Choices website.
Cumulative number of hospital ratings
25000
20000
15000
10000
5000
Oct 11
Dec 11
Jun 11
Aug 11
Apr 11
Feb 11
Oct 10
Dec 10
Jun 10
Aug 10
Apr 10
Feb 10
Oct 09
Dec 09
Jun 09
Aug 09
Apr 09
Feb 09
Oct 08
Dec 08
0
Aug 08
Few studies have investigated social media’s potential in
chronic disease. However, the existing studies showed a
positive impact on health status and patient experience
with none indicating any adverse events. Benefits
have been reported for psychosocial management via
the ability to foster support and share information.
There is less evidence of benefits for physical condition
management. These studies covered a very limited
range of social media platforms and there is an ongoing
propensity towards reporting investigations of earlier
social platforms, such as OSG, discussion forums and
message boards. For social media to form a more
meaningful part of effective chronic disease management,
interventions need to be tailored to the individualised
needs of patients, in particular with respect to identity,
flexibility, structure, narration and adaptation. High
methodological quality is required to investigate
how social media can best serve chronic disease
management.247
However, despite the great potential of harnessing the
‘cloud of patient experience’, the field is relatively new
particularly in the healthcare sector. A study by Naslund
and colleagues (2014)276 used qualitative inquiry informed
by emerging techniques in online ethnography and
analysed 3,044 comments posted in 19 videos on YouTube
which were uploaded by individuals who self-identified as
having schizophrenia, schizoaffective disorder, or bipolar
disorder. The authors found peer support across four
themes: minimising a sense of isolation and providing
hope; finding support through peer exchange and
reciprocity; sharing strategies for coping with day-to-day
challenges of severe mental illness; and learning from
shared experiences of medication use and seeking mental
health care. These broad themes are consistent with
accepted notions of peer support in severe mental illness
as a voluntary process aimed at inclusion and mutual
advancement through shared experiences and developing
a sense of community. The study results suggested that
the lack of anonymity and associated risks of being
identified as an individual with severe mental illness on
YouTube seemed to be overlooked by those who posted
comments or uploaded videos. However, the authors
believed that whether or not this platform can provide
benefits for a wider community of individuals with severe
mental illness remains uncertain.
91 | Integrated Care
The various sources of information that could be used in harnessing the ‘cloud of patient experience’ and the strengths and
limitations of the approach are presented (Table 8.6-2).
Table 8.6-2 Potential sources of information for the ‘cloud of patient experience’
Type of source
Examples
Rating and
RateMDs277
feedback websites Patient Opinion278
Iwantgreatcare279
Patient networks,
discussion fora
and blogs
Patientslikeme280
Mumsnet281
Micro-blogs
Social networks
Information that could
be used
Advantages
Disadvantages
Ratings and free text
descriptions of healthcare
providers and individual
clinicians
Comments
usually directly
relate to care
experience
Comparatively low
usage, possibility of
deliberate gaming
Patients’ and carers’ shared Authentic voice
descriptions of their care
of the patient,
and experiences
often well used
in specific patient
communities
May be a selection
bias towards particular
demographics
(with higher socioeconomic status) or
interest groups
Twitter283
Tweets (short messages)
High volume
directed towards hospitals of traffic, often
or care providers
tagged with
service they relate
to
Short, unstructured
messages may contain
minimal information
about care quality
Facebook284
Comments left on hospital High membership
and usage by the
or friends’ pages about
public
care or specific signals
of appreciation (eg likes,
‘+1’s)
Public rarely talks
about healthcare
on these platforms;
Content may be from
employees rather than
patients
Epatients.net282
Google+285
(Adapted from: Greaves et al. 2013 270)
Integrated Care | 92
The NHS’s public reporting website—known as NHS
Choices—has been incorporating anecdotal comments from
patients regarding primary and hospital care since 2007286.
Publicly reporting patients’ narrative comments along with
numerical ratings of their experience and clinical quality
metrics presents opportunities as well as challenges for
reporting systems (Table 8.6-3).
Table 8.6-3 Questions Asked for Anecdotal Comments
and Ratings on NHS Choices
Question type
Hospitals
Ratings on a scale (all
on a 1-5 Likert-type
scale)
How likely are you to recommend this hospital How likely are you to recommend this GP
to friends and family if they needed similar care surgery to friends and family if they needed
similar care or treatment?
or treatment?
How satisfied are you with the cleanliness of
the area you were treated in?
How satisfied are you that the hospital staff
worked well together?
How satisfied are you that you were treated
with dignity and respect by the staff?
Primary care practices
Are you able to get through to the surgery
by telephone? Are you able to get an
appointment when you want one?
Do the staff treat you with dignity and respect?
Does the surgery involve you in decisions
about your care and treatment?
How satisfied are you that you were involved in This GP practice provides accurate and up to
date information on services and opening
decisions about your care?
hours.
How satisfied are you that the hospital
provides same-sex accommodation?
Free text commentary
Give your opinion in your own words. The
Give your opinion in your own words. The
more detail you can give, the more useful your more detail you can give, the more useful your
review will be.
review will be.
(Source: Greaves et al. 2014286)
Greaves and colleagues (2014)286 provided an in-depth
review of the five key design considerations for publicly
reporting anecdotal comments—including how to
collect, moderate and display comments and how to
encourage the public and the health care providers to
use them which was based on the UK experience. The
authors concluded that while anecdotal comments might
represent an untapped seam of valuable information
about service quality and a potential hook for engaging
patients to use comparative performance data, the jury is
still out on where narrative comments fit in the complex
landscape of quality measurement and reporting.
Another study (also by Greaves and colleagues, 2014)287
examined whether Tweets sent to NHS hospitals
contained information about quality of care. The authors
also compared sentiment on Twitter regarding hospitals
with established survey measures of patient experience
and standardised mortality rates. Using a mixed-method
study including a quantitative analysis of all 198,499
93 | Integrated Care
tweets sent to English hospitals over a year and a
qualitative directed content analysis of ,1000 random
tweets, the authors found that 11% of tweets to hospitals
contained information about quality of care, with the most
frequent topic being patient experience (8%). Comments
on effectiveness or safety of care were present, but less
common (3%). Of all quality of care related tweets, 77%
were positive in tone. Other topics mentioned in tweets
included messages of support to patients, fundraising
activities, self-promotion and dissemination of health
information. No associations were observed between
Twitter sentiment and conventional quality metrics.
The study showed that only a small proportion of tweets
directed at hospitals discussed quality of care and there
was no clear relationship between Twitter sentiment and
other measures of quality, potentially limiting Twitter as
a medium for quality monitoring. However, tweets did
contain useful information to target quality improvement
activity. Given the fact that the UK Department of Health’s
Information Strategy suggests the use of social media
aggregation and sentiment analysis to provide a rapid
indicator of hospital performance and early warnings
of poor care288, and that the NHS recently started
aggregating and publishing social media sentiment about
the NHS on a public website289, it should be noted that the
recent enthusiasm by policy makers to use social media as
a quality monitoring and improvement tool needs to be
carefully considered and subjected to formal evaluation.287
The current evidence suggests that the real-time patient
feedback harnessed from social media cannot be
deemed as a perfect test of clinical performance for an
organisation. However, the soft intelligence provided
by this proposed approach— capturing and processing
the cloud of patient experience— offers another way to
look at health quality; and not just clinical quality but
areas such as dignity and respect, cleanliness of the care
environment, timeliness and efficiency of care, as well
as ideas for improvement. In particular, the potential to
channel the wisdom of patients in real-time to create
unique insights into the quality and safety of care, without
expensive new infrastructure, is appealing.
8.7 Appendix 7 - Is IC cost-effective?
There are two separate important reviews by researchers
from the WHO7 and Centre for Health Economics, University
of York290 on the cost-effectiveness of IC . Despite the high
expectation on the potential cost-savings promised by the
concept of ‘IC’, both reports found that there were relatively
weak or no evidence to support the notion that IC is costeffective and many reasons have been offered to explain
why. The reports also highlighted the fact that there is a
paucity of high quality research on the effectiveness and
cost-effectiveness of different IC models and the results
from both reviews perhaps should be interpreted as a result
of ‘absence of evidence’ instead of ‘evidence of absence’.
The key messages from the WHO review report on
the cost-effectiveness of IC (paraphrased from Nolte &
Pitchforth 20147) include:
zz The rising burden of chronic disease and the number
of people with complex care needs require the
development of delivery systems that bring together
a range of professionals and skills from both the cure
(healthcare) and care (long-term and social-care) sectors.
zz Evidence that is available points to a positive impact of
IC program on the quality of patient care and improved
health or patient satisfaction outcomes but uncertainty
remains about the relative effectiveness of different
approaches and their impacts on costs.
zz This review of published reviews confirms earlier
reports of the shortage of robust evidence on the
economic impact of IC.
zz The term ‘IC’ is often not specifically examined;
the most common concepts or terms were case
management, care coordination, collaborative care or a
combination of these.
zz Utilisation and cost were the most common economic
outcomes assessed by reviewers but reporting of
measures was inconsistent and the quality of the
evidence was often low.
zz There is evidence of cost–effectiveness of selected IC
approaches but the evidence base remains weak.
zz There may be a need to revisit our understanding of
what IC is and what it seeks to achieve and the extent to
which the strategy lends itself to evaluation in a way that
would allow for the generation of clear cut evidence.
zz It is important to come to an understanding as to
whether IC is to be considered an intervention or
whether it is to be interpreted and evaluated as a
complex strategy to innovate and implement longlasting change in the way services in the health and
social-care sectors are being delivered and which
involve multiple changes at multiple levels.
Integrated Care | 94
8.8 Appendix 8 - Patient experience
framework
zz Physical comfort including pain management,
In order to operationalise the measurement of patient
experience, a framework needs to be formed in order
to guide such activities. The framework needs to be
specific, clearly defined and operable given its underlying
concepts. A number of generic frameworks have been
developed, among them are the Picker principles
(developed from the work of Gertis and colleagues291), the
IOM framework, the National Health Council (2004) and
the International Alliance Patients’ Organisations292-296.
Some of these frameworks are outlined below.
zz Emotional support and alleviation of fear and anxiety
8.8.1 The Picker principles
Picker Europe and Picker Institute (USA) are dedicated
to advancing the principles of PCC. The eight Picker
Principles of PCC are:
1) access to reliable health
2) effective treatment by trusted professionals
3) participation in decisions and respect for preferences
4) clear, comprehensive information and support for self-care
5) attention to physical and environmental needs
6) emotional support, empathy and respect
7) involvement of and support for family and carers
8) continuity of care and smooth transitions.
8.8.2 NHS Patient Experience Framework
In October 2011, the NHS National Quality Board (NQB)
agreed on a working definition of patient experience to
guide the measurement of patient experience across the
NHS. This framework outlines those elements which are
critical to the patients’ experience in NHS Services. As noted
by the Department of Health, this framework is based on a
modified version of the Picker Institute Principles of PCC, an
evidence-based definition of a good patient experience.297
zz Respect for patient-centered values, preferences
and expressed needs, including: cultural issues; the
dignity, privacy and independence of patients and
service users; an awareness of QoL issues; and shared
decision making.
zz Coordination and integration of care across the
health and social care system.
zz Information, communication and education on
clinical status, progress, prognosis, and processes
of care in order to facilitate autonomy, self-care and
health promotion.
help with the activities of daily living, and clean and
comfortable surroundings.
about such issues as clinical status, prognosis and the
impact of illness on patients, their families and their
finances.
zz Welcoming the involvement of family and
friends, on whom patients and service users rely, in
decision-making and demonstrating awareness and
accommodation of their needs as care-givers.
zz Transition and continuity as regards information
that will help patients care for themselves away from a
clinical setting and coordination, planning and support
to ease transition.
zz Access to care with attention, for example, to
time spent waiting for admission or time between
admission and placement in a room in an in-patient
setting and waiting time for an appointment or visit in
the out-patient, primary care or social care setting.
The UK government has defined ‘IC’ as ‘person-centred
coordinated care’ which was based on the narratives
developed by National Voices.298 The accompanying
narrative of ‘I statements’ setting out a user-based
perspective of how IC should be experienced (National
Voices 2013).298 The key dimensions of ‘IC’ were also
described (Figure 8.8-1 ). The government expects this
definition of integration as coordinated and personalised
care, to be adopted and delivered by all localities through
health, social care and other services.
Figure 8.8-1 Dimensions included in IC (i.e. person centered
coordinated care) narratives (National Voice, 2013)
Care planning
Information
My goals/outcomes
Person centered coordinated care
I can plan my care with people who
work together to understand me and
my carer(s), allow me control, and
bring together services to achieve
the outcomes important to me.”
Communication
Transitions
Decision making
It should be noted such a definition is also extremely
complex and imposes the perspective of users as its
organising principle and stands in contrast to ‘service
integration’ models which focus more on the interactions
between, or across specific services, such as health and
social care.191 298
95 | Integrated Care
8.8.3 IOM Patient experience framework
The IOM report defined the PCC as one of the key
components of quality of care dimension. The narrative
commentary on each of the themes was provided by
Staniszewska and colleagues (2014).296
Table 8.8-1 A narrative commentary on IOM patient
experience framework
IOM theme
Narrative commentary
Compassion,
empathy and
responsiveness
Compassion and empathy were both
important themes but appeared in
more subtle forms within a number
of wider generic themes, e.g.
communication. Responsiveness
emerged as a generic theme but
was focused on the responsiveness
of the service and the need for an
individualised approach.
Co-ordination and These themes were important but
integration
fitted more appropriately into the
wider generic themes of continuity
of care and responsiveness.
Information,
communication
and education
Information and communication
emerged as two key themes but
were separated to reflect the
different content of the sub-themes
identified. Education appeared in
a number of the generic themes
in different ways, including within
support and information.
Physical comfort
Physical comfort was important but
appeared in other more substantive
generic themes, including
responsiveness and lived experience.
Emotional
Emotional support was included in
support, relieving a much larger category of support.
fear and anxiety
Elements of fear and anxiety were
more subtle and appeared as part of
a broader lived experience.
Involvement of
The role of family and friends was
family and friends important and appeared in broader
themes of lived experience and
support.
(Source: Staniszewska and colleagues (2014)299)
8.8.4 The Warwick patient experience
framework (WaPEF)
The National Institute for Health and Clinical Excellence
(NICE) produces clinical guidelines, quality standards,
public health and health technology appraisal guidance
for the NHS in the UK. NICE commissioned the National
Clinical Guideline Centre (NCGC) to produce clinical
guidance and a quality standard on patient experience
in adult NHS services. There was uncertainty about the
robustness of published frameworks, as part of the process
in developing the NICE guideline and quality standards
‘Patient experience in adult NHS services: improving the
experience of care for people using adult NHS services’.300.
A scoping review was undertaken by the Royal College
of Nursing Research Institute at the University of Warwick
which, in collaboration with the NCGC, to identify the
existence of qualitative evidence of patient experience
in adult cancer, diabetes and cardiovascular disease
populations. As a result, based on the IOM framework, a
new patient experience framework (namely, the Warwick
Patient Experience Framework (WaPEF)) was developed
and described 299 (See Table 8.8-2).
Integrated Care | 96
Table 8.8-2 The Warwick Patient Experiences Framework
Generic theme
Narrative description
Patient as active
participant
Reflects the role of patients as potential active participants in their health care, co-creators and
co-managers of their health and use of services; responsible for self-care, participators in health
care, shared decision-makers, self-managers, risk managers and life-style managers.
Confidence in selfmanagement is critical
Associated with issues of power and control. Responsiveness of services— an individualised
approach
Needing to be seen
as a person within the
health care system
The responsiveness of health services in recognising the individual and tailoring services to
respond to the needs, preferences and values of patients, taking into account both shared
requirements and individual characteristics (such as individuals’ expectations of service cultural
background, gender, subtle issues such as preferences for humour). Includes how well clinical
needs are met (e.g. pain management) and evaluation of how well services perform from a
patient perspective.
Lived experience
The recognition that individuals are living with their condition and experiencing it in a unique
way, that family and broader life need to be taken into account and that all of these aspects
of lived experience can affect self-care. Taking into account individual physical needs and
cognitive needs because of condition. Everyday experiences, hopes, expectations, future
uncertainty, feelings of loss, feelings of being morally judged and feelings of blame. Some of
these experiences originate ‘outside’ of the health care system but are brought with the patient
into the health system; other experiences may be affected by attitudes and expectations of
health professionals.
Continuity of care and
relationships
Initiating contact with services, interpretation of symptoms, co-ordination, access (barriers
to), and availability of services, responsiveness of services and feelings of abandonment
(when treatment ends or support is not made available). Being known as a person rather
than ‘a number’. Trust in health care professional built up over time. Recognition/questioning
of expertise of health care professional. Respect, including respect for patient’s expertise.
Partnership in decision-making. Issues of power and control.
Communication
Needing to be seen as an individual; communication style and format (e.g. over telephone or in
person); skills and characteristics of health care professional; body language (which can convey
different information from that spoken); two-way communication and shared decision-making;
compassion, empathy; the importance of the set-up of consultation (e.g. appropriate time for
questions, appropriate physical environment and number of peoples present). Listening and
paying attention to the patient. Enabling questions and providing answers.
Information
Information to enable self-care and active participation in health care, importance of
information in shared decision-making, tailored information to suit the individual, patient
wanting/not wanting information and timely information. Sources of information, including
outside the health service (e.g. peer-support, internet). Quality of information. Sources of
further information and support. Developing knowledge and understanding, and making
sense of one’s health.
Support
Different preferences for support: Support for self-care and individual coping strategies.
Education. Need for emotional support, and need for hope. Responsiveness of health care
professionals to individual support needs (may vary according to gender, age and ethnicity).
Importance of peer-support, groups and voluntary organisations. Practical support. Family and
friends support. Role of advocacy. Feeling over-protected, not wanting to be a burden.
97 | Integrated Care
The WaPEF summarises a complex patient experience
evidence base. The narrative description of each theme
in Table 8.82 is illustrative, rather than exhaustive.
Furthermore, the themes and sub-themes contained in the
generic framework are complex and many connections
exist between them.
Despite the recommendation by the authors on the
possible international use of such a framework, it was
suggested that the future international use of WaPEF should
incorporate separate reviews of themes and subthemes in
particular country settings or populations, as WaPEF has a
particular relevance for the UK setting. In addition, future
studies should include a more rigorous quality assessment
of the studies included in the evidence base.
of both ‘coordination activities’ and ‘broad approaches’ were
presented (Table 8.8-4).
The original mapped measurement instruments were 62
and updated to 82 in June 2014.
Figure 8.8-2 Care Coordination Measurement Framework
Diagram
Goal: Coordinated care (see Chapter 2)
Mechanisms
Means of achieving goal
Coordination activities
Actions hypothesised to support
coordination. Not necessarily
executed in any structured way.
8.8.5 The AHRQ ‘care coordination’ framework
All the above frameworks are generic and intended for the
patient population as a whole without specifically aiming
for ‘IC’ or ‘care coordination’ except for the National Voice
framework (Figure 8.8-1). There is growing support of ‘care
coordination’ concepts and practice in the USA by several
national organisations including the AHRQ, the IOM, and
the American College of Physicians, amongst others. While
evidence is starting to build about the mechanisms by
which care coordination contributes to patient-centered
high-value, high-quality care, the health care community
is currently struggling to determine how to measure the
extent to which this vital activity is, or is not, occurring.
To address this challenge, the AHRQ launched a research
project in 2010 aiming for developing an atlas to help
evaluators identify appropriate measures for assessing
care coordination interventions in research studies and
demonstration projects, particularly those measures focusing
on care coordination in ambulatory care. This effort involved
three steps: 1) developing a ‘care coordination’ framework;
2) identifying all measures that include the domains and
components of the ‘care coordination’ framework; 3) mapping
different measurement instruments into the framework.
In order to develop such a framework in step 1, research was
conducted to identify the mechanisms of the coordinated
care which can be classified as either coordination activities
or broad approaches. The detailed coordination activities
(such as facilitate transitions between different care
settings) and broad approaches (such as teamwork based
coordination) were further conceptualised. Both coordination
activities and broad approaches were then stratified by
different perspectives (i.e. patients, providers and health
system representatives). Such a framework made the review
and mapping of different measurement instruments explicit.
A diagram illustrating this process is presented in Figure
8.8-2 and the detailed domains/components in Table 8.8-3.
Furthermore, the full definitions of all components were
described in the original report and its conceptual source
Broad approaches
Commonly used groups of
activities and/or tools
hypothesised to support
coordination
Coordination effects
Experienced in different ways
depending on the perspective
Health care
professional(s)
perspective
Patient/family
perspective
System
representative
perspective
Coordination measures
Context: Settings, patient populations, timeframe, facilitators, barriers
Table 8.8-3 Mechanisms for Achieving Care Coordination
(Domains)
COORDINATION ACTIVITIES
Establish Accountability or Negotiate Responsibility
Communicate
Facilitate Transitions
Assess Needs and Goals
Create a Proactive Plan of Care
Monitor, Follow Up, and Respond to Change
Support Self-Management Goals
Link to Community Resources
Align Resources with Patient and Population Needs
BROAD APPROACHES
Teamwork Focused on Coordination
Health Care Home
Care Management
Medication Management
Health IT-Enabled Coordination
Integrated Care | 98
Table 8.8-4 Relation Between the Care Coordination Measurement Framework and Other Key Sources
FRAMEWORK DOMAINS
KEY SOURCES
COORDINATION ACTIVITIES
Establish Accountability or
Negotiate Responsibility
NQF: Communication domain includes – all medical home team members work within
the same plan of care and are measurably co-accountable for their contributions to the
shared plan and achieving the patient’s goals.
Communicate
Antonelli: Care coordination competency – communicates proficiently; care
coordination function – manages continuous communication.
NQF: Framework domain – Communication available to all team members, including
patients and family.
Interpersonal Communication
Coiera: All information exchanged in health care forms a space; the communication
space is the portion of all information interactions that involves direct interpersonal
interactions, such as face-to-face conversations, telephone calls, letters, and email.
Information Transfer
MPR: Care coordination activity – send patient information to primary care provider.
NQF: Communication domain includes – availability of patient information, such as
consultation reports, progress notes, test results, and current medications to all team
members caring for a patient reduces the chance of error.
Facilitate Transitions
Antonelli: Care coordination function – supports/facilitates care transitions.
CMS Definition of Case Management: Case management services are defined for
transitioning individuals from institutions to the community.
NQF: Framework domain – transitions between and off settings of care are a special
case because currently they are fraught with numerous mishaps that can make care
uncoordinated, disconnected, and unsafe. Some care processes during transition
deserve particular attention, including involvement of team during hospitalisation,
nursing home stay, etc.; communication between settings of care; and transfer of
current and past health information from old to new home.
Assess Needs and Goals
Antonelli: Care coordination function – completes/analyzes assessments.
CMS Definition of Case Management: Case management includes assessment and
periodic reassessment of an eligible individual to determine service needs, including
activities that focus on needs identification, to determine the need for any medical,
educational, social, or other services. MPR: Care coordination activity – assess patient’s
needs and health status; develop goals.
Create a Proactive Plan of Care
Antonelli: Defining characteristic of care coordination – proactive, planned and
comprehensive; care coordination function – develops care plans with families; facile in
care planning skills.
CMS Definition of Case Management: Case management assessment includes
development and periodic revision of a specific care plan based on the information
collected through an assessment or reassessment that specifies the goals and actions
to address the medical, social, educational, and other services needed by the eligible
individual, including activities such as ensuring the active participation of the eligible
individual and working with the individual (or the individual’s authorised health care
decision maker) and others to develop those goals and identify a course of action to
respond to the assessed needs of the eligible individual.
MPR: Care coordination activity – develop a care plan to address needs.
NQF: Framework domain – Proactive Plan of Care and Follow-up is an established and
current care plan that anticipates routine needs and actively tracks up-to-date progress
toward patient goals.
99 | Integrated Care
FRAMEWORK DOMAINS
KEY SOURCES
Monitor, Follow Up, and
Respond to Change
Antonelli: Care coordination function – manages/tracks tests, referrals, and outcomes.
CMS Definition of Case Management: Case management assessment includes periodic
reassessment to determine whether an individual’s needs and/or preferences have
changed. Case management includes monitoring and follow-up activities, including
activities and contacts that are necessary to ensure that the care plan is effectively
implemented and adequately addresses the needs of the eligible individual. If there are
changes in the needs or status of the individual, monitoring and follow-up activities
include making necessary adjustments in the care plan and service arrangements with
providers.
MPR: Care coordination activities – monitor patient’s knowledge and services over time;
intervene as needed; reassess patients and care plan periodically.
NQF: Plan of Care domain includes – follow-up of tests, referrals, treatments, or other
services.
Support Self-Management
Goals
Antonelli: Defining characteristic of care coordination – promotes self-care skills and
independence; care coordination function – coaches patients/families.
MPR: Care coordination activity – educate patient about condition and self-care.
NQF: Plan of Care domain includes – self-management support.
Link to Community Resources
Antonelli: Care coordination competency – integrates all resource knowledge.
CMS Definition of Case Management: Case management includes referral and related
activities (such as scheduling appointments for the individual) to help an individual
obtain needed services, including activities that help link eligible individuals with
medical, social, educational providers, or other programs and services that are capable
of providing needed services to address identified needs and achieve goals specified in
the care plan.
MPR: Care coordination activity – arrange needed services, including those outside the
health system (meals, transportation, home repair, prescription assistance, home care).
NQF: Plan of Care domain includes – community services and resources. The Plan of
Care includes community and nonclinical services as well as traditional health care
services that respond to a patient’s needs and preferences and contribute to achieving
the patient’s goals.
Align Resources with Patient
and Population Needs
MPR: Care coordination activity – arrange needed services, including those within the
health system (preventive care with primary care provider; specialist visits; durable
medical equipment; acute care).
NQF: A principle of care coordination is that care coordination is important to
all patients, but some populations are particularly vulnerable to fragmented,
uncoordinated care on a chronic basis, including (not mutually exclusive): children with
special health care needs; the frail elderly; persons with cognitive impairments; persons
with complex medical conditions; adults with disabilities; people at the end of life;
low-income patients; patients who move frequently, including retirees and those with
unstable health insurance coverage; and behavioural health care patients.
BROAD APPROACHES
Teamwork focused on
Coordination
Antonelli: Care coordination competency – applies teambuilding skills; care
coordination function – facilitates team meetings.
Integrated Care | 100
FRAMEWORK DOMAINS
KEY SOURCES
Healthcare Home
NQF: Framework domain – Health Care Home is a source of usual care selected by the
patient (such as a large or small medical group, a single practitioner, a community
health center, or a hospital outpatient clinic).
Care Management
See elements of
CMS case management definition mapped under other domains.
Medication Management
MPR: Care coordination activity – review medications.
NQF: Transitions between and off settings domain includes medication reconciliation
Health IT-enabled Coordination Antonelli: Care coordination competency – adept with information technology; care
coordination function – uses health information technology.
NQF: Framework domain – information systems – the use of standardised, integrated
electronic information systems with functionalities essential to care coordination is
available to all providers and patients.
Antonelli = Antonelli RC, McAllister JW, Popp J. Making care coordination a critical component of the pediatric health system: A
multidisciplinary framework. New York, NY: The Commonwealth Fund. May 2009. Publication No. 1277.
CMS Definition of Case Management = Centers for Medicare and Medicaid Services. Medicaid Program; Optional state plan case
management services. 42 Code of Federal Regulations 441.18 2007 4 December;72(232):68092-3.
Coiera = Coeira E. Guide to health informatics. 2nd ed. London, England: Hodder Arnold, a member of the Hodder Headline
Group; 2003. MPR = Coordinating care for Medicare beneficiaries: Early experiences of 15 demonstration programs, their patients,
and providers: Report to Congress. Princeton, NJ: Mathematica Policy Research, Inc.; May 2004.
NQF = National Quality Forum. National Quality Forum-endorsed definition and framework for measuring care coordination.
Washington, DC: National Quality Forum; 2006.
8.8.6 The patient experience frameworks and
its implications on NSW IC
There is unlikely to be universal agreement on the key
important dimensions of the patient experience to
measure among different health jurisdictions. The choice
of different dimensions/domains/subdomains of the
patient experience very much reflects the value and
preference of health systems, servicers providers and
patients themselves.
These dimensions may have different priorities among
health system representatives, service providers and
patients. In general, the more objective patient experience
measures are preferred as in the discussion provided
above. It is important to reach some consensus on the
most important aspects of the patient experience in a
particular setting among different stakeholders, preferably
with the participation of patients and careers.
References
Integrated Care | 102
1. Jones JB, Snyder CF, Wu AW. Issues in the design of
Internet-based systems for collecting patient-reported
outcomes. Quality of Life Research 2007;16(8):1407-17.
enhance transparency of labelling claims and improve
interpretation of a patient reported outcome measure.
Health and Quality of Life Outcomes 2006;4.
2. Guidance for Industry Patient-Reported Outcome
Measures: Use in Medical Product Development to Support
Labeling Claims.U.S. Department of Health and Human
Services Food and Drug Administration December, 2009.
19. Jaeschke R, Singer J, Guyatt GH. Measurement of health
status. Ascertaining the minimal clinically important
difference (MCID). Control Clin Trials 1989;10(4):407-15.
3. National Quality Forum (NQF). Patient Reported
Outcomes (PROs) in Performance Measurement, 2013.
4. McKenna SP. Measuring patient-reported outcomes:
Moving beyond misplaced common sense to hard science.
BMC Medicine 2011;9.
5. Black N. Patient reported outcome measures could help
transform healthcare. BMJ (Online) 2013;346(7896).
6. Black N, Varaganum M, Hutchings A. Relationship
between patient reported experience (PREMs) and patient
reported outcomes (PROMs) in elective surgery. BMJ
Quality and Safety 2014;23(7):534-43.
7. Nolte E, Pitchforth E. What is the evidence on the
economic impacts of integrated care?: WHO Regional
Office for Europe and European Observatory on Health
Systems and Policies, 2014.
8. NICE. Developing NICE guidelines: the manual: National
Institute for Health and Care Excellence (NICE), 2014.
9. NHS Department of Health the UK. The NHS Outcomes
Framework 2013/14, 2014.
10. McDonald KM, Schultz E, Albin L, et al. Care
Coordination Measures Atlas--June 2014 Update, 2014.
11. JASON The MITRE Corporation. A Robust Health
Data Infrastructure: Prepared for:Agency for Healthcare
Research and Quality, 2014.
12. IOM(Institute of Medicine). Capturing Social and
Behavioral Domains and Measures in electronic Health
Records: Phase 2. Washington, DC, 2014.
13. Department of Health the UK. Developing an indicator
on user experience of integrated care for the NHS and
Adult social care outcomes frameworks, 2014.
14. NHPA(National Health Performance Authority). Three
Year Data Plan: 2014–15 to 2016–17, 2013.
15. Australian Commission on Safety and Quality in Health
Care. REVIEW OF PATIENT EXPERIENCE AND SATISFACTION
SURVEYS CONDUCTED WITHIN PUBLIC AND PRIVATE
HOSPITALS IN AUSTRALIA, 2012.
16. Program EHC. Building a national PartnershiP network
Advancing Patient-Centered Outcomes Research: Agency
for Healthcare Research and Quality, 2011.
17. Cella D, Hahn EA, Jensen SE, et al. Methodological
Issues In The Selection, Administration And Use Of PatientReported Outcomes In Performance Measurement In
Health Care Settings, 2012.
18. Brozek JL, Guyatt GH, Schünemann HJ. How a
well-grounded minimal important difference can
20. Crosby RD, Kolotkin RL, Williams GR. Defining clinically
meaningful change in health-related quality of life. J Clin
Epidemiol 2003;56(5):395-407.
21. Lydick E, Epstein RS. Interpretation of quality of life
changes. Qual Life Res 1993;2(3):221-26.
22. Dworkin RH, Turk DC, Wyrwich KW, et al. Interpreting
the clinical importance of treatment outcomes in chronic
pain clinical trials: IMMPACT recommendations. The
journal of pain : official journal of the American Pain
Society 2008;9(2):105-21.
23. Guyatt GH. Making sense of quality-of-life data. Med
Care 2000;38(9 Suppl):175-79.
24. Guyatt GH, Norman GR, Juniper EF, et al. A critical look
at transition ratings. J Clin Epidemiol 2002;55(9):900-08.
25. Brozek JL, Guyatt GH, Schunemann HJ. How a
well-grounded minimal important difference can
enhance transparency of labelling claims and improve
interpretation of a patient reported outcome measure.
Health Qual Life Outcomes 2006;4:69-69.
26. Revicki D, Hays RD, Cella D, et al. Recommended
methods for determining responsiveness and minimally
important differences for patient-reported outcomes. J
Clin Epidemiol 2008;61(2):102-09.
27. Farivar SS, Liu H, Hays RD. Half standard deviation
estimate of the minimally important difference in
HRQOL scores? Expert Rev Pharmacoecon Outcomes Res
2004;4(5):515-23.
28. Schwartz CE, Sprangers MA. Methodological
approaches for assessing response shift in longitudinal
health-related quality-of-life research. Soc Sci Med
1999;48(11):1531-48.
29. Nolte S, Elsworth GR, Sinclair AJ, et al. The inclusion
of ‘then-test’ questions in post-test questionnaires alters
post-test responses: a randomized study of bias in health
program evaluation. Qual Life Res 2012;21(3):487-94.
30. Ring L, Höfer S, Heuston F, et al. Response shift masks
the treatment impact on patient reported outcomes
(PROs): The example of individual quality of life in
edentulous patients. Health and Quality of Life Outcomes
2005;3.
31. Brossart DF, Clay DL, Willson VL. Methodological and
statistical considerations for threats to internal validity
in pediatric outcome data: response shift in self-report
outcomes. J Pediatr Psychol 2002;27(1):97-9107.
32. Ahmed S, Bourbeau J, Maltais F, et al. The Oort
structural equation modeling approach detected a
response shift after a COPD self-management program
103 | Integrated Care
not detected by the Schmitt technique. J Clin Epidemiol
2009;62(11):1165-72.
33. Mayo NE, Scott SC, Ahmed S. Case management
poststroke did not induce response shift: the value of
residuals. J Clin Epidemiol 2009;62(11):1148-56.
34. Cella D, Nowinski CJ. Measuring quality of life in
chronic illness: the functional assessment of chronic illness
therapy measurement system. Arch Phys Med Rehabil
2002;83(12 Suppl 2):10-17.
35. Shearer D, Morshed S. Common generic measures
of health related quality of life in injured patients. Injury
2011;42(3):241-47.
36. Raleigh V, Bardsley M, Smith P, et al. Integrated care
and support Pioneers: Indicators for measuring the quality
of integrated care: Final report: The King’s Fund; Nuffield
Trust; Personal Social Services Research Unit, London
School of Economics; Health Services Research & Policy,
London School of Hygiene & Tropical Medicine, 2014.
45. Irwin DE, Stucky BD, Langer MM, et al. PROMIS Pediatric
Anger Scale: an item response theory analysis. Quality of
Life Research 2011:1-10.
46. Hung M, Clegg DO, Greene T, et al. Evaluation of the
PROMIS physical function item bank in orthopaedic patients.
Journal of Orthopaedic Research 2011;29(6):947-53.
47. Gibbons LE, Feldman BJ, Crane HM, et al. Migrating
from a legacy fixed-format measure to CAT administration:
calibrating the PHQ-9 to the PROMIS depression measures.
Quality of Life Research 2011:1-9.
48. Fries JF, Krishnan E, Rose M, et al. Improved
responsiveness and reduced sample size requirements
of PROMIS physical function scales with item response
theory. Arthritis Research & Therapy 2011.
49. Fries J, Rose M, Krishnan E. The PROMIS of better
outcome assessment: Responsiveness, floor and
ceiling effects, and internet administration. Journal of
Rheumatology 2011;38(8):1759-64.
37. Varni JW, Thissen D, Stucky BD, et al. PROMIS® Parent
Proxy Report Scales: an item response theory analysis
of the parent proxy report item banks. Quality of Life
Research 2011:1-18.
50. Flynn KE, Jeffery DD, Keefe FJ, et al. Sexual functioning
along the cancer continuum: Focus group results from the
Patient-Reported Outcomes Measurement Information
System (PROMIS®). Psycho-Oncology 2011;20(4):378-86.
38. Thissen D, Varni JW, Stucky BD, et al. Using the PedsQL™
3.0 asthma module to obtain scores comparable with
those of the PROMIS pediatric asthma impact scale (PAIS).
Quality of Life Research 2011:1-9.
51. Cook KF, Bamer AM, Roddey TS, et al. A PROMIS fatigue
short form for use by individuals who have multiple
sclerosis. Quality of Life Research 2011:1-10.
39. Sung VW, Marques F, Rogers RR, et al. Content
validation of the patient-reported outcomes measurement
information system (PROMIS) framework in women with
urinary incontinence. Neurourology and Urodynamics
2011;30(4):503-09.
40. Preston K, Reise S, Cai L, et al. Using the nominal
response model to evaluate response category
discrimination in the PROMIS emotional distress item
pools. Educational and Psychological Measurement
2011;71(3):523-50.
41. Pilkonis PA, Choi SW, Reise SP, et al. Item banks
for measuring emotional distress from the patientreported outcomes measurement information system
(PROMIS®): Depression, anxiety, and anger. Assessment
2011;18(3):263-83.
42. Magasi S, Ryan G, Revicki D, et al. Content validity of
patient-reported outcome measures: perspectives from a
PROMIS meeting. Quality of Life Research 2011:1-8.
43. Lai JS, Cella D, Choi S, et al. How item banks and
their application can influence measurement practice
in rehabilitation medicine: A PROMIS fatigue item bank
example. Archives of Physical Medicine and Rehabilitation
2011;92(10 SUPPL.):S20-S27.
44. Junghaenel DU, Christodoulou C, Lai JS, et al.
Demographic correlates of fatigue in the US general
population: Results from the patient-reported outcomes
measurement information system (PROMIS) initiative.
Journal of Psychosomatic Research 2011;71(3):117-23.
52. Cella D, Lai JS, Stone A. Self-reported fatigue: One
dimension or more? Lessons from the functional assessment
of chronic illness therapy-fatigue (FACIT-F) questionnaire.
Supportive Care in Cancer 2011;19(9):1441-50.
53. Bajaj JS, Thacker LR, Wade JB, et al. PROMIS
computerised adaptive tests are dynamic instruments
to measure health-related quality of life in patients with
cirrhosis. Alimentary Pharmacology and Therapeutics
2011;34(9):1123-32.
54. Amtmann D, Cook KF, Johnson KL, et al. The PROMIS
initiative: Involvement of rehabilitation stakeholders
in development and examples of applications in
rehabilitation research. Archives of Physical Medicine and
Rehabilitation 2011;92(10 SUPPL.):S12-S19.
55. Yeatts KB, Stucky B, Thissen D, et al. Construction of
the pediatric asthma impact scale (PAIS) for the patientreported outcomes measurement information system
(PROMIS). Journal of Asthma 2010;47(3):295-302.
56. Varni JW, Stucky BD, Thissen D, et al. PROMIS pediatric
pain interference scale: An item response theory
analysis of the pediatric pain item bank. Journal of Pain
2010;11(11):1109-19.
57. Rothrock NE, Hays RD, Spritzer K, et al. Relative to the
general US population, chronic diseases are associated
with poorer health-related quality of life as measured by
the Patient-Reported Outcomes Measurement Information
System (PROMIS). Journal of Clinical Epidemiology
2010;63(11):1195-204.
Integrated Care | 104
58. Riley WT, Rothrock N, Bruce B, et al. Patient-reported
outcomes measurement information system (PROMIS)
domain names and definitions revisions: Further
evaluation of content validity in IRT-derived item banks.
Quality of Life Research 2010;19(9):1311-21.
59. MacLaren VV, Best LA. Multiple addictive behaviors
in young adults: Student norms for the Shorter PROMIS
Questionnaire. Addictive Behaviors 2010;35(3):252-55.
60. Liu H, Cella D, Gershon R, et al. Representativeness of
the patient-reported outcomes measurement information
system internet panel. Journal of Clinical Epidemiology
2010;63(11):1169-78.
61. Irwin DE, Stucky BD, Thissen D, et al. Sampling plan and
patient characteristics of the PROMIS pediatrics large-scale
survey. Quality of Life Research 2010;19(4):585-94.
62. Irwin DE, Stucky B, Langer MM, et al. An item response
analysis of the pediatric PROMIS anxiety and depressive
symptoms scales. Quality of Life Research 2010;19(4):595607.
63. Henly SJ. Editorial: The promise of PROMIS. Nursing
Research 2010;59(2):77.
64. Henly SJ. The promise of PROMIS. Nursing research
2010;59(2):77.
65. Hays RD, Bode R, Rothrock N, et al. The impact of next
and back buttons on time to complete and measurement
reliability in computer-based surveys. Quality of Life
Research 2010;19(8):1181-84.
66. Hahn EA, DeVellis RF, Bode RK, et al. Measuring social
health in the patient-reported outcomes measurement
information system (PROMIS): Item bank development and
testing. Quality of Life Research 2010;19(7):1035-44.
67. Gershon RC, Rothrock N, Hanrahan R, et al. The use
of PROMIS and assessment center to deliver PatientReported Outcome Measures in clinical research. Journal
of Applied Measurement 2010;11(3):304-14.
68. Flynn KE, Shelby RA, Mitchell SA, et al. Sleep-wake
functioning along the cancer continuum: Focus group
results from the Patient-Reported Outcomes Measurement
Information System (PROMIS®). Psycho-Oncology
2010;19(10):1086-93.
69. Eisenstein EL, Diener LW, Nahm M, et al. Impact of the
Patient-Reported Outcomes Management Information
System (PROMIS) upon the Design and Operation of Multicenter Clinical Trials: a Qualitative Research Study. Journal
of Medical Systems 2010:1-10.
70. Choi SW, Reise SP, Pilkonis PA, et al. Efficiency of static
and computer adaptive short forms compared to fulllength measures of depressive symptoms. Quality of Life
Research 2010;19(1):125-36.
71. Cella D, Riley W, Stone A, et al. The patient-reported
outcomes measurement information system (PROMIS)
developed and tested its first wave of adult self-reported
health outcome item banks: 2005-2008. Journal of Clinical
Epidemiology 2010;63(11):1179-94.
72. Amtmann D, Cook KF, Jensen MP, et al. Development
of a PROMIS item bank to measure pain interference. Pain
2010;150(1):173-82.
73. Aletaha D. From the item to the outcome: The
promising prospects of PROMIS. Arthritis Research and
Therapy 2010;12(1).
74. Revicki DA, Kawata AK, Harnam N, et al. Predicting
EuroQol (EQ-5D) scores from the patient-reported
outcomes measurement information system (PROMIS)
global items and domain item banks in a United States
sample. Quality of Life Research 2009;18(6):783-91.
75. Revicki DA, Chen WH, Harnam N, et al. Development
and psychometric analysis of the PROMIS pain behavior
item bank. Pain 2009;146(1-2):158-69.
76. Klem M, Saghafi E, Abromitis R, et al. Building PROMIS
item banks: Librarians as co-investigators. Quality of Life
Research 2009;18(7):881-88.
77. Jeffery DD, Tzeng JP, Keefe FJ, et al. Initial report of
the cancer patient-reported outcomes measurement
information system (PROMIS) sexual function committee:
Review of sexual function measures and domains used in
oncology. Cancer 2009;115(6):1142-53.
78. Jansky LJ, Huang JC. A multi-method approach
to assess usability and acceptability: A case study of
the patient-reported outcomes measurement system
(PROMIS) workshop. Social Science Computer Review
2009;27(2):262-70.
79. Irwin DE, Varni JW, Yeatts K, et al. Cognitive
interviewing methodology in the development of
a pediatric item bank: A patient reported outcomes
measurement information system (PROMIS) study. Health
and Quality of Life Outcomes 2009;7.
80. Hays RD, Bjorner JB, Revicki DA, et al. Development
of physical and mental health summary scores from the
patient-reported outcomes measurement information
system (PROMIS) global items. Quality of Life Research
2009;18(7):873-80.
81. Fries JF, Krishnan E, Bruce B. Items, instruments,
crosswalks, and PROMIS. Journal of Rheumatology
2009;36(6):1093-95.
82. Fries JF, Krishnan E. What constitutes progress
in assessing patient outcomes? Journal of Clinical
Epidemiology 2009;62(8):779-80.
83. Fries JF, Cella D, Rose M, et al. Progress in assessing
physical function in arthritis: PROMIS short forms and
computerized adaptive testing. Journal of Rheumatology
2009;36(9):2061-66.
84. Fortune-Greeley AK, Flynn KE, Jeffery DD, et al.
Using cognitive interviews to evaluate items for
measuring sexual functioning across cancer populations:
Improvements and remaining challenges. Quality of Life
Research 2009;18(8):1085-93.
105 | Integrated Care
85. Walsh TR, Irwin DE, Meier A, et al. The use of focus
groups in the development of the PROMIS pediatrics item
bank. Quality of Life Research 2008;17(5):725-35.
98. Cook KF, Bamer AM, Roddey TS, et al. A PROMIS fatigue
short form for use by individuals who have multiple
sclerosis. Quality of Life Research 2012;21(6):1021-30.
86. Rose M, Bjorner JB, Becker J, et al. Evaluation of a
preliminary physical function item bank supported the
expected advantages of the Patient-Reported Outcomes
Measurement Information System (PROMIS). Journal of
Clinical Epidemiology 2008;61(1):17-33.
99. Edelen MO, Tucker JS, Shadel WG, et al. Toward a
more systematic assessment of smoking: Development
of a smoking module for PROMIS®. Addictive Behaviors
2012;37(11):1278-84.
87. Neary PC, Boyle E, Delaney CP, et al. Construct
validation of a novel hybrid virtual-reality simulator for
training and assessing laparoscopic colectomy; results
from the first course for experienced senior laparoscopic
surgeons. Surgical Endoscopy and Other Interventional
Techniques 2008;22(10):2301-09.
88. Christodoulou C, Junghaenel DU, DeWalt DA, et al.
Cognitive interviewing in the evaluation of fatigue items:
Results from the patient-reported outcomes measurement
information system (PROMIS). Quality of Life Research
2008;17(10):1239-46.
89. Castel LD, Williams KA, Bosworth HB, et al. Content
validity in the PROMIS social-health domain: A qualitative
analysis of focus-group data. Quality of Life Research
2008;17(5):737-49.
90. Reeve BB, Hays RD, Bjorner JB, et al. Psychometric
evaluation and calibration of health-related quality of
life item banks: Plans for the Patient-Reported Outcomes
Measurement Information System (PROMIS). Medical Care
2007;45(5 SUPPL. 1):S22-S31.
91. Hays RD, Liu H, Spritzer K, et al. Item response theory
analyses of physical functioning items in the medical
outcomes study. Medical Care 2007;45(5 SUPPL. 1):S32-S38.
92. DeWalt DA, Rothrock N, Yount S, et al. Evaluation of
item candidates: The PROMIS qualitative item review.
Medical Care 2007;45(5 SUPPL. 1):S12-S21.
93. Cella D, Yount S, Rothrock N, et al. The Patient-Reported
Outcomes Measurement Information System (PROMIS):
Progress of an NIH roadmap cooperative group during its
first two years. Medical Care 2007;45(5 SUPPL. 1):S3-S11.
94. Cella D, Gershon R, Lai JS, et al. The future of outcomes
measurement: Item banking, tailored short-forms, and
computerized adaptive assessment. Quality of Life
Research 2007;16(SUPPL. 1):133-41.
95. Ader DN. Developing the Patient-Reported Outcomes
Measurement Information System (PROMIS). Medical Care
2007;45(5 SUPPL. 1):S1-S2.
96. Pallanti S, Bernardi S, Quercioli L. The Shorter PROMIS
Questionnaire and the Internet Addiction Scale in the
assessment of multiple addictions in a high-school
population: Prevalence and related disability. CNS
Spectrums 2006;11(12):966-74.
97. Fries JF, Bruce B, Cella D. The promise of PROMIS:
Using item response theory to improve assessment of
patient-reported outcomes. Clinical and Experimental
Rheumatology 2005;23(5 SUPPL. 39):S53-S57.
100. Forrest CB, Bevans KB, Tucker C, et al. Commentary:
The patient-reported outcome measurement information
system (PROMIS®) for children and youth: Application
to pediatric psychology. Journal of Pediatric Psychology
2012;37(6):614-21.
101. Irwin DE, Gross HE, Stucky BD, et al. Development of
six PROMIS pediatrics proxy-report item banks. Health and
Quality of Life Outcomes 2012;10.
102. Irwin DE, Stucky BD, Langer MM, et al. PROMIS
pediatric Anger scale: An item response theory analysis.
Quality of Life Research 2012;21(4):697-706.
103. Khanna D, Maranian P, Rothrock N, et al. Feasibility
and construct validity of PROMIS and “legacy” instruments
in an academic scleroderma clinic. Value in Health
2012;15(1):128-34.
104. Magasi S, Ryan G, Revicki D, et al. Content validity of
patient-reported outcome measures: Perspectives from a
PROMIS meeting. Quality of Life Research 2012;21(5):739-46.
105. Noonan VK, Cook KF, Bamer AM, et al. Measuring
fatigue in persons with multiple sclerosis: Creating a
crosswalk between the modified fatigue impact scale and
the PROMIS fatigue short form. Quality of Life Research
2012;21(7):1123-33.
106. Oude Voshaar MAH, ten Klooster PM, Taal E, et al.
Dutch translation and cross-cultural adaptation of the
PROMIS ®physical function item bank and cognitive
pre-test in Dutch arthritis patients. Arthritis Research and
Therapy 2012;14(2).
107. Varni JW, Thissen D, Stucky BD, et al. PROMIS® parent
proxy report scales: An item response theory analysis
of the parent proxy report item banks. Quality of Life
Research 2012;21(7):1223-40.
108. Alonso J, Bartlett SJ, Rose M, et al. The case for an
international patient-reported outcomes measurement
information system (PROMIS®) initiative. Health and
Quality of Life Outcomes 2013;11(1).
109. PROMIS: A management platform for software supply
networks based on the linked data and OSLC. Proceedings
- International Computer Software and Applications
Conference; 2013.
110. Askew RL, Kim J, Chung H, et al. Development of
a crosswalk for pain interference measured by the BPI
and PROMIS pain interference short form. Quality of Life
Research 2013;22(10):2769-76.
111. Auriault F, Thollon L, Peres J, et al. The PROMIS model
to highlight the importance of the foetus to the validation
of a pregnant woman model. Computer Methods in
Integrated Care | 106
Biomechanics and Biomedical Engineering 2013;16(SUPPL
1):182-83.
prospective observational study of rheumatoid arthritis.
Annals of the Rheumatic Diseases 2013.
112. Barile JP, Reeve BB, Smith AW, et al. Monitoring
population health for Healthy People 2020: Evaluation of
the NIH PROMIS® Global Health, CDC Healthy Days, and
satisfaction with life instruments. Quality of Life Research
2013;22(6):1201-11.
124. Hinds PS, Nuss SL, Ruccione KS, et al. PROMIS
pediatric measures in pediatric oncology: Valid and
clinically feasible indicators of patient-reported outcomes.
Pediatric Blood and Cancer 2013;60(3):402-08.
113. Bevans KB, Gardner W, Pajer K, et al. Qualitative
development of the PROMIS® pediatric stress
response item banks. Journal of Pediatric Psychology
2013;38(2):173-91.
114. Bruce B, Fries J, Lingala B, et al. Development and
assessment of floor and ceiling items for the PROMIS
physical function item bank. Arthritis Research and
Therapy 2013;15(5).
115. DeWalt DA, Thissen D, Stucky BD, et al. PROMIS
pediatric peer relationships scale: Development of a
peer relationships item bank as part of social health
measurement. Health Psychology 2013;32(10):1093-103.
116. Farin E, Nagl M, Gramm L, et al. Development and
evaluation of the PI-G: a three-scale measure based on the
German translation of the PROMIS® pain interference item
bank. Quality of Life Research 2013:1-11.
117. Flynn KE, Lin L, Cyranowski JM, et al. Development
of the NIH PROMIS® Sexual Function and Satisfaction
Measures in Patients with Cancer. Journal of Sexual
Medicine 2013;10(SUPPL.):43-52.
118. Flynn KE, Reeve BB, Lin L, et al. Construct validity of
the PROMIS® sexual function and satisfaction measures in
patients with cancer. Health and Quality of Life Outcomes
2013;11(1).
119. Forrest CB, Bevans KB, Pratiwadi R, et al. Development
of the PROMIS® pediatric global health (PGH-7) measure.
Quality of Life Research 2013:1-11.
120. Gibbons LE, Feldman BJ, Crane HM, et al. Erratum:
Migrating from a legacy fixed-format measure to CAT
administration: Calibrating the PHQ-9 to the PROMIS
depression measures (Quality of Life Research (2011) 20
(1349-1357) DOI 10.1007/s11136-011-9882-y). Quality of
Life Research 2013;22(2):459-60.
121. Gipson DS, Selewski DT, Massengill SF, et al. Gaining
the PROMIS perspective from children with nephrotic
syndrome: A Midwest pediatric nephrology consortium
study. Health and Quality of Life Outcomes 2013;11(1).
122. Hays RD, Spritzer KL, Amtmann D, et al. Upperextremity and mobility subdomains from the PatientReported Outcomes Measurement Information System
(PROMIS) adult physical functioning item bank. Archives of
Physical Medicine and Rehabilitation 2013;94(11):2291-96.
123. Hays RD, Spritzer KL, Fries JF, et al. Responsiveness
and minimally important difference for the PatientReported Outcomes Measurement Information System
(PROMIS) 20-item physical functioning short form in a
125. Hung M, Baumhauer JF, Latt LD, et al. Validation of
PROMIS® physical function computerized adaptive tests
for orthopaedic foot and ankle outcome research. Clinical
Orthopaedics and Related Research 2013;471(11):3466-74.
126. Hung M, Stuart AR, Higgins TF, et al. Computerized
adaptive testing using the PROMIS physical function
item bank reduces test burden with less ceiling effects
compared to the short musculoskeletal function
assessment in orthopaedic trauma patients. Journal of
Orthopaedic Trauma 2013.
127. Jacobson CJ, Farrell JE, Kashikar-Zuck S, et al.
Disclosure and self-report of emotional, social, and
physical health in children and adolescents with chronic
pain - A qualitative study of PROMIS pediatric measures.
Journal of Pediatric Psychology 2013;38(1):82-93.
128. Kim J, Chung H, Amtmann D, et al. Measurement
invariance of the PROMIS pain interference item bank
across community and clinical samples. Quality of Life
Research 2013;22(3):501-07.
129. Kratz AL, Slavin MD, Mulcahey MJ, et al. An
examination of the PROMIS® pediatric instruments to
assess mobility in children with cerebral palsy. Quality of
Life Research 2013;22(10):2865-76.
130. Lai JS, Stucky BD, Thissen D, et al. Development and
psychometric properties of the PROMIS® pediatric fatigue
item banks. Quality of Life Research 2013;22(9):2417-27.
131. Paz SH, Spritzer KL, Morales LS, et al. Evaluation of the
patient-reported outcomes information system (PROMIS
®) Spanish-language physical functioning items. Quality of
Life Research 2013;22(7):1819-30.
132. Pilkonis PA, Yu L, Colditz J, et al. Item banks for alcohol
use from the Patient-Reported Outcomes Measurement
Information System (PROMIS®): Use, consequences, and
expectancies. Drug and Alcohol Dependence 2013;130(13):167-77.
133. Schneider S, Choi SW, Junghaenel DU, et al.
Psychometric characteristics of daily diaries for the PatientReported Outcomes Measurement Information System
(PROMIS®): A preliminary investigation. Quality of Life
Research 2013;22(7):1859-69.
134. Williams MS, Snyder DC, Sloane R, et al. A comparison
of cancer survivors from the PROMIS study selecting
telephone versus online questionnaires. Psycho-Oncology
2013;22(11):2632-35.
135. Alves FSM, Pinto RMC, Mendonça TMS, et al.
Portuguese-language translation and cross-cultural
adaptation of the Fatigue domain of Patient-Reported-
107 | Integrated Care
Outcomes Measurement Information System (PROMIS).
Cadernos de Saude Publica 2014;30(5):1103-10.
136. Amtmann D, Kim J, Chung H, et al. Comparing
CESD-10, PHQ-9, and PROMIS depression instruments
in individuals with multiple sclerosis. Rehabilitation
Psychology 2014;59(2):220-29.
137. Auriault F, Thollon L, Peres J, et al. Virtual
traumatology of pregnant women: The PRegnant car
Occupant Model for Impact Simulations (PROMIS). Journal
of Biomechanics 2014;47(1):207-13.
138. Baum G, Basen-Engquist K, Swartz MC, et al.
Comparing PROMIS computer-adaptive tests to the Brief
Symptom Inventory in patients with prostate cancer.
Quality of Life Research 2014:1-5.
139. Becker H, Stuifbergen A, Lee HY, et al. Reliability
and validity of PROMIS cognitive abilities and cognitive
concerns scales among people with multiple sclerosis.
International Journal of MS Care 2014;16(1):1-9.
140. Bevans M, Ross A, Cella D. Patient-Reported Outcomes
Measurement Information System (PROMIS): Efficient,
standardized tools to measure self-reported health and
quality of life. Nursing Outlook 2014.
141. Bjorner JB, Rose M, Gandek B, et al. Difference in
method of administration did not significantly impact item
response: An IRT-based analysis from the Patient-Reported
Outcomes Measurement Information System (PROMIS)
initiative. Quality of Life Research 2014;23(1):217-27.
148. Edelen MO. The PROMIS® smoking assessment toolkitbackground and Introduction to supplement. Nicotine
and Tobacco Research 2014;16(SUPPL.3):S170-S74.
149. Edelen MO, Stucky BD, Hansen M, et al. The PROMIS®
smoking initiative: Initial validity evidence: For six new
smoking item banks. Nicotine and Tobacco Research
2014;16(SUPPL.3):S250-S60.
150. Edelen MO, Tucker JS, Shadel WG, et al.
Development of the PROMIS® health expectancies of
smoking item banks. Nicotine and Tobacco Research
2014;16(SUPPL.3):S223-S31.
151. Farin E, Nagl M, Gramm L, et al. Development and
evaluation of the PI-G: a three-scale measure based on the
German translation of the PROMIS ® pain interference item
bank. Quality of life research : an international journal of
quality of life aspects of treatment, care and rehabilitation
2014;23(4):1255-65.
152. Fernández-López JA. PROMIS®: A platform to evaluate
health status and the results of interventions. Semergen
2014;40(7):355-56.
153. Forrest CB, Bevans KB, Pratiwadi R, et al. Development
of the PROMIS® pediatric global health (PGH-7) measure.
Quality of Life Research 2014;23(4):1221-31.
154. Fries JF, Witter J, Rose M, et al. Item response
theory, computerized adaptive testing, and promis:
Assessment of physical function. Journal of Rheumatology
2014;41(1):153-58.
142. Bjorner JB, Rose M, Gandek B, et al. Method of
administration of PROMIS scales did not significantly
impact score level, reliability, or validity. Journal of Clinical
Epidemiology 2014;67(1):108-13.
155. Hansen M, Cai L, Stucky BD, et al. Methodology
for developing and evaluating the PROMIS®
smoking item banks. Nicotine and Tobacco Research
2014;16(SUPPL.3):S175-S89.
143. Cella D, Choi S, Garcia S, et al. Setting standards for
severity of common symptoms in oncology using the
PROMIS item banks and expert judgment. Quality of Life
Research 2014.
156. Hung M, Baumhauer JF, Brodsky JW, et al.
Psychometric comparison of the PROMIS physical function
CAT with the FAAM and FFI for measuring patient-reported
outcomes. Foot and Ankle International 2014;35(6):592-99.
144. Chen WH, Lenderking W, Jin Y, et al. Is Rasch model
analysis applicable in small sample size pilot studies for
assessing item characteristics? An example using PROMIS
pain behavior item bank data. Quality of Life Research
2014;23(2):485-93.
157. Hung M, Hon SD, Franklin JD, et al. Psychometric
properties of the promis physical function item bank in
patients with spinal disorders. Spine 2014;39(2):158-63.
145. Choi SW, Schalet B, Cook KF, et al. Establishing a
common metric for depressive symptoms: Linking the BDIII, CES-D, and PHQ-9 to PROMIS Depression. Psychological
Assessment 2014;26(2):513-27.
146. Christodoulou C, Schneider S, Junghaenel DU, et
al. Measuring daily fatigue using a brief scale adapted
from the Patient-Reported Outcomes Measurement
Information System (PROMIS®). Quality of Life Research
2014;23(4):1245-53.
147. de Castro NFC, de Rezende CHA, Mendonça TMS, et
al. Portuguese-language cultural adaptation of the Items
Banks of Anxiety and Depression of the Patient-Reported
Outcomes Measurement Information System (PROMIS).
Cadernos de Saude Publica 2014;30(4):879-84.
158. Irwin DE, Atwood CA, Jr., Hays RD, et al. Correlation
of PROMIS scales and clinical measures among chronic
obstructive pulmonary disease patients with and without
exacerbations. Quality of Life Research 2014.
159. Junghaenel DU, Schneider S, Stone AA, et al.
Ecological validity and clinical utility of Patient-Reported
Outcomes Measurement Information System (PROMIS®)
instruments for detecting premenstrual symptoms of
depression, anger, and fatigue. Journal of Psychosomatic
Research 2014;76(4):300-06.
160. Kroenke K, Yu Z, Wu J, et al. Operating Characteristics
of PROMIS Four-Item Depression and Anxiety Scales in
Primary Care Patients with Chronic Pain. Pain Medicine
(United States) 2014.
161. Lin FJ, Pickard AS, Krishnan JA, et al. Measuring healthrelated quality of life in chronic obstructive pulmonary
Integrated Care | 108
disease: Properties of the EQ-5D-5L and PROMIS-43 short
form. BMC Medical Research Methodology 2014;14(1).
162. Mohammed SI. Validity of patient reported outcome
measurement information system health assessment
questionnaire (Promis HAQ) for assessing disease
activity in Iraqi patients with active rheumatoid arthritis.
International Journal of Pharmacy and Pharmaceutical
Sciences 2014;6(8):332-34.
163. Niaura R. Delivering on its promises: The PROMIS®
smoking initiative item banks. Nicotine and Tobacco
Research 2014;16(SUPPL.3):S261-S62.
164. Overbeek CL, Nota SPFT, Jayakumar P, et al. The
PROMIS Physical Function Correlates With the QuickDASH
in Patients With Upper Extremity Illness. Clinical
Orthopaedics and Related Research® 2014.
165. Papuga MO, Beck CA, Kates SL, et al. Validation
of GAITRite and PROMIS as high-throughput physical
function outcome measures following ACL reconstruction.
Journal of Orthopaedic Research 2014;32(6):793-801.
166. Pilkonis PA, Yu L, Dodds NE, et al. Validation of
the depression item bank from the Patient-Reported
Outcomes Measurement Information System (PROMIS®) in
a three-month observational study. Journal of Psychiatric
Research 2014;56(1):112-19.
167. Ravens-Sieberer U, Devine J, Bevans K, et al.
Subjective well-being measures for children were
developed within the PROMIS project: Presentation of first
results. Journal of Clinical Epidemiology 2014;67(2):207-18.
168. Revicki DA, Cook KF, Amtmann D, et al. Exploratory and
confirmatory factor analysis of the PROMIS pain quality item
bank. Quality of Life Research 2014;23(1):245-55.
169. Rose M, Bjorner JB, Gandek B, et al. The PROMIS Physical
Function item bank was calibrated to a standardized metric
and shown to improve measurement efficiency. Journal of
Clinical Epidemiology 2014;67(5):516-26.
170. Schalet BD, Cook KF, Choi SW, et al. Establishing a
common metric for self-reported anxiety: Linking the
MASQ, PANAS, and GAD-7 to PROMIS Anxiety. Journal of
Anxiety Disorders 2014;28(1):88-96.
171. Selewski DT, Massengill SF, Troost JP, et al. Gaining the
Patient Reported Outcomes Measurement Information
System (PROMIS) perspective in chronic kidney disease: a
Midwest Pediatric Nephrology Consortium study. Pediatric
Nephrology 2014.
172. Senders A, Hanes D, Bourdette D, et al. Reducing
survey burden: Feasibility and validity of PROMIS measures
in multiple sclerosis. Multiple Sclerosis 2014;20(8):1102-11.
173. Shadel WG, Edelen MO, Tucker JS, et al. Development
of the PROMIS® nicotine dependence item banks. Nicotine
and Tobacco Research 2014;16(SUPPL.3):S190-S201.
174. Shadel WG, Edelen MO, Tucker JS, et al.
Development of the PROMIS® coping expectancies of
smoking item banks. Nicotine and Tobacco Research
2014;16(SUPPL.3):S202-S11.
175. Silva e Costa ZMS, Pinto RMC, Mendonça TMS, et al.
Brazilian-Portuguese translation and cultural adaptation of
the sleep and wake disturbances domains of the PatientReported-Outcomes Measurement Information System
(PROMIS). Cadernos de Saude Publica 2014;30(7):1391-401.
176. Spiegel BMR, Hays RD, Bolus R, et al. Development
of the NIH patient-reported outcomes measurement
information system (PROMIS) gastrointestinal
symptom scales. American Journal of Gastroenterology
2014;109(11):1804-14.
177. Stachler RJ, Schultz LR, Nerenz D, et al. PROMIS
evaluation for head and neck cancer patients: A
comprehensive quality-of-life outcomes assessment tool.
Laryngoscope 2014;124(6):1368-76.
178. Stucky BD, Edelen MO, Tucker JS, et al. Development
of the PROMIS® negative psychosocial expectancies of
smoking item banks. Nicotine and Tobacco Research
2014;16(SUPPL.3):S232-S40.
179. Terwee CB, Roorda LD, De Vet HCW, et al. DutchFlemish translation of 17 item banks from the PatientReported Outcomes Measurement Information System
(PROMIS). Quality of Life Research 2014;23(6):1733-41.
180. Tucker CA, Cieza A, Riley AW, et al. Concept Analysis of
the Patient Reported Outcomes Measurement Information
System (PROMIS®) and the International Classification of
Functioning, Disability and Health (ICF). Quality of Life
Research 2014;23(6):1677-86.
181. Tucker JS, Shadel WG, Edelen MO, et al.
Development of the PROMIS® social motivations for
smoking item banks. Nicotine and Tobacco Research
2014;16(SUPPL.3):S241-S49.
182. Tucker JS, Shadel WG, Edelen MO, et al. Development
of the PROMIS® positive emotional and sensory
expectancies of smoking item banks. Nicotine and
Tobacco Research 2014;16(SUPPL.3):S212-S22.
183. Tyser AR, Beckmann J, Franklin JD, et al. Evaluation
of the PROMIS physical function computer adaptive
test in the upper extremity. Journal of Hand Surgery
2014;39(10):2047-51.
184. Varni JW, Magnus B, Stucky BD, et al. Psychometric
properties of the PROMIS® pediatric scales: Precision,
stability, and comparison of different scoring and
administration options. Quality of Life Research
2014;23(4):1233-43.
185. Varni JW, Thissen D, Stucky BD, et al. PROMIS® parent
proxy report scales for children ages 5-7 years: An item
response theory analysis of differential item functioning
across age groups. Quality of Life Research 2014;23(1):349-61.
186. Voshaar MAHO, Ten Klooster PM, Glas CAW, et al.
Calibration of the PROMIS physical function item bank in
dutch patients with rheumatoid arthritis. PLoS ONE 2014;9(3).
187. Bryan S, Broesch J, Dalzell K, et al. What are the most
effective ways to measure patient health outcomes of
primary health care integration through PROM (Patient
109 | Integrated Care
Reported Outcome Measurement) instruments?: the
Canadian Institutes of Health Research, 2013.
188. Wong ST, Haggerty J. Measuring Patient Experiences
in Primary Health Care: A review and classification of
items and scales used in publicly-available questionnaires:
Centre for Health Services and Policy Research, University
of British Columbia, 2013.
189. McDonald KM, Schultz E, Albin L, et al. Care
Coordination Atlas Version 3 (Prepared by Stanford
University under subcontract to Battelle on Contract No.
290-04-0020). AHRQ Publication No.11-0023-EF. Rockville,
MD, 2010.
190. Graham C, Killpack C, Raleigh V, et al. Options
appraisal on the measurement of people’s experiences of
integrated care: The Kings Fund, 2013.
191. King J, Gibbons E, Graham C, et al. Developing
measures of people’s self-reported experiences of
integrated care: Picker Institute Eureope & University of
Oxford, 2013.
192. Uijen AA, Schers HJ, Schellevis FG, et al. How unique
is continuity of care? A review of continuity and related
concepts. Family Practice 2012;29(3):264-71.
193. Husain A, Barbera L, Howell D, et al. Advanced
lung cancer patients’ experience with continuity of care
and supportive care needs. Supportive Care in Cancer
2013;21(5):1351-58.
194. Aller MB, Vargas I, Garcia-Subirats I, et al. A tool
for assessing continuity of care across care levels: An
extended psychometric validation of the CCAENA
questionnaire. International Journal of Integrated Care
2013;13(OCT/DEC).
195. Continuity of care to optimise chronic disease
management in the community setting: An evidencebased analysis. Ontario Health Technology Assessment
Series 2013;13(6):1-41.
196. Uijen AA, Schers HJ, Schellevis FG, et al. Measuring
continuity of care: Psychometric properties of the
Nijmegen Continuity Questionnaire. British Journal of
General Practice 2012;62(600):e494-e502.
197. Uijen AA, Heinst CW, Schellevis FG, et al.
Measurement properties of questionnaires measuring
continuity of care: a systematic review. PLoS One
2012;7(7):e42256.
198. Uijen AA, Bosch M, Van Den Bosch WJHM, et al. Heart
failure patients’ experiences with continuity of care and its
relation to medication adherence: A cross-sectional study.
BMC Family Practice 2012;13.
199. Roland M. Continuity of care: Betrayed values or
misplaced nostalgia. International Journal of Integrated
Care 2012;12(OCTOBER - DECEMBE).
200. Haggerty JL, Roberge D, Freeman GK, et al. Validation
of a generic measure of continuity of care: When patients
encounter several clinicians. Annals of Family Medicine
2012;10(5):443-50.
201. Haggerty JL, Roberge D, Freeman GK, et al. Validation
of a generic measure of continuity of care: when patients
encounter several clinicians. Ann Fam Med 2012;10(5):443-51.
202. Uijen AA, Schellevis FG, Van Den Bosch WJHM, et al.
Nijmegen Continuity Questionnaire: Development and
testing of a questionnaire that measures continuity of
care. Journal of Clinical Epidemiology 2011;64(12):1391-99.
203. Joyce AS, Adair CE, Wild TC, et al. Continuity of
care: Validation of a self-report measure to assess client
perceptions of mental health service delivery. Community
Mental Health Journal 2010;46(2):192-208.
204. Mokkink LB, Terwee CB, Patrick DL, et al. The COSMIN
checklist for assessing the methodological quality of
studies on measurement properties of health status
measurement instruments: An international Delphi study.
Quality of Life Research 2010;19(4):539-49.
205. Mokkink LB, Terwee CB, Patrick DL, et al. The COSMIN
study reached international consensus on taxonomy,
terminology, and definitions of measurement properties
for health-related patient-reported outcomes. Journal of
Clinical Epidemiology 2010;63(7):737-45.
206. Mokkink LB, Terwee CB, Gibbons E, et al. Inter-rater
agreement and reliability of the COSMIN (COnsensusbased Standards for the selection of health status
Measurement Instruments) checklist. BMC Med Res
Methodol 2010;10:82-82.
207. Angst F. The new COSMIN guidelines confront
traditional concepts of responsiveness. BMC Med Res
Methodol 2011;11:152-52.
208. Terwee CB, Mokkink LB, Knol DL, et al. Rating the
methodological quality in systematic reviews of studies on
measurement properties: a scoring system for the COSMIN
checklist. Qual Life Res 2012;21(4):651-57.
209. Wei X, Barnsley J, Zakus D, et al. Assessing
continuity of care in a community diabetes program:
initial questionnaire development and validation. J Clin
Epidemiol 2008;61(9):925-31.
210. Durbin J, Goering P, Streiner DL, et al. Continuity
of care: validation of a new self-report measure for
individuals using mental health services. The journal of
behavioral health services & research 2004;31(3):279-96.
211. Adair CE, Wild TC, Joyce A, et al. Continuity of mental
health services study of Alberta: a research program on
continuity of mental health care: University of Calgary,
Canada, 2004.
212. Hadjistavropoulos HD, Biem HJ, Kowalyk KM.
Measurement of continuity of care in cardiac patients:
Reliability and validity of an in-person questionnaire.
Canadian Journal of Cardiology 2004;20(9):883-91.
213. de Silva D. Measuring patient experience: The Health
Fundation, 2013.
214. Greenhalgh J, Long AF, Flynn R. The use of patient
reported outcome measures in routine clinical practice:
Integrated Care | 110
Lack of impact or lack of theory? Social Science and
Medicine 2005;60(4):833-43.
215. Abernethy AP, Ahmad A, Zafar SY, et al. Electronic
patient-reported data capture as a foundation of rapid
learning cancer care. Med Care 2010;48(6 Suppl):S32-8.
216. Chen J. The impact of routine collection of Patient
Reported Outcome Measures on patients, providers and
health organisations in an oncologic setting: a rapid
review. An Evidence Check review brokered by the Sax
Institute, 2012.
217. Institute of Medicine. Crossing the Quality Chasm: A
New Health System for the 21st Century, 2001.
218. Crisp N. Creating a patient-led NHS: Delivering the
NHS improvement plan. Creating a Patient-Led NHS
Delivering the NHS Improvement Plan 2005.
219. Digioia A, Lorenz H, Greenhouse PK, et al. A patientcentered model to improve metrics without cost increase:
Viewing all care through the eyes of patients and families.
Journal of Nursing Administration 2010;40(12):540-46.
220. Smith S, Dewar B, Pullin S, et al. Relationship centred
outcomes focused on compassionate care for older people
within in-patient care settings. International Journal of
Older People Nursing 2010;5(2):128-36.
221. Stange KC, Nutting PA, Miller WL, et al. Defining and
measuring the patient-centered medical home. Journal of
General Internal Medicine 2010;25(6):601-12.
222. Hobbs JL. Editor’s Note. Nursing Research
2009;58(1):52-62.
223. Clayton MF, Latimer S, Dunn TW, et al. Assessing
patient-centered communication in a family
practice setting: How do we measure it, and whose
opinion matters? Patient Education and Counseling
2011;84(3):294-302.
224. Sidani S, Collins L, Harbman P, et al. Development of a
Measure to Assess Healthcare Providers’ Implementation
of Patient-Centered Care. Worldviews on Evidence-Based
Nursing 2014;11(4):248-57.
225. Kodner DL, Spreeuwenberg C. Integrated care:
meaning, logic, applications, and implications--a
discussion paper. Int J Integr Care 2002;2.
226. Goodwin N. The state of telehealth and telecare in the
UK: Prospects for integrated care. Journal of Integrated
Care 2010;18(6):3-10.
227. McDonald KM, Sundaram V, D.M. B, et al. Care
coordination. In: Shojania KG, McDonald KM, Wachter
RM, et al., eds. Closing the quality gap: A critical analysis
of quality improvement strategies Technical Review 9
(Prepared by Stanford-UCSF Evidence-Based Practice
Center under contract No 290-02-0017) Vol 7
Rockville, MD: Agency for Healthcare Research and Quality,
2007.
228. Delnoij D, Klazinga N, Glasgow IK. Integrated care in
an international perspective. Int J Integr Care 2002;2.
229. Shortell SM, Gillies RR, Anderson DA. The new world
of managed care: creating organized delivery systems.
Health Aff (Millwood) 1994;13(5):46-64.
230. Ahgren B, Axelsson R. Evaluating integrated health
care: a model for measurement. Int J Integr Care 2005;5.
231. Leutz WN. Five laws for integrating medical and social
services: lessons from the United States and the United
Kingdom. Milbank Q 1999;77(1):77-7110.
232. Rolfson O, Kärrholm J, Dahlberg LE, et al. Patientreported outcomes in the Swedish Hip Arthroplasty
Register: Results of a nationwide prospective
observational study. Journal of Bone and Joint Surgery Series B 2011;93 B(7):867-75.
233. Lindgren JV, Wretenberg P, Kärrholm J, et al. Patientreported outcome is influenced by surgical approach
in total hip replacement: A study of the Swedish Hip
Arthroplasty Register including 42 233 patients. Bone and
Joint Journal 2014;96 B(5):590-96.
234. Hjollund NHI, Larsen LP, Biering K, et al. Use of patientreported outcome (PRO) measures at group and patient
levels: Experiences from the generic integrated PRO system,
WestChronic. Journal of Medical Internet Research 2014;16(2).
235. Green A. Danish clinical databases: an overview.
Scand J Public Health 2011;39(7 Suppl):68-71.
236. Kalucy L, Katterl R, Jackson-Bowers E. Patient
Experience of health care performance: Primary Health
Care Research & Information Service (PHCRIS), 2009.
237. Ipsos Social Research Institute. Development Report:
Adult Admitted Patient Survey: A report prepared for the
Bureau of Health Information. Sydney (NSW): Ipsos Social
Research Institute, 2013.
238. Pearse J. Review of patient satisfaction and
experience surveys conducted for public hospitals in
Australia: Health Policy Analysis Pty Ltd, 2005.
239. Secondary. http://www.commonwealthfund.org/
News/News-Releases/2011/Nov/International-HealthPolicy-Survey.aspx.
240. Jensen RE, Snyder CF, Abernethy AP, et al. Review
of electronic patient-reported outcomes systems used
in cancer clinical care. Journal of oncology practice /
American Society of Clinical Oncology 2014;10(4):e215-22.
241. The National Electronic Health Transition Authority
(NEHTA). Review of the Personally Controlled Electronic
Health Record, 2013.
242. Wennberg JE, Staiger DO, Sharp SM, et al.
Observational intensity bias associated with illness
adjustment: cross sectional analysis of insurance claims.
Bmj 2013;346:f549.
243. Welch HG, Sharp SM, Gottlieb DJ, et al. Geographic
variation in diagnosis frequency and risk of death among
Medicare beneficiaries. JAMA : the journal of the American
Medical Association 2011;305(11):1113-8.
111 | Integrated Care
244. Streiner DL, Norman GR. Health measurement scales.
A practical guide to their development and use. Oxford,
UK: University Press, 2008.
258. Shigaki CL, Smarr KL, Yang G, et al. Social interactions
in an online self-management program for rheumatoid
arthritis. Chronic Illness 2008;4(4):239-46.
245. Terwee CB, Bot SD, de Boer MR, et al. Quality criteria
were proposed for measurement properties of health
status questionnaires. J Clin Epidemiol 2007;60(1):34-42.
259. Bartlett YK, Coulson NS. An investigation into
the empowerment effects of using online support
groups and how this affects health professional/patient
communication. Patient Education and Counseling
2011;83(1):113-19.
246. Verhoef LM, Van De Belt TH, Engelen L, et al. Social
media and rating sites as tools to understanding quality
of care: A scoping review. Journal of Medical Internet
Research 2014;16(2).
247. Merolli M, Gray K, Martin-Sanchez F. Health outcomes
and related effects of using social media in chronic disease
management: A literature review and analysis of affordances.
Journal of Biomedical Informatics 2013;46(6):957-69.
248. Chung DS, Kim S. Blogging activity among cancer
patients and their companions: Uses, gratifications, and
predictors of outcomes. Journal of the American Society for
Information Science and Technology 2008;59(2):297-306.
249. Høybye MT, Dalton SO, Deltour I, et al. Effect of
Internet peer-support groups on psychosocial adjustment
to cancer: A randomised study. British Journal of Cancer
2010;102(9):1348-54.
250. Klemm P. Effects of online support group format
(Moderated vs Peer-Led) on depressive symptoms and
extent of participation in women with breast cancer. CIN Computers Informatics Nursing 2012;30(1):9-18.
251. McLaughlin M, Nam Y, Gould J, et al. A videosharing
social networking intervention for young adult cancer
survivors. Computers in Human Behavior 2012;28(2):631-41.
252. Setoyama Y, Yamazaki Y, Namayama K. Benefits of peer
support in online Japanese breast cancer communities:
differences between lurkers and posters. J Med Internet
Res 2011:13.
253. Rodham K, McCabe C, Blake D. Seeking support: An
interpretative phenomenological analysis of an Internet
message board for people with Complex Regional Pain
Syndrome. Psychology and Health 2009;24(6):619-34.
254. Schulz PJ, Rubinelli S, Mariotti G, et al. Meeting the
ranging of informational needs of chronic low back
pain sufferers: Conceptual design and rationale of the
interactive website ONESELF. Disability and Rehabilitation
2009;31(25):2118-24.
255. Ruehlman LS, Karoly P, Enders C. A randomized
controlled evaluation of an online chronic pain self
management program. Pain 2012;153(2):319-30.
256. Mo PKH, Coulson NS. Developing a model for
online support group use, empowering processes and
psychosocial outcomes for individuals living with HIV/
AIDS. Psychology and Health 2012;27(4):445-59.
257. Greene JA, Choudhry NK, Kilabuk E, et al. Online
social networking by patients with diabetes: A qualitative
evaluation of communication with Facebook. Journal of
General Internal Medicine 2011;26(3):287-92.
260. Chen AT. Exploring online support spaces: using
cluster analysis to examine breast cancer diabetes and
fibromyalgia support groups. Patient Educ Couns 2011.
261. Griffiths KM, Calear AL, Banfield M, et al. Systematic review
on internet support groups (ISGs) and depression (1): do ISGs
reduce depressive symptoms? J Med Internet Res 2009.
262. Hoch DB, Watson AJ, Linton DA, et al. The feasibility
and impact of delivering a mind-body intervention in a
virtual world. PLoS One 2012:7.
263. Lorig KR, Ritter PL, Laurent DD, et al. The internetbased arthritis self-management program: A one-year
randomized trial for patients with arthritis or fibromyalgia.
Arthritis Care and Research 2008;59(7):1009-17.
264. Schubart JR, Stuckey HL, Ganeshamoorthy A, et
al. Chronic health conditions and internet behavioral
interventions: A review of factors to enhance user
engagement. CIN - Computers Informatics Nursing
2011;29(2):81-92.
265. van Uden-Kraan CF, Drossaert CHC, Taal E, et al.
Participation in online patient support groups endorses
patients’ empowerment. Patient Education and Counseling
2009;74(1):61-69.
266. Van Uden-Kraan CF, Drossaert CHC, Taal E, et al.
Empowering processes and outcomes of participation
in online support groups for patients with breast cancer,
arthritis, or fibromyalgia. Qualitative Health Research
2008;18(3):405-17.
267. Bender JL, Jimenez-Marroquin MC, Jadad AR. Seeking
support on facebook: A content analysis of breast cancer
groups. Journal of Medical Internet Research 2011;13(1).
268. Mo PKH, Coulson NS. Living with HIV/AIDS and use
of online support groups. Journal of Health Psychology
2010;15(3):339-50.
269. Van De Belt TH, Engelen LJLPG, Berben SAA, et al.
Definition of health 2.0 and medicine 2.0: A systematic
review. Journal of Medical Internet Research 2010;12(2).
270. Greaves F, Ramirez-Cano D, Millett C, et al. Harnessing
the cloud of patient experience: Using social media to
detect poor quality healthcare. BMJ Quality and Safety
2013;22(3):251-55.
271. Rozenblum R, Bates DW. Patient-centred healthcare,
social media and the internet: The perfect storm? BMJ
Quality and Safety 2013;22(3):183-86.
Integrated Care | 112
272. Greaves F, Millett C. Consistently increasing numbers
of online ratings of healthcare in England. Journal of
Medical Internet Research 2012;14(3).
273. Bardach NS, Asteria-Peñaloza R, John Boscardin W, et
al. The relationship between commercial website ratings
and traditional hospital performance measures in the USA.
BMJ Quality and Safety 2013;22(3):194-202.
274. Greaves F, Pape UJ, King D, et al. Associations
between web-based patient ratings and objective
measures of hospital quality. Archives of Internal Medicine
2012;172(5):435-36.
275. Perfect storm Wikipedia: The Free Encyclopedia.
Wikimedia Foundation Inc. http://en.wikipedia.org/wiki/
Perfect_storm. (accessed 18 Nov 2014).
276. Naslund JA, Grande SW, Aschbrenner KA, et al.
Naturally occurring peer support through social media:
The experiences of individuals with severe mental illness
using you tube. PLoS ONE 2014;9(10).
277. Rate MDS. http://www.ratemds.com/. Accessed in
Nov. 2014, 2014.
278. Patient Opinion https://www.patientopinion.org.uk/.
(Accessd 18 Nov. 2014), 2014.
279. Iwantgreatcare https://www.iwantgreatcare.org/.
(accessed 18 Nov 2014). 2014.
280. PatientsLikeMe http://www.patientslikeme.com/
(accessed 18 Nov 2014), 2014.
281. Mumsnet http://www.mumsnet.com/. (accessed 18
Nov 2014). 2014.
282. E-patients.net http://e-patients.net/ (accessed 18 Nov
2014), 2014.
283. Twitter www.twitter.com (accessed 18 Nov 2014), 2014.
284. Facebook www.facebook.com (accessed 18 Nov
2014), 2014.
285. Google + https://plus.google.com/ (accessed 18 Nov
2014), 2014.
286. Greaves F, Millett C, Nuki P. England’s experience
incorporating “anecdotal” reports from consumers into
their national reporting system: Lessons for the united
states of what to do or not to do? Medical Care Research
and Review 2014;71(4):65S-80S.
287. Greaves F, Laverty AA, Ramirez Cano D, et al. Tweets
about hospital quality: A mixed methods study. BMJ
Quality and Safety 2014;23(10):838-46.
288. Health Do. The power of information: Putting all of
us in control of the health and care information we need:
Department of Health, the UK, 2012.
289. Insights—NHS England [Internet]. http://www.
insights.england.nhs.uk/ (accessed 18 Nov 2014).
290. Mason A, Goddard M, Weatherly H. Financial
mechanisms for integrating funds for health and social care:
an evidence review: Centre for Health Economics, Uiversity
of York, UK. http://www.york.ac.uk/media/che/documents/
papers/researchpapers/CHERP97_Financial_mechanisms_
integrating_funds_healtthcare_social_care_.pdf, 2014.
291. Gerteis M, Edgman-Levitan S, Daley J, et al. Through
the Patient’s Eyes: Understanding and Promoting PatientCentered Care. Fransisco: Jossey-Bass, 1993.
292. Sizmur S, Redding D. Core Domains for Measuring
Inpatients’ Experience of Care. Oxford: Picker Institute, 2009.
293. Shaller D. Patient-centered Care: What Does it Take?
United States: Picker Institute and the Commonwealth
Fund, 2007.
294. International Alliance of Patients’ Organisations. What
is Patient-centred Healthcare? A Review of Definitions and
Principles. London: IAPO, 2007.
295. Cronin C. Patient-centered Care: an Overview of
Definitions and Concepts. Washington, DC: National
Health Council, 2004.
296. Sizmu S, Redding D. Key Domains of the Experience of
Hospital Outpatients. Oxford: Picker Institute, 2010.
297. Department of Health the UK. NHS Patient Experience
Framework. 2011.
298. National Voices. A Narrative for Person-Centred
Coordinated Care 2012.
299. Staniszewska S, Boardman F, Gunn L, et al. The
warwick patient experiences framework: Patient-based
evidence in clinical guidelines. International Journal for
Quality in Health Care 2014;26(2):151-57.
300. NICE Clinical Guideline (CG 138). Patient Experience in
Adult NHS Services: Improving the Experience of Care for
People Using Adult NHS Services. London: National Clinical
Guidelines Centre, 2012.