Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Appendix E - Quality Indicators for MH PbR Framework – April 2013 APPENDIX E - DATA QUALITY ISSUES In October 2012, the Health and Social Care Information Centre (HSCIC) raised issues about data quality issues with the first set of indicators that they had produced analysis for and which may either affect the usefulness of individual indicators or, in themselves, be of interest as a quality indicator. This document summarises the issues raised by the HSCIC. Overview MHMDS version 4, introduced in April 2011, includes a number of new data items and changes to the way in which some existing items are submitted and how the data is processed. A great deal of validation occurs on submission of the data to ensure that data is submitted in the correct format and for the relevant time period. Nevertheless users of the data should bear in mind that individual’s packages of care vary widely, and a variety of approaches are required to assure the accuracy, coverage and consistency of the submitted data. There is no single method for assessing the quality of the data nor any practical way of ensuring that all the data submitted is correct. The HSCIC highlighted issues that are relevant to individual indicators and these were summarised under the following descriptions: 1. Coverage – data being supplied where expected 2. Accuracy – code validity 3. Duplication – inconsistencies in the data where there are multiple records for a single item 4. Definitions – problems with interpreting requirements 5. Failure to supply End dates – v4 requirements not fully implemented 6. Novelty and usage – items not prominent in published data are less well recorded Further specific points raised by the HSCIC include: Defining In Scope patients There is one issue that affects all indicators. The population to be used for the analysis is patients considered ‘in scope’ for clustering. There has been considerable work to define a method for identifying this group within the whole dataset and HSCIC has been publishing experimental figures by provider in its’ quarterly publications this year. The HSCIC have published a denominator in scope population (at end of reporting period) and the proportion of these that have been assigned to cluster. Even for providers in the vanguard of PbR, these proportions are lower than expected. It’s not clear whether this is due to something missing in the definition of ‘in scope’ patients, or whether the denominator is being inflated by providers’ submitting data for patients ‘left open’ on the books, but actually no longer in care. The methodology for defining in scope patients has been refined and amended for these draft indicators, and it is hoped that there will be an improvement. However, on average, 30% of ‘in scope’ patients have not been assigned to a cluster (where 1 Appendix E - Quality Indicators for MH PbR Framework – April 2013 cluster is NULL) and not is necessary to know whether this accurately reflects the implementation of PbR Clusters, or whether the denominator is in some way inflated. Individual providers are encouraged to look carefully at figures produced to help the HSCIC to understand what is happening. As a method for identifying ‘in scope’ patients in the data underpins all the indicators, this is a key issue for consideration. In Scope Definition (from HSCIC) Spells of care open in each Reporting Period (generally one per patient per provider in the RP) where the patients has had at least 2 face to face (or other proper) contacts since their referral into the provider’s service OR at least one night in hospital (2 bed days). The referral may have been in an earlier RP, but if it was in 2010/11, the spell is automatically assumed to have included the required contact/bed days. Of these….EXCLUDE Patients for whom the most recent Team Episode (by start date) was with an excluded team type (i.e. being assigned to a non-excluded team more recently than they were assigned to an excluded team puts them back ‘in scope’). Do not exclude spells with no Team Episode or where the Team Type data is missing. Excluded teams are: Psychiatric Liaison Psychological Therapy Service (IAPT) Primary Care Mental Health Service Forensic Service Community Forensic Service Learning Disability Service Individual Indicators The proportion of users in each cluster who are on CPA: Recording CPA is well established in provider organisations and CPA has been a feature of some well-established national indicators (cf NHS Performance Framework). However there are two known issues that might inflate the number recorded as being on CPA: Changes to the way CPA is recorded in version 4 require an End Date to be supplied when the patient is no longer on CPA. There’s some evidence across the dataset of Episode End Dates not being implemented consistently – this would mean the patient appears to still be on CPA, when they are not. There is anecdotal evidence that some providers are still recording CPA as if the old ‘standard’ CPA still needed recording, rather than only recording new refocused CPA (as redefined 2008) as required. Where CPA appears in an indicator both these issues could apply. 2 Appendix E - Quality Indicators for MH PbR Framework – April 2013 The proportion of users on CPA who have had a review within the last 12 months: This measure has been part of the NHS Performance Framework and is routinely produced from MHMDS. However, because of inconsistencies in the CPA Episode data, the HSCIC derived a flag to indicate where the data suggests there has been a 12 month continuous period of CPA, which might be calculated from separate, overlapping and duplicating Episodes on CPA. So the existing method covers up some issues of inconsistency in recording CPA. Additionally, the HSCIC has had feedback from providers that they would like the definition to be tightened up so that the numerator looks for a CPA Review within 12 months of the end of the current reporting period OR the end of the CPA Episode, whichever is sooner. The HSCIC used the existing definition but indicated that this would be a good opportunity to publicly revise it. The accommodation status of all users: 4 different versions of this indicator were analysed: (i) The proportion of users on CPA with a settled accommodation status (ii) The proportion of users on CPA with a valid accommodation status. (iii) The proportion of all users with a settled accommodation status. (iv) The proportion of all users with HoNOS Q11 scores of 0 or 1 – indicating no or little problem with accommodation. (iv) uses a different data item for the accommodation numerator and coverage is expected to be better. (i) and (iii) compare the completeness of accommodation status recording between those on CPA and not on CPA. Because the focus of national indicators (including NI 149) has been on patients aged 18-69 on CPA, recording accommodation for this group is better. Ethnicity completeness MH providers provide the most accurate and comprehensive ethnic coding in the NHS. It’s understandable that the more intensive clusters show slightly higher levels of coding as the level of contact is more likely to provide an opportunity to check selfassigned ethnic group. The intensity of care (bed days as a proportion of care days): 2 versions of this indicator provide a direct data quality comparison. The first uses a total count of bed days, derived from the Start and End of Ward Stays and the boundaries of the reporting period. The second cleanses this count to remove the impact of duplicate and overlapping records. Is the difference between the 2 numerator of interest – it shows inaccuracies and inconsistencies in the data 3 Appendix E - Quality Indicators for MH PbR Framework – April 2013 provided as a person can only be in one bed at once and the max number of bed days in a period must be the end date minus the start date. These inaccuracies could arise from failure to provide End Dates or possibly from issues with the extraction process from local systems that could give rise to duplication. The proportion of users with a crisis plan in place: The requirement to provide information about a Crisis Plan was mandated from April 2012 – and so levels of recording are likely to increase over time, and be low during 2011/12. Variation may not reflect clinical practice but the time it takes to implement changes in recording and extracting data. Summary Message from the HSCIC The Community and Mental Health Team at HSCIC places great emphasis on assuring the quality of the data that we publish. Our quarterly release (http://www.ic.nhs.uk/article/2021/WebsiteSearch?q=Routine+Quarterly+MHMDS+REports&sort=Most+recent&size=10&page=1&area =both#top) includes a Background Statement of Quality as well as organisation level data quality measures. The Background Statement includes much information about our approach to assuring the data, at various points in the data flow from submission to publication. Our assurance includes a feedback loop whereby issues are fed back to trusts and our own quality assurance processes are refined and updated to tackle emerging issues. 4