13.Performance measures are indicators which allow an organisation to see whether it is achieving its strategic objectives and compare itself against similar organisations as well as its own performance over time. For example, the Commission’s performance measures include the time it takes to respond to safeguarding concerns and the proportion of providers rated as inadequate or requires improvement that improve when re-inspected.
14.In its March 2012 report, the previous Committee criticised the Commission’s lack of adequate performance measures. The Commission has since updated the framework it uses to measure and report its performance, but not all the measures can be quantified and only 6 of the Commission’s 37 performance measures have specific targets. Without targets, members of the public and Parliament cannot know whether the Commission is performing well against its statutory duties to protect the health, safety and welfare of people who use health and social care services. The Commission told us that “as we develop our business plan for 2016–17, we will make sure that for every one of the things that we are committing to do, there is a performance indicator that is clear and can be counted”.
15.Since April 2015, the Commission has been responsible for market oversight of difficult to replace adult social care providers. From April 2016, it will also take on responsibility for providing assurance over how efficiently hospital trusts use their resources. We challenged the Department about how the Commission’s new responsibilities for assessing financial efficiency would fit alongside work already carried out by Monitor and the NHS Trust Development Authority to assess the financial sustainability of NHS trusts. The Department told us the Commission would provide the “single independent version of the truth” about a trust’s quality, safety, effectiveness, leadership and use of resources. It said that Monitor and the NHS Trust Development Authority would work together as a joint partnership, known as NHS Improvement, to provide professional support between inspections. The Commission added that it would focus on whether trusts are using resources to add value to the quality and safety of care, in contrast to work carried out by Monitor, which focuses on the trust’s balance sheet and whether it is in surplus or deficit.
16.Providers told us that the Commission’s reporting requirements already overlap with those of other bodies, which raises concerns that its new responsibilities will result in further duplication of effort. The Commission published a consultation document on the morning of our evidence session asking for views on how it should implement its new responsibilities for assessing the efficiency with which hospitals use their resources. It told us it expected its assessments to draw extensively on data already collected by Monitor and the NHS Trust Development Authority, as well as information published in trusts’ audited accounts. However, at this stage, it was unable to explain to us how, in practice, it would coordinate with, and draw on the expertise of these bodies. The Department acknowledged that there was a risk of duplication in the system and said the Department and oversight bodies would need to work a lot harder to eliminate unnecessary burdens for providers.
17.The National Audit Office reported that the Commission had not recruited the staff with the skills it needed to support the new responsibilities it took on from April 2015 for providing financial oversight of difficult to replace adult social care providers. By October 2015, the Commission had recruited four out of the five senior posts it needed. As a result, the Department shared responsibilities with the Commission by overseeing the largest five providers of adult social care on a temporary basis until October 2015.
18.The Department told us it was providing the Commission with sufficient time to develop and pilot an approach to carrying out its additional new responsibilities for assessing use of resources by acute hospital trusts. However, the Commission admitted that it would not be ready to roll out its assessments of hospital efficiency to all hospitals until January 2017, almost a year after it takes on that responsibility in April 2016. It said it planned to pilot its new approach from April 2016, but it would take longer to consult on, and refine, its methodology. The Commission said it anticipated the work would be most likely to involve desk-based reviews of existing information, rather than substantial new fieldwork or data collection. To do this it expects to need a small pool of staff with analytical skills.
19.After our evidence session the Department wrote to us on the subject of the Commission’s new responsibilities. On assessing the use of resources by hospital trusts it pointed out that the Health and Social Care Act 2008, as amended by the the Care Act 2014, gave the Commission powers to carry out reviews and performance assessments of registered providers. It quoted the 2008 Act that “the assessment of the performance of a registered provider is to be by reference to whatever indicators of quality the Commission devises”. The Department argued that the Commission “is not under a duty to commence the use of resources ratings until it determines that the indicators are sufficiently robust to do so, nor does it have to do this work by a specific date”. The Department’s position is therefore that the Commission not being ready to carry out its new responsibilities until 2017 is not in breach of the relevant legislation. The Commission has, however, told the public that it will start assessing the use of resources by April 2016. There is therefore the risk people will make the legitimate assumption that the Commission will be providing a level of assurance over the efficient and effective use of resources that in reality will not be the case until much later.
20.It is becoming increasingly important for providers and commissioners to integrate services around the needs of patients, to improve people’s end to end experience of the health and social care services they receive. For example, the Chief Executive of Warrington and Halton Foundation Trust told us this was particularly important for managing the discharge of patients from hospital. In this example, the trust has to work closely with community and intermediate care providers to ensure a smooth transition and avoid a situation where patients stay in hospital longer than needed.
21.The Department and providers all agreed on the importance of regulation being focused on the needs of patients and local populations. However, providers told us that the current regulatory system is too heavily focused on single providers in isolation and does not do enough to look at the experiences of patients who receive care from more than one organisation. This can be particularly important when organisations have more than one area of responsibility or when they depend on others, for example when patients are discharged from hospital to community or intermediate care. The Department recognised that evaluating the quality of care across and between providers remains a key challenge.
22.A further complication is that a provider’s performance is influenced by decisions taken by local commissioners, but the Commission has no power to scrutinise clinical commissioning groups or local authorities. The Department told us it is is developing, but has not yet completed, what it described as a ‘scorecard’ to assess the performance of clinical commissioning groups. It hoped that the Commission would be able to use this information, once available, populated and published, to assess the economy, efficiency and effectiveness of services to people living in a given locality.
21 , para 4.12
22 The Care Quality Commission: Regulating the quality and safety of health and adult social care, 78th report of Session 2010-12, HC 1779, 30 March 2012
23 ; , paras 18, 4.12-4.13
24 , para 1.2
30 ; , paras 12, 2.20-2.21
Prepared 9 December 2015