Free article: Valued-added service auditing

Published: Tuesday, 17 June 2014

Tim Dallinger looks at how to audit your service with reference to the CQC’s five key questions.


  • Service audits should drive improvements in service provision; such an approach is proactive and leads to more effective service delivery.
  • Service audits should ensure that services are meeting the Care Quality Commission’s five key questions.
  • The most effective service audits will be those carried out by people who are not connected to the service but who have a good knowledge of the regulations, best practice and the type of service.

Self-regulation is back on the agenda, albeit in a less high-profile manner than before the very public failings of some services in the health and social care sector. Now the UK government, in the form of the Care Quality Commission (CQC), has proposed five key questions on which it will base the inspection and review process. These are:

  • Is it safe?
  • Is it effective?
  • Is it caring?
  • Is it responsive to people’s needs?
  • Is it well-led?

What does the CQC mean by these five questions?

By safe, the CQC means that people are protected from physical, psychological or emotional harm. Examples of unsafe care would include:

  • people getting infections because of poor hygiene and infection control
  • members of staff lifting people from their wheelchairs by holding them under their arms
  • staff who are unsure about some residents’ medical conditions because they have not been given instructions, support or guidance
  • a lack of systems in place to make sure people get the fluids they need to keep them hydrated.

By effective, the CQC means that people’s needs are met and their care is in line with nationally recognised guidelines and relevant National Institute for Health and Care Excellence (NICE) quality standards. Effective new techniques are used which give people the best chance of getting better or living independently. Examples of ineffective care would include:

  • people do not get the care they need because care plans have not been reviewed and updated
  • staff do not have the knowledge, skills and experience to meet service users’ needs
  • people’s nutritional needs are not met because staff are unaware of a specialist eating and drinking plan.

By caring, the CQC means that people are treated with compassion, respect and dignity and that care is tailored to their needs. Examples of uncaring provision include:

  • care home staff do not understand people’s individual needs
  • care staff do not spend time talking to people and making sure they have the opportunity to take part in activities that they enjoy
  • there is little stimulation for people using the service
  • staff do not interact positively with people or engage with them in any meaningful way.

By responsive, the CQC means that people get treatment and care at the right time, without excessive delay, and that they are listened to in a way that responds to their needs and concerns. Examples of non-responsive care include:

  • staff identify that a service user has a pressure ulcer on a Saturday; they fail to act on this until Monday morning, and during this time the pressure ulcer gets worse
  • a service user’s family express a concern about their relative’s care and the staff do not act on this for one week
  • a care agency is made aware of the risk of aggression from a service user; they do not review the risk assessment immediately.

By well-led, the CQC means that there is effective leadership, governance (clinical and corporate) and clinical involvement at all levels of the organisation, and an open, fair and transparent culture that listens to and learns from people’s views and experiences to make improvements. The focus of this is on quality. Examples of a service that is not well-led include:

  • decisions about the quality of care not being based on sound evidence and information
  • a lack of a good complaints procedure that drives improvement
  • failing to properly train and supervise staff
  • gaps in mandatory staff training, including moving and handling, safeguarding of adults and children, first aid and infection control
  • inconsistent records and a lack of evidence to show that care and treatment is being appropriately planned and delivered.

Quality assurance pitfalls

The Essential Standards of Quality and Safety place an obligation on care providers to constantly keep the quality of service under review. In reality most providers carry out periodic quality assurance checks backed up with spot checks and responses to incidents. This is a rather reactive approach as it is based on what has happened rather than on what is happening.

Another common pitfall is to base the quality audit solely on service user questionnaires. As all those who work in the sector will realise, the return rate is normally on the low side and responses tend to be skewed by how the service user feels about the service at that particular time. Often such questionnaires ask service users and their families to rate the service – for example good, satisfactory, poor. This provides little scope for constructive feedback that can be used to develop the service.

Another common failing of the service audit process is that audits are carried out by those responsible for the delivery of services. As such they tend to either not notice the obvious as they see it every day or they ignore clear areas of concern as it is they who are ultimately responsible for it in the first place.

Effective service audits

The most effective service audits will be those carried out by people who are not connected to the service but who have a good knowledge of the regulations, best practice and the type of service. This leads to an objective audit with no vested interests. There are two downsides to this approach: firstly the cost; and secondly that the people carrying out the audit can never seem to resist the temptation to suggest further work (for them) to address the areas of non-compliance they find. As such care providers do not trust the audit, thus making it a worthless exercise.

If the care provider is to carry out an in-house audit then it would be wise to design and agree a set format to be used. If this runs to many pages and the audit takes many days to complete then it will soon fall by the wayside.

It may be more effective to design an overview service audit which checks the main areas of compliance and if this identifies concerns to drill down further into these areas as part of the action plan.

An example of this would be a service audit which, as part of the ‘is it safe?’ question, looks at health and safety. One of the audit checks could be whether all staff have access to the health and safety policy. This is a zero tolerance question: if 38 staff out of 40 have access to a copy of the policy then the service has failed this check. The service should then, as part of the action plan, check the status of other key policies.

Valued-added auditing

There is little point in carrying out a service audit merely to comply with some law or regulation. Such an attitude leads to the audit becoming a tick-box exercise. In addition, areas of non-compliance are seen as a negative. If the aim is to identify areas of service improvement then the auditing process adds value to the service and when areas for improvement are identified these are regarded as a positive event.


Service audits should drive improvements in service provision; such an approach is proactive and leads to more effective service delivery. It also provides evidence of continuous service review, good leadership and a well-run service.


Use the following items in the toolkit to put the ideas in the article into practice:

About the author

Tim Dallinger provides training and consultancy services to care homes, care agencies, local authorities and educational establishments, with an emphasis on practical techniques which work in the real world. You can contact Tim via email: This email address is being protected from spambots. You need JavaScript enabled to view it..

This article was first published in the January 2014 issue of Quality & Compliance Magazine.

Most frequently read