Header image for GM PCB: Greater Manchester Primary Care Provider Board is the strategic integrator and collective voice for primary care within the Greater Manchester Integrated Care System.

RCGP Quick guide: Evaluation

Presenting an evaluation of any quality improvement activity undertaken will help share its results. To this end, if this needs to be formally presented, it is best to consider this at the outset of the project.
Download RCGP Quick guide: Evaluation
To present an effective evaluation, there is a need to be able to present:

  • Project aims
  • Background
  • Intervention(s) made
  • Implementation and monitoring methods used
  • Data collected
  • Costings
  • Outputs achieved

How to
Step 1: Aim
Use driver diagram and/or Model for Improvement to clarify aim.
Step 2: Background
From reviewing the context section you can develop the background
Step 3: Intervention(s)
Use actions from your driver diagram.

  • The interventions need to be fully described
  • Say whether or not they changed as your programme progressed
  • Identify who your target audience was
  • Demonstrate whether or not they engaged
  • Share their experience.

Step 4: Methods
Use tools of quality improvement to implement and monitor.
Step 5: Data
Baseline data from diagnosis section and continued monitoring using, e.g. run charts.
Step 6: Costings
From reviewing context section and part of the description of the intervention(s).
Step 7: Outputs
Can use run charts, SPC charts for quantitative data and also describe qualitative results. Also the third part of PDSA cycles the study section, involves considering whether the change has brought about improvement or not.
An evaluation should explain:

  • What you planned to do
  • Whether or not it worked
  • Why the actions taken were or were not successful.
  • Consider any side-effects or unintended consequences of your programme.

Example: RCGP evaluation of quality improvement in diabetes care, drawing on multiple evidence types

  • Questionnaires
  • Semi-structured interviews
  • Data
  • National Diabetes Audit data
  • Quality and Outcomes Framework data
  • Practice level data
  • Reflection templates
  • Minutes of meetings

These types of evidence were then applied in three domains:

  1. This included a description of the various aspects of context,both internal (culture, leadership, teamwork, capacity for improvement) and external (evidence, regulation, technology and demographics), as well as perceived enabling factors and barriers to quality improvement.
  2. This described the intervention itself in detail, including the degree of engagement and actual implementation at the various levels compared with the original plan.
  3. This assessed the impact of the intervention in both quantitative and qualitative terms.

Sometimes a more detailed evaluation is required and there are a variety of methods, which can be used. These may require support from academic teams with expertise in evaluation.