Skip to main content
Organisations:
Cabinet Office

https://whatworks.blog.gov.uk/2018/05/04/5-lessons-learned-about-what-works/

Five lessons we’ve learnt in our first five years

Posted by: , Posted on: - Categories: Blog posts

Last month marked exactly 5 years since the What Works Network was established to feed better evidence into the way we make decisions across the public sector.

We’ve come a long way since 2013. The network now comprises 10 independent research centres in areas such as health, education, and children’s social care.

But these are no ordinary research centres. They tell us what works in their respective policy areas by summarising the existing evidence base. Where the available evidence is weak, a number of centres have commissioned trials to provide commissioners, policymakers, and frontline workers with the answers they need. And these centres have developed innovative ways of helping decision makers act on this evidence.

Alongside the network, we’ve also established the Cross-Government Trial Advice Panel, which helps civil servants to design, deliver and analyse high-quality impact evaluations. With almost 50 projects under our belt we’ve learnt a thing or two about designing robust policy experiments.

Here are 5 lessons we’ve learnt along the way.

1. Randomised controlled trials aren’t the only way of assessing the impact of a policy or programme.

While they are still considered the gold standard when it comes to measuring the impact of a policy, in situations where they are impractical or unethical, the Trial Advice Panel has sometimes recommended quasi-experimental designs like propensity score matching or regression discontinuity analysis.

Quasi-experiments feature an artificial control group without the need for randomly assigning treatments to participants in the trial. They’re particularly useful in ethically sensitive situations where you have good data about participants – for instance, when sentencing offenders.

2. It’s socially acceptable to experiment on children!

In the Education Endowment Foundation’s (EEF) early days, there were concerns that testing out teaching approaches on children would be viewed as unethical and would not be accepted by schools. Five years on, the EEF has conducted over 150 trials, in which over a third of English schools have volunteered to take part. As a result, the EEF is responsible for an estimated 10% of education trials worldwide and their evidence is trusted by thousands of  headteachers across the country.

The Education Endowment Foundation has demonstrated that it is possible to run a robust and ethical trial in a school setting.
Licence: Creative Commons Attribution-NonCommercial-NoDerivs Flickr:Howard Country Library

3. Finding out what doesn’t work is just as important as understanding what does.

It turns out that antibiotics are ineffective in treating most cases of sinusitis (despite being routinely prescribed). Transferring young people from the juvenile to the adult criminal justice system makes reoffending more likely. Repeating a school year, school uniforms, peer-to-peer teacher observation, and new school buildings do nothing for pupil attainment. Only by finding out what doesn’t work – and being transparent about it – can we identify where money can be saved and re-invested in effective interventions.

4. Mobilising evidence is harder than we first imagined.

Simply making evidence available isn’t enough to change practice on the ground. In one of their recent trials (dubbed the ‘Literacy Octopus’), the Education Endowment Foundation (EEF) tested various methods of engaging with teachers, including printed practice guides, conferences and webinars, and found that none increased the likelihood of teachers adopting recommended practices in the classroom.

Many of the What Works Centres are now trying to accelerate the adoption of new evidence through more sustained engagement with practitioners (for example, the Early Intervention Foundation’s Early Intervention Police Academy, the Centre for Ageing Better’s strategic partnerships with local authorities, and the EEF’s Research Schools Network).

5. Short-term effects don’t always mean long-term benefits.

The Early Intervention Foundation (EIF) has assessed over 100 programmes and has warned that studies which do not assess long-term outcomes (at least 1 year post-intervention) – or do not assess them well – cannot tell us if short-term effects persist. How do we know, for example, that a programme to help people find employment delivers meaningful results if we only track participants for six months after they start a new job? In order to avert issues like these, the EIF has recently released new guidance for evaluators.

We still have a lot to learn, and will keep this blog updated with our latest news.  If you'd like to hear about new content, you can subscribe for email updates.

Sharing and comments

Share this page

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.