Skip to main content

Blog What Works

Cabinet Office

Meet the What Works Trial Advice Panel

Posted by: , Posted on: - Categories: Blog posts, Evaluation, Resources for civil servants

What Works Trial Advice Panel Members

Today sees the relaunch of the What Works Trial Advice Panel (TAP), which now includes experts with experience in a broader range of policy areas and evaluation methods. Meet a few of them below!

Dr Florentina Taylor - Senior Evaluation Manager, Education Endowment Foundation

Florentina has 12 years’ research experience and has led over 130 research and evaluation projects, including managing 30 cluster/ individual randomised controlled trials for the Education Endowment Foundation (EEF).  She has worked closely with senior civil servants for several years, providing advice on pragmatic impact evaluation, data protection and data access.

The best piece of evaluation advice I've received:  'Don't believe everything you think!'

The evaluation I'm most proud of, in a funny way, is my first individual RCT, for which I manually randomised many hundreds of pupils to intervention and control, stratified by gender and subject choice. It took close to 30 hours of painstaking work and a giant stack of papers. Hilarious, in hindsight, but very good training for later.                                 

I enjoy riding my Kawasaki Z650 motorbike on quiet country lanes. Nothing clears the mind of p-value controversies better than the smell of farmed land at 60mph (75% CI).

Joseph Scarlett-Smith - Principal Behavioural Insight Advisor, HM Revenue & Customs

Joe is a Principal Behavioural Insight Officer in HMRC’s  Behaviour Insight and Research team, which is the largest in-house behavioural science team in Government.  Joe has been a member of the team for four years, and over this time he has conducted large-scale experimental and quasi-experimental evaluations involving millions of taxpayers.

The best piece of evaluation advice I've received: There is often a direct trade-off between how complicated a trial is to set up, and how difficult it is to understand the data it produces.

The evaluation I'm most proud of is a mixed-method evaluation of a new guidance system for call-handlers that did not actually work (the new system didn’t lead to any measurable difference in staff behaviour).  Despite the failure of the intervention we learned a lot about a whole host of different behaviours, learning which we have applied (successfully!) in other areas.

I enjoy baking sourdough bread and cycling – I am basically the walking cliché of a 30-something ex-hipster.

Stacy Sharman - Head of Evidence and Evaluation, Department for International Trade

Stacy is currently the Head of Evidence and Evaluation at DIT with 15 years experience working as a researcher.  She has commissioned several large-scale evaluations including an evaluation of Traineeships in the further education context and managing local flood risk at Defra.

The best piece of evaluation advice I've received: Pay attention to what the intervention is actually doing and the context in which it is delivered.  Published papers sometimes do not describe the intervention in any detail and therefore trying to replicate a successful intervention becomes even more challenging. 

The evaluation I'm most proud of is commissioning the Traineeships evaluation whilst working in Further Education and Skills.  It was a complex multi-method evaluation and used propensity scoring matching to assess impact. 

I enjoy cooking and spending time in the kitchen where my current challenge is to cook more vegetarian and meat-free dishes to try to reduce the amount of meat I eat on an ongoing basis.   

Dr Sonia Ilie - Senior Research Fellow, University of Cambridge and RAND Europe

Sonia is senior research fellow at the Faculty of Education University of Cambridge and research leader at RAND Europe, an independent not-for-profit research organisation. She researches educational inequality and runs large-scale experimental and quasi-experimental evaluations of programmes tackling the socio-economic attainment gap in schools, and inequitable access and outcomes in higher education.

The best piece of evaluation advice I've received is to always try to address the ‘if’ together with the ‘how’ – in other words to look at impact and process at the same time when evaluating an intervention.

The evaluation I'm most proud of is a randomised controlled trial of a university access charity’s flagship programme – deciding on the trial design took time and many iterations but resulted in a lean, practical but ethical and methodologically robust evaluation.

I enjoy a good crossword or number puzzle, preferably alongside some very good coffee. 

Meet the full panel here.

Sharing and comments

Share this page

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.