The Office for Students is establishing an Evidence and Impact Exchange – an independent ‘What Works Centre’ dedicated to helping universities improve access and outcomes for students from underrepresented backgrounds. The organisations that will collectively establish the new centre have just been announced as King’s College London, Nottingham Trent University and the Behavioural Insights Team. In this article, Susannah Hume and Michael Bennett from King’s share what they’ve learnt to date on what works in improving diversity and inclusion in universities.
KCL widening participation event, 2017. Photo: David Tett
University: unfair and unequal?
In 1963, the Robbins Report determined that ‘courses of higher education should be available for all those who are qualified by ability and attainment to pursue them and who wish to do so’. Over the ensuing decades, progress has been made towards this principle.
But gaps in participation between those from different backgrounds have persisted. For instance, students from the highest participation postcodes are 2.4 times as likely as those from the lowest-participation postcodes to attend university. Looking at universities with the highest entry requirements, this gap increases to 5.7 times.
While universities have reported to government on their widening participation activities for over a decade, very little attention has been paid to whether they have been investing their resources in activities that actually make a difference to university participation rates for students from non traditional backgrounds.
Recently, however, the Office for Students (OfS) has led growing calls for better evaluations to tell us ‘what works, how and for whom’, and we are starting to see this flow through into practice.
We were delighted when the OfS announced it would establish an Evidence and Impact Exchange to promote social mobility in higher education by encouraging ‘the generation, translation and adoption of high quality evidence and evaluation’. To us, this signalled a step-change in widening participation evaluation approaches.
So what works to increase representation at university?
At the moment, there is a long way to go before we have robust empirical evidence to inform effective practice as a sector.
If we are to really understand what works, why and for whom, we will do so by harnessing whichever approaches are most suitable for a given intervention, including a strong commitment to mixed methods. But focusing on randomised controlled trials (RCTs) and quasi-experimental methods, we can say the following:
● Outreach. There is some evidence from the US that ‘black box’ outreach interventions (comprising several activities such as test taking and study skills assistance, academic advising, mentoring, tutoring, college campus visits, and financial aid application assistance) can be effective. This is complemented by substantial realist and process evaluations of outreach activities in the UK.
● Assistance with forms. A large randomised controlled trial (RCT) in the US found that providing assistance to complete financial aid application forms increased university attendance.
● Ongoing contact. An RCT, again in the US, found that sustained contact over the summer, in the form of text messages and peer mentor outreach, helped reduce ‘summer melt’ among college-bound students.
● Role models. Research conducted by the Behavioural Insights Team and the University of Bristol suggests that inspirational talks and letters can be effective.
There are some glaring gaps in this list, including some of the cornerstones of widening participation in the UK, such as summer schools, volunteer tutoring and contextual admissions. There is also little evidence on what universities can do to raise the school-level attainment of disadvantaged students – something the Education Endowment Foundation and others argue is the biggest barrier to young people achieving their goals in education and work.
However, the existing evidence does provide a base on which universities can design, test and refine new approaches. As a sector, we are starting to do this, with a growing number of RCTs, including driven by the OfS through the National Collaborative Outreach Programme evaluation, and increasing awareness of other methods to measure the impact of interventions.
KCL widening participation event, 2017. Photo: David Tett
A whole-institution, whole-lifecycle approach
King’s set up the What Works Department in September 2017. The Department works with both Widening Participation and Student Success units to conduct research and evaluations of initiatives aimed at supporting underrepresented students to get to university and succeed once they’re here.
We’ve already started running trials that demonstrate that an individual institution can run an RCT on complex outreach activities. Right now, we’re excited about an RCT we’ve just run in partnership with four other universities, testing the effect of sending postcards from current undergraduates to their teachers (to see whether teachers feel more appreciated and motivated to support more students into university).
As our student body has become more diverse, we’ve also started applying a ‘what works’ approach to supporting student success, to make sure our services and schemes work for all students once they’ve made it to university. This has involved running fourteen RCTs to date, testing ways of encouraging students to engage with university life, with another two in the field and many more planned. We’re also using quasi-experimental, non-causal and qualitative research designs to understand what matters to students as well as what works.
Sharing what works
An important aspect of the What Works Department’s mission has always been to take the conversation outward, to publish the findings of our evaluations (both successful and not so successful), to partner with other institutions on shared research, and support other institutions to develop the capacity to conduct evaluations to know what works for their own students.
We are thrilled to have the opportunity to do this via the new Evidence and Impact Exchange.
We believe that robust, causal evaluation of universities’ widening participation and student success activities is a practical and moral imperative for the sector. It protects funding. It gives us confidence that we are doing the right thing for our students and outreach participants. And it allows us to build an evidence base on what works, how, and for whom.
If you’d like to find out more, you can read our research findings, both in reports and via our blog. More details on our plans for the Evidence and Impact Exchange can be found on the OfS blog.
Leave a comment