XClose

Education Research Programme

Home
Menu

Knowledge Exchange in Education Briefing Note #1 Working across the divide

This briefing note considers the relations between research, policy and practice, the limits of “disseminate to” and how else to share responsibilities for system improvement in English education.

Working across the divides between policy, research and practice.

Introduction

This briefing note sets out some early insights from the ESRC Education Research Programme (ERP) on the potential for partnership working in education in the UK.  Educational researchers have a long history of engaging with a wide range of stakeholders (Whitty, 2006). Yet the terms upon which the research community is expected to interact with either policy or with practice have been increasingly narrowly defined (Penuel et al, 2020; Greany and Higham, 2018).

This briefing note considers the strengths and weaknesses of current approaches and how to bring about change.

Policy-driven education reform.

Over the last few decades, education reform has become a routine part of government activity. Whether governments change, or whether they stay in power, reviewing curricula and (re)regulating for system improvements are simply part of what they do.  Such an ongoing commitment to education reform has changed the relationships between research, policy and practice in the education field. 

The earliest approaches to large-scale, policy-driven reform assumed that the answers were already out there – in the research base scholars had already produced; and/or in the prescriptions other governments were already following. Moving existing knowledge and know how from here to there would bring about the improvements in education that governments desired. In practice, things have not turned out to be as simple as that.

For policymakers, such disappointments raise political questions about what to do next. This is particularly acute for any government whose strategy has not yielded the expected results. Do they stick with the current approach and apply more pressure in the hope that the metrics will change; relinquish the evidence base on which the approach was founded and look for more certain answers elsewhere; blame practitioners for failing to follow with sufficient enthusiasm the guidance given or researchers for not having the answers wanted? Alternatively they can bury the evidence and turn to other questions. Given the risks of policymaking in the public eye, any of the above might serve.

But research has a different responsibility: to learn from what is not working by interrogating the evidence with care. Policy, research and practice, orientate differently to the business at hand. The best analysis begins by recognising that policy is made up of “complex interactions between ideas … institutions … and interests” which colour what people see; and that interacting with policymakers involves “framing a problem, negotiating its meaning and arguing for solutions.” (Greenhalgh and Engebretsen, 2022, p 40). These are sense-making activities that structure the available evidence into a compelling narrative (Smith, 2013). This is a useful insight for education.

System improvement and theories of change.

There is now a considerable literature in different areas of social policy committed to exploring the conditions under which interactions between policy, research and practice can lead to meaningful change. This has been built from empirical studies of efforts to drive system improvements. Systematic reviews of the literature have kept track of lessons learnt and the ways in which key concepts have developed over time.

The dominant vocabularies used to describe system improvement have certainly changed: from evidence-based practice (EBP) to evidence-informed practice to research-informed practice. Each step in this chain moves away from a secure hierarchy of knowledge claims, towards greater recognition that the sufficiency of any particular evidence base depends on what happens in implementation and that this cannot be fully known beforehand. Indeed, in some cases practice may precisely show up the limitations of the evidence-base developed elsewhere (Greenhalgh, 2021).

As more attention has focused on recognising the difference between what researchers, policymakers and practitioners know, so more emphasis has been placed on processes of knowledge mobilization, knowledge translation and knowledge exchange. These have replaced knowledge transfer as the dominant metaphor. The terms reflect the difficulties encountered in bringing about meaningful change through top-down prescription. Instead, the quest is on to develop other models: that can give greater weight to user perspectives; and, by paying greater attention to variations in the social context where the evidence lands, identify more precisely what works, for whom, under what circumstances and why? (Pawson and Tilley 2004). Research knowledge-builds for the future in this way, by interrogating the relative strengths and weaknesses of current approaches.

The relations between research, policy and practice in education.

In reviewing the role research use plays in system improvement in education, the OECD distinguish between strategies based on push mechanisms – linear dissemination from research producers to policy or practice; and those based on pull mechanisms – users’ needs inform the research (OECD 2022, p 19).

The respective weight given to push or pull has consequences for how policymakers, researchers and practitioners interact. Strong backing from policymakers for “disseminate to” strategies can lock whole systems into particular patterns of interaction that are hard to change. Evidence-based practice in the English education system is a case in point.

Evidence-based practice in England

In England interactions between policy, research and practice have been radically altered by a policy commitment to EBP. This has been hardwired into the system through a combination of structural innovation (endowing the Education Endowment Foundation (EEF) as a What Works Centre to create a new evidence base through running RCTs and to disseminate research evidence to schools using new channels); and a range of policy levers requiring practitioners to use the evidence supplied to improve attainment (Pupil Premium spend reported to Ofsted; an EEF accredited curriculum for Initial Teacher Training and career development).

In a high-stakes accountability regime, of the kind England has, the strength of the push from an evidence base established in this way is strong. Expectations for measurable improvements are high. But the results across the system have been mixed. The decade of expenditure on EBP has not substantially improved pupil outcomes (40% still fail to achieve the required grades in English and Maths GCSE at 16); nor closed attainment gaps for the most disadvantaged (NAO, 2024).  That the inbuilt structures inhibit rather than promote critical reflection on these outcomes may be part of the reason why.

The limits of “disseminate to”

Over time, EEF has moderated its approach. Current advice on using the Toolkit says:

The Toolkits do not make definitive claims as to what will work to improve outcomes in a given school. Rather they provide …‘best bets’ for what might work in your own context.

A ‘best bet’ that “might work” is very different from a ‘definitive claim’. In a system that runs with a strong “disseminate to” logic, this effectively makes users responsible for fixing “knowledge to action” gaps, while absolving those supplying the evidence from doing the same. This seems unfair. If the most definitive advice is that “evidence does not necessarily result in improved outcomes” then, at the very least, system expectations about what will follow from putting evidence into practice need to dial down.

The implementation guidance, published most recently by EEF, acknowledges the uncertainties involved in translating evidence into practice. To remedy this it details a range of factors that schools ought to consider to achieve best results.  the task of implementing an intervention becomes increasingly complex and time-consuming.  Devolved to the front line, there no feedback loops built in enablign professional reflection on the value of the evidence given to those operating higher up in the system.  This prevents the system itself from revisiting the evidence it disseminates or learning from the very contexts it seeks to influence. 

Sharing responsibilities for system improvement

Everybody has a responsibility to reflect critically on the evidence of what works in system improvement – whether as policymakers, researchers or practitioners. Currently, there are few ways of doing this in partnership, or of recognising the power relations that influence how these different groups interact. Things are very different in health, where there is greater recognition of the undue influence some actors may have and an understanding that building in the perspectives of the less powerful is an essential resource. As a system it is also far less centrally directed. Politicians do not expect to give prescribing advice.

There are other ways of sharing responsibilities for system improvement amongst researchers, policymakers and practitioners. Co-production, collaborative research, research based on engagement with users at different points in the research and policy cycles are just some. These assume that by engaging more deliberatively with a wider range of users, and taking into consideration their views, the research findings will better address different interests and concerns. 

These approaches adopt a different kind of logic from “disseminate to”. Sometimes described as “linkage and exchange”, their theories of change place a higher value on learning through partnership (See Davies et al 2015). They also recognise that working with others requires a willingness to listen and learn on everyone’s part. For this to happen successfully in education, a “disseminate to” logic that measures system success solely in attainment outcomes would have to be unwound. Many of the key questions in education cannot be answered in this way. 

Conclusion

Disappointments in the “disseminate to” logic that EBP adopts rest not with the quality of the evidence nor the quality of the workforce. They are to do with the framing of the questions and the insistence on certain answers. This is particularly the case in education.

Approaches that understand that outcomes may be unpredictable in complex systems with many moving parts, will build in opportunities for reflective deliberation amongst all those involved. This is very different from prescribing fidelity of implementation. On the surface it may appear less certain, but in practice it ensures a system that can learn from what it does. Instilling more productive interactions between the many different stakeholders in education rests on acknowledging that not knowing what the answer is is not a system threat. On the contrary, it is a prompt to finding out.

Author: Gemma Moss
Date: July 2024
Funder: ESRC
Grant Reference number: ES/W004917/1

To join the debate

This series of briefing notes invites readers to consider whether education has yet settled on the most productive ways for policymakers, researchers and other stakeholders in education to interact. We welcome further contributions debating the strengths and weaknesses of current approaches.

References

Davies, HTO., Powell, A. and Nutley, S. (2015) Mobilising knowledge to improve UK health care: learning from other countries and other sectors – a multimethod mapping study. Health Services and Delivery Research. 3:27
Greany, T. and Higham, R. (2018) Hierarchy, Markets and Networks. IOE Press, London
Greenhalgh, T. (2021) Miasmas, mental models and preventive public health: some philosophical reflections on science in the COVID-19 pandemic. Interface Focus 11: 20210017.
Greenhalgh, T and Engebretsen, E. (2022) The science-policy relationship in times of crisis: An urgent call for a pragmatist turn. Social Science & Medicine 306 115140, p 40
OECD (2022), Who Cares about Using Education Research in Policy and Practice?: Strengthening Research Engagement, Educational Research and Innovation, OECD Publishing, Paris
Penuel, W. R., et al (2020). Principles of Collaborative Education Research with Stakeholders: Toward Requirements for a New Research and Development Infrastructure. Review of Educational Research, 90(5), 627-674.
NAO (2024) Improving educational outcomes for disadvantaged children. SESSION 2024-25 HC 125
Pawson, R. and Tilley, N. (2004) Realist Evaluation.
Sharples, J., Eaton, J. and Boughelaf, J. (2024) A School's Guide to Implementation. Guidance ReportEEF
Smith, K. (2013) Beyond Evidence-Based Policy in Health: The interplay of ideas. Palgrave Macmillan, London
Whitty, G. (2006). Education (al) research and education policy making: Is conflict inevitable? British Educational Research Journal, 32(2), 159–176.