Skip to content

Blog post

Great expectations dashed? Evidence, practice and implementation in education

Jim Hordern, University of Bath

This blog is a reflection on the recent ‘special section’ in Research Intelligence issue 144 entitled ‘How we can improve the use of research evidence (in practice)’, edited by Stephen Gorard. I argue here that the relationship between evidence and practice needs to be reconceived so that we can rethink ‘implementation’ in education.

BERA members invest considerable energies in trying to solve educational problems, often basing their research designs on assumptions about the primacy of specific educational purposes such as effective teaching and learning (and how these can be measured). However, assumptions made about the primacy of certain educational purposes and the nature of educational practice often seem to lead to unexpected barriers in the course of implementation of evidence-led interventions. Attempting to change educational practice without respecting its distinctive relational character – and the hold this has on the public and professional mind – results in confusion. The researchers become frustrated by the responses of the ‘practitioners’, who may draw inconsistently on the endless summaries of ‘what the evidence says’ (Gorard, 2020, p. 17) to meet seemingly inexplicable ends.

‘Attempting to change educational practice without respecting its distinctive relational character – and the hold this has on the public and professional mind – results in confusion.’

But how do conceptions of practice affect change? Foray and Hargreaves (2003) suggest that educational innovation is ‘very slow’ (p. 11) with ‘weak spillovers’ and a ‘humanistic mode’ (p. 10) that has been resistant to the ‘epistemic culture of science’ (p. 15). While Foray and Hargreaves may have bemoaned this, there may well be very good reasons for it. We could argue that educational practice, with its relational pedagogic dynamic and moral sensibility, is fundamental to social organisation. It has outcomes that are multiple, unpredictable and often impossible to measure. There are plenty of educational goods which the public can grasp instinctively and are reluctant to see compromised. Such goods might include considering how to be human in a changing world, the ‘mystery of personal life’ (Buber, 1965 in Noddings, 2003, p. 244) and the dynamics of interpersonal relations, as much as engagement with subject knowledge and preparing for assessment, important as these may well be. It doesn’t seem unreasonable to assert that young people need space and time to grow and form identities, to discover what they enjoy in life and to learn to get along with others; a degree of stability may be helpful for this purpose. Most people would recognise that teachers have the important job of facilitating this process. If the institutional environments that are supposed to provide this are constantly battered by pressures to change in the name of the latest evidence, then some ‘public value’ is put at risk. Perhaps education needs to be a slow practice for its goods to flourish fully?

Those interested in enacting change in education might find Pressman and Wildawsky’s 1973 book Implementation useful. The book, which stimulated a rich tradition of policy implementation studies, was subtitled ‘How great expectations in Washington are dashed in Oakland’ and highlighted the problems that emerge if implementation is ‘conceived as a process that takes place after, and independent of, the design of policy’ (Pressman & Wildawsky, 1973, p. 143). In their studies of the implementation of public policy in the United States they found unexpected and unintended consequences everywhere. The implication for educational researchers is that it is unlikely practitioners will unquestioningly implement recommendations ‘with fidelity’ (William, 2020, p. 31). In the case of educational practice there is a distinctive frame of reference available from which judgements about evidence can be made (Hordern, 2020), and this may lead to decisions that seem incomprehensible to those researchers with convictions about the merits of their interventions. An excessive focus on implementing evidence-based policy for assessment, for example, may be judged as crowding out the time and space for other, more nebulous but equally valuable educational purposes.

I don’t doubt that the goods of educational practice can be subverted through new interventions directed at educational institutions and the education workforce, causing considerable damage to young people and our societies in the process. Ideas imported from Taylorist management theory and neoclassical economics are, unfortunately, increasingly rife in educational circles, a consequence perhaps of a fragmented educational knowledge base which is not well served by the ‘New Science of Education’ (Whitty & Furlong, 2017, p. 28). While teachers would no doubt benefit from a better understanding of the methodologies used in the production of evidence, it seems just as important to be able to interrogate assumptions made about educational practice itself.

This blog is based in part on the article ‘Why close to practice is not enough: Neglecting practice in educational research’ by Jim Hordern, published in the British Educational Research Journal. It has been made free to view for a limited period, courtesy of our publisher, Wiley.


Foray, D., & Hargreaves, D. (2003). The production of knowledge in different sectors: A model and some hypotheses. London Review of Education, 1(1), 7–19.

Gorard, S. (2020). Editorial: The story so far about research evidence. Research Intelligence, 144, 17–18.

Hordern, J. (2020). Why close to practice is not enough: Neglecting practice in educational research. British Educational Research Journal. 

Noddings, N. (2003). Is teaching a practice? Journal of Philosophy of Education,37(2), 241–251.

Pressman, J., & Wildawsky, A. (1973). Implementation. Berkeley: University of California Press.

Whitty, G., & Furlong, F. (2017). Knowledge and the study of education: An international exploration. Didcot: Symposium.

William, D. (2020). Three concrete measures to make educational research more useful. Research Intelligence, 144, 30–32.