Skip to content
 

Blog post

DfE and NFER on baseline testing: Do they know what they do?

Gemma Moss Harvey Goldstein Pam Sammons Gwen Sinnott Gordon Stobart

On 4 July 2018, the British Educational Research Association (BERA) expert panel on assessment published the report A baseline without basis (Goldstein et al 2018), in response to the Department for Education (DfE’s) plans to carry out baseline testing of all children on entry to reception classes in England. The panel considered whether the evidence from the assessment literature could justify such a test being used for accountability purposes – and concluded that it could not.

 

‘…[T]he government’s proposals, which will cost upward of £10 million, are flawed, unjustified, and wholly unfit for purpose.

Goldstein et al 2018: 30

Since then, the DfE’s chosen supplier – the National Foundation for Educational Research (NFER) – has published further material on how it intends to implement baseline testing, but without addressing any of the key issues raised by BERA’s expert panel (NFER 2018). Likewise, in answer to a parliamentary question that directly raised the panel’s concerns, the DfE responded in only the most general terms. No assurances have been given about how the most substantive flaws that the expert report highlighted will be tackled (TheyWorkForYou 2018).

In this blog we address these responses in turn, drawing attention to some of the key issues that they are in effect ignoring.

The NFER document is entitled, The Reception Baseline Assessment. While asserting that NFER is an evidence-informed organisation, the document fails to address the key threats to the validity of using baseline tests as a value added (‘cohort’) measure at the end of primary schooling. BERA’s expert panel, in its critique, pointed out that in fact almost all the available evidence pointed to such baseline tests being entirely unfit for just that purpose. If NFER is primarily concerned with advancing knowledge and practice on the basis of careful research then it will need to recognise the limitations of how well schools can, under their current proposals, be held accountable for the attainment of their pupils. In its document NFER chooses instead to present a reassuring picture of how well the exercise is supposed to work. Of course, it is possible that NFER, rather than simply refusing to acknowledge the limitations surrounding baseline testing, is actually unaware of them. That would be cause for even greater concern.

The DfE document, which is a direct response to the report by BERA’s expert panel, makes for depressing reading. It reiterates the mantra that the work to be undertaken ‘will be informed by an extensive evidence base of research’, but actually fails to respond to any of the points raised in Goldstein et al’s (2018) report – particularly those concerning reliability and validity, and crucially the low school-level predictability between baseline and KS2. The DfE seeks to justify its policy by noting that the ‘2017 public consultation on the future of the primary assessment system in England… drew support from a majority of respondents’. In other words, they seem to believe that carrying out a survey and taking the majority view trumps any need to respond rationally to an objective critique, based on the evidence. Science cannot be equated with public acclaim, and in this case should not be used to avoid addressing the substantive weaknesses in the baseline design.

We continue to assert, in light of the evidence, that:

  • the baseline testing proposals are flawed
  • our critiques have not thus far been addressed in any public response
  • this episode reveals a dangerous flaw in the way in which public discourse about social science evidence has come to be handled by many policymakers.

Whereas in the past it was generally recognised that a rational debate about social policy, using the best available evidence, was necessary, that debate has increasingly come to be replaced by ideology-based assertions and an unwillingness to listen to alternative views – even where doing so could prevent embarrassing policy U-turns at a later date. As appears to be the case with baseline testing, once a path has been set out, many policymakers appear to be unwilling to engage in any further argument, even if the path chosen is a poor one. Those with knowledge and understanding are marginalised and often simply ignored. Such tactics, of course, are very effective in driving forward chosen policy directions, but one cannot help remarking upon the irony that those responsible for shaping education themselves betray attitudes inimical to learning and expertise.

It would be sad if an organisation such as NFER – which in the past was often at the forefront of educational innovation and research – were to be seen to align itself with such attitudes. Unlike policymakers themselves, they do (or should) possess the expertise necessary to discuss and explain the pros and cons of their research activities.

Perhaps we can look forward to a detailed response from NFER to the panel’s critique? And, furthermore, to an acknowledgement from the DfE that, in the interests of good government, they need to take another look at their plans?


References

Goldstein H, Moss G, Sammons P, Sinnott G and Stobart G (2018) A baseline without basis: The validity and utility of the proposed reception baseline assessment in England, London: British Educational Research Association. https://www.bera.ac.uk/bera-in-the-news/a-baseline-without-basis-the-validity-and-utility-of-the-proposed-reception-baseline-assessment-in-england

National Foundation for Educational Research [NFER] (2018) ‘The Reception Baseline Assessment’, Slough. https://www.nfer.ac.uk/media/2837/the-reception-baseline-assessment.pdf

TheyWorkForYou (2018) ‘Education: Assessments: Department for Education written question – answered on 12th July 2018’, HC Deb, 12 July 2018, cW. https://www.theyworkforyou.com/wrans/?id=2018-07-09.162116.h

 

More content by Gemma Moss, Harvey Goldstein, Pam Sammons, Gwen Sinnott and Gordon Stobart