DfE and NFER on baseline testing: do they know what they do?
On 4 July, the British Educational Research Association (BERA) expert panel on assessment published the report “A baseline without basis” in response to the Department for Education’s (DfE) plans to carry out baseline testing of all children on entry to reception classes in England (https://www.bera.ac.uk/bera-in-the-news/a-baseline-without-basis-the-validity-and-utility-of-the-proposed-reception-baseline-assessment-in-england ). The panel considered whether the evidence from the assessment literature could justify such a test being used for accountability purposes – and concluded that it could not:
“the government’s proposals, which will cost upward of £10 million, are flawed, unjustified, and wholly unfit for purpose”.
Since then the DfE’s chosen supplier, the National Foundation for Educational Research (NFER), has published further material on how it intends to implement baseline testing, but without addressing any of the key issues raised by the expert Panel (https://www.nfer.ac.uk/media/2837/the-reception-baseline-assessment.pdf ). Likewise, in answer to a Parliamentary Question directly raising the Panel’s concerns, the DfE has responded only in the most general terms. No assurances have been given about how the most substantive flaws that the expert report highlighted will be tackled. (https://www.theyworkforyou.com/wrans/?id=2018-07-09.162116.h&s=Education+Assessments#g162116.r0 ).
We take these responses in turn, drawing attention to some of the key issues they are ignoring.
1. The NFER document is titled, ‘Reception Baseline Assessment’. Whilst asserting that the NFER is an evidence-informed organisation, the document fails to address the key threats to the validity of using baseline tests as a value added ( “cohort”) measure at the end of primary schooling. In the Panel’s critique it was pointed out that in fact almost all the available evidence pointed to such baseline tests as being extremely unfit for just that purpose. If NFER is primarily concerned to advance knowledge and practice on the basis of
careful research then it will need to recognise the limitations of how well schools can be held accountable for the attainment of their pupils on current proposals. In the document it chooses to present a reassuring picture of how well the exercise is supposed to work. Of course, it is possible that the NFER, rather than simply refusing to acknowledge the limitations surrounding baseline testing, is actually unaware of them, but in that case we would be even more concerned.
The DfE document, which is a direct response to the Panel report, makes for depressing reading. It reiterates the mantra that the work to be undertaken “will be informed by an extensive evidence base of research”, but actually fails entirely to respond to any of the points raised in the Panel’s report, especially concerning reliability and validity and crucially low school-level predictability from baseline to KS2. The DfE seeks to justify its policy by noting that the “2017 public consultation on the future of the primary assessment system in England (and) drew support from a majority of respondents.” In other words they seem to believe that carrying out a survey and taking the majority view trumps any need to respond rationally to an objective critique, based on the evidence. Science cannot be equated with public acclaim!
We continue to assert in the light of the evidence that
the baseline testing proposals are flawed,
that our critiques have not been addressed in any public response so far
and that this episode reveals a dangerous flaw in the way public discourse about social science evidence has come to be handled by many policymakers.
Whereas in the past it was generally recognised that a rational debate about social policy was needed using the best available evidence, this has often come to be replaced by ideologically based assertions and an unwillingness to listen to alternative views, even when these could prevent embarrassing changes of policy at a later date. As with baseline testing, once a path has been set out, many policymakers appear to be unwilling to engage in any further argument even if the path chosen is not well set. Those with knowledge and understanding are marginalised and often simply ignored. Such tactics, of course, are very effective in driving forward chosen policy directions but one cannot help remarking upon the irony that
those responsible for shaping education ,themselves betray attitudes inimical to learning and expertise.
It would be sad if an organisation such as NFER, often itself in the past at the forefront of educational innovation and research, should be seen to allay itself with such policy attitudes. Unlike policymakers themselves, they do, or should, possess the expertise to discuss and explain the pros and cons of their research activities. Perhaps we can look forward to a detailed response from NFER to the Panel’s critique? As for the DfE, they should be ashamed of themselves!
HG, July 17, 2018