Tudge Fudges School Results and Funding
The Minister for Education, Alan Tudge, resorted to fudging figures to denigrate Australia’s school performance at the The Age education summit last weekeducation summit last week. He claimed the UK as the new benchmark for education performance but he misrepresented its results by ignoring serious flaws in them and other evidence showing no improvement. He also fudged data on school funding and student results in Australia.
Tudge told the summit that the UK is the country we can most learn from because it has achieved a “remarkable turnaround” in its results in the OECD’s Programme of International Student Assessment (PISA) over the past decade despite cutting school funding. However, the turnaround is a statistical artefact because the increase is largely confined to 2018 and appears to be biased by very low school participation and high student exclusions from the tests.
The OECD report on PISA 2018 shows a statistically significant increase in the UK reading results of 10 points between 2009 and 2018, an increase of 9 points in mathematics and a decline of 10 points in science which Tudge conveniently ignores. About 50% of the increase in reading and all the increase in mathematics occurred between PISA 2015 and 2018. These increases have been questioned because the UK failed to meet several of the OECD criteria for countries to be included in reporting PISA results.
The flaws are exposed by Professor John Jerrim of the University College of London in a paper to be published in the academic journal Review of Education. His analysis comprehensively refutes Tudge’s claim. He concludes: “There is clear evidence of an upward bias in the PISA 2018 data for England and Wales, with lower-achievers systematically excluded from the sample”.
Jerrim estimates that the combination of student exclusions, school non-response, student non response and technical details about eligibility criteria meant that about 40% of the target UK student population did not participate in PISA 2018. This was the fourth lowest participation rate of the 79 countries participating in PISA 2018. Only Panama, USA and Brazil had lower rates.
He shows that the PISA 2018 data for England (which accounts for 84% of the UK sample) clearly under-represents lower achieving students and over-represents higher achieving students. For example, 21% of the PISA sample were low achievers compared to 29% for the total population of the age group.
Another issue analysed by Jerrim is the school response rate. The OECD requires that 85% of sampled schools agree to take part in the study. However, both England and Northern Ireland failed to meet this standard with only 72% and 66% respectively participating. The overall rate for the UK was 73%. While there is provision to include replacement schools, the PISA technical criteria require a very high participation rate from such schools which was not met by the UK.
Even the UK Department of Education admits that that the UK “did not fully meet the PISA 2018 participation requirement” [p. 188] because of the high school non-response. However, the OECD waved this through and agreed that the UK data should be included as fully comparable to other countries. Jerrim says the OECD has a very weak adjudication process to decide whether to include a country’s data in the PISA results.
Jerrim also shows that the UK had a high rate of student exclusion from the tests. Students can be excluded from the tests in various ways. Schools may decide not to test some students included in the sample, others may be declared ineligible because they moved school between the time the sample was designed and the time the test was implemented, parents may not consent for students to participate and some students in the sample may be absent on test day.
The OECD technical standards state that within-school exclusions should total less than 2.5% of the target population and that the combination of school-level and within-school exclusions should not exceed 5% of the target population. The UK failed to meet these standards. The within-school exclusion rate was 3.3% and the total exclusion rate was 5.5%. Jerrim notes that a strict application of PISA’s data quality criteria would have led the UK to be removed from the study as it was from PISA 2003 for similar breaches.
These exclusion rates are much higher than for many other countries participating in PISA 2018. The average within-school rate was 1.4% and the total exclusion rate was 3%. The total exclusion rates in Japan and South Korea were 0.1%. Such differences are likely to bias cross-country comparisons of PISA performance.
The overall high non-participation in the UK has clear potential to bias its PISA 2018 results. It creates large uncertainty about the results. This is likely to affect the reliability of comparisons to other countries and the extent to which results have changed over time.
Jerrim estimates that the average PISA scores in England and Wales were inflated by the high non participation rate in PISA 2018. The average PISA mathematics score for England was 504 points – significantly above the average across OECD countries. He estimates that had a truly representative sample of the population taken the tests England’s score would have been about 494. The report by the UK Department of Education on PISA 2018 shows this is almost the same score achieved in every PISA test from 2006 to 2015 (p. 96).
This is not the “remarkable” turnaround claimed by Tudge. Jerrim told Save Our Schools “it seems a strange thing to point to the UK as having a stunning turnaround. The results in PISA have been pretty flat, other than a relatively small uptick in maths in 2018”.
Jerrim also said that focussing on PISA was “pretty selective” because the UK Year 9 mathematics results in the Trends in International Mathematics and Science Study (TIMSS) did not increase between 2015 and 2019. Indeed, the UK Department of Education report on TIMSS 2019 shows that the UK mathematics results have stagnated since 2007 [p. 9] and there was a large decline in its science results [p. 13].
All this demonstrates that Tudge cherry picked figures to suit his case. He ignored evidence that the increases in the UK PISA results are highly doubtful and he ignored other evidence that shows no improvement in UK results. He completely fudged his case.
Jerrim’s paper raises serious questions about the OECD adjudication process for deciding whether a county’s results should be accepted in a PISA cycle. He says it should be more transparent.
… the OECD needs to reconsider its technical standards, the strictness of which these are applied, and its data adjudication processes…..the processes currently in place flatter to deceive and are nowhere near robust enough to support the OECD’s claims that PISA provides truly representative and cross-nationally comparable data.
His analysis of the dodgy PISA results in the UK raises the broader issue of the validity of international comparisons when there are so many loopholes for countries to rig their results. He states:
There remain many ways for countries to not test pupils who are technically part of the target population, with lower-achievers disproportionately likely to be removed from the sample.
Apart from selectively using the dodgy UK PISA results, Tudge also fudged school funding data in Australia in claiming it school funding per student, adjusted for inflation, increased by 60% since 2000. This is far from the truth. After adjusting for flaws in data from the Report on Government Services, we estimate the actual increase for all schools from 2001-02 to 2018-19 was only 19%, that
is, an average increase of just over 1% per year. The increase for private schools was over double that for public schools. Government funding per student in private schools increased by 34% compared to only 15% for public schools. If school funding is failing to deliver better results as Tudge claims, this is mainly because money is being wasted on more privileged private schools instead of helping schools to overcome disadvantage in education.
Tudge also fudged Australia’s school results by highlighting the decline in PISA results for 15-year-old students and ignoring improving results in Year 12. The decline in PISA results is questionable because student motivation and effort is likely to be a factor in the decline. In contrast to Year 12 assessment, the PISA tests have no consequences for students as they don’t even get their results.
The OECD says that 73% of Australian students participating in PISA 2018 did not fully try in the tests. While there is no direct evidence of an increasing proportion of students not fully trying in the PISA tests over time, there is indirect evidence.
PISA data show that student dissatisfaction at school amongst 15-year-olds in Australia increased fourfold from 8% to 32% between PISA 2003 and 2018. This large increase in student dissatisfaction may have led to lower motivation and effort in PISA over time. The OECD says that the relationship between a feeling of belonging at school and performance in PISA is strong for students with the least sense of belonging [OECD 2016, p. 122]. Students who feel they do not belong at school have significantly lower levels of achievement in PISA than those who do feel they belong.
Tudge’s claim of declining school results is contradicted by other more significant data. The percentage of the estimated Year 12 population that completed Year 12 increased from 68% in 2001 to 79% in 2018, although there is an unexplained drop-off in 2019 [Report on Government Services 2007 & 2021]. The proportion of 20 to 24-year-olds who attained a Year 12 Certificate or equivalent increased from 79% in 2001 to 89% in 2019 [ABS, Education and Work, 2011 & 2020].
OECD data also shows that Australia had one of the larger increases in the OECD in the proportion of 25-34 year-olds who attained at least an upper secondary education. It increased by 19 percentage points from 71% in 2001 to 90% in 2019 [Education at a Glance 2002 & 2020].
These are indicators of an improving education system, not a deteriorating one. They are clearly inconvenient for Tudge because he ignores them and relies solely on questionable figures that mispresent Australia’s education performance.
Tudge’s fudges are designed to deny public schools the funding increases needed to ensure all students received an adequate education and to improve equity in education. Instead, the Morrison
Government has provided billions of dollars in special deals for private schools and conspired with state governments though bilateral funding agreements to continue to under-fund public schools.
No more fudges, Mr Tudge. Your fundamental task as the Commonwealth Minister for Education is to better support public schools and disadvantaged students to deliver improvements in equity in education.
9 May 2021
Trevor Cobbold
National Convenor
SOS - Fighting for Equity in Education
We wish to acknowledge the traditional custodians of the land on which we live and work. We wish to pay respect to their Elders - past, present and future - and acknowledge the important role all Aboriginal and Torres Strait Islander people continue to play within Australia. We stand in solidarity.
Authorised by Mary Franklyn, General Secretary, The State School Teachers' Union of W.A.
ABN 54 478 094 635 © 2024