PISA results don’t look good, but before we panic let’s look at what we can learn from the latest test
The 2015 Program for International Student Assessment (PISA) results have been released – and on first glance, it does not look good for Australia.
On global comparisons, Australia performed equal 10th in science (down from 8th in 2012), 20th in maths (down from 17th) and 12th in reading (down from 10th).
There is a steady decline in the results since 2000, both in terms of overly simple international comparisons and absolute mean scores.
No doubt, similar to the response to Trends in International Mathematics and Science Study (TIMSS) results last week, the media, politicians and education commentators will go into a panic over Australia sliding down the international rankings, falling standards in classrooms, and poor quality teachers.
Beyond the panicked headlines, what can we actually learn from the results of an international test that compares 15 year olds’ science, maths and reading skills in 72 countries and economies?
We need to have a more considered discussion about these test results rather than leaping to quick conclusions about a failing education system.
The main domain of the 2015 test was scientific literacy – the application of scientific knowledge and skills to solve problems.
Australia’s average score was 510, significantly above the OECD average of 493.
However, there has been an overall decline of 17 points since 2006. With the exception of the Northern Territory and Tasmania, most states performed well above the OECD average.
Singapore achieved the highest score of 556, which equates to roughly one and a half years more schooling than Australia.
While 61% of Australian students achieved the National Proficient Standard, only 11% were high performers (OECD average was 8%) and 18% were low performers (OECD average was 21%). This suggests that the majority of students might be doing okay, but few are excelling.
The OECD average for mathematical literacy was 490, with Australia achieving 494. This is significantly below 19 other countries, including Singapore at 564 points. This is the equivalent of two and a half years more schooling.
The number of students reaching the National Proficient Standard in mathematical literacy was 61% in the Australian Capital Territory, but only 44% in Tasmania.
Singapore achieved the highest result of 535 in reading literacy, equating to about one year more of schooling than Australia’s score of 503. The OECD average was 493.
Once again, the Northern Territory and Tasmania performed significantly below the OECD average, while all other states gained much higher results. Also, the spread between the lowest and highest Australian performers was significantly wider than the OECD average.
What the results mean
It is unhelpful to use the single country ranking to determine how we are going as there are significant variances between states/territories and school sectors (government, independent, Catholic).
Instead, we need to carefully disaggregate the data and consider the social and economic factors that influence performance across states, between schools, as well as the correlations between gender, Indigeneity, class, race, geographical location, and so on.
Australia has one of the widest ranges of student achievement, with what can be described as a long tail of underachievement.
For example, the difference in performance of students from the Australian Capital Territory and those in Tasmania and the Northern Territory is worth considering.
These differences are similar to those evident in performance on National Assessment Program – Literacy and Numeracy (NAPLAN).
Furthermore, there is a difference of nearly three years of schooling between students in the highest socioeconomic quartile and the lowest, with similar differences when comparing Indigenous with non-Indigenous students.
Interestingly, boys only marginally outperformed girls on scientific literacy, with girls significantly outperforming boys on reading literacy, with no real difference on mathematical literacy.
It is also interesting to note that since the PISA tests began in 2000, the major federal education policy levers have included:
- Significantly increased federal funding to private schools under John Howard, followed by a commitment by Julia Gillard that no school would lose a dollar.
- Failure to implement Gonski’s needs-based funding of all schools.
- The introduction of NAPLAN, MySchool, the Australian Curriculum and the AITSL national teaching standards by the Rudd-Gillard governments.
- Increased emphasis on market measures for school provision, such as Independent Public Schools and school autonomy.
Yet over this time, the narrative of steady decline on PISA and TIMSS results continues, while educational inequality is on the rise.
Australia has one of the most segregated schooling systems in the world, and the OECD data provide a strong correlation between high-performing systems such as Singapore and factors of social cohesion and equity.
Further evidenced in secondary analysis of all PISA data over time is the strength of the correlation between equitable funding of schools and systemic performance on PISA.
If we want to address these sliding results then we must address the issue of educational inequality in Australia.
Social efficiency and social equity
There are competing tensions in the agenda of social efficiency and social equity, which is evident in how PISA results inform global and local education policy-making. This includes the emphasis on competing within a global knowledge economy.
It is worth noting how the economic rationalisation for greater educational equity plays out in the global policy field, particularly through testing regimes such as NAPLAN and PISA.
The challenge for policymakers, schools and teachers is how to respond to increasing pressure to lift test results on PISA, TIMSS and NAPLAN, while also addressing systemic inequality in order to ensure that every Australian student is given access to a meaningful
Equitable funding of schools, including redistribution to schools serving disadvantaged communities, remains a pressing policy issue in Australia.
However, it is unlikely that we will see much more than panic and moral crusading in the media commentary over the coming days.
Once the hyperventilating dies down, we need to take a long, careful look at these results and what they mean for a more equitable and high-performing Australian schooling system.