Skip to content

PISA findings and implications for UK education

08 Dec 2016

A look at the OECD’s programme for international student assessment (PISA) results which this year had a focus on science

The PISA tests countries from the OECD and partner countries every three years, cycling its focus through Science, English and Maths – this year focusing on science. The UK has not increased its score in science, mathematics or reading since 2006.

Key findings

PISA works by giving students a score for each subject, this is then used to make a ranking of countries. While the UK has not had a statistically significant change in its scores, other countries scores have decreased and the UK now stands at 15th in science (21st in 2012), 27th in maths (26th in 2012) and 22nd in reading (23rd in 2012) (out of 70). Most of the OECD countries’ scores have also remained unchanged. This is a compound score for the UK and England scores highest followed by Northern Ireland, Scotland and Wales; in the same order for all three subjects. The UK currently spend 27% more than the OECD average spend per student.

There is mixed news in terms of diversity – girls and boys are equally likely to get the highest grades in science and equally likely to want to go into a science based profession. However boys are still far more confident of their ability to achieve (more than the average across OECD countries) and while girls want to go into science they are three times as likely to want to go into health professions than boys and half as likely to visualise themselves as science and engineering professionals. Most UK students are above the OECD average (24%) for expecting to go into science professions, but particularly in Northern Ireland (33%) compared to England (30%), Wales (28%) and Scotland (23%) (UK 29%).

The UK also ranks highly in equity based on socio-economic status – particularly in Wales where only 6% of the variation is down to SE status (11% in England, Northern Ireland and Scotland). This puts the UK above the OECD average (12.9%) and Wales at the same levels as the top PISA ranked countries.

Interpretation

Teaching

The UK may have stayed the same in scores, but PISA suggests some factors which may have held it back from achieving higher – teacher shortage was highlighted as a particular problem with 43% of UK head teachers reported that it had hampered their schools’ ability to teach – far above the 30% OECD average shortage. This was only 27% in Northern Ireland and 20% in Wales but 45% in Scotland and England. This has already been highlighted by others and suggests a need for more focus on teacher retention and training. Singapore which leads the PISA rankings has been held up as an example of good education through a centralised teaching system which takes from the top 5% of graduates, and provides excellent teacher support.

Recommendations for better education that can be concluded from the report include access and use of extra-curricular science clubs and more science teaching – something the UK already does well. This is also true of access to science equipment, principals reported that there is “enough laboratory material that all courses can regularly use it” in 91% of UK schools (66% OECD average), and extra lab staff that support lessons are available in 91% of UK schools)– something only 34% of OECD students have access to. 72% of UK schools attend science competitions – a resource shown to significantly increase tests scores. However, there is a wide disparity between advantaged and disadvantaged schools in the UK with only 68% of disadvantaged schools attending them compared to 88% of advantaged schools.

PISA highlights that a strong indicator of high scores is students’ perception of lessons as being adapted to the class. This is only reported by 48% of UK students which while slightly above the average for the OECD (45%) suggests room for improvement. However it is hard to see how it could be implemented under the current system where 82% of teachers describe their workload as unmanageable. It is also noted that PISA scores increase when teachers explain ideas more frequently, another indicator where the UK is above average (65% compared to 55%).

Though not linked to performance in the summary statistics it is worth noting that only 35% of UK students report that whole class discussion takes place in many or all lessons compared to an OECD average of 50% and that 57% students think that a teacher demonstrates an idea in many or all lessons compared to a 54% OECD average. Discussion is shown by PISA to not increase tests scores and demonstration is shown to only increase score half as much as explanation raising some interesting questions about PISA test performance and how it relates to educating students in ‘doing science’.

PISA score increases in relation to teacher practice. From PISA 2015 report part II

Devolution

One impact of the four nations being assessed separately means that strategy can be compared within them. Wales fell in science but increased in maths, and Scotland decreased in reading while England remained the high scorer across all three subjects. However Wales showed less socio-economic disparity, Northern Ireland more expectations to go into science professions and England and Scotland had the highest concerns with teaching staff shortages. This suggest that all four nations can learn from each other to improve the quality, and equality, of education.

Conclusions

For the government, yet another year of stalling UK scores means that the previous education Secretary Nicky Morgan’s 2015 pledge to get the UK into the top 5 in PISA by 2020 would require serious improvement in the UK’s performance (or decreases in performance for our international peers).

The teacher shortages highlighted by PISA, especially in England, are concerning.

One solution is for immigration policy to encourage recruitment of secondary Science and Maths teachers – both professions currently on the shortage occupation list. CaSE has also previously called for policy stability towards education so that teachers can focus on teaching rather than navigating complex system changes. This may help make teaching a profession that people want to stay in.

It is encouraging to see movement in the right direction in gender diversity in STEM subjects, but more work is needed and schools’ accountability measures in the UK should include measurements of success at A-level by gender. In addition, actively improving diversity must be considered central to development, design and promotion of qualifications and careers.

Provision of practical teaching facilities were highlighted as important to PISA scores, and perhaps more importantly science is by nature a practical subject so maintaining this provision is essential.

There are many criticisms of PISA, including that, as with other highly test-based systems, if the aim is succeeding in PISA, we risk implementing policies that prepare our children for succeeding at tests, potentially at the cost of developing skills needed by future researchers such as creativity and initiative. So while PISA results are enlightening and a helpful international comparison, if they are to be used to affect education strategy, as increasingly is the case, PISA results should be interpreted with care and considered as part of a much wider evidence base.