Leaning tower of PISA – 7 serious skews
Like the real Leaning Tower of PISA, the OECD PISA results are built on flimsy foundations and are seriously skewed. Nevertheless, they have become a major international attraction for educators, and have sparked off an annual educational ‘international arms’ race’
In the UK, Gove’s education policy is rooted and firmly targeted at the PISA results. The English Baccalaureate has PISA written all over it. PISA is also used as a political football by d-list celebrities and wannabes, like Toby Young and Katherine Birbalsingh to beat the state system over the head. They stare at the learning tower and swear blind that it’s straight.
Both left and right now use the ‘sputnik’ myth, translated into the ‘Chinese competitiveness’ myth, to chase their own agendas – more state funding or more privatisation. This is a shame, as the last thing we need is yet another dysfunctional , deficit debate in education. Nations have different approaches to education, different demographic and social mixes and different agendas.
The problems in the data are extreme as PISA compares apples and oranges. In fact it compares huge watermelons with tiny seeds. PISA is seriously flawed because of the huge differences in demographics, socio-economic ranges and linguistic diversity within the tested nations. Here’s seven skews as a starter:
Shanghai topped the table this year but why was one city from an entire nation singled out? Could it be that Shanghai is China’s flagship city? To compare a set of students from the richest city in China (on average 3 times the national income), that has attracted a high proportion of high-achievers, with a similar sample from across the whole of the US is odd. It smacks of political manipulation by the Chinese. There was also evidence, presented in the New York Times, of student priming. Imagine if the cohort was drawn from rural China? This comparative method will fail as there will be lots of outliers in the data. It is not surprising that small homogeneous cities and states pop up at both the top and bottom of the table.
Different tested cohorts have different immigrant ratios. The difficulties that immigrants have with language, social adjustment, school and poverty, is a serious pollutant to the data. . As one would expect, Finland and Shanghai have very small numbers of immigrants in their tested cohorts. It is bonkers to compare cohorts with radically different numbers of immigrant children.
Selective immigration skew
There is another odd skew associated with immigration, namely that for some countries, immigration is controlled, so that only wealthy or smart kids get through. So, for example, there’s only one country in the PISA results where the immigrant students outscore the natives and that’s Australia, where immigration is highly selective. There’s a huge difference between refugee immigration and cherry-picked immigration.
Associated with immigration, is a curious linguistic skew – the tendency for smart immigrants to migrate towards English speaking nations. This could mean that English speaking nations benefit in the long term from such immigration but show poor short-term results due to high first generation immigration with associated language problems at school.
On reading, languages with regular structures are likely to do better than languages which are more irregular. The tests may favour languages with simplified spelling structures such as Finnish. Reading data may also be skewed by reading habits as PISA doesn’t recognise reading on screens. It’s big on books.
PISA measures academic subjects only, namely maths, reading and science. To be fair PISA have recognised this flaw and are now embarking on a correction process. But is it right to judge education in these subjects only? One need only focus the curriculum heavily on these subjects to do well, which is exactly what many counties do. Dump sport, music, the arts and humanities and you can produce stellar results.
Subject focus skew
Simple differences in taught curricula can also affect the results. In maths, for example, if you have taught ‘series’ theory you will do well in the 2009 results, as a major set of questions focused on extrapolating series in the test. If this is not part of your curriculum, you will score badly.
I’m not alone with these concerns. OFQUAL published a Progress Report (International Comparisons in Senior Secondary Assessment ) in February 2011 making similar points.
They listed several major criticisms:
- differences between countries’ performance are not that large…usually statistically insignificant
- whether or not a country has moved up or down the league tables is not that meaningful partly because the absolute differences in scores between countries are not that great
- the constituent group of comparators changes from study to study and from year to year
They point to three major but dangerous assumptions that:
- items tested for are somehow an objective measure of what is best
- learners undertaking the study are a balanced representation of all learners at that stage of education
- learners sampled in each country are equally motivated to perform well in the tests
Additionally, these snapshot studies do not isolate variables and may well be skewed by “factors in the past that no longer apply”, such as “learner performance in an examination may be the result of curriculum developments undertaken” or “investment in education infrastructure some time in the past”. In other words, using the data to praise or blame the current system is unwise.
The great danger is that the world skews its curriculum to fit the PISA expectations, just as PISA draw away from their own curriculum tested areas. This has already happened in the UK with Gove’s Ebac.
Gove has specified A*-C passes in five subject areas: English; maths; two sciences; ancient or modern history or geography; and a modern or ancient language. It has all the hallmarks of a PISA-led curriculum.
Devil’s in the detail
Politicians and activists distort PISA to meet their own ends. They cherry pick results and recommendations, ignoring the detail. Finland is famously quoted by the right as a high performing PISA country. Yet, it is a small, homogeneous country with no streaming, high levels of vocational education, no substantial class divisions and no private schools. Facts curiously ignored by PISA supporters.
One could quibble with the details of my analysis, but I’m convinced the PISA comparisons are riddled with skews and errors, many more than indicated above. The great danger, and it is already happening, is that people read causality into the data. It’s crap schools, crap teachers, money spent doesn’t matter etc. The scope for false causality is enormous and exploited by politicians for their own ends.
"In the last 10 years we've plummeted in the PISA rankings" heard this before from Michael Gove - he lied. UK results were excluded in 2000 (low response rate) and 2003 as data was dodgy. Only gathered in 2006 and 2009. PISA tests not that important but National tests have gone up - what's happening?
Late news: Is PISA data fatally flawed?
Sven de Kreiner Danish statistician says PISA is not reliable at all. In the reading tests 28 questions were supposed to be equally difficult in every country. PISA has failed here as differential item functioning - items with different degrees of difficulty in different countries - are common. In fact he couldn't find any that worked without bias. Items are more difficult in some countries. He used his analysis to show that the UK moves up to 8 or down to 36. PISA assumes that DIF has been eliminated but not one single item is the same across the 56 countries.
OECD does not compare over the 10 years. Performance has not fallen, if anything it's flat.