The OECD released the PISA 2015 Technical Report chapter on data adjudication at the end of last month, and the report says that Malaysia’s replacement schools were higher-performing than the non-responding initially selected schools that they replaced.
I wrote a piece about that, and about Malaysia’s participation in PISA 2015 more generally, for The Malaysian Insight—and I promise that it has less jargon than the preceding sentence. Read it here. Alternatively, here’s a version with the citation links:
Did the Education Ministry influence our PISA 2015 results?
by Hwa Yue-Yi
Like many other Malaysians, I want our country to have a good education system. So when the results of the Programme for International Student Assessment (PISA) 2015 were released last December, I was curious to see whether our 15-year-olds had developed more skills in reading, maths, and science than their peers in previous PISA rounds.
But, like many other Malaysians, I was disappointed. Not because of our average PISA scores – which had gone up – but because we weren’t included among other countries in the main database of PISA results. Apparently, we had been excluded because only half of the schools that had originally been chosen to take part in PISA 2015 had actually taken the test.
Within each PISA country, schools are randomly chosen to participate in the assessment, after considering certain school characteristics, such as the size of each school and whether it is urban or rural. The goal is to get a balanced, accurate picture of student learning across the whole country. Each selected school is also paired with a backup school. If one of the originally selected schools doesn’t want to take part, the backup school will be asked to participate instead. But when many originally selected schools drop out of PISA, it’s hard to tell if the results represent the country accurately.
In Malaysia, after including the backup schools, our weighted PISA 2015 response rate increased from 51% to 98%. However, at the end of last month, the OECD released an official report stating that Malaysia’s backup schools “had a significantly better result, on a national examination, than the non-responding schools in the original sample”.
So did the Ministry of Education strategically ask higher-performing backup schools to take the test, so that our PISA 2015 results would look better?
Looking at the evidence
To answer that question, let’s start with what the Ministry has said. Shortly after the results came out, the Ministry promised to release a report with full details on why we were excluded from the main PISA database.
Eight months later, no report has been released. However, in a March 2017 parliamentary reply to Tony Pua, the Ministry said that the 51% initial response rate was due to PISA 2015 being conducted using computers (unlike previous PISA rounds, which used paper-and-pencil tests). Because of this, some students were unfamiliar with computer-based tests and didn’t record their answers properly, and there were technical issues with data loss.
But the evidence suggests that computers probably weren’t to blame. And it also doesn’t seem likely that our low initial response rate was just a coincidence.
- The previous two times we participated in PISA, our weighted initial response rates were above 99%. This is also true of all five times we have participated in the TIMSS international assessments – including TIMSS 2015, which took place just six months before PISA 2015. Moreover, principals and teachers in Malaysia, as civil servants, usually obey government directives. It would be very surprising if half of the Malaysian school principals who were told to administer PISA simply refused.
- In PISA 2015, the Netherlands also had an initial response rate in the “unacceptable” range – although their 63% was higher than our 51%. But they were included in the main PISA database because national exam data showed that their PISA results probably weren’t biased. In contrast, the data submitted by our government showed that our PISA 2015 backup schools had higher national exam scores than expected, so our results may have been biased.
- On average, it’s likely – though unfair – that schools with better exam results also have better computer equipment, and that students in these schools are more familiar with computer-based tests. But it’s a lot less likely that (a) almost half of our initially selected schools faced computer-related difficulties in conducting PISA; but (b) such computer-related problems affected almost none of the backup schools, which had been selected using the same randomised process as the initially selected schools.
- The Ministry invested a lot of time and money to prepare for PISA – certainly enough to detect and solve computer issues. The PISA test was held in April 2015, but the Ministry had formed a committee on TIMSS and PISA by December 2013. Mock PISA tests were held as early as May 2014. In March 2015, the Ministry reported that students had been given PISA-style exercises to familiarise themselves with the test, and that teachers had been trained to conduct the computer-based test. In the weeks leading up to the test, students attended PISA training camps in hotels. Also, the OECD provided each country with a diagnostic programme several months in advance, so that PISA administrators could check if each participating school had adequate computers.
- PISA 2015 did not require fancy new computers – Windows XP was enough. Also, the test was delivered using USB sticks, so it did not need an internet connection.
All of this casts some doubt on what the Ministry has said about the problems with our PISA 2015 participation.
But why might the Education Ministry want to influence our PISA results?
Since our subpar TIMSS 2011 and PISA 2012 results were revealed, the government has been under tremendous pressure to improve our performance in international student assessments. As a result, the Malaysia Education Blueprint 2013–2025 uses these exams as the benchmark of educational quality: the aspiration is to be among the top third of countries participating in PISA and TIMSS.
But this might not be the best benchmark for Malaysian schools. For one thing, our ranking in PISA and TIMSS depends on which other countries decide to participate that year. If, for whatever reason, a lot of high-performing countries decide to drop out of TIMSS 2019, our relative ranking could rise, even if our average score doesn’t change.
If we want to benchmark our education system against international assessments, it would make more sense to use PISA proficiency levels, which are consistent from year to year. For example, we could say that, in PISA 2021, we want 80% of Malaysian 15-year-olds to reach at least at Proficiency Level 3 for science, which means that they can identify evidence supporting a scientific claim and construct explanations in complex situations.
Moving beyond exam obsessions
Most Malaysians would agree that our education system is too exam-oriented. To its credit, the Education Ministry has been trying to address this, by introducing coursework elements to the PT3 and the STPM, and by initiating consultations on whether the UPSR should be abolished.
But in emphasising our PISA and TIMSS rankings, we are choosing to worry about even more exams.
This is especially sad because PISA and TIMSS are not meant to be that type of exam. They are not the sort of test where you must do as well as you can because it will affect your future. PISA and TIMSS give national-level results, not results for individual students or schools. And it’s hard to think of any ways in which our national future would be directly affected by these results. It’s unlikely that multinational companies would use PISA rankings to decide whether to invest in Malaysia – they might look at graduate skill levels instead. Some Malaysians may migrate to other countries so that their children can attend better schools, but our TIMSS and PISA scores are just one data point among many indications that our education system is struggling.
Instead, PISA and TIMSS are more like the weekly quizzes that you might have taken in school, so that both you and your teacher have an accurate picture of what you know, and what you still need to work on. Similarly, these international student assessments are meant to help education systems understand what their students are good at (analysing literary texts? applying maths principles to everyday problems?) and what they can do to improve. Increasingly, it looks like one way our Education Ministry can improve is by embodying the integrity and responsibility that the national curriculum espouses.
Learning lessons, old and new
To practice integrity and responsibility myself, I’d like to highlight some evidence proving that the Ministry is innocent of a suspicion that had been raised earlier about our PISA results. When the PISA data were released last December, I noticed that 30% of students who took the PISA test were from fully residential schools (Sekolah Berasrama Penuh). This looked suspicious because less than 3% of students are enrolled in these asramas, which usually admit students based on their UPSR or PMR/PT3 results – so asrama students would probably get higher PISA scores than average.
However, when I later emailed the OECD to ask about this, they confirmed that the oversampling of asrama students was intentional, and that it was balanced out in all PISA calculations so that it did not bias our average results. It’s likely that the Education Ministry requested this oversampling to get a higher-resolution picture of asrama students’ performance – which is a legitimate reason, especially when the Education Blueprint discusses plans to internationally benchmark Malaysia’s education programmes for gifted children.
So in this process of looking at our PISA 2015 results, I have re-learned the lesson that I must be unbiased in seeking an accurate picture of Malaysian education. As a citizen and researcher, I must hold the government accountable for how it uses our national resources to prepare our children for the future – but I must also give credit where it is due. And as a former SMK teacher, I know that there are many hardworking teachers and Ministry officials who work sacrificially for our children’s futures, and that it can be dispiriting when everyone seems pessimistic about our schools.
I hope that the Ministry reveals more evidence about our participation in PISA 2015, as they promised in December. And perhaps this evidence will show that the bias from the backup schools was unintentional and unavoidable. But more than that, I hope that the Ministry will learn all that they can from this PISA 2015 process, for the sake of the millions of children whose education they are entrusted with.