Are schools gaming NAPLAN to manipulate academic performance?
Private schools are more likely to exclude poorly performing students from NAPLAN testing via parental withdrawal, which could potentially skew academic rankings
Schools may be strategically manipulating participation in the National Assessment Program – Literacy and Numeracy (NAPLAN) to improve their public performance results, according to research from UNSW Business School.
It found that schools with lower initial test scores relative to other schools teaching similar student bodies show higher rates of non-participation in NAPLAN tests following the public release of performance data on the My School website. This website, which is maintained by the Australian Curriculum, Assessment and Reporting Authority, is designed to provide transparency and accountability of Australia’s school education system through the publication of nationally consistent school-level data.
However, the research paper, Unintended consequences of school accountability reforms: Public versus private schools, published in the Economics of Education Review, found that certain schools are more likely to exclude lower-performing students – a move which would enhance a school’s overall test results. This increase in non-participation was largely driven by formal parental withdrawal, a process that parents can initiate by applying directly to the school principal.
“We found that the fraction of students withdrawn from testing in years after the 2010 launch of My School went up, while the fraction who were absent or exempt from testing remained roughly steady – with poorly-performing students far more likely than other students to be withdrawn from testing,” said Gigi Foster, a Professor in the School of Economics at UNSW Business School, who co-authored the research paper together with University of Melbourne Associate Professor Mick Coelli. “We further find that this increase in withdrawal rates occurred in schools that were initially reported on My School to be poor performers relative to peer schools.”
This behaviour, commonly referred to as “gaming the system,” undermines the objective of NAPLAN to provide a fair assessment of student performance across Australia, Prof. Foster asserted.
Private schools are more likely to game the system
The research identified a sector-specific response, with private schools more likely to adjust their testing pools compared with public or Catholic schools. This indicates a higher tendency among private schools to manipulate participation rates to maintain their reputations.
More specifically, it found that students at private schools are more than twice as likely than their peers at state schools to be pulled out of NAPLAN tests in subsequent years if they received low grades in previous years.
“We provide some suggestive evidence that these higher rates of withdrawal in lower-performing schools were more prominent in independent private schools – which are famous for charging parents a pretty penny for the privilege of enrolling their kids – than in public schools,” said Prof. Foster. “These findings are consistent with a situation in which the increased withdrawal is used as a tactic to manipulate the image of a school’s quality: excluding more poor performers from testing makes the school look better than it otherwise would look on My School.”
Read more: Four essential elements for higher education's future success
How parents play a role in NAPLAN testing participation
The research paper noted that parents can formally apply to the school principal for the withdrawal of their child from NAPLAN testing “in the manner specified by the local testing authority” (usually the state’s department of education). NAPLAN testing protocols indicate that withdrawals are intended to address issues “such as religious beliefs and philosophical objections to testing,” but providing a considered explanation for withdrawal does not appear necessary.
Furthermore, the process of applying for withdrawal was not actively promoted by governments or the testing authority. “We believe that knowledge of the process was provided to parents directly by schools,” said the researchers, who noted that parents apply for withdrawal directly to the school, not to the testing authority or government.
Parents have reported pressure from schools to withdraw children from NAPLAN testing in surveys conducted by state education authorities, while the popular press (including The Sydney Morning Herald, news.com.au and The Herald Sun) have also reported on claims by parents that schools have instructed children not to sit the NAPLAN tests "in order to boost their chances of obtaining higher overall scores", the research paper stated.
“The volume of such reports led the NSW Minister for Education to warn that principals or teachers found encouraging children not to sit the tests may face disciplinary action. Reports of parents coordinating the withdrawal of their children from testing so their school would not be included on the My School website have also been made."
In the Australian setting prior to My School, the paper said parents likely already had expectations of a school's overall performance or “quality” based on the types of students it serviced. Parents may not, however, have had an informed view of how a school was performing relative to schools serving similar student cohorts.
“The similar-schools comparisons provided on My School are thus likely to have constituted an ‘information shock’, to the minority of parents who looked at My School, and more than likely to school leaders too. Consistent with this conjecture, the most negative views of school leaders about the My School website were reported by schools that had the weakest performance in similar-school comparisons,” the researchers said in their paper.
Read more: What are the external factors that influence academic excellence?
Unintended consequences of manipulating NAPLAN
Prof. Foster explained that she and research paper co-author Associate Professor Coelli, were familiar with overseas research evaluating government programs designed to make school performance more transparent to parents, and thereby to raise the accountability of schools. “The idea of such programs is to help improve educational outcomes, since with more information, parents would be expected to select higher-performing schools for their children to attend, thereby exerting competitive pressure on low-performing schools to either lift their game or close,” she said.
However, she said, these types of programs could have unintended consequences – if not designed with careful thought. As low-performing pupils at poorly performing schools were more likely not to sit the NAPLAN tests, Prof. Foster said, this meant that the My School program may have had the unintended consequence of hiding from public view the low skills in English and numeracy of some of Australia’s weakest school students.
“This is somewhat ironic since the whole point of a school accountability program is generally to improve the education that students – and particularly disadvantaged students, whose parents may have few sources of information about school quality apart from the program – receive,” she said.
Data analysis and policy implications
In conducting their analysis, the researchers utilised data from the My School website spanning from 2008 to 2015. This data set included standardised test scores and participation rates from a balanced panel of 6981 schools over eight years.
While the intention behind publicising school performance data was to drive improvements and transparency, Prof. Foster said, the results indicated that it could also incentivise undesirable behaviours.
“The government could fix a maximum percentage of students who may be excluded from testing on any given testing day, although the monitoring costs of this may be prohibitive and it may result in further unintended consequences, such as schools reducing the number of allowed exclusions amongst average performers – perhaps then creating pressure on sick or injured students to sit the tests – in order to create more ‘slots’ for weak performers to sit out the tests,” she said.
“Another option would be for the My School site to report the percentages of students excluded from testing at each school, while adding a note informing parents that a comparatively high fraction of students excluded from testing may signal that a school’s true average performance is lower than what it appears to be on the My School tables.”
Subscribe to BusinessThink for the latest research, analysis and insights from UNSW Business School
The researchers noted that their results – while striking – still leave open the question of what precisely drove schools’ gaming responses in Australia. “Do schools believe parents will respond to school performance data published on My School via changing their enrolment decisions? It does not appear that parents have responded that way. Are responses driven by a desire to ‘keep parents happy’ or to underpin public perceptions of the ‘prestige’ of the school?”
The paper concluded that it is unlikely that schools are responding due to fearing consequences from the government or some other oversight body, as those bodies already had access to NAPLAN results prior to My School,” the researchers said. “Isolating the main drivers of gaming responses to the accountability mechanism in the Australian setting is a fruitful area for future research.”