NAPLAN – The National Assessment Program - Literacy and Numeracy.
To fully understand the folly of the NAPLAN experiment brought into our nation’s education system by the then Federal Education Minister Julia Gillard in 2008 one should spend a few minutes on their official website:
Then you might like to consider the technical malfunction on Wednesday, 11th March, 2026 that created havoc across the nation’s schools - NAPLAN 2026 experienced significant nationwide disruptions on its first day (March 11), with 1.3 to 1.4 million students in Years 3, 5, 7, and 9 affected by login failures, server errors, and frozen screens. The online platform issues forced many schools to pause or reschedule, with ACARA apologising for the widespread technical failure. Google “NAPLAN test problems” for more, and the video to explain it more fully. The year three cohort was fine as they are not yet ready to face the beast electronically. The reasons for the tests to be on line are described here by Joe Tognolini
And as one parent wrote to The Conversation:
NAPLAN glitches
“Given that NAPLAN online requires students to develop and demonstrate typing skills as well as literacy and numeracy knowledge, it is imperative that the systems and programs delivering NAPLAN work properly. The whole process is stressful enough without students having tests delayed, timetables adjusted and more assessments done over a shorter period of time. How are this round of NAPLAN results going to be ameliorated to account for the unnecessary frustration and anxiety these glitches have caused the participating students? Sort it out very quickly or go back to writing because this round of testing will not allow students to present their best work.”
Anna Morgan, Footscray
Also for an on depth analysis of the history and cost of NAPLAN by Jen Buchanan from “Future Schools” - https://futureschools.education/standardised-testing-is-it-worth-it-by-jen-buchanan-june-2020/
And she ends her treatise with this comment: "To foster and enable the educational prosperity of future generations, a paradigm shift in assumptions of how we have come to understand assessment from both a theorical standpoint and in practice needs to be challenged. It is only then that we will be able to move from standardised assessment as a metric for gathering data, towards a more holistic education that gives a true measure of what success looks like for all. When did education become a product, and when did conformity become learning?"
That future she speaks of is now with us, and we have been caught short sighted and ill prepared to manage it. It is important to also know that NAPLAN under the auspices of ACARA is administered by the foreign owned publishing company Pearsons at a bone shattering mind blowing annual cost of $100m but the total cost is far greater. Who would question such folly?
https://www.reddit.com/r/Teachers/comments/1glwk0/pearson_seems_to_control_everything_in_education/
Pearson Education Services plays a key role in supporting the Australian Curriculum, Assessment and Reporting Authority (ACARA) with the administration of NAPLAN tests, including services for scanning, processing, and marking, according to Pearson UK. They have been involved with the program for years, providing services across various Australian states.
Then you might ask the question: What’s wrong with NAPLAN?
Join Margaret Wu cautioning us against the use of NAPLAN data for accountability and thinking we can use it to measure school effectiveness: https://www.youtube.com/watch?v=i-ptsrdyxBE
Basically you can't. The measurement error when assessing a domain as large as literacy and numeracy with a few 45 minute tests is huge. The progress measure comparing these two tests two years apart is even huger.
Her submission to the Senate Inquiry makes these points:
☠️ NAPLAN results don’t accurately measure school performance.
☠️ ICSEA and growth measures don’t properly adjust for student background.
☠️ Large measurement errors in student scores make comparisons unreliable.
☠️ Student growth data has a high margin of error, making it misleading.
☠️ Teacher performance estimates based on NAPLAN are too imprecise.
☠️ Publishing NAPLAN results misleads parents and the public.
It would be interesting to see what if any alternative she’s suggested. Published NAPLAN results can be misleading and misused particularly when the tests are not “perfect” by the definition of a perfect test. However, making the results available to the public is fair and important, though they weren’t intended to be League Tables as has become the case. They are the base source of decision making to ensure Howard's "Parent Choice" is indeed corroborated.
NAPLAN narrows the working definition of what constitutes literacy and numeracy and hence dumbs down teaching, pressuring school leaders to teach to the test (always denied) and lots of students don’t ‘try’ whilst some are asked to not attend school on test days(also denied, vehemently). NAPLAN results are not well connected to what’s really going on in classrooms so is a weak driver of real change to improve outcomes.
There are way more criticisms of NAPLAN than just what Margaret said, for sure! Like NAPLAN really measures the extent to which a school teaches whatever NAPLAN measures. And totally - a huge problem for all standardised tests is the narrowing of the curriculum, the Arts being the greatest losers. Yet as adults we spend much of our time listening to music, attending concerts and performances, visiting art galleries and enjoying what The Arts have to offer a vibrant and richly endowed society.
There is value in large-scale data sets to help schools identify trends and support effective deployment of resources (including use of teacher time). However, these must be triangulated with other data points and evidence gathered at the cohort and classroom level in conjunction with teacher judgement and insights. This process should be efficient and effective. If the purpose of data and evidence gathering is to know and respond to our students' needs and build on their strengths to help them develop. If not, it becomes low impact work and a waste of school resources, at great expense to education budgets.
If one believes Margaret's math's, NAPLAN can't be used in that way. Also – maybe it does the opposite of using teacher time well, considering it uses up about two weeks gathering information that's pretty useless for classroom teaching.
Most parents would know nothing of Margaret’s research yet NAPLAN results play a huge part in the Parent Choice decisions they may make in choosing a school for the education of their children. The system is rotten from the core yet schools ride on their “Great NAPLAN Results,” even billboarding them, to attract clientele.
When I talk about large scale data sets and the trends they can identify, I don't believe the responsibility for interpreting or identifying these should be the responsibility of a classroom teacher. I am grateful that Margaret is highlighting the flaws of these large scale data sets and how they are used. As I mentioned, data and evidence gathering should be efficient, effective and supportive of high impact work, as soon as it draws time away from teachers planning for, delivering, and providing feedback on learning, it becomes a low value, low impact use of teacher time, which is the most valuable resource in our schools for nurturing our young people.
And there's more.