NAPLAN results show it isn’t the basics that are missing in Australian education



File 20170804 22508 ceo6pf

AAP/Dan Peled

Misty Adoniou, University of Canberra

The preliminary results of NAPLAN 2017 are out, and the news isn’t good. The annual test of our students’ literacy and numeracy skills shows that not much has changed since 2011, coincidentally – or not – when we began this annual circus of public reporting of NAPLAN results.

In fact, it seems our kids are actually getting dumber – at least as measured by the NAPLAN tests.

Going backwards

The year’s Year 9 students first sat the test back in 2011 when they were in Year 3, so we can now track the cohort’s performance over time.

It is particularly useful to track their performance against the writing assessment task, as all the grade levels are marked against the same ten assessment criteria. Depending upon how they perform against each assessment criterion, they are assigned a Band level – ranging from Band 1, the lowest, to Band 10, the highest.

The minimum benchmark shifts for each year level, because we would expect a different minimum level of writing performance for 16-year-olds than we would of ten-year-olds. So, in Year 3 the minimum benchmark is Band 2, and in Year 9 it is Band 6.

A gifted and talented Year 3 student could easily achieve a Band 6 or above, and it is conceivable a struggling Year 9 student may only reach a Band 2.

This year, a staggering 16.5% of Year 9 students across Australia were below benchmark in writing. Back in 2011, when those students were in Year 3, only 2.8% of them were below benchmark. Somehow we dropped the ball for thousands of those kids as they progressed through school.

The high-performing states of New South Wales, Victoria and the ACT cannot claim immunity from this startling increase in students falling behind as they progress through school. Their results show exactly the same trends. This is a nationwide problem.

https://datawrapper.dwcdn.net/DF78j/1/

It gets worse

Not only are the numbers of low-performing students increasing, but the inverse is occurring for our high-achieving students: their numbers decrease as they move through school.

This year, only 4.8% of Year 9 students across Australia performed far above the minimum benchmark – that is, at a Band 10 level. However, back in 2011, 15.7% of those same students were performing far above the minimum benchmark for Year 3 – that is, at a Band 6 or above.

https://datawrapper.dwcdn.net/Ae5s3/1/

The trend is strikingly similar across all the jurisdictions. As NSW congratulates itself on improving its Year 9 results, it might want to look a little closer to see what the figures are really saying.

In 2011 an impressive 20% of NSW Year 3 students were far above benchmark in writing. But by the time they had reached Year 9 this year, the number of them who were far above the benchmark had dwindled to a depressing 5.7%.

https://datawrapper.dwcdn.net/mrkEV/1/

What is happening?

Why do we start so well, and then lose both high performers and strugglers along the way? Isn’t school supposed to be growing their literacy skills, not diminishing them?

Well, the NAPLAN statistics not only illustrate the problem, they actually provide the explanation.

We don’t have an early years literacy “problem” in Australia. The percentage of students below benchmark in Year 3 converts to very small numbers. In Victoria in 2016, for example, there were around 450 Year 3 students below benchmark.

It should be very easy to locate those children, and provide intensive interventions specifically designed for each student. But apparently we don’t.

By Year 5, those low performers across Australia are simply treading water and our high performers start to slide. Then it all takes a dramatic turn for the worse in Year 7, with a five-fold increase in students below benchmark and a three-fold decrease in those who are far above the benchmark.

So, what is going on?

Well, reading and writing gets harder in Year 4, and every year after that.

The Year 3 test is looking for evidence that the children have learned their basic reading and writing skills. They can decode the words on the page and comprehend their literal meaning. They can retell a simple story that is readable to others.

However, by Year 5, the test begins to assess the children’s ability to infer from and evaluate what they read, and to consider their audience as they write.

In Year 7 it is expected that children are now no longer learning to read and write, but that they are reading and writing to learn. To achieve this they need deep and technical vocabularies, and to be able to manipulate sentence structures in ways we do not and cannot in our spoken language.

And the NAPLAN results suggest that many of them cannot.

Instead, they are stuck with their basic literacy skills, obviously well learned in the early years of school. They can read – but only simple books with simple vocabulary, simple grammatical structures and simple messages. They can write – but they write the way they speak.

What’s the solution?

Raise our expectations of our students. And raise the quality and the challenge of the literacy work we do with them.

There has been a misplaced focus on “back-to-basics” literacy education in recent years. The last ten years of NAPLAN testing shows us we are already exemplary at the basics. It is the complex we are bad at.

It’s time to change tack. Our attention needs to focus on developing the deep comprehension skills of our upper-primary and high school students. And our teachers need – and want – the resources and the professional learning to help them do this.

Teachers must build their own understanding of the ways in which the English language works, so they can teach their students to read rich and complex literature for inference, to use complex language structures to craft eloquent and engaging written pieces, and to build sophisticated and deep vocabularies.

It isn’t the basics that are missing in Australian education; it is challenge and complexity.

The ConversationAnd until we change our educational policy direction to reflect that, we will continue to fail to help our children grow into literate young adults – and that is bad news for us all.

Misty Adoniou, Associate Professor in Language, Literacy and TESL, University of Canberra

This article was originally published on The Conversation. Read the original article.

NAPLAN is ten years old – so how is the nation faring?



File 20170731 15340 dfm5ie
About 1.1 million students in Years 3, 5, 7 and 9 sat the 2017 NAPLAN tests in May.
Shutterstock

Glenn C. Savage, University of Western Australia

The NAPLAN 2017 summary results have been released with the usual mix of criticism, high hopes and panic that marks the yearly unveiling of data.

This year’s results will generate particular interest, as 2017 is the tenth time NAPLAN has been conducted since it was first introduced in 2008.

The final report is not due until December, but the summary results provide a useful opportunity to reflect not only on how young Australians have fared over the past year, but also over the past decade.

What does NAPLAN test?

NAPLAN takes place every year and assesses Australian school students in years 3, 5, 7 and 9 across four domains: reading, writing, language conventions (spelling, and grammar and punctuation), and numeracy.

NAPLAN is a “census assessment”. This means it tests all young people in all schools (government and non-government) across Australia.

NAPLAN uses an assessment scale divided into ten bands to report student progress through Years 3, 5, 7 and 9. Band 1 is the lowest and 10 is the highest.

Each year, NAPLAN data for every school in the nation is published on the publicly accessible My School website.

The Australian Curriculum, Assessment and Reporting Authority (ACARA), which manages NAPLAN and My School, suggests the test and website increase transparency, and allow for fair and meaningful comparisons between schools.

Others, however, argue the website has transformed NAPLAN into a “high-stakes” test with perverse consequences.

How do 2017 data compare to 2016 data?

Compared to 2016 results, 2017 data show:

  • no statistically significant difference in achievement in any domain or year level at the national level;

  • South Australia had the only statistically significant change out of any state or territory, with a decline in Year 3 writing achievement;

  • New South Wales, Victoria and the Australian Capital Territory continue to be the highest-performing jurisdictions, scoring above the national average across the majority of domains and year levels; and

  • the Northern Territory continues to significantly underperform on all measures when compared with other jurisdictions (see, for example, Year 3 reading trends below).

How do 2017 data compare to 2008 data?

Compared to 2008, 2017 data show:

  • no statistically significant difference in achievement across the majority of domains and year levels at the national level;

  • statistically significant improvements at the national level in: spelling (years 3 and 5); reading (years 3 and 5); numeracy (year 5); and grammar and punctuation (year 3);

Year 3 Reading results: 2008-2017.

  • Year 7 writing is the only area to show a statistically significant decline in achievement at the national level (based on data from 2011 to 2017);

  • Queensland and Western Australia stand out positively, showing statistically significant improvements across a number of domains and year levels;

  • despite high mean achievement overall, there has been a plateauing of results in New South Wales, Victoria and the Australian Capital Territory; and

  • students have moved from lower to higher bands of achievement across most domains over the past ten years. This is illustrated in the following graph that shows band shifts in Year 3 reading (green) and Year 9 numeracy (blue).

From 2008-2017 there has been a gradual redistribution of students from lower bands of achievement to higher ones in many domains.

How many students meet the National Minimum Standards?

Another important NAPLAN indicator is the percentage of students meeting the National Minimum Standards (NMS).

NMS provide a measure of how many students are performing above or below the minimum expected level for their age across the domains.

The 2017 national portrait remains positive in relation to the NMS, with percentages over 90% for the majority of domains and year levels.

Year 9 numeracy has the highest NMS percentage of 95.8% at the national level.

Year 9 writing has the lowest NMS percentage of 81.5% at the national level.

The Northern Territory continues to lag significantly behind the rest of the nation across all domains and years, with NMS percentages falling distressingly low in some cases. For example, only 50% of Year 9 students in the Northern Territory meet the NMS for writing.

What are the implications moving forward?

It is safe to say the nation is standing still compared to last year and has not made any amazing leaps or bounds since the test was first introduced.

This will be of concern to many, given one of the main justifications for introducing NAPLAN (and committing major investments and resources to it) was to improve student achievement in literacy and numeracy.

The general lack of improvement in NAPLAN is also put into stark relief by steadily declining results by Australian students on the OECD’s Programme for International Student Assessment (PISA).

Those committed to NAPLAN see improving the test as the best way forward, along with improving the ways data are used by system leaders, policymakers, educators, parents and students.

One major change in 2018 is that schools will begin transitioning away from the current pen and paper version to NAPLAN online. ACARA hopes this will produce better assessment, more precise results and a faster turnaround of information.

Schools will initially move to NAPLAN online on an opt-in basis, with the aim of all schools being online by 2019.

The ConversationOnly time will tell as to whether NAPLAN online has the desired effects and whether the current cycle of stagnating results will continue.

Glenn C. Savage, Senior Lecturer in Public Policy and Sociology of Education, and ARC DECRA Fellow (2016-19), University of Western Australia

This article was originally published on The Conversation. Read the original article.