NAPLAN 2017: Results Have Largely Flat-Lined, and Patterns of Inequality Continue

Revision as of 04:43, 22 March 2018 by WikiSysop (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

The release of the 2017 NAPLAN National Report confirms preliminary findings released in August and offers deeper insights into achievement trends since the assessment program began a decade ago.

The results paint an overall portrait of plateauing student achievement in literacy and numeracy, mixed with pockets of improvement and persistent inequalities between young people from different backgrounds.

High level trends over the past decade

NAPLAN takes place annually. It assesses Australian school students in years three, five, seven and nine across four domains: reading, writing, language conventions (spelling, and grammar and punctuation), and numeracy.

Nationally, NAPLAN results have flat-lined in most areas since testing was first conducted in 2008. There are no statistically significant differences in achievement across the majority of domains and year levels.

Read more: NAPLAN is ten years old – so how is the nation faring?

Improvements can be seen in a limited number of domains and year levels. There are statistically significant increases in spelling (years three and five), reading (years three and five), numeracy (year five), and grammar and punctuation (year three).

Naplan-numeracy-2008-2017.jpg National numeracy trends 2008-2017. ACARA

Year seven writing is the only area to show a statistically significant decline. It remains a major area of concern.

State and territory comparisons reveal good news for Queensland and Western Australia. Both show improvements across a number of domains and year levels.

New South Wales, Victoria and the Australian Capital Territory show high achievement, but results have plateaued.

The Northern Territory continues to lag significantly behind the rest of the nation across all domains and year levels.

The vast majority of young people meet the National Minimum Standards (NMS). NMS provide a measure of how many students are performing above or below the minimum expected level for their age across the domains.

NMS percentages are over 90% for the majority of domains and year levels. But NMS percentages vary widely. For example, only 55.7% of students in the Northern Territory meet the NMS for year seven writing, compared to 90.8% in Victoria.

Background affects achievement

This year’s results show clear patterns of achievement between young people from different backgrounds. In many cases, these differences reflect broader inequalities in Australian society.

Notable trends include:

  • gender differences are persistent. Female students perform significantly better than male students in writing, and grammar and punctuation across all year levels. For example, 88.1% of female students meet the NMS for year nine writing, compared to 75.4% of male students
  • students with a language background other than English (LBOTE) performed significantly better in spelling than non-LBOTE students across all year levels. LBOTE students have also shown gains since 2008 in reading (years three and five), grammar and punctuation (years three and seven), spelling (years three and five) and numeracy (year five)

Naplan-yr5-spelling.jpg Year 5 spelling: students with a language background other than English (LBOTE) compared to non-LBOTE students. ACARA

  • Indigenous students have shown statistically significant gains since 2008 in reading (years three and five), spelling (years three and five), grammar and punctuation (years three and seven) and numeracy (years five and nine). But Indigenous students still trail significantly behind non-Indigenous students across all domains and years levels

Naplan-reading.jpg Year 5 reading: achievement differences between Indigenous and non-Indigenous students. ACARA

  • parental education is a key factor determining student achievement. For example, in year three grammar and punctuation, the mean scale score for a young person whose parents have a Bachelor degree or above is 479.7, compared to 369.6 for students whose parents have a Year 11 equivalent or below. Similar patterns are reflected across all domains and year levels
  • geographical location also has a major bearing on student achievement. For example, in year three grammar and punctuation, the mean scale score for young people in major cities was 450, compared to 284.6 for young people from very remote and 411.5 for outer regional locations.

As always, tread cautiously with data

NAPLAN is one useful measure of student achievement in Australian schooling. When interpreted carefully, it can help policy makers, researchers, school leaders, teachers, students and parents better understand and debate literacy and numeracy achievements. It also serves to highlight pockets of underachievement and disadvantage, and can play an important role informing policy interventions and investments.

Read more: Evidence-based education needs standardised assessment

But NAPLAN is not an oracle and can only tell us so much. So we should treat these results carefully.

To get a more accurate picture of achievement trends, we need to take a number of indicators into consideration. This should go beyond the basics of literacy and numeracy, including achievements in ATAR subjects, year 12 attainment rates, and more.

NAPLAN results should also be considered in relation to other standardised assessments, which do not always tell the same story.

For example, the latest Progress in International Reading Literacy Study (PIRLS) suggest reading achievement among Australian children has improved significantly, whereas the OECD’s Programme for International Student Assessment (PISA) shows steadily declining Australian results in all areas, including reading.

It’s also important to analyse school and student level NAPLAN data, which will be released in March 2018. It will no doubt lead to another round of debates about the role of NAPLAN in our schools.

This paper was first published in The Conversation on December 13, 2017 and is reproduced here, with appreciation, under a Creative Commons licence.

Editor’s comments

Comments by Richard Letts, Director, The Music Trust, Editor, Music in Australia Knowledge Base

My comments are those of an interested layperson.

  1. The changes in scores for the most part are not statistically significant. There were not big changes followed by plateaux at new levels; scores have pretty much plateaud over the ten years of NAPLAN operation.
  2. For the whole student body, there is a small number of statistically significant improvements and only one statistically significant decline. The statistically significant changes are very small. It would be good to have expert comment on whether a statistically significant change in a NAPLAN score is, shall we say, pedagogically significant.
  3. It is interesting that although most NAPLAN scores for the whole student body are static, scores for some sections of the student body have improved – eg LBOTE students, who showed more positive change in most subjects than the student body overall and higher scores in spelling than native English speakers!
  4. NAPLAN does identify particular groups that are underperforming and so points to where more resources are needed.
  5. However, given that business as usual is having almost no impact on scores generally, the extra resources presumably should not be put into business as usual.
  6. NAPLAN is a measurement, not a curriculum. But it is to be expected that if it is to be taken as the measure of achievement, schools will invest time and energy in attempting to lift achievement in what NAPLAN measures. So NAPLAN is not a curriculum but it affects curriculum.
  7. NAPLAN does not measure arts achievement and therefore we continually hear reports of principals reducing or cancelling arts instruction and reapplying resources to the subjects measured by NAPLAN.
  8. Readers might refer to the item in the March edition of Loudmouth describing the strategy adopted by Feversham school in the UK. It was failing. Instead of adopting the strategy of anxious Australian principals and adding instruction in the subjects in which it was failing (literacy, maths…), it introduced a serious program of music. Attendance is now 98% and it overachieves in the academic subjects.
  9. This seems to be too complicated a proposition to be considered by Australian departments of education. If you want to attach two planks one to the other, NAIL them. Umm, but perhaps you could bolt them or glue them. Or not use planks. What is the purpose of the planks anyway? Perhaps there is an entirely different and better solution.
  10. The author mentions that Australian achievement in the PISA scores has declined. As we understand it, PISA is primarily a ranking of countries in their comparative achievements in reading, writing, maths, science. It would be possible for Australia’s scores to remain constant but for it to drop in the rankings because other countries improve. From where would PISA get more reliable scores from Australia than from NAPLAN?
  11. However, as we have reported a number of times, the countries that surpass Australia’s PISA ranking invest much more in music education than do we. This may partly account for their overall performance; we can at the least conjecture that music education has not damaged it.

Author

Glen C Savage

DATE PUBLISHED: 22 March 2018

Share your opinion