Tips for analysing NAPLAN results

When you receive your school’s SMART data package, it’s hard to know where to start in analysing all the information. Here are some suggestions about ways to go. I’m sure you do this already, but perhaps it may be helpful to know that we’re on the same wavelength!

1. Check overall performance in bands versus the state

Look at the student performance in bands by area – Overall Literacy, Numeracy. (It’s the bar chart called : Overall Literacy percentages in bands for Year 5/7/9: All students) Check if your school is above the state or below the state in each band. Of course, we want the red lines to be above the state averages in the higher bands and below the state averages in the lower bands! This will highlight any obvious weakness areas. It’s interesting to see if there are many students in the second highest band, and not many in the top band. This probably means that quite a few students could be extended and “pushed over” into the top band.

2. Look closely at test items below state average

You can export the data to an excel spreadsheet and come up with a table that shows you how your students are performing.

eg.

Difficulty

Qu

Strand

Question Description

Syllabus Outcomes

State %

School %

Difference from State

46

22

Reading:

Infers the nature of a character in a news report

English 4 7.1

28

18

-10

45

39

Reading:

Infers a main idea in the conclusion of an information report

English 4. 7.9, Science 4.10:

28

20

-8

This example shows that in these two items, students performed significantly lower than the state average. Then it’s worth looking at these two items in detail. Actually look at the test paper and see what the problem was.

3. Look at test items above the state average

It’s not all gloom and doom. Have a look at the items in which your students performed above the state average (more than 10 points is terrific!) Highlight these at a staff meeting and discuss why your students might be doing well in these areas. Be careful of some items, especially spelling, as these can pop up above and below the state averages and the descriptors are not detailed enough to be helpful.

4. Individual student growth

Have a look at the Top 30 students in the cohort and the Bottom 30. Of these, how many achieved the expected growth from the last NAPLAN? This will give you an indicator to show you if you’re extending your top students and supporting your students with special needs.

If you’re in a secondary school looking at Year 7, it’s worth remembering that the students had been at the school for only 4 months before the test so the results probably don’t reflect your school’s efforts. However, for Year 9, it’s all you! For that reason, I look at individual student growth in detail for Year 9 and just have a glance at Year 7.

You may like to draw up your own table showing the number of students who did/did not show demonstrated learning gain.

Number who did not show expected literacy gain

Overall literacy

Lowest 30

5

Top 30

26

For example, these results would suggest that the least able students are being supported well and they are achieving learning gains, as 25 out of 30 achieved the expected literacy gain and only 5 did not. However, the top 30 students are a different story. Most of them did not achieve the expected literacy gain, so perhaps you could consider more extension of able students.

Good luck with your analysis!

 
 

©2017 Literacy Works. All rights reserved. Site by Monday in the Sun