A lesson for teachers from their least-loved exam
Illustration: Rocco Fazzari
THE system of NAPLAN testing in Australia has faults, but the survey into its effects says more about teachers than it does about the test. The name of the survey gives a good idea about its purpose. "The impacts of high stakes testing on school students and their families: An Educator's Perspective". But is NAPLAN really "high stakes" testing? What are the consequences of the test? Does any child stand to gain a prize, gain entry to anything? Do they receive any kudos? Any reward? The answer to all these questions is no. NAPLAN tests may be high profile, they may be controversial, but are they high stakes? The answer is no. To paraphrase Mick ''Crocodile'' Dundee - that's not high stakes testing … try selective school testing, or the HSC.
Indeed, the survey reported some students getting bored practising for NAPLAN tests because they know "the result will not affect their semester reports."
One could argue that the biases of this survey are also obvious in the questions asked. Of the 25 questions, not one asked for the positive impacts of NAPLAN. Most questions seek to elicit negative aspects of the tests. For example, take one question: "Some researchers claim that NAPLAN can impact on student health and wellbeing. From your own experience as a teacher, have you ever had any students report the following problems as a result of NAPLAN?'' It then went on to prompt a series of ill-effects such as sleeplessness, crying, feeling stressed, concern that they were too dumb, etc.
Now consider another question that could have been asked: "Some researchers claim that NAPLAN is an excellent diagnostic tool to gauge student progress. From your own experience, have you ever found it useful in finding out a student's strengths and weaknesses, used it to tailor lessons, or devising an individual learning plan? It's easy to see how the results might have been different.
Another issue is the problem of self-selection. The survey received almost six times as many responses from Queensland than from Victoria. Even when weighed for population, this suggests that respondents were motivated to answer the survey, not randomly selected.
Notwithstanding these obvious faults, the survey has highlighted a number of areas which need to be addressed. The lag between the administration of the tests and getting the results is too long. A year 3 student can learn a lot (or fall behind) in the months it takes to report results.
Teachers may not like the scrutiny NAPLAN - and the My School website which reports the results - brings to school performance (or their own). They may not like the emphasis it places on numeracy and literacy. But that's part of the intention. As Professor Barry McGaw, chairman of the Australian Curriculum Assessment and Reporting Authority, argues, these basic skills are absolutely vital for higher learning.
Before they became national, NAPLAN tests used to be called basic skills tests. Their primary purpose is as a diagnostic tool to help teachers, students and parents gauge how students are performing in those basic skills of writing, reading and maths - at a state and national level.
The truth is that NAPLAN tests have become high stakes in the minds of some teachers only. If children are feeling the pressure, then it's probably being transmitted from teachers. One of the more disappointing aspects of the survey was that many teachers are not using the tests to improve learning - 55 per cent said the tests were not a diagnostic tool.
Under 20 per cent used it to form individual learning plans, and about a quarter said they glanced at the data but didn't change their teaching practices. If there is a lesson out of this survey, it is that teachers - some teachers - need to take a chill pill, and use the tests to help children learn.