JavaScript disabled. Please enable JavaScript to use My News, My Clippings, My Comments and user settings.

If you have trouble accessing our login form below, you can go to our login page.

If you have trouble accessing our login form below, you can go to our login page.

Senators urge overhaul of My School website to stop 'league table' of schools

The My School website should be overhauled so the performance of schools cannot be easily compared and converted into league tables, according to a cross-party Senate committee into the NAPLAN testing regime.

While the committee accepted that NAPLAN data was useful for students, schools and parents, it argued there were ''significant disbenefits'' to publishing the results in a way that allowed for a direct comparison of schools.

''One of [the] core elements of the My School website is the ability to compare schools, but given the number of variables involved in the testing process, and the almost infinite variation in students, a true comparison is not possible,'' the committee writes in its report.

Core rankings and comparative functions should be stripped from the website to ''limit the disingenuous use of data to rank schools''.

The NAPLAN (National Assessment Program - Literacy and Numeracy) tests are used to assess students in years 3, 5, 7 and 9 across Australia. Since the Rudd government introduced the My School website 2010, newspapers have used the results to create league tables comparing different schools.

The committee's key recommendation is there needs to be a quicker turnaround between testing and the delivery of results.

The committee - dominated by the Coalition when it did the bulk of its work last year - became Labor-dominated following the federal election.

The Australian Literacy Educators' Association told the committee it takes five months for NAPLAN results to arrive at schools - too long for them to be used as a tool to identify problems in the classroom and devise solutions.

According to a survey by the Australian Education Union, 58 per cent of teachers don't believe NAPLAN is an effective diagnostic tool.

''The school year moves at a rapid pace and the turnaround of many months [for NAPLAN results to become available] does not allow for meaningful intervention to ensure students across the spectrum of development are given the appropriate support they require,'' the committee finds.

The committee also said tests should be adapted to take into account the needs of students with disabilities and from non-English language backgrounds.

Federal Education Minister Christopher Pyne said last year the Coalition would consider banning the publication of NAPLAN results because they were ''skewing the way people teach''.

But Mr Pyne has since dramatically softened his rhetoric. On Thursday, he said: ''The government committed to review NAPLAN and the My School website to ensure it is meeting the needs of our students.''

Labor's education spokeswoman Kate Ellis said: ''Of course NAPLAN should be continually improved. It's a really important tool for picking up where schools and studets are falling behind - and for working out where extra investment in needed.''

''NAPLAN results are just one factor that should be taken into account in assessing a school's performance.''

Follow us on Twitter

18 comments

  • NAPLAN, is referred to as napalm in teaching circles because of its capacity to burn to the ground the love of learning. While I was in amongst teachers who had to pay attention to these results, it was a common reaction that the data only confirmed what teachers already knew about their students. It does not add value. The process is a waste of money and confers no real advantage in helping students find their passion in life and become lifelong learners.

    Commenter
    remotestidea
    Date and time
    March 28, 2014, 7:07AM
    • remotestidea,
      it's not about each teacher or student; yes I'm sure they know their students well.
      NAPLAN is about being able to compare between schools within your area, within the state and within Australia.
      we used to have individual state assessments but no idea how that related to each other state; now we do.
      You've got your microscope glasses on, not the wide-angle lens. This is big picture stuff

      Commenter
      Econorat
      Location
      Sydney
      Date and time
      March 28, 2014, 10:59AM
    • @econorat - 'remotestidea' is correct - NAPLAN is definitely about students, it's about student performance. It was designed as a diagnostic tool so that teachers could address student literacy and numeracy needs, it was NOT designed or intended to be used for comparing schools, thus it's not suitable for that purpose. If we're going to continue to compare school performance then another type of measure, designed for purpose, is required. Maybe a new comparative measure will include school/student performance in literacy and numeracy, and maybe other things as well, but it needs to be designed specifically for the purpose that its intended for.

      Commenter
      Truthful
      Location
      Sydney
      Date and time
      March 28, 2014, 2:12PM
  • Space inhibits a full comment on this article; however, as a retired teacher who specialised in working with children with learning difficulties, I always felt that these children should have been identified by Year 3 (first NAPLAN year), and that any remediation should have been underway for a considerable period of time. Therefore, I totally agree that NAPLAN, as a diagnostic tool, is not as effective as other tests which deliver results in a more timely fashion. I have many issues with NAPLAN, especially as it suffers from the classic test syndrome of being a one-off assessment of a child on one day. I also have a major concern with how parents interpret the results; far too many consider NAPLAN to be the be-all and end-all of test results, without placing the results in the context of other assessments, the ability of the child, etc, etc. "Given the number of variables involved in the testing process, and the almost infinite variation in students, a true comparison is not possible" is certainly a true statement. As a final point, I don't wish to hear that NAPLAN tests are standardised tests; they are not. The content varies from year to year, and the children are assessed based on the total cohort in that grade undergoing the assessment that year. Standardised tests have a standard content, and are measured against norms established over a period of time. It is not truly possible to compare NAPLAN results from one year to the next. I could comment further, but I'll leave it at that!

    Commenter
    Parto
    Location
    Wild West
    Date and time
    March 28, 2014, 7:31AM
    • Very well said, Parto.
      In her enthusiasm to achieve a much better outcome for all Australian children and to provide better opportunities for the disadvantaged, Julia made a huge mistake in using this instrument to measure results. However, there will still be bitter arguments among educators about the best methods of assessing educational outcomes, which should just prove to all of us how, not just controversial, but difficult this is.

      Commenter
      EM
      Date and time
      March 28, 2014, 8:14AM
    • From my own research I can make the following observations about NAPLAN numeracy tests.

      1. It presumes that all students should be taught the same thing at the same time in the same way. If nothing else this stifles creativity.
      2. About 40% of students either don't finish or guess more than half of the test.
      3. Many problems are not understood because of language problems or are ambiguous or badly crafted.
      3. The results if they were going to do any good take too long.
      4. As a comparison between schools it is nonsense given that kids are often excluded and at some schools kids are taught to the test.
      5. The test does not account for students who are capable but slower, or become anxious.

      Commenter
      DrPhil
      Date and time
      March 28, 2014, 11:47AM
  • If they implement the Gonski reforms as they were originally planned before Pyne started mucking around (after all, why take the word of experts who did a 2 year review over a couple of his mates?) then the imbalance between high end private and poor end public would be addressed.

    From a business perspective you can't make plans and improvements unless you measure where you are i.e create a baseline. That is what Naplan does and the results are not always pretty. So rather than hide the bad news how about looking at the causes and addressing them. That is what Gonski was to do, move the funding to where it was needed. Now I know that isn't in the governments interests as that would have an impact on their families and friends and as we know, the government is all about self entitlement.

    Parents need information to make informed decisions which are not just based on this tool but also school visits etc. Without this information coming out we will never address the funding inequality in our schools. What is worse is that we need more accountability in our school performances so that we can then start looking at why we are dropping down the OECD ladder. Until we do that (and Naplam helps) then all we have is a flight to expensive private schools in the misguided belief they get a better education. They don't. What they get is the equivalent of buying a first class ticket on the titanic where you go down last as the ship sinks but you do go down in the end.

    We need to be able to measure performance so that we can identify what is wrong and see how we are going with fixing it.

    Commenter
    Lance
    Date and time
    March 28, 2014, 8:04AM
    • Lance, "start looking at why we are dropping down the OECD ladder" I would suggest that the reason for this is the poor quality of teachers in public schools and parents who are not interested in what their kids are doing.
      If we tripled the funding to public schools the only people to benefit would be the teachers who would go on strike for more pay.

      Commenter
      Thepres
      Date and time
      March 28, 2014, 10:52AM
    • Lance & Thepres. The downward spiral in OECD results coincided with the Howard government's massive increase of funding to the private sector at the cost of the public sector. Many (conservative and progressive) analyses have shown this result and the Gonski analysis & report was meant to address it. When we keep fiddling around the edges (as we have been ever since Howard), we will not have an increase in any OECD ranking (or NAPLAN data).

      Commenter
      JH
      Date and time
      March 28, 2014, 11:30AM
    • @Threpres, I agree with you it is mostly about the teacher. 8-9 out of 10 adults say they hate maths. When asked to pinpoint a time they felt the began to really hate maths and they point to grade 4 and 5. Further research showed the primary teachers who teach the entire curriculum including maths fit the same statistic for hating maths as the general adult population. So 8-9 out of 10 primary teachers also hate maths but are forced to teach it. The result is a lack of flexibility in teaching maths concepts because of the teacher's limitations. The answer is simple. As in many high-end private schools, state primary schools should hire qualified and passionate maths and science teachers if they want better results.

      Commenter
      DrPhil
      Date and time
      March 28, 2014, 12:14PM

More comments

Comments are now closed

Related Coverage

Featured advertisers

Special offers

Credit card, savings and loan rates by Mozo