Topic Tags:
9 Comments

Needs-Based Funding isn’t the Answer

Ken Gannicott

Jun 20 2017

12 mins

broken chalkIt would be a relief to be able to dismiss the latest evidence (March 2017) of Australia’s declining school performance as nothing more than fake news. Unfortunately, the facts are all too clear. Over recent years a stream of evidence from both NAPLAN scores and international assessments confirms that Australian students are being out-performed by many other countries. Those international “league tables” may not matter much, but Australia’s performance is also falling in an absolute sense (which matters a lot).

The evidence has been so consistent there are even signs of “performance fatigue”. International results published in late 2016 prompted considerable shock that Australia was doing worse than Kazakhstan. Jokes about Sacha Baron Cohen’s Borat may have been predictable, but they guaranteed much media exposure about our declining standards. Commentary about the latest reports from the Australian Curriculum, Assessment and Reporting Authority (ACARA), together with full evidence from last year’s Programme for International Student Assessment (PISA) and Trends in International Mathematics and Science Study (TIMSS), has with rare exceptions taken on a noticeable air of weary resignation and “here we go again”. If we are to move beyond the hand-wringing, what is to be done to raise performance?

 

Social disadvantage and school performance

The dominant theme in the post-Gonski years is that underperformance is largely the result of socio-economic disadvantage. It is a matter of common sense that students who experience social disadvantage such as low parental income and occupation, disability, remote location, or lack of English proficiency are likely to be more expensive to educate than others. Stripped to the basics, this is what Gonski was all about. Its central recommendation for “needs-based funding”—described by the Grattan Institute as the “Holy Grail”—has become an article of faith among educators. Its essential element is that public funding should be allocated to each school on the basis of measured need—where “need” includes not just a resource standard to cover general recurrent funding, but also a loading (where necessary) based on various types of socio-economic disadvantage.

Studies in Australia’s secondary schools show a clear correlation between socio-economic advantage and academic performance. Primary schools show a similar pattern. Such a pattern is not unusual. What is unusual is that in Australia we have a larger gap between the highest and lowest performing students than many other OECD countries.

There is much in the “needs-based” line of argument that can be readily accepted, not least because at the conceptual level it contains nothing new. In all the millions of words written about the Gonski Report, it is often forgotten that pre-Gonski funding for both government and non-government schools had long been based on a combination of resource costs and loadings for socio-economic disadvantage. All states incorporated a needs-based component into their budgetary formula for government schools; all states based their support for non-government schools at least partly on various measures of students’ special needs.

This essay appeared in a recent edition of Quadrant.
Click here to subscribe

While the idea had been accepted and implemented in Australia for many years, the pre-Gonski system had become clunky, lacked transparency, and owed more to historical arrangements than to a fresh appraisal of social disadvantage. Support for non-government schools was complex, politicised and often inequitable. The case for reforming a formidably intricate system was strong.

Despite the acknowledged structural deficiencies in the pre-Gonski system, it is quite wrong to suggest that it was failing to support disadvantaged schools. Needs-based funding was still getting through to the target schools. In 2013, before any Gonski funding became available, primary schools in the lowest 25 per cent of socio-economic ranking received an average funding of $14,301 per student. The remaining primary students, ranked in the top 75 per cent of the socio-advantage scale, were funded at $11,938 per student. This is a differential of 20 per cent. At the same time, performance remained stubbornly low. Between 2006 and 2015, when public allocations to schools already incorporated elements of needs funding, inflation-adjusted public expenditure per student increased but mean PISA scores in reading, maths and science all declined.

The fact that Australia already had in place a “simplified Gonski” going back almost to the Karmel Report of 1973, with little or no measurable improvement in student performance, should have rung alarm bells for the Gonski panel. It is true that Gonski envisaged a bigger role than previously for the “needs” component of each school’s funding package, but offered no evidence other than purely indicative estimates ranging up to 100 per cent of the basic resource standard—and even more in the case of multiple sources of disadvantage. In the absence of serious evidence, simply upping the ante on loadings for disadvantage (something which contributed to the report’s demise when faced with the brutal reality of public expenditure constraints) is not conceptually persuasive.

 

Why needs-based funding does not work

If it is intuitively obvious that disadvantaged students may be more costly to educate, and Australia has included such needs, however imperfectly, over many years of its public funding, why isn’t needs-based funding delivering the goods?

The first reason is that social disadvantage is not the whole explanation. Many socially advantaged schools perform poorly in NAPLAN. There are also plenty of schools that are around average on the disadvantage scale but perform better than the secondary NAPLAN average. These substantial variations mean that if all you know about a school is its Index of Social Advantage, you will not do a statistically reliable job of predicting its academic performance.

Our PM begs to differ: “The Turnbull Government will introduce real needs-based school funding, and increase investment as part of a new initiative that will give Australian students the quality education they deserve.”

Helping disadvantaged schools meet their genuine additional costs is one thing. Moving to a more emphatic needs-based system in the expectation that this will improve performance is likely to result in disappointment. We’ve got to search for other explanatory factors. When social advantage accounts for only 53 per cent of variation in performance, as at present, we’re tackling barely half the problem.

The second limitation of needs-based funding is that poor performance is not confined to the socially disadvantaged. Unfortunately, Australia’s performance right across the system gives cause for concern, including even our best students. The proportion of Australian students in the top categories of PISA mathematics fell from 20 per cent in 2003 to 15 per cent in 2012. In reading, the proportion of our top performers fell from 18 per cent to 13 per cent over the same period. Latest results from TIMSS show there has been very little change in Australian student achievement since 1995, and PISA shows a decline in more recent years. Meanwhile, many other countries have continued to improve, so that we are now significantly out-performed pretty much across the board.

This is a far cry from claims that the problem is mostly one of social disadvantage. It would be more accurate to say that performance in socially disadvantaged schools is a subset of a system-wide issue of poor performance in Australia. This means we must search for solutions that tackle performance right across our school system. Needs-based funding addresses, at best, only one component of the performance story; it is irrelevant for the majority of schools which do not have significant elements of social disadvantage.

 

Funding for performance: switching from inputs to outcomes

Perhaps the most serious concern about needs-based funding is that there is no guarantee it will be spent on performance-enhancing activities. This was fully acknowledged by Gonski, though without any exposition of what those activities might be. While the money is likely to be spent in the honest belief that it is being put to educationally effective use, the problem is that what was considered good educational practice in the past must now be set aside and replaced. Needs-based funding is not exempt from this rule. Even targeting or earmarking the money is not sufficient if the targets themselves are not best-practice in terms of enhancing performance.

It is a fair supposition that much of Australia’s poor performance in the last twenty years can be sheeted home to activities now known to produce little or no improvement in student performance. Among these activities are the training of more teachers, even at the cost of driving down entrance requirements; smaller classes; inadequacies in teaching, such as out-of-date notions of constructivist learning; the persistence of “whole language” in teaching reading, despite an accumulated mountain of evidence on the role of phonics; a curriculum which could be kindly described as a hodge-podge of fashionable fads; and low expectations of students. Nor should we forget, in the light of recent evidence, that Australian classrooms are among the most disruptive in the OECD, a problem not confined to poor schools. Needs-based funding, whatever its amount, will be ineffective if it is spent on activities that do nothing to improve performance.

What we need is a complete change of approach. Instead of providing funding for inputs and keeping our fingers crossed for improved performance, we should reserve a portion of school resources explicitly for performance improvement. The key elements of such a scheme would be:

1. Establish a “School Performance Board”, either as a new organisation or as a division of an existing agency (ACARA already has a pool of expertise on performance measurement) to receive applications from, and award grants to, schools to improve their performance. The scheme would be modelled on the Dawkins arrangements put in place for Australia’s universities after 1987 (and regularly fine-tuned in the intervening years). Research and research training are funded separately from undergraduate teaching; researchers have to apply for external block grants; and they are rated on their success in doing so. Just as in the case of university research, separate funding would signal the importance we attach to performance improvement in schools over and above routine recurrent operational expenditures.

2. The educational activities which qualify for a grant would be defined by the Performance Board. In recent years there has been what can reasonably be termed an explosion of studies which take an evidence-based view of “what works” in producing high student performance. That evidence is now well known to international agencies, many think-tanks, and even some education academics. It is becoming well understood in federal and state departments of education in Australia. For example, the New South Wales Centre for Statistics and Evaluation has set a high standard with its compilation of evidence-based practices for schools. (See Box 1 for a summary.) It is, however, fair to assume that few teachers will have been exposed through their pre-service or in-service training to this evidence. An urgent priority for the Performance Board would be the diffusion throughout schools (and its incorporation into professional development courses) of a readily comprehensible summary of what does and does not work for improved performance.

3. The criteria on which grants could be applied for and awarded would be drawn from the evidence of Box 1 and similar sources. Schools would decide what they need to do to improve their performance in the light of the evidence, what it would cost, and the timing of their anticipated results. Schools would be responsible and accountable for achieving what they set out to do.

4. Consideration should be given to phasing out the direct funding of schools on the basis of social disadvantage as such: over time, they would be funded on the basis of what they propose to do about it educationally, and the performance outcomes they expect to achieve.

5. It does not have to be an overly bureaucratic procedure. With careful definition of “what works”, the procedure should be no more onerous than the existing requirement for each school to report its annual funding to ACARA.

 

What works best in helping student performance?

 

High expectations: High expectations are linked with higher performance for all students. The reverse can also be true. Some students from disadvantaged backgrounds may be achieving less than their full potential due to lower expectations of their ability. A culture of high expectations needs to be supported by effective mechanisms, for example by curriculum differentiation.

Explicit teaching. Sometimes termed “direct instruction”, explicit teaching involves teachers clearly showing students what to do and how to do it, rather than having students discover or construct information for themselves. While some commentators dismiss this as rote learning, there is clear evidence that critical thinking processes depend on knowledge stored in long-term memory.

Effective feedback. Feedback is one of the most powerful influences on student achievement. Feedback that focuses on improving tasks, processes and student self-regulation usually has a positive effect.

Use of data to inform practice. Effective analysis of student data helps teachers identify areas in which students’ learning needs may require additional attention and development. High-quality assessment practice is crucial for analysis of student outcomes and wellbeing. Teachers need access to tools, skills and training to help them interpret and use this data effectively.

Classroom management. Data confirm a link between effective classroom management and student performance. Young teachers are likely to benefit from explicit support in developing effective classroom management strategies, so that they spend more time teaching and less time controlling students’ behaviour.

Wellbeing. Internationally, as well as in Australia, there is an increasing focus on student wellbeing, in recognition that the school years contribute to the development of the whole child, which in turn drives academic outcomes. Evidence suggests that higher levels of wellbeing are linked to higher academic achievement.

Collaboration. Teachers need to engage in professionalised collaboration that explicitly aims to improve teacher practices and student outcomes. A whole-of-school focus is needed to develop a culture of excellence.

Conclusion

Such a scheme is not without precedent. Worried by its poor school performance, as exemplified in dismal PISA results, Brazil revamped its funding and used its equivalent of NAPLAN to create an index for each primary school. That index traces each school’s trajectory from baseline 2005 for reaching the average score on PISA in 2021, the year before the 200th anniversary of Brazilian independence. It is the responsibility of each school, working with its municipality and monitored by the state, to develop an improvement plan which addresses the challenges the school currently faces and which will generate progress at the rate required. The target and the actual performance are compared so public and parents as well as school governance authorities can see whether schools are meeting their targets.

If Kazakhstan can dramatically improve its performance, and Brazil can show the way in institutionalising performance improvement, can we afford to do any less?

Ken Gannicott was formerly Professor of Education and Head of the Graduate School of Education at the University of Wollongong. He now works as a private consultant.

 

Comments

Join the Conversation

Already a member?

What to read next

  • Letters: Authentic Art and the Disgrace of Wilgie Mia

    Madam: Archbishop Fisher (July-August 2024) does not resist the attacks on his church by the political, social or scientific atheists and those who insist on not being told what to do.

    Aug 29 2024

    6 mins

  • Aboriginal Culture is Young, Not Ancient

    To claim Aborigines have the world's oldest continuous culture is to misunderstand the meaning of culture, which continuously changes over time and location. For a culture not to change over time would be a reproach and certainly not a cause for celebration, for it would indicate that there had been no capacity to adapt. Clearly this has not been the case

    Aug 20 2024

    23 mins

  • Pennies for the Shark

    A friend and longtime supporter of Quadrant, Clive James sent us a poem in 2010, which we published in our December issue. Like the Taronga Park Aquarium he recalls in its 'mocked-up sandstone cave' it's not to be forgotten

    Aug 16 2024

    2 mins