**3. Results**

*3.1. Initial Efforts in Mental Health Information*

The 1992 National Mental Health Strategy, which included an overarching policy and a plan, had data and accountability at its heart (Box 1).

**Box 1.** Extract from 1992 National Mental Health Policy.

There needs to be greater accountability and visibility in reporting progress in implementing the new national approach to mental health services. Currently mental health data collection is inconsistent and would not be adequate to enable an assessment to be made of the relative stage of development of the Commonwealth and each State/Territory Government in achieving the objectives outlined in the National mental health policy. It is essential that such a consistent system of monitoring and accountability be created.

> *National Mental Health Policy (Commonwealth of Australia 1992)*

The aim of this novel approach to accountability for mental health was to report on the progress being made by governments against the Strategy's agreed goals.

The Australian Health Ministers' Council established a working group to oversee the implementation of the Strategy. The National Mental Health Working Group was comprised of representatives from each state and territory, plus two from the Federal government, as well as the chair and deputy chair of the newly established National Community Advisory Group, which included consumers and carers. This working group established a set of 49 indicators to fulfil the accountability monitoring function recommended in the policy.

However, the data required to report against many of these indicators either did not exist or were not collected. The working group established a Mental Health Information Strategy Sub-Committee (MHISSC) [11] comprised of the same representation as the working group plus representatives from the Australian Bureau of Statistics (ABS), the Australian Institute of Health and Welfare (AIHW) and the Australian Private Hospitals' Association. The MHISSC developed a National Mental Health Data Dictionary and Minimum Data Set for Australia.

The MHISSC oversaw the development of a specific new data collection process designed to fulfil the Working Group's mental health reporting obligations under the Policy. This was conducted outside the structures established already by the National Health Information Agreement, which provided the framework for establishing national data collections and data standards [12].

The Federal governmen<sup>t</sup> engaged consultants to manage the process of collecting and analysing data, and then published a series of National Mental Health Reports [13] to draw together material from all jurisdictions, as well as the private sector.

After a baseline was established in 1993, the first report was published in 1994 [14]. By the time the Commonwealth decided to cease the series, twelve editions had been produced. The final National Mental Health Report (2013) used 18 graphs or tables to describe the pace of reform [13].

This report, produced separately from other existing health data and by external consultants, became the key tool by which the community could track changes in the shape and nature of mental health care. Drawing on the definition provided earlier, the National Mental Health Report series had a clear focus on political accountability, purporting to enable governments to answer the question "Did we do what we agreed?" [13].

Over time, the collection and report became more robust, with data elements incorporated into different national minimum datasets [15]. It reflected a strong focus on the role of the states and territories as the main providers of care, for example, in delivering the policy goal of 'mainstreaming' mental health services.

The reporting also had a heavy emphasis on financial accountability, as described earlier, reporting inputs such as spending and staffing, and outputs, as well as administrative data, such as treatment days, the number of services and clients.

The collection was not designed to drive a process of systemic quality improvement, nor reflect perspectives on accountability held by mental health stakeholders, such as consumers or even health professionals. Stakeholders from across the mental health sector and outside of the governmen<sup>t</sup> would prioritise accountability issues and questions different to those selected by the governmen<sup>t</sup> [16].

#### *3.2. Limited Aims, Limited Performance*

The pursuit of even this rather limited dataset was challenging enough—obtaining agreemen<sup>t</sup> on data collection standards and definitions between nine Australian jurisdictions is difficult. The process requires consensus across governments [17].

MHISSC then had to oversee the process by which each governmen<sup>t</sup> obtained, vetted and cleaned the necessary data. This governmental approval was a slow process, causing delays in publication. For example, the data published in the 2013 National Mental Health Report pertained to the 2010–2011 financial year. This lag has not improved. In 2022, the AIHW's Mental Health Services in Australia website [18], now the key data resource, is still only able to report mental health expenditure up until 2018–2019.

There was no independent verification of the data provided to the Report and, particularly in the first years, the quality and range of data varied between jurisdictions. There was no way to marry annual mental health budget allocations to the actual expenditure or to the costs of services. These matters limited the extent to which data could be usefully interpreted for benchmarking between jurisdictions.

The data were only published at the jurisdictional level (i.e., by state and territory). This could be useful, revealing how the shape and nature of the mental health services available differ between the states. For example, the 2013 National Report showed that Tasmania offered 19.5 beds per 100,000 inhabitants in residential mental health care settings, while Queensland provided zero. However, the Report had no capacity to provide data at more disaggregated levels, preventing a more detailed and regional comparison of service patterns or other issues [13].

The 1997 Evaluation of the first national mental health plan, while noting the role of the National Mental Health Report, stated:

*Information in mental health is grossly undeveloped. The lack of nationally comparable data on service outputs, costs, quality and outcomes places major limitations on the extent to which the National Mental Health Strategy can achieve its objectives.* [19]

An initial \$135 m investment made by the then Federal Government to sponsor reform and accountability under the First Plan was not replicated in subsequent plans [20].

Key proponents of the national reforms noted that, under the Second National Mental Health Plan, momentum "waned" [21].

A decade later, the 'summative' evaluation of the 3rd National Mental Health Plan (2003–2008) repeated concerns about national monitoring and reporting mechanisms, suggesting that there was duplication, waste and an inability to measure appropriate outcome measures [22].

These concerns about data and accountability processes in mental health were echoed in repeated statutory reports and inquiries [23,24]. A report jointly prepared by the Human Rights and Equal Opportunity Commission and the [then] Mental Health Council of Australia found:

*The National Mental Health Strategy was developed over a decade ago to respond to obvious service failures and human rights concerns*... *.we do not yet have a national process for translating the policy rhetoric into real increases in resources, enhanced service access, accepted service standards or service accountability.* [25]

#### *3.3. Fragmentation of Effort, Minimal Improvement*

The ownership of responsibility for national mental health reporting shifted in 2006 from health ministers to first ministers, with the Council of Australian Governments (CoAG) agreeing to a \$5.5 bn National Action Plan on Mental Health [26]. The rationale for the CoAG's involvement is not entirely clear. There were two damning inquiries which required some political response [23,25]. The CoAG itself reported that its engagemen<sup>t</sup> was based on "a broad recognition that renewed governmen<sup>t</sup> effort was needed to give greater impetus to the reform process" [26]. The Action Plan brought together the heads of all governments to focus on mental health for the first time and included its own list of outcomes and progress measures.

The CoAG's list had greater emphasis on social indicators, such as employment and education, than the mental health service indicators prioritised by the MHISSC. It also reflected greater engagemen<sup>t</sup> by the Federal governmen<sup>t</sup> in mental health service provision. The CoAG Action Plan generated progress reports, again designed for the governmen<sup>t</sup> to fulfil a level of political accountability and demonstrate "Are we doing what we said we would?" [27].

There were several other reports and inquiries into mental health emerging in quick succession that recommended changes to the way data are reported, or even proposed new sets of indicators [24,28] (see Table A1 for a timeline). These recommendations were not actioned.

The process of providing national accountability oversight in mental health has become increasingly confused with multiple overlapping initiatives, policies, plans and datasets. This has dramatically increased the gap between planning and reporting, and actual action and monitoring of mental health. Key processes identified as part of effective policy development and evaluation are missing [29].

The 2012 National Mental Health Roadmap, for example, listed 11 'performance' indicators and 3 'contextual' indicators [30]. The 4th National Mental Health Plan and associated Implementation and Measurement Strategies listed 25 indicators [31]. It continued the CoAG's emphasis on broader measures of the social determinants of mental health, promising a "whole of governmen<sup>t</sup> approach" so that:

The public is able to make informed judgements about the extent of mental health reform in Australia, including the progress of the fourth plan, and has confidence in the information available to make these judgements. Consumers and carers have access to information about the performance of services responsible for their care across the range of health quality domains and are able to compare these to national benchmarks [32].

The National Mental Health Commission began in 2012 and soon produced its own annual National Mental Health Report [33] drawing on frameworks, indicators, case studies and stories, rather than against a consistent dataset. In 2014, the Commission was tasked with a review of mental health programs and services and reported, in 2015, on a lack of outcome-based evaluation data and accountability mechanisms [34]. It recommended a focus on a much smaller number of indicators, focusing much more on outcomes than outputs, together with a transition to a much more regionally based system of planning and reporting. The Commission's recommendations remain unimplemented.

The impetus towards greater accountability in mental health in relation to its social determinants was affirmed in the 2014 strategic plan of the NSW Mental Health Commission, which reported that spending on mental health by the NSW Department of Family and Community Services was greater than that by the NSW Department of Health [35]. Accountability for health care alone cannot provide a true picture of mental health.

Despite this, the 5th National Mental Health and Suicide Prevention Plan [36] and its accompanying Implementation Plan (2017) [37] promised monitoring and reporting around a more limited set of 24 core health indicators, focusing on safety and quality.

This Plan promised to draw on proxy data to deal with social determinant issues as part of this, for example, using the Australian Bureau of Statistics General Social Survey to report the social participation of people with a mental illness.

Leaving aside issues such as resources or political will, the infrastructure to support good data collection in mental health has been slow to evolve. Several other countries have developed sophisticated maps [38], permitting benchmarking and the comparison of key mental health services between jurisdictions. Such maps are new to Australia and are not ye<sup>t</sup> driving decision-making. Alternative classifications and structures, such as the Australian Classification of Health Interventions (ACHI), have been demonstrated to be less than comprehensive when applied to mental health [39].

The history of Australian efforts in relation to data collection and reporting has left us with at best a partial picture—strong in relation to health and administrative data, but weak in other areas, particularly outside of hospitals and in relation to the broader social determinants of mental health. It is a situation described as "outcome blind" [40].

#### *3.4. Other Key Reporting Mechanisms in Mental Health*

There are two other key sources of mental health data in Australia. Unlike the National Strategy reporting, both have demonstrated some consistency.

The Australian Institute of Health and Welfare (AIHW) has published the Mental Health Services in Australia (MHSIA) data series since 1988–1999 [18], drawing on the National Mental Health Data Dictionary and Minimum Data Set originally developed by the MHISSC.

Other national minimum data sets have been developed and become part of MHSIA reporting, including in relation to:


In 2021, this array of data permits the publication of 35 tables of information. The AIHW also hold and manage an 'indicator library' [41] from which they derived a set of 26 Key Performance Indicators, including issues such as rates of seclusion and restraint, rates of access to mental health care, community contact pre- and post-discharge, etc. [42]. The AIHW was also the manager of the National Mental Health Performance Framework [43] until the cessation of the CoAG in 2020.

The Productivity Commission prepares the Report on Government Services which, for 25 years, has included a section on mental health services [44,45]. Around 60 tables of information are published each year online, providing data at the state and territory levels across 13 key indicators.

There is considerable overlap across the AIHW and Productivity Commission reporting—they both provide data on public mental health service data, expenditure, staffing and access. Additionally, both publications focus on the health service aspect of mental health care, rather than the broader social determinants. They use proxy data derived from general community survey information to estimate and report on matters such as housing and employment. Both suffer from considerable delays in publication. They report progress at the jurisdictional level, permitting, for example, a comparison of the proportion of all mental health-related emergency department presentations in public hospitals between Western Australia and Tasmania. The work of the AIHW and the Productivity Commission in reporting mental health data, even at this level, is helpful, but, as recommended by the Productivity Commission Review (see below), more useful comparisons need to be established between regions, not between states [46]. This more granular approach reflects the fact that regions may have more in common and provide more valid benchmarks than comparing whole jurisdictions, such as Victoria and NSW.

#### *3.5. The Productivity Commission Review 2020*

The report found duplication and a lack of clarity in mental health reporting arrangements and called for all governments to agree on a new set of realistic measures and outcomes. It suggested a new framework with six key areas and 47 identified indicators [46]. This was echoed by the Victorian Royal Commission, which reported in 2021 that:

*System leadership is weak, and accountability for how the system is managed is unclear.* [47]

These findings are obviously a strong indictment of the approach taken in Australia so far.

Under various reporting structures, the MHISSC operated continuously until the Council of Australian Governments (CoAG) was disbanded in May 2020 in favour of new National Cabinet reporting arrangements. Thus far, these arrangements seem rudimentary.

Eleven general health issues are listed under a 'Performance Reporting Dashboard', of which one pertains to mental health. However, rather than provide any data or indicators, what is presented is simply a list of some projects undertaken in each jurisdiction under a green tick symbol and the word "Achieved" [48].

The final National Mental Health Report was published in 2013. There have been no evaluations of either the 4th or 5th National Mental Health Plans and, as stated, no evaluation of the Strategy overall. Despite the regular calls for annual and transparent reporting and monitoring of progress, there is no current system or process for this to occur.

In 2021, the Federal Government released its response to the Productivity Commission report [49], undertaking with the states and territories to establish a new National Agreement on Mental Health and Suicide Prevention by November 2021.
