Ssssh, don’t mention the AI

5 minute read


AI doesn’t vote (yet), so perhaps those in charge of health, disability and aged care should think a little more about the optics of their ‘efficiencies’.


There was a point on Wednesday afternoon when I was scratching around trying to figure out how to give HSD’s coverage of the Integrated Assessment Tool and its “not-AI” black box a fresh angle when an unexpected white knight rode in.

“I’m just wondering if you could briefly explain exactly how does the IAT determine the support level assigned to an older Australian,” came the dulcet tones of Senator Anne Ruston at Senate estimates.

I could have kissed her, readers. Seriously.

You can read our coverage of Senator Ruston’s grilling of DoHDA officials Blair Comley and Greg Pugh here.

Senator Ruston’s questions were good news on a number of levels, apart from the entertainment of seeing the squeaky-bum moment for the bureaucrats.

For a start, it was a sign that the concern from providers, assessors, and client advocates is actually getting through to the people who can do something about it.

Second, it forced the Department to say publicly what we’ve known all along – humans have been locked out of a vital part of an older person’s chances of getting the care they need, even if the officials did prevaricate like the professional bureaucrats they are.

Third, it forced the Department to say that it is monitoring the way the IAT works and the results it’s pumping out, and to encourage people with concerns to get in touch.

“We’ve got staff that undertake in-depth desktop reviews,” said Mr Pugh, the first assistant secretary for access and home support.

“They also do some comparative analysis of aged care assessments across all assessment organisations, and we are doing that to make sure that there are consistent outcomes and ensuring there is continuous improvement in quality of practice.

“We do have an intention to regularly review the IAT classification algorithm. We will make requirements if we are seeing that it is not operating as intended, and we’ve been quite open with our state and territory and also our private assessment organisation partners in saying, if you are seeing issues where an adverse outcome is being made from the IAT, please let us know.

“We have started to receive some case studies that have been sent through, and all of that will factor into our review processes.”

Those are public utterances we can hold the Department to in the future. Will they change the Aged Care Rules and reinstate the clinical assessors’ ability to override the IAT?

Time will tell, but in the meantime, I encourage anyone with a beef to bombard Mr Pugh with emails.

What else have I learned this week?

DoHDA is incredibly sensitive about the use of the term “AI” and given the Robodebt scandal, that’s not surprising.

Department secretary Blair Comley was very quick to intervene in esties when he sensed the possibility that Senator Ruston might utter “AI”.

“Algorithm means many things to many people,” he said.

“The process takes those inputs, which is from the human assessor, then allows that to be used consistently across the program.

“I just don’t think there should be any impression that there’s just an automated process that doesn’t have human intervention forming assessments on the way through, and those assessments by the assessor can also take into account clinical judgement at the time to say which category of need people should sit within.”

But the fact is, this Department and its minister, Mark Butler, are hot for using algorithms to cut humans out of the processes involved in assessing older people, and, it turns out, people with a disability, for the care they need to live lives of dignity and quality.

Maybe Mr Butler and his colleagues genuinely believe the use of not-AI-just-an-algorithm will improve care quality and access.

Or maybe, they’re just trying to improve “efficiency”.

Or maybe, they’re just trying to save money.

Where else can we expect them to insert a bit of an AI-lgorithm?

Scope of practice is a big issue. Perhaps pharmacists and prescribing nurses will be able to get reassurance that they’re not making huge mistakes by consulting with a computer-generated decision-making assistance tool instead of, say, a GP?

What about triage in emergency departments? Maybe we can get a “decision tree” or a touchscreen with a picture of a human face on its screen to decide who needs to see a doctor now, or stay in the waiting room for the next eight hours?

Are those possibilities, Mr Butler? Are those things on your agenda?

All these weasel words and nitpicking about the difference between AI and algorithms and decision trees and inputs and outputs are meaningless to the people who are impacted the most.

The bottom line is the Australian public don’t trust a computer, or a black box, or a non-human process or whatever you want to call it, to determine their quality of life and how much it’s going to cost them.

If Mr Butler and the DoHDA want acceptance of this kind of “efficiency” they need to learn to be better at the optics, the transparency and the communication.

Right now, they just look fidgety and shifty.

Perhaps they should remember a simple truth – AI doesn’t vote. Humans do.

End of content

No more pages to load

Log In Register ×