How the TGA can enable safe AI in healthcare: the clinician-in-the-loop exemption

6 minute read


Allowing AI-based SaMD to qualify for exemption where defined supervisory criteria consistent with the Clinician-in-the-Loop model are met would not weaken safety. It would formalise it.


Recently, after landing at Brisbane Airport, I switched on my Tesla’s AI-based supervised full self-driving mode for the 100km trip home.

For almost the entire journey I did not touch the pedals or steering wheel. Twice I intervened, not because it was unsafe, but because the system hesitated. A cabin camera monitored my face continuously. If I looked away, it warned me. If I ignored it, the system would disengage.

This technology has been operating in thousands of Australian vehicles for more than six months.

Tesla’s late 2025 US safety data reports roughly one crash per 10.2 million kilometres under supervised self-driving. The US national average sits closer to one crash per 1.1 million kilometres. On that comparison, supervised AI driving appears around nine times safer than the human average.

It complies with Australian law because I remain the legally responsible driver. The system assists; I carry the accountability.

In the world of AI, this is called Human in the Loop (HITL).

Healthcare’s inflection point

Healthcare now faces a similar moment.

AI tools are embedding themselves in clinical practice at speed. Recent commentary in Health Services Daily has examined the rapid uptake of AI scribes, now part of daily workflows despite regulatory ambiguity.

Others have analysed offshore consumer platforms presenting as “wellness” tools while offering increasingly clinical guidance.

The direction of travel is clear. Capability is advancing faster than regulatory interpretation.

The Therapeutic Goods Administration regulates software with a medical purpose as Software as a Medical Device. Since the 2021 amendments to the Therapeutic Goods Act 1989, providers must demonstrate conformity and secure inclusion on the Australian Register of Therapeutic Goods before lawful supply.

Assessment attaches to defined clinical functionality. Each feature with a medical purpose, and each substantive upgrade, is treated as a separate regulatory event.

On paper, this is defensible. In practice, it is slow.

As a practising dermatologist and CEO of Xestro, Australia’s leading cloud-only practice management system used by private medical specialist practices across Australia, I can identify dozens of AI features that would materially improve safety and efficiency in clinical practice. Many are incremental. Some are transformative.

Yet industry discussions suggest a conformity assessment and ARTG inclusion cycle may take 12 to 16 months per feature. There is no clearly defined per-feature cost.

In a field evolving in quarters, not years, 12 to 16 months per iteration guarantees that some tools will be outdated before approval.

A sensible exemption mechanism

The legislation already contains a sensible exemption mechanism.

Broadly, certain software may fall outside full medical device regulation where it supports, but does not replace, clinical judgement and where a clinician can independently review the basis of a recommendation.

That principle aligns with most clinician-facing AI systems. They are designed to support workflows while leaving the final decision with the clinician.

The difficulty lies in interpretation.

The TGA has taken the position that AI-based SaMD is not eligible for that exemption. In January 2026, it reinforced that interpretation. The reasoning centres on transparency. If internal logic cannot be fully explained, described as a “black box”, the exemption is considered not to apply.

AI is therefore treated as categorically ineligible.

The transparency paradox

This creates a paradox. AI is excluded because its reasoning resembles human reasoning.

The human brain is itself opaque. We do not fully understand how cognition integrates symptoms, examination findings and investigations into a diagnosis. Clinicians cannot articulate the neural mechanisms underpinning their own judgement.

Yet we accept human reasoning because it operates within a framework of supervision, accountability and professional standards.

In daily practice, I do not dismiss the reasoning of a junior colleague or another specialist because their cognitive processes are not mechanistically transparent. I assess the conclusion, interrogate the evidence and decide whether to accept it. Responsibility rests with me.

If opacity alone disqualifies AI from exemption, consistency would require similar treatment of human judgement.

That is not workable.

The question is not whether reasoning is mechanistically explainable. The question is whether it is supervised, auditable and anchored to a responsible clinician.

The Clinician-in-the-Loop model

The Clinician-in-the-Loop model, effectively healthcare’s version of HITL, addresses that question directly.

Under the Clinician-in-the-Loop model:

  • AI outputs are clearly identified;
  • Clinicians can modify or delete suggestions;
  • Structured feedback can be provided;
  • Software providers log and respond to that feedback;
  • Systems record both the AI suggestion and the clinician’s final decision.

These safeguards are not theoretical. They are technically straightforward and consistent with existing healthcare software architecture. Electronic prescribing systems, My Health Record integration and state-based real-time prescription monitoring already rely on audit trails, authorship tracking and traceable decision points.

This is evolutionary, not radical.

Allowing AI-based SaMD to qualify for exemption where defined supervisory criteria consistent with the Clinician-in-the-Loop model are met would not weaken safety. It would formalise it.

It would also bring tools already embedded in practice into clearer oversight.

AI scribes are widely used across Australia. Attempting retrospective prohibition would be impractical and disruptive. A Clinician-in-the-Loop exemption would permit enforceable standards without forcing tools into grey zones.

There is also a strategic dimension.

Offshore AI platforms are actively targeting Australian clinicians and patients. If domestic regulation is so slow and rigid that local providers cannot compete, the market will not pause. Capability will simply be imported. That weakens regulatory leverage.

A supervision-based interpretation would preserve the TGA’s authority while aligning regulation with clinical reality.

The analogy with Tesla’s AI-based supervised full self-driving is imperfect but instructive. In road transport, we have accepted that assisted autonomy can improve outcomes while keeping legal responsibility with the human operator. The driver remains accountable; the system supports.

Healthcare can adopt the same principle.

AI does not need to replace clinicians to deliver benefit. It can reduce administrative burden, surface overlooked information and provide structured decision support. In a system under strain, incremental gains matter.

The safeguard is not complete algorithmic transparency at the level of model weights. It is clear disclosure, professional accountability, auditability and the ability to override.

The Therapeutic Goods Act already provides the legislative framework. What is required is interpretive adjustment.

If AI-based SaMD remains categorically excluded from exemption, Australia risks lagging in safe, supervised adoption. If, instead, exemption is permitted where Clinician-in-the-Loop safeguards are demonstrably present, innovation and accountability can coexist.

That is not regulatory retreat. It is regulatory adaptation.

The clinician remains at the wheel. The system assists. Responsibility stays where it belongs.

Disclosure: I used AI systems, including Claude Sonnet and Google Gemini, as structured thinking partners while drafting this piece. I have reviewed and edited the final text and take full responsibility for its content.

Dr Bert Pruim is an Australian dermatologist and CEO of Xestro, Australian cloud-only PMS and an electronic medical record platform for private medical specialist practices.

End of content

No more pages to load

Log In Register ×