A sound plan can be made obsolete through no failure of the team that wrote it. Fixed timelines assume a stable world. Health tech has never operated in one.
There is a lot written about burnout in healthcare – clinicians drowning in documentation, nurses running beyond safe ratios, GPs with 15-minute slots trying to do 30 minutes of care.
There is almost nothing written about the people building the systems those clinicians depend on.
That silence has a cost – maybe a quality cost, maybe an innovation cost. Maybe the cost shows up in the experience – in software that works but never quite gets better, because the people who could see how to improve it were too depleted to try.
Burning out in health tech is not the same as burning out in any other software sector. The burden is specific, compounding, and almost entirely unacknowledged. Until we name it as a workplace safety issue – not a culture one – we won’t fix it.
And before anyone says it: this is not a resourcing problem.
You cannot hire your way out of a system that is structurally designed to consume you. Medical Software Industry Association CEO Emma Hossack told the Treasurer’s productivity roundtable that compliance and government approvals now consume 60–80% of team capacity in parts of this industry.
That is not a gap a few more engineers closes. It is the environment health tech teams operate in – and it shapes everything that follows.
The invisible roadmap
Health tech teams carry two jobs. The one on the roadmap, and the one that isn’t “sexy” to stakeholders and customers.
The one that isn’t sexy is ENORMOUS.
Maintaining integration points across a clinical platform – e-prescribing infrastructure, medicines databases, Medicare claiming, the Australian Immunisation Register, dispensing networks – is not a one-time build.
Every one of those systems changes. Claiming rules are updated. API specifications are versioned. Authentication requirements shift.
When any upstream system changes, every downstream vendor must respond. That work is invisible when done well and catastrophic when it isn’t. It almost never appears in a project plan.
The MSIA put it plainly in a recent briefing to government: vendors lack the bandwidth – people, cash, and sandboxed test environments – to fast-track implementation of evolving national standards. Seven million patient results a month currently fail to flow between medical clinics because legacy systems still can’t talk to each other.
That costs $420 million in repeat tests and preventable hospital admissions. And the people being asked to solve the integration problems are already stretched.
Then there are the working groups, standards bodies, government consultations, industry submissions. This work is not optional – because if you’re not in the room, the regulation gets written without you, the conformance requirement gets set without your input, and your team inherits the consequences.
It is never one meeting. It spans state and federal jurisdictions that frequently don’t align.
Also in today’s edition:
- We need an urgent IAT review: geriatricians
- Razor-gang taskforce promises NDIS cuts
- Prevention: everyone’s business no one’s responsibility
- Not much time saved by AI scribes, large study finds
- Victorian Virtual Hospital pilot will be expanded
- National push to fix blind spots in organ donation and transplant funding
Take vaccination: the public health message is simple, but whether a vaccine is funded depends on the state, the year, the cohort, the program. That changes. The software has to be correct across all of it, in real time. That requires someone on your team following every consultation, every policy update, every jurisdictional variation – and translating it into product decisions.
That is not peripheral work.
This burnout is already visible in conformance discussions – vocalised by vendors who show up to argue the downstream effects of a rule before it gets set in stone, while simultaneously managing competing priorities across other government service updates.
It is never one change in isolation. Medicare, PBS, AIR, Drug Information, NDIS – the reform calendar doesn’t pause. And every update to every one of those systems lands on the same teams, at the same time, alongside everything else … because once it’s written, it’s inherited by every product that touches it.
And it is not a single person’s job. It takes a team to track, interpret, and translate what a conformance decision means in practice. When that team is depleted, or simply not in the room, the rules get written without them. Years of consequences, locked in, with no one there to say what it actually does to the people building on top of it.
Then there’s cybersecurity – regular penetration testing, vulnerability remediation, security audits. In healthcare, patient safety and regulatory obligations demand it. It doesn’t make the roadmap because it’s not a feature. But it happens every cycle, whether it was planned for or not.
None of this is optional. All of it is invisible in planning. All of it lands on the same people.
The customer reality no other sector faces
Your customers are not in a neutral emotional state when they contact you.
The clinicians using your platform are already at capacity – running a dispensing queue, running a practice, trying to give adequate time to a patient who needs more than the appointment allows.
When your software creates friction, they have nothing left for it. The tolerance is zero because everything else in their day already consumed it.
The patients using your product are often unwell, frightened, or navigating a system that already feels overwhelming. When the experience is confusing or broken, that lands differently than a frustrated user of a travel booking app.
And then there are the timeline conversations.
“You said it would be ready by X.” “Why isn’t it done?” Repeated emails. Aggressive follow-ups. Every request urgent. Every opportunity a wedge. Healthcare professionals – who understand better than most what burnout looks like – directing sustained frustration at software teams for delays caused by the exact complexity described above.
The team that moved the “targeted” date did it because the alternative was shipping something unsafe. That nuance rarely survives the inbox.
And here’s the thing – the moment you mention clinical or patient safety, the room goes quiet. For about an hour. Then the next urgent request arrives.
This is a regular occupational hazard in health tech teams. It is almost never named as one. And it accumulates.
There is also something standard UX and product training simply does not prepare you for.
Every course will tell you to test early, pilot with real users, iterate. Good advice – except your user is a pharmacist mid-dispensing queue, or a doctor with an emergency about to walk in or running behind schedule.
“Scheduled” is doing a lot of heavy lifting in that sentence. The testing happens in real workflow, which means any mistake or delay in the session has consequences not just for the research, but potentially for a patient in the room. Even a confirmed session gets abandoned. The feedback you get is partial, rushed, or simply not coming.
And the software itself has to mirror the full weight of that job – not just the care moment, but the funding and administrative burden alongside it – government services, claiming workflows, documentation requirements.
Clinicians don’t get to do only the clinical part. The system requires all of it. The software has to reflect all of it. That scope is almost always underestimated.
Related
Psychological safety is not a ping-pong table
The wellbeing conversation in health tech gets flattened into culture perks – wellness budgets, Flexible Fridays, the proverbial ping-pong table. It is also not a kombucha-on-tap, a bring-your-dog-to-work policy, or a meditation app subscription nobody uses after February.
These are not psychological safety.
Psychological safety – in the formal sense defined by Amy Edmondson’s research and recognised by Safe Work Australia as a genuine work health and safety matter – is the ability to raise a concern, name a problem, say “this timeline isn’t realistic”, without being punished for it.
It is the structural condition that allows a team to function honestly under pressure. (Thanks to the team at Mibo for this framing – doing important work in this space.)
In health tech it is particularly hard to build. The stakes feel too high to slow down. But the change management cost of not slowing down is higher.
And timelines are the mechanism that destroys it. Once a date has been committed externally, saying “this isn’t realistic” means arguing against a promise already made. That is a structurally silencing position. The timeline doesn’t just create overwork – it actively suppresses the people best placed to see the problem earliest.
Software is genuinely hard to fix-time. This is not a new insight – the industry has known it for decades. Scope can flex. Quality can flex. Finance often doesn’t flex. Time is the variable the market wants fixed, and the one the work resists most.
In health tech, every hidden dependency amplifies that resistance: integration changes, regulatory shifts, legacy systems, the irreducible pace of clinical change management. Committing to a fixed date before the scope is understood is a psychological safety risk dressed up as a delivery plan.
When people cannot speak up, problems don’t get solved – they get absorbed. Longer hours, quieter corners cut, people depleting themselves to keep up appearances.
Let’s stop punishing transparency
We exist to make healthcare better. That purpose is worth protecting.
Three things I’d like to leave the industry with:
Timeline movement is a safety call
When a health tech team moves a date, shipping on time would have meant shipping something unsafe. Frustration is understandable. But harassment is never acceptable.
The team moving the date is acting in the interest of patient safety. And friction. Friction for the clinician has a way of reaching the patient – a clunky workflow, a missed step, a moment where the software got in the way of the care.
What the industry needs to work on is educating healthcare professional users on what a timeline change actually means – the vendor is protecting them, and that deserves a different kind of trust in return.
The standard playbook of transparent progress updates doesn’t land the same way when the audience is already at capacity. That requires a different kind of communication on both sides.
Even perfect planning breaks
Regulations shift. People move on. Technologies change. Commercial priorities pivot.
A plan that was sound six months ago can be genuinely obsolete through no failure of the team that wrote it. Fixed timelines assume a stable world.
Health tech has never operated in one.
We have almost no research on this
We have robust work on burnout in clinical roles. We have almost nothing on what breaks the people building the systems those clinicians depend on.
The hidden maintenance load, the compliance hours, the emotional charge, the sustained aggression from stressed users, the timeline that silences the person who can see the problem – these are specific, compounding conditions. They deserve to be studied with the same rigour we apply to clinical burnout.
If you’re living this – or leading a team that is – I’d like to hear about it.
Mina Giang is co-founder and head of product and experiences at Oexa, a Brisbane-based digital health technology company which created the Scripty app, a consumer-focused, free digital wallet that integrates with the national Active Script List.
This article was first published on Ms Giang’s LinkedIn feed. Read the original article here.



