Heidi Health and Lyrebird Health are at the forefront of AI scribe software transforming Aussie GP clinics. Melbourne GP Dr Grant Blashki has used an AI scribe like a “medical intern” in every appointment for a year. The software listens to patients, writes notes, and suggests differential diagnoses.
He says it’s “mostly surprisingly accurate,” though sometimes it mishears names or diagnoses. Consent is mandatory, but most patients don’t mind.
“I do ask for consent. Occasionally, people don’t want me to use it, which is absolutely fine,
but almost everyone is comfortable with it, and it just streamlines the work,” Dr Blashki said.“It’s good for the patient because I’m concentrating on them.”
“If I was going to forget my stethoscope or my scribe software, I’ll take the scribe software.
It’s such a part of my work now.”
The big question: How safe is all this patient data? Blashki says he deletes all transcriptions after each consultation to protect sensitive info. Heidi Health CEO Dr Thomas Kelly stressed they follow strict privacy rules worldwide—APP in Australia, GDPR in Europe, HIPAA in the US—and meet ISO 27K and SOC2 security standards. They also undergo third-party audits.
“Heidi now supports almost two million visits a week, and that’s around the world,
from Australia, New Zealand, Canada, to the US and the UK,” Dr Kelly said.“In each region, data is stored in compliance with the healthcare regulations and privacy policies of the region.”
Lyrebird Health, also Melbourne-based, reported 200,000 consults last week in Australia alone. CEO Kai Van Lieshout said all data for Australian users is stored in sovereign Australian databases and is deleted after seven days unless doctors back up notes.
“We have not been hacked before and that’s something that is incredibly important.”
“For us, it is definitely, really gone,” Van Lieshout said.
“I know that because we’ve had doctors that have needed something that we’ve had …
that don’t realise that it’s deleted after seven days and there’s nothing we can do.”
John Lalor, IT professor at University of Notre Dame, warns digital health data always carries some risk. More data means better AI, but also greater privacy exposure if hacked. He urges transparency from AI scribe firms so doctors and patients understand how data is used.
“A lot of those models, they’re very data-driven, so the more data they have, usually the better they get,” Lalor said.
“Making sure that the firms are clear with how exactly the data is being used, because if there’s ambiguity in what they say, then there could be ambiguity in the interpretation as well.”
Heidi Health’s scribe also offers “magic notes”—summarized clinical encounters with suggested diagnoses. Blashki cautions it’s a tool, not a diagnosis and doctors must verify. Lyrebird Health sticks to documenting conversations without suggesting diagnoses.
“Heidi does not provide a differential diagnosis absent the clinician and it is still up to
the clinician to review their documentation for accuracy,” Dr Kelly said.“We won’t try to tell the clinician what to do, if that makes sense,” Van Lieshout added.
About half the GPs at Blashki’s Melbourne clinic now use AI scribes every visit. Former Victorian Chief Health Officer Brett Sutton calls the tools “indispensable” but flags patient data security as the biggest concern. He calls on regulators to ensure AI scribe data is as safe as traditional clinical notes.
“I think the regulators need to make sure that it’s safe,” Dr Sutton said.
“Obviously, the clinicians who are using it have a responsibility for sensitive health information
to be properly recorded and stored and made safe.”
For patients uneasy about AI scribes, the option to decline or ask questions remains key. AI scribe software is speeding up notes but the balance of privacy and utility remains under close watch.
Watch 7.30 on ABC iview for the full story.