The double-edged sword of AI-driven medical records

Image of Anthony Mennillo 21 Dec 2023

Anthony Mennillo Head of Legal Services

Artificial intelligence (commonly referred to as AI) seems to have sprung upon the world and then advanced in leaps and bounds.
 
The capabilities of AI appear only to be limited by the creativity of the designer.  AI based services like Chat GPT have already revolutionised many aspects of the way we live our lives.
 
In the medical circle, three AI based medical consult recording tools have been launched within a very short space of time and with great excitement.
 
The first is ConsultNote.AI, an Australian based product by two GPs.  The second is US-based HealthScribe, a product developed by the global giant, Amazon. Finally, mAIscribe, developed by a Melbourne based GP and his lawyer wife.
 
All products claim to do the same thing, namely automatically generate a medical progress note from an audio recording of the consultation.
 
The ConsultNote.AI has been described as a “more pivotal revolution than when computers came into medicine in the ’80s and ’90s”1 and could well be a game changer for the medical profession.
 
Based on the video demonstration of ConsultNote.AI, the cause for excitement seems justified.
 
However, like any shiny new toy, in the wrong hands, or used for improper purposes, this valuable tool could create more problems than it solves.
 
This article aims to highlight the pros and cons of AI based medical recording software.

The benefits
Clearly, a program that accurately summarises a consult without the practitioner lifting a finger to type/write a note offers significant benefit for the doctor/patient relationship as well as the medico-legal aspects of medical practice.
 
The practitioner’s focus can be entirely on the patient without their attention being diverted to writing or typing notes.  The fear that something important said by the patient is not recorded and overlooked is dispelled.
 
From a medico-legal perspective, the quality of the records (as demonstrated in the ConsultNote.AI video), is far superior than the record keeping of the majority of practitioners in Australia.
 
There is little doubt that the AI generated record would satisfy clause 10.5 of the Board’s Code of Conduct which requires practitioners to keep accurate, up to date and legible records that report relevant details of clinical history, clinical findings, investigations, diagnosis, information given to patients, medication, referral and other management in a form that can be understood by other health practitioners, are contemporaneous and are sufficient to facilitate continuity of patient care.
 
From a Medicare compliance perspective, the AI generated notes would very likely meet the descriptors of the Medical Benefits Schedule as well as the Health Insurance Regulations where each entry must:
  1. Include the date on which the service was rendered or initiated; and
  2. Provide sufficient clinical information to explain the service; and
  3. Be completed at the time, or as soon as practicable after, the service was rendered or initiated2.
This is a particular area of vulnerability for many medical practitioners who find themselves the subject of a Medicare review and/or Professional Services Review.
 
There is also talk that an AI based medical recording tool can also start thinking for itself and write referrals and order investigations (blood tests, radiology) by reference to the records, identifying precisely the right investigation required and with appropriate follow-up.  (This is very likely a pitfall as well as a potential benefit).

The Pitfalls
While the benefits of AI based medical recording programs offer an unprecedented level of excitement, practitioners still need to proceed with caution.  The following are just some of the issues that need to be explored and addressed before any medical records AI tool is successfully implemented:
  1. Privacy:

    1. Where and how the recording and summary of the consultation is stored and the security of that storage

    2. If storage of the data includes an overseas based service, have privacy obligations (including disclosure to and consent of patients) been met

    3. If and how the recording and transcript of the consultation is retained
    4. Whether the recording forms part of the patient record and, if so, is the patient entitled to a copy of it if requested.
  1. Consent:

    1. From patients to enable a consultation to be recorded. Subject to some limited exceptions, it is illegal in Australia to make a recording of a conversation using a listening device without the express permission of all parties involved.

  1. Clinical:

    1. Over reliance on the AI-produced summary. While it may be tempting for practitioners to rely entirely upon the summary produced by the AI tool, the practitioner is ultimately responsible for the final note produced and it is therefore crucial to check the content of the note to ensure that it is accurate and contains the relevant detail

    2. Whether the AI program is intuitive enough to identify and interpret the spoken word where English is not the first language for the patient and/or the practitioner

    3. AI generated referrals for further investigations:
      1. The practitioner remains responsible for any referral generated including whether it was clinically indicated and any follow up to ensure the investigation is performed and discussed with the patient (where required)
      2. ACCC considerations. We all recall the legal hot water HealthEngine found itself in for misusing patient data3 and it would not be too much of a stretch of the imagination to think of the marketing possibilities companies could use to put their pathology company, radiology company or any other third party referral at the forefront of the AI-based referral tool.
This article is not intended to provide answers to these issues and most of the answers cannot be known at this early stage of development.

Where to from here
MIGA endorses the AMA’s submission to the Department of Industry, Science and Resources calling for a common set of legislative principles to establish a compliance basis for all individuals involved in the use of AI that ensure:
  • Safety and quality of care provided to patients
  • Patient data privacy and protection
  • Appropriate application of medical ethics
  • Ensuring equity of access and equity of outcomes through elimination of bias
  • Transparency in how algorithms used by AI and ADM tools are developed and applied; and,
  • That the final decision on treatment should always rest with the patient and the medical professional, while at the same time recognising the instances where responsibility will have to be shared between the AI (manufacturers), the medical professionals and service providers (hospitals or medical practices).4
There is no doubt that AI based medical recording tools referred to in this article (and the many that will follow) are a very exciting leap forward for the medical community.  However, practitioners should proceed with caution before adopting such software and ensure the AI ‘T’s are crossed and the ‘I’s dotted.

With appropriate regulation in place, the opportunities provided by AI based tools are endless and exciting.
 

1 Former RACGP presidential candidate launches AI software to reduce ‘cognitive burden’ on GPs. Australian Doctor July 2023
2 Regulation 6 of the Health Insurance (Professional Services Review Scheme) Regulations 2019 - standards of adequate and contemporaneous records
3 https://www.accc.gov.au/media-release/healthengine-in-court-for-allegedly-misusing-patient-data-and-manipulating-reviews
4 https://www.ama.com.au/media/healthcare-sector-approach-ai-required#:~:text=In%20a%20submission%20to%20a,developers%20and%20no%20real%20governance