Doctors warned about AI complaint responses by MDU

Despite the allure of artificial intelligence to make everyday tasks easier, the Medical Defence Union (MDU) has warned its members against using AI technology to draft medical complaint responses.

In the latest issue of the MDU journal, the MDU explains some doctors are turning to AI programs to help draft complaint responses for them. However, the leading medical defence organisation, warned that doing so increases the risk of a response being inaccurate or being perceived as insincere. Using AI in this way also raises issues of confidentiality and data protection.

Dr Ellie Mein, MDU medico-legal adviser said:

“In the face of increased complaints and immense pressure on the health service, it’s only natural for healthcare professionals to want to find ways to work smarter. There are many ways in which AI technology is being used to improve the quality of patient care, such as in health screening. But when responding to patient concerns, there is no substitute for the human touch.

“That’s not to say that AI can’t act as a prompt to get you started but it’s vital that patient complaints are responded to in a suitably authentic and reflective manner.

“There have been cases where recipients who were suspicious of the wording in a complaint response were able to reproduce the same text by asking AI to draft a similar letter. Would you feel comfortable in this scenario and would the patient feel you had taken their complaint seriously?”

The MDU points out some of the risks involved in using AI to draft a complaint response include:

  • Inaccuracy – AI drafted responses may sound plausible but can contain inaccurate information or use language or law from their country of origin (often the USA), rather than the UK. An example is the use of the word plaintiff rather than claimant.
  • Confidentiality – The medical history set out in a complaint response is likely to be unique to that patient and such confidential information cannot be disclosed as a prompt. Patients need to know how their data will be processed and data protection laws may prevent transfer of data outside the UK.
  • False apologies – generalised wording in responses which AI can often generate like ‘I am sorry you feel your care was poor’ is unlikely to address a patient’s concerns and may inflame the situation. Apologies need to be specific and genuine.
  • Omitting key information – this can include the offer of a meeting or the complainant’s right to refer the matter to the Ombudsman.
  • Inability to reflect – reflection on concerns raised is a necessary part of a complaint response and therefore outsourcing it to AI defeats that purpose.

Read the full MDU journal here.