Fact checked byShenaz Bagha

Read more

August 26, 2024
1 min read
Save

‘Push back hard’: Liability negotiation critical when contracting with AI companies

Fact checked byShenaz Bagha
You've successfully added to your alerts. You will receive an email when new content is published.

Click Here to Manage Email Alerts

We were unable to process your request. Please try again later. If you continue to have this issue please contact customerservice@slackinc.com.

As health care providers contract with artificial intelligence services, the management of liability and risk is a key concern to keep in mind, according to a presenter at the 2024 Association of Women in Rheumatology annual conference.

“This is a highly dynamic environment, and it is absolutely worth staying on top of, because it will impact patient care,” Chris Dwight, a partner at Poyner Spruill law firm, based in North Carolina, told attendees during the meeting. “It will impact clinical practice — because it does right now, and it will only continue to do so.”

Doctor using AI on laptop
“This is a highly dynamic environment, and it is absolutely worth staying on top of, because it will impact patient care,” Chris Dwight told attendees. Image: Adobe Stock

When health care providers or practices enter business relationships with AI service providers — for example, to perform AI-powered medical image analysis — the AI company is likely to be the one providing the contract, Dwight said. A key item to watch out for in the contract is who will be liable for mistakes made by the AI system.

“It is going to be critical that you understand and heavily negotiate the allocation of risk between the parties,” Dwight said.

He additionally advised practices to “push back hard and to require the AI provider to indemnify and hold the practice harmless if, in fact, there are errors in what the AI system produces.”

Dwight highlighted the potential dangers of AI “hallucinations,” a term for when a language model outputs false or misleading information.

“If that hallucination makes its way into a patient’s care, and there’s liability that results from that, your medical malpractice carrier is certainly going to want to know that there is a third party, in the AI provider, who can potentially be there to help make the practice whole,” he said.

Dwight also warned that AI service providers may try to limit the amount of damages that can be recovered from fees that are paid. The amount is “probably not insignificant,” he said, but likely will only begin to cover legal fees in the event of a malpractice suit.

“We’d encourage you, in that setting, to look closely at the clause pertaining to limitation of liability to ensure that the AI provider has the obligation to come up with more than just a pittance if it is obligated to indemnify the practice,” he said.