What does AI mean for digital healthcare liability?

Artificial intelligence and robotics are changing the face of healthcare, leading the way in a digital healthcare revolution that’s taking the industry by storm.

Digital healthcare Article 5 min 03 Feb, 2020

Innovative new treatment options are now making full use of the mind boggling capabilities of AI, making it easier, cheaper and faster to treat an extensive range of conditions. For patients, the benefits of these new treatments are immeasurable. However, the rapid pace of change is proving problematic where healthcare liability policies are concerned.

We are now entering unknown waters, and there’s a distinct lack of case law and precedents available to assist in decision making processes for AI-centered healthcare claims. Many existing E&O policies aren’t adapting to new treatment provisions fast enough, leaving healthcare providers exposed to new risks.

AI treatments are becoming mainstream

Artificial intelligence is being put to work throughout the healthcare industry. The technology is already being used to power processes in hospitals, rehabilitation facilities, addiction treatment centres and GP surgeries.

The capabilities of AI are far-reaching. The technology can be used to support diagnostics, identify abnormal scans, reduce operational costs and of course automate time-consuming administrative tasks.

AI technology is even being used to provide assistance for patients in the form of healthcare chatbots, designed to answer common queries and take some of the pressure off medical staff.

The new risks of AI-powered healthcare

New technology means new ways of providing advice, delivering treatments and using data. Whilst each of these equates to enhanced opportunities for healthcare providers and patients alike, they also mean a significant upsurge in exposures to be considered.

No technology is completely failsafe, and healthcare providers must be fully aware of all risks associated with their use of AI technology. Not only must risks be identified, it’s also vital that providers understand where liability would lie in the event of an AI-related failure.

Globally, AI is raising profound questions around medical responsibility and an example of this is normally when something goes wrong, we can easily trace the blame. For example, if you got a misdiagnosis from a physician, you could easily trace it back to them. If it’s a faulty medical device and it gives an incorrect reading, harming a patient, you can easily go back to the manufacturer. But we don’t actually know in truth what this would mean for AI at the moment.

What does the future hold for AI healthcare insurance?

The absence of case law in AI-powered healthcare claims may well lead to problematic ambiguity as to which party is responsible for medical malpractice in challenging scenarios.

It’s likely there is going to be a huge amount of confusion and ambiguity as to where liability sits when there is AI within healthcare systems. What may well have been correct for a medical malpractice policy for the last 100 years might not be correct for the next two decades because of the adoption of technology-enabled solutions within healthcare.

Preparing for AI exposures

The potential of AI-powered healthcare is staggering, but as every trailblazing new idea emerges, so too do a set of exposures that insurance providers are unlikely to have come across before.

The key danger here stems largely from the lack of precedent, which could well lead to problematic cases where liability isn’t entirely clear. The majority of existing E&O policies are unlikely to adapt to the emergence of new healthcare ideas as quickly as is necessary, and this might spell disaster for providers who find themselves liable for AI-related data breaches or malpractice claims.

CFC offers bespoke healthcare insurance products that provide cover for allied health professionals, eHealth and digital medicine, medical billings and other wellness and fitness organisations. Get in touch to learn more about our healthcare policies.