Lewis & Tompkins, P.C. | Maryland | Virginia | Washington, D.C.

Free Consultations

PLEASE NOTE: To protect your safety in response to the threats of COVID-19, we are offering our clients the ability to meet with us in person, via telephone or through video conferencing. Please call our office to discuss your options.

Medical malpractice, the use of AI and physician liability

On Behalf of | Jan 19, 2021 | Medical Malpractice

Medical advancements are always occurring. This not only helps patients in the Washington, D.C. metro area and elsewhere, but it also expands the abilities of medical professionals. Whether it means quicker diagnoses, better treatment plans or higher quality medical devices and equipment, medical innovations and advancements seem to be a very positive part of society. Nonetheless, this can shift the reliance on technology and artificial intelligence to do the work of human medical professionals. Thus, when patients suffer harms, this causes one to question whether this might have played a role.

AI and medical malpractice

Using artificial intelligence for advice or as an aid while treating a patient may seem both beneficial and risky. In matters of a medical mistake, one would think that the doctor relying on AI to complete a medical decision or treatment would be held accountable for this choice; however, the opposite is the mindset based on a study that considered the opinions of potential jury members for a medical malpractice case. This study sought to better understand how other view liability when it comes to a physician relying on an AI tool and something goes wrong.

The study included 2,000 participants and they were all given four scenarios that varied the AI recommendation as a standard or nonstandard drug dosage and the physicians decision to either accept or reject the AI recommendations. In all four scenarios, the decision made by the physician caused harm to the patient.


The study found that participant evaluated the liability of the physician basked on whether the treatment decision was one that could have been made by most physicians and a reasonable physician. Higher scores in this meant lower liability. Furthermore, it was found that liability was base don two factors. First, whether the treatment provided was standard, and second, whether the physician decided to follow the AI recommendations.

While this study suggests that the threats of legal liability for accepting the recommendations from an AI may not be as large as they thought, this does not mean errors and mistakes cannot be made when utilizing or even relying on AI.

The last thing a patient and their loved ones expects is that they end up in a worse medical condition than they were in when they sought medical care or treatment. Thus, when a patient suffers harm or death due to medical negligence, it is important that they consider their rights and options, such as filing for a medical malpractice action. This civil suit could help assign liability and help offset the damages and losses caused by the medical mistake.