AI bots democratize medicine

Artificial Intelligence-enabled systems that interpret radiologic studies are advancing to the point where they often match the best radiologists in the world.

And they are getting better every day. From CT scans to MRIs, angiograms to Doppler studies, the data captured is digital—and digital data can be read by machines.

Human doctors traditionally sit in dark rooms and interpret medical diagnostic images. A radiologist’s report is then given to the practicing physician, who merges the objective digital image interpretations with the more subjective clinical presentation and exam. Often there is more information read by the radiologists than is relevant to the physician and patient; at other times, important information is missed. In a knee MRI, for example, there may be degeneration of a meniscus tissue reported—but this finding has no importance to a patient with bone-on-bone arthritis, in need of a joint replacement.

As with genome interpretations, where there is data overload and the possibility of over-diagnosing problems, patients have generally not had direct access to their radiology images or reports. And coalescing this data can take days, if not weeks.

This may soon change due to several parallel advances. AI bots will no longer be interpreting only isolated radiologic scans. Due to the wide availability of electronic medical records, a patient’s entire medical history will available to these bots. In combination with data mining programs like LynxCare, all of the diagnostic information in a person’s medical record will be correlated with any symptom, exam finding, or lab result. And with the advance of do-it-yourself image uploading systems, any patient may be able to send their images not only to their doctor but to a cloud-based radiologic interpretation station as well.

Here’s an example of how we are working with the first stages of this system.

Let’s say a woman injures her knee while skiing in a remote part of Alaska. She gets an exam, X-ray, and MRI done by a local physician in a mountain clinic. The doctor enters this information into an electronic medical record and provides the patient with a web link. The patient uploads the links or the images directly, using a cloud-based radiology storage system like Purview.

The patient’s own physician, who subscribes to the future “Global Medical AI” network, is notified of the injury and provided with the AI bot-produced report over-read by a supervising networked radiologist. This report combines all of the medical information with human and computer software interpretations. It not only diagnoses a torn cruciate ligament (ACL), but highlights some important genomic information previously submitted to the doctor and patient: Due to a specific genomic pattern, a blood clot in the leg is 10% more likely to form during the post-injury period.

Since the AI bot has also reviewed the entire pharmacy records available for this patient (in addition to their credit card purchases of over-the-counter drugs), it is clear that the patient has been consuming high doses of NSAIDs. These, the bot notes, might increase the bleeding risk if the patient were to be placed on anticoagulants before being taken off the other drugs. In addition, the evaluation of the patient’s recent tweets indicates a new level of depression that may require extra psychological support during the postoperative healing phase.

So, before the skiing patient has ever left the remote mountain town, the level of medical insight about their underlying conditions, lifestyle choices, and potential risks exceeds most of what is known today. Yet, very few extra resources were consumed.

Several recent events are making this all possible:

1. The general acceptance and recognition of the lack of privacy inmedical and consumer information. Electronic medical records are improving by sharing information about patients, no matter where they are being treated. On the positive side, this permits the doctor to know a great deal about the patient. All of their credit card or Apple Pay-like purchases are recorded and that information is mined to paint a general picture of the cardholder. When combined with the patient’s Facebook and Twitter feeds, Instagram pictures, and other social media revelations, a highly detailed portrait of the patient emerges.

2. Because they are self-learning, AI bots and data mining technologies are improving more rapidly than humans ever could. They are not intimidated by vast amounts of data and can quickly correlate disparate information sources.

3. Genetic information able to predict disease or highlight risks is rapidly improving. Today we are researching a sliver of this information as it relates to injury and arthritis. While we still do not have a handle on exactly how certain conditions predicted by the genetic code are expressed or suppressed, such patterns will be more predictive after billions of people have been sequenced.

4. Patient access to direct medical resources — through medical data uploading services, along with data mining their own healthcare data — will democratize medical diagnoses. This will not reduce the physician’s role; it will empower doctors with potent interpretation tools, making them far better diagnosticians. And it will free patients from the tyranny of socialized medical care systems and insurance companies that, in their effort to drive down costs and improve profits for their executives and shareholders, have robbed millions of patients of the best that medicine can offer.

So it is time to embrace the belief that, at least in medicine, “all things digital are public.” And if they are going to be, we might as well all benefit from this transparency by mobilizing the best minds to keep us healthy—even if those minds are not human.

Dr. Kevin R. Stone is an orthopedic surgeon at The Stone Clinic and chairman of the Stone Research Foundation in San Francisco.

Just Posted

On Sunday, California bore the brunt of what meteorologists referred to as a bomb cyclone and an atmospheric river, a convergence of storms that brought more than half a foot of rain to parts of the Bay Area, along with high winds, concerns about flash floods and the potential for heavy snow in the Sierra Nevada. Much of the Bay Area was under a flash flood watch on Sunday, with the National Weather Service warning of the potential for mudslides across the region. (NOAA via The New York Times)
Bomb cyclone, atmospheric river combine to pummel California with rain and wind

What you need to know about this historic weather event

National Weather Service flood watch in the San Francisco Bay Area for Sunday, Oct. 24, 2021. (National Weather Service via Bay City News)
Storm pounds Bay Area, leaving over 145,000 without power: Closures and updates

Torrential rainfall causes flooding, triggers evacuations in burn areas

Plan Bay Area 2050 is an expansive plan guiding the region’s growth and development over the next three decades. The regional plan addresses progressive policy priorities like a universal basic income and a region-wide rent cap, alongside massive new spending on affordable housing and transportation infrastructure. (Shutterstock)
Plan Bay Area 2050: Analyzing an extensive regional plan that covers the next 30 years

Here are the big ticket proposals in the $1.4 trillion proposal

A collaborative workspace for a decentralized autonomous organization (DAO) in Coordinape is pictured at a recent blockchain meet up at Atlas Cafe. <ins>(Kevin N. Hume/The Examiner)</ins>
Business without bosses: San Francisco innovators battle bureaucracy with blockchain

‘The next generation will work for three DAOs at the same time’

Most Read