In order for artificial intelligence (AI) in healthcare to be successful, there needs to be more transparency around the data that goes into an algorithm, explained John Halamka, M.D., president of Mayo Clinic Platform.
There is a lot of optimism for the use of artificial intelligence (AI) in healthcare, but there needs to be more transparency to evaluate an algorithm’s fitness for purpose, according to John Halamka, M.D., during a session at the HIMSS21 Digital Program.
Halamka is an emergency physician by trade and the current president of the Mayo Clinic Platform, a platform for new ventures to leverage emerging technologies, such as AI, connected health care devices and natural language processing.
As long as there has been statistics and probability, there have been issues with algorithms.
“If I use a data set of 1 million Scandinavian Lutherans to create the most amazing EKG algorithm ever and then we decide to use it in Spain, is it going to work?” Halamka asked. “And this is our issue: the algorithms are only as good as the underlying training data. And yet, we don’t publish statistics with each algorithm describing how it was developed or it’s fit for purpose.”
With an influx of AI in healthcare, there needs to be a “nutrition label” for algorithms. These labels would include the race/ethnicity, gender, geography, income, etc, of the data that went into creating an algorithm. In addition, the label should include a statistical measure of how well the algorithm works for a given population.
The point of providing this information, said Halamka, is so a physician can know if it will work for the patient in front of them.
“That’s how we get to maturity” for the use of AI in healthcare, he said. “We can keep doing what we’re doing, we just need process maturity.”
The level of transparency that is needed would require some enforcement. While the FDA’s software and medical device people can look at devices and closed loop algorithms from a safety perspective, they wouldn’t be the right people to look at efficacy or bias. According to Halamka, there will need to be public-private collaboration.
He pointed to what HIMSS did for data standards: a group of experts with different opinions were brought together and came up with a harmonized approach. And he thinks it’s going to happen soon. In the past, he has written about the perfect storm for innovation, which happens when there is an urgency, policy suggesting it needs to be done, and industry saying it’s the right thing to do.
“And that is what’s happening now,” he said. Articles published in medical journals and lay media in just the last week have all described the need for addressing bias, ethics, and fairness in AI.
“It’s a top-of-mind issue,” Halamka said. “But to be honest, does the technology exist to allow us to flip a switch and make it live? It does not.”
Finally, he highlighted 4 grand challenges that Mayo has identified:
1. Gathering novel data. Collecting data from different sources to become part of the algorithm is important. For instance, if someone had a fever in January 2020, and GPS data could have shown they were in Wuhan, China, just weeks earlier, that patient would be treated very differently than if they hadn’t been. The challenge, Halamka said, is that there aren’t a lot of data standards yet for these novel data sources.
2. Create discovery. When large data sets are being brought together, there needs to be a tool to allow investigators to look at the data and create new algorithms. This will empower those individuals who don’t have AI experience to become engaged in algorithm development.
3. Validation. This is what Halamka discussed through the rest of the session: how do we decide if the algorithm is fit to its purpose and labeled properly?
4. Delivering the end result into the workflow. Information must be delivered to the physician immediately, while the patient is sitting in front of them, so they can put in an order at the point of care.
“Let us hope government, academic, and industry work on those four challenges, and we’ll all be in a better place,” Halamka said.