Machine learning and artificial intelligence could usher in the future of medicine, but are they being utilized to their full potential?
What are the relationships between artificial intelligence (AI), machine learning and deep learning? How can AI be used in personalized and precision medicine? Are healthcare systems implementing the proper tools to make sense of their data and learn from it? Our expert panel weighs in:
A Healthcare Analytics News® Peer Exchange®
Segment 4/11
Kevin R. Campbell, M.D.: Our second segment is all about analytics, AI and the search for answers. By now we all have heard about AI and big data analytics. In order to put things in perspective, it’s important that we understand exactly what AI is and how it can help us in healthcare. For starters, we really need a better idea of the terms, AI, machine learning and deep learning.
The easiest way to think about the relationship of AI to machine learning and deep learning is to think of them as a group of Russian nesting dolls — we call them matryoshka dolls. At the center of the stack is deep learning, followed by machine learning, and then the final and largest outer doll is AI. In the simplest of terms, artificial intelligence is the simulation of human intelligence by computer systems. Machine learning uses algorithms to analyze data — learn from it and ultimately make decisions, determinations and predictions.
>> Watch the last episode: The Promises and Pitfalls of Big Data in Healthcare
In short, machine learning occurs when a computer actually evolves, and learns without specific programming via complex pattern recognition. With machine learning, a computer is able to modify itself when exposed to new data. Now deep learning is a subset of machine learning and thus the smallest doll in our stack of dolls. In deep learning algorithms, artificial neural networks are used to recognize patterns and cluster and classify data. Deep learning maps inputs to outputs and finds correlations. This is exactly what doctors do when we’re making treatment decisions for a particular disease.
So let’s dive right in. I want to start with you, Geeta. Are healthcare systems implementing the tools needed to, one, make sense of their data, and two, learn from it? Do you have any specific examples that you’re aware of?
Geeta Nayyar, M.D., MBA: I think every health system struggles with this, right? Number one, with the advent of the EHR [electronic health record] revolution, we have all of this data, that’s the first piece. And now the question is, what do we do with the data, how do we analyze it? And then lastly, how do we actually action it, right? So it’s one thing to know you have a thousand diabetics in your panel, and you know half of them are high risk, the other half are moderate risk, and then lastly, how do you actually action that? So you know every hospital system is doing something different. There have been a number of investments made by the NIH [National Institutes of Health], and [The University of Texas] MD Anderson [Cancer Center], Stanford [University] — so many leaders in this space that are using AI technologies, particularly in oncology and radiology when it comes to precision medicine and predictive analytics, and personalized medicine, which is very particularly important in the oncology space, right? The specific genetic DNA of the patient sitting in front of you, and what is the chemotherapy regimen that is most tailored and most beneficial for them.
So, we’ve seen pieces of it. I wouldn’t say that anyone’s cracked the nut on it, but certainly there are health systems that are making investments in that space and trying to figure this out.
Kevin R. Campbell, M.D.: Dr. Albert, tell us how you and your company with the KardiaMobile ECG machine use big data and AI. I mean that’s a specific application of a specific tool that can help patients and doctors.
>> Watch the second episode: How Can We Share Healthcare Data If We Can't Agree on Who Owns It?
David E. Albert, M.D.: ​​​​​​​Like in radiology or dermatology where an image is labeled— this is a melanoma, this is a mole — we have ECGs [electrocardiograms] that are labeled, this is atrial fibrillation, this is bigeminy. And we’ve collected over 30 million ECG recordings from the last 7 years. And actually today record into our cloud over 1.25 million every month. And so that data allows us now, not only have we seen tremendous volumes of pathology, but we’re able to label that data and then feedback — because as machine learning does, it gets better and better. It makes mistakes and you teach it, and then it teaches itself: what are the reasons I made a mistake? How do I next time get it right? And that goes on certainly with deep learning inside a black box. You just give it the answer and you give it the input and it decides how to correlate those 2 together. So, I would tell you that the great strides that have been made in voice recognition, image recognition — certainly in medicine with imaging —dermatology, radiology, oncology— those types of benefits we’re taking advantage of because we have a simpler signal in the electrocardiogram. And it’s been computerized for 50 years, but we’re going to make some new great strides. Instead of using 100 ECGs to learn, we’re going to use 100 million.
Get the best executive insights directly to your inbox. Sign up for our daily newsletter.