Published: Wed, February 21, 2018
Health Care | By Belinda Paul

Google uses AI, deep learning to predict cardiovascular risk from retina scans

Google uses AI, deep learning to predict cardiovascular risk from retina scans

Google's health subsidiary, Verily, announced yesterday that their new AI algorithm can assess heart disease risk by analyzing scans of the back of a patient's eyes.

To test its accuracy, the AI was then presented with the retinal images of two patients, including someone who had a heart attack just five years prior while the other had a clean bill of health.

Since Google's new method does not require a blood test, doctors will be able to determine an individual's cardiovascular risk faster and more efficiently. Typically, this assessment includes examining risk factor such as age, sex, smoking, blood pressure, and cholesterol, as well as taking into account whether the patient has another disease associated with increased risk of cardiovascular issues, such as diabetes.

"They're taking data that's been captured for one clinical reason and getting more out of it than we now do", Luke Oakden-Rayner, a medical researcher from the University of Adelaide, told The Verge. Google previously found that these methods can accurately detect diabetic eye disease. "They're taking data that's been captured for one clinical reason and getting more out of it than we now do", said Oakden-Rayner.

According to Peng, the computer vision algorithm can distinguish between the retinal images of a smoker from that of a non-smoker at least seven out of 10 times.

The researchers however, cautioned that while the system achieved good results that matched the accuracy of a standard blood test-based technique for estimating the risk of heart failure commonly used in Europe called SCORE a dataset of less than 300,000 patients is still small for AI, and should be tested further.

This is a big step forward scientifically, Google AI officials said, because it is not imitating an existing diagnostic but rather using machine learning to uncover a surprising new way to predict these problems.

Although the idea of looking at your eyes to judge the health of your heart sounds unusual, it draws from a body of established research. If the algorithm's success rate grows and Google is able to fine-tune the technology, then it's possible that the method could become be a quicker and cheaper way to discover at-risk patients.

What's compelling about the study is that it could point the way forward to developing more accurate ways of using anatomical changes that can be picked up in medical images to predict disease risk.

Like this: