An AI for an Eye: How Deep Learning May Prevent Diabetes-Induced Blindness
There are many ways diabetes can be debilitating, even lethal. But one condition caused by the disease comes on without warning.
A patient “can go to sleep one night and wake up the next morning and be legally blind, with no previous symptoms,” said Jonathan Stevenson, chief strategy and information officer for Intelligent Retinal Imaging Systems (IRIS), speaking of the condition known as diabetic retinopathy.
While most complications of diabetes such as heart disease, kidney disease and nerve damage have overt symptoms, diabetic retinopathy can sneak up on a patient undetected, unless spotted early by regular eye exams.
Making Diabetic Eye Exams Widely Available
Fewer than 40 percent of the 370 million diabetics in the world get checked for diabetes-related eye conditions. To make matters worse, while the number of patients with diabetes has steadily grown in recent decades, the population of ophthalmologists has been shrinking.
IRIS is attempting to bridge this gap by making retinal exams quick, easy and widely available.
“We are trying to enable a workflow that gives the provider the data they need to make decisions, but not interrupt that sacred time spent with patients,” said Stevenson.
The idea that a patient with diabetes could go blind so suddenly and unnecessarily was too much for Dr. Sunil Gupta, who founded IRIS in 2011. What the young company subsequently discovered was that deep learning can detect early indicators of diabetic complications in the retina.
Now, IRIS is preparing to unleash an updated component to its cloud-based solution that quickly analyzes uploaded images and returns that analysis to caregivers, achieving a 97 percent success rate in matching the analysis of expert ophthalmologists.
Tapping Microsoft’s Latest Toolkits
Behind that solution is an approach combining NVIDIA GPUs and the TensorFlow machine learning library with Microsoft Azure Machine Learning Services and CNTK, which make it possible to write low-level, hardware-agnostic algorithms.
Jocelyn Desbiens, lead innovator and data scientist for IRIS, said the company was one of the first organizations to make use of the Microsoft toolkits in this way. IRIS also uses Kubernetes to orchestrate its cloud-based container, which runs on the Microsoft Azure platform.
To build its model, IRIS obtained a dataset of about 10,000 retinal images, sifting through them to reveal 8,000 high-quality images, 6,000 of which were used for training, while 2,000 were held out for validation.
The system can detect differences between the left and right eyes, as well as between diabetic and normal eyes. Ultimately, it recommends whether a patient needs to be referred to a physician or if the detected condition simply needs to be observed.
Newer GPUs Up the Ante
All training and inferencing occur on NVIDIA GPUs running in IRIS’s Azure instance. IRIS has been at it long enough that it’s benefited from incredible advances in performance.
A few years ago, adopting NVIDIA Tesla K80 GPU accelerators slashed the time it took to train the company’s model on 10,000 images from a month to a week. Switching to the Tesla P100 shrunk that down to only a couple of days. And now with the Tesla V100, the process is down to half a day.
That time gain, Stevenson said, is how NVIDIA is enabling researchers and scientists to answer questions they’d never been able to tackle before — such as whether diabetic blindness can be identified ahead of time.
Even more Azure customers will soon be able to utilize these performance gains as Microsoft has announced the preview of two new N-series Virtual Machines with NVIDIA GPU capabilities.
Eventually, IRIS intends to apply its understanding of the retina to assist in the treatment of other conditions. The retina in many ways, Stevenson said, is a window into a person’s health, providing clues about everything from autoimmune disorders and cancers to cardiovascular diseases.
Without divulging specifics, he made it clear that IRIS’s work won’t stop with diabetic blindness.
“By looking at features within the retinas,” Stevenson said, “we’re able to see other conditions that aren’t necessarily related to the eye.”