X

New DeepMind Project Aims To Improve Eye Disease Understanding

Artificial intelligence is advancing more and more each day, and it seems that people are finding new uses for the technology just as often. The technology can be applied in just about any field, especially with the rise of neural networks and machine learning. From the business sector to technology and even medicine, researchers are finding new things for AI to do, most of which they can do better than humans. Case in point; Google-owned DeepMind’s state-of-the-art AI is being used in a partnership with Moorfields Eye Hospital, a renowned ophthalmic institute in London, that will entail using machine learning to detect and identify eye problems and diseases as early on as possible. The plan is to teach the AI how a normal digital eye scan, or OCT, should look, and have it search for any abnormalities.

A traditional OCT scan, or Optical Coherence Tomography, shows the detailed geometry and other data for a given part of the eye. Although tools exist to analyze these scans for abnormalities, they’re often unable to comb every detail, or see issues in the scans, leaving the bulk of the task to human doctors. Naturally, given the scans’ complexity, a human going over one of these scans takes quite some time, which could be the difference in some cases between catching eye disease early enough to stem its development, or catching it in time to figure out how best to help a patient cope and adapt. Faster analysis could also enable more frequent scanning, allowing patients with ongoing or degenerative issues to be monitored more closely with more frequent testing.

On top of faster analysis, the project is meant to allow the AI to catch things that a human might miss when analyzing an OCT scan. Professor Sir Peng Tee Khaw with Moorfields Eye Hospital says that the research could very well “revolutionise the way professionals carry out eye tests”, citing the fact that vision loss is predicted to double in the population by 2050, making it vital to explore any and all avenues to improve current standards of care. Of the thousands of scans taken daily in Moorfields, DeepMind will be fed a set number and, when applicable, told what is wrong with the scan during the learning process, such as in the case of patients with known abnormalities. Once the AI learns to identify a wide range of common issues, the next step will be to begin letting it analyze scans.