New app uses smartphone selfies to screen for pancreatic cancer

187
New app uses smartphone selfies to screen for pancreatic cancer
Credit: Dennis Wise/University of Washington.

Pancreatic cancer has one of the worst prognoses—with a five-year survival rate of 9 percent—in part because there are no telltale symptoms or non-invasive screening tools to catch a tumor before it spreads.

Now, University of Washington researchers have developed an app that could allow people to easily screen for pancreatic cancer and other diseases—by snapping a smartphone selfie.

BiliScreen uses a smartphone camera, computer vision algorithms and machine learning tools to detect increased bilirubin levels in a person’s sclera, or the white part of the eye.

One of the earliest symptoms of pancreatic cancer, as well as other diseases, is jaundice, a yellow discoloration of the skin and eyes caused by a buildup of bilirubin in the blood.

The ability to detect signs of jaundice when bilirubin levels are minimally elevated—but before they’re visible to the naked eye—could enable an entirely new screening program for at-risk individuals.

In an initial clinical study of 70 people, the BiliScreen app—used in conjunction with a 3-D printed box that controls the eye’s exposure to light—correctly identified cases of concern 89.7 percent of the time, compared to the blood test currently used.

“The problem with pancreatic cancer is that by the time you’re symptomatic, it’s frequently too late,” said lead author Alex Mariakakis, a doctoral student at the Paul G. Allen School of Computer Science & Engineering.

“The hope is that if people can do this simple test once a month—in the privacy of their own homes—some might catch the disease early enough to undergo treatment that could save their lives.”

In collaboration with UW Medicine doctors, the UbiComp lab specializes in using cameras, microphones and other components of common consumer devices—such as smartphones and tablets—to screen for disease.

BiliScreen uses a smartphone’s built-in camera and flash to collect pictures of a person’s eye as they snap a selfie.

The team developed a computer vision system to automatically and effectively isolate the white parts of the eye, which is a valuable tool for medical diagnostics.

The app then calculates the color information from the sclera—based on the wavelengths of light that are being reflected and absorbed—and correlates it with bilirubin levels using machine learning algorithms.

To account for different lighting conditions, the team tested BiliScreen with two different accessories: paper glasses printed with colored squares to help calibrate color and a 3-D printed box that blocks out ambient lighting.

Using the app with the box accessory—reminiscent of a Google Cardboard headset—led to slightly better results.

Next steps for the research team include testing the app on a wider range of people at risk for jaundice and underlying conditions, as well as continuing to make usability improvements—including removing the need for accessories like the box and glasses.

“This relatively small initial study shows the technology has promise,” said co-author Dr. Jim Taylor, a professor in the UW Medicine Department of Pediatrics whose father died of pancreatic cancer at age 70.

“Pancreatic cancer is a terrible disease with no effective screening right now,” Taylor said. “Our goal is to have more people who are unfortunate enough to get pancreatic cancer to be fortunate enough to catch it in time to have surgery that gives them a better chance of survival.”