Feeling sick? Here are some medical apps for that.
Most people are familiar with staring at their phone in horror after looking up their symptoms when they feel slightly off colour – and being warned by WebMD that death is imminent.
However, the smartphone’s reputation as a somewhat dramatic diagnostician is starting to change. Researchers are developing medical apps that can diagnose and treat certain diseases faster, more effectively and cheaper than ever before.
From selfies that help diagnose cancer to platforms that predict seizures, here are some of the medical apps that are disrupting diagnosis and treatment.
1. Heart Disease
A team of engineers from Caltech, Huntington Medical Research Institute and the University of Southern California (USC) have developed an app to monitor cardiovascular health.
The app works by monitoring left ventricular ejection fraction (LVEF), or in more colloquial terms, the heartbeat. atient’s hold the smartphone against their neck, where it can monitor blood flow by measuring the expansion and contraction of arterial walls.
Trials of the app found that it is as effective as an echocardiography procedure, which requires a trained technician, an expensive ultrasound machine and takes up to 45 minutes.
“What is exciting about this study is that it shows our technique is as accurate as echocardiography at estimating LVEF when both are compared to the gold standard of cardiac MRI,” said Caltech’s Mory Gharib, a professor of aeronautics and bioinspired engineering.
“This has the potential to revolutionise how doctors and patients can screen for and monitor heart disease.”
Following the success of trials, researchers are exploring other ways to use this technology. They anticipate that it could be used to diagnose other heart valve diseases such as aortic stenosis and coronary artery blockages.
2. Pancreatic Cancer
While cancer, on the whole, is an undesirable prognosis, pancreatic cancer is one of the worst. It has a five-year survival rate of just 9 per cent, partially because there are no obvious symptoms. Currently, the only pre-emptive screening tool is a lengthy, often expensive blood test.
This could soon change. Researchers from the University of Washington’s Ubiquitous Computing Lab have developed an app called BiliScreen, which could make early detection infinitely easier. One of the earliest symptoms of pancreatic cancer is jaundice, caused by a buildup of bilirubin in the blood. However by the time its characteristic yellow discolouration of the skin and sclera is visible to the naked eye, the cancer is usually quite advanced.
The app uses computer vision algorithms and machine learning tools to detect minimally elevated bilirubin levels, making early detection easier than ever before. The initial clinical study found that the app correctly identified cases of concern 89.7 per cent of the time.
Researcher Alex Mariakakis, a doctoral student at the Paul G Allen School of Computer Science and Engineering, said, “The hope is that if people can do this simple test once a month, in the privacy of their own homes, some might catch the disease early enough to undergo treatment that could save their lives.”
3. Brain Injuries
Another group of researchers at the University of Washington (UW) has developed an app called PupilScreen, which can quickly identify the symptoms of concussions.
The app measures a change in the pupil’s response to light, and is particularly exciting because it can be more effective than current methods of identifying a concussion.
“Where a penlight exam checks for major brain injuries, PupilScreen can identify the more subtle changes associated with concussion, changes that are imperceptible to the human eye,” The Engineer reported.
The convenience and accuracy of the app means it has the potential to be particularly effective in treating injuries in sport, where diagnosis on the sidelines can be inaccurate and subject to outside influences.
“Having an objective measure that a coach or parent or anyone on the sidelines of a game could use to screen for concussion would truly be a game-changer,” said Shwetak Patel, a professor of computer science and engineering at UW.
“The vision we’re shooting for is having someone simply hold the phone up and use the flash,” said lead author Alex Mariakakis, a doctoral student in the Paul G Allen School of Computer Science & Engineering.
4. Epileptic Seizures
A team of engineers and researchers at the University of Melbourne are developing an app they hope will help people with epilepsy predict oncoming seizures.
Patients will be able to enter information about their medication, lifestyle factors,environmental factors and brain recordings. The app will then aggregate the information to predict the probability of experiencing a seizure. The app will become more accurate over time as it learns about a patient’s specific triggers and seizure patterns.
There are several features of this app that distinguish it from its predecessors. First is the significant amount of data it draws upon, including the longest seizure forecasting study undertaken on humans. The second is its capacity to treat seizure likelihood as a “continuum, not a duality.”
As researchers explained: “Forcing a forecast to take on only two possible outcomes [yes or no] means that these highly excitable states are misdiagnosed, making it difficult to refine or improve predictions. To reflect the brain’s changing state a more useful question to ask is: ‘What is the probability a person will have a seizure in the next hour?’.”
5. Respiratory Disease
Associate Professor Udantha Abeyratne at the University of Queensland’s School of Information Technology and Electrical Engineering has developed an app to help diagnose and measure respiratory illnesses, including asthma and pneumonia, by monitoring cough and respiratory sounds.
Traditionally, respiratory health is diagnosed using stethoscopes. However, according to researchers, “The information available from these sounds is compromised as the sound has to first pass through the chest musculature, which muffles high-pitched components of respiratory sounds.”
Instead the app, known as ResApp, uses machine learning to develop algorithms that diagnose respiratory problems by analysing coughs and breathing sounds.
They then compare the coughing and breathing sounds, also known as signatures, against a large database of sound recordings of known clinical diagnoses. From there, the app’s machine learning tools can accurately produce a diagnosis, or measure the severity of a patient’s respiratory condition.