Patients suffering from Amyotrophic Lateral Sclerosis, more commonly known as ALS, experience limited mobility and communication, leading to less personal freedom and a lower quality of life. However, two Stony Brook University professors are working to advance a multifunctional eye-gaze-based app that can help ALS patients gain back some of their lost independence.
According to Fusheng Wang, an associate professor in the Department of Computer Science in the College of Engineering and Applied Sciences, motor disability can be caused by ALS as well as other illness or trauma, including multiple sclerosis, cerebral palsy, spinal cord injury, traumatic brain injuries and stroke, and muscular dystrophy.
“As the motor and speech capability of a patient at different stages evolves, assistive communication is essential,” said Wang. “However, the available capabilities of an ALS patient differ from each other and keep on evolving when the disease progresses. While there is a large space of assistive technologies, they come with major limitations such a high price tag, limited availability, or they are big or cumbersome.”
To address this, Wang and fellow researcher Xiaojun Bi, also an associate professor in the Department of Computer Science, are working on a technology called EyeCanDo, a lightweight, affordable and accessible application to help ALS patients communicate in their daily life.
Patients can download EyeCanDo directly from the Apple App Store to an iPad Pro and begin using it immediately. Caregivers and family members will benefit from better understanding the needs of the patient, improving their physical and mental care. Nurcan Gursoy, MD, clinical neurophysiologist and co-director of the Stony Brook Neuromuscular Disease & ALS Center, is a co-investigator, providing clinical guidance and coordinating with Stony Brook’s ALS Clinic on recruiting patients and evaluating the EyeCanDo app.
According to Wang, the goal of the project is to develop a cost-effective, multi-modal assistive communication app that can run on an iPad and take full advantage of the available capabilities of the patient.
“These would include eye gaze, facial expressions, and brain-computer interfaces for optimal communication and improving the quality of life of ALS patients,” said Wang, who added that EyeCanDo is built upon AI-powered human computer interactions, machine learning, signal processing and augmented reality.
“A patient will be able to look at the iPad and use eye gaze to control the EyeCanDo app for a wide variety of needs ranging from food and personal hygiene to web browsing, social media and entertainment,” added Bi.
EyeCanDo began as a Health Hackathon project for the 2018 Mount Sinai Health Hackathon for rare diseases by a group of students in Wang’s lab. Motivated by this effort, he decided to continue the project in the lab, partnering with Bi’s lab, which brought expertise on human computer interaction research, and in particular, text input for mobile devices. In 2021, Wang and Bi were awarded a two-year, $200,000 grant by the ALS Association to develop the technology. In May, 2022 they received a second award for $777,000 from the Department of Defense’s Congressionally Directed Medical Research Program to further advance their research.
The initial app was evaluated by 20 ALS patients in Stony Brook clinics, and 80 percent of the patients had positive experiences. Wang said the first major contribution of his group to the project was patient-centered technology development.
“In our early interviews with ALS patients, they all expressed grief over the progressive nature of their disease,” said Wang. “They have a profound sense of loss over not being able to do what they used to do. This can pertain to social situations or even something like simply cutting up food to eat. Everyone is affected in their own unique way. Some patients and caregivers are able to cope; others openly cry.”
Other contributions include making the app adaptable to users via machine-learning capabilities and figuring out ways to leverage a patient’s available capabilities, including gaze, facial expressions, and brainwave, and integrating them to achieve optimal communication performance.
The EyeCanDo project also provides Wang opportunities for training students, including high school students, undergraduate students, MS students and PhD students. About 20 students have been involved in the project since 2018.
Wang’s academic research intersects computer science and biomedical informatics, focusing on challenges for delivering effective, scalable and expressive software systems for managing, querying complex, multidimensional biomedical big data. A second research goal is to develop novel methods and software systems to optimize the extraction, mining and understanding of biomedical data with improved efficiency, accuracy and interoperability to support biomedical research and healthcare.
“Bridging the gap between research and patient care is of high interest to my work,” said Wang. “This is the first project that lets us work directly with patients.”
– Robert Emproto