FEELS: a full-spectrum enhanced emotion learning system for assisting individuals with autism spectrum disorder
Abstract
Autism Spectrum Disorder (ASD) is a developmental disorder that
can lead to a variety of social and communication challenges, and
individuals with ASD are at a higher risk of loneliness and depres-
sion as a result of the disconnect and isolation they may feel from
the rest of society as a result of their ASD. Interventions targeting
improved emotional detection has been clinically shown to be quite
promising; however, there are considerable barriers that make it
challenging to incorporate emotion detection within daily life sce-
narios. Motivated by the need to fill this gap, we introduce the
concept of FEELS, a full-spectrum enhanced emotion learning sys-
tem which could be useful as a tool to assist individuals with ASD.
FEELS facilitates enhanced emotion detection by capturing a live
video stream of individuals in real-time, then leveraging deep con-
volutional neural networks to detect facial landmarks and a custom
hybrid neural network consisting of a time distributed feed-forward
neural network and a LTSM neural network to determine the emo-
tional state of the individuals based on a sequence of facial land-
marks over time. The feasibility of such an approach was explored
through the construction of a proof-of-concept FEELS system that
can detect between five different basic emotional states: neutral,
sad, happy, surprise, and anger. Future work will include extend-
ing the proof-of-concept FEELS system to detect more emotional
states and evaluate the system in more natural settings.