Researchers from Dartmouth say they have made the first smartphone app that blends AI with software for processing facial images. To correctly spot the start of sadness before the person concerned even knows something is wrong.
App MoodCapture regularly records a person’s surroundings and facial emotions with their phone’s front camera. Then, it looks at the pictures to see if there are any clinical signs of sadness. The app correctly identified early signs of sadness 75% of the time in a study with 177 people who were diagnosed with major depressive disorder.
The paper’s lead author, Andrew Campbell, said
Read More: A bioelectronic sensor that checks how well the bladder is working!
So far, this is the first time that pictures taken “in the wild” have been used to predict sadness.
A similar set of technologies is used by MoodCapture, which combines deep learning and AI hardware with face recognition technology. “All someone has to do is unlock their phone, and MoodCapture knows how their depression works and can tell them to get help.”
MoodCapture is supposed to look at a set of photos in real time every time a user opens their phone. There are connections in the AI model between facial expressions and things like eye contact, changes in facial expression, and a person’s surroundings that show these things are important for figuring out how depressed someone is.
Jacobson, who runs the AIM HIGH Laboratory for AI and Mental Health.
Comments