ideas, innovations and apps for social work in the age of smartphones and social media
Can machines read human face expression? And can they do it like a social worker or psychotherapist? … Professionals that are trained to focus on empathy and the diagnosis of emotions and moods. Can a virtual human diagnose a depression while talking to him/her?
Researchers at University of Southern California are working on it. Louis-Philippe Morency, Research Assistant Professor at the Institute for Creative Technologies, specializes in the computational study of nonverbal social communication, a multi-disciplinary research topic that overlays the fields of multi-modal interaction, computer vision, machine learning, social psychology and artificial intelligence.
The video below shows SimSensei & MultiSense, two interactive technologies recently developed for multimodal perception and healthcare support:[youtube http://youtu.be/ejczMs6b1Q4]
- Multisense automatically tracks and analyses in real-time facial expressions, body posture, acoustic features, linguistic patterns and higher-level behaviour descriptors (e.g. attention, fidgeting). From these signals and behaviours, indicators of psychological distress are inferred to inform directly the healthcare provider or the virtual human.
- SimSensei is a virtual human platform specifically designed for healthcare support and is based on the 10+ years of expertise at ICT with virtual human research and development. The platform enables an engaging face-to-face interaction where the virtual human automatically reacts to the perceived user state and intent, through its own speech and gestures.
Awesome technology! But can it replace real humans? What do you think? Share your ideas in the comments.