
1.WHAT SCIENTIST DO IN “EMOTIONAL AI”?
The scientist and his team are working on this multi-sensory approach to emotional ai, which includes facial expressions, body language, involuntary and involuntary movements and gestures, eye movements, speech patterns, intonation, and various linguistic and cultural parameters were observed and analyzed.
2.HOW AI- POWERED TREATMENT PROGRAMS ARE HELPFUL?
AI-powered treatment programs can help assess a person’s mental and emotional health, direct them to appropriate resources, and suggest early support strategies. Talking to AI can be less risky and easier than talking to humans. At the same time, AI assistants that look at human nonverbal communication and speech patterns can help human therapists track patient progress between sessions and provide the best possible care.
3.WHY SCIENTISTS WORKING ON PROXEMO?
Self-driving cars and other autonomous robots could help navigate their physical environment more safely if robots could use non-verbal and postural cues to make the same inferences. Scientists are working with collaborators to program the brain of a wheeled robot called ProxEmo. ProxEmo can read people’s body language and measure their emotions.
“The idea is to build a future where robots can partner and help humans more safely, efficiently and effectively accomplish their goals and tasks,” said the scientist. So the biggest problem is often people, and our research puts people back into problem solving to create a better world.”