GVC Emotion Recognition
In many GVC applications the first step of the process is emotion recognition: the user speaks for a few seconds, and the GVC Emotion Recognition algorithm measures hundreds of acoustic properties of the user’s voice and distills from these cues an assessment of the user’s emotional state.
What are the acoustic cues?
The acoustic cues that we measure are numerous: static and dynamic properties of pitch, intensity, resonances, dullness, sharpness, softness, tempo, and phrasing.
How does GVC know how to map acoustic cues to emotions?
There are two ways in which the GVC Emotion Recognition software can learn to interpret the acoustic cues as emotions.
The slow way is to crowd-source from the users of a basic emotion recognition app. Users can produce self-reports, e.g. by judging whether the software recognized the appropriate emotion (answering yes or no). On the basis of these self-reports, GVC’s home server can gradually adapt its knowledge of the association between acoustic cues and emotion states. What emerges is a language-independent, culture-independent, and sex-independent database of knowledge of the emotions-acoustics interface.
The fast way is to learn from the peculiarities of the user’s voice. During the first weeks and months of use, the app learns to understand what the user’s average voice settings are, and learns to detect the smaller hour-to-hour and day-to-day changes in the user’s emotional state.
What is the immediate use of good emotion recognition?
An app that understands how you are currently feeling is capable of interacting with you in a much more natural and harmonious way than all those apps that stay ignorant of your mood. This has immediate consequences in Robotics.
What else can we expect from having good emotion recognition?
We can feed the results from our emotion recognition algorithm into algorithms that choose an appropriate feedback to the user. As GVC we are primarily interested in kinds of feedback that improve the user’s performance and quality of life.