A unified valence scale based on diagnosis of facial expression
Event Type
TimeWednesday, October 12th4:30pm - 5:30pm EDT
LocationPoster Gallery
DescriptionAffect-adaptive systems detect the emotional user state, assess it against the current situation, and adjust interaction accordingly. Tools for real-time emotional state detection, like the Emotient FACET engine (Littlewort et al., 2011), are based on the analysis of facial expressions. When developing affect-adaptive systems, output from the diagnostic engine must be mapped onto theoretical models of emotion. The Circumplex Model of Affect (Russell, 1980) describes emotion on two dimensions: valence and arousal. However, FACET offers three classifiers for valence: positive, neutral, and negative valence. The present study aimed at developing an algorithm that converts these into a unified valence scale. We used FACET to analyze valence-labeled images from the AffectNet database. In a multiple regression analysis, FACET classifier values predicted database valence and explained 38% of the variance. By inserting classifier values into the regression equation, a unified valence scale can be calculated that matches dimensional models of emotion. This research forms the groundwork for adaptation of the emotional user state based on the FACET engine. A future affect-adaptive system can now use the FACET engine to detect the emotional user state on a unified valence dimension, which allows for distinct classification and interpretation of emotions.