This page is partly deprecated as of January 1st, 2018. You can find updated information at

People Projects Publications Software Data Policy Press Videos FAQ Vintage
Projects Affective Learning Companion
Affective Learning Companion is a powerful, flexible new research tool for exploring a variety of social-emotional skills in human-machine interaction, and for understanding how machines can work with people to better meet their needs. The platform enables a computational agent to sense and respond, in real time, to a user's non-verbal emotional cues, using video, postural movements, mouse pressure, physiology, and other behaviors communicated by the user to infer, for example, if a user is in a high or low state of interest, or feeling frustrated. We recently developed an animated agent that combines non-verbal mirroring (or not) with multiple kinds of affective and cognitive support during a frustrating learning episode. The system allows us to control factors that have previously been impossible to control, enabling for the first time the study of how these factors interact in helping learners develop the ability to persevere during frustrating learning episodes.

Group Members:

[mit][media lab + Room E15-419 + 20 Ames Street + Cambridge, MA 02139]