This page is partly deprecated as of January 1st, 2018. You can find updated information at

People Projects Publications Software Data Policy Press Videos FAQ Vintage
Projects Affective Mirror
The Affective Mirror is an attempt to build a fully automated system that intelligently responds to a person's affective state in real time. Current work is focused on building an agent that realistically mirrors a person's facial expression and posture. The agent detects affective cues through a facial-feature tracker and a posture-recognition system developed in the Affective Computing group; based on what affect a person is displaying, such as interest, boredom, frustration, or confusion, the system responds with matching facial affect and/or posture. This project is designed to be integrated into the Learning Companion Project, as part of an early phase of showing rapport-building behaviors between the computer agent and the human learner.

Group Members:

[mit][media lab + Room E15-419 + 20 Ames Street + Cambridge, MA 02139]