This page is deprecated as of January 1st, 2018. You can find current information on our group at

People Projects Publications Software Data Policy Press Videos FAQ Vintage

Applications for Affective Computing

Once the affective computing system has sensed the user's biosignals and recognized the patterns inherent in the signals, the system's Understanding module will assimilate the data into its model of the user's emotional experience. The system is now capable of communicating useful information about the user to applications that can use such data. What do affective computing applications look like? How would they differ from current software applications?

Perhaps the most fundamental application of affective computing will be to inform next-generation human interfaces that are able to recognize, and respond to, the emotional states of their users. Users who are becoming frustrated or annoyed with using a product would "send out signals" to the computer, at which point the application might respond in a variety of ways -- ideally in ways that the user would see as "intuitive". For example, a computer piano tutor might change its pace and presentation based upon naturally expressed signals that the user is interested, bored, confused, or frustrated. A video retrieval system might help identify not just scenes having a particular actor or setting, but scenes having a particular emotional content: fast-forward to the "most exciting" scenes. Your wearable computer could pay attention to things that increase your stress, so that it might help you learn better strategies to boost your immune system when needed, practicing preventive medicine.
Beyond this quantum leap in the ability of software applications to respond with greater sensitivity to the user, the advent of affective computing will immediately lend itself to a host of applications, a number of which are described below.

Current projects

[mit][media lab + Room E15-419 + 20 Ames Street + Cambridge, MA 02139]