||People Projects Publications Software Data Policy Press Videos FAQ Vintage|
Weixuan 'Vincent' Chen
Pablo Egana del Sol
Agata Lapedriza Garcia
Daniel Lopez Martinez
Ognjen (Oggi) Rudovic
>>> Newest Alumni:
GIFGIF+ is a dataset of animated GIFs with labels of 17 emotions. It is semi-automatically collected using clustered multi-task learning based on the original crowdsourced GIFGIF dataset. More details can be found in the article:
Eight-Emotion Sentics Data:
This was the first data set generated as part of the MIT Affective Computing Group's research. The research question motivating the collection of this particular data set was: Will physiological signals exhibit characteristic patterns when a person experiences different kinds of emotional feelings? We wanted to know if patterns could be found day-in day-out, for a single individual, that could distinguish a set of eight affective states (in contrast with prior emotion-physiology research, which focused on averaging results of lots of people over a single session of less than an hour.) We wanted to determine if it might be possible to build a wearable computer system that could learn how to discriminate an individual's affective patterns, based on skin-surface sensing. We did build such a system, which attained 81% classification accuracy among the eight states studied for this data set, separating emotions not only on arousal but also on valence. The pattern recognition aspects of this work are described in more detail in the article:
These data are available to other researchers. The data set consists of measurements of four physiological signals and eight affective states, taken once a day, in a session lasting around 25 minutes, for over twenty days of recordings from an individual trying to keep every other aspect of the measurements constant (time of day, electrode placement, eliciting procedure, etc.) The four physiological signals are: blood volume pulse, electromyogram, respiration and skin conductance. The eight states (from the Clynes sentograph protocol) are: neutral, anger, hate, grief, love, romantic love, joy, and reverence. The data are divided into two overlapping sets, which are described in more detail in Healey's PhD thesis (2000): Wearable and Automotive Systems for the Recognition of Affect from Physiology here.
Permission is granted to use these data for research purposes. If your research with this data is published, please reference the data as: "Jennifer Healey and Rosalind W. Picard (2002), Eight-emotion Sentics Data, MIT Affective Computing Group, http://affect.media.mit.edu."Or you can say "the same data collected and used in the paper X" and cite one of our papers that used it.
Earlier publications where we used these data are:
Driver Stress Data:
Four types of physiological sensors were used to monitor drivers in natural driving situations that included rest, city driving and highway driving tasks. Each drive lasted for approximately ninety minutes. Data was recorded using an electromyogram placed on the shoulder, an electrocardiograph rhythm trace, a sensor for detecting respiration through chest cavity expansion and skin conductance sensors on both the hand and the foot. During the drive, several video cameras were used to record the driver's facial expression, body motion and road conditions. These videos were synchronized to the physiological records to provide ground truth for analysis. A set of hand annotations of these videos has been partly created. This data has been given to be placed for free use by researchers on PhysioNet, as will the annotations when they are completed.
These data are described in more detail in Healey's PhD thesis (2000): Wearable and Automotive Systems for the Recognition of Affect from Physiology TR 526
The data can be downloaded from: PhysioNet.
Permission is granted to use these data for research purposes. If your research with this data is published, please reference the data as: "Jennifer Healey and Rosalind W. Picard (2002), Driver Stress Data, MIT Affective Computing Group, http://affect.media.mit.edu."
Some research using this data was published in: