Jonah Fernandez
The Affective Grid
A central hub for EEG and emotion research data
Emotions shape how we perceive, decide, and interact with the world. Understanding their neural and physiological foundations has become one of the central challenges in cognitive and affective neuroscience. With the rise of affective computing—the interdisciplinary field that studies how machines can recognize, interpret, and simulate human emotions—there is a growing need for open, high-quality datasets that bridge psychology, neuroscience, and artificial intelligence.
The Affective Grid was created as a hub for open-access datasets and resources related to emotion and affective processes, including neural, physiological, and behavioral recordings. Its goal is to make valuable data resources more visible and accessible to researchers, students, and collaborators interested in exploring the neural signatures of emotion.
By curating links to publicly available EEG datasets and complementary materials, The Affective Grid supports transparency, reproducibility, and cross-disciplinary collaboration. It reflects a broader commitment to open science, encouraging the reuse of data for new analyses, comparative studies, and the development of innovative computational models of emotion.
Ultimately, The Affective Grid aims to connect a global community of scientists investigating how emotions emerge in the brain and to contribute to the evolution of affective computing as an open, data-driven discipline.
EEG Datasets
The SJTU team has pioneered several widely used EEG emotion datasets, setting a strong foundation for affective computing and brain–emotion research.
- SEED (SJTU Emotion EEG Dataset)
Original dataset with multiple subjects under various emotional stimuli.
Used extensively for EEG-based emotion recognition research.
Link: SEED Dataset
SEED-IV
Covers four emotional states: happy, sad, neutral, fear.
Provides longer temporal recordings for more detailed analysis.
Link: SEED Dataset
SEED-VII
Latest in the SEED series with updated stimuli and protocols.
Includes multimodal recordings (EEG + eye-tracking in some versions).
Link: SEED Dataset
The DEAP consortium brought together multiple universities to create one of the most comprehensive and influential EEG emotion datasets to date.
- DEAP (Dataset for Emotion Analysis using Physiological signals)
- Large-scale collaboration across European universities.
- Contains EEG and peripheral physiological recordings from 32 participants watching music videos.
- Includes self-assessment ratings for valence, arousal, and dominance.
- One of the most cited EEG emotion datasets in affective computing.
- Link: DEAP: A Dataset for Emotion Analysis using Physiological and Audiovisual Signals
The UWS research team developed the DREAMER dataset to advance multimodal emotion recognition using both EEG and ECG signals.
DREAMER
EEG and ECG recordings from participants viewing emotion-eliciting videos.
Includes subjective ratings for valence, arousal, and dominance.
Ideal for developing models that integrate brain and cardiac signals.
The AMIGOS team created a dataset to explore emotional dynamics in both individual and group contexts.
AMIGOS
Multimodal recordings (EEG, ECG, GSR, and video).
Designed for studying emotional and social interactions.
Provides rich data for affective computing and human–computer interaction research.
Link: AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups
Emotional Video Databases
A curated set of film excerpts validated for emotional elicitation across valence and arousal dimensions. Commonly used in affective neuroscience and psychophysiology research.
A large collection of movie excerpts annotated with continuous valence and arousal scores, ideal for machine learning and affective computing applications. All excerpts are shared under Creative Commons licenses.
- Link: LIRIS-ACCEDE
A multilingual film database offering clips rated for discrete emotions such as anger, happiness, fear, and sadness.
Video clips of actors portraying prototypical facial expressions, useful for emotion perception and recognition studies.
A large-scale multimodal emotion dataset featuring 1,102 videos of children aged 4–14, annotated for 17 affective states including six basic and nine complex emotions, as well as valence and neutrality. It is the largest dataset of its kind, enabling rich analysis of emotional expression and development in children.
All datasets are linked to their official sources to ensure proper access and citation. By grouping datasets by their originating team or university, The Affective Grid facilitates easier exploration, comparison, and leveraging of emotion data for studies in affective neuroscience and computing.