{"id":303,"date":"2025-10-30T14:01:20","date_gmt":"2025-10-30T14:01:20","guid":{"rendered":"https:\/\/exit.udg.edu\/jonahfernandez\/?page_id=303"},"modified":"2025-11-25T11:09:09","modified_gmt":"2025-11-25T11:09:09","slug":"the-affective-grid","status":"publish","type":"page","link":"https:\/\/exit.udg.edu\/jonahfernandez\/index.php\/the-affective-grid\/","title":{"rendered":"The Affective Grid"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-page\" data-elementor-id=\"303\" class=\"elementor elementor-303\">\n\t\t\t\t\t\t<header class=\"elementor-section elementor-top-section elementor-element elementor-element-638faf5 elementor-section-stretched elementor-section-full_width elementor-section-content-top elementor-section-height-default elementor-section-height-default\" data-id=\"638faf5\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;,&quot;background_background&quot;:&quot;classic&quot;}\">\n\t\t\t\t\t\t\t<div class=\"elementor-background-overlay\"><\/div>\n\t\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-1facf8c\" data-id=\"1facf8c\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-9656a7d elementor-widget__width-inherit elementor-widget elementor-widget-heading\" data-id=\"9656a7d\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h1 class=\"elementor-heading-title elementor-size-default\">Jonah Fernandez<\/h1>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/header>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-a484e89 elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"a484e89\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;,&quot;background_background&quot;:&quot;classic&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-25 elementor-top-column elementor-element elementor-element-8cc4647\" data-id=\"8cc4647\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-23b0a81 elementor-widget elementor-widget-text-editor\" data-id=\"23b0a81\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p style=\"text-align: center;\"><a href=\"https:\/\/exit.udg.edu\/jonahfernandez\/\"><strong>About me<\/strong><\/a><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t<div class=\"elementor-column elementor-col-25 elementor-top-column elementor-element elementor-element-9eb06e6\" data-id=\"9eb06e6\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-e39ee4b elementor-widget elementor-widget-text-editor\" data-id=\"e39ee4b\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p style=\"text-align: center;\"><a href=\"https:\/\/exit.udg.edu\/jonahfernandez\/index.php\/publications\/\"><strong>Publications<\/strong><\/a><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t<div class=\"elementor-column elementor-col-25 elementor-top-column elementor-element elementor-element-c5b67d7\" data-id=\"c5b67d7\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-165f4a6 elementor-widget elementor-widget-text-editor\" data-id=\"165f4a6\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p style=\"text-align: center;\"><a href=\"https:\/\/exit.udg.edu\/jonahfernandez\/index.php\/teaching\/\"><strong>Teaching<\/strong><\/a><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t<div class=\"elementor-column elementor-col-25 elementor-top-column elementor-element elementor-element-ddbb461\" data-id=\"ddbb461\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-6e55374 elementor-widget elementor-widget-text-editor\" data-id=\"6e55374\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p style=\"text-align: center;\"><a href=\"https:\/\/exit.udg.edu\/jonahfernandez\/index.php\/the-affective-grid\/\"><b>The Affective Grid<\/b><\/a><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-7c10c35 elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"7c10c35\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-13a645f\" data-id=\"13a645f\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-4c004e2 elementor-widget elementor-widget-spacer\" data-id=\"4c004e2\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"spacer.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-spacer\">\n\t\t\t<div class=\"elementor-spacer-inner\"><\/div>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-e09cf80 elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"e09cf80\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-83d424d\" data-id=\"83d424d\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-e155125 elementor-widget elementor-widget-heading\" data-id=\"e155125\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\"><strong><span style=\"color: #6daee3\"><span data-start=\"895\" data-end=\"912\">Th<\/span><span data-start=\"895\" data-end=\"912\">e Affective Grid<\/span><\/span><\/strong><\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-254b2c1 elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"254b2c1\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-7530089\" data-id=\"7530089\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-ffd2e58 elementor-widget elementor-widget-text-editor\" data-id=\"ffd2e58\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><i>A central hub for EEG and emotion research data<\/i><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-3d4f443 elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"3d4f443\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-a430275\" data-id=\"a430275\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-7c38b22 elementor-widget elementor-widget-spacer\" data-id=\"7c38b22\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"spacer.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-spacer\">\n\t\t\t<div class=\"elementor-spacer-inner\"><\/div>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-8532a8a elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"8532a8a\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-fcf9550\" data-id=\"fcf9550\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-0916964 elementor-widget elementor-widget-text-editor\" data-id=\"0916964\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p data-start=\"414\" data-end=\"893\">Emotions shape how we perceive, decide, and interact with the world. Understanding their neural and physiological foundations has become one of the central challenges in cognitive and affective neuroscience. With the rise of <em data-start=\"639\" data-end=\"660\">affective computing<\/em>\u2014the interdisciplinary field that studies how machines can recognize, interpret, and simulate human emotions\u2014there is a growing need for open, high-quality datasets that bridge psychology, neuroscience, and artificial intelligence.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-84ae4bf elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"84ae4bf\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-3c67a11\" data-id=\"3c67a11\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-2876ec2 elementor-widget elementor-widget-text-editor\" data-id=\"2876ec2\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><strong><span style=\"color: #6daee3;\"><em data-start=\"895\" data-end=\"912\">The Affective Grid<\/em><\/span><\/strong> was created as a hub for open-access datasets and resources related to emotion and affective processes, including neural, physiological, and behavioral recordings. Its goal is to make valuable data resources more visible and accessible to researchers, students, and collaborators interested in exploring the neural signatures of emotion.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-16368ce elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"16368ce\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-4889882\" data-id=\"4889882\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-4ca4ba3 elementor-widget elementor-widget-text-editor\" data-id=\"4ca4ba3\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>By curating links to publicly available EEG datasets and complementary materials, <strong><span style=\"color: #6daee3;\"><em data-start=\"895\" data-end=\"912\">The Affective Grid<\/em><\/span><\/strong>\u00a0supports transparency, reproducibility, and cross-disciplinary collaboration. It reflects a broader commitment to open science, encouraging the reuse of data for new analyses, comparative studies, and the development of innovative computational models of emotion.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-5f543ba elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"5f543ba\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-a9d64e4\" data-id=\"a9d64e4\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-de20e7b elementor-widget elementor-widget-text-editor\" data-id=\"de20e7b\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Ultimately, <strong><span style=\"color: #6daee3;\"><em data-start=\"895\" data-end=\"912\">The Affective Grid<\/em><\/span><\/strong>\u00a0aims to connect a global community of scientists investigating how emotions emerge in the brain and to contribute to the evolution of affective computing as an open, data-driven discipline.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-db5c1a9 elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"db5c1a9\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-a056437\" data-id=\"a056437\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-e84f069 elementor-widget elementor-widget-spacer\" data-id=\"e84f069\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"spacer.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-spacer\">\n\t\t\t<div class=\"elementor-spacer-inner\"><\/div>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-b6e896e elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"b6e896e\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-67dd72e\" data-id=\"67dd72e\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-871f558 elementor-widget elementor-widget-text-editor\" data-id=\"871f558\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<h3><strong><span style=\"color: #6daee3;\">EEG Datasets<\/span><\/strong><\/h3>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-d63cd0d elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"d63cd0d\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-7e95cf8\" data-id=\"7e95cf8\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-931d239 elementor-widget elementor-widget-toggle\" data-id=\"931d239\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"toggle.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-toggle\">\n\t\t\t\t\t\t\t<div class=\"elementor-toggle-item\">\n\t\t\t\t\t<div id=\"elementor-tab-title-1541\" class=\"elementor-tab-title\" data-tab=\"1\" role=\"button\" aria-controls=\"elementor-tab-content-1541\" aria-expanded=\"false\">\n\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon elementor-toggle-icon-left\" aria-hidden=\"true\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-closed\"><i class=\"fas fa-caret-right\"><\/i><\/span>\n\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-opened\"><i class=\"elementor-toggle-icon-opened fas fa-caret-up\"><\/i><\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t<a class=\"elementor-toggle-title\" tabindex=\"0\">SJTU (Shanghai Jiao Tong University)<\/a>\n\t\t\t\t\t<\/div>\n\n\t\t\t\t\t<div id=\"elementor-tab-content-1541\" class=\"elementor-tab-content elementor-clearfix\" data-tab=\"1\" role=\"region\" aria-labelledby=\"elementor-tab-title-1541\"><p data-start=\"739\" data-end=\"810\">The SJTU team has pioneered several widely used EEG emotion datasets, setting a strong foundation for affective computing and brain\u2013emotion research.<\/p><hr \/><ul><li data-start=\"814\" data-end=\"851\"><strong><span style=\"color: #6daee3;\">SEED (SJTU Emotion EEG Dataset)<\/span><\/strong><\/li><li style=\"list-style-type: none;\"><ul data-start=\"854\" data-end=\"1021\"><li data-start=\"854\" data-end=\"930\"><p data-start=\"856\" data-end=\"930\">Original dataset with multiple subjects under various emotional stimuli.<\/p><\/li><li data-start=\"933\" data-end=\"997\"><p data-start=\"935\" data-end=\"997\">Used extensively for EEG-based emotion recognition research.<\/p><\/li><li data-start=\"1000\" data-end=\"1021\"><p data-start=\"1002\" data-end=\"1021\">Link: <a href=\"https:\/\/bcmi.sjtu.edu.cn\/home\/seed\/seed.html\">SEED Dataset<\/a><\/p><\/li><\/ul><\/li><\/ul><hr \/><ul><li data-start=\"1023\" data-end=\"1195\"><p data-start=\"1025\" data-end=\"1038\"><span style=\"color: #6daee3;\"><strong data-start=\"1025\" data-end=\"1036\">SEED-IV<\/strong><\/span><\/p><ul data-start=\"1041\" data-end=\"1195\"><li data-start=\"1041\" data-end=\"1101\"><p data-start=\"1043\" data-end=\"1101\">Covers four emotional states: happy, sad, neutral, fear.<\/p><\/li><li data-start=\"1104\" data-end=\"1171\"><p data-start=\"1106\" data-end=\"1171\">Provides longer temporal recordings for more detailed analysis.<\/p><\/li><li data-start=\"1174\" data-end=\"1195\"><p data-start=\"1176\" data-end=\"1195\">Link: <a href=\"https:\/\/bcmi.sjtu.edu.cn\/home\/seed\/seed-iv.html\">SEED Dataset<\/a><\/p><\/li><\/ul><\/li><\/ul><hr \/><ul><li data-start=\"1197\" data-end=\"1381\"><p data-start=\"1199\" data-end=\"1213\"><span style=\"color: #6daee3;\"><strong data-start=\"1199\" data-end=\"1211\">SEED-VII<\/strong><\/span><\/p><ul data-start=\"1216\" data-end=\"1381\"><li data-start=\"1216\" data-end=\"1281\"><p data-start=\"1218\" data-end=\"1281\">Latest in the SEED series with updated stimuli and protocols.<\/p><\/li><li data-start=\"1284\" data-end=\"1357\"><p data-start=\"1286\" data-end=\"1357\">Includes multimodal recordings (EEG + eye-tracking in some versions).<\/p><\/li><li data-start=\"1360\" data-end=\"1381\"><p data-start=\"1362\" data-end=\"1381\">Link: <a href=\"https:\/\/bcmi.sjtu.edu.cn\/home\/seed\/seed-vii.html\">SEED Dataset<\/a><\/p><\/li><\/ul><\/li><\/ul><\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<div class=\"elementor-toggle-item\">\n\t\t\t\t\t<div id=\"elementor-tab-title-1542\" class=\"elementor-tab-title\" data-tab=\"2\" role=\"button\" aria-controls=\"elementor-tab-content-1542\" aria-expanded=\"false\">\n\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon elementor-toggle-icon-left\" aria-hidden=\"true\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-closed\"><i class=\"fas fa-caret-right\"><\/i><\/span>\n\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-opened\"><i class=\"elementor-toggle-icon-opened fas fa-caret-up\"><\/i><\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t<a class=\"elementor-toggle-title\" tabindex=\"0\">FACED (Tsinghua University)<\/a>\n\t\t\t\t\t<\/div>\n\n\t\t\t\t\t<div id=\"elementor-tab-content-1542\" class=\"elementor-tab-content elementor-clearfix\" data-tab=\"2\" role=\"region\" aria-labelledby=\"elementor-tab-title-1542\"><p>The FACED project introduces one of the most comprehensive EEG datasets focused on fine-grained emotional states, with particular emphasis on positive emotions, an area underrepresented in existing affective-computing resources.<\/p><hr \/><ul><li><span style=\"color: #99ccff;\"><strong><span style=\"color: #6daee3;\">FACED (Finer-grained Affective Computing EEG Dataset)<\/span><\/strong><\/span><ul><li>Recorded 32-channel EEG signals from 123 subjects, making it one of the largest EEG emotion datasets available.<\/li><li><p data-start=\"553\" data-end=\"651\">Participants viewed 28 emotion-elicitation video clips spanning nine emotion categories:<\/p><ul data-start=\"654\" data-end=\"773\"><li data-start=\"654\" data-end=\"709\"><p data-start=\"656\" data-end=\"709\"><em data-start=\"656\" data-end=\"666\">Positive<\/em>: amusement, inspiration, joy, tenderness<\/p><\/li><li data-start=\"712\" data-end=\"757\"><p data-start=\"714\" data-end=\"757\"><em data-start=\"714\" data-end=\"724\">Negative<\/em>: anger, fear, disgust, sadness<\/p><\/li><li data-start=\"760\" data-end=\"773\"><p data-start=\"762\" data-end=\"773\"><em data-start=\"762\" data-end=\"771\">Neutral<\/em><\/p><\/li><\/ul><\/li><li data-start=\"774\" data-end=\"872\"><p data-start=\"776\" data-end=\"872\">Designed to provide balanced representation across positive and negative emotional states.<\/p><\/li><li data-start=\"873\" data-end=\"1013\"><p data-start=\"875\" data-end=\"1013\">Enables robust research on both intra-subject and cross-subject affective recognition\u2014a major challenge in EEG emotion modeling.<\/p><\/li><li>Link: <a href=\"https:\/\/www.synapse.org\/Synapse:syn50614194\/wiki\/620378\">FACED &#8211; syn50614194 &#8211; Wiki<\/a><\/li><\/ul><\/li><\/ul><\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<div class=\"elementor-toggle-item\">\n\t\t\t\t\t<div id=\"elementor-tab-title-1543\" class=\"elementor-tab-title\" data-tab=\"3\" role=\"button\" aria-controls=\"elementor-tab-content-1543\" aria-expanded=\"false\">\n\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon elementor-toggle-icon-left\" aria-hidden=\"true\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-closed\"><i class=\"fas fa-caret-right\"><\/i><\/span>\n\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-opened\"><i class=\"elementor-toggle-icon-opened fas fa-caret-up\"><\/i><\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t<a class=\"elementor-toggle-title\" tabindex=\"0\">DEAP Consortium (Queen Mary University of London, University of Twente, University of Geneva, EPFL)<\/a>\n\t\t\t\t\t<\/div>\n\n\t\t\t\t\t<div id=\"elementor-tab-content-1543\" class=\"elementor-tab-content elementor-clearfix\" data-tab=\"3\" role=\"region\" aria-labelledby=\"elementor-tab-title-1543\"><p>The DEAP consortium brought together multiple universities to create one of the most comprehensive and influential EEG emotion datasets to date.<\/p><hr \/><ul><li><span style=\"color: #6daee3;\"><strong data-start=\"1375\" data-end=\"1442\">DEAP (Dataset for Emotion Analysis using Physiological signals)<\/strong><\/span><ul><li data-start=\"1449\" data-end=\"1516\">Large-scale collaboration across European universities.<\/li><li data-start=\"1521\" data-end=\"1619\">Contains EEG and peripheral physiological recordings from 32 participants watching music videos.<\/li><li data-start=\"1624\" data-end=\"1695\">Includes self-assessment ratings for valence, arousal, and dominance.<\/li><li data-start=\"1700\" data-end=\"1768\">One of the most cited EEG emotion datasets in affective computing.<\/li><li data-start=\"1773\" data-end=\"1792\">Link:\u00a0<a href=\"https:\/\/eecs.qmul.ac.uk\/mmv\/datasets\/deap\/\">DEAP: A Dataset for Emotion Analysis using Physiological and Audiovisual Signals<\/a><\/li><\/ul><\/li><\/ul><\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<div class=\"elementor-toggle-item\">\n\t\t\t\t\t<div id=\"elementor-tab-title-1544\" class=\"elementor-tab-title\" data-tab=\"4\" role=\"button\" aria-controls=\"elementor-tab-content-1544\" aria-expanded=\"false\">\n\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon elementor-toggle-icon-left\" aria-hidden=\"true\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-closed\"><i class=\"fas fa-caret-right\"><\/i><\/span>\n\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-opened\"><i class=\"elementor-toggle-icon-opened fas fa-caret-up\"><\/i><\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t<a class=\"elementor-toggle-title\" tabindex=\"0\">UWS (University of the West of Scotland)<\/a>\n\t\t\t\t\t<\/div>\n\n\t\t\t\t\t<div id=\"elementor-tab-content-1544\" class=\"elementor-tab-content elementor-clearfix\" data-tab=\"4\" role=\"region\" aria-labelledby=\"elementor-tab-title-1544\"><p data-start=\"2092\" data-end=\"2223\">The UWS\u00a0research team developed the DREAMER dataset to advance multimodal emotion recognition using both EEG and ECG signals.<\/p><hr \/><ul data-start=\"2225\" data-end=\"2492\"><li data-start=\"2225\" data-end=\"2492\"><p data-start=\"2227\" data-end=\"2240\"><span style=\"color: #6daee3;\"><strong data-start=\"2227\" data-end=\"2238\">DREAMER<\/strong><\/span><\/p><ul data-start=\"2243\" data-end=\"2492\"><li data-start=\"2243\" data-end=\"2321\"><p data-start=\"2245\" data-end=\"2321\">EEG and ECG recordings from participants viewing emotion-eliciting videos.<\/p><\/li><li data-start=\"2324\" data-end=\"2392\"><p data-start=\"2326\" data-end=\"2392\">Includes subjective ratings for valence, arousal, and dominance.<\/p><\/li><li data-start=\"2395\" data-end=\"2468\"><p data-start=\"2397\" data-end=\"2468\">Ideal for developing models that integrate brain and cardiac signals.<\/p><\/li><li data-start=\"2471\" data-end=\"2492\"><p data-start=\"2473\" data-end=\"2492\">Link:\u00a0<a style=\"background-color: #ffffff;\" href=\"https:\/\/zenodo.org\/records\/546113\">DREAMER: A Database for Emotion Recognition through EEG and ECG Signals from Wireless Low-cost Off-the-Shelf Devices<\/a><\/p><\/li><\/ul><\/li><\/ul><\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<div class=\"elementor-toggle-item\">\n\t\t\t\t\t<div id=\"elementor-tab-title-1545\" class=\"elementor-tab-title\" data-tab=\"5\" role=\"button\" aria-controls=\"elementor-tab-content-1545\" aria-expanded=\"false\">\n\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon elementor-toggle-icon-left\" aria-hidden=\"true\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-closed\"><i class=\"fas fa-caret-right\"><\/i><\/span>\n\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-opened\"><i class=\"elementor-toggle-icon-opened fas fa-caret-up\"><\/i><\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t<a class=\"elementor-toggle-title\" tabindex=\"0\">AMIGOS Consortium (Queen Mary University of London, University of Trento)<\/a>\n\t\t\t\t\t<\/div>\n\n\t\t\t\t\t<div id=\"elementor-tab-content-1545\" class=\"elementor-tab-content elementor-clearfix\" data-tab=\"5\" role=\"region\" aria-labelledby=\"elementor-tab-title-1545\"><p data-start=\"2544\" data-end=\"2661\">The AMIGOS team created a dataset to explore emotional dynamics in both individual and group contexts.<\/p><hr \/><ul data-start=\"2663\" data-end=\"2910\"><li data-start=\"2663\" data-end=\"2910\"><p data-start=\"2665\" data-end=\"2677\"><span style=\"color: #6daee3;\"><strong data-start=\"2665\" data-end=\"2675\">AMIGOS<\/strong><\/span><\/p><ul data-start=\"2680\" data-end=\"2910\"><li data-start=\"2680\" data-end=\"2733\"><p data-start=\"2682\" data-end=\"2733\">Multimodal recordings (EEG, ECG, GSR, and video).<\/p><\/li><li data-start=\"2736\" data-end=\"2796\"><p data-start=\"2738\" data-end=\"2796\">Designed for studying emotional and social interactions.<\/p><\/li><li data-start=\"2799\" data-end=\"2886\"><p data-start=\"2801\" data-end=\"2886\">Provides rich data for affective computing and human\u2013computer interaction research.<\/p><\/li><li data-start=\"2889\" data-end=\"2910\"><p data-start=\"2891\" data-end=\"2910\">Link: <a href=\"https:\/\/www.eecs.qmul.ac.uk\/mmv\/datasets\/amigos\/\">AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups<\/a><\/p><\/li><\/ul><\/li><\/ul><\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-ec7c88d elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"ec7c88d\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-8e5c54b\" data-id=\"8e5c54b\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-d4f12d6 elementor-widget elementor-widget-text-editor\" data-id=\"d4f12d6\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<h3><strong><span style=\"color: #6daee3;\">Emotional Video Databases<\/span><\/strong><\/h3>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-1449f56 elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"1449f56\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-421c6ca\" data-id=\"421c6ca\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-1af35f3 elementor-widget elementor-widget-toggle\" data-id=\"1af35f3\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"toggle.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-toggle\">\n\t\t\t\t\t\t\t<div class=\"elementor-toggle-item\">\n\t\t\t\t\t<div id=\"elementor-tab-title-2821\" class=\"elementor-tab-title\" data-tab=\"1\" role=\"button\" aria-controls=\"elementor-tab-content-2821\" aria-expanded=\"false\">\n\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon elementor-toggle-icon-left\" aria-hidden=\"true\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-closed\"><i class=\"fas fa-caret-right\"><\/i><\/span>\n\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-opened\"><i class=\"elementor-toggle-icon-opened fas fa-caret-up\"><\/i><\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t<a class=\"elementor-toggle-title\" tabindex=\"0\">FilmStim<\/a>\n\t\t\t\t\t<\/div>\n\n\t\t\t\t\t<div id=\"elementor-tab-content-2821\" class=\"elementor-tab-content elementor-clearfix\" data-tab=\"1\" role=\"region\" aria-labelledby=\"elementor-tab-title-2821\"><p>A curated set of film excerpts validated for emotional elicitation across valence and arousal dimensions. Commonly used in affective neuroscience and psychophysiology research.<\/p><hr \/><ul><li style=\"list-style-type: none;\"><ul><li>Link: <a href=\"https:\/\/sites.uclouvain.be\/ipsp\/FilmStim\/\">FilmStim (Schaefer &amp; al.)<\/a><\/li><\/ul><\/li><\/ul><\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<div class=\"elementor-toggle-item\">\n\t\t\t\t\t<div id=\"elementor-tab-title-2822\" class=\"elementor-tab-title\" data-tab=\"2\" role=\"button\" aria-controls=\"elementor-tab-content-2822\" aria-expanded=\"false\">\n\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon elementor-toggle-icon-left\" aria-hidden=\"true\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-closed\"><i class=\"fas fa-caret-right\"><\/i><\/span>\n\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-opened\"><i class=\"elementor-toggle-icon-opened fas fa-caret-up\"><\/i><\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t<a class=\"elementor-toggle-title\" tabindex=\"0\">LIRIS-ACCEDE<\/a>\n\t\t\t\t\t<\/div>\n\n\t\t\t\t\t<div id=\"elementor-tab-content-2822\" class=\"elementor-tab-content elementor-clearfix\" data-tab=\"2\" role=\"region\" aria-labelledby=\"elementor-tab-title-2822\"><p>A large collection of movie excerpts annotated with continuous valence and arousal scores, ideal for machine learning and affective computing applications. All excerpts are shared under Creative Commons licenses.<\/p><hr \/><ul><li style=\"list-style-type: none;\"><ul><li>Link: <a href=\"https:\/\/liris-accede.ec-lyon.fr\/\">LIRIS-ACCEDE<\/a><\/li><\/ul><\/li><\/ul><\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<div class=\"elementor-toggle-item\">\n\t\t\t\t\t<div id=\"elementor-tab-title-2823\" class=\"elementor-tab-title\" data-tab=\"3\" role=\"button\" aria-controls=\"elementor-tab-content-2823\" aria-expanded=\"false\">\n\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon elementor-toggle-icon-left\" aria-hidden=\"true\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-closed\"><i class=\"fas fa-caret-right\"><\/i><\/span>\n\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-opened\"><i class=\"elementor-toggle-icon-opened fas fa-caret-up\"><\/i><\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t<a class=\"elementor-toggle-title\" tabindex=\"0\">EmoFilm<\/a>\n\t\t\t\t\t<\/div>\n\n\t\t\t\t\t<div id=\"elementor-tab-content-2823\" class=\"elementor-tab-content elementor-clearfix\" data-tab=\"3\" role=\"region\" aria-labelledby=\"elementor-tab-title-2823\"><p>A multilingual film database offering clips rated for discrete emotions such as anger, happiness, fear, and sadness.<\/p><hr \/><ul><li style=\"list-style-type: none;\"><ul><li>Link: <a href=\"https:\/\/zenodo.org\/records\/7316999\">EmoFilm &#8211; A multilingual emotional speech corpus<\/a><\/li><\/ul><\/li><\/ul><\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<div class=\"elementor-toggle-item\">\n\t\t\t\t\t<div id=\"elementor-tab-title-2824\" class=\"elementor-tab-title\" data-tab=\"4\" role=\"button\" aria-controls=\"elementor-tab-content-2824\" aria-expanded=\"false\">\n\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon elementor-toggle-icon-left\" aria-hidden=\"true\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-closed\"><i class=\"fas fa-caret-right\"><\/i><\/span>\n\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-opened\"><i class=\"elementor-toggle-icon-opened fas fa-caret-up\"><\/i><\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t<a class=\"elementor-toggle-title\" tabindex=\"0\">Amsterdam Dynamic Facial Expression Set (ADFES)<\/a>\n\t\t\t\t\t<\/div>\n\n\t\t\t\t\t<div id=\"elementor-tab-content-2824\" class=\"elementor-tab-content elementor-clearfix\" data-tab=\"4\" role=\"region\" aria-labelledby=\"elementor-tab-title-2824\"><p>Video clips of actors portraying prototypical facial expressions, useful for emotion perception and recognition studies.<\/p><hr \/><ul><li style=\"list-style-type: none;\"><ul><li>Link: <a href=\"https:\/\/aice.uva.nl\/research-tools\/adfes-stimulus-set\/request-for-use\/request-for-use.html\">Request for use &#8211; Amsterdam Interdisciplinary Centre for Emotion (AICE) &#8211; University of Amsterdam<\/a><\/li><\/ul><\/li><\/ul><\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<div class=\"elementor-toggle-item\">\n\t\t\t\t\t<div id=\"elementor-tab-title-2825\" class=\"elementor-tab-title\" data-tab=\"5\" role=\"button\" aria-controls=\"elementor-tab-content-2825\" aria-expanded=\"false\">\n\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon elementor-toggle-icon-left\" aria-hidden=\"true\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-closed\"><i class=\"fas fa-caret-right\"><\/i><\/span>\n\t\t\t\t\t\t\t\t<span class=\"elementor-toggle-icon-opened\"><i class=\"elementor-toggle-icon-opened fas fa-caret-up\"><\/i><\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t\t<\/span>\n\t\t\t\t\t\t\t\t\t\t\t\t<a class=\"elementor-toggle-title\" tabindex=\"0\">EmoReact<\/a>\n\t\t\t\t\t<\/div>\n\n\t\t\t\t\t<div id=\"elementor-tab-content-2825\" class=\"elementor-tab-content elementor-clearfix\" data-tab=\"5\" role=\"region\" aria-labelledby=\"elementor-tab-title-2825\"><p>A large-scale multimodal emotion dataset featuring 1,102 videos of children aged 4\u201314, annotated for 17 affective states including six basic and nine complex emotions, as well as valence and neutrality. It is the largest dataset of its kind, enabling rich analysis of emotional expression and development in children.<\/p><hr \/><ul><li style=\"list-style-type: none;\"><ul><li>Link: <a href=\"https:\/\/github.com\/bnojavan\/EmoReact?tab=readme-ov-file\">GitHub &#8211; bnojavan\/EmoReact<\/a><\/li><\/ul><\/li><\/ul><\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-c91a784 elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"c91a784\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-4faad21\" data-id=\"4faad21\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-068c93a elementor-widget elementor-widget-spacer\" data-id=\"068c93a\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"spacer.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-spacer\">\n\t\t\t<div class=\"elementor-spacer-inner\"><\/div>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-a62aa6a elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"a62aa6a\" data-element_type=\"section\" data-e-type=\"section\" data-settings=\"{&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-4571e7f\" data-id=\"4571e7f\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-0a90d9a elementor-widget elementor-widget-text-editor\" data-id=\"0a90d9a\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>All datasets are linked to their official sources to ensure proper access and citation. By grouping datasets by their originating team or university,\u00a0<strong><span style=\"color: #6daee3;\"><em data-start=\"895\" data-end=\"912\">The Affective Grid<\/em><\/span><\/strong> facilitates easier exploration, comparison, and leveraging of emotion data for studies in affective neuroscience and computing.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>Jonah Fernandez About me Publications Teaching The Affective Grid The Affective Grid A central hub for EEG and emotion research data Emotions shape how we perceive, decide, and interact with the world. Understanding their neural and physiological foundations has become one of the central challenges in cognitive and affective neuroscience. With the rise of affective<a class=\"more-link\" href=\"https:\/\/exit.udg.edu\/jonahfernandez\/index.php\/the-affective-grid\/\">Seguir leyendo <span class=\"screen-reader-text\">\u00abThe Affective Grid\u00bb<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-303","page","type-page","status-publish","hentry","entry"],"_links":{"self":[{"href":"https:\/\/exit.udg.edu\/jonahfernandez\/index.php\/wp-json\/wp\/v2\/pages\/303","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/exit.udg.edu\/jonahfernandez\/index.php\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/exit.udg.edu\/jonahfernandez\/index.php\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/exit.udg.edu\/jonahfernandez\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/exit.udg.edu\/jonahfernandez\/index.php\/wp-json\/wp\/v2\/comments?post=303"}],"version-history":[{"count":75,"href":"https:\/\/exit.udg.edu\/jonahfernandez\/index.php\/wp-json\/wp\/v2\/pages\/303\/revisions"}],"predecessor-version":[{"id":427,"href":"https:\/\/exit.udg.edu\/jonahfernandez\/index.php\/wp-json\/wp\/v2\/pages\/303\/revisions\/427"}],"wp:attachment":[{"href":"https:\/\/exit.udg.edu\/jonahfernandez\/index.php\/wp-json\/wp\/v2\/media?parent=303"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}