Feeling Positive? Predicting Emotional Image Similarity from Brain Signals
Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Standard
Feeling Positive? Predicting Emotional Image Similarity from Brain Signals. / Ruotsalo, Tuukka; Mäkelä, Kalle; Spapé, Michiel M.; Leiva, Luis A.
MM 2023 - Proceedings of the 31st ACM International Conference on Multimedia. Association for Computing Machinery, Inc., 2023. p. 5870-5878.Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - GEN
T1 - Feeling Positive? Predicting Emotional Image Similarity from Brain Signals
AU - Ruotsalo, Tuukka
AU - Mäkelä, Kalle
AU - Spapé, Michiel M.
AU - Leiva, Luis A.
N1 - Publisher Copyright: © 2023 Owner/Author.
PY - 2023
Y1 - 2023
N2 - The present notion of visual similarity is based on features derived from image contents. This ignores the users' emotional or affective experiences toward the content, and how users feel when they search for images. Here we consider valence, a positive or negative quantification of affective appraisal, as a novel dimension of image similarity. We report the largest neuroimaging experiment that quantifies and predicts the valence of visual content by using functional near-infrared spectroscopy from brain-computer interfacing. We show that affective similarity can be (1)∼decoded directly from brain signals in response to visual stimuli, (2)∼utilized for predicting affective image similarity with an average accuracy of 0.58 and an accuracy of 0.65 for high-arousal stimuli, and (3)∼effectively used to complement affective similarity estimates of content-based models; for example when fused fNIRS and image rankings the retrieval F-measure@20 is 0.70. Our work opens new research avenues for affective multimedia analysis, retrieval, and user modeling.
AB - The present notion of visual similarity is based on features derived from image contents. This ignores the users' emotional or affective experiences toward the content, and how users feel when they search for images. Here we consider valence, a positive or negative quantification of affective appraisal, as a novel dimension of image similarity. We report the largest neuroimaging experiment that quantifies and predicts the valence of visual content by using functional near-infrared spectroscopy from brain-computer interfacing. We show that affective similarity can be (1)∼decoded directly from brain signals in response to visual stimuli, (2)∼utilized for predicting affective image similarity with an average accuracy of 0.58 and an accuracy of 0.65 for high-arousal stimuli, and (3)∼effectively used to complement affective similarity estimates of content-based models; for example when fused fNIRS and image rankings the retrieval F-measure@20 is 0.70. Our work opens new research avenues for affective multimedia analysis, retrieval, and user modeling.
KW - affective computing
KW - bci
KW - ranking relevance
U2 - 10.1145/3581783.3613442
DO - 10.1145/3581783.3613442
M3 - Article in proceedings
AN - SCOPUS:85170399896
SP - 5870
EP - 5878
BT - MM 2023 - Proceedings of the 31st ACM International Conference on Multimedia
PB - Association for Computing Machinery, Inc.
T2 - 31st ACM International Conference on Multimedia, MM 2023
Y2 - 29 October 2023 through 3 November 2023
ER -
ID: 383792619