You are here
Complex Affect Recognition in the Wild
- Date Issued:
- 2017
- Abstract/Description:
- Arti?cial social intelligence is a step towards human-like human-computer interaction. One important milestone towards building socially intelligent systems is enabling computers with the ability to process and interpret the social signals of humans in the real world. Social signals include a wide range of emotional responses from a simple smile to expressions of complex affects.This dissertation revolves around computational models for social signal processing in the wild, using multimodal signals with an emphasis on the visual modality. We primarily focus on complex affect recognition with a strong interest in curiosity. In this dissertation,we ?rst present our collected dataset, EmoReact. We provide detailed multimodal behavior analysis across audio-visual signals and present unimodal and multimodal classi?cation models for affect recognition. Second, we present a deep multimodal fusion algorithm to fuse information from visual, acoustic and verbal channels to achieve a uni?ed classi?cation result. Third, we present a novel system to synthesize, recognize and localize facial occlusions. The proposed framework is based on a three-stage process: 1) Synthesis of naturalistic occluded faces, which include hand over face occlusions as well as other common occlusions such as hair bangs, scarf, hat, etc. 2) Recognition of occluded faces and differentiating between hand over face and other types of facial occlusions. 3) Localization of facial occlusions and identifying the occluded facial regions. Region of facial occlusion, plays an importantroleinrecognizingaffectandashiftinlocationcanresultinaverydifferentinterpretation, e.g., hand over chin can indicate contemplation, while hand over eyes may show frustration or sadness. Finally, we show the importance of considering facial occlusion type and region in affect recognition through achieving promising results in our experiments.
Title: | Complex Affect Recognition in the Wild. |
21 views
6 downloads |
---|---|---|
Name(s): |
Nojavanasghari, Behnaz, Author Hughes, Charles, Committee Chair Morency, Louis-Philippe, Committee CoChair Sukthankar, Gita, Committee Member Foroosh, Hassan, Committee Member Morency, Louis-Philippe, Committee Member University of Central Florida, Degree Grantor |
|
Type of Resource: | text | |
Date Issued: | 2017 | |
Publisher: | University of Central Florida | |
Language(s): | English | |
Abstract/Description: | Arti?cial social intelligence is a step towards human-like human-computer interaction. One important milestone towards building socially intelligent systems is enabling computers with the ability to process and interpret the social signals of humans in the real world. Social signals include a wide range of emotional responses from a simple smile to expressions of complex affects.This dissertation revolves around computational models for social signal processing in the wild, using multimodal signals with an emphasis on the visual modality. We primarily focus on complex affect recognition with a strong interest in curiosity. In this dissertation,we ?rst present our collected dataset, EmoReact. We provide detailed multimodal behavior analysis across audio-visual signals and present unimodal and multimodal classi?cation models for affect recognition. Second, we present a deep multimodal fusion algorithm to fuse information from visual, acoustic and verbal channels to achieve a uni?ed classi?cation result. Third, we present a novel system to synthesize, recognize and localize facial occlusions. The proposed framework is based on a three-stage process: 1) Synthesis of naturalistic occluded faces, which include hand over face occlusions as well as other common occlusions such as hair bangs, scarf, hat, etc. 2) Recognition of occluded faces and differentiating between hand over face and other types of facial occlusions. 3) Localization of facial occlusions and identifying the occluded facial regions. Region of facial occlusion, plays an importantroleinrecognizingaffectandashiftinlocationcanresultinaverydifferentinterpretation, e.g., hand over chin can indicate contemplation, while hand over eyes may show frustration or sadness. Finally, we show the importance of considering facial occlusion type and region in affect recognition through achieving promising results in our experiments. | |
Identifier: | CFE0007291 (IID), ucf:52163 (fedora) | |
Note(s): |
2017-12-01 Ph.D. Engineering and Computer Science, Computer Science Doctoral This record was generated from author submitted information. |
|
Subject(s): | Affect Recognition | |
Persistent Link to This Record: | http://purl.flvc.org/ucf/fd/CFE0007291 | |
Restrictions on Access: | campus 2023-06-15 | |
Host Institution: | UCF |