Current Search: Affective Computing (x)
View All Items
- Title
- AFFECTIVE DESIGN IN TECHNICAL COMMUNICATION.
- Creator
-
Rosen, Michael, Kitalong, Karla, University of Central Florida
- Abstract / Description
-
Traditional human-computer interaction (HCI) is based on 'cold' models of user cognition; that is, models of users as purely rational beings based on the information processing metaphor; however, an emerging perspective suggests that for the field of HCI to mature, its practitioners must adopt models of users that consider broader human needs and capabilities. Affective design is an umbrella term for research and practice being conducted in diverse domains, all with the common thread of...
Show moreTraditional human-computer interaction (HCI) is based on 'cold' models of user cognition; that is, models of users as purely rational beings based on the information processing metaphor; however, an emerging perspective suggests that for the field of HCI to mature, its practitioners must adopt models of users that consider broader human needs and capabilities. Affective design is an umbrella term for research and practice being conducted in diverse domains, all with the common thread of integrating emotional aspects of use into the creation of information products. This thesis provides a review of the current state of the art in affective design research and practice to technical communicators and others involved in traditional HCI and usability enterprises. This paper is motivated by the developing technologies and the growing complexity of interaction that demand a more robust notion of HCI that incorporates affect in an augmented and holistic representation of the user and situated use.
Show less - Date Issued
- 2005
- Identifier
- CFE0000590, ucf:46474
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000590
- Title
- Context-Centric Affect Recognition From Paralinguistic Features of Speech.
- Creator
-
Marpaung, Andreas, Gonzalez, Avelino, DeMara, Ronald, Sukthankar, Gita, Wu, Annie, Lisetti, Christine, University of Central Florida
- Abstract / Description
-
As the field of affect recognition has progressed, many researchers have shifted from having unimodal approaches to multimodal ones. In particular, the trends in paralinguistic speech affect recognition domain have been to integrate other modalities such as facial expression, body posture, gait, and linguistic speech. Our work focuses on integrating contextual knowledge into paralinguistic speech affect recognition. We hypothesize that a framework to recognize affect through paralinguistic...
Show moreAs the field of affect recognition has progressed, many researchers have shifted from having unimodal approaches to multimodal ones. In particular, the trends in paralinguistic speech affect recognition domain have been to integrate other modalities such as facial expression, body posture, gait, and linguistic speech. Our work focuses on integrating contextual knowledge into paralinguistic speech affect recognition. We hypothesize that a framework to recognize affect through paralinguistic features of speech can improve its performance by integrating relevant contextual knowledge. This dissertation describes our research to integrate contextual knowledge into the paralinguistic affect recognition process from acoustic features of speech. We conceived, built, and tested a two-phased system called the Context-Based Paralinguistic Affect Recognition System (CxBPARS). The first phase of this system is context-free and uses the AdaBoost classifier that applies data on the acoustic pitch, jitter, shimmer, Harmonics-to-Noise Ratio (HNR), and the Noise-to-Harmonics Ratio (NHR) to make an initial judgment about the emotion most likely exhibited by the human elicitor. The second phase then adds context modeling to improve upon the context-free classifications from phase I. CxBPARS was inspired by a human subject study performed as part of this work where test subjects were asked to classify an elicitor's emotion strictly from paralinguistic sounds, and then subsequently provided with contextual information to improve their selections. CxBPARS was rigorously tested and found to, at the worst case, improve the success rate from the state-of-the-art's 42% to 53%.
Show less - Date Issued
- 2019
- Identifier
- CFE0007836, ucf:52831
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0007836
- Title
- ADAPTIVE INTELLIGENT USER INTERFACES WITH EMOTION RECOGNITION.
- Creator
-
NASOZ, FATMA, Christine Lisetti, Dr L., University of Central Florida
- Abstract / Description
-
The focus of this dissertation is on creating Adaptive Intelligent User Interfaces to facilitate enhanced natural communication during the Human-Computer Interaction by recognizing users' affective states (i.e., emotions experienced by the users) and responding to those emotions by adapting to the current situation via an affective user model created for each user. Controlled experiments were designed and conducted in a laboratory environment and in a Virtual Reality environment to collect...
Show moreThe focus of this dissertation is on creating Adaptive Intelligent User Interfaces to facilitate enhanced natural communication during the Human-Computer Interaction by recognizing users' affective states (i.e., emotions experienced by the users) and responding to those emotions by adapting to the current situation via an affective user model created for each user. Controlled experiments were designed and conducted in a laboratory environment and in a Virtual Reality environment to collect physiological data signals from participants experiencing specific emotions. Algorithms (k-Nearest Neighbor [KNN], Discriminant Function Analysis [DFA], Marquardt-Backpropagation [MBP], and Resilient Backpropagation [RBP]) were implemented to analyze the collected data signals and to find unique physiological patterns of emotions. Emotion Elicitation with Movie Clips Experiment was conducted to elicit Sadness, Anger, Surprise, Fear, Frustration, and Amusement from participants. Overall, the three algorithms: KNN, DFA, and MBP, could recognize emotions with 72.3%, 75.0%, and 84.1% accuracy, respectively. Driving Simulator experiment was conducted to elicit driving-related emotions and states (panic/fear, frustration/anger, and boredom/sleepiness). The KNN, MBP and RBP Algorithms were used to classify the physiological signals by corresponding emotions. Overall, KNN could classify these three emotions with 66.3%, MBP could classify them with 76.7% and RBP could classify them with 91.9% accuracy. Adaptation of the interface was designed to provide multi-modal feedback to the users about their current affective state and to respond to users' negative emotional states in order to decrease the possible negative impacts of those emotions. Bayesian Belief Networks formalization was employed to develop the User Model to enable the intelligent system to appropriately adapt to the current context and situation by considering user-dependent factors, such as: personality traits and preferences.
Show less - Date Issued
- 2004
- Identifier
- CFE0000126, ucf:46201
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000126
- Title
- Modeling Learner Mood in Realtime through Biosensors for Intelligent Tutoring Improvements.
- Creator
-
Brawner, Keith, Gonzalez, Avelino, Boloni, Ladislau, Georgiopoulos, Michael, Proctor, Michael, Beidel, Deborah, University of Central Florida
- Abstract / Description
-
Computer-based instructors, just like their human counterparts, should monitor the emotional and cognitive states of their students in order to adapt instructional technique. Doing so requires a model of student state to be available at run time, but this has historically been difficult. Because people are different, generalized models have not been able to be validated. As a person's cognitive and affective state vary over time of day and seasonally, individualized models have had differing...
Show moreComputer-based instructors, just like their human counterparts, should monitor the emotional and cognitive states of their students in order to adapt instructional technique. Doing so requires a model of student state to be available at run time, but this has historically been difficult. Because people are different, generalized models have not been able to be validated. As a person's cognitive and affective state vary over time of day and seasonally, individualized models have had differing difficulties. The simultaneous creation and execution of an individualized model, in real time, represents the last option for modeling such cognitive and affective states. This dissertation presents and evaluates four differing techniques for the creation of cognitive and affective models that are created on-line and in real time for each individual user as alternatives to generalized models. Each of these techniques involves making predictions and modifications to the model in real time, addressing the real time datastream problems of infinite length, detection of new concepts, and responding to how concepts change over time. Additionally, with the knowledge that a user is physically present, this work investigates the contribution that the occasional direct user query can add to the overall quality of such models. The research described in this dissertation finds that the creation of a reasonable quality affective model is possible with an infinitesimal amount of time and without (")ground truth(") knowledge of the user, which is shown across three different emotional states. Creation of a cognitive model in the same fashion, however, was not possible via direct AI modeling, even with all of the (")ground truth(") information available, which is shown across four different cognitive states.
Show less - Date Issued
- 2013
- Identifier
- CFE0004822, ucf:49734
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0004822
- Title
- An Exploratory Comparison of a Traditional and an Adaptive Instructional Approach for College Algebra.
- Creator
-
Kasha, Ryan, Kincaid, John, Wiegand, Rudolf, Hartshorne, Richard, Morris, Cliff, University of Central Florida
- Abstract / Description
-
This research effort compared student learning gains and attitudinal changes through the implementation of two varying instructional approaches on the topic of functions in College Algebra. Attitudinal changes were measured based on the Attitude Towards Mathematics Inventory (ATMI). The ATMI also provided four sub-scales scores for self-confidence, value of learning, enjoyment, and motivation. Furthermore, this research explored and compared relationships between students' level of mastery...
Show moreThis research effort compared student learning gains and attitudinal changes through the implementation of two varying instructional approaches on the topic of functions in College Algebra. Attitudinal changes were measured based on the Attitude Towards Mathematics Inventory (ATMI). The ATMI also provided four sub-scales scores for self-confidence, value of learning, enjoyment, and motivation. Furthermore, this research explored and compared relationships between students' level of mastery and their actual level of learning. This study implemented a quasi-experimental research design using a sample that consisted of 56 College Algebra students in a public, state college in Florida. The sample was enrolled in one of two College Algebra sections, in which one section followed a self-adaptive instructional approach using ALEKS (Assessment and Learning in Knowledge Space) and the other section followed a traditional approach using MyMathLab. Learning gains in each class were measured as the difference between the pre-test and post-test scores on the topic of functions in College Algebra. Attitude changes in each class were measured as the difference between the holistic scores on the ATMI, as well as each of the four sub-scale scores, which was administered once in the beginning of the semester and again after the unit of functions, approximately eight weeks into the course. Utilizing an independent t-test, results indicated that there was not a significant difference in actual learning gains for the compared instructional approaches. Additionally, independent t-test results indicated that there was not a statistical difference for attitude change holistically and on each of the four sub-scales for the compared instructional approaches. However, correlational analyses revealed a strong relationship between students' level of mastery learning and their actual learning level for each class with the self-adaptive instructional approach having a stronger correlation than the non-adaptive section, as measured by an r-to-z Fisher transformation test. The results of this study indicate that the self-adaptive instructional approach using ALEKS could more accurately report students' true level of learning compared to a non-adaptive instructional approach. Overall, this study found the compared instructional approaches to be equivalent in terms of learning and effect on students' attitude. While not statistically different, the results of this study have implications for math educators, instructional designers, and software developers. For example, a non-adaptive instructional approach can be equivalent to a self-adaptive instructional approach in terms of learning with appropriate planning and design. Future recommendations include further case studies of self-adaptive technology in developmental and college mathematics in other modalities such as hybrid or on-line courses. Also, this study should be replicated on a larger scale with other self-adaptive math software in addition to focusing on other student populations, such as K - 12. There is much potential for intelligent tutoring to supplement different instructional approaches, but should not be viewed as a replacement for teacher-to-student interactions.
Show less - Date Issued
- 2015
- Identifier
- CFE0005963, ucf:50821
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0005963
- Title
- TOWARD BUILDING A SOCIAL ROBOT WITH AN EMOTION-BASED INTERNAL CONTROL AND EXTERNAL COMMUNICATION TO ENHANCE HUMAN-ROBOT INTERACTION.
- Creator
-
Marpaung, Andreas, Lisetti, Christine, University of Central Florida
- Abstract / Description
-
In this thesis, we aim at modeling some aspects of the functional role of emotions on an autonomous embodied agent. We begin by describing our robotic prototype, Cherry--a robot with the task of being a tour guide and an office assistant for the Computer Science Department at the University of Central Florida. Cherry did not have a formal emotion representation of internal states, but did have the ability to express emotions through her multimodal interface. The thesis presents the results of...
Show moreIn this thesis, we aim at modeling some aspects of the functional role of emotions on an autonomous embodied agent. We begin by describing our robotic prototype, Cherry--a robot with the task of being a tour guide and an office assistant for the Computer Science Department at the University of Central Florida. Cherry did not have a formal emotion representation of internal states, but did have the ability to express emotions through her multimodal interface. The thesis presents the results of a survey we performed via our social informatics approach where we found that: (1) the idea of having emotions in a robot was warmly accepted by Cherry's users, and (2) the intended users were pleased with our initial interface design and functionalities. Guided by these results, we transferred our previous code to a human-height and more robust robot--Petra, the PeopleBot--where we began to build a formal emotion mechanism and representation for internal states to correspond to the external expressions of Cherry's interface. We describe our overall three-layered architecture, and propose the design of the sensory motor level (the first layer of the three-layered architecture) inspired by the Multilevel Process Theory of Emotion on one hand, and hybrid robotic architecture on the other hand. The sensory-motor level receives and processes incoming stimuli with fuzzy logic and produces emotion-like states without any further willful planning or learning. We will discuss how Petra has been equipped with sonar and vision for obstacle avoidance as well as vision for face recognition, which are used when she roams around the hallway to engage in social interactions with humans. We hope that the sensory motor level in Petra could serve as a foundation for further works in modeling the three-layered architecture of the Emotion State Generator.
Show less - Date Issued
- 2004
- Identifier
- CFE0000286, ucf:46228
- Format
- Document (PDF)
- PURL
- http://purl.flvc.org/ucf/fd/CFE0000286