Because of the increasing presence of intelligent agents in various aspects of human social life, social skills play a vital role in ensuring these systems exhibit acceptable and realistic behavior in social communication. The importance of emotional intelligence in social capabilities is noteworthy, so incorporating emotions into the behaviors of intelligent agents is essential. Therefore, some computational models of emotions have been presented to develop intelligent agents that exhibit emotional human-like behaviors. However, most current computational models of emotions neglect the dynamic learning of the affective meaning of events based on agents’ experiences. Such models evaluate the events in the environment according to emotional aspects without considering the context of the situations. Also, these models capture the emotional states of agents by using predefined rules determined according to psychological theories. Therefore, they disregard the data-driven methods that can obtain the relationships between appraisal variables and emotions based on natural human data with fewer assumptions on the nature of such relationships. To address these issues, we proposed a novel and unified affective-cognitive framework (EIAEC) to facilitate the development of emotion-aware intelligent agents. EIAEC uses appraisal theories to acquire the emotional states of the agent in various situations. This paper contains four main contributions: 1- We have designed an efficient episodic memory that uses events and their conditional contexts to store and retrieve knowledge and experiences. This memory facilitates emotional expressions and decision-making adapted to the situations of the agent. 2- A novel method has been proposed that learns context-dependent affective values associated with events by using the agent’s experiences in various contexts. Subsequently, we acquired appraisal variables using the elements and related meta-data in episodic memory. 3- We have proposed a new data-driven method that maps appraisal variables to emotional states. 4- Moreover, a method has been developed to update the activation values regarding actions by using the emotional states of the agent. This method models the influence of emotions on the agent’s decision-making. Finally, we simulate a driving scenarios in our proposed framework to manifest the generated emotions in different situations and conditions. Moreover, we show how the proposed method learns the affective meaning of events and actions used in appraisal computing.
Neurophysiological measurements, such as electroencephalography (EEG), can be used to derive insight into pilots’ mental states during flight training and to track learning progress in order to optimize the training experience for each individual. Prior work has demonstrated that the level of fidelity of a flight simulation (2D Desktop vs. 3D VR) is associated with different cortical activity in relation to task demands. However, it remains unknown whether simulation fidelity affects flight performance, and whether this effect can be observed in EEG neurophysiological responses associated with workload. The current study therefore assessed whether an EEG-based index of workload and task engagement is predictive of performance during flight training in different simulation environments. We conducted a within-subject designed experiment with 53 novice participants who performed two flight tasks (speed change, medium turn) under two conditions (Desktop vs. VR). EEG signals were collected throughout the experiment to quantify mental workload using the beta-ratio (). The VR condition showed increased beta-ratios in all lobes, including frontal and parietal areas, compared to the Desktop simulation. Additionally, we observed an effect of simulator environment on performance, as VR was associated with improved flight performance. However, we found no evidence of a relationship between the beta-ratio and performance. Our findings demonstrate that the brain responds differently to tasks in training environments of various levels of fidelity. However, more research into the neurometrics of flight training is needed in order to develop brain-computer interfaces that can enhance current pilot training methods by providing personalized feedback in real-time.
Cognition and learning are exceedingly modeled as an associative activity of connectionist neural networks. However, only a few such models exist for continuous reading, which involves the delicate coordination of word recognition and eye movements. Moreover, these models are limited to only orthographic level of word processing with predetermined lexicons. Here, we present a conceptual design of a developmentally plausible neural network model of reading designed to simulate word learning, parafoveal preview activation of words, their later foveal word recognition including phonological decoding, and forward saccade length as a control mechanism for intake of new textual information. We will discuss the theoretical advancements of the design and avenues for future developments.
Emotions can be instrumental in shaping the cognition of an intelligent agent. This work presents a yet another attempt to formalize emotions based on the Ortony-Clore-Collins (OCC) model. Specifically, we are interested in emotions, the appraisal of which evaluates the consequences for others. The formal modeling framework introduced here is based on the multiagent Affective Probabilistic Logic (AfPL), which allows us to compute the potential of a given emotion, which represents the emotion’s intensity. The value of this potential allows us to distinguish experienced emotions from mere affective responses using a threshold. The framework describes basic as well as compound emotions. An illustrative practical application scenario in the field of intelligent tutoring is analyzed, demonstrating that the model is robust and practically useful in real-life applications. Broader impact and future research directions are discussed.
This paper explores the potential of adaptive network modeling for joint action and memory recall among elderly through detecting interpersonal synchrony. With the aging population increasing, there is a crucial need to focus on the health and social interaction of older adults. Based on research of the significance of social interaction and memory use for the elderly, as well as the role of interpersonal synchrony in joint action, this paper aims to analyse computationally how to enhance positive effects of social interactions among older individuals by applying an adaptive network model. The research examines the concept of interpersonal synchrony and its impact on joint action, memory, and emotional well-being in elderly populations. Through simulation experiments and analysis, the study demonstrates the potential benefits for music in memory recall for older adults with cognitive decline, highlighting the importance of social interaction and emotional resonance. This study offers a valuable contribution to understanding and improving social interactions and memory recall among the elderly.