Both sides previous revision Previous revision Next revision | Previous revision |
pub:research [2022/04/29 12:41] – kkutt | pub:research [2025/02/01 12:14] (current) – kkutt |
---|
| |
===== Papers ===== | ===== Papers ===== |
| |
| === KES2024 === |
| * J. Ignatowicz, K. Kutt, and G. J. Nalepa, “**Evaluation and Comparison of Emotionally Evocative Image Augmentation Methods**,” //Procedia Computer Science//, vol. 246, pp. 3073–3082, 2024 |
| * DOI: [[https://doi.org/10.1016/j.procs.2024.09.365|10.1016/j.procs.2024.09.365]] |
| * [[https://doi.org/10.1016/j.procs.2024.09.365|Full text available online]] |
| * ++Abstract | Experiments in affective computing are based on stimulus datasets that, in the process of standardization, receive metadata describing which emotions each stimulus evokes. In this paper, we explore an approach to creating stimulus datasets for affective computing using generative adversarial networks (GANs). Traditional dataset preparation methods are costly and time consuming, prompting our investigation of alternatives. We conducted experiments with various GAN architectures, including Deep Convolutional GAN, Conditional GAN, Auxiliary Classifier GAN, Progressive Augmentation GAN, and Wasserstein GAN, alongside data augmentation and transfer learning techniques. Our findings highlight promising advances in the generation of emotionally evocative synthetic images, suggesting significant potential for future research and improvements in this domain.++ |
| |
| === IWINAC2024a === |
| * K. Kutt and G. J. Nalepa, “**Emotion Prediction in Real-Life Scenarios: On the Way to the BIRAFFE3 Dataset**,” in //Artificial Intelligence for Neuroscience and Emotional Systems. IWINAC 2024//, J. M. Ferrández Vicente, M. Val Calvo, and H. Adeli, Eds., Cham: Springer, 2024, pp. 465–475 |
| * DOI: [[https://doi.org/10.1007/978-3-031-61140-7_44|10.1007/978-3-031-61140-7_44]] |
| * [[https://www.researchgate.net/publication/381032511_Emotion_Prediction_in_Real-Life_Scenarios_On_the_Way_to_the_BIRAFFE3_Dataset|Full text available @ResearchGate]] |
| * ++Abstract | Despite over 20 years of research in affective computing, emotion prediction models that would be useful in real-life out-of-the-lab scenarios such as health care or intelligent assistants have still not been developed. The identification of the fundamental problems behind this concern led to the initiation of the BIRAFFE series of experiments, whose main goal is to develop a set of techniques, tools and good practices to introduce personalized context-based emotion processing modules in intelligent systems/assistants. The aim of this work is to present the work-in-progress concept of the third experiment in the BIRAFFE series and discuss the results of the pilot study. After all conclusions have been drawn up, actual study will be carried out, and then the collected data will be processed and made available under the creative commons license as BIRAFFE3 dataset.++ |
| |
| === IWINAC2024b === |
| * K. Kutt, M. Kutt, B. Kawa, and G. J. Nalepa, “**Human-in-the-Loop for Personality Dynamics: Proposal of a New Research Approach**,” in //Artificial Intelligence for Neuroscience and Emotional Systems. IWINAC 2024//, J. M. Ferrández Vicente, M. Val Calvo, and H. Adeli, Eds., Cham: Springer, 2024, pp. 455–464. |
| * DOI: [[https://doi.org/10.1007/978-3-031-61140-7_43|10.1007/978-3-031-61140-7_43]] |
| * [[https://www.researchgate.net/publication/381035542_Human-in-the-Loop_for_Personality_Dynamics_Proposal_of_a_New_Research_Approach|Full text available @ResearchGate]] |
| * ++Abstract | In recent years, one can observe an increasing interest in dynamic models in the personality psychology research. Opposed to the traditional paradigm—in which personality is recognized as a set of several permanent dispositions called traits—dynamic approaches treat it as a complex system based on feedback loops between individual and the environment. The growing attention to dynamic models entails the need for appropriate modelling tools. In this conceptual paper we address this demand by proposing a new approach called personality-in-the-loop, which combines state-of-the-art psychological models with the human-in-the-loop approach used in the design of intelligent systems. This new approach has a potential to open new research directions including the development of new experimental frameworks for research in personality psychology, based on simulations and methods used in the design of intelligent systems. It will also enable the development of new dynamic models of personality in silico. Finally, the proposed approach extends the field of intelligent systems design with new possibilities for processing personality-related data in these systems.++ |
| |
| === DSAA2023 === |
| * K. Kutt, Ł. Ściga, and G. J. Nalepa, "**Emotion-based Dynamic Difficulty Adjustment in Video Games**," in DSAA 2023, pp. 1–5. |
| * DOI: [[https://doi.org/10.1109/DSAA60987.2023.10302578|10.1109/DSAA60987.2023.10302578]] |
| * ++Abstract | Current review papers in the area of Affective Computing and Affective Gaming point to a number of issues with using their methods in out-of-the-lab scenarios, making them virtually impossible to be deployed. On the contrary, we present a game that serves as a proof-of-concept designed to demonstrate that—being aware of all the limitations and addressing them accordingly—it is possible to create a product that works in-the-wild. A key contribution is the development of a dynamic game adaptation algorithm based on the real-time analysis of emotions from facial expressions. The obtained results are promising, indicating the success in delivering a good game experience.++ |
| |
| === InfFusion2023 === |
| * J. M. Górriz //et al.//, "**Computational approaches to Explainable Artificial Intelligence: Advances in theory, applications and trends**," Inf. Fusion, vol. 100, p. 101945, 2023. |
| * DOI: [[https://doi.org/10.1016/j.inffus.2023.101945|10.1016/j.inffus.2023.101945]] |
| * [[https://doi.org/10.1016/j.inffus.2023.101945|Full text available online]] |
| * ++Abstract | Deep Learning (DL), a groundbreaking branch of Machine Learning (ML), has emerged as a driving force in both theoretical and applied Artificial Intelligence (AI). DL algorithms, rooted in complex and non-linear artificial neural systems, excel at extracting high-level features from data. DL has demonstrated human-level performance in real-world tasks, including clinical diagnostics, and has unlocked solutions to previously intractable problems in virtual agent design, robotics, genomics, neuroimaging, computer vision, and industrial automation. In this paper, the most relevant advances from the last few years in Artificial Intelligence (AI) and several applications to neuroscience, neuroimaging, computer vision, and robotics are presented, reviewed and discussed. In this way, we summarize the state-of-the-art in AI methods, models and applications within a collection of works presented at the 9th International Conference on the Interplay between Natural and Artificial Computation (IWINAC). The works presented in this paper are excellent examples of new scientific discoveries made in laboratories that have successfully transitioned to real-life applications.++ |
| |
| === SciData2022 === |
| * K. Kutt, D. Drążyk, L. Żuchowska, M. Szelążek, S. Bobek, and G. J. Nalepa, "**BIRAFFE2, a multimodal dataset for emotion-based personalization in rich affective game environments**," Sci. Data, vol. 9, no. 1, p. 274, 2022 |
| * DOI: [[https://doi.org/10.1038/s41597-022-01402-6|10.1038/s41597-022-01402-6]] |
| * [[https://doi.org/10.1038/s41597-022-01402-6|Full text available online]] |
| * ++Abstract | Generic emotion prediction models based on physiological data developed in the field of affective computing apparently are not robust enough. To improve their effectiveness, one needs to personalize them to specific individuals and incorporate broader contextual information. To address the lack of relevant datasets, we propose the 2nd Study in Bio-Reactions and Faces for Emotion-based Personalization for AI Systems (BIRAFFE2) dataset. In addition to the classical procedure in the stimulus-appraisal paradigm, it also contains data from an affective gaming session in which a range of contextual data was collected from the game environment. This is complemented by accelerometer, ECG and EDA signals, participants’ facial expression data, together with personality and game engagement questionnaires. The dataset was collected on 102 participants. Its potential usefulness is presented by validating the correctness of the contextual data and indicating the relationships between personality and participants’ emotions and between personality and physiological signals.++ |
| |
| === AfCAI2022 === |
| * K. Kutt, P. Sobczyk, and G. J. Nalepa, "**Evaluation of Selected APIs for Emotion Recognition from Facial Expressions**," in Bio-inspired Systems and Applications: from Robotics to Ambient Intelligence. IWINAC 2022 Proceedings, Part II, 2022, pp. 65–74. |
| * {{ :pub:kkt2022afcai.pdf |Full text draft available}} |
| * ++Abstract | Facial expressions convey the vast majority of the emotional information contained in social utterances. From the point of view of affective intelligent systems, it is therefore important to develop appropriate emotion recognition models based on facial images. As a result of the high interest of the research and industrial community in this problem, many ready-to-use tools are being developed, which can be used via suitable web APIs. In this paper, two of the most popular APIs were tested: Microsoft Face API and Kairos Emotion Analysis API. The evaluation was performed on images representing 8 emotions—anger, contempt, disgust, fear, joy, sadness, surprise and neutral—distributed in 4 benchmark datasets: Cohn-Kanade (CK), Extended Cohn-Kanade (CK+), Amsterdam Dynamic Facial Expression Set (ADFES) and Radboud Faces Database (RaFD). The results indicated a significant advantage of the Microsoft API in the accuracy of emotion recognition both in photos taken en face and at a 45∘ angle. Microsoft’s API also has an advantage in the larger number of recognised emotions: contempt and neutral are also included.++ |
| |
=== MRC2021b === | === MRC2021b === |