Researchers recently published an article in the journal Scientific Data introducing AFFECT-HRI, a new comprehensive dataset. This dataset includes physiological data that has been labeled with human emotions to foster empathetic and responsible interactions between humans and human-like or anthropomorphic service robots, known as human-robot interaction (HRI).
Background
In both human-to-human interactions and HRI, the affective state of individuals is significantly influenced by their counterparts, specifically through verbal communication, vocal tone, gestures, and other non-verbal cues. Human partners can detect these affective states, facilitating empathetic responses.
Although similar empathetic responses are expected from anthropomorphic service robots, which possess human-like features such as hands, eyes, and faces, these robots are not yet capable of demonstrating the same level of empathy as humans. The primary challenge lies in reliably recognizing human emotions—a task that current technology and emotion recognition methods are still struggling to perfect, albeit showing potential for improvement.
However, the datasets currently available for developing emotion recognition capabilities are not suitable for practical robotic applications. These datasets are typically derived from generic human-machine interaction studies, which may not accurately represent the dynamics of HRI. Additionally, recent studies have highlighted that the scarcity of open data, particularly involving physiological metrics, is a significant obstacle to advancing affective computing in HRI contexts.
The AFFECT-HRI Dataset
In this study, researchers introduced a new comprehensive dataset called AFFECT-HRI, aimed at mitigating the existing lack of open data in the field. For the first time, this dataset incorporates labeled physiological data indicative of human affect, including mood and emotion, gathered from a complex HRI study.
The AFFECT-HRI dataset employed a multi-method approach, integrating subjective human-affect assessments with objective physiological sensor data. Additionally, it encompassed participant insights on demographics, affect, and socio-technical questionnaire ratings, complemented by recordings of robot speech and gestures.
In the HRI study, 146 participants—comprising 60 males, 85 females, and one participant with diverse characteristics—ranging in age from 18 to 66, engaged with an anthropomorphic service robot within a complex and realistic retail scenario. Researchers specifically chose physiological signals due to their correlation with human affect.
Additionally, physiological signals are less susceptible to human manipulation compared to voice or video data. Therefore, a realistic retail scenario was chosen as the experimental environment, given the significant potential for the application of service robots in such settings.
Previous studies have underscored the importance of integrating expertise from the fields of law, computer science, and psychology in designing responsible, human-centered HRI. Therefore, this study considered five conditions—immoral, moral, liability, transparency, and neutral—encompassing perspectives from these research domains. This approach elicited diverse affective reactions and facilitated interdisciplinary investigations among psychology, law, and computer science.
Specifically, two anthropomorphic service robots were used in the psychology research field, with each condition comprising three scenes: a handover, a request for sensitive personal information, and a product consultation.
The Experiment Design
The experiment design for the empirical study, involving a complex HRI, followed a three-step approach: the preparation phase (first step), experimental manipulation phase (second step), and post-experimental phase (third step).
In the preparation phase, participants wore an Empathica E4 wristband (E4) on their non-dominant hand to minimize interference from arm movements. They also completed a pre-questionnaire assessing their current affective state and demographics. A baseline measurement of their initial physiological state was taken while at rest.
During the experimental manipulation phase, participants were first introduced to the experimental area to familiarize themselves with the robot, reducing its impact as an emotional trigger during the actual experiment. They were then instructed to interact with the robot within a designated scenario and complete a shopping list.
After initiating interaction/conversation with the robot, participants engaged in three uninterrupted scenes. Subsequently, in the third step, participants were directed to a quiet and separate room to complete the post-questionnaire. Following this, the second baseline measurement was conducted, and the E4 wristband was removed. Data collection involved the use of two questionnaires and the E4 wristband equipped with various physiological sensors.
Significance of the Work
The transparency condition resulted in a more positive and relaxed affect, whereas the liability condition had a notably negative impact on users' affect compared to the neutral condition. In the moral condition, participants exhibited a more relaxed affect compared to both the neutral and immoral conditions.
Additionally, the immoral condition elicited heightened feelings of arousal and stress compared to the neutral and moral conditions. Data from physiological sensors, combined pre- and post-questionnaire responses, ground-truth data, and gesture and speech data were collected in this comprehensive HRI study.
In summary, the AFFECT-HRI dataset represents the first publicly available dataset that includes physiological data labeled with human affect in the context of HRI. This dataset holds potential for the development of new emotion recognition methods and technological capabilities, as well as the validation of existing approaches.
Journal Reference
Heinisch, J. S., Kirchhoff, J., Busch, P., Wendt, J., Von Stryk, O., David, K. (2024). Physiological data for affective computing in HRI with anthropomorphic service robots: The AFFECT-HRI data set. Scientific Data, 11(1), 1-22. https://doi.org/10.1038/s41597-024-03128-z, https://www.nature.com/articles/s41597-024-03128-z
Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.