Analysis of Emotions Using Multimodal Data: A Case Study

Abstract

In this case study, we hypothesized that sympathetic nerve activity would be higher during conversation with PALRO robot, and that conversation would result in an increase in cerebral blood flow near the Broca’s area. The facial expressions of a human subject were recorded, and cerebral blood flow and heart rate variability were measured during interactions with the humanoid robot. These multimodal data were time-synchronized to quantitatively verify the change from the resting baseline by testing facial expression analysis, cerebral blood flow, and heart rate variability. In conclusion, this subject indicated that sympathetic nervous activity was dominant, suggesting that the subject may have enjoyed and been excited while talking to the robot (normalized High Frequency < normalized Low Frequency: 0.22 ± 0.16 < 0.78 ± 0.16). Cerebral blood flow values were higher during conversation and in the resting state after the experiment than in the resting state before the experiment. Talking increased cerebral blood flow in the frontal region. As the subject was left-handed, it was confirmed that the right side of the brain, where the Broca’s area is located, was particularly activated (Left < right: 0.15 ± 0.21 < 1.25 ± 0.17). In the sections where a “happy” facial emotion was recognized, the examiner-judged “happy” faces and the MTCNN “happy” results were also generally consistent.

Share and Cite:

Akiyama, T. , Osaka, K. , Ito, H. , Tanioka, R. , Blaquera, A. , Bollos, L. and Tanioka, T. (2023) Analysis of Emotions Using Multimodal Data: A Case Study. Journal of Biosciences and Medicines, 11, 54-68. doi: 10.4236/jbm.2023.1112006.

1. Introduction

Japan’s hyper-aged society entails medical professionals to improve the quality of care for older persons. The shortage of nursing staff in chronic care wards has become a problem due to the aging population and extremely low birth rate. Therefore, social support robots have been developed recently to observe the health status of older people and to manage their health [1] . In Japan, so-called “communication robots,” equipped with voice and facial recognition features and other communication functions, are being introduced into medical and nursing care settings. The introduction of various robotic technologies, including not only communication robots but also therapy robots and monitoring sensors, has been initiated in developed countries with advanced medical environments [2] [3] [4] .

The ability of a healthcare robot to determine a subject’s emotion and accurately convey it to others is essential [5] . Ngai et al. [6] assert that recognizing human emotions through face modality alone is unreliable. This requires robots to have a high degree of multimodal perception, as they need to understand the mental moods, goals, and character aspects of humans in order to provide appropriate feedback [7] . Along with facial recognition through convolutional neural networks, several studies have used bio-signals such as heart rate variability, electroencephalogram signals, and eye modality [8] [9] . Others have used visual-audio signals, lexicographic data, and questionnaire-based data [10] . However, the most effective emotion recognition data are gathered from biosignals [10] .

The ability of communication robots to accurately detect emotions when interacting with subjects will improve the effectiveness of their function and use in the medical field.

For today’s communication robots to communicate with humans, however, many developmental challenges remain [11] [12] . To this end, the communication robot must first be able to accurately analyze human emotions using multimodal data in healthy subjects [13] [14] .

Nevertheless, the use of humanoid robots may have a positive effect on older adults with dementia and their caregivers [15] [16] . Inoue et al. [17] reported that of the 77 older adults with dementia who participated, 84.4% had a significant improvement in their well-being after the program using the robot PALRO®.

On the other hand, it has been reported that robot-assisted activity using a communication robot stimulates neural activity in the region centered on the posterior cingulate gyrus and precuneus in cognitively healthy older adults, but does not significantly alter brain neural activity in cognitively impaired older adults [18] . Therefore, it was noted that there was a lack of studies with higher levels of evidence.

In this case study, we hypothesized that sympathetic nerve activity would be higher during conversation with PALRO, and that conversation would result in an increase in cerebral blood flow near the Broca’s area.

2. Materials and Methods

2.1. Subject

The subject is a 20-year-old left-handed female student of nursing at Tokushima University. Exclusion criteria were severe communication problems, severe cardiac or cerebrovascular disease, and skin allergy to the electrode seals.

2.2. Research Design

This case study employed an Intentional Observational Clinical Research Design (IOCRD) [19] , which simultaneously generates quantitative and qualitative data through a specialized observation and measurement process using advanced technical equipment. The measurement and analysis tools shown in Figure 1 are heart rate variability (hereinafter referred to as “HRV”) as quantitative data, blood flow in the anterior part of the brain (hereinafter referred to as “HOT2000”), and facial expressions analyzed using multi-task Cascaded Convolutional Networks (hereinafter referred to as “MTCNN”). Qualitative data consisted of field notes and subjective evaluations of the subject’s facial expressions by three examiners. To achieve simultaneous acquisition of quantitative and qualitative data, measurement timelines were synchronized using radio clocks in the experimental (quantitative) and observational (qualitative) studies.

The method of analysis was to compare PALRO with conversation, using the resting state as baseline. Three examiners subjectively evaluated the emotions expressed by the subject’s facial expressions during these situations. In the computerized evaluation, seven facial expressions were classified by applying deep learning methods such as MTCNN. The left and right blood flow in the

Figure 1. Research framework by Intentional Observational Clinical Research Design (IOCRD).

frontal lobes of the brain during conversation was obtained using HOT2000. Autonomic nerve activity during conversation was analyzed using Bonaly Light, a real-time analysis software based on heart rate fluctuation to obtain data on heart rate (HR-Mean), normalized High Frequency (HFnu), which relates to relative parasympathetic activity, and normalized Low Frequency (LFnu), which relates to relative sympathetic activity. Particular attention is given to the relative power of the high-frequency band, where low relative power indicates feelings of anxiety and worry, and high relative power indicates the degree of empathy.

All data are time aligned using the IOCRD, with 20 seconds set to output 10 data to match the Bonaly Light, which is an analysis result once every 2 seconds. HOT2000 produces 100 raw data signals in 1 second, so 2000 data were obtained over a 20 second period, and the average of 200 data obtained over a 2 second period was used as the representative value. MTCNN outputs 30 data in 1 second, each with 7 items expressed as a percentage, so 600 data are obtained in 20 seconds, resulting in a representative value of 60 data in 2 seconds. The 20 seconds in which we consider that the emotion has changed are pre-processed by cutting out the region of interest using the video editing software Shotcut (Meltytech, LLC) and applying mosaic to all but the necessary portions so that only the subject is visible, thereby increasing the accuracy of the analysis.

2.3. Communication Robot

PALRO is an autonomous communication robot developed by Fuji Soft Incorporated. It is a compact (40 cm in height, weighing 1.8 kg) humanoid robot equipped with features to create meaningful conversations and connections with humans. PALRO has been chosen for this case study as it was found to have dedicated conversation and activity features suitable for communicating with people for daily conversation, recreation, and health exercises [20] . Numerous institutions have sought approval to incorporate PALRO, which has received endorsements from the Japanese Ministry of Health, Labor and Welfare under the “Special Project to Support the Introduction of Nursing Care Robots”. Moreover, PALRO has gained recognition as a communication robot for assessment within the Ministry of Economy, Trade and Industry’s “Project to Promote the Development and Introduction of Robotic Nursing Care Equipment”, acknowledging the substantiated evidence from diverse demonstration experiments [20] .

2.4. Data Analysis Method

2.4.1. Video Processing Methods

The video editing software Shotcut was used to cut out all the unnecessary portions of the subject’s face, leaving only the facial expressions to be captured. During video editing, mosaic processing was applied to areas that were not required for facial expression analysis by the AI. The video size was adjusted to 640 widths, 480 frames, and 29.97 frames per second.

The term “region of interest” refers to a particular area that has been narrowed down for observation or measurement using several imaging techniques. The region of interest describes how JPEG 2000 compresses pictures by allocating a significant amount of code to a specific region. For this case study, the main investigator reviewed all the videos that lasted for approximately 20 min, and the regions of interest were defined as those that lasted between 10 and 20 seconds and comprised emotional changes during the subject’s conversation with PALRO, as well as the Face Emotion Recognizer based on MTCNN face detection findings. The region of interest in this case study was extracted as the time before and after the main investigator judged the area as “happy” faces. For comparison, the time at rest before the conversation with PALRO was also extracted, along with the time when the main investigator judged that the patient was expressing “happy” face. To ensure the validity of the regions of interest, we asked three examiners to check the regions of interest. The region of interest was defined as the region where all three examiners agreed that the subject was happy.

2.4.2. Heart Rate Variability Analysis

HRV was measured using electrocardiography to evaluate the balance between the sympathetic and parasympathetic nerves. HRV was measured using Bonaly Light as the measurement device, and measurements were continuously made from rest to the end of the conversation for approximately 20 minutes.

The fluctuations in the R-R interval were frequency-analyzed every 2 seconds. The frequency band of the high-frequency (HF) component, an index reflecting parasympathetic activity, was set at 0.15 - 0.40 Hz, and the low-frequency (LF) component, an index reflecting sympathetic and parasympathetic activity, at 0.04 - 0.15 Hz. The values were used to compute for LFnu (normalized LF) and HFnu (normalized HF) using the formula:

LFnu = LF / ( LF + HF ) × 1 00 ,

HFnu = HF / ( LF + HF ) × 1 00 .

There are many factors that influence HRV [21] , and stress is one of them. Increased stress levels are said to decrease HRV, while a healthy balance in daily life is said to increase HRV. Decreased HRV is characterized by emotional dysregulation, reduced psychological flexibility, and deficits in social engagement, which are associated with reduced prefrontal cortex activity [22] . Decreased HRV and decreased cardiac vagal control have also been associated with an increased tendency of developing negative emotions such as anger, sadness, and fear, as well as an increased risk of anxiety disorders, depression, and cardiovascular disease and death [23] . The high-frequency component decreases with fear and anxiety, indicating that parasympathetic activity is reduced when unpleasant stimuli or emotions, such as stress, panic, and worry, are felt [24] [25] .

High-level anxiety is associated with a decrease in the R-R interval and high-frequency component [25] . Increased parasympathetic activity during emotion regulation is associated with situational empathic responses. Emotion regulation could be related to changes in situational empathy and autonomic responses, with a preferential predominance of the parasympathetic branch, possibly reflecting increased regulatory processes. In addition, individuals with high heart rate variability tend to have better emotional well-being than individuals with low heart rate variability [26] . Furthermore, previous studies have found that empathy for different emotional values was associated with different changes in situational empathy and autonomic responses [25] .

2.4.3. Cerebral Blood Flow Analysis Method

Prefrontal cortex (PFC) activity is identified during the communication [27] . In addition, previous research on Prefrontal cortex activity and emotion has shown that positive emotions activate cognitive function [28] while negative emotions decrease prefrontal cortex activity [29] . The Broca’s area, which is involved in language processing and language comprehension, is usually located in the left prefrontal area [30] .

HOT-2000 (NeU Co., Ltd., Tokyo Japan) is a portable wearable device designed for measuring brain activity (Figure 2). This two-channel system uses near-infrared light to track changes in blood flow concentration in the form of total hemoglobin (HbT) in the frontal lobe [31] . This system employs a laser emitting light at a wavelength of 810 nm and operates at a sampling rate of 10 Hz [31] .

In this case study, anterior cerebral blood flow and pulse rate were measured in the left and right hemispheres, respectively. The device was worn with the headset positioned close to the forehead, just above the eyebrows. Raw data signals obtained from the HOT-2000 produce 100 data signals per second, and cerebral blood flow is evaluated from the average of data signals produced every 2 seconds.

Figure 2. The HOT2000 and sensor locations.

2.4.4. Objective Facial Expression Analysis Method

Research in affective computing has focused on detecting and classifying human emotions from facial expressions, and advances in deep learning have increased the number and accuracy of facial emotion detection methods [32] . The trained method [33] can classify facial expressions after face detection with the MTCNN [34] . Therefore, in our study, MTCNN was used for highly accurate face detection [35] . The facial expressions to be analyzed were set to seven categories: happy, anger, surprise, sad, fear, disgust, and neutral. The results obtained with the subjective method were analyzed using MTCNN, a method that applies deep learning. The results obtained using the subjective method were analyzed using MTCNN, a method based on deep learning. The results of the analysis were displayed relative to the seven expressions so that the sum of all items was 100%.

The videos consisted of 30 frames per second, and a total of 30 frames were selected for analysis by MTCNN. An Excel file was created with a timeline of the results of the subjective facial expression analysis for each frame of the clipped video, with the largest proportion of each of the seven facial expressions in each frame color-coded. The item with the highest mean value among the analyzed items was considered as the result of the facial expression analysis of the MTCNN subjects.

2.4.5. Subjective Facial Expression Evaluation Method

The resting state was used as a baseline and compared to the time of conversation with PALRO. The emotions expressed by facial expressions during this conversation were subjectively assessed by three examiners who are healthcare professionals for validity.

The examiners checked the video data without sound for the analysis tool, MTCNN, to match the criteria for judging facial expressions from videos and images. To make the evaluation as objective as possible, the examiners would choose from a drop-down list of the emotions (happy, anger, surprise, sad, fear, disgust, and neutral) created with the MTCNN’s evaluation items to evaluate the subject’s facial expression.

For uniformity, the video was edited using Shotcut to cut out the parts where heart rate variability analysis and cerebral blood flow data were obtained 20 seconds before and after the part where the nurse who was the judgment standard, determined that there was a change in facial expression is a happy face.

2.5. Data Gathering Procedure

The experimental location and data collection period were collected in Building A of the Tokushima University, Graduate School of Health Sciences in July 2023. A wall was placed behind the subject in order for the MTCNN to detect and analyze only the subject’s face. The distance between the subject and PALRO was 80 cm, and a digital video camera was set up as shown in Figure 3.

A Mem Calc/Bonaly Light (GMS Co., Ltd.) was attached to the subject’s chest to analyze heart rate variability. A cerebral blood flow measuring device

Figure 3. Experimental environment.

HOT2000 was attached to the head to measure anterior cerebral blood flow. It was ensured that no other covering was placed on the subject’s face to facilitate reading of facial expressions.

The subject was asked to rest with their eyes closed for 5 minutes in order to collect reference data. The subject then proceeded to have a conversation with PALRO for 10 minutes. After the conversation, the subject was again to rest for 5 minutes with their eyes closed. The conversation was recorded while the examiners record their observations.

The communication mode software of PALRO allows for the remote control of PALRO’s words and movements and was used for intentional conversation with the subject. The contents of the remotely scripted conversation prompts were typed into the conversation application program using a tablet and keyboard. The length of the subject’s conversation with PALRO was recorded using a single digital video camera.

One examiner checked for heart rate variability output, notified the operator to adjust the end time, and recorded important events that occurred during the conversation and the radio clock time in field notes. This protocol was based on past studies [36] [37] .

2.6. Statistical Analysis Method

IBM SPSS Statistics for Windows, version 27.0. (Armonk, NY: IBM Corp.) was used to statistically process the data. Paired t-test and Welch’s ANOVA with post hoc testing (Games-Howell test) were used to analyze the data.

2.7. Ethical Considerations

The study was conducted in accordance with the Declaration of Helsinki and approved by the Ethics Committee of Tokushima University Hospital (#3046). The researcher explained the purpose, objectives, methods, expected results, and disadvantages of the study to the subject. The right to withdraw at any time during the conduct of research was also explained. Consent to participate was obtained after the subject signed the informed consent form.

3. Results

There were no significant changes in heart rate before, during, or after the experiment; HFnu was significantly lower during the conversation (0.22 ± 0.16) than before the experiment (0.47 ± 0.19) (p < 0.001). Significantly, HFnu was higher at the end of the experiment (0.42 ± 0.20) than during the conversation (0.22 ± 0.16) (p < 0.001). On the other hand, LFnu increased significantly during the conversation (0.78 ± 0.16) compared with that before the experiment (0.53 ± 0.19) (p < 0.001). LFnu also decreased at the end of the experiment (0.58 ± 0.20) compared with that during the conversation (0.78 ± 0.16) (p < 0.001).

Changes in HbT showed a significant increase in cerebral blood flow on both the left and right side. The values were greater during the conversation and after the experiment than before the experiment: Left (0.03 ± 0.05 to 0.15 ± 0.21 to 0.55 ± 0.12), Right (0.96 ± 0.07 to 1.25 ± 0.17 to 1.95 ± 0.25). Paired t-test was used for left and right HbT changes, and significant differences were confirmed on the right side: during conversation (t = −118.13, p < 0.001), and at rest after experiment (t = −80.50, p < 0.001) (Table 1).

Figure 4 shows the data corresponding with the instances where a “happy” face was detected and is compared with the data in the same region of interest while at rest before the conversation. HFnu was significant at rest, but LFnu became dominant during the happy facial expression during the conversation with the PALRO. The value of total hemoglobin (HbT) has been multiplied by 100 in order to correspond to the HRV value in Figure 4.

Figure 4. Happy face compared to resting state before conversation.

Table 1. Analysis of heart rate, sympathetic and parasympathetic nerves, and left and right cerebral blood flow before, during, and after conversation.

* <0.05, ** <0.01, *** <0.001. SD: Standard deviation HR: Heart Rate HFnu: High frequency powers of heart rate variability expressed in normal units LFnu: Low frequency powers of heart rate variability expressed in normal units, multivariate analysis using Welch’s ANOVA and Post hoc test (Games-Howell). Paired t-test was used for left and right HbT changes, and significant differences were confirmed on the right side: during conversation (t = −118.13, p < 0.001), and at rest after experiment (t = −80.50, p < 0.001).

4. Discussion

Observations showed that overall, the conversation was successful and there were many occasions when the subject showed a “happy” facial expression. The results of the analysis of the MTCNN also verified that the highest number of facial expressions detected were for “happy”. The MTCNN analysis and the examiner analysis for judging “happy” facial expressions were consistent, which shows MTCNN’s potential as a system for discerning happy facial expressions.

Since the subject is left-handed, the Broca’s area is on the right [38] [39] and the right hemisphere is dominant through conversation. The cerebral blood flow values were higher during the conversation and in the resting state after the experiment than in the resting state before the experiment. We believe that the frontal cerebral blood flow increases and is activated by the conversation, especially on the right side.

Our findings show that LFnu became more elevated when the subject showed a happy facial expression which shows a similar trend to that of a study by Hachenberger et al. [40] . The parameters used high-frequency, low-frequency, and LF/HF ratios to associate positive and negative effects among young females throughout their daily activities for 7 days. Shi et al. [41] found that a higher LF/HF ratio was consistently associated with feeling enthusiastic and happy, which also coincides with a study [41] that found that happiness and enthusiasm was associated with higher low-frequency, lower high-frequency, and a higher LF/HF ratio compared with sadness. A higher LF/HF ratio is generally considered to represent sympathetic activity, which in some studies is said to be dominant in a negative emotional state [42] .

However, Hachenberg et al., [40] mentions that this correlation is dependent on the environment. They also pointed out that other studies have found a negative correlation between feeling enthusiastic and HRV-HF and root-mean-square of successive differences (RMSSD), which represent parasympathetic activity. Also, Hachenberg et al., [40] instead suggest that heart rate and LF/HF ratio with positive affect, specifically in being enthusiastic and happy, which implies that a relative activation of the sympathetic nervous system compared with the parasympathetic nervous system (PNS) and/or PNS withdrawal has beneficial effects on positive affect. This could suggest that the subject was aroused in a positive environment and began feeling enthusiastic during the conversation with PALRO.

Despite its portability, inexpensiveness and functional range as a neuroimaging tool [43] [44] , a drawback to functional near infrared spectroscopy (NIRS), such as the HOT2000, is its shallow detection range that limits the information about cortical activity it can provide [45] . However, it is possible that the HOT2000 measures the right and left sides of the prefrontal cortex, including the frontotemporal region where the Broca’s area is located. Modern functional neuroimaging relies upon hemodynamic-based measures of brain function, the most prominent of which is blood oxygen level dependent functional magnetic resonance imaging (fMRI). Advances in understanding the neural activity underlying cognitive functions have been made possible with fMRI and it is currently considered the gold standard non-invasive hemodynamic based neuroimaging technology [46] . Therefore, it is necessary to use advanced NIRS or functional MRI for accurate measurements.

The results of this case study will help clarify and develop what features are needed for communication robots to be used in the healthcare field. However, the results in this case study were from one subject, and there are limitations in generalizing the results. To clarify this, increasing the number of subjects and increasing the reliability and validity of the data are also future issues.

In this case study, the analysis device used in this study was a mixture of contact and non-contact devices. It is difficult to wear a contact device all the time when interacting with a robot in a real medical setting. Therefore, it is a future challenge to clarify how to correctly analyze the subject’s emotions using only non-contact devices.

5. Conclusion

In conclusion, this case study subject indicated that sympathetic nervous activity was dominant, suggesting that the subject may have enjoyed and been excited by talking to the robot. Cerebral blood flow values were higher during conversation and in the resting state after the experiment than in the resting state before the experiment. Talking increased cerebral blood flow in the frontal region. Since the subject was left-handed, the right side of the brain, where the Broca’s area is located, was particularly activated. In the sections where a “happy” facial emotion was recognized, the examiner-judged “happy” faces and the MTCNN “happy” results were also generally consistent.

Acknowledgements

The authors express their gratitude and appreciation to the subjects and research collaborators.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

References

[1] Abdi, J., Al-Hindawi, A., Ng, T. and Vizcaychipi, M.P. (2018) Scoping Review on the Use of Socially Assistive Robot Technology in Elderly Care. BMJ Open, 8, e018815.
https://doi.org/10.1136/bmjopen-2017-018815
[2] Raffard, S., Bortolon, C., Khoramshahi, M., Salesse, R.N., Burca, M., Marin, L., Bardy, B.G., Billard, A., Macioce, V. and Capdevielle, D. (2016) Humanoid Robots versus Humans: How Is Emotional Valence of Facial Expressions Recognized by Individuals with Schizophrenia? An Exploratory Study. Schizophrenia Research, 176, 506-513.
https://doi.org/10.1016/j.schres.2016.06.001
[3] Schoenhofer, S.O., Wynsberghe, A. and Boykin, A. (2019) Engaging Robots as Nursing Partners in Caring: Nursing as Caring Meets Care-Centered Value-Sensitive Design. International Journal for Human Caring, 23, 157-167.
https://doi.org/10.20467/1091-5710.23.2.157
[4] Miyagawa, M., Kai, Y., Yasuhara, Y., Ito, H., Betriana, F., Tanioka, T. and Locsin, R. (2020) Consideration of Safety Management When Using Pepper, a Humanoid Robot for Care of Older Adults. Intelligent Control and Automation, 11, 15-24.
https://doi.org/10.4236/ica.2020.111002
[5] Pepito, J.A., Ito, H., Betriana, F., Tanioka, T. and Locsin, R.C. (2020) Intelligent Humanoid Robots Expressing Artificial Humanlike Empathy in Nursing Situations. Nursing Philosophy, 21, e12318.
https://doi.org/10.1111/nup.12318
[6] Ngai, W.K., Xie, H., Zou, D. and Chou, K.-L. (2022) Emotion Recognition Based on Convolutional Neural Networks and Heterogeneous Bio-Signal Data Sources. Information Fusion, 77, 107-117.
https://doi.org/10.1016/j.inffus.2021.07.007
[7] Su, H., Qi, W., Chen, J., Yang, C., Sandoval, J. and Laribi, M.A. (2023) Recent Advancements in Multimodal Human-Robot Interaction. Frontiers in Neurorobotics, 17, Article ID: 1084000.
https://doi.org/10.3389/fnbot.2023.1084000
[8] Samadiani, N., Huang, G., Cai, B., Luo, W., Chi, C.-H., Xiang, Y., and He, J. (2019) A Review on Automatic Facial Expression Recognition Systems Assisted by Multimodal Sensor Data. Sensors, 19, Article No. 1863.
https://doi.org/10.3390/s19081863
[9] Akiyama, T., Matsumoto, K., Osaka, K., Tanioka, R., Betriana, F., Zhao, Y., Kai, Y., Miyagawa, M., Yasuhara, Y., Ito, H., Soriano, G. and Tanioka, T. (2022) Comparison of Subjective Facial Emotion Recognition and “Facial Emotion Recognition Based on Multi-Task Cascaded Convolutional Network Face Detection” between Patients with Schizophrenia and Healthy Participants. Healthcare, 10, Article No. 2363.
https://doi.org/10.3390/healthcare10122363
[10] Kumar, A., Sharma, K. and Sharma, A. (2022) MEmoR: A Multimodal Emotion Recognition Using Affective Biomarkers for Smart Prediction of Emotional Health for People Analytics in Smart Industries. Image and Vision Computing, 123, Article ID: 104483.
https://doi.org/10.1016/j.imavis.2022.104483
[11] Honig, S. and Oron-Gilad, T. (2018) Understanding and Resolving Failures in Human-Robot Interaction: Literature Review and Model Development. Frontiers in Psychology, 9, Article No. 861.
https://doi.org/10.3389/fpsyg.2018.00861
[12] Ozeki, T., Mouri, T., Sugiura, H., Yano, Y. and Miyosawa, K. (2020) Use of Communication Robots to Converse with People Suffering from Schizophrenia. ROBOMECH Journal, 7, Article No. 13.
https://doi.org/10.1186/s40648-020-00161-6
[13] Spezialetti, M., Placidi, G. and Rossi, S. (2020) Emotion Recognition for Human-Robot Interaction: Recent Advances and Future Perspectives. Frontiers in Robotics and AI, 7, Article ID: 532279.
https://doi.org/10.3389/frobt.2020.532279
[14] Alonso-Martín, F., Malfaz, M., Sequeira, J., Gorostiza, J. and Salichs, M. (2013) A Multimodal Emotion Detection System during Human-Robot Interaction. Sensors, 13, 15549-15581.
https://doi.org/10.3390/s131115549
[15] Lu, W.-I., Chen, Y.-W., Shen, C.-C., Tsai, P.-H., Chu, Y.-T., Hung, Y.-H., Chien, S.-Y., Lee, J. and Chao, S.-F. (2023) Social Robots for Older Adults in Medical Contexts. In: Kurosu, M. and Hashizume, A., Eds., Human-Computer Interaction, Springer Nature, Cham, 118-128.
https://doi.org/10.1007/978-3-031-35602-5_9
[16] Bonarini, A. (2020) Communication in Human-Robot Interaction. Current Robotics Reports, 1, 279-285.
https://doi.org/10.1007/s43154-020-00026-1
[17] Inoue, K., Yatsu, C., Yao, D.P.G., Kohno, M., Wada, K. and Yamamoto, S. (2022) Preliminary Study on the Benefits of Using the Robot PALRO® in Facilitating Leisure Programs for Older Adults with Dementia. Gerontechnology, 21, 1-7.
https://doi.org/10.4017/gt.2022.21.1.466.04
[18] Goda, A., Shimura, T., Murata, S., Kodama, T., Nakano, H. and Ohsugi, H. (2023) Effects of Robot-Assisted Activity Using a Communication Robot on Neurological Activity in Older Adults with and without Cognitive Decline. Journal of Clinical Medicine, 12, Article No. 4818.
https://doi.org/10.3390/jcm12144818
[19] Tanioka, T., Locsin, R.C., Betriana, F., Kai, Y., Osaka, K., Baua, E. and Schoenhofer, S. (2021) Intentional Observational Clinical Research Design: Innovative Design for Complex Clinical Research Using Advanced Technology. International Journal of Environmental Research and Public Health, 18, Article No. 11184.
https://doi.org/10.3390/ijerph182111184
[20] FUJISOFT Inc. Koureisya Fukushi Shisetumuke, Kaiwarobotto Saisenntan! PALRO. (In Japanese)
https://palro.jp/en/preventive-care/nursing-home.html
[21] Kranjec, J., Begus, S., Gersak, G., Sinkovec, M., Drnovsek, J. and Hudoklin, D. (2017) Design and Clinical Evaluation of a Non-Contact Heart Rate Variability Measuring Device. Sensors, 17, Article No. 2637.
https://doi.org/10.3390/s17112637
[22] Sgoifo, A., Carnevali, L., Alfonso Mde, L. and Amore, M. (2015) Autonomic Dysfunction and Heart Rate Variability in Depression. The International Journal on the Biology of Stress, 18, 343-352.
https://doi.org/10.3109/10253890.2015.1045868
[23] Jerath, R. and Beveridge, C. (2020) Respiratory Rhythm, Autonomic Modulation, and the Spectrum of Emotions: The Future of Emotion Recognition and Modulation. Frontiers in Psychology, 11, Article No. 1980.
https://doi.org/10.3389/fpsyg.2020.01980
[24] Abdullayev, R., Yildirim, E., Celik, B. and Topcu Sarica, L. (2019) Analgesia Nociception Index: Heart Rate Variability Analysis of Emotional Status. Cureus, 11, Article No. 4365.
https://doi.org/10.7759/cureus.4365
[25] Miu, A.C., Heilman, R.M. and Miclea, M. (2009) Reduced Heart Rate Variability and Vagal Tone in Anxiety: Trait versus State, and the Effects of Autogenic Training. Autonomic Neuroscience: Basic and clinical, 145, 99-103.
https://doi.org/10.1016/j.autneu.2008.11.010
[26] Mather, M. and Thayer, J. (2018) How Heart Rate Variability Affects Emotion Regulation Brain Networks. Current Opinion in Behavioral Sciences, 19, 98-104.
https://doi.org/10.1016/j.cobeha.2017.12.017
[27] Fukaya, Y., Kawaguchi, M. and Kitamura, T. (2020) Does Everyday Conversation Contribute to Cognitive Functioning? A Comparison of Brain Activity during Task-Oriented and Life-Worldly Communication Using Near-Infrared Spectroscopy. Gerontology and Geriatric Medicine, 6, 1-14.
https://doi.org/10.1177/2333721420980309
[28] Raschle, N.M., Fehlbaum, L.V., Menks, W.M., Euler, F., Sterzer, P. and Stadler, C. (2017) Investigating the Neural Correlates of Emotion-Cognition Interaction Using an Affective Stroop Task. Frontiers in Psychology, 8, Article No. 1489.
https://doi.org/10.3389/fpsyg.2017.01489
[29] Aoki, R., Sato, H., Katura, T., Matsuda, R. and Koizumi, H. (2013) Correlation between Prefrontal Cortex Activity during Working Memory Tasks and Natural Mood Independent of Personality Effects: An Optical Topography Study. Psychiatry Research: Neuroimaging, 212, 79-87.
https://doi.org/10.1016/j.pscychresns.2012.10.009
[30] Horwitz, B., Amunts, K., Bhattacharyya, R., Patkin, D., Jeffries, K., Zilles, K. and Braun, A.R. (2003) Activation of Broca’s Area during the Production of Spoken and Signed Language: A Combined Cytoarchitectonic Mapping and PET Analysis. Neuropsychologia, 41, 1868-1876.
https://doi.org/10.1016/S0028-3932(03)00125-8
[31] Takahashi, S., Sakurai, N., Kasai, S. and Kodama, N. (2022) Stress Evaluation by Hemoglobin Concentration Change Using Mobile NIRS. Brain Sciences, 12, Article No. 488.
https://doi.org/10.3390/brainsci12040488
[32] Jeong, D., Kim, B.-G. and Dong, S.-Y. (2020) Deep Joint Spatiotemporal Network (DJSTN) for Efficient Facial Expression Recognition. Sensors, 20, Article No. 1936.
https://doi.org/10.3390/s20071936
[33] Arriaga, O., Valdenegro-Toro, M., Muthuraja, M., Devaramani, S. and Kirchner, F. (2020) Perception for Autonomous Systems (PAZ).
http://arxiv.org/abs/2010.14541
[34] Ge, H., Dai, Y., Zhu, Z. and Wang, B. (2021) Robust Face Recognition Based on Multi-Task Convolutional Neural Network. Mathematical Biosciences and Engineering, 18, 6638-6651.
https://doi.org/10.3934/mbe.2021329
[35] Islam, Md.T., Ahmed, T., Rashid, A.B.M.R., Islam, T., Rahman, Md.S. and Habib, Md.T. (2022) Convolutional Neural Network Based Partial Face Detection. 7th International Conference for Convergence in Technology (I2CT), Pune, 7-9 April 2022, 1-6.
https://doi.org/10.48550/ARXIV.2206.14350
[36] Osaka, K. (2020) Development of the Model for the Intermediary Role of Nurses in Transactive Relationships with Healthcare Robots. International Journal for Human Caring, 24, 265-274.
[37] Osaka, K., Sugimoto, H., Tanioka, T., Yasuhara, Y., Locsin, R.C., Zhao, Y., Okuda, K. and Saito, K. (2017) Characteristics of a Transactive Phenomenon in Relationships among Older Adults with Dementia, Nurses as Intermediaries, and Communication Robot. Intelligent Control and Automation, 8, 111-125.
https://doi.org/10.4236/ica.2017.82009
[38] Cai, Q., Van Der Haegen, L. and Brysbaert, M. (2013) Complementary Hemispheric Specialization for Language Production and Visuospatial Attention. Proceedings of the National Academy of Sciences, 110, E322-E330.
https://doi.org/10.1073/pnas.1212956110
[39] Villar-Rodríguez, E., Palomar-García, M., Hernández, M., Adrián-Ventura, J., Olcina-Sempere, G., Parcet, M. and ávila, C. (2020) Left-Handed Musicians Show a Higher Probability of Atypical Cerebral Dominance for Language. Human Brain Mapping, 41, 2048-2058.
https://doi.org/10.1002/hbm.24929
[40] Hachenberger, J., Li, Y.-M., Siniatchkin, M., Hermenau, K., Ludyga, S. and Lemola, S. (2023) Heart Rate Variability’s Association with Positive and Negative Affect in Daily Life: An Experience Sampling Study with Continuous Daytime Electrocardiography over Seven Days. Sensors, 23, Article No. 966.
https://doi.org/10.3390/s23020966
[41] Shi, H., Yang, L., Zhao, L., Su, Z., Mao, X., Zhang, L. and Liu, C. (2017) Differences of Heart Rate Variability between Happiness and Sadness Emotion States: A Pilot Study. Journal of Medical and Biological Engineering, 37, 527-539.
https://doi.org/10.1007/s40846-017-0238-0
[42] Shaffer, F. and Ginsberg, J.P. (2017) An Overview of Heart Rate Variability Metrics and Norms. Frontiers in Public Health, 5, Article No. 258.
https://doi.org/10.3389/fpubh.2017.00258
[43] McKendrick, R., Parasuraman, R., Murtza, R., Formwalt, A., Baccus, W., Paczynski, M. and Ayaz, H. (2016) Into the Wild: Neuroergonomic Differentiation of Hand-Held and Augmented Reality Wearable Displays during Outdoor Navigation with Functional Near Infrared Spectroscopy. Frontiers in Human Neuroscience, 10, Article No. 216.
https://doi.org/10.3389/fnhum.2016.00216
[44] Ayaz, H., Onaral, B., Izzetoglu, K., Shewokis, P.A., McKendrick, R. and Parasuraman, R. (2013) Continuous Monitoring of Brain Dynamics with Functional near Infrared Spectroscopy as a Tool for Neuroergonomic Research: Empirical Examples and a Technological Development. Frontiers in Human Neuroscience, 7, Article No. 871.
https://doi.org/10.3389/fnhum.2013.00871
[45] Doi, H., Nishitani, S. and Shinohara, K. (2013) NIRS as a Tool for Assaying Emotional Function in the Prefrontal Cortex. Frontiers in Human Neuroscience, 7, Article No. 770.
https://doi.org/10.3389/fnhum.2013.00770
[46] Scarapicchia, V., Brown, C., Mayo, C. and Gawryluk, J.R. (2017) Functional Magnetic Resonance Imaging and Functional Near-Infrared Spectroscopy: Insights from Combined Recording Studies. Frontiers in Human Neuroscience, 11, Article No. 419.
https://doi.org/10.3389/fnhum.2017.00419

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.