Emotion Measurement Using Biometric Signal

Abstract

In recent years, research on the estimation of human emotions has been active, and its application is expected in various fields. Biological reactions, such as electroencephalography (EEG) and root mean square successive difference (RMSSD), are indicators that are less influenced by individual arbitrariness. The present study used EEG and RMSSD signals to assess the emotions aroused by emotion-stimulating images in order to investigate whether various emotions are associated with characteristic biometric signal fluctuations. The participants underwent EEG and RMSSD while viewing emotionally stimulating images and answering the questionnaires. The emotions aroused by emotionally stimulating images were assessed by measuring the EEG signals and RMSSD values to determine whether different emotions are associated with characteristic biometric signal variations. Real-time emotion analysis software was used to identify the evoked emotions by describing them in the Circumplex Model of Affect based on the EEG signals and RMSSD values. Emotions other than happiness did not follow the Circumplex Model of Affect in this study. However, ventral attentional activity may have increased the RMSSD value for disgust as the β/θ value increased in right-sided brain waves. Therefore, the right-sided brain wave results are necessary when measuring disgust. Happiness can be assessed easily using the Circumplex Model of Affect for positive scene analysis. Improving the current analysis methods may facilitate the investigation of face-to-face communication in the future using biometric signals.

Share and Cite:

Miyagi, Y. , Gocho, S. , Miyachi, Y. , Nakayama, C. , Okada, S. , Maruyama, K. and Oshima, T. (2024) Emotion Measurement Using Biometric Signal. Health, 16, 395-404. doi: 10.4236/health.2024.165028.

1. Introduction

In recent years, research on the estimation of human emotions has been active, and its application is expected in various fields [1] . However, although there are many findings and reports on psychophysiological measurements during emotional arousal, no clear relationship has been found regarding what psychophysiological responses occur during each emotional arousal [2] .

Two types of data, real and intentional feelings, are obtained during the evaluation of the psychological status. Intentional feelings were excluded from the analysis in previous studies [3] . Physiological indicators, such as biological reactions, have garnered interest as they are less affected by individual arbitrariness. Emotions have been measured using the following methods during biometric analyses: 1) Psychological measures, such as questionnaires; 2) behavioral measures, such as facial expressions and gestures; and 3) physiological measures, such as blood pressure, heart rate, skin temperature, electroencephalogram (EEG), and electrocardiogram (ECG) [4] . Previous studies have revealed that facial expressions cannot be used to evaluate emotion in ethnic groups such as the Japanese, who arbitrarily hide certain emotional states depending on their surroundings [5] . Therefore, the physiological measures of ECG and EEG were used in the present study, as methods considered not to include intentional emotions.

Brain waves, which are weak potential fluctuations in the cerebral cortex, have been classified into delta (0.5 - 3 Hz), theta (4 - 7 Hz), alpha (8 - 13 Hz), and beta (14 - 40 Hz) waves according to frequency [6] . Beta waves are observed during the states of excitement and arousal when the level of consciousness is high, whereas theta waves are observed during slumber and sleep when the level of consciousness is low [7] . β/θ indicates the level of brain arousal [8] . Heart rate variability (HRV) has been used to analyze ECG and is an indicator of cardiac autonomic tone that indicates the variability of adjacent RR intervals [9] . HRV analysis has been divided into frequency-domain analysis, such as low-frequency (LF) and high-frequency (HF), and time-domain analysis, such as the root mean square successive difference (RMSSD) [10] .

In this study, the emotions aroused by emotionally stimulating images were evaluated using EEG and HRV signals to determine whether different emotions are associated with characteristic biometric signal variations.

2. Method

2.1. Apparatus

EEG and ECG equipment (Micro DAQ Terminal intercross-413, Intercross Corporation), electrode connectors (intercross-415-03, Intercross Corporation), and real-time analysis software (DAQ Master intercross-311, Intercross Corporation) were used in this study to evaluate the fluctuations in EEG and HRV during emotional changes. Two electrodes were placed across the heart to obtain the ECG measurements (Figure 1(A)). The electrodes were placed at C3, Cz, and C4 to obtain the EEG measurements (Figure 1(B)).

Figure 1. (A) Electrode placement for ECG and EEG, (B) Detail of EEG electrode placement.

Real-time emotion analysis software (Intercross-340, Intercross Inc.) was used to analyze the relationship between biometric signals and emotional variability. This software can be described using the Circumplex Model of Affect proposed by Russell based on EEG and ECG signals. The Circumplex Model of Affect has been widely used in research [11] [12] . Human emotions are arranged in a two-dimensional circular plane and expressed as valence (pleasant-unpleasant) and arousal (aroused-sleepy) in this model [12] [13] [14] . Real-time emotion analysis software analyzes the “arousal level (aroused-sleepy)” and “comfort level (pleasant-unpleasant)” via frequency analysis of EEG (β/θ) and autonomic nerve analysis of ECG (RMSSD), respectively. The changes in the RMSSD and β/θ values evoked by the external environment and stimuli are described as the Circumplex Model of Affect. The changes in emotions were confirmed using biological signals in the Circumplex Model of Affect.

2.2. Procedure

The Nencki Affective Picture System (NAPS), which comprises 1356 realistic high-quality photographs [15] , was used as the stimulus. Twelve characteristic images, comprising two images for happiness, sadness, fear, surprise, anger, and disgust, were selected from the NAPS.

The participants were instructed to sleep well the day before the measurement, refrain from drinking alcohol and smoking the day before the measurement, and not eat or drink caffeine 2 h before the measurement to ensure that accurate EEG and ECG signals were obtained.

The EEG and ECG electrodes were attached to the participants on the measurement day. A 15-minute control was performed to familiarize the participants with the environment, temperature, and measuring equipment. Prior to presenting the images, the participants were instructed to look at the images carefully and look away or close their eyes if they felt uncomfortable. The participants were informed that the screen would remain blank for a short duration before each image was presented. The participants were encouraged to use this time to clear their minds of all thoughts, feelings, and memories. A 5-s image stimulus was presented to the participants, followed by a blank screen for 60 s. The participants were instructed to complete the questionnaire during the 60-s break between consecutive images. This process was repeated 12 times. The stimuli were presented as 12 images. The images were selected at random for each participant.

The questionnaire asked the participants to describe their emotions after viewing images of six emotions: happiness, sadness, fear, surprise, anger, and disgust. These six emotions have been studied extensively by psychologists [16] . In addition, a 10-cm visual analog scale (VAS) of emotional intensity was used to determine whether the participant had viewed the image previously and whether the subject looked away during the image presentation. The participants were instructed to write down the evoked emotions in the margin if they were inconsistent with those described in the questionnaire. To ensure that the questionnaire was filled out correctly, three versions of the questionnaire were prepared, with the first section of the questionnaire listing the emotions in a different order.

2.3. Participants

Ten healthy individuals, comprising four males and six females (age: 36.1 ± 15.8 years), participated in this study. Ten healthy individuals, comprising four males and six females (age: 36.1 ± 15.8 years), participated in this study.

2.4. Data Analysis

Spearman’s rank correlation coefficient was used to analyze the emotions evoked by the stimulus images and those described in the questionnaire. In addition, data with >5 cm on the VAS scale were categorized as “highly emotional” to confirm that the changes in the biometric signals were accurate.

The image stimulation (event) and post-image stimulation (post-event) data were compared to confirm the difference in the rate of change of biometric signals owing to emotion. The presence of noise was determined visually during the comparison of the EEG and ECG data, and 20 s was used as the control from the previous resting period (60 s), according to the machine specifications. Real-time emotion analysis software was used to analyze the changes during image stimulation (5 s) and post-image stimulation (5 s) in the controls. The average image stimulation and post-image stimulation values were extracted from the Real-Time emotion analysis, and the Wilcoxon signed-rank test was used to confirm the variations in each biological signal.

The changes in the β/θ and RMSSD values during and after image stimulation were examined for each emotion. In addition, the following analysis was conducted to identify the factors contributing to the variations in the biometric signals of each emotion. The data were divided into two groups for each emotion: the first group comprised data showing an increase in the value of each biometric signal from image stimulation to post-image stimulation, and the second group comprised data showing a decrease in the value of each biological signal. The variations in the RMSSD value from image stimulation to post-image stimulation when the β/θ value increased and decreased were evaluated using the Wilcoxon signed-rank test. Similarly, the variations in the β/θ value when the RMSSD value increased and decreased from image stimulation to post-image stimulation were also evaluated using the Wilcoxon signed-rank test.

2.5. Ethical Considerations

The procedure and objectives of this study were explained thoroughly in writing to the participants, and written consent was obtained. This study was approved by the Ethics Review Committee of Kinjo Gakuin University (approval date: 2020/2/25; approval no.: R19015).

3. Results

3.1. Features and Tendencies of the Emotions Evoked by the Questionnaire and Stimulus Images

Spearman’s rank correlation coefficient yielded a low correlation coefficient of 0.351 between the emotions evoked by the stimulus images and those stated in the questionnaire. Therefore, the emotions stated in the questionnaire were prioritized in the analysis if the emotions evoked in the presented image differed from those stated in the questionnaire.

3.2. The Circular Model of Affect during Image Stimulation

The Circumplex model of affect revealed that most of the biometric signals appeared in the upper or lower right hand of the graph. However, the movements of these signals varied depending on emotion.

3.3. Factors Contributing to the Variation among the Biological Signals of Each Emotion

Figure 2 depicts the movement of the RMSSD values of four emotions (disgust, happiness, sadness, and surprise) during (event) and after (post-event) image stimulation. Anger and fear were excluded from the analysis owing to the small number of responses. It was found that happiness and disgust resulted in a significant increase in the RMSSD values during and after image stimulation (Figure 2). No trend was observed in the RMSSD values for sadness or surprise (Figure 2). Similarly, no trend was observed in the β/θ values of each emotion during and after image stimulation.

A comparison of the average values for each emotion revealed a variation in

Figure 2. RMSSD variation from image stimulus to post-image stimulus (*p < 0.05; Wilcoxon signed-rank test).

the movement of the biometric signals. Therefore, the RMSSD variability results for the β/θ values from image stimulation to post-image stimulation were divided into ascending and descending groups to clarify the factors that cause variation among the biological signals of each emotion (Figure 3 and Figure 4). The RMSSD value for disgust increased significantly in C4 and tended to increase in C3 when the β/θ values increased from image stimulation to post-image stimulation (Figure 3(A)). No trend was observed in the changes in the RMSSD values for disgust in Cz when the β/θ value increased (Figure 3(A)). The RMSSD values for disgust increased significantly in Cz and C3 when the β/θ values decreased from image stimulation to post-image stimulation; however, RMSSD variability in C4 showed no trend (Figure 3(B)). Figure 4 shows the results for happiness. The RMSSD values for happiness tended to increase in Cz and C3 when the β/θ values increased from image stimulation to post-image stimulation; however, RMSSD variability in C4 showed no trend (Figure 4(A)). The RMSSD variability at all EEG measurement sites, C4, Cz, and C3 showed no trend when the β/θ values decreased from image stimulation to post-image stimulation (Figure 4(B)). No characteristic variations in the RMSSD or β/θ values were observed for other emotions or at EEG measurement sites.

4. Discussion

This study used EEG and HRV, which fluctuate with emotionally arousing image stimuli, to identify characteristic variations in the biometric signals for each emotion.

The majority of the data was observed in the upper or lower right when the results of the present study were applied to the Circumplex Model of Affect, indicating happiness or relaxation. In addition, the results revealed that the RMSSD value increases if the participants are happy (Figure 2) and that the RMSSD value tends to increase as the β/θ value increases (Figure 4). These findings indicated high levels of comfort and arousal, consistent with happiness in the

Figure 3. Disgust: RMSSD change from image stimulus to post-image stimulus during β/θ change (A) increase in β/θ from image stimulus to post-image stimulus, (B) decrease in β/θ from image stimulus to post-image stimulus (*p < 0.05; Wilcoxon signed-rank test).

Circumplex Model of Affect [13] [14] . These results suggest that happiness was the only emotion evoked in this study, according to the Circumplex Model of Affect.

Emotions other than happiness did not follow the Circumplex Model of Affect; however, the RMSSD value for disgust increased significantly (Figure 2). The RMSSD value also increased as the β/θ value increased in C4 and when the β/θ value decreased in C3 and Cz (Figure 3). Thus, the movement of β/θ differs depending on the site of EEG measurement. Disgust has been reported to activate cortical ventral attention, which is usually controlled by a right hemisphere brain network, and somatomotor activity in the C3, Cz, and C4 regions [17] . The increase in the β/θ value in C4 may be due to ventral attention activity in the C4 region, which measures the right side of the brain. Thus, C4 cannot be excluded during the evaluation of disgust.

The present study demonstrated that it is difficult to examine emotions other than happiness and disgust using EEG signals and RMSSD.

Biometric signals can be used to examine face-to-face communication in the future by improving the current analysis methods. In addition, objective and non-invasive evaluation of the effect of the communication skills of healthcare professionals on the psychological state of the patients may be possible using

Figure 4. Happiness: RMSSD change from image stimulus to post-image stimulus during β/θ change (A) increase in β/θ from image stimulus to post-image stimulus, (B) decrease in β/θ from image stimulus to post-image stimulus (Wilcoxon signed-rank test).

biometric signals. Systems using biometric signals can be used to assess communication skills to provide better medical care and understand the emotions of the patients when communicating with patients with limited contact, such as online clinics.

5. Limitation

Anger and fear were not analyzed owing to the small sample size. Furthermore, the EEG sites to which each emotion corresponded varied. In addition, no trend was observed in the RMSSD values for surprise or sadness.

HRV enhancement occurs when the visual and auditory stimuli originate from the same source [18] . The effect on HRV was likely small as only visual stimuli were used in the experiment. The small sample size suggests that anger and fear had less of an effect on HRV than the other emotions when using only visual stimuli.

Furthermore, anger and fear preferentially activate the “dorsal attention,” “fronto-parietal,” and “default mode” networks in the cortex, whereas happiness, sadness, and disgust preferentially activate the “somatomotor” and “ventral attention (or salience)” networks [17] . The somatomotor sites apply to C3, Cz, and C4, and a trend was observed for happiness and disgust in the present study (Figure 3 and Figure 4). Thus, C3, Cz, and C4 cannot be excluded from the measurements of happiness and disgust. The small sample size of the present study did not permit the analysis of anger and fear. Therefore, future studies on emotions should consider the frontal lobe and other measurement sites.

6. Conclusion

These results suggest that happiness can be evaluated using the Circumplex Model of Affect. In addition, disgust did not fit the Circumplex Model of Affect, but an increased RMSSD value was shown to be a characteristic. No distinct trends in emotions other than happiness or disgust were observed in the present study. The current analysis method can be improved to investigate face-to-face communication using biometric signals in the future.

Acknowledgements

We express our sincere thanks to Dr. Tanaka for giving us useful advice on how to proceed with our research and the framework of our study.

Author Contributions

All authors conceived the study and assembled the conceptualization, methodology, validation, and visualization. Yukina Miyagi, Yuka Miyachi, and Taeyuki Oshima participated in data curation. Yukina Miyagi, Saori Gocho, Yuka Miyachi, Chika Nakayama, and Taeyuki Oshima participated in investigations and formal analysis. Yukina Miyagi and Taeyuki Oshima were involved in obtaining funding. Yukina Miyagi was responsible for manuscript resources, software, and project administration and prepared this manuscript for publication. Shoshiro Okada, Kenta Maruyama, and Taeyuki Oshima have supervised this research and reviewed the manuscript and writing process.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

References

[1] Dzedzickis, A., Kaklauskas, A. and Bucinskas, V. (2020) Human Emotion Recognition: Review of Sensors and Methods. Sensors, 20, 592.
https://doi.org/10.3390/s20030592
[2] Hama, H., Suzuki, N., Hama, Y., Umemoto, T. and Ooyama, T. (2001) Invitation to Emotional Psychology: An Approach to Feelings and Emotions [Translated from Japanese.]. SAIENSU-SHA, Tokyo.
[3] Miyata, Y. (1996) Brain and Mind [Translated from Japanese.]. Baifukan, Tokyo.
[4] Fujinaga, H. (2003) Heart Rate Fluctuation and Affect. The Wakayama Economic Review, 314, 23-57.
[5] Miyagi, Y., Gocho, S., Yamaguchi, N., Miyachi, Y., Nakayama, C., et al. (2023) Predicting the Effect of Pharmacist’s Communication with Patients: Medical Communication Analysis Using Facial Responses. Journal of Pharmaceutical Health Services Research, 14, 221-227.
https://doi.org/10.1093/jphsr/rmad029
[6] Okuma, T., Matsuoka, H. and Ueno, T. (2006) Electroencephalogram Decoding Step by Step Introduction [Translated from Japanese.]. Igaku Shoin, Tokyo.
[7] Katoh, Z. and Okubo, T. (2006) How to Measure Biological Functions for Beginners [Translated from Japanese.]. Japan Publication Service, Tokyo.
[8] Lubar, J.F. (1991) Discourse on the Development of EEG Diagnostics and Biofeedback for Attention-Deficit/Hyperactivity Disorders. Biofeedback and Self-Regulation, 16, 201-225.
https://doi.org/10.1007/BF01000016
[9] Hayashi, H. (1999) Clinical Application of Heart Rate Variability—Physiological Significance, Pathological Evaluation, and Prognostic Prediction [Translated from Japanese.]. Igaku Shoin, Tokyo.
[10] Shaffer, F. and Ginsberg, J.P. (2017) An Overview of Heart Rate Variability Metrics and Norms. Front Public Health, 5, 258.
https://doi.org/10.3389/fpubh.2017.00258
[11] Koelstra, S., Muhl, C., Soleymani, M., Jong-Seok, L., Yazdani, A., et al. (2012) DEAP: A Database for Emotion Analysis; Using Physiological Signals. IEEE Transactions on Affective Computing, 3, 18-31.
https://doi.org/10.1109/T-AFFC.2011.15
[12] Russell, J.A. (1980) A Circumplex Model of Affect. Journal of Personality and Social Psychology, 39, 1161-1178.
https://doi.org/10.1037/h0077714
[13] Feldman Barrett, L. and Russell, J.A. (1998) Independence and Bipolarity in the Structure of Current Affect. Journal of Personality and Social Psychology, 74, 967-984.
https://doi.org/10.1037/0022-3514.74.4.967
[14] Russell, J.A. and Barrett, L.F. (1999) Core Affect, Prototypical Emotional Episodes, and Other Things Called Emotion. Journal of personality and Social Psychology, 76, 805-819.
https://doi.org/10.1037/0022-3514.76.5.805
[15] Marchewka, A., Żurawski, Ł., Jednoróg, K. and Grabowska, A. (2013) The Nencki Affective Picture System (NAPS): Introduction to a Novel, Standardized, Wide-Range, High-Quality, Realistic Picture Database. Behavior Research Methods, 46, 596-610.
https://doi.org/10.3758/s13428-013-0379-1
[16] Peter, C. and Herbon, A. (2006) Emotion Representation and Physiology Assignments in Digital Systems. Interacting with Computers, 18, 139-170.
https://doi.org/10.1016/j.intcom.2005.10.006
[17] Wager, T.D., Kang, J., Johnson, T.D., Nichols, T.E., Satpute, A.B., et al. (2015) A Bayesian Model of Category-Specific Emotional Brain Responses. PLOS Computational Biology, 11, e1004066.
https://doi.org/10.1371/journal.pcbi.1004066
[18] Brouwer, A.-M., Van Wouwe, N., Mühl, C., Van Erp, J. and Toet, A. (2013) Perceiving Blocks of Emotional Pictures and Sounds: Effects on Physiological Variables. Frontiers in Human Neuroscience, 7, 295.
https://doi.org/10.3389/fnhum.2013.00295

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.