The Emotion Recognition in Psychology of Human-robot Interaction

Authors

  • Mengyao Zhao Henan Polytechnic University, People's Republic China

DOI:

https://doi.org/10.59388/pm00331

Keywords:

dynamics of human-robot interaction, emotion recognition, human emotions, human-robot interaction (HRI), human-robot interfaces, machine learning techniques, psychology

Abstract

The field of Human-Robot Interaction (HRI) has garnered significant attention in recent years, with researchers and practitioners seeking to understand the psychological aspects underlying the interactions between humans and robots. One crucial area of focus within HRI is the psychology of emotion recognition, which plays a fundamental role in shaping the dynamics of human-robot interaction. This paper provides an overview of the background of psychology in the context of human-robot interaction, emphasizing the significance of understanding human emotions in this domain. The concept of emotion recognition, a key component of human psychology, is explored in detail, highlighting its relevance in the context of human-robot interaction. Emotion recognition allows robots to perceive and interpret human emotions, enabling them to respond appropriately and enhance the quality of interaction. The role of emotion recognition in HRI is examined from a psychological standpoint, shedding light on its implications for the design and development of effective human-robot interfaces. Furthermore, this paper delves into the application of machine learning techniques for emotion recognition in the context of human-robot interaction. Machine learning algorithms have shown promise in enabling robots to recognize and respond to human emotions, thereby contributing to more natural and intuitive interactions. The utilization of machine learning in emotion recognition reflects the intersection of psychology and technological advancements in the field of HRI. Finally, the challenges associated with emotion recognition in HRI are discussed, encompassing issues such as cross-cultural variations in emotional expression, individual differences, and the ethical implications of emotion detection. Addressing these challenges is pivotal in advancing the understanding and implementation of emotion recognition in human-robot interaction, underscoring the interdisciplinary nature of this endeavor. In conclusion, this paper underscores the critical role of emotion recognition in the psychology of human-robot interaction, emphasizing its potential to revolutionize the way humans and robots engage with each other. By integrating insights from psychology, machine learning, and technology, advancements in emotion recognition have the potential to pave the way for more empathetic and responsive human-robot interactions, offering new avenues for research and practical applications in this burgeoning field.

References

Agarwal, S., Agarwal, B., & Gupta, R. (2022). Chatbots and virtual assistants: a bibliometric analysis. Library Hi Tech, 40(4), 1013-1030. https://doi.org/10.1155/2022/6005446

Alnuaim, A. A., Zakariah, M., Shukla, P. K., Alhadlaq, A., Hatamleh, W. A., Tarazi, H., Sureshbabu, R., & Ratna, R. (2022). Human-computer interaction for recognizing speech emotions using multilayer perceptron classifier. Journal of Healthcare Engineering, 2022. https://doi.org/10.1155/2022/6005446

Alwadi, A., & Lathifa, Z. (2022). Applications of Artificial Intelligence in the Treatment of Behavioral and Mental Health Conditions. Applied Research in Artificial Intelligence and Cloud Computing, 5(1), 1-18.

Anwar, A., Rehman, I. U., Nasralla, M. M., Khattak, S. B. A., & Khilji, N. (2023). Emotions Matter: A Systematic Review and Meta-Analysis of the Detection and Classification of Students’ Emotions in STEM during Online Learning. Education Sciences, 13(9), 914. https://doi.org/10.3390/educsci13090914

Arora, A., Alderman, J. E., Palmer, J., Ganapathi, S., Laws, E., McCradden, M. D., Oakden-Rayner, L., Pfohl, S. R., Ghassemi, M., & McKay, F. (2023). The value of standards for health datasets in artificial intelligence-based applications. Nature Medicine, 1-10. https://doi.org/10.1038/s41591-023-02608-w

Bahmei, B., Birmingham, E., & Arzanpour, S. (2022). CNN-RNN and data augmentation using deep convolutional generative adversarial network for environmental sound classification. IEEE Signal Processing Letters, 29, 682-686. https://doi.org/10.1109/LSP.2022.3150258

Bakır, Ç. N., Abbas, S. O., Sever, E., Özcan Morey, A., Aslan Genç, H., & Mutluer, T. (2023). Use of augmented reality in mental health-related conditions: A systematic review. Digital Health, 9, 20552076231203649. https://doi.org/10.1177/20552076231203649

Barker, N., & Jewitt, C. (2023). Collaborative Robots and Tangled Passages of Tactile-Affects. ACM Transactions on Interactive Intelligent Systems, 12(2), 1-21. https://doi.org/10.1145/3534090

Chakhtouna, A., Sekkate, S., & Adib, A. (2022). Speech Emotion Recognition Using Pre-trained and Fine-Tuned Transfer Learning Approaches. The Proceedings of the International Conference on Smart City Applications, https://doi.org/10.1007/978-3-031-26852-6_35

Chi, O. H., Chi, C. G., Gursoy, D., & Nunkoo, R. (2023). Customers’ acceptance of artificially intelligent service robots: The influence of trust and culture. International Journal of Information Management, 70, 102623. https://doi.org/10.1016/j.ijinfomgt.2023.102623

Costa, V. G., & Pedreira, C. E. (2023). Recent advances in decision trees: An updated survey. Artificial Intelligence Review, 56(5), 4765-4800. https://doi.org/10.1007/s10462-022-10275-5

Cucciniello, I., Sangiovanni, S., Maggi, G., & Rossi, S. (2023). Mind perception in HRI: Exploring users’ attribution of mental and emotional states to robots with different behavioural styles. International Journal of Social Robotics, 15(5), 867-877. https://doi.org/10.1007/s12369-023-00989-z

Etemad-Sajadi, R., Soussan, A., & Schöpfer, T. (2022). How ethical issues raised by human–robot interaction can impact the intention to use the robot? International Journal of Social Robotics, 14(4), 1103-1115. https://doi.org/10.1007/s12369-021-00857-8

Ezzameli, K., & Mahersia, H. (2023). Emotion recognition from unimodal to multimodal analysis: A review. Information Fusion, 101847. https://doi.org/10.1016/j.inffus.2023.101847

Fiorini, L., Loizzo, F. G., D’Onofrio, G., Sorrentino, A., Ciccone, F., Russo, S., Giuliani, F., Sancarlo, D., & Cavallo, F. (2022). Can I Feel You? Recognizing Human’s Emotions During Human-Robot Interaction. International Conference on Social Robotics, https://doi.org/10.1007/978-3-031-24667-8_45

Frijns, H. A., Schürer, O., & Koeszegi, S. T. (2023). Communication models in human–robot interaction: an asymmetric MODel of ALterity in human–robot interaction (AMODAL-HRI). International Journal of Social Robotics, 15(3), 473-500. https://doi.org/10.1007/s12369-021-00785-7

Gervasi, R., Aliev, K., Mastrogiacomo, L., & Franceschini, F. (2022). User Experience and Physiological Response in Human-Robot Collaboration: A Preliminary Investigation. Journal of Intelligent & Robotic Systems, 106(2), 36. https://doi.org/10.1007/s10846-022-01744-8

Gervasi, R., Barravecchia, F., Mastrogiacomo, L., & Franceschini, F. (2023). Applications of affective computing in human-robot interaction: State-of-art and challenges for manufacturing. Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture, 237(6-7), 815-832. https://doi.org/10.1177/09544054221121888

Gladys, A. A., & Vetriselvi, V. (2023). Survey on Multimodal Approaches to Emotion Recognition. Neurocomputing, 126693. https://doi.org/10.1016/j.neucom.2023.126693

Heredia, J., Lopes-Silva, E., Cardinale, Y., Diaz-Amado, J., Dongo, I., Graterol, W., & Aguilera, A. (2022). Adaptive multimodal emotion detection architecture for social robots. IEEE Access, 10, 20727-20744. https://doi.org/10.1109/ACCESS.2022.3149214

Houlihan, S. D. (2022). A computational framework for emotion understanding Massachusetts Institute of Technology].

Houssein, E. H., Hammad, A., & Ali, A. A. (2022). Human emotion recognition from EEG-based brain–computer interface using machine learning: a comprehensive review. Neural Computing and Applications, 34(15), 12527-12557. https://doi.org/10.1007/s00521-022-07292-4

Iparraguirre-Villanueva, O., Guevara-Ponce, V., Ruiz-Alvarado, D., Beltozar-Clemente, S., Sierra-Liñan, F., Zapata-Paulini, J., & Cabanillas-Carbonell, M. (2023). Text prediction recurrent neural networks using long short-term memory-dropout. Indones. J. Electr. Eng. Comput. Sci, 29, 1758-1768. https://doi.org/10.11591/ijeecs.v29.i3.pp1758-1768

Kamila, M. K., & Jasrotia, S. S. (2023). Ethical issues in the development of artificial intelligence: recognizing the risks. International Journal of Ethics and Systems. https://doi.org/10.1108/IJOES-05-2023-0107

Kanna, R. K., Surendhar, P. A., Rubi, J., Jyothi, G., Ambikapathy, A., & Vasuki, R. (2022). Human Computer Interface Application for Emotion Detection Using Facial Recognition. 2022 IEEE International Conference on Current Development in Engineering and Technology (CCET), https://doi.org/10.1109/CCET56606.2022.10080678

Kansizoglou, I., Misirlis, E., Tsintotas, K., & Gasteratos, A. (2022). Continuous emotion recognition for long-term behavior modeling through recurrent neural networks. Technologies, 10(3), 59. https://doi.org/10.3390/technologies10030059

Karnati, M., Seal, A., Bhattacharjee, D., Yazidi, A., & Krejcar, O. (2023). Understanding deep learning techniques for recognition of human emotions using facial expressions: a comprehensive survey. IEEE Transactions on Instrumentation and Measurement. https://doi.org/10.1109/TIM.2023.3243661

Katirai, A. (2023). Ethical considerations in emotion recognition technologies: a review of the literature. AI and Ethics, 1-22. https://doi.org/10.1007/s43681-023-00307-3

Kurani, A., Doshi, P., Vakharia, A., & Shah, M. (2023). A comprehensive comparative study of artificial neural network (ANN) and support vector machines (SVM) on stock forecasting. Annals of Data Science, 10(1), 183-208. https://doi.org/10.1007/s40745-021-00344-x

Li, J., & Wang, Q. (2022). Multi-modal bioelectrical signal fusion analysis based on different acquisition devices and scene settings: Overview, challenges, and novel orientation. Information Fusion, 79, 229-247. https://doi.org/10.1016/j.inffus.2021.10.018

Li, Y. (2022). Research and application of deep learning in image recognition. 2022 IEEE 2nd International Conference on Power, Electronics and Computer Applications (ICPECA), https://doi.org/10.1109/ICPECA53709.2022.9718847

Lin, Y., Yong, L., & Tuunainen, V. K. (2023). Understanding pre-interaction response to humanoid robots: a view of comfort with robots.

Mohammad, S. M. (2022). Ethics sheet for automatic emotion recognition and sentiment analysis. Computational Linguistics, 48(2), 239-278. https://doi.org/10.1162/coli_a_00433

Nandy, D. (2023). Human Rights in the Era of Surveillance: Balancing Security and Privacy Concerns. Journal of Current Social and Political Issues, 1(1), 13-17.

Padi, S., Sadjadi, S. O., Manocha, D., & Sriram, R. D. (2022). Multimodal emotion recognition using transfer learning from speaker recognition and bert-based models. arXiv preprint arXiv:2202.08974. https://doi.org/10.21437/Odyssey.2022-57

Pan, B., Hirota, K., Jia, Z., & Dai, Y. (2023). A review of multimodal emotion recognition from datasets, preprocessing, features, and fusion methods. Neurocomputing, 126866. https://doi.org/10.1016/j.neucom.2023.126866

Pinney, J., Carroll, F., & Newbury, P. (2022). Human-robot interaction: the impact of robotic aesthetics on anticipated human trust. PeerJ Computer Science, 8, e837. https://doi.org/10.7717/peerj-cs.837

Rawal, N., & Stock-Homburg, R. M. (2022). Facial emotion expressions in human–robot interaction: A survey. International Journal of Social Robotics, 14(7), 1583-1604.
https://doi.org/10.1007/s12369-022-00867-0

Richards, D., Vythilingam, R., & Formosa, P. (2023). A principlist-based study of the ethical design and acceptability of artificial social agents. International Journal of Human-Computer Studies, 172, 102980. https://doi.org/10.1016/j.ijhcs.2022.102980

Rodríguez-Hidalgo, C. (2023). Affect Research in Human. The SAGE Handbook of Human–Machine Communication, 280. https://doi.org/10.4135/9781529782783.n35

Saganowski, S. (2022). Bringing emotion recognition out of the lab into real life: Recent advances in sensors and machine learning. Electronics, 11(3), 496. https://doi.org/10.3390/electronics11030496

Saumard, M. (2023). Enhancing Speech Emotions Recognition Using Multivariate Functional Data Analysis. Big Data and Cognitive Computing, 7(3), 146. https://doi.org/10.3390/bdcc7030146

Sharma, T., Diwakar, M., & Arya, C. (2022). A systematic review on emotion recognition by using machine learning approaches. AIP Conference Proceedings, https://doi.org/10.1063/5.0113378

Srivastava, G., & Bag, S. (2023). Modern-day marketing concepts based on face recognition and neuro-marketing: a review and future research directions. Benchmarking: An International Journal. https://doi.org/10.1108/BIJ-09-2022-0588

Stock-Homburg, R. (2022). Survey of emotions in human–robot interactions: Perspectives from robotic psychology on 20 years of research. International Journal of Social Robotics, 14(2), 389-411. https://doi.org/10.1007/s12369-021-00778-6

Su, H., Qi, W., Chen, J., Yang, C., Sandoval, J., & Laribi, M. A. (2023). Recent advancements in multimodal human–robot interaction. Frontiers in Neurorobotics, 17, 1084000. https://doi.org/10.3389/fnbot.2023.1084000

Subramanian, B., Kim, J., Maray, M., & Paul, A. (2022). Digital twin model: A real-time emotion recognition system for personalized healthcare. IEEE Access, 10, 81155-81165. https://doi.org/10.1109/ACCESS.2022.3193941

Thiessen, R. (2023). Understanding family needs: informing the design of social robots for children with disabilities to support play.

Wang, S.-H. (2017, 2017//). Facial Emotion Recognition via Discrete Wavelet Transform, Principal Component Analysis, and Cat Swarm Optimization. Intelligence Science and Big Data Engineering, Cham. https://doi.org/10.1007/978-3-319-67777-4_18

Wang, S.-H. (2018). Intelligent facial emotion recognition based on stationary wavelet entropy and Jaya algorithm. Neurocomputing, 272, 668-676. https://doi.org/https://doi.org/10.1016/j.neucom.2017.08.015

Wang, S.-H. (2021a). COVID-19 classification by CCSHNet with deep fusion using transfer learning and discriminant correlation analysis. Information Fusion, 68, 131-148. https://doi.org/10.1016/j.inffus.2020.11.005

Wang, S.-H. (2021b). Covid-19 classification by FGCNet with deep feature fusion from graph convolutional network and convolutional neural network. Information Fusion, 67, 208-229. https://doi.org/10.1016/j.inffus.2020.10.004

Weng, Y., & Lin, F. (2022). Multimodal emotion recognition algorithm for artificial intelligence information system. Wireless Communications and Mobile Computing, 2022, 1-9. https://doi.org/10.1155/2022/9236238

Wielgopolan, A., & Imbir, K. K. (2023). Affective norms for emotional ambiguity in valence, origin, and activation spaces. Behavior Research Methods, 55(3), 1141-1156. https://doi.org/10.3758/s13428-022-01865-w

Zhang, Y. (2023). Deep learning in food category recognition. Information Fusion, 98, 101859. https://doi.org/10.1016/j.inffus.2023.101859

Zhang, Y.-D., & Dong, Z.-C. (2020). Advances in multimodal data fusion in neuroimaging: Overview, challenges, and novel orientation. Information Fusion, 64, 149-187. https://doi.org/https://doi.org/10.1016/j.inffus.2020.07.006

Zhang, Y. D. (2016). Facial Emotion Recognition Based on Biorthogonal Wavelet Entropy, Fuzzy Support Vector Machine, and Stratified Cross Validation. IEEE Access, 4, 8375-8385. https://doi.org/10.1109/ACCESS.2016.2628407

Zhang, Y. D., & Satapathy, S. (2022). A seven-layer convolutional neural network for chest CT-based COVID-19 diagnosis using stochastic pooling. Ieee Sensors Journal, 22(18), 17573 - 17582. https://doi.org/10.1109/JSEN.2020.3025855

Zhang, Z., & Zhang, X. (2021). MIDCAN: A multiple input deep convolutional attention network for Covid-19 diagnosis based on chest CT and chest X-ray. Pattern Recognition Letters, 150, 8-16.

Zhu, H. (2023). CovC-ReDRNet: A Deep Learning Model for COVID-19 Classification. Machine Learning and Knowledge Extraction, 5(3), 684-712. https://www.mdpi.com/2504-4990/5/3/37

Downloads

Published

2023-11-21

How to Cite

Zhao, M. (2023). The Emotion Recognition in Psychology of Human-robot Interaction. Psychomachina, 1(1), 1–11. https://doi.org/10.59388/pm00331

Issue

Section

Articles