| [1] |
Gasteiger N, Hellou M, Ahn H S. Factors for personalization and localization to optimize human–robot interaction: A literature review[J]. International Journal of Social Robotics, 2023, 15(4): 689-701.
|
| [2] |
工业和信息化部. “十四五”机器人产业发展规划[EB/OL]. (2021-12-28)[2025-11-16].
|
| [3] |
工业和信息化部, 国家发展和改革委员会, 教育部, 等. “十四五”智能制造发展规划[EB/OL]. (2021-12-21)[2025-11-16].
|
| [4] |
中共中央. 中共中央关于制定国民经济和社会发展第十五个五年规划的建议[EB/OL]. (2025-10-28)[2025-11-16].
|
| [5] |
国家新一代人工智能治理专业委员会. 《新一代人工智能伦理规范》发布[EB/OL]. (2021-09-25)[2025-11-16].
|
| [6] |
Commission European. Artificial intelligence act[EB/OL]. (2024-08-01)[2025-11-16].
|
| [7] |
National Institute of Standards and Technology. AI risk management framework[EB/OL]. (2023-01-26)[2025-11-16].
|
| [8] |
Bartosiak N, Gałuszka A, Wojnar M. Implementation of a neural network for the recognition of emotional states by social robots, using “OhBot”[M]//Advances in Computational Intelligence. Cham: Springer Nature Switzerland, 2023: 181-193.
|
| [9] |
Laohakangvalvit T, Subsa-ard N, Fulini F Y, et al. Improving facial emotion recognition model in social robot using graph-based techniques with 3D face orientation[C]//2024 12th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW). Piscataway, New Jersey: IEEE, 2025: 234-237.
|
| [10] |
Yu Chuang, Tapus A. Multimodal emotion recognition with thermal and RGB-D cameras for human-robot interaction[C]//Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction. New York: ACM, 2020: 532-534.
|
| [11] |
Ramis S, Buades J M, Perales F J, et al. Using a social robot to evaluate facial expressions in the wild[J]. Sensors, 2020, 20(23): 6716.
|
| [12] |
Sham A H, Khan A, Lamas D, et al. Towards context-aware facial emotion reaction database for dyadic interaction settings[J]. Sensors, 2023, 23(1): 458.
|
| [13] |
Mishra C, Skantze G, Hagoort P, et al. Perception of emotions in human and robot faces: Is the eye region enough?[M]//Social Robotics. Singapore: Springer Nature Singapore, 2025: 290-303.
|
| [14] |
Ruiz-Garcia A, Webb N, Palade V, et al. Deep learning for real time facial expression recognition in social robots[M]//Neural Information Processing. Cham: Springer International Publishing, 2018: 392-402.
|
| [15] |
Biçer E, Takır Ş, Gürpınar C, et al. Masking and compression techniques for efficient action unit detection of children for social robots[C]//2022 30th Signal Processing and Communications Applications Conference (SIU). Piscataway, New Jersey: IEEE, 2022: 1-4.
|
| [16] |
Jaiswal S, Nandi G C. Optimized, robust, real-time emotion prediction for human-robot interactions using deep learning[J]. Multimedia Tools and Applications, 2023, 82(4): 5495-5519.
|
| [17] |
Verma A, Gavali M. Ensemble of large self-supervised transformers for improving speech emotion recognition[J]. International Journal of Data Mining, Modelling and Management, 2025, 17(2): 10065871.
|
| [18] |
Mishra R, Frye A, Rayguru M M, et al. Personalized speech emotion recognition in human-robot interaction using vision transformers[J]. IEEE Robotics and Automation Letters, 2025, 10(5): 4890-4897.
|
| [19] |
Grágeda N, Alvarado E, Mahu R, et al. Distant speech emotion recognition in an indoor human-robot interaction scenario[C]//INTERSPEECH 2023. ISCA, 2023: 3657-3661.
|
| [20] |
Ahuja S, Shabani A. Affective computing for social companion robots using fine-grained speech emotion recognition[C]//2023 IEEE Conference on Artificial Intelligence (CAI). Piscataway, New Jersey: IEEE, 2023: 331-332.
|
| [21] |
Szabóová M, Sarnovský M, Maslej Krešňáková V, et al. Emotion analysis in human–robot interaction[J]. Electronics, 2020, 9(11): 1761.
|
| [22] |
Ashok A, Pawlak J, Paplu S, et al. Paralinguistic cues in speech to adapt robot behavior in human-robot interaction[C]//2022 9th IEEE RAS/EMBS International Conference for Biomedical Robotics and Biomechatronics (BioRob). Piscataway, New Jersey: IEEE, 2022: 1-6.
|
| [23] |
Zhao Mingyi, Gong Linrui, Din A S. A review of the emotion recognition model of robots[J]. Applied Intelligence, 2025, 55(6): 364.
|
| [24] |
Staffa M, D'Errico L, Sansalone S, et al. Classifying human emotions in HRI: Applying global optimization model to EEG brain signals[J]. Frontiers in Neurorobotics, 2023, 17: 1191127.
|
| [25] |
Alimardani M, Hiraki K. Passive brain-computer interfaces for enhanced human-robot interaction[J]. Frontiers in Robotics and AI, 2020, 7: 125.
|
| [26] |
Mishra R, Welch K C. Towards forecasting engagement in children with autism spectrum disorder using social robots and deep learning[C]//SoutheastCon 2023. Piscataway, New Jersey: IEEE, 2023: 838-843.
|
| [27] |
Kothig A, Muñoz J, Mahdi H, et al. HRI physio lib: A software framework to support the integration of physiological adaptation in HRI[M]//Social Robotics. Cham: Springer International Publishing, 2020: 36-47.
|
| [28] |
Kothig A, Munoz J, Akgun S A, et al. Connecting humans and robots using physiological signals–closing-the-loop in HRI[C]//2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN). Piscataway, New Jersey: IEEE, 2021: 735-742.
|
| [29] |
Li Chenghao, Seng K P, Ang L M, et al. Gait-to-gait emotional human-robot interaction utilizing trajectories-aware and skeleton-graph-aware spatial-temporal transformer[J]. Sensors, 2025, 25(3): 734.
|
| [30] |
Chen Luefeng, Feng Yu, Maram M A, et al. Multi-SVM based dempster-shafer theory for gesture intention understanding using sparse coding feature[J]. Applied Soft Computing, 2019, 85: 105787.
|
| [31] |
Powell H, Laban G, George J N, et al. Is deep learning a valid approach for inferring subjective self-disclosure in human-robot interactions?[C]//2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI). Piscataway, New Jersey: IEEE, 2022: 991-996.
|
| [32] |
Duncan J A, Alambeigi F, Pryor M W. A survey of multimodal perception methods for human-robot interaction in social environments[J]. ACM Transactions on Human-Robot Interaction, 2024, 13(4): 1-50.
|
| [33] |
Song Xinheng, Liu Chang, Xu Linci, et al. Affective computing methods for multimodal embodied AI human-computer interaction[J]. Aslib Journal of Information Management, 2025: 1-25.
|
| [34] |
Chen Luefeng, Li Min, Wu Min, et al. Coupled multimodal emotional feature analysis based on broad-deep fusion networks in human-robot interaction[J]. IEEE Transactions on Neural Networks and Learning Systems, 2024, 35(7): 9663-9673.
|
| [35] |
Chen Luefeng, Su Wanjuan, Feng Yu, et al. Two-layer fuzzy multiple random forest for speech emotion recognition in human-robot interaction[J]. Information Sciences, 2020, 509: 150-163.
|
| [36] |
Jiang Yutong, Shao Shuai, Dai Yaping, et al. A LLM-based robot partner with multi-modal emotion recognition[C]//Intelligent Robotics and Applications. Singapore: Springer, 2025: 71-83.
|
| [37] |
Liu Xiaofeng, Lv Qincheng, Li Jie, et al. Multimodal emotion fusion mechanism and empathetic responses in companion robots[J]. IEEE Transactions on Cognitive and Developmental Systems, 2025, 17(2): 271-286.
|
| [38] |
Hwang C L, Deng Yuchen, Pu Shihen. Human-robot collaboration using sequential-recurrent-convolution-network-based dynamic face emotion and wireless speech command recognitions[J]. IEEE Access, 2023, 11: 37269-37282.
|
| [39] |
Bethany G, Gupta M. A transformer based emotion recognition model for social robots using topographical maps generated from EEG signals[C]//Human-Computer Interaction. Cham: Springer, 2024: 262-271.
|
| [40] |
Shenoy S, Jiang Yusheng, Lynch T, et al. A self learning system for emotion awareness and adaptation in humanoid robots[C]//2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN). Piscataway, New Jersey: IEEE, 2022: 912-919.
|
| [41] |
Lu I C, Huang J Y, Lee W P. An emotion-driven and topic-aware dialogue framework for human-robot interaction[J]. Advanced Robotics, 2024, 38(4): 267-281.
|
| [42] |
Tanevska A, Rea F, Sandini G, et al. A socially adaptable framework for human-robot interaction: Correction[J]. Frontiers in Robotics and AI, 2021, 8: 812583.
|
| [43] |
Churamani N, Barros P, Gunes H, et al. Affect-driven learning of robot behaviour for collaborative human-robot interactions[J]. Frontiers in Robotics and AI, 2022, 9: 717193.
|
| [44] |
Tian Leimin, Oviatt S. A taxonomy of social errors in human-robot interaction[J]. ACM Transactions on Human-Robot Interaction, 2021, 10(2): 1-32.
|
| [45] |
Chen Luefeng, Wu Min, Zhou Mengtian, et al. Information-driven multirobot behavior adaptation to emotional intention in human-robot interaction[J]. IEEE Transactions on Cognitive and Developmental Systems, 2018, 10(3): 647-658.
|
| [46] |
Tuyen N T V, Elibol A, Chong N Y. Learning bodily expression of emotion for social robots through human interaction[J]. IEEE Transactions on Cognitive and Developmental Systems, 2021, 13(1): 16-30.
|
| [47] |
Guerrieri A, Braccili E, Sgrò F, et al. Gender identification in a two-level hierarchical speech emotion recognition system for an Italian social robot[J]. Sensors, 2022, 22(5): 1714.
|
| [48] |
Bagheri E, Roesler O, Cao H L, et al. A reinforcement learning based cognitive empathy framework for social robots[J]. International Journal of Social Robotics, 2021, 13(5): 1079-1093.
|
| [49] |
Mascarenhas S, Guimarães M, Prada R, et al. FAtiMA toolkit: Toward an accessible tool for the development of socio-emotional agents[J]. ACM Transactions on Interactive Intelligent Systems, 2022, 12(1): 1-30.
|
| [50] |
Feng S, Sumioka H, Yamato N, Ishiguro H, Shiomi M. Effect of emotional expression on the impression of older people towards baby-like robots[C]//Proceedings of the 12th Conference on Human-Agent Interaction, HAI 2024. New York: ACM, 2024: 414-416.
|
| [51] |
Sobhani M, Smith J, Pipe A, et al. A novel mirror neuron inspired decision-making architecture for human-robot interaction[J]. International Journal of Social Robotics, 2024, 16(6): 1297-1314.
|
| [52] |
Kang H, Ben Moussa M, Thalmann N M. Nadine: A large language model-driven intelligent social robot with affective capabilities and human-like memory[J]. Computer Animation and Virtual Worlds, 2024, 35(4): e2290.
|
| [53] |
Antony V N, Stiber M, Huang C M. Xpress: A system for dynamic, context-aware robot facial expressions using language models[C]//2025 20th ACM/IEEE International Conference on Human-Robot Interaction (HRI). Piscataway, New Jersey: IEEE, 2025: 958-967.
|
| [54] |
Penčić M, Čavić M, Oros D, et al. Anthropomorphic robotic eyes: Structural design and non-verbal communication effectiveness[J]. Sensors, 2022, 22(8): 3060.
|
| [55] |
Löffler D, Schmidt N, Tscharn R. Multimodal expression of artificial emotion in social robots using color, motion and sound[C]//2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI). Piscataway, New Jersey: IEEE, 2021: 334-343.
|
| [56] |
Korcsok B, Konok V, Persa G, et al. Biologically inspired emotional expressions for artificial agents[J]. Frontiers in Psychology, 2018, 9: 1191.
|
| [57] |
MacDonald S, Bretin R, ElSayed S. Evaluating transferable emotion expressions for zoomorphic social robots using VR prototyping[C]//2024 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). Piscataway, New Jersey: IEEE, 2024: 1087-1096.
|