学生のShen Zhihao さんがIEEE International Conference on Advanced Robotics & Mechatronics (ICARM)においてBest Paper Award を受賞
学生のShen Zhihaoさん (博士後期課程2年、知能ロボティクス領域、チョン研究室)がIEEE International Conference on Advanced Robotics & Mechatronics (ICARM)においてBest Paper Awardを受賞しました。
IEEE International Conference on Advanced Robotics & Mechatronics (ICARM) はIEEE-SMC(Systems,Man,and Cybernetics Society) TC on Bio-mechatronics and Bio-robotics Systems及びIEEE-RAS(Robotics & Automation Society) TC on Neuro-Robotics Systemsにおける最も重要な国際会議で、今回のICARMは、2019年7月3日~5日にかけて大阪大学豊中キャンパスで開催されました。
■受賞年月日
令和元年7月4日
■論文タイトル
Nonverbal Behavior Cue for Recognizing Human Personality Traits in Human-Robot Social Interaction
■著者
Zhihao Shen, Armagan Elibol, and Nak Young Chong
■論文概要
In parallel to breathtaking advancements in Robotics, more and more researchers have been focusing on enhancing the quality of human-robot interaction (HRI) by endowing the robot with the abilities to understand its user's intention, emotion, and many others. The personality traits can be defined as human characters that can affect the behaviors of the speaker and listener, and the impressions about each other In this paper, we proposed a new framework that enables the robot to easily extract the participants' visual features such as gaze, head motion, and body motion as well as the vocal features such as pitch, energy, and Mel-Frequency Cepstral Coefficient (MFCC). The experiments were designed based on an idea that the robot is an individual during the interaction, therefore, the interaction data were extracted without external devices except for the robot itself. The Pepper robot posed a series of questions and recorded the habitual behaviors of each participant, meanwhile, whose personality traits were assessed by a questionnaire. At last, a linear regression model can be trained with the participants' habitual behaviors and the personality traits label. For simplicity, we used the binary labels to indicate that the participant is high or low on each trait. And the experimental results showed the promising performance on inferring personality traits with the user's simple social cues during social communication with the robot toward a long-term human-robot partnership.
令和元年7月9日