本文探討以具有情緒性表情之機器人其經由視覺系統而觀察到其它具有情緒性臉部人工表情之機器人之情緒狀態並能產生互動影響 。本文藉由影像特徵處理分別以不同表情影像以長寬比進行特徵推論,並對其不同特徵經統計反推論出其目前情感狀態之座標,本文同時定義出各人工表情控制其變化使用之特徵點,藉由二維情感座標系統分類建立出具有情緒狀態之臉部表情,如此則可對不同情緒狀態之間其二維情感座標值計算出表情特徵之變化。為了使情感之表達更為接近一般人類之特性,此特徵點再經由貝茲曲線運算產生出更為平順化的表情,此表情設計再與情感座標系統結合以生成多樣且細膩之情緒行為輸出,將以上之行為融合入機器人其在運動狀態下之行為設計,使機器人能夠擁有多樣且細膩之整體且具情感之行為輸出,其可使機器人在自身情緒狀態之表達與行為上藉由臉部情緒性之表情輸出而能夠具有更為接近人類生活在表情與動作上豐富的特性。 本文最後以實體機器人進行整體性能之測試,以兩台不同情感特性之機器人互相影響並改變對方情緒狀態以驗證作為一社交機器人其情感互動能力。機器人之硬體系統部分是以My RIO®主控制器與兩塊微控制器MSP430、MEGA2560開發而成,軟體則是以LabVIEW為發展平台並執行其機器人互動之情感行為之推論。
This thesis discusses the design and the generation of the artificial facial expression under different emotion state for the social robot. It also discusses the visual identification of the emotion state of the social robot which has emotional facial expression. The internal emotional state of the robot is traced by a two dimensional dynamic model continuously and the facial expression is generated according to the emotional state of robot in real-time. The control points which used to produce the graphic facial output are defined. A continuous Bezier curve function which created according to the control points of face is used for the generation of facial entities such as eye brows and mouth. The aspect ratio of certain entities of face is used as the feature to be observed for the emotional state identification. A study on the general emotional facial expression is conducted. The major key expressions of mood are used in the parameter construction of the control points accordingly. The continuous function for the control points which depends on the vary emotional state is than obtained by curve fitting method. The parametric form of Bezier function would produce smooth dynamic variation of the facial expression which closely resembles living creatures. The behavior-based programming method is used in the control of the whole robotic system. The behavior arbitration mechanism is included for the final coordinated behavior output. The main reason for the artificial facial expression is to maximize the mathematic computation efficiency for resource limited small scale system. Real-time field test is conducted to prove the functionality of the visual system which is for the mood identification of facial expression. The sensitivity of such vision system is analyzed and the emotional interactions between robots are orchestrated through proper behavior design. In order to test the performance of the overall entity of the robot, this robot with two different emotional characteristics influence each other and change each other's emotional state as a social robot to verify its ability to interact with emotion. This robot have two deferent hardware part of system, The master is My RIO controller and two microcontrollers MSP430 and MEGA2560 was developed, the software is based on LabVIEW platform for the development and execution of its inferences emotional behavior of the robot interaction.