透過您的圖書館登入
IP:3.147.54.242
  • 學位論文

結合體感與注意力辨識之情感運算研究-以3D校園導覽為例

The Affective Computing Study of Posture and Attention Recognition - An Example on The 3D School Guide System

指導教授 : 吳明霓

摘要


近年來情感運算(Affective Computing)的興起,已徹底改變了以往人機互動(Human-computer Interaction)的思考設計模式。為了讓人類與電腦之間的互動機制能更加的直覺與人性化,如今越來越多的研究將人類的情緒行為視為人機互動的關鍵因素,並嘗試將情感運算導入到系統中運用。因此,本研究希望將肢體動作和眼動行為納入人類情緒表達的因素,並嘗試將肢體情緒及注意力情緒結合到依據真實校園所開發的3D校園導覽系統。希望透過使用者在遊覽校園環境的過程中,分析使用者肢體的行為與視線的注視來作為情緒判斷的依據,並顯示對應區域與情緒的回饋內容。透過這樣的方式,以減少操作上的負荷,改善使用者的互動體驗,達到直覺、人性化的自然感受。 本研究主要開發了兩種結合不同情緒的操控模式,分別為肢體情緒操控與注意力情緒操控。為探討是否加入情感因素的操作方式能改善以往的互動感受,本研究另外將鍵盤操控和體感操控模式一併納入比較。結果顯示,大部分受測者對於結合情感辨識的操作在整體的互動感受上是滿意的,並且能顯著降低操作上的負荷,也認為與以往的操作模式相比較是有趣的。最後,本研究也列出研究上的限制與未來可以改進的方向。

並列摘要


The Affective Computing has changed the design thinking patterns of the Human-computer Interaction completely in recent years. In order to promote the intuitive and user-friendly on Human-Computer Interaction (HCI) system, there are more and more studies regard human emotions and behaviors as a key element in HCI, and try to introduce affective computing into system. Therefore, this study hopes to combine body movement and eyes gaze behavior into the system, and try to integrate emotion expression in posture and attention into the 3D School Guide System which developed on real campus. This study hope to analyze user’s body movement and gaze as the emotional judgment, and shows content feedback correspond to the areas and the emotions through the process of users are visiting the school environment. In this way, the operation burden can be reduced and user's interactive experience can be improved to achieve intuitive and user-friendly experience. This study developed two different emotion operational modes, the operation of body emotion and attention emotion. In order to investigate whether to join the operation of emotional factors can improve the past interactive experience, keyboard operation and motion sensing operation were incorporated in comparison. The results in the overall interactive experience showed that most of the participants were satisfied with the operation modes which integrated emotion recognition, and not only reduced operating load significantly, but also thought that they are more interesting than the past operation modes. Finally, this study showed the limits of this study and improving policies for future studies.

參考文獻


[1] Bergstrom, J. R., & Schall, A. (Eds.). (2014). Eye tracking in user experience design. Elsevier.
[2] Buscher, G., Cutrell, E., & Morris, M. R. (2009). What do you see when you're surfing?: using eye tracking to predict salient regions of web pages. In Proceedings of the SIGCHI conference on human factors in computing systems, 21-30. ACM.
[3] Chen, Y., & Tsai, M. J. (2015). Eye-hand coordination strategies during active video game playing: An eye-tracking study. Computers in Human Behavior, 51, 8-14.
[4] Coulson, M. (2004). Attributing emotion to static body postures: Recognition accuracy, confusions, and viewpoint dependence. Journal of nonverbal behavior, 28(2), 117-139.
[5] Dael, N., Mortillaro, M., & Scherer, K. R. (2012). Emotion expression in body action and posture. Emotion, 12(5), 1085.

延伸閱讀