透過您的圖書館登入
IP:3.135.201.209
  • 學位論文

基於三維加速度資訊實現應用於手機環境上之情緒手勢輸入法

Affective Input with Gestures from Tracked Three-Axis Acceleration Information for Mobile Environments

指導教授 : 鄭穎懋
若您是本文的作者,可授權文章由華藝線上圖書館中協助推廣。

摘要


情緒的行為反應是人天生即有的情感特徵,舉手投足皆經常都包含著情緒資訊於其中,而手勢亦是常見的情緒行為之一,人們亦可自本身的經驗而了解彼此的情緒行為所帶來的含意。然而此類被視為理所當然的行為可否被當作一種使用於裝置輸入的方式以便應用?手機是和人們情感互動緊密結合之科技產品,我們嘗試將人們的情緒手勢導入至手機環境上的應用中,輔以現今手機內建加速度感測器的熱潮,以此為媒介偵測人們的手勢動作並且提出一套輸入法架構去輸入情緒手勢。其次,利用擴展情緒行為的展示來強化情緒資訊表達的應用在國外已有相關發展,情緒亦為人們在經驗分享上的根本動機,在現今的應用上亦有分享情緒資訊之趨勢。因此嘗試以發展情緒手勢輸入法為切入方向來進行情緒互動分享的基礎開端,來歸納人們於情緒手勢行為上之特徵以及發展輸入方式。本文的目的為發展適用於手機環境上的情緒手勢輸入法,因此在研究方法上劃分為三階段:情緒手勢特徵探討、輸入法所採用之判別機制演算法的比較、實際應用於手機上的情境規劃與實作。第一階段於情緒手勢敏感度和情緒手勢種類探討兩方向著手,目的是以使用者觀點為出發點來找出適合用於手機環境情緒手勢輸入,以使用者對於手勢特徵和樣式的觀點出發探討與情緒之對應關係,來勾勒出適合的手勢樣貌以制定手勢輸入法。第二階段為基於第一階段的歸納成果,利用所制定的規則搭配不同的手勢特徵參數組合導入不同的辨識方法進型綜合的比較評估,目的為找出適宜之判別機制以做為輸入法的辨識機制。在最後會提出一個手機應用情境,以及將分析結果實作於手機環境上並且進行相關的驗證。期許這種以情緒手勢輸入法為出發點的嘗試可以做為增進人們情緒互動交流的基礎建設,並且可以用於其他不同的應用上。

並列摘要


Mobile technologies mediate the expression of feelings and emotions via communicative channels such as voice call, text messages (SMS), and multimedia messages (MMS). On the other hand, gestures provides behavioural cues and social signals that help people understand the affective messages in face-to-face communication and this information is usually lost in electronic communications without video. In recent years mobile devices with accelerometer have become popular. These provide gestural applications such as music control or games. Motion detection has also been used as a user interaction method for recognizing hand gestures, text input, etc. The aim of this study was to facilitate affective interaction with gestures on mobile handsets. Thus, we must explore the features of affective gestures for interaction via mobile devices. This paper presents a user study involving gestures for affective interaction via mobile devices. In this study three-axis acceleration information was used to collect and evaluate the emotions of users through their affective gestures. Features of affective gestures applicable in the context of mobile device were measured and a user study involving affective gestures based on users’ expressions is presented. First, the borderline between distinct gestural types for different emotions is extracted. And then the gestural features for different affective gestures are listed. Secondly, the raw acceleration data of the different affective gestures are measured and then analysis via machine learning. We completed different gestural features and methods and then presented a suitable model for gestural classification. Finally, we presented a scenario and a user study for affective inputs with gestures used in mobile environment, and then we implement the gestural inputs to classify different emotions. The results of the user tests reveal the emotion recognition rates.

參考文獻


[3] Damasio, A. R. Descartes’ Error: Emotion, Reason and the Human Brain, Grosset/Putnam, New York, 1994
[6] Brown, L. M. and Williamson, J. Shake2talk: multimodal messaging for interpersonal communication. LNCS, vol. 4813, pp. 44-55, 2007
[8] Mitra, S., Acharya, T., Gesture recognition: a survey. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, vol.37, no.3, pp.311-324, 2007
[9] Choi, S. E., Bang, W. C., Cho, S. J., Yang, J., Kim, D. Y., Kim, S. R., Beatbox music phone: gesture-based interactive mobile phone using a tri-axis accelerometer. Industrial Technology, 2005. ICIT 2005. IEEE International Conference on, pp. 97-102, 2005.
[10]Kwon, D. Y., Gross, M., A framework for 3D spatial gesture design and modeling using a wearable input device. Wearable Computers, 2007 11th IEEE International Symposium on, pp.23-26, 2007

被引用紀錄


王靜茹(2013)。應用情緒運算於現場表演之APP開發研究〔碩士論文,國立交通大學〕。華藝線上圖書館。https://doi.org/10.6842/NCTU.2013.00707

延伸閱讀