透過您的圖書館登入
IP:18.217.220.114
  • 學位論文

在智慧型眼鏡上用飛時相機做手勢辨認

Hand Gesture Recognition with Time-of-Flight Camera for Smart Glasses

指導教授 : 傅楸善
本文將於2027/03/27開放下載。若您希望在開放下載時收到通知,可將文章加入收藏

摘要


擴增實境(Augmented Reality, AR)上的人機互動方式有許多值得探索的部分。空中手勢(Mid-Air Hand Gesture)的整合是當中特別重要的題目,因為在每一個文化中,手勢都是重要的溝通工具。 在這篇論文中,我們提出呂手勢(LuGesture),包含三個準確、快速、舒適、省電與低成本的手勢組。這些手勢使用8x8相素的低解析度飛時(Time-of-Flight)深度相機做辨認。手勢辨識的演算法運算量低,可以被執行在智慧型眼鏡的微處理器(Miro-Controller Unit)上。 我們也探索了幾個不同的使用者介面來搭配我們的手勢組。我們所選擇的使用者介面都是依據相對應的手勢組之特性。在我們的實驗中,我們得到了很好的結果。 我們的演算法和使用者介面都成功整合在佐臻的J7EF Plus智慧型眼鏡上。整合的方法和流程都有詳述在論文中。

並列摘要


There is much to explore for human computer interactions in Augmented Reality (AR). Integration of mid-air hand gestures is an exceptionally important topic, since gestures facilitate important communication in every culture. In this thesis, we proposed three mid-air hand gesture sets (LuGesture) for AR that are accurate, fast, comfortable, power-efficient, and low-cost. These gestures are designed to be recognized with an 8x8 pixels low-resolution Time-of-Flight (ToF) depth camera (ST VL53L5CX). The gesture recognition algorithms are also designed to be fast and can run on a Micro-Controller Unit (MCU: ST STM32F401CE) on a pair of smart glasses. We also explore a few different User Interfaces (UI) to accompany our LuGesture. Each selection of UI is based on the characteristics of the corresponding gesture set, and have shown excellent results in our experiments. Our algorithms and UIs have been packaged and tested on Jorjin J7EF Plus smart glasses with success. The pipeline of data is also described in this thesis.

參考文獻


[1] N. Ahmad, R. A. R. Ghazilla, N. M. Khairi, and V. Kasi, “Reviews on Various Inertial Measurement Unit (IMU) Sensor Applications,” International Journal of Signal Processing Systems, Vol. 1, No. 2, pp. 256-262, 2013.
[2] R. Aigner, D. Wigdor, H. Benko, M. Haller, D. Lindbauer, A. Ion, ... and J. T. K. V. Koh, “Understanding Mid-Air Hand Gestures: A Study of Human Preferences in Usage of Gesture Types for HCI,” Microsoft Research TechReport MSR-TR-2012-111, 2, 30, 2012.
[3] R. E. Bailey, J. J. Arthur III, and S. P. Williams, “Latency Requirements for Head-Worn Display S/EVS Applications,” SPIE Enhanced and Synthetic Vision, Vol. 5424, pp. 98-109, 2004.
[4] T. Bailey and H. Durrant-Whyte, “Simultaneous Localization and Mapping (SLAM): Part II,” IEEE Robotics and Automation Magazine, Vol. 13, No. 3, pp. 108-117, 2006.
[5] S. Beauregard, “A Helmet-Mounted Pedestrian Dead Reckoning System,” Proceedings of International Forum on Applied Wearable Computing, Bremen, Germany, pp. 1-11, 2006.

延伸閱讀