透過您的圖書館登入
IP:3.134.103.74
  • 學位論文

即時手勢辨識系統及其於戰場情資顯示平台之應用

A Real-Time Hand Gesture Recognition System and its Application in a Battlefield Information Platform

指導教授 : 蘇木春
若您是本文的作者,可授權文章由華藝線上圖書館中協助推廣。

摘要


近年來,手勢辨識在人機互動的相關研究中,吸引了各個領域的專家投入,而常見的應用有遊戲控制、機械手臂操作、機器人控制及家電控制等等。其簡單又直覺的操作方式,將漸漸地取代傳統遙控器和輸入裝置的使用。 本論文提出一種基於深度影像之即時手勢辨識系統,並將系統應用於以NASA world wind為基礎之戰場環境情資平臺的操控。我們所開發的以NASA world wind為基礎之戰場環境情資平臺,除了可顯示最基本的世界地圖資訊之外,它也提供了許多地理與環境資訊,可以滿足國軍基層部隊的需求。 系統實做的方法是先對深度攝影機讀到的深度影像與骨架資訊做必要的前處理後,留下手臂資訊,再用手掌切割演算法將手掌擷取出來,接著利用手掌輪廓上每一個點到手掌質心的距離曲線來當描述手型的特徵,由於這種特徵會受手掌旋轉角度及手掌大小等因素影響,不適合直接當手型辨識特徵使用,故本論文採用快速傅立葉轉換的方式,將該距離曲線取樣後轉換到頻率域上的一組快速傅立葉係數。由於不同的手型會產生不同的係數值,所以本論文採用決定樹的方式,根據係數值來辨識六種手型。之後,再利用這些手型和雙手上下左右揮動的動作加以組合,轉換成操控戰場環境情資平臺所需的六種指令集。 本系統的研發目標是要能提供軍方人員一套新一代的戰場環境平臺,除了可利用傳統的鍵盤與滑鼠的操控之外,還可引進人機介面的最新技術,藉由手勢操控來取代傳統的輸入裝置。最後,本系統之各項手勢操控功能皆有透過各種不同之實驗設計來驗證。在手型正確率實驗中,其正確率達96.1%。在組合手勢偵測與辨識實驗中,即使是在不同角度時,整體平均的辨識正確率亦可高達97.9%。

並列摘要


For the past few years, gesture recognition research in human-computer interaction had attracted experts’ attention in various field, general applications includes gaming control, humanoid robot arms operation, robot control, household appliances control and so on. Due to its convenience and intuitive manipulation, the hand-gesture-based controller will gradually substitute for the traditional remote and input device control. This thesis presents a real-time hand gesture recognition system based on depth image, and apply the system to the battlefield information platform which based on NASA world wind. The NASA world wind-based battlefield information platform not only displays the information of the world maps but offers a number of geographical and environmental information that satisfy the demands of military. The implementation of the proposed hand gesture recognition system is as follows. First of all, we locate the arm region from an image captured by Kinect via several necessary depth image preprocessing operators and skeleton tracking operators. Second, use a hand capture algorithm to extract the hand shape from the arm region. We then use the distance curve between hand boundary and the hand center as a feature that describing hand shape. However it’s inappropriate to be used for the hand shape recognition directly since the feature still affected by angle for hand rotation and size of hand. Thus, we use the frequency domain coefficient of the distance curve transformed by Fast Fourier transform. On account that there are different hand shape resulting different Fast Fourier coefficient, the decision tree incorporated with the coefficient is adopted to recognition 6 gestures. Furthermore, arranging those gesture with both hands will be used as commands to control the battlefield information platform. The NASA world wind-based battlefield information platform not only displays the information of the world maps but offers a number of geographical and environmental information that satisfy the demands of military. The aim of this system is to provide a brand-new battlefield platform for the military. In addition to operating by traditional keyboard and mouse, this system also introduce the latest technology of human-computer interaction. After all, several experiments were designed to evaluate the functionalities of the proposed real-time hand gesture recognition system. In hand gesture experiments, the correct rate is 97.1%. Even at different angles, the correct rate in the commands experiment with both hand is 97.42%.

並列關鍵字

無資料

參考文獻


[4] J. M. Rehg and T. Kanade, “DigitEyes: vision-based hand tracking for human-computer interaction,” in Proc. of the Workshop on Motion of Non-Rigid and Articulated Bodies, pp. 16-22, 1994.
[7] R. Yang and S. Sarkar, “Gesture Recognition Using Hidden Markov Models from Fragmented Observations,” in Proc. IEEE Conference Computer Vision and Pattern Recognition, 2006
[8] M. Elmezain, A. Al-Hamadi, G. Krell, and S. El-Etriby, “Gesture Recognition for Alphabets from Hand Motion Trajectory Using Hidden Markov Models,” IEEE International Symposium on Signal Processing and Information Technology, pp. 1192-1197, 2007
[9] M. A. Amin and Y. Hong, “Sign language finger alphabet recognition from gabor-PCA representation of hand gestures,” in Proceedings of the Sixth International Conference on Machine Learning and Cybernetics, vol. 4, pp. 2218–2223, 2007.
[11] M. Vafadar and A. Behrad, “Human hand gesture recognition using motion orientation histogram for interaction of handicapped persons with computer,” in Lecture Notes in Computer Science, vol. 5099, pp. 378–385, 2008.

延伸閱讀