透過您的圖書館登入
IP:18.190.156.155
  • 學位論文

利用攝影機二維影像做三維手勢追蹤

3D Hand Posture Tracking Using 2D Camera Images

指導教授 : 蔡淳仁

摘要


本論文完整地建立了一套以3D手部模型為基礎的手勢追蹤系統。在二維影像處理方面,我們利用 OpenCV所提供的Zhengyou Zhang相機校正系統,以及前、後景判別的codebook系統來實作手部畫面的擷取及追蹤。3D模型方面,我們使用Blender 3D繪圖軟體繪製3D手部模型,再將手部模型轉成Ogre 3D程式庫的模型格式,以便手勢追蹤程式能建立基本的模型控制,以和2D的手部畫面進行比對來計算出3D模型的動作參數。 在手勢追蹤方面,我們嘗試了幾種不同演算法,有單純的依照2D contour的特性,找出2D指尖與手掌位置;有依照PMD的方式,找出模型的mean shape,並將訓練資料做PCA計算,依照eigenvectors做手部線性移動的追蹤;也有改良PMD的方法,將PCA的訓練資料改成轉軸的角度,其運動更可以符合真正手部運動。最後使用最簡單的brute force方式,大量微調3D模型的姿勢,來找出與2D手部影像最相符的模型參數,以此建立了基本追蹤系統。

關鍵字

手勢追蹤 模型式

並列摘要


In this thesis, we present the design of a 3D model-based hand posture tracking system using a single sequence of 2D camera images. The system is composed of several stages. During the pre-processing stage, we retrieve the hand area from the 2D camera image by employing the OpenCV functions of image rectification and codebook training for foreground/background separation. The 3D hand model is constructed using Blender, which is a 3D graphics authoring tool. The hand model is then converted to the OGRE format so that we can use the OGER rendering engine to control the movement of the 3D hand model. During the hand posture tracking stage, we implemented several exiting methods proposed by other researchers. The extraction of fingertips and palm positions are achieved simply by using the contours of the 2D image. We also implemented the Point Distribution Model (PDM) method, this method needs a set of hand posture training data, and calculating the mean shape and eigenvectors by applying Principle Component Analysis (PCA). We then refined the PDM method by changing the training data of PCA from the position of landmarks to the rotation angles of joints. In this way, the movement of the animated hand conforms to the kinematic model of a real hand. Eventually, this thesis proposed a framework that tracks the hand posture by brute-force matching between all the possible postures of the 3D hand model and the observed 2D hand contours to determine the most likely 3D hand posture.

並列關鍵字

Hand Posture Tracking Model-Based

參考文獻


[1] V.I. Pavlovic, R. Sharma, T.S. Huang, "Visual interpretation of hand gestures for human-computer interaction: a review," Pattern Analysis and Machine Intelligence, IEEE Trans. on, vol.19, no.7, pp.677-695, Jul. 1997
[2] B. Stenger, P.R.S. Mendonca, R. Cipolla, "Model-based 3D tracking of an articulated hand," CVPR 2001. Proc. 2001 IEEE Computer Society Conference on, vol.2, pp.II,310-315 vol.2, 2001
[3] B. Stenger, A. Thayananthan, P.H.S. Torr, R. Cipolla, "Model-based hand tracking using a hierarchical Bayesian filter," Pattern Analysis and Machine Intelligence, IEEE Trans. on, vol.28, no.9, pp.1372-1384, Sep. 2006
[7] M. La Gorce, D.J. Fleet, N. Paragios, "Model-Based 3D Hand Pose Estimation from Monocular Video," Pattern Analysis and Machine Intelligence, IEEE Trans. on, vol.33, no.9, pp.1793-1805, Sep. 2011
[8] W. Cui, W. Wang, H. Liu, "Robust hand tracking with refined CAMShift based on combination of Depth and image features," Robotics and Biomimetics (ROBIO), 2012 IEEE International Conference on, pp.1355-1361, 11-14 Dec. 2012

延伸閱讀