透過您的圖書館登入
IP:3.137.218.215
  • 學位論文

以FPGA實現單一像素感測器之即時人機介面

Real-Time Single Image Sensor Man-Machine Interface Implementation with FPGA

指導教授 : 陳耀煌 陳朝烈
若您是本文的作者,可授權文章由華藝線上圖書館中協助推廣。

摘要


一般人機介面系統,皆以軟體作為影像處理開發核心,使用軟體雖容易實現各式影像處裡演算法,但卻必須依賴高階的硬體功能(如:高時脈CPU、大量RAM)才容易實現,這在非PC的設備裝置,尤其是在嵌入式或可攜式產品的應用上,勢必會增加功率消耗、硬體成本。因此本論文提出,單一攝影機輸入之即時緩衝方法進行影像處理,並以FPGA實現此影像處理積體電路雛型,完成即時追蹤、低記憶體需求、並且無需穿戴之人機互動介面系統,我們並提出此系統的驗證環境與方法。本系統包含影像梯度計算暨二值化單元、影像物體辨別單元、影像移動偵測單元、計算影像移動軌跡單元、肢體移動判斷單元以及訊號傳輸單元全部系統僅需一顆FPGA(Xilinx Spartan 3 XC3S1000)即可完成驗證,系統時脈達43MHz,辨識速度最高可支援到140fps@640×480的影像輸入,解析度最高可為4.267像素/cm,系統晶片面積約44萬Gate count、24.75KB Block RAM,並可做到雙手與頭部多物件的即時同步追蹤。

並列摘要


Common imaging based man-machine interactions are most realized using flexible software solutions while high-end hardware components such as high-performance CPU and large amount of memories are still required. For embedded system products with limited CPU power and memories, such as portable devices, low cost LSI chip solutions are then more desirable. We propose FPGA implementation of man-machine interaction system utilizing single imaging sensor and real-time buffering image processing techniques. We achieve real-time body movement tracking and identification without large memory requirement and no need of special wearing. We also propose method and environment setup for the interaction system verifications. The whole interaction system is implemented with a Xilinx Spartan 3 FPGA, XC3S1000. Under 43MHz clock, the processing speed is 140fps@640×480. Using a single 640x480 NTSC sensor, the maximum resolution of body motion is 4.267 pixels/cm. The whole system gate count and memory requirement are 440K gates and 24.75KB respectively including an USB 1.1 interface to achieve two hands, face, and simultaneous multi-object tracking.

參考文獻


[6] HITACHI, “Hitachi Gesture Operation TV”, Youtube http://www.youtube.com/watch?v=O21SYHDEPOs
[10] In-Kwon Park, Jung-Hyun Kim, Kwang-Seok Hong, "An Implementation of an FPGA-Based Embedded Gesture Recognizer Using a Data Glove” intl. Conf. Ubiquitous information management and communication (ICUIMC'08), pp496~500,January 2008.
[11] X. Iturbe, A. Altuna, et. al, “VHDL Described Finger Tracking System for Real-Time Human-Machine Interaction,” Intl. Conf. Signals and Electronic systems (ICSES 2008), Krakow, pp. 171~176, Sept. 14-17, 2008.
[13] Byungsung LeeJ, unchul Chun "Manipulation of Virtual Objects in Marker-less AR System by Fingertip Tracking and Hand Gesture Recognition," ICIS 2009, November 24-26, 2009 Seoul, Korea, ACM 978-1-60558-710-3/09/11, pp.1110~1115.
[14] K. Terajima, T. Komuro, M. Ishikawa, ”Fast Finger Tracking System for In-air Typing Interface,”CHI,2009,April4 – 9, 2009, Boston, MA, USA, ACM 978-1-60558-247-4/09/04,pp. 3739~3744.

延伸閱讀