本論文創新將眼球追蹤儀(eye tracker)及Kinect作為人機互動的輸入裝置,我們利用語音辨識啟動程式,透過眼球追蹤技術用於眼動當作滑鼠游標,再偵測唇形辨識執行點擊的功能,整合多項便利性的設定讓使用者與電腦互動,並作為與外界溝通之橋梁。 在實驗的應用中,利用所設計的人機互動應用程式,讓使用者可以利用眼動,將游標移動到指定點,執行點擊的功能分為兩種模式:第一種點擊模式為眨眼大於30fps,即執行點擊的功能。第二種點擊模式為偵測唇形張開的動作進行點擊的功能。整合利用眼動和唇動,希望讓使用者可以在操作人機互動應用程式時,能夠有更多元的輸入控制使用方式。
In this study, the innovation that use eye tracker and Kinect as a human-computer interactive input device. We firstly apply speech recognition to start the GUI program, and then use eye movement to control the computer mouse cursor by eye tracking technology, and also Kinect can recognize the lip motion for a mouse click. Let the users can use the computers and communicate with people more easily. In the experimental application, we design the computer programs that eye tracker can let user move the mouse to the point position. And the selection functions performed separate two ways, one is winking more than 30fps, the other is the lip motion that using the mouth open to start the functions. Integration eye movement and lip motion to help the user can operate at the human-computer interaction applications, to have more multiple input device usage.