透過您的圖書館登入
IP:3.145.2.184
  • 學位論文

結合智慧眼鏡上慣性量測元件與眼動資訊認知

Attention estimation from the integrated IMU and eye-tracking perception of smart glasses

指導教授 : 陳自強
若您是本文的作者,可授權文章由華藝線上圖書館中協助推廣。

摘要


穿戴式行動學習為未來必然趨勢,在穿戴科技逐漸成熟的當下,市場上相關應用產品如雨後春筍般出現,然而功能上的重複與缺少殺手級的應用,導致穿戴式科技一度陷入困境,此外,專注力是有效學習的基本條件,透過專注力的估測與紀錄,有助於學習者在學習過程的改善與檢討。有鑑於此,本研究透過穿戴式智慧眼鏡提供之眼睛資訊與IMU資訊,搭配眼動追蹤方法,建構一專注度估測系統。 本研究採用Android系統架構下之智慧眼鏡平台,藉以結合眼動與IMU資訊,進而實現專注度估測系統。首先於眼動追蹤部分,先分以眼球與眼角偵測,最小灰度平均法(Minimum Average Gray Value)用來找尋眼球大致輪廓,而為了適應各式光線變化,大津演算法(Otsu)亦用來偵測眼角輪廓,接著會進行視線追蹤與視線投射,藉此即能掌握眼球之運動軌跡,眼動追蹤結果之平均誤差視角為1.82度。而於專注度辨識的部分,本研究將系統分為資訊擷取、特徵擷取、特徵選取、分類與投票機制。在資訊擷取部分錄製7種不同情境下之專注與不專注實驗,藉以獲取眼動與IMU資訊;特徵擷取方面,考量各式特徵特性,擷取出41種特徵,並透過4種不同之特徵選取方式挑選合適的特徵參數;最後將透過基因演算法(Genetic algorithms, GA)優化支持向量機(Support Vector Machine, SVM)之參數,並以K-fold方法驗證其可靠度,且藉由投票機制估測其專注程度。專注估測於7種情境下分為專注與不專注兩類之最高辨識率為86.10%,而探討CPT實驗分為三類與兩類之結果,其正確率最高分別為81.12%與83.44%。最後透過比較相關文獻,藉此已顯示本研究之可行性與可靠性。

並列摘要


Wearable mobile learning becomes an inevitable trend in the future. With the matured wearable technologies, related applications have boosted in the market, but lacked for a killer application that can lead and spread wearable technologies. In another way, attention ability is essential to effective learning. By estimating and recording attention capability help learners to improve and review their learning process. In the light of this, attention-estimating system is constructed form IMU information, and eye-tracking perception in our study. In this study, we combine eye-tracking perception with IMU information to realize the Attention-estimating system based on Android smart glasses. First, we divided the eye-tracking into eyeball detection and eye corner detection. Minimum Average Gray Value was used to find the general eyeball contour. Moreover, considered all kinds of light changes, Otsu algorithm also was used to detect the contour of eye corners. Then, we can master the eye movement by determining the sight and projection point. The result of the average error angle of view is 1.82 degree. In the attention recognition, the system divided steps into five part: data acquisition, feature extraction, feature selection, classify and voting mechanism in this study. In order to extract the eye movement and IMU information by recording experiment of attention and non-attention in seven different scenarios; In the feature extraction, we extracted forty one kinds of features based on feature characteristics, and selected the suitable feature by four different kinds of feature selections; Finally, we optimized the parameter of SVM by Genetic algorithm, and validated the reliability by K-fold algorithm, however, we also adapted the voting mechanism to estimate the result of attention level. The classified accuracy of attention and non-attention are reach up to 86.10% under seven scenarios. The three and two categories of Continuous Performance Test experiment result are 81.12% and 83.44%, respectively. Eventually our result as compared with reference papers that indicated feasibility and reliability.

並列關鍵字

Wearable Smart Glasses Eye-Tracking IMU Attention Level

參考文獻


[1] 林玉雯, 黃台珠, 劉嘉茹, “課室學習專注力之研究-量表發展與分析應用,” 科學教育學刊, 第18卷第 2 期, pp. 107-129, 2010.
[56] B. Hamadicharef, H. Zhang, C. Guan, C. Wang, K. S. Phua, K. P. Tee, & K. K. Ang, “Learning EEG-based spectral-spatial patterns for attention level measurement,” In 2009 IEEE International Symposium on Circuits and Systems, pp. 1465-1468, May, 2009.
[4] L. Schock, M. Schwenzer, W. Sturm, & K. Mathiak, “Alertness and visuospatial attention in clinical depression,” BMC Psychiatry, vol. 11, no. 78, pp. 1-6, 2011.
[7] K. Chu, & C. Y. Wong, “Player's attention and meditation level of input devices on mobile gaming,” 2014 3rd international conference on In User science and engineering (i-user), pp. 13-17, Sep., 2014.
[9] S. Coelli, R. Sclocco, R. Barbieri, G. Reni, C. Zucca, & A. M. Bianchi, “EEG-based index for engagement level monitoring during sustained attention,” In 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 1512-1515, Aug., 2015.

延伸閱讀