透過您的圖書館登入
IP:3.12.153.31
  • 學位論文

利用人工眼球進行CFB與GFB眼動追蹤算法的精確度評估

Accuracy Assessment of Both CFB and GFB Eye Tracking Algorithms with an Artificial Eyeball

指導教授 : 石勝文

摘要


視線追蹤系統的應用逐漸廣泛,然而對於如何客觀地量測視線追蹤系統的精確度的討論卻相對較少。在本篇論文中,我們建構一套使用 GFB (Glint Feature Based) 和 CFB(Contour Feature Based) 的視線追蹤系統,並使用自製的人工眼球來測量其精確度。我們利用 3D 列印製作了多種不同的人工眼球模型,透過改變樹脂的顏色或是在上面噴漆來使人工眼球擁有清晰的瞳孔輪廓。對於我們使用的視線追蹤方法來說,瞳孔後方可以直接放置黑色樹脂來避免多餘的折射與反射而不需要保持中空。此外,我們還嘗試了不同的填充物來模擬房水,因為我們使用的光學透鏡折射率為 1.779,在我們能夠自行操作的材料中,蠟(折射率 1.54)的效果是最好的。此外,因為 GFB 系統依賴亮點反射進行視線估計,我們在系統中使用了環狀光源,並假設其圓心位於相機光學中心。為了驗證當環狀光源所放置的位置有偏移時的系統效果進行了模擬實驗。我們探討了環狀光源的 Z 軸位置和半徑大小、在瞳孔邊界和亮點反射處加入雜訊以及當相機在拍攝校正影像時就有雜訊干擾對視線追蹤精確度的影響。根據我們的實驗數據,視線追蹤的精確度確實會隨著環狀光源中心與攝影機光學原點距離的增加而降低,當 Z 軸距離光學中心 30 mm 時會造成 0.25◦ 左右的誤差,而環狀光源半徑的影響較不顯著。由於0.25◦ 的誤差遠小於與攝影機校正影像受到 1 pixels 雜訊產生的誤差 6.2087◦,我們認為由於光源偏離攝影機光學中心所造成的系統誤差是可以忽略的,故可以判斷使用同軸光源的量測結果可提供可靠的視線資訊。而在使用人工眼球檢驗視線估測結果中,我們分別將人工眼球放置在十二個不同角度進行視線追蹤,然後計算人工眼球方位與估測出的視線的夾角,得到的差距角度平均值約在 2◦–5◦ 之間。雖然因透鏡大小而使人工眼球的旋轉範疇受限,但我們的研究仍顯示出人工眼球在視線追蹤領域的發展潛力。

並列摘要


The application of eye tracking systems is increasingly extensive, but there is relatively lessdiscussion on how to objectively measure the accuracy of eye tracking systems. In this paper,we construct an eye-tracking system using Glint Feature Based (GFB) and Contour Feature Based (CFB) techniques, and use our custom-made artificial eyeballs to measure its accuracy.We utilized 3D printing to create various artificial eyeball models, changing the color of theresin or spray painting on it to provide the artificial eyeball with a clear pupil outline. For the eye tracking method we used, black resin can be placed directly behind the pupil to avoid excessive refraction and reflection, liminating the need for a hollow space. Additionally, we experimented with different fillings to simulate aqueous humor, as the refractive index of the optical lens we used is 1.779. Among the materials we could manipulate, wax (refractive index 1.54) proved most effective. Furthermore, since the GFB system relies on specular reflection for gaze estimation, we used a ring light source in our system, assuming its center is located at the optical center of the camera. To validate the system’s performance when the position of the ring light source is offset, we conducted a simulated experiment. We explored the effects of the Z-axis position and radius of the ring light source, adding noise to the pupil boundary and specular reflection, and the interference of noise when the camera captures correction images on the accuracy of gaze tracking. According to our experimental data, the accuracy of gaze tracking does indeed decrease as the distance between the center of the ring light source and the optical origin of the camera increases. An offset of 30 mm on the Z-axis from the optical center resulted in an error of approximately 0.25◦ , while the influence of the radius of the ring light source was less significant. Since an error of 0.25◦ is significantly less than the error of 6.2087◦ caused by 1 pixel noise in the camera-corrected image, we conclude that the system error caused by the light source deviating from the optical center of the camera can be neglected. Thus, it can be inferred that measurements using a co-axial light source provide reliable gaze information. When using artificial eyeballs to inspect gaze estimation results, we placed the artificial eyeballs at twelve different angles for gaze tracking, then calculated the angle between the orientation of the artificial eyeball and the estimated gaze. The average angular difference was between 2◦ and 5◦. Although the size of the lens restricts the rotation range of the artificial eyeball, our research still shows the potential for artificial eyeballs in the development of gaze tracking technology.

參考文獻


[1] A. Bublea and C. D. Căleanu, “Deep learning based eye gaze tracking for automotive applications: An auto-keras approach,” in 2020 International Symposium on Electronics and Telecommunications (ISETC). IEEE, 2020, pp. 1–4.
[2] A. Bulling, J. A. Ward, H. Gellersen, and G. Tröster, “Eye movement analysis for activity recognition using electrooculography,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 33, no. 4, pp. 741–753, 2011.
[3] T. Imai, K. Sekine, K. Hattori, N. Takeda, I. Koizuka, K. Nakamae, K. Miura, H. Fujioka, and T. Kubo, “Comparing the accuracy of video-oculography and the scleral search coil system in human eye movement analysis,” Auris Nasus Larynx, vol. 32, no. 1, pp. 3–9,
[4] M. Shelhamer and D. C. Roberts, “Magnetic scleral search coil,” Handbook of Clinical Neurophysiology, vol. 9, pp. 80–87, 2010.

延伸閱讀