透過您的圖書館登入
IP:3.145.151.141
  • 學位論文

基於不同手指部位快速切換輸入模式之研究

TouchSense: Expanding Touchscreen Vocabulary using Different Areas of Users’ Finger Pads

指導教授 : 陳彥仰

摘要


「TouchSense」是一種旨在讓使用者在智慧型穿戴裝置上快速切換 輸入模式的技術。以智慧型手錶為例,其操作介面還是以一般觸控螢 幕方式作互動,但穿戴式裝置因為追求體積小,所能提供的操作介面 相對縮小,一般的輸入方式,並不適合此操作情境(譬如:由於螢幕面 積小,較難以雙指作 pinch 手勢來放大地圖),其原因來自於在一般的 操作介面下,每一次的手指點按,只能產生基本的輸入資訊(按下/非 按下),導致互動方式因此受限。 「TouchSense」提出了一個嶄新的想法,我們決定在人的食指上不同 區域安裝不同的功能,當使用者使用不同手指部位觸按螢幕,代表著 不同的輸入模式,即可跳脫出目前觸控螢幕使用方式的框架,我們設 計了兩個使用者實驗,找出使用者區分手指不同部位的能力,並提出 互動方式的設計方針。之後我們將兩個動態傳感器 (inertial motion unit) 分別安裝在使用者的手指及智慧型手錶上,並透過 SVM 技術辨識出手 指相對於觸控螢幕的觸控姿勢,進一步地即時推論出使用者欲使用哪 個手指部位觸按螢幕。最後設計了數個實驗比較「TouchSense」與數 個在智慧型手錶上常用的模式切換方法,實驗結果表示「TouchSense」 讓使用者能夠在模式切換上擁有較佳的表現。

並列摘要


We present TouchSense, which provides additional touchscreen input vo- cabulary by distinguishing the areas of users’ finger pads contacting the touch- screen. It requires minimal touch input area and minimal movement, mak- ing it especially ideal for wearable devices such as smart watches and smart glasses. For example, users of a calculator application on a smart watch could tap normally to enter numbers, and tap with the right side of their fingers to en- ter the operators (e.g. +, -, =). Results from two human-factor studies showed that users could tap a touchscreen with five or more distinct areas of their finger pads. Also, they were able to tap with more distinct areas closer to their fingertips. As a proof of concept, we developed a TouchSense smart watch prototype using inertial measurement sensors, and three example ap- plications: a calculator, a map viewer, and a text editor. In a follow-up study, we further reported user performance and user feedbacks on the TouchSense applications.

參考文獻


[1] X. Bi, T. Moscovich, G. Ramos, R. Balakrishnan, and K. Hinckley. An exploration of pen rolling for pen-based interaction. In Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology, UIST ’08, pages 191–200, New York, NY, USA, 2008. ACM.
[12] C. Holz and P. Baudisch. The generalized perceived input point model and how to double touch accuracy by extracting fingerprints. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’10, pages 581–590, New York, NY, USA, 2010. ACM.
[16] G. Ramos and R. Balakrishnan. Zliding: Fluid zooming and sliding for high preci- sion parameter manipulation. In Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology, UIST ’05, pages 143–152, New York, NY, USA, 2005. ACM.
[17] G. Ramos, M. Boulos, and R. Balakrishnan. Pressure widgets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’04, pages 487–494, New York, NY, USA, 2004. ACM.
[18] V. Roth and T. Turner. Bezel swipe: Conflict-free scrolling and multiple selection on mobile touch screen devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’09, pages 1523–1526, New York, NY, USA, 2009. ACM.

延伸閱讀