螢幕自動旋轉功能增進了在行動裝置上的瀏覽體驗,然而目前以重力方向來 判斷的方式並無法讓使用者在任何姿勢下使用,例如側躺的時候。用手動切換螢 幕方向則需要使用者自己藉由其他方法輸入,例如實體或虛擬按鈕。我們調查了 513 位使用者,其中 42%的使用者至少一個禮拜會遇到螢幕自動旋轉成錯誤方向, 24%的使用者覺得這是一個嚴重的問題。 我們提出兩種方法,iRotate 與 iGrasp,來自動選轉行動裝置上的螢幕方向至 使用者觀看的方向。iRotate 以行動裝置上的前置鏡頭來偵測使用者的臉部並且旋 轉至該方向,增廣目前現有以重力方向來判斷的方式。這個方法不需要讓使用者 做額外的輸入,而且使用者在任何姿勢與任何觀看方向都可以運作。我們在 iPhone 與 iPad 上實作了可以即時判斷的 iRotate 的原型,並且執行了 20 人的可行性評測, 測量 iRotate 的準確性跟限制。 iGrasp 是偵測使用者如何握裝置來自動旋轉行動裝置上的螢幕至使用者目前 的觀看方向。我們發現使用者在一個方向上的握法是一致的,而且跟另一個方向 上的握法有非常顯著的差異。我們的 iGrasp 原型用 32 個光敏電阻鑲在 iPod Touch 的四個邊跟背面上,並且用 support vector machine (SVM)以 25Hz 的採樣率來辨識 使用者的握法。我們收集了 6 個使用者在不同 54 個情況下使用的資料:1)左手持 握,右手持握,雙手持握,2)捲動畫面,放大縮小畫面,打字,3)直向,橫向左, 橫向右,4)坐著,側躺。結果顯示以握的方式判斷是可行的,而且我們的 iGrasp 原型有 86.7-90.5%的準確率可以旋轉螢幕至正確方向。
Automatic screen rotation improves viewing experience and usability of mobile devices, but current gravity-based approaches do not support postures, such as lying on one side; and manual rotation switches require explicit user input. Our survey of 513 users shows that 42% currently experience unintentional auto-rotation that leads to incorrect viewing orientation at least several times a week, and 24% find the problem to be from very serious to extremely serious. We present two approaches, iRotate and iGrasp, to automatically rotate screens on mobile devices to match users’ viewing orientation. iRotate augments gravity-based approach, and uses front cameras on mobile devices to detect users’ faces and rotates screens accordingly. It requires no explicit user input and supports different user postures and device orientations. We have implemented an iRotate that works in real-time on iPhone and iPad, and we assess the accuracy and limitations of iRotate through a 20- participant feasibility study. iGrasp automatically rotates screens of mobile devices to match users’ viewing orientations based on how users are grasping the devices. Our insight is that users’ grasps are consistent for each orientation, but significantly differ between different orientations. Our prototype embeds a total of 32 light sensors along the four sides and the back of an iPod Touch, and uses support vector machine (SVM) to recognize grasps at 25Hz. We collected 6-users’ usage under 54 different conditions: 1) grasping the device using left, right, and both hands, 2) scrolling, zooming and typing, 3) in portrait, landscape-left, and landscape-right orientations, and while 4) sitting and lying down on one side. Results show that our grasp-based approach is promising, and our iGrasp prototype could correctly rotate the screen 86.7-90.5% of the time when training and testing on different users.