Eye MU Interactions: Gaze + IMU Gestures on Mobile Devices
As smartphone screens have grown in size, singlehanded use has become more cumbersome. Interactive targets that are easily seen can be hard to reach, particularly notifications and upper menu bar items. Users must either adjust their grip to reach distant targets, or use their other hand. In this research, we show how gaze estimation using a phones userfacing camera can be paired with IMUtracked motion gestures to enable a new, intuitive, and rapid interaction technique on handheld phones. We describe our proof ofconcept implementation and gesture set, built on stateoftheart techniques and capable of selfcontained execution on a smartphone. In our user study, we found a mean euclidean gaze error of 1. 7 cm and a seven class motion gesture classification accuracy of Citation: Andy Kong, Karan Ahuja, Mayank Goel, and Chris Harrison. 2021. EyeMU Interactions: Gaze + IMU Gestures on Mobile Devices. In Proceedings of the 2021 International Conference on Multimodal Interaction (ICMI 21). Assoc
|
|