Detecting users handedness for ergonomic adaptation of mobile user interfaces

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Often, we operate mobile devices using only one hand. The hand thereby serves two purposes: holding the device and operating the touch screen with the thumb. The current trend of increasing screen sizes however, makes it close to impossible to reach all parts of the screen (especially the top area) for users with average hand sizes. One solution is to offer adaptive user interfaces for such one-handed interactions. These modes have to be triggered manually and thus induce a critical overhead. They are further designed to bring all content closer, regardless of whether the phone is operated with the left or right hand. In this paper, we present an algorithm that allows determining the users' interacting hand from their unlocking behavior. Our algorithm correctly distinguishes one- and twohanded usage as well as left- and right handed unlocking in 98.51% of all cases. This is achieved through a k-nearest neighbor comparison of the internal sensor readings of the smartphone during the unlocking process.

Original languageEnglish
Title of host publicationMUM '15 Proceedings of the 14th International Conference on Mobile and Ubiquitous Multimedia
Number of pages5
PublisherAssociation for Computing Machinery
Publication date2015
ISBN (Print)978-1-4503-3605-5
Publication statusPublished - 2015
Event14th International Conference on Mobile and Ubiquitous Multimedia, MUM 2015 - Linz, Austria
Duration: 30 Nov 20152 Dec 2015


Conference14th International Conference on Mobile and Ubiquitous Multimedia, MUM 2015
SponsorACM Special Interest Group on Computer-Human Interaction (SIGCHI), Association for Computing Machinery, Johannes Kepler Universitat Linz, University of Applied Sciences Upper Austria (FH OOE)

    Research areas

  • Adaptive interfaces, Ergonomics, Handedness, Sensor fusion, Unlocking

ID: 159747835