Peter Aaby
Advancing touch-based continuous authentication by automatically extracting user behaviours
Aaby, Peter
Authors
Abstract
Smartphones provide convenient access to online banking, social media, photos, and entertainment, all of which have become integral to our daily lives. However, this mobile nature also opens up new avenues for unauthorised access to user data. To combat this, device manufacturers often provide a screen lock mechanism. Yet, traditional lock screen authentication can be inconvenient and only offers protection at the point of user verification. Therefore, this thesis demonstrates the potential of touch-based behavioural biometrics for continuous authentication, which could significantly enhance smartphone security. The behaviours studied here are exclusively modelled from smartphone touchscreen inputs obtained from publicly available data, and the results show promising effectiveness in addressing security concerns.
The initial objective was to assess, through feature selection, the behaviours universally exhibited by users and those unique to individuals to improve the performance of the 60 different continuous authentication models being tested. Of the 30 features used over five feature selection methods, results showed that features related to pressure appeared in 81% of models, demonstrating their importance for most users while not negatively affecting performance when non-important features were removed. The following research sought to model users independent of their directional navigation and instead rely more on these essential features.
In this field, several models are typically produced for each user depending on gesture direction when testing authentication methods. However, since features describe behaviour, this thesis demonstrates that the proposed single omni-directional model can be employed while prioritising lesser complex hyperparameters. Results show that using an omni-directional model to evaluate 35 users can achieve an AUC score of 89% and 17.9% EER when authenticating using five gestures while outperforming more complex bi-directional techniques. Furthermore, the omni-directional model performs better when using the oldest feature set than more recent efforts in engineering new features.
When considering the necessary prioritisation of features unique to individuals, it becomes immediately apparent that engineering and evaluating these manually is impossible at scale. To address this, this thesis proposes a move towards automatic feature extraction through the innovative TouchEncoding method. This method transforms touch behaviour into image encodings that enable computer vision to authenticate users. The results of this approach were superior to all the related work, with AUC scores of 96.7% and EER of 8.5% across 74 users while authenticating on a single gesture. The performance further improved to 99.1% AUC and 3.6% EER when authenticating using five gestures. This underscores the effectiveness and superiority of the TouchEncoding method, paving the way for future work in this area.
Citation
Aaby, P. Advancing touch-based continuous authentication by automatically extracting user behaviours. (Thesis). Edinburgh Napier University
Thesis Type | Thesis |
---|---|
Deposit Date | Aug 21, 2024 |
Publicly Available Date | Aug 21, 2024 |
DOI | https://doi.org/10.17869/enu.2024.3787401 |
Award Date | Jul 5, 2024 |
Files
Advancing touch-based continuous authentication by automatically extracting user behaviours
(5 Mb)
PDF
Downloadable Citations
About Edinburgh Napier Research Repository
Administrator e-mail: repository@napier.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search