Multi-modal biometric systems, which leverage more than one sensor, are shown to be superior to uni-modal systems both in accuracy and robustness against attacks. In this paper, we present a novel behavioral multi-biometric user authentication framework that employs data fusion for biometric-based authentication. The proposed scheme extracts two types of data (Electromyography (EMG) and movements) from a single user’s clapping action type, via a worn smartwatch. During the user enrollment phase, the scheme creates a digital identity of the user based on the arm’s movement and EMG signatures generated through the entire period of clapping action. During the verification phase, the scheme authenticates the user based on the detected combined EMG and movement signature. The experimental analysis of the proposed framework employing a Deep Neural Network classifier demonstrates its efficacy; our classifier achieves a True Accept Rate of 94.37%, a False Accept Rate of just 0.11%, and an accuracy of 97.13%. Furthermore, it is experimentally demonstrated that augmenting the training dataset via Generative Adversarial Networks leads to even better performance; with improved TAR (up to ≈96%) and accuracy (up to 97.94%), while decreased FAR (from 0.11% to 0.082%). These results showcase the effectiveness and robustness of the proposed bi-modal behavioral biometric scheme for smartwatch user authentication.

Wearable Wisdom: A Bi-modal Behavioral Biometric Scheme for Smartwatch User Authentication

Attaullah Buriro;Flaminia Luccio
2024-01-01

Abstract

Multi-modal biometric systems, which leverage more than one sensor, are shown to be superior to uni-modal systems both in accuracy and robustness against attacks. In this paper, we present a novel behavioral multi-biometric user authentication framework that employs data fusion for biometric-based authentication. The proposed scheme extracts two types of data (Electromyography (EMG) and movements) from a single user’s clapping action type, via a worn smartwatch. During the user enrollment phase, the scheme creates a digital identity of the user based on the arm’s movement and EMG signatures generated through the entire period of clapping action. During the verification phase, the scheme authenticates the user based on the detected combined EMG and movement signature. The experimental analysis of the proposed framework employing a Deep Neural Network classifier demonstrates its efficacy; our classifier achieves a True Accept Rate of 94.37%, a False Accept Rate of just 0.11%, and an accuracy of 97.13%. Furthermore, it is experimentally demonstrated that augmenting the training dataset via Generative Adversarial Networks leads to even better performance; with improved TAR (up to ≈96%) and accuracy (up to 97.94%), while decreased FAR (from 0.11% to 0.082%). These results showcase the effectiveness and robustness of the proposed bi-modal behavioral biometric scheme for smartwatch user authentication.
2024
12
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in ARCA sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10278/5057505
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact