Smartwatches offer a unique platform for the noninvasive estimation of soft biometric attributes such as age and gender. This capability is crucial for advancing personalized healthcare, enhancing security through behavioral biometrics, and refining user interfaces for improved interactions. Although traditional biometrics, such as face, have been well-utilized, smartwatch sensors offer a fresh avenue for biometric data collection that has not yet been explored, presenting exciting possibilities for advancements in wearable devices. In this paper, we explore the viability of estimating gender and age from smartwatch sensory data. Specifically, we propose a method that leverages the natural arm-movements produced from clapping actions, detected by the smartwatch’s sensors, combined with a Deep Neural Network, to predict these personal attributes. Our tests with clap-generated micro movements, collected using our developed customized application, affirm the effectiveness of our proposed method. Precisely, we report an accuracy of 98.77% and 99.44% for gender and age estimation, respectively.

CLAPMETRICS: Decoding Users’ Gender and Age Through Smartwatch Gesture Dynamics

Attaullah Buriro;Flaminia Luccio
In corso di stampa

Abstract

Smartwatches offer a unique platform for the noninvasive estimation of soft biometric attributes such as age and gender. This capability is crucial for advancing personalized healthcare, enhancing security through behavioral biometrics, and refining user interfaces for improved interactions. Although traditional biometrics, such as face, have been well-utilized, smartwatch sensors offer a fresh avenue for biometric data collection that has not yet been explored, presenting exciting possibilities for advancements in wearable devices. In this paper, we explore the viability of estimating gender and age from smartwatch sensory data. Specifically, we propose a method that leverages the natural arm-movements produced from clapping actions, detected by the smartwatch’s sensors, combined with a Deep Neural Network, to predict these personal attributes. Our tests with clap-generated micro movements, collected using our developed customized application, affirm the effectiveness of our proposed method. Precisely, we report an accuracy of 98.77% and 99.44% for gender and age estimation, respectively.
In corso di stampa
Proceedings of the 19th International Conference on Emerging Technologies (ICET 2024)
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in ARCA sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10278/5081021
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact