Smartwatches offer a unique platform for the noninvasive estimation of soft biometric attributes such as age and gender. This capability is crucial for advancing personalized healthcare, enhancing security through behavioral biometrics, and refining user interfaces for improved interactions. Although traditional biometrics, such as face, have been well-utilized, smartwatch sensors offer a fresh avenue for biometric data collection that has not yet been explored, presenting exciting possibilities for advancements in wearable devices. In this paper, we explore the viability of estimating gender and age from smartwatch sensory data. Specifically, we propose a method that leverages the natural arm-movements produced from clapping actions, detected by the smartwatch’s sensors, combined with a Deep Neural Network, to predict these personal attributes. Our tests with clap-generated micro movements, collected using our developed customized application, affirm the effectiveness of our proposed method. Precisely, we report an accuracy of 98.77% and 99.44% for gender and age estimation, respectively.
CLAPMETRICS: Decoding Users’ Gender and Age Through Smartwatch Gesture Dynamics
Attaullah Buriro;Flaminia Luccio
In corso di stampa
Abstract
Smartwatches offer a unique platform for the noninvasive estimation of soft biometric attributes such as age and gender. This capability is crucial for advancing personalized healthcare, enhancing security through behavioral biometrics, and refining user interfaces for improved interactions. Although traditional biometrics, such as face, have been well-utilized, smartwatch sensors offer a fresh avenue for biometric data collection that has not yet been explored, presenting exciting possibilities for advancements in wearable devices. In this paper, we explore the viability of estimating gender and age from smartwatch sensory data. Specifically, we propose a method that leverages the natural arm-movements produced from clapping actions, detected by the smartwatch’s sensors, combined with a Deep Neural Network, to predict these personal attributes. Our tests with clap-generated micro movements, collected using our developed customized application, affirm the effectiveness of our proposed method. Precisely, we report an accuracy of 98.77% and 99.44% for gender and age estimation, respectively.I documenti in ARCA sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.