BlowPrint: Blow-Based Multi-Factor Biometrics for Smartphone User Authentication

Abstract

Biometric authentication is a widely used security mechanism that leverages unique physiological or behavioral characteristics to authenticate users. In multi-factor biometrics (MFB), multiple biometric modalities, e.g., physiological and behavioral biometrics, are integrated to mitigate the limitations inherent in single-factor biometric systems. The primary research challenge within MFB lies in identifying novel behavioral techniques capable of meeting critical criteria, including high accuracy, high usability, non-invasiveness, resilience against spoofing and other known attacks, and low use of computational resources. Despite ongoing advancements, current behavioral biometric techniques often fall short of fulfilling one or more of these requirements.

In this work, we propose BlowPrint, a novel behavioral biometric technique that allows us to authenticate users based on their phone blowing behaviors. In brief, we assume that the way users blow on a phone screen can produce distinctive acoustic patterns, which can serve as a unique behavioral biometric identifier for effective user identification or authentication. The acoustic features of blowing, such as differences in pattern, intensity, frequency, and timing, are unique to each person, making this technique highly accurate, non-invasive, and exceedingly robust against spoofing and other attacks. Moreover, it can be concurrently performed and seamlessly integrated with other physiological techniques, such as facial recognition, thereby enhancing usability.

To assess BlowPrint’s effectiveness, we conduct an empirical study involving 50 participants from whom we collect blow-acoustic and facial feature data in both sitting and standing modes. Subsequently, we compute the similarity scores of the blow-acoustic data using various time-series similarity algorithms, while we use a pretrained FaceNet-512 model for the facial recognition features. Finally, we combine the similarity scores of the two modalities through score-level fusion and compute the accuracy using a machine learning-based classifier. As a result, the proposed method demonstrates an accuracy of 99.35% for blow acoustics, 99.96% for facial recognition, and 99.82% for the combined approach. The experimental results demonstrate BlowPrint’s high effectiveness in terms of authentication accuracy, spoofing attack resilience, usability, non-invasiveness, and other aspects.

Publication
Accepted at The 30th European Symposium on Research in Computer Security (ESORICS’25), Springer
Click the Cite button above to demo the feature to enable visitors to import publication metadata into their reference management software.
Click the Slides button above to demo Academic’s Markdown slides feature.