Biblio

Filters: Author is Belman, Amith K.  [Clear All Filters]
2021-08-17
Belman, Amith K., Paul, Tirthankar, Wang, Li, Iyengar, S. S., Śniatała, Paweł, Jin, Zhanpeng, Phoha, Vir V., Vainio, Seppo, Röning, Juha.  2020.  Authentication by Mapping Keystrokes to Music: The Melody of Typing. 2020 International Conference on Artificial Intelligence and Signal Processing (AISP). :1—6.
Expressing Keystroke Dynamics (KD) in form of sound opens new avenues to apply sound analysis techniques on KD. However this mapping is not straight-forward as varied feature space, differences in magnitudes of features and human interpretability of the music bring in complexities. We present a musical interface to KD by mapping keystroke features to music features. Music elements like melody, harmony, rhythm, pitch and tempo are varied with respect to the magnitude of their corresponding keystroke features. A pitch embedding technique makes the music discernible among users. Using the data from 30 users, who typed fixed strings multiple times on a desktop, shows that these auditory signals are distinguishable between users by both standard classifiers (SVM, Random Forests and Naive Bayes) and humans alike.