Touch dynamics for affective states recognition: your smartphone knows how you feel since you unlock it Fabrizio Balducci Berardina De Carolis Donato Impedovo Giuseppe Pirlo Computer Science dept. Computer Science dept. Computer Science dept. Computer Science dept. University of Bari “A.Moro” University of Bari “A.Moro” University of Bari “A.Moro” University of Bari “A.Moro” Bari, Italy Bari, Italy Bari, Italy Bari, Italy fabrizio.balducci@uniba.it berardina.decarolis@uniba.it donato.impedovo@uniba.it giuseppe.pirlo@uniba.it Abstract—Touch Dynamics is the behavioral biometric trait that One of the advantages of exploiting this biometric trait is total regards how the user interacts with devices equipped with touch transparency since the user no needs to use unfamiliar devices displays, the dynamic patterns drawn through the swipe or wear sensors, or interact differently from his habits, but the movement can be used to identify the user who is accessing the data are recorded and analyzed automatically in a natural way. smartphone. In this paper we investigated whether the same data As a side effect of the identification/verification task, due to the could be used also to recognize some emotional states. To this aim, an Android App was designed to simulate the unlock patterns and intrinsic capabilities of a behavioral biometric, emotional states collect data needed to calculate numeric features that have been of the user can also be revealed. In this work negative emotional used not only for identification purposes but also to classify three states (anxiety, depression and stress) have been considered and negative affective states: anxiety, stress and depression. Results collected by a specific questionnaire. Results obtained so far obtained so far are encouraging and indicate that Random Forest indicate that Random Forest is capable to reach good is capable to reach good classification accuracy both on touch classification accuracy both on touch numerical features and on numerical features and on negative emotional states classification negative emotional states also exploiting behavior information also exploiting behavior information such the hand and the finger such the hand and the finger used. used in the execution. The work is organized as follows: in Section II related works Keywords—touch dynamics, swipe features, affective classification, and literature are presented; Section III introduces the negative emotions, machine learning. affective states. In Section IV the smartphone App is described. The dataset and the numeric features are depicted in Section V while Section VI presents the experimental phase and, finally, Section VII contains conclusions and future work directions. I. INTRODUCTION Security systems use biometric traits to establish the identity of a person based on their characteristic features which can be II. RELATED WORK difficult to be counterfeited and cannot be lost or forgotten. An important part of studies when considering user behaviors on Biometric traits can be Physiological in the case a direct measure devices concerns the password entry and how approaching of a human body part can be performed (e.g. iris, fingerprint, them. Draffin et al. [6] asked 20 users to type their password on etc.) or Behavioral in the case in which an action performed by a specifically devoted keyboard observing that these micro- the user is measured (e.g. handwritten signature, walk, etc.). behavior features can identify a non-authorized user within 5 Behavioral biometrics also involve a cognitive aspect because keypresses in 67.7% of the time. In [7] 85 users have to enter actions performed are learned over time and can change two numeric PINs (4 and 8 numbers) holding the phone with the depending on environmental, psycho-emotional and physical left hand and interacting using the right index finger reaching a conditions [1]. From this perspective, it has been demonstrated verification equal error rate under 3.65%. Further studies that emotional state influences human movements and actions focused on more complex tasks: in [8] it was required to insert as, for example, speech [2], facial expression [3], body language the phrase ‘the quick brown fox jumped over the lazy ghost.’ in [4] and writing [5]. addition to the common password reaching a minimum The biometric trait taken into consideration in this work is the identification EER of 12.5 while Meng et al. [9] provided data Touch Dynamic referred as behavioral trait related to how the about the use of a smartphone since 20 users were provided with user interacts with a touch screen of a device (e.g. smartphone, a phone with an Android software modified to record all the user tablet). Exploiting touch dynamics in this context, means touches to authenticate different users with an average error rate investigating on features like the pressure applied on the screen, of about 7.8%. Syed et al. [10] simulated a common everyday finger-display touch area, the speed of swipes, the variation of interaction task by asking users to search for something in the the device sensors (e.g. accelerometer), and so on. smartphone (e.g. a specific image within a list of other images), gathering information about the interaction modes of users with Copyright © 2019 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) three different devices (a 4.8-inch display phone and two tablets Fig. 1: the three patterns proposed from the task with different difficulty. of 7 and 10 inches respectively). The study in [11] proposes a simple game of comparing two images, 30 users performed the task on 3 different smartphones for about 3 minutes. In Putri et al. [12] users performed different tasks, for example answer to questionnaires, as well as carry out general web browsing, map searches on Google Maps and small writing task: a Zenfone device was used and 29 users performed recording sessions with a following classification between device owners and imposters. Finally, the work of Liu et al. [13] adopted patterns for user classification and involved 113 users to complete point pattern composed from 4 to 9 steps: 10 samples were collected at the beginning of the experiment and 10 after 7 weeks when 7 users tried to emulate the patterns of the 113 users 5 times; finally, these last patterns have been used as a test to simulate impostors. To the best of authors’ knowledge, there are a few works that aim at recognizing emotions from touch dynamics. Gao et al. [14] built a system to recognize four emotional states (Excited, Relaxed, Frustrated and Bored). It was showed that pressure features discriminate frustration states from the other three states. Stroke length features discriminate mainly boredom from a relaxed state. The classification results were interesting since Fig. 2: (left) tutorial tab for the Easy pattern; (right) the execution tab. the proposed approach discriminates between 4 emotional states reaching between 69% and 77% of correct recognition. These results highlight the potential of using touch behavior as a non- The Anxiety is a state characterized by intense feeling of concern obstructive way to measure users’ emotional states in contexts and fear, often unfounded, related to a specific environmental where touch-based devices are used. Similarly, Maramis et al. stimulus associated with a failed response of adaptation and is [15] use haptic touch data acquired from Android smartphones often accompanied by palpitations, shortness of breath and to unobtrusive and real-life emotion recognition by exploiting tremor with response of "fight or flight". Depression is a the association between four emotions and haptic touch. The disorder characterized by mood episodes accompanied by low proposed method achieves very promising classification self-esteem and loss of interest in normally pleasant activities. It accuracy using a mixture of feature extraction and machine is a debilitating disease that involves both the affective and learning based classification techniques. cognitive spheres affecting work, sleep and physical health with a strong impact on style and quality life. The DASS-42 questionnaire is called ‘Depression, Anxiety and III. THE DASS-42 QUESTIONNAIRE AND AFFECTIVE STATES Stress Scales’ [16] and was created by the University of New South Wales (Australia) to achieve comparable results about the Stress, anxiety and depression are responses to the challenges of three emotions evaluation. It consists of 42 questions, 14 for everyday life and results useful to detect and prevent them each category and refers to the last 7 days of the interview and before they impact on individual health and daily actions: if a each answer is evaluated on the basis of 4 points with a final device detects a negative emotional state, an Intelligent System score as the sum of individual ones. The scale has been already can warn user with appropriate solutions, for example by used in tasks similar to those considered here and, specifically, lightening its work schedule or organizing the environment and it has been used to determine writers’ emotional states when the interface in a more relaxing way with suitable icons, colors, performing handwriting which is considered a behavioral sounds and brightness. The Stress is a general adaptation biometrics like touch dynamics [5]. In this work, after an entire syndrome designed to re-establish a new internal balance pattern execution session, the questionnaire is used to assign a following changes in internal balance at the humoral, organic label to the collected subject’s data, associating them to his and biological levels; in the physical response it shows affective state while performing its experimental session. tachycardia, muscle contraction and other factors typical of the "fight or flight" response. IV. THE EMOTOUCH LOCK APP To date there is no available public dataset related to touch tasks and emotional states. To the aim of data collection, an application has been specifically developed. The application is called EmoTouch Lock (EMOtion and TOUCHdynamics in a LOCKscreen) since it was designed to simulate unlock patterns on an Android smartphone device collecting data on user touch behaviors. At the first usage, the user is required to provide age and gender info, successively 4 tabs are displayed: Survey and Easy/Medium/Difficult Pattern. The first one allows access to the DASS-42 questionnaire while the three ‘pattern tabs’ offer a touch sequence to be executed (Fig.1) with different difficulty due to the number of swipes to be executed. When a specific task is proposed by the system, for example the Easy one, a pop-up appears (Fig.2, left) containing a tutorial showing how to complete the sequence without leaving the finger from the touch screen. Next, a screen with the 9 points to be linked following the proposed pattern is presented (Fig. 2, right). User data are recorded from the first touch until the finger is lifted; whether the inserted pattern is correct or not the data are sent to the server. Fig. 3: the plot of touch movements (black dots) while executing the ‘Medium’ V. SWIPE AND NUMERICAL FEATURES pattern. Red dots highlight direction changes (swipes, red lines). The data and the raw values acquired by the sensors must be transformed and adapted to be used effectively in machine TABLE I. learning classifiers and models. ID Feature Description A. Swipe Dynamics 1 distance Swipe length The first stage of is swipe extraction. A swipe is a touch interaction with no sensible curvature. Figure 3 provides an 2 speed Swipe speed = length / duration example of the execution of the ‘Medium’ difficulty task. Considering the pattern execution sequence (from A to D), 3 lineDev Relationship between distance and black dots are the sampled coordinates projections of the user displacement touch while red dots highlight instants where the touch changes 4 PressDev Pressure variance on the screen direction: a swipe is the sequence of dots sampled until a direction change (the ideal red line in the figure). To determine 5 sizeVar Screen contact area variance the red points in a pattern execution, the angular values 6 lineDir Swipe direction calculated with respect to the horizontal axis between a sequence of three black points has been considered and 7 PPtouchsPerc Percentage of touches in a certain empirically matched against a threshold of 135 degrees. direction considering the points 8 PPangleVar Variance of the direction of the vector B. Numerical Features of point sequence A set of numerical features have been extracted from the swipe 9 SPtouchsPerc Percentage of touches in a certain as reported in Table I. In other words, 14 features characterize direction as ‘start point - point x’ each single swipe. Among the others, ‘line deviation’ and the ‘swipe direction’ are the most important. The former represents 10 SPangleVar Variance of the direction of the vector as ‘start point - point x’ a change in the touch movement while the latter considers the deviation of the movement from the ‘ideal’ trajectories 11 EPtouchsPerc Percentage of touches in a certain requested from the task, indicating how the user designed a direction as ‘end point - point x’ swipe in a linear manner. 12 EPangleVar Variance of the direction of the vector as ‘end point - point x’ 13 YaxisAngleVar Variance of the angle formed between points and the Y axis 14 XaxisAngleVar Variance of the angle formed between points and the X axis Numerical features calculated from the drawn swipes in each task. Some features (Table I) have been calculated in different ways ● Naive Bayes: requires knowledge of the a priori and based on the points contained in a swipe as follows: conditional probabilities related to the problem. ● PP: the feature has been calculated point-to-point, i.e. all the points are considered in pairs (point 1-point 2, point 2-point 3, TABLE II. point 3-point 4, etc.) Classifier Accuracy ● SP: the first point of the pair is always the starting point of J48 65 % the sequence (start point - point 1, start point - point 2, start point - point 3, etc.) SVM 43 % ● EP: unlike SP, indicates that the end point is taken into Random Forest 76 % account rather than the starting point Bayes Network 56 % Naïve Bayes 47 % VI. EXPERIMENTAL SESSION Classification results of swipes with features on the whole dataset. A. Experimental Setup In order to perform experiments, 40 distinct subjects (university TABLE III. students, mean age of 22 years, 6 females and 34 males) have Stress Anxiety Depression been asked to execute the swipe tasks, with a minimum of 5 patterns for each difficulty level, for a total of at least 15 basic features (Table 1) 73.6 % 69.5 % 72.2 % patterns per users. The only constraint was that a new pattern basic features (Table 1) 78 % 74.1 % 77.2 % could be executed only after (at least) 10 minutes after + hand and finger data completion of the previous one while the subjects used the app in an uncontrolled real environment, recording data on the train, Classification results on the affective state recognition exploiting further walking, sitting and so on; moreover, there was not a limit on behavioral features with Random Forest classifier. the interaction mood since subjects were able to complete the patterns using the most comfortable hand and fingers (however, such information have been requested at the end of each task Classification results, considering the whole dataset about the execution). Due to the freedom given to participants, not all of swipe features goodness, are in Table II while a 10-fold cross them reached the same number of recorded patterns and so, validation mode has been employed to reduce the impact of the after a data cleaning operation, to maximize the possibility of variance while choosing the training and test examples. The comparisons it was decided to divide them into three best performance has been obtained with the Random Forest: experimental groups, also overlapped to consider the subject 76% of accuracy. skills progression over time. So that, the entire dataset has been Random Forest has been used for further tests and results balanced into the following: related to negative emotional states classification with 10-fold cross validation are reported in Table III where 15 users has ● Dataset A: 40 users with 6 attempts for each task been randomly selected with 184 swipe samples each. it can be observed that the recognition accuracy is near 70% and all ● Dataset B: 22 users with 10 attempts for each task results improve when exploiting the further behavioral feature ● Dataset C: 10 users with 16 attempts for each task set. The best recognized emotional state is the Stress with 73.6% and 78% respectively while the Depression is the one that takes the most advantage from new features (+5%). B. Machine Learning Classification Finally, the Random Forest classifier has been employed to perform the identity verification tests, checking if it is possible At the initial stage, a set of classifiers has been adopted to to associate each single swipe to the user who produced it in evaluate classification accuracy at single swipe level (i.e. not at order to highlight its characterization through this behavioral entire pattern level). The following have been considered: biometric trait. In other words, understand if it is possible to have identity verification as well as emotional state recognition ● J48: algorithm for the generation of a C4.5 decision tree, at unlock time. In this case due to the reduced dimension, for pruned and not pruned each of the three sub-dataset a leave-one-out setup has been ● Support Vector Machine: a model that assigns one of two adopted: in this way, if the dataset contains n swipe feature classes separated through support examples vectors, there are used n-1 for the training and the remaining one for the test for n times. As evaluation metric the Equal ● Random Forest: an overall classifier of decision trees Error Rate (EER) has been chosen: it indicates how the ● Bayesian Network: a probabilistic model that exploits proportion of false acceptances is equal to the proportion of variables and their conditional dependencies false rejections (the lower the EER value, the higher the accuracy of the biometric system is). [4] D. McColl, C. Jiang and G. Nejat, "Classifying a Person’s Degree of TABLE IV. Accessibility From Natural Body Language During Social Human–Robot Interactions," in IEEE Trans. on Cybernetics, v.47, n.2, pp.524-538, 2017. Dataset EER [5] L. Likforman-Sulem, A. Esposito, M. Faundez-Zanuy, S. Clémençon and G. Cordasco, "EMOTHAW: A Novel Database for Emotional State A 23% reecognition From Handwriting and Drawing," in IEEE Transactions on B 16% Human-Machine Systems, vol. 47, no. 2, pp. 273-284, April 2017. [6] B. Draffin, J. Zhu, and J. Zhang, “Keysens: Passive user authentication C 12% through micro-behavior modeling of soft keyboard interaction,” in Mobile Computing, Applications, and Services, G. Memmi and U. Blanke, Eds. Cham: User identity classification exploiting swipe numerical features. Springer Int. Publishing, 2014, pp. 184–201. [7] N. Zheng, K. Bai, H. Huang, and H. Wang, “You are how you touch: User verification on smartphones via tapping behaviors,” in 2014 IEEE 22nd From Table IV emerges that best-recognized subjects are those Int. Conference on Network Protocols, Oct 2014, pp. 221–232. having the more number of attempts for each task execution [8] G. Kambourakis, D. Damopoulos, D. Papamartzivanos, and E. Pavlidakis, “Introducing touchstroke: Keystroke-based authentication system (dataset C): although they are smaller in number than the other for smartphones,” Sec. and Commun. Netw., vol. 9, no. 6, pp. 542–554, Apr. groups, can be considered their greater characterization due to 2016. http://dx.doi.org/10.1002/sec.1061. increasingly specific data, along with a progression in the [9] Y. Meng, D. S. Wong, R. Schlegel, and L.-f. Kwok, “Touch gestures mastery of the execution skills. Results are noteworthy, with the based biometric authentication scheme for touchscreen mobile phones,” in highest error of only 23% on dataset A and the best classification Information Security and Cryptology, M. Kutyłowski and M. Yung, Eds. Heidelberg: Springer Berlin Heidelberg, 2013, pp. 331–350. results on dataset C where error decreases about 50%. [10] Z. Syed, J. Helmick, S. Banerjee, and B. Cukic, “Effect of user posture and device size on the performance of touch-based authentication systems,” in 2015 IEEE 16th International Symposium on High Assurance Systems VII. CONCLUSION AND FUTURE WORK Engineering, Jan 2015, pp. 10–17. [11] S. Eberz, G. Lovisotto, A. Patan, M. Kwiatkowska, V. Lenders, and I. This study introduces the Touch Dynamics method as biometric Martinovic, “When your fitness tracker betrays you: Quantifying the trait able to recognize a users’ emotional states as information predictability of biometric features across contexts,” in 2018 IEEE Symposium useful to characterize and distinguish users as well as to adapt on Security and Privacy (SP), May 2018, pp. 889–905. [12] A. N. Putri, Y. D. W. Asnar, and S. Akbar, “A continuous fusion contents and behavior of smartphones. A set of tasks related to authentication for android based on keystroke dynamics and touch gesture,” in touch un-lock patterns have been exploited to extract numerical 2016 International Conference on Data and Software Engineering (ICoDSE), features on a smartphone devices demonstrating that that touch Oct 2016, pp. 1–6. swipes are useful for recognizing the identity with an EER of [13] C.-L. Liu, C.-J. Tsai, T.-Y. Chang, W.-J. Tsai, and P.-K. Zhong, 12% for users with a huge practice using the Random Forest. “Implementing multiple biometric features for a recall-based graphical keystroke dynamics authentication system on a smart phone,” Journal of The exploit of emotional states has been performed adopting the Network and Computer Applications, vol. 53, pp. 128 – 139, 2015. DASS-42 questionnaire whose results will be further inquired in [14] [1]Y. Gao, N. Bianchi-Berthouze, and H. Meng, “What does touch tell future studies; in this case, classification accuracy is around us about emotions in touchscreen-based gameplay?”, in ACM Trans. Comput- 70%. Starting from this work several improvements are Hum. Interact., vol.19, no.4, pp.31:1–31:30, 2012. possible, such as adding more features, classification methods [15] Maramis, Christos & Stefanopoulos, Leandros & Chouvarda, Ioanna & Maglaveras, N. (2018). Emotion Recognition from Haptic Touch on Android and swipe patterns to test how users improve their performances. Device Screens. 10.1007/978-981-10-7419-6_34. Equally necessary is to have the users balanced between male [16] P. F. and S.H. Lovibond, “The structure of negative emotional states: and females along with a feature selection study to inquire which Comparison of the depression anxiety stress scales (dass) with the beck are the features with the major informative contribution. depression and anxiety inventories,” Behaviour research and therapy, vol. 33, Furthermore, the proposed methodology could be adapted to no. 3, pp. 335–343,1995. new contexts, exploiting swipes made by users while moving around the device interface or in other applications. ACKNOWLEDGMENT This work is supported by the Italian Ministry of Education, University and Research within the PRIN2017 - BullyBuster project - A framework for bullying and cyberbullying action detection by computer vision and artificial intelligence methods and algorithms. CUP: H94I19000230006. REFERENCES [1] D. Impedovo and G. Pirlo, "Automatic signature verification in the mobile cloud scenario: survey and way ahead," in IEEE Transactions on Emerging Topics in Computing. [2] S. Deb and S. Dandapat, "Multiscale Amplitude Feature and Significance of Enhanced Vocal Tract Information for Emotion Classification," in Trans. on Cybernetics, v.49, n.3, pp.802-815, 2019. [3] A. Majumder, L. Behera and V. K. Subramanian, "Automatic Facial Expression Recognition System Using Deep Network-Based Data Fusion," in IEEE Trans. on Cybernetics, v.48, no.1, pp. 103-114, 2018.