Biblio
This paper presents a computational model for managing an Embodied Conversational Agent's first impressions of warmth and competence towards the user. These impressions are important to manage because they can impact users' perception of the agent and their willingness to continue the interaction with the agent. The model aims at detecting user's impression of the agent and producing appropriate agent's verbal and nonverbal behaviours in order to maintain a positive impression of warmth and competence. User's impressions are recognized using a machine learning approach with facial expressions (action units) which are important indicators of users' affective states and intentions. The agent adapts in real-time its verbal and nonverbal behaviour, with a reinforcement learning algorithm that takes user's impressions as reward to select the most appropriate combination of verbal and non-verbal behaviour to perform. A user study to test the model in a contextualized interaction with users is also presented. Our hypotheses are that users' ratings differs when the agents adapts its behaviour according to our reinforcement learning algorithm, compared to when the agent does not adapt its behaviour to user's reactions (i.e., when it randomly selects its behaviours). The study shows a general tendency for the agent to perform better when using our model than in the random condition. Significant results shows that user's ratings about agent's warmth are influenced by their a-priori about virtual characters, as well as that users' judged the agent as more competent when it adapted its behaviour compared to random condition.
The goal of this work is to enable user authentication via finger inputs on ubiquitous surfaces leveraging low-cost physical vibration. We propose VibWrite that extends finger-input authentication beyond touch screens to any solid surface for smart access systems (e.g., access to apartments, vehicles or smart appliances). It integrates passcode, behavioral and physiological characteristics, and surface dependency together to provide a low-cost, tangible and enhanced security solution. VibWrite builds upon a touch sensing technique with vibration signals that can operate on surfaces constructed from a broad range of materials. It is significantly different from traditional password-based approaches, which only authenticate the password itself rather than the legitimate user, and the behavioral biometrics-based solutions, which usually involve specific or expensive hardware (e.g., touch screen or fingerprint reader), incurring privacy concerns and suffering from smudge attacks. VibWrite is based on new algorithms to discriminate fine-grained finger inputs and supports three independent passcode secrets including PIN number, lock pattern, and simple gestures by extracting unique features in the frequency domain to capture both behavioral and physiological characteristics such as contacting area, touching force, and etc. VibWrite is implemented using a single pair of low-cost vibration motor and receiver that can be easily attached to any surface (e.g., a door panel, a desk or an appliance). Our extensive experiments demonstrate that VibWrite can authenticate users with high accuracy (e.g., over 95% within two trials), low false positive rate (e.g., less 3%) and is robust to various types of attacks.
The proliferation of wearable devices, e.g., smartwatches and activity trackers, with embedded sensors has already shown its great potential on monitoring and inferring human daily activities. This paper reveals a serious security breach of wearable devices in the context of divulging secret information (i.e., key entries) while people accessing key-based security systems. Existing methods of obtaining such secret information relies on installations of dedicated hardware (e.g., video camera or fake keypad), or training with labeled data from body sensors, which restrict use cases in practical adversary scenarios. In this work, we show that a wearable device can be exploited to discriminate mm-level distances and directions of the user's fine-grained hand movements, which enable attackers to reproduce the trajectories of the user's hand and further to recover the secret key entries. In particular, our system confirms the possibility of using embedded sensors in wearable devices, i.e., accelerometers, gyroscopes, and magnetometers, to derive the moving distance of the user's hand between consecutive key entries regardless of the pose of the hand. Our Backward PIN-Sequence Inference algorithm exploits the inherent physical constraints between key entries to infer the complete user key entry sequence. Extensive experiments are conducted with over 5000 key entry traces collected from 20 adults for key-based security systems (i.e. ATM keypads and regular keyboards) through testing on different kinds of wearables. Results demonstrate that such a technique can achieve 80% accuracy with only one try and more than 90% accuracy with three tries, which to our knowledge, is the first technique that reveals personal PINs leveraging wearable devices without the need for labeled training data and contextual information.