Visible to the public Biblio

Filters: Keyword is gestures  [Clear All Filters]
2020-09-11
Ababtain, Eman, Engels, Daniel.  2019.  Security of Gestures Based CAPTCHAs. 2019 International Conference on Computational Science and Computational Intelligence (CSCI). :120—126.
We present a security analysis of several gesture CAPTCHA challenges designed to operate on mobiles. Mobile gesture CAPTCHA challenges utilize the accelerometer and the gyroscope inputs from a mobile to allow a human to solve a simple test by physically manipulating the device. We have evaluated the security of gesture CAPTCHA in mobile devices and found them resistant to a range of common automated attacks. Our study has shown that using an accelerometer and the gyroscope readings as an input to solve the CAPTCHA is difficult for malware, but easy for a real user. Gesture CAPTCHA is effective in differentiating between humans and machines.
Ababtain, Eman, Engels, Daniel.  2019.  Gestures Based CAPTCHAs the Use of Sensor Readings to Solve CAPTCHA Challenge on Smartphones. 2019 International Conference on Computational Science and Computational Intelligence (CSCI). :113—119.
We present novel CAPTCHA challenges based on user gestures designed for mobile. A gesture CAPTCHA challenge is a security mechanism to prevent malware from gaining access to network resources from mobile. Mobile devices contain a number of sensors that record the physical movement of the device. We utilized the accelerometer and gyroscope data as inputs to our novel CAPTCHAs to capture the physical manipulation of the device. We conducted an experimental study on a group of people. We discovered that younger people are able to solve this type of CAPTCHA challenges successfully in a short amount of time. We found that using accelerometer readings produces issues for some older people.
2017-10-18
Zha, Xiaojie, Bourguet, Marie-Luce.  2016.  Experimental Study to Elicit Effective Multimodal Behaviour in Pedagogical Agents. Proceedings of the International Workshop on Social Learning and Multimodal Interaction for Designing Artificial Agents. :1:1–1:6.

This paper describes a small experimental study into the use of avatars to remediate the lecturer's absence in voice-over-slide material. Four different avatar behaviours are tested. Avatar A performs all the upper-body gestures of the lecturer, which were recorded using a 3D depth sensor. Avatar B is animated using few random gestures in order to create a natural presence that is unrelated to the speech. Avatar C only performs the lecturer's pointing gestures, as these are known to indicate important parts of a lecture. Finally, Avatar D performs "lecturer-like" gestures, but these are desynchronised with the speech. Preliminary results indicate students' preference for Avatars A and C. Although the effect of avatar behaviour on learning did not prove statistically significant, students' comments indicate that an avatar that behaves quietly and only performs some of the lecturer's gestures (pointing) is effective. The paper also presents a simple empirical method for automatically detecting pointing gestures in Kinect recorded lecture data.