Visible to the public Biblio

Filters: Author is Yalçin, Özge Nilay  [Clear All Filters]
2019-12-16
DiPaola, Steve, Yalçin, Özge Nilay.  2019.  A multi-layer artificial intelligence and sensing based affective conversational embodied agent. 2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW). :91–92.

Building natural and conversational virtual humans is a task of formidable complexity. We believe that, especially when building agents that affectively interact with biological humans in real-time, a cognitive science-based, multilayered sensing and artificial intelligence (AI) systems approach is needed. For this demo, we show a working version (through human interaction with it) our modular system of natural, conversation 3D virtual human using AI or sensing layers. These including sensing the human user via facial emotion recognition, voice stress, semantic meaning of the words, eye gaze, heart rate, and galvanic skin response. These inputs are combined with AI sensing and recognition of the environment using deep learning natural language captioning or dense captioning. These are all processed by our AI avatar system allowing for an affective and empathetic conversation using an NLP topic-based dialogue capable of using facial expressions, gestures, breath, eye gaze and voice language-based two-way back and forth conversations with a sensed human. Our lab has been building these systems in stages over the years.