Visible to the public Biblio

Filters: Keyword is emotions  [Clear All Filters]
2021-03-29
Xu, X., Ruan, Z., Yang, L..  2020.  Facial Expression Recognition Based on Graph Neural Network. 2020 IEEE 5th International Conference on Image, Vision and Computing (ICIVC). :211—214.

Facial expressions are one of the most powerful, natural and immediate means for human being to present their emotions and intensions. In this paper, we present a novel method for fully automatic facial expression recognition. The facial landmarks are detected for characterizing facial expressions. A graph convolutional neural network is proposed for feature extraction and facial expression recognition classification. The experiments were performed on the three facial expression databases. The result shows that the proposed FER method can achieve good recognition accuracy up to 95.85% using the proposed method.

2020-05-22
Geetha, R, Rekha, Pasupuleti, Karthika, S.  2018.  Twitter Opinion Mining and Boosting Using Sentiment Analysis. 2018 International Conference on Computer, Communication, and Signal Processing (ICCCSP). :1—4.

Social media has been one of the most efficacious and precise by speakers of public opinion. A strategy which sanctions the utilization and illustration of twitter data to conclude public conviction is discussed in this paper. Sentiments on exclusive entities with diverse strengths and intenseness are stated by public, where these sentiments are strenuously cognate to their personal mood and emotions. To examine the sentiments from natural language texts, addressing various opinions, a lot of methods and lexical resources have been propounded. A path for boosting twitter sentiment classification using various sentiment proportions as meta-level features has been proposed by this article. Analysis of tweets was done on the product iPhone 6.

2020-02-10
Schneeberger, Tanja, Scholtes, Mirella, Hilpert, Bernhard, Langer, Markus, Gebhard, Patrick.  2019.  Can Social Agents elicit Shame as Humans do? 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII). :164–170.
This paper presents a study that examines whether social agents can elicit the social emotion shame as humans do. For that, we use job interviews, which are highly evaluative situations per se. We vary the interview style (shame-eliciting vs. neutral) and the job interviewer (human vs. social agent). Our dependent variables include observational data regarding the social signals of shame and shame regulation as well as self-assessment questionnaires regarding the felt uneasiness and discomfort in the situation. Our results indicate that social agents can elicit shame to the same amount as humans. This gives insights about the impact of social agents on users and the emotional connection between them.
2019-01-31
Simmons, Andrew J., Curumsing, Maheswaree Kissoon, Vasa, Rajesh.  2018.  An Interaction Model for De-Identification of Human Data Held by External Custodians. Proceedings of the 30th Australian Conference on Computer-Human Interaction. :23–26.

Reuse of pre-existing industry datasets for research purposes requires a multi-stakeholder solution that balances the researcher's analysis objectives with the need to engage the industry data custodian, whilst respecting the privacy rights of human data subjects. Current methods place the burden on the data custodian, whom may not be sufficiently trained to fully appreciate the nuances of data de-identification. Through modelling of functional, quality, and emotional goals, we propose a de-identification in the cloud approach whereby the researcher proposes analyses along with the extraction and de-identification operations, while engaging the industry data custodian with secure control over authorising the proposed analyses. We demonstrate our approach through implementation of a de-identification portal for sports club data.

2018-11-28
Sandbank, Tommy, Shmueli-Scheuer, Michal, Herzig, Jonathan, Konopnicki, David, Shaul, Rottem.  2017.  EHCTool: Managing Emotional Hotspots for Conversational Agents. Proceedings of the 22Nd International Conference on Intelligent User Interfaces Companion. :125–128.

Building conversational agents is becoming easier thanks to the profusion of designated platforms. Integrating emotional intelligence in such agents contributes to positive user satisfaction. Currently, this integration is implemented using calls to an emotion analysis service. In this demonstration we present EHCTool that aims to detect and notify the conversation designer about problematic conversation states where emotions are likely to be expressed by the user. Using its exploration view, the tool assists the designer to manage and define appropriate responses in these cases.

2017-10-18
Yang, Yang, Ma, Xiaojuan, Fung, Pascale.  2017.  Perceived Emotional Intelligence in Virtual Agents. Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems. :2255–2262.

In March 2016, several online news media reported on the inadequate emotional capabilities of interactive virtual assistants. While significant progress has been made in the general intelligence and functionality of virtual agents (VA), the emotional intelligent (EI) VA has yet been thoroughly explored. We examine user's perception of EI of virtual agents through Zara The Supergirl, a virtual agent that conducts question and answering type of conversational testing and counseling online. The results show that overall users perceive an emotion-expressing VA (EEVA) to be more EI than a non-emotion-expressing VA (NEEVA). However, simple affective expression may not be sufficient enough for EEVA to be perceived as fully EI.