Visible to the public Biblio

Filters: Author is Rawat, Yogesh Singh  [Clear All Filters]
2019-05-01
Tirupattur, Praveen, Rawat, Yogesh Singh, Spampinato, Concetto, Shah, Mubarak.  2018.  ThoughtViz: Visualizing Human Thoughts Using Generative Adversarial Network. Proceedings of the 26th ACM International Conference on Multimedia. :950-958.

Studying human brain signals has always gathered great attention from the scientific community. In Brain Computer Interface (BCI) research, for example, changes of brain signals in relation to specific tasks (e.g., thinking something) are detected and used to control machines. While extracting spatio-temporal cues from brain signals for classifying state of human mind is an explored path, decoding and visualizing brain states is new and futuristic. Following this latter direction, in this paper, we propose an approach that is able not only to read the mind, but also to decode and visualize human thoughts. More specifically, we analyze brain activity, recorded by an ElectroEncephaloGram (EEG), of a subject while thinking about a digit, character or an object and synthesize visually the thought item. To accomplish this, we leverage the recent progress of adversarial learning by devising a conditional Generative Adversarial Network (GAN), which takes, as input, encoded EEG signals and generates corresponding images. In addition, since collecting large EEG signals in not trivial, our GAN model allows for learning distributions with limited training data. Performance analysis carried out on three different datasets – brain signals of multiple subjects thinking digits, characters, and objects – show that our approach is able to effectively generate images from thoughts of a person. They also demonstrate that EEG signals encode explicitly cues from thoughts which can be effectively used for generating semantically relevant visualizations.