Biblio
Immersive technologies have been touted as empathetic mediums. This capability has yet to be fully explored through machine learning integration. Our demo seeks to explore proxemics in mixed-reality (MR) human-human interactions. The author developed a system, where spatial features can be manipulated in real time by identifying emotions corresponding to unique combinations of facial micro-expressions and tonal analysis. The Magic Leap One is used as the interactive interface, the first commercial spatial computing head mounted (virtual retinal) display (HUD). A novel spatial user interface visualization element is prototyped that leverages the affordances of mixed-reality by introducing both a spatial and affective component to interfaces.
In this paper we present work-in-progress toward a vision of personalized views of visual analytics interfaces in the context of collaborative analytics in immersive spaces. In particular, we are interested in the sense of immersion, responsiveness, and personalization afforded by gaze-based input. Through combining large screen visual analytics tools with eye-tracking, a collaborative visual analytics system can become egocentric while not disrupting the collaborative nature of the experience. We present a prototype system and several ideas for real-time personalization of views in visual analytics.