Visible to the public Biblio

Filters: Keyword is mixed reality  [Clear All Filters]
2022-01-25
Kozlova, Liudmila P., Kozlova, Olga A..  2021.  Expanding Space with Augmented Reality. 2021 IEEE Conference of Russian Young Researchers in Electrical and Electronic Engineering (ElConRus). :965—967.
Replacing real life with the virtual space has long ceased to be a theory. Among the whole variety of visualization, systems that allow projecting non-existent objects into real-world space are especially distinguished. Thus, augmented reality technology has found its application in many different fields. The article discusses the general concepts and principles of building augmented reality systems.
Lu, Lu, Duan, Pengshuai, Shen, Xukun, Zhang, Shijin, Feng, Huiyan, Flu, Yong.  2021.  Gaze-Pinch Menu: Performing Multiple Interactions Concurrently in Mixed Reality. 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). :536—537.
Performing an interaction using gaze and pinch has been certified as an efficient interactive method in Mixed Reality, for such techniques can provide users concise and natural experiences. However, executing a task with individual interactions gradually is inefficient in some application scenarios. In this paper, we propose the Hand-Pinch Menu, which core concept is to reduce unnecessary operations by combining several interactions. Users can continuously perform multiple interactions on a selected object concurrently without changing gestures by using this technique. The user study results show that our Gaze-Pinch Menu can improve operational efficiency effectively.
Gonsher, Ian, Lei, Zhenhong.  2021.  Prototype of Force Feedback Tool for Mixed Reality Applications. 2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). :508—509.
This prototype demonstrates the viability of manipulating both physical and virtual objects with the same tool in order to maintain object permanence across both modes of interaction. Using oppositional force feedback, provided by a servo, and an augmented visual interface, provided by the user’s smartphone, this tool simulates the look and feel of a physical object within an augmented environment. Additionally, the tool is also able to manipulate physical objects that are not part of the augmented reality, such as a physical nut. By integrating both modes of interaction into the same tool, users can fluidly move between these different modes of interaction, manipulating both physical and virtual objects as the need arises. By overlaying this kind of visual and haptic augmentation onto a common tool such as a pair of pliers, we hope to further explore scenarios for collaborative telepresence in future work.
2021-11-29
Li, Taojin, Lei, Songgui, Wang, Wei, Wang, Qingli.  2020.  Research on MR virtual scene location method based on image recognition. 2020 International Conference on Information Science, Parallel and Distributed Systems (ISPDS). :109–113.
In order to solve the problem of accurate positioning of mixed reality virtual scene in physical space, this paper, firstly, analyzes the positioning principle of mixed reality virtual scene. Secondly, based on the comparison among the three developer kits: ARToolKit, ARTag, and Vuforia and two image optimization algorithms: AHE and ACE, it makes sure to use Vuforia development tool to complete the signature-based tracking and registration task, and use ACE algorithm to optimize the signature-based image. It improves the efficiency, stability and accuracy of image recognition registration. Then the multi-target recognition and registration technology is used to realize the multi-location of virtual scene. Finally, Hololens glasses are used as the hardware carrier to verify the above method. The experimental results show that the above method not only realizes the precise location of MR virtual scene based on image recognition, but also ensures the absolute position of the virtual model in the real space, bringing users a more real virtual experience. Keywords-mixed reality, multi-person collaboration, virtual positioning, gesture interaction.
2019-02-22
Roberts, Jasmine.  2018.  Using Affective Computing for Proxemic Interactions in Mixed-Reality. Proceedings of the Symposium on Spatial User Interaction. :176-176.

Immersive technologies have been touted as empathetic mediums. This capability has yet to be fully explored through machine learning integration. Our demo seeks to explore proxemics in mixed-reality (MR) human-human interactions. The author developed a system, where spatial features can be manipulated in real time by identifying emotions corresponding to unique combinations of facial micro-expressions and tonal analysis. The Magic Leap One is used as the interactive interface, the first commercial spatial computing head mounted (virtual retinal) display (HUD). A novel spatial user interface visualization element is prototyped that leverages the affordances of mixed-reality by introducing both a spatial and affective component to interfaces.

2017-06-27
Hu, Gang, Bin Hannan, Nabil, Tearo, Khalid, Bastos, Arthur, Reilly, Derek.  2016.  Doing While Thinking: Physical and Cognitive Engagement and Immersion in Mixed Reality Games. Proceedings of the 2016 ACM Conference on Designing Interactive Systems. :947–958.

We present a study examining the impact of physical and cognitive challenge on reported immersion for a mixed reality game called Beach Pong. Contrary to prior findings for desktop games, we find significantly higher reported immersion among players who engage physically, regardless of their actual game performance. Building a mental map of the real, virtual, and sensed world is a cognitive challenge for novices, and this appears to influence immersion: in our study, participants who actively attended to both physical and virtual game elements reported higher immersion levels than those who attended mainly or exclusively to virtual elements. Without an integrated mental map, in-game cognitive challenges were ignored or offloaded to motor response when possible in order to achieve the minimum required goals of the game. From our results we propose a model of immersion in mixed reality gaming that is useful for designers and researchers in this space.