Visible to the public Biblio

Filters: Keyword is immersive systems  [Clear All Filters]
2018-01-10
Cheng, Lung-Pan, Marwecki, Sebastian, Baudisch, Patrick.  2017.  Mutual Human Actuation. Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology. :797–805.
Human actuation is the idea of using people to provide large-scale force feedback to users. The Haptic Turk system, for example, used four human actuators to lift and push a virtual reality user; TurkDeck used ten human actuators to place and animate props for a single user. While the experience of human actuators was decent, it was still inferior to the experience these people could have had, had they participated as a user. In this paper, we address this issue by making everyone a user. We introduce mutual human actuation, a version of human actuation that works without dedicated human actuators. The key idea is to run pairs of users at the same time and have them provide human actuation to each other. Our system, Mutual Turk, achieves this by (1) offering shared props through which users can exchange forces while obscuring the fact that there is a human on the other side, and (2) synchronizing the two users' timelines such that their way of manipulating the shared props is consistent across both virtual worlds. We demonstrate mutual human actuation with an example experience in which users pilot kites though storms, tug fish out of ponds, are pummeled by hail, battle monsters, hop across chasms, push loaded carts, and ride in moving vehicles.
Valkov, Dimitar, Flagge, Steffen.  2017.  Smooth Immersion: The Benefits of Making the Transition to Virtual Environments a Continuous Process. Proceedings of the 5th Symposium on Spatial User Interaction. :12–19.
In this paper we discuss the benefits and the limitations, as well as different implementation options for smooth immersion into a HMD-based IVE. We evaluated our concept in a preliminary user study, in which we have tested users' awareness, reality judgment and experience in the IVE, when using different transition techniques to enter it. Our results show that a smooth transition to the IVE improves the awareness of the user and may increase the perceived interactivity of the system.
Cordeil, Maxime, Cunningham, Andrew, Dwyer, Tim, Thomas, Bruce H., Marriott, Kim.  2017.  ImAxes: Immersive Axes As Embodied Affordances for Interactive Multivariate Data Visualisation. Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology. :71–83.
We introduce ImAxes immersive system for exploring multivariate data using fluid, modeless interaction. The basic interface element is an embodied data axis. The user can manipulate these axes like physical objects in the immersive environment and combine them into sophisticated visualisations. The type of visualisation that appears depends on the proximity and relative orientation of the axes with respect to one another, which we describe with a formal grammar. This straight-forward composability leads to a number of emergent visualisations and interactions, which we review, and then demonstrate with a detailed multivariate data analysis use case.
2017-06-27
Lebeck, Kiron, Kohno, Tadayoshi, Roesner, Franziska.  2016.  How to Safely Augment Reality: Challenges and Directions. Proceedings of the 17th International Workshop on Mobile Computing Systems and Applications. :45–50.

Augmented reality (AR) technologies, such as those in head-mounted displays like Microsoft HoloLens or in automotive windshields, are poised to change how people interact with their devices and the physical world. Though researchers have begun considering the security, privacy, and safety issues raised by these technologies, to date such efforts have focused on input, i.e., how to limit the amount of private information to which AR applications receive access. In this work, we focus on the challenge of output management: how can an AR operating system allow multiple concurrently running applications to safely augment the user's view of the world? That is, how can the OS prevent apps from (for example) interfering with content displayed by other apps or the user's perception of critical real-world context, while still allowing them sufficient flexibility to implement rich, immersive AR scenarios? We explore the design space for the management of visual AR output, propose a design that balances OS control with application flexibility, and lay out the research directions raised and enabled by this proposal.

Tsai, Wan-Lun, Hsu, You-Lun, Lin, Chi-Po, Zhu, Chen-Yu, Chen, Yu-Cheng, Hu, Min-Chun.  2016.  Immersive Virtual Reality with Multimodal Interaction and Streaming Technology. Proceedings of the 18th ACM International Conference on Multimodal Interaction. :416–416.

In this demo, we present an immersive virtual reality (VR) system which integrates multimodal interaction sensors (i.e., smartphone, Kinect v2, and Myo armband) and streaming technology to improve the VR experience. The integrated system solves the common problems in most VR systems: (1) the very limited playing area due to transmission cable between computer and display/interaction devices, and (2) non-intuitive way of controlling virtual objects. We use Unreal Engine 4 to develop an immersive VR game with 6 interactive levels to demonstrate the feasibility of our system. In the game, the user not only can freely walk within a large playing area surrounded by multiple Kinect sensors but also select the virtual objects to grab and throw with the Myo armband. The experiment shows that our idea is workable for VR experience.

Ramos Mota, Roberta C., Cartwright, Stephen, Sharlin, Ehud, Hamdi, Hamidreza, Costa Sousa, Mario, Chen, Zhangxin.  2016.  Exploring Immersive Interfaces for Well Placement Optimization in Reservoir Models. Proceedings of the 2016 Symposium on Spatial User Interaction. :121–130.

As the oil and gas industry's ultimate goal is to uncover efficient and economic ways to produce oil and gas, well optimization studies are crucially important for reservoir engineers. Although this task has a major impact on reservoir productivity, it has been challenging for reservoir engineers to perform since it involves time-consuming flow simulations to search a large solution space for an optimal well plan. Our work aims to provide engineers a) an analytical method to perform static connectivity analysis as a proxy for flow simulation, b) an application to support well optimization using our method and c) an immersive experience that benefits engineers and supports their needs and preferences when performing the design and assessment of well trajectories. For the latter purpose, we explore our tool with three immersive environments: a CAVE with a tracked gamepad; a HMD with a tracked gamepad; and a HMD with a Leap Motion controller. This paper describes our application and its techniques in each of the different immersive environments. This paper also describes our findings from an exploratory evaluation conducted with six reservoir engineers, which provided insight into our application, and allowed us to discuss the potential benefits of immersion for the oil and gas domain.

Mahfoud, Eli, Lu, Aidong.  2016.  Gaze-directed Immersive Visualization of Scientific Ensembles. Proceedings of the 2016 ACM Companion on Interactive Surfaces and Spaces. :77–82.

The latest advances in head-mounted displays (HMDs) for augmented reality (AR) and mixed reality (MR) have produced commercialized devices that are gradually accepted by the public. These HMDs are generally equipped with head tracking, which provides an excellent input to explore immersive visualization and interaction techniques for various AR/MR applications. This paper explores the head tracking function on the latest Microsoft HoloLens – where gaze is defined as the ray starting at the head location and points forward. We present a gaze-directed visualization approach to study ensembles of 2D oil spill simulations in mixed reality. Our approach allows users to place an ensemble as an image stack in a real environment and explore the ensemble with gaze tracking. The prototype system demonstrates the challenges and promising effects of gaze-based interaction in the state-of-the-art mixed reality.

Todi, Kashyap, Degraen, Donald, Berghmans, Brent, Faes, Axel, Kaminski, Matthijs, Luyten, Kris.  2016.  Purpose-Centric Appropriation of Everyday Objects As Game Controllers. Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. :2744–2750.

Generic multi-button controllers are the most common input devices used for video games. In contrast, dedicated game controllers and gestural interactions increase immersion and playability. Room-sized gaming has opened up possibilities to further enhance the immersive experience, and provides players with opportunities to use full-body movements as input. We present a purpose-centric approach to appropriating everyday objects as physical game controllers, for immersive room-sized gaming. Virtual manipulations supported by such physical controllers mimic real-world function and usage. Doing so opens up new possibilities for interactions that flow seamlessly from the physical into the virtual world. As a proof-of-concept, we present a 'Tower Defence' styled game, that uses four everyday household objects as game controllers, each of which serves as a weapon to defend the base of the players from enemy bots. Players can use 1) a mop (or a broom) to sweep away enemy bots directionally; 2) a fan to scatter them away; 3) a vacuum cleaner to suck them; 4) a mouse trap to destroy them. Each controller is tracked using a motion capture system. A physics engine is integrated in the game, and ensures virtual objects act as though they are manipulated by the actual physical controller, thus providing players with a highly-immersive gaming experience.

Ravenet, Brian, Bevacqua, Elisabetta, Cafaro, Angelo, Ochs, Magalie, Pelachaud, Catherine.  2016.  Perceiving Attitudes Expressed Through Nonverbal Behaviors in Immersive Virtual Environments. Proceedings of the 9th International Conference on Motion in Games. :175–180.

Virtual Reality and immersive experiences, which allow players to share the same virtual environment as the characters of a virtual world, have gained more and more interest recently. In order to conceive these immersive virtual worlds, one of the challenges is to give to the characters that populate them the ability to express behaviors that can support the immersion. In this work, we propose a model capable of controlling and simulating a conversational group of social agents in an immersive environment. We describe this model which has been previously validated using a regular screen setting and we present a study for measuring whether users recognized the attitudes expressed by virtual agents through the realtime generated animations of nonverbal behavior in an immersive setting. Results mirrored those of the regular screen setting thus providing further insights for improving players experiences by integrating them into immersive simulated group conversations with characters that express different interpersonal attitudes.

Borba, Eduardo Zilles, Cabral, Marcio, Montes, Andre, Belloc, Olavo, Zuffo, Marcelo.  2016.  Immersive and Interactive Procedure Training Simulator for High Risk Power Line Maintenance. ACM SIGGRAPH 2016 VR Village. :7:1–7:1.

This project shows a procedure-training simulator targeted at the operation and maintenance of overland distribution power lines. This simulator is focused on workplace safety and risk assessment of common daily operations such as fuse replacement and power cut activities. The training system is implemented using VR goggles (Oculus Rift) and a mixture of a real scenario matched perfectly with its Virtual Reality counterpart. The real scenario is composed of a real "basket" and a stick - both of the equipment is the actual ones used in daily training. Both, equipment are tracked by high precision infrared cameras system (OptiTrack) providing a high degree of immersion and realism. In addition to tracking the scenario, the user is completely tracked: heads, shoulders, arms and hands are tracked. This tracking allows a perfect simulation of the participant's movements in the Virtual World. This allows precise evaluation of movements as well as ergonomics. The virtual scenario was carefully designed to accurately reproduce in a coherent way all relevant spatial, architectonic and natural features typical of the urban environment, reflecting the variety of challenges that real cities might impose on the activity. The system consists of two modules: the first module being Instructor Interface, which will help create and control different challenging scenarios and follow the student's reactions and behavior; and the second module is the simulator itself, which will be presented to the student through VR goggles. The training session can also be viewed on a projected screen by other students, enabling learning through observation of mistakes and successes of their peers, such as a martial arts dojo. The simulator features various risk scenarios such as: different climates - sun, rain and wind; different lighting conditions - day, night and artificial; different types of electrical structures; transformer fire and explosion; short-circuit and electric arc; defective equipment; many obstacles - trees, cars, windows, swarm of bees, etc.

Hu, Gang, Bin Hannan, Nabil, Tearo, Khalid, Bastos, Arthur, Reilly, Derek.  2016.  Doing While Thinking: Physical and Cognitive Engagement and Immersion in Mixed Reality Games. Proceedings of the 2016 ACM Conference on Designing Interactive Systems. :947–958.

We present a study examining the impact of physical and cognitive challenge on reported immersion for a mixed reality game called Beach Pong. Contrary to prior findings for desktop games, we find significantly higher reported immersion among players who engage physically, regardless of their actual game performance. Building a mental map of the real, virtual, and sensed world is a cognitive challenge for novices, and this appears to influence immersion: in our study, participants who actively attended to both physical and virtual game elements reported higher immersion levels than those who attended mainly or exclusively to virtual elements. Without an integrated mental map, in-game cognitive challenges were ignored or offloaded to motor response when possible in order to achieve the minimum required goals of the game. From our results we propose a model of immersion in mixed reality gaming that is useful for designers and researchers in this space.

Bonada, Santiago, Veras, Rafael, Collins, Christopher.  2016.  Personalized Views for Immersive Analytics. Proceedings of the 2016 ACM Companion on Interactive Surfaces and Spaces. :83–89.

In this paper we present work-in-progress toward a vision of personalized views of visual analytics interfaces in the context of collaborative analytics in immersive spaces. In particular, we are interested in the sense of immersion, responsiveness, and personalization afforded by gaze-based input. Through combining large screen visual analytics tools with eye-tracking, a collaborative visual analytics system can become egocentric while not disrupting the collaborative nature of the experience. We present a prototype system and several ideas for real-time personalization of views in visual analytics.

Luboschik, Martin, Berger, Philip, Staadt, Oliver.  2016.  On Spatial Perception Issues In Augmented Reality Based Immersive Analytics. Proceedings of the 2016 ACM Companion on Interactive Surfaces and Spaces. :47–53.

Beyond other domains, the field of immersive analytics makes use of Augmented Reality techniques to successfully support users in analyzing data. When displaying ubiquitous data integrated into the everyday life, spatial immersion issues like depth perception, data localization and object relations become relevant. Although there is a variety of techniques to deal with those, they are difficult to apply if the examined data or the reference space are large and abstract. In this work, we discuss observed problems in such immersive analytics systems and the applicability of current countermeasures to identify needs for action.