Immersion (virtual reality)
Immersion into virtual reality is a metaphoric use of the experience of submersion applied to representation, fiction or simulation. Immersion can also be defined as the state of consciousness where a "visitor" (Maurice Benayoun) or "immersant" (Char Davies)’s awareness of physical self is transformed by being surrounded in an engrossing environment; often artificial, used for describing partial or complete suspension of disbelief enabling action or reaction to stimulations encountered in a virtual or artistic environment. The degree to which the virtual or artistic environment faithfully reproduces reality determines the degree of suspension of disbelief. The greater the suspension of disbelief, the greater the degree of Presence achieved.
Types of immersion
- Tactical immersion
- Tactical immersion is experienced when performing tactile operations that involve skill. Players feel "in the zone" while perfecting actions that result in success.
- Strategic immersion
- Strategic immersion is more cerebral, and is associated with mental challenge. Chess players experience strategic immersion when choosing a correct solution among a broad array of possibilities.
- Narrative immersion
- Narrative immersion occurs when players become invested in a story, and is similar to what is experienced while reading a book or watching a movie.
Staffan Björk and Jussi Holopainen, in Patterns In Game Design, divide immersion into similar categories, but call them sensory-motoric immersion, cognitive immersion and emotional immersion, respectively. In addition to these, they add a new category:
- Spatial immersion
- Spatial immersion occurs when a player feels the simulated world is perceptually convincing. The player feels that he or she is really "there" and that a simulated world looks and feels "real".
Immersive virtual reality
Immersive virtual reality is a hypothetical future technology that exists today as virtual reality art projects, for the most part. It consists of immersion in an artificial environment where the user feels just as immersed as they usually feel in consensus reality.
Direct interaction of the nervous system
The most considered method would be to induce the sensations that made up the virtual reality in the nervous system directly. In functionalism/conventional biology we interact with consensus reality through the nervous system. Thus we receive all input from all the senses as nerve impulses. It gives your neurons a feeling of heightened sensation. It would involve the user receiving inputs as artificially stimulated nerve impulses, the system would receive the CNS outputs (natural nerve impulses) and process them allowing the user to interact with the virtual reality. Natural impulses between the body and central nervous system would need to be prevented. This could be done by blocking out natural impulses using nanorobots which attach themselves to the brain wiring, whilst receiving the digital impulses of which describe the virtual world, which could then be sent into the wiring of the brain. A feedback system between the user and the computer which stores the information would also be needed. Considering how much information would be required for such a system, it's likely it would be based on quantum computers.
- Understanding of the nervous system
A comprehensive understanding of which nerve impulses correspond to which sensations, and which motor impulses correspond to which muscle contractions will be required. This will allow the correct sensations in the user, and actions in the virtual reality to occur. The Blue Brain Project is the current, most promising research with the idea of understanding how the brain works by building very large scale computer models.
- Ability to manipulate CNS
The nervous system would obviously need to be manipulated. Whilst non-invasive devices using radiation have been postulated, invasive cybernetic implants are likely to become available sooner and be more accurate. Manipulation could occur at any stage of the nervous system - the spinal cord is likely to be simplest; as all nerves pass through here, this could be the only site of manipulation. Molecular Nanotechnology is likely to provide the degree of precision required and could allow the implant to be built inside the body rather than be inserted by an operation.
- Computer hardware/software to process inputs/outputs
A very powerful and probably (but not necessarily) Strong AI would be required to process all the inputs from the CNS, run a simulation of a virtual reality approaching the complexity of consensus reality, and translate its events to a complete set of nerve impulses for the user. Strong artificial intelligence may also be required to write the program for a decent alternate reality.
Immersive digital environments
Immersive digital environments could be thought of as synonymous with Virtual reality, but without the implication that actual "reality" is being simulated. An immersive digital environment could be a model of reality, but it could also be a complete fantasy user interface or abstraction, as long as the user of the environment is immersed within it. The definition of immersion is wide and variable, but here it is assumed to mean simply that the user feels like they are part of the simulated "universe". The success with which an immersive digital environment can actually immerse the user is dependent on many factors such as believable 3D computer graphics, surround sound, interactive user-input and other factors such as simplicity, functionality and potential for enjoyment. New technologies are currently under development which claim to bring realistic environmental effects to the players' environment - effects like wind, seat vibration and ambient lighting.
To create a sense of full immersion, the 5 senses (sight, sound, touch, smell, taste) must perceive the digital environment to be physically real. Immersive technology can perceptually fool the senses through:
- Panoramic 3D displays (visual)
- Surround sound acoustics (auditory)
- Haptics and force feedback (tactile)
- Smell replication (olfactory)
- Taste replication (gustation)
Once the senses reach a sufficient belief that the digital environment is real, the user must then be able to interact with the environment in a natural, intuitive manner. Various immersive technologies such as gestural controls, motion tracking, and computer vision respond to the user's actions and movements. Brain control interfaces (BCI) respond to the user's brainwave activity.
Examples and applications
Training and rehearsal simulations run the gamut from part task procedural training (often buttonology, for example: which button do you push to deploy a refueling boom) through situational simulation (such as crisis response or convoy driver training) to full motion simulations which train pilots or soldiers and law enforcement in scenarios that are too dangerous to train in actual equipment using live ordinance.
Computer games from simple arcade to Massively multiplayer online game and training programs such as flight and driving simulators. Entertainment environments such as motion simulators that immerse the riders/players in a virtual digital environment enhanced by motion, visual and aural cues. There is a motion simulators of the Virunga Mountains in Rwanda to meet a tribe of Mountain Gorillas, or a ride that takes a journey through the arteries and heart to witness the build up of plaque and thus learn about cholesterol and health.
In parallel with scientist, artists like Knowbotic Research, Donna Cox, Rebecca Allen, Robbie Cooper, Maurice Benayoun, Char Davies, and Jeffrey Shaw use the potential of immersive virtual reality to create physiologic or symbolic experiences and situations.
Other examples of immersion technology include physical environment / immersive space with surrounding digital projections and sound such as the CAVE, and the use of head-mounted displays for viewing movies, with head-tracking and computer control of the image presented, so that the viewer appears to be inside the scene.. The next generation is VIRTSIM, which achieves total immersion through motion capture and wireless head mounted displays for teams of up to thirteen immersants enabling natural movement through space and interaction in both the virtual and physical space simultaneously.
Practical Applications at Ford Motor Company
The developmental process used at Ford Motor Company may seem unusual to anyone who is not familiar with virtual reality, but this does not mean it is not impressive. The innovative concepts being used in the Virtual Reality lab at Ford have even sparked the interest of NASA. Ford’s Virtual Reality lab is led by technical expert, Elizabeth Baron. Along with her team, she promotes some of the most advanced car designs in the auto industry.
Ford Motor Company uses 3D imaging to help them construct the interior of their vehicles. Inside the iVE lab, Ford has a test car that supports the motion software. The test car is built to resemble an actual car and has doors, seats, and a steering wheel. The test car is fixed to a platform which has built in overhead cameras that face the driver. Through the software, engineers are able to develop the interior of cars and display it through the virtual reality headset of the person sitting in the test car. The cameras that are facing the test car allow the user full visual mobility while looking at the interior of the vehicle. The motion sensing is able to tell when the user turns their head and gives them a three hundred and sixty degree view. The appearance of the interior can be switched from a Mustang to an Escape in as quickly as five minutes which allows easy comparison of concept designs.
The Cave Automated Virtual Environment (CAVE) is also used by developers to look at the exterior as well as the interior of concept designs. While standing in the CAVE, 3D glasses fixed with motion sensors are needed to control the view. Images developed in design software are projected over the walls of the room to give users an idea as to what future products are intended to look like. To change the view of the car from exterior to interior, the person wearing the 3D glasses just has to sit atop a chair that is positioned in the center of the room.
Although Ford has state of the art equipment in their labs, engineers are able to duplicate the same visual experience with their mobile station. This allows Ford to do public demonstrations at events like the North American International Auto Show and at various universities across the country. Ms. Baron is a key note speaker who has traveled with the mobile station to give demonstrations on the work being done with this type of technology. Ford’s mobile station consists of some of the same components that are found in their onsite company lab. The station has seats, pedals, and a steering wheel which give the user an opportunity to physically grasp what they are seeing through their virtual reality headset. Ford’s mobile station also consists of two, top of the line, 3D televisions that display the same images as the virtual reality headset which is useful when demonstrating to larger audiences.
This software gives engineers real life data on the overall appearance of their future products. Engineers are able to make recommendations and suggest that certain components be placed in a more convenient position. Being able to suggest these ideas before production allows the company to make full use of their resources. These innovative concepts that have been applied in the virtual reality lab have sped up production and reduced costs, thus ultimately making better products for their customers. The technology of virtual reality has become state of the art when compared to where it was five years ago; it is only imaginable as to what will be the next advancement.
Simulation sickness, or simulator sickness, is a condition where a person exhibits symptoms similar to motion sickness caused by playing computer/simulation/video games (Oculus Rift is working to solve simulator sickness).
Motion sickness due to virtual reality is very similar to simulation sickness and motion sickness due to films. In virtual reality, however, the effect is made more acute as all external reference points are blocked from vision, the simulated images are three-dimensional and in some cases stereo sound that may also give a sense of motion. Studies have shown that exposure to rotational motions in a virtual environment can cause significant increases in nausea and other symptoms of motion sickness (So, R.H.Y. and Lo, W.T. (1999) "Cybersickness: An Experimental Study to Isolate the Effects of Rotational Scene Oscillations." Proceedings of IEEE Virtual Reality '99 Conference, March 13–17, 1999, Houston, Texas. Published by IEEE Computer Society, pp. 237–241)
- Adams, Ernest (July 9, 2004). "Postmodernism and the Three Types of Immersion". Gamasutra. Retrieved 2007-12-26.
- Björk, Staffan; Jussi Holopainen (2004). Patterns In Game Design. Charles River Media. p. 206. ISBN 1-58450-354-8.
- Joseph Nechvatal, Immersive Ideals / Critical Distances. LAP Lambert Academic Publishing. 2009, pp. 367-368
- Joseph Nechvatal, Immersive Ideals / Critical Distances. LAP Lambert Academic Publishing. 2009, pp. 48-60
- Alternate reality game
- Cave automatic virtual environment
- Environmental sculpture
- Immersive design
- Immersive technology
- Interactive art
- Simulation sickness
- Motion sickness
- Neo-conceptual art
- Simulated reality
- Sound art
- Sound installation
- Video installation
- virtual art
- Reyes, Stephanie. "Ford brings virtual reality presentation to UCF." http://www.centralfloridafuture.com/news/ford-brings-virtual-reality-presentation-to-ucf-1.2767843. September 2012.
- Christiane Paul, Digital Art, Thames & Hudson Ltd.
- Oliver Grau, "Virtual Art: From Illusion to Immersion" MIT-Press, Cambridge 2003
- Timothy Murray, Derrick de Kerckhove, Oliver Grau, Kristine Stiles, Jean-Baptiste Barrière, Dominique Moulon, Maurice Benayoun Open Art, Nouvelles éditions Scala, 2011, French version, ISBN 978-2-35988-046-5
- Allen Varney, (August 8, 2006). "Immersion Unexplained" in "The Escapist"
- Frank Popper, "From Technological to Virtual Art", MIT Press. ISBN 0-262-16230-X.
- Oliver Grau (Ed.), Media Art Histories, MIT-Press, Cambridge 2007
- Joseph Nechvatal, "Immersive Excess in the Apse of Lascaux", Technonoetic Arts 3, no3. 2005
- Adams, Ernest (July 9, 2004). "Postmodernism and the Three Types of Immersion". Gamasutra. Retrieved 2007-12-26.
- Björk, Staffan; Jussi Holopainen (2004). Patterns In Game Design. Charles River Media. p. 423. ISBN 1-58450-354-8.
- Edward A. Shanken, Art and Electronic Media. London: Phaidon, 2009. ISBN 978-0-7148-4782-5
- Joseph Nechvatal Towards an Immersive Intelligence: Essays on the Work of Art in the Age of Computer Technology and Virtual Reality (1993-2006). Edgewise Press. New York, N.Y. 2009
- Joseph Nechvatal, Immersive Ideals / Critical Distances. LAP Lambert Academic Publishing. 2009
- Annual Summit on Immersive Technology
-  pdf download of Joseph Nechvatal's text book: Immersive Ideals / Critical Distances. LAP Lambert Academic Publishing. 2009
- Audio and Game Immersion PhD thesis about game audio (the IEZA Framework) and immersion.
- "Improvising Synesthesia: Comprovisation of Generative Graphics and Music" by Joshua B. Mailman, in Leonardo Electronic Almanac v.19 no.3, Live Visuals, 2013, pp.352-84, about two immersive systems for improvising music and graphics through dance-like motion detected by an infrared video camera and other sensors.