Cave automatic virtual environment
|This article needs additional citations for verification. (October 2013) (Learn how and when to remove this template message)|
A cave automatic virtual environment (better known by the recursive acronym CAVE) is an immersive virtual reality environment where projectors are directed to between three and six of the walls of a room-sized cube. The name is also a reference to the allegory of the Cave in Plato's Republic in which a philosopher contemplates perception, reality and illusion.
General characteristics of the CAVE
The first CAVE was invented by Professor Daniel J. Sandin and developed at the University of Illinois, Chicago's Electronic Visualization Laboratory by Carolina Cruz, David Pape, and a small team of graduate students. A CAVE is typically a video theater situated within a larger room. The walls of a CAVE are typically made up of rear-projection screens, however flat panel displays are becoming more common. The floor can be a downward-projection screen, a bottom projected screen or a flat panel display. The projection systems are very high-resolution due to the near distance viewing which requires very small pixel sizes to retain the illusion of reality. The user wears 3D glasses inside the CAVE to see 3D graphics generated by the CAVE. People using the CAVE can see objects apparently floating in the air, and can walk around them, getting a proper view of what they would look like in reality. This was initially made possible by electromagnetic sensors, but has converted to infrared cameras. The frame of early CAVEs had to be built from non-magnetic materials such as wood to minimize interference with the electromagnetic sensors; the change to infrared tracking has removed that limitation. A CAVE user's movements are tracked by the sensors typically attached to the 3D glasses and the video continually adjusts to retain the viewers perspective. Computers control both this aspect of the CAVE and the audio aspect. There are typically multiple speakers placed at multiple angles in the CAVE, providing 3D sound to complement the 3D video.
A lifelike visual display is created by projectors positioned outside the CAVE and controlled by physical movements from a user inside the CAVE. A motion capture system records the real time position of the user. Stereoscopic LCD shutter glasses convey a 3D image. The computers rapidly generate a pair of images, one for each of the user's eyes, based on the motion capture data. The glasses are synchronized with the projectors so that each eye only sees the correct image. Since the projectors are positioned outside the cube, mirrors are often used to reduce the distance required from the projectors to the screens. One or more computers drive the projectors. Clusters of desktop PCs are popular to run CAVEs, because they cost less and run faster.
Software and libraries designed specifically for CAVE applications are available. There are several techniques for rendering the scene. There are 3 popular scene graphs in use today: OpenSG, OpenSceneGraph, and OpenGL Performer. OpenSG and OpenSceneGraph are open source; while OpenGL Performer is free, its source code is not included.
CAVELib is the original application programmer's interface (API) developed for the CAVE(TM) system created at the Electronic Visualization Lab at University of Illinois Chicago. The software was commercialized in 1996 and further enhanced by Mechdyne Corporation. The CAVELib is a low level VR software package in that it abstracts for a developer window and viewport creation, viewer-centered perspective calculations, displaying to multiple graphics channels, multi-processing and multi-threading, cluster synchronization and data sharing, and stereoscopic viewing. Developers create all of the graphics for their environment and the CAVELib makes it display properly. The CAVELib API is platform-independent, enabling developers to create high-end virtual reality applications on Windows and Linux operating systems (IRIX, Solaris, and HP-UX are no longer supported). CAVELib-based applications are externally configurable at run-time, making an application executable independent of the display system.
Mechdyne's Conduit is a commercial software package that makes a small number of existing 3D OpenGL application (like CATIA V5, Pro/E, Unigraphics...) work directly in a CAVE, without any source code modification. Working like an OpenGL driver, it takes the model definitions of the existing application, streams them on a PC cluster, and changes the camera so that the viewpoint is dependent on the tracking system.
EON Icube is a hardware & software package developed by EON Reality that uses PC-based technology to create a multi-sided immersive environment in which participants may be completely surrounded by virtual imagery and 3D sound. The Icube software supports edge blending and the capability to create full quad buffer stereo images in 3D.
TechViz' TechViz XL is a commercial software package that makes any existing 3D OpenGL application (like Dassault's suite, Siemens PLM's suite, Creo, Autodesk's suite..) work directly in a CAVE (or any other Virtual or Augmented system available today), without any source code modification. Working like an OpenGL driver, it takes all of the information of the existing application, streams them on a Windows of Linux PC, or PC cluster, and changes the camera so that the viewpoint is dependent on the tracking system. A user can then utilize a robust set of TechViz XL tools to create a dynamic virtual session with the model. A Virtual session can also be controlled directly from the user's application with changes updating instantly in the virtual system. Finally, bi-directional associativity can be established if the users wish to drive changes from both the system and the application.
VR Juggler is a suite of APIs designed to simplify the VR application development process. VR Juggler allows the programmer to write an application that will work with any VR display device, with any VR input devices, without changing any code or having to recompile the application. Juggler is used in over 100 CAVEs worldwide.
CoVE is a suite of APIs designed to enable the creation of reusable VR applications. CoVE provides programmers with an API to develop multi-user, multi-tasking, collaborative, cluster-ready applications with rich 2D interfaces using an immersive window manager and windowing API to provide windows, menus, buttons, and other common widgets within the VR system. CoVE also supports running X11 applications within the VR environment.
Equalizer (software) is an open source rendering framework and resource management system for multipipe applications, ranging from single pipe workstations to VR installations. Equalizer provides an API to write parallel, scalable visualization applications which are configured at run-time by a resource server.
Syzygy (software) is a freely-distributed grid operating system for PC cluster virtual reality, tele-collaboration, and multimedia supercomputing, developed by the Integrated Systems Laboratory at the Beckman Institute of the University of Illinois at Urbana–Champaign. This middleware runs on Mac OS, Linux, Windows, and Irix. C++, OpenGL, and Python applications (as well as other regular computer apps) can run on this and be distributed for VR.
Avango is a framework for building distributed virtual reality applications. It provides a field/fieldcontainer-based application layer similar to VRML. Within this layer a scene graph, based on OpenGL Performer, input sensors, and output actuators are implemented as runtime loadable modules (or plugins). A network layer provides automatic replication/distribution of the application graph using a reliable multi-cast system. Applications in Avango are written in Scheme and run in the scripting layer. The scripting layer provides complete access to fieldcontainers and their fields; this way distributed collaborative scenarios as well as render-distributed applications (or even both at the same time) are supported. Avango was originally developed at the VR group at GMD, now Virtual Environments Group at Fraunhofer IAIS and was open-sourced in 2004.
CaveUT is an open source mutator for Unreal Tournament 2004. Developed by PublicVR, CaveUT leverages existing gaming technologies to create a CAVE environment. By using Unreal Tournament's spectator function CaveUT can position virtual viewpoints around the player's "head". Each viewpoint is a separate client that, when projected on a wall, gives the illusion of a 3D environment.
Quest3D, a real-time 3D engine and development platform, suitable for CAVE implementations.
Vrui (Virtual Reality User Interface) is a development toolkit that handles real-time rendering, head tracking, etc. in multi-display environments such as the CAVE. 3DVisualizer, LidarViewer, and several other software packages were developed using Vrui to provide visualization tools for specific data types. These tools have been publicly released with continuing development by the Keck Center for Active Visualization in Earth Sciences. Oliver Kreylos maintains Vrui documentation and source code on his website.
inVRs The inVRs framework provides a clearly structured approach for the design of highly interactive and responsive VEs and NVEs. It is developed following open-source principles (LGPL) easy to use with CAVEs and a variety of input devices.
VR4MAX is a package for real-time 3D rendering and development of interactive 3D models and simulators based on Autodesk 3ds Max content. VR4MAX Extreme supports multi-projection for CAVE implementations and provides extensive tracking support.
libGlass is a general purpose distributed computing library, but has been used extensively in distributed computer graphic applications. There are many applications running at the five-sided CAVE. For example: astronomic application, arcade-like flight simulator and OpenGL demos.
P3D VirtualSight is a software solution designed to provide an immersive, photorealistic 3D experience of Digital Aspect Mockups on a 1:1 scale. P3D Virtual Sight supports multiple stereoscopic display modes. It can be interfaced with various tracking systems and can power configurations such as multi-screen devices, image walls based on juxtaposed projections, CAVE systems, or Head Mounted Displays.
Vizard (software) is a multi-purpose virtual reality development platform by WorldViz for building, rendering, and deploying 3D visualization & simulation applications in stereoscopic multi-display environments such as the CAVE. The software lets users control 3D content, CAD workflows, rendering clusters, visual displays, motion tracking, and user interaction from one single platform. A joint solution with SensoMotoric Instruments also allows to incorporate eye tracking.
Quazar3D Immersive (software) commercial software package for building and managing immersive digital environments including CAVE, PowerWalls, cylindrical projection systems, etc. The key feature is a powerful management console for easy configuration of the whole rendering cluster. Features such as VRPN, quadbuffer stereo, hardware and software synchronization, off-axis stereo for planar and cylindrical projections are supported.
Dice by Immersion is an acronym for Digital Immersive and Compact Environment. This is an affordable Premium turnkey CAVE-type solution developed by Immersion SAS (Fr), including hardware (screens, mechanics, projectors, tracking, workstation...), software suite (Middle VR and Unity) and Services (3-year warranty: parts & labour and consumables included).
3D Virtual Spaces by Satavision are CAVE-type solutions including both the hardware and the software developed by Satavision Ltd. The 3D Virtual Spaces are built to customer specific requirements and the content the customer wishes to use is converted into a CAVE compatible stereoscopic content. These spaces are used for multiple purposes: as a tool for planning, research or marketing, in educational settings or as an effective way to increase sales.
VisCube by Visbox affordable high performance CAVE systems that fit within existing spaces, eliminating time-consuming and costly room modifications. VisCube CAVE systems are available as either standalone displays or turn-key VR systems with tracking and software.
vrCluster by Pixela Labs plugin for Unreal Engine 4 that allows to launch applications into stereoscopic multi-screen environments, support VRPN tracking, opengl quad buffer buffer, vsync, gsync and nvswapsync, easely configurable via GUI toolset. Source code available at github - https://github.com/vrCluster/vrCluster 
To be able to create an image that will not be distorted or out of place, the displays and sensors must be calibrated. The calibration process depends on the motion capture technology being used. Optical or Inertial-acoustic systems only requires to configure the zero and the axes used by the tracking system. Calibration of electromagnetic sensors (like the ones used in the first cave) is more complex. In this case a person will put on the special glasses needed to see the images in 3D. The projectors then fill the CAVE with many one-inch boxes set one foot apart. The person then takes an instrument called an "ultrasonic measurement device" which has a cursor in the middle of it, and positions the device so that the cursor is visually in line with the projected box. This process can go on until almost 400 different blocks are measured. Each time the cursor is placed inside a block, a computer program records the location of that block and sends the location to another computer. If the points are calibrated accurately, there should be no distortion in the images that are projected in the CAVE. This also allows the CAVE to correctly identify where the user is located and can precisely track their movements, allowing the projectors to display images based on where the person is inside the CAVE.
The concept of the original CAVE has been reapplied and is currently being used in a variety of fields. Many universities own CAVE systems. CAVEs have many uses. Many engineering companies use CAVEs to enhance product development. Prototypes of parts can be created and tested, interfaces can be developed, and factory layouts can be simulated, all before spending any money on physical parts. This gives engineers a better idea of how a part will behave in the product in its entirety. CAVEs are also used more and more in the collaborative planning in construction sector.
The EVL team at UIC released the CAVE2 in October 2012. Similar to the original CAVE, it is a 3D immersive environment but is based on LCD panels rather than projection.
|Wikimedia Commons has media related to Cave automatic virtual environment.|
- "libGlass - Images". Libglass.sourceforge.net. Retrieved 2014-08-04.
- "and WorldViz bring Eye Tracking to Virtual Reality". Smivision.com. 2013-05-28. Retrieved 2014-08-04.
- "Satavision". Satavision. Retrieved 2014-08-04.
- "CAVE Automatic Virtual Environment". Visbox. Retrieved 2015-09-18.
- "vrCluster". Pixela Labs. Retrieved 2017-02-20.
- "Archived copy". Archived from the original on 2007-01-09. Retrieved 2006-06-27.
- "Virtual reality in the product development process". Journal of Engineering Design. 13: 159–172. 1970-01-01. doi:10.1080/09544820210129823. Retrieved 2014-08-04.
- Product Engineering: Tools and Methods Based on Virtual Reality. 2007-06-06. Retrieved 2014-08-04.
- Nostrad (2014-06-13). "Collaborative Planning with Sweco Cave: State-of-the-art in Design and Design Management". Slideshare.net. Retrieved 2014-08-04.
- EVL (2009-05-01). "CAVE2: Next-Generation Virtual-Reality and Visualization Hybrid Environment for Immersive Simulation and Information Analysis". Retrieved 2014-08-07.
- Carolina Cruz-Neira, Daniel J. Sandin, Thomas A. DeFanti, Robert V. Kenyon and John C. Hart. "The CAVE: Audio Visual Experience Automatic Virtual Environment", Communications of the ACM, vol. 35(6), 1992, pp. 64–72. DOI:10.1145/129888.129892
- Carolina Cruz-Neira, Daniel J. Sandin and Thomas A. DeFanti. "Surround-Screen Projection-based Virtual Reality: The Design and Implementation of the CAVE", SIGGRAPH'93: Proceedings of the 20th Annual Conference on Computer Graphics and Interactive Techniques, pp. 135–142, DOI:10.1145/166117.166134