Synthetic vision system
Synthetic vision was developed by NASA and the U.S. Air Force in the late 1970s and 1980s in support of advanced cockpit research, and in 1990s as part of the Aviation Safety Program. Development of the High Speed Transport (HST) fueled NASA research in the 1980s and 1990s. In the early 1980s, the USAF recognized the need to improve cockpit situation awareness to support piloting ever more complex aircraft, and pursued SVS (sometimes called pictorial format avionics) as an integrating technology for both manned and remotely piloted systems. NASA initiated industry involvement in early 2000 with major avionics manufacturers. Researchers like E. Theunissen at Delft University of Technology in the Netherlands contributed greatly to the development of SVS technology.
Synthetic vision provides situational awareness to the operators by using terrain, obstacle, geo-political, hydrological and other databases. A typical SVS application uses a set of databases stored on board the aircraft, an image generator computer, and a display. Navigation solution is obtained through the use of GPS and inertial reference systems.
Highway In The Sky (HITS), or Path-In-The-Sky, is often used to depict the projected path of the aircraft in perspective view. Pilots acquire instantaneous understanding of the current as well as the future state of the aircraft with respect to the terrain, towers, buildings and other environment features.
NASA also used synthetic vision for remotely piloted vehicles (RPVs), such as the High Maneuvability Aerial Testbed or HiMAT (see Sarrafian, 1984). According to the report by NASA, the aircraft was flown by a pilot in a remote cockpit, and control signals up-linked from the flight controls in the remote cockpit on the ground to the aircraft, and aircraft telemetry downlinked to the remote cockpit displays (see photo). The remote cockpit could be configured with either nose camera video or with a 3D synthetic vision display. SV was also used for simulations of the HiMAT. Sarrafian reports that the test pilots found the visual display to be comparable to output of camera on board the RPV.
Similar research continued in the U.S. military services, and at Universities around the world. In 1995-1996, North Carolina State University flew a 17.5% scale F-18 RPV using Microsoft Flight Simulator to create the three-dimensional projected terrain environment. However, the recreational uses of synthetic vision for RPVs preceded this substantially. For example, in 1980 the Flight Simulator was introduced by Bruce Artwick. But most directly, the RC Aerochopper RPV simulation used synthetic vision to aid aspiring RC helicopter pilots in learning to fly.
According to the "RC Aerochopper Owners Manual" published in 1986 by Ambrosia Microcomputer Products, Inc., the system included joystick flight controls which would connect to an Amiga computer and display. The software included a three-dimensional terrain database for the ground as well as some man-made objects. This database was basic, representing the terrain with relatively small numbers of polygons by today's standards. The program simulated the dynamic three-dimensional position and attitude of the aircraft using the terrain database to create a projected 3D perspective display. The realism of this RPV pilot training display was enhanced by allowing the user to adjust the simulated control system delays and other parameters.
After years of research, in 2005 NASA's "Turning Goals Into Reality" program, a synthetic vision system was installed on a Gulfstream V test aircraft as part of the GVSITE project. Much of the experienced gained during that program led directly to the introduction of certified SVS on future aircraft.
The first FAA certified application of a synthetic vision system (2009) was available as part of the Gulfstream PlaneView flight deck in the form of the Synthetic Vision - Primary Flight Display (SV-PFD) which replaces the traditional blue-over-brown artificial horizon with the computer-generated terrain overlaid with normal PFD symbology. Since then, many newer glass cockpit systems such as the Garmin G1000 and the Rockwell Collins Pro Line Fusion offer synthetic terrain. A number of lower-cost "experimental" class avionics systems also offer synthetic vision systems. A number of app developers - such as ForeFlight, Garmin, and Hilton Software - have developed synthetic vision systems for iPad and Android tablets.
Enhanced vision is a related technology which incorporates information from aircraft based sensors (e.g., near-infrared cameras, millimeter wave radar) to provide vision in limited visibility environments.
Night vision systems have been available to pilots of military aircraft for many years. More recently business jets have added similar capabilities to aircraft to enhance pilot situational awareness in poor visibility due to weather or haze, and at night. The first civil certification of an enhanced vision system on an aircraft was pioneered by Gulfstream Aerospace using a Kollsman IR camera. Originally offered as an option on the Gulfstream V aircraft, it was made standard equipment in 2003 when the Gulfstream G550 was introduced and followed on the Gulfstream G450 and Gulfstream G650. As of 2009, Gulfstream has delivered over 500 aircraft with a certified EVS installed. Other aircraft OEMs followed, with EVS now available on some Bombardier and Dassault business jet products. Boeing has begun offering EVS on its line of Boeing business jets and is likely to include it as an option on the B787 and B737 MAX.
The Gulfstream EVS and later EVS II systems use an IR camera mounted in the aircraft's nose to project a raster image on the head-up display (HUD). The IR image on the HUD is conformal to the outside scene, meaning that objects detected by the IR camera are the same size and aligned with objects outside the aircraft. Thus in poor visibility the pilot is able to view the IR camera image and is able to seamlessly and easily transition to the outside world as the aircraft gets closer.
The advantage of EVS is that safety in nearly all phases of flight are enhanced, especially during approach and landing in limited visibility. A pilot on a stabilized approach is able to recognize the runway environment (lights, runway markings, etc.) earlier in preparation for touchdown. Obstacles such as terrain, structures, and vehicles or other aircraft on the runway that might not otherwise be seen are clearly visible on the IR image.
The FAA grants some additional operating minimums to aircraft equipped with certified enhanced vision systems allowing Category I approaches to Category II minimums. Typically an operator is permitted to descend to lower altitudes closer to the runway surface (typically as low as 100 ft) in poor visibility in order to improve the chances of spotting the runway environment prior to landing. Aircraft not equipped with such systems would not be allowed to descend as low and often would be required to execute a missed approach and fly to a suitable alternate airport.
Other sensor types have been flown for research purposes, including active and passive millimeter wave radar. In 2009, DARPA provided funding to develop "Sandblaster", a millimeter wave radar based enhanced vision system installed on helicopters which enables the pilot to see and avoid obstacles in the landing area that may be obscured by smoke, sand, or dust.
The combination of dissimilar sensor types such as long wave IR, short wave IR, and millimeter wave radar can help ensure that real time video imagery of the outside scene can be provided to the pilot in all types of visibility conditions. For example, long wave IR sensor performance can be degraded in some types of large water droplet precipitation where millimeter wave radar would be less affected.
Regulations and standards
RTCA DO-315B / Eurocae ED-179B(sept 2011) defines minimum aviation system performance standards for EVS, SVS, CVS and EFVS.
|Wikimedia Commons has media related to Synthetic vision system.|
- Knox et al.: "Description of Path-In-The-Sky Contact Analog Piloting Display", NASA Technical Memorandum 74057, October 1977 
- Sarrafian, S: "Simulator Evaluation of a Remotely Piloted Vehicle Lateral Landing Task Using a Visual Display", NASA Technical Memorandum 85903, August 1984 
- Stern, D: "RC Aerochopper Owners Manual", Ambrosia Microcomputer Products, Inc., 1986
- Theunissen et al.: "Guidance, Situation Awareness and Integrity Monitoring with an SVS+EVS", AIAA GNC Conference Proceedings, August 2005
- Way et al.: "Pictorial Format Display Evaluation", USAF AFWAL-TR-34-3036, May 1984
- North Carolina State University Synthetic Vision F-18 RPV
- NASA Synthetic Vision
- AINOnline.com: "The promise of synthetic vision: turning ideas into (virtual) reality"
- Synthetic Vision Systems research