Jump to content

Augmented reality: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Line 79: Line 79:
=====Head Mounted Displays=====
=====Head Mounted Displays=====


A [[Head Mounted Display]] (HMD) places images of both the physical world and registered virtual graphical objects over the user's view of the world. The HMD's are either optical see-through or video see-through in nature. An optical see-through display employs half-silver mirror technology to allow views of physical world to pass through the lens and graphical overlay information to be reflected into the user's eyes. The HMD must be tracked with a six degree of freedom sensor.This tracking allows for the computing system to register the virtual information to the physical world. The main advantage of HMD AR is the immersive experience for the user. The graphical information is slaved to the view of the user. The most common products employed are as follows: MicroVision Nomad, Sony [[Glasstron]], and I/O Displays.
A [[Head Mounted Display]] (HMD) places images of both the physical world and registered virtual graphical objects over the user's view of the world. The HMD's are either optical see-through or video see-through in nature. An optical see-through display employs half-silver mirror technology to allow views of physical world to pass through the lens and graphical overlay information to be reflected into the user's eyes. The HMD must be tracked with a six degree of freedom sensor.This tracking allows for the computing system to register the virtual information to the physical world. The main advantage of HMD AR is the immersive experience for the user. The graphical information is slaved to the view of the user. The most common products employed are as follows: MicroVision Nomad, Sony [[Glasstron]], [[Vuzix AR]]<ref>[http://www.vuzix.com/AR_Site/default.asp Vuzix AR]</ref> and I/O Displays.


=====Handheld Displays=====
=====Handheld Displays=====

Revision as of 15:16, 19 November 2010

Wikitude World Browser on the iPhone 3GS uses GPS and a solid state compass
AR Tower Defense game on the Nokia N95 smartphone (Symbian OS) uses fiduciary markers

Augmented reality (AR) is a term for a live direct or indirect view of a physical real-world environment whose elements are augmented by virtual computer-generated sensory input such as sound or graphics. It is related to a more general concept called mediated reality in which a view of reality is modified (possibly even diminished rather than augmented) by a computer. As a result, the technology functions by enhancing one’s current perception of reality.

In the case of Augmented Reality, the augmentation is conventionally in real-time and in semantic context with environmental elements, such as sports scores on TV during a match. With the help of advanced AR technology (e.g. adding computer vision and object recognition) the information about the surrounding real world of the user becomes interactive and digitally usable. Artificial information about the environment and the objects in it can be stored and retrieved as an information layer on top of the real world view. The term augmented reality is believed to have been coined in 1990 by Thomas Caudell, an employee of Boeing at the time[1].

Augmented reality research explores the application of computer-generated imagery in live-video streams as a way to expand the real-world. Advanced research includes use of head-mounted displays and virtual retinal displays for visualization purposes, and construction of controlled environments containing any number of sensors and actuators.

Definition

There are two commonly accepted definitions of Augmented Reality today. One was given by Ronald Azuma in 1997 [2]. Azuma's definition says that Augmented Reality

  • combines real and virtual
  • is interactive in real time
  • is registered in 3D

Additionally Paul Milgram and Fumio Kishino defined Milgram's Reality-Virtuality Continuum in 1994 [3]. They describe a continuum that spans from the real environment to a pure virtual environment. In between there are Augmented Reality (closer to the real environment) and Augmented Virtuality (is closer to the virtual environment).

Milgram's Continuum
Milgram's Continuum
Mann's Continuum
Mediated Reality continuum showing four points: Augmented Reality, Augmented Virtuality, Mediated Reality, and Mediated Virtuality on the Virtuality and Mediality axes

This continuum has been extended into a two-dimensional plane of "Virtuality" and "Mediality"[4]. Taxonomy of Reality, Virtuality, Mediality. The origin R denotes unmodified reality. A continuum across the Virtuality axis V includes reality augmented with graphics (Augmented Reality), as well as graphics augmented by reality (Augmented Virtuality). However, the taxonomy also includes modification of reality or virtuality or any combination of these. The modification is denoted by moving up the mediality axis. Further up this axis, for example, we can find mediated reality, mediated virtuality, or any combination of these. Further up and to the right we have virtual worlds that are responsive to a severely modified version of reality. (at right) Mediated reality generalizes the concepts of mixed reality, etc.. It includes the virtuality reality continuum (mixing) but also, in addition to additive effects, also includes multiplicative effects (modulation) of (sometimes deliberately) diminished reality. Moreover, it considers, more generally, that reality may be modified in various ways. The mediated reality framework describes devices that deliberately modify reality, as well as devices that accidentally modify it.

More recently, the term augmented reality has been blurred a bit due to the increased interest of the general public in AR.

Examples

Commonly known examples of AR are the yellow "first down" lines seen in television broadcasts of American football games using the 1st & Ten system, and the colored trail showing location and direction of the puck in TV broadcasts of ice hockey games. The real-world elements are the football field and players, and the virtual element is the yellow line, which is drawn over the image by computers in real time. Similarly, rugby fields and cricket pitches are branded by their sponsors using Augmented Reality; giant logos are inserted onto the fields when viewed on television. In some cases, the modification of reality goes beyond mere augmentation. For example, advertisements may be blocked out (partially or wholly diminished) and replaced with different advertisements. Such replacement is an example of Mediated reality, a more general concept than AR.

Television telecasts of swimming events also often have a virtual line which indicates the position of the current world record holder at that time.

Another type of AR application uses projectors and screens to insert objects into the real environment, enhancing museum exhibitions for example. The difference from a simple TV screen for example, is that these objects are related to the environment of the screen or display, and that they often are interactive as well.

Many first-person shooter video games simulate the viewpoint of someone using AR systems. In these games the AR can be used to give visual directions to a location, mark the direction and distance of another person who is not in line of sight, give information about equipment such as remaining bullets in a gun, and display a myriad of other images based on whatever the game designers intend. This is also called the head-up display.

In some current applications like in cars or airplanes, this is usually a head-up display integrated into the windshield.

The F-35 Lightning II has no head-up display. This is because all targets are tracked by the aircraft's situational awareness and the sensor fusion is presented in the pilot's helmet mounted display. The helmet mounted display provides an augmented reality system that allows the pilot to look through his own aircraft as if it wasn't there.[5]

History

  • 1957-62: Morton Heilig, a cinematographer, creates and patents a simulator called Sensorama with visuals, sound, vibration, and smell.[6]
  • 1966: Ivan Sutherland invents the head-mounted display suggesting it was a window into a virtual world.
  • 1975: Myron Krueger creates Videoplace that allows users to interact with virtual objects for the first time.
  • 1989: Jaron Lanier coins the phrase Virtual Reality and creates the first commercial business around virtual worlds.
  • 1992: Tom Caudell coins the phrase Augmented Reality while at Boeing helping workers assemble cables into aircraft.
  • 1992: L.B. Rosenberg develops one of the first functioning AR systems, called VIRTUAL FIXTURES, at the U.S. Air Force Armstrong Labs, and demonstrates benefit on human performance.[7][8]
  • 1992: Steven Feiner, Blair MacIntyre and Doree Seligmann present first major paper on an AR system prototype, KARMA, at the Graphics Interface conference. Widely cited version of the paper is published in Communications of the ACM next year.
  • 1994: Julie Martin creates first Augmented Reality Theater production,Dancing In Cyberspace funded by Australian Federal Government, Australia Council For The ArtsFeatures, dancers and acrobats manipulating full body sized virtual object in real time, projected into the same physical space and performance plane. The acrobats appeared immersed within the virtual object and environments. naked eye installation powered by Silicon Graphic Computer and Polhemus sensing system.
  • 1999: Hirokazu Kato (加藤 博一) created ARToolKit at HITLab, where AR later is further developed by other HITLab scientists and it is demonstrated at SIGGRAPH that year.
  • 2000: Bruce H. Thomas develops ARQuake, the first outdoor mobile AR game, and is demonstrated in the International Symposium on Wearable Computers.
  • 2008: Wikitude AR Travel Guide launches on Oct. 20, 2008 with the G1 Android phone.
  • 2009: Wikitude Drive - AR Navigation System launches on Oct. 28, 2009 for the Android platform.
  • 2009: AR Toolkit is ported to Adobe Flash (FLARToolkit) by Saqoosha, bringing augmented reality to the web browser.

Technology

Hardware

The main hardware components for augmented reality are: display, tracking, input devices, and computer. Combination of powerful CPU, camera, accelerometers, GPS and solid state compass are often present in modern smartphones, which make them prospective platforms for augmented reality.

Display

There are three major display techniques for Augmented Reality:

  1. Head Mounted Displays
  2. Handheld Displays
  3. Spatial Displays
Head Mounted Displays

A Head Mounted Display (HMD) places images of both the physical world and registered virtual graphical objects over the user's view of the world. The HMD's are either optical see-through or video see-through in nature. An optical see-through display employs half-silver mirror technology to allow views of physical world to pass through the lens and graphical overlay information to be reflected into the user's eyes. The HMD must be tracked with a six degree of freedom sensor.This tracking allows for the computing system to register the virtual information to the physical world. The main advantage of HMD AR is the immersive experience for the user. The graphical information is slaved to the view of the user. The most common products employed are as follows: MicroVision Nomad, Sony Glasstron, Vuzix AR[9] and I/O Displays.

Handheld Displays

Handheld Augment Reality employs a small computing device with a display that fits in a user's hand. All handheld AR solutions to date have employed video see-through techniques to overlay the graphical information to the physical world. Initially handheld AR employed sensors such as digital compasses and GPS units for its six degree of freedom tracking sensors. This moved onto the use of fiducial marker systems such as the ARToolKit for tracking. Today vision systems such as SLAM or PTAM are being employed for tracking. Handheld display AR promises to be the first commercial success for AR technologies. The two main advantages of handheld AR is the portable nature of handheld devices and ubiquitous nature of camera phones.

Spatial Displays

Instead of the user wearing or carrying the display such as with head mounted displays or handheld devices; Spatial Augmented Reality (SAR) makes use of digital projectors to display graphical information onto physical objects. The key difference in SAR is that the display is separated from the users of the system. Because the displays are not associated with each user, SAR scales naturally up to groups of users, thus allowing for collocated collaboration between users. SAR has several advantages over traditional head mounted displays and handheld devices. The user is not required to carry equipment or wear the display over their eyes. This makes spatial AR a good candidate for collaborative work, as the users can see each other’s faces. A system can be used by multiple people at the same time without each having to wear a head mounted display. Spatial AR does not suffer from the limited display resolution of current head mounted displays and portable devices. A projector based display system can simply incorporate more projectors to expand the display area. Where portable devices have a small window into the world for drawing, a SAR system can display on any number of surfaces of an indoor setting at once. The tangible nature of SAR makes this an ideal technology to support design, as SAR supports both a graphical visualisation and passive haptic sensation for the end users. People are able to touch physical objects, and it is this process that provides the passive haptic sensation. [2] [10] [11] [12]

Tracking

Modern mobile augmented reality systems use one or more of the following tracking technologies: digital cameras and/or other optical sensors, accelerometers, GPS, gyroscopes, solid state compasses, RFID, wireless sensors. Each of these technologies have different levels of accuracy and precision. Most important is the tracking of the pose and position of the user's head for the augmentation of the user's view. The user's hand(s) can be tracked or a handheld input device could be tracked to provide a 6DOF interaction technique. Stationary systems can employ 6DOF track systems such as Polhemus, ViCON, A.R.T, or Ascension.

Input devices

This is a current open research question. Some systems, such as the Tinmith system, employ pinch glove techniques. Another common technique is a wand with a button on it. In case of a smartphone, the phone itself could be used as 3D pointing device, with 3D position of the phone restored from the camera images.

Computer

Camera based systems require powerful CPU and considerable amount of RAM for processing camera images. Wearable computing systems employ a laptop in a backpack configuration. For stationary systems a traditional workstation with a powerful graphics card. Sound processing hardware could be included in augmented reality systems.

Software

For consistent merging real-world images from camera and virtual 3D images, virtual images should be attached to real-world locations in visually realistic way. That means a real world coordinate system, independent from the camera, should be restored from camera images. That process is called Image registration and is part of Azuma's definition of Augmented Reality.

Augmented reality image registration uses different methods of computer vision, mostly related to video tracking. Many computer vision methods of augmented reality are inherited form similar visual odometry methods.

Usually those methods consist of two parts. First interest points, or fiduciary markers, or optical flow detected in the camera images. First stage can use Feature detection methods like Corner detection, Blob detection, Edge detection or thresholding and/or other image processing methods.

In the second stage, a real world coordinate system is restored from the data obtained in the first stage. Some methods assume objects with known 3D geometry(or fiduciary markers) present in the scene and make use of those data. In some of those cases all of the scene 3D structure should be precalculated beforehand. If not all of the scene is known beforehand SLAM technique could be used for mapping fiduciary markers/3D models relative positions. If no assumption about 3D geometry of the scene made structure from motion methods are used. Methods used in the second stage include projective(epipolar) geometry, bundle adjustment, rotation representation with exponential map, kalman and particle filters.

Applications

Current applications

Advertising: Marketers started to use AR to promote products via interactive AR applications. For example, at the 2008 LA Auto Show, Nissan unveiled the concept vehicle Cube and presented visitors with a brochure which, when held against a webcam, showed several versions of the vehicle[13]. In August 2009,discrepancies Best Buy ran a circular with an augmented reality code that allowed users with a webcam to interact with the product in 3D.[14] In 2010 Walt Disney used mobile augmented reality to connect a movie experience to outdoor advertising.

Support with complex tasks: Complex tasks such as assembly, maintenance, and surgery can be simplified by inserting additional information into the field of view. For example, labels can be displayed on parts of a system to clarify operating instructions for a mechanic who is performing maintenance on the system. [15] AR can include images of hidden objects, which can be particularly effective for medical diagnostics or surgery. Examples include a virtual X-ray view based on prior tomography or on real time images from ultrasound and microconfocal probes[16] or open NMR devices. A doctor could observe the fetus inside the mother's womb [17].See also Mixed reality.

Navigation devices: AR can augment the effectiveness of navigation devices for a variety of applications. For example, building navigation can be enhanced for the purpose of maintaining industrial plants. Outdoor navigation can be augmented for military operations or disaster management. Head-up displays or personal display glasses in automobiles can be used to provide navigation hints and traffic information. These types of displays can be useful for airplane pilots, too. Head-up displays are currently used in fighter jets as one of the first AR applications. These include full interactivity, including eye pointing.

Industrial Applications: AR can be used to compare the data of digital mock-ups with physical mock-ups for efficiently finding discrepancies between the two sources. It can further be employed to safeguard digital data in combination with existing real prototypes, and thus save or minimize the building of real prototypes and improve the quality of the final product.

Military and emergency services: AR can be applied to military and emergency services as wearable systems to provide information such as instructions, maps, enemy locations, and fire cells.

Prospecting: In the fields of hydrology, ecology, and geology, AR can be used to display an interactive analysis of terrain characteristics. Users could use, and collaboratively modify and analyze, interactive three-dimensional maps.[citation needed]

Art: AR can be incorporated into artistic applications that allow artists to create art in real time over reality such as painting, drawing, modeling, etc. One such example of this phenomenon is called Eyewriter that was developed in 2009 by Zachary Lieberman and a group formed by members of Free Art and Technology (FAT), OpenFrameworks and the Graffiti Research Lab to help a graffiti artist, who became paralyzed, draw again.[18]

Architecture: AR can be employed to simulate planned construction projects.[19]

Sightseeing: Models may be created to include labels or text related to the objects/places visited. With AR, users can rebuild ruins, buildings, or even landscapes as they previously existed.[20]

Collaboration: AR can help facilitate collaboration among distributed team members via conferences with real and virtual participants. The Hand of God is a good example of a collaboration system [21] Also see Mixed reality.

Entertainment and education: AR can be used in the fields of entertainment and education to create virtual objects in museums and exhibitions, theme park attractions (such as Cadbury World), games (such as ARQuake) and books[22]. Also see Mixed reality.

Music: Pop group Duran Duran included interactive AR projections into their stage show during their 2000 Pop Trash concert tour.[23] Sydney band Lost Valentinos launched the world's first interactive AR music video on 16 October 2009, where users could print out 5 markers representing a pre-recorded performance from each band member which they could interact with live and in real-time via their computer webcam and record as their own unique music video clips to share via YouTube.[24][25]

Future applications

It is important to note that augmented reality is a costly development in technology. Because of this, the future of AR is dependent on whether or not those costs can be reduced in some way. If AR technology becomes affordable, it could be very widespread but for now major industries are the sole buyers that have the opportunity to utilize this resource.

  • Expanding a PC screen into the real environment: program windows and icons appear as virtual devices in real space and are eye or gesture operated, by gazing or pointing. A single personal display (glasses) could concurrently simulate a hundred conventional PC screens or application windows all around a user
  • Virtual devices of all kinds, e.g. replacement of traditional screens, control panels, and entirely new applications impossible in "real" hardware, like 3D objects interactively changing their shape and appearance based on the current task or need.
  • Enhanced media applications, like pseudo holographic virtual screens, virtual surround cinema, virtual 'holodecks' (allowing computer-generated imagery to interact with live entertainers and audience)
  • Virtual conferences in "holodeck" style
  • Replacement of cellphone and car navigator screens: eye-dialing, insertion of information directly into the environment, e.g. guiding lines directly on the road, as well as enhancements like "X-ray"-views
  • Virtual plants, wallpapers, panoramic views, artwork, decorations, illumination etc., enhancing everyday life. For example, a virtual window could be displayed on a regular wall showing a live feed of a camera placed on the exterior of the building, thus allowing the user to effectually toggle a wall's transparency
  • With AR systems getting into mass market, we may see virtual window dressings, posters, traffic signs, Christmas decorations, advertisement towers and more. These may be fully interactive even at a distance, by eye pointing for example.
  • Virtual gadgetry becomes possible. Any physical device currently produced to assist in data-oriented tasks (such as the clock, radio, PC, arrival/departure board at an airport, stock ticker, PDA, PMP, informational posters/fliers/billboards, in-car navigation systems, etc.) could be replaced by virtual devices that cost nothing to produce aside from the cost of writing the software. Examples might be a virtual wall clock, a to-do list for the day docked by your bed for you to look at first thing in the morning, etc.
  • Subscribable group-specific AR feeds. For example, a manager on a construction site could create and dock instructions including diagrams in specific locations on the site. The workers could refer to this feed of AR items as they work. Another example could be patrons at a public event subscribing to a feed of direction and information oriented AR items.
  • AR systems can help the visually impaired navigate in a much better manner (combined with a text-to-speech software).
  • Computer games which make use of position and environment information to place virtual objects, opponents, and weapons overlaid in the player's visual field.

Notable researchers

Conferences

  • 1st International Workshop on Augmented Reality, IWAR'98, San Francisco, Nov. 1998.
  • 2nd International Workshop on Augmented Reality (IWAR'99), San Francisco, Oct. 1999.
  • 1st International Symposium on Mixed Reality (ISMR'99), Yokohama, Japan, March 1999.
  • 2nd International Symposium on Mixed Reality (ISMR'01), Yokohama, Japan, March 2001.
  • 1st International Symposium on Augmented Reality (ISAR 2000), Munich, Oct. 2000.
  • 2nd International Symposium on Augmented Reality (ISAR 2001), New York, Oct. 2001.
  • 1st International Symposium on Mixed and Augmented Reality (ISMAR 2002), Darmstadt, Oct. 2002.
  • 2nd International Symposium on Mixed and Augmented Reality (ISMAR 2003), Tokyo, Oct. 2003.
  • 3rd International Symposium on Mixed and Augmented Reality (ISMAR 2004), Arlington, VA, Nov. 2004.
  • 4th International Symposium on Mixed and Augmented Reality (ISMAR 2005), Vienna, Oct. 2005.
  • 5th International Symposium on Mixed and Augmented Reality (ISMAR 2006) Santa Barbara, Oct. 2006.
  • 6th International Symposium on Mixed and Augmented Reality (ISMAR 2007) Nara, Japan, Nov. 2007.
  • 7th International Symposium on Mixed and Augmented Reality (ISMAR 2008) Cambridge, United Kingdom, Sep. 2008.
  • 8th International Symposium on Mixed and Augmented Reality (ISMAR 2009) Orlando, Oct. 2009.
  • Augmented Reality Developer Camp (AR DevCamp) in Mountain View, Dec. 2009.
  • 9th International Symposium on Mixed and Augmented Reality (ISMAR 2010) Seoul, Korea, Oct. 2010.

Software

Free software

Non-commercial use

Books

  • Woodrow Barfield, and Thomas Caudell, eds. Fundamentals of Wearable Computers and Augmented Reality. Mahwah, NJ: Lawrence Erlbaum, 2001. ISBN 0-8058-2901-6.
  • Oliver Bimber and Ramesh Raskar. Spatial Augmented Reality: Merging Real and Virtual Worlds. A K Peters, 2005. ISBN 1-56881-230-2.
  • Michael Haller, Mark Billinghurst and Bruce H. Thomas. Emerging Technologies of Augmented Reality: Interfaces and Design. Idea Group Publishing, 2006. ISBN 1-59904-066-2 , publisher listing
  • Rolf R. Hainich. "The end of Hardware : A Novel Approach to Augmented Reality" 2nd ed.: Booksurge, 2006. ISBN 1-4196-5218-4. 3rd ed. ("Augmented Reality and Beyond"): Booksurge, 2009, ISBN 1-4392-3602-X.
  • Stephen Cawood and Mark Fiala. Augmented Reality: A Practical Guide, 2008, ISBN 1-934356-03-4

Television & film

  • The television series Dennō Coil depicts a near-future where children use AR glasses to enhance their environment with games and virtual pets.
  • The television series Firefly depicts numerous AR applications, including a real-time medical scanner which allows a doctor to use his hands to manipulate a detailed and labeled projection of a patient's brain.
  • In the 1993 ABC miniseries Wild Palms, a Scientology-like organization used holographic projectors to overlay virtual reality images over physical reality.
  • In the movie Iron Man, Tony Stark (Robert Downey Jr.) uses an augmented reality system to design his super-powered suit. The suit itself also uses augmented reality technology.

Literature

  • The books Halting State by Charles Stross and Rainbows End by Vernor Vinge include augmented reality primarily in the form of virtual overlays over the real world. Halting State mentions Copspace, which is used by cops, and the use by gamers to overlay their characters onto themselves during a gaming convention. Rainbows End mentions outdoor overlays based on popular fictional universes from H. P. Lovecraft and Terry Pratchett among others.
  • The term "Geohacking" has been coined by William Gibson in his book Spook Country, where artists use a combination of GPS and 3D graphics technology to embed rendered meshes in real world landscapes.
  • In The Risen Empire, by Scott Westerfeld, most - if not all - people have their own "synesthesia". An AR menu unique to the user that is projected in front of them, but they can only see their own synesthesia menus. It is controlled by hand gestures, blink patterns, where the user is looking, clicks of the tongue, etc.
  • In the Greg Egan novel Distress, the 'Witness' software used to record sights and sounds experienced by the user can be set-up to scan what the user is seeing and highlight people the user is looking out for.
  • In the Revelation Space series of novels, Alastair Reynolds characters frequently employ "Entoptics" which are essentially a highly developed form of augmented reality, going so far as to entirely substitute natural perception.

Games

  • The table top role-playing game, Shadowrun, introduced AR into its game world. Most of the characters in the game use viewing devices to interact with the AR world most of the time.
  • Cybergeneration, a table top role-playing game by R. Talsorian, includes "virtuality", an augmented reality created through v-trodes, cheap, widely available devices people wear at their temples.
  • In the video game Heavy Rain, Norman Jayden, an FBI profiler, possesses a set of experimental augmented reality glasses called an "Added Reality Interface", or ARI. It allows him to rapidly investigate crime scenes and analyze evidence.
  • In Dead Space the RIG worn by Isaac Clarke is thoroughly equipped with augmented reality technology, including a navigation system that projects a line along the best route to his destination, and a system that displays images, video and text in front of him.

See also

References

  1. ^ The interactive system is no longer a precise location, but the whole environment; interaction is no longer simply a face-to-screen exchange, but dissolves itself in the surrounding space and objects. Using an information system is no longer exclusively a conscious and intentional act. Brian X. Chen (2009-08-25). "If You're Not Seeing Data, You're Not Seeing". Wired Magazine. Retrieved 2009-08-26.
  2. ^ a b R. Azuma, A Survey of Augmented Reality Presence: Teleoperators and Virtual Environments, pp. 355–385, August 1997.
  3. ^ P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
  4. ^ Mediated Reality with implementations for everyday life, 2002 August 6th, Presence Connect, the on line companion to the MIT Press journal PRESENCE: Teleoperators and Virtual Environments, MIT Press
  5. ^ "F-35 Distributed Aperture System EO DAS." Youtube.com. Retrieved: 07 October 2010.
  6. ^ http://www.google.com/patents?q=3050870
  7. ^ a b L. B. Rosenberg. The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments. Technical Report AL-TR-0089, USAF Armstrong Laboratory, Wright-Patterson AFB OH, 1992.
  8. ^ a b L. B. Rosenberg, "The Use of Virtual Fixtures to Enhance Operator Performance in Telepresence Environments" SPIE Telemanipulator Technology, 1993.
  9. ^ Vuzix AR
  10. ^ Ramesh Raskar, Spatially Augmented Reality, First International Workshop on Augmented Reality, Sept 1998
  11. ^ David Drascic of the University of Toronto is a developer of ARGOS: A Display System for Augmenting Reality. David also has a number of AR related papers on line, accessible from his home page.
  12. ^ Augmented reality brings maps to life July 19, 2005
  13. ^ company website
  14. ^ Best Buy goes 3D, even augmented reality isn't safe from advertising
  15. ^ Steve Henderson, Steven Feiner. "ARMAR:Augmented Reality for Maintenance and Repair (ARMAR)". Retrieved 2010-01-06.
  16. ^ Peter Mountney, Stamatia Giannarou, Daniel Elson and Guang-Zhong Yang. "Optical Biopsy Mapping for Minimally Invasive Cancer Screening. In proc MICCAI(1), 2009, pp. 483-490". Retrieved 2010-07-07.{{cite web}}: CS1 maint: multiple names: authors list (link)
  17. ^ "UNC Ultrasound/Medical Augmented Reality Research". Retrieved 2010-01-06.
  18. ^ Zachary Lieberman. "THE EYEWRITER". Retrieved 2010-04-27.
  19. ^ Anish Tripathi. "AUGMENTED REALITY : AN APPLICATION FOR ARCHITECTURE". Retrieved 2010-01-06.
  20. ^ Patrick Dähne, John N. Karigiannis. "Archeoguide: System Architecture of a Mobile Outdoor Augmented Reality System". Retrieved 2010-01-06.
  21. ^ Aaron Stafford, Wayne Piekarski, and Bruce H. Thomas. "Hand of God". Retrieved 2009-12-18.{{cite web}}: CS1 maint: multiple names: authors list (link)
  22. ^ Jose Fermoso. "Make Books 'Pop' With New Augmented Reality Tech". Retrieved 2010-10-01.
  23. ^ Pair, J., Wilson, J., Chastine, J., Gandy, M. "The Duran Duran Project: The Augmented Reality Toolkit in Live Performance". The First IEEE International Augmented Reality Toolkit Workshop, 2002. (photos and video)
  24. ^ Gizmodo: Sydney Band Uses Augmented Reality For Video Clip
  25. ^ cnet: Augmented reality in Aussie film clip
  26. ^ "Knowledge-based augmented reality". ACM. July, 1993. {{cite web}}: Check date values in: |date= (help)
  27. ^ Wagner, Daniel (September 29, 2009). "First Steps Towards Handheld Augmented Reality". ACM. Retrieved 2009-09-29.
  28. ^ Pikarski, Wayne and Thomas, Bruce (Oct. 1, 2001). "Tinmith-Metro: New Outdoor Techniques for Creating City Models with an Augmented Reality Wearable Computer". IEEE. {{cite web}}: |access-date= requires |url= (help); Check date values in: |accessdate= and |date= (help); Missing or empty |url= (help); Text "http://www.computer.org/portal/web/csdl/doi/10.1109/ISWC.2001.962093" ignored (help)CS1 maint: multiple names: authors list (link)
  29. ^ Kato, H., Billinghurst, M. "Marker tracking and hmd calibration for a video-based augmented reality conferencing system.",In Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR 99), October 1999.
  30. ^ ARToolKit SourceForge page
  31. ^ ARToolWorks company website
  32. ^ Dybis, Karen (August 6, 2010). "Some Genuine Detroit 'School Pride'". Time. Retrieved October 7, 2010.

Media related to Augmented reality at Wikimedia Commons