Jump to content

Augmented reality

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by EhJJ (talk | contribs) at 05:29, 30 April 2011 (Head–mounted: fix link (dab)). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Wikitude World Browser on the iPhone 3GS uses GPS and a solid state compass
AR Tower Defense game on the Nokia N95 smartphone (Symbian OS) uses fiduciary markers

Augmented reality (AR) is generally accepted as a term that describes a live direct or an indirect view of a physical, real-world environment whose elements are augmented by computer-generated sensory input, such as sound or graphics. However, some have called for a much broader definition of Augmented Reality. [1] AR is related to a more general concept called mediated reality, in which a view of reality is modified (possibly even diminished rather than augmented) by a computer. As a result, the technology functions by enhancing one’s current perception of reality. By contrast, virtual reality replaces the real world with a simulated one.

Augmentation is conventionally in real-time and in semantic context with environmental elements, such as sports scores on TV during a match. With the help of advanced AR technology (e.g. adding computer vision and object recognition) the information about the surrounding real world of the user becomes interactive and digitally manipulable. Artificial information about the environment and its objects can be overlaid on the real world. The term augmented reality is believed to have been coined in 1990 by Thomas Caudell, working at Boeing.[2]

Research explores the application of computer-generated imagery in live-video streams as a way to enhance the perception of the real world. AR technology includes head-mounted displays and virtual retinal displays for visualization purposes, and construction of controlled environments containing sensors and actuators.

Definition

Ronald Azuma offered a definition in 1997.[3] Azuma's definition says that Augmented Reality combines real and virtual, is interactive in real time and is registered in 3D.

Additionally Paul Milgram and Fumio Kishino defined Milgram's Reality-Virtuality Continuum in 1994.[4] They describe a continuum that spans an entirely real environment to a purely virtual environment. In between are Augmented Reality (closer to the real environment) and Augmented Virtuality (closer to the virtual environment).

Milgram's Continuum
Milgram's Continuum
Mann's Continuum
Mediated Reality continuum showing four points: Augmented Reality, Augmented Virtuality, Mediated Reality, and Mediated Virtuality on the Virtuality and Mediality axes

Taxonomy of reality, virtuality, mediality

This continuum has been extended into a second dimension that incorporates Mediality.[5] On a graph, the origin R at the bottom left denotes unmodified reality. A continuum across the Virtuality axis V includes reality augmented with additional information (AR), as well as virtual reality augmented by reality (Augmented Virtuality or AV). Unmediated AV simulations are constrained to match the real world behaviorally if not in contents.

The mediality axis measures modification of AV, AR and mixes thereof. Moving away from the origin on this axis, the depicted world becomes increasingly different from reality. Diagonally opposite from R are virtual worlds that have no connection to reality. (at right) It includes the virtuality reality continuum (mixing) but also, in addition to additive effects, also includes modulation and/or diminishment of reality. Mediation encompasses deliberate and/or unintentional modifications.[citation needed]

Examples

Sports

AR has become common in sports telecasting. The yellow "first down" line seen in television broadcasts of American football games shows the line the offensive team must cross to receive a first down using the 1st & Ten system. The real-world elements are the football field and players, and the virtual element is the yellow line, which augment the image in real time. Similarly, in ice hockey an AR colored trail shows location and direction of the puck. Sections of Rugby fields and cricket pitches display sponsored images.

Swimming telecasts often add a line across the lanes to indicate the position of the current record holder as a race proceeds to allow viewers to compare the current race to the best performance.

As an example of mediated (diminished) reality, the network may hide a real message or replace a real ad message with a virtual message.

Other

First-person shooter video games can simulate a player's viewpoint using AR to give visual directions to a location, mark the direction distance of another person who is not in line of sight and give information about equipment such as remaining ammunition. This is done using a virtual head-up display.[citation needed]

Heads-up displays in AR cars such as some BMW 7 Series models or within airplanes are typically integrated into the windshield.

The F-35 Lightning II instead display information in the pilot's helmet mounted display, which allows the pilot to look through the aircraft's walls as if he was floating in space.[6]

History

  • 1957-62: Morton Heilig, a cinematographer, creates and patents a simulator called Sensorama with visuals, sound, vibration, and smell.[7]
  • 1966: Ivan Sutherland invents the head-mounted display and positions it as a window into a virtual world.
  • 1975: Myron Krueger creates Videoplace to allow users to interact with virtual objects for the first time.
  • 1989: Jaron Lanier coins the phrase Virtual Reality and creates the first commercial business around virtual worlds.
  • 1990: Tom Caudell coins the phrase Augmented Reality while at Boeing helping workers assemble cables into aircraft.[8]
  • 1992: L.B. Rosenberg develops one of the first functioning AR systems, called VIRTUAL FIXTURES, at the U.S. Air Force Research Laboratory—Armstrong, and demonstrates benefits to human performance.[9][10]
  • 1992: Steven Feiner, Blair MacIntyre and Doree Seligmann present the first major paper on an AR system prototype, KARMA, at the Graphics Interface conference. A widely cited version of the paper is published in Communications of the ACM in 1993.
  • 1993: Loral WDL, with sponsorship from STRICOM, performed the first demonstration combining live AR-equipped vehicles and manned simulators. Unpublished paper, J. Barrilleaux, "Experiences and Observations in Applying Augmented Reality to Live Training", 1999.[11]
  • 1994: Julie Martin creates first Augmented Reality Theater production, Dancing In Cyberspace, funded by the Australia Council for the Arts, features dancers and acrobats manipulating body–sized virtual object in real time, projected into the same physical space and performance plane. The acrobats appeared immersed within the virtual object and environments. The installation used Silicon Graphics computers and Polhemus sensing system.
  • 1998: Spatial Augmented Reality introduced at University of North Carolina at Chapel Hill by Raskar, Welch, Fuchs.[12]
  • 1999: Hirokazu Kato (加藤 博一) created ARToolKit at HITLab, where AR later was further developed by other HITLab scientists, demonstrating it at SIGGRAPH.
  • 2000: Bruce H. Thomas develops ARQuake, the first outdoor mobile AR game, demonstrating it in the International Symposium on Wearable Computers.
  • 2008: Wikitude AR Travel Guide launches on Oct. 20, 2008 with the G1 Android phone.[13]
  • 2009: Wikitude Drive, AR navigation system launched on Oct. 28, 2009 for the Android platform.
  • 2009: AR Toolkit was ported to Adobe Flash (FLARToolkit) by Saqoosha, bringing augmented reality to the web browser.[14]
  • 2011: The Nintendo 3DS comes packaged with 6 AR cards which could be used as fiduciary markers. That allow for various mini-games to be played involving virtual objects appearing in camera view.

Technology

Hardware

The main hardware components for augmented reality are: processor, display, sensors and input devices. These elements, specifically CPU, display, camera and MEMS sensors such as accelerometer, GPS, solid state compass are often present in modern smartphones, which make them prospective AR platforms.

Display

There are three major display techniques for Augmented Reality: head–mounted displays, handheld displays and spatial displays.

Head–mounted

A Head Mounted Display (HMD) places images of both the physical world and registered virtual graphical objects over the user's view of the world. The HMD's are either optical see–through or video see–through. Optical see-through employs half-silver mirrors to pass images through the lens and overlay information to be reflected into the user's eyes. The HMD must be tracked with sensor that provides six degrees of freedom. This tracking allows the system to align virtual information to the physical world. The main advantage of HMD AR is the user's immersive experience. The graphical information is slaved to the view of the user.[15]

Handheld

Handheld displays employ a small display that fits in a user's hand. All handheld AR solutions to date opt for video see-through. Initially handheld AR employed fiduciary markers, and later GPS units and MEMS sensors such as digital compasses and six degrees of freedom accelerometergyroscope. Today SLAM markerless trackers such as PTAM are starting to come into use. Handheld display AR promises to be the first commercial success for AR technologies. The two main advantages of handheld AR is the portable nature of handheld devices and ubiquitous nature of camera phones. The disadvantages are the physical constraints of the user having to hold the handheld device out in front of them at all times as well as distorting effect of classically wide-angled mobile phone cameras when compared to the real world as viewed through the eye. [16]

Spatial

Instead of the user wearing or carrying the display such as with head mounted displays or handheld devices, Spatial Augmented Reality (SAR) [12] makes use of digital projectors to display graphical information onto physical objects. The key difference in SAR is that the display is separated from the users of the system. Because the displays are not associated with each user, SAR scales naturally up to groups of users, thus allowing for collocated collaboration between users. SAR has several advantages over traditional head mounted displays and handheld devices. The user is not required to carry equipment or wear the display over their eyes. This makes spatial AR a good candidate for collaborative work, as the users can see each other’s faces. A system can be used by multiple people at the same time without each having to wear a head mounted display. Spatial AR does not suffer from the limited display resolution of current head mounted displays and portable devices. A projector based display system can simply incorporate more projectors to expand the display area. Where portable devices have a small window into the world for drawing, a SAR system can display on any number of surfaces of an indoor setting at once. The drawbacks, however, are that SAR systems of projectors do not work so well in sunlight and also require a surface on which to project the computer-generated graphics. Augmentations cannot simply hang in the air as they do with handheld and HMD-based AR. The tangible nature of SAR, though, makes this an ideal technology to support design, as SAR supports both a graphical visualisation and passive haptic sensation for the end users. People are able to touch physical objects, and it is this process that provides the passive haptic sensation. [3] [12] [17] [18][19]

Tracking

Modern mobile augmented reality systems use one or more of the following tracking technologies: digital cameras and/or other optical sensors, accelerometers, GPS, gyroscopes, solid state compasses, RFID and wireless sensors. These technologies offer varying levels of accuracy and precision. Most important is the position and orientation of the user's head. Tracking the user's hand(s) or a handheld input device can provide a 6DOF interaction technique.[20]

Input devices

Techniques include the pinch glove,[21] a wand with a button and a smartphone that signals its position and orientation from camera images.

Computer

The computer analyzes the sensed visual and other data to synthesize and position augmentations.

Software and algorithms

A key measure of AR systems is how realistically they integrate augmentations with the real world. The software must derive real world coordinates, independent from the camera, from camera images. That process is called image registration and is part of Azuma's definition of Augmented Reality.

Image registration uses different methods of computer vision, mostly related to video tracking. Many computer vision methods of augmented reality are inherited from visual odometry. Usually those methods consist of two parts. First detect interest points, or fiduciary markers, or optical flow in the camera images. First stage can use feature detection methods like corner detection, blob detection, edge detection or thresholding and/or other image processing methods.

The second stage restores a real world coordinate system from the data obtained in the first stage. Some methods assume objects with known geometry (or fiduciary markers) present in the scene. In some of those cases the scene 3D structure should be precalculated beforehand. If part of the scene is unknown simultaneous localization and mapping (SLAM) can map relative positions. If no information about scene geometry is available, structure from motion methods like bundle adjustment are used. Mathematical methods used in the second stage include projective(epipolar) geometry, geometric algebra, rotation representation with exponential map, kalman and particle filters, nonlinear optimization, robust statistics.

Applications

Applications as of 2011

Advertising: Marketers started to use AR to promote products via interactive AR applications. For example, at the 2008 LA Auto Show, Nissan unveiled the concept vehicle Cube and presented visitors with a brochure which, when held against a webcam, showed alternate versions of the vehicle.[22] In August 2009, Best Buy ran a circular with an augmented reality code that allowed users with a webcam to interact with the product in 3D.[23] In 2010 Walt Disney used mobile AR to connect a movie experience to outdoor advertising.[24]

Task support: Complex tasks such as assembly, maintenance, and surgery can be simplified by inserting additional information into the field of view. For example, labels can be displayed on parts of a system to clarify operating instructions for a mechanic who is performing maintenance on the system.[25][26] AR can include images of hidden objects, which can be particularly effective for medical diagnostics or surgery. Examples include a virtual X-ray view based on prior tomography or on real time images from ultrasound and microconfocal probes[27] or open NMR devices. AR can enhance viewing a fetus inside a mother's womb.[28] See also Mixed reality.

Navigation: AR can augment the effectiveness of navigation devices. For example, building navigation can be enhanced to aid in maintaining industrial plants. Outdoor navigation can be augmented for military operations or disaster management. Head-up displays or personal display glasses in automobiles can provide navigation and traffic information. Head-up displays are currently used in fighter jets. These systems include full interactivity, including gaze tracking.

Industrial: AR can be used to compare digital mock-ups with physical mock-ups for efficiently finding discrepancies between them. It can safeguard digital data together with existing real prototypes, and thus reduce the number of real prototypes and improve the quality of the final product.[citation needed]

Military and emergency services: Wearable AR can provide information such as instructions, maps, enemy locations, and fire cells.

Art: AR can help create art in real time integrating reality such as painting, drawing and modeling. AR art technology has helped disabled individuals to continue pursuing their passion.[29]

Architecture: AR can simulate planned construction projects.[30]

Sightseeing: Guides can include labels or text related to the objects/places visited. With AR, users can rebuild ruins, buildings, or even landscapes as they previously existed.[31]

Collaboration: AR can help facilitate collaboration among distributed team members via conferences with real and virtual participants.[32]

Entertainment and education: AR can create virtual objects in museums and exhibitions, theme park attractions,[33] games[34][35] and books.[36]

Performance: AR can enhance concert and theater performances. For example, artists can allow listeners to augment their listening experience by adding their performance to that of other bands/groups of users.[37][38][39]

Translation: AR systems can provide dynamic subtitles in the user's language.[40][41]

Potential applications

Possible extensions include:

  • Devices: Create new applications that are physically impossible in "real" hardware, such as 3D objects interactively changing their shape and appearance based on the current task or need.
  • Multi-screen simulation: Display multiple application windows as virtual monitors in real space and switch among them with gestures and/or redirecting head and eyes. A single pair of glasses could "surround" a user with application windows.
  • Holodecks: Enhanced media applications, like pseudo holographic virtual screens and virtual surround cinema.
  • Automotive: eye-dialing, navigation arrows on roadways
  • "X-ray vision"
  • Furnishings: plants, wallpapers, panoramic views, artwork, decorations, posters, illumination etc. For example, a virtual window could show a live feed of a camera placed on the exterior of the building, thus allowing the user to toggle a wall's transparency.
  • Public displays: Window dressings, traffic signs, Christmas decorations, advertisements.
  • Gadgets: Clock, radio, PC, arrival/departure board at an airport, stock ticker, PDA, PMP, informational posters/fliers/billboards.
  • Group-specific feeds: For example, a construction manager could display instructions including diagrams at specific locations. Patrons at a public event could subscribe to a feed of directions and/or program notes.
  • Speech synthesis: Render location/context-specific information via spoken words.
  • Prospecting: In hydrology, ecology, and geology, AR can be used to display an interactive analysis of terrain characteristics. Users can collaboratively modify and analyze, interactive three-dimensional maps.

Notable researchers

Conferences

  • 1st International Workshop on Augmented Reality, IWAR'98, San Francisco, Nov. 1998.
  • 2nd International Workshop on Augmented Reality (IWAR'99), San Francisco, Oct. 1999.
  • 1st International Symposium on Mixed Reality (ISMR'99), Yokohama, Japan, March 1999.
  • 2nd International Symposium on Mixed Reality (ISMR'01), Yokohama, Japan, March 2001.
  • 1st International Symposium on Augmented Reality (ISAR 2000), Munich, Oct. 2000.
  • 2nd International Symposium on Augmented Reality (ISAR 2001), New York, Oct. 2001.
  • 1st International Symposium on Mixed and Augmented Reality (ISMAR 2002), Darmstadt, Oct. 2002.
  • 2nd International Symposium on Mixed and Augmented Reality (ISMAR 2003), Tokyo, Oct. 2003.
  • 3rd International Symposium on Mixed and Augmented Reality (ISMAR 2004), Arlington, VA, Nov. 2004.
  • 4th International Symposium on Mixed and Augmented Reality (ISMAR 2005), Vienna, Oct. 2005.
  • 5th International Symposium on Mixed and Augmented Reality (ISMAR 2006) Santa Barbara, Oct. 2006.
  • 6th International Symposium on Mixed and Augmented Reality (ISMAR 2007) Nara, Japan, Nov. 2007.
  • 7th International Symposium on Mixed and Augmented Reality (ISMAR 2008) Cambridge, United Kingdom, Sep. 2008.
  • 8th International Symposium on Mixed and Augmented Reality (ISMAR 2009) Orlando, Oct. 2009.
  • Augmented Reality Developer Camp (AR DevCamp) in Mountain View, Dec. 2009.
  • 9th International Symposium on Mixed and Augmented Reality (ISMAR 2010) Seoul, Korea, Oct. 2010.
  • 10th International Symposium on Mixed and Augmented Reality (ISMAR 2011) Basel, Switzerland Oct. 2011

Software

Open source software

  • ARToolKit, an open source (dual-license: GPL, commercial) C-library to create augmented reality applications; was ported to many different languages and platforms like Android, Flash or Silverlight; very widely used in augmented reality related projects
  • ArUco, a minimal library for augmented reality applications based on OpenCv; licenses: BSD, Linux, Windows
  • mixare, Open-source (GPLv3) augmented reality engine for Android and iPhone; works as an autonomous application and for developing other implementations
  • OpenMAR, Open Mobile Augmented Reality component framework for the Symbian platform, released under EPL; website is down but there is some information here
  • Argon, Augemented reality browser by Georgia Tech's GVU Center that uses a mix of KML and HTML/JavaScript/CSS to allow developing AR applications; any web content (with appropriate meta-data and properly formatted) can be converted into AR content; currently available only for the iPhone, website is down
  • Goblin, BSD licensed, Microsoft XNA based
  • PTAM, Non-commercial use only
  • ARTag, Downloads unavailable after 12/21/2010 due to licensing restrictions

Books

  • Woodrow Barfield, and Thomas Caudell, eds. Fundamentals of Wearable Computers and Augmented Reality. Mahwah, NJ: Lawrence Erlbaum, 2001. ISBN 0-8058-2901-6.
  • Oliver Bimber and Ramesh Raskar. Spatial Augmented Reality: Merging Real and Virtual Worlds. A K Peters, 2005. ISBN 1-56881-230-2.
  • Michael Haller, Mark Billinghurst and Bruce H. Thomas. Emerging Technologies of Augmented Reality: Interfaces and Design. Idea Group Publishing, 2006. ISBN 1-59904-066-2 , publisher listing
  • Rolf R. Hainich. "The end of Hardware : A Novel Approach to Augmented Reality" 2nd ed.: Booksurge, 2006. ISBN 1-4196-5218-4. 3rd ed. ("Augmented Reality and Beyond"): Booksurge, 2009, ISBN 1-4392-3602-X.
  • Stephen Cawood and Mark Fiala. Augmented Reality: A Practical Guide, 2008, ISBN 1-934356-03-4

Television, film

  • The television series Dennō Coil depicts a near-future where children use AR glasses to enhance their environment with games and virtual pets.
  • In the Terminator movie series, all Terminator models, beginning with T-800 series, use augmented reality systems to "see".
  • The television series Firefly depicts numerous AR applications, including a real-time medical scanner which allows a doctor to use his hands to manipulate a detailed and labeled projection of a patient's brain.
  • In the 1993 ABC miniseries Wild Palms, a Scientology-like organization used holographic projectors to overlay virtual reality images over physical reality.
  • In the movie Iron Man, Tony Stark (Robert Downey Jr.) uses an augmented reality system to design his super-powered suit. The suit itself also uses augmented reality technology.
  • In the Philippines, during their first automated elections (2010), ABS-CBN News and Current Affairs used augmented reality during the counting of votes for all national and local candidates and in delivering news reports. ABS-CBN still uses augmented reality in its TV Patrol news programs.
  • In Minority Report, Tom Cruise stands in front of a supercomputer using AR technology.
  • In the movie Mission Impossible 2, Tom Cruise uses Augmented Reality technology via a set of sunglasses he wears to debrief himself of his forthcoming mission,Chimera,after he completes climbing a mountain at the very outset of the movie.
  • In the movie RoboCop, RoboCop uses Augmented Reality tech via his Head-mounted display to get into the details of a particular person or status quo.
  • In the movie, They Live, aliens on Earth use a hypnotic radio frequency causing the human population to see generated images and adverts which mask billboards that actually contain subliminal messaging. Curiously, it takes wearing a Head-mounted display (in this case, a pair of sunglasses) in order not to be able to see the AR.[45]
  • NBC School Pride debuts AR alive: Letters alive in the Communication & Media Arts High School in Detroit, Michigan [46]

Literature

  • The books Halting State by Charles Stross and Rainbows End by Vernor Vinge and the Daemon series by Daniel Suarez include augmented reality primarily in the form of virtual overlays over the real world. Halting State mentions Copspace, which is used by cops, and the use by gamers to overlay their characters onto themselves during a gaming convention. Rainbows End mentions outdoor overlays based on popular fictional universes from H. P. Lovecraft and Terry Pratchett among others. The Daemon series features the "Darknet", which connects human followers and allows them to create their own ranking system and economy among other features.
  • The term "Geohacking" has been coined by William Gibson in his book Spook Country, where artists use a mix of GPS and 3D graphics technology to embed rendered meshes in real world landscapes.
  • In The Risen Empire, by Scott Westerfeld, most - if not all - people have their own "synesthesia". An AR menu unique to the user that is projected in front of them, but they can only see their own synesthesia menus. It is controlled by hand gestures, blink patterns, where the user is looking, clicks of the tongue, etc.
  • In the Greg Egan novel Distress, the 'Witness' software used to record sights and sounds experienced by the user can be set-up to scan what the user is seeing and highlight people the user is looking out for.
  • In the Revelation Space series of novels, Alastair Reynolds characters frequently employ "Entoptics" which are essentially a highly developed form of augmented reality, going so far as to entirely substitute natural perception.
  • The book The California Voodoo Game by Larry Niven and Steve Barnes, the game players use LCD displays for what the book calls Dreamtime technology to add virtual overlays to the real world.

Games

  • The table top role-playing game, Shadowrun, introduced AR into its game world. Most of the characters in the game use viewing devices to interact with the AR world most of the time.
  • Cybergeneration, a table top role-playing game by R. Talsorian, includes "virtuality", an augmented reality created through v-trodes, cheap, widely available devices people wear at their temples.
  • In the video game Heavy Rain, Norman Jayden, an FBI profiler, possesses a set of experimental augmented reality glasses called an "Added Reality Interface", or ARI. It allows him to rapidly investigate crime scenes and analyze evidence.
  • In Dead Space the RIG worn by Isaac Clarke is thoroughly equipped with augmented reality technology, including a navigation system that projects a line along the best route to his destination, and a system that displays images, video and text in front of him.

Comics

Tools

see Augmented reality#Software

See also

References

  1. ^ "If we accept that all Reality happens in our mind and is by definition, Mediated by our senses...then surely any technology that extends our memory, provides additional information, clues, hints or expanded interaction is a form of Augmented Reality"Rob Manson (2011-04-13). "Time to review the definition of AR...again". AR-UX.com. Retrieved 2011-04-29.
  2. ^ "The interactive system is no longer a precise location, but the whole environment; interaction is no longer simply a face-to-screen exchange, but dissolves itself in the surrounding space and objects. Using an information system is no longer exclusively a conscious and intentional act."Brian X. Chen (2009-08-25). "If You're Not Seeing Data, You're Not Seeing". Wired. Retrieved 2009-08-26.
  3. ^ a b R. Azuma, A Survey of Augmented Reality Presence: Teleoperators and Virtual Environments, pp. 355–385, August 1997.
  4. ^ P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
  5. ^ Mediated Reality with implementations for everyday life, 2002 August 6th, Presence Connect, the on line companion to the MIT Press journal PRESENCE: Teleoperators and Virtual Environments, MIT Press
  6. ^ "F-35 Distributed Aperture System EO DAS." Youtube.com. Retrieved: 07 October 2010.
  7. ^ http://www.google.com/patents?q=3050870
  8. ^ Tom Caudell
  9. ^ a b L. B. Rosenberg. The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments. Technical Report AL-TR-0089, USAF Armstrong Laboratory, Wright-Patterson AFB OH, 1992.
  10. ^ a b L. B. Rosenberg, "The Use of Virtual Fixtures to Enhance Operator Performance in Telepresence Environments" SPIE Telemanipulator Technology, 1993.
  11. ^ Experiences and Observations in Applying Augmented Reality to Live Training
  12. ^ a b c Ramesh Raskar, Greg Welch, Henry Fuchs Spatially Augmented Reality, First International Workshop on Augmented Reality, Sept 1998
  13. ^ Wikitude AR Travel Guide
  14. ^ Saqoosha
  15. ^ The most common products employed are as follows: MicroVision Nomad, Sony Glasstron, Vuzix AR and I/O Displays. Vuzix AR
  16. ^ Feiner, Steve. "Augmented reality: a long way off?". AR Week. Pocket-lint. Retrieved 3 March 2011.
  17. ^ David Drascic of the University of Toronto is a developer of ARGOS: A Display System for Augmenting Reality. David also has a number of AR related papers on line, accessible from his home page.
  18. ^ Augmented reality brings maps to life July 19, 2005
  19. ^ Feiner, Steve. "Augmented reality: a long way off?". AR Week. Pocket-lint.com. Retrieved 3 March 2011.
  20. ^ Stationary systems can employ 6DOF track systems such as Polhemus, ViCON, A.R.T, or Ascension.
  21. ^ Tinmith
  22. ^ company website
  23. ^ Vlad Savov. "Best Buy goes 3D, even augmented reality isn't safe from advertising".
  24. ^ AR at Disney
  25. ^ The big idea:Augmented Reality
  26. ^ Steve Henderson, Steven Feiner. "ARMAR:Augmented Reality for Maintenance and Repair (ARMAR)". Retrieved 2010-01-06.
  27. ^ Peter Mountney, Stamatia Giannarou, Daniel Elson and Guang-Zhong Yang. "Optical Biopsy Mapping for Minimally Invasive Cancer Screening. In proc MICCAI(1), 2009, pp. 483-490". Retrieved 2010-07-07.{{cite web}}: CS1 maint: multiple names: authors list (link)
  28. ^ "UNC Ultrasound/Medical Augmented Reality Research". Retrieved 2010-01-06.
  29. ^ One such example of this phenomenon is called Eyewriter that was developed in 2009 by Zachary Lieberman and a group formed by members of Free Art and Technology (FAT), OpenFrameworks and the Graffiti Research Lab to help a graffiti artist, who became paralyzed, draw again. Zachary Lieberman. "The Eyewriter". Retrieved 2010-04-27.
  30. ^ Anish Tripathi. "Augmented Reality: An Application for Architecture". Retrieved 2010-01-06.
  31. ^ Patrick Dähne, John N. Karigiannis. "Archeoguide: System Architecture of a Mobile Outdoor Augmented Reality System". Retrieved 2010-01-06.
  32. ^ The Hand of God is a good example of a collaboration system. Aaron Stafford, Wayne Piekarski, and Bruce H. Thomas. "Hand of God". Retrieved 2009-12-18.{{cite web}}: CS1 maint: multiple names: authors list (link)
  33. ^ Theme park attraction:Cadbury World
  34. ^ ARQuake
  35. ^ Eye of Judgement
  36. ^ Jose Fermoso. "Make Books 'Pop' With New Augmented Reality Tech". Retrieved 2010-10-01.
  37. ^ Pop group Duran Duran included interactive AR projections into their stage show during their 2000 Pop Trash concert tour. Pair, J., Wilson, J., Chastine, J., Gandy, M. "The Duran Duran Project: The Augmented Reality Toolkit in Live Performance". The First IEEE International Augmented Reality Toolkit Workshop, 2002. (photos and video)
  38. ^ Sydney band Lost Valentinoslaunched the world's first interactive AR music video on 16 October 2009, where users could print out 5 markers representing a pre-recorded performance from each band member which they could interact with live and in real-time via their computer webcam and record as their own unique music video clips to share viaYouTube Gizmodo: Sydney Band Uses Augmented Reality For Video Clip
  39. ^ cnet: Augmented reality in Aussie film clip
  40. ^ iPhone applicationWord Lens injects subtitles into the desired language in video. [1] Alexia Tsotsis "Word Lens Translates Words Inside of Images. Yes Really." TechCrunch (December 16, 2010)
  41. ^ [2] N.B. "Word Lens: This changes everything" The Economist: Gulliver blog (December 18, 2010)
  42. ^ "Knowledge-based augmented reality". ACM. July, 1993. {{cite web}}: Check date values in: |date= (help)
  43. ^ Wagner, Daniel (September 29, 2009). "First Steps Towards Handheld Augmented Reality". ACM. Retrieved 2009-09-29.
  44. ^ Pikarski, Wayne and Thomas, Bruce (October 1, 2001). "Tinmith-Metro: New Outdoor Techniques for Creating City Models with an Augmented Reality Wearable Computer". IEEE. Retrieved 2010-11-09.{{cite web}}: CS1 maint: multiple names: authors list (link)
  45. ^ Miles, Stuart. "Top 10 uses of augmented reality in the movies". AR Week. Pocket-lint.com. Retrieved 1 March 2011.
  46. ^ Dybis, Karen (August 6, 2010). "Some Genuine Detroit 'School Pride'". Time. Retrieved October 7, 2010.

Media related to Augmented reality at Wikimedia Commons