Augmented reality
This article needs additional citations for verification. (January 2011) |
Augmented reality (AR) is a term for a live direct or indirect view of a physical, real-world environment whose elements are augmented by computer-generated sensory input, such as sound or graphics. It is related to a more general concept called mediated reality, in which a view of reality is modified (possibly even diminished rather than augmented) by a computer. As a result, the technology functions by enhancing one’s current perception of reality. By contrast, virtual reality replaces the real-world with a simulated one.
Augmentation is conventionally in real-time and in semantic context with environmental elements, such as sports scores on TV during a match. With the help of advanced AR technology (e.g. adding computer vision and object recognition) the information about the surrounding real world of the user becomes interactive and digitally manipulable. Artificial information about the environment and its objects can be overlaid on the real world. The term augmented reality is believed to have been coined in 1990 by Thomas Caudell, working at Boeing.[1]
Research explores the application of computer-generated imagery in live-video streams as a way to enhance the perception of the real world. AR technology includes head-mounted displays and virtual retinal displays for visualization purposes, and construction of controlled environments containing sensors and actuators.
Definition
Ronald Azuma offered a definition in 1997.[2] Azuma's definition says that Augmented Reality combines real and virtual, is interactive in real time and is registered in 3D.
Additionally Paul Milgram and Fumio Kishino defined Milgram's Reality-Virtuality Continuum in 1994.[3] They describe a continuum that spans an entirely real environment to a purely virtual environment. In between are Augmented Reality (closer to the real environment) and Augmented Virtuality (closer to the virtual environment).
Taxonomy of Reality, Virtuality, Mediality
This continuum has been extended into a second dimension that incorporates Mediality.[4] On a graph, the origin R at the bottom left denotes unmodified reality. A continuum across the Virtuality axis V includes reality augmented with additional information (AR), as well as virtual reality augmented by reality (Augmented Virtuality or AV). Unmediated AV simulations are constrained to match the real world behaviorally if not in contents.
The mediality axis measures modification of AV, AR and combination of these. Moving away from the origin on this axis, the depicted world becomes increasingly different from reality. Diagonally opposite from R are virtual worlds that have no connection to reality. (at right) It includes the virtuality reality continuum (mixing) but also, in addition to additive effects, also includes modulation and/or diminishment of reality. Mediation encompasses deliberate and/or unintentional modifications.[citation needed]
Examples
Sports
AR has become common in sports telecasting. The yellow "first down" line seen in television broadcasts of American football games shows the line the offensive team must cross to receive a first down using the 1st & Ten system. The real-world elements are the football field and players, and the virtual element is the yellow line, which augment the image in real time. Similarly, in ice hockey an AR colored trail shows location and direction of the puck. Sections of Rugby fields and cricket pitches display sponsored images.
Swimming telecasts often add a line across the lanes to indicate the position of the current record holder as a race proceeds to allow viewers to compare the current race to the best performance.
As an example of mediated (diminished) reality, the network may hide a real message or replace a real ad message with a virtual message.
Other
A museum exhibition might use projectors and screens to insert "objects" into the real environment. These objects relate to the particular location where they appear and can be interactive.[citation needed]
First-person shooter video games can simulate a player's viewpoint using AR to give visual directions to a location, mark the direction distance of another person who is not in line of sight and give information about equipment such as remaining ammunition. This is done using a virtual head-up display.[citation needed]
Heads-up displays in AR cars[which?] or airplanes are typically integrated into the windshield.[citation needed]
The F-35 Lightning II instead display information in the pilot's helmet mounted display, which allows the pilot to look through the aircraft's walls as if he was floating in space.[5]
History
- 1957-62: Morton Heilig, a cinematographer, creates and patents a simulator called Sensorama with visuals, sound, vibration, and smell.[6]
- 1966: Ivan Sutherland invents the head-mounted display and positions it as a window into a virtual world.
- 1975: Myron Krueger creates Videoplace to allow users to interact with virtual objects for the first time.
- 1989: Jaron Lanier coins the phrase Virtual Reality and creates the first commercial business around virtual worlds.
- 1992: Tom Caudell coins the phrase Augmented Reality while at Boeing helping workers assemble cables into aircraft.[7]
- 1992: L.B. Rosenberg develops one of the first functioning AR systems, called VIRTUAL FIXTURES, at the U.S. Air Force Research Laboratory—Armstrong, and demonstrates benefits to human performance.[8][9]
- 1992: Steven Feiner, Blair MacIntyre and Doree Seligmann present the first major paper on an AR system prototype, KARMA, at the Graphics Interface conference. A widely cited version of the paper is published in Communications of the ACM in 1993.
- 1994: Julie Martin creates first Augmented Reality Theater production, Dancing In Cyberspace, funded by the Australia Council for the Arts, features dancers and acrobats manipulating body–sized virtual object in real time, projected into the same physical space and performance plane. The acrobats appeared immersed within the virtual object and environments. The installation used Silicon Graphics computers and Polhemus sensing system.
- 1998: Spatial Augmented Reality introduced at University of North Carolina at Chapel Hill by Raskar, Welch, Fuchs.[10]
- 1999: Hirokazu Kato (加藤 博一) created ARToolKit at HITLab, where AR later was further developed by other HITLab scientists, demonstrating it at SIGGRAPH.
- 2000: Bruce H. Thomas develops ARQuake, the first outdoor mobile AR game, demonstrating it in the International Symposium on Wearable Computers.
- 2008: Wikitude AR Travel Guide launches on Oct. 20, 2008 with the G1 Android phone.[11]
- 2009: Wikitude Drive - AR Navigation System launched on Oct. 28, 2009 for the Android platform.
- 2009: AR Toolkit was ported to Adobe Flash (FLARToolkit) by Saqoosha, bringing augmented reality to the web browser.[12]
Technology
Hardware
The main hardware components for augmented reality are: display, tracking, input devices, sensors and processor. These elements, specifically CPU, camera, display, accelerometer, GPS and solid state compass are often present in modern smartphones, which make them prospective AR platforms.
Display
There are three major display techniques for Augmented Reality are head–mounted displays, handheld displays, and spatial displays.
Head–mounted
A Head Mounted Display (HMD) places images of both the physical world and registered virtual graphical objects over the user's view of the world. The HMD's are either optical see–through or video see–through. Optical see-through employs half-silver mirrors to pass images through the lens and overlay information to be reflected into the user's eyes. The HMD must be tracked with sensor that provides six degrees of freedom. This tracking allows the system to align virtual information to the physical world. The main advantage of HMD AR is the user's immersive experience. The graphical information is slaved to the view of the user.[13]
Handheld
Handheld displays employ a small display that fits in a user's hand. All handheld AR solutions to date opt for video see-through. Initially handheld AR employed fiduciary markers, and later GPS units and MEMS sensors such as digital compasses and six degrees of freedom accelerometer–gyroscope. Today SLAM markerless trackers such as PTAM are staring come in use. Handheld display AR promises to be the first commercial success for AR technologies. The two main advantages of handheld AR is the portable nature of handheld devices and ubiquitous nature of camera phones.
Spatial
Instead of the user wearing or carrying the display such as with head mounted displays or handheld devices; Spatial Augmented Reality (SAR) [10]makes use of digital projectors to display graphical information onto physical objects. The key difference in SAR is that the display is separated from the users of the system. Because the displays are not associated with each user, SAR scales naturally up to groups of users, thus allowing for collocated collaboration between users. SAR has several advantages over traditional head mounted displays and handheld devices. The user is not required to carry equipment or wear the display over their eyes. This makes spatial AR a good candidate for collaborative work, as the users can see each other’s faces. A system can be used by multiple people at the same time without each having to wear a head mounted display. Spatial AR does not suffer from the limited display resolution of current head mounted displays and portable devices. A projector based display system can simply incorporate more projectors to expand the display area. Where portable devices have a small window into the world for drawing, a SAR system can display on any number of surfaces of an indoor setting at once. The tangible nature of SAR makes this an ideal technology to support design, as SAR supports both a graphical visualisation and passive haptic sensation for the end users. People are able to touch physical objects, and it is this process that provides the passive haptic sensation. [2] [10] [14] [15]
Tracking
Modern mobile augmented reality systems use one or more of the following tracking technologies: digital cameras and/or other optical sensors, accelerometers, GPS, gyroscopes, solid state compasses, RFID and wireless sensors. These technologies offer varying levels of accuracy and precision. Most important is the position and orientation of the user's head. Tracking the user's hand(s) or a handheld input device can provide a 6DOF interaction technique.[16]
Input devices
Techniques include the pinch glove,[17] a wand with a button and a smartphone that signals its position and orientation from camera images.
Computer
The computer analyzes the sensed visual and other data to synthesize and position augmentations.
Software
A key measure of AR systems is how realistically they integrate augmentations with the real world. The software must derive real world coordinates, independent from the camera, from camera images. That process is called image registration and is part of Azuma's definition of Augmented Reality.
Image registration uses different methods of computer vision, mostly related to video tracking. Many computer vision methods of augmented reality are inherited from visual odometry. Usually those methods consist of two parts. First detect interest points, or fiduciary markers, or optical flow in the camera images. First stage can use feature detection methods like corner detection, blob detection, edge detection or thresholding and/or other image processing methods.
The second stage restores a real world coordinate system from the data obtained in the first stage. Some methods assume objects with known geometry (or fiduciary markers) present in the scene. In some of those cases the scene 3D structure should be precalculated beforehand. If part of the scene is unknown simultaneous localization and mapping (SLAM) can map relative positions. If no information about scene geometry is available, structure from motion methods are used. Methods used in the second stage include projective(epipolar) geometry, bundle adjustment, rotation representation with exponential map, kalman and particle filters.
Applications
Applications as of 2011
Advertising: Marketers started to use AR to promote products via interactive AR applications. For example, at the 2008 LA Auto Show, Nissan unveiled the concept vehicle Cube and presented visitors with a brochure which, when held against a webcam, showed alternate versions of the vehicle.[18] In August 2009, Best Buy ran a circular with an augmented reality code that allowed users with a webcam to interact with the product in 3D.[19] In 2010 Walt Disney used mobile AR to connect a movie experience to outdoor advertising.[20]
Task support: Complex tasks such as assembly, maintenance, and surgery can be simplified by inserting additional information into the field of view. For example, labels can be displayed on parts of a system to clarify operating instructions for a mechanic who is performing maintenance on the system.[21][22] AR can include images of hidden objects, which can be particularly effective for medical diagnostics or surgery. Examples include a virtual X-ray view based on prior tomography or on real time images from ultrasound and microconfocal probes[23] or open NMR devices. AR can enhance viewing a fetus inside a mother's womb.[24] See also Mixed reality.
Navigation: AR can augment the effectiveness of navigation devices. For example, building navigation can be enhanced to aid in maintaining industrial plants. Outdoor navigation can be augmented for military operations or disaster management. Head-up displays or personal display glasses in automobiles can provide navigation and traffic information. Head-up displays are currently used in fighter jets. These systems include full interactivity, including gaze tracking.
Industrial: AR can be used to compare digital mock-ups with physical mock-ups for efficiently finding discrepancies between them. It can safeguard digital data in combination with existing real prototypes, and thus reduce the number of real prototypes and improve the quality of the final product.
Military and emergency services: Wearble AR can provide information such as instructions, maps, enemy locations, and fire cells.
Prospecting: In hydrology, ecology, and geology, AR can be used to display an interactive analysis of terrain characteristics. Users can collaboratively modify and analyze, interactive three-dimensional maps.[citation needed]
Art: AR can help create art in real time integrating reality such as painting, drawing and modeling. AR art technology has helped disabled individuals to continue pursuing their passion.[25]
Architecture: AR can simulate planned construction projects.[26]
Sightseeing: Guides can include labels or text related to the objects/places visited. With AR, users can rebuild ruins, buildings, or even landscapes as they previously existed.[27]
Collaboration: AR can help facilitate collaboration among distributed team members via conferences with real and virtual participants.[28]
Entertainment and education: AR can create virtual objects in museums and exhibitions, theme park attractions,[29] games[30] and books.[31]
Performance: AR can enhance concert and theater performances. For example, artists can allow listeners to augment their listening experience by add their performance to that of other bands/groups of users.[32][33][34]
Translation: AR systems can provide dynamic subtitles in the user's language.[35][36]
Video games: Consoles and handheld devices such as PlayStation 3, PlayStation Portable, Nintendo 3DS and smartphones are capable of producing some form of augmented reality with proper software. Games use cameras which are either sold separately or built into the machines and utilize fiduciary marker or MEMS sensors to register camera position and/or direction in game engine coordinate system. For example Eye of Judgement uses special cards as markers, producing a virtual monster on top of the marker in the video image. Other video games, such as, Zombie ShootAR utilizes the camera live-stream for optical tracking and 3D rendering. Therefore, the software enables the recognition of the environment and the integration of (3D) content into live video streams.
Potential applications
Possible extensions include:
- Devices: Create new applications that are physically impossible in "real" hardware, such as 3D objects interactively changing their shape and appearance based on the current task or need.
- Multi-screen simulation: Display multiple application windows as virtual monitors in real space and switch among them with gestures and/or redirecting head and eyes. A single pair of glasses could "surround" a user with application windows.
- Holodecks: Enhanced media applications, like pseudo holographic virtual screens and virtual surround cinema.
- Automotive: eye-dialing, navigation arrows on roadways
- "X-ray vision"
- Furnishings: plants, wallpapers, panoramic views, artwork, decorations, posters, illumination etc. For example, a virtual window could show a live feed of a camera placed on the exterior of the building, thus allowing the user to toggle a wall's transparency.
- Public displays: Window dressings, traffic signs, Christmas decorations, advertisements.
- Gadgets: Clock, radio, PC, arrival/departure board at an airport, stock ticker, PDA, PMP, informational posters/fliers/billboards.
- Group-specific feeds: For example, a construction manager could display instructions including diagrams at specific locations. Patrons at a public event could subscribe to a feed of directions and/or program notes.
- Speech synthesis: Render location/context-specific information via spoken words.
Notable researchers
This section needs additional citations for verification. (September 2009) |
- Ivan Sutherland invented the first AR head mounted display at Harvard University.
- Steven Feiner, Professor at Columbia University, is a leading pioneer of augmented reality, and author of the first paper on an AR system prototype, KARMA (the Knowledge-based Augmented Reality Maintenance Assistant), along with Blair MacIntyre and Doree Seligmann.[37]
- L.B. Rosenberg developed one of the first known AR systems, called Virtual Fixtures, while working at the U.S. Air Force Armstrong Labs in 1991, and published first study of how an AR system can enhance human performance.[8][9]
- Mark Billinghurst and Daniel Wagner jump started the field of AR on mobile phones. They developed the first marker tracking systems for mobile phones and PDAs.[38]
- Bruce H. Thomas and Wayne Piekarski develop the Tinmith system in 1998.[39] They along with Steve Feiner with his MARS system pioneer outdoor augmented reality.
Conferences
- 1st International Workshop on Augmented Reality, IWAR'98, San Francisco, Nov. 1998.
- 2nd International Workshop on Augmented Reality (IWAR'99), San Francisco, Oct. 1999.
- 1st International Symposium on Mixed Reality (ISMR'99), Yokohama, Japan, March 1999.
- 2nd International Symposium on Mixed Reality (ISMR'01), Yokohama, Japan, March 2001.
- 1st International Symposium on Augmented Reality (ISAR 2000), Munich, Oct. 2000.
- 2nd International Symposium on Augmented Reality (ISAR 2001), New York, Oct. 2001.
- 1st International Symposium on Mixed and Augmented Reality (ISMAR 2002), Darmstadt, Oct. 2002.
- 2nd International Symposium on Mixed and Augmented Reality (ISMAR 2003), Tokyo, Oct. 2003.
- 3rd International Symposium on Mixed and Augmented Reality (ISMAR 2004), Arlington, VA, Nov. 2004.
- 4th International Symposium on Mixed and Augmented Reality (ISMAR 2005), Vienna, Oct. 2005.
- 5th International Symposium on Mixed and Augmented Reality (ISMAR 2006) Santa Barbara, Oct. 2006.
- 6th International Symposium on Mixed and Augmented Reality (ISMAR 2007) Nara, Japan, Nov. 2007.
- 7th International Symposium on Mixed and Augmented Reality (ISMAR 2008) Cambridge, United Kingdom, Sep. 2008.
- 8th International Symposium on Mixed and Augmented Reality (ISMAR 2009) Orlando, Oct. 2009.
- Augmented Reality Developer Camp (AR DevCamp) in Mountain View, Dec. 2009.
- insideAR Augmented Reality Conference (insideAR 2010) Munich, Sept. 2010.
- 9th International Symposium on Mixed and Augmented Reality (ISMAR 2010) Seoul, Korea, Oct. 2010.
- 10th International Symposium on Mixed and Augmented Reality (ISMAR 2011) Basel, Switzerland Oct. 2011
Software
Free software
- ARToolKit is a cross-platform C-library for the creation of augmented reality applications. It was ported to many different languages and platforms like Android, Flash or Silverlight. It is also very widely used in augmented reality related projects.
- mixare - Open-Source (GPLv3) Augmented Reality Engine for Android. It works as a completely autonomous application and is available as well for the development of own implementations.
- OpenMAR - Open Mobile Augmented Reality component framework for the Symbian platform, released under EPL
- ArUco - a minimal library for Augmented Reality applications based on OpenCv.
Non-commercial use
Books
- Woodrow Barfield, and Thomas Caudell, eds. Fundamentals of Wearable Computers and Augmented Reality. Mahwah, NJ: Lawrence Erlbaum, 2001. ISBN 0-8058-2901-6.
- Oliver Bimber and Ramesh Raskar. Spatial Augmented Reality: Merging Real and Virtual Worlds. A K Peters, 2005. ISBN 1-56881-230-2.
- Michael Haller, Mark Billinghurst and Bruce H. Thomas. Emerging Technologies of Augmented Reality: Interfaces and Design. Idea Group Publishing, 2006. ISBN 1-59904-066-2 , publisher listing
- Rolf R. Hainich. "The end of Hardware : A Novel Approach to Augmented Reality" 2nd ed.: Booksurge, 2006. ISBN 1-4196-5218-4. 3rd ed. ("Augmented Reality and Beyond"): Booksurge, 2009, ISBN 1-4392-3602-X.
- Stephen Cawood and Mark Fiala. Augmented Reality: A Practical Guide, 2008, ISBN 1-934356-03-4
In popular culture
Television & film
- The television series Dennō Coil depicts a near-future where children use AR glasses to enhance their environment with games and virtual pets.
- In the Terminator movie series, all Terminator models, beginning with T-800 series, use augmented reality systems to "see".
- The television series Firefly depicts numerous AR applications, including a real-time medical scanner which allows a doctor to use his hands to manipulate a detailed and labeled projection of a patient's brain.
- In the 1993 ABC miniseries Wild Palms, a Scientology-like organization used holographic projectors to overlay virtual reality images over physical reality.
- In the movie Iron Man, Tony Stark (Robert Downey Jr.) uses an augmented reality system to design his super-powered suit. The suit itself also uses augmented reality technology.
- In the Philippines, during their first automated elections (2010), ABS-CBN News and Current Affairs used augmented reality during the counting of votes for all National and Local Candidates and in delivering news reports. ABS-CBN still uses augmented reality in its TV Patrol news programs.
- In Minority Report Tom Cruise stands in front of a supercomputer using AR technology.
- NBC School Pride debuts AR alive: Letters alive in the Communication & Media Arts High School in Detroit, Michigan [40]
Literature
- The books Halting State by Charles Stross and Rainbows End by Vernor Vinge include augmented reality primarily in the form of virtual overlays over the real world. Halting State mentions Copspace, which is used by cops, and the use by gamers to overlay their characters onto themselves during a gaming convention. Rainbows End mentions outdoor overlays based on popular fictional universes from H. P. Lovecraft and Terry Pratchett among others.
- The term "Geohacking" has been coined by William Gibson in his book Spook Country, where artists use a combination of GPS and 3D graphics technology to embed rendered meshes in real world landscapes.
- In The Risen Empire, by Scott Westerfeld, most - if not all - people have their own "synesthesia". An AR menu unique to the user that is projected in front of them, but they can only see their own synesthesia menus. It is controlled by hand gestures, blink patterns, where the user is looking, clicks of the tongue, etc.
- In the Greg Egan novel Distress, the 'Witness' software used to record sights and sounds experienced by the user can be set-up to scan what the user is seeing and highlight people the user is looking out for.
- In the Revelation Space series of novels, Alastair Reynolds characters frequently employ "Entoptics" which are essentially a highly developed form of augmented reality, going so far as to entirely substitute natural perception.
- The book The California Voodoo Game by Larry Niven and Steve Barnes, the game players use LCD displays for what the book calls Dreamtime technology to add virtual overlays to the real world.
Games
- The table top role-playing game, Shadowrun, introduced AR into its game world. Most of the characters in the game use viewing devices to interact with the AR world most of the time.
- Cybergeneration, a table top role-playing game by R. Talsorian, includes "virtuality", an augmented reality created through v-trodes, cheap, widely available devices people wear at their temples.
- In the video game Heavy Rain, Norman Jayden, an FBI profiler, possesses a set of experimental augmented reality glasses called an "Added Reality Interface", or ARI. It allows him to rapidly investigate crime scenes and analyze evidence.
- In Dead Space the RIG worn by Isaac Clarke is thoroughly equipped with augmented reality technology, including a navigation system that projects a line along the best route to his destination, and a system that displays images, video and text in front of him.
See also
- Alternate reality game
- ARQuake
- Augmented browsing
- Augmented virtuality
- Augmented Reality-based testing
- Bionic contact lens
- Brain in a vat
- Camera resectioning
- Computer-mediated reality
- Cyborg
- Head-mounted display
- Mixed reality
- Mediated reality
- Simulated reality
- Viractualism
- Virtual retinal display
- Virtuality Continuum
- Virtual reality
- Wearable computer
References
- ^ "The interactive system is no longer a precise location, but the whole environment; interaction is no longer simply a face-to-screen exchange, but dissolves itself in the surrounding space and objects. Using an information system is no longer exclusively a conscious and intentional act."Brian X. Chen (2009-08-25). "If You're Not Seeing Data, You're Not Seeing". Wired Magazine. Retrieved 2009-08-26.
- ^ a b R. Azuma, A Survey of Augmented Reality Presence: Teleoperators and Virtual Environments, pp. 355–385, August 1997.
- ^ P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
- ^ Mediated Reality with implementations for everyday life, 2002 August 6th, Presence Connect, the on line companion to the MIT Press journal PRESENCE: Teleoperators and Virtual Environments, MIT Press
- ^ "F-35 Distributed Aperture System EO DAS." Youtube.com. Retrieved: 07 October 2010.
- ^ http://www.google.com/patents?q=3050870
- ^ Tom Caudell
- ^ a b L. B. Rosenberg. The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments. Technical Report AL-TR-0089, USAF Armstrong Laboratory, Wright-Patterson AFB OH, 1992.
- ^ a b L. B. Rosenberg, "The Use of Virtual Fixtures to Enhance Operator Performance in Telepresence Environments" SPIE Telemanipulator Technology, 1993.
- ^ a b c Ramesh Raskar, Greg Welch, Henry Fuchs Spatially Augmented Reality, First International Workshop on Augmented Reality, Sept 1998
- ^ Wikitude AR Travel Guide
- ^ Saqoosha
- ^ The most common products employed are as follows: MicroVision Nomad, Sony Glasstron, Vuzix AR and I/O Displays. Vuzix AR
- ^ David Drascic of the University of Toronto is a developer of ARGOS: A Display System for Augmenting Reality. David also has a number of AR related papers on line, accessible from his home page.
- ^ Augmented reality brings maps to life July 19, 2005
- ^ Stationary systems can employ 6DOF track systems such as Polhemus, ViCON, A.R.T, or Ascension.
- ^ Tinmith
- ^ company website
- ^ Vlad Savov. "Best Buy goes 3D, even augmented reality isn't safe from advertising".
- ^ AR at Disney
- ^ The big idea:Augmented Reality
- ^ Steve Henderson, Steven Feiner. "ARMAR:Augmented Reality for Maintenance and Repair (ARMAR)". Retrieved 2010-01-06.
- ^ Peter Mountney, Stamatia Giannarou, Daniel Elson and Guang-Zhong Yang. "Optical Biopsy Mapping for Minimally Invasive Cancer Screening. In proc MICCAI(1), 2009, pp. 483-490". Retrieved 2010-07-07.
{{cite web}}
: CS1 maint: multiple names: authors list (link) - ^ "UNC Ultrasound/Medical Augmented Reality Research". Retrieved 2010-01-06.
- ^ One such example of this phenomenon is called Eyewriter that was developed in 2009 by Zachary Lieberman and a group formed by members of Free Art and Technology (FAT), OpenFrameworks and the Graffiti Research Lab to help a graffiti artist, who became paralyzed, draw again. Zachary Lieberman. "THE EYEWRITER". Retrieved 2010-04-27.
- ^ Anish Tripathi. "AUGMENTED REALITY : AN APPLICATION FOR ARCHITECTURE". Retrieved 2010-01-06.
- ^ Patrick Dähne, John N. Karigiannis. "Archeoguide: System Architecture of a Mobile Outdoor Augmented Reality System". Retrieved 2010-01-06.
- ^ The Hand of God is a good example of a collaboration system. Aaron Stafford, Wayne Piekarski, and Bruce H. Thomas. "Hand of God". Retrieved 2009-12-18.
{{cite web}}
: CS1 maint: multiple names: authors list (link) - ^ Theme park attraction:Cadbury World
- ^ ARQuake
- ^ Jose Fermoso. "Make Books 'Pop' With New Augmented Reality Tech". Retrieved 2010-10-01.
- ^ Pop group Duran Duran included interactive AR projections into their stage show during their 2000 Pop Trash concert tour. Pair, J., Wilson, J., Chastine, J., Gandy, M. "The Duran Duran Project: The Augmented Reality Toolkit in Live Performance". The First IEEE International Augmented Reality Toolkit Workshop, 2002. (photos and video)
- ^ Sydney band Lost Valentinoslaunched the world's first interactive AR music video on 16 October 2009, where users could print out 5 markers representing a pre-recorded performance from each band member which they could interact with live and in real-time via their computer webcam and record as their own unique music video clips to share viaYouTube Gizmodo: Sydney Band Uses Augmented Reality For Video Clip
- ^ cnet: Augmented reality in Aussie film clip
- ^ iPhone applicationWord Lens injects subtitles into the desired language in video. [1] Alexia Tsotsis "Word Lens Translates Words Inside of Images. Yes Really." TechCrunch (December 16, 2010)
- ^ [2] N.B. "Word Lens: This changes everything" The Economist: Gulliver blog (December 18, 2010)
- ^ "Knowledge-based augmented reality". ACM. July, 1993.
{{cite web}}
: Check date values in:|date=
(help) - ^ Wagner, Daniel (September 29, 2009). "First Steps Towards Handheld Augmented Reality". ACM. Retrieved 2009-09-29.
- ^ Pikarski, Wayne and Thomas, Bruce (October 1, 2001). "Tinmith-Metro: New Outdoor Techniques for Creating City Models with an Augmented Reality Wearable Computer". IEEE.
{{cite web}}
:|access-date=
requires|url=
(help); Missing or empty|url=
(help); Text "http://www.computer.org/portal/web/csdl/doi/10.1109/ISWC.2001.962093" ignored (help)CS1 maint: multiple names: authors list (link) - ^ Dybis, Karen (August 6, 2010). "Some Genuine Detroit 'School Pride'". Time. Retrieved October 7, 2010.
Media related to Augmented reality at Wikimedia Commons