The reflectance field over a human face was first captured in 1999 by Paul Debevec et al and presented in SIGGRAPH 2000. The method they used to find the light that travels under the skin was based on the existing scientific knowledge that light reflecting off the air-to-oil retains its polarization while light that travels under the skin loses its polarization.
Using this information, a light stage was built by Debevec et al., consisting of
- Moveable digital camera
- Moveable simple light source (full rotation with adjustable radius and height)
- Two polarizers set into various angles in front of the light and the camera
- A computer with relatively simple programs doing relatively simple tasks. The setup enabled the team to find the subsurface scattering component of the bidirectional scattering distribution function over the human face which was required for fully virtual cinematography with ultra-photorealistic digital look-alikes, similar to effects seen in the films The Matrix Reloaded, The Matrix Revolutions and others since the early 2000s.
Following great scientific success Debevec et al. constructed more elaborate versions of the light stage at the University of Southern California's (USC)'s Institute for Creative Technologies (ICT). Ghosh et al. built the seventh version of the USC light stage X. In 2014 President Barack Obama had his image and reflectance captured with the USC mobile light stage.
Examples of use
- Human image synthesis is hard to tell apart from a human imaged with an imaging technology
- Digital Emily presented to the SIGGRAPH convention in 2008 was a project whereby the reflection field of actress Emily O'Brien was captured using the USC light stage 5, and the prerendered digital look-alike was made in association with Image Metrics. Video came from USC light stage 5 and USC light stage 6.
- Digital Ira was a fairly convincingly rendered real-time image that was presented at the 2013 SIGGRAPH in association with Activision. While Digital Emily was a pre-computed simulation, Digital Ira ran in real-time and was fairly realistic looking even as a real-time rendering of animation. The field is rapidly moving from movies to computer games and leisure applications – Video includes USC light stage X.
- The Presidential Portrait by USC ICT in conjunction with the Smithsonian Institution was done using the latest mobile light stage. It included texture, feature and reflectance capture with high resolution multi-camera setup and also additional hand held scanners. A 3D printed bust of the President was also produced.
- ESPER LightCage a geodesic frame with cross-polarized which can be programmed to create realistic lighting conditions for capturing images suitable for processing into high resolution 3D meshes. Allows to emulate the pioneering techniques developed by Paul Debevec and the USC ICT for generating diffuse maps, normal maps, specular maps and sub-surface scattering maps.
- Debevec, Paul; Tim Hawkins; Chris Tchou; Haarm-Pieter Duiker; Westley Sarokin; Mark Sagar (2000). "Acquiring the reflectance field of a human face". Proceedings of the 27th annual conference on Computer graphics and interactive techniques - SIGGRAPH '00. ACM. pp. 145–156. doi:10.1145/344779.344855. ISBN 1581132085.
- "Scanning and printing a 3D portrait of President Barack Obama". University of Southern California. 2013. Retrieved 2015-11-04.
- Paul Debevec animates a photo-real digital face - Digital Emily 2008
- Debevec, Paul (2013). "Digital Ira - A real-time animatable face demonstration". His web site. University of Southern California. Retrieved 2013-08-10.
- "Scanning and Printing a 3D Portrait of President Barack Obama". usc.edu. Retrieved 2017-02-24.
- "LightCage | Photogrammetry light stage system for 3D face scanning". ESPER. Retrieved 2019-02-21.