Pixel Camera: Difference between revisions
Updated Pixel 4 features |
→HDR+: Fixed typo Tags: Mobile edit Mobile web edit |
||
Line 27: | Line 27: | ||
=== HDR+ === |
=== HDR+ === |
||
Unlike traditional [[High-dynamic-range imaging|High-dynamic-range (HDR) imaging]], HDR+ (also known as HDR+ on) uses [[computational photography]] techniques to |
Unlike traditional [[High-dynamic-range imaging|High-dynamic-range (HDR) imaging]], HDR+ (also known as HDR+ on) uses [[computational photography]] techniques to achieve higher dynamic range. HDR+ takes continuous burst shots with short exposures. When the shutter is pressed the last 5-15 frames are analysed to pick the sharpest shots (using [[lucky imaging]]), which are selectively aligned and combined with image averaging. HDR+ also uses Semantic Segmentation to detect faces to brighted using synthetic fill flash and darken and denoise skies. HDR+ also reduces noise, improves colors, while avoiding [[Overexpose|blowing out highlights]] and [[motion blur]]. HDR+ was introduced on the [[Nexus 6]] and brought back to the [[Nexus 5]].<ref>{{Cite web|url=https://www.cnet.com/news/google-pixel-hdr-better-photo-camera-dynamic-range/|title=How Google's Pixel phone builds a better photo|last=Shankland|first=Stephen|date=October 21, 2016|website=CNET|language=en|url-status=live|archive-url=|archive-date=|access-date=2019-10-14}}</ref><ref>{{Cite web|url=http://ai.googleblog.com/2014/10/hdr-low-light-and-high-dynamic-range.html|title=HDR+: Low Light and High Dynamic Range photography in the Google Camera App|website=Google AI Blog|language=en|access-date=2019-10-14}}</ref><ref name=":1">{{Cite web|url=https://www.dpreview.com/articles/7921074499/five-ways-google-pixel-3-pushes-the-boundaries-of-computational-photography|title=5 ways Google Pixel 3 camera pushes the boundaries of computational photography|website=DPReview|access-date=2019-10-15}}</ref> |
||
==== HDR+ enhanced ==== |
==== HDR+ enhanced ==== |
Revision as of 11:07, 18 October 2019
Developer(s) | Google, Google Research |
---|---|
Initial release | April 16, 2014 |
Stable release | 6.2.024
/ March 29, 2019 |
Operating system | Android |
Type | Camera |
License | Proprietary |
Google Camera (Gcam) is a camera application developed by Google for Android. Google Camera development began in 2011 by X, lead by Marc Levoy, developing image fusion technology for the Google Glass.[1] It was initially supported on all devices running Android 4.4 KitKat and higher, but is now only officially supported on Google's Pixel devices. It was publicly released for Android 4.4+ on the Google Play Store on April 16, 2014[2] and removed from public view on Feb 17, 2016.[3]
Features
Google Camera contains a number of features that can be activated either in the Settings page or on the row of icons at the top of the app.
Pixel Visual/Neural Core
This section needs expansion. You can help by adding to it. (October 2019) |
Starting with Pixel devices, the camera app has been aided with hardware accelerators to perform its image processing. The first generation of Pixel phones used Qualcomm's Hexagon DSPs and Adreno GPUs to accelerate image processing. The Pixel 2 and Pixel 3 (but not the Pixel 3a) include the Pixel Visual Core to aid with image processing. The Pixel 4 introduced the Pixel Neural Core.[4]
HDR+
Unlike traditional High-dynamic-range (HDR) imaging, HDR+ (also known as HDR+ on) uses computational photography techniques to achieve higher dynamic range. HDR+ takes continuous burst shots with short exposures. When the shutter is pressed the last 5-15 frames are analysed to pick the sharpest shots (using lucky imaging), which are selectively aligned and combined with image averaging. HDR+ also uses Semantic Segmentation to detect faces to brighted using synthetic fill flash and darken and denoise skies. HDR+ also reduces noise, improves colors, while avoiding blowing out highlights and motion blur. HDR+ was introduced on the Nexus 6 and brought back to the Nexus 5.[5][6][7]
HDR+ enhanced
Unlike HDR+/HDR+ On, 'HDR+ enhanced' mode does not feature Zero Shutter Lag (ZSL). Like Night Sight, HDR+ enhanced features positive-shutter-lag (PSL), it captures images after the shutter is pressed. HDR+ enhanced is similar to HDR+ from the Nexus 5, Nexus 6, Nexus 5X and Nexus 6P. It is believed to feature underexposed and overexposed frames like Apple's Smart HDR. HDR+ enhanced captures increase the dynamic range compared to HDR+ on. HDR+ enhanced on the Pixel 3 uses the learning-based AWB algorithm from Night Sight.[8][9]
Live HDR+
Starting with the Pixel 4, Live HDR+ replaced HDR+ on, featuring WYSIWYG viewfinder with a real-time preview of HDR+. HDR+ live uses the learning-based AWB algorithm from Night Sight and averages up to 9 underexposed pictures.[10][11]
Dual Exposure Controls
'Live HDR+' mode features Dual Exposure Controls, which has a slider for brightness (capture exposure) and a slider for shadows (tone mapping).[10][11]
Motion Photos
Google Camera's Motion photo mode is similar to HTC's Zoe and iOS' Live Photo. When enabled, a short, silent, video clip of relatively low resolution is paired with the original photo. Motion Photos was introduced on the Pixel 2. Motion Photo is disabled in HDR+ enhanced mode.[12][13][14]
Video Stabilization
Fused Video Stabilization, a technique that combines Optical Image Stabilization and Electronic/Digital image stabilization, can be enabled for significantly smoother video. This technique also corrects Rolling shutter distortion and Focus breathing, amongst various other problems. Fused Video Stabilization was introduced on the Pixel 2.[10][15]
Super Res Zoom
Super Res Zoom is a multi-frame super-resolution technique introduced with the Pixel 3 that shifts the image sensor to achieve higher resolution, which Google claim is eqivalent to 2-3x optical zoom. It is similar to drizzle image processing. Super Res Zoom can also be used with telephoto lens, for example Google claims the Pixel 4 can capture 8x zoom at near-optical quality.[16][17]
Smartburst
Smartburst is activated by holding the shutter button down. Whilst the button is held down, up to 10 shots per second are captured. Once released, the best pictures captured are automatically highlighted.
Different 'creations' can be produced from the captured pictures:
- Moving GIF - an animated GIF to capture action or images containing a high amount of movement.
- 'All-smile' - a single photo in which everyone is smiling and not blinking; produced by taking different parts of every photo in the burst.
- Collage - when taking 'selfies', a collage similar to that of a Photo booth is generated.
Top Shot
When Motion Photos is enabled, Top Shot analyzes up to 90 additional frames from 1.5 seconds before and after the shutter is pressed. The Pixel Visual Core is used to accelerate the analyzes using computer vision techniques and ranks them based on object motion, motion blur, auto exposure, auto focus, and auto white balance. About 10 additional photos are saved, including an additional HDR+ photo up to 3MP. Top Shot was introduced on the Pixel 3.[18]
Other features
Computational Raw - Google Camera supports capturing JPEG and DNG files simultaneously. The DNG files are also processed with Google's HDR+ Computational Photography. Computational Raw was introduced on the Pixel 3.[7]
Motion Auto Focus - maintains focus on any subject/object in the frame. Motion Auto Focus was introduced in the Pixel 3.[19]
Frequent Faces - allows the camera to remember faces. The camera will try ensure those faces are in focus, smiling and not blinking.[17]
Location - Location information obtained via GPS and/or Google's location service can be added to pictures and videos when enabled.
Functions
Like most other camera applications, Google Camera offers different 'functions' or 'modes', allowing the user to take different types of photo or video.[20]
Slow Motion
Slow motion video can be captured in Google Camera at either 120 or, on supported devices, 240 frames per second.[21]
Panorama
Panoramic photography is also possible with Google Camera. Four types of panoramic photo are supported; Horizontal, Vertical, Wide-angle and Fisheye. Once the Panorama function is selected, one of these four modes can be selected at a time from a row of icons at the top of the screen.[22]
Photo Sphere
Google Camera allows the user to create a 'Photo Sphere', a 360-degree panorama photo, originally added in Android 4.2 in 2012.[23] These photos can then be embedded in a web page with custom HTML code or uploaded to various Google services.[24]
Portrait
Portrait mode offers an easy way for users to take 'selfies' or portraits with a Bokeh effect, in which the subject of the photo is in focus and the background is slightly blurred. This effect is achieved via the information from dual-pixel sensors when available (such as the Pixel 2 and Pixel 3), and the application of machine learning to identify what should be kept in focus and what should be blurred out. Portrait mode was introduced on the Pixel 2.[25][26][27]
Additionally, a "face retouching" feature can be activated which cleans up blemishes and other imperfections from the subject's skin.[28]
The Pixel 4 featured an improved Portrait mode utilising the telephoto lens for more accurate depth maps and the blur effect is applied at the Raw stage before the tonemapping stage for more realistic bokeh effect.[11]
Playground
In late 2017, with the debut of the Pixel 2 and Pixel 2 XL, Google introduced AR Stickers, a feature that, using Google's new ARCore platform, allowed the user to superimpose augmented reality animated objects on their photos and videos. With the release of the Pixel 3, AR Stickers was rebranded to Playground.[29][30]
Google Lens
The camera offers a functionality powered by Google Lens, which allows the camera to copy text it sees, identify products, books and movies and search similar ones, identify animals and plants, and scan barcodes and QR codes, among other things.
Photobooth
The Photobooth mode allows the user to automate the capture of selfies. The AI is able to detect the user smile or funny faces and shoot the picture at the best time without any action from the user, similar to Google Clips. This mode also feature a two level AI processing of the subject's face that can be enabled or disabled in order to soften its skin. Motion Photos functionality is also available in this mode. The white balance is also adjustable to defined presets.[31]
Night Sight
Night Sight is based on a similar principle to exposure stacking, used in astrophotography. Night Sight uses modified HDR+ or Super Res Zoom algorithms. Once the user presses the trigger, multiple long exposure shots are taken, up to 15x 1/15 second exposure or 6x of 1 second exposure, to create up to a 6 second exposure. The motion metering and tile-based processing of the image allows to reduce if not cancel the users motion and shivering to result in a clear and properly exposed shot. Google claims it can handle up to ~8% displacement frame to frame. And each frame is broke into around 12,000 tiles. It also introduced a learning-based AWB algorithm for more accurate white balance in low light.[32][33][11]
Night Sight also works well in daylight, improving WB, detail and sharpness. Like HDR+ enhanced, Night Sight features positive-shutter-lag (PSL). Night Sight also supports a delay-timer as well as an assisted selector for the focus featuring three options (far, close and auto-focus). Night Sight was introduced with the Pixel 3, all older Pixel phones were updated with support.[34][35][36][37]
Astrophotography
Astrophotography mode averages up to 15x 16 second exposures, to create a 4 minute exposure. Astrophotography activates automatically when in Night Sight mode. Astrophotography mode includes improved algorithms to remove hotpixels. Astrophotography mode was introduced with Pixel 4, and brought back to the Pixel 3 and Pixel 3a.[10][38][11]
Unofficial ports
While Google Camera is made solely for specific Google hardware, many developers have released unofficial ports that backport its newest features to older Google phones, and even phones from other brands, sometimes even enabling in-development features not yet enabled by the official app.
For example, in 2016, a modified version brought HDR+ featuring Zero Shutter Lag (ZSL) on back to the Nexus 5X and Nexus 6P.[39] In mid-2017, a modified version of Google Camera was created for any smartphone equipped with either a Snapdragon 820, 821 or 835 processor.[40] In 2018, developers released modified versions enabling Night Sight on phones on non-Pixel phones.[41]
References
- ^ X, The Team at (2017-06-06). "Meet Gcam: The X graduate that gave us a whole new point of view". Medium. Retrieved 2019-10-15.
- ^ Kellex (16 April 2014). "Google Camera Quick Look and Tour". Droid Life.
- ^ http://www.appbrain.com/app/com.google.android.GoogleCamera
- ^ "Introducing the HDR+ Burst Photography Dataset". Google AI Blog. Retrieved 2019-08-02.
- ^ Shankland, Stephen (October 21, 2016). "How Google's Pixel phone builds a better photo". CNET. Retrieved 2019-10-14.
{{cite web}}
: CS1 maint: url-status (link) - ^ "HDR+: Low Light and High Dynamic Range photography in the Google Camera App". Google AI Blog. Retrieved 2019-10-14.
- ^ a b "5 ways Google Pixel 3 camera pushes the boundaries of computational photography". DPReview. Retrieved 2019-10-15.
- ^ "HDR+ on vs HDR+ Enhanced? - Post #5". forum.xda-developers.com. Retrieved 2018-04-05.
- ^ an, Idrees Patel 716 posts see posts > Idrees Patel is a smartphone enthusiast from India He has been; processors, roid user since the time he got the LG Optimus One in 2011 He has a bachelor's degree in Management Studies The subjects in which he is interested are mobile; performance, real-world UI; Analysis, In-Depth Camera Quality; Contact, many more (2017-11-20). "Google Explains Decisions Made on the Pixel 2 Camera". xda-developers. Retrieved 2019-10-14.
{{cite web}}
:|first2=
has generic name (help)CS1 maint: numeric names: authors list (link) - ^ a b c d Made by Google '19, retrieved 2019-10-16
- ^ a b c d e "These are the most important Google Pixel 4 camera updates". DPReview. Retrieved 2019-10-18.
- ^ "Behind the Motion Photos Technology in Pixel 2". Google AI Blog. Retrieved 2019-10-14.
- ^ "Motion Stills – Create beautiful GIFs from Live Photos". Google AI Blog. Retrieved 2019-10-14.
- ^ Segan, By Sascha; September 11, 2015 10:00AM EST; September 11, 2015. "How Apple's 'Live Photos' Can Win Where HTC's Zoe Lost". PCMAG. Retrieved 2019-10-14.
{{cite web}}
:|first3=
has numeric name (help)CS1 maint: numeric names: authors list (link) - ^ "Fused Video Stabilization on the Pixel 2 and Pixel 2 XL". Research Blog. Retrieved 2018-04-05.
- ^ "See Better and Further with Super Res Zoom on the Pixel 3". Google AI Blog. Retrieved 2019-10-14.
- ^ a b "Google Pixel 4 Promises 'Studio-Like Photos Without the Studio'". petapixel.com. Retrieved 2019-10-16.
- ^ "Top Shot on Pixel 3". Google AI Blog. Retrieved 2019-10-15.
- ^ Kundu, Kishalaya (2018-10-12). "10 Best Google Pixel 3 Camera Features". Beebom. Retrieved 2019-10-15.
- ^ ZenTalk. "Google Camera HDR+ Manual setting of all parameters version". ZenTalk. Retrieved 2018-04-05.
- ^ "Google Camera - Apps on Google Play". Google Play. 2018-04-05. Retrieved 2018-04-05.
- ^ Biersdorfer, J. D. (2016-05-23). "Going Wide With Google Camera". The New York Times. ISSN 0362-4331. Retrieved 2018-04-05.
- ^ "Android 4.2 Jelly Bean Has Arrived: Photo Sphere Panoramic Camera, Gesture Typing, Wireless HDTV Streaming – TechCrunch". techcrunch.com. Retrieved 2018-04-05.
- ^ "Photo Sphere". Android Central. 2016-04-26. Retrieved 2018-04-05.
- ^ "Portrait mode on the Pixel 2 and Pixel 2 XL smartphones". Google AI Blog. Retrieved 2019-10-14.
- ^ "Learning to Predict Depth on the Pixel 3 Phones". Google AI Blog. Retrieved 2019-10-14.
- ^ Ltd, Guiding Media Pvt (2017-12-26). "How to Use Portrait Mode in Google Pixel 2: Cool Tips". Guiding Tech. Retrieved 2018-04-05.
- ^ "Download Google Camera App with Motion Photo + Face Retouching on the Google Pixel". xda-developers. 2017-10-13. Retrieved 2018-04-05.
- ^ "How to use AR stickers on the Google Pixel or Pixel 2". Android Authority. 2017-12-12. Retrieved 2018-04-05.
- ^ "See your world differently with Playground and Google Lens on Pixel 3". Google. 2018-10-09. Retrieved 2019-10-15.
- ^ "Take Your Best Selfie Automatically, with Photobooth on Pixel 3". Google AI Blog. Retrieved 2019-10-15.
- ^ "Night Sight: Seeing in the Dark on Pixel Phones". Google AI Blog. Retrieved 2019-10-14.
- ^ "See the light with Night Sight". Google. 2018-11-14. Retrieved 2019-10-14.
- ^ Savov, Vlad (2018-11-14). "Google gives the Pixel camera superhuman night vision". The Verge. Retrieved 2019-10-14.
- ^ "The Pixel's Night Sight camera mode performs imaging miracles". Engadget. Retrieved 2019-10-14.
- ^ "Pixel Night Sight also works in daylight, reducing noise and boosting resolution". Android Police. 2018-11-14. Retrieved 2019-10-14.
- ^ Savov, Vlad (2018-11-26). "Google's Night Sight is subtly awesome in the daytime, too". The Verge. Retrieved 2019-10-14.
- ^ "Behind the scenes: Google's Pixel cameras aren't trying to be cameras at all". Android Authority. 2019-10-15. Retrieved 2019-10-16.
- ^ Chow, Charles (2016-11-05). "Camera NX V4 Bring ZSL Photo Shooting with HDR+ on Nexus, Same as Pixel Phone's Way (Update for N6P)". ChromLoop. Retrieved 2019-10-15.
- ^ Andrew Liptak (August 12, 2017). "Google's Pixel camera software has been made to work on other recent Android phones".
- ^ "Get Google Camera port with Night Sight for Xiaomi Mi 5, Essential Phone". xda-developers. 2018-10-27. Retrieved 2019-10-15.
Further reading
- Eric Ravenscraft (18 June 2014). "How to Get the Most Out of the New Google Camera for Android". Lifehacker.
- Jimmy Westenberg (12 December 2017). "How to use AR Stickers on the Google Pixel or Pixel 2". Android Authority.
External links
- Google Play application page (No longer available for non Nexus or Pixel devices.)