Digital cinematography
Digital cinematography is the process of capturing (recording) film as digital video images rather than through film stock. Digital capture may occur on video tape, hard disks, flash memory, or other media which can record digital data through the use of a digital movie video camera or other digital video camera. As digital technology has improved in recent years, this practice has become dominant. Since the mid 2010s most of the movies across the world are captured as well as distributed digitally despite the fact that this has led to a noticeable reduction in approximate resolution, and thus picture quality.[1][2][3]
Many vendors have brought products to market, including traditional film camera vendors like Arri and Panavision, as well as new vendors like RED, Blackmagic, Silicon Imaging, Vision Research and companies which have traditionally focused on consumer and broadcast video equipment, like Sony, GoPro, and Panasonic.
The most notable disadvantage of visual data being recorded and stored as binary data (i.e. digitally) is reduced picture quality through the fact that the resolution limits of most, if not all, commercially available digital video and still image cameras rarely exceed more than 16 Megapixels (16,000,000 px) whereas medium format film cells have been noted to be capable of reaching the equivalent of 400 Megapixels (400,000,000 px) and beyond. The primary motivation behind digital cinematography is ease of use and compact storage. Many who find themselves interested in replicating the true detail and textures of reality prefer to use analogue picture formats to achieve the desired results.[4]
History
Beginning in the late 1980s, Sony began marketing the concept of "electronic cinematography," utilizing its analog Sony HDVS professional video cameras. The effort met with very little success. However, this led to one of the earliest digitally shot feature movies Julia and Julia to be produced in 1987.[5] In 1998, with the introduction of HDCAM recorders and 1920 × 1080 pixel digital professional video cameras based on CCD technology, the idea, now re-branded as "digital cinematography," began to gain traction in the market.[citation needed] Shot and released in 1998, The Last Broadcast is believed by some to be the first feature-length video shot and edited entirely on consumer-level digital equipment.[6]
In May 1999 George Lucas challenged the supremacy of the movie-making medium of film for the first time by including footage filmed with high-definition digital cameras in Star Wars: Episode I – The Phantom Menace. The digital footage blended seamlessly with the footage shot on film and he announced later that year he would film its sequels entirely on hi-def digital video. Also in 1999, digital projectors were installed in four theaters for the showing of The Phantom Menace. In June 2000, Star Wars: Episode II – Attack of the Clones began principal photography shot entirely using a Sony HDW-F900 camera as Lucas had previously stated. The film was released in May 2002. In May 2001 Once Upon a Time in Mexico was also shot in 24 frame-per-second high-definition digital video, partially developed by George Lucas using a Sony HDW-F900 camera,[7] following Robert Rodriguez's introduction to the camera at Lucas' Skywalker Ranch facility whilst editing the sound for Spy Kids. Two lesser-known movies, Vidocq (2001) and Russian Ark (2002), had also been shot with the same camera, the latter notably consisting of a single long take.
Today, cameras from companies like Sony, Panasonic, JVC and Canon offer a variety of choices for shooting high-definition video. At the high-end of the market, there has been an emergence of cameras aimed specifically at the digital cinema market. These cameras from Sony, Vision Research, Arri, Silicon Imaging, Panavision, Grass Valley and Red offer resolution and dynamic range that exceeds that of traditional video cameras, which are designed for the limited needs of broadcast television.
In 2009, Slumdog Millionaire became the first movie shot mainly in digital to be awarded the Academy Award for Best Cinematography[8] and the highest-grossing movie in the history of cinema, Avatar, not only was shot on digital cameras as well, but also made the main revenues at the box office no longer by film, but digital projection.
In late 2013, Paramount became the first major studio to distribute movies to theaters in digital format eliminating 35mm film entirely.[9] Anchorman 2 was the last Paramount production to include a 35mm film version, while The Wolf of Wall Street was the first major movie distributed entirely digitally.[9]
Technology
Digital cinematography captures motion pictures digitally in a process analogous to digital photography. While there is no clear technical distinction that separates the images captured in digital cinematography from video, the term "digital cinematography" is usually applied only in cases where digital acquisition is substituted for film acquisition, such as when shooting a feature film. The term is seldom applied when digital acquisition is substituted for video acquisition, as with live broadcast television programs.
Recording
Cameras
Professional cameras include the Sony CineAlta(F) Series, Blackmagic Cinema Camera, RED ONE, Arriflex D-20, D-21 and Alexa, Panavisions Genesis, Silicon Imaging SI-2K, Thomson Viper, Vision Research Phantom, IMAX 3D camera based on two Vision Research Phantom cores, Weisscam HS-1 and HS-2, GS Vitec noX, and the Fusion Camera System. Independent filmmakers have also pressed low-cost consumer and prosumer cameras into service for digital filmmaking.
Sensors
Digital cinematography cameras capture images using CMOS or CCD sensors, usually in one of two arrangements.
Single chip cameras designed specifically for the digital cinematography market often use a single sensor (much like digital photo cameras), with dimensions similar in size to a 16 or 35 mm film frame or even (as with the Vision 65) a 65 mm film frame. An image can be projected onto a single large sensor exactly the same way it can be projected onto a film frame, so cameras with this design can be made with PL, PV and similar mounts, in order to use the wide range of existing high-end cinematography lenses available. Their large sensors also let these cameras achieve the same shallow depth of field as 35 or 65 mm motion picture film cameras, which many cinematographers consider an essential visual tool.[10]
Video formats
Unlike other video formats, which are specified in terms of vertical resolution (for example, 1080p, which is 1920×1080 pixels), digital cinema formats are usually specified in terms of horizontal resolution. As a shorthand, these resolutions are often given in "nK" notation, where n is the multiplier of 1024 such that the horizontal resolution of a corresponding full-aperture, digitized film frame is exactly pixels. Here the "K" has a customary meaning corresponding to the binary prefix "kibi" (ki).
For instance, a 2K image is 2048 pixels wide, and a 4K image is 4096 pixels wide. Vertical resolutions vary with aspect ratios though; so a 2K image with an HDTV (16:9) aspect ratio is 2048×1152 pixels, while a 2K image with a SDTV or Academy ratio (4:3) is 2048×1536 pixels, and one with a Panavision ratio (2.39:1) would be 2048×856 pixels, and so on. Due to the "nK" notation not corresponding to specific horizontal resolutions per format a 2K image lacking, for example, the typical 35mm film soundtrack space, is only 1828 pixels wide, with vertical resolutions rescaling accordingly. This led to a plethora of motion-picture related video resolutions, which is quite confusing and often redundant with respect to nowadays few projection standards.
All formats designed for digital cinematography are progressive scan, and capture usually occurs at the same 24 frame per second rate established as the standard for 35mm film. Some films such as The Hobbit: An Unexpected Journey have a High Frame Rate of 48 fps, although in some theatres it was also released in a 24 fps version which many fans of traditional film prefer.
The DCI standard for cinema usually relies on a 1.89:1 aspect ratio, thus defining the maximum container size for 4K as 4096×2160 pixels and for 2K as 2048×1080 pixels. When distributed in the form of a Digital Cinema Package (DCP), content is letterboxed or pillarboxed as appropriate to fit within one of these container formats.
In the last few years[when?], 2K has been the most common format for digitally acquired major motion pictures however, as new camera systems gain acceptance, 4K is becoming more prominent (as the 1080p format has been before). During 2009 at least two major Hollywood films, Knowing and District 9, were shot in 4K on the RED ONE camera, followed by The Social Network in 2010. The Arri Alexa captures a 2.8k image.
Data storage
Broadly, two workflow paradigms are used for data acquisition and storage in digital cinematography.
Tape-based workflows
With video-tape-based workflow, video is recorded to tape on set. This video is then ingested into a computer running non-linear editing software, using a deck. Upon ingestion, a digital video stream from tape is converted to computer files. These files can be edited directly or converted to an intermediate format for editing. Then video is output in its final format, possibly to a film recorder for theatrical exhibition, or back to video tape for broadcast use. Original video tapes are kept as an archival medium. The files generated by the non-linear editing application contain the information necessary to retrieve footage from the proper tapes, should the footage stored on the computer's hard disk be lost. With increasing convenience of file-based workflows, the tape-based workflows have become marginal in recent years.
File-based workflows
Digital cinematography has mostly shifted towards "tapeless" or "file-based" workflows. This trend has accelerated with increased capacity and reduced cost of non-linear storage solutions such as hard disk drives, optical discs, and solid-state memory. With tapeless workflows digital video is recorded as digital files onto random-access media like optical discs, hard disk drives or flash memory-based digital "magazines". These files can be easily copied to another storage device, typically to a large RAID (array of computer disks) connected to an editing system. Once data is copied from the on-set media to the storage array, they are erased and returned to the set for more shooting.
Such RAID arrays, both of "managed" (for example, SANs and NASs) and "unmanaged" (for example, JBoDs on a single computer workstation), are necessary due to the throughput required for real-time (320 MB/s for 2K @ 24fps) or near-real-time playback in post-production, compared to throughput available from a single, yet fast, hard disk drive. Such requirements are often termed as "on-line" storage. Post-production not requiring real-time playback performances (typically for lettering, subtitling, versioning and other similar visual effects) can be migrated to slightly slower RAID stores.
Short-term archiving, "if ever", is accomplished by moving the digital files into "slower" RAID arrays (still of either managed and unmanaged type, but with lower performances), where playback capability is poor to non-existent (unless via proxy images), but minimal editing and metadata harvesting still feasible. Such intermediate requirements easily fall into the "mid-line" storage category.
Long-term archiving is accomplished by backing up the digital files from the RAID, using standard practices and equipment for data backup from the IT industry, often to data tapes (like LTOs).
Chroma subsampling
Most digital cinematography systems further reduce data rate by subsampling color information. Because the human visual system is much more sensitive to luminance than to color, lower resolution color information can be overlaid with higher resolution luma (brightness) information, to create an image that looks very similar to one in which both color and luma information are sampled at full resolution. This scheme may cause pixelation or color bleeding under some circumstances. High quality digital cinematography systems are capable of recording full resolution color data (4:4:4) or raw sensor data.
Intra- vs. Inter-frame compression
Most compression systems used for acquisition in the digital cinematography world compress footage one frame at a time, as if a video stream is a series of still images. This is called intra-frame compression. Inter frame compression systems can further compress data by examining and eliminating redundancy between frames. This leads to higher compression ratios, but displaying a single frame will usually require the playback system to decompress a number of frames from before & after it. In normal playback this is not a problem, as each successive frame is played in order, so the preceding frames have already been decompressed. In editing, however, it is common to jump around to specific frames and to play footage backwards or at different speeds. Because of the need to decompress extra frames in these situations, inter-frame compression can cause performance problems for editing systems. Inter-frame compression is also disadvantageous because the loss of a single frame (say, due to a flaw writing data to a tape) will typically ruin all the frames until the next keyframe occurs. In the case of the HDV format, for instance, this may result in as many as 6 frames being lost with 720p recording, or 15 with 1080i.[11] An inter-frame compressed video stream consists of groups of pictures (GOPs), each of which has only one full frame, and a handful of other frames referring to this frame. If the full frame, called I-frame, is lost due to transmission or media error, none of the P-frames or B-frames (the referenced images) can be displayed. In this case, the whole GOP is lost.
Digital distribution
For theaters with digital projectors, digital films may be distributed digitally, either shipped to theaters on hard drives or sent via the Internet or satellite networks. Digital Cinema Initiatives, LLC, a joint venture of Disney, Fox, MGM, Paramount, Sony Pictures Entertainment, Universal and Warner Bros. Studios, has established standards for digital cinema projection. In July 2005, they released the first version of the Digital Cinema System Specification,[12] which encompasses 2K and 4K theatrical projection. They also offer compliance testing for exhibitors and equipment suppliers.
Theater owners initially balked at installing digital projection systems because of high cost and concern over increased technical complexity. However new funding models, in which distributors pay a "digital print" fee to theater owners, have helped to alleviate these concerns. Digital projection also offers increased flexibility with respect to showing trailers and pre-show advertisements and allowing theater owners to more easily move films between screens or change how many screens a film is playing on, and the higher quality of digital projection provides a better experience to help attract consumers who can now access high-definition content at home. These factors have resulted in digital projection becoming an increasingly attractive prospect for theater owners, and the pace of adoption has been rapidly increasing.
Since some theaters currently don't have digital projection systems, even if a movie is shot and post-produced digitally, it must be transferred to film if a large theatrical release is planned. Typically, a film recorder will be used to print digital image data to film, to create a 35 mm internegative. After that the duplication process is identical to that of a traditional negative from a film camera.
Comparison with film cinematography
Resolution
Unlike a digital sensor, a film frame does not have a regular grid of discrete pixels.
Determining resolution in digital acquisition seems straightforward, but it is significantly complicated by the way digital camera sensors work in the real world. This is particularly true in the case of high-end digital cinematography cameras that use a single large bayer pattern CMOS sensor. A bayer pattern sensor does not sample full RGB data at every point; instead, each pixel is biased toward red, green or blue, and a full color image is assembled from this checkerboard of color by processing the image through a demosaicking algorithm. Generally with a bayer pattern sensor, actual resolution will fall somewhere between the "native" value and half this figure, with different demosaicing algorithms producing different results. Additionally, most digital cameras (both bayer and three-chip designs) employ optical low-pass filters to avoid aliasing. Such filters reduce resolution.
Grain and noise
Film has a characteristic grain structure. Different film stocks have different grain.
Digitally acquired footage lacks this grain structure. It has electronic noise.
Digital Intermediate Workflow and Archiving
The process of using digital intermediate workflow, where movies are color graded digitally instead of via traditional photochemical finishing techniques, has become common.
In order to utilize digital intermediate workflow with film, the camera negative must first be processed and then scanned to a digital format. Some filmmakers have years of experience achieving their artistic vision using the techniques available in a traditional photochemical workflow, and prefer that finishing/editing process.
Digitally shot movies can be printed, transferred or archived on film. Large scale digital productions are often archived on film, as it provides a safer medium for storage, benefiting insurance and storage costs.[13] As long as the negative does not completely degrade, it will always be possible to recover the images from it in the future, regardless of changes in technology, since all that will be involved is simple photographic reproduction.
In contrast, even if digital data is stored on a medium that will preserve its integrity, highly specialized digital equipment will always be required to reproduce it. Changes in technology may thus render the format unreadable or expensive to recover over time. For this reason, film studios distributing digitally-originated films often make film-based separation masters of them for archival purposes.[13]
Reliability
Film proponents have argued that digital cameras lack the reliability of film, particularly when filming sequences at high speed or in chaotic environments, due to digital cameras technical glitches. Cinematographer Wally Pfister noted that for his shoot on the film Inception, "Out of six times that we shot on the digital format, we only had one useable piece and it didn't end up in the film. Out of the six times we shot with the Photo-Sonics camera and 35mm running through it, every single shot was in the movie."[14] Michael Bay stated that when filming Transformers: Dark of the Moon, 35mm cameras had to be used when filming in slow-motion and sequences where the digital cameras were subject to strobing or electrical damage from dust.[15]
Criticism and concerns
Some film directors such as Christopher Nolan,[16] Paul Thomas Anderson[17] and Quentin Tarantino have publicly criticized digital cinema, and advocated the use of film and film prints. Tarantino has suggested he may retire because he will no longer be able to have his films projected in 35mm in most American cinemas. Tarantino considers digital cinema to be simply "television in public."[18] Christopher Nolan has speculated that the film industry's adoption of digital formats has been driven purely by economic factors as opposed to digital being a superior medium to film: "I think, truthfully, it boils down to the economic interest of manufacturers and [a production] industry that makes more money through change rather than through maintaining the status quo."[16]
Another concern with digital image capture is how to archive all the digital material. Archiving digital material is turning out to be extremely costly, and it creates issues in terms of long-term preservation. In a 2007 study, the Academy of Motion Picture Arts and Sciences found that the cost of storing 4K digital masters is "enormously higher – 1100% higher – than the cost of storing film masters." Furthermore, digital archiving faces challenges due to the insufficient longevity of today's digital storage: no current media, be it magnetic hard drives or digital tape, can reliably store a film for a hundred years, something that properly stored and handled film can do.[19] Although this also used to be the case with optical disc, in 2012 Millenniata, Inc. a digital storage company based in Utah, released M-DISC, an optical storage solution, designed to last up to 1,000 years, thus, offering a possibility of digital storage as a viable storage solution.[20][21]
See also
References
- ^ "Qube Cinema Supports Cinecolor in Its Transition to Digital Cinema in Latin America". qubecinema.com.
- ^ "How Digital Conversion Is Killing Independent Movie Theaters". Rolling Stone.
- ^ Michael Hurley (2 January 2014). "Studios Abandon Film, Small Theaters Struggle -- And Ther - Indiewire". Indiewire.
- ^ "Film vs. Digital: A Comparison of the Advantages and Disadvantages". PetaPixel. 26 May 2015. Retrieved 2016-06-28.
- ^ "Julia and Julia (1987)". IMDb.
- ^ The Last Broadcast is A First: The Making of a Digital Feature http://www.thelastbroadcastmovie.com/
- ^ "Robert Rodriguez Film Once Upon a Time in Mexico This is a structural review". WriteWork. Retrieved 2013-04-22.
- ^ "Silicon Imaging". siliconimaging.com.
- ^ a b Megan Geuss (January 18, 2014). "Anchorman 2 was Paramount's final release on 35mm film". Ars Technica (via Los Angeles Times). Retrieved 2014-01-20.
- ^ "Putting the FULL FRAME confusion to bed – Personal View Talks". Personal-view.com. Retrieved 2013-04-22.
- ^ recording adamwilt.com
- ^ Digital Cinema System Specification
- ^ a b "KODAK Color Asset Protection Film 2332". Motion.kodak.com. Retrieved 2013-04-22.
- ^ "From The Dark Knight to Inception, Wally Pfister, ASC refuses to compromise". kodak. Retrieved 2013-05-19.
- ^ "TRANSFORMERS: DARK OF THE MOON Edit Bay Visit! Steve Watches 20 Minutes of the Movie and Interviews Michael Bay for Over 2 Hours!". Collider. December 8, 2010. Retrieved 2013-05-19.
- ^ a b Merchan, George (2012-04-15). "Christopher Nolan talks film vs. digital, his take on CGI, his disinterest in 3D, and much more in insightful DGA interview – Movie News". JoBlo.com. Retrieved 2013-04-22.
- ^ "pta on digital vs. film". YouTube. 2006-08-10. Retrieved 2013-04-22.
- ^ Published Friday, Nov 30, 2012, 09:34 GMT (2012-11-30). "Quentin Tarantino: 'I can't stand digital filmmaking, it's TV in public' – Movies News". Digital Spy. Retrieved 2013-04-22.
{{cite web}}
: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link) - ^ "The Digital Dilemma. Strategic issues in archiving and accessing digital motion picture materials". Academy of Motion Picture Arts and Sciences. 2007.
{{cite journal}}
: Cite journal requires|journal=
(help) - ^ "What is M-Disc™? » The M-DISC™". Mdisc.com. Retrieved 2013-04-22.
- ^ Harris, Robin (2013-01-14). "The 1,000 year DVD is here". ZDNet. Retrieved 2013-04-22.