Digital cinematography is the process of capturing (recording) a motion picture using digital image sensors rather than through film stock. As digital technology has improved in recent years, this practice has become dominant. Since the mid 2010s most of the movies across the world are captured as well as distributed digitally.
Many vendors have brought products to market, including traditional film camera vendors like Arri and Panavision, as well as new vendors like RED, Blackmagic, Silicon Imaging, Vision Research and companies which have traditionally focused on consumer and broadcast video equipment, like Sony, GoPro, and Panasonic.
As of 2017, professional 4K digital film cameras are approximately equal to 35mm film in their resolution and dynamic range capacity, however, digital film still has a slightly different look to analog film. Some filmmakers still prefer to use analogue picture formats to achieve the desired results.
Video Digital cinematography
History
Beginning in the late 1980s, Sony began marketing the concept of "electronic cinematography," utilizing its analog Sony HDVS professional video cameras. The effort met with very little success. However, this led to one of the earliest high definition video shot feature movies, Julia and Julia (1987).
Rainbow (1996) was the world's first film utilizing extensive digital post production techniques. Shot entirely with Sony's first Solid State Electronic Cinematography cameras and featuring over 35 minutes of digital image processing and visual effects, all post production, sound effects, editing and scoring were completed digitally. The Digital High Definition image was transferred to 35mm negative via electron beam recorder for theatrical release.
The first digitally filmed and post produced feature film was Windhorse, shot in Tibet and Nepal in 1996 on a prototype of the digital-beta Sony DVW-700WS and the prosumer Sony DCE-VX1000. The offline editing (avid) and the online post and color work (Roland House / da Vinci) were also all digital. The film, transferred to 35mm negative for theatrical release, won Best U.S. Feature at the Santa Barbara Film Festival in 1998.
In 1998, with the introduction of HDCAM recorders and 1920 × 1080 pixel digital professional video cameras based on CCD technology, the idea, now re-branded as "digital cinematography," began to gain traction in the market. Shot and released in 1998, The Last Broadcast is believed by some to be the first feature-length video shot and edited entirely on consumer-level digital equipment.
In May 1999 George Lucas challenged the supremacy of the movie-making medium of film for the first time by including footage filmed with high-definition digital cameras in Star Wars: Episode I - The Phantom Menace. The digital footage blended seamlessly with the footage shot on film and he announced later that year he would film its sequels entirely on hi-def digital video. Also in 1999, digital projectors were installed in four theaters for the showing of The Phantom Menace. In June 2000, Star Wars: Episode II - Attack of the Clones began principal photography shot entirely using a Sony HDW-F900 camera as Lucas had previously stated. The film was released in May 2002. In May 2001 Once Upon a Time in Mexico was also shot in 24 frame-per-second high-definition digital video, partially developed by George Lucas using a Sony HDW-F900 camera, following Robert Rodriguez's introduction to the camera at Lucas' Skywalker Ranch facility whilst editing the sound for Spy Kids. Two lesser-known movies, Vidocq (2001) and Russian Ark (2002), had also been shot with the same camera, the latter notably consisting of a single long take.
Today, cameras from companies like Sony, Panasonic, JVC and Canon offer a variety of choices for shooting high-definition video. At the high-end of the market, there has been an emergence of cameras aimed specifically at the digital cinema market. These cameras from Sony, Vision Research, Arri, Silicon Imaging, Panavision, Grass Valley and Red offer resolution and dynamic range that exceeds that of traditional video cameras, which are designed for the limited needs of broadcast television.
In 2009, Slumdog Millionaire became the first movie shot mainly in digital to be awarded the Academy Award for Best Cinematography and the highest-grossing movie in the history of cinema, Avatar, not only was shot on digital cameras as well, but also made the main revenues at the box office no longer by film, but digital projection.
In late 2013, Paramount became the first major studio to distribute movies to theaters in digital format eliminating 35mm film entirely. Anchorman 2 was the last Paramount production to include a 35mm film version, while The Wolf of Wall Street was the first major movie distributed entirely digitally.
Maps Digital cinematography
Technology
Digital cinematography captures motion pictures digitally in a process analogous to digital photography. While there is no clear technical distinction that separates the images captured in digital cinematography from video, the term "digital cinematography" is usually applied only in cases where digital acquisition is substituted for film acquisition, such as when shooting a feature film. The term is seldom applied when digital acquisition is substituted for video acquisition, as with live broadcast television programs.
Recording
Cameras
Professional cameras include the Sony CineAlta(F) Series, Blackmagic Cinema Camera, RED ONE, Arriflex D-20, D-21 and Alexa, Panavisions Genesis, Silicon Imaging SI-2K, Thomson Viper, Vision Research Phantom, IMAX 3D camera based on two Vision Research Phantom cores, Weisscam HS-1 and HS-2, GS Vitec noX, and the Fusion Camera System. Independent filmmakers have also pressed low-cost consumer and prosumer cameras into service for digital filmmaking.
Sensors
Digital cinematography cameras capture images using CMOS or CCD sensors, usually in one of two arrangements.
Single chip cameras designed specifically for the digital cinematography market often use a single sensor (much like digital photo cameras), with dimensions similar in size to a 16 or 35 mm film frame or even (as with the Vision 65) a 65 mm film frame. An image can be projected onto a single large sensor exactly the same way it can be projected onto a film frame, so cameras with this design can be made with PL, PV and similar mounts, in order to use the wide range of existing high-end cinematography lenses available. Their large sensors also let these cameras achieve the same shallow depth of field as 35 or 65 mm motion picture film cameras, which many cinematographers consider an essential visual tool.
Video formats
Unlike other video formats, which are specified in terms of vertical resolution (for example, 1080p, which is 1920×1080 pixels), digital cinema formats are usually specified in terms of horizontal resolution. As a shorthand, these resolutions are often given in "nK" notation, where n is the multiplier of 1024 such that the horizontal resolution of a corresponding full-aperture, digitized film frame is exactly pixels. Here the "K" has a customary meaning corresponding to the binary prefix "kibi" (ki).
For instance, a 2K image is 2048 pixels wide, and a 4K image is 4096 pixels wide. Vertical resolutions vary with aspect ratios though; so a 2K image with an HDTV (16:9) aspect ratio is 2048×1152 pixels, while a 2K image with a SDTV or Academy ratio (4:3) is 2048×1536 pixels, and one with a Panavision ratio (2.39:1) would be 2048×856 pixels, and so on. Due to the "nK" notation not corresponding to specific horizontal resolutions per format a 2K image lacking, for example, the typical 35mm film soundtrack space, is only 1828 pixels wide, with vertical resolutions rescaling accordingly. This led to a plethora of motion-picture related video resolutions, which is quite confusing and often redundant with respect to nowadays few projection standards.
All formats designed for digital cinematography are progressive scan, and capture usually occurs at the same 24 frame per second rate established as the standard for 35mm film. Some films such as The Hobbit: An Unexpected Journey have a High Frame Rate of 48 fps, although in some theatres it was also released in a 24 fps version which many fans of traditional film prefer.
The DCI standard for cinema usually relies on a 1.89:1 aspect ratio, thus defining the maximum container size for 4K as 4096×2160 pixels and for 2K as 2048×1080 pixels. When distributed in the form of a Digital Cinema Package (DCP), content is letterboxed or pillarboxed as appropriate to fit within one of these container formats.
In the early years of digital cinematography, 2K was the most common format for digitally acquired major motion pictures however, as new camera systems gain acceptance, 4K is becoming more prominent. The Arri Alexa captured a 2.8k image. During 2009 at least two major Hollywood films, Knowing and District 9, were shot in 4K on the RED ONE camera, followed by The Social Network in 2010. As of 2017, 4k cameras are now commonplace, with most high-end films being shot at 4k resolution.
Data storage
Broadly, two workflow paradigms are used for data acquisition and storage in digital cinematography.
Tape-based workflows
With video-tape-based workflow, video is recorded to tape on set. This video is then ingested into a computer running non-linear editing software, using a deck. Upon ingestion, a digital video stream from tape is converted to computer files. These files can be edited directly or converted to an intermediate format for editing. Then video is output in its final format, possibly to a film recorder for theatrical exhibition, or back to video tape for broadcast use. Original video tapes are kept as an archival medium. The files generated by the non-linear editing application contain the information necessary to retrieve footage from the proper tapes, should the footage stored on the computer's hard disk be lost. With increasing convenience of file-based workflows, the tape-based workflows have become marginal in recent years.
File-based workflows
Digital cinematography has mostly shifted towards "tapeless" or "file-based" workflows. This trend has accelerated with increased capacity and reduced cost of non-linear storage solutions such as hard disk drives, optical discs, and solid-state memory. With tapeless workflows digital video is recorded as digital files onto random-access media like optical discs, hard disk drives or flash memory-based digital "magazines". These files can be easily copied to another storage device, typically to a large RAID (array of computer disks) connected to an editing system. Once data is copied from the on-set media to the storage array, they are erased and returned to the set for more shooting.
Such RAID arrays, both of "managed" (for example, SANs and NASs) and "unmanaged" (for example, JBoDs on a single computer workstation), are necessary due to the throughput required for real-time (320 MB/s for 2K @ 24fps) or near-real-time playback in post-production, compared to throughput available from a single, yet fast, hard disk drive. Such requirements are often termed as "on-line" storage. Post-production not requiring real-time playback performances (typically for lettering, subtitling, versioning and other similar visual effects) can be migrated to slightly slower RAID stores.
Short-term archiving, "if ever", is accomplished by moving the digital files into "slower" RAID arrays (still of either managed and unmanaged type, but with lower performances), where playback capability is poor to non-existent (unless via proxy images), but minimal editing and metadata harvesting still feasible. Such intermediate requirements easily fall into the "mid-line" storage category.
Long-term archiving is accomplished by backing up the digital files from the RAID, using standard practices and equipment for data backup from the IT industry, often to data tapes (like LTOs).
Chroma subsampling
Most digital cinematography systems further reduce data rate by subsampling color information. Because the human visual system is much more sensitive to luminance than to color, lower resolution color information can be overlaid with higher resolution luma (brightness) information, to create an image that looks very similar to one in which both color and luma information are sampled at full resolution. This scheme may cause pixelation or color bleeding under some circumstances. High quality digital cinematography systems are capable of recording full resolution color data (4:4:4) or raw sensor data.
Intra- vs. Inter-frame compression
Most compression systems used for acquisition in the digital cinematography world compress footage one frame at a time, as if a video stream is a series of still images. This is called intra-frame compression. Inter frame compression systems can further compress data by examining and eliminating redundancy between frames. This leads to higher compression ratios, but displaying a single frame will usually require the playback system to decompress a number of frames from before & after it. In normal playback this is not a problem, as each successive frame is played in order, so the preceding frames have already been decompressed. In editing, however, it is common to jump around to specific frames and to play footage backwards or at different speeds. Because of the need to decompress extra frames in these situations, inter-frame compression can cause performance problems for editing systems. Inter-frame compression is also disadvantageous because the loss of a single frame (say, due to a flaw writing data to a tape) will typically ruin all the frames until the next keyframe occurs. In the case of the HDV format, for instance, this may result in as many as 6 frames being lost with 720p recording, or 15 with 1080i. An inter-frame compressed video stream consists of groups of pictures (GOPs), each of which has only one full frame, and a handful of other frames referring to this frame. If the full frame, called I-frame, is lost due to transmission or media error, none of the P-frames or B-frames (the referenced images) can be displayed. In this case, the whole GOP is lost.
Digital distribution
For theaters with digital projectors, digital films may be distributed digitally, either shipped to theaters on hard drives or sent via the Internet or satellite networks. Digital Cinema Initiatives, LLC, a joint venture of Disney, Fox, MGM, Paramount, Sony Pictures Entertainment, Universal and Warner Bros. Studios, has established standards for digital cinema projection. In July 2005, they released the first version of the Digital Cinema System Specification, which encompasses 2K and 4K theatrical projection. They also offer compliance testing for exhibitors and equipment suppliers.
Theater owners initially balked at installing digital projection systems because of high cost and concern over increased technical complexity. However new funding models, in which distributors pay a "digital print" fee to theater owners, have helped to alleviate these concerns. Digital projection also offers increased flexibility with respect to showing trailers and pre-show advertisements and allowing theater owners to more easily move films between screens or change how many screens a film is playing on, and the higher quality of digital projection provides a better experience to help attract consumers who can now access high-definition content at home. These factors have resulted in digital projection becoming an increasingly attractive prospect for theater owners, and the pace of adoption has been rapidly increasing.
Since some theaters currently don't have digital projection systems, even if a movie is shot and post-produced digitally, it must be transferred to film if a large theatrical release is planned. Typically, a film recorder will be used to print digital image data to film, to create a 35 mm internegative. After that the duplication process is identical to that of a traditional negative from a film camera.
Comparison with film cinematography
Resolution
Unlike a digital sensor, a film frame does not have a regular grid of discrete pixels.
Determining resolution in digital acquisition seems straightforward, but it is significantly complicated by the way digital camera sensors work in the real world. This is particularly true in the case of high-end digital cinematography cameras that use a single large bayer pattern CMOS sensor. A bayer pattern sensor does not sample full RGB data at every point; instead, each pixel is biased toward red, green or blue, and a full color image is assembled from this checkerboard of color by processing the image through a demosaicing algorithm. Generally with a bayer pattern sensor, actual resolution will fall somewhere between the "native" value and half this figure, with different demosaicing algorithms producing different results. Additionally, most digital cameras (both bayer and three-chip designs) employ optical low-pass filters to avoid aliasing; suboptimal antialiasing filtering can further reduce system resolution.
Grain and noise
Film has a characteristic grain structure. Different film stocks have different grain.
Digitally acquired footage lacks this grain structure. It has electronic noise.
Digital intermediate workflow and archiving
The process of using digital intermediate workflow, where movies are color graded digitally instead of via traditional photochemical finishing techniques, has become common.
In order to utilize digital intermediate workflow with film, the camera negative must first be processed and then scanned to a digital format. Some filmmakers have years of experience achieving their artistic vision using the techniques available in a traditional photochemical workflow, and prefer that finishing/editing process.
Digitally shot movies can be printed, transferred or archived on film. Large scale digital productions are often archived on film, as it provides a safer medium for storage, benefiting insurance and storage costs. As long as the negative does not completely degrade, it will always be possible to recover the images from it in the future, regardless of changes in technology, since all that will be involved is simple photographic reproduction.
In contrast, even if digital data is stored on a medium that will preserve its integrity, highly specialized digital equipment will always be required to reproduce it. Changes in technology may thus render the format unreadable or expensive to recover over time. For this reason, film studios distributing digitally-originated films often make film-based separation masters of them for archival purposes.
Reliability
Film proponents have argued that digital cameras lack the reliability of film, particularly when filming sequences at high speed or in chaotic environments, due to digital cameras technical glitches. Cinematographer Wally Pfister noted that for his shoot on the film Inception, "Out of six times that we shot on the digital format, we only had one useable piece and it didn't end up in the film. Out of the six times we shot with the Photo-Sonics camera and 35mm running through it, every single shot was in the movie." Michael Bay stated that when filming Transformers: Dark of the Moon, 35mm cameras had to be used when filming in slow-motion and sequences where the digital cameras were subject to strobing or electrical damage from dust.
Criticism and concerns
Some film directors such as Christopher Nolan, Paul Thomas Anderson and Quentin Tarantino have publicly criticized digital cinema, and advocated the use of film and film prints. Tarantino has suggested he may retire because he will no longer be able to have his films projected in 35mm in most American cinemas. Tarantino considers digital cinema to be simply "television in public." Christopher Nolan has speculated that the film industry's adoption of digital formats has been driven purely by economic factors as opposed to digital being a superior medium to film: "I think, truthfully, it boils down to the economic interest of manufacturers and [a production] industry that makes more money through change rather than through maintaining the status quo."
Another concern with digital image capture is how to archive all the digital material. Archiving digital material is turning out to be extremely costly, and it creates issues in terms of long-term preservation. In a 2007 study, the Academy of Motion Picture Arts and Sciences found that the cost of storing 4K digital masters is "enormously higher - 1100% higher - than the cost of storing film masters." Furthermore, digital archiving faces challenges due to the insufficient longevity of today's digital storage: no current media, be it magnetic hard drives or digital tape, can reliably store a film for a hundred years, something that properly stored and handled film can do. Although this also used to be the case with optical disc, in 2012 Millenniata, Inc. a digital storage company based in Utah, released M-DISC, an optical storage solution, designed to last up to 1,000 years, thus, offering a possibility of digital storage as a viable storage solution.
See also
- Digital versus film photography
- Filmizing
- List of motion picture topics
- Motion picture film scanner
References
Source of the article : Wikipedia