For another view of this topic see digital cinema.
A highly recommended film featuring notable film and video people discussing and illustrating this issue is the 2012 film, Side By Side, available from sources such as Netlix.
New Insights in an Ongoing Debate -
Film vs. Videotape
Which is better: film or videotape?"
The fact is, each is superior in some ways; it depends on your needs.
At the same time we need to acknowledge the fact that much of the information about the "inferiority of video" is no longer valid. Even so, old beliefs persist.
First, let's look first at the advantages of film.
Advantages of FilmThe TV Production modules concentrate on television, so here we tend to emphasize the advantages of video. Even so, film has some advantages.
First, film has been around for more than 100 years and some of the earliest films can still be shown. In contrast, in its comparatively short history video has gone through dozens of incompatible formats, most of which are no longer around. Thus, film has a major archival advantage.
After more than 100 years of film production, a rich and highly sophisticated tradition has grown up around film. Unlike video production where newcomers may quickly find themselves functioning as camerapersons and even in some cases as directors, the feature film tradition typically involves long, highly competitive apprenticeships.
Less motivated people tend to drop out in favor of those who are more persistent, and dedicated.
Because of rich heritage of film, the production and postproduction processes have not suffered from a lack of talent or supporting industries. In Southern California alone there are thousands of companies that specialize in various aspects of film production. (At the same time, it should be noted that with the increased use of video, film companies that have been in business for decades have now gone out of business.)
Comparing the closing credits of a major film feature with those of a typical video production provides some measure of the differences that still exist between the two media. (Try sitting through the closing credits of Pearl Harbor or Ratatouille!)
For decades, film has enjoyed rather consistent worldwide standards. A 16mm film can be broadcast on any of world's broadcast systems, regardless of the broadcast standard, and a 35mm film can be shown in almost any theater in the world.
Video, on the other hand, has not only progressed through many formats, but there are now a half-dozen incompatible broadcast standards being used in various parts of the world. For producers with an eye on international distribution, film has for decades been the obvious choice.
But today, many productions are being shot "hi-def" (on high-definition video) and then (if necessary) converted to film.
Finally, there is the issue of familiarity. Many directors who grew up with film are comfortable with it and they know how it will render scenes. They like the look of film -- including the slightly softer Hollywood" ambiance created by technical issues we'll discuss later.
Relative Equipment Durability
It is commonly assumed that film equipment is more durable than digital video equipment -- primarily because film cameras are mechanically much simpler.
Although this may be true, professional video equipment--especially with the new solid-state recording media--is now being successfully used under the most extreme conditions.
Unlike film cameras these video cameras have no moving parts which can be adversely affected by weather extremes. Plus, when stored and used in extreme temperatures, color film stock will suffer problems such as color shifts.
Technical Quality ComparedIt is commonly believed that the quality of 35mm motion picture film as viewed on television is better than video. If we are talking about the artistic differences, then film may still have an advantage for the historical reasons we've noted.
Although artistic differences between film and videotape are difficult to measure, purely technical differences are not. This brings us to the following statement.
If production conditions are controlled and if comparisons are made solely on the basis of sharpness and color fidelity, the best 35mm film will be slightly inferior to the best video, assuming the latest professional-quality video equipment is used and the final result is broadcast.
As controversial as this statement might be with some film people, the reason becomes obvious when the production process for each medium is traced.
First, it is important to realize that if a signal from a video camera is recorded on the highest-quality process, no discernible difference will be noted between the picture coming from the camera and the picture that is later electronically reproduced.
With film intended for broadcast the process is far more complex.
First the image is recorded on negative film. Typically, the original negative film is then used to make a master positive, or intermediate print. From the master positive a "dupe'' (duplicate) negative is created; and from that a positive release print is made. This adds up to a minimum of three generations.
At each step things happen: color and quality variations are introduced by film emulsions and processing, there is a subtle optical degradation of the image, and the inevitable accumulation of dirt and scratches on the film surface starts.
After all of these steps, the film release print is projected into a video camera to convert it to an electronic signal, which is where the video signal started out in the first place.There is also this: Unlike video, film is based in a mechanical process. As the film goes through the gate of a camera and projector there is the inevitable loss of perfect registration. This is easy to see when you sit close to a large motion picture screen the note ever-so-slight variations in the placement of sharp (primarily horizontal) lines. This is often referred to as judder, and it results in a slight blurring of projected film images.
To understand more of the film-video sharpness difference we must bear several other factors in mind. Film is theoretically capable of resolving several times more detail than standard video.
But, since it looses much of its sharpness in its route from film camera to television camera, when the film is converted to video electronic image enhancement is routinely used to restore lost sharpness. Although image enhancement sharpens the overall look of the film image, once lost, subtle details cannot be enhanced back into existence.At the same time video is becoming capable of resolving ever-greater levels of fine detail. Eastman Kodak has announced a CCD chip capable of holding 16,777,216 bytes per square inch, which is double the resolution of standard 35mm film.
But the sharpness of video isn't necessarily a plus.
Many people think the slightly softer look of film is actually one of its advantages. For one thing, the soft ambiance surrounding the film image is subconsciously if not consciously associated with "Hollywood film making.''
There are also subtle tonal and color changes with film, which, while not representing the true values of the original subject matter, are subconsciously associated with film and it's historical heritage.
At the same time, the slightly sharper image of video is associated with news and the live coverage of events, subject matter that is very much in contrast to the normal fare of feature films.
For these reasons the technical quality of "hi-def" is often subtly degraded in various ways to look more like film.
Coping With Brightness Ranges
Until recently, video cameras simply could not handle the brightness range of film. (Remember ▲30:1 is the maximum brightness range for many home receivers.)
If film exposure is carefully controlled, a bright window in the background of a scene, for example, will not adversely affect the reproduction of surrounding tones.
As a result of early experience with professional tube-based video cameras, many producers concluded that film had a major advantage over video. And, at that point, it clearly did.
But times have changed. One video camera (the Phantom 65) demonstrated at the 2008 NAB convention can handle a 10,000,00:1 contrast ratio -- or 23 f-stops of exposure latitude in the same scene.
In a demonstration the camera was able to clearly see the burning a filament in a clear lit 500-watt bulb and, at the same time, reproduce background objects.
As we note here "The Red One" video camera (shown here) from the Red Digital Cinema Camera Company has a resolution of 5,000K, exceeding the best broadcast HDTV.
Other digital cameras being used in production are Sony's F35 and F23, Panavision's Genesis, and Arri's Arriflex D--21.
This graphic shows the relative pixel resolution of several ultra-high definition formats.
There is also a less obvious difference between film and video.
The NTSC analog film-to-video conversion process requires some technical "fancy footwork" that results in the introduction of almost subliminal effects associated with the film image on TV.
NTSC video is transmitted at 30 frames per-second and the frame rate for film is 24 per-second. (The machine shown on the right converts film images to video.)
Because there is no nice, neat math associated with dividing 30 by 24, the only way to make the conversion is to regularly scan some film frames twice.
This results in a subtle high-speed jitter, a type of artifact that has become associated (if only subconsciously) with the film image on TV.
With the SECAM and PAL broadcast standards used in non-NTSC countries the conversion process is easier. Both of these video systems operate at 25 frames per-second-very close to the 24 fps used in film. The 1 fps difference is almost impossible to detect, so adjusting the film camera or projector rate to 25 fps is a common solution.
DI - the Intermediate Digital Step
By 2005, major motion pictures were using the advantages of digital imaging (DI) as an intermediate step between the color negative film shot in the camera and the final release print copied for use within theaters. (Here, we are talking about films made for theatrical release.)
Scanning the film into digital form provides much more control over color correction and artistic color changes.
Of course once in digital form visual effects with video are much easier and less expensive than with film.
One of the quality compromises involved in HDTV has been the need to compress the signal.
However, as the cost of digital recording and storage has decreased we are seeing some production facilities move to uncompressed (4:4:4, 10 bit) video recording and editing. Silence Becomes You, released in 2005, was billed as the world's first uncompressed 4:4:4 feature production--shot with a video camera and later converted to film.
Once this approach is more widely adopted, we'll see a major jump in image quality and post-production speed and economy, making the switch to "hi-def" even more attractive.
So-called digital cinema or e-cinema (electronic cinematography) is rapidly gaining ground, especially since it is becoming almost impossible for most theater patrons to distinguish between it and film.
E-cinema is now preferred by many independent "filmmakers," and major "film" competitions now have more entries on video than on film.
The major disadvantage of the move to digital cinema has been video projectors. But, the latest generation is based on projector imagers with a 4-megapixel resolution--twice that of the previous generation. The detail possible with these projectors exceeds that of 35mm film projection.
Now the major stumbling block for digital cinema is the great initial investment in equipment--the projector and the associated computer. However, once this investment is made, major savings can be realized.
Directors of Photography in film often resist moving to video equipment because "everything is different." It can take decades to move up to a Director of Photography position, and old habits and patterns of thinking are difficult to break.
For this reason, video camera manufactures have made some of their cameras resemble the operation of film cameras.
The video camera shown here uses standard 35mm motion picture lenses.
This means that directors of (film) photography do not have to abandon all that they have learned over the years with film camera lenses.
Previously, we mentioned the almost subliminal effect that the NTSC film-to-video process creates. To make video look even more like film, even this "double-step" effect (resulting from the extra film fields being regularly added) can be electronically created. In fact everything, right down to electronically-generated random specks of "dust" can be added to the video image! (For a time -- and for questionable reasons -- video was being made to look like film -- bad film, in fact, -- by adding a host of electronic scratches, dirt, and even flash frames.)
This extreme step aside, the first practical step used in creating a "film look" with video is through the use of filters. This link lists filters that are often used to make video look like film (if that's your goal).
Film also can have a more saturated color appearance. With sophisticated video equipment this can be simulated by adjusting the color curves in a sophisticated video editor. This can also be addressed in post-production by channeling video through computer programs such as Photoshop CS3, After Effects, or Chroma Match.
By softening the image to smudge the digital grid of video, and reducing the contrast, you can take additional steps to make video look like film.
Of course, the question is why would you want to degrade the quality of one medium to match another?
Possibly it's a matter of what people get used to. When people first heard high-fidelity audio, they didn't like it. After listening to music and voice for decades on low quality radio and phonograph speakers, they had become used to this as "the standard" in audio quality, and anything else--even something much better--didn't sound right.
Purely technical considerations aside, the primary underlying difference between film and video lies in the way it's shot.
Film is normally shot in a single-camera style, and video is often shot using a multiple-camera production approach.
In film each individual scene can be carefully set up, staged, lit, rehearsed, and shot. Generally, a number of takes are made of each scene and the best one is edited into the final production.
As they strive for perfection in today's high-budget feature film productions, some directors re-shoot scenes many times before they are satisfied. (Possibly the record is held by one well-known film director who reportedly shot the same scene 87 times.)
Quite in contrast, video is generally shot with several time-code synchronized cameras covering several angles simultaneously.
Instead of lighting being optimized for one camera angle, it must hold up for three or more camera angles at the same time. This means that it's generally lit in a rather flat manner, which sacrifices dimension and form. And, with the exception of single-camera production, multiple takes in video are not the rule.
Film and Videotape Costs
The minute-for-minute cost of 16mm and 35mm film and processing is hundreds of times more than the cost of broadcast-quality video recording.
Offsetting the savings with video is the initial cost of video equipment.
Depending on levels of sophistication, the initial investment in video production and postproduction equipment can easily be ten times the cost of film equipment.
On the other hand, there is a substantial cost savings in using video for postproduction (visual effects, editing, etc.). As we've noted, for these and other reasons film productions intended for television are routinely transferred to video. This transfer can take place as soon as the film comes out of the film processor.
Reversal of the negative film to a positive image, complete with needed color correction, can be done electronically as the film is being transferred to video. From this point on all editing and visual effects are done by the video process. The negative film is then locked away in a film vault and kept in perfect condition.
Even for film productions intended for theatrical release, major time and cost savings can be realized by transferring the film to video for editing. The video version can then be used as a "blueprint'' for editing the film.
Will Video Replace Film?
So will video or digital imaging soon replace film for prime-time TV production?
Yes, even though a few directors maintain that the look of film can't be matched by video, eventually video will replace film in motion picture work. Aesthetic issues aside, the transition is being driven by pure economics.
In 2011, the majority of productions done for TV were mastered on video.
Digital Update 05/05/2013
According the National Assn. of Theatre Owners' trade group by 2012 more than 85% of the U.S.' 4,044 theaters, representing 34,161 screens, had gone digital.
Those that haven't will have to either spend $60,000 or more for digital equipment or be forced to close, because soon movies will all be distributed on computer disks rather than film.
Theaters that can't afford the move to digital are planning to close -- some after decades of serving small towns around the country.
Not only do digital "films" represent a major cost savings in duplication and distribution, but the technical quality (sharpness and clarity) of the image can be superior to film.
Many film buffs, including many film and TV directors, still strongly argue this point, of course. However, when "Hollywood" is 100% digital, this issue may only be a matter of historic interest.
To Home Page
© 2013, All Rights Reserved