Compares Digital to Workprint
A great deal has been written about the two different ways
of editing. There's the old school, where you get down and
dirty in the editing room with workprint and mag track over-flowing
from several trim bins, grease pencils that keep breaking,
mylar tape that sticks to everything but won't lay flat at
the splice, and a flat-bed that keeps snapping your film.
how about the new school? Just ease into that multi-directional
editing chair, adjust the indirect lighting in that clean,
climate-controlled editing bay and let your fingers and mind
fly through the infinite permutations of possible editing
decisions, all by pressing a few buttons.
But before you label me another digital-non-linear-convert,
you need to hear the rest of the story.
the advent of digital editing in the 1980's, debate erupted
over which type of editing (digital or workprint) was the
best way to deal with motion pictures. Since 1985, I have
worked with over 200 different editors, from first-timer film
students to chain-smoking old hands, and the jury is still
out on the subject; however, the tide is shifting heavily
in favor of digital editing. Obviously, the vast majority
of anyone under 35 has grown up with computers, and naturally
leans toward digital editing. The lion's-share of film schools
have already converted, or are in the process of converting,
their facilities to accommodate the digital editing format,
and even we old-timers have recognized the value of non-linear
editing. Only the purists, a few editors, and some film instructors
are the hold outs. Being a bit of a purist myself, I do mourn
the passing of another vestige of a seemingly bygone era.
But my point here is not to look back about what was, but
rather look at what is: just what are the technical and creative
pros and cons for each approach, and how, in some instances,
do they work together?
It is easy to understand the allure of digital non-linear
editing. The colorist will sync your dailies at the point
of transfer. All of your scenes and takes are carefully ordered
in bin files as you digitize into the computer. Any notes
can be easily filed with your dailies in the computer, and
never lost. You can edit with speed and ease, saving different
edited sections for comparison, and make changes with the
stroke of a key with no frames lost. You don't have to visualize
your effects--you render them out.
But is there a dark side to this? No, I'm not talking about
a ghost in the machine-just something that's difficult to
define. Is something lost, or unaccounted for in digital editing
process? Something that the old-timers understand, but don't
consciously think about in the editing process? Something
that anyone who hasn't been exposed to physically editing
in film would miss? Something that could undermine their project?
In a word, yes. Some of my purists friends talk about it as
an organic approach to film editing. Many believe that actually
handling the film--physically marking , cutting, and splicing
the workprint allows the editor/filmmaker to better understand
the essence of his project. This is a little too esoteric
for me. I would rather consider it from a 'what-is' point
Here are the issues I will cover, and you can pick the ones
you would like more information on:
1. The technical pros
& cons of editing on a digital non-linear system.
2. The subjective pros
& cons of editing on a digital non-linear system.
3. Why is it necessary
to build a relationship between the film and the video?
4. Why does someone edit
on film at all?
5. What does the 3/2 pull
down have to do with editing?
6. How can film and digital
editing work together?
7. Why is a 24 fps editing
system preferable to a 30 fps system?
8. Why does film play at
24 fps and video at 30 (or is it 29.97?) fps?
The Technical Pros & Cons Of Editing on a Digital Non-Linear
I've made a straight-up comparison of costs between workprint
editing and digital/non-linear. With no discounts, deals,
or favors the costs worked out to be about the same. The only
thing difference was the time involved in the edit. The digital
cost more, but took less time and vice-versa.
stated the advantages of digital editing above. So what are
the cons? Everything has its weakness. In the case of a digital
non-linear system, it can be as general as human error in
translating the film to video, or as specific as creating
bad flex files. Each of these problems won't necessarily show
up until you try to output your cut footage to tape or when
you start your negative cut. By not being able to see the
full frame on the video monitor, one can create a world of
hurt in the finished product. An "invisible" flaw (boom mike,
a grip's elbow) can appear larger than life when projected
on the big screen.
regard to fades and dissolves, there are standard lengths
of effects (16, 24, 32, 48, 64, 96 and 128 frames) that a
contact printer can do. Working with these standards, the
negative cutter can save you money by building 'A&B' rolls
during your negative cut (you'll have to 'A/B' roll in 16mm
anyway), so no optical printer work is needed. Anything but
fades and dissolves in these lengths--like freeze frames,
slow motion not done in the camera, reverse action, etc. will
demand a printed optical. And they ain't cheap. Some versions
of the Avid actually throw up a $ sign every time you create
one, just to remind you that the alimony payment might be
a little note here: there has always been the desire (particularly
in Hollywood) to use opticals, because the effect is rendered
at the interpositive stage (a positive image) rather than
at the negative stage. This is said to create a smoother effect--but
with an added effect on your pocketbook. (I must admit that
fades are superior at the interpositive stage). In general
the typical film goer won't be able to see the difference--except
possibly in scenes that shift between radically different
lighting. Even then, it is often not an issue. It always comes
back to the fact that filmmaking is a balance between cost
of course, there is the possibility of a glitch in the system.
It could be something simple like a few bad key numbers that
would have stuck the shot of the monkey into your big dramatic
scene, and your negative cutter catches it. More likely, (and
even worse) something major happens, like half of the edits
don't have key numbers, just question marks.
All computers are susceptible to loss of information. It can
be anything from a power outage to a hard disk failure. I
was even told once that a temperature variation between the
stylus and the disk of the hard drive can cause a glitch.
Any one of the things can cause a problem with one shot or
a huge portion of the project.
The idea behind these scare tactics is to underline the need
to be informed, be prepared, take precautions, and have contingency
plans in place for these possible problems. Continue on and
at least some of these problems will be addressed. It's always
better to address a crisis before it happens, rather than
in the middle of one.
2. The Subjective Pros and Cons of Editing
on a Digital Non-Linear System
Once again, the advantages of editing digitally are pretty
self evident: the ease of the edit; the speed of the edit;
the ease of changing an edit to see a scene occurring as "another
possibility." Digital Editing allows the filmmaker the ability
to see fades and dissolves (as well as a number of other cool
effects on screen), and generally offers a greater sense of
creative freedom in the user. There is no doubt the digital
non-linear editing system allows you to expand your editing
But what if you get so involved in discovering the next possible
change in your edit that you loose your original direction?
What if the speed of the edits won't allow you to absorb the
visual content of the scene, and you miss that needed cut
away, pause, or look. An old film editor I knew back in my
Hollywood days once remarked that there are 3 stories in every
film: the film that is written, the film that is shot, and
the film that is edited. Don't get lost in the whiz-bang of
the machine. Stay true to your vision and your intuition.
Another factor to consider is that you can see more at one
time on a small video monitor than on the big screen. The
screen is large--there is much more up there to look at. So,
on the monitor you might have the tendency to cut-away from
a scene too soon. Once it's on the screen, you may realize
that the audience didn't have time to take in the impact of
When dealing with optical effects, there is something else
to consider. Whether or not the system boasts the ability
of rendering an effect to look like film, it simply will not
look the same on the big screen.This
has to do with the light curve of the film and D-min and D-max
principles. I'm not going to go into that now, but suffice
it to say that the speed of film effects appear shorter on
the screen than on the video monitor. Fade-outs and fade-ins
are quicker, and the elapsed time for black in-between appears
longer. Dissolves also appear shorter on the big screen.
Remember: Knowledge is king, and having an understanding of
the possible pitfalls you can encounter will better prepare
you for the editing process on a digital non-linear system.
3. Why Is It Necessary To Build a Relationship
Between the Film and the Video?
It is necessary to build a relationship between film and video
if you are planning to project your work on film without a
Film expresses time and quantity in feet and frames (using
printed numbers sometimes known as Keykode), while video expresses
time and quantity in hours, minutes, seconds, and frames (Timecode).
Film runs at 24 frames per second, while standard NTSC video
plays at 29.97 frames per second (I'll tell you about PAL
reality is that in transferring your negative to video and
digitizing it into a computer, you are creating an "imaginary"
film that doesn't exist in the real world. So how does the
editor know that the video correctly relates to the film (meaning
that the timecode correctly relates to the Keykode), and that
a negative cutter can make heads or tails out of what is supposed
to be the final edited version?
A relationship must be built between the "real," physical
negative and the "imaginary" stage of the digitally edited
output. The bridge between the two is established at the telecine
(film-to-video transfer) stage. Keep this thought, and remember
that at the transfer stage, you will have created a 3/2 pull
down, and will have transferred at 29.97 non-drop fps. Both
of those terms are covered in further detail on the Editing
on Video page of this site.
correct transfer is begun by asking the laboratory to do a
'Zero/A frame transfer'. The "zero" refers to the zero frame
on the Keykode running numbers. In 16mm film, the 'key' number
appears at every 20 frames (at 2 key numbers per foot). The
frames count from zero to 19 before the key number changes.
In 35mm, the code appears every 16 frames, 16 frames per foot.
By instructing the telecine operator to punch the first 'zero'
frame, as well as the last visible frame of each camera roll,
the colorist can easily locate those frames on his/her video
The colorist can then reset his/her field/frame count to 'A'
at the zero frame. Because there are 30 frames of video per
second for every 24 frames of film transferred, video frames
are not only identified by the count from zero to 19 in 16mm
(or zero to 15 in 35mm), but also with a following letter
designation A,B,C,C,D. The letter relates the video frame
to the film frame, and the underlined letter C indicates the
extra frame of video created in the 3/2 pull down. We then
get 30 frames of video for 24 frames of film.
a written record of both punches, the colorist, editor, and
negative cutter can compare the numbers to what appears on
the monitor. It is important to realize that this needs to
happen with every exposed camera roll, including shortends.
would happen if the punches and the 'zero/A' frame were not
reset? The barcode reader on the telecine machine would figure
out that it was a different number after it went a few feet
into a new camera roll, but in 16mm you have one chance in
twenty that the frame count remained the same when the lab
spliced one camera roll onto the next.
establishing a film-to-video relationship for every camera
roll, all of the frame counts on the edits that come from
the camera rolls could be off by as many as 19 frames. Most
of the editors I know would have a hissy fit if they discovered
a one-frame discrepancy. Imagine if their edit points were
off by ten or fifteen frames! By having the lab take care
of this in advance, and performing a zero/A frame transfer,
the relationship built between the video dailies and the camera-roll
negative is known and verifiable.
If the video is digitized into a Avid or Lightworks system
editing at 24 fps, the zero/A frame relationship also helps
maintain the correct frames by allowing the editing system
to drop the "phony" fields from the transfer during the digitizing
process (if you're editing on a 30 fps system, this doesn't
apply). The editor can check the transfer for accuracy by
referring to the written information from the lab and comparing
it to the zero-frame punches seen on the video.
create a relationship between Timecode and Keykode in the
transfer using the zero/A frame. It is important that the
transferred video have visible timecode and keykode "burn"
windows for visual reference. As an aside, it seems that the
new Spirit Telecines do not require that the film be punched,
due to the fact that the machine can establish the zero frame
based on the position of the barcode on the negative.
Why Does Someone Edit on Film At All?
a word Ö..reality. That reality translates into two key points
about editing in workprint. The first is, unlike digital editing
where the film exists in a nether-world of ones and zeros,
workprint editing has a direct physical relationship with
the negative (and the magnetic track). That relationship is
a one to one frame ratio. There is one physical frame of workprint
for every physical frame of negative. No extra fields of video--no
"pretend" ones and zeros--just the simple physical reality
of workprint and negative. This enables the editor to know
that exactly what is produced in the workprint will be duplicated
in the negative cut.
Second, and equally important, is the ability to see the edited
workprint as it will be seen as an end product--on the big
screen. Since you are editing on physical film, the cut workprint
can be moved from the flatbed to the projector. Having also
edited your magnetic track, this process enables you to screen
both in an interlocked presentation. So you can hear, as well
as see, what you have edited.
Being able to see exactly what you have in your edited version
also eliminates those ugly surprises that can be overlooked
on the video monitor or the small flatbed screen. The car
parked 'off screen' in your western. The sound boom that should
be in the 'safe area'. The light stand that is definitely
'off camera'. Although everyone always strives to avoid these
kinds of problems, it's a lot better to discover them at the
editing stage, rather than in your answer print. Also, it's
always better to only cut your negative once, rather that
trying to fix a problem in a recut.
What Does the 3/2 Pull Down Have To Do With Editing?
Because of the 3/2 pull down, you have created 5 frames of
video for every 4 frames of film. You know that when editing
on video, time is expressed in timecode. If you digitize into
a digital non-linear editing system that edits at 24 fps,
there is no issue because the system will drop the extra field,
making sure that the edits are expressed in a 1-video-frame
to 1-film-frame relationship.
If you digitize into a 30 fps system, a true relationship
(film-to-video) exists only for edits made on frames divisible
by 5. Let me explain. One-half of a film second is 12 frames,
one-half of a video second is 15 frames. If you divide 15
by 5, you get 3. If you subtract 3 from 15, you get 12. Using
this equation, you can determine whether or not you have a
direct video-to-film relationship. But what happens if you
cut on the 14th frame, or the 18th frame, or the 6th frame?
Any video frame not divisible by 5 does not have a physical
relationship to a film frame. There is a little bit of time
(either extra at the end of a edit, or a little less at the
head of a cut) that exists in the "imaginary" film, but not
in the actual camera rolls.
dilemma can be solved by converting your edit decision list
to a 24 frames per second cut list by crunching the numbers
in an Excalibur.
6. How Can Film and Digital Editing Work
The pros of digital editing being time savings and ease of
work, and the pros of workprint editing being the relationship
with the final film, the editor (if possible) needs to use
both. This is accomplished everyday in Hollywood for both
big and small budget productions.
The big budget boys and girls have both workprint and video
dailies made. That is to say they have a workprint and magnetic
track made from the negative and recorded sound. Those are
then synced together in daily rolls and transferred to video
for digital editing.
The editor edits the film in a digital domain, and creates
an Assemble List that he passes off to the assistant editor.
The assistant editor then assembles the workprint based on
that list for screening and review. If any changes are needed,
they are made to both the digital edit and the workprint.
When everyone is satisfied, the workprint is turned over to
the negative cutter for the negative conform.
An alternative is having all of the editing done on video
prior to any workprint being generated. A pull list of select
takes (those takes used in the editing process) is generated
from the digital non-linear system. This is used as a guide
to build rolls of negative that will be printed by the lab.
The workprint is then conformed to the digital edit list,
and projected for the editor prior to moving to the negative
cutting stage. As an added bonus, most negative cutters offer
discounts for projects that will be conformed from workprint.
Why Is a 24 Fps Editing System Preferable To a 30 Fps System?
As described earlier, one the advantages of the 24 fps system
is its ability to drop the extra fields that were created
in the 3/2 pull down at the telecine stage. By doing this,
the digital system is editing in a one-video-frame to one-film-frame
relationship. This eliminates the issue of editing on frames
of video that don't have a direct relationship to a frame
The 24 fps system also offers the ability to recognize dupe
shots (the same piece or partial piece of negative used more
than once), and shots that are bumped up against themselves
in a cutback. (You need to drop at least 2 frames in 16mm,
and 1 frame in 35mm, between cutbacks because the frames are
needed by the negative cutter to make his/her splices.) Let's
look at this in a different way: You have chosen to use a
long shot that will be used twice in the final edit. The outgoing
frame of the first shot is frame 6 (in counting from 1 to
19) of a certain key number. You need to drop (not use) frame
7 and 8. So the first frame of the cutback (the next piece
of the shot that is used) is frame 9. The 24 fps system will
help you do this, thus saving you the cost of creating a dupe
The system will also tell you if you are creating an optical
effect. By definition this is a shot that requires an optical
printer (as compared to a contact printer) to complete. This
can include fades or dissolves that are not the correct length
for contact printing, duplicate shots, titles over picture,
freeze frames, slow or fast motion, reverse motion or repositioning,
split screen, etc. This would also include any shots that
were digitally composed and recorded on film.
Since the 30 fps system was developed for video editing (television),
it doesn't care or compensate for its limitations when finishing
on film. You can use the same shot 100 times, and it wouldn't
matter. You can turn a shot upside down and backwards, and
because it's meant to remain within the digital realm, it
doesn't have to adhere to the physical requirements of film.
A 30 fps system is a lot less expensive to rent, but you need
to pay attention to several key factors, or they will come
back to bite you in the butt--either costing you money you
weren't expecting, or forcing you to change your edit after
you have locked picture.
8. Why Does Film Play At 24 Fps And Video
At 30 (Or Is It 29.97?) Fps?
I could give the long version, but goal here is to inform,
and not put you to sleep. When film was first developed (if
you'll pardon the pun), a number of different projection speeds
(frames per second) were tried. The slowest speed that projected
the best realistic moving image without strobing was 24 fps.
You know how the old silent movies flicker a little? That's
because they were filmed at 18 or 20 fps. 24 fps were also
needed for the sound to record properly on magnetic stock.
Naturally, you can always project faster, but that uses more
film, and therefore costs more money. And everybody knows
films cost way too much as it is. (I understand that tests
are proceeding on a system which projects at 30 fps, but uses
the same amount of film by reducing the frame size down from
4-perfs to 3-perfs. The problem isn't getting someone to use
a 3-perf camera, it's getting the theaters to change the gates
of the projectors to accommodate a 3-perf set up.)
Okay, so we are happily projecting film at 24 fps, and along
comes black and white television. TV uses an interlaced image
by creating 2 fields per frame with each field projecting
every other line of the 525 lines on the TV screen. By projecting
2 fields it creates a complete image (one frame). Domestic
TV also uses a 60 hertz cycle, so they found that 24 fps didn't
get it, but 30 fps would because it's a dividend of 60. The
question then was how to get 30 frames out of 24 frames. The
answer was the 3/2 pull down. By creating a third field in
every other frame transferred, you get 30 frames of video
from 24 frames of film.
All of this worked wonderfully because the black and white
information could flow at the 30 frames per second rate. Then
along came color television. There was more information in
the 30 frames than could be packed into the one second. The
only answer was to slow down the frame rate to 29.97. This
is why a program on television is 1% slower than the same
show on the big screen.
That's great information, but how would this affect the editing
process? This happens in two ways. The first revolves around
the transfer rate. You have a choice of transferring at 29.97
non-drop frame (for regular transfer), or 29.97 drop frame
(to speed it up to match film projection rate). The drop frame
transfer should only be used for shows not going back to film.
When editing on a 24 fps system, which eliminates the extra
fields generated in the transfer, you must always transfer
at 29.97 non-drop frame. Otherwise you are dropping the frames
The second effect involves the edited sound. Since you have
transferred at 29.97 non-drop frame, the editing and output
is slower (by about 1%) than the finished film will play.
When taking the sound off any digital system (this includes
sound editing systems), remember you have two choices. The
system can output at either 23.97 fps or 24 fps. If you take
it off the digital system at 23.97 fps, it must be sped up
to 24 fps before transferring to an optical sound track. Otherwise,
you will have this "great sound track" that slowly drifts
out of sync.