Jump to content

Resolution - Digital versus 35mm Film


FoodLover

Recommended Posts

Advertisement (gone after registration)

Hi Guys,

 

Thanks for all the great explanations in terms of cinematography. I find it interesting that the sensors top out at about 50 fps. If I remember right, film camera's (Arriflex/Panavision) can do speeds up to something like 200 fps.

 

One question that would interest me though: Once they have the film in the bag, how do they do editing nowadays (I'm not talking fx here, just normal scene to scene type work). Do they scan in the film and then work digitally, or do they still use the traditional cutting desk?

 

Andreas

Link to post
Share on other sites

  • Replies 47
  • Created
  • Last Reply

From Wikipedia - Film Editing entry:

 

"Today, most films are edited digitally (on systems such as Avid or Final Cut Pro) and bypass the film positive workprint altogether. In the past, the use of a film positive (not the original negative) allowed the editor to do as much experimenting as he or she wished, without the risk of damaging the original.

When the film workprint had been cut to a satisfactory state, it was then used to make an edit decision list (EDL). ..... With the advent of digital intermediate ("DI"), the physical negative does not necessarily need to be physically cut and hot spliced together; rather the negative is optically scanned into computer(s) and a cut list is conformed by a DI editor."

 

As to frame rate and the data flow required, it has been noted in the video from the Nikon D90 and Canon 5DMk2 that there is occasionally a "jello effect" or wobbly distortion of fast action from top to bottom in the frame - somewhat analogous to the effect from early focal plane shutters where, for example, the wheels of a passing race car were distorted into ovals because the wheel moved during the time interval between exposure of the top and bottom of the picture.

 

With huge pixel counts there is enough delay in the transfer of data that the time difference becomes visible from the top to the bottom of the frame, at least as reassembled into HDTV by the cameras' processors.

Link to post
Share on other sites

I agree with Luigi! Andy, your explanations are very much appreciated!

 

On a side note:

 

Resolution is not a high priority in cinema because any individual image is only visible for 1/24 or a second and because shutter speeds are often not much faster.

 

Actually every frame is projected twice in order to make movement look smooth to the human eye. In the cinema, while watching a 90 minute feature, you're in complete darkness for 45 minutes.

 

Which is why studios hire "stills" photographers to shoot the promotional pictures, rather than just grab frames from the movies themselves.

 

Another, perhaps even more important reason is that at the end of an expensive shooting day all that's left is the negative. Nobody touches it, except for one or two people in the lab. It's worth gold.

Link to post
Share on other sites

Hi Guys,

One question that would interest me though: Once they have the film in the bag, how do they do editing nowadays (I'm not talking fx here, just normal scene to scene type work). Do they scan in the film and then work digitally, or do they still use the traditional cutting desk?

 

Depends on the film. I would imagine almost all films are cut on a digital workstation nowadays. Lower budget would cut on a telecined version at low resolution - like mini DV. Then you go back and cut the negative (or work print) to match the edited digital version. Then you make your distribution prints. A higher budget version of that would be to telecine to some HD format. Of course, if your final output is HD (or SD) then you can skip the final print. My brother in film school does this - shoot on film, telecine, output to DVD, so there is not final print.

 

A higher budget version would take the scenes that made the final cut (and maybe alternates) and scan them at 2k or 4k. Then work the heck out of them digital for color grading, effects, etc, then make a print from the digital files. This is the digital intermediate process (DI) mentioned above. I don't think anyone does a 2k or 4k scan of all of the original negative - that would have to be really expensive.

 

A side note about the high speed stuff. Even a hand cranked 16mm from the 40-50's can do 64 fps. It's significantly easier to over crank film than it is digital; just run the gears faster.

 

As far as the bayer resolution thing goes, I don't really know. You aren't interpolating all the information, just mostly the color. Though you'd run into some problems with luminosity resolution (for lack of a better description) if you were taking pictures of red and green grids, stuff like this doesn't happen in the real world much. Bayer filters and the resulting interpolation are a lot better than what they do in the video world, where they often take non-interpolated data and then throw out a bunch of color resolution upon storage. DV has atrocious color sampling and DVD isn't great either.

Link to post
Share on other sites

One interesting thing we can learn from cinematography and their professional scanning (for digital intermediates): scan at higher resolutions to oversample and avoid grain alaising, even when the scan resolution is way beyond the resolution of the film.

 

Good blu-ray-transfers (1080 pixels horizontal resolution) are therefore quite often scanned at 6k or 8k, then downsampled to 4k for storage/archiving and further downrezzed to 1080p... Transfers made with ordinary telecines at 1080p to 2k are much less sharp.

 

Professional digital cinematography tries to avoid color interpolation at any cost (many artifacts are extremely distracting). The 1080p-cameras all have quite high sensor-resolutions to get a full 1080p 4:4:4 (ever pixel has the full color information), some use prisms (3 sensors) or filtered sensors (6MP -> 2MP for every color channel or even 12MP -> 2MP for R/G/B with two different exposures to increase dynamic range). Only lately some "newcomers" tried to market bayer-filtered sensors as superior by generating large images (12MP-sensor ->4k color interpolated)...

 

There are much faster digital cameras (WEISSCAM) but they don't reach the IQ of digital cinematography cameras.

 

Soon they will finally reach real 4k-resolution (the target for digital cinema) but they need a 37MP-sensor (at 18x24mm!) which creates new problems (small photosites -> lower DR and increase noise again, max fps is only 30fps).

 

Film is still too good to get replaced for most artistic reasons righ now...

 

Steven Spielberg and his editor are famous for traditionally cutting film, don't use the technology marketing tries to sell to you but the technology that works best for YOU!

Link to post
Share on other sites

Well, there is of course the 3 CCD technology of Panasonic, which uses a separate CCD for each RGB colour.

 

Oh I know. Many consumer DV cams have 3 CCD chips. The dinger is the DV compression then proceeds to go to 4:1:1 (in NTSC) and toss a bunch of that color info out. Likewise DVD is 4:2:0, no matter what the origin is. When I said DV, I didn't mean the camera, lens, or chip, I meant the recording format.

 

There are also plenty of cameras out there that do higher frame rates. I've used several that did 10,000 fps or even 100,000 fps. The problems here with those are image quality and storage.

 

The pieces are here, but putting them all together in a usable package is taking some time. Film has a lot going for it in cinema origination in the sense that it operates at any frame rate you want, has 4:4:4 color sampling, and is a reasonably robust medium that provides an archival copy immediately. And decent resolution :)

Link to post
Share on other sites

  • 2 weeks later...

Great thread, but it doesn't explain to me something simple, which is why most digital images I've seen , either on a computer screen, or a print, look soft, or even out of focus, compared to most scanned 35mm negs I've seen.

 

Having looked at many M8-generated pics on this sight, there always seems to be a vast difference in resolution - they are either the sharpest ,or some of the least sharp, either wonderfully colourful, or washed out. This may be purely a personal perception, but curious to know if anyone else has the same feeling. I can't believe this is always down to a choice by the photographer.

 

And don't get me started on digital point-and-shoots (not Leica). Yet to see a pic from one that didn't look like it was taken with gauze over the lens.

Link to post
Share on other sites

Great thread, but it doesn't explain to me something simple, which is why most digital images I've seen , either on a computer screen, or a print, look soft, or even out of focus, compared to most scanned 35mm negs I've seen.

 

Having looked at many M8-generated pics on this sight, there always seems to be a vast difference in resolution - they are either the sharpest ,or some of the least sharp, either wonderfully colourful, or washed out. This may be purely a personal perception, but curious to know if anyone else has the same feeling. I can't believe this is always down to a choice by the photographer.

 

And don't get me started on digital point-and-shoots (not Leica). Yet to see a pic from one that didn't look like it was taken with gauze over the lens.

That answer is simple. Most digital images are taken with cameras whicht have an AA filter, which is essentially a matte piece of glass in front of the sensor, which makes it needed to apply "sharpening" (contrast enhancement) in postprocessing, the rest is caused by either the photographers' choice or the photographers' skill in postprocessing. Especially on the Web, "sharpness"has little to do with resolution.

Link to post
Share on other sites

Interesting. I have naively used resolution and sharpness as interchange-able terms. Your answer means that the original digital images are often 'dumbed-down' so they they are more easily 'smartened-up' in post processing ! Omigod.

Link to post
Share on other sites

A really good thread with some very informed argument.

 

The perceived web resolution or sharpness of images on this or any other web site is also affected by contrast of the image. If the dynamic range can take it, an image destined for web or digital projection can often be improved with a touch of increased contrast after converting to 8 bit and sRGB colour space jpegs.

 

Modern sensors such as the full frame Sony (as also used in the Nikon D3x) and Canon have a much improved dynamic range rivalling some medium format digital backs, but none yet have the total file quality that a true 16 bit digital back provides IMHO.

 

Little mention had been made of DR in this film- v - digital debate. High end DSLR's are currently producing 13 stops in most reviews (depending on the system of measurement used). Good film has always been considered able to beat digital in the DR stakes and I wonder if that argument is also now over in favour of digital?

Link to post
Share on other sites

Interesting. I have naively used resolution and sharpness as interchange-able terms. Your answer means that the original digital images are often 'dumbed-down' so they they are more easily 'smartened-up' in post processing ! Omigod.

 

Well, if you consider that web images are 900 pixels wide at most that means that a camera with 1Mp would be ample in terms of resolution. The only reason that better cameras do indeed produce better web images is that in the shrinking proces the finer transients are interpolated to an illusion of better quality, and of course dynamic range,contrast etc are a factor too. I always find it a bit sad to see everything that is in my photograph more or less disappear when I resize. On the other hand, seeing full-res print compared to the screen is always a happy moment :)

 

And yes - an AA filter is a quality-reducing thing. The reason that a few cameras, specifically the M8 and DMR and digital backs don't use them and that other makers are slowly-slowly moving away from them. Of course, a user will see moire effects immediately, whilst a softening of the image is less obvious when it has been covered up by "sharpening".

One would imagine that to be effective, an AA filter would have to affect least one ajacent pixel. Assuming that the blurring is a linear gradient we come to 1/2 pixel-full pixel-1/2 pixel, so the smearing out of the incoming information would amount to halving the sensor resolution, which is a good explanation for the fact that the M8 and DMR, being 10 Mp cameras, can easily compete with cameras up to 20 Mp in the final image.

 

Of course I am hoping that a sensor expert will move in and tell me I'm spouting nonsense ;)

 

Of course the basic sensor resolution of the high-Mp sensor is still there, and will show up at extremely high magnification in more lines per mm, but for myself I think in this analogy: an AA filter introduces distortion, turning block waves into sine waves, and the subsequent sharpening compresses the sine waves to narrow peaks with flats in between. Translate this to sound : a higher frequency response, but a harsher sound for the higher resolution filtered option.

Link to post
Share on other sites

Two things. First I think the lack of an AA filter is just a bad design. Moire can't be corrected properly after the fact. Now whether the anti-aliasing is done by a blurring filter, or an active optical element, or lack or resolving power of the lens is unimportant - we should have anti aliasing going on.

 

Second, I don't think digital cameras actually have 13 stops of range. I've yet to see it. A lot of reviewers take pictures of step wedges, crank the hell out of the raw developing, and then say, look at all those stops! Or even worse, they say 14 bit converters = 14 stops. Anyone can shoot a current digital camera and see the usable dynamic range is a little better than slides, 7-8 stops. Maybe 9. I've not seen anything that shows that current DSLR actually have 13 stops of range. The problem with those other methods of determining range is the usually ignore noise. Noise usually completely eats up the bottom two bits. And the linear nature of the cutoff in the highlights makes that a tricky area too.

 

Negative film usually has about 10 stops in the linear and almost linear part of the curve. More importantly, the shoulder seems to go on for days. I don't know how many stops are up there in the films I shoot. I do know that it's really really hard to completely blow them out to pure white. And the compressed nature of the shoulder further helps us out, since highlights will make a more gradual transition to being blown out, where as with digital, it's just BAM, blown out. I'm not talking about specially processed negative film either. Standard processed Tri-X or Portra 400 has just a huge dynamic range. What you can't do if you want to see some of that range is scan on a minilab scanner. Either do a we print for B&W with the right contrast and dodge and burn, or scan on a decent scanner like a Nikon Coolscan.

 

In my mind, there's a lot of myth's floating around about digital capture. The whole expose to the right thing is founded on silly thought processes. I'm not saying there isn't a reason to do that, but it's not so you can get your image data up where all the levels are.

Link to post
Share on other sites

First I think the lack of an AA filter is just a bad design. Moire can't be corrected properly after the fact.

 

Without an AA filter the moire issue occurs very, very occasionally - I hapened to notice it on a couple of shots at the weekend, the first I can remember in a long time.

 

With an AA filter the softness occurs in every shot.

 

Personally I'm happy with the M8 approach.

 

The idea for normally exposing to the right is that for every bit you expose to the left you're losing half of the possible sample values. I'm assuming this would be less of an issue if the AD conversion was performed using a logrithmic scale rather than a linear one.

Link to post
Share on other sites

I would rather have a sensor with 40 mp and an AA filter :) The lack of the AA filter in my mind is a compromise to get sharpness up where they want it to be with the available technology. Once our resolution gets high enough, they will either reinstate AA filters, rely on blurring intrinsic to the optics to act as an AA filter, or introduce it some other manner that I'm unaware of.

 

As far the expose to the right business, what you stated is myth as far as I know. It does make sense to put your peak brightness value at the right side of the curve for several reasons, but not for that one. First, why throw away range to the right of the peak - put it at you maximum brightness, maximizing your range, and adjust in post; this will give you more shadows. Other reasons for doing this have to do with the noise characteristics. Noise rises with signal level, so all those extra levels used by exposing to the right go into digitizing photon shot noise. (Photon noise goes as the square root of the number of photons). This does however help you out by giving you cleaner shadows. Other sources of noise factor in here too. The end result is the same - sometimes it is best to expose to the right. But the cause and effects of this process are not the typical 'accessing the larger number of levels in the higher bits' argument.

 

Here's a good link to read through.

Link to post
Share on other sites

Ummm... high resolution MF digital backs have no AA filter. I think it is just the other way around - an AA filter is a compromise to avoid moiré, which can be handled in software nowadays for a large part,- but a compromise with loss of quality. I think the future will see fewer and fewer AA filtered sensors in the higher segment. Canon is already weakiening their AA filters on the newest models. And, btw, as explained above, resolution is not the same as sharpness. An AA filtered sensor may have a very high resolution, but it will not be as sharp as it could be.

Link to post
Share on other sites

Ummm... high resolution MF digital backs have no AA filter. I think it is just the other way around - an AA filter is a compromise to avoid moiré, which can be handled in software nowadays for a large part,- but a compromise with loss of quality. I think the future will see fewer and fewer AA filtered sensors in the higher segment. Canon is already weakiening their AA filters on the newest models. And, btw, as explained above, resolution is not the same as sharpness. An AA filtered sensor may have a very high resolution, but it will not be as sharp as it could be.

 

That's nice that high resolution MF backs don't have AA filters. Moire can not be properly handled in post processing. You can patch it up, but you are still compromising your imaging. As resolution increases for a given format size, there will be less and less reason not to anti alias the image*. Aliasing can occur in any sampled system, and it sucks when it does. It's all about a system's Nyquist frequency, and when you start play games trying to sample above that frequency, or run with out a low pass filter, you can get artifacts at lower frequencies.

 

I understand resolution isn't sharpness and vice versa. At some point if we get our pixel density up high enough, the diffraction limit on our lenses resolving power will be visible at all apertures. Once we can start resolving the Airy disk at the point of focus, I would think we can totally do away with AA filtering, because the resolution limits of the lenses will introduce enough blur to prevent aliasing in our sampling.

 

*Again, this can come from sources other than a traditional filter. An active optical element or blurriness from the limited resolving power of the lens would work as well. Once we reach that point, you might say, "Ha! I told you so, no more AA filter." I'd say in response that I don't care about the filter itself, just that we have the proper amount of blur for a given sensor's Nyquist frequency to prevent aliasing.

Link to post
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...