Jump to content

LuLa "Equivalent-Lens" debate


ho_co

Recommended Posts

Advertisement (gone after registration)

For technicians only:

If you're not reading the section-by-section growth of the article at Luminous landscape, you should check it out.

 

Two knowledgeable people with different optical backgrounds are (relatively) coolly engaged in a very interesting and (somewhat) scholarly debate over some of the ramifications of the introduction of digital sensors into the field of optics.

 

The first section covers some of the same territory as the Olaf Stefanus LFI 3/2006 article "Form Follows Format," after Peter Karbe's introduction of the concept of "equivalent aperture" alongside "equivalent focal length."

 

From there, two different approaches to optics and sensors arise between the two scholars, and the repercussions are startling.

 

One thing is sure: We thought we had good understanding of the physics of light, but our understanding of how light interacts with the physical nature of digital sensors is still in flux (if you'll pardon the pun).

 

--HC

Link to post
Share on other sites

1. Yes, it is a generally interesting discussion - but

 

2. I have to say I just shoot my lenses. A 90 'cron or a 50 'lux at full aperture will blur backgrounds at certain subject/background distances, on both silicon and film, and worrying about whether the result is: precisely equivalent to what they would do on a bigger or smaller image area; or precisely equivalent to a 107.32mm f/2.356 lens on a different image area, seems to belong in the "angels on the head of a pin" category of philosophy.

 

My 90 'cron (as a "120mm" f/2 lens) blurs backgrounds more effectively than my Sony R1 zoom (as a "120mm" f/4.8 lens) primarily because of the difference in aperture. The relative crop factor of a 72mm-lens-cropped-1.6x vs. 90mm-lens-cropped-1.33x plays a role, but it is minor compared to plain old f/stop.

 

In the film days, was there this much - agonizing - over the "crop factors" and "lens equivalents" between, say, a Leica, a Rollei, a Hasselblad and a 4x5? Not that I remember.

 

[edit] on going back to see the newest comments, it seems like the conversation has departed from the "equivalent focal length" concept, and gotten into diffraction-limited-sharpness vs. DOF sharpness. Which is OK, and certainly a good reminder that stopping down a lot (with any lens, on any image surface) will not give one maximum resolution. Which is why astronomers are constantly trying to build wider and wider telescopes - other things being equal, a WIDER aperture equals better resolution.

 

But again, as a photojournalist, my images are more "light-limited" than "diffraction-limited". I focus on my main subject, and the aperture (and resolution or DOF) will be determined by things such as the field of view I want, the light available, the ISO noise I will tolerate, and the shutter speed I need to stop or blur action in a story-telling way.

Link to post
Share on other sites

Andy--

Maybe I just need a strong dose of perspective.

 

If I understand the arguments, one of these guys is saying that some cameras can't shoot a better picture than at, say, f/8. By his formula, stopping any M8 lens down beyond f/7.7 will reduce performance not because of lens-related diffraction, but because of the optical influence of pixel size. He says one of his digicams reaches its optimum performance at f/2, but wide open the lens is only f/2.8. His point is that when you stop down beyond the optimum, you are doing the same thing as removing pixels from the sensor. I think he says a 1 Ds Mk II stopped down to f/22 has effectively reduced its pixel count to 6MP.

 

The considerations go far beyond merely figuring the crop factor and then using it to calculate an equivalent f-stop for depth of field; that was last year's news. :)

 

And the other author responds, "You don't know beans about it." Both cite sources, and the sources are good adjunct to what I thought I knew about optics.

 

But what most interests me is that they both feel that an anti-aliasing filter is necessary in a 35mm-form-factor camera, despite the fact that Canon expressly weakened the AA filter in the 5D, despite the fact that no one turns on the DMR's AA filter, and despite the fact that the M8 lacks an AA filter completely.

 

One of them uses math and logic that sounds convincing to math-challenged me. The other approaches the matter practically by saying, "I was going to refute your argument point by point, but then I found these pictures that prove I'm right."

 

You're right about our not doing anything like this in film days. I think either the whole debate is an interesting digression, or else these guys understand something that makes digital *absolutely* different from analog imagery.

 

And as I said, there's Leica sitting on the side, not taking part in the discussion and making better images without the AA filter these guys think is necessary, and coming out smelling like a rose (albeit a magenta one). :p

 

Or maybe I'm just all excited over nothing. :cool: Once again. :(

 

 

[edit] on going back to see the newest comments, it seems like the conversation has departed from the "equivalent focal length" concept, and gotten into diffraction-limited-sharpness vs. DOF sharpness. Which is OK, and certainly a good reminder that stopping down a lot (with any lens, on any image surface) will not give one maximum resolution. Which is why astronomers are constantly trying to build wider and wider telescopes - other things being equal, a WIDER aperture equals better resolution.

 

But again, as a photojournalist, my images are more "light-limited" than "diffraction-limited". I focus on my main subject, and the aperture (and resolution or DOF) will be determined by things such as the field of view I want, the light available, the ISO noise I will tolerate, and the shutter speed I need to stop or blur action in a story-telling way.

Good points all.

 

My first thought was that astronomy sought wider apertures strictly for light-grasp, but then I remembered the VLA telescopes which clearly have a different purpose.

 

And again, one could say that the practical approach is all that's needed, because the optical designers must have made the best compromise they could.

 

Both good points and I don't argue with either.

 

But IF the one gentleman is correct that the physical size of a pixel can be optimized, couldn't that mean a big change in the industry? He says his digicam has, what, 8MP, but cannot use them because his lens isn't fast enough: The extra pixels are simply marketing.

 

Extending the argument, there would be a very strong reason to stop the "My MP is bigger than yours" wars. We all agree that there's a limit to our need for more pixels, but if there is a physical proof that putting more pixels behind a given lens isn't just unnecessary, but actually gives NO advantage, we could start concentrating on lower noise, better ergonomics and lens improvements.

 

And where are the cameras without AA filters in all this? I remember when people said to Leica that the DMR at 10MP would be too little, too late, Leica simply said 10MP was the best resolution for their lenses at that time. Had they already calculated all this stuff into the sensor spec? Or is all this discussion meaningless?

 

--HC

Link to post
Share on other sites

I haven't put numbers through formulas, but it seems to me that if the M8 with its 10MP get the sharpest images up to f/8, and they get softer beyond that, then if Leica had chosen 12MP or 16MP and the same sensor size, then we might all be stopping down only to f/5.6 or f/4! This would be quite a dramatic loss of artistic flexibility. I can see why Leica are pursuing FF sensors. With fullframe, and the same pixel densities, we get about 17MP, or about the same as the 1Ds mk. II. I wonder if Canon stopped there for the same reason?

Link to post
Share on other sites

Both writers are getting a bit feisty, aren't they? The debate, and the Brian Wandel papers have made my list of weekend reading candidates. They do both assume that the Canon family of cameras pretty much exhausts the application space, and that resolution involves matching the image that the lens delivers to the sensors available at the Nyquist frequency. Those of us lucky enough to have no AA filter should not expect that our lenses' resolving power is giving us accurate detail below the pixel spacing, even with a little help from software. Texture that looks right, maybe, but detail will work only if you know it in advance and tell the software what to create in those areas.

 

Enough until I read this stuff carefully. I know and respect one of the authors, and the other one looks highly qualified as well. I imagine that they are solving slightly different problems.

 

scott

Link to post
Share on other sites

...For technicians only...

Sorry Howard, just a postcard shooter here. ;)

 

...For the FF sensor (or 35 mm film) this gives a CoC of about 30 micrometers...

30 micrometers = 0.03 mm.

0.03 mm is indeed the usual CoC value for film and FF sensors.

Leads to 0.03 : 1.5 = 0.02 mm for APS-C sensors.

FWIW

Link to post
Share on other sites

Advertisement (gone after registration)

Howard,

For technicians only, NOT! I think it is extremely important for the day-to-day practice of digital photography and for our expectations of its future to understand that diffraction now occurs not ony with the lens.

 

I tried to point people to the Cambridge article in my post "More MPX for M8 and DMR?" because the little applet on that site makes clear that in practice you are not going to get a Leica quality image if you stop down past f/8. It still seems to me that the only hope of much more quality and size than we already have in the M8 and DMR is in tri-color sensitivity not in simple multiplication of pixels. Going full frame just is not going to make much real difference; going tri-color really will.

 

I am glad LuLa is bringing the problem to the attention of a wider audience. It might increase demand for a better Foveon-style sensor instead of simply more megapixles.

 

Joe

Link to post
Share on other sites

Nathan Myrvhold knows the physics he's talking about; if you don't recognize the name, he used to be CTO at Microsoft. He got his PhD in physics from Princeton and post-doc'd with Hawking at Cambridge. The discussion is pretty interesting esp. when you consider what the marketers at various camera companies would like you to believe. I was struck by the AA filter discussion also & will be interested to see how/if it continues.
Link to post
Share on other sites

Thanks, Bob, for info on Myrvhold. As Scott said, these folks seem to know what they're talking about but are probably solving two different problems.

 

And yes, Scott, the discussion does seem to be heating up a bit--something they couldn't do in scholarly journals! :)

 

Joe, I'm sorry that I missed your earlier post on the topic. And I agree it's not just for techies. (I made that remark primarily to entice some of the people who have since commented, just in case they hadn't seen the article.)

 

I'm fascinated that two obviously knowledgeable people are having such a chat over *optics,* a subject we all thought was pretty much already baked and bagged.

 

As Scott said, they take the viewpoint that Canon has exhausted the knowledge in the field, but that's something DMR and M8 users might argue with.

 

There's clearly more to learn here, and it may force new directions in digital imagery.

 

--HC

Link to post
Share on other sites

Great article, but I've known for a long time (including days in film) that diffraction limiting made the practical "sharpest" aperture of a variety of fine lenses not much more than f8.

 

In other words, this is why a lot of serious landscape photographers used larger format than 35mm film :)

 

And yes, stopping a 1ds2 down to f16 definitely creates less resolution in the file, and it may be equivalent to 2mp worth of resolution; but that''s way more than any *other* dSLR @ f 16; imagine what you have left with a 20D :)

 

I always thought the practical limit for the M8 / DMR is around f8 at most; even Putt's venerable reviews of Leica M and R lenses note that sharpness on the *sharpest* Leica lenses falls off past f5-6 to f8 due to diffraction... even with film.

Link to post
Share on other sites

I'm a bit annoyed by the page "Diffraction Limited Photography: Pixel Size, Aperture and Airy Disks" which Myhrvold links to. (My apologies; I inadvertently misspelled his name earlier.)

 

For example, the page author says that light "begins to disperse or 'diffract' when squeezed through a small hole." His definition of Circle of Confusion is similarly sloppy. But he does introduce some information new to me, like the fact that some Nikon cameras' pixels are slightly rectangular.

 

The page is well laid out and the interactive elements are fun and illustrative of the author's points, but the "example" of pictures shot with the 20D is meaningless because it doesn't/can't distinguish between lens diffraction and loss of detail caused by the light's spill onto adjacent pixels. Myhrvold's recommendation that we try the experiment ourselves is to be taken seriously.

 

Also, if I understand what the Cambridge-In-Color author is saying, isn't this statement topsy-turvy:

"Since the physical size of the lens aperture is larger for telephoto lenses (f/22 is a larger aperture at 20 mm than at 100 mm)...." :confused:

I'm guessing that '20 mm' and '100 mm' refer to focal lengths? Then the error would just be a typo.

 

Please understand: These criticisms are minor, and maybe I'm being picky. Despite being a bit sloppy, the page is definitely useful, attractive and informative. The interactive panels make it very easy to visualize the points made.

 

But as Jamie said, Leica lenses are generally very well corrected wide open; some are nearly diffraction limited. That being the case, calculating that an M8's performance will be diffraction-limited at f/22 due to pixel size is of somewhat questionable usefulness, since it will already be limited by lens diffraction by f/5.6 or f/8. Although the calculator shows that increasing the pixel count (and reducing pixel size) would bring diffraction-limited performance to a wider aperture, I think Leica lenses still meet reasonable criteria: If the M8 were equipped with a 20MP sensor but still with the same Crop Factor, using a CoC of twice the pixel size, the diffraction limitation due to pixel size would drop from f/8 to f/5.6--still on the acceptable edge of best lens performance.

 

--HC

Link to post
Share on other sites

  • 2 weeks later...
Has anyone seen this Excell Stylesheet (viewable in OpenOffice or NeoOffice) Available Diffraction-Free Stops vs. Print Size for Several Digital Cameras which compares the diffraction for different cameras? The max f-stop for the Leica M8 given there is f11. Any ideas how this fits with the rest of the discussion?

 

Unfortunately I cant see that from this cybercafe in the mountains (with French keyboard, too) but you notice fro, the title that the Excel estimator is looking at diffraction vs resolution desired in the print; while Nathans point was that diffraction should be considered to have begun to matter as soon as the diffraction broadening made the smallest possible detail in the image get broader than a pixel.

 

scott

Link to post
Share on other sites

Interesting work, but notice that it's based on viewing (for the Leica) a 10"x15" print at 10" viewing distance:

 

"Our desired resolution at the print is 5 lp/mm (a data density of 254 ppi). Having calculated the enlargement factor suffered when making a 254 ppi reciprocal of 5 lp/mm (1/5th mm, or 0.2 mm). One can not stop down below this f-stop (to smaller apertures) without forcing Airy disk diameters to grow, and thus, visibly soften the entire print, if viewed at a distance of 10 inches. Note that if we viewed the print at twice that distance, it would look sharp until we stopped down two additional stops. Another way to get two additional stops of depth of field without concern for visible diffraction would be to make the print only half as large as the dimensions shown, while still viewing it at a distance of 10 inches."

Link to post
Share on other sites

This is a very interesting thread. Three observations:

 

1. I run a series of on-tripod test shots at all apertures of each new lens out my dining room window, where there is a NYC cityscape that includes millions of bricks, thousands of windows, long straight lines and opportunities for flare in the AM and high contrast in the PM. All of the current generation of Leica lenses that I have are outstanding at f8. At f11 any diffraction issue is very minor and has, in my view, no photographic significance. At f16 loss of sharpness is noticable, and at f22 very much so.

 

2. This may be the technical underpining of "f11 and be there" (sometimes attributed to C-B but I'm not certain of the original source - it is sometimes quoted as "f8 and be there"). f11 gives maximum scope for zone or hyper-focusing before suffering noticable losses to diffraction (or for that matter any losses at all if you're shooting Tri-X).

 

3. While an exposure at f22 on an M8 may contain the same amount of "informantion" as an exposure at f8 on a much lower resolution sensor, the outcome from a photographic standpoint is dramatically different.

 

When you hit the diffraction limit light-dark transitions become softer and the finest detail is lost. In many circumstances this simply dosen't matter. Does anyone know how "sharp" the original of Joe Rosenthal's image on Mount Suribachi is?

 

On the other hand hand when you run out of sensor resolution the consequences are jaggies, rice crispies and all kinds of digital artifacts - which generally are visually offensive (at least to me) in all cases.

 

So the exchange on LuLa that started this thread more or less confirms what we already knew.

Link to post
Share on other sites

I noted in a recent shoot that at f/16 my 24mm lens showed less definition on the chimp-screen. In fact, this loss of definition is dramatic and clearly visible on the little screen. Therefore, I arranged to shoot everything at f/8.

 

I was planning to test all my lenses at the next shoot to see if they all gave up at f/11 or f/16. It's probably true that I just didn't notice the softening of the image at f/11 (and on the 2-inch screen).

Link to post
Share on other sites

So the exchange on LuLa that started this thread more or less confirms what we already knew.

 

Hey--didn't know this was even an issue so I'm glad the thread is here! Not a clue.

 

So, these would constitute an M8 "safe zone" to stay within for technically crisp images?

 

* f/8 is preferred but f/11 is possible (varies based on lens) to keep maximum sharpness because of the sensor's definition.

 

* Hand-held shooting speed at no less than 200 (250 preferable). (Read on another thread about the risk of sharpness loss rising due to hand shake as one starts to dip below this number into the 125s, 60s, and 30s--although many people do this without issue.)

 

* ISO degrades image quality noticably when set to more than 640 (800 equiv) when you pixel peep.

 

No complaints here--being *able* to select ISO 1250 or 2500 and crank down the speed to 30 while upping to f/8 lets me get shots I couldn't have gotten otherwise. It's just not "ideal" if I want to be a safe zone to take the crispest *possible* images from a technical perspective.

 

Thanks everyone for contributing--very helpful!

Link to post
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...