Jump to content

Recommended Posts

Advertisement (gone after registration)

58 minutes ago, Photoworks said:

Many programs already have it as an option to correct. Plus AI is getting stupid good to refocus.

I would say don't shoot at f22 if you don't need to. I have noticed stronger in macro photography.

Interesting. What is this app? 

Edit, oh I see it's C1, never noticed this setting before. 

Edited by LBJ2
Link to post
Share on other sites

x
vor 48 Minuten schrieb adan:

Don't confuse CAUSE and EFFECT. Diffraction is CAUSED solely [...]

Uh oh ... adan, I'd never expected to see such an eerie article from you, of all forumners. It's full of errors and misconceptions from start to end.

.

vor 53 Minuten schrieb adan:

If a sensel is physically larger than the diffraction blur circle projected on it, it will not record the blurriness. It will register the point light as one pixel's brightness—as close to a point as the camera can get. Nor will it smoothly record increasing blurriness, until the blur is big enough to affect more than one pixel.

Yeah—the 'one Airy disk on one pixel' fallacy.

.

vor 56 Minuten schrieb adan:

Really the same thing as "camera shake blur" with high-megapixel sensors.

The same thing indeed—but not in the way you think of it.

.

vor 57 Minuten schrieb adan:

The higher-resolution recording/sampling does not CAUSE the camera to shake more, or the resulting blur to be larger—but it can RECORD and REVEAL any motion-blur more, for any given amount of shake.

For any given amount of shake, the resulting blur streaks are always the same length, no matter how many pixels will record it. If there's more pixels then they will be smaller, rendering the same streak more accurately. Hence, the higher-resolving sensor will always yield the more detailed image, even in the presence of diffraction blur or motion blur.

.

vor einer Stunde schrieb adan:

It will depend on focal length. Shorter focal lengths will produce diffraction effects sooner than longer focal lengths. As I mentioned above, diffraction is related to the absolute diameter of the aperture ...

This is utter nonsense, and you know it.

Shorter focal lengths will produce diffraction blur to a lesser extent than longer focal lengths. The same angular deflection will cause a spatial deflection that increases with distance. So the same angular deflection causes more diffraction effects at longer focal lengths. That's why the absolute diffraction blur depends on the relative aperture (the f-number), not the absolute diameter thereof (in millimeters).

And then, the same absolute diffraction blur causes the greater damage to the image sharpness the smaller the image format is. So larger image formats can take more diffraction blur.

.

vor einer Stunde schrieb adan:

Whereas the following settings will produce essentially identical diffraction effects:

21 mm set to an aperture of 10.5 mm (f/2)
50 mm set to an aperture of 10.5 mm (f/4.8)
90 mm set to an aperture of 10.5 mm (f/8.6)
280 mm set to an aperture of 10.5 mm (f/27)

Congratulations! Here you are confusing cause and effect. Remember your own post's first sentence?

The diffraction at a 10.5 mm aperture will always be the same—that much is true. But the resulting effect is not the same. Instead, it will increase with focal length. That's why the diffraction effect on the sensor depends on relative, not absolute aperture.

Maybe you got confused by the f-stop = focal length divided by four rule mentioned in the article referred to by SrMi. If so then you obviously missed the fact that this rule refers to standard lenses. In this rule, a 300 mm lens is not a telephoto lens for 35-mm format but a standard lens for 8×10" format. So these focal lengths indirectly refer to different image formats.

.

vor einer Stunde schrieb adan:

... at which aperture will diffraction become notable on the SL3's 60-megapixel sensor?

And the answer is: With the same lens, the diffraction will become notable at the very same aperture as it will become notable on any other 35-mm-format sensor. The diffraction limit solely depends on the lens quality and not at all on the sensor resolution (for a given sensor size). Better lenses have their diffraction limit at wider apertures.

Your article will confuse and mislead many readers of this fine forum. Please delete it!

  • Thanks 1
Link to post
Share on other sites

9 hours ago, 01af said:

Don't listen to that nonsense. Those two ... umm, 'experts' just promote the usual misconceptions about diffraction and pixel size.

.

 

David Farkas also discussed in a past video how a lens out-resolved a sensor (or vice versa). 🤪  I take their technical comments with a grain of salt.  But some of the content is entertaining nonetheless.

Jeff

  • Like 2
Link to post
Share on other sites

vor 7 Minuten schrieb Jeff S:

David Farkas also discussed in a past video how a lens out-resolved a sensor (or vice versa).

Yeah, sure he did. But then, lenses don't out-resolve sensors, and sensors don't out-resolve lenses. David Farkas, considering his position in the Leica world, should know better. After all, he did use a digital camera or two before, didn't he?

  • Like 1
Link to post
Share on other sites

18 minutes ago, 01af said:

Yeah, sure he did. But then, lenses don't out-resolve sensors, and sensors don't out-resolve lenses. David Farkas, considering his position in the Leica world, should know better. After all, he did use a digital camera or two before, didn't he?

I know… emailed him the article from Roger Cicala explaining the myth, but never received a response.

Jeff

Link to post
Share on other sites

2 hours ago, 01af said:

Uh oh ... adan, I'd never expected to see such an eerie article from you, of all forumners. It's full of errors and misconceptions from start to end......etc.

For any given amount of shake, the resulting blur streaks are always the same length, no matter how many pixels will record it. If there's more pixels then they will be smaller, rendering the same streak more accurately. Hence, the higher-resolving sensor will always yield the more detailed image, even in the presence of diffraction blur or motion blur.

Just to clarify - you are claiming that a diffraction artifact, or a motion blur streak, will appear the same regardless of the number (and therefore size) of pixels per unit image area?

That a sensor with a single huge 24mm x 36mm pixel will record it just as visibly as a sensor with 6 or 18 or 24 or 60 or 1000 megapixels?

Link to post
Share on other sites

Advertisement (gone after registration)

vor 11 Minuten schrieb adan:

Just to clarify—you are claiming that a diffraction artifact, or a motion blur streak, will appear the same regardless of the number (and therefore size) of pixels per unit image area?

No. Didn't you read what I wrote?

.

vor 11 Minuten schrieb adan:

That a sensor with a single huge 24 × 36 mm pixel will record it just as visibly as a sensor with 6 or 18 or 24 or 60 or 1000 megapixels?

Don't play dumb.

Edited by 01af
Link to post
Share on other sites

49 minutes ago, Jeff S said:

I know… emailed him the article from Roger Cicala explaining the myth, but never received a response.

Jeff

Sounds like you may have missed the now famous Peter Karbe videos explaining this topic in great detail, in English. 

 

Link to post
Share on other sites

9 minutes ago, LBJ2 said:

Sounds like you may have missed the now famous Peter Karbe videos explaining this topic in great detail, in English. 

I do not remember Peter Karbe explaining anything, but he just threw out a number  (100MP?) to be misinterpreted at will 😜.

An educated and lively discussion occurred on DPR in the Photographic Science and Technology forum:

Are Perceptual Megapixels stupid?

 

Link to post
Share on other sites

15 minutes ago, SrMi said:

I do not remember Peter Karbe explaining anything, but he just threw out a number  (100MP?) to be misinterpreted at will 😜.

An educated and lively discussion occurred on DPR in the Photographic Science and Technology forum:

Are Perceptual Megapixels stupid?

 

You wrote "I do not remember Peter Karbe explaining anything, but he just threw out a number  (100MP?) to be misinterpreted at will 😜

Okay then. Never mind. 

Edited by LBJ2
Link to post
Share on other sites

9 minutes ago, LBJ2 said:

You wrote "I do not remember Peter Karbe explaining anything, but he just threw out a number  (100MP?) to be misinterpreted at will 😜

Okay then. Never mind. 

I was referring to explaining what that 100MP limit means. Do you remember differently?

Link to post
Share on other sites

Just now, SrMi said:

I was referring to explaining what that 100MP limit means. Do you remember differently?

His videos arereally worth a proper and patient listen or two. Peter Karbe shares a significant amount of information in a fairly easy to understand format for those interested in these sort of things. I've had the opportunity to speak with a few current/active optical engineers on this topic previously, but I must say Peter Karbe really takes it to another level for those of us on the other side of the camera to better comprehend IMO. 

Link to post
Share on other sites

1 minute ago, LBJ2 said:

His videos arereally worth a proper and patient listen or two. Peter Karbe shares a significant amount of information in a fairly easy to understand format for those interested in these sort of things. I've had the opportunity to speak with a few current/active optical engineers on this topic previously, but I must say Peter Karbe really takes it to another level for those of us on the other side of the camera to better comprehend IMO. 

You avoided answering my question.

Even though I am averse to watching YouTube videos, I try to watch every video with Peter Karbe, in German or in English 😄. I even referenced his comments in the DPR discussion that I linked to.

Link to post
Share on other sites

1 hour ago, LBJ2 said:

Sounds like you may have missed the now famous Peter Karbe videos explaining this topic in great detail, in English. 

 

I emailed him quite a while ago, and even asked him how he reconciled Cicala’s and Karbe’s comments at the time.

Jeff

Link to post
Share on other sites

57 minutes ago, 01af said:

Didn't you read what I wrote?

Yes I did.

Seems to be mostly unsupported and disorganized "o-pee-nyuns posing as facts," interspersed with thinly-veiled (and equally unsupported) ad hominem attacks.

"Yeah—the 'one Airy disk on one pixel' fallacy - This is utter nonsense, and you know it - Here you are confusing cause and effect - Don't play dumb."

Save that stuff for politics on twitter X. If the only arguments you can make involve comments about the mental condition or capability of others in the discussion, that is presumptive evidence of an inadequate impersonal technical argument otherwise.

You did say the following (emphasis added):

4 hours ago, 01af said:

For any given amount of shake, the resulting blur streaks are always the same length, no matter how many pixels will record it.

When you said "no matter how many pixels," you placed no limit on "many" - therefore your categorical statement as written applies for any number of pixels, from 1-1000000000..........0).

It is an incorrect statement if it fails in the case of a single, large pixel. Basic science. It falls into the category of "a beautiful theory slain by one ugly little fact."

It also fails in the case of a small number of large pixels - if the artifact is not large enough to affect more than one of them.

When I asked if that was actually what you meant, your only response was yet another personal remark - while avoiding actually answering the question. Or revising your claim so that it would hold up against any experimental result. Perhaps, for example, "No matter how many pixels, (but at least two), will record it."

Link to post
Share on other sites

22 minutes ago, Jeff S said:

I emailed him quite a while ago, and even asked him how he reconciled Cicala’s and Karbe’s comments at the time.

Jeff

IMO Peter Karbe pretty much lays out the relative high level facts with illustrations and examples of which we can use to apply going forward for those interested. 

FWIW. Roger has answered why he wrote what he did and added a bit more detail to his intent and to whom he was addressing. See the DPR thread reference above. 

https://www.dpreview.com/forums/post/66954443

RCicala

 RCicala  Contributing Member • Posts: 829
Re: Are Perceptual Megapixels stupid?
In reply to SrMi  Mar 21, 2023
15

SrMi wrote:

In the often-referenced appendix Why Perceptual Megapixels are Stupid, Roger Cicala explains why claiming that a lens can resolve a certain number of megapixels does not make sense.

On the other hand, Leica’s Peter Karbe said in a presentation that Leica’s SL-APO lenses are prepared for more than 100MP sensors.

ProfHankD also disagrees with Roger. I wondered about ProfHankD’s statement that the 45MP FF sensor will out-resolve most lenses wide open, and his answer was (DPR post):

I don't disagree with Roger very often, but his simple MTF math is a little too simple. First off, "MTF maxes at 1.0" makes no sense in terms of resolution -- it maxes at 1.0 for contrast. We normally quote resolution at a fixed contrast (e.g., MTF30 is 30%) or contrast at a fixed resolution; multiplying contrasts at a fixed resolution doesn't tell you at what resolution your target contrast threshold will be reached. It's simply not that linear. Beyond that, the Perceptual MP numbers are supposed to be system MTF numbers approximating human perception (whatever contrast ratio that means; DxOMark created the PMP metric, but doesn't document the exact computation) -- from DxOMark, the same lens often gets a different PMP rating on a different body. So, yes, quoting a single PMP number for a lens independent of body used would be wrong.

Is Roger wrong? Can sensors out-resolve lenses?

"Well, first off, Roger's wrong all the damn time. Sometimes, like this time, he even knows it as he writes it, so to speak.

As Jack pointed out, I was aiming for as simple as possible, and gave up accuracy to try to get understanding among the group who were losing their minds over DxOMark's metric, and the "you have to buy a new lens for your new camera" marketing.

As an oversimplified generalization my math pretty well holds for decent lenses on reasonable cameras, but there would be lots of exceptions, especially towards the extremes of high resolution sensors and inadequate lenses. I didn't mean it with anything like scientific accuracy. But the argument that I was fighting was "if you put your 20 perceptual megapixel lens on a 20 megapixel sensor, you get 20 megapixels. If you put it on a 40 megapixel camera you still get 20 megapixels".

I will take full responsibility for using oversimplified and somewhat inaccurate math, but I want full credit for being more accurate than DxOs pseudoscience. At least I showed my "formula" 

All that being said, though, perceptual megapixels are stupid. Even the name is stupid. 

Roger"

 

https://www.dpreview.com/forums/post/66957672

RCicala

 RCicala  Contributing Member • Posts: 829
Re: Are Perceptual Megapixels stupid?
In reply to chrisfisheye  Mar 22, 2023
5

"I think all the above have merit and and are correct depending on specific circumstance.

What does NOT have merit was the way the general community was interpreting perceptual megapixels, which was basically the lowest of camera resolution or lens resolution was what the overall resolution would be. Basically, they were saying "if a lens perceptual megapixel is 20, then there's no reason to have a camera greater than 20 megapixels; it won't make a difference". This got encouraged by the not-so-subtle manufacturer's "lens rated for 40 megapixels" marketing.

I was simply (oversimply for this subforum, but consider who I was writing to) trying to walk people back from that ledge and point out that they would indeed see a significant difference with either a better lens OR a better camera.

My off the cuff formula definitely breaks down at very high resolution or with very bad lenses, but given a starting point of adequate lens on adequate camera, I think it's pretty accurate as a general description of consumer range equipment."

Edited by LBJ2
  • Thanks 1
Link to post
Share on other sites

This was interesting until it went into the sandpit.  

@01af’s statement that “For any given amount of shake, the resulting streaks are always the same leng no matter how many pixels record it” is of course true, as is Andy’s statement that more pixels on the sensor will record that blur more accurately.  The blur is the same, but the ability of the sensor to record the blur increases with resolution.  This is surely not hard to grasp.  With higher resolution sensors, the increase in blur is really only there if you go looking for it, in the same way that a large printed image may look blurry, where a smaller print looks sharp.

Back on topic, the same really applies to diffraction - at the sensor, the amount of diffraction is the same for a given focal length and aperture.  The difference is that with more pixels, the more the diffraction is accurately recorded.  Sensel size, sesor size and MP count doesn’t increase diffraction, it simply records it more accurately.

I’m somewhat baffled by Andy’s conclusion that diffraction is greater with shorter focal lengths, and less of an issue with longer focal lengths.  While lenses are telecentric from 50mm on (for the M system), I had always intuited that diffraction was more of an issue with telephoto lenses.  I was taught to photograph with the smallest apertures for a give lens to use the best part of the glass.  Since using Leica’s, that approach has flipped on its head, somewhat.

I’m rambling - my question is more about the effect of diffraction on focal lengths.

Edit - I found the answer in Wikipedia (assuming it is correct):

The ability of an imaging system to resolve detail is ultimately limited by diffraction. This is because a plane wave incident on a circular lens or mirror is diffracted as described above. The light is not focused to a point but forms an Airy disk having a central spot in the focal plane whose radius (as measured to the first null) is

{\displaystyle \Delta x=1.22\lambda N,}

where {\displaystyle \lambda } is the wavelength of the light and {\displaystyle N} is the f-number (focal length {\displaystyle f} divided by aperture diameter {\displaystyle D}) of the imaging optics; this is strictly accurate for {\displaystyle N\gg 1} (paraxial case).

So, if we take wavelength as constant (for this purpose), as focal length increases the Airy disk central spot (detail) also increases.  I guess that is a statement of the bleeding obvious, but dividing the focal length by the aperture diameter (NOT f-stop) introduces  the key point.

I don’t have the bandwidth to re-work Andy’s figures (and I suspect there might be more to it than the figures show), but if we accept Andy’s aperture diameter of 10.5mm, the Airy disk central spot for a 21mm lens (N in the above formula) is 2.0, for 50mm it is 4.76 and for 135mm 12.86.  Taking a constant wavelength ({\displaystyle \lambda }and aperture diameter (D - note, not f-stop), then the factor to be applied increases with focal length, and therefore the Airy central spot increses in size.

That doesn’t quite support Andy’s apparent conclusion that diffraction is more of an issue with wide angle lenses at wider f-stops …

Edited by IkarusJohn
Link to post
Share on other sites

3 minutes ago, IkarusJohn said:

I had always intuited that diffraction was more of an issue with telephoto lenses.

Well, refraction and dispersion (spreading colors into different-sized image circles - longitudinal chromatic aberration) is definitely more troublesome with long lenses. Which is why Leica's first APOchromatic lenses were all teles (R system 180, 280, 400, etc.) As were Nikon's ED lenses.

 

17 minutes ago, IkarusJohn said:

my question is more about the effect of diffraction on focal lengths.

I do know, from "accidental" real-world personal experimentation, that a excellent 28mm Elmarit-M v.4 produced quite clear across-the-image diffraction blur at f/16, quite a bit more than the much-longer 75mm APO-Summicron-M ASPH at f/16.

Both at close-focus limit - which would make the 75 even more "extended" from the image plane. Both in the studio, where the 75 (with strobes, at min. focus, at f/11-16) is my go-to lens for razor-sharp still lifes and such. Both on the same M10 original 24Mpixels - where f/11 would be the point diffraction is supposed to become noticeable.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...