Dr. G Posted April 24, 2024 Share #1 Posted April 24, 2024 Advertisement (gone after registration) On a recent Red Dot Forum Youtube video they were talking about diffraction. I always thought that diffraction was a function of a lens, but they mentioned that diffraction was directly related to the sensor resolution/pixel size and that with the SL3 diffraction begins at f/8. For lower megapixel sensors it would require a stopping down more to start to see the effects of diffraction. It seems like f/8 would be a very low f/stop to start seeing diffraction. I need to try and find the video again, but they don't index them and they're all over 2 hours long. Link to post Share on other sites More sharing options...
Advertisement Posted April 24, 2024 Posted April 24, 2024 Hi Dr. G, Take a look here Diffraction.... I'm sure you'll find what you were looking for!
TeleElmar135mm Posted April 24, 2024 Share #2 Posted April 24, 2024 vor 7 Minuten schrieb Dr. G: On a recent Red Dot Forum Youtube video they were talking about diffraction. I always thought that diffraction was a function of a lens, but they mentioned that diffraction was directly related to the sensor resolution/pixel size and that with the SL3 diffraction begins at f/8. For lower megapixel sensors it would require a stopping down more to start to see the effects of diffraction. It seems like f/8 would be a very low f/stop to start seeing diffraction. I need to try and find the video again, but they don't index them and they're all over 2 hours long. They are absokute right. Google for diffraction and ou find some calcultors for specific Megapixels. Link to post Share on other sites More sharing options...
jaapv Posted April 24, 2024 Share #3 Posted April 24, 2024 Maybe a bit elliptical, Michael 😉 Well, there are two types of diffraction, one is purely lens diffraction which is caused by small apertures "bending" the light at the blade's edge which is mainly of interest on film, and diffraction throughout the digital imaging lens/sensor system. Google Nyquist Frequency. https://www.edmundoptics.com/knowledge-center/application-notes/imaging/sensors-and-lenses/ nyquist frequency 1 1 Link to post Share on other sites More sharing options...
Chaemono Posted April 24, 2024 Share #4 Posted April 24, 2024 Reid Reviews tested recently how one of the non-APO Summcron lenses behaves on the SL3 including when diffraction sets in. Link to post Share on other sites More sharing options...
01af Posted April 24, 2024 Share #5 Posted April 24, 2024 vor 1 Stunde schrieb Dr. G: I always thought that diffraction was a function of a lens ... Diffraction is a property of the light. From a photographer's point of view, it's a function of the relative aperture, i. e. the f-number (more diffraction with smaller apertures = higher f-numbers). Diffraction blur, in turn, depends on the film or sensor format (less with larger formats). . vor 1 Stunde schrieb Dr. G: ... but [in the Red Dot Forum Youtube video] they mentioned that diffraction was directly related to the sensor resolution/pixel size ... This is a common misconception and total bunkum. In fact, neither diffraction nor diffraction blur have anything to do with pixel size or sensor resolution. 2 Link to post Share on other sites More sharing options...
Jeff S Posted April 24, 2024 Share #6 Posted April 24, 2024 3 hours ago, Dr. G said: On a recent Red Dot Forum Youtube video they were talking about diffraction. .... I need to try and find the video again, but they don't index them and they're all over 2 hours long. They are indexed after posting. Diffraction discussed at 10:00 min... Jeff Link to post Share on other sites More sharing options...
SrMi Posted April 24, 2024 Share #7 Posted April 24, 2024 Advertisement (gone after registration) One of the more authoritative articles on diffraction: Diffraction Limited Pixels? Really? Enjoy. Link to post Share on other sites More sharing options...
01af Posted April 25, 2024 Share #8 Posted April 25, 2024 vor 10 Stunden schrieb Jeff S: Diffraction discussed at 10:00 min ... Don't listen to that nonsense. Those two ... umm, 'experts' just promote the usual misconceptions about diffraction and pixel size. . vor 10 Stunden schrieb SrMi: One of the more authoritative articles on diffraction: Diffraction Limited Pixels? Really? Enjoy. Ugh ... lengthy, but correct, for a change. Thank you for that link! One of the very few places on the Internet where the relationship of detail, resolution, diffraction, and pixel size is explained properly. Another is here (use Google translator if you don't read German). 3 2 Link to post Share on other sites More sharing options...
LBJ2 Posted April 25, 2024 Share #9 Posted April 25, 2024 15 hours ago, Dr. G said: On a recent Red Dot Forum Youtube video they were talking about diffraction. I always thought that diffraction was a function of a lens, but they mentioned that diffraction was directly related to the sensor resolution/pixel size and that with the SL3 diffraction begins at f/8. For lower megapixel sensors it would require a stopping down more to start to see the effects of diffraction. It seems like f/8 would be a very low f/stop to start seeing diffraction. I need to try and find the video again, but they don't index them and they're all over 2 hours long. I find these articles, illustrations and calculators a handy reference: LENS DIFFRACTION & PHOTOGRAPHY https://www.cambridgeincolour.com/tutorials/diffraction-photography.htm DIGITAL CAMERA DIFFRACTION, PART 2 https://www.cambridgeincolour.com/tutorials/diffraction-photography-2.htm Link to post Share on other sites More sharing options...
jaapv Posted April 25, 2024 Share #10 Posted April 25, 2024 These articles ( like all CiC articles ) are not incorrect but simplified to the extent that they have the wrong emphasis. In the end, diffraction blur is a product of the complete sensor-lens system. Discussing the separate contributing principles is not a complete explanation. 1 Link to post Share on other sites More sharing options...
Dr. G Posted April 25, 2024 Author Share #11 Posted April 25, 2024 1 hour ago, jaapv said: These articles ( like all CiC articles ) are not incorrect but simplified to the extent that they have the wrong emphasis. In the end, diffraction blur is a product of the complete sensor-lens system. Discussing the separate contributing principles is not a complete explanation. So where does the SL3 used with APO lenses start showing signs of diffraction? I have always defaulted to f/8 when balancing a larger depth of field with maxiumum sharpness and f/11 for casual landscapes. I'm wondering if I need to use different apertures now. Link to post Share on other sites More sharing options...
jaapv Posted April 25, 2024 Share #12 Posted April 25, 2024 Practically in real life instead of theory: In general it depends on the subject and circumstances. I would use f8 as a basic rule to be forgotten as circumstances change. Don’t forget that in practice you will only see diffraction effects on a rock solid tripod with critical subjects and an excellent lens unless you take things to extremes. 1 Link to post Share on other sites More sharing options...
01af Posted April 25, 2024 Share #13 Posted April 25, 2024 vor 9 Minuten schrieb Dr. G: So where does the Leica SL3 used with apo lenses start showing signs of diffraction? Exactly where the Leica SL and SL2 will start showing signs of diffraction, too ... I'd guess from f/2.8 or f/3.5 on, at the frame's center. . vor 9 Minuten schrieb Dr. G: I have always defaulted to f/8 when balancing a larger depth of field with maxiumum sharpness and f/11 for casual landscapes. That's a good choice. You may want to use f/4 or f/5.6 for maximum sharpness at the plane of focus when there's no particular demand for large depth-of-field, and f/16 or f/22 for maximum depth-of-field. Always choose your aperture after your requirements and don't let fear of diffraction blur hold you back! The Apo-Summicron-SL lenses offer an aperture range from f/2 to f/22, so make full use of it! 2 Link to post Share on other sites More sharing options...
BernardC Posted April 25, 2024 Share #14 Posted April 25, 2024 The best resource on diffraction is Richard Feinman's QED: The Strange Theory of Light and Matter. Summary: diffraction is what happens when light bends around an edge, such as an aperture blade. It's always there, but the proportion of diffracted light goes up as the aperture gets smaller. You'll find lots of online arguments about when diffraction starts to affect images. It's generally agreed that diffraction isn't noticeable at apertures wider than 5.6 in traditional 35mm photography. Other photographic processes like photolithography suffer from diffraction at much wider apertures. If Red Dot says that diffraction "begins" at f:8.0, that's not necessarily false, but it's something everyone needs to determine for themselves. You might start to lose fine details sooner, but that might not be relevant to your own work. By f:11 or 16, you get lower overall contrast, which again may not be all that important to you. Diffraction is not a hard barrier, it's one aspect of imaging that gets progressively worse as you stop-down. At the same time, other aberrations get progressively better, and depth of field increases. Link to post Share on other sites More sharing options...
01af Posted April 25, 2024 Share #15 Posted April 25, 2024 vor 14 Minuten schrieb BernardC: The best resource on diffraction is Richard Feynman's QED: The Strange Theory of Light and Matter. Summary: Diffraction is what happens when light bends around an edge, such as an aperture blade. It's always there, but the proportion of diffracted light goes up as the aperture gets smaller. [...] That's all fine—but doesn't address the pixel size myth. After all, it's a textbook about physics and not about digital photography ... which hardly surprises anyone, as the book was published in 1985 when digital cameras didn't yet exist. 1 Link to post Share on other sites More sharing options...
Photoworks Posted April 25, 2024 Share #16 Posted April 25, 2024 (edited) Welcome, dear visitor! As registered member you'd see an image here… Simply register for free here – We are always happy to welcome new members! Many programs already have it as an option to correct. Plus AI is getting stupid good to refocus. I would say don't shoot at f22 if you don't need to. I have noticed stronger in macro photography. Edited April 25, 2024 by Photoworks 1 Link to post Share on other sites Simply register for free here – We are always happy to welcome new members! Many programs already have it as an option to correct. Plus AI is getting stupid good to refocus. I would say don't shoot at f22 if you don't need to. I have noticed stronger in macro photography. ' data-webShareUrl='https://www.l-camera-forum.com/topic/393386-diffraction/?do=findComment&comment=5219372'>More sharing options...
adan Posted April 25, 2024 Share #17 Posted April 25, 2024 (edited) Don't confuse CAUSE and EFFECT. Diffraction is CAUSED soley by the absolute diameter (in mm/microns/100ths of an inch - whatever) - ignore f/stops, which are relative diameters) of the lens aperture. The EFFECT will be a progressively-blurrier image of a point light source, as the aperture's absolute diameter gets smaller. That is all there is to say about diffraction itself. (BTW - all photographs are a collection of "point brightnesses" - Georges Seurat had it right; he just chose jumbo-sized "points." 😁 ) https://upload.wikimedia.org/wikipedia/commons/6/67/A_Sunday_on_La_Grande_Jatte%2C_Georges_Seurat%2C_1884.png However, as a practical matter, a blur circle may or may not be recorded by a given imaging surface. Homogenous film will record blur circles of all sizes, down to the limit of the film's recording ability. A digital sensor is not homogenous - it is an array of discrete and separate units of silicon (plus other chemicals) called pixels or sensels. If a pixel/sensel is physically larger than the diffraction blur circle projected on it, it will not record the blurriness. It will register the point light as 1 pixel's brightness - as close to a point as the camera can get. Nor will it smoothly record increasing blurriness, until the blur is big enough to affect more than one pixel. The more pixels for a given area (increased sampling frequency, i.e. more, smaller pixels), the more the blur will become accurately rendered and apparent. Really the same thing as "camera shake blur" with high-Mpixel sensors. The higher-resolution recording/sampling does not CAUSE the camera to shake more, or the resulting blur to be larger - but it can RECORD and REVEAL any motion-blur more, for any given amount of shake. ............. As to Dr. G's second question: It will depend on focal length. Shorter focal lengths will produce diffraction effects sooner than longer focal lengths. As i mentioned above, diffraction is related to the absolute diameter of the aperture, in mm, or 10ths of an inch, or whatever unit you prefer. (Why? because light wavelengths are also absolute in size - green light with a wavelength of 530nm will remain that wavelength regardless of anything else - otherwise it ceases to be "green.") A 21mm lens at "f/16" has an absolute aperture diameter of 21/16, or 1.315mm - small diameter, more diffraction A 50mm lens at "f/16" has an absolute aperture diameter of 50/16, or 3.125mm - medium-small diameter, somewhat less diffraction A 90mm lens at "f/16" has an absolute aperture diameter of 90/16, or 5.625mm - large diameter, less diffraction The 90-280 zoom at 280mm and "f/16" will have an absolute diameter of 280/16, or 17.5mm - very large diameter, much less diffraction. Whereas the following settings will produce essentially identical diffraction effects: 21mm set to an aperture of 10.5mm (f/2) 50mm set to an aperture of 10.5mm (f/4.8) 90mm set to an aperture of 10.5mm (f/8.6) 280mm set to an aperture of 10.5mm (f/27) As to the practical problem (at which aperture will diffraction become notable on the SL3's 60-mpixel sensor?) Basic back-of-the-envelope calculation based on this source: http://www.sfu.ca/~gchapman/e894/e894l13i.pdf Aperture diameter of 7mm will produce an Airy disk of 3.5 microns from a focused red laser beam. Pixel pitch (dimensions of one pixel width/height) of the 60 Mpixel Leica/Sony sensors: 3.76 microns So for the 21mm, f/2.8-f/3.0 (if one is really picky - and for red light; other colors/wavelengths may vary a small amount) For the 50mm, f/6.6 For the 280mm, f/40. But, again, being practical, if one is shooting color (where debayerizing the color will swap around some data, pixel to neighboring pixel, to get colors other than red/green/blue), that can probably be doubled. Which puts even the 21 Summicron-L close to the suggested "safe" range of f/8, and the other lenses even better. And onc can add to that any residual aberrations (no such thing as a perfect lens). Edited April 25, 2024 by adan 3 Link to post Share on other sites More sharing options...
LBJ2 Posted April 25, 2024 Share #18 Posted April 25, 2024 (edited) 2 hours ago, Dr. G said: So where does the SL3 used with APO lenses start showing signs of diffraction? I have always defaulted to f/8 when balancing a larger depth of field with maxiumum sharpness and f/11 for casual landscapes. I'm wondering if I need to use different apertures now. F8 as a practical rule of thumb, so to speak. However I typically don't go beyond 7.1 for when/where I think diffraction is a visual potential sometimes a bit wider depending on the lens. The cambridgeincolour calculators I posted are also a practical reference point to check out. In the calculator's advanced mode you can add some of the other factors that might come to play as well to help further define as needed. I find these articles, illustrations and calculators a handy reference: LENS DIFFRACTION & PHOTOGRAPHY https://www.cambridgeincolour.com/tutorials/diffraction-photography.htm DIGITAL CAMERA DIFFRACTION, PART 2 https://www.cambridgeincolour.com/tutorials/diffraction-photography-2.htm But also, diffraction can sometimes be the least of our worries when dealing with other factors that might be more beneficial to prioritize as discussed in this article. Edited April 25, 2024 by LBJ2 Link to post Share on other sites More sharing options...
BernardC Posted April 25, 2024 Share #19 Posted April 25, 2024 17 minutes ago, 01af said: That's all fine—but doesn't address the pixel size myth. After all, it's a textbook about physics and not about digital photography True, but it explains what diffraction is (and isn't). One can always follow-up to do the maths for any specific wavelength of light. One other thing that the book explains is that diffraction values aren't absolute. For one thing, aperture mechanisms are not infinitely thin, or razor sharp. Pinhole photographers have known this for a very long time (two pinholes with the same diameter don't perform the same), but it's often glossed-over for general photography. People quote diffraction charts as if they were universal constants, like the speed of light in a vacuum. The main takeaways are Diffraction doesn't suddenly appear at a single f-stop. It's always there, but it gets more noticeable as you stop-down. The effect of diffraction progresses from "not noticeable because of other aberrations" to "barely noticeable" to "the main source of image degradation". Where you draw the line depends on a lot of factors, of which pixel size is one (but rarely the most important one). Magnification is a much more important factor. You should draw your own conclusions, based on your own experience. The "pixel size myth" is based on a comparison between the Airy disk size and the pixel size. Frankly, it's very unlikely to be a factor in pictorial photography. Nobody will judge your prints based on Airy disk size alone. 2 Link to post Share on other sites More sharing options...
Eclectic Man Posted April 25, 2024 Share #20 Posted April 25, 2024 You may also like to have a look at the Wikipedia page: https://en.wikipedia.org/wiki/Diffraction_spike#:~:text=Diffraction spikes are lines radiating,in photographs and in vision. which discusses 'sun stars' and diffraction spikes: "Diffraction spikes are lines radiating from bright light sources, causing what is known as the starburst effect[1]or sunstars[2] in photographs and in vision. They are artifacts caused by light diffracting around the support vanes of the secondary mirror in reflecting telescopes, or edges of non-circular camera apertures, and around eyelashes and eyelids in the eye. ... Non-circular aperture Apertures blades of camera Iris diaphragms with moving blades are used in most modern camera lenses to restrict the light received by the film or sensor. While manufacturers attempt to make the aperture circular for a pleasing bokeh, when stopped down to high f-numbers (small apertures), its shape tends towards a polygon with the same number of sides as blades. Diffraction spreads out light waves passing through the aperture perpendicular to the roughly-straight edge, each edge yielding two spikes 180° apart.] As the blades are uniformly distributed around the circle, on a diaphragm with an even number of blades, the diffraction spikes from blades on opposite sides overlap. Consequently, a diaphragm with n blades yields n spikes if n is even, and 2n spikes if n is odd.] Comparison of diffraction spikes for apertures of different shapes and blade count " Basically it is not 'when diffraction starts' that matters, it is 'when diffraction becomes noticeable and intrusive'. See images from the JWST for example exhibiting large diffraction spikes for 'foreground' stars. 2 Link to post Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now