laowai_ Posted July 26, 2019 Share #81 Posted July 26, 2019 Advertisement (gone after registration) 21 hours ago, pico said: I don't know what a sub-pixel is. Would it be a virtual pixel calculated from neighboring pixels? We have that in conventional algorithmic PP software. Would it be pixels created by statistical analysis of any given pattern of global and neighboring pixels? We have that, too. So, I'm curious regarding the goal of AI pixel sampling. Hi Pico, i have no insight on PP software algorithms (stating the obvious), no idea what could be improved or how this could be done. Let me take the opportunity to embellish a bit on the thought behind my question: For digital converters the concept of ENOB (effective number of bits). Would it make sense for photographers to think for example in effective number of pixels instead of the absolute number of physical pixels of the sensor itself? Quote Link to post Share on other sites More sharing options...
Advertisement Posted July 26, 2019 Posted July 26, 2019 Hi laowai_, Take a look here Leica Lens MP Resolution?. I'm sure you'll find what you were looking for!
laowai_ Posted July 26, 2019 Share #82 Posted July 26, 2019 4 hours ago, jaapv said: Maybe he is referring to oversampling. Scroll down in the linked article. https://en.wikipedia.org/wiki/Optical_transfer_function Which, BTW, demonstrates the pointlessness of the original question. I do apologize that my contribution is considered pointless. My aim is to add value to the discussion, I regret that I fell short of reaching this goal. Quote Link to post Share on other sites More sharing options...
pico Posted July 26, 2019 Share #83 Posted July 26, 2019 (edited) 2 hours ago, laowai_ said: For digital converters the concept of ENOB (effective number of bits). Would it make sense for photographers to think for example in effective number of pixels instead of the absolute number of physical pixels of the sensor itself? In a sense we perceptively exploit the concept of ENOB every time we select image size or print. Image resizing also uses various PP algorithms, usually user selected. ENOB is commonly implemented in circuits and effectively smooths input to inhibit output noise within a given range. It does not lower the number of bits finally recorded. A visual analogy would be viewing a digital image at 1:1 of a sensor of a large number of sensels - noise will be more evident than if the same image were viewed at a lower magnification due to the input receptor (our eyes) having lower resolution - they see only an effective number of pixels. It is a relatively old technique. Today gosh only knows what other means they use to control noise. Edited July 26, 2019 by pico speeling error Quote Link to post Share on other sites More sharing options...
farnz Posted July 26, 2019 Share #84 Posted July 26, 2019 ENOB: a tool for stopping Entropy from running away with the cake. Pete. Quote Link to post Share on other sites More sharing options...
jaapv Posted July 26, 2019 Share #85 Posted July 26, 2019 3 hours ago, laowai_ said: I do apologize that my contribution is considered pointless. My aim is to add value to the discussion, I regret that I fell short of reaching this goal. Not the contribution, the concept is widespread on the internet, it is legitimate to address it. Unfortunately the widespread nature doesn't make it more valid. 1 Quote Link to post Share on other sites More sharing options...
ChicagoMatthew Posted July 26, 2019 Author Share #86 Posted July 26, 2019 On 7/25/2019 at 2:31 AM, jaapv said: I use a lot of lenses... And I know that the concept "designed for film" bears no relationship to reality. The whole concept of lens resolution is far too convoluted to be caught in simplistic ideas like this. If you want to refer to (useless) resolution tests, the film used in the past was Technical Pan, which outresolves any present-day sensor with up to 200 cycles/mm. So you don’t believe that newer, lenses with new coatings and computerized manufacturing techniques produce better resolution than older lenses AND you think a resolution test is useless? We don’t have to get stuck on “film era lenses” if you don’t like that term. I’m just saying that the tolerances for film are less precise than for digital and newer lenses are made to that tolerance. Quote Link to post Share on other sites More sharing options...
jaapv Posted July 26, 2019 Share #87 Posted July 26, 2019 Advertisement (gone after registration) Yes - lens resolution tests are useless. "should be relegated to the dustbin of history" according to Erwin Puts. (Leica Lens Compendium, pp 75ff) Resolution is not even a discussion point in lens design any more. See the various interviews with Peter Karbe. The simplistic conflation of optical correction of lenses and their OTF with the resolution of sensors is, frankly, meaningless. Nor is the "tolerance of film" valid. Older lenses were designed as well as the designers could, irrespective of the medium used. Quote Link to post Share on other sites More sharing options...
farnz Posted July 26, 2019 Share #88 Posted July 26, 2019 Newer is not necessarily better. (Sometimes it is but sometimes it isn't.) Plus it could be argued that some newer lenses may be designed more 'loosely' because the designers know that optical aberrations can be corrected with in-camera software, which wasn't available in the pre-digital camera era. Pete. Quote Link to post Share on other sites More sharing options...
ChicagoMatthew Posted July 26, 2019 Author Share #89 Posted July 26, 2019 21 minutes ago, jaapv said: Nor is the "tolerance of film" valid. Older lenses were designed as well as the designers could, irrespective of the medium used. Regardless of the medium used? Pre digital only had one medium. And yeah, they designed lenses as well as they could... and now they can produce better lenses, and better results. What do you think about this article? Which seems to support the assertion that an outmatched lens will produce soft images. https://www.dtcommercialphoto.com/do-megapixels-matter-and-is-medium-format-really-better/ Quote Link to post Share on other sites More sharing options...
jaapv Posted July 26, 2019 Share #90 Posted July 26, 2019 That is about pixel size, medium format and point spread function (important aspects ) , but has very little to do with historical lens design. BTW, lenses were not tested on film during design and production in the fifties, sixties and seventies by Leica, but on large projections and digitally (the COMO program). Quote Link to post Share on other sites More sharing options...
ChicagoMatthew Posted July 26, 2019 Author Share #91 Posted July 26, 2019 1 minute ago, jaapv said: That is about pixel size, medium format and point spread function, but has very little to do with historical lens design. BTW, lenses were not tested on film during design and production in the fifties, sixties and seventies by Leica, but on large projections and digitally (the COMO program). Either we are talking about two different things, or you conveniently didn’t read this entire article. Yeah it’s about pixel size and point spread function AND if an older lenses work well on newer sensors... here’s the part that’s most relevant to this thread and my original question. —— directly quoted from: https://www.dtcommercialphoto.com/do-megapixels-matter-and-is-medium-format-really-better/ Old Lenses + New Sensors = Not-So-Great Systems Take, for example, some of the world’s finest lensmakers – Leica, Zeiss, Schneider, and Rodenstock to name a few. In the film era, these companies produced some of the sharpest optics ever seen. Whether they were wicked-fast 35mm lenses, or large-format lenses covering massive image circles, they were all well engineered enough to resolve down to the size of a few film grains. To put it in terms of what we just discussed, as long as their their PSFs could be engineered to be smaller than those clusters of grains, the lenses would generate razor-sharp images. And they certainly delivered, for a long time. But modern pixels can be much smaller than film grain clusters, and the PSFs of those old lenses that may have been good enough back then may now be too wide for the increasingly tiny pixels on new sensors. That’s why sharper, “digital” versions of these lenses have been released. Let’s look at a much more recent, widespread example though. On the 35mm Full Frame DSLR (I’ll just refer to this as 35mm FF going forward) side of things, consider that Canon’s excellent 5D system doubled in resolution between 2013 and 2016, reaching 50 MP by the end of the of the short 3 year period. But even the newest entries in their flagship line of L lenses were designed with the 22 MP 5D Mk III and its predecessors in mind. This led to widespread complaints that images on the 50MP 5DS R produced soft images. And it did. But not because the sensor was lacking in resolution – the lenses were not of high enough quality to accommodate the level of stress the sensor subjected them to. Their corresponding PSFs were too wide to take advantage of the jump in megapixels. So more pixels might mean better images, but only if you have the glass to support them. Quote Link to post Share on other sites More sharing options...
jaapv Posted July 26, 2019 Share #92 Posted July 26, 2019 I have used Canon L lenses in the past - a rather sad experience compared to their 1950-ies LTM ones, which I still use today. Quote Link to post Share on other sites More sharing options...
adan Posted July 27, 2019 Share #93 Posted July 27, 2019 That article is basically what Rollei and Hasselblad and Mamiya were advertising 60 years ago or more: "If you want high resolution, don't buy finer and finer-grained film for your 35mm camera - get a medium-format camera with a physically-larger piece of film (or silicon)!" MF brochures and ads were full of this type of blurb: Welcome, dear visitor! As registered member you'd see an image here… Simply register for free here – We are always happy to welcome new members! Salivating about a 61mp sensor ("finer and finer grain") in 35mm-format (24mm x 36mm) is doing exactly what R, H, M, and that article all tell you NOT to do - try to get more and more detail out of a "small" image area. Regardless of the quality of the lens. The point they make about the point-spread function is correct (if a bit lacking in details). However, what it really means is that any small sensor and any lens for that format is going to run into the limits of the PSF sooner than a larger sensor and longer focal length. The prime contributors to light from a point "spreading" are quantum interference between photons ("spooky interaction at a distance"), and diffraction. And diffraction is related to the absolute aperture of the lens - that is, the aperture expressed in meters or inches, rather than f/stops. A 50mm lens set to aperture f/11 has an absolute aperture 4.5 mm across. An equivalent lens (field of view) for a 54 x 40 sensor (digital medium format) will have a focal length of ~80mm, and the absolute aperture at f/11 will be 7.27mm across. The second lens will produce less diffraction and less point-spread than the first lens, with the same field of view and f/stop. Add to that that (as the article says) a 61Mpixel MF sensor has larger pixels that are more tolerant of the light spreading, than a "Barnack-format" 61Mpixel sensor. Suffice it to say that Leitz/Leica has been "all over" the PSF in optimizing their lenses, ever since installing their first programmable digital computer for lens design and optimization - in 1953. If that is the criteria, you won't find better. ______________ As to lenses "designed for digital," I agree with you on that, and not Jaap. But that design change has little to do with better resolution as such. "Digital lenses" are optimized for the different structure of a digital sensor. And the problems they solve have to do with telecentricity, refraction, vignetting and color shifts, rather than lines per mm. The old "Sony fuzzy-corners" or "Italian-flag color shifts" problems. If you're from Chi-town (I have been) you know that there are streets in the Loop, between the skyscrapers, that don't get any sun for weeks on end when the sun is low in the winter. Digital sensors are kinda like that - they really, really prefer the "sun" to always be straight overhead in "summer high-noon" position. A lens that can do that even with wide angles of view is a "telecentric" lens - it bends the light differently so that it hits the sensor as much as possible from straight overhead. The Leica SL (and Zeiss Otus) lenses are telecentric - and quite large and long for their focal lengths. So are traditional SLR lenses, to some extent, because they have to clear the moving viewing mirror anyway. The SL/Otus lenses are "designed for digital" - the SLR lenses just lucked out in being already "digital-ready." BTW, telecentric lenses can be used on film, assuming compatable mounts and such - they are just unnecessarily large, since they solve a problem that doesn't exist with film. Quote Link to post Share on other sites Simply register for free here – We are always happy to welcome new members! Salivating about a 61mp sensor ("finer and finer grain") in 35mm-format (24mm x 36mm) is doing exactly what R, H, M, and that article all tell you NOT to do - try to get more and more detail out of a "small" image area. Regardless of the quality of the lens. The point they make about the point-spread function is correct (if a bit lacking in details). However, what it really means is that any small sensor and any lens for that format is going to run into the limits of the PSF sooner than a larger sensor and longer focal length. The prime contributors to light from a point "spreading" are quantum interference between photons ("spooky interaction at a distance"), and diffraction. And diffraction is related to the absolute aperture of the lens - that is, the aperture expressed in meters or inches, rather than f/stops. A 50mm lens set to aperture f/11 has an absolute aperture 4.5 mm across. An equivalent lens (field of view) for a 54 x 40 sensor (digital medium format) will have a focal length of ~80mm, and the absolute aperture at f/11 will be 7.27mm across. The second lens will produce less diffraction and less point-spread than the first lens, with the same field of view and f/stop. Add to that that (as the article says) a 61Mpixel MF sensor has larger pixels that are more tolerant of the light spreading, than a "Barnack-format" 61Mpixel sensor. Suffice it to say that Leitz/Leica has been "all over" the PSF in optimizing their lenses, ever since installing their first programmable digital computer for lens design and optimization - in 1953. If that is the criteria, you won't find better. ______________ As to lenses "designed for digital," I agree with you on that, and not Jaap. But that design change has little to do with better resolution as such. "Digital lenses" are optimized for the different structure of a digital sensor. And the problems they solve have to do with telecentricity, refraction, vignetting and color shifts, rather than lines per mm. The old "Sony fuzzy-corners" or "Italian-flag color shifts" problems. If you're from Chi-town (I have been) you know that there are streets in the Loop, between the skyscrapers, that don't get any sun for weeks on end when the sun is low in the winter. Digital sensors are kinda like that - they really, really prefer the "sun" to always be straight overhead in "summer high-noon" position. A lens that can do that even with wide angles of view is a "telecentric" lens - it bends the light differently so that it hits the sensor as much as possible from straight overhead. The Leica SL (and Zeiss Otus) lenses are telecentric - and quite large and long for their focal lengths. So are traditional SLR lenses, to some extent, because they have to clear the moving viewing mirror anyway. The SL/Otus lenses are "designed for digital" - the SLR lenses just lucked out in being already "digital-ready." BTW, telecentric lenses can be used on film, assuming compatable mounts and such - they are just unnecessarily large, since they solve a problem that doesn't exist with film. ' data-webShareUrl='https://www.l-camera-forum.com/topic/299380-leica-lens-mp-resolution/?do=findComment&comment=3785817'>More sharing options...
Chaemono Posted July 27, 2019 Share #94 Posted July 27, 2019 vor 45 Minuten schrieb adan: The point they make about the point-spread function is correct (if a bit lacking in details). However, what it really means is that any small sensor and any lens for that format is going to run into the limits of the PSF sooner than a larger sensor and longer focal length. The prime contributors to light from a point "spreading" are quantum interference between photons ("spooky interaction at a distance"), and diffraction. And diffraction is related to the absolute aperture of the lens - that is, the aperture expressed in meters or inches, rather than f/stops. Can this be corrected by algorithms when the camera records the RAW data? Quote Link to post Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.