Jump to content

Voightlander Lenses, LoCa and Purple Fringing


paulcurtis

Recommended Posts

Advertisement (gone after registration)

50 minutes ago, otto.f said:

What’s LoCA? It would be nice if incrowd talk could be avoided. It’s easy on a computer to let this abbreviations written out

A very common abbreviation for a very long set of words.

https://www.google.com/search?client=safari&rls=en&q=lens+loca&ie=UTF-8&oe=UTF-8

//

The CV APO lenses – we are seeing the absolute raw output in the resulting DNG files, warts and all. For the Leica lenses, lens corrections are baked into the embedded DNG lens profile, and this profile cannot be turned off in Lightroom/Photoshop. For the CVs, just check the CA correction option and adjust purple/green fringing tool to taste when you encounter challenging lighting situations that show these flaws.

For my CV 50 APO, I use the M lens profile for the 50 f/2 (first choice) simply so I see less vignetting wide open, which helps for metering of the scene and for in-camera image review on the LCD. I edit my DNGs in Capture One, which does not apply the lens profile corrections for M lenses by default, so I'm not worried about it applying the corrections for a different lens. If you use Lightroom/PS, then probably turning off the lens profile in-camera is best.

Edit to add: With my Canon RF 50 f/1.2 and a few other lenses, I use Zeiss T* UV filters. I have noticed the hard UV cutoff that is almost exclusive to the Zeiss UV is helpful in reducing the severity of these issues. The B+W UV/IR cut filter will work in much the same way with the added benefit of IR-cut security. I have the Zeiss filters on order for my M CV lenses, so I've yet to see if they help with the CV. I prefer to just use the lens hood and ditch the lens cap anyway, so I'm going to use them regardless.

Edited by hdmesa
  • Like 1
  • Thanks 2
Link to post
Share on other sites

3 hours ago, paulcurtis said:

That's really interesting. Of course it begs the question how you could ever correct for it because i am guessing that all rays are going to cross like that. You would need to correct for the exit on the last piece of glass to not refract the wavelengths differently. Plus also real light is a spread of wavelengths not just RGB. This is why i am not an optical designer.

So guessing APO is about that last element correction (i suppose it involves more than the last element though)

Interesting to point out purple is because of the fact there is no purple wavelength or rather it's magenta, not purple. Magenta is made up by our brains out of red and blue light but doesn't  exist in the visual spectrum at all. (See the CIE colour diagrams). Although you may also be meaning that actually it's ultraviolet light that is being focused in front, in which case do UV filters help with PF?

This is interesting

cheers
Paul

We are quite beyond my understanding of colour there, although I am trying to study that area of our art.

It would be interesting to see what difference it might make, but the UV cut should be at the correct wavelength, @hdmesa is saying about the Zeiss filters.

Link to post
Share on other sites

3 hours ago, LBJ2 said:

I didn't realize until now that you are the man behind previously unknown to me until today,   www.47-degree.com 👍  I hope you don't mind if I share some of this data on your website with others... 😎

Yep, that is my work. I don’t mind you sharing a link to my website at all!

  • Thanks 1
Link to post
Share on other sites

9 minutes ago, Harpomatic said:

What do you mean?

This thread amounts to maybe 3 people trying to be clever and pontificating about nothing at all.

Hows that?

Link to post
Share on other sites

14 hours ago, hdmesa said:

The CV APO lenses – we are seeing the absolute raw output in the resulting DNG files, warts and all. For the Leica lenses, lens corrections are baked into the embedded DNG lens profile, and this profile cannot be turned off in Lightroom/Photoshop. For the CVs, just check the CA correction option and adjust purple/green fringing tool to taste when you encounter challenging lighting situations that show these flaws.

FWIW if you are curious about DNG files then RawDigger is an app that will let you open them in their basic raw format - pre-debayer. There's not that much that can be done in the DNG, you can tone map, store colour matrix transforms and obviously tag the file with metadata. There's no lens correction as such beyond tagging that DNG so that whichever app you use to debayer will do what it wants with the data. So there's no treatment of any LoCa or PF going on, DNG is just a transport for the sensor data. You can learn a lot digging through DNGs.

It's worth looking at basic colour science too. Often the DNG stores sensor colour space and there is a transform into XYZ space and then the app doing the debayer will move that to whichever colourspace is needed. In that process there are numerous places for colour gamut mapping which can adversely affect the image. Luckily on the stills side of the world these workflows are pretty solid and accurate. On the motion side it's more of a world west! It is possible for some camera hardware to treat the data in some way going into the DNG - so it is feasible that Lecia is doing something but it's quite rare.

There are interesting things, like did you know Lightroom is always doing highlight recovery and an awful lot of skies coming out of lightroom are recovered and not accurate representations of what was there? Apps like RAW Digger can show you each RGGB channel before any clipping. You can learn and understand what the manufacturer is doing with the colour filters and their sensitivities. 

I've not looked at Lecia DNGs but lots of others! I think it's the ultimate destination for a pixel peeper gone extreme. A caveat, my background is vfx & post production so knowing this was my bread and butter (my excuse)

cheers
Paul

  • Like 1
Link to post
Share on other sites

7 hours ago, steve 1959 said:

This thread amounts to maybe 3 people trying to be clever and pontificating about nothing at all.

Hows that?

I must have misunderstood your previous post: with pretentious.com you must have been indicating yourself? 

Are you trying to say that in a forum where people pontificate to no end about the smallest minutiae of image lens rendering, with vastly different opinions and with several (or dozens) different lenses in the same focal lengths to have different flavours of imaging, taking about the different design choices made by the Cosina-Voigtlander designers and trying to better understand chromatic aberrations is pretentious? I must be missing something here!

I'm sorry you were coerced to read our pontifications. I hope you'll be set free soon.

  • Like 2
Link to post
Share on other sites

7 hours ago, steve 1959 said:

This thread amounts to maybe 3 people trying to be clever and pontificating about nothing at all.

:)

Your opinion, which is fine. 

My background is vfx & post, it's my profession to understand this. In motion, issues can compound and cost a lot of money. I remember one green screen shoot where vintage lenses were used that was near impossible to key - i tracked it down to odd diffraction of the colours which prevented good edges, even though it looked good to the eye.

So in my creative work, i have my eye and what i like, and in professional, i need to ensure things run smooth. But like lots of others on here i've spend decades with lenses and personally find it fascinating. The lens designer is very much an unsung hero but i feel to appreciate you have to understand a bit.

cheers
Paul

Link to post
Share on other sites

4 hours ago, paulcurtis said:

:)

Your opinion, which is fine. 

My background is vfx & post, it's my profession to understand this. In motion, issues can compound and cost a lot of money. I remember one green screen shoot where vintage lenses were used that was near impossible to key - i tracked it down to odd diffraction of the colours which prevented good edges, even though it looked good to the eye.

So in my creative work, i have my eye and what i like, and in professional, i need to ensure things run smooth. But like lots of others on here i've spend decades with lenses and personally find it fascinating. The lens designer is very much an unsung hero but i feel to appreciate you have to understand a bit.

cheers
Paul

Just a guess? 

https://www.imdb.com/name/nm0004271/

Edited by LBJ2
Link to post
Share on other sites

23 minutes ago, LBJ2 said:

Sure that's part of what i do. I try to avoid mainstream as much as possible, it's not the best working environment, especially with families. I worked in soho in the 90s and consulted for lots of software companies then, set up indie pipelines and also development work which is still bread and butter these days mixed with producing now.

More recently there's a lot of ML and Deep Learning stuff going on which is interesting because a lot of lens characteristics can be taught to a ML set up. So for example PF and LoCa is something i'm experimenting with removing by teaching a neural net what a perfect version is and then use that to alter source images. I've just started experimenting whether i can change bokeh after the shot, to smooth it out and make what i like. The photographer in me cringes at the thought, but computation photography is valid and offers interesting possibilities. This started from tackling deblurring in motion, especial after stablisation.

All good fun.

cheers
Paul

 

  • Thanks 1
Link to post
Share on other sites

2 minutes ago, paulcurtis said:

Sure that's part of what i do. I try to avoid mainstream as much as possible, it's not the best working environment, especially with families. I worked in soho in the 90s and consulted for lots of software companies then, set up indie pipelines and also development work which is still bread and butter these days mixed with producing now.

More recently there's a lot of ML and Deep Learning stuff going on which is interesting because a lot of lens characteristics can be taught to a ML set up. So for example PF and LoCa is something i'm experimenting with removing by teaching a neural net what a perfect version is and then use that to alter source images. I've just started experimenting whether i can change bokeh after the shot, to smooth it out and make what i like. The photographer in me cringes at the thought, but computation photography is valid and offers interesting possibilities. This started from tackling deblurring in motion, especial after stablisation.

All good fun.

cheers
Paul

 

I definitely recognized experience in your comments prior to knowing your background. More than good fun for me, Paul. I appreciate when industry experienced pros are willing to share on the public forums. Some of us need all the help we can get.

Very interesting to know you are also working with applying ML/AI to PF and LoCa problems. Seems this area will continue to draw a lot of funding for these type projects. 

Link to post
Share on other sites

6 hours ago, paulcurtis said:

FWIW if you are curious about DNG files then RawDigger is an app that will let you open them in their basic raw format - pre-debayer. There's not that much that can be done in the DNG, you can tone map, store colour matrix transforms and obviously tag the file with metadata. There's no lens correction as such beyond tagging that DNG so that whichever app you use to debayer will do what it wants with the data. So there's no treatment of any LoCa or PF going on, DNG is just a transport for the sensor data. You can learn a lot digging through DNGs.

It's worth looking at basic colour science too. Often the DNG stores sensor colour space and there is a transform into XYZ space and then the app doing the debayer will move that to whichever colourspace is needed. In that process there are numerous places for colour gamut mapping which can adversely affect the image. Luckily on the stills side of the world these workflows are pretty solid and accurate. On the motion side it's more of a world west! It is possible for some camera hardware to treat the data in some way going into the DNG - so it is feasible that Lecia is doing something but it's quite rare.

There are interesting things, like did you know Lightroom is always doing highlight recovery and an awful lot of skies coming out of lightroom are recovered and not accurate representations of what was there? Apps like RAW Digger can show you each RGGB channel before any clipping. You can learn and understand what the manufacturer is doing with the colour filters and their sensitivities. 

I've not looked at Lecia DNGs but lots of others! I think it's the ultimate destination for a pixel peeper gone extreme. A caveat, my background is vfx & post production so knowing this was my bread and butter (my excuse)

cheers
Paul

Yes, I said the corrections were baked into the embedded profile in the DNG, not made to the DNG itself. Capture One allows you to use or not use the manufacturer profile, but Lightroom does not allow you to remove any profile flagged as mandatory. The Q/Q2 and my GFX have the mandatory flags, so I’m assuming the M is the same way, but maybe not — haven’t tried the M10-R in LR yet to verify.

That said, we do know it’s the manufacturer that is writing the data to the RAW files, so we’d never know what kind of manipulations they do. One example that was discovered is Canon baking in noise reduction at lower ISOs to its R5 RAW files in order to increase the apparent dynamic range. This can be seen in the DR charts on the Photos-to-photos site where the NR is denoted by downward-facing triangles. This isn’t something flagged or embedded in a profile, this is done to the RAW itself. 

Link to post
Share on other sites

17 hours ago, hdmesa said:

That said, we do know it’s the manufacturer that is writing the data to the RAW files, so we’d never know what kind of manipulations they do. One example that was discovered is Canon baking in noise reduction at lower ISOs to its R5 RAW files in order to increase the apparent dynamic range. This can be seen in the DR charts on the Photos-to-photos site where the NR is denoted by downward-facing triangles. This isn’t something flagged or embedded in a profile, this is done to the RAW itself. 

Yes, true. This is always one of the issues with cinema camera testing in that RAW isn't really RAW sometimes and so people can get in a twist about DR of various cameras without really understanding what's going on or being able to work with it. I think Sony also do something similar, no? I remember issues with the a7 range and 'eating' stars at night (although that might be the RAW compression rather than NR in that case)

What is interesting is how a sensor achieves it's range. We like to think in terms of camera A has a DR of 14 stops for example, but behind the scenes it's a lot more complicated than that. So an example being say the original Sony FS700 which was unique because you could get RAW DNGs out of it via a 3rd party - means that you get to see behind the scenes for once. In that case i noticed that the green channel was 2 stops more sensitive than the other channels which effectively gave an extended dynamic range - so in a scene outside the green channel could clip and then you are reliant on the Red and Blue channel data not clipping so you can guess what the green might have been. It means that in low light it was a good performer, but again outside Blue would clip quite fast too and you're reliant on Red. So you would see colour skewing in highlight areas because of this reconstruction and lack of colour detail in shadows because only green was recording.

Then i noticed that Lightroom is always doing highlight reconstruction whether you want it or not and as i mentioned above it means that a lot of skies in LR are reconstructed auto magically and also have colour casts but i think we are so used it it that we don't actually even question them. Again highlight reconstruction is something that can happen in camera or out of camera. If one of the colour channels clips you can make a rough guess as to what the colour might have been based on the colours around it, you can be quite sophisticated as modern debayer solutions would look at gradients and edges to help that process. It's a shame that LR doesn't let you choose debayer algorithms because some are better in some situations than others. I find quite often some dodgy edges in LR because of this.

This is why it's quite enlightening looking at the RAW colours before they get debayered. Because all this reconstruction happens at that point. It's also why white balancing after a RAW shot is possible.

19 hours ago, LBJ2 said:

Very interesting to know you are also working with applying ML/AI to PF and LoCa problems. Seems this area will continue to draw a lot of funding for these type projects. 

Thanks for the interest. This is a really interesting area and in some ways really goes against my principles. Just take a look at how much work the simple(?) iPhone is doing these days. Each photo is made from around 15 different shots all doing different things - depth, noise - all contributing to the 'data' it has to work with and then from that an image is calculated, not recorded. What is concerning is that image has face recognition, smoothing going on, it can tweak colours of grass, of skies and who knows where it will all end. It's feasible in the not too distant future to take a snap and actually that image not really represent what was there but a best memory of it. 

cheers
Paul

  • Like 1
Link to post
Share on other sites

On 3/20/2021 at 6:45 PM, hdmesa said:

...

Edit to add: With my Canon RF 50 f/1.2 and a few other lenses, I use Zeiss T* UV filters. I have noticed the hard UV cutoff that is almost exclusive to the Zeiss UV is helpful in reducing the severity of these issues. The B+W UV/IR cut filter will work in much the same way with the added benefit of IR-cut security. I have the Zeiss filters on order for my M CV lenses, so I've yet to see if they help with the CV. I prefer to just use the lens hood and ditch the lens cap anyway, so I'm going to use them regardless.

What camera are you using? Canon R series? Or Leica M8 / M9? Or analog / film cameras?

I ask, because I find it hard to believe that the Zeiss UV filter (which has a sharp cut at 410nm), and particularly the B+W 486 UV & IR cut filter (which cuts sharply at 390nm), should impact the image with stock digital cameras. The reason is that the sensor glass effectively cuts away both UV and IR on modern digital cameras. I have seen measured graphs of these filters, and they include a sharp UV / IR cutting filter as well as a "hot mirror" type filter.

That said, these graphs don't include digital Leica M cameras like M8 and M9 which have different filters, so there may be an explanation there... 

(I have gone the odd route and got my Nikon Z6 modified with Kolari Vision Ultra-Thin sensor glass to optimize the camera for M mount lenses. This conversion strips the hard UV/IR cut filter, so a UV filter is actually needed for that reason and lead me to research the topic.)

Link to post
Share on other sites

That is not correct - the UV is cut off by the lens, all modern lenses are UV filtered; the sensor is still UV sensitive, it is quite possible to do digital UV photography using a lens that is transparent to UV like the Summarit 50/1.5 with B+W 403 filter.  As to IR filtering, up to today, Leica M cameras have quite thin and not fully effective IR filters, due to technical reasons (search the forum and you'll find dozens of IR photographs) It is quite possible to do (near) IR photography with the M10 using a B+W 093 filter, even handheld. Using a 092 filter is more difficult, but still possible.

Canon is a different story, the IR filter is quite thick and thus effective so it needs to be modified.

  • Thanks 1
Link to post
Share on other sites

10 hours ago, jaapv said:

That is not correct - the UV is cut off by the lens, all modern lenses are UV filtered; the sensor is still UV sensitive, it is quite possible to do digital UV photography using a lens that is transparent to UV like the Summarit 50/1.5 with B+W 403 filter.  As to IR filtering, up to today, Leica M cameras have quite thin and not fully effective IR filters, due to technical reasons (search the forum and you'll find dozens of IR photographs) It is quite possible to do (near) IR photography with the M10 using a B+W 093 filter, even handheld. Using a 092 filter is more difficult, but still possible.

Canon is a different story, the IR filter is quite thick and thus effective so it needs to be modified.

First: Your point confirms my point that using a Zeiss UV filter shouldn't improve color fringing / CA's which was the context and reason I posted in the first place.

As to the rest:

UV below about 350 or 360nm is cut by the glass in the lens and in addition some by the coating. However, the range between that and visual (400nm) is transmitted so some degree. Those who do UV photography won't use a standard, modern lens since it doesn't let much UV through, but I can confirm that some UV does pass modern, regular lenses. I have two full spectrum cameras plus filters for UV and IR photography, so I know this first hand.

The B+W 403 (UG-1) filter is a dual-band filter that leaks a considerable amount of near IR (with a peak at 750nm). Given that the sensor itself is much more sensitive to IR than UV, shooting that filter without a properly IR cutting filter stacked will lead to IR images with just a hint of UV at best. Regarding the B+W 092 and 093 filters, I think you are mixing them up. See attached image below. Shooting with an unmodified camera (even regular digital cameras) with the 092 will be no problem, but for shooting the 093 filter you will need extreme levels of exposure. There are some who shoot IR images with normal, stock digital cameras. They just raise the ISO and exposure time considerably. That said, digital Leica M cameras may have a less UV and IR cut than other cameras. The M8 is notorious for leaking IR, but I have no personal experience with digital Leica M cameras.

Kolari Vision makes a living by replacing the stock sensor glass on various digital cameras (for UV and IR photography etc.) and has measured these stock filters (see the link in my post above). They say that the M9 has the sharp cutting UV / IR filter is fused to the sensor while the IR and red absorbing (greenish-blue colored) filter is removable. The latter filter has close to zero transmission already at 750nm according to their measurements. See here: https://kolarivision.com/product/leica_m9_repair/ 

 

Welcome, dear visitor! As registered member you'd see an image here…

Simply register for free here – We are always happy to welcome new members!

Link to post
Share on other sites

2 hours ago, LarsHP said:

The B+W 403 (UG-1) filter is a dual-band filter that leaks a considerable amount of near IR (with a peak at 750nm). Given that the sensor itself is much more sensitive to IR than UV,

Which is the reason that sensors have an IR filter, which cuts off the transmission through the filter. Your remark is only valid for modified sensors, mine was aimed at standard sensors.

Link to post
Share on other sites

26 minutes ago, jaapv said:

Which is the reason that sensors have an IR filter, which cuts off the transmission through the filter. Your remark is only valid for modified sensors, mine was aimed at standard sensors.

Well, for unmodified cameras you will basically get a black filter using the B+W 403 since it opens up only in the UV plus the red and IR range. The stock sensor glass will cut heavily in both ends. The transmission peak of the 403 filter in the UV range is around 330nm. This means only special UV capable lenses will let any UV through. In the other end, the filter opens up already around 670nm. In other words, with a regular lens and a stock camera, it will only pass deep red and perhaps a little IR.

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...