Jump to content

M8's Kodak Sensor


MP3

Recommended Posts

Advertisement (gone after registration)

The sensor as such has no ability at all to tell the frequency of light hitting it - all that happens is that any photon than hits a cell generates an electron (I simplify, quantum efficiency, etc means less electrons than photons). Those electrons are then counted, which gives a measure of light intensity. It's the Bayer filter over the sensor that means that certain cells are now sensitive to red, green or blue, because that's all that passes through. But once infra-red is through the red part of the Bayer filter, and generates an electron, there is absolutely no way for the camera electronics/firmware/any post processing software to know whether the electron was as a result of red light, or infra-red.

 

You can play a bit with the relative sensitivities of the red/green/blue cells. E.g., if you make the red cells a little less sensitive, you can decrease IR sensitivity a bit. This may be part of the new firmware's revised color matrix. But sooner or later you will start to impact on overall color balance.

 

Sandy

Link to post
Share on other sites

  • Replies 62
  • Created
  • Last Reply

That’s beside the point. For all intents and purposes, it is an IR-blocking filter:

 

And not a very good one. Since you write on technical issues for LFI, do you have any insight into the analysis on which the decision was made that it was better to thin the filter, using an existing material and accept IR sensitivity, rather than use a dichroic and address the extra internal reflections or find a material which would absorb more strongly and thus be thinner? While I have heard this defended as the result of a careful study, I have not heard any details and all statements seem to have been filtered by marketing people who cannot be held responsible for technical accuracy.

 

My personal opinion is that an M9 will emerge someday, but it's not based on talking to anyone in Solms.

 

scott

Link to post
Share on other sites

There are absorption filters (absorbing IR) and dichroitic interference filters (reflecting IR). That’s the terminology as it is commonly used.

 

Dichroic- get all the definitions here: dichroic - encyclopedia article about dichroic.

 

And for an engineer designing a camera, this difference matters a lot.

 

Yup, the Epson R-D1 definitely shows that.

 

 

Only if the filter is of the absorption kind, and only if it is a filter in front of the lens. The absorption filter in front of the sensor does also absorb some red and thus favour blue and green, but that only countacts the sensor’s increased sensitivity for red. In other words, the IR absorbing cover glass also corrects for the red cast caused by the CCD.

 

Again, the in the Epson R-D1s this is correct to a large extent. :)

 

 

That’s beside the point. For all intents and purposes, it is an IR-blocking filter:

 

That definition, now, makes sense! :)

Link to post
Share on other sites

Steve is correct on the adapter - the Canon mount is the closest to the film, so its possible to use an adapter for other lenses on a Cannon SLR, but not the other way round.

 

Sandy

 

I am not talking about the need to space the lens further because the Canon body is thinner. That's obvious. Once the lens is focused at infinity for each camera, the angle of light to the sensor will be the same for a given area. Positioning the lens further or closer would alter focus.

 

So in my example, when I move my Nikkor 28mm PC (or any other lens) from a Nikon FF body to a Canon FF body, the angle of light from center to edge will be the same.

 

So what I don't understand is any type of claim that specific cameras have specific differences as to the angle of light that comes from the lens. This is a lens design issue not a camera design issue. I don't know what is the relevance of the flange distance on measuring an angle from center to edge. Why choose this location?

 

Of course a camera such as an M8 that lacks a mirror can have a lens mounted closer to the sensor and thus accomodate a design that uses a wider angle of light. But there is nothing to stop Leica from designing retrofocus wide angle or other more telecentric lenses for this and future models. This is something they probably are considering.

 

Surely a 90mm Elmar does not have the same angle as a 24mm. And there is nothing stopping a DSLR manufacturer from making lenses that sit further back and require one to raise the mirror. I used to have a Nikkon 21mm and a Nikkor 8mm that worked this way.

 

What it really comes down to is that for acceptable performance, the angle of light that hits the sensor has to be within a certain range to meet the sensor's requirements, not the camera body's.

Link to post
Share on other sites

do you have any insight into the analysis on which the decision was made that it was better to thin the filter, using an existing material and accept IR sensitivity, rather than use a dichroic and address the extra internal reflections or find a material which would absorb more strongly and thus be thinner?

A filter with a higher absorption ratio for infrared would have been a viable solution, but apparently, the material with the desired characteristics doesn’t exist (yet). A dichroitic interference filter wouldn’t just cause additional reflections, which are difficult to deal with within the tight space left between filter and lens. It would also introduce a version of the red vignetting/cyan corners syndrome that is much more severe than the one caused by dichroitic filters in front of the lens. With a filter in front of the lens, the amount of red vignetting depends on the focal length and little else. That’s because the incident angle for all the rays of light emanating from some point will (with the possible exception of macro shots) vary only slightly. But for a filter in front of the sensor, there will be a lot of variation in the incident angle, and it will also vary with the aperture setting. The type of lens factors in as well. All in all, the cyan corner issue is much harder, if not impossible to solve, so a solution with the filter in front of the lens was to be preferred.

Link to post
Share on other sites

Advertisement (gone after registration)

Alan,

 

The reason for all the flange talk is that in an SLR, the flange is the limiting distance - you can't have any element of the lens closer than that, so that in turn defines the maximum angle. For the M series, the flange isn't the limiting distance, which is why in table I showed, I also gave the number for a lens that protrudes backwards by 1cm into the body. I will admit to simplifying a bit :D - (a) you can design a lens that keeps light more parallel than the simple exit pupil calculation that I did (e.g., OM four thirds lenses, at least in the original version of the spec), and (B) the diameter of the exit pupil also has an impact. Etc, etc. But the simple calculation shows the magnitude of the problem that the Leica engineers had to deal with.

 

Sandy

Link to post
Share on other sites

Alan,

 

But the simple calculation shows the magnitude of the problem that the Leica engineers had to deal with.

 

Sandy

 

OK, that's where you are coming from. I think we all knew this starting years ago that accomodating existing w/a lenses would be the big problem for a Leica M. It is clear that their solution has been a compromise that may or may not be acceptable to all users. Simply making some new retrofocus w/a lenses would be an easy solution for those who are unhappy about the cyan corners..

Link to post
Share on other sites

So what I don't understand is any type of claim that specific cameras have specific differences as to the angle of light that comes from the lens. This is a lens design issue not a camera design issue. I don't know what is the relevance of the flange distance on measuring an angle from center to edge. Why choose this location?

 

differing systems have different registers (lens to film distance) its a system issue

check out this link

 

mounts.htm

 

Of course a camera such as an M8 that lacks a mirror can have a lens mounted closer to the sensor and thus accomodate a design that uses a wider angle of light. But there is nothing to stop Leica from designing retrofocus wide angle or other more telecentric lenses for this and future models. This is something they probably are considering.

 

thats an idea that would make all previous M glass obsolete

and would be a difficult sell to say the least

 

Surely a 90mm Elmar does not have the same angle as a 24mm. And there is nothing stopping a DSLR manufacturer from making lenses that sit further back and require one to raise the mirror. I used to have a Nikkon 21mm and a Nikkor 8mm that worked this way.

 

the worst case scenario must be accomodated

 

What it really comes down to is that for acceptable performance, the angle of light that hits the sensor has to be within a certain range to meet the sensor's requirements, not the camera body's.

 

this is the idea of microlenses, but the angular capacity can only be stretched so far

they actually increase this capacity, but there are limits

the measurements listed where just an indication of the differences in systems

you can perhaps see from the list of registers that the M is shorter than any other

 

perhaps the fact that, FF canons despite much longer registers still have troubles with falloff in ultra-wide angle lenses, should alert you to the challenging difficulties that have had to be dealt with the M8. Add to that, the accommodation of lenses over 50 years old.

Link to post
Share on other sites

Yes - and a very simple solution it was: use a smaller sensor to cut out the edge and corner rays, where the problem lies, so one can use a more effective filter. Not that the RD1 filter is that effective, the Epson RD1 is -after the Leica M8, one of the most IR sensitive camera's around. The main difference is that they got spared the Internet hype.

 

There is another solution - design some new retrofocus wide-angle lenses that do not suffer from the same issues, and use a full-size 35mm sensor.

 

What is cheaper for the customer - having to use a filter on every lens and effectively junking everything above 75, or buying one more 28 or 21 mm ?

 

I would have preferred a full-frame Leica with a decent IR filter, even if I had to buy a more expensive wide.

 

This was a commercial decision, not a technical one.

 

Edmund

Link to post
Share on other sites

There is another solution - design some new retrofocus wide-angle lenses that do not suffer from the same issues, and use a full-size 35mm sensor.

 

What is cheaper for the customer - having to use a filter on every lens and effectively junking everything above 75, or buying one more 28 or 21 mm ?

 

I would have preferred a full-frame Leica with a decent IR filter, even if I had to buy a more expensive wide.

 

This was a commercial decision, not a technical one.

 

Edmund

 

that would junk the existing wides as well though Edmund

clearly what they attempted to do was accommodate as much M lenses as they could

with a rangefinders preferred use of wides, that seems a little severe

 

add to that, the math may still work out unfavourably for falloff and edge softness for pretty well half the kit, and i suspect it would.

 

that i would think, is an engineering compromise

 

it is not for nothing that the RD is a deeper crop, yet still this FF obsession exists

and still, somehow Leica are wrong

Link to post
Share on other sites

There is another solution - design some new retrofocus wide-angle lenses that do not suffer from the same issues, and use a full-size 35mm sensor.

 

Even better yet is to design and sell such lenses with built-in filters as in the 15mm Super Elmar, Elmarit and the 16mm fisheye.

Link to post
Share on other sites

A filter with a higher absorption ratio for infrared would have been a viable solution, but apparently, the material with the desired characteristics doesn’t exist (yet). A dichroitic interference filter wouldn’t just cause additional reflections, which are difficult to deal with within the tight space left between filter and lens. It would also introduce a version of the red vignetting/cyan corners syndrome that is much more severe than the one caused by dichroitic filters in front of the lens. With a filter in front of the lens, the amount of red vignetting depends on the focal length and little else. That’s because the incident angle for all the rays of light emanating from some point will (with the possible exception of macro shots) vary only slightly. But for a filter in front of the sensor, there will be a lot of variation in the incident angle, and it will also vary with the aperture setting. The type of lens factors in as well. All in all, the cyan corner issue is much harder, if not impossible to solve, so a solution with the filter in front of the lens was to be preferred.

 

This seems to be garbled. The exit pupil distance is never less than the focal length, and in the modern ASPH designs, it is greater than the focal length. The Elmarit 28/2.8-asph is close to symmetric, the 21, 24 Elmarits, and 28 Summicron are definitely "telecentric," as evidenced either by their physical lengths or by looking at the relative vignetting measured on an optical bench that Leica reports in their specs. That means that the range of angles with which light enters the sensor is less than the range of angles of view with which light enters the front of the lens. So red vignetting due to a dichroic at the sensor has a smaller magnitude than that due to a filter in front.

 

As far as I have seen in actual measurements, there is no aperture variation in red vignetting whether due to the sensor filter or a filter in front, and there are calculations that say this should be the case. There is aperture variation in luminance vignetting, and a considerable amount of lens specific differences, so if Leica is willing to incorporate those differences in its firmware, it should be reasonable to add the exit pupil dependence as well.

 

Your first statement sounds right -- an absorptive filter makes a lot of sense, but maybe can't justify the extra time it takes to qualify a different supplier or new material or both.

 

scott

Link to post
Share on other sites

Several posts above have got this wrong, so let me try to be really clear.

 

The rays of light coming out of a lens appear to leave from a point called the exit pupil. In a symmetric lens design, the exit pupil is right in the center of the lens (usually right where the aperture blades are), but in most modern designs the lens sticks out further and the exit pupil can be behind the center. However, gain in lens length is more than the shift of the exit pupil, so it moves out from the film plane, and the exit pupil distance to the film plane increases. It is a characteristic only of the lens design and the distance of the object in focus.

 

The maximum angle of view depends only on the lens effective focal length. This makes it easy to predict, but it is larger than the maximum angle at which light hits the sensor.

 

OK?

 

scott

Link to post
Share on other sites

The exit pupil distance is never less than the focal length, and in the modern ASPH designs, it is greater than the focal length. The Elmarit 28/2.8-asph is close to symmetric, the 21, 24 Elmarits, and 28 Summicron are definitely "telecentric," as evidenced either by their physical lengths or by looking at the relative vignetting measured on an optical bench that Leica reports in their specs. That means that the range of angles with which light enters the sensor is less than the range of angles of view with which light enters the front of the lens. So red vignetting due to a dichroic at the sensor has a smaller magnitude than that due to a filter in front.

There seems to be a misunderstanding here. I wasn’t talking about the maximum incident angle, but rather about the distribution of angles within a cone of light converging on the sensor. For all the light emanating from a point in the scene and converging on a point in the sensor plane, there is not one incident angle, but a range of angles – there is not just the chief ray, but also the marginal rays to be considered.

 

For example, lets consider a lens that is telecentric on the image side. The chief rays will be perpendicular to the sensor, but the marginal rays will not, and their angle depends on the aperture. As a result, we would get a cyan cast that is uniform across the image, with an intensity depending on the aperture. With non-telecentric lenses, there will be red vignetting, the amount of which depending on the aperture.

Link to post
Share on other sites

There seems to be a misunderstanding here. I wasn’t talking about the maximum incident angle, but rather about the distribution of angles within a cone of light converging on the sensor. For all the light emanating from a point in the scene and converging on a point in the sensor plane, there is not one incident angle, but a range of angles – there is not just the chief ray, but also the marginal rays to be considered.

 

For example, lets consider a lens that is telecentric on the image side. The chief rays will be perpendicular to the sensor, but the marginal rays will not, and their angle depends on the aperture. As a result, we would get a cyan cast that is uniform across the image, with an intensity depending on the aperture. With non-telecentric lenses, there will be red vignetting, the amount of which depending on the aperture.

 

You are, in absolute terms, perfectly right. But for the cyan cast to appear and be a problem, the angle(s) of incidence of the light rays on the sensor must be great. This is not the case for the effect generated by lens aperture, because this aperture is small.

Take a worst case symmetric 28mm f2. Exit pupil is 14mm diameter, divide by 2 is 7. Distance from pupil to sensor is 28, arctan7/28 is 14degrees, which does not create any practical cyan problem on the center. Going to the side of sensor, it is true that these 14 degrees add to the angle from the center ray, but at the same time they subtract at the other side of the cone, so the mean value is always that of the center ray.

Please, consider that english is not my language, and also that I am a philosophy prof...

Sergio

Link to post
Share on other sites

Please, consider that english is not my language, and also that I am a philosophy prof...

Sergio

 

Sergio, you are right, and perfectly clear. The effect that he is citing is small compared to the overall cast that comes at the edges of a wide-angle image. Joseph Wisniewsky pointed out some months ago that the aperture-dependent effects mostly cancel out, according to his calculations. I've done measurements of Sean Reid's red vignetting studies and plotted them up at various apertures. There seems to be no aperture dependence in the relative red vignetting, although there is in the overall vignetting. I wrote this up and posted it on Sean Reid's site, http://www.reidreviews.com , where it is linked to from the very end of his article on 28mm lenses.

 

scott

Link to post
Share on other sites

Sergio, you are right, and perfectly clear. The effect that he is citing is small compared to the overall cast that comes at the edges of a wide-angle image. Joseph Wisniewsky pointed out some months ago that the aperture-dependent effects mostly cancel out, according to his calculations. I've done measurements of Sean Reid's red vignetting studies and plotted them up at various apertures. There seems to be no aperture dependence in the relative red vignetting, although there is in the overall vignetting. I wrote this up and posted it on Sean Reid's site, http://www.reidreviews.com , where it is linked to from the very end of his article on 28mm lenses.

 

scott

 

Nice to hear you again ,Scott.

I know that you are a happy user of your m8. I am planning to post a comparison Canon 35/2 and 35 cron asph that is a little surprising..

My site is now operational.

Regards.

Sergio

Link to post
Share on other sites

Yes, the sensor sees IR insofar as the photo diodes are sensitive to IR. But it cannot see IR in the sense that IR could be distinguished from red, green, or blue. Of the red, green, and blue filters of the Bayer array, the red and blue filters have a higher IR transmission than the green filters, and thus IR is seen as magenta (red + blue).

Ah, but then the problem should be addressed at the Bayer array's red and blue filters to lower the IR transmission to better match the green's output. I assume there's a reason this hasn't been solved, so perhaps there is an art to balancing the overall transmission.

Link to post
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...