Jump to content

APO Summicron 50 ASPH review


helged

Recommended Posts

Advertisement (gone after registration)

Ok señor. I suppose I should get you in touch with my university physics professors. Their teaching methods and knowledge apparently needs to be updated.

 

I should add my chem professors contact info as well (since he used the same analogy).

Link to post
Share on other sites

  • Replies 54
  • Created
  • Last Reply
That's right—but "not much" still is a lot more than "not at all."

 

 

 

....

 

?

 

"Not much more" means "not much more," in English or mathematically, so it never means "a lot more."

Link to post
Share on other sites

Sigh ... :rolleyes:

 

This is exactly NOT how it works.

 

So don't expect him to bother you with elementary truths.

 

so then please prove it or provide a reference, as someone else has asked...

Link to post
Share on other sites

so then please prove it or provide a reference, as someone else has asked...

 

I think that 01af means that if for instance formerly in film photography you had a top lens and a mediocre or bad enlarger lens in your dark room to print with, your top lens will not shine in the print.

This chain reasoning cannot be applied to lens and sensor because a better lens will still give better results and a 'bad' sensor.

The bigger difference between the Nocti and the APO Summicron on the MM is due to the fact that the Nocti is not an APO, which in B&W is more critical.

Link to post
Share on other sites

I think that 01af means that if for instance formerly in film photography you had a top lens and a mediocre or bad enlarger lens in your dark room to print with, your top lens will not shine in the print.

Arrgh! No, this definitely is NOT what I mean. :mad:

 

 

This chain reasoning cannot be applied to lens and sensor because a better lens will still give better results ...

This "chain reasoning" can NEVER be applied to chains of bandwidth-limited information-transmitting devices.

Link to post
Share on other sites

 

So if you switch from a good lens to a better lens then the sharpness of the final image will improve, no matter what sensor you're using.

 

 

 

Still true what I said

Link to post
Share on other sites

Advertisement (gone after registration)

?

 

"Not much more" means "not much more," in English or mathematically, so it never means "a lot more."

?:confused:Quantifying unquantifiable concepts now? Define a non-subjective difference between not much and a lot, please.
Link to post
Share on other sites

?

 

"Not much more" means "not much more," in English or mathematically, so it never means "a lot more."

 

Mathematically there is no "not much more". You actually try to limit the error and quantify it as low as possible. Now if I understand Olaf and you correctly you both acknowledge the error, but you are saying it is small and Olaf says it is actually significant. Now it would be good if both of you could quantify it :D

Link to post
Share on other sites

Mathematically there is no "not much more". You actually try to limit the error and quantify it as low as possible. Now if I understand Olaf and you correctly you both acknowledge the error, but you are saying it is small and Olaf says it is actually significant. Now it would be good if both of you could quantify it :D

 

In the absence of anything else constructive just use the usual definition 1/n = 1/a + 1/b, where a, b are the resolutions of the two components and n is the net resolution. This is good enough in most cases, and good enough for the purposes. Let "a" be the lowest res component and ka, k>1, be the other. Then

 

n = 1 / (1/a + 1/(ka)) = ak / (k + 1) < a

 

and you see the net resolution is bounded by that of the lowest res component. And so you can use terms like "limiting factor" and "wasted resolution."

 

Now to see how the author knows his lens is high res compared to his sensor/film, let "a" be any sensor resolution and ka be the lens resolution; this time k>0. Then differentiate n with respect to "a" to get k/(k + 1); that is how much the net resolution changes with a change in sensor resolution. So if the lens is low res compared to the sensor, i.e. k is small enough, the net resolution will not change much when you swap in sensors of various resolutions.

Link to post
Share on other sites

120,

Please post something worthy of being published in the International Journal of Optics and Applications, or The Optical Society, or other acknowledged scientific publications, since apparently anything else is not good enough.

 

 

In the absence of anything else constructive just use the usual definition 1/n = 1/a + 1/b, where a, b are the resolutions of the two components and n is the net resolution. This is good enough in most cases, and good enough for the purposes. Let "a" be the lowest res component and ka, k>1, be the other. Then

 

n = 1 / (1/a + 1/(ka)) = ak / (k + 1) < a

 

and you see the net resolution is bounded by that of the lowest res component. And so you can use terms like "limiting factor" and "wasted resolution."

 

Now to see how the author knows his lens is high res compared to his sensor/film, let "a" be any sensor resolution and ka be the lens resolution; this time k>0. Then differentiate n with respect to "a" to get k/(k + 1); that is how much the net resolution changes with a change in sensor resolution. So if the lens is low res compared to the sensor, i.e. k is small enough, the net resolution will not change much when you swap in sensors of various resolutions.

 

 

Good enough = subjective. It invalidates the rest of the post, I fear.
Link to post
Share on other sites

120,

Please post something worthy of being published in the International Journal of Optics and Applications, or The Optical Society, or other acknowledged scientific publications, since apparently anything else is not good enough.

 

I got the formula from a SPIE book actually. It's the one usually used to illustrate the basic principal; I don't think it claims to do more. I did not have time to look for anything else.

 

I used to edit for SIAM, but not the imaging sciences journal.

Link to post
Share on other sites

In the absence of anything else constructive just use the usual definition 1/n = 1/a + 1/b, where a, b are the resolutions of the two components and n is the net resolution. This is good enough in most cases, and good enough for the purposes. Let "a" be the lowest res component and ka, k>1, be the other. Then

 

n = 1 / (1/a + 1/(ka)) = ak / (k + 1) < a

 

and you see the net resolution is bounded by that of the lowest res component. And so you can use terms like "limiting factor" and "wasted resolution."

 

Now to see how the author knows his lens is high res compared to his sensor/film, let "a" be any sensor resolution and ka be the lens resolution; this time k>0. Then differentiate n with respect to "a" to get k/(k + 1); that is how much the net resolution changes with a change in sensor resolution. So if the lens is low res compared to the sensor, i.e. k is small enough, the net resolution will not change much when you swap in sensors of various resolutions.

 

 

Good enough = subjective. It invalidates the rest of the post, I fear.

-- -- --

 

I got the formula from a SPIE book actually. It's the one usually used to illustrate the basic principal; I don't think it claims to do more. I did not have time to look for anything else.

 

I used to edit for SIAM, but not the imaging sciences journal.

 

120,

Humor really doesn't come through here does it?

Link to post
Share on other sites

?:confused:Quantifying unquantifiable concepts now? Define a non-subjective difference between not much and a lot, please.

 

was not ignoring your posts...the redo is

Pt. 1

For resolutions x and y, 1/x + 1/y is bigger than 1/x and 1/y, so net res 1/(1/x + 1/y) is smaller than x and y.

Pt. 2

If x is small (low res) and y is big, then 1/x is big and 1/y is small, then changing y has relatively less effect on 1/x + 1/y, and it will be harder to notice changes in resolution.

Link to post
Share on other sites

O1af

Am I correct in thinking that this could be expressed by cascading MTF graphs of the components within the imaging chain (although MTF data for sensors doesn't seem to be too easy to obtain). So what may be shown is that any better lens, with a higher MTF, will, when combined with a sensor's MTF result in a cascaded MTF reduced by a lower amount than it would be by a poorer lens? (Its a very long time since I was involved with MTF testing!)

 

Yes, exactly.

 

I don't think this is correct. If the MTF of a shared component is slight in a region, then the MTFs for the two systems will not differ appreciably there. Again, there is often a limiting factor in the system.

 

I have looked for confirmation of your idea Olaf and cannot find it anywhere, so I'm asking again for a reference. Thanks...

Link to post
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...