Guest mc_k Posted June 14, 2010 Share #41 Posted June 14, 2010 Advertisement (gone after registration) thanks for the info Link to post Share on other sites More sharing options...
Advertisement Posted June 14, 2010 Posted June 14, 2010 Hi Guest mc_k, Take a look here Wide aperture SHARP Narrow aperture NOT. I'm sure you'll find what you were looking for!
luigi bertolotti Posted June 14, 2010 Share #42 Posted June 14, 2010 Sorry, I enter late in this interesting discussion... I would like to have some point clear in my mind for I never thought in depth about when and WHY a certain lens starts really to suffer the diffraction effect : Jaap's link on Wikipedia gives a very simple formula : The ability of an imaging system to resolve detail is ultimately limited by diffraction. This is because a plane wave incident on a circular lens or mirror is diffracted as described above. The light is not focused to a point but forms an Airy disk having a central spot in the focal plane with radius to first null of where λ is the wavelength of the light and N is the f-number (focal length divided by diameter) of the imaging optics. I tend to keep this for RIGHT, i.e., the diameter of the Airy disk is simply linearly depending on the f stop, so, diffraction is always minimal the most is the lens open, and this is completely indipendent from other factors related to lens design : but, as clearly stated above, it is a LIMIT of lens resolution : so, given that, on a certain lens, the correction of abherrations (which "makes" the resolution) is, generally speaking, better the more closed is the diaphragm (for all the math which measures corrections, afaik, gives best results for lightrays that are not far from lens' axis), there is, for a certain design, a point (a f/ stop) in which resolution ceases to become better, and starts to worsen, for the above linear formula starts to "command" resolution. All right till this point ? So, if a lens has an excellent resolution can happen that the diffraction's effect starts to take its toll at the first f stop after wide open, while on "roughly" designed lenses it can be never commanding even at full closure (35 years ago about, I used a very modest 6x9 folding given to me by an uncle... unnamed 105 f 6,3 lens made of 3 elements... I remember it performed better at 22 than 8). What puzzles me (and I would like to have some hint from experts) is the question af chromatic abherration : I know that the theory is that lightrays of different colors focus on different planes, and there are several classic design methods to attenuate this: given that diffraction effect (=radius of Airy disk) is also dependent on wavelength (=color), can it be that, at a certain f stop, the effects of chromatic abherration and diffraction are at levels that one is next to compensate the other, so that THIS ends to be the optimum f stop in this sense ? I mean... it's a matter of radiuses... points of different colors, being in focus in different planes, have different radiuses in the neg/sensor plane... but the radius "originated" by diffraction is different for each of them (owing to the different wavelenghts)... could be that, at a certain stop, the respective radiuses are almost the same (=at a minimum of their difference) making it the "ideal" setting ? Sorry in advance if this is a trivial or nonsense question... Link to post Share on other sites More sharing options...
jaapv Posted June 14, 2010 Share #43 Posted June 14, 2010 It is not a nonsense question. Chromatic aberration, both type 1 and type 2 show up as circles, coloured ones in this case, around the Lichtberg of the Airy disk. I think you mean that those " waves" would synchronize with the Airy disk interference pattern. It is quite possible I think, I have never seen it mentioned, but maybe it would only work for specific wavelenths. Link to post Share on other sites More sharing options...
01af Posted June 14, 2010 Share #44 Posted June 14, 2010 [...] All right till this point? Yes. What puzzles me (and I would like to have some hint from experts) is the question af chromatic abherration: I know that the theory is that lightrays of different colors focus on different planes, and there are several classic design methods to attenuate this: given that diffraction effect (= radius of Airy disk) is also dependent on wavelength (= color), can it be that, at a certain f-stop, the effects of chromatic abherration and diffraction are at levels that one is next to compensate the other, so that THIS ends to be the optimum f-stop in this sense? A math professor sits in a room with three students. All of sudden five students get up and leave. The professor thinks to himself, if now two more entered this room then I'd be all alone at last. A clever combination of glasses with normal dispersion, low dispersion, and anomalous dispersion can reduce or cancel out chromatic aberrations. Dispersion is anomalous (not normal) when it has a negative sign at least across parts of the visible spectrum. Normal glass refracts blue more than green, green more than yellow, and yellow more than red. In a glass with anomalous dispersion, this behaviour is reversed, at least in certain parts of the spectrum. So you have a value with a positive sign and another with a negative sign which can cancel each other out. But you cannot have an Airy disk with a negative radius that would add up to zero with another disk with a positive radius. And you wouldn't want it anyway, because when Airy disks would get cancelled out entirely then there would be no image at all. Link to post Share on other sites More sharing options...
luigi bertolotti Posted June 15, 2010 Share #45 Posted June 15, 2010 Yes. A math professor sits in a room with three students. All of sudden five students get up and leave. The professor thinks to himself, if now two more entered this room then I'd be all alone at last. .... But you cannot have an Airy disk with a negative radius (*) that would add up to zero with another disk with a positive radius. And you wouldn't want it anyway, because when Airy disks would get cancelled out entirely then there would be no image at all. .... (*) That is clear... the existence of negative number is sure, of negative dimensions is still to be proved . But I vaguely wondered about a NORMALIZATION of radiuses due to the two effects... imagine (with a certain exaggeration) 3 colored points that, due to chroma aberration, have 3 different radiuses in the theorical focus plane, but, due to diffraction they too get 3 different radiuses... the two effects sum (algebric ? - or is it the well known "sum of inverse" formula quoted elsewhere ?) in a way such that the total radiuses are "the same" or anyway, for certain wavelenghts (and a certain f stop), very very similar... i.e., the colors are equally rendered. Link to post Share on other sites More sharing options...
jaapv Posted June 15, 2010 Share #46 Posted June 15, 2010 Anyway, if the professor and the students were elementary particles, the anecdote would probably be perfectly possible:D Link to post Share on other sites More sharing options...
mjh Posted June 19, 2010 Share #47 Posted June 19, 2010 Advertisement (gone after registration) But I vaguely wondered about a NORMALIZATION of radiuses due to the two effects... imagine (with a certain exaggeration) 3 colored points that, due to chroma aberration, have 3 different radiuses in the theorical focus plane, but, due to diffraction they too get 3 different radiuses... the two effects sum (algebric ? - or is it the well known "sum of inverse" formula quoted elsewhere ?) in a way such that the total radiuses are "the same" or anyway, for certain wavelenghts (and a certain f stop), very very similar... i.e., the colors are equally rendered. It cannot possibly work. Lateral chromatic aberation results in different magnification factors for different wavelengths, so the red light from some small object gets projected onto one point in the image while the blue light from the same object gets projected onto another point. Diffraction also depends on the wavelength, so basically the amount of blurriness varies with the wavelength. This means that those dislocated red and blue points depicting the same small object will show different amounts of blur, but diffraction does nothing to reunite those points, except in some kind of general mush if you take diffraction to its extremes. There is no way diffraction could act as an anti-CA. Link to post Share on other sites More sharing options...
mjh Posted June 19, 2010 Share #48 Posted June 19, 2010 not sure how you got from Airy disks to the formula above, which I understand as a rule of thumb. The formula R = 1 / (1/R1 + 1/R2) is indeed useful in many contexts. For example, it could also be used to describe how the total write speed of a camera depends on the speed of the camera electronics and the speed of the card. It would predict that even a slow camera will profit from an expensive high-speed card, but only marginally, which turns out to be verifiable in practice. One can think of the smaller of the two factors R1 and R2 as the system’s bottleneck. If you want to improve things overall, you should concentrate on the bottleneck. Say you have a camera with a certain sensor with a certain resolution. Used with a mediocre lens of lower resolution than the sensor’s, the images taken with this set-up would turn out equally mediocre. When you invest in a better lens that does justice to the sensor resolution, this will pay off as the total image resolution will be much improved. Investing in a still better (and much more expensive) lens would give you images of an even higher quality, but the improvement would be smaller and you might come to the conclusion that it wasn’t worth the expense. If you draw a curve of the total resolution as a function of the resolution of one component (say the lens), keeping the resolution of the other component (say the sensor) constant, you could distinguish roughly four parts of the resulting curve: In the first part, any increase in lens resolution leads to some marked improvements in the overall resolution. In the second part, overall resolution continues to increase visibly, but despite the efforts in improving the lens resolution, the overall effect is diminishing. In the third part, while the overall resolution still shows some measurable increase, you would have trouble visibly discerning it. In the fourth part (all the rest of the curve up to infinity) even measurements stop showing the increase in resolution that still exists – theoretically. In reality (the reality in which you have to make buying decisions and in which the amount of money you can spend is limited) you would also need to associate cost functions with R1 and R2. Typically the cost will not grow linearly with resolution, but more like exponentially. Combining those cost functions with the R = 1 / (1/R1 + 1/R2) formula would tell you that one quickly reaches a point where further improvements of one component are a waste of money – investing the same amount of money in improving the other component would yield a much higher return on investment. In other words, it pays to invest in better lenses, but there is a point when you should leave the lenses alone and rather invest in a higher resolution sensor. And of course the same principle also applies in the other direction: There are times when increasing the sensor resolution would offer no real benefits as either the resolution bottleneck is in the lenses or the theoretical increase in sensor resolution is more than offset by the increase of noise and the decrease of dynamic range that goes along with smaller pixels. Link to post Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.