Jump to content

MTF Curves ~ Sensor Resolution


k-hawinkler

Recommended Posts

Advertisement (gone after registration)

But I don't agree with this bizarre statement, quote;

"Describing this in terms of one component outresolving the other, while technically incorrect, gets the point across."

 

No, it doesn't get the point across.

 

Statement 1 is plain and simple confusing and wrong!

It is wrong, but in a mostly harmless way. Whether it is confusing would depend on the point you want to get across.

 

The real problem with the imagery invoked by talking about ‘outresolving’ in this context is that it suggests some kind of competition when of course none exists. Lens and sensor do not compete; rather they both work together to create an image. The sensor alone cannot create an image, and while the lens could, it would be ephemeral. Talking about lenses and sensors outresolving each other would be tantamount to talking about mothers or fathers being better at producing offspring.

 

But the impression conveyed when claiming that, say, a certain lens ‘outresolves’ a certain sensor is this – imagine two team-mates squabbling: The lens could say it didn’t need to get any better; it was already better than the sensor and the only way the two as a team could improve would be by the sensor improving its resolution.

 

And that would be technically incorrect. The performance of the team would actually improve even if just the lens did improve and the already ‘outresolved’ sensor did nothing about its performance. However, the net gain in the performance of the team would be minimal then, despite all the efforts of the lens. Whereas if the sensor did improve while the lens did nothing, the overall jump in performance could be huge.

 

So the ‘outresolving’ imagery provides useful advice, even when it isn’t technically correct: it correctly suggests which component to spend money on when trying to improve overall performance/resolution.

  • Like 2
Link to post
Share on other sites

So, I eventually have to get a display matching that machine's performance.

Any recommendations will be welcome.

 

K.H.,

 

The screen of i-Mac with 5K (5120x2880) retina display is powered by an AMD Radeon R9 M290X graphic chip. However, the spec. of this chip shows the maximum resolution as 4K only. I cannot figure out the enabling magic that Apple drives this 5K screen.

 

I bought in 2012 Fall the first generation Apple 15" MacPro with retina display featuring a Navidia graphic chip that drives a 2880x1800 pixels native screen, and use the not-so-powerful but easy-to-use editong software "Aperture" to PP my Raw files.

 

After I convert them into JPEG files in AdobeRGB color space, I then browse the JPEG files on the Dell 27" UltraSharp 2711 (2560x1440 pixels and set as AdobeRGB) powered by an AMD V4900 graphic card (on PC) that support sthe 2560x1440 pixels resolution output. If the JPEG file is not to my satisfaction I then return to Aperture for fine tuning. Normally the iteration will not exceed three times or four. As a hobbyist I think it's enough.

 

As the 15" MacPro also supports 2560x1440 output via a HDMI port, I did once connect a HDMI cable to the Dell monior but the rendering did not match that driven by V 4900. For alternatives, you may also consider the bettter acclaimed Eizo or NEC monitor.

 

My file looks much more vivid and with better 3D perspective, not to mention the details, on the 5K screen than the Dell screen. The 5K monitor is really alluring! It is an excellent tool to explore the potential of A7R + Leica lenses.

 

Dell also brought out a 5K monitor-UP2715K in September. However, I don't know which graphic card to drive it.

 

Best Regards,

 

Thomas Chen

Link to post
Share on other sites

It is wrong, but in a mostly harmless way. Whether it is confusing would depend on the point you want to get across.

 

The real problem with the imagery invoked by talking about ‘outresolving’ in this context is that it suggests some kind of competition when of course none exists. Lens and sensor do not compete; rather they both work together to create an image. The sensor alone cannot create an image, and while the lens could, it would be ephemeral. Talking about lenses and sensors outresolving each other would be tantamount to talking about mothers or fathers being better at producing offspring.

 

But the impression conveyed when claiming that, say, a certain lens ‘outresolves’ a certain sensor is this – imagine two team-mates squabbling: The lens could say it didn’t need to get any better; it was already better than the sensor and the only way the two as a team could improve would be by the sensor improving its resolution.

 

And that would be technically incorrect. The performance of the team would actually improve even if just the lens did improve and the already ‘outresolved’ sensor did nothing about its performance. However, the net gain in the performance of the team would be minimal then, despite all the efforts of the lens. Whereas if the sensor did improve while the lens did nothing, the overall jump in performance could be huge.

 

So the ‘outresolving’ imagery provides useful advice, even when it isn’t technically correct: it correctly suggests which component to spend money on when trying to improve overall performance/resolution.

 

 

Thanks Michael for the technical explanations. Very clear. I agree.

 

Where I don't agree is that no harm is done with Statement 1. I personally never attached or intended to attach to the term "outresolve" anything other than a comparison of values; namely larger, equal, smaller. Of course creating an image depends on the functioning of sensor and lens and other things. So a correct physical description of a how a lens and sensor work jointly to produce an image is a good starting point. That analysis should also include a sensitivity discussion under which circumstances it is more beneficial to improve one component rather than the other.

 

I think a lot of misunderstanding in this thread could have been avoided by correctly stating the physics facts and its implications without erroneous oversimplification.

 

Thanks again.

Link to post
Share on other sites

K.H.,

 

The screen of i-Mac with 5K (5120x2880) retina display is powered by an AMD Radeon R9 M290X graphic chip. However, the spec. of this chip shows the maximum resolution as 4K only. I cannot figure out the enabling magic that Apple drives this 5K screen.

 

I bought in 2012 Fall the first generation Apple 15" MacPro with retina display featuring a Navidia graphic chip that drives a 2880x1800 pixels native screen, and use the not-so-powerful but easy-to-use editong software "Aperture" to PP my Raw files.

 

After I convert them into JPEG files in AdobeRGB color space, I then browse the JPEG files on the Dell 27" UltraSharp 2711 (2560x1440 pixels and set as AdobeRGB) powered by an AMD V4900 graphic card (on PC) that support sthe 2560x1440 pixels resolution output. If the JPEG file is not to my satisfaction I then return to Aperture for fine tuning. Normally the iteration will not exceed three times or four. As a hobbyist I think it's enough.

 

As the 15" MacPro also supports 2560x1440 output via a HDMI port, I did once connect a HDMI cable to the Dell monior but the rendering did not match that driven by V 4900. For alternatives, you may also consider the bettter acclaimed Eizo or NEC monitor.

 

My file looks much more vivid and with better 3D perspective, not to mention the details, on the 5K screen than the Dell screen. The 5K monitor is really alluring! It is an excellent tool to explore the potential of A7R + Leica lenses.

 

Dell also brought out a 5K monitor-UP2715K in September. However, I don't know which graphic card to drive it.

 

Best Regards,

 

Thomas Chen

 

Thanks Thomas.

 

Did you mean to refer to a Mac Pro or MacBook Pro?

 

Also, if I am not mistaken even Thunderbolt 2 doesn't have the bandwidth necessary to drive the display of the 5K Apple i-Mac with retina display as an external display device.

 

So, at this stage I am not sure which display to get for my late 2013 Mac Pro.

  • Like 1
Link to post
Share on other sites

Please note that the Retina display does not even quite render sRGB (99% -basically close enough) and only 88% of Adobe RGB, making it unsuitable for advanced photo editing. Besides the contrast cannot be turned down enough to get a suitable greyscale. The screen is designed to give an utrasharp contrasty and poppy rendering,exacly wat one does not want for editing.

The Dell Ultrasharp Premier screens are much better in that respect with 99% AdobeRGB.

Eizo CG and NEC Spectraview are still ahead because the colour constancy across the screen is better.

 

Color space, colour constancy and contrast are the criteria here. Resolution is less important.

Edited by jaapv
  • Like 2
Link to post
Share on other sites

Please note that the Retina display does not even quite render sRGB (99% -basically close enough) but only 88% of Adobe RGB, making it unsuitable for advanced photo editing.

 

 

Jaap,

 

Thanks. That's my understanding as well.

 

For me it sure would be nice if displays were available that could show every pixel of an entire 36 or even 50 MP image. :D

Link to post
Share on other sites

Advertisement (gone after registration)

Thanks Thomas.

 

Did you mean to refer to a Mac Pro or MacBook Pro?

 

Also, if I am not mistaken even Thunderbolt 2 doesn't have the bandwidth necessary to drive the display of the 5K Apple i-Mac with retina display as an external display device.

 

So, at this stage I am not sure which display to get for my late 2013 Mac Pro.

 

Sorry, what I refer to is the MacBook Pro with retina display.

 

Yes, the 5K Apple i-Mac cannot play the role of an external display. I'm not sure whether a displayport cable with sufficient bandwith (dispalyport 1.3 cable?) is available for the Dell 5K monitor.

  • Like 1
Link to post
Share on other sites

Eizo CG and NEC Spectraview are still ahead because the colour constancy across the screen is better.

 

Color space, colour constancy and contrast are the criteria here. Resolution is less important.

 

Jaap,

 

Thanks a lot for your comment. You are right.

 

I'll edit my files based on the Dell 2711 (98% AdobeRGB) and browse them on the 5K i-Mac to get the best of both sides. I just want to jump the gun to get the "Decca Sound" quality image perception.

 

All the Best,

 

Thomas Chen

  • Like 1
Link to post
Share on other sites

... For example the Sony sensor in the Nikon D800/E or Sony A7R has about 100 line pairs per mm (lppm) resolution, whereas the APO-R 280/4 has about 500 lppm.

 

Sorry for probably being OT but I'm intrigued by this resolution of "500 lppm". May I ask where this number comes from?

 

I may be wrong but I seem to remember that a diffraction-limited f/4 lens would have a cutoff frequency (in the MTF curve) of about 450 lppm. Close to 500, true - but is the Apo-Telyt 280/4 indeed diffraction-limited? (And "cutoff" seems to mean that there is basically no contrast left at this spatial frequency.)

 

Taking the idea of a diffraction-limited lens a tiny bit further one might estimate that for f/4, 40 lppm would give a diffraction-limited MTF value of about 89%, 20 lppm one of about 94%. Leica's datasheet, on the other hand, gives MTF values of about 80 and 90%, respectively, which are _extremely_ respectable but do not seem to be diffraction-limited. They also seem to point to a "real" cutoff frequency of about 250 lppm for that lens.

 

And since one would like to transfer some finite contrast one might end up at something like 40% contrast at 125 lppm. This still shows (to me) how very, very capable that lens is. And at the same time, it seems rather well-matched to the Sony sensor mentioned above, doesn't it?

 

There may be a lot wrong with my thoughts and little estimates. All the more I'm interested in learning where the original 500 lppm value comes from... Thanks for any pointers!

 

(And now back to topic.)

  • Like 1
Link to post
Share on other sites

Sorry for probably being OT but I'm intrigued by this resolution of "500 lppm". May I ask where this number comes from?

 

I may be wrong but I seem to remember that a diffraction-limited f/4 lens would have a cutoff frequency (in the MTF curve) of about 450 lppm. Close to 500, true - but is the Apo-Telyt 280/4 indeed diffraction-limited? (And "cutoff" seems to mean that there is basically no contrast left at this spatial frequency.)

 

Taking the idea of a diffraction-limited lens a tiny bit further one might estimate that for f/4, 40 lppm would give a diffraction-limited MTF value of about 89%, 20 lppm one of about 94%. Leica's datasheet, on the other hand, gives MTF values of about 80 and 90%, respectively, which are _extremely_ respectable but do not seem to be diffraction-limited. They also seem to point to a "real" cutoff frequency of about 250 lppm for that lens.

 

And since one would like to transfer some finite contrast one might end up at something like 40% contrast at 125 lppm. This still shows (to me) how very, very capable that lens is. And at the same time, it seems rather well-matched to the Sony sensor mentioned above, doesn't it?

 

There may be a lot wrong with my thoughts and little estimates. All the more I'm interested in learning where the original 500 lppm value comes from... Thanks for any pointers!

 

(And now back to topic.)

 

 

Many thanks stefans4 for bringing us back on topic. Great question.

 

When I wrote that down I thought I had read somewhere a sentence by Erwin Puts claiming "almost diffraction limited wide open" and 1000 lines per mm (lpm) resolution for the APO-R 280/4. Anyway that stuck in my mind.

 

I also have on my computer a pdf file, entitled:

 

Leica R-Lenses

by Erwin Puts

September 2003

Chapter 3: 180 mm and 280 mm lenses

__ LEICA APO-ELMARIT-R 180 mm f/2.8

__ LEICA APO-TELYT-R 280 mm f/4

Among other things he writes, quote:

 

"Even so, the optical quality of the 280 mm lens is higher. Here we can detect the limit of the MTF graphs when we restrict our- selves to 40 Lp/mm as the highest frequency. There are sound arguments for this limit, but when dealing with very high perfor- mance lenses, the information may not be as we want it to be. The 280 mm f/4 Apo-Telyt-R lens is one of the every few lenses that is truly diffraction-limited. This means that the optical aberrations are so small that the size and shape of the image point is governed solely by physical laws. The absolute limit can be found at 450 Lp/mm. The most amazing feature is the follo- wing: a contrast value of 50% for 50 Lp/mm is the normal limit for high quality 35 mm photography.

The 280 mm f/4 Apo-Telyt-R lens delivers a resolution of 150 Lp/mm with 50% contrast. Often the lower limit for usable con- trast is set at 20%. At this value this lens still delivers an out- standing 300 Lp/mm."

 

Erwin Puts then discusses film related issues, quote:

 

"The big question is: how do we obtain this performance on the negative?

 

__ High-resolution photography

Let us make it clear from the start. Under practical circumstan- ces, we can achieve a visible and usable resolution of more than 150 Lp/mm on microfilm (Agfa Copex and Kodak Techni- cal Pan).

At first sight this may appear to be a bit disappointing. But 150 Lp/mm are 300 separate lines in one millimeter and that means that every single line has a width of 0.003 mm – an exceedingly small number!

Between two black lines there is a single white separation of a mere 0.003 mm in width. The smallest halo caused by the lens or by the grain in the emulsion, will reduce that separation line to a dark gray one, making the difference between black and white disappear. The same holds for the slightest movement of camera or subject.

Occasionally you will read about film emulsions that are capa- ble of resolving 700 Lp/mm or more in normal photographic situations (film-lens combination). In this case we have a line width of less than 0.0007 mm and that is minute in the extre- me. But these theoretical claims are not so important because the results have never been seen or documented.

The 280 mm f/4 Apo-Telyt-R, which has a theoretical (i.e. com- puted) resolving power of 450 Lp/mm (depending on the wave- length that is being used), can resolve 250 Lp/mm with a con- trast of 50%, of which approximately 150 Lp/mm can actually be recorded on film. The 180 mm f/2.8 Apo-Elmarit-R has values that are a bit lower."

 

Here is the url: http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0CB4QFjAA&url=http%3A%2F%2Fleica-camera.pl%2Fwp-content%2Fdownloads%2FPuts_Column_180-280_mm_Leica_R_Lenses_en.pdf&ei=UXuPVLKHKpLdoATpuoLoBQ&usg=AFQjCNF1IrLGQmhWZhgaxXbgbtCkyzDOyg&bvm=bv.82001339,d.cGU

 

Thanks again for your great question.

Edited by k-hawinkler
  • Like 2
Link to post
Share on other sites

I'm going to throw my hat in the ring because I enjoy the technical aspects of our hobby. I'm not an expert, though. Even the statements I make below that sound definite are probably based only on rough understanding, but I hope they can contribute to improving our understanding of these issues.

 

So, please take everything I write with a grain of salt and a smiley face. :)

 

Let me start by saying that I don't think you can do this by reading spec sheets. Here's why:

 

The spec sheets tell us how many sensor elements (sensels) are on an M9, so we can calculate ~145 px/mm, but this does not translate into 72.5 lp/mm except for lines that run either perfectly vertical or horizontal. Let's say, for example, that these are "perfect" pixels with a "perfect" lens. With horizontal lines at 72.5 lp/mm, pixels running top to bottom alternate between black and white. Now, shift the lines to an inclination of 1%. If the pixel at (1,1) is black and at (1,2) is white, then, with the slight angle, the pixel at (101,1) is white and at (101,2) is black. What happens to the pixels at (50,1) and (50,2)? They're both 50% grey because they're averaging the black and white lines which each cover half of those pixels. This is called moire, which gets worse when you add color filter arrays and is why most digital cameras come with a low-pass filter: at the sensor's finest resolution, with a perfect lens, per-sensel detail is likely to create false data, which is often worse from an imaging standpoint than simple noise. Adding a low-pass filter blurs out those finest details, avoiding false data and enabling lines at angles (including curves) to be resolved about as well as lines that are horizontal or vertical, but limiting the maximum resolution of even a sensor with "perfect" sensels. (And most lenses have low enough contrast at the >100 lp/mm of a D800 or NEX-7 to make low-pass filters arguably unnecessary.) Since Leica (along with every other camera manufacturer I'm aware of) doesn't publish the specs of its low-pass system, which would need to include both the cut-off resolution and the slope of the transition, it is impossible to derive maximum resolution from the spec sheet.

 

Thanks. I thought I would put Statement 1 out there and see how other folks react to that. Here The GetDPI Photography Forums - View Single Post - Sony FE 16-35/f4 is feedback from Sergio Lovisolo of Varese Italy, that also addresses the first point you are making, quote:

 

"A sensor has a resolution that is strictly related the number of pixels in a defined surface, (pixel density). That resolution can be measured in Line pairs per mm, and in the case of a7r this value is approx. 100 for vertical and horizontal lines. (a little less for oblique) If the lens has a resolution greater than 100, the part exceeding 100 cannot be correctly recorded by the sensor, and creates aliasing, better known by photographer as moirè. everytime you see aliasing on a digital image, you know that the lens has more resolution than the sensor. When the sensor reaches its limit, it can no more cooperate with the lens to produce a better image. I do this measurement with every new lens I get."

 

Similarly, I don't think MTF curves can provide the desired data to determine maximum resolution. The style of chart that is most common today--contrast as a function of distance from axis, charted at specific resolutions--definitely won't. It is impossible to reliably extrapolate performance at 70 or 100 lp/mm from data at 5, 10, 20, and 40 lp/mm. What you want is a graph of contrast as a function of resolution, charted at specific distances from the axis, which I've seen used by Angenieux and Erwin Puts. However, even this style of chart won't tell you what is limiting contrast at a given resolution. For example, coma will affect maximum practical resolution differently than lateral chromatic aberration, and obviously rays at high angles of incidence may be well focused but not accessible to the sensor. MTF charts need to exclude that sort of information in order to present the information that they do. The MTF chart of a nearly perfect lens--showing, say, 80% contrast at 200 lp/mm and 22mm off axis--can tell us that a lens is phenomenal along the plane of focus, but it doesn't tell us whether the lens will work nicely with even a 1 megapixel sensor. So, MTF curves, even though they're valuable sources of information, aren't useful for our theoretical analysis of the "outresolving" question.

 

(I think this subject can be analyzed theoretically if one has all the pertinent data. With full design information for the entire imaging chain, an engineer could balance the trade-offs of improving the sensor versus improving the lens without ever going beyond a theoretical model of the camera. This information isn't available to us, though. It isn't a matter of "theoretical analysis" being impossible but rather "our theoretical analysis" being the thing that won't work.)

 

Thankfully, though, we're photographers and not only armchair analysts. We have actual cameras and actual lenses to work with. So, if the problem at hand is identifying the component of our imaging process that has the strongest deleterious effect on the fine resolution of final output, and if we assume that we're using best practices such as using a tripod and credible processing so that we're limited to analyzing our lenses and our cameras, what can we do?

 

I think it simply comes down to looking at the photos. If the finest details are rendered with high acutance--one or two pixels to transition from bright to dark--then you'd probably see bigger gains going with a higher-megapixel sensor than with a new lens. If it takes thirty pixels to draw a faint spot where there actually was a black bird against a bright cloud, then you'd probably see bigger gains going with a better lens (or lens hood) than a new camera. If it usually takes three to five pixels (before sharpening) to transition from bright white to deep dark, then the optical system is likely well balanced. These are rough numbers off the top of my head, but they indicate how I'd analyze the issue.

 

So, what's a better lens? Yes, you could compare MTF charts of two comparable lenses (e.g., Macro-Elmarit-R 60mm and Summilux-R 50mm) to determine whether one would have better peak resolution than the other. But I think too many things change between types of lenses and eras of design, not to mention different brands, to tell from the spec sheet alone. Test photos are invaluable here.

 

Personally, though, I don't think of resolution as the limiting factor of modern lenses, including good lenses that are over 30 years old. As much as I enjoy exploring the fine details in photos, both my own and from others, I'm more affected by how lenses fail than by how they succeed. That is, I'd sacrifice resolution to avoid harshness, or, in statistical terms, I'll accept lower precision in exchange for higher accuracy. Frankly, this is what I love about my Leica lenses: compared to everything else that I've shot, they have less false color and fewer false shapes. Which is to say, they have well balanced aberrations, which I prefer even over lenses that have higher resolution. I mention all of this because it may tell you that my priorities don't align with yours, so your analysis may take a different shape than what I've discussed here.

 

Whether a lens can (or cannot) or does (or does not) outresolve a sensor, or vice versa, is mostly a question of how the terms are defined. From what I've read, everybody's points have merit but those who disagree are talking at cross-purposes. Once the issue is understood, though, I still don't think the practical answer changes. The bottom line, as far as I'm aware, is that imaging is complicated and there's no analytic shortcut available to us. So, we look at pictures. We make informed guesses to buy what seems like the best gear for our needs. Then, we go out and take photos. From time to time, we might even share what we've learned and what we've captured with our family, friends, and a handful or two of the good folks we've met on the internet.

 

Cheers,

Jon

 

Again, I think you try to address an important point, namely how to determine whether the sensor or the lens has higher resolution. Your point looking at images seems to be the way forward. For continuation please see the next post.

Link to post
Share on other sites

Again, here The GetDPI Photography Forums - View Single Post - Sony FE 16-35/f4 is how Sergio Lovisolo proceeds, quote:

 

 

"in photo 1 you'll see a test pattern, prepared following instructions provided by Norman Coren, creator of Imatest, the program used by professionals to test lenses.

15799031877_a9fae3bfd2_o.jpg

 

 

photo 2 is a 100% crop of center, showing performance of the elmarit 28-2,8 II full open. Strong aliasing is visible, centered at 100 lppm, and from the fact that false detail is shown in the area between 100 and 200, we can roughly estimate lens resolution with good contrast up to 130/150.The lens is out resolving the sensor. A precise measurement can be effectuated positioning the camera farther from target.

15797484800_1f69a677c7_o.jpg

 

photo 3 shows performance of the elmarit full open on corner. We can see approx. 90 or slightly more lppm, (a very respectable value) and no aliasing, the sensor is out resolving the lens. (the elmarit creates aliasing also on corners stopping down to 5,6)

15984098682_1f871f8cab_o.jpg

 

 

in the last crop is also easy to evaluate the effect of vignetting.

 

a simple test as this with every new lens for peace of mind."

 

I hope this helps. Any feedback is appreciated. Thanks.

Link to post
Share on other sites

Due to Erwin Puts we have a pretty good idea of the resolution characteristics of one lens, namely the APO-R 280/4.

For vertical and horizontal lines we also know approximately the resolution (lppm) of the Sony A7R sensor.

In the following table I have listed sensor characteristics of cameras of interest to me.

 

Welcome, dear visitor! As registered member you'd see an image here…

Simply register for free here – We are always happy to welcome new members!

 

It seems with regards to the resolution characteristics of the APO-R 280/4 lens all these sensors should behave similarly in the absence of a low-pass filter.

Of course, that's not true for other characteristics important to photographers. :D

Edited by k-hawinkler
  • Like 1
Link to post
Share on other sites

... and still there are situations where replacing one component by a higher-resolution version would give tangible benefits whereas replacing the other would not. Describing this in terms of one component outresolving the other, while technically incorrect, gets the point across.

It might get the point across in that particular situation ... but it also will result in a blatant misconception of what actually is going on—which then inevitably will rear its ugly head as soon as we are looking at the next situation.

 

That's why I disapprove technically incorrect explanations, even if they might be able to get one point across.

 

______________________________

When all my lenses are 1.5× longer now, do I have to buy a bigger bag?

Link to post
Share on other sites

K.H.,

 

The 5K i-Mac is really a horrible tool.

 

Today I packed more files created by a wider range of cameras and lenses to put on this machine again, and those images that are not so crisp or not exact spot-on in focusing can be sorted out from so easily.

 

This machine entitles itself as an effective evaluation platform when image quality is discussed, or toward the interaction of lens-sensor or even the skill of PP.

 

I'm glad that all my files that have been posted in LUF and created by Leica lenses on A7R looks more than wanderful on it.

 

All the Best,

  • Like 1
Link to post
Share on other sites

What happens at 45 degrees off axis - I assume that the sensor 'resolution' drops off with angle?

 

That's my understanding as well, as mentioned in post #71 in the blue text.

To me the sensor resolution values in the table are upper limits.

So the resolution of the APO-R 280/4 lens is higher than that of the sensors of any of the cameras listed - trying to avoid the to some obviously emotionaly laden term "outresolve".

Edited by k-hawinkler
Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...