Jump to content

Recommended Posts

Advertisement (gone after registration)

I conducted a quick and dirty little test to check the latency of the M11 when in rangefinder mode vs live-view mode.

Holding the camera in my right hand, I held out my left arm so I could see it in the OVF / EVF. Starting with my hand outstretched, I quickly touched my index finger to my thumb and back to an outstretched position as quickly as I could. As my finger hit the thumb, I clicked the shutter. I did this around a dozen times in a row for each camera setting.

Understand that I'm going by feel here, I'm not responding to the scene. Try tapping your index and thumb together on each hand at the same time, you can easily tell when you're off in your timing. I know this is as unscientific as it gets, but the results were repeatable enough to see the pattern.

When looking through the OVF, with live view switched off, each image showed my thumb and finger still touching - so the timing was spot on.

Turning on live view and looking through the EVF to take each shot, each image showed my thumb and finger separated (about half way towards an outstretched hand). And finally, keeping live view on, but looking through the OVF instead, the results were identical to the EVF test. 

So using the EVF, or using the OVF with live-view still switched on, you'll have the same noticeable latency - but using the camera in 'rangefinder mode' with live-view switched off, there does seem to be less latency. 

 

  • Like 1
  • Thanks 5
Link to post
Share on other sites

x
8 minutes ago, John Ricard said:

Unfortunate, but good to know.

Someone can probably set up a better test to really work out the difference. The good news from my point of view is that using the camera just with the OVF doesn't have this lag. I was worried that because it was metering off the sensor full time there would be an equal delay, but that doesn't seem to be the case.

And this isn't latency introduced by the live view image being slightly delayed, causing me to click the shutter later due to observing the scene a fraction of a second later than in real life - this seems to be latency introduced just by having live-view engaged. 

I can't see why this is the case though - the shutter has to close-open-close-open in each case.  Live view itself must be introducing a delay of its own when engaged. 

Link to post
Share on other sites

1 hour ago, Stevejack said:

I can't see why this is the case though - the shutter has to close-open-close-open in each case.  Live view itself must be introducing a delay of its own when engaged. 

All EVFs (and live views) introduce a certain non-insignificant amount of delay.

The camera needs to sample the image off the sensor, decode the data, demosaic it, re-encode it for the display, overlay all the info over it, etc.

All of these steps take processing time.

  • Like 3
Link to post
Share on other sites

Just now, orcinus said:

All EVFs (and live views) introduce a certain non-insignificant amount of delay.

The camera needs to sample the image off the sensor, decode the data, demosaic it, re-encode it for the display, overlay all the info over it, etc.

All of these steps take processing time.

Good point - I guess I was thinking about the shutter actuation and not about the rest of the features that are active during live view and slowing things down. 

  • Like 1
Link to post
Share on other sites

I think there is at least one recent mirrorless camera (I can't remember which) that can "go back in time" and find the image that was being read from the sensor at the instant the shutter was pressed. Or maybe that is still aspirational, and coming soon.

I think it requires a "stacked" BSI sensor, with its own dedicated processor and buffer memory attached right to the back of the sensor (for processing and storage speed), and electronic shutter - usually global (all pixels read at once - no rolling shutter).

Because the camera has to hold in memory maybe 10-30 images (virtually a video stream) to find the "40-millisecond-older" picture.

http://image-sensors-world.blogspot.com/2019/03/sony-announces-stacked-bsi-274um-global.html

......................

As to testing lag, back in the day of everyone having an audio turntable, a technique was to put on a 78 or 45 rpm record, and then try to snap the shutter at the instant the label was exactly "right side up" through the finder. Then the degrees it had moved past "top dead center" by the time the film was exposed could be divided into the rotation speed to figure the total lag (reaction time plus camera mechanics).

(also useful for checking long shutter exposures - length of visible blur streaks divided into record rpms would show if your "1/8th second" was accurate. ;) )

But I think you finger method was not bad - and innovative!

  • Like 1
  • Thanks 2
Link to post
Share on other sites

Advertisement (gone after registration)

17 minutes ago, adan said:

I think there is at least one recent mirrorless camera (I can't remember which) that can "go back in time" and find the image that was being read from the sensor at the instant the shutter was pressed.

That's fascinating!  I think apple does something similar with their "live view' photos. The sensor is always recording while you're framing your shot, so when you click the shutter it just keeps the previous few frames and saves a few extra frames on the end, and you get a small looped video embedded in your image, with your actual "image" being somewhere in the middle of the video.

  • Thanks 1
Link to post
Share on other sites

I have always shot my Leica M10 in manual exposure mode, sometimes with the EVF. I always set my exposure way before shooting a picture. Having not used the M11 yet I am not sure how this camera will respond. Can anyone tell me what effect this method will have on shutter image latency with the M11? 


Thanks!

Link to post
Share on other sites

33 minutes ago, adan said:

I think there is at least one recent mirrorless camera (I can't remember which) that can "go back in time" and find the image that was being read from the sensor at the instant the shutter was pressed. Or maybe that is still aspirational, and coming soon.

I think it requires a "stacked" BSI sensor, with its own dedicated processor and buffer memory attached right to the back of the sensor (for processing and storage speed), and electronic shutter - usually global (all pixels read at once - no rolling shutter).

Because the camera has to hold in memory maybe 10-30 images (virtually a video stream) to find the "40-millisecond-older" picture.

http://image-sensors-world.blogspot.com/2019/03/sony-announces-stacked-bsi-274um-global.html

......................

As to testing lag, back in the day of everyone having an audio turntable, a technique was to put on a 78 or 45 rpm record, and then try to snap the shutter at the instant the label was exactly "right side up" through the finder. Then the degrees it had moved past "top dead center" by the time the film was exposed could be divided into the rotation speed to figure the total lag (reaction time plus camera mechanics).

(also useful for checking long shutter exposures - length of visible blur streaks divided into record rpms would show if your "1/8th second" was accurate. ;) )

But I think you finger method was not bad - and innovative!

Olympus m43 cameras had a similar feature for quite a while. Using electronic shutter, a half press starts recording and a full press saves n pictures before the shutter press.

  • Like 1
  • Thanks 1
Link to post
Share on other sites

6 minutes ago, hmathias said:

I have always shot my Leica M10 in manual exposure mode, sometimes with the EVF. I always set my exposure way before shooting a picture. Having not used the M11 yet I am not sure how this camera will respond. Can anyone tell me what effect this method will have on shutter image latency with the M11? 


Thanks!

No real difference to the M10, just use it like you always have (no live-view) and it will feel just the same, albeit with a different shutter sound. 

I'll often just use sunny 16, or meter for the highlights / shadows and then adjust the aperture as I move in and out of the light. I don't pay too much attention to the camera's metering once I'm actually shooting. 

Edited by Stevejack
  • Like 1
  • Thanks 1
Link to post
Share on other sites

Interesting!

I did similar tests of lag some years ago photographing a digital stopwatch, pressing the shutter button the moment it hit a certain value, and seeing what the camera recorded.

I did the same to test delays from switching on and waking from sleep, at the time this was a controversy for the M240.

Link to post
Share on other sites

For what it’s worth, i always find these lag tests are a tad pointless, as you will almost always be the limiting factor, not the EVF delay or shutter delay.

Human reaction times are:

  • 250ms average for reaction to visual input
  • 170ms for sound input
  • 150ms for touch/feel

Fastest recorded reactions to visual stimuli are around 120ms. Reflexes (that bypass the brain altogether) are 80ms.

Yes, all delays in the system add up, so camera does have an influence, obviously, but if the camera introduces a 50ms delay, that will be insignificant next to your 250. And also bear in mind that this is for simple stimuli, like a light flashing, not reaction to complex visual patterns.

Edited by orcinus
  • Like 2
Link to post
Share on other sites

22 minutes ago, orcinus said:

For what it’s worth, i always find these lag tests are a tad pointless, as you will almost always be the limiting factor, not the EVF delay or shutter delay.

Human reaction times are:

  • 250ms average for reaction to visual input
  • 170ms for sound input
  • 150ms for touch/feel

Fastest recorded reactions to visual stimuli are around 120ms. Reflexes (that bypass the brain altogether) are 80ms.

Yes, all delays in the system add up, so camera does have an influence, obviously, but if the camera introduces a 50ms delay, that will be insignificant next to your 250. And also bear in mind that this is for simple stimuli, like a light flashing, not reaction to complex visual patterns.

I do understand what you’re saying and I agree with you, but in this situation if you were trying to time the taking of a photograph (via anticipation, not reaction) of say a bird with its wings at the top of its stroke, or a person throwing a ball so that the ball is still in the air but just next to the catcher’s  outstretched hands… anything where you are anticipating the moment to click the shutter rather than reacting to it… you’ll have better success with live view switched off on the M11 due to that latency. 
 

You explained it well in your first post, and obviously I was only thinking about the extra shutter movement due to the now permanent on-sensor metering.
 

I was able to repeat the test dozens of times, with the same result every time, so it merits consideration if you’re like me and assumed there would be no difference on the M11. 

  • Like 2
Link to post
Share on other sites

1 hour ago, orcinus said:

For what it’s worth, i always find these lag tests are a tad pointless, as you will almost always be the limiting factor, not the EVF delay or shutter delay.

Human reaction times are:

  • 250ms average for reaction to visual input
  • 170ms for sound input
  • 150ms for touch/feel

Fastest recorded reactions to visual stimuli are around 120ms. Reflexes (that bypass the brain altogether) are 80ms.

Yes, all delays in the system add up, so camera does have an influence, obviously, but if the camera introduces a 50ms delay, that will be insignificant next to your 250. And also bear in mind that this is for simple stimuli, like a light flashing, not reaction to complex visual patterns.

It doesn’t really work like that though in reality. Because the lag happens after you press the shutter so the longer the lag the later the moment in time you are capturing.
 

So for the M10 the lag is quoted to be something around 40ms and the M11 something around 60ms. Which roughly equates to 1 / 25 vs 1 / 15 which I think is a significant moment in time especially with fast movement 

  • Like 1
Link to post
Share on other sites

6 minutes ago, sebben said:

It doesn’t really work like that though in reality. Because the lag happens after you press the shutter so the longer the lag the later the moment in time you are capturing.

Re-read the last bit.

It absolutely works that way - it doesn’t matter if you’re 250ms late (0ms of camera lag) or 300ms late (50ms of camera lag). You’re still a quarter of a second late.

Edited by orcinus
Link to post
Share on other sites

1 hour ago, Stevejack said:

you’ll have better success with live view switched off on the M11 due to that latency. 

No you won’t, you’ll just capture it a bit sooner - but you’ll still be far off from the moment.

As long as the delay is consistent, you will learn to anticipate, whether the delay is 250, 300, or 350ms is irrelevant.

Link to post
Share on other sites

3 minutes ago, orcinus said:

Re-read the last bit.

It absolutely works that way - it doesn’t matter if you’re 250ms late (0ms of camera lag) or 300ms late (50ms of camera lag). You’re still a quarter of a second late.

There is a difference in what the image is. Your reaction time doesn't enter the equation as the M11 always captures an image that is always 1/50 behind the M10 (20 milliseconds).

  • Like 1
Link to post
Share on other sites

vor 9 Minuten schrieb orcinus:

Re-read the last bit.

It absolutely works that way - it doesn’t matter of you’re 250ms late (0ms of camera lag) or 300ms late (50ms of camera lag). You’re still a quarter of a second late.

I am not sure here. Why? In many situation your brain will anticipate what happens next. If you want to take an image when your kid hits the soccerball - you wont wait until you see the foot touching the ball, and then start your reaction time. I think you will kind of synchronize with the action to press the shutter. I dont think so that our brain can also calculate in an additional delay from the shutter lag.

I believe an opticl viewfinder and a shot shutter lag is the best bet.

 

Edited by tom0511
Link to post
Share on other sites

Personally not a camera machine gunner, proven to my self in the past that OVF RF camera is capable capturing what I saw in single shutter realise mode, but makes sense why modern cameras, mostly mirrorless, to offer options to shoot at high frame rates, 10 fps or higher for instance. 

In anticipation, when you keep shutter pressed for instance for two seconds, with firing rate of 10fps those 20 images, apart from clogging your memory card,  increase odds capturing "decisive moment" providing it is longer than 1/100 of a second :lol:

Most examples of M11 pictures here on LUF are study of sedentary settings, and probably most shooters are slower than they used to be, so maybe M11 LiveView doesn't need  to be blazing fast, call me cynic, Leica know their customers.

  • Like 1
  • Haha 2
Link to post
Share on other sites

1 hour ago, sebben said:

It doesn’t really work like that though in reality. Because the lag happens after you press the shutter so the longer the lag the later the moment in time you are capturing.
 

So for the M10 the lag is quoted to be something around 40ms and the M11 something around 60ms. Which roughly equates to 1 / 25 vs 1 / 15 which I think is a significant moment in time especially with fast movement 

FYI, Leica says that the increase is 10ms, so that would be 50ms for M11. Nikon D5 has shutter lag of 40ms when prefocused. 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...