hey_giulio Posted May 16, 2021 Author Share #21 Posted May 16, 2021 Advertisement (gone after registration) Dear SrMi, let me just say: wow! This is why I come to this forum. Your test results are super super interesting. I will try to replicate them asap. By the way, did you by any chance also test the “live view off” shutter lag, to get a baseline result for the camera? The fact that this shutter lag’s duration depends on the shutter speed is quite an interesting development and actually makes me more interested than before in the technical explanation for the issue. I am also interested in the live view “display delay” you mentioned. I understand this to be due to the time the image from the sensor needs to be processed and sent to the screen. But in practice how does it operate? I mean, suppose you conduct your test through live view and press shutter when you see 6:60. At that moment, in reality, the timer is already at 6.61 (or 6.62) according to your measurements. Then the shutter lag effect takes place, and, if you shoot at 1/180 sec, you get an image of the timer at 6:67. So, the “real” shutter lag is 0,6 (or 0,7)? Did I get this right? If so, would pressing the shutter release button when you see 6:60 in the optical viewfinder get you a 6:66 (or 6:65) image? (It defeats the purpose of live view, but it’s just to understand how this works) Link to post Share on other sites More sharing options...
Advertisement Posted May 16, 2021 Posted May 16, 2021 Hi hey_giulio, Take a look here M10P / M10M Live view shutter lag. I'm sure you'll find what you were looking for!
SrMi Posted May 16, 2021 Share #22 Posted May 16, 2021 1 hour ago, hey_giulio said: Dear SrMi, let me just say: wow! This is why I come to this forum. Your test results are super super interesting. I will try to replicate them asap. By the way, did you by any chance also test the “live view off” shutter lag, to get a baseline result for the camera? The fact that this shutter lag’s duration depends on the shutter speed is quite an interesting development and actually makes me more interested than before in the technical explanation for the issue. I am also interested in the live view “display delay” you mentioned. I understand this to be due to the time the image from the sensor needs to be processed and sent to the screen. But in practice how does it operate? I mean, suppose you conduct your test through live view and press shutter when you see 6:60. At that moment, in reality, the timer is already at 6.61 (or 6.62) according to your measurements. Then the shutter lag effect takes place, and, if you shoot at 1/180 sec, you get an image of the timer at 6:67. So, the “real” shutter lag is 0,6 (or 0,7)? Did I get this right? If so, would pressing the shutter release button when you see 6:60 in the optical viewfinder get you a 6:66 (or 6:65) image? (It defeats the purpose of live view, but it’s just to understand how this works) What I did is the following: a) let the timer run on my iPad b) live view on rear LCD of my M10M, and point the camera to the iPad c) start video recording on my iPhone, with iPad and camera's LCD in the frame. d) take a picture e) stop recording video on the iPhone Looking at the recorded video, I see the time displayed on the iPad and the camera's live view. The difference is the delay in live view. On the video, I check the iPad time when the image was taken and compare it to the picture. The difference is the shutter lag. I will try to measure the shutter lag with the live view off (using the sound of the shutter as the trigger). Link to post Share on other sites More sharing options...
hey_giulio Posted May 17, 2021 Author Share #23 Posted May 17, 2021 Dear SrMi, I have just had a little time to conduct a quick test of my own, with an alternative method, which is way less scientific than yours, but based on the "big data" logic Before doing this I have tried photographing a timer on screen. However, when I tried to use a slow shutter speed to try and reproduce, albeit less scientifically, your test result I have encountered an obstacle, dependent on the fact that with slow shutter speed the numbers of the timer were not reproduced in an understandable way due to the lenght of the exposure. I have therefore changed method, using this movement test on screen: My test has been the following: 1) try to photograph the white dot when it gets to the bottom of the screen without live view at 1/250 sec. 2) try to photograph the white dot when it gets to the bottom of the screen at 1/250 sec. 3) try to photograph the white dot when it gets to the bottom of the screen at 1/15 sec. Of course, to compensate for human error, I have done multiple shots of the scenarios above. The results are the following: in the case n. 1), on average I have succeeded to "catch" the white dot at the bottom; in the case n. 2), on average I have not succeeded to "catch" the white dot at the bottom, on average, the white dot was a little bit higher than the bottom; in the case n. 3), on average I have not succeeded to "catch" the white dot at the bottom, on average, the white dot was a little bit higher than the bottom. This was noticeable even though there was a bit of blur due to the length of the exposure. From the above I deduce that there could be no, or very little difference, in the shutter lag that happens with live view between different shutter speed. What do you think? As usual, many thanks! Link to post Share on other sites More sharing options...
SrMi Posted May 17, 2021 Share #24 Posted May 17, 2021 42 minutes ago, hey_giulio said: Dear SrMi, I have just had a little time to conduct a quick test of my own, with an alternative method, which is way less scientific than yours, but based on the "big data" logic Before doing this I have tried photographing a timer on screen. However, when I tried to use a slow shutter speed to try and reproduce, albeit less scientifically, your test result I have encountered an obstacle, dependent on the fact that with slow shutter speed the numbers of the timer were not reproduced in an understandable way due to the lenght of the exposure. I have therefore changed method, using this movement test on screen: My test has been the following: 1) try to photograph the white dot when it gets to the bottom of the screen without live view at 1/250 sec. 2) try to photograph the white dot when it gets to the bottom of the screen at 1/250 sec. 3) try to photograph the white dot when it gets to the bottom of the screen at 1/15 sec. Of course, to compensate for human error, I have done multiple shots of the scenarios above. The results are the following: in the case n. 1), on average I have succeeded to "catch" the white dot at the bottom; in the case n. 2), on average I have not succeeded to "catch" the white dot at the bottom, on average, the white dot was a little bit higher than the bottom; in the case n. 3), on average I have not succeeded to "catch" the white dot at the bottom, on average, the white dot was a little bit higher than the bottom. This was noticeable even though there was a bit of blur due to the length of the exposure. From the above I deduce that there could be no, or very little difference, in the shutter lag that happens with live view between different shutter speed. What do you think? As usual, many thanks! Interesting approach. When testing with live view, did you use the viewfinder or live-view? If you used live-view, then the delay of live view itself contributed to measurement, not only the shutter-lag. When I tested using your approach, my "precision" varied quite a bit. Link to post Share on other sites More sharing options...
hey_giulio Posted May 17, 2021 Author Share #25 Posted May 17, 2021 Dear SrMi, Thanks for taking time to comment on my test. I have always shot through the optical viewfinder even when live view was turned on. As soon as I can I will try to replicate the test actually using live view when live view is active, just to see if it changes anything to my (I repeat, way less scientifically correct than yours) test results. Precision in this test is of course varying greatly because of the human element being involved, even though I noticed that training and sufficient repetitions get quite consistent results. Of course these are not meant to be scientific proofs at all, but they interesting to me as it’s quite similar to the problem I have actually experienced in the real use case: say you want to catch a pedestrian crossing the street at exactly the moment when the pedestrian is in the middle: with live view I noticed I was always a little behind... Link to post Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now