Jump to content

bencoyote

Members
  • Posts

    521
  • Joined

  • Last visited

Reputation Activity

  1. Like
    bencoyote got a reaction from ellisson in Skin tones with the M-P240   
    I'm not sure recalibration over a short time interval would work because of the way that the ColorMunki fits to the screen. But you obviously get the point. ;-) Thankfully LCDs are much more color stable than phosphors being blasted by a particle accelerator i.e. CRT.
     
    Looks like a good reference. I'm not sure books like that existed when I was working with color professionally. Digital was brand new and people were still figuring it out. The big guys e.g. Victoria Secret (who were our heros ) had their workflows figured out and I was part of the 2nd or 3rd crop where those gurus passed their knowledge down to us so that we could build parts of it into the next generation of machines to make their lives easier.
     
    We used to say, "you have to get the objective colors correct first. Then everything beyond that is artistic." You have to get the objective right first because the image has to go somewhere, even if it just posting it on instagram people will see it on a different device and unless the particular way that the colors are wrong happens to be exactly the same as someone else's all the fine tuning will not be preserved.
     
    I think we've got the objective color idea pretty well covered on this thread. That is science not art it is easy. I think that something where the combined forum with their vast experience can really add something is actually in the art of it. Taking the metaphoric, artistic and semi technical language of photoshop and converting it into the artistic and emotive language of skin tone. What makes a particular skin tone wrong? Why do you like a particular set of colors? What does it say to you?
     
    For example I find Elmar's color settings posted earlier in the thread really interesting. I haven't gone so far as to try it yet, I don't know what situation it was designed for but I find the concept really interesting.
  2. Like
    bencoyote got a reaction from Tortuga in Skin tones with the M-P240   
    I used to do a lot of work with color and specifically color printing. Color is a very complicated topic and in the color theory classes that my employer sent me to the problem that the original poster described was introduced on the first day. They kind of used it as a starting point on the topic of sensation vs. perception. It would be impossible for me to distill weeks of classroom study down to a short forum post but let me try to get at the essence of it:
     
    There is an objective measurable interaction between an object and the light that hits it. We can build sensors that capture a portion of this interaction in a way that is similar to the way that our eyes do. This is objectively measurable.
     
    When we look at the world what we think we see is our perception. The thing that we have in our mind is something more like a HDR, focus stacked, white balance corrected, sharpened image which is highly modified in many ways based upon our past experience. This is what we call perception and by the time that your brain is done manipulating it this way and that it bears little resemblance to the actual sensations that your eyes generate.
     
    When we look at a still picture several layers of that processing are no longer available. We can't shift our focus or the white balance or a whole bunch of other factors. Furthermore because it is a still image we can look at it longer.and so things that we wouldn't have time to notice when trying to process the torrent of sensory data coming from our eyes in a moving scene are able to bubble up to our awareness.
     
    A really big factor to keep in mind when when looking at a photo is the very real difference between objective reality and perceived reality. When you are paused looking at a captured still photo, there is a very strong desire in some people to apply one of the last steps in your brain's automatic post processing to the image.That is to bring in your vast experience about what you think something like skin tones should look like and override the objectively captured information recorded by the camera.
     
    That is not to say the camera is always exactly correct. As Jaap pointed out even the camera has to infer some things like white balance. If you are just trying to be artistic and make a pretty picture do whatever you want. However, if you are doing product photography and you need to make sure that the clothes look the same on the model on the runway with carefully designed lighting, in the catalog, and on the ultimate consumer in the store with their likely florescent lights, and a home with Tungsten lights, and outside then you have to do a huge amount of work.
     
    1) As Jaap said profile the camera with a Color Checker passport. You will need to do this under various lighting conditions bright daylight, overcast, and a couple of indoor lighting conditions. With outdoor natural light two things that matters more than most people think it does is elevation and the amount of water vapor in the air. So if you are more than about 1000m higher than lower than the altitude which you calibrated your camera then you want run through the color calibration again. The same is true if you go from a very moist area like near the ocean to a very dry place. Different cameras are more or less sensitive to these changes. In my experience the M is a bit more sensitve to altitude than my T. I don't remember noting a difference with water vapor. When I first calibrated my Leica cameras, I remember thinking "Wow these Germans really are into objective reality vs. making colors look good (which is what other camera vendors often do -- ehm Olympus, Panasonic)." I also remember noting that my color profiles were not really that far off of the "Embedded profile" that was in the camera. However, ACR and LR's color profile was way way off. So one of the things that I put in my default develop presets is to change from "Adobe Standard" to "Embedded profile" for Leica cameras.
     
    2) When you say the colors don't look right, what are you looking that them on? The LCD, your computer's monitor, what? Have you calibrated it? How big is its color gamut? Is it good enough to represent the colors the colors recorded? And you haven't messed with the brightness or contrast or any other settings on the monitor since you calibrated it have you? Oh and one more thing what is the ambient illumination source in the room and how does it change throughout the day? The screens are not 100% black body absorbers and so the light source in the room can mix with with the light coming out of the monitor to distort even your sensation of color. If you really want to do it right, you should only edit your photos in a dark room with no natural light on probably a brand new MacBook Pro that you have calibrated with something like a ColorMunki Photo.
     
    And all of that is long before you ever try to print something. There you have to deal with the reflectivity and spectral neutrality of the paper, the metamersim of inks or pigments and finally the limited gamut of colors possible with printing.
     
    If you want to keep in really simple: Buy a brand new MBP and use Embedded profile rather than Adobe Standard, and only do your editing in at night with the lights off and remember that there is an objective way things actually are and there is an artistic preconception of how you believe things should be.
  3. Like
    bencoyote reacted to jmahto in Leica repair wait times   
    My dealer closed shop after 30 years in business couple of months ago. Keeble and schuatt in Palo Alto. Great shop that could not survive online age. I blame all the people who used it as a free demo shop.
    Hello guest! Please register or sign in to view the hidden content. Hallo Gast! Du willst die Bilder sehen? Einfach registrieren oder anmelden!
  4. Like
    bencoyote reacted to jonoslack in Skin tones with the M-P240   
    Thank You BenThis is such a wonderful encapsulation of what I feel about colour - I could add"
     
    "There is no such thing as a 'correct' white balance in a scene which has mixed lighting (which includes any image with shade in it)."
     
    More literally, if there is a variation in colour temperature in a scene, there isn't a way to get the white balance right.
     
    The think about people's brains doing HDR, focus stacking and white balance correction, is that (probably) we all do it differently, and if that's the case, then we almost certainly also look at images we see with our own personal set of 'corrections'.
     
    The upshot of this (to me) is that
    1. If you're doing colour corrected work, in controlled lighting conditions, or if you're working in a team on a photographic project - then all the monitors and printer profiles should be properly calibrated and checked daily but
    2. If you're shooting in mixed light (which includes light with shadows) - then there is no such thing as 'correct', and you should be aiming for 'Excellent' - which is what the end user likes (or indeed what the photographer likes in my case) whether they're looking on Facebook or at a high quality print. "Correct" is not an option, so why strive for anything other than "Brilliant"!
  5. Like
    bencoyote reacted to jaapv in Skin tones with the M-P240   
    No, the Colormunki sits between my screens on a small tripod and measures the ambient light, to adjust the pre-calibrated screens. As you probably know it measures the ambient light as part of the calibration process.
  6. Like
    bencoyote got a reaction from jonoslack in Skin tones with the M-P240   
    I'm not sure recalibration over a short time interval would work because of the way that the ColorMunki fits to the screen. But you obviously get the point. ;-) Thankfully LCDs are much more color stable than phosphors being blasted by a particle accelerator i.e. CRT.
     
    Looks like a good reference. I'm not sure books like that existed when I was working with color professionally. Digital was brand new and people were still figuring it out. The big guys e.g. Victoria Secret (who were our heros ) had their workflows figured out and I was part of the 2nd or 3rd crop where those gurus passed their knowledge down to us so that we could build parts of it into the next generation of machines to make their lives easier.
     
    We used to say, "you have to get the objective colors correct first. Then everything beyond that is artistic." You have to get the objective right first because the image has to go somewhere, even if it just posting it on instagram people will see it on a different device and unless the particular way that the colors are wrong happens to be exactly the same as someone else's all the fine tuning will not be preserved.
     
    I think we've got the objective color idea pretty well covered on this thread. That is science not art it is easy. I think that something where the combined forum with their vast experience can really add something is actually in the art of it. Taking the metaphoric, artistic and semi technical language of photoshop and converting it into the artistic and emotive language of skin tone. What makes a particular skin tone wrong? Why do you like a particular set of colors? What does it say to you?
     
    For example I find Elmar's color settings posted earlier in the thread really interesting. I haven't gone so far as to try it yet, I don't know what situation it was designed for but I find the concept really interesting.
  7. Like
    bencoyote got a reaction from paulmac in Skin tones with the M-P240   
    I used to do a lot of work with color and specifically color printing. Color is a very complicated topic and in the color theory classes that my employer sent me to the problem that the original poster described was introduced on the first day. They kind of used it as a starting point on the topic of sensation vs. perception. It would be impossible for me to distill weeks of classroom study down to a short forum post but let me try to get at the essence of it:
     
    There is an objective measurable interaction between an object and the light that hits it. We can build sensors that capture a portion of this interaction in a way that is similar to the way that our eyes do. This is objectively measurable.
     
    When we look at the world what we think we see is our perception. The thing that we have in our mind is something more like a HDR, focus stacked, white balance corrected, sharpened image which is highly modified in many ways based upon our past experience. This is what we call perception and by the time that your brain is done manipulating it this way and that it bears little resemblance to the actual sensations that your eyes generate.
     
    When we look at a still picture several layers of that processing are no longer available. We can't shift our focus or the white balance or a whole bunch of other factors. Furthermore because it is a still image we can look at it longer.and so things that we wouldn't have time to notice when trying to process the torrent of sensory data coming from our eyes in a moving scene are able to bubble up to our awareness.
     
    A really big factor to keep in mind when when looking at a photo is the very real difference between objective reality and perceived reality. When you are paused looking at a captured still photo, there is a very strong desire in some people to apply one of the last steps in your brain's automatic post processing to the image.That is to bring in your vast experience about what you think something like skin tones should look like and override the objectively captured information recorded by the camera.
     
    That is not to say the camera is always exactly correct. As Jaap pointed out even the camera has to infer some things like white balance. If you are just trying to be artistic and make a pretty picture do whatever you want. However, if you are doing product photography and you need to make sure that the clothes look the same on the model on the runway with carefully designed lighting, in the catalog, and on the ultimate consumer in the store with their likely florescent lights, and a home with Tungsten lights, and outside then you have to do a huge amount of work.
     
    1) As Jaap said profile the camera with a Color Checker passport. You will need to do this under various lighting conditions bright daylight, overcast, and a couple of indoor lighting conditions. With outdoor natural light two things that matters more than most people think it does is elevation and the amount of water vapor in the air. So if you are more than about 1000m higher than lower than the altitude which you calibrated your camera then you want run through the color calibration again. The same is true if you go from a very moist area like near the ocean to a very dry place. Different cameras are more or less sensitive to these changes. In my experience the M is a bit more sensitve to altitude than my T. I don't remember noting a difference with water vapor. When I first calibrated my Leica cameras, I remember thinking "Wow these Germans really are into objective reality vs. making colors look good (which is what other camera vendors often do -- ehm Olympus, Panasonic)." I also remember noting that my color profiles were not really that far off of the "Embedded profile" that was in the camera. However, ACR and LR's color profile was way way off. So one of the things that I put in my default develop presets is to change from "Adobe Standard" to "Embedded profile" for Leica cameras.
     
    2) When you say the colors don't look right, what are you looking that them on? The LCD, your computer's monitor, what? Have you calibrated it? How big is its color gamut? Is it good enough to represent the colors the colors recorded? And you haven't messed with the brightness or contrast or any other settings on the monitor since you calibrated it have you? Oh and one more thing what is the ambient illumination source in the room and how does it change throughout the day? The screens are not 100% black body absorbers and so the light source in the room can mix with with the light coming out of the monitor to distort even your sensation of color. If you really want to do it right, you should only edit your photos in a dark room with no natural light on probably a brand new MacBook Pro that you have calibrated with something like a ColorMunki Photo.
     
    And all of that is long before you ever try to print something. There you have to deal with the reflectivity and spectral neutrality of the paper, the metamersim of inks or pigments and finally the limited gamut of colors possible with printing.
     
    If you want to keep in really simple: Buy a brand new MBP and use Embedded profile rather than Adobe Standard, and only do your editing in at night with the lights off and remember that there is an objective way things actually are and there is an artistic preconception of how you believe things should be.
  8. Like
    bencoyote got a reaction from LocalHero1953 in Skin tones with the M-P240   
    I'm not sure recalibration over a short time interval would work because of the way that the ColorMunki fits to the screen. But you obviously get the point. ;-) Thankfully LCDs are much more color stable than phosphors being blasted by a particle accelerator i.e. CRT.
     
    Looks like a good reference. I'm not sure books like that existed when I was working with color professionally. Digital was brand new and people were still figuring it out. The big guys e.g. Victoria Secret (who were our heros ) had their workflows figured out and I was part of the 2nd or 3rd crop where those gurus passed their knowledge down to us so that we could build parts of it into the next generation of machines to make their lives easier.
     
    We used to say, "you have to get the objective colors correct first. Then everything beyond that is artistic." You have to get the objective right first because the image has to go somewhere, even if it just posting it on instagram people will see it on a different device and unless the particular way that the colors are wrong happens to be exactly the same as someone else's all the fine tuning will not be preserved.
     
    I think we've got the objective color idea pretty well covered on this thread. That is science not art it is easy. I think that something where the combined forum with their vast experience can really add something is actually in the art of it. Taking the metaphoric, artistic and semi technical language of photoshop and converting it into the artistic and emotive language of skin tone. What makes a particular skin tone wrong? Why do you like a particular set of colors? What does it say to you?
     
    For example I find Elmar's color settings posted earlier in the thread really interesting. I haven't gone so far as to try it yet, I don't know what situation it was designed for but I find the concept really interesting.
  9. Like
    bencoyote got a reaction from mp58 in Skin tones with the M-P240   
    I used to do a lot of work with color and specifically color printing. Color is a very complicated topic and in the color theory classes that my employer sent me to the problem that the original poster described was introduced on the first day. They kind of used it as a starting point on the topic of sensation vs. perception. It would be impossible for me to distill weeks of classroom study down to a short forum post but let me try to get at the essence of it:
     
    There is an objective measurable interaction between an object and the light that hits it. We can build sensors that capture a portion of this interaction in a way that is similar to the way that our eyes do. This is objectively measurable.
     
    When we look at the world what we think we see is our perception. The thing that we have in our mind is something more like a HDR, focus stacked, white balance corrected, sharpened image which is highly modified in many ways based upon our past experience. This is what we call perception and by the time that your brain is done manipulating it this way and that it bears little resemblance to the actual sensations that your eyes generate.
     
    When we look at a still picture several layers of that processing are no longer available. We can't shift our focus or the white balance or a whole bunch of other factors. Furthermore because it is a still image we can look at it longer.and so things that we wouldn't have time to notice when trying to process the torrent of sensory data coming from our eyes in a moving scene are able to bubble up to our awareness.
     
    A really big factor to keep in mind when when looking at a photo is the very real difference between objective reality and perceived reality. When you are paused looking at a captured still photo, there is a very strong desire in some people to apply one of the last steps in your brain's automatic post processing to the image.That is to bring in your vast experience about what you think something like skin tones should look like and override the objectively captured information recorded by the camera.
     
    That is not to say the camera is always exactly correct. As Jaap pointed out even the camera has to infer some things like white balance. If you are just trying to be artistic and make a pretty picture do whatever you want. However, if you are doing product photography and you need to make sure that the clothes look the same on the model on the runway with carefully designed lighting, in the catalog, and on the ultimate consumer in the store with their likely florescent lights, and a home with Tungsten lights, and outside then you have to do a huge amount of work.
     
    1) As Jaap said profile the camera with a Color Checker passport. You will need to do this under various lighting conditions bright daylight, overcast, and a couple of indoor lighting conditions. With outdoor natural light two things that matters more than most people think it does is elevation and the amount of water vapor in the air. So if you are more than about 1000m higher than lower than the altitude which you calibrated your camera then you want run through the color calibration again. The same is true if you go from a very moist area like near the ocean to a very dry place. Different cameras are more or less sensitive to these changes. In my experience the M is a bit more sensitve to altitude than my T. I don't remember noting a difference with water vapor. When I first calibrated my Leica cameras, I remember thinking "Wow these Germans really are into objective reality vs. making colors look good (which is what other camera vendors often do -- ehm Olympus, Panasonic)." I also remember noting that my color profiles were not really that far off of the "Embedded profile" that was in the camera. However, ACR and LR's color profile was way way off. So one of the things that I put in my default develop presets is to change from "Adobe Standard" to "Embedded profile" for Leica cameras.
     
    2) When you say the colors don't look right, what are you looking that them on? The LCD, your computer's monitor, what? Have you calibrated it? How big is its color gamut? Is it good enough to represent the colors the colors recorded? And you haven't messed with the brightness or contrast or any other settings on the monitor since you calibrated it have you? Oh and one more thing what is the ambient illumination source in the room and how does it change throughout the day? The screens are not 100% black body absorbers and so the light source in the room can mix with with the light coming out of the monitor to distort even your sensation of color. If you really want to do it right, you should only edit your photos in a dark room with no natural light on probably a brand new MacBook Pro that you have calibrated with something like a ColorMunki Photo.
     
    And all of that is long before you ever try to print something. There you have to deal with the reflectivity and spectral neutrality of the paper, the metamersim of inks or pigments and finally the limited gamut of colors possible with printing.
     
    If you want to keep in really simple: Buy a brand new MBP and use Embedded profile rather than Adobe Standard, and only do your editing in at night with the lights off and remember that there is an objective way things actually are and there is an artistic preconception of how you believe things should be.
  10. Like
    bencoyote got a reaction from elmars in Skin tones with the M-P240   
    I'm not sure recalibration over a short time interval would work because of the way that the ColorMunki fits to the screen. But you obviously get the point. ;-) Thankfully LCDs are much more color stable than phosphors being blasted by a particle accelerator i.e. CRT.
     
    Looks like a good reference. I'm not sure books like that existed when I was working with color professionally. Digital was brand new and people were still figuring it out. The big guys e.g. Victoria Secret (who were our heros ) had their workflows figured out and I was part of the 2nd or 3rd crop where those gurus passed their knowledge down to us so that we could build parts of it into the next generation of machines to make their lives easier.
     
    We used to say, "you have to get the objective colors correct first. Then everything beyond that is artistic." You have to get the objective right first because the image has to go somewhere, even if it just posting it on instagram people will see it on a different device and unless the particular way that the colors are wrong happens to be exactly the same as someone else's all the fine tuning will not be preserved.
     
    I think we've got the objective color idea pretty well covered on this thread. That is science not art it is easy. I think that something where the combined forum with their vast experience can really add something is actually in the art of it. Taking the metaphoric, artistic and semi technical language of photoshop and converting it into the artistic and emotive language of skin tone. What makes a particular skin tone wrong? Why do you like a particular set of colors? What does it say to you?
     
    For example I find Elmar's color settings posted earlier in the thread really interesting. I haven't gone so far as to try it yet, I don't know what situation it was designed for but I find the concept really interesting.
  11. Like
    bencoyote got a reaction from ellisson in Skin tones with the M-P240   
    I used to do a lot of work with color and specifically color printing. Color is a very complicated topic and in the color theory classes that my employer sent me to the problem that the original poster described was introduced on the first day. They kind of used it as a starting point on the topic of sensation vs. perception. It would be impossible for me to distill weeks of classroom study down to a short forum post but let me try to get at the essence of it:
     
    There is an objective measurable interaction between an object and the light that hits it. We can build sensors that capture a portion of this interaction in a way that is similar to the way that our eyes do. This is objectively measurable.
     
    When we look at the world what we think we see is our perception. The thing that we have in our mind is something more like a HDR, focus stacked, white balance corrected, sharpened image which is highly modified in many ways based upon our past experience. This is what we call perception and by the time that your brain is done manipulating it this way and that it bears little resemblance to the actual sensations that your eyes generate.
     
    When we look at a still picture several layers of that processing are no longer available. We can't shift our focus or the white balance or a whole bunch of other factors. Furthermore because it is a still image we can look at it longer.and so things that we wouldn't have time to notice when trying to process the torrent of sensory data coming from our eyes in a moving scene are able to bubble up to our awareness.
     
    A really big factor to keep in mind when when looking at a photo is the very real difference between objective reality and perceived reality. When you are paused looking at a captured still photo, there is a very strong desire in some people to apply one of the last steps in your brain's automatic post processing to the image.That is to bring in your vast experience about what you think something like skin tones should look like and override the objectively captured information recorded by the camera.
     
    That is not to say the camera is always exactly correct. As Jaap pointed out even the camera has to infer some things like white balance. If you are just trying to be artistic and make a pretty picture do whatever you want. However, if you are doing product photography and you need to make sure that the clothes look the same on the model on the runway with carefully designed lighting, in the catalog, and on the ultimate consumer in the store with their likely florescent lights, and a home with Tungsten lights, and outside then you have to do a huge amount of work.
     
    1) As Jaap said profile the camera with a Color Checker passport. You will need to do this under various lighting conditions bright daylight, overcast, and a couple of indoor lighting conditions. With outdoor natural light two things that matters more than most people think it does is elevation and the amount of water vapor in the air. So if you are more than about 1000m higher than lower than the altitude which you calibrated your camera then you want run through the color calibration again. The same is true if you go from a very moist area like near the ocean to a very dry place. Different cameras are more or less sensitive to these changes. In my experience the M is a bit more sensitve to altitude than my T. I don't remember noting a difference with water vapor. When I first calibrated my Leica cameras, I remember thinking "Wow these Germans really are into objective reality vs. making colors look good (which is what other camera vendors often do -- ehm Olympus, Panasonic)." I also remember noting that my color profiles were not really that far off of the "Embedded profile" that was in the camera. However, ACR and LR's color profile was way way off. So one of the things that I put in my default develop presets is to change from "Adobe Standard" to "Embedded profile" for Leica cameras.
     
    2) When you say the colors don't look right, what are you looking that them on? The LCD, your computer's monitor, what? Have you calibrated it? How big is its color gamut? Is it good enough to represent the colors the colors recorded? And you haven't messed with the brightness or contrast or any other settings on the monitor since you calibrated it have you? Oh and one more thing what is the ambient illumination source in the room and how does it change throughout the day? The screens are not 100% black body absorbers and so the light source in the room can mix with with the light coming out of the monitor to distort even your sensation of color. If you really want to do it right, you should only edit your photos in a dark room with no natural light on probably a brand new MacBook Pro that you have calibrated with something like a ColorMunki Photo.
     
    And all of that is long before you ever try to print something. There you have to deal with the reflectivity and spectral neutrality of the paper, the metamersim of inks or pigments and finally the limited gamut of colors possible with printing.
     
    If you want to keep in really simple: Buy a brand new MBP and use Embedded profile rather than Adobe Standard, and only do your editing in at night with the lights off and remember that there is an objective way things actually are and there is an artistic preconception of how you believe things should be.
  12. Like
    bencoyote got a reaction from jonoslack in Skin tones with the M-P240   
    I used to do a lot of work with color and specifically color printing. Color is a very complicated topic and in the color theory classes that my employer sent me to the problem that the original poster described was introduced on the first day. They kind of used it as a starting point on the topic of sensation vs. perception. It would be impossible for me to distill weeks of classroom study down to a short forum post but let me try to get at the essence of it:
     
    There is an objective measurable interaction between an object and the light that hits it. We can build sensors that capture a portion of this interaction in a way that is similar to the way that our eyes do. This is objectively measurable.
     
    When we look at the world what we think we see is our perception. The thing that we have in our mind is something more like a HDR, focus stacked, white balance corrected, sharpened image which is highly modified in many ways based upon our past experience. This is what we call perception and by the time that your brain is done manipulating it this way and that it bears little resemblance to the actual sensations that your eyes generate.
     
    When we look at a still picture several layers of that processing are no longer available. We can't shift our focus or the white balance or a whole bunch of other factors. Furthermore because it is a still image we can look at it longer.and so things that we wouldn't have time to notice when trying to process the torrent of sensory data coming from our eyes in a moving scene are able to bubble up to our awareness.
     
    A really big factor to keep in mind when when looking at a photo is the very real difference between objective reality and perceived reality. When you are paused looking at a captured still photo, there is a very strong desire in some people to apply one of the last steps in your brain's automatic post processing to the image.That is to bring in your vast experience about what you think something like skin tones should look like and override the objectively captured information recorded by the camera.
     
    That is not to say the camera is always exactly correct. As Jaap pointed out even the camera has to infer some things like white balance. If you are just trying to be artistic and make a pretty picture do whatever you want. However, if you are doing product photography and you need to make sure that the clothes look the same on the model on the runway with carefully designed lighting, in the catalog, and on the ultimate consumer in the store with their likely florescent lights, and a home with Tungsten lights, and outside then you have to do a huge amount of work.
     
    1) As Jaap said profile the camera with a Color Checker passport. You will need to do this under various lighting conditions bright daylight, overcast, and a couple of indoor lighting conditions. With outdoor natural light two things that matters more than most people think it does is elevation and the amount of water vapor in the air. So if you are more than about 1000m higher than lower than the altitude which you calibrated your camera then you want run through the color calibration again. The same is true if you go from a very moist area like near the ocean to a very dry place. Different cameras are more or less sensitive to these changes. In my experience the M is a bit more sensitve to altitude than my T. I don't remember noting a difference with water vapor. When I first calibrated my Leica cameras, I remember thinking "Wow these Germans really are into objective reality vs. making colors look good (which is what other camera vendors often do -- ehm Olympus, Panasonic)." I also remember noting that my color profiles were not really that far off of the "Embedded profile" that was in the camera. However, ACR and LR's color profile was way way off. So one of the things that I put in my default develop presets is to change from "Adobe Standard" to "Embedded profile" for Leica cameras.
     
    2) When you say the colors don't look right, what are you looking that them on? The LCD, your computer's monitor, what? Have you calibrated it? How big is its color gamut? Is it good enough to represent the colors the colors recorded? And you haven't messed with the brightness or contrast or any other settings on the monitor since you calibrated it have you? Oh and one more thing what is the ambient illumination source in the room and how does it change throughout the day? The screens are not 100% black body absorbers and so the light source in the room can mix with with the light coming out of the monitor to distort even your sensation of color. If you really want to do it right, you should only edit your photos in a dark room with no natural light on probably a brand new MacBook Pro that you have calibrated with something like a ColorMunki Photo.
     
    And all of that is long before you ever try to print something. There you have to deal with the reflectivity and spectral neutrality of the paper, the metamersim of inks or pigments and finally the limited gamut of colors possible with printing.
     
    If you want to keep in really simple: Buy a brand new MBP and use Embedded profile rather than Adobe Standard, and only do your editing in at night with the lights off and remember that there is an objective way things actually are and there is an artistic preconception of how you believe things should be.
  13. Like
    bencoyote got a reaction from jmahto in Skin tones with the M-P240   
    I used to do a lot of work with color and specifically color printing. Color is a very complicated topic and in the color theory classes that my employer sent me to the problem that the original poster described was introduced on the first day. They kind of used it as a starting point on the topic of sensation vs. perception. It would be impossible for me to distill weeks of classroom study down to a short forum post but let me try to get at the essence of it:
     
    There is an objective measurable interaction between an object and the light that hits it. We can build sensors that capture a portion of this interaction in a way that is similar to the way that our eyes do. This is objectively measurable.
     
    When we look at the world what we think we see is our perception. The thing that we have in our mind is something more like a HDR, focus stacked, white balance corrected, sharpened image which is highly modified in many ways based upon our past experience. This is what we call perception and by the time that your brain is done manipulating it this way and that it bears little resemblance to the actual sensations that your eyes generate.
     
    When we look at a still picture several layers of that processing are no longer available. We can't shift our focus or the white balance or a whole bunch of other factors. Furthermore because it is a still image we can look at it longer.and so things that we wouldn't have time to notice when trying to process the torrent of sensory data coming from our eyes in a moving scene are able to bubble up to our awareness.
     
    A really big factor to keep in mind when when looking at a photo is the very real difference between objective reality and perceived reality. When you are paused looking at a captured still photo, there is a very strong desire in some people to apply one of the last steps in your brain's automatic post processing to the image.That is to bring in your vast experience about what you think something like skin tones should look like and override the objectively captured information recorded by the camera.
     
    That is not to say the camera is always exactly correct. As Jaap pointed out even the camera has to infer some things like white balance. If you are just trying to be artistic and make a pretty picture do whatever you want. However, if you are doing product photography and you need to make sure that the clothes look the same on the model on the runway with carefully designed lighting, in the catalog, and on the ultimate consumer in the store with their likely florescent lights, and a home with Tungsten lights, and outside then you have to do a huge amount of work.
     
    1) As Jaap said profile the camera with a Color Checker passport. You will need to do this under various lighting conditions bright daylight, overcast, and a couple of indoor lighting conditions. With outdoor natural light two things that matters more than most people think it does is elevation and the amount of water vapor in the air. So if you are more than about 1000m higher than lower than the altitude which you calibrated your camera then you want run through the color calibration again. The same is true if you go from a very moist area like near the ocean to a very dry place. Different cameras are more or less sensitive to these changes. In my experience the M is a bit more sensitve to altitude than my T. I don't remember noting a difference with water vapor. When I first calibrated my Leica cameras, I remember thinking "Wow these Germans really are into objective reality vs. making colors look good (which is what other camera vendors often do -- ehm Olympus, Panasonic)." I also remember noting that my color profiles were not really that far off of the "Embedded profile" that was in the camera. However, ACR and LR's color profile was way way off. So one of the things that I put in my default develop presets is to change from "Adobe Standard" to "Embedded profile" for Leica cameras.
     
    2) When you say the colors don't look right, what are you looking that them on? The LCD, your computer's monitor, what? Have you calibrated it? How big is its color gamut? Is it good enough to represent the colors the colors recorded? And you haven't messed with the brightness or contrast or any other settings on the monitor since you calibrated it have you? Oh and one more thing what is the ambient illumination source in the room and how does it change throughout the day? The screens are not 100% black body absorbers and so the light source in the room can mix with with the light coming out of the monitor to distort even your sensation of color. If you really want to do it right, you should only edit your photos in a dark room with no natural light on probably a brand new MacBook Pro that you have calibrated with something like a ColorMunki Photo.
     
    And all of that is long before you ever try to print something. There you have to deal with the reflectivity and spectral neutrality of the paper, the metamersim of inks or pigments and finally the limited gamut of colors possible with printing.
     
    If you want to keep in really simple: Buy a brand new MBP and use Embedded profile rather than Adobe Standard, and only do your editing in at night with the lights off and remember that there is an objective way things actually are and there is an artistic preconception of how you believe things should be.
  14. Like
    bencoyote got a reaction from Wyck in Skin tones with the M-P240   
    I used to do a lot of work with color and specifically color printing. Color is a very complicated topic and in the color theory classes that my employer sent me to the problem that the original poster described was introduced on the first day. They kind of used it as a starting point on the topic of sensation vs. perception. It would be impossible for me to distill weeks of classroom study down to a short forum post but let me try to get at the essence of it:
     
    There is an objective measurable interaction between an object and the light that hits it. We can build sensors that capture a portion of this interaction in a way that is similar to the way that our eyes do. This is objectively measurable.
     
    When we look at the world what we think we see is our perception. The thing that we have in our mind is something more like a HDR, focus stacked, white balance corrected, sharpened image which is highly modified in many ways based upon our past experience. This is what we call perception and by the time that your brain is done manipulating it this way and that it bears little resemblance to the actual sensations that your eyes generate.
     
    When we look at a still picture several layers of that processing are no longer available. We can't shift our focus or the white balance or a whole bunch of other factors. Furthermore because it is a still image we can look at it longer.and so things that we wouldn't have time to notice when trying to process the torrent of sensory data coming from our eyes in a moving scene are able to bubble up to our awareness.
     
    A really big factor to keep in mind when when looking at a photo is the very real difference between objective reality and perceived reality. When you are paused looking at a captured still photo, there is a very strong desire in some people to apply one of the last steps in your brain's automatic post processing to the image.That is to bring in your vast experience about what you think something like skin tones should look like and override the objectively captured information recorded by the camera.
     
    That is not to say the camera is always exactly correct. As Jaap pointed out even the camera has to infer some things like white balance. If you are just trying to be artistic and make a pretty picture do whatever you want. However, if you are doing product photography and you need to make sure that the clothes look the same on the model on the runway with carefully designed lighting, in the catalog, and on the ultimate consumer in the store with their likely florescent lights, and a home with Tungsten lights, and outside then you have to do a huge amount of work.
     
    1) As Jaap said profile the camera with a Color Checker passport. You will need to do this under various lighting conditions bright daylight, overcast, and a couple of indoor lighting conditions. With outdoor natural light two things that matters more than most people think it does is elevation and the amount of water vapor in the air. So if you are more than about 1000m higher than lower than the altitude which you calibrated your camera then you want run through the color calibration again. The same is true if you go from a very moist area like near the ocean to a very dry place. Different cameras are more or less sensitive to these changes. In my experience the M is a bit more sensitve to altitude than my T. I don't remember noting a difference with water vapor. When I first calibrated my Leica cameras, I remember thinking "Wow these Germans really are into objective reality vs. making colors look good (which is what other camera vendors often do -- ehm Olympus, Panasonic)." I also remember noting that my color profiles were not really that far off of the "Embedded profile" that was in the camera. However, ACR and LR's color profile was way way off. So one of the things that I put in my default develop presets is to change from "Adobe Standard" to "Embedded profile" for Leica cameras.
     
    2) When you say the colors don't look right, what are you looking that them on? The LCD, your computer's monitor, what? Have you calibrated it? How big is its color gamut? Is it good enough to represent the colors the colors recorded? And you haven't messed with the brightness or contrast or any other settings on the monitor since you calibrated it have you? Oh and one more thing what is the ambient illumination source in the room and how does it change throughout the day? The screens are not 100% black body absorbers and so the light source in the room can mix with with the light coming out of the monitor to distort even your sensation of color. If you really want to do it right, you should only edit your photos in a dark room with no natural light on probably a brand new MacBook Pro that you have calibrated with something like a ColorMunki Photo.
     
    And all of that is long before you ever try to print something. There you have to deal with the reflectivity and spectral neutrality of the paper, the metamersim of inks or pigments and finally the limited gamut of colors possible with printing.
     
    If you want to keep in really simple: Buy a brand new MBP and use Embedded profile rather than Adobe Standard, and only do your editing in at night with the lights off and remember that there is an objective way things actually are and there is an artistic preconception of how you believe things should be.
  15. Like
    bencoyote got a reaction from LocalHero1953 in Skin tones with the M-P240   
    I used to do a lot of work with color and specifically color printing. Color is a very complicated topic and in the color theory classes that my employer sent me to the problem that the original poster described was introduced on the first day. They kind of used it as a starting point on the topic of sensation vs. perception. It would be impossible for me to distill weeks of classroom study down to a short forum post but let me try to get at the essence of it:
     
    There is an objective measurable interaction between an object and the light that hits it. We can build sensors that capture a portion of this interaction in a way that is similar to the way that our eyes do. This is objectively measurable.
     
    When we look at the world what we think we see is our perception. The thing that we have in our mind is something more like a HDR, focus stacked, white balance corrected, sharpened image which is highly modified in many ways based upon our past experience. This is what we call perception and by the time that your brain is done manipulating it this way and that it bears little resemblance to the actual sensations that your eyes generate.
     
    When we look at a still picture several layers of that processing are no longer available. We can't shift our focus or the white balance or a whole bunch of other factors. Furthermore because it is a still image we can look at it longer.and so things that we wouldn't have time to notice when trying to process the torrent of sensory data coming from our eyes in a moving scene are able to bubble up to our awareness.
     
    A really big factor to keep in mind when when looking at a photo is the very real difference between objective reality and perceived reality. When you are paused looking at a captured still photo, there is a very strong desire in some people to apply one of the last steps in your brain's automatic post processing to the image.That is to bring in your vast experience about what you think something like skin tones should look like and override the objectively captured information recorded by the camera.
     
    That is not to say the camera is always exactly correct. As Jaap pointed out even the camera has to infer some things like white balance. If you are just trying to be artistic and make a pretty picture do whatever you want. However, if you are doing product photography and you need to make sure that the clothes look the same on the model on the runway with carefully designed lighting, in the catalog, and on the ultimate consumer in the store with their likely florescent lights, and a home with Tungsten lights, and outside then you have to do a huge amount of work.
     
    1) As Jaap said profile the camera with a Color Checker passport. You will need to do this under various lighting conditions bright daylight, overcast, and a couple of indoor lighting conditions. With outdoor natural light two things that matters more than most people think it does is elevation and the amount of water vapor in the air. So if you are more than about 1000m higher than lower than the altitude which you calibrated your camera then you want run through the color calibration again. The same is true if you go from a very moist area like near the ocean to a very dry place. Different cameras are more or less sensitive to these changes. In my experience the M is a bit more sensitve to altitude than my T. I don't remember noting a difference with water vapor. When I first calibrated my Leica cameras, I remember thinking "Wow these Germans really are into objective reality vs. making colors look good (which is what other camera vendors often do -- ehm Olympus, Panasonic)." I also remember noting that my color profiles were not really that far off of the "Embedded profile" that was in the camera. However, ACR and LR's color profile was way way off. So one of the things that I put in my default develop presets is to change from "Adobe Standard" to "Embedded profile" for Leica cameras.
     
    2) When you say the colors don't look right, what are you looking that them on? The LCD, your computer's monitor, what? Have you calibrated it? How big is its color gamut? Is it good enough to represent the colors the colors recorded? And you haven't messed with the brightness or contrast or any other settings on the monitor since you calibrated it have you? Oh and one more thing what is the ambient illumination source in the room and how does it change throughout the day? The screens are not 100% black body absorbers and so the light source in the room can mix with with the light coming out of the monitor to distort even your sensation of color. If you really want to do it right, you should only edit your photos in a dark room with no natural light on probably a brand new MacBook Pro that you have calibrated with something like a ColorMunki Photo.
     
    And all of that is long before you ever try to print something. There you have to deal with the reflectivity and spectral neutrality of the paper, the metamersim of inks or pigments and finally the limited gamut of colors possible with printing.
     
    If you want to keep in really simple: Buy a brand new MBP and use Embedded profile rather than Adobe Standard, and only do your editing in at night with the lights off and remember that there is an objective way things actually are and there is an artistic preconception of how you believe things should be.
  16. Like
    bencoyote got a reaction from JPP1 in Skin tones with the M-P240   
    I used to do a lot of work with color and specifically color printing. Color is a very complicated topic and in the color theory classes that my employer sent me to the problem that the original poster described was introduced on the first day. They kind of used it as a starting point on the topic of sensation vs. perception. It would be impossible for me to distill weeks of classroom study down to a short forum post but let me try to get at the essence of it:
     
    There is an objective measurable interaction between an object and the light that hits it. We can build sensors that capture a portion of this interaction in a way that is similar to the way that our eyes do. This is objectively measurable.
     
    When we look at the world what we think we see is our perception. The thing that we have in our mind is something more like a HDR, focus stacked, white balance corrected, sharpened image which is highly modified in many ways based upon our past experience. This is what we call perception and by the time that your brain is done manipulating it this way and that it bears little resemblance to the actual sensations that your eyes generate.
     
    When we look at a still picture several layers of that processing are no longer available. We can't shift our focus or the white balance or a whole bunch of other factors. Furthermore because it is a still image we can look at it longer.and so things that we wouldn't have time to notice when trying to process the torrent of sensory data coming from our eyes in a moving scene are able to bubble up to our awareness.
     
    A really big factor to keep in mind when when looking at a photo is the very real difference between objective reality and perceived reality. When you are paused looking at a captured still photo, there is a very strong desire in some people to apply one of the last steps in your brain's automatic post processing to the image.That is to bring in your vast experience about what you think something like skin tones should look like and override the objectively captured information recorded by the camera.
     
    That is not to say the camera is always exactly correct. As Jaap pointed out even the camera has to infer some things like white balance. If you are just trying to be artistic and make a pretty picture do whatever you want. However, if you are doing product photography and you need to make sure that the clothes look the same on the model on the runway with carefully designed lighting, in the catalog, and on the ultimate consumer in the store with their likely florescent lights, and a home with Tungsten lights, and outside then you have to do a huge amount of work.
     
    1) As Jaap said profile the camera with a Color Checker passport. You will need to do this under various lighting conditions bright daylight, overcast, and a couple of indoor lighting conditions. With outdoor natural light two things that matters more than most people think it does is elevation and the amount of water vapor in the air. So if you are more than about 1000m higher than lower than the altitude which you calibrated your camera then you want run through the color calibration again. The same is true if you go from a very moist area like near the ocean to a very dry place. Different cameras are more or less sensitive to these changes. In my experience the M is a bit more sensitve to altitude than my T. I don't remember noting a difference with water vapor. When I first calibrated my Leica cameras, I remember thinking "Wow these Germans really are into objective reality vs. making colors look good (which is what other camera vendors often do -- ehm Olympus, Panasonic)." I also remember noting that my color profiles were not really that far off of the "Embedded profile" that was in the camera. However, ACR and LR's color profile was way way off. So one of the things that I put in my default develop presets is to change from "Adobe Standard" to "Embedded profile" for Leica cameras.
     
    2) When you say the colors don't look right, what are you looking that them on? The LCD, your computer's monitor, what? Have you calibrated it? How big is its color gamut? Is it good enough to represent the colors the colors recorded? And you haven't messed with the brightness or contrast or any other settings on the monitor since you calibrated it have you? Oh and one more thing what is the ambient illumination source in the room and how does it change throughout the day? The screens are not 100% black body absorbers and so the light source in the room can mix with with the light coming out of the monitor to distort even your sensation of color. If you really want to do it right, you should only edit your photos in a dark room with no natural light on probably a brand new MacBook Pro that you have calibrated with something like a ColorMunki Photo.
     
    And all of that is long before you ever try to print something. There you have to deal with the reflectivity and spectral neutrality of the paper, the metamersim of inks or pigments and finally the limited gamut of colors possible with printing.
     
    If you want to keep in really simple: Buy a brand new MBP and use Embedded profile rather than Adobe Standard, and only do your editing in at night with the lights off and remember that there is an objective way things actually are and there is an artistic preconception of how you believe things should be.
  17. Like
    bencoyote reacted to jaapv in Skin tones with the M-P240   
    No, I use the Colorchecker to make a profile. I have a whole series by now of different types of light. If I cannot get the skintone just so I move over to C1.
    The problem with skin is that it is multilayered, each layer has its own IR character, and it is not uniform in colour. However, when we look at a person our brain will record a more or less uniform colour. The camera records the colours as they are objectively, hence the unnatural look. Add the IR issue and you have your problem.
  18. Like
    bencoyote got a reaction from ELAN in M 11   
    I do appreciate how difficult it is to innovate when the essence of the design is minimalism.
    At some point the tool used to make the art is not the limiting factor, it is the artist's creative vision.
     
    I'd say for most of us, that point was reached with either the M9 or the M type 240. The M10 may some nice to have refinements but since we are already limited by our own ability rather than the camera's ability and the arms race between camera vendors is winding down as the masses rely on their cell phones leaving the pro market a low volume business, I don't expect the M11 to come out any time soon.
     
    The things that I would ask for if I were specifying the M11 now are:
    1) More dynamic range from the sensor. This also probably also implies more bits per pixel per color channel even if the resolution stays the same.
    2) A case design that facilitates replacing and upgrading the electronics. In the same way that you could upgrade your computer by replacing the motherboard and the CPU and sticking the new components in the same commodity case, Designing the case where the rangefinder and the user controls remain the same and they can quickly replace the sensor and processor logic board as technology evolves seems like a good approach. What I would do is have the screen, all the buttons and controls plug into one flex cable that attached to the processor/sensor assembly. This would be exactly the same on all the variations of the camera and have the IO processor in this part of the camera identify itself to the sensor/processor assembly so that it knows how to implement the UI. That way a M11-D with no screen would have the same electronics as a M11. The firmware would say, no screen, no need to do preview and I read the ISO from the wheel on the back rather than from the knob where the rewind spool used to be.
    3) Better weather and dust sealing in the case.
    4) replace the direct mechanical connection between the RF and the lens with an electronic one or in some other way allow the camera to be adjusted in the field to keep the RF focus accurate.
    5) make the vertical adjustment of the RF something field serviceable.
    Those last three could leverage the network of the Leica boutiques to offload much of the routine work of being done by the repair service in Allendale and Wetzlar.
    6) Work with Adobe to add encryption and Error detection and correction to a new version of DNG and implement that in the camera.
    7) Give the camera a USB-C port for charging in-camera and tethered shooting.
    8) Make the external charger use USB-C rather than AC.
    9) Provide a some sort of haptic signal or put stops on the shutter speed dial so that you either can't go or at least know when you go from 1/4000 to B and vice versa.
  19. Like
    bencoyote got a reaction from BCMielke in Best settings to conserve battery   
    I think that the real solution is not really some magic combination of settings but to just have sufficient spare batteries in your pocket.
     
    Most of the things that matter are really already baked into the hardware design and the firmware. Almost everything that you can do through settings which doesn't compromise functionality is fairly small percentages in comparison to that. 
     
    I would say that you might want to do some testing with the GPS on and compare it to when the GPS is off. GPS chips aren't really the power hogs they were a few years ago and it is reasonably likely that even with GPS off in the settings the GPS chip is still powered and computing your position. So it is reasonably likely that the only thing you would lose is the geo-tagging of your photos.
     
    For the T2 or whatever model that follows the TL a couple of things that I would like are:
    1) USB-C charging and downloading using the full capability of USB PD (Power Delivery) to provide enough voltage and wattage to charge the battery as quickly in the battery as in the external charger. USB-C can now deliver up to 100W and can provide voltages up to 20V; this is plenty to charge a camera battery.
    2) The external charger should also use USB-C with USB PD and instead of plugging directly into the the main AC power there should be a USB-C adapter. This way you can charge using a car adapter or off of a battery pack as well as plugging into a AC plug. Then have the adapters for the various national plugs attach to the USB-C adapter.
  20. Like
    bencoyote got a reaction from jmahto in Lost/stolen Leica M240 with 35mm pre asph lux and 50mm pre asph Lux   
    I had previously registered my lenses and camera with the Leica Member's Area on their web site and then put a comment that they had been stolen.
    Then I mailed Leica and told them as much. They sent me a nice email saying that they would take note of the serial numbers and if they ever appeared at service they would notify me as the rightful owner.
     
    When I've bought used equipment most of the time when I try to register them on the member area, I discover that the previous owner has failed to unregister them and the web interface prevents me from registering them. The general procedure has been, that I email Leica a copy of my receipt from a reputable shop where I bought the used camera or lens and they unregister the previous owner and then I can register my ownership.
     
    I also tried to email all the local and big national shops camera shops which deal in used equipment so that they at least would be aware. Leica Store SF, Samy's Camera, KEH, Adorama, B&H, Popflash...
     
    I have a google doc spreadsheet with all my serial numbers, how much I paid for them, where I bought them, and how much the current retail is. This made it fairly easy to fill out the police report and make the claim with the insurance company.
  21. Like
    bencoyote got a reaction from pico in M 11   
    I do appreciate how difficult it is to innovate when the essence of the design is minimalism.
    At some point the tool used to make the art is not the limiting factor, it is the artist's creative vision.
     
    I'd say for most of us, that point was reached with either the M9 or the M type 240. The M10 may some nice to have refinements but since we are already limited by our own ability rather than the camera's ability and the arms race between camera vendors is winding down as the masses rely on their cell phones leaving the pro market a low volume business, I don't expect the M11 to come out any time soon.
     
    The things that I would ask for if I were specifying the M11 now are:
    1) More dynamic range from the sensor. This also probably also implies more bits per pixel per color channel even if the resolution stays the same.
    2) A case design that facilitates replacing and upgrading the electronics. In the same way that you could upgrade your computer by replacing the motherboard and the CPU and sticking the new components in the same commodity case, Designing the case where the rangefinder and the user controls remain the same and they can quickly replace the sensor and processor logic board as technology evolves seems like a good approach. What I would do is have the screen, all the buttons and controls plug into one flex cable that attached to the processor/sensor assembly. This would be exactly the same on all the variations of the camera and have the IO processor in this part of the camera identify itself to the sensor/processor assembly so that it knows how to implement the UI. That way a M11-D with no screen would have the same electronics as a M11. The firmware would say, no screen, no need to do preview and I read the ISO from the wheel on the back rather than from the knob where the rewind spool used to be.
    3) Better weather and dust sealing in the case.
    4) replace the direct mechanical connection between the RF and the lens with an electronic one or in some other way allow the camera to be adjusted in the field to keep the RF focus accurate.
    5) make the vertical adjustment of the RF something field serviceable.
    Those last three could leverage the network of the Leica boutiques to offload much of the routine work of being done by the repair service in Allendale and Wetzlar.
    6) Work with Adobe to add encryption and Error detection and correction to a new version of DNG and implement that in the camera.
    7) Give the camera a USB-C port for charging in-camera and tethered shooting.
    8) Make the external charger use USB-C rather than AC.
    9) Provide a some sort of haptic signal or put stops on the shutter speed dial so that you either can't go or at least know when you go from 1/4000 to B and vice versa.
  22. Like
    bencoyote reacted to flyalf in Best used 90 for m240   
    I have two 90mm; the f/4 macro (old) and f/2 APO. I find the f/2 hard to handle. My fingers always fumble to find the focus ring since its diameter is smaller than outer part. Probably because I dont use it often. Both are really exelent in terms of IQ. The 90 macro is a great travel lens.
  23. Like
    bencoyote got a reaction from Archiver in Introduction & advise on switch Fuji -> Leica TL   
    I'd say I made a somewhat similar jump. I went from the Olympus E-M1 to the original Leica T. In summary, I'd say that they nailed it. They gave you what you need as a photographer in a very clean simple user interface. In my opinion, this frees you to focus on the art of photography and composition and not on fiddling with the camera. I think that when it was released several people said something to the effect, this is the camera that apple would have designed if they made cameras. It really is that simple.
     
    Now here is the important thing to understand. There is a bit of a caveat in what I just said, "They gave you what you NEED as a photographer". It really is, "what you need" and not a whit more. You are not going to find "whiz bang fancy feature" that makes your job as a photographer easier in this one specific situation. I would say that the UI of other cameras are cluttered up with hundreds of features look good on marketing brochures. Whether these features are well implemented and well thought out really doesn't matter. Other cameras have those features and so they can be listed on the marketing materials and can show up in reviews. The Leica TL has none of that. You have the bare necessities of what you need as a photographer and that is it. In my experience there is an implied understanding in that. Leica has a long history in photography and most of the time the cameras were primitive mechanical devices. Adding features was just not possible the way it is now with these modern computer controlled digital recording devices we now have and call cameras. Never the less, photographers managed to do some stunning things. They found tricks and techniques and work arounds and could make these primitive cameras capture all these amazing shots. Leica with its long history and deep experience with photography in essence assumes that you know those special purpose tricks. So all those things which are special purpose options or modes buried in way too many buttons, levers, and menus on other cameras cluttering up the UI just aren't there on the TL. The end result for me, is a mantra that I developed when shooting with the T: "Be a better photographer". If I find myself pushing up against the capability of the camera when trying to do something, wishing that it would do just a little bit more, I say to myself "be a better photographer". For example: do I need 80 frames per second burst mode with continuous tracking AF to capture just the right shot? No I need to recognize and anticipipate the decisive moment, be pre-focused and hit the shutter at just the right time.
     
    So in the end, if you want a beautiful uncluttered camera that focuses you on the art and practice photography not on device operation, the TL is a good camera. If you want to do anything out of the mainstream of photography then be prepared to go back and figure out how people did things before cameras were software controlled computers. I kind of enjoy this kind of thing.
     
    All of that being said there are some things which I wish were different and some truly missing features in the Leica TL.
    1) If you are doing studio work, the fact that you can't use the EVF and use the hot shoe to trigger flash can be a bit of a challenge. You can work around this by using the internal flash to trigger remote flashes but there is also the problem that you can't turn off exposure simulation and so the camera thinks your image will be black when you dial in the settings for your studio lights.
    2) I'm not a big fan of the external EVF, I think it is really important to have but I find that it makes the camera cumbersome to carry and put into a bag. I really feel like they need to make a version of the TL camera with the EVF built in like it is on the Q.
    3) I wish the TL were weather sealed to some extent. The place where the flash pops out always freaks me out.
    4) I hope that they upgraded the USB charging port from 500mW to 12W like many cell phones on the TL so that it could charge faster. (Note: I haven't verified that they haven't done this on the new TL). Likewise, I wish that the external charger used 12W USB or USB-C rather than having a plug.
    5) There are a couple of long standing bugs which never seem to get resolved. E.g. When using AF pressing the shutter half way cancels the shot review of the previous shot. When you are using MF you just have to wait. This makes taking a rapid succession of shots a sort of manual drive mode impossible.
  24. Like
    bencoyote got a reaction from Kyros Moutsouris in Introduction & advise on switch Fuji -> Leica TL   
    I'd say I made a somewhat similar jump. I went from the Olympus E-M1 to the original Leica T. In summary, I'd say that they nailed it. They gave you what you need as a photographer in a very clean simple user interface. In my opinion, this frees you to focus on the art of photography and composition and not on fiddling with the camera. I think that when it was released several people said something to the effect, this is the camera that apple would have designed if they made cameras. It really is that simple.
     
    Now here is the important thing to understand. There is a bit of a caveat in what I just said, "They gave you what you NEED as a photographer". It really is, "what you need" and not a whit more. You are not going to find "whiz bang fancy feature" that makes your job as a photographer easier in this one specific situation. I would say that the UI of other cameras are cluttered up with hundreds of features look good on marketing brochures. Whether these features are well implemented and well thought out really doesn't matter. Other cameras have those features and so they can be listed on the marketing materials and can show up in reviews. The Leica TL has none of that. You have the bare necessities of what you need as a photographer and that is it. In my experience there is an implied understanding in that. Leica has a long history in photography and most of the time the cameras were primitive mechanical devices. Adding features was just not possible the way it is now with these modern computer controlled digital recording devices we now have and call cameras. Never the less, photographers managed to do some stunning things. They found tricks and techniques and work arounds and could make these primitive cameras capture all these amazing shots. Leica with its long history and deep experience with photography in essence assumes that you know those special purpose tricks. So all those things which are special purpose options or modes buried in way too many buttons, levers, and menus on other cameras cluttering up the UI just aren't there on the TL. The end result for me, is a mantra that I developed when shooting with the T: "Be a better photographer". If I find myself pushing up against the capability of the camera when trying to do something, wishing that it would do just a little bit more, I say to myself "be a better photographer". For example: do I need 80 frames per second burst mode with continuous tracking AF to capture just the right shot? No I need to recognize and anticipipate the decisive moment, be pre-focused and hit the shutter at just the right time.
     
    So in the end, if you want a beautiful uncluttered camera that focuses you on the art and practice photography not on device operation, the TL is a good camera. If you want to do anything out of the mainstream of photography then be prepared to go back and figure out how people did things before cameras were software controlled computers. I kind of enjoy this kind of thing.
     
    All of that being said there are some things which I wish were different and some truly missing features in the Leica TL.
    1) If you are doing studio work, the fact that you can't use the EVF and use the hot shoe to trigger flash can be a bit of a challenge. You can work around this by using the internal flash to trigger remote flashes but there is also the problem that you can't turn off exposure simulation and so the camera thinks your image will be black when you dial in the settings for your studio lights.
    2) I'm not a big fan of the external EVF, I think it is really important to have but I find that it makes the camera cumbersome to carry and put into a bag. I really feel like they need to make a version of the TL camera with the EVF built in like it is on the Q.
    3) I wish the TL were weather sealed to some extent. The place where the flash pops out always freaks me out.
    4) I hope that they upgraded the USB charging port from 500mW to 12W like many cell phones on the TL so that it could charge faster. (Note: I haven't verified that they haven't done this on the new TL). Likewise, I wish that the external charger used 12W USB or USB-C rather than having a plug.
    5) There are a couple of long standing bugs which never seem to get resolved. E.g. When using AF pressing the shutter half way cancels the shot review of the previous shot. When you are using MF you just have to wait. This makes taking a rapid succession of shots a sort of manual drive mode impossible.
  25. Like
    bencoyote reacted to pico in Encryption for professional cameras.   
    Cop: "Let's see what you have in your camera."
    Photog: "Film"
    Cop: "WTF is film?"
×
×
  • Create New...