-
Posts
13,173 -
Joined
-
Last visited
-
Days Won
8
Profile Information
-
Member Title
Erfahrener Benutzer
-
Country
USA
Converted
-
City
Denver
Recent Profile Visitors
The recent visitors block is disabled and is not being shown to other users.
-
Depends on the camera's flash-sync speed - approximately. The on-off behavior of LEDs becomes noticeable as banding at about 1/360th second with my M10 (metal-shutter sync speed 1/180th second). So I make sure the exposure time is longer than that (1/250th or 180th or 1/125 or 1/60th or 1/30th). With a fully-electronic-readout "shutter," that may be different, and more related to the speed where "exposure/read-time-distortion" sets in - such as getting a stretched and distorted picture of a fast car going past. As famously demonstrated by Jacques-Henri Lartigue in 1912 - his focal-plane shutter moving vertically during the motion exposed/"read out" the wheels first, and then progressively later and later the top of the car and the people, as the car moved past. So they are sloped. https://www.sothebys.com/en/buy/auction/2021/classic-photographs/automobile-delage-grand-prix-de-lautomobile-club Now, just imagine the daylight has also flickered on and off a few times during that exposure. There would be exposure "bands" of bright and dark, at different heights, as well as the sloping car. Data I find for the FPL is that it takes 21.7 milliseconds for the sensor to read entirely, equating to about 1/46th of a second to capture the whole picture (longer than the M10s 1/180th or 1/360th-second). So yes, that LONGER read-out time should show less light-flicker banding - but probably more fast-moving-subject distortion (like Lartigues's).
-
Common newbie mistake - the Fuji X-Pro is NOT a rangefinder camera. Having a viewing window does not make a camera a rangefinder camera, any more than having windows in one's house turns it into a "rangefinder house." 🤪 Here are two more cameras with window viewing (but no rangefinders), just like the Fuji. From the opposite ends of the price spectrum. But they are not "rangefinder" cameras either.
-
It is unlikely to be an equipment failure. It is a natural, somewhat common, and expected effect in digital photography. So common that most digital photography post-processing software includes (at some expense to the software maker) a control exactly intended to get rid of of such color fringing. It can happen any time with any lens/camera/sensor (except the Leica Monochrom-only cameras, obviously 😉 ) This is from Adobe's Camera Raw, but most other editing programs will have a similar set of controls. jaapv suggests "blooming" - I suspect color aliasing (a feature of even healthy Bayer-pattern sensors, using extra-sharp lenses) http://www.tedfelix.com/ColorAliasing/index.html ) .....and/or chromatic aberrations (no such thing as a perfect lens, even for Leicas). Localized overexposure (such as these sunlit and shiny hairs, compared to the rest of the dog's black fur) can emphasize it.
-
⚠️ Wednesday (September 24th): Server Migration ⚠️
adan replied to LUF Admin's topic in About the Leica Forum
(nevermind) -
⚠️ Wednesday (September 24th): Server Migration ⚠️
adan replied to LUF Admin's topic in About the Leica Forum
Hmmm- the new server is failing to accept posts. It records them, but does not add them to a thread. ... never mind - also seems to be a caching issue. Refreshing the page permitted posts. -
Since we seem to be into orange cats - Tut's little buddy Cali (for Calico). Inherited from my mother. (M10, 50mm Summicron v.3)
-
Hey, at least they didn't set fire to it, like Jimi. 😜
-
⚠️ Wednesday (September 24th): Server Migration ⚠️
adan replied to LUF Admin's topic in About the Leica Forum
👍 -
Possibly. I think the problem is that the Q3 sensor (from Sony) uses phase-detect autofocus rather than contrast-detect - which means it has paired phase-detect pixels right on the image sensor itself (and the firmware is set up for those). But the focus-sensing pixels cannot also produce image data (they are feeding their output to the AF system), so they result in two-pixel "dead-pixel pinholes" in the photograph, at each of the focus points. https://blog.reikanfocal.com/2023/05/how-it-works-on-sensor-phase-detect-autofocus/ Fortunately the pinholes get blurred out of existence by the demosaicing/debayerizing processing of COLOR sensor output, which is how we get full-color images out of only red, green, and blue-filtered pixels anyway. https://en.wikipedia.org/wiki/Demosaicing But a key selling point of the Monochrom cameras is that they gain extra practical resolution precisely because they do not have a Bayer-pattern filter array, and do not demosaic and blur neighboring pixels together. So either Leica gets a sharp true-Monochrom image from that sensor - but with intentionally "dead" pixels scattered across the image. Or Leica blurs out the "dead" AF pixels with otherwise unnecessary demosaicing - which blurs the whole image a bit. Result being it is no sharper than the same image, simply made with the color Q3 and de-saturated. I suppose one could argue that a little blurring, or a handful of "dead" pixels intentionally supplied right from the factory, with 60 Mpixels to start with, is acceptable. But I'm not sure the dedicated Monochrom users would buy that argument.
-
I expect if you asked Stefan Daniel* that question, he would respond, in words he has used before, that the Q2 (M or otherwise) is "a finished product." Meaning "We are not designing Q2's of any kind anymore, let alone spending money re-engineering them for a different lens (including rewriting the firmware, and changing other electronics to match) and a different back construction. We have moved on." It's a bit like asking why, once Leica had the technology for an M9 Monochrom, they did not go backwards and produce an M8 Monochrom. Once the M9 was introduced, the M8 was also "a finished product." The Leica CEO around the time of the M8 (Steven Lee) was fired for, among other things, suggesting that the M8 would be perpetually upgradeable. Just not how digital camera technology works. ______________ * long-time Leica product manager and now executive VP for technology and operations.
-
Well, there have always been adapters (by Leica and others) to mount M lenses on the SL series. An SL user can certainly ask about them. But in the M line, there is only one APO-90mm-Summicron-M ASPH - introduced 1998, still in production, and still the only one. https://wiki.l-camera-forum.com/leica-wiki.en/index.php/90mm_f/2_ASPH_Apo-Summicron-M The other 90mm Summicrons for M (1957, 1980) are neither ASPH nor APO. https://wiki.l-camera-forum.com/leica-wiki.en/index.php/Summicron_(I)_f%3D_9_cm_1:2 https://wiki.l-camera-forum.com/leica-wiki.en/index.php/90mm_f/2_Summicron-M_III
-
does anyone know what this is at the bottom of my sensor?
adan replied to ClemFandango's topic in Leica M9 / M-E
My guess (going by the color) is that a "mental midget" 🤪 previous owner was trying to push loose some recalcitrant edge-of-sensor dust with a blue/cyan plastic toothpick or similar, and broke off the tip in the process. https://us.amazon.com/Plastic-Cocktail-Toothpicks-Soodhalter-Swords/dp/B01CCERPKY https://www.walmart.com/ip/JUNTEX-200Pcs-Food-Grade-Plastic-Toothpick-Travel-Portable-Double-Head-Dental-Oral-Sticks-Interdental-Floss-Cleaners-Tartar-Removal-Picks-Brush-Stora/1005833535 My personal solution if I wanted to remove this shard, from a camera I owned, would be to slide a thin-but-fairly-firm piece of paper (i.e. typing or document paper, NOT tissue paper) between the shard and the sensor glass. And then use tweezers to grab and pull loose the shard - the piece of paper serving as a backstop to prevent any metal-to-glass contact. Just make sure the camera battery are fresh,so that the shutter won't close unexpectedly. -
Bottom Line: Leitz/Leica's reputation for "precision" was based on their engineering skills, not on their record-keeping skills (which at times amounted to hand-written records in ledgers.) Especially as translated by amateurish collectors and "list-makers," often years after the fact.
-
In the USA, the trademark "M12™ camera" is already in use. Since it is a digital camera, it probably is also a "Computer & Software Product & Electrical & Scientific Product." https://www.milwaukeetool.com/Products/2324-21 Therefore, that "trademark space" is already taken, for use in the USA - can't have two functionally-similar products trying to use the same trademark (that is kind of the whole idea of trademarks!) More commentary on the "EVF M rumored" thread's recent pages.
-
Do the mechanical moving parts of the RF mechanism have a built-in electronic function or capability? No. Neither do the purely-mechanical movements of the, for example, M10 baseplate being removed or replaced. Or the M10/M11 ISO dials being rotated. Yet both of those simply add tiny magnets and magnetic-field detectors/encoders to produce an electromagnetic signal - that inform the CPU that the baseplate has been removed (and to display a warning on the rear screen), or that the camera's ISO dial has been rotated to position x, so the CPU can adjust the exposure and metering accordingly. I expect, as a resident of Dorset, you share the UK fascination with railways. The very same overall concept of a purely mechanical event (a train entering a certain "block" or section of the rail route) automatically triggering an electrical or electronic response (powered semaphores, warning lights, clanging bells, and gates all reacting automatically, up to miles away) has been used by railways for nearly a century. Just glue or screw a micromagnet onto one of the moving levers (or even the parallax-correcting moving framelines) of the RF mechanism at some position, and a magnetic-field detector (or "encoder") onto the non-moving camera body right beside or above that section of the lever. https://audemars.com/micro-magnets-manufacturing/ https://www.thethriftybot.com/products/thrifty-absolute-magnetic-encoder If the RF mechanism moves even slightly, the magnetic field moves and registers as weaker or stronger at the detector/encoder location. The detector sends a "Hey, the lens is being focused" signal to the camera's CPU. Can be a very simple binary "movement/no movement," on/off, 1 or 0 signal CPU checks that electronic viewing is actually in use (maybe), and then sends a signal to the display system, to "magnify and zoom in on the center of the electronic viewing image, for 5 or 10 seconds." All "at the speed of light," more or less.