Deliberate1 Posted August 29, 2017 Share #1 Posted August 29, 2017 Advertisement (gone after registration) Friends, I am working with Puget Computers to configure a new box. As we went through the components, we came to the video card and the rep said that I should go with a 10 bit card if my monitor supports it. He indicated that it can create a more nuanced color space. I then read that a 10 bit card can reduce banding in certain circumstances. I contacted NEC and was told that my pro-level monitor does support 10 bit processing. Consequently, my question is whether the whatever subtle differences I might detect on a 10 bit monitor with a 10 bit video card will be observable on a print, and if so, under what circumstances. Or is the benefit only detectable, if at all, on the monitor. Obliged, David Link to post Share on other sites More sharing options...
Advertisement Posted August 29, 2017 Posted August 29, 2017 Hi Deliberate1, Take a look here 8 bit vs. 10 bit video card - observable differences?. I'm sure you'll find what you were looking for!
Jeff S Posted August 29, 2017 Share #2 Posted August 29, 2017 Any link in the print workflow chain.... including camera/lens, operating system, screen, editing software, printer, paper/profile, inks, display lighting, and more....can potentially affect print results. And then there's the most important link.... the user's eye, judgment and skill. Weston made great prints using a light bulb. Others struggle with the most expensive gear. A couple of bits, fine. But let's not lose the forest for the trees. Jeff Link to post Share on other sites More sharing options...
jrp Posted August 29, 2017 Share #3 Posted August 29, 2017 Well if you've got it you may as well use it, but the differences will be v subtle. Can you tell the difference between 8-bit jpegs and 16-bit tiffs on your set-up? Link to post Share on other sites More sharing options...
Deliberate1 Posted August 29, 2017 Author Share #4 Posted August 29, 2017 Gents, obliged for your comments. They track the "real world" conversation I had with the tech guy. To your point, jrp, no, I have not been able to visualize the difference between 8 and 16 bit images, though I also understand that if pushed hard, the 16 bit tiff file is going to have more PP malleability than the 8 bit jpg, just as the file from my S 006 will be more robust than one from my M9. That said, I have found that the greatest influence on the appearance of a print, no matter my fanatical tweaking, is the piece of glass that goes on top of it, and the lighting of the exhibition space. Frankly, I struggle under those circumstances to discern the difference between paper finishes or manufacturers for that matter. All that said, I have made some images where banding is discernible and gross. Especially in black skies in the area around a bright moon, and blues skies around the sun or other bright light sources. If having a 10 bit vid card would enhance the transitions under those extreme circumstances and diminish or eliminate banding, then I would feel the additional cost of the 10 bit card to be justified. David Link to post Share on other sites More sharing options...
Jeff S Posted August 29, 2017 Share #5 Posted August 29, 2017 I stock several different glass types, including museum glass when the situation warrants. A print viewing booth can be helpful to preview prints under display lighting conditions. ImagePrint can also be useful since it includes custom profiles for most papers and for different lighting conditions, and has constant soft-proof mode. Many people underestimate the degree to which display conditions matter. All the "bits" in the world can get lost in poor display conditions. Jeff Link to post Share on other sites More sharing options...
Deliberate1 Posted August 29, 2017 Author Share #6 Posted August 29, 2017 Many people underestimate the degree to which display conditions matter. All the "bits" in the world can get lost in poor display conditions. Jeff Bingo. Link to post Share on other sites More sharing options...
Ko.Fe. Posted August 29, 2017 Share #7 Posted August 29, 2017 Advertisement (gone after registration) NDVIA supports 10-bit per channel since 2011. http://nvidia.custhelp.com/app/answers/detail/a_id/3011/~/10-bit-per-color-support-on-nvidia-geforce-gpus For example, NVIDIA Quadro K420 supports 10-bit per color channel and where I'm it costs well under 200$. IMO, It is better to have card where you could enable 10-bit, instead of 8-bit per channel. For future use. You newer know what could happen in next couple of years. All of the sudden OS could be switched to 10-bit. Just because. It is Microsoft, after all. Link to post Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.