My hardware setup is:
- Asus P8Z77-V LE motherboard
- Intel i7-3770
- nVidia GTX 660 Ti graphics card -> display port / hdmi
- -or-
- sometimes Intel 4000 graphics -> display port
- Dell U2713H monitor
(To see this just make a new image in an image editor and fill it with a gradient with a fairly restricted range of grays, such as from 55 - 65. I was surprised when I first tried this just how obvious the effect is.)
So I started looking around to see if I could get at the 10 bits / channel that my monitor is capable of.
At first it appears that the only way to do this is to by a professional workstation graphics card, but these are dramatically more expensive than their consumer equivalents. Then I found that the nVidia Linux drivers are fully 10 bit capable (usually referred to as 30 bit color) on all relatively recent cards!
After a couple of false starts I set up an xorg.conf file: (this includes the fix to enable 2560 x 1440 -> Option "ModeValidation" "NoMaxPClkCheck")
Section "Screen"And now nvidia-settings reports:
Identifier "Screen0"
Device "Card0"
Monitor "Monitor0
Option "ModeValidation" "NoMaxPClkCheck"
DefaultDepth 30
SubSection "Display"
Viewport 0 0
Depth 30
EndSubSection
EndSection
X screen 0
Depth: 30
here is the line from Xorg.0.log:
[ 4.600] (**) NVIDIA(0): Depth 30, (--) framebuffer bpp 32However there are now some wierd artifacts on the screen, so it looks like somone is still outputting 8 bit RGBA.
No comments:
Post a Comment