MATLAB Programming/Psychtoolbox/How to achieve 10-bit color resolution on OS X

From Wikibooks, open books for an open world
Jump to navigation Jump to search

It may of interest to some in the group that I've achieved and measured 10-bit color resolution with the OSX PTB 1.0.6.

BACKGROUND

At issue is whether 10-bit color resolution can be achieved with the OSX PTB 1.0.6 and video cards sporting 10-bit DACs. Even if the hardware supports 10-bit depth, it has remained unclear whether the software pipeline (video card drivers, OpenGL, PTB, etc.) would truncate double-precision Matlab pixel values at the 8-bit (256 colors) or 10-bit (1024 colors) level. Although 8-bit artifacts can sometimes be determined by visually inspecting the display, the best test is to take a photometer to the screen and make measurements. [NOTE: an even easier way is to send the video card outputs though a splitter and look at the voltages produced on an oscilloscope - ABM].

HARDWARE

  • Power Mac G5 Dual 2.7 GHz
  • 4 GB DDR SDRAM
  • ATI Radeon® X800 XT Mac Edition (has 10-bit DACs)
  • Iiyama HM204DT "Vision Master 514" 22" color monitor
  • Minolta LS-110 Hand-held Photometers + tripod

SOFTWARE

  • OSX 10.4.5
  • Matlab 7.0.4 for Mac
  • OSX Psychtoolbox 1.0.6

IMAGE-MATRIX VALUES VS. LUT VALUES

Critically, as outlined by Mario Kleiner, 10-bit values with the PTB 1.0.6 can only be obtained by manipulating * values * inside the look-up table (LUT). To illustrate this point, suppose you want to display a 50%-contrast grating. The wrong way to do it is to define a 50%-contrast image matrix in Matlab:

127.5 * (1 + 0.5 * grating)

and display that image using a normalized LUT that spans the full 0.0 to 1.0 range of values. This inevitably result in less-than-10-bit quantization.

The right way (or at least the 10-bit way) to do it is to produce a nominally full-contrast grating:

127.5 * (1 + 1.0 * grating)

in the Matlab image matrix but display the image using a normalized LUT whose values only span the .25 to .75 range [NOTE: this would enable access a range defined by 9-bits ... for 10 bit range, the limits are 0.375 to 0.625 - ABM]. Unfortunately, LUTs in the PTB only have 256 slots, and so you must choose 256 values out of the 1024 available. This limitation can be problematic in cases where you need to display, say, a very low contrast pattern next to a very high contrast one.

MATLAB SCRIPT

Here's the Matlab script I've used to measure luminance:

% --------------------------------------
function Test10Bits

     p.screenNumber = max(Screen('Screens'));
     AssertOpenGL;
     Screen('Preference', 'SkipSyncTests', 1);

     masterGammaTable = ones(256, 3);

     windowPtr = Screen('OpenWindow', p.screenNumber, 255, [], 32, 2);
     Screen('LoadNormalizedGammaTable', windowPtr, masterGammaTable);

     Screen('FillRect', windowPtr, 0);
     Screen('Flip', windowPtr);

     while 1==1
         volts = input('Normalized Voltage Value (0-1): ');
         thisGammaTable = volts * masterGammaTable;
         Screen('LoadNormalizedGammaTable', windowPtr, thisGammaTable);
     end

     Screen('CloseAll');

end
% --------------------------------------

Valid entries into a normalized LUT are bounded between 0.0 and 1.0 inclusively. In a 8-bit system, the 0.0 to 1.0 range is divided in 256 possible values, each value corresponding to a 0.00392 increment over the previous value. In a 10-bit systems, the 0.0 to 1.0 range is divided in 1024 values, each value corresponding to a 0.00098 increment over the previous value. Therefore, if you step through the 0.0 to 1.0 range in increments of 0.001, a 10-bit system would increase luminance on every step whereas an 8-bit system would increase luminance on every fourth (sometimes third) step.

PHOTOMETER MEASUREMENTS

Measurement precision varies depending on photometer setup. In my case, it was important to put the photometer on a tripod (rather than using it handheld) and set it to a long integration period to average out the noise and get reliable readings. Photometer precision also varies with the range of luminances being measured (e.g. my photometer goes from two to one decimal point when it goes above 100 cd/m^2 ). Long story short, continuous photometer readings jumped noticeably on every 0.001 LUT increment I tested, and this indicates that my display has 10-bit precision.

Hope this helps.

Cheers

Stéphane