This is a read-only archive. Find the latest Linux articles, documentation, and answers at the new Linux.com!

Linux.com

Feature

Looking good: Monitor calibration under X

By Nathan Willis on February 08, 2005 (8:00:00 AM)

Share    Print    Comments   

A lot of Linux users seem resigned to the notion that the X Window System is a second-class citizen in the calibration world. They couldn't be more wrong. Just as Linux allows you the flexibility to hand-tune your kernel configuration and optimize your disk drive performance to the manufacturer's limits, you can calibrate your monitor with enough precision to satisfy the pros. Here's how.

Looks are everything

To begin with, understand that all display calibration has one goal: appearing correct to your eye. An image -- whether it's a JPEG file or a live video feed -- is a big matrix of pixel values. The job of the computer monitor is to map those pixels to glowing dots that look right to you. That means, for one thing, that if your monitor resides in a brightly lit neon lamp showroom, calibrating it will result in different settings than the same model used in an unlit basement.

That said, calibration isn't completely individualistic either. Basically, a calibrated display should map an absolute black pixel to the blackest color that it can produce, an absolute white pixel to the whitest color that it can produce, and smoothly scale the shades in between. This task is complicated slightly by the fact that not everybody agrees on what color white is. It is further complicated by the fact that CRTs and LCDs don't generate a linear increase in brightness from a linear increase in voltage, so some math is required to make them paint their signals to the screen correctly.

But calibration really begins before you even touch the monitor. Remember the neon-light showroom? However well you calibrate that monitor, it would look better if you improved the viewing conditions; red and blue ambient light bouncing off the glass interfere with your vision. If you have control over your computing environment, reducing screen glare and using white light bulbs will give you an improvement. I don't generally recommend replacing your lights with special daylight-balanced bulbs, but doing so would help if you must color-match images on your display with other items, such as printouts.

One last word: calibration will not make a bad monitor look like a good one. Calibration is getting optimal performance from a piece of hardware. Finding the optimal hardware is a different issue, and some montiors are just better than others. Almost every monitor has built-in hardware brightness and contrast controls, and most have color temperature control as well. If your hardware doesn't have these controls or they are malfunctioning, calibration won't make up the difference.

Step One: Learning colors with Lord Kelvin

In order to have a good, objective definition of color, physicists quantify it by something called color temperature -- the temperature to which you would have to heat a black-body radiator for it to glow in a particular shade.

The general standard for normal daylight is 6,500K (Kelvins). Hopefully you have a hardware "color" control on your monitor. Most monitors have at least two pre-sets, 5,000K and 9,300K, which translate respectively to mustard yellow and an irritating blue. If you're lucky enough to have a 6,500K preset, that is the one you want. If you don't, check your monitor's manual and see what your options are. In most cases you can adjust a slider to any value in between the presets. Fire up your calculator and find the correct position. If you don't have that kind of control, you can eyeball the color temperature setting by switching off all artificial lights in the room and comparing the display's white to sunlight hitting a white piece of paper.

If you have no idea what the presets are or what the color control does and you have no outside windows, you could try the same with a white LED light, but beware that "white" LEDs are not standardized. But honestly, if your situation is that dire, your time is better spent writing angry letters to your monitor's manufacturer.

Step Two: Black is black, white is white

Next we set the minimum and maximum values the monitor can produce to absolute black and absolute white, respectively. The front of your monitor has buttons or dials labelled Contrast and Brightness. Here's the first of many potential suprises: these names are firmly entrenched in monitor and television manufacturing, but they are incorrect.

The control labeled Contrast actually controls the brightness of the display. The control labeled Brightness controls the black level. I will try to refer to the controls as "the Contrast control" and "the Brightness control" rather than just contrast and brightness to not make this more confusing than it already is.

We are interested in getting the brightest white our display can produce, so set the Contrast control to 100% or whatever the maximum is labelled.

We are also interested in getting the darkest black our display can produce, so we will adjust that with the Brightness control. Turn off all room lighting, and you will see that the "black" produced by the monitor is actualy glowing ever so slightly -- compare a black image to the unlit glass at the edges of your display.

To set the correct black level for your display, download Norman Koren's combined gamma/black-level chart and open it in an image viewer. Norman Koren's site is the best resource for detailed information on monitor calibration, and he provides these charts free of charge. The left-hand column is a graduated gamma scale; the columns on the right are black bars of increasing luminence.

Lower the Brightness control on your display, then raise it bit by bit until you can see black bar "A" against the background parallel to the number 2.2 on the gamma scale. When you have done this, you should be able to see black bar "B" much more clearly. If you can't, don't worry, you will come back to double-check this step after setting the gamma-correction in the next step.

Step Three: Gray is gray. Sort of.

We have now affixed the black and white "endpoints" of the monitor's tonal scale where they belong, so all that's left is to fit a straight line between. Unfortunately reality throws us another curve at this point -- an exponential curve. The trouble is that a CRT does not produce a 50% luminance pixel when given a 50% voltage. The actual luminance is generated according to a power function, approximately

Luminence = C * valuegamma + B

Here C is the brightness as determined by the Contrast control, value is the pixel value normalized to the [0,1] interval, and B is the black level as set by the Brightness control. We already maximized C and minimized B. The gamma exponent is the property that most directly controls the slope of the resulting function: what we want is a gamma that gives us a nice smooth gradient over the grays between black and white.

On a modern CRT display, the native hardware gamma is approximately 2.5. This is a property of the hardware, and you cannot change it. The sRGB colorspace is an abstract color model for working with computers. When doing their research, its creators determined that a gamma of 2.2 most exactly complemented the human eye's response to light (Macs, for historical reasons, do not adhere to this standard, but you need to do the right thing, regardless of Steve Jobs' example). So to get a computer monitor to show your eye all the grays that it can see, you will have to apply a software gamma correction that results in an overall gamma of 2.2.

Mathematica tells me 2.5 divided by 2.2 is around 1.136. Under ideal circumstances, this is the gamma correction factor that we would set with our software. Open Koren's combined gamma/black-level test image again and look at the gamma column, either squinting, blurring your eyes, or standing back from the screen. Find the point on the chart where the horizontal lines blend in perfectly with the gray background; the number on the scale is your system's current gamma. If it's 2.2, you are done.

But it's probably not. Open a terminal window so that you can see both it and the chart. Type the command xgamma -gamma 1.136. Your display should change ever so slightly. Look for the new gamma on the chart, and re-run the xgamma command tweaking the value until you match 2.2 on the chart. These gamma charts work on a simple principle: at the right gamma, a 50% gray image will have the same average luminence as a test pattern composed of 50% black and 50% white pixels. Koren's images are more complicated than the classic "box within a box" model because they are scaled to show you a range of possible gamma values.

The xgamma command -- a component of the standard XFree86/X.org X server distributions -- takes a gamma correction value as its argument, which is why we started with 1.136. You can specify different gamma correction values for the red, green, and blue signals with the -rgamma, -ggamma, and -bgamma arguments respectively, precise to three decimals. To decide if you need to tweak these gamma correction values separately, look for aberrant coloration in neutral grays. Any will do, but Koren provides a nice multi-value chart on his Web site. This is better than attempting to use separate color "box within a box" images because the pure signals are very hard to read. My own monitor is balanced at Red 1.05, Green 1.15, Blue 1.136.

That's it. Your display is properly calibrated. I wish it were more complicated, but it isn't.

Well, there is one thing left to do: automate the gamma-correction step. There are two ways to do this: have the xgamma command executed every time you log in, or set it system-wide in your XF86Config file.

On a system where you cannot edit XF86Config, put the proper xgamma command in a .login or .xsession file -- but add the -quiet flag to suppress the output. If you use GNOME or KDE, let the desktop environment's session manager handle this instead, as they may have complicated start-up code that overlooks .xsession and the like. I run GNOME, so I would simply launch gnome-session-properties and add my xgamma command to the Startup Programs tab.

But since I have root access to my desktop machine, I add a gamma-correcting line to my /etc/X11/XF86Config file, so it is applied for everyone at system startup. It goes in the Monitor section:

Section "Monitor"
	Identifier   "Monitor0"
	VendorName   "Vendor0"
	ModelName    "P96"
	DisplaySize  330   240
	HorizSync    30.0 - 94.0
	VertRefresh  48.0 - 120.0
	Gamma 1.05 1.15 1.136
EndSection

Other options and a warning

Mac and PC users have a variety of available display-calibration programs, ranging from freeware to extremely expensive. But as we have seen, there isn't a lot to the calibration process (profiling for a color-managed workflow is another story, and really a different task). Most of the calibration software out there relies on the same steps we have just gone through: showing you a test image and asking you to tweak a setting until you determine it is correct. But the test pattern displayed by one of these programs is no better than a test pattern from a calibration Web site.

There are a handful of semi-useful GUI tools for changing the X server's gamma-correction: tkgamma, an aging Tcl/Tk program; KGamma, which is now built in to KDE's control center; and Nvidia's nvidia-settings tool, which is good only for Nvidia graphics cards.

But I wouldn't bother with any of them; they are all merely front-ends to the xgamma command, and they make it less useful. For one thing, the test images they provide are less helpful than Koren's because they provide no marked scales. Furthermore, tkgamma and KGamma have large numerical steps hardcoded to the control "sliders," so you have far less precision than when using xgamma directly. KGamma actually makes an additional interface inconvenience by putting the gamma-correction values in what appear to be editable text-boxes, but not making them editable! And, obviously, if you don't have an Nvidia card, that company's tool is useless to you.

A word about over-correction: some high-end displays have a "Gamma" control built in to the monitor's on-screen controls. But remember: the monitor's gamma is a physical property of the device, and it cannot be altered. What these controls do is apply a gamma-correction factor, just as we would do in software. But if you add one gamma-correction on the monitor's controls, and another in your X server, they will be multiplied and overshoot the correct value.

I suspect that a high-end display doing its own gamma correction can be more precise than xgamma can, so if you have this option, try it and do no gamma correction with your X server. Or if you want to purchase a high-end display for me, I'd be happy to try it both ways and tell you which looks better.

The future's so bright, I gotta adjust my gamma-correction

Sticking with this hardware-is-better-than-software theme, there are a few companies that make hardware display calibration tools; they look like three-legged spiders that stick to the surface of your monitor and feed info back through USB cables. To my knowledge, none of these devices is supported under Linux or other "alternative" operating systems. If I am wrong and you know of one, please contact me and let me know; I would very much like to see it. Manufacturers of these devices include LaCie, ColorVision and Eye-One Color, if you wish to make polite inquiries.

Despite what I said about today's woeful crop of GUI calibration programs, I would really like to see a more full-fledged graphical tool for X users, so that all of the relevant information is in one easy-to-use place. As desktop Linux makes further inroads into professional-level graphics, there will be more people who need such software. There may be only a handful of applications today that support 16-bit-depth pixels and color management, but every day there are more. If you're interested in undertaking such a project, send me an email and I'll tell you everything I know. It won't take long, promise.

In the meantime, today there are more than enough resources at your disposal to properly calibrate your display. For further study I recommend that you begin at Norman Koren's Monitor Calibration and Gamma site; in addition to the calibration charts I have already talked about, he explains in great detail the process involved and why some things (like individual-color gamma charts) just don't work, and he has good, up-to-date links to other sites.

Share    Print    Comments   

Comments

on Looking good: Monitor calibration under X

Note: Comments are owned by the poster. We are not responsible for their content.

Easier way, at least in Suse

Posted by: Anonymous Coward on February 10, 2005 01:11 AM
Gamma calibration is available under Suse 9.2, KDE 3.3.0 under
ControlCenter --> Peripherals-->Display
You can play to your hearts content so long as you don't select the option "save to XFConfig" till you are done. It works great.

Kevin

#

Re:Easier way, at least in Suse

Posted by: Anonymous Coward on February 13, 2005 04:43 AM
That is KGamma.

#

only when necessary

Posted by: Anonymous Coward on February 11, 2005 12:32 AM
don't set contrast to 100% unless you edit photos for print. it will ruin your eyes. actually setting brightness to ZERO also may reduce eye wear.

#

Re:only when necessary

Posted by: Anonymous Coward on February 13, 2005 01:23 AM
That is flat out not true.

Furthermore, if your monitor hurts your eyes, it's due to refresh rate or bad lighting in the room. And setting the brightness or contrast too low will strain your eyes because you will lose clarity and be squinting at the display.

For maximum clarity, proper color rendering and the minimal eye strain, you calibrate your monitor. By definition, that is what calibration is. This is ridiculously bad "advice."

#

xgamma and ATI fglrx

Posted by: Anonymous Coward on February 26, 2005 08:35 PM
Like some other users of ATI's fglx drivers, the command 'xgamma' does not seem to work for me. As a result, kgamma does not work either.

Try fglrx-xgamma instead, although it does not seem quite as user friendly.

#

fixed tkgamma link

Posted by: Anonymous Coward on April 07, 2005 04:10 AM
the freshmeat entry was pointing to a broken place, fixed that.

#

Re:xcalib ?

Posted by: Anonymous Coward on April 07, 2005 09:24 AM
thats a beutifull idea! and a reason to put icc support in X11...

amusingly enough, the now commercially dominated icc stuff looks suspiciously like the built in color profiling in X11R5. see Xcms at the end of chapter 6 in "Volume 8: X Window Systems Administrators Guide"

this might even tempt me to put windows on a linux box to try it out...

maybe vmware (not likely qemu) with one of those that just tell you change everything (was either spyder or monaco, dont remember) as oppsed to ione(greytag mcbeth)<nobr> <wbr></nobr>...

hopefully will least work with pcc and a free nix.

#

then raise it bit by bit until you can see ...

Posted by: Anonymous Coward on January 22, 2006 12:11 AM
What should I do, when at contrast 100% and brightness 100% and no gamma correction I don't see black bar A at all ? With gamma correction around 1.7 its better, but monitor looks way too bright<nobr> <wbr></nobr>...


Maybee I need new monitor<nobr> <wbr></nobr>... I hope I don't need new eyes<nobr> <wbr></nobr>...

#

Re:xcalib ?

Posted by: Anonymous Coward on March 02, 2006 01:04 AM
Xcalib does not apply a screen profile - it only uses a calibration info stored in the profile to adjust the graphics card LUT table so that the monitor is calibrated.
The purpose of the profile itself is mainly for conversion between colour spaces (e.g. Adobe RGB to the Monitor profile). The calibration info is something extra added to it.
I have read somewhere that the approach of calibrating monitor in a different OS and then using the information in linux may not be the best way since different OSes may process information for the graphics card in a different way.
Anybody has more information?

#

Re:xcalib ?

Posted by: Anonymous Coward on July 22, 2006 02:14 PM
In all OSs only colormanaged applications do gamut conversions. But you do need correct LUTs loaded if you used calibration that affects LUTs (most of them do).

Unless somebody takes his time and measures patches with a colorimeter attached to a Windows machine on a Linux machine with correct LUTs loaded via Xcalib it's impossible to tell how accurate the profile is.

I assume it's very accurate. There's not much room to go wrong.

<a href="http://photo.net/bboard/q-and-a-fetch-msg?msg_id=00HMf8&tag=200607211107" title="photo.net">http://photo.net/bboard/q-and-a-fetch-msg?msg_id=<nobr>0<wbr></nobr> 0HMf8&tag=200607211107</a photo.net>

Serge.

#

Colorimeter-based approach

Posted by: Anonymous Coward on July 29, 2006 08:35 AM
Although it has been possible to properly calibrate a monitor under Linux for some time now it is not well known.

Essentially you either use Argyll CMS to natively calibrate the monitor (only a couple of Xrite colorimeters are supported) or use a profile created in Windows or Mac OS on the same machine.

Then you need to load LUTs from a vcgt tag of the profile on startup using either Xcalib or a dispwin module of Argyll CMS.

Then you need to specify the monitor profile in color managed applications. There are several colormanaged application for Linux (Bibble Pro, Cinepaint, Sribus etc...).

There's a Wiki on the subject:
<a href="http://en.wikipedia.org/wiki/Linux_color_management" title="wikipedia.org">http://en.wikipedia.org/wiki/Linux_color_manageme<nobr>n<wbr></nobr> t</a wikipedia.org>

Serge

#

xcalib ?

Posted by: Administrator on February 10, 2005 07:31 PM
Hi,

Have you tried to use xcalib (http://www.etg.e-technik.uni-erlangen.de/web/doe<nobr>/<wbr></nobr> xcalib/)
which lets apply a screen profile (generated with "spiders" on the other OS) to the xserver ? I would be interested in your opinion<nobr> <wbr></nobr>...

Michal

#

trouble with slackware 10.1 and monitor LG Flatron

Posted by: Administrator on January 06, 2006 08:45 AM
Hi,
I am newbie in linux and mine monitor are contrast poor. the colors are ugly...

I more need help ? anyone can me help please or indicate place get help ?

my email: alex_florentino@hotmail.com

sorry my english.

very very thanks

#

Looking good: Monitor calibration under X

Posted by: Anonymous [ip: 83.83.225.61] on November 09, 2007 08:28 AM
Just for the record: ATM (11/2007) look to argyllcms (CLI) and lprof (look at the CVS, atm under development) for hardware support.

#

nVidia Control Panel for Gamma Correction

Posted by: Anonymous [ip: 72.86.92.55] on December 09, 2007 08:01 AM
I recommend the nVidia Control Panel, if available, over the console for gamma correction.
[Modified by: Anonymous on December 09, 2007 08:01 AM]

#

Calibration hardware available under Linux Red-Hat based systems.

Posted by: Anonymous [ip: 79.22.150.233] on January 02, 2008 09:39 PM

This story has been archived. Comments can no longer be posted.



 
Tableless layout Validate XHTML 1.0 Strict Validate CSS Powered by Xaraya