This is a read-only archive. Find the latest Linux articles, documentation, and answers at the new!


High Dynamic Range images under Linux

By Nathan Willis on December 21, 2005 (8:00:00 AM)

Share    Print    Comments   

Not all image files are created equal. Most of us know this from working with the everyday formats like PNG, JPEG, and TIFF, each of which has its own pros and cons. But cutting-edge applications from cinematography to computer vision demand more range, color depth, and accuracy than these formats can deliver. That demand drove the development of what are called High Dynamic Range file formats. Luckily for us, Linux is a first-class citizen in the HDR image world.

You may have seen the HDR acronym in reference to computer gaming as well. Video card manufacturers use it to refer to rendering scenes with very large contrast ranges. These rendering techniques are not related to HDR imaging and HDR image file formats directly -- although, as we will see, game designers make use of HDR image formats to maximize visual quality.

... but some are less equal than others

The trouble with older image formats is that they were designed with output devices in mind. Older formats used a limited number of bits per pixel and had small color palettes, because display devices at the time handled fewer bits per pixel and smaller color palettes. Before digital still and video cameras, few people cared about the color balance of their monitor.

Today, most of our entertainment media are being created and processed on computers. This includes scanned motion picture film, digital video, and computer-generated images in two and three dimensions that must blend in seamlessly with real life.

While we could modify existing image formats by tacking on extra pixels, researchers and industry people opted for a better idea: try to define an image format that can describe the entire scene in at least as much detail as the human eye can resolve -- even if it can't all be displayed on today's equipment. This concept is known as scene-referred imaging.

To display a scene-referred image on lowly CRT montiors or DVDs, of course, you have to create a mapping function to compress it for each display device, but the technique is archival and you can re-map the original to each destination without a generation's loss in quality. Even better, working with scene-referred data follows the principle of keeping information around as long as possible, which gives you more flexibility, no clipping, fewer accumulated errors, and smoother undos.

HDR image formats all have a few points in common. First, they encode color information in reference to a device-independent color model such as CIE XYZ or CIE Lab, so they can reference every color the human eye can see (and possibly more), unlike the limited gamut of color models such as sRGB.

Second, they try to preserve the large contrast ratios we experience in the real world. The eye can resolve about four orders of magnitude of contrast at any one moment, but because it adjusts to wide lighting conditions, those four orders of magnitude may fall anywhere within a much larger range (arguably 15 or so, but who's counting -- it's not easy to test).

Last but not least, HDR image formats try to be of uniform quality across the entire visible spectrum -- if it does a smashing job on the highlights but is murky in the shadows, it's a waste of effort. The eye does not respond to light linearly, and that means finding an encoding scheme that quantizes it usefully but with predictable numerical error. Usually HDR formats pick a logarithmic encoding, but they can be far more complicated. HDR wizard Greg Ward has written an excellent (albeit highly technical) exploration of the topic.

Meeting these conditions is not a trivial task. In addition, you probably also want a format that's space-efficient, at least a little flexible on things like compression, and that avoids those downright weird situations that often arise in mathematical color models, such as negative colors and hues brighter than pure white, which are hard to explain to the camera-shopping public.

The formats

One of the earliest HDR formats to see widespread use was Kodak's Cineon. Cineon is a 10-bits-per-channel logarithmic encoding designed to work with motion picture film. It is supported natively in a number of high-end film scanners and recorders (and by "high-end" we are talking five- and six-digit prices), including a line of devices built by Kodak for digital film restoration.

These days, Cineon has been superseded by the Digital Moving Picture Exchange (DPX) format, which is an SMPTE standard (SMPTE 268M-2003) that extends Cineon in several useful ways. First, while Cineon can store data only in 10-bit log form and Kodak's reference color space, DPX can use other encodings and bit depths as well. Second, DPX adds several header fields (such as timecode and sampling rate) useful for video processing. Finally, DPX aligns header and data at fixed offsets, allowing file updates without having to read in and write out the entire file.

Though DPX has now officially replaced Cineon, the expensive Cineon-only digitization hardware is still widespread, so both variants are still supported in most HDR software. Kodak provides a test image "target" in Cineon format, if you are curious.

OpenEXR was developed by Industrial Light and Magic and released under a modified BSD license in 2003. It supports 16-bit floating point, 32-bit floating point, and 32-bit integer pixels. It covers more than the entire visible color spectrum, and more than 10 orders of magnitude in brightness. The file format allows for pluggable compression schemes (of which wavelet compression is the most common, but not required) and allows extra, non-color channels like alpha, depth, or other information that may be useful in computer-generated art.

ILM used the EXR format internally for years before releasing it as free software; since the release it has become popular with other special effects and animation studios. NVIDIA and ATI both support EXR data in their graphics cards; it is the 16-bit Half data type in NVIDIA's Cg language. The OpenEXR download page has precompiled binaries for Windows and OS X, and a hefty package of sample files you can work with on any system.

Other HDR formats include RGBE, which grew out of the Radiance ray-tracer. Since ray-tracing works by calculating all the light in a scene (i.e., not just the few orders-of-magnitude visible on a monitor), they needed a way to model more brightness than a normal RGB scheme allowed.

RGBE uses 8-bit RGB triples plus an 8-bit exponent denoting the magnitude of the pixel's brightness, extending the contrast range to 76 orders of magnitude -- far, far, far more than the human eye can see. I mean, far. The creators of the format later created LogLuv, an extension of the extension-friendly TIFF format, sanely using more of their bits for the color and fewer for the exponent.

In 2004, Adobe released a specification it called Digital Negative (DNG) files. DNG is designed to be a wrapper for various flavors of RAW camera and sensor data, and not for general-purpose image files used in editing -- but it is technically an HDR format.

The software

Now to the fun part. On Linux systems, the popular ImageMagick command-line graphics library supports Cineon and DPX files. The ImageMagick offshoot/fork GraphicsMagick, on the other hand, relegates Cineon to "legacy" format status, has a far more complete DPX read/write implementation, supports LogLuv TIFF, and lists OpenEXR on its "to do" list.

Among the GUI image editors, CinePaint handles the most: Cineon, DPX, OpenEXR, and LogLuv TIFF. The GIMP can handle Cineon and DPX through plugins (and the Cineon plugin is more recent than the DPX plugin). Krita supports OpenEXR natively, and inherits the ability to read Cineon and DPX from its ImageMagick dependency.

Raytracers, 3D modelers, and animators in the proprietary software realm can almost always output to OpenEXR, and this is true of Linux offerings like Maya and Shake. Several open source rendering and 3D apps -- including Yafray and Blender -- either support OpenEXR now or are working on it.

There are a few HDR-specific applications available for Linux as well. exrtools supports (surprise!) OpenEXR images, while pfstools supports OpenEXR and RGBE.

HDRIE and HDRShop are both multi-format HDR image editors, though they don't seem to be under active development.

The personal site of HDRShop creator Paul Debvec lists some other software titles. And Greg Ward -- the man behind Radiance, RGBE, and LogLuv TIFF, has some free utilities on his personal site as well.

Of course, once you have found your software of choice, finding and creating HDR images becomes an issue. Rendering them from a computer-generated scene is straightforward enough, and good digital cameras today capture enough data to make HDR image formats worth using. For film, drum scanning at a professional photo lab squeezes just about every pixel out of the frame, although it will cost you. Alternatively, there are techniques for combining multiple bracketed exposures into a single HDR image.

But remember, even if you aren't using 32-bits-per-channel originals, storing your data in an HDR image format saves you from roundoff, clipping, and all kinds of computational error that accumulates during the editing process. And what's a few extra bits between friends, anyway?

Share    Print    Comments   


on High Dynamic Range images under Linux

Note: Comments are owned by the poster. We are not responsible for their content.

W2g, HDR, imho.

Posted by: Anonymous Coward on December 21, 2005 10:20 PM
W2G, HDR, IMHO. I knew HDR would finally <a href="" title="">get its due</a>.

An unrelated but amusing poem. Who can read it?

%* ~#4

Wakka wakka bang splat tick tick hash,
Caret quote back-tick dollar dollar dash,
Bang splat equal at dollar underscore,
Percent splat wakka wakka twiddle number four,
Ampersand bracket bracket dot dot slash,
Vertical-bar curly-bracket comma comma CRASH


Re:W2g, HDR, imho.

Posted by: Anonymous Coward on December 22, 2005 01:22 AM
cmon, stop trying to screw with google already!



Posted by: Anonymous Coward on December 21, 2005 06:24 PM
Astronomers need HDR images too. We're not so interested in the response of the eye as simply not discarding information from detectors that have a higher dynamic range than typical image formats.

A commonly used format is FITS, or Flexible Image Transport System. netpbm has converters for this format, and GIMP can open them, but I think they get converted to 24 bit images once they're opened.

I read once that to change this would be a major re-write of the GIMP... is that true? GIMP's scriptability would be very useful for astronomers if it handled HDR images properly.



Posted by: Anonymous Coward on December 22, 2005 02:18 AM
If I can figure out how to read FITS images (which should be easy, given the netpbm source), then I guess it would be a matter of a day or so to make Krita read those, too. And we don't need to downgrade images to 24 bits, we can handle 32 bits/channel.

As for scripting; Krita is scriptable in Python and Ruby, but we need to extend the scripting API a lot. Of course, one way of determining what needs to be present in such an API is through cooperation with people who actually need it.

So, if interested, mail me at:



Posted by: Anonymous Coward on December 23, 2005 10:40 PM
I just wanted to clarify for people who may not be picking this up: Gimp works with images that are 8 bits per each of the three additive color channels, Red, Green, and Blue, for a total of 24 bit color. HDR capable programs like Krita and Cinepaint can work with up to 32 bits of color per channel, for a total of 96 bit color. So we are talking about a huge difference here in the amount of stored information.

Also note that most mid to high end digital cameras that have a RAW format capture images at 12 bits per channel, so most RAW conversion programs have the ability to convert these to 16 bits per channel to preserve all the original information and leave a little extra room for image adjustments.



Posted by: Administrator on December 23, 2005 12:50 AM
The GIMP uses 8-bit samples internally. CinePaint (originally derived from GIMP) also supports 16-bit integer, 16-bit float, and 32-bit float. CinePaint is capable of supporting true HDR but GIMP can not.

ImageMagick and GraphicsMagick can supports as many as 32-bits per sample, but only to add additional resolution rather than range. They are not HDR applications.



Posted by: Anonymous Coward on December 21, 2005 06:34 PM
The Medical domain uses DICOM images for this (I've seen up to 16 bits per channel), which can be viewed fine with Gimp. However, since most domain-specific high-dynamic image types are domain-specific, the image types define a lot of 'context' in which way the images are acquired. For instance, Gimp cannot create DICOM images, since it cannot create all the context required ( patient name, scanner type, date,<nobr> <wbr></nobr>...).

So while viewing is convenient for high-dynamic images, I don't see the need for one-size-fits-them-all high-dynamic image type.



Posted by: Anonymous Coward on December 22, 2005 02:18 AM
I think this is just useless hype that wont benefit me and the majority of users in any way at all.



Posted by: Anonymous Coward on December 22, 2005 05:15 AM
suggestion : if you have a windows computer, download Day of Defeat Source and Half-Life2: Lost Coast.

then tell me it's hype.


Hu ? And 16 bits TIFF ?

Posted by: Anonymous Coward on December 22, 2005 04:48 PM
16 bits TIFF is HDR, no ? It's an old, well documented and well supported format!


Re:Hu ? And 16 bits TIFF ?

Posted by: Anonymous Coward on December 22, 2005 06:01 PM
Not really, dynamic range of an image format has more to do with how your supposed to interpret it than the bit count. 16 bit TIFF, general assumes the same brightness as an 8 bit tiff for peak white, so all the extra bits go in the dark end, this is a good thing, but at some point you hit noise, this effectively reduces the real dynamic range. If you make your 16 bit out of a single digital camera capture then really your probably not going to get better than a 10-11 bit signal this is barely enough to allow your eye to see a perfect image over a single stop of range, but over 7-8 stops of range with a bit of non-linear coding (gamma for instance) it works well for taking a picture. However in the real world the dynamic range of your average day will far exceed that dynamic range (20 stops+ which is *another* 12 bits everything else being equal)

If you want to deal with linear images your need more bits for that too, this all means you need to be clever where you spend your bits, the OpenEXR Half data format does a good job for 90% of the interesting scenes used in movie production, more if you adjust your reference exposure.


Re:Hu ? And 16 bits TIFF ?

Posted by: Anonymous Coward on December 22, 2005 07:24 PM
I have a digital camera capture with 12 bit signal, if i'm not mistaken<nobr> <wbr></nobr>:-)

I'm currently seeking a long term method to archive my work files (from<nobr> <wbr></nobr>.cr2 file, the native RAW format of Canon), and I looked at 16 bit tiff. But after reading your good comment, i'm wondering what I'll finally choose.

OpenEXR+CinePaint could be a good workflow (to be tested), but the<nobr> <wbr></nobr>.cr2->EXR convertion is not currently possible: I'll have to mofify dcraw to support EXR output, I'm afraid.


Re:Hu ? And 16 bits TIFF ?

Posted by: Anonymous Coward on December 23, 2005 06:18 PM
Well, if all you've got is single images, then 16 bit tiff is good, as is cr2. For archiving, you have to know everything you might want to do, as such I'd archive the cr2 as somebody in the future might come up with a better de-bayer than you have currently available. In general archive the capture medium, and the final, and perhaps the steps you did to get it there if this is programatic, e.g. in a scriptable image editing system you'd save the script.

  I belive there ia a tool that does cr2->exr - photomatics?

Modify dcraw<nobr> <wbr></nobr>... hmmm we've done that not the easiest bit of code to work with due to its reverse engineered heritage


Re:Hu ? And 16 bits TIFF ?

Posted by: Administrator on December 23, 2005 01:01 AM
It is quite true that HDR is about *range* rather than resolution. HDR can be represented with a single bit if all that is needed is to represent a certain brightness and darkness. Normal image formats/files uses all the bits to encode the visible intensities in a scene. Extended range formats like Cineon/DPX using log encoding establish black and white points and use a log encoding so that considerable values can be stored in the intervals below visible black and above visible white, but they are not true HDR.
True HDR formats only use a small portion of their numeric range for the visible part and use the rest of the range to encode darkness/lightness which goes well beyond human vision. For example, some HDR formats have sufficient range to encoded the darkness under a rock on the dark side of the moon, and the brightness on the surface of the sun.

See Greg Ward's High Dynamic Range Image Encodings article at
<a href="" title=""><nobr>.<wbr></nobr> html</a> for an excellent overview of bits vs range.



Posted by: Anonymous Coward on December 28, 2005 02:16 AM
What does HDRshop have to do with Linux or even open source?

It only runs on windows and is closed source.

Not even HDRIE has a version yet that runs on anything other than windows.

What do these have to do with anything?


GIMP and HDR is nonsense

Posted by: Anonymous Coward on January 19, 2006 07:29 AM
To mention an old fashioned 8-bit editor like GIMP in relation with HDR out of scope here. It has no HDR or high precission capabilities.

Did the author confuse with FilmGIMP, the HOLLYWOOD branch of GIMP many years ago, which was rejected by the GIMP developers?


This story has been archived. Comments can no longer be posted.

Tableless layout Validate XHTML 1.0 Strict Validate CSS Powered by Xaraya