So I had a quick question about color management, calibration, and profiling.
A long time ago I borrowed a Spyder2 from a friend and calibrated (profiled?) my screen. The screen I use is a 15" Powerbook G4. That was almost 2 years ago, I'm sure the calibration has drifted a little since.
I recently ordered an X-Rite i1 Display 2 and while I was at it I picked up the X-Rite Color Checker chart. These are the two tools I need to get consistent color profiles across my workflow, right?
I received the colorchecker chart today, and tried it out. I went out and took a picture of the chart in the sunlight, and then took another from inside where the chart was lit by a tungsten bulb. Following the directions on the Adobe website regarding their DNG Profile Editor, I loaded the shots into the profile editor and had it generate the profile tables for the two charts. Then I saved this profile for use in LR2. The method I used was the two-shot profiling method which needs a test chart shot at 6500k and one at 2850k. Using the two pictures the DNG profile editor will calculate the correct correction values and also since there are two it will interpolate the corrections based on the measured white balance of an image.
(Note: to get DNG files from my RAW NEF files I opened them in LR, removed any correction parameters, and exported them as DNG files. Is this correct? Should I have white balanced the DNGs in LR to force them to 6500k/2850k values before exporting?)
Now, in theory, and provided that I have a properly calibrated screen, if I load up the picture of the colorchecker chart in LR and apply the profile I created in the camera calibration section, it *should* look exactly like the actual chart, right? Since the camera's sensor has been calibrated in the DNG Profile editor and the screen has been calibrated with a colorimeter, the color output on the screen should match exactly what I see in real life.
Looking at the colorchart picture in LR2 after applying the profile the colors are still not exactly matching to the real chart. I was actually surprised at how subtle the changes were going from the ACR default profile to the custom one I had just made. Maybe I needed to shoot it in more controlled conditions (the chart was not in direct sunlight, nor was the tungsten bulb shot a pure setup I'm sure either)? Where can I shoot the test charts such that they are under exact 6500k and 2850k lights? I don't have a studio or anything like that so I'm not sure I can get it good like that.
So I'm hoping that once the Display2 arrives and I re-calibrate my screen, I should then get color rendition that is pretty close to what I see on the chart in real life, yes? Or perhaps my display is simply too crappy to accurately show colors? Granted it's a laptop screen but as I recall the 15" displays on the Powerbook G4s were pretty good as laptop displays go.
Am I understanding what is supposed to happen with these pieces of color calibration hardware (colorimeter and chart) or am I missing something?
For the record I have a D80 and shoot RAW using the AdobeRGB color space. Does it matter what lens is used to shoot the color charts? I have the nikon 18-200VR and the nikon 12-24 f/4. Do I need to make color profiles for each lens?
#1. "RE: X-Rite colorchecker theory" | In response to Reply # 0BJNicholls Charter MemberTue 14-Oct-08 03:11 PM
You're using a butter knife to do brain surgery. The Apple notebook displays aren't adequate for high end image editing. If you're serious about getting reproduction-level accuracy, you need a professional caliber display with IPS LCD technology for wide viewing angles flexibility without sacrificing color and tonal accuracy.
As to your workflow, "sunlight" describes a wide range of color temperatures and there's no way to calibrate given the color temperature changes from time of day, season, cloud cover, smog, etc. Tungsten lighting is less variable but it's not like you can rely on any tungsten bulb to deliver a "tungsten" color temperature. There are warmer and cooler bulbs. If you're doing repro work, then look at photographic flood bulbs in tungsten or daylight color balance. Or use studio strobes or speedlights with their very consistent color temperature.
This kind of exercise in search of high levels of color accuracy (perfect isn't physically possible) is useful if you do studio product photography or reproduction work where color accuracy is highly critical. For general photography, "accurate" color is often not very pleasing. That's why there are (were) so many different film and photo paper types that had their own color characteristics and why there are so many digital settings to achieve similar color results.
Yes, lenses have an effect on color rendition so to do a really accurate profile you'd have to use a fixed color temperature light source and use a specific lens. Nikon lenses tend to have similar color rendition (not perfectly matched, however) and you'll get different color from a Zeiss, Sigma, or Tamron lens.
So are you doing repro work? If so, invest in a professional display and some better glass. If not, don't go chasing color accuracy to the level you seem to think is necessary. Not only is it not necessary, it's not desirable.
#2. "RE: X-Rite colorchecker theory" | In response to Reply # 1TiggerGTO Nikonian since 22nd Feb 2006Tue 14-Oct-08 05:07 PM
Also, remember that even if you have everything calibrated accurately, the color checker chart will not look identical to the image on the screen. It can't. The color checker chart is reflective, and the display is luminescent. That's before you even start talking about whether your LCD display can render the full color spectrum shown in the chart.
When I've talked to people who know what they are talking about, you need to have the printed output at a right angle to the display so that you have to turn your head to look at either one. You can not compare a reflective image to a luminescent image side-by-side.
A Nikonian in North Carolina
#3. "RE: X-Rite colorchecker theory" | In response to Reply # 0
Thanks for the comments. No I'm not doing repro work, just general photography. If color accuracy is so poor (and unrealistic on luminescent displays vs reflective media) then why does everyone make such a big deal about calibrating their display? No I don't need perfect accuracy but I'd like things to be pretty close. I suppose as a scientist I try to go a little further than normal . I know my Apple LCD display isn't the greatest but it is supposed to be at least decent at color reproduction. Exciting news a couple of days ago though from Apple and their new LED-driven LCD panel. I might actually save up and splurge for one of those... I know that LED backlighting has done wonders for color gamuts on LCD displays- in some cases LED driven LCDs have color gamuts bigger than AdobeRGB!
Anyways, I received my colorimeter in the mail yesterday and profiled my monitor. That combined with the DNG profile I created with the ColorChecker chart are giving me pretty good results now. Actually the calibration I thought was neatest was the camera sensor. I am getting much more vivid (and accurate) reds now that I am using a custom sensor profile for my DSLR. The difference isn't too obvious but on pictures with red it can be pretty noticeable. Worth the $60 or so if you are just a casual user? Maybe not but again as a scientist I thought it was cool.
What should I be using as target whitepoint and gamma values? I created several profiles, all with a target gamma of 2.2, and various whitepoints. I have profiles with whitepoints of 6500k, 4200k (which is what the colorimeter measured as my ambient light), and native. Which should I be using? Do I understand it correctly where in the ideal case one would measure the average color temp of the wall space that a picture will be displayed on and use that as the whitpeoint? Then everything will look really yellow and wierd but only then can you get a realistic idea of what the picture will look like under light of that temp? What is the purpose of using values like 6500k (D65) or 5000k (D50)? Are those only for specalized commercial printing press situations? Otherwise I should just stick with native?
#4. "RE: X-Rite colorchecker theory" | In response to Reply # 3Yachtsman Registered since 10th Jun 2006Wed 15-Oct-08 05:19 PM | edited Wed 15-Oct-08 05:20 PM by Yachtsman
I use the Eye1 Display2 to calibrate my flatscreen monitor. When I recently switched printers I found I had major problems with the colour of my prints and critically reexamined every part of my system and my routine.
X-Rite were adamant that I should use a Gamma of 2.2,colour temperature of 6500 and Luminence of 110 CD/M2. I had previously used Native White Point. Like a good little soldier I did as I was told. However I have to say, I've changed virtually every parameter known to man to track the problem and found they made virtually no visible difference.
Between 6000 and 6500 I defy you to see a difference!
#5. "RE: X-Rite colorchecker theory" | In response to Reply # 3BJNicholls Charter MemberWed 15-Oct-08 09:17 PM | edited Wed 15-Oct-08 09:19 PM by BJNicholls
I don't share the opinion that LCD-to-print color accuracy is necessarily poor, but that depends very much on the performance of your display, your calibration, and your methodology for checking prints against the display. The primary use of color calibration is to simulate printed output, but you don't get to that point in your workflow until you're soft-proofing to your printer+paper profile.
Decent at color reproduction is a relative term (and can vary a lot with a tilt of the notebook's display). Apple has LED backlights in MacBook Pros. While that helps battery life and can give you more color gamut, it doesn't make a display more suitable for image editing. The bigger issue with notebooks is that they typically use TN display technology. That means significant color and tonal shifts with minor shifts in viewing angle. My workstation LCD has wider than Adobe RGB color gamut without an LED light source, but it also uses IPS LCD technology which is the technology I'd recommend for image editing work.
There's nothing wrong with pretty good results and the DNG profiler can work really well to deliver more accurate color. Accuracy isn't aways a photographer's friend — Velvia and Kodachrome are prized not for their accuracy, but for the flavor they impart to images. What's most important for my workflow is an accurate soft proof of my images for print reproduction and knowing that someone with a good calibrated display will see the image as I intend.
The most recommended whitepoints for printing are D50 and D65. D55 is what I use since I'm a graphic designer and the warmer whitepoint is better suited for judging offset printed images, but I think D50 is too warm for the average viewing color temp for my intended viewers. D65 is a good choice for photographers since good photos are usually viewed under less yellow lighting. But the real answer is that you should calibrate to the same color temperature light you use in your print viewing station. Part of the workflow that most people don't do is getting a good viewing light source and using that exclusively when you're judging print color. You'll appreciate that the more you limit variables, the less subjective your color judgements will be.
As you suggest, if it was practical you'd create a profile for the display lighting on each gallery print. If you indeed are printing for a gallery setting, you might well calibrate to the tungsten or halogen lighting. In practice, for photo printing, D65 is a pretty good "daylight" soft proofing color. It can get really hairy to try to manage all the variables and it's usually not necessary except for specialized reproduction. I've been working images for giclee (the expensive word for inkjet) reproductions of oil paintings. Doing that kind of repro work makes you acutely aware of all the variables that conspire to make reproductions imperfect.
Using the ambient light feature of your colorimeter is a poor workflow unless you work with a decent color of ambient light and that lighting is consistent. The real value of an ambient light reading would be for location calibration for a digital projector. The worst is to have a device like a Huey that reads ambient lighting and adjusts display colors as they change. This could be justified if you had a display set up in a lobby and the idea was to make the display accurate to the setting. But that's the opposite of creating a consistent editing and viewing environment.
Sorry for the too-much-information replies and didactic tone, it's a byproduct of the project I've been working on.