"Moiré and False Color. A discussion amongst Nikonians." Tue 27-Mar-12 10:55 PM by DigitalDarrell
As I write this new book, Mastering the Nikon D800, I have to be very careful to make sure I understand certain highly technical matters correctly. Otherwise, I might write something incorrectly. I am going to be writing about moiré and false color potential in the D800E and how to avoid it. Therefore, I wanted to discuss this issue with a few Nikonian color scientists. Here is my current understanding of the cause of false color and moiré interference patterns:
False Color Cause
From what I understand about the low-pass filter, all it does it slightly divert the light rays so that surrounding pixels share the color. That way, no one pixel can report a color inaccurately when the fineness of the subject's pattern closely matches the fineness of the sensor, a situation that does not allow normal color interpolation.
In other words, without the low-pass (AA) filter (e.g, D800E), a single pixel being hit by a pixel-sized white light source will report its own color, red, for instance, even though the light is white. That causes a red false color. The other pixels (green and blue) can do the same incorrect reporting when the source light has the same pinpoint fineness as the sensor's pixels. By blurring the light rays slightly, surrounding pixels in the RGB bayer pattern receive some of the same light and normalized color interpolation can then occur.
Moiré interference patterns can also result when the fineness of the subject's pattern approaches the fineness of the sensors pixels (within 50%). The frequency of overlapping patterns can cause some interference patterns that are nearly impossible to remove once allowed into the image.
Subsequently, the Nikon D800 has a relatively weak low-pass (AA) filter——per my own direct observation of my D800's images——to solve both problems. Since this anti-aliasing action blurs the image, the camera must resharpen it.
Do you detect a flaw in my reasoning or understanding?
#1. "RE: Moiré and False Color. A discussion amongst Nikonians." In response to Reply # 0 Tue 27-Mar-12 07:51 AM by adangus
The first thing to understand is what a low pass filter is and how it works. This can become quite complex quite quickly; and so to simplify to the extent possible, a low pass filter is a system that removes higher frequency components. In the case of the 2-D signals that the optical sensor is detecting, this is something like a blur filter in Photoshop. Another way of thinking about this would be a function that averages signals or light over a number of adjacent pixels.
So, in this context, take a look at these diagrams from Nikon USA. In the case of using the optical low pass filter (OLPF), there are two "birefringent" glass elements. These split the incoming beam of light into two slightly separated beams. For the OLPF, there is a circular polarizing filter between the two birefringent elements; and this ensures that the second element breaks the two beams into another pair of light beams. For the case without OLPF, there is no polarizer and the second birefringent element recombines the original beam.
Whether you completely understand the physics of that process doesn't matter too much. The point is that the OLFP causes the incoming photons of light to spread out over 4 adjacent sensor elements. Each photon will take one or the other of those 4 paths. Now consider that exactly the same process is going on for the other adjacent locations on the surface of the sensor. In this way, light that would go to only one sensor element is spread out across 4 elements; and looking at the process the other way, each sensor element is getting light from 4 adjacent locations on the first birefringent plate.
The sensor itself has some form of color filter array (CFA) as well. A typical choice is a Bayer filter. Note that this filter has exactly 4 color filter elements; and so the 1:4 beam splitter that is the OLPF is exactly matched to the color filter array. By spreading the incoming light through the OLPF onto the CFA, each sensor element gets a "share" of the color from the scene.
However, each picture element in the sensor has just one color channel associated with it. By the time you get a processed RAW file into Photoshop or whatever, each pixel has the usual three red, green and blue color channels. The function in RAW processing that takes these individual color channels per sensor element and combines them into three color channels per pixel is called de-mosaicing or Bayer interpolation.
Without the OLPF, false color could arise if the structure of colored elements in some part of the scene is matched in some way to the pattern of elements in the sensor. While this is possible, it's relatively rare. A more likely situation is color fringing at edges between light and dark elements in a scene. This is a case of false color triggered by sharp (1 pixel or so) boundaries between parts of the scene so that the de-mosaicing algorithm is tricked into making bad choices for color channel data in pixels at the boundary. It can look like chromatic aberration but it is not due to lens defects. This is often purple fringing. You can look at a few examples of different de-mosaicing algorithms and how they yield different false color effects here. As discussed below, the anti-aliasing effects of the OLPF help eliminate these false color effects at edges.
Much more likely is a Moiré pattern. Moiré patterns arise when two highly regular patterns overlap one another. In the present case, this can occur if some pattern in the scene overlaps the pattern in the sensor. It should be understood that the D800E is not unique in regard to being sensitive to Moiré; no medium or large format digital camera or back uses an OLPF. So, the Phase One and Hasselblad guys are used to having to deal with Moiré in post processing and software like Capture One builds in functions to handle this possibility. As a result, you do not want to be shooting architecture with a D800E in JPG or TIFF since the only way to handle the Moiré that can arise is in RAW post processing. There's a decent article on this over on the Phase One site here.
As you mention, the downside of using an OLPF is that the image will be somewhat blurred; more than without the OLPF. Hence, all RAW images require sharpening to some extent, which is just to say that one needs to apply a high pass filter in PP to compensate for the effect of the OLPF before the sensor. This would be less essential for PP on a D800E image; and so expect that the sharpening functions cooked into programs like Capture NX2 would be different for the 800E and the 800 (or 700 or D3 or whatever).
Just as for high-pixel density cameras like the Phase Ones and Hasselblads, the expectation for the 800E is that the pixel density will ultimately be down-sampled in some way. This might be in s/w to produce a lower density JPG for the web or optically through creating a print. Hence, false color (at edges) and Moiré can arise (but can be dealt with in post).
My own humble guess is that the improved sharpness that could be captured by the 800E's sensor is likely to be limited in many cases by the optics of the lens. For the 800E sensor, a sensor element is about 4.88µm, so call it 10µm per line pair. For a Phase One IQ140, the 40Mp sensor is 43.9mmx32.9mm so it's around 12µm per line pair. For the Phase One IQ180, its 80Mp sensor is 53.7mmx40.4mm giving about the same 10µm per line pair as the 800E. So far so good. But now consider the 800 with its OLPF. By splitting the incoming light over pairs of adjacent sensor elements in both the horizontal and vertical directions, its effective resolution is halved to about 20µm per line pair.
Consider this against the D700. Its sensor element size is more like 8.5µm and it includes an OLPF, so its resolution is more like 34µm per line pair. What this implies is that glass that works quite well for a D700 may yield much more observable chromatic aberration and edge diffraction on a D800E. This says to use the best possible glass with an 800E body. No big news there. However, it might also suggest some limitations in choice of RAW processor, where available lens correction functions for CA and distortion for the combination of lens and body are fully accounted for.
As the story of the 800E continues to evolve, I'll be interested to see the quality of support for this body with a broad variety of lens options from application providers like Nikon, Adobe, DXO, Phase One, Apple, et al. I personally have been quite displeased with what Adobe has delivered relative to the D700 as far as CA correction is concerned. My own personal favorites are NX2 and Capture One. This is not to say that DXO does a bad job; but my experience with it has been that it seems to process lens correction in a way that's sensitive to exposure. As a result, DXO yields different results for images in a bracketed set making HDR rather difficult to achieve together with RAW processing that includes lens correction.
But now I'm way off topic...
There is another function to the OLPF, and that is anti-aliasing. Imagine for a moment that the scene has some component that has a pattern that varies more than 10µm between line pairs when it's focussed on the sensor of the 800E. To make things really simple, say we somehow make a fine grating of alternating light and dark lines, light it uniformly, and then set it up so that we get 15µm between line pairs on the sensor of a D800E. If you think about this for a moment, you'll see that the 800E's sensor is going to lie about the pattern that it sees. This lie is called aliasing and it happens broadly whenever a digital system samples a signal that is varying more quickly than twice the sampling rate. I've cooked the numbers here so that this 15µm grating is going to look like one with only 5µm line pairs. This possibility of aliasing with the 800E yields more cases in which optical patterns can occur that look like Moiré against the aliased version of the signal. The Wikipedia article I've referenced includes a few good examples of this (although I believe that the clever brick wall image has more to do with the sampling of the JPG image than the original camera shot.)
It would be different with a D800 using an OLPF as an AAF. Because of the OLPF averaging the grating, by the time the light gets to the sensor, it will have been averaged out to look like a uniform gray (a blend of the light and dark lines). Hence, no aliasing will occur. Instead of the 15µm line pairs coming out like 5µm line pairs from the 800E, the D800 image will be a flat gray.
Anyone with an 800E might try a few games with this by focusing on a window screen at various distances and with a slight tilt to the camera and a fixed focal length and aperture on your lens. Beyond some critical distance, you'll see this effect begin as the spacing between the lines of the image of the mesh on the sensor get smaller than the distance between sensor elements. With an 800, when it gets passed the distance where it could resolve the distinct lines in the screen's mesh, the mesh will just blur out.
#2. "RE: Moiré and False Color. A discussion amongst Nikonians." In response to Reply # 1
Excellent response! I am going to study the resources you mentioned and add to my knowledge on this subject. In the book, I will be focusing more on how to avoid or minimize false color and moiré than all this deep detail. I just wanted to make sure I understood the concepts so that I can discuss it with authority. Thanks for your valuable input!
#3. "RE: Moiré and False Color. A discussion amongst Nikonians." In response to Reply # 2
I meant to add a conjecture.
I believe that the OLPF in the 800 may reduce the volume of light passing through it relative to the configuration in the 800E. This would suggest that metrics like dynamic range and noise might be somewhat better (if only marginally) for the 800E than the 800.
This seems to be borne out so far from Bill Claff\'s data for the two bodies, with the 800E scoring 13.61EV and the 800 scoring 13.53EV in his table just now. Not a big difference, at 1/10th of a stop; but perhaps worth noting.
#4. "RE: Moiré and False Color. A discussion amongst Nikonians." In response to Reply # 3 Tue 27-Mar-12 06:31 PM by TomCurious
Bay Area, US
>I meant to add a conjecture. > >I believe that the OLPF in the 800 may reduce the volume of >light passing through it relative to the configuration in the >800E. This would suggest that metrics like dynamic range and >noise might be somewhat better (if only marginally) for the >800E than the 800.
From the material I have seen, the OLPF in both models have the same material and same thickness, so I think it's unlikely that the light is attenuated more in one model over the other. The difference is in the orientation of the filters.
>This seems to be borne out so far from Claff's data for the two bodies, with the 800E scoring >13.61EV and the 800 scoring 13.53EV in his table just now. Not >a big difference, at 1/10th of a stop; but perhaps worth >noting.
Perhaps it's in the margin of error of his test? Impossible to say without knowing how accurate the test is.
#5. "RE: Moiré and False Color. A discussion amongst Nikonians." In response to Reply # 0
I think it is useful to point out that while the majority opinion is that moire is the only aliasing artifact worth worrying about, there are a few photographers (such as myself) who are displeased by all types of aliasing.
This difference of opinion can play out in a discussion where one photographer says that AA filters are not needed if you are shooting landscapes.
To me, any image with aliasing, including landscapes, have a distinctly "digital" look to them, like TRON or The Matrix. Where others see a desirable "crunchiness", I see jaggies, stair-stepping, sparkling, wavy lines, bands, fringing, popping, strobing, and false detail. For example, a landscape with an AA filter may only have a green blur when the grass is, whereas a landscape without an AA filter will show individual blades of grass in perfectly straight vertical lines. The latter is more "detail", but to me it is "false" -- grass doesn't really look like that.
For example, here is a photo of the same subject. One with an OLPF, one without:
#6. "RE: Moiré and False Color. A discussion amongst Nikonians." In response to Reply # 5 Thu 29-Mar-12 10:40 PM by adangus
It seems to me that there are a couple of ways to think about this. One is in terms of information. The sensor for the D800/800E has the claimed 36Mp. After an exposure is taken, each pixel has 14 bits; so this is 504 Mbits. Of course, I'm not accounting for any loss of information due to noise, just the information capacity of the sensor.
What's reported by the RAW NEF has 3 times as much capacity though, since each pixel has 3 color channels; rounding a touch, let's say we're getting 1.5 Gbits from an original 0.5 Gbits. Of course, this process, through Bayer interpolation, cannot create information. But the apparent information capacity of the RAW file is 3 times greater than the apparent capacity of the sensor.
This is the same for both the D800 and D800E, even though the interpolation methods might be different for each. Arguably, in order to claim improved sharpness, the D800E should have higher real information content for any scene. A natural question is whether or not this can be true. It seem logical enough to believe that the D800's OLPF will reduce information flowing through it to the sensor. It does this by confounding the paths taken by any given information-carrying photon into the sensor's detector wells. But information is also lost by filtering photons before detection on the basis of their energy (or wavelength) by the operation of the color filter array (CFA); and both the 800 and 800E do this.
In the case of the 800, the nominal point spread function (PSF) would be roughly 4 pixels broad, ignoring any lens effects. In the case of the 800E, it should be just 1 pixel broad. This suggests that the information in the 800 sensor is closer to 3/4 of 504 Mbit or 378 Mbit. I divide by 4 to account for the 4-way split into the 4 cells of the CFA and multiply by 3 to account for the 3 color channels reported per sensor element.
The 800E would not apparently lose information because of an OLPF; but consideration of its true resolving power shows that it must be reporting information inaccurately for a wide variety of cases. Assuming that its pixel size is about 5µm, it should resolve line pairs of 10µm in width. That would be 100 lp/mm and would be exceptional resolution. However, if one actually put a white light sinusoidal stripe pattern of 100 lp/mm on the 800E sensor, the result would be a either a cyan or yellow line alternating with a black line. I say this with the assumption that the white line would either register exactly with the Bayer filter's green and blue stripe (cyan) or with its green and red stripe (yellow). Of course, interpolation might change these false colors even further; but you get the idea that you can only get to the highest resolution by introducing false color.
Similar odd results would occur by illuminating the sensor with a 100 lp/mm blue only, red only or green only sinusoidal stripe pattern. If the blue stripe happened to fall on the green & red strip of the sensor, the result would just be uniformly black. No interpolation could save this result. The result would be similar for a red-only stripe pattern if the illuminated stripe fell on the sensor's green & blue strip. By shifting these illumination patterns one-half pixel in either direction the result would be an alternating pattern of false colors. Different odd results would occur if the illumination was re-oriented to lie at 45 degrees to the CFA. Going back to the green-only stripe illumination, this is the only one for which false colors would not arise, if and only if the illumination exactly registers with the CFA's strips. Any offsets or mis-alignment will again yield false color fringing.
Of course, this is an extremely artificial case that I'm constructing here. Nonetheless, it's instructive for being able to describe how the 800E could break down into false color patterns in, perhaps, isolated parts of a real scene.
Here is an example of the sort of goofy results that might occur as a consequence of not using an OLPF in conjunction with a Bayer CFA and demosaicing:
The original is a standard grayscale resolution chart squeezed down in size so that it begins to be of the order of the CFA and sensor pixels.
It was interpolated with a nearest neighbor algorithm, which is far from the best approach admittedly, but excellent for showing the kind of artifacts that I'm referring to. Note that I've scaled both of these up by a factor of 3 to better show the results.
Whether the D800 has a standard Bayer filter and what kind of interpolation is used is anybody's guess. So, this image is just an example of the kind of result that can occur for false color. This kind of "fringing" can look like CA, but it is strictly an artifact of digital processing of the image.
This raises a natural question as to what the "true" resolving power of the 800E would be if it were not pushed passed this non-linear limit of false color. I would be compelled to answer that the resolution can't be better than that of the CFA since averaging over this cell size is essential in avoiding false color. In this case, the cell size is about 10µm on a side and we can get around 50 lp/mm out of it without too much ado. This is just what one would expect for the 800 and this is pretty darn good.
But with the 800 we get a graceful degradation in performance because of the OLPF; with the 800E, there may be any number of odd artifacts in cases where the scene happens to include patterns with information at spatial frequencies above 50 lp/mm.