Not Quite it.
That is obviously a shortened explanation of the reasons behind the blue pigment appearing pink. As shown earlier we can re-create this effect
I also mentioned in that post that the simple equal mix of the RGB color plates (from the Spirit Raw Images hosted by NASA) is only equivalent for
Why not all?
To explain this we need to look a little more at the PanCam and how it transmits the data. From our Pancam Technical brief, we discover that the
onboard computer on the rover (which controls PanCam) has the ability to perform a limited set of image-processing tasks, one of which is:
(4) rudimentary automatic exposure control capability to maximize the SNR of downlinked data while preventing data saturation
This means that the brightness of all three color plates has been amplified to give the highest range of brightness for each plate. I don't know the
graphical term for it, but an equivalent Audio term would be something like Hard limiting
. So I'll use that.
Basically, in each of the three filter pics. The exposure has been set so the brightest part of the picture from each filter, correlates with the
absolute maximum brightness for that channel. For example the brightest part of the red channel is FF0000, green is 00FF00, and blue is 0000FF.
(Obviously they all come in as b/w pics so in each black and white plate there is a perfect range from 000000 (absolute black) to FFFFFF (absolute
You can test this by opening one of the black and white plates (photoshop again sorry). Select either 000000 or FFFFFF as the working color. Then go
to the select menu and color range. Set fuzziness to zero and OK. For each extreme you will find at least a few pixels of each.
You can test the counter to this theory with a photo taken on Earth. Choose any photo taken on earth, (A good one to try is that Autumn road looking
one that comes with Windows XP). Open it in photoshop and set its blending options so only the blue channel is showing. Its very dark, and there are
no 0000FF pixels at all. In fact there are only a few 0000AA pixels, and they are in the whiteish parts. You can try this with any picture taken on
earth. Try to avoid pictures with solid black and white in them however. Or something silly like a rainbow. White requires bright amounts of all RG
and B to show. The rainbow is self-explanatory
By sending each plate of colors spread across each extreme, you gain the maximum amount of data from each plate. Once you know the calibration
information it is easy to amplify each channel back down to its correct setting, and get the images looking as they should. If you were to send the
images at equal exposure levels, the signal to noise ratio would be lower, and any slight error in one of the blue/green channels would be more
Now, this only throws out the color-balance on images where the original plates were not
almost even. Unfortunately this covers most of the
pics where the rover isn't visible. Remember each plate is Hard Limited when transmitted back. For this to not change the look of the simple-combined
image. The original plates would have to already be almost hard limited. There are a few where this is the case.
Now, a way to test this. Is to get these images:
You will have to shrink the first one from 1024 to 512. The 'EFF' is a prefix for 1024x1024 and 'EDN' is for 512x512. I don't know why, thats
just the pattern I've noticed
. These are the 3 plates that make up the top of the little silver pole and corner of the sat-dish visible in the
Now, this pole has very bright almost white areas in the reflection. Thus all plates should be fairly even in exposure levels.
Combining them in photoshop. (In the manner mentioned before. We get:
Which is extremely close to the colors in the panorama. With slightly less of a red-tint.
Yet when we use other 3-plate series from Sol05 (which is largely the panorama). Such as these ones:
A completely different look. Even though they are combined in the exact same manner. This is the effect of having all channels Hard Limited
You can re-create this effect by choosing auto-levels in Photoshop. While this is often handy, brightening up images and so forth. It does not work
well when you are dealing with images predominantly one color, and whos brightest and darkest point is not a shade of grey.
How does NASA do it?
Well, clearly high-end, purpose made image processing software is a big part of it. They also have all the relevant calibration and exposure
information from the rover.
Are we boned?
Not at all. Any picture with White and black, or bright red, green and blue in it will look almost exact when mixed evenly. What is the one thing we
has these? The sundial.
So any photo of the sundial. (Such as the ones shown earlier in the thread). We can be fairly sure will be accurate when mixed evenly. The convenient
thing with the sundial is the fact it has mirrors on it to show the Martian sky, so with any plate-series of L4, L5 and L6 filtered plates, we can see
a close approximation of the Martian Sky.
We can see the sky color in the little mirrors at the edges of the sundial.
Now, the one flaw with all this is the fact that a slight and constant hue of any sort would be removed by the equalisation of all the channels. So if
anything all these pics would likely have a slight
Among the multitude of images from Spirit, a nice pair for comparison are these.
There is a series on sol 8 which looks like a test of almost all the filters at one hill. This was good news as it allows us to compare the difference
when we use an L2 filter as the red channel and when we use an L4 filter.
The results are below. REMEMBER these are normalized color images, not real color.
The slide on the left looks less-red than the one on the right. Obviously the channels are normalised so the colours are not true. But it is a good
visual example of the idea that the near-infrared selection of filter for the Red channel will actually give the appearance of less
would using the L4 filter for the red plate.