posted on Jan, 24 2006 @ 05:27 PM
Have you ever wondered why you combine the R:G:B photos from NASA and they come out with blue sky? More so; have you felt this was significant and
exposing their cover-up?
Fear no longer!
I shall now relate Dr. Wendy Calvin's and Dr. James R. Carr's explanation of how a raw image becomes a "true color image".
Basic remote sensing information is that received data is converted to numbers 1-255 and these numbers represent a value of "intensity" more or less
of a specific spectrum.
In the case of the Mars Rovers the spectrum is not a single wave-length but a continuum of wave lengths (instead of simply 430nm it is 410-450nm for
instance) thus you need to integrate to get the actual 1-255 value of a given pixel.
To ensure that the pixels of a given image are all of the correct value to produce any color of the spectrum, they must be calibrated against the
target (there is an image taken immediately before or after of a calibration target) which is a white circle (now slightly off color by dust so the
figures have changed some) which would be in the Martian atmosphere a value of 255.
So if the average value of a section of pixels of that target is say 240 you have to adjust all the pixel values in that image so that all pixels >
240 are = 255.
More complicated than that is the integration required to determine you have the correct wave length for a color photograph.
NASA has a program to do this with, but anyone can program it with the logarithms given on a specific website...I'm sorry I don't have it saved to
my Favorites but your Kano has posted a less explainative post from the same author of this website, Dr. Bell.
With a calibrated image you merely need to combine the R:G:B equivalents and adjust for contrast and you'll have a rough-approximation of true color
(only analogue can truly capture color as digital cameras have the problems of interpretation of numbers).
So if you've ever wondered why when you follow the standard procedures you get a blue sky...this is why.