posted on Jun, 1 2008 @ 10:46 PM
This effect is due to two reasons.
First, the way digital images are captured. In a consumer digital camera, you have an image sensor. These images sensors are sensitive to any light.
When light hits them, they generate an analog signal. That signal is than converted into a number by an analog-to-digital converter. You can end up
with a number that's 0, corresponding to black, ranging up to a number that's 256, corresponding to white. But this just gives you a black and
white image. To generate a color image, each element in the sensor has a filter in front of it. A third have red filters, a third have green
filters, and a third have blue filters. Each pixel in the resulting image is creating by combining one red, one green, and one blue value into a
final value. This works great, except you have essentially cut the resolution of your sensor by two thirds. The sensors NASA use get around this by
putting the filters in front of the camera, instead of in front of the sensor. So each image is captured at full resolution, but the data in the
picture only corresponds to one color. However, standard red, green, and blue filters don't necessarily give you as much scientific value as other
color filters could. Since the filter goes in front of the camera, you can build the filters on a wheel, so you can put many different filters in
there. Rather than just red, green, and blue, you ca put stuff like infrared and ultraviolet filters in there. So you're no longer capturing red,
green, and blue values that can be used to create a true color image. You're capturing wavelengths of light the hunan eye can't see. So even if
you combine them into a "true color" image, it's not going to look exactly like what your eye (or a standard camera) will see because the data it
captured is different than the data your eyes use to see.
Second, the issue of white balance. White balance refers to how a camera determines what color a neutral gray is. Let's say you're out in the sun.
The sun is yellow. So if you hold up a white piece of paper, it will actually be slightly yellow in color. Your brain adjusts for this effect
automatically, so you see the white as white. But a camera records the actual light as it receives it, which will be yellow. You (or likely your
camera automatically) adjusts the color value in the image so whites look white. You can observe this effect by taking a picture outside and locking
the the balance on your camera and then taking a picture inside. Everything will look blue because the camera had to add blue to the image so the
yellow paper looked white. This applies to the images from Mars because the sun is filtering through a different atmosphere, giving it a different
color than it would have here on Earth. The values can be adjusted in the image so the white balance is close to what it is here on Earth, but it
won't be perfect. This can effect the colors in the whole image.
So, to sum it up, there are valid scientific reasons for the images from Phoenix to look the way they do.