posted on Mar, 1 2011 @ 06:43 PM
After looking at the original photos, this is the result.
The green area are just some bright pixels, probably the result of overloaded pixels in the blue+green sensor.
One interesting result of looking at this photo is that it showed that Google uses the IRB colour image instead of the RGB ones, so the colours are
nothing like we see on Google Mars.
This is what it looks like in the original IRB image.
And this is the same area in the RGB version.
Another thing we can see, once more, is that the Google image has too much contrast, resulting in many missing details. It probably also results in a
more compressible image, allowing faster transfer rates.
The shadow in the wrong direction looks like a smal crater on the original photo, so that explains (I think
PS: the IRB images are made with the three different channels HiRISE uses, infrared, red and blue+green, so the colour is not a good representation of
the original colours, although it shows better the differen types of materials in the ground. The RGB images are not really RGB, because HiRISE
doesn't have separated green and blue channels, it only has a wider blue+green channel. The images are created with the red from the red channel, the
green from the blue+green channel and the blue is the result of a somewhat strange formula:
Information for Scientific Users of HiRISE Color Products
The synthetic blue image digital numbers (DNs) consist of the BG image DN multiplied by 2 minus 30% of
the RED image DN for each pixel
edit on 1/3/2011 by
ArMaP because: I forgot the source.