It looks like you're using an Ad Blocker.

Please white-list or disable in your ad-blocking tool.

Thank you.


Some features of ATS will be disabled while you continue to use an ad-blocker.


Examining NASA's Method Of Producing Color Images From Mars

page: 1

log in


posted on Feb, 8 2004 @ 02:41 PM
Images Obtained with RGB composite using pancam filters L2 L5 L6, and L4 L5 L6.

Example one:

L2 L5 L6

L4 L5 L6

Example two:

L2 L5 L6

L4 L5 L6

Earth Reference L4 L5 L6

Which is the proper way to create RGB images from Mars?

L2 L5 L6 is what NASA is using on most of the images at their press page.

[Edited on 8-2-2004 by ArchAngel]

[Edited on 8-2-2004 by ArchAngel]

posted on Feb, 8 2004 @ 03:08 PM
If you read Kano's thread and understand it(rough on the brain) then you will know WHY NASA color pictures look the way they do.

You must look deeper to understand WHY they used L2 instead of L4.

Please read the Pancam Investigation.

On page 74 you can read the "MER Requirements Relevant to Pancam Calibration and Testing".

Line one:

Acquire at least one RGB and at least one stereo 360摥杲敥 panoramic image of each landing site with
the Pancam. Image one exposed rock that is also analyzed by another instrument.

This sounds like they should be creating an RGB color Panorama for each lander. But you must look deeper in the report to see the Loyering going on.

On page 53

One of these per rover, in RGB color (L2: 753 nm, L5: 535 nm, and L6: 483 nm)
and stereo (R2: 754 nm), is called for by the formal MER Level 1 Mission Success requirements
(Table 2).

This implies the SOURCE will be L2 L5 L6 for the composite RGB image.

I think we can see from above that this is not the correct way to obtain an RBG image. The OUTPUT may be RGB, but the INPUT used for the images is not.

Continue reading at the very last pages of the report. Look at the test color images. They used L4 L5 L6 for testing and created great color pictures.


[Edited on 8-2-2004 by ArchAngel]

posted on Feb, 8 2004 @ 03:16 PM
Lets look at the definition of RGB

Red (700 nm)
Green (546.1 nm)
Blue(435.8 nm)

Now lets look at the response of the pancam:

L2 750 nm
Bandpass 20

L3 670 nm
Bandpass 16

L4 600 nm
Bandpass 17

L5 530 nm
Bandpass 19

L6 480 nm
bandpass 27

L7 430 nm
Bandpass 25

L8 440 nm
Bandpass 20

The response curve is not so simple, and can be found in the Pancam Investigation on page 95.

But it is clear that L2 is NOT the proper filter to use for creating RGB composites.

[Edited on 8-2-2004 by ArchAngel]

posted on Feb, 8 2004 @ 05:47 PM
Lets start with the images at NASA's press page that include the sundial and see how the described method compares with what they are publishing. This is after all the evidence they provided to counter the questions of the color.


Press Release

The sundial looks much like the ones I created from the raw data other than it is reduced in size, and uses A higher level of Jpeg compression.

So lets recreate it and see what our looks like. I would post it right here, but I can't. The raw data for it is not on NASA's site. They must have it because this image is up on their press page, but they have not yet shared it even though the data was received over a week ago.


Press Release

This one has a VERY similar image supposedly from a different Sol[data pacK]. It is unfortunate that NASA has not yet posted the data for this although they received it over a week ago.


Full Size Image
Press Release

If you look at the sundial on the full size image you will see that the colors are wrong. They are using L2 in place of L4 again.

Now lets make our own. Guess what?

This time there is data, but their is only data for one small strip, and yes it does include the sundial.

The image on the right is a normal RGB composite. The left has Green reduced 20% and blue reduced 40%.

Notice that the color of the ground is not as red.

Other than the image discussed in Kano's thread there are no more with the sundial on NASA's page.

We cannot verify the work NASA has done to prove the color of Mars.

[Edited on 8-2-2004 by ArchAngel]

posted on Feb, 8 2004 @ 07:41 PM
From NASA:

Press Release

The image has been reduced from the original 1024X1024 to 500X500.

Here is what it looks like with a L4 L5 L6 RGB composite from the raw data:

They have done a bit to tweek the data. This is and image with the red channel brightness turned up 40% - contast + 20%, and Blue reduced 15%, and green reduced just slightly[Aprox]:

Compare with NASA.

Was that much adjustment warranted?

[Edited on 8-2-2004 by ArchAngel]

posted on Feb, 8 2004 @ 10:01 PM
Archangel, surely I have explained enough by now why equalized images are incorrect. The onboard computer on the Rover has an automatic exposure control to amplify the images so that each channel covers all brightness values, to increas the SNR of the transmission back to Earth. It also sends the exposure information (along with a lot of other engineering data) back with the images. From that the images are re-created. When you simply combine the raw color plates after they have had the exposure amplified, that is making an image with adjusted values, you have to take the plates back to their original exposure levels to re-create a more realistic image.

As far as the image with the Sundial/Airbags/Landscape (that you have added Marvin to) I could have sworn I have already pointed out why that is so obviously flawed in another thread. Its actually a good example of the folly of equalizing the channels. That is a late afternoon shot but with channels equalized it appears as bright as a midday shot on Earth.

As far as the L2 filter, as mentioned many times previously, it is used as it is a better tool for the geologists attached to the mission to analyze the rocks and surrounds. Instead of wasting time/bandwidth sending additional L4 information the decision has been made in most cases to just use the L2 for the red channel. Now for the pigments etc on the rover this obviously causes a few rather obvious quirks. But (and heres the point) it doesn't change the look of the landscape much at all.

We can point at the blue-pink shift on the blue color chip on the sundial all we like. But if you go out and find the images of the landscape that have been taken with L2456 (theres quite a few). Then run them through photoshop, with rudimentary exposure control. If you keep the same settings on both channels you will find the coloration of the image really doesn't change a great deal. This was explained and an example given in my original thread, but you seem to have ignored this.

The approximate exposure levels are found in the maestro information and are mentioned in the PanCam discussion thread.

I'm also curious why you have started a few threads in the last 2 days on the same subject, please try to avoid this in future.

posted on Feb, 9 2004 @ 05:40 PM
Latest "color" image from Mars:

Full Size Image

From its new location at the inner edge of the small crater surrounding it, the Mars Exploration Rover Opportunity was able to look out to the plains where its backshell (left) and parachute (right) landed. Opportunity is currently investigating a rock outcropping with its suite of robotic geologic tools. This approximate true-color image was created by combining data from the panoramic camera's red, green and blue filters.

The data for this image is missing too, so we can't try to reproduce NASA's method.

[Edited on 9-2-2004 by ArchAngel]

posted on Feb, 10 2004 @ 12:29 AM
Here is a simple L456 RGB composite from Sol35. No adjustments

Compare with reference image.

What is different?

posted on Feb, 10 2004 @ 05:40 AM
Are there any satellite images of this area showing the topography of the area? Just wondering since the lander seems like it's standing on a ridge of some sort (the last image posted by ArchAngel), the lander seems way too far towards the horizon unless it is standing on the edge of something. See?


top topics


log in