SCI/TECH: What Color Is Mars, Really?

page: 5
<< 2  3  4    6  7  8 >>

log in


posted on Jan, 12 2004 @ 06:10 AM
As far as the Color replacement from the Green tab on the sundial. The same thing can be achieved by using a color range selection. I selected 5 pixels from the exact center of the green tab. Replacing every occurence of these exact colors throughout the image. We get this:

From this I believe it is a fairly safe assumption that this is a flawed method. These points on the sundial can not be green. This is simply not a full visible spectrum picture.

An over-exaggerated metaphor would be the case of a coke can sitting on green grass. We know the can is red. But it is not safe to assume that all shades of grey in the picture are going to be red.

posted on Jan, 12 2004 @ 06:53 AM

Originally posted by Kano An over-exaggerated metaphor would be the case of a coke can sitting on green grass. We know the can is red. But it is not safe to assume that all shades of grey in the picture are going to be red.
Exacly... only that some shades of gray are the same shade of gray, that was once green. But we still do know there was a "catastrophic" color-shift in the panorama image... but we're not absolutely certian the same color-shift is in all images because we don't have the tell-tale blue-as-pink color swatch to tell us (we can assume, but can't be certain it's the same color-shift).

posted on Jan, 12 2004 @ 07:51 AM
Pancam Pancam uses 1024??2048 pixel Mitel CCD array detectors developed for the MER Project. The arrays are operated in frame transfer mode, with one 1024??1024-pixel region constituting the active imaging area and the another adjacent 1024??1024 region serving as a frame transfer buffer. The frame transfer buffer has an opaque cover that prevents >99% of light at all wavelengths from 400 to 1100 nm from being detected by this region of the CCD. The pixels are continuous, and the pitch is 12 ?gm in both directions. The arrays are capable of exposure times from 0 msec (to characterize the ?readout smear? signal acquired during the ~5 msec required to transfer the image to the frame transfer buffer) to 30 sec. Under expected operating conditions, the arrays have at least 150,000 electrons of full-well depth, and a read noise of less than 50 electrons. Dark current varies with temperature; it is negligible at -55?C and is 200 at all signal levels above 20% of full scale. The detector response has a linearity > 99% for signals between 10% to 90% of full well. Each array is combined with optics and a small filter wheel to form one eye of a multispectral, stereoscopic imaging system. The optics for both cameras consist of identical 3-element symmetrical lenses with an effective focal length of 38 mm and a focal ratio of f/20, yielding an IFOV of 0.28 mrad/pixel and a square FOV of 16.8???16.8? per eye. The optics and filters are protected from direct exposure to the martian environment by a sapphire window at the front of the optics barrel. The optical design provides for more than 90% of the encircled energy to be contained in an area equal to 3??3 IFOVs, and 99% in an area equal to 5??5 IFOVs, across the entire range of spectral responsivity of the instrument and over the required operating temperature range for performance of Pancam within specifications (-55?C to 0?C). The optical design allows Pancam to maintain optimal focus from infinity to within about 1.5 meters of the cameras. At ranges closer than 1.5 meters, Pancam images suffer from some defocus blur. For example, at a range of 80 cm (the approximate distance from the Pancam calibration target), the defocus blur is about 10 pixels. Each filter wheel has eight positions, allowing multispectral sky imaging and surface mineralogic studies in the 400-1100 nm wavelength region. The left wheel contains one clear (empty) position. The remaining filter wheel positions are filled with narrowband interference filters that are circular and 10 mm in diameter, and that have the central wavelengths and bandpasses listed in Table 2.1.2-1. One filter on each eye has an ND5.0 coating to allow direct imaging of the Sun at two wavelengths. LEFT CAMERA..............RIGHT CAMERA L1. EMPTY................R1. 430 (SP) * L2. 750 (20).............R2. 750 (20) L3. 670 (20).............R3. 800 (20) L4. 600 (20).............R4. 860 (25) L5. 530 (20).............R5. 900 (25) L6. 480 (25).............R6. 930 (30) L7. 430 (SP)*............R7. 980 (LP)* L8. 440 Solar ND.........R8. 880 Solar ND *SP indicates short-pass filter; LP indicates long-pass filter Table 2.1.2-1: Pancam Multispectral Filter Set: Wavelength (and Bandpass) in nm Radiometric calibration of both Pancam cameras will be performed with an absolute accuracy of 7% or better and a relative precision (pixel-to-pixel) of 1% or better. Calibration will be achieved using a combination of preflight calibration data and inflight images of a Pancam calibration target carried by the rover. The Pancam calibration target is placed within unobstructed view of both camera heads and will be fully illuminated by the Sun between at least 10:00 AM and 2:00 PM local solar time for nominal rover orientations. The target has three gray regions of variable reflectivity (approximately 20%, 40%, and 60%) and four colored regions (peak reflectance in the blue, green, red, and near-IR) for colorimetric calibration. It includes a vertical post that will cast a shadow simultaneously across all three gray surfaces at some time within the 10:00 AM to 2:00 PM nominal operating range. The calibration target is large enough that defocus blur will not produce significant degradation of the calibration images. The two Pancam eyes are mounted on a mast on the rover deck. The mast is referred to as the Pancam Mast Assembly (PMA), and also includes several key components for the Mini-TES. The PMA is erected to the vertical position by a deployment actuator at its base. The cameras are located on a "camera bar? with a boresight 180? from the Mini-TES boresight. The rover navigation cameras (Navcams) are also located on this same camera bar, and point in the same direction as Pancam. The boresight of the Pancam cameras is approximately 1.3 m above the martian surface with the PMA in the deployed position. The cameras are moved together by 90? in elevation using a geared brush motor on the camera bar. The entire PMA head, including the cameras, can be rotated 360? in azimuth by a geared brush motor assembly. A separate geared brush motor provides elevation actuation for the Mini-TES elevation mirror assembly. Hard stops are provided for all actuation axes. The two Pancam eyes are separated by 30 cm horizontally and have a 1? toe-in. This separation and toe-in provide an adequate convergence distance for scientifically useful stereo topographic and ranging solutions to be obtained from the near-field (5-10 m) to approximately 100 m from the rover. Pointing control is

posted on Jan, 12 2004 @ 09:04 AM
RGB pictures can never be 'true color' as it is impossible to re-create every color the eye can percieve using only Standard Red Green and Blue.

A brief explanation can be found here:

To explain visually, we can use the CIE diagram.

Basically using whatever points we pick for red, green and blue, we can only make a triangle, and never cover every possible color visible.

As Dr. Bell stated in his email. The choice for the Red channel in this picture was incoming light with a wavelength of 750 nm. This corresponds to light at the extreme end of the visible spectrum, and the beginnings of infra-red light.

Now, it seems the blue pigment used for the color chip on the sundial (and elsewhere on the electrical connectors) is especially bright at 750nm. I suspect if anyone can get their hands on Dr. Bells papers about the choice of pigments used it would shed some light. But it appears the quirk we see in the photos is a result of the choice of wavelengths which to set as Red, Green, and Blue. Combined with the reflective properties of the pigments used.

Perhaps the longer wavelength setting was chosen to increase the range of colours the camera is able to pick up.

Remember, as someone pointed out, it is not simply a matter of taking a color photo. The PanCam needs to take 3 photos with different filters to pick up the 3 wavelengths. These wavelengths are then assigned to RGB and combined to give the images we see.

It seems therefore it should be possible also to record some images using the L4, L5 and L6 as R,G and B respectively. Which I imagine we will also see as the mission progresses.

Remember shifting the arbitrary points for RGB is just like increasing the size of the triangle shown inside the CIE Chromacity Diagram. Only when it is reconverted to the standard RGB shown by our computers, the triangle is shifted back into its normal position. Thus for the majority of the spectrum the change in percieved colour is minimal. But for colours close to exact 'Red' 'Green' and 'Blue' the shift is somewhat more noticeable. Also the effect of the extra bright blue chip in the near-infra range gives it the interesting pink look.

posted on Jan, 12 2004 @ 09:23 AM
Right... certainly. But it still doesn't explain how Blue became Hot Pink.
That's still the big mystery here... a giant leap across the color wheel.

posted on Jan, 12 2004 @ 09:59 AM
Pink isnt that far from blue in color space.

Heres a quick way to re-create the effect yourself btw.

In the name of the image on the raw images directory

It shows what filter the image was taken with (yes those black and white ones).

For example 2P126644567ESF0200P2095L2M1.JPG was taken with the L2 filter, which we know is at 750nm.

All the raw images seem to follow this format. A way to re-create the effect seen by shifting the redpoint. (Thats all thats been done, it actually makes the surface seem less red). Is this.

Photoshop-only explanation here.
Download these 2 sets of 3 images.
Series 1.

Series 2.

Now, In the first series the Red component is from filter L4, (600nm) and in the second the filter is L2, (750nm). The green and blue filters are the same for both at L5 (530nm), and L6 (480nm) respectively.

To combine these, we will start with the first series. Open the L4 filtered image in photoshop first. This will be the background Then open the L5 and copy/paste as a layer over the L4 (layer 1), then copy/paste the L6 image as a layer over both (layer 2). Now all you have to do is rightclick on the L6 layer, go to blending options, advanced blending, and make sure only the blue channel is selected (deselect the other 2) this makes this layer the blue channel. Now, do the same for layer 1 (L5) but select the green channel. You don't have to do the same for the last channel as if you havent changed the opacity. Red is the only thing that can show through from that layer.

You should now have a regular, true-colour image of the sundial. As RGB are typically those values.

Now, if we repeat the process with the second series of images. Using the L2 layer as the background (and therefore red channel). We get a completely different looking sundial.

Try it for yourself.

Here are the two processed images:

Series 1:

Series 2:

All this from a little shift of the redpoint by 150nm. You'll also notice it doesn't really change the look of much apart from the extreme colours. I will make a diagram to show how the color-space is transposed.

posted on Jan, 12 2004 @ 10:38 AM
Ok, explanation of why some colors look out of place on the L2/L5/L6 filtered image.

Firstly we have the full visible spectrum.

We can then map our RGB colorspace onto it:

The curved grey region is the entire visible spectrum. The white triangle is the region of colours displayable by RGB. (The L4,L5 and L6 filters correspond to the points R,G and B).

This is the space recorded by replacing the L4 filter with the L2 filter (ie shifting the Red point by 150nm to the very edge of infra-red).

Notice there is a region recorded that is outside the visible spectrum. (The bottom right corner of the RGB triangle).

Now, when we display the composite RGB image of this back on our monitors. The colorspace that is recorded by the PanCam (with regions outside the visible) is transposed onto the displayable region shown in the first image. Thus a small region of infra-red is now added to the end of the red channel. As it is squashed into the displayable region.

Now, this means anything that is very reflective in the near infra-red spectrum (for example the blue pigment) has a massive boost in the Red channel when transposed. By comparing the L2 and L4 images for the green-chip, we can see the green chip also is quite a bit more reflective in L2 than L4. Thus the blue pigment appears pink and the green a kind of beige.

Also you'll notice that the transposition would actually make the environment look less red. As anything in the true red range (600nm) would be shifted to a slightly shorter wavelength, and appear more orange.

Now, a possible reason for this choice of Red channel. Is that by shifting the red channel to the extreme edge of the visible spectrum, more data can be recorded that would otherwise be missed if red was taken at 600nm (ie anything from 610-750 would be lost). The mission is a geological one, not a picture-taking one. They can also be fairly sure that there is going to be little to no bright blue or green items on the surface. This can be confirmed by using the L4, L5 and L6 filter set periodically. So using L2 instead of L4 as the red point allows more of the visible spectrum to be recorded in the same image. With some slight quirks that don't really effect the mission of Spirit on Mars.

[Edited on 12-1-2004 by Kano]

posted on Jan, 12 2004 @ 10:54 AM
"So... what color is Mars... really?"

i have a great idea for finding out.......

Step 1
Go to a store, don't tell any one what store, (but a place like wal mart or kmart or target (except thier french owned))
Step 2
Go back to the sporting good area (make sure your not being followed)
Step 3
Purchase a brand new high line telescope, (don't tell the clerk what you will be looking at since he might give you the telescope that the US goverenment has colored the lenses on.)
Step 4
take it home (make sure not being followed) and set it up on a clear moonless night (so its harder for the spy craft to see you doing such,) then look at the planet Mars with the telescope, IT"S RED, JUST LOOK, IT"S RED.

posted on Jan, 12 2004 @ 11:37 AM
We're talking about how it would look like on the surface of the planet.

[Edited on 12-1-2004 by TheBandit795]

posted on Jan, 12 2004 @ 11:44 AM
Hes right though. It is obvious that theres no massive discoloration of the surface due to this effect. For the reasons explained above.

posted on Jan, 12 2004 @ 11:51 AM

Originally posted by Kano Hes right though. It is obvious that theres no massive discoloration of the surface due to this effect. For the reasons explained above.
Unknown. Really. We can certianly imply, from ground-based observations, that the surface color is similar to what we're seeing in these pictures. However, but by applying the same logic one would assume the surface of the earth is mostly blue when seen from afar... and we know that effect is due to our atmosphere. I suppose the next logical question, following Kano's excellent contributions in colorspace theory (I'm not in agreement that it all applies in this case, but a great example of why ATS is the place for the discussion), why not publish pictures in unfiltered colorspace? I suppose the easiest "conspiracy" answer is that the ubber-techies at NASA assume we normal humans wouldn't understand the true colors of Mars.

posted on Jan, 12 2004 @ 12:03 PM

why not publish pictures in unfiltered colorspace? I suppose the easiest "conspiracy" answer is that the ubber-techies at NASA assume we normal humans wouldn't understand the true colors of Mars.

I'm sure there will be pictures taken using L4 as the red channel instead of L2. The simplest answer is as above though. The mission really isn't to take pictures for the people at home.

Theres a mission I forgot to give to people. I'll try to do it myself also. What I would really like to find (and what would answer all of our questions) is to find a picture (in parts) in the Raw images folder comprising the L4, L5 and L6 filters. Keep an eye out for this as you browse around. Most are using L2 as the Red channel, but if we could find a series using L4, 5 and 6, its a 'true color' image.

Although as stated before using a 'true color' image you actually lose data from the beyond-red area of the spectrum. But still, it would answer the questions posed here.

As mentioned earlier you can tell what filter an image was taken with by looking here:


Means it is taken with the L4 filter. (600nm)
A series would have all the rest of the filename the same, we are looking for a series with (at least) L4, L5 and L6.

What would be amazing would be a 4-part series that also had the L2 filtered image. So we could compare the same image with a different basepoint setting for the Red channel.

posted on Jan, 12 2004 @ 12:18 PM
I believe (I really hope libliam doesn't see this) and have always felt that NASA does alter images in a way to hide the truth from us. Look at the blue hue around the perimeter of Mars.

[Edited on 12-1-2004 by 29MV29]

[Edited on 12-1-2004 by 29MV29]

posted on Jan, 12 2004 @ 12:21 PM
Yeah, there was that topic not so long ago about the blue outline of Mars in the above foto...

posted on Jan, 12 2004 @ 12:24 PM

Originally posted by TheBandit795
Yeah, there was that topic not so long ago about the blue outline of Mars in the above foto...

Yeah, but that was in RATS, where not everyone can see or comment on the blue hue.

posted on Jan, 12 2004 @ 12:37 PM

Originally posted by 29MV29
I believe (I really hope libliam doesn't see this) and have always felt that NASA does alter images in a way to hide the truth from us. Look at the blue hue around the perimeter of Mars.

NASA Released that image. Do you see the logic flaw?

The blue hue visible in these hubble shots is Ice crystals in the upper atmosphere.

I moved the thread regarding this phenomenon out of RATS so people can see the information provided for themselves.

You can see it here:

posted on Jan, 12 2004 @ 02:39 PM
I was watching a show on the Science Channel about mars and the Viking (I think that was the first one) or what ever the first mars lander was called, taking the first pictures of mars and them comming back with a blue sky, and then the next day, they came back with red pics, saying there was a problem with the blue calabration. I am looking for thoes pics right now. Kinda reminded me of when NASA found the mars face, made a big deal about it one day, and the next day, said it was weird shadowing. Seems to me like nasa is getting "corrected" an awfull lot. One more thing, why is it so hard for them to calabrate the cameras? All the cameras I have ever used had preaty good calabration right off the bat, (atleast you can tell what is red and what is blue). I would assume that nasa probiblly has better cameras than I do. Just seems strange to me!

posted on Jan, 12 2004 @ 03:00 PM
haven't found any of the originals, but did find this

posted on Jan, 12 2004 @ 03:00 PM
I tried to get some other color references for the Spirit craft. This is a photo of the balloons taken on Earth. Does it look a bit modified like the images on Mars. It has way too much red. Take a look and decide for yourself.

posted on Jan, 12 2004 @ 03:08 PM
Looks like they did this with the sundial before on the pathfinder and viking missions.

check out this site..

[Edited on 12-1-2004 by The Real Deal]

top topics
<< 2  3  4    6  7  8 >>

log in