It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

BBC article suspecting NASA is manipulating the colors

page: 3
0
<< 1  2   >>

log in

join
share:

posted on Jan, 29 2004 @ 07:54 PM
link   

Originally posted by mikromarius
Thanks for the explanation on the dust. Finally someone who actually answers questions instead of whining about how stupid I am. I want these questions answered, that's all. And I see now that I turned 17% into 0.17% sorry about that. But 1/6th isn't much. They still seem to fall down much quicker than they would have.


Again, that is precisely because there is no air (to speak of) to slow down the fall, so from the peak of where the fall starts, in less than a second or so it would likely be falling FASTER on the Moon than on Earth.

Air resistance plays a HUGE role in our intuitive observations about what looks like a "natural" speed, simply because it dominates the environment where we live our lives.

I'm sure if we had lived all of our lives observing the behavior of objects in a near-vacuum, it would look natural for all objects to fall fast, and things like dust and feathers falling slowly would look absolutely freakish.

Such folks might no doubt wonder if their exploration agency was attaching wires to some objects in order to slow their fall, because it just didn't look right.




The camera registers LIGHT, not colors. The way that the camera can produce "color" images is to put a filter in front of it that filters out all frequencies EXCEPT the one that you are interested in.


And I am fully aware of that. My question is simply: How do they manage to get a fluorecent or rather phosphorecent (sp?) effect in the blue hues without using a second lightsource using UV or IR light. You don't get this effect by simply putting a color film before the camera as far as I know.


OK, I think I may see the problem here. You have to realize that sunlight comes in what is called a "blackbody spectrum". It's a goofy name, since it's used to describe the behavior of things that emit light because of temperature.

Our sun is one of those objects. The Plank radiation law outlines what happens in the frequencies of emitted light as the temperature of an object rises.

For blackbody radiation, temperature is really the ONLY determining factor in the spectral distribution. It's actually a little easier to demonstrate the results than describe them.

Try using this applet to see what a blackbody spectrum looks like.

Our sun is at roughly 6000 degrees Kelvin, so click and drag on the thermometer and set it to about 6000.

Notice how very wide the spread of frequencies is? Sunlight contains plenty of ultraviolet and infrared frequencies, going WAY outside the visible spectrum.

Now, it's useful here on Earth that the peak of the distribution lands near the center of the human-visible spectrum. Some would say that eyes evolved to see those frequencies precisely BECAUSE they are quite bright.

But sunlight is more than sufficient as a light source for CCD filters that go WAY, way outside of the human-visible spectrum. No "extra" light source is needed, since we have a nicely-hot blackbody radiation source at the center of our solar system.



And unless they tweak or "compress" the colors back into the visible spectrum, the effects won't even be visible. You would have to scale up the invisible lightin order to see it.


Actually, they don't have to scale it up at all. All they do is take a signal from 750 nm (which the blackbody curve shows is still quite bright compared to visible frequencies), and when they combine it, the computer in effect "pretends" that the 750 nm signal was actually a 600 nm signal.

It's not that they've necessarily amped it up... they've just taken the curve and shifted it sideways. This is, of course, just plain wrong to do... and especially so when they have the tools right there to see the REAL curve instead of pretending that one signal is another.



Why on Earth NASA is doing this in their press pictures is quite odd in my opinion. And their arguent that every bloody blue thing on Mars is painted with some kind of superpaint in order to "calibrate" the pictures is just not good enough.


I'm having a lot of trouble accepting that explanation as well. It makes no sense to me that they would DELIBERATELY try to make so many things high-response in the IR when the natural curves are so different.

I could almost buy it on the sundial, since that is a calibration tool, and it is useful to be able to calibrate the high-IR right-lens filters. But why use so many high-IR-response pigments elsewhere?

If anything, doing so would almost certainly result in LESS data being received when a portion of the Rover is in the scene, due to the normalization of the channels. The Rover might very well present the brightest IR signal, and thereby prevent capturing a lot of the more subtle IR data from surrounding materials.



posted on Jan, 30 2004 @ 07:51 AM
link   

Originally posted by BarryKearns
Actually, they don't have to scale it up at all. All they do is take a signal from 750 nm (which the blackbody curve shows is still quite bright compared to visible frequencies), and when they combine it, the computer in effect "pretends" that the 750 nm signal was actually a 600 nm signal.


Exactly what I meant. Only I said scale up, I really meant scale down. In other words tamper with the colors. Make one color become another.


It's not that they've necessarily amped it up... they've just taken the curve and shifted it sideways.


Well maybe pitch is a more correct word to use. Speeking in sound terms: They have taken a frequency we can't hear and then pitched it down so we can hear it, and have presented the sound like a sound from outer space. Why would they do that? It sounds a little strange to say the least.


This is, of course, just plain wrong to do... and especially so when they have the tools right there to see the REAL curve instead of pretending that one signal is another.



Why on Earth NASA is doing this in their press pictures is quite odd in my opinion. And their arguent that every bloody blue thing on Mars is painted with some kind of superpaint in order to "calibrate" the pictures is just not good enough.


I'm having a lot of trouble accepting that explanation as well. It makes no sense to me that they would DELIBERATELY try to make so many things high-response in the IR when the natural curves are so different.

I could almost buy it on the sundial, since that is a calibration tool, and it is useful to be able to calibrate the high-IR right-lens filters. But why use so many high-IR-response pigments elsewhere?


Yes, every blue obviously has this special pigment in it's paint. Odd.


If anything, doing so would almost certainly result in LESS data being received when a portion of the Rover is in the scene, due to the normalization of the channels. The Rover might very well present the brightest IR signal, and thereby prevent capturing a lot of the more subtle IR data from surrounding materials.



Well, I don't know, I just find it odd that they make the choices they do.

Blessings,
Mikromarius



posted on Jan, 31 2004 @ 12:18 AM
link   
Red planet gets redder, with NASA help

The American space agency NASA has been accused of doctoring its pictures of Mars to make the Martian surface conform to our impression of the Red Planet.

NASA, it is claimed, digitally "tweaked" drab brown scenery to make it redder, and removed green patches to hide evidence of life.

The theories gained credence after NASA told New Scientist magazine that "getting the colours right is a surprisingly difficult and subjective job".

Most of the pictures have been taken through green, blue and infra-red filters instead of green, blue and standard red filters, which would have produced more accurate colours.

The infra-red filters over-emphasised the redness of the planet, turning blue objects a deep burgundy red or, in some cases, a hot pink, while greens appeared a dirty mustard yellow.

Dr Jim Bell, who worked with NASA on the Mars rovers' cameras, said infra-red filters were used because they helped geologists to distinguish rock types.


[Edited on 1/31/2004 by Bangin]



posted on Jan, 31 2004 @ 01:42 AM
link   
Interesting thread. I used to use both IR B/W and
color shifted IR film for various reasons, due to the
ability to see some things better or just to see them
period. If I were NASA and wanted to "hide" any
grass or plants or green weenies, the LAST thing I
would do would be to use IR filters. From what I can
tell of the Opportunity site at least, the area it landed
in would appear close to a water surface as far as IR
film would react. (Best guess) If there were any
plants (chloraphyl ) there it would stand out like the
plants in the water of this IR shot ::

www.biphoto.com...

so if NASA is trying to "hide" life on Mars, they better
stop using that IR filter post haste.

/\/ight\/\/ing



new topics

top topics
 
0
<< 1  2   >>

log in

join