It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Is The Mars Rover Cam Life-Blind?

page: 6
0
<< 3  4  5    7  8  9 >>

log in

join
share:

posted on Feb, 15 2004 @ 11:22 PM
link   
Okay...im reading this...and my head HURTS like hell. Why would NASA wanna hide the fact that theres algae on Mars...what harm would it do if we knew that there was algae or even plantlife up there?




posted on Feb, 15 2004 @ 11:24 PM
link   
What is the actual mechanism of the Rover's RGB filters - aren't they just variable circuits controlled by software like the CCD cameras in television studios? Why not occasionally adjust the color calibration to have broad RGB overlap? I don't understand why the narrow filter options are so limited. I can understand why they might want them to be limited in some circumstances, but colors that mimic ordinary, average human eyesight would also be handy in my opinion.


[Edited on 15-2-2004 by Condorcet]



posted on Feb, 15 2004 @ 11:41 PM
link   
"What is the actual mechanism of the Rover's RGB filters - aren't they just variable circuits controlled by software like the CCD cameras in television studios?"

No. They are physical transmittance filters. Pieces of film on a wheel over the cam lens. The wheel is rotated so the different filters are in front of the lens as needed.

"Why not occasionally adjust the color calibration to have broad RGB overlap?"

This cannot be done. There are 5 filters that cover the RBG range with gaps, and overlapping outside the RGB standard. 'Color" pictures are impossible, but you can get fairly close with all information using L456.

"I don't understand why the narrow filter options are so limited."

These are used to look at a narrow part of the spectrum. The CCD measures photons, not light frequencies. You must block what you do NOT want to see so that it does not wash out what you are looking for.

"I can understand why they might want them to be limited in some circumstances, but colors that mimic ordinary, average human eyesight would also be handy in my opinion."

I agree. An RGB set would not have added much to cost. A slightly bigger filter wheel on one cam is all that would be needed. The added weight would have been very little.


[Edited on 15-2-2004 by Condorcet]



posted on Feb, 15 2004 @ 11:53 PM
link   
www.lpi.usra.edu...

Here you can see details of the cam including the filter wheel. It is not very large, and this is where the key to seeing is.



posted on Feb, 15 2004 @ 11:55 PM
link   

Originally posted by dreamlandmafia
Okay...im reading this...and my head HURTS like hell. Why would NASA wanna hide the fact that theres algae on Mars...what harm would it do if we knew that there was algae or even plantlife up there?



Because it does not suit their purposes, yet...

Many things lead me to believe this is possible, not just the fact that they can hide things if they want.



posted on Feb, 16 2004 @ 12:07 AM
link   
Ahhh, thanks ArchAngel. I just researched the mechanics of it at howstuffworks.com and it corroborates your information.

You also write

"An RGB set would not have added much to cost. A slightly bigger filter wheel on one cam is all that would be needed. The added weight would have been very little."

Exactly!


[Edited on 16-2-2004 by Condorcet]



posted on Feb, 16 2004 @ 10:36 AM
link   

Originally posted by ArchAngel
The cam CAN be used to find life, but it is not simple, and you must know what you are looking for first...


Thanks for finally coming out and admitting what I've been saying all along... that your ORIGINAL contention about the CAM being life-blind is quite simply false. That's progress, and a hopeful sign.

If your original thread title, and subsequent arguments had been "Is NASA trying to avoid looking for life in their picture releases?", you would have heard little (if any) objection from me... because you would be properly framing your argument as usage of the tool, not the tool being broken.




I understand what you are saying.


For someone who "understands" what I'm saying, it's remarkable how little that "understanding" impinges on the argument that you are offering. I'd expect exactly the same lack of addressing of substantive issues from someone who DOESN'T understand it.

If your responses are indistinguishable from someone who doesn't understand, why would anyone believe that you actually DO understand?



You do not understand what I am saying.


I'd wager that I understand the thrust of your point a heck of a lot better than you think. I simply believe that the "argument" you offer, even when understood, is effectively meaningless.

To summarize, your objection is that the Pancam can be used to facilitate lying. It can be selectively used to create false impressions.

That makes it no different than any other data-gathering tool in existence, and makes it no different than the English language for that matter.

Just because a tool can be used for evil purposes does not make the tool evil, nor does it make it broken.

The tool can also be used for good purposes, and despite the fact that it is designed specifically for geological purposes, it can be used effectively to search for a myriad of different life forms.

It is a tool that can be used for good, or for evil. It can be used effectively, or deliberately used ineffectively... for instance, if someone wanted to hide something.

The problem is, you are acting as if this situation for the Pancam is somehow different from any other set of filters that might be used in front of any CCD camera in existence.

It's not. Any tool, and any set of filters, can be used to deceive, and some are much less effective than others at DISTINGUISHING THE CAUSE of a particular color signal... and that action is one that reveals much more truth, instead of creating more deception.

Let's take the two non-Pancam filter sets you included in the initial graphic you posted. (Astronomik and IDAS/Hutech Type III RGB)

Assume that either of those filter sets had been the set that was sent up instead of the existing ones. Here's your task:

Assume that the Pancam is looking at a uniform concentration of chlorophyll B laden material, lit with sunlight at high noon. Compute the (relative) pre-normalized CCD counts for any given pixel in the image (assume all pixels give identical values), for each of the filter types. Show all work. It's not necessary to come up with exact values, merely the relative signal sizes from each filter.

(Hint: it's proportional to the "area under the filter response curve" when the curve values are multiplied by the response curve values of the underlying CCD camera, which is already published. You'll need to use the RESPONSE curves / reflectance data for chlorophyll B... not the absorption curves.)

Based on that finding, for each of those filter sets, roughly what "color" (computer monitor RGB value set) would the hypothetical field of cholorphyll B register as on your screen? How would that result differ from looking at it with your own eyes?

Most importantly, given raw image data from those filter sets and the combined color signal of an unknown image which showed a "greenish" tint, how would you go about distinguishing whether the color value in the picture was a result of chlorophyll B being present, versus some kind of greenish mineral like olivine?

When you can answer those questions, you should know why it is technically much BETTER to have multiple non-overlapping narrow bandpass filters, instead of three broad overlapping curves.

The RGB curves you referenced are nice for producing images that mimic human vision reasonably well... but you should recognize that human vision is a GROSSLY POOR TOOL for distinguishing WHY something is colored the way it is. Human vision basically can't tell the (color) difference between plant life and paint that is of the same basic shade. Sure, it can give hints that something might be useful to examine in more detail, but so can a wide variety of other filter data sets.

Since this is a science mission, it is far more useful to gather data that can be used to identify WHY a signal is present, than it is to simply mimic what a human would see.

That's why so much of today's hyperspectral orbital analysis is done with a large number of narrow-bandpass, non-overlapping filters... because it gives good answers for solving problems and answering questions, instead of just making pretty pictures.



[Edited on 2-16-2004 by BarryKearns]



posted on Feb, 16 2004 @ 12:51 PM
link   
L3



L7



What is different?

'Bandpass' filters are a great tool for observation, and a tool for hiding what is observed.

[Edited on 16-2-2004 by ArchAngel]



posted on Feb, 16 2004 @ 01:06 PM
link   
What he hell are you talking about?

You realise that an image of Red/White stripes in RGB would be black/white stripes in the blue/green filters and solid white in the red filter image right?

Plus the fact that the JPL logo and other things on the rover are designed to be reflective at certain wavelengths, to assist in calibration.



posted on Feb, 16 2004 @ 01:12 PM
link   

What he hell are you talking about?


Here is but one example of how things can be hidden with a camera using narrow bandpass filters.



posted on Feb, 16 2004 @ 01:24 PM
link   

Originally posted by ArchAngel
(L3 image)

(L7 image)

What is different?


What is different? The wavelengths of the photons that were detected, exactly as expected. These images prove the OPPOSITE of what you think they do. Even with only two images, we see that the JPL logo has a significantly different response curve than the ground in that picture, due to the more drastic shift in brightness when comparing the two images.

Those two images REVEAL information, rather than hiding it.



'Bandpass' filters are a great tool for observation, and a tool for hiding what is observed.


The English language is a great tool for conveying information, and a tool for disseminating disinformation.
Which conspiracy caused language to come into existence?

A mind is a terrible thing to waste.

You keep alleging understanding on your part... are you planning on demonstrating that understanding?

"Otto: Don't call me stupid.

Wanda: Oh, right, to call you stupid would be an insult to stupid people. I've worn dresses with higher IQs. I've known sheep that could outwit you, but you think you're an intellectual, don't you, ape?

Otto: Apes don't read philosophy.

Wanda: Yes, they do Otto, they just don't understand it."

-- "A Fish Called Wanda"



posted on Feb, 16 2004 @ 01:26 PM
link   
Barry,

do you know Howard by any chance?



posted on Feb, 16 2004 @ 01:30 PM
link   
Well, you realise the examples you have are from the L3 and L6 filters, and as such are from a Red region and a blue region filters. Of course they will show different things.

This EXACT phenomenon would occur with a regular RGB filter if the area around the JPL was white and the JPL itself was red. White reflects all parts of the spectrum, the red reflects the red-only. Thus the blue filter image would look as the L6 does and the red filter image would look as the L3 does.

In this case the pigment on the JPL actually has a much more specific reflectance spectra. A better example for these claims would have been to show the L234 series of the same shot, and see the difference there. Even though its false, it would be more impressive



posted on Feb, 16 2004 @ 01:34 PM
link   

Originally posted by THENEO
Barry,

do you know Howard by any chance?


To the best of my recollection, I don't know anyone named Howard.

[Edited on 2-16-2004 by BarryKearns]



posted on Feb, 16 2004 @ 01:57 PM
link   
I think he means me.


Neo hates it when anyone uses logic, science and common sense.

BTW I have enjoyed your posts in this thread. Mr. Big says to say: good job.




posted on Feb, 16 2004 @ 05:52 PM
link   

Well, you realise the examples you have are from the L3 and L6 filters, and as such are from a Red region and a blue region filters. Of course they will show different things.


And in this example there is something hidden. The tool can be used to show or hide. It depends on what you are looking for, and what filters you have.


Even with only two images, we see that the JPL logo has a significantly different response curve than the ground in that picture, due to the more drastic shift in brightness when comparing the two images.


Look at the others in this set. With selective release of data you can hide things in plain sight.

[Edited on 16-2-2004 by ArchAngel]



posted on Feb, 16 2004 @ 06:49 PM
link   

Originally posted by ArchAngel

Well, you realise the examples you have are from the L3 and L6 filters, and as such are from a Red region and a blue region filters. Of course they will show different things.


And in this example there is something hidden. The tool can be used to show or hide. It depends on what you are looking for, and what filters you have.


No, nothing is hidden. That's the fundamental flaw in your logic. What is being shown is different ASPECTS of a scene... just like the R channel, the G channel, and the B channel of a standard RGB image show different ASPECTS of a scene.

You could make the same no-value argument that RGB can be used to "show or hide" depending on which shots you chose to release.

The individual filter shots only "hide" things from those who are too ignorant to understand what they are looking at, and too ignorant to realize that seeing additional aspects of a scene always provides MORE information, never less.

Consider someone taking black-and-white photographs of an elephant. The photographer can stand at various orientations with respect to the elephant, and pictures taken at different orientations and zoom levels will provide different data.

A close-up shot of the side of the elephant, and the rear, will fail to show the tusks. If you want to use a perverse form of logic, one might say that the photographer can use the power of orientation to "show or hide", but that doesn't mean that the power of orientation is somehow evidence of a conspiracy.

Instead, someone who understands the nature of multiple dimensions will recognize that shots from different orientations will provide DIFFERENT data, and that having a wider variety of those orientations can provide more data than having a smaller set of "standard" shots that all photographs must use.



Even with only two images, we see that the JPL logo has a significantly different response curve than the ground in that picture, due to the more drastic shift in brightness when comparing the two images.


Look at the others in this set. With selective release of data you can hide things in plain sight.


You're still not getting it. In the example above, the photographer could choose to stand at a variety of orientations, and could "hide" the fact that the elephant had tusks through judicious choices of where they stood.

That doesn't mean that the ability to take shots from multiple angles should set off alarm bells and suspicions of a conspiracy. It simply means that there are more options. The fact that some options can be used to create a false impression is ABSOLUTELY NOT any kind of evidence that something sinister is going on.

Nothing can be "hidden in plain sight" using the Pancam, for those who understand what the data actually means. Some people can misunderstand the implications of the data. That says nothing meaningful about the tool itself, since that concept can be applied to ANY DATA-GATHERING TOOL.

Instead, it speaks to the ignorance of those who act like they understand the implications of the data, when they do not.

The Pancam only reveals data, it does not hide it. The fact that it doesn't reveal EVERYTHING YOU WANT, in the specific format that you want, does not mean it is "blind". It simply means that it is a tool that is built for extremely useful purposes other than satisfying your personal demands... and its existence implies nothing more than that.

The two other RGB filter sets that you included in your first graphic can also be used to "hide" things under your logic, and I've given you detailed instructions on how to demonstrate that for yourself. Have you completed the work yet, or will you ignore this like you've ignored all the other opportunities offered to show yourself where your logic is breaking down?

The amount of scientifically useful data that would have come from either of those two filter sets is substantially LESS than the amount of scientifically useful data produced by the Pancam filters.

Those filter sets are great for producing human-friendly pictures, but that's about it... they would add no real data of substance to the mission.

Since you seem to keep going on about how the Pancam "can be used" to hide things, I'd like to ask a basic question: what tool COULD NOT BE USED to hide things under that definition, and still provide a meaningful amount of scientific visual data for the purposes of identifying what is being seen?

I don't think such a tool exists... in which case, it wouldn't have mattered what NASA had chosen to send, you'd still be able to offer exactly the same argument.

Please, by all means, show us which tool can produce meaningful, discriminatory visual data but CANNOT be mis-used by operators who choose to use only a portion of the functionality.

That would obviously be the tool that you think they should have sent instead of the Pancam, right?

Heck, the RGB filter sets you referenced "hide" things UNDER PERFECT OPERATING CONDITIONS, even when the data from every single filter is released, and are almost useless when it comes to trying to identify the underlying causes of the signals!

You can prove it to yourself... are you willing to do the necessary work?



posted on Feb, 16 2004 @ 06:52 PM
link   

The Pancam only reveals data, it does not hide it.


It can hide something when the background is the same.


You're still not getting it.


That would be you.

[Edited on 16-2-2004 by ArchAngel]



posted on Feb, 16 2004 @ 07:11 PM
link   
Let me restate one of my earlier points, to hopefully illustrate why the pursuit of RGB as a standard for what "should have been sent" is so utterly flawed.

Any pictures that could have been produced from this mission using any given set of RGB filters you can imagine, would end up coming down to three pixel signals, each ranging from 0 to 255.

The red signal could be any integer from 0 to 255 inclusive, for 256 total red values. Likewise for the green frame, and the blue one.

So any given picture would have a potential 16 million colors or so for any given pixel.

Tell me... can you point out even ONE of those 16 million combinations which indicates that the color was caused by chlorophyll of any kind, as opposed to some kind of mineral deposit?

Which RGB value would that be? When you take a picture of chlorophyll, what unique RGB results do you think you can possibly get?



posted on Feb, 16 2004 @ 07:16 PM
link   
"The red signal could be any integer from 0 to 255 inclusive, for 256 total red values. Likewise for the green frame, and the blue one. "

PDS is 12bit format, not 8 bit. The 24bit Jpegs from the rover site may only have 256 potential values for each pixel, but the cam transmitts data already in PDS. It must be downsampled to covert to jpeg.



new topics

top topics



 
0
<< 3  4  5    7  8  9 >>

log in

join