It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

NASA Prepares to Launch COLOR BLIND Probe

page: 1
0

log in

join
share:

posted on Dec, 16 2005 @ 10:46 PM
link   
New Horizons Color Cam cannot create true color images.

All the way to Pluto and forget to send it to the eye doctor first.

Next time I want a second opinion.


pluto.jhuapl.edu...

MVIC operates at visible wavelengths - using the same light by which we see - and has 4 different filters for producing color maps. One filter is tailored to measure the methane frost distribution over the surface, while the others are more generic and cover blue, red and near-infrared colors, respectively. MVIC also has two panchromatic filters, which pass essentially all visible light, for when maximum sensitivity to faint light levels is required. In all cases, the light passes from the telescope through the filters and is focused onto a charge coupled device (CCD). (Although the MVIC CCD is a unique, sophisticated device, virtually all consumer digital cameras use CCDs.)


Remember when NASA sent a Rover all the way to Mars with a colorblind cam that could not tell the difference between the orange sands, and Blue Green Algea?

Unlike last time they are not allowing access to the bandwidths of the transmittance filters Without Paying!

[edit on 16-12-2005 by ArchAngel]




posted on Dec, 16 2005 @ 11:49 PM
link   
What the heck are you talking about? It's standard procedure in order to extract the most information possible from a CCD device. Even your average camera has filters, one for each color. So why is it a big deal that it has filters specifically detailed for each colors?

They use one exposure using each color filter, then recombine the images to form a visible color image. More details:

So what's the big deal.www.badastronomy.com...



posted on Dec, 16 2005 @ 11:51 PM
link   

Originally posted by SilentFrog
What the heck are you talking about?


That's my question as well....

Are you trying to say that NASA is trying to hide the existance of blue-green algae on Pluto?


If so, you're even more hardcore than I thought



posted on Dec, 16 2005 @ 11:54 PM
link   
maybe it's the planet they are trying to hide from the algae?

i say just put a bigger zoom lense and a very large flash bulb on the hubble!



posted on Dec, 16 2005 @ 11:59 PM
link   

What the heck are you talking about? It's standard procedure in order to extract the most information possible from a CCD device. Even your average camera has filters, one for each color. So why is it a big deal that it has filters specifically detailed for each colors?


Not like these.

I found the ranges of the filters.

pluto.jhuapl.edu...

425 - 550nm (Blue);
540 - 700nm (Red);
780 - 1000nm (IR);
860 - 910 nm (CH4)

Compare with RGB


You can't make a color picture using these transmittance filters.

You can create a representative color picture using any data, but true color requires RGB filters on the camera.



posted on Dec, 17 2005 @ 12:08 AM
link   

Are you trying to say that NASA is trying to hide the existance of blue-green algae on Pluto?


Not on Pluto, but I had a small hope of seeing some on Mars.

In much the same way as this cam is color blind so are the ones on the Mars rovers.

None carry a standard RGB filter set for creating true color images.

The result is that certain colors will look different, and by fine tuning the frequencies you could make something look just like the background, and seem to disappear.



posted on Dec, 17 2005 @ 12:12 AM
link   
sometimes i wonder if some of you guys are just too damn smart for you own good..

can you lend me a few cells? I'd like to learn quantum physics so i can create a black hole and use it as a garbage disposal



posted on Dec, 17 2005 @ 12:17 AM
link   

Originally posted by SilentFrog
What the heck are you talking about? It's standard procedure in order to extract the most information possible from a CCD device. Even your average camera has filters, one for each color. So why is it a big deal that it has filters specifically detailed for each colors?

They use one exposure using each color filter, then recombine the images to form a visible color image. More details:

So what's the big deal.www.badastronomy.com...


You might find my Mars Cam thread at Bad-Astro more interesting.


No matter where you place the curves, there are huge gaps between all the filters. They're basicly narrow bandpass filters, and not suited to RGB imaging.

As has been stated over and over again, they will not create a true RGB image. There's no crossover between the filters, even if you use the best case curve based on their bandpass. The L3 filter, at 670nm, is the closest to a typical Red filter in an RGB set, which usually has a peak transmittance around 650nm. The L4 filter at 600nm falls into the typical crossover between green and red on an RGB set, or "orange".

The difference is that an RGB filter set has a much, much broader bandpass - a typical red RGB filter covers from 570 (min transmittance) to around 700 (min transmittance) with peak transmittance at 650nm. Green, from 450 to 650 peaking at 550, and blue from 350 to 550, peaking around 450. That wide crossover between filters is what allows an RGB set to provide a much better rendition. Not only that, the crossovers are usually at the 50% or higher transmission points. Those non-existent crossovers on the pancam are at 0% transmittance. No color data at all for specific wavelengths between filters.

You can not, and will not, ever get an RGB image from narowband filters that effectively eliminate some wavelengths from reaching the detector.

www.bautforum.com...



posted on Dec, 17 2005 @ 07:31 AM
link   
You know, RGB is not the only method to artificially recreat colour.

TVs do not use RGB in their transmissions, so does that mean that the colours we see on TV are all wrong?

From Wikipedia:



NTSC saves only 11% of the original blue and 30% of the red


Does that mean that the skys seen on a NTSC TV system are redish?



posted on Dec, 17 2005 @ 10:19 AM
link   

Originally posted by ArMaP
You know, RGB is not the only method to artificially recreat colour.

TVs do not use RGB in their transmissions, so does that mean that the colours we see on TV are all wrong?


As with all modern NASA probes New Horizons will transmit imaging data in PDS format, which is RGB.

THE SOURCE is RGB format.

It can be converted to other formats, but the origin will always be RGB.



posted on Dec, 17 2005 @ 10:38 AM
link   
Yes, but what I was trying to say is that the origin of a TV stream can be RGB but a NTSC system will only show some 11% of the blue and 30% of the red, and we do not think "Hey, there is almost no blue in these images", meaning that the fact that we do not see all the elements does not affect our capacity in finding it a very good reproduction of the real colours.

If the Infrared and near Infrared are more important, than I guess that the lack of other colours can be regard as a "colateral damage"


But looking at the ranges you provided, I see they go from 425nm to 700nm, so there are no colours lacking from the blue 425nm to the red 700, only almost all Violet and the darkest Red are missing.



posted on Dec, 17 2005 @ 11:03 AM
link   

But looking at the ranges you provided, I see they go from 425nm to 700nm, so there are no colours lacking from the blue 425nm to the red 700, only almost all Violet and the darkest Red are missing.


The blue almost fits the RGB green range, and the red nearly fits the RGB red range.

The IR channel is out of the range of human sight.

They have effectively shifted the spectrum down one step.

The results will not be anywhere near true color.



posted on Dec, 17 2005 @ 05:59 PM
link   
I'm slighly confused here. Maybe you can clear it up. As a side note, I do research in biophysics with lasers, so I use optics pretty much every day of the week.

So here is what I think they would do... could you point out where my reasoning is flawed?

From your trasmittance graph, I can see that the filters map pretty much across the whole color range. So the logical thing, I'm thinking here, is to take a picture with the CCD through each filter. This will give 3 pictures in black and white. Using something like a reference template, they map each of those channels to a reference color, then composit those 3 channels to get an RGB picture. The compositing step does involve guesswork as to things like relative intensity, but I suppose that NASA has a lot of experience in this domain, and so isn't exactly going to release utterly erroneous pictures.

I fail to see how the process is any different from that used in commercial digital cameras. I'd even say that it's superior. In commercial CCD cameras, groups of 3 pixels have differing filters, and then the camera uses statistical magic to approximate the color of each pixel of the image, from the R, G and B channels around it. That's why you see a lot of noise in dark areas from cheaper cameras.

So again, what is the big deal?



posted on Dec, 17 2005 @ 06:05 PM
link   

From your trasmittance graph, I can see that the filters map pretty much across the whole color range. So the logical thing, I'm thinking here, is to take a picture with the CCD through each filter. This will give 3 pictures in black and white.


Only two filters are in the visible spectrum.

The other two are IR.

You need three roughly close to RGB standard.

You cannot create a true color image, or anything close with these filters.

You can create representative color pictures, but thats true for any filter set.


I fail to see how the process is any different from that used in commercial digital cameras. I'd even say that it's superior.


The difference is that the commercial camera uses a proper RGB filter set while Ralph does not.

[edit on 17-12-2005 by ArchAngel]


jra

posted on Dec, 17 2005 @ 10:45 PM
link   

Originally posted by ArchAngel
You cannot create a true color image, or anything close with these filters.

You can create representative color pictures, but thats true for any filter set.


And hence the reason why they call them false colour photos. They choose the filters for a reason. Mostly for scientific purposes.



I fail to see how the process is any different from that used in commercial digital cameras. I'd even say that it's superior.


The difference is that the commercial camera uses a proper RGB filter set while Ralph does not.


But a commercial camera isn't going to work out in space. Plus I believe it's easier to transmit a B&W image than a colour one. Less information the signal has to contain.

So why is this all a big deal anyway? No probe, as far as I know, has taken true colour images.

[edit on 17-12-2005 by jra]



posted on Feb, 16 2006 @ 11:44 PM
link   
www.abovetopsecret.com...

NASA did it again.

They sent a colorblind probe all the way to Mars....

MOD EDIT: No need to revive one of your old threads to plug a new one... Let's continue discussion there, shall we? This thread is closed now.

[edit on 2/17/2006 by cmdrkeenkid]



new topics

top topics



 
0

log in

join