posted on Sep, 8 2010 @ 01:12 PM
Originally posted by Astyanax
reply to post by Soylent Green Is People
The photographer took 3 black-and-white pictures through 3 different filters, then combined those black-and-white pictures (projected through
color lenses) to make an "approximate" true color picture.
This is also how old colour movie stock used to be preserved, because colour film degrades much more rapidly than B&W. George Lucas did it to the Star
Wars prints, I believe.
I suppose it's all digital now.
That's also the idea behind digital photography.
All digital cameras are essentially color-blind -- i.e., the sensor that picks up the image cannot automatically distinguish one color from another.
All digital cameras (yours and the ones NASA is using on Mars) need to first view each image through various filters, which in turn provides various
grey-scale images of different intensity for each color. The camera then converts those various grey-scales into what it thinks the color should be
and puts it all in one image.
This happens so fast inside your camera that you don't even realize it is happening. However, for NASA, they get the gray-scale images from Mars,
then translate the images into color here on Earth. That way they have the individual "separated" grey-scale images, which can give them more
information than if they were pre-converted into color before being sent to Earth.
This Russian Photographer did the same thing -- getting color from black-and-white film.
edit on 9/8/2010 by Soylent Green Is People because: added last line to clarify