kanu, I may have got my cameras mixed up but it does not alter the following illogical point :
1. Nasa insist that IR filter is needed for geological detail reasons - I can agree with that.
2. Using any IR filter (from any camera) adjust the response of a composite colour image such that blue becomes magenta etc etc - I won't argue.
3. They take a photo of the lander.......which is of absolutely no geological or scientific importance. We have better photos of the lander taken on
earth. Therefore this has been done for publicity.
So if you are taking a picture for publicity why on earth use a filter that distorts the colours ? Good god any numpty with a PC and the data from
each filter can construct a decent colour photo. Nasa with its resources can do this automatically as the data streams in and yet fails to do so. Why
? In fact it is perfectly possible to have two sets of images : true colour for the press and public, and false colour with detail for the
scientists. It doesn't take a cray just a PC.
My reference to overhead images indicating water are not of dry lake beds but wet ones I'll try and dig out the original frame references.
malcr: for a better understanding of how the PanCam makes the composite images (eg the panoramas) for press releases, and why some of the pigments on
the rover appear strange, read this thread: