reply to post by ArMaP
I am replying to myself to answer some doubts I had and add some things.
About this image:
The filters used were L2, L5 and L7, corresponding to the wavelengths 753 nm, 535 nm
and 432 nm, which correspond to infrared, green and violet (not ultraviolet), so, as I said, the filters were not the right ones for a near true
colour image but I was wrong about thinking it was ultraviolet (the Rovers do not have ultraviolet filters on the panoramic cameras), it was the
infrared that was changing the colours the most.
I asked if NASA had said that the photo was true colour, and what they said about that photo was that it was "a false-color stretch", which may be
considered right or wrong, depending on our point of view.
But in that press release they also presented this photo,
and they said that it was "approximately true-color", which is not true.
Even if they used the radiometrically corrected images, those images do not show the visible light, so it is not approximately true colour.
Luckily, the images of the sundial are available for this Sol (1368) and with the filters we need, L4, L5 and L6, and by using them we get this
that we can see has the right colours. Even using the radiometrically
corrected images (that show what the scene looked like when it was photographed), the result does not look like the one on that panorama from which
the sundial on the first photo was taken.
Although it looks like it is covered by a red "mist", the image shows that the colours are the right ones, the blue and the green patches have the
If we use the radiometrically corrected images but without taking into account the radiometric data, this is what we get, and even if this is not what
the scene looked like, to me it looks like the most probable colours of the objects if they were here on Earth and not "painted" red by the dust in
the air (that is supposed to be the cause for the red "mist").
So, the image before the last should be what NASA should consider the nearest to true colour, never an image where blue and green and transformed into
pink and orange.
Now this image.
As I thought, this image was altered (but not by NASA) for a specific purpose, the original is probably this one.
And as a way of seeing what happens when we use the wrong filters, this is what this image would look if the filters were the same as in the first
image on this post, infrared, green and violet.
So, as we can see, whenever a photo of the sundial shows a pink patch where the blue patch should be that means that they used the L2 (infrared)
filter instead of the L4 (red, but considering the wavelength, more orange than red) filter.
As for the Clementine photos, I am still looking.