I could not find (yet) any group of photos that includes the whole 16 filters, but I will use these from Spirit from Sol 9 to show what I mean.
These are the 7 photos from the left panoramic camera, the ones that have visible light filters.
(I use the radiometrically adjusted photos but without the corresponding
Clear filter - a real greyscale photo with all the wavelengths that the CCD accepts.
753nm filter - infrared.
673nm filter - red.
601nm filter - orange.
535nm filter - green.
482nm filter - blue.
432nm filter - violet.
As we can see, some of the colours in the colour chart appear brighter in some wavelengths and darker in other, because they reflect those wavelengths
in different ways.
If you use the red, green and blue filters to get a colour image, this is what we get.
By using the orange, green and blue we get an image that looks close to the original, probably because the filters used are narrow band filters and
the red is too much on the red side.
Things start to look strange when we use photos from wavelengths that are too far away from the intended colour, so if we use infrared instead of red
we get this.
If we use violet instead of blue we get this, too yellow.
This happens for all "wrong" filters, so when we use orange, blue and violet instead of orange, green and blue we get this:
In the above example, as we have two wrong components, most of the colours in the image are wrong.
Whenever the wrong components are used, no amount of colour tweaking will get back the right colours, because they never existed in the original
It's like trying to write with a keyboard that doesn't have all the letters, you can write may words correctly, but the more different words you
have in your writing the more likely it will be that some words will have missing letters.
I tried to reproduce the "MER2RGB" process referenced in the opening post, but the results do not look like the images on their site, so I guess I
must be doing something wrong.