posted on Jan, 21 2010 @ 05:19 AM
They look rather like sensor/processing artefacts to me, but I'll happily admit I don't know for sure.
The initial images from the Stereo spacecraft are posted at very low resolution, and they are 'edited up' to enhance contrast. That means that any
defects, and even the jpeg boundary artefacts (do you think those little faint squarish boxes are real objects..??) are artificially enhanced, even
though they are not real detail.
Later (usually a month or so), Nasa finishes processing the full resolution raw files and replaces those poor quality initial images with much better
versions - if you browse backwards thru the data, you will find that at some point the images change to better versions.
The full processing includes use of a very sophisticated image engine that not only manages to screw every last minute detail from the gathered data,
but also uses a 'subtraction' method to eliminate any junk caused by sensor problems, hot/failed pixels, etc. They take test 'darkframes' to
determine how best to do this, and they average the data they get over a period, including using future frames. These frames are not yet taken, at
the time the original image is collected... That's one reason it takes a long time. It's all documented if you look hard.
Now, for those who think this means that NASA have plenty of time to manipulate the data.. Well, no. The original .FTS (or FITS) raw images are in
fact available to the public as soon as they are taken. If you want to get better resolution images earlier you can download your very own FITS
processing software, get the originals as they come in, and play to your heart's content in the hope you can beat NASA to it. You can even get hold
of the same powerful image processing engine that they use, but it won't run on a PC - it requires much more grunt.
I've tried using basic FITS software, just as an experiment.. While I did manage to get a few decent images, it was hard work using rather
unfriendly software, and of course you then have to wait for the dark frames to come in, then work out how to average and subtract them...
That's why most folk just let them get on with it. It's frikkin' hard work!! (And NASA's full archiving service (which is where you get the FTS
images) was also pretty unfriendly last time I used it.)
As for the shot into the sun image.. Try being a little scientific with your lens flare display. Move the camera around slightly so the sun is
located just a little left of centre, then a little up and left, then up, then up and right, then right, etc.. taking images as you go. (But see
warning below..) Then show us the results. You might learn something, as you see the flares move around..
Lens flares are INEVITABLE when shooting into the Sun. Extremely high quality lenses (eg the best Leica/Zeiss/Zuiko's, perhaps even the odd
Nikon/Canon/Minolta pro-series) may be able to suppress them so well that they are almost invisible, but if that camera is a consumer compact... Not
a snowball's chance in hell. You WILL get little blobs, streaks and sparkles that are simply NOT real. They come from light bouncing around in all
directions - internal reflections/refractions from the lens elements, the aperture mechanism and the inside surfaces of the lens barrel..
But do bear in mind that it ISN'T wise to leave a compact (ie non DSLR) camera pointing at the sun for more than 10 seconds or so, as the sensor
is being exposed to damaging IR and UV radiation focused to a small intense point on the sensor by the lens. While the sensor is probably
somewhat protected by a IR/UV filter, it isn't perfect and after a while... permanent damage may result. The cheaper the camera, the more likely
the danger of damage.