posted on Jan, 12 2013 @ 02:02 PM
reply to post by wildespace
Well geez wildespace, you really made my day with that comment, my whole year so far even!
Perhaps, with the advent of space tourism
The Canadian clown who went up there never mentioned the stars or moon or sun, as far as I remember. Imagine going to a space station and not seeing
space? Well he did see some, but just the usual Earth dominated shot, no astro-photography.
taken with a regular camera,
Well I didn't know till I went and looked that Voyager used vidicons for its sensors. They were quite the device, designed for very low 'light'
levels, but on Voyager they were really spectrographic units looking for chemical and mineral etc spectra. Sure you can make a picture out of the
different wavelength spectral images, but your eyes or a regular camera couldn't detect it. You will only have the type of light we need when you are
close to an object with an atmosphere or ionosphere dense enough to create transverse waves of the proper wavelength. The distance that light will be
visible from depends on how much light is created, and that could be a long way with a large gassy object like Saturn. All the real action in space
though goes on in the UV/EUV and X-ray wavelengths.
Voyager is quite the machine though.
With this image, they used a clear filter, which means that all wavelengths are getting through to the vidicon, and you have no idea what wavelength
is being captured, it could all be UV. The vidicon was sensitive to UV down to 280 nm, but only went to 680 nm, the orange, and not to IR
The images from shuttle/ISS are from a normal Nikon DSLR
I know that, but they have to use the Earths atmosphere/ionosphere to make anything visible to the camera, they will not point them out into deep
space where the depth of ionosphere above
the ISS/Shuttle is too sparse to make objects visible. They always look sideways, through
atmosphere. Once it sinks in wmd, you'll wonder how they managed to pull the wool over your eyes for so long.