Help ATS with a contribution via PayPal:
learn more

Why Can't We Get Photos Of ALL Sides Of the Moon?

page: 5
8
<< 2  3  4   >>

log in

join

posted on Jan, 12 2013 @ 02:26 AM
link   

Originally posted by GaryN


Sorry but you do not understand the instruments used to obtain those images, which are nothing like your eyes or a regular camera, which can not see IR or UV and can not detect the plane-wave fronts, the simple lens of your eyes just can not do it. You need to accept that humans in space will always have to use instruments in order to be able to see what's out there.


Yes and once again I suggest YOU recheck what YOU assume about what these cameras see with regards to IR & UV.

When you make an assumption and someone posts an image that contradicts your claim you move the goal posts and you keep doing that.

These instruments are almost like your eyes remember your eye is a detector its your brain that assembles the picture!!!

The lens of your eye projects an image on the retina to its light sensitive cells, they react produce a signal that goes along the optic nerve were the picture is assembled by your brain.

A digital camera lens projects the image onto the sensor (ccd or cmos) that is turned into a signal which is converted and processed by the cameras processing chip (brain) and the image stored in the memory.

Modern digital cameras have an IR filter on them you get special models like the Canon 60Da

Canon 60 Da

This camera has the IR filter removed for astrophotographers and as you can see from the top 2 images on the link produces better images of nebula & galaxies.

It will produce strange colours if used for normal images unless a filter is used and you also have problems with auto focus.

The images from shuttle/ISS are from a normal Nikon DSLR , also like I said in a previous post you kept going on about no images of stars/clusters/moon taken from the shuttle/ISS yet when links were posted you claimed to have posted the images yourself on other threads care to explain that
edit on 12-1-2013 by wmd_2008 because: (no reason given)




posted on Jan, 12 2013 @ 02:02 PM
link   
reply to post by wildespace
 

Well geez wildespace, you really made my day with that comment, my whole year so far even!




Perhaps, with the advent of space tourism


The Canadian clown who went up there never mentioned the stars or moon or sun, as far as I remember. Imagine going to a space station and not seeing space? Well he did see some, but just the usual Earth dominated shot, no astro-photography.
www.dailymail.co.uk...

@jra



taken with a regular camera,


Well I didn't know till I went and looked that Voyager used vidicons for its sensors. They were quite the device, designed for very low 'light' levels, but on Voyager they were really spectrographic units looking for chemical and mineral etc spectra. Sure you can make a picture out of the different wavelength spectral images, but your eyes or a regular camera couldn't detect it. You will only have the type of light we need when you are close to an object with an atmosphere or ionosphere dense enough to create transverse waves of the proper wavelength. The distance that light will be visible from depends on how much light is created, and that could be a long way with a large gassy object like Saturn. All the real action in space though goes on in the UV/EUV and X-ray wavelengths.
Voyager is quite the machine though.
voyager.jpl.nasa.gov...
With this image, they used a clear filter, which means that all wavelengths are getting through to the vidicon, and you have no idea what wavelength is being captured, it could all be UV. The vidicon was sensitive to UV down to 280 nm, but only went to 680 nm, the orange, and not to IR wavelengths.




The images from shuttle/ISS are from a normal Nikon DSLR


I know that, but they have to use the Earths atmosphere/ionosphere to make anything visible to the camera, they will not point them out into deep space where the depth of ionosphere above the ISS/Shuttle is too sparse to make objects visible. They always look sideways, through the atmosphere. Once it sinks in wmd, you'll wonder how they managed to pull the wool over your eyes for so long.



posted on Jan, 12 2013 @ 02:21 PM
link   

Originally posted by GaryN
The Canadian clown who went up there never mentioned the stars or moon or sun, as far as I remember. Imagine going to a space station and not seeing space? Well he did see some, but just the usual Earth dominated shot, no astro-photography.

Space is a very dark place. When you're inside a well-lit spaceship, and you look out of the window, you will see mostly blackness. Stars, galaxies and nebulae are very faint and require dark-adapted sight. That said (and I have mentioned this before), there has been astrophotography done from the ISS. There are also several side-facing windows on the ISS, through wich you can see much further up into space.


Originally posted by GaryN
Well I didn't know till I went and looked that Voyager used vidicons for its sensors. They were quite the device, designed for very low 'light' levels, but on Voyager they were really spectrographic units looking for chemical and mineral etc spectra.

Huh? The vidicon cameras on Voyager were not spetrographs. The spectrographs (or rather spectrometers) were separate instruments from the cameras. You can see it well on the link you posted: voyager.jpl.nasa.gov...

Spectrometers or spectrographs don't produce a photographic image, they produce a spectrum or measure the intensity at various wavelengths. You can't take a picture with them. On the other hand, the vidicon cameras on Voyager took photographic images. Voyager 1 Scientific instruments

Those cameras were sensitive throughout most of the visible spectrum, down to 640 hm (red).
edit on 12-1-2013 by wildespace because: (no reason given)


jra

posted on Jan, 12 2013 @ 02:50 PM
link   

Originally posted by GaryN
...on Voyager they were really spectrographic units looking for chemical and mineral etc spectra.


I think you are mixing/confusing several instruments here. The Imaging Science System, which is the vidicon camera that took all the photos like the famous "pale blue dot". Other instruments were used for spectrography like the Infrared Interferometer Spectrometer, Ultraviolet Spectrometer and the Plasma Spectrometer.


With this image, they used a clear filter, which means that all wavelengths are getting through to the vidicon, and you have no idea what wavelength is being captured, it could all be UV.


What image are you referring to? And what about the fact that the "pale blue dot" photo was taken with green, blue and violet filters? No ultraviolet was used.






top topics
 
8
<< 2  3  4   >>

log in

join