It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
originally posted by: swanne
a reply to: wmd_2008
a reply to: wildespace
Online calculators agree with my computation.
Check for yourselves: angular diameter calculator
The sun diameter is 0.0046491 AU, and the distance of Pluto is 39.5 AU.
My calculations give 0.006743639.
The online calculator rounds it up to 0.007.
originally posted by: swanne
All right. According to my calculations, the Sun would appear as a bright dot, some 0.006743639 of a degree in Pluto's sky. For comparison, Sirius is about 0.005 of a degree as seen from Earth.
originally posted by: jimmyx
it's neither a sunrise nor sunset, way, way too far away. the "sun" would be a tiny speck of dust, barely, if at all, visible.
originally posted by: Dr UAE
have a look at this image guys , notice the two shining lines in the sky , are those meteors ?
originally posted by: swanne
a reply to: wmd_2008
a reply to: wildespace
Online calculators agree with my computation.
Check for yourselves: angular diameter calculator
The sun diameter is 0.0046491 AU, and the distance of Pluto is 39.5 AU.
My calculations give 0.006743639.
The online calculator rounds it up to 0.007.
They’ve thought about it, actually. But the truth is, we’re probably better off the way things are.
To find out about space cameras, we got in touch with Noam Izenberg, a planetary scientist working on the MESSENGER probe, which is now circling Mercury taking pictures. He told us there are basically two reasons space photography is mostly in black and white. The first, as you rightly suppose, is that grayscale images are often more useful for research.
In principle, most digital cameras, including cheap Walmart models in addition to the custom-built jobs on space probes, are monochrome, or more accurately panachrome. Each of the pixel-sized receptors in a digital camera sensor is basically a light bucket; unmodified, their combined output is simply a grayscale image generated from all light in the visible spectrum and sometimes beyond.
To create a color image, each pixel on a typical earthbound camera has a filter in front of it that passes red, green, or blue light, and the camera’s electronics add up the result to create the image we see, similar to a color TV. In effect, filtering dumbs down each panachrome pixel so that it registers only a fraction of the light it’s capable of seeing. Granted, the human eye works in roughly the same way. The fact remains, in an earthbound camera, some information is lost.
Space cameras are configured differently. They're designed to measure not just all visible light but also the infrared and ultraviolet light past each end of the visible spectrum. Filtering is used primarily to make scientifically interesting details stand out. “Most common planetary camera designs have filter wheels that rotate different light filters in front of the sensor,” Izenberg says. “These filters aren’t selected to produce ‘realistic’ color that the human eye would see, but rather to collect light in wavelengths characteristic of different types of rocks and minerals,” to help identify them.
True-color images — that is, photos showing color as a human viewer would perceive it — can be approximated by combining exposures shot through different visible-color filters in certain proportions, essentially mimicking what an earth camera does. However, besides not inherently being of major scientific value, true-color photos are a bitch to produce: all the variously filtered images must be separately recorded, stored, and transmitted back to Earth, where they’re assembled into the final product. An 11-filter color snapshot really puts the squeeze on storage space and takes significant transmission time.
Given limited opportunities, time, and bandwidth, a better use of resources often is a false-color image — for example, an infrared photo of rocks revealing their mineral composition. At other times, when the goal is to study the shape of the surface, measuring craters and mountains and looking for telltale signs of tectonic shifts or ancient volcanoes, scientists want black-and-white images at maximum resolution so they can spot fine detail.
Terrific, you say. But don’t scientists realize the PR value of a vivid color photo?
They realize it all right. But that brings up the second reason most NASA images aren’t in color. The dirty little secret of space exploration is that a lot of the solar system, and for that matter the cosmos, is pretty drab. “The moon is 500 shades of gray and black with tiny spatterings of greenish and orangish glass,” Izenberg says. “Mars is red-dun and butterscotch with white ice at the poles. Jupiter and glorious Saturn are white/yellowish/brown/reddish. Hubble's starscapes are white or faintly colored unless you can see in the infrared and ultraviolet.”
As for Mercury, Izenberg’s bailiwick, NASA has posted on its website detailed color photos showing vast swaths of the planet’s surface. If the accompanying text didn’t tell you they were true-color, you’d never know.
False-color images are often a lot more interesting. The colors aren’t faked, exactly; rather, they’re produced by amplifying modest variations in the visible spectrum and adding in infrared and ultraviolet. Some of the less successful examples look like a Hare Krishna tract, but done skillfully the result can be striking. The spectacular full-color nebula images from the Hubble Space Telescope were all produced by black-and-white sensors with color filters.
For what it’s worth, some colleagues of Izenberg’s a few years ago floated the idea of doing as you suggest — putting an off-the-shelf digital camera on a probe in addition to the more expensive models. The idea didn’t get off the ground, as it were, partly out of concerns the camera wouldn’t survive the extreme temperatures of space. But chances are the raw results wouldn’t have been all that impressive anyway. Experience suggests a good space photo needs a little … eh, don’t call it show biz. Call it art.
— Cecil Adams
originally posted by: Spacespider
So pluto is only black and white ??
The camera have taken many color photos.. but when we get to close we get black and white..
bullfiawhlaiwfhlaw3itio
originally posted by: swanne
The sun diameter is 0.0046491 AU,
and the distance of Pluto is 39.5 AU.
originally posted by: wmd_2008
a reply to: Soylent Green Is People
Actually consumer digital cameras use a Bayer Filter except for Sigma DSLR's which use a 3 layer sensor one layer for each colour.
originally posted by: Saint Exupery
No, it's radius is that. (roughly - I got 0.00465240).
I plugged twice that (to get the diameter) into your calculator with the 39.5 AU value and got 0.013 degrees, which is 46.8 seconds of arc.