It looks like you're using an Ad Blocker.

Thank you.

Some features of ATS will be disabled while you continue to use an ad-blocker.

# A Sunset on Pluto

page: 2
25
share:

posted on Sep, 18 2015 @ 06:40 AM

True that. We only know the smallest fraction of a fraction about the universe at large and the stuff we see is AMAZING. Not that the stuff we see on Earth isn't amazing either (too bad humans don't stop to check that out anymore either and just wantonly destroy it...), but damn! The stuff in the wider universe is just fascinating and sometimes terrifying when you try to conceptualize the scales of everything.

posted on Sep, 18 2015 @ 07:05 AM

Online calculators agree with my computation.

Check for yourselves: angular diameter calculator

The sun diameter is 0.0046491 AU, and the distance of Pluto is 39.5 AU.

My calculations give 0.006743639.
The online calculator rounds it up to 0.007.

edit on 18-9-2015 by swanne because: (no reason given)

posted on Sep, 18 2015 @ 07:54 AM

originally posted by: swanne

Online calculators agree with my computation.

Check for yourselves: angular diameter calculator

The sun diameter is 0.0046491 AU, and the distance of Pluto is 39.5 AU.

My calculations give 0.006743639.
The online calculator rounds it up to 0.007.

These are the figs you quoted

originally posted by: swanne

All right. According to my calculations, the Sun would appear as a bright dot, some 0.006743639 of a degree in Pluto's sky. For comparison, Sirius is about 0.005 of a degree as seen from Earth.

How can Sirius which is 1.5 million miles in diameter (not even twice the suns) at a distance of 8.6 light years from Earth look almost the same as the Sun from Pluto, you have made a mistake somewhere from Pluto the Sun would be 0.014 degrees.
edit on 18-9-2015 by wmd_2008 because: (no reason given)

edit on 18-9-2015 by wmd_2008 because: (no reason given)

posted on Sep, 18 2015 @ 08:16 AM

originally posted by: jimmyx
it's neither a sunrise nor sunset, way, way too far away. the "sun" would be a tiny speck of dust, barely, if at all, visible.

There is definite sunlight that falls upon Pluto, and a discernible terminator line separating daylight from the darkness of the night side. Therefore, the Sun is visible.

The lighting on Pluto at "local noon" would be similar to the lighting conditions on Earth at dusk or dawn. To see first-hand what the light conditions on Pluto at noon looks like, see this below:

Pluto Time

edit on 9/18/2015 by Soylent Green Is People because: (no reason given)

posted on Sep, 18 2015 @ 08:37 AM
have a look at this image guys , notice the two shining lines in the sky , are those meteors ?

posted on Sep, 18 2015 @ 08:45 AM

originally posted by: Dr UAE
have a look at this image guys , notice the two shining lines in the sky , are those meteors ?

They might be stars or maybe two of Pluto's moons.

I'm guessing that maybe the exposure time (the time the camera shutter needed to be open to properly expose the image) was long enough that the stars in the background appear as streaks as the spacecraft moves during the exposure.

The spacecraft would tilt/change orientation to keep the camera fixed on Pluto during the exposure as the spacecraft zipped past Pluto, thus there is no motion blur with Pluto, but due to the tilting of the camera to stay fixed on Pluto, the stars (or moons) would appear to move relative to the camera.

edit on 9/18/2015 by Soylent Green Is People because: (no reason given)

posted on Sep, 18 2015 @ 10:18 AM

originally posted by: swanne

Online calculators agree with my computation.

Check for yourselves: angular diameter calculator

The sun diameter is 0.0046491 AU, and the distance of Pluto is 39.5 AU.

My calculations give 0.006743639.
The online calculator rounds it up to 0.007.

And 0.007 degrees is a whopping 25.2 arcseconds. Sirius is only 0.006 arcseconds wide.
edit on 18-9-2015 by wildespace because: (no reason given)

posted on Sep, 18 2015 @ 10:36 AM

I think you are wrong as well the Sun is 865,000 miles across so it's 1392083km.

posted on Sep, 18 2015 @ 10:47 AM
While looking for images of sun from Pluto, found this and find it fascinating. It's Earth and Moon from Saturn. you can clearly see it's bigger and brightest then background stars. (image was taken in 2013 by Cassini spacecraft.)

Sun can be just bigger and much brighter then that from Pluto, considering distance and size difference between earth, moon and sun.

edit on 18-9-2015 by SuperFrog because: (no reason given)

posted on Sep, 18 2015 @ 10:49 AM
So pluto is only black and white ??
The camera have taken many color photos.. but when we get to close we get black and white..
bullfiawhlaiwfhlaw3itio

posted on Sep, 18 2015 @ 10:55 AM
Nice pictures, a shame they are not in colour.

posted on Sep, 18 2015 @ 10:56 AM
Oh, not again....

They’ve thought about it, actually. But the truth is, we’re probably better off the way things are.

To find out about space cameras, we got in touch with Noam Izenberg, a planetary scientist working on the MESSENGER probe, which is now circling Mercury taking pictures. He told us there are basically two reasons space photography is mostly in black and white. The first, as you rightly suppose, is that grayscale images are often more useful for research.

In principle, most digital cameras, including cheap Walmart models in addition to the custom-built jobs on space probes, are monochrome, or more accurately panachrome. Each of the pixel-sized receptors in a digital camera sensor is basically a light bucket; unmodified, their combined output is simply a grayscale image generated from all light in the visible spectrum and sometimes beyond.

To create a color image, each pixel on a typical earthbound camera has a filter in front of it that passes red, green, or blue light, and the camera’s electronics add up the result to create the image we see, similar to a color TV. In effect, filtering dumbs down each panachrome pixel so that it registers only a fraction of the light it’s capable of seeing. Granted, the human eye works in roughly the same way. The fact remains, in an earthbound camera, some information is lost.

Space cameras are configured differently. They're designed to measure not just all visible light but also the infrared and ultraviolet light past each end of the visible spectrum. Filtering is used primarily to make scientifically interesting details stand out. “Most common planetary camera designs have filter wheels that rotate different light filters in front of the sensor,” Izenberg says. “These filters aren’t selected to produce ‘realistic’ color that the human eye would see, but rather to collect light in wavelengths characteristic of different types of rocks and minerals,” to help identify them.

True-color images — that is, photos showing color as a human viewer would perceive it — can be approximated by combining exposures shot through different visible-color filters in certain proportions, essentially mimicking what an earth camera does. However, besides not inherently being of major scientific value, true-color photos are a bitch to produce: all the variously filtered images must be separately recorded, stored, and transmitted back to Earth, where they’re assembled into the final product. An 11-filter color snapshot really puts the squeeze on storage space and takes significant transmission time.

Given limited opportunities, time, and bandwidth, a better use of resources often is a false-color image — for example, an infrared photo of rocks revealing their mineral composition. At other times, when the goal is to study the shape of the surface, measuring craters and mountains and looking for telltale signs of tectonic shifts or ancient volcanoes, scientists want black-and-white images at maximum resolution so they can spot fine detail.

Terrific, you say. But don’t scientists realize the PR value of a vivid color photo?

They realize it all right. But that brings up the second reason most NASA images aren’t in color. The dirty little secret of space exploration is that a lot of the solar system, and for that matter the cosmos, is pretty drab. “The moon is 500 shades of gray and black with tiny spatterings of greenish and orangish glass,” Izenberg says. “Mars is red-dun and butterscotch with white ice at the poles. Jupiter and glorious Saturn are white/yellowish/brown/reddish. Hubble's starscapes are white or faintly colored unless you can see in the infrared and ultraviolet.”

As for Mercury, Izenberg’s bailiwick, NASA has posted on its website detailed color photos showing vast swaths of the planet’s surface. If the accompanying text didn’t tell you they were true-color, you’d never know.

False-color images are often a lot more interesting. The colors aren’t faked, exactly; rather, they’re produced by amplifying modest variations in the visible spectrum and adding in infrared and ultraviolet. Some of the less successful examples look like a Hare Krishna tract, but done skillfully the result can be striking. The spectacular full-color nebula images from the Hubble Space Telescope were all produced by black-and-white sensors with color filters.

For what it’s worth, some colleagues of Izenberg’s a few years ago floated the idea of doing as you suggest — putting an off-the-shelf digital camera on a probe in addition to the more expensive models. The idea didn’t get off the ground, as it were, partly out of concerns the camera wouldn’t survive the extreme temperatures of space. But chances are the raw results wouldn’t have been all that impressive anyway. Experience suggests a good space photo needs a little … eh, don’t call it show biz. Call it art.

Source: www.straightdope.com...

posted on Sep, 18 2015 @ 11:12 AM

originally posted by: Spacespider
So pluto is only black and white ??
The camera have taken many color photos.. but when we get to close we get black and white..
bullfiawhlaiwfhlaw3itio

As 'SuperFrog' pointed out above, ALL pictures taken with digital cameras are for all intents and purposes grayscale (i.e., the image sensor that picks up the light in your consumer digital camera, and the digital cameras on a spacecraft can only detect grayscales -- not color.

To get colorized images from those grayscales, your digital camera views each picture taken through a variety of filters -- red, green, and blue. Each filtered image creates a slightly different grayscale with slightly different lighting intensities. Your camera's computer then uses per-written algorithms to analyze each grayscale pixel (as seen through the different filers) to try to determine what color those grayscales represent (based on what we know about color theory) and then combine the results. The end result is the colorized image that you see.

All of this happens in less than one second, so you don't really know what is going on inside your camera.

Space imagery is slightly different in that the grayscale images themselves (seen through various filters of different wavelengths of the spectrum) are a valuable tool just as they are. So the grayscale images are what is sent back to Earth to be analyzed as grayscales, or they can be run through the colorization algorithms later to get an idea of the colors.

By sending back the grayscale images rather than having the computers on board the spacecraft do the colorizing and then sending the color image back, they can preserve the valuable data that is contained in the various filtered versions images of each image. If it were combined and colorized on the spacecraft, some of that information/data would be gone.

posted on Sep, 18 2015 @ 11:21 AM
a reply to: Soylent Green Is People

Actually consumer digital cameras use a Bayer Filter except for Sigma DSLR's which use a 3 layer sensor one layer for each colour.

edit on 18-9-2015 by wmd_2008 because: (no reason given)

edit on 18-9-2015 by wmd_2008 because: (no reason given)

posted on Sep, 18 2015 @ 11:23 AM

originally posted by: swanne
The sun diameter is 0.0046491 AU,

No, it's radius is that. (roughly - I got 0.00465240). I plugged twice that (to get the diameter) into your calculator with the 39.5 AU value and got 0.013 degrees, which is 46.8 seconds of arc. However...

and the distance of Pluto is 39.5 AU.

That's the average distance. Pluto is only ~25 years past perihelion, so it's only ~32.9 AU from the Sun, which yields an angular diameter of ~58 seconds of arc.

A simpler approximation is to say that Pluto is ~30 times further from the Sun than Earth, therefore the Sun will appear ~1/30 the size in the sky as it appears from Earth.

As a side note, the Sun would be ~ 1/1000 as bright, which is still much, much brighter than a full moon.

posted on Sep, 18 2015 @ 11:34 AM
You need to run for President, as you have a vision that no one else does.

posted on Sep, 18 2015 @ 11:47 AM

originally posted by: wmd_2008
a reply to: Soylent Green Is People

Actually consumer digital cameras use a Bayer Filter except for Sigma DSLR's which use a 3 layer sensor one layer for each colour.

Yes. Thank you for the clarification. The Bayer Filter is a combined red/green/blue filter. I was just trying to simplify it in concept.

By the way, Curiosity can use a Bayer filter that it has along its cameras' image path.

edit on 9/18/2015 by Soylent Green Is People because: (no reason given)

posted on Sep, 18 2015 @ 12:35 PM

These images show how the Sun appears to the instruments at that distance:

www3.telus.net...
www3.telus.net...

posted on Sep, 18 2015 @ 02:24 PM

originally posted by: Saint Exupery

No, it's radius is that. (roughly - I got 0.00465240).

Oh. Damn, you are quite right.

I plugged twice that (to get the diameter) into your calculator with the 39.5 AU value and got 0.013 degrees, which is 46.8 seconds of arc.

Yes, my own manual calculation agrees, I now get 0.01348728 degree. Makes more sense.

Thanks for figuring out the fault! Something as stupid as confusing radius and diameter.

posted on Sep, 18 2015 @ 03:20 PM

I spotted it because it is MY number-1 boo-boo on the spreadsheets I make, so I have to watch myself.

new topics

top topics

25