It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
It depends on how fast the ship was traveling to Alpha Centauri. Let's say it was traveling the speed of the spaceraft that went to the moon or a little faster. It took the astronauts about 3 days to get to the moon and there was a delay of 1 second by the time they got to the moon. So even that video wasn't real time, it was already delayed by 1 second, though if that 1 second of lost time was spread out over 3 days on the trip to the moon, you'd hardly notice it. So that's how it would be if that craft continued on to alpha Centauri, every few days you would accumulate another 1 second delay. Given the trip would take over 100,000 years, those seconds would add up eventually.
originally posted by: Crumbles
Can someone Answer my question. Light speed communication is the limit, of course everyone knows this. I was randomly thinking the other day. If we sent out a probe, and it live streamed from launch to lets say Alpha centauri wouldn't we get real time images from start to finish since there is never a cut from broadcast. Just a weird thought. I know I am wrong, but not sure why. I know it takes 1 year from one light year away to reach us, but if we are getting a constant feed how would that work.
originally posted by: moebius
There would be no "glitches" or skipped frames.
The signal itself would be simply red-shifted proportional speed of the spacecraft. This red-shift would cause a delay in the receiver.
You would be sending 30 frames per second, but it would take 2 seconds to receive them for example.
originally posted by: pheonix358
Which is slow motion.
When the craft gets to where it is going, and while it would be delayed by four years, the slow motion effect would cease.
But yes, delayed by the time light would take to go from there to here.
P