posted on Sep, 7 2012 @ 05:41 AM
Originally posted by seabhac-rua
The sounds of the astronauts speaking were recorded separately and added to the film afterwards, or simply, as in a live broadcast played
Audio and visual feeds are generally sent as two different signals (vision/audio) to the reciever which then picks it up and plays both back. This can
be to a tape or just a monitor or whatever. Even recording off the screen at NASA, broadcasters would be producing two feeds. Between early Apollo and
later Apollo there was tonnes of changes to how the data was transmitted exactly, so the rest of that is a lot more reading but the info is available.
I do know there was issues which caused interference in the early missions with multiple feeds at once.
This has to do with how things were done in the editing room back in the 60's/70's and who was contracted to do the editing in the first
place. I don't think that having 100% accurate continuity was high on the editors priority list back in those days,
Been a while since I did film history but: Continuity is actually worse now than it ever has been. Saying that, 60's - 70s had the Cinéma vérité
movement in France and mixing images with different audio to make a point or have an agenda was becoming much more common. Not that it was a first but
WWII was mostly using a voice over to drive the images (propaganda style). From the 60s and 70s onwards it started to become much more common to
actually present a reality as if it was entirely observational 'truth' but in fact heavily edited to support a film maker agenda.
It's for another thread, because I suspect this one should die since the original question has been answered but ... I imagine most of those changes
have come about over the decades with production companies retiming the footage for segments, and also the communications delay was removed in later
releases/restorations which caused some confusion.