posted on Oct, 27 2010 @ 11:54 PM
Ok, before everyone's heads explode in fear, can someone show me a sample of this being used on a complex background, and not a uniform one as the
youtube video showed?
It's one thing to track an object in real time and remove it from a uniform background, but you'd never get a decent result using it in a room full
of people. Suddenly you'd have parts of random people scattered all over the removed area. So no, it wont work in Washington, at least not
And you can guarantee that if this were used on a moving object, it would take more than 41ms to do, as each frame would have to be completely
analysed. This, from what I can gather from the video, is using the first frame as a reference frame, and extrapolating the data of that to work on
the subsequent frames. Mocha has a similar feature in it's 2d planar tracker, where it uses a set frame as the master frame and then can handle
rotation, panning, perspective etc, from that, and can even use that data to place a consistent track even if the area becomes partially obstructed by
A moving object however has a constantly changing background, and would require each frame to be analysed individually.
still, it is a neat idea... But it's a shame so many people will think it's being used to erase people from live feed when both the result would be
erratic at best, and obvious at least.