It looks like you're using an Ad Blocker.

Please white-list or disable in your ad-blocking tool.

Thank you.


Some features of ATS will be disabled while you continue to use an ad-blocker.


That 'Minority Report' interface? It's now only $70 away

page: 1

log in


posted on May, 21 2012 @ 09:37 PM

If you've ever watched "Minority Report" and wished that you could interact with your computer just by waving your hands around, then you might want to take a look at the Leap. It's a tiny little gadget which will supposedly allow you to control your devices by waving your hands, wiggling your fingers, or fidgeting with a pen.

Minority Report interface--LEAP

This is pretty cool, and inexpensive. I am not sure what the uses would be for it, and would think it would be terrible for someone who talks with their hands while on the phone. For $70 I might even give this a try to see what it could do. It would be nice if you could set it up for your computer to detect motion and then start streaming video while you were away from work/home. What do you guys think.........any ideas?

posted on May, 21 2012 @ 09:53 PM
reply to post by SUICIDEHK45

I see a whole new generation of people struggle to get to grips (pun intended) with the digital world. Like my previous generation (I'm 40) who couldn't adapt to computers, automation and all that...I see a glorious future for me while I move uncoordinatedly with my e-glove on my hand and my e-spectacles on my head and my e-phones in my ears.... all the while walking in the wrong direction. I think my e-boot needs re-booting. Yeah....

edit on 21/5/12 by LightSpeedDriver because: Typo

posted on May, 21 2012 @ 10:18 PM
I still think the brain wave reading device for a few hundred (you can buy them now) bucks is the future.

I mean, just recently there was a lady who managed to manipulate a robotic arm with only her brain.

ANd yet this, almost the same concept, without surgery.

You see, the trick is not in 'moving' things. The trick is imagining it. You could imagine a toaster popping up and have that associated with grab. So each time your brain fires neurons a certain way, the devices pick up a specific brainwave pattern, and the computer associates it with an action. That's how we work. We don't have specific brain functions per se, we have a learned control experience. That's why a baby has to learn to walk. It's brain says 'move leg' and it keeps trying until the baby can move it's leg in a walking manner, etc.

This is why a lot of people are scared of this tech too, they think "ooooomg it's reading my brains, I will have none of this!" when it's not. No different than when you 'train' your touch screen device. You calibrate it. It doesn't care if you hit the center for each point it wants to 'learn' -- only YOU do. it treats what it learns as a valid response and act accordingly.

I think this is the future of things. Combined with 3D glasses with a HUD display and an environment aware set of applications, using your brain alone to control things which would ordinarily take extreme dexterity or effort, can be done effortlessly with machines that even now, rely on a non-aware processor.

I can't wait .. hope I have enough of my brain left to work some of the things that come from it all.. lol

posted on May, 22 2012 @ 02:25 PM
Very cool ... I had wondered about hacking a kinect to achieve this same goal but this is way simpler and cheaper.

top topics

log in