It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Artificial Intelligence could one day spell the end for mankind, according to Stephen Hawking. The Nobel Prize-winning physicist who has warned that humanity faces an uncertain future as technology learns to think for itself and adapt to its environment, writes The Daily Mail. In an article written in the Independent, Professor Hawking discusses Transcendence, Johnny Depp's latest film, which explores a world where computers can surpass the abilities of humans. Like Us on Facebook Professor Hawking wrote dismissing the film purely as science fiction could be the 'worst mistake in history'. He argues that developments in digital personal assistants such as Siri, Google Now and Cortana are merely symptoms of an IT arms race which 'pale against what the coming decades will bring', writes The Daily Mail. However, he notes the technology could bring significant benefits - with the potential to eradicate war, disease and poverty. "Success in creating AI would be the biggest event in human history," he wrote. "Unfortunately, it might also be the last, unless we learn how to avoid the risks." Militaries throughout the world are working to develop autonomous weapon systems in the short and medium term, with the UN simultaneously working to ban them, he wrote. "Looking further ahead, there are no fundamental limits to what can be achieved,' said Professor Hawking. "There is no physical law precluding particles from being organized in ways that perform even more advanced computations than the arrangements of particles in human brains." IBM has already developed smart chips mimicking the brain - that could lead to sensor systems that mimic the brain's capacity for perception, action, and thought, writes The Daily Mail. It could even allow computer scientists to develop a machine with a brain that is even more intelligent than that of humans, Professor Hawking wrote. "As Irving Good realized in 1965, machines with superhuman intelligence could repeatedly improve their design even further, triggering what Vernor Vinge called a singularity," Professor Hawking wrote. Experts are not prepared for these scenarios, he wrote. In comparison, he wrote, if aliens were to tell us saying they would arrive within a few decades, scientists would not just sit waiting for their arrival. "Although we are facing potentially the best or worst thing to happen to humanity in history, little serious research is devoted to these issues. "All of us should ask ourselves, 'what we can do now to improve the chances of reaping the benefits and avoiding the risks'."
originally posted by: beezzer
a reply to: MConnalley
Stephen Hawking is a smart guy.
Now if Carl, the bathroom attendant at ATS headquarters said it, I'd have my doubts.
originally posted by: symptomoftheuniverse
Obvioulsy biological entities will mrrrrrrge with a.i.
A.i will realise that the best machines are biological. Rust is # for a a.i.
Over the eons of time wow signals are it.
The hamish know the score
2 - 1 england