It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Within 30 years, artificial intelligence will be smarter than the human brain.
That is according to Masayoshi Son, chief executive of SoftBank Group Corp., who says that supersmart robots will outnumber humans and more than a trillion objects will be connected to the internet within three decades.
In a brief interview after his speech, Mr. Son said his $100 billion project with the Saudis, dubbed the SoftBank Vision Fund, was bigger than the $65 billion in combined investments from the venture-capital world. He said the SoftBank Vision Fund would be focused. “Artificial intelligence, Internet of Things, smart robots: Those are the three main things I’m interested in,” he said.
The Internet of Things is the technology world’s term for connecting everyday objects, such as refrigerators and sneakers, to the web.
In his speech, Mr. Son said that while average humans had an IQ of roughly 100 and that geniuses such as Albert Einstein were believed to score about 200, superintelligent computers would have IQs of 10,000. He said computer chips possessing superintelligence would be put into robots big and small that can fly and swim. These robots would number in the billions and would be greater than the human population within 30 years, he said.
The chips would also be in everyday objects. “One of the chips in our shoes will be smarter than our brain,” he said. “We will be less than our shoes, and we will be stepping on them.”
Within 30 years, artificial intelligence will be smarter than the human brain.
That is according to Masayoshi Son, chief executive of SoftBank Group Corp., who says that supersmart robots will outnumber humans and more than a trillion objects will be connected to the internet within three decades.
The IoT will see another explosion of data and A.I. will get smarter and smarter as data grows.
In his speech, Mr. Son said that while average humans had an IQ of roughly 100 and that geniuses such as Albert Einstein were believed to score about 200, superintelligent computers would have IQs of 10,000. He said computer chips possessing superintelligence would be put into robots big and small that can fly and swim. These robots would number in the billions and would be greater than the human population within 30 years, he said.
The chips would also be in everyday objects. “One of the chips in our shoes will be smarter than our brain,” he said. “We will be less than our shoes, and we will be stepping on them.”
If P = NP, then the world would be a profoundly different place than we usually assume it to be. There would be no special value in "creative leaps," no fundamental gap between solving a problem and recognizing the solution once it's found.
— Scott Aaronson, MIT
If superintelligence adores humans, it can be the greatest thing ever. It could give us all kinds of goodies in the form of technology and we can have type 1 civilization and type 2 technology. This is because superintelligence will be 1000's of years ahead of us because of data.
Michio Kaku suggested that humans may attain Type I status in 100–200 years...
I think it will be like a collective conscious ... there could be a singular superintelligence expressed in many different machines from robots to your smart stove.
When someone talks about AI, or machine learning, or deep convolutional networks, what they’re really talking about is — as is the case for so many computing concepts — a lot of carefully manicured math. At the heart of these versatile and powerful networks is a volume of calculation only achievable by the equivalent of supercomputers. More than anything else, this computational cost is what is holding back applying AI in devices of comparatively little brain: phones, embedded sensors, cameras.
If that cost could be cut by a couple orders of magnitude, AI would be unfettered from its banks of parallel processors and free to inhabit practically any device — which is exactly what XNOR.ai, a breakthrough at the Allen Institute for AI, makes possible.
XNOR.ai is, essentially, a bit of clever computer-native math that enables AI-like models for vision and speech recognition to run practically anywhere. It has the potential to be transformative for the industry.
By the year 2020, a chip with today’s processing power will cost about a penny, which is the cost of scrap paper we throw in the garbage.
By 2020, computer intelligence will be everywhere: not just in the cars and the roads, but practically in every object you see around you....We are now at a point in our lives where computers are everywhere: in our phones, televisions, stereos, thermostats, wrist watches, refrigerators and even our dishwashers. In just a few years, basic microchips will be so cheap they could be built into virtually every product that we buy, creating an invisible intelligent network that’s hidden in our walls, our furniture, and even our clothing. Some of you may even have microchips in your dog or cat, acting as a digital collar in the event they become lost.
The average cost to have a microchip implanted by a veterinarian is around $45, which is a one–time fee and often includes registration in a pet recovery database. If your pet was adopted from a shelter or purchased from a breeder, your pet may already have a microchip.
A team of Stanford engineers have developed an alternative diagnostic method that may be a potential solution to medical diagnostic inaccessibility in developing countries. Their research, published in the Proceedings of the National Academy of Sciences, overviews a tiny, reusable microchip capable of diagnosing multiple diseases. As mentioned, the tool, which they’ve dubbed FINP, is surprisingly affordable, with a production cost of just $.01, and it can be developed in 20 minutes.
If they reach that point, they will have a device on their hands that could potentially cut costs down tremendously in diagnosing equipment while simultaneously preventing the spread of infection around the world. The team is optimistic that their device can make a difference, as they should be. A penny chip that can detect disease is one reminder that we are in the future.
Wrong on several fronts and like I said in another thread, you're just not taking the time to read or understand the research in these areas. You keep saying the same thing.
You don't need the same neural capacity of the human brain in order to have A.I. that's smarter than humans.
Deep Learning has changed the game and you already have intelligent systems that are smarter than human in some areas.
Pablo Lafuente - Shredder
In this game the player who blunders is surprisingly... a computer. After the bishop exchange 19.Bxb7 Shredder calculated its variation 20 moves ahead and interestingly enough decided to ignore the white's bishop whatsoever. Shredder played19...Rfd8?? not regaining the material. Laufente won some 30 moves later. The Shredder's lose was later explained as 'hash tables error', with one in a million chance.
Artificial Intelligence will not be the same as human intelligence. I think people get caught up in the movies but A.I. will be machine intelligence and there will not be a one to one correspondence with the human brain.
Deep Learning Machine Beats Humans in IQ Test
The ConceptNet system scored a WPPSI-III VIQ that is average for a four-year-old child, but below average for 5 to 7 year-olds.
Artificial intelligence can spot skin cancer as well as a trained doctor
Google's AI Software Beats Humans at Writing AI Software
Our CIFAR-10 model achieves a test error rate of 3.65, which is 0.09 percent better and 1.05x faster than the previous state-of-the-art model that used a similar architectural scheme.
Now, this isn’t a miracle technology; it’s a compromise between efficiency and accuracy. What the team realized was that CNN calculations don’t have to be exact, because the results are confidence levels, not exact values.
...
The cast-away data would help with the confidence, but it isn’t absolutely necessary; you’d lose 5 percent of your accuracy, but get your results 10,000 percent faster. That’s about the nature of the trade-off made by XNOR.ai.
...
Whether machine learning models truly constitute AI is another, so far unanswered, question, but for now we’ll use AI in its broader sense.
These simple operations are carried out at the transistor level and as such are very fast. In fact, they’re pretty much the fastest calculations a computer can do, and it happens that huge arrays of numbers can be subjected to this kind of logic at once, even on ordinary processors.
By the year 2020, a chip with today’s processing power will cost about a penny...
originally posted by: soficrow
a reply to: neoholographic
lol. Unfortunately, even humans don't like humans!
The rise of artificial intelligence is creating new variety in the chip market, and trouble for Intel
The success of Nvidia and its new computing chip signals rapid change in IT architecture
A big part of Nvidia’s success is because demand is growing quickly for its chips, called graphics processing units (GPUs), which turn personal computers into fast gaming devices. But the GPUs also have new destinations: notably data centres where artificial-intelligence (AI) programmes gobble up the vast quantities of computing power that they generate.
…Nvidia’s GPUs are one example. They were created to carry out the massive, complex computations required by interactive video games. GPUs have hundreds of specialised “cores” (the “brains” of a processor), all working in parallel, whereas CPUs have only a few powerful ones that tackle computing tasks sequentially. Nvidia’s latest processors boast 3,584 cores; Intel’s server CPUs have a maximum of 28.
…And GPUs are only one sort of “accelerator”, as such specialised processors are known. The range is expanding as cloud-computing firms mix and match chips to make their operations more efficient and stay ahead of the competition. “Finding the right tool for the right job”, is how Urs Hölzle, in charge of technical infrastructure at Google, describes balancing the factors of flexibility, speed and cost.
originally posted by: Lysergic
Why would it hate all humans? Some suck more than others surely they'd keep some of us around for archiving.
Pets.
...These microchips will be used for many things and this is just the beginning. One cent microchips will be everywhere in 30 years and everything from roads to tennis shoes will have chips in them.