It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Our Computers Are Learning How to Code Themselves

page: 3
18
<< 1  2   >>

log in

join
share:

posted on Mar, 3 2017 @ 03:22 PM
link   

originally posted by: soficrow
Hence 'coding for dummies' in the new apps. Not up to your standards I know, but there IS a market.

...and the capacities will grow as the market does.


These types of systems can make syntax a lot easier. The problem is that you still need the algorithmic knowledge behind it. For example let me use Excel as an analogy. They have a formula builder function in order to put something together. That makes the syntax much easier, but it doesn't actually work unless you have an understanding of the mathematics behind the system.

Now to expand that to code, one job programmers are frequently employed for is that of a database report writer. They query information from multiple database tables in order to generate a custom report for some executive. Access has a report generator built into it, but using the report generator to write your SQL for you for the report doesn't actually help if you don't have an understanding of how a relational database model works.

Coding for dummies generally teaches you the syntax of what to write and when, but it doesn't teach you how to build formulas and actually solve problems. That's the main issue, employers today need problem solvers in every discipline, the tools you use to solve the problem can become easier to use, but it still has to be solved.

I've had this discussion with a few business people on these forums as well as people in real life. What it always seems to come down to is that as a society our ability to solve problems (usually through applied math) is just not up to where it needs to be.



posted on Mar, 3 2017 @ 06:03 PM
link   
a reply to: Aazadan

You're still thinking industrial use while I'm talking consumer apps. 2 different animals. Altho yeah, the odd manager or supervisor might try to apply it at work. ...Yes, it would be elementary and not threaten your job security. Doesn't mean it won't happen.



posted on Mar, 3 2017 @ 06:08 PM
link   

originally posted by: Aazadan
Coding for dummies generally teaches you the syntax of what to write and when, but it doesn't teach you how to build formulas and actually solve problems. That's the main issue, employers today need problem solvers in every discipline, the tools you use to solve the problem can become easier to use, but it still has to be solved.


You said this well, and I certainly agree. I've seen any number of codemonkey type things that will help you grind out an app - but it can't tell a machine what humans might like to see next or what they might find useful and they're horrible in solving problems where the data is big and messy.

A lot of the times it needs a LOT of human interaction just to put the data in a form where a computer can actually use it for something.

Apps programs, yeah... they might get as easy as asking a computer to make an app for you. But the big stuff, no. Needs people.



posted on Mar, 4 2017 @ 10:41 AM
link   
a reply to: Byrd



...Apps programs, yeah... they might get as easy as asking a computer to make an app for you. But the big stuff, no. Needs people.





But what do you think of the 2030 projection? (For full-blown AI and computers smarter than people.)



posted on Mar, 5 2017 @ 05:43 PM
link   
Hello everyone. Long time lurker here. I felt compelled to contribute a little here as the subject is something I contemplate constantly and have frequent conversations about with family and friends ( I know, not always the best way to bounce ideas due to potential biased, but I am always willing to accept being wrong about something with enough evidence.) I have no real background in coding although I'm very interested in learning, so please bear with me on this. Over the span of my relatively short life, I have seen astounding things be achieved due to the advent of the internet as I'm sure many of you have. One good example is the ability to find almost any "how to" video in order to fix a problem that used to require someone with special knowledge to fix, ie. a mechanic, IT personnel, etc. There is still a current demand for other people to take care of these issues due to man's innate laziness. We really do like to have things automated for us, hence our creation of technology.

On to the point. I was born in '86, so I'm on the very cusp of the "millennial" generation. Due to this availability of resources that I grew up with, I can't even imagine what life was like before computers. Now jump to the next generation coming into this world. By the time they are my age they will have seen and developed with the introduction of Internet of Things (IoT or Internet 2.0). In a short span of 20 years, we have gone from

Deep Blue and Kasparov played each other on two occasions. The first match began on February 10, 1996, in which Deep Blue became the first machine to win a chess game against a reigning world champion (Garry Kasparov) under regular time controls. However, Kasparov won three and drew two of the following five games, beating Deep Blue by a score of 4–2 (wins count 1 point, draws count ½ point). The match concluded on February 17, 1996. Deep Blue was then heavily upgraded (unofficially nicknamed "Deeper Blue")[11] and played Kasparov again in May 1997, winning the six-game rematch 3½–2½, ending on May 11. Deep Blue won the deciding game six after Kasparov made a mistake in the opening, becoming the first computer system to defeat a reigning world champion in a match under standard chess tournament time controls. The system derived its playing strength mainly from brute force computing power. It was a massively parallel, RS/6000 SP Thin P2SC-based system with 30 nodes, with each node containing a 120 MHz P2SC microprocessor, enhanced with 480 special purpose VLSI chess chips. Its chess playing program was written in C and ran under the AIX operating system. It was capable of evaluating 200 million positions per second, twice as fast as the 1996 version. In June 1997, Deep Blue was the 259th most powerful supercomputer according to the TOP500 list, achieving 11.38 GFLOPS on the High-Performance LINPACK benchmark.


to:


Throughout the competition, Libratus recruited the raw power of approximately 600 of Bridges’ 846 compute nodes. Bridges total speed is 1.35 petaflops, about 7,250 times as fast as a high-end laptop and its memory is 274 Terabytes, about 17,500 as much as you’d get in that laptop. This computing power gave Libratus the ability to play four of the best Texas Hold’em players in the world at once and beat them.


That's an increase of roughly 118,600 times in computing power utilized by AI compared to just 20 years ago. Granted this is a shallow estimate as the architecture, programming and other variables not included in the assessment were not accounted for in the calculation.

Moving on to the increase in population and the increase of internet access to said population. The population of the world in '86 was about 4.9 billion, today it sits at roughly 7.3 billion. A 48% increase in 30 years. I'll take 2030 as that was the last date i saw mentioned before posting this. In 2030 the U.N. projects the world population to be 8.5 billion. Another 16% increase from today. Now take the population of the world using the internet in 1995 ( it's as far back as I could find with a basic search). It was about 16 million, or 0.4% of the 1995 population. As of Dec 2016 it sat at 3.675 billion, or roughly 50.1% of the world population. this increase in Internet availability is already having major impacts on technology and the speed at which information is shared around the world.

Now imagine for a second that our internet saturation levels reach 65% by 2030. That would mean that you would have a potential 5.5 billion people all with access to each other, sharing breakthroughs, from the backyard/basement scientist to fully funded labs. Now most companies keep private some of these breakthroughs, but it is becoming increasingly popular for collaboration to cross country lines leading to more and more open research. Also with the increase in translation software the language barrier gets thinner and thinner, making it even easier to share this info. Introduce IoT and you have so much data available that it becomes increasingly easier to define "human". Everything from foods eaten to biomedical readings will begin to give us a bigger picture of how we work both mentally and physically. This in turn will allow us to create better algorithms that mimic human intelligence. Each generation tends to get smarter due to genetic variables and education aka sharing of information. The "millennials" already see what happens when technology begins to take off. Hence the mass nostalgia for something from just 10-20 years ago when life was "simpler". That's not a very long time for an entire generation to feel like they are being left behind. Now I hate generalizations, so to make it clear, my reference to "millennials" does not encompass all in that generation. It is a generalization to help with a perspective, not to label.

Onto the next gen coming into play here. By 2030, those born in 2005 will be 25, and those who went to college straight out of high school will have graduated and started to influence the world, if they don't do so at a younger age with access to resources such as YouTube. This generation is going to be smarter and have had access to a world of information at their finger tips essentially from birth. The internet is now part of their world. It's vital to realize that their brains, when in it's most malleable plasticity, is being trained to use the internet as second nature. That means this generation is going to be more adept with programming purely due to the nature of humans to adapt to and control our environment. In their case the Internet and IoT will be a vital part of their environment.

Due to all of these variables converging, and the fact that this adaptability to the internet is already taking place within just the "millennial" generation, let alone future gens. I see human level intelligence being possible by 2030. This isn't including all of the physics breaks through that are happening at an ever increasing rate, or the advances in material sciences as well as quantum, photonic, DNA computing, or even memristors. As i am lazy I think I'll end it here haha. Thank you for taking the time to read this. I'm going to slide back into my comfy hole now. Enjoy



posted on Mar, 6 2017 @ 09:21 AM
link   
a reply to: Irikash

Thanks for your post Irikash - sorry I didn't respond sooner. There is so much there! VERY interesting point here:



... It's vital to realize that their brains, when in it's most malleable plasticity, is being trained to use the internet as second nature. That means this generation is going to be more adept with programming purely due to the nature of humans to adapt to and control our environment. In their case the Internet and IoT will be a vital part of their environment.








posted on Mar, 6 2017 @ 10:23 AM
link   

originally posted by: soficrow
a reply to: Byrd
But what do you think of the 2030 projection? (For full-blown AI and computers smarter than people.)


Absurd.

Americans have developed a climate of fear about robotics and science. In the first place, there's no real definition of "smarter" -- if we mean "has access to more reliable knowledge on an encyclopedic basis" then any textbook is smarter than me. In fact, my smartphone by that definition is smarter than me (I can barely name three actors while my phone can name all the top actors everywhere as well as a bunch of ones that almost nobody's heard of.)

Smart is not access to facts.

My Roomba does a better job at vacuuming the floor than I do. If I combine a Roomba and a smartphone, they would be more competent and technically have more knowledge than I do. They can't replace me.

Robots might be able to do a nice abstract painting or a nice mod of an existing scene, but I draw political cartoons and they won't be able to do that in the foreseeable future (in other words, create something that's selectively humorous and selectively barbed and do it in a stylized way.) I've sold a number of short stories and they won't replace me as a writer. They can't design jewelry (you can use them to design jewelry, but they can't be unpredictable.

Nor can they solve a "medical zebra" - because what humans may think is significant (pain, feeling, etc) may turn out to be insignificant while the critical datapoint is something that the sufferer thinks is not worth mentioning.

And you can't program them to deal with gray areas.



posted on Mar, 6 2017 @ 10:24 AM
link   
I wish mine would....



posted on Mar, 6 2017 @ 11:40 AM
link   
a reply to: Byrd

Are you saying the timeline is absurd, or the concept is? (Seems to be the former.)




PS. I love science and I'm not afraid of automation. In fact, I'm looking forward to it, lazy bum that I am.
...I do think though, that we need not to be in denial, and take steps to position our society to deal with the inevitable job losses and disruptions.






edit on 6/3/17 by soficrow because: (no reason given)



posted on Mar, 6 2017 @ 07:02 PM
link   
Then you have thought projects like this that begin to help bridge machine learning with quantum computing. These discoveries are happening at faster and faster rates as technology builds upon itself.
Here is an example of possible picture language, further reducing the amount of man power required to put information into a form that is useful to computers.

edit on 6-3-2017 by Irikash because: (no reason given)



posted on May, 15 2017 @ 11:10 PM
link   
a reply to: soficrow

Artificial intelligence! AI Supercomputer programmers that act as though they are human.



posted on May, 15 2017 @ 11:15 PM
link   
If it can only use existing code, it's not creating anything new. Imagination and awareness are not lurking in human-written code.

Plus, the source code is only as good as the human that wrote it.




top topics



 
18
<< 1  2   >>

log in

join