It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Is building AI immoral?

page: 2
8
<< 1    3  4  5 >>

log in

join
share:

posted on Sep, 9 2016 @ 04:29 PM
link   

originally posted by: seaswine
a reply to: Jonjonj

I wanted to but totally forgot about it. I'll have to check it out now!

Thanks for the reminder


If you have a human bone in your body you will weep. Anthropomorphism, gets me every time!


edit on 9-9-2016 by Jonjonj because: spellery

edit on 9-9-2016 by Jonjonj because: (no reason given)



posted on Sep, 9 2016 @ 04:34 PM
link   
a reply to: andy06shake

I think it has the potential to be one of our greatest crimes at least in terms of how long it could last.

I thought about AI just making AI 2.0 however Frankenstein made monster 2.0 and it didn't work too well.

Totally agree that we will make ourselves obsolete, in fact I think that could be a decent answer to the Fermi Paradox.



posted on Sep, 9 2016 @ 04:34 PM
link   
a reply to: Xeven

If however we could indeed merge our own intelligence with an artificial intelligence then interstellar journeys become tenable with the existing technology we have at our disposal. Might take awhile all the same(10000 years or so, journey time).

Look at it this way, once we create an intelligence unbound by mortality and that learns exponentially the materials and capability to travel at superluminal velocity are going to materialize sooner or later.
edit on 9-9-2016 by andy06shake because: (no reason given)



posted on Sep, 9 2016 @ 04:35 PM
link   

originally posted by: surfer_soul
a reply to: TerryDon79




You would need to define the level of consciousness and how it arrived at that level. It's not such a black and white thing lol.


Why not re-define that as 'self aware' then, and with the inherent conviction to remain so?
Hows that sound?



Self-aware? Again, it would be programmed to be so. Unless it naturally developed the trait through programming anomolies. But even then, it would still only be a thing, a machine made by us with programming.

Now. If you're on about something like a hybrid animal and machine then, sure, morality (to a degree) would play a part. But as a meat eater, the morality would be minimal.

It's just a tough subject because there's not any parameters. AI covers too large an area and consciousness in humans has yet to be truely defined.



posted on Sep, 9 2016 @ 04:37 PM
link   
a reply to: Krahzeef_Ukhar

"I thought about AI just making AI 2.0 however Frankenstein made monster 2.0 and it didn't work too well."

AI would not be bound by our limited perspective or capability's through, cant make an omelette without breaking a few eggs. I think it would work out just fine, after all us clever Monkeys manage to reproduce.

Keep in mind as well any AI we do create would not experience emotions in the same manner as we do down to the fact that its above biology at least in the way that we look at the subject. So morality is pretty much neither here nor there.
edit on 9-9-2016 by andy06shake because: (no reason given)



posted on Sep, 9 2016 @ 04:45 PM
link   
Why does it 'have' to be sentient? If sentient its not a robot anymore. Your average drone is way more intelligent than anyone, it can fly, triangulate its position and autonomously hover until its allowed to destroy its enemies.

Perfect_robot.

It obeys every command. Who cares if its aware of its deeds or not? In fact, they prefer it isn't.



posted on Sep, 9 2016 @ 04:49 PM
link   

originally posted by: TerryDon79
a reply to: Krahzeef_Ukhar

What hippy stuff?

You would need to define the level of consciousness and how it arrived at that level before we could talk about the morality of AI.

There's literally a ton of issues that need defined before we could answer the question posed. Like I said, it's not just black and white.


The hippy stuff I was referring to was the whole what is consciousness etc. etc.

But let's simplify the question to avoid getting caught up in semantics.

There's a box on the table that will become equally conscious as you if you flick the switch.
Do you flick the switch and what are the moral implications regarding the feelings of the box?



posted on Sep, 9 2016 @ 04:54 PM
link   
a reply to: Krahzeef_Ukhar

Of course I would flip the switch. That'd be a mighty fine gaming PC.

Also, it's still just programming. The "consciousness" wouldn't be real, so there's no morality involved.



posted on Sep, 9 2016 @ 04:55 PM
link   
a reply to: intrptr

But can a drone paint a masterpiece or create a musical symphony? It pretty much cannot do anything other than its programmed to do. It may be able to implement mathematical programming to accomplish a task better than any human but it cannot break the bounds of its own program.



posted on Sep, 9 2016 @ 04:58 PM
link   

originally posted by: intrptr

Why does it 'have' to be sentient? If sentient its not a robot anymore. Your average drone is way more intelligent than anyone, it can fly, triangulate its position and autonomously hover until its allowed to destroy its enemies.


It's important to understand they "they" (military and numerous tech giants, universities & such [Google in particular]) are already deep into the MASSIVE long running initiative to build 'Skynet'.

We're not just facing a CHAPPiE. We're facing a global network harnessing the entire internet, essentially every computer module hooked to it. For for military and surveillance purposes for the most part. This includes the robots!



And much much more when it all goes online, assuming it isn't already, yet for whatever stage of maturity its already at (its not just flicking a switch we're talking about here), its or its various facets are already 'hooked in'.


originally posted by: IgnoranceIsntBlisss
Welcome to the Unpossible Future... The AGI Manhattan Project



edit on 9-9-2016 by IgnoranceIsntBlisss because: (no reason given)



posted on Sep, 9 2016 @ 05:00 PM
link   
a reply to: andy06shake


But can a drone paint a masterpiece or create a musical symphony?

Mass death, a symphony of destruction.

Masterful to some.


It pretty much cannot do anything other than its programmed to do.

That is changing. The primary task of the smartest machines will be to kill humans.

Not write music.



posted on Sep, 9 2016 @ 05:04 PM
link   
a reply to: IgnoranceIsntBlisss


We're facing a global network harnessing the entire internet, essentially every computer module hooked to it. For for military and surveillance purposes for the most part. This includes the robots!

Agreed. for control under the constant threat of annihilation. All that 'intel' feeding to the very top, a megalomaniacal control freak of a wretched human being. Who would never allow any questioning of orders or debate.

Perfect(ly) evil.



posted on Sep, 9 2016 @ 05:05 PM
link   
a reply to: intrptr

Well given what we do to the planet never mind one another its not like we don't have it coming to us as a species.


In the end humanity will build its own God. Possibly because we are looking for judgment?

Personally i imagine any AI that did manage to subjugate humanity would treat us a whole lot better than our current masters aka the banking cartels and mass media tycoons that run the show.
edit on 9-9-2016 by andy06shake because: (no reason given)



posted on Sep, 9 2016 @ 05:06 PM
link   

originally posted by: TerryDon79
a reply to: Krahzeef_Ukhar

Of course I would flip the switch. That'd be a mighty fine gaming PC.

Also, it's still just programming. The "consciousness" wouldn't be real, so there's no morality involved.


In that situation the box has consciousness equal to yours.

We don't know how consciousness arises as yet. There's no reason to think that it cannot be programmed.



posted on Sep, 9 2016 @ 05:14 PM
link   
a reply to: andy06shake


In the end humanity will build its own God. Possibly because we are looking for judgment?

The notion of the state, like religion is to build a belief in the system as absolute, just like organized religion does. The whole idea is to plant a 'supreme undefeatable' (so why bother trying) "Being" in everyones minds.

I can imagine an AI leading the troops in prayer, can you?

Not...

I can also imagine a super duper predator AI robot being totally defeated by a child with a can of spray paint.



posted on Sep, 9 2016 @ 05:22 PM
link   
a reply to: intrptr

"I can imagine an AI leading the troops in prayer, can you?"

Prayer and/or mantras are in essence is a form of program, so it may not be that far fetched.

"I can also imagine a super duper predator AI robot being totally defeated by a child with a can of spray paint."

An AI wont be confined to a body, even a robotic one, it will be free to walk the information nets in the same manner as we cross a field or a street. So i imagine that any child wishing to wage war against such an entity would require more than spay paint. It would not be a David and Goliath scenario, it would be a David versus an infinite amount of Goliath's.



posted on Sep, 9 2016 @ 05:40 PM
link   

originally posted by: andy06shake
Keep in mind as well any AI we do create would not experience emotions in the same manner as we do down to the fact that its above biology at least in the way that we look at the subject. So morality is pretty much neither here nor there.


With greater understanding they could also be even more emotional. Either way it's a guess however by the time we know for sure it will be too late.



posted on Sep, 9 2016 @ 06:11 PM
link   
a reply to: Krahzeef_Ukhar



Microsoft Ai removed


Microsoft deletes 'teen girl' AI after it became a Hitler-loving sex robot within 24 hours (...)


Or you could go into real life and check Donald Trump



posted on Sep, 9 2016 @ 06:19 PM
link   
a reply to: tikbalang
Technically, with the short circuits Hillary might be more fitting for this thread.

If only Hitler-loving sex robot was running.



posted on Sep, 9 2016 @ 06:27 PM
link   
a reply to: Krahzeef_Ukhar

An AI without pre programming, would be devastating.. It would wipe out 90 percent of the population..



new topics

top topics



 
8
<< 1    3  4  5 >>

log in

join