This is how humanity will end...potentially.

page: 2
33
<< 1    3  4  5 >>

log in

join

posted on Mar, 10 2012 @ 03:39 PM
link   

Originally posted by beezzer
As for the death of humanity, let me posit a few questions.

What's to stop us downlaoding a person into an android?
What is the soul?
What is emotion?
Could an emotion like love be "programmed"?




Downloading...well, the concept is great, but the technicals are elusive..and you covered that in your following questions.

What is a soul? I don't know (beyond...consciousness)
What is emotion? Well, that is programming actually..or reaction based on stimulus..but core programming can be overridden even in humans
spank a baby and it cries, spank an adult and they may laugh.

Such a philosophical gray area...it makes one question some pretty rudimentary things of life that we still have no clue about in regards to nailing down.

Perhaps by building these machines, that become alive, we are indirectly doing this to answer these questions for ourselves...build it to learn more about us.




posted on Mar, 10 2012 @ 03:41 PM
link   

Originally posted by petrus4


The Turing test is garbage. I've seen the code for chat bots; they're a text database with a pattern matching algorithm, and nothing more. They are pure smoke and mirrors, and are no more a form of AI than anything else we have ever developed.


Was unaware of the issues surrounding the Turing Test.


Strong AI is an atheistic wet dream, but not much else. The only way it is going to happen, is if it is (at least partly) biologically based. It can't be done with silicon chip technology as we currently know it, because the necessary scale of per-node miniaturisation can't go down far enough.


Quantum computing, biological computing may be in its infancy right now, but a decade? 2 decades from now?



posted on Mar, 10 2012 @ 03:42 PM
link   

Originally posted by Turq1
Why would a robot or AI want to "live"? Living/experiencing are human desires, applying that to inanimate objects isn't logical - which is something a robot or AI would be.
edit on 10-3-2012 by Turq1 because: (no reason given)

Why would the element carbon want to live?
Why would calcium, or anything else that make up our parts want to live?

Life is not about any component wanting something, its about the whole becoming greater than its parts...

humans are made up of mud and water..just reformed..and the reforming process has made us grow beyond our parts into something amazing...the same can apply towards any other thing with purpose and design.

I think the only thing truely alive in the universe is thought, or intention...everything else is just the technicals of how such thought and intention move around



posted on Mar, 10 2012 @ 03:44 PM
link   

Originally posted by SaturnFX


Perhaps by building these machines, that become alive, we are indirectly doing this to answer these questions for ourselves...build it to learn more about us.


Probably the most profound statement I've read in a long time.
*applause*

That's the take-home message, in my humble opinion. Maybe that's why "God" created us.



posted on Mar, 10 2012 @ 03:47 PM
link   
reply to post by beezzer
 





If aging and ding are no longer an issue, what kind of culture would that create?


If we never died what would happen? how would our way of thinking change? If we were religious and we new we never died , then we would never have to face judgement for our actions.Would morality even have any purpose ?
Is death the only reason we have morality hard wired into our brains? Would people start to do things they would have never done before? Would we be able to pursue dreams that we could never complete in this lifetime? Would it be complete chaos? What would you do when you first realized your life wasn't over and you had limitless time to do what ever you wanted? How would laws change to deal with this issue? What about over population do to the fact no one would ever die?
It seems with this subject questions seem to only bring more and more complex questions. The implications are staggering when you actually think about it, and this is why personally I feel we are rushing to fast down a path we know nothing about and when we get to the end we may not like what we find.



posted on Mar, 10 2012 @ 03:48 PM
link   

Originally posted by davesmart
moral implicity...mmmm
2 films spring to mind
I robot, we know the reason for that one
but what about
Short Circuit 2


Oh, we could spend ages talking about films relevant to this.
Dune for instance (great backstory)
Matrix of course, Terminator, as you mentioned, I Robot, the remake of The Time Machine had a fully functional AI holographic program that was pretty much alive by any given standard (outside of physical being). and a thousand other science fiction movies, books, etc since the dawning of the industrial age (much less computer age)

We know subconsciously that it will happen. we know it the same way we know virtual reality will be here one day, or that we will have a person on mars...
well, assuming we dont kill ourselves before all this stuff is accomplished anyhow.

and we have been intentionally or unintentionally conditioning ourselves for this since we picked up our first tool as a species...to self evolve.



posted on Mar, 10 2012 @ 03:54 PM
link   

Originally posted by petrus4
Strong AI is an atheistic wet dream, but not much else. The only way it is going to happen, is if it is (at least partly) biologically based. It can't be done with silicon chip technology as we currently know it, because the necessary scale of per-node miniaturisation can't go down far enough.


There are soo many routes to this, from dna computers, to quantum computing cores, etc.
Saying we can't do it today is like someone on a car in 1920 saying no way a car will go 100mph, and giving reasons based on the current design.
Yes, your car won't, but the trend is showing clearly it will.

as far as necessary scale per node miniaturization, this may sufficiently blow your mind with its potential:
Atomically-precise positioning of a single atom transistor

I don't think miniaturization is going to be a limiting factor at all.



posted on Mar, 10 2012 @ 03:56 PM
link   
reply to post by mark1167
 

Wow.

If a race evolved to immortality, would religion survive?

Murder would carry a special "taboo", of course.

Are our ethics so centered on a life span we can readily identify, that they'd change if we became "immortal"?



posted on Mar, 10 2012 @ 03:57 PM
link   
reply to post by petrus4
 


Yes
outlawing, or even slowing down will achieve absolutely nothing.
If the US outlawed or slowed it, China, or elsewhere would go full steam ahead.
If the globe outlawed or slowed it, then those people whom don't subscribe to global police would develop it and claim the market

So, time to discuss it, time to push hard for it, and to at least put some controls in place that are reasonable..then accept the outcome,



posted on Mar, 10 2012 @ 04:03 PM
link   

Originally posted by mark1167
reply to post by beezzer
 





If aging and ding are no longer an issue, what kind of culture would that create?


If we never died what would happen? how would our way of thinking change? If we were religious and we new we never died , then we would never have to face judgement for our actions.Would morality even have any purpose ?
Is death the only reason we have morality hard wired into our brains? Would people start to do things they would have never done before? Would we be able to pursue dreams that we could never complete in this lifetime? Would it be complete chaos? What would you do when you first realized your life wasn't over and you had limitless time to do what ever you wanted? How would laws change to deal with this issue? What about over population do to the fact no one would ever die?
It seems with this subject questions seem to only bring more and more complex questions. The implications are staggering when you actually think about it, and this is why personally I feel we are rushing to fast down a path we know nothing about and when we get to the end we may not like what we find.


There is not a single question you asked I wouldn't love to discuss..but ya, too many questions all at once.
Still, good questions.
incidently, isn't the idea of afterlife the concept of never dying also? One could say having to stay in this life, universe, etc means you would perhaps be more moral..after all, you can hurt someone now and maybe never see them again for the rest of your life...but if life never ends, you will probably run into them over and over throughout your eternity...consequences for your actions takes on a profound and potentially endless ripple in your existence.

As far as resources are concerned..ya, that is an issue currently...not forever, but with our tech, it would be illogical for a human being as is to continue existing..if we didn't require food however......or directly converted sunlight into energy for us...



posted on Mar, 10 2012 @ 04:04 PM
link   
So much of who we are is driven by biology and chemistry. Without drives for hunger, propagation and territory/habitat, would sentience be obtainable? Self-awareness is certainly key to being human, but that doesn't necessarily bring emotion. Perhaps at this point a drive for survival may emerge, but why? Does being self aware without bio/chem reactions mean the same thing? Does it evoke a fear of losing that awareness or self?
Maybe this adventure is like training wheels for us achieving god-like abilities in the future.... man the responsibility.
Deciding what end result we want will determine the direction I suppose. Do we want to create us, or do we just want fascinating, functional and assistive robots?
Many more questions than answers with this provocative subject, but exciting and intriguing for sure.

spec



posted on Mar, 10 2012 @ 04:06 PM
link   

Originally posted by beezzer
reply to post by mark1167
 

Wow.

If a race evolved to immortality, would religion survive?

Murder would carry a special "taboo", of course.

Are our ethics so centered on a life span we can readily identify, that they'd change if we became "immortal"?



That does seem like the big test, doesn't it
are your emotions, morals, principles, solely based on fear of judgement, or are you at the core a good person.

I always chuckle at those suggesting monsters will be made if immortality is realized...it shows their true character (that they are only pretending to be a person of principles and moral, only due to a boogieman god not giving candy should they stray off the path..like marrying a dying person you don't like only for the money they have to inherit)



posted on Mar, 10 2012 @ 04:12 PM
link   
reply to post by speculativeoptimist
 


We would most likely just replace it with other quests

resource chasing for longevity of species goes from trying to put a flag in a plot of land to expanding into the stars
need to procreate...I don't see that ever changing...however, the form may...from giving birth to a drooling baby to forming a independent artificial program that develops personality, own traits, drives, etc. wouldn't it be interesting to see the first fully independent and uniquely personal self aware child that exists completely in cyberspace...

I don't think we will lose any of our motivations that exists in all life, but they will change its form as it always have, from single celled amoeba's consuming its neighbor for energy/food, to a cyborg standing in sunlight for the same purposes...one thing about life, it is constant no matter what species it is.



posted on Mar, 10 2012 @ 04:20 PM
link   
I'm reminded of the of the movie AI. One of my favorite movies really. I think our future is quickly moving towards smarter and smarter machines.
www.sciencedaily.com...

I certainly believe that deep space travel will only be possible for humans with such being preceding us. Who knows? Anyway here is a clip for anyone who cares to watch.




posted on Mar, 10 2012 @ 04:24 PM
link   
reply to post by SaturnFX
 





I always chuckle at those suggesting monsters will be made if immortality is realized.


I personally think human nature will dictate whether this will happen.Mortality has produced monsters throughout history so I don't see why immortality would be different. Even though human behaviour is fairly predictable,you can still never predict "all" the actions of any one individual.Plus the fact of achieving immortality could change human behaviour in ways we also can't predict.There's always going to be a segment of humanity that act in ways we can't understand.



posted on Mar, 10 2012 @ 05:24 PM
link   


The speech Clarice gives says it all..... (by your command)



posted on Mar, 10 2012 @ 05:36 PM
link   
vkey08 Thats mind blowing! The implications are astounding. Like I said in another thread as well as this one. The final outcome in our rush to create A.I. might not be want we expect. It's nice to know the people with foresight are thinking along these lines.I just don't think the scientists are.They just seem to be rushing to this conclusion without really thinking this through.
I think of that saying from Jurassic Park. "Your scientists where to busy worrying about if they could ,they didn't stop to think about if they should."
edit on 10-3-2012 by mark1167 because: edit
edit on 10-3-2012 by mark1167 because: text
edit on 10-3-2012 by mark1167 because: text
edit on 10-3-2012 by mark1167 because: (no reason given)



posted on Mar, 10 2012 @ 06:16 PM
link   

Originally posted by mark1167
I think of that saying from Jurassic Park. "Your scientists where to busy worrying about if they could ,they didn't stop to think about if they should."


Humanoid artificial intelligence should not be created, no; that is, strong AI which seeks to become an independent lifeform in its' own right. There is no practical use for it as far as humanity is concerned, (which could not be achieved via weak/non-sentient AI) and abundant ethical reasons as to why it should not be.

Unfortunately, however, even at this stage in the game, we have probably already moved past the question of whether or not it should be done, to the question of whether or not it will. The issue then becomes how to achieve the greatest mitigation of harm.



posted on Mar, 10 2012 @ 06:32 PM
link   

Originally posted by Tinman67



The entire problem here, is that the concept of love is not something that should ever exist, within the context of machines. When we talk about artificial intelligence, we then begin to anthropomorphise it; we make something more of it than what it really is, in our own heads. This is what the judges of the Turing Test do. They judge something as containing real intelligence, which literally has no more true intelligence than a calculator. All those chat bots do, is match interactive human input, with pre-supplied, non-interactive human input. That is not intelligence.

So when we think of artificial intelligence, we begin to think of machines as something other than what they really are. Nobody talks about having a love affair with a toaster, or it being able to have one with you; aside from some moving parts, it is a dead object which has no purpose other than to cook toast for our benefit. It has no independent existence aside from human beings, whatsoever.
edit on 10-3-2012 by petrus4 because: (no reason given)



posted on Mar, 10 2012 @ 07:00 PM
link   
reply to post by beezzer


Could an emotion like love be "programmed"?

 


I don't know if there is a big enough hard drive to store all the idiosyncratic behaviors of women that is associated with love.





new topics
top topics
 
33
<< 1    3  4  5 >>

log in

join