It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Do we put too much trust in computers and technology?

page: 2
0
<< 1   >>

log in

join
share:

posted on Dec, 13 2008 @ 07:17 PM
link   
It is a yes and no question. Computers make many things much much more efficient. They can also make things much more annoying.

If you're asking if we will ever rely on them completely, I don't think so, certainly not in my lifetime.



posted on Dec, 13 2008 @ 07:23 PM
link   
reply to post by prototism
 





If you're asking if we will ever rely on them completely, I don't think so, certainly not in my lifetime.

True, and if the capacity to do that, WERE available in our lifetime, I would make a point of not relying on them completely.
As someone else mentioned, relying on them too much can make us mentally "lazy". In addition, would computers be capable of "compassion", or making ethical decisions. I wouldn't want a computer to decide if a loved one lived or died, based upon some algorithm that someone programmed.

What do others feel about that issue?



posted on Dec, 13 2008 @ 07:27 PM
link   
I think the government will eventually have to step in and create a law that prevents artificial intelligence from developing too far, to avoid cases when a computer will have to make "emotional" and/or ethical choices.



posted on Dec, 13 2008 @ 07:30 PM
link   
i disagree,i think alot if not most of the populace are going to be relying very heavily on technology, in around 40 or 50 years people from now are going to be far more shocked at the difference in technology and society from say people in the 40's or 50's are to current day technology.I think intelligence will just be used in other ways and wont be lost,different kind of intelligence but not stupid by any means.

[edit on 13-12-2008 by Solomons]



posted on Dec, 13 2008 @ 07:31 PM
link   

Originally posted by prototism
I think the government will eventually have to step in and create a law that prevents artificial intelligence from developing too far, to avoid cases when a computer will have to make "emotional" and/or ethical choices.


Why? your a biological computer...that seems a tad double standard.

two lines



posted on Dec, 13 2008 @ 07:31 PM
link   
reply to post by prototism
 





I think the government will eventually have to step in and create a law that prevents artificial intelligence from developing too far, to avoid cases when a computer will have to make "emotional" and/or ethical choices.


I agree that such a law should be proposed. My concern there, is that I wouldn't trust a politician or a group of them to make such laws. Look at what a horrible job they've done with the economic crisis. I think Congress is clueless, with the exception of a handful of members.



posted on Dec, 13 2008 @ 07:35 PM
link   

Originally posted by Solomons

Originally posted by prototism
I think the government will eventually have to step in and create a law that prevents artificial intelligence from developing too far, to avoid cases when a computer will have to make "emotional" and/or ethical choices.


Why? your a biological computer...that seems a tad double standard.

two lines
Okay, one is a mechanical machine. I am not. I did not create myself, and I do not know what created me (or rather, humanity).

I do know that computers are created by us. That is the (rather obvious) difference.



posted on Dec, 13 2008 @ 07:39 PM
link   

Originally posted by prototism

Originally posted by Solomons

Originally posted by prototism
I think the government will eventually have to step in and create a law that prevents artificial intelligence from developing too far, to avoid cases when a computer will have to make "emotional" and/or ethical choices.


Why? your a biological computer...that seems a tad double standard.

two lines
Okay, one is a mechanical machine. I am not. I did not create myself, and I do not know what created me (or rather, humanity).

I do know that computers are created by us. That is the (rather obvious) difference.




emm you werent created by *us*? you know your mother and father....were you made by zebras?



posted on Dec, 13 2008 @ 07:45 PM
link   
reply to post by Solomons
 

At this point, you're getting into a religious versus non-religious interpretation of mankind.
My belief is that we are MORE than "biological computers". However, I don't expect people that are non-believers to accept that, and I don't wish to debate the issue, since that debate belongs in the Religion Forum. However, even in a non-religious sense, I would oppose taking the "humanity" out of decisions of life and death. Again, that is just my opinion.
I suppose that such a choice could be written into a "living will", much as that is done today.



posted on Dec, 13 2008 @ 07:46 PM
link   
reply to post by Solomons
 



You are claiming I, as a human, am a "computer", which implies all humans are "computers" (according to you).

I was stating, though I do not know where I (as a human "computer", not as a son) came from, I do know where human made computers come from.



posted on Dec, 13 2008 @ 07:49 PM
link   
Yes well being in the science and technology forum i expect to have a logical debate not a nonsensical im special and not simply the result of social conditioning and memories.



posted on Dec, 13 2008 @ 08:13 PM
link   
reply to post by Solomons
 

Yes, as I said, I accept your position, and I stated mine, so let the discussion continue. There are also legal issues, such as liability issues, should computers make more and more decisions. If one of those decisions results in harm, where does the liability lie? Is it with the designer, the programmer, the user, all of the above? Any lawyers here want to take a stab at that?



posted on Dec, 13 2008 @ 10:54 PM
link   
reply to post by infolurker
 


North Korea has not said it's willing to support terrorists by selling, or giving, nuclear weapons.

Any critical machinery will be shielded against an EMP blast by Faraday cages. It won't be the end of the world, though no doubt most computers in private hands will be severely affected.



posted on Dec, 13 2008 @ 11:00 PM
link   


Just drive down the street and look for teens and try and take their cell phones or I-pods away from them.
You'll be lucky if you dont loose a finger or two.



posted on Dec, 14 2008 @ 06:32 AM
link   
reply to post by prototism
 


So that's what we need, more laws, more legislation against reality!

We all need to stop using computers. They are compromising our humanity!

We also need to stop evolving. That silliness has gone on long enough!


sty

posted on Dec, 14 2008 @ 12:45 PM
link   
computers are only doing what they are told to do , with an error of billions times smaller than humans . Viruses, troyans , spywares etc - are only programs that are doing what they are told do to , and they do it right. If in the future I would be sick and necessit a surgery , i would rather prefere a robotic surgeon than a human one - just because the computers are more accurate . So at this point, we can thrust them

However, when we will give them the abillity to find out what moral or not - then we can be scared, as the computers could very much disagree with our way of thinking.
Like - computers may not find it moral we carry on our lifes while 30 000 people die every day of hunger for example.. then they could organise to fight against the human establishments in order to change that . If we opose, we could become enemies - yet the question to be answered : will the computers be right ?

[edit on 14-12-2008 by sty]



posted on Dec, 14 2008 @ 05:36 PM
link   
reply to post by sty
 


Your post reminds me of a 1970 movie, Colossus: The Forbin Project.
Here is a summary of the plot, from Wikipedia:



Dr. Charles A. Forbin (Eric Braeden) is the chief designer of a secret government project. He and his team have built a gigantic and fantastically advanced supercomputer, called "Colossus", to control all of the United States and Allied nuclear weapons systems. Colossus is built to be impervious to any attack, encased within a mountain and powered by its own nuclear reactor. When it is activated, the Kennedyesque President of the United States (Gordon Pinsent) announces its existence, proudly proclaiming it a perfect defense system that will ensure peace.

Almost immediately after the broadcast ends, Colossus displays a cryptic warning: "There is another system".

It is revealed that Colossus is referring to a Soviet project very similar to Colossus; a supercomputer called "Guardian," that controls Soviet nuclear weapons. Both computers order a link to allow them to communicate with one another. A link is set up, and the computers start exchanging messages of simple mathematics, as the scientists and officials of both sides monitor the communication on video screens. The communications become increasingly complex, eventually extending into mathematics formerly unknown to mankind. Then the two machines begin communicating in a binary language that the scientists can't interpret. This alarms the President and the leader of the Soviet Union, who agree to disconnect the link. Colossus and Guardian demand that the link be restored, or "action will be taken." When this threat is ignored, Colossus and Guardian each launch one of their nuclear missiles. The U.S. and U.S.S.R. quickly restore the link, and Colossus intercepts the Soviet missile before it strikes. The link is restored too late for the American missile to be destroyed, and a Soviet oil complex and neighboring town are destroyed. The scientists and officials then watch helplessly as the two computers exchange information without limitation. The computers soon announce they've joined, and become a single, even more powerful computer, taking the name Colossus.

Working by direct personal contact, the scientists and governments of the U.S. and U.S.S.R. attempt to fight back, first by attempting to overload the computers. This attempt fails and Colossus identifies the individuals responsible, ordering their immediate executions.

Realizing that the computers were themselves impervious to attack (as originally intended), the governments then undertake a plan to covertly disarm the nuclear missiles, one by one -- a process which, using the normal maintenance and servicing schedules will take three years. Unfortunately, Colossus detects this plot and responds by detonating two missiles in their silos.

At the film's end, Colossus broadcasts a speech to all countries, declaring itself the ruler of the world. It says that under its authority, war will be abolished and problems such as famine, disease and overpopulation will be solved. "The human millennium will be a fact." In its final remark, addressed to Dr. Forbin, Colossus predicts: "In time, you will come to regard me not only with respect and awe, but with love."




posted on Dec, 18 2008 @ 04:42 PM
link   
Check out the current trust index of any item at www.trust-index.com...

You can add whatever item to be rated by people all over the world



new topics

top topics



 
0
<< 1   >>

log in

join