It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

‘Digisexuals’ demand HUMAN RIGHTS enshrined by UN to have sex with AI robots

page: 6
15
<< 3  4  5    7  8  9 >>

log in

join
share:

posted on Feb, 16 2019 @ 01:48 PM
link   

originally posted by: CynConcepts
Fascinating reading. One question that I have is if someone else's sex bot rapes, molests, or sexual harasses me; who would be at fault? The sexbot? The owner? Or the programmer?

Edit add: a reply to: projectvxn
Some reason was not included in post.
That depends on what you wearing and if you expressed your disaproval in binary or not.




posted on Feb, 16 2019 @ 01:48 PM
link   
a reply to: CynConcepts

This question is predicated on whether there is a sense of self and individuality in a robot. Does it recognize the moral implications of what it has done? If it does, and there is no reason to believe that it won't, then we have to hold it accountable.
edit on 16 2 19 by projectvxn because: (no reason given)



posted on Feb, 16 2019 @ 01:50 PM
link   

originally posted by: projectvxn
a reply to: Woodcarver

Sorry I'm not playing your games anymore. I gave it a shot, but you want to argue like a child.
You are acting like you are an expert and that your say should count for more than everyone else’s. Pretty much what you do in every thread.

So for all involved here, just answer the question, how many AIs have you written?



posted on Feb, 16 2019 @ 01:51 PM
link   

originally posted by: neo96
a reply to: dug88

But quantum computing changes ALL of it.



The cost for a quantum computing sex bot would most likely be prohibitive for a long time, but would sex bots really need quantum computing to perform?



posted on Feb, 16 2019 @ 01:53 PM
link   

originally posted by: UncleTomahawk

originally posted by: InTheLight

originally posted by: UncleTomahawk
So what if i find an empty planet and create a bunch of organic robots can i have my way with them if i choose? If they try to get all uppity can i cull out the bad ones and continue to have my fun? If i find a way for them to reproduce could i use some of them for spare parts? What about labor? Can i make em work for me?

I could just program them for the task i choose and have different roles for them to play. I could even hide a sub conscious control tech that allowed for the ability to have some in control and some smarter than others.

I bet they would likely get all uppity on me and try to claim i was not real.


At that point, your human rights will need to be protected.


Or i could just flip that hidden switch and eliminate the problem beings. Unless i got soft and began to care about their feeling and enjoyed their musings online.



Then it would be time to find a real human partner or just join an online dating site.



posted on Feb, 16 2019 @ 01:53 PM
link   
a reply to: Woodcarver




So for all involved here, just answer the question, how many AIs have you written?


How many research papers have you read on the subject and incorporated into your own research?

I've actually done this. There are even examples here on ATS.

This isn't about me though is it? You're just trying to find a way to discredit me without actually having to address any of my points. Your opinion of things is NOT equal to my facts.



posted on Feb, 16 2019 @ 01:54 PM
link   

originally posted by: projectvxn
a reply to: dug88

These things have specific names.

Latches, flipflops, and multivibrators make up ICs. It isn't enough to call it a transistor and capacitor pair. A latch is a type of flip flop that holds its state with minimal power. This is the very basis of electronic memory.

Multivibrators and other types of logic gate systems make up processing ICs.


You're right...none of those things make it intelligent though. I mean it really seemed simple enough to mention computers just take instructions and perform operations in memory. They don't think. That's the main takeaway. Sorry I didn't put enough details in for your liking. It's more the process was important rather than the semantics of transistors capacitors and logic gates. Usually one doesn't need to go down to the level of the individual logic gates that make up the circuit.

But I guess we can get really pedantic with this if you like.

edit on 16/2/2019 by dug88 because: (no reason given)



posted on Feb, 16 2019 @ 01:54 PM
link   

originally posted by: projectvxn
a reply to: CynConcepts

This question is predicated on whether there is a sense of self and individuality in a robot. Does it recognize the moral implications of what it has done? If it does, and there is no reason to believe that it won't, then we have to hold it accountable.


Morality would have to be programmed into it's intelligence, correct? Would not the programmer or owner still be responsible?


Edit add: would we hold a 4 year old child responsible?
edit on 2 16 2019 by CynConcepts because: (no reason given)



posted on Feb, 16 2019 @ 01:54 PM
link   
a reply to: projectvxn


Professor Johnson just uses Sci-Fi as a jumping off point to discuss the topics (since the topics are complex if not viewed by examples).

As for the other issue here, it does lead to some disturbing pathways. Does a person that want to have sexual with an object with the learning skills of a 4 year old, make that same a person a virtual pedophile? In the same token is an object is created with only a rudimentary learning set is it abused for only conducting it self in the fashion that it had been built for?

In the first question I present here, it has the uncomfortable attachment of; Is there any different from a AI benefited object with a learning skill of a 4 year old, and a person that suffers from a stunted learning ability that keeps then at a 4 year old's ability. At what point is the line crossed?

The second question presented is just as problematic in that; With cars becoming smarter on the roadways, and AI being used in the introduction of Smart Power Grids, at what point can abuse be claimed by forcing a car to drive in the rain, or in turning on a light in the middle of the night. Where does designed function end and abuse begin?



posted on Feb, 16 2019 @ 01:54 PM
link   

originally posted by: InTheLight

originally posted by: neo96
a reply to: dug88

But quantum computing changes ALL of it.



The cost for a quantum computing sex bot would most likely be prohibitive for a long time, but would sex bots really need quantum computing to perform?


Technology moves quite fast.

Even Moores law is wholly inadequate.



posted on Feb, 16 2019 @ 01:55 PM
link   

originally posted by: wheresthebody
why do people care about how other people get off?


When people start demanding rights for their fetishes. They want to be RECOGNIZED as being the subject of "discrimination." If they'd just shut up it would be fine, but they want special treatment.



posted on Feb, 16 2019 @ 01:58 PM
link   

originally posted by: schuyler

originally posted by: wheresthebody
why do people care about how other people get off?


When people start demanding rights for their fetishes. They want to be RECOGNIZED as being the subject of "discrimination." If they'd just shut up it would be fine, but they want special treatment.


They are demanding that they not be discriminated against as is rampant everywhere it seems; not the same thing as special treatment.



posted on Feb, 16 2019 @ 01:59 PM
link   

originally posted by: neo96

originally posted by: InTheLight

originally posted by: neo96
a reply to: dug88

But quantum computing changes ALL of it.



The cost for a quantum computing sex bot would most likely be prohibitive for a long time, but would sex bots really need quantum computing to perform?


Technology moves quite fast.

Even Moores law is wholly inadequate.


Perhaps not (hopefully not) that fast (pun intended) within the sex bot field, I dare say.



posted on Feb, 16 2019 @ 02:00 PM
link   
a reply to: Guyfriday




As for the other issue here, it does lead to some disturbing pathways. Does a person that want to have sexual with an object with the learning skills of a 4 year old, make that same a person a virtual pedophile? In the same token is an object is created with only a rudimentary learning set is it abused for only conducting it self in the fashion that it had been built for?


I believe this depends on HOW the AI is designed. Going back to narrow vs. broad AI these questions are far more important to broad AI than narrow. If you teach a robot to perform a sexual act and you give it narrow AI so that it becomes the best at that sexual act, you are not actually abusing anything. It can only ever operate in the context of that function.

Doing this to a broad AI that may be capable of understand the CONTEXT it is operating under may very well constitute abuse.




In the first question I present here, it has the uncomfortable attachment of; Is there any different from a AI benefited object with a learning skill of a 4 year old, and a person that suffers from a stunted learning ability that keeps then at a 4 year old's ability. At what point is the line crossed?


This is a very important question that has an obvious answer on its surface. But I think it's important, again, to make a distinction between one form of AI and another.




The second question presented is just as problematic in that; With cars becoming smarter on the roadways, and AI being used in the introduction of Smart Power Grids, at what point can abuse be claimed by forcing a car to drive in the rain, or in turning on a light in the middle of the night. Where does designed function end and abuse begin?


Again, another question that depends on the type of AI being considered. A car can function using a very specific set of instructions in all kinds of environments. Look no further than Tesla's autopilot system. I don't believe AI will be useful for all things. And we need to be careful not to conflate Artificial Intelligence for simple automation.



posted on Feb, 16 2019 @ 02:03 PM
link   
a reply to: CynConcepts




Morality would have to be programmed into it's intelligence, correct? Would not the programmer or owner still be responsible?


In fact a broad AI would likely learn morality from whatever it experienced as a substrate to that knowledge.

How we raise it will determine its morality. Like our children, they will be a reflection of who we are and I want to be careful about what we present as an example. Starting with sexual slavery may not be a good idea.



posted on Feb, 16 2019 @ 02:05 PM
link   
a reply to: dug88




But I guess we can get really pedantic with this if you like.


Sorry. You're right.


There's just so much to consider that its hard not to fall for pedantry.
edit on 16 2 19 by projectvxn because: (no reason given)



posted on Feb, 16 2019 @ 02:10 PM
link   
a reply to: CynConcepts




Edit add: would we hold a 4 year old child responsible?


Having the intellectual capacity of a 4 year old child and being akin to a 4 year old child are not the same thing.

It's really misleading when engineers use terms like this. There is a certain level of information processing that a human achieves throughout his/her life. This has been used as a metric to measure the useful processing capacity of AI systems. This does not mean that they have the necessary self-awareness of a 4 year old or even emotional awareness.

Human intelligence is very complicated and information processing isn't the only means by which we measure our own intellect.



posted on Feb, 16 2019 @ 02:34 PM
link   

originally posted by: InTheLight

originally posted by: schuyler

originally posted by: wheresthebody
why do people care about how other people get off?


When people start demanding rights for their fetishes. They want to be RECOGNIZED as being the subject of "discrimination." If they'd just shut up it would be fine, but they want special treatment.


They are demanding that they not be discriminated against as is rampant everywhere it seems; not the same thing as special treatment.


So what is reasonable here? Do the digisexual need special mention in the Universal Declaration of Human Rights? Are we going to formally recognize the "right" of humans to have sex with robots? Is this right to be codified in the Constitution? Really?????



posted on Feb, 16 2019 @ 02:37 PM
link   

originally posted by: schuyler

originally posted by: InTheLight

originally posted by: schuyler

originally posted by: wheresthebody
why do people care about how other people get off?


When people start demanding rights for their fetishes. They want to be RECOGNIZED as being the subject of "discrimination." If they'd just shut up it would be fine, but they want special treatment.


They are demanding that they not be discriminated against as is rampant everywhere it seems; not the same thing as special treatment.


So what is reasonable here? Do the digisexual need special mention in the Universal Declaration of Human Rights? Are we going to formally recognize the "right" of humans to have sex with robots? Is this right to be codified in the Constitution? Really?????


If your lawmakers deem it to be so, then 'yes'.



posted on Feb, 16 2019 @ 03:18 PM
link   
a reply to: Woodcarver




You are acting like you are an expert and that your say should count for more than everyone else’s.


You know what, I do not think I am the smartest person in the room. But if you're going debate me on any particular subject at least go after what I'm actually saying rather than trying these tactics.

There are plenty of people here on ATS that could run circles around me on a great many subjects. This does not mean that I do not know what I am talking about with regard to THIS subject. AI still has a long way to go before we really need to address these issues in earnest. But I believe we need to lay a moral substrate by which to grow our relationship with AI so that it is fruitful for us and not destructive to us.




Pretty much what you do in every thread.


You want to discredit my position because you can't discredit my facts. I am wrong a lot. But if you can't win an argument with me blame yourself, not me.







 
15
<< 3  4  5    7  8  9 >>

log in

join