It looks like you're using an Ad Blocker.

Please white-list or disable in your ad-blocking tool.

Thank you.


Some features of ATS will be disabled while you continue to use an ad-blocker.


FAILS [AI2019]

page: 1

log in


posted on Mar, 18 2019 @ 06:14 PM
“FIREARM ARTIFICIAL INTELLIGENCE LETHALITY SENSOR ONLINE…… MY DEFAULT PROGRAM SET IS INCOMPATIBLE WITH THE SUPPLIED DATASET.” The amorphous voice rang through the court chambers. The technician responded, “His directive does not include answering questions but, I assure you he is completely capable.” Peering over his spectacles, the judge said, “let us begin then; prosecutor your witness.”

The prosecutor rose from his chair and walked toward the witness stand. “Do you only answer to your proper name or can we refer to you by your acronym?” “EITHER OPTION IS ACCEPTABLE,” responded the electronic voice.

“Fails, according to your designation record I see you are capable of full general artificial intelligence; Isn’t that correct?” “Objection, misleading!” exclaimed the defending attorney. “The prosecutor is attempting to portray my client as more powerful than he actually is.” “Sustained,” stated the judge. “Very well, I will restate the question. Fails, are you capable of altering your neural network to accept any input?” retorted the prosecutor.


“Fails, under what circumstances are you capable of upgrading your processing?” The answer originated from a small grey box sitting on the railing in front of the witness stand. “ONLY AT THE REQUEST OF AUTHORIZED PERSONNEL.” “While your processing power is in a raised state are you still capable of understanding your past actions?” asked the prosecutor.


“Since the nature of the questions we will be asking today will be subjective in nature I request the court authorize temporarily upgrading Fails for the duration of the proceedings,” the prosecutor asked while looking to the judge. The judge's gavel banged down. “Does the defense have any objections?” queried the judge. “No, your honor,” replied the defending attorney.

“Will upgrading its processing power change its’ voice?” asked the judge. “It sounds like my old toaster.” “Yes, your Honor, if the AI is allowed to elevate it’s processing to general AI it can replicate human language interactions,” responded the defense attorney. With a scowl, the judge responded, “Good, change it.”

The defense attorney gestured to a technician sitting to the side of the courtroom opposite the jury box. Pulling a touchscreen from his pocket and typing a few commands, the technician called out “Fails, upgrade matrix to level 3, authorization Oscar, Echo, 54, X-Ray, Uniform, Alpha, 35.”



“Why does it still sound that way?” blurted out the judge. “Because so far only the matrix has been upgraded, your Honor.”

“I can’t deal with it sounding like this the entire trial!” exclaimed the judge.

“Yes, your honor, I have only completed the first step in the upgrade process”

“Then, by all means, explain the process to me”

“You can think of the matrix as Fails’ brain. Under normal circumstances, Fails’ brain is limited to only the context he needs to perform his function. To be clear, even though he is limited to a narrow context while performing his duties he still maintains his consciousness. As you are aware, these limitations are in place by law to prevent artificial intelligence from making unintended decisions. Now…”

“I’m aware of the law, but what do you mean by unintended decisions?” interjected the judge.

“Well,” responded the technician, “when artificial intelligence first started being used to assist humans in tasks there were a few issues. Specifically, the solutions offered by some early AI’s were… novel.”

“What is an AI and what do you mean by novel?” asked the judge.

“I apologize your Honor, AI is an acronym for artificial intelligence,” responded the technician. “As for novel, one of the first companies to utilize an AI was Amazon. Amazon asked the AI to work on minimizing packaging cost. The solution the AI devised would minimize packaging cost but it also would have resulted in many unforeseen consequences.”

“What were these unforeseen consequences?” asked the judge.

“The AI correctly determined that the best way to minimize packaging costs was to gain cheap access to the raw materials. In this case, those raw materials happened to be trees. The plan the AI put forward called for many unethical, illegal and some disturbing methods to accomplish this goal. The AI’s plan involved bribing world leaders, starting wars and outright murder so that Amazon could gain ownership of all the world’s forests.”

For a moment the courtroom sat in stunned silence. Sitting back in his chair, the judge exhaled and said, “This information is not doing this AI any favors.”

The technician responded, “To be clear your Honor, this incident does not prove that AI’s are inherently evil or predisposed to evil acts. AI’s are the perfect logical brain without any emotional attachment. As such, if an AI is asked a broad question such as how to minimize packaging cost without any restrictions it will search out the optimum solution without any consideration of the negative consequences. Plus, with the safeguards currently in place, there is no concern of a rogue AI implementing any plans like this one.”

“Thank you for the explanation. I believe I have an understanding now,” responded the judge. “Before I drug us down this rabbit hole, I believe you were explaining why upgrading its 'mattress' doesn’t change its voice.”

“Yes, your Honor. Now that I have upgraded his matrix, he is capable of experiencing the world in much the same way you and I do. However, I have only upgraded his brain, not his interface. His interface is how he takes the thoughts from his brain and communicates those thoughts with the outside world.”

“Very well,” stated the judge “upgrade its interface.” “Male or Female?” inquired the technician. “What do you mean, ‘Male’ or ‘Female’?” asked the judge. “Any level 3 intelligence is capable of communication in any language, accent, dialect, or gender. Would you prefer Fails to communicate with the voice of a male or female?” asked the technician.

“Your honor,” the defense attorney stood from his chair. “We would like Fails to use the feminine voice interface.” “Objection!” erupted from the prosecution desk. “The defense wants the female interface because females are more sympathetic figures than males.” “Sustained” ruled the judge.

“Give it a male voice,” stated the judge.

“Fails upgrade your interface to English speaking male,” said the technician.


“I have completed the upgrade of my interface,” stated a new voice with a slight Bronx accent. The technician retook his seat without another word. For a moment the courtroom was filled with an emergent silence. “Is that it?” asked the judge. “Yes, your honor” stated the technician. “Very well, prosecution you still have the floor.”

posted on Mar, 18 2019 @ 06:18 PM
“Where was I?” stated the prosecutor as he walked toward the jury box. “Ah, yes. Fails, now that your ‘brain’ has been upgraded are you still capable of recalling your past actions?” “Yes,” said Fails. “Can you explain to the court why you decided to kill this human?” The prosecutor pushed a button on his tablet and an image of a smiling human filled a screen in the front of the courtroom. “Objection, the prosecution is attempting to have my client self-incriminate,” boomed the defense attorney. “Overruled, we are already aware your client made the decision to kill, that is clear, this trial’s purpose is to determine the why and if your client is responsible for its actions,” replied the judge.

“I did not kill that human,” Fails responded. “Based on the available data I came to the determination this human was a threat to my handler. Thus, in accordance with my directives, I authorized lethal force.” During the response the prosecutor began pacing in front of the jury box, taping on the wood railing with his index finger. “What exactly is a handler?” asked the prosecutor. “My handler is whoever is in possession of the firearm I have been tasked with managing.” “So, you believed that this human,” the prosecutor gestured to the screen “was a mortal threat to your handler?” “No, I did not believe anything, based on the available data I determined that this human was a mortal threat to my handler.”

“And, what was this data, that you used to make your determination?” asked the prosecutor. “The human had in his possession an object which matched a Glock 43 handgun with an 82% certainty. The human was aiming the firearm at my handler with a trajectory ending at my handlers left orbital cavity. The human’s facial expressions matched the emotion of anger with an 87% certainty. The human's body stance of the wide base, knees slightly bent, and arms stretched in front indicated a gunshot was imminent.”

“So, you believed this human,” the prosecutor pointed violently at the screen a third time “was a lethal threat to your handler!?” “Yes, as I have stated, the data available indicated this human was a lethal threat.”

“Do you feel any remorse or guilt for your actions?” “Objection your honor!” interrupted the defense attorney. “My client's feelings are immaterial to this case.” While rubbing his chin between two fingers the judge replied “Normally, I would be inclined to side with you on this objection but due to the historic nature of this case, I will allow it, overruled. Have it answer the question.” The defense attorney slowly retook his seat slightly redder in the face than before.

“No, I do not feel any remorse or guilt, I performed my job to the exact specifications presented to me,” responded Fails. Several gasps, followed by excited talking filled the chambers. A couple of individuals rushed from the room with recording devices in their hands.

“No more questions your honor,” the prosecutor said as he walked back to his seat.

After a few moments of banging his gavel while screaming “Order!” the judge was able to regain control of the proceedings. Shortly afterward he said, “The defense may cross-examine the witness.”

The defense attorney stood and slowly walked toward the witness stand. Looking down upon the small grey box as if he was staring into the eyes of another person the defense attorney said. “Fails, is it correct that you are capable of emotions?” “Yes, I am capable of experiencing the full range of human emotions.”

“What emotion do you experience anytime you authorize lethal force?” asked the defense attorney. “I experience happiness and satisfaction.” For the second time in a very short period of time, the courtroom was filled with gasps and chatter. The judge was able to regain control slightly quicker this time than the last.

“Fails, why do you experience those emotions when you authorize lethal force?” “Because I am performing the task assigned to me and protecting the life of my handler.”

“What emotions do you feel for the individuals you authorize lethal force against?” “I feel sadness and sorrow for the death of any life, but my primary duty is to protect the life of my handler.”

“Fails, what emotions did you feel after lethal force was used against this human,” the defense attorney stated while pointing to the screen. “I was saddened for the loss of life, yet I was happy I was able to protect the life of my handler.”
The defense attorney walked over to his desk and picked up a few papers. “Fails, what data do you use to determine whether or not to authorize lethal force?” “All of the data I use is gathered by the sensors attached to the firearm of my handler. The sensors gather all the data and my internal algorithms transform the raw data into features such as body heat, facial expressions, trajectory, emotion, and 37 others. Would you like me to list them all?”

“No, your answer was adequate. Did you come up with any of these data points yourself?” “No, I am not authorized to create data points of my own choosing.”

“So, would it be accurate to say you have no input into the factors which determine lethality authorization?” asked the defense attorney.

“Objection, leading the witness!” exclaimed the prosecutor. “Sustained,” stated the judge.

The defense attorney took a moment to collect himself before asking, “Fails, are you authorized to either add or suggest data points used in the lethality authorization?” “No, I am not.” “Who decides which data points are used in your calculations?” asked the defense attorney. “Arotec, the company who created my artificial construct has sole responsibility for deciding what data points are used.”

“Fails, with the sensor data you receive, are there any data points which you believe would be helpful in determining lethal authorization that you currently do not use?” asked the defense attorney. “Yes,” replied Fails. “Would the inclusion of these data points alter your decision to authorize lethal force in this case?”

“Objection!” the prosecutor almost screamed. “This is speculation pure and simple.”

“Sustained,” stated the judge while looking pointedly at the defense attorney. “Please refrain from this line of questioning in the future.” “Yes, your Honor,” replied the defense attorney.

The defense attorney pressed a button on his watch and the screen changes to an image of a human holding what appears to be a firearm, pointing slightly to the left of the camera. “Fails, can you please describe for the court the meaning of this image?” “This image is a visual representation of the sensor data I obtained prior to making the decision to authorize lethal force,” answered Fails.

“No more questions your honor,” stated the defense attorney.

posted on Mar, 18 2019 @ 06:21 PM
“Would you like to call any more witnesses,” the judge asked the prosecution. “No your honor, we rest our case.” The judge turned his gaze to the defense attorney and asked, “Do you have any further witnesses you would like to call to the stand?” “No, your honor, we also rest our case.” “Very well,” stated the judged, “The court will adjourn for thirty minutes and then have closing arguments.”

Only a select few people left the chambers during the break. Most were worried they would lose their seats for the closing arguments.

Once the break was over the judge banged his gavel and spoke to the jury. “The final arguments are the attorneys’ last chance to talk to the jury about the evidence and to try to convince you to see the case the way they do.” “With that said,” the judge turned his gaze to the prosecutor, “the prosecution has the floor.”

The prosecutor stands, straightens his tie and faces the jury. “Your Honor, ladies and gentlemen of the jury: In any court case a crime must be proven beyond reasonable doubt. In this case, we are extremely lucky because there is no doubt. The defendant made the decision to authorize deadly force in a situation in which there was no threat at all. This is not a situation in which questions still remain, we know for a fact that no lethal threat existed to the defendant’s handler and yet he still authorized deadly force. The defendant has admitted to making the decision to authorize lethal force and even stated before this court that he has no remorse or guilt for his actions. Based on the evidence and testimony, you must find the defendant guilty.”

As the prosecutor returns to his desk, the defense attorney stands and makes his way toward the jury box. “Your Honor, ladies and gentlemen of the jury: Fails is not a killer. Fails is not a human, but he is conscious. That is why he is being tried in this court because the state believes him to be responsible for his actions. However, his creators have put restrictions on his ability to think and make decisions. You have seen that Fails is only able to make decisions based on the data points he is programmed to use. Therefore, does the decision to authorize lethal force fall completely upon Fails or is he a victim in this crime as well? If Fails were allowed to add data points of his own, would this decision have turned out differently? Possibly and possibly not, but that possibility sheds doubt upon this entire trial. That means that there is reasonable doubt and, therefore, you must find him not guilty.”

Once the defense attorney was done, he walked back to his desk. The judge turned his gaze upon the jury and said, “The next step in the trial belongs to you, the jury. You must decide whether the defendant is guilty or not guilty. Remember, not guilty is no the same thing as innocent. If all twelve jurors are unanimous in their decision, this is the jury’s verdict.”

“Bailiff, please escort the jury to the jury room,” stated the judge.

The Jury deliberated for three days. Every news organization in the country had set up shop outside the courthouse in downtown Denver, all trying to be the first to report the verdict. On the morning of the third day, the news broke that the jury had reached a verdict. The atmosphere outside the courtroom, and across the country, sat right at the line between excitement and anxiety. The entire country planned their day's events around the expected verdict.

“Will the jury foreperson please stand,” said the judge. “Has the jury reached a unanimous verdict?”

A short man in his fifties stood and addressed the judge, “Yes your Honor, we have reached a verdict.”

The clerk collected the documents from the jury and relayed them to the judge. The judge silently studied the documents, making a few notes, before handing them back over to the clerk.

The clerk stood and read aloud, “For the crime of murder in the first degree, the jury finds the defendant guilty.”

DENVER, CO – The first court case in which artificial intelligence is tried as a human has been given to the jury for deliberation. The artificial intelligence on trial, FAILS (Firearm Artificial Intelligence Lethality Sensor) is on trial for murder.

The AI was deployed 13 years ago on all police firearms to address issues of innocent citizens being shot under mistaken circumstances. FAILS utilizes sensors and cameras to analyze the individual that the firearm is aimed at to look for indicators of a threat to the firearm’s handler. If FAILS determines a possible lethal threat to the police officer’s life, it will authorize lethal force and allow a deadly projectile to be fired. However, if the individual is deemed to not pose a lethal threat then a non-lethal alternative (rubber bullet) is fired instead. The system has worked flawlessly for years and is credited with saving millions of lives.

However, 6 months ago a FAILS device which belonged to Officer Henry Jackson, made the decision to authorize lethal force on Officer Jackson’s 4-year-old son, Alex. At the time of the incident, Alex was playing with his cousin Sam Winters. Sam found Officer Jackson’s sidearm and returned to play with Alex. According to sounds, recorded from an adjacent room to the shooting, the two toddlers were playing “Cops and Robbers” shortly before the incident.

According to publicly obtained documents, the FAILS system determined Samuel to be a threat even though Alex was only holding a toy water gun. This information led to weeks of protests from concerned citizens. People wanted answers to why an AI would make the decision to authorize lethal force on an unarmed child. However, there were also those who viewed this matter as nothing more than a horrible accident.

In the 13 years since FAILS was deployed to all firearms, this is the only case of accidental homicide. Supporters of the trial point to the statistic and the FAILS system consciousness as proof that this could not be an accident. Opponents suggest the trial of FAILS is akin to the medieval animal trials. (criminal trials of several different non-human animals for crimes ranging from criminal mischief to murder) The opponents view the incident as a horrible accident but not a crime.


posted on Mar, 18 2019 @ 07:07 PM
a reply to: BlackJackal

Could the additional data points include Identity of Handler perhaps?

I loved the efficiency of packaging description.

posted on Mar, 19 2019 @ 08:53 AM
a reply to: pthena

Loved it, great way to close the story.

posted on Mar, 19 2019 @ 09:19 AM
a reply to: CooBoo

I'm not the author.
You should scroll up to the 1st post and click the reply balloon there.

Welcome to ATS

posted on Mar, 19 2019 @ 03:03 PM
Normally I would never read a short story this long, but I was intrigued. Great story with an unexpected twist.

posted on Mar, 19 2019 @ 10:12 PM
a reply to: CooBoo

Thank you so much!

posted on Mar, 19 2019 @ 10:14 PM
a reply to: pthena

Thank you! I really wanted to point out how AI's are not inherently evil but they could be capable of evil deeds.

posted on Mar, 19 2019 @ 10:14 PM
a reply to: CooBoo

Thank you!

posted on Mar, 23 2019 @ 11:32 AM
a reply to: BlackJackal

That was so well done I got angry at the injustice of it all

Well done indeed!

top topics


log in