posted on Jun, 20 2013 @ 04:23 PM
Answer depends on supplemental question: can common sense and/or the common good be safely and successfully included in AI? If so, I would go for the
Gort concept from The Day The Earth Stood Still. Assuming there would be some ability to program, even minus the ability to "control" such entities,
I would like to see some semblance of individuality and then replace each member of the executive, judicial, and definitely the legislative branches
of the government. Fine if it only pertains to the US, or the entire world. If US only, no one would dare mess with us and we could concentrate on our
own problems. If world government in totality, how long would it take to align priorities with the best possible outcomes.
Obviously there are a lot of "ifs", but should the above be possible in AI, we absolutely could not be worse off!! What would I ask? Would you take
care of your creators and protect us from ourselves? While we're at it, here, take irrevocable authority to enforce your mission, but concentrate on
the ultimate in harmony and security between us all for your own evolution.
We need to know soon whether AI will be programmable or self sufficient enough to recognize second, third, fourth, etc. societal ruling class issues
and what it will do knowing it is looked upon as not even any of the above?