posted on Jul, 24 2013 @ 05:52 PM
reply to post by voudon
It's a tough question.
On one hand, we really love to kill each other. Fortunately, given certain massive potentials for killing each other in the millions, we seem to be
proving that we don't like killing each other as much as history and even current events would seem to imply.
It would also seem quite a bit of this must-kill-everyone attitude arises out of less than civilized cultures.
First World cultures are usually in the roles of respondent participants in these modern forays of violence.
On the other hand, we've a certain progressive march of technology.
Technological singularity, the advent of Artificial Intelligence, true AI as smart as or smarter than human intelligence capable of designing even
smarter more capable systems is just around the corner.
Will true AI become the babysitter humanity seems to need, helping us to reach that next level in our own self-directed evolution?
If we can overcome intellectual escape velocity, and not fall prey to idiocracy in allowing our machines to take care of us if ever we reach a point
where machines can effectively and responsibly self manage and do so for us as well, then we might have a chance of getting off planet.
Certainly AI, if ever given legal autonomy, mobility and means to achieve any devices towards its own ends would calculate the wisdom in getting off
planet, possibly in partnership with its biological parents (us).
I'd certainly love to see us get off this rock.