My first replies to this thread were obviously in jest, as there are a few things, at least on the surface that make this a fun intellectual topic...
ie. what would
it feel like to be "defragged". The more I think about the ramifications, though, the more I think that it is a horribly bad
thing to do.
First of all, science still has no real understanding of how the mind works. Granted, we have studied the brain in great detail, mapped out areas that
control certain processes and have a basic understanding of how the chemistry works. The brain is just an organ that serves as a CPU for the mind, so
to speak. The mind is a completely different concept from the brain, and seemingly infinitely complex. Twins are a good example. If two people on this
planet have any chance of having identical brains, twins would be where to look. Many twins have common speech patterns, mutual idiosyncrasies and
other traits that would imply "sameness"... yet one could be deeply spiritual and the other an atheist. Two very like minded people, yet totally
different belief systems. Why? Even if those twins are joined at the hip throughout their formative years, experiencing life together and sharing
common events, each one will still have a different perspective on the world as everyone of us learn from our experiences in our own way
Software developers love flow charts. Flow charts graphically represent how the program "thinks"... ie. "or" switches "and" switches and other boolean
arguments... could you even imagine just how much paper would be involved to "flow chart" just 1 person's thought patterns? Or how much that "flow
chart" would be dependent on an individual's mood and thinking at that time?
Not to mention the fact that programmers know EXACTLY how today's
computers work, yet software is released that is buggy... pretty sure I don't want these guys trying to program me on a system they barely
Beyond that, the programming involved to make this concept work for just 1 person's mind would be insanely complex and would probably take decades
just to debug. By the time they actually got it to work, they would have to start all over as the "test subject" would no doubt be fundamentally
different by this time by the effects of learning and experience.
Some of the "what-ifs" I made in jest are actually pretty scary. We are born, we live our lives, then we die. It is a natural process. After we die,
we move on (or so we like to believe)... but... what if your AI mind started to think outside of what the other AI minds believe that you should?
Could you then be "boxed" ala Battlestar Galactica? Your mind and being, your "esse" put on a memory device and powered off forever... consigned to
oblivion on a hard drive in a government warehouse somewhere.
Someone would also have to physically maintain the computer that you are loaded on. How long would you be viable in such a system if the power were
shut off? Would you survive a hardware crash? If you could be "recovered", how much data would be lost? Would you still be "you" if your "emotion"
subroutine was removed because the external programmers want more predictability from the system?
Mechanical immortality? No thanks.
I'll take my chances and be happy with my time here.
edit on 21-6-2013 by madmac5150 because: My cat made me