ItCameFromOuterSpace
There's a book called "Feed" everyone should read. Can't remember the author. But I read it about 13 years ago and it's about this exact thing.
I was given that very same book by my English teacher as a gift (still don't know why, but I'm sure it was because I really enjoyed reading
1984 and
Brave New World).
The idea of having a chip in mine or another person's brain for entertainment purposes (note: I'm not against it for medical reasons), isn't
something I would jump on the bandwagon for. It's a pretty bad idea because:
1. Unless the device is coded in Linux, it can be hacked - I mean, look at the list of devices that companies claimed couldn't be hacked (cough, PSP,
cough). Even Linux doesn't guarantee that the device is safe. Now, imagine that someone found out how to control how the device sends information
out, and decides to "think" up their own interface for hacking purposes.
2. When something becomes that integrated with another object, not only is it harder to fix, but the more likely something can break. Real world
example: Windows 8. On most of the newer PC's, Windows 8 is not just an OS, but now it's wired into the BIOS as well - I had to force Windows 8 off
this laptop, kicking & screaming all the way. Were it not so attached to everything, the uninstall process would have taken maybe an hour at most.
Now, say you have a device that not only manages your memories, but maybe manages your heart rate, body temperature, muscle actions, etc. What do you
think would happen if that device suddenly failed? My bet would be this: You'd be like a computer that cannot boot - frozen; you might not be able to
see, or even walk, because this device has wired itself to be a part of you.
My only hope is that if this tech does become mainstream, I can keep my job in the tech industry - although somehow, I don't think I'd have any
chance to compete against someone that can think up a program vs myself, who can only type as fast as the computer lets me.
-fossilera