For anyone who can;t be bothered to watch the video, read this page
, it mentions some of the products it has made and some
of the groups and companies who use his technology or are empoloying him for that reason.
I know a little bit about this stuff, just finished a Masters degree in Cybernetics. Not actually heard about creativity machines before however, just
your usual set of neural networks, GA's etc. One thing from the video, he generalises other systems way too much and makes it sound like theres only
a few kinds out there, there are in fact, lots.
The concept is actually remarkably simple and I was worried that by his explination alone, anyone could put such a system together. However what he
specifically does no mention is how he asks the system to produce something, the learning, sure, well covered, the perterbations too, but not how he
asks it to do anything. Looks like I've got a patent to find and read.
If it works quite as well as indicated, this is a fantastic piece of technology. All the 'Hell no, didn;t you see terminator?' comments are
ridiculous. This sytem does not use any form of emotional architecture (apparently, and no, that doesn;t mean it would literally love or hate things
if it did) and has no concept of the world beyond what it is presented and asked to think up.
Where his idea falls down is in the idea of interfacing with the human mind. There is no agreed notion of how the mind and conciousness actually work
and tie into the brain, and thats not to mention the concept of a soul. The idea of plugging one of these machines into a brain and melding them
togther, as the technology and the way it works currently, is just stupid. The whole crux of his system working as it does is the 'perfect' network
that judges the 'creative' one. Who wants a 'perfect' computer system judging your minds creativity? And once again, such a system is unlikely to
understand emotions, as they are not applicable in the way (as suggested in the video) that the system works.
Interesting stuff, certainly. I'll have to look into using such systems in the future in jobs. However, there are certainly some worrying
implications of the 'wrong people' using this to design better forms of terror and weapons used to target specific people, or types