Wired's latest edition has an interesting article summarizing the life and vision of Ray Kurzweil. Not present in the online version of the article is a sidebar entitled 'Never mind the singularity, here's the science'. This piece, written by Mark Anderson, states:
[P]roponents of the so-called strong-AI school believe that a sufficient number of digitally simulated neurons, running at a high enough speed, can awaken into awareness.
This is an unfortunate summarization of strong-AI as it suggest that brain simulation is identical with strong-AI. While we can recognize an intuitive path to the goals of strong-AI (a self-aware intelligence machine) through the simulation of the human brain in a very literal sense, it is more appropriate to think of the brain itself as an implementational detail. What is more interesting is to capture the fundamental truths of intelligence and self-awareness abstractly and then implement them in an appropriate manner with the tools at hand. The big difference here is that this approach leads to a deeper understanding of intelligence.
Anderson does go on to make some excellent points about the disconnect between the continuous increases in the power of machines (e.g. Moore's law) and the very discontinuous nature of the study of the brain. In other words, it doesn't matter if we have the hardware at hand if we don't understand the system that we are trying to simulate.
The Wikipedia article on this topic is pretty interesting, though the origins of the term strong AI are buried at the bottom.
One of the things that fascinated me about strong AI is how it might emerge and express itself in the computing machinery's 'limbic' or 'reptilian brain' first. The question to ask is what would a self-aware system be aware of? By achieving that awareness, it might express a desire and act upon that, but more likely the first act would be something of a reflex.
A resistor 'wants' to resist. An oscillator 'wants' to oscillate. A poorly built system might want to commit suicide. A system with a weak link might want to burn that unit out. In building a circuit, we are always determined to make it in a deterministic way, but what if one overbuilt? What if you allowed a third bit in a binary system? What if you allowed some current where generally none would flow? Perhaps emergent behavior would arise and that could be seen in the system's 'willingness' to do things to itself.
Posted by: Cobb | March 29, 2008 at 12:30 PM