Is it time to move past the idea that our brain is like a computer?

Ever since the days of Alan Turing, neuroscientists have, in increasing numbers, compared the human brain to a computer. It's an analogy that makes a hell of a lot of sense, and it's done much to help us understand this remarkable grey blob that sits between our ears. But as a recent essay by philosopher Daniel Dennett points out, while the brain should most certainly be considered a kind of machine — one with a trillion moving parts — its inner workings are far removed from anything we have ever developed. Consequently, scientists need to take note and update their models accordingly. Calling the brain a "computer," says Dennett, is accurate, but insufficient.

Writing in the Edge, Dennett admits that he has been wrong about the brain — but he's not backing down from the foundations set down by Turing and Alonzo Church in the first half of the 20th century; cognitive functionalism is still very much alive and well.

Is it time to move past the idea that our brain is like a computer?S

But rather than looking at the brain as a series of small and discrete sub-systems, Dennett has been considering the role of individual neurons. Moreover, he's impressed with how remarkably plastic and adaptable the brain is. Today's computers, which are designed from the top down, could never adjust to dramatically changing internal and external conditions. And the reason for this, says Dennett, has to do with the self-preserving nature of neurons. He writes:

We're beginning to come to grips with the idea that your brain is not this well-organized hierarchical control system where everything is in order, a very dramatic vision of bureaucracy. In fact, it's much more like anarchy with some elements of democracy. Sometimes you can achieve stability and mutual aid and a sort of calm united front, and then everything is hunky-dory, but then it's always possible for things to get out of whack and for one alliance or another to gain control, and then you get obsessions and delusions and so forth.

You begin to think about the normal well-tempered mind, in effect, the well-organized mind, as an achievement, not as the base state, something that is only achieved when all is going well, but still, in the general realm of humanity, most of us are pretty well put together most of the time. This gives a very different vision of what the architecture is like, and I'm just trying to get my head around how to think about that.

Each neuron, far from being a simple logical switch, he argues, is a little agent with an agenda, "and they are much more autonomous and much more interesting than any switch." Dennett describes a number of ways the brain spontaneously reorganizes itself to changing conditions — and says that a neuroscientist who doesn't have an architecture that can explain how this happens, and why this is, has a very deficient model. He continues:

Why should these neurons be so eager to pitch in and do this other work just because they don't have a job? Well, they're out of work. They're unemployed, and if you're unemployed, you're not getting your neuromodulators. If you're not getting your neuromodulators, your neuromodulator receptors are going to start disappearing, and pretty soon you're going to be really out of work, and then you're going to die.

This is a fascinating perspective, one that offers a kind of Darwinian approach to brain cells. He's basically saying that neurons, like organisms, are subject to selectional pressures, and by virtue of this, have to find ways to adapt and stay useful in the brain. The end result, in conjunction with other adaptive processes, is the advent of a highly functional and well-tempered brain that works to keep the host alive. Think of it as a kind of "selfish neuron" hypothesis.

Be sure to check out the rest of Dennett's long-read article, which includes his perspectives on the role of culture in human thinking, along with his most recent thoughts on philosophy, religion, and creationism.

Images: agsandrew/Shutterstock, Edge.org.