Saturday, June 4, 2016

Your brain is not a computer

Below we are reposting the opening paragraphs of (and a link to) an article called “The empty brain” from Aeon, surely one of the best science journals on the web.[1] This article is by Robert Epstein, former editor of Psychology Today, and is a direct and refreshing attack on a deeply misguided view of the brain that is ubiquitous both in science (especially in the dubious field of neuroscience) and educated public opinion. It's the view that the brain is a computer. But as the article's subtitle states (and as the article goes on to demonstrate with some ingenious examples): “Your brain does not process information, store memories or retrieve knowledge.” If this statement startles you, it's only a measure of how steeped we all are in what Epstein rightly calls “shoddy thinking” about the brain. Of course it isn't really Epstein's contention that the brain is “empty” - the title is meant to be provocative and to underscore that what we normally think of as happening inside the brain is just an EMPTY METAPHOR.

Let me add a couple of points that I think relate to Epstein's excellent essay. First, there is “shoddy thinking” in science not just about the brain but even more so about the mind. The last couple of decades have seen the rise of 'neuroscience' in university science departments, often with extravagant funding. But this field is premised on a deeply misguided philosophical reductionism, which claims to understand the mind as 'brain circuitry'. Neuroscience was supposed to unlock the secrets of consciousness, but it hasn't and there is no reason to believe it ever will. Marxists who pay attention to philosophical issues will know that leading thinkers in the classical tradition like Engels (Dialectics of Nature) and Trotsky (Philosophical Notebooks) anticipated that the narrow empiricism of the natural sciences would eventually lead to just such a blind alley. I don't think it's far-fetched to imagine that in coming decades and centuries, neuroscience will take its place alongside phrenology in the catalogue of pseudo-sciences. It is perhaps not coincidental that Epstein approvingly cites the dialectical biologist Steven Rose, whose book, The Future of the Brain, champions the irreducible uniqueness of every individual brain and argues that this organ is intimately tied to the history of each individual including their social history. [2]

Second, I think the problems detailed by Epstein with the brain-as-computer metaphor raise big questions about the viability of Artificial Intelligence, another field now much in vogue in academic science. If the brain isn't a computer, then using algorithms and microchips to produce intelligence is never going to work. A computer computes but it doesn't think. It mechanically adds 7 + 5 but it DOESN'T UNDERSTAND 12. To understand means to relate something to lived experience. A computer has no such experience and never will. Computers are wonderful slaves and will become ever better at being that. In a world whose organizing principle is human need rather than private profit, computer technology will eliminate lots of the drudgery of work (rather than, as at present, compounding it). But massive data bases and electronic speed are not the same as thinking. Turn Descartes on his head and you get to the gist of the matter: I am, therefore I think.

Frank Brenner


No matter how hard they try, brain scientists and cognitive psychologists will never find a copy of Beethoven’s 5th Symphony in the brain – or copies of words, pictures, grammatical rules or any other kinds of environmental stimuli. The human brain isn’t really empty, of course. But it does not contain most of the things people think it does – not even simple things such as ‘memories’.
Our shoddy thinking about the brain has deep historical roots, but the invention of computers in the 1940s got us especially confused. For more than half a century now, psychologists, linguists, neuroscientists and other experts on human behaviour have been asserting that the human brain works like a computer.
To see how vacuous this idea is, consider the brains of babies. Thanks to evolution, human neonates, like the newborns of all other mammalian species, enter the world prepared to interact with it effectively. A baby’s vision is blurry, but it pays special attention to faces, and is quickly able to identify its mother’s. It prefers the sound of voices to non-speech sounds, and can distinguish one basic speech sound from another. We are, without doubt, built to make social connections.
A healthy newborn is also equipped with more than a dozen reflexes – ready-made reactions to certain stimuli that are important for its survival. It turns its head in the direction of something that brushes its cheek and then sucks whatever enters its mouth. It holds its breath when submerged in water. It grasps things placed in its hands so strongly it can nearly support its own weight. Perhaps most important, newborns come equipped with powerful learning mechanisms that allow them to change rapidly so they can interact increasingly effectively with their world, even if that world is unlike the one their distant ancestors faced.
Senses, reflexes and learning mechanisms – this is what we start with, and it is quite a lot, when you think about it. If we lacked any of these capabilities at birth, we would probably have trouble surviving.
But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently. Not only are we not born with such things, we also don’t develop them – ever.
We don’t store words or the rules that tell us how to manipulate them. We don’t create representations of visual stimuli, store them in a short-term memory buffer, and then transfer the representation into a long-term memory device. We don’t retrieve information or images or words from memory registers. Computers do all of these things, but organisms do not.

[2] For a discussion of the work of Steven Rose and the other dialectical biologists see our essay,
"A defense of positivism in the guise of a defense of science", 

1 comment:

Mark said...

That's an interesting topic, I agree with the author that organisms are not computers. Although I am not qualified to comment on the state of neuroscience and degree to which it is encumbered with computer like analogies, I just want to comment on how the understanding of the brain has helped computers and what might be called artificial intelligence.

One example the author cites is how although we can recognized a dollar bill there is no literal representation of the dollar bill in our heads. It is the same with modern image and facial recognition technology actually. The representation of dollar bill may be represented as the composite of many images of dollar bills "experienced" (captured by means of digitizer) by the image processor (an algorithm), that is then used in turn to create a representation of dollar bill that computer can use to recognize the dollar bill upon a further encounter. The representation might even be used to produce an image that outline the main features of dollar bill without literally representing every detail as like an image.

Of course it's "shoddy thinking" to believe that because a computer can recognize faces or other objects that humans must do "image processing" or that brain is limited to processing "information" (electrical signals produced by the nervous system). It seems this conjecture (unproven) raises a whole host issues from biology, the physical nature of reality, as well as philosophy, that can't be examined in a brief comment. The arguments of John Searle might be a good starting place to examine this and other conjectures.