No parent would want a teacher to teach a science course using a textbook from, say, 1910.
And, in 2110, no parent will want a teacher to teach science using today’s textbook.
In teaching science, then, to avoid a version of the presentist fallacy, it’s important to retain a sense for what is fact—and how we know it’s a fact—what is supposition or hypothesis, and what is plain unquestioned assumption (which, pejoratively, we may call superstition). And textbooks are generally not good at this; they too often present science not as a creative process, but as a finished product, as dogma.
Here's one example. If I had to vote for the greatest illusion or superstition of our age, I would say it’s the illusion that the brain thinks. (Not that you don't need a brain in order to think, but that you think; the brain is an instrument. And a metaphor.)
Assume everything we know of the brain and neurons and neural activity is true—it isn’t, it can’t be, and there’s a lot we just don’t even pretend we know. Picture a vast network of electrochemical activity among neurons, impulses racing this way and that. Picture it down to the smallest activity of an ion across a membrane. Picture it in its trillion-connection complexity. And realize that, if you want to find, say, a thought or an emotion, there’s no there there.
Eventually, for instance, if only in a thought experiment, we could draw all the connections and interactions in a brain, down to the molecular, atomic, or subatomic level; give us a whiteboard large enough, sufficient time, and any tools we need. (Yes, indeterminacy and entanglement might make our comprehension impossible, but these just substitute scientific magic for the old fashioned kind.)
Let us chart the “action potentials” of neural impulses, and the movement and effects of neurotransmitter fluids. (Did you know that the impulse along an auditory nerve is the same—exactly the same—as the impulse along an optic nerve? Watching the impulse pass, you simply cannot tell if this chemical activity relates to vision or to hearing; the impulse is void of quality. Why do we suppose that following it down the rabbit hole of complexity into which it vanishes will yield insight?) What we won’t find is a single thought, emotion, or memory, although—and even this is supposition—we may find the correlates or material or organizational traces of these. We may well be able to look at a configuration of neurons or molecules or particles—now represented in marker on our whiteboard—and correlate some brain configuration with some emotional state or thought, but it should be clear that the configuration is not and can never be an emotion or a thought or anything else that is fundamental to human experience or value.
(We can successfully perform this same thought experiment, however, with a computer. In the computer, if we know how to read the code, we can discover exactly what’s stored there, what’s calculated there, and so on. That’s because the computer isn’t conscious, isn’t thinking, isn’t feeling, isn’t anything but an apparatus, despite our science fiction fantasies. What’s in the computer is what we put there. The computer’s “activity” may yield startling and counter-intuitive results; it may “solve” problems of complexity beyond that of smart human beings in many lifetimes. But the thinking behind this work doesn’t reside in the machinery. It is available through the ingenuity and creativity of the programmers.
When we say that the brain is a computer—unless we are speaking metaphorically—we are not only making a category error, we have things exactly backward. One aspect or set of aspects of the brain is computer-like, but let’s not forget that minds and brains existed for a long, long time before the computer, and that the computer existed in the work and minds of those, like Charles Pierce and Charles Babbidge, who imagined the computer before the technology existed that could bring it to reality.)
I’m not disputing the association and correlation of brain activity with thinking, perceiving, emoting, breathing, heart-beating, or running. You need a brain to do brain-associated things. A pianist needs a piano, and preferably a well-functioning one. But when we say, for example, that “the brain thinks,” we indulge a supposition that not only has not been demonstrated, we irrationally indulge a view that loads what is essentially an electrochemical flowchart with impossible hopes, dreams, and assumptions.
The consequences of this thinking—or should I say, this lack of thinking, this assuming—are dire. Where, in fact, are consequence, morality or ethics, creativity, or humanity in this picture? The answer is, nowhere.
None of this is to argue against brain research. Far from it. Someone near to me is recovering from a closed head injury, and I am in awe of and grateful to the doctors and researchers who have helped with his recovery and helped us to understand what has occurred and what is occuring. I wish them all speed and good fortune in learning more and more about how the brain works and, when necessary, how to help it heal. But medicine is not (or not necessarily) meaning.
To bring this back to the education of students in a school, we have to acknowledge that we do not serve them well if we freight them with our superstitions, no matter how fervently we believe them.
No comments:
Post a Comment