"WE'RE ON THE Verge Of profound changes in our ability
to manipulate the brain" says Paul Root Wolpe a bioethicist at the
University of Pennsylvania. He isn't kidding.
The dawning age of neuroscience promises not just
new treatments for Alzheimer's and other brain diseases but enhancements to
improve memory, boost intellectual acumen, and fine-tune our emotional
responses. "The next two decades will be the golden age of neuroscience"
Moreno, a bioethicist at the University of Virginia.
"We're on the threshold of the kind of rapid growth of information in
neuroscience that was true of genetics 15 years ago." One man's golden age
is another man's dystopia.
One of the more vociferous critics of such research
is Francis Fukuyama, who warns in his book "Our Posthuman Future" that "we
are already in the midst of this revolution" and "WE SHOULD USE THE POWER OF
THE STATE TO REGULATE IT" (emphasis his). In May a cover story in the
usually pro-technology Economist worried that "neuroscientists may soon be
able to screen people's brains to assess their mental health, to distribute
that information, possibly accidentally, to employers or insurers, and to
"fix" faulty personality traits with drugs or implants on demand."
There are good reasons to consider the ethics of
tinkering directly with the organ from which all ethical reflection arises.
Most of those reasons boil down to the need to respect the rights of the
people who would use the new technologies. Some of the field's moral issues
are common to all biomedical research: how to design clinical trials
ethically, how to ensure subjects' privacy, and so on. Others are peculiar
to neurology. It's not clear, for example, whether people suffering from
neurodegenerative disease can give informed consent to be experimented on.
Last May the Dana Foundation sponsored an entire
conference at Stanford on "neuroethics". Conferees deliberated over issues
like the moral questions raised by new brain scanning techniques, which some
believe will lead to the creation of truly effective lie detectors.
Participants noted that scanners might also be able to pinpoint brain
abnormalities in those accused of breaking the law, thus changing our
perceptions of guilt and innocence.
Most nightmarishly, some worried that governments
could one day use brain implants to monitor and perhaps even control
But most of the debate over neuroethics has not
centered around patients' or citizens' autonomy, perhaps because so many of
the field's critics themselves hope to restrict that autonomy in various
ways. The issue that most vexes them is the possibility that neuroscience
might enhance previously "normal" human brains.
The tidiest summation of their complaint comes from
the conservative columnist William Safire. "Just as we have antidepressants
today to elevate mood," he wrote after the Dana conference, "tomorrow we can
expect a kind of Botox for the brain to smooth out wrinkled temperaments, to
turn shy people into extroverts, or to bestow a sense of humor on a born
But what price will human nature pay for these
Truly effective neuropharmaceuticals that improve
moods and sharpen mental focus are already widely available and taken by
millions. While there is some controversy about the effectiveness of Prozac,
Paxil, and Zoloft, nearly 30 million Americans have taken them, with mostly
In his famous 1993 book Listening to Prozac, the
psychiatrist Peter Kramer describes patients taking the drug as feeling
"better than well." One Prozac user, called Tess, told him that when she
isn't taking the medication, "l am not myself."
One Pill Makes You Smarter...
That's exactly what worries Fukuyama, who thinks
Prozac looks a lot like Brave New World's soma. The pharmaceutical industry,
he declares, is producing drugs that "provide self-esteem in the bottle by
elevating serotonin in the brain." If you need a drug to be your "self,"
these critics ask, do you really have a self at all?
Another popular neuropharmaceutical is Ritalin, a
drug widely prescribed to remedy attention deficit hyperactivity disorder
(ADHD), which is characterized by agitated behavior and an inability to
focus on tasks.
Around 1.5 million schoolchildren take Ritalin,
which recent research suggests boosts the activity of the neurotransmitter
dopamine in the brain.
Like all psychoactive drugs, it is not without
controversy. Perennial psychiatric critic Peter Breggin argues that millions
of children are being "drugged into more compliant or submissive state[s]"
to satisfy the needs of harried parents and school officials. For Fukuyama,
Ritalin is prescribed to control rambunctious children because "parents and
teachers ... do not want to spend the time and energy necessary to
discipline, divert, entertain, or difficult children the old-fashioned way."
Unlike the more radical Breggin, Fukuyama
acknowledges that drugs such as Prozac and Ritalin have helped millions when
other treatments have failed.
Still, he worries about their larger social
consequences. "There is a disconcerting symmetry between Prozac and
Ritalin," he writes. "The former is prescribed heavily for depressed women
lacking in self-esteem; it gives them more the alpha-male feeling that comes
with high serotonin levels.
Ritalin, on the other hand, is prescribed largely
for young boys who do not want to sit still in class because nature never
designed them to behave that way. Together, the two sexes are gently nudged
toward that androgynous median personality, self-satisfied and socially
compliant, that is the current politically correct outcome in American
Although there are legitimate questions here,
they're related not to the chemicals themselves but to who makes the
decision to use them. Even if Prozac and Ritalin can help millions of
people, that doesn't mean schools should be able to force them on any
student who is unruly or bored.
But by the same token, even if you accept the most
radical critique of the drug -- that ADHD is not a real disorder to begin
with - - that doesn't mean Americans who exhibit the symptoms that add up to
an ADHD diagnosis should not be allowed to alter their mental state
chemically, if that's an outcome they want and a path to it they're willing
Consider Nick Megibow, a senior majoring in
philosophy at Gettysburg College. "Ritalin made my life a lot better," he
reports. "Before I started taking Ritalin as a high school freshman, I was
doing really badly in my classes.
I had really bad grades, Cs and Ds mostly. By
sophomore year, I started taking Ritalin, and it really worked amazingly. My
grades improved dramatically to mostly As and Bs. It allows me to focus and
get things done rather than take three times the amount of time that it
should take to finish something." If people like Megibow don't share
Fukuyama's concerns about the wider social consequences of their medication,
it's because they're more interested, quite reasonably, in feeling better
and living a successful life.
What really worries critics like Safire and Fukuyama
is that Prozac and Ritalin may be the neuropharmacological equivalent of
bearskins and stone axes compared to the new drugs that are coming.
Probably the most critical mental function to be
enhanced is memory. And this, it turns out, is where the most promising work
is being done. At Princeton, biologist Joe Tsien's laboratory famously
created smart mice by genetically modifying them to produce more NMDA brain
receptors, which are critical for the formation and maintenance of memories.
Tsien's mice were much faster learners than their
unmodified counterparts. "By enhancing learning, that is, memory
acquisition, animals seem to be able to solve problems faster," notes Tsien.
He believes his work has identified an important target that will lead other
researchers to develop drugs that enhance memory.
A number of companies are already hard at work
developing memory drugs. Cortex Pharmaceuticals has developed a class of
compounds called AMPA receptor modulators, which enhance the glutamate-based
transmission between brain cells.
Preliminary results indicate that the compounds do
enhance memory and cognition in human beings.
Memory Pharmaceuticals, co-founded by Nobel laureate
Eric Kandel, is developing a calcium channel receptor modulator that
increases the sensitivity of neurons and allows them to transmit information
more speedily and a nicotine receptor modulator that plays a role in
synaptic plasticity. Both modulators apparently improve memory.
Another company, Targacept, is working on the
nicotinic receptors as well. All these companies hope to cure the memory
deficits that some 30 million baby boomers will suffer as they age. If these
compounds can fix deficient memories, it is likely that they can enhance
normal memories as well.
Tsien points out that a century ago the encroaching
senility of Alzheimer's disease might have been considered part of the
"normal" progression of aging. "So it depends on how you define normal," he
says. "Today we know that most people have less good memories after age 40,
and I don't believe that's a normal process."
And so we face the prospect of pills to improve our
mood, our memory, our intelligence, and perhaps more. Why would anyone
object to that?
Eight objections to such enhancements recur in
neuroethicists' arguments. None of them is really convincing.
1. Neurological enhancements permanently change the
Erik Parens of the Hastings Center, a bioethics
think tank, argues that it's better to enhance a child's performance by
changing his environment than by changing his brain -- that it's better to,
say, reduce his class size than to give him Ritalin. But this is a false
Reducing class size is aimed at changing the child's
biology too, albeit indirectly. Activities like teaching are supposed to
induce biological changes in a child's brain, through a process called
Fukuyama falls into this same error when he suggests
that even if there is some biological basis for their condition, people with
ADHD "clearly ... can do things that would affect their final degree of
attentiveness or hyperactivity. Training in character, determination, and
environment more generally would all play important roles."
So can Ritalin, and much more expeditiously, too.
"What is the difference between Ritalin and the Kaplan SAT review?" asks the
Dartmouth neuroscientist Michael Gazzaniga. "It's six of one and a half
dozen of the other.
If both can boost SAT scores by, say, 120 points, I
think it's immaterial which way it's done."
2. Neurological enhancements are anti-egalitarian. A
perennial objection to new medical technologies is the one Parens calls
"unfairness in the distribution of resources." in other words, the rich and
their children will get access to brain enhancements first, and will thus
acquire more competitive advantages over the poor.
This objection rests on the same false dichotomy as
the first. As the University of Virginia's Moreno puts it, "We don't stop
people from giving their kids tennis lessons." If anything, the new
enhancements might increase social equality.
Moreno notes that neuropharmaceuticals are likely to
be more equitably distributed than genetic enhancements, because "after all,
a pill is easier to deliver than DNA."
3. Neurological enhancements are self-defeating. Not
content to argue that the distribution of brain enhancements won't be
egalitarian enough, some critics turn around and argue that it will be too
egalitarian. Parens has summarized this objection succinctly: "if everyone
achieved the same relative advantage with a given enhancement, then
ultimately no one's position would change; the enhancement would have failed
if its purpose was to increase competitive advantage."
This is a flagrant example of the zero-sum approach
that afflicts so much bioethical thought. Let's assume, for the sake of
argument, that everyone in society will take a beneficial brain-enhancing
drug. Their relative positions may not change,
but the overall productivity and wealth of society
would increase considerably, making everyone better off. Surely that is a
4. Neurological enhancements are difficult to
Why exactly would everyone in the country take the
same drug? Because, the argument goes, competitive pressures in our go-go
society will be so strong that a person will be forced to take a
memory-enhancing drugs just to keep up with everyone else. Even if the law
protects freedom of choice, social pressures will draw us in.
For one thing, this misunderstands the nature of the
technology. It's not simply a matter of popping a pill and suddenly zooming
ahead. "I know a lot of smart people who don't amount to a row of beans,"
says Gazzaniga. "They're just happy underachieving, living life below their
potential. So a pill that pumps up your intellectual processing power
won't necessarily give you the drive and ambition to use it."
Beyond that, it's not as though we don't all face
competitive pressures anyway -- to get into and graduate from good
universities, to constantly upgrade skills, to buy better computers and more
productive software, whatever.
Some people choose to enhance themselves by getting
a Ph.D. in English; others are happy to stop their formal education after
high school. it's not clear why a pill should be more irresistible than
higher education, or why one should raise special ethical concerns while the
other does not.
5. Neurological enhancements undermine good
For some critics, the comparison to higher education
suggests a different problem. We should strive for what we get, they
suggest; taking a pill to enhance cognitive functioning is just too easy. As
Fukuyama puts it: "The normal, and morally acceptable, way of overcoming low
self-esteem was to struggle with oneself and with others, to work hard, to
endure painful sacrifices, and finally to rise and be seen as having done
"By denying access to brain-enhancing drugs, people
like Fukuyama are advocating an exaggerated stoicism," counters Moreno. "I
don't see the benefit or advantage of that kind of tough love," Especially
since there will still be many different ways to achieve things and many
difficult challenges in life. Brain-enhancing drugs might ease some of our
labors, but as Moreno notes, "there are still lots of hills to climb, and
they are pretty steep" Cars, computers, and washing machines have
tremendously enhanced our ability to deal with formerly formidable tasks.
That doesn't mean life's struggles have disappeared -- just that we can now
tackle the next ones.
6. Neurological enhancements undermine personal
responsibility. Carol Freedman, a philosopher at Williams College, argues
that what is at stake "is a conception of ourselves as responsible agents,
not machines." Fukuyama extends the point, claiming that "ordinary people"
are eager to "medicalize as much of their behavior as possible and thereby
reduce their responsibility for their own actions." As an example, he
suggests that people who claim to suffer from ADHD "want to absolve
themselves of personal responsibility."
But we are not debating people who might use an ADHD
diagnosis as an excuse to behave irresponsibly. We are speaking of people
who use Ritalin to change their behavior. Wouldn't it be more irresponsible
of them to not take corrective action?
7. Neurological enhancements enforce dubious norms.
There are those who assert that corrective action
might be irresponsible after all, depending on just what it is that you're
trying to correct. People might take neuropharmaceuticals, some warn, to
conform to a harmful social conception of normality.
Many bioethicists -- George-town University's
Margaret Little, for example-- argue that we can already see this process in
action among women who resort to expensive and painful cosmetic surgery to
conform to a social ideal of feminine beauty. Never mind for the moment that
beauty norms for both men and women have never been so diverse.
Providing and choosing to avail oneself of that
surgery makes one complicit in norms that are morally wrong, the critics
argue. After all, people should be judged not by their physical appearances
but by the content of their characters.
That may be so, but why should someone suffer from
society's slights if she can overcome them with a nip here and a tuck there?
The norms may indeed be suspect, but the suffering is experienced by real
people whose lives are consequently diminished. Little acknowledges this
point, but argues that those who benefit from using a technology to conform
have a moral obligation to fight against the suspect norm. Does this mean
people should be given access to technologies they regard as beneficial only
if they agree to sign on to a bioethical fatwa?
Of course, we should admire people who challenge
norms they disagree with and live as they wish, but why should others be
denied relief just because some bioethical commissars decree that society's
misdirected values must change?
Change may come, but real people should not be
sacrificed to some restrictive bioethical utopia in the meantime. Similarly,
we should no doubt value depressed people or people with bad memories just
as highly as we do happy geniuses, but until that glad day comes people
should be allowed to take advantage of technologies that improve their lives
in the society in which they actually live.
Furthermore, it's far from clear that everyone will
use these enhancements in the same ways. There are people who alter their
bodies via cosmetic surgery to bring them closer to the norm, and there are
people who alter their bodies via piercings and tattoos to make them more
individually expressive. it doesn't take much imagination to think of
unusual or unexpected ways that Americans might use mind-enhancing
Indeed, the war on drugs is being waged, in part,
against a small but significant minority of people who prefer to alter their
consciousness in socially disapproved ways.
8. Neurological enhancements make us inauthentic.
Parens and others worry that the users of brain-altering chemicals are less
authentically themselves when they're on the drug. Some of them would reply
that the exact opposite is the case. In "Listening to Prozac", Kramer
chronicles some dramatic transformations in the personalities and attitudes
of his patients once they're on the drug. The aforementioned Tess
tells him it was "as if I had been in a drugged state all those years and
now I'm clearheaded?"
Again, the question takes a different shape when one
considers the false dichotomy between biological and "nonbiological"
enhancements. Consider a person who undergoes a religious conversion and
emerges from the experience with a more upbeat and attractive personality.
Is he no longer his "real" self? Must every religious convert be
deprogrammed? Even if there were such a thing as a "real" personality, why
should you stick with it if you don't like it? If you're socially withdrawn
and a pill can give you a more vivacious and out-going manner, why not go
After all, you're choosing to take responsibility
for being the "new" person the drug helps you to be. Authenticity and
Responsibility "Is it a drug-induced personality or has the drug cleared
away barriers to the real personality?" asks the University of
Surely the person who is choosing to use the drug is
in a better position to answer that question than some bioethical busybody.
This argument over authenticity lies at the heart of the neuroethicists'
objections. If there is a single line that divides the supporters of
neurological freedom from those who would restrict the new treatments, it is
the debate over whether a natural state of human being exists and, if so,
how appropriate it is to modify it. Wolpe makes the point that in one sense
cognitive enhancement resembles its opposite, Alzheimer's disease.
A person with Alzheimer's loses her personality.
Similarly, an enhanced individual's personality may become unrecognizable to
those who knew her before. Not that this is unusual.
Many People experience a version of this process
when they go away from their homes to college or the military. They return
as changed people with new capacities, likes, dislikes, and social styles,
and they often find that their families and friends no longer relate to them
in the old ways.
Their brains have been changed by those experiences,
and they are not the same people they were before they went away. Change
makes most people uncomfortable, probably never more so than when it happens
to a loved one.
Much of the neuro- Luddites' case rests on a belief
in an unvarying, static personality, some- thing that simply doesn't exist.
It isn't just personality that changes over time. Consciousness itself is
far less static than we've previously assumed, a fact that raises
contentious questions of free will and determinism. Neuroscientists are
finding more and more of the underlying automatic processes operating in the
brain, allowing us to take a sometimes disturbing look under our own hoods.
"We're finding out that by the time we're conscious of doing something, the
brain's already done it," explains Gazzaniga.
Consciousness, rather than being the director of our
activities, seems instead to be a way for the brain to explain to itself why
it did something.
Haunting the whole debate over neuroscientific
research and neuroenhancements is the fear that neuroscience will undercut
notions of responsibility and free will. Very preliminary research has
suggested that many violent criminals do have altered brains.
At the Stanford conference, Science editor Donald
Kennedy suggested that once we know more about brains, our legal system will
have to make adjustments in how we punish those who break the law. A
murderer or rapist might one day plead innocence on the grounds that "my
amygdala made me do it." There is precedent for this: The legal system
already mitigates criminal punishment when an offender can convince a jury
he's so mentally ill that he cannot distinguish right from wrong.
Of course, there are other ways such discoveries
might pan out in the legal system, with results less damaging to social
order but still troubling for notions of personal autonomy.
One possibility is that an offender's punishment
might be reduced if he agrees to take a pill that corrects the brain defect
he blames for his crime.
We already hold people responsible when their drug
use causes harm to others -- most notably, with laws against drunk driving.
Perhaps in the future we will hold people responsible if they fail to take
drugs that would help prevent them from behaving in harmful ways. After all,
which is more damaging to personal autonomy, a life confined to a jail cell
or roaming free while taking a medication?
The philosopher Patricia Churchland examines these
conundrums in her forthcoming book, Brainwise: Studies in Neurophilosophy.
"Much of human social life depends on the expectation that agents have
control over their actions and are responsible for their choices," she
writes. "In daily life it is commonly assumed that it is sensible to punish
and reward behavior so long as the person was in control and chose knowingly
And that's the way it should remain, even as we
learn more about how our brains work and how they sometimes break down.
Churchland points out that neuroscientific research by scientists like the
University, of Iowa's Antonio Damasio strongly shows, that emotions are an
essential component, of viable practical reasoning about what a person
should do. In other words, neuroscience is bolstering philosopher David
Hume's insight that "reason is and ought only to be the slave of the
passions." Patients whose affects are depressed or lacking due to brain
injury are incapable of judging or evaluating between courses of action.
Emotion is what prompts and guides our choices.
Churchland further argues that moral agents come to be morally and
practically wise not through pure cognition but by developing moral beliefs
and habits through life experiences. Our moral reflexes are honed through
watching and hearing about which actions are rewarded and which are
punished; we learn to be moral the same way we learn language.
Consequently, Churchland concludes "the default
presumption that agents are responsible for their actions is
empirically necessary to an agent's learning, both emotionally and
cognitively, how to evaluate the consequences of certain events and the
price of taking risks." It's always risky to try to derive an "ought" from
an "is," but neuroscience seems to be implying that liberty -- i.e., letting
people make choices and then suffer or enjoy the consequences -- is
essential for inculcating virtue and maintaining social cooperation. Far
from under-mining personal responsibility, neuroscience may end up
For Neurological Liberty
Fukuyama wants to "draw red lines" to distinguish
between therapy and enhancement, "directing research toward the former while
putting restrictions on the latter." He adds that "the original purpose of
medicine is, after all, to heal the sick, not turn healthy people into gods"
He imagines a federal agency that would oversee neurological research,
prohibiting anything that aims at enhancing our capacities beyond some
notion of the human norm. "For us to flourish as human beings, we have to
live according to our nature, satisfying the deepest longings that we as
natural beings have"
Fukuyama told the Christian review Books & Culture
last summer. "For example, our nature gives us tremendous cognitive
capabilities, capability for reason, capability to learn, to teach ourselves
things, to change our opinions, and so forth.
What follows from that? A way of life that permits
such growth is better than a life in which this capacity is shriveled and
stunted in various ways." This is absolutely correct. The trouble is that
Fukuyama has a shriveled, stunted vision of human nature, leading him and
others to stand athwart neuroscientific advances that will make it possible
for more people to take fuller advantage of their reasoning and learning
Like any technology, neurological enhancements can
be abused, especially if they're doled out -- or imposed -- by an unchecked
authority. But Fukuyama and other critics have not made a strong case for
why individuals, in consultation with their doctors, should not be allowed
to take advantage of new neuroscientific breakthroughs to enhance the
functioning of their brains.
And it is those individuals that the critics will
have to convince if they seriously expect to restrict this research. It's
difficult to believe that they'll manage that. In the 1960s many states
outlawed the birth control pill, on the grounds that it would be too
disruptive to society.
Yet Americans, eager to take control of their
reproductive lives, managed to roll back those laws, and no one believes
that the pill could be re-outlawed today. Moreno thinks the same will be
true of the neurological advances to come.
"My hunch," he says, "is that in the United States,
medications that enhance our performance are not going to be prohibited."
When you consider the sometimes despairing tone that Fukuyama and others
like him adopt, it's hard not to conclude that on that much, at least, they
Ronald Bailey is Reason's science correspondent
and the editor of "Global Warming and Other Eco-Myths: How the Environmental
Movement Uses False Science to Scare Us to Death" (Prima Publishing).