This fall, Princeton University Press will publish Harden’s book, “The Genetic Lottery: Why DNA Matters for Social Equality,” which attempts to reconcile the findings of her field with her commitments to social justice. As she writes, “Yes, the genetic differences between any two people are tiny when compared to the long stretches of DNA coiled in every human cell. But these differences loom large when trying to understand why, for example, one child has autism and another doesn’t; why one is deaf and another hearing; and—as I will describe in this book—why one child will struggle with school and another will not. Genetic differences between us matter for our lives. They cause differences in things we care about. Building a commitment to egalitarianism on our genetic uniformity is building a house on sand.”
Harden understands herself to be waging a two-front campaign. On her left are those inclined to insist that genes don’t really matter; on her right are those who suspect that genes are, in fact, the only things that matter. The history of behavior genetics is the story of each generation’s attempt to chart a middle course. When the discipline first began to coalesce, in the early nineteen-sixties, the memory of Nazi atrocities rendered the eugenics threat distinctly untheoretical. The reigning model of human development, which seemed to accord with postwar liberal principles, was behaviorism, with its hope that environmental manipulation could produce any desired outcome. It did not take much, however, to notice that there is considerable variance in the distribution of human abilities. The early behavior geneticists started with the premise that our nature is neither perfectly fixed nor perfectly plastic, and that this was a good thing. They conscripted as their intellectual patriarch the Russian émigré Theodosius Dobzhansky, an evolutionary biologist who was committed to anti-racism and to the conviction that “genetic diversity is mankind’s most precious resource, not a regrettable deviation from an ideal state of monotonous sameness.”
The field’s modern pioneers were keen to establish that their interest lay in academic questions, and they prioritized the comparatively clement study of animals. In 1965, John Paul Scott and John L. Fuller reported that, despite the discernible genetic differences among dog breeds, there did not seem to be categorical distinctions that might allow one to conclude that, say, German shepherds were smarter than Labradors. The most important variations occurred on an individual level, and environmental conditions were as important as innate qualities, if not more so.
This era of comity did not last long. In 1969, Arthur Jensen, a respected psychologist at Berkeley, published an article called “How Much Can We Boost IQ and Scholastic Achievement?” in the Harvard Educational Review. Jensen coolly argued that there was an I.Q. gap between the races in America; that the reason for this gap was at least partly genetic, and thus, unfortunately, immutable; and that policy interventions were unlikely to thwart the natural hierarchy. The Jensen affair, which extended for more than a decade, prefigured the publication of “The Bell Curve”: endless public debate, student protests, burned effigies, death threats, accusations of intellectual totalitarianism. As Aaron Panofsky writes in “Misbehaving Science,” a history of the discipline, “Controversies wax and wane, sometimes they emerge explosively, but they never really resolve and always threaten to reappear.”
The problem was that most of Jensen’s colleagues agreed with some of his basic claims: it did seem that there was something akin to “general intelligence” in humans, that it could be meaningfully measured with I.Q. tests, and that genetic inheritance has a good deal to do with it. Critics quickly pointed out that the convoluted social pathways that led from genes to complex traits rendered any simple notion of genetic “causation” silly. In 1972, Christopher Jencks, a sociologist at Harvard, proposed the thought experiment of a country in which red-haired children were prevented from going to school. One might anticipate that such children would demonstrate a weaker reading ability, which, because red hair is genetic in origin, would be conspicuously linked to their genes—and would, in some bizarre sense, be “caused” by them.
Richard Lewontin, a geneticist and a staunch egalitarian, developed a different analogy. Imagine a bag of seed corn. If you plant one handful in nutrient-poor soil, and another in rich loam, there will be a stark difference in their average stalk height, irrespective of any genetic predisposition. (There will also be greater “inequality” among the well-provisioned plants; perhaps counterintuitively, the more uniformly beneficial the climate, the more pronounced the effects of genetic difference.) Jensen’s racial comparison was thus unwarranted and invidious: it was absurd to think, in the America of 1969, that different races enjoyed equally bountiful circumstances.
Behavior geneticists emphasized that their own studies showed that poorer children adopted by wealthy families saw substantial gains in average I.Q. This finding, it later emerged, obtained on a societal basis as well. The scholar James Flynn found that, for reasons that are not entirely understood, the average I.Q. of a population increases significantly over time: most people living a hundred years ago, were they given contemporary I.Q. tests, would easily have qualified as what early psychometricians called, with putative technical precision, “morons” or “imbeciles.” Such tests might be measuring something real, but whatever it is cannot be considered “purely” biological or inflexible.
Our ability to remediate genetic differences was thus a separate moral question. In 1979, the economist Arthur Goldberger published a mordant rejoinder to social conservatives who argued that genetic differences rendered the welfare apparatus supererogatory. “In the same vein, if it were shown that a large proportion of the variance in eyesight were due to genetic causes, then the Royal Commission on the Distribution of Eyeglasses might well pack up,” he wrote. Just because outcomes might be partly genetic didn’t mean that they were inevitable.
As twin studies proliferated throughout the nineteen-eighties, their results contributed to substantial changes in our moral intuitions. When schizophrenia and autism, for example, turned out to be largely heritable, we no longer blamed these disorders on cold or inept mothers. But, for such freighted traits as intelligence, liberals remained understandably anxious and continued to insist that differences—not just on a group level but on an individual one—were merely artifacts of an unequal environment. Conservatives pointed out that an à-la-carte approach to scientific findings was intellectually incoherent.
In 1997, Turkheimer, perhaps the preëminent behavior geneticist of his generation, published a short political meditation called “The Search for a Psychometric Left,” in which he called upon his fellow-liberals to accept that they had nothing to fear from genes. He proposed that “a psychometric left would recognize that human ability, individual differences in human ability, measures of human ability, and genetic influences on human ability are all real but profoundly complex, too complex for the imposition of biogenetic or political schemata. It would assert that the most important difference between the races is racism, with its origins in the horrific institution of slavery only a very few generations ago. Opposition to determinism, reductionism and racism, in their extreme or moderate forms, need not depend on blanket rejection of undeniable if easily misinterpreted facts like heritability.” He concluded, “Indeed it had better not, because if it does the eventual victory of the psychometric right is assured.”
Having endured the summer of 2020 trapped indoors in the oppressive Austin heat, Harden was grateful for an invitation to spend this past June at Montana State University, in Bozeman. A recent influx of out-of-town wealth had accelerated during the pandemic, and the town’s industrial fixtures had been ruthlessly spruced up to suit the needs of remote knowledge workers. Harden, who has moss-colored eyes, a wry smile, and an earnest nonchalance, met me at a coffee shop that looked as though it had been airlifted that morning from San Francisco. She wore a soft flannel shirt, faded stone-washed jeans, and dark Ray-Ban sunglasses. The air was hot and dry, but Harden is the sort of person who seems accompanied by a perpetual breeze. “ ‘The Bell Curve’ came out when I was twelve years old, and somehow that’s still what people are talking about,” she said. “There’s a new white dude in every generation who gets famous talking about this.” Virtually every time Harden gives a presentation, someone asks about “Gattaca,” the 1997 movie about a dystopia structured by genetic caste. Harden responds that the life of a behavior geneticist resembles a different nineties classic: “Groundhog Day.”
Harden was raised in a conservative environment, and though she later rejected much of her upbringing, she has maintained a convert’s distrust of orthodoxy. Her father’s family were farmers and pipeline workers in Texas, and her grandparents—Pentecostalists who embraced faith healing and speaking in tongues—were lifted out of extreme poverty by the military. “It was the classic tale of the government’s deliberate creation of a white middle class,” she said. Her father served as a Navy pilot, then took a job flying for FedEx, and Harden and her brother grew up in an exurb of Memphis. Harden scandalized her Christian high school when, at fifteen, she wrote a term paper about “The Bell Jar.” She has not recapitulated the arc of her parents’ lives. “They’re still very religious—very suspicious of the mainstream media, secular universities, secular anything, which has accelerated in the Trump years.”
Harden’s parents insisted that she stay in the South for college, and Furman University, a formerly Baptist college in South Carolina, gave her a full scholarship based on her near-perfect SAT scores. She received paid summer fellowships in rodent genetics, and found that she preferred the grunt work of the lab bench to the difficult multitasking required by the jobs in waitressing and retail to which she was accustomed. She only later realized that the point of the program was to draw students from underrepresented backgrounds into science. At twenty, she applied to graduate school in clinical psychology. Her father’s only comment was “I was afraid you were going to say that.” She was rejected almost everywhere, but Turkheimer, noting her lab experience and her exceptionally high quantitative G.R.E. scores, invited her for an interview. She wore a new Ann Taylor suit and he wore Tevas. Turkheimer’s e-mail avatar is the Greek letter psi, for “psychology,” set against the Grateful Dead logo; he offered her admission on the condition that she stop calling him “sir.”
Her experiences as an apprentice scientist were only part of the reason that she grew disillusioned with evangelicalism: “There was this incredible post-9/11 nationalism—flags on the altar next to crosses—that infected my church to a point that felt immoral and gross. Sometimes I feel like I sat through eleven years of Christian school and absorbed all the things they didn’t intend for me to absorb. I thought we were following a social-justice ethos in which the meek shall inherit the earth, and I must’ve missed the track that was the run-up to the Iraq War.” Turkheimer recommended a local psychoanalyst, who, Harden said, took her on as a “charity case.”
It might have seemed peculiar that a behavior geneticist was recommending analytic treatment, but Turkheimer had long been known for his belief that biological explanations for behavior were unlikely ever to supplant cultural and psychological ones. Turkheimer’s longtime rival, the prolific researcher Robert Plomin, believed otherwise, predicting that we would one day achieve molecular-level purchase on what makes people who they are. Turkheimer associated himself with what Plomin lamented as “the gloomy prospect”—the notion that the relevant processes were too messy and idiosyncratic to be fixed under glass. The prospect was gloomy, Turkheimer said, only from the perspective of a social scientist. As a person, he had a more sanguine view: “In the long run, the gloomy prospect always wins, and no one would want to live in a world where it did not.”
This did not mean that behavior genetics was useless, only that it required a modest perspective on what could be achieved: twin studies might never explain how a given genotype made someone more likely to be depressed, but they could help avoid the kind of mistaken inference that blamed bad parenting. Harden’s work in Turkheimer’s lab remained squarely within this tradition. For example, the state of Texas spent a lot of money on school programs to promote sexual abstinence, on the basis of research that showed a correlation between adolescent sexuality and subsequent antisocial behavior. Harden used a twin study to demonstrate that a twin who began having sex early showed no greater likelihood of engaging in risky behavior than her twin who had abstained. In other words, both behaviors might be the expression of some underlying predisposition, but no causal arrow could be drawn. She did similar work to show that the idea of “peer pressure” as a driver of adolescent substance abuse was, at best, a radical oversimplification of an extremely complex transactional dynamic between genes and environment.
Friends, this isn’t the time to be complacent. If you are ready to fight for the soul of this nation, you can start by donating to elect Joe Biden and Kamala Harris by clicking the button below.
Thank you so much for supporting Joe Biden’s Presidential campaign.