Genes can be scary stuff if you don’t understand them. In 1994, psychologist Richard Herrnstein and policy analyst Charles Murray warned in their bestselling book The Bell Curve that we live in an increasingly stratified world where the “cognitive elite”—those with the best genes—are more and more isolated from the cognitive/genetic underclass. “Genetic partitioning,” they called it. There was no mistaking their message:
“The irony is that as America equalizes the [environmental] circumstances of people’s lives, the remaining differences in intelligence are increasingly determined by differences in genes . . . Putting it all together, success and failure in the American economy, and all that goes with it, are increasingly a matter of the genes that people inherit.”
Stark and terrifying—and thankfully quite mistaken. The authors had fundamentally misinterpreted a number of studies, becoming convinced that roughly 60 percent of each person’s intelligence comes directly from his or her genes. But genes don’t work that way. “There are no genetic factors that can be studied independently of the environment,” explains McGill University’s Michael Meaney, one of the world’s leading experts on genes and development. “And there are no environmental factors that function independently of the genome. [A trait] emerges only from the interaction of gene and environment.”
While Herrnstein and Murray adhered to a particular ideological agenda, they also seem to have been genuinely hobbled in their analysis by a common misunderstanding of how genes work. We’ve all been taught that we inherit complex traits like intelligence straight from our parents’ DNA in the same way we inherit simple traits like eye color. This belief is continually reinforced by the popular media. As an illustration, USA Today recently explained heredity in this way:
“Think of your own genetic makeup as the hand of cards you were dealt at conception. With each conception in a family comes a new shuffling of the deck and a new hand. That’s partly why little Bobby sleeps through the night as a baby, always behaves and seems to love math, while brother Billy is colicky, never listens and already is the head of a gang in kindergarten.”
Genes dictate. Genes instruct. Genes determine. For more than a century, this has been the widely accepted explanation of how each of us becomes us. In his famous pea-plant experiments of the 1850s and ’60s, Gregor Mendel demonstrated that basic traits like seed shape and flower color were reliably passed from one generation to the next through dominant and recessive “heritable factors” (Mendel’s phrase before the word “gene” was introduced). After eight years and 28,000 plants, Mendel had proved the existence of genes—and seemed to prove that genes alone determined the essence of who we are. Such was the unequivocal interpretation of early 20th century geneticists.
That notion is with us still. “Genes set the stage,” affirms USA Today. The environment has an impact on all of our lives, to be sure, but genes come first; they set specific lower and upper limits of each person’s potential abilities. Where did your brother get that amazing singing voice? How did you get so tall? Why can’t I dance? How is she so quick with numbers?
“It’s in the genes,” we say.
That’s what The Bell Curve authors thought, too. None of these writers realized that over the last two decades Mendel’s ideas have been thoroughly upgraded—so much so that one large group of scientists now suggests that we need to wipe the slate clean and construct an entirely new understanding of genes.