This week my father preserved for me a series of editorials from the Wall Street Journal by Charles Murray, of The Bell Curve fame, arguing his thesis for the reality of g, which he identifies as an inherent and inherited “intelligence factor” that differentiates the smart from the dumb. Distribution of g in the population follows a normal, or “bell,” curve, and he points up many of the oft-touted depressing statistics of U.S. public education and explains them–convincingly, in my view–in terms of the normal statistical distribution of intelligence in our population. He revives the spectre of the IQ score, and although he acknowledges quibbles about the accuracy of the tools used to measure it, he also advocates its phenomenological legitimacy. He deals summarily with Gardner’s multiple intelligences theory, citing relatively convincing evidence that g is a real phenomenon and can’t be wished away by egalitarian reformers. He recognizes how the notion of uneven distribution of g chafes agains our ideals of equality and the political difficulties attendant to making policy decisions based on a worldview which is in this sense elitist.
I see Murray as one of a small but growing vocal minority of intellectuals who are prepared to acknowledge that human beings are in most meaningful ways determined by their genes. As biology and neurobiology advance, we come to understand more and more how even very complex human behaviors can be predicted genetically. This is certainly not the first time in history that a deterministic elitist movement has surfaced, but it may well prove to be the first time that the unpleasant awareness of genetic determinism is answered by an ethical technical solution. Before long, it seems obvious to any scientifically-informed observer, biochemistry will allow human beings to achieve meaningful control of their genetic destinies, at which point a political battle will ensue between the forces that advocate non-intervention in genetic fate and those who recognize biochemical eugenics as an escape from determinism.
Brief meditation on human nature leads me to predict that the battle will be a short one. Voices in favor of accepting determinism–such as Murray, et. al.–run up against the ubiquitous phenomenological fact of choice: Whether it is real or not, human beings experience a process of decision making that causes them to behave as if they have some measure of control over their fates. Although most rational adults can be persuaded to admit, if pressured, that there are things in life over which they have no control, most of them would also prefer that it not be so. If offered a choice between the certainty of a brilliant and beautiful and happy child and the luck of the draw, which of us would leave it to fate?
Practical eugenetic technology is not with us now, and may well not materialize until twenty years hence. Even if it takes that long, however, it still seems likely that we will find ourselves living with a technology that can correct our genes before we find ourselves living in a political culture prepared to accept that they determine our fates. In that most probable case, Murray’s arguments, though convincing, come too late on the scene. Even if we begin now to implement the policy regime he advocates, it’s likely that by the time reforms come into place the biology on which they are founded will become subject to the same socioeconomic pressures which corrupt the system now. Western culture has lived in denial of biological determinism for decades now, and in resentment of it for millenia–are we going now to give in and accept it on the very eve of our liberation? Better now to begin preparing for that future culture of eugenetic control, to begin steering now toward’s Keillor’s Lake Wobegon, where all the children are above average, and away from Huxley’s Brave New World, where minds are manufactured to meet the demands of industry.