Does America Still Have What It Takes?

Why the American spirit of innovation is in trouble, and what culture has to do with it.
Does America Still Have What It Takes?
A construction worker connects two cables suspended high above the New York during the construction of the Empire State Building, 1931. Photo credit: Lewis W. Hine/George Eastman House/Getty Images.
Charles Murray
April 1 2014

Some years ago, I conducted an ambitious research project to document and explain patterns of human accomplishment across time and cultures. My research took me from 800 BCE, when Homo sapiens’ first great surviving works of thought appeared, to 1950, my cut-off date for assessing lasting influence. I assembled world-wide inventories of achievements in physics, biology, chemistry, geology, astronomy, mathematics, medicine, and technology, plus separate inventories of Western, Chinese, and Indian philosophy; Western, Chinese, and Japanese art; Western, Arabic, Chinese, Indian, and Japanese literature; and Western music. These inventories were analyzed using quantitative techniques alongside standard qualitative historical analysis. The result was Human Accomplishment: The Pursuit of Excellence in the Arts and Sciences (2003).

My study confirmed important patterns. Foremost among them is that human achievement has clustered at particular times and places, including Periclean Athens, Renaissance Florence, Sung China, and Western Europe of the Enlightenment and the Industrial Revolution. But why? What was special about those times and places? In the book’s final chapters, I laid out my best understanding of the environment within which great accomplishment occurs.

In what follows, I want to conduct an inquiry into the ways in which the environment of achievement in early 21st-century America corresponds or fails to correspond to the patterns of the past. As against pivotal moments in the story of human accomplishment, does today’s America, for instance, look more like Britain blooming at the end of the 18th century or like France fading at the end of the 19th century? If the latter, are there idiosyncratic features of the American situation that can override what seem to be longer-run tendencies?

To guide the discussion, I’ll provide a running synopsis, in language drawn from Human Accomplishment, of the core conditions that prevailed during the glorious periods of past achievement. I’ll focus in particular on science and technology, since these are the fields that preoccupy our contemporary debates over the present course and future prospects of American innovation.


1. Wealth, Cities, Politics

I begin with enabling conditions. They don’t explain how the fires of innovative periods are ignited—we’ll come to that later—but they help explain how those fires are sustained.

  • Accomplishment in the sciences and technology is facilitated by growing national wealth, both through the additional resources that can support those endeavors and through the indirect, spillover effect of economic vitality on cultural vitality.

What is the relation between innovation and economic growth? The standard account assumes that the former is a cause and the latter is an effect. To judge from past accomplishment in fields other than technology, however, the causal arrow points in the other direction as well. Growing wealth encouraged a competitive art market in Renaissance Florence, providing incentives for the young and talented to enter the field. Growing wealth in 18th-century Europe enabled patrons to support the work of the great Baroque and classical composers. Similarly with technological innovation: growing wealth is not only caused by it but helps to finance the pure and applied research that leads to it.

Growing national wealth also appears to have a more diffuse but important effect: encouraging the cultural optimism and vibrancy that accompany significant achievement. With only one conspicuous exception—Athens in the fourth century BCE, which endured a variety of catastrophes as it produced great philosophy and literature—accomplishment of all sorts flourishes in a context of prosperity.

In assessing contemporary America’s situation from this angle, the big unanswered question is whether the upward growth curve that has characterized the nation’s history will continue or whether our present low-growth mode is a sign of creeping economic senescence. It is too soon to say, but if the latter proves to be the case, innovation can be expected to diminish. No society has ever been economically sluggish and remained at the forefront of technological innovation.

  • Streams of accomplishment become self-reinforcing as new scientists and innovators build on the models before them.

Statistically, one of the strongest predictors of creativity in a given generation is the number of important creative figures in the two preceding generations. By itself, the correlation tells us only that periods of creativity tend to last longer than two generations. The reasons are unknown, but one specific causal factor has been noted by writers going all the way back to the Roman historian Velleius Paterculus in the first century CE. Explaining the improbable concentration of great accomplishment in Periclean Athens, Paterculus observed that “genius is fostered by emulation, and it is now envy, now admiration, which enkindles imitation.” In the modern era, the psychologist Dean Simonton has documented the reality underlying Paterculus’s assertion: a Titian is more likely to appear in the 1520s if Michelangelo and Leonardo were being lionized in the 1500s; a James Maxwell is more likely to turn his mathematical abilities to physics in the 1850s if Michael Faraday was a national hero in the 1840s.

By this standard, American culture would seem to be going downhill. It’s likely that individuals within most technological industries still have heroes, unknown to the public at large, who serve as models. People within the microchip industry know about Jack Kilby, Robert Noyce, and Gordon Moore; people within the energy-development industry know about George Mitchell. But such local fame is not what inspires members of one generation to emulate members of the preceding generation or generations.

In part, the declining visibility of outsized individuals reflects the increasingly corporate nature of technological innovation itself. Insiders may be aware of the steps that led to the creation of the modern microchip or the development of slickwater fracturing, but those steps have no counterpart to the moments when Samuel Morse telegraphed “What hath God wrought” and Alexander Graham Bell said “Mr. Watson, come here,” or to the day when Thomas Edison watched an incandescent bulb with a carbon filament burn for 13.5 hours after hundreds of other filaments had failed. Even Steve Jobs and Bill Gates, the most famous people involved in the development of the personal computer, didn’t actually invent anything themselves.

In part, too, the decline I’m tracing here reflects a larger cultural shift. In America, inventors once loomed large in the popular imagination. In the classroom, schoolchildren throughout the 19th and early 20th centuries grew up on the stories of Bell and Morse and Edison, of Eli Whitney, Robert Fulton, the Wright brothers, Henry Ford, and more—as well as on stories of awe-inspiring technological achievements like the building of the transcontinental railway and the Panama Canal. Popular fiction celebrated inventors and scientists—Sinclair Lewis’s Arrowsmith provoked a surge of interest among young people in becoming medical researchers—and Hollywood made movies about them. There are still occasional exceptions (the movies Apollo 13 and The Social Network come to mind), but they are rare. The genre is out of fashion, as is the ethos that supported it.  

  • Streams of accomplishment are fostered by the existence of cities that serve as centers of human capital and that supply audiences and patrons for the arts and sciences.

Here is one enabling condition for accomplishment that today’s America obviously meets, and has met since the last half of the 19th century. In Silicon Valley, we see a new kind of critical mass of human capital, centered not in a city but in a geographical area and producing spectacular innovation. The Internet and all its ancillary effects have created a new way to achieve such critical masses: today, the world’s dozen top researchers in one or another arcane topic are likely to be in continual contact, trading drafts of papers, exchanging results, arguing and inspiring each other as effectively as if they were working in the same laboratory.

When I formulated this enabling condition for my 2003 book, I was still working off of traditional models for centers of human capital. Today the formulation should be usefully rephrased to embrace the great leap forward fueled by the revolution in information technology (IT).

  • Streams of accomplishment are fostered by political regimes that give de-facto freedom of action to their potential scientists and inventors.

All great scientific and technological accomplishments prior to the 19th century occurred in societies that we would consider unfree by today’s standards. Most took place in autocracies. What this suggests is that the crucial factor is not freedom in the political and legal sense but de-facto freedom of action. The latter was provided by most nations of Western Europe from the 15th century onward.

By this measure, too, America has been sliding downward. Technically, its innovators of tomorrow still live in a free society. But the environment within which they operate is increasingly subject to two large and growing constraints.

The first comprises the regulatory regimes at the federal, state, and local levels that make it impossible or financially unfeasible to implement innovation in many sectors of the economy. The second is the American system of tort law with its rules governing class-action suits and punitive damages. These render many kinds of innovations vulnerable to ruinous liability if attorneys can convince a jury that a product is to blame for anything that goes wrong.

The combination of health, safety, and environmental regulation with tort liability bears down heavily on the risks presented by new products and procedures—even when those risks are measurably fewer and smaller than those presented by the products and procedures they replace. The result is to instill in potential innovators a version of the precautionary principle attributed to Thomas Schelling: “Never do anything for the first time.”

It is impossible to say with certainty where such fields as medical technology, pharmaceuticals, or cheap energy would be today in the absence of these constraints, but the phenomenal innovation we have seen in IT may afford a contrasting lesson. No one in the federal regulatory jungle has been able to devise an excuse for regulating the number of transistors that a microchip may contain; nor have class-action lawyers been able to make a case that Google’s zillions of linked servers cause cancer. The nature of information technology has left it uniquely free among growing American industries of the last few decades—and the results speak for themselves.

Constraints on de-facto freedom of action may well constitute the single most decisive factor in impeding American technological innovation over the last half-century. It is a sad commentary, but true: for people who want to shake up the world by building a better mousetrap, when it comes to most kinds of mousetraps, America is no longer all that free.


2. Raw Materials   

  • The magnitude and the content of a stream of accomplishment in a given domain vary according to the richness and age of the organizing structure.

Part of what determines the rate of innovation in science and technology is the abundance, or lack of it, of raw material. A good way to think about this is through the idea of organizing structures.

Imagine you are a young painter at the beginning of the 15th century, practicing your art as painters have been doing since time out of mind. Then, in 1413, in the piazza in front of Florence’s Baptistery, Filippo Brunelleschi unveils the techniques of linear perspective. Now you are able to portray a three-dimensional world on a two-dimensional canvas with unprecedented fidelity. Your amount of raw material has suddenly and hugely expanded.

Linear perspective is an example of an organizing structure, one whose result in history was an outpouring of great art. Similar organizing structures include polyphony and tonal harmony in music, which opened up vast new raw material for composers, and the novel, which did the same for writers. In today’s world, the graphical user interface with windows and mice that we now take for granted, developed at Xerox PARC in the 1970s, constitutes another such organizing structure, one that afforded huge new possibilities to programmers.

The degree of creativity triggered by an organizing structure can be measured in two dimensions. One is the structure’s inherent richness. Both checkers and chess enjoy organizing structures, but chess’s is much the richer, making the potential for accomplishment in that game commensurately greater. Something similar may be said of the sonnet versus the novel: many beautiful sonnets have been written, but the organizing structure of that form is much more restrictive than the novel’s.

The second dimension is the structure’s age. However rich they may be, organizing structures do grow old. In the arts, talented creators in each generation want to do new things; although the form of the classical symphony may well have room for more great works to appear, young composers want to try something else. In science, the aging process works differently. The discovery of E=mc² can happen only once. Sooner or later, each scientific discipline not only ages, it “fills up.”

Some disciplines—human anatomy and the geography of the earth, for example—are for practical purposes completely filled up. Others are in an advanced stage, like a jigsaw puzzle of a landscape missing only the sky: more remains to be done, but doing it is not going to change materially the state of knowledge. For example, it is believed that many varieties of insects are yet to be discovered, but it is unlikely that their discovery will change entomologists’ understanding of insects.

Something analogous has happened to innovation in the tasks of everyday life. The technology for the easing of everyday life is not an organizing structure, strictly defined. Think of it rather as a large bin. At the beginning of the 18th century, the bin was almost empty. Cooking food was laborious; so was cleaning clothes; so was keeping the house warm. Would people have preferred not having to carry all their water to the house from a well or to chop wood for the fireplace? If asked, they would have said yes, of course. They might not have been able to envision how these chores could be eliminated, but they knew they were defects in daily life. There were many such defects, on which inventive minds might exercise their inventiveness.

The 18th century saw the rapid spread of the Franklin stove (invented in 1741) for heating rooms, an early version of the kitchen stove (1780), and the flush toilet with ball-valve and siphon (1778). The next one-hundred years saw both the stove and the toilet spread throughout middle-class society, along with gas lighting, running water, and sewage systems in cities; then came electric lighting, central heating, hot water on demand in bathroom and kitchen, the carpet sweeper, sewing machine, canned foods, and iceboxes using commercially manufactured ice. The first half of the 20th century witnessed the advent of the washing machine and dryer, the refrigerator, dishwasher, home freezer, garbage disposal, vacuum cleaner, and frozen foods. The first commercial microwave oven went on sale in 1947.

By 1950, revolutionary innovation in the technology of daily life was over. Since then, we have seen refinements. Microwave ovens didn’t become affordable and widespread until the 1970s. The range and quality of precooked foods has expanded by orders of magnitude. We now have Cuisinarts, bread makers, electric pasta machines, and a dozen other kitchen gadgets that didn’t exist in 1950. We have sybaritic options for our bathtubs and showers that were invented after 1950. But the effects of these improvements on daily life, compared with, say, the effect of not having and then having indoor plumbing, are trivial. The bin has largely been filled up.

A similar story could be told of transportation. As of 1700, we could move on land no more rapidly than a horse could gallop. At sea, a transatlantic voyage took several weeks. By 1960, more than a half-century ago, trains, cars, ships, and airplanes had reached their present speeds and levels of comfort, with only minor changes—the bullet train, still not available in the U.S., and the short-lived Concorde plane—since then. Hypersonic international travel will be introduced at some point in the future, and not many years from now cars will drive themselves; but these, too, are attractive enhancements of a bin that has been largely filled up—at least until the time Star Trek’s transporter becomes a reality.

But there is an important exception: both in scientific knowledge and in technological innovation, some of the remaining gaps are huge. In pure science, dark matter is still a mystery; unlocking that mystery is sure to be a landmark event in the history of human knowledge. The reality of “quantum entanglement” is now accepted, but it involves some sort of unexplained instantaneous effect: not just faster than the speed of light but instantaneous at unlimited distances. Who knows what our eventual understanding of that phenomenon will lead to?

In technology, the obvious example of a bin that is not yet close to being filled up is the one produced by the IT revolution. Those who compare the effects of that revolution with the effects of the industrial revolution are not being hyperbolic. Thanks to the rich organizing structures of the microchip, the graphical user interface, and the Internet, we are probably nearing an apogee of innovation in this field that has few parallels in human history. The potential effects are so open-ended that a whole movement—the “singularity” movement, associated most prominently with the names of Ray Kurzweil and Vernor Vinge—has come into being to predict and, some fear, control how humanity itself will be transformed. Technical advances in genomics and genetics offer the prospect of other sweeping revolutions with no less ambiguous effects.

In sum, we are living at a time when, because of inherent constraints, scientific accomplishment and technological innovation are declining. Simultaneously, an explosion of innovation is taking place in other fields where the state of knowledge still exhibits important gaps and the potential for advance is still rich. We should not be surprised to see uneven rates of innovation in different fields. Some of the unevenness may be attributable to features of American culture or politics, the subject of the next section, but others may be due to the workings of organizing structures. 


3. The Need for Purpose and Autonomy

Enabling conditions help explain how periods of innovation continue. Organizing structures help explain the magnitude and the content of the innovations themselves. But neither category explains how the fires of innovation are ignited, or why they die out. In Human Accomplishment, I proposed two places to look for an answer: first, the sources of personal energy that impel potential innovators to realize their potential; second, the characteristics of the milieu in which they grow up.

I’ll begin with the sources of personal energy, which also come in a pair: purpose and autonomy. The two are closely intertwined.

  • A major stream of human accomplishment is fostered by a culture in which the most talented people believe that life has a purpose and that the function of life is to fulfill that purpose.
  • A major stream of human accomplishment is fostered by a culture that encourages the belief that individuals can act efficaciously as individuals, and encourages them to do so.

In science and technology, people with a strong sense of “This is what I have been put on earth to do”—people who have a sense of vocation—are more likely to try to accomplish great things than are equally talented people who don’t have that sense.The reason is self-evident. People who choose these fields because they see them as the way to fulfill their destiny also tend to set their sights on ambitious, even grandiose goals. People who go into science or technology for the paycheck are less likely to do that.

People with a sense of vocation are also more likely to succeed in achieving the great goals they set for themselves. Suppose a talented scientist without a sense of vocation is assigned to an intellectually arduous task. He is less likely than a person with a sense of vocation to come up with the breakthrough, because the overriding reality about great accomplishment is that it almost always requires incredibly hard work. No other finding emerges so consistently from studies of the lives of the great figures in the sciences, arts, business, and academia. Fame can occur overnight, and is not necessarily connected either with merit or with hard work. Prodigious raw talent occasionally produces an isolated gem. But the highest forms of achievement virtually always require a long apprenticeship, persistence in the face of setbacks, single-mindedness (often obsessive), and brutally long hours. As an anonymous Greek poet put it: “Before the gates of excellence the high gods have placed sweat.”

In addition to believing that life has a purpose, people also need to believe that they have the power and even the responsibility to fulfill that purpose through their own independent acts: to believe that they are autonomous, efficacious individuals. To see the role that autonomy plays, consider the case of classical China.

China has always enjoyed an intellectually talented, industrious population, and historically China’s stock of human capital led to a sophisticated civilization that in many ways was more advanced than the West’s until halfway through the second millennium CE. But in China, one’s role in life was defined in terms of one’s obligations to family, especially to parents. The eighteen-year-old in classical China who set out to follow his star in defiance of his parents’ wishes was not just headstrong or willful, as he might have been seen in the West, but behaving so very, very badly that no Chinese son of good character would consider such a course.

Classical Chinese culture also disapproved of open, vociferous intellectual argument. Correspondingly, it did not foster the kind of “I’m right, you’re wrong, and I’ll prove it” frame of mind that has been central to the West’s scientific and technological progress.

These aspects of China’s culture did not prevent many Chinese from achieving great innovation—paper and gunpowder are just the most famous of dozens of important innovations that occurred first in China. But the absence of a tradition of individualism lowered the creative energy that the human capital of China was capable of generating.

Through the Middle Ages, the West also lacked such a tradition. Not even the golden age of Hellenic philosophy espoused individualism as we think of it today. The polis took precedence. Nor did the advent of Christianity bring individualism immediately to the West. In some ways, Christian theology was individualistic from its inception—teaching that all persons are created in the image of God, are equal in the sight of God, and are invited into a personal relationship with God. But Christianity as it was practiced for its first 1,200 years did not attach much importance to individual accomplishment of great things in this life. To the contrary: many of the most talented young people were drawn into a monastic life of prayer and contemplation in preparation for the life to come.

Then in the 13th century came Thomas Aquinas, teaching that humans are morally autonomous beings who best serve God by using all of their capacities of intellect and will, whether to unravel the mysteries of the universe or to create works of beauty. The humanism that Aquinas grafted onto Christianity’s central promises of eternal salvation and a personal relationship with God created a potent force.

It is increasingly accepted by historians who have explored the question that the single most powerful cultural force fostering Western individualism has been post-Aquinian Christianity, augmented later by the Reformation and its contribution to what Max Weber would call the Protestant Ethic. Historians still argue about the specific role of the Reformation in this process, but in the terms I am using here, post-Aquinian Christianity fostered both purpose and autonomy.


And America? Throughout most of its history, American culture has run with the concept of the autonomous individual as no other culture has ever done. One of the signal features of American exceptionalism is the fierce belief that, if they are willing to work hard enough, people can achieve whatever they set their minds to.

But that sense of autonomy has been deteriorating for at least a half-century.

One of the most important psychological measures for predicting success in life (apart from IQ score) is one’s place on the “locus of control” scale. This positions people on a spectrum from “highly internal”—i.e., believing that one’s fate is within one’s own control—to “highly external”—i.e., believing that one’s fate is determined by outside forces. In other words, the locus-of-control scale is a direct measure of the sense of autonomy. According to a meta-analysis of 97 studies with results running from 1960 to 2002, locus of control among college students fell steadily over the course of that four-decade-plus span, with the average student of 2002 displaying a lower (less “internal”) sense of autonomy than did 80 percent of college students in 1960.

Apart from the social-science data, indicators in everyday life reveal how much the traditional American veneration of individuals triumphing by dint of perseverance and hard work has faded. A few decades ago, it would have been unthinkable for a president of the United States to make President Obama’s “You didn’t build that” speech, celebrating the supremacy of the collective and denigrating the contribution of the individual. It would have been political suicide. No longer.

The data on sense of purpose tell a similar story of decline. We know from questions asked by the General Social Survey that people attach more and more importance to job security and short working hours and less and less importance to work that “gives a feeling of accomplishment.” The government’s Current Population Survey tells us that the percentage of employed males who work fewer than 40 hours a week has been rising even in the healthiest economies, and so has the percentage of males who aren’t in the labor force even when they’re in their prime working years and even in periods when the economy has had jobs for anyone who has wanted to work for any number of hours per week.

Those trends began among men with lower levels of education. In the last decade, they have been increasing among the college-educated as well. Among the latter, the percentage of men who work more than 48 hours a week has been decreasing since the turn of the century. There do remain niches in the economy where people routinely work long hours—new hires at prestigious hedge funds and investment banks, associates at top law firms seeking to make partner, and in much of the IT industry. There also remain many people in other fields who love their vocations and work long hours all their lives. But for American culture as a whole, the drive to find meaning in work and to do whatever it takes to be the best appears to have been diminishing.


What accounts for the declines in purpose and autonomy? I have already hinted at the answer, which resides in the second place to look for the sources that ignite or suppress the fires of innovation: namely, the cultural milieu in which potential innovators grow up.

One plausible part of the answer is secularization. If you have been put on earth for a purpose, the universe must have a purpose, which in turn necessitates some form of God. Since 1972, the proportion of Americans aged thirty to forty-nine who are explicitly nonbelievers has quintupled, reaching 20 percent in 2010. Another 30 percent in the same year said they had a religion but attended worship services no more than once a year. Both of these trends have accelerated in the last two decades.

Secularization is particularly evident among intellectuals. In the population at large, explicit atheists may be at only 20 percent, but among members of the National Academy of Sciences, 65 percent in one poll said they did not believe in God. To put it another way, the people who are best positioned to be great innovators in science and technology are precisely the people who are now least likely to have a sense of vocation coming from God. Most of the people who make it into the National Academy of Sciences have had a strong sense of vocation without religion, so religion is not a necessary condition for a sense of vocation. But it helps. The question to be asked is: how much has secularization contributed to vocational ennui among those with the intellectual potential to become great scientists or innovators?

Growing wealth and security are also implicated. For the first time in human history, a high proportion of the most talented people in advanced societies get advanced educations, find good careers, and take prosperity and security for granted. Furthermore, affluence and technology have proliferated attractive leisure alternatives: second homes, trips abroad, and a multitude of time-intensive avocations that were unknown a half-century ago and that compete with spending 60 hours a week in the laboratory.

Combine all this with a worldview that says there is no God, no destiny, no ultimate good, and it is natural that people develop what I have elsewhere called the “Europe Syndrome,” after the Western European countries where it is most visible: a way of life based on the belief that humans are collections of chemicals activated by conception and deactivated by death, and that the purpose of life is to while away the intervening years as pleasantly as possible, with as little trouble as possible.

Along with secularization, the Europe Syndrome is spreading in contemporary America. I cannot put coefficients to the size of the effect, but the existence of the trend is not open to argument. Fewer young Americans than in earlier stages of our history come to adulthood assuming that they have a purpose in life, that they are impelled to fulfill that purpose, and that they can expect to do so through their own efforts. It follows that, to some extent, the net amount of National Creative Energy brought to scientific and technological innovation must suffer as well.


4. Transcendental Goods 

  • A major stream of accomplishment in any domain requires a well- articulated vision of, and use of, the transcendental goods relevant to that domain.

Closely associated with the roles of purpose and autonomy in stimulating great achievement is the final condition that I identified in Human Accomplishment: the concept of transcendental goods.

“Transcendental” refers to perfect qualities that lie beyond direct, complete experience. In the classical Western tradition, the worth of something that exists in our world can be characterized by the three dimensions known as the true, the beautiful, and the good. Those three are what I mean by transcendental goods. Since the good is not a term in common use these days, I should specify that I am using it in Aristotle’s sense in the opening statement of the Nicomachean Ethics:

Every art and every inquiry, and similarly every action and pursuit, is thought to aim at some good; and for this reason the good has rightly been declared to be that at which all things aim.

To put this in operational terms: if a culture possesses a coherent, well-articulated sense of what constitutes excellence in being human, it has a conception of the good as I am using it.

The discussion of transcendental goods in Human Accomplishment focused on the arts, where the concepts of beauty and the good are crucially important. In contrast, the transcendental good that has mattered most in science and technology has been truth. Many scientists have also seen beauty in the laws of mathematics and physics, but the beauty was incidental. Above all else, science has been a search for the truth, and technology has consisted in the application of those truths. During the last century, the abiding devotion to truth has helped to keep science vital.

But what happens when a society that still believes in pursuing truth in the sciences becomes heedless of the good? That society brings a lack of moral seriousness to moral problems. This is our condition, and it has come upon us at a peculiarly dangerous juncture in human history.

Until now, science hasn’t possessed the power to tinker with the nature of the human animal. Within a few decades at most, progress in genetic engineering will give science that power. Perhaps it will occur in the form of designer babies, perhaps in ways of linking human consciousness with computers, perhaps in ways that are still unforeseeable. But in one form or another, it will be within the power of human beings to alter the nature of our very humanity. We will arrive at this crossroads just as philosophical and religious discourse about the good—the nature of human flourishing—is nearly inaudible.

We have brilliant people thinking about such issues. Leon Kass’s writings on the intersection of bioethics and the nature of what it means to be human are profound. The “singularity” movement, as I mentioned earlier, has prompted some well-known people to argue for the beneficent implications of a transformation in human intelligence; others, like Kass, point to the potential menace. But the latter do their work in a cultural milieu that disdains them, and the uses to which the new technologies will be put will likely be determined by that cultural milieu.

Making the case that ours has become a cultural milieu indifferent to the good is the subject for a book. Here I will leave it as an assertion: if genetic engineering develops a capability of the kind I have been describing, that capability will be realized somewhere in the world with little resistance by the general public, under governments that cannot be trusted to act wisely.

Imagine, for example, that it becomes possible to engineer male fetuses so that they are no longer any more aggressive than female fetuses. Should the government permit individual parents to make such choices? What should be the government’s role in forbidding, encouraging, or even requiring such choices? That is just one of dozens of issues that will need to be assessed not just on a case-by-case basis but in the context of a rich, rigorous discourse on what it means to be human. The prospect of trying to address such questions in a culture that increasingly rejects the belief that human life has a transcendental dimension is profoundly troubling.


5. How America Matches Up

How, then, does the ideal culture of innovation match up with today’s America? Our evidence straddles two quite different conclusions.

On the positive side, the beginning of the 21st century has seen the opening of new and rich organizing structures in information technology and genetics that permit prodigious innovation. So far, America is still at the forefront of innovation in both of those domains.

On the negative side:

  • As America increasingly resembles Europe economically, it must be expected increasingly to resemble Europe in terms of its innovative energy.
  • Celebration of innovation and innovators is out of fashion.
  • The restrictions on de-facto freedom of action are already extreme in many industries, and can be presumed eventually to encroach on information technology. In the realm of genetic innovation involving humans, where some regulation is desirable, bad regulation will needlessly inhibit the desirable forms of genetic innovation as well; the history of government regulation is hardly encouraging on this point.
  • As reflected in American attitudes and behaviors, the fostering of a sense of purpose and autonomy has declined in our culture.

These negatives are not etched in stone. Changes in tort law and regulation could have major effects on the climate of innovation. American cultural norms have been known to change quickly and dramatically in the past, and could do so again. When some of the large gaps in our scientific knowledge are filled, we might see huge effects on the amount and quality of human capital turned toward innovation.

Finally, in assessing contemporary America as it looks from the template drawn in Human Accomplishment, I have not taken into account the specific dynamics that might make America an exceptional case. In light of that template, though, it is clear that if we are to override historical tendencies and avoid deep trouble, we had better have at our disposal some of those exceptional dynamics. For when a government is increasingly hostile to innovation, as America’s is, and a society is decreasingly industrious, as America’s is, and a culture stops lionizing innovators, as America’s has, and elites increasingly deny that life has transcendent purpose, as America’s do, innovation must be expected to diminish markedly.

To return to the contrast I suggested at the outset: today we bear little resemblance to England at the end of the 18th century, and look a lot like France fading at the end of the 19th.


  1. Terminal Sclerosis by Charles Murray
    Can American innovation survive the paralysis of American government? A reply to my respondents.
  2. Fear of Falling by Walter Russell Mead
    How global competition can spur technological innovation and keep America (and Israel) dynamic.
  3. Will Israel Have What It Takes? by Ran Baratz
    Thanks to a mix of old values—mutual responsibility—and new ones—individual freedom—Israel is thriving; but challenges loom.
  4. The Course of Cultural Genius by Dean Keith Simonton
    Why some cultures rise while others fall, and still others revive—a brief survey.

More about: autonomy, Charles Murray, Human Accomplishment, innovation, national wealth, need for purpose, political regimes, raw materials, Secularization, transcendental goods, Velleius Paterculus


The Course of Cultural Genius

Why some cultures rise while others fall, and still others revive—a brief survey.

The Course of Cultural Genius
A scroll painting of the Emperor Shenzong of the Song Dynasty (960-1279 CE), noted for its cultural flowering and economic achievement. Courtesy Wikimedia Commons.
Dean Keith Simonton
April 6 2014

Charles Murray’s essay, “Does America Still Have What It Takes?,” provides a brief but comprehensive treatment of an issue—the rise and fall of civilizations, cultures, or nations—that has plagued thinkers for many centuries. He mentions the ancient Roman historian Velleius Paterculus (c. 19 B.C.E. – c. 31 C.E.), who speculated on why geniuses in literature and philosophy tended to cluster into periods of intense creativity, only to yield ground quickly with the passage of time. Much later, the Muslim historian Ibn Khaldun, in his Muqaddimah (1377), contributed an important systematic treatment of historical cycles in politics and the arts.

The first genuine attempt to address this phenomenon in a more modern, scientific manner can be credited to the Belgian-Swiss botanist Alphonse de Candolle. In particular, in his Histoire des Sciences et des Savants depuis deux Siècles (1873), Candolle collected objective data on the scientific achievements of various European nations, and from these data attempted to identify the political, religious, cultural, educational, and other forces that promoted or inhibited outstanding scientific contributions. Candolle also offered observations on how the center of scientific activity tended to shift from one nation to another, and even predicted that English—not French or German—would eventually become the primary language of scientific communication.

Not only did that latter forecast prove correct, but it also explains why Candolle’s pioneering work would remain relatively unknown and largely forgotten; to this day, it has never been translated into English. In ignorance, then, subsequent scholars returned to less objective and less quantitative approaches to the question. An example is Alfred Kroeber’s classic Configurations of Culture Growth (1944).

In fact, when, exactly one century after Candolle’s work, I decided to focus on this same area of inquiry for my doctoral dissertation, I too was unaware of his earlier efforts. In the event, my dissertation applied modern econometric methods to historical data on the fluctuations in creative activity over the course of Western civilization, from the ancient Greeks to recent times. Among its findings is one cited by Murray in his Mosaic article: the number of eminent creators in a given generation is a positive function of the number of eminent creators active in the previous two generations. This finding was later replicated for other civilizations, especially those of China and Japan.

Creative genius, then, does not emerge de novo but rather rests on the shoulders of high-impact role models and mentors. Working across consecutive generations, this same influence can help to maintain a fairly high level of creative activity for at least a century. It takes a while for the initial impetus of genius to exhaust its sociocultural momentum.


And there is yet another result, one that Murray acknowledges but does not extensively treat: many civilizations undergo rebirths. Golden ages may decline into silver and later into dark ages, but not all cultures stay submerged in darkness. Sometimes, the subsequent revival may produce a new golden age that surpasses anything seen before. China, one such long-lived civilization, has experienced multiple revivals during the course of its history.

Besides occurring across whole civilizations, such resuscitations of innovative activity can appear in individual nations or regions within a larger civilization, or in specific domains of creative achievement. In the Western scientific tradition, for example, several nations experienced transient dry periods, as can be seen in the history of Italian, French, Dutch, British, German, and Scandinavian science. Indeed, during and after the scientific revolution in early modernity, no one single nation was able to maintain consistently high levels of creative activity.

The obvious question is what factors enable a nation or civilization to recover its creative vitality. The complete inventory of potential booster-shots is as yet unknown. Certainly some of the positive influences on creativity that are enumerated in Murray’s article can be rejuvenated, while some of the negative influences that he notes can be reversed. Thus, many civilizations have experienced new vigor when they have opened themselves to input from other cultures, often through the medium of extensive and diverse immigration. In general, open or heterogeneous societies have a distinct advantage over closed or homogeneous societies.

Perhaps the biggest unanswerable question right now concerns the long-term effects of the Internet and social networks on stimulating and maintaining this quality of openness and heterogeneity. In all of human history, the capacity for cultural hybridization has never been greater than now. We cannot yet know whether the resulting global cultural mix will create a new civilization with an unprecedented gift for innovation in technology, science, and the arts. Notwithstanding all of the sociocultural weaknesses that Charles Murray pinpoints, even the United States may continue to be a full participant in a new golden age.


Dean Keith Simonton is Distinguished Professor of psychology at the University of California, Davis. Among his hundreds of publications are Genius, Creativity, and Leadership (1984), Psychology, Science, and History (1990), Creativity in Science (2004), and an edited volume, The Wiley Handbook of Genius, due out in June. 

More about: Charles Murray, golden age, heterogeneous societies, Human Accomplishment, innovation, rebirth, Velleius Paterculus


Will Israel Have What It Takes?

Thanks to a mix of old values—mutual responsibility—and new ones—individual freedom—Israel is thriving; but challenges loom.

Will Israel Have What It Takes?
An Israeli-made Ecoppia solar-panel cleaning system. Courtesy Ecoppia.
Ran Baratz
April 9 2014

In “Does America Still Have What It Takes?,” Charles Murray soberly registers the decline of certain central cultural characteristics that historically have sustained America’s success as a global center of progress and innovation. His warnings on these matters apply to the democratic West as a whole, where the “Europe syndrome,” as he aptly calls it, is a widespread affliction. But what about Israel? Obviously, this “start-up nation” is not competing with the U.S. for world dominance in innovation; but is there anything to be learned from a comparison between the two countries and their respective cultures?

Let me start with Murray’s list of “enabling conditions” for innovation in science and technology, of which the first is national wealth. Relatively speaking, Israel’s public debt is now at a reasonable level, having been reduced from 275 percent of GDP in the 1980s to 70-80 percent of GDP more recently; by comparison, in the same period, U.S. national debt moved from 40 to over 100 percent of GDP. (Readers of Hebrew can learn more here.) Similarly with other enabling conditions named by Murray, including the high status accorded to innovators, the existence of cities or other “centers of human capital,” the freedom to invent. On all of these indices, Israel’s situation isn’t bad at all.


What about the larger, cultural context, the surrounding milieu of attitudes and beliefs that, in Murray’s telling, both foster and drive the spirit of innovation within individuals and societies?

Israel is a highly ideological place. It sometimes seems that every Israeli is fanatically devoted to one or another ideology that he regards as his personal property. For some, that ideology is Judaism; for others, it’s the overarching imperative to establish a “model society”; still others dream of creating a “new Middle East”; and on it goes. Those lacking a positive ideology can always find refuge in a negative one, blaming all their discontents on the settlers, the ultra-Orthodox, the Arabs, the Left, the Right. . . .

This ideological surfeit is no coincidence. It is a direct legacy of Israel’s birth after World War II when a bedrock conviction in the rightness of the Zionist cause was needed simply in order for the country to survive. In its first decades, as, against overwhelming odds, Israelis were engaged in a desperate struggle for their national existence, that same Zionist spark, in the form of deep personal commitment to the security and flourishing of the state, provided the sense of “purpose and autonomy” that Murray names as a dual prerequisite for igniting the “fires of innovation” within a society.

Israelis striving to excel in those years derived a deep sense of purpose from contributing to their as-yet-insecure state. For its part, Israeli society knew how to appreciate their contributions, to condone their eccentricities (if they had them), and to give them backing and honor.

Like all honeymoons, however, this one didn’t last forever. First, power corrupts: the rewards granted by the state became disproportionate, morphing into spoils lavished by the government on party apparatchiks and lobbyists of different shapes and sizes. Second, just like America and Europe if on a much reduced scale, Israel today, even as it enjoys its relative economic prosperity, is encountering the price to be paid for prosperity in terms of cultural and personal mores.

An Israeli friend in high-tech related his contrasting impressions of visits to French and American companies. In France, he found exceptionally gifted engineers who were knowledgeable, committed, and creative—until, that is, 5:00 in the afternoon or the eve of one of the many vacations available to them; then they disappeared. In the U.S., the engineers in the companies he visited were primarily from India. As individuals, they were as gifted as those in France, but they worked sixteen hours a day, took few if any vacations, and did not profess to be in the least disadvantaged.

My friend reported that, personally, he identified with the French. He, too, wanted to be with his wife and children at the end of the day, and to take vacations. But, overall, the French are out of the race. In the world of high-tech, speed and flexibility rule; a delay of two weeks in getting a product to market can make the difference between success and failure.

Israel stands somewhere in-between these two models. Although in general the economy suffers from relatively low workforce productivity, the high-tech sector is an exception. For one thing, the great immigration from the former Soviet Union in the 1990s brought waves of educated personnel with a strong work ethic and a clear affinity for technology. For another, as Dan Senor and Saul Singer argue in Start-Up Nation, military service, especially in elite units, plays a central role in shaping the mentality of Israeli high-tech workers. In the army, the soldier learns to be “mission-focused,” not to watch the clock, and at times assumes a level of personal responsibility that has no equivalent in civilian life. Bringing these qualities into the world of innovation produces clear payoffs. Young Israelis have launched large and complex projects with impressive speed, justifying the high degree of foreign investment in this sector of the economy.

Next, creativity. For Israelis, necessity—often, immediate national necessity—is indeed the mother of invention. Thus, Israel specializes in water-saving agriculture, security and weapons systems, and information technology. For better or worse, none of these is likely to fade in importance or urgency any time soon. Here, too, military service provides good training, as, contrary to stereotype, elite military units are constantly called upon to display creative problem-solving of the kind that budding scientists and engineers are much less likely to learn in a university setting.

One could say that Israel has benefited twice over from its existential struggle. That struggle imparts a sense of purpose to those who contribute to the security of the state while also developing important innovative skills. At the same time, living under threat also imparts a sense of realism that contrasts refreshingly with the moral and intellectual relativism that Murray rightly decries in the West. To be sure, Israel is not immune from this disease, but it is far less pervasive than elsewhere.

Again in contrast to the U.S. and much of Europe, secularization still lags in Israel. There are many reasons to rejoice at this state of affairs, although frankly it may not be so great for a culture of innovation. Judaism doesn’t object to self-fulfillment, but it doesn’t place it at center stage among life’s goals, much less make a cult of it. On the other hand, for believing Jews (and Israeli citizens), it also insists on many respites from work: not just the weekly Sabbath but also much of the month devoted to the High Holy Days and the festival of Sukkot, and the extended vacation of Passover in the spring. There is personal and social good in so much rest—but also a price to be paid in the global competitive economy.


In sum, and on balance, Israel’s situation measures up favorably against Murray’s metrics. Indeed, while the U.S. in his telling is becoming more “European” – moving from individualism to statism – Israel is moving in the opposite direction. In its youth, its most enterprising citizens had to contend with the suffocating hand of socialist institutions and a spirit of excessive communitarianism. Since the 1980s, this has changed. From a state where all creative energies were invested in defense and settlement, it is fast becoming a strong, free society where individuals can contribute to a whole host of new fields not directly tied to survival.

But every change carries with it new problems, and certain old problems have likewise persisted into the present. To conclude, I’ll note four challenges that seem central in light of Murray’s penetrating analysis.

The economic challenge. This can be split into two parts. The first is regulation: Israel suffers from an extremely complex web of rules and regulations that primarily harm small and medium-sized businesses. These form a critical component of Israel’s overall financial health: in 2012, some 450,000 small businesses generated 45 percent of GNP and employed 55 percent of the labor force. Yet they are an abused sector of the Israeli economy, heavily taxed and politically ignored. When it comes to the ease of doing business, or the cost of starting a business, Israel fares very poorly compared to the OECD and, on some measures, even compared to the rest of the world (see here for more). Niall Ferguson recently wrote that compared with England, it’s hard to start a business in the U.S.; he clearly hasn’t tried to start one in Israel. Any Israeli politician who made it his mission to reduce the huge tax and regulatory burdens on business would render a great service to his country.

The second part of the economic challenge is well known: the low rate at which ultra-Orthodox (haredi) men and Arab women participate in the workforce. This issue transcends economics; solving it depends on a much more complex process of cultural change inside communities that have long seen themselves as embattled minorities.

The political challenge. The positive trends that I’ve outlined are easily reversible, and Israeli politics is a fickle battleground. Today’s Israelis remain very receptive to arguments based on considerations of “social justice,” and their state-run educational system has trained them to look to government as the solution to all problems. If an Israeli prime minister were to say, as President Obama said, “you didn’t build that,” most would not object. At a time when the shockwaves of global economic problems are, somewhat belatedly, reaching Israel too, it’s easy to invoke the populist argument in favor of nationalization and redistribution of wealth and against the free market.

The educational challenge. Anyone who wants to tell the medium- and long-term future need only look at today’s children. The picture isn’t pretty: Israel faces the worst crisis in scientific and technological education in its history (see here [Hebrew]). The reason once again is that all the eggs are in one basket—namely, the state, which has made a royal mess of things. This is not the place for an extended discussion, but one big central problem is the government’s determination to wage war against private education. Although it borders on the irrational to think that any government ministry can successfully run a system with more than a million and a half students and hundreds of thousands of teachers, this fatal conceit has so far proved unbudgeable. As long as it remains so, it will be difficult if not impossible for education in Israel to recover.

The cultural challenge. Individualism is a complex and nuanced phenomenon, easily misunderstood. Israel needs to complete the transition from its formerly excessive communitarianism to a model that better balances the needs of individuals and communities, but it is hard to achieve such a balance without first swinging all the way to the other extreme. Unfortunately, the post-modern conception of individualism, according to which all forms of self-fulfillment are not only legitimate but equal, can cause as much harm to Israel’s social fabric as it has done elsewhere. If freedom comes to mean nothing more to Israelis than licentiousness and nihilism, severed from individual and communal responsibility, from social institutions and accepted norms of behavior, their uniquely moral society will quickly fall apart.

Israel’s situation today is much better than it was 40 years ago. But Murray provides an important warning from the experience of the modern West. Affluence and success exact a cost—on the one hand, an ever-growing reliance on the “nanny state”; on the other hand, moral and societal disintegration. To navigate wisely, Israelis will need to leaven the values of individual freedom and self-fulfillment with their inherited ethic of mutual help and civic responsibility, political participation, attachment to tradition and religion, and national pride. By achieving a balance of these elements, they might just enable a culture of innovation, and much, much more, to flourish in the Jewish state.


Ran Baratz is  professor of ancient philosophy at Shalem College, founder and director of the Israeli nonprofit El Haprat, and founding editor of Mida, a Hebrew-language website of news and opinion. He is executive director in Israel of the Tikvah Fund’s program in political thought, economics, and strategy.

More about: Charles Murray, innovation, Israel, Middle East, national wealth, Secularization, statism


Fear of Falling

How global competition can spur technological innovation and keep America (and Israel) dynamic.

Fear of Falling
Walter Russell Mead
April 16 2014

Prediction is always difficult, especially (as Yogi Berra is alleged to have said) about the future. Charles Murray’s brilliant essay on the future of American innovation handles the difficulties well—and, like all good forecasters, he knows when and how to be appropriately Delphic in his pronouncements. “Let Athens trust to her wooden walls,” said the Pythoness as the Persians approached, without specifying whether by “walls” she meant the fortifications around the Acropolis or the hulls of the Athenian fleet. When it comes to the future of American innovation, Murray concludes that it could go either way: European-style malaise, combined with the depredations of the tort bar, could slow innovation to a crawl and put America on the inexorable road to decline; but decline is still a choice, not a fate.

In one sense, there is little to add to Murray’s conclusion; so many factors affect a topic as complex as his that an uncertain verdict about the future is intellectually all but mandatory. But Murray doesn’t just state the obvious; his look at the specific conditions associated with innovation (or the lack of it) in many different cultures and eras offers an important checklist as we think about the state of American life today.


To Murray’s list of factors affecting the potential for dynamic innovation in societies over time, there is one thing I would add. It is a factor particularly relevant to today’s America, and taking it into consideration would lead, in my opinion, to a somewhat more robust confidence in the idea that America’s age of innovation is not yet behind us. That additional factor is competition.

Historically, one of the chief reasons why the European states were able to power ahead of the rest of the world in the last 600 years was the constant press of competition among them. From the Italian Renaissance, when military rivalry drove rapid innovation in offensive and defensive armaments, right up through the cold war, the need to avoid falling behind drove countries and cultures to foster and support innovation of all kinds.

The second Hundred Years’ War between Britain and France (1688-1815) wasn’t just won on the playing fields of Eton; it was won through the development of the Bank of England, advances in navigation and ship building, the promotion of foreign trade, and in many other ways. Similarly, the cold war was seen to involve an all-out competition between the Soviet Communism and American (and more broadly, Western) democracy. Moscow’s launch of the Sputnik satellite in 1957 led to a strong push in the United States to develop the natural sciences at every level.

Perhaps the leading example today of a society driven into innovation is Israel. Limited in natural resources (though perhaps, from a hydrocarbon standpoint, less limited than once thought), and surrounded on all sides by current or potential enemies, Israeli policy makers and citizens understand all too well that the future of their society depends on, among other factors, their ability to stay ahead in the science-and-technology sweepstakes.

So, as we attempt to peer dimly into the future of American innovation, it is worth asking: will international competition continue to play a strong role in impelling Americans, whether they like it or not, into supporting the kinds of policies and institutions that promote innovation?

Some have argued that, since the end of the cold war, the United States need no longer worry about competition of this kind. Russia, they say, even in its newly aggressive Putinist guise, is only a shadow of its former Soviet self. China, though certainly ambitious, remains a regional power, and many believe or hope that ultimately it will even choose to become a “responsible stakeholder” in an America-backed world order. Elsewhere, we are told, with the possible and temporary exception of Iran, none of today’s various rogue states (like North Korea) or members of the Axis of Anklebiters (Venezuela, Cuba, Bolivia) has the potential truly to threaten the United States on a large enough scale to trigger the kind of civilizational confrontation we faced during the cold war.


As I began by saying, one cannot be sure about the future. Still, I have my doubts that the United States is going to enjoy a quiet life in the years and decades to come. On the contrary: we are likely to be living in a world dominated by both old and new kinds of international competition. The ability to develop and apply new techniques and technology, and to manage the social and political consequences of accelerating change, will be key to national success and survival.

In some ways, we have already begun to see the emergence of a new kind of competition that is directly affecting the way both the American government and other governments approach innovation. Only a few years ago, the digital world was regarded as a virtual Garden of Eden, a peaceful paradise where ugly realities like state competition played little part. (Remember when email from Nigerians soliciting our banking information was the most dangerous form of crime on the Internet?) But those days are over. If the Stuxnet virus can disable centrifuges in Iran, one can only imagine what other kinds of havoc can be wreaked through the ubiquitous and indispensable worldwide web.

Increasingly, the need to protect and police the digital universe has led to sharp, zero-sum competition among powerful states, with enormous implications for security. This competition is strengthening the ties between states and technology companies. Google’s business model, for example, very much depends on an American-style Internet with open access; the modern American state, for its part, is as linked to Silicon Valley and its capacity to innovate as tightly as Louis XIV was linked to the genius of Vauban.

It did not take Edward Snowden to alert powers around the world that they, too, must innovate or die through the technologies that shape the age of the Internet. Every drone strike in Pakistan, every piece of news about the revolution in military affairs, hammers home the lesson that the future belongs to those who can seize and hold a place on the technological cutting edge.

Fear has been one of the principal drivers of innovation since Archimedes built machines to defend the city of Syracuse from its Roman besiegers, and fear will continue to drive the human race in the 21st century. While fear alone may be insufficient to create a flowing stream of innovation of the kind that Murray describes, in the American context it is likely to tip the scale in favor of the forces in American society that can keep us dynamic and open to change.


Walter Russell Mead is professor of foreign affairs and humanities at Bard College and editor-at-large of the American Interest, where his blog, Via Meadia, appears daily. Among his books are Special Providence: American Foreign Policy and How it Changed the World and God and Gold: Britain, America, and the Making of the Modern World.

More about: Charles Murray, innovation, international competition, Israel


Terminal Sclerosis

Can American innovation survive the paralysis of American government?

Terminal Sclerosis
Illustration of the buildup of plaque in arteries. By AstraZeneca.
Charles Murray
April 22 2014

I read the commentaries on my essay by Dean Keith Simonton and Ran Baratz with interest, but have nothing substantive to add. Walter Russell Mead’s argument about the role of competition triggered a variety of thoughts, all of which are preliminary; it is not a topic that I have investigated systematically. For example, what if classical China had not coalesced into a single huge state, but had consisted of four or five smaller, competing states? Would competition have spurred those different Chinas to accelerate their technological development or invent the scientific method? It is a fascinating and not implausible possibility. With regard to Europe, Mead is surely right that rivalry spurred technological and economic responses.

But as Dean Simonton points out, all of the major European nations experienced “transient dry periods” during those same centuries. The locus of scientific and technological accomplishment moved northward, from Italy during the Renaissance to northern Europe and especially Great Britain thereafter. The experience of Europe teaches that it is possible to drop out of technological competition, as Italy did, or never to enter the competition at all, as in the case of Spain and Portugal.

With regard to the role of rivalry among nations in keeping America innovative, I have one observation and one working hypothesis. The observation is that the United States as a nation has already shown that it is willing to drop out of competitions that do not directly affect our security. The obvious example is the manned space program. The United States now depends on Russian launch vehicles to get our astronauts to the international space station, and shows no alarm about the prospect of being left far behind by the Chinese in years to come. The national ethos that prompted the Apollo program seems completely lost.

My working hypothesis is that the United States government still will respond to rivalries that affect our national security or competitive position in the world when the threat is imminent and obvious but not otherwise. Suppose that China were found to be developing military technology that could give it a decisive advantage over the United States. I still have hopes that America’s leaders would mount a new Manhattan Project to counter it.

But I italicized my caveat for a reason. Walter Russell Mead is correct that, for example, “the modern American state, for its part, is as linked to Silicon Valley and its capacity to innovate as tightly as Louis XIV was linked to the genius of Vauban.” But does the United States government understand that reality? Will it make appropriate policy even if it does understand?

I, like many others, have been bemused by the seeming paralysis and incompetence of the federal government in recent years. This is not a reference specifically to the Obama administration. Instead, I think we are witnessing an advanced stage of the systemic sclerosis that Mancur Olson described in The Rise and Decline of Nations: Economic Growth, Stagflation, and Social Rigidities (1982). The options open to a sclerotic government are tightly limited by the special interests that have grown up within and around it. Sclerotic bureaucracies seek primarily to serve themselves, not the missions they are assigned. And even when a sclerotic government realizes that it is in peril, it is neither agile nor responsive, and is often unable even to prevent itself from doing harm to its own long-term interests.

To put my working hypothesis in terms of Silicon Valley, it seems quite possible to me that the modern American state will effectively cripple the information-technology industry’s innovativeness, oblivious to its dependence on that innovativeness. The same obliviousness will characterize the federal government’s policies toward other centers of American innovation. Whether I am right depends on how far sclerosis in the American government has progressed. My own assessment is that the sclerosis is close to terminal. But that’s a topic for another day.


Charles Murray, the W. H. Brady Scholar at the American Enterprise Institute, is the author of, among other books, Losing Ground (1984), The Bell Curve (with Richard J. Herrnstein, 1994), Human Accomplishment (2003), and Coming Apart (2012).

More about: imminent threat, international competition, Silicon Valley, Walter Russell Mead