If the next 100 years produce nothing else of lasting note, they may give us the first conclusive test results from the vast, uncontrolled experiment we have been running on several billion human beings.
We are a young species on an old planet. Homo sapiens sapiens has only been around for a few dozen millenniums, and the basic model has barely evolved in any meaningful Darwinian sense. But the combination of relentless technological progress and social manipulation has made it possible to alter this creature's environment radically in intervals far shorter than a single lifetime.
As a result, the physical and mental world we inhabit has changed more--and faster, and more often--in the past 200 years than it did in the previous 20,000. In terms of sheer existential pressure on any individual, the transition from the last Ice Age to the present interglacial period was positively trivial compared with the shocks imposed by the second half of the 20th century.
The value and nature of work, the roles of men and women, the definition of family, the status of racial groups, the distribution of wealth, the ever varying roster of bitter enemies and trusted allies (Japan and Germany have been both, within 30 years!)--all these and more have undergone revolutionary transformations. Many are still shifting under our feet.
At the same time, we have burdened ourselves with neural stresses that it would be illegal to impose on a lab animal. Cell phones, beepers and inescapable global e-mail. Incessant forced mobility. Evanescent serial loyalties to neighborhoods and employers. Marriages that fracture and refracture into permutations unimagined by Caligula. Obligatory multiculturalism. Tenfold increases in "communication" by electronic means, and tenfold reductions in person-to-person contact. All these, set to the frantic clamor of gadgetry and the tempo of pop music, constitute the modern whiplash gestalt.
Life rhythms are divorced from nature. Advanced societies run 24 hours a day, 365 days a year, in climate-controlled buildings where few can see a window and none can open one. For every new labor-saving device, people report feeling more rushed. Silence has become distracting, if not pathological.
Of course, it is quite possible that the human organism is sufficiently pliable that it can accommodate cultural mutation on an unprecedented time scale. It is, however, equally possible that modern humans, no less than our furrier forebears, are ill-equipped to "do their own thing" in a world of persistent flux--at least without subsiding into pandemic depression, anxiety and ostensibly senseless violence. People may yet require a society and a world view that give their lives meaning and structure through shared rituals and tribal bonds.
Nonetheless, modern industrial democracies, which have evolved with ultra-biological speed and near-fanatical emphasis on change, secular ecumenism and individual expression, provide fewer anthropological anchor points every year.
They also place frequently unwelcome demands on people who already feel only the most tenuous sense of belonging to an entity the size of the United States, with its 270 million conspicuously diverse inhabitants. These tensions may rise. At present, between one-third and one-half of our population growth comes from immigration. By mid-century, half of all Americans will be black or Hispanic. Engineering a true commonweal that would serve all such interests presents an immense challenge.
It will be even harder if the traditional middle class continues to erode, in effect converting America into a two-class society of the increasingly affluent techno-adept and increasingly useless retro-drones. The 21st century will doubtless draw this problem into sharp, if not bloody, focus.
(Note, however, that despite the rise of the new cognocracy, the vast majority of highly educated people do not have the dimmest comprehension of the principles behind the devices on which their lives and livelihoods depend. Every year, real understanding of humanity's most important systems and devices resides in a smaller fraction of the populace. For the rest, modern technology might as well be magic. This trend may constitute the greatest potential for tyranny in history.)
The world's great melting pots already are cooling fast, and people are ever more prone to identify themselves with smaller geographic, social and ethnic groups. Half a century ago, there were 50 nations on the planet. Now there are six times as many, and further fission is likely. This trend is apparent from the "breakaway republics" of the former Soviet Union to the great paradox of American politics: intense enthusiasm for local issues and indifference to national elections.
It has recently been the function of enlightened world opinion to stifle the importance of these perceived differences, whether between feuding religious factions in Northern Ireland or ethnic rivals in Bosnia. The next century, however, could show this to be a Sisyphean task.
Liberal democracy, the political model that recently appeared destined to be the future of civilization, may already be an anachronism, judging from planetwide evidence. Both "multicultural" democracy and communism rely on a populace that is equipped to feel empathy for, and make sacrifices on behalf of, a wide variety of people with drastically dissimilar interests. In the 21st century, some governments may find those feelings in appallingly short supply.
Even countries with strikingly homogenous populations will face ample trauma. The 21st century will likely reveal, for the first time, just how many human residents the beleaguered Earth can sustain. World population more than tripled in the 20th century and could easily double again in the next 100 years. Absent some catastrophe, it will almost certainly reach 9 billion by 2100, 50 percent more than the current figure, and will likely hit 10.5 billion. Almost all the growth will occur in Asia and Africa. Collectively, these new lives will dwarf the total combined population of Europe and the Americas, which is estimated to be no more than 1.8 billion by 2025 and won't grow much during the rest of the century.
We'll be feeling the effects within a decade. Today, half the world's population is under the age of 25. Thirty-five out of every 100 people in developing countries are under 15. That age distribution amounts to a very short fuse on a population bomb that is set to explode at the same time that inexpensive public health measures are drastically reducing infant mortality and death from diseases such as malaria and cholera. At least half of the added population will live in "supercities," of which the three largest by 2015 will be Tokyo, Bombay and Lagos, each with 25 million inhabitants or more.
Most of the growth will be far from our shores. The United Nations projects that the 21st century's largest countries will be, in order of population, India, China, Pakistan, the United States and Nigeria. However, even the remotest Third World birth spikes will be felt in our society, chiefly in the form of changing prices for petroleum and food.
Unless some new energy sources or water-distribution schemes arise, wars over those two increasingly scarce commodities seem dismayingly probable. Even if fuel and water were abundant, though, ensuring that even a portion of the added billions could exist above the poverty level would exceed any collective social effort in memory.
Meanwhile, the industrial nations--with no more than one-fifth of the world's population--will grow progressively and spectacularly older. Right now, about 9 percent of the world's population is over 60. By 2100, the United Nations estimates, it will be nearly 30 percent, and one out of every 12 persons will be over 80. Nearly all of the very old will be in developed countries.
Many experts assume that 100 will not be an uncommon age for affluent persons to reach in the mid-21st century. Some believe 120 is possible, as more diseases of old age are vanquished and life-extension techniques are perfected.
If those projections are even partly accurate, the geezer boom will utterly transform the economies and cultures of advanced nations. The financial resources required to support an additional two to four decades of life would, at a minimum, eliminate the prospect of inheritance in all but the wealthiest families.
Substantial life extension would raise government spending on the elderly from about one-third of all federal outlays now to well over half by mid-century, by which time nearly one out of three Americans will be over 65.
It would also raise the even more troublesome question of what the aged are supposed to do with their bonus time, shift a titanic amount of business effort toward elder care and/or amusement, and could prompt unprecedented resentment among younger generations. In those circumstances, new euthanasia programs and "right-to-die" laws may find surprisingly broad support.
Suppose that it somehow proves possible to elevate a billion people--a modest one-fourth of the century's added population--to some approximation of working-class status within a generation or two. To be sure, U.S. exports would soar. But the consumption of resources would place an unexampled strain on the planet's agricultural "carrying capacity."
For any given family, education and social advancement typically result in fewer children per couple. But they also produce extraordinary demand for energy-intensive goods such as meat, a notably inefficient food source because land must first be employed to raise grains, which are then fed to animals, which are then fed to humans. Yet demand for beef, pork and chicken has already doubled in the past 50 years, and is expected to double again by 2050.
Unfortunately, worldwide grain acreage per capita has been declining for years. How world agriculture can meet demand without extensive destruction of currently forested areas, stupendous increases in irrigation (for which fresh water is not presently available), and widespread adoption of heavily bioengineered crops is an open and ominous question.
That question, however, pales in comparison to the impending energy puzzle. Unless some cost-effective source can be found in the form of fusion reactors, solar energy conversion, fuel cells or the like, humanity will continue to power its growth with fossil materials.
There are now more than 700 million automobiles in the world; by 2050, the number will probably exceed a billion. Equivalent growth is likely in other energy-consuming sectors, including business and residential use. Unless all these devices can somehow be made to run on hydrogen, electricity from fuel cells or some kind of alcohol, demand for scarce petroleum will continue to rise. But none of those alternative fuels exists naturally; all require energy to create. Meanwhile, consumption of electricity will have increased at least fourfold--a demand that can only be met by coal-fueled generators, barring some amazing breakthrough.
Thus it appears that under any but the most Draconian scenario of greenhouse gas reduction (and probably irrespective of what sorts of Kyoto-like agreements industrial democracies may make), humanity will continue to pump billions of tons of carbon dioxide into the atmosphere for the foreseeable future. How much of it will stay there, and what effect it will have on the planet, are among the creepy unknowns of climate science.
At present, civilization adds 6 billion or 7 billion tons of carbon to the air every year, about half of which will remain there for an expected 100-year lifetime. So far, the other half has been absorbed somehow--presumably by vegetation and oceans--and global warming has been confined to about 1.1 degrees Fahrenheit over the past 120 years, nearly all of it in the Northern Hemisphere during winter evenings.
Nobody knows if or when the planet's carbon "sinks" will reach saturation, what will happen if they do, and exactly what other "feedback" systems are involved in climate change. (For example, increased CO2 in the air makes many kinds of plants grow much larger, thus increasing their ability to sop up airborne carbon as it increases.)
The answers should be unambiguously clear by the end of the 21st century, when global average temperatures are expected to be 3 to 8 degrees Fahrenheit warmer and sea levels six to 20 inches higher. We will know whether, and to what extent, human behavior has altered the Earth's climate--and, incidentally, whether we will have the flexibility to adapt.
By that time, however, very little of what now passes for enlightened environmental ideology, or indeed any of the cognitive staples we inherited from the Industrial Revolution, may have survived.
The fundamental moral and aesthetic distinction between "natural" order and "artificial" disruption of nature--currently embodied in squabbling over "organic" foods--may disappear entirely, or become merely a parlor topic for the wealthy.
After all, some of the most sought-after medical innovations of the 21st century will be profoundly unnatural: transplant organs from bioengineered pigs and other animals; spare tissues from vat cultures; chemical therapies custom-designed to supplant natural deficiencies in the genomes of patients; neural prostheses that allow paralyzed limbs to move again, or blind eyes to see. Each of these initiatives is already in progress.
Moreover, it will not be possible for even the most stubbornly reactionary cultures to halt the inevitable development of artificially enhanced animals, whether ultra-brainy chimps, Homo sapiens II or a brand-new kind of organism built from genetic scratch. Ditto for the creation of computing devices that can equal or exceed the human mind, at least in ability to perform useful tasks, teach themselves and generalize from experience. Both are plausible arrivals in the coming century.
By one expert estimate, the human brain is only a million times smarter than a good desktop PC; the slightly less clever mouse has merely 100,000 times as much computing power. (That's for general-purpose thought. Already computers can beat world champions at specialized activities such as chess.) These may seem like large numbers. They aren't. If computing capability continues to double every couple of years--and few doubt that it will--then an artificial mouse-grade brain is only decades away.
Such entities may constitute an endless retinue of servants for mankind. Alternatively, they might put the stolid, slow-evolving human race out of business, replacing our antiquated wet brains with new molecular-scale computers that can be fed on a handful of electrons.
It would be a fitting end to the ongoing deglorification of our species. The centuries since Copernicus have not been flattering to humanity's notions of its own uniqueness and importance. The 21st century may be the one in which we discover that life is not only commonplace in the cosmos, but that many of the characteristics we consider distinctively human are easily duplicated.
If the current pace of neuroscience research continues, it is also likely that we will be able to understand consciousness on the biochemical level, and in fact to re-create in a computer or in laboratory goo our proudest, most coveted possession--the process whereby thought becomes aware of itself.
Or, we might not. Prediction is a notoriously myopic (and human) art.
In 1900, not even the most brilliant scientists knew what atoms were; many doubted their very existence. Yet a mere 45 years later, the atom split the sky over Hiroshima, altering human history in an instant.
None of the politicians who created the Social Security system in 1935 suspected that the average American's life span would double in fewer than 100 years. Physicians in the early 1980s were no more able to foresee the global horror of HIV than their 14th-century counterparts were to anticipate the Black Death.
Perhaps the only thing one can conclude with certainty about the coming century, beyond the inevitable spew of ultra-tech chattels and rococo diversions, is that the unimaginable will occur--often.
Curt Suplee writes about science and technology for The Post.