The End of Food

By Paul Roberts
Houghton Mifflin. 390 pp. $26
August 17, 2008

Chapter One

Starving for Progress

In the late 1940s, anglers who fished the waters of the Hudson River near Orangetown, New York, noticed something odd about the trout they were reeling in: every year, the fish were getting larger. Fishermen rarely complain about big fish, but because the creatures in question were being hooked downstream from Lederle Laboratories, a pharmaceutical company, some may have wondered whether the phenomenon was entirely natural. Eventually, the fish stories reached the upper management at Lederle, where they piqued the curiosity of Thomas Jukes, a brilliant biophysicist and expert in the new field of vitamin nutrition, who decided to investigate. Jukes knew that Lederle discharged its factory wastes in great piles near the river. He also knew that one such waste product was a residual mash left over from the fermentation process that Lederle used to make its hot new antibiotic tetracycline. Jukes surmised that the mash was leaching into the river and being eaten by the fish, and that something in the mash-Jukes dubbed it a "new growth factor"-was making them larger.

Initially, Jukes suspected the factor might be vitamin B12, a newly identified nutrient that was known to boost growth in laboratory animals. The vitamin was a byproduct of fermentation, so it was very likely to be in the mash. But when Jukes and a colleague, Robert Stokstad, tested the mash, they found something quite unexpected, and even world-changing: although B12 was indeed present, the new growth factor wasn't that vitamin but the tetracycline itself. When mixed with cornmeal and fed to baby chickens, even tiny doses of the amber-colored antibiotic boosted growth rates by an unprecedented 25 percent.

Jukes wasn't sure why this was happening. He speculated (correctly, as it turned out) that the tetracycline was treating the intestinal infections that are routine in closely confined farm animals, and that calories that normally would have been consumed by the chicks' immune system were going instead to make bigger muscles and bones. In any case, the phenomenon wasn't limited to baby chickens. Other researchers soon confirmed that low, subtherapeutic doses of tetracycline increased growth in turkeys, calves, and pigs by as much as 50 percent, and later studies showed that antibiotics made cows give more milk and caused pigs to have more litters, more piglets per litter, and piglets with larger birth weights. When the discovery was announced to the world in 1950, Jukes's new growth factor was the closest thing anyone had ever seen to free meat and a welcome development amid rising concerns over food supplies in war-torn Europe and burgeoning Asia. As the New York Times put it, tetracycline's "hitherto unsuspected nutritional powers" would have "enormous long-range significance for the survival of the human race in a world of dwindling resources and expanding populations."

Jukes's discovery would indeed have enormous long-range significance, although not quite in the ways the Times envisioned. By the middle of the twentieth century, the global food system was in the throes of a massive transformation. In even the poorest of nations, thousand-year-old methods of farming and processing were being replaced by a new industrial model of production that could generate far more calories than had been possible even a generation earlier-and which seemed poised to end the cycle of boom and bust that had plagued humanity for eons. But the great revolution was incomplete. For all our great success in industrializing grains and other plants, the more complex biology of our cattle, hogs, chickens, and other livestock defied the mandates of mass production. By the early twentieth century, meat-the food that humans were built for and certainly the food we crave-was still so scarce that populations in Asia, Europe, and even parts of the United States suffered physical and mental stunting, and by the end of World War II, experts were predicting global famine.

Then, abruptly, the story changed. In the aftermath of the war, a string of discoveries by researchers like Thomas Jukes in the new fields of nutrition, microbiology, and genetics rendered it possible to make meat almost as effortlessly as we produced corn or canned goods. We learned to breed animals for greater size and more rapid maturation. We moved our animals from pastures and barnyards and into far more efficient sheds and feed yards. And we boosted their growth with vitamins and amino acids, hormones and antibiotics (it would be years before anyone thought to ask what else these additives might do). This livestock revolution, as it came to be known, unleashed a surge in meat production so powerful that it transformed the entire food sector and, for a brief time, allowed many of us to return to the period of dietary history that had largely defined us as a species- and where the story of the modern food economy properly begins.

By most accounts, that narrative started about three million years ago, with Australopithecus, a diminutive ancestor who lived in the prehistoric African forest and ate mainly what could be found there-fruits, leaves, larvae, and bugs. Australopithecus surely ate some meat (probably scavenged from carcasses, as he was too small to do much hunting), but most of his calories came from plants, and this herbivorous strategy was reflected in every element of Australopithecus's being. His brain and sensory organs were likely optimized to home in on the colors and shapes of edible (and poisonous) plants. His large teeth, powerful jaws, and oversize gut were all adapted to coarse, fibrous plant matter, which is hard to chew and even harder to digest. Even his small size-he stood barely four feet tall and weighed forty pounds-was ideal for harvesting fruit among the branches.

So perfectly did Australopithecus match his herbaceous diet that our story might well have ended there. Instead, between 3 million and 2.4 million years ago, Australopithecus got a shove: the climate began to cool and dry out, and the primeval jungle fragmented into a mosaic of forest and grasslands, which forced our ancestors out of the trees and into a radically new food strategy. In this more open environment, early humans would have found far less in the way of fruits and vegetables but far more in the way of animals, some of which ate our ancestors, and some of which our ancestors began to eat. This still wasn't really hunting, but scavenging carcasses left by other predators-yet now with an important difference: our ancestors were using stone tools to crack open the leg bones or skulls, which other predators typically left intact, to get at the calorie-rich, highly nutritious marrow and brains. Gradually, their feeding strategies improved. By around 500,000 years ago, the larger, more upright Homo erectus was using crude weapons to hunt rodents, reptiles, even small deer. Erectus was still an omnivore and ate wild fruit, tubers, eggs, bugs, and anything else he could find. But animal food-muscle, fat, and the soft tissues like brains and organs-now made up as much as 65 percent of his total calories, almost the dietary mirror image of Australopithecus.

On one level, this shift away from plants and toward animal food was simple adaptation. All creatures choose feeding strategies that yield the most calories for the least effort (anthropologists call this optimal foraging behavior), and with fewer plant calories available, our ancestors naturally turned to animal foods as the simplest way to replace those calories. But what is significant is this: even if the move toward meat began out of necessity, the consequences went far beyond replacing lost calories. In the economics of digestion, animal foods give a far greater caloric return on investment than plants do. It might take more calories to chase down a frisky antelope on the veldt than to pluck fruit in the forest. But for that extra investment, Homo erectus earned more calories-far more. Fat and muscle are more calorie dense than plants are and thus offer more energy per mouthful. Animal foods are also easier to digest, so their calories can be extracted faster. In all, meat provided more calories, and thus more energy, that could then be used for hunting, fighting, territorial defense, and certainly mating. Meat was also a more reliable food source; by shifting to meat, prehistoric man could migrate from Africa to Europe, where colder winters and lack of year-round edible vegetation would have made an herbivorous diet impossible.

But meat's real significance to human evolution was probably not the quantity of calories it contained but the quality of these new calories. Because animal and human tissues have the same sixteen amino acids (whereas most plant-based proteins contain just eight), animal converts readily into human: meat is the ideal building block for meat. That's why bodybuilders eat a lot of meat; it also helps explain why, as our ancestors ate more animal foods, their bodies grew larger. Whereas Australopithecus stood four feet tall, Homo erectus was a strapping six feet in height, and much stronger, which made him better at eluding predators and hunting. (The point isn't that meat made us big but that by eating more meat, our ancestors could then adapt more readily to an environment where greater size and strength were advantageous. But once attained, our new stature had to be maintained, which is one reason our ancestors sought out larger prey animals; not only did these big beasts supply a lot of calories, they also supplied more fat per pound than did smaller animals.) As important, Homo erectus's skull was a third larger than that of Australopithecus, and the brain inside vastly more developed-an adaptation known as encephalization that was also related to the meatier diet. Just as muscle grows best on a diet of meat, brains thrive on the fatty acids, and especially on two long-chain fatty acids, the omega-3 fat docosahexaenoic acid (DHA) and the omega-6 fat arachidonic acid (AA), which are abundant in animal fats and soft tissues. Plants have omega-3 and omega-6 fatty acids, too, but these are shorter forms and can't provide the same nutritional benefits.

Fatty acids were just the start. The brain is what's known as expensive tissue-not only does it need lots of DHA to grow so large, it also needs lots of calories to create all the chemical neurotransmitters upon which mental activity depends. The bigger the brain, the more calories it requires, which is why across the zoological spectrum, bigger brains tend to be found with bigger bodies. A sperm whale, for example, can support a twenty-pound brain mainly because it also has a massive stomach and heart. But humans defied this brain-body pattern. In the millions of years between Australopithecus and Homo erectus, brain size nearly tripled, yet body size barely doubled. Somehow, the human body was fueling a very large brain with a relatively small set of body organs. How? Again, the likely answer was meat. Recall that meat is more calorie dense and easier to digest than plants. According to paleoanthropologist Leslie Aiello, coauthor of the expensive-tissue theory, as our ancestors ate more meat and fewer plants, they no longer needed the large primate gut to digest all the plant matter. Over time, the gut shrank to about 60 percent of the size of other primates' - a critical development, as digestive systems themselves consume lots of energy and having a smaller gut meant more available calories for larger brains. (In a similar development, because we weren't required to grind up so much plant matter, our jaws and teeth became smaller.) This is not a claim for dietary determinism: meat didn't "make" monkeys human. Many factors interacting in complex ways spurred the changes in our ancestors' physiology that ultimately produced modern humans. But it's also clear that without more animal foods, their bodies and brains couldn't have gotten larger. And without those bigger bodies and brains, they couldn't have become the intelligent, tool-using, highly effective hunters who were able to spread so quickly from Africa to the Middle East, Asia, and finally Europe. It's probably not entirely coincidental that the several offshoots of Australopithecus that remained herbivorous became extinct.

In any case, by around 180,000 years ago, as the first of the four ice ages began, animal foods dominated and defined the human food strategy. Neanderthals and, later, the Cro-Magnons, the first anatomically modern humans, were primarily hunters. Each had its own strategies, but both relied heavily on the mastodon, bison, woolly rhinoceros, and other arctic megafauna that had been driven southward into human territory by expanding glaciers. To prehistoric hunters, these big animals were walking meat markets-risky to hunt, but offering a huge payoff. By some estimates, Cro-Magnon hunters were earning as much as fifteen thousand calories per hour-far more than their predecessors. In fact, although Cro-Magnon foraged for plants and tubers, eggs, insects, fruits, and honey, two-thirds of their calories came from animal foods, making their diet, as Mike Richards at Oxford University has shown, nearly identical to that of bears, wolves, and other "top-level carnivores."

By the start of the last Ice Age, eighteen hundred years ago, big-game hunting had become the brutally efficient practice that was celebrated in cave paintings-and that elevated humans to a kind of dietary elitism. Daily life must still have been solitary, poor, nasty, brutish, and short: infant mortality was high, work was dangerous, and treatment for injury or infection was nonexistent, which helps explain why the average life expectancy may have been eighteen years. That, coupled with low birthrates (in part because infants couldn't digest meat and unprocessed plants and so had to be breastfed longer, which delayed subsequent pregnancies), kept population growth nearly flat. By some estimates, world population cycled at around one million for tens of thousands of years. Still, from the strict standpoint of food economics-that is, the quantity and quality of available calories-Cro-Magnon was fabulously wealthy. Indeed, so ideally matched were these ancestors to their diets that those who did survive childhood trauma and hunting accidents were probably healthier than most of their modern decedents: according to Neil Mann, an expert in paleonutrition at RMIT University in Melbourne, Australia, fossil remains from this early period show none of the diet-related chronic diseases that plague us today.

The good times couldn't last. By 11,000 years ago, a warming climate had drawn the big, cold-weather game back northward, away from human settlement. In their place came smaller, faster species, like gazelle, antelope, and deer, which required new hunting skills and weapons. Our Cro-Magnon forebears adapted: the bow and arrow, for example, was clearly invented to hit a smaller, faster target. (Neanderthal, by contrast, may not have had the ability to update his hunting strategy, and so followed the big animals into extinction.) Yet ultimately, new technologies couldn't save the hunting life. By modern estimates, even well-equipped hunters during this period were probably earning fewer than fifteen hundred calories an hour hunting small game-quite a comedown from the high times of yore. As hunting success faltered, tribes had little choice but to supplement hunting with gathering: nuts and berries, edible roots; peas and other legumes, as well as various seeded grasses such as wild wheat and barley. This hunting and gathering, or broad-spectrum, strategy kept our ancestors alive, but barely. Research suggests that they needed hours and hours of effort, over many miles of territory, to find enough to eat. As the centuries went by, it became clear that the extractive food strategies, built around whatever foods nature made available, were failing. At some point, our ancestors would have to begin producing food. They would have to become farmers.

If the shift from hunting to farming was largely involuntary, our ancestors at least had the benefit of excellent timing. The same warming trend that took away the big animals had also expanded the range of certain edible plants and grasses, notably wheat and barley, into human territory. Such expansions had occurred before, but this time was different in several important ways. First, our ancestors now had a deep knowledge of plants, gleaned over several thousand years of foraging; they had certainly learned, for example, that wheat and barley were edible, and that fruit pits left in a garbage pile could sprout. Second, they had the beginnings of the social organization necessary to tackle the complex and large-scale task of farming. Third, and perhaps most important, they were highly motivated: the same weeds that successful big-game hunters had ignored must now have looked very interesting indeed.


Excerpted from The End of Food by Paul Roberts Copyright © 2008 by Paul Roberts. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this Web site.