There are so many mantras of the modern day food movement that keeping track of any one single goal can be a tall task. But no matter whom you ask, there seems to be at least one concern that arises time and time again. The ascent of big food companies, which have allowed Americans to eat more foods, more cheaply, and at more times of the year than ever before, has led to a massive decline in the amount of food we source from nearby farms.
The response to our current, largely commercial food system is often a simple reminder that feeding this many finicky people is a job too big for localized food systems. All the romantic things that come with local food—small farms, seasonal produce, less guilt—are difficult to provide when cities like New York house almost 10 million people.
There is little doubt that both demographic and agricultural trends have made it more difficult than ever to feed cities with food grown nearby. Look no further than the death of the small American family farm for evidence of the difficulties.
But there's actually good reason to believe the future of local food is brighter than many make it seem. Even though the potential for locally sourced food has decreased over time, it hasn't fallen off nearly as quickly as the actual amount of locally sourced food we eat today.
It remains surprisingly high, in fact. As much as 90 percent of Americans could eat food grown within 100 miles of their home, according to a new study by professors Andrew Zumkehr and J Elliott Campbell, who teach engineering at the University of California. More than 80 percent of mouths could be fed with food grown within 50 miles.
The researchers, who recently published their findings in the journal Frontiers in Ecology and the Environment, used a theoretical model in which all cropland is used to produce foods eaten as part of a standard diet. It doesn't mimic exactly what people eat today, but it's close—it mirrors, for instance, the amount of calories currently consumed in different food groups, such as vegetables, meats, grains, and eggs. They then estimated per capita food demand in different areas and food production associated with different cropland, and calculated an allocation of foods from different farmed areas that maximized the number of people eating food grown within 100 miles of where they live.
The maps below, plucked from the study, show how and where the ability to source food locally has changed over the years. The size of the bubbles correspond with the number of people in each city. The colors show what percentage of the population in those cities could eat locally, if the food system was designed for as many people to as possible.
In 1900, perhaps unsurprisingly, there were only a couple places where some people couldn't eat food grown not too far from their backyards if they wanted to. Today, the story is quite different.
But the fact that large swaths of the country could, theoretically, subsist on local farm systems is encouraging. Especially considering the scope and uniqueness of the study. Campbell's research is among the first of its kind. Previous attempts to gauge the scale of local food systems have focused on a regional rather than national scale. What's more, the conversation around local food, as one of the movement's champions Michael Pollan noted in a statement about the study, "has been hobbled by too much wishful thinking and not enough hard data."
It also, too often, consists of grandstanding about the benefits of local food, without considering or discussing the actual feasibility of turning it into a reality.
The takeaway isn't that we can flip the food system on its head overnight, but rather that a good deal more of the country might be able to eat locally than previously thought. Which is a big deal in and of itself.