Homesteading: Lessons from Ancient History
What is the price of urban living?
At the core of the modern homesteading movement is the rejection of urban life and a desire to return to a more decentralized, self-reliant rural lifestyle. In many (but not all) cases, this migration out of cities and into the country coincides with a commitment to a healthier diet, often including increased intake of animal-derived protein and decreased reliance on carbohydrates to meet caloric requirements. Is this just a modern fad, or are there historic precedents supporting what many experience as a form of intuitive wisdom?
Let’s Start with Decentralization.
Three major, interconnected problems in urban life are the provision of sufficient clean water, the disposal of human waste, and the control of infectious diseases. As Jill once candidly observed during a summer trip to Rome, “the place smells like it has been lived in for thousands of years”. Downtown Atlanta has a similar summer perfume without the excuse of a millennium of history. Historically, when urbanization develops, human waste contaminates drinking water sources, and the combination of contaminated water and close habitation enables the development and spread of a wide variety of human microbial parasites - otherwise known as infectious diseases. In a sense, the history of cities is dominated by attempts to solve this inconvenient truth through innovative technology. The miraculous Roman aqueducts are one notable example, and the development of the London sewer system is another.
One reason that many cite for migrating from high-density cities to rural regions is a sense of growing urban decay and deterioration of urban infrastructure, and a desire to minimize a wide variety of risks - particularly for families with children. Is there an easy way to estimate the costs and benefits of urban versus rural living? Is there an “Urban Penalty” to be paid when choosing city life over country living?
One easy measure used throughout human history is infant mortality. Our children and our elderly are the most vulnerable to the effects of poor nutrition, sanitation and infectious diseases, but tracking infant mortality provides the most clear-cut data since there is a clear start (birth) and endpoint (death) that are quite closely coupled, if measured over the first year of life. This is why the recent increase in infant mortality in the United States is so alarming. 16th-century Europe provides a nice case study, as modern sanitation systems had yet to be developed and deployed. In a sense, by looking back through time to an earlier period of urbanization, we can gain insight into a future where more advanced urban decay may become commonplace.
Infant mortality (deaths in the first year of life) in 16th-century Europe was generally much higher in urban settings than in rural ones. This was part of the well-documented “urban penalty” (or “urban graveyard” effect) that characterized early modern European cities until significant improvements began in the mid-to-late 18th century. Rural areas offered better survival chances for infants due to lower population density, better access to fresh air, cleaner water sources, less exposure to crowd diseases, and more widespread maternal breastfeeding. Take a moment to reflect on that: these reasons remain core justifications for many choosing to migrate back to the land in the 21st century!
Historians describe cities in this period as demographic sinks that required constant immigration from the countryside, just to maintain (let alone grow) their populations. During the 16th century, cities (especially larger ones like London, Paris, or major trade centers) suffered from a variety of problems that are still relevant today:
High population density with associated rapid spread of infectious diseases (smallpox, typhus, dysentery/”bloody flux”, tuberculosis, influenza, etc.). COVID-19 triggered major migration from urban to rural, often as second homes for those wealthy enough to afford them. Imagine what will happen if we have a true high-mortality infectious disease outbreak?
Poor sanitation and hygiene
Contaminated water and food. In the modern urban setting, now we also have “food deserts”.
Frequent epidemic outbreaks (including periodic plague visitations)
Greater reliance on artificial feeding or wet-nursing (especially among wealthier urban families, who frequently sent their infants to rural wet-nurses, often with mixed results). In the modern context, urban families usually employ a variety of “day care” solutions, so that both parents can return to work to support the financial costs of their urban lifestyle.
For a 16th-century European family, having an infant in a bustling city was statistically far riskier than in the countryside — often by a factor of 1.5 times or more in terms of mortality risk during the first year of life.
Precise Europe-wide statistics for the 16th century are scarce and vary by region, but reconstructions from parish registers, family reconstitutions, and bills of mortality provide the following rough picture:
Overall European infant mortality (1500–1600): Typically 200–300+ deaths per 1,000 live births (20–30% or higher), with some areas approaching 350/1000 in the worst urban conditions.
England (best-studied case): Around 140–250 infant deaths per 1,000 on average, but clearly higher in urban areas than rural ones. Death from infectious diseases was explicitly noted as higher in towns.
Large cities (e.g., London in the late 16th–early 17th centuries): Often 250–350+ infant deaths per 1,000 (sometimes peaking higher during epidemics), this was significantly worse than the countryside.
Rural areas: Generally lower: often 100–200 deaths per 1,000 in healthier agricultural regions, though still very high by modern standards.
The urban–rural gap was substantial and persistent throughout the 16th century, with cities acting as major mortality hotspots. This pattern held across much of northern and western Europe (England, Northern Germany, France, Belgium, the Netherlands, and Luxembourg). However, regional variations existed (e.g., breastfeeding practices influenced outcomes more in some areas than others). The urban penalty only began to diminish significantly after ~1750, with marked improvements in infant survival in cities during the second half of the 18th century (due to changes in breastfeeding, public health measures, and declining epidemic intensity).
Of note, in modern times, the percentage of asthma among children in urban areas is also much higher than in rural areas.
What about Diet?
Wheat and bread are not the only factors driving the development of cities, but the reliable surpluses of excess calories were the indispensable foundation that made large-scale urbanization possible in the ancient world. And in many ways, they still underpin urban life today. Returning to Rome, think about the Italian obsession with pasta, or bread baking throughout the northern regions of Europe.
Wheat (and products made from it) played a central role in enabling early urbanization, particularly in the ancient Near East, which was the cradle of the first true cities in Mesopotamia (modern Iraq) and ancient Egypt. This connection is one of the most fundamental in human history: the Neolithic Revolution (~10,000–8,000 BCE) involved the domestication of wheat (especially emmer and einkorn varieties) in the Fertile Crescent, which created reliable food surpluses. These surpluses were the key preconditions for supporting large non-farming populations, specialization of labor, social hierarchies, and eventually full-fledged cities and states.
Wheat (along with barley) had unique advantages over other early crops that made it particularly well-suited to fueling urban growth:
High yield and storability: Wheat produced dense, calorie-rich grains that could be harvested, threshed, stored for years in granaries, and transported without spoiling quickly.
Taxability and “legibility” to states: Anthropologist James C. Scott argues in “Against the Grain” (2017) that cereal grains, like wheat are visible, divisible, assessable, storable, and “rationable.” Rulers could efficiently tax, measure, and redistribute wheat stores, unlike roots and tuber crops that grow underground and are harder to confiscate.
Caloric density and processing into bread: Wheat could be ground into flour and baked into bread (or fermented into beer), providing a concentrated, portable staple food ideal for feeding dense urban populations, workers, soldiers, and elites.
Irrigation compatibility: In river valleys like the Tigris-Euphrates (Mesopotamia) and Nile (Egypt), annual floods deposited nutrient-rich silt, allowing massive wheat and barley production. Organized irrigation canals can create surpluses far beyond subsistence needs.
Without these key wheat characteristics, sustaining cities of tens of thousands (e.g., Uruk in Mesopotamia ~3500 BCE, with ~50,000+ people) would have been nearly impossible. The domestication and cultivation of wheat are inextricably interwoven with both urbanization and the development of the modern State with its taxation and standing armies. But a diet rich in carbohydrates and poor in animal protein comes at a cost.
To illustrate this point, I will use some of the ideas presented on two recent posts on “X”. Both were provided by Sama Hoole (@SamaHoole), who examined historical monastery records as part of his research to gain insight from historical “natural experiments” in dietary change.
Medieval monasteries maintained detailed, private health records, tracking the well-being of their communities. In the early medieval era, monks regularly ate meat such as fish, pork, and game when accessible. They also ate cheese and eggs every day. These health records show common medieval health trends: injuries, occasional illnesses, and mostly active lives into old age.
Church doctrine evolved, leading to stricter rules among religious orders. Meat was considered spiritually corrupting, prompting monks to adopt bread-based diets with beer for liquid nutrition. However, health records from 50 years later show a contrasting story from the same monasteries, with meticulous records indicating very different health outcomes.
Obesity often appears in historical records, with gout becoming known as a typical “monk’s disease.” Documentation of arthritis also rises notably, describing symptoms like lethargy, swelling, joint pain, and metabolic issues. Monasteries with the strictest meat restrictions tended to have worse health results. Conversely, monks who ate fish and cheese on fasting days experienced better health than those who ate only bread and beer.
The irony is striking. They banned meat for spiritual purity but caused physical decline. They believed animal food corrupted the soul, so instead they ate grain, which damaged their metabolism. Medieval doctors documented this clearly: they prescribed meat to sick monks who recovered, then returned them to bread and beer diets, and watched them deteriorate again. The monks kept detailed records of their own decline, blaming it on spiritual weakness rather than nutritional harm.
In the late 700s AD, Scandinavian farming was limited due to a short growing season, poor soil, and barely viable grain crops. Most Norse relied on fishing, hunting, herding, and raiding for sustenance. Their diet primarily consisted of fatty fish like salmon and herring, seal, whale blubber when accessible, dairy from cattle and goats, and occasional pork. Grain served as a supplement to an animal protein diet. Children grew up consuming raw milk and fatty fish as their main dietary staples.
In Agricultural Europe, peasants typically enjoyed simple meals like grain porridge and occasional vegetables, with minimal meat. Social class played a role in access to protein, as nobility had more frequent meat, while everyone else primarily consumed grain.
In 793 AD, Vikings raided Lindisfarne (a tidal island located off the northeast coast of England), with accounts describing the raiders as unusually tall, physically powerful, and terrifying. This pattern persisted for 250 years throughout Europe. Viking success was due not only to tactics but also to the raiders' physical superiority. The average Viking stood between 5'9" and 5'11", with strong bones, excellent teeth, and robust health. Skeletal analysis reveals minimal deficiencies and quick healing.
The typical European peasant stood about 5'4" to 5'6". They commonly experienced dental cavities, nutritional deficiencies, and had weaker skeletons. Vikings weren't genetically superior; they consumed fatty fish and animal products, which built stronger bodies, whereas Europeans mainly ate grains.
When Vikings settled in conquered territories, such as Normandy, England, and Ireland, a pattern emerged: the first generation remained physically dominant and retained the “Norse” size; the second generation began to shrink, and by the third generation, they looked like everyone else.
What changed? Diet. Norse settlers began eating more grain and less fish and meat, relying more on local agriculture. After three generations, their physical advantage was lost.
This is documented in Normandy. In 911 AD, Rollo's Vikings settled there. They were notably taller and stronger than the Franks. By 1000 AD, their descendants resembled the French they had conquered. These were the same genetics, but different diets across generations led to different outcomes.
The opposite is also true. The Norse settlers in Iceland kept fishing and hunting. They did not convert to a grain-based diet. This animal-based diet supported the population's vitality for centuries.
The Viking Age came to a close around 1066, not because Europeans became better fighters, but because Vikings gradually blended into the agricultural European lifestyle. The fierce warriors who terrorized Europe for over 250 years gradually disappeared as they adopted local diets. After just three generations of eating grains, they started to resemble the very people they had previously raided.
The key lesson is that meat-eating sea raiders from remote lands successfully conquered other farming societies because their animal-based diets conferred a stronger, even taller phenotype than their farm-based counterparts. However, that physical advantage disappeared over generations as diets changed.
The transition from hunter-gatherer lifestyles to agriculture, known as the Neolithic Revolution, began around 10,000–12,000 years ago. It involved a major dietary shift toward carbohydrate-rich foods, particularly domesticated cereals such as wheat, barley, millet, and rice. This change meant that people ceased to lead a nomadic lifestyle, thereby increasing sedentism, population density, and animal domestication. These changes produced well-documented alterations in disease patterns, as evidenced by the analyses of skeletal remains, dental records, and bioarchaeological data.
While the shift was not uniformly catastrophic (regional variation existed, and some health indicators improved), studies indicate a general deterioration in many aspects of health among early farmers relative to their hunter-gatherer predecessors. This was not a simple “worse in every way” story.
Some studies (e.g., on the eastern coast of the Mediterranean Sea) report a complex profile: higher infectious disease incidence but fewer traumatic injuries overall. Fertility rose sharply because of more frequent pregnancies due to settled life and reliable calories, which offset higher mortality and enabled population growth, despite poorer individual health. James C. Scott’s “Against the Grain” emphasizes that early farmers often faced worse nutrition, more disease, and more strenuous labor than foragers, and that the rise of Nation-States later exacerbated these conditions through taxation and coercion.
The key (anthropologically documented) disease and health changes most directly linked to the Neolithic transition to a carbohydrate-rich diet included the following:
Dental caries (cavities) and periodontal disease increased worldwide.
This is the most consistent and dramatic change. Hunter-gatherers typically had very low rates of caries due to diverse, often low-starch diets with minimal simple sugars. The adoption of starchy grains dramatically increased caries prevalence, often by several-fold or more, as oral bacteria thrived on the new grain-based diet.Nutritional deficiencies and related conditions increased.
Cereal-heavy diets were often lower in protein, fiber, micronutrients (e.g., iron, zinc, vitamin C, B vitamins), and dietary diversity than varied forager diets. This contributed to Iron-deficiency anemia and stunted growth. The average adult stature often declined by several centimeters (e.g., by 5–10 cm in many early farming populations. This is also evident in the late 19th century and was partly a result of Infant weaning practices, which often relied on thin gruel porridges. Our ancestors weren’t necessarily short because of starvation; rather, their grain-rich diets were a critical factor.Infectious and zoonotic diseases increased.
While not solely due to carbohydrates, the dietary shift indirectly amplified risks through sedentism and overcrowding, as discussed above. This facilitated the spread of pathogens through the close proximity to domesticated animals, resulting in the rise of zoonotic diseases, such as the plague. It also resulted in poorer overall nutrition, which weakened immune responses. Skeletal evidence shows higher rates of nonspecific infection markers (periostitis), and some specific diseases (e.g., tuberculosis) first appear or increase.
In essence, the transition from a lifestyle built around diverse foraging to a carbohydrate-dominated diet and permanent settlements provided caloric stability, but at a price. It enabled larger populations but at the cost of increased dental caries, nutritional stress markers, and greater vulnerability to certain deficiencies and infections. These effects have been observed throughout human history when meat-based diets are abandoned, and, to some extent, this pattern persists in the present day.
In the United States, the modern diet has reshaped the American phenotype in subtle but profound ways. Widespread consumption of ultra-processed, calorie-dense yet nutrient-poor foods has produced a population that is heavier but less muscular, taller but structurally weaker, and metabolically older at younger ages. High intakes of refined carbohydrates and seed oils, combined with lower protein and micronutrient density, promote visceral fat accumulation, insulin resistance, chronic inflammation, and earlier puberty, while soft, processed foods contribute to reduced jaw development, dental crowding, and airway problems. The result is an “overfed but undernourished” phenotype. People who are larger in size, but biologically more fragile, with higher rates of chronic disease, skeletal weakness, and metabolic dysfunction than earlier American generations.
So, get ahead of the curve. Escape to the country, and consider increasing dietary animal protein and nutrient-dense produce and fruits, while decreasing carbohydrates. Stay away from ultraprocessed foods. For yourself, your family, and especially for the growth, health, and well-being of your children.




Good article. But what we have experienced from this urban flight is urbanization of our formerly rural area. Not liking it much.
Most of the population has no idea that chicken has a “taste” and it’s wonderful. When you purchase it from your local supermarket it will have zero taste. You may as well be eating tofu or some other gelatinous form of meat that is plant based. Humans need animal proteins as Doc wrote it makes a huge difference in our health. It has been documented for millennia. Vegetarians who for whatever reason refuse to eat meat supplement their diets with proteins through pills or shakes. It costs a lot to buy these supplements when all you need to do is eat a burger or steak or chicken which tastes amazing and much cheaper. Had myself a filet mignon last night and it was ah-mazing! It’s what is called being on top of the food chain and we should be thankful for the animals who nourish us. Getting back to growing and raising your own food is paramount to a healthy life. I can’t wait to get back to it. 2 more years!