We do not know much about how long humans lived before 1750. Around that time, the first national population data were collected for Sweden and Finland. For earlier eras, we have some life tables constructed for municipal populations, members of the nobility, and other groups that were probably not representative of the national population at large (11,12). After aThe National Health Interview Survey began in 1957 and contains information on health status from individual self-evaluations. The National Health and Nutrition Examination Survey, began in the early 1970s, provides information on the health and nutritional status of individuals based on direct measurement.
1750 and even today, we have extensive and highly reliable mortality information for only a subset of national populations.
For the Middle Ages and earlier, mortality levels have been estimated based on data gleaned from tombstone inscriptions, genealogical records, and skeletal remains (1). The accuracy of such estimates has been a subject of dispute (13-16). In studies based on skeletal remains, a key issue is the attribution of age based on bone fragments. Another problem for estimation based on either skeletal or tombstone data is uncertainty about the age structure of the population, which affects mortality estimates based solely on the distribution of ages at death. The only practical solution is to assume that the population was "stationary," implying a long-term zero growth rate and unchanging levels of fertility and mortality, and even an unchanging age pattern of mortality. Clearly, these assumptions are always violated, but the resulting estimates are useful nonetheless.
For mortality data derived from subpopulations, there is also an issue of whether the data are representative of some larger population. Who gets buried in a society, and who gets a tombstone? Which societies have regular burial practices, as opposed, say, to burning their dead? What kinds of populations have complete genealogical records from a particular time period? Thus, for many reasons, all estimates of mortality or longevity from the preindustrial period (roughly, before 1750) should be viewed with caution. Of the many sources of bias in these estimates, there are both positive and negative factors, which tend to balance each other to some extent (17). They are inaccurate and/or unrepresentative by amounts that cannot be well quantified.
Although these historical estimates may be either too high or too low, they provide us nonetheless with a useful description of the general contours of the history of human longevity. For example, most scholars agree that life expectancy at birth (or e0, in the notation of demographers and actuaries) was probably in the 20s for early human populations. Some very disadvantaged societies might have had life expectancies in the teens, whereas others may have been in the 30s. Since historical levels of life expectancy were in the 20s, compared to around 75 to 80 years today in wealthy countries, the average length of life has roughly tripled.
Most of this increase was due to the reduction of infant and child mortality. It used to be the case, for example, that remaining life expectancy at age 1 was greater than at birth, because the toll of infant mortality was so high. The difference between premodern periods and today is less stark if we consider life expectancy at higher ages. Instead of the tripling of life expectancy at birth, remaining life expectancy at higher ages has roughly doubled over the course of human history. At age 10, for example, life expectancy (i.e., expected years after age 10) may have moved from the around 30 to 33 years to almost 70 years (Thatcher AR. Life tables from the Stone Age to William Farr, unpublished manuscript, 1980). At age 50, it may have gone from around 14 years to more than 30 years (17).
Was this article helpful?