Lifespan vs Life Expectancy

Discussion in 'Human Science' started by Gustav, Oct 3, 2012.

  1. Gustav Banned Banned

    Messages:
    12,575
    i seem to have got suckered into non-critical thought in this matter
    ie: "nasty, brutish and short" is not entirely accurate


    Human Lifespans Nearly Constant for 2,000 Years


    The Centers for Disease Control and Prevention, often the harbinger of bad news about e. coli outbreaks and swine flu, recently had some good news: The life expectancy of Americans is higher than ever, at almost 78.

    Discussions about life expectancy often involve how it has improved over time. According to the National Center for Health Statistics, life expectancy for men in 1907 was 45.6 years; by 1957 it rose to 66.4; in 2007 it reached 75.5. Unlike the most recent increase in life expectancy (which was attributable largely to a decline in half of the leading causes of death including heart disease, homicide, and influenza), the increase in life expectancy between 1907 and 2007 was largely due to a decreasing infant mortality rate, which was 9.99 percent in 1907; 2.63 percent in 1957; and 0.68 percent in 2007.

    But the inclusion of infant mortality rates in calculating life expectancy creates the mistaken impression that earlier generations died at a young age; Americans were not dying en masse at the age of 46 in 1907. The fact is that the maximum human lifespan — a concept often confused with "life expectancy" — has remained more or less the same for thousands of years. The idea that our ancestors routinely died young (say, at age 40) has no basis in scientific fact.

    Yet this myth is widespread, and repeated by both the public and professionals. A few examples:

    * An article on Egyptian pyramid builders in the November 2001 issue of "National Geographic" noted, "Despite the availability of medical care the workers' lives were short. On average a man lived 40 to 45 years, a woman 30 to 35."

    * In a 2005 press release for the TV show "Nightline," a producer wrote, "I am 42 years old. I live in a comfortable home with my family…. I'm lucky. If I were in Sierra Leone, the poorest country in Africa, chances are I'd be dead at my age. The life expectancy there is 34 years of age."

    * A Dec. 18, 2003, Reuters news story on the impact of AIDS in Africa reported that "A baby girl born now in Japan could expect to live 85 years, while one born in Sierra Leone probably would not survive beyond 36."

    Such statements are completely wrong; most people in Sierra Leone are not dropping dead at age 34. The problem is that giving an "average age" at which people died tells us almost nothing about the age at which an individual person living at the time might expect to die.

    Again, the high infant mortality rate skews the "life expectancy" dramatically downward. If a couple has two children and one of them dies in childbirth while the other lives to be 90, stating that on average the couple's children lived to be 45 is statistically accurate but meaningless. Claiming a low average age of death due to high infant mortality is not the same as claiming that the average person in that population will die at that age.

    Of course, infant mortality is only one of many factors that influence life expectancy, including medicine, crime, and workplace safety. But when it is calculated in, it often creates confusion and myths.

    When Socrates died at the age of 70 around 399 B.C., he did not die of old age but instead by execution. It is ironic that ancient Greeks lived into their 70s and older, while more than 2,000 years later modern Americans aren't living much longer.


    thread inspired by One-Third of Married People Think of Sex as a Chore
    instead of choring for 46 years as was supposedly the case way back when, we now have to chore for 74.
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. tashja Registered Senior Member

    Messages:
    715
    Not everyone have to ''shore.''

    Some are actually living the good life and would love to expand life past 100.
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. Rhaedas Valued Senior Member

    Messages:
    1,516
    I take life span as how long the human body can stay functional before the aging process causes natural breakdowns and death. Whereas life expectancy is how long you can expect a person to live based on surroundings, family history, diseases, and injuries.

    I would probably agree that people ages ago could have lived the same amount of time, if there were no other circumstances at play. But the lack of medical care, both treatment and preventative, would certainly have played a big part. Get a nasty injury or illness back then, your chances of regaining full health would be a lot less than with 20th century medicine at your side. Isn't it true that there weren't as many wounded veterans of the American Civil War (or others) as in modern times, because usually infection from the wounds would kill them eventually?
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. Read-Only Valued Senior Member

    Messages:
    10,296
    Yes, that's true. And it would be more accurate to say "in a short period of time" rather than "eventually." That time was generally measured in mere days, rarely ever a full week.
     
  8. Gustav Banned Banned

    Messages:
    12,575

    yes
    i find it bogus to choose arbitrary points from which to calculate life expectancy. it has to be from the point of birth to be of any value. sure your chances to 40 are better if you make it past 2. likewise, better chances to 80 if you make it past 40 but why skew the data like that? life expectancy is exactly that, the period b/w birth and death.

    It can be argued that it is better to compare life expectancies of the period after adulthood to get a better handle on life span. Even during childhood, life expectancy can take a huge jump as seen in the Roman Life Expectancy table at the University of Texas where at birth the life expectancy was 25 but at the age of 5 it jumped to 48. Studies like Plymouth Plantation; "Dead at Forty" and Life Expectancy by Age, 1850–2004 similarly show a dramatic increase in life expectancy once adulthood was reached. (wiki)​


    as for lifespan, give me the maximum recorded age and i will assign that value to it
     
  9. iceaura Valued Senior Member

    Messages:
    30,994
    One reason is it prevents unwarranted assumptions about the quality of the lives being lived.

    The implications for living conditions are much different if people are dying in their forties, as opposed to either succumbing as infants or living to 80. If making it to 7 means one has good odds for 70, life is not "nasty, brutish, and short" by most people's assessments.

    And by implication, preventing childhood vulnerabilities from being lethal can improve the stats a lot without implying improvements in people's lives - so assumptions of improved lives from such statistics are deeply flawed.

    Somewhere there's a movie set in rural Japan about an old woman - in her 60s? - whose teeth are still good, an uncanny circumstance that prevents her children from disposing of her and making room (food, etc) for the grandchildren to grow up. It's a tense, troubled situation - and based on real historical circumstances, not too long ago. The feature of interest here is that this settled, agricultural community had a fairly low rate of child mortality, but limited resources - so some hard choices obtained, and life was what we would call brutal, despite what was probably a comparatively long birth-to-death lifespan compared with, say, the Northern Cheyenne of aboriginal America circa 1700 (the tallest people in the world at the time, as far as John Komlos has discovered) or the Cro-Magnon peoples of stone age Europe (likewise tall, strong, healthy adults).
     
  10. Fraggle Rocker Staff Member

    Messages:
    24,690
    An exhibit in the Smithsonian says that, indeed, human life expectancy did decrease significantly between the Agricultural Revolution 12KYA and the zenith of the Roman Empire. It says that analysis of skeletons of the last cavemen shows that an adult who had survived childhood (infant mortality was about 80% in those days) was likely to reach an age between 50 and 55. But the skeletons of the Romans shows that the average adult in that era lived only into his mid-20s.

    The reason was the Agricultural Revolution itself. Stone Age people were hunters who ate meat every day. Meat is a perfectly balanced food for humans because the bodies of other animals contain exactly the same balance of amino acids, vitamins and minerals as our own bodies. But as agriculture spread, people ate more grains and less meat. By the Roman Era cities were so crowded that it was physically impossible to raise enough meat in the nearby farms to feed them.

    Grains are not a perfectly balanced food. They don't even have the right ratio of amino acids to support human life. You can correct that by mixing them with the protein in legumes, but still the vitamin and mineral content in the total mixture is pathetic. Try surviving only on peanut butter sandwiches for a year! Probably the first thing that will happen is that all your teeth will fall out.

    Of course the ruling class got the meat, and the rest of the aristocracy was able to get eggs and dairy products, which are nutritionally identical to meat. They also had a more varied diet of vegetables, which provide many of the nutrients lacking in grains.

    But the peasants, as well as the huge population of slaves, "lived by bread alone." Or should I say "died by bread alone." Of course that's a bit of an exaggeration or none of them would have even lived to age 25.
     
  11. quadraphonics Bloodthirsty Barbarian Valued Senior Member

    Messages:
    9,391
    A "perfectly balanced food" would not exhibit the exact same balance of amino acids, vitamins and minerals as our own bodies. This is because the purpose of consuming food is not simply to replace parts of our extant bodies, but also to provide fuel for our bodies to use for various activities. So a perfectly balanced food should have a lot more carbohydrates than are found inside human body tissues.

    You are right that a grain-only diet is a bummer, especially in the classical world. However you elide the basic causal chain in the situation: the reason that people were eating so much more grain after the Agricultural Revolution is exactly that there was now enough food to sustain vastly greater populations, albeit mainly on grains. Those people who were dieing at age 25 are people that would either never have been born, or would have died in infancy.
     
  12. R1D2 many leagues under the sea. Valued Senior Member

    Messages:
    2,321
    Good one F.R.
    I would like to add that I hope if I do live say past 80 years old. I still have my mind. A person with memory loss or Alzheimer's disease. In old age does not in my view, mean that a person has a good lifespan. Because its similar to them being dead, not exactly. And they may provide some comfort to others and a sever burden to others. Its like living but with a lack of an idea what living means or is. I hope I in my lifespan I don't have fall into Alzheimer's disease. Or wreck a car or something and not really have my mental capability, or most of my use of my limbs.
    If I got paralyzed from the neck down. I could not "vanquish myself" if need be, like getting mauled by a rabid dog.
     
  13. Fraggle Rocker Staff Member

    Messages:
    24,690
    The protein in meat can easily and efficiently be used as energy. Amino acids can be broken down into carbohydrates and the nitrogen discarded through the kidneys and lungs as waste.

    Many species of vertebrates are predators that eat meat exclusively or almost exclusively. As our ancestral species invented more effective weapons, our digestive system adapted to a higher proportion of meat in our diet and lost the huge bacteria-hosting gut that allows other apes to digest cellulose. By the time Homo sapiens arose, with our singularly large brains that require massive amounts of protein, we were obligate carnivores. We are now the apex predator on this planet, dining on the flesh of both bears and sharks.

    A small band of lazy, curious wolves took up residence on the outskirts of one of our villages, looking longingly at the gigantic piles of perfectly good food we left lying around. They were astounded to discover that we were happy to let them eat our garbage and clean up the place, so long as they used their night vision to protect us from other predators, tagged along on our hunting expeditions and used their superior sense of smell to find game, and did NOT eat the human babies. Eventually they developed into a distinct subspecies of wolf, adapted to the much lower-protein diet of a scavenger. The predictable result of this evolution is that dogs have significantly smaller brains than wolves.

    The Agricultural Revolution was a true paradigm shift. Cultivation of plants and domestication of animals produced the first food surplus that ever existed. Before then, people were at the mercy of the climate. Without permanent homes, pottery, draft animals or wagons, their food preservation and storage technology was limited to what they could carry on their hunting and gathering journeys, and perhaps a little that could be successfuly concealed from scavengers in a base camp. So as the human population increased to the carrying capacity of the land, in a bad year rival tribes had no choice but to encroach on each other's territory and fight over the food. This is the reason that more than half of adult Paleolithic skeletons show clear evidence of death by violence.

    The Neolithic Revolution both permitted and required humans to fundamentally change their behavior and settle in permanent villages. The surplus food mitigated the need for nearby tribes to compete. In fact economies of scale and division of labor encouraged once-rival tribes to come together and form larger villages, which produced a greater surplus and ultimately allowed a few people to choose "careers" outside of the food production and distribution "industry," such as weavers, potters and musicians.

    For a long time, agriculture did not particularly focus on grains. Figs were one of the first cultivated crops in the Old World, and peppers had that distinction in the New World. Animal husbandry provided plenty of meat for the early settlements, and much of the pasture land was used to grow fodder for the herbivorous meat animals, rather than grains for human consumption. People still liked to go hunting (after all, they still do today!) and with the aid of their newfound partners, the dogs with their phenomenal sense of smell and their faster running speed over short distances, they brought home exotic game to enhance the daily diet of goat and pig meat. (Like the dogs, these animals are scavengers. They too were attracted by our garbage and basically self-domesticated. But they didn't get nearly as good a deal out of it as the dogs.

    Please Register or Log in to view the hidden image!

    )

    This is the current paradigm of old age, so we all need to find a way to acclimate to it. Alzheimer's is the leading cause of death for retired Americans. You're more likely to die a slow death from that, than something quicker and more merciful like a stroke or heart attack. You'll lose your dignity and your entire sense of who you are, while your loved ones watch in helpless grief, and the nursing home keeps your body alive so they can keep charging you rent.

    At my age (69) I know many people whose parents are in nursing homes (including my wife's mother), and many of those parents have Alzheimer's (although at age 95 she does not). I think "burden" is the operative word here, not "comfort."

    To put it more bluntly, it's "living" the way an oak tree or a carrot lives.

    Fate does not care about your hopes. If you live to be 85, the probability is about 40% that you will have Alzheimer's.

    This is why assisted suicide must be legalized.
     

Share This Page