Lengthening Lives: 7 ways Science Has Changed Lifespans

Lengthening Lives: 7 ways Science Has Changed Lifespans

Michelle Powell-Smith - October 30, 2016

Over the course of the 20th century, the average lifespan in the developed world increased by 60 percent. A baby boy born in 1900 had an expected lifespan of 47 years. In 2000, his great-great grandson could expect to live to 75. What made the difference? What big changes occurred and became common in the 20th century, and what’s going to happen to see that number go up in the future?

Lengthening Lives: 7 ways Science Has Changed Lifespans


The United States’ Food and Drug Administration defines pasteurization as, “”a process that kills harmful bacteria by heating milk to a specific temperature for a set period of time…[it] kills harmful organisms responsible for such diseases as listeriosis, typhoid fever, tuberculosis, diphtheria, and brucellosis.” The process was invented by scientist Louis Pasteur in 1864. Pasteurization is not intended to kill all microbes present in the food or fluid, and is commonly used for milk, juice, and canned foods.

Pasteur originally developed pasteurization to prevent spoilage in beer and wine. He had researched fermentation extensively, and was seeking a solution to problems in the wine industry. Pasteurization of young wines allowed them to be aged without spoilage. The technology was not applied to milk until significantly later.

In the United States in the 1870s and later, milk spoiled quickly, and flavorings or chemicals were sometimes added to disguise spoilage. In 1886, German scientist Franz von Soxhlet proposed pasteurizing bottled milk in the same way as wine.

As children were the largest consumers of milk, they were the ones most at risk from milk-borne illnesses. In 1907, hundreds of children in New York City alone died each year from illnesses contracted from milk. New York City did not institute widespread pasteurization until 1914, following a typhoid epidemic in New York City. After the introduction of pasteurization, the infant mortality rate dropped from 240 per 1,000 live births to 71 per 1,000 live births.

It took several decades for pasteurization to become the norm across the United States, and even today, raw milk is linked to outbreaks of milk-borne illnesses. The sale of raw milk is prohibited in some states. Good sanitary and hygiene practices, as well as animal care, reduce, but do not eliminate the bacterial risks associated with raw milk. In addition, quick consumption of milk products, rather than shipping and storage, can reduce the time available for dangerous bacterial growth.

Lengthening Lives: 7 ways Science Has Changed Lifespans

Clean Water

Access to clean drinking water and appropriate waste water management has also had a substantial impact on human life spans. Without clean water, whether in the past or in parts of the world today, people are at risk for a wide-range of water-borne parasites and illnesses. Today, 1.8 billion people rely on contaminated drinking water sources, so unfortunately, this issue isn’t just history.

Many different illnesses can be spread through contaminated water sources—and let’s be clear here; when we say contaminated, we’re referring to fecal contamination or poop—including cholera, dysentery, dengue fever and campylobacters. In order to understand how this impacted the past, you need to think about where people got water from, and how they handled both human and animal waste of various sorts.

In some historical periods and places, like ancient Rome, people had access to relatively clean water sources. Fountains provided drinking water delivered from the countryside in Rome. That was, however, an exception to the general water options available historically. In most cases, people got water from local streams, rivers, and ponds. These ranged from relatively clean, in the case of a rural spring to downright disgusting, in the case of the Thames River.

Providing clean water is a two-part process. People need access to sources of clean water, like wells, springs or fountains, but also need a way to appropriately manage waste. That means latrines situated well away from flowing water, indoor plumbing, and measures to ensure that other forms of waste, like the leftovers from a slaughterhouse, are not dumped into the water.

While efforts to provide clean water date to the ancient world, the first municipal water purification plant didn’t open until 1804. Municipal water filtration was introduced in London in the 1850s. These measures were especially important as pollution grew with the industrial revolution, putting traditionally clean water sources at risk. Today, wells and water purification systems are still needed to provide clean, safe water in many parts of the world.

Lengthening Lives: 7 ways Science Has Changed Lifespans
Germ Theory

Germ theory states that many diseases are caused by the presence and actions of specific micro-organisms within the body. Prior to the existence of germ theory, doctors believed that diseases were the result of miasma, a sort of bad air, or imbalance of different components, called humors, in the body. Now, the discovery of microorganisms came well before the understanding of germ theory, but it was only with germ theory that the importance of antiseptics, and later, antibiotics, came into being.

In fact, microorganisms were recognized as early as 1677, Antoni van Leeuwenhoek saw microorganisms through his newly invented microscope. This was the first definitive proof of the existence of life forms too small to see with the naked eye. Leeuwenhoek called these animalcules. When scientists saw these “animalcules” in the blood of individuals with diseases, they believed that they were the result of the illness, not the cause of it.
Before the late 19th century, there was no understanding of germ theory, and, in fact, hospitals were especially dangerous places. A number of different scientists were essential to the understanding of germ theory, including Ignaz Semmelweiss, Joseph Lister and John Snow; however, the research of Louis Pasteur took germ theory mainstream.

Joseph Lister published the first paper on the importance of antiseptics in a medical setting in 1867, advocating the use of carbolic acid as an antiseptic. Semmelweiss had published a paper in 1861 on the issue of puerperal or childbirth fever, noting the risks associated with unclean environments.

While the use of antiseptics dramatically reduced the risk of infection, they did not become widespread until the 1890s. It did not become routine to clean and disinfect surgical tools, operating theaters, or even to wash the surgeon’s hands until around 1900. With the advent of antiseptic procedures and understanding of germ theory, the risk of infection for wounds and surgeries was dramatically reduced. In addition to providing the essential understanding of antiseptics, germ theory was also critical to a thorough understanding of diseases of many different types.

Lengthening Lives: 7 ways Science Has Changed Lifespans

The principle behind inoculation or vaccination is relatively simple. A milder, weakened, or similar variant of an illness is introduced to the body to trigger a natural immune response. The immune response then provides the individual with long-term resistance to the illness. The first immunization significantly predated any real understanding of germ theory.

The first form of inoculation was variolation. Variolation or the controlled transfer of pus from a smallpox lesion to a healthy person’s arm was intended to produce a very mild version of the illness. The individual would become sick, but recover and have immunity. Variolation was first practiced in Asia in the 1600s.

In the 1790s, Edward Jenner recognized that milkmaids, who commonly contracted cowpox, were immune to or largely resistant to smallpox. Unlike smallpox, which was quite serious and could lead to death in many cases, cowpox was a relatively mild illness. Jenner took pus from a cowpox sore and inoculated a young boy with it, later exposing the child to smallpox on multiple occasions. The boy never developed smallpox. By 1800, 100,000 people had been vaccinated against smallpox in Europe.

No additional improvements were made in vaccination technology until Pasteur’s development of a rabies antitoxin in the middle of the 19th century. The polio vaccine brought an end to the polio epidemic of the 1950s. Additional vaccines were developed for many other previously common illnesses, including diphtheria, pertussis, measles, mumps, and rubella.

In the United States alone, each year, immunizations prevent some 33,000 deaths. The availability of vaccinations has dramatically reduced childhood mortality rates, thereby extending the overall life span.

Lengthening Lives: 7 ways Science Has Changed Lifespans

Bacterial illnesses and infections once killed a very large number of people around the world. Today, many of those infections are relatively minor illnesses, quickly remedied with the use of antibiotic medications. Antibiotics are a fairly recent innovation. Before the discovery of antibiotic compounds, common illnesses, like strep throat, were fatal with significant regularity, and illnesses like bacterial meningitis were fatal 90 percent of the time. Today, a trip to the doctor’s office and a prescription can cure and treat many of these conditions.

In 1928, scientist Alexander Fleming recognized that a mold growing in his laboratory, the Penicilium mold, could kill Staphylococcus aureus bacteria. Fleming and other scientists continued the research on a prepared variant on the mold, now called Penicillin. By 1941, they found that Penicillin could cure a range of different bacterial illnesses. Penicillin was used to treat war injuries during World War II, and was widely available to the general public by the late 1940s.

Research on new, stronger and different antibiotics began almost immediately. While Penicillin was very effective, it was not always effective, nor could it treat every bacterial infection. During the 1950s, a number of other antibiotics were developed to treat additional conditions and kill other types of bacteria. In addition to curing a number of bacterial infections, antibiotics have also facilitated the growth of surgery, the use of chemotherapy, and other essential medical treatments.

While antibiotics have dramatically reduced the risk of death from common infections, they are not without their own problems. Today, antibiotic resistance is increasingly a problem, as more and more bacteria have evolved to resist the actions of different types of antibiotics. Overuse of antibiotics, as well as other antibacterial products has significantly contributed to the problem of antibiotic resistance. Research continues to find new drugs to treat resistant bacteria, and new policies are in place to reduce the use of antibiotics.

Access to and the use of antibiotics is closely linked to the dramatic extension of human life expectancy over the course of the 20th century.

Lengthening Lives: 7 ways Science Has Changed Lifespans

Today, chemotherapy is the first-course treatment for many types of cancer, but is also used to treat a number of other conditions, including auto-immune disorders. Drugs used for chemotherapy damage cells, typically by interfering in the cell cycle or the process of new cell creation. Cancer cells grow and reproduce more quickly than healthy cells, so are more vulnerable to chemotherapy; however, chemotherapy also damages the healthy cells in the body, causing severe side effects.

Chemotherapy was first discovered during World War II. Military scientists were conducting experiments with various forms of mustard gas, and realized that sailors exposed to mustard gas experienced changes in their bone marrow. During the course of this research, they found that nitrogen mustard was effective against lymphoma. Not long after, a compound related to folic acid was found that could treat acute leukemia. This drug was the precursor to methotrexate, a medication still in use today.

Before the invention of chemotherapy, options to treat cancer were relatively few. Small, localized cancers could be removed surgically, and later radiation could be used for some cancers. Chemotherapy provided new treatments for cancers that could not be treated in the past. Today, multiple chemotherapy drugs are typically used together, and can be used alone for some types of cancers, or as adjuvant therapy following surgery.

Research continues to improve chemotherapy options for patients. Today, more targeted drugs may offer less severe side effects, or be designed to respond to specific types of cancers. In addition, new drugs can accompany chemotherapy regimes to reduce side effects. While most types of cancer were once a death sentence, today, survival rates are quite good for many different types of cancer.

Lengthening Lives: 7 ways Science Has Changed Lifespans

The last of our scientific discoveries that have extended the average life expectancy is not yet history; in fact, it’s only had a minimal impact as of yet. In the future, however, this may be as important as any of the other discoveries we’ve discussed.

DNA was first isolated in 1869, but its molecular structure was not identified until 1953 by James Watson and Francis Crick. The code of DNA was finally cracked during the 1960s. The human genome was not fully sequenced until 2003.

The impact of DNA on the human life span has not been too significant yet on a broad scale. Today, our understanding of genetics can provide additional information to allow people to take preventative measures. For instance, we know that some types of cancer are linked to particular genes, and that preventative therapeutic measures may be worthwhile for at-risk individuals. Genetic counseling and pre-implantation testing, when used in combination with fertility treatments, can allow parents who carry genetic disorders to have healthy babies.

In the long term, increased understanding of genetics may lead to individualized drugs, for a variety of condition. For instance, cancer treatments could be engineered to destroy a specific tumor or drugs could be chosen to correct health conditions from depression to high blood pressure. Increased understanding of genetics may lead to earlier diagnosis of a variety of health conditions and the potential for prevention and treatment, and increased ability to prevent the development of genetic disorders.