Did People Have Allergies 100 Years Ago?
Determining the prevalence of allergies 100 years ago is challenging to assess due to the lack of comprehensive healthcare data and standardized diagnostic criteria. However, general observations suggest that allergies were not as prevalent as they are today. Evidence of the prevalence of allergies during that time includes anecdotal reports of rare cases of allergy, limited records of allergies such as hay fever, and indirect indicators of allergic conditions.

Overview of Allergy History
The history of allergies dates back centuries ago when it was first documented in ancient Greek and Chinese literature. However, it was not until the 19th and early 20th century that the medical community began to more scientifically study the phenomenon of allergic reactions.
In 1873, Charles Blackley, a British physician, conducted the first recorded experiment demonstrating the existence of a pollen allergy. He did so by pricking himself with different plant pollens and noting his allergic reactions. In 1906, the term “allergy” was coined by Viennese pediatrician, Clemens von Pirquet, to describe the hypersensitivity reactions observed in some of his patients.
During the early 1900s, several key figures made significant contributions to the understanding of allergies. In 1902, French scientists Charles Nicolle and Jules Bordet discovered that serum from an allergic individual could transfer their hypersensitivity to a non-allergic person, marking the discovery of the antibody response. Meanwhile, Scottish immunologist Alexander Ogston provided evidence of allergic reactions in skin tests, and in 1918, he also reported using immunotherapy to treat allergies.
In the 1920s and 1930s, doctors and researchers continued to expand their knowledge of allergic diseases, identifying a growing list of allergens and exploring the immune response to these sensitizing agents. Danish physician Johannes Clemens, for example, conducted pioneering studies on hay fever, while American clinician Barton Childs studied allergies to food proteins.
Also during the 1920s and 1930s, the medical community began exploring the hygiene hypothesis–the idea was that improved sanitation and decreased exposure to microbes during early infancy left children with greater susceptibility to allergies later in life. However, it was not until the 1980s and 1990s that the hypothesis gained more widespread recognition, as researchers like David Strachan and Thomas Platts-Mills provided more concrete evidence.
Prevalence of Allergies 100 Years Ago
Allergies have been recognized for centuries, with the medical community beginning to more scientifically study allergic reactions in the 19th and early 20th centuries. Measuring the prevalence of allergies 100 years ago presents challenges due to the lack of comprehensive healthcare data and standardized diagnostic criteria. However, general observations suggest that allergies were not as prevalent as they are today. Primary evidence of the prevalence of allergies in the early 1900s includes anecdotal reports of rare cases of allergy, limited records of allergies such as hay fever, and indirect indicators of allergic conditions.
Origin and Development of Allergies
Allergies have been recognized for centuries, yet it wasn’t until the 19th century that the medical community began to scientifically study allergic reactions. In 1906, pediatrician Clemens von Pirquet coined the term “allergy” to describe a hypersensitivity reaction in which a person’s immune system responds to a usually harmless substance as if it were harmful. This opened up the way for extensive research to better understand the cause and development of allergies.
The understanding of allergies has evolved in the sense that new allergens have been identified in foods, medications, and environmental factors. For example, peanut allergy has seen exponential growth in recent years, whereas it was rare a century ago. Similarly, allergies to other allergenic foods, such as eggs and cows’ milk, have become increasingly prevalent.
While genetics play a role in the development of allergies, research has also considered the influence of environmental factors such as pollution, chemical exposure, and changes in lifestyle and dietary habits. Studies suggest that early life experiences and exposures may influence immune tolerance and potentially increase the risk of allergic sensitization. As previously noted, the hygiene hypothesis proposes that exposure to dirt and bacteria during childhood may be protective against allergies and that a lack of such exposure may lead to an increased risk of developing allergies.
Measuring Allergy Prevalence in the Past
Measuring the prevalence of allergies in the past can be challenging due to the lack of awareness and understanding of food allergies a few decades ago. In fact, the medical community was skeptical of the diagnosis and not actively researching the field. As a result, the perceived prevalence of food allergy during this time may not accurately reflect the actual
Accurately measuring the prevalence of allergies in the past is important because it helps us understand the history of allergy development over time. Research on the prevalence of allergies in the past can provide insight into the changing environmental and lifestyle factors that contribute to the development of allergies. It can also lead to a better understanding of the role of genetics in the development of allergies.
General Observations on Allergy Prevalence in the Early 1900s
In the early 1900s, the prevalence of allergic diseases was not widely recognized, and the medical community had little understanding of their causes and treatments. Some of the earliest observations on the prevalence of allergies were related to hay fever, also known as allergic rhinitis. Pioneer physicians noticed that hay fever was more common in urban areas than in rural areas. This led to research on the role that environmental exposures, such as pollen, played in causing allergic reactions. Hay fever was prevalent in many areas, particularly in regions with high levels of pollen production, such as the Midwest and East Coast of the United States.
The association of pollen and hay fever was first defined in 1870, and by 1900 the disease was common among the “leisured class,” according to Dr. Thomas A.E. Platts-Mills in an article titled The Allergy Epidemics: 1870 – 2010 published in the Journal of Allergy and Clinical Immunology. Dr. Platts-Mills further notes that by 1940, hay fever was epidemic.
Causes of Allergy 100 Years Ago
During the early 1900s, the causes of allergies were not fully understood. However, as the 20th century progressed, various factors began to emerge as possible causes of allergic reactions. These factors included changes in diet, the hygiene hypothesis, and the use of vaccines.
Dietary Changes in the 20th Century
Throughout the 20th century, there were significant changes in the way people ate, which impacted their overall health and increased the prevalence of allergies. Industrialization, globalization, and mass production techniques made it easier to produce food cheaply, leading to increased consumption of processed food. The growing number of people consuming food outside of their homes and the rise of fast food chains have made it difficult to determine the source and composition of the food eaten.
One of the main drivers of dietary changes was the implicit assumption that modern food technology and manufacturing processes were better than traditional methods. With the growth of additive and preservative use, it was thought that food could be produced more easily and safely. However, the use of additives and preservatives has been linked with the increased occurrence of allergies in the Western world. Chemical flavorings were thought to entice people to consume more of the same food, increasing their reliance on a limited diet.
Another significant change in the Western diet was the shift from fibrous fruits and vegetables to pasture-raised and grain-fed meat. Meat consumption increased in the Western world, leading to the mass production of beef and pork. Meat production at the level seen in the 20th century was never before seen in human history. The high rate of meat consumption was linked to various health concerns and an increase in allergies.
These dietary changes highlight the contrast between the traditional diet and modern diets, including processed food, refined cereals, and fast food. The Western diet of the modern age also has higher intake rates of fat and sugar. The shift to the modern diet is believed to have caused allergies to rise worldwide.
Hygiene Hypothesis and Its Impact on Allergies
The hygiene hypothesis postulates that a reduction in exposure to bacteria and other microorganisms in the environment can lead to a weakened immune system and an increased risk of developing allergies. This idea suggests that being too clean is not always good for our health and that we actually need some level of exposure to bacteria and other microorganisms to strengthen our immune systems.
Over the last century, various changes in our lifestyle and sanitation practices have contributed to the rise in allergy prevalence. For example, modern antibiotics have reduced the incidence of infectious diseases, but they have also reduced exposure to beneficial bacteria. Furthermore, modern sanitation practices have prevented childhood exposure to dirt, which would have increased exposure to beneficial microorganisms. These changes have resulted in a change in the types of microorganisms that people are exposed to, leading to a weakened immune system that may be less capable of dealing with allergens.
Studies have supported the hygiene hypothesis, suggesting that exposure to certain elements in early childhood can reduce the risk of developing allergies. For instance, children growing up in households with pets or farm animals have lower rates of allergies compared to children growing up in pet-free households. Similarly, studies have found that having older siblings can also reduce the risk of developing allergies.
Vaccines and Their Influence on Prevalence of Allergies
In the past century, the widespread use of vaccines has had a significant impact on the prevalence of allergies. Vaccines work by introducing a small, harmless part of a disease-causing microbe–such as a virus or bacterium–into the body, which triggers an immune response. This response builds immunity to the microbe in the future and prevents the development of the actual disease.
The development of vaccines has contributed to a decrease in many infectious diseases, which would have otherwise led to severe and sometimes life-threatening allergic reactions. For example, before the measles vaccine became widely available, many children with a primary immunodeficiency disorder–a genetic condition that affects the immune system–died from severe allergic reactions to the disease. With the introduction of vaccines, this risk has been greatly reduced.
Immunization, the act of receiving a vaccine to build immunity to a particular disease, has further influenced the prevalence of allergies. By preventing diseases that can trigger allergic reactions, including anaphylaxis, immunization has indirectly contributed to a decrease in the overall incidence of allergies.
Vaccines induce immunity through the production of immunoglobulins, also known as antibodies. These are proteins that the immune system produces in response to a particular allergen, which then attach to the allergen and remove it from the body. Immunoglobulins also serve as a form of memory for the immune system, allowing it to quickly produce an immune response to the same allergen in the future. This process helps prevent the development of allergies and also ensures a quicker and more effective response if exposure to the allergen occurs.
Various types of vaccines exist, including those that target allergens themselves. For example, allergen-specific immunotherapy is a vaccine-like treatment for individuals with allergies to specific substances, like pollen or dust. This treatment works by gradually increasing exposure to the allergen, allowing the immune system to build up immunity over time. Vaccines have also been developed for certain infectious diseases that may trigger allergic reactions, including the pneumococcal vaccine and the flu vaccine. These vaccines have been shown to significantly reduce the risk of allergic reactions associated with these diseases.
Types of Allergies 100 Years Ago
While not much was known about allergies or their causes a hundred years ago, historical evidence suggests that people did suffer from a variety of allergic reactions during that time, including hay fever, food allergies, and insect sting reactions.
Hay fever or seasonal allergic rhinitis became increasingly prevalent during the second half of the 19th century, and our understanding of these conditions has evolved since then. Some researchers suggest that changes in agriculture and public hygiene practices contributed significantly to the rise of hay fever and other respiratory symptoms during this time.
Starting in the 19th century, major changes in public hygiene practices began to take place. The discovery of the relationship between sewage and enteric disease led to significant sanitation initiatives, including the implementation of public sanitation systems and the building of modern plumbing systems. Health education campaigns were also conducted to raise awareness about basic hygiene practices like handwashing. It took some time for these new hygiene practices to be widely adopted but they eventually became standard practice. These practices almost certainly resulted in a change in the types of microorganisms that people were exposed to, leading to a weakened immune system that may have been less capable of dealing with allergens.