CHAPTER NINE: What Would Have Happened to Polio if We Hadn’t Intervened?

Previous instalment

The Rise & Demise of Polio the Great Crippler

Polio is perhaps known more for its ability to cripple than its ability to kill. It must have been quite horrifying to see otherwise healthy older children for the most part – crippled – sometimes for life and often not being able to breathe for themselves, but many did also eventually recover either partially or fully.

To put this disease into perspective in relation to the other greater destroyers of the past, even at its peak, being infected with Polio didn’t always spell death and disability for the vast majority of the population that one might imagine. Indeed, most would not have known that they had the Polio infection at all as strongly suggested by the statistical data below.

POLIO

Permanent paralysis, fortunately, occurs in only 0.5% of infections. The majority of infections (72%) do not lead to any symptoms. About a quarter of cases (24%) result in “abortive” poliomyelitis which leads to nonspecific symptoms for a few days, such as a fever or a cold, and 1-5% of cases lead to “non-paralytic aseptic meningitis”, in which the patient suffers from stiff limbs for up to 10 days.

Sophie Ochmann and Max Roser (2018)

[78]

https://ourworldindata.org/polio

A clue to how Polio, a previously benign infection that was commonly transmitted without any major complications, became a crippler for a small minority of the population dovetails well with the review of infectious diseases discussed throughout this study as it relates to natural exposure leading to long-term community resistance.

For instance, the article excerpted below highlights the fact that many plagues of old were commonly triggered by major shifts in social/economic change, and points out that the epidemic Polio (Poliomyelitis) was common to many of our industrialised nations due to modernity and one of the great shifts that occurred in our modern era, was that we became a good deal more sanitised.

The origin of plagues: old and new science

There are numerous examples of old viruses that have caused new epidemics as a consequence of changes in human practices and social behavior.

Epidemic poliomyelitis emerged in the first half of this century when modern sanitation delayed exposure of the virus …

Krause R.M., (1992)

http://www.sciencemag.org/site/feature/data/diseases/PDFs/257-5073-1073.pdf

[79]

This observation runs counter, of course, to our widely held belief that it was the lack of hygiene and sanitation that led to the eruption of many contagions of the past as addressed previously in this series, now the Polio epidemics may have been stimulated by a lack of exposure to naturally occurring background pathogens due to being improved sanitation too.

Again, we have supporting evidence to suggest that it may not be the pathogens themselves that are the culprits entirely, but our level of previous exposure and familiarity with them – a theme that runs throughout this study.

For instance, the tipping of the balance within our social and economic practices as a major trigger for Polio becoming more virulent is supported by a growing body of studies which strongly suggest that Polio paralysis, in particular, was a problematic contagion for older children who tended to be better-off and as such, Polio became referred to as a Middle-class Plague as indicated by the title of the following excerpt and of course, this would be the very social class who were becoming increasingly more sanitised and effectively with indoor running water and plumbing and toilets, were unwittingly becoming the most susceptible.

The Middle-class Plague: Epidemic Polio and the Canadian State, 1936-37

Thus, despite medical science-and, ironically, because of improving public health and personal hygiene standards that delayed what had earlier been an endemic, invisible, harmless and almost universal gastrointestinal infection-during the first half of this century epidemics of paralytic polio escalated throughout the industrialized world…

Rutty, J. C. (2006, 278)

http://www.healthheritageresearch.com/cbmhbchm_v13n2rutty.pdf

[80]

This is a common theme seen throughout our near-fully modernised nations and Ireland, is no exception, as the following recollections of the Polio era reveal.

Polio the Deadly Summer of 1956

Polio, once the most feared of diseases, is about to be eradicated

Dr Gerard McCarthy, the medical officer for Cork county, pointed this out, saying:

“The higher the standard of living, the greater the tendency towards the disease. Generally the well- washed and well-laundered children are the more susceptible.”

Maureen O’Sullivan, a Red Cross nurse, noticed that

“80 per cent of the victims came from affluent or semi-affluent families.”

Cockburn, P., (2010, Oct. 27th The Independent)

https://www.independent.co.uk/life-style/health-and-families/features/polio-the-deadly-summer-of-1956-2117253.html

[81]

Those that had always been the least vulnerable were the vast majority of people who had grown up exposed naturally to the Polioviruses (there are three main strains) and more often than not, they had Polio and passed it on silently to others without complications or even obvious symptoms for the most part and gained life-long immunity unbeknownst to themselves.

This would build up a shielding effect and there would have been strong robust community immunity. However, once small infants and young children were no longer playing in the dirt, living in close proximity to these pathogens, problems began to ensue later in their childhood as seemingly, as outlined in the following, it was much less problematic to get infected with these pathogens whilst in infancy versus facing your first infection as an older child.

The origin of plagues: old and new science

Before the introduction of modern sanitation, polio infection was acquired during infancy, at which time it seldom caused
paralysis but provided lifelong immunity against subsequent polio infection and paralysis in later life.

Krause R.M., (1992)

http://www.sciencemag.org/site/feature/data/diseases/PDFs/257-5073-1073.pdf

[82]

In this scenario, is embedded, the very reason why it seems we became generationally immune to a whole range of much deadlier pathogens over the course of generations. Nature seems to provide protection at times most needed and if we miss that opportunity whether growing up as an individual or as a nation developing towards modernity by shifting the delicate balance between pathogen and us as their natural host due to improved sanitation, booming populations in more urbanised settings and increased opportunities to spread such more virulent infections due to greater world interconnectedness – thus, cutting off essential exposures at the right time and to the correct degree, we have historically seen the disastrous consequences of upsetting this balance. Polio is a case in point, albeit on a much lesser scale.

This brings us to another interrelated factor that may have been the very reason why younger children tended to be less susceptible to paralysis from being infected with Polio and consequently, the very reason that may have led to a greater tendency for older children to be more susceptible to the worst impact of becoming infected for the first time with Polio.

How Mother’s Pass on Protection Against Disease to Their Offspring

PROTECTION OF INFANTS BY MATERNAL ANTIBODIES

As early as the 1930s, it was observed that mothers transfer antibodies to their infants, thus providing infants with some degree of protection against diseases such as measles, diphtheria and poliomyelitis…

Maternal antibodies can protect infants from infections and modify the severity of infectious diseases in infants for a varying period of time, depending on the level of placental transmission and the rate of decay of passively acquired antibodies.

Rie, A.V., Wendelboe, A.M. and Englund, J.A. (2005)

https://www.hpsc.ie/a-z/vaccinepreventable/pertussiswhoopingcough/niac/File,13702,en.pdf

[83]

Generational protection afforded to infants and young children when first being exposed to natural environmental pathogens essentially, comes from their mothers as described above. Now, if mothers themselves were becoming less exposed to natural background polioviruses whilst growing up, then, their offspring could not benefit from maternal protective antibodies against the Polioviruses if their mothers hadn’t accumulated regular contact and increasing resilience to the pathogens themselves.

Therefore, the most vulnerable mothers and subsequently, the most vulnerable children growing up were typically those who were less and less exposed to such pathogens at an early age. During the Polio epidemic era within many of our industrialised nations, breastfeeding, also a well-recognised source of antibody protection to the environmental pathogens currently circulating, being another mode of protection for infants was also rapidly declining, if not becoming increasingly shorter in duration.

These factors would have, in combination, contributed significantly to the vulnerability of older children as they grew up and began to encounter the Polioviruses, potentially, in all three forms for the first time. They would not be primed to deal with them or had cross-protection from different strains. Some of these factors that conspired to fuel the flames of epidemic Polio are outlined in the article below:

The Spatial Dynamics of Poliomyelitis in the United States: From Epidemic Emergence to Vaccine-Induced Retreat, 1910–1971

According to the hygiene model, successive improvements in levels of sanitation during the late nineteenth and early twentieth centuries would account for a reduced level of faecal exposure to poliovirus in early infancy, thereby reducing the level of latent immunization in the population.

Contingent on these developments, Nathanson and Martin… postulate that the appearance of epidemic poliomyelitis resulted from several concomitant changes:

1) a reduction in levels of maternal antibody as booster infections became less common; and

2) a reduction in the frequency of antibody levels sufficient to produce cross-protection between virus types. . .

Both of these developments would reduce the average duration of passive protection of infants.

3) An increase in the average age of primary infections.

Together, these changes would drastically increase the proportion of children at risk of a paralytic infection.

Trevelyan, B., Smallman-Raynor, M., & Cliff, A. D. (2005)

[84]

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1473032

So you can imagine what our health professionals were making of it all. They were indeed, rather perplexed as we were living in a world with renewed optimism just having come out of the second world war of the 20th Century, with all this modern living and people finally beginning to see light at the end of the contagion tunnel as more older children and younger adults began showing fairly extreme symptoms such as paralysis, and these same individuals were for the most part, as suggested above, typically from some of the best laundered and well-scrubbed homes.

However, it gets worse. Little did our health professionals think for one moment that the very medical modern marvels that have in some minds saved us from the plagues of the bad old days could be implicated in adding fuel to the already smouldering fire. As indicated by the chronology of when and to what degree vaccines were introduced, particularly in our more modern era, epidemic Polio came on the heels of our massive vaccination campaigns against Diphtheria in particular with Tetanus and Pertussis (Whooping Cough) vaccinations being combined a little later in the form of the DTP. This is discussed as a contributing factor in the excerpt below.

Polio provocation: solving a mystery with the help of history

During the summer of 1951, a medical mystery in the USA erupted into a crisis, stimulating professional debate and public anxiety. The issue was polio provocation, a health risk facing unvaccinated children in polio endemic regions. Leading specialists were at a loss to explain the condition.

As the poliovirus was widespread before the discovery of an effective vaccine in 1955, evidence that some paediatric injections could incite polio infection and paralysis led to extraordinary shifts in health policy and calculated efforts to mitigate the risk. At the core of this discussion were physicians and public health researchers, whose efforts to formulate a clinical theory drove both policy and the impetus for scientists to unravel the underlying mechanism.

During the 1940s and 1950s, physicians and public health researchers in the USA, UK, Canada, and Australia sought to understand the nature of polio and how it attacked the grey matter of the spinal cord… some physicians noticed a correlation between certain medical interventions and polio paralysis… it was not until the end of World War II that injection-induced polio emerged as a public health concern. The application of epidemiological surveillance and statistical methods enabled researchers to trace the steady rise in polio incidence along with the expansion of immunisation programmes for diphtheria, pertussis, and tetanus.

…Concern about polio provocation lay dormant for decades, but resurfaced in the 1980s when large international aid agencies, such as Rotary International and WHO, expanded their immunisation programmes in low-income nations. In some areas of Africa, where polio was endemic, public health workers began to report cases of paralysis after immunisations against common childhood diseases. Since these observations were decades removed from earlier published findings, many health professionals supposed they were witnessing a new phenomenon… Published historical accounts drew some attention to the matter, but it was not until severe epidemics erupted in India during the 1990s that fresh clinical evidence became available. …For the first time, health professionals working in polio endemic regions had scientific evidence that paediatric injections could incite paralysis.

Mawdsley, S.E., (2014)

https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(14)61251-4/fulltext

[85]

The mechanism for this is documented in the following excerpt from one of these very important studies dating to the mid-nineteen-nighties.

Mechanism of injury-provoked poliomyelitis

Provocation poliomyelitis In persons incubating wild poliovirus infection, intramuscular injections (e.g. DTP) may provoke paralysis in the injected limb (…).

ABSTRACT

Skeletal muscle injury is known to predispose its sufferers to neurological complications of concurrent poliovirus infections. This phenomenon, labeled “provocation poliomyelitis,” continues to cause numerous cases of childhood paralysis due to the administration of unnecessary injections to children in areas where poliovirus is endemic.

Recently, it has been reported that intramuscular injections may also increase the likelihood of vaccine-associated paralytic poliomyelitis in recipients of live attenuated poliovirus vaccines. We have studied this important risk factor for paralytic polio in an animal system for poliomyelitis and have determined the pathogenic mechanism linking intramuscular injections and provocation poliomyelitis.

Skeletal muscle injury induces retrograde axonal transport of poliovirus and thereby facilitates viral invasion of the central nervous system and the progression of spinal cord damage. The pathogenic mechanism of provocation poliomyelitis may differ from that of polio acquired in the absence of predisposing factors.

Gromeier, M., & Wimmer, E. (1998)

[86]

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC110068/

And to make matters even worse, there was a growing realisation that potentially, some of our other newer medical interventions and sanitary measures to tackle the bugs may have also been further fanning the flames of Polio epidemics, this was tonsillectomies (removal of the tonsils in older children).

Polio provocation: solving a mystery with the help of history

One of the first medical procedures implicated in the causation of polio was tonsil surgery. A study of more than 2000 case histories in the 1940s by the Harvard Infantile Paralysis Commission concluded that tonsillectomies led to a significant risk of respiratory paralysis due to bulbar polio.

Although proponents of the theory did not entirely oppose tonsillectomies, they cautioned that such interventions should be avoided during epidemics. Reflecting the growing body of evidence that tonsillectomies could provoke polio, many doctors in the USA adjusted their surgical procedures to account for disease-endemic factors. “The policy of the United States Army”, Major-General E A Noyes acknowledged in 1948, “has been to stop tonsil and adenoid operations during epidemics”.

Even though laboratory technology at the time was not sufficiently advanced to unravel the mechanism, published evidence affected clinical practice.

Concerns about tonsillectomies coincided with indications that paediatric injections could also incite polio paralysis…

Mawdsley, S.E., (2014)

https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(14)61251-4/fulltext

[87]

We are only now beginning to gain an insight into some plausible biological mechanisms that may have exacerbated the epidemic rise in paralytic Polio as outlined in the following.

A Wicked Operation’? Tonsillectomy in Twentieth-Century Britain.

Some academic surgeons also opposed tonsillectomy by building on earlier suggestions that the tonsils were protective against infectious diseases like poliomyelitis. They emphasised the immunological compromise created by ‘three open wounds’ and believed the tonsils conferred ‘immunity’ against such diseases, … as part of a circle of lymphoid tissue in the throat known as ‘Waldeyer’s ring’… Logically, it would follow that tonsillectomy could cause both a short-term risk and long-term predisposition to infectious diseases.

A retrospective study of the 1947–8 South Australian poliomyelitis epidemic vindicated these predictions. In over half of bulbar poliomyelitis cases, the individual had received tonsillectomy at least five years prior, supporting an increasingly popular theory: that susceptibility persisted after wounds healed…

Immunological and epidemiological arguments converged and provided mutual reinforcement. It was not enough to defer surgery until next year: tonsillectomy would require large-scale reduction.

Dwyer-Hemmings L. (2014).

[88]

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5883156/

Furthermore, we are only in more recent times beginning to understand how important the tonsils are in your first line of defence against many infectious diseases, but particularly against Polio due to the way in which it enters the body naturally, via the mouth in particular.

Moreover, there is also a clue to the age group most vulnerable to Polio (pre-puberty) as the tonsils seem to be most active at this time as indicated in the excerpt below, and removing them may have had a fairly big impact, leaving them wide open to attacks of the worst type, before the viruses could be dampened down by natural pathogen filters as their first line of defence had been removed surgically.

 Tonsillitis

Tonsils produce certain types of disease-fighting white blood cells. Therefore, the tonsils are believed to act as the immune system’s first line of defense against bacteria and viruses that enter your mouth.

This function may make them particularly vulnerable to infection and inflammation. The problem is more common in children because the immune system function of tonsils is most active before puberty. Also, unlike an adult’s immune system, a child’s system has had less exposure to bacteria and viruses and has yet to develop immunities to them.

Mayo  Clinic (2011)

[89]

https://www.mayoclinic.org/tests-procedures/tonsillectomy/about/pac-20395141

Obviously, if these medical and generation improved hygienic factors did conspire to create the perfect storm of vulnerability, particularly amongst those who could afford to avail most easily from such medical procedures and interventions and came from some of the most modernised homes, for these particularly nasty Polio outbreaks to converge into larger epidemics, essentially being more manmade than naturally occurring, we should expect to see some variation between our respective nations (unlike the more closely matched pattern that we have observed for the more natural rise, fall, peaks and troughs if other much deadlier contagions of the past).

And this is what we do see when we examine the mortality graphs from different regions showing the impact of Polio within our respective nations. Take, for instance, the United States as shown in the following, Fig. 15.

2019 300 DPI Polio deaths and cases US
Fig. 15: Polio vaccine first became widely available in the United States in 1955. Graph generated from statistics of annual mortality from Polio since official records began in the United States (with small data gap for deaths) source of raw data, Sophie Ochmann and Max Roser (2018) – “Polio”. Published online at OurWorldInData.org. https://ourworldindata.org/polioreported-paralytic-polio-cases-and-deaths-in-the-united-states-since-1910(1).csv

As you can see from the above graph (Fig. 15) from the United States with a small data gap for the early 1940s in relation to mortality rates for Polio (the light grey smaller curve) compared to Polio cases (the larger dark grey curve), there is a very pronounced spike corresponding to 1916 shortly after official records of Polio began. This seemingly came out of nowhere and at no time after, even at its height, did Polio cause such destruction in terms of deaths (not cases which are greater in the 1950s with mortality rates significantly lower in this era).

The other aspect of this large spike that makes it unusual, is its concentrated origin and confined regional impact (New York as its epicentre and only spreading to a limited range beyond. This and the virus’s unique signature has lead one scholar to investigate the possibility of it being an epidemic of unnatural origins as suggested in the abstract of the study below.

The 1916 New York City Epidemic of Poliomyelitis: Where did the Virus Come From?

Previous accounts of the 1916 devastating epidemic have been faulty. The unique features of the epidemic and its sudden appearance have never been explained. A New York laboratory was passaging poliovirus in primate brains, a technique which increased pathogenicity. I propose that highly virulent virus escaped and caused the epidemic. Scientists, technical and animal house staff were unaware that they could be infected by poliovirus which could then infect others. All laboratory workers must be constantly reminded of the dangers which can arise from the escape of pathogens from their work.

…An Extraordinary Epidemic
The definitive account of the epidemic from the US Public Health Service […] concluded that the outbreak had remarkable features: the extent and intensity was beyond all previous experience; the origin was remarkably definite in time and place; there was a strikingly uniform radial spread from this focus (…) with intensity progressively decreasing in proportion to the distance from the original focus; and a demonstrable mathematical regularity in its whole evolution.

These features were never experienced again. Three other aspects were not noted at the time: the number of children age 2 yr affected was the highest ever recorded […]; the case fatality rate of 25 % was the highest ever recorded… the epidemic started in early May, well before the normal summer polio season. p. 13

…Polio has a particularly hidden risk as it was not realised that adults with immunity could be infected and shed virus, infecting other adults as well as children. Most infected children would also shed virus, but without any symptoms. Some years ago I visited an Indian laboratory which monitored stools sent from paralysed children. The scientist in charge was unaware that staff who handled the stools could be infected and shed virus in the community even though they were themselves immune…

It is not possible to prove that the 1916 epidemic was caused by the escape of MV from the Rockefeller Institute… However it is a remarkable coincidence that a unique neurotropic strain of poliovirus was developed a few miles from an epidemic caused by a uniquely pathogenic strain of the virus.

Wyatt, H. V., (2011, 13-14)

https://benthamopen.com/contents/pdf/TOVACJ/TOVACJ-4-13.pdf

[90]

In the absence of any other convincing causes of this Polio epidemic, and with all the evidence presented in the rest of the study excerpted above, the chart presenting the cases and death rate from Polio from the United States should perhaps be considered in the light of this potential anomaly (as you will see when we begin to examine the other charts from different regions, this large spike for the same general period is absent).

Now, setting this anomaly aside, when we review the more general era of eruption of Polio and its impact in the United States, we can clearly see some of its worst epidemics (after 1916) and greatest number of deaths occur from the late 1940s with a degree of decline and the last peak of a significant number of deaths occurs around 1952 (Fig. 15). Again, reviewing Figure 15 closely, we can see that the number of cases peaks also at this point (1952) followed by a steep decline and then the cases increase again followed by a sharp decline which is not reflected in the significantly declining number of deaths for the corresponding period of relatively high case incidences.

Note that in the terms of Polio in particular, and certainly as we have previously reviewed for the many other previously much deadlier contagions, increased number of cases/infections that are obvious, does not equal an increase in the number of deaths and disabilities – this is clear also for Polio when we see from the United States (Fig. 15) the overall number of deaths dropping dramatically after 1952, whilst the number of cases rise sharply again soon after but drops dramatically just slightly before the vaccine was made widely available in 1955.

Now if we review the Canadian mortality rates for the same broad period (Fig. 16), which also introduced a vaccine shortly after the United States within the same year, 1955, we find an even more dramatic decline in deaths from Polio just prior to our efforts to eliminate the disease. There is a small peaking after the vaccine was introduced in 1955, and deaths from Polio appear to return to the very low levels seen just prior to the vaccine becoming more widely available and continue to decline thereafter. (Note that we only have data for mortality and not paralysis or cases for Canada, but we can assume that number of cases does not equal proportional number of deaths).

2019 300 DPI Polio death rates Canada

Fig. 16: Polio vaccine became widely available in 1955, but after it was introduced into the United States. Statistics in graph derived from Figure 1, table showing annual mortality rate from Polio per 100,000 of population since official records began in Canada http://www.healthheritageresearch.com/Polio-Biologicals34-2-June2006sdarticle.pdf The Middle-class Plague: Epidemic Polio and the Canadian State, 1936-37, CHRISTOPHER J. RUTTY http://www.healthheritageresearch.com/cbmhbchm_v13n2rutty.pdf This graph was generated for Polio deaths across the Canadian states

It looks like Canada had seen the last of its most major impact of Polio in terms of deaths on the eve of the introduction of the vaccine in 1955 and really the major span of relatively and more consistently deadly Polio era in Canada was from the late 1920s and part of the 1930s. It fluctuates quite considerably thereafter, until the final spike over a few short years from the early 1950s, which peaks during 1953 and rapidly plummets just before the vaccine was introduced.

The Middle-class Plague: Epidemic Polio and the Canadian State, 1936-37

The many serious polio epidemics that occurred between the late 1920s and early 1950s have received minimal historical attention… During the 1950 – 55 period there were wide natural variations in incidence of paralytic polio in Canada that peaked in 1953 and then sharply declined in 1954 and 1955. National incidence remained low during 1956 and 1957, when it dropped to an attack rate level not seen since 1926.

Rutty, J. C. (2006, 279)

http://www.healthheritageresearch.com/cbmhbchm_v13n2rutty.pdf

[91]

2019 300 DPI Eng Polio deaths & paralysis .png

Fig. 17: Vaccine became widely available by about 1959 onwards. Graph produced using raw data statistics of rates of paralysis and mortality from Polio in the UK since official records began. http://www.post-polio.org/ir-eng.html Incidence Rates of Poliomyelitis in England, Incidence Rates of Poliomyelitis in Other Countries     In the US, PHI thanks The British Polio Fellowship for obtaining the following records. Acute Poliomyelitis became notifiable in 1912. *Deaths include late effects Source: Annual Reports of the Registrar General. Communicable Disease Surveillance Centre, London. http://www.post-polio.org/ir-eng.html

Now comparing the later introduction of the new vaccine against the Polio paralysis cases and number of individual deaths in the United Kingdom, we find that when we examine the mortality/paralysis statistics (Fig. 17), that these have already declined significantly shortly after the terrible outbreaks of Polio of the later 1940s, peaking and dramatically declining after 1951. Polio paralysis spiked somewhat after deaths declined to increasingly smaller numbers, but, the worst was truly over by the time the vaccine eventual made its way out to the wider starting on the eve of the 1960s. This rather convoluted sage is summarised as much as possible in the following excerpt.

Vaccine innovation and adoption: polio vaccines in the UK, the Netherlands and West Germany, 1955-1965

Medical history

IPV in the UK

Eventually, in 1956, a new vaccination programme was planned with British IPV, … However, only small amounts of vaccine were available from the British manufacturer. In May and June 1956 the vaccine was administered to children between two and nine years of age, whose parents had to register them beforehand… On the whole, 29 per cent of all British children in that age group had been registered for vaccination, but only 10 per cent of those registered, or 3 per cent of all children born between 1947 and 1956, could be inoculated in 1956 with the specified two injections. The limited supplies available extended no further than this.

Why did the British attitude, which had been so much in favour of the Salk vaccine, change so quickly in 1955 and why were the risks so radically reassessed that for two years imports of US vaccine were impossible? …An additional source of delay had to do with the supply of vaccine. For spring 1957, Glaxo had promised to put a certain amount of vaccine at the disposal of the health ministry, but there were again delays in production…. Growing public pressure for vaccination was a main factor leading to a change of views in the British health administration…

In the Ministry of Health the question “If British supplies are adequate, should we buy only British?” was discussed repeatedly in November and December 1958. Although Canadian and American products were much cheaper … and should therefore have been attractive to the NHS with its chronic shortage of funds, other criteria could clearly override cost considerations. These were partly to do with safety…

For 1958 the Ministry of Health agreed to administer vaccine to all children under fifteen. In order to meet this goal the imported vaccine would be used, provided it had undergone an additional British test. Furthermore, parents should be able to choose, and would have the right to refuse to allow their children to be inoculated with imported vaccine. This complicated scheme added to the relatively slow progress of vaccination in the UK…

Lindner, U., & Blume, S. S. (2006)

[92]

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1592614/

In Ireland, it should be noted that although the number of deaths was tragic and significant to those affected, these numbers are really very small indeed in the scheme of things with the greatest death toll peaking in some years between 50 and 60 individuals (Fig. 18).

Obviously, the paralysis cases would have been significantly higher judging by other statistics from regions such as the UK. Ireland, however, sees an earlier intensification of Polio (as seen in the relative death rates) than the UK. In Ireland, Polio has a greater impact rising as we approach the late 1930s and peaks rather aggressively during the earlier 1940s. Certainly, in this era, it must have terrifying indeed. But, finally, Polio impacted less and less soon after as seen in the ever-decreasing peaks of deaths since that time, with the final peak following soon after the vaccine trails begin in Ireland in 1957 (See excerpt below).

2019 300 DPI Polio deaths Ireland

Fig. 18: Vaccine started to be used on a trial basis in 1957. Chart of the annual number of individual deaths in Ireland from Polio since deaths from the disease was first recorded (1922). Source: Chart generated using this tumultuous statistics reports since records began – “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of: An Phríomh-Oifig Staidrimh, Central Statics Office CSO, link. © Copyright dig-press.com.

Polio, once the most feared of diseases, is about to be eradicated

Official statistics show that overall 499 people got the disease in Ireland in 1956, of whom 220 were from the Cork area. The number admitted to hospital suspected of having polio was slightly larger. The figures seem low, but they conceal the fact that by the end of the year most people in Cork had got the disease, although very often without realising it. By 1957 the first trials of the Salk vaccine were taking place in Ireland …

Cockburn, P., (2010, Oct. 27th The Independent)

https://www.independent.co.uk/life-style/health-and-families/features/polio-the-deadly-summer-of-1956-2117253.html

[93]

It would appear that much of the people were already fairly well immunised by natural means, as indicated within the excerpt above, at least in Cork, but presumably the rest of Ireland were well on their way to becoming more resilient to Polio’s worst effects as the ever-declining peaks in deaths appear to indicate as seen in Figure 18.

Bearing in mind that Polio had a greater negative impact in Ireland towards the latter half of the 1930s (deaths only began to be recorded from 1922 onwards – See Fig. 18), this ties in with observations on the ground at the time which also give us an insight into the ratio of Polio paralysis (many would have recovered from) across Ireland as a whole and the specific impact of Polio during the last recorded major eruption of 1956 (before the vaccine trials) from one medical officer.

The 1956 Polio Epidemic in Cork

In 1956, 499 cases of paralytic poliomyelitis, resulting in twenty deaths, were notified to the Department of Health for the entire country. …There were 220 cases and five deaths in Cork city and county, a comparatively low fatality rate of 2.3%, particularly if the 1942–3 epidemic is taken as the benchmark.

Dr Gerald P. McCarthy, MOH for County Cork, concluded, with some justification in the light of these figures, that the poliomyelitis outbreak in the city and county was no more than ‘a trifling epidemic’.

History Ireland (2006, May/June issue)

[94]

http://www.historyireland.com/20th-century-contemporary-history/the-1956-polio-epidemic-in-cork/

Similarly, we can see from the relative impact of Polio in Canada from the short history of previous Polio outbreaks, that the disease overall had a greater and more sustained impact in the earlier era and the worst was seemingly over by the time the vaccine was introduced (Fig. 16).

The United States shows a comparatively later rise in fatalities (leaving aside the 1916 anomaly) and correspondingly later decline in deaths compared to these other regions, yet, even with the relatively early introduction of a vaccine, deaths still to significantly decline well before this intervention (Fig. 15). And, certainly in the UK (Fig. 17) deaths from Polio, followed closely by a dramatic drop in paralysis was essentially indicating an overall resolution prior to the relatively delayed introduction of the vaccine to combat the disease.

However, our medical history books tend to only focus upon Polio deaths and paralysis at its worst as if the contagion would have continued to be a plague of our modern age if we had not have intervened and, understandably, therefore, everyone naturally associated this intervention with the remarkably positive turn of events (as seen in the rather dramatic declines in paralysis and deaths) soon after.

Although the first trials certainly looked promising, and I am sure that lives were saved and disabilities avoided initially, however, it seems that the honeymoon period was almost over before it ever started as highlighted in the following excerpt.

History of polio vaccination

In 1954, the inactivated vaccine was tested in a placebo-controlled trial, which enrolled 1.6 million children in Canada, Finland and the United States […]. In April 1955, Salk’s vaccine was adopted throughout the United States. The incidence of paralytic poliomyelitis in the United States decreased from 13.9 cases per 100 000 in 1954 to 0.8 cases per 100 000 in 1961[…].

Some disadvantages of the Salk vaccine in that time were the decrease of the titres of the circulating antibody within a few years of vaccination, the further circulation of wild PV and its implications in outbreaks, and the large number of monkeys (about 1500) needed to be sacrificed to produce every 1 million inactivated doses. .. Shortly after the licensing of Salk vaccine, the failure of inactivation of vaccine virus at Cutter Laboratories, Berkeley, was followed by 260 cases of poliomyelitis with PV type 1 and 10 deaths.

… The use of the highly virulent Mahoney strain in vaccine production has been controversial and after the Cutter incident, even more so. In Sweden, the Brunenders strain for type 1 was preferred. In 1980, concentration and purification of polio antigens were introduced into the manufacture of IPV and the immunogenicity of the vaccine was increased.

Baicus A. (2012)

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3782271/

[95]

As noted above, the rate of Polio paralysis dropped dramatically after this major trial of the injectable killed (Inactivated Polio Vaccine) IPV vaccine given to over one and a half million schoolchildren. Now without going into the ins and outs of all that actually unfolded regarding the few hiccups at the early stages of the rollout of the new Polio vaccine (IPV) and things like the Cutter incident and without mentioning the monkeys, we will fast-forward a few years and review why we started using the oral form of the competing vaccine OPV (Sabin attenuated/weakened virus oral type) instead of the IPV (Salk inactivated/killed virus injectable type). One of the reasons relates to what was eluded to above regarding the seemingly rather short-lived protection that Salk’s new vaccine against Polio provided and a few other things besides.

A Brief History of Polio Vaccines

Many virologists were of the opinion that Salk’s vaccine could not provide long-lasting protection and that this could only be achieved with Sabin’s live-attenuated version. Only a live vaccine, it was argued, had sufficient immunogenicity to provide protection. In contrast, an inactivated vaccine would have to be re-administered regularly…

On the basis of these trials, Sabin’s vaccine was deemed the better of the two. It was found to confer longer-lasting immunity, so that repeated boosters were not necessary, and acted quickly, immunity being achieved in a matter of days. Taken orally (on a sugar cube or in a drink), the vaccine could be administered more readily than the Salk vaccine, which had to be injected.

Most importantly, the Sabin vaccine offered the prospect of passive vaccination because it caused an active infection of the bowel that resulted in the excretion of live-attenuated virus. Thus, through fecal matter and sewage the Sabin vaccine could help to protect those who had not been vaccinated. In August 1960, the U.S. Surgeon General recommended licensing of the Sabin vaccine. The oral vaccine gradually supplanted its rival and by 1968, Salk’s vaccine was no longer being administered in the United States, and U.S. pharmaceutical companies had stopped producing it.

Blume, S., & Geesink, I., (2000, 1593-1594)

http://science.sciencemag.org/content/288/5471/1593

[96]

All in all, we went with some successful trials that promoted the OPV (Oral Polio Vaccine – Sabin oral form) vaccine as the front-runner. Quickly it became the mainstay vaccine shortly thereafter across most of our developed nations, first administered via sugar cubes if anyone is old enough to remember, then later and presently as drops into the mouth – and as noted above, in many ways it mimicked the natural form of the infection – but made it attenuated (less potent) and depended upon spreading the virus to unvaccinated people and the community at large – which is of course, somewhat different to what most other vaccines are intended to do – i.e., stop the actual spread of the infection.

So, in effect, the OPV was seen to provide much longer protection and given the nature of the way Polio circulated in its natural state (silently for the most part as most people who were infected wouldn’t know they had Polio and they could easily pass it along to others – more often than not giving them life-long immunity and even cross-strain protection, i.e., if you were immune to one or two strains this helped protect you against a third), it seemed like a sure bet that the oral vaccine could mimic nature and achieve something similar.

The idea of course was to attenuate the viruses (weaken them) and stop replication in the gut that might cause problems (called mucosal immunity) and indeed, the aim was to make enough people immune (even those unvaccinated could gain some immunity from coming into close contact with the vaccinated individual), thus producing broadly reaching community immunity (usually referred to as herd immunity) in a safer way than the wild virus infections as the vaccine virus was attenuated (much like how a mother can attenuate a pathogen for her new-born into infancy allowing their immune systems to get familiar with the bug without the danger) and it was believed that this was, therefore, a much safer way to gain protection for everyone. It was simply a case of reaching enough people (children for the most part) and soon the world would be entirely free of Polio the crippler for once and for all.

As you can imagine, things did not quite pan out in the way that had been hoped.   Although, the OPV was deemed the weapon of choice when it came to stopping outbreaks in their tracks and indeed, it did seem to do this rather successfully in the early days when the vaccination coverage was very high (or, possibly as natural immunity was restored to much of our populations), but once natural Polio had essentially stopped circulating across our industrialised nations (at least in an obvious way as it would only become obvious that Polio was still viral and spreading amongst our communities by its ill effects such as paralysis or death from obvious symptoms – may be most had already become silently immune without anyone noticing?) and we could begin to see the woods for the trees, we noticed that the only ones getting paralysed were those who had gotten it from the vaccine itself.

When Can We Stop Using Oral Poliovirus Vaccine?

Why must OPV vaccination be stopped?

Vaccine-associated paralytic poliomyelitis was recognized shortly after the introduction of OPV, with cases occurring in both vaccinees and their contacts.

The time is coming when the only cause of polio is likely to be the vaccine used to prevent it.

Hull, H.F., & Minor, P.D., (2005, 2033 – 2035)

https://academic.oup.com/jid/article/192/12/2033/838973

[97]

This was one of the main reasons why we and most other first world countries went back to using the IPV inactivated (killed) type vaccine around the 2000s, as the IPV vaccine seemed to be effective enough as long as Polio was no longer circulating when the risk of being infected by the wild virus was very low.

However, the OPV remained the mainstay vaccine of choice for low income or less industrialised nations, as it was easier to administer and was much cheaper. But, the long-term use of this revealed another issue, especially when natural wild Polio was still circulating in some pockets as discussed below, the OPV strain itself started to go viral and even evolve into new lineages.

Evolution of the Sabin Vaccine into Pathogenic Derivatives without Appreciable Changes in Antigenic Properties: Need for Improvement of Current Poliovirus Surveillance

Viruses constituting the Sabin oral polio vaccine (OPV) are inherently genetically unstable (…). Upon reproduction in vaccinees and their contacts, they tend to lose attenuating mutations. At a certain step in their evolution, vaccine-derived polioviruses (VDPV) have virtually no phenotypic distinctions from wild polioviruses and are able to circulate, especially in underimmunized communities, causing sporadic cases of the disease and outbreaks (…).

Such VDPV pose a serious challenge to the Global Polio Eradication Initiative (…) because the real eradication of polioviruses should obviously include not only “true” wild viruses but also VDPV (…). Since it is unlikely that the usage of OPV (inevitably producing new VDPV) will be discontinued soon, i.e., until its replacement by inactivated polio vaccine (…), the identification of VDPV is a key issue for the eradication program.

Yakovenko, M. L., Korotkova, E.A., Ivanova, O. E., Eremeeva, T.P., Samoilovich, E., Uhova, I., Gavrilin, G.V., & Agol1, V.I., (2009, 3402)

https://jvi.asm.org/content/jvi/83/7/3402.full.pdf

[98]

As you can tell from the above excerpt, this issue with the OPV going wild and turning out to be causing outbreaks and the recirculation of Polio in its newly evolved wild vaccine-derived state sort of defeats the purpose of trying to eliminate the wild type Polio in the first place.

It now looks like some of these OPV vaccine strains have evolved from humble origins in the lab, even having their own family tree and at the base of it is its common ancestor, the vaccine virus itself as excerpted below. It looks like it isn’t the Polioviruses in their original state that we need to pay attention to, but those of our own making.

“The evolutionary pathway to virulence of an RNA virus,” 

In the past, all three oral polio vaccine (OPV) strains have reverted and caused outbreaks, but this occurs most commonly with the vaccine’s poliovirus type 2 (OPV2), as has happened dozens of times in places such as Belarus, China, Egypt, Madagascar, and Nigeria.

And because wild poliovirus type 2 was eradicated in 1999, public health workers began using a bivalent vaccine that used attenuated versions of only strains 1 and 3 in April 2016.

Using 424 sequences of circulating OPV2-derived virulent viruses, representing about 30 independent outbreaks of vaccine-derived poliovirus, the researchers created an evolutionary tree showing how the vaccine-derived viruses evolved in parallel from their common ancestor, the vaccine virus.

Through this analysis, the researchers could also determine the order in which the virulence-promoting mutations occurred…

Pathway to Polio Virulence Revealed

Using epidemiological and laboratory data, scientists have mapped out a sequence of mutations through which the attenuated oral polio vaccine reverts to a virulent virus.

Taylor, P., (The Scientist 2017, March 23rd)

[99]

https://www.the-scientist.com/daily-news/pathway-to-polio-virulence-revealed-31788

The following excerpt highlights just how widespread these vaccine strains gone wild or, VDPVs (Vaccine-Derived Polioviruses) are, and it also points out the potential problem particularly for those that are not vaccinated or can’t be reached to vaccinate.

When Can We Stop Using Oral Poliovirus Vaccine?

Ample molecular data are now available to demonstrate that vaccine viruses can revert to full neurovirulence […]. Outbreaks of polio in China, Egypt, Haiti, Madagascar, and the Philippines caused by circulating, neurovirulent vaccine-derived polioviruses (VDPVs) demonstrate that these revertent strains are fully transmissible and pose significant population risks.

VDPV outbreaks are associated with incomplete vaccine coverage over a period of years, allowing a large population of susceptible children to accumulate […]. Worldwide, only 70%–80% of children receive 3 routine doses of diphtheria-tetanus-pertussis and OPV in their first year of life. Many of the poorest countries in the world are unable to vaccinate even 50% of their children.

Under these circumstances, continuing to use OPV after eradication is very risky

Hull, H.F., & Minor, P.D., (2005, 2033 – 2035)

https://academic.oup.com/jid/article/192/12/2033/838973

[100]

It certainly would be a bit difficult to see who was even infected with these vaccine mutants gone wild, however, we may not have to worry too much as they seem to behave like the Polioviruses in that, if we recall that even at the height of the worst Polio epidemics, most people who got Polio, didn’t show symptoms or become paralysed and many more recovered from their ordeal.

Therefore, many people may already be immune from infections even from vaccine strains without even knowing they had ever been infected. But, yes this certainly makes the eradication of Polio worldwide quite a task. It would be like trying to find the proverbial needle in a haystack. So the plan of later years is to use a combination of both vaccines – the OPV and IPV to help stop the spreading to top it off for the final eradication.

Now since most of the world has been using the OPV, even if we changed to the IPV back in the day, where we would need boosters, certainly as adults, even if we were fully vaccinated with all four doses during childhood according to health officials, then, if the immunity from the OPV was believed to be very long-term, most of us who are old enough to remember the sugar cubes or at least anyone around up to the year 2000 who had the OPV should still be immune. But, as it turns out, the OPV lasts a surprisingly short time as seen in a number of large-scale studies as excerpted further on.

But, studies that have looked back at large populations who were previously vaccinated using detailed records and clinical testing of stool samples have discovered that the OPV may not be as long-lasting as previously thought. Indeed, it now looks like – at least in less industrialised regions where the wild Polioviruses were still circulating, that the OPV may have only provided mucosal (gut) protection for up to a year!

Waning Intestinal Immunity After Vaccination With Oral Poliovirus Vaccines in India

Infection with OPV (vaccine “take”) is highly seasonal in India and results in intestinal mucosal immunity that appears to wane significantly within a year of vaccination…

Waning intestinal mucosal immunity after vaccination with OPV is likely to make the interruption of wild-type poliovirus transmission more challenging. We have previously found frequent infection with wild-type polioviruses among healthy, OPV-vaccinated children in contact with children with poliomyelitis […].

Grassly N. C, Jafari H, Bahl S, Sethi, R., Deshpande, J. M., Wolff, C., Sutter, R. W., & Aylward R. B. (2012, Conclusion)

https://academic.oup.com/jid/article/205/10/1554/949286

[101]

This gut immunity is important for how the OPV is intended to work, i.e., it is not that you don’t get an infection with Polio, it is simply that you get a much-weakened form and it should not keep replicating in your gut once you get it. Therefore, if the gut immunity doesn’t hold for long due to vaccine waning, then it is likely to start replicating again, but, that doesn’t mean that it isn’t still a mild form, but, it causes issues regarding the spread of the virus amongst the population where the endgame is eradication.

Asymptomatic wild-type poliovirus infection in India among children with previous oral poliovirus vaccination.

Mucosal immunity induced by oral poliovirus vaccine (OPV) is imperfect and potentially allows immunized individuals to participate in asymptomatic wild-type poliovirus transmission in settings with efficient fecal-oral transmission of infection.

We examined the extent of asymptomatic wild-type poliovirus transmission in India by measuring the prevalence of virus in stool samples obtained from 14,005 healthy children who were in contact with 2761 individuals with suspected poliomyelitis reported during the period 2003-2008…

Wild-type poliovirus serotypes 1 and 3 were isolated from the stool samples of 103 … healthy contacts, respectively. Among contacts of individuals with laboratory-confirmed poliomyelitis, 27 (12.7%) of 213 and 29 (13.9%) of 209 had serotypes 1 and 3, respectively, isolated from their stool samples. The odds ratio of excreting serotype 1 wild-type poliovirus was 0.13 (95% confidence interval, 0.02-0.87) among healthy children reporting 6 doses of OPV, compared with children reporting 0-2 doses. However, two-thirds of healthy children who excreted this virus reported >or=6 doses, and the prevalence of this virus did not decrease with age over the sampled range.

…Although OPV is protective against infection with poliovirus, the majority of healthy contacts who excreted wild-type poliovirus were well vaccinated. This is consistent with a potential role for OPV-vaccinated children in continued wild-type poliovirus transmission and requires further study.

Grassly, N. C., Jafari H, Bahl S, Durrani S., Wenger J,

Sutter R. W, & Aylward R. B

(2010)

https://www.ncbi.nlm.nih.gov/pubmed/20367459

[102]

However, ironically perhaps, because our nations, for the most part, used the oral type vaccine for so long before switching back to the original injectable form, we may have been immunised via a combination of naturally attenuated and artificially attenuated strains anyway due to the overall freedom of the vaccine viruses had to mate and produce mutated forms of themselves. Certainly, if these studies are anything to go by, where our nations were ourselves not that long ago still living with wild Polio circulating, but, not causing too much harm, then if the OPV actually wore off so rapidly – did this leave our populations wide open to Polio returning in its own colours?

Seemingly not, as the nature of Polio meant that asymptomatic (infected individuals who showed no obvious signs of having Polio could spread it easily to others, who in turn may not show any symptoms) would mean that this facilitated the wide community spread of the wild Polioviruses as more and more became silently infected as the OPV wore off, with not too much mishap, then, we would not have had much evidence of wild Polio still being present and many more people would have been exposed (and therefore became fully immunised naturally) anyway without us necessarily being aware of it.

And as we used the OPV for so long and it tried to mimic the natural form of immunisation, perhaps, ironically, the fact that OPV may have conferred such surprisingly short term protection against the wild Polioviruses, and with those vaccine virus strains evolving and acting more like wild Poliovirus silently infecting whole communities without most of them even knowing it, we may have become naturally immune to even lab-created mutants by more natural means as well.


Please feel free to share your thoughts below…



References

[78] Ochmann R., and Roser, M., (2018) Polio, OurWorldInData.org. https://ourworldindata.org/polio

[79] Krause R.M., (1992) The origin of plagues: old and new, Science, Vol. 257, pp. 1073-1075 http://www.sciencemag.org/site/feature/data/diseases/PDFs/257-5073-1073.pdf

[80] Rutty, J. C. (2006) The Middle-class Plague: Epidemic Polio and the Canadian State, 1936-37, Canadian Bulletin of Medical History, Vol. 13 [2], p. 278. doi:10.3138/CBMH.13.2.277. http://www.healthheritageresearch.com/cbmhbchm_v13n2rutty.pdf

[81] Cockburn, P., (2010) Polio, once the most feared of diseases, is about to be eradicated. [This essay by Patrick Cockburn, first published in The Independent in 1999], The Independent (October 27th 2010) https://www.independent.co.uk/life-style/health-and-families/features/polio-the-deadly-summer-of-1956-2117253.html

[82] Krause R.M., (1992) The origin of plagues: old and new, Science, Vol. 257, pp. 1073-1075 http://www.sciencemag.org/site/feature/data/diseases/PDFs/257-5073-1073.pdf 

[83] Rie, A.V., Wendelboe, A.M. & Englund, J.A. (2005) How Mothers Pass on Protection Against Disease to their Offspring. The Pediatric Infectious Disease Journal, Vol. 24, [5], May 2005 https://www.hpsc.ie/a-z/vaccinepreventable/pertussiswhoopingcough/niac/File,13702,en.pdf

[84] Trevelyan, B., Smallman-Raynor, M., & Cliff, A. D. (2005). The Spatial Dynamics of Poliomyelitis in the United States: From Epidemic Emergence to Vaccine-Induced Retreat, 1910–1971. Annals of the Association of American Geographers, Vol. 95 [2], pp. 269–293. http://www.doi: 10.1111/j.1467-8306.2005.00460.x

[85] Mawdsley, S. E., (2014) Polio provocation: solving a mystery with the help of history, The Art of Medicine, The Lancet, Vol. 384, [9940], pp. 300-301, doi: 1016/S0140-6736(14)61251- https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1473032/

[86] Gromeier, M., & Wimmer, E. (1998). Mechanism of injury-provoked poliomyelitis. Journal of Virology, Vol. 72, [6]. pp. 5056-60. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC110068/

[87] Mawdsley, S. E., (2014) Polio provocation: solving a mystery with the help of history, The Art of Medicine, The Lancet, Vol. 384, [9940], pp. 300-301, doi: 10.1016/S0140-6736(14)61251- https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1473032/

[88] Dwyer-Hemmings L. (2018). A Wicked Operation? Tonsillectomy in Twentieth-Century Britain. Medical history, Vol. 62, [2], pp.217-241. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5883156/

[89] Mayo Clinic (2011), Tonsillitis, Mayo Clinic https://www.mayoclinic.org/tests-procedures/tonsillectomy/about/pac-20395141

[90] Wyatt, H.V., (2011) The 1916 New York City Epidemic of Poliomyelitis: Where did the Virus Come From? The Open Vaccine Journal, Vol. 4, pp. 13,16, Abstract https://benthamopen.com/contents/pdf/TOVACJ/TOVACJ-4-13.pdf

[91] Rutty, J. C. (2006) The Middle-class Plague: Epidemic Polio and the Canadian State, 1936-37, Canadian Bulletin of Medical History, Vol. 13 [2], p. 279. doi:10.3138/CBMH.13.2.277. http://www.healthheritageresearch.com/cbmhbchm_v13n2rutty.pdf

[92] Lindner, U., & Blume, S. S. (2006) Vaccine innovation and adoption: polio vaccines in the UK, the Netherlands and West Germany, 1955-1965, Medical history, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1592614/

[93] Cockburn, P., (2010) Polio, once the most feared of diseases, is about to be eradicated. (1st published in The Independent in 1999), The Independent (October 27th 2010) https://www.independent.co.uk/life-style/health-and-families/features/polio-the-deadly-summer-of-1956-2117253.html

[94] History Ireland (2006), The 1956 polio epidemic in Cork, History of Ireland, Vol. 14, [3] http://www.historyireland.com/20th-century-contemporary-history/the-1956-polio-epidemic-in-cork/

[95] Baicus A. (2012). History of polio vaccination. World journal of virology, Vol. 1, [4] pp.108-14. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3782271/

[96] Blume, S., & Geesink, I., (2000), A Brief History of Polio Vaccines: Essay on Science and Society, Science, Vol. 288, [5471], pp. 1593-1594, doi: 10.1126/science.288.5471.1593 http://science.sciencemag.org/content/288/5471/1593

[97] Hull, H. F., & Minor, P. D. (2005) When Can We Stop Using Oral Poliovirus Vaccine? The Journal of Infectious Diseases, Vol. 192, [12], pp. 2033–2035. https://doi.org/10.1086/498171 https://academic.oup.com/jid/article/192/12/2033/838973

[98] Yakovenko, M. L., Korotkova, E.A., Ivanova, O. E., Eremeeva, T.P., Samoilovich, E., Uhova, I., Gavrilin, G.V., & Agol1, V.I., (2009) Evolution of the Sabin Vaccine into Pathogenic Derivatives without Appreciable Changes in Antigenic Properties: Need for Improvement of Current Poliovirus Surveillance, Journal of Virology, Vol. 83, [7], p. 3402, doi:10.1128/JVI.02122-08 https://jvi.asm.org/content/jvi/83/7/3402.full.pdf

[99] Taylor, P., (2017) Pathway to Polio Virulence Revealed, The Scientist (2017, March 23rd) https://www.the-scientist.com/daily-news/pathway-to-polio-virulence-revealed-31788

[100] Hull, H. F., & Minor, P. D. (2005) When Can We Stop Using Oral Poliovirus Vaccine? The Journal of Infectious Diseases, Vol. 192, [12], pp. 2033–2035, doi: 10.1086/498171

https://academic.oup.com/jid/article/192/12/2033/838973

[101] Grassly N. C, Jafari H, Bahl S, Sethi, R., Deshpande, J. M., Wolff, C., Sutter, R. W., & Aylward R. B. (2012), Waning Intestinal Immunity After Vaccination With Oral Poliovirus Vaccines in India, The Journal of Infectious Diseases, Vol. 205, [10], Conclusion, pp. 1554–1561, doi: 10.1093/infdis/jis241 https://academic.oup.com/jid/article/205/10/1554/949286

[102] Grassly, N. C., Jafari H, Bahl S, Durrani S., Wenger J, Sutter R. W, & Aylward R. B., (2010) Asymptomatic wild-type poliovirus infection in India among children with previous oral poliovirus vaccination. The Journal of Infectious Diseases, Vol. 201, [10] Abstract, pp. 1535-1543. doi: 10.1086/651952. https://www.ncbi.nlm.nih.gov/pubmed/20367459

Advertisements

I M M U N E A L R E A D Y ?: How We May Have Won the War On Bugs By Fully Natural Means…

plague2polio

TABLE OF CONTENTS

PREFACE

DEDICATION

INTRODUCTION

From the Plague to Polio in Ireland & Beyond

PART ONE

CHAPTER ONE

Whatever happened to the Bubonic Plague & What has Chickenpox got to do with it?

CHAPTER TWO

Cholera: the Disease that inspired Bram Stoker to write Dracula & a Tale of a few other pathogens

CHAPTER THREE

The Many ‘Typhoid Marys’

CHAPTER FOUR

TB the Modern Plague of the 20th Century?

CHAPTER FIVE

Would We Survive the Spanish Flu if it Re-emerged Today?

PART TWO

CHAPTER SIX

Don’t count your Children before they’ve had the Pox

CHAPTER SEVEN

Scarlet Fever Returns, but it is a whole lot less lethal

CHAPTER EIGHT

The Almost Universal Decline in Deaths from Infectious Diseases of Childhood & Some Solutions for Our Present Times

CHAPTER NINE

The Rise & Demise of Polio the Great Crippler

CONCLUSION: Where to Now?


THIS STUDY IS DEDICATED TO:

All those…

View original post 2,159 more words

NATURAL IMMUNITY MAY HAVE SAVED MILLIONS!

TABLE OF CONTENTS

PREFACE

DEDICATION

INTRODUCTION

From the Great Plague to Polio in Ireland & Beyond

PART ONE

CHAPTER ONE

Whatever happened to the Bubonic Plague & What has Chickenpox got to do with it?

CHAPTER TWO

Cholera: the Disease that inspired Bram Stoker to write Dracula & a Tale of a few other pathogens

CHAPTER THREE

The Many ‘Typhoid Marys’

CHAPTER FOUR

TB the Modern Plague of the 20th Century?

CHAPTER FIVE

Would We Survive the Spanish Flu if it Re-emerged Today?

PART TWO

CHAPTER SIX

Did We Really Eradicate Smallpox: Don’t count your Children before they’ve had the Pox?

CHAPTER SEVEN

Scarlet Fever Returns, but it is a whole lot less lethal

CHAPTER EIGHT

The Almost Universal Decline in Deaths from Infectious Diseases of Childhood

CHAPTER NINE

What Would Have Happened Polio If We Hadn’t Intervened?

EPILOGUE


 

THIS STUDY IS DEDICATED TO:

All those who gave their Lives & Suffered Untold Disabilities

On the Front Line of Our Developing Nations
Who Fought with such Valour
Unknowingly – Defending Our Future Immunity
Against the Greatest Scourges of Humanity.

INTRODUCTION

Recently emerging archaeological evidence points to the fact that black rats and their fleas may be innocent after all in being the true cause of the Plague of the Middle Ages. And, seemingly if the old Plague returned (we now know for example that it has the same genes as the original Plague) to taunt our modern communities today, apparently we would not begin to die in our millions, or a third of Ireland’s population and the same proportion in Europe would not be wiped out as before and the Herpes family of viruses, which includes Chickenpox, may be the reason why!

By re-examining history from the perspective of the bugs themselves and understanding their behaviour within us as their hosts, this begins to reveal an unexpected story of natural resistance to all sorts of once deadlier pathogens and our ultimate immunity to them over the course of generations.

No better example of this, is how most of our ancestors who survived the great Spanish Flu pandemic (worldwide) of 1918/19 (and indeed, most survived who became infected and the vast majority only got a mild form due apparently to previous exposure to the same strain that was initially less aggressive and those old enough to have had exposure to earlier Influenza pandemics) carried within them natural resistance against every major Influenza viral strain that emerged thereafter; as did the vast majority of subsequent generations that came after, who also became resistant, not just to the circulating strains that they were exposed to as children and while growing up, but, to major pandemic strains that were to be experienced in the future.

That future has come and gone and it is remarkable just how many of us are actually fairly robustly resistant to almost all the circulating strains, no matter what the season and the most surprising part is that we may be able to pass this protection against the worst effects of these previously experienced bugs (quite possibly due to being attenuated, or rendered less lethal, by being filtered and disarmed via our mighty immune systems) to our offspring and theirs to their own. Immunity molecules echo far into the future, further than any of us could have imagined.

Heritable immunity operates outside, above and beyond your genes, and it is this newly emerging science that holds the key to explaining how we seem to be now nearing the end of the Great War with a vast array of bugs that once wiped out entire families, destroyed communities and left massive holes in our populations. Therefore, perhaps we don’t need to worry so much about those viruses, or bacterium for that matter, outsmarting our immunologists and health professionals now that the war is effectively over with only a few skirmishes remaining.

Essentially, this study has established the rise and demise of all the major contagions as a near-universal pattern, and as such, this widely shared phenomenon, experienced across and throughout our now developed nations, cannot be explained by the more commonly offered means – typically relating to improved nutrition, economics, hygiene etc., because it would seem highly improbable that we all experienced the very same degree of hygienic intervention, economic prosperity and improved our diets at exactly the same time within and across our vast and far-flung nations from Ireland to Iceland and from Australia to America to account for this remarkably similar and near-simultaneous decline in deaths from the same infectious contagions.

And certainly, this pattern of dramatically declining deaths across our nations cannot be accounted for by our medical interventions, as these measures do not correlate with what we see in the mortality statistics or the historical accounts either within our respective regions. In other words, all of the major contagions are historically documented to have declined in their deadliness long before or, entirely without, our efforts to intervene in their demise.

Even contagions that are widely believed to have been stopped in their tracks and ultimately been eradicated via our medical interventions such as Smallpox, upon closer inspection of previously un-investigated sources and statistics would say otherwise.

The answer to how these greater and lesser killer pathogens may have become less deadly to our populations, near-simultaneously, lies in how these same pathogens may have initially gained the upper hand, or gotten a more deadly foothold within our now modern nations in the first place. It seems that the more you are exposed to natural background pathogens, the more resistance you build up against them.

For instance, as this study documents and explores, it would appear that as we were beginning to become cleaner, more affluent and generally better fed, counter-intuitively perhaps, it was a shift in our normal exposure to such natural background pathogens that may have contributed significantly to the terrible shift from benign to deadly and debilitating.

This is exemplified in the studies relating to the rise of Polio at the other end of the plague-like spectrum. Polio, the great crippler, has often been referred to as the Middle-Class disease. This is due to the fact that it was often, unfortunately, the most affluent older children from hygienic homes that were the most susceptible to the Poliovirus’s worst impact.

Everyone else was essentially naturally immune to Polio due to being frequently exposed from an early age to natural Polioviruses that lived alongside us in the general background.  However, it was the better off children who were now playing less in the dirt and around other children and living with new indoor plumbing with flushable toilets, making them the most vulnerable due to now being artificially shielded against such germs and this is reflected in the historical accounts at the time and upon later research into the nature of the eruption within our fully modern nations.

In other words, it seems that the original eruption of even the greatest pathogen attacks known to humankind from the Plague to Typhus or Cholera, as well as the lesser types like Polio, were probably initiated by these typically more benign pathogens finding increasing opportunities to colonise greater numbers of susceptible populations than they had previously been able to do due to an increasing lack of exposure and subsequent lack of resilience to them as their natural hosts.

It is proposed here that the shift in the ecology was due to the fact that as we developed and advanced towards modern living (even back in the Middle Ages people were becoming more modernised), the more opportunities, particularly with increasing urbanisation, densely growing populations, world interconnectedness via expanding commerce, these opportunistic pathogens at different historical times gained the upper hand until our swelling populations had built up enough resistance and immunity to return the pathogen to its more natural state once again.

Another situation where a pathogen can wreak devastation is when a given population has never encountered a particular pathogen previously and therefore, have no existing immunity or resilience as is well documented throughout early modern history when indigenous peoples first encountered pathogen carrying explorers from the old world to the new.

Now the particular pathogen, once it became out of balance within a population or if it were only being encountered for the first time, would impact the least resilient victims the most and the contagion would begin spreading, often with lethal devastation at least initially. In other words, it doesn’t look like any amount of cleaning up sewers or removing flea-infested rat heaps, or delousing could impede its spread into the wider public once it caught hold and spread like wildfire until those that survived had adequate immunity to keep the bug at bay. Moreover, it doesn’t matter how fit and healthy the indigenous peoples were when they encountered such a new contagion, the bug didn’t care. But, thankfully, all this was not relentless and as harsh as these eruptions were, seemingly, this hard-earned resilience would echo down the generations as indicated above

Overall, there would be an immunising effect from such exposures over time, to the point where, the pathogen would only be able to impact the least resilient hosts (less exposed) portions of a population, namely the infants and children, but, even this more vulnerable population were protected by their mother’s antibodies (gained from her own exposure) until they could deal with the pathogen on their own as our medical literature on the subject clearly shows, but, eventually, almost all the population grew up to become immune or at least with a very strong resistance to once deadlier infectious diseases and indeed as the historical accounts show, these very often became relatively benign infections of childhood and ultimately the threat would pass entirely.

This pattern, as noted in the preface is inscribed upon our statistical mortality charts and graphs which form the backbone of this entire study and when we assess these against the historical accounts of each contagion through time, from its rise to its demise, it becomes obvious that we are indeed robustly resistant (if not fully and ancestrally immune) to the worst impact of these bugs within our more modern and fully developed world.

Whichever end of the contagion spectrum we are viewing in terms of time or scale, the story is essentially the same; the overarching pattern is that they all become significantly and increasingly less deadly (fairly tame in fact) as the generations passed, with one reassuring caveat, the older the contagion: the more deadly its impact at a population level and conversely, the younger the contagion: the less deadly within our populations as a whole.

This common pattern, in principle, presumably, as it is demonstrably near universal – should also apply to less developed nations who are only a short bit behind where we are now. Indeed, there is some indication of such a welcomed reprieve from previously deadlier contagions are also occurring in other parts of the world less developed than us.

Essentially, Natural Immunity May Have Saved Millions!

Kindle and book Natural Immunity May Have Saved Millions

Please send any feedback that you think might be helpful before publication in Spring 2019.

 

CHAPTER FIVE: Would we survive the Spanish Flu if it re-emerged today?

Link to Previous Chapter 

Just to put Influenza into perspective in terms of its impact across our emerging modern nations, the following quote will give you an idea of the sort of numbers of deaths we are talking about. Bear in mind that the world by the early 20th Century is much more heavily populated than it was back in the Middle Ages when the Great Plague ran rampant.

The influenza pandemic of 1918

The influenza pandemic of 1918-1919 killed more people than the Great War, known today as World War I (WWI), at somewhere between 20 and 40 million people. It has been cited as the most devastating epidemic in recorded world history. More people died of influenza in a single year than in four-years of the Black Death Bubonic Plague from 1347 to 1351. Known as “Spanish Flu” or “La Grippe” the influenza of 1918-1919 was a global disaster…

Bodies pil[l]ed up as the massive deaths of the epidemic ensued. Besides the lack of health care workers and medical supplies, there was a shortage of coffins, morticians and gravediggers (…). The conditions in 1918 were not so far removed from the Black Death in the era of the bubonic plague of the Middle Ages.

 Billings, M. (1997)

https://virus.stanford.edu/uda/

[70]

Why it came to be known as the Spanish Flu is revealed below and a little about the background to how it may have gotten such an easy ride throughout our populations due to the extremely unusual circumstances that prevailed at the end of World War I.

Limerick City and the Spanish Influenza Epidemic, 1918-19

The name ‘Spanish Influenza’ came about not because it originated in Spain but because Spain was the first country to report, uncensored and unbiased, on the spread of the disease, due to its neutrality in World War I.  It occurred in three waves ; the first in spring 1918, the second in October/ November 1918 and the third in spring 1919 …

The beginning of the pandemic in spring 1918 is thought to result from soldiers returning to their home countries. The demobilisation of troops in November 1918 (Armistice Day) could possibly account for the second wave of influenza which proved to be more deadly than its predecessor… in Chicago, the police were instructed to arrest anybody who sneezed in public

….as the epidemic in America gathered speed, school children even came up with a rhyme about it to skip by…

I had a little bird and its name was Enza,

I opened the window and in-flew-Enza.

Buckley, M., (2014, 81)

[71]

https://www.ucc.ie/en/media/academic/appliedsocialstudies/cstpdfs/vol6/MargaretBuckley.pdf

The situation in Ireland as recorded at the time and established from the historical accounts since reveals the impact felt across our nations as we were going through a near-déjà vu of the Great Plague of the Middle Ages many centuries earlier. The following documents the Flu’s arrival on Ireland’s shores.

Limerick City and the Spanish Influenza Epidemic, 1918-19

In Ireland, the first verifiable outbreak of the first wave can be traced to Cobh, when a US Naval ship, the USS Dixie, docked there in May 1918. It seems that the first wave was somewhat more contained th[a]n the subsequent waves as it did not affect the entire country … Confirmation of the onset of the second wave came from Howth during late September and this time all areas of the country were infected. By Christmas, all counties had suffered an outbreak in both rural and urban areas…

ibid

[72]

https://www.ucc.ie/en/media/academic/appliedsocialstudies/cstpdfs/vol6/MargaretBuckley.pdf

The death toll was massive for such a small nation as Ireland. However, it was proportional to what other regions all around the world, both great and small, were feeling. No wonder our medical professionals were desperately trying to find some means of intervention, but as indicated in the review below, this alas proved unsuccessful.

Review of: The Last Irish Plague: The Great Flu Epidemic in Ireland 1918-19, by Catriona Foley

…between spring 1918 and early summer 1919, resulted in the sickness of over 800,000 people on this island, and the related death of almost 21,000 of them (statistics of the Registrar General of Ireland 1918-19)…

From the early months of the Influenza Pandemic – in Ireland as elsewhere, attempts were underway in universities and laboratories in pursuit of a therapeutic vaccine for influenza. They did not succeed: this influenza type infection was undoubtedly lethal, and they knew that it was not a bacteria; but they simply did not know, at this stage in the pandemic, precisely what order of complexity they were dealing with.

Jones, M. (2016), Ill-Prepared

[73]

Dublin Book Reviews

http://www.drb.ie/essays/ill-prepared

However, this should be put into perspective. It didn’t mean that everyone else was immune to actually getting the same Flu that wiped out millions worldwide, it just meant that many got the Flu, but they didn’t, for the most part, die from it or, indeed, even suffer that much comparatively speaking as highlighted in the following excerpt:

1918 Influenza: the mother of all pandemics. Could a 1918-like Pandemic Appear Again? If So, What Could We Do About It?

In its disease course and pathologic features, the 1918 pandemic was different in degree, but not in kind, from previous and subsequent pandemics. Despite the extraordinary number of global deaths, most influenza cases in 1918 (>95% in most locales in industrialized nations) were mild and essentially indistinguishable from influenza cases today.

Taubenberger, J. K., & Morens, D. M. (2006)

[74]

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3291398/

Therefore, as tragic and ultimately devastating as the Spanish Flu was for so many families and communities, the vast majority survived the infection and even got off quite lightly in terms of suffering. It is hard to believe just how many survived the great Flu epidemic.

The Taming of the Flu

However, nobody knew that the deaths would cease or that they wouldn’t be next. It was tragic as the whole world had been in turmoil from the Great War and what should have been a time of celebration and exuberance, turned to a doomsday scenario that caused more deaths than the war itself!

And then, almost as suddenly as Enza flu in, this great pestilence flew out again. We know this from accounts on the ground at the time within Ireland as indicated within a newspaper article in the Irish Times dating to the era.

November 9th, 1918: Relief as deaths from 1918 Spanish flu epidemic began to decline

Irish Times

IT WAS authoritatively stated yesterday that the influenza epidemic in Dublin is abating.

The statement was based on the fact that there are very few fresh cases within the past few days… On the whole, he stated, there was a decline.

Joyce. J (2009, Nov. 9th)

[75]

https://www.irishtimes.com/opinion/november-9th-1918-relief-as-deaths-from-1918-spanish-flu-epidemic-began-to-decline-1.768576

A similarly dramatic decline in deaths was beginning to occur around the world and as indicated below, it seems that rapidly developing immunity was emerging depending upon the degree of previous exposure individuals were experiencing, which begins to give us a very strong clue to why the deaths from Influenza were declining almost as rapidly as they had risen in the second wave.

Pathogenic Responses among Young Adults during the 1918 Influenza Pandemic

…Mortality Rates among Nurses and Medical Officers

During the 1918 pandemic period, military nurses and medical officers were intensively and repeatedly exposed to the influenza A (H1N1) pandemic strain in clinics, in ambulances, and on crowded open wards. However, during the lethal second wave, nurses and medical officers of the Australian Army had influenza-related illness rates similar to, but mortality rates lower than, any other occupational group …

Similar observations were made in other groups of military and civilian health care workers … These findings suggest that the occupational group with the most intensive exposure to the pandemic strain had relatively low influenza-related pneumonia mortality rates during the second wave …

Mortality Rates among Military Members with Least Service

During the fall of 1918, all 40 large mobilization/training camps throughout the United States and Puerto Rico were affected by influenza epidemics … During the camp epidemics, influenza–pneumonia mortality rates were inevitably highest among the soldiers with the least military service. In the US Army overall, 60% of those who died of influenza-related pneumonia were soldiers with <4 months of military service …

 Shank, D. & Brundage, J.F (2012)

[76]

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3310443/

In other words, the more exposed, the greater your resistance to the worst effects of these pathogens you become, and conversely, the least exposed and most naïve your immune system the more vulnerable you were in the face of the pathogen.

Another dimension to this immunity conferred by natural exposure is addressed below, this time regarding pre-exposure to the same pandemic strain which was milder in the first wave of the Spanish Flu than the second (although they were the same strain), that was much more severe and the fact that the older age group who got off most lightly during this second wave may had some pre-existing resistance due to having direct exposure to something like Influenza strain that caused the Russian Flu Pandemic of   1889/90 almost thirty years earlier, all of which might help explain (at least in part) the unusual age group that the Spanish Flu impacted most adversely, namely some of the strongest, healthiest and fittest members of our populations.

How historical disease detectives are solving mysteries of the 1918 flu

The pattern of deaths by age was also intriguing. Young adults in their late 20s were at heightened risk. In contrast, influenza infections were frequent among teenagers, but these infections were mild. Senior adults were also less likely than young adults to die from influenza …

These studies show that populations that experienced an early and often mild pandemic wave in the first half of 1918 fared better in the deadly autumn wave. Our hypothesis was eventually supported when virologists recovered a pandemic-like virus from preserved lung tissue of U.S. soldiers who died in summer 1918. This suggested that the pathogen responsible for the early waves was the novel pandemic virus.

Why were older adults spared? One popular explanation is that well-connected populations who had seen influenza in the 19th century would be protected upon the return of a similar virus decades later. This is known as the “antigen recycling” hypothesis.

This hypothesis gained more traction during the 2009 pandemic, when older populations had higher levels of prior antibodies and therefore were less likely to die than younger populations.

…Moreover, patterns of infection and death may depend upon people’s prior immunity, imprinted by circulation of similar viruses within the last century.

The Conversation (2018, 5th March 5)

https://theconversation.com/how-historical-disease-detectives-are-solving-mysteries-of-the-1918-flu-91887

[77]

As also noted above, this evidence supports the fact that natural immunity/resistance built up due to prior exposure over the course of a lifetime seems to have provided similar protection to our older populations during the most recently documented Influenza pandemic of 2009, the Swine Flu, although, as you will see further on, this thankfully left a relatively minuscule mark on our populations as a whole.

Further to some of what was highlighted in the article excerpt above, it may also be that those that we might expect to be most susceptible, such as young infants, fairly small children and the older generations, particularly the elderly, may have been more likely to have had exposure to a gentler early form of pandemic strain before it got a whole lot more aggressive due to this age group being more likely to remain at home and within their more immediate environments than the young adult population.

Given the highly unusual circumstances of the end of the Great War (WWI), this would have also put this young adult population in the direct firing line without much in the way of defence against it as they would have been highly mobile compared to their counterparts who were closer to home.

Further clues to this phenomenon of rapid resistance built up from prior natural exposure to the Spanish Flu as children who were safely tucked away at home is documented for individuals who actually lived through the ordeal and were still alive within our modern era to tell the tale as documented in the following excerpt.

Neutralizing antibodies derived from the B cells of 1918 influenza pandemic survivors.

Nature

Little is known about naturally occurring adaptive immunity to this virus; however, some elderly survivors are still living. We sought to determine whether survivors exhibited evidence of acquired immunity to the virus. Expression of the 1918 HA antigen allowed us to identify and characterize protective antibodies induced by natural exposure of humans to the 1918 pandemic virus.

We identified a panel of 32 subjects aged 91-101 years (i.e., aged 2 to 12 years in 1918), many of whom recalled a sick family member in the household during the pandemic, which suggested direct exposure to the virus. Of the subjects tested, 100% exhibited serum neutralizing activity against the 1918 virus .., and 94% had serologic reactivity to the 1918 HA (…), even though these samples were obtained nearly 90 years after the pandemic.

-Thus, these studies reveal that survivors of the 1918 influenza pandemic possess highly functional, virus-neutralizing antibodies to this uniquely virulent virus, and that humans can sustain circulating B memory cells to viruses for many decades after exposure – well into the tenth decade of life.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2848880/

Yu, X., Tsibane, T., McGraw, P. A., House, F. S., Keefer, C. J., Hicar, M. D., Tumpey, T. M., Pappas, C., Perrone, L. A., Martinez, O., Stevens, J., Wilson, I. A., Aguilar, P. V., Altschuler, E. L., Basler, C. F., … Crowe, J. E. (2008). Neutralizing antibodies derived from the B cells of 1918 influenza pandemic survivors. Nature, 455(7212), 532-6.

[78]

Now, that should have been the end of it and everyone would have been immune, at least robustly resistant to the more deadly impact of the great Flu. However, a whole plethora of Flu viruses came after.  See the summary of what happened below:

Your flu risk may be linked to the year you were born
CNN

Scientists stunned by own discovery
Influenza A viruses can be categorized into two groups, and within these groups, there are subtypes: H1, H2 and H5 are in group 1, and H3 and H7 are in group 2. Only three subtypes — H1, H2 and H3 — have circulated in humans worldwide from 1918 to 2015, according to the study.

As it turned out, the researchers found that people born before 1968 were more likely to be exposed to the group 1 viruses H1N1 or H2N2 and were less likely to suffer or die from infections with the group 1 virus H5N1 infections later in life. In 1968, there was an influenza pandemic that had a multinational impact.

The 1968 pandemic marked the transition from an era of group 1 viruses to a group 2-dominated one, the researchers wrote in the study.
Therefore, people born after 1968 were more likely to be exposed to the group 2 virus H3N2 at a young age and were less likely to suffer or die from infections with the group 2 virus H7N9 later in life.

For both groups, exposure at a young age not only lowered the risk of a severe infection with either H5N1 or H7N9, it reduced the risk of death by up to 80%, the researchers wrote in their study.

 Howard, J., (2016, 10th November)

[79]

https://edition.cnn.com/2016/11/10/health/flu-risk-birth-year/index.html

But, one of these has turned out to be a little stubborn, as seen in the following. However, the point being, that it just may be a matter of time as seen when reviewing the cross-protection provided to us as hosts against longer circulating major strains of Influenza and their variants over the course of history so far.

The Problem Child of Seasonal Flu”: Beware This Winter’s Virus
H3N2 is deadlier than many other influenza strains

For a long time, it was flu dogma that only one influenza A virus could circulate at once. The H1N1 virus that caused the 1918 Spanish flu disappeared when the H2N2 virus that touched off the Asian flu pandemic emerged in 1957. Then in 1968, H3 muscled out H2.

But in 1977, something odd happened. H1N1 reappeared—likely as the result of a laboratory accident. And what was thought to be impossible—two influenza A strains circulating at the same time—was shown to be possible.

When the 2009 pandemic started, flu researchers hoped it would push the reset button. They hoped the new virus—an H1N1 virus that had been circulating in pigs—would drive out both the old H1N1 and H3N2.

The old H1N1 viruses did disappear. But H3N2 viruses didn’t budge. For the time being, we’re stuck with this unpleasant virus.

Branswell, H. (2018, 9th January)

https://www.scientificamerican.com/article/ldquo-the-problem-child-of-seasonal-flu-rdquo-beware-this-winter-rsquo-s-virus/

[80]

As highlighted by the title of the above article, the persistent strain was to be feared the most, but, this season has come and gone as of the time of writing here, so thankfully, it didn’t cause massive death-tolls and besides, as discussed previously, having exposure to this particular type conferred fairly solid protection in the face of other strains (cross-protection) against other similar group strains that might be encountered later in life.

Therefore, our latest problem child (N3N2), as it only erupting in the late 1960s may eventually disappear (or become less obvious) like its older counterparts, it is seemingly just a matter of exposure and time, until almost all of us become immune to that one as well.

Furthermore, we can gain a real-time insight into such infectious from natural exposure and subsequent immunity during different Flu seasons in the following study which reveals just how quickly and robustly our immune systems adapt to all sorts of strains, including cross-protection from more familiar types to even those that have morphed rather dramatically and how the natural immune response can even deal with less than natural strains too; the returned A H1H1 hybrid (possibly escaped lab) strain of the 1970s.

Infection with influenza A H1N1: Effect of past experience

on natural challenge

SUMMARY

Following its reintroduction in 1978 influenza A HIN1 spread widely in the child population. By the autumn of 1979, 75 % of 11-year olds entering a boys’ boarding school had detectable antibody.

The protective effect of previous experience could be assessed during two outbreaks in the school. In the first outbreak in 1979, 90 % of those known to have been infected in the previous year were protected against reinfection…

Previous experience conferred over 90 % protection against infection.

Between the 1979 and the 1983 outbreaks there was no overt evidence of A HIN1 activity in the school although a few sporadic infections were identified in those investigated routinely or in connection with other infections…

First, with the re-emergence of A HINt in 1978, infections were virtually confined to young people. Those old enough to have had experience of strains before 1957 seemed to be immune.

Secondly, our own observations on A H3N2 in the school (Hoskins et al. 1979) suggested that natural infection gave good protection even against strains which had undergone considerable antigenic drift.

Thirdly, the 1979 outbreak showed that recent infection with A HINt gave good protection against reinfection.

Davies, J.R, Grill, E.A., & Smith, A.J (1985)

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2129641/

[81]

Not only did the boys survive their ordeal and were now almost all immune for life from being directly exposed to a strange attenuated mutant gone wild, most definitely, by all accounts manipulated by man (whatever the actual point of origin, it was humanly manipulated – that much everyone agrees), but, also showed immunity to Flu strains upon reexposure in other Flu seasons.

Also of interest, noted in the above article, is that it was observed from other related studies that exposure to the A H3N2 strain (that newer tricky type which appears to have morphed considerably by fully natural means, what is called antigenic drift) resulted in fairly robust resistance even though these boys had no previous exposure to that particular variant before. This is what is called cross-protection from being exposed to another Flu strain that is not too dissimilar.

All in all, it would appear that we have built up quite an arsenal of immunity against all sorts of strains over the course of our life due to continual exposure, sometimes not even knowing it and the really good news is that these seasonal exposures can give you lifelong protection, but also set you up with some mighty protection against a pandemic strain that you haven’t even experienced yet as highlighted in the following excerpt.

Age-dependence of the 1918 pandemic

4. Birth year dependence of H5N1 and H7N9 avian flu cases

A growing body of epidemiological evidence indicates reduced risk of pandemic infection in those with previous seasonal exposure, and lifelong protection against viruses of different subtypes but within the same HA Group

Woo, G (2018, 11)
Presented to the Institute & Faculty of Actuaries

https://www.actuaries.org.uk/documents/age-dependence-1918-pandemic

[82]

That is fairly impressive cross-protection due to exposure to naturally circulating Flu viruses when we see that these exposures can provide protection to future pandemic strains (new mutations that impact the world) that haven’t even happened yet!

It certainly is beginning to look like our immune systems have been doing a fine job defending us over the generations and it may be worth being exposed to the real thing as it seems that at the very least,  we are ensuring protection directly and indeed, indirectly against future strains. This becomes clearly evident when we take a closer look at our most recent pandemic – the first one of the 21st Century.

As we revisit our most recent worldwide pandemic, the 2009 Swine Flu outbreak, more research has emerged that demonstrates the possible cause of such low mortality figures across our nations (we will discuss the low mortality of this most recent pandemic shortly) may relate to the strains of Flu you were exposed to throughout your life, whether you knew it or not as indicated above and specifically from epidemiological data as indicated below.

Immunity to Pre-1950 H1N1 Influenza Viruses Confers Cross-Protection against the Pandemic Swine-Origin 2009 A (H1N1) Influenza Virus

The 2009 H1N1 influenza virus outbreak is the first pandemic of the twenty-first century.

Epidemiological data reveal that of all the people afflicted with H1N1 virus, <5% are over 51 y of age. Interestingly, in the uninfected population, 33% of those >60 y old have pre-existing neutralizing Abs against the 2009 H1N1 virus.

This finding suggests that influenza strains that circulated 50–60 y ago might provide cross-protection against the swine-origin 2009 H1N1 influenza virus.

[83]

Immunity to Pre-1950 H1N1 Influenza Viruses Confers Cross-Protection against the Pandemic Swine-Origin 2009 A (H1N1) Influenza Virus
Ioanna Skountzou, Dimitrios G. Koutsonanos, Jin Hyang Kim, Ryan Powers, Lakshmipriyadarshini Satyabhama, Feda Masseoud, William C. Weldon, Maria del Pilar Martin, Robert S. Mittler, Richard Compans and Joshy Jacob
J Immunol August 1, 2010, 185 (3) 1642-1649; DOI: https://doi.org/10.4049/jimmunol.1000091

http://www.jimmunol.org/content/185/3/1642

This dynamic protection across our lifespans and preparing us for future outbreaks due to exposure is perhaps the very reason why we see such a dramatic decline in deaths from Influenza since the Spanish Flu era as seen in the graph below (Fig. 11).  It really does look like we have become robustly resistant, if not fairly immune, to just about all the strains that have been circulating over the last century?

Flu Mortality Ireland

Fig. 11: Individual number of annual Deaths from Influenza in Ireland from 1864 until the mid-1990s when the cause of death was classified differently, being combined with Pneumonia in the registrar. Source: Chart generated using the annual statistics reports since records began until, Influenza was no longer listed as the exclusive cause of death from the “Annual Reports on Marriages, Births and Deaths in Ireland,” courtesy of: An Phríomh-Oifig Staidrimh, Central Statics Office CSO, http://www.cso.ie/en/statistics/birthsdeathsandmarriages/archive/annualreportsonmarriagesbirthsanddeathsinirelandfrom1864to2000/. © Copyright dig-press.com

Notice that in Figure 11 shows the individual number of deaths recorded from Influenza alone, annually in Ireland since official records began that there were epidemic years prior to the more significant rise in deaths during the Spanish Flu pandemic of 1918-19. You will see that the overall longer-term trend of the major spikes representing deaths from pronounced epidemics thereafter become less and less as we progress through the 20th Century.

Note also that the Irish data does not record deaths from Influenza after the mid-nineteen-nineties as the official cause of death changed at that point to include Pneumonia. Furthermore, we know from the continued data recorded from other sources, that deaths from Influenza after this point or even for the first pandemic of the 21st Century would hardly register on the above graph compared to what went before. It is only really since 2009/10 (Swine Flu era) that the vaccine has been offered in Ireland to a broader range of individuals beyond the elderly and their carers.

This overall decline in deaths since the Spanish Flu era to marginal figures in our modern era is a pattern shared across our diverse nations where statistics of this kind are available. For instance, if we examine the graphs generated from statistics relating to deaths resulting from Influenza within the U.S. and compare this data directly with the Irish graph, the only difference between them is one of scale.

The proportion of deaths for a massive population in the U.S. would obviously be greater than a relatively tiny population within a country like Ireland. See Figure 1, Crude mortality per 100000 population, by influenza season (July to June of the following year), for seasons 1900–1901 to 2003–2004 (a) in, Doshi, P. (2008), ‘Trends in Recorded Influenza Mortality: United States, 1900–2004’ https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2374803/ [84].

Similarly, near-identical patterns of proportional deaths from Influenza over the same essential timeframe can be found in, ‘Twentieth-century mortality trends in England and Wales’ Figure 5, showing the age-standardised mortality rates for Influenza in England and Wales from 1901 to 2000 by Griffiths C and Brock A (2003) https://www.ons.gov.uk/ons/rel/hsq/health-statistics-quarterly/no–18–summer-2003/twentieth-century-mortality-trends-in-england-and-wales.pdf [85].

Indeed, the pattern of declining deaths from Influenza since the great pandemic of 1918-19 would appear to be a near-universal pattern experienced across our developing nations where an investigation into any of the relevant studies and statistics that are available recording deaths from this once deadlier contagion over a related timeframe consistently demonstrates.

This commonality of a dramatic decline is further supported by the worldwide estimates of deaths recorded for each of the major Influenza pandemics that near-simultaneously swept across almost all our nations commencing with the mother of them all, the Spanish Flu of 1918 until around 1920.

Influenza Virus (Flu)

There were three influenza pandemics in the 20th century – the “Spanish” flu of 1918-19, the “Asian” flu of 1957-58, and the “Hong Kong” flu of 1968-69.

The 1918 flu, caused by a strain of H1N1, was by far the most deadly. More than 500,000 people died in the United States as a result of the Spanish flu, and up to 50 million people may have died worldwide…

The 1957 pandemic was due to a new H2N2 strain of influenza virus that caused the deaths of two million people, while the 1968 pandemic resulted from an H3N2 strain that killed one million people.

One pandemic has occurred so far in the 21st century. This was due to the novel swine-origin H1N1 virus which emerged in 2009.

Baylor College of Medicine (1998-2008)

https://www.bcm.edu/departments/molecular-virology-and-microbiology/emerging-infections-and-biodefense/influenza-virus-flu

[86]

And our most recent Swine Flu pandemic of 2009/10, produced figures as low as just under 20,000 as seen in the following, although, these figures were from laboratory-confirmed cases only.

Pandemic (H1N1) 2009 

WHO: Weekly update

6 AUGUST 2010 – As of 1 August 2010, worldwide more than 214 countries and overseas territories or communities have reported laboratory confirmed cases of pandemic influenza H1N1 2009, including over 18449 deaths.

World Health Organisation (2010, update no. 12)

https://www.who.int/csr/don/2010_08_06/en/

[87]

Since, these have been somewhat expanded, because they were felt to be underrepresented and when projected models were used, the estimated figure was substantially higher, but still significantly less than any of the preceding pandemics.

Putting this into perspective, in terms of Ireland, the combined number of documented deaths for the Swine Flu pandemic here, accounted for just over twenty in total (as obtained from tracking the main stories at the time and statistics that were given from our major news outlets) – and as tragic as this was for those involved, this number would not even be visible on the graph above if we plotted it.

Take, for example, the preceding influenza season (2007/08), just before the Swine Flu outbreak of 2009. We can see just two registered deaths in the over 65s (usually the elderly – much older group – are the most susceptible) in the whole of Ireland reported.

Summary Report of 2007/2008 Influenza

Mortality Data

(Ireland)

During the 2007/2008 influenza season, two deaths attributed to influenza were registered with the General Register Office. These deaths were both in adults over 65 years of age, one in HSE-NW registered in week 8 2008 and one in HSE-S registered in week 14 2008. It should be noted that the death registered in HSE-S was not a laboratory confirmed case of influenza.

 The Health Protection Surveillance (2009)

https://www.hpsc.ie/a-z/respiratory/influenza/seasonalinfluenza/surveillance/influenzasurveillancereports/seasonsummaries/File,3418,en.pdf

[88]

Further support for the fact that deaths from Influenza have continued to plummet to historic lows in our modern era since the Spanish Flu, statistics are also available for the actual incidence of cases of Influenza itself that would tend to suggest that we are not even catching the Flu as often these days.

For instance, a study out of the UK which followed Flu seasons spanning forty years, (just prior to the last pandemic of 2009) also noted that the cases of Influenza have also gradually receded as highlighted in the following with an indication that the ability of the viruses to adapt to multiple strains may, in fact, have their limit:

Lessons from 40 years’ surveillance of influenza in England and Wales

We show a gradually decreasing trend in the incidence of respiratory illness associated with influenza virus infection (influenza-like illness; ILI) over the 40 years and speculate that there are limits to how far an existing virus can drift and yet produce substantial new epidemics.

Fleming, D. M., & Elliot, A. J. (2007). Lessons from 40 years’ surveillance of influenza in England and Wales. Epidemiology and infection136(7), 866-75.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2870877/

[89]

Therefore, perhaps it is a good thing that our rates of mortality and morbidity had already dwindled along with our cases of Influenza significantly due to this highly robust cross-immunity protection well before we began to intervene with the natural immunity cycle to any great extent, as where would we be then without our exposure? However, it seems that we haven’t been able to come up with a good alternative to natural immunity anyway, as you will see in the following article, therefore, perhaps our failures are a blessing in disguise?

Are Flu Viruses Smarter than us?

Here’s why it’s so hard to make a better flu vaccine

Imagine you work in a high-security building. It uses facial recognition technology to keep out known intruders. It works well, until someone figures out how to use clever makeup, or even just grow a moustache to game the cameras. No matter how often the intruders are caught, new infiltrators find new disguises to help them get in.

That’s a little bit how the immune system works, and the flu virus is gaming that recognition technology. It sneaks past the body’s immune system to cause misery and mayhem, even as new vaccines update the biological equivalent of facial recognition software. Each year, a new influenza vaccine is formulated and distributed, and each year, viruses develop ways to evade them. Flu vaccines are never as effective as other vaccines, and the current vaccine only provides partial protection against the ongoing flu epidemic.

It’s an annual guessing game of sorts, one backed by data but also plagued with uncertainty. And when the guesses don’t exactly match the reality, as happened this past year, it can mean a dismal and deadly flu season.

“We’ll do the best we can,” said Daum, a Chicago doctor who heads the Food and Drug Administration advisory committee that makes the recommendations. But “the virus is smarter than we are at this point. I don’t know of any disease that plagues us more. It’s very, very frustrating and a very inexact science. . . . We do it with varying luck, and I think the luck is mostly the virus’s whim.

Fox, M. (2018 Feb. 14th)

[90]

 https://www.nbcnews.com/health/health-news/here-s-why-it-s-so-hard-make-better-flu-n848081

The reason I suppose our health officials worry so much about protecting us from Influenza is that not many of them think to look at the actual mortality statistics that paint a somewhat more reassuring, and certainly more realistic picture of what is going on on the ground. By all accounts discussed thus far, we are almost all already fairly resistant against all strains of these pathogens.

But, you may not blame them if you realise the type of statistical estimates and projections that they follow, which are essentially built upon assumptions modelled on Spanish Flu-like proportions pandemics lurking behind every normal Flu season. These models have been criticised as being often too broad,  ill-defined and frequently contradictory and vary widely from one another depending upon the systems used as discussed in some detail by, Doshi, P. (2008), in ‘Trends in Recorded Influenza Mortality: United States, 1900–2004’ https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2374803/ [91].

As noted earlier, in reality, as you will see from a recent study below, getting the actual Flu isn’t even as common as you might think and for almost all of us, it isn’t even that problematic. Just because the Flu circulates each season, doesn’t mean you’ll get it; most of us don’t and those that do, don’t even know they have the infection because it is so mild (asymptomatic) according to a long-term study as excerpted below:

Three-quarters of people with flu have no symptoms

“‘77% of flu infections’ have no symptoms, say experts,” reports ITV News.

The news is based on a large community-based study carried out in England, which found that most people with influenza (“flu”) don’t have symptoms, and even if they do, only a small proportion go to a doctor.

The study was part of Flu Watch – a larger, ongoing study to assess the impact of flu on public health in England – and analysed five groups of people over six periods of influenza transmission, between 2006 and 2011.

Participants provided blood samples before and after the influenza season, so that the amount of antibodies in the blood could be measured. They were then contacted every week so that cough, cold, sore throat, or any “flu-like illness” could be noted down. If any of these were experienced, participants were asked to complete a symptom diary and to take a nasal swab to test for the influenza virus.

Approximately 20% of people had an increase in antibodies against influenza in their blood after an influenza “season”. However, around three-quarters of infections were symptom-free, or so mild that they weren’t identified through weekly questioning.

This is very much a “good news, bad news” story. It is good news in that so many people with a flu infection are spared the burden of a nasty infection. However, limiting the spread of a future pandemic could be challenging, as it would be unclear who is infected.

NHS, News (2014, 17th March)

https://www.nhs.uk/news/medical-practice/three-quarters-of-people-with-flu-have-no-symptoms/

[92]

Thankfully, as we have already been exposed to so many strains of Influenza over the generations, it seems that we are, for the most part, ready for just about any form that those influenza viruses can morph into.

Why Revive a Deadly Flu Viurus?

New York Times Magazine

Flu viruses mutate very rapidly, and each season’s version is a little different. But your immune system preserves a memory of its previous encounters with a flu, which are dragged up, like old photographs from the back of a closet, every time your system responds to a new flu invasion.

Shreevejan, J. (2006, Jan. 29th)

http://www.nytimes.com/2006/01/29/magazine/why-revive-a-deadly-flu-virus.html

[93]

The name of the game would appear to be, exposure and the more exposed you are, the greater your resilience and ultimate immunity, even to strains (cross-strain protection) that you may not have directly encountered as studies discussed above strongly suggest.

And certainly, this long-term memory is very encouraging indeed, as although there doesn’t appear to be any fundamental distinction between the Spanish Flu of 1918-19 and 2009 some ninety years later that would make one genetically more virulent than the other, the main difference appears to be our level of innate and generational resilience to the virus and all its variants due to simple exposure.

This gives us hope, particularly if someone released something like the Spanish type Flu upon our populations in say, a bioterrorist attack which is addressed in the following excerpt.

Scientists Believe They Have Explained The Great Flu Outbreak Of 1918

…The good news here is that much of the population has now been immunized against numerous strains of flu. While these might not be enough to stop people getting sick from a novel version, it should keep the death rates down if we experience something as potentially devastating as the 1918 outbreak again.

 Luntz, S. (2014, 4th May)

http://www.iflscience.com/health-and-medicine/scientists-believe-they-have-explained-great-flu-outbreak-1918

[94]

So, essentially, nothing happened to the Spanish Flu; it never went anywhere in particular; it is just probably finding it difficult to infiltrate our mighty defences thanks to our sophisticated protein recognition system. In other words, we have been naturally immunised, now, isn’t that reassuring?

But, it gets better. as even more reassuring is the fact that overall, these Influenza viruses may have already lost a great deal of their earlier killing power by virtue of the fact that they have been circulating within us as their host over so many generations, further supporting the ideas that if Spanish Flu was genetically engineered and released out into the public, most of us may not even notice.

 A clue came when investigating some communities who had gotten off rather lightly as the most virulent eruptions of the Spanish Flu reached its tentacles into some of remotest parts of the world, but yet, many communities survived relatively unscathed who had no previous exposure, mild or otherwise to this viral pathogen. How?

The Places that Escaped the Spanish Flu

BBC

“These communities basically shut themselves down,” explains Howard Markel, an epidemiological historian at the University of Michigan who was one of the authors of the study. “No one came in and no one came out. Schools were closed and there were no public gatherings. We came up with the term ‘protective sequestration’, where a defined and healthy group of people are shielded from the risk of infection from outsiders.”

..When these measures were lifted in November 1918, as reports of cases in San Francisco were on the decline, the base experienced only mild cases, but at least three people did die…

But there may be some benefit to keeping the virus out for as long as is possible. American Samoa implemented a five-day quarantine for all boats that kept influenza from its shores until 1920. When it finally did arrive, the virus appears to have lost much of its sting and there were no deaths attributed to influenza in a population of more than 8,000. The main island of Samoa to the northwest, however, lost around a fifth of its population to the pandemic…

A similar story unfolded on the on the Australian island of Tasmania, which implemented strict quarantine measures for boats arriving on its shores that required all passengers and crew to be isolated for seven days. When the infection penetrated the island in August 1919, medical officers reported that it was a milder infection than that on the mainland. The death rate on Tasmania was one of the lowest recorded worldwide.

Richard Gray
24 October 2018

[95]

http://www.bbc.com/future/story/20181023-the-places-that-escaped-the-spanish-flu

What happened? Quarantine should have meant that these people were just as vulnerable, they had just delayed the inevitable. But, that is not what occurred. The cause of this strange, but, very reassuring anomaly of the timing and decreasing impact of the Spanish Flu pandemic becomes clearer when we look to other similar patterns as documented from epidemiological studies and observations on the ground at the time summarised below: 

The Story of Influenza

[96]

One of the more interesting epidemiologic findings in 1918 was that the later in the second wave someone got sick, the less likely he or she was to die, and the more mild the illness was likely to be.

This was true in terms of how late in the second wave the virus struck a given area, and, more curiously, it was also true within an area. That is, cities struck later tended to suffer less, and individuals in a given city struck later also tended to suffer less. Thus west coast American cities, hit later, had lower death rates than east coast cities, and Australia, which was not hit by the second wave until 1919, had the lowest death rate of any developed country.

Again, more curiously, someone who got sick 4 days into an outbreak in one place was more likely to develop a viral pneumonia … than someone who got sick 4 weeks into the outbreak in the same place…

The best data on this comes from the U.S. Army. Of the Army’s 20 largest cantonments, in the first five affected, roughly 20 percent of all soldiers with influenza developed pneumonia. Of those, 37.3 percent died (…).

In the last five camps affected—on average 3 weeks later—only 7.1 percent of influenza victims developed pneumonia. Only 17.8 percent of the soldiers who developed pneumonia died (…).

Inside each camp the same trend held true. Soldiers struck down early died at much higher rates than soldiers in the same camp struck down late.

Cities struck later in the epidemic also usually had lower mortality rates…

The same pattern held true throughout the country and the world… places hit later tended to suffer less.

As discussed in the rest of this article, it was difficult to reconcile some of the hypotheses offered for this strange phenomenon and, therefore, another explanation was required. One hypothesis is offered by the author of the above study, albeit admittedly highly speculative although, it did appear to fit the evidence a whole lot better and is therefore excerpted in summary form below:

…At the peak of the pandemic, then, the virus seemed to still be mutating rapidly, virtually with each passage through humans, and it was mutating toward a less lethal form.

We do know that after a mild spring wave, after a certain number of passages through humans, a lethal virus evolved. Possibly after additional passages it became less virulent. This makes sense particularly if the virus was immature when it erupted in September, if it entered the human population only a few months before the lethal wave.

[ibid]

[97]

https://www.ncbi.nlm.nih.gov/books/NBK22148/

In other words, this rapid mutation to increased virulence and ultimately, a much less lethal form, may have been due to the number of pathogen passages (infections in individuals and their spread to others) through their human hosts. That doesn’t mean that this strain became genetically mutated, it just means that something may have occurred within the behaviour of the viral pathogens in the context of their human host – a dilution or filtering effect as the viruses passed through and between more and more people (hosts) during the outbreak.

Now to take this a little further, none of this would show up genetically, so we have to infer a plausible scenario from what we know of the behaviour of viruses within a host and some of the defences we might put up to defend ourselves against them. For instance, we could say that the very fact that viruses require cells/hosts to begin replicating, unlike bacteria, viruses are not actually free-living independent organisms, perhaps they took a while to establish the full takeover (highjacking) of the host’s molecular machinery to get going (the first wave milder form), but once the immune system identified what they were doing (wreaking havoc in the second wave), it found a way to begin disarming these viral replicators and we find the milder impact again during the third wave.

This would explain how those exposed later in the pandemic, even in the same army barracks, towns and cities would fare better than those exposed earlier. It would also begin to explain how those remote islanders were able to protect themselves just long enough to only encounter the much attenuated/filtered viral form of the infection and still gain immunity.

We could call this the generational immunising effect, which completely concurs with McNeil’s hypothesis regarding the same as highlighted in the introduction of this present study [98]. This immunising effect over generations may have made these viruses more and more attenuated each time they passed through generations. Is this why our mortality graphs all look so similar and all the strains of Influenza look weaker and weaker on each passing season and pandemic over the course of the last hundred years?

Moreover, there is also an indication that Nature has taken care of our future resilience and immunity too. For instance, recently emerging molecular evidence takes this dynamic of non-inherited transference of protection somewhat further. It is now looking quite likely that our immune systems can memorise past battles with pathogens, even well into the future, across the generations even when the threat is no longer present in an obvious way as indicated in the excerpt below.

RETHINKING THE ORIGIN OF CHRONIC DISEASES

Some modern-day diseases reflect the capacity of organisms to “memorize” responses to external signals and transmit them across generations; …

the original causative agent may not be extant today, but “memory” of the infection has persisted.

Shoja, M.M et al, (2012)

[99]

BioScience, Volume 62, Issue 5, 1st May 2012,

https://academic.oup.com/bioscience/article/62/5/470/236430/Rethinking-the-Origin-of-Chronic-Diseases

This type of generational imprinting, passing on environmental information, and all the necessary adaptations to all sorts of threats and experiences gleaned from past battles with a vast array of pathogens that can be passed on to our offspring, obviously, tells us that it is not all in the genes as we once thought.  As the title and rest of the following excerpt suggest, our ancestral battles with the bugs is not a genetic one.

Your Immune System Is Made, Not Born

New research dispels the belief that the strength of the body’s defense system is genetically programmed

Landhuis, E. (2015)

[100]

http://discovermagazine.com/2006/nov/cover

We have only in more recent times began to gain deeper insights into just how adaptable and responsive the immune system actually is. Seemingly, just about every living thing can rapidly respond and defend itself from danger and threats, particularly infectious disease, without having to wait around for millions of years in hope that we might end up with the lucky genes that will save us.

Now, it looks very likely that we can inherit this hard-fought-for immunity, not just from our mother’s directly, but from their mother’s and perhaps generations of mothers before them as suggested by the following study – at least in pigeons, but that’s only because we haven’t looked yet, at least not in terms of generational immunity transference, but, we can certainly pass on just about everything else via molecular imprints for just about everything else that has been studied. And, these memory molecules go back as far as we have looked also.

GRANDMOTHERS CAN PASS IMMUNITY TO THEIR GRANDCHILDREN, AT LEAST IN PIGEONS

At the moment of birth, a newborn leaves behind its safe protective environment and enters a world teeming with bacteria, parasites, viruses, and infectious agents of all sorts. However, the babies do have one trump card: antibodies and immune compounds passed across the placenta from their mothers. These short-lived molecules can dip into mom’s immunological experience to protect the newborn until the immune system gets up to speed.

Now, a new study in pigeons suggests that some baby birds owe their early immunity not just their mothers, but to their grandmothers as well.

…previous research has suggested that these early maternal immune compounds may have “educational effects” on the newborn’s developing immune profile—that they may somehow be priming the system to be on the lookout for common local diseases or parasites…

Shultz. D (2015)

[101]

http://www.sciencemag.org/news/2015/11/grandmothers-can-pass-immunity-their-grandchildren-least-pigeons

In other words, as indicated in the excerpt above, adaptation and resistance to disease can be handed down through generations. This, of course, goes directly against our current dogma of genetically driven adaptation, but, with so many studies emerging supporting this non-genetic inheritance, this dogma is finally changing and more studies are exploring such adaptive forms of evolution.

This gives us hope that even if a great infectious contagion of the past that once devastated our communities and loved ones, that due to natural immunity across the generations, even our children’s children may not have to face the same again. This could be the very reason why we don’t hear of those once deadlier and thankfully, for most of us, long forgotten diseases. Maybe Influenza will ultimately go the same way. In other words, we may have now forgotten those once deadlier diseases that our ancestors battled with, but our immune systems have thankfully not.

So, the future looks brighter perhaps than we had imagined. Therefore in answer to the question posed at the beginning, would we survive the Spanish Flu if it re-emerged today, I think we most certainly would and it seems our children and their offspring might actually become fully immune to it in the not too distant future and would also survive the Spanish Flu if it re-emerged even generations later. Therefore, perhaps it would be prudent to leave this natural generational immunity cycle well alone so that we can continue to pass on those longlived memory molecules to our offspring!

And it is hoped that by this stage if Influenza or Pneumonia still remains an issue for certain more vulnerable populations that we will have further developed our vitamin C therapies to a degree where nobody should ever be dying of these once deadlier contagions of the past.

But just in case, and in the meantime as we are not fully out of the woods yet, what we knew way back in the day, and indeed, already know today regarding the role of vitamin C (which is critical in supporting the immune system to fight all types of infections, and particularly serious complications arising from such attacks) could help greatly.

Alternative Interventions & Prevention

Abstract

In the early literature, vitamin C deficiency was associated with pneumonia. After its identification, a number of studies investigated the effects of vitamin C on diverse infections. ..

 …Influenza A infection in mice resulted in a decrease in vitamin C … and in vitamin C deficiency influenza led to greater lung pathology …

Decreases in vitamin C levels during various infections imply that vitamin C administration might have a treatment effect on many patients with infections…

Hemilä, H. (2017, Abstract)

[102]

https://www.ncbi.nlm.nih.gov/pubmed/28353648

And  we are beginning to see vitamin C therapy used with encouraging results with real human subjects who need it the most, as is exemplified below:

Vitamin C and acute respiratory infections

Three controlled studies recorded a reduction of at least 80% in the incidence of pneumonia in the vitamin C group, and one randomised trial reported substantial treatment benefit from vitamin C in elderly UK patients hospitalized with pneumonia or bronchitis. It seems that the preventive effects of supplementation are mainly limited to subjects with low dietary vitamin C intake, but therapeutic effects may occur in wider population groups. Further carefully designed trials are needed to explore the effects of vitamin C.

Hemilä H. & Douglas R. M., (1999)

[103]

https://www.ncbi.nlm.nih.gov/pubmed/10488881

References:
[70] Billings, M. (1997) The influenza pandemic of 1918, Stanford Education https://virus.stanford.edu/uda/ 
[71] Buckley, M. (2014) Limerick City and the Spanish Influenza Epidemic, 1918-19, Critical Social Thinking, Vol. 6, p. 81. School of Applied Social Studies, University College Cork, Ireland. https://www.ucc.ie/en/media/academic/appliedsocialstudies/cstpdfs/vol6/MargaretBuckley.pdf
[72] ibid
[73] Jones, M. (2016) Ill-Prepared, Review of: The Last Irish Plague: The Great Flu Epidemic in Ireland 1918-19, by Catriona Foley, Dublin Book Reviews http://www.drb.ie/essays/ill-prepared
[74] Taubenberger, J. K., & Morens, D. M. (2006) 1918 Influenza: the mother of all pandemics. Could a 1918-like Pandemic Appear Again? If So, What Could We Do About It? Emerging infectious diseases, Vol.12, [1], pp. 15-22 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3291398/
[75] Joyce. J (2009) November 9th, 1918: Relief as deaths from 1918 Spanish flu epidemic began to decline, The Irish Times (Nov. 9th, 2009) https://www.irishtimes.com/opinion/november-9th-1918-relief-as-deaths-from-1918-spanish-flu-epidemic-began-to-decline-1.768576
[76] Shanks, D., & Brundage, J. F. (2012) Pathogenic Responses among Young Adults during the 1918 Influenza Pandemic, Journal of Infectious Disease. Vol.18, [2]: 201-207.  doi: 10.3201/eid1802.102042 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3310443
[77] The Conversation (2018), How historical disease detectives are solving mysteries of the 1918 flu, (5th March 2018)
https://theconversation.com/how-historical-disease-detectives-are-solving-mysteries-of-the-1918-flu-91887
[78] Yu, X., Tsibane, T., McGraw, P. A., House, F. S., Keefer, C. J., Hicar, M. D., Tumpey, T. M., Pappas, C., Perrone, L. A., Martinez, O., Stevens, J., Wilson, I. A., Aguilar, P. V., Altschuler, E. L., Basler, C. F., … Crowe, J. E. (2008). Neutralizing antibodies derived from the B cells of 1918 influenza pandemic survivors. Nature, Vol. 455, [7212], pp. 532-6. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2848880/
[79] Howard, J., (2016) Your flu risk may be linked to the year you were born
CNN (10th November 2016) https://edition.cnn.com/2016/11/10/health/flu-risk-birth-year/index.html
[80] Branswell, H. (2018) The Problem Child of Seasonal Flu”: Beware This Winter’s Virus
H3N2 is deadlier than many other influenza strains, Scientific American, (9th January 2018) https://www.scientificamerican.com/article/ldquo-the-problem-child-of-seasonal-flu-rdquo-beware-this-winter-rsquo-s-virus/
[81] Davies, J. R., Grill, E. A., & Smith, A. J. (1985) Infection with influenza A HlNl: The effect of past experience on natural challenge effect of past experience on natural challenge, Journal of Hygiene, Vol. 96, Summary, pp. 345-352. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2129641/
[82] Woo, G (2018) Age-dependence of the 1918 pandemic, 4. Birth year dependence of H5N1 and H7N9 avian flu cases, Paper presented to the Institute & Faculty of Actuaries, p. 11, London. PDF https://www.actuaries.org.uk/documents/age-dependence-1918-pandemic
[83] Ioanna Skountzou, Dimitrios G. Koutsonanos, Jin Hyang Kim, Ryan Powers, Lakshmipriyadarshini Satyabhama, Feda Masseoud, William C. Weldon, Maria del Pilar Martin, Robert S. Mittler, Richard Compans and Joshy Jacob (2010) Immunity to Pre-1950 H1N1 Influenza Viruses Confers Cross-Protection against the Pandemic Swine-Origin 2009 A (H1N1) Influenza Virus, Journal of Immunology, Vol. 185 [3], pp. 1642-1649; DOI: https://doi.org/10.4049/jimmunol.1000091 http://www.jimmunol.org/content/185/3/1642
[84] Doshi P. (2008). Trends in recorded influenza mortality: United States, 1900-2004. American journal of public health, Vol. 98 [5], pp. 939-45. Figure, 1. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2374803/
[85] Griffiths C., & Brock A (2003) Twentieth Century Mortality Trends in England and Wales. Health Statistics Quarterly, Issue 18, Figure 5: age-standardised mortality rates for Influenza in England and Wales from 1901 to 2000, pp. 5–17. [Available online as PDF] https://www.ons.gov.uk/ons/rel/hsq/health-statistics-quarterly/no–18–summer-2003/twentieth-century-mortality-trends-in-england-and-wales.pdf
[86] Baylor College of Medicine (1998-2008) Influenza Virus (Flu) https://www.bcm.edu/departments/molecular-virology-and-microbiology/emerging-infections-and-biodefense/influenza-virus-flu
[87] WHO (World Health Organisation) (2010) Pandemic (H1N1) 2009, WHO: Weekly update (2010, update no. 12) https://www.who.int/csr/don/2010_08_06/en/
[88]  Health Protection Surveillance Centre (HPSC) (2009), Summary Report of 2007/2008 Influenza HPSC Influenza Summary Report v1.0 season, Mortality Data, https://www.hpsc.ie/a-z/respiratory/influenza/seasonalinfluenza/surveillance/influenzasurveillancereports/seasonsummaries/File,3418,en.pdf 
[89] Fleming, D. M., & Elliot, A. J. (2007). Lessons from 40 years’ surveillance of influenza in England and Wales. Epidemiology and infection, Vol. 136 [7], pp. 866-75. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2870877/
[90] Fox, M. (2018) Are Flu Viruses Smarter than us? Here’s why it’s so hard to make a better flu, nbcnews.com (Feb. 14th 2018) https://www.nbcnews.com/health/health-news/here-s-why-it-s-so-hard-make-better-flu-n848081
[91] Doshi, P. (2008). Trends in Recorded Influenza Mortality: United States, 1900–2004, American Journal of Public Health. Vol. 98 [5]: pp. 939–945. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2374803/ doi: 10.2105/AJPH.2007.119933 PMCID: PMC2374803
[92] NHS, News (2014) Three-quarters of people with flu have no symptoms, National Health Service Medical News (17th March 2014) https://www.nhs.uk/news/medical-practice/three-quarters-of-people-with-flu-have-no-symptoms/
[93] Shreevejan, J. (2006), Why Revive a Deadly Flu Virus? New York Times Magazine online, (Jan. 29th 2006) http://www.nytimes.com/2006/01/29/magazine/why-revive-a-deadly-flu-virus.html 
[94] Luntz, S. (2014), Scientists Believe They Have Explained The Great Flu Outbreak Of 1918, IFLScience.com (May 4th 2014) http://www.iflscience.com/health-and-medicine/scientists-believe-they-have-explained-great-flu-outbreak-1918 
[95] Gray, R., (2018) The Places that Escaped the Spanish Flu, BBC (24th October 2018) http://www.bbc.com/future/story/20181023-the-places-that-escaped-the-spanish-flu
[96] Barry, J. M (2005) The Story of Influenza 1918 Revisited: Lessons and Suggestions for Further Inquiry, in, eds., Knobler, S. L., Mack, A., Mahmoud, A & Lemon, S. M. The Threat of Pandemic Influenza: Are We Ready? Workshop Summary, National Academies Press 2005 Academies Press, p. 63 https://www.ncbi.nlm.nih.gov/books/NBK22148/
[97] [ibid] Barry, J. M (2005) The Story of Influenza 1918 Revisited: Lessons and Suggestions for Further Inquiry, in, eds., Knobler, S. L., Mack, A., Mahmoud, A & Lemon, S. M. The Threat of Pandemic Influenza: Are We Ready? Workshop Summary, National Academies Press 2005 Academies Press, Summary https://www.ncbi.nlm.nih.gov/books/NBK22148/
[98] William McNeil, W., (1976) ‘Plagues and People’ (McNeill W.H. (1976) Plagues and Peoples. Anchor Books, New York, USA.
[99] Shoja, M.M., Tubbs, R. S., Ghaffari, A., Loukas, M. & Agutter, P.S. (2012) Rethinking the Origin of Chronic Diseases, BioScience, Vol. 62, [5], Abstract, pp. 470–478,https://doi.org/10.1525/bio.2012.62.5.8 https://academic.oup.com/bioscience/article/62/5/470/236430/Rethinking-the-Origin-of-Chronic-Diseases
[100] Landhuis, E. (2015), Your Immune System Is Made, Not Born, Discovery Magazine (Jan., 29th 2015)  [http://discovermagazine.com/2006/nov/cover]
[101] Shultz. D (2015) Grandmothers can pass immunity to their grandchildren, at least in pigeons, Sciencemag.com  (Nov. 10th 2015), [http://www.sciencemag.org/news/2015/11/grandmothers-can-pass-immunity-their-grandchildren-least-pigeons]
[102] Hemilä, H. (2017), Vitamin C and Infections. Alternative Interventions & Prevention, Nutrients, Vol. 9, [4], Abstract: doi: 10.3390/nu9040339.  https://www.ncbi.nlm.nih.gov/pubmed/28353648
[103] Hemilä H., & Douglas R. M. (1999) Vitamin C and Acute Respiratory Infections, International Journal of Tuberculosis Lung Disease, Vol. 3, [9]: Abstract, pp. 756-61.  https://www.ncbi.nlm.nih.gov/pubmed/10488881

PART TWO starts here:

CHAPTER SIX: Don’t Count Your Children ’til they’ve had the Pox!

CHAPTER NINE: The Forgotten Clinical Trials of Prevention & Recovery from Scarlet Fever, Diphtheria, Whooping Cough (Pertussis), Measles, Mumps, Rubella & Polio

Link to previous chapter

Now there were some very promising therapies using vitamins clinically to help prevent, speed up, reduce the impact and even fully recover people from the whole gamete of infectious diseases previously discussed. These vitamin therapies (some we have already mentioned with regard to TB, Influenza and Pneumonia for example) began to arise to clinical prominence within the earlier part of the 20th Century.

Although these therapies never became widespread enough to have impacted upon the overall mortality decline of our respective nations, they are important to mention from a historical perspective. Firstly, they reveal the avenues of medical care that could have been further developed in order to address some of the complications that still arose from time to time due to having complications from such infectious diseases, particularly those of childhood.

Secondly, as most of these therapies were being applied prior to the main era of our modern vaccination epoch and before antibiotics, this makes these therapies even more relevant today. You see, they offer possible solutions to the rising Superbugs that can adapt to just about all our antibiotics at this stage and could perhaps also help with the increasing age-inappropriate outbreaks of some of the diseases as we grapple with the short-lived protection typically conferred by using artificial injections rather than natural infections.

The history of such therapies starts with the progress being made, particularly with the immune support provided by Vitamin A, which is given in the research paper excerpt below.

The Historical Evolution of Thought Regarding Multiple Micronutrient Nutrition

Progress with Vitamin A

In Denmark from 1910–1920, Carl Bloch and Olaf Blegvad … observed high mortality in children who were hospitalized with vitamin A deficiency. The mortality rate of vitamin A-deficient children was reduced by ~54% by treating the children with cod-liver oil and whole milk, two rich sources of vitamin A. In the late 1920s, vitamin A was recognized to have an effect on immunity to infection, and vitamin A became known as the antiinfective vitamin …

Largely through the influence of Mellanby, vitamin A underwent a period of intense clinical investigation. Between 1920 and 1940, at least 30 trials were conducted to determine whether vitamin A could reduce the morbidity and mortality from infectious diseases, including respiratory disease, measles, puerperal sepsis, and tuberculosis…

By the 1930s, it was established that vitamin A supplementation could reduce morbidity and mortality in young children. In 1932, Joseph Ellison… showed that vitamin A supplementation reduced the mortality of vitamin A-deficient children with measles by nearly 60%.

Vitamin A became a mainstream preventive measure; cod-liver oil was part of the morning routine for millions of children and was acknowledged in saving the lives of children from poor families in England…

Semba, R.D. (2012)

The Journal of Nutrition

[160]

https://academic.oup.com/jn/article/142/1/143S/4630750

Cod-Liver Oil – Anyone old enough to remember it?

cod-liver-oil

Fig. 26: Vintage photograph of public health promotion of Cod Liver Oil alongside the article excerpt as above.

A clear illustration of how these clinical studies filtered out to the broader public can be seen in a sample of the type of advertisements of the era such as that for Cod Liver Oil as a protection against some of the worst effects and complications that could still arise, but thankfully significantly less and less frequently, as we entered the mid 20th Century.

1867588_original
Fig. 27: Vintage advertisement for ‘Squibb’s’ cod-liver oil. Source: Masterjohn, C. (2015) Did Cod Liver Oil Contribute to the Historical Decline in Measles Mortality and Mortality From Other Infectious Diseases?https://www.westonaprice.org/did-cod-liver-oil-contribute-to-the-historical-decline-in-measles-mortality-and-mortality-from-other-infectious-diseases/ 

See the highlights from the poster below:

  • whooping cough, measles, mumps, • chicken pox, scarlet fever…may do greater harm than most mothers think. But the children have lighter cases, they recover quicker and are less likely to be left with some permanent injury, if they build up good general resistance in advance to fight them…

  • There is a way to prevent the “common” diseases from resulting seriously. … “resistance-building” Vitamin A! Vitamin A is the important factor which increases their fighting power in time of illness. It helps to set up a defense against the attacking disease germs…

  • In fact, good cod-liver oil is one of the richest sources of Vitamin A mothers can give. Don’t wait until your child catches one of the “common” diseases. Give him Squibb Cod-Liver Oil now!

Vintage advertisement for ‘Squibb’s’ cod-liver oil

[161]

https://www.westonaprice.org/did-cod-liver-oil-contribute-to-the-historical-decline-in-measles-mortality-and-mortality-from-other-infectious-diseases/

Many clinical studies from the era show that different vitamin therapies were given to patients suffering from the ill effects of a broad range of highly infectious diseases. However, high dose vitamin C therapy seemed to be the most often applied and takes up the greatest bulk of the medical literature regarding our intervention in such complications arising in certain individuals from viral or bacterial infections.

High dose Vitamin C therapy is consistently shown to be powerfully effective in reducing the time, severity and deaths and disabilities that could sometimes arise from a broad range of both viral and bacterial diseases, but only when applied in a timely manner in fairly high doses.

The literature indicates that high doses of vitamin C, in particular, was necessary at the time of infection due to the fact that many of the diseases used up Vitamin C when the body was attacked and therefore, the immune system required a great deal more to fight such infections.

For instance, McCormick (the physician who was rather vocal in pointing out that our medical interventions could not have been responsible for the huge reduction in mortality seen from statistical data as discussed in the previous chapter) was an avid promoter of high dose Vitamin C therapy, as he himself had had such success with the therapy.

McCormick demonstrated clinically how high dose Vitamin C therapies could be used with great success against the worst effects of a vast range of both viral and bacterial infections, including Scarlet Fever, as summarised below.

VITAMIN C IN THE PROPHYLAXIS AND THE

 THERAPY OF INFECTIOUS DISEASES

1951

(The Author’s Experience)

In the author’s private practice during the past ten years, over 5,000 tests for vitamin-C status have been made… In many cases of deficiency, where the dietary intake indicates a subnormal in-take of vitamin C over a lengthy period, the correlated clinical history shows repeated occurrence of infectious processes…

Several cases of scarlet fever were given vitamin-C therapy, intravenously and orally, 2,000 mg. daily. In each case the fever dropped to normal in a few hours and the patients were symptom-free within three or four days.

 McCormick, W.J  (1951)

[162]

https://www.seanet.com/~alexs/ascorbate/195x/mccormick-wj-arch_pediatrics-1951-v68-n1-p1.htm

And certainly, as you will see as we proceed, McCormick was not alone in taking such an approach to tackle all sorts of complications that could arise from having a broad range of infections. Slowly, but surely, our modern practitioners are beginning to see the value of such timely interventions for an extraordinary range of infections, including Tetanus as indicated in the following excerpt:

Vitamin C and Infections.

Alternative Interventions & Prevention

Abstract

…A total of 148 animal studies indicated that vitamin C may alleviate or prevent infections caused by bacteria, viruses, and protozoa.

One controlled trial reported treatment benefits for tetanus patients. The effects of vitamin C against should be investigated further.

Hemilä, H (2017)

[163]

Returning to the older clinical studies, we find that Vitamin C was even used with good success for treating cases of Whooping Cough (Pertussis) – a summary of some of the studies is outlined below:

A FURTHER REPORT ON THE ASCORBIC ACID TREATMENT OF WHOOPING COUGH

From the Department of Physiology and Pharmacology, University of Manitoba

IN a previous communication two of us (M.J.O. and B.M.U.) gave an account of the treatment of 10 cases of whooping cough with ascorbic acid (synthetic vitamin C). While the small number of cases forbade any statistical conclusions they nevertheless did show that this treatment had an almost specific effect in decreasing the intensity and duration of the disease.

At the time of forwarding the above paper we believed this to be an entirely new system of treatment, but we have since discovered that Otani … had published his results in treating 81 cases of whooping cough with ascorbic acid, and we take this opportunity of acknowledging his priority and confirming his results. His method of treatment was the intravenous injection of the same brand of ascorbic acid (Redoxon – Hoffmann-La Roche) as we have used orally, and his patients were drawn from hospital clinics, while ours were treated in the home.

He does not give much detail in the paper but his general conclusions are matched by ours….

Ormerod, M. J., UnKauf, B. M., & White, F. D. (1937)

[164]

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC536087/

And again, the highly effective use of Vitamin C treatment in combating complications from yet another infectious type, namely Diphtheria, which is summarised in another medical paper from around the late 1940s. Note the title includes Poliomyelitis (Polio), which will be addressed below.

Journal of Southern Medical & Surgery

The Treatment of Poliomyelitis and Other Virus Diseases with Vitamin C

Harde et al. reported that diphtheria toxin is inactivated by vitamin C in vitro and to a lesser extent in vivo. I have confirmed this finding, in- deed extended it. Diphtheria can be cured in man by the administration of massive frequent doses of … (vitamin C) given intravenously and/or intramuscularly. To the synthetic drug, by mouth, there is little response, even when 1000 to 2000 mg. is used every two hours.

This cure in diphtheria is brought about in half the time required to remove the membrane and give negative smears by antitoxin. This membrane is removed by lysis when “C” is given, rather than by sloughing as results with the use of the antitoxin.

An advantage of this form of therapy is that the danger of serum reaction is eliminated. The only disadvantage of the ascorbic acid therapy is the inconvenience of the multiple injections.

Klenner, F.R., (1949, 211)

[165]

https://ia800301.us.archive.org/3/items/southernmed1111949char/southernmed1111949char.pdf 

And what is often less focused upon, if known widely at all, is the efforts being made to bring Polio under control using high dose vitamin C therapy as an interim – at least until the time when vaccines could be made available.

For instance, although deaths from Polio were minuscule compared to the much more massive deadliness of something like Measles in its heyday and as tragic as this was at the time on the ground for the individuals who were directly impacted, what made Polio so terrifying perhaps was its high visibility in terms of its greater fall out, paralysis. It is that which presumably prompted, with some the urgency, the only available treatment at the time that could stave off the worst effects of Polio.

Hence, we have quite a large number of studies using vitamin C (fairly high doses at regular intervals) therapy to treat the disease, and often with great success along with tackling many other viral infections – some examples are given in the following references and excerpts below. You will also notice from the era that these interventions were being used, that in many ways, treatment against the worst effects of Polio was understood as a stop-gap until other medical technologies could be made available as eluded to in the opening of the excerpt that follows:

Journal of Southern Medical & Surgery

The Treatment of Poliomyelitis and Other Virus Diseases with Vitamin C

Since immunization against poliomyelitis comparable to that against other bacterial diseases is still a matter of the future, it suggested itself that some antibiotic could be found that would destroy this scourge …

These results were so consistently positive that we did not hesitate to try its effectiveness against all tvpes of virus infections.

The frequent administration of massive doses of vitamin C was so encouraging in the early days of the 1948 epidemic of poliomyelitis that a review of the literature was begun…

In the poliomyelitis epidemic in North Carolina in 1948, 60 cases of this disease came under his care. The treatment employed was vitamin C in massive doses. It was given like any other antibiotic every two to four hours. The initial dose was 1000 to 2000 mg, depending on age.

This schedule was followed for 24 hours. After this time the fever was consistently down, so the vitamin C was given 1000 to 2000 mg every six hours for the next 48 hours. All patients were clinically well after 72 hours.

Klenner, F.R., (1949, 211-212)

[166]

https://ia800301.us.archive.org/3/items/southernmed1111949char/southernmed1111949char.pdf

Below is a recommended list of further clinical studies of the general pre-Polio-vaccine era using high dose Vitamin C therapies in the main with an emphasis upon Polio as this was a relatively new emergent contagion at the time.

Orthomolecular Medicine News Service

The Forgotten Research of Claus W. Jungeblut, M.D.

Robert Landwehr. The origin of the 42-year stonewall of vitamin C. Journal of Orthomolecular Medicine, 1991, Vol 6, No 2, p 99-103.

Dr. Jungeblut’s 22 research reports were published in the Journal of Experimental Medicine and are available at: Link

…Other important papers are as follows Key papers regarding vitamin C include:

Jungeblut CW. Inactivation of poliomyelitis virus in vitro by crystalline vitamin C (ascorbic acid). J Exper Med, 1935, October; 62:517-521

Jungeblut CW. Vitamin C therapy and prophylaxis in experimental poliomyelitis. J Exp Med, 1937. 65: 127-146.
Jungeblut CW. Further observations on vitamin C therapy in experimental poliomyelitis. J Exper Med, 1937. 66: 459-477.
Jungeblut CW, Feiner RR. Vitamin C content of monkey tissues in experimental poliomyelitis. J Exper Med, 1937. 66: 479-491.
Jungeblut CW. A further contribution to vitamin C therapy in experimental poliomyelitis. J Exper Med, 1939. 70:315-332.

Orthomolecular Medicine News (2013)

[167]

http://orthomolecular.org/resources/omns/v09n16.shtml

The real advantage of these therapies, particularly in bacterial infections such as Scarlet Fever, Diphtheria, Pertussis (Whooping Cough) and even Tetanus was as, Fleming warned almost at the time of presenting his medical wonder, that the bacteria would soon adapt and find ways to outwit us. In our more modern era, that is, unfortunately, becoming increasingly so true.

Now historically it is interesting to note that in many ways these vitamin therapies and antibiotics were at logger-heads with each other in the early days of their applied treatments, presumably due to their fundamentally different ways of approaching the treatment of disease as suggested from reading the following excerpts.

THE JOURNAL OF SOUTHERN

MEDICINE AND SURGERY

VOL.CIII APRIL, 1951 No. 4

Massive Doses of Vitamin C and the Virus Diseases

It has been reported that one of the mold-derived drugs, in addition to being a good antibiotic, is a super-vitamin Conversely, we argue that vitamin C, besides being an essential vitamin, is a superantibiotic…

Hippocrates declared the highest duty of medicine to be to get the patient well. He further declared that, of several remedies physicians should choose the least sensational- Vitamin C would seem to meet both these requirements.

Klenner, F.R. (1951, 101, 107)

[168]

http://www.mv.helsinki.fi/home/hemila/CP/Klenner_1951_ch.pdf

Or to put it another way…

VITAMIN C IN THE PROPHYLAXIS AND THE

 THERAPY OF INFECTIOUS DISEASES

The author’s experience leads to the conclusion that the principle of trying to eradicate disease by concentrating our attack against the associated micro-organisms by means of toxic antibiotics is fundamentally unsound. If we wish to eliminate a desert or swamp we do not proceed to cut down the sage brush and cactus of the former or the lush characteristic verdure of the latter.

Instead, we change the condition of the soil. By irrigation we make the desert blossom like a rose, and by drainage we change the flora of the swamp.

The late Dr. Alexis Carrel …has said: “Microbes and viruses are to be found everywhere, in the air, in the water, in our food… Nevertheless, in many people they remain inoffensive… This is natural immunity…

McCormick, W. J  (1951)

[169]

https://www.seanet.com/~alexs/ascorbate/195x/mccormick-wj-arch_pediatrics-1951-v68-n1-p1.htm

Basically, antibiotics seemed to appeal more to the emerging medical authorities of the era and our medical authorities went with Alexander Fleming’s new discovery instead and as they say, the rest is history.

Or, is it? The good news is that in more recent times some seriously effective interventions have been demonstrated in emergency medicine using just that, high dose vitamin C intervention in critical situations as highlighted below.

This would appear to be the most promising for our future offspring if they run into any trouble with these older type pathogens is the more recent interventions using high dose vitamin C in emergency life-threatening situations against has produced ‘almost miraculous results, and of course, need to be treated with caution.

This (perhaps not so new) protocol has been used with great success on an increasing number of emergencies and is becoming more widely adopted by clinics as outlined in the following article which is really beginning to look more hopeful for the future of medicine regarding emergency interventions against the most life-threatening results that can sometimes occur from an otherwise non-threatening viral, fungal or bacterial (respiratory infection).

Could Vitamin C Be the Cure for Deadly Infections?

A new protocol that includes this common nutrient could save millions of lives—and has already sparked a raging debate among doctors

Dr. Joseph Varon, a pulmonologist and researcher at the University of Texas Health Science Center in Houston. “It does sound too good to be true,” Varon said over the phone. “But my mortality rates have changed dramatically. It is unreal. Everything we have tried in the past didn’t work. This works.”

To test Marik’s protocol, Catravas and his team cultured endothelial cells from lung tissue and exposed them to the endotoxin found in septic patients. Vitamin C alone did nothing. Neither did steroids. When administered together, however, the cells were restored to normal levels.

“We have a clinical answer,” Catravas says. “We have part of the mechanistic answer. There is satisfaction in that as a scientist. There is also satisfaction knowing that a lot of people worldwide are going to get an amazing benefit.”

Jim Morrison smithsonian.com June 27, 2017

[170]

https://www.smithsonianmag.com/science-nature/could-deadly-infections-be-cured-vitamin-c-180963843/#KuU5q4FYRfDrEd7v.99

 

Note that this article refers to Sepsis, however, the reason why it would appear to have wide-reaching benefits across all viral and even bacterial infections is that Sepsis is worst case scenario result of your body not being able to handle an otherwise less threatening viral or bacterial infection. Sepsis can become a deadly infection from complications that can sometimes arise.

Your body can become overwhelmed, an over-reaction in a sense (Sepsis) which is why high dose vitamin C intervention (according to the protocol above) could ultimately lead to saving millions of lives worldwide.  This also highlights the all-important fact that it is often (particularly in our more modern times now that much of the population appear to be robustly resistant if not fully immune to many of the contagions discussed thus far – or at least those that have natural exposure),  not the infection or pathogen invasion that could kill or maim, but, the body’s ability to deal with it that can lead to these conditions.

That is perhaps the main reason why we are running into so many longer-term difficulties with attempting to eradicate and stop the natural circulation of otherwise (at least for the vast majority of our populations in industrialised regions) benign contagions once they have settled down to normal background levels as we really should be trying to find ways of dealing with the adverse events that can arise from time to time within an individual, (intervention when required under life-threatening conditions) rather than trying to eradicate or eliminated and generally stamp out the natural circulation of pathogens that we need to be exposed to (even if we aren’t aware of it), so that our immune systems can keep reinforcing (boosting) our resilience to them so that we don’t end up in an emergency situation where our bodies simply cannot deal with the onslaught. This is exemplified in the final chapter on Polio next.


References:

[160] Richard D. Semba, R.D. (2012) The Historical Evolution of Thought Regarding Multiple Micronutrient Nutrition, The Journal of Nutrition, Vo. 142, [1], Progress with Vitamin A, https://doi.org/10.3945/jn.110.137745 [Available online from academic.oup.com] https://academic.oup.com/jn/article/142/1/143S/4630750

[161] Masterjohn, C. (2015) Did Cod Liver Oil Contribute to the Historical Decline in Measles Mortality and Mortality From Other Infectious Diseases? Westonaprice.org, (April 6th 2015) [Available online from westonaprice.org] https://www.westonaprice.org/did-cod-liver-oil-contribute-to-the-historical-decline-in-measles-mortality-and-mortality-from-other-infectious-diseases/

[162] McCormick, W.J (1951) Vitamin C in the Prophylaxis and Therapy of Infectious Diseases, Archives of Pediatrics, Vol. 68, [1]. https://www.seanet.com/~alexs/ascorbate/195x/mccormick-wj-arch_pediatrics-1951-v68-n1-p1.htm

[163] Hemilä, H (2017). Vitamin C and Infections. Alternative Interventions & Prevention, Nutrients, Vol. 29, [4]. Abstract, pii: E339. doi: 10.3390/nu9040339. https://www.ncbi.nlm.nih.gov/pubmed/28353648

[164] Ormerod, M. J., UnKauf, B. M., & White, F. D. (1937). A Further Report on the Ascorbic Acid Treatment of Whooping Cough, Department of Physiology and Pharmacology. Canadian Medical Association Journal, Vol. 37 [3], pp. 268–272. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC536087

[165] Klenner, F. R., (1949) The Treatment of Poliomyelitis and Other Virus Diseases with Vitamin C 
. Journal of Southern Medical & Surgery, VOL. 111 [1], p. 211, [Available online as PDF from Gutenberg.org] https://ia800301.us.archive.org/3/items/southernmed1111949char/southernmed1111949char.pdf

[166] Klenner, F.R., (1949) The Treatment of Poliomyelitis and Other Virus Diseases with Vitamin C 
. Journal of Southern Medical & Surgery, VOL. 111 [1], p. 211-212 [Available online as PDF from Gutenberg.org] https://ia800301.us.archive.org/3/items/southernmed1111949char/southernmed1111949char.pdf

[167] Orthomolecular Medicine News Service, (2013), Vitamin C and Polio, The Forgotten Research of Claus W. Jungeblut, M.D. in (ed.) Andrew W. Saul, Orthomolecular Medicine News Service (August 7th, 2013) http://orthomolecular.org/resources/omns/v09n16.shtm

[168] Klenner, F.R., (1951) Massive Doses of Vitamin C and the Virus Diseases
. Journal of Southern Medical & Surgery, VOL. 113 [4], p. 101-107 [Available online as PDF from Gutenberg.org] http://www.mv.helsinki.fi/home/hemila/CP/Klenner_1951_ch.pdf

[169] McCormick, W. J (1951) Vitamin C in the Prophylaxis and Therapy of Infectious Diseases, Archives of Pediatrics, Vol. 68, [1].

[170] Morrison, J. (2017) Could Vitamin C Be the Cure for Deadly Infections? A new protocol that includes this common nutrient could save millions of lives—and has already sparked a raging debate among doctors, Smithsonian Magazine (June 27th 2017) https://www.smithsonianmag.com/science-nature/could-deadly-infections-be-cured-vitamin-c-180963843/#KuU5q4FYRfDrEd7v.99

 

Facebook logo
Click for updates on final chapter about Polio & conclusion of this study

 

CHAPTER EIGHT: The Almost Universal Decline in Deaths from some of the Deadliest Contagions Known to Children

Link to the Previous Chapter

Child Infant Mortality in Ireland from contagious diseases 20th Century

Fig. 18: Chart of the annual number of deaths in Ireland recorded from the combined major killers of all ages of impacting mostly infants and children: Scarlet Fever, Whooping Cough, Measles, Diphtheria & Polio. Source: Chart generated using this tumultuous statistics reports since records began – “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of: An Phríomh-Oifig Staidrimh, Central Statics Office CSO, http://www.cso.ie/en/statistics/birthsdeathsandmarriages/archive/annualreportsonmarriagesbirthsanddeathsinirelandfrom1864to2000/. © Copyright dig-press.com

Bearing in mind that the mortality statistics generated on the chart above (Fig. 18) only commence in 1864 when official registration began and we know from the historical record that some of these contagions were much deadlier prior to this era. This should hopefully place the enormity of the destruction felt during the earlier era into perspective, especially, when we review the rather dramatic decline in deaths from these same diseases throughout much of the first half of the 20th Century.

It should be noted that any infectious contagion not presented in Figure 18, such as Mumps, Rubella or Chickenpox, for example, is due to the fact that deaths arising from these were always historically so rare, they wouldn’t even register on the above graph, even if plotted.

In contrast, recall the enormous death-toll from something like the fevers which included Scarlatina that were much deadlier before official records began and the historical accounts that indicated that Measles began to take over where these left off in terms of its deadliness? And we also know from the historical record,  Whooping Cough (Pertussis) was much deadlier in Ireland before our official statistical records began. It is rather amazing to see just how much these once deadlier contagions plummeted to all-time lows to become relatively benign infections of childhood by around the mid-20th Century as clearly evident within Figure. 18.

What makes this all the more surprising, is the fact that as we approached the end of this great contagion cycle, as we will discuss in more detail throughout this current chapter, this is the same general era that launches our more modern vaccine era. This, therefore, of course, means that these medical interventions cannot have caused the vast bulk of the decline in deaths in the first place.

Recall that in some cases, such as Scarlet Fever, we never had a commercially available vaccine to attempt to intervene in this disease at all, yet, this once deadlier contagion became so remarkably tame that, when it returned to taunt our children in our present era, our children survived! And antibiotics arrived too late to account for the vast bulk of the decline in this disease and all the other bacterial contagions too. Furthermore, antibiotics don’t work on any of the viral type infections, such as Measles (I should add, that the vaccine against Measles in Ireland came in much later than other regions – in 1985 when everything was essentially done and dusted).

And, as discussed throughout this study thus far, no particular social, economic, nutritional factors can be attributed to such a dramatic decline in our mortality rates either, as judging by the historical accounts, these contagions knocked on almost every family’s door to taunt our children across all social and economic divides.

Moreover, as you will see later in this present discussion, the pattern of declining deaths from the broad range of contagions (as illustrated for Ireland in Figure 18) could be described as an almost universal phenomenon, as it is shared across a diverse range of developed nations, over a very similar timeframe, and this commonly shared pattern, in of itself, would tend to suggest a more natural cause for this remarkable decline in deaths to relative benignity, whereby, as of the middle of the 20th Century, parents certainly could count more of their children than their counterparts back in the middle of the 19th Century.

Now focussing in on Figure 18 above, you will see the unusual humps and bumps of the pattern of deaths from Diphtheria which doesn’t conform to the steep decline slope from high mortality to extremely low levels as the others. In other words, although Diphtheria comes to a fairly abrupt end as a killer around the same time, it doesn’t appear to have reached the heights of destruction in the earlier period as some of its counterparts.

This may be an artefact of our recording system and peculiar to both Ireland and Britain as in the earlier days, Diphtheria was indistinguishable from Scarlet Fever as described by Charles Creighton (1894) in the previous chapter and partly by the fact that Diphtheria was often related to Croup as a cause of death and judging by references within the death registration annual statistics relating to Ireland and Britain, there was historically some confusion over this classification.

Therefore, the mortality statistics for Diphtheria may be somewhat underrepresented (particularly in the earlier era of statistical recording) and without Croup being included as a Diphtheria related cause of death, this may account for the unusual pattern of mortality statistics for this disease compared to the more typical pattern as illustrated above (Fig. 18) for most of the other contagions. It should be noted also that the more typical steep downward sloping decline in deaths from Diphtheria over a corresponding period is recorded for other regions.

This finally brings us to Polio, another shared infectious disease of our modern world that erupted rather late compared to all the other greater contagions. If we look a little closer (Figure 18,  the white mortality curve on the bottom right of the graph), we can see that it too follows this more typical rise, peaking and decline in mortality pattern, albeit in a much scaled down version compared to the other infectious diseases.

Polio as a cause of death, appears to be the last of all the contagions,  which reiterates the pattern that arose inadvertently from this study that, the earlier and older a pathogen rises to more deadly and debilitating prominence, the more lethal its impact and conversely, the later and younger a contagion, the less lethal its impact upon our populations as a whole.

Polio will be dealt with in its own chapter (the last one) because, as this study started with the Plague, the earliest that we can assess, Polio will finish the discussion being the final contagion that plagued our developed nations – hence the main title of this book.

Essentially, the historical record demonstrates that, it is not the rate of infection which matters, and certainly by the mid-20th Century our nations were beginning to see a post-war baby boom giving the pathogens circulating at the time fresh fodder in abundance, but, the number of deaths and disabilities occurring (mortality and morbidity) from such attacks.

All in all, judging by the full assessment of all the major facts as presented here, it would appear that within our now modernised nations, our older generation could quite rightly wear T-Shirts that said: Been There, Done That, Immune Already! And even if we didn’t suffer the consequences of gaining such vital long-term immunity and resistance due to direct exposure to some of these once deadlier pathogens, we could still wear the T-Shirt as our ancestors seemingly passed this hard-earned protection on to ourselves.

This raises the interesting and encouraging prospect that as lesser developed nations move into greater modernity, that they too will soon follow suit in terms of the dramatically falling death rates from such previously more lethal contagions! The natural cycle of generational contagion resulting in an ultimate resolution of an accommodation between these pathogens and us as their hosts, which is essentially inscribed in the mortality graphs and data of our more developed nations, would certainly strongly suggest that this should be the case.

The Almost Universal Decline of Some of the Deadliest Contagions Known to Children

As discussed throughout this study, Ireland shows a fairly close correspondence in terms of its mortality statistics from a similar range of infectious diseases to that of many other developed nations and this certainly continues to be the case as we examine the statistical patterns for the great contagions that once haunted our children the most.

For instance, if we substitute the ‘c’ in Iceland for ‘r’ for Ireland, it would be difficult to tell which country we were talking about as our mortality statistics are so similar for the very same diseases for a broadly similar timeframe.

The Development of Infant Mortality in Iceland, 1800–1920

The great epidemic infant and child killers of the nineteenth century, such as measles and whooping cough, had lost much of their virulence. Occasionally, they were even successfully coped with in individual places with quarantine measures. By 1920 Iceland had become relatively safe for infants and young children in comparison with the dreadful situation prevailing around the mid-nineteenth century.

Loftur Guttormsson and Ólöf Garðarsdóttir (2002)

[133]

https://pdfs.semanticscholar.org/d338/90ffb7c01490bde7a729270285926ea3b17e.pdf

This pattern of dramatically declining mortality rates from once deadlier contagions of childhood is also evident within Australia and just about everywhere we look when a statistical assessment of this type has been carried out. Thus, this pattern of plummeting deaths throughout the first half of the 20th Century in particular from a broad range of similar contagions that impacted children the most could be called a near-universal phenomenon.

For example, we can see from examining similar mortality graphs for this period showing deaths per year from once deadlier contagions such as Scarlet Fever, Measles and Pertussis (Whooping Cough) from within England and Wales (See Fig. 19 below) that, apart from scale, these graphs would be difficult to tell apart from the corresponding Irish statistics for the same contagions over the same timeframe.

England and Wales Mortality Measles, Scarlet Fever and Whooping Cough

Fig. 19: Reproduced and rearranged keeping essential data from small inserts in larger graphs of cases of the same disease within England and Wales derived from: Smallman-Raynor, M, Cliff, A (2012), Atlas of Epidemic Britain: A Twentieth Century Picture, Oxford University Press, Oxford. p.50, figure 4:18 (Measles); p. 52, figure 4:24 (Whooping Cough); p.49, figure 4:15 (Scarlet Fever). https://books.google.ie/books

Obviously, England and Wales, having a much greater population compared to Ireland is not directly comparable in terms of actual individual numbers of deaths, as given in the corresponding statistics that go with Figure 19 above recorded annually from Scarlet Fever, Pertussis (Whooping Cough) and Measles for England and Wales below, but, if we focus on the percentages when reading the following statistics, we can see that the fate of these contagions in terms of their deadliness shows  a close correspondence, over the same timeframe, and ultimately become fully resolved at around the same time.

Atlas of Epidemic Britain: A Twentieth Century Picture

A total of 67, 791 deaths from scarlet fever were recorded in England and Wales during the twentieth century, with the overwhelming majority (over 99 percent) occurring in the period 1901-45…

A total of 274,347 deaths from measles were recorded in England and Wales during the twentieth century, with over 98 percent occurring in the period 1901-45…

A total of 233,698 deaths from whooping cough were recorded in England and Wales during the twentieth century, with 97 percent occurring in the period 1901-45…

Smallman-Raynor, M, Cliff, A (2012), pp. 50, 52, 49,

[134]

https://books.google.ca/books?id=iMnN4fZrj70C&pg=PA48#v=onepage&q&f=false

Earlier data has also been generated for England and Wales back to the mid-19th Century by Cumpston 1927 until around the first quarter of the 20th Century showing the clear decline of deaths from Measles, (Figure 6a), Pertussis/Whooping Cough (Figure 6b) and Scarlet Fever (Figure 6c) reproduced in, Infectious Diseases and Human Population History (1996) by Dobson, A. P and Carper, E. R (1996, 119) https://academic.oup.com/bioscience/article-abstract/46/2/115/252374 [135], are indistinguishable from the Irish data for much of this same period – other than a case of scale, relative to our respective populations.

Now moving across the Atlantic to the United States, we find that this overall pattern of plummeting deaths from a similar range of childhood contagions is also recorded as illustrated in Figure 20 (below) and this compares remarkably closely with the same data from Ireland.

Fig.2
Fig. 20: Dramatic decline in annual number/rate of deaths from Measles, Whooping Cough (Pertussis) in the pre-vaccine era compared between Ireland and the United States.  Irish data generated using annual statistics reports since records began – “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of An Phríomh-Oifig Staidrimh, Central Statics Office CSO, link. © Copyright dig-press.com American data reproduced from Tavia Gordon, Public Health Reports, (1896-1970), Vol. 68, No. 4 (Apr. 1953), figure. 3. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2024011/  Link to PDF

When we compare the death rates from similar childhood contagions across a corresponding timeframe within the U.S., again, apart from scale relative to our population sizes, the graphs generated are fairly difficult to tell apart. The main difference is one of scale where, the graph for Ireland shows the actual number of deaths per year recorded officially for Ireland since records began and the United States shows the rates of deaths per 100, 000 of its population. Below is the statistical data accompanying the graph from the United States (Fig. 20) with the addition of the rates for Diphtheria.

Mortality in the United States 1900 – 1950

Mortality for the communicable diseases of childhood fell sharply between 1900 and 1950…

By 1950 death rates for diphtheria, measles, whooping cough, and scarlet fever had declined to a small fraction of their values at the beginning of the century …

In 1900, this group of diseases was responsible for 242.6 deaths per 100,000 children under 15. In 1950, these diseases together caused fewer than 5 deaths for every 100,000 children.

Gordon, T., (1968, 441).

[136]

 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2024011/

McCormick furthers this pattern in the following excerpt where he highlights the dramatic drop in the rates of deaths from a broad range of similar contagions throughout the first half of the 20th Century by comparing the statistics for the United States with that from Canada with an emphasis upon Diphtheria.

 …THERAPY OF INFECTIOUS DISEASES

J. McCORMICK, M.D.

Toronto, Canada.

1951

Diphtheria

Prior to the present century this disease was the major scourge of infancy and childhood. The mortality rates in the United States, for consecutive ten-year periods from 1900 to 1940, were as follows: 40, 21, 15, 5 and 1.

For the city of Toronto, for ten-year periods from 1885 to 1945, the rates were as follows: 132, 66, 34, 19, 8 and 3…

A similar general decline in incidence and mortality rates for other infectious diseases, notably scarlet fever, whooping cough, measles, mumps, rheumatic fever and typhoid fever, has also been recorded.

[137]

https://www.seanet.com/~alexs/ascorbate/195x/mccormick-wj-arch_pediatrics-1951-v68-n1-p1.htm

Now, antibiotics cannot have impacted the greater portion of mortality statistics at a population level, as the deaths from these infections had already plummeted to such low levels by the time antibiotics began circulating more widely. Certainly, antibiotics would have been somewhat helpful in addressing complications that still could arise in this era (post-mid-nineteen forties) at a localised and individual level, however, as noted previously, antibiotics don’t work on viral infections.

Furthermore, as McCormick stresses, the fact that much of the decline in deaths from many of these once deadlier contagions had been prior to, if not in the absence of, our broader use of prophylaxis, prevention typically via vaccination that these interventions cannot have been the main cause of this most welcome decline in deaths as summarised in the excerpt below.

…PROPHYLAXIS AND THERAPY OF INFECTIOUS DISEASES

J. McCORMICK, M.D.

Toronto, Canada.

1951

The usual explanation offered for this changed trend in infectious diseases has been the forward march of medicine in prophylaxis and therapy; but, from a study of the literature, it is evident that these changes in incidence and mortality have been neither synchronous with nor proportionate to such measures.

…Likewise, the decline in diphtheria, whooping cough and typhoid fever began fully fifty years prior to the inception of artificial immunization and followed an almost even grade before and after the adoption of these control measures.

In the case of scarlet fever, mumps, measles and rheumatic fever there has been no specific innovation in control measures, yet these also have followed the same general pattern in incidence decline…

…On this same subject McKinnon … says: “Quite obviously then, all the factors mentioned are not adequate in themselves to explain the recorded decline. Some other factor or factors must have been operating during this period and it is necessary to cast farther afield in search of them…”

[138]

https://www.seanet.com/~alexs/ascorbate/195x/mccormick-wj-arch_pediatrics-1951-v68-n1-p1.htm

 

DIPHTHERIA PRIOR TO THE VACCINE ERA

Picking up where McCormick left off, an antitoxin to attempt to stave off the worst effects of Diphtheria was one of the earliest vaccines after Jenner’s Cowpox vaccine to become widely available throughout our developed nations.  However, when we examine this timing for the introduction of the Diphtheria antitoxin across our different regions, as investigated within this present section, we find, confirming McCormick’s observation above, that the great decline in mortality from having the Diphtheria infection, does not typically correspond with this medical intervention.

Similarly, this disparity between our vaccine policies and our respective declining Diphtheria mortality rates, occurring independently of this medical intervention, is paraphrased from one immunologist’s speech of the early 1940s given within McKeown’s 1979 study.

The Role of Medicine

… the reduction of mortality from diphtheria in the 1940s

did not everywhere coincide with the introduction of immunization…

McKeown, T. (1979, 47)

[139]

http://peaceworkspartners.org/vault/Oxford/DPHPC/Health and Development Course/Int Dev Readings HT10/1a. Main Theories/McKeown The Role of Medicine 1979.pdf

Now, focussing in again on the United States. While searching the historical archives, it was difficult to pinpoint exactly the period when the toxoid to combat Diphtheria became widely used by enough of the population to impact upon their mortality statistics in this region, but we could begin estimating this judging by a number of historical accounts that it was within the decade prior to the nineteen forties and perhaps as early as the late nineteen thirties. However, this comes with a caveat. These anti-Diphtheria campaigns tended to be focussed upon the great urban centres as you will see below.

Now, when we assess this against the actual mortality (individual deaths per year since records began as presented below in Figure 21), it is difficult to reconcile this data with these campaigns being the main cause of the plummeting mortality rates as, of course, the vast bulk of the decline in deaths from Diphtheria in the United States occurs prior to these more localised efforts.

Diphtheria US individual deaths

Fig. 21: Annual death rate per 100.000 of population from Diphtheria in the U.S. 1900 – 1960, Graph was generated from tabulations given in ‘Vaccination and the Control of Seven Infectious Diseases in the US -1900-1970’, (Blood, B., 2000-2013) http://www.dolmetsch.com/USDiseaseData1900to1970.html#table9 

As you can see from the graph above (Fig. 21) Diphtheria follows the more typical pattern of declining mortality rates comparable to many of the other once greater contagions. For instance, it is not dissimilar to the pattern of decline from another bacterial contagion that was also much deadlier in the earlier days – namely, Scarlet Fever, which, as discussed in detail in the previous chapter, never had any particular medical intervention that could account for its decline.

As indicated above, the efforts to combat Diphtheria within the United States (and indeed elsewhere) was mostly confined to the great cities, as alluded to in the excerpt below. Therefore, it seems unlikely that these localised efforts would have impacted the mortality statistics relating to deaths from Diphtheria at a national level. However, these urban campaigns are documented as having a positive impact as highlighted in the following excerpt along with the fact that Britain was slow to follow suit in its own anti-Diphtheria campaigns.

American Journal of Public Health

Immunization and the American Way: 4 Childhood Vaccines

Diphtheria: Enthusiasm and Empiricism

The Ministry of Health did not become convinced of the efficacy of diphtheria immunization until after 1940, by which time it had become clear that the incidence of diphtheria had in fact dropped further in cities using toxin–antitoxin aggressively (such as New York City and Toronto) than it had in London…Just why diphtheria immunization gained acceptance more slowly in Britain than in the United States is not fully clear.

Baker, J.P (2000, 200-201)

[140]

http://ajph.aphapublications.org/doi/pdf/10.2105/AJPH.90.2.199

But as it turns out, perhaps Britain did not have to do anything at all, as the death rates began to plummet to all-time lows from having Diphtheria just as their own campaigns against Diphtheria were beginning to get off the ground. Indeed, this dramatic drop in the rates of deaths from Diphtheria occurs on the eve of the campaign in Scotland (before mass vaccination was rolled out).

This can be assessed by looking closely at the mortality curve for Scotland (Fig. 22 below – the solid line with dots) and comparing it with the timing of the Scottish campaign which commenced in 1941 as documented in, ‘The Scottish Diphtheria Immunization Campaign (1941 to 1942)’ Russell, A. (1943) [141] http://journals.sagepub.com/doi/pdf/10.1177/003591574303601001 This timing would have been in line with the broader national campaigns seen throughout the rest of Britain.

Diphtheria graph

Fig. 22 The steep rises and falls representing deaths per 100.000 of our respective populations from diphtheria well before the main availability of vaccination (infants to begin with) for Northern Ireland, Scotland and England/Wales. Note that Eire (ROI) started its infant vaccination scheme earlier – yet it does not follow the same pattern of a decline until some time later. Chart reproduced from the 1947 annual statistics reports, p. xix, Diagram No. 2  – “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of: An Phríomh-Oifig Staidrimh, Central Statics Office CSO, link

Note that Éire, as plotted on the graph above (Fig. 22, solid black curve) relates to the newly formed Republic of Ireland of this era and its mortality curve deviates somewhat to the other regions as it lags behind by a few years in terms of its final resolution. However, if we hypothetically pushed this line back a few years, it would broadly correspond in its peaks and troughs with the others. We will discuss this deviation seen for the statistics plotted for Éire shortly.

Now returning to Scotland, as documented in the excerpt below, we find that the initial enthusiasm within Scotland for mass vaccination rapidly began to dwindle. This was probably because the worst was essentially over and parents could sense this on the ground as they counted more of their children after having the Diphtheria infection than they had done previously.

The Scottish Diphtheria Immunization Campaign (1941 to 1942)

During the second half of 1942, as the figures indicate, there was a serious falling-away from the earlier enthusiastic response and it is probable that at the moment only about 60 % of the child population is protected. It is to be hoped that this figure will not decrease further, otherwise Scotland may, sooner rather than later, find herself back to the unsatisfactory position which existed three years ago.

Russell, A. (1943, 32)

[142]

http://journals.sagepub.com/doi/pdf/10.1177/003591574303601001

As indicated above, the coverage had declined significantly after the first anti-Diphtheria campaign of 1941, and this, of course, suggests that this degree of coverage would not have been sufficient to account for the continued tumbling rates of deaths from Diphtheria as clearly illustrated in Figure 22.

And although the dramatic drop in deaths from Diphtheria was obviously a very welcome state of affairs, as the health official stresses in the above excerpt, it was felt that there was a need for continued vaccination lest the old Diphtheria returned again. Little did he, or any of his peers know that Diphtheria’s reign of terror was actually just about to come to a rather abrupt and ultimate end irrespective of the dramatically falling rates of artificial protection (via vaccination). (See Fig. 22 for the dramatic drop to virtually no deaths by 1947).

Now focussing in on Ireland (leaving aside Northern Ireland which has by this era come under the British public health and mortality statistics system), which as you can see from the graph above (Fig. 22), Éire lags behind the other regions to some extent in terms of its final resolution of Diphtheria as a killer as noted earlier.

This is of particular interest when we consider that the Republic of Ireland (Éire) begins its mass vaccination rollout using the Diphtheria antitoxin significantly earlier than in Britain/the UK (inclusive of Northern Ireland, Scotland, England and Wales). This earlier introduction is different to these other regions as it was not driven by governmental health policies, but by those testing a vaccine – experimentally, with a view to making it commercially available to our respective governmental health authorities as indicated below.

The Irish Examiner

Vaccine Trials: Dark chapter that needs answers

(1st December 2014)

…findings published in today’s Irish Examiner reveal … In the earliest known trials, conducted between 1930 and 1935, Wellcome’s APT anti-diphtheria vaccine was administered to 2,051 children in 24 residential institutions in Dublin, Cork, and Tipperary and to more than 40,000 children among the general child population.

Dwyer, M. (2014)

https://www.irishexaminer.com/viewpoints/analysis/vaccine-trials-dark-chapter-that-needs-answers-300251.html

[143]

Now, if we calculate the annual average of infants actually born in Ireland for this timeframe by going to the Registrar General’s Report (1940) for the period: 1930-39 https://www.cso.ie/en/media/csoie/releasespublications/documents/birthsdm/archivedreports/P-VS_1940.pdf        (1941, viii) [144], we find that this means that a sizable portion of the child population was vaccinated as part of the trials in the same general era.

This should have been adequate to register on the mortality graphs for Diphtheria, but, in fact, it is quite the opposite. It took longer for The Republic of Ireland to see a final resolution to deaths from Diphtheria and while other regions were seeing their mortality rates go into free-fall either, just prior to, or near-simultaneously with the British 1941 anti-Diphtheria campaign, Ireland was witnessing a huge rise in deaths for the same general period. However, fortunately, Ireland also sees its own final epidemic of deaths from Diphtheria plummet shortly thereafter (See Fig. 22 – solid line).

For instance, just prior to this final resolution within the newly founded Irish Republic (independent of Britain), we can gain a more detailed insight into what was happening on the ground here in Ireland during its tumultuous final epidemic. This is documented from historical accounts of one doctor’s passionate campaign to increase the uptake of the vaccination using the Diphtheria antitoxin in Ireland as the final Diphtheria epidemic was climbing to more deadly prominence.

The campaign against diphtheria in Cork Street Fever Hospital, 1934-1952

The early immunisation schemes were not without their complications, which were recognised by Dr McSweeney a year later in 1942. In his medical report for the year, Dr McSweeney stated that of the 127,000 children inoculated against diphtheria in Dublin during the previous 10 years, approximately 35% had been treated by the ‘one shot’ method now regarded as unreliable.

After a period of 5-7 years some children immunised in this way became susceptible again; 137 of 594 patients with diphtheria admitted in Cork Street in 1942 had given a previous history of receiving a full course of diphtheria prophylactic…

However, Dr McSweeney noted in the following year’s report that diphtheria rarely killed a child who had had even one course of injections…

He continued to be a firm advocate of the immunisation programme, now modified to consist of the administration of two injections of alum precipitated toxoid (APT) or three injections of toxoid antitoxin floccules (TAF).

Dr McSweeney noted in his 1948 report that the year gone by was the first in sixty years in which no death from diphtheria had been recorded in Cork Street. He stated that ‘although immunisation on a large scale has helped to produce this favourable state of things, it must be conceded that the decline in virulence of the causative organism – a cyclical phenomenon – has been a causative factor’…

Brady. F. (2015, 6th February RCPI Heritage Blog)

http://rcpilibrary.blogspot.com/2015/02/the-campaign-against-diphtheria-in-cork.html

[145]

Recall the context for Dr McSweeney’s remarks, in that he, nor anyone else, could have known that this was Diphtheria’s last hurrah as a killer pathogen. McSweeney, himself, notes the surprising decline in deaths as his hospital beds empty and the few patients that did come in with Diphtheria, tended to walk out alive in his diaries and it is of interest, as documented above, that within a few short years, it is the first time in sixty years that deaths from Diphtheria did not occur in his hospital at all!

Now, if we review Figure 22 again, you will perhaps see that in 1947, it wasn’t just the beds in his hospital that were emptying, but in the nation as a whole. Of course, nobody would have believed at the time that such a final resolution of such a previously more lethal contagion across the entire nation could be possible. However, McSweeney did attribute, at least in part, this decline in deaths from Diphtheria to some natural cycle of the pathogen and us as its host.

Furthermore, it is perhaps worth reiterating some other factors that McSweeney reveals in his diaries, and that is the fairly short-lived protection afforded by the Diphtheria antitoxin. This, of course, brings to mind the issue with the Cowpox vaccine against Smallpox. This waning immunity is a much more common feature of artificial vaccine-derived immunity that most of us realise (there is a substantial amount of scientific research carried out by immunologists that support this statement).

Indeed, when vaccines were first introduced, it was truly believed (and greatly hoped) that preventing the circulation of infectious disease would result in less of our populations getting infected (which initially this did occur for a certain number of years until immunity began to wane – a well-known phenomenon of almost all vaccines investigated) and therefore creating less mortality and morbidity (death and destruction).

Of course, as this was predicated upon the belief that vaccine immunity was for life or close to it (and one shot would give protection) and the hope that we could create something artificially that would effectively mimic our natural immune system but would produce a milder form of the infection, did not turn out to be as robust as everyone had hoped. This contrasts quite spectacularly with the documented evidence for natural generational immunity against Diphtheria, which seems to apply to all the natural once deadlier contagions across the board. See excerpt below:

The Changing Epidemiology of Diphtheria in the Vaccine Era

The Journal of Infectious Diseases, Volume 181

Diphtheria

Most newborn infants passively acquired antibodies from their mothers via the placenta. In 1914 in Vienna … and in 1923 in New York City …, ∼80% of newborns showed evidence of diphtheria immunity …

During the first several months of life, this passive immunity waned and was gradually replaced by active immunity, which was acquired through increasing exposure to natural infection.

By 15 years of age, 80% of the children had acquired natural immunity against diphtheria.

Galazka A. and Dittmann, S.,  (2000)

[146]

https://academic.oup.com/jid/article/181/Supplement_1/S2/840806

In other words, mothers who had themselves been exposed as children to the Diphtheria pathogen were able to pass on this protection to their young until they could fend for themselves. And this exposure was seemingly essential to keep the generational robust resilience cycle going, thus, it is this that most likely accounts for the ever-declining deaths from Diphtheria as a cyclical phenomenon until full resolution across our modern nations rather than any particular intervention on our part.

It seems that the worst effects of Diphtheria were almost already resolved by the time most of our nations were even thinking about taking such measures to combat it. However, as we vaccinate almost all of our infants today against it, this might have unintended longer-term impacts that nobody could have foreseen as outlined in the following:

The Changing Epidemiology of Diphtheria in the Vaccine Era

The Journal of Infectious Diseases, Volume 181

Diphtheria

Changes in Immunity Patterns by Age

Changes in the age-wise distribution of the immunity patterns usually have been explained by the argument that immunization led to a marked decrease in the incidence of the disease and to a subsequent reduction of the reservoir of toxigenic C. diphtheriae organisms.

In the prevaccine era, exposure to toxigenic strains of diphtheria organisms was common, and this provided natural boosts to the development and maintenance of immunity against diphtheria. Children were susceptible, and most adults remained immune to the disease.

However, after immunization of children became widespread, diphtheria became rare, so exposure to these bacteria (and the concomitant natural boost of immunity) become uncommon.

If adults do not have natural exposure to diphtheria-causing organisms or receive booster doses of diphtheria toxoid, their immunity induced by childhood immunization wanes, and they become susceptible to the disease …

Galazka A. and Dittmann, S.,  (2000)

[147]

https://academic.oup.com/jid/article/181/Supplement_1/S2/840806


PERTUSSIS (WHOOPING COUGH) PRIOR TO THE VACCINE ERA

Pertussis whooping cough Mortality Ireland

Fig. 23: Decline in deaths from Pertussis (Whooping Cough) in the pre-vaccine era in Ireland. Arrow represents when the vaccine against Pertussis (Whooping Cough) was first introduced in 1952 into Ireland. Chart generated using annual statistics reports since records began – “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of An Phríomh-Oifig Staidrimh, Central Statics Office CSO, link. © Copyright dig-press.com

Deaths from Pertussis (Whooping Cough) as shown above (Fig. 23) have clearly become rare by the time the vaccine becomes widely available (see arrow). A vaccine against Pertussis (Whooping Cough) only became widely available by 1952 in Ireland (See history of vaccine introduction into Ireland) [148]. The vaccine was introduced into the UK around 1950 (Smallman-Raynor, M and Cliff, A, 2012, 52) https://books.google.ca/books?id=iMnN4fZrj70C&pg=PA48#v=onepage&q&f=false [149].

Recall the graphs from England and Wales showing mortality statistics from Pertussis (Whooping Cough) as illustrated earlier in Figure. 19 which was near identical (apart from scale) in terms of its pattern of decline in deaths from Pertussis (Whooping Cough) over the same timeframe and the comparative graphs for the mortality rates from the disease presented in Figure 20 within the United States? This pattern of dramatic declining deaths from Pertussis (Whooping Cough) is easier to see within the stand-alone mortality graph for the United States (Fig. 24) below.

Whooping Cough U.S. individual deaths
Fig. 24: Annual death rate per 100.000 of population from Whooping Cough (Pertussis) in the U.S. 1900 – 1960 with data gap, Graph was generated from tabulations given in ‘Vaccination and the Control of Seven Infectious Diseases in the US -1900-1960s’, (Blood, B., 2000-2013) http://www.dolmetsch.com/USDiseaseData1900to1970.html#table9 

As you can see from inspecting the individual number of deaths annually from Pertussis (Whooping Cough) within the U.S., the most dramatic decline (even allowing for the data gap) is essentially within the era before the vaccine became more widely used just a year or two prior to that of Britain (as indicated from official websites relating to public health in the United States), which cannot, therefore, be correlated with the cause this near-universal decline in mortality from Pertussis (Whooping Cough) across our respective nations.

Nor can antibiotics account for the decline in deaths at a population level, as this bacterial disease had already plummeted in terms of deaths to all-time lows by the time we entered the antibiotic era. 

The cause of the overall decline and ultimate resolution across our different nations may, therefore, once again, be to do with a more natural phenomenon relating to generational transference of protection. We find support for this in the different dynamic of the disease prior to the vaccine era and after this era, when most infants were vaccinated and thus, creating a population of more susceptible individuals, which seems to be due to the loss of natural boosting via exposure.

An Evaluation of Pertussis Vaccine

Pertussis

..in the prevaccine era, when pertussis circulation was high and symptomatic infections more common than they are now, most people got a natural immune boost by coming in contact with infection before their immunity had completely waned…

With so little circulating pathogen people’s immunity was rarely boosted, thereby creating a large pool of people susceptible to pertussis, and allowing epidemic outbreaks in the current era despite high vaccine coverage.

Mortimer, E.A., Jr. & Jones, P.K. (1979)

 [150]

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2997163/

https://www.jstor.org/stable/4452397?googleloggedin=true#page_scan_tab_contents

Pertussis (Whooping Cough) is seemingly akin to those other great bacterial diseases such as – Scarlet Fever and Tuberculosis in that Whooping Cough also appears to require natural boosting via exposure to remain fully protected from its worst effects. However, with the introduction of mass vaccination, it is worrying that we are no longer getting our much-needed boosters in a safe and natural way.

For instance, although, it would appear that in the long-term, and on a generational level, once deadlier contagions such as Whooping Cough, and Diphtheria (like all the  once deadlier contagions discussed thus far), have resolved themselves for the most part, either without, or, essentially, prior to our efforts to intervene, in the case of Pertussis, the fact that there is an ever-increasing issue with vaccine waning and asymptomatic – silent carriers (as indicated in the rest of the article excerpted above), makes you into a Typhoid Mary type, as artificial protection (vaccination) as more studies are showing, means that you are not going around Whooping or even knowing you are infected.

Back in the pre-vaccine era days, you knew you had the Whoop and people would hear you coming if you felt a need to avoid the natural boost. However, you got it typically in childhood, when it was a lot less awful to have and you could boost the immune systems of the adults around you.

Epidemiology of Whooping Cough

Clinical Description

Pertussis, or whooping cough,.. is a serious epidemic respiratory infection caused by Bordetella pertussis,..

In fact, evidence suggests that immunized… adults in developed nations are the most common source of pertussis infections in neonates and children.

Institute of Medicine (US) (1991)

[151]

https://www.ncbi.nlm.nih.gov/books/NBK234373/

Although, these days most of us don’t typically die from having Pertussis, unfortunately, infants under two months of age (neonates, see excerpt above) are becoming increasingly vulnerable  due to a well-recognised phenomena whereby, they are no longer getting a natural buffering from the environment via their  mother’s antibodies, which she once got from her own exposure during childhood to the natural circulation of the Pertussis organism. This is why we now offer vaccination against Pertussis to pregnant women in our modern era! I thought nature was doing a fine job herself.

Another feature of the natural immunity cycle being somewhat interrupted by our efforts to intervene via vaccination is that it has forced the natural strains to adapt and evolve as proposed in the following science article excerpt in Nature.

Whole-genome sequencing reveals the effect of vaccination on the evolution of Bordetella pertussis

… since 1990s, pertussis resurgence has been observed in developed countries that have attained high vaccination coverage such as the Netherland, the United Kingdom and the United States…

These results provide new and crucial evidence that the immune pressure from vaccination is one major driving force for the evolution of B. pertussis.

Yinghua Xu, Bin Liu, Kirsi Gröndahl-Yli-Hannuksila, Yajun Tan, Lu Feng, Teemu Kallonen, Lichan Wang, Ding Peng, Qiushui He, Lei Wang & Shumin Zhang
(2015, Abstract)

[152]

https://www.nature.com/articles/srep12888

The Irish schedule given to parents is similar to other regions and is generally as follows: the Pertussis vaccine is given at 2, 4, and 6 months of life as part of five other doses of vaccine against a wide range of potential infections (called the 6 in 1).

A booster vaccine is then given at 4-5 years of age (the 4 in 1) and another booster dose is given in 1st year of second level school with three two other doses of vaccine for other diseases in one injection and even when a child is fully up to date, their vaccine protection can still wear off after as summarised in the following health providers website relating to Pertussis (If you examine the duration for all the other main infections discussed throughout this study, you will find a similar story with the exception of Measles, which we will review in the final section of this chapter).

Duration of protection by vaccine

Estimated duration of protection from vaccine after receipt of all recommended doses…

Pertussis (whooping cough) 4-6 year

The Immunisation Advisory Centre (2017)

[153]

http://www.immune.org.nz/vaccines/efficiency-effectiveness


MEASLES PRIOR TO THE VACCINE ERA

If we review the graphs presented at the beginning of this chapter which include mortality figures for Measles along with the other childhood contagions that plummeted throughout much of the 20th Century across our different nations, we can see just how similar our mortality rates are and how Measles becomes a fairly benign infection of children around the same time. Certainly compared to the generations of slaughter that this contagion once wreaked amongst our childhood populations, this was a vast improvement that obviously occurred via natural means well before we began to intervene.

For instance, even if we go back to a time when Measles had not fully resolved itself, but it was significantly tamer than the earlier era, say around the early 1940s, we find that having the infection at the age-appropriate time was relatively harmless even back then. If we look at the historical archives for accounts going back to the era, well before a Measles vaccine was even thought of, we find from a report relating to England and Wales (which of course would apply to any of our modern nations) a very revealing commentary as documented in a more recent study.

Atlas of Epidemic Britain: A Twentieth Century Picture

During the first three full years of notification, 1940-42, over 1.15 million cases of measles were reported.

With this new information to hand, William Butler could inform a meeting of the Royal Statistical Society that:

“In looking through the Annual Reports of a number of Medical Officers of Health for the year 1940, I was impressed by the small number of deaths recorded in proportion to the number of cases notified in the districts to which they related.”

Smallman-Raynor, M and Cliff , A (2012, 50)

[154]

https://books.google.ca/books?id=iMnN4fZrj70C&pg=PA48#v=onepage&q&f=false

We are talking about over a million cases of Measles and so few deaths, and this is back in 1940! Was nearly everyone somehow immune? According to the CDC as quoted below, you can figure out that on the eve of the vaccine era in the United States, yes, indeed, almost everyone was immune.

Centers for Disease Control and Prevention

Epidemiology and Prevention of Vaccine-Preventable Diseases, 13th Edition

Before a vaccine was available, infection with measles virus was nearly universal during childhood, and more than 90% of persons were immune by age 15 years…

CDC (2015, 209)

https://www.cdc.gov/vaccines/pubs/pinkbook/downloads/meas.pdf

[155]

This, article also emphasises the often fatal aspect of having the Measles, but this is based upon figures in less developed nations who are presumably where we once found ourselves only decades earlier and it looks, according to our mortality graphs, that it is just a matter of time before their mortality graphs will look the very same.

Then our vaccine era begins, but, apparently, there was no real need for it now, but, as we could do it, we did do it …

American Journal

Vol. 8

Vaccination Before the Measles-Mumps-Rubella Vaccine

US and UK IMMUNIZATION POLICY, 1963–1968

Any decision to begin mass measles vaccination in the early 1960s thus involved numerous uncertainties. Was the disease serious enough? Would parents feel it worth having their children vaccinated?

And if mass vaccination did seem justified, should the live or the killed vaccine (or a combination of both) be used?

In the United States, experience with the polio vaccines played a major role in shaping the consensus that gradually emerged.

…Approximately 15 million children were given one of the new measles vaccines starting with their licensing in 1963 and continuing until mid-1966, and the reported incidence of the disease fell by half... On the basis of this success, with material and financial support from the Centers for Disease Control and Prevention, and inspired by the social and political climate of the time, in 1967 a campaign was launched to eliminate measles from the United States.

“To those who ask me ‘Why do you wish to eradicate measles?’” wrote Alexander Langmuir, chief epidemiologist at the Centers for Disease Control and Prevention from 1949 to 1970, I reply with the same answer that Hillary used when asked why he wished to climb Mt. Everest. He said “Because it is there.” To this may be added, “… and it can be done.”…

Mass measles immunization began in Britain in 1968. In Sweden it began in 1971 and in the Netherlands not until 1976.

Hendriks, J., & Blume, S. (2013)

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4007870/

[156]

As you can see from the excerpt above, other regions followed suit at varying times thereafter in their implementation of the first vaccines against Measles. As also indicated above, this shift towards feeling a need for such an intervention was very much built upon the seeming success of the Polio vaccine (this subject will be discussed in the final chapter of this book).

Now one region not mentioned above is Ireland. A vaccine against Measles was only first introduced here in 1985, as documented in, history of vaccine introduction into Ireland [157]. Below is a graph of the officially recording individual deaths from Measles since records began [Fig. 25]. As you can see, this is remarkably late indeed, considering by the time of the Measles vaccine’s broader use, the mortality graph has flatlined. In other words, deaths and disabilities from having Measles were virtually unheard of at this time.

Measles Ireland

Fig. 25: Decline in deaths from Measles in the pre-vaccine era in Ireland. Arrow represents when the Measles vaccine became widely available in Ireland. Chart generated using annual statistics reports since records began – “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of An Phríomh-Oifig Staidrimh, Central Statics Office CSO, link. © Copyright dig-press.com

We used to think that high vaccination coverage could mimic the type of ‘herd immunity’ (community protection as so many of the population were already immune – recall that it was estimated to be c. 90% from about 15 years of age in the pre-vaccine era), thus replacing natural life-long immunity that almost everyone got by the time they had grown up, thus protecting the community as a whole.

However, this alas, turned out to not be the much hoped for outcome. You see, in many ways, what happened over the course of time could be summarised succinctly as the honeymoon period is over.

This is because, we now know – a generation or two, or even three in some regions, on from replacing natural immunity with artificial immunity, that infants rapidly became the most vulnerable as they are not offered the vaccine in most of our modern nations prior to one year old (ironically this is mostly due to the vaccine not working so well as it interfered with the mothers naturally acquired maternal antibodies as documented frequently within scientific literature).

As discussed previously within this study, naturally acquired maternal antibodies against seemingly all infectious diseases that have been circulating widely within the environment to which our ancestors were exposed are passed along via the maternal line and reinforced and boosted in each generation more rapidly thereafter. The critical part of this natural immunity cycle would, therefore, be the pathogens a mother is exposed to as a child when growing up and what challenges her own mother and grandmother had to face.

This maternal priming (preparation) helps to shield a newborn by providing a natural buffering (a type of attenuated – less virulent form) against the circulating pathogens within the environment the infant is born into. This then gives the infant a taste and familiarity to the infection without causing harm.

However, now that we vaccinate against something like Measles on such a mass scale, and due to the fact that our vaccines have been perhaps, too successful in attempting to eradicate Measles, this leaves children under one year of age the most vulnerable as vaccinated mothers simply cannot offer them the same degree of protection via their artificially acquired vaccination.

This very issue is discussed in the excerpt below, where it also discusses breastfeeding which has also significantly declined and was once also a common way of maintaining an infant’s protection via maternal antibodies. However, this same article, suggests vaccinating earlier due to all of the above.

New measles vaccination schedules in the European countries?.

Summary

Over the last 5 years, a number of outbreaks of measles have occurred in several European Union (EU) countries. Many of these outbreaks continue and/or continued for more than 1 year after the notification of the first case. Curiously in many of measles outbreaks about 10% of the patients were less than 12 months of age. All these patients according to the current EU countries vaccination calendars were not yet vaccinated against measles…

Most of mothers between 30 and 40 years of age are not vaccinated against measles, and many of them are not naturally immune against measles. These mothers do not pose antibodies against measles and in turn do not provide […] protection for their infants.

In conclusion, administrating the first dose of measles vaccine in the EU countries should be considered before 12 months of age, most probably at 9 months of age…

vaccinated mothers aged between 30 and 40 years old provide very low levels of antibodies to their infants, not sufficient to protect them over the first 12 months of their lives …

… the prevalence of breast feeding is lowering or at least reduced in duration all over the EU countries, especially over the last 20 years because of several social and economic changes. Maternal milk provides antibodies which offers major protection for infants against many bacterial and viral infections including measles ..

Allam M. F. (2014)

[158]

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4718332/

In other words, vulnerable infants used to get their robust protection from their mother’s who, back before the vaccine era, would have had themselves been exposed to the Measles virus as it was almost a rite of passage getting the infection as a child, due to it having become so relatively benign before we started vaccinating against it, and therefore becoming immune for life and bolstering the other 10 % who had not yet been there, done that and were immune already.

Moreover, this type of natural immunity prior to the wider use of vaccination against Measles comes into sharper relief as we are beginning to realise that the immunity wanes significantly after time, just like all the rest as encapsulated in the following excerpt dealing with Measles vaccine (secondary) failure (or another way of saying: waning artificial immunity).

 Secondary measles vaccine failures identified by measurement of IgG avidity: high occurrence among teenagers vaccinated at a young age.

…measles-vaccine failures are more common than was more previously thought, particularly among individuals vaccinated in early life, long ago, and among re-vaccinees. Waning immunity even among individuals vaccinated after 15 months of age, without the boosting effect of natural infections should be considered a relevant possibility in future planning of vaccination against measles.

Paunio M, Hedman K, Davidkin I, Valle M, Heinonen OP,

Leinikki P, Salmi A, Peltola H. (2000)

Epidemiology Infection. Vol. 124 [2], pp.263-71.

https://www.ncbi.nlm.nih.gov/pubmed/?term=Paunio%20M%5BAuthor%5D&cauthor=true&cauthor_uid=10813152

[159]


References:


[133] Guttormsson, L and Garðarsdóttir, Ó (2002) The Development of Infant Mortality in Iceland, 1800–1920, Hygiea Internationalis, An Interdisciplinary Journal for the History of Public Health, Vol. 3 [1] pp. 151 – 176, PDF · [Available online as PDF] DOI: 10.3384/hygiea.1403-8668.0231151 https://pdfs.semanticscholar.org/d338/90ffb7c01490bde7a729270285926ea3b17e.pdf

[134] Smallman-Raynor, M, and Cliff, A (2012), Atlas of Epidemic Britain: A Twentieth Century Picture, Oxford University Press, Oxford. p.50, figure 4:18 (Measles); p. 52, figure 4:24 (Whooping Cough); p.49, figure 4:15 (Scarlet Fever). https://books.google.ca/books?id=iMnN4fZrj70C&pg=PA48#v=onepage&q&f=false

[135] Dobson, AP. and Carper, ER (1996) Infectious Diseases and Human Population History: Throughout history the establishment of disease has been a side effect of the growth of civilization, BioScience, 46, Issue [2,] pp. 115–126, DOI: 10.2307/1312814 [Available online as PDF] https://academic.oup.com/bioscience/article-abstract/46/2/115/252374

[136] Gordon, T. (1953) Mortality in the United States, 1900-1950. Public Health Reports, Vol. 68 [4], p. 441. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2024011/

[137] McCormick, W.J (1951) Vitamin C in the Prophylaxis and Therapy of Infectious Diseases, Archives of Pediatrics, Vol. 68, [1]. https://www.seanet.com/~alexs/ascorbate/195x/mccormick-wj-arch_pediatrics-1951-v68-n1-p1.htm

[138] McCormick, W.J (1951) Vitamin C in the Prophylaxis and Therapy of Infectious Diseases, Archives of Pediatrics, Vol. 68, [1]. https://www.seanet.com/~alexs/ascorbate/195x/mccormick-wj-arch_pediatrics-1951-v68-n1-p1.htm

[139] Mc Keown, T (1979) The Role of Medicine: Dream, Mirage, or Nemesis? Basil Blackwell, Oxford. p. 47 [Available online as PDF] http://peaceworkspartners.org/vault/Oxford/DPHPC/Health%20and%20Development%20Course/Int%20Dev%20Readings%20HT10/1a.%20Main%20Theories/McKeown%20The%20Role%20of%20Medicine%201979.pdf

[140] Baker, J.P. (2000) Immunization and the American Way: 4 Childhood Vaccines, Diphtheria: Enthusiasm and Empiricism, American Journal of Public Health, Vol. 90, [2], 200-201 http://ajph.aphapublications.org/doi/pdf/10.2105/AJPH.90.2.199

[141] Russell, A. (1943) The Scottish Diphtheria Immunization Campaign (1941 to 1942) Sectional Proceedings of the Royal Society of Medicine Vol. XXXV1, p. 32 http://journals.sagepub.com/doi/pdf/10.1177/003591574303601001

[142] Russell, A. (1943) The Scottish Diphtheria Immunization Campaign (1941 to 1942) Sectional Proceedings of the Royal Society of Medicine Vol. XXXV1, p. 32 http://journals.sagepub.com/doi/pdf/10.1177/003591574303601001

[143] Dwyer, M (2014) Vaccine Trials: Dark chapter that needs answers, The Irish Examiner (1st December 2014) https://www.irishexaminer.com/viewpoints/analysis/vaccine-trials-dark-chapter-that-needs-answers-300251.html

[144] Registrar General’s Report (1940) for the period: 1930-39 https://www.cso.ie/en/media/csoie/releasespublications/documents/birthsdm/archivedreports/P-VS_1940.pdf        (1941, viii) [143],

[145] Brady, F (2015) The campaign against diphtheria in Cork Street Fever Hospital, 1934-1952, RCPI (Royal College of Physicians of Ireland) blogspot (9th February 2015) http://rcpilibrary.blogspot.com/2015/02/the-campaign-against-diphtheria-in-cork.html

[146] Galazka, A., & Dittmann S. (2000). The Changing Epidemiology of Diphtheria in the Vaccine Era, The Journal of Infectious Diseases, Vol.181, [1], pp. S2–S9, https://doi.org/10.1086/315533https://academic.oup.com/jid/article/181/Supplement_1/S2/840806

[147] Galazka, A., & Dittmann S. (2000). The Changing Epidemiology of Diphtheria in the Vaccine Era, The Journal of Infectious Diseases, Vol.181, [1], pp. S2–S9, https://doi.org/10.1086/315533https://academic.oup.com/jid/article/181/Supplement_1/S2/840806

[148] National Immunisation Office, (2018) Previous vaccine schedules https://www.hse.ie/eng/health/immunisation/whoweare/vacchistory.htm

[149] Smallman-Raynor, M, and Cliff, A (2012), Atlas of Epidemic Britain: A Twentieth Century Picture, Oxford University Press, Oxford.  https://books.google.ca/books?id=iMnN4fZrj70C&pg=PA48#v=onepage&q&f=false

[150] Mortimer E.A Jr, & Jones P.K. (1979). An Evaluation of Pertussis Vaccine [with Discussion] in, Charles H. Rammelkamp, Jr., Symposium (eds.,). Reviews of Infectious Diseases, Vol. 1, [6], Issue (Nov. – Dec., 1979), pp. 927-934 Abstract, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2997163/

[151] Institute of Medicine (US) (1991), Epidemiology of Whooping Cough, Clinical Description, in, Howson CP, Howe CJ, Fineberg HV, (eds.,), Effects of Pertussis and Rubella Vaccines: A Report of the Committee to Review the Adverse Consequences of Pertussis and Rubella Vaccines, Washington (DC): National Academies Press (US). https://www.ncbi.nlm.nih.gov/books/NBK234373/

[152] Yinghua Xu, Bin Liu, Kirsi Gröndahl-Yli-Hannuksila, Yajun Tan, Lu Feng, Teemu Kallonen, Lichan Wang, Ding Peng, Qiushui He, Lei Wang & Shumin Zhang
(2015). Whole-genome sequencing reveals the effect of vaccination on the evolution of Bordetella pertussis, Scientific Reports. Vol. 5, [12888], Abstract. https://www.nature.com/articles/srep12888

[153]  The Immunisation Advisory Centre (2017). Duration of protection by vaccine http://www.immune.org.nz/vaccines/efficiency-effectiveness

[154] Smallman-Raynor, M, and Cliff, A (2012), Atlas of Epidemic Britain: A Twentieth Century Picture, Oxford University Press, Oxford. https://books.google.ca/books?id=iMnN4fZrj70C&pg=PA48#v=onepage&q&f=false

[155] CDC (2015) Epidemiology and Prevention of Vaccine-Preventable Diseases, 13th Edition, 209 https://www.cdc.gov/vaccines/pubs/pinkbook/downloads/meas.pdf

[156] Hendriks, J., & Blume, S. (2013). Measles Vaccination Before the Measles-Mumps-Rubella Vaccine. American Journal of Public Health, Vol. 103 [8], pp.1393–1401. http://doi.org/10.2105/AJPH.2012.301075 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4007870/

[157] National Immunisation Office, (2018) Previous vaccine schedules https://www.hse.ie/eng/health/immunisation/whoweare/vacchistory.htm

[158] Allam M. F. (2014). New measles vaccination schedules in the European countries?. Journal of preventive medicine and hygiene, Vol. 55 [1], Summary, pp. 33-4. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4718332/

[159] Paunio M, Hedman K, Davidkin I, Valle M, Heinonen OP, Leinikki P, Salmi A, Peltola H. (2000). Secondary measles vaccine failures identified by measurement of IgG avidity: high occurrence among teenagers vaccinated at a young age. Epidemiology Infection. Vol. 124 [2], pp. 263-71. https://www.ncbi.nlm.nih.gov/pubmed/?term=Paunio%20M%5BAuthor%5D&cauthor=true&cauthor_uid=10813152


CHAPTER NINE: What Would Have Happened to Polio If We Hadn’t Intervened?

CHAPTER SEVEN: Scarlet Fever Returns, but it is a whole lot less lethal

Previous chapter link

Just a historical note on a peculiar relationship between Diphtheria and another pathogen, Scarlet Fever, it should be said that we don’t know much about Scarlet Fever from the earlier era (pre-1800s), but we do, however, understand that it was fairly widespread and existed in some form as it relates to another fairly similar disease – Diphtheria, as documented in the excerpt below taken from Charles Creighton’s 1894 history of epidemics.

A History of Epidemics in Britain

CHAPTER VII.

SCARLATINA AND DIPHTHERIA.

Scarlatina and diphtheria have to be taken together in a historical work for the reason that certain important epidemics of the 18th century, both in Britain and in the American colonies, which were indeed the first of the kind in modern English experience, cannot now be placed definitely under the one head or the other, nor divided between the two.

Creighton, C.  (1894, 678)

[109]

https://www.gutenberg.org/files/43671/43671-h/43671-h.htm#Page_693

Creighton also documents regular and widespread outbreaks of throat infections that often led to fatalities throughout Ireland, Britain and in parts of the U.S. over the course of the 18th Century (the 1700s) that were akin to what we would call Scarlet Fever. However, it wasn’t really until the 19th Century (the 1800s) that we begin to see specific epidemics erupt around the same time within Ireland, Scotland, England and Wales that Scarlet Fever becomes understood as a more distinct disease as indicated in the excerpt that follows.

A History of Epidemics in Britain

Vol. II.

SCARLATINA

… The general prevalence of malignant scarlet fever in the first years of the 19th century is farther shown by the accounts from Ireland, which were recalled by Graves in a clinical lecture of the session 1834-35, during the prevalence of a scarlet fever as malignant as that of thirty years before…

“In the year 1801,” he says, “in the months of September, October, November and December, scarlet fever committed great ravages in Dublin, and continued its destructive progress during the spring of 1802.

It ceased in summer, but returned at intervals during the years 1803-4, when the disease changed its character; and although scarlatina epidemics recurred very frequently during the next twenty-seven years, yet it was always in the simple or mild form, so that I have known an instance where not a single death occurred among eighty boys attacked in a public institution.

The epidemic of 1801-2-3-4, on the contrary, was extremely fatal, sometimes terminating in death (as appears by the notes of Dr Percival kindly communicated to me) so early as the second day. It thinned many families in the middle and upper classes of society, and even left not a few parents childless. Its characters seem to have answered to the definition of the scarlatina maligna of authors.”

The long immunity from malignant scarlatina which Graves asserts for Ireland after 1804, is made probable also for England and Scotland after 1805…

It is not until 1831 that we begin to hear much of malignant scarlatina again. But it is clear that scarlet fever was common enough all through that interval, probably in its milder form. It was now the usual epidemic trouble of schools.

Creighton, C.  (1894, 722-3)

[110]

https://www.gutenberg.org/files/43671/43671-h/43671-h.htm#Page_693

Supporting this observation of Scarlet Fever taking a turn for the worse to become a much more lethal contagion of pandemic proportions throughout many of our developing nations, and bearing in mind that we also had to contend with other fairly consistently deadly contagions of the era such as, Measles, Diphtheria and Whooping Cough (Pertussis) alongside Scarlet Fever, as indicated below, Scarlet Fever became so deadly that it began to supersede all of these other contagions.

AETIOLOGY

Scarlet fever–past and present

In the early nineteenth century, the clinical presentation of the disease appears to have changed for the worse. Lethal epidemics were seen in Tours, France, in 1824; in Dublin, Ireland, in 1831; and in Augusta, Georgia, during 1832-33. Similarly, in Great Britain, the fatality rate from scarlet fever increased from between 1 and 2 % to more than 15% in 1834. From 1840 until 1883, scarlet fever became one of the most common infectious childhood disease to cause death in most of the major metropolitan centers of Europe and the United States, with case fatality rates that reached or exceeded 30% in some areas–eclipsing even measles, diphtheria, and pertussis.

Smith, T.C.(2011)

[111]

http://aetiologyblog.com/2011/07/06/scarlet-fever-in-hong-kong/

In other words, you can imagine how parents felt as Smallpox was beginning to recede, only to confront losing their children to Scarlet Fever. And as Scarlet Fever began to recede in its deadliness, they had to contend with the rise of Measles as another potential killer of their children along with Whooping Cough (Pertussis) and others, but as will be revealed in the next chapter, even these eventually, became rather tame.

Again, like all the other great contagions, when they begin to rise to more deadly prominence, Scarlet Fever knew no social boundaries. When Scarlatina came to visit, it didn’t matter how poor or well off you were, she could still knock on your family’s door. For instance, as highlighted earlier in Creighton’s (1884) discussion of the devastation of Scarlet Fever in Ireland of the beginning of the 1800s, he highlights the fact that even some of the more affluent families had been thinned by its impact. And indeed, the devastation often included some of our best-known figures of historical renown as highlighted in the following excerpt:

Scarlet fever–past and present

AETIOLOGY

Children were always the worst affected, and proved to be highly susceptible. Charles Darwin lost two children to scarlet fever in the 1850s. Scarlet fever is also believed to have caused the 19-month old Helen Keller to lose her hearing and sight. John Rockefeller lost a two-year old grandson to scarlet fever, which is why Rockefeller University remains one of the world’s leading biomedical research centers in the world today.

Smith, T,C, (2011)

[112]

http://aetiologyblog.com/2011/07/06/scarlet-fever-in-hong-kong/

Scarlet Fever was so common and dreaded in childhood that it worked its way into children’s literature such as Little Women and  ‘Velveteen Rabbit’. Below is a short excerpt from the latter.

SYNOPSIS

The Velveteen Rabbit, or How Toys Become Real tells the story of a stuffed rabbit made of velveteen…

And then, one day, the Boy was ill.

His face grew very flushed, and he talked in his sleep, and his little body was so hot that it burned the Rabbit when he held him close.  Strange people came and went in the nursery, and a light burned all night and through it all the little Velveteen Rabbit lay there, hidden from sight under the bedclothes, and he never stirred, for he was afraid that if they found him some one might take him away, and he knew that the Boy needed him…

Presently the fever turned, and the Boy got better.  He was able to sit up in bed and look at picture books, while the little Rabbit cuddled close at his side.  And one day, they let him get up and dress

The Boy was going to the seaside tomorrow.  Everything was arranged, and now it only remained to carry out the doctor’s orders.  They talked about it all, while the little Rabbit lay under the bedclothes, with just his head peeping out, and listened.  The room was to be disinfected, and all the books and toys that the Boy had played with in bed must be burnt.

“Hurrah!” thought the little Rabbit.  “Tomorrow we shall go to the seaside!”

…Just then Nana caught sight of him.

“How about this old Bunny?” she asked.

“That?” said the doctor.  “Why, it’s a mass of scarlet fever germs!– Burn it at once.

Williams, M., (1922, 33-36)

[113]

https://archive.org/stream/thevelveteenrabb11757gut/11757.txt

Thankfully, the boy (and seemingly the rabbit) survive the ordeal with Scarlet Fever – thus, leaving the reader with some hope that the disease could be gotten over without any bad effects. And most certainly, they needed hope, as you might imagine, at the coal face so to speak, there was a great deal of pessimism as we felt wholly helpless and unable to prevent this slaughter of innocence. Below are some quotes from the period which reveal this bewilderment and hope that our medical advances will hold the key to such senseless destruction of life sometime in the not too distant future.

Chapter 5

The Historiography of Social Medical Improvement

… the situation in February 1885 remained bleak:

The prevention of scarlet fever is as yet an unsolved problem. I trust such men as Pasteur and Koch will turn their attention to it; my only hopes of a satisfactory answer lie in that direction…

Davies was not alone in his pessimism…:

Yet, as knowledge and administrative resources now stand, official powers of preventing this murderous disease are, practically speaking, insignificant; and such general advice as may be given for individual preventive purposes has so little likelihood of being applied except in select cases, that, as regards the main mass of sufferers, it may seem almost insincere and derisory…

Bristol Historical Resource (2000)

[114]

http://humanities.uwe.ac.uk/bhr/Main/abstract_health/Health_5.htm

Creighton draws our attention to the remarkable pattern of  this disease as it rose from being so meek to such lethal virulent proportions for a number of decades from the mid-19th Century, and he also describes how it began to return to its milder form by the 1880s within England and Wales, but bear in mind that Ireland, Scotland and many other regions follow this pattern of peaks and troughs for the same period.

A History of Epidemics in Britain

Vol. II

The enormous number of deaths from scarlatina during some thirty or forty years in the middle of the 19th century will appear in the history as one of the most remarkable things in our epidemiology. There can be no reasonable doubt that this scarlatinal period was preceded by a whole generation with moderate or small mortality from that disease, just as it is now being followed by annual death-rates which are less than a half, perhaps not more than a third, of the average during forty years before 1880.

Creighton, C.  (1894, 72)

[115]

https://www.gutenberg.org/files/43671/43671-h/43671-h.htm#Page_693

As tabulated by Creighton (1894), although Scarlet Fever’s reign of terror is shown to continue ebbing, the statistics also demonstrate that Measles, and to some extent, Diphtheria, began to pick up the reins where Scarlet Fever left off up to the time of Creighton’s review published in the last decade of the 19th Century. This pattern is illustrated in the following graph for the era under discussion (Figure 16) and, as noted above, the ultimate fate of all of these once deadlier contagions of children, in particular, will be reviewed in more detail right up to our present era in the next chapter, but, for now, we will continue with the fate of Scarlet Fever.

Scarlet Fever, Measles, Diphtheria Eng and Wales

Fig. 16 Graph generated using tabulations for individual annual death statistics in England and Wales from 1837 to 1880 compiled by Creighton, C. (1894) A History of Epidemics in Britain, Volume II, From the Extinction of Plague to the Present Time, p. 722-3, Cambridge University Press, Cambridge.https://www.gutenberg.org/files/43671/43671-h/43671-h.htm#Page_693

As we move into the 20th Century, we discover something that the earlier statisticians could not have foreseen. Scarlet Fever ceases to be a major threat to children and becomes a relatively benign disease of childhood. This historical pattern of the rise, peaking and ultimate decline of Scarlet Fever in terms of its killing power is highlighted in the following commentary dating to the 1940s in relation to Britain.

A Century of Changes in the Mortality and Incidence of the Principal Infections of Childhood

At the beginning of the statistical era scarlet fever, which had been increasing in virulence since about 1825, was already a great killing disease though it reached its peak in 1863 with a death-rate of 3,966 per million living under 15. In the “eighties and “nineties the mortality of scarlet fever declined rapidly, and this fact accounts to a considerable extent for the reduction in the general death-rates at ages 1-5, 5-10, 10-15 …

The behaviour of scarlet fever, which is generally ascribed to change of type, has never been more adequately explained, and it would be interesting to re-examine the problem in the light of modem knowledge of its bacteriology.

Gale, A. H. (1945, 20).

[116]

https://adc.bmj.com/content/archdischild/20/101/2.full.pdf

This latter point, regarding whether Scarlet Fever changed her type is answered in more recent times as you will see further on. In the meantime, below are the updated numbers of annual deaths in Ireland which, clearly show the continued decline in the deadliness of Scarlet Fever to the point where it becomes a fairly benign childhood infection.

Scarlet Fever Deaths Ireland

Fig. 17: Chart of the annual individual number of deaths in Ireland from Scarlet Fever since records began going beyond the period when deaths from this disease were no longer registered. Source: Chart generated using this tumultuous statistics reports since records began – “Annual Reports on Marriages, Births and Deaths in Ireland, from 1864 to 2000” courtesy of: An Phríomh-Oifig Staidrimh, Central Statics Office CSO, link. © Copyright dig-press.com

The main pattern of the rise and fall of Scarlet Fever as reflected from 1864 in the chart above (Fig. 17) showing the individual number of deaths from Scarlet Fever recorded annually since official records began are partially reflected in the historical overview of the disease as outlined in the following excerpt:

Scarlet fever–past and present

AETIOLOGY

Historical data suggest at least three epidemiologic phases for scarlet fever. In the first, which appears to have begun in ancient times and lasted until the late eighteenth century, scarlet fever was either endemic (always present at a low level) or occurred in relatively benign outbreaks separated by long intervals.

In the second phase (~1825-1885), scarlet fever suddenly began to recur in cyclic and often highly fatal urban epidemics. In the third phase (~1885 to the present), scarlet fever began to manifest as a milder disease in developed countries, with fatalities becoming quite rare by the middle of the 20th century.

In both England and the United States, mortality from scarlet fever decreased beginning in the mid-1880s. By the middle of the twentieth century, the mortality rate from scarlet fever again fell to around 1%.

Smith, T,C, (2011)

[117]

http://aetiologyblog.com/2011/07/06/scarlet-fever-in-hong-kong/

A drop to c. 1 per cent by the middle of the 20th Century in the death rate from Scarlet Fever is fairly spectacular. The pattern seen in the above graph for deaths from Scarlet Fever in Ireland (Fig. 17) corresponds very closely with the historical documentation for the same timeframe as those charted for England and Wales and elsewhere.

 Moreover, the fact that Scarlet Fever cases (infections) were very prevalent at the time when deaths from the disease was very rare in our more modern era (the 1950s and 60s) is clearly charted by the steep decline in deaths from Scarlet Fever and correspondingly high incidence of cases of the disease recorded from official statistics in England and Wales for infants and children even up to the 1940s, as illustrated in: ‘A Century of Changes in the Mortality and Incidence of the Principal Infections of Childhood’ fig. 14, (Gale, A.H, Medical Officer, Ministry of Education, 1945) http://adc.bmj.com/content/archdischild/20/101/2.full.pdf [118].

And, again, this close correspondence of decline in mortality from Scarlet Fever over the same timeframe, can be seen within graphs presented in Figure 4.15: Scarlet Fever from the turn of the 20th Century until modern times, with the overwhelming majority (over 99 per cent) decline in deaths occurring in the period 1901-45 as documented in, ‘Atlas of Epidemic Britain: A Twentieth Century Picture’  Smallman-Raynor, M., and Cliff, A (2012) [119] https://books.google.ca/books?id=iMnN4fZrj70C&pg=PA48#v=onepage&q&f=false

This near-simultaneous decline in deaths from Scarlet Fever is also illustrated by the graphs generated for the United States for a similar timeframe in the graph (figure 1) showing Deaths per 100,000 from Rheumatic Fever and Scarlet Fever https://www.econ.ucla.edu/costa/figureschicago.pdf in ‘Health at Older Ages: The Causes and Consequences of Declining Disability among the Elderly’ (Costa,  D.L., 2009) [120]. Also see the graphs  ‘Mortality in the United States 1900 – 1950’ (Gordon, T., 1968, figure. 3). https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2024011/ [121].

It would be difficult to attribute this rather dramatic decline in deaths from Scarlet Fever to anything other than a natural cause due to the fact that Scarlet Fever is a bacterial disease and can, therefore, be tackled with antibiotics, yet, antibiotics only really began to be more widely available after World War II around the mid-1940s (see excerpt below). By which time, Scarlet Fever was essentially a relatively tame infection for the vast majority of children and a vaccine was never developed (fully) to attempt to combat it either (this will be reviewed further on).

About Antimicrobial Resistance

Brief History of Antibiotics

Penicillin, the first commercialized antibiotic, was discovered in 1928 by Alexander Fleming.  While it wasn’t distributed among the general public until 1945, it was widely used in World War II for surgical and wound infections among the Allied Forces.  It was hailed as a “miracle drug” and a future free of infectious diseases was considered.  When Fleming won the Nobel Prize for his discovery, he warned of bacteria becoming resistant to penicillin in his acceptance speech.

CDC (2015)

[122]

https://www.cdc.gov/drugresistance/about.html

As indicated above, Fleming’s warning has unfortunately turned out to be quite correct, as we are battling with superbugs through over-use of such treatments against much less trivial infections and those bacterial critters have apparently adapted to just about anything we can throw at them. As noted above, vaccination cannot have impacted in any way upon the decline in deaths from Scarlet Fever as, although a Scarlet Fever vaccine was being developed in the early days, it never made it into wider public use, apart from a few experimental attempts as documented in the following excerpts which should give you an idea of its history and why it was not successfully marketed:

The Rockefeller University, Harvey Society Lectures. 22.

Aetiology of Scarlet Fever

Krumwiede, Nicoll and Pratt … in 1914 observed an accidental infection of a laboratory worker, who sucked into her mouth a … mixture of living streptococci containing Streptococcus scarlatinru…

In 1923 the same workers … repeated their efforts to produce scarlet fever in human volunteers.

In the second series of observations a hremolytic streptococcus obtained from the infected finger of a nurse suffering from wound scarlet fever was used for purposes of inoculation.

Five volunteers were inoculated by swabbing the tonsils and phlynx with four-day-old cultures of the streptococcus in question.

Three of these individuals remained without evidence of infection and one suffered from sore throat and fever without a rash.

Dochez, A.R., (1926).

https://pdfs.semanticscholar.org/c85b/153d6901f8dc262982ded19d13a86154c1a4.pdf

[123]

…Some years later…

Alphonse Raymond Dochez

April 21, 1882-June 30, 1964

National Academy of Sciences, Washington D.C

…Dochez injected melted agar under the skin of a young pig and inoculated it with his Pr 1 strain of streptococcus. A few days later, at about eleven o’clock at night, he called me excitedly on the phone to come up to his laboratory. There was his little pig, as rosy as a boiled lobster! The logical next step was to immunize a horse by the same procedure, and this Doh, as he was familiarly called, started in the stables of the Rockefeller Institute.”

… An epidemic of scarlet fever in New Haven provided the first large-scale opportunity to test the serum, and its beneficial effects were enthusiastically reported by a number of observers. Unfortunately, Dochez’s studies of streptococcal infections were terminated by a legal decision handed down in favor of George F. and Gladys H. Dick of Evanston, Illinois. They had obtained British patent 243675, dated November 28, 1924, and U.S. patent 1,547,369, dated July 28, 1925, making broad claims as to the isolation of streptococci specific to scarlet fever, the preparation of a scarlatinal toxin, the injection of animals to obtain an antitoxin, and the antitoxin itself.

Dochez had earlier taken out British patent 232181, dated April 14, 1924, but his U.S. patent 1,585,090, dated May 18, 1926, and assigned to the Presbyterian Hospital of New York, was issued after that of the Dicks.

http://www.nasonline.org/publications/biographical-memoirs/memoir-pdfs/dochez-alphonse.pdf

[124]

 Heidelberger, M., Kneeland, Y Jr., and Mills Price K., (1971)

According to the records, the legal battle continued right up to the 1940s, around the same time when antibiotics were beginning to work their miracle and for certain individuals, these would have been a most welcome rescue. However, as noted above, at a population level throughout our nations, deaths from Scarlet Fever were becoming quite rare by this time this new medical intervention became more widely available and cannot, therefore, account for the overall decline in deaths from this disease as supported by the historical and statistical data presented thus far.

Indeed, even today, Scarlet Fever is listed as a fairly benign infection to have, and considering that there was never a vaccine available to combat the childhood infection, this is reassuring as the advice below indicates.

Scarlet fever: Overview

Prevention

There is no scarlet fever vaccine. But it’s a good idea to take the same kind of general precautions you would take to avoid colds or respiratory infections. These include washing your hands often and avoiding close contact with anyone who has scarlet fever and might still be contagious.

A lot of children carry scarlet fever germs in their throat without getting ill, especially in the winter. These children are usually not contagious, even if they are shown to have the germs. So there’s no need to take measures to prevent infection in those cases.

Informed Health Online (2014)

https://www.ncbi.nlm.nih.gov/books/NBK279620/

[125]

Interestingly, Scarlet Fever appears to be one of those diseases that once you got it, does not mean that you are immune for life as indicated in the following excerpt.

Scarlet fever: Overview

Unlike with many other childhood diseases, having had scarlet fever in the past doesn’t make you immune to future scarlet fever infections. So people can have it more than once in their lives.

Informed Health Online (2014)

https://www.ncbi.nlm.nih.gov/books/NBK279620/

[126]

Considering this form of immunity that requires continual boosting to keep you resistant to repeated attacks (as indicated in the next excerpt), perhaps Scarlet Fever needs lots of silent carriers (like TB and Typhoid) to keep the pathogen circulating and quietly infecting the population without their knowing for the most part (at least not in our more modern developed nations). This type of resistant immunity is indicated in the following:

Scarlet fever: the disease in the UK

The Pharmaceutical Journal

Figures suggest that up to 40 per cent of the population are asymptomatic carriers, with low infectivity and little risk of developing complications.

Marshall, S. (2006)

https://www.pharmaceutical-journal.com/opinion/comment/scarlet-fever-the-disease-in-the-uk/10001690.fullarticle

[127]

Then, Scarlet Fever returns to taunt our children here in the 21st Century. We had forgotten just how tame this pathogen had become and when it did suddenly reemerge out of seemingly nowhere, we simply equated the return of the disease in our modern world with a return to the type of fatalities of the 19th and earlier 20th Century. It was a global comeback out of seemingly nowhere, and its impact left everyone scratching their heads.

THE ONCE-DEADLY SCARLET FEVER IS MAKING A WEIRD COMEBACK AROUND THE WORLD

What is happening!

After decades of decline, scarlet fever is once again on the rise in the UK and other places around the world, and doctors are scrambling to figure out why.

Beginning in 2014, the infection started to steadily rise, and in 2016, over 19,000 cases from 620 outbreaks were reported, mostly in schools and nurseries. This represents a seven-fold increase since 2011.

Starr, M., (2017)

[128]

https://www.irishtimes.com/life-and-style/health-family/scarlet-fever-is-back-in-the-21st-century-1.3291330

You can begin to imagine the fear as we began to realise that this was the very same disease and symptoms of Scarlet Fever of old. Would our children start dying from the disease in their thousands as emblazoned in our imaginations of the tragic stories of the dark days of the Victorian era?

 Scarlet fever: the disease in the UK

The Pharmaceutical Journal

…it may not sound terrible based on those symptoms, but it was responsible for 36,000 registered deaths in the first decade of the 20th century in England and Wales, and was a leading cause of child mortality…

Marshall, S.  (2006)

https://www.pharmaceutical-journal.com/opinion/comment/scarlet-fever-the-disease-in-the-uk/10001690.fullarticle

[129]

This was a true pandemic as it almost went worldwide. It was horrifying for the poor parents of infants and children who got the disease as they had remembered stories of the massive death toll of the dreaded strawberry tongue.

Scarlet Fever, a Disease of Yore, Is Making a Comeback

The reason for the sudden surge remains a mystery,

Scientific America

Scarlet fever, a disease that struck fear into the heart of parents when cases surged in the days of yore, appears to be making an unexpected and puzzling comeback in parts of the world. England and Wales have seen a substantial rise in scarlet fever cases starting in 2014.

The number of cases tripled from 2013 and continued to increase in 2015 and 2016, with England and Wales last year recording the highest number of cases there in a half-century, British scientists reported Monday in the journal Lancet Infectious Diseases. Similar and in some cases even larger surges of scarlet fever have been reported in recent years in South Korea, Vietnam, China, and Hong Kong. Hong Kong, which saw a tenfold rise in cases, continues to report increased annual counts five years after the resurgence was first noticed.

The reason for the sudden and surprising increase is a mystery. And the authors of a commentary that accompanied the article urge other countries to be on the lookout for similar spikes in cases.

Branswell, H. (2017)

[130]

https://www.scientificamerican.com/article/scarlet-fever-a-disease-of-yore-is-making-a-comeback/

However, the pandemic continued to sweep throughout our now developed nations, and its impact was nowhere near as devastating as we were beginning to anticipate. Different causes for the mildness of the modern pandemic were considered as seen in the following.

Scarlet fever: the disease in the UK

The Pharmaceutical Journal

…The most obvious reason for a resurgence in a bacterial infection would be a new strain of the disease that spreads more easily and is possibly antibiotic-resistant – but molecular genetic testing has ruled this out.

Instead, tests showed a range of already established strains of the bacteria, leaving researchers still looking for a possible cause.

Meanwhile, the 2016 statistics put incidence at 33.2 cases per 100,000 people, with 1 in 40 cases being admitted to hospital (although around half of those get discharged the same day).

Marshall, S. (2006)

https://www.pharmaceutical-journal.com/opinion/comment/scarlet-fever-the-disease-in-the-uk/10001690.fullarticle

[131]

This posed the obvious question: if it was not the Scarlet Fever strain of old – the pathogen that killed so many – then,  what on earth was going on? The fact that it was essentially the same disease and amazingly, no deaths and rather brief and typically uneventful hospital visits, left them even more puzzled than before as indicated in the following excerpt:

Scarlet Fever, a Disease of Yore, Is Making a Comeback

The reason for the sudden surge remains a mystery,

Scientific America

“The strains didn’t give us the answer. We were really pinning our hopes on those, because that’s the most obvious answer,” she noted. “We’re left thinking what on earth it could be. We don’t have an answer at the moment.” Even though scarlet fever does not have to be reported to the CDC, Lamagni said a surge in the United States would be hard to miss. “If they were seeing what we’re seeing, they would know about it. It is unusual,” she said.

Branswell, H. (2017)

[132]

https://www.scientificamerican.com/article/scarlet-fever-a-disease-of-yore-is-making-a-comeback/

We now know that the pathogen didn’t change genetically – it’s still the same pathogen that killed thousands annually back in the day. And perhaps the answer to this quandary regarding Scarlet Fever’s return in our fully modern era points to another cause.

It looks very likely that our immune systems have become highly educated regarding this particular pathogen and we have been doing this via continuous exposure and boosting (recall about 40 per cent are silent carriers) for a very long time. This would provide us with very long-term resilience to the pathogen over the generations and it would be perfectly natural if the disease returned in our modern era, that it would behave in the way that it did – giving us all a much-needed booster, lest our immune systems forgot. See it as a type of fire drill and a way of keeping our systems up to date. Therefore, is this why when Scarlet Fever returned in our more modern era that it was a whole lot less lethal?

Therefore, the most reassuring aspect of this discussion is the fact that our immune systems have apparently been quietly working away in the background ensuring this protection all the while and this resilience certainly did stand our children in great stead which was fully put to the test when Scarlet Fever returned in the 21st Century and the dreaded deaths and horrors of the Victorian era did not descend upon our children.

Fortunately, it would appear from the evidence presented above, with all the generations of exposure (albeit rather invisible and silent) prior to this pandemic event, we were ready for Scarlatina’s return, even if we didn’t know it at the time. It does make you think though, what if we had have successfully developed a vaccine for wider use – how would that have impacted upon our natural exposure and subsequent boosting from the real background pathogen as vaccination, if it had been successful (and not got caught up in litigation) would have unwittingly shielded our infants and children and their mothers  from the critical exposure and therefore boosting required to protect us against the infection getting the upper hand in the first place?


References

[109] Creighton, C. (1894) A History of Epidemics in Britain, Volume II, From the Extinction of Plague to the Present Time, Vol. II, p. 678, Cambridge University Press, Cambridge.https://www.gutenberg.org/files/43671/43671-h/43671-h.htm#Page_693

[110]  Creighton, C. (1894) A History of Epidemics in Britain, Volume II, From the Extinction of Plague to the Present Time, Vol. II, p. 722-3, Cambridge University Press, Cambridge.https://www.gutenberg.org/files/43671/43671-h/43671-h.htm#Page_693

[111] Smith, T.C. (2011) Scarlet fever–past and present, Aetiology Blog (July 6yh 2011) [Available online at aetiologyblog.com] http://aetiologyblog.com/2011/07/06/scarlet-fever-in-hong-kong/

[112] Smith, T.C. (2011) Scarlet fever–past and present, Aetiology Blog (July 6yh 2011) [Available online at aetiologyblog.com] http://aetiologyblog.com/2011/07/06/scarlet-fever-in-hong-kong/

[113] Williams, M., (1922) Velveteen Rabbit or How Toys Become Real, Doubleday & Company , Inc., New York. [Available on Project Gutenberg] https://archive.org/stream/thevelveteenrabb11757gut/11757.txt

[114] Bristol Historical Resource (2000) Chapter 5: The Historiography of Social Medical Improvement, in, (eds) ByIan Archer, Spencer Jordan, Keith Ramsey, Peter Wardley and Matthew Woollard [Available online website BHR] http://humanities.uwe.ac.uk/bhr/Main/abstract_health/Health_5.htm

[115] Creighton, C. (1894) A History of Epidemics in Britain, Volume II, From the Extinction of Plague to the Present Time, Vol. II, p. 72, Cambridge University Press, Cambridge.https://www.gutenberg.org/files/43671/43671-h/43671-h.htm#Page_693

[116] Gale, A. H. (1945). A Century of Changes in the Mortality and Incidence of the Principal Infections of Childhood. Archives of Disease in Childhood, Vol. 20, [101], pp. 2–21. http://adc.bmj.com/content/archdischild/20/101/2.full.pdf

[117] Smith, T.C. (2011) Scarlet fever–past and present, Aetiology Blog (July 6yh 2011) [Available online at aetiologyblog.com] http://aetiologyblog.com/2011/07/06/scarlet-fever-in-hong-kong/

[118] Gale, A. H. (1945). A Century of Changes in the Mortality and Incidence of the Principal Infections of Childhood. Archives of Disease in Childhood, Vol. 20, [101], pp. 2–21. http://adc.bmj.com/content/archdischild/20/101/2.full.pdf

[119] Smallman-Raynor, M, and Cliff, A (2012), Atlas of Epidemic Britain: A Twentieth Century Picture, Oxford University Press, Oxford. p.50, figure 4:18 (Measles); p. 52, figure 4:24 (Whooping Cough); p.49, figure 4:15 (Scarlet Fever). https://books.google.ie/books

[120] Costa, D. L., (2009) Health at Older Ages: The Causes and Consequences of Declining Disability among the Elderly in, (eds.,) David M. Cutler and David A. Wise, Selection from a published volume from the National Bureau of Economic Research, University of Chicago Press [Available online at the National Bureau of Economic Research NBER] http://www.nber.org/chapters/c11109.pdf

[121] Gordon, T. (1953) Mortality in the United States, 1900-1950. Public Health Reports, Vol. 68 [4], pp. 441–444. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2024011/ with links to pdf of the timeline of the introduction of each antibiotic type and the period of its resistance. https://www.cdc.gov/drugresistance/pdf/5-2013-508.pdf

[122] CDC (2015) About Antimicrobial Resistance, Brief History of Antibiotics, Center for Disease Control website, https://www.cdc.gov/drugresistance/about.html

[123] Dochez, A.R. (1926) The Rockefeller University, Harvey Society Lectures. 22, Aetiology of Scarlet Fever,https://pdfs.semanticscholar.org/c85b/153d6901f8dc262982ded19d13a86154c1a4.pdf

[124] Heidelberger, M., Kneeland, Y Jr., and Mills Price K., (1971), Alphonse Raymond Dochez, April 21, 1882-June 30, 1964, National Academy of Sciences, Washington D.C http://www.nasonline.org/publications/biographical-memoirs/memoir-pdfs/dochez-alphonse.pdf

[125] Informed Health Online (2014) Scarlet fever: Overview, Cologne, Germany: Institute for Quality and Efficiency in Health Care (IQWiG); https://www.ncbi.nlm.nih.gov/books/NBK279620/

[126] Informed Health Online {2014) Scarlet fever: Overview, Cologne, Germany: Institute for Quality and Efficiency in Health Care (IQWiG), [Updated 2017 Apr 6] Available from: https://www.ncbi.nlm.nih.gov/books/NBK279620/

[127] Marshall, S., (2006) Scarlet fever: the disease in the UK, The Pharmaceutical Journal (July Issue 2006). https://www.pharmaceutical-journal.com/opinion/comment/scarlet-fever-the-disease-in-the-uk/10001690.fullarticle

[128] Starr, M., (2017) The Once-Deadly Scarlet Fever is making a Weird Comeback around the World – What is happening! Irish Times, (November 29th 2017) https://www.irishtimes.com/life-and-style/health-family/scarlet-fever-is-back-in-the-21st-century-1.3291330

[129] Marshall, S., (2006) Scarlet fever: the disease in the UK, The Pharmaceutical Journal (July Issue 2006). https://www.pharmaceutical-journal.com/opinion/comment/scarlet-fever-the-disease-in-the-uk/10001690.fullarticle

[130] Branswell, H., (2017) Scarlet Fever, a Disease of Yore, Is Making a Comeback – The reason for the sudden surge remains a mystery, Scientific America (November 28th 2017). https://www.scientificamerican.com/article/scarlet-fever-a-disease-of-yore-is-making-a-comeback/

[131]  Marshall, S., (2006) Scarlet fever: the disease in the UK, The Pharmaceutical Journal (July Issue 2006). https://www.pharmaceutical-journal.com/opinion/comment/scarlet-fever-the-disease-in-the-uk/10001690.fullartic

[132] Branswell, H., (2017) Scarlet Fever, a Disease of Yore, Is Making a Comeback – The reason for the sudden surge remains a mystery, Scientific America (November 28th 2017). https://www.scientificamerican.com/article/scarlet-fever-a-disease-of-yore-is-making-a-comeback/


CHAPTER EIGHT: The Almost Universal Decline in Deaths from some of the Deadliest Contagions Known to Children


Top image source