Category Archives: Antivaccinationism

Anti-vaxxers in Free Fall

I read a news story in which a man is dying of Covid-19 in the hospital.  He is asked whether he regrets not getting vaccinated and rallies enough to reply, “No, I don’t believe in the vaccine.”  So what then does he believe in?  Systemic viral infection, suffering, and death?  If you don’t believe in vaccination, you don’t believe in modern medicine in toto.  You don’t believe in bacteriology, virology, cellular biology, microbiology, or immunology.  What then is left to prevent, diagnose, and treat disease?  Trump-ish medievalism, mysticism, shamanism, divine intervention?

A study by researchers at Harvard’s Brigham and Women’s Hospital used natural language processing to comb through 5,307 electronic patient records of adult type 2 diabetics living in Massachusetts and followed by their primary care physicians between 2000 and 2014.  They found that 43% (2,267) of patients refused to begin insulin therapy when their doctors recommended it.  Further, diabetics who declined the recommendation not only had higher blood sugar levels than those who began insulin, but had greater difficulty achieving glycemic control later on.[1]  So what do the insulin-declining diabetics believe in?  Chronic heart and kidney disease, blindness, and amputation – the all but inevitable sequelae of poorly managed diabetes?

The problem, really an epistemological problem, is that such people apparently have no beliefs at all – unless one imputes to them belief in disease, suffering, and death, and in the case of Covid vaccine resisters, the prerogative to inflict them on others.  This is not tantamount to a scientifically specious belief system that unintentionally infects others.  During the Yellow Fever epidemic that left Philadelphia in ruins in 1793, Dr. Benjamin Rush, highly acclaimed throughout the newborn nation, set about his curative missions by draining his patients, in successive bleedings, of up to four pints of blood while simultaneously purging them (i.e., causing them to vomit) with copious doses of toxic mercury.

Rush’s “Great Purge,” adopted by his followers, added hundreds, perhaps thousands, to the death toll in Philadelphia alone.  But at least Rush’s “system” derived from a belief system.  He did in fact  find a theoretical rationale for his regimen in an essay by the Virginia physician and mapmaker John Mitchell.  Describing yellow fever in Virginia in 1741, Mitchell noted that in yellow fever the “abdominal viscera were filled with blood, and must be cleaned out by immediate evacuation.”[2]   Bleeding, of course, was conventional treatment for all manner of disease in 1793, so Mitchell’s recommendation came as no surprise. Taken in conjunction with the system of mercuric purges employed by Dr. Thomas Young during the Revolutionary War, Rush had all the grounding he required for a ruinously misguided campaign that greatly extended recovery time of those it did not kill.  But, yes, he had his theory, and he believed in it.

In the early 19th century, Napoleon, sweeping through Europe, conquers the north Italian province of Bolzano, which in 1807 he incorporated into Bavaria. Two years later, when the Bavarian government mandates smallpox vaccination for all residents, the newly absorbed Italians launch an armed revolt, partly because they believed vaccination would inject Protestantism into their Catholic veins.[3]

All right, it is a nonsensical belief, even in 1809, but it is still a belief of sorts.  It is epistemically flawed, because it fails to stipulate what exactly makes a substance inherently Protestant in nature; nor does it posit a mechanism of transmission whereby a Protestant essence seeps into liquid smallpox vaccine in the first place.  In the realm of ethics, it suggests that the possibility of death pales alongside the certainty of spiritual contamination by a fluid that, however neutral in life-saving potency, is injected by a Protestant hand.

Only slightly less ridiculous to modern ears is the mid-19th-century belief that general anesthesia via ether or chloroform, introduced by James Young Simpson in 1847, must be withheld from women giving birth.  The reason?  Genesis 3.16 enjoins women to bring forth new life in suffering.  Forget that the belief is espoused solely by certain men of the cloth and male physicians,[4] and was based on a highly questionable rendering of the biblical Hebrew.  Forget as well that, for Christians, Christ’s death redeemed humankind, relieving women of the need to relive the primal curse.  Bear in mind further that the alleged curse would also forbid, inter alia, use of forceps, caesarian operations, and embryotomy.  A woman with a contracted pelvis would die undelivered because she is guilty of the sin over which she has no control – that of having a contracted pelvis.[5]

In a secular nation guided by a constitution that asserts everyone’s right to pursue happiness in his or her own pain-free terms, we see the primal curse as archaic misogynist drivel, no less absurd than belief that the Bible, through some preternatural time warp, forbids vaccination.  But, hey, it’s a free country, and if a mid-19th-century or early-21st-century man chooses to believe that anesthesia permits men to escape pain whenever possible but women only in male-sanctioned circumstances, so be it.  It is a belief.

Now it’s 1878, and the worst yellow fever epidemic in American history is sweeping across the lower Mississippi Valley, taking lives and destroying commerce in New Orleans, Memphis and surrounding cities and towns to which refugees are streaming.  The epidemic will reach the Ohio Valley, bringing deadly Yellow Jack to Indiana, Illinois, and Ohio.  Koch’s monograph on the bacteriology of sepsis (wound infection) was published that very  year, and neither his work nor that of Lister is universally accepted in the American south.  Nor would its precepts have counted for much in the face of a viral (not bacterial) invader carried up the Mississippi from Havana.

What can city boards of health do in the face of massive viral infection, suffering, and death?  Beyond imposing stringent new sanitary measures, they can quarantine ships arriving in their harbors until all infected crew members have either died or been removed and isolated.  This will prevent the newly infected from infecting others and crippling cities still further – assuming, that is, a belief system in which yellow fever is contagious and spread from person to person.

But in 1878 Memphis, where by September the epidemic is claiming 200 lives a day, this “modern” belief is widely contested among the city’s physicians.  Some are contagionists, who believe that disease is caused by invisible entities that are transmissible.  But others, greater in number, favor the long-held theory that infectious disease results from “miasma” or bad air – air rendered toxic by decaying plant and animal matter in the soil.  If you believe miasma causes disease, then you’re hard-pressed to understand how quarantining ships laden with sick people will do anything to control the epidemic.

This was precisely the position of the 32 Memphis physicians who defeated the city council’s plan to institute a quarantine and set up a quarantine station.  Quarantine is pointless in the face of bad air.  The city’s only recourse, so held the 32, was to alter the “epidemic constitution” of the atmosphere by inundating it with smoke.  Canon blasts and blazing barrels of tar up and down city streets – that’s the ticket to altering the atmospheric conditions that create infectious disease.[6]

The miasmic theory of disease retained a medical following throughout the 1870s, after which it disappeared in the wake of bacteriology.  But in Memphis in 1878, bad air was still a credible theory in which physicians could plausibly believe.  And this matter of reasonable belief – reasonable for a particular time and place – takes us back to the hospitalized Covid patient of 2021 who, with virtually his last breath, defends his decision to remain unvaccinated because he doesn’t believe in the vaccine.  What is the knowledge base that sustains his disbelief?  There isn’t any.  He has no beliefs, informed or otherwise, about bacteriology, virology, cellular biology, or immunology.  At best, he has decided to accept what someone equally belief-less has told him about Covid vaccination, whether personally, in print, or over the internet.

It is no different among the 43% of Massachusetts diabetics who, a century after Banting’s and Best’s miraculous discovery, declined insulin therapy when their doctors recommended it.  Their disbelief is actually a nonbelief because it is groundless.  For some, perhaps, the refusal falls back on a psychological inability to accept that one is diabetic enough to warrant insulin.  They resist the perceived stigma of being insulin-dependent diabetics.[7]  Here at least the grounds of refusal are intelligible and remediable.  An insulin phobia does not sustain real-world belief; it is an impediment to such belief in relation to diabetes and insulin, illness and long-term health, lesser and greater life expectancy.

Back in the present, I read another news story in which two unvaccinated hospital nurses explain to a journalist that they have refused Covid vaccination because the vaccines’ effectiveness is based on “junk data.”  Really?  Here there is the glimmering of a belief system, since scientific data can be more or less robust, more or less supportive of one or another course of action.

But what exactly makes Covid vaccine data worthless, i.e., junk?  And how have these two nurses acquired the expertise in epidemiology, population statistics, and data analysis to pass judgment on data deemed credible and persuasive by scientists at Pfizer, Moderna, Johnson & Johnson, the CDC, and the WHO?  And how, pray tell, have they gained access to these data?  Like all opponents of vaccine science, they pontificate out of ignorance, as if the mere act of an utterance confers truth-value to what is being uttered.  It’s an extreme example of asserting as fact what remains to be demonstrated (argument petitio principii), the legacy of an ex-president who elevated pathological lying to a political art form.

Even the nurses pale alongside the anti-vax protester who is pictured in a news photo holding a sign that reads, “Vaccines Kill.”[8]  Whom do they kill and under what circumstances?  Does he mean all vaccines are deadly and kill people all the time, or just certain vaccines, such as the Covid  vaccine?   But what does it matter?  The sign holder doesn’t know anything about any vaccines.  Does he really believe that everything we know about the history of vaccine science from the time of Jenner is bogus, and that children who once died from smallpox, cholera, yellow fever, diphtheria, pertussis, typhoid, typhus, tetanus, and polio are still dying in droves, now from the vaccines they receive to protect them from these infectious diseases during the earliest years of life?  Is the demographic fact that, owing to vaccination and other public health measures, life expectancy in the U.S. has increased from 47 in 1900 to 77 in 2021 also based on junk data?  In my essay, Anti- vaccinationism, American Style, I provide statistics on the total elimination in the U.S. of smallpox and diphtheria, and virtual elimination of polio.  Were my claims also based on junk data?  If so, I’d appreciate being directed to the data that belie these facts and demonstrate that, in point of fact, vaccines kill.

Maybe the man with the sign has an acquaintance who got sick from what he believed to be a vaccine?  Perhaps someone on his internet chat group heard of someone else who became ill, or allegedly died, after receiving a vaccine.  Of course, death can follow vaccination without being caused by it.  Do we then assume that the man with the sign and like-minded protesters are well-versed in the difference between causation and correlation in scientific explanation?

We know that for a tiny number of individuals aspirin kills.[9]   So why doesn’t the man hold up a sign that reads, “Aspirin Kills.”  Here at least, he would be calling attention to a scientific fact that people with GI conditions should be aware of.    We know that sugary drinks have been linked to 25,000 deaths in the U.S. each year.  Why not a sign, “Soda Kills”?  It would at least be based on science.  He chooses not to proclaim the lethality of aspirin or soda because he cares no more about aspirin- or soda-related deaths than Covid-related deaths.  If he did, then, like the two nurses with their junk data and the Covid patient announcing disbelief in Covid vaccination on his deathbed, he would have to anchor his belief in consensually accepted scientific facts – a belief that someone, anyone, might find believable.

He is no different than other American anti-vaxxers I read about in the paper. They are the epistemological Luddites of our time, intent on wrecking the scientific machinery of disease prevention, despite profound ignorance of vaccine science and its impact on human affairs since the late 18th century.  Indeed, they see no need to posit grounds of belief of any kind, since their anger – at Covid, at Big Government, at Big Science, at Big Medicine, at Big Experts – fills the epistemic void.  It fuels what they offer in place of the science of disease prevention:  the machinery of misinformation that is their stock in trade.

And therein is the source of their impotence.  They have fallen into an anti-knowledge black hole, and struggle to fashion an existence out of anger that – to push the anti-matter trope a little further – repels rational thought.  Their contrarian charge is small solace for the heightened risks of diseases, suffering, and death they incur, and, far less conscionably, impose on the rest of us.

______________________

[1] N. Hosomura, S. Malmasi, et al., “Decline of Insulin Therapy and Delays in Insulin Initiation in People with Uncontrolled Diabetes Melitus,” Diabetic Med., 34:1599-1602, 2017.

[2] J. M. Powell, Bring Out Your Dead:  The Great Plague of Yellow Fever in Philadelphian in 1793 (Phila: Univ. of Pennsylvania Press, 1949), 76-78.

[3] My thanks to my friend Marty Meyers for bringing to my attention this event of 1809, as reported by Emma Bubola,In Italy’s Alps, Traditional Medicine Flourishes, as Does Covid,” New York Times, December 16, 2021.

[4] With reason, wrote Elizabeth Cady Stanton in The Woman’s Bible (1895), “The Bible and the Church have been the greatest stumbling blocks in the way of women’s emancipation.”

[5] For a fulller examination of the 19th-century debate on the use of general anesthesia during childbirth, see Judith Walzer Leavitt Brought to Bed: Childbearing in America, 1750-1950 (NY:  OUP, 1986), ch. 5.

[6] On the measures taken to combat the epidemic in Memphis, including the rift between contagionists and noncontagionists physicians, see John H. Ellis, Yellow Fever and Public Health in the New South (Lexington: Univ. Press of Kentucky, 1992), ch. 3.

[7] A. Hussein, A. Mostafa, et al., “The Perceived Barriers to Insulin Therapy among Type 2 Diabetic Patients,” African Health Sciences, 19:1638-1646, 2019.

[8] Now, sadly, we have gone from hand-written “Vaccines Kill” signs to highway billboards, e.g., https://www.kxxv.com/hometown/mclennan-county/a-new-billboard-in-west-claims-vaccines-kill.

[9] Patients prescribed aspirin before developing a GI bleed or perforation are prominent among those killed by aspirin.  See A. Lanas, M. A. Perez-Aisa, et al., “A Nationwide Study of Mortality Associated with Hospital Admission and Those Associated with Nonsteroidal Antiinflammatory Drug Use,” Am. J.  Gastroenterol., 100:1685-1693, 2005; S. Straube, M. R. Trainer, et al., “Mortality with Upper Gastrointestinal Bleeding and Perforation,” BMC Gastroenterol., 8: 41, 2009.

Remembering Cholera in the Time of Covid

It came in crippling waves, the deadly intestinal infection that, in 1832 alone, killed 150,000 Americans.  Its telltale symptom was copious watery diarrhea (“rice water”) accompanied by heavy vomiting, loss of internal fluids and electrolytes, and dehydration; hence its name, cholera, from the Greek “flow of bile.”  Severe cases quickly proceeded to hypovolemic shock and could kill otherwise healthy persons, children and adults, within hours.  Historians refer to the cholera epidemics of the 19th century, but the epidemics, especially those of 1832, 1848, and 1866, were in fact pandemics, spreading from nation to nation and continent to content.

     Orthodox or “regular” physicians of the first half of the 19th century had no clue about its cause, mode of transmission, or remedy, so they speculated wildly in all directions.  Contagionists believed it spread from person to person.  Noncontagonists attributed it to poisonous fumes, termed miasma, which emanated from soil and   decaying matter of all kinds.  Some attributed it to atmospheric changes; others thought it a byproduct of fermentation.  Physician-moralists could be counted on to ascribe it to the moral failings of the underclass.  Withal, the regulars resorted, one and all, to “heroic” treatments, with bleeding and toxic chemicals, especially calomel (mercury) and laudanum (opium-laced alcohol) having pride of place. These treatments only hastened the death of seriously ill patients, which, given the extent of their suffering, may have been an unwitting act of mercy.  Physician-induced deaths resulted in the appeal of homeopathy and botanical medicine, far milder approaches that, so their practitioners avowed, caused far fewer deaths than the horrid regular remedies.  

Caricature of 1832, depicting cholera sufferer with nightmarish “remedies” of the day.

      The suffering public, seeing the baleful effects of conventional remedies, grew disgusted with doctors. The Cholera Bulletin, a newsletter put out by a group of New York physicians over the summer of 1832, grimly acknowledged in its July 23 issue, the “fierce onslaught” of cholera and doctors in felling the afflicted: “For Cholera kills and Doctors slay, and every foe will have its way!”  After a new wave of cholera reached American shores in 1848, New York’s Sunday Dispatch lambasted traditional medical science as “antiquated heathen humbug, utterly unworthy of the middle of the nineteenth century.”  “Cholera was a most terrible affliction,” chimed in the New York Herald a year later, “but bad doctors and bad drugs are worse.  The pestilence might come now and then; physicians we had always with us.”[1]

     And yet, amid such loathing about doctors and their so-called remedies, science marched on in the one domain in which forward movement was possible.  Throughout the second half of the 19th century, cholera was the catalyst that brought Europe and eventually America into the proto-modern era of public health management of infectious disease.  Then as now, measures that safeguarded the public from life-threatening infectious disease are good things.  In 1853, after another cholera epidemic reached Edinburgh, there was no political posturing about “rights” – presumably the right of British citizens to get sick and die.  Parliament, the “Big Government” of the day, resolved to go after the one major, recurrent infectious disease for which a vaccine was at hand:  smallpox.  The Vaccination Act of 1853 grew out of this resolve.  Among other things, it instituted compulsory smallpox vaccination, with all infants to be vaccinated within the first three months of life (infants in orphanages were given four months).  Physicians were obligated to send certificates of vaccination to local birth registrars, and parents who did not comply were subject to fines or imprisonment. The requirement was extended under the Vaccination Act of 1867.[2]

     New York City followed suit a decade later, when the state legislature created the Metropolitan Board of Health.  The Board responded to the outbreak of cholera in 1866 by mandating the isolation of cholera patients and disinfection of their excretions.  When a new epidemic, which travelled from India to Egypt, erupted in 1884, French and German teams descended on Egypt in search of the specific microorganism responsible for cholera.  The prize went to Robert Koch, who isolated the cholera vibrio in January 1884.  

     In 1892, when a cholera epidemic originating in Hamburg brought 100 cases to New York, the city mobilized with the full force of the new science of bacteriology.  The Board of Health lost no time in establishing a Division of Pathology, Bacteriology, and Disinfection, which included a state-of-the-art bacteriological laboratory under the direction of Hermann Biggs.  The lab, as we have seen, came into its own in the fight against diphtheria, but it was the threat of cholera that brought it into existence.  A year later, in 1893, Congress passed the National Quarantine Act, which created a national system of quarantine regulations that included specific procedures for the inspection of immigrants and cargos.  It was to be administered by the U.S. Marine Hospital Service, forerunner of the Public Health Service.

     In the late 1840s, the Bristol physician William Budd argued that contaminated sewage was the source of cholera, and in 1854 the surgeon John Snow traced the source of a cholera outbreak in his Soho, London neighborhood to contaminated well water.  But it was the Hamburg epidemic that proved beyond doubt that cholera was waterborne, and Koch himself demonstrated that water filtration was the key to its control.[3]  Now, we rarely hear of cholera, since water and waste management systems that came into existence in the last century eliminated it from the U.S. and Europe.[4]  Anti-vax libertarians would no doubt take exception to the Safe Drinking Water Act of 1974, which empowers the EPA to establish and enforce national water quality standards.  There it is again, the oppressive hand of Big Government, denying Americans the freedom to drink contaminated water and contract cholera.  Where has our freedom gone? 

Caricature of 1866, “Death’s Dispensary,” giving contaminated drinking water as a source of cholera.

     The gods will never stop laughing at the idiocy of humankind.  Here we are in 2021 and, thanks to the foundation laid down by 19th century scientists, gifted scientists of our own time have handed us, in astoundingly little time, an understanding of the Corona virus, its mode of transmission, and a pathway to prevention and containment.  We have in hand safe and effective vaccines that reduce the risk of infection to miniscule proportions and insure that, among the immunized, infection from potent new strains of the virus will be mild and tolerable, and certainly not life- threatening.   

     Yes, a small percentage of those who receive Covid vaccines will have reactions, and, among them, a tiny fraction will become ill enough to require treatment, even hospitalization.  But they will recover and enjoy immunity thereafter.  Such “risks” pale alongside those incurred by their forebears, who sought protection from smallpox in the time-tested manner of their forebears.  In America, a century before the discovery of Edward Jenner’s cowpox-derived vaccine, colonists protected themselves from recurrent smallpox epidemics through inoculation with human smallpox “matter.”  The procedure, termed variolation, originated in parts of Europe and the Ottoman Empire in the early 16th century, reaching Britain and America a century later, in 1721.  It involved inoculating the healthy with pus scraped from skin ulcers of those already infected, and was informed by the ancient observation that smallpox could be contracted only once in a lifetime.[5]  The variolated developed a mild case of smallpox which, so it was hoped, would confer protection against the ravages of future epidemics. 

     And they were essentially right: over 98% of the variolated survived the procedure and achieved immunity.[6]   To be sure, the risk of serious infection was greater with variolation than with Edward Jenner’s cowpox-derived vaccine, but the latter, which initially relied on the small population of English cows that contracted cowpox and person-to-person inoculation, was a long time in coming.  It took the United States most of the 19th century to maintain and distribute an adequate supply of Jennerian vaccine.  Long before the vaccine was widely available, when the death rate from naturally acquired smallpox was roughly 30%,[7] Americans joined Europeans, Asians, and Africans in accepting the risks of variolation. For George Washington, as we noted, the risks paled alongside the very real risk that the Continental Army would collapse from smallpox:  he had every soldier variolated before beginning military operations in Valley Forge in 1777.[8]

    But here we are in 2021, with many Americans unwilling to accept the possibility of any Covid vaccine reaction at all, however transient and tolerable.  In so doing, they turn their backs on more than two hundred years of scientific progress, of which the successful public health measures in Europe and America spurred by the cholera epidemics form an important chapter.  The triumph of public health, which antedated by decades the discovery of bacteria, accompanied increased life expectancy and vastly improved quality of life wrought by vaccine science, indeed, by science in general. 

     Witness Britain’s Great Exhibition of 1851, a scant three years after the cholera epidemic of 1848. Under the dome of the majestic Crystal Palace, science was celebrated in all its life-affirming possibilities.  In medicine alone, exhibits displayed mechanically enhanced prosthetic limbs, the first double stethoscope, microscopes, surgical instruments and appliances of every kind, and a plethora of pharmaceutical extracts and medicinal juices (including cod liver oil).[9] Topping it off was a complete model of the human body that comprised 1,700 parts.  Science promised better lives and a better future; scientific medicine, which by 1851 had begun to include public health measures, was integral to the promise.

     But here we are in 2021, replete with anti-vaccinationists who choose to endanger themselves, their children, and members of their communities.  They are anti-science primitives in our midst, and I behold them with the same incredulity that visitors to Jurassic Park beheld living, breathing dinosaurs.  Here are people who repudiate both public health measures (mask wearing, curfews, limits on group gatherings) and vaccination science in a time of global pandemic.  For them, liberty is a primal thing that antedates the social contract, of which our Constitution is a sterling example.  It apparently includes the prerogative to get sick and make others sick to the point of death.  The anti-vaccinationists, prideful in their ignorance and luxuriant in their fantasies of government control, remind me of what the pioneering British anthropologist Edward Tyler termed “survivals,” by which he meant remnants of cultural conditions and mindsets irrelevant to the present.  Dinosaurs, after all, didn’t care about the common good either.     


[1] Newsletter and press quotations from Charles Rosenberg, The Cholera Years: The United States in 1832, 1849, and 1866 (Chicago: University of Chicago Press, 1962), 68, 161, 155.

[2] Dorothy Porter & Roy Porter, “The Politics of Prevention:  Anti-Vaccinationism and Public Health in Nineteenth-Century England,” Medical History, 32:231-252, 1988.

[3] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 229-230, 255.

[4] The last reported case of cholera in the U.S. was in 1949.  Cholera, sadly, remains alive and well in a number of African countries. 

[5] In China and Asia Minor, where variolation originated, dried smallpox scabs blown into the nose was the mode of inoculation. 

[6]  José Esparzan, “Three Different Paths to Introduce the Smallpox Vaccine in Early 19th Century United States,” Vaccine, 38:2741-2745. 2020.

[7] Ibid.

[8] Andrew W. Artenstein, et al., “History of U.S. Military Contributions to the Study of Vaccines against Infectious Diseases,” Military Medicine, 170[suppl]:3-11. 2005.  

[9] C. D. T. James, “Medicine and the 1851 Exhibition,” J. Royal Soc. Med., 65:31-34, 1972.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.


[1] Newsletter and press quotations from Charles Rosenberg, The Cholera Years: The United States in 1832, 1849, and 1866 (Chicago: University of Chicago Press, 1962), 68, 161, 155.

[2] Dorothy Porter & Roy Porter, “The Politics of Prevention:  Anti-Vaccinationism and Public Health in Nineteenth-Century England,” Medical History, 32:231-252, 1988.

[3] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 229-230, 255.

[4] The last reported case of cholera in the U.S. was in 1949.  Cholera, sadly, remains alive and well in a number of African countries. 

[5] In China and Asia Minor, where variolation originated, dried smallpox scabs blown into the nose was the mode of inoculation. 

[6]  José Esparzan, “Three Different Paths to Introduce the Smallpox Vaccine in Early 19th Century United States,” Vaccine, 38:2741-2745. 2020.

[7] Ibid.

[8] Andrew W. Artenstein, et al., “History of U.S. Military Contributions to the Study of Vaccines against Infectious Diseases,” Military Medicine, 170[suppl]:3-11. 2005.  

[9] C. D. T. James, “Medicine and the 1851 Exhibition,” J. Royal Soc. Med., 65:31-34, 1972.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

The War on Children’s Plague

In the early 19th century, doctors called it angina maligna (gangrenous pharyngitis) or “malignant sore throat.”  Then in 1826, the French physician Pierre-Fidele Bretonneau grouped both together as diphtherite.  It was a horrible childhood disease in which severe inflammation of the upper respiratory tract gave rise to a false membrane, a “pseudomembrane,” that covered the pharynx, larynx, or both.  The massive tissue growth prevented swallowing and blocked airways and often led to rapid death by asphyxiation.  It felled adults and children alike, but younger children were especially vulnerable.  Looking back on the epidemic that devastated New England in 1735-1736, the lexicographer Noah Webster termed it “literally the plague among children.”  It was the epidemic, he added, in which families often lost all, or all but one, of their children.

A century later, diphtheria epidemics continued to target the young, especially those in cities.  Diphtheria, not smallpox or cholera, was “the dreaded killer that stalked young children.”[1]   It was especially prevalent during the summer months, when children on hot urban streets readily contracted it from one another when they sneezed or coughed or spat.  The irony is that a relatively effective treatment for the disease was already in hand.

In 1882, Robert Koch’s assistant, Fredrich Loeffler, published a paper identifying the bacillus – the rod-shaped bacterium Corynebacterium diphtheria first identified by Edwin Klebs – as the cause of diphtheria.  German scientists immediately went to work, injecting rats, guinea pigs, and rabbits with live bacilli, and then injecting their blood serum – blood from which cells and clotting factor have been removed – into infected animals to see if the diluted serum could produce a cure.  Then they took blood from the “immunized” animal, reduced it to the cell-free blood liquid, and injected it into healthy animals. The latter, to their amazement, did not become ill when injected with diphtheria bacilli.  This finding was formalized in the classic paper of Emil von Behring and Shibasaburo Kitsato of 1890, “The Establishment of Diphtheria Immunity and Tetanus Immunity in Animals.”  For this, von Behring was awarded the very first Nobel Prize in Medicine in 1901.      

Thus the birth of blood serum therapy, precursor of modern vaccines and antibiotics alike.  By the early 1890s, Emile Roux and his associates at the Pasteur Institute discovered that infected horses, not the rabbits used by Behring and Kitsato, produced the most potent diphtheria serum of all.  Healthy horses injected with a heat-killed broth culture of diphtheria, it was found, could survive repeated inoculations with the live bacilli.  The serum, typically referred to as antitoxin, neutralized the highly poisonous substances – the exotoxins – secreted by diphtheria bacteria. 

And there was more:  horse serum provided a high degree of protection for another mammal, viz., human beings.  Among people who received an injection of antitoxin, only one in eight developed symptoms on exposure to diphtheritic individuals. In1895 two American drug companies, H. K. Mulford of Philadelphia and Parke Davis of Chicago, began manufacturing diphtheria antitoxin.  To be sure, their drug provided only short-term immunity, but it sufficed to cut the U.S. death rate among hospitalized diphtheria patients in half.  This fact, astonishing for its time, fueled the explosion of disease-specific antitoxins, some quite effective, some less so.  By 1904 Mulford alone had antitoxin preparations for anthrax, dysentery, meningitis, pneumonia, tetanus, streptococcus infections, and of course diphtheria. 

Colorful Mulford antitoxin ad from early 20th century, featuring, of course, the children

In the era of Covid-19, there are echoes all around of the time when diphtheria permeated the nation’s everyday consciousness. Brilliant scientists, then and now, deploying all the available resources of laboratory science, developed safe and effective cures for a dreaded disease.  But more than a century ago, the public’s reception of a new kind of preventive treatment – an injectable horse-derived antitoxin – was unsullied by the resistance of massed anti-vaccinationists whose anti-scientific claims are amplified by that great product of 1980s science, the internet. 

To be sure, in the 1890s and early 20th century, fringe Christian sects anticipated our own selectively anti-science Evangelicals.  It was sacrilegious, they claimed, to inject the blood product of beasts into human arms, a misgiving that did nothing to assuage their hunger for enormous quantities of beef, pork, and lamb.  Obviously, their God had given them a pass to ingest bloody animal flesh.  Saving children’s lives with animal blood serum was apparently a different matter. 

During the summer months, parents lived in anxious expectation of diphtheria every day their children ventured on to city streets.  Their fear was warranted and not subject to the denials of self-serving politicians.  In 1892, New York City’s Health Department established the first publicly funded bacteriological laboratory in the country, and between 1892 and the summer of 1894, the lab proved its worth by developing a bacteriological test for diagnosing diphtheria.  Infected children could now be sent to hospitals and barred from public schools.  Medical inspectors, armed with the new lab tests, went into the field to enforce a plethora of health department regulations. 

Matters were simplified still further in 1913, when the Viennese pediatrician Bela Schick published the results of experiments demonstrating how to test children for the presence or absence of diphtheria antitoxin without sending their blood to a city lab. Armed with the “Schick test,” public health physicians and nurses could quickly and painlessly determine whether or not a child was immune to diphtheria.  For the roughly 30% of New York City school children who had positive reactions, injections of antitoxin could be given on the spot.  A manageable program of diphtheria immunization in New York and other cities was now in place.    

What about public resistance to the new proto-vaccine?  There was very little outside of religious fringe elements.  In the tenement districts, residents welcomed public health inspectors into their flats.  Intrusion into their lives, it was understood, would keep their children healthy and alive, since it led to aggressive intervention under the aegis of the Health Department.[2]   And it was not only the city’s underserved, immigrants among them, who got behind the new initiative.  No sooner had Hamann Biggs, head of the city’s bacteriological laboratory, set in motion the lab’s inoculation of horses and preparation of antitoxin, than the New York Herald stepped forward with a fund-raising campaign that revolved around a series of articles dramatizing diphtheria and its “solution” in the form of antitoxin injections. The campaign raised sufficient funds to provide antitoxin for the William Parke Hospital, reserved for patients with communicable diseases, and for the city’s private physicians as well.  In short order, the city decided to provide antitoxin to the poor free of charge, and by 1906 the Health Department had 318 diphtheria antitoxin stations administering free shots in all five boroughs.[3][4]

A new campaign by New York City’s Diphtheria Prevention Commission was launched in 1929 and lasted two years.   As was the case three decades earlier, big government, represented by state and municipal public health authorities, was not the problem but the solution.  To make the point, the Commission’s publicity campaign adopted military metaphors.  The enemy was not government telling people what to do; it was the disease itself along with uncooperative physicians and recalcitrant parents.  “The very presence of diphtheria,” writes Evelynn Hammonds, “became a synonym for neglect.”[5]     

The problem with today’s Covid anti-vaccinationists is that their opposition to vaccination is erected on a foundation of life-preserving vaccination science of which they, their parents, their grandparents, and their children are beneficiaries.  They can shrug off the need for Covid-19 vaccination because they have been successfully immunized against the ravages of debilitating childhood diseases.  Unlike adults of the late-nineteenth and early-20th centuries, they have not experienced, up close and personal, the devastation wrought summer after summer, year after year, by the diphtheria bacillus.  Nor have they lost children to untreated smallpox, scarlet fever, cholera, tetanus, or typhus.  Nor, finally, have they, in their own lives, beheld the miraculous transition to a safer world in which children stopped contracting diphtheria en masse, and when those who did contract the disease were usually cured through antitoxin injections.

In the 1890s, the citizens of New York City had it all over the Covid vaccine resisters of today.  They realized that the enemy was not public health authorities infringing on their right to keep themselves and their children away from antitoxin-filled syringes. No, the enemy was the microorganism that caused them and especially their children to get sick and sometimes die. 

Hail the supremely common sense that led them forward, and pity those among us for whom the scientific sense of the past 150 years has given way to the frontier “medical freedom” of Jacksonian America.  Anti-vaccinationist rhetoric, invigorated by the disembodied comaraderie of internet chat groups, does not provide a wall of protection against Covid-19.  Delusory thinking is no less delusory because one insists, in concert with others, that infection can be avoided without the assistance of vaccination science. The anti-vaccinationists need to be vaccinated along with the rest of us.  A healthy dose of history wouldn’t hurt them either.         


[1] Judith Sealander, The Failed Century of the Child: Governing America’s Young in the Twentieth Century (Cambridge: Cambridge Univ. Press, 2003), p. 326.

[2] Evelynn Maxine Hammonds, Childhood’s Deadly Scourge: The Campaign To Control Diphtheria in New York City, 1880-1930 (Baltimore:Johns Hopkins University Press, 1999), 84-86.

[3] William H. Park, “The History of Diphtheria in New York, City,” Am. J. Dis. Child., 42:1439-1445, 1931.

[4] Marian Moser Jones, Protecting Public Health in New York City: Two Hundred Years of Leadership, 1805-2005 (NY: New York City Department of Health and Mental Hygiene, 2005), 20.                                     

[5] Hammonds, op cit., p. 206.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

Vaccinating Across Enemy Lines

There are periods in American history when scientific progress is in sync with governmental resolve to exploit that progress.  This was the case in the early 1960s, when advances in vaccine development were matched by the Kennedy Administration’s efforts to vaccinate the nation and improve the public’s health.  And the American public wholeheartedly supported both the emerging generation of vaccines and the government’s resolve to place them in the hands – or rather arms – of as many Americans as possible. The Vaccination Assistance Act of 1962 grew out of this three-pronged synchrony.[1]

Between 1963 and 1965, a severe outbreak of rubella (German measles) lent support to those urging Congress to approve title XIX (of the Medicaid provision) of the Social Security Act of 1965.  And Congress rose to the task, passing into law the “Early and Periodic Screening, Diagnosis, and Treatment” amendments to Title XIX.  The latter affirmed the right of every American child to receive comprehensive pediatric care, including vaccination.

The timing was auspicious.  In 1963, Merck, Sharp & Dohme began shipping its live-virus measles vaccine, trademarked Rubeovax, which had to be administered with standardized immune globulin (Gammagee). In 1967 MSD combined the measles vaccine with smallpox vaccine as Dryvax, and then, a year later, released a more attenuated live measles vaccine (Attenuvax) that did not require coadministration of immune globulin.[2]   MSD marketing reminded parents that mumps, long regarded as a benign childhood illness, was now associated with adult sterility.  It too bowed to science and responsible parenting, with its incident among American children falling 98% between 1968 and 1985.

Crowd waiting for 1962 oral polio vaccination
Creator: CDC/Mr. Stafford Sm

America’s commitment to vaccination was born of the triumphs of American medicine during WWII and came to fruition in the early 1950s, just as Cold War fears of nuclear war gripped the nation and pervaded everyday life.  Grade school nuclear attack drills, “duck and cover” animations, basement fallout shelters with cabinets filled with canned food – I remember all too well these scary artifacts of a 1950s childhood. Competition with the Soviet Union suffused all manner of scientific, technological, public health-related, and athletic endeavor. The Soviets leapt ahead in the space race with the launching of Sputnik in 1957.  The U.S. retained an enormous advantage on the ground with the size and destructive power of its nuclear arsenal.

Less well known is that, in the matter of mass polio vaccination, countries in the Eastern Bloc – Hungary, Czechoslovakia, Poland – led the way. Hungary’s intensive annual vaccination campaigns, launched in 1957 with Salk vaccine imported from the U.S. and Sabin vaccine imported from the U.S.S.R. in 1959, was the prototype for the World Health Organization’s (WHO) global strategy of polio eradication.  Czechoslovakia became the first nation to eradicate polio in 1959; Hungary followed in 1963.[3]  

It is tempting to absorb the narrative of polio eradication into Cold War politics, especially the rhetoric of the vaccination campaigns that mobilized the public. Throughout the Eastern Bloc, mass vaccination was an aspect of pro-natalist policies seeking to increase live births, healthy children, and, a bit down the road, productive workers. Eradication of polio, in the idiom of the time, subserved the reproduction of labor. In the U.S., the strategic implications of mass vaccination were framed differently.  During the late 50s and early 60s, one in five American applicants for military service was found medically unfit.  Increasing vaccination rates was a cost-effective way of rendering more young men fit to serve their nation.[4]   

But there is a larger story that subsumes these Cold War rationales, and it is a story, surprisingly, of scientific cooperation across the Iron Curtain.  Amid escalating Cold War tensions, the United States and Soviet Union undertook a joint initiative, largely clandestine, to develop, test, and manufacture life-saving vaccines.  The story begins in 1956, when the U.S. State Department and Soviet Ministry of Foreign Affairs jointly facilitated collaboration between Albert Sabin and two leading Soviet virologists, Mikhail Chumakov and Anatoli Smorodintsev.  Their shared goal was the manufacture of Sabin’s oral polio vaccine on a scale sufficient for large-scale testing in the Soviet Union. With a KGB operative in tow, the Russians travelled to Sabin’s laboratory in the Cincinnati Children’s Hospital, and Sabin in turn flew to Moscow to continue the brainstorming.  

Two years later, shipments of Sabin’s polio virus strains, packed in dry ice, arrived in the Soviet Union, and shortly thereafter, with the blessing of post-Stalin Kremlin leadership, the mass trials began.  The Sabin vaccine was given to 10 million Russian school children, followed by millions of young adults.  A WHO observer, the American virologist Dorothy Horstmann, attested to the safety of the trials and the validity of their findings. It has long since stopped polio transmission everywhere in the world except Afghanistan and Pakistan.   

No sooner was the Sabin live-virus vaccine licensed than Soviet scientists developed a unique process for preserving smallpox vaccine in harsh environments.  With freeze-dried vaccine now available, Viktor Zhdanov, a Soviet virologist and Deputy Minister of Health, boldly proposed to the 1958 meeting of the World Health Assembly, WHO’s governing body, the feasibility of global smallpox eradication.  After the meeting, he did not wait patiently for the WHO to act: he led campaigns both to produce smallpox vaccine and to solicit donations from around the world.[5]  His American colleague-in-arms in promoting freeze-dried vaccine was the public health physician and epidemiologist Donald Henderson, who led a 10-year international vaccination campaign that eliminated smallpox by 1977.[6] 

What can we learn from our Cold War predecessors?  The lesson is self-evident: we learn from them that science in the service of public health can be an enclave of consensus, what Dora Vargha, the historian of Cold War epidemics, terms a “safe space,” among ideological combatants with the military resources to destroy one another. The Cold War is long gone, so the safe space of which Vargha writes is no longer between geopolitical rivals with fingers on nuclear triggers.

But America in 2021 is no longer a cohesive national community.  Rather, we inhabit a fractured national landscape that erupts, with demoralizing frequency, into a sociopolitical battle zone. The geopolitical war zone is gone, but Cold War-type tensions play out in the present. Right-wing extremists, anti- science Evangelicals, purveyors of a Trump-like notion of insular “greatness” – these overlapping segments of the population increasingly pit themselves against the rest of us:  most Democrats, liberals, immigrants, refugees,  defenders of the social welfare state that took shape after the Second World War.  Their refusal to receive Covid-19 vaccination is absorbed into a web of breezy rhetoric:  that they’ll be okay, that the virus isn’t so bad, that the vaccines aren’t safe, that they come to us from Big Government, which always gets it wrong.  Any and all of the above.  In fact, the scientific illiterati are led by their anger, and the anger shields them from relevant knowledge – of previous pandemics, of the nature of a virus, of the human immune system, of the role of antibodies in protecting us from invading antigens, of the action of vaccines on blood chemistry – that would lead them to sequester their beliefs and get vaccinated.   

When the last wave of antivaccinationism washed across these shores in the early 1980s, it was led by social activists who misappropriated vaccination in support of their cause.  Second-wave feminists saw vaccination as part of the patriarchal structure of American medicine, and urged women to be skeptical about vaccinating their children, citing the possibility of reactions to measles vaccine among children allergic to eggs.  It was a classic instance of throwing out the baby with the bathwater which, in this case, meant putting the children at risk because the bathwater reeked of male hubris.  Not to be left out of the antiscientific fray, environmentalists, in an act of stupefying illogic, deemed vaccines an environmental pollutant – and one, according to writers such as Harris Coulter, associated with psychiatric illness.[7]                                

Matters are now much worse.  Antivaccinationism is no longer aligned, however misguidedly, with a worthy social cause.  Rather, it has been absorbed into this far-reaching skepticism about government which, according to many right-wing commentators and their minions, intrudes in our lives, manipulates us, constrains our freedom of choice, and uses our tax dollars to fund liberal causes.

Even in the absence of outright hostility, there is a prideful indifference to vaccination, partly because it is a directive from Big Government, acting in conformity with the directive of what is, after all, Big Pharmaceutical Science.  But we have always needed Big Government and Big Science to devise solutions to Big Problems, such as a global pandemic that has already claimed over 560,000 American lives.  Without American Big Government, in cooperation with British Big Government, overseeing the manufacture and distribution of penicillin among collaborating pharmaceutical firms, the miracle drug would not have been available in time for D-Day.  Big government made it happen.   A decade later, the need for international cooperation transcended the bonds of wartime allies.  It penetrated the Iron Curtain in the wake of global polio and smallpox epidemics that began in 1952 and continued throughout the decade.  

The last thing we need now is a reprise on that era’s McCarthyism, when anyone was tainted, if not blacklisted, by mere accusation of contact with communists or communism. That is, we do not need a nation in which, for part of the population, anything bearing the stamp of Big Government is suspected of being a deception that infringes on some Trumpian-Hobbesian notion of “freedom” in a state of (market-driven) nature.  

If you want to make America “great” again, then start by making Americans healthy again.  Throughout the 1960s, the imperative of vaccination overcame the anxieties of American and Soviet officials given to eying one another warily atop growing nuclear stockpiles. They brought the scientists together, and the result was the mass testing that led to the eradication of polio.  Then America rallied around the Soviet creation of freeze-dried smallpox vaccine, and largely funded the manufacture and distribution that resulted in the eradication of smallpox. 

Now things are better.  We live in an era in which science enables us to alter the course of a global pandemic.  It is time for antivaccinationists to embrace the science, indeed, to celebrate the science and the gifted scientists whose grasp of it enabled them to create safe and effective Covid-19 vaccines in astonishingly little time.  You’ve got to get your vaccine.  It’s the only way. 


[1] Elena Comis, Vaccine Nation: America’s Changing Relationship with Immunization  (Chicago: University of Chicago Press, 2014), 20.

[2] Louis Galambos, with Jane Eliot Sewell, Networks of Innovation: Vaccine Development at Merck, Sharp & Dohme, and Mulford, 1895-1995.Cambridge:  Cambridge University Press, 1995, 96-98, 196-107.

[3] Dora Vargha, “Between East and West: Polio Vaccination Across the Iron Curtain in Cold War Hungary,” Butt. Hist. Med., 88:319-345, 2014; Dora Vargha, “Vaccination and the Communist State,” in The Politics of Vaccination (online pub date: March 2017).

[4] Comis, Vaccine Nation, 27.

[5] Manela E. “A Pox on Your Narrative: Writing Disease Control into Cold War History,” Diplomatic History, 34:299-323, 2010.

[6] Peter J. Hotez, “Vaccine Diplomacy:  Historical Perspective and Future Directions,” PLoS Neglected Trop. Dis. 8:e380810.1371, 2014; Peter J. Hotez, “Russian-United States Vaccine Science: Preserving the Legacy,” PLoS Neglected Trop. Dis., 11:e0005320,2017.

[7] The feminist and environmentalist antivaccination movements of the 1980s are reviewed at length, in Comis, Vaccine Nation, chapter 5 & 6.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

Anti-vaccinationism, American Style

Here is an irony:  America’s staggering production of generations of scientific brainpower coexists with the deep skepticism about science of many Americans.  Donald Trump, a prideful scientific illiterate, rode to power on the back of many others who, like him, were skeptical about science and especially the role of scientific experts in modern life.  He maintains their allegiance still.

Why does this surprise us?  Anti-intellectualism was burned into the national character early in American history.  Those skeptical of this claim should read Richard Hofstadter’s brilliant twin studies of the 1960s, Anti-Intellectualism in American Life and The Paranoid Trend in American Politics. From the beginning of the American Experiment, democracy was antithetical to so-called European “elitism,” and this ethos gained expression, inter alia, in antebellum medicine.  

The Founding Fathers, an intellectual elite in defense of democracy, were not part of the movement away from science.  When Benjamin Waterhouse introduced Edward Jenner’s smallpox vaccine to America in 1800, Washington, Adams, and Jefferson hailed it as the greatest discovery of modern medicine.  They appreciated the severity of smallpox, which had ravaged the Continental Army during the War of Independence.  Indeed, Washington was so desperate to rein in its decimation of his troops that, in 1777, he inoculated his entire army with pus from active smallpox lesions, knowing that the resulting infections would be milder and far less likely to cause fatalities than smallpox naturally contracted.  When Jefferson became president in 1801, he pledged to introduce the vaccine to the American public, because “it will be a great service indeed rendered to human nature to strike off the catalogue of its evils so great a one as the smallpox.” Not to be outdone in support of Jenner’s miraculous discovery, Jefferson’s successor, James Madison, signed into law in 1813, “An Act to Encourage Vaccination.” Among its provisions was the requirement that the U.S. postal service “carry mail containing vaccine materials free of charge.”[1]

But this appreciation of the vaccine was short-lived, and Jefferson’s hope that the value of vaccination would seep into public consciousness was never realized.  In Jacksonian America, the Founding Fathers’ belief that medical progress safeguarded democracy gave way to something far less enlightened:  democracy now meant that everyone could be, indeed should be, his own doctor.  Most Americans had no need for those with university educations, much less clinical experience in governmentally managed public hospitals.  Jacksonian America emerges as what the historian Joseph Kett termed the “Dark Age of the profession.”[2]  During this time, the nation lay claim to a medical elite only because a few monied medical intelligentsia – John Collins Warren, Valentine Mott, Philip Syng Physick, William Gibson, and David Hosack, among them – found their way to European medical centers in London, Edinburgh, and somewhat later, Paris. 

Otherwise, it was every man for himself, which usually meant every woman for herself and her family.  Homeopaths, herbalists, Thomsonians, eclectics, hydropaths, phrenologists, Christian Scientists, folk healers, faith healers, uroscopians, chromo-thermalists – each exemplified the democratic mind in action.[3]  Sad to say, homegrown “regular” American medicine of the day, with its reliance on depletive (bleeding, vomiting, purging) and stimulative (alcohol, quinine) treatments, was no better and often worse.  The belief, Galenic in origin, that all diseases were variants of the same global type of bodily dysregulation is startlingly close to Donald Trump’s holistic medieval approach to bodily infection and its treatment.

The birth of scientific medicine in the decades following the Civil War could not still the ardor of America’s scientific illiterati. The development of animal blood-derived serums (antitoxins), forerunners of modern antibiotics, was anathema to many. Among them were religionists, mainly Christian, for whom injecting blood product of a horse or sheep into the human body was not only repugnant but sinful.  Better to let children be stricken with smallpox, diphtheria and tetanus, sometimes to the point of death, than violate what they construe as divine strictures – strictures, be it noted, not intimated, much less codified, in the body of doctrine of any of the five major world religions.[4]

Antivaccinationists of the early 20th century were an unhappy lot.  They were unhappy about the proliferation of medicines (“biologics”) for treating illness.  And they deeply resented the intrusion of the State into domains of parental decision-making in the form of newly empowered social workers, visiting nurses, and educators.  In fact, antivaccinationism was part and parcel of resistance to all things progressive, including scientific medicine.[5]  Holdovers from the free-wheeling anything-goes medicine of antebellum America – especially devotees of homeopathy and, of late, chiropractic – were prominent in its ranks.    

Now, in the face of a global pandemic no less lethal than the Great Influenza of 1918-1919, we hear the same irrational musings about the dangers of vaccines that animated the scientific illiterati at the turn of the 20th century. For the foes of public health, any misstep in the manufacture or storage of smallpox vaccine – a much greater possibility over a century ago than today – was enough to condemn vaccination outright. In1901,smallpox vaccination of school children in Camden, NJ led to an outbreak of 100 cases of tetanus, with nine deaths.  Historians believe that, in all probability, the outbreak resulted not from a contaminated batch of vaccine but rather from poor care of the vaccination site.  But Congress accepted the possibility of contamination, and the incident led to passage of the Biologics Control Act of 1902.[6]  Henceforth every manufacturer of vaccine had to be licensed by the Secretary of the Treasury (relying on the PHS Laboratory of Hygiene), and each package of vaccine had to be properly labeled and dated and was subject to inspection.[7]  

And this leads to a second irony: the more preventive medicine advanced, incorporating additional safeguards into vaccine production, storage, and administration, the greater the resistance of the illiterati.  Throughout the 20th century and right down to the present, the antebellum notion of science-free “medical freedom” continues to hold sway.  Then and now, it means the right to put children at risk for major infectious disease that could result in death – and the right, further, to pass disease, possibly severe and occasionally fatal, on to others.

It follows that, then and now, the science illiterati are skeptical, if not distressed, by the State’s commitment to public health.  It was Oklahoma Senator Robert Owen’s proposed legislation of 1910 to combine five federal departments into a cabinet-level Department of Public Health that pushed the opponents of medical “tyranny” onward. The Anti-Vaccination League of America, formed in 1908, was joined by the National League for Medical Freedom in 1910.  Eight years later, they were joined by the American Medical Liberty League.  For all three groups, anti-Progressivism was in full swing. “Medical freedom” not only exempted children from compulsory vaccination, but from medical examinations at school.  Further, young adults should not be subjected to premarital syphilis tests. Nor did the groups’ expansive view of medical tyranny flinch in the face of public education about communicable disease: municipal campaigns against diphtheria were to be forbidden entirely. 

With the death of the founders of the Anti-Vaccination League (Charles Higgins) and the American Medical Liberty League (Lora Little) in 1929 and 1931, respectively, antivaccinationism underwent a dramatic decline.  The Jacksonian impulse that fueled the movement simply petered out, and by the later ‘30s, Americans finally grasped that mainstream medicine was not simply another medical sect. It was the real deal:  a medicine grounded in laboratory research that effectively immunized against disease, promoted relief and cure of those already infected, and thereby saved lives.

But was the embrace of scientific healing really universal?  A pinnacle of life-depriving anti-science occurred well beyond the 1930s.  Consider the belief of some Christian sects that certain life-saving medical interventions must be withheld from children on religious grounds.  It was only in 1982, 81 years after von Behring’s discovery of diphtheria antitoxin launched the era of serum therapy, that criminal charges were first brought against parents who had withheld necessary treatment from their children.  Of the 58 cases of such parental withholding of care, 55 involved fatalities.[8]  Child deaths among Christian Scientists alone included untreated diabetes (leading to diabetic ketoacidosis), bacterial meningitis, and pneumonia.  Now things are better for the children, since even U.S. Courts that have overturned parents’ criminal convictions have come around to the mainstream belief that religious exemption laws are not a defense of criminal neglect – a fine insight for the judiciary to have arrived at more than a century after serum therapy scored major triumphs in the treatment of rabies, diphtheria, tetanus, pneumococcal pneumonia, and meningococcal meningitis.

Should vaccination for the Covid-19 virus be a requirement for attendance in public and private schools?  How can the question even be asked?  As early as 1827, a Boston school committee ordered teachers to require entering students to give evidence of smallpox vaccination.[9]  Statewide vaccination requirements for smallpox followed in Massachusetts in 1855, New York in 1862, Connecticut in 1872, and Pennsylvania in 1895.  And the inoculations were effective across the board.  They quickly brought outbreaks of smallpox underway at the time of inoculation under control, and they prevented their recurrence in the future. These laws and those that followed were upheld by the Supreme Court in 1922 in Zucht v. King.[10]      

Twentieth-century vaccines were developed for pertussis in 1914, diphtheria in 1926, and tetanus in 1938.  In 1948 the three were combined and given to infants and toddlers at regular intervals as the DTP vaccine.  There was no hue and cry in 1948 or the years to follow. And yet, the same fear of vaccination that led the New York State Health Department to launch a statewide drive to immunize children against diphtheria now renders a new generation of parents resistant to mandatory Covid-19 vaccination for their own children.

Bear in mind that the anti-science rhetoric of today’s illiterati can be mobilized just as easily to resist DPT or any subsequent vaccine administered to their children. Why subject a child to DPT vaccination?  Perhaps combining three different vaccines into one injection entails heightened risks. Perhaps the batch of vaccine in the hands of one’s own doctor has been contaminated.  Perhaps one’s child will be among the miniscule number that have a minor allergic reaction.  And, after all, children who contract diphtheria, pertussis, and/or tetanus will hardly die from their infections, especially with the use of antibiotics. Why inject foreign matter into healthy infants – the very argument adduced by the opponents of diphtheria vaccine a century ago. 

The problem with antivaccinationist rhetoric in the 21st century is that its proponents are all beneficiaries of more than a century of mandatory vaccination policy.  If they lived in a society bereft of vaccines – or, for the unvaccinated, the immunity conferred by the vast herd of immunes – they would have led very different lives.  Indeed, some would not be here to celebrate solipsism masquerading as individualism.  Their specious intuitions about the risks of vaccination are profoundly anti-social, since they compromise the public’s health. Parents who decide not to vaccinate their children put the entire community at risk.  The community includes not only their own children, but all those who desire protection but cannot receive it:  children too young to be vaccinated, those with actual medical contraindications to vaccination, and the miniscule number who have been vaccinated but remain unprotected.[11]    

Nor is it strictly a matter of providing equal protection to individuals who seek, but cannot receive, the protection afforded by compulsory vaccination. In a secular society, religious objections to vaccination pale alongside the health of the community. Whether framed in terms of a “compelling state interest” in mitigating a health threat (Sherbert v. Vernerin [1963]) or the individual’s obligation to comply with “valid and neutral laws of general applicability” whatever their incidental religious implications (Employment Division, Department of Human Resources of Oregon v. Smith [1990]) , the U.S. Supreme Court has consistently held that mandatory vaccination laws need not allow religious exemptions of any kind.  

Antivaccinationists might bear in mind a few particulars as they align themselves with the infectious dark ages.  Between 1900 and 1904, an average of 48,164 cases of smallpox and 1,528 smallpox deaths were reported each year. With the arrival of compulsory vaccination in schools, the rate fell drastically and outbreaks of smallpox ended in 1929. The last case of smallpox in the U.S. was reported in 1949.[12]  

Among American children, diphtheria was a major cause of illness and death through 1921, when 206,000 cases and 15,520 deaths were recorded.  Before Emil von Bering’s diphtheria antitoxin became available in 1894 to treat infected children, the death rate among children struck down, especially during the hot summer months, could reach 50%. Within several years, use of the antitoxin brought it down to 15%.[13]  Then, by the late 1920s, diphtheria immunization was introduced and diphtheria rates fell dramatically, both in the U.S. and other countries that vaccinated widely. Between 2004 and 2008, no cases of diphtheria were recorded in the U.S.[14] 

Between 1951 and 1954, paralytic polio cases in the United States averaged 16,316 a year, of which 1,879 resulted in death. Then science came to the rescue.  Jonas Salk’s dead-poliovirus vaccine became available in1955, and Albert Sabin’s live-poliovirus variant four years later. By 1962, there were fewer than 1,000 cases a year and, in every year thereafter, fewer than 100 cases.[15]

Now, alas, some parents still worry that the measles component of the MMR (measles, mumps, rubella) vaccine available since 1971 may lead to childhood autism.  Why?  Resist the disease-promoting mythologies of the illiterati at all costs.  Autism is a neuro-developmental disorder with a strong genetic component; its genesis is during the first year of life, before the vaccine is even administered.  None of the epidemiologists who have studied the issue has found any evidence whatsoever of an association, not among normal children and not among high-risk children with autistic siblings.[16]  The fact is that children who do not receive a measles vaccine have been found 35 times more likely to contract measles than the vaccinated.[17]  And measles is no laughing matter. When contracted later in life, measles and mumps are serious and can be deadly.  They were among the major systemic infections that felled soldiers during the Civil War, the Spanish-American War, the Anglo-Boer War, and World War I.[18]                  

All of which leads to a conclusion in the form of an admonishment.  Accept the fact that you live in a secular society governed by law and a network of agencies, commissions, and departments lawfully enjoined to safeguard public health.  Do your part to sustain the social contract that came into existence when the Founding Fathers, elitists molded by European thought who had   imbibed the social contractualism of John Locke, wrote the American constitution.

Vaccination is a gift that modern science bestows on all of us – vaccination proponents and opponents alike. When one of the two FDA-approved Covid-19 vaccines comes to a clinic or storefront near you, run, don’t walk, to get your and your children’s shots. Give thanks to the extraordinarily gifted scientists at Pfizer and Moderna who created these vaccines and demonstrated their effectiveness and safety. Make sure that everyone’s children grow up, paraphrasing the U.S. Army’s old recruiting slogan, to be all they can be.   


[1] Dan Liebowitz, Smallpox Vaccination: An Early Start of Modern Medicine in America, J. Community Hosp. Intern. Med. Perspect., 7:61-63, 2017 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5463674).

[2] Joseph F. Kett, The Formation of the American Medical Profession: The Role of Institutions, 1780-1860 (New Haven: Yale University Press, 1968), p. vii. 

[3] Robert E. Riegel, Young America, 1830-1840 (Westport, CT: Greenwood Press, 1973 [1949]), pp. 314-315, quoted at  314. 

[4] John D. Graberstein, “What the World’s Religions Teach, As Applied to Vaccines and Immune Globulins,” Vaccine, 31:2011-2023, 2013.

[5] James Colgrove, “’Science in Democracy’: The Contested Status of Vaccination In the Progressive Era and the 1920s,” Isis, 96:167-191, 2005.

[6]  Harry F. Dowling, Fighting Infection: Conquests of the Twentieth Century (Cambridge, MA: Harvard University Press, 1977), 38; Harry M. Marks, The Progress of Experiment: Science and Therapeutic Reform in the United States, 1900-1990 (Cambridge: Cambridge University Press, 1997), 73-74.

[7] Jonathan Liebenau, Medical Science and Medical Industry: The FormationOf the American Pharmaceutical Industry (Baltimore: Johns Hopkins, 1987), 89-90.

[8]  Janna C. Merrick, “Spiritual Healing, Sick Kids and the Law: Inequities in theAmerican Healthcare System,” Amer. J. Law & Med., 29:269-300, 2003, at 280.

[9] John Duffy, “School Vaccination: The Precursor to School Medical Inspection,” J. Hist. Med. & Allied Sci., 33:344-355, 1978,

[10] Kevin M. Malone & Alan R. Hinman, “Vaccination Mandates: The Public Health Imperative and Individual Rights, Law in Public Health Practice (2009), 262-284, at 272.

[11] Alan R. Hinman, et al., “Childhood Immunization: Laws that Work,” J. Law, Med &I Ethics, 30(suppl):122-127, 2002.

[12] Frank Fenner, et al., Smallpox and its Eradication (Geneva: World Health Organization, 1988).

[13] Karie Youngdahl, “Early Uses of Diphtheria Antitoxin in the United States,” The History of Vaccines, August 2, 2010 (https://www.historyofvaccines.org/content/blog/…).

[14] Epidemiology and Prevention of Vaccine-Preventable Diseases, 11th Edition (The Pink Book). National Immunization Program, Centers for Disease Control and Prevention (http://www.cdc.gov/vaccines/Pubs/pinkbook/downloads/dip.pdf); Diphtheria. WHO, Regional Office for the Western Pacific (http://www.wpro.who.int/health_topics/diphtheria).

[15] CDC. Annual summary 1980: Reported Morbidity and Mortality in the United States. MMWR 1981;29; CDC, Reported Incidence of Notifiable Diseases in the United States, 1960. MMWR 1961;9.

[16] Frank DeStefano & Tom T. Shimabukuro, “The MMR Vaccine and Autism,” Ann. Rev. Virol., 6:585-600, 2019.

[17] Hinman, op. cit. (note 11).

[18] Paul E. Stepansky, Easing Pain on the Western Front:  American Nurses of the Great War and the Birth of Modern Nursing Practice (Jefferson, NC:  McFarland, 2020), 36, 50, 96, 144.

 

Copyright © 2020 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.