Category Archives: Corona Virus

Unmasked and Unhinged

The Great Influenza, the Spanish Flu, a viral infection spread by droplets and mouth/nose/hand contact, laid low the residents of dense American cities, and spurred municipal officials to take new initiatives in social distancing.[1]  City-wide bans on public gatherings included closing schools, theaters, motion picture houses, dance halls, and – perish the thought – saloons.  In major cities, essential businesses that remained open had to comply with new regulations, including staggered opening and closing times to minimize crowd size on streets and in trolleys and subways.  Strict new sanitation rules were the order of the day.  And yes, eight western cities, not satisfied with preexisting regulations banning public spitting and the use of common cups, or even new regulations requiring the use of cloth handkerchiefs when sneezing or coughing, went the full nine yards:  they passed mask-wearing ordinances.

In San Francisco and elsewhere, outdoor barbering and police courts were the new normal.

The idea was a good one; its implementation another matter.  In the eight cities in question, those who didn’t make their own masks bought masks sewn from wide-mesh gauze, not the tightly woven medical gauze, four to six layers thick, worn in hospitals and recommended by authorities.  Masks made at home from cheesecloth were more porous still.  Nor did most bother washing or replacing masks with any great frequency. Still, these factors notwithstanding, the consensus is that masks did slow down the rate of viral transmission, if only as one component of a “layered” strategy of protection.[2]   Certainly, as commentators of the time pointed out, masks at least shielded those around the wearer from direct in-your face (literally) droplet infection from sneezes, coughs, and spittle.  Masks couldn’t hurt, and we now believe they helped.  

Among the eight cities that passed mask-wearing ordinances, San Francisco took the lead.  Its mayor, James Rolph, with a nod to the troops packed in transport ships taking them to war-torn France and Belgium, announced that “conscience, patriotism and self-protection demand immediate and rigid compliance” with the mask ordinance. By 1918, masks were entering hospital operating theaters, especially among assisting nurses and interns.[3]  But mask-wearing in public by ordinary people was a novelty.  In a nation gripped by life-threatening influenza, however, most embraced masks and wore them proudly as emblems of patriotism and public-mindedness.   Local Red Cross volunteers lost no time in adding mask preparation to the rolling of bandages and knitting of socks for the boys overseas.

A trolley conductor waves off an unmasked citizen. The image is from Seattle, another city with a mask-wearing ordinance.

But, then as now, not everyone was on board with face masks.  Then as now, there were protesters.  In San Francisco, they were small in number but large in vocal reach.  The difference was that in 1918, cities like San Francisco meant business, with violators of mask laws fined $5 or $10 or imprisoned for 10 days.  On the first day the ordinance took effect, 110 were arrested, many with masks dangling around their necks.  In mid-November,

San Francisco police arrest “mask slackers,” one of whom has belatedly put on a mask.

following the signing of the Armistice, city officials mistakenly believed the pandemic had passed and rescinded the ordinance.  At noon, November 21, at the sound of a city-wide whistle, San Franciscans rose as one and tossed their masks onto sidewalks and streets.   In January, however, following a spike in the number of influenza cases, a second mask-wearing ordinance was passed by city supervisors, at which point a small, self-styled Anti-Mask League – the only such League in the nation – emerged on the scene.[4]  

A long line of San Franciscans waiting to purchase masks in 1919.  A few already have masks in place.

The League did not take matters lying down, nor were they content to point out that masks of questionable quality, improperly used and infrequently replaced, probably did less good than their proponents suggested.  Their animus was trained on the very concept of obligatory mask-wearing, whatever its effect on transmission of the unidentified influenza microbe.  At a protest of January 27, “freedom and liberty” was their mantra.  Throwing public health to the wind, they lumped together mask-wearing, the closing of city schools, and the medical testing of children in school.  Making sure sick children did not infect healthy classmates paled alongside the sacrosanctity of parental rights.  For the protesters, then as now, parental rights meant freedom to act in the worst interests of the child.

___________________

One wants to say that the Anti-Mask League’s short-lived furor over mask-wearing, school closings, and testing of school children is long behind us.  But it is not.  In the matter of contagious infectious disease – and expert recommendations to mitigate its impact – what goes around comes around. In the era of Covid-19, when San Francisco mayor London Breed ordered city residents “to wear face coverings in essential businesses, in public facilities, on transit and while performing essential work,” an animated debate on mask-wearing among city officials and the public ensued.  A century of advance in the understanding of infectious disease, including the birth and maturation of virology – still counts for little among the current crop of anti-maskers.  Their “freedom” to opt for convenience trumps personal safety and the safety of others. Nor does a century of improvements in mask fabrics, construction, comfort, and effectiveness mitigate the adolescent wantonness of this freedom one iota.  

“Liberty and freedom.”  Just as the Anti-Mask League’s call to arms masked a powerful political undertow, so too with the anti-vaxxers and anti-maskers of the present.  Times change; some Americans – a much higher percentage now than in 1918 – do not. Spearheaded by Trumpian extremists mired in fantasies of childlike-freedom from adult responsibility, the “anti” crowd still can’t get its head around the fact that protecting the public’s health – through information, “expert” recommendations and guidelines, and, yes, laws – is the responsibility of government.  The responsibility operates through the Commerce Clause of the Constitution, which gives the federal government broad authority to impose health measures to prevent the spread of disease from a foreign country.  It operates through the Public Health Service Act, which gives the Secretary of Health and Human Services authority to lead federal health-related responses to public health emergencies.  And it operates through the 10th Amendment to the Constitution which grants states broad authority to take action during public health emergencies.  Quarantine and restricted movement of those exposed to contagious disease, business restrictions, stay-at-home orders – all are among the “broad public health tools” available to governors.[5]   

When a catastrophe, natural or man-made, threatens public health and safety, this responsibility, this prerogative, this Constitutional mandate, may well come down with the force of, well, mandates, which is to say, laws.  At such moments in history, we are asked to step up and accept the requisite measure of inconvenience, discomfort, and social and economic restriction because it is intrinsic to the civil liberties that make us a society of citizens, a civil society. 

Excepting San Francisco’s anti-masker politicos, it is easier to make allowances for the inexpert mask wearers of 1918 than for anti-masked crusaders today.  In 1918, many simply didn’t realize that pulling masks down below the nose negated whatever protection the masks provided.  The same is true of the well-meaning but guileless who made small holes in the center of their masks to allow for insertion of a cigarette.  It is much harder to excuse the Covid-19 politicos who resisted mask-wearing during the height of the pandemic and now refuse to don face masks in supermarkets and businesses as requested by store managers.  The political armor that shields them from prudent good sense, respect for store owners, and the safety of fellow shoppers is of a decidedly baser metal. 

The nadir of civil bankruptcy is their virulent hostility toward parents who, in compliance with state, municipal and school board ordinances – or even in their absence – send their children to school donned in face masks.  The notion that children wearing protective masks are in some way being abused, tormented, damaged pulls into its orbit all the rage-filled irrationality of the corrosive Trump era.  Those who would deny responsible parents the right to act responsibly on behalf of their children are themselves damaged.  They bring back to life in a new and chilling context that diagnostic warhorse of asylum psychiatrists (“alienists”) and neurologists of the 19th century:  moral insanity.  

The topic of child mask-wearing, then and now, requires an essay of its own.  By way of prolegomenon, consider the British children pictured below.  They are living, walking to school, sitting in their classrooms, and playing outdoors with bulky gas masks in place during the Blitz of London in 1940-1941.  How could their parents subject them to these hideous contraptions?  Perhaps parents sought to protect their children, to the extent possible, from  smoke inhalation and gas attack from German bombing raids.   It was a response to a grave national emergency.  A grave national emergency.  You know, like a global pandemic that to date has brought serious illness to over 46.6 million Americans and claimed over 755,000 American lives.  

 


[1] For an excellent overview of these initiatives, see See Nancy Tomes, “’Destroyer and Teacher’: Managing the Masses During the 1918-1919 Influenza Pandemic,” Public Health Rep. 125(Suppl 3): 48–62, 2010.  My abbreviated account draws on her article. 

[2] P. Burnett, “Did Masks Work? — The 1918 Flu Pandemic and the Meaning of Layered Interventions,” Berkeley Library, Oral History Center, University of California, May 23, 2020  (https://update.lib.berkeley.edu/2020/05/23/did-masks-work-the-1918-flu-pandemic-and-the-meaning-of-layered-interventions).  Nancy Tomes, “’Destroyer and Teacher’” (n. 1), affirms that the masks were effective enough to slow the rate of transmission. 

[3]  Although surgical nurses and interns in the U.S. began wearing masks after 1910, surgeons themselves generally refused until the 1920s: “the generation of head physicians rejected them, as well as rubber gloves, in all phases of an operation, as they were considered ‘irritating’.”  Christine Matuschek, Friedrich Moll, et al., “The History and Value of Face Masks,” Eur. J. Med. Res., 25: 23, 2020.

[4] My brief summary draws on Brian Dolan, “Unmasking History: Who Was Behind the Anti-Mask League Protests During the 1918 Influenza Epidemic in San Francisco,” Perspective in Medical Humanities, UC Berkeley, May 19, 2020.  Another useful account of  the mask-wearing ordinance and the reactions to it  is the “San Francisco” entry of the The American Influenza Epidemic of 1918-1919: A Digital Encyclopedia, produced by the  University of Michigan Center for the History of Medicine and Michigan Publishing (www.unfluenzaarchive.org/city/city-sanfrancisco.html).

[5] American Bar Association, “Two Centuries of Law Guide Legal Approach to Modern Pandemic,”  Around the ABA, April 2020                           (https://www.americanbar.org/news/abanews/publications/youraba/2020/youraba-april-2020/law-guides-legal-approach-to-pandem).

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

Remembering Cholera in the Time of Covid

It came in crippling waves, the deadly intestinal infection that, in 1832 alone, killed 150,000 Americans.  Its telltale symptom was copious watery diarrhea (“rice water”) accompanied by heavy vomiting, loss of internal fluids and electrolytes, and dehydration; hence its name, cholera, from the Greek “flow of bile.”  Severe cases quickly proceeded to hypovolemic shock and could kill otherwise healthy persons, children and adults, within hours.  Historians refer to the cholera epidemics of the 19th century, but the epidemics, especially those of 1832, 1848, and 1866, were in fact pandemics, spreading from nation to nation and continent to content.

     Orthodox or “regular” physicians of the first half of the 19th century had no clue about its cause, mode of transmission, or remedy, so they speculated wildly in all directions.  Contagionists believed it spread from person to person.  Noncontagonists attributed it to poisonous fumes, termed miasma, which emanated from soil and   decaying matter of all kinds.  Some attributed it to atmospheric changes; others thought it a byproduct of fermentation.  Physician-moralists could be counted on to ascribe it to the moral failings of the underclass.  Withal, the regulars resorted, one and all, to “heroic” treatments, with bleeding and toxic chemicals, especially calomel (mercury) and laudanum (opium-laced alcohol) having pride of place. These treatments only hastened the death of seriously ill patients, which, given the extent of their suffering, may have been an unwitting act of mercy.  Physician-induced deaths resulted in the appeal of homeopathy and botanical medicine, far milder approaches that, so their practitioners avowed, caused far fewer deaths than the horrid regular remedies.  

Caricature of 1832, depicting cholera sufferer with nightmarish “remedies” of the day.

      The suffering public, seeing the baleful effects of conventional remedies, grew disgusted with doctors. The Cholera Bulletin, a newsletter put out by a group of New York physicians over the summer of 1832, grimly acknowledged in its July 23 issue, the “fierce onslaught” of cholera and doctors in felling the afflicted: “For Cholera kills and Doctors slay, and every foe will have its way!”  After a new wave of cholera reached American shores in 1848, New York’s Sunday Dispatch lambasted traditional medical science as “antiquated heathen humbug, utterly unworthy of the middle of the nineteenth century.”  “Cholera was a most terrible affliction,” chimed in the New York Herald a year later, “but bad doctors and bad drugs are worse.  The pestilence might come now and then; physicians we had always with us.”[1]

     And yet, amid such loathing about doctors and their so-called remedies, science marched on in the one domain in which forward movement was possible.  Throughout the second half of the 19th century, cholera was the catalyst that brought Europe and eventually America into the proto-modern era of public health management of infectious disease.  Then as now, measures that safeguarded the public from life-threatening infectious disease are good things.  In 1853, after another cholera epidemic reached Edinburgh, there was no political posturing about “rights” – presumably the right of British citizens to get sick and die.  Parliament, the “Big Government” of the day, resolved to go after the one major, recurrent infectious disease for which a vaccine was at hand:  smallpox.  The Vaccination Act of 1853 grew out of this resolve.  Among other things, it instituted compulsory smallpox vaccination, with all infants to be vaccinated within the first three months of life (infants in orphanages were given four months).  Physicians were obligated to send certificates of vaccination to local birth registrars, and parents who did not comply were subject to fines or imprisonment. The requirement was extended under the Vaccination Act of 1867.[2]

     New York City followed suit a decade later, when the state legislature created the Metropolitan Board of Health.  The Board responded to the outbreak of cholera in 1866 by mandating the isolation of cholera patients and disinfection of their excretions.  When a new epidemic, which travelled from India to Egypt, erupted in 1884, French and German teams descended on Egypt in search of the specific microorganism responsible for cholera.  The prize went to Robert Koch, who isolated the cholera vibrio in January 1884.  

     In 1892, when a cholera epidemic originating in Hamburg brought 100 cases to New York, the city mobilized with the full force of the new science of bacteriology.  The Board of Health lost no time in establishing a Division of Pathology, Bacteriology, and Disinfection, which included a state-of-the-art bacteriological laboratory under the direction of Hermann Biggs.  The lab, as we have seen, came into its own in the fight against diphtheria, but it was the threat of cholera that brought it into existence.  A year later, in 1893, Congress passed the National Quarantine Act, which created a national system of quarantine regulations that included specific procedures for the inspection of immigrants and cargos.  It was to be administered by the U.S. Marine Hospital Service, forerunner of the Public Health Service.

     In the late 1840s, the Bristol physician William Budd argued that contaminated sewage was the source of cholera, and in 1854 the surgeon John Snow traced the source of a cholera outbreak in his Soho, London neighborhood to contaminated well water.  But it was the Hamburg epidemic that proved beyond doubt that cholera was waterborne, and Koch himself demonstrated that water filtration was the key to its control.[3]  Now, we rarely hear of cholera, since water and waste management systems that came into existence in the last century eliminated it from the U.S. and Europe.[4]  Anti-vax libertarians would no doubt take exception to the Safe Drinking Water Act of 1974, which empowers the EPA to establish and enforce national water quality standards.  There it is again, the oppressive hand of Big Government, denying Americans the freedom to drink contaminated water and contract cholera.  Where has our freedom gone? 

Caricature of 1866, “Death’s Dispensary,” giving contaminated drinking water as a source of cholera.

     The gods will never stop laughing at the idiocy of humankind.  Here we are in 2021 and, thanks to the foundation laid down by 19th century scientists, gifted scientists of our own time have handed us, in astoundingly little time, an understanding of the Corona virus, its mode of transmission, and a pathway to prevention and containment.  We have in hand safe and effective vaccines that reduce the risk of infection to miniscule proportions and insure that, among the immunized, infection from potent new strains of the virus will be mild and tolerable, and certainly not life- threatening.   

     Yes, a small percentage of those who receive Covid vaccines will have reactions, and, among them, a tiny fraction will become ill enough to require treatment, even hospitalization.  But they will recover and enjoy immunity thereafter.  Such “risks” pale alongside those incurred by their forebears, who sought protection from smallpox in the time-tested manner of their forebears.  In America, a century before the discovery of Edward Jenner’s cowpox-derived vaccine, colonists protected themselves from recurrent smallpox epidemics through inoculation with human smallpox “matter.”  The procedure, termed variolation, originated in parts of Europe and the Ottoman Empire in the early 16th century, reaching Britain and America a century later, in 1721.  It involved inoculating the healthy with pus scraped from skin ulcers of those already infected, and was informed by the ancient observation that smallpox could be contracted only once in a lifetime.[5]  The variolated developed a mild case of smallpox which, so it was hoped, would confer protection against the ravages of future epidemics. 

     And they were essentially right: over 98% of the variolated survived the procedure and achieved immunity.[6]   To be sure, the risk of serious infection was greater with variolation than with Edward Jenner’s cowpox-derived vaccine, but the latter, which initially relied on the small population of English cows that contracted cowpox and person-to-person inoculation, was a long time in coming.  It took the United States most of the 19th century to maintain and distribute an adequate supply of Jennerian vaccine.  Long before the vaccine was widely available, when the death rate from naturally acquired smallpox was roughly 30%,[7] Americans joined Europeans, Asians, and Africans in accepting the risks of variolation. For George Washington, as we noted, the risks paled alongside the very real risk that the Continental Army would collapse from smallpox:  he had every soldier variolated before beginning military operations in Valley Forge in 1777.[8]

    But here we are in 2021, with many Americans unwilling to accept the possibility of any Covid vaccine reaction at all, however transient and tolerable.  In so doing, they turn their backs on more than two hundred years of scientific progress, of which the successful public health measures in Europe and America spurred by the cholera epidemics form an important chapter.  The triumph of public health, which antedated by decades the discovery of bacteria, accompanied increased life expectancy and vastly improved quality of life wrought by vaccine science, indeed, by science in general. 

     Witness Britain’s Great Exhibition of 1851, a scant three years after the cholera epidemic of 1848. Under the dome of the majestic Crystal Palace, science was celebrated in all its life-affirming possibilities.  In medicine alone, exhibits displayed mechanically enhanced prosthetic limbs, the first double stethoscope, microscopes, surgical instruments and appliances of every kind, and a plethora of pharmaceutical extracts and medicinal juices (including cod liver oil).[9] Topping it off was a complete model of the human body that comprised 1,700 parts.  Science promised better lives and a better future; scientific medicine, which by 1851 had begun to include public health measures, was integral to the promise.

     But here we are in 2021, replete with anti-vaccinationists who choose to endanger themselves, their children, and members of their communities.  They are anti-science primitives in our midst, and I behold them with the same incredulity that visitors to Jurassic Park beheld living, breathing dinosaurs.  Here are people who repudiate both public health measures (mask wearing, curfews, limits on group gatherings) and vaccination science in a time of global pandemic.  For them, liberty is a primal thing that antedates the social contract, of which our Constitution is a sterling example.  It apparently includes the prerogative to get sick and make others sick to the point of death.  The anti-vaccinationists, prideful in their ignorance and luxuriant in their fantasies of government control, remind me of what the pioneering British anthropologist Edward Tyler termed “survivals,” by which he meant remnants of cultural conditions and mindsets irrelevant to the present.  Dinosaurs, after all, didn’t care about the common good either.     


[1] Newsletter and press quotations from Charles Rosenberg, The Cholera Years: The United States in 1832, 1849, and 1866 (Chicago: University of Chicago Press, 1962), 68, 161, 155.

[2] Dorothy Porter & Roy Porter, “The Politics of Prevention:  Anti-Vaccinationism and Public Health in Nineteenth-Century England,” Medical History, 32:231-252, 1988.

[3] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 229-230, 255.

[4] The last reported case of cholera in the U.S. was in 1949.  Cholera, sadly, remains alive and well in a number of African countries. 

[5] In China and Asia Minor, where variolation originated, dried smallpox scabs blown into the nose was the mode of inoculation. 

[6]  José Esparzan, “Three Different Paths to Introduce the Smallpox Vaccine in Early 19th Century United States,” Vaccine, 38:2741-2745. 2020.

[7] Ibid.

[8] Andrew W. Artenstein, et al., “History of U.S. Military Contributions to the Study of Vaccines against Infectious Diseases,” Military Medicine, 170[suppl]:3-11. 2005.  

[9] C. D. T. James, “Medicine and the 1851 Exhibition,” J. Royal Soc. Med., 65:31-34, 1972.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.


[1] Newsletter and press quotations from Charles Rosenberg, The Cholera Years: The United States in 1832, 1849, and 1866 (Chicago: University of Chicago Press, 1962), 68, 161, 155.

[2] Dorothy Porter & Roy Porter, “The Politics of Prevention:  Anti-Vaccinationism and Public Health in Nineteenth-Century England,” Medical History, 32:231-252, 1988.

[3] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 229-230, 255.

[4] The last reported case of cholera in the U.S. was in 1949.  Cholera, sadly, remains alive and well in a number of African countries. 

[5] In China and Asia Minor, where variolation originated, dried smallpox scabs blown into the nose was the mode of inoculation. 

[6]  José Esparzan, “Three Different Paths to Introduce the Smallpox Vaccine in Early 19th Century United States,” Vaccine, 38:2741-2745. 2020.

[7] Ibid.

[8] Andrew W. Artenstein, et al., “History of U.S. Military Contributions to the Study of Vaccines against Infectious Diseases,” Military Medicine, 170[suppl]:3-11. 2005.  

[9] C. D. T. James, “Medicine and the 1851 Exhibition,” J. Royal Soc. Med., 65:31-34, 1972.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

The War on Children’s Plague

In the early 19th century, doctors called it angina maligna (gangrenous pharyngitis) or “malignant sore throat.”  Then in 1826, the French physician Pierre-Fidele Bretonneau grouped both together as diphtherite.  It was a horrible childhood disease in which severe inflammation of the upper respiratory tract gave rise to a false membrane, a “pseudomembrane,” that covered the pharynx, larynx, or both.  The massive tissue growth prevented swallowing and blocked airways and often led to rapid death by asphyxiation.  It felled adults and children alike, but younger children were especially vulnerable.  Looking back on the epidemic that devastated New England in 1735-1736, the lexicographer Noah Webster termed it “literally the plague among children.”  It was the epidemic, he added, in which families often lost all, or all but one, of their children.

A century later, diphtheria epidemics continued to target the young, especially those in cities.  Diphtheria, not smallpox or cholera, was “the dreaded killer that stalked young children.”[1]   It was especially prevalent during the summer months, when children on hot urban streets readily contracted it from one another when they sneezed or coughed or spat.  The irony is that a relatively effective treatment for the disease was already in hand.

In 1882, Robert Koch’s assistant, Fredrich Loeffler, published a paper identifying the bacillus – the rod-shaped bacterium Corynebacterium diphtheria first identified by Edwin Klebs – as the cause of diphtheria.  German scientists immediately went to work, injecting rats, guinea pigs, and rabbits with live bacilli, and then injecting their blood serum – blood from which cells and clotting factor have been removed – into infected animals to see if the diluted serum could produce a cure.  Then they took blood from the “immunized” animal, reduced it to the cell-free blood liquid, and injected it into healthy animals. The latter, to their amazement, did not become ill when injected with diphtheria bacilli.  This finding was formalized in the classic paper of Emil von Behring and Shibasaburo Kitsato of 1890, “The Establishment of Diphtheria Immunity and Tetanus Immunity in Animals.”  For this, von Behring was awarded the very first Nobel Prize in Medicine in 1901.      

Thus the birth of blood serum therapy, precursor of modern vaccines and antibiotics alike.  By the early 1890s, Emile Roux and his associates at the Pasteur Institute discovered that infected horses, not the rabbits used by Behring and Kitsato, produced the most potent diphtheria serum of all.  Healthy horses injected with a heat-killed broth culture of diphtheria, it was found, could survive repeated inoculations with the live bacilli.  The serum, typically referred to as antitoxin, neutralized the highly poisonous substances – the exotoxins – secreted by diphtheria bacteria. 

And there was more:  horse serum provided a high degree of protection for another mammal, viz., human beings.  Among people who received an injection of antitoxin, only one in eight developed symptoms on exposure to diphtheritic individuals. In1895 two American drug companies, H. K. Mulford of Philadelphia and Parke Davis of Chicago, began manufacturing diphtheria antitoxin.  To be sure, their drug provided only short-term immunity, but it sufficed to cut the U.S. death rate among hospitalized diphtheria patients in half.  This fact, astonishing for its time, fueled the explosion of disease-specific antitoxins, some quite effective, some less so.  By 1904 Mulford alone had antitoxin preparations for anthrax, dysentery, meningitis, pneumonia, tetanus, streptococcus infections, and of course diphtheria. 

Colorful Mulford antitoxin ad from early 20th century, featuring, of course, the children

In the era of Covid-19, there are echoes all around of the time when diphtheria permeated the nation’s everyday consciousness. Brilliant scientists, then and now, deploying all the available resources of laboratory science, developed safe and effective cures for a dreaded disease.  But more than a century ago, the public’s reception of a new kind of preventive treatment – an injectable horse-derived antitoxin – was unsullied by the resistance of massed anti-vaccinationists whose anti-scientific claims are amplified by that great product of 1980s science, the internet. 

To be sure, in the 1890s and early 20th century, fringe Christian sects anticipated our own selectively anti-science Evangelicals.  It was sacrilegious, they claimed, to inject the blood product of beasts into human arms, a misgiving that did nothing to assuage their hunger for enormous quantities of beef, pork, and lamb.  Obviously, their God had given them a pass to ingest bloody animal flesh.  Saving children’s lives with animal blood serum was apparently a different matter. 

During the summer months, parents lived in anxious expectation of diphtheria every day their children ventured on to city streets.  Their fear was warranted and not subject to the denials of self-serving politicians.  In 1892, New York City’s Health Department established the first publicly funded bacteriological laboratory in the country, and between 1892 and the summer of 1894, the lab proved its worth by developing a bacteriological test for diagnosing diphtheria.  Infected children could now be sent to hospitals and barred from public schools.  Medical inspectors, armed with the new lab tests, went into the field to enforce a plethora of health department regulations. 

Matters were simplified still further in 1913, when the Viennese pediatrician Bela Schick published the results of experiments demonstrating how to test children for the presence or absence of diphtheria antitoxin without sending their blood to a city lab. Armed with the “Schick test,” public health physicians and nurses could quickly and painlessly determine whether or not a child was immune to diphtheria.  For the roughly 30% of New York City school children who had positive reactions, injections of antitoxin could be given on the spot.  A manageable program of diphtheria immunization in New York and other cities was now in place.    

What about public resistance to the new proto-vaccine?  There was very little outside of religious fringe elements.  In the tenement districts, residents welcomed public health inspectors into their flats.  Intrusion into their lives, it was understood, would keep their children healthy and alive, since it led to aggressive intervention under the aegis of the Health Department.[2]   And it was not only the city’s underserved, immigrants among them, who got behind the new initiative.  No sooner had Hamann Biggs, head of the city’s bacteriological laboratory, set in motion the lab’s inoculation of horses and preparation of antitoxin, than the New York Herald stepped forward with a fund-raising campaign that revolved around a series of articles dramatizing diphtheria and its “solution” in the form of antitoxin injections. The campaign raised sufficient funds to provide antitoxin for the William Parke Hospital, reserved for patients with communicable diseases, and for the city’s private physicians as well.  In short order, the city decided to provide antitoxin to the poor free of charge, and by 1906 the Health Department had 318 diphtheria antitoxin stations administering free shots in all five boroughs.[3][4]

A new campaign by New York City’s Diphtheria Prevention Commission was launched in 1929 and lasted two years.   As was the case three decades earlier, big government, represented by state and municipal public health authorities, was not the problem but the solution.  To make the point, the Commission’s publicity campaign adopted military metaphors.  The enemy was not government telling people what to do; it was the disease itself along with uncooperative physicians and recalcitrant parents.  “The very presence of diphtheria,” writes Evelynn Hammonds, “became a synonym for neglect.”[5]     

The problem with today’s Covid anti-vaccinationists is that their opposition to vaccination is erected on a foundation of life-preserving vaccination science of which they, their parents, their grandparents, and their children are beneficiaries.  They can shrug off the need for Covid-19 vaccination because they have been successfully immunized against the ravages of debilitating childhood diseases.  Unlike adults of the late-nineteenth and early-20th centuries, they have not experienced, up close and personal, the devastation wrought summer after summer, year after year, by the diphtheria bacillus.  Nor have they lost children to untreated smallpox, scarlet fever, cholera, tetanus, or typhus.  Nor, finally, have they, in their own lives, beheld the miraculous transition to a safer world in which children stopped contracting diphtheria en masse, and when those who did contract the disease were usually cured through antitoxin injections.

In the 1890s, the citizens of New York City had it all over the Covid vaccine resisters of today.  They realized that the enemy was not public health authorities infringing on their right to keep themselves and their children away from antitoxin-filled syringes. No, the enemy was the microorganism that caused them and especially their children to get sick and sometimes die. 

Hail the supremely common sense that led them forward, and pity those among us for whom the scientific sense of the past 150 years has given way to the frontier “medical freedom” of Jacksonian America.  Anti-vaccinationist rhetoric, invigorated by the disembodied comaraderie of internet chat groups, does not provide a wall of protection against Covid-19.  Delusory thinking is no less delusory because one insists, in concert with others, that infection can be avoided without the assistance of vaccination science. The anti-vaccinationists need to be vaccinated along with the rest of us.  A healthy dose of history wouldn’t hurt them either.         


[1] Judith Sealander, The Failed Century of the Child: Governing America’s Young in the Twentieth Century (Cambridge: Cambridge Univ. Press, 2003), p. 326.

[2] Evelynn Maxine Hammonds, Childhood’s Deadly Scourge: The Campaign To Control Diphtheria in New York City, 1880-1930 (Baltimore:Johns Hopkins University Press, 1999), 84-86.

[3] William H. Park, “The History of Diphtheria in New York, City,” Am. J. Dis. Child., 42:1439-1445, 1931.

[4] Marian Moser Jones, Protecting Public Health in New York City: Two Hundred Years of Leadership, 1805-2005 (NY: New York City Department of Health and Mental Hygiene, 2005), 20.                                     

[5] Hammonds, op cit., p. 206.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

Anti-vaccinationism, American Style

Here is an irony:  America’s staggering production of generations of scientific brainpower coexists with the deep skepticism about science of many Americans.  Donald Trump, a prideful scientific illiterate, rode to power on the back of many others who, like him, were skeptical about science and especially the role of scientific experts in modern life.  He maintains their allegiance still.

Why does this surprise us?  Anti-intellectualism was burned into the national character early in American history.  Those skeptical of this claim should read Richard Hofstadter’s brilliant twin studies of the 1960s, Anti-Intellectualism in American Life and The Paranoid Trend in American Politics. From the beginning of the American Experiment, democracy was antithetical to so-called European “elitism,” and this ethos gained expression, inter alia, in antebellum medicine.  

The Founding Fathers, an intellectual elite in defense of democracy, were not part of the movement away from science.  When Benjamin Waterhouse introduced Edward Jenner’s smallpox vaccine to America in 1800, Washington, Adams, and Jefferson hailed it as the greatest discovery of modern medicine.  They appreciated the severity of smallpox, which had ravaged the Continental Army during the War of Independence.  Indeed, Washington was so desperate to rein in its decimation of his troops that, in 1777, he inoculated his entire army with pus from active smallpox lesions, knowing that the resulting infections would be milder and far less likely to cause fatalities than smallpox naturally contracted.  When Jefferson became president in 1801, he pledged to introduce the vaccine to the American public, because “it will be a great service indeed rendered to human nature to strike off the catalogue of its evils so great a one as the smallpox.” Not to be outdone in support of Jenner’s miraculous discovery, Jefferson’s successor, James Madison, signed into law in 1813, “An Act to Encourage Vaccination.” Among its provisions was the requirement that the U.S. postal service “carry mail containing vaccine materials free of charge.”[1]

But this appreciation of the vaccine was short-lived, and Jefferson’s hope that the value of vaccination would seep into public consciousness was never realized.  In Jacksonian America, the Founding Fathers’ belief that medical progress safeguarded democracy gave way to something far less enlightened:  democracy now meant that everyone could be, indeed should be, his own doctor.  Most Americans had no need for those with university educations, much less clinical experience in governmentally managed public hospitals.  Jacksonian America emerges as what the historian Joseph Kett termed the “Dark Age of the profession.”[2]  During this time, the nation lay claim to a medical elite only because a few monied medical intelligentsia – John Collins Warren, Valentine Mott, Philip Syng Physick, William Gibson, and David Hosack, among them – found their way to European medical centers in London, Edinburgh, and somewhat later, Paris. 

Otherwise, it was every man for himself, which usually meant every woman for herself and her family.  Homeopaths, herbalists, Thomsonians, eclectics, hydropaths, phrenologists, Christian Scientists, folk healers, faith healers, uroscopians, chromo-thermalists – each exemplified the democratic mind in action.[3]  Sad to say, homegrown “regular” American medicine of the day, with its reliance on depletive (bleeding, vomiting, purging) and stimulative (alcohol, quinine) treatments, was no better and often worse.  The belief, Galenic in origin, that all diseases were variants of the same global type of bodily dysregulation is startlingly close to Donald Trump’s holistic medieval approach to bodily infection and its treatment.

The birth of scientific medicine in the decades following the Civil War could not still the ardor of America’s scientific illiterati. The development of animal blood-derived serums (antitoxins), forerunners of modern antibiotics, was anathema to many. Among them were religionists, mainly Christian, for whom injecting blood product of a horse or sheep into the human body was not only repugnant but sinful.  Better to let children be stricken with smallpox, diphtheria and tetanus, sometimes to the point of death, than violate what they construe as divine strictures – strictures, be it noted, not intimated, much less codified, in the body of doctrine of any of the five major world religions.[4]

Antivaccinationists of the early 20th century were an unhappy lot.  They were unhappy about the proliferation of medicines (“biologics”) for treating illness.  And they deeply resented the intrusion of the State into domains of parental decision-making in the form of newly empowered social workers, visiting nurses, and educators.  In fact, antivaccinationism was part and parcel of resistance to all things progressive, including scientific medicine.[5]  Holdovers from the free-wheeling anything-goes medicine of antebellum America – especially devotees of homeopathy and, of late, chiropractic – were prominent in its ranks.    

Now, in the face of a global pandemic no less lethal than the Great Influenza of 1918-1919, we hear the same irrational musings about the dangers of vaccines that animated the scientific illiterati at the turn of the 20th century. For the foes of public health, any misstep in the manufacture or storage of smallpox vaccine – a much greater possibility over a century ago than today – was enough to condemn vaccination outright. In1901,smallpox vaccination of school children in Camden, NJ led to an outbreak of 100 cases of tetanus, with nine deaths.  Historians believe that, in all probability, the outbreak resulted not from a contaminated batch of vaccine but rather from poor care of the vaccination site.  But Congress accepted the possibility of contamination, and the incident led to passage of the Biologics Control Act of 1902.[6]  Henceforth every manufacturer of vaccine had to be licensed by the Secretary of the Treasury (relying on the PHS Laboratory of Hygiene), and each package of vaccine had to be properly labeled and dated and was subject to inspection.[7]  

And this leads to a second irony: the more preventive medicine advanced, incorporating additional safeguards into vaccine production, storage, and administration, the greater the resistance of the illiterati.  Throughout the 20th century and right down to the present, the antebellum notion of science-free “medical freedom” continues to hold sway.  Then and now, it means the right to put children at risk for major infectious disease that could result in death – and the right, further, to pass disease, possibly severe and occasionally fatal, on to others.

It follows that, then and now, the science illiterati are skeptical, if not distressed, by the State’s commitment to public health.  It was Oklahoma Senator Robert Owen’s proposed legislation of 1910 to combine five federal departments into a cabinet-level Department of Public Health that pushed the opponents of medical “tyranny” onward. The Anti-Vaccination League of America, formed in 1908, was joined by the National League for Medical Freedom in 1910.  Eight years later, they were joined by the American Medical Liberty League.  For all three groups, anti-Progressivism was in full swing. “Medical freedom” not only exempted children from compulsory vaccination, but from medical examinations at school.  Further, young adults should not be subjected to premarital syphilis tests. Nor did the groups’ expansive view of medical tyranny flinch in the face of public education about communicable disease: municipal campaigns against diphtheria were to be forbidden entirely. 

With the death of the founders of the Anti-Vaccination League (Charles Higgins) and the American Medical Liberty League (Lora Little) in 1929 and 1931, respectively, antivaccinationism underwent a dramatic decline.  The Jacksonian impulse that fueled the movement simply petered out, and by the later ‘30s, Americans finally grasped that mainstream medicine was not simply another medical sect. It was the real deal:  a medicine grounded in laboratory research that effectively immunized against disease, promoted relief and cure of those already infected, and thereby saved lives.

But was the embrace of scientific healing really universal?  A pinnacle of life-depriving anti-science occurred well beyond the 1930s.  Consider the belief of some Christian sects that certain life-saving medical interventions must be withheld from children on religious grounds.  It was only in 1982, 81 years after von Behring’s discovery of diphtheria antitoxin launched the era of serum therapy, that criminal charges were first brought against parents who had withheld necessary treatment from their children.  Of the 58 cases of such parental withholding of care, 55 involved fatalities.[8]  Child deaths among Christian Scientists alone included untreated diabetes (leading to diabetic ketoacidosis), bacterial meningitis, and pneumonia.  Now things are better for the children, since even U.S. Courts that have overturned parents’ criminal convictions have come around to the mainstream belief that religious exemption laws are not a defense of criminal neglect – a fine insight for the judiciary to have arrived at more than a century after serum therapy scored major triumphs in the treatment of rabies, diphtheria, tetanus, pneumococcal pneumonia, and meningococcal meningitis.

Should vaccination for the Covid-19 virus be a requirement for attendance in public and private schools?  How can the question even be asked?  As early as 1827, a Boston school committee ordered teachers to require entering students to give evidence of smallpox vaccination.[9]  Statewide vaccination requirements for smallpox followed in Massachusetts in 1855, New York in 1862, Connecticut in 1872, and Pennsylvania in 1895.  And the inoculations were effective across the board.  They quickly brought outbreaks of smallpox underway at the time of inoculation under control, and they prevented their recurrence in the future. These laws and those that followed were upheld by the Supreme Court in 1922 in Zucht v. King.[10]      

Twentieth-century vaccines were developed for pertussis in 1914, diphtheria in 1926, and tetanus in 1938.  In 1948 the three were combined and given to infants and toddlers at regular intervals as the DTP vaccine.  There was no hue and cry in 1948 or the years to follow. And yet, the same fear of vaccination that led the New York State Health Department to launch a statewide drive to immunize children against diphtheria now renders a new generation of parents resistant to mandatory Covid-19 vaccination for their own children.

Bear in mind that the anti-science rhetoric of today’s illiterati can be mobilized just as easily to resist DPT or any subsequent vaccine administered to their children. Why subject a child to DPT vaccination?  Perhaps combining three different vaccines into one injection entails heightened risks. Perhaps the batch of vaccine in the hands of one’s own doctor has been contaminated.  Perhaps one’s child will be among the miniscule number that have a minor allergic reaction.  And, after all, children who contract diphtheria, pertussis, and/or tetanus will hardly die from their infections, especially with the use of antibiotics. Why inject foreign matter into healthy infants – the very argument adduced by the opponents of diphtheria vaccine a century ago. 

The problem with antivaccinationist rhetoric in the 21st century is that its proponents are all beneficiaries of more than a century of mandatory vaccination policy.  If they lived in a society bereft of vaccines – or, for the unvaccinated, the immunity conferred by the vast herd of immunes – they would have led very different lives.  Indeed, some would not be here to celebrate solipsism masquerading as individualism.  Their specious intuitions about the risks of vaccination are profoundly anti-social, since they compromise the public’s health. Parents who decide not to vaccinate their children put the entire community at risk.  The community includes not only their own children, but all those who desire protection but cannot receive it:  children too young to be vaccinated, those with actual medical contraindications to vaccination, and the miniscule number who have been vaccinated but remain unprotected.[11]    

Nor is it strictly a matter of providing equal protection to individuals who seek, but cannot receive, the protection afforded by compulsory vaccination. In a secular society, religious objections to vaccination pale alongside the health of the community. Whether framed in terms of a “compelling state interest” in mitigating a health threat (Sherbert v. Vernerin [1963]) or the individual’s obligation to comply with “valid and neutral laws of general applicability” whatever their incidental religious implications (Employment Division, Department of Human Resources of Oregon v. Smith [1990]) , the U.S. Supreme Court has consistently held that mandatory vaccination laws need not allow religious exemptions of any kind.  

Antivaccinationists might bear in mind a few particulars as they align themselves with the infectious dark ages.  Between 1900 and 1904, an average of 48,164 cases of smallpox and 1,528 smallpox deaths were reported each year. With the arrival of compulsory vaccination in schools, the rate fell drastically and outbreaks of smallpox ended in 1929. The last case of smallpox in the U.S. was reported in 1949.[12]  

Among American children, diphtheria was a major cause of illness and death through 1921, when 206,000 cases and 15,520 deaths were recorded.  Before Emil von Bering’s diphtheria antitoxin became available in 1894 to treat infected children, the death rate among children struck down, especially during the hot summer months, could reach 50%. Within several years, use of the antitoxin brought it down to 15%.[13]  Then, by the late 1920s, diphtheria immunization was introduced and diphtheria rates fell dramatically, both in the U.S. and other countries that vaccinated widely. Between 2004 and 2008, no cases of diphtheria were recorded in the U.S.[14] 

Between 1951 and 1954, paralytic polio cases in the United States averaged 16,316 a year, of which 1,879 resulted in death. Then science came to the rescue.  Jonas Salk’s dead-poliovirus vaccine became available in1955, and Albert Sabin’s live-poliovirus variant four years later. By 1962, there were fewer than 1,000 cases a year and, in every year thereafter, fewer than 100 cases.[15]

Now, alas, some parents still worry that the measles component of the MMR (measles, mumps, rubella) vaccine available since 1971 may lead to childhood autism.  Why?  Resist the disease-promoting mythologies of the illiterati at all costs.  Autism is a neuro-developmental disorder with a strong genetic component; its genesis is during the first year of life, before the vaccine is even administered.  None of the epidemiologists who have studied the issue has found any evidence whatsoever of an association, not among normal children and not among high-risk children with autistic siblings.[16]  The fact is that children who do not receive a measles vaccine have been found 35 times more likely to contract measles than the vaccinated.[17]  And measles is no laughing matter. When contracted later in life, measles and mumps are serious and can be deadly.  They were among the major systemic infections that felled soldiers during the Civil War, the Spanish-American War, the Anglo-Boer War, and World War I.[18]                  

All of which leads to a conclusion in the form of an admonishment.  Accept the fact that you live in a secular society governed by law and a network of agencies, commissions, and departments lawfully enjoined to safeguard public health.  Do your part to sustain the social contract that came into existence when the Founding Fathers, elitists molded by European thought who had   imbibed the social contractualism of John Locke, wrote the American constitution.

Vaccination is a gift that modern science bestows on all of us – vaccination proponents and opponents alike. When one of the two FDA-approved Covid-19 vaccines comes to a clinic or storefront near you, run, don’t walk, to get your and your children’s shots. Give thanks to the extraordinarily gifted scientists at Pfizer and Moderna who created these vaccines and demonstrated their effectiveness and safety. Make sure that everyone’s children grow up, paraphrasing the U.S. Army’s old recruiting slogan, to be all they can be.   


[1] Dan Liebowitz, Smallpox Vaccination: An Early Start of Modern Medicine in America, J. Community Hosp. Intern. Med. Perspect., 7:61-63, 2017 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5463674).

[2] Joseph F. Kett, The Formation of the American Medical Profession: The Role of Institutions, 1780-1860 (New Haven: Yale University Press, 1968), p. vii. 

[3] Robert E. Riegel, Young America, 1830-1840 (Westport, CT: Greenwood Press, 1973 [1949]), pp. 314-315, quoted at  314. 

[4] John D. Graberstein, “What the World’s Religions Teach, As Applied to Vaccines and Immune Globulins,” Vaccine, 31:2011-2023, 2013.

[5] James Colgrove, “’Science in Democracy’: The Contested Status of Vaccination In the Progressive Era and the 1920s,” Isis, 96:167-191, 2005.

[6]  Harry F. Dowling, Fighting Infection: Conquests of the Twentieth Century (Cambridge, MA: Harvard University Press, 1977), 38; Harry M. Marks, The Progress of Experiment: Science and Therapeutic Reform in the United States, 1900-1990 (Cambridge: Cambridge University Press, 1997), 73-74.

[7] Jonathan Liebenau, Medical Science and Medical Industry: The FormationOf the American Pharmaceutical Industry (Baltimore: Johns Hopkins, 1987), 89-90.

[8]  Janna C. Merrick, “Spiritual Healing, Sick Kids and the Law: Inequities in theAmerican Healthcare System,” Amer. J. Law & Med., 29:269-300, 2003, at 280.

[9] John Duffy, “School Vaccination: The Precursor to School Medical Inspection,” J. Hist. Med. & Allied Sci., 33:344-355, 1978,

[10] Kevin M. Malone & Alan R. Hinman, “Vaccination Mandates: The Public Health Imperative and Individual Rights, Law in Public Health Practice (2009), 262-284, at 272.

[11] Alan R. Hinman, et al., “Childhood Immunization: Laws that Work,” J. Law, Med &I Ethics, 30(suppl):122-127, 2002.

[12] Frank Fenner, et al., Smallpox and its Eradication (Geneva: World Health Organization, 1988).

[13] Karie Youngdahl, “Early Uses of Diphtheria Antitoxin in the United States,” The History of Vaccines, August 2, 2010 (https://www.historyofvaccines.org/content/blog/…).

[14] Epidemiology and Prevention of Vaccine-Preventable Diseases, 11th Edition (The Pink Book). National Immunization Program, Centers for Disease Control and Prevention (http://www.cdc.gov/vaccines/Pubs/pinkbook/downloads/dip.pdf); Diphtheria. WHO, Regional Office for the Western Pacific (http://www.wpro.who.int/health_topics/diphtheria).

[15] CDC. Annual summary 1980: Reported Morbidity and Mortality in the United States. MMWR 1981;29; CDC, Reported Incidence of Notifiable Diseases in the United States, 1960. MMWR 1961;9.

[16] Frank DeStefano & Tom T. Shimabukuro, “The MMR Vaccine and Autism,” Ann. Rev. Virol., 6:585-600, 2019.

[17] Hinman, op. cit. (note 11).

[18] Paul E. Stepansky, Easing Pain on the Western Front:  American Nurses of the Great War and the Birth of Modern Nursing Practice (Jefferson, NC:  McFarland, 2020), 36, 50, 96, 144.

 

Copyright © 2020 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.