Category Archives: General

Christian Healthcare for Christian Nationalists

 

Christian Nationalism (CN):   The belief that the United States is, and always has been, a Christian nation.  An oxymoron in the context of the explicit language of the Bill of Rights and the Constitution.

CN Proponents:  Titans of  American Ignorance in all its anti-historical, anti-rationalist, anti-democratic glory.  Nationalism, as set forth in the Bill of Rights and the Constitution, does not permit the qualifier, “Christian.”  The phrase is quite literally non-sensical.

America’s Founding Fathers:   A group of educated gentlemen, some avowed Christians and others deists influenced by English and French freethinkers. Whatever their personal convictions, the Founders collectively established  a secular republic predicated on religious freedom and the separation of Church and State.  The documents they bequeathed to us and that continue to shape our national sense of self – the Declaration of Independence and the Constitution – do not establish a Christian nation.[i]

_____________________

It’s August 21, 1790, and George Washington sets pen to paper and writes a letter to the Hebrew Congregation of Newport, Rhode Island.  Following the state’s ratification of the Constitution, Washington congratulates the Newport congregants for joining a nation where “every one shall sit in safety under his own vine and fig-tree and there shall be none to make him afraid.”   And he continues: “For happily the Government of the United States gives to bigotry no sanction, to persecution no assistance, requires only that they who live under its protection should demean themselves as good citizens, in giving on all occasions their effectual support.”

Now it’s  June 7, 1797, and Washington’s successor, John Adams, adds his ringing endorsement to Congress’s unanimous ratification of the Treaty of Tripoli.  All the nation’s “citizens and inhabitants thereof”  are enjoined “faithfully to observe and fulfill the said Treaty and every clause and article thereof.”   Article 11 of the Treaty begins with this avowal: “the government of the United States of America is not in any sense founded on the Christian Religion.”  The Article in its entirety mitigates in no way at all the plain meaning of this statement.

And now on New Year’s Day, 1802, Thomas Jefferson composes a letter to the Danbury Baptist Association.  Here the  third president asserts, famously, that “the legitimate powers of the government reach actions only, & not opinions.”   It followed that Jefferson contemplate[d] with “sovereign reverence that act of the whole American people which declared that their legislature should ‘make no law respecting an establishment of religion, or prohibiting the free exercise thereof,’ thus building a wall of separation between Church & State.”[ii]

These words tell us how the first three American presidents understood the nation they helped create.  But more important than these words or any other words they wrote or spoke, is the document they and their colleagues signed in 1787 and bequeathed to future generations.  This charter of government, the Constitution of the United States, took effect on March 9, 1789 and has guided the nation these past 233 years.

____________________

Christian Nationalism, code for an un-American Christian Nation-State, seeks to overthrow the Constitution, the founding document on which the American Republic is predicated.  The prospect of a Christian Nation that consigns the values, principles, and precepts of the Constitution to the dustbin of history is the stuff of nightmares.  Many nightmares.  What follows is a gloss on one of them:  What might well follow, indeed, what ought to follow, in the domain of healthcare after the CNs come to power:

  1.  Society’s commitment to public health would be overturned by the Supreme Court.  Jehovah’s Witnesses and Christian Scientists would be entitled, as a matter of law, to deprive their children of life-saving blood transfusions and tissue and organ transplants.  If you’re a Christian Scientist parent, for example, go ahead and let your children die, as they have in the past, from untreated diabetes (leading to diabetic ketoacidosis), bacterial meningitis, and pneumonia.  What American courts define as criminal neglect would be sanctioned – as long as it resulted from one or another variety of Christian belief.  A litmus test for membership in the Christian Nation could be repudiation of compulsory childhood vaccination, with parents who deny their children vaccination on religious grounds applauded for putting their children at risk, and sometimes sacrificing them, out of adherence to their version of a Christian Life.  Similarly, during times of pandemic, Christians who, as beneficiaries of Divine protection, chose to ignore quarantine directives from leftist health organizations like the CDC and WHO would  receive the blessing of the  State.  All such groups would be following in the footsteps of many contemporary Evangelicals.  As Covid-19 gripped the nation and the world, Evangelicals from California to Florida, courageous Christians all, refused to follow social distancing and stay-at-home guidelines; they continued to assemble for communal worship in churches and homes, placing themselves and their communities in peril.[iii]
  2. America’s long and inglorious tradition of discrimination in medical education would be rejuvenated on behalf of the Christian state.  Examples of exclusion by race, religion, and gender abound, and they can be drawn on to guide Christian Nationalists in any number of discriminatory practices for marginalizing the presence of non-Christians in American healthcare.  Consider only that by the mid-1930s, over 2,000 American medical students, 95% of whom were Jews, were driven to Europe to pursue medicine.  Seven years later, Charles Drew wrote a blistering letter to the editor of JAMA, protesting the AMA’s racially based exclusion of qualified black applicants whose state chapters refused them membership, thereby keeping them out of the national organization.  The American Nursing Association (ANA) was little better.  Founded in 1896, it  allowed qualified black nurses from states with racist state chapters direct admittance to the national organization only in 1950.  The Georgia chapter, incidentally, continued to exclude blacks until 1961, and retreated only after the ANA threated to expel it from the national organization.[iv]  And let us not forget quota systems, implemented to keep Jews out of both elite universities and medical schools after World War I.  After all, they were followed by the quota system implemented in the Immigration Acts of 1921 and 1924, a device to keep East European immigrants out of the country – a project no doubt congenial to Christian Nationalists.[v]
  3. Christian Healthcare would enjoin believing Christians to follow the dictates of conscience in deploying life-saving medications, procedures, and technologies on nonbelievers. EMTs and Medics, for example, would no longer be legally or professionally obligated to provide assistance to Jews, Muslims, Hindus, atheists, and other non-Christians. This will require a Constitutional amendment, since the Constitution makes no allowance for conscience as a ground for violating laws and lawfully implemented directives, as in the denial of life-saving medical interventions.  The First Amendment provides only for freedom of religion, understood as the freedom to practice the religion of one’s choice through voluntary affiliation with one or another House of Worship (or no House of Worship at all).
  4. It follows that Christian physicians, nurses, and other providers would be free, as practicing Christians, to provide services only to Christians. They might, at their conscience-driven discretion, avoid nonbelievers entirely or simply privilege the needs (as to medications, nourishment, and allocation of scarce resources) of Christians.  Self-evidently, Christian surgeons would be under no legal, professional, or moral obligation to operate on Jews, Muslims, Hindus, atheists, and other nonbelievers; nor would Christian anesthesiologists be required to administer anesthesia to them.  Professional codes of ethics would have to be revised (i.e., Christianized) accordingly.  In toto, under the auspices of a Christian nation, there would be a vast expansion of the “refusal laws” that individual states have passed to free hospitals, physicians, and nurses from any obligation to provide patients with abortions and other reproductive services, including contraceptives, genetic counseling, infertility treatment, STD and HIV testing, and treatment of victims of sexual assault.  Constitutional amendments would be required on this score as well, since such “laws of conscience,” whatever their religious moorings, have no legal, judicial, or moral status in the Constitution.
  5. Following the example of the National Blood Program of 1941, the blood bank set up to provide Caucasian-only blood to the American armed forces, all nationally sanctioned blood banks should be limited to Christian donors.[vi]  There is ample historical precedent regarding the sacrosanctity of Christian blood and blood products; witness the Italian residents of Bolzano who, newly absorbed into  Bavaria by Napoleon in 1807, launch an armed revolt against mandatory smallpox vaccination lest Protestantism be injected into their Catholic veins. Over a century later, a Nazi military directive forbidding the transfusion of Jewish blood into the veins of German military personnel led to the death of countless war wounded.  America was little better in the collection, identification, and storage of  blood.  The Red Cross Blood Donor Program, after refusing the blood of black Americans for a year, began accepting it in January 1942.  But it continued to segregate blood by donor race until 1950; southern states like Arkansas and Louisiana held firm to segregated blood collection until the early 1970s.[vii]  These precedents will be seized on in the time of CN.  In the new America, the blood of nonbelievers could be collected by their respective agencies, and made available to hospitals and clinics amenable to receiving and storing impure blood for non-Christian patients. Institutions that continued to  permit cross-religious transfusions would require signed waivers from Christian patients willing to accept transfusions of non-Christian blood under exigent circumstances.  Such waivers could be incorporated into Living Wills.

Christian Healthcare is only one of the societal transformations that await the ascendancy of Christian Nationalism.  The anti-intellectual disemboweling of American public education, especially in the South, is already well under way; where will it end up when the white CNs assume control?  To those who espouse it, I say:  Congratulations.  You have destroyed the America envisioned by the Founding Fathers and enshrined in the Constitution and Bill of Rights.  You have replaced the wall of separation between Church and State with a wall of separation between Christian and non-Christian.  In so doing, you have laid the seedbed for one more religious theocracy, a Christian sibling to the virulently anti-democratic Muslim theocracies of the Middle East.

The American theocracy will reach its apotheosis over time.  But when the Christian Nationalists assume political control, there will be immediate changes.  The United States will all at once be a two-tier society stratified along religious lines.  It will not only be Jews who, failing to throw their votes to Christian leaders, will have to watch their backs.  Everyone who opposes Christian National hegemony will be at risk.  We will all have to, in the ex-president’s  subtle formulation, “watch it.”

What to call the new state?  Christian Nationalists may profitably analogize from the example of Saudi Arabia.  If we replace “Saudi” (i.e., the Kingdom of Saud) with the New Testament’s “Kingdom of God,” and let “New Jerusalem” stand  for it, we arrive at a suitable replacement for the United States of America.  Here, Christian Nationalists, is the nation of your dreams and our nightmares.  I give you  New Jerusamerica.

January 6, 2021

______________________

[i] I am grateful to my friend and colleague of many decades, Professor Jeffrey Merrick, for his help in formulating my comments on the Founding Fathers, religion, and the founding of the American Republic.  Among recent books elaborating in scholarly detail these comments, see especially Steven K. Green, Inventing a Christian America:  The Myth of the Religious Founding (New York:  OUP, 2015).

[ii] Washington’s and Jefferson’s letters and Adams’ remarks to Congress are in the public domain and widely reproduced on the internet.

[iii] Ed Kilgore, “Many Evangelicals are Going to Church Despite Social-Distancing Guidelines,” New York Magazine, April 17, 2020  (https://nymag.com/intelligencer/2020/04/many-evangelicals-defying-guidelines-on-in-person-gatherings.html); Bianca Lopez, “Religious Resistance to Quarantine Has a Long History,”  (https://blog.smu.edu/opinions/2020/08/07/religious-resistance-to-quarantine-has-a-long-history).  “In numerous parts of the United States,”  Lopez writes, “certain stripes of Christianity and quarantine orders stand in direct opposition, resulting in deadly outcomes due to the COVID-19 pandemic.”

[iv] Edward C. Halperin, “The Jewish Problem in Medical Education, 1920-1955,” J. Hist. Med. & Allied Sci., 56:140-167, 2001, at 157-158; Patricia D’Antonio, American Nursing: A History of Knowledge, Authority, and the Meaning of Work (Baltimore: John Hopkins, 2010), 130.

[v] David Oshinsky, Bellevue:  Three Centuries of Medicine and Mayhem at America’s Most Storied Hospital.  NY: Doubleday, 2016), 196-198; Ian Robert Dowbiggin, Keeping America Sane: Psychiatry and Eugenics in The United States and Canada, 1880-1940 (Ithaca: Cornell University Press,1997), 224-227.

[vi] Charles E. Wynes, Charles Richard Drew: The Man and The Myth (Urbana: Univ. Illinois Press, 1988), 67; “Nazi Order Prohibiting Jewish Blood for Transfusions Causing Death of Many Soldiers,” JTA Daily News Bulletin, March 2, 1942 (https://www.jta.org/archive/nazi-order-prohibiting-jewish-blood-for-transfusions-causing-death-of-many-soldiers).  Note that I am not addressing Christian sects, like Jehovah’s Witnesses, whose members refuse blood transfusions altogether, only those that accept transfusions, but only of Christian blood.

[vii] Thomas A. Guglielmo, “Desegregating Blood:  A Civil Rights Struggle to Remember,” February 4, 2018 (https://www.pbs.org/newshour/science/desegregating-blood-a-civil-rights-struggle-to-remember).  For   lengthier consideration of blood and race in American history, see Spencer Love, One Blood:  The Death and Resurrection of Charles R. Drew (Chapel Hill:  Univ. North Carolina Press, 1996), 139-160.

Copyright © 2022 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

Malaria in the Ranks

Malaria (from Italian “bad air”): Infection transmitted to humans by mosquito bites containing single-celled parasites, most commonly Plasmodium (P.) vivax  and P. falciparum.  Mosquito vector discovered by Ronald Ross of Indian Medical Service in 1897.  Symptoms:  Initially recurrent (“intermittent”) fever, then constant high fever, violent shakes and shivering, nausea, vomiting.  Clinical descriptions as far back as Hippocrates in fifth century B.C. and earlier still in Hindu and Chinese writings.  Quinine:  Bitter-tasting alkaloid from the bark of cinchona (quina-quina) trees, indigenous to South America and Peru.  Used to treat malaria from 1630s through 1920s, when more effective synthetics became available.  Isolated from chinchona bark in 1817 by French chemists Pierre Joseph Pelletier and Joseph Caventou.  There you have it.  Now read on. 

_______________________

It’s 1779 and the British, commanded by  Henry Clinton adopt a southern strategy to occupy the malaria-infested Carolinas. The strategy appears successful, as British troops commanded by Charles Lord Cornwallis capture Charleston on March 29, 1780.  But appearances can be deceiving. In reality, the Charleston campaign has left the British force debilitated.  Things get worse when Cornwallis marches inland in June, where his force is further ravaged by malarial  fever carried by Anopheles mosquitoes and Plasmodium parasites.  Lacking quinine, his army simply melts away in the battle to follow. Seeking to preserve what remains of his force, Cornwallis looks to the following winter as a time to  recuperate and rebuild.  But it is not to be.  Clinton sends him to Yorktown, where he occupies a fort between two malarial swamps on the Chesapeake Bay.  Washington swoops south and, aided by French troops, besieges the British.  The battle is over almost before it has begun.  Cornwallis surrenders to Washington, but only after his army has succumbed to malarial bombardment by the vast army of mosquitoes. The Americans have won the Revolutionary War.  We owe American independence to Washington’s command, aided, unwittingly, by mosquitoes and malaria.[1] 

Almost two centuries later, beginning in the late 1960s, malaria again joins America at war.  Now the enemy is communism, and the site is Vietnam.  The Republic of Korea (ROK), in support of the war effort, sends over 30,000 soldiers and tens of thousands of civilians to Vietnam.  The calculation is plain enough: South Korea seeks to consolidate America’s commitment to its economic growth and military defense in its struggle with North Korean communism after the war.  It works, but there is an additional, major benefit:  ROK medical care of soldiers and civilians greatly strengthens South Korean capabilities in managing infectious disease and safeguarding public health.  Indeed, at war’s end in 1975, ROK is an emergent powerhouse in malaria research and the treatment of parasitic disease.  Malaria has again played a part in the service of American war aims.[2]

Winners and losers aside, the battle against malaria is a thread that weaves its way through American military history.  When the Civil War erupted in 1861, outbreaks of malaria and its far more lethal cousin, yellow fever, did not discriminate between the forces of North and South.  Parasites mowed down combatants with utter impartiality.  For many, malarial infection was the enemy that precluded engagement of the enemy.  But there were key differences.  The North had the U.S. Army Laboratory, comprised of  laboratories in Astoria, New York and Philadelphia.  In close collaboration with Powers and Weightman, one of only two American pharmaceutical firms then producing quinine, the Army Laboratory provided Union forces with ample purified quinine in standardized  doses.  Astute Union commanders made sure their troops took quinine prophylactically, with troops summoned to their whiskey-laced quinine ration with the command, “fall in for your quinine.” 

Confederate troops were not so lucky.  The South lacked chemists able to synthesize quinine from its alkaloid; nor did a Spanish embargo permit the drug’s importation.  So the South had to rely on various plants and plant barks, touted by the South Carolina physician and botanist Frances Peyre Porcher as  effective quinine substitutes.  But Porcher’s quinine substitutes were all ineffective, and the South had to make do with the meager supply of quinine it captured or smuggled.  It was a formula for defeat, malarial and otherwise.[3] 

Exactly 30 years later, in 1891, Paul Ehrlich announced that the application of  a chemical stain, methylene blue, killed malarial microorganisms and could be used to treat malaria.[4]   But nothing came of Ehrlich’s breakthrough seven years later in the short-lived Spanish-American War of 1898.   Cuba was a haven for infectious microorganisms of all kinds, and, in a campaign of less than four months, malaria mowed down American troops with the same ease it had in the Civil War.  Seven times more Americans died from tropical diseases than from Spanish bullets.  And malaria topped the list.  

As the new century approached, mosquitoes were, in both senses, in the air.  In 1900, Walter Reed returned to Cuba to conduct experiments with paid volunteers; they established once and for all that mosquitoes were the disease vector of yellow fever; one could not contract the disease from “fomites,” i.e., the soiled clothing, bedding, and other personal matter of those infected.  Two years later, Ronald Ross received his second Nobel Prize in Medicine for his work on the role of mosquitoes in transmission of malaria.[5]   But new insight into the mosquito vector of yellow fever and malaria did not mitigate the dismal state of affairs that came with World War I.  The American military was no better prepared for the magnitude of malaria outbreaks than during the Civil War.  At least 1.5 million doughboys were incapacitated, as malaria spread across Europe from southeast England to the shores of Arabia, and from the Arctic to the Mediterranean.  Major epidemics broke out in  Macedonia, Palestine, Mesopotamia, Italy, and sub-Saharan Africa.[6]

In the Great War, malaria treatment fell back on quinine, but limited knowledge of malarial parasites compromised its effectiveness.  Physicians of the time could not differentiate between the two strains of  parasite active in the camps – P. vivax and P. falciparum.  As a result, they could not optimize treatment doses according to these somewhat different types of infection.  Malarial troops, especially those with falciparum, paid the price.  Except for the French, whose vast malaria control plan spared its infantry from infection and led to victory over Bulgarian forces in September 1918, malaria’s contribution to the Great War was what it had always been in war – it was the unexpected adversary of all.

Front cover of “The Illustrated War Times,” showing WWI soldiers, probably Anzacs, taking their daily dose of quine at Salonika, 1916.

In 1924, the problem that had limited the effectiveness of quinine during the Great War was addressed when the German pharmacologist Wilhelm Roehl, working with Bayer chemist Fritz Schönhöfer, distilled the quinine derivative Plasmoquin, which was far more effective against malaria than quinine.[7]  By the time World War II erupted, another antimalarial, Atabrine (quinacrine, mepacrine), synthesized in Germany in1930, was available.  It would be the linchpin of the U.S. military’s malaria suppression campaign, as announced by the Surgeon General in Circular Letter No. 56 of December 9, 1941.  But the directive had little impact in the early stages of  the war. U.S. forces in the South Pacific were devastated by malaria, with as many as 600 malaria cases for every 1,000 GIs.[8]  Among American GIs and British Tommies alike, the daily tablets were handed out erratically.  Lackluster command and side effects were part of the problem:  The drug turned skin yellow and occasionally caused nausea and vomiting.  From there, the yellowing skin in particular, GIs leapt to the conclusion that Atabrine would leave them sterile and impotent after the war.  How they leapt to this conclusion is anyone’s guess, but there was no medical information available to contradict it.[9]   

The anxiety bolstered the shared desire of some GIs to evade military service.  A number of them tried to contract malaria in the hope of discharge or transfer – no one was eager to go to Guadalcanal.  Those who ended up hospitalized often prolonged their respite by spitting out their  Atabrine pills.[10]   When it came to taking Atabrine, whether prophylactically or as treatment, members of the Greatest Generation could be, well, less than great.

Sign posted at 363rd Station Hospital in Papua New Guinea in 1942, sternly admonishing U.S. Marines to take their Atabrine.

Malarial parasites are remarkably resilient, with chemically resistant strains emerging time and again.  New strains have enabled malaria to find ways of staying ahead of the curve, chemically speaking.  During the Korean War (1950-1953), both South Korean and American forces fell to the vivax strain.  American cases decreased with the use of chloroquine, but the improvement was offset by a rash of cases back in the U.S., where hypnozoites (dormant malarial parasites) came to life with a vengeance and caused relapses.  The use of yet another antimalarial, primaquine, during the latter part of the war brought malaria under better control.  But even then, in the final year of the war 3,000 U.S. and 9,000 ROK soldiers fell victim.[11]   In Vietnam, malaria reduced the combat strength of some American units by half and felled more troops  than bullets.  Between 1965 and 1970, the U.S. Army alone reported over 40,000 cases.[12]  Malaria control measure were strengthened, yes, but so were the parasites, with the spread of drug-resistant falciparum and the emergence of a new chloroquine-resistant strain.  

Malaria’s combatant role in American wars hardly ends with Vietnam.  It was a destructive force in 1992, when American troops joined the UN Mission “Operation Restore Hope” in Somalia.  Once more, Americans resisted directives to take a daily dose of  preventive medicine, now Mefloquine, a vivax antimalarial developed by the Army in 1985.  As with Atabrine a half century earlier, false rumors of  debilitating side effects led soldiers to stop taking it.  And as with Atabrine, malaria relapses knocked out soldiers following their return home, resulting in the largest outbreak of malaria stateside since Vietnam.[13] 

In Somalia, as in Vietnam, failure of commanders to educate troops about the importance of “chemoprophylaxis” and to institute “a proper antimalarial regimen” were the primary culprits.  As a result, “Use of prophylaxis, including terminal prophylaxis, was not supervised after arrival in the United States, and compliance was reportedly low.[14]  It was another failure of malaria control for the U.S. military.  A decade later, American combat troops went to Afghanistan, another country with endemic malaria.  And there, yet again, “suboptimal compliance with preventive measures” – preventive medication, use of insect repellents, chemically treated tent netting, and so forth – was responsible for “delayed presentations” of malaria after a regiment of U.S. Army Rangers returned home.[15]  Plus ca change, plus c’est la même chose. 

Surveying American history, it seems that the only thing more certain than malarial parasites during war is the certainty of war itself.  Why is this still the case?  As to the first question, understanding the importance of “chemoprophylaxis” in the service of personal and public health (including troop strength in the field) has never been a strong suit of Americans.  Nor has the importance of preventive measures, whether applying insecticides and tent netting (or wearing face masks) been congenial, historically, to libertarian Americans who prefer freedom in a Hobbesian state of nature to responsible civic behavior.  Broad-based public-school education on the public health response to epidemics and pandemics throughout history, culminating in the critical role of preventive measures in containing Coronavirus, might help matters.  In the military domain, Major Peter Weima sounded this theme in calling attention to the repeated failure of education in the spread of malaria among American troops in World War II and Somalia. He stressed “the critical contribution of education to the success of clinical preventive efforts. Both in WWII and in Somalia, the failure to address education on multiple levels contributed to ineffective or only partially effective malaria control.” [16]  As to why war, in all its malarial ingloriousness, must accompany the human experience, there is no easy answer.   

_____________________

[1]  Peter McCandless, “Revolutionary fever:  Disease and war in the lower South,1776-1783,” Am. Clin. Climat. Assn., 118:225-249, 2007.   Matt Ridley provides a popular account in The Evolution of Everything:  How New Ideas Emerge (NY: Harper, 2016).

[2] Mark Harrison & Sung Vin Yim, “War on Two Fronts: The fight against parasites in Korea and Vietnam,” Medical History, 61:401-423, 2017.  

[3] Robert D. Hicks, “’The popular dose with doctors’: Quinine and the American Civil War,” Science History Institute, December 6, 2013 (https://www.sciencehistory.org/distillations/the-popular-dose-with-doctors-quinine-and-the-american-civil-war).

[4] Harry F. Dowling, Fighting Infection: Conquests of the Twentieth Century  (Cambridge, MA: Harvard Univ. Press, 1977), 93.

[5] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 263.

[6] Bernard J Brabin, “Malaria’s contribution to World War One – The unexpected adversary,” Malaria Journal, 13, 497, 2014;  R. Migliani, et al., “History of malaria control in the French armed forces:  From Algeria to the Macedonian Front  during the First World War” [trans.], Med. Santé Trop, 24:349-61, 2014.

[7] Frank Ryan, The Forgotten Plague:  How the Battle Against Tuberculosis was Won and Lost (Boston:  Little, Brown,1992), 90-91.

[8]  Peter J. Weima, “From Atabrine in World War II to Mefloquine in Somalia: The role of education in preventive medicine,” Mil. Med., 163:635-639, 1998, at 635.

[9] Weima, op. cit., p. 637, quoting Major General, then Captain, Robert Green during the  Sicily campaign  in August 1943:  “ . . . the rumors were rampant, that it made you sterile…. people did turn yellow.”

[10] Ann Elizabeth Pfau, Miss Yourlovin (NY:  Columbia Univ. Press, 2008), ch. 5.

[11] R. Jones, et al., “Korean vivax malaria. III. Curative effect and toxicity of Primaquine in doses from 10 to 30 mg daily,” Am. J. Trop. Med. Hyg., 2:977-982, 1953;  Joon-Sup Yeom, et al., “Evaluation of Anti-Malarial Effects, J. Korean Med. Sci., 5:707-712, 2005.

[12] B. S. Kakkilaya, “Malaria in Wars and Victims” (malariasite.com).

[13] Weima, op. cit.  Cf. M. R. Wallace et al., “Malaria among United States troops in Somalia,” Am. J. Med., 100:49-56, 1996.

[14] CDC, “Malaria among U.S. military personnel returning from Somalia, 1993,” MMWR, 42:524-526, 1993.

[15] Russ S. Kotwal, et al., “An outbreak of malaria in US Army Rangers returning from Afghanistan,”  JAMA, 293:212-216, 2005, at 214.  Of the 72% of the troops who completed a postdeployment survey, only 31% reported taking both their weekly tablets and continuing with their “terminal chemoprophylaxis” (taking medicine, as directed, after returning home).  Contrast this report with one for Italian troops fighting in Afghanistan from 2002-2011. Their medication compliance was measured 86.7% , with no “serious adverse events” reported and no cases of malaria occurring in Afghanistan. Mario S Peragallo, et al.,  “Risk assessment and prevention of malaria among Italian troops in Afghanistan,”   2002 to 2011,” J. Travel Med., 21:24-32, 2014.

[16] Weima, op. cit., 638.

 

Copyright © 2022 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

 
 

Why Pellagra Matters

It was the dread disease of the four “D”s:  dermatitis, diarrhea, dementia, and death.  The symptoms were often severe: deep red rashes, with attendant blistering and skin sloughing on the face, lips, neck, and extremities; copious liquid bowels; and deepening dementia with disorganized speech and a host of neurological symptoms.  Death followed up to 40% of the time.  The disease was reported in 1771 by the Italian Francisco Frapolini, who observed it among the poor of Lombardy.  They called it pelagra – literally “rough skin” in the dialect of northern Italy.  Frapolini popularized the term (which later acquired a second “l”), unaware that the Spanish physician Don Gaspar Casal had described the same condition in 1735.

Case reports later identified as pellagra occasionally appeared in American medical journals after the Civil War, but epidemic pellagra only erupted at the dawn of the 20th century.  Between  1902 and 1916, it ravaged mill towns in the American South.  Reported cases declined during World War I, but resumed their upward climb in 1919, reaching crisis proportions in 1921-1922 and 1927.  Nor was pellagra confined to the South.  Field workers and day laborers throughout the country, excepting the Pacific Northwest, fell victim.  Like yellow fever, a disease initially perceived as a regional problem was elevated to the status of a public health crisis for the nation.  But pellagra was especially widespread and horrific in the South.

In the decades following the Civil War, the South struggled to rebuild a shattered economy through a textile industry that revolved around cotton.  Pellagra found its mark in the thousands of underpaid mill workers who spun the cotton into yarn and fabric, all the while subsisting on a diet of cheap corn products:  cornbread, grits, syrup, brown gravy, fatback, and coffee were the staples.  The workers’  meager pay came as credit checks good only at company stores, and what company stores stocked, what the workers could afford to buy, was corn meal.  They lacked the time, energy, and means to supplement starvation diets with fresh vegetables grown on their own tiny plots.  Pellagra sufferers (“pellagrins”) subsisted on corn; pellagra, it had long been thought, was all about  corn.  Unsurprisingly, then, it did not stop at the borders of southern mill towns.  It also victimized the corn-fed residents of state-run institutions:  orphanages, prisons, asylums.         

In 1908, when cases of pellagra at southern state hospitals were increasing at an alarming rate, James Woods Babcock, the Harvard-educated superintendent of the South Carolina State Hospital and a pellagra investigator himself, organized the first state-wide pellagra conference.[1]  It was held at his own institution, and generated animated dialogue and comraderie among the 90 attendees.  It was followed a year later by a second conference, now billed as a national pellagra conference, also at Babcock’s hospital.  These conferences underscored both the seriousness of pellagra and the divided opinions about its causes, prevention, and treatment. 

At the early conferences, roughly half the attendees, dubbed Zeists (from Zea mays, or maize), were proponents of the centuries-old corn theory of pellagra. What is it about eating corn that causes the disease?  “We don’t yet know,” they answered, “but people who contract pellagra subsist on corn products.  Ipso facto, corn must lack some nutrient essential to  health.”  The same claim had been made by Giovanni Marzani in 1810.  The Zeists were countered by  anti-Zeists under the sway of germ theory.  “A deficiency disease based on some mysterious element of animal protein missing in corn? Hardly. There has to be a pathogen at work, though it remains to be discovered.”   Perhaps the microorganism was carried by insects, as with yellow fever and malaria.  The Italian-born British physician Louis Sambon went a step further.  He claimed to have identified the insect in question:  it was a black or sand fly of the genus Simulium.  

Germ theory gained traction from a different direction.  “You say dietary reliance on corn ‘causes’ pellagra?  Well, maybe so, but it can’t be a matter of healthy corn.  The corn linked to pellagra must be bad corn,  i.e., corn contaminated by a protozoon.”  Thus the position argued at length by no less than Cesare Lombroso, the pioneer of criminal anthropology.  Like Sambon, moreover, he claimed to have the answer:  it was, he announced, a  fungus, Sporisorium maidis, that made corn moldy and caused pellagra.  But many attendees were unpersuaded by the “moldy corn” hypothesis.  For them pellagra wasn’t a matter of any type of corn, healthy, moldy, or otherwise. It was an infectious disease pure and simple, and some type of microorganism had to be the culprit.  How exactly the microorganism entered the body was a matter for continued theorizing and case reports at conferences to come.         

And there matters rested until 1914, when Joseph Goldberger, a public health warrior of Herculean proportions, entered the fray.  A Jewish immigrant from Hungary, educated at the Free Academy of New York (later CUNY) and Bellevue Hospital Medical College (later NYU Medical School), Goldberger was a leading light of the Public Health Service’s Hygienic Laboratory.  A veteran epidemic fighter, he had earned his stripes battling yellow fever in Mexico, Puerto Rico, and the South; typhoid in Washington, DC; typhus in Mexico City; and dengue fever in Texas.[2]   With pellagra now affecting most of the nation, Goldberger was tapped by Surgeon General Rupert Blue to head south and determine once and for all the cause, treatment, and prevention of pellagra.  

Joseph Goldberger, M.D.

Goldberger was up to the challenge.  He took the South by storm and left a storm of anger and resentment in his wake.  He began in Mississippi, where reported cases of pellagra would increase from 6,991 in 1913 to 10,954 in  1914.  In a series of “feeding experiments” in two orphanages in the spring of 1914, he introduced lean meat, milk, and eggs into the children’s diets; their pellagra vanished.  And Goldberger and his staff were quick to make a complementary observation:  In all the institutions they investigated, not a single staff member ever contracted pellagra.  Why?  Well, the staffs of orphanages, prisons, and asylums were quick to take for themselves whatever protein-rich foods came to their institutions.  They were not about to make do with the cornbread, corn mush, and fatback given to the hapless residents.  And of course their salaries, however modest, enabled them to procure animal protein on the side. 

Joseph Goldberger, with his assistant C. H. Waring, in the Baptist Orphanage near Jackson, Mississippi in 1914, in the painting by Robert Thom.

       

Alright, animal protein cleared up pellagra, but what about residents of state facilities whose diets provide enough protein to protect them from pellagra.  Were there any?  And, if so, could pellagra be induced in them by restricting them to corn-based diets?  Goldberger observed that the only wards of the state who did not contract pellagra were the residents of prison farms.  It made sense:  They alone received some type of meat at mealtime, along with farm-grown vegetables and buttermilk.  In collaboration with Mississippi governor Earl Brewer, Goldberger persuaded 11 residents of Rankin State Prison Farm to restrict themselves to a corn-based diet for six months.  At the study’s conclusion, the prisoners would have their sentences commuted, a promise put in writing.  The experiment corroborated Goldberger’s previous findings:  Six of the 11 prisoners contracted pellagra, and, ill and debilitated, they became free men when the experiment ended in October 1915.  

Now southern cotton growers and textile manufacturers rose in arms.  Who was this Jewish doctor from the North – a representative of “big government,” no less – to suggest they were crippling and killing mill workers by consigning them to corn-based diets?  No, they and their political and medical allies insisted, pellagra had to be an infectious disease spread from worker to worker or transmitted by an insect.  To believe otherwise, to suggest the southern workforce was endemically ill and dying because it was denied essential nutrients – this  would jeopardize the textile industry and its ability to attract investment dollars outside the region.  Goldberger, supremely unfazed by their commitment to science-free profit-making, then undertook the most lurid experiment of all.  Joined by his wife Mary and a group of colleagues, he hosted a series of  “filth parties” in which the group transfused pellagrin blood into their veins and ingested tablets consisting of the  scabs, urine, and feces of pellagra sufferers.  Sixteen volunteers at four different sites participated in the experiment, none of whom contracted the disease.  Here was definitive proof: pellagra was not an infectious disease communicable person-to-person.[3]  

The next battle in Goldberger’s war was a massive survey of over 4,000 residents of textile villages throughout the Piedmont of South Carolina.  It began in April 1916 and lasted 2.5 years, with data analysis continuing, in Goldberger’s absence, after America’s entry into the Great War.  Drawing on the statistical skills  of his PHS colleague, Edgar Sydenstricker, the survey was remarkable for its time and place.  Homes were canvassed to determine the incidence of pellagra in relation to sanitation, food accessibly, food supply, family size and composition, and family income.  Sydenstricker’s statistical analysis of 747 households with 97 cases of pellagra showed that the proportion of families with pellagra markedly declined as income increased.  “Whatever the course that led to an attack of pellagra,”  he concluded, “it began with a light pay envelope.” [4]      

But Goldberger was not yet ready to retire his suit of armor for the coat of a lab researcher.  In September 1919, the PHS reassigned him to Boston, where he joined his old mentor at the Hygienic Laboratory, Milton Rosenau, in exploring influenza with human test subjects.  Once the Spanish Flu had subsided, he was able to return to the South, and just in time for a new spike in pellagra rates.  By the spring of 1920, wartime prosperity was a thing of the past.  Concurrent dips in both cotton prices and tobacco profits led to depressed wages for mill workers and tenant farmers, and a new round of starvation diets led to dramatic increases in  pellagra.  It was, wrote The New York Times on July 25, 1921, quoting a PHS memo, one of the “worst scourges known to man.”[5]

So Goldberger took up arms again, and in PHS-sponsored gatherings and southern medical conferences, withstood  virulent denunciations, often tinged with anti-Semitism.  Southern health care officers like South Carolina’s James A. Hayne dismissed the very notion of deficiency disease as “an absurdity.”  Hayne angrily refused to believe that pellagra was such a disease because, well, he simply refused to believe it – a dismissal sadly prescient of Covid-deniers who refused to accept the reality of a life-threatening viral pandemic because, well, they simply refused to believe it.[6]   

As late as November 1921, at a meeting of the Southern Medical Association, most attendees insisted that pellagra was caused by infection, and that Goldberger’s series of experiments was meaningless.  But they were meaningless only to those blinded to any and all meanings that reflected poorly on the South and its ability to feed its working class.  Even the slightest chink in the physicians’  self-protective armor would have opened to the epidemiological plausibility of Goldberger’s deficiency model.  How could they fail to see that pellagra was  a seasonal disease that reappeared every year in late spring or early summer, exactly like endemic scurvy and beriberi, both of which were linked to dietary deficiencies?     

Back in the Hygienic Laboratory in Washington, Goldberger donned his lab coat and, beginning in 1922, devised a series of experiments involving both people and dogs. Seeking to find an inexpensive substitute for the meat, milk, and eggs unavailable to the southern poor, he tested  a variety of foods and chemicals, one at a time, to see if one or more of them contained the unknown pellagra preventative, now dubbed the “P-P factor.”  He was not inattentive to vitamins, but in the early ’20s, there were only   vitamins A, B , and C to consider, none of which contained the P-P factor.  It was not yet understood that vitamin B was not a single vitamin but a vitamin complex.  Only two dietary supplements, the amino acid tryptophan and, surprisingly, brewer’s yeast, were found to have reliably preventive and curative properties.[7] 

Brewer’s yeast was inexpensive and widely available in the South.  It would soon be put to the test.  In June 1927, following two seasons of declining cotton prices, massive flooding of 16,570,627 acres of the lower Mississippi River Valley lowered wages and increased food prices still further.  The result was drastic increases in pellagra.  So Goldberger, with Sydenstricker at his side, headed South yet again, now hailed on the front page of the Jackson Daily News as a returning hero.  After a three-month survey of tenant farmers, whose starvation diet resembled that of the mill workers interviewed in 1916, he arranged for shipment of 12,000 pounds of brewer’s yeast to the hardest hit regions.  Three cents’ worth of yeast per day cured most cases of pellagra in six to ten weeks.  “Goldberger,” writes Kraut, “had halted an American tragedy.”[8]  Beginning with flood relief in 1927, Red Cross and state-sponsored relief efforts following natural disasters followed Goldberger’s lead.  Red Cross refugee camps in 1927 and thereafter educated disaster victims about nutrition and pellagra and served meals loaded with P-P factor.  On leaving the camps, field workers could take food with them; families with several sick members were sent off with parcels loaded with pellagra preventives.

But the scientific question remained:  What exactly did brewer’s yeast, tryptophan, and two other tested products, wheat germ and canned salmon, have in common?  By 1928, Goldberger, who had less than a year to live,[9] was convinced it was an undiscovered vitamin, but the discovery would have to await the biochemistry of the 1930s.  In the meantime, Goldberger’s empirical demonstration that inexpensive substitutes for animal protein like brewer’s yeast prevented and cured pellagra made a tremendous difference in the lives of the South’s workforce.   Many thousands of lives were saved.

___________________ 

 It was only in 1912, when pellagra ripped through the South, that Casimir Funk, a Polish-born American biochemist, like Goldberger a Jew, formulated the vita-amine or vitamine hypothesis to designate organic molecules essential to life but not synthesized by the human body, thereby pointing to the answer Goldberger sought.[10]  Funk’s research concerned beriberi, a deficiency disease that causes a meltdown of the central nervous system and cardiac problems to the point of heart failure.  In 1919, he determined that it resulted from the depletion of thiamine (vitamin B1).  The covering term “vita-amine” reflected his (mistaken) belief that other deficiency diseases – scurvy, rickets, pellagra – would be found to result from the absence of different amines (i.e., nitrogen-containing) molecules.  

 In the case of pellagra, niacin (aka vitamin B3, aka nicotinic acid/nicotinamide) proved the missing amine, Goldberger’s long sought-after P-P factor.  In the course of his research, Funk himself had isolated the niacin molecule, but its discovery as the P-P factor was only made in 1937 by the American biochemist Conrad Elvehjem.  The circle of discovery begun with Frapolini’s observations in Lombardy in 1771 was closed between 1937 and 1940, when field studies on pellagrins in northern Italy conducted by the Institute of Biology  of the NRC confirmed the curative effect of niacin.[11] 

 Now, ensnared for 2.5 years by a global pandemic that continues to sicken and kill throughout the world, we are understandably focused on communicable infectious diseases.  Reviewing the history of pellagra reminds us that deficiency diseases too have plagued humankind, and in turn brought forth the best that science – deriving here from the collaboration of laboratory researchers, epidemiologists, and public health scientists – has  to offer.  Louis Pasteur, Robert Koch, and Walter Reed are the names that  leap to the foreground in considering the triumphs of bacteriology.  Casimir Funk, Joseph Goldberger, Edgar Sydenstricker, and Conrad Elvehjem are murky background figures that barely make it onto the radar.

In the developed world, pellagra is long gone, though it remains  common in Africa, Indonesia, and China.  But the entrenched commercial and political interests that Goldberger and his PHS team battled to the mat are alive and well.  Over the course of the Covid pandemic, they have belittled public health experts and bewailed CDC protocols that limit “freedom” to contract the virus and infect others.  In 1914, absent Goldberger and his band of Rough Riders, the South would have languished with a seasonally crippled labor force far longer than it did.  Mill owners, cotton-growing farmers, and politicians would have shrugged and accepted the death toll as a cost of doing business.

Let us pause, then, and pay homage to Goldberger and his PHS colleagues.  They were heroes willing to enter an inhospitable region of the country and, among other things, ingest pills of pellagrin scabs and excreta to prove that pellagra was not a contagious disease.  There are echoes of Goldberger in Anthony Fauci, William Schaffner, Ashish Jha, and Leana Wen as they relentlessly fan the embers of  scientific awareness among those who resist an inconvenient truth: that scientists, epidemiologists, and public health officers know things about pandemic management that demagogic politicians and unfit judges do not.  Indeed, the scientific illiterati appear oblivious to the fact that the health of the public is a bedrock of the social order, that individuals ignore public health directives and recommendations at  everyone’s peril.  This is no less true now than it was in 1914.  Me?  I say,  “Thank you, Dr. Goldberger.  And thank you, Dr. Fauci.” 

___________________________           

[1] My material on Babcock and the early pellagra conferences at the South Carolina State Hospital come from Charles S. Bryan, Asylum Doctor:  James Woods Babcock and the Red Plague of Pellagra (Columbia: Univ. of S C Press, 2014), chs. 3-5. 

[2] Alan Kraut, Goldberger’s War:  The life and Work of a Public Health Crusader (NY: Hill & Wang, 2004), 7.

[3] To be sure, the “filth parties” did not rule out the possibility of animal or insect transmission of a  microorganism.   Goldberger’s wife Mary incidentally, was transfused with pellagrin blood but didn’t ingest the filth pills. 

[4] Kraut, Goldberger’s War, 164.

[5] Quoted in Kraut, Goldberger’s War, 190.

[6] On Hayne, Goldberger’s loudest and most vitriolic detractor among southern public health officers, see Kraut, pp. 118, 194; Bryan, Asylum Doctor, pp. 170, 223, 232, 239; and Elizabeth Etheridge, The Butterfly Caste: A Social History of Pellagra in the South (Westport, CT: Greenwood, 1972), 42, 55, 98-99, 110-111.  This is the same James Hayne who in October 1918, in the midst of the Great Pandemic, advised the residents of South Carolina that “The disease itself is not so dangerous: in fact, it is nothing more than what is known as ‘Grippe’” (“Pandemic and Panic: Influenza in 1918 Charleston” [https://www.ccpl.org/charleston-time-machine/pandemic-and-panic-influenza-1918-charleston#:~:text=Pandemic%20and%20panic%20visited%20Charleston,counter%20a%20major%20health%20crisis]).

[7] The tryptophan experiments were conceived and conducted by Goldberger’s assistant, W. F. Tanner, who, after Goldberger’s return to Washington, continued to work out of the PHS laboratory at Georgia State Sanitarium        (Kraut, Goldberger’s War, 203-204, 212-214).

[8] Kraut, Goldberger’s War, 216-222, quoted at 221. 

[9] Goldberger died from hypernephroma, a rare form of kidney cancer, on January 17, 1929.   Prior to the discovery of niacin, in tribute to Goldberger, scientists referred to the P-P factor as Vitamin G. 

 [10] The only monographic study of Funk in English, to my knowledge, is Benjamin Harrow, Casimir Funk, Pioneer in Vitamins and Hormones (NY:  Dodd, Mead, 1955).  There are, however, more recent articles providing brief and accessible overviews of his achievements, e.g.,  T. H. Juke, “The prevention and conquest of scurvy, beriberi, and pellagra,” Prev. Med., 18:8877-883, 1989;  Anna Piro, et al., “Casimir Funk: His discovery of the vitamins and their deficiency disorders,” Ann. Nutr. Metab., 57:85-88, 2010.

[11] Renato Mariani-Costantini & Aldo Mariani-Costantini, “An outline of the history of pellagra in Italy,” J. Anthropol. Sci., 85:163-171, 2007.

 

Copyright © 2022 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

Anti-vaxxers in Free Fall

I read a news story in which a man is dying of Covid-19 in the hospital.  He is asked whether he regrets not getting vaccinated and rallies enough to reply, “No, I don’t believe in the vaccine.”  So what then does he believe in?  Systemic viral infection, suffering, and death?  If you don’t believe in vaccination, you don’t believe in modern medicine in toto.  You don’t believe in bacteriology, virology, cellular biology, microbiology, or immunology.  What then is left to prevent, diagnose, and treat disease?  Trump-ish medievalism, mysticism, shamanism, divine intervention?

A study by researchers at Harvard’s Brigham and Women’s Hospital used natural language processing to comb through 5,307 electronic patient records of adult type 2 diabetics living in Massachusetts and followed by their primary care physicians between 2000 and 2014.  They found that 43% (2,267) of patients refused to begin insulin therapy when their doctors recommended it.  Further, diabetics who declined the recommendation not only had higher blood sugar levels than those who began insulin, but had greater difficulty achieving glycemic control later on.[1]  So what do the insulin-declining diabetics believe in?  Chronic heart and kidney disease, blindness, and amputation – the all but inevitable sequelae of poorly managed diabetes?

The problem, really an epistemological problem, is that such people apparently have no beliefs at all – unless one imputes to them belief in disease, suffering, and death, and in the case of Covid vaccine resisters, the prerogative to inflict them on others.  This is not tantamount to a scientifically specious belief system that unintentionally infects others.  During the Yellow Fever epidemic that left Philadelphia in ruins in 1793, Dr. Benjamin Rush, highly acclaimed throughout the newborn nation, set about his curative missions by draining his patients, in successive bleedings, of up to four pints of blood while simultaneously purging them (i.e., causing them to vomit) with copious doses of toxic mercury.

Rush’s “Great Purge,” adopted by his followers, added hundreds, perhaps thousands, to the death toll in Philadelphia alone.  But at least Rush’s “system” derived from a belief system.  He did in fact  find a theoretical rationale for his regimen in an essay by the Virginia physician and mapmaker John Mitchell.  Describing yellow fever in Virginia in 1741, Mitchell noted that in yellow fever the “abdominal viscera were filled with blood, and must be cleaned out by immediate evacuation.”[2]   Bleeding, of course, was conventional treatment for all manner of disease in 1793, so Mitchell’s recommendation came as no surprise. Taken in conjunction with the system of mercuric purges employed by Dr. Thomas Young during the Revolutionary War, Rush had all the grounding he required for a ruinously misguided campaign that greatly extended recovery time of those it did not kill.  But, yes, he had his theory, and he believed in it.

In the early 19th century, Napoleon, sweeping through Europe, conquers the north Italian province of Bolzano, which in 1807 he incorporated into Bavaria. Two years later, when the Bavarian government mandates smallpox vaccination for all residents, the newly absorbed Italians launch an armed revolt, partly because they believed vaccination would inject Protestantism into their Catholic veins.[3]

All right, it is a nonsensical belief, even in 1809, but it is still a belief of sorts.  It is epistemically flawed, because it fails to stipulate what exactly makes a substance inherently Protestant in nature; nor does it posit a mechanism of transmission whereby a Protestant essence seeps into liquid smallpox vaccine in the first place.  In the realm of ethics, it suggests that the possibility of death pales alongside the certainty of spiritual contamination by a fluid that, however neutral in life-saving potency, is injected by a Protestant hand.

Only slightly less ridiculous to modern ears is the mid-19th-century belief that general anesthesia via ether or chloroform, introduced by James Young Simpson in 1847, must be withheld from women giving birth.  The reason?  Genesis 3.16 enjoins women to bring forth new life in suffering.  Forget that the belief is espoused solely by certain men of the cloth and male physicians,[4] and was based on a highly questionable rendering of the biblical Hebrew.  Forget as well that, for Christians, Christ’s death redeemed humankind, relieving women of the need to relive the primal curse.  Bear in mind further that the alleged curse would also forbid, inter alia, use of forceps, caesarian operations, and embryotomy.  A woman with a contracted pelvis would die undelivered because she is guilty of the sin over which she has no control – that of having a contracted pelvis.[5]

In a secular nation guided by a constitution that asserts everyone’s right to pursue happiness in his or her own pain-free terms, we see the primal curse as archaic misogynist drivel, no less absurd than belief that the Bible, through some preternatural time warp, forbids vaccination.  But, hey, it’s a free country, and if a mid-19th-century or early-21st-century man chooses to believe that anesthesia permits men to escape pain whenever possible but women only in male-sanctioned circumstances, so be it.  It is a belief.

Now it’s 1878, and the worst yellow fever epidemic in American history is sweeping across the lower Mississippi Valley, taking lives and destroying commerce in New Orleans, Memphis and surrounding cities and towns to which refugees are streaming.  The epidemic will reach the Ohio Valley, bringing deadly Yellow Jack to Indiana, Illinois, and Ohio.  Koch’s monograph on the bacteriology of sepsis (wound infection) was published that very  year, and neither his work nor that of Lister is universally accepted in the American south.  Nor would its precepts have counted for much in the face of a viral (not bacterial) invader carried up the Mississippi from Havana.

What can city boards of health do in the face of massive viral infection, suffering, and death?  Beyond imposing stringent new sanitary measures, they can quarantine ships arriving in their harbors until all infected crew members have either died or been removed and isolated.  This will prevent the newly infected from infecting others and crippling cities still further – assuming, that is, a belief system in which yellow fever is contagious and spread from person to person.

But in 1878 Memphis, where by September the epidemic is claiming 200 lives a day, this “modern” belief is widely contested among the city’s physicians.  Some are contagionists, who believe that disease is caused by invisible entities that are transmissible.  But others, greater in number, favor the long-held theory that infectious disease results from “miasma” or bad air – air rendered toxic by decaying plant and animal matter in the soil.  If you believe miasma causes disease, then you’re hard-pressed to understand how quarantining ships laden with sick people will do anything to control the epidemic.

This was precisely the position of the 32 Memphis physicians who defeated the city council’s plan to institute a quarantine and set up a quarantine station.  Quarantine is pointless in the face of bad air.  The city’s only recourse, so held the 32, was to alter the “epidemic constitution” of the atmosphere by inundating it with smoke.  Canon blasts and blazing barrels of tar up and down city streets – that’s the ticket to altering the atmospheric conditions that create infectious disease.[6]

The miasmic theory of disease retained a medical following throughout the 1870s, after which it disappeared in the wake of bacteriology.  But in Memphis in 1878, bad air was still a credible theory in which physicians could plausibly believe.  And this matter of reasonable belief – reasonable for a particular time and place – takes us back to the hospitalized Covid patient of 2021 who, with virtually his last breath, defends his decision to remain unvaccinated because he doesn’t believe in the vaccine.  What is the knowledge base that sustains his disbelief?  There isn’t any.  He has no beliefs, informed or otherwise, about bacteriology, virology, cellular biology, or immunology.  At best, he has decided to accept what someone equally belief-less has told him about Covid vaccination, whether personally, in print, or over the internet.

It is no different among the 43% of Massachusetts diabetics who, a century after Banting’s and Best’s miraculous discovery, declined insulin therapy when their doctors recommended it.  Their disbelief is actually a nonbelief because it is groundless.  For some, perhaps, the refusal falls back on a psychological inability to accept that one is diabetic enough to warrant insulin.  They resist the perceived stigma of being insulin-dependent diabetics.[7]  Here at least the grounds of refusal are intelligible and remediable.  An insulin phobia does not sustain real-world belief; it is an impediment to such belief in relation to diabetes and insulin, illness and long-term health, lesser and greater life expectancy.

Back in the present, I read another news story in which two unvaccinated hospital nurses explain to a journalist that they have refused Covid vaccination because the vaccines’ effectiveness is based on “junk data.”  Really?  Here there is the glimmering of a belief system, since scientific data can be more or less robust, more or less supportive of one or another course of action.

But what exactly makes Covid vaccine data worthless, i.e., junk?  And how have these two nurses acquired the expertise in epidemiology, population statistics, and data analysis to pass judgment on data deemed credible and persuasive by scientists at Pfizer, Moderna, Johnson & Johnson, the CDC, and the WHO?  And how, pray tell, have they gained access to these data?  Like all opponents of vaccine science, they pontificate out of ignorance, as if the mere act of an utterance confers truth-value to what is being uttered.  It’s an extreme example of asserting as fact what remains to be demonstrated (argument petitio principii), the legacy of an ex-president who elevated pathological lying to a political art form.

Even the nurses pale alongside the anti-vax protester who is pictured in a news photo holding a sign that reads, “Vaccines Kill.”[8]  Whom do they kill and under what circumstances?  Does he mean all vaccines are deadly and kill people all the time, or just certain vaccines, such as the Covid  vaccine?   But what does it matter?  The sign holder doesn’t know anything about any vaccines.  Does he really believe that everything we know about the history of vaccine science from the time of Jenner is bogus, and that children who once died from smallpox, cholera, yellow fever, diphtheria, pertussis, typhoid, typhus, tetanus, and polio are still dying in droves, now from the vaccines they receive to protect them from these infectious diseases during the earliest years of life?  Is the demographic fact that, owing to vaccination and other public health measures, life expectancy in the U.S. has increased from 47 in 1900 to 77 in 2021 also based on junk data?  In my essay, Anti- vaccinationism, American Style, I provide statistics on the total elimination in the U.S. of smallpox and diphtheria, and virtual elimination of polio.  Were my claims also based on junk data?  If so, I’d appreciate being directed to the data that belie these facts and demonstrate that, in point of fact, vaccines kill.

Maybe the man with the sign has an acquaintance who got sick from what he believed to be a vaccine?  Perhaps someone on his internet chat group heard of someone else who became ill, or allegedly died, after receiving a vaccine.  Of course, death can follow vaccination without being caused by it.  Do we then assume that the man with the sign and like-minded protesters are well-versed in the difference between causation and correlation in scientific explanation?

We know that for a tiny number of individuals aspirin kills.[9]   So why doesn’t the man hold up a sign that reads, “Aspirin Kills.”  Here at least, he would be calling attention to a scientific fact that people with GI conditions should be aware of.    We know that sugary drinks have been linked to 25,000 deaths in the U.S. each year.  Why not a sign, “Soda Kills”?  It would at least be based on science.  He chooses not to proclaim the lethality of aspirin or soda because he cares no more about aspirin- or soda-related deaths than Covid-related deaths.  If he did, then, like the two nurses with their junk data and the Covid patient announcing disbelief in Covid vaccination on his deathbed, he would have to anchor his belief in consensually accepted scientific facts – a belief that someone, anyone, might find believable.

He is no different than other American anti-vaxxers I read about in the paper. They are the epistemological Luddites of our time, intent on wrecking the scientific machinery of disease prevention, despite profound ignorance of vaccine science and its impact on human affairs since the late 18th century.  Indeed, they see no need to posit grounds of belief of any kind, since their anger – at Covid, at Big Government, at Big Science, at Big Medicine, at Big Experts – fills the epistemic void.  It fuels what they offer in place of the science of disease prevention:  the machinery of misinformation that is their stock in trade.

And therein is the source of their impotence.  They have fallen into an anti-knowledge black hole, and struggle to fashion an existence out of anger that – to push the anti-matter trope a little further – repels rational thought.  Their contrarian charge is small solace for the heightened risks of diseases, suffering, and death they incur, and, far less conscionably, impose on the rest of us.

______________________

[1] N. Hosomura, S. Malmasi, et al., “Decline of Insulin Therapy and Delays in Insulin Initiation in People with Uncontrolled Diabetes Melitus,” Diabetic Med., 34:1599-1602, 2017.

[2] J. M. Powell, Bring Out Your Dead:  The Great Plague of Yellow Fever in Philadelphian in 1793 (Phila: Univ. of Pennsylvania Press, 1949), 76-78.

[3] My thanks to my friend Marty Meyers for bringing to my attention this event of 1809, as reported by Emma Bubola,In Italy’s Alps, Traditional Medicine Flourishes, as Does Covid,” New York Times, December 16, 2021.

[4] With reason, wrote Elizabeth Cady Stanton in The Woman’s Bible (1895), “The Bible and the Church have been the greatest stumbling blocks in the way of women’s emancipation.”

[5] For a fulller examination of the 19th-century debate on the use of general anesthesia during childbirth, see Judith Walzer Leavitt Brought to Bed: Childbearing in America, 1750-1950 (NY:  OUP, 1986), ch. 5.

[6] On the measures taken to combat the epidemic in Memphis, including the rift between contagionists and noncontagionists physicians, see John H. Ellis, Yellow Fever and Public Health in the New South (Lexington: Univ. Press of Kentucky, 1992), ch. 3.

[7] A. Hussein, A. Mostafa, et al., “The Perceived Barriers to Insulin Therapy among Type 2 Diabetic Patients,” African Health Sciences, 19:1638-1646, 2019.

[8] Now, sadly, we have gone from hand-written “Vaccines Kill” signs to highway billboards, e.g., https://www.kxxv.com/hometown/mclennan-county/a-new-billboard-in-west-claims-vaccines-kill.

[9] Patients prescribed aspirin before developing a GI bleed or perforation are prominent among those killed by aspirin.  See A. Lanas, M. A. Perez-Aisa, et al., “A Nationwide Study of Mortality Associated with Hospital Admission and Those Associated with Nonsteroidal Antiinflammatory Drug Use,” Am. J.  Gastroenterol., 100:1685-1693, 2005; S. Straube, M. R. Trainer, et al., “Mortality with Upper Gastrointestinal Bleeding and Perforation,” BMC Gastroenterol., 8: 41, 2009.

Unmasked and Unhinged

The Great Influenza, the Spanish Flu, a viral infection spread by droplets and mouth/nose/hand contact, laid low the residents of dense American cities, and spurred municipal officials to take new initiatives in social distancing.[1]  City-wide bans on public gatherings included closing schools, theaters, motion picture houses, dance halls, and – perish the thought – saloons.  In major cities, essential businesses that remained open had to comply with new regulations, including staggered opening and closing times to minimize crowd size on streets and in trolleys and subways.  Strict new sanitation rules were the order of the day.  And yes, eight western cities, not satisfied with preexisting regulations banning public spitting and the use of common cups, or even new regulations requiring the use of cloth handkerchiefs when sneezing or coughing, went the full nine yards:  they passed mask-wearing ordinances.

In San Francisco and elsewhere, outdoor barbering and police courts were the new normal.

The idea was a good one; its implementation another matter.  In the eight cities in question, those who didn’t make their own masks bought masks sewn from wide-mesh gauze, not the tightly woven medical gauze, four to six layers thick, worn in hospitals and recommended by authorities.  Masks made at home from cheesecloth were more porous still.  Nor did most bother washing or replacing masks with any great frequency. Still, these factors notwithstanding, the consensus is that masks did slow down the rate of viral transmission, if only as one component of a “layered” strategy of protection.[2]   Certainly, as commentators of the time pointed out, masks at least shielded those around the wearer from direct in-your face (literally) droplet infection from sneezes, coughs, and spittle.  Masks couldn’t hurt, and we now believe they helped.  

Among the eight cities that passed mask-wearing ordinances, San Francisco took the lead.  Its mayor, James Rolph, with a nod to the troops packed in transport ships taking them to war-torn France and Belgium, announced that “conscience, patriotism and self-protection demand immediate and rigid compliance” with the mask ordinance. By 1918, masks were entering hospital operating theaters, especially among assisting nurses and interns.[3]  But mask-wearing in public by ordinary people was a novelty.  In a nation gripped by life-threatening influenza, however, most embraced masks and wore them proudly as emblems of patriotism and public-mindedness.   Local Red Cross volunteers lost no time in adding mask preparation to the rolling of bandages and knitting of socks for the boys overseas.

A trolley conductor waves off an unmasked citizen. The image is from Seattle, another city with a mask-wearing ordinance.

But, then as now, not everyone was on board with face masks.  Then as now, there were protesters.  In San Francisco, they were small in number but large in vocal reach.  The difference was that in 1918, cities like San Francisco meant business, with violators of mask laws fined $5 or $10 or imprisoned for 10 days.  On the first day the ordinance took effect, 110 were arrested, many with masks dangling around their necks.  In mid-November,

San Francisco police arrest “mask slackers,” one of whom has belatedly put on a mask.

following the signing of the Armistice, city officials mistakenly believed the pandemic had passed and rescinded the ordinance.  At noon, November 21, at the sound of a city-wide whistle, San Franciscans rose as one and tossed their masks onto sidewalks and streets.   In January, however, following a spike in the number of influenza cases, a second mask-wearing ordinance was passed by city supervisors, at which point a small, self-styled Anti-Mask League – the only such League in the nation – emerged on the scene.[4]  

A long line of San Franciscans waiting to purchase masks in 1919.  A few already have masks in place.

The League did not take matters lying down, nor were they content to point out that masks of questionable quality, improperly used and infrequently replaced, probably did less good than their proponents suggested.  Their animus was trained on the very concept of obligatory mask-wearing, whatever its effect on transmission of the unidentified influenza microbe.  At a protest of January 27, “freedom and liberty” was their mantra.  Throwing public health to the wind, they lumped together mask-wearing, the closing of city schools, and the medical testing of children in school.  Making sure sick children did not infect healthy classmates paled alongside the sacrosanctity of parental rights.  For the protesters, then as now, parental rights meant freedom to act in the worst interests of the child.

___________________

One wants to say that the Anti-Mask League’s short-lived furor over mask-wearing, school closings, and testing of school children is long behind us.  But it is not.  In the matter of contagious infectious disease – and expert recommendations to mitigate its impact – what goes around comes around. In the era of Covid-19, when San Francisco mayor London Breed ordered city residents “to wear face coverings in essential businesses, in public facilities, on transit and while performing essential work,” an animated debate on mask-wearing among city officials and the public ensued.  A century of advance in the understanding of infectious disease, including the birth and maturation of virology – still counts for little among the current crop of anti-maskers.  Their “freedom” to opt for convenience trumps personal safety and the safety of others. Nor does a century of improvements in mask fabrics, construction, comfort, and effectiveness mitigate the adolescent wantonness of this freedom one iota.  

“Liberty and freedom.”  Just as the Anti-Mask League’s call to arms masked a powerful political undertow, so too with the anti-vaxxers and anti-maskers of the present.  Times change; some Americans – a much higher percentage now than in 1918 – do not. Spearheaded by Trumpian extremists mired in fantasies of childlike-freedom from adult responsibility, the “anti” crowd still can’t get its head around the fact that protecting the public’s health – through information, “expert” recommendations and guidelines, and, yes, laws – is the responsibility of government.  The responsibility operates through the Commerce Clause of the Constitution, which gives the federal government broad authority to impose health measures to prevent the spread of disease from a foreign country.  It operates through the Public Health Service Act, which gives the Secretary of Health and Human Services authority to lead federal health-related responses to public health emergencies.  And it operates through the 10th Amendment to the Constitution which grants states broad authority to take action during public health emergencies.  Quarantine and restricted movement of those exposed to contagious disease, business restrictions, stay-at-home orders – all are among the “broad public health tools” available to governors.[5]   

When a catastrophe, natural or man-made, threatens public health and safety, this responsibility, this prerogative, this Constitutional mandate, may well come down with the force of, well, mandates, which is to say, laws.  At such moments in history, we are asked to step up and accept the requisite measure of inconvenience, discomfort, and social and economic restriction because it is intrinsic to the civil liberties that make us a society of citizens, a civil society. 

Excepting San Francisco’s anti-masker politicos, it is easier to make allowances for the inexpert mask wearers of 1918 than for anti-masked crusaders today.  In 1918, many simply didn’t realize that pulling masks down below the nose negated whatever protection the masks provided.  The same is true of the well-meaning but guileless who made small holes in the center of their masks to allow for insertion of a cigarette.  It is much harder to excuse the Covid-19 politicos who resisted mask-wearing during the height of the pandemic and now refuse to don face masks in supermarkets and businesses as requested by store managers.  The political armor that shields them from prudent good sense, respect for store owners, and the safety of fellow shoppers is of a decidedly baser metal. 

The nadir of civil bankruptcy is their virulent hostility toward parents who, in compliance with state, municipal and school board ordinances – or even in their absence – send their children to school donned in face masks.  The notion that children wearing protective masks are in some way being abused, tormented, damaged pulls into its orbit all the rage-filled irrationality of the corrosive Trump era.  Those who would deny responsible parents the right to act responsibly on behalf of their children are themselves damaged.  They bring back to life in a new and chilling context that diagnostic warhorse of asylum psychiatrists (“alienists”) and neurologists of the 19th century:  moral insanity.  

The topic of child mask-wearing, then and now, requires an essay of its own.  By way of prolegomenon, consider the British children pictured below.  They are living, walking to school, sitting in their classrooms, and playing outdoors with bulky gas masks in place during the Blitz of London in 1940-1941.  How could their parents subject them to these hideous contraptions?  Perhaps parents sought to protect their children, to the extent possible, from  smoke inhalation and gas attack from German bombing raids.   It was a response to a grave national emergency.  A grave national emergency.  You know, like a global pandemic that to date has brought serious illness to over 46.6 million Americans and claimed over 755,000 American lives.  

 


[1] For an excellent overview of these initiatives, see See Nancy Tomes, “’Destroyer and Teacher’: Managing the Masses During the 1918-1919 Influenza Pandemic,” Public Health Rep. 125(Suppl 3): 48–62, 2010.  My abbreviated account draws on her article. 

[2] P. Burnett, “Did Masks Work? — The 1918 Flu Pandemic and the Meaning of Layered Interventions,” Berkeley Library, Oral History Center, University of California, May 23, 2020  (https://update.lib.berkeley.edu/2020/05/23/did-masks-work-the-1918-flu-pandemic-and-the-meaning-of-layered-interventions).  Nancy Tomes, “’Destroyer and Teacher’” (n. 1), affirms that the masks were effective enough to slow the rate of transmission. 

[3]  Although surgical nurses and interns in the U.S. began wearing masks after 1910, surgeons themselves generally refused until the 1920s: “the generation of head physicians rejected them, as well as rubber gloves, in all phases of an operation, as they were considered ‘irritating’.”  Christine Matuschek, Friedrich Moll, et al., “The History and Value of Face Masks,” Eur. J. Med. Res., 25: 23, 2020.

[4] My brief summary draws on Brian Dolan, “Unmasking History: Who Was Behind the Anti-Mask League Protests During the 1918 Influenza Epidemic in San Francisco,” Perspective in Medical Humanities, UC Berkeley, May 19, 2020.  Another useful account of  the mask-wearing ordinance and the reactions to it  is the “San Francisco” entry of the The American Influenza Epidemic of 1918-1919: A Digital Encyclopedia, produced by the  University of Michigan Center for the History of Medicine and Michigan Publishing (www.unfluenzaarchive.org/city/city-sanfrancisco.html).

[5] American Bar Association, “Two Centuries of Law Guide Legal Approach to Modern Pandemic,”  Around the ABA, April 2020                           (https://www.americanbar.org/news/abanews/publications/youraba/2020/youraba-april-2020/law-guides-legal-approach-to-pandem).

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

Everything You Didn’t Want to Know About Typhoid

Extreme fatigue; dangerously high fever; severe abdominal pain; headaches; diarrhea or constipation; nausea and vomiting – the symptoms of severe typhoid fever can be a panoply of horrors.  Like cholera, the bacteria in question – Salmonella typhi, cousin to the Salmonella that causes food poisoning – find a home in water and food contaminated with human feces.  The infection is contracted only by humans, and it is highly contagious.  More persons contract it from human contact – often from unwashed hands following defecation – than from drinking contaminated water or ingesting contaminated food.  But the latter are hardly incidental causes.  At least two billion people worldwide, the World Health Organization tells us, drink feces-contaminated water.[1]

And the story gets worse. Through the 19th century, “chronic carriers” could not be conceptualized, much less detected.  They were symptom-free folks in whom typhi found safe harbor in the gall bladder, where they traveled with stored bile through the bile duct into the small intestine en route to fecal expulsion.  The chronic carriers brought infection to entire communities in sudden, explosive outbreaks; typhoid is a prime example of what epidemiologists term a “fulminant” disease (from the Latin fulmināre, to strike with lightning).  And worse still, the ranks of common carriers were enlarged by some of those who contracted the disease and recovered.  Typhi lived on in their gall bladders as well, and were passed on to others via the same fecal-oral route.

The Mother of all Common Carriers, the Super Spreader who comes down to us as Typhoid Mary, was one Mary Mallon, an Irish cook who passed on typhi to no less than 53 members of seven prominent Manhattan-area households between 1900 and 1906.  In 1907 she was quarantined in a bungalow on New York’s North Brother Island near Riverside Hospital, only to resume her career as cook-super spreader on release in 1910.  Tracked down five years later, she was whisked back to her island bungalow, where she lived out her remaining 23 years. 

Here is what Salmonella typhi do once ingested through the mouth.  Absent sufficient gastric acid to neutralize them in the stomach, the bacteria make their way to the terminal of the small intestine and enter the cells lining it.  Intestinal cells respond to the invaders with a massive inflammatory response that leads to an intestinal rupture, a hole, through which intestinal contents drain into the abdomen, with attendant and severe pain.  And from there matters go from bad to worse.  Without fast, effective treatment, the bacteria penetrate lymphatic tissue and enter the blood stream, which shuttles them to other organs:  the liver, the spleen, bone marrow.  In the worst cases, bacterial ulceration can extend all the way to the terminal lining of the ileum, from which typhi flood the body, carrying infection to the brain, heart, and pancreas.  Death is now around the corner; only major abdominal surgery holds any prospect of survival.  It is a pernicious disease of microbial migratory urgency.    

Improvements in water treatment and personal hygiene, along with antibiotic therapy and – yes! – a newly effective vaccine for adults, brought typhoid to its knees in the United States after World War II.  But the disease is alive and well in Central and South America, Africa, and parts of Asia, where it claims between 11 and 21 million victims and some 200,000 deaths each year.[2]  Typhi has evolved along with the antibiotics that control it, and multidrug-resistant strains (MDR) remain deadly.  And even here, on these ostensibly sanitized shores, typhi can still make its presence known.  As recently as 2010, nine Americans contracted typhoid, five in California and four in Nevada.[3] 

But such instances are aberrational, and in the northern hemisphere typhoid fever has long since vanished from anyone’s disease-monitoring radar.  Now federal and state governments, the bane of anti-vaccine irrationalists and mask-wearing naysayers, make sure we don’t drink water or eat food contaminated by microbe-laced feces.  But it was not so for our forebears. In the Civil War, typhoid fever devastated north and south alike; the Union Army’s general hospital, the Satterlee Hospital in West Philadelphia, was constructed in 1862 largely to cope with its victims.  In the Spanish-American War of 1898, typhoid fever shared center stage with yellow fever and, at war’s end, rated its own federal investigative commission.  Chaired by Walter Reed, the Typhoid Commission determined that contact among soldiers (“comrade contact”) was primarily responsible for the transmission of typhoid fever in military camps.[4]  Four years later, Koch’s investigations during a typhoid epidemic in Trier, Germany led him to generalize the Commission’s finding: typhoid fever was contracted less from contaminated water or sewage than from nonsymptomatic carriers; the “carrier hypothesis” was among his final significant contributions.[5] 

The era of modern typhoid prevention began in 1897, when Almroth Wright, then a pathologist at the British Army’s Medical School at Netley Hospital, published a paper on the typhoid vaccine he had developed with killed typhi.  The Army took note and, in the South African war the following year, made very limited use of it: of 330,000 British troops, only 14,000 received the vaccine.  It was effective in this limited trial but never caught on after the war.[6]  Beginning in 1902, the U.S. government’s Public Health and Marine Hospital Service, renamed the Public Health Service in 1912, focused its research on typhoid.  Progress was made, and by the time America entered WWI, the PHS’s Hygienic Laboratory had developed an antityphoid vaccine.[7]  American troops sailing to France in 1917 were not asked how they felt about receiving a typhoid vaccine; they received their mandatory shots and boarded their ships.  Those who were not vaccinated stateside received their shots on arriving at their camps.  Vaccination was not negotiable.  The obligation to live and fight for the nation trumped the freedom to be free to contract typhoid, suffer, and possibly die.  

“A Monster Soup Commonly Called Thames Water,” a mid 19th-century etching depicting the stew of disease-promoting organisms in the river that supplied drinking water to Londoners.

The vaccine dramatically reduced the incidents of typhoid, but it still wrought damage in field and base hospitals, especially among unvaccinated European troops who had been fighting since 1914.  American nurses who arrived in northern France and Belgium in advance of troops recalled their misery at being transferred to typhoid wards, which, as one recalled were “gloomy and dark.”  Another recalled a typhoid scourge that crippled her hospital and created an urgent need to find space outside the hospital for the typhoid patients.[8]

_______________________________

The current governors of Texas and Florida would surely look askance at the history of typhoid control, since a key aspect of it – allowing children on school premises to drink only water subjected to antimicrobial treatment – ignores parental freedom of choice.  Parents decide what their children eat, and they should be free to determine what kind of water they drink.   Children are not military enlistees obligated to remain healthy in the service of the nation.  What right do schools boards have to abrogate the freedom of parents to determine what kind of water their children drink?  Why should they be mandated to drink water subject to modern sanitary treatment that robs it of Salmonella typhi along with Vibrio cholerae, Poliovirus, and dysentery-causing Shigella?  Shouldn’t they be free to have their children partake of nature’s bounty, to drink fresh water from streams and rivers, not to mention untreated well water contaminated with human feces and the pathogens it harbors?

And here is the Covid connection.  If local school boards and municipal authorities lack the authority to safeguard children, to the extent possible, through obligatory wearing of facemasks, then surely they lack the authority to force them to drink water filtered through layers of state and federal regulation informed by modern science.  Let parents be free to parent; let their children pay the steep, even life-threatening price.      

Did I mention that young children, along with immune-compromised young adults, are at greatest risk for contracting typhoid?  Well, now you know, and now, perhaps, we can return to reality.  State governors who do not understand the legal and moral imperative of acting in the best interests of the child[9] are unfit for public office of any sort.  In point of fact, they are unfit. Who wants governors who, in denying adults the right to act responsibly in the best interests of children, sanction child abuse?  Let them crawl back into the existential dung heap whence they emerged.    


[1] https://www.who.int/news-room/fact-sheets/detail/drinking-water.

[2] https://www.cdc.gov/typhoid-fever/health-professional.html,

[3] https://www.cdc.gov/salmonella/2010/frozen-fruit-pulp-8-25-10.html.

[4] Victor C. Vaughan, A Doctor’s Memories (Indianapolis: Bobbs-Merrill, 1926), 369ff., 386.

[5] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 255-256.

[6] Gywn Macfarlane, Alexander Fleming: The Man and the Myth (Cambridge: Harvard University Press, 1984), 54-55.

[7] Victoria A. Harden, Inventing the NIH: Federal Biomedical Research Policy, 1887-1937 (Baltimore:  Johns Hopkins University Press, 1986), 41.

[8] Grace McDougall, A Nurse at the War:  Nursing Adventures in Belgium and France (NY: McBride, 1917), 111, 117; Alice Isaacson, Diary of 1917, Library & Archives, Canada, letter of 16 February 1917. 

[9] Joseph Goldstein, Anna Freud, et al., In the Best Interests of the Child (New York:  Free Press, 1986).

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

SHARE THIS POST:

Remembering Cholera in the Time of Covid

It came in crippling waves, the deadly intestinal infection that, in 1832 alone, killed 150,000 Americans.  Its telltale symptom was copious watery diarrhea (“rice water”) accompanied by heavy vomiting, loss of internal fluids and electrolytes, and dehydration; hence its name, cholera, from the Greek “flow of bile.”  Severe cases quickly proceeded to hypovolemic shock and could kill otherwise healthy persons, children and adults, within hours.  Historians refer to the cholera epidemics of the 19th century, but the epidemics, especially those of 1832, 1848, and 1866, were in fact pandemics, spreading from nation to nation and continent to content.

     Orthodox or “regular” physicians of the first half of the 19th century had no clue about its cause, mode of transmission, or remedy, so they speculated wildly in all directions.  Contagionists believed it spread from person to person.  Noncontagonists attributed it to poisonous fumes, termed miasma, which emanated from soil and   decaying matter of all kinds.  Some attributed it to atmospheric changes; others thought it a byproduct of fermentation.  Physician-moralists could be counted on to ascribe it to the moral failings of the underclass.  Withal, the regulars resorted, one and all, to “heroic” treatments, with bleeding and toxic chemicals, especially calomel (mercury) and laudanum (opium-laced alcohol) having pride of place. These treatments only hastened the death of seriously ill patients, which, given the extent of their suffering, may have been an unwitting act of mercy.  Physician-induced deaths resulted in the appeal of homeopathy and botanical medicine, far milder approaches that, so their practitioners avowed, caused far fewer deaths than the horrid regular remedies.  

Caricature of 1832, depicting cholera sufferer with nightmarish “remedies” of the day.

      The suffering public, seeing the baleful effects of conventional remedies, grew disgusted with doctors. The Cholera Bulletin, a newsletter put out by a group of New York physicians over the summer of 1832, grimly acknowledged in its July 23 issue, the “fierce onslaught” of cholera and doctors in felling the afflicted: “For Cholera kills and Doctors slay, and every foe will have its way!”  After a new wave of cholera reached American shores in 1848, New York’s Sunday Dispatch lambasted traditional medical science as “antiquated heathen humbug, utterly unworthy of the middle of the nineteenth century.”  “Cholera was a most terrible affliction,” chimed in the New York Herald a year later, “but bad doctors and bad drugs are worse.  The pestilence might come now and then; physicians we had always with us.”[1]

     And yet, amid such loathing about doctors and their so-called remedies, science marched on in the one domain in which forward movement was possible.  Throughout the second half of the 19th century, cholera was the catalyst that brought Europe and eventually America into the proto-modern era of public health management of infectious disease.  Then as now, measures that safeguarded the public from life-threatening infectious disease are good things.  In 1853, after another cholera epidemic reached Edinburgh, there was no political posturing about “rights” – presumably the right of British citizens to get sick and die.  Parliament, the “Big Government” of the day, resolved to go after the one major, recurrent infectious disease for which a vaccine was at hand:  smallpox.  The Vaccination Act of 1853 grew out of this resolve.  Among other things, it instituted compulsory smallpox vaccination, with all infants to be vaccinated within the first three months of life (infants in orphanages were given four months).  Physicians were obligated to send certificates of vaccination to local birth registrars, and parents who did not comply were subject to fines or imprisonment. The requirement was extended under the Vaccination Act of 1867.[2]

     New York City followed suit a decade later, when the state legislature created the Metropolitan Board of Health.  The Board responded to the outbreak of cholera in 1866 by mandating the isolation of cholera patients and disinfection of their excretions.  When a new epidemic, which travelled from India to Egypt, erupted in 1884, French and German teams descended on Egypt in search of the specific microorganism responsible for cholera.  The prize went to Robert Koch, who isolated the cholera vibrio in January 1884.  

     In 1892, when a cholera epidemic originating in Hamburg brought 100 cases to New York, the city mobilized with the full force of the new science of bacteriology.  The Board of Health lost no time in establishing a Division of Pathology, Bacteriology, and Disinfection, which included a state-of-the-art bacteriological laboratory under the direction of Hermann Biggs.  The lab, as we have seen, came into its own in the fight against diphtheria, but it was the threat of cholera that brought it into existence.  A year later, in 1893, Congress passed the National Quarantine Act, which created a national system of quarantine regulations that included specific procedures for the inspection of immigrants and cargos.  It was to be administered by the U.S. Marine Hospital Service, forerunner of the Public Health Service.

     In the late 1840s, the Bristol physician William Budd argued that contaminated sewage was the source of cholera, and in 1854 the surgeon John Snow traced the source of a cholera outbreak in his Soho, London neighborhood to contaminated well water.  But it was the Hamburg epidemic that proved beyond doubt that cholera was waterborne, and Koch himself demonstrated that water filtration was the key to its control.[3]  Now, we rarely hear of cholera, since water and waste management systems that came into existence in the last century eliminated it from the U.S. and Europe.[4]  Anti-vax libertarians would no doubt take exception to the Safe Drinking Water Act of 1974, which empowers the EPA to establish and enforce national water quality standards.  There it is again, the oppressive hand of Big Government, denying Americans the freedom to drink contaminated water and contract cholera.  Where has our freedom gone? 

Caricature of 1866, “Death’s Dispensary,” giving contaminated drinking water as a source of cholera.

     The gods will never stop laughing at the idiocy of humankind.  Here we are in 2021 and, thanks to the foundation laid down by 19th century scientists, gifted scientists of our own time have handed us, in astoundingly little time, an understanding of the Corona virus, its mode of transmission, and a pathway to prevention and containment.  We have in hand safe and effective vaccines that reduce the risk of infection to miniscule proportions and insure that, among the immunized, infection from potent new strains of the virus will be mild and tolerable, and certainly not life- threatening.   

     Yes, a small percentage of those who receive Covid vaccines will have reactions, and, among them, a tiny fraction will become ill enough to require treatment, even hospitalization.  But they will recover and enjoy immunity thereafter.  Such “risks” pale alongside those incurred by their forebears, who sought protection from smallpox in the time-tested manner of their forebears.  In America, a century before the discovery of Edward Jenner’s cowpox-derived vaccine, colonists protected themselves from recurrent smallpox epidemics through inoculation with human smallpox “matter.”  The procedure, termed variolation, originated in parts of Europe and the Ottoman Empire in the early 16th century, reaching Britain and America a century later, in 1721.  It involved inoculating the healthy with pus scraped from skin ulcers of those already infected, and was informed by the ancient observation that smallpox could be contracted only once in a lifetime.[5]  The variolated developed a mild case of smallpox which, so it was hoped, would confer protection against the ravages of future epidemics. 

     And they were essentially right: over 98% of the variolated survived the procedure and achieved immunity.[6]   To be sure, the risk of serious infection was greater with variolation than with Edward Jenner’s cowpox-derived vaccine, but the latter, which initially relied on the small population of English cows that contracted cowpox and person-to-person inoculation, was a long time in coming.  It took the United States most of the 19th century to maintain and distribute an adequate supply of Jennerian vaccine.  Long before the vaccine was widely available, when the death rate from naturally acquired smallpox was roughly 30%,[7] Americans joined Europeans, Asians, and Africans in accepting the risks of variolation. For George Washington, as we noted, the risks paled alongside the very real risk that the Continental Army would collapse from smallpox:  he had every soldier variolated before beginning military operations in Valley Forge in 1777.[8]

    But here we are in 2021, with many Americans unwilling to accept the possibility of any Covid vaccine reaction at all, however transient and tolerable.  In so doing, they turn their backs on more than two hundred years of scientific progress, of which the successful public health measures in Europe and America spurred by the cholera epidemics form an important chapter.  The triumph of public health, which antedated by decades the discovery of bacteria, accompanied increased life expectancy and vastly improved quality of life wrought by vaccine science, indeed, by science in general. 

     Witness Britain’s Great Exhibition of 1851, a scant three years after the cholera epidemic of 1848. Under the dome of the majestic Crystal Palace, science was celebrated in all its life-affirming possibilities.  In medicine alone, exhibits displayed mechanically enhanced prosthetic limbs, the first double stethoscope, microscopes, surgical instruments and appliances of every kind, and a plethora of pharmaceutical extracts and medicinal juices (including cod liver oil).[9] Topping it off was a complete model of the human body that comprised 1,700 parts.  Science promised better lives and a better future; scientific medicine, which by 1851 had begun to include public health measures, was integral to the promise.

     But here we are in 2021, replete with anti-vaccinationists who choose to endanger themselves, their children, and members of their communities.  They are anti-science primitives in our midst, and I behold them with the same incredulity that visitors to Jurassic Park beheld living, breathing dinosaurs.  Here are people who repudiate both public health measures (mask wearing, curfews, limits on group gatherings) and vaccination science in a time of global pandemic.  For them, liberty is a primal thing that antedates the social contract, of which our Constitution is a sterling example.  It apparently includes the prerogative to get sick and make others sick to the point of death.  The anti-vaccinationists, prideful in their ignorance and luxuriant in their fantasies of government control, remind me of what the pioneering British anthropologist Edward Tyler termed “survivals,” by which he meant remnants of cultural conditions and mindsets irrelevant to the present.  Dinosaurs, after all, didn’t care about the common good either.     


[1] Newsletter and press quotations from Charles Rosenberg, The Cholera Years: The United States in 1832, 1849, and 1866 (Chicago: University of Chicago Press, 1962), 68, 161, 155.

[2] Dorothy Porter & Roy Porter, “The Politics of Prevention:  Anti-Vaccinationism and Public Health in Nineteenth-Century England,” Medical History, 32:231-252, 1988.

[3] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 229-230, 255.

[4] The last reported case of cholera in the U.S. was in 1949.  Cholera, sadly, remains alive and well in a number of African countries. 

[5] In China and Asia Minor, where variolation originated, dried smallpox scabs blown into the nose was the mode of inoculation. 

[6]  José Esparzan, “Three Different Paths to Introduce the Smallpox Vaccine in Early 19th Century United States,” Vaccine, 38:2741-2745. 2020.

[7] Ibid.

[8] Andrew W. Artenstein, et al., “History of U.S. Military Contributions to the Study of Vaccines against Infectious Diseases,” Military Medicine, 170[suppl]:3-11. 2005.  

[9] C. D. T. James, “Medicine and the 1851 Exhibition,” J. Royal Soc. Med., 65:31-34, 1972.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.


[1] Newsletter and press quotations from Charles Rosenberg, The Cholera Years: The United States in 1832, 1849, and 1866 (Chicago: University of Chicago Press, 1962), 68, 161, 155.

[2] Dorothy Porter & Roy Porter, “The Politics of Prevention:  Anti-Vaccinationism and Public Health in Nineteenth-Century England,” Medical History, 32:231-252, 1988.

[3] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 229-230, 255.

[4] The last reported case of cholera in the U.S. was in 1949.  Cholera, sadly, remains alive and well in a number of African countries. 

[5] In China and Asia Minor, where variolation originated, dried smallpox scabs blown into the nose was the mode of inoculation. 

[6]  José Esparzan, “Three Different Paths to Introduce the Smallpox Vaccine in Early 19th Century United States,” Vaccine, 38:2741-2745. 2020.

[7] Ibid.

[8] Andrew W. Artenstein, et al., “History of U.S. Military Contributions to the Study of Vaccines against Infectious Diseases,” Military Medicine, 170[suppl]:3-11. 2005.  

[9] C. D. T. James, “Medicine and the 1851 Exhibition,” J. Royal Soc. Med., 65:31-34, 1972.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

Vaccinating Across Enemy Lines

There are periods in American history when scientific progress is in sync with governmental resolve to exploit that progress.  This was the case in the early 1960s, when advances in vaccine development were matched by the Kennedy Administration’s efforts to vaccinate the nation and improve the public’s health.  And the American public wholeheartedly supported both the emerging generation of vaccines and the government’s resolve to place them in the hands – or rather arms – of as many Americans as possible. The Vaccination Assistance Act of 1962 grew out of this three-pronged synchrony.[1]

Between 1963 and 1965, a severe outbreak of rubella (German measles) lent support to those urging Congress to approve title XIX (of the Medicaid provision) of the Social Security Act of 1965.  And Congress rose to the task, passing into law the “Early and Periodic Screening, Diagnosis, and Treatment” amendments to Title XIX.  The latter affirmed the right of every American child to receive comprehensive pediatric care, including vaccination.

The timing was auspicious.  In 1963, Merck, Sharp & Dohme began shipping its live-virus measles vaccine, trademarked Rubeovax, which had to be administered with standardized immune globulin (Gammagee). In 1967 MSD combined the measles vaccine with smallpox vaccine as Dryvax, and then, a year later, released a more attenuated live measles vaccine (Attenuvax) that did not require coadministration of immune globulin.[2]   MSD marketing reminded parents that mumps, long regarded as a benign childhood illness, was now associated with adult sterility.  It too bowed to science and responsible parenting, with its incident among American children falling 98% between 1968 and 1985.

Crowd waiting for 1962 oral polio vaccination
Creator: CDC/Mr. Stafford Sm

America’s commitment to vaccination was born of the triumphs of American medicine during WWII and came to fruition in the early 1950s, just as Cold War fears of nuclear war gripped the nation and pervaded everyday life.  Grade school nuclear attack drills, “duck and cover” animations, basement fallout shelters with cabinets filled with canned food – I remember all too well these scary artifacts of a 1950s childhood. Competition with the Soviet Union suffused all manner of scientific, technological, public health-related, and athletic endeavor. The Soviets leapt ahead in the space race with the launching of Sputnik in 1957.  The U.S. retained an enormous advantage on the ground with the size and destructive power of its nuclear arsenal.

Less well known is that, in the matter of mass polio vaccination, countries in the Eastern Bloc – Hungary, Czechoslovakia, Poland – led the way. Hungary’s intensive annual vaccination campaigns, launched in 1957 with Salk vaccine imported from the U.S. and Sabin vaccine imported from the U.S.S.R. in 1959, was the prototype for the World Health Organization’s (WHO) global strategy of polio eradication.  Czechoslovakia became the first nation to eradicate polio in 1959; Hungary followed in 1963.[3]  

It is tempting to absorb the narrative of polio eradication into Cold War politics, especially the rhetoric of the vaccination campaigns that mobilized the public. Throughout the Eastern Bloc, mass vaccination was an aspect of pro-natalist policies seeking to increase live births, healthy children, and, a bit down the road, productive workers. Eradication of polio, in the idiom of the time, subserved the reproduction of labor. In the U.S., the strategic implications of mass vaccination were framed differently.  During the late 50s and early 60s, one in five American applicants for military service was found medically unfit.  Increasing vaccination rates was a cost-effective way of rendering more young men fit to serve their nation.[4]   

But there is a larger story that subsumes these Cold War rationales, and it is a story, surprisingly, of scientific cooperation across the Iron Curtain.  Amid escalating Cold War tensions, the United States and Soviet Union undertook a joint initiative, largely clandestine, to develop, test, and manufacture life-saving vaccines.  The story begins in 1956, when the U.S. State Department and Soviet Ministry of Foreign Affairs jointly facilitated collaboration between Albert Sabin and two leading Soviet virologists, Mikhail Chumakov and Anatoli Smorodintsev.  Their shared goal was the manufacture of Sabin’s oral polio vaccine on a scale sufficient for large-scale testing in the Soviet Union. With a KGB operative in tow, the Russians travelled to Sabin’s laboratory in the Cincinnati Children’s Hospital, and Sabin in turn flew to Moscow to continue the brainstorming.  

Two years later, shipments of Sabin’s polio virus strains, packed in dry ice, arrived in the Soviet Union, and shortly thereafter, with the blessing of post-Stalin Kremlin leadership, the mass trials began.  The Sabin vaccine was given to 10 million Russian school children, followed by millions of young adults.  A WHO observer, the American virologist Dorothy Horstmann, attested to the safety of the trials and the validity of their findings. It has long since stopped polio transmission everywhere in the world except Afghanistan and Pakistan.   

No sooner was the Sabin live-virus vaccine licensed than Soviet scientists developed a unique process for preserving smallpox vaccine in harsh environments.  With freeze-dried vaccine now available, Viktor Zhdanov, a Soviet virologist and Deputy Minister of Health, boldly proposed to the 1958 meeting of the World Health Assembly, WHO’s governing body, the feasibility of global smallpox eradication.  After the meeting, he did not wait patiently for the WHO to act: he led campaigns both to produce smallpox vaccine and to solicit donations from around the world.[5]  His American colleague-in-arms in promoting freeze-dried vaccine was the public health physician and epidemiologist Donald Henderson, who led a 10-year international vaccination campaign that eliminated smallpox by 1977.[6] 

What can we learn from our Cold War predecessors?  The lesson is self-evident: we learn from them that science in the service of public health can be an enclave of consensus, what Dora Vargha, the historian of Cold War epidemics, terms a “safe space,” among ideological combatants with the military resources to destroy one another. The Cold War is long gone, so the safe space of which Vargha writes is no longer between geopolitical rivals with fingers on nuclear triggers.

But America in 2021 is no longer a cohesive national community.  Rather, we inhabit a fractured national landscape that erupts, with demoralizing frequency, into a sociopolitical battle zone. The geopolitical war zone is gone, but Cold War-type tensions play out in the present. Right-wing extremists, anti- science Evangelicals, purveyors of a Trump-like notion of insular “greatness” – these overlapping segments of the population increasingly pit themselves against the rest of us:  most Democrats, liberals, immigrants, refugees,  defenders of the social welfare state that took shape after the Second World War.  Their refusal to receive Covid-19 vaccination is absorbed into a web of breezy rhetoric:  that they’ll be okay, that the virus isn’t so bad, that the vaccines aren’t safe, that they come to us from Big Government, which always gets it wrong.  Any and all of the above.  In fact, the scientific illiterati are led by their anger, and the anger shields them from relevant knowledge – of previous pandemics, of the nature of a virus, of the human immune system, of the role of antibodies in protecting us from invading antigens, of the action of vaccines on blood chemistry – that would lead them to sequester their beliefs and get vaccinated.   

When the last wave of antivaccinationism washed across these shores in the early 1980s, it was led by social activists who misappropriated vaccination in support of their cause.  Second-wave feminists saw vaccination as part of the patriarchal structure of American medicine, and urged women to be skeptical about vaccinating their children, citing the possibility of reactions to measles vaccine among children allergic to eggs.  It was a classic instance of throwing out the baby with the bathwater which, in this case, meant putting the children at risk because the bathwater reeked of male hubris.  Not to be left out of the antiscientific fray, environmentalists, in an act of stupefying illogic, deemed vaccines an environmental pollutant – and one, according to writers such as Harris Coulter, associated with psychiatric illness.[7]                                

Matters are now much worse.  Antivaccinationism is no longer aligned, however misguidedly, with a worthy social cause.  Rather, it has been absorbed into this far-reaching skepticism about government which, according to many right-wing commentators and their minions, intrudes in our lives, manipulates us, constrains our freedom of choice, and uses our tax dollars to fund liberal causes.

Even in the absence of outright hostility, there is a prideful indifference to vaccination, partly because it is a directive from Big Government, acting in conformity with the directive of what is, after all, Big Pharmaceutical Science.  But we have always needed Big Government and Big Science to devise solutions to Big Problems, such as a global pandemic that has already claimed over 560,000 American lives.  Without American Big Government, in cooperation with British Big Government, overseeing the manufacture and distribution of penicillin among collaborating pharmaceutical firms, the miracle drug would not have been available in time for D-Day.  Big government made it happen.   A decade later, the need for international cooperation transcended the bonds of wartime allies.  It penetrated the Iron Curtain in the wake of global polio and smallpox epidemics that began in 1952 and continued throughout the decade.  

The last thing we need now is a reprise on that era’s McCarthyism, when anyone was tainted, if not blacklisted, by mere accusation of contact with communists or communism. That is, we do not need a nation in which, for part of the population, anything bearing the stamp of Big Government is suspected of being a deception that infringes on some Trumpian-Hobbesian notion of “freedom” in a state of (market-driven) nature.  

If you want to make America “great” again, then start by making Americans healthy again.  Throughout the 1960s, the imperative of vaccination overcame the anxieties of American and Soviet officials given to eying one another warily atop growing nuclear stockpiles. They brought the scientists together, and the result was the mass testing that led to the eradication of polio.  Then America rallied around the Soviet creation of freeze-dried smallpox vaccine, and largely funded the manufacture and distribution that resulted in the eradication of smallpox. 

Now things are better.  We live in an era in which science enables us to alter the course of a global pandemic.  It is time for antivaccinationists to embrace the science, indeed, to celebrate the science and the gifted scientists whose grasp of it enabled them to create safe and effective Covid-19 vaccines in astonishingly little time.  You’ve got to get your vaccine.  It’s the only way. 


[1] Elena Comis, Vaccine Nation: America’s Changing Relationship with Immunization  (Chicago: University of Chicago Press, 2014), 20.

[2] Louis Galambos, with Jane Eliot Sewell, Networks of Innovation: Vaccine Development at Merck, Sharp & Dohme, and Mulford, 1895-1995.Cambridge:  Cambridge University Press, 1995, 96-98, 196-107.

[3] Dora Vargha, “Between East and West: Polio Vaccination Across the Iron Curtain in Cold War Hungary,” Butt. Hist. Med., 88:319-345, 2014; Dora Vargha, “Vaccination and the Communist State,” in The Politics of Vaccination (online pub date: March 2017).

[4] Comis, Vaccine Nation, 27.

[5] Manela E. “A Pox on Your Narrative: Writing Disease Control into Cold War History,” Diplomatic History, 34:299-323, 2010.

[6] Peter J. Hotez, “Vaccine Diplomacy:  Historical Perspective and Future Directions,” PLoS Neglected Trop. Dis. 8:e380810.1371, 2014; Peter J. Hotez, “Russian-United States Vaccine Science: Preserving the Legacy,” PLoS Neglected Trop. Dis., 11:e0005320,2017.

[7] The feminist and environmentalist antivaccination movements of the 1980s are reviewed at length, in Comis, Vaccine Nation, chapter 5 & 6.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

Anti-vaccinationism, American Style

Here is an irony:  America’s staggering production of generations of scientific brainpower coexists with the deep skepticism about science of many Americans.  Donald Trump, a prideful scientific illiterate, rode to power on the back of many others who, like him, were skeptical about science and especially the role of scientific experts in modern life.  He maintains their allegiance still.

Why does this surprise us?  Anti-intellectualism was burned into the national character early in American history.  Those skeptical of this claim should read Richard Hofstadter’s brilliant twin studies of the 1960s, Anti-Intellectualism in American Life and The Paranoid Trend in American Politics. From the beginning of the American Experiment, democracy was antithetical to so-called European “elitism,” and this ethos gained expression, inter alia, in antebellum medicine.  

The Founding Fathers, an intellectual elite in defense of democracy, were not part of the movement away from science.  When Benjamin Waterhouse introduced Edward Jenner’s smallpox vaccine to America in 1800, Washington, Adams, and Jefferson hailed it as the greatest discovery of modern medicine.  They appreciated the severity of smallpox, which had ravaged the Continental Army during the War of Independence.  Indeed, Washington was so desperate to rein in its decimation of his troops that, in 1777, he inoculated his entire army with pus from active smallpox lesions, knowing that the resulting infections would be milder and far less likely to cause fatalities than smallpox naturally contracted.  When Jefferson became president in 1801, he pledged to introduce the vaccine to the American public, because “it will be a great service indeed rendered to human nature to strike off the catalogue of its evils so great a one as the smallpox.” Not to be outdone in support of Jenner’s miraculous discovery, Jefferson’s successor, James Madison, signed into law in 1813, “An Act to Encourage Vaccination.” Among its provisions was the requirement that the U.S. postal service “carry mail containing vaccine materials free of charge.”[1]

But this appreciation of the vaccine was short-lived, and Jefferson’s hope that the value of vaccination would seep into public consciousness was never realized.  In Jacksonian America, the Founding Fathers’ belief that medical progress safeguarded democracy gave way to something far less enlightened:  democracy now meant that everyone could be, indeed should be, his own doctor.  Most Americans had no need for those with university educations, much less clinical experience in governmentally managed public hospitals.  Jacksonian America emerges as what the historian Joseph Kett termed the “Dark Age of the profession.”[2]  During this time, the nation lay claim to a medical elite only because a few monied medical intelligentsia – John Collins Warren, Valentine Mott, Philip Syng Physick, William Gibson, and David Hosack, among them – found their way to European medical centers in London, Edinburgh, and somewhat later, Paris. 

Otherwise, it was every man for himself, which usually meant every woman for herself and her family.  Homeopaths, herbalists, Thomsonians, eclectics, hydropaths, phrenologists, Christian Scientists, folk healers, faith healers, uroscopians, chromo-thermalists – each exemplified the democratic mind in action.[3]  Sad to say, homegrown “regular” American medicine of the day, with its reliance on depletive (bleeding, vomiting, purging) and stimulative (alcohol, quinine) treatments, was no better and often worse.  The belief, Galenic in origin, that all diseases were variants of the same global type of bodily dysregulation is startlingly close to Donald Trump’s holistic medieval approach to bodily infection and its treatment.

The birth of scientific medicine in the decades following the Civil War could not still the ardor of America’s scientific illiterati. The development of animal blood-derived serums (antitoxins), forerunners of modern antibiotics, was anathema to many. Among them were religionists, mainly Christian, for whom injecting blood product of a horse or sheep into the human body was not only repugnant but sinful.  Better to let children be stricken with smallpox, diphtheria and tetanus, sometimes to the point of death, than violate what they construe as divine strictures – strictures, be it noted, not intimated, much less codified, in the body of doctrine of any of the five major world religions.[4]

Antivaccinationists of the early 20th century were an unhappy lot.  They were unhappy about the proliferation of medicines (“biologics”) for treating illness.  And they deeply resented the intrusion of the State into domains of parental decision-making in the form of newly empowered social workers, visiting nurses, and educators.  In fact, antivaccinationism was part and parcel of resistance to all things progressive, including scientific medicine.[5]  Holdovers from the free-wheeling anything-goes medicine of antebellum America – especially devotees of homeopathy and, of late, chiropractic – were prominent in its ranks.    

Now, in the face of a global pandemic no less lethal than the Great Influenza of 1918-1919, we hear the same irrational musings about the dangers of vaccines that animated the scientific illiterati at the turn of the 20th century. For the foes of public health, any misstep in the manufacture or storage of smallpox vaccine – a much greater possibility over a century ago than today – was enough to condemn vaccination outright. In1901,smallpox vaccination of school children in Camden, NJ led to an outbreak of 100 cases of tetanus, with nine deaths.  Historians believe that, in all probability, the outbreak resulted not from a contaminated batch of vaccine but rather from poor care of the vaccination site.  But Congress accepted the possibility of contamination, and the incident led to passage of the Biologics Control Act of 1902.[6]  Henceforth every manufacturer of vaccine had to be licensed by the Secretary of the Treasury (relying on the PHS Laboratory of Hygiene), and each package of vaccine had to be properly labeled and dated and was subject to inspection.[7]  

And this leads to a second irony: the more preventive medicine advanced, incorporating additional safeguards into vaccine production, storage, and administration, the greater the resistance of the illiterati.  Throughout the 20th century and right down to the present, the antebellum notion of science-free “medical freedom” continues to hold sway.  Then and now, it means the right to put children at risk for major infectious disease that could result in death – and the right, further, to pass disease, possibly severe and occasionally fatal, on to others.

It follows that, then and now, the science illiterati are skeptical, if not distressed, by the State’s commitment to public health.  It was Oklahoma Senator Robert Owen’s proposed legislation of 1910 to combine five federal departments into a cabinet-level Department of Public Health that pushed the opponents of medical “tyranny” onward. The Anti-Vaccination League of America, formed in 1908, was joined by the National League for Medical Freedom in 1910.  Eight years later, they were joined by the American Medical Liberty League.  For all three groups, anti-Progressivism was in full swing. “Medical freedom” not only exempted children from compulsory vaccination, but from medical examinations at school.  Further, young adults should not be subjected to premarital syphilis tests. Nor did the groups’ expansive view of medical tyranny flinch in the face of public education about communicable disease: municipal campaigns against diphtheria were to be forbidden entirely. 

With the death of the founders of the Anti-Vaccination League (Charles Higgins) and the American Medical Liberty League (Lora Little) in 1929 and 1931, respectively, antivaccinationism underwent a dramatic decline.  The Jacksonian impulse that fueled the movement simply petered out, and by the later ‘30s, Americans finally grasped that mainstream medicine was not simply another medical sect. It was the real deal:  a medicine grounded in laboratory research that effectively immunized against disease, promoted relief and cure of those already infected, and thereby saved lives.

But was the embrace of scientific healing really universal?  A pinnacle of life-depriving anti-science occurred well beyond the 1930s.  Consider the belief of some Christian sects that certain life-saving medical interventions must be withheld from children on religious grounds.  It was only in 1982, 81 years after von Behring’s discovery of diphtheria antitoxin launched the era of serum therapy, that criminal charges were first brought against parents who had withheld necessary treatment from their children.  Of the 58 cases of such parental withholding of care, 55 involved fatalities.[8]  Child deaths among Christian Scientists alone included untreated diabetes (leading to diabetic ketoacidosis), bacterial meningitis, and pneumonia.  Now things are better for the children, since even U.S. Courts that have overturned parents’ criminal convictions have come around to the mainstream belief that religious exemption laws are not a defense of criminal neglect – a fine insight for the judiciary to have arrived at more than a century after serum therapy scored major triumphs in the treatment of rabies, diphtheria, tetanus, pneumococcal pneumonia, and meningococcal meningitis.

Should vaccination for the Covid-19 virus be a requirement for attendance in public and private schools?  How can the question even be asked?  As early as 1827, a Boston school committee ordered teachers to require entering students to give evidence of smallpox vaccination.[9]  Statewide vaccination requirements for smallpox followed in Massachusetts in 1855, New York in 1862, Connecticut in 1872, and Pennsylvania in 1895.  And the inoculations were effective across the board.  They quickly brought outbreaks of smallpox underway at the time of inoculation under control, and they prevented their recurrence in the future. These laws and those that followed were upheld by the Supreme Court in 1922 in Zucht v. King.[10]      

Twentieth-century vaccines were developed for pertussis in 1914, diphtheria in 1926, and tetanus in 1938.  In 1948 the three were combined and given to infants and toddlers at regular intervals as the DTP vaccine.  There was no hue and cry in 1948 or the years to follow. And yet, the same fear of vaccination that led the New York State Health Department to launch a statewide drive to immunize children against diphtheria now renders a new generation of parents resistant to mandatory Covid-19 vaccination for their own children.

Bear in mind that the anti-science rhetoric of today’s illiterati can be mobilized just as easily to resist DPT or any subsequent vaccine administered to their children. Why subject a child to DPT vaccination?  Perhaps combining three different vaccines into one injection entails heightened risks. Perhaps the batch of vaccine in the hands of one’s own doctor has been contaminated.  Perhaps one’s child will be among the miniscule number that have a minor allergic reaction.  And, after all, children who contract diphtheria, pertussis, and/or tetanus will hardly die from their infections, especially with the use of antibiotics. Why inject foreign matter into healthy infants – the very argument adduced by the opponents of diphtheria vaccine a century ago. 

The problem with antivaccinationist rhetoric in the 21st century is that its proponents are all beneficiaries of more than a century of mandatory vaccination policy.  If they lived in a society bereft of vaccines – or, for the unvaccinated, the immunity conferred by the vast herd of immunes – they would have led very different lives.  Indeed, some would not be here to celebrate solipsism masquerading as individualism.  Their specious intuitions about the risks of vaccination are profoundly anti-social, since they compromise the public’s health. Parents who decide not to vaccinate their children put the entire community at risk.  The community includes not only their own children, but all those who desire protection but cannot receive it:  children too young to be vaccinated, those with actual medical contraindications to vaccination, and the miniscule number who have been vaccinated but remain unprotected.[11]    

Nor is it strictly a matter of providing equal protection to individuals who seek, but cannot receive, the protection afforded by compulsory vaccination. In a secular society, religious objections to vaccination pale alongside the health of the community. Whether framed in terms of a “compelling state interest” in mitigating a health threat (Sherbert v. Vernerin [1963]) or the individual’s obligation to comply with “valid and neutral laws of general applicability” whatever their incidental religious implications (Employment Division, Department of Human Resources of Oregon v. Smith [1990]) , the U.S. Supreme Court has consistently held that mandatory vaccination laws need not allow religious exemptions of any kind.  

Antivaccinationists might bear in mind a few particulars as they align themselves with the infectious dark ages.  Between 1900 and 1904, an average of 48,164 cases of smallpox and 1,528 smallpox deaths were reported each year. With the arrival of compulsory vaccination in schools, the rate fell drastically and outbreaks of smallpox ended in 1929. The last case of smallpox in the U.S. was reported in 1949.[12]  

Among American children, diphtheria was a major cause of illness and death through 1921, when 206,000 cases and 15,520 deaths were recorded.  Before Emil von Bering’s diphtheria antitoxin became available in 1894 to treat infected children, the death rate among children struck down, especially during the hot summer months, could reach 50%. Within several years, use of the antitoxin brought it down to 15%.[13]  Then, by the late 1920s, diphtheria immunization was introduced and diphtheria rates fell dramatically, both in the U.S. and other countries that vaccinated widely. Between 2004 and 2008, no cases of diphtheria were recorded in the U.S.[14] 

Between 1951 and 1954, paralytic polio cases in the United States averaged 16,316 a year, of which 1,879 resulted in death. Then science came to the rescue.  Jonas Salk’s dead-poliovirus vaccine became available in1955, and Albert Sabin’s live-poliovirus variant four years later. By 1962, there were fewer than 1,000 cases a year and, in every year thereafter, fewer than 100 cases.[15]

Now, alas, some parents still worry that the measles component of the MMR (measles, mumps, rubella) vaccine available since 1971 may lead to childhood autism.  Why?  Resist the disease-promoting mythologies of the illiterati at all costs.  Autism is a neuro-developmental disorder with a strong genetic component; its genesis is during the first year of life, before the vaccine is even administered.  None of the epidemiologists who have studied the issue has found any evidence whatsoever of an association, not among normal children and not among high-risk children with autistic siblings.[16]  The fact is that children who do not receive a measles vaccine have been found 35 times more likely to contract measles than the vaccinated.[17]  And measles is no laughing matter. When contracted later in life, measles and mumps are serious and can be deadly.  They were among the major systemic infections that felled soldiers during the Civil War, the Spanish-American War, the Anglo-Boer War, and World War I.[18]                  

All of which leads to a conclusion in the form of an admonishment.  Accept the fact that you live in a secular society governed by law and a network of agencies, commissions, and departments lawfully enjoined to safeguard public health.  Do your part to sustain the social contract that came into existence when the Founding Fathers, elitists molded by European thought who had   imbibed the social contractualism of John Locke, wrote the American constitution.

Vaccination is a gift that modern science bestows on all of us – vaccination proponents and opponents alike. When one of the two FDA-approved Covid-19 vaccines comes to a clinic or storefront near you, run, don’t walk, to get your and your children’s shots. Give thanks to the extraordinarily gifted scientists at Pfizer and Moderna who created these vaccines and demonstrated their effectiveness and safety. Make sure that everyone’s children grow up, paraphrasing the U.S. Army’s old recruiting slogan, to be all they can be.   


[1] Dan Liebowitz, Smallpox Vaccination: An Early Start of Modern Medicine in America, J. Community Hosp. Intern. Med. Perspect., 7:61-63, 2017 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5463674).

[2] Joseph F. Kett, The Formation of the American Medical Profession: The Role of Institutions, 1780-1860 (New Haven: Yale University Press, 1968), p. vii. 

[3] Robert E. Riegel, Young America, 1830-1840 (Westport, CT: Greenwood Press, 1973 [1949]), pp. 314-315, quoted at  314. 

[4] John D. Graberstein, “What the World’s Religions Teach, As Applied to Vaccines and Immune Globulins,” Vaccine, 31:2011-2023, 2013.

[5] James Colgrove, “’Science in Democracy’: The Contested Status of Vaccination In the Progressive Era and the 1920s,” Isis, 96:167-191, 2005.

[6]  Harry F. Dowling, Fighting Infection: Conquests of the Twentieth Century (Cambridge, MA: Harvard University Press, 1977), 38; Harry M. Marks, The Progress of Experiment: Science and Therapeutic Reform in the United States, 1900-1990 (Cambridge: Cambridge University Press, 1997), 73-74.

[7] Jonathan Liebenau, Medical Science and Medical Industry: The FormationOf the American Pharmaceutical Industry (Baltimore: Johns Hopkins, 1987), 89-90.

[8]  Janna C. Merrick, “Spiritual Healing, Sick Kids and the Law: Inequities in theAmerican Healthcare System,” Amer. J. Law & Med., 29:269-300, 2003, at 280.

[9] John Duffy, “School Vaccination: The Precursor to School Medical Inspection,” J. Hist. Med. & Allied Sci., 33:344-355, 1978,

[10] Kevin M. Malone & Alan R. Hinman, “Vaccination Mandates: The Public Health Imperative and Individual Rights, Law in Public Health Practice (2009), 262-284, at 272.

[11] Alan R. Hinman, et al., “Childhood Immunization: Laws that Work,” J. Law, Med &I Ethics, 30(suppl):122-127, 2002.

[12] Frank Fenner, et al., Smallpox and its Eradication (Geneva: World Health Organization, 1988).

[13] Karie Youngdahl, “Early Uses of Diphtheria Antitoxin in the United States,” The History of Vaccines, August 2, 2010 (https://www.historyofvaccines.org/content/blog/…).

[14] Epidemiology and Prevention of Vaccine-Preventable Diseases, 11th Edition (The Pink Book). National Immunization Program, Centers for Disease Control and Prevention (http://www.cdc.gov/vaccines/Pubs/pinkbook/downloads/dip.pdf); Diphtheria. WHO, Regional Office for the Western Pacific (http://www.wpro.who.int/health_topics/diphtheria).

[15] CDC. Annual summary 1980: Reported Morbidity and Mortality in the United States. MMWR 1981;29; CDC, Reported Incidence of Notifiable Diseases in the United States, 1960. MMWR 1961;9.

[16] Frank DeStefano & Tom T. Shimabukuro, “The MMR Vaccine and Autism,” Ann. Rev. Virol., 6:585-600, 2019.

[17] Hinman, op. cit. (note 11).

[18] Paul E. Stepansky, Easing Pain on the Western Front:  American Nurses of the Great War and the Birth of Modern Nursing Practice (Jefferson, NC:  McFarland, 2020), 36, 50, 96, 144.

 

Copyright © 2020 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

Covid-19 and Trump’s Medieval Turn of Mind

“We ought to give it [hydroxychloroquine] a try . . . feel good about it. That’s all it is, just a feeling, you know, smart guy. I feel good about it.” – Donald J. Trump, March 20, 2020

“I see the disinfectant, where it knocks it out in a minute. One minute. And is there a way we can do something like that, by injection inside or almost a cleaning?  Because you see it gets in the lungs, and it does a tremendous number on the lungs. So, it would be interesting to check that.” – Donald J. Trump, April 23, 2020

“So, supposing we hit the body with a tremendous —  whether it’s ultraviolet or just very powerful light — and I think you said that that hasn’t been checked, but you’re going to test it. And then I said, supposing you brought the light inside the body?” – Donald J. Trump, April 23, 2020

______

Viewed from the standpoint of history of medicine, the Great Influenza (aka the Spanish Flu) of 1918-1919 and the Coronavirus pandemic of today, separated by a century, share a basic commonality. Both are pandemics of the modern era, where treatments for specific diseases grow out of the findings of laboratory science and scientific medicine. The development of serology, which transferred to humans, via injection, whatever antitoxins resided in the purified blood of immune animals, had by 1918 proven effective, albeit to varying degrees, with diseases such as rabies, diphtheria, tetanus, and typhoid fever.

Today we anxiously await the development of a Covid-19 vaccine.  In 1918, health professionals and the public waited impatiently for new serums to combat gas gangrene and the pandemic flu.  And given the state of medical progress of the time – viruses had yet to be identified and differentiated from bacteria – their optimism was reasonable.  By the spring of 1918, 5000 units of an anti-gangrene serum had reached AEF hospitals in Europe, of which 2,500 units had been used by the time of the Armistice.  For the Spanish Flu, two different injectable serums were available to overseas American nurses by the fall of 1918.

The predictable failure of these serums should not obscure the fact that in 1918 management of the Spanish Flu was squarely in the hands of mainstream scientists and physicians.  Then President Woodrow Wilson stood back from the whirl of  suffering and death around him.  He maintained a steely silence about the whole business, refusing to mention the pandemic in even a single public address.  His preoccupation with the war and ensuing Paris Peace Conference was total, and precluded even the simplest expression of sympathy for stricken Americans and their families.  Here he anticipated by a century President Donald Trump.

Wilson held his peace.  Now we behold President Donald Trump, who, in his own preoccupation with self-promotion and self-congratulations, buttressed by denial of the pandemic’s magnitude, cannot remain silent.  Not even for a day. But what is he telling us?  How is he helping us cope with the fury of another global pandemic? His musings – contradictory, impulsive, and obsessively self-serving – would have stunned Americans of 1918. For Trump seems to have dispensed with scientific medicine altogether.  To understand his “spin” on the pandemic, we must go back much further than the Great Influenza and look again at the Black Death of the mid-14th century.

In October, 1347, a vessel, probably originating off the Crimean Peninsula, docked in Messina, Sicily.  It was laden with infected rats, themselves laden with infected fleas.  Together, the rats and fleas brought the Black Death, likely a combination of bubonic and hemorrhagic plague, to Europe.  Physicians of the time, wedded to Hippocratic and Galenic notions of illness and health, confronted plague with the therapeutics derived from this paradigm.  Bleeding (venesection) was typically the first order of business, since blood was associated with body heat.  Bleeding would cool down a body overheated by fever and  agitation, thereby restoring balance among the four humors that corresponded to the four elements of the universe: black bile [earth], yellow bile [fire], phlegm [water], and blood [air].

When bleeding and the regulation of Galenic non-naturals (food and drink, motion, rest, evacuation, the passions) failed to restore health, physicians turned to what was to them an observable fact:  that plague was literally in the winds.  It was contained, that is, in miasmic air that was unbearably foul-smelling, hence corrupt and impure.  For some, the miasma resulted from a particular alignment of the planets; for many others it was pinned on the Jews, a poisonous race, they believed, that sought to poison the air.  But for most European physicians, no less than for priests and laymen, the miasmic air came directly from an enraged God who, disgusted with sinning humankind, breathed down the corrupt vapor to wipe them out.

How then, were 14th-century physicians to combat a pollution of Divine origin?  Galen came to the rescue, with heat  again at the center of  plague therapeutics.  Heat, it was known, had the ability to eliminate foul-smelling air, perhaps even lethally foul-smelling air. What was needed to combat plague was actually more heat, not less.  Make fires everywhere to purify the air.  This was the advice Guy de Chauliac, surgeon to the Papal Court in Avignon, gave Pope Clement VI, whose presumed sanctity did not prevent him from isolating himself from Court and servants and spending his days seated between two enormous log fires.  Among the infected, a more draconian application of heat was often employed:  doctors lanced plague victims’ inflamed buboes (boils) and applied red hot pokers directly to their open wounds.

Medieval thinking also led to treatments based on Galen’s theory of opposites.  Purities cancel impurities.  If you want to avoid the plague, physicians advised, drink copious amounts of the urine of the non-infected; collecting and distributing healthy urine became a community project throughout the continent.  If you were of means and would rather not drink urine, the ingestion of crushed sapphires would work just as well.

English peasants adopted a more benign path to purification:  they stuffed their dwellings with sweet scented flowers and aromatic herbs.  Here they followed the example of Europe’s plague doctors, those iconic bird-men who stuffed the huge beak extensions of their masks with dried flowers and odoriferous herbs to filter out pestilence from the air they breathed. Good smells, after all, were the opposite of airborne foulness.

a 14th-century plague doctor, dressed to ward off the miasma

On the other hand, in another variation of Galenic thinking, physicians sought a dissonant foulness powerful enough to vanquish the foulness in the air. Villagers lined up to stick their heads in public latrines.  Some physicians favored a more subtle variant. They lanced the infected boils of the stricken and applied a paste of gum resin, roots of white lilies, and dried human excrement. The synergism among the ingredients, they believed, would act as a magical restorative.  This, in any event, was the opinion of the eminent Italian physician Gentile da Foligno, whose treatise on the Black Death was widely read and who, inter alia, was among the first European physicians to study plague victims by dissecting their corpses.  Needless to say, the treatment did him no good, and he died of Plague in 1348.  Other physicians developed their own topical anodynes.  Snakes, when available, were cut up and rubbed onto a plague victim’s infected boils.  Pigeons were cut up and rubbed over the victim’s entire body.

Now, 672 years after the Black Death wiped out more than 40% of world population, we behold an astonishing recrudescence of the medieval mind:  we are led through a new plague by a presidential medievalist who “feels good” about nonscientific remedies based on the same intuitive search for complementarities and opposites that medieval physicians proffered to plague patients in the mid-14th century.  Heat kills things; heat obliterates atmospheric impurities; heat purifies. Perhaps, then, it can rid the body of viral invaders.  Disinfectants such as bleach are microbe killers. We wipe a counter top with Clorox and rid it of virus.  Can’t we do the same thing by injecting bleach into the human body? What bleach does to healthy tissue, to internal organs, to blood chemistry – these are science questions an inquiring 8th grader might put to her teacher.  But such questions could not arise to a medieval physician or to Donald Trump. They simply fall outside the paradigm of Galenic medicine in which they operate.  In this world, with its reliance on whole-body states calling forth whole-body, re-balancing interventions, there is no possibility of  weighing the pros and cons of specific treatments for specific ailments (read: different types of infection, local and systemic).  The concept of immunological specificity is literally unthinkable.

Injecting or ingesting bleach has an internal logic no greater than that of the 14th-century Flagellants, who roamed across continental Europe in a frenzy of penitential self-abuse that left them lacerated if not dead.  It made perfectly good 14th-century sense – though not, be it noted, to Clement VI, who condemned the practice as heretical.  Withal, the Flagellants believed that self-mortification and the excruciating pain it entailed could assuage a wrathful God and induce Him to stop blowing death down on humankind.  But science belied their self-purifying intentions. The roving Flagellants, leaving paths of infected blood and entrails behind them, became a vector for the transmission of plague.  For our medieval president, the path is one of toxic verbal effluvia no less dangerous than infected blood and entrails in spreading Covid-19.

We want to believe that no one living in 2020 can possibly lend credence to anything Trump has to say about infectious illness, virology, pandemics, scientific research, and post-medieval medicine.  When it comes to Covid-19, he is an epistemic vacuity whose medieval conjectures would never make it past the family dinner table or the local bar. But he is the president, and he speaks with the authority of high office.  So his musings, grounded in Galenic-type notions and feelings, have an apriori valence.  As such they will continue to lead many astray – away from prudent safeguards, away from mainstream medicine, indeed, away from an appreciation of the scientific expertise that informs these safeguards and treatments.

Hippocratic-Galenic medicine, with its notions of balance, synergy, complementarity, and opposites, retains its appeal to many.  But prescientific, feeling-based intuitions about disease are always dangerous, and positively deadly in a time of global pandemic. In the aftermath of Trump’s pronouncement about the logic of injecting  household disinfectants to combat Covid-19, poison control centers across the country were flooded with inquiries about the advisability of imbibing household bleach.  As to hydroxychloroquine, “More Deaths, No Benefit from Malaria Drug in VA Virus Study,” reported AP News on the first use of hydroxychloroquine in a small-scale nationwide study of VA patients hospitalized with Covid-19.

Is this surprising?  Whether or not hydroxychloroquine or any other drug or household disinfectant or chopped up animal remains is safe and effective against Covid-19 is an empirical question subject to laboratory research and clinical study.  But who exactly sets the agenda?  Who, that is, decides which existing pharmaceuticals or household products or smashed animal parts are worthy of scientific investigation?  Experts with knowledge of pharmacology, infectious disease, and virology or an intellectually null and void president for whom science matters only as a handmaiden to political objectives?  Pity those who follow him in his medieval leap of faith.

By fanning the flames of Hippocratic-Galenic notions about heat, light, the neutralizing effect of opposites, the shared efficacy of substances with complemental or analogical properties, Trump himself has become a vector for the transmission of plague. Bleach kills microbes on a counter top.  Shouldn’t it therefore kill the Covid-19 virus in the human body?  Hydroxychloroquine kills the protozoan parasite Plasmodium that causes malaria.  Shouldn’t it therefore kill Covid-19 viruses within the human body?  Wouldn’t a really “solid” seasonal flu vaccine provide people with a measure of resistance to Covid-19?  No, no, and no. Would that Mr. Trump would “feel good” about a more benign medieval variant, perhaps donning a garland of garlic cloves at press briefings.  Better still, following the example of the plague doctors, he could wear a mask in public, if only to satisfy those of us whose heads are not buried in medieval muck.  Given the clear and present danger of his treatment preferences to public health, however, we would be best served if he were simply muzzled until election day.

 

Copyright © 2020 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.