Tag Archives: Robert Koch

Remembering Cholera in the Time of Covid

It came in crippling waves, the deadly intestinal infection that, in 1832 alone, killed 150,000 Americans.  Its telltale symptom was copious watery diarrhea (“rice water”) accompanied by heavy vomiting, loss of internal fluids and electrolytes, and dehydration; hence its name, cholera, from the Greek “flow of bile.”  Severe cases quickly proceeded to hypovolemic shock and could kill otherwise healthy persons, children and adults, within hours.  Historians refer to the cholera epidemics of the 19th century, but the epidemics, especially those of 1832, 1848, and 1866, were in fact pandemics, spreading from nation to nation and continent to content.

     Orthodox or “regular” physicians of the first half of the 19th century had no clue about its cause, mode of transmission, or remedy, so they speculated wildly in all directions.  Contagionists believed it spread from person to person.  Noncontagonists attributed it to poisonous fumes, termed miasma, which emanated from soil and   decaying matter of all kinds.  Some attributed it to atmospheric changes; others thought it a byproduct of fermentation.  Physician-moralists could be counted on to ascribe it to the moral failings of the underclass.  Withal, the regulars resorted, one and all, to “heroic” treatments, with bleeding and toxic chemicals, especially calomel (mercury) and laudanum (opium-laced alcohol) having pride of place. These treatments only hastened the death of seriously ill patients, which, given the extent of their suffering, may have been an unwitting act of mercy.  Physician-induced deaths resulted in the appeal of homeopathy and botanical medicine, far milder approaches that, so their practitioners avowed, caused far fewer deaths than the horrid regular remedies.  

Caricature of 1832, depicting cholera sufferer with nightmarish “remedies” of the day.

      The suffering public, seeing the baleful effects of conventional remedies, grew disgusted with doctors. The Cholera Bulletin, a newsletter put out by a group of New York physicians over the summer of 1832, grimly acknowledged in its July 23 issue, the “fierce onslaught” of cholera and doctors in felling the afflicted: “For Cholera kills and Doctors slay, and every foe will have its way!”  After a new wave of cholera reached American shores in 1848, New York’s Sunday Dispatch lambasted traditional medical science as “antiquated heathen humbug, utterly unworthy of the middle of the nineteenth century.”  “Cholera was a most terrible affliction,” chimed in the New York Herald a year later, “but bad doctors and bad drugs are worse.  The pestilence might come now and then; physicians we had always with us.”[1]

     And yet, amid such loathing about doctors and their so-called remedies, science marched on in the one domain in which forward movement was possible.  Throughout the second half of the 19th century, cholera was the catalyst that brought Europe and eventually America into the proto-modern era of public health management of infectious disease.  Then as now, measures that safeguarded the public from life-threatening infectious disease are good things.  In 1853, after another cholera epidemic reached Edinburgh, there was no political posturing about “rights” – presumably the right of British citizens to get sick and die.  Parliament, the “Big Government” of the day, resolved to go after the one major, recurrent infectious disease for which a vaccine was at hand:  smallpox.  The Vaccination Act of 1853 grew out of this resolve.  Among other things, it instituted compulsory smallpox vaccination, with all infants to be vaccinated within the first three months of life (infants in orphanages were given four months).  Physicians were obligated to send certificates of vaccination to local birth registrars, and parents who did not comply were subject to fines or imprisonment. The requirement was extended under the Vaccination Act of 1867.[2]

     New York City followed suit a decade later, when the state legislature created the Metropolitan Board of Health.  The Board responded to the outbreak of cholera in 1866 by mandating the isolation of cholera patients and disinfection of their excretions.  When a new epidemic, which travelled from India to Egypt, erupted in 1884, French and German teams descended on Egypt in search of the specific microorganism responsible for cholera.  The prize went to Robert Koch, who isolated the cholera vibrio in January 1884.  

     In 1892, when a cholera epidemic originating in Hamburg brought 100 cases to New York, the city mobilized with the full force of the new science of bacteriology.  The Board of Health lost no time in establishing a Division of Pathology, Bacteriology, and Disinfection, which included a state-of-the-art bacteriological laboratory under the direction of Hermann Biggs.  The lab, as we have seen, came into its own in the fight against diphtheria, but it was the threat of cholera that brought it into existence.  A year later, in 1893, Congress passed the National Quarantine Act, which created a national system of quarantine regulations that included specific procedures for the inspection of immigrants and cargos.  It was to be administered by the U.S. Marine Hospital Service, forerunner of the Public Health Service.

     In the late 1840s, the Bristol physician William Budd argued that contaminated sewage was the source of cholera, and in 1854 the surgeon John Snow traced the source of a cholera outbreak in his Soho, London neighborhood to contaminated well water.  But it was the Hamburg epidemic that proved beyond doubt that cholera was waterborne, and Koch himself demonstrated that water filtration was the key to its control.[3]  Now, we rarely hear of cholera, since water and waste management systems that came into existence in the last century eliminated it from the U.S. and Europe.[4]  Anti-vax libertarians would no doubt take exception to the Safe Drinking Water Act of 1974, which empowers the EPA to establish and enforce national water quality standards.  There it is again, the oppressive hand of Big Government, denying Americans the freedom to drink contaminated water and contract cholera.  Where has our freedom gone? 

Caricature of 1866, “Death’s Dispensary,” giving contaminated drinking water as a source of cholera.

     The gods will never stop laughing at the idiocy of humankind.  Here we are in 2021 and, thanks to the foundation laid down by 19th century scientists, gifted scientists of our own time have handed us, in astoundingly little time, an understanding of the Corona virus, its mode of transmission, and a pathway to prevention and containment.  We have in hand safe and effective vaccines that reduce the risk of infection to miniscule proportions and insure that, among the immunized, infection from potent new strains of the virus will be mild and tolerable, and certainly not life- threatening.   

     Yes, a small percentage of those who receive Covid vaccines will have reactions, and, among them, a tiny fraction will become ill enough to require treatment, even hospitalization.  But they will recover and enjoy immunity thereafter.  Such “risks” pale alongside those incurred by their forebears, who sought protection from smallpox in the time-tested manner of their forebears.  In America, a century before the discovery of Edward Jenner’s cowpox-derived vaccine, colonists protected themselves from recurrent smallpox epidemics through inoculation with human smallpox “matter.”  The procedure, termed variolation, originated in parts of Europe and the Ottoman Empire in the early 16th century, reaching Britain and America a century later, in 1721.  It involved inoculating the healthy with pus scraped from skin ulcers of those already infected, and was informed by the ancient observation that smallpox could be contracted only once in a lifetime.[5]  The variolated developed a mild case of smallpox which, so it was hoped, would confer protection against the ravages of future epidemics. 

     And they were essentially right: over 98% of the variolated survived the procedure and achieved immunity.[6]   To be sure, the risk of serious infection was greater with variolation than with Edward Jenner’s cowpox-derived vaccine, but the latter, which initially relied on the small population of English cows that contracted cowpox and person-to-person inoculation, was a long time in coming.  It took the United States most of the 19th century to maintain and distribute an adequate supply of Jennerian vaccine.  Long before the vaccine was widely available, when the death rate from naturally acquired smallpox was roughly 30%,[7] Americans joined Europeans, Asians, and Africans in accepting the risks of variolation. For George Washington, as we noted, the risks paled alongside the very real risk that the Continental Army would collapse from smallpox:  he had every soldier variolated before beginning military operations in Valley Forge in 1777.[8]

    But here we are in 2021, with many Americans unwilling to accept the possibility of any Covid vaccine reaction at all, however transient and tolerable.  In so doing, they turn their backs on more than two hundred years of scientific progress, of which the successful public health measures in Europe and America spurred by the cholera epidemics form an important chapter.  The triumph of public health, which antedated by decades the discovery of bacteria, accompanied increased life expectancy and vastly improved quality of life wrought by vaccine science, indeed, by science in general. 

     Witness Britain’s Great Exhibition of 1851, a scant three years after the cholera epidemic of 1848. Under the dome of the majestic Crystal Palace, science was celebrated in all its life-affirming possibilities.  In medicine alone, exhibits displayed mechanically enhanced prosthetic limbs, the first double stethoscope, microscopes, surgical instruments and appliances of every kind, and a plethora of pharmaceutical extracts and medicinal juices (including cod liver oil).[9] Topping it off was a complete model of the human body that comprised 1,700 parts.  Science promised better lives and a better future; scientific medicine, which by 1851 had begun to include public health measures, was integral to the promise.

     But here we are in 2021, replete with anti-vaccinationists who choose to endanger themselves, their children, and members of their communities.  They are anti-science primitives in our midst, and I behold them with the same incredulity that visitors to Jurassic Park beheld living, breathing dinosaurs.  Here are people who repudiate both public health measures (mask wearing, curfews, limits on group gatherings) and vaccination science in a time of global pandemic.  For them, liberty is a primal thing that antedates the social contract, of which our Constitution is a sterling example.  It apparently includes the prerogative to get sick and make others sick to the point of death.  The anti-vaccinationists, prideful in their ignorance and luxuriant in their fantasies of government control, remind me of what the pioneering British anthropologist Edward Tyler termed “survivals,” by which he meant remnants of cultural conditions and mindsets irrelevant to the present.  Dinosaurs, after all, didn’t care about the common good either.     


[1] Newsletter and press quotations from Charles Rosenberg, The Cholera Years: The United States in 1832, 1849, and 1866 (Chicago: University of Chicago Press, 1962), 68, 161, 155.

[2] Dorothy Porter & Roy Porter, “The Politics of Prevention:  Anti-Vaccinationism and Public Health in Nineteenth-Century England,” Medical History, 32:231-252, 1988.

[3] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 229-230, 255.

[4] The last reported case of cholera in the U.S. was in 1949.  Cholera, sadly, remains alive and well in a number of African countries. 

[5] In China and Asia Minor, where variolation originated, dried smallpox scabs blown into the nose was the mode of inoculation. 

[6]  José Esparzan, “Three Different Paths to Introduce the Smallpox Vaccine in Early 19th Century United States,” Vaccine, 38:2741-2745. 2020.

[7] Ibid.

[8] Andrew W. Artenstein, et al., “History of U.S. Military Contributions to the Study of Vaccines against Infectious Diseases,” Military Medicine, 170[suppl]:3-11. 2005.  

[9] C. D. T. James, “Medicine and the 1851 Exhibition,” J. Royal Soc. Med., 65:31-34, 1972.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.


[1] Newsletter and press quotations from Charles Rosenberg, The Cholera Years: The United States in 1832, 1849, and 1866 (Chicago: University of Chicago Press, 1962), 68, 161, 155.

[2] Dorothy Porter & Roy Porter, “The Politics of Prevention:  Anti-Vaccinationism and Public Health in Nineteenth-Century England,” Medical History, 32:231-252, 1988.

[3] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 229-230, 255.

[4] The last reported case of cholera in the U.S. was in 1949.  Cholera, sadly, remains alive and well in a number of African countries. 

[5] In China and Asia Minor, where variolation originated, dried smallpox scabs blown into the nose was the mode of inoculation. 

[6]  José Esparzan, “Three Different Paths to Introduce the Smallpox Vaccine in Early 19th Century United States,” Vaccine, 38:2741-2745. 2020.

[7] Ibid.

[8] Andrew W. Artenstein, et al., “History of U.S. Military Contributions to the Study of Vaccines against Infectious Diseases,” Military Medicine, 170[suppl]:3-11. 2005.  

[9] C. D. T. James, “Medicine and the 1851 Exhibition,” J. Royal Soc. Med., 65:31-34, 1972.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

The War on Children’s Plague

In the early 19th century, doctors called it angina maligna (gangrenous pharyngitis) or “malignant sore throat.”  Then in 1826, the French physician Pierre-Fidele Bretonneau grouped both together as diphtherite.  It was a horrible childhood disease in which severe inflammation of the upper respiratory tract gave rise to a false membrane, a “pseudomembrane,” that covered the pharynx, larynx, or both.  The massive tissue growth prevented swallowing and blocked airways and often led to rapid death by asphyxiation.  It felled adults and children alike, but younger children were especially vulnerable.  Looking back on the epidemic that devastated New England in 1735-1736, the lexicographer Noah Webster termed it “literally the plague among children.”  It was the epidemic, he added, in which families often lost all, or all but one, of their children.

A century later, diphtheria epidemics continued to target the young, especially those in cities.  Diphtheria, not smallpox or cholera, was “the dreaded killer that stalked young children.”[1]   It was especially prevalent during the summer months, when children on hot urban streets readily contracted it from one another when they sneezed or coughed or spat.  The irony is that a relatively effective treatment for the disease was already in hand.

In 1882, Robert Koch’s assistant, Fredrich Loeffler, published a paper identifying the bacillus – the rod-shaped bacterium Corynebacterium diphtheria first identified by Edwin Klebs – as the cause of diphtheria.  German scientists immediately went to work, injecting rats, guinea pigs, and rabbits with live bacilli, and then injecting their blood serum – blood from which cells and clotting factor have been removed – into infected animals to see if the diluted serum could produce a cure.  Then they took blood from the “immunized” animal, reduced it to the cell-free blood liquid, and injected it into healthy animals. The latter, to their amazement, did not become ill when injected with diphtheria bacilli.  This finding was formalized in the classic paper of Emil von Behring and Shibasaburo Kitsato of 1890, “The Establishment of Diphtheria Immunity and Tetanus Immunity in Animals.”  For this, von Behring was awarded the very first Nobel Prize in Medicine in 1901.      

Thus the birth of blood serum therapy, precursor of modern vaccines and antibiotics alike.  By the early 1890s, Emile Roux and his associates at the Pasteur Institute discovered that infected horses, not the rabbits used by Behring and Kitsato, produced the most potent diphtheria serum of all.  Healthy horses injected with a heat-killed broth culture of diphtheria, it was found, could survive repeated inoculations with the live bacilli.  The serum, typically referred to as antitoxin, neutralized the highly poisonous substances – the exotoxins – secreted by diphtheria bacteria. 

And there was more:  horse serum provided a high degree of protection for another mammal, viz., human beings.  Among people who received an injection of antitoxin, only one in eight developed symptoms on exposure to diphtheritic individuals. In1895 two American drug companies, H. K. Mulford of Philadelphia and Parke Davis of Chicago, began manufacturing diphtheria antitoxin.  To be sure, their drug provided only short-term immunity, but it sufficed to cut the U.S. death rate among hospitalized diphtheria patients in half.  This fact, astonishing for its time, fueled the explosion of disease-specific antitoxins, some quite effective, some less so.  By 1904 Mulford alone had antitoxin preparations for anthrax, dysentery, meningitis, pneumonia, tetanus, streptococcus infections, and of course diphtheria. 

Colorful Mulford antitoxin ad from early 20th century, featuring, of course, the children

In the era of Covid-19, there are echoes all around of the time when diphtheria permeated the nation’s everyday consciousness. Brilliant scientists, then and now, deploying all the available resources of laboratory science, developed safe and effective cures for a dreaded disease.  But more than a century ago, the public’s reception of a new kind of preventive treatment – an injectable horse-derived antitoxin – was unsullied by the resistance of massed anti-vaccinationists whose anti-scientific claims are amplified by that great product of 1980s science, the internet. 

To be sure, in the 1890s and early 20th century, fringe Christian sects anticipated our own selectively anti-science Evangelicals.  It was sacrilegious, they claimed, to inject the blood product of beasts into human arms, a misgiving that did nothing to assuage their hunger for enormous quantities of beef, pork, and lamb.  Obviously, their God had given them a pass to ingest bloody animal flesh.  Saving children’s lives with animal blood serum was apparently a different matter. 

During the summer months, parents lived in anxious expectation of diphtheria every day their children ventured on to city streets.  Their fear was warranted and not subject to the denials of self-serving politicians.  In 1892, New York City’s Health Department established the first publicly funded bacteriological laboratory in the country, and between 1892 and the summer of 1894, the lab proved its worth by developing a bacteriological test for diagnosing diphtheria.  Infected children could now be sent to hospitals and barred from public schools.  Medical inspectors, armed with the new lab tests, went into the field to enforce a plethora of health department regulations. 

Matters were simplified still further in 1913, when the Viennese pediatrician Bela Schick published the results of experiments demonstrating how to test children for the presence or absence of diphtheria antitoxin without sending their blood to a city lab. Armed with the “Schick test,” public health physicians and nurses could quickly and painlessly determine whether or not a child was immune to diphtheria.  For the roughly 30% of New York City school children who had positive reactions, injections of antitoxin could be given on the spot.  A manageable program of diphtheria immunization in New York and other cities was now in place.    

What about public resistance to the new proto-vaccine?  There was very little outside of religious fringe elements.  In the tenement districts, residents welcomed public health inspectors into their flats.  Intrusion into their lives, it was understood, would keep their children healthy and alive, since it led to aggressive intervention under the aegis of the Health Department.[2]   And it was not only the city’s underserved, immigrants among them, who got behind the new initiative.  No sooner had Hamann Biggs, head of the city’s bacteriological laboratory, set in motion the lab’s inoculation of horses and preparation of antitoxin, than the New York Herald stepped forward with a fund-raising campaign that revolved around a series of articles dramatizing diphtheria and its “solution” in the form of antitoxin injections. The campaign raised sufficient funds to provide antitoxin for the William Parke Hospital, reserved for patients with communicable diseases, and for the city’s private physicians as well.  In short order, the city decided to provide antitoxin to the poor free of charge, and by 1906 the Health Department had 318 diphtheria antitoxin stations administering free shots in all five boroughs.[3][4]

A new campaign by New York City’s Diphtheria Prevention Commission was launched in 1929 and lasted two years.   As was the case three decades earlier, big government, represented by state and municipal public health authorities, was not the problem but the solution.  To make the point, the Commission’s publicity campaign adopted military metaphors.  The enemy was not government telling people what to do; it was the disease itself along with uncooperative physicians and recalcitrant parents.  “The very presence of diphtheria,” writes Evelynn Hammonds, “became a synonym for neglect.”[5]     

The problem with today’s Covid anti-vaccinationists is that their opposition to vaccination is erected on a foundation of life-preserving vaccination science of which they, their parents, their grandparents, and their children are beneficiaries.  They can shrug off the need for Covid-19 vaccination because they have been successfully immunized against the ravages of debilitating childhood diseases.  Unlike adults of the late-nineteenth and early-20th centuries, they have not experienced, up close and personal, the devastation wrought summer after summer, year after year, by the diphtheria bacillus.  Nor have they lost children to untreated smallpox, scarlet fever, cholera, tetanus, or typhus.  Nor, finally, have they, in their own lives, beheld the miraculous transition to a safer world in which children stopped contracting diphtheria en masse, and when those who did contract the disease were usually cured through antitoxin injections.

In the 1890s, the citizens of New York City had it all over the Covid vaccine resisters of today.  They realized that the enemy was not public health authorities infringing on their right to keep themselves and their children away from antitoxin-filled syringes. No, the enemy was the microorganism that caused them and especially their children to get sick and sometimes die. 

Hail the supremely common sense that led them forward, and pity those among us for whom the scientific sense of the past 150 years has given way to the frontier “medical freedom” of Jacksonian America.  Anti-vaccinationist rhetoric, invigorated by the disembodied comaraderie of internet chat groups, does not provide a wall of protection against Covid-19.  Delusory thinking is no less delusory because one insists, in concert with others, that infection can be avoided without the assistance of vaccination science. The anti-vaccinationists need to be vaccinated along with the rest of us.  A healthy dose of history wouldn’t hurt them either.         


[1] Judith Sealander, The Failed Century of the Child: Governing America’s Young in the Twentieth Century (Cambridge: Cambridge Univ. Press, 2003), p. 326.

[2] Evelynn Maxine Hammonds, Childhood’s Deadly Scourge: The Campaign To Control Diphtheria in New York City, 1880-1930 (Baltimore:Johns Hopkins University Press, 1999), 84-86.

[3] William H. Park, “The History of Diphtheria in New York, City,” Am. J. Dis. Child., 42:1439-1445, 1931.

[4] Marian Moser Jones, Protecting Public Health in New York City: Two Hundred Years of Leadership, 1805-2005 (NY: New York City Department of Health and Mental Hygiene, 2005), 20.                                     

[5] Hammonds, op cit., p. 206.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.