Category Archives: Public health

Everything You Didn’t Want to Know About Typhoid

Extreme fatigue; dangerously high fever; severe abdominal pain; headaches; diarrhea or constipation; nausea and vomiting – the symptoms of severe typhoid fever can be a panoply of horrors.  Like cholera, the bacteria in question – Salmonella typhi, cousin to the Salmonella that causes food poisoning – find a home in water and food contaminated with human feces.  The infection is contracted only by humans, and it is highly contagious.  More persons contract it from human contact – often from unwashed hands following defecation – than from drinking contaminated water or ingesting contaminated food.  But the latter are hardly incidental causes.  At least two billion people worldwide, the World Health Organization tells us, drink feces-contaminated water.[1]

And the story gets worse. Through the 19th century, “chronic carriers” could not be conceptualized, much less detected.  They were symptom-free folks in whom typhi found safe harbor in the gall bladder, where they traveled with stored bile through the bile duct into the small intestine en route to fecal expulsion.  The chronic carriers brought infection to entire communities in sudden, explosive outbreaks; typhoid is a prime example of what epidemiologists term a “fulminant” disease (from the Latin fulmināre, to strike with lightning).  And worse still, the ranks of common carriers were enlarged by some of those who contracted the disease and recovered.  Typhi lived on in their gall bladders as well, and were passed on to others via the same fecal-oral route.

The Mother of all Common Carriers, the Super Spreader who comes down to us as Typhoid Mary, was one Mary Mallon, an Irish cook who passed on typhi to no less than 53 members of seven prominent Manhattan-area households between 1900 and 1906.  In 1907 she was quarantined in a bungalow on New York’s North Brother Island near Riverside Hospital, only to resume her career as cook-super spreader on release in 1910.  Tracked down five years later, she was whisked back to her island bungalow, where she lived out her remaining 23 years. 

Here is what Salmonella typhi do once ingested through the mouth.  Absent sufficient gastric acid to neutralize them in the stomach, the bacteria make their way to the terminal of the small intestine and enter the cells lining it.  Intestinal cells respond to the invaders with a massive inflammatory response that leads to an intestinal rupture, a hole, through which intestinal contents drain into the abdomen, with attendant and severe pain.  And from there matters go from bad to worse.  Without fast, effective treatment, the bacteria penetrate lymphatic tissue and enter the blood stream, which shuttles them to other organs:  the liver, the spleen, bone marrow.  In the worst cases, bacterial ulceration can extend all the way to the terminal lining of the ileum, from which typhi flood the body, carrying infection to the brain, heart, and pancreas.  Death is now around the corner; only major abdominal surgery holds any prospect of survival.  It is a pernicious disease of microbial migratory urgency.    

Improvements in water treatment and personal hygiene, along with antibiotic therapy and – yes! – a newly effective vaccine for adults, brought typhoid to its knees in the United States after World War II.  But the disease is alive and well in Central and South America, Africa, and parts of Asia, where it claims between 11 and 21 million victims and some 200,000 deaths each year.[2]  Typhi has evolved along with the antibiotics that control it, and multidrug-resistant strains (MDR) remain deadly.  And even here, on these ostensibly sanitized shores, typhi can still make its presence known.  As recently as 2010, nine Americans contracted typhoid, five in California and four in Nevada.[3] 

But such instances are aberrational, and in the northern hemisphere typhoid fever has long since vanished from anyone’s disease-monitoring radar.  Now federal and state governments, the bane of anti-vaccine irrationalists and mask-wearing naysayers, make sure we don’t drink water or eat food contaminated by microbe-laced feces.  But it was not so for our forebears. In the Civil War, typhoid fever devastated north and south alike; the Union Army’s general hospital, the Satterlee Hospital in West Philadelphia, was constructed in 1862 largely to cope with its victims.  In the Spanish-American War of 1898, typhoid fever shared center stage with yellow fever and, at war’s end, rated its own federal investigative commission.  Chaired by Walter Reed, the Typhoid Commission determined that contact among soldiers (“comrade contact”) was primarily responsible for the transmission of typhoid fever in military camps.[4]  Four years later, Koch’s investigations during a typhoid epidemic in Trier, Germany led him to generalize the Commission’s finding: typhoid fever was contracted less from contaminated water or sewage than from nonsymptomatic carriers; the “carrier hypothesis” was among his final significant contributions.[5] 

The era of modern typhoid prevention began in 1897, when Almroth Wright, then a pathologist at the British Army’s Medical School at Netley Hospital, published a paper on the typhoid vaccine he had developed with killed typhi.  The Army took note and, in the South African war the following year, made very limited use of it: of 330,000 British troops, only 14,000 received the vaccine.  It was effective in this limited trial but never caught on after the war.[6]  Beginning in 1902, the U.S. government’s Public Health and Marine Hospital Service, renamed the Public Health Service in 1912, focused its research on typhoid.  Progress was made, and by the time America entered WWI, the PHS’s Hygienic Laboratory had developed an antityphoid vaccine.[7]  American troops sailing to France in 1917 were not asked how they felt about receiving a typhoid vaccine; they received their mandatory shots and boarded their ships.  Those who were not vaccinated stateside received their shots on arriving at their camps.  Vaccination was not negotiable.  The obligation to live and fight for the nation trumped the freedom to be free to contract typhoid, suffer, and possibly die.  

“A Monster Soup Commonly Called Thames Water,” a mid 19th-century etching depicting the stew of disease-promoting organisms in the river that supplied drinking water to Londoners.

The vaccine dramatically reduced the incidents of typhoid, but it still wrought damage in field and base hospitals, especially among unvaccinated European troops who had been fighting since 1914.  American nurses who arrived in northern France and Belgium in advance of troops recalled their misery at being transferred to typhoid wards, which, as one recalled were “gloomy and dark.”  Another recalled a typhoid scourge that crippled her hospital and created an urgent need to find space outside the hospital for the typhoid patients.[8]

_______________________________

The current governors of Texas and Florida would surely look askance at the history of typhoid control, since a key aspect of it – allowing children on school premises to drink only water subjected to antimicrobial treatment – ignores parental freedom of choice.  Parents decide what their children eat, and they should be free to determine what kind of water they drink.   Children are not military enlistees obligated to remain healthy in the service of the nation.  What right do schools boards have to abrogate the freedom of parents to determine what kind of water their children drink?  Why should they be mandated to drink water subject to modern sanitary treatment that robs it of Salmonella typhi along with Vibrio cholerae, Poliovirus, and dysentery-causing Shigella?  Shouldn’t they be free to have their children partake of nature’s bounty, to drink fresh water from streams and rivers, not to mention untreated well water contaminated with human feces and the pathogens it harbors?

And here is the Covid connection.  If local school boards and municipal authorities lack the authority to safeguard children, to the extent possible, through obligatory wearing of facemasks, then surely they lack the authority to force them to drink water filtered through layers of state and federal regulation informed by modern science.  Let parents be free to parent; let their children pay the steep, even life-threatening price.      

Did I mention that young children, along with immune-compromised young adults, are at greatest risk for contracting typhoid?  Well, now you know, and now, perhaps, we can return to reality.  State governors who do not understand the legal and moral imperative of acting in the best interests of the child[9] are unfit for public office of any sort.  In point of fact, they are unfit. Who wants governors who, in denying adults the right to act responsibly in the best interests of children, sanction child abuse?  Let them crawl back into the existential dung heap whence they emerged.    


[1] https://www.who.int/news-room/fact-sheets/detail/drinking-water.

[2] https://www.cdc.gov/typhoid-fever/health-professional.html,

[3] https://www.cdc.gov/salmonella/2010/frozen-fruit-pulp-8-25-10.html.

[4] Victor C. Vaughan, A Doctor’s Memories (Indianapolis: Bobbs-Merrill, 1926), 369ff., 386.

[5] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 255-256.

[6] Gywn Macfarlane, Alexander Fleming: The Man and the Myth (Cambridge: Harvard University Press, 1984), 54-55.

[7] Victoria A. Harden, Inventing the NIH: Federal Biomedical Research Policy, 1887-1937 (Baltimore:  Johns Hopkins University Press, 1986), 41.

[8] Grace McDougall, A Nurse at the War:  Nursing Adventures in Belgium and France (NY: McBride, 1917), 111, 117; Alice Isaacson, Diary of 1917, Library & Archives, Canada, letter of 16 February 1917. 

[9] Joseph Goldstein, Anna Freud, et al., In the Best Interests of the Child (New York:  Free Press, 1986).

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

SHARE THIS POST:

Remembering Cholera in the Time of Covid

It came in crippling waves, the deadly intestinal infection that, in 1832 alone, killed 150,000 Americans.  Its telltale symptom was copious watery diarrhea (“rice water”) accompanied by heavy vomiting, loss of internal fluids and electrolytes, and dehydration; hence its name, cholera, from the Greek “flow of bile.”  Severe cases quickly proceeded to hypovolemic shock and could kill otherwise healthy persons, children and adults, within hours.  Historians refer to the cholera epidemics of the 19th century, but the epidemics, especially those of 1832, 1848, and 1866, were in fact pandemics, spreading from nation to nation and continent to content.

     Orthodox or “regular” physicians of the first half of the 19th century had no clue about its cause, mode of transmission, or remedy, so they speculated wildly in all directions.  Contagionists believed it spread from person to person.  Noncontagonists attributed it to poisonous fumes, termed miasma, which emanated from soil and   decaying matter of all kinds.  Some attributed it to atmospheric changes; others thought it a byproduct of fermentation.  Physician-moralists could be counted on to ascribe it to the moral failings of the underclass.  Withal, the regulars resorted, one and all, to “heroic” treatments, with bleeding and toxic chemicals, especially calomel (mercury) and laudanum (opium-laced alcohol) having pride of place. These treatments only hastened the death of seriously ill patients, which, given the extent of their suffering, may have been an unwitting act of mercy.  Physician-induced deaths resulted in the appeal of homeopathy and botanical medicine, far milder approaches that, so their practitioners avowed, caused far fewer deaths than the horrid regular remedies.  

Caricature of 1832, depicting cholera sufferer with nightmarish “remedies” of the day.

      The suffering public, seeing the baleful effects of conventional remedies, grew disgusted with doctors. The Cholera Bulletin, a newsletter put out by a group of New York physicians over the summer of 1832, grimly acknowledged in its July 23 issue, the “fierce onslaught” of cholera and doctors in felling the afflicted: “For Cholera kills and Doctors slay, and every foe will have its way!”  After a new wave of cholera reached American shores in 1848, New York’s Sunday Dispatch lambasted traditional medical science as “antiquated heathen humbug, utterly unworthy of the middle of the nineteenth century.”  “Cholera was a most terrible affliction,” chimed in the New York Herald a year later, “but bad doctors and bad drugs are worse.  The pestilence might come now and then; physicians we had always with us.”[1]

     And yet, amid such loathing about doctors and their so-called remedies, science marched on in the one domain in which forward movement was possible.  Throughout the second half of the 19th century, cholera was the catalyst that brought Europe and eventually America into the proto-modern era of public health management of infectious disease.  Then as now, measures that safeguarded the public from life-threatening infectious disease are good things.  In 1853, after another cholera epidemic reached Edinburgh, there was no political posturing about “rights” – presumably the right of British citizens to get sick and die.  Parliament, the “Big Government” of the day, resolved to go after the one major, recurrent infectious disease for which a vaccine was at hand:  smallpox.  The Vaccination Act of 1853 grew out of this resolve.  Among other things, it instituted compulsory smallpox vaccination, with all infants to be vaccinated within the first three months of life (infants in orphanages were given four months).  Physicians were obligated to send certificates of vaccination to local birth registrars, and parents who did not comply were subject to fines or imprisonment. The requirement was extended under the Vaccination Act of 1867.[2]

     New York City followed suit a decade later, when the state legislature created the Metropolitan Board of Health.  The Board responded to the outbreak of cholera in 1866 by mandating the isolation of cholera patients and disinfection of their excretions.  When a new epidemic, which travelled from India to Egypt, erupted in 1884, French and German teams descended on Egypt in search of the specific microorganism responsible for cholera.  The prize went to Robert Koch, who isolated the cholera vibrio in January 1884.  

     In 1892, when a cholera epidemic originating in Hamburg brought 100 cases to New York, the city mobilized with the full force of the new science of bacteriology.  The Board of Health lost no time in establishing a Division of Pathology, Bacteriology, and Disinfection, which included a state-of-the-art bacteriological laboratory under the direction of Hermann Biggs.  The lab, as we have seen, came into its own in the fight against diphtheria, but it was the threat of cholera that brought it into existence.  A year later, in 1893, Congress passed the National Quarantine Act, which created a national system of quarantine regulations that included specific procedures for the inspection of immigrants and cargos.  It was to be administered by the U.S. Marine Hospital Service, forerunner of the Public Health Service.

     In the late 1840s, the Bristol physician William Budd argued that contaminated sewage was the source of cholera, and in 1854 the surgeon John Snow traced the source of a cholera outbreak in his Soho, London neighborhood to contaminated well water.  But it was the Hamburg epidemic that proved beyond doubt that cholera was waterborne, and Koch himself demonstrated that water filtration was the key to its control.[3]  Now, we rarely hear of cholera, since water and waste management systems that came into existence in the last century eliminated it from the U.S. and Europe.[4]  Anti-vax libertarians would no doubt take exception to the Safe Drinking Water Act of 1974, which empowers the EPA to establish and enforce national water quality standards.  There it is again, the oppressive hand of Big Government, denying Americans the freedom to drink contaminated water and contract cholera.  Where has our freedom gone? 

Caricature of 1866, “Death’s Dispensary,” giving contaminated drinking water as a source of cholera.

     The gods will never stop laughing at the idiocy of humankind.  Here we are in 2021 and, thanks to the foundation laid down by 19th century scientists, gifted scientists of our own time have handed us, in astoundingly little time, an understanding of the Corona virus, its mode of transmission, and a pathway to prevention and containment.  We have in hand safe and effective vaccines that reduce the risk of infection to miniscule proportions and insure that, among the immunized, infection from potent new strains of the virus will be mild and tolerable, and certainly not life- threatening.   

     Yes, a small percentage of those who receive Covid vaccines will have reactions, and, among them, a tiny fraction will become ill enough to require treatment, even hospitalization.  But they will recover and enjoy immunity thereafter.  Such “risks” pale alongside those incurred by their forebears, who sought protection from smallpox in the time-tested manner of their forebears.  In America, a century before the discovery of Edward Jenner’s cowpox-derived vaccine, colonists protected themselves from recurrent smallpox epidemics through inoculation with human smallpox “matter.”  The procedure, termed variolation, originated in parts of Europe and the Ottoman Empire in the early 16th century, reaching Britain and America a century later, in 1721.  It involved inoculating the healthy with pus scraped from skin ulcers of those already infected, and was informed by the ancient observation that smallpox could be contracted only once in a lifetime.[5]  The variolated developed a mild case of smallpox which, so it was hoped, would confer protection against the ravages of future epidemics. 

     And they were essentially right: over 98% of the variolated survived the procedure and achieved immunity.[6]   To be sure, the risk of serious infection was greater with variolation than with Edward Jenner’s cowpox-derived vaccine, but the latter, which initially relied on the small population of English cows that contracted cowpox and person-to-person inoculation, was a long time in coming.  It took the United States most of the 19th century to maintain and distribute an adequate supply of Jennerian vaccine.  Long before the vaccine was widely available, when the death rate from naturally acquired smallpox was roughly 30%,[7] Americans joined Europeans, Asians, and Africans in accepting the risks of variolation. For George Washington, as we noted, the risks paled alongside the very real risk that the Continental Army would collapse from smallpox:  he had every soldier variolated before beginning military operations in Valley Forge in 1777.[8]

    But here we are in 2021, with many Americans unwilling to accept the possibility of any Covid vaccine reaction at all, however transient and tolerable.  In so doing, they turn their backs on more than two hundred years of scientific progress, of which the successful public health measures in Europe and America spurred by the cholera epidemics form an important chapter.  The triumph of public health, which antedated by decades the discovery of bacteria, accompanied increased life expectancy and vastly improved quality of life wrought by vaccine science, indeed, by science in general. 

     Witness Britain’s Great Exhibition of 1851, a scant three years after the cholera epidemic of 1848. Under the dome of the majestic Crystal Palace, science was celebrated in all its life-affirming possibilities.  In medicine alone, exhibits displayed mechanically enhanced prosthetic limbs, the first double stethoscope, microscopes, surgical instruments and appliances of every kind, and a plethora of pharmaceutical extracts and medicinal juices (including cod liver oil).[9] Topping it off was a complete model of the human body that comprised 1,700 parts.  Science promised better lives and a better future; scientific medicine, which by 1851 had begun to include public health measures, was integral to the promise.

     But here we are in 2021, replete with anti-vaccinationists who choose to endanger themselves, their children, and members of their communities.  They are anti-science primitives in our midst, and I behold them with the same incredulity that visitors to Jurassic Park beheld living, breathing dinosaurs.  Here are people who repudiate both public health measures (mask wearing, curfews, limits on group gatherings) and vaccination science in a time of global pandemic.  For them, liberty is a primal thing that antedates the social contract, of which our Constitution is a sterling example.  It apparently includes the prerogative to get sick and make others sick to the point of death.  The anti-vaccinationists, prideful in their ignorance and luxuriant in their fantasies of government control, remind me of what the pioneering British anthropologist Edward Tyler termed “survivals,” by which he meant remnants of cultural conditions and mindsets irrelevant to the present.  Dinosaurs, after all, didn’t care about the common good either.     


[1] Newsletter and press quotations from Charles Rosenberg, The Cholera Years: The United States in 1832, 1849, and 1866 (Chicago: University of Chicago Press, 1962), 68, 161, 155.

[2] Dorothy Porter & Roy Porter, “The Politics of Prevention:  Anti-Vaccinationism and Public Health in Nineteenth-Century England,” Medical History, 32:231-252, 1988.

[3] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 229-230, 255.

[4] The last reported case of cholera in the U.S. was in 1949.  Cholera, sadly, remains alive and well in a number of African countries. 

[5] In China and Asia Minor, where variolation originated, dried smallpox scabs blown into the nose was the mode of inoculation. 

[6]  José Esparzan, “Three Different Paths to Introduce the Smallpox Vaccine in Early 19th Century United States,” Vaccine, 38:2741-2745. 2020.

[7] Ibid.

[8] Andrew W. Artenstein, et al., “History of U.S. Military Contributions to the Study of Vaccines against Infectious Diseases,” Military Medicine, 170[suppl]:3-11. 2005.  

[9] C. D. T. James, “Medicine and the 1851 Exhibition,” J. Royal Soc. Med., 65:31-34, 1972.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.


[1] Newsletter and press quotations from Charles Rosenberg, The Cholera Years: The United States in 1832, 1849, and 1866 (Chicago: University of Chicago Press, 1962), 68, 161, 155.

[2] Dorothy Porter & Roy Porter, “The Politics of Prevention:  Anti-Vaccinationism and Public Health in Nineteenth-Century England,” Medical History, 32:231-252, 1988.

[3] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 229-230, 255.

[4] The last reported case of cholera in the U.S. was in 1949.  Cholera, sadly, remains alive and well in a number of African countries. 

[5] In China and Asia Minor, where variolation originated, dried smallpox scabs blown into the nose was the mode of inoculation. 

[6]  José Esparzan, “Three Different Paths to Introduce the Smallpox Vaccine in Early 19th Century United States,” Vaccine, 38:2741-2745. 2020.

[7] Ibid.

[8] Andrew W. Artenstein, et al., “History of U.S. Military Contributions to the Study of Vaccines against Infectious Diseases,” Military Medicine, 170[suppl]:3-11. 2005.  

[9] C. D. T. James, “Medicine and the 1851 Exhibition,” J. Royal Soc. Med., 65:31-34, 1972.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.