Category Archives: Public health

Why Pellagra Matters

It was the dread disease of the four “D”s:  dermatitis, diarrhea, dementia, and death.  The symptoms were often severe: deep red rashes, with attendant blistering and skin sloughing on the face, lips, neck, and extremities; copious liquid bowels; and deepening dementia with disorganized speech and a host of neurological symptoms.  Death followed up to 40% of the time.  The disease was reported in 1771 by the Italian Francisco Frapolini, who observed it among the poor of Lombardy.  They called it pelagra – literally “rough skin” in the dialect of northern Italy.  Frapolini popularized the term (which later acquired a second “l”), unaware that the Spanish physician Don Gaspar Casal had described the same condition in 1735.

Case reports later identified as pellagra occasionally appeared in American medical journals after the Civil War, but epidemic pellagra only erupted at the dawn of the 20th century.  Between  1902 and 1916, it ravaged mill towns in the American South.  Reported cases declined during World War I, but resumed their upward climb in 1919, reaching crisis proportions in 1921-1922 and 1927.  Nor was pellagra confined to the South.  Field workers and day laborers throughout the country, excepting the Pacific Northwest, fell victim.  Like yellow fever, a disease initially perceived as a regional problem was elevated to the status of a public health crisis for the nation.  But pellagra was especially widespread and horrific in the South.

In the decades following the Civil War, the South struggled to rebuild a shattered economy through a textile industry that revolved around cotton.  Pellagra found its mark in the thousands of underpaid mill workers who spun the cotton into yarn and fabric, all the while subsisting on a diet of cheap corn products:  cornbread, grits, syrup, brown gravy, fatback, and coffee were the staples.  The workers’  meager pay came as credit checks good only at company stores, and what company stores stocked, what the workers could afford to buy, was corn meal.  They lacked the time, energy, and means to supplement starvation diets with fresh vegetables grown on their own tiny plots.  Pellagra sufferers (“pellagrins”) subsisted on corn; pellagra, it had long been thought, was all about  corn.  Unsurprisingly, then, it did not stop at the borders of southern mill towns.  It also victimized the corn-fed residents of state-run institutions:  orphanages, prisons, asylums.         

In 1908, when cases of pellagra at southern state hospitals were increasing at an alarming rate, James Woods Babcock, the Harvard-educated superintendent of the South Carolina State Hospital and a pellagra investigator himself, organized the first state-wide pellagra conference.[1]  It was held at his own institution, and generated animated dialogue and comraderie among the 90 attendees.  It was followed a year later by a second conference, now billed as a national pellagra conference, also at Babcock’s hospital.  These conferences underscored both the seriousness of pellagra and the divided opinions about its causes, prevention, and treatment. 

At the early conferences, roughly half the attendees, dubbed Zeists (from Zea mays, or maize), were proponents of the centuries-old corn theory of pellagra. What is it about eating corn that causes the disease?  “We don’t yet know,” they answered, “but people who contract pellagra subsist on corn products.  Ipso facto, corn must lack some nutrient essential to  health.”  The same claim had been made by Giovanni Marzani in 1810.  The Zeists were countered by  anti-Zeists under the sway of germ theory.  “A deficiency disease based on some mysterious element of animal protein missing in corn? Hardly. There has to be a pathogen at work, though it remains to be discovered.”   Perhaps the microorganism was carried by insects, as with yellow fever and malaria.  The Italian-born British physician Louis Sambon went a step further.  He claimed to have identified the insect in question:  it was a black or sand fly of the genus Simulium.  

Germ theory gained traction from a different direction.  “You say dietary reliance on corn ‘causes’ pellagra?  Well, maybe so, but it can’t be a matter of healthy corn.  The corn linked to pellagra must be bad corn,  i.e., corn contaminated by a protozoon.”  Thus the position argued at length by no less than Cesare Lombroso, the pioneer of criminal anthropology.  Like Sambon, moreover, he claimed to have the answer:  it was, he announced, a  fungus, Sporisorium maidis, that made corn moldy and caused pellagra.  But many attendees were unpersuaded by the “moldy corn” hypothesis.  For them pellagra wasn’t a matter of any type of corn, healthy, moldy, or otherwise. It was an infectious disease pure and simple, and some type of microorganism had to be the culprit.  How exactly the microorganism entered the body was a matter for continued theorizing and case reports at conferences to come.         

And there matters rested until 1914, when Joseph Goldberger, a public health warrior of Herculean proportions, entered the fray.  A Jewish immigrant from Hungary, educated at the Free Academy of New York (later CUNY) and Bellevue Hospital Medical College (later NYU Medical School), Goldberger was a leading light of the Public Health Service’s Hygienic Laboratory.  A veteran epidemic fighter, he had earned his stripes battling yellow fever in Mexico, Puerto Rico, and the South; typhoid in Washington, DC; typhus in Mexico City; and dengue fever in Texas.[2]   With pellagra now affecting most of the nation, Goldberger was tapped by Surgeon General Rupert Blue to head south and determine once and for all the cause, treatment, and prevention of pellagra.  

Joseph Goldberger, M.D.

Goldberger was up to the challenge.  He took the South by storm and left a storm of anger and resentment in his wake.  He began in Mississippi, where reported cases of pellagra would increase from 6,991 in 1913 to 10,954 in  1914.  In a series of “feeding experiments” in two orphanages in the spring of 1914, he introduced lean meat, milk, and eggs into the children’s diets; their pellagra vanished.  And Goldberger and his staff were quick to make a complementary observation:  In all the institutions they investigated, not a single staff member ever contracted pellagra.  Why?  Well, the staffs of orphanages, prisons, and asylums were quick to take for themselves whatever protein-rich foods came to their institutions.  They were not about to make do with the cornbread, corn mush, and fatback given to the hapless residents.  And of course their salaries, however modest, enabled them to procure animal protein on the side. 

Joseph Goldberger, with his assistant C. H. Waring, in the Baptist Orphanage near Jackson, Mississippi in 1914, in the painting by Robert Thom.

       

Alright, animal protein cleared up pellagra, but what about residents of state facilities whose diets provide enough protein to protect them from pellagra.  Were there any?  And, if so, could pellagra be induced in them by restricting them to corn-based diets?  Goldberger observed that the only wards of the state who did not contract pellagra were the residents of prison farms.  It made sense:  They alone received some type of meat at mealtime, along with farm-grown vegetables and buttermilk.  In collaboration with Mississippi governor Earl Brewer, Goldberger persuaded 11 residents of Rankin State Prison Farm to restrict themselves to a corn-based diet for six months.  At the study’s conclusion, the prisoners would have their sentences commuted, a promise put in writing.  The experiment corroborated Goldberger’s previous findings:  Six of the 11 prisoners contracted pellagra, and, ill and debilitated, they became free men when the experiment ended in October 1915.  

Now southern cotton growers and textile manufacturers rose in arms.  Who was this Jewish doctor from the North – a representative of “big government,” no less – to suggest they were crippling and killing mill workers by consigning them to corn-based diets?  No, they and their political and medical allies insisted, pellagra had to be an infectious disease spread from worker to worker or transmitted by an insect.  To believe otherwise, to suggest the southern workforce was endemically ill and dying because it was denied essential nutrients – this  would jeopardize the textile industry and its ability to attract investment dollars outside the region.  Goldberger, supremely unfazed by their commitment to science-free profit-making, then undertook the most lurid experiment of all.  Joined by his wife Mary and a group of colleagues, he hosted a series of  “filth parties” in which the group transfused pellagrin blood into their veins and ingested tablets consisting of the  scabs, urine, and feces of pellagra sufferers.  Sixteen volunteers at four different sites participated in the experiment, none of whom contracted the disease.  Here was definitive proof: pellagra was not an infectious disease communicable person-to-person.[3]  

The next battle in Goldberger’s war was a massive survey of over 4,000 residents of textile villages throughout the Piedmont of South Carolina.  It began in April 1916 and lasted 2.5 years, with data analysis continuing, in Goldberger’s absence, after America’s entry into the Great War.  Drawing on the statistical skills  of his PHS colleague, Edgar Sydenstricker, the survey was remarkable for its time and place.  Homes were canvassed to determine the incidence of pellagra in relation to sanitation, food accessibly, food supply, family size and composition, and family income.  Sydenstricker’s statistical analysis of 747 households with 97 cases of pellagra showed that the proportion of families with pellagra markedly declined as income increased.  “Whatever the course that led to an attack of pellagra,”  he concluded, “it began with a light pay envelope.” [4]      

But Goldberger was not yet ready to retire his suit of armor for the coat of a lab researcher.  In September 1919, the PHS reassigned him to Boston, where he joined his old mentor at the Hygienic Laboratory, Milton Rosenau, in exploring influenza with human test subjects.  Once the Spanish Flu had subsided, he was able to return to the South, and just in time for a new spike in pellagra rates.  By the spring of 1920, wartime prosperity was a thing of the past.  Concurrent dips in both cotton prices and tobacco profits led to depressed wages for mill workers and tenant farmers, and a new round of starvation diets led to dramatic increases in  pellagra.  It was, wrote The New York Times on July 25, 1921, quoting a PHS memo, one of the “worst scourges known to man.”[5]

So Goldberger took up arms again, and in PHS-sponsored gatherings and southern medical conferences, withstood  virulent denunciations, often tinged with anti-Semitism.  Southern health care officers like South Carolina’s James A. Hayne dismissed the very notion of deficiency disease as “an absurdity.”  Hayne angrily refused to believe that pellagra was such a disease because, well, he simply refused to believe it – a dismissal sadly prescient of Covid-deniers who refused to accept the reality of a life-threatening viral pandemic because, well, they simply refused to believe it.[6]   

As late as November 1921, at a meeting of the Southern Medical Association, most attendees insisted that pellagra was caused by infection, and that Goldberger’s series of experiments was meaningless.  But they were meaningless only to those blinded to any and all meanings that reflected poorly on the South and its ability to feed its working class.  Even the slightest chink in the physicians’  self-protective armor would have opened to the epidemiological plausibility of Goldberger’s deficiency model.  How could they fail to see that pellagra was  a seasonal disease that reappeared every year in late spring or early summer, exactly like endemic scurvy and beriberi, both of which were linked to dietary deficiencies?     

Back in the Hygienic Laboratory in Washington, Goldberger donned his lab coat and, beginning in 1922, devised a series of experiments involving both people and dogs. Seeking to find an inexpensive substitute for the meat, milk, and eggs unavailable to the southern poor, he tested  a variety of foods and chemicals, one at a time, to see if one or more of them contained the unknown pellagra preventative, now dubbed the “P-P factor.”  He was not inattentive to vitamins, but in the early ’20s, there were only   vitamins A, B , and C to consider, none of which contained the P-P factor.  It was not yet understood that vitamin B was not a single vitamin but a vitamin complex.  Only two dietary supplements, the amino acid tryptophan and, surprisingly, brewer’s yeast, were found to have reliably preventive and curative properties.[7] 

Brewer’s yeast was inexpensive and widely available in the South.  It would soon be put to the test.  In June 1927, following two seasons of declining cotton prices, massive flooding of 16,570,627 acres of the lower Mississippi River Valley lowered wages and increased food prices still further.  The result was drastic increases in pellagra.  So Goldberger, with Sydenstricker at his side, headed South yet again, now hailed on the front page of the Jackson Daily News as a returning hero.  After a three-month survey of tenant farmers, whose starvation diet resembled that of the mill workers interviewed in 1916, he arranged for shipment of 12,000 pounds of brewer’s yeast to the hardest hit regions.  Three cents’ worth of yeast per day cured most cases of pellagra in six to ten weeks.  “Goldberger,” writes Kraut, “had halted an American tragedy.”[8]  Beginning with flood relief in 1927, Red Cross and state-sponsored relief efforts following natural disasters followed Goldberger’s lead.  Red Cross refugee camps in 1927 and thereafter educated disaster victims about nutrition and pellagra and served meals loaded with P-P factor.  On leaving the camps, field workers could take food with them; families with several sick members were sent off with parcels loaded with pellagra preventives.

But the scientific question remained:  What exactly did brewer’s yeast, tryptophan, and two other tested products, wheat germ and canned salmon, have in common?  By 1928, Goldberger, who had less than a year to live,[9] was convinced it was an undiscovered vitamin, but the discovery would have to await the biochemistry of the 1930s.  In the meantime, Goldberger’s empirical demonstration that inexpensive substitutes for animal protein like brewer’s yeast prevented and cured pellagra made a tremendous difference in the lives of the South’s workforce.   Many thousands of lives were saved.

___________________ 

 It was only in 1912, when pellagra ripped through the South, that Casimir Funk, a Polish-born American biochemist, like Goldberger a Jew, formulated the vita-amine or vitamine hypothesis to designate organic molecules essential to life but not synthesized by the human body, thereby pointing to the answer Goldberger sought.[10]  Funk’s research concerned beriberi, a deficiency disease that causes a meltdown of the central nervous system and cardiac problems to the point of heart failure.  In 1919, he determined that it resulted from the depletion of thiamine (vitamin B1).  The covering term “vita-amine” reflected his (mistaken) belief that other deficiency diseases – scurvy, rickets, pellagra – would be found to result from the absence of different amines (i.e., nitrogen-containing) molecules.  

 In the case of pellagra, niacin (aka vitamin B3, aka nicotinic acid/nicotinamide) proved the missing amine, Goldberger’s long sought-after P-P factor.  In the course of his research, Funk himself had isolated the niacin molecule, but its discovery as the P-P factor was only made in 1937 by the American biochemist Conrad Elvehjem.  The circle of discovery begun with Frapolini’s observations in Lombardy in 1771 was closed between 1937 and 1940, when field studies on pellagrins in northern Italy conducted by the Institute of Biology  of the NRC confirmed the curative effect of niacin.[11] 

 Now, ensnared for 2.5 years by a global pandemic that continues to sicken and kill throughout the world, we are understandably focused on communicable infectious diseases.  Reviewing the history of pellagra reminds us that deficiency diseases too have plagued humankind, and in turn brought forth the best that science – deriving here from the collaboration of laboratory researchers, epidemiologists, and public health scientists – has  to offer.  Louis Pasteur, Robert Koch, and Walter Reed are the names that  leap to the foreground in considering the triumphs of bacteriology.  Casimir Funk, Joseph Goldberger, Edgar Sydenstricker, and Conrad Elvehjem are murky background figures that barely make it onto the radar.

In the developed world, pellagra is long gone, though it remains  common in Africa, Indonesia, and China.  But the entrenched commercial and political interests that Goldberger and his PHS team battled to the mat are alive and well.  Over the course of the Covid pandemic, they have belittled public health experts and bewailed CDC protocols that limit “freedom” to contract the virus and infect others.  In 1914, absent Goldberger and his band of Rough Riders, the South would have languished with a seasonally crippled labor force far longer than it did.  Mill owners, cotton-growing farmers, and politicians would have shrugged and accepted the death toll as a cost of doing business.

Let us pause, then, and pay homage to Goldberger and his PHS colleagues.  They were heroes willing to enter an inhospitable region of the country and, among other things, ingest pills of pellagrin scabs and excreta to prove that pellagra was not a contagious disease.  There are echoes of Goldberger in Anthony Fauci, William Schaffner, Ashish Jha, and Leana Wen as they relentlessly fan the embers of  scientific awareness among those who resist an inconvenient truth: that scientists, epidemiologists, and public health officers know things about pandemic management that demagogic politicians and unfit judges do not.  Indeed, the scientific illiterati appear oblivious to the fact that the health of the public is a bedrock of the social order, that individuals ignore public health directives and recommendations at  everyone’s peril.  This is no less true now than it was in 1914.  Me?  I say,  “Thank you, Dr. Goldberger.  And thank you, Dr. Fauci.” 

___________________________           

[1] My material on Babcock and the early pellagra conferences at the South Carolina State Hospital come from Charles S. Bryan, Asylum Doctor:  James Woods Babcock and the Red Plague of Pellagra (Columbia: Univ. of S C Press, 2014), chs. 3-5. 

[2] Alan Kraut, Goldberger’s War:  The life and Work of a Public Health Crusader (NY: Hill & Wang, 2004), 7.

[3] To be sure, the “filth parties” did not rule out the possibility of animal or insect transmission of a  microorganism.   Goldberger’s wife Mary incidentally, was transfused with pellagrin blood but didn’t ingest the filth pills. 

[4] Kraut, Goldberger’s War, 164.

[5] Quoted in Kraut, Goldberger’s War, 190.

[6] On Hayne, Goldberger’s loudest and most vitriolic detractor among southern public health officers, see Kraut, pp. 118, 194; Bryan, Asylum Doctor, pp. 170, 223, 232, 239; and Elizabeth Etheridge, The Butterfly Caste: A Social History of Pellagra in the South (Westport, CT: Greenwood, 1972), 42, 55, 98-99, 110-111.  This is the same James Hayne who in October 1918, in the midst of the Great Pandemic, advised the residents of South Carolina that “The disease itself is not so dangerous: in fact, it is nothing more than what is known as ‘Grippe’” (“Pandemic and Panic: Influenza in 1918 Charleston” [https://www.ccpl.org/charleston-time-machine/pandemic-and-panic-influenza-1918-charleston#:~:text=Pandemic%20and%20panic%20visited%20Charleston,counter%20a%20major%20health%20crisis]).

[7] The tryptophan experiments were conceived and conducted by Goldberger’s assistant, W. F. Tanner, who, after Goldberger’s return to Washington, continued to work out of the PHS laboratory at Georgia State Sanitarium        (Kraut, Goldberger’s War, 203-204, 212-214).

[8] Kraut, Goldberger’s War, 216-222, quoted at 221. 

[9] Goldberger died from hypernephroma, a rare form of kidney cancer, on January 17, 1929.   Prior to the discovery of niacin, in tribute to Goldberger, scientists referred to the P-P factor as Vitamin G. 

 [10] The only monographic study of Funk in English, to my knowledge, is Benjamin Harrow, Casimir Funk, Pioneer in Vitamins and Hormones (NY:  Dodd, Mead, 1955).  There are, however, more recent articles providing brief and accessible overviews of his achievements, e.g.,  T. H. Juke, “The prevention and conquest of scurvy, beriberi, and pellagra,” Prev. Med., 18:8877-883, 1989;  Anna Piro, et al., “Casimir Funk: His discovery of the vitamins and their deficiency disorders,” Ann. Nutr. Metab., 57:85-88, 2010.

[11] Renato Mariani-Costantini & Aldo Mariani-Costantini, “An outline of the history of pellagra in Italy,” J. Anthropol. Sci., 85:163-171, 2007.

 

Copyright © 2022 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

Anti-vaxxers in Free Fall

I read a news story in which a man is dying of Covid-19 in the hospital.  He is asked whether he regrets not getting vaccinated and rallies enough to reply, “No, I don’t believe in the vaccine.”  So what then does he believe in?  Systemic viral infection, suffering, and death?  If you don’t believe in vaccination, you don’t believe in modern medicine in toto.  You don’t believe in bacteriology, virology, cellular biology, microbiology, or immunology.  What then is left to prevent, diagnose, and treat disease?  Trump-ish medievalism, mysticism, shamanism, divine intervention?

A study by researchers at Harvard’s Brigham and Women’s Hospital used natural language processing to comb through 5,307 electronic patient records of adult type 2 diabetics living in Massachusetts and followed by their primary care physicians between 2000 and 2014.  They found that 43% (2,267) of patients refused to begin insulin therapy when their doctors recommended it.  Further, diabetics who declined the recommendation not only had higher blood sugar levels than those who began insulin, but had greater difficulty achieving glycemic control later on.[1]  So what do the insulin-declining diabetics believe in?  Chronic heart and kidney disease, blindness, and amputation – the all but inevitable sequelae of poorly managed diabetes?

The problem, really an epistemological problem, is that such people apparently have no beliefs at all – unless one imputes to them belief in disease, suffering, and death, and in the case of Covid vaccine resisters, the prerogative to inflict them on others.  This is not tantamount to a scientifically specious belief system that unintentionally infects others.  During the Yellow Fever epidemic that left Philadelphia in ruins in 1793, Dr. Benjamin Rush, highly acclaimed throughout the newborn nation, set about his curative missions by draining his patients, in successive bleedings, of up to four pints of blood while simultaneously purging them (i.e., causing them to vomit) with copious doses of toxic mercury.

Rush’s “Great Purge,” adopted by his followers, added hundreds, perhaps thousands, to the death toll in Philadelphia alone.  But at least Rush’s “system” derived from a belief system.  He did in fact  find a theoretical rationale for his regimen in an essay by the Virginia physician and mapmaker John Mitchell.  Describing yellow fever in Virginia in 1741, Mitchell noted that in yellow fever the “abdominal viscera were filled with blood, and must be cleaned out by immediate evacuation.”[2]   Bleeding, of course, was conventional treatment for all manner of disease in 1793, so Mitchell’s recommendation came as no surprise. Taken in conjunction with the system of mercuric purges employed by Dr. Thomas Young during the Revolutionary War, Rush had all the grounding he required for a ruinously misguided campaign that greatly extended recovery time of those it did not kill.  But, yes, he had his theory, and he believed in it.

In the early 19th century, Napoleon, sweeping through Europe, conquers the north Italian province of Bolzano, which in 1807 he incorporated into Bavaria. Two years later, when the Bavarian government mandates smallpox vaccination for all residents, the newly absorbed Italians launch an armed revolt, partly because they believed vaccination would inject Protestantism into their Catholic veins.[3]

All right, it is a nonsensical belief, even in 1809, but it is still a belief of sorts.  It is epistemically flawed, because it fails to stipulate what exactly makes a substance inherently Protestant in nature; nor does it posit a mechanism of transmission whereby a Protestant essence seeps into liquid smallpox vaccine in the first place.  In the realm of ethics, it suggests that the possibility of death pales alongside the certainty of spiritual contamination by a fluid that, however neutral in life-saving potency, is injected by a Protestant hand.

Only slightly less ridiculous to modern ears is the mid-19th-century belief that general anesthesia via ether or chloroform, introduced by James Young Simpson in 1847, must be withheld from women giving birth.  The reason?  Genesis 3.16 enjoins women to bring forth new life in suffering.  Forget that the belief is espoused solely by certain men of the cloth and male physicians,[4] and was based on a highly questionable rendering of the biblical Hebrew.  Forget as well that, for Christians, Christ’s death redeemed humankind, relieving women of the need to relive the primal curse.  Bear in mind further that the alleged curse would also forbid, inter alia, use of forceps, caesarian operations, and embryotomy.  A woman with a contracted pelvis would die undelivered because she is guilty of the sin over which she has no control – that of having a contracted pelvis.[5]

In a secular nation guided by a constitution that asserts everyone’s right to pursue happiness in his or her own pain-free terms, we see the primal curse as archaic misogynist drivel, no less absurd than belief that the Bible, through some preternatural time warp, forbids vaccination.  But, hey, it’s a free country, and if a mid-19th-century or early-21st-century man chooses to believe that anesthesia permits men to escape pain whenever possible but women only in male-sanctioned circumstances, so be it.  It is a belief.

Now it’s 1878, and the worst yellow fever epidemic in American history is sweeping across the lower Mississippi Valley, taking lives and destroying commerce in New Orleans, Memphis and surrounding cities and towns to which refugees are streaming.  The epidemic will reach the Ohio Valley, bringing deadly Yellow Jack to Indiana, Illinois, and Ohio.  Koch’s monograph on the bacteriology of sepsis (wound infection) was published that very  year, and neither his work nor that of Lister is universally accepted in the American south.  Nor would its precepts have counted for much in the face of a viral (not bacterial) invader carried up the Mississippi from Havana.

What can city boards of health do in the face of massive viral infection, suffering, and death?  Beyond imposing stringent new sanitary measures, they can quarantine ships arriving in their harbors until all infected crew members have either died or been removed and isolated.  This will prevent the newly infected from infecting others and crippling cities still further – assuming, that is, a belief system in which yellow fever is contagious and spread from person to person.

But in 1878 Memphis, where by September the epidemic is claiming 200 lives a day, this “modern” belief is widely contested among the city’s physicians.  Some are contagionists, who believe that disease is caused by invisible entities that are transmissible.  But others, greater in number, favor the long-held theory that infectious disease results from “miasma” or bad air – air rendered toxic by decaying plant and animal matter in the soil.  If you believe miasma causes disease, then you’re hard-pressed to understand how quarantining ships laden with sick people will do anything to control the epidemic.

This was precisely the position of the 32 Memphis physicians who defeated the city council’s plan to institute a quarantine and set up a quarantine station.  Quarantine is pointless in the face of bad air.  The city’s only recourse, so held the 32, was to alter the “epidemic constitution” of the atmosphere by inundating it with smoke.  Canon blasts and blazing barrels of tar up and down city streets – that’s the ticket to altering the atmospheric conditions that create infectious disease.[6]

The miasmic theory of disease retained a medical following throughout the 1870s, after which it disappeared in the wake of bacteriology.  But in Memphis in 1878, bad air was still a credible theory in which physicians could plausibly believe.  And this matter of reasonable belief – reasonable for a particular time and place – takes us back to the hospitalized Covid patient of 2021 who, with virtually his last breath, defends his decision to remain unvaccinated because he doesn’t believe in the vaccine.  What is the knowledge base that sustains his disbelief?  There isn’t any.  He has no beliefs, informed or otherwise, about bacteriology, virology, cellular biology, or immunology.  At best, he has decided to accept what someone equally belief-less has told him about Covid vaccination, whether personally, in print, or over the internet.

It is no different among the 43% of Massachusetts diabetics who, a century after Banting’s and Best’s miraculous discovery, declined insulin therapy when their doctors recommended it.  Their disbelief is actually a nonbelief because it is groundless.  For some, perhaps, the refusal falls back on a psychological inability to accept that one is diabetic enough to warrant insulin.  They resist the perceived stigma of being insulin-dependent diabetics.[7]  Here at least the grounds of refusal are intelligible and remediable.  An insulin phobia does not sustain real-world belief; it is an impediment to such belief in relation to diabetes and insulin, illness and long-term health, lesser and greater life expectancy.

Back in the present, I read another news story in which two unvaccinated hospital nurses explain to a journalist that they have refused Covid vaccination because the vaccines’ effectiveness is based on “junk data.”  Really?  Here there is the glimmering of a belief system, since scientific data can be more or less robust, more or less supportive of one or another course of action.

But what exactly makes Covid vaccine data worthless, i.e., junk?  And how have these two nurses acquired the expertise in epidemiology, population statistics, and data analysis to pass judgment on data deemed credible and persuasive by scientists at Pfizer, Moderna, Johnson & Johnson, the CDC, and the WHO?  And how, pray tell, have they gained access to these data?  Like all opponents of vaccine science, they pontificate out of ignorance, as if the mere act of an utterance confers truth-value to what is being uttered.  It’s an extreme example of asserting as fact what remains to be demonstrated (argument petitio principii), the legacy of an ex-president who elevated pathological lying to a political art form.

Even the nurses pale alongside the anti-vax protester who is pictured in a news photo holding a sign that reads, “Vaccines Kill.”[8]  Whom do they kill and under what circumstances?  Does he mean all vaccines are deadly and kill people all the time, or just certain vaccines, such as the Covid  vaccine?   But what does it matter?  The sign holder doesn’t know anything about any vaccines.  Does he really believe that everything we know about the history of vaccine science from the time of Jenner is bogus, and that children who once died from smallpox, cholera, yellow fever, diphtheria, pertussis, typhoid, typhus, tetanus, and polio are still dying in droves, now from the vaccines they receive to protect them from these infectious diseases during the earliest years of life?  Is the demographic fact that, owing to vaccination and other public health measures, life expectancy in the U.S. has increased from 47 in 1900 to 77 in 2021 also based on junk data?  In my essay, Anti- vaccinationism, American Style, I provide statistics on the total elimination in the U.S. of smallpox and diphtheria, and virtual elimination of polio.  Were my claims also based on junk data?  If so, I’d appreciate being directed to the data that belie these facts and demonstrate that, in point of fact, vaccines kill.

Maybe the man with the sign has an acquaintance who got sick from what he believed to be a vaccine?  Perhaps someone on his internet chat group heard of someone else who became ill, or allegedly died, after receiving a vaccine.  Of course, death can follow vaccination without being caused by it.  Do we then assume that the man with the sign and like-minded protesters are well-versed in the difference between causation and correlation in scientific explanation?

We know that for a tiny number of individuals aspirin kills.[9]   So why doesn’t the man hold up a sign that reads, “Aspirin Kills.”  Here at least, he would be calling attention to a scientific fact that people with GI conditions should be aware of.    We know that sugary drinks have been linked to 25,000 deaths in the U.S. each year.  Why not a sign, “Soda Kills”?  It would at least be based on science.  He chooses not to proclaim the lethality of aspirin or soda because he cares no more about aspirin- or soda-related deaths than Covid-related deaths.  If he did, then, like the two nurses with their junk data and the Covid patient announcing disbelief in Covid vaccination on his deathbed, he would have to anchor his belief in consensually accepted scientific facts – a belief that someone, anyone, might find believable.

He is no different than other American anti-vaxxers I read about in the paper. They are the epistemological Luddites of our time, intent on wrecking the scientific machinery of disease prevention, despite profound ignorance of vaccine science and its impact on human affairs since the late 18th century.  Indeed, they see no need to posit grounds of belief of any kind, since their anger – at Covid, at Big Government, at Big Science, at Big Medicine, at Big Experts – fills the epistemic void.  It fuels what they offer in place of the science of disease prevention:  the machinery of misinformation that is their stock in trade.

And therein is the source of their impotence.  They have fallen into an anti-knowledge black hole, and struggle to fashion an existence out of anger that – to push the anti-matter trope a little further – repels rational thought.  Their contrarian charge is small solace for the heightened risks of diseases, suffering, and death they incur, and, far less conscionably, impose on the rest of us.

______________________

[1] N. Hosomura, S. Malmasi, et al., “Decline of Insulin Therapy and Delays in Insulin Initiation in People with Uncontrolled Diabetes Melitus,” Diabetic Med., 34:1599-1602, 2017.

[2] J. M. Powell, Bring Out Your Dead:  The Great Plague of Yellow Fever in Philadelphian in 1793 (Phila: Univ. of Pennsylvania Press, 1949), 76-78.

[3] My thanks to my friend Marty Meyers for bringing to my attention this event of 1809, as reported by Emma Bubola,In Italy’s Alps, Traditional Medicine Flourishes, as Does Covid,” New York Times, December 16, 2021.

[4] With reason, wrote Elizabeth Cady Stanton in The Woman’s Bible (1895), “The Bible and the Church have been the greatest stumbling blocks in the way of women’s emancipation.”

[5] For a fulller examination of the 19th-century debate on the use of general anesthesia during childbirth, see Judith Walzer Leavitt Brought to Bed: Childbearing in America, 1750-1950 (NY:  OUP, 1986), ch. 5.

[6] On the measures taken to combat the epidemic in Memphis, including the rift between contagionists and noncontagionists physicians, see John H. Ellis, Yellow Fever and Public Health in the New South (Lexington: Univ. Press of Kentucky, 1992), ch. 3.

[7] A. Hussein, A. Mostafa, et al., “The Perceived Barriers to Insulin Therapy among Type 2 Diabetic Patients,” African Health Sciences, 19:1638-1646, 2019.

[8] Now, sadly, we have gone from hand-written “Vaccines Kill” signs to highway billboards, e.g., https://www.kxxv.com/hometown/mclennan-county/a-new-billboard-in-west-claims-vaccines-kill.

[9] Patients prescribed aspirin before developing a GI bleed or perforation are prominent among those killed by aspirin.  See A. Lanas, M. A. Perez-Aisa, et al., “A Nationwide Study of Mortality Associated with Hospital Admission and Those Associated with Nonsteroidal Antiinflammatory Drug Use,” Am. J.  Gastroenterol., 100:1685-1693, 2005; S. Straube, M. R. Trainer, et al., “Mortality with Upper Gastrointestinal Bleeding and Perforation,” BMC Gastroenterol., 8: 41, 2009.

Everything You Didn’t Want to Know About Typhoid

Extreme fatigue; dangerously high fever; severe abdominal pain; headaches; diarrhea or constipation; nausea and vomiting – the symptoms of severe typhoid fever can be a panoply of horrors.  Like cholera, the bacteria in question – Salmonella typhi, cousin to the Salmonella that causes food poisoning – find a home in water and food contaminated with human feces.  The infection is contracted only by humans, and it is highly contagious.  More persons contract it from human contact – often from unwashed hands following defecation – than from drinking contaminated water or ingesting contaminated food.  But the latter are hardly incidental causes.  At least two billion people worldwide, the World Health Organization tells us, drink feces-contaminated water.[1]

And the story gets worse. Through the 19th century, “chronic carriers” could not be conceptualized, much less detected.  They were symptom-free folks in whom typhi found safe harbor in the gall bladder, where they traveled with stored bile through the bile duct into the small intestine en route to fecal expulsion.  The chronic carriers brought infection to entire communities in sudden, explosive outbreaks; typhoid is a prime example of what epidemiologists term a “fulminant” disease (from the Latin fulmināre, to strike with lightning).  And worse still, the ranks of common carriers were enlarged by some of those who contracted the disease and recovered.  Typhi lived on in their gall bladders as well, and were passed on to others via the same fecal-oral route.

The Mother of all Common Carriers, the Super Spreader who comes down to us as Typhoid Mary, was one Mary Mallon, an Irish cook who passed on typhi to no less than 53 members of seven prominent Manhattan-area households between 1900 and 1906.  In 1907 she was quarantined in a bungalow on New York’s North Brother Island near Riverside Hospital, only to resume her career as cook-super spreader on release in 1910.  Tracked down five years later, she was whisked back to her island bungalow, where she lived out her remaining 23 years. 

Here is what Salmonella typhi do once ingested through the mouth.  Absent sufficient gastric acid to neutralize them in the stomach, the bacteria make their way to the terminal of the small intestine and enter the cells lining it.  Intestinal cells respond to the invaders with a massive inflammatory response that leads to an intestinal rupture, a hole, through which intestinal contents drain into the abdomen, with attendant and severe pain.  And from there matters go from bad to worse.  Without fast, effective treatment, the bacteria penetrate lymphatic tissue and enter the blood stream, which shuttles them to other organs:  the liver, the spleen, bone marrow.  In the worst cases, bacterial ulceration can extend all the way to the terminal lining of the ileum, from which typhi flood the body, carrying infection to the brain, heart, and pancreas.  Death is now around the corner; only major abdominal surgery holds any prospect of survival.  It is a pernicious disease of microbial migratory urgency.    

Improvements in water treatment and personal hygiene, along with antibiotic therapy and – yes! – a newly effective vaccine for adults, brought typhoid to its knees in the United States after World War II.  But the disease is alive and well in Central and South America, Africa, and parts of Asia, where it claims between 11 and 21 million victims and some 200,000 deaths each year.[2]  Typhi has evolved along with the antibiotics that control it, and multidrug-resistant strains (MDR) remain deadly.  And even here, on these ostensibly sanitized shores, typhi can still make its presence known.  As recently as 2010, nine Americans contracted typhoid, five in California and four in Nevada.[3] 

But such instances are aberrational, and in the northern hemisphere typhoid fever has long since vanished from anyone’s disease-monitoring radar.  Now federal and state governments, the bane of anti-vaccine irrationalists and mask-wearing naysayers, make sure we don’t drink water or eat food contaminated by microbe-laced feces.  But it was not so for our forebears. In the Civil War, typhoid fever devastated north and south alike; the Union Army’s general hospital, the Satterlee Hospital in West Philadelphia, was constructed in 1862 largely to cope with its victims.  In the Spanish-American War of 1898, typhoid fever shared center stage with yellow fever and, at war’s end, rated its own federal investigative commission.  Chaired by Walter Reed, the Typhoid Commission determined that contact among soldiers (“comrade contact”) was primarily responsible for the transmission of typhoid fever in military camps.[4]  Four years later, Koch’s investigations during a typhoid epidemic in Trier, Germany led him to generalize the Commission’s finding: typhoid fever was contracted less from contaminated water or sewage than from nonsymptomatic carriers; the “carrier hypothesis” was among his final significant contributions.[5] 

The era of modern typhoid prevention began in 1897, when Almroth Wright, then a pathologist at the British Army’s Medical School at Netley Hospital, published a paper on the typhoid vaccine he had developed with killed typhi.  The Army took note and, in the South African war the following year, made very limited use of it: of 330,000 British troops, only 14,000 received the vaccine.  It was effective in this limited trial but never caught on after the war.[6]  Beginning in 1902, the U.S. government’s Public Health and Marine Hospital Service, renamed the Public Health Service in 1912, focused its research on typhoid.  Progress was made, and by the time America entered WWI, the PHS’s Hygienic Laboratory had developed an antityphoid vaccine.[7]  American troops sailing to France in 1917 were not asked how they felt about receiving a typhoid vaccine; they received their mandatory shots and boarded their ships.  Those who were not vaccinated stateside received their shots on arriving at their camps.  Vaccination was not negotiable.  The obligation to live and fight for the nation trumped the freedom to be free to contract typhoid, suffer, and possibly die.  

“A Monster Soup Commonly Called Thames Water,” a mid 19th-century etching depicting the stew of disease-promoting organisms in the river that supplied drinking water to Londoners.

The vaccine dramatically reduced the incidents of typhoid, but it still wrought damage in field and base hospitals, especially among unvaccinated European troops who had been fighting since 1914.  American nurses who arrived in northern France and Belgium in advance of troops recalled their misery at being transferred to typhoid wards, which, as one recalled were “gloomy and dark.”  Another recalled a typhoid scourge that crippled her hospital and created an urgent need to find space outside the hospital for the typhoid patients.[8]

_______________________________

The current governors of Texas and Florida would surely look askance at the history of typhoid control, since a key aspect of it – allowing children on school premises to drink only water subjected to antimicrobial treatment – ignores parental freedom of choice.  Parents decide what their children eat, and they should be free to determine what kind of water they drink.   Children are not military enlistees obligated to remain healthy in the service of the nation.  What right do schools boards have to abrogate the freedom of parents to determine what kind of water their children drink?  Why should they be mandated to drink water subject to modern sanitary treatment that robs it of Salmonella typhi along with Vibrio cholerae, Poliovirus, and dysentery-causing Shigella?  Shouldn’t they be free to have their children partake of nature’s bounty, to drink fresh water from streams and rivers, not to mention untreated well water contaminated with human feces and the pathogens it harbors?

And here is the Covid connection.  If local school boards and municipal authorities lack the authority to safeguard children, to the extent possible, through obligatory wearing of facemasks, then surely they lack the authority to force them to drink water filtered through layers of state and federal regulation informed by modern science.  Let parents be free to parent; let their children pay the steep, even life-threatening price.      

Did I mention that young children, along with immune-compromised young adults, are at greatest risk for contracting typhoid?  Well, now you know, and now, perhaps, we can return to reality.  State governors who do not understand the legal and moral imperative of acting in the best interests of the child[9] are unfit for public office of any sort.  In point of fact, they are unfit. Who wants governors who, in denying adults the right to act responsibly in the best interests of children, sanction child abuse?  Let them crawl back into the existential dung heap whence they emerged.    


[1] https://www.who.int/news-room/fact-sheets/detail/drinking-water.

[2] https://www.cdc.gov/typhoid-fever/health-professional.html,

[3] https://www.cdc.gov/salmonella/2010/frozen-fruit-pulp-8-25-10.html.

[4] Victor C. Vaughan, A Doctor’s Memories (Indianapolis: Bobbs-Merrill, 1926), 369ff., 386.

[5] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 255-256.

[6] Gywn Macfarlane, Alexander Fleming: The Man and the Myth (Cambridge: Harvard University Press, 1984), 54-55.

[7] Victoria A. Harden, Inventing the NIH: Federal Biomedical Research Policy, 1887-1937 (Baltimore:  Johns Hopkins University Press, 1986), 41.

[8] Grace McDougall, A Nurse at the War:  Nursing Adventures in Belgium and France (NY: McBride, 1917), 111, 117; Alice Isaacson, Diary of 1917, Library & Archives, Canada, letter of 16 February 1917. 

[9] Joseph Goldstein, Anna Freud, et al., In the Best Interests of the Child (New York:  Free Press, 1986).

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

SHARE THIS POST:

Remembering Cholera in the Time of Covid

It came in crippling waves, the deadly intestinal infection that, in 1832 alone, killed 150,000 Americans.  Its telltale symptom was copious watery diarrhea (“rice water”) accompanied by heavy vomiting, loss of internal fluids and electrolytes, and dehydration; hence its name, cholera, from the Greek “flow of bile.”  Severe cases quickly proceeded to hypovolemic shock and could kill otherwise healthy persons, children and adults, within hours.  Historians refer to the cholera epidemics of the 19th century, but the epidemics, especially those of 1832, 1848, and 1866, were in fact pandemics, spreading from nation to nation and continent to content.

     Orthodox or “regular” physicians of the first half of the 19th century had no clue about its cause, mode of transmission, or remedy, so they speculated wildly in all directions.  Contagionists believed it spread from person to person.  Noncontagonists attributed it to poisonous fumes, termed miasma, which emanated from soil and   decaying matter of all kinds.  Some attributed it to atmospheric changes; others thought it a byproduct of fermentation.  Physician-moralists could be counted on to ascribe it to the moral failings of the underclass.  Withal, the regulars resorted, one and all, to “heroic” treatments, with bleeding and toxic chemicals, especially calomel (mercury) and laudanum (opium-laced alcohol) having pride of place. These treatments only hastened the death of seriously ill patients, which, given the extent of their suffering, may have been an unwitting act of mercy.  Physician-induced deaths resulted in the appeal of homeopathy and botanical medicine, far milder approaches that, so their practitioners avowed, caused far fewer deaths than the horrid regular remedies.  

Caricature of 1832, depicting cholera sufferer with nightmarish “remedies” of the day.

      The suffering public, seeing the baleful effects of conventional remedies, grew disgusted with doctors. The Cholera Bulletin, a newsletter put out by a group of New York physicians over the summer of 1832, grimly acknowledged in its July 23 issue, the “fierce onslaught” of cholera and doctors in felling the afflicted: “For Cholera kills and Doctors slay, and every foe will have its way!”  After a new wave of cholera reached American shores in 1848, New York’s Sunday Dispatch lambasted traditional medical science as “antiquated heathen humbug, utterly unworthy of the middle of the nineteenth century.”  “Cholera was a most terrible affliction,” chimed in the New York Herald a year later, “but bad doctors and bad drugs are worse.  The pestilence might come now and then; physicians we had always with us.”[1]

     And yet, amid such loathing about doctors and their so-called remedies, science marched on in the one domain in which forward movement was possible.  Throughout the second half of the 19th century, cholera was the catalyst that brought Europe and eventually America into the proto-modern era of public health management of infectious disease.  Then as now, measures that safeguarded the public from life-threatening infectious disease are good things.  In 1853, after another cholera epidemic reached Edinburgh, there was no political posturing about “rights” – presumably the right of British citizens to get sick and die.  Parliament, the “Big Government” of the day, resolved to go after the one major, recurrent infectious disease for which a vaccine was at hand:  smallpox.  The Vaccination Act of 1853 grew out of this resolve.  Among other things, it instituted compulsory smallpox vaccination, with all infants to be vaccinated within the first three months of life (infants in orphanages were given four months).  Physicians were obligated to send certificates of vaccination to local birth registrars, and parents who did not comply were subject to fines or imprisonment. The requirement was extended under the Vaccination Act of 1867.[2]

     New York City followed suit a decade later, when the state legislature created the Metropolitan Board of Health.  The Board responded to the outbreak of cholera in 1866 by mandating the isolation of cholera patients and disinfection of their excretions.  When a new epidemic, which travelled from India to Egypt, erupted in 1884, French and German teams descended on Egypt in search of the specific microorganism responsible for cholera.  The prize went to Robert Koch, who isolated the cholera vibrio in January 1884.  

     In 1892, when a cholera epidemic originating in Hamburg brought 100 cases to New York, the city mobilized with the full force of the new science of bacteriology.  The Board of Health lost no time in establishing a Division of Pathology, Bacteriology, and Disinfection, which included a state-of-the-art bacteriological laboratory under the direction of Hermann Biggs.  The lab, as we have seen, came into its own in the fight against diphtheria, but it was the threat of cholera that brought it into existence.  A year later, in 1893, Congress passed the National Quarantine Act, which created a national system of quarantine regulations that included specific procedures for the inspection of immigrants and cargos.  It was to be administered by the U.S. Marine Hospital Service, forerunner of the Public Health Service.

     In the late 1840s, the Bristol physician William Budd argued that contaminated sewage was the source of cholera, and in 1854 the surgeon John Snow traced the source of a cholera outbreak in his Soho, London neighborhood to contaminated well water.  But it was the Hamburg epidemic that proved beyond doubt that cholera was waterborne, and Koch himself demonstrated that water filtration was the key to its control.[3]  Now, we rarely hear of cholera, since water and waste management systems that came into existence in the last century eliminated it from the U.S. and Europe.[4]  Anti-vax libertarians would no doubt take exception to the Safe Drinking Water Act of 1974, which empowers the EPA to establish and enforce national water quality standards.  There it is again, the oppressive hand of Big Government, denying Americans the freedom to drink contaminated water and contract cholera.  Where has our freedom gone? 

Caricature of 1866, “Death’s Dispensary,” giving contaminated drinking water as a source of cholera.

     The gods will never stop laughing at the idiocy of humankind.  Here we are in 2021 and, thanks to the foundation laid down by 19th century scientists, gifted scientists of our own time have handed us, in astoundingly little time, an understanding of the Corona virus, its mode of transmission, and a pathway to prevention and containment.  We have in hand safe and effective vaccines that reduce the risk of infection to miniscule proportions and insure that, among the immunized, infection from potent new strains of the virus will be mild and tolerable, and certainly not life- threatening.   

     Yes, a small percentage of those who receive Covid vaccines will have reactions, and, among them, a tiny fraction will become ill enough to require treatment, even hospitalization.  But they will recover and enjoy immunity thereafter.  Such “risks” pale alongside those incurred by their forebears, who sought protection from smallpox in the time-tested manner of their forebears.  In America, a century before the discovery of Edward Jenner’s cowpox-derived vaccine, colonists protected themselves from recurrent smallpox epidemics through inoculation with human smallpox “matter.”  The procedure, termed variolation, originated in parts of Europe and the Ottoman Empire in the early 16th century, reaching Britain and America a century later, in 1721.  It involved inoculating the healthy with pus scraped from skin ulcers of those already infected, and was informed by the ancient observation that smallpox could be contracted only once in a lifetime.[5]  The variolated developed a mild case of smallpox which, so it was hoped, would confer protection against the ravages of future epidemics. 

     And they were essentially right: over 98% of the variolated survived the procedure and achieved immunity.[6]   To be sure, the risk of serious infection was greater with variolation than with Edward Jenner’s cowpox-derived vaccine, but the latter, which initially relied on the small population of English cows that contracted cowpox and person-to-person inoculation, was a long time in coming.  It took the United States most of the 19th century to maintain and distribute an adequate supply of Jennerian vaccine.  Long before the vaccine was widely available, when the death rate from naturally acquired smallpox was roughly 30%,[7] Americans joined Europeans, Asians, and Africans in accepting the risks of variolation. For George Washington, as we noted, the risks paled alongside the very real risk that the Continental Army would collapse from smallpox:  he had every soldier variolated before beginning military operations in Valley Forge in 1777.[8]

    But here we are in 2021, with many Americans unwilling to accept the possibility of any Covid vaccine reaction at all, however transient and tolerable.  In so doing, they turn their backs on more than two hundred years of scientific progress, of which the successful public health measures in Europe and America spurred by the cholera epidemics form an important chapter.  The triumph of public health, which antedated by decades the discovery of bacteria, accompanied increased life expectancy and vastly improved quality of life wrought by vaccine science, indeed, by science in general. 

     Witness Britain’s Great Exhibition of 1851, a scant three years after the cholera epidemic of 1848. Under the dome of the majestic Crystal Palace, science was celebrated in all its life-affirming possibilities.  In medicine alone, exhibits displayed mechanically enhanced prosthetic limbs, the first double stethoscope, microscopes, surgical instruments and appliances of every kind, and a plethora of pharmaceutical extracts and medicinal juices (including cod liver oil).[9] Topping it off was a complete model of the human body that comprised 1,700 parts.  Science promised better lives and a better future; scientific medicine, which by 1851 had begun to include public health measures, was integral to the promise.

     But here we are in 2021, replete with anti-vaccinationists who choose to endanger themselves, their children, and members of their communities.  They are anti-science primitives in our midst, and I behold them with the same incredulity that visitors to Jurassic Park beheld living, breathing dinosaurs.  Here are people who repudiate both public health measures (mask wearing, curfews, limits on group gatherings) and vaccination science in a time of global pandemic.  For them, liberty is a primal thing that antedates the social contract, of which our Constitution is a sterling example.  It apparently includes the prerogative to get sick and make others sick to the point of death.  The anti-vaccinationists, prideful in their ignorance and luxuriant in their fantasies of government control, remind me of what the pioneering British anthropologist Edward Tyler termed “survivals,” by which he meant remnants of cultural conditions and mindsets irrelevant to the present.  Dinosaurs, after all, didn’t care about the common good either.     


[1] Newsletter and press quotations from Charles Rosenberg, The Cholera Years: The United States in 1832, 1849, and 1866 (Chicago: University of Chicago Press, 1962), 68, 161, 155.

[2] Dorothy Porter & Roy Porter, “The Politics of Prevention:  Anti-Vaccinationism and Public Health in Nineteenth-Century England,” Medical History, 32:231-252, 1988.

[3] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 229-230, 255.

[4] The last reported case of cholera in the U.S. was in 1949.  Cholera, sadly, remains alive and well in a number of African countries. 

[5] In China and Asia Minor, where variolation originated, dried smallpox scabs blown into the nose was the mode of inoculation. 

[6]  José Esparzan, “Three Different Paths to Introduce the Smallpox Vaccine in Early 19th Century United States,” Vaccine, 38:2741-2745. 2020.

[7] Ibid.

[8] Andrew W. Artenstein, et al., “History of U.S. Military Contributions to the Study of Vaccines against Infectious Diseases,” Military Medicine, 170[suppl]:3-11. 2005.  

[9] C. D. T. James, “Medicine and the 1851 Exhibition,” J. Royal Soc. Med., 65:31-34, 1972.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.


[1] Newsletter and press quotations from Charles Rosenberg, The Cholera Years: The United States in 1832, 1849, and 1866 (Chicago: University of Chicago Press, 1962), 68, 161, 155.

[2] Dorothy Porter & Roy Porter, “The Politics of Prevention:  Anti-Vaccinationism and Public Health in Nineteenth-Century England,” Medical History, 32:231-252, 1988.

[3] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 229-230, 255.

[4] The last reported case of cholera in the U.S. was in 1949.  Cholera, sadly, remains alive and well in a number of African countries. 

[5] In China and Asia Minor, where variolation originated, dried smallpox scabs blown into the nose was the mode of inoculation. 

[6]  José Esparzan, “Three Different Paths to Introduce the Smallpox Vaccine in Early 19th Century United States,” Vaccine, 38:2741-2745. 2020.

[7] Ibid.

[8] Andrew W. Artenstein, et al., “History of U.S. Military Contributions to the Study of Vaccines against Infectious Diseases,” Military Medicine, 170[suppl]:3-11. 2005.  

[9] C. D. T. James, “Medicine and the 1851 Exhibition,” J. Royal Soc. Med., 65:31-34, 1972.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.