The War on Children’s Plague

In the early 19th century, doctors called it angina maligna (gangrenous pharyngitis) or “malignant sore throat.”  Then in 1826, the French physician Pierre-Fidele Bretonneau grouped both together as diphtherite.  It was a horrible childhood disease in which severe inflammation of the upper respiratory tract gave rise to a false membrane, a “pseudomembrane,” that covered the pharynx, larynx, or both.  The massive tissue growth prevented swallowing and blocked airways and often led to rapid death by asphyxiation.  It felled adults and children alike, but younger children were especially vulnerable.  Looking back on the epidemic that devastated New England in 1735-1736, the lexicographer Noah Webster termed it “literally the plague among children.”  It was the epidemic, he added, in which families often lost all, or all but one, of their children.

A century later, diphtheria epidemics continued to target the young, especially those in cities.  Diphtheria, not smallpox or cholera, was “the dreaded killer that stalked young children.”[1]   It was especially prevalent during the summer months, when children on hot urban streets readily contracted it from one another when they sneezed or coughed or spat.  The irony is that a relatively effective treatment for the disease was already in hand.

In 1882, Robert Koch’s assistant, Fredrich Loeffler, published a paper identifying the bacillus – the rod-shaped bacterium Corynebacterium diphtheria first identified by Edwin Klebs – as the cause of diphtheria.  German scientists immediately went to work, injecting rats, guinea pigs, and rabbits with live bacilli, and then injecting their blood serum – blood from which cells and clotting factor have been removed – into infected animals to see if the diluted serum could produce a cure.  Then they took blood from the “immunized” animal, reduced it to the cell-free blood liquid, and injected it into healthy animals. The latter, to their amazement, did not become ill when injected with diphtheria bacilli.  This finding was formalized in the classic paper of Emil von Behring and Shibasaburo Kitsato of 1890, “The Establishment of Diphtheria Immunity and Tetanus Immunity in Animals.”  For this, von Behring was awarded the very first Nobel Prize in Medicine in 1901.      

Thus the birth of blood serum therapy, precursor of modern vaccines and antibiotics alike.  By the early 1890s, Emile Roux and his associates at the Pasteur Institute discovered that infected horses, not the rabbits used by Behring and Kitsato, produced the most potent diphtheria serum of all.  Healthy horses injected with a heat-killed broth culture of diphtheria, it was found, could survive repeated inoculations with the live bacilli.  The serum, typically referred to as antitoxin, neutralized the highly poisonous substances – the exotoxins – secreted by diphtheria bacteria. 

And there was more:  horse serum provided a high degree of protection for another mammal, viz., human beings.  Among people who received an injection of antitoxin, only one in eight developed symptoms on exposure to diphtheritic individuals. In1895 two American drug companies, H. K. Mulford of Philadelphia and Parke Davis of Chicago, began manufacturing diphtheria antitoxin.  To be sure, their drug provided only short-term immunity, but it sufficed to cut the U.S. death rate among hospitalized diphtheria patients in half.  This fact, astonishing for its time, fueled the explosion of disease-specific antitoxins, some quite effective, some less so.  By 1904 Mulford alone had antitoxin preparations for anthrax, dysentery, meningitis, pneumonia, tetanus, streptococcus infections, and of course diphtheria. 

Colorful Mulford antitoxin ad from early 20th
century, featuring, of course, the children

In the era of Covid-19, there are echoes all around of the time when diphtheria permeated the nation’s everyday consciousness. Brilliant scientists, then and now, deploying all the available resources of laboratory science, developed safe and effective cures for a dreaded disease.  But more than a century ago, the public’s reception of a new kind of preventive treatment – an injectable horse-derived antitoxin – was unsullied by the resistance of massed anti-vaccinationists whose anti-scientific claims are amplified by that great product of 1980s science, the internet. 

To be sure, in the 1890s and early 20th century, fringe Christian sects anticipated our own selectively anti-science Evangelicals.  It was sacrilegious, they claimed, to inject the blood product of beasts into human arms, a misgiving that did nothing to assuage their hunger for enormous quantities of beef, pork, and lamb.  Obviously, their God had given them a pass to ingest bloody animal flesh.  Saving children’s lives with animal blood serum was apparently a different matter. 

During the summer months, parents lived in anxious expectation of diphtheria every day their children ventured on to city streets.  Their fear was warranted and not subject to the denials of self-serving politicians.  In 1892, New York City’s Health Department established the first publicly funded bacteriological laboratory in the country, and between 1892 and the summer of 1894, the lab proved its worth by developing a bacteriological test for diagnosing diphtheria.  Infected children could now be sent to hospitals and barred from public schools.  Medical inspectors, armed with the new lab tests, went into the field to enforce a plethora of health department regulations. 

Matters were simplified still further in 1913, when the Viennese pediatrician Bela Schick published the results of experiments demonstrating how to test children for the presence or absence of diphtheria antitoxin without sending their blood to a city lab. Armed with the “Schick test,” public health physicians and nurses could quickly and painlessly determine whether or not a child was immune to diphtheria.  For the roughly 30% of New York City school children who had positive reactions, injections of antitoxin could be given on the spot.  A manageable program of diphtheria immunization in New York and other cities was now in place.    

What about public resistance to the new proto-vaccine?  There was very little outside of religious fringe elements.  In the tenement districts, residents welcomed public health inspectors into their flats.  Intrusion into their lives, it was understood, would keep their children healthy and alive, since it led to aggressive intervention under the aegis of the Health Department.[2]   And it was not only the city’s underserved, immigrants among them, who got behind the new initiative.  No sooner had Hamann Biggs, head of the city’s bacteriological laboratory, set in motion the lab’s inoculation of horses and preparation of antitoxin, than the New York Herald stepped forward with a fund-raising campaign that revolved around a series of articles dramatizing diphtheria and its “solution” in the form of antitoxin injections. The campaign raised sufficient funds to provide antitoxin for the William Parke Hospital, reserved for patients with communicable diseases, and for the city’s private physicians as well.  In short order, the city decided to provide antitoxin to the poor free of charge, and by 1906 the Health Department had 318 diphtheria antitoxin stations administering free shots in all five boroughs.[3][4]

A new campaign by New York City’s Diphtheria Prevention Commission was launched in 1929 and lasted two years.   As was the case three decades earlier, big government, represented by state and municipal public health authorities, was not the problem but the solution.  To make the point, the Commission’s publicity campaign adopted military metaphors.  The enemy was not government telling people what to do; it was the disease itself along with uncooperative physicians and recalcitrant parents.  “The very presence of diphtheria,” writes Evelynn Hammonds, “became a synonym for neglect.”[5]     

The problem with today’s Covid anti-vaccinationists is that their opposition to vaccination is erected on a foundation of life-preserving vaccination science of which they, their parents, their grandparents, and their children are beneficiaries.  They can shrug off the need for Covid-19 vaccination because they have been successfully immunized against the ravages of debilitating childhood diseases.  Unlike adults of the late-nineteenth and early-20th centuries, they have not experienced, up close and personal, the devastation wrought summer after summer, year after year, by the diphtheria bacillus.  Nor have they lost children to untreated smallpox, scarlet fever, cholera, tetanus, or typhus.  Nor, finally, have they, in their own lives, beheld the miraculous transition to a safer world in which children stopped contracting diphtheria en masse, and when those who did contract the disease were usually cured through antitoxin injections.

In the 1890s, the citizens of New York City had it all over the Covid vaccine resisters of today.  They realized that the enemy was not public health authorities infringing on their right to keep themselves and their children away from antitoxin-filled syringes. No, the enemy was the microorganism that caused them and especially their children to get sick and sometimes die. 

Hail the supremely common sense that led them forward, and pity those among us for whom the scientific sense of the past 150 years has given way to the frontier “medical freedom” of Jacksonian America.  Anti-vaccinationist rhetoric, invigorated by the disembodied comaraderie of internet chat groups, does not provide a wall of protection against Covid-19.  Delusory thinking is no less delusory because one insists, in concert with others, that infection can be avoided without the assistance of vaccination science. The anti-vaccinationists need to be vaccinated along with the rest of us.  A healthy dose of history wouldn’t hurt them either.         


[1] Judith Sealander, The Failed Century of the Child: Governing America’s Young in the Twentieth Century (Cambridge: Cambridge Univ. Press, 2003), p. 326.

[2] Evelynn Maxine Hammonds, Childhood’s Deadly Scourge: The Campaign To Control Diphtheria in New York City, 1880-1930 (Baltimore:Johns Hopkins University Press, 1999), 84-86.

[3] William H. Park, “The History of Diphtheria in New York, City,” Am. J. Dis. Child., 42:1439-1445, 1931.

[4] Marian Moser Jones, Protecting Public Health in New York City: Two Hundred Years of Leadership, 1805-2005 (NY: New York City Department of Health and Mental Hygiene, 2005), 20.                                     

[5] Hammonds, op cit., p. 206.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

Vaccinating Across Enemy Lines

There are periods in American history when scientific progress is in sync with governmental resolve to exploit that progress.  This was the case in the early 1960s, when advances in vaccine development were matched by the Kennedy Administration’s efforts to vaccinate the nation and improve the public’s health.  And the American public wholeheartedly supported both the emerging generation of vaccines and the government’s resolve to place them in the hands – or rather arms – of as many Americans as possible. The Vaccination Assistance Act of 1962 grew out of this three-pronged synchrony.[1]

Between 1963 and 1965, a severe outbreak of rubella (German measles) lent support to those urging Congress to approve title XIX (of the Medicaid provision) of the Social Security Act of 1965.  And Congress rose to the task, passing into law the “Early and Periodic Screening, Diagnosis, and Treatment” amendments to Title XIX.  The latter affirmed the right of every American child to receive comprehensive pediatric care, including vaccination.

The timing was auspicious.  In 1963, Merck, Sharp & Dohme began shipping its live-virus measles vaccine, trademarked Rubeovax, which had to be administered with standardized immune globulin (Gammagee). In 1967 MSD combined the measles vaccine with smallpox vaccine as Dryvax, and then, a year later, released a more attenuated live measles vaccine (Attenuvax) that did not require coadministration of immune globulin.[2]   MSD marketing reminded parents that mumps, long regarded as a benign childhood illness, was now associated with adult sterility.  It too bowed to science and responsible parenting, with its incident among American children falling 98% between 1968 and 1985.

Crowd waiting for 1962 oral polio vaccination
Creator: CDC/Mr. Stafford Sm

America’s commitment to vaccination was born of the triumphs of American medicine during WWII and came to fruition in the early 1950s, just as Cold War fears of nuclear war gripped the nation and pervaded everyday life.  Grade school nuclear attack drills, “duck and cover” animations, basement fallout shelters with cabinets filled with canned food – I remember all too well these scary artifacts of a 1950s childhood. Competition with the Soviet Union suffused all manner of scientific, technological, public health-related, and athletic endeavor. The Soviets leapt ahead in the space race with the launching of Sputnik in 1957.  The U.S. retained an enormous advantage on the ground with the size and destructive power of its nuclear arsenal.

Less well known is that, in the matter of mass polio vaccination, countries in the Eastern Bloc – Hungary, Czechoslovakia, Poland – led the way. Hungary’s intensive annual vaccination campaigns, launched in 1957 with Salk vaccine imported from the U.S. and Sabin vaccine imported from the U.S.S.R. in 1959, was the prototype for the World Health Organization’s (WHO) global strategy of polio eradication.  Czechoslovakia became the first nation to eradicate polio in 1959; Hungary followed in 1963.[3]  

It is tempting to absorb the narrative of polio eradication into Cold War politics, especially the rhetoric of the vaccination campaigns that mobilized the public. Throughout the Eastern Bloc, mass vaccination was an aspect of pro-natalist policies seeking to increase live births, healthy children, and, a bit down the road, productive workers. Eradication of polio, in the idiom of the time, subserved the reproduction of labor. In the U.S., the strategic implications of mass vaccination were framed differently.  During the late 50s and early 60s, one in five American applicants for military service was found medically unfit.  Increasing vaccination rates was a cost-effective way of rendering more young men fit to serve their nation.[4]   

But there is a larger story that subsumes these Cold War rationales, and it is a story, surprisingly, of scientific cooperation across the Iron Curtain.  Amid escalating Cold War tensions, the United States and Soviet Union undertook a joint initiative, largely clandestine, to develop, test, and manufacture life-saving vaccines.  The story begins in 1956, when the U.S. State Department and Soviet Ministry of Foreign Affairs jointly facilitated collaboration between Albert Sabin and two leading Soviet virologists, Mikhail Chumakov and Anatoli Smorodintsev.  Their shared goal was the manufacture of Sabin’s oral polio vaccine on a scale sufficient for large-scale testing in the Soviet Union. With a KGB operative in tow, the Russians travelled to Sabin’s laboratory in the Cincinnati Children’s Hospital, and Sabin in turn flew to Moscow to continue the brainstorming.  

Two years later, shipments of Sabin’s polio virus strains, packed in dry ice, arrived in the Soviet Union, and shortly thereafter, with the blessing of post-Stalin Kremlin leadership, the mass trials began.  The Sabin vaccine was given to 10 million Russian school children, followed by millions of young adults.  A WHO observer, the American virologist Dorothy Horstmann, attested to the safety of the trials and the validity of their findings. It has long since stopped polio transmission everywhere in the world except Afghanistan and Pakistan.   

No sooner was the Sabin live-virus vaccine licensed than Soviet scientists developed a unique process for preserving smallpox vaccine in harsh environments.  With freeze-dried vaccine now available, Viktor Zhdanov, a Soviet virologist and Deputy Minister of Health, boldly proposed to the 1958 meeting of the World Health Assembly, WHO’s governing body, the feasibility of global smallpox eradication.  After the meeting, he did not wait patiently for the WHO to act: he led campaigns both to produce smallpox vaccine and to solicit donations from around the world.[5]  His American colleague-in-arms in promoting freeze-dried vaccine was the public health physician and epidemiologist Donald Henderson, who led a 10-year international vaccination campaign that eliminated smallpox by 1977.[6] 

What can we learn from our Cold War predecessors?  The lesson is self-evident: we learn from them that science in the service of public health can be an enclave of consensus, what Dora Vargha, the historian of Cold War epidemics, terms a “safe space,” among ideological combatants with the military resources to destroy one another. The Cold War is long gone, so the safe space of which Vargha writes is no longer between geopolitical rivals with fingers on nuclear triggers.

But America in 2021 is no longer a cohesive national community.  Rather, we inhabit a fractured national landscape that erupts, with demoralizing frequency, into a sociopolitical battle zone. The geopolitical war zone is gone, but Cold War-type tensions play out in the present. Right-wing extremists, anti- science Evangelicals, purveyors of a Trump-like notion of insular “greatness” – these overlapping segments of the population increasingly pit themselves against the rest of us:  most Democrats, liberals, immigrants, refugees,  defenders of the social welfare state that took shape after the Second World War.  Their refusal to receive Covid-19 vaccination is absorbed into a web of breezy rhetoric:  that they’ll be okay, that the virus isn’t so bad, that the vaccines aren’t safe, that they come to us from Big Government, which always gets it wrong.  Any and all of the above.  In fact, the scientific illiterati are led by their anger, and the anger shields them from relevant knowledge – of previous pandemics, of the nature of a virus, of the human immune system, of the role of antibodies in protecting us from invading antigens, of the action of vaccines on blood chemistry – that would lead them to sequester their beliefs and get vaccinated.   

When the last wave of antivaccinationism washed across these shores in the early 1980s, it was led by social activists who misappropriated vaccination in support of their cause.  Second-wave feminists saw vaccination as part of the patriarchal structure of American medicine, and urged women to be skeptical about vaccinating their children, citing the possibility of reactions to measles vaccine among children allergic to eggs.  It was a classic instance of throwing out the baby with the bathwater which, in this case, meant putting the children at risk because the bathwater reeked of male hubris.  Not to be left out of the antiscientific fray, environmentalists, in an act of stupefying illogic, deemed vaccines an environmental pollutant – and one, according to writers such as Harris Coulter, associated with psychiatric illness.[7]                                

Matters are now much worse.  Antivaccinationism is no longer aligned, however misguidedly, with a worthy social cause.  Rather, it has been absorbed into this far-reaching skepticism about government which, according to many right-wing commentators and their minions, intrudes in our lives, manipulates us, constrains our freedom of choice, and uses our tax dollars to fund liberal causes.

Even in the absence of outright hostility, there is a prideful indifference to vaccination, partly because it is a directive from Big Government, acting in conformity with the directive of what is, after all, Big Pharmaceutical Science.  But we have always needed Big Government and Big Science to devise solutions to Big Problems, such as a global pandemic that has already claimed over 560,000 American lives.  Without American Big Government, in cooperation with British Big Government, overseeing the manufacture and distribution of penicillin among collaborating pharmaceutical firms, the miracle drug would not have been available in time for D-Day.  Big government made it happen.   A decade later, the need for international cooperation transcended the bonds of wartime allies.  It penetrated the Iron Curtain in the wake of global polio and smallpox epidemics that began in 1952 and continued throughout the decade.  

The last thing we need now is a reprise on that era’s McCarthyism, when anyone was tainted, if not blacklisted, by mere accusation of contact with communists or communism. That is, we do not need a nation in which, for part of the population, anything bearing the stamp of Big Government is suspected of being a deception that infringes on some Trumpian-Hobbesian notion of “freedom” in a state of (market-driven) nature.  

If you want to make America “great” again, then start by making Americans healthy again.  Throughout the 1960s, the imperative of vaccination overcame the anxieties of American and Soviet officials given to eying one another warily atop growing nuclear stockpiles. They brought the scientists together, and the result was the mass testing that led to the eradication of polio.  Then America rallied around the Soviet creation of freeze-dried smallpox vaccine, and largely funded the manufacture and distribution that resulted in the eradication of smallpox. 

Now things are better.  We live in an era in which science enables us to alter the course of a global pandemic.  It is time for antivaccinationists to embrace the science, indeed, to celebrate the science and the gifted scientists whose grasp of it enabled them to create safe and effective Covid-19 vaccines in astonishingly little time.  You’ve got to get your vaccine.  It’s the only way. 


[1] Elena Comis, Vaccine Nation: America’s Changing Relationship with Immunization  (Chicago: University of Chicago Press, 2014), 20.

[2] Louis Galambos, with Jane Eliot Sewell, Networks of Innovation: Vaccine Development at Merck, Sharp & Dohme, and Mulford, 1895-1995.Cambridge:  Cambridge University Press, 1995, 96-98, 196-107.

[3] Dora Vargha, “Between East and West: Polio Vaccination Across the Iron Curtain in Cold War Hungary,” Butt. Hist. Med., 88:319-345, 2014; Dora Vargha, “Vaccination and the Communist State,” in The Politics of Vaccination (online pub date: March 2017).

[4] Comis, Vaccine Nation, 27.

[5] Manela E. “A Pox on Your Narrative: Writing Disease Control into Cold War History,” Diplomatic History, 34:299-323, 2010.

[6] Peter J. Hotez, “Vaccine Diplomacy:  Historical Perspective and Future Directions,” PLoS Neglected Trop. Dis. 8:e380810.1371, 2014; Peter J. Hotez, “Russian-United States Vaccine Science: Preserving the Legacy,” PLoS Neglected Trop. Dis., 11:e0005320,2017.

[7] The feminist and environmentalist antivaccination movements of the 1980s are reviewed at length, in Comis, Vaccine Nation, chapter 5 & 6.

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

Antivaccinationism, American Style

Here is an irony:  America’s staggering production of generations of scientific brainpower coexists with the deep skepticism about science of many Americans.  Donald Trump, a prideful scientific illiterate, rode to power on the back of many others who, like him, were skeptical about science and especially the role of scientific experts in modern life.  He maintains their allegiance still.

Why does this surprise us?  Anti-intellectualism was burned into the national character early in American history.  Those skeptical of this claim should read Richard Hofstadter’s brilliant twin studies of the 1960s, Anti-Intellectualism in American Life and The Paranoid Trend in American Politics. From the beginning of the American Experiment, democracy was antithetical to so-called European “elitism,” and this ethos gained expression, inter alia, in antebellum medicine.  

The Founding Fathers, an intellectual elite in defense of democracy, were not part of the movement away from science.  When Benjamin Waterhouse introduced Edward Jenner’s smallpox vaccine to America in 1800, Washington, Adams, and Jefferson hailed it as the greatest discovery of modern medicine.  They appreciated the severity of smallpox, which had ravaged the Continental Army during the War of Independence.  Indeed, Washington was so desperate to rein in its decimation of his troops that, in 1777, he inoculated his entire army with pus from active smallpox lesions, knowing that the resulting infections would be milder and far less likely to cause fatalities than smallpox naturally contracted.  When Jefferson became president in 1801, he pledged to introduce the vaccine to the American public, because “it will be a great service indeed rendered to human nature to strike off the catalogue of its evils so great a one as the smallpox.” Not to be outdone in support of Jenner’s miraculous discovery, Jefferson’s successor, James Madison, signed into law in 1813, “An Act to Encourage Vaccination.” Among its provisions was the requirement that the U.S. postal service “carry mail containing vaccine materials free of charge.”[1]

But this appreciation of the vaccine was short-lived, and Jefferson’s hope that the value of vaccination would seep into public consciousness was never realized.  In Jacksonian America, the Founding Fathers’ belief that medical progress safeguarded democracy gave way to something far less enlightened:  democracy now meant that everyone could be, indeed should be, his own doctor.  Most Americans had no need for those with university educations, much less clinical experience in governmentally managed public hospitals.  Jacksonian America emerges as what the historian Joseph Kett termed the “Dark Age of the profession.”[2]  During this time, the nation lay claim to a medical elite only because a few monied medical intelligentsia – John Collins Warren, Valentine Mott, Philip Syng Physick, William Gibson, and David Hosack, among them – found their way to European medical centers in London, Edinburgh, and somewhat later, Paris. 

Otherwise, it was every man for himself, which usually meant every woman for herself and her family.  Homeopaths, herbalists, Thomsonians, eclectics, hydropaths, phrenologists, Christian Scientists, folk healers, faith healers, uroscopians, chromo-thermalists – each exemplified the democratic mind in action.[3]  Sad to say, homegrown “regular” American medicine of the day, with its reliance on depletive (bleeding, vomiting, purging) and stimulative (alcohol, quinine) treatments, was no better and often worse.  The belief, Galenic in origin, that all diseases were variants of the same global type of bodily dysregulation is startlingly close to Donald Trump’s holistic medieval approach to bodily infection and its treatment.

The birth of scientific medicine in the decades following the Civil War could not still the ardor of America’s scientific illiterati. The development of animal blood-derived serums (antitoxins), forerunners of modern antibiotics, was anathema to many. Among them were religionists, mainly Christian, for whom injecting blood product of a horse or sheep into the human body was not only repugnant but sinful.  Better to let children be stricken with smallpox, diphtheria and tetanus, sometimes to the point of death, than violate what they construe as divine strictures – strictures, be it noted, not intimated, much less codified, in the body of doctrine of any of the five major world religions.[4]

Antivaccinationists of the early 20th century were an unhappy lot.  They were unhappy about the proliferation of medicines (“biologics”) for treating illness.  And they deeply resented the intrusion of the State into domains of parental decision-making in the form of newly empowered social workers, visiting nurses, and educators.  In fact, antivaccinationism was part and parcel of resistance to all things progressive, including scientific medicine.[5]  Holdovers from the free-wheeling anything-goes medicine of antebellum America – especially devotees of homeopathy and, of late, chiropractic – were prominent in its ranks.    

Now, in the face of a global pandemic no less lethal than the Great Influenza of 1918-1919, we hear the same irrational musings about the dangers of vaccines that animated the scientific illiterati at the turn of the 20th century. For the foes of public health, any misstep in the manufacture or storage of smallpox vaccine – a much greater possibility over a century ago than today – was enough to condemn vaccination outright. In1901,smallpox vaccination of school children in Camden, NJ led to an outbreak of 100 cases of tetanus, with nine deaths.  Historians believe that, in all probability, the outbreak resulted not from a contaminated batch of vaccine but rather from poor care of the vaccination site.  But Congress accepted the possibility of contamination, and the incident led to passage of the Biologics Control Act of 1902.[6]  Henceforth every manufacturer of vaccine had to be licensed by the Secretary of the Treasury (relying on the PHS Laboratory of Hygiene), and each package of vaccine had to be properly labeled and dated and was subject to inspection.[7]  

And this leads to a second irony: the more preventive medicine advanced, incorporating additional safeguards into vaccine production, storage, and administration, the greater the resistance of the illiterati.  Throughout the 20th century and right down to the present, the antebellum notion of science-free “medical freedom” continues to hold sway.  Then and now, it means the right to put children at risk for major infectious disease that could result in death – and the right, further, to pass disease, possibly severe and occasionally fatal, on to others.

It follows that, then and now, the science illiterati are skeptical, if not distressed, by the State’s commitment to public health.  It was Oklahoma Senator Robert Owen’s proposed legislation of 1910 to combine five federal departments into a cabinet-level Department of Public Health that pushed the opponents of medical “tyranny” onward. The Anti-Vaccination League of America, formed in 1908, was joined by the National League for Medical Freedom in 1910.  Eight years later, they were joined by the American Medical Liberty League.  For all three groups, anti-Progressivism was in full swing. “Medical freedom” not only exempted children from compulsory vaccination, but from medical examinations at school.  Further, young adults should not be subjected to premarital syphilis tests. Nor did the groups’ expansive view of medical tyranny flinch in the face of public education about communicable disease: municipal campaigns against diphtheria were to be forbidden entirely. 

With the death of the founders of the Anti-Vaccination League (Charles Higgins) and the American Medical Liberty League (Lora Little) in 1929 and 1931, respectively, antivaccinationism underwent a dramatic decline.  The Jacksonian impulse that fueled the movement simply petered out, and by the later ‘30s, Americans finally grasped that mainstream medicine was not simply another medical sect. It was the real deal:  a medicine grounded in laboratory research that effectively immunized against disease, promoted relief and cure of those already infected, and thereby saved lives.

But was the embrace of scientific healing really universal?  A pinnacle of life-depriving anti-science occurred well beyond the 1930s.  Consider the belief of some Christian sects that certain life-saving medical interventions must be withheld from children on religious grounds.  It was only in 1982, 81 years after von Behring’s discovery of diphtheria antitoxin launched the era of serum therapy, that criminal charges were first brought against parents who had withheld necessary treatment from their children.  Of the 58 cases of such parental withholding of care, 55 involved fatalities.[8]  Child deaths among Christian Scientists alone included untreated diabetes (leading to diabetic ketoacidosis), bacterial meningitis, and pneumonia.  Now things are better for the children, since even U.S. Courts that have overturned parents’ criminal convictions have come around to the mainstream belief that religious exemption laws are not a defense of criminal neglect – a fine insight for the judiciary to have arrived at more than century after serum therapy scored major triumphs in the treatment of rabies, diphtheria, tetanus, pneumococcal pneumonia, and meningococcal meningitis.

Should vaccination for the Covid-19 virus be a requirement for attendance in public and private schools?  How can the question even be asked?  As early as 1827, a Boston school committee ordered teachers to require entering students to give evidence of smallpox vaccination.[9]  Statewide vaccination requirements for smallpox followed in Massachusetts in 1855, New York in 1862, Connecticut in 1872, and Pennsylvania in 1895.  And the inoculations were effective across the board.  They quickly brought outbreaks of smallpox underway at the time of inoculation under control, and they prevented their recurrence in the future. These laws and those that followed were upheld by the Supreme Court in 1922 in Zucht v. King.[10]      

Twentieth-century vaccines were developed for pertussis in 1914, diphtheria in 1926, and tetanus in 1938.  In 1948 the three were combined and given to infants and toddlers at regular intervals as the DTP vaccine.  There was no hue and cry in 1948 or the years to follow. And yet, the same fear of vaccination that led the New York State Health Department to launch a statewide drive to immunize children against diphtheria now renders a new generation of parents resistant to mandatory Covid-19 vaccination for their own children.

Bear in mind that the anti-science rhetoric of today’s illiterati can be mobilized just as easily to resist DPT or any subsequent vaccine administered to their children. Why subject a child to DPT vaccination?  Perhaps combining three different vaccines into one injection entails heightened risks. Perhaps the batch of vaccine in the hands of one’s own doctor has been contaminated.  Perhaps one’s child will be among the miniscule number that have a minor allergic reaction.  And, after all, children who contract diphtheria, pertussis, and/or tetanus will hardly die from their infections, especially with the use of antibiotics. Why inject foreign matter into healthy infants – the very argument adduced by the opponents of diphtheria vaccine a century ago. 

The problem with antivaccinationist rhetoric in the 21st century is that its proponents are all beneficiaries of more than a century of mandatory vaccination policy.  If they lived in a society bereft of vaccines – or, for the unvaccinated, the immunity conferred by the vast herd of immunes – they would have led very different lives.  Indeed, some would not be here to celebrate solipsism masquerading as individualism.  Their specious intuitions about the risks of vaccination are profoundly anti-social, since they compromise the public’s health. Parents who decide not to vaccinate their children put the entire community at risk.  The community includes not only their own children, but all those who desire protection but cannot receive it:  children too young to be vaccinated, those with actual medical contraindications to vaccination, and the miniscule number who have been vaccinated but remain unprotected.[11]    

Nor is it strictly a matter of providing equal protection to individuals who seek, but cannot receive, the protection afforded by compulsory vaccination. In a secular society, religious objections to vaccination pale alongside the health of the community. Whether framed in terms of a “compelling state interest” in mitigating a health threat (Sherbert v. Vernerin [1963]) or the individual’s obligation to comply with “valid and neutral laws of general applicability” whatever their incidental religious implications (Employment Division, Department of Human Resources of Oregon v. Smith [1990]) , the U.S. Supreme Court has consistently held that mandatory vaccination laws need not allow religious exemptions of any kind.  

Antivaccinationists might bear in mind a few particulars as they align themselves with the infectious dark ages.  Between 1900 and 1904, an average of 48,164 cases of smallpox and 1,528 smallpox deaths were reported each year. With the arrival of compulsory vaccination in schools, the rate fell drastically and outbreaks of smallpox ended in 1929. The last case of smallpox in the U.S. was reported in 1949.[12]  

Among American children, diphtheria was a major cause of illness and death through 1921, when 206,000 cases and 15,520 deaths were recorded.  Before Emil von Bering’s diphtheria antitoxin became available in 1894 to treat infected children, the death rate among children struck down, especially during the hot summer months, could reach 50%. Within several years, use of the antitoxin brought it down to 15%.[13]  Then, by the late 1920s, diphtheria immunization was introduced and diphtheria rates fell dramatically, both in the U.S. and other countries that vaccinated widely. Between 2004 and 2008, no cases of diphtheria were recorded in the U.S.[14] 

Between 1951 and 1954, paralytic polio cases in the United States averaged 16,316 a year, of which 1,879 resulted in death. Then science came to the rescue.  Jonas Salk’s dead-poliovirus vaccine became available in1955, and Albert Sabin’s live-poliovirus variant four years later. By 1962, there were fewer than 1,000 cases a year and, in every year thereafter, fewer than 100 cases.[15]

Now, alas, some parents still worry that the measles component of the MMR (measles, mumps, rubella) vaccine available since 1971 may lead to childhood autism.  Why?  Resist the disease-promoting mythologies of the illiterati at all costs.  Autism is a neuro-developmental disorder with a strong genetic component; its genesis is during the first year of life, before the vaccine is even administered.  None of the epidemiologists who have studied the issue has found any evidence whatsoever of an association, not among normal children and not among high-risk children with autistic siblings.[16]  The fact is that children who do not receive a measles vaccine have been found 35 times more likely to contract measles than the vaccinated.[17]  And measles is no laughing matter. When contracted later in life, measles and mumps are serious and can be deadly.  They were among the major systemic infections that felled soldiers during the Civil War, the Spanish-American War, the Anglo-Boer War, and World War I.[18]                  

All of which leads to a conclusion in the form of an admonishment.  Accept the fact that you live in a secular society governed by law and a network of agencies, commissions, and departments lawfully enjoined to safeguard public health.  Do your part to sustain the social contract that came into existence when the Founding Fathers, elitists molded by European thought who had   imbibed the social contractualism of John Locke, wrote the American constitution.

Vaccination is a gift that modern science bestows on all of us – vaccination proponents and opponents alike. When one of the two FDA-approved Covid-19 vaccines comes to a clinic or storefront near you, run, don’t walk, to get your and your children’s shots. Give thanks to the extraordinarily gifted scientists at Pfizer and Moderna who created these vaccines and demonstrated their effectiveness and safety. Make sure that everyone’s children grow up, paraphrasing the U.S. Army’s old recruiting slogan, to be all they can be.   


[1] Dan Liebowitz, Smallpox Vaccination: An Early Start of Modern Medicine in America, J. Community Hosp. Intern. Med. Perspect., 7:61-63, 2017 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5463674).

[2] Joseph F. Kett, The Formation of the American Medical Profession: The Role of Institutions, 1780-1860 (New Haven: Yale University Press, 1968), p. vii. 

[3] Robert E. Riegel, Young America, 1830-1840 (Westport, CT: Greenwood Press, 1973 [1949]), pp. 314-315, quoted at  314. 

[4] John D. Graberstein, “What the World’s Religions Teach, As Applied to Vaccines and Immune Globulins,” Vaccine, 31:2011-2023, 2013.

[5] James Colgrove, “’Science in Democracy’: The Contested Status of Vaccination In the Progressive Era and the 1920s,” Isis, 96:167-191, 2005.

[6]  Harry F. Dowling, Fighting Infection: Conquests of the Twentieth Century (Cambridge, MA: Harvard University Press, 1977), 38; Harry M. Marks, The Progress of Experiment: Science and Therapeutic Reform in the United States, 1900-1990 (Cambridge: Cambridge University Press, 1997), 73-74.

[7] Jonathan Liebenau, Medical Science and Medical Industry: The FormationOf the American Pharmaceutical Industry (Baltimore: Johns Hopkins, 1987), 89-90.

[8]  Janna C. Merrick, “Spiritual Healing, Sick Kids and the Law: Inequities in theAmerican Healthcare System,” Amer. J. Law & Med., 29:269-300, 2003, at 280.

[9] John Duffy, “School Vaccination: The Precursor to School Medical Inspection,” J. Hist. Med. & Allied Sci., 33:344-355, 1978,

[10] Kevin M. Malone & Alan R. Hinman, “Vaccination Mandates: The Public Health Imperative and Individual Rights, Law in Public Health Practice (2009), 262-284, at 272.

[11] Alan R. Hinman, et al., “Childhood Immunization: Laws that Work,” J. Law, Med &I Ethics, 30(suppl):122-127, 2002.

[12] Frank Fenner, et al., Smallpox and its Eradication (Geneva: World Health Organization, 1988).

[13] Karie Youngdahl, “Early Uses of Diphtheria Antitoxin in the United States,” The History of Vaccines, August 2, 2010 (https://www.historyofvaccines.org/content/blog/…).

[14] Epidemiology and Prevention of Vaccine-Preventable Diseases, 11th Edition (The Pink Book). National Immunization Program, Centers for Disease Control and Prevention (http://www.cdc.gov/vaccines/Pubs/pinkbook/downloads/dip.pdf); Diphtheria. WHO, Regional Office for the Western Pacific (http://www.wpro.who.int/health_topics/diphtheria).

[15] CDC. Annual summary 1980: Reported Morbidity and Mortality in the United States. MMWR 1981;29; CDC, Reported Incidence of Notifiable Diseases in the United States, 1960. MMWR 1961;9.

[16] Frank DeStefano & Tom T. Shimabukuro, “The MMR Vaccine and Autism,” Ann. Rev. Virol., 6:585-600, 2019.

[17] Hinman, op. cit. (note 11).

[18] Paul E. Stepansky, Easing Pain on the Western Front:  American Nurses of the Great War and the Birth of Modern Nursing Practice (Jefferson, NC:  McFarland, 2020), 36, 50, 96, 144.

Copyright © 2020 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

Covid-19 and Trump’s Medieval Turn of Mind

“We ought to give it [hydroxychloroquine] a try . . . feel good about it. That’s all it is, just a feeling, you know, smart guy. I feel good about it.” – Donald J. Trump, March 20, 2020

“I see the disinfectant, where it knocks it out in a minute. One minute. And is there a way we can do something like that, by injection inside or almost a cleaning?  Because you see it gets in the lungs, and it does a tremendous number on the lungs. So, it would be interesting to check that.” – Donald J. Trump, April 23, 2020

“So, supposing we hit the body with a tremendous —  whether it’s ultraviolet or just very powerful light — and I think you said that that hasn’t been checked, but you’re going to test it. And then I said, supposing you brought the light inside the body?” – Donald J. Trump, April 23, 2020

______

Viewed from the standpoint of history of medicine, the Great Influenza (aka the Spanish Flu) of 1918-1919 and the Coronavirus pandemic of today, separated by a century, share a basic commonality. Both are pandemics of the modern era, where treatments for specific diseases grow out of the findings of laboratory science and scientific medicine. The development of serology, which transferred to humans, via injection, whatever antitoxins resided in the purified blood of immune animals, had by 1918 proven effective, albeit to varying degrees, with diseases such as rabies, diphtheria, tetanus, and typhoid fever.

Today we anxiously await the development of a Covid-19 vaccine.  In 1918, health professionals and the public waited impatiently for new serums to combat gas gangrene and the pandemic flu.  And given the state of medical progress of the time – viruses had yet to be identified and differentiated from bacteria – their optimism was reasonable.  By the spring of 1918, 5000 units of an anti-gangrene serum had reached AEF hospitals in Europe, of which 2,500 units had been used by the time of the Armistice.  For the Spanish Flu, two different injectable serums were available to overseas American nurses by the fall of 1918.

The predictable failure of these serums should not obscure the fact that in 1918 management of the Spanish Flu was squarely in the hands of mainstream scientists and physicians.  Then President Woodrow Wilson stood back from the whirl of  suffering and death around him.  He maintained a steely silence about the whole business, refusing to mention the pandemic in even a single public address.  His preoccupation with the war and ensuing Paris Peace Conference was total, and precluded even the simplest expression of sympathy for stricken Americans and their families.  Here he anticipated by a century President Donald Trump.

Wilson held his peace.  Now we behold President Donald Trump, who, in his own preoccupation with self-promotion and self-congratulations, buttressed by denial of the pandemic’s magnitude, cannot remain silent.  Not even for a day. But what is he telling us?  How is he helping us cope with the fury of another global pandemic? His musings – contradictory, impulsive, and obsessively self-serving – would have stunned Americans of 1918. For Trump seems to have dispensed with scientific medicine altogether.  To understand his “spin” on the pandemic, we must go back much further than the Great Influenza and look again at the Black Death of the mid-14th century.

In October, 1347, a vessel, probably originating off the Crimean Peninsula, docked in Messina, Sicily.  It was laden with infected rats, themselves laden with infected fleas.  Together, the rats and fleas brought the Black Death, likely a combination of bubonic and hemorrhagic plague, to Europe.  Physicians of the time, wedded to Hippocratic and Galenic notions of illness and health, confronted plague with the therapeutics derived from this paradigm.  Bleeding (venesection) was typically the first order of business, since blood was associated with body heat.  Bleeding would cool down a body overheated by fever and  agitation, thereby restoring balance among the four humors that corresponded to the four elements of the universe: black bile [earth], yellow bile [fire], phlegm [water], and blood [air].

When bleeding and the regulation of Galenic non-naturals (food and drink, motion, rest, evacuation, the passions) failed to restore health, physicians turned to what was to them an observable fact:  that plague was literally in the winds.  It was contained, that is, in miasmic air that was unbearably foul-smelling, hence corrupt and impure.  For some, the miasma resulted from a particular alignment of the planets; for many others it was pinned on the Jews, a poisonous race, they believed, that sought to poison the air.  But for most European physicians, no less than for priests and laymen, the miasmic air came directly from an enraged God who, disgusted with sinning humankind, breathed down the corrupt vapor to wipe them out.

How then, were 14th-century physicians to combat a pollution of Divine origin?  Galen came to the rescue, with heat  again at the center of  plague therapeutics.  Heat, it was known, had the ability to eliminate foul-smelling air, perhaps even lethally foul-smelling air. What was needed to combat plague was actually more heat, not less.  Make fires everywhere to purify the air.  This was the advice Guy de Chauliac, surgeon to the Papal Court in Avignon, gave Pope Clement VI, whose presumed sanctity did not prevent him from isolating himself from Court and servants and spending his days seated between two enormous log fires.  Among the infected, a more draconian application of heat was often employed:  doctors lanced plague victims’ inflamed buboes (boils) and applied red hot pokers directly to their open wounds.

Medieval thinking also led to treatments based on Galen’s theory of opposites.  Purities cancel impurities.  If you want to avoid the plague, physicians advised, drink copious amounts of the urine of the non-infected; collecting and distributing healthy urine became a community project throughout the continent.  If you were of means and would rather not drink urine, the ingestion of crushed sapphires would work just as well.

English peasants adopted a more benign path to purification:  they stuffed their dwellings with sweet scented flowers and aromatic herbs.  Here they followed the example of Europe’s plague doctors, those iconic bird-men who stuffed the huge beak extensions of their masks with dried flowers and odoriferous herbs to filter out pestilence from the air they breathed. Good smells, after all, were the opposite of airborne foulness.

a 14th-century plague doctor, dressed to ward off the miasma

On the other hand, in another variation of Galenic thinking, physicians sought a dissonant foulness powerful enough to vanquish the foulness in the air. Villagers lined up to stick their heads in public latrines.  Some physicians favored a more subtle variant. They lanced the infected boils of the stricken and applied a paste of gum resin, roots of white lilies, and dried human excrement. The synergism among the ingredients, they believed, would act as a magical restorative.  This, in any event, was the opinion of the eminent Italian physician Gentile da Foligno, whose treatise on the Black Death was widely read and who, inter alia, was among the first European physicians to study plague victims by dissecting their corpses.  Needless to say, the treatment did him no good, and he died of Plague in 1348.  Other physicians developed their own topical anodynes.  Snakes, when available, were cut up and rubbed onto a plague victim’s infected boils.  Pigeons were cut up and rubbed over the victim’s entire body.

Now, 672 years after the Black Death wiped out more than 40% of world population, we behold an astonishing recrudescence of the medieval mind:  we are led through a new plague by a presidential medievalist who “feels good” about nonscientific remedies based on the same intuitive search for complementarities and opposites that medieval physicians proffered to plague patients in the mid-14th century.  Heat kills things; heat obliterates atmospheric impurities; heat purifies. Perhaps, then, it can rid the body of viral invaders.  Disinfectants such as bleach are microbe killers. We wipe a counter top with Clorox and rid it of virus.  Can’t we do the same thing by injecting bleach into the human body? What bleach does to healthy tissue, to internal organs, to blood chemistry – these are science questions an inquiring 8th grader might put to her teacher.  But such questions could not arise to a medieval physician or to Donald Trump. They simply fall outside the paradigm of Galenic medicine in which they operate.  In this world, with its reliance on whole-body states calling forth whole-body, re-balancing interventions, there is no possibility of  weighing the pros and cons of specific treatments for specific ailments (read: different types of infection, local and systemic).  The concept of immunological specificity is literally unthinkable.

Injecting or ingesting bleach has an internal logic no greater than that of the 14th-century Flagellants, who roamed across continental Europe in a frenzy of penitential self-abuse that left them lacerated if not dead.  It made perfectly good 14th-century sense – though not, be it noted, to Clement VI, who condemned the practice as heretical.  Withal, the Flagellants believed that self-mortification and the excruciating pain it entailed could assuage a wrathful God and induce Him to stop blowing death down on humankind.  But science belied their self-purifying intentions. The roving Flagellants, leaving paths of infected blood and entrails behind them, became a vector for the transmission of plague.  For our medieval president, the path is one of toxic verbal effluvia no less dangerous than infected blood and entrails in spreading Covid-19.

We want to believe that no one living in 2020 can possibly lend credence to anything Trump has to say about infectious illness, virology, pandemics, scientific research, and post-medieval medicine.  When it comes to Covid-19, he is an epistemic vacuity whose medieval conjectures would never make it past the family dinner table or the local bar. But he is the president, and he speaks with the authority of high office.  So his musings, grounded in Galenic-type notions and feelings, have an apriori valence.  As such they will continue to lead many astray – away from prudent safeguards, away from mainstream medicine, indeed, away from an appreciation of the scientific expertise that informs these safeguards and treatments.

Hippocratic-Galenic medicine, with its notions of balance, synergy, complementarity, and opposites, retains its appeal to many.  But prescientific, feeling-based intuitions about disease are always dangerous, and positively deadly in a time of global pandemic. In the aftermath of Trump’s pronouncement about the logic of injecting  household disinfectants to combat Covid-19, poison control centers across the country were flooded with inquiries about the advisability of imbibing household bleach.  As to hydroxychloroquine, “More Deaths, No Benefit from Malaria Drug in VA Virus Study,” reported AP News on the first use of hydroxychloroquine in a small-scale nationwide study of VA patients hospitalized with Covid-19.

Is this surprising?  Whether or not hydroxychloroquine or any other drug or household disinfectant or chopped up animal remains is safe and effective against Covid-19 is an empirical question subject to laboratory research and clinical study.  But who exactly sets the agenda?  Who, that is, decides which existing pharmaceuticals or household products or smashed animal parts are worthy of scientific investigation?  Experts with knowledge of pharmacology, infectious disease, and virology or an intellectually null and void president for whom science matters only as a handmaiden to political objectives?  Pity those who follow him in his medieval leap of faith.

By fanning the flames of Hippocratic-Galenic notions about heat, light, the neutralizing effect of opposites, the shared efficacy of substances with complemental or analogical properties, Trump himself has become a vector for the transmission of plague. Bleach kills microbes on a counter top.  Shouldn’t it therefore kill the Covid-19 virus in the human body?  Hydroxychloroquine kills the protozoan parasite Plasmodium that causes malaria.  Shouldn’t it therefore kill Covid-19 viruses within the human body?  Wouldn’t a really “solid” seasonal flu vaccine provide people with a measure of resistance to Covid-19?  No, no, and no. Would that Mr. Trump would “feel good” about a more benign medieval variant, perhaps donning a garland of garlic cloves at press briefings.  Better still, following the example of the plague doctors, he could wear a mask in public, if only to satisfy those of us whose heads are not buried in medieval muck.  Given the clear and present danger of his treatment preferences to public health, however, we would be best served if he were simply muzzled until election day.

 

Copyright © 2020 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

The Picture They Wouldn’t Publish

 

Here is the picture the publisher, McFarland & Co., refused to include in Easing Pain on the Western Front, my study of the American and Canadian nurses of World War I.  It was taken in a French field hospital in 1918, and shows a Red Cross nurse clasping the hand of a French soldier, a poilu, most of whose face has been blown off by artillery. She gazes outward stoically, as if to say: “Yes, I am a nurse, and this is the kind of boy the ambulances bring to us during the rushes.  Soon the surgeons will decide whether he will live and whether they can do anything for him.  For now I hold his hand.”  A more junior sister stands beside her and bears witness.

Why did the publisher exclude this powerful image from a book about the nurses of the Great War?  I was told the problem was one of image quality, viz.,   the picture did not reproduce at 300dpi and was pixelated. So it had to go.

Earlier I had made the case for including WWI photos of nurses in action that did not meet the publisher’s quality threshold in this way:

Not all books are created equal; not all pictures are created equal; and not all pictures are equally relevant to the story an author tells.  Criteria of inclusion should accommodate, at least to some extent, subject matter and narrative.  In the context of this particular book, I find it difficult to imagine anyone finding fault with the publisher for including rare and graphic images, never seen by anyone except a handful of medical historians, that  illustrate – and do so poignantly – my central arguments about the emergence of scientific nursing practice on the Western front.  Certainly, should McFarland choose to allow a small number of lower resolution images, I would gratefully acknowledge the publisher’s kind accommodation in the interest of a more vivid historical presentation.

With the photo at issue, however, I was unavailing.  After all, the publisher opined, it would be criticized, presumably by reviewers, for including such a low-resolution photo in one of its books. Really?  If any of my readers know of a single instance in which a book publisher has been criticized in print for including a rare, low-resolution period photo in a work of history, please let me know.

Now the photo is graphic and unsettling.  Was the subject of the photo the underlying reason the publisher excluded it from the book?  Probably not.  I do think, however, that the horrific image of a faceless soldier fortified the resolve  to exclude it on”low resolution”grounds.  But it is precisely the nature of the picture – what it shows us – that speaks to the subject of this particular book.  I sought its inclusion not as yet another depiction of the horror of modern mechanized warfare, a reminder of what exploding artillery shells do to human faces at close range.  Rather, the photo provides poignant visual representation of a Great War nurse in action, of the steadiness and steadfastness with which she faced up to the care of the faceless.

Even more to the point, the photo shows one key instrument deployed by nurses in this war and, to some extent, all wars: the hand. We behold the nursing hand as an instrument of stabilizing connection, of calming containment.  Easing Pain examines the many uses of the nurse’s hand in diagnosis and treatment.  To which the hand as instrument of touch-mediated attachment must be included. Seen thusly, the photo is a wartime embodiment in extremis of touching and being touched as a vehicle of therapeutic “holding” in the sense of the British pediatrician and psychoanalyst D. W. Winnicott.  The role of the hand in nursing care antedates and postdates the era of nursing professionalization.  I explore the topic at length in my previous book, In the Hands of Doctors:  Touch and Trust in Medical Care.

Here then is the context in which the photo would have been introduced.  References to the nurses quoted therein may be found in the book.

_______________

Nursing hands also monitor the nurse’s own performance, especially the acclimatization of new nurses to the demands of the reception hut.  Shirley Millard reports how her hands “get firmer, faster.  I can feel the hardness of emergency setting in.  Perhaps after a while I won’t mind.” More importantly, nursing hands stabilize soldiers whose fear and pain off the battlefield leave them overwhelmed and child-like.  With soldiers who arrive at casualty clearing stations in surgical shock, massive blood loss is compounded by sepsis, pain, and anxiety, making it incumbent on nurses not only to institute stabilizing measures, but to make the soldier feel “he is in good and safe hands.” Touch is a potent instrument for inducing this feeling.  Soldiers clutch hands as they ask, “Is it all right?  Don’t leave me.”  But it is usually not all right, and it is the nurse’s hand that provides a lifeline of human attachment to relieve a desolation that is often wordless:  “Reaching down to feel his legs before I could stop him, he uttered a heartbreaking scream.  I held his hand firmly until the drug I had given him took effect.”  When panic overwhelms and leaves soldiers mute, the hand communicates what the voice can not:  “He seized my hand and gripped it until it hurt . . . He looked up at me desperately, hanging onto my hand in his panic.”  The hand offers consolation when there are no words:  “The bandage around his eyes was soaked with tears.  I sat on his bed and covered his hand with mine.”

The nurse’s hands mark attachment and impending loss.  Soldiers become terrified at the time of surgery.  The reality of amputation, the painful aftercare it will entail, and the kind of life it permits thereafter can be overwhelming.  It is 1915, and the American Maud Mortimer is in a field hospital at the edge of Belgium, only five miles from the firing line.  A patient with whom she has connected, “Petit Pere, is about to have his leg amputated.  He makes her promise that when he comes around from the anesthesia she will be there, and that she will “hold his hand through the first most painful dressing.”  The amputation complete, he gazes up at her:  “Hold my hand tight and I will scream no more.”

But the attachment can transcend treatment-related trauma and become perduring.  Now it is April 1918, three years later, and a pause in the action permits Helen Boylston’s hospital to ship 26 ward patients to England.  One of her patients, Hilley, begs her to let him remain.  “I went out to the ambulance with him,” she recounts, “and he clung tightly to my hand all the way.  I almost cried.” Such separation, with the hand clinging it elicits, reminds us that a wounded soldier’s parting from his nurses can be a major loss, even when it is a prelude to greater safety and fuller recovery. The vigorous hand clinging of the living, even in loss, is far preferable to the enfeebled squeeze of the dying.  With the latter, the nurse’s  hand becomes an instrument of palliation, interposing human touch between living and dying, easing the transition from one to the other:  “I held his hand as he went . . .  Near the end he saw me crying and patted my hand with his two living fingers to comfort me.” Expressions of gratitude and affection, hand-communicated, are part of the process.  The hand continues to communicate as the body shuts down:  “He was ever so good and tried to take milk and food almost up to the end but he was unable to speak and not really conscious, though he could hold my hand and squeeze it which was so sweet of him.”

________________

EASING PAIN ON THE WESTERN FRONT American Nurses of the Great War and the Birth of Modern Nursing Practice

Paul E. Stepansky

McFarland & Co.    978-1476680019    2020    244pp.    19 photos    $39.95pbk/$19.95 Kindle eBook

Available now from Amazon

 

READ THE PREFACE TO EASING PAIN ON THE WESTERN FRONT HERE

PAUL STEPANSKY IS FEATURED AUTHOR IN THE

PRINCETON ALUMNI WEEKLY  

        LISTEN TO HIM DISCUSS THE BOOK WITH THE EDITOR OF THE JOURNAL OF THE AMERICAN ASSOCIATION OF NURSE PRACTITIONERS IN A SPECIAL JAANP PODCAST

 

Copyright © 2020 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

JUST RELEASED: Easing Pain on the Western Front (McFarland, 2020)

We are pleased to announce the publication of Easing Pain on the Western Front: American Nurses of the Great War and the Birth of Modern Nursing Practice, the book that has grown out of Paul Stepansky’s popular series of blog essays, “Remembering the Nurses of WWI.”  It has been lauded as “an important contribution to scholarship on nurses and war” (Patricia D’Antonio, Ph.D.) that is simultaneously “the  gripping story of nurses who advanced their profession despite the emotional trauma and physical hardships of combat nursing” (Richard Prior, DNP).  The Preface is presented to the readers of “Medicine, Health, and History” in its entirety.

____________________

Dr. Stepansky is the featured author in the Princeton Alumni Weekly

Listen to him discuss Easing Pain on the Western Front with the editor of the Journal of the American Association of Nurse Practitioners in this special JAANP Podcast

____________________

PREFACE

Studies in the history of nursing have their conventions, and this study of American nursing in World War I does not adhere to them.  It is not an examination of the total experience of military nurses during the Great war.  Excepting only the first chapter, which addresses the American war fever of 1917 and the shared circumstances of the nurses’ enlistment, little attention is paid to aspects of their lives that have engaged other historians.  I do not review the nurses’ families of origin, their formative years, or their reasons for entering the nursing profession.  Similarly, there is relatively little in these pages of the WWI nurses as women, of their role in the history of the women’s movement, or of their personal relationships, romantic or otherwise, with the men with whom they served.

In their place Easing Pain on the Western Front focuses on nursing practice, by which I mean the actual caregiving activities of America’s Great War nurses and their Canadian and British comrades.  These activities comprise the role of nurses in diagnosis, in emergency interventions, in medication decisions, in the use of available technologies, and in devising creative solutions to treatment-resistant and otherwise atypical injuries. And it includes, in these several contexts, nurses’ evolving relationship with the physicians alongside whom they worked.  Among historians of nursing, Christine Hallett stands out for weaving issues of nursing practice into her excellent accounts of the allied nurses of WWI, with a focus on the nurses of Britain and the British Dominions.  Hallett has no counterpart among historians of American nursing, and as a result no effort has been made to gauge the impact of WWI on the trajectory of nurse professionalization in America, both inside and outside the military.

This study begins to address this lacuna from a perspective that stands alongside nursing history.  It  comes from a historian of ideas who works in the history of American medicine.  It is avowedly historicist in nature, grounded in the assumption that nursing practice is not a Platonic universal with a self-evident, objectivist meaning.  Rather this type of practice, like all types of practice, is historically determined, with the line between medical treatment and nursing care becoming especially fluid during times of crisis.  It is intended to supplement the existing literature on WWI nurses, especially the excellent work of Hallett and other informative studies of British, Canadian, and Australian nurses, respectively.  The originality of the work lies in its focus on American nurses, its thematic emphasis on nursing activities, and its argument for the surprisingly modern character of the latter.  Close study of nursing practice, especially in the context of specific battlefield injuries, wound infections, and infectious diseases, yields insights that coalesce into a new appreciation of just how much frontline “doctoring” these nurses actually did.

The case for modern nursing practice in WWI is strengthened by comparative historical inquiry that renders the study, I hope, a more general contribution to the history of military nursing and medicine.  In each of the chapters to follow, I work into the narrative comparative treatment of WWI nursing with nursing in the American Civil War of 1861-1865, the Spanish-American War of 1898, and/or the Anglo-Boer War of 1899-1902.  The gulf that separates Great War nursing from that of wars only two decades earlier, we will see, is wide and deep.  In the concluding chapter, I invert the case for modernity by looking forward from America’s Great War nurses to the American nurses who served in World War II and Vietnam.  If Great War nurses had little in common with the Civil War nurses who preceded them by a half century, they share a great deal, surprisingly, with their successors in Vietnam a half century later.

The focus on nursing practice, then, far from being restrictive in scope, opens to a wide range of issues – medical, cultural, political, and military. Consider, for example, the very notion of healthcare practice, which is determined by a confluence of factors.  Specific theories of disease, rationales for treatment of specific conditions, putative mechanisms of cure, and the grounds for “proving” cure are all central to the historical study of healthcare practice.  Female nursing practice, with the bodily intimacies it entails, also implicates considerations of gender – of what, keeping to the time frame of this study, early-twentieth-century female hands might do to male bodies, and what the males who “owned” these bodies might comfortably permit female hands to do.  Depending on the historical location of nursing practice, issues of social class, nationality, ethnicity, and race may count for as much or more than gender.

By channeling our gaze onto what nurses of the Great War actually did with wounded, suffering, distraught, and dying soldiers, we learn about the many factors that enter into combat nursing at one moment in modern history.  The focus on nursing practice provides a new perspective on the medical advances of the Great War; the role of nurses in making and implementing these advances; the new professional status that accompanied this process; and the American military’s emerging appreciation of trained nurses, indeed of female officers in general.  The focus on nursing practice draws a different picture of the evolution of military personnel policy and the changing social mores and political pressures that accompanied this evolution.

The fate of WWI combat nurses’ role back on the homefront in the aftermath of war is a separate story that I address but briefly in the concluding chapter.  More work should be done on the relationship between military and civilian nursing practice, especially the fate of combat nurses who return to civilian nursing after wartime service.  To keep to the subject matter of this study, the experience of America’s WWI nurses back in civilian hospitals offers an illuminating window into the frustrations and accomplishments of nurses, indeed of professional women in general, in the American workplace in the two decades between the world wars.

For American readers Easing Pain on the Western Front may prove interesting for another reason.  Our increasing reliance on nurses to meet the health care challenges of the twenty-first century, especially in the realm of primary care, underscores the relevance of the unsuspectedly modern Great War nurse providers.  Indeed, they offer a fascinating point of departure for ongoing debates by nurses, physicians, social scientists, and politicians about the scope of practice of nurse practitioners in relation to physicians.  This is because America’s Great War nurses, no less than the nurses of other combatant nations, had to step up during battle “rushes” that overwhelmed their surgeon colleagues.  At such times, and in the weeks and months of intensive care that followed, they became autonomous clinical providers, true forebearers of the nurse practitioners and advanced practice nurses of the present day.  The fact that their professional leap forward occurred in understaffed casualty clearing stations and field hospitals on the Western front during the second decade of the twentieth century lends salience to their accomplishments.

Notwithstanding the many streams of contingency that flow into nursing practice at a given moment in history, I hasten to point out that I am not a historian of gender and that my comments on gender, not to mention ethnicity and race, are sparing.  I invoke them only in the context of specific nursing activities, especially when they were raised by the nurses themselves.  The same may be said of the nurses’ personal lives.  I ignore neither the emotional toll of combat nursing nor the psychological adaptations to which it gave rise.  But here again these issues are addressed primarily in relation to nursing activities, especially the nurses’ perceptions of and reactions to the wounded, ill, and dying they cared for.  I leave to others more comprehensive study of the gender-related, racial, and psychological aspects of American military nursing in World War I and other wars, noting only that the scholarship of Darlene Clark Hine, Margaret Sadowski, and Kara Dixon Vuic has begun to mine this rich vein of nursing history to great effect.

The cohort of nurses at the heart of this study is not limited to American nurses.  They include Canadian and British nurses as well.  In the case of the former, the ground for inclusion is not especially problematic, since many Canadian nurses trained in the United States; indeed, some both trained and worked in the States prior to their wartime service.  Ella Mae Bongard, for example, a Canadian from Picton, Ontario, trained at New York Presbyterian Hospital, practiced in New York for two years after graduating in 1915, and then volunteered with the U.S. Army Nursing Corps.  She ended up at a British hospital in Étretat, where she served with several of her Presbyterian classmates.  The Canadian Alice Isaacson, who served with the Canadian Army Medical Corps, was a naturalized American citizen.  Among other members of the Canadian Nurse Corps,  Mary Catherine Nichols Gunn trained in Ferrisburg VT and initially worked at Nobel Hospital outside Seattle; Annie Main Gee trained at Minneapolis City Hospital with postgraduate studies at New York Polyclinic; and Eleanor Jane McPhedran trained at New York Hospital School of Nursing and worked in the area for three years after graduating.  In all, the training of Canadian nurses and the nursing services they provided were  very much in line with American nursing.

In the case of several prominent British nurses cited throughout the work – Kate Luard, Edith Appleton, Dorothea Crewdson – I am arguably on less certain ground.  British nurses – veiled and addressed as “Sister” –cannot strictly speaking play a role in American nursing practice during the Great War.   I include them nonetheless for several reasons.  The fact is that many British nurses, no less than the Canadians, served abroad for the duration of the war; usually their wartime service extended beyond the Armistice of November 1918 by up to a year.   The duration of their wartime experience makes their diaries and letters, taken en masse, more revelatory of treatment-related issues than those of their American colleagues, whose term of service  was a year or less.  The reflections of Canadian and British nurses on nursing practice on the Western front lend illustrative force to the same battlefield injuries, systemic infections, and psychological traumata encountered by the America nurses as well.

Shared nursing practices were reinforced by considerable interchange among the allied nurses.  Within months of the outbreak of war, American Red Cross (ARC) nurses, all native born and white per ARC requirements, sailed to Europe to lend a hand.  On September 12, 1914, the first 126 departed from New York Harbor on a relief ship officially renamed Red Cross for the duration of the voyage.  A second group of 12 nurses joined three surgeons on a separate vessel destined for Serbia.  Other nurse contingents followed over the next several months, all part of the American Red Cross’s “Mercy Mission.”

Technically ARC nurses were envoys of a neutral nation, and those in the initial group ended up not only in England and France but in Russia, Austro-Hungary, and Germany as well.   In the last the ARC worked in concert with the German Red Cross, and American nurses like Caroline Bauer, stationed in Kosel, Germany in 1915, expressed genuine fondness for the “brave and good” German soldiers under her care.  For the majority of nurses, however, pre-1917 service in British and French hospitals, despite some initial tensions with British supervisors, reinforced ideological and emotional bonds and introduced American nurses to the realities of combat nursing on the Western front.

Even after America’s entry into the war in 1917, American nurses were typically assigned on arrival to British or Canadian hospitals, where they continued their tutelage under senior Canadian and British nursing sisters until returning to their units once their hospitals were ready for them.  Allowing for occasional exceptions, the same medicines (sometimes with different names) were administered, the same procedures performed, and the same technologies employed by the nurses of the allied nations.  To ignore what the Canadian and British nurses have to say about the same issues of nursing practice encountered by the Americans would enervate the study without leading to any refinement of its thesis.

And so, aided by the testimony of Canadian and British nurses,  I am secure in my thesis as it pertains to American nursing and the birth of modern nursing practice.  That being said, I leave it to scholars more knowledgeable than I about Canadian and British nursing history to validate, amend, or reject the thesis in relation to their respective nations.

Finally, it bears noting that in a work about nursing practice that draws on the recollections of a cohort of American, Canadian, and British nurses, each nurse is very much her own person with a personal story to tell.  The memoirs, letters, and diaries that frame this study provide elements of these stories, of how each nurse’s experience interacted with her family history, training, personality, temperament, and capacity for stress management.

In a general way, reactions to the actualities of nursing on the Western front fall along a spectrum of psychological and existential possibilities. At one pole is the affirmation of the combat nursing life provided by Dorothea Crewdson: “I enjoy life here very much indeed. Wonderfully healthy and free.”   At the opposite pole are the mordant reflections of the writer Mary Borden, for whom “The nurse is no longer a woman.  She is dead already, just as I am – really dead, and past resurrection.”  The gamut of reactions, and the richly idiomatic language through which they were expressed, are woven into my narrative at every turn.  I am most concerned, however, with the nurses’ transition from one mindset to another, especially the abruptness with which the life-affirming brio of Crewdson gave way to the horror, demoralization, and depersonalization of Borden. The happy excitement and prideful sense of participation in the war effort with which American nurses set out for the front often dissipated shortly after they arrived and saw the human wreckage that would be the locus of their “nursing.”

Signposts of personal transformation, which I gather together as epiphanies, represent my point of departure in chapter 1.  But in the chapters that follow, these elements of personal biography are subordinate to my focus on nursing practice through a cohort analysis.  Fleshing out the individual stories that undergirded such transformations – the   chronicles of strong, often overpowering emotions that took nurses to the point of physical or nervous collapse – is the stuff of biography and falls beyond the task I have set myself.  It suffices to recall that nurses, no less than the soldiers they cared for, fell victim to what, in the parlance of the war, was termed “shell shock,” even though medical and military personnel steadfastly refused to pin this label on them.  But most of the time the nurses’ descent into the horrific gave rise to adaptive strategies – compartmentalization, dissociation, psychic numbing, black humor – that enabled them to labor on in the service of their soldiers, their “boys.”.

It is with respect to the nurses’ shared ability to bracket their personal stories in the service of a nascent professionalism – a professionalism that segued into medical diagnosis and procedural caregiving far removed from the world of their training and prewar experience – that they reached their full stature.  In so doing, they provide an historical example, deeply moving, of the kind of self-overcoming for which we reserve the term “hero.”

_______________

EASING PAIN ON THE WESTERN FRONT

American Nurses of the Great War and the Birth of Modern Nursing Practice

Paul E. Stepansky

McFarland & Co.     978-1476680019     2020     244pp.     19 photos     $39.95pbk.

Available now from Amazon 

 

 

Copyright © 2020 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

 

 

 

Telemedicine Rising

In a “Viewpoint” published in JAMA a month ago,[1] Michael Nochomovitz and Rahul Sharma suggest that the time has come to create a new medical specialty: virtual medicine.  Extrapolating from the manner in which medical specialties have traditionally arisen (viz., “by advances in technology and expansion of knowledge in care delivery”), they submit that telemedicine has advanced to the point of providing the basis for a new kind of specialty care.  Telemedicine, as they define it, comprises various web-based telecommunications modalities, social media, teleconferencing, video face-to-face communications with patients, among them.  They place before us  medical virtualists, physicians who “will spend the majority or all of their time caring for patients using a virtual medium.”

Unlike today’s physicians, who make use of this or that “virtual medium” haphazardly and without formal training, the virtualist will achieve a set of “core competencies” through formal training.  Their curriculum for certification, according to the authors, should include “knowledge of legal and clinical limitations of virtual care, competencies in virtual examination using the patient or families, ‘virtual visit presence training,’ inclusion of on-site clinical measurements, as well as continuing education.” Among the techniques in their arsenal will be those aimed at achieving “good webside manner” (authors’ italics).

Now, far be it from me to discourage the use of  remote technologies to render health care delivery more efficient and especially to bring primary care to underserved communities. The value of “remote surgery” that ranges from telementoring and remote guidance to actual robotic operations is well-documented.  But is a new medical virtualism specialty really in our best interest?  Certainly,  telemedicine will play an  increasing role in medicine; the question is whether this “role” should become the basis of a bounded specialty.  This would make the medical virtualist the first medical practitioner whose practice excluded (or drastically marginalized) patients, making it radically different from nonpractice specialties such as pathology or diagnostic radiology.

It is problematic especially in the cognitive specialties.  We would have a subspecies of primary care doctors who specialized in care that was patient-uncentered, i.e., that was premised on the self-sufficiency of piece-person care as opposed to whole-person care.  The proposal takes the current fragmentation of care among subspecialists and refashions it into a virtue.  That is, we will have virtuous physicians who only practice virtual medicine and feel good about doing so.  Such care differs from subspecialty care in a key respect:  we typically see our subspecialists in the flesh.  We can ask them questions, demand explanations, and criticize them for not giving us the time and attention we seek.  In the absence of adequate time and attention, we can seek out a different subspecialist who is more patient-centered and welcoming.   With the medical virtualist, on the other hand, dehumanization is integral to the specialty itself.  The patient has no recourse; he is outside the virtualist’s purview altogether.

It is striking that the issue of patient trust is nowhere mentioned in the article,   even though empirical research suggests that trust is the “basic driver” of patient satisfaction.  It has been linked to less treatment anxiety, greater pain tolerance, and greater compliance.[2]  But the authors subordinate all such issues to their focus on efficiency and ease of use.   As such, their case rests on the assumption that informed the patient rights movement of the 1970s and ’80s:  that patients are simply consumers in search of a commodity.  Now, a half century after passage of the Patient’s Bill of Rights, the commodity is increasingly mediated by technology.[3]  And “the success of technology-based services,” according to the authors, “is not determined by hardware and software alone but by ease of use, perceived value, and workflow optimization.”  The need to humanize the delivery of technology, to convey to the patient some sense of what I have termed “caring technology,”[4] falls outside a conversation framed in terms of consumerist values.

But once we factor trust into the equation, we open a can of worms.  For patient trust implicates the doctor’s touch, which includes both the laying on of hands and the implementation of office-based procedures.  It also implicates human qualities such as caring, empathy, and the willingness to tolerate ambiguity.  Finally, it puts us in contact with the  Hippocratic Oath, in which ethical obligations revolve entirely around physicians treating patients who are full-fledged human beings, fellow sufferers.  This is why Jennifer Edgoose and Julian Edgoose, writing in a recent issue of Annals of Family Medicine about “Finding Hope in the Face-to-Face,” begin with this sentence:  “The daily work of clinicians is conducted in face-to-face encounters, whether in exam rooms, homes, or alongside hospital beds, but little attention has been paid to the responsibilities and ethical implications generated by this dimension of our relational work.”[5]  Among these implications is the physician’s obligation not merely to be an instrument of diagnosis and treatment, but also to contain the patient’s “wounded humanity” in the sense of Pellegrino.[6]

I wrote In the Hands of Doctors precisely to explore, both historically and in the present, this dimension of physicians’ “relational work,” including the better and worse ways in which it can appropriate technologies that are not only sought after by patient-consumers, but viewed as remote and intimidating by patient-persons.  Physicians who know their patients as wounded and vulnerable can humanize technology by pulling it into a trusting doctor-patient relationship.

These thoughts are a counterpoise to the authors’ brief for “the medical virtualist.”  Their proposal is provocative and troubling.  It inverts figure and ground, so that telemedicine, heretofore an adjunct to face-to-face care, becomes the ground of a specialty in which face-to-face care is incidental at best.  In the domain of primary care, it segues into the philosophical question of who or what primary care virtualists are being trained to care for.  Can one be a primary care physician of any type and care for some “thing” other than whole persons?  The status of virtualism in surgical specialties is no doubt different.

I invite others to reply to this posting with their thoughts on a topic that will only grow in importance in the years ahead.

_______________________

[1] Michael Nochomovitz & Rahul Sharma, “Is It Time for a New Medical Specialty?  The Medical Virtualist,” JAMA, 319:437-438, 2018.

[2] Paul E. Stepansky, In the Hands of Doctors: Touch and Trust in Medical Care (Montclair: Keynote, 2017), 21 and references cited therein.

[3] Stepansky, In the Hands of Doctors, 133-135

[4] Stepansky, In the Hands of Doctors, 82-98.

[5] Jennifer Y. C. Edgoose & Julian M. Edgoose, “Finding Hope in the Face-to-Face,” Ann. Fam. Med., 15:272-274, 2017.

[6] E. D. Pellegrino, Humanism and the Physician (Knoxville:  University of Tennessee Press, 1979), 124, 146, 184, and passim.

 

The Politics of Medical Freedom

Winner of an Independent Publisher Books Award Bronze Medal for 2017, Paul Stepansky’s In the Hands of Doctors:  Touch and Trust in Medical Care is now available in paperback and as an eBook.  For the paperback edition, Stepansky has written a new preface, a stirring defense of Obamacare as a path to universal health care in America.  It is given here in its entirety in appreciation of the readers of “Medicine, Health, and History.”

~

In the Hands of Doctors:  Touch and Trust in Medical Care

Preface to the Paperback Edition
Copyright © 2017 by Paul E. Stepansky.  All rights reserved.

~

 In our time, political speech and writing are largely the defense of the indefensible.
                                         — George Orwell, “Politics and the English Language” (1946)

Now, less than a year after publication of In the Hands of Doctors, the Patient Protection and Affordable Care Act of 2010 (aka Obamacare), which I roundly endorse in this book, is gravely imperiled.  Congressional Republican legislators have joined a Republican President in a commitment to repeal the bill that has provided health insurance to over 20 million previously uninsured Americans.  The legislation thus far presented to replace it by the U.S. Senate  (the “Better Care Reconciliation Act of 2017”) would, according to the Congressional Budget Office, leave 15 million Americans uninsured in 2018 and 22 million by 2026.   Proposed cuts and caps to the Medicaid budget, which are part of the legislation, would, according to the CBO, decrease enrollment in the program by 16% over the next decade.  In brief, these cuts and caps would jeopardize the health and well-being of the one in five Americans and one in three American children dependent on the support provided by Medicaid.  Disabled and other special-needs children as well as elderly nursing home residents would suffer the most.  A Congressional vote simply to repeal Obamacare absent new legislation would have even more catastrophic consequences.
      Congressional opponents of the Affordable Care Act, no less than President Donald Trump, appear to live in a hermetically sealed bubble that makes only grazing contact with the socioeconomic ground below.  They share space in the bubble with colliding political abstractions that they grasp, one after the other, and radio back down to earth.  The political bubble dwellers offer us yet again the palliatives of context-free “medical choice” and “medical freedom” as remedies for the real-world suffering addressed by Obamacare.
     But these terms, as used by politicians, do not speak to the realties of American health care in  2017.  Rather, they hearken back to the era of the Founding Fathers, when issues of health, illness, freedom, and tyranny were much simpler.  Freedom, as the founders understood it, operated only in the intertwined realms of politics and religion.  How could it be otherwise?   Medical intervention did not affect the course of illness; it did not enable people to feel better and live longer and more productive lives.  With the exception of smallpox inoculation, which George Washington wisely made mandatory among colonial troops in the winter of 1777, governmental intrusion into the health of its citizenry was nonexistent, even non-sensical.
     Until roughly the eighth decade of the nineteenth century, you got sick, you recovered (often despite doctoring), you lingered on in sickness, or you died.  They were the options.  Medical freedom, enshrined during the Jacksonian era, meant being free to pick and choose your doctor without any state interference.  So liberty-loving Americans picked and chose among mercury-dosing, bloodletting “regulars,” homeopaths, herbalists, botanical practitioners, eclectics, hydropaths, phrenologists, Christian Scientists, folk healers, and faith healers.   State legislatures stood on the sidelines, rescinding or gutting medical licensing laws and applauding the new pluralism.  It was anarchy, but anarchy in the service of medical freedom of choice.
     Now, mercifully, our notion of medical freedom has been reconfigured by two centuries of medical progress.  We don’t just get sick and die.  We get sick and get medical help, and, mirabile dictu, the help actually helps.  In antebellum America, deaths of young people under 20 accounted for half the national death rate, which was more than three times the death rate today.  Now our children  don’t die of  small pox, cholera, yellow fever, dysentery, typhoid, and pulmonary and respiratory infections before they reach maturity.  When they get sick in early life, their parents take them to the doctor and they almost always get better.  Their parents, on the other hand, especially after reaching middle age, don’t always get better.  So they get ongoing medical attention to help them live longer and more comfortably with chronic conditions such as diabetes, coronary heart disease, inflammatory bowel disease, Parkinson’s, and many forms of cancer.
     When our framers drafted the Constitution, the idea of being free to live a productive and relatively comfortable life with long-term illness did not compute.  You died from diabetes, cancer, bowel obstruction, neurodegenerative disease, and major infections.  Among young women,  such infections included the uterine infection that routinely followed childbirth.  A major heart attack simply killed you.  You didn’t receive dialysis and possibly a kidney transplant when you entered kidney failure.  Major surgery, performed on the kitchen table if you were of means or in a bacteria-infested public hospital if you were not, was rarely attempted because it invariably resulted in massive blood loss, infection, and death.
     So, yes, our framers intended our citizenry to be free of government interference, including the Obamacare “mandate” that impinges on Americans who choose to opt out of the program.  But then, with the arguable exception of Benjamin Franklin, the framers never envisioned a world in which freedom could be extended by access to expert medical care that relieves suffering, often effects cure, and prolongs life.  But then, neither could they envision the enfranchisement of former slaves and women, the progressive income tax, compulsory vaccination, publicly supported health clinics, mass screening for TB, diabetes, and  syphilis, or Medicare and Medicaid.  Throughout the antebellum era, when physicians were reviled by the public and when neither regular medicine nor the rival alternative sects could stem the periodic waves of cholera, yellow fever, and malaria that decimated local populations, it mattered little who provided one’s doctoring.  Many, like the thousands who paid $20.00 for the right to practice Samuel Thomson’s do-it-yourself botanical system, chose to doctor themselves.
     Those who seek repeal of Obamacare without a credible legislative alternative that provides equal, preferably greater, health benefits to all Americans seem challenged by the very idea of medical progress.  Their use of terms like “choice” and “freedom” invokes an eighteenth-century political frame of reference to deprive Americans of a kind of freedom associated with a paradigm-shift that arose only in the final quarter of the nineteenth century.  It was only then that American medicine began its transition to what we think of as modern medicine.  Listerian antisepsis and asepsis; laboratory research in bacteriology, immunology, and pharmacology; laboratory development of specific remedies for specific illnesses; implementation of public health measures informed by bacteriology; modern medical education; and, yes, government regulation to safeguard the public from incompetent practitioners and toxic medications – all were part of the transition.  The Jacksonian impulse persisted into the early twentieth century, flaring up in organized opposition to compulsory childhood vaccination, and  finally petering out in the 1930s, by which time it was universally accepted that scientific medicine was, well, scientific, and, as such, something more than one medical sect among many.
     “We hold these truths to be self-evident,” Thomas Jefferson began the second paragraph of the Declaration of Independence, “that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”  What Jefferson did not stipulate, indeed what he could not stipulate in his time and place, were the hierarchical relationships among these rights.  Now, in the twenty-first century, we are able to go beyond an eighteenth-century mindset in which  “Life, Liberty, and the pursuit of Happiness” functioned as a noun phrase whose unitary import derived from the political tyrannies of King George III and the British parliament.  Now we can place life at the base of the pyramid and declare that quality of life is indelibly linked to liberty and the pursuit of happiness.  To the extent that quality of life is diminished through disease, liberty and the pursuit of happiness are necessarily compromised.  In the twenty-first century, health is life; it is life poised to exercise liberty and pursue happiness to the fullest.
     Why, then,  is it wrong to require all citizens to participate in a national health plan, either directly or through a mandate (i.e., a tax on those who opt out), that safeguards the right of people to  efficacious health care regardless of their financial circumstances, their employment status, and their preexisting medical conditions?  What is it about the Obamacare mandate that has proven so troubling to our legislators?  When you buy a house in this country, you pay local property taxes that support the local public schools.  These taxes function like the mandate:  They have a differential financial impact on people depending on whether they directly benefit from the  system sustained by the tax.  To wit, you pay the tax whether or not you choose to send your children to the public schools, indeed, whether or not you have children at all.  You are obligated to subsidize the public education of children other than your own because public education has been declared a public good by the polity of which you are a part.  The same goes for that portion of local taxes that provides police and fire protection.  We accept a mandate to support policemen and firefighters whether or not we will ever need them, in the knowledge that other members of the community assuredly will.  Similarly, those who opt out of Obamacare should pay a price, because they remain part of a society committed to health as a superordinate value without which liberty and the pursuit of happiness are enfeebled.
     It is inconceivable that the Founding Fathers would have found unconstitutional or unfair or governmentally oppressive a law that provided life-promoting health care that enabled more Americans to discharge the duties of citizenship and live more freely and productively in pursuit of happiness.  They declared that citizens – all of whom, be it noted, were white, propertied males – were entitled to life consistent with the demands and entitlements of representative democracy.  Their pledge, their declaration, was not in support of a compromised life that limited the ability to fulfill those demands and enjoy those entitlements.
     When, in our own time, the word “choice,” as used by Republican politicians, means that millions of Americans who rely on Obamacare will end up leading compromised lives, the word becomes semantically contorted and ethically bankrupt.  The absence of Obamacare does not, ipso facto, empower lower-income, assistance-dependent Americans to buy the comprehensive health insurance they need, especially when the tax credits under legislation proposed thus far provide far less support than the subsidization lower-income families now receive.  Freeing insurers from Obamacare regulations so that they can offer inadequate policies that lower-income Americans can afford to buy does nothing but maximize the medical risks of these financially choice-less Americans.  Here is a fact:  Economic circumstances wipe out the prerogative to make prudent choices in one’s own best interest.  For lower-income Americans, a panoply of inadequate choices is not the pathway to right-minded decision making.  With the Senate’s “Better Care Reconciliation Act,” unveiled in June and updated in July, 2017, millions of low-income Americans, especially those dependent on Obamacare subsidies and Medicaid, would have had an absence of credible and affordable choices for obtaining health care adequate to their needs.  The call simply to repeal the Affordable Care Act, which the Senate has rejected as of this writing, would take us back to a status quo ante when millions of Americans were either priced out of, or completely denied, health coverage.
     Of course, adult citizens may repudiate mainstream health care altogether on the basis of philosophical or religious  predilections.  Christian Scientists and Jehovah’s Witnesses, for example, hold theological beliefs that often lead them to refuse medical treatment.  Certainly, they are free to pursue health through spiritual healing or, in the manner of medieval Christians, to disavow corporeal health and earth-bound life altogether.  But by law they cannot deny their children, who deserve to live to maturity and make their own choices, the healing power of modern medicine, whether it comes in the form of insulin, antibiotics, blood transfusions, or surgery.  Nor should they be allowed to undermine social and political arrangements, codified in law, that support everyone else’s right to pursue life and happiness through twenty-first century medicine.  Those who, prior to the Affordable Care Act, had inadequate insurance or no insurance at all are not struggling to free themselves from the clutches of federal regulation; they are not crying out for new free market health plans through which they can exercise freedom of choice.  Rather, they are struggling to put food on the table and keep themselves and their families healthy.  To this end, they need to be free to avail themselves of what modern medicine has to offer, unencumbered by politically debased notions of freedom and choice.
     At this moment in history, in a fractionated society represented by a President and Congressional leaders whose daily missives bear out George Orwell’s acute observation about the corruption of language brought on by political orthodoxies, In the Hands of Doctors may have a wistful ring.  I hope not.  I am addressing the personal side of health care – the reality of a doctor and patient, alone in a consulting room, often surrounded by high-tech diagnostic aids but always containing those vital low-tech instruments with which one person reaches out to the other:  the physician’s eyes and hands and voice.  The human face of doctoring, which now includes the doctoring of nurse practitioners and physician assistants, remains essential to the success of any doctor-patient relationship, whatever the institutional arrangements that bring together this doctor and this patient, the former to help the latter.
     Endeavoring to understand the several aspects – and possibilities – of the doctor-patient relationship, I write about the nature of clinical caring; the relation between caring and patient trust; the need to recruit and train physicians who can bring this caring sensibility to their patients; the role of empathy in medical caring; and the obligation of medical educators to revivify primary care medicine to meet the critical shortage of frontline physicians within underserved American communities.  These issues will not go away, whatever the fate of Obamacare.
     When federal legislation, through the practical assistance it provides, extends the reach of trusting doctor-patient relationships to the most vulnerable  groups in society, it has a  function that is both binding and enabling.  It fortifies the webbing that underlies the increasingly disparate parts of our national mosaic.  Obamacare, the Children’s Health Insurance Program, Medicaid – these programs do not “bring us together” in a feel good way.  They do, however, prevent a free fall in which the subcommunities and interest groups into which society has decomposed land so far apart they are no longer in hailing distance of one another.  As to the enabling function, a comprehensive medical safety net for all Americans – let’s call it what it is: universal health care – revitalizes political democracy by extending to all Americans a greater possibility of life, liberty, and the pursuit of happiness.  In the everyday world, this pursuit boils down to the ability of more people to stay on the job or to work from home rather than not work at all.  Society benefits, since chronically ill people pursuing happiness under the umbrella of universal health care will better resist the complications and collateral illnesses that follow from their primary illness or illnesses.  Society also benefits by enabling healthier happiness-pursuers to avoid hospitalization and, among the elderly, to push back the time when nursing home care is required.  And finally, society benefits by seeing to it that all children, especially those who are disabled, receive every medical advantage as they traverse their own challenging paths to productive, choice-wielding citizenship.
     Obamacare is a far cry from universal health care, but for all its limitations and current financial straits, it has provided these binding and enabling functions for millions of Americans previously without a medical safety net.  Woe to politicians who shred it in the name of choice, a pleasing vacuity that evades the reality of disease and pain among many who are relieved to have a single avenue of subsidized care where none was previously available.
     Health care should be a national trust; everyone deserves what twenty-first century medicine has to offer, regardless of how much or how little choice can be worked into the offering.  Politicians who feel otherwise are enemies of the polity.  Jefferson, who as president helped set up the first smallpox vaccination clinics in the south and then, in retirement, planned a state-supported clinic to provide free medical care to those who could not afford it, would not have brooked the empty insistence that medical freedom and medical choice, unhinged from socioeconomic reality, trump access to medical care per se.  Nor, for that matter, does choice, whatever it may or may not mean, obviate our moral obligation as a society to see to it that best available treatment, whatever the pathway that leads to it, means treatment rendered by caring doctors willing to know their patients as people.

~

In the Hands of Doctors:  Touch and Trust in Medical Care, 1st pbk ed

Paul E. Stepansky

978-0983080770        2016/2017pbk        348pp        $26.95pbk

Release date: September 12, 2017

Order now at Amazon.com

 

 

 

Remembering the Nurses of WWI (VI)

[The sixth and final essay about the gallant nurses of World War I commemorating the centennial of America’s entry into the war on April 6, 1917.  The startling parallels between the medical and nursing responses to the Great Pandemic of 1918 and the Coronavirus Pandemic of 2020 are elaborated in Easing Pain on the Western Front:  American Nurses of the Great War and the Birth of Modern Nursing Practice  (McFarland, 2020)].    

PLAGUE

Influenza.  The Plague. The Great Pandemic of 1918.  The Spanish Flu, which, as best we can determine, originated not in Spain but in Camp Funston in northeastern Kansas and Camp Oglethorpe in northwestern Georgia in March and April, 1918.  From there it spread to other army camps, then to France via troops disembarking at Brest, then to the rest of Europe, then to the rest the world.  By the time the epidemic had passed, world population was reduced by 50 to 100 million, or from 3% to 5%.[1]  

During 1918 and 1919, 47% of all deaths in the United States were from influenza and its complications, with over 675,000 deaths in all.   The first wave of the disease in March and April 1918 was relatively mild, as the virus learned to adapt to humans via passage from person to person.  But the second wave, which began in August, was deadly.  Philadelphia officials foolishly refused to cancel a Liberty Loan Parade scheduled for September 28.  Within three days, every hospital bed in the city’s 31 hospitals was filled; within 10 days the pandemic exploded to hundreds of thousands ill, with hundreds of deaths each day. By October 12, 4,500 Philadelphians had died from the flu; a few weeks later the total was nearly 11,000.  Neighboring New York City lost over 21,000 during the same period.

One characteristic of the Spanish flu was that, unlike typical influenza, it targeted younger victims, aged 20-40.  The fault was in their very youth, as   their immune systems mounted a massive response to the virus, filling their lungs with so much fluid and debris that the exchange of oxygen became impossible.  Victims lapsed into unconsciousness and drowned on their own internal secretions.  Others remained alive long enough to have bacteria swarm into their compromised lungs and compound viral infection with bacterial pneumonia; the result was either death or a lengthy convalescence.

The American Expeditionary Force (AEF) consisted of healthy young men.  Influenza hit them hard.  At home, things grew desperate in army training camps and cantonments.  It was the first and only time the number of seriously ill soldiers exceed the military’s total hospital capacity; the army had to take over barracks and use them as hospitals.  It was no better overseas, where the virus took advantage of the conditions of trench warfare to evolve into its lethal form.  By the U.S. War Department’s own reckoning, the flu eventually sickened 26% of the AEF – over a million men – and accounted for  82% of the army’s total deaths from disease.[2]

Influenza patients at a naval training station in California, 1918

The serious shortage of nurses to care for stricken soldiers spurred the American Red Cross to action.  It struggled to ship out the 1,000 nurses a week requested by the army. On the home front, it set aside its policy of racial exclusion and enrolled African American nurses for “special service.” In early December, two emergency detachments of nurses of color set out for army hospitals at Camp Sherman in Ohio and Camp Grant in Illinois, respectively. There they were assigned to general wards filled with white soldiers.  To the surprise of skeptical white chief nurses, their service was exemplary, whatever the drudgery assigned them.[3]

____________

The story of the pandemic of 1918 is the story of modern medicine not yet modern enough to grasp the characteristics of viruses:  their structure; the manner in which they invaded and infected cells; even whether they were liquid or particle in nature.   The best laboratory techniques of the time could identify viruses only as the mysterious “something”  left over after infected blood samples or respiratory secretions were passed through a Chamberland filter, whose minute pores filtered out all known bacteria.  Since, with  hoof-and-mouth disease and  yellow fever, filtered solution or “filtrate” had been shown to make healthy animals sick, it must contain an imperceptible, nonspecifiable something  that passed on the disease.  But this eliminative logic had no operational consequences; it was a therapeutic dead end.  So researchers played to the strength of their time, a golden age for linking specific bacteria to specific illnesses:  they searched for a bacteriological culprit for The Great Flu.  And they thought they had found one.  Plausibly, they believed the Spanish flu was caused by the Bacillus influenzae discovered by the German bacteriologist Richard Pfeiffer in 1892.  And they were wrong.

Bacteriology  in 1918 might be adequate, at least some of the time, to the secondary bacterial infections (especially pneumonia) that followed the weakening of the immune system caused by the virus.  But in the face of the virus itself,  it was helpless.  Only in 1934, when a new flu epidemic raged in Puerto Rico, would Thomas Francis of the Rockefeller Institute, utilizing a technique for viral transmission in animals developed by his colleague, Richard Shope, isolate the Type A influenza virus.

_____________

If the warring armies of 1918 were hit hard by the flu, it was often the nurses, no less than the infected soldiers, who took it on the chin. This included the nurses of the Allied Expeditionary Force.  During the initially mild phase of the epidemic, ignorant of what was to come, overseas nurses were content to add the flu to the list of infections they combatted and to which, often enough, they fell victim.  Being bedridden themselves was simply a vicissitude of the job – a cost of the business of frontline nursing.  “It’s not that I mind being in bed,” wrote Helen Boylston in February, 1918.  “I don’t even mind having flu and trench fever.”  Two months later, she recorded that the flu was back again, “and everybody has it, including me.  I’ve run a temperature of one hundred and two for three days, can hardly breathe, and have to sleep on four pillows at night.” But she kept her suffering to herself and soldiered on:  “But I’m not talking about it, because I don’t want to be sent to Villa Tino[for rest and treatment].”   Of course, when the influenza struck in full force in the fall of 1918, nurses, whatever their resolve, were not spared.  In the U.S., 127 army nurses died from flu, and an untold number, probably another 100, died in Europe.  Katherine Magrath, the chief nurse of Base Hospital 68 in Nièvre in central France, buried 12 of her nurses in a single month. After each funeral, she avoid looking at the faces of her surviving nurses lest she wonder “which would be the next to be absent from the dismal scene.” [4]

Unable to treat the flu at its source, the nurses did what they had grown accustomed to doing for the desperately ill:  They bore witness to suffering and tried to ease it.  Their witnessing was different from that of the doctors.  In their diaries, for example, they record the sensory experience of hands-on care of those who grew sicker and then died.  Influenza cases had swamped the nurses, wrote Shirley Millard at the beginning of April, 1918. And the soldiers, “When they die, as about half of them do, they turn a ghastly dark gray and are taken out at once and cremated.”  Forty of 160 patients had the flu, and the staff was coming down with it  So reported Hoosier nurse Maude Essig from Base Hospital 32 in the French resort town of Contrexéville in November.  “The odors,” she added, “are bad.”[5]

Treatment of the flu-ridden called forth everything in the nurses’ toolbox. They quickly learned the course of the illness, and they took from the toolbox everything they had to strengthen the heart, ease respiration, and attenuate suffering.  If they could not tame the virus, they could, with luck, keep patients alive long enough for their immune systems to rally and join the struggle.  During the early days of the pandemic, Beatrice Hopkinson wrote in her diary how flu-ridden patients were stripped of their clothing in one tent, bathed in disinfectant and distributed among different wards.  But disinfectants, she and others quickly learned, were unavailing.  Patients who were very ill often died from pneumonia within days, Hopkinson found.[6]  For those whose fate was not sealed, the trick was to bring down their high fevers with alcohol baths and to keep their hearts beating and their lungs exchanging oxygen and carbon dioxide.  Here is how nurses coped with a signal corps switchboard operator – one of the first “Hello Girls” – stricken with influenza on board the transport ship Olympic in September 1918:

Risking their own lives, nurses placed warm mustard packs on her chest to dilate the capillaries, stimulate her nervous system, and help her cough up the mucus that could drown her.  They aspirated her lungs, sponged her body with alcohol, applied camphorated oil every hour, gave  her salt-solution enemas, and spoon-fed her concoctions of milk, eggs, and whiskey.  The first week, Conroy received four hypodermic injections of digitalis to control her pulse and strengthen her heartbeat.[7]

None of these ministrations attacked the virus, but they kept Conroy alive, and after 17 days her fever finally broke.  She, no less than the nurses who saved her, went on to contribute to the war effort in France.

In 1916, prior to the epidemic, nurses who began to feel “influenza-ish” might resort to brisk walks and a “good hot mustard bath.”  But when they took ill after the epidemic set in, they resorted to large doses of quinine and aspirin or, alternatively, to quinine and “a stiff dose of whiskey” to keep going.[8] By the fall of 1918, their wards had become “influenza departments,” and they wondered how long they could resist infection.  Inevitably, nurses became patients, taking to their beds to await transfer to nearby convalescent homes set aside for them, referred to colloquially as “Sick Sisters,” for treatment and recuperation.[9]  For some, only physical collapse on the wards could remove them from the second battlefield.

The flu paid no heed to the Armistice that ended hostilities on November 11, 1918.  It raged on in base hospitals. It subsided in intensity in the final month of the year only to  return with renewed virulence in the new year. The nurses, for their part, remained in base hospitals throughout France and Belgium, serving not only bedridden soldiers but the local populace as well.  Then, pooling their efforts with the women physicians serving in American Women’s Hospitals (AWH) sponsored by the Medical Women’s National Association, they fanned out from northern France and Belgium to Serbia, the Near East, and even Russia.  Working hand in hand with medical colleagues, the nurses established public health programs for civilian populations that had gone without medical, surgical, dental, and nursing care since 1914.  Divided into mobile units, AEF nurses and AWH physicians established weekly house call and dispensary routes that took them to battle-scarred villages throughout the regions they served.  During the seven months that AWH No. 1 was based in Luzancy in northcentral France, for example, its units  made 3,626 house calls to the 20 villages on their regular schedule and to 45 outlying villages as well.  In virtually every village, chronic disease management shared center stage with dental and gynecological care.  And among the diseases with which nurses and physicians continued to do battle, typhoid fever and influenza had pride of place.[10]

We have considered the manner in which nursing interventions could become curative by simple dint of their frequency and intensity  – not to mention the confident bravado with which they were administered.  No where was this in greater evidence than with the soldiers and civilians stricken with virulent flu in the fall of 1918 and winter of 1919.  Nurses in the Allied Expeditionary Force, no less than their sisters-in-arms in the British Expeditionary Force, stayed on and nursed on, no matter the  apparent inevitability of death.  There was this implicit hope that caring interventions could at any point turn the tide, if only in the sense of gaining a brief reprieve during which the body’s depleted healing resources might rally.

Nurses were great naturalists.  Fevers might break.  Hearts might resume normal rhythms.  Lungs might expel enough infectious matter to resume respiration. To be sure, the worst of the influenza victims almost always died.  But then so did the worst of the postsurgical patients, the worst of the gassed patients, the worst of the soldiers with multiple injuries and multiple amputations. It mattered not.  Nursing professionals professed an ethic of caring grounded in, but not limited by, the scientific medicine of the time.  When all else had failed, when surgeons and physicians had given up on a patient, nursing care could still be a clinical tipping point that loosened the grasp of the grim repeater.  Always the nurses gave it their all.  Bits from the wreckage might still be saved despite the “unutterable woe.”[11]  Let one nurse, Britain’s Kate Luard, distinguished recipient of the Royal Red Cross Medal and Bar, speak for all in a diary entry from the fall of 1916:

There is no form of horror imaginable, on any part of the human body, that we can’t tackle ourselves now, and no extreme of shock or collapse is considered too hopeless to cope with, except the few who die in a few minutes after admission.[12]

__________________

[1]There is abundant secondary literature on the Great Pandemic of 1918.  An excellent, readable  overview is John M. Barry, The Great Influenza: The Epic Story of the Deadliest Plague in History (NY: Viking, 2004).  Those interested in the pandemic’s impact on the American Expeditionary Force and the war in general should begin with Carol R. Byerly, Fever of War:  The Influenza Epidemic in the U.S. Army during World War I (NY: New York University Press, 2005).  A lively account of the search for the virus that caused the pandemic in the decades after the war is Gina Kolata, Flu:  The Story of the Great Influenza Pandemic of 1918 and the Search for the Virus That Caused It (NY: Farrar, Straus and Giroux, 2011).

[2] Cited by Byerly, Fever of War, 6.

[3] Lavinia L. Dock, et al.,  History of the American Red Cross (NY: Macmillan, 1922), 404-410.

[4] Helen Dore Boylston, Sister: The War Diary of a Nurse (NY: Ives Washburn, 1927), loc 694; Mary T. Sarnecky, A History of the U.S. Army Nurse Corps (Phila:  University of Pennsylvania Press, 1999),  121.

[5] Shirley Millard, I Saw Them Die: Diary and Recollections (New Orleans, LA: Quid Pro, 2011), loc 472; Alma S. Wooley, “A Hoosier Nurse in France:  The World War I Diary of Maude Frances Essig” ( https://scholarworks.iu.edu/journals/index.php/imh/article/view/10683/15077), entry of October 7, 1918.

[6] Beatrice Hopkinson, Nursing through Shot & Shell: A Great War Nurse’s Story, ed. Vivien Newman (South Yorkshire: Pen & Sword, 2014), loc 1999.

[7] Elizabeth Cobbs, The Hello Girls:  America’s First Women Soldiers (Cambridge:  Harvard University Press, 2017), 134.

[8] Edith Appleton, A Nurse at the Front:  The First World War Diaries, ed. R. Cowen (London:  Simon & Schuster UK, 2012), 102; Maude Essig, “World War I Diary,” entry of October 27, 1918; Boylston, War Diary, loc 1348.

[9] Dorothea Crewdson, Dorothea’s War: A First World War Nurse Tells her Story, ed. Richard Crewdson (London: Weidenfeld & Nicolson, 2013), loc 5344, 5379.

[10] On the role of American women physicians in WWI and its aftermath, see Ellen S. More, “’A Certain Restless Ambition’: Women Physicians and World War I,” Amer. Quart., 41:1989, 636-660; Lettie Gavin, American Women in World War I  (Niwot, CO: University Press of Colorado, 1997), 157-178; and Kimberly Jensen, Mobilizing Minerva: American Women in the First World War (Urbana:  University of Illinois Press, 2008), 77-97. Statistics on AWH No. 1’s service while based in Luzancy are given in Jensen, Mobilizing Minerva, 110.

[11] [Kate Norman Derr], “Mademoiselle Miss”:  Letters from an American Girl Serving with the Rank of Lieutenant in a French Army Hospital at the Front, preface by R. C. Cabot (Boston:  Butterfield, 1916), 21.

[12] John & Caroline Stevens, eds., Unknown Warriors: The Letter of Kate Luard, RRC and Bar, Nursing Sister in France, 1914-1918 (Stroud: History Press, 2014), loc 1277.

Copyright © 2017 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

 

Remembering the Nurses of WWI (V)

“They were very pathetic, these shell shocked boys.”

[The fifth of a series of essays about the gallant nurses of World War I commemorating the centennial of America’s entry into the war on April 6, 1917.  Learn more about the nursing response to shell shock in Easing Pain on the Western Front:  American Nurses of the Great War and the Birth of Modern Nursing Practice (McFarland, 2020)].

Every war has mental casualties, but each war has its own way of understanding them.  Each war, that is, has its own nomenclature for what we now term “psychiatric diagnosis.”  To Napoleon’s Surgeon in Chief, Dominique-Jean Larrey, we owe the diagnosis nostalgie   (nostalgia); it characterized soldiers whose homesickness left them depressed and unable to fight.   This was in 1815.  During the American Civil War, nostalgia remained in vogue, but a new term, “irritable heart” (aka  “soldier’s heart” or “Da Costa’s Syndrome”) was coined just after the war to label soldiers whose uncontrollable shivering and trembling had been accompanied by rapid heart beat and difficulty breathing.  During the 10-week Spanish-American War of 1898, soldiers who broke down mentally amid heat, bugs, bullets, and rampant typhoid fever were diagnosed with “tropical weakness.”

And this brings us to World War I, the war that bequeathed the diagnosis of shell shock.  At first, the nurses of WWI were no less baffled by variable expressions of shell shock – most cases of which, it was learned, arose some distance from the exploding shells at the Front – than the doctors.  The term was coined in 1915 by the British physician and psychologist Charles Myers, then part of a volunteer medical unit in France.  Myers immediately realized the term was a misnomer:  he coined it after seeing three soldiers whose similar psychological symptoms followed the concussive impact of artillery shells bursting at close range, only to discover that many men with the same symptoms were no where near exploding shells.   In May, 1917, the American psychiatrist Thomas Salmon, with the approval of the War Department, traveled to England to observe how the British treated their shell shocked soldiers; he returned home convinced that shell shock was a real disorder, not to be taken for malingering, and a disorder that was amenable to psychological treatment.  Salmon was the Medical Director of the National Committee for Mental Hygiene, so his report to the Surgeon General carried weight, and the Army began making arrangements for treating the mental casualties that, he predicted, could flood overseas and stateside hospitals following America’s entry into the war.[1]

Nurses were unconcerned with the animated debate among physicians on the nature of shell shock.  Was it a kind of brain concussion that resulted from the blast force of exploding shells? A physiological response to prolonged fear?  A psychological reaction to the impact of industrial warfare?  A product of nervous shock analogous to  that suffered by victims of railway accidents in the later nineteenth century?   A Freudian-type “war neurosis”  that plugged into earlier traumas  that the soldier had suffered?  They did not care.  Theirs was the everyday world of distressed soldiers, whose florid symptoms overlaid profound anxiety and for whom the reliving of trauma and its aftermath occurred throughout the day.  Theirs was the world, that is, of management and containment.

A shell shocked soldier in a trench during the Somme offensive of September, 1916

The management part could be bemused, good-naturedly patronizing, a tad irritated.  Shell shock victims, after all, made unusual demands on nurses.  The patients were always falling out of bed and otherwise “shaking and stammering and looking depressed and scared.”  Simple tasks like serving meals could be a project, as attested to by the British nursing assistant  Grace Bignold, who, prior to becoming a  VAD in 1915, worked as an orderly at a London convalescent home.  There, she recalled,

One of the things I was told was that when I was serving meals . . . always to put the plate down very carefully in front of them and to let them see me do it.  If you so much as put a plate down in front of them in the ordinary way, when they weren’t looking, the noise made them almost jump through the roof – just the noise of a plate being put on a table with a cloth on it? [2]

Accommodation by the orderlies at mealtime paled alongside the constant burden on ward nurses who had to calm hospitalized shell shock soldiers when exploding shells and overhead bombs rocked the hospital, taking patients back to the Front and causing a recurrence of the anxiety attendant to what they had seen or done or had done to them or to others.  And both, perhaps, paled alongside the burden of nurses in the ambulance trains that transported the shell shocked out of the trenches or off the battlefield.  “It was a horrible thing,” wrote the British VAD and ambulance nurse, Clair Elise Tisdall,

because they sometimes used to get these attacks, rather like epileptic fits in a way.  They became quite unconscious, with violent shivering and shaking, and you had to keep them from banging themselves about too much until they came round again.  The great thing was to keep them from falling off the stretchers, and for that reason we used to take just one at a time in the ambulance. . . . these were the so-called milder cases; we didn’t carry the dangerous ones.  They always tried to keep that away from us and they came in a separate part of the train.”[3]

The latter were the “hopeless mental cases” destined, Tisdall recalled, for “a special place,” i.e., a mental hospital,  in England a “neurasthenic centre.” But how to tell the difference?  The line between “milder” and “severe” cases of shell shock was subjectively drawn and constantly fluctuating; soldiers who arrived in the hospital with some combination of headaches,  tremors, a stutter,  memory loss, and vivid flashback dreams might become psychosomatically blind, deaf or mute or develop paralyzed or spastic limbs after settling into base hospitals and the care of nurses.  In their diaries and letters home, the nurses’ characterizations were not only patronizing but sometimes unkind:  shock patients, often incontinent, were “very pathetic”;  they formed “one of the most pitiful groups” of soldiers.  Dorothea Crewdson referred to them as “dithery shell shocks” and “old doddering shell shocks.” A patient who without warning got out of bed and raced down the hall clad only in his nightshirt was a “dotty poor dear.”  “It is sad to see them,” wrote Edith Appleton.  “They dither like palsied old men, and talk all the time about their mates who were blown to bits, or their mates who were wounded and never brought in.  The whole scene is burnt into their brains and they can’t get rid of the sight of it.”[4]

It is in the containment aspects of their care of the shell shocked that the nurses evinced the same caring acceptance they brought to all their patients.  After, all, shell shocked patients, however they presented, were wounded soldiers, and their suffering was as real and intense as that of comrades with bodily wounds.  The nursing historian Christine Hallett, who writes of the WWI nurses with great sympathy and insight, credits nurses working with the shell shocked with an almost preternatural psychoanalytic sensibility in containing the trauma that underlay their symptoms.  The nurses, she claims, aligned themselves with the patients, however disruptive their outbursts and enactments, since they “sensed that insanity would be a ‘normal’ response for any man who fully realized the deliberateness of the destruction that had been unleashed on him.”  Hence, she continues,

Nurses conspired with their patients to ‘ignore’ or ‘forget’ the reality of warfare until it was safe to remember.  In this way they ameliorated the effect of the ‘psychic splintering’ caused by trauma.  They contained the effects of this defensive fragmentation – the ‘forgetting’ and the ‘denial’ – until patients were able to confront their memories, incorporate them as part of themselves and become ‘whole’ beings again.[5]

I follow Hallett in her insistence that nurses usually ignored the directive not to “spoil” shell shocked patients.  All too often, they let themselves get involved with them at the expense of maintaining professional distance.[6]  

But then the nurses were equally caring and equally prone to personal connection with all their patients, mental or not.  They were not psychotherapists, and the dizzying demands of their long days and nights did not permit empathic engagement in the psychoanalytic sense, beyond the all-too-human  realization that the shell shocked had experienced something so horrible as to require a gentleness, a lightness of touch, a willingness to accept strange adaptive defenses that, with the right kind of nursing,  might peel away slowly over time.  Here, for example, is one of Hallett’s examples of “emotional containment” on the part of  the Australian army nurse Elsie Steadman:

It was very interesting work, some of course could not move, others could not speak, some had lost their memory, and did not even know their own names, others again had very bad jerks and twitching.  Very careful handling these poor lads needed, for supposing a man was just finding his voice, to be spoken to in any way that was not gentle and quiet the man ‘was done,’ and you would have to start all over again to teach him to talk, the same things applied to walking, they must be allowed to take their time.”[7]

This sensitivity, this “very careful handling” of the shell shocked, was no different than the sensitivity of the mealtime orderlies, who knew to “put the plate down very carefully in front of them,” always making sure that the shell shocked saw them do it.  And of course there were accommodations out of the ordinary, a remarkable example of which comes from Julia Stimson, the American chief nurse of British Base Hospital 21 (and amateur violinist). Writing to her parents in late November, 1917, she related “an interesting little incident” that began when a patient knocked on her door and asked for the Matron:

He was so wobbly he almost had to lean up against the wall.  “Somebody told me,” he said, “that you had a violin.  I am a professional violinist and I have not touched a violin for five months, and today I couldn’t stand it any longer, so I got up out of bed to come and find you.”  I made him come in and sit down.  As it happened I had a new violin and bow, which had been bound for our embryo orchestra, here in my office.  The violin was not tuned up, but that didn’t matter.  The man had it in shape in no time and then he began to play and how he could play!  We let him take the violin down to his tent, and later sent him some of my music.  He was a shell shock, and all the evening and the next few days until he was sent to England he played to rapt audiences of fellow patients.[8]

With the shell shocked, the therapeutic gift of the WWI nurses resided less in their ability to empathize than in their acceptance that their patients had experienced horrors that could not be empathized with.  Their duty, their calling, was simply to stay with these soldiers in an accepting manner that coaxed them toward commonality among the wounded, the sense that their symptoms and the underlying terror were not only understandable but unexceptional and well within the realm of nursing care.  In this sense – in the sense of a daily willingness to be with these soldiers in all their bodily dysfunction, mental confusion, and florid symptomatic displays – the nurses strove to normalize shell shock for the shell shocked. After all, the shell shocked, however dithery, shaking, and stammering, were depressed and scared “only at times.”  Otherwise, continued Dorothea Crewdson, “they are very cheery and willing.”  Mary Stollard, a British nurse working  with shell shocked soldiers at a military hospital in Leeds, noted that many of the boys were very sensitive to being incontinent.

They’d say, ‘‘I’m  terribly sorry about it, Sister, it’s shaken me all over and I can’t control it.  Just imagine, to wet the bed at my age!”  I’d say, “We’ll see to that.  Don’t worry about it.”  I used to give them a bedpan in the locker beside them and keep it as quiet as possible.  Poor fellows, they were so embarrassed – especially the better-class men.”[9]

But such embarrassment was a relic of civilian life.  It had no place among battle-hardened nurses who coped daily with the sensory overload of trench warfare:  the overpowering stench of gangrenous infections and decaying flesh; the sight of mutilated soldiers without faces or portions of torso, not to mention missing arms and legs; the screams of gassed soldiers, blind and on fire and dying in unspeakable pain.  Alongside such things, how off-putting could incontinence be?  The fact is that shell shocked soldiers, no less than the nurses themselves, were warriors. Warriors are wounded and scarred in many ways; nurses themselves fell victim to shell shock, even if they were not officially diagnosed as such.[10]   Knowing full well that shell shocked soldiers declared physically unfit and shipped back home were often subject to stigma and humiliation,  Ellen La Motte offered this dismal prognosis for one who had lost the ability to walk and could no longer serve the nation:  “For many months he had faced death under the guns, a glorious death.  Now he was to face death in another form.  Not glorious, shameful.”[11]

_________________________

[1] Earl D. Bond, Thomas W. Salmon – Psychiatrist  (NY: Norton, 1950), 83-84.

[2] Julia C. Stimson, Finding Themselves:  The Letters of an American Army Chief Nurse in a British Hospital in France (NY: Macmillan, 1918), 41; Dorothea Crewdson, Dorothea’s War:  A First World War Nurse Tells her Story, ed. Richard Crewdson (London: Weidenfeld & Nicolson, 2013), loc 4383; Grace Bignold, in Lyn MacDonald, The Roses of No Man’s Land (London: Penguin, 1993), 233.

[3]  Claire Elise Tisdall, in MacDonald, Roses of No Man’s Land, 233-34.

[4] Mary Stollard, in MacDonald, Roses of No Man’s Land, 231-32; Stimson, Finding Themselves, 41; Crewdson, Dorothea’s War, loc 967; Edith Appleton, A Nurse at the Front:  The First World War Diaries, ed. R. Cowen (London:  Simon & Schuster UK, 2012), 184.

[5] Christine Hallett, Containing Trauma:  Nursing Work in the First World War (Manchester: Manchester University Press, 2009), 163.

[6] Hallett, Containing Trauma, 165, 177.

[7] Hallett, Containing Trauma, 172-73.

[8] Stimson, Finding Themselves, 163.

[9] Crewdson, Dorothea’s War,  loc 4383; Stollard, in MacDonald, Roses of No Man’s Land, 232.

[10] E.g., Crewdson, Dorothea’s War, loc 4914; In “Blind,” Mary Borden writes of herself as “jerk[ing] like a machine out of order.  I was away now, and I seemed to be breaking to pieces.”  She was sent home as “tired.”  Mary Borden, The Forbidden Zone (London: Hesperus, 2008[1929], 103).  On the military’s unwillingness to diagnose women as “shell shocked,” see Hannah Groch-Begley, “The Forgotten Female Shell-Shock Victims of World War I,” The Atlantic, September 8, 2014 (https://www.theatlantic.com/health/archive/2014/09/world-war-ones-forgotten-female-shell-shock-victims/378995).

[11] Ellen N. La Motte, The Backwash of War: The Human Wreckage of the Battlefield as Witnessed by an American Hospital Nurse (NY: Putnam’s, 1916), 239.

Copyright © 2017 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

%d bloggers like this: