Category Archives: War & Medicine

Malaria in the Ranks

Malaria (from Italian “bad air”): Infection transmitted to humans by mosquito bites containing single-celled parasites, most commonly Plasmodium (P.) vivax  and P. falciparum.  Mosquito vector discovered by Ronald Ross of Indian Medical Service in 1897.  Symptoms:  Initially recurrent (“intermittent”) fever, then constant high fever, violent shakes and shivering, nausea, vomiting.  Clinical descriptions as far back as Hippocrates in fifth century B.C. and earlier still in Hindu and Chinese writings.  Quinine:  Bitter-tasting alkaloid from the bark of cinchona (quina-quina) trees, indigenous to South America and Peru.  Used to treat malaria from 1630s through 1920s, when more effective synthetics became available.  Isolated from chinchona bark in 1817 by French chemists Pierre Joseph Pelletier and Joseph Caventou.  There you have it.  Now read on. 

_______________________

It’s 1779 and the British, commanded by  Henry Clinton adopt a southern strategy to occupy the malaria-infested Carolinas. The strategy appears successful, as British troops commanded by Charles Lord Cornwallis capture Charleston on March 29, 1780.  But appearances can be deceiving. In reality, the Charleston campaign has left the British force debilitated.  Things get worse when Cornwallis marches inland in June, where his force is further ravaged by malarial  fever carried by Anopheles mosquitoes and Plasmodium parasites.  Lacking quinine, his army simply melts away in the battle to follow. Seeking to preserve what remains of his force, Cornwallis looks to the following winter as a time to  recuperate and rebuild.  But it is not to be.  Clinton sends him to Yorktown, where he occupies a fort between two malarial swamps on the Chesapeake Bay.  Washington swoops south and, aided by French troops, besieges the British.  The battle is over almost before it has begun.  Cornwallis surrenders to Washington, but only after his army has succumbed to malarial bombardment by the vast army of mosquitoes. The Americans have won the Revolutionary War.  We owe American independence to Washington’s command, aided, unwittingly, by mosquitoes and malaria.[1] 

Almost two centuries later, beginning in the late 1960s, malaria again joins America at war.  Now the enemy is communism, and the site is Vietnam.  The Republic of Korea (ROK), in support of the war effort, sends over 30,000 soldiers and tens of thousands of civilians to Vietnam.  The calculation is plain enough: South Korea seeks to consolidate America’s commitment to its economic growth and military defense in its struggle with North Korean communism after the war.  It works, but there is an additional, major benefit:  ROK medical care of soldiers and civilians greatly strengthens South Korean capabilities in managing infectious disease and safeguarding public health.  Indeed, at war’s end in 1975, ROK is an emergent powerhouse in malaria research and the treatment of parasitic disease.  Malaria has again played a part in the service of American war aims.[2]

Winners and losers aside, the battle against malaria is a thread that weaves its way through American military history.  When the Civil War erupted in 1861, outbreaks of malaria and its far more lethal cousin, yellow fever, did not discriminate between the forces of North and South.  Parasites mowed down combatants with utter impartiality.  For many, malarial infection was the enemy that precluded engagement of the enemy.  But there were key differences.  The North had the U.S. Army Laboratory, comprised of  laboratories in Astoria, New York and Philadelphia.  In close collaboration with Powers and Weightman, one of only two American pharmaceutical firms then producing quinine, the Army Laboratory provided Union forces with ample purified quinine in standardized  doses.  Astute Union commanders made sure their troops took quinine prophylactically, with troops summoned to their whiskey-laced quinine ration with the command, “fall in for your quinine.” 

Confederate troops were not so lucky.  The South lacked chemists able to synthesize quinine from its alkaloid; nor did a Spanish embargo permit the drug’s importation.  So the South had to rely on various plants and plant barks, touted by the South Carolina physician and botanist Frances Peyre Porcher as  effective quinine substitutes.  But Porcher’s quinine substitutes were all ineffective, and the South had to make do with the meager supply of quinine it captured or smuggled.  It was a formula for defeat, malarial and otherwise.[3] 

Exactly 30 years later, in 1891, Paul Ehrlich announced that the application of  a chemical stain, methylene blue, killed malarial microorganisms and could be used to treat malaria.[4]   But nothing came of Ehrlich’s breakthrough seven years later in the short-lived Spanish-American War of 1898.   Cuba was a haven for infectious microorganisms of all kinds, and, in a campaign of less than four months, malaria mowed down American troops with the same ease it had in the Civil War.  Seven times more Americans died from tropical diseases than from Spanish bullets.  And malaria topped the list.  

As the new century approached, mosquitoes were, in both senses, in the air.  In 1900, Walter Reed returned to Cuba to conduct experiments with paid volunteers; they established once and for all that mosquitoes were the disease vector of yellow fever; one could not contract the disease from “fomites,” i.e., the soiled clothing, bedding, and other personal matter of those infected.  Two years later, Ronald Ross received his second Nobel Prize in Medicine for his work on the role of mosquitoes in transmission of malaria.[5]   But new insight into the mosquito vector of yellow fever and malaria did not mitigate the dismal state of affairs that came with World War I.  The American military was no better prepared for the magnitude of malaria outbreaks than during the Civil War.  At least 1.5 million doughboys were incapacitated, as malaria spread across Europe from southeast England to the shores of Arabia, and from the Arctic to the Mediterranean.  Major epidemics broke out in  Macedonia, Palestine, Mesopotamia, Italy, and sub-Saharan Africa.[6]

In the Great War, malaria treatment fell back on quinine, but limited knowledge of malarial parasites compromised its effectiveness.  Physicians of the time could not differentiate between the two strains of  parasite active in the camps – P. vivax and P. falciparum.  As a result, they could not optimize treatment doses according to these somewhat different types of infection.  Malarial troops, especially those with falciparum, paid the price.  Except for the French, whose vast malaria control plan spared its infantry from infection and led to victory over Bulgarian forces in September 1918, malaria’s contribution to the Great War was what it had always been in war – it was the unexpected adversary of all.

Front cover of “The Illustrated War Times,” showing WWI soldiers, probably Anzacs, taking their daily dose of quine at Salonika, 1916.

In 1924, the problem that had limited the effectiveness of quinine during the Great War was addressed when the German pharmacologist Wilhelm Roehl, working with Bayer chemist Fritz Schönhöfer, distilled the quinine derivative Plasmoquin, which was far more effective against malaria than quinine.[7]  By the time World War II erupted, another antimalarial, Atabrine (quinacrine, mepacrine), synthesized in Germany in1930, was available.  It would be the linchpin of the U.S. military’s malaria suppression campaign, as announced by the Surgeon General in Circular Letter No. 56 of December 9, 1941.  But the directive had little impact in the early stages of  the war. U.S. forces in the South Pacific were devastated by malaria, with as many as 600 malaria cases for every 1,000 GIs.[8]  Among American GIs and British Tommies alike, the daily tablets were handed out erratically.  Lackluster command and side effects were part of the problem:  The drug turned skin yellow and occasionally caused nausea and vomiting.  From there, the yellowing skin in particular, GIs leapt to the conclusion that Atabrine would leave them sterile and impotent after the war.  How they leapt to this conclusion is anyone’s guess, but there was no medical information available to contradict it.[9]   

The anxiety bolstered the shared desire of some GIs to evade military service.  A number of them tried to contract malaria in the hope of discharge or transfer – no one was eager to go to Guadalcanal.  Those who ended up hospitalized often prolonged their respite by spitting out their  Atabrine pills.[10]   When it came to taking Atabrine, whether prophylactically or as treatment, members of the Greatest Generation could be, well, less than great.

Sign posted at 363rd Station Hospital in Papua New Guinea in 1942, sternly admonishing U.S. Marines to take their Atabrine.

Malarial parasites are remarkably resilient, with chemically resistant strains emerging time and again.  New strains have enabled malaria to find ways of staying ahead of the curve, chemically speaking.  During the Korean War (1950-1953), both South Korean and American forces fell to the vivax strain.  American cases decreased with the use of chloroquine, but the improvement was offset by a rash of cases back in the U.S., where hypnozoites (dormant malarial parasites) came to life with a vengeance and caused relapses.  The use of yet another antimalarial, primaquine, during the latter part of the war brought malaria under better control.  But even then, in the final year of the war 3,000 U.S. and 9,000 ROK soldiers fell victim.[11]   In Vietnam, malaria reduced the combat strength of some American units by half and felled more troops  than bullets.  Between 1965 and 1970, the U.S. Army alone reported over 40,000 cases.[12]  Malaria control measure were strengthened, yes, but so were the parasites, with the spread of drug-resistant falciparum and the emergence of a new chloroquine-resistant strain.  

Malaria’s combatant role in American wars hardly ends with Vietnam.  It was a destructive force in 1992, when American troops joined the UN Mission “Operation Restore Hope” in Somalia.  Once more, Americans resisted directives to take a daily dose of  preventive medicine, now Mefloquine, a vivax antimalarial developed by the Army in 1985.  As with Atabrine a half century earlier, false rumors of  debilitating side effects led soldiers to stop taking it.  And as with Atabrine, malaria relapses knocked out soldiers following their return home, resulting in the largest outbreak of malaria stateside since Vietnam.[13] 

In Somalia, as in Vietnam, failure of commanders to educate troops about the importance of “chemoprophylaxis” and to institute “a proper antimalarial regimen” were the primary culprits.  As a result, “Use of prophylaxis, including terminal prophylaxis, was not supervised after arrival in the United States, and compliance was reportedly low.[14]  It was another failure of malaria control for the U.S. military.  A decade later, American combat troops went to Afghanistan, another country with endemic malaria.  And there, yet again, “suboptimal compliance with preventive measures” – preventive medication, use of insect repellents, chemically treated tent netting, and so forth – was responsible for “delayed presentations” of malaria after a regiment of U.S. Army Rangers returned home.[15]  Plus ca change, plus c’est la même chose. 

Surveying American history, it seems that the only thing more certain than malarial parasites during war is the certainty of war itself.  Why is this still the case?  As to the first question, understanding the importance of “chemoprophylaxis” in the service of personal and public health (including troop strength in the field) has never been a strong suit of Americans.  Nor has the importance of preventive measures, whether applying insecticides and tent netting (or wearing face masks) been congenial, historically, to libertarian Americans who prefer freedom in a Hobbesian state of nature to responsible civic behavior.  Broad-based public-school education on the public health response to epidemics and pandemics throughout history, culminating in the critical role of preventive measures in containing Coronavirus, might help matters.  In the military domain, Major Peter Weima sounded this theme in calling attention to the repeated failure of education in the spread of malaria among American troops in World War II and Somalia. He stressed “the critical contribution of education to the success of clinical preventive efforts. Both in WWII and in Somalia, the failure to address education on multiple levels contributed to ineffective or only partially effective malaria control.” [16]  As to why war, in all its malarial ingloriousness, must accompany the human experience, there is no easy answer.   

_____________________

[1]  Peter McCandless, “Revolutionary fever:  Disease and war in the lower South,1776-1783,” Am. Clin. Climat. Assn., 118:225-249, 2007.   Matt Ridley provides a popular account in The Evolution of Everything:  How New Ideas Emerge (NY: Harper, 2016).

[2] Mark Harrison & Sung Vin Yim, “War on Two Fronts: The fight against parasites in Korea and Vietnam,” Medical History, 61:401-423, 2017.  

[3] Robert D. Hicks, “’The popular dose with doctors’: Quinine and the American Civil War,” Science History Institute, December 6, 2013 (https://www.sciencehistory.org/distillations/the-popular-dose-with-doctors-quinine-and-the-american-civil-war).

[4] Harry F. Dowling, Fighting Infection: Conquests of the Twentieth Century  (Cambridge, MA: Harvard Univ. Press, 1977), 93.

[5] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 263.

[6] Bernard J Brabin, “Malaria’s contribution to World War One – The unexpected adversary,” Malaria Journal, 13, 497, 2014;  R. Migliani, et al., “History of malaria control in the French armed forces:  From Algeria to the Macedonian Front  during the First World War” [trans.], Med. Santé Trop, 24:349-61, 2014.

[7] Frank Ryan, The Forgotten Plague:  How the Battle Against Tuberculosis was Won and Lost (Boston:  Little, Brown,1992), 90-91.

[8]  Peter J. Weima, “From Atabrine in World War II to Mefloquine in Somalia: The role of education in preventive medicine,” Mil. Med., 163:635-639, 1998, at 635.

[9] Weima, op. cit., p. 637, quoting Major General, then Captain, Robert Green during the  Sicily campaign  in August 1943:  “ . . . the rumors were rampant, that it made you sterile…. people did turn yellow.”

[10] Ann Elizabeth Pfau, Miss Yourlovin (NY:  Columbia Univ. Press, 2008), ch. 5.

[11] R. Jones, et al., “Korean vivax malaria. III. Curative effect and toxicity of Primaquine in doses from 10 to 30 mg daily,” Am. J. Trop. Med. Hyg., 2:977-982, 1953;  Joon-Sup Yeom, et al., “Evaluation of Anti-Malarial Effects, J. Korean Med. Sci., 5:707-712, 2005.

[12] B. S. Kakkilaya, “Malaria in Wars and Victims” (malariasite.com).

[13] Weima, op. cit.  Cf. M. R. Wallace et al., “Malaria among United States troops in Somalia,” Am. J. Med., 100:49-56, 1996.

[14] CDC, “Malaria among U.S. military personnel returning from Somalia, 1993,” MMWR, 42:524-526, 1993.

[15] Russ S. Kotwal, et al., “An outbreak of malaria in US Army Rangers returning from Afghanistan,”  JAMA, 293:212-216, 2005, at 214.  Of the 72% of the troops who completed a postdeployment survey, only 31% reported taking both their weekly tablets and continuing with their “terminal chemoprophylaxis” (taking medicine, as directed, after returning home).  Contrast this report with one for Italian troops fighting in Afghanistan from 2002-2011. Their medication compliance was measured 86.7% , with no “serious adverse events” reported and no cases of malaria occurring in Afghanistan. Mario S Peragallo, et al.,  “Risk assessment and prevention of malaria among Italian troops in Afghanistan,”   2002 to 2011,” J. Travel Med., 21:24-32, 2014.

[16] Weima, op. cit., 638.

 

Copyright © 2022 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

 
 

Everything You Didn’t Want to Know About Typhoid

Extreme fatigue; dangerously high fever; severe abdominal pain; headaches; diarrhea or constipation; nausea and vomiting – the symptoms of severe typhoid fever can be a panoply of horrors.  Like cholera, the bacteria in question – Salmonella typhi, cousin to the Salmonella that causes food poisoning – find a home in water and food contaminated with human feces.  The infection is contracted only by humans, and it is highly contagious.  More persons contract it from human contact – often from unwashed hands following defecation – than from drinking contaminated water or ingesting contaminated food.  But the latter are hardly incidental causes.  At least two billion people worldwide, the World Health Organization tells us, drink feces-contaminated water.[1]

And the story gets worse. Through the 19th century, “chronic carriers” could not be conceptualized, much less detected.  They were symptom-free folks in whom typhi found safe harbor in the gall bladder, where they traveled with stored bile through the bile duct into the small intestine en route to fecal expulsion.  The chronic carriers brought infection to entire communities in sudden, explosive outbreaks; typhoid is a prime example of what epidemiologists term a “fulminant” disease (from the Latin fulmināre, to strike with lightning).  And worse still, the ranks of common carriers were enlarged by some of those who contracted the disease and recovered.  Typhi lived on in their gall bladders as well, and were passed on to others via the same fecal-oral route.

The Mother of all Common Carriers, the Super Spreader who comes down to us as Typhoid Mary, was one Mary Mallon, an Irish cook who passed on typhi to no less than 53 members of seven prominent Manhattan-area households between 1900 and 1906.  In 1907 she was quarantined in a bungalow on New York’s North Brother Island near Riverside Hospital, only to resume her career as cook-super spreader on release in 1910.  Tracked down five years later, she was whisked back to her island bungalow, where she lived out her remaining 23 years. 

Here is what Salmonella typhi do once ingested through the mouth.  Absent sufficient gastric acid to neutralize them in the stomach, the bacteria make their way to the terminal of the small intestine and enter the cells lining it.  Intestinal cells respond to the invaders with a massive inflammatory response that leads to an intestinal rupture, a hole, through which intestinal contents drain into the abdomen, with attendant and severe pain.  And from there matters go from bad to worse.  Without fast, effective treatment, the bacteria penetrate lymphatic tissue and enter the blood stream, which shuttles them to other organs:  the liver, the spleen, bone marrow.  In the worst cases, bacterial ulceration can extend all the way to the terminal lining of the ileum, from which typhi flood the body, carrying infection to the brain, heart, and pancreas.  Death is now around the corner; only major abdominal surgery holds any prospect of survival.  It is a pernicious disease of microbial migratory urgency.    

Improvements in water treatment and personal hygiene, along with antibiotic therapy and – yes! – a newly effective vaccine for adults, brought typhoid to its knees in the United States after World War II.  But the disease is alive and well in Central and South America, Africa, and parts of Asia, where it claims between 11 and 21 million victims and some 200,000 deaths each year.[2]  Typhi has evolved along with the antibiotics that control it, and multidrug-resistant strains (MDR) remain deadly.  And even here, on these ostensibly sanitized shores, typhi can still make its presence known.  As recently as 2010, nine Americans contracted typhoid, five in California and four in Nevada.[3] 

But such instances are aberrational, and in the northern hemisphere typhoid fever has long since vanished from anyone’s disease-monitoring radar.  Now federal and state governments, the bane of anti-vaccine irrationalists and mask-wearing naysayers, make sure we don’t drink water or eat food contaminated by microbe-laced feces.  But it was not so for our forebears. In the Civil War, typhoid fever devastated north and south alike; the Union Army’s general hospital, the Satterlee Hospital in West Philadelphia, was constructed in 1862 largely to cope with its victims.  In the Spanish-American War of 1898, typhoid fever shared center stage with yellow fever and, at war’s end, rated its own federal investigative commission.  Chaired by Walter Reed, the Typhoid Commission determined that contact among soldiers (“comrade contact”) was primarily responsible for the transmission of typhoid fever in military camps.[4]  Four years later, Koch’s investigations during a typhoid epidemic in Trier, Germany led him to generalize the Commission’s finding: typhoid fever was contracted less from contaminated water or sewage than from nonsymptomatic carriers; the “carrier hypothesis” was among his final significant contributions.[5] 

The era of modern typhoid prevention began in 1897, when Almroth Wright, then a pathologist at the British Army’s Medical School at Netley Hospital, published a paper on the typhoid vaccine he had developed with killed typhi.  The Army took note and, in the South African war the following year, made very limited use of it: of 330,000 British troops, only 14,000 received the vaccine.  It was effective in this limited trial but never caught on after the war.[6]  Beginning in 1902, the U.S. government’s Public Health and Marine Hospital Service, renamed the Public Health Service in 1912, focused its research on typhoid.  Progress was made, and by the time America entered WWI, the PHS’s Hygienic Laboratory had developed an antityphoid vaccine.[7]  American troops sailing to France in 1917 were not asked how they felt about receiving a typhoid vaccine; they received their mandatory shots and boarded their ships.  Those who were not vaccinated stateside received their shots on arriving at their camps.  Vaccination was not negotiable.  The obligation to live and fight for the nation trumped the freedom to be free to contract typhoid, suffer, and possibly die.  

“A Monster Soup Commonly Called Thames Water,” a mid 19th-century etching depicting the stew of disease-promoting organisms in the river that supplied drinking water to Londoners.

The vaccine dramatically reduced the incidents of typhoid, but it still wrought damage in field and base hospitals, especially among unvaccinated European troops who had been fighting since 1914.  American nurses who arrived in northern France and Belgium in advance of troops recalled their misery at being transferred to typhoid wards, which, as one recalled were “gloomy and dark.”  Another recalled a typhoid scourge that crippled her hospital and created an urgent need to find space outside the hospital for the typhoid patients.[8]

_______________________________

The current governors of Texas and Florida would surely look askance at the history of typhoid control, since a key aspect of it – allowing children on school premises to drink only water subjected to antimicrobial treatment – ignores parental freedom of choice.  Parents decide what their children eat, and they should be free to determine what kind of water they drink.   Children are not military enlistees obligated to remain healthy in the service of the nation.  What right do schools boards have to abrogate the freedom of parents to determine what kind of water their children drink?  Why should they be mandated to drink water subject to modern sanitary treatment that robs it of Salmonella typhi along with Vibrio cholerae, Poliovirus, and dysentery-causing Shigella?  Shouldn’t they be free to have their children partake of nature’s bounty, to drink fresh water from streams and rivers, not to mention untreated well water contaminated with human feces and the pathogens it harbors?

And here is the Covid connection.  If local school boards and municipal authorities lack the authority to safeguard children, to the extent possible, through obligatory wearing of facemasks, then surely they lack the authority to force them to drink water filtered through layers of state and federal regulation informed by modern science.  Let parents be free to parent; let their children pay the steep, even life-threatening price.      

Did I mention that young children, along with immune-compromised young adults, are at greatest risk for contracting typhoid?  Well, now you know, and now, perhaps, we can return to reality.  State governors who do not understand the legal and moral imperative of acting in the best interests of the child[9] are unfit for public office of any sort.  In point of fact, they are unfit. Who wants governors who, in denying adults the right to act responsibly in the best interests of children, sanction child abuse?  Let them crawl back into the existential dung heap whence they emerged.    


[1] https://www.who.int/news-room/fact-sheets/detail/drinking-water.

[2] https://www.cdc.gov/typhoid-fever/health-professional.html,

[3] https://www.cdc.gov/salmonella/2010/frozen-fruit-pulp-8-25-10.html.

[4] Victor C. Vaughan, A Doctor’s Memories (Indianapolis: Bobbs-Merrill, 1926), 369ff., 386.

[5] Thomas D. Brock, Robert Koch: A Life in Medicine and Bacteriology (Wash, DC: ASM Press, 1998 [1988]), 255-256.

[6] Gywn Macfarlane, Alexander Fleming: The Man and the Myth (Cambridge: Harvard University Press, 1984), 54-55.

[7] Victoria A. Harden, Inventing the NIH: Federal Biomedical Research Policy, 1887-1937 (Baltimore:  Johns Hopkins University Press, 1986), 41.

[8] Grace McDougall, A Nurse at the War:  Nursing Adventures in Belgium and France (NY: McBride, 1917), 111, 117; Alice Isaacson, Diary of 1917, Library & Archives, Canada, letter of 16 February 1917. 

[9] Joseph Goldstein, Anna Freud, et al., In the Best Interests of the Child (New York:  Free Press, 1986).

Copyright © 2021 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in courses and seminars let him know via info[at]keynote-books.com.

SHARE THIS POST:

The Picture They Wouldn’t Publish

 

Here is the picture the publisher, McFarland & Co., refused to include in Easing Pain on the Western Front, my study of the American and Canadian nurses of World War I.  It was taken in a French field hospital in 1918, and shows a Red Cross nurse clasping the hand of a French soldier, a poilu, most of whose face has been blown off by artillery. She gazes outward stoically, as if to say: “Yes, I am a nurse, and this is the kind of boy the ambulances bring to us during the rushes.  Soon the surgeons will decide whether he will live and whether they can do anything for him.  For now I hold his hand.”  A more junior sister stands beside her and bears witness.

Why did the publisher exclude this powerful image from a book about the nurses of the Great War?  I was told the problem was one of image quality, viz.,   the picture did not reproduce at 300dpi and was pixelated. So it had to go.

Earlier I had made the case for including WWI photos of nurses in action that did not meet the publisher’s quality threshold in this way:

Not all books are created equal; not all pictures are created equal; and not all pictures are equally relevant to the story an author tells.  Criteria of inclusion should accommodate, at least to some extent, subject matter and narrative.  In the context of this particular book, I find it difficult to imagine anyone finding fault with the publisher for including rare and graphic images, never seen by anyone except a handful of medical historians, that  illustrate – and do so poignantly – my central arguments about the emergence of scientific nursing practice on the Western front.  Certainly, should McFarland choose to allow a small number of lower resolution images, I would gratefully acknowledge the publisher’s kind accommodation in the interest of a more vivid historical presentation.

With the photo at issue, however, I was unavailing.  After all, the publisher opined, it would be criticized, presumably by reviewers, for including such a low-resolution photo in one of its books. Really?  If any of my readers know of a single instance in which a book publisher has been criticized in print for including a rare, low-resolution period photo in a work of history, please let me know.

Now the photo is graphic and unsettling.  Was the subject of the photo the underlying reason the publisher excluded it from the book?  Probably not.  I do think, however, that the horrific image of a faceless soldier fortified the resolve  to exclude it on”low resolution”grounds.  But it is precisely the nature of the picture – what it shows us – that speaks to the subject of this particular book.  I sought its inclusion not as yet another depiction of the horror of modern mechanized warfare, a reminder of what exploding artillery shells do to human faces at close range.  Rather, the photo provides poignant visual representation of a Great War nurse in action, of the steadiness and steadfastness with which she faced up to the care of the faceless.

Even more to the point, the photo shows one key instrument deployed by nurses in this war and, to some extent, all wars: the hand. We behold the nursing hand as an instrument of stabilizing connection, of calming containment.  Easing Pain examines the many uses of the nurse’s hand in diagnosis and treatment.  To which the hand as instrument of touch-mediated attachment must be included. Seen thusly, the photo is a wartime embodiment in extremis of touching and being touched as a vehicle of therapeutic “holding” in the sense of the British pediatrician and psychoanalyst D. W. Winnicott.  The role of the hand in nursing care antedates and postdates the era of nursing professionalization.  I explore the topic at length in my previous book, In the Hands of Doctors:  Touch and Trust in Medical Care.

Here then is the context in which the photo would have been introduced.  References to the nurses quoted therein may be found in the book.

_______________

Nursing hands also monitor the nurse’s own performance, especially the acclimatization of new nurses to the demands of the reception hut.  Shirley Millard reports how her hands “get firmer, faster.  I can feel the hardness of emergency setting in.  Perhaps after a while I won’t mind.” More importantly, nursing hands stabilize soldiers whose fear and pain off the battlefield leave them overwhelmed and child-like.  With soldiers who arrive at casualty clearing stations in surgical shock, massive blood loss is compounded by sepsis, pain, and anxiety, making it incumbent on nurses not only to institute stabilizing measures, but to make the soldier feel “he is in good and safe hands.” Touch is a potent instrument for inducing this feeling.  Soldiers clutch hands as they ask, “Is it all right?  Don’t leave me.”  But it is usually not all right, and it is the nurse’s hand that provides a lifeline of human attachment to relieve a desolation that is often wordless:  “Reaching down to feel his legs before I could stop him, he uttered a heartbreaking scream.  I held his hand firmly until the drug I had given him took effect.”  When panic overwhelms and leaves soldiers mute, the hand communicates what the voice can not:  “He seized my hand and gripped it until it hurt . . . He looked up at me desperately, hanging onto my hand in his panic.”  The hand offers consolation when there are no words:  “The bandage around his eyes was soaked with tears.  I sat on his bed and covered his hand with mine.”

The nurse’s hands mark attachment and impending loss.  Soldiers become terrified at the time of surgery.  The reality of amputation, the painful aftercare it will entail, and the kind of life it permits thereafter can be overwhelming.  It is 1915, and the American Maud Mortimer is in a field hospital at the edge of Belgium, only five miles from the firing line.  A patient with whom she has connected, “Petit Pere, is about to have his leg amputated.  He makes her promise that when he comes around from the anesthesia she will be there, and that she will “hold his hand through the first most painful dressing.”  The amputation complete, he gazes up at her:  “Hold my hand tight and I will scream no more.”

But the attachment can transcend treatment-related trauma and become perduring.  Now it is April 1918, three years later, and a pause in the action permits Helen Boylston’s hospital to ship 26 ward patients to England.  One of her patients, Hilley, begs her to let him remain.  “I went out to the ambulance with him,” she recounts, “and he clung tightly to my hand all the way.  I almost cried.” Such separation, with the hand clinging it elicits, reminds us that a wounded soldier’s parting from his nurses can be a major loss, even when it is a prelude to greater safety and fuller recovery. The vigorous hand clinging of the living, even in loss, is far preferable to the enfeebled squeeze of the dying.  With the latter, the nurse’s  hand becomes an instrument of palliation, interposing human touch between living and dying, easing the transition from one to the other:  “I held his hand as he went . . .  Near the end he saw me crying and patted my hand with his two living fingers to comfort me.” Expressions of gratitude and affection, hand-communicated, are part of the process.  The hand continues to communicate as the body shuts down:  “He was ever so good and tried to take milk and food almost up to the end but he was unable to speak and not really conscious, though he could hold my hand and squeeze it which was so sweet of him.”

________________

EASING PAIN ON THE WESTERN FRONT American Nurses of the Great War and the Birth of Modern Nursing Practice

Paul E. Stepansky

McFarland & Co.    978-1476680019    2020    244pp.    19 photos    $39.95pbk/$19.95 Kindle eBook

Available now from Amazon

 

READ THE PREFACE TO EASING PAIN ON THE WESTERN FRONT HERE

PAUL STEPANSKY IS FEATURED AUTHOR IN THE

PRINCETON ALUMNI WEEKLY  

        LISTEN TO HIM DISCUSS THE BOOK WITH THE EDITOR OF THE JOURNAL OF THE AMERICAN ASSOCIATION OF NURSE PRACTITIONERS IN A SPECIAL JAANP PODCAST

 

Copyright © 2020 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

Remembering the Nurses of WWI (V)

“They were very pathetic, these shell shocked boys.”

[The fifth of a series of essays about the gallant nurses of World War I commemorating the centennial of America’s entry into the war on April 6, 1917.  Learn more about the nursing response to shell shock in Easing Pain on the Western Front:  American Nurses of the Great War and the Birth of Modern Nursing Practice (McFarland, 2020)].

Every war has mental casualties, but each war has its own way of understanding them.  Each war, that is, has its own nomenclature for what we now term “psychiatric diagnosis.”  To Napoleon’s Surgeon in Chief, Dominique-Jean Larrey, we owe the diagnosis nostalgie   (nostalgia); it characterized soldiers whose homesickness left them depressed and unable to fight.   This was in 1815.  During the American Civil War, nostalgia remained in vogue, but a new term, “irritable heart” (aka  “soldier’s heart” or “Da Costa’s Syndrome”) was coined just after the war to label soldiers whose uncontrollable shivering and trembling had been accompanied by rapid heart beat and difficulty breathing.  During the 10-week Spanish-American War of 1898, soldiers who broke down mentally amid heat, bugs, bullets, and rampant typhoid fever were diagnosed with “tropical weakness.”

And this brings us to World War I, the war that bequeathed the diagnosis of shell shock.  At first, the nurses of WWI were no less baffled by variable expressions of shell shock – most cases of which, it was learned, arose some distance from the exploding shells at the Front – than the doctors.  The term was coined in 1915 by the British physician and psychologist Charles Myers, then part of a volunteer medical unit in France.  Myers immediately realized the term was a misnomer:  he coined it after seeing three soldiers whose similar psychological symptoms followed the concussive impact of artillery shells bursting at close range, only to discover that many men with the same symptoms were no where near exploding shells.   In May, 1917, the American psychiatrist Thomas Salmon, with the approval of the War Department, traveled to England to observe how the British treated their shell shocked soldiers; he returned home convinced that shell shock was a real disorder, not to be taken for malingering, and a disorder that was amenable to psychological treatment.  Salmon was the Medical Director of the National Committee for Mental Hygiene, so his report to the Surgeon General carried weight, and the Army began making arrangements for treating the mental casualties that, he predicted, could flood overseas and stateside hospitals following America’s entry into the war.[1]

Nurses were unconcerned with the animated debate among physicians on the nature of shell shock.  Was it a kind of brain concussion that resulted from the blast force of exploding shells? A physiological response to prolonged fear?  A psychological reaction to the impact of industrial warfare?  A product of nervous shock analogous to  that suffered by victims of railway accidents in the later nineteenth century?   A Freudian-type “war neurosis”  that plugged into earlier traumas  that the soldier had suffered?  They did not care.  Theirs was the everyday world of distressed soldiers, whose florid symptoms overlaid profound anxiety and for whom the reliving of trauma and its aftermath occurred throughout the day.  Theirs was the world, that is, of management and containment.

A shell shocked soldier in a trench during the Somme offensive of September, 1916

The management part could be bemused, good-naturedly patronizing, a tad irritated.  Shell shock victims, after all, made unusual demands on nurses.  The patients were always falling out of bed and otherwise “shaking and stammering and looking depressed and scared.”  Simple tasks like serving meals could be a project, as attested to by the British nursing assistant  Grace Bignold, who, prior to becoming a  VAD in 1915, worked as an orderly at a London convalescent home.  There, she recalled,

One of the things I was told was that when I was serving meals . . . always to put the plate down very carefully in front of them and to let them see me do it.  If you so much as put a plate down in front of them in the ordinary way, when they weren’t looking, the noise made them almost jump through the roof – just the noise of a plate being put on a table with a cloth on it? [2]

Accommodation by the orderlies at mealtime paled alongside the constant burden on ward nurses who had to calm hospitalized shell shock soldiers when exploding shells and overhead bombs rocked the hospital, taking patients back to the Front and causing a recurrence of the anxiety attendant to what they had seen or done or had done to them or to others.  And both, perhaps, paled alongside the burden of nurses in the ambulance trains that transported the shell shocked out of the trenches or off the battlefield.  “It was a horrible thing,” wrote the British VAD and ambulance nurse, Clair Elise Tisdall,

because they sometimes used to get these attacks, rather like epileptic fits in a way.  They became quite unconscious, with violent shivering and shaking, and you had to keep them from banging themselves about too much until they came round again.  The great thing was to keep them from falling off the stretchers, and for that reason we used to take just one at a time in the ambulance. . . . these were the so-called milder cases; we didn’t carry the dangerous ones.  They always tried to keep that away from us and they came in a separate part of the train.”[3]

The latter were the “hopeless mental cases” destined, Tisdall recalled, for “a special place,” i.e., a mental hospital,  in England a “neurasthenic centre.” But how to tell the difference?  The line between “milder” and “severe” cases of shell shock was subjectively drawn and constantly fluctuating; soldiers who arrived in the hospital with some combination of headaches,  tremors, a stutter,  memory loss, and vivid flashback dreams might become psychosomatically blind, deaf or mute or develop paralyzed or spastic limbs after settling into base hospitals and the care of nurses.  In their diaries and letters home, the nurses’ characterizations were not only patronizing but sometimes unkind:  shock patients, often incontinent, were “very pathetic”;  they formed “one of the most pitiful groups” of soldiers.  Dorothea Crewdson referred to them as “dithery shell shocks” and “old doddering shell shocks.” A patient who without warning got out of bed and raced down the hall clad only in his nightshirt was a “dotty poor dear.”  “It is sad to see them,” wrote Edith Appleton.  “They dither like palsied old men, and talk all the time about their mates who were blown to bits, or their mates who were wounded and never brought in.  The whole scene is burnt into their brains and they can’t get rid of the sight of it.”[4]

It is in the containment aspects of their care of the shell shocked that the nurses evinced the same caring acceptance they brought to all their patients.  After, all, shell shocked patients, however they presented, were wounded soldiers, and their suffering was as real and intense as that of comrades with bodily wounds.  The nursing historian Christine Hallett, who writes of the WWI nurses with great sympathy and insight, credits nurses working with the shell shocked with an almost preternatural psychoanalytic sensibility in containing the trauma that underlay their symptoms.  The nurses, she claims, aligned themselves with the patients, however disruptive their outbursts and enactments, since they “sensed that insanity would be a ‘normal’ response for any man who fully realized the deliberateness of the destruction that had been unleashed on him.”  Hence, she continues,

Nurses conspired with their patients to ‘ignore’ or ‘forget’ the reality of warfare until it was safe to remember.  In this way they ameliorated the effect of the ‘psychic splintering’ caused by trauma.  They contained the effects of this defensive fragmentation – the ‘forgetting’ and the ‘denial’ – until patients were able to confront their memories, incorporate them as part of themselves and become ‘whole’ beings again.[5]

I follow Hallett in her insistence that nurses usually ignored the directive not to “spoil” shell shocked patients.  All too often, they let themselves get involved with them at the expense of maintaining professional distance.[6]  

But then the nurses were equally caring and equally prone to personal connection with all their patients, mental or not.  They were not psychotherapists, and the dizzying demands of their long days and nights did not permit empathic engagement in the psychoanalytic sense, beyond the all-too-human  realization that the shell shocked had experienced something so horrible as to require a gentleness, a lightness of touch, a willingness to accept strange adaptive defenses that, with the right kind of nursing,  might peel away slowly over time.  Here, for example, is one of Hallett’s examples of “emotional containment” on the part of  the Australian army nurse Elsie Steadman:

It was very interesting work, some of course could not move, others could not speak, some had lost their memory, and did not even know their own names, others again had very bad jerks and twitching.  Very careful handling these poor lads needed, for supposing a man was just finding his voice, to be spoken to in any way that was not gentle and quiet the man ‘was done,’ and you would have to start all over again to teach him to talk, the same things applied to walking, they must be allowed to take their time.”[7]

This sensitivity, this “very careful handling” of the shell shocked, was no different than the sensitivity of the mealtime orderlies, who knew to “put the plate down very carefully in front of them,” always making sure that the shell shocked saw them do it.  And of course there were accommodations out of the ordinary, a remarkable example of which comes from Julia Stimson, the American chief nurse of British Base Hospital 21 (and amateur violinist). Writing to her parents in late November, 1917, she related “an interesting little incident” that began when a patient knocked on her door and asked for the Matron:

He was so wobbly he almost had to lean up against the wall.  “Somebody told me,” he said, “that you had a violin.  I am a professional violinist and I have not touched a violin for five months, and today I couldn’t stand it any longer, so I got up out of bed to come and find you.”  I made him come in and sit down.  As it happened I had a new violin and bow, which had been bound for our embryo orchestra, here in my office.  The violin was not tuned up, but that didn’t matter.  The man had it in shape in no time and then he began to play and how he could play!  We let him take the violin down to his tent, and later sent him some of my music.  He was a shell shock, and all the evening and the next few days until he was sent to England he played to rapt audiences of fellow patients.[8]

With the shell shocked, the therapeutic gift of the WWI nurses resided less in their ability to empathize than in their acceptance that their patients had experienced horrors that could not be empathized with.  Their duty, their calling, was simply to stay with these soldiers in an accepting manner that coaxed them toward commonality among the wounded, the sense that their symptoms and the underlying terror were not only understandable but unexceptional and well within the realm of nursing care.  In this sense – in the sense of a daily willingness to be with these soldiers in all their bodily dysfunction, mental confusion, and florid symptomatic displays – the nurses strove to normalize shell shock for the shell shocked. After all, the shell shocked, however dithery, shaking, and stammering, were depressed and scared “only at times.”  Otherwise, continued Dorothea Crewdson, “they are very cheery and willing.”  Mary Stollard, a British nurse working  with shell shocked soldiers at a military hospital in Leeds, noted that many of the boys were very sensitive to being incontinent.

They’d say, ‘‘I’m  terribly sorry about it, Sister, it’s shaken me all over and I can’t control it.  Just imagine, to wet the bed at my age!”  I’d say, “We’ll see to that.  Don’t worry about it.”  I used to give them a bedpan in the locker beside them and keep it as quiet as possible.  Poor fellows, they were so embarrassed – especially the better-class men.”[9]

But such embarrassment was a relic of civilian life.  It had no place among battle-hardened nurses who coped daily with the sensory overload of trench warfare:  the overpowering stench of gangrenous infections and decaying flesh; the sight of mutilated soldiers without faces or portions of torso, not to mention missing arms and legs; the screams of gassed soldiers, blind and on fire and dying in unspeakable pain.  Alongside such things, how off-putting could incontinence be?  The fact is that shell shocked soldiers, no less than the nurses themselves, were warriors. Warriors are wounded and scarred in many ways; nurses themselves fell victim to shell shock, even if they were not officially diagnosed as such.[10]   Knowing full well that shell shocked soldiers declared physically unfit and shipped back home were often subject to stigma and humiliation,  Ellen La Motte offered this dismal prognosis for one who had lost the ability to walk and could no longer serve the nation:  “For many months he had faced death under the guns, a glorious death.  Now he was to face death in another form.  Not glorious, shameful.”[11]

_________________________

[1] Earl D. Bond, Thomas W. Salmon – Psychiatrist  (NY: Norton, 1950), 83-84.

[2] Julia C. Stimson, Finding Themselves:  The Letters of an American Army Chief Nurse in a British Hospital in France (NY: Macmillan, 1918), 41; Dorothea Crewdson, Dorothea’s War:  A First World War Nurse Tells her Story, ed. Richard Crewdson (London: Weidenfeld & Nicolson, 2013), loc 4383; Grace Bignold, in Lyn MacDonald, The Roses of No Man’s Land (London: Penguin, 1993), 233.

[3]  Claire Elise Tisdall, in MacDonald, Roses of No Man’s Land, 233-34.

[4] Mary Stollard, in MacDonald, Roses of No Man’s Land, 231-32; Stimson, Finding Themselves, 41; Crewdson, Dorothea’s War, loc 967; Edith Appleton, A Nurse at the Front:  The First World War Diaries, ed. R. Cowen (London:  Simon & Schuster UK, 2012), 184.

[5] Christine Hallett, Containing Trauma:  Nursing Work in the First World War (Manchester: Manchester University Press, 2009), 163.

[6] Hallett, Containing Trauma, 165, 177.

[7] Hallett, Containing Trauma, 172-73.

[8] Stimson, Finding Themselves, 163.

[9] Crewdson, Dorothea’s War,  loc 4383; Stollard, in MacDonald, Roses of No Man’s Land, 232.

[10] E.g., Crewdson, Dorothea’s War, loc 4914; In “Blind,” Mary Borden writes of herself as “jerk[ing] like a machine out of order.  I was away now, and I seemed to be breaking to pieces.”  She was sent home as “tired.”  Mary Borden, The Forbidden Zone (London: Hesperus, 2008[1929], 103).  On the military’s unwillingness to diagnose women as “shell shocked,” see Hannah Groch-Begley, “The Forgotten Female Shell-Shock Victims of World War I,” The Atlantic, September 8, 2014 (https://www.theatlantic.com/health/archive/2014/09/world-war-ones-forgotten-female-shell-shock-victims/378995).

[11] Ellen N. La Motte, The Backwash of War: The Human Wreckage of the Battlefield as Witnessed by an American Hospital Nurse (NY: Putnam’s, 1916), 239.

Copyright © 2017 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

Remembering the Nurses of WWI (IV)

“Mustard gas burns.  Terrific suffering.”

 [The fourth of a series of essays about the gallant nurses of World War I commemorating the centennial of America’s entry into the war on April 6, 1917.  The nursing care of soldiers exposed to poison gas on the Western Front is explored at greater length in chapter 4 of  Easing Pain on the Western Front:  American Nurses of the Great War and the Birth of Modern Nursing Practice (McFarland, 2020)].

Now, sadly, chemical weapons are back in the news.  But large-scale chemical warfare  reaches back over a century.   In WWI, Germany released 5,730 cylinders of chlorine gas across a four-mile stretch of no-man’s-land into the Allied lines during the Second Battle of Ypres in April, 1915. Thus the birth of chemical warfare.  Britain replied in kind, releasing cylinders of chlorine gas during the Battle of Loos the following summer, and  Germany upped the horror in July, 1917, delivering artillery shells filled with dichlor-ethyl-sulphide or “mustard gas” just prior to the Third Battle of Ypres.

Chlorine gas attacked the airways.  Severe respiratory swelling and inflammation killed many instantly and the rest struggled to nearby casualty clearing stations with acute congestion of the lungs, pneumonia, and blindness.  Soldiers who had inhaled the most gas arrived  with heavy discharge of a frothy yellow fluid from their noses and mouths as they drowned in their own secretions.  For the rest, partial suffocation persisted for days, and long-term survivors had permanent lung damage, chronic bronchitis, and occasionally heart failure.  Mustard gas burned the skin and respiratory tract, stripping the mucous membrane off the bronchial tubes and causing violent inflammation of the eyes.  Victims were left in excruciating pain and utterly helpless.[1]

Nurses, no less than physicians, were initially confused about the nature of the gas and the severity of its effects.[2]  But they quickly came up to speed and realized that soldiers suffering from poison gas posed a nursing challenge no less formidable than those dying from gangrenous wounds.  Nurses were accustomed to losing patients, but not to being powerless to provide comfort care, to  ease  patients’ agony during their final days.  How to nurse on when nursing was unavailing, when the burns were so terrible that “nothing seems to give relief”?[3]

WWI nurses in gas masks treat soldiers after a gas attack

Of course, nurses did what little they could.  Inflamed eyes were repeatedly irrigated with alkaline solution.  Respirators soaked in hyposulphate could be provided to patients able to use them.  At American Base Hospital 32, soldiers who had breathed in mustard gas were given a mixture of guiacol, camphor, menthol, oil of thyme, and eucalyptus that caused them to expectorate inflammatory material.   According to Maude Essig, an American Red Cross Nurse who worked at the hospital, the nurses helped devise it.[4]

According to Essig, the mixture provided some temporary relief to soldiers with burning throats and mouths.  But nurses otherwise echoed a shared sense of impotence when it came to making gassed patients comfortable.  During the Second Battle of Ypres, when chlorine gas was first used by the Germans, Canadian nurse Agnes Warner recalled the initial wave of gassed troops:  “There they lay, fully sensible, choking, suffocating, dying in horrible agonies.  We did what we could, but the best treatment for such cases had yet to be discovered, and we felt almost powerless.”[5]   Shirley Millard was graphic in describing the severe burn patients who rendered nursing futile.  “Gas cases are terrible,” she wrote at war’s end in November, 1918.

They cannot breathe lying down or sitting up.  They just struggle for breath, but nothing can be done . . . their lungs are gone . . . literally burnt out.  Some with their eyes and faces entirely eaten away by the gas, and bodies covered with first degree burns.  We try to relieve them by pouring oil on them.  They cannot be bandaged or even touched.[6]

Whereas soldiers with even the worst of battlefield wounds usually did not complain,  the gas cases “invariably are beyond endurance and they cannot help crying out.” Millard’s judgment was affirmed by many others.  Maude Essig wrote of a “star patient,” one Leo Moquinn, who “was terribly burned with mustard gas while carrying a pal of his three-quarters of a mile to safety after the gas attack.  Except for his back, she added, his “entire body is one third-degree burn.  He cannot see and has  developed pneumonia and he is delirious.”[7]  Such were the burn patients.

Essig’s reference to pneumonia alludes to the multitude of infectious diseases that accompanied battlefield wounds and complicated (or prevented) recovery. Pneumonia could be rampant during winter months; gangrene and tetanus were prevalent year round.  Typhoid was partially  controlled by the antityphoid serum injections troops received, usually prior to disembarkation but otherwise in the reception huts of clearing stations and field hospitals. But bronchitis, trench fever, diphtheria, cholera, dysentery, meningitis, measles, mumps, erysipelas,[8] and, finally, influenza, were not.  Nurses recorded deaths resulting from various combinations of the foregoing, such as Edith Appleton’s “poor little boy, Kerr,” who died of gas, pneumonia, and bronchitis.[9]

Infected shrapnel and gunshot wounds could be irrigated or bathed continuously in antiseptics, first developed in the 1870s and packed in sterile dressings available in sealed paper packages since 1893.[10]  But in the preantibiotic era, nursing care of systemic infections was limited to the same palliatives we employ today:  rest, warmth, hydration, nutrition, aspirin (and, back then, quinine), all amplified by the nurse’s caring, maternal presence.

Trench foot, a combination of fungal infection, frostbite, and poor circulation, was endemic during the winter months, when soldiers lived in trenches flooded with icy water, often waist-high, for days on end.  They struggled into clearing stations with feet that were “hideously swollen and purple,” feet “that were “raw with broken blisters and were wrapped in muddy, dripping bandages.”[11] But trench feet, however disabling, at least  permitted more active measures.   In addition to giving morphine, there was a treatment protocol to follow, such as this one at a British Military Hospital in the winter of 1917:

We had to rub their feet every morning and every evening with warm olive oil for about a quarter of an hour or so, massage it well in and wrap their feet in cotton wool and oiled silk – all sorts of  things just to keep them warm – and then we put big fisherman’s socks on them.  Their feet were absolutely white, swollen up and dead.  Some of their toes dropped off with it, and their feet looked dreadful.  We would say, ‘I’ll stick a pin in you.  Can you feel it?”  Whenever they did feel the pin-prick we knew that life was coming back, and then we’d see a little bit of pink come up and everybody in the ward would cheer.”[12]

It is the dizzying  confluence of multiple battlefield injuries, many gangrenous, with the effects of poison gas and intercurrent infectious diseases that threatened to, and occasionally did,  overwhelm the WWI nurses.  Reading their diaries and memoirs, one sees time and again how the nurses’ calling, amplified by the camaraderie of other nurses, surgeons, and orderlies who felt similarly called, overpowered resignation and despair.  In a diary entry of September 14, 1916,  Kate Luard referred to the “very special nursing” required by soldiers with multiple severe injuries.  She had in mind

The man with two broken arms has also a wound in the knee – joint in a splint – and has had his left eye removed today.  He is nearly crazy.  Another man has compound fractures of both legs, one arm, and head, and is quite sensible.  Another has both legs amputated, and a compound fracture of [the] arm.  These people – as you may imagine – need very special nursing.[13]

If one adds to such clusters the serious general infections that often accompanied battlefield injuries, one has some sense of what nurses were up against, and just how special their nursing had to be.   When influenza, the deadly Spanish flu, began to swamp clearing stations and hospitals in the spring of 1918, nurses simply added it to the list of challenges to be met with the resources at hand. And they did so in the knowledge that as many as half of the infected would die.[14]  Beatrice Hopkinson, a British auxiliary nurse, recorded the new protocol developed at her General Hospital in St. Omer to meet the rush of influenza patients:

During those early days of the flu the treatment was to strip the patients in one tent, their clothing going immediately to the fumigator.  Then, the patient was bathed in disinfectant and taken to the different wards.  Some of the patients were very ill and died with pneumonia after a few days.[15]

The early days of the pandemic gave way to the later days, and after the Armistice was signed on November 11, 1918, nurses occasionally felt boredom, even mild malaise, when the demands of “special nursing” relented and they increasingly found themselves nursing “mostly mild influenza cases.”[16]

I admire the nurses of WWI because they did what was required of them absent any preexisting sense of what they could be required to do, absent, that is,  anything approaching a  “job description.” Without medical residents, internists, and infectious disease specialists to fall back on, they collapsed specialism into global care-giving identities.  This meant they managed multiple war wounds and  intercurrent infections, prioritizing among them and continuously adjusting treatment goals in the manner of highly skilled primary care physicians.  By the same token, they realized the importance of compassion in the face of ameliorative impotence.  Somehow they found  time to be present, to slip into a ward with a soldier dying of gas gangrene every few minutes “to do something perfectly useless that might perhaps give a ray of comfort.”[17]

Ironically, given the environment in which they labored and their “patient population” of soldiers in extremis, the nurses embodied the values of primary care medicine, since they took upon themselves the role of primary caregivers obligated to stay with their patients through thick and thin, to summon senior colleagues and surgeons as needed, and to ease life transitions, whether to recovery,  convalescence, lifelong disability, or death.[18]   And they did so whatever the weight of multiple assaults on their own bodily and mental integrity.

Nurses, technically noncombatants, suffered alongside the troops.  During rushes, their clearing stations, hospitals, and living quarters were under land and air assault and occasionally took direct hits.  They contracted infectious diseases, especially flu,[19] during which they usually carried on with the aid of simple analgesics until they felt better or worse.  When Helen Boylston became feverish in November, 1918, a symptom she attributed to diphtheria, she braced herself for a long-awaited evening dance with “quantities of quinine and finally a stiff dose of whiskey, and I felt ready for anything.” But not ready enough, it turned out.  She collapsed at the dance with a bad chill and had to be carried to her bed.  When she went on duty the following day, she became delirious in the ward and was lugged off by an orderly and subsequently seen by a doctor. “So here I am,” she wrote in her diary.  “I’ve developed a heart and a liver, and am as yellow as a cow-lily.  I have to lie flat on my back and be fed.  For three days I lay motionless all day long, not caring to move or to speak, I was so tired.”  Boylston was soon joined by a second nurse with diphtheria, placing the camp “in a panic,” with every staff member now given daily throat cultures.[20]

Despite training in the use of gas masks in the event of direct shelling, mask-less nurses suffered the effects of poison gas from daily proximity to patients on whom the shells had landed.  Their own vulnerability to gas attack and attenuated exposure to the poison lent special intensity to their care of burn victims.  They understood, with Maude Essig, that mustard gas burns indeed meant “terrific suffering.”[21]  Whether infected or poisoned, they usually labored on until they collapsed or were so near collapse that medical colleagues ordered them out of the wards, whether to bed, to a general hospital for treatment, or to a nearby convalescent homes for recuperation and a desperately needed “time out.”[22]

Civil War nurses too eased transitions to death, but their nursing goal during a soldier’s final days was to reconfigure mortal battlefield injury into the promise of a beneficent afterlife.   So they stayed with the dying, soliciting final confessions of sinful living, allowing soldiers to reminisce and reflect, and soliciting (and writing down) words of comfort  to sustain family members in believing that their soldier had died a “good death.”[23]  World War II, on the other hand, witnessed the development of new vaccines, a national blood bank program, the widespread availability of sulfa drugs in 1941 and penicillin in 1944, major advances in the control of shock and bleeding and in battlefield surgery, and much greater speed of evacuation of the seriously wounded to European and stateside base hospitals.  Taken together these advances created a buffer between nurses and the prolonged witnessing  of soldiers dying in unrelievable pain.

It was the nurses of WWI who took it on the chin.  They could not sustain themselves and their patients with the naturalistic view of the afterlife popular during the Civil War.[24]  Nor did they have the benefit of more “modern” technology and organization to shield them, if only somewhat, from the experiential onslaught of  dying soldiers.   It was not death per se but the agony of dying – from infected battle wounds and/or systemic infections,  gas gangrene, chlorine and mustard gas,  rushed amputations followed by reinfection and blood loss –  that took them to their own existential no-man’s-land, the kind we encounter in the writings of Mary Borden and Ellen LaMotte.

In the summer of 1917, the nurses at No. 12 General Hospital on the outskirts of Rouen struggled with a gas victim whose paroxysms of coughing came every minute and a half “by the clock,” and who had not slept in four days.  To quiet him, they rigged up a croup tent under which they took turns holding a small stove that heated a croup kettle from which the soldier could breathe the steam.  When sleep finally came, they were “ready to get down on their knees in gratitude, his anguish had been so terrible to watch.”  To their head nurse, Julia Stimson, they remarked that “they could not wish the Germans any greater unhappiness than to have them have to witness the sufferings of a man like that and know that they had been the cause of it.”[25]

It was bearing witness to unrelievable suffering that was the worst assault borne by the nurses.  “It is dreadful to be impotent, to stand by grievously stricken men it is impossible to help, to see the death-sweat gathering on young faces, to have no means of easing their last moments.  This is the nearest to Hell I have yet been.”  This is the voice of an anonymous British Red Cross nurse, unsettled by the dying Belgium soldiers she encountered on ambulance runs in the fields of West Flanders in the winter of 1915.  The American nurses at No. 12 General Hospital brushed up against this same hell, and they could think of no greater punishment for enemy combatants than to witness what they witnessed, often for weeks on end.  And yet the nurses of WWI were not stymied by seeming impotence in the face of pain.  They labored on to the breaking point in the service of soldiers who, all too often, were already broken.  This makes them warriors of care and, in a devotion to patients that was literally and not metaphorically self-less, heroes of the first rank.

_______________________

[1] Christine E. Hallett, Veiled Warriors:  Allied Nurses of the First World War (Oxford: OUP, 2014), 79-80, 203.

[2] E.g., Julia C. Stimson, Finding Themselves: The Letters of an American Army Chief Nurse in a British Hospital in France (NY: Macmillan, 1918), 80; John & Caroline Stevens, eds., Unknown Warriors:  The Letters of Kate Luard, RRC and Bar, Nursing Sister in France 1914-1918 (Stroud: History Press, 2014), loc 1945.

[3] Maude Frances Essig, My Trip with Uncle Sam, 1917-1919:  How We Won World War I, unpublished journal written during the summer, 1919, entry of March 24, 1918.

[4] Agnes Warner, My Beloved Poilus’ (St. John: Barnes, 1917), loc 861.

[5] Warner, My Beloved Poilus’ , loc 814.

[6] Shirley Millard, I Saw Them Die: Diary and Recollections (New Orleans, LA: Quid Pro, 2011), loc 514.

[7] Essig, My Trip with Uncle Sam,  entry of March 24, 1918.

[8] Erysipelas is an acute bacterial infection of the upper dermis, usually of the arms, legs, and/or face, that is accompanied by red swollen rashes.  Without antibiotic treatment, It can spread through the blood stream and cause sepsis.

[9] Edith Appleton, A Nurse at the Front:  The First World War Diaries, ed. R. Cowen (London:  Simon & Schuster UK, 2012), 111.

[10] Rodney D. Sinclair & Terence J. Ryan, “A Great War for Antiseptics,” Australas. J. Dermatol, 34:115-118, 1993.  These nineteenth-century antiseptics included salicylic, thymol, Eucalyptus oil, aluminum acetate, and boric acid.

[11] Helen Dore Boylston, Sister: The War Diary of a Nurse (NY: Ives Washburn, 1927), loc 154.

[12] Kathleen Yarwood (VAD, Dearnley Military Hospital), in Lyn MacDonald, The Roses of No Man’s Land (London: Penguin, 1993 [1980]), 197-198.

[13] Luard, Letters, loc 1245.

[14] Millard, I Saw Them Die, loc 472.

[15] Beatrice Hopkinson, Nursing through Shot & Shell: A Great War Nurse’s Story, ed. Vivien Newman (South Yorkshire: Pen & Sword, 2014), loc 1999.

[16] Hopkinson, Nursing Through Shot & Shell, loc 2609.

[17] [Kate Norman Derr] “Mademoiselle Miss”: Letters from an American Girl Serving with the Rank of Lieutenant in a French Army Hospital at the Front, preface by Richard C. Cabot (Boston:  Butterfield, 1916), 76-77.

[18] For an exposition of these values and how they gained expression in American medicine in the nineteenth and twentieth centuries, extending through “general practice” of the 1950s and ’60s, see Paul E. Stepansky, In the Hands of Doctors:  Touch and Trust in Medical Care (Santa Barbara: Praeger, 2016).

[19] “The flu is back again and everybody has it, including me.  I’ve run a temperature of one hundred and two for three days, can hardly breathe, and have to sleep on four pillows at night.” Boylston, Sister, loc 630.

[20] Boylston, Sister, loc 1350, 1357.

[21] Essig, My Trip with Uncle Sam, entry of March 23, 1918.

[22] E.g., Luard, Letters, loc 1247:  “Sister D, the Mother of all the Abdominals, has her marching orders and goes down to Rouen to a General Hospital tomorrow.  Her loss is irreparable.”  Edith Appleton  recounts taking care of three sick nurses and a sick VAD at one time:  “I have begun to feel like a perpetual night nurse to the sick sisters as I have another one to look after tonight with an abscess in her ear”(A Nurse at the Front, p. 123).  Maude Essig contracted erysipelas in the spring of 1918 and reported feeling “awfully sick” the following fall, when she relied on “quinine and aspirin in large doses” to keep going (My Trip with Uncle Sam, entries of April 9, 1918, April 14, 1918, and October 27, 1918).

[23] Drew Gilpin Faust, This Republic of Suffering:  Death and The American Civil War (New York: Vintage, 2008), chapter 1.

[24] Faust, Republic of Suffering, pp. 178, 187.

[25] Stimson, Finding Themselves, pp. 80-81.

Copyright © 2017 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

Remembering the Nurses of WWI (III)

He’s saved, and that makes up for much.”

[The third of a series of essays about the gallant nurses of World War I commemorating the centennial of America’s entry into the war on April 6, 1917.  Learn more about the range of battlefield injuries and infectious diseases treated by nurses on the Western Front in Easing Pain on the Western Front: American Nurses of the Great War and the Birth of Modern Nursing Practice (McFarland, 2020)]. 

Of course the surgeons of WWI could only save so many lives.  During battle “rushes,” when they operated up to 16 hours a day, they had to husband operative energy for soldiers who were savable, especially those whose saving could land them back in the trenches.  Many cases were deemed hopeless and simply handed back to the Sisters, to provide what meager palliative care they could while the soldiers awaited death in the tent set aside for them, the Moribund Ward.  But the Sisters sometimes refused to let matters rest, recognizing that the surgeons, often operating at breakneck speed in a state of exhaustion, did not have the last word on life and death.  So soldiers out of  surgeons’ hands might still find themselves in  nurses’ hands, where they were beneficiaries of  nursing so intensive and prolonged that, against all odds, it segued into a curative regimen.

Mary Norman Derr, an American nurse trained by the French Red Cross in 1914 and assigned to a French Army Hospital near the trenches of the Marne in 1915, recalled an Arab soldier who arrived at the hospital barely conscious.  His seven suppurating wounds led to two successive operations, after which  surgeons  pronounced him hopeless and handed him back to Nurse Derr:

It is one of the few dressings I have had that really frightened me; for it was so long, and every day for a week or more, I extracted bits of cloth and fragments of metal, sometimes at a terrifying depth.  Besides my patient was savage and sullen, all that is ominous in the Arab nature.  Gradually, however, the suppuration ceased, the fever fell, and suddenly one day Croya smiled.[1]

An American Red Cross nurse at work outside a French field hospital, 1915

MGH-trained Helen Dore Boylston, working in the post-surgical bone ward of her Base Hospital in the winter of 1918, was no stranger to surgical aftercare.   Boylston enjoyed her 40 patients, and singled out a pluck Australian of over sixty with a leg “torn to pieces.”  “He’s a Crotchety old darling, always raging and roaring about something,” she wrote her family:

One day, when I was here before, he complained of a pain in his thigh and began to run quite a temp.  As his leg was laid wide open anyhow, I took a look along the bone, Dad meantime cursing the roof off.  I found a walled-in pus pocket, and picking up a scalpel told Dad he’d better look out of the window for a minute, as I was going to have to hurt him.  Then, before he knew what I was about, I had slit the thing open.  At least two cupfuls of pus poured out, and his relief was tremendous at once.  Of course his temp dropped, too.  I put in a packing and watched it for a few days.  It cleared up promptly.  That was absolutely all.”[2]

Nor did interventionist nursing end with bedside surgery.  Nurses often believed rehabilitation was possible when doctors did not, and they proved their point with paralyzed soldiers who, so the surgeons declared, would never walk again.  Consider Agnes Warner, a Canadian nurse working at the American Hospital in Neuilly, France.  Casualties from Alsace poured into the hospital in the spring of 1915, at which time a surgeon remarked that one of her patients, her “poor paralyzed man,” would never walk again.  Unfazed by the pronouncement and unwilling to rest content giving the patient English lessons to help pass the time, she devised a program of rehabilitation that incorporated electrical stimulation, which only became available at the Hospital in late June.  Three weeks later, she had her paralyzed man out on the balcony, where he enjoyed fresh air for the first time in six months.  She was assigned another patient paralyzed from the waist down a month later, and then in mid July she proudly reported on both patients:

My paralyzed man stood up alone last Sunday for the first time and now he walks, pushing a chair before him like a baby.  He is the happiest thing you can imagine; for seven months he has had no hope of ever walking again. . . . My prize patient, Daillet walks down stairs by himself now . . .We are all proud of him.  The doctor who sent him here from Besancon came in the other day to see how he was getting on and he could not believe it when he saw him.[3]

Worst of all were soldiers whose gaping wounds and limbless stumps were saturated with anaerobic bacteria of the genus Clostridium– soil-dwelling bacteria that thrive in the absence of oxygen – from the heavily fertilized fields of Flanders and Northern France.  The bacteria entered cavities through dirt and debris picked up by exploding shell fragments; bullet wounds and shrapnel typically drove into the body with pieces of bacteria-infested clothing.  The result was the dreaded gas gangrene, easily detected by the darkened muscle, bubbling sound, and overpowering stench[4] emanating from the infected limb or body cavity.  Nurses could smell such cases a mile away (so to speak),  and dreaded removing original aid station bandaging, often four or five days old, that revealed the “hideous and hopeless color of gangrene.”[5]  Prompt treatment in a Casualty Clearing Station (CCS), which typically meant amputation of an infected limb and antiseptic irrigation, might save a soldier’s life.  But left unattended in trenches and on battlefields for three, four, even five days, soldiers arrived at clearing stations with septicemia (blood poisoning), which foretold an agonizing death, often within hours, almost always within a few days.

Among the multitude of stressors that made up ward nursing in CCSs and field hospitals, ministering to dying gangrenous soldiers was at the top of the list.  What is remarkable is that even here nurses occasionally rejected the medical verdict and resolved to nurse on with those awaiting death in the Moribund Ward.  This was true of Kate Luard, who, in the midst of the Battle of Arras in May, 1917, battled on for her dying soldiers.  She was, she wrote home, “engaged in a losing battle with gas gangrene again – in the Moribund Tent – a particularly fine man, too.”  But then, a month later, she began working with “two given-up boys” who could not be revived the preceding day.  Still, the boys seemed to her “not hopeless” and she resolved to “work” on them.  The result repaid the effort,  “and after more resuscitation they are now both comfortably bedded in one of the Acute Surgicals, each with a leg off and a fair chance of recovery.”  A few days later, she wrote that her “two resuscitated boys in the Moribund Ward are all right.”  To be sure, many dying soldiers were revived only to develop gangrene above their amputations and die,  but Luard never stopped trying.  If one of her  gangrenous boys was “going wrong” on a particular day,  she would counter that “moribund head cases are smoking pipes and eating eggs and bread and butter. The kidney man is being dressed with [the antiseptic] Flavine and has had a leg off and is nearly convalescent!”[6]

The vast majority of nursing saves went unrecorded, perhaps noticed at the time by a colleague, a supervisor, even the Head Matron.  Without the wartime diaries and letters the nurses left behind, we would have little inkling of their quiet struggles to keep forsaken soldiers alive.  Such struggles take us far from the world of high-tech nursing, even in its low-tech WWI incarnation.  What we behold, rather, is hard-core, soft-touch nursing, abetted by a Rube Goldberg inventiveness in making use of materials at hand, somehow garnering materials not easily obtainable, and then patiently titrating treatments (including food intake) in a manner responsive to states of severe, even deathlike, debilitation.

A little Night Sister in the Medical last night pulled a man round who was at the point of death, in the most splendid way.  He had bronchitis and acute Bright’s Disease, and Captain S. and the Day Sister had all but given him up; but at 10:30 p.m., as a last resource, Captain S. talked about a Vapour Bath [steaming up his room], and the little Sister got hold of a Primus [stove] and some tubing and a kettle and cradles, and got it going, and did it again later, and this morning the man was speaking and swallowing, and back to earth again.  He is still alive tonight, but not much more.[7]

You will like to hear of the living skeleton with wounds in back and hands and shoulder that they brought me filthy and nearly dead from another pavilion.  That was nine days ago.  I diagnosed him as a case of neglect and slow starvation, and treated him accordingly – malted milk, eggs, soap, and alcohol to the fore.  His dressing took one and a half hours every day, and all nourishment given a few drops at a time, and early all the time, for he was almost too weak to lift an eyelid, much less a finger.  This morning he actually laughed with me and tried to clench his fist inside the dressings to show me how strong he was.  He’s saved, and that makes up for much.[8]

I happened on a corpse-like child [a teenage soldier] the other day being brought into the Moribund Ward to die and we got to work on resuscitation, with some success.  He had been bleeding from his subclavian artery and heard them leave him for dead in his shell-hole.  But he crawled out and was eventually tended in a dug-out by ‘a lad what said prayers with me,’ and later the hole in his chest was plugged and he reached us – what was left o of him.  When, after two days, he belonged to this world again, I got Capt. B. to see him, and he got Major C. to operate and tied the twisted artery which I had re-plugged – he couldn’t be touched before – and cover with muscle the hole through which he was breathing, and he is now a great hero known as ‘the Prince of Wales’.”[9]

Nor was orthopedic inventiveness beyond the pale.  In fracture wards up and down the Front, war nurses were  adepts of the Balkan frames affixed to beds, virtuoso adjusters of the heavy weights and cables that maintained constant traction of fractured long bones suspended from above.  But they improvised as well.   Kate Derr provides an example of the ingenious contraption rigged up by war nurses for a soldier with badly damaged joints.  She wrote home from Vitry in April, 1917 of her “lastingly satisfactory” work on a soldier who had “double anthrotomie [deep lacerations] of the knees.”  She explained that

when he came the insteps were bent like a ballet-dancer’s.  Even admitting his recovery, which seemed impossible, he would be obliged to go about on the points of his toes, the knees being permanently stiff.   At first, after ‘peeling’ with every conceivable dissolvent, I began just the slightest effleurissage [circular stroking] which developed into massage, and then I invented an apparatus . . . A board about 14 inches square was padded with cotton and swathed neatly in a bandage.  This was laid vertical against the soles of the feet which I tried to place as nearly as possible in a normal position.  Then I attached a bandage (having no elastic, which would have been better) to the head rail of the bed on one side, passed it around the board and up the other side, fastening it again to the rail as taut as possible.  The knot was tightened twice a day.  Result – in two weeks those refractory feet had regained a proper attitude.[10]

Such dedication to severely injured patients persisted in the face of bombings that reached and sometimes destroyed the clearing stations and field hospitals in which the nurses worked.  Nurses too were casualties of war and disease.  In Belgium in the fall of 1917, enemy bombs destroyed the 58th General Scottish Hospital adjacent to Beatrice Hopkinson’s own 59th.  Hopkinson watched while orderlies from her hospital “stooped over bunches of twigs in various places and picked up something, putting it in the sheet.  They were the arms and legs and other pieces of the patients that had been bombed and blown right out into the [outlying] park.”[11]  Back in her own hospital, with bombs continuing to fall, she confided to her diary that “My knees just shook and, had I allowed it, my teeth would have rattled; but I had to be brave for my patients’ sake.  When they saw the womenfolk apparently without fear it kept them brave.”[12]

Nurses like Hopkinson, Warner, Luard, and Derr did not see themselves as brave.  Rather, their sense of duty was so powerful that it sequestered fear and compelled action in ways that would have been incomprehensible to their non-nursing selves.  “I never realized what the word ‘duty’ meant until this War,” Hopkinson remarked.  Hers was the courage of  the Hippocratic caregiver, who subordinates self-interest to the patient’s well-being.  For the nurses of WWI, such subordination extended to self-preservation itself.  I admire them because their sense of mission remained unswerving as moribund wards swelled and they failed, time and again, to “pull round” those too far gone to be pulled.  Living and working amid the bodies of those they failed to save – perhaps because they lived and worked among those they failed – the nurses remained certain of who they were and what they did.  They were vindicated by their calling.  Thus Kate Luard during the Battle of Arras in the fall of 1916:

There is no form of horror imaginable, on any part of the human body, that we can’t tackle ourselves now, and no extreme of shock or collapse is considered too hopeless to cope with, except the few who die in a few minutes after admission.[13]

And with the resolve to nurse on, even during bombing raids that imperiled them, came defiant resiliency.  The clearing stations right off the Front were, in the words of the American nurse and poet Mary Borden, the second battlefield – a battlefield littered with care giving paraphernalia that combatted and succumbed to the inexorability of death.  So why did the nurses labor on?  “He’s saved, and that makes up for much,” declaimed Kate Derr in the fall of 1915.  To which Kate Luard added her own gloss a year and a half later:

Some of us and Capt. B. have been having a bad fit of pessimism over them all lately, wondering what is the good of operations, nursing, rescues, or anything, when so many have died in the end.  But even a few miraculous recoveries buck one up to begin again.[14]

____________________

[1] [Kate Norman Derr] “Mademoiselle Miss”: Letters from an American Girl Serving with the Rank of Lieutenant in a French Army Hospital at the Front, preface by Richard C. Cabot (Boston:  Butterfield, 1916), 67.

[2] Helen Dore Boylston, Sister: The War Diary of a Nurse (NY: Ives Washburn, 1927), 237.

[3] Agnes Warner, ‘My Beloved Poilus’ (St. John: Barnes, 1917), loc 119,221, 324, 782.

[4] On the stench of gas gangrene, which suffused entire wards, see, for example, Edith Appleton, A Nurse at the Front:  The First World War Diaries, ed. R. Cowen (London:  Simon & Schuster UK, 2012), 194-95, 240:  “One feels the horrible smell in one’s throat and nose all the time.”

[5] Shirley Millard, I Saw Them Die: Diary and Recollections (New Orleans, LA: Quid Pro, 2011), loc 1192.  Even among gangrenous patients who survived, changing the dressings twice a day was an “agonizing procedure.”  Beatrice Hopkinson, Nursing through Shot & Shell: A Great War Nurse’s Story, ed. Vivien Newman (South Yorkshire: Pen & Sword, 2014), loc 409.

[6] John & Caroline Stevens, eds., Unknown Warriors:  The Letters of Kate Luard, RRC and Bar, Nursing Sister in France 1914-1918 (Stroud: History Press, 2014), loc 1790, 1704, 1713, 1722.

[7] Luard Letters, loc 306-315.  Sadly, ‘the Prince of Wales’ died several days later.

[8] Derr, “Mademoiselle Miss, p. 47.

[9] Luard Letters, loc 2232.

[10] Derr, “Mademoiselle Miss,” pp. 95-96.

[11] Plus ça change, plus c’est la même chose.  See Ann Jones’s graphic description of the work of army specialists in Mortuary Affairs who retrieve and bag body parts and liquefied innards of our fallen soldiers in Afghanistan.  Ann Jones, They Were Soldiers:  How the Wounded Return from America’s Wars – The Untold Story (Chicago: Haymarket, 2013), chapter 1 (“Secrets: The Dead”).

[12] Hopkinson, Nursing through Shot & Shell, loc 1442, 1498.

[13] Luard, Letters, loc 1273.

[14] Mary Borden, The Forbidden Zone (ed. H. Hutchison (London: Hesperus[1928] 2008), 83; Derr, “Mademoiselle Miss,”47; Luard Letters, loc 1767.

Copyright © 2017 by Paul E. Stepansky.  All rights reserved.  The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

Remembering the Nurses of WWI (I)

“Real war at last.  Can hardly wait.  Here we go!”

[The first of six essays about the gallant nurses of World War I commemorating the centennial of America’s entry into the war on April 6, 1917.  The outgrowth of these essays is the book, Easing Pain on the Western Front:  American Nurses of the Great War and the Birth of Modern Nursing Practice (McFarland, 2020).  Hear Paul Stepansky discuss the book with the editor of the American Journal of Nurse Practitioners in a special JAANP podcast. 

It was the all-too-common story of the WWI nurses, the narrative thread that linked the vagaries of their wartime experiences.  The war was to be the adventure of a lifetime. The opportunity to serve on the Western Front was not to be missed, not by hospital-trained nurses and not by lightly trained volunteer nurses.  For both groups, the claim of duty was suffused with the excitement of grand adventure.  Beginning in the spring of 1917, the war abroad was the event of the season.  Julia Stimson, a Vassar graduate who, as superintendent of nursing at Barnes Hospital, led the St. Louis base hospital unit to Europe in May, 1917, was overwhelmed with the honor bestowed on her and the opportunities it promised.  “To be in the front ranks in this most dramatic event that ever was staged,” she wrote her mother, was “all too much good fortune for any one person like me.”  For 28-year-old Shirley Millard, a Red Cross volunteer nurse from Portland, Oregon rushed to a field hospital near Soissons in March, 1918, the prospect of nursing work at Chateau Gabriel, close to the Front, was a dream come true:  “It is so exciting and we are all thrilled to have such luck.  Real war at last.  Can hardly wait.  Here we go!” “I haven’t the least fear or worry in the world.  Am ready for anything,” averred Minnesotan Grace Anderson, a reserve nurse and nurse-anesthetist who embarked from New York harbor in July, 1918.  Serving in a base hospital or, more exciting still, in a field hospital or casualty clearing station only miles from the Front, was to be invited to the Grand Cotillion.  Volunteer and army nurses alike were typically well-bred young women of substance, often upper-class substance. They were adventuresome and patriotic and given over to a sense of  duty informed by literary culture, not battlefield experience.  So they experienced  happiness on receiving the call; they would make their families proud.[1]

But their sense of exhilaration at being invited to the Patriotic Ball quickly gave way to stunned amazement at the “work” before them.  The wounds of French, British, and, soon enough, American troops were literally unimaginable to them and then, in the fevered atmosphere of post-battle “rushes,” wrenchingly imaginable, indeed omnipresent. They grew familiar with the horrid stench of gas gangrene, which crackled beneath the surface of the infected body part or parts and almost always presaged quick death. Under the mentoring of senior nurses, the Sisters, young American women learned how to prep patients for surgery.  In the process, they encountered cases in which “there are only pieces of men left.”  And yet, having no choice, they quickly made their peace with the stumps of severed limbs and concavities of missing stomachs, faces and eyes and began to help clean, irrigate, and dress what remained, before and after surgery, if surgery could even be attempted.  Like their seniors, they learned to remain unflinching in the face of the many soldiers who arrived “unrecognizable as a human being.”  And they retained composure before soldiers as young as sixteen or seventeen  — “children,” they would say — who arrived at Casualty Clearing Stations (CCSs) caked in mud and blood and covered with lice – children with three, five, nine, even eleven wounds.  They learned to accept that many soldiers would die in a matter of hours or days, but to join this realization to an obligation to provide what comfort they could.  They ended up working hard to keep the dying alive long enough to warm up and pass under morphine and chloroform, all the while holding their nurse-mother’s hand.[2]

They could not operate on Rochard and amputate his leg, as they wanted to do.  The infection was so high, into the hip, it could not be done.  Moreover, Rochard had a fractured skull as well.  Another piece of shell had pierced his ear, and broken into his brain, and lodged there.  Either wound would have been fatal, but it was the gas gangrene in his torn-out thigh that would kill him first.”[3]

Here is  “a poor youngster with both legs broken, both arms wounded, one eye shot out and the other badly damaged,” there a “poor lad” who “had both eyes shot through and there they were, all smashed and mixed up with the eyelashes.  He was quite calm, and very tired.  He said, ‘Shall I need an operation?  I can’t see anything’.”  Within a week of arrival at her field hospital, Shirley Millard wrote of “bathing [a soldier’s] great hip cavity where a leg once was,” while “a long row of others, their eyes fastened upon me, await their turn.  And she followed with the kind of litany offered by many others:  “Gashes from bayonets. Flesh torn by shrapnel.  Faces half shot away.  Eyes seared by gas; one here with no eyes at all.  I can see down into the back of his head.” Helen Dore Boylston, an MGH-trained nurse who served with the Harvard Medical Unit from 1915 on, presents an indelible image that affected her for life and  affects us still:

There were strings of from eight to twenty blind boys filing up the road, clinging tightly and pitifully to each other’s hands, and led by some bedraggled limping youngster who could still see . . . I wonder if I’ll every be able to look at marching men anywhere again without seeing those blinded boys, with five and six wound stripes on their sleeves, struggling painfully along the road.[4]

A soldier with gangrenous wounds oozing everywhere might morph into a “mass of very putrid rottenness long before he died.”  Such was the experience of Edith Appleton, who continued:  “The smell was so very terrible I had to move him right away from everyone, and all one could do was dress and redress. Happily I don’t think he could smell it himself but I have never breathed a worse poison.”[5]

All too soon after arrival, then, the cheery young American nurses beheld the fearless young soldiers – or remnants thereof – who came to clearing stations and base hospitals in funereal processions of ambulances. The fearless young men had become “wretched, restless beings.”  For Shirley Millard, “The crowded, twisted bodies, the screams and groans, made one think of the old engraving in Dante’s Inferno.  More came, and still more.”  In Helen Boylston’s field hospital, a “rush” during the German offensive of late March, 1918 brought 1,100 wounded to her base hospital in 24 hours, with three operating teams performing some 90 emergency operations that night and the nights to follow.  The operating room nurse, she recalled, “walked up and down between the tables with a bottle of aromatic spirits of ammonia in one hand and a bottle of brandy in the other, ready to pounce on the next person who wilted.” At Beatrice Hopkinson’s CCS 47, just outside Amiens, the situation was even worse.  During the March rush many thousands of patients passed through the doors in only a few days and kept seven operating tables working day and night.[6]

And so the narratives captured in these diaries, journals, and memoirs turn a corner into blackness, as the nurses themselves undergo a kind of existential decomposition.  The volunteer nurses in particular, many little older than the combatants, became war-weary and war-wise in ways that choked off the childish exhilaration with which they had embarked. They found themselves at the threshold of their own nonnegotiable no-woman’s land. The nurse, wrote Mary Borden in The Forbidden Zone,

is no longer a woman.  She is dead already, just as I am – really dead, past resurrection.  Her heart is dead.  She killed it.  She couldn’t bear to feel it jumping in her side when Life, the sick animal, choked and rattled in her arms.  Her ears are deaf; she deafened them.  She could not bear to hear Life crying and mewing.  She is blind so that she cannot see the torn parts of men she must handle.  Blind, deaf, dead – she is strong, efficient, fit to consort with gods and demons – a machine inhabited by the ghost of a woman – soulless, past redeeming, just as I am – just as I will be.[7]

Nurses bore up, but in the process many were ground down, their pre-war values pulverized into dust.  Comprehending trench warfare in bodily perspective, they became freighted with the pointlessness of the horror, the multitude of mutilated, infection-saturated, and lifeless young bodies.  It was, for Helen Boylston, less tragic than unutterably stupid.

Today a ditch is full of Germans, and tomorrow it is full of Englishmen.  Neither side really wants the silly muddy ditch, yet they kill each other persistently, wearily, ferociously, patiently, in order to gain possession of it.  And whoever wins, it has won – nothing.[8]

They pondered the paradox of pain – the impossibility of knowing its nature in another along with the inability to nurse without imagining it.  They grew into a capacity for shame – shame in  their own strength, in their ability to stand firm and straight alongside a bedside “whose coverings are flung here and there by the quivering nerves beneath it.”  They empathized with shell-shocked patients who, having endured the prospect of “glorious death” under the guns, were sent home “to face death in another form. Not glorious, shameful.”  And finally there was the shame, thinly veiled, attendant to witnessing the unremitting pain of the dying.  “No philosophy,” reflected Enid Bagnold, “helps the pain of death.  It is pity, pity, pity, that I feel, and sometimes a sort of shame that I am here to write at all.”[9]

And then, as hostilities drew to a close, there were the larger reflections, the alterations of life philosophy that grew out of nursing their boys. For Helen Boylston,

The war has done strange things to me.  It has given me a lot and taken away a lot.  It has taught me that nothing matters, really.  That people do not matter, and things do not matter, and laces do not matter, except for a minute.  And the minute is always now.[10]

For Shirley Millard, Armistice Day and the immediate dismissal of her unit of volunteer nurses marked her epiphany:

Only then did the enormous crime of the whole thing begin to come home to me.  All very well to celebrate, I thought, but what about Charley?  All the Charlies? What about Donnelly, Goldfarb, Wendel, Auerbach? And Rene?  And the hundreds, thousands of others.”[11]

The enormity of the crime and the absurd reasoning that justified it coalesced in the wartime essays of Ellen LaMotte and Mary Boyden, one recurrent theme of which is the impossibility of a good death in war, where the very effort to “restore” bodies and minds that are shattered, literally and figuratively, becomes oxymoronic.  War, they insist, occurs in an alternate universe where any claim to morality is, from the standpoint of ordinary life, self-willed delusion.  In this universe, surgeons function as cavalier automatons and even life-saving surgery is specious, because the lives saved, more often than not, are no longer human lives, psychologically or physically. In this alternate universe, death withheld, ironically, is the ultimate act of inhumanity.[12]

What makes the nurses of World War I gallant is that so many of them were able to bracket their encroaching horror, with its undercurrents of anger, depression, and numbing – and simply care for their patients.  They were able to function as nurses in a nurses’ hell.  Military directives pushed them to an even lower circle of the Inferno, since the nurses’ primary task, they were told over and over, was to get injured troops back to the Front as soon as possible.  They were to fix up serviceable (and hence service-able) soldiers so that they could be reused at least one more time before breakdown precluded further servicing and the soldier’s obligation to serve further.

But the nurses knew better and unfailingly did better.  Nursing practice, it turns out, had its own moral imperative, so that military directives were downplayed, often cast to the wind.  As the nursing historian Christine Hallett observes, the emotional containment nurses provided for suffering and needy soldiers did not – indeed could not – preclude caring.[13]  In essays to follow, I hope to explore further the remarkable elements of this caring, which blurred the boundary between comfort care and healing and took nursing practice into the domains of emergency medicine, infectious disease management, surgery, and psychotherapy.  It is as agents of care and caring that the nurses of World War I rose to the status of gallants.  Flying in the face of military priorities and surgical fatalism, they bravely dispensed cure in a manner true to the word’s etymology, the Latin curare, a taking care of that privileges the patient’s welfare above all else.

_____________________

[1] Julia, C. Stimson, Finding Themselves: The Letters of an American Army Chief Nurse in a British Hospital in France (NY: Macmillan, 1918), 3-4.; Shirley Millard, I Saw Them Die: Diary and Recollections, ed. E. T. Gard (New Orleans: Quid Pro, 2011), location in Kindle edition (loc), 388; Shari Lynn Wigle, Pride of America: The Letters of Grace Anderson, U.S. Army Nurse Corps, World War I (Rockville, MD: Seaboard, 2007), 9.

[2] Agnes Warner, ‘My Beloved Poilus’ (St. John: Barnes, 1917), loc 75; Beatrice Hopkinson, Nursing Through Shot & Shell: A Great War Nurse’s Story, ed. V. Newman (South Yorkshire: Pen & Sword, 2014), loc 1425; Helen Dore Boylston, Sister: The War Diary of a Nurse (NY: Washburn, 1927), loc 463; Enid Bagnold, A Diary Without Dates (London: Heinemann, 1918),  125: “Among his eleven wounds he has two crippled arms.”

[3] Ellen N. La Motte, The Backwash of War: The Human Wreckage of the Battlefield as Witnessed by an American Hospital Nurse (NY: Putnam’s, 1916), 51-52.

[4] Edith Appleton, A Nurse at the Front: First World War Diaries, ed. R. Cowen (London: Simon & Schuster UK, 2012), 138, 161; Millard, I Saw Them Die, loc 428; Boylston, Sister, loc 463.

[5] Dorothea Crewdson, Dorothea’s War: A First World War Nurse Tells her Story, ed. Richard Crewdson (London: Weidenfeld & Nicolson, 2013, 2013), loc 1189; Appleton, Nurse at the Front, 189.

[6] Crewdson, Dorothea’s War, loc 1192; Millard, I Saw Them Die, loc 388; Boylston, Sister, loc 1101; Hopkinson, Nursing Through Shot & Shell, loc 1719, 1780.

[7] Mary Borden, The Forbidden Zone, ed. H. Hutchison (London: Hesperus, 1928/2008), 44.

[8] Boylston, Sister, loc 648.

[9] Bagnold, Diary without Dates, loc 25, 104; LaMotte, Backwash of War, 139.

[10] Boylston, Sister, loc 1373.

[11] Millard, I Saw Them Die, loc 1562.

[12] All the brief essays in LaMotte’s The Backwash of War and Borden’s The Forbidden Zone circle around these and related themes.  Among them,  I was especially moved by LaMotte’s  “Alone,” “Locomotor Ataxia,” and “A Surgical Triumph,”  and Borden’s “Rosa,” “Paraphernalia,” and “In the Operating Room.”

[13] Christine E. Hallett, Containing Trauma:  Nursing Work in the First World War (Manchester: Manchester University Press, 2009), 177.

Copyright © 2017 by Paul E. Stepansky.  All rights reserved.

You Touch Me

Etymologically, the word “touch” (from the old French touchier) is a semantic cornucopia.  In English, of course, common usage embraces dual meanings. We make tactile contact, and we receive emotional contact.  The latter meaning is usually passively rendered, in the manner of receiving a gift:  we are the beneficiary of someone else’s emotional offering; we are “touched” by a person’s words, gestures, or deeds.  The duality extends to the realm of healthcare:  as patients, we are touched physically by our physicians (or other providers) but, if we are fortunate, we are also touched emotionally by their kindness, concern, empathy, even love.  Here the two kinds of touching are complementary.  We are examined (and often experience a measure of  contact comfort through the touch)  and then comforted by the physician’s sympathetic words; we are touched by the human contact that follows from physical touch.

For nurses, caregiving as touching and being touched has been central to professional identity.  The foundations of nursing as a modern “profession” were laid down on the battlefields of Crimea and the American South during the mid-nineteenth century.  Crimean and Civil War nurses could not “treat” their patients, but they “touched” them literally and figuratively and, in so doing, individualized their suffering.  Their nursing touch was amplified by the caring impulse of mothers:  they listened to soldiers’ stories, sought to keep them warm, and especially sought to nourish them, struggling to pry their food parcels away from corrupt medical officers.  In the process, they formulated a professional ethos that, in privileging patient care over hospital protocol, was anathema to the professionalism associated with male medical authority.[1]

This alternative, comfort-based vision of professionalism is one reason, among others, that nursing literature is more nuanced than medical literature in exploring the phenomenology and dynamic meanings of touch. It has fallen to nursing researchers to isolate and appraise the tactile components of touch (such as duration, location, intensity, and sensation) and also to differentiate between comforting touch and the touch associated with procedures, i.e., procedural touch.[2]  Buttressing the  phenomenological viewpoint of Husserl and Merleau-Ponty with recent neurophysiologic research, Catherine Green has recently argued that nurse-patient interaction, with its “heavily tactile component” promotes an experiential oneness:  it “plunges the nurse into the patient situation in a direct and immediate way.”  To touch, she reminds us, is simultaneously to be touched, so that the nurse’s soothing touch not only promotes deep empathy of the patient’s plight but actually “constitutes” the nurse herself (or himself) in her (or his) very personhood.[3]  Other nurse researchers question the intersubjective convergence presumed by Green’s rendering.  A survey of hospitalized patients, for example, documents that some patients are ambivalent toward the nurse’s touch, since for them it signifies not only care but also control.[4]

After World War II, the rise of sophisticated monitoring equipment in hospitals pulled American nursing away from hands-on, one-on-one bedside nursing.  By the 1960s, hospital nurses, no less than physicians, were “proceduralists” who relied on cardiac and vital function monitors, electronic fetal monitors, and the like for “data” on the patients they “nursed.”  They monitored the monitors and, for educators critical of this turn of events, especially psychiatric nurses, had become little more than monitors themselves.

As the historian Margarete Sandelowski has elaborated, this transformation of hospital nursing had both an upside and a downside.  It elevated the status of nurses by aligning them with postwar scientific medicine in all its burgeoning technological power.  Nurses, the skilled human monitors of the machines, were key players on whom hospitalized patients and their physicians increasingly relied.  In the hospital setting, they became “middle managers,”[5] with command authority of their wards. Those nurses with specialized skills – especially those who worked in the newly established intensive care units (ICUs) – were at the top of the nursing pecking order.  They were the most medical of the nurses, trained to diagnose and treat life-threating conditions as they arose.  As such, they achieved a new collegial status with physicians, the limits of which were all too clear.  Yes, physicians relied on nurses (and often learned from them) in the use of the new machines, but they simultaneously demeaned the “practical knowledge” that nurses acquired in the service of advanced technology – as if educating and reassuring patients about the purpose of the machines; maintaining them (and recommending improvements to manufacturers); and utilizing them without medical supervision was something any minimally intelligent person could do.

A special predicament of nursing concerns the impact of monitoring and proceduralism on a profession whose historical raison d’être was hands-on caring, first on the battlefields and then at the bedside.  Self-evidently, nurses with advanced procedural skills had to relinquish that most traditional of nursing functions: the laying on of hands.  Consider hospital-based nurses who worked full-time as x-ray technicians and microscopists in the early 1900s; who, beginning in the 1930s, monitored  polio patients in their iron lungs; who, in the decades following World War II, performed venipuncture as full-time IV therapists; and who, beginning in the 1960s, diagnosed and treated life-threatening conditions in the machine-driven ICUs.  Obstetrical nurses who, beginning in the late 1960s, relied on electronic fetal monitors to gauge the progress of labor and who, on detecting “nonreassuring” fetal heart rate patterns, initiated oxygen therapy or terminated oxytocin infusions – these “modern” OB nurses were worlds removed from their pre-1940s forebears, who monitored labor with their hands and eyes in the patient’s own home.  Nursing educators grew concerned that, with the growing reliance on electronic metering, nurses were “literally and figuratively ‘losing touch’ with laboring women.”[6]

Nor did the dilemma for nurses end with the pull of machine-age monitoring away from what nursing educators long construed as “true nursing.”  It pertained equally to the compensatory efforts to restore the personal touch to nursing in the 1970s and 80s.  This is because “true nursing,” as understood by Florence Nightingale and several generations of twentieth-century nursing educators, fell back on gendered touching; to nurse truly and well was to deploy the feminine touch of caring women.  If “losing touch” through technology was the price paid for elevated status in the hospital, then restoring touch brought with it the re-gendering (and hence devaluing) of the nurse’s charge:  she was, when all was said and done, the womanly helpmate of physicians, those masculine (or masculinized) gatekeepers of scientific medicine in all its curative glory.[7]  And yet, in the matter of touching and being touched, gender takes us only so far.  What then of male nurses, who insist on the synergy of masculinity, caring, and touch?[8]  Is their touch ipso facto deficient in some essential ingredient of true nursing?

As soon as we enter the realm of soothing touch, with its attendant psychological meanings, we encounter a number of binaries.  Each pole of a binary is a construct, an example of what the sociologist Max Weber termed an “ideal type.”  The question-promoting, if not questionable, nature of these constructs only increases their heuristic value.  They give us something to think about.  So we have “feminine” and “masculine” touch, as noted above.  But we also have the nurse’s touch and, at the other pole, the physician’s touch.  In the gendered world of many feminist writers, this binary replicates the gender divide, despite the historical and contemporary reality of women physicians and male nurses.

But the binary extends  to women physicians themselves.  In their efforts to gain entry to the world of male American medicine,  female medical pioneers adopted two radically different strategies.  At one pole, we have the touch-comfort-sympathy approach of Elizabeth Blackwell, which assigned women their own  feminized domain of practice (child care, nonsurgical obstetrics and gynecology, womanly counseling on matters of sanitation, hygiene, and prevention).  At the opposite pole we have the research-oriented, scientific approach of Mary Putnam Jacobi and Marie Zakrezewska, which held that  women physicians must be physicians in any and all respects.  Only with state-of-the-art training in the medical science (e.g., bacteriology) and treatments (e.g., ovariotomy) of the day, they held, would women docs achieve what they deserved:  full parity with  medical men.  The binary of female physicians as extenders of women’s “natural sphere” versus female physicians as physicians pure and simple runs through the second half of the nineteenth century.[9]

Within medicine, we can perhaps speak of the generalist touch (analogous to the generalist gaze[10]) that can be juxtaposed with the specialist touch.  Medical technology, especially tools that amplify the physician’s senses –  invite another binary.  There is the pole of direct touch and the pole of touch mediated by instrumentation.  This binary spans the divide between “direct auscultation,” with the physician’s ear on the patient’s chest, and “mediate auscultation,” with the stethoscope linking (and, for some nineteenth-century patients, coming between) the physician’s ear and the patient’s chest).

Broader than any of the foregoing is the binary that pushes beyond the framework of comfort care per se.  Consider it a meta-binary.  At one pole is therapeutic touch (TT), whose premise of a preternatural human energy field subject to disturbance and hands-on (or hands-near) remediation is nothing if not a recrudescence of Anton Mesmer’s “vital magnetism” of the late 18th century, with the TT therapist (usually a nurse) taking the role of Mesmer’s magnétiseur.[11]  At the opposite pole is transgressive touch.  This is the pole of boundary violations typically, though not invariably, associated with touch-free specialties such as psychiatry and psychoanalysis.[12]  Transgressive touch signifies inappropriately intimate, usually sexualized, touch that violates the boundaries of professional caring and results in the patient’s dis-comfort and dis-ease, sometimes to the point of leaving the patient traumatized, i.e., “touched in the head.”  It also signifies the psychological impairment of the therapist, who, in another etymologically just sense of the term, may be “touched,” given his or her gross inability to maintain a professional treatment relationship.

These binaries invite further scrutiny, less on account of the extremes than of the shades of grayness that span each  continuum.  Exploration of touch is a messy business, a hands-on business, a psycho-physical business.  It may yield important insights but perhaps only fitfully, in the manner of – to invoke a meaning that arose in the early nineteenth century – touch and go.


[1] See J. E. Schultz, “The inhospitable hospital: gender and professionalism in civil war medicine,” Signs, 17:363-392, 1992.

[2]  S. J. Weiss, “The language of touch,” Nurs. Res., 28:76-80, 1979; S. J. Weiss, “Psychophysiological effects of caregiver touch on incidence of cardiac dysrhythmia,” Heart and Lung, 15:494-505, 1986; C. A. Estabrooks, “Touch in nursing practice: a historical perspective: 1900-1920,” J. Nursing Hist., 2:33-49, 1987; J. S. Mulaik, et al., “Patients’ perceptions of nurses’ use of touch,” W. J. Nursing Res., 13:306-323, 1991.

[3] C. Green, “Philosophic reflections on the meaning of touch in nurse-patient interactions,” Nurs. Phil., 14:242-253, 2013; quoted at pp. 250-251.

[4] Mulaik, “Patient’s perceptions of nurses’ use of touch,” pp. 317-318.

[5] “Middle managers” is the characterization of the nursing historian Barbara Melosh, in “Doctors, patients, and ‘big nurse’: work and gender in the postwar hospital,” in E. C. Lagemann, ed., Nursing History: New Perspective, New Possibilities (NY: Teachers College Press, 1983), pp. 157-179.  

[6] M. Sandelowski, Devices and Desires:  Gender, Technology, and American Nursing (Chapel Hill: University of North Carolina Press, 2000), p. 166.

[7] On the revalorization of the feminine in nursing in the Nursing Theory Movement of the 70s and 80s, see Sandelowski, Devices and Desires, pp. 131-134.

[8] See R. L. Pullen, et al., “Men, caring, & touch,”  Men in Nursing, 7:14-17, 2009.

[9] The work of Regina Morantz-Sanchez is especially illuminating of this binary and the major protagonists at the two poles.  See R. Morantz, “Feminism, professionalism, and germs: the thought of Mary Putnam Jacobi and Elizabeth Blackwell,” American Quarterly, 34:459-478, 1982, with a slightly revised version of the paper in R. Morantz-Sanchez, Sympathy and Science: Women Physicians in American Medicine (Chapel Hill: University of North Carolina Press, 2000 [1985]), pp. 184-202.

[10] I have written about the “generalist gaze” in P. E. Stepansky, The Last Family Doctor:  Remembering my Father’s Medicine (Montclair, NJ: Keynote Books, 2011), pp. 62-66, and more recently in P. E. Stepansky, “When generalist values meant general practice: family medicine in post-WWII America” (precirculated paper, American Association for the History of Medicine, Atlanta, GA, May 16-19, 2013).

[11] Therapeutic touch was devised and promulgated by the nursing educator Delores Krieger in publications of the 1970s and 80s, e.g., “Therapeutic touch:  the imprimatur of nursing,” Amer. J. Nursing, 75:785-787, 1975; The Therapeutic Touch (NY: Prentice Hall, 1985); and Living the Therapeutic Touch (NY:  Dodd, Mead, 1987).  I share the viewpoint of Therese Meehan, who sees the technique as a risk-free nursing intervention capable of potentiating a powerful placebo effect (T. C. Meehan, “Therapeutic touch as a nursing intervention,” J. Advanced Nursing, 1:117-125, 1998).

[12] For a fairly recent examination of transgressive touch and its ramifications, see G. O. Gabbard & E. P. Lester, Boundary Violations in Psychoanalysis (Arlington, VA: Amer. Psychiatric Pub., 2002). 

Copyright © 2013 by Paul E. Stepansky.  All rights reserved.

An Irony of War

“There are two groups of people in warfare – those organized to inflict and those organized to repair wounds – and there is little doubt but that in all wars, and in this one in particular, the former have been better prepared for their jobs” (Milit. Surg., 38:601, 1916).  So observed Harvey Cushing, the founder of modern neurosurgery, a year before America’s entry into World War I.  Cushing’s judgment is just, and yet throughout history “those organized to repair wounds” have risen to the exigencies  of the war at hand.  In point of fact, warfare has spurred physicians, surgeons, and researchers to major, sometimes spectacular, advances, and their scientific and clinical victories are bequeathed  to civilian populations that inherit the peace.  Out of human destructiveness emerge potent new strategies of protection, remediation, and self-preservation.  Call it an irony of war.

Nor are these medical and surgical gifts limited to the era of modern warfare.  The French army surgeon Jean Louis Petit invented the screw tourniquet in 1718; it made possible leg amputation above the knee.  The Napoleonic Wars of the early nineteenth century brought us the first field hospitals along with battlefield nursing and ambulances.  The latter were of course horse-drawn affairs, but they were exceedingly fast and maneuverable and were termed “flying ambulances.”  The principle of triage — treating the wounded, regardless of rank, according to severity of injury and urgency of need – is not a product of twentieth-century disasters.  It was devised by Dominique Jean Larrey, Napoleon’s surgeon-in-chief from 1797 to 1815.

The American Civil War witnessed the further development of field hospitals and the acceptance, often grudging, especially among southern surgeons, of female nurses tending to savaged male bodies.  Hospital-based training programs for nurses were a product of wartime experience.  Civil War surgeons themselves broached the idea shortly after the peace, and the first such programs opened  in New York, Boston, and New Haven hospitals in 1873.  The dawning appreciation of the relationship between sanitation and prevention of infection, which would blossom into the “sanitary science” of the 1870s and 1880s, was another Civil War legacy.

And then there were the advances, surgical and technological, in amputation.  They included the use of the flexible chain saw to spare nerves and muscles and even, in many cases of comminuted fracture, to avoid amputation entirely.  The development of more or less modern vascular ligation – developed on the battlefield to tie off major arteries extending from the stumps of severed limbs – is another achievement of Civil War surgeons.  Actually, they rediscovered ligation, since the French military surgeon Amboise Paré employed it following battlefield amputation in the mid-sixteenth century, and he in turn was reviving a practice employed in the Alexandrian Era of the fourth century B.C.

In 1900 Karl Landsteiner, a Viennese pathologist and immunologist, first described the ABO system of blood groups, founding the field of immunohematology.  As a result, World War I gave us blood banks that made possible blood transfusions among wounded soldiers in the Army Medical Corps in France.  The First World War also pushed medicine further along the path to modern wound management, including the treatment of cellulitic wound infections, i.e., bacterial skin infections that followed soft tissue trauma.  Battlefield surgeons were quick to appreciate the need for thorough wound debridement and delayed closure in treating contaminated war wounds.  The prevalence of central nervous system injuries – a tragic byproduct of trench warfare in which soldiers’ heads peered anxiously above the parapets  – led to “profound insights into central nervous system form and function.” The British neurologist Gordon Holmes provided elaborate descriptions of spinal transections (crosswise fractures) for every segment of the spinal cord, whereas Cushing, performing eight neurosurgeries a day, “rose to the challenge of refining the treatment of survivors of penetrating head wounds” (Arch. Neurol., 51:712, 1994).  His work from 1917 “lives today” (ANZ J. Surg., 74:75, 2004).

No less momentous was the development of reconstructive surgery by inventive surgeons (led by the New Zealand ENT surgeon Harold Gillies) and dentists (led by the French-American Charles Valadier) unwilling to accept the gross disfigurement of downed pilots who crawled away from smoking wreckages with their lives, but not their faces, intact.  A signal achievement of wartime experience with burn and gunshot victims was Gillies’s Plastic Surgery of the Face of 1920; another was the founding of the American Association of Plastic Surgeons a year later.  After the war, be it noted, the pioneering reconstructive surgeons refused to place their techniques at the disposal of healthy women (and less frequently healthy men) desirous of facial enhancement; reconstructive facial surgery went into short-lived hibernation.  One reason reconstructive surgeons morphed into cosmetic surgeons was the psychiatrization of facial imperfection via Freudian and especially Adlerian notions of the “inferiority complex,” with its allegedly life-deforming ramifications.  So nose jobs became all the rage in the 1930s, to be joined by facelifts in the postwar 40s. (Elizabeth Haiken’s book Venus Envy: A History of Cosmetic Surgery [1997] is illuminating on all these issues.)

The advances of World War II are legion.  Among the most significant was the development or significant improvement of 10 of the 28 vaccine-preventable diseases identified in the twentieth century (J. Pub. Health Pol., 27:38, 2006);  new vaccines for influenza, pneumococcal pneumonia, and plague were among them.   There were also new treatments for malaria and the mass production of penicillin in time for D-Day.  It was during WWII that American scientists learned to separate blood plasma into its constituents (albumin, globulins, and clotting factors), an essential advance in the treatment of shock and control of bleeding.

No less staggering were the surgical advances that occurred during the war. Hugh Cairns, Cushing’s favorite student, developed techniques for the repair of the skull base and laid the foundation of modern craniofacial surgery by bringing together neurosurgeons, plastic surgeons, and ophthalmic surgeons in mobile units referred to as “the trinity.”   There were also major advances in fracture and wound care along with the development of hand surgery as a surgical specialty.   Wartime treatment experience with extreme stress, battlefield trauma, and somatization (then termed, in Freudian parlance, “conversion reactions”) paved the way for the blossoming of psychosomatic medicine in the 1950s and 1960s.

The drum roll hardly ends with World War II.  Korea gave us the first air ambulance service.  Vietnam gave us Huey helicopters for evacuation of wounded soldiers.  (Now all trauma centers have heliports.)  Prior to evacuation, these soldiers received advanced, often life-saving, care from medical corpsmen who opened surgical airways and performed thoracic needle decompressions and shock resuscitation; thus was born our modern system of prehospital emergency care by onsite EMTs and paramedics.  When these corpsmen returned to the States, they formed the original candidate pool for Physician Assistant training programs, the first of which opened its doors at Duke University Medical Center in 1965.  Vietnam also gave us major advances in vascular surgery, recorded for surgical posterity in the “Vietnam Vascular Registry,” a database with records of over 8000 vascular wound cases contributed by over 600 battlefield surgeons.

The medical and surgical yield of recent and ongoing wars in the Persian Gulf will be recorded in years to come.  Already, these wars have provided two advances for which all may give thanks:  portable intensive care units (“Life Support for Trauma and Transport”) and Hem-Con bandages.  The latter, made from extract of shrimp cells, stop severe bleeding instantaneously.

Now, of course, with another century of war under our belt and the ability to play computer-assisted war games, we are better able to envision the horrific possibilities of wars yet to come.  In the years leading up to World War I, American surgeons – even those, like Harvey Cushing, who braced themselves for war – had no idea of the human wreckage they would encounter in French field hospitals.  Their working knowledge of war wounds relied on the Boer War (1899-1900), a distinctly nineteenth-century affair, militarily speaking, fought in the desert of South Africa, not in trenches in the overly fertilized, bacteria-saturated soil of France.  Now military planners can turn to databases that gather together the medical-surgical lessons of two World Wars, Korea, Vietnam, Iraq, Afghanistan, and any number of regional conflicts.

Military simulations have already been broadened to include political and social factors.  But military planners should also be alert to possibilities of mutilation, disfigurement, multiple-organ damage, and drug-resistant infection only dimly imagined.  Perhaps they can broaden their simulations to include the medical and surgical contingencies of future wars and get bench scientists, clinical researchers, and surgeons to work on them right away.  Lucky us.

Copyright © 2012 by Paul E. Stepansky.  All rights reserved.

Primary Care/Primarily Caring (IV)

If it is little known in medical circles that World War II “made” American psychiatry, it is even less well known that the war made psychiatry an integral part of general medicine in the postwar decades.  Under the leadership of the psychoanalyst (and as of the war, Brigadier General) William Menninger, Director of Neuropsychiatry in the Office of the Surgeon General, psychoanalytic psychiatry guided the armed forces in tending to soldiers who succumbed to combat fatigue, aka war neuroses, and getting some 60% of them back to their units in record time.   But it did so less because of the relatively small number of trained psychiatrists available to the armed forces than through the efforts of the General Medical Officers (GMOs), the psychiatric foot soldiers of the war.  These GPs, with at most three months of psychiatric training under military auspices, made up 1,600 of the Army’s  2,400-member neuropsychiatry service (Am. J. Psychiatry., 103:580, 1946).

The GPs carried the psychiatric load, and by all accounts they did a remarkable job.  Of course, it was the psychoanalytic brass – William and Karl Menninger, Roy Grinker, John Appel, Henry Brosin, Franklin Ebaugh, and others – who wrote the papers and books celebrating psychiatry’s service to the nation at war.  But they all knew that the GPs were the real heroes.  John Milne Murray, the Army Air Force’s chief neuropsychiatrist, lauded them as the “junior psychiatrists” whose training had been entirely “on the job” and whose ranks were destined to swell under the VA program of postwar psychiatric care (Am. J. Psychiatry, 103:594, 1947).

The splendid work of the GMOs encouraged expectations that they would help shoulder the nation’s psychiatric burden after the war. The psychiatrist-psychoanalyst Roy Grinker, coauthor with John Spiegel of the war’s enduring  contribution to military psychiatry, Men Under Stress (1945), was under no illusion about the ability of trained psychiatrists to cope with the influx of returning GIs, a great many “angry, regressed, anxiety-ridden, dependent men” among them (Men Under Stress, p. 450).  “We shall never have enough psychiatrists to treat all the psychosomatic problems,” he remarked in 1946, when the American Psychiatric Association boasted all of 4,000 members.  And he continued:  “Until sufficient psychiatrists are produced and more internists and practitioners make time available for the treatment of psychosomatic syndromes, we must use heroic shortcuts in therapy which can be applied by all medical men with little special training” (Psychosom. Med., 9:100-101, 1947).

Grinker was seconded by none other than William Menninger, who remarked after the war that “the majority of minor psychiatry will be practiced by the general physician and the specialists in other fields” (Am. J. Psychiatry, 103:584, 1947).  As to the ability of stateside GPs to manage the “neurotic” veterans, Lauren Smith, Psychiatrist-in-Chief to the Institute of Pennsylvania Hospital prior to assuming his wartime duties, offered a vote of confidence two years earlier.  The majority of returning veterans would “present” with psychoneuroses rather than major psychiatric illness, and most of them “can be treated successfully by the physician in general practice if he is practical in being sympathetic and understanding, especially if his knowledge of psychiatric concepts is improved and formalized by even a minimum of reading in today’s psychiatric literature”  (JAMA, 129:192, 1945).

These appraisals, enlarged by the Freudian sensibility that saturated popular American culture in the postwar years, led to the psychiatrization of American general practice in the 1950s and 60s.  Just as the GMOs had been the foot soldiers in the campaign to manage combat stress, so GPs of the postwar years were expected to lead the charge against the ever growing number of “functional illnesses” presented by their patients (JAMA, 152:1192, 1953; JAMA, 156:585, 1954).  Surely these patients were not all destined for the analyst’s couch.  And in truth they were usually better off in the hands of their GPs, a point underscored by Robert Needles in his address to the AMA’s Section on General Practice in June of 1954.  When it came to functional and nervous illnesses, Needles lectured, “The careful physician, using time, tact, and technical aids, and teaching the patient the signs and meanings of his symptoms, probably does the most satisfactory job” (JAMA, 156:586, 1954).

Many generalists of the time, my father, William Stepansky, among them, practiced psychiatry.  Indeed they viewed psychiatry, which in the late 40s, 50s, and 60s typically meant psychoanalytically informed psychotherapy, as intrinsic to their work.  My father counseled patients from the time he set out his shingle in 1953.  Well-read in the psychiatric literature of his time and additionally interested in psychopharmacology, he supplemented medical school and internship with basic and advanced-level graduate courses on psychodynamics in medical practice.  Appointed staff research clinician at McNeal Laboratories in 1959, he conducted and published  (Cur. Ther. Res. Clin. Exp., 2:144, 1960) clinical research on McNeal’s valmethamide, an early anti-anxiety agent.  Beginning in the 1960s, he attended case conferences at Norristown State Hospital (in exchange for which he gave his services, gratis, as a medical consultant).  And he participated in clinical drug trials as a member of the Psychopharmacology Research Unit of the University of Pennsylvania’s Department of Psychiatry, sharing authorship of several publications that came out of the unit.  In The Last Family Doctor, my tribute to him and his cohort of postwar GPs, I wrote:

“The constraints of my father’s practice make it impossible for him to provide more than supportive care, but it is expert support framed by deep psychodynamic understanding and no less valuable to his patients owing to the relative brevity of 30-minute ‘double’ sessions.  Saturday mornings and early afternoons, when his patients are not at work, are especially reserved for psychotherapy.  Often, as well , the last appointment on weekday evenings is given to a patient who needs to talk to him.  He counsels many married couples having difficulties.  Sometimes he sees the husband and wife individually; sometimes he seems them together in couples therapy.  He counsels the occasional alcoholic who comes to him.  He is there for whoever seeks his counsel, and a considerable amount of his counseling, I learn from [his nurse] Connie Fretz, is provided gratis.”

To be sure, this was family medicine of a different era.  Today primary care physicians (PCPs) lack the motivation, not to mention the time, to become frontline psychotherapists.  Nor would their credentialing organizations (or their accountants) look kindly on scheduling double-sessions for office psychotherapy and then billing the patient for a simple office visit.  The time constraints under which PCPs typically operate, the pressing need to maintain practice “flow” in a climate of regulation, third-party mediation, and bureaucratic excrescences of all sorts – these things make it more and more difficult for physicians to summon the patience to take in, much less to co-construct and/or psychotherapeutically reconfigure, their patients’ illness narratives.

But this is largely beside the point.  Contemporary primary care medicine, in lockstep with psychiatry, has veered away from psychodynamically informed history-taking and office psychotherapy altogether.  For PCPs and nonanalytic psychiatrists alike – and certainly there are exceptions – the postwar generation’s mandate to practice “minor psychiatry,” which included an array of supportive, psychoeducative, and psychodynamic interventions, has effectively shrunk to the simple act of prescribing psychotropic medication.

At most, PCPs may aspire to become, in the words of Howard Brody, “narrative physicians” able to empathize with their patients and embrace a “compassionate vulnerability” toward their suffering.  But even this has become a difficult feat.  Brody, a family physician and bioethicist, remarks that respectful attentiveness to the patient’s own story or “illness narrative” represents a sincere attempt “to develop over time into a certain sort of person – a healing sort of person – for whom the primary focus of attention is outward, toward the experience and suffering of the patient, and not inward, toward the physician’s own preconceived agenda” (Lit. & Med., 13:88, 1994; my emphasis).  The attempt is no less praiseworthy than the goal.  But where, pray tell, does the time come from?  The problem, or better, the problematic, has to do with the driven structure of contemporary primary care, which makes it harder and harder for physicians to enter into a world of open-ended storytelling that over time provides entry to the patient’s psychological and psychosocial worlds.

Whether or not most PCPs even want to know their patients in psychosocially (much less psychodynamically) salient ways is an open question.  Back in the early 90s, primary care educators recommended special training in “psychosocial skills” in an effort to remedy the disinclination of primary care residents to address the psychosocial aspects of medical care.  Survey research of the time showed that most residents not only devalued psychosocial care, but also doubted their competence to provide it (J. Gen. Int. Med., 7:26, 1992; Acad. Med., 69:48, 1994).

Perhaps things have improved a bit since then with the infusion of courses in the medical humanities into some medical school curricula and focal training in “patient and relationship-centered medicine” in certain residency programs.   But if narrative listening and relationship-centered practice are to be more than academic exercises, they must be undergirded by a clinical identity in which relational knowing is constitutive, not superadded in the manner of an elective.  Psychodynamic psychiatry was such a constituent in the general medicine that emerged after World War II.  If it has become largely irrelevant to contemporary primary care, what can take its place?  Are there other pathways through which PCPs, even within the structural constraints of contemporary practice, may enter into their patients’ stories?

Copyright © 2011 by Paul E. Stepansky.  All rights reserved.