Category Archives: Nineteenth-Century Medicine

Anti-vaxxers in Free Fall

I read a news story in which a man is dying of Covid-19 in the hospital.  He is asked whether he regrets not getting vaccinated and rallies enough to reply, “No, I don’t believe in the vaccine.”  So what then does he believe in?  Systemic viral infection, suffering, and death?  If you don’t believe in vaccination, you don’t believe in modern medicine in toto.  You don’t believe in bacteriology, virology, cellular biology, microbiology, or immunology.  What then is left to prevent, diagnose, and treat disease?  Trump-ish medievalism, mysticism, shamanism, divine intervention?

A study by researchers at Harvard’s Brigham and Women’s Hospital used natural language processing to comb through 5,307 electronic patient records of adult type 2 diabetics living in Massachusetts and followed by their primary care physicians between 2000 and 2014.  They found that 43% (2,267) of patients refused to begin insulin therapy when their doctors recommended it.  Further, diabetics who declined the recommendation not only had higher blood sugar levels than those who began insulin, but had greater difficulty achieving glycemic control later on.[1]  So what do the insulin-declining diabetics believe in?  Chronic heart and kidney disease, blindness, and amputation – the all but inevitable sequelae of poorly managed diabetes?

The problem, really an epistemological problem, is that such people apparently have no beliefs at all – unless one imputes to them belief in disease, suffering, and death, and in the case of Covid vaccine resisters, the prerogative to inflict them on others.  This is not tantamount to a scientifically specious belief system that unintentionally infects others.  During the Yellow Fever epidemic that left Philadelphia in ruins in 1793, Dr. Benjamin Rush, highly acclaimed throughout the newborn nation, set about his curative missions by draining his patients, in successive bleedings, of up to four pints of blood while simultaneously purging them (i.e., causing them to vomit) with copious doses of toxic mercury.

Rush’s “Great Purge,” adopted by his followers, added hundreds, perhaps thousands, to the death toll in Philadelphia alone.  But at least Rush’s “system” derived from a belief system.  He did in fact  find a theoretical rationale for his regimen in an essay by the Virginia physician and mapmaker John Mitchell.  Describing yellow fever in Virginia in 1741, Mitchell noted that in yellow fever the “abdominal viscera were filled with blood, and must be cleaned out by immediate evacuation.”[2]   Bleeding, of course, was conventional treatment for all manner of disease in 1793, so Mitchell’s recommendation came as no surprise. Taken in conjunction with the system of mercuric purges employed by Dr. Thomas Young during the Revolutionary War, Rush had all the grounding he required for a ruinously misguided campaign that greatly extended recovery time of those it did not kill.  But, yes, he had his theory, and he believed in it.

In the early 19th century, Napoleon, sweeping through Europe, conquers the north Italian province of Bolzano, which in 1807 he incorporated into Bavaria. Two years later, when the Bavarian government mandates smallpox vaccination for all residents, the newly absorbed Italians launch an armed revolt, partly because they believed vaccination would inject Protestantism into their Catholic veins.[3]

All right, it is a nonsensical belief, even in 1809, but it is still a belief of sorts.  It is epistemically flawed, because it fails to stipulate what exactly makes a substance inherently Protestant in nature; nor does it posit a mechanism of transmission whereby a Protestant essence seeps into liquid smallpox vaccine in the first place.  In the realm of ethics, it suggests that the possibility of death pales alongside the certainty of spiritual contamination by a fluid that, however neutral in life-saving potency, is injected by a Protestant hand.

Only slightly less ridiculous to modern ears is the mid-19th-century belief that general anesthesia via ether or chloroform, introduced by James Young Simpson in 1847, must be withheld from women giving birth.  The reason?  Genesis 3.16 enjoins women to bring forth new life in suffering.  Forget that the belief is espoused solely by certain men of the cloth and male physicians,[4] and was based on a highly questionable rendering of the biblical Hebrew.  Forget as well that, for Christians, Christ’s death redeemed humankind, relieving women of the need to relive the primal curse.  Bear in mind further that the alleged curse would also forbid, inter alia, use of forceps, caesarian operations, and embryotomy.  A woman with a contracted pelvis would die undelivered because she is guilty of the sin over which she has no control – that of having a contracted pelvis.[5]

In a secular nation guided by a constitution that asserts everyone’s right to pursue happiness in his or her own pain-free terms, we see the primal curse as archaic misogynist drivel, no less absurd than belief that the Bible, through some preternatural time warp, forbids vaccination.  But, hey, it’s a free country, and if a mid-19th-century or early-21st-century man chooses to believe that anesthesia permits men to escape pain whenever possible but women only in male-sanctioned circumstances, so be it.  It is a belief.

Now it’s 1878, and the worst yellow fever epidemic in American history is sweeping across the lower Mississippi Valley, taking lives and destroying commerce in New Orleans, Memphis and surrounding cities and towns to which refugees are streaming.  The epidemic will reach the Ohio Valley, bringing deadly Yellow Jack to Indiana, Illinois, and Ohio.  Koch’s monograph on the bacteriology of sepsis (wound infection) was published that very  year, and neither his work nor that of Lister is universally accepted in the American south.  Nor would its precepts have counted for much in the face of a viral (not bacterial) invader carried up the Mississippi from Havana.

What can city boards of health do in the face of massive viral infection, suffering, and death?  Beyond imposing stringent new sanitary measures, they can quarantine ships arriving in their harbors until all infected crew members have either died or been removed and isolated.  This will prevent the newly infected from infecting others and crippling cities still further – assuming, that is, a belief system in which yellow fever is contagious and spread from person to person.

But in 1878 Memphis, where by September the epidemic is claiming 200 lives a day, this “modern” belief is widely contested among the city’s physicians.  Some are contagionists, who believe that disease is caused by invisible entities that are transmissible.  But others, greater in number, favor the long-held theory that infectious disease results from “miasma” or bad air – air rendered toxic by decaying plant and animal matter in the soil.  If you believe miasma causes disease, then you’re hard-pressed to understand how quarantining ships laden with sick people will do anything to control the epidemic.

This was precisely the position of the 32 Memphis physicians who defeated the city council’s plan to institute a quarantine and set up a quarantine station.  Quarantine is pointless in the face of bad air.  The city’s only recourse, so held the 32, was to alter the “epidemic constitution” of the atmosphere by inundating it with smoke.  Canon blasts and blazing barrels of tar up and down city streets – that’s the ticket to altering the atmospheric conditions that create infectious disease.[6]

The miasmic theory of disease retained a medical following throughout the 1870s, after which it disappeared in the wake of bacteriology.  But in Memphis in 1878, bad air was still a credible theory in which physicians could plausibly believe.  And this matter of reasonable belief – reasonable for a particular time and place – takes us back to the hospitalized Covid patient of 2021 who, with virtually his last breath, defends his decision to remain unvaccinated because he doesn’t believe in the vaccine.  What is the knowledge base that sustains his disbelief?  There isn’t any.  He has no beliefs, informed or otherwise, about bacteriology, virology, cellular biology, or immunology.  At best, he has decided to accept what someone equally belief-less has told him about Covid vaccination, whether personally, in print, or over the internet.

It is no different among the 43% of Massachusetts diabetics who, a century after Banting’s and Best’s miraculous discovery, declined insulin therapy when their doctors recommended it.  Their disbelief is actually a nonbelief because it is groundless.  For some, perhaps, the refusal falls back on a psychological inability to accept that one is diabetic enough to warrant insulin.  They resist the perceived stigma of being insulin-dependent diabetics.[7]  Here at least the grounds of refusal are intelligible and remediable.  An insulin phobia does not sustain real-world belief; it is an impediment to such belief in relation to diabetes and insulin, illness and long-term health, lesser and greater life expectancy.

Back in the present, I read another news story in which two unvaccinated hospital nurses explain to a journalist that they have refused Covid vaccination because the vaccines’ effectiveness is based on “junk data.”  Really?  Here there is the glimmering of a belief system, since scientific data can be more or less robust, more or less supportive of one or another course of action.

But what exactly makes Covid vaccine data worthless, i.e., junk?  And how have these two nurses acquired the expertise in epidemiology, population statistics, and data analysis to pass judgment on data deemed credible and persuasive by scientists at Pfizer, Moderna, Johnson & Johnson, the CDC, and the WHO?  And how, pray tell, have they gained access to these data?  Like all opponents of vaccine science, they pontificate out of ignorance, as if the mere act of an utterance confers truth-value to what is being uttered.  It’s an extreme example of asserting as fact what remains to be demonstrated (argument petitio principii), the legacy of an ex-president who elevated pathological lying to a political art form.

Even the nurses pale alongside the anti-vax protester who is pictured in a news photo holding a sign that reads, “Vaccines Kill.”[8]  Whom do they kill and under what circumstances?  Does he mean all vaccines are deadly and kill people all the time, or just certain vaccines, such as the Covid  vaccine?   But what does it matter?  The sign holder doesn’t know anything about any vaccines.  Does he really believe that everything we know about the history of vaccine science from the time of Jenner is bogus, and that children who once died from smallpox, cholera, yellow fever, diphtheria, pertussis, typhoid, typhus, tetanus, and polio are still dying in droves, now from the vaccines they receive to protect them from these infectious diseases during the earliest years of life?  Is the demographic fact that, owing to vaccination and other public health measures, life expectancy in the U.S. has increased from 47 in 1900 to 77 in 2021 also based on junk data?  In my essay, Anti- vaccinationism, American Style, I provide statistics on the total elimination in the U.S. of smallpox and diphtheria, and virtual elimination of polio.  Were my claims also based on junk data?  If so, I’d appreciate being directed to the data that belie these facts and demonstrate that, in point of fact, vaccines kill.

Maybe the man with the sign has an acquaintance who got sick from what he believed to be a vaccine?  Perhaps someone on his internet chat group heard of someone else who became ill, or allegedly died, after receiving a vaccine.  Of course, death can follow vaccination without being caused by it.  Do we then assume that the man with the sign and like-minded protesters are well-versed in the difference between causation and correlation in scientific explanation?

We know that for a tiny number of individuals aspirin kills.[9]   So why doesn’t the man hold up a sign that reads, “Aspirin Kills.”  Here at least, he would be calling attention to a scientific fact that people with GI conditions should be aware of.    We know that sugary drinks have been linked to 25,000 deaths in the U.S. each year.  Why not a sign, “Soda Kills”?  It would at least be based on science.  He chooses not to proclaim the lethality of aspirin or soda because he cares no more about aspirin- or soda-related deaths than Covid-related deaths.  If he did, then, like the two nurses with their junk data and the Covid patient announcing disbelief in Covid vaccination on his deathbed, he would have to anchor his belief in consensually accepted scientific facts – a belief that someone, anyone, might find believable.

He is no different than other American anti-vaxxers I read about in the paper. They are the epistemological Luddites of our time, intent on wrecking the scientific machinery of disease prevention, despite profound ignorance of vaccine science and its impact on human affairs since the late 18th century.  Indeed, they see no need to posit grounds of belief of any kind, since their anger – at Covid, at Big Government, at Big Science, at Big Medicine, at Big Experts – fills the epistemic void.  It fuels what they offer in place of the science of disease prevention:  the machinery of misinformation that is their stock in trade.

And therein is the source of their impotence.  They have fallen into an anti-knowledge black hole, and struggle to fashion an existence out of anger that – to push the anti-matter trope a little further – repels rational thought.  Their contrarian charge is small solace for the heightened risks of diseases, suffering, and death they incur, and, far less conscionably, impose on the rest of us.

______________________

[1] N. Hosomura, S. Malmasi, et al., “Decline of Insulin Therapy and Delays in Insulin Initiation in People with Uncontrolled Diabetes Melitus,” Diabetic Med., 34:1599-1602, 2017.

[2] J. M. Powell, Bring Out Your Dead:  The Great Plague of Yellow Fever in Philadelphian in 1793 (Phila: Univ. of Pennsylvania Press, 1949), 76-78.

[3] My thanks to my friend Marty Meyers for bringing to my attention this event of 1809, as reported by Emma Bubola,In Italy’s Alps, Traditional Medicine Flourishes, as Does Covid,” New York Times, December 16, 2021.

[4] With reason, wrote Elizabeth Cady Stanton in The Woman’s Bible (1895), “The Bible and the Church have been the greatest stumbling blocks in the way of women’s emancipation.”

[5] For a fulller examination of the 19th-century debate on the use of general anesthesia during childbirth, see Judith Walzer Leavitt Brought to Bed: Childbearing in America, 1750-1950 (NY:  OUP, 1986), ch. 5.

[6] On the measures taken to combat the epidemic in Memphis, including the rift between contagionists and noncontagionists physicians, see John H. Ellis, Yellow Fever and Public Health in the New South (Lexington: Univ. Press of Kentucky, 1992), ch. 3.

[7] A. Hussein, A. Mostafa, et al., “The Perceived Barriers to Insulin Therapy among Type 2 Diabetic Patients,” African Health Sciences, 19:1638-1646, 2019.

[8] Now, sadly, we have gone from hand-written “Vaccines Kill” signs to highway billboards, e.g., https://www.kxxv.com/hometown/mclennan-county/a-new-billboard-in-west-claims-vaccines-kill.

[9] Patients prescribed aspirin before developing a GI bleed or perforation are prominent among those killed by aspirin.  See A. Lanas, M. A. Perez-Aisa, et al., “A Nationwide Study of Mortality Associated with Hospital Admission and Those Associated with Nonsteroidal Antiinflammatory Drug Use,” Am. J.  Gastroenterol., 100:1685-1693, 2005; S. Straube, M. R. Trainer, et al., “Mortality with Upper Gastrointestinal Bleeding and Perforation,” BMC Gastroenterol., 8: 41, 2009.

Humanitas, History, Empathy

In the nineteenth century, no one was devising courses, workshops, or coding schemes to foster empathic care-giving.  In both Europe and America, students were expected to learn medicine’s existential lessons in the manner they long had:  through mastery of Latin and immersion in ancient writings.  This fact should not surprise us:  knowledge of Latin was the great nineteenth-century signpost of general knowledge.  It was less an index of education achieved than testimony to educability per se.  As such, it was an aspect of cultural endowment essential to anyone aspiring to a learned profession.

I have written elsewhere about the relationship of training in the classics to medical literacy throughout the century.[1]  Here I focus on the “felt” aspect of this cultural endowment: the relationship of classical training to the kind of Humanitas (humanity) that was foundational to empathic caregiving.

The conventional argument has it that the role of Latin in medicine progressively diminished throughout the second half of the nineteenth century, as experimental medicine and laboratory science took hold, first in Germany and Austria, then in France, and finally in Britain and the United States, and transformed the nature of medical training.  During this time, physicians who valued classical learning, so the argument goes, were the older men who clung to what Christopher Lawrence terms “an epistemology of individual experience.”  In Britain, aficionados of the classics were the older, hospital-based people who sought to circumscribe the role of science in clinical practice.  Like their younger colleagues, they used the rhetoric of science to bolster their authority but, unlike the younger men, they “resisted the wholesale conversion of bedside practice into a science – any science.”  For these men, clinical medicine might well be based on science, but its actual practice was “an art which necessitated that its practitioners be the most cultured of men and the most experienced reflectors on the human condition.”[2]

For Lawrence, classical learning signified the gentleman-physician’s association of bedside practice with the breadth of wisdom associated with general medicine; as such, it left them “immune from sins begotten by the narrowness of specialization.”  In America, I believe, the situation was different.  Here the classics did not (or did not only) sustain an older generation intent on dissociating scientific advance from clinical practice.  Rather, in the final decades of the century, the classics sustained the most progressive of our medical educators in their efforts to resist the dehumanization of sick people inherent in specialization and procedural medicine.  Medical educators embraced experimental medicine and laboratory science, to be sure, but they were also intent on molding physicians whose sense of professional self transcended the scientific rendering of the clinical art.  Seen thusly, the classics were more than a pathway to the literacy associated with professional understanding and communication; they were also a humanizing strategy for revivifying the Hippocratic Oath in the face of malfunctioning physiological systems and diseased organs.

Consider the case of Johns Hopkins Medical College, which imported the continental, experimental model to theUnited States and thereby became the country’s first modern medical school in 1892.   In the medical value assigned to the classics, three of Hopkins’ four founding fathers were second to none.  William Welch, the pathologist who headed the founding group of professors (subsequently known as “The Big Four”), only reluctantly began medical training in 1872, since it meant abandoning his first ambition:  to become a Greek tutor and ultimately a professor of classics at his alma mater, Yale University.  Welch’s love of the classics, especially Greek literature and history, spanned his lifetime.  “Everything that moves in the modern world has its roots in Greece,” he opined in 1907.

William Osler, the eminent Professor of Medicine who hailed from the Canadian woodlands north of Toronto, began his education as a rambunctious student at the Barrie Grammar School, where he and two friends earned the appellation “Barrie’s Bad Boys.”  On occasion, the little band would give way to “a zeal for study” that led them after lights-out to “jump out of our dormitory window some six feet above the ground and study our Xenophon, Virgil or Caesar by the light of the full moon.”  Osler moved on to the Trinity College School where, in a curriculum overripe with Latin and the classics, he finished first in his class and received the Chancellor’s Prize of 1866.  Two years later, he capped his premedical education at Trinity College with examination papers on Euclid, Greek (Medea and Hippolytus), Latin Prose, Roman History, Pass Latin (Terence), and Classics (Honours).[3]  Ever mindful of his classical training, Osler not only urged his Hopkins students “to read widely outside of medicine,” but admonished them to “Start at once a bed-side library and spend the last half hour of the day in communion with the saints of humanity,”  among whom he listed Plutarch, Marcus Aurelius, Plato, and Epictetus.[4]

When Howard Kelly, the first Hopkins Professor of Gynecology and arguably the foremost abdominal surgeon of his time, began college in 1873, he was awarded the Universityof Pennsylvania’s matriculate Latin Prize for his thesis, “The Elements of Latin Prose Composition.”  Kelly, like Welch and Osler, was a lifetime lover of the classics, and he relished summer vacations, when he could “catch up on his Virgil and other classics.[5]

Of the fourth Hopkins founding father, the reclusive, morphine-addicted surgeon William Stewart Halsted, there is no evidence of a life-long passion for the ancients, though his grounding in Latin and Greek at Phillips Academy, which he attended from 1863 to 1869, was typically rigorous.  Far more impressive bona fides belong to one of  Halsted’s early trainees, Harvey Cushing, who came to Hopkins in 1897 and became the hospital’s resident surgeon in 1898.  Cushing, the founder of modern neurosurgery, entered Yale in 1887, where he began his college career “walking familiarly in the classics” with courses that included “geometry, Livy, Homer, Cicero, German, Algebra, and Greek prose.”  In February, 1888, he wrote his father that Yale was giving him and his friends “our fill of Cicero.  We have read the Senectute and Amicitia and are reading his letter to Atticus, which are about the hardest Latin prose, and now we have to start in on the orations.”[6]

In the early twentieth century, Latin, no less than high culture in general, fell by the wayside in the effort to create modern “scientific” doctors.  By the 1920s, medical schools had assumed their modern “corporate” form, providing an education that was standardized and mechanized in the manner of factory production.  “The result of specialization,” Kenneth Ludmerer has observed, “was a crowded, highly structured curriculum in which subjects were taught as a series of isolated disciplines rather than as integrated branches of medicine.”[7]  Absent such integration, the very possibility of a holistic grasp of sick people, enriched by study of the classics, was relinquished.

The elimination of Latin from the premed curriculum made eminently good sense to twentieth-century medical educators.  But it was not only the language that went by the wayside.  Gone as well was familiarity with the broader body of myth, literature, and history to which the language opened up.  Gone, that is, was the kind of training that sustained holistic, perhaps even empathic, doctoring.

When in the fall of 1890 – a year after the opening of Johns Hopkins Hospital – Osler and Welch founded the Johns Hopkins Hospital Historical Club, it was with the explicit understanding that medical history, beginning with the Hippocratic and Galenic writings, was a humanizing building block in the formation of a medical identity.  The first year of monthly meetings was devoted exclusively to Greek medicine, with over half of 15 presentations dealing with Hippocrates.  Osler’s two talks dealt, respectively, with “The Aphorisms of Hippocrates” and “Physic and Physicians as Depicted in Plato.”  Over the next three years, the Club’s focus broadened to biography, with Osler himself presenting essays on seven different American physicians, John Morgan, Thomas Bond, Nathan Smith, and William Beaumont, among them.  His colleagues introduced the club to other medical notables, European and American, and explored topics in the history of the specialties, including the history of trephining, the history of lithotomy in women, and the ancient history of rhinoscopy.[8]

The collective delving into history of medicine that took place within the Hopkins Medical History Club not only broadened the horizons of the participates, residents among them.  It also promoted a comfortable fellowship conducive to patient-centered medicine.  The Hopkins professors and their occasional guests were not only leading lights in their respective specialties, but Compleat Physicians deeply immersed in the humanities. Residents and students who attended the meetings of the Club saw their teachers as engaged scholars; they beheld professors who, during the first several years of meetings, introduced them, inter alia, to “The Royal Touch for Scrofula in England,” “The Medicine of Shakespeare,” “The Plagues and Pestilences of the Old Testament,” and “An Old English Medical Poem by Abraham Cowley.”   Professors familiar with doctor-patient relationships throughout history were the very type of positive role models that contemporary medical educators search for in their efforts to counter a “hidden curriculum” that pulls students away from patient-centered values and into a culture of academic hierarchies, cynical mixed-messages, and commercialism.[9]

Medical history clubs were not uncommon in the early decades of the twentieth century.  The Hopkins Club, along with the New York-based Charaka Club founded in 1899, had staying power.  In 1939, the third meeting of the Hopkins Club, which presented a play adapted by Hopkins’ medical librarian Sanford Larkey from William Bullein’s “A Dialogue Against the Fever Pestilence” (1564), drew a crowd of 460.  The following year, when the Hopkins Club celebrated its fiftieth anniversary, Baltimore alone boasted two other medical history clubs: the Osler Society of the Medical and Chirurgical Faculty of the State of Maryland and the Cordell Society of the University of Maryland.[10]

Although medical history clubs are a thing of the past, we see faint echoes of their milieu in contemporary medical student and resident support groups, some modeled on the Balint groups developed by Michael and Enid Balint at London’s Tavistock Clinic in the 1950s.[11]  All such groups seek to provide a safe space for shared reflection and self-examination in relation to physician-patient relationships.  In the late-nineteenth and early-twentieth centuries, history clubs filled this space with topics in medical history.  Their meetings broadened the care-giving sensibility of young physicians by exposing them to pain and suffering, to plagues and pestilences, far beyond the misery of everyday rounds.  Medical history and the broadened “medical self” it evokes and nurtures – now there’s a pathway to empathy.


[1] P. E. Stepansky, “Humanitas: Nineteenth-Century Physicians and the Classics,” presented to the Richardson History of Psychiatry Research Seminar, Weill Cornell Medical College, New York, NY, October 3, 2007.

[2] C. Lawrence, “Incommunicable knowledge: science, technology and the clinical art in Britain, 1850-1914,” J. Contemp. Hist., 20:503-520, 1985, quoted at pp. 504-505, 507.

[3] S. Flexner & J. T. Flexner, William Henry Welch and the Heroic Age of American Medicine (Baltimore:  Johns Hopkins University Press, 1968 [1941]), pp. 63-65, 419-420; H. Cushing, The Life of Sir William Osler (London: Oxford University Press, 1940), pp. 25, 39, 52.

[4] W. Osler, Aequanimitas, with other Addresses to Medical Students, Nurses and Practitioners of Medicine, 3rd edition (New York: McGraw-Hill, 1906), pp. 367, 463; L. F. Barker, Time and the Physician (New York: Putnam, 1942), p. 86.

[5] A. W. Davis, Dr. Kelly of Hopkins: Surgeon, Scientist, Christian (Baltimore: Johns Hopkins University Press, 1959),  pp. 17, 21.

[6] David Linn Edsall, who, as Dean of Harvard Medical School and of the Harvard School of Public Health during the 1920s, engineered Harvard’s progressive transformation, entered Princeton the same year (1887) Cushing entered Yale.  Edsall came to Princeton “a serious-minded young classicist” intent on a career in the classics. See  J. C. Aub & R. K. Hapgood, Pioneer in Modern Medicine: David Linn Edsall of Harvard (Cambridge: Harvard Medical Alumni Association, 1970), p. 7.  On Cushing and the classics, see  E. H. Thomson, Harvey Cushing: Surgeon, Author, Artist (New York: Schuman, 1950), p. 20.

[7] K. M. Ludmerer, Learning to Heal: The Development of American Medical Education (New York:  Basic Books, 1985), pp. 256-57, 262.

[8] V. A. McKusick, “The minutes of the Johns Hopkins medical history club, 1890 to 1894,” Bull. Hist. Med., 27:177-181, 1953.

[9] F. W. Hafferty, “Beyond curriculum reform: confronting medicine’s hidden curriculum,” Acad. Med., 73:403-407, 1998;  J. Coulehan, “Today’s professionalism: engaging the mind but not the heart,” Acad. Med., 80:892-898, 2005; P. Haldet & H. F. Stein, “The role of the student-teacher relationship in the formation of physicians: the hidden curriculum as process,” J. Gen. Int. Med., 21(suppl):S16-S20, 2005; S. Weissman, “Faculty empathy and the hidden curriculum” [letter to the editor], Acad. Med., 87:389, 2012.

[10] O. Temkin, “The Johns Hopkins medical history club,” Bull. Hist. Med., 7:809, 1939; W.R.B., “Johns Hopkins medical history club,” BMJ, 1:1036, 1939.

[11] K. M. Markakis, et al., “The path to professionalism: cultivating humanistic values and attitudes in residency training,” Acad. Med., 75:141-150, 2000; M. Hojat, “Ten approaches for enhancing empathy in health and human services cultures,” J. Health Hum. Serv. Adm., 31:412-450, 2009;  K. Treadway & N. Chatterjee, “Into the water – the clinical clerkships,” NEJM, 364:1190-1193, 2011.  On contemporary Balint groups, see A. L. Turner & P. L. Malm, “A preliminary investigation of Balint and non-Balint behavioral medicine training,” Fam. Med., 36:114-117,2004; D. Kjeldmand, et al., “Balint training makes GPs thrive better in their job,” Pat. Educ. Couns., 55:230-235, 2004; K. P. Cataldo, et al., “Association between Balint training and physician empathy and work satisfaction,” Fam. Med., 37:328–31, 2005.

Copyright © 2012 by Paul E. Stepansky.  All rights reserved.

Medical Freedom, Then and Now

“A nation’s liberties seem to depend upon headier and heartier attributes than the liberty to die without medical care.”                                                                        ~Milton Mayer, “The Dogged Retreat of the Doctors” (1949)

Conservative supreme court justices who voice grave skepticism about the constitutionality of the Patient Protection and Affordable Care Act of 2010 would have been better suited to judicial service in the decades following the Revolutionary War.  Issues of health, illness, freedom, and tyranny were much simpler then.  Liberty, as understood by our founding fathers, operated only in the interlacing realms of politics and religion.  How could it have been otherwise?   Medical intervention did not affect the course of illness; it did not enable people to feel better and live longer and more productive lives.  With the exception of smallpox inoculation, which George Washington made mandatory among colonial troops in the winter of 1777, governmental intrusion into the health of its citizenry was nonexistent, even nonsensical.

Until roughly the eighth decade of the nineteenth century, you got sick, you recovered (often despite doctoring), you lingered on in sickness, or you died.  Antebellum (pre-Civil War) medicine relied on a variation of Galenic medicine developed in the eighteenth century by the Scottish physician John Cullen and his student John Brown.  According to Cullen’s system, all diseases were really variations of a single disease that consisted of too much tension or excitability (and secondarily too little tension or excitability) in the blood vessels.  Revolutionary-era and antebellum physicians sought to restore a natural balance by giving “overstimulated” patients (read: feverish, agitated, pain-ridden patients) large doses of toxic mercury compounds like calomel to induce diarrhea; emetics like ipecac and tobacco to induce vomiting; and by bleeding patients to the point of fainting (i.e., syncope).  It was not a pretty business.

Antebellum Americans did not have to worry about remedies for specific illnesses.  Except for smallpox vaccine and antimalarial cinchona tree bark (from which quinine was isolated in 1820), none existed.  Nor did they have to worry about long-term medical interventions for chronic conditions – bacterial infections, especially those that came in epidemic waves every two or three years, had no more opportunity to become chronic than diabetes, heart disease, or cancer.

Medical liberty, enshrined during the Jacksonian era, meant being free to pick and choose your doctor without any state interference.  So liberty-loving Americans picked and chose among calomel-dosing, bloodletting-to-syncope “regulars,” homeopaths, herbalists, botanical practitioners (Thomsonians), eclectics, hydropaths, phrenologists, Christian Scientists, folk healers, and faith healers.   State legislatures stood on the sidelines and applauded this instantiation of pure democracy.  By midcentury, 15 states had rescinded medical licensing laws; the rest gutted their laws and left them unenforced.  Americans were free to enjoy medical anarchy.

Now, mercifully, our notion of liberty has been reconfigured by two centuries of medical progress.  We don’t just get sick and die.  We get sick and get medical help, and, mirabile dictu, the help actually helps.  In antebellum America, deaths of young people under 20 accounted for half the national death rate.   Now our children don’t die of small pox, cholera, yellow fever, dysentery, typhoid, and pulmonary and respiratory infections before they reach maturity.  Diphtheria no longer stalks them during the warm summer months.  When they get sick in early life, their parents take them to the doctor and they almost always get better.  Their parents, on the other hand, especially after reaching middle age, don’t always get better.  So they get ongoing medical attention to help them live longer and more comfortably with chronic conditions like diabetes, coronary heart disease, inflammatory bowel disease, Parkinson’s, and many forms of cancer.

When our framers drafted the Constitution, the idea of being free to live a productive and relatively comfortable life with long-term illness didn’t compute.  You died from diabetes,  cancer, bowel obstruction, neurodegenerative disease, and any major infection (including, among young women, the infection that often followed childbirth).  A major heart attack usually killed you.  You didn’t receive dialysis and possibly a kidney transplant when you entered kidney failure.  Major surgery, performed on the kitchen table if you were of means or in a bacteria-infested, dimly lit, unventilated public hospital if you weren’t, was all but nonexistent because it invariably resulted in massive blood loss, infection, and death.

So, yes, our framers intended our citizenry to be free of government interference, including an obligatory mandate to subsidize health care for millions of uninsured and underserved Americans.  But then the framers never envisioned a world in which freedom could be safeguarded and extended by access to expert care that relieved suffering, effected cure, and prolonged life.  Nor could they envision the progressive income tax, compulsory vaccination, publicly supported clinics, mass screening for TB, diabetes, and  syphilis, and Medicare.  Throughout the antebellum era, when regular physicians were reviled by the public and when neither regulars nor “alternative” practitioners could stem the periodic waves of cholera, yellow fever, and malaria that decimated local populations, it mattered little who provided one’s doctoring. Many, like the thousands who paid $20.00 for the right to practice Samuel Thomson’s do-it-yourself botanical system, chose to doctor themselves.

Opponents of the Affordable Care Act seem challenged by the very idea of progress.  Their consideration of liberty invokes an eighteenth-century political frame of reference to deprive Americans of a kind of liberty associated with a paradigm-shift that arose in the 1880s and 1890s.  It was only then that American medicine began its transition to what we think of as modern medicine. Listerian antisepsis (and then asepsis); laboratory research in bacteriology, immunology, and pharmacology; laboratory development of specific remedies for specific illnesses; implementation of public health measures informed by bacteriology; modern medical education beginning with the opening of Johns Hopkins Medical College in 1893; and, yes, government regulation to safeguard the public from incompetent practitioners and toxic, sometimes fatal, medications – all were  all part of the transition.

“We hold these truths to be self-evident,” Jefferson begins the second paragraph of the Declaration of Independence, “that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”  What Jefferson didn’t stipulate – what he couldn’t stipulate in his time and place – was the hierarchical relationship among these rights.  Now, in the twenty-first century, we are able to go beyond an eighteenth-century mindset in which “life, liberty, and the pursuit of happiness” functions as a noun phrase whose unitary import derives from the political tyrannies of King George III and the British Parliament.  Now we can place life at the base of the pyramid and declare that quality of life is indelibly linked to liberty and the pursuit of happiness.  To the extent that quality of life is diminished through disease and dysfunction, liberty and the pursuit of happiness are necessarily compromised.  In 2012, health is life; it is life poised to exercise liberty and pursue happiness to the fullest.

Why is it unconstitutional to obligate all citizens to participate in a health plan, either directly or through a mandate, that safeguards the right of people to efficacious health care regardless of their financial circumstances, their employment status, and their preexisting medical conditions?  What is it about the term “mandate” that is constitutionally questionable?  When you buy a house in this country, you pay local property taxes that support the local public schools.  (If you’re a renter, your landlord pays your share of the tax out of your rent.)  The property tax functions like the mandate:  It has a differential financial impact on people depending on whether they directly benefit from the system sustained by the tax.  To wit, you pay the tax whether or not you choose to send your children to the public schools, indeed, whether or not you have children.  You are obligated to subsidize the public education of children other than your own because public education, for all its failings, has been declared a public good by the polity of which you are a part.

It is inconceivable that the founding fathers would have found unconstitutional a law that extended life-promoting health care to the roughly 50 million Americans who lack health insurance.  The founding fathers declared that citizens – well, white, propertied males, at least – were entitled to life consistent with the demands and entitlements of representative democracy; their pledge, their Declaration, was not in support of a compromised life that limited the ability to fulfill those demands and enjoy those entitlements.

Of course, adult citizens may repudiate mainstream health care on the basis of their own philosophical or religious  predilections.  Fine.  Americans who wish to pursue health outside the medical mainstream or, in the manner of medieval Christians, to disavow corporeal well-being altogether, are free to do so.  But they should not be allowed to undermine social and political arrangements, codified in law, that support everyone else’s right to pursue life and happiness through twenty-first century medicine.

The concept of medical freedom dominated the antebellum period and resurfaced during the early twentieth century, when compulsory childhood vaccination and Oklahoma Senator Robert Owen’s proposed legislation to create a federal department of public health spurred the formation of the Anti-Vaccination League of America, the American Medical Liberty League, and the National League for Medical Freedom.   According to these groups, medical freedom was incompatible not only with compulsory vaccination, but also with the medical examination of school children, premarital syphilis tests, and municipal campaigns against diphtheria.  In the 1910s, failure to detect and treat contagious bacterial disease was a small price to pay for freedom from what medical libertarians derided as “allopathic knowledge.”   These last gasps of the Jacksonian impulse were gone by 1930, by which time it was universally accepted that scientific medicine was, well, scientific, and, as such, something more than one medical sect among many.

After World War II,  when the American Medical Association mounted its holy crusade against President Harry Truman’s proposal for national health care, “medical liberty” came into vogue once more, though its meaning had changed.  In antebellum American and again in the 1910s, it signified freedom to cast off the oppressive weight of “regular” medicine and pick and choose among the many alternative sects.  In the late 1940s, it signified freedom from federally funded health care, which would contaminate the sacrosanct doctor-patient relationship.  For the underserved, such freedom safeguarded the right to remain untreated.  The AMA’s legerdemain elicited ridicule by many, the prominent journalist Milton Mayer among them.  “Millions of Americans,” Mayer wrote in Harper’s in 1949, “geographically or economically isolated, now have access to one doctor or none.  The AMA would preserve their present freedom of choice.”  In 1960, the medical reporter Selig Greenberg mocked  medical free choice as a “hoary slogan” based on “the fatuous assumption that shopping around for a doctor without competent guidance and paying him on a piecemeal basis somehow guarantees a close relationship and high-quality medical care.”[1]

Now the very notion of medical freedom has an archaic ring.  We no longer seek freedom from the clutches of mainstream medicine; now we seek  freedom to avail ourselves of what mainstream medicine has to offer.  At this singular moment in history, in a fractionated society represented by a bitterly divided Congress, access to health care will be expanded and safeguarded, however imperfectly, by the Affordable Health Care Act.  Those who opt out of the Act should pay a price, because they remain part of a society committed to health as a superordinate value without which liberty and the pursuit of happiness are enfeebled.  To argue on about whether the price of nonparticipatory citizenship in the matter of health care can be a tax but not a mandate is obfuscating wordplay.  And the health and well-being of we the people should not be a matter of wordplay.


[1] Milton Mayer, “The Dogged Retreat of the Doctors,” Harper’s Magazine, 199:25-37, 1949, quoted at pp. 32, 35; Silas Greenberg, “The Decline of the Healing Art,” Harper’s Magazine, 221:132-137, 1960, quoted at p. 134.

Copyright © 2012 by Paul E. Stepansky.  All rights reserved.

Medical Toys, Old and New

“The plethora of tests available to the young clinician has significantly eroded the skills necessary to obtain adequate histories and careful physical examinations.  Day in and day out, I encounter egregious examples of misdiagnosis engendered by inadequacies in these skills.”                                ~William Silen, M.D. “The Case for Paying Closer Attention to Our Patients” (1996)

Treat the Patient, Not the CT Scan,” adjures Abraham Verghese in a New York Times op-ed piece of February 26, 2011.  Verghese targets American medicine’s overreliance on imaging tests, but, like others before him, he is really addressing the mindset that fosters such overreliance.  Preclinical medical students, he reminds us, all learn physical examination and diagnosis, but their introduction to the art dissipates under the weight of diagnostic tests and specialist procedures during their clinical years.  “Then,” he writes, “they discover that the currency on the ward seems to be ‘throughput’ – getting tests ordered and getting results, having procedures like colonoscopies done expeditiously, calling in specialists, arranging discharge.”  In the early 90s, William Silen, Harvard’s Johnson and Johnson Distinguished Professor of Surgery,[1] made the same point with greater verve.  In one of his wonderful unpublished pieces, “Lumps and Bumps,” he remarked that “the modern medical student, and most physicians, have been so far removed from physical diagnosis, that they simply do not accept that a mass is a mass is a mass unless the CT scan or ultrasound tells them it is there.”

Verghese and Silen get no argument from me on the clinical limitations and human failings associated with technology-driven medicine.  But these concerns are hardly unique to an era of CT scans and MRIs.  There is a long history of concern about overreliance on new technologies;  Silen has a delightfully pithy, unpublished piece on the topic that is simply titled, “New Toys.”

One limitation of such critiques is the failure to recognize that all “toys” are not created equal.  Some new toys become old toys, at which point they cease being toys altogether and simply become part of the armamentarium that the physician brings to the task of physical examination and diagnosis.  For example, we have long since stopped thinking of x-ray units, EKG machines, blood pressure meters (i.e., sphygmomanometers), and stethoscopes as “new toys” that militate against the acquisition of hands-on clinical skill.

But it was not always so.  When x-rays became available in 1896, clinical surgeons were aghast.  What kind of images were these?  Surely not photographic images in the reliably objectivistic late-nineteenth century sense of the term.  The images were wavy, blurry, and imprecise, vulnerable to changes in the relative location of the camera, the x-ray tube, and the object under investigation.  That such monstrously opaque images might count as illustrative evidence in courts of law, that they might actually be turned against the surgeon and his “expert opinion”  – what was the world coming to?  Military surgeons quickly saw the usefulness of x-rays for locating bullets and shrapnel, but their civilian colleagues remained suspicious of the new technology for a decade or more after its invention.  No fools, they resorted to x-rays only when they felt threatened by malpractice suits.

Well before the unsettling advent of x-ray photography, post-Civil War physician-educators were greatly concerned about the use of mechanical pulse-reading instruments.  These ingenious devices, so they held, would discourage young physicians from learning to appreciate the subtle diagnostic indicators embedded in the pulse.  And absent such appreciation, which came only from prolonged training of their fingertips, they could never acquire the diagnostic acumen of their seniors, much less the great pulse readers of the day.

Thus they cautioned students and young colleagues to avoid the instruments.  It was only through “the habit of discriminating pulses instinctively” that the physician acquired  “valuable truths . . . which he can apply to practice.”  So inveighed the pioneering British physiologist John Burdon-Sanderson in 1867.  His judgment was shared by a generation of senior British and American clinicians for whom the trained finger remained a more reliable measure of radial pulse than the sphygmograph’s arcane tracings.  In The Pulse, his manual of 1890, William Broadbent cautioned his readers to avoid the sphygmograph, since interpretation of its tracings could “twist facts in the desired direction.”  Physicians should “eschew instrumental aids and educate the finger,” echoed Graham Steell in The Use of the Sphygmograph in Medicine at the century’s close.[2]

Lower still on the totem pole of medical technology, indeed about as low down as one can get – is the stethoscope, “invented” by René Laennec in 1816 and first employed by him in the wards of Paris’s Hôpital Necker (see sidebar).  In 1898, James Mackenzie, the founder of modern cardiology, relied on the stethoscope, used in conjunction with his own refinement of the Dudgeon sphygmograph of 1881 (i.e., the Mackenzie polygraph of 1892), to identify what we now term atrial fibrillation.  In the years to follow, Mackenzie, a master of instrumentation, became the principal exponent of what historians refer to as the “new cardiology.” His “New Methods of Studying Affections of the Heart,” a series of articles published in the British Medical Journal in 1905, signaled a revolution in understanding cardiac function.  “No man,” remarked his first biographer, R. McNair Wilson, in 1926, “ever used a stethoscope with a higher degree of expertness.”  And yet this same Mackenzie lambasted the stethoscope as the instrument that had “not only for one hundred years hampered the progress of knowledge of heart affections, but had done more harm than good, in that many people had had the tenor of their lives altered, had been forbidden to undertake duties for which they were perfectly competent, and had been subject to unnecessary treatment because of its findings’.”[3]

Why did Mackenzie come to feel this way?  The problem with the stethoscope was that the auscultatory sounds it “discovered,” while diagnostically illuminating, could cloud clinical judgment and lead to unnecessary treatments, including draconian restrictions of lifestyle.  For Mackenzie,  sphygmomanometers were essentially educational aids that would corroborate what medical students were learning to discern through their senses.  And, of course, he allowed for the importance of such gadgetry in research.  His final refinment of pulse-reading instrumentation, the ink jet polygraph of 1902 (see sidebar), was just such a tool.  But it was never intended for generalists, whose education of the senses was expected to be adequate to the meaning of heart sounds.  Nor was Mackenzie a fan of the EKG, when it found its way into hospitals after 1905.  He perceived it as yet another “new toy” that provided no more diagnostic information than the stethoscope and ink jet polygraph.  And for at least the first 15 years of the machine’s use, he was right.

Now, of course, the stethoscope, the sphygmomanometer, and, for adults of a certain age, the EKG machine are integral to the devalued art of physical examination.  Critics who bemoan the overuse of CT scans and MRIs, of echocardiography and angiography, would be happy indeed  if medical students and residents spent more time examining patients and learning all that can be learned from stethoscopes, blood pressure monitoring, and baseline EKGs.  But more than a century ago these instrumental prerequisites of physical examination and diagnosis were themselves new toys, and educators were wary of what medical students would lose by relying on them at the expense of educating their senses.  Now educators worry about what students lose by not relying on them.

Toys aside, I too hope  that those elements of physical diagnosis that fall back on one tool of exquisite sensitivity – the human hand – will not be lost among reams of lab results and diagnostic studies.  One shudders at the thought of a clinical medicine utterly bereft of the laying on of hands, which is not only an instrument of diagnosis but also an amplifier of therapy.  The great pulse readers of the late nineteenth century are long gone and of interest only to a handful of medical historians.  Will the same be true, a century hence, of the great palpators of the late twentieth?


[1] I worked as Dr. Silen’s editor in 2000-2001, during which time I was privileged to read his unpublished lectures, addresses, and general-interest medical essays as preparation for helping him organize his memoirs.  Sadly, the memoirs project never materialized.

[2] In this paragraph, I am guided especially by two exemplary studies, Christopher Lawrence, “Incommunicable Knowledge: Science, Technology and the Clinical Art in Britain, 1850-1914,” J. Contemp. Hist., 20:503-520, 1985 and Hughes Evans, “Losing Touch: The Controversy Over the Introduction of Blood Pressure Instruments in Medicine, “ Tech. Cult., 34:784-807, 1993.  Broadbent and Steell are quoted from Lawrence, p. 516.

[3] R. McNair Wilson, The Beloved Physician: Sir James Mackenzie (New York:  Macmillan, 1926), pp. 103-104. A more recent, detailed account of Mackenzie’s life and career is Alex Mair, Sir James Mackenzie, M.D., 1853-1925 – General Practitioner (London: Royal College of General Practitioners, 1986).

Copyright © 2012 by Paul E. Stepansky.  All rights reserved.

My Doctor, My Friend

In a piece several months ago  in the Boston Globe, “Blurred Boundaries Between Doctor and Patient,” columnist and primary care internist Suzanne Koven writes movingly of her patient Emma, whom Koven befriended over the last 15 years of Emma’s life.  “Emma and I met frequently to gossip, talk about books and politics, and trade stories about our lives,” she remarks.  “She came to my house for dinner several times, and my husband and kids joined me at her 90th birthday party.  When, at 92, Emma moved reluctantly into a nursing home, I brought her the bagels and lox she craved – rich, salty treats her doctor had long discouraged her from eating.  Here’s the funny part:  I was that doctor.”

Koven writes perceptively of her initial concern with doctor-patient boundaries (heightened, she admits, by her status as “a young female physician”), her ill-fated efforts to maintain her early ideal of professional detachment, and, as with Emma, her eventual understanding that the roles of physician and friend could be for the most part “mutually reinforcing.”

As a historian of medicine interested in the doctor-patient relationship, I reacted to Koven’s piece appreciatively but, as I confessed to her, sadly.  For her initial concern with “blurred boundaries” and her realization after years of practice about the compatibility of friendship with primary medical care only underscore the fragmented and depersonalized world of contemporary medicine, primary care included.  By this, I mean that the quality of intimacy that grows out of most doctoring has become so shallow that we are led to scrutinize doctor-patient “friendship” as a problematic (Is it good?  Is it bad?  Should there be limits to it?) and celebrate instances of such friendship as signal achievements.   Psychoanalysts, be it noted, have been pondering these questions in their literature for decades, but they at least have the excuse of their method, which centrally implicates the analysis and resolution of transference with patients who tend to become inordinately dependent on them.

My father, William Stepansky, like many of the WWII generation, befriended his patients, but he befriended them as their doctor.  That is, he understood his medicine to include human provisions of a loving and Hippocratic sort.  Friendly two-way extramedical queries about his family, contact at community events, attendance at local weddings and other receptions – these were not boundary-testing land mines but aspects of community-embedded caregiving.  But here’s the rub:  My father befriended his patients as their doctor; his friendship was simply the caring dimension of his care-giving.  What, after all, did he have in common with the vast majority of his patients?  They were Protestants and Catholics, members of the Rotary and Kiwanis Clubs who attended the local churches and coached little league baseball and Pop Warner football.  He was an intellectual East European Jew, a serious lifelong student of the violin whose leisure time was spent practicing, reading medical journals, and tending to his lawn.

And yet to his patients, he was always a special friend, though he himself would admit nothing special about it:  his friendship  was simply the human expression of his calling.  He did not (to my knowledge) bring anyone bagels and lox or pay visits to chat about books or politics, but he provided treatment (including ongoing supportive psychotherapy) at no charge, accepted payment in kind, and visited patients in their homes when they became too elderly or infirm to come to the office.  Other routine “friendly” gestures included charging for a single visit when a mother brought a brood of sick children to the office during the cold season.  And when elderly patients became terminal, they did not have to ask – he simply began visiting them regularly in their homes to provide what comfort he could and to let them know they were on his mind.

When he announced his impending retirement to his patients in the fall of 1990, his farewell letter began “Dear Friend” and then expressed regret at “leaving many patients with whom I have shared significant life experience from which many long-term friendships have evolved.”  “It has been a privilege to serve as your physician for these many years,” he concluded.  “Your confidence and friendship have meant much to me.”  When, in my research for The Last Family Doctor, I sifted through the bags of cards and letters that followed this announcement, I was struck by the number of patients who not only reciprocated my father’s sentiment but summoned the words to convey deep gratitude for the gift of their doctor’s friendship.

In our own era of fragmented multispecialty care, hemmed in by patient rights, defensive medicine, and concerns about boundary violations, it is far from easy for a physician to “friend” a patient as physician, to be and remain a physician-friend.  Furthermore, physicians now wrestle with the ethical implications of “friending” in ways that are increasingly dissociated from a medical identity.  Many choose to forego professional distance at the close of a work day.  No less than the rest of us, physicians seek multicolored self states woven of myriad connective threads; no less than the rest of us, they are the Children of Facebook.

But there is a downside to this diffusion of connective energy.  When, as a society, we construe the friendship of doctors as extramedical, when we pull it into the arena of depersonalized connecting fostered by social media, we risk marginalizing the deeper kind of friendship associated with the medical calling: the physician’s nurturing love of the patient.   And we lose sight of the fact that, until the final two decades of the 19th century,  when advances in cellular biology, experimental physiology, bacteriology, and pharmacology ushered in an era of specific remedies for specific ailments, most effective doctoring – excluding only a limited number of surgeries – amounted to little more than just such friendship, such comfortable and comforting “friending” of sick and suffering people.

And this takes us back to Suzanne Koven, who imputes the “austere façade” of her medical youth to those imposing 19th-century role models “whose oil portraits lined the walls of the hospital [MGH] in which I did my medical training.”  Among the grim visages that stared down from on high was that of the illustrious James Jackson, Sr., who brought Jenner’s technique of smallpox inoculation to the shores of Boston in 1800, became Harvard’s second Hersey Professor of the Theory and Practice of Medicine in 1812, and was a driving force in the founding of MGH, which opened its doors in 1821.  Koven cites a passage from the second of Jackson’s Letters to a Young Physician (1855) in which he urges his young colleague to “abstain from all levity” and “never exact attention to himself.”

But why should absence of levity and focal concern with the patient be tantamount to indifference, coolness, the withholding of physicianly friendship?  Was Jackson really so forbidding a role model?  Composing his Letters in the wake of the cholera epidemic of 1848, when “regular” remedies such as bleeding and purging proved futile and only heightened the suffering of  thousands, Jackson cautioned modesty when it came to therapeutic pretensions.  He abjured the use of drugs “as much as possible,” and added that “the true physician takes care of his patient without claiming to control the disease in all cases.” Indeed he sought to restore “cure” to its original Latin meaning, to curare, the sense in which “to cure meant to take care.”  “The physician,” he instructed his protégé,

“may do very much for the welfare of the sick, more than others can do, although he does not, even in the major part of cases, undertake to control and overcome the disease by art.  It was with these views that I never reported any patients cured at our hospital.  Those who recovered their health before they left the house were reported as well, not implying that they were made so by the active treatment they had received there.  But it was to be understood that all patients received in that house were to be cured, that is, taken care of” [Letters to a Young Physician, p. 16, Jackson’s emphasis].

And then he moved on to the narrowing of vision that safeguarded the physician’s caring values, his cura:

“You must not mistake me.  We are not called upon to forget ourselves in our regard for others.  We do not engage in practice merely from philanthropy.  We are justified in looking for both profit and honor, if we give our best services to our patients; only we must not be thinking of these when at the bedside.  There the welfare of the sick must occupy us entirely” [Letters to a Young Physician, pp. 22-23].

Koven sees the Hippocratic commitment that lies beneath Jackson’s stern glance and, with the benefit of hindsight, links it to her friendship with Emma. “As mutually affectionate as our friendship was,” she concludes, “her health and comfort were always its purpose.”  Indeed.  For my father and any number of caring generalists, friendship was prerequisite to clinical knowing and foundational to clinical caring.  It was not extramural, not reserved for special patients, but a way of being with all patients.  And this friendship for his patients, orbiting around a sensibility of cura and a wide range of procedural activities, was not a heavy thing, leaden with solemnity.  It was musical.  It danced.

In the early 60s, he returns from a nursing home where he has just visited a convalescing patient.  I am his travelling companion during afternoon house calls, and I greet him on his return to the car.  He looks at me and with a sly grin remarks that he has just added “medicinal scotch” to the regimen of this elderly gentlemen, who sorely missed his liquor and was certain a little imbibing would move his rehab right along.  It was a warmly caring gesture worthy of Osler, that lover of humanity, student of the classics, and inveterate practical joker.  And a generation before Osler, the elder Jackson would have smiled.  Immediately after cautioning the young physician to “abstain from all levity,” he added: “He should, indeed, be cheerful, and, under proper circumstances, he may indulge in vivacity and in humor, if he has any.  But all this should be done with reference to the actual state of feeling of the patient and of his friends.”  Just so.

Copyright © 2012 by Paul E. Stepansky.  All rights reserved.