Monthly Archives: April 2012

The Hunt For Caring Med Students

The MCATs, new and improved, will save us!  The overhauled medical school admissions test, which was approved by the American Association of Medical Colleges last February and will take effect in 2015, will devote almost half its questions to the social sciences and critical reasoning, with the latter including reading passages addressing cross-cultural issues and medical ethics.  According to Darrell G. Kirch, President of the AAMC, the new version of the test will aid medical schools in finding students “who you and I would want as our doctors.  Being good doctors isn’t just about understanding science, it’s about understanding people.”[1]

To which I reply:  Will wonders never cease?  We’re going to help medical schools make humanistic doctors with better people skills by making sure premed students are exposed to humanistic medicine as it filters through introductory psychology and sociology courses.  Had AACP personnel perused a sampling of introductory psychology and sociology syllabi, they might have paused before deciding to cultivate this new skill set through introductory social science courses, which, in this day and age, devote little time to theories of personality, family structure and dynamics, psychosocial development, and psychodynamics – the very topics that engaged me when I studied introductory psychology in the fall of 1969.  Still less do today’s introductory social science courses permit psychosocial and ethical consideration of health-related issues; for the latter, one seeks out upper-class courses in medical sociology, medical anthropology, and, of course, medical ethics.

If it’s a matter of choosing general nonscience courses that frame some of the ethical and cross-cultural (and racial and gender-related) issues tomorrow’s physicians will face, introductory philosophy courses in moral philosophy and/or ethics would be far more to the point.  But I am a historian and my own bias is clear:   At the top of horizon-broadening and humanizing courses would be surveys of nineteenth- and twentieth-century medicine in its cultural, political, and institutional aspects.  I offer two such seminars to upper-class history majors at my university under the titles “Medicine and Society: From Antebellum America to the Present”  and “Women, Their Bodies, Their Health, and Their Doctors: America, 1850 to the Present.”  Both seminars address doctor-patient relationships over the past two centuries, a topic at the heart of the social history of medicine.

But let’s face it.  Requiring premed students to take a few additional courses is a gesture – something more than an empty gesture but still a weak gesture.  There is every reason to believe that students who spend their undergraduate years stuffing their brains with biology, organic chemistry, and physics will  approach the social science component of premed studies in the same task-oriented way.  The nonscience courses will simply be another hurdle to overcome.  Premed students will take introductory psychology and sociology to learn what they need to know to do credibly well on the MCATs.  And, for most of them, that will be that.  Premed education will continue to be an intellectual variant of survivor TV:  making the grade(s), surviving the cut, and moving on to the next round of competition.

The overhaul of the MCAT is premised on the same fallacy that persuades medical educators they can “teach” empathy to medical students through dramatizations, workshops, and the like.  The fallacy is that physicianly caring, especially caring heightened by empathy, is a cognitive skill that can be instilled through one-time events or curricular innovations.  But empathy cannot be taught, not really.  It is an inborn sensibility associated with personality and temperament.   It is not an emotion (like rage, anger, joy) but an emotional aptitude that derives from the commensurability of one’s own feeling states with the feeling states of others.  The aptitude is two-fold:  It signifies (1) that one has lived a sufficiently rich emotional life to have a range of emotions available for identificatory purposes; and (2) that one is sufficiently disinhibited to access one’s own emotions, duly modulated, to feel what the patient or client is feeling in the here and now of the clinical encounter.  Empathy does not occur in a vacuum; it always falls back on the range, intensity, and retrievability of one’s own emotional experiences.  For this reason, Heinz Kohut, who believed empathy was foundational to the psychoanalytic method, characterized it as “vicarious introspection,” the extension of one’s own introspection (and associated feelings) to encompass the introspection (and associated feelings) of another.

Everyone possesses this ability to one small degree or another; extreme situations elicit empathy even in those who otherwise live self-absorbed, relationally parched lives.  This is why psychologists who present medical students with skits or film clips of the elderly in distressing situations find the students score higher on empathy scales administered immediately after viewing such dramatizations.  But the “improvement” is short-lived.[2]  An ongoing (read: characterological) predisposition to engage others in caring and comprehending ways cannot result from what one team of researchers breezily terms “empathy interventions.”[3]

If one seeks to mobilize a preexisting aptitude for empathic care giving, there are much better ways of doing it than adding introductory psychology and sociology courses to the premed curriculum.  Why not give premed students sustained contact with patients and their families in settings conducive to an emotional connection.  Let’s introduce them to messy and distressing “illness narratives” in a way that is more than didactic.  Let’s place them in situations in which these narratives intersect with their own lived experience.  To wit, let’s have all premed students spend the summer following  their junior year as premed volunteers in one of three settings:  pediatric cancer wards; recovery and rehab units in VA hospitals; and public geriatric facilities, especially the Alzheimer’s units of such facilities.

I recommend eight weeks of full-time work before the beginning of senior year. Routine volunteer duties would be supplemented by time set aside for communication – with doctors, nurses, and aids, but especially with patients and their families.  Students would be required to keep journals with daily entries that recorded their experience – especially how it affected (or didn’t affect) them personally and changed (or didn’t change) their vision of medicine and medical practice.  These journals, in turn, would be included with their senior-year applications to medical school.  Alternatively, the journals would be the basis for an essay on doctor-patient relationships informed by their summer field work.

I mean, if medical educators want to jumpstart the humane sensibility of young doctors-to-be, why not go the full nine yards and expose these scientifically minded young people to aspects of the human condition that will stretch them emotionally.  Emotional stretching will not make them empathic; indeed, it may engender the same defenses that medical students, especially in the third year, develop to ward off emotional flooding when they encounter seriously ill patients.[4]  But apart from the emotions spurred or warded off by daily exposure to children with cancer, veterans without limbs, and elderly people with dementias, the experience will have a psychoeducational yield:  It will provide incoming med students  with a broadened range of feeling states that will be available to them in the years ahead.  As such, their summer in the trenches will lay a foundation for clinical people skills far more durable that what they can glean from introductory psychology and sociology texts.

Those premed students of caring temperament will be pulled in an “empathic” direction; they will have an enlarged reservoir of life experiences to draw on when they try to connect with their patients during medical school and beyond.  Those budding scientists who are drawn to medicine in its research or data-centric “managerial” dimension[5] will at least have broadened awareness of the suffering humanity that others must tend to.  Rather than reaching for the grand prize (viz., a generation of empathic caregivers), the AAMC might lower its sights and help medical schools create physicians who, even in technologically driven specialties and subspecialties, evince a little more sensitivity.  In their case, this might simply mean understanding that many patients need doctors who are not like them.  A small victory is better than a Pyrrhic victory.


[1] Elisabeth Rosenthal, “Molding a New Med Student,” Education/Life Supplement, New York Times, April 15, 2012, pp. 20-22.

 [2] Lon J. Van Winkle, Nancy Fjortoft, & Mohammadreza Hojat, “Impact of a Workshop About Aging on the Empathy Scores of Pharmacy and Medical Students,” Amer. J. Pharmaceut. Ed., 76:1-5, 2012.

 [3] Sarah E. Wilson, Julie Prescott, & Gordon Becket, “Empathy Levels in First- and Third-Year Students in Health and Non-Health Disciplines,” Amer. J. Pharmaceut. Ed., 76:1-4, 2012.

 [4] Eric R. Marcus, “Empathy, Humanism, and the Professionalization Process of Medical Education,” Acad. Med., 74:1211-1215, 1999;  Mohammadreza Hojat, et al., “The Devil is in the Third Year: A Longitudinal Study of Erosion of Empathy in Medical School,” Acad. Med., 84:1182-1191, 2009.

 [5] Beverly Woodward, “Confidentiality, Consent and Autonomy in the Physician-Patient Relationship,” Health Care Analysis, 9:337-351, 2001.

Copyright © 2012 by Paul E. Stepansky.  All rights reserved.

Advertisements

Medical Freedom, Then and Now

“A nation’s liberties seem to depend upon headier and heartier attributes than the liberty to die without medical care.”                                                                        ~Milton Mayer, “The Dogged Retreat of the Doctors” (1949)

Conservative supreme court justices who voice grave skepticism about the constitutionality of the Patient Protection and Affordable Care Act of 2010 would have been better suited to judicial service in the decades following the Revolutionary War.  Issues of health, illness, freedom, and tyranny were much simpler then.  Liberty, as understood by our founding fathers, operated only in the interlacing realms of politics and religion.  How could it have been otherwise?   Medical intervention did not affect the course of illness; it did not enable people to feel better and live longer and more productive lives.  With the exception of smallpox inoculation, which George Washington made mandatory among colonial troops in the winter of 1777, governmental intrusion into the health of its citizenry was nonexistent, even nonsensical.

Until roughly the eighth decade of the nineteenth century, you got sick, you recovered (often despite doctoring), you lingered on in sickness, or you died.  Antebellum (pre-Civil War) medicine relied on a variation of Galenic medicine developed in the eighteenth century by the Scottish physician John Cullen and his student John Brown.  According to Cullen’s system, all diseases were really variations of a single disease that consisted of too much tension or excitability (and secondarily too little tension or excitability) in the blood vessels.  Revolutionary-era and antebellum physicians sought to restore a natural balance by giving “overstimulated” patients (read: feverish, agitated, pain-ridden patients) large doses of toxic mercury compounds like calomel to induce diarrhea; emetics like ipecac and tobacco to induce vomiting; and by bleeding patients to the point of fainting (i.e., syncope).  It was not a pretty business.

Antebellum Americans did not have to worry about remedies for specific illnesses.  Except for smallpox vaccine and antimalarial cinchona tree bark (from which quinine was isolated in 1820), none existed.  Nor did they have to worry about long-term medical interventions for chronic conditions – bacterial infections, especially those that came in epidemic waves every two or three years, had no more opportunity to become chronic than diabetes, heart disease, or cancer.

Medical liberty, enshrined during the Jacksonian era, meant being free to pick and choose your doctor without any state interference.  So liberty-loving Americans picked and chose among calomel-dosing, bloodletting-to-syncope “regulars,” homeopaths, herbalists, botanical practitioners (Thomsonians), eclectics, hydropaths, phrenologists, Christian Scientists, folk healers, and faith healers.   State legislatures stood on the sidelines and applauded this instantiation of pure democracy.  By midcentury, 15 states had rescinded medical licensing laws; the rest gutted their laws and left them unenforced.  Americans were free to enjoy medical anarchy.

Now, mercifully, our notion of liberty has been reconfigured by two centuries of medical progress.  We don’t just get sick and die.  We get sick and get medical help, and, mirabile dictu, the help actually helps.  In antebellum America, deaths of young people under 20 accounted for half the national death rate.   Now our children don’t die of small pox, cholera, yellow fever, dysentery, typhoid, and pulmonary and respiratory infections before they reach maturity.  Diphtheria no longer stalks them during the warm summer months.  When they get sick in early life, their parents take them to the doctor and they almost always get better.  Their parents, on the other hand, especially after reaching middle age, don’t always get better.  So they get ongoing medical attention to help them live longer and more comfortably with chronic conditions like diabetes, coronary heart disease, inflammatory bowel disease, Parkinson’s, and many forms of cancer.

When our framers drafted the Constitution, the idea of being free to live a productive and relatively comfortable life with long-term illness didn’t compute.  You died from diabetes,  cancer, bowel obstruction, neurodegenerative disease, and any major infection (including, among young women, the infection that often followed childbirth).  A major heart attack usually killed you.  You didn’t receive dialysis and possibly a kidney transplant when you entered kidney failure.  Major surgery, performed on the kitchen table if you were of means or in a bacteria-infested, dimly lit, unventilated public hospital if you weren’t, was all but nonexistent because it invariably resulted in massive blood loss, infection, and death.

So, yes, our framers intended our citizenry to be free of government interference, including an obligatory mandate to subsidize health care for millions of uninsured and underserved Americans.  But then the framers never envisioned a world in which freedom could be safeguarded and extended by access to expert care that relieved suffering, effected cure, and prolonged life.  Nor could they envision the progressive income tax, compulsory vaccination, publicly supported clinics, mass screening for TB, diabetes, and  syphilis, and Medicare.  Throughout the antebellum era, when regular physicians were reviled by the public and when neither regulars nor “alternative” practitioners could stem the periodic waves of cholera, yellow fever, and malaria that decimated local populations, it mattered little who provided one’s doctoring. Many, like the thousands who paid $20.00 for the right to practice Samuel Thomson’s do-it-yourself botanical system, chose to doctor themselves.

Opponents of the Affordable Care Act seem challenged by the very idea of progress.  Their consideration of liberty invokes an eighteenth-century political frame of reference to deprive Americans of a kind of liberty associated with a paradigm-shift that arose in the 1880s and 1890s.  It was only then that American medicine began its transition to what we think of as modern medicine. Listerian antisepsis (and then asepsis); laboratory research in bacteriology, immunology, and pharmacology; laboratory development of specific remedies for specific illnesses; implementation of public health measures informed by bacteriology; modern medical education beginning with the opening of Johns Hopkins Medical College in 1893; and, yes, government regulation to safeguard the public from incompetent practitioners and toxic, sometimes fatal, medications – all were  all part of the transition.

“We hold these truths to be self-evident,” Jefferson begins the second paragraph of the Declaration of Independence, “that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”  What Jefferson didn’t stipulate – what he couldn’t stipulate in his time and place – was the hierarchical relationship among these rights.  Now, in the twenty-first century, we are able to go beyond an eighteenth-century mindset in which “life, liberty, and the pursuit of happiness” functions as a noun phrase whose unitary import derives from the political tyrannies of King George III and the British Parliament.  Now we can place life at the base of the pyramid and declare that quality of life is indelibly linked to liberty and the pursuit of happiness.  To the extent that quality of life is diminished through disease and dysfunction, liberty and the pursuit of happiness are necessarily compromised.  In 2012, health is life; it is life poised to exercise liberty and pursue happiness to the fullest.

Why is it unconstitutional to obligate all citizens to participate in a health plan, either directly or through a mandate, that safeguards the right of people to efficacious health care regardless of their financial circumstances, their employment status, and their preexisting medical conditions?  What is it about the term “mandate” that is constitutionally questionable?  When you buy a house in this country, you pay local property taxes that support the local public schools.  (If you’re a renter, your landlord pays your share of the tax out of your rent.)  The property tax functions like the mandate:  It has a differential financial impact on people depending on whether they directly benefit from the system sustained by the tax.  To wit, you pay the tax whether or not you choose to send your children to the public schools, indeed, whether or not you have children.  You are obligated to subsidize the public education of children other than your own because public education, for all its failings, has been declared a public good by the polity of which you are a part.

It is inconceivable that the founding fathers would have found unconstitutional a law that extended life-promoting health care to the roughly 50 million Americans who lack health insurance.  The founding fathers declared that citizens – well, white, propertied males, at least – were entitled to life consistent with the demands and entitlements of representative democracy; their pledge, their Declaration, was not in support of a compromised life that limited the ability to fulfill those demands and enjoy those entitlements.

Of course, adult citizens may repudiate mainstream health care on the basis of their own philosophical or religious  predilections.  Fine.  Americans who wish to pursue health outside the medical mainstream or, in the manner of medieval Christians, to disavow corporeal well-being altogether, are free to do so.  But they should not be allowed to undermine social and political arrangements, codified in law, that support everyone else’s right to pursue life and happiness through twenty-first century medicine.

The concept of medical freedom dominated the antebellum period and resurfaced during the early twentieth century, when compulsory childhood vaccination and Oklahoma Senator Robert Owen’s proposed legislation to create a federal department of public health spurred the formation of the Anti-Vaccination League of America, the American Medical Liberty League, and the National League for Medical Freedom.   According to these groups, medical freedom was incompatible not only with compulsory vaccination, but also with the medical examination of school children, premarital syphilis tests, and municipal campaigns against diphtheria.  In the 1910s, failure to detect and treat contagious bacterial disease was a small price to pay for freedom from what medical libertarians derided as “allopathic knowledge.”   These last gasps of the Jacksonian impulse were gone by 1930, by which time it was universally accepted that scientific medicine was, well, scientific, and, as such, something more than one medical sect among many.

After World War II,  when the American Medical Association mounted its holy crusade against President Harry Truman’s proposal for national health care, “medical liberty” came into vogue once more, though its meaning had changed.  In antebellum American and again in the 1910s, it signified freedom to cast off the oppressive weight of “regular” medicine and pick and choose among the many alternative sects.  In the late 1940s, it signified freedom from federally funded health care, which would contaminate the sacrosanct doctor-patient relationship.  For the underserved, such freedom safeguarded the right to remain untreated.  The AMA’s legerdemain elicited ridicule by many, the prominent journalist Milton Mayer among them.  “Millions of Americans,” Mayer wrote in Harper’s in 1949, “geographically or economically isolated, now have access to one doctor or none.  The AMA would preserve their present freedom of choice.”  In 1960, the medical reporter Selig Greenberg mocked  medical free choice as a “hoary slogan” based on “the fatuous assumption that shopping around for a doctor without competent guidance and paying him on a piecemeal basis somehow guarantees a close relationship and high-quality medical care.”[1]

Now the very notion of medical freedom has an archaic ring.  We no longer seek freedom from the clutches of mainstream medicine; now we seek  freedom to avail ourselves of what mainstream medicine has to offer.  At this singular moment in history, in a fractionated society represented by a bitterly divided Congress, access to health care will be expanded and safeguarded, however imperfectly, by the Affordable Health Care Act.  Those who opt out of the Act should pay a price, because they remain part of a society committed to health as a superordinate value without which liberty and the pursuit of happiness are enfeebled.  To argue on about whether the price of nonparticipatory citizenship in the matter of health care can be a tax but not a mandate is obfuscating wordplay.  And the health and well-being of we the people should not be a matter of wordplay.


[1] Milton Mayer, “The Dogged Retreat of the Doctors,” Harper’s Magazine, 199:25-37, 1949, quoted at pp. 32, 35; Silas Greenberg, “The Decline of the Healing Art,” Harper’s Magazine, 221:132-137, 1960, quoted at p. 134.

Copyright © 2012 by Paul E. Stepansky.  All rights reserved.