Category Archives: Antebellum Medicine

Anti-vaccinationism, American Style

Here is an irony:  America’s staggering production of generations of scientific brainpower coexists with the deep skepticism about science of many Americans.  Donald Trump, a prideful scientific illiterate, rode to power on the back of many others who, like him, were skeptical about science and especially the role of scientific experts in modern life.  He maintains their allegiance still.

Why does this surprise us?  Anti-intellectualism was burned into the national character early in American history.  Those skeptical of this claim should read Richard Hofstadter’s brilliant twin studies of the 1960s, Anti-Intellectualism in American Life and The Paranoid Trend in American Politics. From the beginning of the American Experiment, democracy was antithetical to so-called European “elitism,” and this ethos gained expression, inter alia, in antebellum medicine.  

The Founding Fathers, an intellectual elite in defense of democracy, were not part of the movement away from science.  When Benjamin Waterhouse introduced Edward Jenner’s smallpox vaccine to America in 1800, Washington, Adams, and Jefferson hailed it as the greatest discovery of modern medicine.  They appreciated the severity of smallpox, which had ravaged the Continental Army during the War of Independence.  Indeed, Washington was so desperate to rein in its decimation of his troops that, in 1777, he inoculated his entire army with pus from active smallpox lesions, knowing that the resulting infections would be milder and far less likely to cause fatalities than smallpox naturally contracted.  When Jefferson became president in 1801, he pledged to introduce the vaccine to the American public, because “it will be a great service indeed rendered to human nature to strike off the catalogue of its evils so great a one as the smallpox.” Not to be outdone in support of Jenner’s miraculous discovery, Jefferson’s successor, James Madison, signed into law in 1813, “An Act to Encourage Vaccination.” Among its provisions was the requirement that the U.S. postal service “carry mail containing vaccine materials free of charge.”[1]

But this appreciation of the vaccine was short-lived, and Jefferson’s hope that the value of vaccination would seep into public consciousness was never realized.  In Jacksonian America, the Founding Fathers’ belief that medical progress safeguarded democracy gave way to something far less enlightened:  democracy now meant that everyone could be, indeed should be, his own doctor.  Most Americans had no need for those with university educations, much less clinical experience in governmentally managed public hospitals.  Jacksonian America emerges as what the historian Joseph Kett termed the “Dark Age of the profession.”[2]  During this time, the nation lay claim to a medical elite only because a few monied medical intelligentsia – John Collins Warren, Valentine Mott, Philip Syng Physick, William Gibson, and David Hosack, among them – found their way to European medical centers in London, Edinburgh, and somewhat later, Paris. 

Otherwise, it was every man for himself, which usually meant every woman for herself and her family.  Homeopaths, herbalists, Thomsonians, eclectics, hydropaths, phrenologists, Christian Scientists, folk healers, faith healers, uroscopians, chromo-thermalists – each exemplified the democratic mind in action.[3]  Sad to say, homegrown “regular” American medicine of the day, with its reliance on depletive (bleeding, vomiting, purging) and stimulative (alcohol, quinine) treatments, was no better and often worse.  The belief, Galenic in origin, that all diseases were variants of the same global type of bodily dysregulation is startlingly close to Donald Trump’s holistic medieval approach to bodily infection and its treatment.

The birth of scientific medicine in the decades following the Civil War could not still the ardor of America’s scientific illiterati. The development of animal blood-derived serums (antitoxins), forerunners of modern antibiotics, was anathema to many. Among them were religionists, mainly Christian, for whom injecting blood product of a horse or sheep into the human body was not only repugnant but sinful.  Better to let children be stricken with smallpox, diphtheria and tetanus, sometimes to the point of death, than violate what they construe as divine strictures – strictures, be it noted, not intimated, much less codified, in the body of doctrine of any of the five major world religions.[4]

Antivaccinationists of the early 20th century were an unhappy lot.  They were unhappy about the proliferation of medicines (“biologics”) for treating illness.  And they deeply resented the intrusion of the State into domains of parental decision-making in the form of newly empowered social workers, visiting nurses, and educators.  In fact, antivaccinationism was part and parcel of resistance to all things progressive, including scientific medicine.[5]  Holdovers from the free-wheeling anything-goes medicine of antebellum America – especially devotees of homeopathy and, of late, chiropractic – were prominent in its ranks.    

Now, in the face of a global pandemic no less lethal than the Great Influenza of 1918-1919, we hear the same irrational musings about the dangers of vaccines that animated the scientific illiterati at the turn of the 20th century. For the foes of public health, any misstep in the manufacture or storage of smallpox vaccine – a much greater possibility over a century ago than today – was enough to condemn vaccination outright. In1901,smallpox vaccination of school children in Camden, NJ led to an outbreak of 100 cases of tetanus, with nine deaths.  Historians believe that, in all probability, the outbreak resulted not from a contaminated batch of vaccine but rather from poor care of the vaccination site.  But Congress accepted the possibility of contamination, and the incident led to passage of the Biologics Control Act of 1902.[6]  Henceforth every manufacturer of vaccine had to be licensed by the Secretary of the Treasury (relying on the PHS Laboratory of Hygiene), and each package of vaccine had to be properly labeled and dated and was subject to inspection.[7]  

And this leads to a second irony: the more preventive medicine advanced, incorporating additional safeguards into vaccine production, storage, and administration, the greater the resistance of the illiterati.  Throughout the 20th century and right down to the present, the antebellum notion of science-free “medical freedom” continues to hold sway.  Then and now, it means the right to put children at risk for major infectious disease that could result in death – and the right, further, to pass disease, possibly severe and occasionally fatal, on to others.

It follows that, then and now, the science illiterati are skeptical, if not distressed, by the State’s commitment to public health.  It was Oklahoma Senator Robert Owen’s proposed legislation of 1910 to combine five federal departments into a cabinet-level Department of Public Health that pushed the opponents of medical “tyranny” onward. The Anti-Vaccination League of America, formed in 1908, was joined by the National League for Medical Freedom in 1910.  Eight years later, they were joined by the American Medical Liberty League.  For all three groups, anti-Progressivism was in full swing. “Medical freedom” not only exempted children from compulsory vaccination, but from medical examinations at school.  Further, young adults should not be subjected to premarital syphilis tests. Nor did the groups’ expansive view of medical tyranny flinch in the face of public education about communicable disease: municipal campaigns against diphtheria were to be forbidden entirely. 

With the death of the founders of the Anti-Vaccination League (Charles Higgins) and the American Medical Liberty League (Lora Little) in 1929 and 1931, respectively, antivaccinationism underwent a dramatic decline.  The Jacksonian impulse that fueled the movement simply petered out, and by the later ‘30s, Americans finally grasped that mainstream medicine was not simply another medical sect. It was the real deal:  a medicine grounded in laboratory research that effectively immunized against disease, promoted relief and cure of those already infected, and thereby saved lives.

But was the embrace of scientific healing really universal?  A pinnacle of life-depriving anti-science occurred well beyond the 1930s.  Consider the belief of some Christian sects that certain life-saving medical interventions must be withheld from children on religious grounds.  It was only in 1982, 81 years after von Behring’s discovery of diphtheria antitoxin launched the era of serum therapy, that criminal charges were first brought against parents who had withheld necessary treatment from their children.  Of the 58 cases of such parental withholding of care, 55 involved fatalities.[8]  Child deaths among Christian Scientists alone included untreated diabetes (leading to diabetic ketoacidosis), bacterial meningitis, and pneumonia.  Now things are better for the children, since even U.S. Courts that have overturned parents’ criminal convictions have come around to the mainstream belief that religious exemption laws are not a defense of criminal neglect – a fine insight for the judiciary to have arrived at more than a century after serum therapy scored major triumphs in the treatment of rabies, diphtheria, tetanus, pneumococcal pneumonia, and meningococcal meningitis.

Should vaccination for the Covid-19 virus be a requirement for attendance in public and private schools?  How can the question even be asked?  As early as 1827, a Boston school committee ordered teachers to require entering students to give evidence of smallpox vaccination.[9]  Statewide vaccination requirements for smallpox followed in Massachusetts in 1855, New York in 1862, Connecticut in 1872, and Pennsylvania in 1895.  And the inoculations were effective across the board.  They quickly brought outbreaks of smallpox underway at the time of inoculation under control, and they prevented their recurrence in the future. These laws and those that followed were upheld by the Supreme Court in 1922 in Zucht v. King.[10]      

Twentieth-century vaccines were developed for pertussis in 1914, diphtheria in 1926, and tetanus in 1938.  In 1948 the three were combined and given to infants and toddlers at regular intervals as the DTP vaccine.  There was no hue and cry in 1948 or the years to follow. And yet, the same fear of vaccination that led the New York State Health Department to launch a statewide drive to immunize children against diphtheria now renders a new generation of parents resistant to mandatory Covid-19 vaccination for their own children.

Bear in mind that the anti-science rhetoric of today’s illiterati can be mobilized just as easily to resist DPT or any subsequent vaccine administered to their children. Why subject a child to DPT vaccination?  Perhaps combining three different vaccines into one injection entails heightened risks. Perhaps the batch of vaccine in the hands of one’s own doctor has been contaminated.  Perhaps one’s child will be among the miniscule number that have a minor allergic reaction.  And, after all, children who contract diphtheria, pertussis, and/or tetanus will hardly die from their infections, especially with the use of antibiotics. Why inject foreign matter into healthy infants – the very argument adduced by the opponents of diphtheria vaccine a century ago. 

The problem with antivaccinationist rhetoric in the 21st century is that its proponents are all beneficiaries of more than a century of mandatory vaccination policy.  If they lived in a society bereft of vaccines – or, for the unvaccinated, the immunity conferred by the vast herd of immunes – they would have led very different lives.  Indeed, some would not be here to celebrate solipsism masquerading as individualism.  Their specious intuitions about the risks of vaccination are profoundly anti-social, since they compromise the public’s health. Parents who decide not to vaccinate their children put the entire community at risk.  The community includes not only their own children, but all those who desire protection but cannot receive it:  children too young to be vaccinated, those with actual medical contraindications to vaccination, and the miniscule number who have been vaccinated but remain unprotected.[11]    

Nor is it strictly a matter of providing equal protection to individuals who seek, but cannot receive, the protection afforded by compulsory vaccination. In a secular society, religious objections to vaccination pale alongside the health of the community. Whether framed in terms of a “compelling state interest” in mitigating a health threat (Sherbert v. Vernerin [1963]) or the individual’s obligation to comply with “valid and neutral laws of general applicability” whatever their incidental religious implications (Employment Division, Department of Human Resources of Oregon v. Smith [1990]) , the U.S. Supreme Court has consistently held that mandatory vaccination laws need not allow religious exemptions of any kind.  

Antivaccinationists might bear in mind a few particulars as they align themselves with the infectious dark ages.  Between 1900 and 1904, an average of 48,164 cases of smallpox and 1,528 smallpox deaths were reported each year. With the arrival of compulsory vaccination in schools, the rate fell drastically and outbreaks of smallpox ended in 1929. The last case of smallpox in the U.S. was reported in 1949.[12]  

Among American children, diphtheria was a major cause of illness and death through 1921, when 206,000 cases and 15,520 deaths were recorded.  Before Emil von Bering’s diphtheria antitoxin became available in 1894 to treat infected children, the death rate among children struck down, especially during the hot summer months, could reach 50%. Within several years, use of the antitoxin brought it down to 15%.[13]  Then, by the late 1920s, diphtheria immunization was introduced and diphtheria rates fell dramatically, both in the U.S. and other countries that vaccinated widely. Between 2004 and 2008, no cases of diphtheria were recorded in the U.S.[14] 

Between 1951 and 1954, paralytic polio cases in the United States averaged 16,316 a year, of which 1,879 resulted in death. Then science came to the rescue.  Jonas Salk’s dead-poliovirus vaccine became available in1955, and Albert Sabin’s live-poliovirus variant four years later. By 1962, there were fewer than 1,000 cases a year and, in every year thereafter, fewer than 100 cases.[15]

Now, alas, some parents still worry that the measles component of the MMR (measles, mumps, rubella) vaccine available since 1971 may lead to childhood autism.  Why?  Resist the disease-promoting mythologies of the illiterati at all costs.  Autism is a neuro-developmental disorder with a strong genetic component; its genesis is during the first year of life, before the vaccine is even administered.  None of the epidemiologists who have studied the issue has found any evidence whatsoever of an association, not among normal children and not among high-risk children with autistic siblings.[16]  The fact is that children who do not receive a measles vaccine have been found 35 times more likely to contract measles than the vaccinated.[17]  And measles is no laughing matter. When contracted later in life, measles and mumps are serious and can be deadly.  They were among the major systemic infections that felled soldiers during the Civil War, the Spanish-American War, the Anglo-Boer War, and World War I.[18]                  

All of which leads to a conclusion in the form of an admonishment.  Accept the fact that you live in a secular society governed by law and a network of agencies, commissions, and departments lawfully enjoined to safeguard public health.  Do your part to sustain the social contract that came into existence when the Founding Fathers, elitists molded by European thought who had   imbibed the social contractualism of John Locke, wrote the American constitution.

Vaccination is a gift that modern science bestows on all of us – vaccination proponents and opponents alike. When one of the two FDA-approved Covid-19 vaccines comes to a clinic or storefront near you, run, don’t walk, to get your and your children’s shots. Give thanks to the extraordinarily gifted scientists at Pfizer and Moderna who created these vaccines and demonstrated their effectiveness and safety. Make sure that everyone’s children grow up, paraphrasing the U.S. Army’s old recruiting slogan, to be all they can be.   


[1] Dan Liebowitz, Smallpox Vaccination: An Early Start of Modern Medicine in America, J. Community Hosp. Intern. Med. Perspect., 7:61-63, 2017 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5463674).

[2] Joseph F. Kett, The Formation of the American Medical Profession: The Role of Institutions, 1780-1860 (New Haven: Yale University Press, 1968), p. vii. 

[3] Robert E. Riegel, Young America, 1830-1840 (Westport, CT: Greenwood Press, 1973 [1949]), pp. 314-315, quoted at  314. 

[4] John D. Graberstein, “What the World’s Religions Teach, As Applied to Vaccines and Immune Globulins,” Vaccine, 31:2011-2023, 2013.

[5] James Colgrove, “’Science in Democracy’: The Contested Status of Vaccination In the Progressive Era and the 1920s,” Isis, 96:167-191, 2005.

[6]  Harry F. Dowling, Fighting Infection: Conquests of the Twentieth Century (Cambridge, MA: Harvard University Press, 1977), 38; Harry M. Marks, The Progress of Experiment: Science and Therapeutic Reform in the United States, 1900-1990 (Cambridge: Cambridge University Press, 1997), 73-74.

[7] Jonathan Liebenau, Medical Science and Medical Industry: The FormationOf the American Pharmaceutical Industry (Baltimore: Johns Hopkins, 1987), 89-90.

[8]  Janna C. Merrick, “Spiritual Healing, Sick Kids and the Law: Inequities in theAmerican Healthcare System,” Amer. J. Law & Med., 29:269-300, 2003, at 280.

[9] John Duffy, “School Vaccination: The Precursor to School Medical Inspection,” J. Hist. Med. & Allied Sci., 33:344-355, 1978,

[10] Kevin M. Malone & Alan R. Hinman, “Vaccination Mandates: The Public Health Imperative and Individual Rights, Law in Public Health Practice (2009), 262-284, at 272.

[11] Alan R. Hinman, et al., “Childhood Immunization: Laws that Work,” J. Law, Med &I Ethics, 30(suppl):122-127, 2002.

[12] Frank Fenner, et al., Smallpox and its Eradication (Geneva: World Health Organization, 1988).

[13] Karie Youngdahl, “Early Uses of Diphtheria Antitoxin in the United States,” The History of Vaccines, August 2, 2010 (https://www.historyofvaccines.org/content/blog/…).

[14] Epidemiology and Prevention of Vaccine-Preventable Diseases, 11th Edition (The Pink Book). National Immunization Program, Centers for Disease Control and Prevention (http://www.cdc.gov/vaccines/Pubs/pinkbook/downloads/dip.pdf); Diphtheria. WHO, Regional Office for the Western Pacific (http://www.wpro.who.int/health_topics/diphtheria).

[15] CDC. Annual summary 1980: Reported Morbidity and Mortality in the United States. MMWR 1981;29; CDC, Reported Incidence of Notifiable Diseases in the United States, 1960. MMWR 1961;9.

[16] Frank DeStefano & Tom T. Shimabukuro, “The MMR Vaccine and Autism,” Ann. Rev. Virol., 6:585-600, 2019.

[17] Hinman, op. cit. (note 11).

[18] Paul E. Stepansky, Easing Pain on the Western Front:  American Nurses of the Great War and the Birth of Modern Nursing Practice (Jefferson, NC:  McFarland, 2020), 36, 50, 96, 144.

 

Copyright © 2020 by Paul E. Stepansky.  All rights reserved. The author kindly requests that educators using his blog essays in their courses and seminars let him know via info[at]keynote-books.com.

Medical Freedom, Then and Now

“A nation’s liberties seem to depend upon headier and heartier attributes than the liberty to die without medical care.”                                                                        ~Milton Mayer, “The Dogged Retreat of the Doctors” (1949)

Conservative supreme court justices who voice grave skepticism about the constitutionality of the Patient Protection and Affordable Care Act of 2010 would have been better suited to judicial service in the decades following the Revolutionary War.  Issues of health, illness, freedom, and tyranny were much simpler then.  Liberty, as understood by our founding fathers, operated only in the interlacing realms of politics and religion.  How could it have been otherwise?   Medical intervention did not affect the course of illness; it did not enable people to feel better and live longer and more productive lives.  With the exception of smallpox inoculation, which George Washington made mandatory among colonial troops in the winter of 1777, governmental intrusion into the health of its citizenry was nonexistent, even nonsensical.

Until roughly the eighth decade of the nineteenth century, you got sick, you recovered (often despite doctoring), you lingered on in sickness, or you died.  Antebellum (pre-Civil War) medicine relied on a variation of Galenic medicine developed in the eighteenth century by the Scottish physician John Cullen and his student John Brown.  According to Cullen’s system, all diseases were really variations of a single disease that consisted of too much tension or excitability (and secondarily too little tension or excitability) in the blood vessels.  Revolutionary-era and antebellum physicians sought to restore a natural balance by giving “overstimulated” patients (read: feverish, agitated, pain-ridden patients) large doses of toxic mercury compounds like calomel to induce diarrhea; emetics like ipecac and tobacco to induce vomiting; and by bleeding patients to the point of fainting (i.e., syncope).  It was not a pretty business.

Antebellum Americans did not have to worry about remedies for specific illnesses.  Except for smallpox vaccine and antimalarial cinchona tree bark (from which quinine was isolated in 1820), none existed.  Nor did they have to worry about long-term medical interventions for chronic conditions – bacterial infections, especially those that came in epidemic waves every two or three years, had no more opportunity to become chronic than diabetes, heart disease, or cancer.

Medical liberty, enshrined during the Jacksonian era, meant being free to pick and choose your doctor without any state interference.  So liberty-loving Americans picked and chose among calomel-dosing, bloodletting-to-syncope “regulars,” homeopaths, herbalists, botanical practitioners (Thomsonians), eclectics, hydropaths, phrenologists, Christian Scientists, folk healers, and faith healers.   State legislatures stood on the sidelines and applauded this instantiation of pure democracy.  By midcentury, 15 states had rescinded medical licensing laws; the rest gutted their laws and left them unenforced.  Americans were free to enjoy medical anarchy.

Now, mercifully, our notion of liberty has been reconfigured by two centuries of medical progress.  We don’t just get sick and die.  We get sick and get medical help, and, mirabile dictu, the help actually helps.  In antebellum America, deaths of young people under 20 accounted for half the national death rate.   Now our children don’t die of small pox, cholera, yellow fever, dysentery, typhoid, and pulmonary and respiratory infections before they reach maturity.  Diphtheria no longer stalks them during the warm summer months.  When they get sick in early life, their parents take them to the doctor and they almost always get better.  Their parents, on the other hand, especially after reaching middle age, don’t always get better.  So they get ongoing medical attention to help them live longer and more comfortably with chronic conditions like diabetes, coronary heart disease, inflammatory bowel disease, Parkinson’s, and many forms of cancer.

When our framers drafted the Constitution, the idea of being free to live a productive and relatively comfortable life with long-term illness didn’t compute.  You died from diabetes,  cancer, bowel obstruction, neurodegenerative disease, and any major infection (including, among young women, the infection that often followed childbirth).  A major heart attack usually killed you.  You didn’t receive dialysis and possibly a kidney transplant when you entered kidney failure.  Major surgery, performed on the kitchen table if you were of means or in a bacteria-infested, dimly lit, unventilated public hospital if you weren’t, was all but nonexistent because it invariably resulted in massive blood loss, infection, and death.

So, yes, our framers intended our citizenry to be free of government interference, including an obligatory mandate to subsidize health care for millions of uninsured and underserved Americans.  But then the framers never envisioned a world in which freedom could be safeguarded and extended by access to expert care that relieved suffering, effected cure, and prolonged life.  Nor could they envision the progressive income tax, compulsory vaccination, publicly supported clinics, mass screening for TB, diabetes, and  syphilis, and Medicare.  Throughout the antebellum era, when regular physicians were reviled by the public and when neither regulars nor “alternative” practitioners could stem the periodic waves of cholera, yellow fever, and malaria that decimated local populations, it mattered little who provided one’s doctoring. Many, like the thousands who paid $20.00 for the right to practice Samuel Thomson’s do-it-yourself botanical system, chose to doctor themselves.

Opponents of the Affordable Care Act seem challenged by the very idea of progress.  Their consideration of liberty invokes an eighteenth-century political frame of reference to deprive Americans of a kind of liberty associated with a paradigm-shift that arose in the 1880s and 1890s.  It was only then that American medicine began its transition to what we think of as modern medicine. Listerian antisepsis (and then asepsis); laboratory research in bacteriology, immunology, and pharmacology; laboratory development of specific remedies for specific illnesses; implementation of public health measures informed by bacteriology; modern medical education beginning with the opening of Johns Hopkins Medical College in 1893; and, yes, government regulation to safeguard the public from incompetent practitioners and toxic, sometimes fatal, medications – all were  all part of the transition.

“We hold these truths to be self-evident,” Jefferson begins the second paragraph of the Declaration of Independence, “that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”  What Jefferson didn’t stipulate – what he couldn’t stipulate in his time and place – was the hierarchical relationship among these rights.  Now, in the twenty-first century, we are able to go beyond an eighteenth-century mindset in which “life, liberty, and the pursuit of happiness” functions as a noun phrase whose unitary import derives from the political tyrannies of King George III and the British Parliament.  Now we can place life at the base of the pyramid and declare that quality of life is indelibly linked to liberty and the pursuit of happiness.  To the extent that quality of life is diminished through disease and dysfunction, liberty and the pursuit of happiness are necessarily compromised.  In 2012, health is life; it is life poised to exercise liberty and pursue happiness to the fullest.

Why is it unconstitutional to obligate all citizens to participate in a health plan, either directly or through a mandate, that safeguards the right of people to efficacious health care regardless of their financial circumstances, their employment status, and their preexisting medical conditions?  What is it about the term “mandate” that is constitutionally questionable?  When you buy a house in this country, you pay local property taxes that support the local public schools.  (If you’re a renter, your landlord pays your share of the tax out of your rent.)  The property tax functions like the mandate:  It has a differential financial impact on people depending on whether they directly benefit from the system sustained by the tax.  To wit, you pay the tax whether or not you choose to send your children to the public schools, indeed, whether or not you have children.  You are obligated to subsidize the public education of children other than your own because public education, for all its failings, has been declared a public good by the polity of which you are a part.

It is inconceivable that the founding fathers would have found unconstitutional a law that extended life-promoting health care to the roughly 50 million Americans who lack health insurance.  The founding fathers declared that citizens – well, white, propertied males, at least – were entitled to life consistent with the demands and entitlements of representative democracy; their pledge, their Declaration, was not in support of a compromised life that limited the ability to fulfill those demands and enjoy those entitlements.

Of course, adult citizens may repudiate mainstream health care on the basis of their own philosophical or religious  predilections.  Fine.  Americans who wish to pursue health outside the medical mainstream or, in the manner of medieval Christians, to disavow corporeal well-being altogether, are free to do so.  But they should not be allowed to undermine social and political arrangements, codified in law, that support everyone else’s right to pursue life and happiness through twenty-first century medicine.

The concept of medical freedom dominated the antebellum period and resurfaced during the early twentieth century, when compulsory childhood vaccination and Oklahoma Senator Robert Owen’s proposed legislation to create a federal department of public health spurred the formation of the Anti-Vaccination League of America, the American Medical Liberty League, and the National League for Medical Freedom.   According to these groups, medical freedom was incompatible not only with compulsory vaccination, but also with the medical examination of school children, premarital syphilis tests, and municipal campaigns against diphtheria.  In the 1910s, failure to detect and treat contagious bacterial disease was a small price to pay for freedom from what medical libertarians derided as “allopathic knowledge.”   These last gasps of the Jacksonian impulse were gone by 1930, by which time it was universally accepted that scientific medicine was, well, scientific, and, as such, something more than one medical sect among many.

After World War II,  when the American Medical Association mounted its holy crusade against President Harry Truman’s proposal for national health care, “medical liberty” came into vogue once more, though its meaning had changed.  In antebellum American and again in the 1910s, it signified freedom to cast off the oppressive weight of “regular” medicine and pick and choose among the many alternative sects.  In the late 1940s, it signified freedom from federally funded health care, which would contaminate the sacrosanct doctor-patient relationship.  For the underserved, such freedom safeguarded the right to remain untreated.  The AMA’s legerdemain elicited ridicule by many, the prominent journalist Milton Mayer among them.  “Millions of Americans,” Mayer wrote in Harper’s in 1949, “geographically or economically isolated, now have access to one doctor or none.  The AMA would preserve their present freedom of choice.”  In 1960, the medical reporter Selig Greenberg mocked  medical free choice as a “hoary slogan” based on “the fatuous assumption that shopping around for a doctor without competent guidance and paying him on a piecemeal basis somehow guarantees a close relationship and high-quality medical care.”[1]

Now the very notion of medical freedom has an archaic ring.  We no longer seek freedom from the clutches of mainstream medicine; now we seek  freedom to avail ourselves of what mainstream medicine has to offer.  At this singular moment in history, in a fractionated society represented by a bitterly divided Congress, access to health care will be expanded and safeguarded, however imperfectly, by the Affordable Health Care Act.  Those who opt out of the Act should pay a price, because they remain part of a society committed to health as a superordinate value without which liberty and the pursuit of happiness are enfeebled.  To argue on about whether the price of nonparticipatory citizenship in the matter of health care can be a tax but not a mandate is obfuscating wordplay.  And the health and well-being of we the people should not be a matter of wordplay.


[1] Milton Mayer, “The Dogged Retreat of the Doctors,” Harper’s Magazine, 199:25-37, 1949, quoted at pp. 32, 35; Silas Greenberg, “The Decline of the Healing Art,” Harper’s Magazine, 221:132-137, 1960, quoted at p. 134.

Copyright © 2012 by Paul E. Stepansky.  All rights reserved.