Tag Archives: WWII medicine

Psychotropic Serendipities

Serendipities abound in history of medicine, in our own time no less than in the past.  In the 15 years that followed the end of World War II, a period of special interest to me, the discovery of what we now consider modern psychiatric (or psychotropic) drugs is a striking case in point.

Researchers in the final years of the war and immediately thereafter were hardly looking for psychotropics.  They were looking for other things:  improved antihistamines; preservatives that would permit penicillin to hold up during transport to troops in Europe and Asia; and the development of antibiotics effective against penicillin-resistant microorganisms like the tubercle bacilli that caused tuberculosis.

When Frank Berger, a Czechoslovakian bacteriologist, fled to England in 1939, he gained work as as a refugee camp physician.  Then, in 1943, he was hired by a government laboratory in London and joined in the work that engaged so many British scientists of the time:  the purification and industrial production of penicillin.  Berger’s particular assignment was the search for a penicillin preservative; he was especially interested in finding an agent that would prevent the breakdown of penicillin by gram-negative bacteria (penicillinase) during shipment.  And with the synthesis of mephenesin in 1945, he achieved success – and then some.  Mephenesin not only preserved penicillin, but, in small-scale animal trials on toxicity begun at the end of 1951, it revealed something else:  On injection into mice, rats, and guinea pigs, the preservative produced deep muscle relaxation, a sleep-like state that Berger described in 1946 as “tranquillization.”[1]

Berger emigrated to the United States in 1947, and after a brief stint at the University of Rochester Medical School, became Director of Laboratories at Carter-Wallace in Cranbury, New Jersey.  There, joined by the chemist Bernard Ludwig, he developed a more potent and slowly metabolizing form of mephenesin.  The drug was meprobamate, the first minor tranquilizer, for which a patent was finally granted in 1955.  Released by Carter-Wallace as Miltown and by Wyeth (a licensee) as Equanil, it took the American market by storm.  In 1956, it leaped from less than 1% to nearly 70% of new tranquilizer prescriptions; in1957 more than 35 million prescriptions were sold, the equivalent of one per second. Meprobamate single-handedly transformed American medicine by transmuting the everyday stresses and strains of Everyman (and Everywoman) into pharmacologically treatable anxiety.  For general practitioners in particular it was a godsend.  “If generalists could not psychoanalyze their troubled patients,” the historian David Herzberg has observed, “they could at least ease worries with a pill, possibly preventing a minor condition from worsening into serious mental illness.”[2]  Not bad for a penicillin preservative.

In 1952, at the very time Berger was observing the “tranquillization” of small rodents injected with meprobamate,  Henri-Marie Laborit, a French naval surgeon working at the Val de Grâce military hospital outside Paris, published his first article on the usefulness of chlorpromazine (CPZ), a chlorinated form of the antihistamine Promazine, in surgical practice.  Laborit, who was working on the development of “artificial hibernation” as an anesthetic technique, found that the drug not only calmed surgical patients prior to the administration of anesthesia, but also prevented them from lapsing into shock during and after their operations.  The drug had been synthesized by the Rhône-Poulenc chemist Paul Charpentier at the end of 1951. Charpentier was searching for an improved antihistamine, but he quickly saw the drug’s possible usefulness as a potentiator of general anesthesia,[3] which indeed it proved to be.

Impressed with the drug’s effectiveness (in combination with other drugs as a “lytic cocktail”) in inducing relaxation – what he termed “euphoric quietude” – and in preventing shock, Laborit encouraged his colleague Joseph Hamon to try it on psychiatric patients.  It was subsequently taken up by the French psychiatrists Jean Delay and Pierre Deniker, who tried it on psychiatric patients at the Sainte-Anne mental hospital in Paris.  In six journal articles published in the spring and summer of 1952, they reported encouraging results, characterizing their patients’ slowing down of motor activity and emotional indifference as “neuroleptic syndrome” (from the Greek “that take the nerve”).  Thus was born, in retrospect, the first major tranquilizer, a drug far more effective than its predecessors (including morphine and scopolamine in combination) in controlling extreme agitation and relieving psychotic delusions and hallucinations.[4]

But only in retrospect.  At the time of the preliminary trials, the primary application of chlorpromazine remained unclear.  Rhône-Poulenc conducted clinical trials for a number of applications of the drug: to induce “hibernation” during surgery; as an anesthetic; as an antinausea drug (antiemetic) for seasickness; as a treatment for, respectively, burns, stress, infections, obesity,  Parkinson’s disease, and epilepsy.  When Smith, Kline, & French became the American licensee of the drug in early 1953, it planned to market it to American surgeons and psychiatrists alike, and it also took pains to license the drug as an antiemetic.  Only at the end of 1953 did it recognize the primary psychiatric use of the drug, which it released in May, 1954 as Thorazine.

Of course, the birth of modern antibiotic therapy begins with penicillin – the first of the wartime “miracle drugs.” And a miracle drug it was, with an antibacterial spectrum that encompassed strep and staph infections, pneumonia, syphilis and gonorrhea.  But the foregoing infections were all caused by gram-positive bacteria.  Penicillin did not touch the kind of gram-negative bacteria that caused tuberculosis.

The first wonder drug effective against TB was streptomycin, an actinomyces (a soil-dwelling, anaerobic bacteria) discovered by Salman Waksman and his doctoral student Albert Schatz at the Rutgers Agricultural Experiment Station in 1943.  Eight years later, in 1951, chemists working at Bayer Laboratories in Wuppertal, Germany, at Hoffman-La Roche in Basel, Switzerland, and at the Squibb Institute for Medical Research in New Brunswick, New Jersey simultaneously discovered a drug that was not only more effective in treating TB than streptomycin; it was also easier to administer and less likely to have serious side effects.  It was isoniazid, the final wonder drug in the war against TB.  In combination with streptomycin, it was more effective than either drug alone and  less likely to elicit treatment-resistant strains of the tubercle bacilli.

But here’s the thing:  A side-effect of isoniazid was its mood-elevating (or, in the lingo of the day, “psycho-stimulative”) effect.  Researchers conducting trials at Baylor University, the University of Minnesota, and Spain’s University of Cordoba reached the same conclusion:  The mood-elevating effect of isoniazid treatment among TB patients pointed to psychiatry as the primary site of its use.  Back in New York, Nathan Kline, an assistant professor of psychiatry at Columbia’s College of Physicians and Surgeons, learned about the “psycho-stimulative” effect of isoniazid from a report about animal experiments conducted at the Warner-Lambert Research Laboratories in Morris Plains, New Jersey.  Shortly thereafter, he began his own trial of isoniazid with patients at Rockland State Hospital in Orangeburg, New York, and published a report of his findings in 1957.

A year later the drug was brought to market as an antitubercular agent (Marsilid), even though it had been given to over 400,000 depressed patients by that time.  Its improved successor drug, iproniazid, was withdrawn from the U.S. market in 1961 owing to an alleged linkage to jaundice and kidney damage.  But isoniazid retains its place of honor among psychotropic serendipities:  It was the first of the monoamine oxidase inhibitors (MAOIs), potent antidepressants of which contemporary formulations (Marplan, Nardil) are used to treat atypical depressions, i.e., depressions refractory to newer and more benign antidepressants like the omnipresent SSRIs.[5]

Nathan Kline was likewise at hand to steer another ostensibly nonpsychotropic drug into psychiatric usage.  In 1952, reports of Rauwolfia serpentine, a plant root used in  India for hypertension (high blood pressure), snakebite, and “insanity,” reached the West and led to interest in the root’s potential as an antihypertensive.  A year later, chemists at the New Jersey headquarters of the Swiss pharmaceutical firm Ciba (later Ciba-Geigy and now Novartis) isolated an active salt, reserpine, from the root, and Kline, ever ready with the mental patients at Rockland State Hospital, obtained a sample to try on the hospital’s depressed patients.

Kline’s results were encouraging.  In short order, he was touting  reserpine as an “effective sedative for use in mental hospitals,” a finding reaffirmed later that year at a symposium at Ciba’s American headquarters in Summit, New Jersey, where staff pharmacologist F. F. Yonkman first used the term “tranquilizer” to characterize the drug’s mixture of sedation and well-being.[6]  As a major tranquilizer, reserpine never caught on like chlorpromazine, even though, throughout the 1950s, it “was far more frequently mentioned in the scientific literature than chlorpromazine.”[7]

So there we have it: the birth of modern psychopharmacology in the postwar era from research into penicillin preservatives, antihistamines, antibiotics, and antihypertensives.  Of course, serendipities operate in both directions:  drugs initially released as psychotropics sometimes fail miserably, only to find their worth outside of psychiatry.  We need only remember the history of thalidomide, released by the German firm Chemie Grűnenthal in 1957 as a sedative effective in treating anxiety, tension states, insomnia, and nausea.  This psychotropic found its initial market among pregnant women who took the drug to relieve first-trimester morning sickness.  Unbeknown to the drug manufacturer, the drug crossed the placental barrier and, tragically, compromised the pregnancies of many of these women.  Users of thalidomide delivered grossly deformed infants with truncated limbs, “flipper” babies, around 10,000 in all in Europe and Japan.  Only 40% of these infants survived.

This sad episode is well-known among historians, as is the heroic resistance of the FDA’s Frances Kelsey, who in 1960 fought off pressure from FDA administrators and executives at Richardson-Merrell, the American distributor, to release the drug in the U.S.  Less well known, perhaps, is the relicensing of the drug by the the FDA in 1998 (as Thalomid) for a totally nonpsychotropic usage: the treatment of certain complications of leprosy.  Prescribed off-label, it also proved helpful in treating AIDS wasting syndrome.  And beginning in the 2000s, it was used in combination with another drug, dexamethasone, to treat multiple myeloma (a cancer of the plasma cells). It received FDA approval as an anticancer agent in 2006.[8]

Seen thusly, serendipities are often rescue operations, the retrieving and reevaluating of long-abandoned industrial chemicals and of medications deemed inadequate for their intended purpose.  Small wonder that Librium, the first of the benzodiazepine class of minor tranquilizers, the drug that replaced meprobamate as the GP’s drug of choice in 1960, began its life as a new dye (benzheptoxidiazine) synthetized by the German chemists K. von Auwers and F. von Meyenburg in 1891. In the 1930s the Polish-American chemist Leo Sternbach returned to the chemical and synthesized related compounds in the continuing search for new “dyestuffs.”  Then, 20 years later, Sternbach, now a chemist at Hoffmann-La Roche in Nutley, New Jersey, returned to these compounds one final time to see if any of them might have psychiatric applications.  He found nothing of promise, but seven years later, in 1957, a coworker undertook a spring cleaning of the lab and found a variant that Sternbach had missed.  It turned out to be Librium.[9]  All hail to the resourceful minds that return to the dyes of yesteryear in search of the psychotropics of tomorrow – and to those who clean their labs with eyes wide open.

______________________

[1] F. M. Berger & W. Bradley, “The pharmacological properties of α:β dihdroxy (2-methylphenoxy)-γ- propane (Myanesin),” Brit. J. Pharmacol. Chemother., 1:265-272, 1946, at p. 265.

[2] D. Herzberg, Happy Pills in America: From Miltown to Prozac (Baltimore: Johns Hopkins, 2009), p. 35.  Cf. A. Tone, The Age of Anxiety: A History of America’s Turbulent Affair with Tranquilizers  (NY:  Basic Books, 2009), pp. 90-91.

[3] P. Charpentier, et al., “Recherches sur les diméthylaminopropyl –N phénothiazines substituées,” Comptes Rendus de l’Académie des Sciences, 235:59-60, 1952.

[4] On the discovery and early uses of chlorpromazine, see D. Healy, The Creation of Psychopharmacology (Cambridge: Harvard, 2002), pp. 77-101; F. Lopez-Munoz, et al., “History of the discovery and clinical introduction of chlorpromazine,”  Ann. Clin. Psychiat., 17:113-135, 2005; and T. A. Ban, “Fifty years chlorpromazine:  a historical perspective,” Neuropsychiat. Dis. & Treat., 3:495-500, 2007.

[5] On the development and marketing of isoniazid,  see H. F. Dowling, Fighting Infection: Conquests of the Twentieth Century (Cambridge: Harvard, 1977), p. 168; F. Ryan, The Forgotten Plague: How the Battle Against Tuberculosis was Won and Lost (Boston:  Little, Brown, 1992), p. 363; F. López-Munoz, et al., “On the clinical introduction of monoamine oxidase inhibitors, tricyclics, and tetracyclics. Part II: tricyclics and tetracyclics,” J. Clinical Psychopharm., 28:1-4, 2008; and Tone, Age of Anxiety, pp. 128-29.

[6] E. S. Valenstein, Blaming the Brain: The Truth about Drugs and Mental Health  (NY:  Free Press, 1998), p. 69; D. Healy, The Antidepressant Era (Cambridge: Harvard, 1997),  pp. 59-70;  D. Healy, Creation of Psychopharmacology, pp. 103-05.

[7] Healy, Creation of Psychopharmacology, p. 106.

[8] P. J. Hilts provides a readable overview of the thalidomide crisis in Protecting America’s Health:  The FDA, Business, and One Hundred Years of Regulation (NY:  Knopf, 2003), pp. 144-65.  On the subsequent relicensing of thalidomide for the treatment of leprosy in 1998 and its extensive off-label use, see S. Timmermans & M. Berg, The Gold Standard:  The Challenge of Evidence-Based Medicine and Standardization in Health Care. (Phila: Temple University Press, 2003), pp. 188-92.

[9] On the discovery of Librium, see Valenstein, Blaming the Brain, pp. 54-56; A. Tone,“Listening to the past: history, psychiatry, and anxiety,” Canad. J. Psychiat,, 50:373-380, 2005, at p. 377; and Tone, Age of Anxiety, pp. 126-40.

Copyright © 2014 by Paul E. Stepansky.  All rights reserved.

An Irony of War

“There are two groups of people in warfare – those organized to inflict and those organized to repair wounds – and there is little doubt but that in all wars, and in this one in particular, the former have been better prepared for their jobs” (Milit. Surg., 38:601, 1916).  So observed Harvey Cushing, the founder of modern neurosurgery, a year before America’s entry into World War I.  Cushing’s judgment is just, and yet throughout history “those organized to repair wounds” have risen to the exigencies  of the war at hand.  In point of fact, warfare has spurred physicians, surgeons, and researchers to major, sometimes spectacular, advances, and their scientific and clinical victories are bequeathed  to civilian populations that inherit the peace.  Out of human destructiveness emerge potent new strategies of protection, remediation, and self-preservation.  Call it an irony of war.

Nor are these medical and surgical gifts limited to the era of modern warfare.  The French army surgeon Jean Louis Petit invented the screw tourniquet in 1718; it made possible leg amputation above the knee.  The Napoleonic Wars of the early nineteenth century brought us the first field hospitals along with battlefield nursing and ambulances.  The latter were of course horse-drawn affairs, but they were exceedingly fast and maneuverable and were termed “flying ambulances.”  The principle of triage — treating the wounded, regardless of rank, according to severity of injury and urgency of need – is not a product of twentieth-century disasters.  It was devised by Dominique Jean Larrey, Napoleon’s surgeon-in-chief from 1797 to 1815.

The American Civil War witnessed the further development of field hospitals and the acceptance, often grudging, especially among southern surgeons, of female nurses tending to savaged male bodies.  Hospital-based training programs for nurses were a product of wartime experience.  Civil War surgeons themselves broached the idea shortly after the peace, and the first such programs opened  in New York, Boston, and New Haven hospitals in 1873.  The dawning appreciation of the relationship between sanitation and prevention of infection, which would blossom into the “sanitary science” of the 1870s and 1880s, was another Civil War legacy.

And then there were the advances, surgical and technological, in amputation.  They included the use of the flexible chain saw to spare nerves and muscles and even, in many cases of comminuted fracture, to avoid amputation entirely.  The development of more or less modern vascular ligation – developed on the battlefield to tie off major arteries extending from the stumps of severed limbs – is another achievement of Civil War surgeons.  Actually, they rediscovered ligation, since the French military surgeon Amboise Paré employed it following battlefield amputation in the mid-sixteenth century, and he in turn was reviving a practice employed in the Alexandrian Era of the fourth century B.C.

In 1900 Karl Landsteiner, a Viennese pathologist and immunologist, first described the ABO system of blood groups, founding the field of immunohematology.  As a result, World War I gave us blood banks that made possible blood transfusions among wounded soldiers in the Army Medical Corps in France.  The First World War also pushed medicine further along the path to modern wound management, including the treatment of cellulitic wound infections, i.e., bacterial skin infections that followed soft tissue trauma.  Battlefield surgeons were quick to appreciate the need for thorough wound debridement and delayed closure in treating contaminated war wounds.  The prevalence of central nervous system injuries – a tragic byproduct of trench warfare in which soldiers’ heads peered anxiously above the parapets  – led to “profound insights into central nervous system form and function.” The British neurologist Gordon Holmes provided elaborate descriptions of spinal transections (crosswise fractures) for every segment of the spinal cord, whereas Cushing, performing eight neurosurgeries a day, “rose to the challenge of refining the treatment of survivors of penetrating head wounds” (Arch. Neurol., 51:712, 1994).  His work from 1917 “lives today” (ANZ J. Surg., 74:75, 2004).

No less momentous was the development of reconstructive surgery by inventive surgeons (led by the New Zealand ENT surgeon Harold Gillies) and dentists (led by the French-American Charles Valadier) unwilling to accept the gross disfigurement of downed pilots who crawled away from smoking wreckages with their lives, but not their faces, intact.  A signal achievement of wartime experience with burn and gunshot victims was Gillies’s Plastic Surgery of the Face of 1920; another was the founding of the American Association of Plastic Surgeons a year later.  After the war, be it noted, the pioneering reconstructive surgeons refused to place their techniques at the disposal of healthy women (and less frequently healthy men) desirous of facial enhancement; reconstructive facial surgery went into short-lived hibernation.  One reason reconstructive surgeons morphed into cosmetic surgeons was the psychiatrization of facial imperfection via Freudian and especially Adlerian notions of the “inferiority complex,” with its allegedly life-deforming ramifications.  So nose jobs became all the rage in the 1930s, to be joined by facelifts in the postwar 40s. (Elizabeth Haiken’s book Venus Envy: A History of Cosmetic Surgery [1997] is illuminating on all these issues.)

The advances of World War II are legion.  Among the most significant was the development or significant improvement of 10 of the 28 vaccine-preventable diseases identified in the twentieth century (J. Pub. Health Pol., 27:38, 2006);  new vaccines for influenza, pneumococcal pneumonia, and plague were among them.   There were also new treatments for malaria and the mass production of penicillin in time for D-Day.  It was during WWII that American scientists learned to separate blood plasma into its constituents (albumin, globulins, and clotting factors), an essential advance in the treatment of shock and control of bleeding.

No less staggering were the surgical advances that occurred during the war. Hugh Cairns, Cushing’s favorite student, developed techniques for the repair of the skull base and laid the foundation of modern craniofacial surgery by bringing together neurosurgeons, plastic surgeons, and ophthalmic surgeons in mobile units referred to as “the trinity.”   There were also major advances in fracture and wound care along with the development of hand surgery as a surgical specialty.   Wartime treatment experience with extreme stress, battlefield trauma, and somatization (then termed, in Freudian parlance, “conversion reactions”) paved the way for the blossoming of psychosomatic medicine in the 1950s and 1960s.

The drum roll hardly ends with World War II.  Korea gave us the first air ambulance service.  Vietnam gave us Huey helicopters for evacuation of wounded soldiers.  (Now all trauma centers have heliports.)  Prior to evacuation, these soldiers received advanced, often life-saving, care from medical corpsmen who opened surgical airways and performed thoracic needle decompressions and shock resuscitation; thus was born our modern system of prehospital emergency care by onsite EMTs and paramedics.  When these corpsmen returned to the States, they formed the original candidate pool for Physician Assistant training programs, the first of which opened its doors at Duke University Medical Center in 1965.  Vietnam also gave us major advances in vascular surgery, recorded for surgical posterity in the “Vietnam Vascular Registry,” a database with records of over 8000 vascular wound cases contributed by over 600 battlefield surgeons.

The medical and surgical yield of recent and ongoing wars in the Persian Gulf will be recorded in years to come.  Already, these wars have provided two advances for which all may give thanks:  portable intensive care units (“Life Support for Trauma and Transport”) and Hem-Con bandages.  The latter, made from extract of shrimp cells, stop severe bleeding instantaneously.

Now, of course, with another century of war under our belt and the ability to play computer-assisted war games, we are better able to envision the horrific possibilities of wars yet to come.  In the years leading up to World War I, American surgeons – even those, like Harvey Cushing, who braced themselves for war – had no idea of the human wreckage they would encounter in French field hospitals.  Their working knowledge of war wounds relied on the Boer War (1899-1900), a distinctly nineteenth-century affair, militarily speaking, fought in the desert of South Africa, not in trenches in the overly fertilized, bacteria-saturated soil of France.  Now military planners can turn to databases that gather together the medical-surgical lessons of two World Wars, Korea, Vietnam, Iraq, Afghanistan, and any number of regional conflicts.

Military simulations have already been broadened to include political and social factors.  But military planners should also be alert to possibilities of mutilation, disfigurement, multiple-organ damage, and drug-resistant infection only dimly imagined.  Perhaps they can broaden their simulations to include the medical and surgical contingencies of future wars and get bench scientists, clinical researchers, and surgeons to work on them right away.  Lucky us.

Copyright © 2012 by Paul E. Stepansky.  All rights reserved.