“There are two groups of people in warfare – those organized to inflict and those organized to repair wounds – and there is little doubt but that in all wars, and in this one in particular, the former have been better prepared for their jobs” (Milit. Surg., 38:601, 1916). So observed Harvey Cushing, the founder of modern neurosurgery, a year before America’s entry into World War I. Cushing’s judgment is just, and yet throughout history “those organized to repair wounds” have risen to the exigencies of the war at hand. In point of fact, warfare has spurred physicians, surgeons, and researchers to major, sometimes spectacular, advances, and their scientific and clinical victories are bequeathed to civilian populations that inherit the peace. Out of human destructiveness emerge potent new strategies of protection, remediation, and self-preservation. Call it an irony of war.
Nor are these medical and surgical gifts limited to the era of modern warfare. The French army surgeon Jean Louis Petit invented the screw tourniquet in 1718; it made possible leg amputation above the knee. The Napoleonic Wars of the early nineteenth century brought us the first field hospitals along with battlefield nursing and ambulances. The latter were of course horse-drawn affairs, but they were exceedingly fast and maneuverable and were termed “flying ambulances.” The principle of triage — treating the wounded, regardless of rank, according to severity of injury and urgency of need – is not a product of twentieth-century disasters. It was devised by Dominique Jean Larrey, Napoleon’s surgeon-in-chief from 1797 to 1815.
The American Civil War witnessed the further development of field hospitals and the acceptance, often grudging, especially among southern surgeons, of female nurses tending to savaged male bodies. Hospital-based training programs for nurses were a product of wartime experience. Civil War surgeons themselves broached the idea shortly after the peace, and the first such programs opened in New York, Boston, and New Haven hospitals in 1873. The dawning appreciation of the relationship between sanitation and prevention of infection, which would blossom into the “sanitary science” of the 1870s and 1880s, was another Civil War legacy.
And then there were the advances, surgical and technological, in amputation. They included the use of the flexible chain saw to spare nerves and muscles and even, in many cases of comminuted fracture, to avoid amputation entirely. The development of more or less modern vascular ligation – developed on the battlefield to tie off major arteries extending from the stumps of severed limbs – is another achievement of Civil War surgeons. Actually, they rediscovered ligation, since the French military surgeon Amboise Paré employed it following battlefield amputation in the mid-sixteenth century, and he in turn was reviving a practice employed in the Alexandrian Era of the fourth century B.C.
In 1900 Karl Landsteiner, a Viennese pathologist and immunologist, first described the ABO system of blood groups, founding the field of immunohematology. As a result, World War I gave us blood banks that made possible blood transfusions among wounded soldiers in the Army Medical Corps in France. The First World War also pushed medicine further along the path to modern wound management, including the treatment of cellulitic wound infections, i.e., bacterial skin infections that followed soft tissue trauma. Battlefield surgeons were quick to appreciate the need for thorough wound debridement and delayed closure in treating contaminated war wounds. The prevalence of central nervous system injuries – a tragic byproduct of trench warfare in which soldiers’ heads peered anxiously above the parapets – led to “profound insights into central nervous system form and function.” The British neurologist Gordon Holmes provided elaborate descriptions of spinal transections (crosswise fractures) for every segment of the spinal cord, whereas Cushing, performing eight neurosurgeries a day, “rose to the challenge of refining the treatment of survivors of penetrating head wounds” (Arch. Neurol., 51:712, 1994). His work from 1917 “lives today” (ANZ J. Surg., 74:75, 2004).
No less momentous was the development of reconstructive surgery by inventive surgeons (led by the New Zealand ENT surgeon Harold Gillies) and dentists (led by the French-American Charles Valadier) unwilling to accept the gross disfigurement of downed pilots who crawled away from smoking wreckages with their lives, but not their faces, intact. A signal achievement of wartime experience with burn and gunshot victims was Gillies’s Plastic Surgery of the Face of 1920; another was the founding of the American Association of Plastic Surgeons a year later. After the war, be it noted, the pioneering reconstructive surgeons refused to place their techniques at the disposal of healthy women (and less frequently healthy men) desirous of facial enhancement; reconstructive facial surgery went into short-lived hibernation. One reason reconstructive surgeons morphed into cosmetic surgeons was the psychiatrization of facial imperfection via Freudian and especially Adlerian notions of the “inferiority complex,” with its allegedly life-deforming ramifications. So nose jobs became all the rage in the 1930s, to be joined by facelifts in the postwar 40s. (Elizabeth Haiken’s book Venus Envy: A History of Cosmetic Surgery  is illuminating on all these issues.)
The advances of World War II are legion. Among the most significant was the development or significant improvement of 10 of the 28 vaccine-preventable diseases identified in the twentieth century (J. Pub. Health Pol., 27:38, 2006); new vaccines for influenza, pneumococcal pneumonia, and plague were among them. There were also new treatments for malaria and the mass production of penicillin in time for D-Day. It was during WWII that American scientists learned to separate blood plasma into its constituents (albumin, globulins, and clotting factors), an essential advance in the treatment of shock and control of bleeding.
No less staggering were the surgical advances that occurred during the war. Hugh Cairns, Cushing’s favorite student, developed techniques for the repair of the skull base and laid the foundation of modern craniofacial surgery by bringing together neurosurgeons, plastic surgeons, and ophthalmic surgeons in mobile units referred to as “the trinity.” There were also major advances in fracture and wound care along with the development of hand surgery as a surgical specialty. Wartime treatment experience with extreme stress, battlefield trauma, and somatization (then termed, in Freudian parlance, “conversion reactions”) paved the way for the blossoming of psychosomatic medicine in the 1950s and 1960s.
The drum roll hardly ends with World War II. Korea gave us the first air ambulance service. Vietnam gave us Huey helicopters for evacuation of wounded soldiers. (Now all trauma centers have heliports.) Prior to evacuation, these soldiers received advanced, often life-saving, care from medical corpsmen who opened surgical airways and performed thoracic needle decompressions and shock resuscitation; thus was born our modern system of prehospital emergency care by onsite EMTs and paramedics. When these corpsmen returned to the States, they formed the original candidate pool for Physician Assistant training programs, the first of which opened its doors at Duke University Medical Center in 1965. Vietnam also gave us major advances in vascular surgery, recorded for surgical posterity in the “Vietnam Vascular Registry,” a database with records of over 8000 vascular wound cases contributed by over 600 battlefield surgeons.
The medical and surgical yield of recent and ongoing wars in the Persian Gulf will be recorded in years to come. Already, these wars have provided two advances for which all may give thanks: portable intensive care units (“Life Support for Trauma and Transport”) and Hem-Con bandages. The latter, made from extract of shrimp cells, stop severe bleeding instantaneously.
Now, of course, with another century of war under our belt and the ability to play computer-assisted war games, we are better able to envision the horrific possibilities of wars yet to come. In the years leading up to World War I, American surgeons – even those, like Harvey Cushing, who braced themselves for war – had no idea of the human wreckage they would encounter in French field hospitals. Their working knowledge of war wounds relied on the Boer War (1899-1900), a distinctly nineteenth-century affair, militarily speaking, fought in the desert of South Africa, not in trenches in the overly fertilized, bacteria-saturated soil of France. Now military planners can turn to databases that gather together the medical-surgical lessons of two World Wars, Korea, Vietnam, Iraq, Afghanistan, and any number of regional conflicts.
Military simulations have already been broadened to include political and social factors. But military planners should also be alert to possibilities of mutilation, disfigurement, multiple-organ damage, and drug-resistant infection only dimly imagined. Perhaps they can broaden their simulations to include the medical and surgical contingencies of future wars and get bench scientists, clinical researchers, and surgeons to work on them right away. Lucky us.