summary:
This article offers a historical perspective on diversity, equity, and inclusion initiatives in health professions. Historians have highlighted how workforce shortages have facilitated increased gender diversity in male-dominated scientific and clinical occupations. Less attention has been given to manpower as a motivator for enhancing racial/ethnic diversity. I explore the history of minority recruitment, retention, and inclusion initiatives in occupational therapy and physical therapy after 1970 and examine the evolving ways in which the longstanding underrepresentation of racial/ethnic minority health professions students and practitioners was recognized, mobilized, and instrumentalized in each field. I argue that broad-based manpower concerns, though often compelling initial motivators for action, were insufficient for sustaining successful and long-term minority initiatives, due to constant shifts in job market demand. Instead, this article shows that annual and institutionalized minority-specific awards and fundraisers were the most effective strategies for maintaining minority recruitment initiatives over multiple decades.
Over the second half of the nineteenth century, thousands of Americans were admitted to schools for so-called idiotic children, later known as institutions for the feebleminded and linked to the Eugenics movement. While idiocy is often presumed to be the antecedent of intellectual disability, an analysis of the stories of three hundred children admitted to one such institution over a forty-year period demonstrates an unexpected diversity of appearances, abilities, and behaviors. Within the walls of the institution, idiocy was composed of children whose perceived abilities deviated from the expectations of their social position. Families further shaped the diagnosis of idiocy by negotiating the timing of admission for their children, influenced not only by personal factors, but by shifting educational and employment opportunities, and cultural tolerance of diversity. Consequently, idiocy became the broadest descriptor of disability during the nineteenth century.
Following the medical breakthroughs of Pasteur and Koch after 1880, the use of simians became pivotal to laboratory research to develop vaccines and cultivate microbes through the technique of serial passage. These innovations fueled research on multiple diseases and unleashed a demand for simians, which died easily in captivity. European and American colonial expansion facilitated a burgeoning market for laboratory animals that intensified hunting for live animals. This demand created novel opportunities for disease transfers and viral recombinations as simians of different species were confined in precarious settings. As laboratories moved into the colonies for research into a variety of diseases, notably syphilis, sleeping sickness, and malaria, the simian market was intensified. While researchers expected that colonial laboratories offered more natural environments than their metropolitan affiliates, amassing apes, people, microbes, and insects at close quarters instead created unnatural conditions that may have facilitated the spread of undetectable diseases.
Jean-Martin Charcot (1825-1893), the leading neurologist of his time, is best remembered for his studies on hysteria presented in clinical lectures at the Paris Salpêtrière hospital. Developing the concept of traumatic male hysteria after accidents in which patients suffered slight physical damage led him to advance a psychological explanation for hysteria. Traumatic hysteria is the context for a close reading of Charcot's "last words" based upon a final unpublished lesson in 1893. This case history concerns a seventeen-year-old Parisian artisan whose various signs of hysteria developed following a dream in which he imagined himself the victim of a violent assault. Charcot identifies the dream/nightmare as the "original" feature determining traumatic hysteria. The dream sets in motion an overwhelming consciousness followed by a susceptibility to "autosuggestion" producing somatic signs of hysteria. Charcot's final lesson on dreams thus culminates his study of the psychological basis of traumatic hysteria.
Physicians in the twentieth century routinely used episiotomy-a cut made during childbirth-to better facilitate labor, using the evidence of their experiences that it was useful. But physicians were not alone in producing evidence regarding episiotomy and its repair. Here I consider how three groups-male physicians, husbands, and laboring women-were involved in creating evidence and circulating knowledge about episiotomies, specifically, the intention of its repair, the so-called "husband's stitch," to sexually benefit men. By doing so I seek to consider the meanings of evidence within medicine, evidence as a basis for challenging the hegemony of medicine by lay women, and how medical knowledge is produced and shared among physicians and non-physicians.
Systemic lupus erythematosus (SLE) is an autoimmune disorder that affects mostly women and disproportionately Black women. Until the 1940s, SLE was rarely diagnosed in Black Americans, reflecting racist medical beliefs about Black immunity. In the 1940s and 1950s, SLE and its treatment were part of a patriarchal narrative of American industrialization. By the 1960s, newer diagnostic techniques increased recognition of SLE, especially among Black women; medical thinking about SLE shifted from external causes like infection or allergy to autoimmunity, which emphasized biological, genetically determined racial difference. In the 1970s and 1980s, an advocacy structure crystalized around memoirs by women with SLE, which emphasized the experiences of able-bodied, economically privileged white women, while Black feminist health discourse and SLE narratives by Black authors grappled with SLE's more complicated intersections. Throughout the twentieth century, SLE embodied immunity as a gendered, racialized, and culturally invested process.