But any such suggestion has no basis in fact. Instead, given the chance to prepare hospitals and health-care workers for the expected influx of covid-19 patients, the Trump administration did not take action to build up supplies of the vital equipment experts knew would be needed. Indeed, the administration has so far refused to use the Defense Production Act, or DPA, to ramp up production of even fairly basic but essential medical supplies, despite many urgent calls to do so. And now there is a shortage of personal protective equipment (PPE), such as masks and other gear for health-care workers, which threatens the lives of these vital professionals and may overwhelm the health-care system even more rapidly.
There is an urgent need to get essential protective gear to the people who need it. Without enough PPE on hand, public health officials are turning to donations from laboratories, tattoo and nail parlors, and construction companies. U.S. manufacturers such as 3M and Honeywell have announced plans to produce protective gear. There have even been calls for laypeople to sew masks at home from provided medical-grade fabric or from their home stores of fabric, using everything from vacuum bags to air conditioning filters to make masks for clinicians to use.
But why are such simple, low-tech and relatively low-cost items such as PPE so hard to find right now in the first place? In part, this shortage can be explained by global production chains, just-in-time manufacturing strategies and poor public health planning. But the rapid-onset shortage of PPE in the United States during the coronavirus pandemic exposes a much bigger structural and cultural problem: the technological imperative in U.S. health care. Over the past century, a deeply ingrained cultural preference has developed for high-tech, high-cost health-care interventions, which in turn drives supply chains in the direction of expensive, novel interventions rather than our very real, far less sexy need for day-to-day, low-tech or old-tech materials. This technological imperative is so institutionalized that we as a society barely even notice it, much less question it — but the dangers are on display today.
In the 19th century, hospitals lacked the technology we associate with health care today. Instead, they were institutions of last resort, places for sick individuals whose communities had abandoned them to the care of strangers. In 1835, Philadelphia hospitals and almshouses organized women’s wards not by diagnosis, but by what kind of useful work patients — then known as inmates — could do for the hospital: “aged and helpless women in bad health; aged and helpless women who can sew and knit; aged and helpless women who are good sewers” and, finally, “spinners.” The patients in these early American institutions were charity patients, expected to stay a long time. And so, if they were physically able, they were often pressed into direct service to keep the hospital running.
Spinning wool into thread was about the most valuable work a patient could do at the time. Thread was needed for making and repairing bedsheets, curtains and nurses’ uniforms, to name just a few items essential to the functioning of these early institutions. Surplus thread could be stored and saved for future use or even sold to support the work of the hospital.
As decades went by, medicine and hospital care changed dramatically. As both public and private investment in science and medicine yielded novel and effective therapeutics — think vaccines, antibiotics and organ transplants, to name just a few — the United States’ private system of health-care provisioning meant that medicine rapidly became big business in the country. American medicine came to be characterized by its new and expensive technologies, ideally housed in new and expensive hospital buildings.
Dazzled and inspired by success stories, advertisements and their own experiences, many Americans adopted a presumption that newer, expensive technologies necessarily equaled better care. In that moment, with hospitals no longer a last resort but a vaunted destination for care, it seemed increasingly absurd to imagine patients or neighboring laypeople could be a part of creating the materials or machinery that could be found inside. Though area women were and are still to this day called on to volunteer to knit hats for newborns, sew puppets for children or devise and knit cannula sleeves for dementia patients, in the eyes of the modern hospital and its administrators, the value of this labor paled in comparison to the therapeutic or economic potential of other interventions and technologies.
During the 20th century, the expanding American middle class repeatedly demonstrated a preference and excitement for newer, more expensive technologies. This was not true everywhere; many health-care consumers in other countries were and remain skeptical of new technology, preferring to wait until a new method was perfected before trying it out.
But middle-class Americans consistently chose more novel and expensive technology — even if it was untested — over more distribution of less expensive technology to all. This was especially true for three key 20th-century technical therapeutic innovations: the iron lung machine for the treatment of polio, the development of kidney dialysis and the invention of other respirator treatments, such as ventilators. During this era of rapid innovation in U.S. medicine and building up of hospitals, laboratories, medical device makers and pharmaceutical companies, sick Americans, particularly those with economic means, were recast — no longer known as inmates, like their 19th-century counterparts, but instead as health-care consumers. Consumers’ decisions in the health-care arena drove not just care decisions but also big revenue. Basics such as PPE couldn’t garner the same excitement as new equipment.
As consumers, many of us are conditioned to believe that higher-tech interventions mean we are getting better care. An MRI exam seems like better diagnostic care for a minor injury than an X-ray, which in turn seems better than a physical exam alone. In turn, our doctors’ offices are incentivized to invest in and offer these services, even if they are not medically necessary. This cycle of providing treatments and diagnostics with higher and higher tech drives high utilization rates and high health-care costs in the United States, and too often it does so without evidence or better outcomes.
There is nothing inherently wrong with liking new technologies: Our medical technology industries and medical device manufacturers have created many effective, lifesaving therapeutics, produced hundreds of thousands of jobs and added tens of millions of dollars to our economy every year. But if we permit ourselves to be romanced by the lure of shiny new tech in health care, putting our resources and innovation efforts in the direction of the new and the complex, without really questioning whether we are using our resources wisely or effectively, we run the risk of neglecting older but necessary and effective technologies — such as PPE and ventilators (once a new technology, now an intervention almost 100 years old) — and consequently risking lives.
The PPE shortage for health-care workers facing down covid-19 in our hospitals and clinics is the latest harrowing installment in a long trend of prioritizing the higher-tech, new, more highly profitable ventures at the expense of other sectors of the U.S. health-care system. We must all act now, in as many ways as needed — using advocacy, creativity and flexibility — to solve the urgent #GetMePPE crisis in the United States. And if we wish to avoid repeating this mistake, moving forward we must make largely invisible social institutions — such as the technological imperative in U.S. health care — visible. We must create policies that encourage Americans to be aware of our costly and sometimes wrongheaded affinity for new technology, to question assumptions and to ensure that we always, always have ample supplies of low-tech, low-cost medical equipment, such as PPE, ready when we need it.