Some excerpts from an article in The Atlantic
Almost two years ago, my father was killed by a hospital-borne infection in the intensive-care unit of a well-regarded nonprofit hospital in New York City. Dad had just turned 83, and he had a variety of the ailments common to men of his age. But he was still working on the day he walked into the hospital with pneumonia. Within 36 hours, he had developed sepsis. Over the next five weeks in the ICU, a wave of secondary infections, also acquired in the hospital, overwhelmed his defenses. My dad became a statistic—merely one of the roughly 100,000 Americans whose deaths are caused or influenced by infections picked up in hospitals. One hundred thousand deaths: more than double the number of people killed in car crashes, five times the number killed in homicides, 20 times the total number of our armed forces killed in Iraq and Afghanistan. Another victim in a building American tragedy.
…
I’m a businessman, and in no sense a health-care expert. But the persistence of bad industry practices—from long lines at the doctor’s office to ever-rising prices to astonishing numbers of preventable deaths—seems beyond all normal logic, and must have an underlying cause. There needs to be a business reason why an industry, year in and year out, would be able to get away with poor customer service, unaffordable prices, and uneven results—a reason my father and so many others are unnecessarily killed.
…
The housing bubble offers some important lessons for health-care policy. The claim that something—whether housing or health care—is an undersupplied social good is commonly used to justify government intervention, and policy makers have long striven to make housing more affordable. But by making housing investments eligible for special tax benefits and subsidized borrowing rates, the government has stimulated not only the construction of more houses but also the willingness of people to borrow and spend more on houses than they otherwise would have. The result is now tragically clear.
…
Comprehensive health insurance is such an ingrained element of our thinking, we forget that its rise to dominance is relatively recent. Modern group health insurance was introduced in 1929, and employer-based insurance began to blossom during World War II, when wage freezes prompted employers to expand other benefits as a way of attracting workers. Still, as late as 1954, only a minority of Americans had health insurance. That’s when Congress passed a law making employer contributions to employee health plans tax-deductible without making the resulting benefits taxable to employees. This seemingly minor tax benefit not only encouraged the spread of catastrophic insurance, but had the accidental effect of making employer-funded health insurance the most affordable option (after taxes) for financing pretty much any type of health care. There was nothing natural or inevitable about the way our system developed: employer-based, comprehensive insurance crowded out alternative methods of paying for health-care expenses only because of a poorly considered tax benefit passed half a century ago.
…
Insurance is probably the most complex, costly, and distortional method of financing any activity; that’s why it is otherwise used to fund only rare, unexpected, and large costs. Imagine sending your weekly grocery bill to an insurance clerk for review, and having the grocer reimbursed by the insurer to whom you’ve paid your share. An expensive and wasteful absurdity, no?
Is this really a big problem for our health-care system? Well, for every two doctors in the U.S., there is now one health-insurance employee—more than 470,000 in total. In 2006, it cost almost $500 per person just to administer health insurance. Much of this enormous cost would simply disappear if we paid routine and predictable health-care expenditures the way we pay for everything else—by ourselves.
…
Moral hazard has fostered an accidental collusion between providers benefiting from higher costs and patients who don’t fully bear them. In this environment, trying to control costs is awfully tough. When Medicare cut reimbursement rates in 2005 on chemotherapy and anemia drugs, for instance, it saved almost 20 percent of the previously billed costs. But Medicare’s total cancer-treatment costs actually rose almost immediately. As The New York Times reported, some physicians believed their colleagues simply performed more treatments, particularly higher-profit ones.
Want further evidence of moral hazard? The average insured American and the average uninsured American spend very similar amounts of their own money on health care each year—$654 and $583, respectively. But they spend wildly different amounts of other people’s money—$3,809 and $1,103, respectively. Sometimes the uninsured do not get highly beneficial treatments because they cannot afford them at today’s prices—something any reform must address. But likewise, insured patients often get only marginally beneficial (or even outright unnecessary) care at mind-boggling cost. If it’s true that the insurance system leads us to focus on only our direct share of costs—rather than the total cost to society—it’s not surprising that insured families and uninsured ones would make similar decisions as to how much of their own money to spend on care, but very different decisions on the total amount to consume.
…
In 2007, employer-based health insurance cost, on average, more than $12,000 per family, up 78 percent since 2001. I’ve run several companies and company divisions of various sizes over the course of my career, so I can confidently tell you that raises (and even entry-level hiring) are tightly limited by rising health-care costs. You may think your employer is paying for your health care, but in fact your company’s share of the insurance premium comes out of your potential wage increase. Where else could it come from?
Let’s say you’re a 22-year-old single employee at my company today, starting out at a $30,000 annual salary. Let’s assume you’ll get married in six years, support two children for 20 years, retire at 65, and die at 80. Now let’s make a crazy assumption: insurance premiums, Medicare taxes and premiums, and out-of-pocket costs will grow no faster than your earnings—say, 3 percent a year. By the end of your working days, your annual salary will be up to $107,000. And over your lifetime, you and your employer together will have paid $1.77 million for your family’s health care. $1.77 million! And that’s only after assuming the taming of costs! In recent years, health-care costs have actually grown 2 to 3 percent faster than the economy. If that continues, your 22-year-old self is looking at an additional $2 million or so in expenses over your lifetime—roughly $4 million in total.
…
Whatever their histories, nearly all developed countries are now struggling with rapidly rising health-care costs, including those with single-payer systems. From 2000 to 2005, per capita health-care spending in Canada grew by 33 percent, in France by 37 percent, in the U.K. by 47 percent—all comparable to the 40 percent growth experienced by the U.S. in that period. Cost control by way of bureaucratic price controls has its limits.
…
Health care is an exceptionally heavily regulated industry. Health-insurance companies are regulated by states, which limits interstate competition. And many of the materials, machines, and even software programs used by health-care facilities must be licensed by state or federal authorities, or approved for use by Medicare; these requirements form large barriers to entry for both new facilities and new vendors that could equip and supply them.
Many health-care regulations are justified as safety precautions. But many also result from attempts to redress the distortions that our system of financing health care has created. And whatever their purpose, almost all of these regulations can be shaped over time by the powerful institutions that dominate the health-care landscape, and that are often looking to protect themselves from competition.
Take the ongoing battle between large integrated hospitals and specialty clinics (for cardiac surgery, orthopedics, maternity, etc.). The economic threat posed by these facilities is well illustrated by a recent battle in Loma Linda, California. When a group of doctors proposed a 28-bed private specialty facility, the local hospitals protested to the city council that it was unnecessary, and launched a publicity campaign to try to block it; the council backed the facility anyway. So the nonprofit Loma Linda University Medical Center simply bought the new facility for $80 million in 2008. Traditional hospitals got Congress to include an 18-month moratorium on new specialty hospitals in the 2003 Medicare law, and a second six-month ban in 2005.
…
Consider the oft-quoted “statistic” that emergency-room care is the most expensive form of treatment. Has anyone who believes this ever actually been to an emergency room? My sister is an emergency-medicine physician; unlike most other specialists, ER docs usually work on scheduled shifts and are paid fixed salaries that place them in the lower ranks of physician compensation. The doctors and other workers are hardly underemployed: typically, ERs are unbelievably crowded. They have access to the facilities and equipment of the entire hospital, but require very few dedicated resources of their own. They benefit from the group buying power of the entire institution. No expensive art decorates the walls, and the waiting rooms resemble train-station waiting areas. So what exactly makes an ER more expensive than other forms of treatment?
Perhaps it’s the accounting. Since charity care, which is often performed in the ER, is one justification for hospitals’ protected place in law and regulation, it’s in hospitals’ interest to shift costs from overhead and other parts of the hospital to the ER, so that the costs of charity care—the public service that hospitals are providing—will appear to be high. Hospitals certainly lose money on their ERs; after all, many of their customers pay nothing. But to argue that ERs are costly compared with other treatment options, hospitals need to claim expenses well beyond the marginal (or incremental) cost of serving ER patients.
In a recent IRS survey of almost 500 nonprofit hospitals, nearly 60 percent reported providing charity care equal to less than 5 percent of their total revenue, and about 20 percent reported providing less than 2 percent. Analyzing data from the American Hospital Directory, The Wall Street Journal found that the 50 largest nonprofit hospitals or hospital systems made a combined “net income” (that is, profit) of $4.27 billion in 2006, nearly eight times their profits five years earlier.
…
Here’s a wonderful example of price opacity. Advocates for the uninsured complain that hospitals charge uninsured patients, on average, 2.5 times the amount charged to insured patients. Hospitals defend themselves by contending that they earn from uninsured patients only 25 percent of the amount they do from insured ones. Both statements appear to be true!
…
Ten days after my father’s death, the hospital sent my mother a copy of the bill for his five-week stay: $636,687.75. He was charged $11,590 per night for his ICU room; $7,407 per night for a semiprivate room before he was moved to the ICU; $145,432 for drugs; $41,696 for respiratory services. Even the most casual effort to compare these prices to marginal costs or to the costs of off-the-shelf components demonstrates the absurdity of these numbers, but why should my mother care? Her share of the bill was only $992; the balance, undoubtedly at some huge discount, was paid by Medicare.
Leave a Reply