A key issue on the stage throughout the first two rounds of Democratic debates has been health care reform, with much focus on universal access. The United States is, after all, the "only major country on Earth" that does not offer health care to all of its citizens, as Sen. Bernie Sanders, I-Vt., has put it. But this critical question remains unaddressed: Can universal access to health care really solve the problem of health inequality in the United States?

The answer is no — the disappointing lesson learned by most European countries with universal access. Perhaps the most telling example of accessibility is Britain's National Health Service (NHS), created on the heels of World War II. The NHS's "equal access for equal need" was meant, in part, to tackle the vastly unequal health outcomes and life expectancies of Brits across the socioeconomic spectrum. But over subsequent decades, it became clear that the NHS had left health inequalities basically untouched.

That's because health care systems are designed around the care of the individual. They were never designed to manage some of the most critical causes of poor health, which have socioeconomic origins. Poverty, homelessness, and inadequate housing and nutrition are improved by greater access to resources, not by more doctor visits. In other words, universal access is not enough. To address health inequality, we must revise a fundamental and deep-rooted misunderstanding of what good health is and what it requires.

In the U.S., we owe this misunderstanding chiefly to the activities of professional medical organizations selling a new idea of health care to potential patients in the 1920s. Before the 1920s, a motley crew of practitioners dominated medical work in the United States. Their training ranged from a few months of casual lecture attendance to many years of intensive hospital work, and in clinical facilities that varied from state-of-the-art hospitals to dilapidated structures packed to the rafters with beds. The public was not especially impressed by the vastly varying practices and outcomes that resulted.

This inconsistency began to decline, though, in the early 20th century. The Carnegie Foundation's famous Flexner Report of 1910 began to push "irregular" medical schools out of existence and standardize medical training. In the 1920s, organizations such as the American College of Surgeons began to do the same for hospitals, arranging a minimum standard that hospitals had to meet to gain the ACS's coveted stamp of approval.

For this new standardized health care to succeed, it needed a more consistent stream of "consumers": That is, these new "regular" doctors needed patients. The ACS partly solved this problem by extracting promises from major industrial organizations to use only ACS-approved hospitals for workers injured on the job. But a question remained: How could doctors expand their patient base and encourage individuals to voluntarily show up to hospitals and trust a stranger to address their health woes?

Customer loyalty had to be built, as burgeoning industrial titans during the same period, such as Henry Ford, understood. And this is exactly what medical organizations did, in the process creating a new understanding of health as personal, private and individually focused.

In 1923, the American Medical Association launched "Hygeia," a journal that aimed to endear the public to medical practitioners by teaching them to appreciate how medicine could improve the lives of individuals. Likewise, over this same period, the ACS organized a series of "health rallies" across the country, showcasing their standardized, regulated brand of health care as one that would revolutionize personal well-being.

The result? Personal aches and pains became a doctor's business. Have an "ache in your back"? Go to the doctor. Worried you have cancer? Don't, because it's curable — provided, an ACS representative would prudently add, that a doctor catches it early. And speaking of that, why not see a doctor regularly, just for a checkup? The message: Standardized medicine was individualized medicine, and individualized medicine made you — not your community, your state or your nation — feel better.

These appeals to individuals' health were quickly successful. And the system worked: Individual health outcomes improved. By spurring the regulation of medical institutions and practices, the ACS could guarantee a decent standard of individual care that was also more widely accessible and lifesaving during moments of medical distress.

But at the same time this new standard of care was improving outcomes for some, the movement also promoted a persistent, narrowing vision of what health care could and should consist of. Since then, Americans have internalized the lesson that health care is a private, personal and individual affair to such a degree that medicine seems somehow a panacea for all that ails us.

As a result, when policymakers and politicians contemplate how to make everyone across the socioeconomic spectrum healthier, they reach for a severely limited tool kit. Rather than considering problems as they exist, our society looks to ready-made solutions based on what the health care system as constituted provides.

The deficiencies of the current system make it impossible for equality-minded individuals not to call for proposals such as Medicare for All. But by tackling only those inequalities embedded within the current system, we leave the larger problems of health inequality intact.

Recognizing the historical contingency that has fostered faith in universal access as the righter of all health care wrongs ought to shake our fixation on individuals' health. More "health care" is not the right remedy for every problem. Instead, policymakers, politicians and voters interested in addressing enduring inequality should seek out the more creative, more thoughtful and more fitting solutions that emerge when we understand how the current, dominant system of health care came to be — mostly for the better but, occasionally, also for the worse — just a century ago.

Caitjan Gainty is a historian of medicine in the department of history at King's College, London. She is working on a book that investigates how American medicine's embrace of the industrial values of the early 20th century factory created the health care system that we know today. She wrote this article for the Washington Post.