If you want to see just how long an academic institution can tolerate a string of slow, festering research scandals, let me invite you to the University of Minnesota, where I teach medical ethics.

Over the past 25 years, our Psychiatry Department has been party to the following disgraces: a felony conviction and a Food and Drug Administration research disqualification for a psychiatrist guilty of fraud in a drug study; the FDA disqualification of another psychiatrist, for enrolling illiterate Hmong refugees in a drug study without their consent; the suspended license of yet another psychiatrist, who was charged with “reckless, if not willful, disregard” for dozens of patients; and, in 2004, the discovery, in a halfway house bathroom, of the near-decapitated corpse of Dan Markingson, a seriously mentally ill young man under an involuntary commitment order who committed suicide after enrolling, over the objections of his mother, in an industry-funded antipsychotic study run by members of the department. And those, unfortunately, are just the highlights.

The problem extends well beyond the Psychiatry Department and into the university administration. Rather than dealing forthrightly with these ethical breaches, university officials have seemed more interested in covering up wrongdoing with a variety of underhanded tactics. Reporting in the Star Tribune discovered, for example, that in the felony case, university officials hid an internal investigation of the fraud from federal investigators for nearly four years.

I hope the situation at the University of Minnesota is exceptional. But I know that at least one underlying cause of our problems is not limited to us: namely, the antiquated bureaucratic apparatus of institutional review boards, or IRBs, which are supposed to protect subjects of medical experimentation. Indeed, whether other institutions have seen the kinds of abuses that have emerged at the University of Minnesota is difficult to know, precisely because the current research oversight system is inadequate to detect them.

The current IRB system arose in the 1970s. At the time, many reformers believed the main threat to research subjects came from overambitious government and university researchers who might be tempted to overlook the welfare of research subjects.

As a result, the scheme put in place for protecting subjects was not a formal regulatory system but essentially an honor code. Under the IRB system, medical research studies are evaluated — on paper — by a panel of academic volunteers. IRBs do not usually monitor research as it is taking place. They rarely see a research subject or even a researcher face to face. Instead, they simply trust researchers to tell the truth, report mishaps honestly and conduct their studies in the way they claim to be conducting them.

These days, of course, medical research is not just a scholarly affair. It is also a global, multibillion-dollar business enterprise, powered by the pharmaceutical and medical-device industries. The ethical problem today is not merely that these corporations have plenty of money to grease the wheels of university research. It’s also that researchers themselves are often given powerful financial incentives to do unethical things: pressure vulnerable subjects to enroll in studies, fudge diagnoses to recruit otherwise ineligible subjects and keep subjects in studies even when they are doing poorly.

In what other potentially dangerous industry do we rely on an honor code to keep people safe? Imagine if inspectors never actually set foot in meatpacking plants or coal mines, but gave approvals based entirely on paperwork filled out by the owners.

With so much money at stake in drug research, research subjects need a full-blown regulatory system. IRBs should be replaced with oversight bodies that are fully independent — both financially and institutionally — of the research they are overseeing. These bodies must have the staffing and the authority to monitor research on the ground. And they must have the power to punish researchers who break the rules and institutions that cover up wrongdoing.

Here at the University of Minnesota, we have reached a critical point. Two months ago, after two blistering external investigations, university officials finally agreed to suspend recruitment for psychiatric drug studies. Yet they still refuse to admit any serious wrongdoing.

An honor code is a fragile thing. All of the parts have to be in place: pride in the integrity of an institution, vigilant self-policing, a collective sense of shame when the code is violated and a willingness to punish those who break it. At the University of Minnesota, we have very few of those things. And so without sustained, relentless pressure from the outside, I am afraid we are doomed to more of the same.


Carl Elliott, a professor at the Center for Bioethics at the University of Minnesota, is the author of “White Coat, Black Hat: Adventures on the Dark Side of Medicine.” He wrote this article for the New York Times.