U researchers refining simulators that allow for virtual practice.
Simulators aren’t just for pilots anymore.
In complex cases ranging from enlarged prostates to brain tumors, physicians at the University of Minnesota are using virtual-reality simulators more and more to perfect their surgical techniques. And, in what may be the most significant change in surgical training since the early 1900s, they are working with local medical device companies to develop new generations of software to train the next generation of medical students.
The researchers hope to build anatomical models so lifelike that medical residents will get hands-on experience and learn from their mistakes without harming patients, said Dr. Robert M. Sweet, director of the U’s Medical School Simulation Programs.
As the technology improves, Sweet said, surgeons will be able to use medical imaging devices like MRIs to create custom, virtual models of their patients’ diseased organs — and eventually practice tricky procedures before ever cutting the patient open.
“Have you ever seen a pitcher not warm up before their first pitch, or a musician not warm up before they go on stage? Never!” Sweet said. “Why would a surgeon be any different?”
An added benefit: Simulators collect data that can be used to research surgical techniques and detect common errors. For instance, a 2011 study using a virtual reality trainer for laparoscopic surgery found a “hangover effect” — degraded performance by surgeons who had drunk to the point of intoxication the night before an operation.
The team of scientists, physicians and computer experts who are driving the effort say the U and its partners in the Minnesota medical device industry are uniquely positioned to become leaders in the field.
The U developed the software, for example, for American Medical Systems, a Minnetonka company that built a simulator for urology cases, which is being adopted globally. And it is among just 10 centers worldwide whose residents are working to refine their neurosurgery skills with a device under development by the Canada National Research Council.
Until simulators came along about 15 years ago, the only way for surgeons to get hands-on experience was to cut into living patients under the watchful eye of a mentor.
Sweet, 44, said the idea for simulator training came to him as he was learning prostate surgery during his third year of residency at the University of Washington.
“Being from the video game generation, I thought that there might be a good way to do it with a video game,” he said.
Sweet dropped by the school’s Human Interface Technology lab and they built one. But early simulators were crude compared to the ones being developed now.
Sweet attributes some of the improvements to information in the U’s “one-of-a-kind” Human Tissues Properties Database.
“When a patient dies, we get consent to harvest little bits of tissue. Not whole organs, just little bits of tissue. And we rapidly run them through tests. Mechanical testing. Electrical testing. Thermal testing. Optical testing. You need to understand the object you’re simulating,” he said.
Sweet oversees the U’s training center, called SimPORTAL, and its research unit, called the Center for Research in Education and Simulation Technologies (CREST). Yunhe Shen, an assistant professor with a background in biomedical engineering, is in charge of developing algorithms that provide users with instant feedback that mimics what surgeons would feel and see if they were operating on a live patient.