Skip to main content

Serious Games Adding The Sense Of Touch To Sim Surgery


Via: Rensselaer Polytechnic Institute - Adding the Sense of Touch to Virtual Environments 

Suvranu De, assistant professor of mechanical, aerospace, and nuclear engineering, leads a team that is combining the sense of touch with 3-D computer models of organs to create a new approach to training surgeons, a virtual simulator that allows surgeons to touch, feel, and manipulate computer-generated 3-D tissues and organs.

A screen shot simulating smoke generation from cauterization on a model of the stomach.
(Credit: Rensselaer/Suvranu De)

Digital Surgery With Touch Feedback Could Improve Medical Training

Combining the sense of touch with 3-D computer models of organs, Rensselaer researchers are developing a new approach to training surgeons, much as pilots learn to fly on flight simulators. With collaborators at Harvard Medical School, Albany Medical Center, and the Massachusetts Institute of Technology, the team is developing a virtual simulator that will allow surgeons to touch, feel, and manipulate computer-generated organs with actual tool handles used in minimally invasive surgery (MIS).

MIS allows doctors to perform operations through small incisions with long, slender instruments and video cameras, which can result in minimal postoperative pain, less blood loss, lower risk of complications, and a shorter hospital stay. The number of MIS procedures has grown dramatically in recent years, but despite its many advantages, the technique deprives surgeons of the depth perception, dexterity, sense of touch, and hand-eye coordination that they are accustomed to in open surgeries.

“The most important single factor that determines the success of a surgical procedure is the skill of the surgeon,” said Suvranu De, assistant professor of mechanical, aerospace, and nuclear engineering and director of the Advanced Computational Research Lab at Rensselaer.

De and his colleagues at Rensselaer are seeking to improve surgical training by developing a new type of virtual simulator. Based on the science of haptics — the study of sensing through touch — the new simulator will provide an immersive environment for surgeons to touch, feel, and manipulate computer-generated 3-D tissues and organs with tool handles used in actual surgery.

Such a simulator could standardize the assessment of surgical skills and avert the need for cadavers and animals currently used in training, according to De.

Beginning in the summer of 2006, this work is being supported by a $1.4 million, four-year grant from the National Institutes of Health (NIH).

Surgical simulators — even more than flight simulators — are based on intense computation. To program the realism of touch feedback from a surgical probe navigating through soft tissue, the researchers must develop efficient computer models that perform 30 times faster than real-time graphics, solving complex sets of partial differential equations about a thousand times a second, De said.

The major challenge to current technologies is the simulation of soft biological tissues, according to De. Such tissues are heterogeneous and viscoelastic, meaning they exhibit characteristics of both solids and liquids — similar to chewing gum or silly putty. And surgical procedures such as cutting and cauterizing are almost impossible to simulate with traditional techniques. To overcome these barriers, De’s group has developed a new computational tool called the Point-Associated Finite Field (PAFF) approach, which models human tissue as a collection of particles with distinct, overlapping zones of influence that produce coordinated, elastic movements.

The team also plans to develop a prototype technology that will be tested by surgeons and surgical residents at the Carl J. Shapiro Simulation and Skills Center at Beth Israel Deaconess Medical Center, a teaching hospital of Harvard Medical School. Researchers at the Human Performance Institute at the University of Texas, Arlington, will assist the team in the validation process.

After developing a successful prototype, De hopes to apply the model to a much wider class of medical procedures. “The grand vision,” he said, “is to develop a palpable human — a giant database of human anatomy that provides real-time interactivity for a variety of uses, from teaching anatomy to evaluating injuries in a variety of scenarios. In the long run, a better simulator could even help in the design of new surgical tools and techniques.”