If you've ever walked through the underpass of Hackerman Hall and wondered what was going on in the mock operating room, what you are seeing is a state-of-the-art robotic surgery facility. Even more impressive is the work being done inside the building, around Hopkins and around the world. Robotically assisted surgery has become an increasingly present field in research and in practice, led in part by Hopkins inter-departmental collaboration with industry.
In a new paper published online last month in The International Journal of Medical Robotics and Computer Assisted Surgery, a system is presented for minimally invasive surgery performed with Haptic feedback and visual feedback aids for surgeons. Haptics refers to the sense of touch, specifically the perception of objects as such. In the new work, the case of prostate cancer is examined, due to rising concerns over the effectiveness of robot assisted surgery for this application.
The paper was written jointly by researchers at Hopkins and Robotics and ElectroMechanical Systems, Intelligent Automation, Inc., located in Rockville, MD.
There are several components to the system. First, a 3D map is generated for surgeons to reference while operating, complete with added information regarding material properties of tissue in order to identify tumors. Stereoscopic cameras are set up at specific positions to allow for correlation of their images into a 3D "cloud" of image data. This data set is cleaned and enhanced statistically by a computer and projected for the surgeon to see.
The system also uses force-feedback mechanisms to stop surgeons from entering inoperable regions. To do this, a region is first defined on the computer where the surgeon should not cut, such as regions too deep into an organ.
The system then creates a "virtual wall" around the area, such that, if the surgeon tries to enter that region, he will feel the artificially generated force of strong springs resisting his motion in that direction. When combined, these subsystems create an operating environment better than what is currently used.
"By integrating computer vision with haptic feedback and graphical displays, we have generated virtual fixtures and augmented reality displays for surgical applications that are very easy and intuitive to use," co-author Allison Okamura wrote in an email to the News-Letter. Okamura is a former professor of Mechanical Engineering at Hopkins and is now at Stanford University.
The next step in the plan is further verification to confirm its ability to locate tumors. "We would like to compare the effectiveness of the graphical display of hard lumps to manual identification of lumps with the bare finger," Okamura wrote. Once proven, Okamura hopes to construct a stand-alone surgical unit incorporating these new technologies.
"We would like to apply these concepts to a real surgical robot and have surgeons test the system," she wrote. These feedback and visual displays coupled with a range of surgical tools on multiple-degree-of-freedom robotic arms could give surgeons their best capability yet for minimally invasive surgery.