Augmented Reality Visualization During Laparoscopic Radical Prostatectomy

dc.contributor.authorSimpfendoerfer, Tobias
dc.contributor.authorBaumhauer, Matthias
dc.contributor.authorMueller, Michael
dc.contributor.authorGutt, Carsten N.
dc.contributor.authorMeinzer, Hans-Peter
dc.contributor.authorRassweiler, Jens J.
dc.contributor.authorGuven, Selcuk
dc.date.accessioned2020-03-26T18:13:50Z
dc.date.available2020-03-26T18:13:50Z
dc.date.issued2011
dc.departmentSelçuk Üniversitesien_US
dc.description.abstractPurpose: We present an augmented reality (AR) navigation system that conveys virtual organ models generated from transrectal ultrasonography (TRUS) onto a real laparoscopic video during radical prostatectomy. By providing this additional information about the actual anatomy, we can support surgeons in their working decisions. This work reports the system's first in-vivo application. Materials and Methods: The system uses custom-developed needles with colored heads that are inserted into the prostate as soon as the organ surface is uncovered. These navigation aids are once segmented in three-dimensional (3D) TRUS data that is acquired right after the placement of the needles and then continuously tracked in the laparoscopic video images by the surgical navigation system. The navigation system traces the navigation aids in real time and computes a registration between TRUS image and laparoscopic video based on the two-dimensional-three dimensional (2D-3D) point correspondences. With this registration, the system correctly superimposes TRUS-based 3D information on an additional AR monitor placed next to the normal laparoscopic screen. Surgical navigation guidance took place until the prostate was removed from the rectal wall. Finally, the navigation aids were removed together with the specimen inside the specimen bag. Results: The initial human in-vivo application of the surgical navigation system was successful. No complications occurred, the prostate was removed together with the navigation aids, and the system supported the surgeons as intended with an AR visualization in real time. In case of tissue deformations, changes in the spatial configuration of the navigation aids are detected, which preserves the system from erroneous navigation visualization. Conclusions: Feasibility of the navigation system was shown in the first in-vivo application. TRUS information could be superimposed via AR in real time. To show the benefit for the patient, results obtained from a larger number of trials are needed.en_US
dc.description.sponsorshipGerman Research Foundation (DFG)German Research Foundation (DFG)en_US
dc.description.sponsorshipThis work was conducted in part within the setting of the "Research Group 1126: Intelligent Surgery-Development of New Computer-Based Methods for the Future Workplace in Surgery'' founded by the. The presented software is part of the open-source framework MITK German Research Foundation (DFG) (Medical Imaging Interaction Toolkit, www.mitk.org).en_US
dc.identifier.doi10.1089/end.2010.0724en_US
dc.identifier.endpage1845en_US
dc.identifier.issn0892-7790en_US
dc.identifier.issue12en_US
dc.identifier.pmid21970336en_US
dc.identifier.scopusqualityQ1en_US
dc.identifier.startpage1841en_US
dc.identifier.urihttps://dx.doi.org/10.1089/end.2010.0724
dc.identifier.urihttps://hdl.handle.net/20.500.12395/26181
dc.identifier.volume25en_US
dc.identifier.wosWOS:000298079100005en_US
dc.identifier.wosqualityQ2en_US
dc.indekslendigikaynakWeb of Scienceen_US
dc.indekslendigikaynakScopusen_US
dc.indekslendigikaynakPubMeden_US
dc.language.isoenen_US
dc.publisherMARY ANN LIEBERT INCen_US
dc.relation.ispartofJOURNAL OF ENDOUROLOGYen_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.selcuk20240510_oaigen_US
dc.titleAugmented Reality Visualization During Laparoscopic Radical Prostatectomyen_US
dc.typeArticleen_US

Dosyalar