Real-Time Segmentation of Non-rigid Surgical Tools Based on Deep Learning and Tracking

被引:69
作者
Garcia-Peraza-Herrera, Luis C. [1 ]
Li, Wenqi [1 ]
Gruijthuijsen, Caspar [4 ]
Devreker, Alain [4 ]
Attilakos, George [3 ]
Deprest, Jan [5 ]
Vander Poorten, Emmanuel [4 ]
Stoyanov, Danail [2 ]
Vercauteren, Tom [1 ]
Ourselin, Sebastien [1 ]
机构
[1] UCL, CMIC, Translat Imaging Grp, London, England
[2] UCL, CMIC, Surg Robot Vis Grp, London, England
[3] Univ Coll London Hosp, London, England
[4] Katholieke Univ Leuven, Leuven, Belgium
[5] Univ Ziekenhuis Leuven, Leuven, Belgium
来源
COMPUTER-ASSISTED AND ROBOTIC ENDOSCOPY | 2017年 / 10170卷
基金
英国惠康基金; 英国工程与自然科学研究理事会;
关键词
APPEARANCE;
D O I
10.1007/978-3-319-54057-3_8
中图分类号
TP301 [理论、方法];
学科分类号
080201 [机械制造及其自动化];
摘要
Real-time tool segmentation is an essential component in computer-assisted surgical systems. We propose a novel real-time automatic method based on Fully Convolutional Networks (FCN) and optical flow tracking. Our method exploits the ability of deep neural networks to produce accurate segmentations of highly deformable parts along with the high speed of optical flow. Furthermore, the pre-trained FCN can be fine-tuned on a small amount of medical images without the need to hand-craft features. We validated our method using existing and new benchmark datasets, covering both ex vivo and in vivo real clinical cases where different surgical instruments are employed. Two versions of the method are presented, non-real-time and real-time. The former, using only deep learning, achieves a balanced accuracy of 89.6% on a real clinical dataset, outperforming the (non-real-time) state of the art by 3.8% points. The latter, a combination of deep learning with optical flow tracking, yields an average balanced accuracy of 78.2% across all the validated datasets.
引用
收藏
页码:84 / 95
页数:12
相关论文
共 25 条
[1]
Allan Max, 2014, Information Processing in Computer-Assisted Interventions. 5th International Conference, IPCAI 2014. Proceedings: LNCS 8498, P1, DOI 10.1007/978-3-319-07521-1_1
[2]
Toward Detection and Localization of Instruments in Minimally Invasive Surgery [J].
Allan, Max ;
Ourselin, Sebastien ;
Thompson, Steve ;
Hawkes, David J. ;
Kelly, John ;
Stoyanov, Danail .
IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2013, 60 (04) :1050-1058
[3]
[Anonymous], 2000, TECHNICAL REPORT
[4]
[Anonymous], 2014, CVPR
[5]
Detecting Surgical Tools by Modelling Local Appearance and Global Shape [J].
Bouget, David ;
Benenson, Rodrigo ;
Omran, Mohamed ;
Riffaud, Laurent ;
Schiele, Bernt ;
Jannin, Pierre .
IEEE TRANSACTIONS ON MEDICAL IMAGING, 2015, 34 (12) :2603-2617
[6]
Daga P., 2015, SPIE MED IMAGING
[7]
Devreker A, 2015, IEEE INT C INT ROBOT, P1415, DOI 10.1109/IROS.2015.7353553
[8]
Everingham M., PASCAL VOC CHALLENGE
[9]
Fast R-CNN [J].
Girshick, Ross .
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, :1440-1448
[10]
Supporting user-oriented analysis for multi-view domain-specific visual languages [J].
Guerra, Esther ;
de Lara, Juan ;
Malizia, Alessio ;
Diaz, Paloma .
INFORMATION AND SOFTWARE TECHNOLOGY, 2009, 51 (04) :769-784