Real-time automatic surgical phase recognition in laparoscopic sigmoidectomy using the convolutional neural network-base
- PDF / 718,227 Bytes
- 8 Pages / 595.276 x 790.866 pts Page_size
- 110 Downloads / 196 Views
and Other Interventional Techniques
Real‑time automatic surgical phase recognition in laparoscopic sigmoidectomy using the convolutional neural network‑based deep learning approach Daichi Kitaguchi1,2,3 · Nobuyoshi Takeshita1,2 · Hiroki Matsuzaki2 · Hiroaki Takano2 · Yohei Owada3 · Tsuyoshi Enomoto3 · Tatsuya Oda3 · Hirohisa Miura4 · Takahiro Yamanashi4 · Masahiko Watanabe4 · Daisuke Sato5 · Yusuke Sugomori5 · Seigo Hara5 · Masaaki Ito1,2 Received: 29 March 2019 / Accepted: 23 November 2019 © Springer Science+Business Media, LLC, part of Springer Nature 2019
Abstract Background Automatic surgical workflow recognition is a key component for developing the context-aware computerassisted surgery (CA-CAS) systems. However, automatic surgical phase recognition focused on colorectal surgery has not been reported. We aimed to develop a deep learning model for automatic surgical phase recognition based on laparoscopic sigmoidectomy (Lap-S) videos, which could be used for real-time phase recognition, and to clarify the accuracies of the automatic surgical phase and action recognitions using visual information. Methods The dataset used contained 71 cases of Lap-S. The video data were divided into frame units every 1/30 s as static images. Every Lap-S video was manually divided into 11 surgical phases (Phases 0–10) and manually annotated for each surgical action on every frame. The model was generated based on the training data. Validation of the model was performed on a set of unseen test data. Convolutional neural network (CNN)-based deep learning was also used. Results The average surgical time was 175 min (± 43 min SD), with the individual surgical phases also showing high variations in the duration between cases. Each surgery started in the first phase (Phase 0) and ended in the last phase (Phase 10), and phase transitions occurred 14 (± 2 SD) times per procedure on an average. The accuracy of the automatic surgical phase recognition was 91.9% and those for the automatic surgical action recognition of extracorporeal action and irrigation were 89.4% and 82.5%, respectively. Moreover, this system could perform real-time automatic surgical phase recognition at 32 fps. Conclusions The CNN-based deep learning approach enabled the recognition of surgical phases and actions in 71 Lap-S cases based on manually annotated data. This system could perform automatic surgical phase recognition and automatic target surgical action recognition with high accuracy. Moreover, this study showed the feasibility of real-time automatic surgical phase recognition with high frame rate. Keywords Phase recognition · Real-time automatic recognition · Surgical action recognition · Laparoscopic sigmoidectomy · Convolutional neural network · Deep learning Automatic surgical workflow recognition is a key component for developing context-aware computer-assisted surgery (CA-CAS) systems [1]. CAS systems play a vital role in
current surgical procedures. During surgery, these afford the visualization of pre- and intra-operative information about th
Data Loading...