Surgical activity recognition with transformer networks Transformer aǧi ile cerrahi hareket tanima


Kabatas S. N., SARIKAYA D.

29th IEEE Conference on Signal Processing and Communications Applications, SIU 2021, Virtual, Istanbul, Türkiye, 9 - 11 Haziran 2021 identifier identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Doi Numarası: 10.1109/siu53274.2021.9477969
  • Basıldığı Şehir: Virtual, Istanbul
  • Basıldığı Ülke: Türkiye
  • Anahtar Kelimeler: Robot Assisted Surgery (RAS), kinematic data, deep learning, transformer network, surgical gesture recognition
  • Gazi Üniversitesi Adresli: Evet

Özet

© 2021 IEEE.Automatic classification and recognition of surgical activities is an important step towards providing feedback in surgical training, preventing adversarial events and medical errors in surgeries. Kinematic data recorded from surgical robots contains information about the surgeon's movements. Using this data, we can model and recognize surgical gestures automatically. In this study, the Transformer model, which has shown better performance than Recurrent Neural Networks (RNNs) with time series data, has been used to recognize surgical gestures with kinematic data. The model learned in this study is compared with the Long Short-Term Memory (LSTM) model, which is widely used in the literature. The average accuracy of for JHU-ISI Gesture And Skill Asessment Working Set (JIGSAWS) the Transformer model is 0.77. According to the results, Transformer model is comparable to the state of the art LSTM models, and has outperformed the LSTM model we have developed in this study as part of the benchmark, and the standard variation is lower. To our knowledge, our study is the first to use Transformer model for surgical activity recognition with kinematic data. Our experiments show the promise of Transformer Network in this domain.