T R A C K       P A P E R
Close

Login Panel

Close tab

Password Meter

Volume 11 Issue 12 (December 2024)

S.No. Title & Authors Page No View
1

Title : Bi-DLKA Unet: Merging Bi-level Routing Attention and Deformable Large Kernel Attention for Medical Image Segmentation

Authors : Chunfei Liu, Baoshan Sun

Click Here For Abstract

Download Certificate
Abstract :

In the field of medical image segmentation, deep learning-based methods have gained widespread recognition for their efficiency, with Transformer-based architectures proving particularly effective. However, these architectures are typically associated with high computational complexity and substantial memory requirements due to the self-attention mechanisms, which compute relationships between all tokens. In recent years, numerous studies have aimed to address this issue by introducing sparse attention mechanisms. These methods rely on artificial, content-agnostic sparse attention but still struggle to accurately capture long-range dependencies. In this study, we propose a novel network architecture, Bi-DLKA Unet, which incorporates dynamic sparse attention through bi-level routing Attention, thereby optimizing the allocation of computational resources to local feature maps that contribute most to the predictions. Within the Encoder-Decoder module, we integrate BiFormer, designed to enhance semantic information extraction and feature map resolution restoration. Additionally, in the Skip Connection segment, we introduce the Deformable Large Kernel Attention module, which combines the strengths of large convolutional kernels and deformable convolutions, allowing the model to better extract image features without incurring the high computational cost typical of traditional attention mechanisms. We rigorously evaluated the proposed Bi-DLKA Unet using the publicly available Synapse multi-organ segmentation dataset. Notably, our method showed statistically significant improvements in Dice coefficients, surpassing state-of-the-art algorithms such as MISSFormer by 0.51% in Dice coefficients.

1-7