Camouflaged Object Segmentation with Omni Perception
Article Ecrit par: Wei, Xiaopeng ; Yang, Xin ; Wang, Yang ; Xu, Ke ; Zhou, Yunduo ; Piao, Haiyin ; Mei, Haiyang ;
Résumé: Camouflaged object segmentation (COS) is a very challenging task due to the deceitful appearances of the candidate objects to the noisy backgrounds. Most existing state-of-the-art methods mimic the first-positioning-then-focus mechanism of predators, but still fail in positioning camouflaged objects in cluttered scenes or delineating their boundaries. The key reason is that their methods do not have a comprehensive understanding of the scene when they spot and focus on the objects, so that they are easily attracted by local surroundings. An ideal COS model should be able to process local and global information at the same time, i.e., to have omni perception of the scene through the whole process of camouflaged object segmentation. To this end, we propose to learn the omni perception for the first-positioning-then-focus COS scheme. Specifically, we propose an omni perception network (OPNet) with two novel modules, i.e., the pyramid positioning module (PPM) and dual focus module (DFM). They are proposed to integrate local features and global representations for accurate positioning of the camouflaged objects and focus on their boundaries, respectively. Extensive experiments demonstrate that our method, which runs at 54 fps, significantly outperforms 15 cutting-edge models on 4 challenging datasets under 4 standard metrics. The code will be made publicly available.
Langue:
Anglais
Thème
Informatique
Mots clés:
perception
Segmentation (Informatique)
Camouflage