HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Bilateral Reference for High-Resolution Dichotomous Image Segmentation

Zheng Peng ; Gao Dehong ; Fan Deng-Ping ; Liu Li ; Laaksonen Jorma ; Ouyang Wanli ; Sebe Nicu

Bilateral Reference for High-Resolution Dichotomous Image Segmentation

Abstract

We introduce a novel bilateral reference framework (BiRefNet) forhigh-resolution dichotomous image segmentation (DIS). It comprises twoessential components: the localization module (LM) and the reconstructionmodule (RM) with our proposed bilateral reference (BiRef). The LM aids inobject localization using global semantic information. Within the RM, weutilize BiRef for the reconstruction process, where hierarchical patches ofimages provide the source reference and gradient maps serve as the targetreference. These components collaborate to generate the final predicted maps.We also introduce auxiliary gradient supervision to enhance focus on regionswith finer details. Furthermore, we outline practical training strategiestailored for DIS to improve map quality and training process. To validate thegeneral applicability of our approach, we conduct extensive experiments on fourtasks to evince that BiRefNet exhibits remarkable performance, outperformingtask-specific cutting-edge methods across all benchmarks. Our codes areavailable at https://github.com/ZhengPeng7/BiRefNet.

Code Repositories

zhengpeng7/birefnet
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
camouflaged-object-segmentation-on-camoBiRefNet
MAE: 0.030
S-Measure: 0.904
Weighted F-Measure: 0.890
camouflaged-object-segmentation-on-chameleonBiRefNet
MAE: 0.015
S-measure: 0.932
weighted F-measure: 0.914
camouflaged-object-segmentation-on-codBiRefNet
MAE: 0.014
S-Measure: 0.913
Weighted F-Measure: 0.874
camouflaged-object-segmentation-on-nc4kBiRefNet
MAE: 0.023
S-measure: 0.914
weighted F-measure: 0.894
dichotomous-image-segmentation-on-dis-te1BiRefNet
E-measure: 0.908
HCE: 106
MAE: 0.038
S-Measure: 0.882
max F-Measure: 0.855
weighted F-measure: 0.814
dichotomous-image-segmentation-on-dis-te2BiRefNet
E-measure: 0.935
HCE: 265
MAE: 0.035
S-Measure: 0.904
max F-Measure: 0.898
weighted F-measure: 0.863
dichotomous-image-segmentation-on-dis-te3BiRefNet
E-measure: 0.952
HCE: 573
MAE: 0.030
S-Measure: 0.918
max F-Measure: 0.923
weighted F-measure: 0.891
dichotomous-image-segmentation-on-dis-te4BiRefNet
E-measure: 0.937
HCE: 2746
MAE: 0.040
S-Measure: 0.898
max F-Measure: 0.900
weighted F-measure: 0.861
dichotomous-image-segmentation-on-dis-vdBiRefNet
E-measure: 0.928
HCE: 1006
MAE: 0.038
S-Measure: 0.898
max F-Measure: 0.889
weighted F-measure: 0.853
rgb-salient-object-detection-on-davis-sBiRefNet (DUTS, HRSOD)
F-measure: 0.976
MAE: 0.006
S-measure: 0.973
rgb-salient-object-detection-on-davis-sBiRefNet (DUTS, HRSOD, UHRSD)
F-measure: 0.979
MAE: 0.006
S-measure: 0.975
rgb-salient-object-detection-on-davis-sBiRefNet (DUTS)
F-measure: 0.966
MAE: 0.008
S-measure: 0.967
rgb-salient-object-detection-on-davis-sBiRefNet (HRSOD, UHRSD)
F-measure: 0.980
MAE: 0.006
S-measure: 0.976
rgb-salient-object-detection-on-davis-sBiRefNet (DUTS, UHRSD)
F-measure: 0.977
MAE: 0.006
S-measure: 0.975
rgb-salient-object-detection-on-hrsodBiRefNet (DUTS, UHRSD)
MAE: 0.014
S-Measure: 0.959
max F-Measure: 0.958
rgb-salient-object-detection-on-hrsodBiRefNet (DUTS, HRSOD)
MAE: 0.011
S-Measure: 0.962
max F-Measure: 0.963
rgb-salient-object-detection-on-hrsodBiRefNet (DUTS)
MAE: 0.014
S-Measure: 0.957
max F-Measure: 0.958
rgb-salient-object-detection-on-hrsodBiRefNet (HRSOD, UHRSD)
MAE: 0.016
S-Measure: 0.956
max F-Measure: 0.953
rgb-salient-object-detection-on-hrsodBiRefNet (DUTS, HRSOD, UHRSD)
MAE: 0.013
S-Measure: 0.962
max F-Measure: 0.961
rgb-salient-object-detection-on-uhrsdBiRefNet (HRSOD, UHRSD)
MAE: 0.019
S-Measure: 0.952
max F-Measure: 0.958
rgb-salient-object-detection-on-uhrsdBiRefNet (DUTS, UHRSD)
MAE: 0.019
S-Measure: 0.952
max F-Measure: 0.960
rgb-salient-object-detection-on-uhrsdBiRefNet (DUTS, HRSOD)
MAE: 0.024
S-Measure: 0.937
max F-Measure: 0.942
rgb-salient-object-detection-on-uhrsdBiRefNet (DUTS, HRSOD, UHRSD)
MAE: 0.016
S-Measure: 0.957
max F-Measure: 0.963
rgb-salient-object-detection-on-uhrsdBiRefNet (DUTS)
MAE: 0.030
S-Measure: 0.931
max F-Measure: 0.933
salient-object-detection-on-dut-omronBiRefNet (HRSOD, UHRSD)
F-measure: 0.810
MAE: 0.040
S-Measure: 0.864
Weighted F-Measure: 0.790
mean E-Measure: 0.879
mean F-Measure: 0.801
salient-object-detection-on-dut-omronBiRefNet (DUTS, UHRSD)
F-measure: 0.837
MAE: 0.036
S-Measure: 0.881
Weighted F-Measure: 0.815
mean E-Measure: 0.896
mean F-Measure: 0.825
salient-object-detection-on-dut-omronBiRefNet (DUTS, HRSOD)
F-measure: 0.818
MAE: 0.040
S-Measure: 0.868
Weighted F-Measure: 0.800
mean E-Measure: 0.882
mean F-Measure: 0.809
salient-object-detection-on-dut-omronBiRefNet (DUTS, HRSOD, UHRSD)
F-measure: 0.839
MAE: 0.038
S-Measure: 0.882
Weighted F-Measure: 0.815
mean E-Measure: 0.896
mean F-Measure: 0.825
salient-object-detection-on-dut-omronBiRefNet (DUTS)
F-measure: 0.813
MAE: 0.040
S-Measure: 0.868
Weighted F-Measure: 0.792
mean E-Measure: 0.878
mean F-Measure: 0.802
salient-object-detection-on-duts-teBiRefNet (DUTS, UHRSD)
MAE: 0.018
S-Measure: 0.942
Weighted F-Measure: 0.919
max F-measure: 0.942
mean E-Measure: 0.961
mean F-Measure: 0.925
salient-object-detection-on-duts-teBiRefNet (DUTS)
MAE: 0.019
S-Measure: 0.939
Weighted F-Measure: 0.913
max F-measure: 0.937
mean E-Measure: 0.958
mean F-Measure: 0.919
salient-object-detection-on-duts-teBiRefNet (DUTS, HRSOD, UHRSD)
MAE: 0.018
S-Measure: 0.944
Weighted F-Measure: 0.920
max F-measure: 0.943
mean E-Measure: 0.962
mean F-Measure: 0.925
salient-object-detection-on-duts-teBiRefNet (DUTS, HRSOD)
MAE: 0.018
S-Measure: 0.938
Weighted F-Measure: 0.918
max F-measure: 0.935
mean E-Measure: 0.960
mean F-Measure: 0.923
salient-object-detection-on-duts-teBiRefNet (HRSOD, UHRSD)
MAE: 0.020
S-Measure: 0.933
Weighted F-Measure: 0.907
max F-measure: 0.928
mean E-Measure: 0.954
mean F-Measure: 0.913

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Bilateral Reference for High-Resolution Dichotomous Image Segmentation | Papers | HyperAI