
摘要
显著目标检测(SOD)近年来备受关注,但在高分辨率(HR)图像上的研究相对较少。不幸的是,与低分辨率(LR)图像及其注释相比,高分辨率图像及其像素级注释的生成无疑更加耗时费力。因此,我们提出了一种基于图像金字塔的SOD框架——逆向显著性金字塔重建网络(InSPyReNet),该框架无需任何高分辨率数据集即可进行高分辨率预测。我们设计了InSPyReNet以生成严格的显著性图金字塔结构,这使得可以通过基于金字塔的图像融合来组合多个结果。为了实现高分辨率预测,我们设计了一种金字塔融合方法,该方法从同一张图像的不同尺度生成两个不同的图像金字塔,从而克服有效感受野(ERF)差异。我们在公共的低分辨率和高分辨率SOD基准上进行了广泛的评估,结果表明InSPyReNet在各种SOD指标和边界准确性方面均超过了现有最先进(SotA)的方法。
代码仓库
plemeri/transparent-background
官方
pytorch
plemeri/inspyrenet
官方
pytorch
GitHub 中提及
基准测试
| 基准 | 方法 | 指标 |
|---|---|---|
| dichotomous-image-segmentation-on-dis-te1 | InSPyReNet (HR scale) | E-measure: 0.894 HCE: 110 MAE: 0.045 S-Measure: 0.873 max F-Measure: 0.845 weighted F-measure: 0.788 |
| dichotomous-image-segmentation-on-dis-te1 | InSPyReNet | HCE: 148 S-Measure: 0.862 max F-Measure: 0.834 |
| dichotomous-image-segmentation-on-dis-te2 | InSPyReNet (HR scale) | HCE: 255 S-Measure: 0.905 max F-Measure: 0.894 |
| dichotomous-image-segmentation-on-dis-te2 | InSPyReNet | E-measure: 0.925 HCE: 316 MAE: 0.038 S-Measure: 0.893 max F-Measure: 0.881 weighted F-measure: 0.834 |
| dichotomous-image-segmentation-on-dis-te3 | InSPyReNet (HR scale) | E-measure: 0.938 HCE: 522 MAE: 0.034 S-Measure: 0.918 max F-Measure: 0.919 weighted F-measure: 0.871 |
| dichotomous-image-segmentation-on-dis-te3 | InSPyReNet | E-measure: 0.938 HCE: 582 MAE: 0.038 S-Measure: 0.902 max F-Measure: 0.904 weighted F-measure: 0.856 |
| dichotomous-image-segmentation-on-dis-te4 | InSPyReNet | E-measure: 0.926 HCE: 2243 MAE: 0.046 S-Measure: 0.891 max F-Measure: 0.892 weighted F-measure: 0.840 |
| dichotomous-image-segmentation-on-dis-te4 | InSPyReNet (HR scale) | E-measure: 0.926 HCE: 2336 MAE: 0.042 S-Measure: 0.905 max F-Measure: 0.905 weighted F-measure: 0.848 |
| dichotomous-image-segmentation-on-dis-vd | InSPyReNet (HR scale) | HCE: 904 S-Measure: 0.900 max F-Measure: 0.889 |
| dichotomous-image-segmentation-on-dis-vd | InSPyReNet | E-measure: 0.921 HCE: 905 MAE: 0.043 S-Measure: 0.887 max F-Measure: 0.876 weighted F-measure: 0.826 |
| rgb-salient-object-detection-on-davis-s | InSPyReNet (DUTS, HRSOD) | F-measure: 0.976 S-measure: 0.972 mBA: 0.770 |
| rgb-salient-object-detection-on-davis-s | InSPyReNet | F-measure: 0.959 MAE: 0.009 S-measure: 0.962 mBA: 0.743 |
| rgb-salient-object-detection-on-hrsod | InSPyReNet (HRSOD, UHRSD) | MAE: 0.018 S-Measure: 0.956 mBA: 0.771 max F-Measure: 0.956 |
| rgb-salient-object-detection-on-hrsod | InSPyReNet (DUTS, HRSOD) | MAE: 0.014 S-Measure: 0.960 mBA: 0.766 max F-Measure: 0.957 |
| rgb-salient-object-detection-on-hrsod | InSPyReNet | MAE: 0.016 S-Measure: 0.952 mBA: 0.738 max F-Measure: 0.949 |
| rgb-salient-object-detection-on-uhrsd | InSPyReNet (HRSOD, UHRSD) | MAE: 0.020 S-Measure: 0.953 mBA: 0.812 max F-Measure: 0.957 |
| rgb-salient-object-detection-on-uhrsd | InSPyReNet (DUTS, HRSOD) | S-Measure: 0.936 mBA: 0.785 |
| rgb-salient-object-detection-on-uhrsd | InSPyReNet | MAE: 0.029 S-Measure: 0.932 mBA: 0.741 max F-Measure: 0.938 |
| salient-object-detection-on-dut-omron | InSPyReNet | F-measure: 0.832 MAE: 0.045 S-Measure: 0.875 |
| salient-object-detection-on-duts-te | InSPyReNet | MAE: 0.024 S-Measure: 0.931 max F-measure: 0.892 |
| salient-object-detection-on-ecssd | InSPyReNet | F-measure: 0.96 MAE: 0.031 S-Measure: 0.936 |
| salient-object-detection-on-hku-is | InSPyReNet | F-measure: 0.955 MAE: 0.028 S-Measure: 0.944 |
| salient-object-detection-on-pascal-s | InSPyReNet | F-measure: 0.893 MAE: 0.048 S-Measure: 0.893 |