读书笔记:SAFNet

文献:通过 Siamese Adaptive Fusion Network 进行合成孔径雷达图像变化检测

论文
摘要:合成孔径雷达(SAR)图像变化检测是遥感图像分析领域中一项关键而具有挑战性的任务。由于以下挑战,这项任务并非易事:首先,SAR图像的固有散斑噪声由于误差梯度累积不可避免地会降低神经网络。此外,特征图的各个级别或尺度之间的相关性很难通过求和或连接来实现。为此,我们提出了一种用于 SAR 图像变化检测的孪生自适应融合网络。更具体地说,两分支 CNN 用于提取多时相 SAR 图像的高级语义特征。此外,自适应融合模块旨在自适应地组合卷积层中的多尺度响应。因此,利用了互补信息,进一步改进了变化检测中的特征学习。此外,设计了一个相关层来进一步探索多时相图像之间的相关性。此后,通过带有 softmax 的全连接层利用鲁棒的特征表示进行分类。在四个真实 SAR 数据集上的实验结果表明,所提出的方法在几种最先进的方法中表现出优越的性能。

给定在不同时期捕获的两个配准的 SAR 图像 I1 和 I2,我们的目标是生成一个变化图,表示两个图像之间的变化信息。所提出的变化检测方法包括两个步骤:首先,通过双分支网络提取多时相图像的高级语义特征,并使用相似性度量来优化特征提取过程。其次,使用相关层来整合分类和变化图生成的特征。
读书笔记:SAFNet_第1张图片

A特征提取和可靠的样本生成

B. 双分支自适应融合网络

读书笔记:SAFNet_第2张图片
读书笔记:SAFNet_第3张图片


读书笔记:SAFNet_第4张图片

C、SAFNet 的 优化 和 变化图生成

代码部分

程序
结果图:
读书笔记:SAFNet_第5张图片
训练过程

(291, 306, 1)
torch.Size([20220, 1, 7, 7])
torch.Size([5055, 1, 7, 7])
torch.Size([89046, 1, 7, 7])
Creating dataloader
[Epoch: 1]   [loss avg: 62.1554]   [current loss: 0.1116]
98.22
Save model!
[Epoch: 2]   [loss avg: 9.4708]   [current loss: 0.0884]
99.17
Save model!
[Epoch: 3]   [loss avg: 4.9710]   [current loss: 0.0891]
99.39
Save model!
[Epoch: 4]   [loss avg: 3.1971]   [current loss: 0.0264]
99.41
Save model!
[Epoch: 5]   [loss avg: 2.2624]   [current loss: 0.2599]
99.47
Save model!
[Epoch: 6]   [loss avg: 1.7266]   [current loss: 0.1351]
99.49
Save model!
[Epoch: 7]   [loss avg: 1.3886]   [current loss: 0.0148]
99.56
Save model!
[Epoch: 8]   [loss avg: 1.1441]   [current loss: 0.0494]
99.45
[Epoch: 9]   [loss avg: 0.9503]   [current loss: 0.0267]
99.58
Save model!
[Epoch: 10]   [loss avg: 0.8380]   [current loss: 0.0653]
99.60
Save model!
[Epoch: 11]   [loss avg: 0.7409]   [current loss: 0.0200]
99.56
[Epoch: 12]   [loss avg: 0.6333]   [current loss: 0.0115]
99.53
[Epoch: 13]   [loss avg: 0.5591]   [current loss: 0.0396]
99.62
Save model!
[Epoch: 14]   [loss avg: 0.4983]   [current loss: 0.0754]
99.68
Save model!
[Epoch: 15]   [loss avg: 0.4427]   [current loss: 0.0117]
99.62
[Epoch: 16]   [loss avg: 0.3919]   [current loss: 0.0203]
99.66
[Epoch: 17]   [loss avg: 0.3609]   [current loss: 0.0141]
99.68
Save model!
[Epoch: 18]   [loss avg: 0.3322]   [current loss: 0.0634]
99.70
Save model!
[Epoch: 19]   [loss avg: 0.3029]   [current loss: 0.0582]
99.68
[Epoch: 20]   [loss avg: 0.2745]   [current loss: 0.0035]
99.74
Save model!
[Epoch: 21]   [loss avg: 0.2551]   [current loss: 0.0881]
99.76
Save model!
[Epoch: 22]   [loss avg: 0.2339]   [current loss: 0.0085]
99.70
[Epoch: 23]   [loss avg: 0.2133]   [current loss: 0.0436]
99.78
Save model!
[Epoch: 24]   [loss avg: 0.1967]   [current loss: 0.0185]
99.74
[Epoch: 25]   [loss avg: 0.1841]   [current loss: 0.0988]
99.76
[Epoch: 26]   [loss avg: 0.1710]   [current loss: 0.0111]
99.74
[Epoch: 27]   [loss avg: 0.1548]   [current loss: 0.0191]
99.80
Save model!
[Epoch: 28]   [loss avg: 0.1451]   [current loss: 0.0519]
99.80
Save model!
[Epoch: 29]   [loss avg: 0.1375]   [current loss: 0.0782]
99.72
[Epoch: 30]   [loss avg: 0.1281]   [current loss: 0.0093]
99.80
Save model!
[Epoch: 31]   [loss avg: 0.1190]   [current loss: 0.0180]
99.78
[Epoch: 32]   [loss avg: 0.1185]   [current loss: 0.0368]
99.76
[Epoch: 33]   [loss avg: 0.1084]   [current loss: 0.0078]
99.68
[Epoch: 34]   [loss avg: 0.1017]   [current loss: 0.0140]
99.78
[Epoch: 35]   [loss avg: 0.0948]   [current loss: 0.0034]
99.82
Save model!
[Epoch: 36]   [loss avg: 0.0876]   [current loss: 0.0123]
99.82
Save model!
[Epoch: 37]   [loss avg: 0.0840]   [current loss: 0.0113]
99.74
[Epoch: 38]   [loss avg: 0.0842]   [current loss: 0.0258]
99.78
[Epoch: 39]   [loss avg: 0.0755]   [current loss: 0.0229]
99.78
[Epoch: 40]   [loss avg: 0.0726]   [current loss: 0.0021]
99.74
[Epoch: 41]   [loss avg: 0.0684]   [current loss: 0.0217]
99.82
Save model!
[Epoch: 42]   [loss avg: 0.0680]   [current loss: 0.0154]
99.78
[Epoch: 43]   [loss avg: 0.0610]   [current loss: 0.0129]
99.78
[Epoch: 44]   [loss avg: 0.0616]   [current loss: 0.0146]
99.82
Save model!
[Epoch: 45]   [loss avg: 0.0594]   [current loss: 0.0128]
99.82
Save model!
[Epoch: 46]   [loss avg: 0.0576]   [current loss: 0.0030]
99.78
[Epoch: 47]   [loss avg: 0.0520]   [current loss: 0.0830]
99.84
Save model!
[Epoch: 48]   [loss avg: 0.0513]   [current loss: 0.0070]
99.80
[Epoch: 49]   [loss avg: 0.0498]   [current loss: 0.0019]
99.86
Save model!
[Epoch: 50]   [loss avg: 0.0457]   [current loss: 0.0049]
99.80
94.36
The final accuracy is  94.36134132920064
... ... row  0  handling ... ...
... ... row  20  handling ... ...
... ... row  40  handling ... ...
... ... row  60  handling ... ...
... ... row  80  handling ... ...
... ... row  100  handling ... ...
... ... row  120  handling ... ...
... ... row  140  handling ... ...
... ... row  160  handling ... ...
... ... row  180  handling ... ...
... ... row  200  handling ... ...
... ... row  220  handling ... ...
... ... row  240  handling ... ...
... ... row  260  handling ... ...
... ... row  280  handling ... ...

MATLAB运行calculate_result.m

读书笔记:SAFNet_第6张图片

你可能感兴趣的:(变化检测,python)