pytorch实现多种经典GAN

文章目录

    • GAN
      • 数据集
        • 问题:
    • DCGAN
    • LSGAN
    • WGAN
        • n_d设置为5, 3, 1等,再次训练wGAN
    • WGAN-GP

GAN

GAN网络结构分为两部分,生成器网络Generator和判别器网络Discriminator.

  • 生成器Generator将随机生成的噪声z通过多个线性层生成图片,注意生成器的最后一层是Tanh,所以我们生成的图片的取值范围为[-1,1],同理,我们会将真实图片归一化(normalize)到[-1,1].
  • 而判别器Discriminator是一个二分类器,通过多个线性层得到一个概率值来判别图片是"真实"或者是"生成"的,所以在Discriminator的最后是一个sigmoid,来得到图片是"真实"的概率.

在所有的网络结构中我们都使用了LeakyReLU作为激活函数,除了G与D的最后一层,同时,我们在层与层之间我们还加入了BatchNormalization.

import torch
import numpy as np
import torch.nn as nn
import torch.optim as optim
import torchvision
import torchvision.transforms as transforms
import matplotlib.pyplot as plt
%matplotlib inline
# 生成器
class Generator(nn.Module):
    def __init__(self, image_size=32, latent_dim=100, output_channel=1):
        """
        image_size: image with and height
        latent dim: the dimension of random noise z
        output_channel: the channel of generated image, for example, 1 for gray image, 3 for RGB image
        """
        super(Generator, self).__init__()
        self.latent_dim = latent_dim
        self.output_channel = output_channel
        self.image_size = image_size
        
        # Linear layer: latent_dim -> 128 -> 256 -> 512 -> 1024 -> output_channel * image_size * image_size -> Tanh
        self.model = nn.Sequential(
            nn.Linear(latent_dim, 128),
            nn.BatchNorm1d(128),
            nn.LeakyReLU(0.2, inplace=True),
            nn.Linear(128, 256),
            nn.BatchNorm1d(256),
            nn.LeakyReLU(0.2, inplace=True),
            nn.Linear(256, 512),
            nn.BatchNorm1d(512),
            nn.LeakyReLU(0.2, inplace=True),
            nn.Linear(512, 1024),
            nn.BatchNorm1d(1024),
            nn.LeakyReLU(0.2, inplace=True),
            
            nn.Linear(1024, output_channel * image_size * image_size),
            nn.Tanh()
        )

    def forward(self, z):
        img = self.model(z)
        img = img.view(img.size(0), self.output_channel, self.image_size, self.image_size)
        return img

# 判别器
class Discriminator(nn.Module):
    def __init__(self, image_size=32, input_channel=1):
        """
        image_size: image with and height
        input_channel: the channel of input image, for example, 1 for gray image, 3 for RGB image
        """
        super(Discriminator, self).__init__()
        self.image_size = image_size
        self.input_channel = input_channel
        
        # Linear layer: input_channel * image_size * image_size -> 1024 -> 512 -> 256 -> 1 -> Sigmoid
        self.model = nn.Sequential(
            nn.Linear(input_channel * image_size * image_size, 1024),
            nn.LeakyReLU(0.2, inplace=True),
            nn.Linear(1024, 512),
            nn.LeakyReLU(0.2, inplace=True),
            nn.Linear(512, 256),
            nn.LeakyReLU(0.2, inplace=True),
            nn.Linear(256, 1),
            nn.Sigmoid(),
        )

    def forward(self, img):
        img_flat = img.view(img.size(0), -1)
        out = self.model(img_flat)
        return out

数据集

在训练我们的GAN网络之前, 先介绍一下本次实验可训练GAN的数据集,我们提供了两个数据集来供大家进行尝试数据集.

  • MNIST手写体3类数据集,这里为了加快我们的训练速度,我们提供了一个简化版本的只包含数字0,2的2类MNIST数据集,每类各1000张.图片为28*28的单通道灰度图(我们将其resize到32*32),对于GAN而言,我们不需要测试集.我们本次实验主要使用该数据集作为主要的训练数据集.

  • 室内家具数据集.为了加快我们的训练速度,我们将其做了删减处理,仅包含chair等一个类,共500张.图片为32*32的3通道彩色图片.

下面是两个加载数据集的函数.注意我们将所有图片normalize到了[-1,1]之间.

torchvision.transforms.Grayscale(num_output_channels=1)

将图片转成灰度图

参数: num_output_channels(int) —— (1或者3),输出图片的通道数量

返回: 输入图片的灰度图,如果num_output_channels=1, 返回的图片为单通道. 如果 num_output_channels=3, 返回的图片为3通道图片,且r=g=b

返回类型:PIL图片类型

def load_mnist_data():
    """
    load mnist(0,1,2) dataset 
    """
    
    transform = torchvision.transforms.Compose([
        # transform to 1-channel gray image since we reading image in RGB mode
        transforms.Grayscale(1),
        # resize image from 28 * 28 to 32 * 32
        transforms.Resize(32),
        transforms.ToTensor(),
        # normalize with mean=0.5 std=0.5
        transforms.Normalize(mean=(0.5, ), 
                             std=(0.5, ))
        ])
    
    train_dataset = torchvision.datasets.ImageFolder(root='./data/mnist', transform=transform)
    
    return train_dataset

def load_furniture_data():
    """
    load furniture dataset 
    """
    transform = torchvision.transforms.Compose([
        transforms.ToTensor(),
        # normalize with mean=0.5 std=0.5
        transforms.Normalize(mean=(0.5, 0.5, 0.5), 
                             std=(0.5, 0.5, 0.5))
        ])
    train_dataset = torchvision.datasets.ImageFolder(root='./data/household_furniture', transform=transform)
    return train_dataset

查看两个数据集中的20张随机真实图片.

def denorm(x):
    # 这个函数的功能是把正规化之后的数据还原到原始数据
    # denormalize
    # normalize:y = (x-0.5)/0.5, y/2 = x-1/2, y+1 = 2x, x = (y+1)/2
    out = (x + 1) / 2
    # Clamp函数可以将随机变化的数值限制在一个给定的区间[min, max]内
    return out.clamp(0, 1)
def show(img):
    # 显示图片
    npimg = img.numpy()
    # 最近插值法(Nearest Neighbor Interpolation):这是一种最简单的插值算法,输出像素的值为输入图像离映射点最近的像素值
    plt.imshow(np.transpose(npimg, (1,2,0)), interpolation='nearest')
    plt.pause(0.05)
# show mnist real data
train_dataset = load_mnist_data()
trainloader = torch.utils.data.DataLoader(train_dataset, batch_size=20, shuffle=True)
show(torchvision.utils.make_grid(denorm(next(iter(trainloader))[0]), nrow=5))
# show furniture real data
train_dataset = load_furniture_data()
trainloader = torch.utils.data.DataLoader(train_dataset, batch_size=20, shuffle=True)
show(torchvision.utils.make_grid(denorm(next(iter(trainloader))[0]), nrow=5))

在这里插入图片描述

在这里插入图片描述

下面代码实现GAN在一个epoch内的训练过程.

大体而言,GAN的训练过程分为两步,首先将随机噪声z喂给G,生成图片,然后将真实图片和G生成的图片喂给D,然后使用对应的loss函数反向传播优化D.然后再次使用G生成图片,并喂给D,并使用对应的loss函数反向传播优化G.

下面的图片是普通的GAN在G和D上的优化目标:
在这里插入图片描述
在这里插入图片描述
值得注意的是,上述图片描述的是G和D的优化目标,而在具体实现过程中,我们实现loss函数来达到优化目标.对于上图中D与G的优化目标我们可以使用Binary Cross Entroy损失函数来实现:
B C E l o s s ( p i , y i ) = − ( y i log ⁡ p i + ( 1 − y i ) log ⁡ ( 1 − p i ) ) BCEloss(p_i,y_i)= -(y_i\log{p_i}+(1−y_i)\log{(1−p_i)}) BCEloss(pi,yi)=(yilogpi+(1yi)log(1pi))
p i p_i pi, y i y_i yi分别是模型的预测值与图片的真实标签(1为真,0为假).因此,对于D,最大化其优化目标可以通过最小化一个BCEloss来实现,其真实图片 x ∼ P r x\sim{P_r} xPr的标签设置为1,而生成图片 z ∼ P ( z ) z\sim{P(z)} zP(z)的标签设置为0(1-y_i有效).我们可以看到这样的损失函数相当于对D的优化目标加上负号.

而对于G,也通过最小化一个BCEloss来实现,即将生成图片 z ∼ P ( z ) z\sim{P(z)} zP(z)的标签设置为1即可,我们可以看到这样的损失函数与其优化目标是一致的.-log(Dw(θ))越小,log(Dw(θ))越大,log(1-Dw(θ))越小。

def train(trainloader, G, D, G_optimizer, D_optimizer, loss_func, device, z_dim):
    """
    train a GAN with model G and D in one epoch
    Args:
        trainloader: data loader to train
        G: model Generator
        D: model Discriminator
        G_optimizer: optimizer of G(etc. Adam, SGD)
        D_optimizer: optimizer of D(etc. Adam, SGD)
        loss_func: loss function to train G and D. For example, Binary Cross Entropy(BCE) loss function
        device: cpu or cuda device
        z_dim: the dimension of random noise z
    """
    # set train mode
    D.train()
    G.train()
    
    D_total_loss = 0
    G_total_loss = 0
    
    
    for i, (x, _) in enumerate(trainloader):
        # real label and fake label
        y_real = torch.ones(x.size(0), 1).to(device)
        y_fake = torch.zeros(x.size(0), 1).to(device)
        # batch_size个真实数据
        x = x.to(device)
        # batch_size个z_dim维的随机噪声
        z = torch.rand(x.size(0), z_dim).to(device)

        # update D network
        # D optimizer zero grads
        D_optimizer.zero_grad()
        
        # D real loss from real images
        d_real = D(x)
        d_real_loss = loss_func(d_real, y_real)
        
        # D fake loss from fake images generated by G
        g_z = G(z)
        d_fake = D(g_z)
        d_fake_loss = loss_func(d_fake, y_fake)
        
        # D backward and step
        d_loss = d_real_loss + d_fake_loss
        d_loss.backward()
        D_optimizer.step()

        # update G network
        # G optimizer zero grads
        G_optimizer.zero_grad()
        
        # G loss
        g_z = G(z)
        d_fake = D(g_z)
        g_loss = loss_func(d_fake, y_real)
        
        # G backward and step
        g_loss.backward()
        G_optimizer.step()
        
        D_total_loss += d_loss.item()
        G_total_loss += g_loss.item()
    
    return D_total_loss / len(trainloader), G_total_loss / len(trainloader)

当模型训练后,我们需要查看此时G生成的图片效果,下面的visualize_results代码便实现了这块内容.注意,我们生成的图片都在[-1,1],因此,我们需要将图片反向归一化(denorm)到[0,1].

def visualize_results(G, device, z_dim, result_size=20):
    G.eval()
    z = torch.rand(result_size, z_dim).to(device)
    g_z = G(z)
    show(torchvision.utils.make_grid(denorm(g_z.detach().cpu()), nrow=5))

万事具备,接下来让我们来尝试这训练一个基本的GAN网络吧.这里实现run_gan函数来调用train以及visualize_results来训练我们的GAN.

def run_gan(trainloader, G, D, G_optimizer, D_optimizer, loss_func, n_epochs, device, latent_dim):
    d_loss_hist = []
    g_loss_hist = []

    for epoch in range(n_epochs):
        d_loss, g_loss = train(trainloader, G, D, G_optimizer, D_optimizer, loss_func, device, 
                               z_dim=latent_dim)
        print('Epoch {}: Train D loss: {:.4f}, G loss: {:.4f}'.format(epoch, d_loss, g_loss))

        d_loss_hist.append(d_loss)
        g_loss_hist.append(g_loss)

        if epoch == 0 or (epoch + 1) % 10 == 0:
            visualize_results(G, device, latent_dim) 
    
    return d_loss_hist, g_loss_hist

设置好超参数就可以开始训练!让我们尝试用它来训练2类的mnist数据集

# hyper params

# z dim
latent_dim = 100

# image size and channel
image_size=32
image_channel=1

# Adam lr and betas
learning_rate = 0.0002
betas = (0.5, 0.999)

# epochs and batch size
n_epochs = 100
batch_size = 32

# device : cpu or cuda:0/1/2/3
device = torch.device('cuda:0')

# mnist dataset and dataloader
train_dataset = load_mnist_data()
trainloader = torch.utils.data.DataLoader(train_dataset, batch_size=batch_size, shuffle=True)

# use BCELoss as loss function
bceloss = nn.BCELoss().to(device)

# G and D model
G = Generator(image_size=image_size, latent_dim=latent_dim, output_channel=image_channel).to(device)
D = Discriminator(image_size=image_size, input_channel=image_channel).to(device)

# G and D optimizer, use Adam or SGD
G_optimizer = optim.Adam(G.parameters(), lr=learning_rate, betas=betas)
D_optimizer = optim.Adam(D.parameters(), lr=learning_rate, betas=betas)
d_loss_hist, g_loss_hist = run_gan(trainloader, G, D, G_optimizer, D_optimizer, bceloss, 
                                   n_epochs, device, latent_dim)
    
Epoch 0: Train D loss: 1.1791, G loss: 0.7463

pytorch实现多种经典GAN_第1张图片

Epoch 1: Train D loss: 1.2470, G loss: 0.9145
Epoch 2: Train D loss: 1.2323, G loss: 1.0148
Epoch 3: Train D loss: 1.1846, G loss: 1.0341
Epoch 4: Train D loss: 1.1817, G loss: 1.0510
Epoch 5: Train D loss: 1.1295, G loss: 1.1367
Epoch 6: Train D loss: 1.0995, G loss: 1.1915
Epoch 7: Train D loss: 1.0892, G loss: 1.2255
Epoch 8: Train D loss: 1.1462, G loss: 1.1665
Epoch 9: Train D loss: 1.1051, G loss: 1.2254

pytorch实现多种经典GAN_第2张图片

Epoch 10: Train D loss: 1.1198, G loss: 1.2484
Epoch 11: Train D loss: 1.1132, G loss: 1.2158
Epoch 12: Train D loss: 1.0997, G loss: 1.2686
Epoch 13: Train D loss: 1.1104, G loss: 1.2343
Epoch 14: Train D loss: 1.1033, G loss: 1.3125
Epoch 15: Train D loss: 1.0886, G loss: 1.3325
Epoch 16: Train D loss: 1.1316, G loss: 1.3003
Epoch 17: Train D loss: 1.1022, G loss: 1.2800
Epoch 18: Train D loss: 1.0752, G loss: 1.3035
Epoch 19: Train D loss: 1.1293, G loss: 1.2889

pytorch实现多种经典GAN_第3张图片

Epoch 20: Train D loss: 1.0665, G loss: 1.3355
Epoch 21: Train D loss: 1.1173, G loss: 1.2758
Epoch 22: Train D loss: 1.1187, G loss: 1.2723
Epoch 23: Train D loss: 1.1315, G loss: 1.2449
Epoch 24: Train D loss: 1.1654, G loss: 1.1570
Epoch 25: Train D loss: 1.1272, G loss: 1.2180
Epoch 26: Train D loss: 1.1286, G loss: 1.2215
Epoch 27: Train D loss: 1.1519, G loss: 1.2276
Epoch 28: Train D loss: 1.1280, G loss: 1.2116
Epoch 29: Train D loss: 1.1409, G loss: 1.2337

在这里插入图片描述

Epoch 30: Train D loss: 1.1258, G loss: 1.2557
Epoch 31: Train D loss: 1.1276, G loss: 1.2434
Epoch 32: Train D loss: 1.1216, G loss: 1.2604
Epoch 33: Train D loss: 1.1229, G loss: 1.2451
Epoch 34: Train D loss: 1.1342, G loss: 1.2303
Epoch 35: Train D loss: 1.1206, G loss: 1.2892
Epoch 36: Train D loss: 1.0959, G loss: 1.2749
Epoch 37: Train D loss: 1.1147, G loss: 1.2429
Epoch 38: Train D loss: 1.1647, G loss: 1.2313
Epoch 39: Train D loss: 1.1382, G loss: 1.2000

pytorch实现多种经典GAN_第4张图片

Epoch 40: Train D loss: 1.1356, G loss: 1.2547
Epoch 41: Train D loss: 1.1650, G loss: 1.1780
Epoch 42: Train D loss: 1.1426, G loss: 1.1804
Epoch 43: Train D loss: 1.1694, G loss: 1.1753
Epoch 44: Train D loss: 1.1547, G loss: 1.1658
Epoch 45: Train D loss: 1.1547, G loss: 1.1880
Epoch 46: Train D loss: 1.1659, G loss: 1.1827
Epoch 47: Train D loss: 1.1749, G loss: 1.1926
Epoch 48: Train D loss: 1.1759, G loss: 1.1534
Epoch 49: Train D loss: 1.1622, G loss: 1.1594

pytorch实现多种经典GAN_第5张图片

Epoch 50: Train D loss: 1.1614, G loss: 1.1997
Epoch 51: Train D loss: 1.1416, G loss: 1.2006
Epoch 52: Train D loss: 1.1615, G loss: 1.1860
Epoch 53: Train D loss: 1.1517, G loss: 1.1677
Epoch 54: Train D loss: 1.1552, G loss: 1.1959
Epoch 55: Train D loss: 1.1523, G loss: 1.2024
Epoch 56: Train D loss: 1.1464, G loss: 1.2215
Epoch 57: Train D loss: 1.1580, G loss: 1.1840
Epoch 58: Train D loss: 1.1540, G loss: 1.2202
Epoch 59: Train D loss: 1.1497, G loss: 1.1969

pytorch实现多种经典GAN_第6张图片

Epoch 60: Train D loss: 1.1559, G loss: 1.1503
Epoch 61: Train D loss: 1.1472, G loss: 1.2118
Epoch 62: Train D loss: 1.1392, G loss: 1.2124
Epoch 63: Train D loss: 1.1505, G loss: 1.2236
Epoch 64: Train D loss: 1.1459, G loss: 1.1933
Epoch 65: Train D loss: 1.1541, G loss: 1.2189
Epoch 66: Train D loss: 1.1176, G loss: 1.2387
Epoch 67: Train D loss: 1.1360, G loss: 1.2510
Epoch 68: Train D loss: 1.1477, G loss: 1.2275
Epoch 69: Train D loss: 1.1323, G loss: 1.2271

pytorch实现多种经典GAN_第7张图片

Epoch 70: Train D loss: 1.1221, G loss: 1.2880
Epoch 71: Train D loss: 1.1262, G loss: 1.2353
Epoch 72: Train D loss: 1.1372, G loss: 1.2594
Epoch 73: Train D loss: 1.1333, G loss: 1.2870
Epoch 74: Train D loss: 1.1168, G loss: 1.2889
Epoch 75: Train D loss: 1.1332, G loss: 1.2653
Epoch 76: Train D loss: 1.0919, G loss: 1.3116
Epoch 77: Train D loss: 1.1028, G loss: 1.3412
Epoch 78: Train D loss: 1.1301, G loss: 1.2754
Epoch 79: Train D loss: 1.1025, G loss: 1.3306

pytorch实现多种经典GAN_第8张图片

Epoch 80: Train D loss: 1.1032, G loss: 1.2856
Epoch 81: Train D loss: 1.1018, G loss: 1.3074
Epoch 82: Train D loss: 1.0834, G loss: 1.3427
Epoch 83: Train D loss: 1.0781, G loss: 1.3659
Epoch 84: Train D loss: 1.0777, G loss: 1.3278
Epoch 85: Train D loss: 1.0783, G loss: 1.3492
Epoch 86: Train D loss: 1.0856, G loss: 1.3677
Epoch 87: Train D loss: 1.0802, G loss: 1.4034
Epoch 88: Train D loss: 1.0590, G loss: 1.3827
Epoch 89: Train D loss: 1.0399, G loss: 1.4526

pytorch实现多种经典GAN_第9张图片

Epoch 90: Train D loss: 1.0494, G loss: 1.4073
Epoch 91: Train D loss: 1.0545, G loss: 1.4035
Epoch 92: Train D loss: 1.0442, G loss: 1.4618
Epoch 93: Train D loss: 1.0488, G loss: 1.4411
Epoch 94: Train D loss: 1.0459, G loss: 1.4418
Epoch 95: Train D loss: 1.0192, G loss: 1.4603
Epoch 96: Train D loss: 1.0118, G loss: 1.4805
Epoch 97: Train D loss: 1.0085, G loss: 1.5281
Epoch 98: Train D loss: 0.9990, G loss: 1.5310
Epoch 99: Train D loss: 1.0001, G loss: 1.5491

pytorch实现多种经典GAN_第10张图片

训练完后,让我们来看一下G生成的图片效果,可以看到即使是一个简单的GAN在这种简单的数据集上的生成效果还是不错的,虽然仍然存在不少瑕疵,比如说我们可以看到生成的图片上的数字有很多奇怪的雪花等等.

让我们看一下G和D的loss变化曲线(运行下方语句.)

def loss_plot(d_loss_hist, g_loss_hist):
    x = range(len(d_loss_hist))

    plt.plot(x, d_loss_hist, label='D_loss')
    plt.plot(x, g_loss_hist, label='G_loss')

    plt.xlabel('Epoch')
    plt.ylabel('Loss')

    plt.legend(loc=4)
    plt.grid(True)
    plt.tight_layout()

    plt.show()
loss_plot(d_loss_hist, g_loss_hist)

pytorch实现多种经典GAN_第11张图片

问题:

观察G与D的loss曲线,与之前的训练的CNN的loss曲线相比,有什么不同?试简要回答你觉得可能产生这样的不同的原因.

答:

在判别器判断得足够好之后,D函数的损失函数的本质是计算真实分布和生成分布在高维空间内的JS距离,由于在高维空间中二者之间的重叠部分很小,因此JS不能正确地度量两者间的距离,造成梯度消失,loss下降得很慢,并且下降过程不稳定。而G函数的损失函数的本质是计算真实分布和生成分布的KL距离减去两倍的JS距离,由于这两者在目标上是相互矛盾的,由于JS距离在判别器的训练过程中成功得到了优化,JS距离足够的小,因此导致了损失函数反而在变大。(参考:https://zhuanlan.zhihu.com/p/25071913 )

DCGAN

在DCGAN(Deep Convolution GAN)中,最大的改变是使用了CNN代替全连接层.在生成器G中,使用stride为2的转置卷积来生成图片同时扩大图片尺寸,而在判别器D中,使用stride为2的卷积来将图片进行卷积并下采样.除此之外,DCGAN加入了在层与层之间BatchNormalization(虽然我们在普通的GAN中就已经添加),在G中使用ReLU作为激活函数,而在D中使用LeakyReLU作为激活函数.

def initialize_weights(net):
    for m in net.modules():
        if isinstance(m, nn.Conv2d):
            m.weight.data.normal_(0, 0.02)
            m.bias.data.zero_()
        elif isinstance(m, nn.ConvTranspose2d):
            m.weight.data.normal_(0, 0.02)
            m.bias.data.zero_()
        elif isinstance(m, nn.Linear):
            m.weight.data.normal_(0, 0.02)
            m.bias.data.zero_()
            
def show_weights_hist(data):
    plt.hist(data, bins=100, normed=1, facecolor="blue", edgecolor="black", alpha=0.7)
    plt.xlabel("weights")
    plt.ylabel("frequency")
    plt.title("D weights")
    plt.show()

Transpose2d(in_channels, out_channels, kernel_size, stride=1, padding=0, output_padding=0, groups=1, bias=True, dilation=1)

  • in_channels(int) – 输入信号的通道数
  • out_channels(int) – 卷积产生的通道数
  • kerner_size(int or tuple) - 卷积核的大小
  • stride(int or tuple,optional) - 卷积步长,即要将输入扩大的倍数。
  • padding(int or tuple, optional) - 输入的每一条边补充0的层数,高宽都增加2*padding
  • output_padding(int or tuple, optional) - 输出边补充0的层数,高宽都增加padding
  • groups(int, optional) – 从输入通道到输出通道的阻塞连接数
  • bias(bool, optional) - 如果bias=True,添加偏置
  • dilation(int or tuple, optional) – 卷积核元素之间的间距
class DCGenerator(nn.Module):
    def __init__(self, image_size=32, latent_dim=64, output_channel=1):
        super(DCGenerator, self).__init__()
        self.image_size = image_size
        self.latent_dim = latent_dim
        self.output_channel = output_channel
        
        self.init_size = image_size // 8
        # 相当于除了三次2,因此在反卷积中要乘三次二变回来
        
        # fc: Linear -> BN -> ReLU
        self.fc = nn.Sequential(
            nn.Linear(latent_dim, 512 * self.init_size ** 2),
            nn.BatchNorm1d(512 * self.init_size ** 2),
            nn.ReLU(inplace=True)
        )
        
        # deconv: ConvTranspose2d(4, 2, 1) -> BN -> ReLU -> 
        #         ConvTranspose2d(4, 2, 1) -> BN -> ReLU -> 
        #         ConvTranspose2d(4, 2, 1) -> Tanh
        # 根据o = s(i-1)-2p+k,o = s(i)
        self.deconv = nn.Sequential(
            nn.ConvTranspose2d(512, 256, 4, stride=2, padding=1),
            nn.BatchNorm2d(256),
            nn.ReLU(inplace=True),
            nn.ConvTranspose2d(256, 128, 4, stride=2, padding=1),
            nn.BatchNorm2d(128),
            nn.ReLU(inplace=True),
            nn.ConvTranspose2d(128, output_channel, 4, stride=2, padding=1),
            nn.Tanh(),
        )
        initialize_weights(self)

    def forward(self, z):
        out = self.fc(z)
        out = out.view(out.shape[0], 512, self.init_size, self.init_size)
        img = self.deconv(out)
        return img


class DCDiscriminator(nn.Module):
    def __init__(self, image_size=32, input_channel=1, sigmoid=True):
        super(DCDiscriminator, self).__init__()
        self.image_size = image_size
        self.input_channel = input_channel
        self.fc_size = image_size // 8
        
        # conv: Conv2d(3,2,1) -> LeakyReLU 
        #       Conv2d(3,2,1) -> BN -> LeakyReLU 
        #       Conv2d(3,2,1) -> BN -> LeakyReLU 
        self.conv = nn.Sequential(
            nn.Conv2d(input_channel, 128, 3, 2, 1),
            nn.LeakyReLU(0.2),
            nn.Conv2d(128, 256, 3, 2, 1),
            nn.BatchNorm2d(256),
            nn.LeakyReLU(0.2),
            nn.Conv2d(256, 512, 3, 2, 1),
            nn.BatchNorm2d(512),
            nn.LeakyReLU(0.2),
        )
        
        # fc: Linear -> Sigmoid
        self.fc = nn.Sequential(
            nn.Linear(512 * self.fc_size * self.fc_size, 1),
        )
        if sigmoid:
            self.fc.add_module('sigmoid', nn.Sigmoid())
        initialize_weights(self)
        
        

    def forward(self, img):
        out = self.conv(img)
        out = out.view(out.shape[0], -1)
        out = self.fc(out)

        return out

同样的,我们使用同样的mnist数据集对DCGAN进行训练.

# hyper params

# z dim
latent_dim = 100

# image size and channel
image_size=32
image_channel=1

# Adam lr and betas
learning_rate = 0.0002
betas = (0.5, 0.999)

# epochs and batch size
n_epochs = 100
batch_size = 32

# device : cpu or cuda:0/1/2/3
device = torch.device('cuda:0')

# mnist dataset and dataloader
train_dataset = load_mnist_data()
trainloader = torch.utils.data.DataLoader(train_dataset, batch_size=batch_size, shuffle=True)

# use BCELoss as loss function
bceloss = nn.BCELoss().to(device)

# G and D model, use DCGAN
G = DCGenerator(image_size=image_size, latent_dim=latent_dim, output_channel=image_channel).to(device)
D = DCDiscriminator(image_size=image_size, input_channel=image_channel).to(device)

# G and D optimizer, use Adam or SGD
G_optimizer = optim.Adam(G.parameters(), lr=learning_rate, betas=betas)
D_optimizer = optim.Adam(D.parameters(), lr=learning_rate, betas=betas)
d_loss_hist, g_loss_hist = run_gan(trainloader, G, D, G_optimizer, D_optimizer, bceloss, 
                                   n_epochs, device, latent_dim)
Epoch 0: Train D loss: 0.3058, G loss: 4.7598

在这里插入图片描述

Epoch 1: Train D loss: 0.2405, G loss: 6.3398
Epoch 2: Train D loss: 0.1296, G loss: 5.1858
Epoch 3: Train D loss: 0.3112, G loss: 5.0476
Epoch 4: Train D loss: 0.2729, G loss: 4.0073
Epoch 5: Train D loss: 0.5681, G loss: 3.3789
Epoch 6: Train D loss: 0.5043, G loss: 2.7180
Epoch 7: Train D loss: 0.6750, G loss: 2.6523
Epoch 8: Train D loss: 0.4656, G loss: 2.4621
Epoch 9: Train D loss: 0.7226, G loss: 2.3971

在这里插入图片描述

Epoch 10: Train D loss: 0.4275, G loss: 2.5623
Epoch 11: Train D loss: 0.5590, G loss: 2.6006
Epoch 12: Train D loss: 0.7402, G loss: 2.3961
Epoch 13: Train D loss: 0.8114, G loss: 1.7695
Epoch 14: Train D loss: 0.5556, G loss: 2.2795
Epoch 15: Train D loss: 0.5343, G loss: 2.3767
Epoch 16: Train D loss: 0.5401, G loss: 2.4362
Epoch 17: Train D loss: 0.6157, G loss: 2.4186
Epoch 18: Train D loss: 0.6249, G loss: 2.4676
Epoch 19: Train D loss: 0.5846, G loss: 2.3182

pytorch实现多种经典GAN_第12张图片

Epoch 20: Train D loss: 0.4627, G loss: 2.6083
Epoch 21: Train D loss: 0.5023, G loss: 2.6305
Epoch 22: Train D loss: 0.4536, G loss: 2.6421
Epoch 23: Train D loss: 0.4860, G loss: 2.8043
Epoch 24: Train D loss: 0.4305, G loss: 2.7437
Epoch 25: Train D loss: 0.5044, G loss: 2.8343
Epoch 26: Train D loss: 0.4247, G loss: 2.9332
Epoch 27: Train D loss: 0.3002, G loss: 2.9579
Epoch 28: Train D loss: 0.2449, G loss: 3.3253
Epoch 29: Train D loss: 0.5332, G loss: 3.1328

pytorch实现多种经典GAN_第13张图片

Epoch 30: Train D loss: 0.3761, G loss: 3.2301
Epoch 31: Train D loss: 0.3427, G loss: 3.3083
Epoch 32: Train D loss: 0.3705, G loss: 3.4713
Epoch 33: Train D loss: 0.1670, G loss: 3.5174
Epoch 34: Train D loss: 0.3841, G loss: 3.6162
Epoch 35: Train D loss: 0.3142, G loss: 3.6708
Epoch 36: Train D loss: 0.3937, G loss: 3.3291
Epoch 37: Train D loss: 0.1097, G loss: 3.9710
Epoch 38: Train D loss: 0.5398, G loss: 3.4160
Epoch 39: Train D loss: 0.0967, G loss: 4.0094

pytorch实现多种经典GAN_第14张图片

Epoch 40: Train D loss: 0.0742, G loss: 4.3840
Epoch 41: Train D loss: 0.6214, G loss: 3.4211
Epoch 42: Train D loss: 0.1369, G loss: 3.9117
Epoch 43: Train D loss: 0.0708, G loss: 4.3135
Epoch 44: Train D loss: 0.0531, G loss: 4.6238
Epoch 45: Train D loss: 0.0587, G loss: 4.7016
Epoch 46: Train D loss: 0.7416, G loss: 3.1067
Epoch 47: Train D loss: 0.2149, G loss: 3.9059
Epoch 48: Train D loss: 0.0592, G loss: 4.4667
Epoch 49: Train D loss: 0.0538, G loss: 4.7051

在这里插入图片描述

Epoch 50: Train D loss: 0.0454, G loss: 4.8818
Epoch 51: Train D loss: 1.1197, G loss: 2.5505
Epoch 52: Train D loss: 0.5909, G loss: 2.8026
Epoch 53: Train D loss: 0.3929, G loss: 3.2649
Epoch 54: Train D loss: 0.3204, G loss: 3.6944
Epoch 55: Train D loss: 0.1150, G loss: 4.2222
Epoch 56: Train D loss: 0.0568, G loss: 4.5714
Epoch 57: Train D loss: 0.0393, G loss: 4.9418
Epoch 58: Train D loss: 0.0329, G loss: 4.9977
Epoch 59: Train D loss: 0.6212, G loss: 3.4983

pytorch实现多种经典GAN_第15张图片

Epoch 60: Train D loss: 0.1130, G loss: 4.2528
Epoch 61: Train D loss: 0.0420, G loss: 4.8189
Epoch 62: Train D loss: 0.0340, G loss: 5.1217
Epoch 63: Train D loss: 0.0257, G loss: 5.3322
Epoch 64: Train D loss: 0.0217, G loss: 5.5109
Epoch 65: Train D loss: 0.0229, G loss: 5.4668
Epoch 66: Train D loss: 0.0241, G loss: 5.6308
Epoch 67: Train D loss: 0.0200, G loss: 5.6641
Epoch 68: Train D loss: 1.1368, G loss: 3.4615
Epoch 69: Train D loss: 0.6017, G loss: 2.3565

pytorch实现多种经典GAN_第16张图片

Epoch 70: Train D loss: 0.2794, G loss: 3.6240
Epoch 71: Train D loss: 0.4002, G loss: 3.7691
Epoch 72: Train D loss: 0.0903, G loss: 4.4110
Epoch 73: Train D loss: 0.0352, G loss: 4.9389
Epoch 74: Train D loss: 0.0373, G loss: 5.2859
Epoch 75: Train D loss: 0.5994, G loss: 4.0492
Epoch 76: Train D loss: 0.5804, G loss: 2.7458
Epoch 77: Train D loss: 0.3309, G loss: 3.6591
Epoch 78: Train D loss: 0.0637, G loss: 4.3764
Epoch 79: Train D loss: 0.0337, G loss: 4.9487

pytorch实现多种经典GAN_第17张图片

Epoch 80: Train D loss: 0.0246, G loss: 5.2609
Epoch 81: Train D loss: 0.0216, G loss: 5.4751
Epoch 82: Train D loss: 0.0177, G loss: 5.6254
Epoch 83: Train D loss: 0.0168, G loss: 5.7373
Epoch 84: Train D loss: 0.0147, G loss: 5.8803
Epoch 85: Train D loss: 0.0149, G loss: 5.9149
Epoch 86: Train D loss: 0.0130, G loss: 6.0219
Epoch 87: Train D loss: 0.0114, G loss: 6.1498
Epoch 88: Train D loss: 0.0098, G loss: 6.1600
Epoch 89: Train D loss: 0.0106, G loss: 6.3201

pytorch实现多种经典GAN_第18张图片

Epoch 90: Train D loss: 0.0099, G loss: 6.3109
Epoch 91: Train D loss: 0.8348, G loss: 3.1048
Epoch 92: Train D loss: 0.3101, G loss: 3.6857
Epoch 93: Train D loss: 0.3013, G loss: 3.6529
Epoch 94: Train D loss: 0.0576, G loss: 4.8273
Epoch 95: Train D loss: 0.0280, G loss: 5.2751
Epoch 96: Train D loss: 0.0242, G loss: 5.5182
Epoch 97: Train D loss: 0.0188, G loss: 5.8019
Epoch 98: Train D loss: 0.0140, G loss: 5.9357
Epoch 99: Train D loss: 0.0132, G loss: 6.1122

pytorch实现多种经典GAN_第19张图片

loss_plot(d_loss_hist, g_loss_hist)

pytorch实现多种经典GAN_第20张图片

可以看到,DCGAN的生成图片质量比起只有线性层的GAN要好不少.接下来,让我们尝试使用家具数据集来训练DCGAN.

# RGB image channel = 3
image_channel=3

# epochs
n_epochs = 300

# mnist dataset and dataloader
train_dataset = load_furniture_data()
trainloader = torch.utils.data.DataLoader(train_dataset, batch_size=batch_size, shuffle=True)

# G and D model, use DCGAN
G = DCGenerator(image_size=image_size, latent_dim=latent_dim, output_channel=image_channel).to(device)
D = DCDiscriminator(image_size=image_size, input_channel=image_channel).to(device)

# G and D optimizer, use Adam or SGD
G_optimizer = optim.Adam(G.parameters(), lr=learning_rate, betas=betas)
D_optimizer = optim.Adam(D.parameters(), lr=learning_rate, betas=betas)

d_loss_hist, g_loss_hist = run_gan(trainloader, G, D, G_optimizer, D_optimizer, bceloss, 
                                   n_epochs, device, latent_dim)
Epoch 0: Train D loss: 0.8721, G loss: 3.5691

pytorch实现多种经典GAN_第21张图片

Epoch 1: Train D loss: 0.4196, G loss: 5.5890
Epoch 2: Train D loss: 0.3538, G loss: 6.5387
Epoch 3: Train D loss: 0.1889, G loss: 5.9494
Epoch 4: Train D loss: 0.0938, G loss: 5.0065
Epoch 5: Train D loss: 0.1333, G loss: 5.8194
Epoch 6: Train D loss: 0.3067, G loss: 6.1847
Epoch 7: Train D loss: 0.2274, G loss: 5.6500
Epoch 8: Train D loss: 0.1881, G loss: 6.3358
Epoch 9: Train D loss: 0.2102, G loss: 5.1835

在这里插入图片描述

Epoch 10: Train D loss: 0.3018, G loss: 4.8081
Epoch 11: Train D loss: 0.5444, G loss: 3.7089
Epoch 12: Train D loss: 0.3338, G loss: 3.8421
Epoch 13: Train D loss: 0.1828, G loss: 4.3349
Epoch 14: Train D loss: 0.3425, G loss: 4.4118
Epoch 15: Train D loss: 0.4448, G loss: 4.5843
Epoch 16: Train D loss: 0.3628, G loss: 4.3812
Epoch 17: Train D loss: 0.5384, G loss: 4.9013
Epoch 18: Train D loss: 0.2033, G loss: 4.3718
Epoch 19: Train D loss: 0.3340, G loss: 5.0024

pytorch实现多种经典GAN_第22张图片

Epoch 20: Train D loss: 0.3366, G loss: 5.0892
Epoch 21: Train D loss: 0.2694, G loss: 4.7501
Epoch 22: Train D loss: 0.2114, G loss: 4.8818
Epoch 23: Train D loss: 0.4150, G loss: 5.6769
Epoch 24: Train D loss: 0.1959, G loss: 4.7393
Epoch 25: Train D loss: 0.4434, G loss: 6.2860
Epoch 136: Train D loss: 0.3719, G loss: 3.1800
Epoch 137: Train D loss: 0.3459, G loss: 3.3916
Epoch 138: Train D loss: 0.2580, G loss: 3.2348
Epoch 139: Train D loss: 0.2596, G loss: 3.3438

pytorch实现多种经典GAN_第23张图片

Epoch 140: Train D loss: 0.3937, G loss: 3.4812
Epoch 141: Train D loss: 0.3171, G loss: 3.2285
Epoch 142: Train D loss: 0.2634, G loss: 3.2658
Epoch 143: Train D loss: 0.2668, G loss: 3.4240
Epoch 144: Train D loss: 0.2951, G loss: 3.3313
Epoch 145: Train D loss: 0.3694, G loss: 3.5713
Epoch 146: Train D loss: 0.3826, G loss: 3.3729
Epoch 147: Train D loss: 0.2454, G loss: 3.4306
Epoch 148: Train D loss: 0.2453, G loss: 3.3822
Epoch 149: Train D loss: 0.2130, G loss: 3.3257

pytorch实现多种经典GAN_第24张图片

Epoch 150: Train D loss: 0.2878, G loss: 3.3489
Epoch 151: Train D loss: 0.3113, G loss: 3.5103
Epoch 152: Train D loss: 0.2099, G loss: 3.4619
Epoch 153: Train D loss: 0.1964, G loss: 3.4952
Epoch 154: Train D loss: 0.1842, G loss: 3.5244
Epoch 155: Train D loss: 0.2597, G loss: 3.4327
Epoch 156: Train D loss: 0.2785, G loss: 3.3303
Epoch 157: Train D loss: 0.4728, G loss: 3.8587
Epoch 158: Train D loss: 0.3232, G loss: 3.5044
Epoch 159: Train D loss: 0.1993, G loss: 3.5146

Epoch 160: Train D loss: 0.1683, G loss: 3.4123
Epoch 161: Train D loss: 0.2157, G loss: 3.4506
Epoch 162: Train D loss: 0.1826, G loss: 3.5759
Epoch 163: Train D loss: 0.1776, G loss: 3.4413
Epoch 164: Train D loss: 0.1484, G loss: 3.6285
Epoch 165: Train D loss: 0.1908, G loss: 3.6796
Epoch 166: Train D loss: 0.6024, G loss: 4.2387
Epoch 167: Train D loss: 0.7751, G loss: 3.6685
Epoch 168: Train D loss: 0.2835, G loss: 3.3991
Epoch 169: Train D loss: 0.1918, G loss: 3.3879

Epoch 170: Train D loss: 0.1864, G loss: 3.4544
Epoch 171: Train D loss: 0.1903, G loss: 3.5601
Epoch 172: Train D loss: 0.1863, G loss: 3.6867
Epoch 173: Train D loss: 0.1626, G loss: 3.6987
Epoch 174: Train D loss: 0.2146, G loss: 3.8739
Epoch 175: Train D loss: 0.1788, G loss: 3.5217
Epoch 176: Train D loss: 0.1689, G loss: 3.6545
Epoch 177: Train D loss: 0.1642, G loss: 3.7946
Epoch 178: Train D loss: 0.1430, G loss: 3.7982
Epoch 179: Train D loss: 0.1414, G loss: 3.6554

Epoch 180: Train D loss: 0.1761, G loss: 3.8872
Epoch 181: Train D loss: 0.1618, G loss: 3.7334
Epoch 182: Train D loss: 0.1425, G loss: 3.8345
Epoch 183: Train D loss: 0.1391, G loss: 3.7518
Epoch 184: Train D loss: 0.2055, G loss: 3.9726
Epoch 185: Train D loss: 0.1413, G loss: 3.8630
Epoch 186: Train D loss: 0.1260, G loss: 3.7982
Epoch 187: Train D loss: 0.1130, G loss: 3.9749
Epoch 188: Train D loss: 0.1039, G loss: 3.9076
Epoch 189: Train D loss: 0.1292, G loss: 3.9768

Epoch 190: Train D loss: 1.5367, G loss: 4.4487
Epoch 191: Train D loss: 0.9981, G loss: 4.0753
Epoch 192: Train D loss: 0.4184, G loss: 3.7332
Epoch 193: Train D loss: 0.2579, G loss: 3.6235
Epoch 194: Train D loss: 0.1839, G loss: 3.4661
Epoch 195: Train D loss: 0.1483, G loss: 3.5572
Epoch 196: Train D loss: 0.1427, G loss: 3.5212
Epoch 197: Train D loss: 0.1157, G loss: 3.6499
Epoch 198: Train D loss: 0.1176, G loss: 3.6017
Epoch 199: Train D loss: 0.1132, G loss: 3.7423

在这里插入图片描述

Epoch 200: Train D loss: 0.1187, G loss: 3.7492
Epoch 201: Train D loss: 0.1031, G loss: 3.8457
Epoch 202: Train D loss: 0.1106, G loss: 3.7987
Epoch 203: Train D loss: 0.1017, G loss: 3.8125
Epoch 204: Train D loss: 0.1062, G loss: 3.8884
Epoch 205: Train D loss: 0.0986, G loss: 3.8442
Epoch 206: Train D loss: 0.0988, G loss: 3.9019
Epoch 207: Train D loss: 0.0992, G loss: 3.9403
Epoch 208: Train D loss: 0.0858, G loss: 3.8984
Epoch 209: Train D loss: 0.0870, G loss: 4.0242

Epoch 210: Train D loss: 0.0898, G loss: 4.0558
Epoch 211: Train D loss: 0.0857, G loss: 4.0391
Epoch 212: Train D loss: 0.0936, G loss: 4.1636
Epoch 213: Train D loss: 0.0916, G loss: 4.1066
Epoch 214: Train D loss: 0.0798, G loss: 4.0738
Epoch 215: Train D loss: 0.0766, G loss: 4.0864
Epoch 216: Train D loss: 0.1023, G loss: 4.1615
Epoch 217: Train D loss: 0.0845, G loss: 4.1678
Epoch 218: Train D loss: 0.0939, G loss: 4.1595
Epoch 219: Train D loss: 0.0922, G loss: 4.1458

在这里插入图片描述

Epoch 220: Train D loss: 0.1856, G loss: 4.6638
Epoch 221: Train D loss: 2.2310, G loss: 4.4371
Epoch 222: Train D loss: 0.7489, G loss: 4.3505
Epoch 223: Train D loss: 0.3328, G loss: 3.7174
Epoch 224: Train D loss: 0.2318, G loss: 3.6495
Epoch 225: Train D loss: 0.1736, G loss: 3.6866
Epoch 226: Train D loss: 0.1321, G loss: 3.7295
Epoch 227: Train D loss: 0.1178, G loss: 3.7200
Epoch 228: Train D loss: 0.1053, G loss: 3.8374
Epoch 229: Train D loss: 0.0958, G loss: 3.8854

在这里插入图片描述

Epoch 230: Train D loss: 0.0939, G loss: 3.8658
Epoch 231: Train D loss: 0.0898, G loss: 3.9437
Epoch 232: Train D loss: 0.0867, G loss: 3.9250
Epoch 233: Train D loss: 0.0776, G loss: 4.0148
Epoch 234: Train D loss: 0.0862, G loss: 4.0274
Epoch 235: Train D loss: 0.0749, G loss: 4.1446
Epoch 236: Train D loss: 0.0717, G loss: 4.1305
Epoch 237: Train D loss: 0.0782, G loss: 4.1080
Epoch 238: Train D loss: 0.0732, G loss: 4.1991
Epoch 239: Train D loss: 0.0681, G loss: 4.2126

在这里插入图片描述

Epoch 240: Train D loss: 0.0693, G loss: 4.2196
Epoch 241: Train D loss: 0.0593, G loss: 4.2899
Epoch 242: Train D loss: 0.0661, G loss: 4.1928
Epoch 243: Train D loss: 0.0600, G loss: 4.2429
Epoch 244: Train D loss: 0.0719, G loss: 4.3311
Epoch 245: Train D loss: 0.0623, G loss: 4.1954
Epoch 246: Train D loss: 0.0597, G loss: 4.3905
Epoch 247: Train D loss: 0.0675, G loss: 4.4015
Epoch 248: Train D loss: 0.0623, G loss: 4.3965
Epoch 249: Train D loss: 0.0596, G loss: 4.4253

在这里插入图片描述

Epoch 250: Train D loss: 0.0616, G loss: 4.4121
Epoch 251: Train D loss: 0.0593, G loss: 4.4328
Epoch 252: Train D loss: 0.0592, G loss: 4.7691
Epoch 253: Train D loss: 0.0516, G loss: 4.5762
Epoch 254: Train D loss: 0.0524, G loss: 4.5352
Epoch 255: Train D loss: 0.0527, G loss: 4.5172
Epoch 256: Train D loss: 0.0525, G loss: 4.5847
Epoch 257: Train D loss: 0.0551, G loss: 4.5511
Epoch 258: Train D loss: 0.0553, G loss: 4.5863
Epoch 259: Train D loss: 0.0563, G loss: 4.6744

Epoch 260: Train D loss: 0.0567, G loss: 4.6050
Epoch 261: Train D loss: 0.0520, G loss: 4.6238
Epoch 262: Train D loss: 0.0524, G loss: 4.7029
Epoch 263: Train D loss: 0.0510, G loss: 4.6646
Epoch 264: Train D loss: 0.0504, G loss: 4.7371
Epoch 265: Train D loss: 1.3409, G loss: 5.1950
Epoch 266: Train D loss: 1.8008, G loss: 3.7726
Epoch 267: Train D loss: 0.6327, G loss: 4.3772
Epoch 268: Train D loss: 0.3427, G loss: 3.8794
Epoch 269: Train D loss: 0.1973, G loss: 3.8747

在这里插入图片描述

Epoch 270: Train D loss: 0.1299, G loss: 3.7844
Epoch 271: Train D loss: 0.1153, G loss: 3.7996
Epoch 272: Train D loss: 0.0911, G loss: 4.0014
Epoch 273: Train D loss: 0.0983, G loss: 4.3976
Epoch 274: Train D loss: 0.0821, G loss: 4.0610
Epoch 275: Train D loss: 0.0736, G loss: 4.1553
Epoch 276: Train D loss: 0.0687, G loss: 4.1460
Epoch 277: Train D loss: 0.0681, G loss: 4.1953
Epoch 278: Train D loss: 0.0592, G loss: 4.2686
Epoch 279: Train D loss: 0.0564, G loss: 4.3348

在这里插入图片描述

Epoch 280: Train D loss: 0.0544, G loss: 4.4811
Epoch 281: Train D loss: 0.0558, G loss: 4.4769
Epoch 282: Train D loss: 0.0538, G loss: 4.3963
Epoch 283: Train D loss: 0.0518, G loss: 4.4596
Epoch 284: Train D loss: 0.0478, G loss: 4.4500
Epoch 285: Train D loss: 0.0488, G loss: 4.5309
Epoch 286: Train D loss: 0.0499, G loss: 4.5284
Epoch 287: Train D loss: 0.0463, G loss: 4.5306
Epoch 288: Train D loss: 0.0500, G loss: 4.5908
Epoch 289: Train D loss: 0.0461, G loss: 4.5983

在这里插入图片描述

Epoch 290: Train D loss: 0.0473, G loss: 4.6707
Epoch 291: Train D loss: 0.0398, G loss: 4.6382
Epoch 292: Train D loss: 0.0438, G loss: 4.7286
Epoch 293: Train D loss: 0.0440, G loss: 4.6879
Epoch 294: Train D loss: 0.0424, G loss: 4.7704
Epoch 295: Train D loss: 0.0376, G loss: 4.8003
Epoch 296: Train D loss: 0.0424, G loss: 4.6733
Epoch 297: Train D loss: 0.0367, G loss: 4.8391
Epoch 298: Train D loss: 0.0443, G loss: 4.9735
Epoch 299: Train D loss: 0.0446, G loss: 4.9586

在这里插入图片描述

loss_plot(d_loss_hist, g_loss_hist)

pytorch实现多种经典GAN_第25张图片

LSGAN

LSGAN(Least Squares GAN)将loss函数改为了 L2损失.G和D的优化目标如下图所示,
在这里插入图片描述

class L2Loss(nn.Module):
    def __init__(self):
        super(L2Loss, self).__init__()
    
    def forward(self, input_, target):
        """
        input_: (batch_size*1) 
        target: (batch_size*1) labels, 1 or 0
        """
        return torch.sum((input_-target)**2)

完成上方代码后,使用所写的L2Loss在mnist数据集上训练DCGAN.

# hyper params

# z dim
latent_dim = 100

# image size and channel
image_size=32
image_channel=1

# Adam lr and betas
learning_rate = 0.0002
betas = (0.5, 0.999)

# epochs and batch size
n_epochs = 100
batch_size = 32

# device : cpu or cuda:0/1/2/3
device = torch.device('cuda:0')

# mnist dataset and dataloader
train_dataset = load_mnist_data()
trainloader = torch.utils.data.DataLoader(train_dataset, batch_size=batch_size, shuffle=True)

# use L2Loss as loss function
l2loss = L2Loss().to(device)

# G and D model, use DCGAN
G = DCGenerator(image_size=image_size, latent_dim=latent_dim, output_channel=image_channel).to(device)
D = DCDiscriminator(image_size=image_size, input_channel=image_channel).to(device)

# G and D optimizer, use Adam or SGD
G_optimizer = optim.Adam(G.parameters(), lr=learning_rate, betas=betas)
D_optimizer = optim.Adam(D.parameters(), lr=learning_rate, betas=betas)
d_loss_hist, g_loss_hist = run_gan(trainloader, G, D, G_optimizer, D_optimizer, l2loss, n_epochs, device, 
                                   latent_dim)
loss_plot(d_loss_hist, g_loss_hist)
Epoch 0: Train D loss: 2.5713, G loss: 30.0346

pytorch实现多种经典GAN_第26张图片

Epoch 1: Train D loss: 0.0041, G loss: 31.7366
Epoch 2: Train D loss: 0.2194, G loss: 31.5888
Epoch 3: Train D loss: 0.0903, G loss: 30.7643
Epoch 4: Train D loss: 2.3371, G loss: 30.2593
Epoch 5: Train D loss: 0.2688, G loss: 31.6666
Epoch 6: Train D loss: 0.0017, G loss: 31.7426
Epoch 7: Train D loss: 2.9023, G loss: 31.7087
Epoch 8: Train D loss: 0.8844, G loss: 31.6255
Epoch 9: Train D loss: 0.4305, G loss: 31.5096

Epoch 10: Train D loss: 2.6566, G loss: 30.1996
Epoch 11: Train D loss: 8.4883, G loss: 26.8622
Epoch 12: Train D loss: 9.6084, G loss: 22.9372
Epoch 13: Train D loss: 7.3964, G loss: 21.5906
Epoch 14: Train D loss: 7.2310, G loss: 20.5919
Epoch 15: Train D loss: 7.6952, G loss: 20.2568
Epoch 16: Train D loss: 7.6550, G loss: 20.1319
Epoch 17: Train D loss: 7.0190, G loss: 20.4484
Epoch 18: Train D loss: 7.1121, G loss: 20.3034
Epoch 19: Train D loss: 8.0103, G loss: 20.4453

pytorch实现多种经典GAN_第27张图片

Epoch 20: Train D loss: 6.7117, G loss: 20.6895
Epoch 21: Train D loss: 6.6646, G loss: 20.7064
Epoch 22: Train D loss: 6.4605, G loss: 21.1456
Epoch 23: Train D loss: 6.0944, G loss: 21.6049
Epoch 24: Train D loss: 5.8689, G loss: 21.8934
Epoch 25: Train D loss: 5.3858, G loss: 22.3251
Epoch 26: Train D loss: 5.1545, G loss: 22.6144
Epoch 27: Train D loss: 4.6969, G loss: 23.2359
Epoch 28: Train D loss: 3.8833, G loss: 23.6799
Epoch 29: Train D loss: 3.3399, G loss: 24.5034

pytorch实现多种经典GAN_第28张图片

Epoch 30: Train D loss: 3.6209, G loss: 24.6793
Epoch 31: Train D loss: 2.7142, G loss: 25.9100
Epoch 32: Train D loss: 2.4011, G loss: 26.1022
Epoch 33: Train D loss: 2.6400, G loss: 26.1017
Epoch 34: Train D loss: 1.4426, G loss: 27.0691
Epoch 35: Train D loss: 0.8127, G loss: 27.9928
Epoch 36: Train D loss: 5.5952, G loss: 26.1908
Epoch 37: Train D loss: 2.4507, G loss: 26.7684
Epoch 38: Train D loss: 1.0612, G loss: 27.7660
Epoch 39: Train D loss: 3.2969, G loss: 27.0381

在这里插入图片描述

Epoch 40: Train D loss: 2.5997, G loss: 26.7446
Epoch 41: Train D loss: 1.3786, G loss: 28.0493
Epoch 42: Train D loss: 0.5518, G loss: 28.6288
Epoch 43: Train D loss: 0.5213, G loss: 28.9280
Epoch 44: Train D loss: 0.3842, G loss: 29.0906
Epoch 45: Train D loss: 3.9250, G loss: 26.8266
Epoch 46: Train D loss: 0.7995, G loss: 28.5540
Epoch 47: Train D loss: 2.5561, G loss: 28.2141
Epoch 48: Train D loss: 1.1729, G loss: 28.3446
Epoch 49: Train D loss: 0.2631, G loss: 29.4365

在这里插入图片描述

Epoch 50: Train D loss: 0.2615, G loss: 29.7149
Epoch 51: Train D loss: 14.7312, G loss: 30.3422
Epoch 52: Train D loss: 31.7424, G loss: 31.7460
Epoch 53: Train D loss: 31.7272, G loss: 31.7460
Epoch 54: Train D loss: 15.0306, G loss: 28.9960
Epoch 55: Train D loss: 0.8352, G loss: 29.1654
Epoch 56: Train D loss: 0.3542, G loss: 29.5590
Epoch 57: Train D loss: 0.1783, G loss: 29.9503
Epoch 58: Train D loss: 2.4319, G loss: 29.5885
Epoch 59: Train D loss: 10.6588, G loss: 28.1815

pytorch实现多种经典GAN_第29张图片

Epoch 60: Train D loss: 0.7104, G loss: 28.9110
Epoch 61: Train D loss: 0.2065, G loss: 29.6716
Epoch 62: Train D loss: 0.1604, G loss: 30.0357
Epoch 63: Train D loss: 27.3840, G loss: 31.3773
Epoch 64: Train D loss: 31.3364, G loss: 31.4454
Epoch 65: Train D loss: 4.0272, G loss: 28.0788
Epoch 66: Train D loss: 0.3504, G loss: 29.6956
Epoch 67: Train D loss: 0.2134, G loss: 29.8739
Epoch 68: Train D loss: 0.1955, G loss: 29.9440
Epoch 69: Train D loss: 0.0010, G loss: 31.7400

pytorch实现多种经典GAN_第30张图片

Epoch 70: Train D loss: 0.0003, G loss: 31.7321
Epoch 71: Train D loss: 0.1174, G loss: 31.0762
Epoch 72: Train D loss: 12.1916, G loss: 30.8252
Epoch 73: Train D loss: 31.6777, G loss: 31.7460
Epoch 74: Train D loss: 14.9836, G loss: 27.6801
Epoch 75: Train D loss: 1.0681, G loss: 29.3340
Epoch 76: Train D loss: 2.5895, G loss: 28.9571
Epoch 77: Train D loss: 0.2752, G loss: 29.8682
Epoch 78: Train D loss: 0.9716, G loss: 29.7243
Epoch 79: Train D loss: 2.5968, G loss: 28.9623

pytorch实现多种经典GAN_第31张图片

Epoch 80: Train D loss: 0.1984, G loss: 30.0664
Epoch 81: Train D loss: 0.1467, G loss: 30.2326
Epoch 82: Train D loss: 0.1388, G loss: 30.3244
Epoch 83: Train D loss: 0.0915, G loss: 30.5168
Epoch 84: Train D loss: 0.0928, G loss: 30.5631
Epoch 85: Train D loss: 0.0542, G loss: 30.6784
Epoch 86: Train D loss: 0.0565, G loss: 30.6442
Epoch 87: Train D loss: 0.0479, G loss: 30.6865
Epoch 88: Train D loss: 0.0000, G loss: 31.7448
Epoch 89: Train D loss: 0.0000, G loss: 31.7446

在这里插入图片描述

Epoch 90: Train D loss: 0.0000, G loss: 31.7444
Epoch 91: Train D loss: 0.0000, G loss: 31.7439
Epoch 92: Train D loss: 0.0000, G loss: 31.7431
Epoch 93: Train D loss: 0.0000, G loss: 31.7410
Epoch 94: Train D loss: 0.0000, G loss: 31.7216
Epoch 95: Train D loss: 16.1862, G loss: 26.8639
Epoch 96: Train D loss: 7.4888, G loss: 26.9355
Epoch 97: Train D loss: 0.8308, G loss: 29.2169
Epoch 98: Train D loss: 1.8946, G loss: 29.2336
Epoch 99: Train D loss: 1.0229, G loss: 29.6227

pytorch实现多种经典GAN_第32张图片

pytorch实现多种经典GAN_第33张图片

WGAN

GAN依然存在着训练不稳定,模式崩溃(collapse mode,可以理解为生成的图片多样性极低)的问题(我们的数据集不一定能体现出来).WGAN(Wasserstein GAN)将传统GAN中拟合的JS散度改为Wasserstein距离.WGAN一定程度上解决了GAN训练不稳定以及模式奔溃的问题.

WGAN的判别器的优化目标变为,在满足Lipschitz连续的条件(我们可以限制w不超过某个范围来满足)下,最大化
在这里插入图片描述
而它会近似于真实分布与生成分布之间的Wasserstein距离.所以我们D和G的loss函数变为:

在这里插入图片描述
在这里插入图片描述

具体到在实现上,WGAN主要有3点改变:

  • 判别器D最后一层去掉sigmoid
  • 生成器G和判别器的loss不使用log
  • 每次更新判别器D后,将参数的绝对值截断到某一个固定常数c

所以我们主要重写了WGAN的训练函数,在这里,网络结构使用去除Sigmoid的DCGAN(注意初始化D时将sigmoid设置为False来去掉最后一层sigmoid).

下面是WGAN的代码实现.加入了两个参数,n_d表示每训练一次G训练D的次数,weight_clip表示截断的常数.

def wgan_train(trainloader, G, D, G_optimizer, D_optimizer, device, z_dim, n_d=2, weight_clip=0.01):
    
    """
    n_d: the number of iterations of D update per G update iteration
    weight_clip: the clipping parameters
    """
    
    D.train()
    G.train()
    
    D_total_loss = 0
    G_total_loss = 0
    
    for i, (x, _) in enumerate(trainloader):
        
        x = x.to(device)
        
        # update D network
        # D optimizer zero grads
        D_optimizer.zero_grad()
        
        # D real loss from real images
        d_real = D(x)
        d_real_loss = - d_real.mean()
        
        # D fake loss from fake images generated by G
        z = torch.rand(x.size(0), z_dim).to(device)
        g_z = G(z)
        d_fake = D(g_z)
        d_fake_loss = d_fake.mean()
        
        # D backward and step
        d_loss = d_real_loss + d_fake_loss
        d_loss.backward()
        D_optimizer.step()
        
        # D weight clip
        for params in D.parameters():
            params.data.clamp_(-weight_clip, weight_clip)
            
        D_total_loss += d_loss.item()

        # update G network
        if (i + 1) % n_d == 0:
            # G optimizer zero grads
            G_optimizer.zero_grad()

            # G loss
            g_z = G(z)
            d_fake = D(g_z)
            g_loss = - d_fake.mean()

            # G backward and step
            g_loss.backward()
            G_optimizer.step()
            
            G_total_loss += g_loss.item()
    
    return D_total_loss / len(trainloader), G_total_loss * n_d / len(trainloader)
def run_wgan(trainloader, G, D, G_optimizer, D_optimizer, n_epochs, device, latent_dim, n_d, weight_clip):
    d_loss_hist = []
    g_loss_hist = []

    for epoch in range(n_epochs):
        d_loss, g_loss = wgan_train(trainloader, G, D, G_optimizer, D_optimizer, device, 
                               z_dim=latent_dim, n_d=n_d, weight_clip=weight_clip)
        print('Epoch {}: Train D loss: {:.4f}, G loss: {:.4f}'.format(epoch, d_loss, g_loss))

        d_loss_hist.append(d_loss)
        g_loss_hist.append(g_loss)

        if epoch == 0 or (epoch + 1) % 10 == 0:
            visualize_results(G, device, latent_dim) 
    
    return d_loss_hist, g_loss_hist

接下来让我们使用写好的run_wgan来跑我们的家具(椅子)数据集,看看效果如何.

# hyper params

# z dim
latent_dim = 100

# image size and channel
image_size=32
image_channel=3

# Adam lr and betas
learning_rate = 0.0002
betas = (0.5, 0.999)

# epochs and batch size
n_epochs = 300
batch_size = 32

# n_d: the number of iterations of D update per G update iteration
n_d = 2
weight_clip=0.01

# device : cpu or cuda:0/1/2/3
device = torch.device('cuda:0')

# mnist dataset and dataloader
train_dataset = load_furniture_data()
trainloader = torch.utils.data.DataLoader(train_dataset, batch_size=batch_size, shuffle=True)

# G and D model, use DCGAN, note that sigmoid is removed in D
G = DCGenerator(image_size=image_size, latent_dim=latent_dim, output_channel=image_channel).to(device)
D = DCDiscriminator(image_size=image_size, input_channel=image_channel, sigmoid=False).to(device)

# G and D optimizer, use Adam or SGD
G_optimizer = optim.Adam(G.parameters(), lr=learning_rate, betas=betas)
D_optimizer = optim.Adam(D.parameters(), lr=learning_rate, betas=betas)

d_loss_hist, g_loss_hist = run_wgan(trainloader, G, D, G_optimizer, D_optimizer, n_epochs, device, 
                                    latent_dim, n_d, weight_clip)
Epoch 0: Train D loss: -0.0369, G loss: 0.0003

pytorch实现多种经典GAN_第34张图片

Epoch 1: Train D loss: -0.0557, G loss: 0.0180
Epoch 2: Train D loss: -0.1094, G loss: 0.0761
Epoch 3: Train D loss: -0.2066, G loss: 0.1717
Epoch 4: Train D loss: -0.3481, G loss: 0.2569
Epoch 5: Train D loss: -0.5345, G loss: 0.3350
Epoch 6: Train D loss: -0.6861, G loss: 0.3968
Epoch 7: Train D loss: -0.8040, G loss: 0.4546
Epoch 8: Train D loss: -0.8856, G loss: 0.4914
Epoch 9: Train D loss: -0.8868, G loss: 0.5160

pytorch实现多种经典GAN_第35张图片

Epoch 10: Train D loss: -0.4947, G loss: 0.4029
Epoch 11: Train D loss: -0.4846, G loss: 0.3979
Epoch 12: Train D loss: -0.6611, G loss: 0.4521
Epoch 13: Train D loss: -0.6089, G loss: 0.4519
Epoch 14: Train D loss: -0.6156, G loss: 0.4615
Epoch 15: Train D loss: -0.6843, G loss: 0.4576
Epoch 16: Train D loss: -0.6772, G loss: 0.4790
Epoch 17: Train D loss: -0.7228, G loss: 0.4852
Epoch 18: Train D loss: -0.7127, G loss: 0.4807
Epoch 19: Train D loss: -0.6863, G loss: 0.4887

pytorch实现多种经典GAN_第36张图片

Epoch 20: Train D loss: -0.6695, G loss: 0.4762
Epoch 21: Train D loss: -0.7215, G loss: 0.5057
Epoch 22: Train D loss: -0.6358, G loss: 0.4569
Epoch 23: Train D loss: -0.6306, G loss: 0.4669
Epoch 24: Train D loss: -0.7047, G loss: 0.4567
Epoch 25: Train D loss: -0.6638, G loss: 0.4459
Epoch 26: Train D loss: -0.5874, G loss: 0.4428
Epoch 27: Train D loss: -0.7162, G loss: 0.4674
Epoch 28: Train D loss: -0.6620, G loss: 0.4347
Epoch 29: Train D loss: -0.6523, G loss: 0.4242

pytorch实现多种经典GAN_第37张图片

Epoch 30: Train D loss: -0.7162, G loss: 0.4577
Epoch 31: Train D loss: -0.6678, G loss: 0.4401
Epoch 32: Train D loss: -0.5838, G loss: 0.3992
Epoch 33: Train D loss: -0.7047, G loss: 0.4644
Epoch 34: Train D loss: -0.6482, G loss: 0.4241
Epoch 35: Train D loss: -0.6864, G loss: 0.4568
Epoch 36: Train D loss: -0.6528, G loss: 0.4220
Epoch 37: Train D loss: -0.6719, G loss: 0.4227
Epoch 38: Train D loss: -0.6081, G loss: 0.4174
Epoch 39: Train D loss: -0.6437, G loss: 0.4257

在这里插入图片描述

Epoch 40: Train D loss: -0.5848, G loss: 0.4267
Epoch 41: Train D loss: -0.6626, G loss: 0.4272
Epoch 42: Train D loss: -0.6374, G loss: 0.3932
Epoch 43: Train D loss: -0.6313, G loss: 0.3741
Epoch 44: Train D loss: -0.6392, G loss: 0.4163
Epoch 45: Train D loss: -0.6255, G loss: 0.4248
Epoch 46: Train D loss: -0.5944, G loss: 0.3938
Epoch 47: Train D loss: -0.6586, G loss: 0.4202
Epoch 48: Train D loss: -0.6064, G loss: 0.3939
Epoch 49: Train D loss: -0.5512, G loss: 0.3669

pytorch实现多种经典GAN_第38张图片

Epoch 50: Train D loss: -0.6243, G loss: 0.4187
Epoch 51: Train D loss: -0.5807, G loss: 0.3558
Epoch 52: Train D loss: -0.5851, G loss: 0.3578
Epoch 53: Train D loss: -0.6291, G loss: 0.4053
Epoch 54: Train D loss: -0.5735, G loss: 0.3418
Epoch 55: Train D loss: -0.5539, G loss: 0.3507
Epoch 56: Train D loss: -0.6016, G loss: 0.3790
Epoch 57: Train D loss: -0.5606, G loss: 0.3528
Epoch 58: Train D loss: -0.5744, G loss: 0.3593
Epoch 59: Train D loss: -0.5846, G loss: 0.3592

在这里插入图片描述

Epoch 60: Train D loss: -0.5997, G loss: 0.3672
Epoch 61: Train D loss: -0.5742, G loss: 0.3324
Epoch 62: Train D loss: -0.5576, G loss: 0.3376
Epoch 63: Train D loss: -0.5752, G loss: 0.3463
Epoch 64: Train D loss: -0.5768, G loss: 0.3539
Epoch 65: Train D loss: -0.5477, G loss: 0.3231
Epoch 66: Train D loss: -0.6136, G loss: 0.3773
Epoch 67: Train D loss: -0.5167, G loss: 0.3272
Epoch 68: Train D loss: -0.6060, G loss: 0.3476
Epoch 69: Train D loss: -0.5631, G loss: 0.3674

pytorch实现多种经典GAN_第39张图片

Epoch 70: Train D loss: -0.5560, G loss: 0.3200
Epoch 71: Train D loss: -0.5753, G loss: 0.3563
Epoch 72: Train D loss: -0.5690, G loss: 0.3248
Epoch 73: Train D loss: -0.5598, G loss: 0.3016
Epoch 74: Train D loss: -0.5364, G loss: 0.2877
Epoch 75: Train D loss: -0.5405, G loss: 0.3384
Epoch 76: Train D loss: -0.5291, G loss: 0.2984
Epoch 77: Train D loss: -0.5375, G loss: 0.3141
Epoch 78: Train D loss: -0.5082, G loss: 0.2881
Epoch 79: Train D loss: -0.5320, G loss: 0.3043

pytorch实现多种经典GAN_第40张图片

Epoch 80: Train D loss: -0.5015, G loss: 0.2818
Epoch 81: Train D loss: -0.5276, G loss: 0.3145
Epoch 82: Train D loss: -0.5075, G loss: 0.2845
Epoch 83: Train D loss: -0.5415, G loss: 0.2964
Epoch 84: Train D loss: -0.4852, G loss: 0.2518
Epoch 85: Train D loss: -0.5336, G loss: 0.2975
Epoch 86: Train D loss: -0.5094, G loss: 0.3462
Epoch 87: Train D loss: -0.5191, G loss: 0.2883
Epoch 88: Train D loss: -0.4903, G loss: 0.3478
Epoch 89: Train D loss: -0.5118, G loss: 0.3263

pytorch实现多种经典GAN_第41张图片

Epoch 90: Train D loss: -0.5094, G loss: 0.2908
Epoch 91: Train D loss: -0.5259, G loss: 0.2910
Epoch 92: Train D loss: -0.5095, G loss: 0.2947
Epoch 93: Train D loss: -0.5207, G loss: 0.3252
Epoch 94: Train D loss: -0.5074, G loss: 0.3099
Epoch 95: Train D loss: -0.4576, G loss: 0.2860
Epoch 96: Train D loss: -0.4991, G loss: 0.3000
Epoch 97: Train D loss: -0.4898, G loss: 0.2971
Epoch 98: Train D loss: -0.5090, G loss: 0.3430
Epoch 99: Train D loss: -0.4786, G loss: 0.3358

在这里插入图片描述

Epoch 100: Train D loss: -0.4968, G loss: 0.3189
Epoch 101: Train D loss: -0.4885, G loss: 0.3020
Epoch 102: Train D loss: -0.5104, G loss: 0.3330
Epoch 103: Train D loss: -0.4599, G loss: 0.2476
Epoch 104: Train D loss: -0.4808, G loss: 0.3091
Epoch 105: Train D loss: -0.4738, G loss: 0.3218
Epoch 106: Train D loss: -0.4660, G loss: 0.2918
Epoch 107: Train D loss: -0.4623, G loss: 0.2971
Epoch 108: Train D loss: -0.4908, G loss: 0.3011
Epoch 109: Train D loss: -0.4537, G loss: 0.2483

pytorch实现多种经典GAN_第42张图片

Epoch 110: Train D loss: -0.4907, G loss: 0.3256
Epoch 111: Train D loss: -0.4589, G loss: 0.2906
Epoch 112: Train D loss: -0.4931, G loss: 0.3398
Epoch 113: Train D loss: -0.4186, G loss: 0.2934
Epoch 114: Train D loss: -0.4393, G loss: 0.2338
Epoch 115: Train D loss: -0.4840, G loss: 0.2749
Epoch 116: Train D loss: -0.4541, G loss: 0.2995
Epoch 117: Train D loss: -0.4686, G loss: 0.2716
Epoch 118: Train D loss: -0.4514, G loss: 0.2964
Epoch 119: Train D loss: -0.4573, G loss: 0.2670

Epoch 120: Train D loss: -0.4598, G loss: 0.2822
Epoch 121: Train D loss: -0.4504, G loss: 0.3012
Epoch 122: Train D loss: -0.4783, G loss: 0.2460
Epoch 123: Train D loss: -0.4346, G loss: 0.2514
Epoch 124: Train D loss: -0.4541, G loss: 0.2833
Epoch 125: Train D loss: -0.4898, G loss: 0.3164
Epoch 126: Train D loss: -0.4401, G loss: 0.3315
Epoch 127: Train D loss: -0.4626, G loss: 0.2726
Epoch 128: Train D loss: -0.4424, G loss: 0.2563
Epoch 129: Train D loss: -0.4651, G loss: 0.3307

pytorch实现多种经典GAN_第43张图片

Epoch 130: Train D loss: -0.4532, G loss: 0.2641
Epoch 131: Train D loss: -0.4394, G loss: 0.2383
Epoch 132: Train D loss: -0.4529, G loss: 0.3050
Epoch 133: Train D loss: -0.4158, G loss: 0.2999
Epoch 134: Train D loss: -0.4788, G loss: 0.2793
Epoch 135: Train D loss: -0.4370, G loss: 0.2687
Epoch 136: Train D loss: -0.4586, G loss: 0.2786
Epoch 137: Train D loss: -0.4309, G loss: 0.2369
Epoch 138: Train D loss: -0.4596, G loss: 0.3155
Epoch 139: Train D loss: -0.4140, G loss: 0.2369

在这里插入图片描述

Epoch 140: Train D loss: -0.4491, G loss: 0.2790
Epoch 141: Train D loss: -0.4302, G loss: 0.2765
Epoch 142: Train D loss: -0.4799, G loss: 0.3198
Epoch 143: Train D loss: -0.4318, G loss: 0.2569
Epoch 144: Train D loss: -0.4512, G loss: 0.3154
Epoch 145: Train D loss: -0.4302, G loss: 0.2465
Epoch 146: Train D loss: -0.4581, G loss: 0.3399
Epoch 147: Train D loss: -0.4274, G loss: 0.2863
Epoch 148: Train D loss: -0.4526, G loss: 0.2655
Epoch 149: Train D loss: -0.4482, G loss: 0.2845

在这里插入图片描述

Epoch 150: Train D loss: -0.4177, G loss: 0.2716
Epoch 151: Train D loss: -0.4417, G loss: 0.2212
Epoch 152: Train D loss: -0.4274, G loss: 0.2426
Epoch 153: Train D loss: -0.4638, G loss: 0.3161
Epoch 154: Train D loss: -0.4304, G loss: 0.2718
Epoch 155: Train D loss: -0.4464, G loss: 0.2687
Epoch 156: Train D loss: -0.4360, G loss: 0.3259
Epoch 157: Train D loss: -0.4451, G loss: 0.2939
Epoch 158: Train D loss: -0.4403, G loss: 0.3331
Epoch 159: Train D loss: -0.4245, G loss: 0.2776

Epoch 160: Train D loss: -0.4339, G loss: 0.2207
Epoch 161: Train D loss: -0.4448, G loss: 0.2611
Epoch 162: Train D loss: -0.4289, G loss: 0.3069
Epoch 163: Train D loss: -0.4338, G loss: 0.2636
Epoch 164: Train D loss: -0.4074, G loss: 0.2566
Epoch 165: Train D loss: -0.4549, G loss: 0.2751
Epoch 166: Train D loss: -0.4009, G loss: 0.2482
Epoch 167: Train D loss: -0.4322, G loss: 0.2830
Epoch 168: Train D loss: -0.3914, G loss: 0.3043
Epoch 169: Train D loss: -0.4282, G loss: 0.2439

在这里插入图片描述

Epoch 170: Train D loss: -0.4264, G loss: 0.2564
Epoch 171: Train D loss: -0.4143, G loss: 0.2825
Epoch 172: Train D loss: -0.4305, G loss: 0.2785
Epoch 173: Train D loss: -0.4131, G loss: 0.2262
Epoch 174: Train D loss: -0.4246, G loss: 0.2442
Epoch 175: Train D loss: -0.4212, G loss: 0.2144
Epoch 176: Train D loss: -0.4614, G loss: 0.3215
Epoch 177: Train D loss: -0.3726, G loss: 0.2268
Epoch 178: Train D loss: -0.4435, G loss: 0.2872
Epoch 179: Train D loss: -0.3947, G loss: 0.2036

pytorch实现多种经典GAN_第44张图片

Epoch 180: Train D loss: -0.4263, G loss: 0.2507
Epoch 181: Train D loss: -0.3798, G loss: 0.2768
Epoch 182: Train D loss: -0.4332, G loss: 0.2744
Epoch 183: Train D loss: -0.4212, G loss: 0.3262
Epoch 184: Train D loss: -0.4099, G loss: 0.2530
Epoch 185: Train D loss: -0.4127, G loss: 0.3350
Epoch 186: Train D loss: -0.4015, G loss: 0.2160
Epoch 187: Train D loss: -0.4028, G loss: 0.2865
Epoch 188: Train D loss: -0.4642, G loss: 0.3220
Epoch 189: Train D loss: -0.3634, G loss: 0.2521

pytorch实现多种经典GAN_第45张图片

Epoch 190: Train D loss: -0.4257, G loss: 0.2675
Epoch 191: Train D loss: -0.4006, G loss: 0.2758
Epoch 192: Train D loss: -0.4329, G loss: 0.3320
Epoch 193: Train D loss: -0.4086, G loss: 0.2827
Epoch 194: Train D loss: -0.4102, G loss: 0.2510
Epoch 195: Train D loss: -0.4385, G loss: 0.3492
Epoch 196: Train D loss: -0.3770, G loss: 0.2770
Epoch 197: Train D loss: -0.4503, G loss: 0.3169
Epoch 198: Train D loss: -0.3788, G loss: 0.2843
Epoch 199: Train D loss: -0.4032, G loss: 0.2338

Epoch 200: Train D loss: -0.4082, G loss: 0.2617
Epoch 201: Train D loss: -0.3875, G loss: 0.2342
Epoch 202: Train D loss: -0.3914, G loss: 0.3006
Epoch 203: Train D loss: -0.3950, G loss: 0.2643
Epoch 204: Train D loss: -0.4181, G loss: 0.2452
Epoch 205: Train D loss: -0.3911, G loss: 0.2484
Epoch 206: Train D loss: -0.4069, G loss: 0.2388
Epoch 207: Train D loss: -0.4138, G loss: 0.3130
Epoch 208: Train D loss: -0.3849, G loss: 0.2519
Epoch 209: Train D loss: -0.4043, G loss: 0.3063

pytorch实现多种经典GAN_第46张图片

Epoch 210: Train D loss: -0.3868, G loss: 0.2552
Epoch 211: Train D loss: -0.3854, G loss: 0.2099
Epoch 212: Train D loss: -0.3885, G loss: 0.2260
Epoch 213: Train D loss: -0.3831, G loss: 0.2843
Epoch 214: Train D loss: -0.3913, G loss: 0.2981
Epoch 215: Train D loss: -0.4026, G loss: 0.2615
Epoch 216: Train D loss: -0.3571, G loss: 0.2077
Epoch 217: Train D loss: -0.4265, G loss: 0.3552
Epoch 218: Train D loss: -0.3766, G loss: 0.2846
Epoch 219: Train D loss: -0.4187, G loss: 0.3177

Epoch 220: Train D loss: -0.3770, G loss: 0.2534
Epoch 221: Train D loss: -0.3877, G loss: 0.2228
Epoch 222: Train D loss: -0.3798, G loss: 0.2793
Epoch 223: Train D loss: -0.3638, G loss: 0.2602
Epoch 224: Train D loss: -0.3892, G loss: 0.2791
Epoch 225: Train D loss: -0.4137, G loss: 0.3361
Epoch 226: Train D loss: -0.3650, G loss: 0.2430
Epoch 227: Train D loss: -0.3672, G loss: 0.1967
Epoch 228: Train D loss: -0.4273, G loss: 0.3011
Epoch 229: Train D loss: -0.3801, G loss: 0.3057

pytorch实现多种经典GAN_第47张图片

Epoch 230: Train D loss: -0.3953, G loss: 0.2356
Epoch 231: Train D loss: -0.3941, G loss: 0.2907
Epoch 232: Train D loss: -0.3698, G loss: 0.2256
Epoch 233: Train D loss: -0.3977, G loss: 0.3323
Epoch 234: Train D loss: -0.3693, G loss: 0.2528
Epoch 235: Train D loss: -0.3851, G loss: 0.2325
Epoch 236: Train D loss: -0.3620, G loss: 0.2738
Epoch 237: Train D loss: -0.3822, G loss: 0.2258
Epoch 238: Train D loss: -0.3972, G loss: 0.2712
Epoch 239: Train D loss: -0.3509, G loss: 0.2222

Epoch 240: Train D loss: -0.3718, G loss: 0.2497
Epoch 241: Train D loss: -0.3656, G loss: 0.2088
Epoch 242: Train D loss: -0.3711, G loss: 0.2579
Epoch 243: Train D loss: -0.3786, G loss: 0.2283
Epoch 244: Train D loss: -0.3598, G loss: 0.2810
Epoch 245: Train D loss: -0.3684, G loss: 0.2485
Epoch 246: Train D loss: -0.3720, G loss: 0.2799
Epoch 247: Train D loss: -0.3744, G loss: 0.2398
Epoch 248: Train D loss: -0.3626, G loss: 0.2804
Epoch 249: Train D loss: -0.3728, G loss: 0.2394

pytorch实现多种经典GAN_第48张图片

Epoch 250: Train D loss: -0.3541, G loss: 0.2554
Epoch 251: Train D loss: -0.3710, G loss: 0.1906
Epoch 252: Train D loss: -0.3620, G loss: 0.2561
Epoch 253: Train D loss: -0.3598, G loss: 0.2646
Epoch 254: Train D loss: -0.3700, G loss: 0.2938
Epoch 255: Train D loss: -0.3635, G loss: 0.2967
Epoch 256: Train D loss: -0.3487, G loss: 0.2272
Epoch 257: Train D loss: -0.3522, G loss: 0.2308
Epoch 258: Train D loss: -0.3523, G loss: 0.2648
Epoch 259: Train D loss: -0.3381, G loss: 0.2113

Epoch 260: Train D loss: -0.3678, G loss: 0.2297
Epoch 261: Train D loss: -0.3510, G loss: 0.2529
Epoch 262: Train D loss: -0.3690, G loss: 0.2061
Epoch 263: Train D loss: -0.3663, G loss: 0.2076
Epoch 264: Train D loss: -0.3791, G loss: 0.2091
Epoch 265: Train D loss: -0.3502, G loss: 0.2412
Epoch 266: Train D loss: -0.3299, G loss: 0.2167
Epoch 267: Train D loss: -0.3648, G loss: 0.2762
Epoch 268: Train D loss: -0.3584, G loss: 0.2302
Epoch 269: Train D loss: -0.3524, G loss: 0.2647

Epoch 270: Train D loss: -0.3458, G loss: 0.2113
Epoch 271: Train D loss: -0.3380, G loss: 0.2069
Epoch 272: Train D loss: -0.3513, G loss: 0.2369
Epoch 273: Train D loss: -0.3561, G loss: 0.2837
Epoch 274: Train D loss: -0.3648, G loss: 0.2519
Epoch 275: Train D loss: -0.3474, G loss: 0.2281
Epoch 276: Train D loss: -0.3435, G loss: 0.2756
Epoch 277: Train D loss: -0.3345, G loss: 0.2143
Epoch 278: Train D loss: -0.3532, G loss: 0.1934
Epoch 279: Train D loss: -0.3511, G loss: 0.2116

Epoch 280: Train D loss: -0.3500, G loss: 0.2431
Epoch 281: Train D loss: -0.3691, G loss: 0.2747
Epoch 282: Train D loss: -0.3426, G loss: 0.2600
Epoch 283: Train D loss: -0.3314, G loss: 0.2253
Epoch 284: Train D loss: -0.3624, G loss: 0.2619
Epoch 285: Train D loss: -0.3626, G loss: 0.2775
Epoch 286: Train D loss: -0.3702, G loss: 0.2960
Epoch 287: Train D loss: -0.3490, G loss: 0.2210
Epoch 288: Train D loss: -0.3541, G loss: 0.2102
Epoch 289: Train D loss: -0.3493, G loss: 0.2054

Epoch 290: Train D loss: -0.3177, G loss: 0.2641
Epoch 291: Train D loss: -0.3307, G loss: 0.2126
Epoch 292: Train D loss: -0.3427, G loss: 0.1790
Epoch 293: Train D loss: -0.3579, G loss: 0.2586
Epoch 294: Train D loss: -0.3286, G loss: 0.1560
Epoch 295: Train D loss: -0.3414, G loss: 0.2358
Epoch 296: Train D loss: -0.3369, G loss: 0.2336
Epoch 297: Train D loss: -0.3583, G loss: 0.2262
Epoch 298: Train D loss: -0.3370, G loss: 0.2112
Epoch 299: Train D loss: -0.3263, G loss: 0.1956

WGAN的原理我们知道,D_loss的相反数可以表示生成数据分布与真实分布的Wasserstein距离,其数值越小,表明两个分布越相似,GAN训练得越好.它的值给我们训练GAN提供了一个指标.

运行下方代码观察wgan的loss曲线,可以看到,总体上,D_loss的相反数随着epoch数增加逐渐下降,同时生成的数据也越来越逼近真实数据,这与wgan的原理是相符合的.

loss_plot(d_loss_hist, g_loss_hist)

pytorch实现多种经典GAN_第49张图片

接下来运行下面两个cell的代码,集中展示wgan的参数分布.

def show_weights_hist(data):
    plt.hist(data, bins=100, normed=1, facecolor="blue", edgecolor="black", alpha=0.7)
    plt.xlabel("weights")
    plt.ylabel("frequency")
    plt.title("D weights")
    plt.show()
def show_d_params(D):
    plist = []
    for params in D.parameters():
        plist.extend(params.cpu().data.view(-1).numpy())
    show_weights_hist(plist)
show_d_params(D)
/opt/conda/lib/python3.6/site-packages/matplotlib/axes/_axes.py:6571: UserWarning: The 'normed' kwarg is deprecated, and has been replaced by the 'density' kwarg.
  warnings.warn("The 'normed' kwarg is deprecated, and has been "

pytorch实现多种经典GAN_第50张图片

可以看到,参数都被截断在[-c, c]之间,大部分参数集中在-c和c附近.

n_d设置为5, 3, 1等,再次训练wGAN

从下面的三种结果中可以看出,当n_d等于1时效果较好,即训练一次判别器,再训练一次生成器,我觉得这是因为判别器的训练难度要远小于生成器,从而使得训练进度更加均衡。

n_d = 1

# G and D model, use DCGAN, note that sigmoid is removed in D
G = DCGenerator(image_size=image_size, latent_dim=latent_dim, output_channel=image_channel).to(device)
D = DCDiscriminator(image_size=image_size, input_channel=image_channel, sigmoid=False).to(device)

# G and D optimizer, use Adam or SGD
G_optimizer = optim.Adam(G.parameters(), lr=learning_rate, betas=betas)
D_optimizer = optim.Adam(D.parameters(), lr=learning_rate, betas=betas)

d_loss_hist, g_loss_hist = run_wgan(trainloader, G, D, G_optimizer, D_optimizer, n_epochs, device, 
                                    latent_dim, n_d, weight_clip)

loss_plot(d_loss_hist, g_loss_hist)
Epoch 0: Train D loss: -0.0182, G loss: -0.0070

pytorch实现多种经典GAN_第51张图片

Epoch 1: Train D loss: -0.0159, G loss: -0.0039
Epoch 2: Train D loss: -0.0325, G loss: 0.0160
Epoch 3: Train D loss: -0.0498, G loss: 0.0458
Epoch 4: Train D loss: -0.1249, G loss: 0.1022
Epoch 5: Train D loss: -0.0865, G loss: 0.1156
Epoch 6: Train D loss: -0.0540, G loss: 0.0653
Epoch 7: Train D loss: -0.0501, G loss: 0.0820
Epoch 8: Train D loss: -0.0925, G loss: 0.1034
Epoch 9: Train D loss: -0.1572, G loss: 0.1668

pytorch实现多种经典GAN_第52张图片

Epoch 10: Train D loss: -0.1910, G loss: 0.1957
Epoch 11: Train D loss: -0.1417, G loss: 0.2013
Epoch 12: Train D loss: -0.2383, G loss: 0.2058
Epoch 13: Train D loss: -0.1865, G loss: 0.2131
Epoch 14: Train D loss: -0.2484, G loss: 0.2400
Epoch 15: Train D loss: -0.2469, G loss: 0.2686
Epoch 16: Train D loss: -0.1967, G loss: 0.2310
Epoch 17: Train D loss: -0.3066, G loss: 0.2467
Epoch 18: Train D loss: -0.2017, G loss: 0.2228
Epoch 19: Train D loss: -0.2556, G loss: 0.2559

pytorch实现多种经典GAN_第53张图片

Epoch 20: Train D loss: -0.2782, G loss: 0.2884
Epoch 21: Train D loss: -0.3246, G loss: 0.3078
Epoch 22: Train D loss: -0.3518, G loss: 0.3165
Epoch 23: Train D loss: -0.3516, G loss: 0.3314
Epoch 24: Train D loss: -0.4098, G loss: 0.3007
Epoch 25: Train D loss: -0.3924, G loss: 0.3193
Epoch 26: Train D loss: -0.2917, G loss: 0.2550
Epoch 27: Train D loss: -0.3090, G loss: 0.2907
Epoch 28: Train D loss: -0.3129, G loss: 0.2922
Epoch 29: Train D loss: -0.3216, G loss: 0.2930

pytorch实现多种经典GAN_第54张图片

Epoch 30: Train D loss: -0.3486, G loss: 0.2942
Epoch 31: Train D loss: -0.3000, G loss: 0.2695
Epoch 32: Train D loss: -0.3124, G loss: 0.2745
Epoch 33: Train D loss: -0.3273, G loss: 0.3266
Epoch 34: Train D loss: -0.3082, G loss: 0.2772
Epoch 35: Train D loss: -0.3180, G loss: 0.2811
Epoch 36: Train D loss: -0.3688, G loss: 0.3004
Epoch 37: Train D loss: -0.3516, G loss: 0.3144
Epoch 38: Train D loss: -0.3937, G loss: 0.3063
Epoch 39: Train D loss: -0.4009, G loss: 0.3059

pytorch实现多种经典GAN_第55张图片

Epoch 40: Train D loss: -0.3389, G loss: 0.2873
Epoch 41: Train D loss: -0.3691, G loss: 0.3021
Epoch 42: Train D loss: -0.3820, G loss: 0.3264
Epoch 43: Train D loss: -0.3817, G loss: 0.2932
Epoch 44: Train D loss: -0.4297, G loss: 0.3235
Epoch 45: Train D loss: -0.3897, G loss: 0.3043
Epoch 46: Train D loss: -0.3980, G loss: 0.2997
Epoch 47: Train D loss: -0.3736, G loss: 0.2793
Epoch 48: Train D loss: -0.3197, G loss: 0.2732
Epoch 49: Train D loss: -0.2841, G loss: 0.2598

pytorch实现多种经典GAN_第56张图片

Epoch 50: Train D loss: -0.3379, G loss: 0.2851
Epoch 51: Train D loss: -0.3437, G loss: 0.2720
Epoch 52: Train D loss: -0.3731, G loss: 0.3267
Epoch 53: Train D loss: -0.3377, G loss: 0.2398
Epoch 54: Train D loss: -0.3265, G loss: 0.2670
Epoch 55: Train D loss: -0.2987, G loss: 0.2443
Epoch 56: Train D loss: -0.3268, G loss: 0.2418
Epoch 57: Train D loss: -0.3325, G loss: 0.2758
Epoch 58: Train D loss: -0.3191, G loss: 0.2743
Epoch 59: Train D loss: -0.3334, G loss: 0.2791

pytorch实现多种经典GAN_第57张图片

Epoch 60: Train D loss: -0.3353, G loss: 0.2931
Epoch 61: Train D loss: -0.3305, G loss: 0.2575
Epoch 62: Train D loss: -0.3491, G loss: 0.2574
Epoch 63: Train D loss: -0.3371, G loss: 0.2534
Epoch 64: Train D loss: -0.3626, G loss: 0.2549
Epoch 65: Train D loss: -0.3577, G loss: 0.2753
Epoch 66: Train D loss: -0.3148, G loss: 0.2603
Epoch 67: Train D loss: -0.3061, G loss: 0.2508
Epoch 68: Train D loss: -0.3204, G loss: 0.2522
Epoch 69: Train D loss: -0.3572, G loss: 0.2807

pytorch实现多种经典GAN_第58张图片

Epoch 70: Train D loss: -0.3155, G loss: 0.2541
Epoch 71: Train D loss: -0.3042, G loss: 0.2538
Epoch 72: Train D loss: -0.3328, G loss: 0.2629
Epoch 73: Train D loss: -0.3426, G loss: 0.2580
Epoch 74: Train D loss: -0.2983, G loss: 0.2352
Epoch 75: Train D loss: -0.3388, G loss: 0.2626
Epoch 76: Train D loss: -0.3185, G loss: 0.2660
Epoch 77: Train D loss: -0.3089, G loss: 0.2504
Epoch 78: Train D loss: -0.3151, G loss: 0.2466
Epoch 79: Train D loss: -0.3097, G loss: 0.2651

pytorch实现多种经典GAN_第59张图片

Epoch 80: Train D loss: -0.2894, G loss: 0.2483
Epoch 81: Train D loss: -0.3101, G loss: 0.2593
Epoch 82: Train D loss: -0.3015, G loss: 0.2221
Epoch 83: Train D loss: -0.3140, G loss: 0.2486
Epoch 84: Train D loss: -0.3043, G loss: 0.2315
Epoch 85: Train D loss: -0.2930, G loss: 0.2406
Epoch 86: Train D loss: -0.3057, G loss: 0.2564
Epoch 87: Train D loss: -0.3217, G loss: 0.2585
Epoch 88: Train D loss: -0.2781, G loss: 0.2280
Epoch 89: Train D loss: -0.3016, G loss: 0.2530

Epoch 90: Train D loss: -0.2845, G loss: 0.2297
Epoch 91: Train D loss: -0.2659, G loss: 0.2401
Epoch 92: Train D loss: -0.2926, G loss: 0.2237
Epoch 93: Train D loss: -0.3072, G loss: 0.2614
Epoch 94: Train D loss: -0.3003, G loss: 0.2295
Epoch 95: Train D loss: -0.3211, G loss: 0.2603
Epoch 96: Train D loss: -0.2616, G loss: 0.2169
Epoch 97: Train D loss: -0.2807, G loss: 0.2167
Epoch 98: Train D loss: -0.3015, G loss: 0.2267
Epoch 99: Train D loss: -0.2735, G loss: 0.2308

Epoch 100: Train D loss: -0.2716, G loss: 0.2101
Epoch 101: Train D loss: -0.3056, G loss: 0.2570
Epoch 102: Train D loss: -0.2684, G loss: 0.2301
Epoch 103: Train D loss: -0.2500, G loss: 0.2290
Epoch 104: Train D loss: -0.3124, G loss: 0.2547
Epoch 105: Train D loss: -0.2400, G loss: 0.2191
Epoch 106: Train D loss: -0.2538, G loss: 0.2180
Epoch 107: Train D loss: -0.3012, G loss: 0.2366
Epoch 108: Train D loss: -0.2810, G loss: 0.2319
Epoch 109: Train D loss: -0.2722, G loss: 0.2347

Epoch 110: Train D loss: -0.2816, G loss: 0.2440
Epoch 111: Train D loss: -0.2706, G loss: 0.2395
Epoch 112: Train D loss: -0.2850, G loss: 0.2090
Epoch 113: Train D loss: -0.2924, G loss: 0.2388
Epoch 114: Train D loss: -0.2492, G loss: 0.2396
Epoch 115: Train D loss: -0.2805, G loss: 0.2439
Epoch 116: Train D loss: -0.2917, G loss: 0.2182
Epoch 117: Train D loss: -0.2489, G loss: 0.2055
Epoch 118: Train D loss: -0.2817, G loss: 0.2496
Epoch 119: Train D loss: -0.2766, G loss: 0.2123

Epoch 120: Train D loss: -0.3147, G loss: 0.2554
Epoch 121: Train D loss: -0.2099, G loss: 0.1905
Epoch 122: Train D loss: -0.2907, G loss: 0.2443
Epoch 123: Train D loss: -0.2671, G loss: 0.2187
Epoch 124: Train D loss: -0.2602, G loss: 0.2262
Epoch 125: Train D loss: -0.2674, G loss: 0.2054
Epoch 126: Train D loss: -0.2793, G loss: 0.2151
Epoch 127: Train D loss: -0.2728, G loss: 0.2125
Epoch 128: Train D loss: -0.2760, G loss: 0.2082
Epoch 129: Train D loss: -0.2906, G loss: 0.2200

Epoch 130: Train D loss: -0.2442, G loss: 0.2110
Epoch 131: Train D loss: -0.2671, G loss: 0.2449
Epoch 132: Train D loss: -0.2847, G loss: 0.2403
Epoch 133: Train D loss: -0.2616, G loss: 0.2112
Epoch 134: Train D loss: -0.2658, G loss: 0.2312
Epoch 135: Train D loss: -0.2687, G loss: 0.2113
Epoch 136: Train D loss: -0.2742, G loss: 0.2060
Epoch 137: Train D loss: -0.2987, G loss: 0.2461
Epoch 138: Train D loss: -0.2429, G loss: 0.1936
Epoch 139: Train D loss: -0.2795, G loss: 0.2319

pytorch实现多种经典GAN_第60张图片

Epoch 140: Train D loss: -0.2551, G loss: 0.2186
Epoch 141: Train D loss: -0.2727, G loss: 0.2181
Epoch 142: Train D loss: -0.2803, G loss: 0.2147
Epoch 143: Train D loss: -0.2403, G loss: 0.2014
Epoch 144: Train D loss: -0.2687, G loss: 0.2081
Epoch 145: Train D loss: -0.2655, G loss: 0.2216
Epoch 146: Train D loss: -0.2570, G loss: 0.2044
Epoch 147: Train D loss: -0.2612, G loss: 0.2048
Epoch 148: Train D loss: -0.2831, G loss: 0.2123
Epoch 149: Train D loss: -0.2485, G loss: 0.2019

pytorch实现多种经典GAN_第61张图片

Epoch 150: Train D loss: -0.2663, G loss: 0.2102
Epoch 151: Train D loss: -0.2605, G loss: 0.2043
Epoch 152: Train D loss: -0.2713, G loss: 0.2101
Epoch 153: Train D loss: -0.2704, G loss: 0.1962
Epoch 154: Train D loss: -0.2837, G loss: 0.2128
Epoch 155: Train D loss: -0.2650, G loss: 0.1856
Epoch 156: Train D loss: -0.2568, G loss: 0.1991
Epoch 157: Train D loss: -0.2409, G loss: 0.2089
Epoch 158: Train D loss: -0.2543, G loss: 0.2077
Epoch 159: Train D loss: -0.2465, G loss: 0.2075

Epoch 160: Train D loss: -0.2664, G loss: 0.2036
Epoch 161: Train D loss: -0.2732, G loss: 0.1882
Epoch 162: Train D loss: -0.2508, G loss: 0.2153
Epoch 163: Train D loss: -0.2591, G loss: 0.2019
Epoch 164: Train D loss: -0.2524, G loss: 0.2327
Epoch 165: Train D loss: -0.2530, G loss: 0.2250
Epoch 166: Train D loss: -0.2331, G loss: 0.1909
Epoch 167: Train D loss: -0.2497, G loss: 0.2172
Epoch 168: Train D loss: -0.2500, G loss: 0.2109
Epoch 169: Train D loss: -0.2312, G loss: 0.1952

pytorch实现多种经典GAN_第62张图片

Epoch 170: Train D loss: -0.2475, G loss: 0.2165
Epoch 171: Train D loss: -0.2167, G loss: 0.1974
Epoch 172: Train D loss: -0.2526, G loss: 0.2084
Epoch 173: Train D loss: -0.2438, G loss: 0.1823
Epoch 174: Train D loss: -0.2526, G loss: 0.2057
Epoch 175: Train D loss: -0.2338, G loss: 0.1805
Epoch 176: Train D loss: -0.2380, G loss: 0.1999
Epoch 177: Train D loss: -0.2504, G loss: 0.2079
Epoch 178: Train D loss: -0.2323, G loss: 0.2171
Epoch 179: Train D loss: -0.2267, G loss: 0.2106

Epoch 180: Train D loss: -0.2334, G loss: 0.2112
Epoch 181: Train D loss: -0.2504, G loss: 0.2015
Epoch 182: Train D loss: -0.2386, G loss: 0.2003
Epoch 183: Train D loss: -0.2579, G loss: 0.2225
Epoch 184: Train D loss: -0.2329, G loss: 0.1926
Epoch 185: Train D loss: -0.2448, G loss: 0.1943
Epoch 186: Train D loss: -0.2410, G loss: 0.1972
Epoch 187: Train D loss: -0.2147, G loss: 0.2089
Epoch 188: Train D loss: -0.2481, G loss: 0.2071
Epoch 189: Train D loss: -0.2226, G loss: 0.1883

pytorch实现多种经典GAN_第63张图片

Epoch 190: Train D loss: -0.2166, G loss: 0.1877
Epoch 191: Train D loss: -0.2306, G loss: 0.1673
Epoch 192: Train D loss: -0.2411, G loss: 0.1936
Epoch 193: Train D loss: -0.2090, G loss: 0.1795
Epoch 194: Train D loss: -0.2241, G loss: 0.1926
Epoch 195: Train D loss: -0.2321, G loss: 0.1997
Epoch 196: Train D loss: -0.2337, G loss: 0.1896
Epoch 197: Train D loss: -0.2472, G loss: 0.1888
Epoch 198: Train D loss: -0.2187, G loss: 0.1842
Epoch 199: Train D loss: -0.2410, G loss: 0.1910

Epoch 200: Train D loss: -0.2122, G loss: 0.2000
Epoch 201: Train D loss: -0.2195, G loss: 0.1505
Epoch 202: Train D loss: -0.2261, G loss: 0.2015
Epoch 203: Train D loss: -0.2431, G loss: 0.1929
Epoch 204: Train D loss: -0.1865, G loss: 0.2117
Epoch 205: Train D loss: -0.2178, G loss: 0.1990
Epoch 206: Train D loss: -0.2218, G loss: 0.2107
Epoch 207: Train D loss: -0.2072, G loss: 0.1877
Epoch 208: Train D loss: -0.2300, G loss: 0.1616
Epoch 209: Train D loss: -0.2283, G loss: 0.2229

在这里插入图片描述

Epoch 210: Train D loss: -0.2135, G loss: 0.1653
Epoch 211: Train D loss: -0.2005, G loss: 0.1861
Epoch 212: Train D loss: -0.1969, G loss: 0.1439
Epoch 213: Train D loss: -0.2162, G loss: 0.1740
Epoch 214: Train D loss: -0.2244, G loss: 0.1757
Epoch 215: Train D loss: -0.2157, G loss: 0.1904
Epoch 216: Train D loss: -0.2338, G loss: 0.1971
Epoch 217: Train D loss: -0.2309, G loss: 0.1905
Epoch 218: Train D loss: -0.2319, G loss: 0.1956
Epoch 219: Train D loss: -0.2072, G loss: 0.1821

Epoch 220: Train D loss: -0.2217, G loss: 0.1682
Epoch 221: Train D loss: -0.2222, G loss: 0.1610
Epoch 222: Train D loss: -0.2084, G loss: 0.1770
Epoch 223: Train D loss: -0.2231, G loss: 0.1770
Epoch 224: Train D loss: -0.2287, G loss: 0.1664
Epoch 225: Train D loss: -0.2025, G loss: 0.1719
Epoch 226: Train D loss: -0.2255, G loss: 0.2111
Epoch 227: Train D loss: -0.2138, G loss: 0.1751
Epoch 228: Train D loss: -0.2281, G loss: 0.1734
Epoch 229: Train D loss: -0.2329, G loss: 0.2143

Epoch 230: Train D loss: -0.2108, G loss: 0.1704
Epoch 231: Train D loss: -0.2094, G loss: 0.1904
Epoch 232: Train D loss: -0.2337, G loss: 0.2201
Epoch 233: Train D loss: -0.2001, G loss: 0.1507
Epoch 234: Train D loss: -0.2368, G loss: 0.1695
Epoch 235: Train D loss: -0.2102, G loss: 0.1928
Epoch 236: Train D loss: -0.2278, G loss: 0.1856
Epoch 237: Train D loss: -0.2165, G loss: 0.1752
Epoch 238: Train D loss: -0.2195, G loss: 0.1767
Epoch 239: Train D loss: -0.2063, G loss: 0.1984

Epoch 240: Train D loss: -0.2117, G loss: 0.1691
Epoch 241: Train D loss: -0.1951, G loss: 0.1645
Epoch 242: Train D loss: -0.2185, G loss: 0.1849
Epoch 243: Train D loss: -0.1912, G loss: 0.1975
Epoch 244: Train D loss: -0.2064, G loss: 0.1826
Epoch 245: Train D loss: -0.2058, G loss: 0.1761
Epoch 246: Train D loss: -0.2058, G loss: 0.1964
Epoch 247: Train D loss: -0.1886, G loss: 0.1491
Epoch 248: Train D loss: -0.2380, G loss: 0.2016
Epoch 249: Train D loss: -0.1905, G loss: 0.1639

Epoch 250: Train D loss: -0.2091, G loss: 0.1662
Epoch 251: Train D loss: -0.2091, G loss: 0.1943
Epoch 252: Train D loss: -0.2288, G loss: 0.1891
Epoch 253: Train D loss: -0.2051, G loss: 0.1706
Epoch 254: Train D loss: -0.2025, G loss: 0.1776
Epoch 255: Train D loss: -0.2017, G loss: 0.2068
Epoch 256: Train D loss: -0.2255, G loss: 0.1873
Epoch 257: Train D loss: -0.2229, G loss: 0.1915
Epoch 258: Train D loss: -0.2085, G loss: 0.1562
Epoch 259: Train D loss: -0.2008, G loss: 0.1809

Epoch 260: Train D loss: -0.2147, G loss: 0.2080
Epoch 261: Train D loss: -0.2232, G loss: 0.1709
Epoch 262: Train D loss: -0.2285, G loss: 0.1834
Epoch 263: Train D loss: -0.2003, G loss: 0.1878
Epoch 264: Train D loss: -0.1990, G loss: 0.1749
Epoch 265: Train D loss: -0.2169, G loss: 0.2041
Epoch 266: Train D loss: -0.1845, G loss: 0.1373
Epoch 267: Train D loss: -0.2032, G loss: 0.1662
Epoch 268: Train D loss: -0.2112, G loss: 0.1467
Epoch 269: Train D loss: -0.2017, G loss: 0.1613

Epoch 270: Train D loss: -0.2164, G loss: 0.1894
Epoch 271: Train D loss: -0.1943, G loss: 0.1676
Epoch 272: Train D loss: -0.2001, G loss: 0.1614
Epoch 273: Train D loss: -0.1856, G loss: 0.1941
Epoch 274: Train D loss: -0.1890, G loss: 0.1458
Epoch 275: Train D loss: -0.1916, G loss: 0.1645
Epoch 276: Train D loss: -0.1973, G loss: 0.1968
Epoch 277: Train D loss: -0.2100, G loss: 0.2010
Epoch 278: Train D loss: -0.2052, G loss: 0.1787
Epoch 279: Train D loss: -0.1834, G loss: 0.1414

Epoch 280: Train D loss: -0.1791, G loss: 0.1570
Epoch 281: Train D loss: -0.1975, G loss: 0.1735
Epoch 282: Train D loss: -0.1917, G loss: 0.1693
Epoch 283: Train D loss: -0.1993, G loss: 0.1682
Epoch 284: Train D loss: -0.1696, G loss: 0.1724
Epoch 285: Train D loss: -0.2005, G loss: 0.1656
Epoch 286: Train D loss: -0.1988, G loss: 0.1789
Epoch 287: Train D loss: -0.2005, G loss: 0.1648
Epoch 288: Train D loss: -0.1865, G loss: 0.1314
Epoch 289: Train D loss: -0.2028, G loss: 0.2210

Epoch 290: Train D loss: -0.1877, G loss: 0.1620
Epoch 291: Train D loss: -0.1938, G loss: 0.1862
Epoch 292: Train D loss: -0.1458, G loss: 0.1050
Epoch 293: Train D loss: -0.2081, G loss: 0.2041
Epoch 294: Train D loss: -0.1423, G loss: 0.1373
Epoch 295: Train D loss: -0.1750, G loss: 0.1569
Epoch 296: Train D loss: -0.1765, G loss: 0.1841
Epoch 297: Train D loss: -0.1718, G loss: 0.1302
Epoch 298: Train D loss: -0.2015, G loss: 0.2108
Epoch 299: Train D loss: -0.1822, G loss: 0.1689

pytorch实现多种经典GAN_第64张图片

n_d = 3

# G and D model, use DCGAN, note that sigmoid is removed in D
G = DCGenerator(image_size=image_size, latent_dim=latent_dim, output_channel=image_channel).to(device)
D = DCDiscriminator(image_size=image_size, input_channel=image_channel, sigmoid=False).to(device)

# G and D optimizer, use Adam or SGD
G_optimizer = optim.Adam(G.parameters(), lr=learning_rate, betas=betas)
D_optimizer = optim.Adam(D.parameters(), lr=learning_rate, betas=betas)

d_loss_hist, g_loss_hist = run_wgan(trainloader, G, D, G_optimizer, D_optimizer, n_epochs, device, 
                                    latent_dim, n_d, weight_clip)

loss_plot(d_loss_hist, g_loss_hist)
Epoch 0: Train D loss: 0.0115, G loss: 0.0028

pytorch实现多种经典GAN_第65张图片

Epoch 1: Train D loss: -0.0849, G loss: 0.0287
Epoch 2: Train D loss: -0.2092, G loss: 0.1247
Epoch 3: Train D loss: -0.3877, G loss: 0.2280
Epoch 4: Train D loss: -0.5400, G loss: 0.3050
Epoch 5: Train D loss: -0.6341, G loss: 0.3542
Epoch 6: Train D loss: -0.7792, G loss: 0.4113
Epoch 7: Train D loss: -0.8921, G loss: 0.4566
Epoch 8: Train D loss: -0.9774, G loss: 0.4867
Epoch 9: Train D loss: -1.0469, G loss: 0.5187

pytorch实现多种经典GAN_第66张图片

Epoch 10: Train D loss: -1.0285, G loss: 0.5216
Epoch 11: Train D loss: -0.9615, G loss: 0.5114
Epoch 12: Train D loss: -0.9017, G loss: 0.4891
Epoch 13: Train D loss: -0.8909, G loss: 0.4479
Epoch 14: Train D loss: -1.0019, G loss: 0.5253
Epoch 15: Train D loss: -0.8964, G loss: 0.5183
Epoch 16: Train D loss: -0.8387, G loss: 0.4386
Epoch 17: Train D loss: -0.9418, G loss: 0.4687
Epoch 18: Train D loss: -0.9051, G loss: 0.4914
Epoch 19: Train D loss: -1.0527, G loss: 0.5396

pytorch实现多种经典GAN_第67张图片

Epoch 20: Train D loss: -0.9699, G loss: 0.5065
Epoch 21: Train D loss: -0.9157, G loss: 0.4821
Epoch 22: Train D loss: -1.0009, G loss: 0.5267
Epoch 23: Train D loss: -0.9863, G loss: 0.5208
Epoch 24: Train D loss: -0.8917, G loss: 0.4812
Epoch 25: Train D loss: -0.8956, G loss: 0.4888
Epoch 26: Train D loss: -0.9106, G loss: 0.4834
Epoch 27: Train D loss: -0.9318, G loss: 0.5039
Epoch 28: Train D loss: -0.8434, G loss: 0.4712
Epoch 29: Train D loss: -0.8285, G loss: 0.4837

pytorch实现多种经典GAN_第68张图片

Epoch 30: Train D loss: -0.8983, G loss: 0.4685
Epoch 31: Train D loss: -0.7217, G loss: 0.3685
Epoch 32: Train D loss: -0.7127, G loss: 0.4416
Epoch 33: Train D loss: -0.7619, G loss: 0.3957
Epoch 34: Train D loss: -0.7324, G loss: 0.3847
Epoch 35: Train D loss: -0.8370, G loss: 0.4685
Epoch 36: Train D loss: -0.7254, G loss: 0.4249
Epoch 37: Train D loss: -0.7606, G loss: 0.3881
Epoch 38: Train D loss: -0.8475, G loss: 0.4369
Epoch 39: Train D loss: -0.7574, G loss: 0.4178

在这里插入图片描述

Epoch 40: Train D loss: -0.7793, G loss: 0.4372
Epoch 41: Train D loss: -0.7609, G loss: 0.4106
Epoch 42: Train D loss: -0.8020, G loss: 0.4627
Epoch 43: Train D loss: -0.7828, G loss: 0.4223
Epoch 44: Train D loss: -0.7864, G loss: 0.4156
Epoch 45: Train D loss: -0.8003, G loss: 0.4536
Epoch 46: Train D loss: -0.8051, G loss: 0.4268
Epoch 47: Train D loss: -0.7091, G loss: 0.3992
Epoch 48: Train D loss: -0.8230, G loss: 0.4182
Epoch 49: Train D loss: -0.7692, G loss: 0.4265

pytorch实现多种经典GAN_第69张图片

Epoch 50: Train D loss: -0.7436, G loss: 0.3889
Epoch 51: Train D loss: -0.7924, G loss: 0.4587
Epoch 52: Train D loss: -0.7245, G loss: 0.4406
Epoch 53: Train D loss: -0.8386, G loss: 0.4729
Epoch 54: Train D loss: -0.7688, G loss: 0.4302
Epoch 55: Train D loss: -0.8514, G loss: 0.4562
Epoch 56: Train D loss: -0.7149, G loss: 0.4728
Epoch 57: Train D loss: -0.7803, G loss: 0.4096
Epoch 58: Train D loss: -0.7519, G loss: 0.3558
Epoch 59: Train D loss: -0.7757, G loss: 0.4407

pytorch实现多种经典GAN_第70张图片

Epoch 60: Train D loss: -0.7444, G loss: 0.3916
Epoch 61: Train D loss: -0.7294, G loss: 0.4036
Epoch 62: Train D loss: -0.6996, G loss: 0.4092
Epoch 63: Train D loss: -0.6151, G loss: 0.2592
Epoch 64: Train D loss: -0.7241, G loss: 0.3998
Epoch 65: Train D loss: -0.7103, G loss: 0.4065
Epoch 66: Train D loss: -0.6855, G loss: 0.3740
Epoch 67: Train D loss: -0.6014, G loss: 0.2923
Epoch 68: Train D loss: -0.7698, G loss: 0.4409
Epoch 69: Train D loss: -0.6474, G loss: 0.3609

pytorch实现多种经典GAN_第71张图片

Epoch 70: Train D loss: -0.7633, G loss: 0.4385
Epoch 71: Train D loss: -0.6585, G loss: 0.3993
Epoch 72: Train D loss: -0.7166, G loss: 0.3943
Epoch 73: Train D loss: -0.7145, G loss: 0.3874
Epoch 74: Train D loss: -0.6999, G loss: 0.3982
Epoch 75: Train D loss: -0.6753, G loss: 0.3795
Epoch 76: Train D loss: -0.6380, G loss: 0.2932
Epoch 77: Train D loss: -0.6757, G loss: 0.4047
Epoch 78: Train D loss: -0.6589, G loss: 0.3766
Epoch 79: Train D loss: -0.6796, G loss: 0.3769

pytorch实现多种经典GAN_第72张图片

Epoch 80: Train D loss: -0.6832, G loss: 0.3994
Epoch 81: Train D loss: -0.6753, G loss: 0.3706
Epoch 82: Train D loss: -0.6589, G loss: 0.3934
Epoch 83: Train D loss: -0.6982, G loss: 0.4058
Epoch 84: Train D loss: -0.6773, G loss: 0.3891
Epoch 85: Train D loss: -0.6343, G loss: 0.3287
Epoch 86: Train D loss: -0.6902, G loss: 0.4246
Epoch 87: Train D loss: -0.6273, G loss: 0.3418
Epoch 88: Train D loss: -0.6654, G loss: 0.3764
Epoch 89: Train D loss: -0.6256, G loss: 0.3525

pytorch实现多种经典GAN_第73张图片

Epoch 90: Train D loss: -0.6252, G loss: 0.2638
Epoch 91: Train D loss: -0.6839, G loss: 0.4056
Epoch 92: Train D loss: -0.6672, G loss: 0.4038
Epoch 93: Train D loss: -0.6162, G loss: 0.3254
Epoch 94: Train D loss: -0.6786, G loss: 0.3844
Epoch 95: Train D loss: -0.6440, G loss: 0.3848
Epoch 96: Train D loss: -0.6216, G loss: 0.3360
Epoch 97: Train D loss: -0.6514, G loss: 0.3452
Epoch 98: Train D loss: -0.6280, G loss: 0.3378
Epoch 99: Train D loss: -0.6389, G loss: 0.3781

Epoch 100: Train D loss: -0.6247, G loss: 0.3755
Epoch 101: Train D loss: -0.5936, G loss: 0.2273
Epoch 102: Train D loss: -0.6509, G loss: 0.3720
Epoch 103: Train D loss: -0.6235, G loss: 0.3705
Epoch 104: Train D loss: -0.6215, G loss: 0.3137
Epoch 105: Train D loss: -0.6137, G loss: 0.3555
Epoch 106: Train D loss: -0.6364, G loss: 0.3166
Epoch 107: Train D loss: -0.5826, G loss: 0.3726
Epoch 108: Train D loss: -0.6304, G loss: 0.3509
Epoch 109: Train D loss: -0.6262, G loss: 0.3693

Epoch 110: Train D loss: -0.6152, G loss: 0.3723
Epoch 111: Train D loss: -0.5603, G loss: 0.3244
Epoch 112: Train D loss: -0.6228, G loss: 0.3292
Epoch 113: Train D loss: -0.5751, G loss: 0.3592
Epoch 114: Train D loss: -0.6341, G loss: 0.3237
Epoch 115: Train D loss: -0.5957, G loss: 0.3302
Epoch 116: Train D loss: -0.5450, G loss: 0.2193
Epoch 117: Train D loss: -0.6332, G loss: 0.3558
Epoch 118: Train D loss: -0.5991, G loss: 0.3779
Epoch 119: Train D loss: -0.5378, G loss: 0.3330

Epoch 120: Train D loss: -0.6033, G loss: 0.3438
Epoch 121: Train D loss: -0.5573, G loss: 0.3457
Epoch 122: Train D loss: -0.5730, G loss: 0.3430
Epoch 123: Train D loss: -0.5930, G loss: 0.3696
Epoch 124: Train D loss: -0.5787, G loss: 0.3108
Epoch 125: Train D loss: -0.5548, G loss: 0.3351
Epoch 126: Train D loss: -0.6151, G loss: 0.3816
Epoch 127: Train D loss: -0.5616, G loss: 0.3556
Epoch 128: Train D loss: -0.5305, G loss: 0.3388
Epoch 129: Train D loss: -0.5902, G loss: 0.3435

Epoch 130: Train D loss: -0.5366, G loss: 0.2469
Epoch 131: Train D loss: -0.5515, G loss: 0.2755
Epoch 132: Train D loss: -0.5506, G loss: 0.2604
Epoch 133: Train D loss: -0.5457, G loss: 0.3191
Epoch 134: Train D loss: -0.5214, G loss: 0.2295
Epoch 135: Train D loss: -0.5769, G loss: 0.3087
Epoch 136: Train D loss: -0.5200, G loss: 0.2559
Epoch 137: Train D loss: -0.5777, G loss: 0.3618
Epoch 138: Train D loss: -0.5186, G loss: 0.2800
Epoch 139: Train D loss: -0.5217, G loss: 0.3243

Epoch 140: Train D loss: -0.5597, G loss: 0.3368
Epoch 141: Train D loss: -0.5527, G loss: 0.3299
Epoch 142: Train D loss: -0.5231, G loss: 0.2991
Epoch 143: Train D loss: -0.5329, G loss: 0.2732
Epoch 144: Train D loss: -0.5256, G loss: 0.2855
Epoch 145: Train D loss: -0.5587, G loss: 0.3552
Epoch 146: Train D loss: -0.4973, G loss: 0.2657
Epoch 147: Train D loss: -0.5437, G loss: 0.3290
Epoch 148: Train D loss: -0.5680, G loss: 0.3651
Epoch 149: Train D loss: -0.5231, G loss: 0.3805

Epoch 150: Train D loss: -0.5349, G loss: 0.2985
Epoch 151: Train D loss: -0.5348, G loss: 0.2984
Epoch 152: Train D loss: -0.5246, G loss: 0.3332
Epoch 153: Train D loss: -0.5357, G loss: 0.2987
Epoch 154: Train D loss: -0.5090, G loss: 0.3046
Epoch 155: Train D loss: -0.5364, G loss: 0.2304
Epoch 156: Train D loss: -0.5003, G loss: 0.2755
Epoch 157: Train D loss: -0.5618, G loss: 0.3661
Epoch 158: Train D loss: -0.5135, G loss: 0.3532
Epoch 159: Train D loss: -0.5130, G loss: 0.3264

Epoch 160: Train D loss: -0.5334, G loss: 0.2451
Epoch 161: Train D loss: -0.5537, G loss: 0.3472
Epoch 162: Train D loss: -0.4741, G loss: 0.1592
Epoch 163: Train D loss: -0.5463, G loss: 0.2963
Epoch 164: Train D loss: -0.5277, G loss: 0.3333
Epoch 165: Train D loss: -0.4628, G loss: 0.2999
Epoch 166: Train D loss: -0.5380, G loss: 0.2680
Epoch 167: Train D loss: -0.4609, G loss: 0.1788
Epoch 168: Train D loss: -0.5778, G loss: 0.3605
Epoch 169: Train D loss: -0.4881, G loss: 0.2942

Epoch 170: Train D loss: -0.5424, G loss: 0.3343
Epoch 171: Train D loss: -0.5350, G loss: 0.3464
Epoch 172: Train D loss: -0.4992, G loss: 0.2575
Epoch 173: Train D loss: -0.5071, G loss: 0.2116
Epoch 174: Train D loss: -0.5370, G loss: 0.3021
Epoch 175: Train D loss: -0.5177, G loss: 0.3543
Epoch 176: Train D loss: -0.4932, G loss: 0.2682
Epoch 177: Train D loss: -0.5349, G loss: 0.2949
Epoch 178: Train D loss: -0.5051, G loss: 0.2636
Epoch 179: Train D loss: -0.4992, G loss: 0.2721

Epoch 180: Train D loss: -0.5420, G loss: 0.3494
Epoch 181: Train D loss: -0.4919, G loss: 0.3713
Epoch 182: Train D loss: -0.4869, G loss: 0.2652
Epoch 183: Train D loss: -0.4866, G loss: 0.2572
Epoch 184: Train D loss: -0.5344, G loss: 0.3539
Epoch 185: Train D loss: -0.4857, G loss: 0.3095
Epoch 186: Train D loss: -0.5102, G loss: 0.3381
Epoch 187: Train D loss: -0.4874, G loss: 0.2951
Epoch 188: Train D loss: -0.5002, G loss: 0.2418
Epoch 189: Train D loss: -0.4807, G loss: 0.2385

Epoch 190: Train D loss: -0.5155, G loss: 0.3445
Epoch 191: Train D loss: -0.5122, G loss: 0.3259
Epoch 192: Train D loss: -0.4799, G loss: 0.2169
Epoch 193: Train D loss: -0.5173, G loss: 0.3011
Epoch 194: Train D loss: -0.5009, G loss: 0.2789
Epoch 195: Train D loss: -0.4596, G loss: 0.2777
Epoch 196: Train D loss: -0.5154, G loss: 0.2718
Epoch 197: Train D loss: -0.4872, G loss: 0.2967
Epoch 198: Train D loss: -0.4751, G loss: 0.3050
Epoch 199: Train D loss: -0.5244, G loss: 0.2959

Epoch 200: Train D loss: -0.4821, G loss: 0.3262
Epoch 201: Train D loss: -0.4558, G loss: 0.2070
Epoch 202: Train D loss: -0.4959, G loss: 0.3048
Epoch 203: Train D loss: -0.4879, G loss: 0.2530
Epoch 204: Train D loss: -0.4395, G loss: 0.1670
Epoch 205: Train D loss: -0.5073, G loss: 0.3102
Epoch 206: Train D loss: -0.4784, G loss: 0.2495
Epoch 207: Train D loss: -0.4883, G loss: 0.2673
Epoch 208: Train D loss: -0.4747, G loss: 0.2367
Epoch 209: Train D loss: -0.5200, G loss: 0.3213

Epoch 210: Train D loss: -0.4664, G loss: 0.3219
Epoch 211: Train D loss: -0.4904, G loss: 0.2710
Epoch 212: Train D loss: -0.4840, G loss: 0.3310
Epoch 213: Train D loss: -0.5050, G loss: 0.3159
Epoch 214: Train D loss: -0.4826, G loss: 0.3501
Epoch 215: Train D loss: -0.4954, G loss: 0.3450
Epoch 216: Train D loss: -0.4956, G loss: 0.2996
Epoch 217: Train D loss: -0.4661, G loss: 0.3677
Epoch 218: Train D loss: -0.4778, G loss: 0.2072
Epoch 219: Train D loss: -0.4701, G loss: 0.3158

Epoch 220: Train D loss: -0.5315, G loss: 0.3198
Epoch 221: Train D loss: -0.4317, G loss: 0.2696
Epoch 222: Train D loss: -0.5121, G loss: 0.3345
Epoch 223: Train D loss: -0.4901, G loss: 0.3473
Epoch 224: Train D loss: -0.4410, G loss: 0.3118
Epoch 225: Train D loss: -0.5101, G loss: 0.2783
Epoch 226: Train D loss: -0.4633, G loss: 0.3348
Epoch 227: Train D loss: -0.4627, G loss: 0.2943
Epoch 228: Train D loss: -0.4460, G loss: 0.2120
Epoch 229: Train D loss: -0.4821, G loss: 0.3281

Epoch 230: Train D loss: -0.4927, G loss: 0.2944
Epoch 231: Train D loss: -0.4566, G loss: 0.2749
Epoch 232: Train D loss: -0.4568, G loss: 0.2780
Epoch 233: Train D loss: -0.4839, G loss: 0.2552
Epoch 234: Train D loss: -0.4454, G loss: 0.2397
Epoch 235: Train D loss: -0.4916, G loss: 0.3104
Epoch 236: Train D loss: -0.4568, G loss: 0.2490
Epoch 237: Train D loss: -0.4682, G loss: 0.3689
Epoch 238: Train D loss: -0.4643, G loss: 0.3490
Epoch 239: Train D loss: -0.4417, G loss: 0.1772

Epoch 240: Train D loss: -0.4678, G loss: 0.2610
Epoch 241: Train D loss: -0.4616, G loss: 0.2496
Epoch 242: Train D loss: -0.4932, G loss: 0.3001
Epoch 243: Train D loss: -0.4238, G loss: 0.2434
Epoch 244: Train D loss: -0.4829, G loss: 0.3002
Epoch 245: Train D loss: -0.4240, G loss: 0.2755
Epoch 246: Train D loss: -0.5055, G loss: 0.3238
Epoch 247: Train D loss: -0.4266, G loss: 0.1758
Epoch 248: Train D loss: -0.4706, G loss: 0.2863
Epoch 249: Train D loss: -0.4935, G loss: 0.2461

Epoch 250: Train D loss: -0.4247, G loss: 0.2628
Epoch 251: Train D loss: -0.4966, G loss: 0.3026
Epoch 252: Train D loss: -0.4171, G loss: 0.2561
Epoch 253: Train D loss: -0.4626, G loss: 0.2328
Epoch 254: Train D loss: -0.4524, G loss: 0.2890
Epoch 255: Train D loss: -0.4441, G loss: 0.3211
Epoch 256: Train D loss: -0.4232, G loss: 0.1900
Epoch 257: Train D loss: -0.4441, G loss: 0.1332
Epoch 258: Train D loss: -0.4717, G loss: 0.3172
Epoch 259: Train D loss: -0.4622, G loss: 0.2089

在这里插入图片描述

Epoch 260: Train D loss: -0.4311, G loss: 0.3081
Epoch 261: Train D loss: -0.4779, G loss: 0.3142
Epoch 262: Train D loss: -0.4163, G loss: 0.3010
Epoch 263: Train D loss: -0.4488, G loss: 0.2603
Epoch 264: Train D loss: -0.4439, G loss: 0.3693
Epoch 265: Train D loss: -0.4231, G loss: 0.2526
Epoch 266: Train D loss: -0.4671, G loss: 0.2866
Epoch 267: Train D loss: -0.4331, G loss: 0.2889
Epoch 268: Train D loss: -0.4832, G loss: 0.3141
Epoch 269: Train D loss: -0.4374, G loss: 0.2532

Epoch 270: Train D loss: -0.4528, G loss: 0.2389
Epoch 271: Train D loss: -0.4498, G loss: 0.3348
Epoch 272: Train D loss: -0.4597, G loss: 0.2681
Epoch 273: Train D loss: -0.4630, G loss: 0.2154
Epoch 274: Train D loss: -0.4551, G loss: 0.2361
Epoch 275: Train D loss: -0.4500, G loss: 0.2494
Epoch 276: Train D loss: -0.4627, G loss: 0.3424
Epoch 277: Train D loss: -0.4245, G loss: 0.1571
Epoch 278: Train D loss: -0.5025, G loss: 0.3117
Epoch 279: Train D loss: -0.4652, G loss: 0.3430

Epoch 280: Train D loss: -0.4332, G loss: 0.3182
Epoch 281: Train D loss: -0.4416, G loss: 0.3479
Epoch 282: Train D loss: -0.4448, G loss: 0.2879
Epoch 283: Train D loss: -0.4374, G loss: 0.2657
Epoch 284: Train D loss: -0.4536, G loss: 0.3117
Epoch 285: Train D loss: -0.4366, G loss: 0.3271
Epoch 286: Train D loss: -0.4170, G loss: 0.1628
Epoch 287: Train D loss: -0.4373, G loss: 0.1984
Epoch 288: Train D loss: -0.4241, G loss: 0.2815
Epoch 289: Train D loss: -0.4520, G loss: 0.2082

Epoch 290: Train D loss: -0.4308, G loss: 0.2137
Epoch 291: Train D loss: -0.4374, G loss: 0.1940
Epoch 292: Train D loss: -0.4687, G loss: 0.3391
Epoch 293: Train D loss: -0.4192, G loss: 0.2842
Epoch 294: Train D loss: -0.4711, G loss: 0.3331
Epoch 295: Train D loss: -0.4143, G loss: 0.2124
Epoch 296: Train D loss: -0.4799, G loss: 0.2863
Epoch 297: Train D loss: -0.4379, G loss: 0.2810
Epoch 298: Train D loss: -0.4576, G loss: 0.3339
Epoch 299: Train D loss: -0.4042, G loss: 0.2448

pytorch实现多种经典GAN_第74张图片

n_d = 5

# G and D model, use DCGAN, note that sigmoid is removed in D
G = DCGenerator(image_size=image_size, latent_dim=latent_dim, output_channel=image_channel).to(device)
D = DCDiscriminator(image_size=image_size, input_channel=image_channel, sigmoid=False).to(device)

# G and D optimizer, use Adam or SGD
G_optimizer = optim.Adam(G.parameters(), lr=learning_rate, betas=betas)
D_optimizer = optim.Adam(D.parameters(), lr=learning_rate, betas=betas)

d_loss_hist, g_loss_hist = run_wgan(trainloader, G, D, G_optimizer, D_optimizer, n_epochs, device, 
                                    latent_dim, n_d, weight_clip)

loss_plot(d_loss_hist, g_loss_hist)
Epoch 0: Train D loss: -0.0498, G loss: 0.0118

pytorch实现多种经典GAN_第75张图片

Epoch 1: Train D loss: -0.1303, G loss: 0.0704
Epoch 2: Train D loss: -0.3252, G loss: 0.1906
Epoch 3: Train D loss: -0.5369, G loss: 0.2909
Epoch 4: Train D loss: -0.7171, G loss: 0.3722
Epoch 5: Train D loss: -0.8520, G loss: 0.4342
Epoch 6: Train D loss: -0.9347, G loss: 0.4683
Epoch 7: Train D loss: -1.0296, G loss: 0.5066
Epoch 8: Train D loss: -1.1149, G loss: 0.5377
Epoch 9: Train D loss: -1.1570, G loss: 0.5572

pytorch实现多种经典GAN_第76张图片

Epoch 10: Train D loss: -0.9942, G loss: 0.5635
Epoch 11: Train D loss: -1.2075, G loss: 0.5844
Epoch 12: Train D loss: -1.2513, G loss: 0.6020
Epoch 13: Train D loss: -1.2834, G loss: 0.6129
Epoch 14: Train D loss: -1.2965, G loss: 0.6197
Epoch 15: Train D loss: -1.2924, G loss: 0.6268
Epoch 16: Train D loss: -1.2905, G loss: 0.6274
Epoch 17: Train D loss: -1.2297, G loss: 0.6119
Epoch 18: Train D loss: -1.2333, G loss: 0.6283
Epoch 19: Train D loss: -1.1277, G loss: 0.5800

pytorch实现多种经典GAN_第77张图片

Epoch 20: Train D loss: -1.1618, G loss: 0.6162
Epoch 21: Train D loss: -1.2675, G loss: 0.6286
Epoch 22: Train D loss: -1.2346, G loss: 0.6221
Epoch 23: Train D loss: -1.3001, G loss: 0.6336
Epoch 24: Train D loss: -1.2687, G loss: 0.6253
Epoch 25: Train D loss: -1.2169, G loss: 0.6226
Epoch 26: Train D loss: -1.2275, G loss: 0.6279
Epoch 27: Train D loss: -1.0138, G loss: 0.5700
Epoch 28: Train D loss: -1.0239, G loss: 0.5825
Epoch 29: Train D loss: -1.1723, G loss: 0.5944

pytorch实现多种经典GAN_第78张图片

Epoch 30: Train D loss: -1.2025, G loss: 0.6059
Epoch 31: Train D loss: -1.2188, G loss: 0.6016
Epoch 32: Train D loss: -1.0934, G loss: 0.6026
Epoch 33: Train D loss: -1.2112, G loss: 0.6125
Epoch 34: Train D loss: -1.2045, G loss: 0.5984
Epoch 35: Train D loss: -1.1119, G loss: 0.5849
Epoch 36: Train D loss: -1.1861, G loss: 0.5869
Epoch 37: Train D loss: -1.1273, G loss: 0.5839
Epoch 38: Train D loss: -1.1051, G loss: 0.5551
Epoch 39: Train D loss: -1.1286, G loss: 0.5817

pytorch实现多种经典GAN_第79张图片

Epoch 40: Train D loss: -1.0711, G loss: 0.5609
Epoch 41: Train D loss: -1.0567, G loss: 0.5515
Epoch 42: Train D loss: -0.9367, G loss: 0.5237
Epoch 43: Train D loss: -1.0113, G loss: 0.5598
Epoch 44: Train D loss: -0.9749, G loss: 0.5267
Epoch 45: Train D loss: -0.9543, G loss: 0.5109
Epoch 46: Train D loss: -1.0083, G loss: 0.5194
Epoch 47: Train D loss: -0.9908, G loss: 0.5217
Epoch 48: Train D loss: -1.0203, G loss: 0.5546
Epoch 49: Train D loss: -1.0030, G loss: 0.5567

pytorch实现多种经典GAN_第80张图片

Epoch 50: Train D loss: -1.0205, G loss: 0.5295
Epoch 51: Train D loss: -0.9432, G loss: 0.5310
Epoch 52: Train D loss: -0.9623, G loss: 0.5147
Epoch 53: Train D loss: -0.9891, G loss: 0.5395
Epoch 54: Train D loss: -0.9674, G loss: 0.5117
Epoch 55: Train D loss: -0.9610, G loss: 0.5144
Epoch 56: Train D loss: -0.9394, G loss: 0.5026
Epoch 57: Train D loss: -0.9432, G loss: 0.4979
Epoch 58: Train D loss: -0.8897, G loss: 0.4692
Epoch 59: Train D loss: -0.9506, G loss: 0.5228

pytorch实现多种经典GAN_第81张图片

Epoch 60: Train D loss: -0.9208, G loss: 0.5056
Epoch 61: Train D loss: -0.9101, G loss: 0.4958
Epoch 62: Train D loss: -0.8413, G loss: 0.5095
Epoch 63: Train D loss: -0.9268, G loss: 0.4940
Epoch 64: Train D loss: -0.7816, G loss: 0.4402
Epoch 65: Train D loss: -0.9127, G loss: 0.5004
Epoch 66: Train D loss: -0.9370, G loss: 0.5245
Epoch 67: Train D loss: -0.8447, G loss: 0.4552
Epoch 68: Train D loss: -0.8280, G loss: 0.4825
Epoch 69: Train D loss: -0.9060, G loss: 0.4782

pytorch实现多种经典GAN_第82张图片

Epoch 70: Train D loss: -0.9183, G loss: 0.4999
Epoch 71: Train D loss: -0.8661, G loss: 0.4900
Epoch 72: Train D loss: -0.8760, G loss: 0.4230
Epoch 73: Train D loss: -0.7779, G loss: 0.4799
Epoch 74: Train D loss: -0.9086, G loss: 0.5033
Epoch 75: Train D loss: -0.8061, G loss: 0.4596
Epoch 76: Train D loss: -0.8455, G loss: 0.4589
Epoch 77: Train D loss: -0.8155, G loss: 0.4482
Epoch 78: Train D loss: -0.8859, G loss: 0.4782
Epoch 79: Train D loss: -0.8398, G loss: 0.4290

pytorch实现多种经典GAN_第83张图片

Epoch 80: Train D loss: -0.8457, G loss: 0.4550
Epoch 81: Train D loss: -0.8206, G loss: 0.4336
Epoch 82: Train D loss: -0.8132, G loss: 0.4383
Epoch 83: Train D loss: -0.8790, G loss: 0.4850
Epoch 84: Train D loss: -0.8539, G loss: 0.4894
Epoch 85: Train D loss: -0.8460, G loss: 0.4521
Epoch 86: Train D loss: -0.8292, G loss: 0.4823
Epoch 87: Train D loss: -0.8681, G loss: 0.4889
Epoch 88: Train D loss: -0.7984, G loss: 0.4585
Epoch 89: Train D loss: -0.8420, G loss: 0.4467

在这里插入图片描述

Epoch 90: Train D loss: -0.6950, G loss: 0.3452
Epoch 91: Train D loss: -0.9381, G loss: 0.5037
Epoch 92: Train D loss: -0.6778, G loss: 0.3609
Epoch 93: Train D loss: -0.9316, G loss: 0.4965
Epoch 94: Train D loss: -0.7886, G loss: 0.4451
Epoch 95: Train D loss: -0.7759, G loss: 0.3888
Epoch 96: Train D loss: -0.8151, G loss: 0.4879
Epoch 97: Train D loss: -0.7749, G loss: 0.4183
Epoch 98: Train D loss: -0.7832, G loss: 0.4181
Epoch 99: Train D loss: -0.8032, G loss: 0.4556

pytorch实现多种经典GAN_第84张图片

Epoch 100: Train D loss: -0.8334, G loss: 0.4964
Epoch 101: Train D loss: -0.8033, G loss: 0.4483
Epoch 102: Train D loss: -0.7647, G loss: 0.4101
Epoch 103: Train D loss: -0.8006, G loss: 0.4228
Epoch 104: Train D loss: -0.7677, G loss: 0.4692
Epoch 105: Train D loss: -0.8417, G loss: 0.4730
Epoch 106: Train D loss: -0.7593, G loss: 0.4531
Epoch 107: Train D loss: -0.7748, G loss: 0.4201
Epoch 108: Train D loss: -0.7863, G loss: 0.4877
Epoch 109: Train D loss: -0.8630, G loss: 0.4825

pytorch实现多种经典GAN_第85张图片

Epoch 110: Train D loss: -0.7701, G loss: 0.4323
Epoch 111: Train D loss: -0.7747, G loss: 0.4315
Epoch 112: Train D loss: -0.7496, G loss: 0.4397
Epoch 113: Train D loss: -0.8011, G loss: 0.3997
Epoch 114: Train D loss: -0.7878, G loss: 0.3442
Epoch 115: Train D loss: -0.7460, G loss: 0.4705
Epoch 116: Train D loss: -0.8168, G loss: 0.4377
Epoch 117: Train D loss: -0.7708, G loss: 0.4669
Epoch 118: Train D loss: -0.7941, G loss: 0.4279
Epoch 119: Train D loss: -0.7487, G loss: 0.4724

Epoch 120: Train D loss: -0.7531, G loss: 0.4362
Epoch 121: Train D loss: -0.7665, G loss: 0.4447
Epoch 122: Train D loss: -0.7073, G loss: 0.3246
Epoch 123: Train D loss: -0.7415, G loss: 0.3181
Epoch 124: Train D loss: -0.8371, G loss: 0.4748
Epoch 125: Train D loss: -0.6830, G loss: 0.3035
Epoch 126: Train D loss: -0.7337, G loss: 0.4533
Epoch 127: Train D loss: -0.7250, G loss: 0.4284
Epoch 128: Train D loss: -0.7476, G loss: 0.3011
Epoch 129: Train D loss: -0.7296, G loss: 0.4449

pytorch实现多种经典GAN_第86张图片

Epoch 130: Train D loss: -0.7725, G loss: 0.4172
Epoch 131: Train D loss: -0.7151, G loss: 0.4161
Epoch 132: Train D loss: -0.7459, G loss: 0.4034
Epoch 133: Train D loss: -0.7157, G loss: 0.3320
Epoch 134: Train D loss: -0.7521, G loss: 0.4033
Epoch 135: Train D loss: -0.6922, G loss: 0.2842
Epoch 136: Train D loss: -0.7865, G loss: 0.4374
Epoch 137: Train D loss: -0.7560, G loss: 0.4191
Epoch 138: Train D loss: -0.7504, G loss: 0.3817
Epoch 139: Train D loss: -0.6792, G loss: 0.3853

Epoch 140: Train D loss: -0.8732, G loss: 0.4929
Epoch 141: Train D loss: -0.6861, G loss: 0.3828
Epoch 142: Train D loss: -0.7873, G loss: 0.4018
Epoch 143: Train D loss: -0.6585, G loss: 0.3191
Epoch 144: Train D loss: -0.8468, G loss: 0.4863
Epoch 145: Train D loss: -0.6688, G loss: 0.4658
Epoch 146: Train D loss: -0.7232, G loss: 0.3666
Epoch 147: Train D loss: -0.7330, G loss: 0.4593
Epoch 148: Train D loss: -0.7241, G loss: 0.3961
Epoch 149: Train D loss: -0.7380, G loss: 0.3940

Epoch 150: Train D loss: -0.6803, G loss: 0.3510
Epoch 151: Train D loss: -0.7470, G loss: 0.4280
Epoch 152: Train D loss: -0.7362, G loss: 0.3690
Epoch 153: Train D loss: -0.7216, G loss: 0.3798
Epoch 154: Train D loss: -0.7178, G loss: 0.4439
Epoch 155: Train D loss: -0.7250, G loss: 0.3090
Epoch 156: Train D loss: -0.7732, G loss: 0.4513
Epoch 157: Train D loss: -0.7239, G loss: 0.3663
Epoch 158: Train D loss: -0.6600, G loss: 0.2951
Epoch 159: Train D loss: -0.7248, G loss: 0.3944

Epoch 160: Train D loss: -0.7476, G loss: 0.3511
Epoch 161: Train D loss: -0.6630, G loss: 0.3017
Epoch 162: Train D loss: -0.7118, G loss: 0.3148
Epoch 163: Train D loss: -0.7102, G loss: 0.3027
Epoch 164: Train D loss: -0.6579, G loss: 0.3612
Epoch 165: Train D loss: -0.7652, G loss: 0.3983
Epoch 166: Train D loss: -0.6637, G loss: 0.2630
Epoch 167: Train D loss: -0.7033, G loss: 0.3768
Epoch 168: Train D loss: -0.7261, G loss: 0.3822
Epoch 169: Train D loss: -0.7256, G loss: 0.3892

Epoch 170: Train D loss: -0.6815, G loss: 0.3225
Epoch 171: Train D loss: -0.6621, G loss: 0.3520
Epoch 172: Train D loss: -0.7397, G loss: 0.4141
Epoch 173: Train D loss: -0.7067, G loss: 0.3764
Epoch 174: Train D loss: -0.7317, G loss: 0.4211
Epoch 175: Train D loss: -0.6938, G loss: 0.3156
Epoch 176: Train D loss: -0.6960, G loss: 0.3748
Epoch 177: Train D loss: -0.7222, G loss: 0.4367
Epoch 178: Train D loss: -0.6672, G loss: 0.3683
Epoch 179: Train D loss: -0.7037, G loss: 0.3822

Epoch 180: Train D loss: -0.6948, G loss: 0.4413
Epoch 181: Train D loss: -0.6633, G loss: 0.3511
Epoch 182: Train D loss: -0.6615, G loss: 0.4381
Epoch 183: Train D loss: -0.7216, G loss: 0.4092
Epoch 184: Train D loss: -0.6599, G loss: 0.3071
Epoch 185: Train D loss: -0.6838, G loss: 0.4535
Epoch 186: Train D loss: -0.7259, G loss: 0.3740
Epoch 187: Train D loss: -0.6939, G loss: 0.2970
Epoch 188: Train D loss: -0.6751, G loss: 0.3970
Epoch 189: Train D loss: -0.7300, G loss: 0.3603

Epoch 190: Train D loss: -0.6959, G loss: 0.4795
Epoch 191: Train D loss: -0.6309, G loss: 0.3895
Epoch 192: Train D loss: -0.6581, G loss: 0.4188
Epoch 193: Train D loss: -0.6867, G loss: 0.3905
Epoch 194: Train D loss: -0.7255, G loss: 0.4369
Epoch 195: Train D loss: -0.6657, G loss: 0.3930
Epoch 196: Train D loss: -0.6872, G loss: 0.4678
Epoch 197: Train D loss: -0.6756, G loss: 0.3764
Epoch 198: Train D loss: -0.6562, G loss: 0.3925
Epoch 199: Train D loss: -0.6776, G loss: 0.3102

pytorch实现多种经典GAN_第87张图片

Epoch 200: Train D loss: -0.6722, G loss: 0.3223
Epoch 201: Train D loss: -0.6819, G loss: 0.3834
Epoch 202: Train D loss: -0.7003, G loss: 0.3767
Epoch 203: Train D loss: -0.6241, G loss: 0.3667
Epoch 204: Train D loss: -0.6869, G loss: 0.4294
Epoch 205: Train D loss: -0.6736, G loss: 0.4085
Epoch 206: Train D loss: -0.6643, G loss: 0.3876
Epoch 207: Train D loss: -0.6726, G loss: 0.3457
Epoch 208: Train D loss: -0.6147, G loss: 0.2352
Epoch 209: Train D loss: -0.6277, G loss: 0.3010

Epoch 210: Train D loss: -0.7033, G loss: 0.4098
Epoch 211: Train D loss: -0.6933, G loss: 0.4020
Epoch 212: Train D loss: -0.6218, G loss: 0.3895
Epoch 213: Train D loss: -0.6677, G loss: 0.3395
Epoch 214: Train D loss: -0.6663, G loss: 0.4399
Epoch 215: Train D loss: -0.6661, G loss: 0.4621
Epoch 216: Train D loss: -0.6264, G loss: 0.3585
Epoch 217: Train D loss: -0.6356, G loss: 0.3457
Epoch 218: Train D loss: -0.6422, G loss: 0.3604
Epoch 219: Train D loss: -0.6284, G loss: 0.2832

Epoch 220: Train D loss: -0.6343, G loss: 0.3843
Epoch 221: Train D loss: -0.6317, G loss: 0.4756
Epoch 222: Train D loss: -0.6197, G loss: 0.3029
Epoch 223: Train D loss: -0.6345, G loss: 0.3948
Epoch 224: Train D loss: -0.6453, G loss: 0.3045
Epoch 225: Train D loss: -0.6744, G loss: 0.4691
Epoch 226: Train D loss: -0.6615, G loss: 0.3710
Epoch 227: Train D loss: -0.6276, G loss: 0.3870
Epoch 228: Train D loss: -0.6479, G loss: 0.3536
Epoch 229: Train D loss: -0.6284, G loss: 0.3681

Epoch 230: Train D loss: -0.6199, G loss: 0.4000
Epoch 231: Train D loss: -0.7007, G loss: 0.4761
Epoch 232: Train D loss: -0.6447, G loss: 0.3475
Epoch 233: Train D loss: -0.5693, G loss: 0.3419
Epoch 234: Train D loss: -0.6634, G loss: 0.4123
Epoch 235: Train D loss: -0.6644, G loss: 0.4239
Epoch 236: Train D loss: -0.5915, G loss: 0.3670
Epoch 237: Train D loss: -0.6020, G loss: 0.3126
Epoch 238: Train D loss: -0.5968, G loss: 0.2184
Epoch 239: Train D loss: -0.6230, G loss: 0.3268

Epoch 240: Train D loss: -0.6126, G loss: 0.3568
Epoch 241: Train D loss: -0.6470, G loss: 0.3765
Epoch 242: Train D loss: -0.5864, G loss: 0.3381
Epoch 243: Train D loss: -0.5856, G loss: 0.3126
Epoch 244: Train D loss: -0.6790, G loss: 0.3583
Epoch 245: Train D loss: -0.5710, G loss: 0.1709
Epoch 246: Train D loss: -0.6388, G loss: 0.4037
Epoch 247: Train D loss: -0.7029, G loss: 0.4048
Epoch 248: Train D loss: -0.6213, G loss: 0.4806
Epoch 249: Train D loss: -0.6230, G loss: 0.4364

Epoch 250: Train D loss: -0.5627, G loss: 0.2158
Epoch 251: Train D loss: -0.6097, G loss: 0.3876
Epoch 252: Train D loss: -0.6769, G loss: 0.3288
Epoch 253: Train D loss: -0.5295, G loss: 0.4380
Epoch 254: Train D loss: -0.6278, G loss: 0.3596
Epoch 255: Train D loss: -0.6062, G loss: 0.3896
Epoch 256: Train D loss: -0.6067, G loss: 0.4381
Epoch 257: Train D loss: -0.5900, G loss: 0.1975
Epoch 258: Train D loss: -0.6189, G loss: 0.2696
Epoch 259: Train D loss: -0.6229, G loss: 0.4171

pytorch实现多种经典GAN_第88张图片

Epoch 260: Train D loss: -0.6049, G loss: 0.3565
Epoch 261: Train D loss: -0.5734, G loss: 0.3521
Epoch 262: Train D loss: -0.6421, G loss: 0.4335
Epoch 263: Train D loss: -0.5919, G loss: 0.3178
Epoch 264: Train D loss: -0.5892, G loss: 0.3091
Epoch 265: Train D loss: -0.5760, G loss: 0.1836
Epoch 266: Train D loss: -0.6256, G loss: 0.3794
Epoch 267: Train D loss: -0.6587, G loss: 0.4308
Epoch 268: Train D loss: -0.5421, G loss: 0.2269
Epoch 269: Train D loss: -0.7041, G loss: 0.4481

pytorch实现多种经典GAN_第89张图片

Epoch 270: Train D loss: -0.6716, G loss: 0.4032
Epoch 271: Train D loss: -0.5258, G loss: 0.2325
Epoch 272: Train D loss: -0.6333, G loss: 0.4433
Epoch 273: Train D loss: -0.6654, G loss: 0.3966
Epoch 274: Train D loss: -0.5510, G loss: 0.2878
Epoch 275: Train D loss: -0.5713, G loss: 0.3841
Epoch 276: Train D loss: -0.5891, G loss: 0.3344
Epoch 277: Train D loss: -0.6262, G loss: 0.3101
Epoch 278: Train D loss: -0.6261, G loss: 0.3268
Epoch 279: Train D loss: -0.5313, G loss: 0.2680

Epoch 280: Train D loss: -0.5970, G loss: 0.2476
Epoch 281: Train D loss: -0.6034, G loss: 0.3639
Epoch 282: Train D loss: -0.6163, G loss: 0.3793
Epoch 283: Train D loss: -0.5957, G loss: 0.3937
Epoch 284: Train D loss: -0.5650, G loss: 0.3543
Epoch 285: Train D loss: -0.5554, G loss: 0.1889
Epoch 286: Train D loss: -0.6230, G loss: 0.4319
Epoch 287: Train D loss: -0.6013, G loss: 0.2979
Epoch 288: Train D loss: -0.5942, G loss: 0.3613
Epoch 289: Train D loss: -0.5624, G loss: 0.3098

pytorch实现多种经典GAN_第90张图片

Epoch 290: Train D loss: -0.6021, G loss: 0.3838
Epoch 291: Train D loss: -0.6086, G loss: 0.4250
Epoch 292: Train D loss: -0.5721, G loss: 0.3718
Epoch 293: Train D loss: -0.5672, G loss: 0.3573
Epoch 294: Train D loss: -0.5760, G loss: 0.1761
Epoch 295: Train D loss: -0.5797, G loss: 0.4052
Epoch 296: Train D loss: -0.5579, G loss: 0.1919
Epoch 297: Train D loss: -0.5588, G loss: 0.2167
Epoch 298: Train D loss: -0.5803, G loss: 0.2816
Epoch 299: Train D loss: -0.5546, G loss: 0.3150

pytorch实现多种经典GAN_第91张图片

WGAN-GP

在WGAN中,需要进行截断, 在实验中发现: 对于比较深的WAGN,它不容易收敛。

大致原因如下:

  1. 实验发现最后大多数的权重都在-c 和c上,这就意味了大部分权重只有两个可能数,这太简单了,作为一个深度神经网络来说,这实在是对它强大的拟合能力的浪费.
  2. 实验发现容易导致梯度消失或梯度爆炸。判别器是一个多层网络,如果把clip的值设得稍微小了一点,每经过一层网络,梯度就变小一点点,多层之后就会指数衰减;反之,则容易导致梯度爆炸.

所以WGAN-GP使用了Gradient penalty(梯度惩罚)来代替clip.
因为Lipschitz限制是要求判别器的梯度不超过K,所以可以直接使用一个loss term来实现这一点,所以改进后D的优化目标改进为如下:
在这里插入图片描述

下面是WGAN-GP的具体代码实现,同WGAN,我们也只实现了他的训练代码,而模型我们直接使用DCGAN的模型.

import torch.autograd as autograd

def wgan_gp_train(trainloader, G, D, G_optimizer, D_optimizer, device, z_dim, lambda_=10, n_d=2):
    
    D.train()
    G.train()
    
    D_total_loss = 0
    G_total_loss = 0
    
    
    for i, (x, _) in enumerate(trainloader):
        x = x.to(device)

        # update D network
        # D optimizer zero grads
        D_optimizer.zero_grad()
        
        # D real loss from real images
        d_real = D(x)
        d_real_loss = - d_real.mean()
        
        # D fake loss from fake images generated by G
        z = torch.rand(x.size(0), z_dim).to(device)
        g_z = G(z)
        d_fake = D(g_z)
        d_fake_loss = d_fake.mean()
        
        # D gradient penalty
        
        #   a random number epsilon,这一段的意思是获取夹在x和g_z之间的随机分布,epsilon是位于0-1之间的均匀分布
        epsilon = torch.rand(x.size(0), 1, 1, 1).cuda()
        x_hat = epsilon * x + (1 - epsilon) * g_z
        x_hat.requires_grad_(True)

        y_hat = D(x_hat)
        #   computes the sum of gradients of y_hat with regard to x_hat
        gradients = autograd.grad(outputs=y_hat, inputs=x_hat, grad_outputs=torch.ones(y_hat.size()).cuda(),
                                  create_graph=True, retain_graph=True, only_inputs=True)[0]
        #   computes gradientpenalty
        gradient_penalty =  torch.mean((gradients.view(gradients.size()[0], -1).norm(p=2, dim=1) - 1) ** 2)
        
        # D backward and step
        d_loss = d_real_loss + d_fake_loss + lambda_ * gradient_penalty
        d_loss.backward()
        D_optimizer.step()
        
            
        D_total_loss += d_loss.item()

        # update G network
        # G optimizer zero grads
        if (i + 1) % n_d == 0:
            G_optimizer.zero_grad()

            # G loss
            g_z = G(z)
            d_fake = D(g_z)
            g_loss = - d_fake.mean()

            # G backward and step
            g_loss.backward()
            G_optimizer.step()
            
            G_total_loss += g_loss.item()
    
    return D_total_loss / len(trainloader), G_total_loss * n_d / len(trainloader)
# hyper params

# z dim
latent_dim = 100

# image size and channel
image_size=32
image_channel=3

# Adam lr and betas
learning_rate = 0.0002
betas = (0.5, 0.999)

# epochs and batch size
n_epochs = 300
batch_size = 32

# device : cpu or cuda:0/1/2/3
device = torch.device('cuda:0')

# n_d: train D
n_d = 2
lambda_ = 10

# mnist dataset and dataloader
train_dataset = load_furniture_data()
trainloader = torch.utils.data.DataLoader(train_dataset, batch_size=batch_size, shuffle=True)

# G and D model, use DCGAN, note that sigmoid is removed in D
G = DCGenerator(image_size=image_size, latent_dim=latent_dim, output_channel=image_channel).to(device)
D = DCDiscriminator(image_size=image_size, input_channel=image_channel, sigmoid=False).to(device)

# G and D optimizer, use Adam or SGD
G_optimizer = optim.Adam(G.parameters(), lr=learning_rate, betas=betas)
D_optimizer = optim.Adam(D.parameters(), lr=learning_rate, betas=betas)

d_loss_hist = []
g_loss_hist = []

for epoch in range(n_epochs):
    d_loss, g_loss = wgan_gp_train(trainloader, G, D, G_optimizer, D_optimizer, device, 
                           z_dim=latent_dim, lambda_=lambda_, n_d=n_d)
    print('Epoch {}: Train D loss: {:.4f}, G loss: {:.4f}'.format(epoch, d_loss, g_loss))
    
    d_loss_hist.append(d_loss)
    g_loss_hist.append(g_loss)
    
    if epoch == 0 or (epoch + 1) % 10 == 0:
        visualize_results(G, device, latent_dim)
Epoch 0: Train D loss: 2.3375, G loss: 1.6076

在这里插入图片描述

Epoch 1: Train D loss: -7.0229, G loss: 7.2572
Epoch 2: Train D loss: -12.6825, G loss: 15.3234
Epoch 3: Train D loss: -17.3524, G loss: 20.3854
Epoch 4: Train D loss: -24.7355, G loss: 23.2378
Epoch 5: Train D loss: -20.9602, G loss: 21.4061
Epoch 6: Train D loss: -15.5647, G loss: 18.6234
Epoch 7: Train D loss: -13.5252, G loss: 15.8746
Epoch 8: Train D loss: -13.6425, G loss: 16.4975
Epoch 9: Train D loss: -13.2946, G loss: 17.4806

pytorch实现多种经典GAN_第92张图片

Epoch 10: Train D loss: -12.6535, G loss: 18.0911
Epoch 11: Train D loss: -12.6453, G loss: 18.6485
Epoch 12: Train D loss: -12.1076, G loss: 18.9110
Epoch 13: Train D loss: -12.3911, G loss: 18.0021
Epoch 14: Train D loss: -12.1081, G loss: 18.4543
Epoch 15: Train D loss: -10.8537, G loss: 18.2820
Epoch 16: Train D loss: -10.4333, G loss: 18.4033
Epoch 17: Train D loss: -10.7065, G loss: 18.2367
Epoch 18: Train D loss: -11.0685, G loss: 18.3941
Epoch 19: Train D loss: -10.6738, G loss: 18.5614

pytorch实现多种经典GAN_第93张图片

Epoch 20: Train D loss: -10.5931, G loss: 19.2672
Epoch 21: Train D loss: -11.5785, G loss: 19.7935
Epoch 22: Train D loss: -10.2580, G loss: 20.9359
Epoch 23: Train D loss: -11.0469, G loss: 20.5477
Epoch 24: Train D loss: -12.0539, G loss: 21.4792
Epoch 25: Train D loss: -7.4531, G loss: 21.8701
Epoch 26: Train D loss: -10.1882, G loss: 20.7676
Epoch 27: Train D loss: -11.5679, G loss: 22.7503
Epoch 28: Train D loss: -9.7243, G loss: 21.6977
Epoch 29: Train D loss: -6.8135, G loss: 21.4995

pytorch实现多种经典GAN_第94张图片

Epoch 30: Train D loss: -3.8788, G loss: 23.3653
Epoch 31: Train D loss: -6.7727, G loss: 26.0506
Epoch 32: Train D loss: -7.8814, G loss: 24.3415
Epoch 33: Train D loss: -8.5027, G loss: 24.1361
Epoch 34: Train D loss: -8.8884, G loss: 23.4721
Epoch 35: Train D loss: -8.9071, G loss: 23.6977
Epoch 36: Train D loss: -9.1270, G loss: 23.6131
Epoch 37: Train D loss: -9.2372, G loss: 24.9809
Epoch 38: Train D loss: -8.9897, G loss: 23.9358
Epoch 39: Train D loss: -8.8727, G loss: 24.8954

pytorch实现多种经典GAN_第95张图片

Epoch 40: Train D loss: -8.0116, G loss: 23.0790
Epoch 41: Train D loss: -7.1674, G loss: 24.1048
Epoch 42: Train D loss: -7.0950, G loss: 25.1263
Epoch 43: Train D loss: -7.2116, G loss: 24.0249
Epoch 44: Train D loss: -7.1526, G loss: 23.1523
Epoch 45: Train D loss: -7.0267, G loss: 22.5925
Epoch 46: Train D loss: -6.8523, G loss: 23.7724
Epoch 47: Train D loss: -6.1514, G loss: 23.5108
Epoch 48: Train D loss: -6.3261, G loss: 22.2509
Epoch 49: Train D loss: -5.9133, G loss: 23.4143

pytorch实现多种经典GAN_第96张图片

Epoch 50: Train D loss: -4.8301, G loss: 22.4895
Epoch 51: Train D loss: -5.4982, G loss: 23.5148
Epoch 52: Train D loss: -5.7327, G loss: 23.0237
Epoch 53: Train D loss: -5.3672, G loss: 24.5981
Epoch 54: Train D loss: -5.8419, G loss: 24.1544
Epoch 55: Train D loss: -5.6617, G loss: 22.1073
Epoch 56: Train D loss: -6.1638, G loss: 23.3523
Epoch 57: Train D loss: -5.3853, G loss: 22.8904
Epoch 58: Train D loss: -2.4469, G loss: 21.8208
Epoch 59: Train D loss: -3.5256, G loss: 24.0280

pytorch实现多种经典GAN_第97张图片

Epoch 60: Train D loss: -5.3344, G loss: 27.0730
Epoch 61: Train D loss: -5.4392, G loss: 21.9599
Epoch 62: Train D loss: -5.2358, G loss: 24.5099
Epoch 63: Train D loss: -5.7057, G loss: 23.1977
Epoch 64: Train D loss: -5.6823, G loss: 22.8753
Epoch 65: Train D loss: -5.1993, G loss: 23.6504
Epoch 66: Train D loss: -5.3119, G loss: 23.8633
Epoch 67: Train D loss: -5.3521, G loss: 24.2901
Epoch 68: Train D loss: -5.4217, G loss: 23.8531
Epoch 69: Train D loss: -5.4006, G loss: 24.5350

pytorch实现多种经典GAN_第98张图片

Epoch 70: Train D loss: -5.2107, G loss: 25.2159
Epoch 71: Train D loss: -6.3511, G loss: 24.4599
Epoch 72: Train D loss: -5.6472, G loss: 23.6516
Epoch 73: Train D loss: -5.6285, G loss: 24.9873
Epoch 74: Train D loss: -5.5940, G loss: 24.5002
Epoch 75: Train D loss: -5.8774, G loss: 25.4834
Epoch 76: Train D loss: -5.9048, G loss: 23.9710
Epoch 77: Train D loss: -5.1310, G loss: 23.3302
Epoch 78: Train D loss: -1.0480, G loss: 23.2426
Epoch 79: Train D loss: -1.7491, G loss: 24.9195

pytorch实现多种经典GAN_第99张图片

Epoch 80: Train D loss: -2.5135, G loss: 25.6086
Epoch 81: Train D loss: -3.4346, G loss: 26.2362
Epoch 82: Train D loss: -3.9430, G loss: 26.8000
Epoch 83: Train D loss: -4.2262, G loss: 26.7070
Epoch 84: Train D loss: -4.5304, G loss: 26.7027
Epoch 85: Train D loss: -5.1061, G loss: 26.6735
Epoch 86: Train D loss: -5.4108, G loss: 26.9123
Epoch 87: Train D loss: -5.3472, G loss: 26.7057
Epoch 88: Train D loss: -5.5203, G loss: 28.1240
Epoch 89: Train D loss: -5.7020, G loss: 26.9564

pytorch实现多种经典GAN_第100张图片

Epoch 90: Train D loss: -5.9846, G loss: 29.0245
Epoch 91: Train D loss: -6.1918, G loss: 27.8217
Epoch 92: Train D loss: -5.7079, G loss: 28.7783
Epoch 93: Train D loss: -5.5706, G loss: 30.4015
Epoch 94: Train D loss: -5.6908, G loss: 27.9081
Epoch 95: Train D loss: -5.9386, G loss: 29.9694
Epoch 96: Train D loss: -6.6543, G loss: 28.9369
Epoch 97: Train D loss: -5.7934, G loss: 31.1125
Epoch 98: Train D loss: -5.9300, G loss: 29.5650
Epoch 99: Train D loss: -6.4245, G loss: 31.2284

pytorch实现多种经典GAN_第101张图片

Epoch 100: Train D loss: -5.6557, G loss: 28.2230
Epoch 101: Train D loss: -6.7430, G loss: 30.7428
Epoch 102: Train D loss: -6.2389, G loss: 31.9553
Epoch 103: Train D loss: -6.9982, G loss: 28.7615
Epoch 104: Train D loss: -4.3206, G loss: 30.3946
Epoch 105: Train D loss: -6.2643, G loss: 30.9668
Epoch 106: Train D loss: -5.4260, G loss: 31.3210
Epoch 107: Train D loss: -4.1328, G loss: 29.2331
Epoch 108: Train D loss: -3.0862, G loss: 29.3588
Epoch 109: Train D loss: -5.2220, G loss: 32.4586

pytorch实现多种经典GAN_第102张图片

Epoch 110: Train D loss: -6.1161, G loss: 31.5719
Epoch 111: Train D loss: -6.0992, G loss: 31.7885
Epoch 112: Train D loss: -6.3759, G loss: 31.3707
Epoch 113: Train D loss: -6.8123, G loss: 32.2436
Epoch 114: Train D loss: -6.1770, G loss: 30.8075
Epoch 115: Train D loss: -6.5573, G loss: 31.9127
Epoch 116: Train D loss: -6.6594, G loss: 31.6124
Epoch 117: Train D loss: -5.4032, G loss: 32.9919
Epoch 118: Train D loss: -6.2448, G loss: 34.8807
Epoch 119: Train D loss: -6.3115, G loss: 32.0648

pytorch实现多种经典GAN_第103张图片

Epoch 120: Train D loss: -6.6409, G loss: 32.1153
Epoch 121: Train D loss: -6.7388, G loss: 32.1203
Epoch 122: Train D loss: -5.8512, G loss: 33.9175
Epoch 123: Train D loss: -6.1638, G loss: 31.9460
Epoch 124: Train D loss: -6.6893, G loss: 32.3351
Epoch 125: Train D loss: -6.6725, G loss: 32.5981
Epoch 126: Train D loss: -1.0918, G loss: 30.5706
Epoch 127: Train D loss: -0.9223, G loss: 29.5487
Epoch 128: Train D loss: -1.8801, G loss: 30.5785
Epoch 129: Train D loss: -2.1809, G loss: 33.0878

pytorch实现多种经典GAN_第104张图片

Epoch 130: Train D loss: -2.4127, G loss: 31.3984
Epoch 131: Train D loss: -3.2557, G loss: 32.2457
Epoch 132: Train D loss: -3.5636, G loss: 34.5859
Epoch 133: Train D loss: -4.2362, G loss: 33.3747
Epoch 134: Train D loss: -4.6183, G loss: 34.2370
Epoch 135: Train D loss: -5.1151, G loss: 34.3615
Epoch 136: Train D loss: -5.3290, G loss: 34.6819
Epoch 137: Train D loss: -5.4942, G loss: 34.0987
Epoch 138: Train D loss: -6.2746, G loss: 35.9311
Epoch 139: Train D loss: -5.7736, G loss: 32.7043

pytorch实现多种经典GAN_第105张图片

Epoch 140: Train D loss: -5.9902, G loss: 36.1289
Epoch 141: Train D loss: -5.1590, G loss: 34.8565
Epoch 142: Train D loss: -6.1068, G loss: 35.5381
Epoch 143: Train D loss: -5.7384, G loss: 36.0594
Epoch 144: Train D loss: -5.8503, G loss: 36.7146
Epoch 145: Train D loss: -5.3819, G loss: 33.3955
Epoch 146: Train D loss: -6.1108, G loss: 35.4723
Epoch 147: Train D loss: -5.9595, G loss: 35.1426
Epoch 148: Train D loss: -7.8403, G loss: 36.2727
Epoch 149: Train D loss: -2.9989, G loss: 34.7234

pytorch实现多种经典GAN_第106张图片

Epoch 150: Train D loss: -6.0890, G loss: 36.6619
Epoch 151: Train D loss: -5.7360, G loss: 35.2319
Epoch 152: Train D loss: -2.5470, G loss: 37.6325
Epoch 153: Train D loss: -4.3428, G loss: 33.2738
Epoch 154: Train D loss: -5.1647, G loss: 38.7267
Epoch 155: Train D loss: -5.9234, G loss: 35.9941
Epoch 156: Train D loss: -5.2527, G loss: 38.0740
Epoch 157: Train D loss: -6.3392, G loss: 38.6860
Epoch 158: Train D loss: -7.0542, G loss: 36.5369
Epoch 159: Train D loss: -6.0224, G loss: 37.9246

pytorch实现多种经典GAN_第107张图片

Epoch 160: Train D loss: -5.9971, G loss: 35.8993
Epoch 161: Train D loss: -6.4151, G loss: 36.4551
Epoch 162: Train D loss: -6.1822, G loss: 37.9584
Epoch 163: Train D loss: -7.4103, G loss: 36.2408
Epoch 164: Train D loss: -6.0805, G loss: 38.4836
Epoch 165: Train D loss: -6.7855, G loss: 36.2980
Epoch 166: Train D loss: -5.8253, G loss: 36.4480
Epoch 167: Train D loss: -6.1184, G loss: 38.1128
Epoch 168: Train D loss: -5.0192, G loss: 35.4841
Epoch 169: Train D loss: -4.8776, G loss: 37.3828

pytorch实现多种经典GAN_第108张图片

Epoch 170: Train D loss: -6.8094, G loss: 38.7482
Epoch 171: Train D loss: -6.4655, G loss: 37.0877
Epoch 172: Train D loss: -6.9732, G loss: 37.6592
Epoch 173: Train D loss: -6.6657, G loss: 35.4193
Epoch 174: Train D loss: -5.6038, G loss: 39.1525
Epoch 175: Train D loss: -7.1182, G loss: 37.8428
Epoch 176: Train D loss: -5.6907, G loss: 37.4807
Epoch 177: Train D loss: -6.4822, G loss: 38.9060
Epoch 178: Train D loss: -4.6822, G loss: 34.2815
Epoch 179: Train D loss: -5.7164, G loss: 36.4346

pytorch实现多种经典GAN_第109张图片

Epoch 180: Train D loss: -6.4657, G loss: 38.0587
Epoch 181: Train D loss: -6.0777, G loss: 36.7933
Epoch 182: Train D loss: -1.6276, G loss: 33.8338
Epoch 183: Train D loss: -0.0780, G loss: 28.6731
Epoch 184: Train D loss: 0.1379, G loss: 29.2441
Epoch 185: Train D loss: -0.8350, G loss: 32.2009
Epoch 186: Train D loss: -1.4214, G loss: 30.4018
Epoch 187: Train D loss: -1.2807, G loss: 31.7758
Epoch 188: Train D loss: -1.3287, G loss: 32.6561
Epoch 189: Train D loss: -1.5514, G loss: 30.5654

pytorch实现多种经典GAN_第110张图片

Epoch 190: Train D loss: -1.9065, G loss: 31.9897
Epoch 191: Train D loss: -2.0409, G loss: 32.9744
Epoch 192: Train D loss: -2.4405, G loss: 32.8510
Epoch 193: Train D loss: -2.6346, G loss: 33.1665
Epoch 194: Train D loss: -2.8928, G loss: 34.1775
Epoch 195: Train D loss: -3.2104, G loss: 34.5349
Epoch 196: Train D loss: -3.2243, G loss: 35.6762
Epoch 197: Train D loss: -3.5349, G loss: 37.3179
Epoch 198: Train D loss: -3.8284, G loss: 37.1046
Epoch 199: Train D loss: -3.9038, G loss: 37.2970

pytorch实现多种经典GAN_第111张图片

Epoch 200: Train D loss: -4.1748, G loss: 37.6725
Epoch 201: Train D loss: -4.2250, G loss: 38.6212
Epoch 202: Train D loss: -4.5129, G loss: 38.5633
Epoch 203: Train D loss: -4.6203, G loss: 38.2163
Epoch 204: Train D loss: -4.8810, G loss: 39.5521
Epoch 205: Train D loss: -4.9646, G loss: 39.0926
Epoch 206: Train D loss: -5.1082, G loss: 40.1284
Epoch 207: Train D loss: -5.4029, G loss: 39.0202
Epoch 208: Train D loss: -5.3376, G loss: 41.1535
Epoch 209: Train D loss: -5.4198, G loss: 40.0758

pytorch实现多种经典GAN_第112张图片

Epoch 210: Train D loss: -5.6772, G loss: 40.3948
Epoch 211: Train D loss: -5.5501, G loss: 41.9808
Epoch 212: Train D loss: -6.0014, G loss: 40.9326
Epoch 213: Train D loss: -5.9291, G loss: 41.1029
Epoch 214: Train D loss: -5.5610, G loss: 42.2327
Epoch 215: Train D loss: -6.1778, G loss: 40.6025
Epoch 216: Train D loss: -6.2397, G loss: 41.4627
Epoch 217: Train D loss: -6.1380, G loss: 42.4489
Epoch 218: Train D loss: -6.1426, G loss: 41.0780
Epoch 219: Train D loss: -6.4218, G loss: 41.1462

在这里插入图片描述

Epoch 220: Train D loss: -6.0231, G loss: 42.4741
Epoch 221: Train D loss: -6.5885, G loss: 42.6016
Epoch 222: Train D loss: -6.4288, G loss: 40.1520
Epoch 223: Train D loss: -5.7565, G loss: 42.6967
Epoch 224: Train D loss: -6.6304, G loss: 41.1387
Epoch 225: Train D loss: -5.6784, G loss: 41.8259
Epoch 226: Train D loss: -6.3118, G loss: 42.7625
Epoch 227: Train D loss: -6.4658, G loss: 41.9857
Epoch 228: Train D loss: -6.4723, G loss: 40.6945
Epoch 229: Train D loss: -6.9230, G loss: 43.5592

pytorch实现多种经典GAN_第113张图片

Epoch 230: Train D loss: -6.4957, G loss: 41.7938
Epoch 231: Train D loss: -6.5154, G loss: 40.0763
Epoch 232: Train D loss: -5.4352, G loss: 43.1833
Epoch 233: Train D loss: -5.4309, G loss: 39.1739
Epoch 234: Train D loss: -1.8059, G loss: 39.8952
Epoch 235: Train D loss: -4.3360, G loss: 39.3909
Epoch 236: Train D loss: -5.1035, G loss: 42.6123
Epoch 237: Train D loss: -5.8806, G loss: 42.4864
Epoch 238: Train D loss: -6.5246, G loss: 42.0718
Epoch 239: Train D loss: -6.3737, G loss: 43.3795

pytorch实现多种经典GAN_第114张图片

Epoch 240: Train D loss: -7.3377, G loss: 40.9893
Epoch 241: Train D loss: -6.2426, G loss: 43.0480
Epoch 242: Train D loss: -7.7458, G loss: 42.2035
Epoch 243: Train D loss: -6.6748, G loss: 42.1781
Epoch 244: Train D loss: -7.4434, G loss: 43.7804
Epoch 245: Train D loss: -6.7938, G loss: 40.1896
Epoch 246: Train D loss: -7.6131, G loss: 42.1748
Epoch 247: Train D loss: -7.0010, G loss: 45.4082
Epoch 248: Train D loss: -6.8232, G loss: 39.0382
Epoch 249: Train D loss: -7.1529, G loss: 42.7597

pytorch实现多种经典GAN_第115张图片

Epoch 250: Train D loss: -6.9998, G loss: 42.6621
Epoch 251: Train D loss: -7.2467, G loss: 41.1587
Epoch 252: Train D loss: -6.9345, G loss: 39.4910
Epoch 253: Train D loss: -5.4505, G loss: 40.6268
Epoch 254: Train D loss: -6.6528, G loss: 44.8819
Epoch 255: Train D loss: -5.2694, G loss: 38.0317
Epoch 256: Train D loss: -4.3088, G loss: 41.4750
Epoch 257: Train D loss: -6.6831, G loss: 41.6762
Epoch 258: Train D loss: -5.2940, G loss: 38.6744
Epoch 259: Train D loss: -6.6510, G loss: 41.2500

Epoch 260: Train D loss: -7.0832, G loss: 41.9869
Epoch 261: Train D loss: -6.0355, G loss: 39.1222
Epoch 262: Train D loss: -2.9520, G loss: 38.4937
Epoch 263: Train D loss: -5.9696, G loss: 40.8856
Epoch 264: Train D loss: -6.4665, G loss: 44.0659
Epoch 265: Train D loss: -7.8549, G loss: 43.4135
Epoch 266: Train D loss: -7.9603, G loss: 41.5108
Epoch 267: Train D loss: -6.2329, G loss: 42.8211
Epoch 268: Train D loss: -2.7879, G loss: 40.3899
Epoch 269: Train D loss: -5.1850, G loss: 41.8479

pytorch实现多种经典GAN_第116张图片

Epoch 270: Train D loss: -6.5687, G loss: 44.4683
Epoch 271: Train D loss: -8.5260, G loss: 43.1028
Epoch 272: Train D loss: -3.3616, G loss: 44.9901
Epoch 273: Train D loss: -5.6846, G loss: 38.9040
Epoch 274: Train D loss: -6.5341, G loss: 44.3343
Epoch 275: Train D loss: -6.8587, G loss: 45.0697
Epoch 276: Train D loss: -5.7049, G loss: 42.0923
Epoch 277: Train D loss: -1.3625, G loss: 39.3204
Epoch 278: Train D loss: -1.8208, G loss: 37.7670
Epoch 279: Train D loss: -3.0697, G loss: 37.6188

pytorch实现多种经典GAN_第117张图片

Epoch 280: Train D loss: -3.8869, G loss: 40.4507
Epoch 281: Train D loss: -4.7943, G loss: 41.7112
Epoch 282: Train D loss: -5.5414, G loss: 41.6876
Epoch 283: Train D loss: -5.9564, G loss: 42.6991
Epoch 284: Train D loss: -5.8231, G loss: 41.8769
Epoch 285: Train D loss: -6.7429, G loss: 42.7879
Epoch 286: Train D loss: -6.2128, G loss: 43.8983
Epoch 287: Train D loss: -6.9813, G loss: 42.8210
Epoch 288: Train D loss: -6.8439, G loss: 45.0894
Epoch 289: Train D loss: -6.7164, G loss: 43.0432

Epoch 290: Train D loss: -7.3504, G loss: 45.2013
Epoch 291: Train D loss: -6.9635, G loss: 42.3681
Epoch 292: Train D loss: -7.5860, G loss: 43.8509
Epoch 293: Train D loss: -6.6981, G loss: 45.6519
Epoch 294: Train D loss: -7.7940, G loss: 44.9397
Epoch 295: Train D loss: -6.6731, G loss: 43.2752
Epoch 296: Train D loss: -7.2289, G loss: 46.4379
Epoch 297: Train D loss: -7.8480, G loss: 42.9881
Epoch 298: Train D loss: -7.4628, G loss: 47.0468
Epoch 299: Train D loss: -7.4644, G loss: 44.8931

同理,观察loss曲线和D上的参数分布.

loss_plot(d_loss_hist, g_loss_hist)

pytorch实现多种经典GAN_第118张图片

show_d_params(D)

在这里插入图片描述

相同的epoch下,WGAN生成的图片,多样性和真实度都不如WGAN-GP,但是WGAN的loss曲线明显收敛得更加稳定一些,而WGAN_GP的生成器loss曲线甚至方向都不对。WGAN中D的参数分布呈现两头高中间低的形状,数值主要集中在截断的两个端点;WGAN-GP则呈现出正态分布,更符合常理。

你可能感兴趣的:(神经网络)