GAN学习笔记

1.原始的GAN

1.1原始的损失函数

1.1.1写法1参考1,参考2

GAN学习笔记_第1张图片

 1.1.2 写法2

GAN学习笔记_第2张图片

 where,

  • G = Generator 
  • D = Discriminator 
  • Pdata(x) = distribution of real data 
  • P(z) = distribution of generator 
  • x = sample from Pdata(x) 
  • z = sample from P(z) 
  • D(x) = Discriminator network 
  • G(z) = Generator network 

 1.1.3 写法3:  参考3

 1.2Wasserstein损失  参考

2.Conditional GAN (CGAN)

2.1 写法1:

The Discriminator has two task

  • Discriminator has to correctly label real images which are coming from training data set as “real”.
  • Discriminator has to correctly label generated images which are coming from Generator as “fake”.

We need to calculate two losses for the Discriminator. The sum of the “fake” image and “real” image loss is the overall Discriminator lossSo the loss function of the Discriminator is aiming at minimizing the error of predicting real images coming from the dataset and fake images coming from the Generator given their one-hot labels.

The Generator network has one task

  • To create an image that looks as “real” as possible to fool the Discriminator.

The loss function of the Generator minimizes the correct prediction of the Discriminator on fake images conditioned on the specified one-hot labels.

GAN学习笔记_第3张图片

  • The conditioning is performed by feeding y into the both the discriminator and generator as additional input layer.
  • In the generator the prior input noise p_z(z), and y are combined in joint hidden representation.
  • In the discriminator x and y are presented as inputs and to a discriminative function.
  • The objective function of a two-player minimax game become:

2.2 写法2:

where   is a probability distribution over classes,  is the probability distribution of real images of class C,  and   the probability distribution of images generated by the generator when given class label C.

2.3 写法3:参考

你可能感兴趣的:(生成对抗网络,学习,笔记)