Question: ( 2 5 points ) the minimax loss for L G suffers from vanishing gradient problem. In terms of the discriminator's logits , the minimax

(25 points) the minimax loss for LG suffers from vanishing gradient
problem. In terms of the discriminator's logits, the minimax loss is
LGminimax(;)=EzN(0,I)[log(1-(h(G(z))))]
Show that the derivative of LGminimax with respect to is approximately 0 if
D(G(z))~~0, or equivalently, if h(G(z))0. You may use the fact that
'(x)=(x)(1-(x)). Why is this problematic for the training of the generator
when the discriminator successfully identifies a fake sample G(z)?
(25 points) To solve this vanishing gradient problem, we usually replace
LGminimax with other loss functions such as non-saturating loss LGnsgan[1] and
more other forms of loss functions can be found in
[2]. You may plot differ-
ent loss functions including minimax loss and non-saturating loss to show the
contrast. You also need to explain why non-saturating loss can avoid vanishing
gradient problem.
LGnsgan(;)=-EzN(0,I)[logD(G(z))]
( 2 5 points ) the minimax loss for L G suffers

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!