Check Your Understanding: GAN Anatomy
Stay organized with collections
Save and categorize content based on your preferences.
True or false: the discriminator network and generator network influence
each other solely through the data produced by the generator and the labels
produced by the discriminator. When it comes to backpropagation, they
are separate networks.
True
Incorrect: during generator training, gradients propagate through the
discriminator network to the generator network (although the discriminator
does not update its weights during generator training).
False
Correct: during generator training, gradients propagate through the
discriminator network to the generator network (although the discriminator
does not update its weights during generator training). So the weights in
the discriminator network influence the updates to the generator network.
True or false: a typical GAN trains the generator and the discriminator
simultaneously.
True
Incorrect. A typical GAN alternates between training the discriminator
and training the generator. There is some [research
](https://arxiv.org/abs/1706.04156) on training the
generator and discriminator simultaneously.
False
Correct. A typical GAN alternates between training the discriminator
and training the generator.
True or false: a GAN always uses the same loss function for both
discriminator and generator training.
True
Incorrect. While it's possible for a GAN to use the same loss for
both generator and discriminator training (or the same loss differing only
in sign), it's not required. In fact it's more common to use different
losses for the discriminator and the generator.
False
Correct. While it's possible for a GAN to use the same loss for
both generator and discriminator training (or the same loss differing only
in sign), it's not required. In fact it's more common to use different
losses for the discriminator and the generator.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-08-25 UTC.
[null,null,["Last updated 2025-08-25 UTC."],[[["\u003cp\u003eDuring generator training, gradients propagate through the discriminator to the generator, influencing its updates.\u003c/p\u003e\n"],["\u003cp\u003eA typical GAN alternates between training the discriminator and training the generator, rather than simultaneous training.\u003c/p\u003e\n"],["\u003cp\u003eGANs often employ different loss functions for the discriminator and generator, optimizing each network separately.\u003c/p\u003e\n"]]],[],null,["# Check Your Understanding: GAN Anatomy\n\n\u003cbr /\u003e\n\nTrue or false: the discriminator network and generator network influence each other solely through the data produced by the generator and the labels produced by the discriminator. When it comes to backpropagation, they are separate networks. \nTrue \nIncorrect: during generator training, gradients propagate through the discriminator network to the generator network (although the discriminator does not update its weights during generator training). \nFalse \nCorrect: during generator training, gradients propagate through the discriminator network to the generator network (although the discriminator does not update its weights during generator training). So the weights in the discriminator network influence the updates to the generator network. \nTrue or false: a typical GAN trains the generator and the discriminator simultaneously. \nTrue \nIncorrect. A typical GAN alternates between training the discriminator and training the generator. There is some \\[research \\](https://arxiv.org/abs/1706.04156) on training the generator and discriminator simultaneously. \nFalse \nCorrect. A typical GAN alternates between training the discriminator and training the generator. \nTrue or false: a GAN always uses the same loss function for both discriminator and generator training. \nTrue \nIncorrect. While it's possible for a GAN to use the same loss for both generator and discriminator training (or the same loss differing only in sign), it's not required. In fact it's more common to use different losses for the discriminator and the generator. \nFalse \nCorrect. While it's possible for a GAN to use the same loss for both generator and discriminator training (or the same loss differing only in sign), it's not required. In fact it's more common to use different losses for the discriminator and the generator."]]