The best methods for balancing the training of a conditional GAN with class labels are:
- Class-Specific Losses: You can add classification loss for both the generator and discriminator to ensure correct label conditioning.
- Label Smoothing: You can use soft labels for real classes to prevent the discriminator from becoming overly confident.
- Two-Time Scale Update Rule(TTUR): To stabilize training across classes, you can use different learning rates for the generator and discriminator.
- Balance Batch: You can also sample batches with balanced class distribution to prevent mode collapse or bias towards dominant classes.
- Auxiliary GAN(AC-GAN): It includes a separate classifier in the discriminator to predict class labels, enhancing label consistency and quality.
Combining all these strategies, class-specific losses, loss smoothing, two-time scale update rule, balance batch, auxiliary GAN here is the code reference:
Hence, by using these methods, you can balance the training of a conditional GAN with class labels.