IntheGenerativeAdversarialNetwork(GAN) training, oneneeds totrainaGeneratornetwork andDiscriminatornetworkalternatively. OnImageNet-1Ktask, traditionallypeoplehavebeen usingADAMoptimizerandResNet-50modelforbothgeneratoranddiscriminatortraining. Inthis setup,generatoranddiscriminatorproduce25milliongradientsineachmini-batchiteration. 1. [1point]Whatisthesize(inMBs)oftrainableparameters(singleprecision)ofthegenerator anddiscriminator 2. [2points]Recently,someresearchershavefoundAdagradoptimizerhasabettertheoretical guaranteeforGANtraininganddecidetouseAdagradoptimizer instead.Canyouestimate

Posted Date: