REECHA SHARMA
Articles written in Sadhana
Volume 47 All articles Published: 10 February 2022 Article ID 0033
Comparative analysis of CycleGAN and AttentionGAN on face aging application
NEHA SHARMA REECHA SHARMA NEERU JINDAL
Recently, there is incredible progress in the arena of machine learning with generative adversarial network (GAN) methods. These methods tend to synthesize new data from input images that are highly realistic at the output. One of its applications in the image-to-image transformation way is the face aging task. In the face aging process, new face images are synthesized with the help of the input images and desired target images. Face aging can be beneficial in several domains such as in biometric systems for face recognition with age progression, in forensics for helping to find the missing children, in entertainment, and many more. Nowadays, several GANs are available for face aging applications and this paper focuses on the insight comparison amongthe frequently used image-to-image translation GANs which are CycleGAN (Cycle-Consistent Adversarial Network) and AttentionGAN (Attention-Guided Generative Adversarial Network). The first model (CycleGAN) comprises two generators, two discriminators, and converting an image from one domain to another without the need for paired images dataset. The second is AttentionGAN, which consists of attention masks and content masks multiplied with the generated output in one domain to generate a highly realistic image in another domain. For comparison, these two are trained on two dataset which is CelebA-HQ (CelebFaces Attributes highquality dataset) and FFHQ (Flickr Faces HQ). Efficacy is evaluated quantitatively with identity preservation, five image quality assessment metrics, and qualitatively with a perceptual study on synthesized images, face aging signs, and robustness. It has been concluded that overall CycleGAN has better performance than AttentionGAN. In the future, a more critical comparison can be performed on the number of GANs for faceaging applications.
Volume 48, 2023
All articles
Continuous Article Publishing mode
Click here for Editorial Note on CAP Mode
© 2022-2023 Indian Academy of Sciences, Bengaluru.