Paper Explained: TransGAN -- Two Transformers can make One Strong GAN
Most of the NLP tasks are currently solved using the Transformer network or a variation in the Transformer network. Transformers have become an integral part of the NLP eco-system over the past few years because of their reusability. Some multi-modal tasks are using the transformer network somewhere; still, those aren't CNN free. Any Computer Vision task coupled with Transformers; also employs a CNN as backbones for feature extraction. But with TransGAN, a pure transformer network-based architecture is developed to train a GAN for image synthesis.
Mar-23-2021, 20:46:38 GMT