A 70% full GAN / style paper review:
- review spark-in.me/post/gan-paper-revie
- TLDR - author.spark-in.me/gan-list.html
Did not crack math in Wasserstein GAN though.
Also a friend focused on GANS for ~6 months. Below is the gist of his work:
- GANs are known to be notoriously difficult and tricky to train even with wasserstein loss
- The most photo-realistic papers use custom regularization techniques and very sophisticated training regimes
- Seemingly photo-realistic GANs (with progressive growing)
-- are tricky to train
-- require 2-3x time to train the GAN itself and additional 3-6x to use growing
- end result may be completely unpredictable despite all the efforts
- most GANs are not viable in production / mobile applications
- visually in practice they perform much WORSE than style transfer
Training TLDR trick
- Use DCGAN just for training latent space variables w/o any domain
- Use CycleGan + wasserstein loss for domain transfer
- Use growing for photo-realism
As for using them for latent space algebra - I will do this project this year.
In this I list useful / influential GAN papers and papers related to sparse unsupervised data CNN training / latent space operations Статьи автора - http://spark-in.me/author/snakers41 Блог - http://spark-in.me
US$1 million prize US-citizen exclusive Kaggle challenge ... for just stacking Resnets?
America is fucked up bad...
Also notice the shake-up and top scores
- Public goo.gl/2utoDC
- Private goo.gl/GXpnWe