Spark in me - Internet, data science, math, deep learning, philo

snakers4 @ telegram, 1365 members, 1673 posts since 2016

All this - lost like tears in rain.

Data science, deep learning, sometimes a bit of philosophy and math. No bs.

Our website
Our chat
DS courses review

January 10, 03:20

A 70% full GAN / style paper review:

- review

- TLDR -

Did not crack math in Wasserstein GAN though.

Also a friend focused on GANS for ~6 months. Below is the gist of his work:

- GANs are known to be notoriously difficult and tricky to train even with wasserstein loss

- The most photo-realistic papers use custom regularization techniques and very sophisticated training regimes

- Seemingly photo-realistic GANs (with progressive growing)

-- are tricky to train

-- require 2-3x time to train the GAN itself and additional 3-6x to use growing

- end result may be completely unpredictable despite all the efforts

- most GANs are not viable in production / mobile applications

- visually in practice they perform much WORSE than style transfer

Training TLDR trick

- Use DCGAN just for training latent space variables w/o any domain

- Use CycleGan + wasserstein loss for domain transfer

- Use growing for photo-realism

As for using them for latent space algebra - I will do this project this year.



GAN paper list and review

In this I list useful / influential GAN papers and papers related to sparse unsupervised data CNN training / latent space operations Статьи автора - Блог -

US$1 million prize US-citizen exclusive Kaggle challenge ... for just stacking Resnets?


America is fucked up bad...

Also notice the shake-up and top scores

- Public

- Private