Spark in me - Internet, data science, math, deep learning, philo

snakers4 @ telegram, 1252 members, 1404 posts since 2016

All this - lost like tears in rain.

Internet, data science, math, deep learning, philosophy. No bs.

Our website
- spark-in.me
Our chat
- goo.gl/WRm93d
DS courses review
- goo.gl/5VGU5A
- goo.gl/YzVUKf

January 10, 03:20

A 70% full GAN / style paper review:

- review spark-in.me/post/gan-paper-review

- TLDR - author.spark-in.me/gan-list.html

Did not crack math in Wasserstein GAN though.

Also a friend focused on GANS for ~6 months. Below is the gist of his work:

- GANs are known to be notoriously difficult and tricky to train even with wasserstein loss

- The most photo-realistic papers use custom regularization techniques and very sophisticated training regimes

- Seemingly photo-realistic GANs (with progressive growing)

-- are tricky to train

-- require 2-3x time to train the GAN itself and additional 3-6x to use growing

- end result may be completely unpredictable despite all the efforts

- most GANs are not viable in production / mobile applications

- visually in practice they perform much WORSE than style transfer

Training TLDR trick

- Use DCGAN just for training latent space variables w/o any domain

- Use CycleGan + wasserstein loss for domain transfer

- Use growing for photo-realism

As for using them for latent space algebra - I will do this project this year.

#deep_learning

#data_science

GAN paper list and review

In this I list useful / influential GAN papers and papers related to sparse unsupervised data CNN training / latent space operations Статьи автора - http://spark-in.me/author/snakers41 Блог - http://spark-in.me


US$1 million prize US-citizen exclusive Kaggle challenge ... for just stacking Resnets?

- www.kaggle.com/c/passenger-screening-algorithm-challenge/discussion/45805

America is fucked up bad...

Also notice the shake-up and top scores

- Public goo.gl/2utoDC

- Private goo.gl/GXpnWe

#data_science

#sick_sad_worlds