PyTorch DP / DDP / model parallel
Finally they made proper tutorials:
Model parallel = have parts of the same model on different devices
Data Parallel (DP) = wrapper to use multi-GPU withing a single parent process
Distributed Data Parallel = multiple processes are spawned across cluster / on the same machine
The State of ML, eof 2018 in Russian
Quite down-to-earth and clever lecture
Some nice examples for TTS and some interesting forecasts (some of them happened already).