January 29, 2018

Classic / basic CNN papers

Aggregated Residual Transformations for Deep Neural Networks (ResNeXt)

- Authors Xie Saining / Girshick Ross / Dollár Piotr / Tu Zhuowen / He Kaiming

- Link arxiv.org/abs/1611.05431

- Resnet and VGG go deeper

- Inception nets go wider. Despite efficiency - they are hard to re-purpose and design

- key idea - add group convolutions to the residual block

- illustrations

-- basic building block goo.gl/L8PjUF

-- same block in terms of group convolutions goo.gl/fZKmgf

-- overall architecture goo.gl/WWSxRv

-- performance - goo.gl/vgLN8G - +1% vs resnet

#data_science

#deep_learning