Nameless rookie

keep foolish


  • Home

  • About

  • Tags2

  • Categories1

  • Archives50

  • Search

【论文阅读】——— ShuffleNet, An Extremely Efficient Convolutional Neural Network for Mobile

Posted on 2018-12-11
words count in article: 5k | Reading time ≈ 5 mins.

Title:ShuffleNet: An Extremely Efficient Convolutional Neural Network for Mobile
Authors:Face++团队
Session:CVPR, 2018
Abstract:作者提出一个效率极高的网络结构 ShuffleNet,专门应用于计算力受限的移动设备。网络利用两个操作:pointwise group convolution 和 channel shuffle,与现有先进模型相比在类似的精度下大大降低计算量。在 ImageNet 和 MS COCO 上 ShuffleNet 表现出比其他先进模型的优越性能。

Read more »

【论文阅读】—— MobileNets, Efficient Convolutional Neural Networks for Mobile Vision Applications

Posted on 2018-12-04
words count in article: 3.7k | Reading time ≈ 3 mins.

Title:MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Authors:Google 团队
Abstract:MobileNet 是为移动和嵌入式设备提出的高效模型。 使用了深度可分离卷积来构造的轻量级深度神经网络,计算量大幅度减小,且性能能媲美其它网络。

Read more »

【论文阅读】—— Densely Connected Convolutional Networks

Posted on 2018-12-04 | Edited on 2018-12-11
words count in article: 2.9k | Reading time ≈ 3 mins.

Title:Densely Connected Convolutional Networks
Author:Gao Huang, Zhuang Liu, et al.
Session:CVPR, 2017
Abstract:作者发现 ResNet 的结构有些许冗余,因此提出要将网络中的每一层都相互相连,而不是像 ResNet 那样只是 block 之间连接。作者提出的这种 DenseNet 结构参数量比 ResNet 要少,网络层数也比 ResNet 要少,但是效果却比 ResNet 要好。

Read more »

【论文阅读】—— Aggregated Residual Transformations for Deep Neural Networks

Posted on 2018-12-03 | Edited on 2018-12-11
words count in article: 2.6k | Reading time ≈ 2 mins.

Title:Aggregated Residual Transformations for Deep Neural Networks
Authors:Saining Xie,Kaiming He
Session:CVPR, 2017
Abstract:提高网络性能,除了深度和宽度外,作者提出了 cardinality 的概念,并修改 ResNet 成一个高度模块化的网络 ResNeXt,对比 ResNet 和 Inception,ResNeXt 的超参数很少,不需要精细的网络设计,且网络参数和计算量都比 ResNet 少。

Read more »

【论文阅读】—— Wide Residual Networks

Posted on 2018-12-03 | Edited on 2018-12-11
words count in article: 1.6k | Reading time ≈ 1 mins.

Title:Wide Residual Networks
Authors:Sergey Zagoruyko, Nikos Komodakis
Session:BMVC, 2016
Abstract:传统的 ResNet 都是以深度为主,作者从宽度(channel 大小)出发,提出了通过拓宽网络的 channel 数,也能达到和深层 ResNet 一样的效果,且训练更快。

Read more »

【论文阅读】—— Inception-v4, Inception-ResNet and Impact of Residual Connection on Learning

Posted on 2018-12-03 | Edited on 2018-12-11
words count in article: 584 | Reading time ≈ 1 mins.

Title:Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning
Authors:Christian Szegedy, Sergey Ioffe
Session:AAAI, 2017
Abstract:作者修改了 Inception-v3 为 Inception-v4,并且将 Inception 和 ResNet 进行结合。两个网络的性能接近。

Read more »

【论文阅读】—— Rethinking the Inception Architecture for Computer Vision

Posted on 2018-12-03 | Edited on 2018-12-11
words count in article: 1.4k | Reading time ≈ 1 mins.

Title:Rethinking the Inception Architecture for Computer Vision
Authors:Christian Szegedy et al.
Session:CVPR, 2016
Abstract:Inception 的作者进一步地优化网络结构,提出了非对称卷积的方法,使得网络的训练速度和性能得到进一步的增强,该版本被称为 Inception-v3。

Read more »

【论文阅读】—— Batch Normalization, Accelerating Deep Network Training by Reducing Internal Covariate Shift

Posted on 2018-12-01 | Edited on 2018-12-11
words count in article: 3.3k | Reading time ≈ 3 mins.

Title:Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
Authors:Christian Szegedy, Sergey Ioffe
Session:ICML, 2015
Abstract:Inception 的作者提出 batchNorm,从而解决 internal covariate shift 的问题,使得网络训练更快,并在自己的 Inception 网络上加入 batchNorm,使得在 ImageNet 上的错误率进一步降低,该网络被称为 Inception-BN。

Read more »

【论文阅读】—— Identity Mappings in Deep Residual Networks

Posted on 2018-12-01 | Edited on 2018-12-11
words count in article: 3k | Reading time ≈ 3 mins.

Title:Identity Mappings in Deep Residual Networks
Authors:Kaiming He et al
Session:ECCV, 2016
Abstract:作者在 ResNet-v1 基础上进行了修改,使得表现得到了提升,该网络称为 ResNet-v2.

Read more »

【论文阅读】—— Deep Residual Learning for Image Recognition

Posted on 2018-11-30 | Edited on 2018-12-11
words count in article: 4.4k | Reading time ≈ 4 mins.

Title:Deep Residual Learning for Image Recognition
Authors:Kaiming He et al
Session:CVPR, 2016
Abstract:这就是传说中的ResNet,作者提出了残差学习(residual learning),解决了深层网络的退化问题(degradation),并且破天荒地将网络层数加深到 152 层,并且揽获 ILSVRC 2015 的所有比赛冠军(分类,检测,定位)。

Read more »
12345
Vincent Ho

Vincent Ho

Get used to be ignored.

50 posts
1 categories
2 tags
RSS
Links
  • Google
  • bing
© 2018 – 2019 Vincent | words count total: 215k | Reading time total ≈ 3:15
Powered by Hexo
|
Theme – NexT.Mist
|
visitors count total:
Thank you so much for visiting the site.