#项目Deep Residual Networks
model top-1 top-5
VGG-16 28.5% 9.9%
ResNet-50 24.7% 7.8%
ResNet-101 23.6% 7.1%
ResNet-152 23.0% 6.7%
10-crop validation error on ImageNet (averaging softmax scores of 10 224x224 crops from resized image with shorter side=256), the same as those in the paper:
model top-1 top-5
ResNet-50 22.9% 6.7%
ResNet-101 21.8% 6.1%
ResNet-152 21.4% 5.7%
Third-party re-implementations
Deep residual networks are very easy to implement and train. We recommend to see also the following third-party re-implementations and extensions:
By Facebook AI Research (FAIR), with training code in Torch and pre-trained ResNet-18/34/50/101 models for ImageNet: blog, code
Torch, CIFAR-10, with ResNet-20 to ResNet-110, training code, and curves: code
Lasagne, CIFAR-10, with ResNet-32 and ResNet-56 and training code: code
Neon, CIFAR-10, with pre-trained ResNet-32 to ResNet-110 models, training code, and curves: code
Torch, MNIST, 100 layers: blog, code
A winning entry in Kaggle’s right whale recognition challenge: blog, code
Neon, Place2 (mini), 40 layers: blog, code
MatConvNet, CIFAR-10, with ResNet-20 to ResNet-110, training code, and curves: code
TensorFlow, CIFAR-10, with ResNet-32,110,182 training code and curves: code
MatConvNet, reproducing CIFAR-10 and ImageNet experiments (supporting official MatConvNet), training code and curves: blog, code
Keras, ResNet-50: code
Converters:
MatConvNet: url
TensorFlow: url