Keras Applications is the applications
module of
the Keras deep learning library.
It provides model definitions and pre-trained weights for a number
of popular archictures, such as VGG16, ResNet50, Xception, MobileNet, and more.
Read the documentation at: https://keras.io/applications/
Keras Applications may be imported directly from an up-to-date installation of Keras:
from keras import applications
Keras Applications is compatible with Python 2.7-3.6 and is distributed under the MIT license.
- The top-k accuracies were obtained using Keras Applications with the TensorFlow backend on the 2012 ILSVRC ImageNet validation set and may slightly differ from the original ones.
- Input: input size fed into models
- Top-1: single center crop, top-1 accuracy
- Top-5: single center crop, top-5 accuracy
- Size: rounded the number of parameters when
include_top=True
- Stem: rounded the number of parameters when
include_top=False
Input | Top-1 | Top-5 | Size | Stem | References | |
---|---|---|---|---|---|---|
VGG16 | 224 | 71.268 | 90.050 | 138.4M | 14.7M | [paper] [tf-models] |
VGG19 | 224 | 71.256 | 89.988 | 143.7M | 20.0M | [paper] [tf-models] |
ResNet50 | 224 | 74.928 | 92.060 | 25.6M | 23.6M | [paper] [tf-models] [torch] [caffe] |
ResNet101 | 224 | 76.420 | 92.786 | 44.7M | 42.7M | [paper] [tf-models] [torch] [caffe] |
ResNet152 | 224 | 76.604 | 93.118 | 60.4M | 58.4M | [paper] [tf-models] [torch] [caffe] |
ResNet50V2 | 299 | 75.960 | 93.034 | 25.6M | 23.6M | [paper] [tf-models] [torch] |
ResNet101V2 | 299 | 77.234 | 93.816 | 44.7M | 42.6M | [paper] [tf-models] [torch] |
ResNet152V2 | 299 | 78.032 | 94.162 | 60.4M | 58.3M | [paper] [tf-models] [torch] |
ResNeXt50 | 224 | 77.740 | 93.810 | 25.1M | 23.0M | [paper] [torch] |
ResNeXt101 | 224 | 78.730 | 94.294 | 44.3M | 42.3M | [paper] [torch] |
InceptionV3 | 299 | 77.898 | 93.720 | 23.9M | 21.8M | [paper] [tf-models] |
InceptionResNetV2 | 299 | 80.256 | 95.252 | 55.9M | 54.3M | [paper] [tf-models] |
Xception | 299 | 79.006 | 94.452 | 22.9M | 20.9M | [paper] |
MobileNet(alpha=0.25) | 224 | 51.582 | 75.792 | 0.5M | 0.2M | [paper] [tf-models] |
MobileNet(alpha=0.50) | 224 | 64.292 | 85.624 | 1.3M | 0.8M | [paper] [tf-models] |
MobileNet(alpha=0.75) | 224 | 68.412 | 88.242 | 2.6M | 1.8M | [paper] [tf-models] |
MobileNet(alpha=1.0) | 224 | 70.424 | 89.504 | 4.3M | 3.2M | [paper] [tf-models] |
MobileNetV2(alpha=0.35) | 224 | 60.086 | 82.432 | 1.7M | 0.4M | [paper] [tf-models] |
MobileNetV2(alpha=0.50) | 224 | 65.194 | 86.062 | 2.0M | 0.7M | [paper] [tf-models] |
MobileNetV2(alpha=0.75) | 224 | 69.532 | 89.176 | 2.7M | 1.4M | [paper] [tf-models] |
MobileNetV2(alpha=1.0) | 224 | 71.336 | 90.142 | 3.5M | 2.3M | [paper] [tf-models] |
MobileNetV2(alpha=1.3) | 224 | 74.680 | 92.122 | 5.4M | 3.8M | [paper] [tf-models] |
MobileNetV2(alpha=1.4) | 224 | 75.230 | 92.422 | 6.2M | 4.4M | [paper] [tf-models] |
MobileNetV3(small) | 224 | 68.076 | 87.800 | 2.6M | 0.9M | [paper] [tf-models] |
MobileNetV3(large) | 224 | 75.556 | 92.708 | 5.5M | 3.0M | [paper] [tf-models] |
DenseNet121 | 224 | 74.972 | 92.258 | 8.1M | 7.0M | [paper] [torch] |
DenseNet169 | 224 | 76.176 | 93.176 | 14.3M | 12.6M | [paper] [torch] |
DenseNet201 | 224 | 77.320 | 93.620 | 20.2M | 18.3M | [paper] [torch] |
NASNetLarge | 331 | 82.498 | 96.004 | 93.5M | 84.9M | [paper] [tf-models] |
NASNetMobile | 224 | 74.366 | 91.854 | 7.7M | 4.3M | [paper] [tf-models] |
EfficientNet-B0 | 224 | 77.190 | 93.492 | 5.3M | 4.0M | [paper] [tf-tpu] |
EfficientNet-B1 | 240 | 79.134 | 94.448 | 7.9M | 6.6M | [paper] [tf-tpu] |
EfficientNet-B2 | 260 | 80.180 | 94.946 | 9.2M | 7.8M | [paper] [tf-tpu] |
EfficientNet-B3 | 300 | 81.578 | 95.676 | 12.3M | 10.8M | [paper] [tf-tpu] |
EfficientNet-B4 | 380 | 82.960 | 96.260 | 19.5M | 17.7M | [paper] [tf-tpu] |
EfficientNet-B5 | 456 | 83.702 | 96.710 | 30.6M | 28.5M | [paper] [tf-tpu] |
EfficientNet-B6 | 528 | 84.082 | 96.898 | 43.3M | 41.0M | [paper] [tf-tpu] |
EfficientNet-B7 | 600 | 84.430 | 96.840 | 66.7M | 64.1M | [paper] [tf-tpu] |
- SSD by @rykov8 [paper]
- YOLOv2 by @allanzelener [paper]
- YOLOv3 by @qqwweee [paper]
- Mask RCNN by @matterport [paper]
- U-Net by @zhixuhao [paper]
- RetinaNet by @fizyr [paper]
- keras-rl by @keras-rl
- RocAlphaGo by @Rochester-NRT [paper]
- Keras-GAN by @eriklindernoren