mobilenet_v3_224

"""MobileNet v3 models for Keras.
The following table describes the performance of MobileNets:
------------------------------------------------------------------------
MACs stands for Multiply Adds
| Classification Checkpoint| MACs (M)| Parameters (M)| Top 1 Accuracy| Pixel 1 CPU (ms)|
| [mobilenet_v3_large_1.0_224]              | 217 | 5.4 | 75.2 | 51.2 |
| [mobilenet_v3_large_0.75_224]             | 155 | 4.0 | 73.3 | 39.8 |
| [mobilenet_v3_large_minimalistic_1.0_224] | 209 | 3.9 | 72.3 | 44.1 |
| [mobilenet_v3_small_1.0_224]              | 66  | 2.9 | 67.5 | 15.8 |
| [mobilenet_v3_small_0.75_224]             | 44  | 2.4 | 65.4 | 12.8 |
| [mobilenet_v3_small_minimalistic_1.0_224] | 65  | 2.0 | 61.9 | 12.2 |
The weights for all 6 models are obtained and
translated from the Tensorflow checkpoints
from TensorFlow checkpoints found [here]
(https://github.com/tensorflow/models/tree/master/research/slim/nets/mobilenet/README.md).
# Reference
This file contains building code for MobileNetV3, based on
[Searching for MobileNetV3]
(https://arxiv.org/pdf/1905.02244.pdf) (ICCV 2019)
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function

import os
import warnings

from . import get_submodules_from_kwargs
from .imagenet_utils import decode_predictions
from .imagenet_utils import _obtain_input_shape

BASE_WEIGHT_PATH = ('https://github.com/DrSlink/mobilenet_v3_keras/releases/download/v1.0/')

backend = None
layers = None
models = None
keras_utils = None


def preprocess_input(x, **kwargs):
    """Preprocesses a numpy array encoding a batch of images.
    # Arguments
        x: a 4D numpy array consists of RGB values within [0, 255].
    # Returns
        Preprocessed array.
    """
    return imagenet_utils.preprocess_input(x, mode='tf', **kwargs)


def hard_sigmoid(x):
    return layers.ReLU(6.)(x + 3.) * (1. / 6.)


def hard_swish(x):
    return layers.Multiply()([layers.Activation(hard_sigmoid)(x), x])


def _activation(x, name='relu'):
    if name == 'relu':
        return layers.ReLU()(x)
    elif name == 'hardswish':
        return hard_swish(x)


# This function is taken from the original tf repo.
# It ensures that all layers have a channel number that is divisible by 8
# It can be seen here:
# https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py


def _make_divisible(v, divisor, min_value=None)
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值