Tensorflow2.0之 YOLO篇模型搭建与训练
这一篇我们就要结束yolo2的代码了,代码和相关资料的下载链接在博客的最后面,文章只是模型搭建和训练的过程,代码里面涉及到的一些函数是上几篇博客中提到的,有兴趣可以去看看。
模型搭建
YOLO的训练模型我们选择在进行优化Dartnet-19,并选择其的参数数据进行初始化

- DartNet
class SpaceToDepth(layers.Layer):
def __init__(self, block_size, **kwargs):
self.block_size = block_size
super(SpaceToDepth, self).__init__(**kwargs)
def call(self, inputs):
x = inputs
batch, height, width, depth = K.int_shape(x)
batch = -1
reduced_height = height // self.block_size
reduced_width = width // self.block_size
y = K.reshape(x, (batch, reduced_height, self.block_size,
reduced_width, self.block_size, depth))
z = K.permute_dimensions(y, (0, 1, 3, 2, 4, 5))
t = K.reshape(z, (batch, reduced_height, reduced_width, depth * self.block_size ** 2))
return t
def compute_output_shape(self, input_shape):
shape = (input_shape[0], input_shape[1] // self.block_size, input_shape[2] // self.block_size,
input_shape[3] * self.block_size ** 2)
return tf.TensorShape(shape)
# 3.1
input_image = layers.Input((IMGSZ, IMGSZ, 3), dtype='float32')
# unit1
x = layers.Conv2D(32, (3, 3), strides=(1, 1), padding='same', name='conv_1', use_bias=False)(input_image)
x = layers.BatchNormalization(name='norm_1')(x)
x = layers.LeakyReLU(alpha=0.1)(x)
x = layers.MaxPooling2D(pool_size=(2, 2))(x)
# unit2
x = layers.Conv2D(64, (3, 3), strides=(1, 1), padding='same', name='conv_2', use_bias=False)(x)
x = layers.BatchNormalization(name='norm_2')(x)
x = layers.LeakyReLU(alpha=0.1)(x)
x = layers.MaxPooling2D(pool_size=(2, 2))(x)
# Layer 3
x = layers.Conv2D(128, (3, 3), strides=(1, 1), padding='same', name='conv_3', use_bias=False)(x)
x = layers.BatchNormalization(name='norm_3')(x)
x = layers.LeakyReLU(alpha=0.1)(x)
# Layer 4
x = layers.Conv2D(64, (1, 1), strides=(1, 1), padding='same', name='conv_4', use_bias=False)(x)
x = layers.BatchNormalization(name='norm_4')(x)
x = layers.LeakyReLU(alpha=0.1)(x)
# Layer 5
x = layers.Conv2D(128, (3, 3

本文详细介绍了如何在TensorFlow2.0环境下搭建与训练YOLOv2目标检测模型,包括模型结构的设计、权重初始化、损失函数的计算及训练过程。通过代码实现展示了模型的每一层构建细节,以及如何利用预训练权重进行初始化。
最低0.47元/天 解锁文章
862

被折叠的 条评论
为什么被折叠?



