最近一直要做on device learning ,需要一些基础的例子学习,于是做了一个alexnet ,原始的alexnet虽然在cifar10上面表现不错,但是到了cifar100就不行了,所以稍微改了改代码,增加了batch norm 和 dropout ,后来发现有些过拟合,又对数据做了数据增强,最终为了能在device上面跑转成了tflite版本,具体代码如下:
import numpy as np
import tensorflow as tf
# Load the CIFAR-100 dataset
(x_train, y_train), (x_test, y_test) = tf.keras.datasets.cifar100.load_data()
# Normalize the pixel values between 0 and 1
x_train = x_train.astype('float32') / 255.0
x_test = x_test.astype('float32') / 255.0
# Convert the labels to one-hot encoding
y_train = tf.keras.utils.to_categorical(y_train, 100)
y_test = tf.keras.utils.to_categorical(y_test, 100)
# Define the AlexNet model
model = tf.keras.models.Sequential([
tf.keras.layers.Conv2D(96, kernel_size=3, activation='relu', input_shape=(32, 32, 3)),
tf.keras.layers.BatchNormalization(),
tf.keras.layers.MaxPooling2D(pool_size=2),
tf.keras.layers.Dropout(0.25),
tf.keras.layers.Conv2D(256, kernel_size=3,

最低0.47元/天 解锁文章
2488

被折叠的 条评论
为什么被折叠?



