CS231n Assignment1 流程及部分知识点

本文详细介绍了CS231n课程作业第一部分的内容,包括数据集处理、可视化、训练集划分、K-Nearest-Neighbor、Support Vector Machine和Softmax算法的实现。重点讲解了这些算法的损失函数、梯度计算以及交叉验证过程,用于选择最佳超参数。最后,通过预测和准确率评估展示了算法性能。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

本文仅为备忘

Assignment1 的基本操作为:

1、读取数据集,输出shape

# Load the raw CIFAR-10 data.
cifar10_dir = 'cs231n/datasets/cifar-10-batches-py'

# Cleaning up variables to prevent loading data multiple times (which may cause memory issue)
try:
   del X_train, y_train
   del X_test, y_test
   print('Clear previously loaded data.')
except:
   pass

X_train, y_train, X_test, y_test = load_CIFAR10(cifar10_dir)

# As a sanity check, we print out the size of the training and test data.
print('Training data shape: ', X_train.shape)
print('Training labels shape: ', y_train.shape)
print('Test data shape: ', X_test.shape)
print('Test labels shape: ', y_test.shape)
Clear previously loaded data. 
Training data shape:  (50000, 32, 32, 3) 
Training labels shape:  (50000,) 
Test data shape:  (10000, 32, 32, 3) 
Test labels shape:  (10000,)

2、可视化subplot

# Visualize some examples from the dataset.
# We show a few examples of training images from each class.
classes = ['plane', 'car', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck']
num_classes = len(classes)
samples_per_class = 7
for y, cls in enumerate(classes):
    idxs = np.flatnonzero(y_train == y)
    idxs = np.random.choice(idxs, samples_per_class, replace=False)
    for i, idx in enumerate(idxs):
        plt_idx = i * num_classes + y + 1
        plt.subplot(samples_per_class, num_classes, plt_idx)
  
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值