使用DBN学习MINIST数据集

本教程通过使用GitHub上的TensorFlow代码实现了一个深层信念网络(DBN)的预训练和微调过程。该过程包括从MNIST数据集中下载训练数据,并通过多个隐藏层进行逐层预训练。
部署运行你感兴趣的模型镜像

使用github上的tensorflow代码,数据下载和训练一条龙。

下载代码

#git clone https://github.com/xiaohu2015/DeepLearning_tutorials.git

运行

#python dbn.py

输出

Successfully downloaded train-images-idx3-ubyte.gz 9912422 bytes.
Extracting MNIST_data/train-images-idx3-ubyte.gz
Successfully downloaded train-labels-idx1-ubyte.gz 28881 bytes.
Extracting MNIST_data/train-labels-idx1-ubyte.gz
Successfully downloaded t10k-images-idx3-ubyte.gz 1648877 bytes.
Extracting MNIST_data/t10k-images-idx3-ubyte.gz
Successfully downloaded t10k-labels-idx1-ubyte.gz 4542 bytes.
Extracting MNIST_data/t10k-labels-idx1-ubyte.gz
2018-07-09 21:31:32.912218: I tensorflow/core/platform/cpu_feature_guard.cc:140] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
Starting pretraining...

    Pretraing layer 0 Epoch 0 cost: 82.7562146932
    Pretraing layer 0 Epoch 1 cost: 71.4445186615
    Pretraing layer 0 Epoch 2 cost: 68.2782074252
    Pretraing layer 0 Epoch 3 cost: 66.6253933924
    Pretraing layer 0 Epoch 4 cost: 65.5671972968
    Pretraing layer 0 Epoch 5 cost: 64.8149403104
    Pretraing layer 0 Epoch 6 cost: 64.2372861134
    Pretraing layer 0 Epoch 7 cost: 63.7739256183
    Pretraing layer 0 Epoch 8 cost: 63.3849852024
    Pretraing layer 0 Epoch 9 cost: 63.0742568346
    Pretraing layer 1 Epoch 0 cost: 127.297717382
    Pretraing layer 1 Epoch 1 cost: 115.47870919
    Pretraing layer 1 Epoch 2 cost: 112.256735805
    Pretraing layer 1 Epoch 3 cost: 110.447800154
    Pretraing layer 1 Epoch 4 cost: 109.252376036
    Pretraing layer 1 Epoch 5 cost: 108.416442469
    Pretraing layer 1 Epoch 6 cost: 107.767917571
    Pretraing layer 1 Epoch 7 cost: 107.259497833
    Pretraing layer 1 Epoch 8 cost: 106.867456998
    Pretraing layer 1 Epoch 9 cost: 106.534078383
    Pretraing layer 2 Epoch 0 cost: 36.3754562274
    Pretraing layer 2 Epoch 1 cost: 32.9483315901
    Pretraing layer 2 Epoch 2 cost: 32.2201649111
    Pretraing layer 2 Epoch 3 cost: 31.8175771783
    Pretraing layer 2 Epoch 4 cost: 31.5295569212
    Pretraing layer 2 Epoch 5 cost: 31.3358987011
    Pretraing layer 2 Epoch 6 cost: 31.1839012215
    Pretraing layer 2 Epoch 7 cost: 31.056089396
    Pretraing layer 2 Epoch 8 cost: 30.9713914368
    Pretraing layer 2 Epoch 9 cost: 30.8702448134

The pretraining process ran for 10.1000482996 minutes

Start finetuning...

    Epoch 0 cost: 0.444020345374, validation accuacy: 0.935400009155
    Epoch 1 cost: 0.228727179088, validation accuacy: 0.947200000286
    Epoch 2 cost: 0.192814317359, validation accuacy: 0.956600010395
    Epoch 3 cost: 0.172022930289, validation accuacy: 0.958800017834
    Epoch 4 cost: 0.15778082135, validation accuacy: 0.962999999523
    Epoch 5 cost: 0.146718783832, validation accuacy: 0.963800013065
    Epoch 6 cost: 0.137771496312, validation accuacy: 0.965600013733
    Epoch 7 cost: 0.130382152335, validation accuacy: 0.9674000144
    Epoch 8 cost: 0.124080524492, validation accuacy: 0.969399988651
    Epoch 9 cost: 0.118702218932, validation accuacy: 0.970399975777

The finetuning process ran for 1.72220735153 minutes
[Finished in 780.0s]

这里写图片描述

您可能感兴趣的与本文相关的镜像

TensorFlow-v2.15

TensorFlow-v2.15

TensorFlow

TensorFlow 是由Google Brain 团队开发的开源机器学习框架,广泛应用于深度学习研究和生产环境。 它提供了一个灵活的平台,用于构建和训练各种机器学习模型

Code provided by Ruslan Salakhutdinov and Geoff Hinton Permission is granted for anyone to copy, use, modify, or distribute this program and accompanying programs and documents for any purpose, provided this copyright notice is retained and prominently displayed, along with a note saying that the original programs are available from our web page. The programs and documents are distributed without any warranty, express or implied. As the programs were written for research purposes only, they have not been tested to the degree that would be advisable in any important application. All use of these programs is entirely at the user's own risk. How to make it work: 1. Create a separate directory and download all these files into the same directory 2. Download from http://yann.lecun.com/exdb/mnist the following 4 files: o train-images-idx3-ubyte.gz o train-labels-idx1-ubyte.gz o t10k-images-idx3-ubyte.gz o t10k-labels-idx1-ubyte.gz 3. Unzip these 4 files by executing: o gunzip train-images-idx3-ubyte.gz o gunzip train-labels-idx1-ubyte.gz o gunzip t10k-images-idx3-ubyte.gz o gunzip t10k-labels-idx1-ubyte.gz If unzipping with WinZip, make sure the file names have not been changed by Winzip. 4. Download Conjugate Gradient code minimize.m 5. Download Autoencoder_Code.tar which contains 13 files OR download each of the following 13 files separately for training an autoencoder and a classification model: o mnistdeepauto.m Main file for training deep autoencoder o mnistclassify.m Main file for training classification model o converter.m Converts raw MNIST digits into matlab format o rbm.m Training RBM with binary hidden and binary visible units o rbmhidlinear.m Training RBM with Gaussian hidden and binary visible units o backprop.m Backpropagation for fine-tuning an autoencoder o backpropclassify.m Backpropagation for classification using "encoder" network o CG_MNIST.m Conjugate Gradient optimization for fine-tuning an autoencoder o CG_CLASSIFY_INIT.m Co
评论 3
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值