神经网络调参batchsize对网络性能影响

本文通过实验探讨了批大小(batch size)对一个特定神经网络(81*60*2结构)性能的影响,使用MNIST数据集并固定学习率为0.1,迭代次数为10000次。结果显示,随着批大小增加,网络性能先提高后趋于稳定,且当批大小超过50后,网络性能更为稳定。

本文设计了一个81*60*2的神经网络结构,并将学习率固定为0.1,噪音比例控制在0,批次数量200,每批迭代次数10000次,每10批进行一次测试,并逐渐的改变batchsize观察batchsize对网络性能的影响。

实验数据用的是mnist的数据集中的0,1,0有5863个,1有6677个,测试集0有980个,1有1135个。图片经过1/3的池化,由28*28变成9*9。

批次数量500个和迭代次数10000和batchsize的关系是

500*10000*batchsize,比如batchsize=100

是将这100个样本每个都计算一次然后累积求差值的平均值,更新权重然后将这个过程重复迭代10000次,这样的样本取500批。

结论是如下表格


*81*60*281*60*281*60*281*60*281*60*281*60*281*60*281*60*2
         
*全样品集全样品集全样品集全样品集全样品集全样品集全样品集全样品集
         
*全测试集全测试集全测试集全测试集全测试集全测试集全测试集全测试集
         
学习率0.10.10.10.10.10.10.10.1
         
batchz=2z=5z=10z=20z=30z=50z=100z=200
         
迭代次数it=10000it=10000it=10000it=10000it=10000it=10000it=10000it=10000
平均值0.6603640.7784820.7809080.8437120.8223260.8691440.8619090.896198
标准差0.0751080.0285920.0534430.0461670.0448810.0060910.00640.00622
最大值0.7456260.8085110.826950.872340.8713950.8827420.8955080.909693


可以看到随着batchsize的数量逐渐增大准确率平均值也在逐渐变大由0.66-0.89,整个网络的性能由标准差可见在batchsize >=50以后就相对平稳了,因为每批样本计算10000次,所以网络整体性能经过最初开始的几批就已经接近最大值,随着批次的增加性能也没有太大变化。

这个是batchsize=5时的图片



*******************************************************************************************************************

后来发现这组数据的训练集的噪音比例是0,测试集的噪音比例是10%,测试集加噪音是个巧合

下面的数字是训练集和测试集的噪音比例都是0的数据


网络结构81*60*281*60*281*60*281*60*281*60*281*60*281*60*281*60*2
训练集全样品集全样品集全样品集全样品集全样品集全样品集全样品集全样品集
测试集全测试集全测试集全测试集全测试集全测试集全测试集全测试集全测试集
学利率ret=0.1ret=0.1ret=0.1ret=0.1ret=0.1ret=0.1ret=0.1ret=0.1
batchsizez=2z=5z=10z=20z=30z=50z=100z=200
it=10000it=10000it=10000it=10000it=10000it=10000it=10000it=10000
平均值0.6700570.6411540.8023560.8606050.8743050.8997450.8737460.806339
标准差0.0556620.0959610.0419870.0172030.0591810.045220.0802840.102181
最大值0.7294230.7544940.8334910.8774830.8883630.9115420.894040.906812
训练集噪音比例zx=0zx=0zx=0zx=0zx=0zx=0zx=0zx=0
测试集噪音比例zy=0zy=0zy=0zy=0zy=0zy=0zy=0zy=0


两组数据做对比

batchsizez=2z=5z=10z=20z=30z=50z=100z=200
平均值0/100.6603640.7784820.7809080.8437120.8223260.8691440.8619090.896198
标准差0.0751080.0285920.0534430.0461670.0448810.0060910.00640.00622
最大值0.7456260.8085110.826950.872340.8713950.8827420.8955080.90969
平均值0/00.6700570.6411540.8023560.8606050.8743050.8997450.8737460.806339
标准差0.0556620.0959610.0419870.0172030.0591810.045220.0802840.102181
最大值0.7294230.7544940.8334910.8774830.8883630.9115420.894040.906812



经过对比测试集加噪音的网络的最大值0.89小于未加噪音的0.91,但是在batchsize>50以后测试集加噪音是的网络的标准差远小于未加噪音的0.006091<0.04522,表明在batchsize>50以后测试集加噪音使网络的性能更加稳定。





这个是训练集噪音比例0,测试集噪音比例10%的数据

z=2z=5z=10z=20z=30z=50z=100z=200
0.5470450.5981090.7929080.7338060.8713950.8671390.8955080.900709
0.6222220.7721040.8127660.8643030.7905440.863830.8548460.897872
0.633570.7933810.7947990.8685580.7257680.8661940.8619390.883215
0.7304960.803310.796690.7640660.8591020.8581560.8591020.892199
0.6382980.7981090.5375890.8680850.8657210.8690310.8628840.88227
0.4633570.8066190.8080380.6534280.8619390.8761230.8671390.901655
0.6387710.8085110.8080380.8307330.8539010.8609930.8591020.900236
0.7356970.7749410.8104020.872340.8643030.8690310.857210.898345
0.6278960.781560.8137120.6529550.8312060.8737590.8609930.892671
0.6349880.7943260.7560280.8624110.8609930.875650.8652480.899764
0.7295510.7895980.7971630.8330970.8657210.8614660.8591020.892199
0.6330970.7995270.7446810.8704490.8397160.872340.8581560.88227
0.7229310.7877070.7531910.8680850.8349880.8666670.8628840.895508
0.7456260.8023640.7408980.8657210.7886520.8780140.8609930.889362
0.6316780.7952720.8018910.8671390.8378250.8718680.8567380.886525
0.7238770.7858160.7498820.8685580.8014180.8680850.8520090.900236
0.6293140.6770690.8037830.8633570.7635930.8586290.8633570.9026
0.7229310.7602840.8075650.8591020.7574470.8709220.8633570.903546
0.7286050.7621750.8061470.8321510.8628840.8718680.8666670.900709
0.6231680.7990540.7995270.8666670.8416080.863830.8633570.900236
0.7356970.7323880.8203310.8614660.8160760.8718680.8695040.899291
0.7257680.7962170.8179670.8250590.8496450.8624110.8643030.9026
0.7248230.7673760.7484630.8345150.830260.8680850.8619390.903546
0.4633570.7650120.7522460.8704490.8581560.8685580.8609930.898818
0.7300240.8037830.8132390.8553190.8581560.8780140.8619390.909693
0.7375890.7957450.7314420.8633570.8586290.8647750.8680850.898818
0.7314420.7867610.8014180.8671390.8364070.8652480.8633570.899764
0.7257680.8037830.8132390.8628840.8548460.875650.8685580.8974
0.7300240.7947990.80.8288420.8387710.8827420.8591020.901182
0.6293140.7976360.8165480.8401890.8520090.872340.8557920.901182
0.6307330.7659570.7914890.8657210.8326240.8704490.8661940.897872
0.4633570.7635930.8070920.8326240.8586290.8747040.857210.904019
0.6288420.7985820.7418440.860520.8160760.8742320.8628840.894563
0.6250590.7271870.742790.8666670.8624110.8657210.8586290.892199
0.4633570.7952720.7957450.8666670.8283690.8737590.8576830.889835
0.6260050.7839240.8179670.8609930.8624110.8619390.8671390.886998
0.736170.7895980.5432620.8685580.8307330.8775410.8624110.901182
0.6293140.7910170.8160760.8236410.7460990.8765960.8600470.895035
0.55130.7801420.7399530.8713950.8609930.8732860.8529550.893617
0.6330970.7985820.6543740.8628840.7456260.8690310.8548460.891726
0.727660.7758870.7541370.8255320.7390070.8765960.8647750.885579
0.6387710.7853430.8004730.8619390.7635930.8699760.8553190.886525
0.6293140.7990540.8170210.8619390.7565010.8557920.8619390.899764
0.6198580.8004730.8198580.8382980.7413710.8747040.8586290.899764
0.7304960.7583920.8066190.8595740.7257680.8614660.8576830.895508
0.6354610.7976360.8137120.8666670.7437350.8619390.8666670.894563
0.626950.7758870.736170.8614660.8170210.8680850.8600470.89409
0.6222220.69740.736170.8505910.8695040.8576830.8624110.900236
0.6340430.7182030.7981090.833097    
0.7323880.7952720.8156030.850118    
0.5366430.7952720.7484630.866194    
0.7314420.7659570.722931     
0.6174940.7725770.821749     
0.6349880.8042550.805201     
0.7267140.7829790.805674     
0.7371160.7981090.739007     
0.7205670.796690.805201     
0.5366430.7981090.799527     
0.7333330.7872340.797163     
0.6189130.80.799527     
0.6321510.7877070.807565     
0.573050.7976360.821749     
0.5825060.7853430.806619     
0.7390070.7872340.796217     
0.6354610.7947990.728605     
0.6226950.7598110.793381     
0.7323880.7593380.78818     
0.6382980.7829790.813239     
0.6330970.7739950.817967     
0.7219860.7810870.735697     
0.7352250.7702130.786288     
0.4633570.7659570.791962     
0.7234040.7711580.79669     
0.7442080.7654850.799054     
0.6312060.7669030.787234     
0.6316780.7650120.791489     
0.6312060.7721040.799527     
0.7295510.7631210.821749     
0.6425530.7929080.729551     
0.4633570.8018910.816076     
0.7319150.8009460.808983     
0.6217490.7839240.802837     
0.7423170.788180.809929     
0.7238770.7919620.821749     
0.7366430.7957450.809929     
0.6368790.7914890.795272     
0.6278960.8037830.801418     
0.7380610.7536640.80331     
0.6345150.7654850.808983     
0.7342790.7612290.800946     
0.6416080.7943260.788652     
0.7309690.7919620.815603     
0.6387710.7895980.820804     
0.6345150.7758870.550827     
0.736170.7650120.82695     
0.6293140.7739950.797636     
0.7248230.7721040.660047     
0.7295510.7631210.742317     
0.7333330.7607570.792908     
0.727660.7484630.759338     



下面是训练集和测试集噪音比例都是0的数据

z=2z=5z=10z=20z=30z=50z=100z=200
0.5548720.6636710.8334910.8661310.850520.9077580.894040.906812
0.7275310.4635760.7961210.8751180.8883630.9115420.8859980.889309
0.7275310.5364240.8311260.8760640.8883630.9068120.8864710.862819
0.6367080.7152320.8207190.8448440.8864710.9068120.8930940.862346
0.6608330.7478710.7885530.8363290.8552510.856670.8864710.862819
0.7123940.7459790.8311260.8666040.8855250.9058660.8859980.862346
0.7204350.5288550.7885530.8755910.8869440.8964050.8926210.83018
0.6721850.6144750.7786190.8765370.8883630.9077580.8902550.83018
0.6636710.4635760.8065280.850520.8883630.9077580.8897820.83018
0.643330.6929990.8325450.8751180.8836330.9077580.8888360.83018
0.6721850.4635760.7885530.8741720.8883630.8964050.4910120.83018
0.6627250.7251660.8311260.8746450.8883630.9077580.878430.83018
0.6627250.6920530.8202460.8519390.8855250.9077580.8864710.83018
0.6627250.7483440.8330180.8751180.8883630.9068120.8916750.830653
0.7294230.4654680.7885530.845790.8883630.9077580.8888360.83018
0.7275310.7199620.7885530.8472090.8883630.9068120.8916750.83018
0.6873230.6996220.5586570.8755910.8869440.9077580.8916750.830653
0.6622520.7384110.7923370.8514660.8883630.9068120.8893090.83018
0.7294230.605960.7421950.8514660.8864710.9077580.8916750.83018
0.568590.4635760.8330180.8774830.8850520.9068120.8864710.463576
0.6721850.7360450.8330180.8774830.8789030.9077580.8859980.83018
0.531220.7384110.7937560.8741720.8864710.9068120.8859980.830653
0.6045410.7175970.8202460.8736990.8883630.9077580.8926210.830653
0.6636710.5444650.8330180.8703880.8803220.9077580.8926210.463576
0.6627250.7062440.8330180.8609270.8632920.9068120.8921480.83018
0.6627250.7105010.8070010.8689690.8803220.9077580.8916750.83018
0.6636710.5444650.8202460.8557240.8789030.9077580.8916750.83018
0.6627250.6466410.8334910.8197730.8803220.9077580.8921480.83018
0.4678330.6466410.8330180.8708610.8789030.9077580.8921480.83018
0.6608330.5245980.8330180.8557240.8789030.9077580.8921480.83018
0.6636710.6244090.8065280.8727530.8789030.9077580.8916750.83018
0.7275310.5652790.8330180.8150430.8789030.9077580.8916750.83018
0.6636710.5510880.8330180.8557240.8789030.9077580.8926210.830653
0.6721850.605960.803690.8543050.8789030.9077580.8916750.830653
0.7294230.7119210.8051090.8741720.8789030.9077580.8921480.830653
0.7204350.6830650.7483440.8339640.8789030.9077580.8921480.830653
0.6608330.6830650.7876060.8509930.8789030.9077580.8930940.464049
0.6608330.5444650.8051090.8543050.8869440.9077580.8916750.830653
0.6641440.7544940.7899720.8325450.8789030.9077580.8921480.830653
0.7237460.6830650.7682120.8741720.8869440.9077580.8921480.830653
0.7294230.7483440.8032170.8136230.8855250.9077580.8921480.830653
0.7275310.5444650.7488170.8434250.8869440.9077580.8921480.830653
0.7275310.7455060.8051090.8751180.8855250.9077580.8869440.830653
0.632450.7152320.8051090.8736990.8869440.9077580.8916750.830653
0.601230.7436140.8051090.8699150.8883630.9077580.8864710.464049
0.7294230.5444650.803690.8741720.8883630.5875120.8916750.830653
0.6608330.7455060.803690.8438980.8869440.9077580.8864710.830653



评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

黑榆

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值