实例中使用的是Pima Indians Diabetes数据集,数据集有八项属性和对应输出:
(1)怀孕次数
(2)2小时口服葡萄糖耐量实验中血浆葡萄糖浓度
(3)舒张压
(4)三头肌皮褶皱厚度
(5)2小时血清胰岛素
(6)身体质量指数
(7)糖尿病谱系功能
(8)年龄
(9)是否是糖尿病
第九项是我们的输出层。
数据集下载:数据集下载地址
下面代码:
from keras.models import Sequential
from keras.layers import Dense
import numpy as np
#设定随机种子
np.random.seed(7)
dataset = np.loadtxt(r'F:\Python\pycharm\keras_deeplearning\datasets\PimaIndiansdiabetes.csv',
delimiter=',',skiprows=1)
#分割输入变量和输出变量,0-7为输入x, 8为输出Y
x = dataset[:, 0:8]
Y = dataset[:, 8]
#创建模型——参数为“神经元数,input_dim只有第一层有,表示有多少个输入量,
#activation为激活函数,前两层用relu,二分类的输出层要用sigmoid
model = Sequential()
model.add(Dense(12, input_dim=8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
#编译模型——二分类中损失函数定义为“二进制交叉熵”,优化器Adam为有效的梯度下降算法
model.compile(loss='binary_crossentropy', optimizer='adam',
metrics=['accuracy'])
#训练模型——epochs参数表示对数据集进行固定次数的迭代
#batch_size表示在执行神经网络中的权重更新的每个批次中所用实例的个数
model.fit(x=x, y=Y, epochs=150, batch_size=10)
#评估模型——这里简化用,训练集评估
scores = model.evaluate(x=x, y=Y)
print('\n%s: %.2f%%' %(model.metrics_names[1], scores[1]*100))
输出结果:
Using TensorFlow backend.
Epoch 1/150
2018-10-29 16:26:09.934286: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX AVX2
2018-10-29 16:26:09.935682: I tensorflow/core/common_runtime/process_util.cc:69] Creating new thread pool with default inter op setting: 8. Tune using inter_op_parallelism_threads for best performance.
10/768 [..............................] - ETA: 31s - loss: 1.6118 - acc: 0.9000
500/768 [==================>...........] - ETA: 0s - loss: 4.6762 - acc: 0.6260
768/768 [==============================] - 0s 644us/step - loss: 3.6799 - acc: 0.5964
Epoch 2/150
10/768 [..............................] - ETA: 0s - loss: 0.4365 - acc: 0.8000
520/768 [===================>..........] - ETA: 0s - loss: 0.9445 - acc: 0.6038
768/768 [==============================] - 0s 100us/step - loss: 0.9296 - acc: 0.6016
Epoch 3/150
10/768 [..............................] - ETA: 0s - loss: 0.7402 - acc: 0.9000
520/768 [===================>..........] - ETA: 0s - loss: 0.7585 - acc: 0.6481
768/768 [==============================] - 0s 100us/step - loss: 0.7461 - acc: 0.6380
Epoch 4/150
10/768 [..............................] - ETA: 0s - loss: 0.7285 - acc: 0.8000
490/768 [==================>...........] - ETA: 0s - loss: 0.6964 - acc: 0.6449
768/768 [==============================] - 0s 103us/step - loss: 0.7103 - acc: 0.6549
Epoch 5/150
.
.
.
.
.
.
.
10/768 [..............................] - ETA: 0s - loss: 0.3279 - acc: 0.9000
520/768 [===================>..........] - ETA: 0s - loss: 0.4545 - acc: 0.7981
768/768 [==============================] - 0s 100us/step - loss: 0.4701 - acc: 0.7826
Epoch 142/150
10/768 [..............................] - ETA: 0s - loss: 0.5173 - acc: 0.7000
510/768 [==================>...........] - ETA: 0s - loss: 0.4799 - acc: 0.7745
768/768 [==============================] - 0s 102us/step - loss: 0.4800 - acc: 0.7734
Epoch 143/150
10/768 [..............................] - ETA: 0s - loss: 0.4776 - acc: 0.6000
510/768 [==================>...........] - ETA: 0s - loss: 0.4408 - acc: 0.8020
768/768 [==============================] - 0s 104us/step - loss: 0.4720 - acc: 0.7760
Epoch 144/150
10/768 [..............................] - ETA: 0s - loss: 0.6618 - acc: 0.6000
510/768 [==================>...........] - ETA: 0s - loss: 0.4592 - acc: 0.8039
768/768 [==============================] - 0s 100us/step - loss: 0.4738 - acc: 0.7786
Epoch 145/150
10/768 [..............................] - ETA: 0s - loss: 0.6643 - acc: 0.6000
510/768 [==================>...........] - ETA: 0s - loss: 0.4932 - acc: 0.7569
768/768 [==============================] - 0s 102us/step - loss: 0.4862 - acc: 0.7630
Epoch 146/150
10/768 [..............................] - ETA: 0s - loss: 0.5036 - acc: 0.7000
500/768 [==================>...........] - ETA: 0s - loss: 0.4992 - acc: 0.7580
768/768 [==============================] - 0s 103us/step - loss: 0.4918 - acc: 0.7708
Epoch 147/150
10/768 [..............................] - ETA: 0s - loss: 0.7728 - acc: 0.6000
510/768 [==================>...........] - ETA: 0s - loss: 0.4734 - acc: 0.7824
768/768 [==============================] - 0s 100us/step - loss: 0.4818 - acc: 0.7799
Epoch 148/150
10/768 [..............................] - ETA: 0s - loss: 0.3585 - acc: 0.8000
510/768 [==================>...........] - ETA: 0s - loss: 0.4805 - acc: 0.7745
768/768 [==============================] - 0s 102us/step - loss: 0.4686 - acc: 0.7812
Epoch 149/150
10/768 [..............................] - ETA: 0s - loss: 0.5303 - acc: 0.7000
500/768 [==================>...........] - ETA: 0s - loss: 0.4795 - acc: 0.7660
768/768 [==============================] - 0s 101us/step - loss: 0.4722 - acc: 0.7643
Epoch 150/150
10/768 [..............................] - ETA: 0s - loss: 0.3492 - acc: 0.9000
490/768 [==================>...........] - ETA: 0s - loss: 0.4477 - acc: 0.7980
768/768 [==============================] - 0s 103us/step - loss: 0.4769 - acc: 0.7799
32/768 [>.............................] - ETA: 1s
768/768 [==============================] - 0s 81us/step
acc: 78.12%
最后可以看到准确率。