
机器学习
记录
Vace___yun
这个作者很懒,什么都没留下…
展开
-
机器学习-----吴恩达课后习题ex5
偏差和方差 利用水库水位变化预测大坝出水量import numpy as npimport matplotlib.pyplot as pltfrom scipy.io import loadmatfrom scipy.optimize import minimizedata = loadmat('ex5data1.mat')data.keys()# dict_keys(['__header__', '__version__', '__globals__', 'X', '...原创 2021-08-18 18:10:59 · 419 阅读 · 0 评论 -
机器学习-----吴恩达课后习题ex4
BP-反向传播import numpy as npimport scipy.io as sioimport matplotlib.pyplot as pltfrom scipy.optimize import minimizedata = sio.loadmat('ex4data1.mat')raw_X = data['X']raw_y = data['y']X = np.insert(raw_X,0,values=1,axis=1) #输入层加偏值1.对y进行独热编码处理:on原创 2021-08-18 11:23:39 · 273 阅读 · 0 评论 -
机器学习-----吴恩达课后习题ex3
这个部分需要你实现手写数字(0到9)的识别。你需要扩展之前的逻辑回归,并将其应用于一对多的分类。ex3-神经网络-前向传播import numpy as npimport scipy.io as siodata = sio.loadmat('ex3data1.mat')raw_X = data['X']raw_y = data['y']X = np.insert(raw_X,0,values=1,axis=1)X.shape# (5000,401)y = ra...原创 2021-08-18 10:46:58 · 237 阅读 · 0 评论 -
机器学习-----吴恩达课后习题ex2
ex2-线性可分import numpy as npimport pandas as pdimport matplotlib.pyplot as pltdata = pd.read_csv('ex2data1.txt',names=['Exam 1','Exam 2','Accepted'])data.head()fig,ax = plt.subplots()ax.scatter(data[data['Accepted']==0]['Exam 1'],data[data..原创 2021-08-17 20:58:54 · 280 阅读 · 0 评论 -
matplotlib个人总结
fig, ax = plt.subplots(figsize=(12,8))ax.plot(x, f, 'r', label='Prediction')ax.scatter(data.Population, data.Profit, label='Traning Data')ax.legend(loc=2)ax.set_xlabel('Population')ax.set_ylabel('Profit')ax.set_title('Predicted Profit vs. Population原创 2021-08-05 21:31:51 · 282 阅读 · 0 评论 -
Pandas个人总结
持续更新-------pd.read_csv()画出散点图data.plot(kind='scatter',x='Population',y='Profit',figsize=(12,8))data.ilocdata.shape原创 2021-08-05 21:31:37 · 115 阅读 · 0 评论 -
Numpy个人总结
np.power()x 和 y 为单个数字import numpy as npprint(np.power(2, 3))8x 为列表,y为数字print(np.power([2,3,4], 3))[ 8 27 64]x 为数字,y为列表print(np.power(2, [2,3,4]))[ 4 8 16]np.sum()import numpy as npa = np.array([[[1,2,3,2],[1,2,3,1],[2,3,4,..原创 2021-08-05 21:31:14 · 128 阅读 · 0 评论 -
机器学习-----吴恩达课后习题ex1
def computeCost(X,y,theta): inner = np.power(X @ theta-y,2) return np.sum(inner)/(2 * len(X))def gradientDescent(X, y, theta, alpha, iters): costs = [] for i in range(iters): theta = theta- (X.T @(X @theta-y))*alpha/len(X) ...原创 2021-08-05 20:34:12 · 649 阅读 · 0 评论