
机器学习
daguge1
情不知所起,一往而深
展开
-
Stanford machine learing Part 1 Linear Regression
代码汇总: computeCost:代价函数function J = computeCost(X, y, theta)%COMPUTECOST Compute cost for linear regression% J = COMPUTECOST(X, y, theta) computes the cost of using theta as the% parameter for l原创 2015-07-13 14:17:06 · 310 阅读 · 0 评论 -
K-means算法
%N是数据一共分多少类%data是输入的不带分类标号的数据%u是每一类的中心%re是返回的带分类标号的数据function [u re]=KMeans(data,N) [m n]=size(data); %m是数据个数,n是数据维数 ma=zeros(n); %每一维最大的数 mi=zeros(n); %每一维最小的数 u原创 2015-07-27 14:20:15 · 384 阅读 · 0 评论 -
AHP层次分析法
clc;clear;A=[1 3/4 3/9 3/8 3/6; 4/3 1 4/9 4/8 4/6; 9/3 9/4 1 9/8 9/6; 8/3 8/4 8/9 1 8/6; 6/3 6/4 6/9 6/8 1] %因素对比矩阵A,只需要改变矩阵A[m,n]=size(A);原创 2015-07-27 14:22:51 · 1493 阅读 · 0 评论 -
Machine Learning week 4 One vs All and Neural network
1、Vectorizing Logistic Regressionfunction [J, grad] = lrCostFunction(theta, X, y, lambda) %LRCOSTFUNCTION Compute cost and gradient for logistic regression with %regularization % J = LRCOSTFUN原创 2015-07-20 22:14:38 · 558 阅读 · 0 评论