fminunc() [coursera] Machine learning - Stanford University - Andrew Ng

Advanced Optimization

fminunc() in Octave

costFunction.m

function [jVal, gradient] = costFunction(theta)
  jVal = (theta(1)-5)^2 + (theta(2)-5)^2;
  gradient = zeros(2,1);
  gradient(1) = 2*(theta(1)-5);
  gradient(2) = 2*(theta(2)-5);
endfunction

prompt

>> options = optimset('GradObj', 'on', 'MaxIter', '100');
>> initialTheta = zeros(2,1);
>> initialTheta
initialTheta =

   0
   0

>> [optTheta, functionVal, exitFlag] = fminunc(@costFunction, initialTheta, options)
optTheta =

   5.0000
   5.0000

functionVal =    1.5777e-30
exitFlag =  1
>> %% exitFlag == 1, means converged
>> %% intialTheta must be an at least 2-dimensional vector

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值