机器学习中的神经网络Neural Networks for Machine Learning:Programming Assignment 1: The perceptron learning alg

本文介绍了一项编程作业,任务是实现感知机学习算法。通过提供的启动代码完成缺失部分,创建一个可以运行的感知机实现。文章提供了所需的数据集及运行说明,并包含几个关于线性可分性、学习率等问题。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Programming Assignment 1: The perceptron learning algorithm.Help Center

Warning: The hard deadline has passed. You can attempt it, but you will not get credit for it. You are welcome to try it as a learning exercise.

DISCLAIMER: Before beginning the actual quiz portion of the assignment, download the code and fill in the missing pieces. You do not have to submit the quiz until the deadline.

In this assignment you will take the provided starter code and fill in the missing details in order to create a working perceptron implementation.

To start, download the following code files:

And the following datasets:

Attention: some people have notified us that the provided datasets do not load under some versions of Octave. We are providing the same datasets in a different format that will hopefully work with more versions. You can find these files below.

And the following datasets:

For those who want to download all of the files together in a zip archive, you get get them here: Assignment1.zip

To run the code, you first need to load a dataset. To do so enter the following command in the Octave console (to load dataset 1):

load dataset1

This should load 4 variables:

  • neg_examples_nobias - The matrix containing the examples belonging to class 0.
  • pos_examples_nobias - The matrix containing the examples belonging to class 1.
  • w_init - Some initial weight vector.
  • w_gen_feas - A generously feasible weight vector (empty if one doesn't exist).

    The variables have _nobias appended to their names because they do not have an additional column of 1's appended to them. This is done automatically in the learn_perceptron.m code already. Now that you have loaded a dataset, you can run the algorithm by entering the following at the Octave console:

    w = learn_perceptron(neg_examples_nobias,pos_examples_nobias,w_init,w_gen_feas)

    This will start the algorithm and plot the results as it proceeds. Until the algorithm converges you can keep pressing enter to run the next iteration. Pressing 'q' will terminate the program. At each iteration it should produce a plot that looks something like this.

    The top left plot shows the data points. The circles represent one class while the squares represent the other. The line shows the decision boundary of the perceptron using the current set of weights. The green examples are those that are correctly classified while the red are incorrectly classified. The top-right plot will show the number of mistakes made by the perceptron. If a generously feasible weight vector is provided (and not empty), then the bottom left plot will show the distance of the learned weight vectors to the generously feasible weight vector.

    Currently, the code doesn't do any learning. It is your job to fill this part in. Specifically, you need to fill in the lines under learn_perceptron.m marked %YOUR CODE HERE (lines 114 and 122). When you are finished, use this program to help you answer the questions below.



    Question 1

    Which of the provided datasets are linearly separable? Check all that apply.

    Question 2

    True or false: if the dataset is  not linearly separable, then it is possible for the number of classification errors to  increase during learning.

    Question 3

    True or false: If a generously feasible region exists, then the distance between the current weight vector and any weight vector in the feasible region will monotonically decrease as the learning proceeds.

    Question 4

    The perceptron algorithm as implemented and described in class implicitly uses a learning rate of 1. We can modify the algorithm to use a different learning rate  α  so that the update rule for an input  x  and target  t  becomes: 
    w(t)w(t1)+α(tprediction)x
    where  prediction  is the decision made by the perceptron using the current weight vector  w(t1) , given by: 
    prediction={1 if wTx00 otherwise 
    True or false: if we use a learning rate of 2, then the perceptron algorithm will always converge to a solution for linearly separable datasets.

    Question 5

    According to the code, how many iterations does it take for the perceptron to converge to a solution on  dataset 3 using the provided initial weight vector  w_init
    Note: the program will output
    Number of errors in iteration  x : 0
    You simply need to report  x .

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值