- 博客(147)
- 收藏
- 关注
原创 Ian Goodfellow
CUDA CUDA is a parallel computing platform and application programming interface model created by Nvidia. It allows software developers and software engineers to use a CUDA-enabled graphics processing unit for general purpose processing – an approach term
2021-07-31 11:57:09
320
原创 Pieter Abbeel
deep reinforcement learning reinforcement leraning autonomous helicopter flight ImageNet (Geoffrey Hinton, Toronto team) AlexNet Supervise learning is about learning an input and output mapping Reinforcement learning: where does the data even come fr
2021-07-28 12:47:08
344
原创 Geoffrey Hinton
The notes for interviewing Geoffrey Hinton, the godfather for deep learning, by Andrew Ng. You can definitely see many advanced techniques in deep learning and some good advice for how to start with your deep leraning!————————————————Restricted Boltzman
2021-07-27 11:52:56
366
原创 Deep Neural Networks - Parameters vs Hyperparameters
Being effective in developing your deep NN requires that you not only organize your parameters well, but also your hyper parameters. So, what are hyper parameters?ParametersHyper parametersLearning rate : it determines how the parameters evolve N.
2021-07-25 18:58:58
126
原创 Deep Neural Networks - Building blocks of deep neural networks
Take the deep NN of figure-1 as the example:Figure-1Following figure-2 shows the building blocks of this deep NN while generalized the figure for NN with L layers:Figure-2
2021-07-22 12:50:44
127
原创 Deep Neural Networks - Forward Propagation in a Deep Network
Take a deep NN in figure-1 as an example,Figure-1For a single training example xConcretely,For multiple (m) training examples
2021-07-21 10:10:30
167
1
原创 Deep Neural Networks - Deep L-layer Neural network
The notes when study the Coursera class by Mr. Andrew Ng "Neural Networks & Deep Learning", section 4.1 "Deep L-layer Neural network". It shows what deep NN looks like and notations to denote and compute NN. Share it with you and hope it helps!———————
2021-07-20 22:30:53
203
原创 One hidden layer Neural Network - Random Initialization
When you train your NN, it's important to initialize the weights ( etc.) randomly. For logistic regression, it's ok to initialize the weights to 0; but for NN, if initialize the weights all to 0 and then apply gradient descent, it won't work!
2021-07-12 10:51:19
103
原创 One hidden layer Neural Network - Gradient descent for neural networks
The notes when study the Coursera class by Mr. Andrew Ng "Neural Networks & Deep Learning", section 3.9 "Gradient descent for neural networks". It shows the computation graph for NN, how to compute back propagation of NN when there is one and multiple
2021-07-12 10:09:06
112
原创 One hidden layer Neural Network - Derivatives of activation functions
When you implement back propagation for your NN, you need to compute the slop/derivative of the activation function. Let's take a look at how to compute the slope of those activation functions.Sigmoid functionfigure-1We have:Following is the deriv..
2021-07-09 12:15:39
84
原创 One hidden layer Neural Network - Activation functions
W W WWhen build a NN, one of the choices to make is what activation functions to use in the hidden layers as well as the output layer. Besides the sigmoid activation function, sometimes other choices can work much better.tanh functionAls...
2021-07-07 12:29:59
138
原创 One hidden layer Neural Network - Vectorizing across multiple examples
The notes when study the Coursera class by Mr. Andrew Ng "Neural Networks & Deep Learning", section 3.4 "Vectorizing across multiple examples". It shows how to compute NN output via vectorization when there are multiple training example. Share it with
2021-07-06 15:33:04
132
原创 One hidden layer Neural Network - Computing a Neural Network‘s Output
Let's see how the Neural Network computes its output. It's like logistic regression, but repeat a lot of times!figure-1Figure-1 shows how to compute the output of logistic regression.Figure-2 shows how to compute the activation units of hidden la..
2021-07-05 15:09:03
93
原创 One hidden layer Neural Network - Neural Network Representation
The notes when study the Coursera class by Mr. Andrew Ng "Neural Networks & Deep Learning", section 3.2 "Neural Network Representation". Share it with you and hope it helps!------------------figure-1Figure-1 shows names of different parts of Neura
2021-07-05 14:45:55
141
原创 One hidden layer Neural Network - Neural Networks Overview
Let's give a quick overview of how you implement your neural network.figure-1 logistic regressionFigure-1 shows how we compute logistic regression using computation graph.figure-2 neural networkFigure-2 shows what the Neural Network looks like. We'll u
2021-07-05 12:45:41
102
原创 Basics of Neural Network Programming - Vectorizing Logistic Regression and its Gradient Computation
Let's talk about how to vectorize the implementation of logistic regression. With that we can implement a single iteration of gradient descent with respect to entire training set without using even a single explicit for loop.Figure-1In gradient descent,
2021-06-23 11:49:32
105
原创 Basics of Neural Network Programming - More vectorization examples
The rule of thumb to keep in mind is when you programming your neural networks or logistic regression, whenever possible, avoid explicit for loops. Let's look at another example.Figure-1Figure-1 shows two different ways to compute where u and v are vec.
2021-06-22 18:57:22
88
原创 Basics of Neural Network Programming - Vectorization
Vectorization is the art to get rid of the for loops in the code. The ability to perform vectorization has become a key skill.Figure-1Figure-1 shows two different ways to calculate
2021-06-22 17:55:52
139
原创 Basics of Neural Network Programming - Gradient descent on m examples
Last class, you saw how to compute derivatives and implement gradient descent with respect to just one training example for logistic regression. Now, we'll do it for m training examples.Recap the cost function of logistic regression:And,Accordin.
2021-06-21 20:38:10
117
原创 Basics of Neural Network Programming - Logistic Regression Gradient descent
This is the notes when study class Neural Networks & Deep Learning, section Logistic Regression Gradient Descent. Share it with anyone who is interested in.Following two figures shows how to calculate derivatives for logistic regression when one tra
2021-06-21 19:34:00
98
原创 Basics of Neural Network Programming - Derivatives with a Computation Graph
In last class, we worked through an example of using computation graph to compute the function J.
2021-06-20 12:09:20
96
原创 Basics of Neural Network Programming - Computation Graph
The computations of a neural network are organized in terms of a forward propagation step in which we compute the
2021-06-20 10:48:42
105
原创 Basics of Neural Network Programming - More derivatives examples
Let's see more complex examples where the slope of the function can be different at different points of the function.
2021-06-17 12:53:34
78
原创 Basics of Neural Network Programming - Derivatives
Let's try to get an intuitive understanding of calculus and derivatives.
2021-06-17 12:13:39
728
原创 Basics of Neural Network Programming - Gradient Descent
Let's talk about how you can use gradient descent to train/learn the parameters w and b on your training set.
2021-06-16 10:31:56
104
原创 Basics of Neural Network Programming - Logistic Regression cost function
For logistic regression, to train the parameters w and b, we need to define a cost function
2021-06-11 12:44:05
95
原创 Basics of Neural Network Programming - Logistic Regression
Logistic RegressionLogistic regression is a learning algorithm used in a supervised learning problem when the output ???? are all either zero or one. The goal of logistic regression is to minimize the error between its predictions and training data.Examp
2021-06-10 23:37:55
99
原创 Basics of Neural Network Programming - Binary Classification
Binary ClassificationIn a binary classification problem, the result is a discrete value output.For example - account hacked (1) or compromised (0)- a tumor malign (1) or benign (0)Example: Cat vs Non-CatThe goal is to train a classifier that the input
2021-06-09 12:10:04
131
原创 Introduction to Deep Learning - About this Course
Followings are the five courses of this specialization:Neural Networks and Deep Learning Improving Deep Neural Networks: Hyperparameter tuning, Regulization and Optimization Structuring your Machine Learning project Convolutional Neural Networks Natu
2021-05-26 13:10:22
154
原创 Introduction to Deep Learning - Why is deep learning taking off?
If the basic technical idea behind deep learning and neural networks have been around for decades, why are they only just now taking off? In this class, let's go over some of the main drivers behind the rise of deep learning. This will help you better spot
2021-05-25 14:30:34
139
原创 Introduction to Deep Learning - Supervised Learning with Neural Networks
It turns out that so far almost all the economic value created by neural networks has been through supervised learning. Let's see what that means and we'll go through some examples in this class.In supervised learning, you have some input x, and you w.
2021-05-18 08:19:24
126
原创 Introduction to Deep Learning - What is a Neural Network?
The term deep learning refers to training neural networks, sometimes very large neural networks. So what is exactly a neural network?Figure-1Let's start with a housing price prediction example. Say you have a data set with six houses. You know the size o
2021-05-15 16:04:53
207
原创 Application example: Photo OCR - Ceiling analysis: What part of the pipeline to work on next
When developing machine learning system, one of the most valuable resource is your time as the developer in terms of picking what to work on next. What you really want to avoid is that spent a lot of time working on some component only to realize, after we
2021-05-11 12:29:59
150
原创 Application example: Photo OCR - Getting lots of data: Artificial data synthesis
One of the most reliable ways to get a high performance machine learning system is to take a low bias learning algorithm and to train it on a massive training set. But where did you get so much training data from? It turns out that in machine learning ther
2021-05-06 09:49:32
298
3
原创 Application example: OCR - Sliding windows
In this class, let's talk about how the individual components of photo OCR pipeline works. In particular, we'll center around the discussion of what is called a sliding windows classifier.Figure-1The first stage of photo OCR pipeline is 'Text detection'.
2021-04-22 09:32:59
153
原创 How to Move Live WordPress Site to Localhost Windows?
目录Step 1: Download and Install MAMP on your PCStep 2: Run MAMP on your PCStep 3: Create a New Database for Your WordPress Test SiteStep 4: Download WordPress to your PCStep 5: Move WordPress into MAMP htdocsStep 6: Install WordPress Locally on
2021-04-20 23:47:19
280
原创 Application example: Photo OCR - Problem description and pipeline
We'll talk about phone OCR next for three reasons:Show an example of how a complex machine learning system can be put together Talk about the concept of a machine learning pipeline, and how to allocate resources when try to decide what to do next Tell
2021-04-14 09:27:56
86
原创 Large scale machine learning - Map-reduce and data parallelism
In this class, let's talk about a different approach to large scale machine learning called the Map-reduce approach. It is at least as equally important or even more important compared to stochastic gradient descent. By using this idea, you might be able t
2021-04-08 08:27:26
98
原创 Large scale machine learning - Online learning
In this class, let's talk about a new large scale machine learning setting called the online learning setting. It allows us to model problems where we have a continuous flood of data coming in and we would like the algorithm to learn form that. Today, many
2021-03-29 08:30:09
135
原创 Large scale machine learning - Stochastic gradient descent convergence
In this class, let's talk about:How to make sure the Stochastic gradient descent algorithm is converging well when we're running the algorithm How to tune the learning rate for the algorithm.When we were using batch gradient descent, our standard w.
2021-03-23 13:01:52
197
空空如也
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人