
人工智能,机器学习,深度学习
文章平均质量分 89
人工智能,机器学习,深度学习
深海里的鱼(・ω<)★
这个作者很懒,什么都没留下…
展开
-
【吴恩达深度学习】Trigger Word Detection
Trigger Word DetectionWelcome to the final programming assignment of this specialization!In this week’s videos, you learned about applying deep learning to speech recognition. In this assignment, you will construct a speech dataset and implement an algor原创 2022-04-19 21:53:15 · 444 阅读 · 0 评论 -
【吴恩达深度学习】Neural Machine Translation
Neural Machine TranslationWelcome to your first programming assignment for this week!You will build a Neural Machine Translation (NMT) model to translate human readable dates (“25th of June, 2009”) into machine readable dates (“2009-06-25”). You will do原创 2022-04-19 15:09:19 · 1568 阅读 · 0 评论 -
【吴恩达深度学习】05_week3_quiz Sequence models & Attention mechanism
(1)Consider using this encoder-decoder model for machine translation.This model is a “conditional language model” in the sense that the encoder portion (shown in green) is modeling the probability of the input sentence x.[A]True[B]False答案:B解析:输入的是句子x原创 2022-04-17 21:07:04 · 1883 阅读 · 0 评论 -
【吴恩达深度学习】Emojify
Emojify!Welcome to the second assignment of Week 2. You are going to use word vector representations to build an Emojifier.Have you ever wanted to make your text messages more expressive? Your emojifier app will help you do that. So rather than writing “原创 2022-04-15 16:44:46 · 1368 阅读 · 0 评论 -
【吴恩达深度学习】Operations on word vectors
Operations on word vectorsWelcome to your first assignment of this week!Because word embeddings are very computionally expensive to train, most ML practitioners will load a pre-trained set of embeddings.After this assignment you will be able to:Load p原创 2022-04-14 19:59:42 · 424 阅读 · 0 评论 -
【吴恩达深度学习】05_week2_quiz Natural Language Processing & Word Embeddings
(1)Suppose you learn a word embedding for a vocabulary of 10000 words. Then the embedding vectors should be 10000 dimensional, so as to capture the full range of variation and meaning in those words.[A]True[B]False答案:B解析:注意和one-hot的区别。(2)What is t-SN原创 2022-04-13 21:45:11 · 1082 阅读 · 0 评论 -
【吴恩达深度学习】Improvise a Jazz Solo with an LSTM Network
Improvise a Jazz Solo with an LSTM NetworkWelcome to your final programming assignment of this week! In this notebook, you will implement a model that uses an LSTM to generate music. You will even be able to listen to your own music at the end of the assi原创 2022-04-09 19:03:48 · 801 阅读 · 0 评论 -
【吴恩达深度学习】Character level language model - Dinosaurus land
Character level language model - Dinosaurus landWelcome to Dinosaurus Island! 65 million years ago, dinosaurs existed, and in this assignment they are back. You are in charge of a special task. Leading biology researchers are creating new breeds of dinosa原创 2022-04-07 21:36:39 · 525 阅读 · 0 评论 -
【吴恩达深度学习】Building your Recurrent Neural Network - Step by Step
Building your Recurrent Neural Network - Step by StepWelcome to Course 5’s first assignment! In this assignment, you will implement your first Recurrent Neural Network in numpy.Recurrent Neural Networks (RNN) are very effective for Natural Language Proce原创 2022-04-03 18:33:33 · 1501 阅读 · 0 评论 -
【吴恩达深度学习】05_week1_quiz Recurrent Neural Networks
(1)Suppose your training examples are sentences (sequences of words). Which of the following refers to the jth word in the ith training example?[A] x(i)<j>x^{(i)<j>}x(i)<j>[B] x<i>(j)x^{<i>(j)}x<i>(j)[C] x(j)<i>原创 2022-03-22 21:06:18 · 446 阅读 · 0 评论 -
【吴恩达深度学习】Deep Learning & Art: Neural Style Transfer
Deep Learning & Art: Neural Style TransferWelcome to the second assignment of this week. In this assignment, you will learn about Neural Style Transfer. This algorithm was created by Gatys et al. (2015) (https://arxiv.org/abs/1508.06576).In this assi原创 2022-03-18 13:20:13 · 453 阅读 · 0 评论 -
【吴恩达深度学习】Face Recognition for the Happy House
Face Recognition for the Happy HouseWelcome to the first assignment of week 4! Here you will build a face recognition system. Many of the ideas presented here are from FaceNet. In lecture, we also talked about DeepFace.Face recognition problems commonly原创 2022-03-17 18:24:56 · 362 阅读 · 0 评论 -
【吴恩达深度学习】04_week4_quiz Special applications: Face recognition & Neural style transfer
(1)Face verification requires comparing a new picture against one person’s face, whereas face recognition requires comparing a new picture against K person’s faces.[A]True[B]False答案:A解析:人脸验证是验证是不是这个人;人脸识别是识别是哪个人。(2)Why do we learn a function d(img1,原创 2022-03-17 00:53:20 · 519 阅读 · 0 评论 -
【吴恩达深度学习】Residual Networks(PyTorch)
keras版本链接导包import torchfrom torch import nnfrom torch import optimimport torch.nn.functional as Ffrom torch.utils.data import Datasetfrom torch.utils.data import DataLoaderfrom resnets_utils import *Dataset类class MyDataset(Dataset): def __i原创 2022-03-15 07:20:16 · 1665 阅读 · 0 评论 -
【吴恩达深度学习】the Happy House(PyTorch)
keras版本链接导包import torchfrom torch import nnfrom torch import optimfrom torch.utils.data import Datasetfrom torch.utils.data import DataLoaderimport torch.nn.functional as Ffrom kt_utils import *Dataset类用于分成不同的mini-batchclass MyDataset(Dataset):原创 2022-03-15 04:40:29 · 457 阅读 · 0 评论 -
【吴恩达深度学习】Autonomous driving - Car detection
Autonomous driving - Car detectionWelcome to your week 3 programming assignment. You will learn about object detection using the very powerful YOLO model. Many of the ideas in this notebook are described in the two YOLO papers: Redmon et al., 2016 (https:原创 2022-03-14 20:59:35 · 845 阅读 · 0 评论 -
【吴恩达深度学习】04_week3_quiz Detection algorithms
(1)You are building a 3-class object classification and localization algorithm. the classes are: pedestrian (c=1), car (c=2), motorcycle (c=3). What would be the label for the following image? Recall y=[pc,bx,by,bh,bw,c1,c2,c3]y=[p_c,b_x,b_y,b_h,b_w,c_1,c_原创 2022-03-14 07:36:48 · 4205 阅读 · 0 评论 -
【吴恩达深度学习】Residual Networks
Residual NetworksWelcome to the second assignment of this week! You will learn how to build very deep convolutional networks, using Residual Networks (ResNets). In theory, very deep networks can represent very complex functions; but in practice, they are原创 2022-03-13 12:22:27 · 386 阅读 · 0 评论 -
【吴恩达深度学习】Keras tutorial - the Happy House
Keras tutorial - the Happy HouseWelcome to the first assignment of week 2. In this assignment, you will:Learn to use Keras, a high-level neural networks API (programming framework), written in Python and capable of running on top of several lower-level原创 2022-03-12 16:48:31 · 574 阅读 · 0 评论 -
【吴恩达深度学习】04_week2_quiz Deep convolutional models
(1)Which of the following do you typically see as you move to deeper layer in a ConvNet?[A] nHn_HnH and nWn_WnW decrease, while nCn_CnC increases.[B] nHn_HnH and nWn_WnW decrease, while nCn_CnC also decreases.[C] nHn_HnH and nWn_WnW increases, w原创 2022-03-12 06:40:59 · 1740 阅读 · 0 评论 -
【吴恩达深度学习】Convolutional Neural Networks: Application(PyTorch)
TensorFlow版原文导入所需要的包import torchfrom torch import nnfrom torch import optimfrom cnn_utils import *Flatten类由于早期的PyTorch没有提供nn.Faltten 类,所以这里需要手写一个class Flatten(nn.Module): def __init__(self, start_dim=1, end_dim=-1): super(Flatten, sel原创 2022-03-10 18:43:42 · 1177 阅读 · 0 评论 -
【吴恩达深度学习】Convolutional Neural Networks: Application (TensorFlow)
Convolutional Neural Networks: Application注:由于tensorflow版本不同,输出结果可能会不同。Welcome to Course 4’s second assignment! In this notebook, you will:Implement helper functions that you will use when implementing a TensorFlow modelImplement a fully functioning C原创 2022-03-10 15:10:17 · 369 阅读 · 0 评论 -
【吴恩达深度学习】Convolutional Neural Networks: Step by Step
Convolutional Neural Networks: Step by StepWelcome to Course 4’s first assignment! In this assignment, you will implement convolutional (CONV) and pooling (POOL) layers in numpy, including both forward propagation and (optionally) backward propagation.No原创 2022-03-10 02:23:14 · 523 阅读 · 0 评论 -
【吴恩达深度学习】04_week1_quiz The basics of ConvNets
(1)What do you think applying this filter to a grayscale image will do?∣01−1013−3−113−3−101−10∣\left| \begin{matrix} 0& 1& -1& 0\\ 1& 3& -3& -1\\ 1& 3& -3& -1\\ 0& 1& -1& 0\\\end{matrix} \r原创 2022-03-09 16:10:07 · 3344 阅读 · 0 评论 -
池化层的反向传播
今天博主在研究卷积神经网络的反向传播算法时,产生了这么一个疑问:pooling层没有卷积核,那反向传播的时候,做了些什么呢?更新了什么参数呢? 有一位博主提到:池化层一般没有参数,所以反向传播的时候,只需对输入参数求导,不需要进行权值更新。但是具体在计算的时候是要根据Max还是Average来进行区分,进行参数更新的。 我们来看看池化层的前向传播和反向传播过程 1 Max-P...转载 2022-03-09 15:28:02 · 2662 阅读 · 4 评论 -
【吴恩达深度学习】03_week2_quiz Autonomous driving (case study)
(1) To help you practice strategies for machine learning, in this week we’ll present another scenario and ask how you would act. We think this “simulator” of working in a machine learning project will give a task of what leading a machine learning project原创 2022-03-07 03:55:20 · 1421 阅读 · 0 评论 -
【吴恩达深度学习】03_week1_quiz Bird recognition in the city of Peacetopia(case study)
(1)Problem StatementThis example is adapted from a real production application, but with details disguised to protect confidentiality.You are a famous researcher in the City of Peacetopia. The people of Peacetopia have a common characteristic: they are原创 2022-03-06 00:13:34 · 993 阅读 · 0 评论 -
PyTorch 框架
PyTorch参考原文TensorFlow框架1 - Exploring the PyTorch Libraryimport mathimport numpy as npimport h5pyimport matplotlib.pyplot as pltimport torchfrom torch import nnfrom torch.nn import functional as Ffrom tf_utils import load_dataset, random_mini_batc原创 2022-03-03 20:53:31 · 749 阅读 · 0 评论 -
【吴恩达深度学习】TensorFlow Tutorial
TensorFlow TutorialWelcome to this week’s programming assignment. Until now, you’ve always used numpy to build neural networks. Now we will step you through a deep learning framework that will allow you to build neural networks more easily. Machine learni原创 2022-03-03 14:09:08 · 763 阅读 · 0 评论 -
【吴恩达深度学习】02_week3_quiz Hyperparameter tuning, Batch Normalization, Programming Frameworks
(1)If searching among a large number of hyperparameters, you should try values in a grid rather than random values, so that you can carry out the search more systematically and not rely on chance. True or False?答案:False解析:当有很多参数的时候,不知道哪个参数比较重要。课程中的例子:在原创 2022-03-02 16:48:01 · 743 阅读 · 0 评论 -
【吴恩达深度学习】Optimization Methods
Optimization MethodsUntil now, you’ve always used Gradient Descent to update the parameters and minimize the cost. In this notebook, you will learn more advanced optimization methods that can speed up learning and perhaps even get you to a better final va原创 2022-03-01 02:19:25 · 503 阅读 · 0 评论 -
【吴恩达深度学习】02_week2_quiz Optimization algorithms
(1)Which notation would you use to denote the 3rd layer’s activations when the input is the 7th example from the 8th mini-batch[A] a[3]{7}(8)a^{[3]\{7\}(8)}a[3]{7}(8)[B] a[8]{7}(3)a^{[8]\{7\}(3)}a[8]{7}(3)[C] a[8]{3}(7)a^{[8]\{3\}(7)}a[8]{3}(7)[D] a[3]原创 2022-02-28 03:16:49 · 854 阅读 · 0 评论 -
【吴恩达深度学习】Gradient Checking
Gradient CheckingWelcome to the final assignment for this week! In this assignment you will learn to implement and use gradient checking.You are part of a team working to make mobile payments available globally, and are asked to build a deep learning mod原创 2022-02-25 12:46:15 · 443 阅读 · 0 评论 -
【吴恩达深度学习】Regularization
RegularizationWelcome to the second assignment of this week. Deep Learning models have so much flexibility and capacity that overfitting can be a serious problem, if the training dataset is not big enough. Sure it does well on the training set, but the le原创 2022-02-24 20:50:15 · 343 阅读 · 0 评论 -
【吴恩达深度学习】Initialization
InitializationWelcome to the first assignment of “Improving Deep Neural Networks”.Training your neural network requires specifying an initial value of the weights. A well chosen initialization method will help learning.If you completed the previous cour原创 2022-02-23 20:31:53 · 333 阅读 · 0 评论 -
【吴恩达深度学习】02_week1_quiz Practical aspects of deep learning
(1)If you have 10,000,000 examples, how would you split the train/dev/test set?[A] 98% train. 1% dev. 1%test[B] 33% train. 33% dev. 33%test[A] 60% train. 20% dev. 20%test答案:A解析:见视频1.1 Train/dev/test sets. (2)The dev and test set should:[A]Come from原创 2022-02-23 16:22:17 · 1043 阅读 · 0 评论 -
非常形象(mo xing)的激活函数操
转载 2022-02-17 22:29:28 · 153 阅读 · 0 评论 -
【吴恩达深度学习】Deep Neural Network for Image Classification: Application
Deep Neural Network for Image Classification: ApplicationWhen you finish this, you will have finished the last programming assignment of Week 4, and also the last programming assignment of this course!You will use the functions you’d implemented in the p原创 2022-02-17 19:40:02 · 733 阅读 · 0 评论 -
【吴恩达深度学习】Building your Deep Neural Network: Step by Step
Building your Deep Neural Network: Step by StepWelcome to your week 4 assignment (part 1 of 2)! You have previously trained a 2-layer Neural Network (with a single hidden layer). This week, you will build a deep neural network, with as many layers as you原创 2022-02-16 23:40:35 · 478 阅读 · 0 评论 -
【吴恩达深度学习】01_week4_quiz Key concepts on Deep Neural Networks
(1)What is the “cache” used for in our implementation of forward propagation and backward propagation?[A]It is used to keep track of the hyperparameters that we are searching over, to speed up computation.[B]We use it to pass variables computed during fo原创 2022-02-16 18:33:46 · 761 阅读 · 0 评论