- 博客(13)
- 收藏
- 关注
原创 268. Missing Number 丢失的数字
Given an array containing n distinct numbers taken from 0, 1, 2, ..., n, find the one that is missing from the array. For example, Given nums = [0, 1, 3] return 2Follow up: Could you implement a solution using only O(1) extra space complexity and O(n) ru
2020-10-26 11:21:17
184
原创 使用tensorflow时遇到的一些问题汇总·
1. 使用tensorboardno scalar data was found。解决方案link:https://github.com/tensorflow/tensorflow/issues/2353在使用tf.summary记录验证集的损失时,总是发现no scalar data was found。with tf.Session(graph=g) as sess: ...
2019-06-07 08:10:31
266
原创 Mask R-CNN 精读
1. IntroductionIn principle Mask R-CNN is an intuitive extension ofFaster R-CNN, yet constructing the mask branch properlyis critical for good results. Most importantly, Faster RCNNwas not designed fo...
2018-05-11 17:55:22
433
转载 欢迎使用优快云-markdown编辑器2
欢迎使用Markdown编辑器写博客本Markdown编辑器使用StackEdit修改而来,用它写博客,将会带来全新的体验哦:Markdown和扩展Markdown简洁的语法代码块高亮图片链接和图片上传LaTex数学公式UML序列图和流程图离线写博客导入导出Markdown文件丰富的快捷键快捷键加粗 Ctrl + B 斜体 Ctrl + I 引用 Ctrl
2017-12-22 16:00:14
209
原创 deeplearning.ai-lecture4-week1-Convolutional Neural Networks: Step by Step
Convolutional Neural Networks: Step by StepWelcome to Course 4's first assignment! In this assignment, you will implement convolutional (CONV) and pooling (POOL) layers in numpy, including both forw
2017-12-21 21:09:07
254
转载 deeplearning.ai-lecture1-building deep neural network steps
该实验主要是实现一些“Helper function”,为下一步实现两层神经网络和L层神经网络做准备,实现一个两层网络或深层网络的步骤如下:Step 1.分别初始化一个两层神经网络和L层神经网络的参数Step 2: 前向传播的实现:1.完成一个网络的前向传播的线性部分(linear part),即计算出 Z [l] 2.实现relu和 sigmoid激活函数
2017-12-12 20:11:00
210
原创 deeplearning.ai-lecture1-building deep neural network-summary
先上一张summary map1. L层神经网络参数初始化: 返回各层参数W(1)…W(l-1)def initialize_parameters_deep(layer_dims): forl inrange(1,L): parameters['W'+str(l)]=np.random.randn(layer_dims[l],layer_dims[
2017-12-12 19:56:38
395
原创 deeplearning.ai-lecture2-week1-regularization-homework
RegularizationWelcome to the second assignment of this week. Deep Learning models have so much flexibility and capacity that overfitting can be a serious problem, if the training dataset is not big
2017-12-12 19:52:01
402
原创 deeplearning.ai-lecture2-week1-Initialization-homework
InitializationWelcome to the first assignment of "Improving Deep Neural Networks".Training your neural network requires specifying an initial value of the weights. A well chosen initialization met
2017-12-12 19:50:14
560
1
原创 deeplearning.ai-lecture2-week1-Gradient Checking-homework
Gradient CheckingWelcome to the final assignment for this week! In this assignment you will learn to implement and use gradient checking.You are part of a team working to make mobile payments avai
2017-12-12 19:48:38
570
原创 deeplearning.ai-lecture2-week3-Tensorflow Tutorial-homework
TensorFlow TutorialWelcome to this week's programming assignment. Until now, you've always used numpy to build neural networks. Now we will step you through a deep learning framework that will allow
2017-12-12 19:44:18
1018
原创 deeplearning.ai lecture2-week2-optimization methods
Optimization methodsOptimization MethodsUntil now, you've always used Gradient Descent to update the parameters and minimize the cost. In this notebook, you will learn more advanced optimi
2017-12-11 19:51:11
987
转载 【转载】A Review on Deep Learning Techniques Applied to Semantic Segmentation(译)-(1)
原文链接:http://blog.youkuaiyun.com/u011771047/article/details/72779221http://blog.youkuaiyun.com/u014451076/article/details/71101850本部分包括:摘要,1.引言,2.专业术语和背景概念摘要 图像语义分割越来越受到计算机视觉和机器学习的研究人员的热爱。越来越多新兴
2017-11-16 10:51:29
1640
空空如也
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人