Caffe
visionfans
研究方向:计算机视觉,模式识别,机器学习
展开
专栏收录文章
- 默认排序
- 最新发布
- 最早发布
- 最多阅读
- 最少阅读
-
fully connected layer as 1x1 convolution
Quoting Lecun’s post: In Convolutional Nets, there is no such thing as “fully-connected layers”. There are only convolution layers with 1x1 convolution kernels and a full connection table. It’s a原创 2015-09-07 19:07:53 · 2177 阅读 · 0 评论 -
Caffe blob reshape
Release original memory and hold new:template <typename Dtype>void Blob<Dtype>::Reshape(const vector<int>& shape) { CHECK_LE(shape.size(), kMaxBlobAxes); count_ = 1; shape_.resize(shape.size())原创 2015-09-12 21:54:11 · 5332 阅读 · 0 评论 -
Notes on Caffe layers
to finish.ReferencesDeveloping new layersSwitch LayerType to string for extensibility #1685Layer type is a string #1694原创 2015-09-12 13:32:33 · 800 阅读 · 0 评论 -
caffe layer `EltwiseLayer`
coeff should be empty or be the same size with bottomcoeff only be used with SUM operationtop and all bottoms are the same sizeoperations top[0] = PROD(bottom[:])top[0] = SUM(coeff[i]*bottom[i])原创 2015-11-04 00:26:04 · 2794 阅读 · 0 评论 -
caffe training tricks
Implement the model that won the classification task of ImageNet 2013 #33choosing batch sizes and tuning sgd #218How to train imagenet with reduced memory and batch size? #430 Originally base_lr = 0原创 2015-11-01 20:39:08 · 956 阅读 · 0 评论 -
Caffe cuDNN max-pooling with in-place dropout
dropout in place incompatible with max pooling #117Fix Max pooling layer to use a mask #162Finish up max pooling with a mask from @sguada #448In-place computation can break gradient computation #201原创 2015-11-06 22:24:35 · 957 阅读 · 0 评论 -
caffe layer
看Caffe/wiki/Development,同时对照ArgMaxLayer来看! layer的声明要写到:l common_layers.hppl data_layers.hppl loss_layers.hppl neuron_layers.hppl vision_layers.hpp 要实现以下函数:l LayerSetUpl Re原创 2015-10-26 23:34:02 · 1301 阅读 · 0 评论 -
caffe loss NaN
by Yangqing on 14 May 2014 For a sanity check, try running with a learning rate 0 to see if any nan errors pop up (they shouldn’t, since no learning takes place). If data is not initialized well, it原创 2015-11-01 20:28:55 · 2641 阅读 · 0 评论
分享