人工智能资料库:第5辑(20170109)


  1. 【博客】Matching Networks for One Shot Learning

简介:

This is a paper on one-shot learning, where we’d like to learn a class based on very few (or indeed, 1) training examples. E.g. it suffices to show a child a single giraffe, not a few hundred thousands before it can recognize more giraffes.

This paper falls into a category of “duh of course” kind of paper, something very interesting, powerful, but somehow obvious only in retrospect. I like it.

原文链接:https://github.com/karpathy/paper-notes/blob/master/matching_networks.md


2.【博客】Building Machine Learning Estimator in TensorFlow

简介:

The purpose of this post is to help you better understand the underlying principles of estimators in TensorFlow Learn and point out some tips and hints if you ever want to build your own estimator that’s suitable for your particular application. This post will be helpful when you ever wonder how everything works internally and gets overwelmed by the large codebase.

原文链接:http://terrytangyuan.github.io/2016/07/08/understand-and-build-tensorflow-estimator/


3.【资源】Deep Learning Resources

简介:

Deep learning resources that I marked here for reading and self-study.

原文链接:https://github.com/YajunHuang/DL-learning-resources


4.【博客】Unfolding RNNs —— RNN : Concepts and Architectures

简介:

RNN is one of those toys that eluded me for a long time. I just couldn’t figure out how to make it work. Ever since I read Andrej Karpathy’s blog post on the Unreasonable Effectiveness of RNNs, I have been fascinated by what RNNs are capable of, and at the same time confused by how they actually worked. I couldn’t follow his code for text generation (Language Modeling). Then, I came across Denny Britz’s blog, from which I understood how exactly they worked and how to build them. This blog post is addressed to my past self that was confused about the internals of RNN. Through this post, I hope to help people interested in RNNs, develop a basic understanding of what they are, how they work, different variants of RNN and applications.

原文链接:http://suriyadeepan.github.io/2017-01-07-unfolding-rnn/


5.【代码】Neural Variational Document Model

简介:

Tensorflow implementation of Neural Variational Inference for Text Processing.

This implementation contains:

  • Neural Variational Document Model
    1. Variational inference framework for generative model of text
    2. Combines a stochastic document representation with a bag-of-words generative model
  • Neural Answer Selection Model (in progress)
    1. Variational inference framework for conditional generative model of text
    2. Combines a LSTM embeddings with an attention mechanism to extract the semantics between question and answer

代码链接:https://github.com/carpedm20/variational-text-tensorflow


评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值