人工智能资料库:第33辑(20170214)

作者:chen_h
微信号 & QQ:862251340
微信公众号:coderpai


  1. 【代码】A TensorFlow Implementation of Character Level Neural Machine Translation Using the Quasi-RNN

简介:

InBradbury et al., 2016(hereafter, the Paper), the authors introduce a new neural network model which they call the Quasi-RNN. Basically, it tries to benefit from both CNNs and RNNs by combining them. The authors conducted three experiments to evaluate the performance of the Q-RNN. Character level machine translation is one of them. After the Paper was published, some enthusiasts tried to reproduce the experiments as the authors didn’t disclose their source codes. Now I’m happy to be one of them. To my best knowledge, this is the first TensorFlow implementation of character level machine translation based on the Paper.

原文链接:https://github.com/Kyubyong/quasi-rnn


2.【博客】Unfolding RNNs II

简介:

Thefirst articlein this series focused on the general mechanism of RNN, architectures, variants and applications. The objective was to abstract away the details and illustrate the high-level concepts in RNN. Naturally, the next step is to dive into the details. In this article, we will follow a bottom-up approach, starting with the basic recurrent operation, building up to a complete neural network which performs language modeling.

As we have seen in the previous article, the RNNs consist of states, which are updated every time step. The state, at time step*t*, is essentially a summary of the information in the input sequence till*t*. At each*t*, information flows from the current input and the previous state, to the current state. This flow of information can be controlled. This is called the**gating**mechanism. Conceptually, a gate is a structure that selectively allows the flow of information from one point to another. In this context, we can employ multiple gates, to control information flow from the input to the current state, previous state to current state and from current state to output. Based on how gates are employed to control the information flow, we have multiple variants of RNNs.

原文链接:http://suriyadeepan.github.io/2017-02-13-unfolding-rnn-2/


3.【资料】Open Source Deep Learning Curriculum

简介:

This open-source Deep Learning curriculum is meant to be a starting point for everyone interested in seriously studying the field. Plugging into the stream of research papers, tutorials and books about deep learning mid-stream it is easy to feel overwhelmed and without a clear idea of where to start. Recognizing that all knowledge is hierarchical, advanced concepts building on more fundamental ones, I strove to put a list of resources that form a logical progression from fundamental to advanced.

Few universities offer an education that is on par with what you can find online these days. The people pioneering the field from industry and academia so openly and competently share their knowledge that the best curriculum is an open source one.

原文链接:http://www.deeplearningweekly.com/pages/open_source_deep_learning_curriculum


4.【博客】Bayesian Linear Regression (in PyMC) - a different way to think about regression

简介:

In this blog post, I’ll approach this problem from a Bayesian point of view. Ordinary linear regression (as taught in introductory statistics textbooks) offers a recipe which works great under a few circumstances, but has a variety of weaknesses. These weaknesses include an extreme sensitivity to outliers, an inability to incorporate priors, and little ability to quantify uncertainty.

Bayesian linear regression (BLR) offers a very different way to think about things. Combined with some computation (and note - computationally it’s a LOT harder than ordinary least squares), one can easily formulate and solve a very flexible model that addresses most of the problems with ordinary least squares.

原文链接:https://www.chrisstucchio.com/blog/2017/bayesian_linear_regression.html


5.【NLP & Brain】This is your brain on sentences

简介:

Researchers at the University of Rochester have, for the first time, decoded and predicted the brain activity patterns of word meanings within sentences, and successfully predicted what the brain patterns would be for new sentences.

The study used functional magnetic resonance imaging (fMRI) to measure human brain activation. “Using fMRI data, we wanted to know if given a whole sentence, can we filter out what the brain’s representation of a word is—that is to say, can we break the sentence apart into its word components, then take the components and predict what they would look like in a new sentence,” said Andrew Anderson, a research fellow who led the study as a member of the lab of**Rajeev Raizada, assistant professor ofbrain and cognitive sciences**at Rochester.

“We found that we can predict brain activity patterns—not perfectly [on average 70% correct], but significantly better than chance,” said Anderson, The study is published in the journal**Cerebral Cortex.**

Anderson and his colleagues say the study makes key advances toward understanding how information is represented throughout the brain. “First, we introduced a method for predicting the neural patterns of words within sentences—which is a more complex problem than has been addressed by previous studies,which have almost all focused on single words,” Anderson said. “And second, we devised a novel approach to map semantic characteristics of words that we then correlated to neural activity patterns.”

原文链接:http://www.rochester.edu/newscenter/this-is-your-brain-on-sentences/#.WKFxXwJXqLM.facebook


评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值