人工智能资料库:第14辑(20170123)

通过使用电子健康记录(EHR)的无监督深度特征学习方法,构建了一种通用的患者表示法——深度患者。该模型能够从约700,000名患者的EHR数据中捕捉到层次规律和依赖关系,并在多种疾病预测上表现出显著优于基于原始EHR数据和其他特征学习策略的结果。

  1. 【代码】Music Auto Tagging Keras

简介:

Music auto-tagging models and trained weights in keras/theano

原文链接:https://github.com/keunwoochoi/music-auto_tagging-keras


2.【论文】Deep Patient: An Unsupervised Representation to Predict the Future of Patients from the Electronic Health Records

简介:

Secondary use of electronic health records (EHRs) promises to advance clinical research and better inform clinical decision making. Challenges in summarizing and representing patient data prevent widespread practice of predictive modeling using EHRs. Here we present a novel unsupervised deep feature learning method to derive a general-purpose patient representation from EHR data that facilitates clinical predictive modeling. In particular, a three-layer stack of denoising autoencoders was used to capture hierarchical regularities and dependencies in the aggregated EHRs of about 700,000 patients from the Mount Sinai data warehouse. The result is a representation we name “deep patient”. We evaluated this representation as broadly predictive of health states by assessing the probability of patients to develop various diseases. We performed evaluation using 76,214 test patients comprising 78 diseases from diverse clinical domains and temporal windows. Our results significantly outperformed those achieved using representations based on raw EHR data and alternative feature learning strategies. Prediction performance for severe diabetes, schizophrenia, and various cancers were among the top performing. These findings indicate that deep learning applied to EHRs can derive patient representations that offer improved clinical predictions, and could provide a machine learning framework for augmenting clinical decision systems.

原文链接:http://www.nature.com/articles/srep26094


3.【博客】Deriving the Gradient for the Backward Pass of Batch Normalization

简介:

I recently sat down to work on assignment 2 of Stanford’s CS231n. It’s lengthy and definitely a step up from the first assignment, but the insight you gain is tremendous.

Anyway, at one point in the assignment, we were tasked with implementing a Batch Normalization layer in our fully-connected net which required writing a forward and backward pass.

The forward pass is relatively simple since it only requires standardizing the input features (zero mean and unit standard deviation). The backwards pass, on the other hand, is a bit more involved. It can be done in 2 different ways:

  • staged computation: we can break up the function into several parts, derive local gradients for them, and finally multiply them with the chain rule.
  • gradient derivation: basically, you have to do a “pen and paper” derivation of the gradient with respect to the inputs.

It turns out that second option is faster, albeit nastier and after struggling for a few hours, I finally got it to work. This post is mainly a clear summary of the derivation along with my thought process, and I hope it can provide others with the insight and intuition of the chain rule. There is a similar tutorial online already (but I couldn’t follow along very well) so if you want to check it out, head over to Clément Thorey’s Blog.

Finally, I’ve summarized the original research paper and accompanied it with a small numpy implementation which you can view on my Github. With that being said, let’s jump right into the blog.

原文链接:https://kevinzakka.github.io/2016/09/14/batch_normalization/


4.【代码】Visual Debugger for Deep Learning, built on TensorFlow

简介:

TensorDebugger (TDB) is a visual debugger for deep learning. It extends TensorFlow (Google’s Deep Learning framework) with breakpoints + real-time visualization of the data flowing through the computational graph.

原文链接:https://github.com/ericjang/tdb


5.【资料】Open Learning

简介:

As a person who does a lot of autonomous learning, the Internet in these days offer a huge amount of possibilities to read/learn about any topic you might think of. There might be more the problem of filtering out useful/good content from the nearly infinite amount of sources. Inspired by a colleague I will try to give a record of whatever I read/saw and can recommend on specific topics. I will also try to add specific links that I have already studied in the past but may help any interested reader (or myself as lookup). Most stuff will be about machine learning in general and more specific about computer vision/image classification as my master thesis is related to these topics. But from time to time I might add also some more fun related topics.

原文链接:http://kratzert.github.io/openlearning.html


评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值