作者:chen_h
微信号 & QQ:862251340
微信公众号:coderpai
1.【代码】Visual Question Answering in Pytorch
简介:

This repo was made by Remi Cadene (LIP6) and Hedi Ben-Younes (LIP6-Heuritech), two PhD Students working on VQA at UPMC-LIP6 and their professors Matthieu Cord (LIP6) and Nicolas Thome (LIP6-CNAM). We developped this code in the frame of a research paper called MUTAN: Multimodal Tucker Fusion for VQA which is (as far as we know) the current state-of-the-art on the VQA-1 dataset.
The goal of this repo is two folds:
- to make it easier to reproduce our results,
- to provide an efficient and modular code base to the community for further research on other VQA datasets.
If you have any questions about our code or model, don’t hesitate to contact us or to submit any issues. Pull request are welcome!
原文链接:https://github.com/Cadene/vqa.pytorch
2.【博客】Why Does Deep Learning Not Have a Local Minimum?
简介:

Yes, there is a ‘theoretical justification’, and has taken a couple decades to flush it out.
I will first point out, however, it has been observed in practice. This was pointed out by LeCun in his early work on LeNet, and is actually discussed in the ‘orange book’, “Pattern Classification” by David G. Stork, Peter E. Hart, and Richard O. Duda.
原文链接:http://www.kdnuggets.com/2017/06/deep-learning-local-minimum.html
3.【博客】Graph-based machine learning: Part I
简介:
During the seven-week Insight Data Engineering Fellows Program* recent grads and experienced software engineers learn the *latest open source technologies* by building a data platform to handle large, real-time datasets.*
Sebastien Dery* (now a Data Science Engineer at *Yewno) discusses his project on community detection on large datasets.
原文链接:https://blog.insightdatascience.com/graph-based-machine-learning-6e2bd8926a0
4.【博客】Deep Learning the Stock Market
简介:

In the past few months I’ve been fascinated with “Deep Learning”, especially its applications to language and text. I’ve spent the bulk of my career in financial technologies, mostly in algorithmic trading and alternative data services. You can see where this is going.
I wrote this to get my ideas straight in my head. While I’ve become a “Deep Learning” enthusiast, I don’t have too many opportunities to brain dump an idea in most of its messy glory. I think that a decent indication of a clear thought is the ability to articulate it to people not from the field. I hope that I’ve succeeded in doing that and that my articulation is also a pleasurable read.
原文链接:https://medium.com/@TalPerry/deep-learning-the-stock-market-df853d139e02
5.【论文】Deep Learning in Trading
简介:
* Current state of the art *
LSTM is theholygrail of sequencepredictions.
A major part of thefinancial modellingis sequenceprediction- whether that?s volatility modelsor volume modelsor thetoughest oneof all - returnprediction models. This is theunderlyingtask insuchproblems - Givenasequenceof values, can wepredict thenext number inthe sequence?
LSTM modelsnaturally fit this criteria,becauseof its recursivenature.Additionally,thehiddenstateandthe memory cell tremendouslyhelpretaintheuseful featuresof thesequence.Featureengineeringis thethingof thepast intheeraof neural networks.
Neural networksarereallygoodat comingup withfeaturesontheir own.A number of peopleinfinance work
day-in-day-out incomingup withfeatures.Neural netsarepoisedtotakeover this segment of the market.Neural networksprovideaneasy way tocombine market dataandother datasources.
Sinceneural netswork inthelatent space,it?s super easy tocombinedyour market datainput withother datasources
you might have.That canbeanythingfrom sentiment analysis, summaryof SEC filings tovisual or audioinputs.Additionally,neural networksmakeit easy todo multivariate modelling wheretherearealot of relationships
betweeninputs,andthereisatime-varyingnaturetoit.It?s important tounderstand when neural networksdonot work - theydon?t work if youdon?t haveenoughdata.
Small datasetsarebottleneckswhenit comes toconvergence.Largedatasets come withcomputationproblems.
&spm=1001.2101.3001.5002&articleId=80378585&d=1&t=3&u=f10d1f9c807d4ed79d7bf1ee9211ce0a)
1010

被折叠的 条评论
为什么被折叠?



