
big code: 编程现场大数据
文章平均质量分 92
big code 代码生成 代码补全 以及其他相关的任务或论文
优惠券已抵扣
余额抵扣
还需支付
¥99.90
¥299.90
购买须知?
本专栏为图文内容,最终完结不会低于15篇文章。
订阅专栏,享有专栏所有文章阅读权限。
本专栏为虚拟商品,基于网络商品和虚拟商品的性质和特征,专栏一经购买无正当理由不予退款,不支持升级,敬请谅解。
大黄老鼠
皮,皮,皮卡
展开
-
big code: Code Completion with Neural Attention and Pointer Networks 源码分析
代码和数据这篇论文的源码在:https://github.com/jack57lee/neuralCodeCompletion数据集在:http://plml.ethz.ch/经过处理的数据在:https://drive.google.com/open?id=1EZZuL8Rl3tatvxpIClvO_a8JD_Oid_oYvanillaLSTM首先分析最简单的vanillaLSTM代码是:https://github.com/jack57lee/neuralCodeCompletion原创 2020-10-14 21:12:09 · 579 阅读 · 8 评论 -
big code: Code Completion with Neural Attention and Pointer Networks [IJCAI 2018][ccf A类 会议]
原文:Code Completion with Neural Attention and Pointer Networks作者:Jian Li单位:中国香港大学(The Chinese University of Hong Kong)会议:IJCAI 2018注:IJCAI是中国计算机学会推荐国际学术会议(人工智能)A 类,全称是International Joint Conference on Artificial Intelligence数据x = 7print x+1[ {"type原创 2020-05-26 23:41:11 · 426 阅读 · 0 评论 -
big code: Open Vocabulary Learning on Source Code with a Graph–Structured Cache [ICLR 2019]
原文:Open Vocabulary Learning on Source Code with a Graph–Structured Cache作者:Milan Cvitkovic单位:加州理工学院(California Institute of Technology)会议:ICLR 2019这篇文章同 Deep Learning On Code with an Unbounded Vocabulary [EasyChair 2018]同一个作者,相似的内容,算正式版吧模型比前一篇多了不少原创 2020-05-23 15:20:32 · 413 阅读 · 2 评论 -
big code: Deep Learning On Code with an Unbounded Vocabulary [EasyChair 2018]
原文:Deep Learning On Code with an Unbounded Vocabulary作者:Milan Cvitkovic单位:加州理工学院(Caltech, California Institute of Technology)、Amazon AI会议:EasyChair 2018模型讲源代码转成AST在AST的基础上加各种边,如数据流,控制流(本文重点)变量的结点和subtoken之间加边用GGNN训练效果FILL-IN-THE-BLANKFixed&n原创 2020-05-21 18:49:33 · 297 阅读 · 0 评论 -
big code: 任务、数据集、网站等资源汇总 (持续更新)
Fill–In–The–Blank把一段代码某个变量挖空,然后预测这个变量名挖掉红色部分,但保留绿色部分,预测红色部分,看能不能预测出n这个变量相关论文Deep Learning On Code with an Unbounded Vocabulary [EasyChair 2018]Variable Naming把一段代码中,某一相同变量名全部挖空,自动生成这个变量名挖掉绿色部分,最后要预测出绿色部分的变量名是expectedsLength相关论文Deep Learning O原创 2020-05-21 18:08:09 · 4666 阅读 · 0 评论 -
big code: Code Completion/Suggestion 发展简史
模型统计语言模型(n-gram和RNN)Code Completion with Statistical Language Models [ACM SIGPLAN Notices 2014]RNNToward Deep Learning Software Repositories [MSR 2015]决策树Probabilistic Model for Code with Decision Trees [OOPSLA 2016]LSTMNeural Code Comp原创 2020-05-19 20:18:18 · 471 阅读 · 0 评论 -
big code: Toward Deep Learning Software Repositories [MSR 2015]
Toward Deep Learning Software Repositories [MSR 2015]原文:Toward Deep Learning Software Repositories作者:Martin White单位:威廉与玛丽学院(College of William & Mary)会议:MSR 2015模型语言模型p(s)=∏i=1mp(wi∣w1i−1)≈∏i=1mp(wi∣wi−n+1i−1)\begin{array}{rcl} p (s) = \prod原创 2020-05-16 22:33:38 · 280 阅读 · 0 评论 -
big code: Learning python code suggestion with a sparse pointer network [ICLR 2017]
Learning python code suggestion with a sparse pointer network [ICLR 2017]原文:Learning python code suggestion with a sparse pointer network作者:Avishkar Bhoopchand单位:伦敦大学学院(University College London)会议:ICLR 2017模型神经语言模型对序列S=a1,…,aNS = a_1, \ldots, a_NS原创 2020-05-15 16:39:45 · 310 阅读 · 1 评论 -
big code: Neural Code Completion [ICLR 2017]
原文:Neural Code Completion作者:Chang Liu, Xin Wang单位:加州大学伯克利分校(University of California, Berkeley)会议:ICLR 2017 (???为啥openreview网站上写reject)模型公式嵌入Ei=ANi+BTi\begin{array}{rcl} E_i = AN_i + BT_i\end{array}Ei=ANi+BTiLSTM(qfog)=(σσσtanh)PJ,2J(xihi原创 2020-05-14 17:50:11 · 412 阅读 · 0 评论 -
big code: code2seq论文复现 Generating Sequences from Structured Representations of Code
这个代码其实是别人写的pytorch的实现:GitHubcode2seq复现数据test|reset test,Nm0|MarkerExpr|Mth|Void1,void test,Nm0|MarkerExpr|Mth|Nm2,METHOD_NAME void,Void1|Mth|Nm2,METHOD_NAME数据按行存,通过空格分隔开。其中,第一项test|reset是方法名,用竖线|分...原创 2020-04-30 18:50:13 · 1043 阅读 · 9 评论 -
big code: code2seq Generating Sequences from Structured Representations of Code
code2seq论文:code2seq: Generating Sequences from Structured Representations of Code作者:Alon单位:以色列理工学院(Israel Institute of Technology)会议:ICLR 2019title、abstract、introduction、conclusionseq2seq在NMT取得了...原创 2020-04-08 19:41:32 · 1112 阅读 · 0 评论