
CS224N
lirt15
这个作者很懒,什么都没留下…
展开
-
CS224N notes_chapter2_word2vec
第二讲 word2vec1 Word meaningthe idea that is represented by a word, phrase, writing, art etc.How do we have usable meaning in a computer?Common answer: toxonomy(分类系统) like WordNet that has hypernyms...原创 2019-07-01 15:07:16 · 206 阅读 · 0 评论 -
CS224N notes_chapter11_Review GRU & LSTM
第十一讲 Review GRU & LSTM原视频中还涉及一些关于MT的其他议题,笔者在此处忽略了。GRUidea: Perhaps we could use shortcut connections to prevent model from gradient vanishing. -> adaptive shortcut connections(utu_tut).f(h...原创 2019-07-06 12:51:08 · 136 阅读 · 0 评论 -
CS224N notes_chapter9_machine translation & LSTM & GRU
第九讲 machine translation & LSTM & GRUCurrent statistical machine translation systemsparallel corpus: lots of sentences from a language to another.Source language, e.g. FrenchTarget languag...原创 2019-07-05 21:27:43 · 206 阅读 · 0 评论 -
CS224N notes_chapter8_RNN & LM
第八讲 RNN & LMLanguage ModelA language model computes a probability for sequence of words: P(w1,w2,...,wT)P(w_1,w_2,...,w_T)P(w1,w2,...,wT)Useful for machine translationWord ordering: p(the...原创 2019-07-05 17:39:08 · 130 阅读 · 0 评论 -
CS224N notes_chapter14_Recursive Neural Network
第十四讲 树RNN和短语句法分析Language understanding - it requires being able to understand bigger things from knowing about smaller parts.Language could be respresented in a recursive way. For example, Noun phra...原创 2019-07-11 00:34:01 · 170 阅读 · 0 评论 -
CS224N notes_chapter6_syntax grammar and dependency parsing
第六讲 syntax grammar and dependency parsing#我是真没咋看懂这一讲1. Syntactic Structure: Consistency and DependencyConstituency = phrase structure grammar = context-free grammars(CFGs)Phrase structure organize...原创 2019-07-04 20:25:07 · 202 阅读 · 0 评论 -
CS224N notes_chapter13_CNN
第十三讲 CNNFrom RNN to CNNRNN can only capture a phrase given its left side context.#就是说,你如果想拿到RNN对某一个输入向量的处理结果,你需要把它前边的输入都过一遍,而不能只拆出其中的一部分Main CNN idea:Compute vectors for every possible phrase.R...原创 2019-07-10 11:21:16 · 116 阅读 · 0 评论 -
CS224N notes_chapter5_Backpropagation
第五讲 BackpropagationFrom one-layer NN to multi layer NN2 layer case.x=z(1)=a(1)z(2)=W(1)x+b(1)a(2)=f(z(2))z(3)=W(2)a(2)+b(2)a(3)=f(z(3))s=UTa(3)\begin{aligned}x =& z^{(1)} = a^{(1)} \\z^{...原创 2019-07-03 16:47:27 · 115 阅读 · 0 评论 -
CS224N notes_chapter4_Word window classification and Neural Network
第四讲 word window分类与神经网络Classification backgroundnotationsinput: xix_ixi, words/context windows/sentences/doc.etcoutput: yiy_iyi, labels such as sentiment/NER/other words.etci=1,2,…,N#为了方便, 笔者接下...原创 2019-07-02 19:26:43 · 172 阅读 · 0 评论 -
CS224N notes_chapter3_Deeper Look at Word Vectors
第三讲 Deeper Look at Word VectorsNegtive SamplingFirstly, we need to review the Skip-gramp(wt+j∣wt)=exp(uoTvc)∑w=1Vexp(uwTvc)p(w_{t+j}|w_t)= \frac{exp(u_o^Tv_c)}{\sum_{w=1}^V exp(u_w^Tv_c)}p(wt+j...原创 2019-07-01 20:48:05 · 179 阅读 · 0 评论 -
CS224N notes_chapter15_Coreference Resolution
第十五讲 共指解析Coreference ResolutionIdea: Identify all noun phrases that refer #说白了就是要搞清楚每个名词短语指代的是谁 比如 John loves his wife. He prepares breakfirst for her everyday. 我们知道his,He都指代(co-refer)的是John.None p...原创 2019-07-11 17:46:04 · 317 阅读 · 0 评论