In: A sentence
Output: a host of language processing predictions: part-of-speech tags, chunks, with a language model
The network is trained jointly on all these tasks using weight-sharing, an instance of multitask learning. (using labeled data)
Deep neural network Trained jointly all of these tasks
Semi-supervised learning for the shared tasks (using unlabeled text). --->the language model
Natural language processing tasks:
Part of speech tagging; Chunking; Named entity recognition; Semantic role labeling; Language model; Semantically related words.
The architecture of deep neural network
In order to handle sequences of variable length, the simplest solution is to use a window approach:
Consider a window of fixed size ksz around each work we want to label. (NO)
Time-Delay Neural networks (TDNNs)

被折叠的 条评论
为什么被折叠?



