Natural language processing: Deep Neural networks with multitask learning

In: A sentence

Output: a host of language processing predictions: part-of-speech tags, chunks, with a language model

The network is trained jointly on all these tasks using weight-sharing, an instance of multitask learning.  (using labeled data)

Deep neural network   Trained jointly all of these tasks 

Semi-supervised learning for the shared tasks (using unlabeled text). --->the language model


Natural language processing tasks:

     Part of speech tagging; Chunking; Named entity recognition; Semantic role labeling; Language model; Semantically related words.

The architecture of deep neural network

Natural language processing: Deep Neural networks with multitask learning_第1张图片


In order to handle sequences of variable length, the simplest solution is to use a window approach:

Consider a window of fixed size ksz around each work we want to label.  (NO)

Time-Delay Neural networks (TDNNs)





你可能感兴趣的:(乐在科研中)