关于word2vec

好吧,我终于要开始nlp的学习了

http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/

https://cs224d.stanford.edu/lecture_notes/notes1.pdf


[1] Mikolov T, Chen K, Corrado G, et al. Efficient Estimation of Word Representations in Vector Space[J]. Computer Science, 2013.(这篇文章就讲了两个模型:CBOW 和 Skip-gram) 

[2] Mikolov T, Sutskever I, Chen K, et al. Distributed Representations of Words and Phrases and their Compositionality[J]. 2013, 26:3111-3119.(这篇文章针对Skip-gram模型计算复杂度高的问题提出了一些该进) 

[3] Presentation on Word2Vec(这是NIPS 2013workshop上Mikolov的PPT报告)

你可能感兴趣的:(关于word2vec)