Text classification-FastText

1.Getting and preparing the data

每行包括:label,句子

>> head cooking.stackexchange.txt

__label__sauce __label__cheese How much does potato starch affect a cheese sauce recipe?__label__food-safety __label__acidity Dangerous pathogens capable of growing in acidic environments __label__cast-iron __label__stove How do I cover up the white spots on my cast iron stove?__label__restaurant Michelin Three Star Restaurant; but if the chef is not there__label__knife-skills__label__dicing Without knife skills, how can I quickly and accurately dice vegetables?

在训练之前将数据分为训练集和交叉验证集 4:1

>> wc cooking.stackexchange.txt

  15404  169582 1401900 cooking.stackexchange.txt

>> head -n 12404 cooking.stackexchange.txt > cooking.train

>> tail -n 3000 cooking.stackexchange.txt > cooking.valid

2.Our first classifier

训练分类器

>> ./fasttext supervised -input cooking.train -output model_cooking

Read 0M words

Number of words:  14543

Number of labels: 735

Progress: 100.0% words/sec/thread:  90012 lr:  0.000000 loss: 10.222594 ETA:  0h 0m

 -input 指示训练集 

-output定义在哪里存放model文件

训练的最后会生成 model.bin文件存放分类器


交互式测试分类器:

>> ./fasttext predict model_cooking.bin -

>> Why not put knives in the dishwasher?

__label__baking

>> Why not put knives in the dishwasher?

__label__food-safety

执行交叉验证:

>> ./fasttext test model_cooking.bin cooking.valid

N 3000

P@1 0.138

R@1 0.0595

Number of examples: 3000

precision:P@1

recall:R@1

每5次迭代计算一次precision和recall

>> ./fasttext test model_cooking.bin cooking.valid 5

N 3000

P@5 0.0677

R@5 0.146

Number of examples: 3000

3.Advanced readers: precision and recall

The precision is the number of correct labels among the labels predicted by fastText.

The recall is the number of labels that successfully were predicted, among all the real labels.

4.Making the model better

4.1 preprocessing the data

>>cat cooking.stackexchange.txt | sed -e "s/\([.\!?,'/()]\)/ \1 /g" | tr "[:upper:]" "[:lower:]" > cooking.preprocessed.txt

head -n 12404 cooking.preprocessed.txt > cooking.train

 tail -n 3000 cooking.preprocessed.txt > cooking.valid

使用预训练过的数据训练模型:

>> ./fasttext supervised -input cooking.train -output model_cooking

Read 0M words

Number of words: 8952

Number of labels: 735

Progress: 100.0% words/sec/thread: 101142   lr: 0.000000   loss: 11.018550   ETA: 0h 0m


>> ./fasttext test model_cooking.bin cooking.valid

N 3000

P@1 0.172

R@1 0.0744

Number of examples: 3000

4.2 more epochs and larger learning rate

默认情况下fastText 只进行5次迭代

使用 -epoch 自定义迭代次数

>> ./fasttext supervised -input cooking.train -output model_cooking -epoch 25

Read 0M words

Number of words: 8952

Number of labels: 735

Progress: 100.0% words/sec/thread: 92990   lr: 0.000000   loss: 7.257324   ETA: 0h 0m


>> ./fasttext test model_cooking.bin cooking.valid

N 3000

P@1 0.514

R@1 0.222

Number of examples: 3000

加快学习速度--改变learning rate

>> ./fasttext supervised -input cooking.train -output model_cooking -lr 1.0

Read 0M words

Number of words: 8952

Number of labels: 735

Progress: 100.0% words/sec/thread: 91682     lr: 0.000000     loss: 6.346271     ETA: 0h 0m


>> ./fasttext test model_cooking.bin cooking.valid

N 3000

P@1 0.579

R@1 0.25

Number of examples: 3000


4.3 word n-grams

使用bigrams训练模型

>> ./fasttext supervised -input cooking.train -output model_cooking -lr 1.0 -epoch 25 -wordNgrams 2

Read 0M words

Number of words: 8952

Number of labels: 735

Progress: 100.0% words/sec/thread: 93126    lr: 0.000000     loss: 3.139972     ETA: 0h 0m

>> ./fasttexttestmodel_cooking.bin cooking.valid

N 3000

P@1 0.61

R@1 0.264

Number of examples: 3000

提升模型准确率的几种方法:

preprocessing the data ;

changing the number of epochs (using the option-epoch, standard range[5 - 50]) ;

changing the learning rate (using the option-lr, standard range[0.1 - 1.0]) ;

using word n-grams (using the option-wordNgrams, standard range[1 - 5]).

5.Scaling things up

使用 hierarchical softmax可以让模型训练的更快 This can be done with the option -loss hs:

>> ./fasttext supervised -input cooking.train -output model_cooking -lr 1.0 -epoch 25 -wordNgrams 2 -bucket 200000 -dim 50 -loss hs

Read 0M words

Number of words:  8952

Number of labels: 735

Progress: 100.0% words/sec/thread: 2139399 lr:  0.000000 loss:  2.142308 ETA:  0h 0m

Text classification-FastText_第1张图片

你可能感兴趣的:(Text classification-FastText)