facilitate the processes adjusting related parameters by easy.py

execute the easy.py in Linux

reference link

f=open("a1a.t","r", encoding = 'utf-8',errors='ignore')
cout_1 = 0
cout = 0
for line in f: 
    cout += 1
    if (line.split(' ')[0] == "+1"):
        cout_1 += 1
print("cout = %d" % (cout-cout_1) + "\n")
print("cout_1 = %d " % cout_1 + "\n")

above codes could let you know the different labels amount with “+1” or “-1”

f=open("a1a.t.predict","r", encoding = 'utf-8',errors='ignore')
cout_1 = 0
cout = 30956
lines = f.readlines()
cout_1 = lines.count('1\n')
print("cout_2 = {}".format(cout -cout_1) + "\n")
print("cout_1 = %d " % cout_1 + "\n")

obtain the prediction classfication numbers

step_1 just download two database files svmguide1 and svmguide1.t into a file containing easy.py liking following:

在这里插入图片描述

step_2 use command python3 easy.py svmguide1 svmguide1.t and I just got the followign mistakes:

facilitate the processes adjusting related parameters by easy.py_第1张图片

the solutions to handle the above problems: sudo apt install python-is-python3 but as follows is also a mistake:

facilitate the processes adjusting related parameters by easy.py_第2张图片

it’s very mysterious for me. When I just restart my computer and I run this codespython easy.py a1a a1a.t

python easy.py a1a a1a.t
Scaling training data...
WARNING: original #nonzeros 22249
       > new      #nonzeros 181365
If feature values are non-negative and sparse, use -l 0 rather than the default -l -1
Cross validation...
Best c=512.0, g=3.0517578125e-05 CV rate=83.3022
Training...
Output model: a1a.model
Scaling testing data...
WARNING: feature index 12 appeared in file a1a.t was not seen in the scaling factor file a1a.range. The feature is scaled to 0.
WARNING: feature index 60 appeared in file a1a.t was not seen in the scaling factor file a1a.range. The feature is scaled to 0.
WARNING: feature index 89 appeared in file a1a.t was not seen in the scaling factor file a1a.range. The feature is scaled to 0.
WARNING: feature index 96 appeared in file a1a.t was not seen in the scaling factor file a1a.range. The feature is scaled to 0.
WARNING: feature index 111 appeared in file a1a.t was not seen in the scaling factor file a1a.range. The feature is scaled to 0.
WARNING: feature index 116 appeared in file a1a.t was not seen in the scaling factor file a1a.range. The feature is scaled to 0.
WARNING: feature index 120 appeared in file a1a.t was not seen in the scaling factor file a1a.range. The feature is scaled to 0.
WARNING: feature index 121 appeared in file a1a.t was not seen in the scaling factor file a1a.range. The feature is scaled to 0.
WARNING: feature index 122 appeared in file a1a.t was not seen in the scaling factor file a1a.range. The feature is scaled to 0.
WARNING: feature index 123 appeared in file a1a.t was not seen in the scaling factor file a1a.range. The feature is scaled to 0.
WARNING: original #nonzeros 429343
       > new      #nonzeros 3498028
If feature values are non-negative and sparse, use -l 0 rather than the default -l -1
Testing...
Accuracy = 84.3358% (26107/30956) (classification)
Output prediction: a1a.t.predict

execute the file easy.py in windows

  1. you could watch this blog to see how to use libsvm in windows
  2. there is a mistake I made when I used the easy.py
  3. "gnuplot executable not found" I have to say this is a tricky question. Because after I have installed gun plot into the computer. I still cannot avoid this mistake. Finally, I found that the original test in the pgnuplot.exe instead of gnuplot.exe.
  4. Now I could use the easy.py, but the executation time is so long I cannot know what’s the problem in here. I was ready to run it in night !.
  5. I am ready to show my result with grid.py to find the suitable parameters for my database.
  6. I just the command python grid.py ..\heart_scalethe following is the results:
    facilitate the processes adjusting related parameters by easy.py_第3张图片
  7. maybe it is not fast enough. Therefore, what I could do is wait it.

The following is the content for my assignemnt of ML

  1. svm-train a1a and I got the following results:
*
optimization finished, #iter = 537
nu = 0.460270
obj = -673.031415, rho = 0.628337
nSV = 754, nBSV = 722
Total nSV = 754
  1. .\svm-predict .\a1a.t .\a1a.model a1a.t.predict and got the following accuracy:
Accuracy = 83.5864% (25875/30956) (classification)

你可能感兴趣的:(assignments,Python,python,libsvm)