Opencv 机器学习 ---- 支持向量机( SVM )

Opencv 机器学习 —- 支持向量机( SVM )

OpenCV3 Java 机器学习使用方法汇总

public class SVM {

    static {
        System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
    }
    public static void run() {    
        // 训练数据,两个维度,表示身高和体重    
        float[] trainingData = { 186, 80, 185, 81, 160, 50, 161, 48 };    
        // 训练标签数据,前两个表示男生0,后两个表示女生1,由于使用了多种机器学习算法,他们的输入有些不一样,所以labelsMat有三种     
        float[] labels = { 0f, 0f, 0f, 0f, 1f, 1f, 1f, 1f };    
        int[] labels2 = { 0, 0, 1, 1 };    
        float[] labels3 = { 0, 0, 1, 1 };    
        // 测试数据,先男后女    
        float[] test = { 184, 79, 159, 50 };    

        Mat trainingDataMat = new Mat(4, 2, CvType.CV_32FC1);    
        trainingDataMat.put(0, 0, trainingData);       

        Mat labelsMat2 = new Mat(4, 1, CvType.CV_32SC1);    
        labelsMat2.put(0, 0, labels2);     

        Mat sampleMat = new Mat(2, 2, CvType.CV_32FC1);    
        sampleMat.put(0, 0, test);    

        MySvm(trainingDataMat, labelsMat2, sampleMat);
    }   

    // SVM 支持向量机
    public static Mat MySvm(Mat trainingData, Mat labels, Mat testData) {

        SVM svm = SVM.create();
        //配置SVM训练器参数
        TermCriteria criteria = new TermCriteria(TermCriteria.EPS + TermCriteria.MAX_ITER, 1000, 0);
        svm.setTermCriteria(criteria);//指定
        svm.setKernel(SVM.LINEAR);//使用预先定义的内核初始化
        svm.setType(SVM.C_SVC); //SVM的类型,默认是:SVM.C_SVC
        svm.setGamma(0.5);//核函数的参数
        svm.setNu(0.5);//SVM优化问题参数
        svm.setC(1);//SVM优化问题的参数C

        TrainData td = TrainData.create(trainingData, Ml.ROW_SAMPLE, labels);//类封装的训练数据
        boolean success = svm.train(td.getSamples(),Ml.ROW_SAMPLE,td.getResponses());//训练统计模型
        System.out.println("Svm training result: " + success);
        //svm.save(filename);//保存模型

        //测试数据
        Mat responseMat = new Mat();
        svm.predict(testData,responseMat,0);
        System.out.println("SVM responseMat:\n" + responseMat.dump());
        for(int i = 0;iif(responseMat.get(i, 0)[0] == 0) 
                System.out.println("Boy\n");
            if(responseMat.get(i, 0)[0] == 1)
                System.out.println("Girl\n");
        }
        return responseMat;
    }

        return responseMat;
    }

    public static void main(String[] args) {
        run();
    }

结果:

Svm training result: true
SVM responseMat:
[0;
 1]
Boy

Girl

方法:

SVM.create()

静态方法,创建一个空的模型。使用 train方法训练这个模型。

class TermCriteria : 官方定义是 定义迭代算法的终止标准的类

TermCriteria(int type, int maxCount, double epsilon)

  • type - 终止标准的类型 。COUNT,EPS 或者 COUNT + EPS。
Enumerator 说明
TermCriteria.COUNT 要计算的迭代或元素的最大数量
TermCriteria.MAX_ITER 同上
TermCriteria.EPS 迭代算法停止的参数的期望精度或变化

- maxCount - 迭代/元素的最大数量
- epsilon - 所需的精度

默认值:TermCriteria(TermCriteria.MAX_ITER + TermCriteria.EPS, 1000, ??? )

??? :此处在api中为:FLT_EPSILON ,但这是C或者C++才有的,意为单精度所能识别的最小精度。

void setKernel(int kernelType)

使用预先定义的内核初始化

  • kernelType - 详见官方API 。
Enumerator
CUSTOM Returned by SVM::getKernelType in case when custom kernel has been set
LINEAR Linear kernel. No mapping is done, linear discrimination (or regression) is done in the original feature space. It is the fastest option. K(xi,xj)=xTixj.
POLY Polynomial kernel: K(xi,xj)=(γxTixj+coef0)degree,γ>0.
RBF Radial basis function (RBF), a good choice in most cases. K(xi,xj)=e−γ||xi−xj||2,γ>0.
SIGMOID Sigmoid kernel: K(xi,xj)=tanh(γxTixj+coef0).
CHI2 Exponential Chi2 kernel, similar to the RBF kernel: K(xi,xj)=e−γχ2(xi,xj),χ2(xi,xj)=(xi−xj)2/(xi+xj),γ>0.
INTER Histogram intersection kernel. A fast kernel. K(xi,xj)=min(xi,xj).

void setType(int val)

SVM的类型 ,默认是 SVM.C_SVC

Enumerator
C_SVC C-Support Vector Classification. n-class classification (n ≥ 2), allows imperfect separation of classes with penalty multiplier C for outliers.
NU_SVC ν-Support Vector Classification. n-class classification with possible imperfect separation. Parameter ν (in the range 0..1, the larger the value, the smoother the decision boundary) is used instead of C.
ONE_CLASS Distribution Estimation (One-class SVM). All the training data are from the same class, SVM builds a boundary that separates the class from the rest of the feature space.
EPS_SVR ϵ-Support Vector Regression. The distance between feature vectors from the training set and the fitting hyper-plane must be less than p. For outliers the penalty multiplier C is used.
NU_SVR ν-Support Vector Regression. ν is used instead of p. See [31] for details.

你可能感兴趣的:(OpenCV)