1. 暴力(Brute-Force)匹配
暴力匹配是一种描述符匹配方法,该方法会比较两个图像的关键点描述符,并找到它们的共同之处。称为暴力匹配的原因是该算法基本不涉及优化,第一个描述符的所有特征都用来和第二个描述符特征进行比较。
以下例程通过ORB算法(也可采用其它算法)获取图像的特征描述符,采用暴力匹配法进行匹配,并显示匹配结果。# -*- coding: utf-8 -*-
"""
Created on Sun Jun 24 12:10:50 2018
@author: lu
"""
import cv2
# query and test images
img1 = cv2.imread('./manowar_logo.png',0)
img2 = cv2.imread('./manowar_single.jpg',0)
# create the ORB detector
orb = cv2.ORB_create()
kp1, des1 = orb.detectAndCompute(img1,None)
kp2, des2 = orb.detectAndCompute(img2,None)
# brute force matching
bf = cv2.BFMatcher(cv2.NORM_HAMMING, crossCheck=True)
#match函数匹配
matches1 = bf.match(des1,des2)
#Sort by distance.
matches1 = sorted(matches1, key = lambda x:x.distance)
img3 = cv2.drawMatches(img1,kp1,img2,kp2, matches1[:40], img2,flags=2)
#显示匹配后图像
cv2.imshow("matchBF",img3)
cv2.waitKey()
cv2.destroyAllWindows()
2. K-近邻匹配
match函数和knnMatch函数区别:
match函数返回最佳匹配,KNN函数返回k个匹配,开发人员可用knnMatch进一步处理这些匹配。例如:可以遍历结果,并采用比率测试来过滤掉不满足用户定义条件的匹配。# -*- coding: utf-8 -*-
"""
Created on Sun Jun 24 12:10:50 2018
@author: lu
"""
import cv2
# query and test images
img1 = cv2.imread('./manowar_logo.png',0)
img2 = cv2.imread('./manowar_single.jpg',0)
# create the ORB detector
orb = cv2.ORB_create()
kp1, des1 = orb.detectAndCompute(img1,None)
kp2, des2 = orb.detectAndCompute(img2,None)
#knnMatch函数匹配
matches2 = bf.knnMatch(des1,des2, k=1)
'''
Python: cv2.drawMatchesKnn(img1, keypoints1, img2, keypoints2, matches1to2[, outImg[, matchColor[, singlePointColor[, matchesMask[, flags]]]]]) → outImg
Parameters:
img1 – First source image.
keypoints1 – Keypoints from the first source image.
img2 – Second source image.
keypoints2 – Keypoints from the second source image.
matches1to2 – Matches from the first image to the second one, which means that keypoints1[i] has a corresponding point in keypoints2[matches[i]] .
outImg – Output image. Its content depends on the flags value defining what is drawn in the output image. See possible flags bit values below.
matchColor – Color of matches (lines and connected keypoints). If matchColor==Scalar::all(-1) , the color is generated randomly.
singlePointColor – Color of single keypoints (circles), which means that keypoints do not have the matches. If singlePointColor==Scalar::all(-1) , the color is generated randomly.
matchesMask – Mask determining which matches are drawn. If the mask is empty, all matches are drawn.
flags – Flags setting drawing features. Possible flags bit values are defined by DrawMatchesFlags.
'''
img4 = cv2.drawMatchesKnn(img1,kp1,img2,kp2, matches2, img2,flags=2)
cv2.imshow("matchKNN",img4)
cv2.waitKey()
cv2.destroyAllWindows()
3. FLANN匹配
最近邻的快速库(Fast Library for Approximate Nearest Neighbors, FLANN),FLANN比SIFT或SURF有更宽松的许可协议,可以在项目中自由使用。# -*- coding: utf-8 -*-
"""
Created on Sun Jun 24 12:45:21 2018
@author: lu
"""
import numpy as np
import cv2
from matplotlib import pyplot as plt
queryImage = cv2.imread('./bathory_album.jpg',0)
trainingImage = cv2.imread('./bathory_vinyls.jpg',0)
# create SIFT and detect/compute
sift = cv2.xfeatures2d.SIFT_create()
kp1, des1 = sift.detectAndCompute(queryImage,None)
kp2, des2 = sift.detectAndCompute(trainingImage,None)
# FLANN matcher parameters
FLANN_INDEX_KDTREE = 0
indexParams = dict(algorithm = FLANN_INDEX_KDTREE, trees = 5)
searchParams = dict(checks=50) # or pass empty dictionary
'''
https://docs.opencv.org/3.0-beta/doc/py_tutorials/py_feature2d/py_matcher/py_matcher.html?highlight=flannbasedmatcher
FLANN stands for Fast Library for Approximate Nearest Neighbors. It contains a collection of algorithms optimized
for fast nearest neighbor search in large datasets and for high dimensional features.
It works more faster than BFMatcher for large datasets. We will see the second example with FLANN based matcher.
indexParams:
First one is IndexParams. For various algorithms, the information to be passed is explained in FLANN docs.
As a summary, for algorithms like SIFT, SURF etc. you can pass following:
index_params = dict(algorithm = FLANN_INDEX_KDTREE, trees = 5)
searchParams:
Second dictionary is the SearchParams. It specifies the number of times the trees
in the index should be recursively traversed. Higher values gives better precision, but also takes more time. If you want to change the value, pass search_params = dict(checks=100).
'''
flann = cv2.FlannBasedMatcher(indexParams,searchParams)
matches = flann.knnMatch(des1,des2,k=2)
# prepare an empty mask to draw good matches
matchesMask = [[0,0] for i in range(len(matches))]
# David G. Lowe's ratio test, populate the mask
for i,(m,n) in enumerate(matches):
if m.distance
matchesMask[i]=[1,0]
drawParams = dict(matchColor = (0,255,0),
singlePointColor = (255,0,0),
matchesMask = matchesMask,
flags = 0)
resultImage = cv2.drawMatchesKnn(queryImage,kp1,trainingImage,kp2,matches,None,**drawParams)
plt.imshow(resultImage,),plt.show()
#显示匹配后图像
cv2.imshow("resultImage",resultImage)
cv2.waitKey()
cv2.destroyAllWindows()
注意:本站所有文章除特别说明外,均为原创,转载请务必以超链接方式并注明作者出处。