matlab获取视差图,Matlab生成视差图

【实例简介】

双目视觉,根据块匹配方法的视差图生成。依据Matlab生成视差图。

Dbasic= zeros(size(leftI),'single')

disparity range 15;

Selects (2*halfBlocksize+1)-by-(2*halfBlocksize+1) block

halfblocksize =3

blocksize 2*halfblocksize+1

Allocate space for all template matchers

tats cell(blocksize)

g Scan over all rows

for m=1: size(leftI, 1)

Set min/max row bounds for image block

inr=max(l, m-halfBlocksize);

maxr min(size(leftI, 1), mthalfBlocksize)i

‰ Scan over a11 columns

for n=1: size(leftI, 2

minc =max(1, n-halfBlocksize);

maxc min(size(leftI, 2),nthalfBlocksize)

Compute disparity bounds

mind max( -disparityRange, 1-minc i

maxd min( disparityRange, size(leftI, 2)-maxc

9 Construct template and region of interest

template rightI(minr: maxr, minc: maxc);

templateCenter floor((size(template)+1)/2)

roi = [minr+template Center(1)-2

mincttemplateCenter(2)+mind-2

1 maxd-mind+1]

Lookup proper TemplateMatcher object, create if empty

if isempty( tmatssize(template, 1),size(template, 2)1)

tmatsisize(template, 1),size(template, 2)1

video. TemplateMatcher( ROIInputPort', true);

thisTemplateMatcher tmatsfsize(template, 1),size(template, 2));

Run TemplateMatcher object

loc step( thisTemplateMatcher, leftI, template, roi);

basic(m)

m,n)=1oc(2)-roi(2)

nd

end

end

In the results bclow, the basic block matching docs well, as the correct shape of thc stcrco

scene is recovered. However, there are noisy patches and bad depth estimates everywhere,

especially on the ceiling. These are caused when no strong image features appear inside of

the 7-by-7-pixel windows being compared. Then the matching process is subject to noise

since each pixel chooses its disparity independently of all the other pixels

For display purposes, we saturate the depth map to have only positive values. In general

slight angular misalignment of the stereo cameras used for image acquisition can allow both

positive and negative disparities to appear validly in the depth map In this case, however, the

stereo cameras were near perfectly parallel, so the true disparities have only one sign. Thus

this correction is valid

figure(3), clf;

imshow(Dbasic, []), axis image, colormap( jet), colorbar

caxis(lo disparity d);

title( ' depth map from basic block matching)

Depth map from basic black matching

10

Step 3. Sub-pixel estimation

The disparity estimates returned by block matching are all integer-valued so the above depth

map exhibits contouring effects where there are no smooth transitions between regions of

diffcrcnt disparity. This can bc amclioratcd by incorporating sub-pixcl computation into thc

matching metric. Previously we only took the location of the minimum cost as the disparity,

but now we take into consideration the minimum cost and the two neighboring cost values

We fit a parabola to these three values, and analytically solve for the minimum to get the sub

pixel correction

DbasicSubpixel= zeros(size(leftI),'single)

tmats cell(2*halfBlocksize+1);

for m=1: size(lefti 1

Set min/max row bounds for image block

minr =max(1, m-halfBlocksize);

maxr min(size(leftI, 1),m+halfBlocksize)

g Scan over all columns

for n=l: size(leftI,2

minc max(1, n-halfBlocksize)

maxc min(size(leftI, 2),n+halfBlocksize)

%6 Compute disparity bounds

mind max( -disparityRange, 1 -minc i

maxd min( disparity Range, size(leftI, 2)-maxc )j

Construct template and region of interest

template rightI(minr: maxr, minc: maxc)

templateCenter floor((size(template)+1)/2);

roi =[minr+templateCenter(1)-2

minc+templateCenter(2)+mind-2

1 maxd-mind+1

Lookup proper TemplateMatcher object; create if empty

if isempty(tmatsisize(template, 1),size(template, 2)1)

tmatsisize(template, 1), size(template, 2)]

video. TemplateMatcher( ROIInputPort',true

BestMatchNeighborhoodoutputPort', true);

thisTemplateMatcher tmatsisize(template, 1),size(template, 2)J;

Run TemplateMatcher object

loc, a2]=step(thisTemplateMatcher, leftI, template, roi)i

x single(loc(2)- roi(2 )+ mind);

Subpixel refinement of index

DbasicSubpixel(m, n)=ix-05*(a2(2,3)-a2(2, 1))

(a2(2,1)-2*a2(2,2)+a2(2,3)

end

end

Rc-running basic block matching wc achieve the result bclow where the contouring cffccts

are mostly removed and the disparity estimates are correctly refined. This is especially

evident along the walls

figure(1), clf;

imshow(DbasicSubpixel,[]), axis image, colormap( 'jet ) colorbar;

caxis(lo disparityrange];

title( Basic block matching with sub-pixel accuracy '

Basic block matching with sub-pixel accuracy

15

Step 4. Dynamic programming

As mentioned above, basic block matching creates a noisy disparity image. This can be

improved by introducing a smoothness constraint. Basic block matching chooses the optimal

disparity for each pixel based on its own cost function alone. Now we want to allow a pixel

to have a disparity with possibly sub-optimal cost for it locally. This extra cost must be offset

by increasing that pixel's agreement in disparity with its neighbors. In particular, we constrain

each disparity estimate to lie with 3 values of its neighbors disparities, where its neighbors

are the adjacent pixels along an image row. The problem of finding the optimal disparity

estimates for a row of pixels now becomes one of finding the"optimal path"from one side of

the image to the other. To find this optimal path, we use the underlying block matching

metric as the cost function and constrain the disparities to only change by a certain amount

between adjacent pixels. this is a problem that can be solved efficiently using the technique

of dynamic programming [3, 4

Ddynamic zeros(size(leftI),single )

finf le3; False infinity

disparity Cost finf*ones (size(leftI, 2),2*disparityRange 1,'single)

disparity Penalty =0.5;% Penalty for disparity disagreement between pixels

Scan over all rows

for m=1: size leftI, 1)

disparityCost(: )=finf;

Set min/max row bounds for image block

minr max(l, m-halfBlocksize)

maxr min(size(leftI, 1),m+halfBlocksize;

Scan over all columns

for n=1: size(leftI, 2)

minc max(1, n-halfBlocksize)

maxc min(size(leftI, 2),nthalfBlocksize);

Compute disparity bounds

mind max( -disparityRange, 1-minc )

maxd min( disparityRange, size(leftI, 2)-maxc )i

9 Compute and save all matching costs

for d=mind: maxd

disparityCost(n, d+ disparityRange +1)

sum(sum(abs(leftI(minr: maxr,(minc: maxc )+d)

rightI(minr: maxr, minc: maxc))))

end

nd

Process scanline disparity costs with dynamic programming

optimalIndices zeros(size(disparity Cost),'single);

p= disparity Cost(end,

for j=size(disparityCost 1)-1: -1: 1

9 False infinity for this level

cfinf =(size(disparityCost, 1)-j+1)*finf

Construct matrix for finding optimal move for each column

individuall

[v, ix]= min([cfinf cfinf cp(1: end-4)+3*disparity Penalty;

cfinf cp(1: end-3)+2*disparityPenalty

cp(1: end-2)+disparityPenalty

cp(2: end-1)

cp( 3: end)+disparityPenalty

cp(4: end)+2 disparityPenalty cfinf;

cp(5: end)+3*disparity Penalty cfinf cfinf],[,1)

cp= lcfinf disparity Cost(3, 2: end-1)+v cfinf]i

Record optimal routes

optimalIndices(j, 2: end-1)=(2: size(disparity Cost, 2)-1)+(ix-4)

end

Recover optimal route

min(cp);

Ddynamic(m,1)=i×;

for k=1: size(Ddynamic, 2)-1

Ddynamic(m,k+1)=optimalIndices(k

(1, min(size(optimalIndices, 2)

d(ddynamic(m,k)))))

end

en

Ddynamic Ddynamic disparity Range -1;

The image below shows the stereo result refined by applying dynamic programming to each

row individually. dynamic programming docs introducc crrors of its own by blurring the

edges around object boundaries due to the smoothness constraint. Also, it does nothing to

smooth"between rows, which is why a striation pattern now appears on the left side

foreground chair. Despite these limitations, the result is significantly improved, with the

noise along the walls and ceiling nearly completely removed, and with many of the

foreground objects being better reconstructed

figure( 3), clf

imshow(Ddynamic, [], axis image, colormap(jet ) colorbar

caxis(lo disparityRange]

title(' Block matching with dynamic programming)

Block matching with d ynamic programming

10

Step 5. Image Pyramiding

While dynamic programming can improve the accuracy of the stereo image, basic block

matching is still an cxpcnsivc opcration, and dynamic programming only adds to thc burden

One solution is to use image pyramiding and telescopic search to guide the block matching

[5,7]. With the full-size image, we had to search over a +15-pixel range to properly detect the

disparities in the image. If we had down-sized the image by a factor of two, however, this

search could have been reduced to +7 pixels on an image a quarter of the area, meaning this

step would cost a factor of 8 less. Then we use the disparity estimates from this down-sized

operation to seed the search on the larger image and therefore we only need to search over a

smaller rangc of disparities

The below example performs this telescoping stereo matching using a three-level image

pyramid. We use the Pyramid and Geometricscaler System objects, and we have wrapped up

the preceding block matching code into the function vipstereo blockmatch m for simplicity

The disparity search range is only +3 pixels at each level, making it over 5x faster to

compute than basic block matching. Yet the results compare favorably

Construct a three-level pyramid

pyramids cell(1, 4)

pyramids[1].L leftI;

pyramids[1.R= rightI

for i=2: length(pyramids)

hPyr video Pyramid (p

yramidLeve1’,1

);

pyramids i].L= single(step(hPyr, pyramidsfi-1.L))

end pyramidsfiJ. R= single(step(hPyrPyramidsfi-1]R));

Declare original search radius as +/-4 disparities for every pixel

smallRange single(3

disparityMin repmat(-smallRange, size(pyramidsfend] L));

disparityMax repmat( smallRange, size(pyramidsiend. L))

Do telescoping search over pyramid levels

for i=length (pyramids ): -1: 1

Pyramid vipstereo blockmatch(pyramidsfif. L, pyramidsfi. R

disparityMin, disparityMax,

false, true, 3)

if i>1

Scale disparity values for next level

hGsca video. Geometricscaler(

InterpolationMethod, Nearest neighbor

izeMethod', Number of output rows and columns,

Size, size(pyramids[i-1.L));

Pyramid =2step(hGsca, Pyramid);

Maintain search radius of +/-smaliRange

disparityMin =Pyramid- smallRange;

disparityMax = Pyramid smallRange:

end

end

figure(3), clf;

imshow(Pyramid,[D, colormap( jet ) colorbar, axis image;

caxis(lo disparityRangel);

title( Four-level pyramid block matching);

Four-level pyramid block matching

5

Step 6. Combined pyramiding and dynamic programming

Finally wc merge the above techniques and run dynamic programming along with image

pyramiding, where the dynamic programming is run on the disparity estimates output by

every pyramid level. The results compare well with the highest-quality results we have

obtained so far, and are still achieved at a reduced computational burden versus basic block

matching

It is also possible to use sub-pixel methods with dynamic programming, and we show the

results of all three techniques in the second image. As before, sub-pixeling reduces

contouring effects and clearly improves accuracy. The previous code has been bundled into

vipstereo blockmatch combined, m, which exposes all of the options previously presented as

parameter-value pairs

DpyramidDynamic vipstereo blockmatch_ combined (lefti, rightI

NumPyramids',3,'DisparityRange', 4, 'DynamicProgramming', true);

fi

gure(3), clf

C

imshow(DpyramidDynamic, [I, axis( image), colorbar, colormap jet;

caxis([o disparityRanged);

title( 3-level pyramid with dynamic programming);

DdynamicSubpixel vipstereo blockmatch combined (leftI, rightI,

NumPyramids,3, DisparityRange,4, DynamicProgramming, true,

Subpixel, true);

figure(4), clf

imshow(DdynamicSubpixel, []), axis image, colormap('jet'), colorbar;

caxis(lo disparityRange d

title( Pyramid with dynamic programming and sub-pixel accuracy ' )

3-level pyramid with dynamic prograrmming

10

Pyramid with dy namic programming and sub-pixel accurad

15

Step 7. Backprojection

With a stereo depth map and knowledge of the intrinsic parameters of the camera, it is

possible to backproject image pixels into 3D points [1, 2]. One way to compute the camera

intrinsics is with the MaTLAB Camera Calibration Toolbox 6 from the California Institute

of Technology(R. Such a tool will produce an intrinsics matrix, K, of the form

k=[focal length_ x

skew x camera center x

0 focal length y camera center y

【实例截图】

【核心代码】

你可能感兴趣的:(matlab获取视差图)