1.Laplacian/Laplacian of Gaussian1 (LoG)
As Laplace operator may detect edges as well as noise (isolated, out-of-range), it may be desirable to smooth the image first by a convolution with a Gaussian kernel of width
The Gaussian and its first and second derivatives and are shown here:
This 2-D LoG can be approximated by a 5 by 5 convolution kernel such as
The kernel of any other sizes can be obtained by approximating the continuous expression of LoG given above. However, make sure that the sum (or average) of all elements of the kernel has to be zero (similar to the Laplace kernel) so that the convolution result of a homogeneous regions is always zero.
The edges in the image can be obtained by these steps:
2.Difference of Gaussian (DoG)
Similar to Laplace of Gaussian, the image is first smoothed by convolution with Gaussian kernel of certain width
As the difference between two differently low-pass filtered images, the DoG is actually a band-pass filter, which removes high frequency components representing noise, and also some low frequency components representing the homogeneous areas in the image. The frequency components in the passing band are assumed to be associated to the edges in the images.
The discrete convolution kernel for DoG can be obtained by approximating the continuous expression of DoG given above. Again, it is necessary for the sum or average of all elements of the kernel matrix to be zero.
Comparing this plot with the previous one, we see that the DoG curve is very similar to the LoG curve. Also, similar to the case of LoG, the edges in the image can be obtained by these steps:
Edge detection by DoG operator:
讲义的地址:http://fourier.eng.hmc.edu/e161/lectures/gradient/gradient.html
基础知识:
1.1Gradient
The Gradient (also called the Hamilton operator) is a vector operator for any N-dimensional scalar function , where is an N-D vector variable. For example, when , may represent temperature, concentration, or pressure in the 3-D space. The gradient of this N-D function is a vector composed of components for the partial derivatives:
In image processing we only consider 2-D field:
Now we show that increases most rapidly along the direction of and the rate of increment is equal to the magnitude of .
Consider the directional derivative of along an arbitrary direction :
From , we can also get
For discrete digital images, the derivative in gradient operation
Two steps for finding discrete gradient of a digital image:
The differences in two directions and can be obtained by convolution with the following kernels:
In 2-D case, Laplace operator is the sum of two second order differences in both dimensions:
Gradient operation is an effective detector for sharp edges where the pixel gray levels change over space very rapidly. But when the gray levels change slowly from dark to bright (red in the figure below), the gradient operation will produce a very wide edge (green in the figure). It is helpful in this case to consider using the Laplace operation. The second order derivative of the wide edge (blue in the figure) will have a zero crossing in the middle of edge. Therefore the location of the edge can be obtained by detecting the zero-crossings of the second order difference of the image.
One dimensional example:
In the two dimensional example, the image is on the left, the two Laplace kernels generate two similar results with zero-crossings on the right:
Edge detection by Laplace operator followed by zero-crossing detection: