第一是高斯模糊,iOS5.0之后出现了CoreImage的API,Core Image的API被放在CoreImage.framework库中, 在iOS和OS X平台上,Core Image都提供了大量的滤镜(Filter),在OS X上有120多种Filter,而在iOS上也有90多。
方法如下:
+ (UIImage *)coreTeachImage:(UIImage *)image withCoreNumber:(CGFloat)coreNumber {
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *inputImage=[CIImage imageWithCGImage:image.CGImage];
//设置filter
CGImageRef outImage;
if (NSFoundationVersionNumber >= 1140.11) {
CIImage *result = [inputImage imageByClampingToExtent];
result = [result imageByApplyingFilter:@"CIGaussianBlur" withInputParameters:@{
kCIInputRadiusKey : @(coreNumber)
}];
result = [result imageByCroppingToRect:inputImage.extent];
outImage=[context createCGImage:result fromRect:[result extent]];
}
UIImage *blurImage=[UIImage imageWithCGImage:outImage];
CGImageRelease(outImage);
return blurImage;
}
效果如下(coreNumber设置为5)
但是在实际项目开发里这种方法好像遇到了问题。
第二种方法是用系统Accelerate.Framework,Accelerate主要是用来做数字信号处理、图像处理相关的向量、矩阵运算的库。图像可以认为是由向量或者矩阵数据构成的,Accelerate里既然提供了高效的数学运算API,自然就能方便我们对图像做各种各样的处理 ,模糊算法使用的是vImageBoxConvolve_ARGB8888这个函数。具体代码如下:
+ (UIImage *)vagueImage:(UIImage *)image withVagueNumber:(CGFloat)number {
if (number < 0.f || number > 1.f) {
number = 0.5f;
}
int boxSize = (int)(number * 40);
boxSize = boxSize - (boxSize % 2) + 1;
CGImageRef img = image.CGImage;
vImage_Buffer inBuffer, outBuffer;
vImage_Error error;
void *pixelBuffer;
// 从CGImage中获取数据
CGDataProviderRef inProvider = CGImageGetDataProvider(img);
CFDataRef inBitmapData = CGDataProviderCopyData(inProvider);
// 设置从CGImage获取对象的属性
inBuffer.width = CGImageGetWidth(img);
inBuffer.height = CGImageGetHeight(img);
inBuffer.rowBytes = CGImageGetBytesPerRow(img);
inBuffer.data = (void *)CFDataGetBytePtr(inBitmapData);
pixelBuffer = malloc(CGImageGetBytesPerRow(img) *
CGImageGetHeight(img));
if(pixelBuffer == NULL)
NSLog(@"No pixelbuffer");
outBuffer.data = pixelBuffer;
outBuffer.width = CGImageGetWidth(img);
outBuffer.height = CGImageGetHeight(img);
outBuffer.rowBytes = CGImageGetBytesPerRow(img);
error = vImageBoxConvolve_ARGB8888(&inBuffer, &outBuffer, NULL, 0, 0, boxSize, boxSize, NULL, kvImageEdgeExtend);
if (error) {
NSLog(@"error from convolution %ld", error);
}
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef ctx = CGBitmapContextCreate(
outBuffer.data,
outBuffer.width,
outBuffer.height,
8,
outBuffer.rowBytes,
colorSpace,
kCGImageAlphaNoneSkipLast);
CGImageRef imageRef = CGBitmapContextCreateImage (ctx);
UIImage *returnImage = [UIImage imageWithCGImage:imageRef];
//clean up
CGContextRelease(ctx);
CGColorSpaceRelease(colorSpace);
free(pixelBuffer);
CFRelease(inBitmapData);
CGColorSpaceRelease(colorSpace);
CGImageRelease(imageRef);
return returnImage;
}
效果图:(number设置0.4)
最后就是用的第三方:其中一个类UIImageView+LBBlurredImage,项目开发中发现对大图片加载会有些慢