最近做论坛功能,发帖的时候需要用到从相册中选取图片然后上传,由于每次上传图片的最大数量为9张,所以需要对图片进行压缩。开始时用了以前经常用的压缩的方法:
//压缩图片质量
+(UIImage *)reduceImage:(UIImage *)image percent:(float)percent
{
NSData *imageData = UIImageJPEGRepresentation(image, percent);
UIImage *newImage = [UIImage imageWithData:imageData];
return newImage;
}
//压缩图片尺寸
+ (UIImage*)imageWithImageSimple:(UIImage*)image scaledToSize:(CGSize)newSize
{
// Create a graphics image context
UIGraphicsBeginImageContext(newSize);
// new size
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
// Get the new image from the context
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
// End the context
UIGraphicsEndImageContext();
// Return the new image.
return newImage;
}
上面的方法比较常见,可是需要加载到内存中来处理图片,当图片数量多了的时候就会收到内存警告,程序崩溃。研究半天终于在一篇博客中找到了解决方法:
static size_t getAssetBytesCallback(voidvoid *info, voidvoid *buffer, off_t position, size_t count) {
ALAssetRepresentation *rep = (__bridge id)info;
NSError *error = nil;
size_t countRead = [rep getBytes:(uint8_t *)buffer fromOffset:position length:count error:&error];
if (countRead == 0 && error) {
// We have no way of passing this info back to the caller, so we log it, at least.
NDDebug(@"thumbnailForAsset:maxPixelSize: got an error reading an asset: %@", error);
}
return countRead;
}
static void releaseAssetCallback(voidvoid *info) {
// The info here is an ALAssetRepresentation which we CFRetain in thumbnailForAsset:maxPixelSize:.
// This release balances that retain.
CFRelease(info);
}
// Returns a UIImage for the given asset, with size length at most the passed size.
// The resulting UIImage will be already rotated to UIImageOrientationUp, so its CGImageRef
// can be used directly without additional rotation handling.
// This is done synchronously, so you should call this method on a background queue/thread.
- (UIImage *)thumbnailForAsset:(ALAsset *)asset maxPixelSize:(NSUInteger)size {
NSParameterAssert(asset != nil);
NSParameterAssert(size > 0);
ALAssetRepresentation *rep = [asset defaultRepresentation];
CGDataProviderDirectCallbacks callbacks = {
.version = 0,
.getBytePointer = NULL,
.releaseBytePointer = NULL,
.getBytesAtPosition = getAssetBytesCallback,
.releaseInfo = releaseAssetCallback,
};
CGDataProviderRef provider = CGDataProviderCreateDirect((voidvoid *)CFBridgingRetain(rep), [rep size], &callbacks);
CGImageSourceRef source = CGImageSourceCreateWithDataProvider(provider, NULL);
CGImageRef imageRef = CGImageSourceCreateThumbnailAtIndex(source, 0, (__bridge CFDictionaryRef) @{
(NSString *)kCGImageSourceCreateThumbnailFromImageAlways : @YES,
(NSString *)kCGImageSourceThumbnailMaxPixelSize : [NSNumber numberWithInt:size],
(NSString *)kCGImageSourceCreateThumbnailWithTransform : @YES,
});
CFRelease(source);
CFRelease(provider);
if (!imageRef) {
return nil;
}
UIImage *toReturn = [UIImage imageWithCGImage:imageRef];
CFRelease(imageRef);
return toReturn;
}
把图片缓存到本地,在很多场景都会用到,如果只是存储文件信息,那建一个plist文件,或者数据库就能很方便的解决问题,但是如果存储图片到沙盒就没那么方便了。这里简单介绍两种保存图片到沙盒的方法。
(1)把图片转为base64的字符串存到数据库中或者plist文件中,然后用到的时候再取出来
//获取沙盒路径,
NSString *path_sandox = NSHomeDirectory();
//创建一个存储plist文件的路径
NSString *newPath = [path_sandox stringByAppendingPathComponent:@/Documents/pic.plist];
NSMutableArray *arr = [[NSMutableArray alloc] init];
UIImage *image = [UIImage imageNamed:@"1.png"];
/*
把图片转换为Base64的字符串
在iphone上有两种读取图片数据的简单方法: UIImageJPEGRepresentation和UIImagePNGRepresentation.
UIImageJPEGRepresentation函数需要两个参数:图片的引用和压缩系数.而UIImagePNGRepresentation只需要图片引用作为参数.通过在实际使用过程中,
比较发现: UIImagePNGRepresentation(UIImage* image) 要比UIImageJPEGRepresentation(UIImage* image, 1.0) 返回的图片数据量大很多.
譬如,同样是读取摄像头拍摄的同样景色的照片, UIImagePNGRepresentation()返回的数据量大小为199K ,
而 UIImageJPEGRepresentation(UIImage* image, 1.0)返回的数据量大小只为140KB,比前者少了50多KB.
如果对图片的清晰度要求不高,还可以通过设置 UIImageJPEGRepresentation函数的第二个参数,大幅度降低图片数据量.譬如,刚才拍摄的图片,
通过调用UIImageJPEGRepresentation(UIImage* image, 1.0)读取数据时,返回的数据大小为140KB,但更改压缩系数后,
通过调用UIImageJPEGRepresentation(UIImage* image, 0.5)读取数据时,返回的数据大小只有11KB多,大大压缩了图片的数据量 ,
而且从视角角度看,图片的质量并没有明显的降低.因此,在读取图片数据内容时,建议优先使用UIImageJPEGRepresentation,
并可根据自己的实际使用场景,设置压缩系数,进一步降低图片数据量大小.
*/
NSData *_data = UIImageJPEGRepresentation(image, 1.0f);
//将图片的data转化为字符串
NSString *strimage64 = [_data base64EncodedString];
[arr addObject:image64];
//写入plist文件
if ([arr writeToFile:newPath atomically:YES]) {
NSLog(@"写入成功");
};
//可以到沙河路径下查看plist文件中的图片数据
//这样就存起来的,然后用到的时候再利用存储的字符串转化为图片
//NSData *_decodedImageData = [[NSData alloc] initWithBase64Encoding:image64]; 这是iOS7之前的一个方法
NSData *_decodedImageData = [[NSData alloc]initWithBase64EncodedString:strimage64 options:NSDataBase64DecodingIgnoreUnknownCharacters];
UIImage *_decodedImage = [UIImage imageWithData:_decodedImageData];
//可以打印下图片是否存在
NSLog(@"===Decoded image size: %@", NSStringFromCGSize(_decodedImage.size));
// 查看沙盒路径
NSLog(@"imgPath = %@", NSHomeDirectory());
(2)把图片直接保存到沙盒中,然后再把路径存储起来,等到用图片的时候先获取图片的路径,再通过路径拿到图片
//拿到图片
UIImage *image2 = [UIImage imageNamed:@"1.png"];
NSString *path_document = NSHomeDirectory();
//设置一个图片的存储路径
NSString *imagePath = [path_document stringByAppendingString:@"/Documents/pic.png"];
//把图片直接保存到指定的路径(同时应该把图片的路径imagePath存起来,下次就可以直接用来取)
[UIImagePNGRepresentation(image2) writeToFile:imagePath atomically:YES];
// 下次利用图片的地址直接来拿图片。
UIImage *getimage2 = [UIImage imageWithContentsOfFile:imagePath];
NSLog(@"image2 is size %@",NSStringFromCGSize(getimage2.size));
常用沙盒路径
//Home目录
NSString *homeDirectory = NSHomeDirectory();
//Document目录
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *path = [paths objectAtIndex:0];
//Cache目录
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES);
NSString *path = [paths objectAtIndex:0];
//Libaray目录
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSLibraryDirectory, NSUserDomainMask, YES);
NSString *path = [paths objectAtIndex:0];
// NSUserDefaults 路径在 Library->preference
NSUserDefaults *users = [NSUserDefaults standardUserDefaults];
[users setBool:YES forKey:@"login"];
[users setValue:value forKey:@"userID"];
NSLog(@"%@",NSHomeDirectory());
[users synchronize];
参考博客:
1.http://blog.csdn.net/u012716788/article/details/49564027
2.http://blog.csdn.net/apple_app/article/details/38847357