这个模块主要的类就是下面的这几个
(协议)
AFHTTPRequestSerializer(根类)
(多部分表单,协议)
AFJSONRequestSerializer
AFPropertyListRequestSerializer
(协议)
AFHTTPResponseSerializer(根类)
AFJSONResponseSerializer(默认的)
AFXMLParserResponseSerializer
AFXMLDocumentResponseSerializer (macOS)
AFPropertyListResponseSerializer
AFImageResponseSerializer(重要的类)
AFCompoundResponseSerializer
实例
本模块通过上传一张图片引入
- (void)multipart{
// 1.使用AFHTTPSessionManager的接口
AFHTTPSessionManager *manager = [AFHTTPSessionManager manager];
NSDictionary *dic = @{@"businessType":@"CC_USER_CENTER",
@"fileType":@"image",
@"file":@"img.jpeg"
};
[manager POST:@"http://114.215.186.169:9002/api/demo/test/file" parameters:dic constructingBodyWithBlock:^(id _Nonnull formData) {
// 2.在这个block中设置需要上传的文件
NSString *path = [[NSBundle mainBundle] pathForResource:@"1" ofType:@"png"];
// 将本地图片数据拼接到formData中 指定name
[formData appendPartWithFileURL:[NSURL fileURLWithPath:path] name:@"file" error:nil];
// 或者使用这个接口拼接 指定name和filename
// NSData *picdata =[NSData dataWithContentsOfFile:path];
// [formData appendPartWithFileData:picdata name:@"image" fileName:@"image.jpg" mimeType:@"image/jpeg"];
} progress:^(NSProgress * _Nonnull uploadProgress) {
NSLog(@"progress --- %@",uploadProgress.localizedDescription);
} success:^(NSURLSessionDataTask * _Nonnull task, id _Nullable responseObject) {
NSLog(@"responseObject-------%@", responseObject);
dispatch_async(dispatch_get_main_queue(), ^{
});
} failure:^(NSURLSessionDataTask * _Nullable task, NSError * _Nonnull error) {
NSLog(@"Error-------%@", error);
}];
}
POST向服务器提交数据时,一般包括三个部分:请求行、请求头、请求体。当我们要从文件、socket、NSData读取较大数据到内存时,对CPU的消耗是非常大的。所以为了防止内存爆长,AFN采用了分片上传的方式。
对于 [AFHTTPSessionManager manager]这个还是一样,做了一个初始化。然而,[manager POST_xxx]方法却做了很多,用我们提供的数据给我们把请求三部分做了一个封装,这样就省去了我们很多的拼接。
具体的来看
- (NSURLSessionDataTask *)POST:(NSString *)URLString
parameters:(id)parameters
constructingBodyWithBlock:(void (^)(id formData))block
progress:(nullable void (^)(NSProgress * _Nonnull))uploadProgress
success:(void (^)(NSURLSessionDataTask *task, id responseObject))success
failure:(void (^)(NSURLSessionDataTask *task, NSError *error))failure
{
NSError *serializationError = nil;
NSMutableURLRequest *request = [self.requestSerializer multipartFormRequestWithMethod:@"POST" URLString:[[NSURL URLWithString:URLString relativeToURL:self.baseURL] absoluteString] parameters:parameters constructingBodyWithBlock:block error:&serializationError];
if (serializationError) {
if (failure) {
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wgnu"
dispatch_async(self.completionQueue ?: dispatch_get_main_queue(), ^{
failure(nil, serializationError);
});
#pragma clang diagnostic pop
}
return nil;
}
__block NSURLSessionDataTask *task = [self uploadTaskWithStreamedRequest:request progress:uploadProgress completionHandler:^(NSURLResponse * __unused response, id responseObject, NSError *error) {
if (error) {
if (failure) {
failure(task, error);
}
} else {
if (success) {
success(task, responseObject);
}
}
}];
[task resume];
return task;
}
requestSerialization
点进POST方法,我们看到,这里做了两步:生成request、用request生成task返回。用request生成的task这一步,主要是SessionManager做的,上一篇文章已经讲述,下面我们重点看一下,如何生成的request的请求体。源代码如下
//构建一个multipartForm的request。并且通过`AFMultipartFormData`类型的formData来构建请求体
- (NSMutableURLRequest *)multipartFormRequestWithMethod:(NSString *)method
URLString:(NSString *)URLString
parameters:(NSDictionary *)parameters
constructingBodyWithBlock:(void (^)(id formData))block
error:(NSError *__autoreleasing *)error
{
NSParameterAssert(method);
//method不能是get、head
NSParameterAssert(![method isEqualToString:@"GET"] && ![method isEqualToString:@"HEAD"]);
NSMutableURLRequest *mutableRequest = [self requestWithMethod:method URLString:URLString parameters:nil error:error];
// 使用initWithURLRequest:stringEncoding:来初始化一个AFStreamingMultipartFormData变量
// 主要是为了构建bodyStream
__block AFStreamingMultipartFormData *formData = [[AFStreamingMultipartFormData alloc] initWithURLRequest:mutableRequest stringEncoding:NSUTF8StringEncoding];
if (parameters) {
for (AFQueryStringPair *pair in AFQueryStringPairsFromDictionary(parameters)) {
NSData *data = nil;
if ([pair.value isKindOfClass:[NSData class]]) {
data = pair.value;
} else if ([pair.value isEqual:[NSNull null]]) {
data = [NSData data];
} else {
//通常nslog打印会调用description,打印出来的是地址,但是可以重写description,来实现打印出我们想要的类型
data = [[pair.value description] dataUsingEncoding:self.stringEncoding];
}
if (data) {
// bodyStream构造最主要的部分就在这了(虽然后面requestByFinalizingMultipartFormData函数还会稍微处理一下)
// 根据data和name构建Request的header和body,后面详解
[formData appendPartWithFormData:data name:[pair.field description]];
}
}
}
// 参考上面的例子,其实还是往formData中添加数据
if (block) {
block(formData);
}
// 做最终的处理,比如设置一下MultipartRequest的bodyStream或者其特有的content-type等等,后面也会详解
return [formData requestByFinalizingMultipartFormData];
}
普通的post和multipart的post区别:
1.content-type:
2.httpbody/httpbodystrean
requestserialization:1.动态监听我们的属性;2.设置请求头;3.生成查询字符串;4.分片上传
responseSerialization
对于response的序列化,我们需要注意的一个点就是,当我们下载图片时,图片是经过压缩的。所以,当我们下载图片时,需要手动解压图片。若不手动解压,这个过程就会在渲染图片时解压,这样这个解压过程就在主线程进行了,严重影响性能,还好这个过程AFN已经为我们做了。
下面,我们看看图片解压的核心代码(AFInflatedImageFromResponseWithDataAtScale),代码有点长,
static UIImage * AFInflatedImageFromResponseWithDataAtScale(NSHTTPURLResponse *response, NSData *data, CGFloat scale) {
if (!data || [data length] == 0) {
return nil;
}
CGImageRef imageRef = NULL;
// CoreGraphics对data的封装
CGDataProviderRef dataProvider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);
// 根据不同的格式进行转化
if ([response.MIMEType isEqualToString:@"image/png"]) {
imageRef = CGImageCreateWithPNGDataProvider(dataProvider, NULL, true, kCGRenderingIntentDefault);
} else if ([response.MIMEType isEqualToString:@"image/jpeg"]) {
imageRef = CGImageCreateWithJPEGDataProvider(dataProvider, NULL, true, kCGRenderingIntentDefault);
if (imageRef) {
CGColorSpaceRef imageColorSpace = CGImageGetColorSpace(imageRef);
CGColorSpaceModel imageColorSpaceModel = CGColorSpaceGetModel(imageColorSpace);
// CGImageCreateWithJPEGDataProvider does not properly handle CMKY, so fall back to AFImageWithDataAtScale
// 不能支持cmyk 转为位图
/*
CMKY:印刷色彩模式,用来印刷
RGB:
*/
if (imageColorSpaceModel == kCGColorSpaceModelCMYK) {
CGImageRelease(imageRef);
imageRef = NULL;
}
}
}
CGDataProviderRelease(dataProvider);
//根据创建原格式图片
UIImage *image = AFImageWithDataAtScale(data, scale);
if (!imageRef) {
//如果imageRef为空,说明不是压缩格式的图片,或者无法进一步转成Bitmap格式,直接返回原格式图片
if (image.images || !image) {
return image;
}
// 如果imageRef为空
imageRef = CGImageCreateCopy([image CGImage]);
if (!imageRef) {
return nil;
}
}
size_t width = CGImageGetWidth(imageRef);
size_t height = CGImageGetHeight(imageRef);
// 获得图片的位数
size_t bitsPerComponent = CGImageGetBitsPerComponent(imageRef);
// 大于8位或者像素大于1024*1024 如果图片太大,直接返回原格式图片
if (width * height > 1024 * 1024 || bitsPerComponent > 8) {
CGImageRelease(imageRef);
return image;
}
//获取图片相关信息
// CGImageGetBytesPerRow() calculates incorrectly in iOS 5.0, so defer to CGBitmapContextCreate
size_t bytesPerRow = 0;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGColorSpaceModel colorSpaceModel = CGColorSpaceGetModel(colorSpace);
CGBitmapInfo bitmapInfo = CGImageGetBitmapInfo(imageRef);
//如果是RGB颜色模型,根据像素是否包含alpha通道进行相应处理
if (colorSpaceModel == kCGColorSpaceModelRGB) {
uint32_t alpha = (bitmapInfo & kCGBitmapAlphaInfoMask);
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wassign-enum"
if (alpha == kCGImageAlphaNone) {
bitmapInfo &= ~kCGBitmapAlphaInfoMask;
bitmapInfo |= kCGImageAlphaNoneSkipFirst;
} else if (!(alpha == kCGImageAlphaNoneSkipFirst || alpha == kCGImageAlphaNoneSkipLast)) {
bitmapInfo &= ~kCGBitmapAlphaInfoMask;
bitmapInfo |= kCGImageAlphaPremultipliedFirst;
}
#pragma clang diagnostic pop
}
//创建Bitmap的上下文
CGContextRef context = CGBitmapContextCreate(NULL, width, height, bitsPerComponent, bytesPerRow, colorSpace, bitmapInfo);
CGColorSpaceRelease(colorSpace);
if (!context) {
CGImageRelease(imageRef);
return image;
}
//渲染到画布上
CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, width, height), imageRef);
//获取Bitmap格式的图片
CGImageRef inflatedImageRef = CGBitmapContextCreateImage(context);
CGContextRelease(context);
//转化为UIImage对象
UIImage *inflatedImage = [[UIImage alloc] initWithCGImage:inflatedImageRef scale:scale orientation:image.imageOrientation];
CGImageRelease(inflatedImageRef);
CGImageRelease(imageRef);
return inflatedImage;
}