既然有类似微信小视频的需求,那就废话不多说,直接开干。
先看效果:
大致有3个方案UIImagePickerController
、AVCaptureSession + AVCaptureMovieFileOutput
、AVCaptureSession + AVAssetWriter
,由简至难来总结一下,然后再说一说其中遇到的各种坑,特别是iPhone X和iPad。
UIImagePickerController
这个是目前最简单的一种食品捕捉方式,当然自定义化程度也就随之减分。
1、创建一个UIImagePickerController对象,设置好sourceType、mediaTypes、delegate这些:
UIImagePickerController *systemImagePickerVc = [[UIImagePickerController alloc] init];
systemImagePickerVc.delegate = self;
systemImagePickerVc.navigationBar.barTintColor = viewController.navigationController.navigationBar.barTintColor;
systemImagePickerVc.navigationBar.tintColor = viewController.navigationController.navigationBar.tintColor;
systemImagePickerVc.videoMaximumDuration = 10;
systemImagePickerVc.mediaTypes = @[(NSString *)kUTTypeImage,(NSString *)kUTTypeMovie];
systemImagePickerVc.videoQuality = UIImagePickerControllerQualityTypeHigh;
systemImagePickerVc.cameraCaptureMode = UIImagePickerControllerCameraCaptureModePhoto;
systemImagePickerVc.modalPresentationStyle = UIModalPresentationOverCurrentContext;
systemImagePickerVc.modalTransitionStyle = UIModalTransitionStyleCrossDissolve;//看起来逐渐淡入显示
[viewController presentViewController:systemImagePickerVc animated:YES completion:nil];
然后在代理里面去做文章,取出图片、视频做自己的处理:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSString *type = [info objectForKey:UIImagePickerControllerMediaType];
if ([type isEqualToString:@"public.image"]) {
} else if ([type isEqualToString:@"public.movie"]) {
}
}
2、UIImagePickerController支持自定义UI,你可以自定义相机的控件,通过隐藏默认控件,然后创建带有控件的自定义视图,并覆盖在相机预览图层上面:
UIView *cameraOverlayView = [UIView new];
picker.showsCameraControls = NO;
picker.cameraOverlayView = cameraOverlayView;
结:UIImagePickerController相对单调,但实施简单,如果不是主要功能的相机需求,可以考虑使用这个从简处理,而且基本上也不会有什么坑。
AVCaptureSession + AVCaptureMovieFileOutput
一、视频录制
1、首先创建AVCaptureSession
会话
self.captureSession = ({
AVCaptureSession *session = [[AVCaptureSession alloc] init];
if (IS_IPHONEX) {
if ([session canSetSessionPreset:AVCaptureSessionPreset1920x1080]) {
[session setSessionPreset:AVCaptureSessionPreset1920x1080];
}
} else {
if ([session canSetSessionPreset:AVCaptureSessionPresetHigh]) {
[session setSessionPreset:AVCaptureSessionPresetHigh];
}
}
session;
});
2、添加音频输入、视频输入
//视频输入
AVCaptureDevice *captureDevice = [self getCameraDeviceWithPosition:_cameraDevice];
NSError *error = nil;
self.captureDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:captureDevice error:&error];
if (error) {
PRINT("captureDeviceInput error:%@",error.localizedDescription)
return NO;
}
//音频输入
AVCaptureDevice *audioCaptureDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
error = nil;
AVCaptureDeviceInput *audioCaptureDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:audioCaptureDevice error:&error];
if (error) {
PRINT("audioCaptureDeviceInput error:%@",error.localizedDescription)
return NO;
//将输入设备添加到会话
if ([self.captureSession canAddInput:self.captureDeviceInput]) {
[self.captureSession addInput:self.captureDeviceInput];
[self.captureSession addInput:audioCaptureDeviceInput];
//视频防抖
AVCaptureConnection *connection = [self.captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
if ([connection isVideoStabilizationSupported]) {
connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeCinematic;
}
}
3、添加视频输出和照片输出
//设备输出
self.captureMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([self.captureSession canAddOutput:self.captureMovieFileOutput]) {
[self.captureSession addOutput:self.captureMovieFileOutput];
}
//照片输出
self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
[self.stillImageOutput setOutputSettings:outputSettings];
if ([self.captureSession canAddOutput:self.stillImageOutput]) {
[self.captureSession addOutput:self.stillImageOutput];
}
4、最后将你的会话链接到layer
上去显示
//创建视频预览层
self.captureVideoPreviewLayer = ({
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
previewLayer.frame = self.AVCaptureBackgroundView.bounds;
if (IS_IPAD) {
previewLayer.videoGravity = AVLayerVideoGravityResizeAspect;
} else {
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
}
if ([self.captureVideoPreviewLayer.connection isVideoOrientationSupported]) {
[self.captureVideoPreviewLayer.connection setVideoOrientation:[self avOrientationForDeviceOrientation:[UIDevice currentDevice].orientation]];
}
[self.AVCaptureBackgroundView.layer insertSublayer:previewLayer atIndex:0];
previewLayer;
});
5、开始录制
//根据设备输出获得连接
AVCaptureConnection *connection = [self.captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
//根据连接取得设备输出的数据
if (![self.captureMovieFileOutput isRecording]) {
//如果支持多任务则开始多任务
if ([[UIDevice currentDevice] isMultitaskingSupported]) {
// self.backgroundTaskIdentifier = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil];
}
if (self.saveVideoUrl) {
[[NSFileManager defaultManager] removeItemAtURL:self.saveVideoUrl error:nil];
}
if ([connection isVideoOrientationSupported]) {
connection.videoOrientation = [self avOrientationForDeviceOrientation:[UIDevice currentDevice].orientation];
}
if ([connection isVideoMirroringSupported]) {
AVCaptureDevicePosition currentPosition=[[self.captureDeviceInput device] position];
if (currentPosition == AVCaptureDevicePositionUnspecified || currentPosition == AVCaptureDevicePositionFront) {
connection.videoMirrored = YES;
} else {
connection.videoMirrored = NO;
}
}
NSString *outputFielPath = [NSTemporaryDirectory() stringByAppendingString:@"myMovie.mov"];
PRINT("save path is :%s",outputFielPath)
[self.captureMovieFileOutput startRecordingToOutputFileURL:[NSURL fileURLWithPath:outputFielPath] recordingDelegate:self];
} else {
[self.captureMovieFileOutput stopRecording];
}
结束录制也就是else
里的[self.captureMovieFileOutput stopRecording];
6、取文件。在AVCaptureFileOutputRecordingDelegate
的代理方法里去做。
有两个代理,因为startRecording
后实际开始录制时机是有延迟的,结束录制同理,所以一般处理在代理里去做。
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections{
PRINT("begin record")
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error{
PRINT("end record")
}
二、拍照
在上述录制中我们已经加入了照片输出AVCaptureStillImageOutput
,所以直接通过它取图。
AVCaptureConnection *imageConnection = [self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
if (!imageConnection) {
return;
}
if ([imageConnection isVideoOrientationSupported]) {
imageConnection.videoOrientation = [self avOrientationForDeviceOrientation:[UIDevice currentDevice].orientation];
}
if ([imageConnection isVideoMirroringSupported]) {
AVCaptureDevicePosition currentPosition = [[self.captureDeviceInput device] position];
if (currentPosition == AVCaptureDevicePositionUnspecified || currentPosition == AVCaptureDevicePositionFront) {
imageConnection.videoMirrored = YES;
} else {
imageConnection.videoMirrored = NO;
}
}
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:imageConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer == NULL) {
return;
}
[self stopSession];
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [UIImage imageWithData:imageData];
if (!image) {
return;
}
self.resultImage = [image fixOrientation];
}];
三、坑
大致流程就是这样,说一下里面的坑:
1、最大的坑:细心读者肯定看到了里面一些IS_IPAD
、IS_IPHONEX
这些东西,而且也无法达到6s等机型的全屏效果。因为手机的摄像头的尺寸并不是和机型的尺寸(也就是mainScreen.bounds
)一致,所以你拍出来的照片或视频是和你在录制或拍照时看到的大小和范围都不一样(准确来说,iPhone X和iPad都会拍出超出你所能看到的画面景象),这个是影响用户体验以及安全性的,拍出了他不想出现在画面里内容。所以,这里面相应的处理是用户在拍或录之前看到的就是摄像头的尺寸,由于用了fitmode,所以界面上在拍之前看到的对于iPhone X或iPad来说不是全屏画面。这个完全的解决方法有两个:(1)录制或拍摄时还是采取全屏,录完后裁剪视频或图片(耗时长,体验差);(2)采用第三种录制方式AVAssetWriter
,拿到每帧去做处理,这个方案完美,后续有讲。
2、前置摄像头系统自动镜像了,所以需要再镜像一次,返还回来;如果切换摄像头,那么需要重新设置镜像,所以上述是在开始录制的时候去设置的。
3、音频使用的问题,打开这个界面后,按home到手机屏幕,会发现状态栏有红色一闪而过,那是音频占用的标识,QQ也是如此,微信却不是,可能做了音视频分离,然后再合成处理。这一块会影响到其他地方的音频使用,比如你有通话功能的话,你需要在开始其他使用前audiosession setactive no
,而且这个也是有过程的,把握好逻辑时机很重要。
4、stopsession
和startsession
是会阻塞线程的,所以建议放在同步子线程中处理,同时,这两个也是有延迟的,不是立马生效的,开启或关闭成功后有对应的通知AVCaptureSessionDidStartRunningNotification
、AVCaptureSessionDidStopRunningNotification
,封装起来使用吧。
- (void)stopRunningSession:(AVCaptureSession *)session compeletion:(void(^)(BOOL success))completion {
if (!session || ![session isRunning]) {
if (completion) {
completion(YES);
}
return;
}
__block NSObject *stopRunningOKObsever = nil;
stopRunningOKObsever = [[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionDidStopRunningNotification object:session queue:[NSOperationQueue mainQueue] usingBlock:^(NSNotification * _Nonnull note) {
if (note.object == session) {
PRINT("session stopRunning success")
if (completion) {
completion(YES);
}
[[NSNotificationCenter defaultCenter] removeObserver:stopRunningOKObsever];
stopRunningOKObsever = nil;
}
}];
dispatch_async(self.sessionHandleQueue, ^{
PRINT("session stopRunning")
[session stopRunning];
});
}
- (void)startRunningSession:(AVCaptureSession *)session compeletion:(void(^)(BOOL success))completion {
if (!session || [session isRunning]) {
if (completion) {
completion(NO);
}
return;
}
__block NSObject *startRunningOKObsever = nil;
startRunningOKObsever = [[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionDidStartRunningNotification object:session queue:[NSOperationQueue mainQueue] usingBlock:^(NSNotification * _Nonnull note) {
if (note.object == session) {
if (completion) {
completion(YES);
}
[[NSNotificationCenter defaultCenter] removeObserver:startRunningOKObsever];
startRunningOKObsever = nil;
}
}];
dispatch_async(self.sessionHandleQueue, ^{
[session startRunning];
});
}
5、4的问题解决了,就会衍生出一个新的问题,就是[self.captureSession startRunning]
和[self.captureSession stopRunning]
有延迟,那么就会出现开启或关闭时黑一下(或者说闪一下),这个把握好时机去做处理,例如在开启完成后再去显示出这个layer所在的view(这也就是有些应用扫一扫之类的开启时会有个动画或者干脆黑一下),另外,如果你在中间过程中想要暂停画面,也就是达到类似于[self.captureSession pauseRunning]
(虽然没有这个API),可以使用断开连接的方式:self.captureVideoPreviewLayer.connection.enabled = NO;
,以及self.captureVideoPreviewLayer.connection.enabled = YES;
实现暂停和继续的效果。
6、和5的问题有点类似,在切换前后摄像头时会有remove和addinput的操作,这些操作切换时会闪一下或黑一下,这个时候可以做一些转场动画之类的去掩盖住,例如微信采用的是加一层模糊效果去掩盖这个变换过程带来的不好体验。
[self.captureSession beginConfiguration];
//移除原有对象
[self.captureSession removeInput:self.captureDeviceInput];
//添加新的对象
if ([self.captureSession canAddInput:newCaptureDeviceInput]) {
[self.captureSession addInput:newCaptureDeviceInput];
self.captureDeviceInput = newCaptureDeviceInput;
}
[self.captureSession commitConfiguration];
7、还有拍照时的聚焦、闪光灯、白平衡等等代码就不贴了。
结:这种方法适合快速集成自己的拍照录制界面,但是在iPhone X这种机型上有弊端,加上马上又要出新的全面屏,不太完美。虽然他有一些其他的配置选项,比如在某段时间后,在达到某个指定的文件尺寸时,或者当设备的最小磁盘剩余空间达到某个阈值时停止录制。如果还需要更多设置,比如自定义视频音频的压缩率,或者你想要在写入文件之前,处理视频音频的样本,那么就得看第三种方法了。
AVCaptureMovieFileOutput 的 demo在此,包含后期图片裁剪、马赛克、涂鸦等
AVCaptureSession + AVAssetWriter
最完美也是最麻烦的一个方案。
1、先去掉第二种方法中创建的AVCaptureMovieFileOutput
对象,然后添加音频和视频输出,注意其中有个videoQueue
这个是统一操作的同步线程,以及两个新的代理AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate
@property (nonatomic, strong) dispatch_queue_t videoQueue;
@property (nonatomic, strong) AVCaptureVideoDataOutput *videoOutput;
@property (nonatomic, strong) AVCaptureAudioDataOutput *audioOutput;
//设备输出
self.videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[self.videoOutput setSampleBufferDelegate:self queue:self.videoQueue];
if ([self.captureSession canAddOutput:self.videoOutput]) {
[self.captureSession addOutput:self.videoOutput];
[self.captureSession beginConfiguration];
AVCaptureConnection *connection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
if ([connection isVideoStabilizationSupported]) {
connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeCinematic;
}
if ([connection isVideoMirroringSupported]) {
AVCaptureDevicePosition currentPosition = [[self.captureDeviceInput device] position];
if (currentPosition == AVCaptureDevicePositionUnspecified || currentPosition == AVCaptureDevicePositionFront) {
connection.videoMirrored = YES;
} else {
connection.videoMirrored = NO;
}
}
[self.captureSession commitConfiguration];
}
self.audioOutput = [[AVCaptureAudioDataOutput alloc] init];
[self.audioOutput setSampleBufferDelegate:self queue:self.videoQueue];
if([self.captureSession canAddOutput:self.audioOutput]) {
[self.captureSession addOutput:self.audioOutput];
}
这个时候是可以在代理方法中有实时回调的,如果在代理中return掉,就不会写入这一帧,如果你设置个开关shouldWrite那就可以达到类似抖音那种录一半暂停,然后换个地方再录一半的效果。
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
@autoreleasepool {
//视频
if (connection == [self.videoOutput connectionWithMediaType:AVMediaTypeVideo]) {
if (!self.assetWriteManager.outputVideoFormatDescription) {
@synchronized(self) {
CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
self.assetWriteManager.outputVideoFormatDescription = formatDescription;
}
} else {
@synchronized(self) {
if (self.assetWriteManager.writeState == FMRecordStateRecording) {
[self.assetWriteManager appendSampleBuffer:sampleBuffer ofMediaType:AVMediaTypeVideo];
}
}
}
}
//音频
if (connection == [self.audioOutput connectionWithMediaType:AVMediaTypeAudio]) {
if (!self.assetWriteManager.outputAudioFormatDescription) {
@synchronized(self) {
CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
self.assetWriteManager.outputAudioFormatDescription = formatDescription;
}
}
@synchronized(self) {
if (self.assetWriteManager.writeState == FMRecordStateRecording) {
[self.assetWriteManager appendSampleBuffer:sampleBuffer ofMediaType:AVMediaTypeAudio];
}
}
}
}
}
这里的self.assetWriteManager
也就是接下来的重点了。
2、为了方便,将AVAssetWriter
以及其input等封装到一个AVAssetWriteManager中,其中包含:
@property (nonatomic, strong) dispatch_queue_t writeQueue;//统一的操作线程
@property (nonatomic, strong) NSURL *videoUrl;//写入每帧视频数据的路径,也就是文件路径
@property (nonatomic, strong)AVAssetWriter *assetWriter;//写入操盘手
@property (nonatomic, strong)AVAssetWriterInput *assetWriterVideoInput;
@property (nonatomic, strong)AVAssetWriterInput *assetWriterAudioInput;
@property (nonatomic, strong) NSDictionary *videoCompressionSettings;
@property (nonatomic, strong) NSDictionary *audioCompressionSettings;
@property (nonatomic, assign) BOOL canWrite;//用以开启或组织写入的开关
@property (nonatomic, assign) CGSize outputSize;//写入时的视频size大小
然后先初始化Writer在设置outputSettings给videoInput时,有参数AVVideoWidthKey和AVVideoHeightKey就是用来指定视频宽高,解决前面一种方案里面的1号坑的核心。
- (void)setUpWriterWithIsFront:(BOOL)isFront {
self.assetWriter = [AVAssetWriter assetWriterWithURL:self.videoUrl fileType:AVFileTypeMPEG4 error:nil];
//写入视频大小
NSInteger numPixels = self.outputSize.width * self.outputSize.height;
//每像素比特
CGFloat bitsPerPixel = 6.0;
NSInteger bitsPerSecond = numPixels * bitsPerPixel;
//码率和帧率设置
NSDictionary *compressionProperties = @{AVVideoAverageBitRateKey:@(bitsPerSecond), AVVideoExpectedSourceFrameRateKey:@(30), AVVideoMaxKeyFrameIntervalKey:@(30), AVVideoProfileLevelKey:AVVideoProfileLevelH264BaselineAutoLevel};
//视频属性
UIDeviceOrientation orientation = [UIDevice currentDevice].orientation;
CGFloat widthKey = self.outputSize.height;
CGFloat heightKey = self.outputSize.width;
if (IS_IPAD && UIDeviceOrientationIsLandscape(orientation)) {
widthKey = self.outputSize.width;
heightKey = self.outputSize.height;
}//这里是个坑。
self.videoCompressionSettings = @{AVVideoCodecKey:AVVideoCodecH264, AVVideoScalingModeKey:AVVideoScalingModeResizeAspectFill, AVVideoWidthKey:@(widthKey), AVVideoHeightKey:@(heightKey), AVVideoCompressionPropertiesKey:compressionProperties};
_assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:self.videoCompressionSettings];
_assetWriterVideoInput.expectsMediaDataInRealTime = YES;
switch (orientation) {
case UIDeviceOrientationPortraitUpsideDown:
_assetWriterVideoInput.transform = CGAffineTransformMakeRotation(isFront ? M_PI_2 : (M_PI + M_PI_2));
break;
case UIDeviceOrientationLandscapeLeft:
_assetWriterVideoInput.transform = CGAffineTransformMakeRotation(isFront ? -M_PI : 0);
break;
case UIDeviceOrientationLandscapeRight:
_assetWriterVideoInput.transform = CGAffineTransformMakeRotation(isFront ? 0 : M_PI);
break;
default:
_assetWriterVideoInput.transform = CGAffineTransformMakeRotation(isFront ? M_PI + M_PI_2 : M_PI_2);
break;
}
//音频设置
self.audioCompressionSettings = @{AVEncoderBitRatePerChannelKey:@(28000), AVFormatIDKey:@(kAudioFormatMPEG4AAC), AVNumberOfChannelsKey:@(1), AVSampleRateKey:@(22050)};
_assetWriterAudioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:self.audioCompressionSettings];
_assetWriterAudioInput.expectsMediaDataInRealTime = YES;
if ([_assetWriter canAddInput:_assetWriterVideoInput]) {
[_assetWriter addInput:_assetWriterVideoInput];
} else {
PRINT("AssetWriter videoInput append Failed")
}
if ([_assetWriter canAddInput:_assetWriterAudioInput]) {
[_assetWriter addInput:_assetWriterAudioInput];
} else {
PRINT("AssetWriter audioInput Append Failed")
}
self.writeState = FMRecordStateRecording;
}
然后在获取帧数据时进行写入操作
//开始写入数据
- (void)appendSampleBuffer:(CMSampleBufferRef)sampleBuffer ofMediaType:(NSString *)mediaType {
if (sampleBuffer == NULL) {
PRINT("empty sampleBuffer")
return;
}
@synchronized(self) {
if (self.writeState < FMRecordStateRecording) {
return;
}
}
CFRetain(sampleBuffer);
dispatch_async(self.writeQueue, ^{
@autoreleasepool {
@synchronized(self) {
if (self.writeState > FMRecordStateRecording) {
CFRelease(sampleBuffer);
return;
}
}
if (!self.canWrite && mediaType == AVMediaTypeVideo) {
[self.assetWriter startWriting];
[self.assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
self.canWrite = YES;
}
//写入视频数据
if (mediaType == AVMediaTypeVideo) {
if (self.assetWriterVideoInput.readyForMoreMediaData) {
BOOL success = [self.assetWriterVideoInput appendSampleBuffer:sampleBuffer];
if (!success) {
@synchronized (self) {
[self stopWrite];
[self destroyWrite];
}
}
}
}
//写入音频数据
if (mediaType == AVMediaTypeAudio) {
if (self.assetWriterAudioInput.readyForMoreMediaData) {
BOOL success = [self.assetWriterAudioInput appendSampleBuffer:sampleBuffer];
if (!success) {
@synchronized (self) {
[self stopWrite];
[self destroyWrite];
}
}
}
}
CFRelease(sampleBuffer);
}
} );
}
想结束的时候执行Finish,同时回调出去
- (void)stopWrite {
self.writeState = FMRecordStateFinish;
__weak __typeof(self)weakSelf = self;
if (_assetWriter && _assetWriter.status == AVAssetWriterStatusWriting){
dispatch_async(self.writeQueue, ^{
[_assetWriter finishWritingWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (weakSelf.delegate && [weakSelf.delegate respondsToSelector:@selector(finishWritingWithURL:)]) {
[weakSelf.delegate finishWritingWithURL:weakSelf.videoUrl];
}
});
}];
});
} else {
if (_assetWriter) {
PRINT("assetWriter wrong, status:%ld", _assetWriter.status)
} else {
PRINT("assetWriter not exist")
}
if (weakSelf.delegate && [weakSelf.delegate respondsToSelector:@selector(finishWritingWithURL:)]) {
[weakSelf.delegate finishWritingWithURL:weakSelf.videoUrl];
}
}
}
大致流程就是这样,但是用Writer里面的坑那不是一般的多...
坑
1、正如我在前面初始化Writer里面写的注释一样,AVVideoWidthKey ,AVVideoHeightKey是个坑。首先,AVVideoHeightKey和AVVideoHeightKey分别是高和宽赋值是相反的。因为一般以人观看的方向做为参考标准来说小视频的分辨率 宽 X 高 ,而设备默认的方向是Landscape Left,即设备向左偏移90度,所以实际的视频分辨率就是高 x 宽与一般认为的相反,而iPad横屏下它本来就转了90度,所以又是一致的,才有了那个补丁:
if (IS_IPAD && UIDeviceOrientationIsLandscape(orientation)) {
widthKey = self.outputSize.width;
heightKey = self.outputSize.height;
}
2、关于videoUrl
,也就是写入数据的指定好的路径,这个路径下不能有文件,必须先remove掉才能开始写入,否则crash,报错status == AVAssetWriterStatusFailed
- (BOOL)checkPathUrl:(NSURL *)url {
if (!url) {
return NO;
}
if ([[NSFileManager defaultManager] fileExistsAtPath:[url path]]) {
return [[NSFileManager defaultManager] removeItemAtPath:[url path] error:nil];
}
return YES;
}
3、注意开始写入时保证先startWriting
,再startSessionAtSourceTime
4、注意线程问题,以及sampleBuffer的释放
5、初始化Writer时,注意有传入现在的摄像头状态,因为这里有个巨坑,对于前后摄像头他的旋转方向是不一样的,因为要保证iPhone横屏下录的视频输出时方向是对的。
6、回到在session VC那边开始调用self.assetWriteManager的地方,每次开始录制最好都重新初始化这个assetWriteManager,否则会出现报错status == AVAssetWriterStatusUnknown
。
self.assetWriteManager = [[JTAVAssetWriteManager alloc] initWithURL:[NSURL fileURLWithPath:outputFilePath] outputSize:self.outputSize];
self.assetWriteManager.delegate = self;
[self.assetWriteManager startWriteWithIsFront:isFront];
7、如果要提高清晰度,那就增大outputsize,但是有上限,取决于你的设备摄像头,另外也可提高码率来提高清晰度。
另:第三种方式的demo后续补上。
结:这种方法无疑很精密,各种设置都可以自定义,就是麻烦一些,如果要求较高,建议使用这种,而且扩展性强,以后比如添加什么录制一半继续录制这种需求都好完成。而且这边是实时拿到缓冲样本进行处理,做一些加水印之类的就比较方便了,不用录制完成后再去耗时处理。
最后放一张三种情况下的对比图
参考链接:
About AVFoundation
Capturing Video on iOS
Camera Capture on iOS
AVCaptureSession pause