自定义相机及视频录制界面

最近在公司没什么任务,看项目里面有一个仿照微信朋友圈的秒拍界面,觉得挺有趣的,于是研究了一下AVFoundation,在这里记录这几天学习的心得,大家一起交流。

一般如果UI和UE在设计时只要求功能,对相机界面没什么要求的话,个人觉得调用系统相机(UIImagePickerController)就可以满足我们的需求比如照相或者录制视频,但是考虑界面美观性,有时候就需要我们自定义拍摄界面,此时系统相机已经满足不了我们的需求,跟多的是要跟AVFoundation打交道,这里面牵涉了一下系统底层的东西,断断续续研究了几天但是由于内容实在太多,所以在这里只记录基本的拍摄以及视频录制功能,如果有别的需求还需要大家去查阅资料(程序猿的悲催生活啊),下面进入正题。

在跟AVFoundation打交道的时候有几点是需要提前认识的,认识这几点我觉得实现基本的拍摄以及录制功能基本上就没问题了:

 AVCaptureSession:媒体(音、视频)捕获会话,负责把捕获的音视频数据输出到输出设备中,一个AVCaptureSession可以有多个输入输出(前面的是比较专业的说法,然并卵,个人理解这其实相当于一个会话,连接了音频之间的输入和输出,即从你开始拍照或者录制视频到最后出现照片或者视频,整个过程之间的数据流是由它管理的)
 AVCaptureDevice:输入设备,包括麦克风、摄像头,通过该对象可以设置物理设备的一些属性(例如相机聚焦、白平衡等)
 AVCaptureDeviceInput:设备输入数据管理对象,可以根据AVCaptureDevice创建对应的AVCaptureDeviceInput对象,该对象将会被添加到AVCaptureSession中管理。(即一个输入设备对应一个输入管理对象,然后把它加入会话中)
 AVCaptureOutput:输出数据管理对象,用于接收各类输出数据,但是通常我们不直接用它更多的使用它的子类AVCaptureStillImageOutput、AVCaptureMovieFileOutput(相对应的它也是要加入会话的)
 AVCaptureVideoPreviewLayer:相机拍摄预览图层,是CALayer的子类,使用该对象可以实时查看拍照或视频录制效果,创建该对象需要指定对应的AVCaptureSession对象(它可以理解为拍照或者录制视频时显示的层)。

好了,上面是我们事先要了解的,了解它们那么下面将更好的理解(当然你不了解也没关系,可能我解释的不够清楚,但是我尽力了),最直观的还是先来看代码吧,下面的是拍照的代码,先看一下定义的属性吧

@property(nonatomic,strong)AVCaptureSession *session;

@property(nonatomic,strong)AVCaptureDevice *device;

@property(nonatomic,strong)AVCaptureDeviceInput *input;

@property(nonatomic,strong)AVCaptureStillImageOutput *imageOutput;

@property(nonatomic,strong)AVCaptureVideoPreviewLayer *videoLayer;

@property(nonatomic,strong)UIImageView *focusImage;//聚焦框

下面是初始化的代码

- (void)viewDidLoad {
    [super viewDidLoad];
    //添加聚焦框
    self.focusImage = [[UIImageView alloc]initWithFrame:CGRectMake(0, 0, 60, 60)];
    self.focusImage.center = self.centerView.center;
    [self.focusImage setImage:[UIImage imageNamed:@"边框"]];
    [self.centerView addSubview:_focusImage];
    self.centerView.layer.masksToBounds = YES; //加上这句话,主要是为了当聚焦时聚焦框超出centerView裁剪聚焦框
    //建立会话
    _session = [[AVCaptureSession alloc]init];
    //设置分辨率
    if ([self.session canSetSessionPreset:AVCaptureSessionPreset1280x720]) {
        [self.session setSessionPreset:AVCaptureSessionPreset1280x720];
    }
    //取得设备
    NSArray *deviceArray = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *devide in  deviceArray) {
        if (devide.position == AVCaptureDevicePositionBack) {
            NSLog(@"取得的是后置摄像头");
            _device = devide;
        }
        if (devide.position == AVCaptureDevicePositionFront) {
            NSLog(@"前置摄像头有但是没取");
        }
    }
    NSError *error = nil;
   //建立输入设备
    _input = [[AVCaptureDeviceInput alloc]initWithDevice:_device error:&error];
    //将输入设备添加到回话
    if ([_session canAddInput:_input]) {
        [_session addInput:_input];
    }
    //建立输出设备
    _imageOutput = [[AVCaptureStillImageOutput alloc]init];
    //将输入设备加入会话
    if ([_session canAddOutput:_imageOutput]) {
        [_session addOutput:_imageOutput];
    }
    //建立照相预览层
    _videoLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:_session];
    _videoLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    CALayer *layer = self.centerView.layer;
    _videoLayer.frame = layer.bounds;
    _videoLayer.masksToBounds = YES;
    [layer insertSublayer:_videoLayer below:_focusImage.layer];//将拍摄层插入到聚焦框下面不然聚焦框不显示
    [_session startRunning];
    [self addGestureTap];//添加手势用于聚焦
    [self addNotificationToCaptureDevide:_device];//给设备添加通知
}

当拍照按钮点击时:

- (IBAction)takePhoto:(UIButton *)sender {
    //根据输出设备获得链接
    AVCaptureConnection *connection = [self.imageOutput connectionWithMediaType:AVMediaTypeVideo];
    [self.imageOutput captureStillImageAsynchronouslyFromConnection:connection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
        if (imageDataSampleBuffer) {
            NSData *data  = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
            UIImage *image = [UIImage imageWithData:data];
            UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
        }
    }];
}

-(void) image: (UIImage*)image didFinishSavingWithError: (NSError*) error contextInfo: (void*)contextInfo
{
    if (error != nil) {
        //show error message
        NSLog(@"图片保存失败");
        NSLog(@"%@",error);
    }else {
        NSLog(@"图片保存成功");
    }
}

其实在这里如果不考虑曝光,前后摄像头切换,聚焦什么的那么最基本的拍照功能已经实现了,我在上面的代码里面用的后置摄像头,当然你可以更换,在这里我遇到了一个问题,因为我也是看着别人的博客学习的,当我给拍照添加聚焦功能时,有一个地方跟原博客有出入(什么,你问我原博客在哪,少年你这样会伤我心的知道不,好歹让我把话说完,最后会给大家我参考的博客链接),我给大家看下聚焦的方法:

#pragma mark  添加点击事件用于聚焦
-(void)addGestureTap
{
    UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(tapScreen:)];
    [self.centerView addGestureRecognizer:tap];
    
}

-(void)tapScreen:(UITapGestureRecognizer*)tap
{
    //取得在拍照区域点击的位置
    CGPoint point = [tap locationInView:self.centerView];
    //将UI坐标转化为摄像头坐标
    CGPoint cameraPoint = [self.videoLayer captureDevicePointOfInterestForPoint:point];//虽说这里的作用是将坐标转换为摄像头坐标,但是在实际的过程中发现,如果用这个摄像头坐标那么位置不对
    [self setFocusCursorWithPoint:point]; //根据点击位置进行动画显示聚焦框
    [self focusWithMode:AVCaptureFocusModeAutoFocus exposureMode:AVCaptureExposureModeAutoExpose atPoint:cameraPoint];
}

//根据点击位置设置聚焦框动画
-(void)setFocusCursorWithPoint:(CGPoint)point
{
    self.focusImage.center = point;
    self.focusImage.alpha = 1;
    self.focusImage.transform = CGAffineTransformScale(self.focusImage.transform, 1.5, 1.5);
    [UIView animateWithDuration:1.0 animations:^{
        self.focusImage.transform = CGAffineTransformIdentity;//从缩放1.5恢复大最初的大小和位置
    } completion:^(BOOL finished) {
        self.focusImage.alpha = 0;
    }];
}

看到我tapScreen里面的注释了吗,在setFocusCursorWithPoint方法里面我传的是point而不是转换的摄像头坐标,因为传入转化的摄像头坐标那么位置不对,如过大家明白什么原因可以跟我说,同时还有一个地方需要注意,在插入拍摄预览层的时候要将这层插入到聚焦框下面,不然聚焦框不显示,其他的曝光、闪光灯、摄像头切换其实是属性的更改,我在这里不再介绍,后面给出大家原博客地址,比我的详细多了。

说完拍摄照片来说说视频录制吧,视频录制其实比拍摄多了一个设备即录音设备和一个录音设备输入对象,同时它的输入管理对象也变成了AVCaptureMovieFileOutput,同时在最后的录制过程中它是通过代理方法实现的,需要实现AVCaptureFileOutputRecordingDelegate的代理方法,下面上代码:

@interface RecordVideoViewController ()

@property(nonatomic,strong)AVCaptureSession *session;
@property(nonatomic,strong)AVCaptureDevice *videoDevice;
@property(nonatomic,strong)AVCaptureDevice *audioDevice;
@property(nonatomic,strong)AVCaptureDeviceInput *videoInput;
@property(nonatomic,strong)AVCaptureDeviceInput *audioInput;
@property(nonatomic,strong)AVCaptureMovieFileOutput *movieFileOutput;
@property(nonatomic,strong)AVCaptureVideoPreviewLayer *videoLayer;
@property(nonatomic,assign)UIBackgroundTaskIdentifier backgroundTaskIdentifier;
@end

@implementation RecordVideoViewController

-(void)viewWillAppear:(BOOL)animated
{
    [super viewWillAppear:animated];
    self.navigationController.navigationBarHidden = YES;
    _session = [[AVCaptureSession alloc]init];
    if ([_session canSetSessionPreset:AVCaptureSessionPreset1280x720]) {
        [_session setSessionPreset:AVCaptureSessionPreset1280x720];
    }
    NSArray *deviceArray = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *device in deviceArray) {
        if (device.position == AVCaptureDevicePositionBack) {
            _videoDevice = device;
        }
    }
    _audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio]firstObject];
    NSError *error = nil;
    _videoInput = [[AVCaptureDeviceInput alloc]initWithDevice:_videoDevice error:&error];
    _audioInput = [[AVCaptureDeviceInput alloc]initWithDevice:_audioDevice error:&error];
    if ([_session canAddInput:_videoInput]) {
        [_session addInput:_videoInput];
    }
    if ([_session canAddInput:_audioInput]) {
        [_session addInput:_audioInput];
    }
    _movieFileOutput = [[AVCaptureMovieFileOutput alloc]init];
    if ([_session canAddOutput:_movieFileOutput]) {
        [_session addOutput:_movieFileOutput];
    }
    AVCaptureConnection *connection = [_movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    //此处是为了设置视频防抖动在iOS8以后才有,需要加系统判断
    if ([connection isVideoStabilizationSupported]) {
      connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeCinematic;//在iOS8以后才有效,要加判断
    }
    _videoLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
    _videoLayer.frame = _centerView.bounds;
    _centerView.layer.masksToBounds = YES;
    _videoLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    [_centerView.layer addSublayer:_videoLayer];
    _RecordButton.selected = NO;
}
- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view from its nib.
}

-(void)viewDidAppear:(BOOL)animated
{
   [ super viewDidAppear:animated];
    [self.session startRunning];
}
-(void)viewDidDisappear:(BOOL)animated
{
    [super viewDidDisappear:animated];
    self.navigationController.navigationBarHidden = NO ;
    [self.session stopRunning];
}

- (IBAction)RecordVideoClick:(UIButton *)sender {
    _RecordButton.selected = !_RecordButton.selected; //改变按钮状态切换上面文字
    AVCaptureConnection *captureConnection = [self.movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    if (![self.movieFileOutput isRecording]) {
        //如果支持多任务则开始多任务
        if ([[UIDevice currentDevice] isMultitaskingSupported]) {
            self.backgroundTaskIdentifier = [[UIApplication sharedApplication]beginBackgroundTaskWithExpirationHandler:nil];
        }
        //预览层和视频方向保持一致
        captureConnection.videoOrientation = [self.videoLayer connection].videoOrientation;
        //建立录制缓存文件
        NSString *outputFilePath = [NSTemporaryDirectory() stringByAppendingString:@"mMovie.mov"];
        NSURL *fileUrl = [NSURL fileURLWithPath:outputFilePath];
        //此句是为了开始录制,并设置代理
        [self.movieFileOutput  startRecordingToOutputFileURL:fileUrl recordingDelegate:self];
    }
   else
   {
       [self.movieFileOutput stopRecording];
   }
}

#pragma mark 视频输出代理
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections
{
    NSLog(@"开始录制");
}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
    NSLog(@"视频录制完成");
    UIBackgroundTaskIdentifier lastBackgroundTaskIdentifier = self.backgroundTaskIdentifier;
    self.backgroundTaskIdentifier = UIBackgroundTaskInvalid;
    ALAssetsLibrary *assetLibrary = [[ALAssetsLibrary alloc]init];
    [assetLibrary writeVideoAtPathToSavedPhotosAlbum:outputFileURL completionBlock:^(NSURL *assetURL, NSError *error) {
        if (error) {
            NSLog(@"保存视频到相薄发生错误");
        }
        if(lastBackgroundTaskIdentifier != UIBackgroundTaskInvalid)
        {
            [[UIApplication sharedApplication]endBackgroundTask:lastBackgroundTaskIdentifier];
        }
        NSLog(@"成功保存视频到相薄");
         NSString *outputFilePath = [NSTemporaryDirectory() stringByAppendingString:@"mMovie.mov"];
        if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath]) {
            [[NSFileManager defaultManager]removeItemAtPath:outputFilePath error:nil];
        }
    }];
}
@end

在上面只需要一点要注意,我代码里加注释了,那个设置视频防抖冻的是在iOS8以后才有的方法,要加系统判断,我这里没加,各位一定要加啊,不然在iOS7(话说好老的系统了啊)会崩溃。

其实在这里最基本的拍摄以及录制都写完了,更多剩下的还是一些系统设备的属性更改。其实大家看代码发现很简单的,可以总结为一下几个步骤

  1. 建立会话即AVCaptureSession;
  2. 取得设备AVCaptureDevice;
  3. 根据设备创建输入管理对象AVCaptureDeviceInput
  4. 将输入管理对象加入会话
  5. 根据最后要得到的是照片还是视频建立AVCaptureStillImageOutput或者AVCaptureMovieFileOutput输入管理对象
  6. 同样的将输出管理对象加入会话
  7. 建立预览层并插入
  8. 启动会话
  9. 操作完成结束会话

基本的步骤就这么多,如果有什么缺少的欢迎补充,下面附上我当时学习的博客(想必有些童鞋看我说了这么多已经烦了),我主要参考了两篇博客:
主参考博客
副参考博客

你可能感兴趣的:(自定义相机及视频录制界面)