AVFoundation - 媒体捕捉

一.  捕捉会话AVCaptureSession.

AVCaptureSession用于连接输入和输出的资源. 捕捉会话管理从物理设备(比如: 摄像头, 麦克风)得到的数据流, 输出到一个或多个目的地. 可以动态的配置输入和输出线路, 让开发者能够在会话中重新配置捕捉环境.

捕捉会话可以配置会话预设值(session preset), 用来控制捕捉数据的格式和质量. 会话设置默认: AVCaptureSessionPresetHigh

二. 捕捉设备

AVCaptureDevice用于访问系统的捕捉设备, 最常用的方法: defaultDeviceWithMediaType: 

AVCaptureDeveice为摄像头, 麦克风等物理设备定义了接口, 这些设备内置于Mac, iPhone中, 也可能是外部数码相机或者摄像机等. AVCaptureDevice针对物理设备提供了大量的方法, 比如控制摄像头的对焦, 曝光, 白平衡或者闪光灯等.

三. 捕捉设备的输入

在使用捕捉设备进行处理前, 需要将其加入到会话中, 捕捉设备没法直接添加到AVCaptureSession中, 需要将他封装在AVCaptureDeviceInput实例中. AVCaptureDeviceInput对象在设备输出数据和捕捉会话间扮演桥接作用. AVCaptureDeviceInput的创建方式

AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];

四. 捕捉设备的输出

AVFoundation定义了AVCaptureOutput的许多扩展类, AVCaptureOutput是一个抽象基类. 用于为捕捉会话得到的数据寻找输出目的地. 框架定义了一下这个类的高级扩展类.

AVCaptureDeviceStillImageOutput: 捕捉静态图片.

AVCaptureMovieFileOutput: 捕捉音频和视频数据

底层的输出类: AVCaptureAudioDataOutput和AVCaptureVideoDataOutput, 使用它们可以直接访问硬件捕捉到数字样本. 使用底层的输出类可以对音频和视频进行实时处理.

五. 捕捉连接

AVCaptureConenction: 捕捉输入和输出的连接, 可以用来启用或者禁用给定输入或给定输出的数据流. 也可以使用连接来监听音频信道中的平均和峰值.

六. 捕捉预览

AVCaptureVideoPreviewLayer: 对捕捉数据进行实时预览, 类似于AVPlayerLayer, 不过针对摄像头的捕捉进行了定制. AVCaptureVideoPreviewLayer也支持视频重力的概念, 可以控制视频内容的缩放和拉伸效果.

AVLayerGravityResizeAspect: 保持视频宽高比

AVLayerGravityResizeAspectFill: 保持宽高比, 填满整个图层, 会造成视频裁剪

AVLayerGravityResize: 填满图层, 视频变形.

//捕捉会话的创建

AVCaptureSession *session = [[AVCaptureSession alloc] init];

AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];    

AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:nil];

if ([session canAddInput:input]) {[session addInput:input];}

AVCaptureStillImageOutput *output = [ [AVCaptureStillImageOutput alloc] init];

if([session canAddOutput:output]) {[session addOutput:output];}

七.知识点

1. 坐标空间转换, 捕捉设备坐标系和屏幕坐标系不同左上角, 捕捉设备坐标系是基于摄像头传感器的本地设置, 水平方向不可旋转, 左上角为(0, 0), 右下角为(1, 1)

2. AVCaptureVideoPreViewLayer提供了两个方法用于两个坐标系间进行转换. 

    captureDevicePointOfInterestForPoint: 获取屏幕坐标系point, 返回设备坐标系point

    pointForCaptureDeviceOfInterest: 获取摄像头坐标系, 返回屏幕坐标系

    点击对焦和点击曝光通常会用到转换坐标.

3. 捕捉会话设置

// CameraController

- (Bool)setupSession {

        self.captureSession = [[AVCaptureSession alloc] init];        //创建捕捉会话

        AVCaptureDevice *videoDevice = [AVCaptureDevice deviceWithMediaType: AVMediaTypeVideo];    //得到系统默认捕捉设备指针, 返回手机后置摄像头

        AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice: videoDevice error: error];    

        if (videoInput && [self.captureSession canAddInput: videoInput]) {

                [self.captureSession addInput: videoInput];

                self.activeVideoInput = videoInput;

        } else {

                return NO;

        }

        AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMeidaTypeAudio];        //创建捕捉设备的输入

        if (audioDevice && [self.captureSession canAddInput:audioInput]) {

               [ self.captureSession addInput: audioInput]

        } else {

                return NO;

        }

        self.imageOutput = [[AVCaptureStillImageOutput alloc] init];    //捕捉静态图片

        self.imageOutput.outputSettings = @{AVVideoCodeKey: VAVideoCodecJPEG};

        if ([self.captureSession canAddOutput: self.imageOutput]) {

                [self.captureSession addOutput: self.imageOutput];

        }

        self.movieOutput = [[AVCaptureMovieFileOutput alloc] init]; //保存到文件系统

        if ([self.captureSession canAddOutput: self.movieOutput]) {

                [self.captureSession addOutput: self.movieOutput];

        }

        self.videoQueue = dispatch_queue_create("com.vedioqueue", NULL);

        return YES;

}

4. 启动和停止会话, 使用捕捉会话之前首先要启动会话, 启动会话的第一步是启动数据流, 使他处于准备捕捉图片和视频的状态. a: 检查设备是否处于活跃状态, 如果没有准备好则调用startRunning方法, 这是一个同步调用会消耗一定的时间, 所以要异步方式在videoQueue排队调用. b: stopRunning停止系统中数据流, 也是一个同步调用, 所以也要采用异步方式调用.

- (void)startSession { //a

        if (![self.captureSession isRunning]) {

                dispatch_async(self.videoQueue, ^{

                        [self.captureSession startRunning];

                });

        }

}

- (void)stopSession {    //b

        if ([self.captureSession isRunning]) {

                dispatch_async(self.videoQueue, ^{

                        [self.captureSession stopRunning];

                });

       }

}

5. 切换摄像头

- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition)position {

   NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMidaTypeVideo];

    for (AVCaptureDevice *device in devices) {

            if (device.position == position) {

                    return device;

            }

            return nil;

    }

}

- (AVCaputreDevice *)activeCamera {

    return self.activeVideoInput.device;

}

- (AVCAptureDevice *)inactiveCamera {

        AVCaptureDevice *device = nil;

        if (self.cameraCount > 1) {

                if  ([self activeCamera].position == AVCaptureDevicePositionBack) {

                        device = [self cameraWithPosition: AVCaptureDevicePositionFront];

                } else {

                        device = [self cameraWithPosition: AVCaptureDevicePositionBack];

                }

        }

        return device;

}

- (BOOL)canSwitchCameras {

        return self.cameraCount > 1;

}

- (NSUInteger)cameraCount {

        return [[AVCaptureDevice deviceWithMediaType:AVMediaTypeVideo] count];

}

切换前置和后置摄像头需要重新配置捕捉会话, 幸运的是可以动态配置AVCaptureSession, 所以不必担心停止会话和重启会话带来的开销, 不过对会话进行的任何改变都要通过beginConfiguration和commitConfiguration进行单独的原子性的变化.

- (void)switchCameras {

        if (![self canSwitchCameras]) {    return NO;    }

        NSError *error;

        AVCaptureDevice *videoDevice = [self inactiveCamera];

        AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];

        if (videoInput) {

                [self.captureSession beginConfiguration];    //标注原子配置变化开始

                [self.captureSession removeInput: self.activeVideoInput];

                if ([self.captureSession canAddInput: videoInput]) {

                        [self.captureSession addInput: videoInput];

                        self.activeVideoInput = videoInput;

                } else {

                        [self.captureSession addInput:self.activeInput];    //防止添加失败

                }

               //分批将所有变更整合在一起, 得到一个有关会话单独的原子的修改

                [self.captureSession commitConfiguration];

        } else {

                return NO;        //返回之前可以做一些处理

        }

        return YES;

}

6. 配置捕捉设备, AVCaptureDevice定义了很多方法可以让开发者控制摄像头,尤其是可以独立调整和锁定摄像头的焦距, 曝光和白平衡. 对焦和曝光还可以支持特定的兴趣点设置, 实现点击曝光和对焦的功能. AVCaptureDevice还可以控制设备的LED作为拍照的闪光灯或手电筒使用. 每当修改摄像头设备时, 一定要先检查该动作是否被设备支持, 否则会出现异常. 并不是所有摄像头都能支持所有的功能. 例如: 前置摄像头不支持对焦. 后置摄像头可以支持全尺寸对焦.

AVCaptureDevice *device = ...

if ([devcie isFocusModeSupported: AVCaptureFocusModeAutoFocus]) {

    //当配置修改可以支持时, 修改技巧为: 先锁定设备准备配置, 执行所需要的修改, 最后解锁配置

    if ([device lockForConfiguration: &error]) {

            device.focusMode = AVCaptureFocusModeAutoFocus;

            [device unlockForConfiguration];

    } else {

            //handle error

    }

}

7. 调整焦距, a: 询问摄像头是否支持兴趣点对焦 b: 传递一个point, 已经从屏幕坐标系转化为捕捉设备坐标. c: 是否支持兴趣点对焦并确认是否支持自动对焦模式, 这一模式会使用单独扫描的自动对焦.

点击对焦的实现:

- (BOOL)cameraSupportTapToFocus {     //a

        return [[self activeCamera] isFocusPointOfInterestSupported];

}

- (void)focusAtPoint:(CGPoint)point {    //b

        AVCaptureDevice *device = [self activeCamera];

        if (device.isFocusPointOfInterestSupported && [device isFocusModeSupported: AVCaptureFocusModeAutoFocus]) {    //c

                if ([device lockForConfiguration: &error]) {

                        device.focusPointOfInterest = point;

                        [device unlockForConfiguration];

                } else {

                        //错误处理

                }

        }

}

8. 点击曝光, a: 是否支持对兴趣点曝光. b: 判断设备是否支持锁定曝光模式, 如果支持使用KVO来确定设备adjustingExposure属性的状态, 观察该属性可以知道曝光何时调整完成, 让我们有机会在改点上锁定曝光. c: 判断设备不在调整曝光等级, 确认设备的exposureMode可以设置为AVCaptureExposueModeLocked.

- (BOOL)cameraSupportsTapToExpose {    //a

        return [[self activeCamera] isExposurePointOfInterestSupported];

}

static const NSString *THCameraAdjustingExposureContext;

- (void)exposeAtPoint:(CGPoint)point {

        AVCaptureDevice *device = [self activeCamera];

        AVCaptureExposureMode exposureMode = AVCaptureExposureModeContinuousAutoExposure;

        if (device.isExposurePointOfInterestSuppoted && [device isExposureModeSupported: exposureMode]) {    

                if ([device lockForConfiguration:&error]) {

                        device.exposurePointOfInterest = point;

                        device.exposureMode = exposureMode;

                        if ([device isExposureModeSupported: AVCaptureExposureModelLocked]) {    //b

                                [device addObserve: self forKeyPath: @"adjustingExposure" oprions: NSKeyValueObservingOptionNes context: &THCameraAdjustingExposureContext];

                        }

                        [device unlockForConfiguration];

                }

        } else {

        // handle error        

        }

    }

}

- (void)observeValueForKeyPath:(NSSting *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {

    if (context == &THCameraAdjustingExposureContext) {

        AVCaptureDevice *device = (AVCaptureDevice *)object;

        if (!device.isAjustingExposure && [device isExposureModeSupported: AVCaptureExposureModeLocked]) {    //c

            [object removeObserver: self forKeyPath: @"ajustingExposure" ...];

            dispatch_async(dispatch_get_main_queue(), ^{

                if ([device lockForConfiguration: &error]) {

                    device.exposureMode = AVCaptureEXposureModeLocked;

                    [device unlockConfiguration];

                } else {

                     //错误处理

                }

            });

      } else {

        [super observeValueForKeyPath: keyPath ofObject:....];

    }

}

9. 重新设置对焦和曝光

- (void)resetFocusAndExposureMode {

    AVCaptureDevice *device = [self activeCamera];

    AVCaptureFocusMode focusMode = AVCaptureFocusModeContinuousAutoFocus;

    BOOL canResetFocus = [device isFocusPointOfInterestSupported] && [device isFocusModeSupported:focusMode];

    AVCaptureExpsureMode exposureMode = AVCaptureExposureModeContinuousAutoExposure;

    BOOL canResetExposure = [device isExposurePointOfInterestSuppported] && [device isFocusModeSupported:focusMode];

    CGPoint centerPoint = CGPointMake(0.5, 0.5);

    NSError *error;

    if ([device lockForConfiguration:&error]) {

        if (canResetFocus) {

            device.focusMode = focusMode;

            device.focusPointOfInterest = centerPoint;

        }

        if (canResetExposure) {

            device.exposureMode = exposureMode;

           device.exposurePointOfInterest = centerPoint;

        }

        [device unlockForConfiguration];

    } else {

        //错误处理

    }

}

10. 调整闪光灯和手电筒模式, AVCaptureDevice可以让开发者修改摄像头的闪光灯和手电筒模式, 设备后面的LED灯, 当拍摄静态图片时用作闪光灯, 当拍摄视频时用作连续的灯光(手电筒). 捕捉设备的flashMode和torchMode属性可以被设置为AVCapture(Torch|Flash)(On| Off | Auto), 3个值中的一个.

- (BOOL)cameraHasFlash {

    return [[self activeCamera] hasFlash];

}

- (AVCaptureFlashMode)flashMode {

    return [[self activeCamera] flashMode];

}

- (void)setFlashMode:(AVCaptureFlashMode)flashMode {

    AVCaptureDevice *device = [self activeCamera];

    if ([device isFlashModeSupported:flashMode]) {

        if ([device lockForConfiguration: &error]) {

            device.flashMode = flashMode;

            [device unlockForConfiguration];

        } else {

            //错误处理

        }

    }

}

- (BOOL)cameraHasTorch {

    return [[self activeCamera] hasTorch];

}

- (AVCaptureTorchMode)torchMode {

    return [[self activeCamera] torchMode];

}

- (void)setTorchMode:(AVCaptureTorchMode)torchMode {

    AVCaptureDevice *device = [self activeCamera];

    if ([device isTorchModeSupported:torchMode]) {

        NSError *error;

        if ([device lockForConfiguration: &error]) {

            device.torchMode = torchMode;

            [device unlockForConfiguration];

        } else {

            //异常处理

        }

    }

}

11. 拍摄静态图片, 在setupSession方法的实现中, 我们将一个AVCaptureStillImageOutput实例添加到捕捉会话中, 这个是AVCaptureOutput子类, 用于捕捉静态图片. a: 获取AVCaptureStillImageOutput对象使用的当前AVCaptureConnection指针, 当查找AVCaptureStillImageOutput连接时一般会传递AVMediaTypeVideo媒体类型.

- (void)captureStillImage {

    AVCaptureConnection *connection = [self.imageOutput connectionWithMediaType:AVMediaTypeVideo];    //a

    if (connection.isVideoOrientationSupported) {

        connection.videoOrientation = [self currentVideoOrientation];

    }

    id handler = ^(CMSampleBufferRef sampleBuffer, NSError *error) {

        if (sampleBuffer != NULL) {

            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDatRepresentation:sampleBuffer];

            UIImage *image = [[UIImage alloc] initWithData:imageData];

        } else {

            //错误处理

         }

    };

    [self.imageOutput captureStillImageAsynchronouslyFromConnection:connection  completionHandler:handler];

}

- (AVCaptureVideoOrientation)currentVideoOrientation {

    AVCaptureVideoOrientation orientation;

    switch ([UIDevice currentDevice].orientation) {

        case UIDeviceOrientationPortrait: 

                    orientation = UIDeviceOrientationPortrait;

                    break;

        case  UIDeviceOrientationLandscapeRight:

                    orientation =  UIDeviceOrientationLandscapeLeft;

                    break;

        case UIDeviceOrientationUpsideDown:

                    orientation = UIDeviceOrientationUpsideDown;              

                    break;

        default:

                orientation = UIDeviceOrientationLandscapeRight;

                break;

    }

    return orientation;

}

12. 使用Assets Library框架, Assets Library可以让开发者管理用户的相册和视频库.

AVAuthorizationStatus status = [ALAssetLibrary authorizationStatus];

if (status == AVAuthorizationStatusDenied) {    

    //without access

} else {   

     //perform authorized access to the library

}

- (void)writeImageToAssetsLibrary:(UIImage *)image {    //将图片写到本地

    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];

    [library writeImageToSavePhotosAlbum:image.CGImage orientation:image.orientation completionHander:^(NSURL *assetURL, NSError *error){               

      }];

}

13. 视频捕捉, AVCaptureMovieFileOutput开始录制时, 会在头文件前面写入一个最小化的头信息, 随着录制的进行, 片段会按照一定的周期写入. 创建完整的头信息. 间隔可以通过修改捕捉输出的movieFragmentInterval属性来改变. a: 判断AVCaptureMovieFileOutput状态. b: 视频稳定, 支持稳定可以显著提高捕捉到视频的质量. c:摄像头平滑对焦模式, 减慢摄像头对焦的速度. 通常情况下用户移动摄像头会尝试快速自动对焦, 这会在捕捉的视频中出现脉冲式效果, 平滑对焦会降低对焦速度, 从而提供更加自然的录制效果. 

- (BOOL)isRecording {    //a

    return self.movieOutput.isRecording;

}

- (void)startRecording {

    if (![self isRecording]) {

        AVCaptureConnection *videoConnection = [self.movieOutput connectionWithMediaType:AVMediaTypeVideo];

        if ([videoConnection isVideoOrientationSupported]) {

            videoConnection.videoOrientation = self.currentVideoOrientation;

        }

        if ([videoConnection isVideoStabilizationSupported]) {    //b

            videoConnection.enablesVideoStabilizationWhenAvailable = YES;

        }

        AVCaptureDevice *device = [self activeCamera];

        if (device.isSmoothAutoFocusSupported) {    //c

            NSError *error;

            if ([device lockForConfiguration: &error]) {

                   device.smoothAutoFocusEnabled = YES;          

                    [device unlockForConfiguration];

            } else {

                    //错误处理

            }

        }

        self.outputURL = [self uniqueURL];

        [self.movieOutput startRecordingToOutputFileURL:self.outputURL recordingDelegate:self];

    }

}

- (NSURL *)uniqueURL {

    NSFileManager *fileManager = [NSFileManager defaultManager];

    NSString *dirPath = [fileManager temporaryDirectoryWithTemplateString:@"temp"];

    if (dirPath) {

        NSString *filePath = [dirPath stringByAppendingPatchComponent:@"1.mov"];

        return [NSURL fileURLWithPath:filePath];

    }    

    return nil;

}

- (void)stopRecording {

    if ([self isRecording]) {

        [self.movieOutput stopRecording];

    }

}

14. 实现AVCaptureFileOutputRecordingDelegate协议, a:向资源库写入前检查视频是否可以被写入.  b:根据视频的宽高比设置图片高度, 还需要设置appliesPreferredTrackTransform为YES, 这样捕捉缩略图时会考虑视频的变化.

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {

    if (error) {

        [self.delegate mediaCaptureFailedWithError:error];

    } else {

        [self writeVideoToAssetsLibrary:[self.outputURL copy]];

    }

    self.outURL = nil;

}

- (void)writeVideoToAssetsLibrary:(NSURL *)videoURL {

    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];

    if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:videoURL]) {    //a

        ALAssetsLibraryWriteVideoCompletionBlock completionBlock;

        completionBlock = ^(){

            if (error) {

                [self.delegate assetLibraryWriteFailedWithError:error];

            } else {

                [self generateThumbnailForVideoAtURL:videoURL];

            }

        };

                    [library writeVideoAtPathToSavePhotosAlbum:videoURL completionBlock:completionBlock];

    }

}

- (void)generateThumbnailForVideoAtURL:(NSURL *)videoURL {

    dispatch_async(self.videoQueue, ^{

        AVAsset *asset = [AVAsset assetWithURL:videoURL];

        AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];

        imageGenerator.maximumSize = CGSizeMake(100.0f, 0.0f);    //b       

        imageGenerator.appliesPreferredTrackTransform = YES;

        CGImageRef imageRef = [imageGenerator copyCGImageAtTime:kCMTimeZero actualTime:NULL error: nil];

        UIImage *image = [UIImage imageWithCGImage:imageRef];

        CGImageRelease(imageRef);

        dispatch_async(dispatch_get_main_queue(), ^{

            //...

        });


    });


}

你可能感兴趣的:(AVFoundation - 媒体捕捉)