iOS视频采集(自适应设备最低分辨率)

前言

视频会议文章目录
demo地址

正文

实时的视频流数据采集用AVFoundation框架的AVCaptureSession类

创建AVCaptureSession->设置输入输出对象-> 用AVCaptureConnection连接输入输出->在回调方法中返回视频数据做处理

重要知识点

由于iphone设备的不同型号的前置、后置摄像头所支持的最高分辨率各不相同,所以在设置时,一定进行判断,不然会采集不出数据.

采集、设置

 //初始化AVCaptureSession
    self.videoSession =[[AVCaptureSession alloc]init];
    //开始配置
    [self.videoSession beginConfiguration];
    //设置分辨率
    [self set_capture_present];
     //获取视频设备对象
    self.videoDevice = [self cameraWithPosition:position];
    //初始化视频捕获输入对象
    self.VideoInput = [[AVCaptureDeviceInput alloc]initWithDevice:self.videoDevice error:nil];
    //初始化视频捕获输出对象
    self.VideoOutput = [[AVCaptureVideoDataOutput alloc]init];
    // 是否卡顿时丢帧
    [self.VideoOutput setAlwaysDiscardsLateVideoFrames:YES];
    // 设置像素格式
    self.VideoOutput.videoSettings =[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
    //将输出对象添加到队列、并设置代理
    dispatch_queue_t  mProcessQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0);
    [self.VideoOutput setSampleBufferDelegate:self queue:mProcessQueue];
    //输入,输出对象添加到Session
    if ([self.videoSession canAddInput:self.VideoInput])
    {
        [self.videoSession addInput:self.VideoInput];
    }
    if ([self.videoSession canAddOutput:self.VideoOutput])
    {
        [self.videoSession addOutput:self.VideoOutput];
    }
     //创建连接  AVCaptureConnection输入对像和捕获输出对象之间建立连接。
    _videoConnection=[self.VideoOutput connectionWithMediaType:AVMediaTypeVideo];
    //视频的方向
    [_videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
    //设置稳定性,判断connection连接对象是否支持视频稳定
    if ([_videoConnection isVideoStabilizationSupported]) {
        //这个稳定模式最适合连接
        _videoConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;
    }
    //缩放裁剪系数
    _videoConnection.videoScaleAndCropFactor = _videoConnection.videoMaxScaleAndCropFactor;
    [self.videoSession commitConfiguration];

自动适配设备最高分辨率


- (void)set_capture_present
{
        if (![self.videoDevice supportsAVCaptureSessionPreset:AVCaptureSessionPreset1920x1080])
        {
            if (![self.videoDevice supportsAVCaptureSessionPreset:AVCaptureSessionPreset1280x720])
            {
                if (![self.videoDevice supportsAVCaptureSessionPreset:AVCaptureSessionPresetiFrame960x540])
                {
                    if (![self.videoDevice supportsAVCaptureSessionPreset:AVCaptureSessionPreset640x480])
                    {
                        if (![self.videoDevice supportsAVCaptureSessionPreset:AVCaptureSessionPreset352x288])
                        {
                            NSLog(@"what the fuck?");
                        }else
                        {
                            self.videoSession.sessionPreset = AVCaptureSessionPreset352x288;
                        }
                    }else
                    {
                        self.videoSession.sessionPreset = AVCaptureSessionPresetiFrame960x540;
                    }
                }else
                {
                    self.videoSession.sessionPreset = AVCaptureSessionPresetiFrame960x540;
                }
            }else
            {
                self.videoSession.sessionPreset = AVCaptureSessionPreset1280x720;
            }
        }else
        {
            self.videoSession.sessionPreset = AVCaptureSessionPreset1920x1080;
        }

}

回调

//视频采集数据回调
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    if (connection == self.videoConnection)
    {
//判断是否为采集的第一帧数据
        if (!self.isComed)
        {
//设置一个起始时间戳
            self.startTime = [[NSDate date] timeIntervalSince1970]*1000*1000;
            self.isComed = 1;
            self.timeStamp = 0;
            NSUserDefaults * defaults = [NSUserDefaults standardUserDefaults];
            [defaults setInteger:self.startTime forKey:@"startTime"];
        }
        else
        {
            self.timeStamp = ([[NSDate date] timeIntervalSince1970]*1000*1000-self.startTime)*0.09;
        }
把视频数据和时间戳传给代理类
        [self.delegate backWithVideobuffer:sampleBuffer andTimeStamp:self.timeStamp];
    }
}

最后

demo地址
下一篇 视频渲染
欢迎交流指正

你可能感兴趣的:(iOS视频采集(自适应设备最低分辨率))