视频流的处理(实时美颜、滤镜)并通过简单的coreImage渲染

主要思路 :通过摄像头捕获画面,获取视频流之后,进行美颜处理,然后将处理后的流给coreImage进行渲染

视频的捕获:框架 AVFoundation/AVFoundation.h

说明:
 AVCaptureDevice 是关于相机硬件的接口。它被用于控制硬件特性,诸如镜头的位置、曝光、闪光灯等。

 AVCaptureOutput 是一个抽象类,描述 capture session 的结果。以下是三种关于静态图片捕捉的具体子类:

 AVCaptureStillImageOutput 用于捕捉静态图片
 AVCaptureMetadataOutput 启用检测人脸和二维码
 AVCaptureVideoOutput 为实时预览图提供原始帧
 AVCaptureSession 管理输入与输出之间的数据流,以及在出现问题时生成运行时错误。

说明:

 AVCaptureVideoPreviewLayer 是 CALayer 的子类,可被用于自动显示相机产生的实时图像。它还有几个工具性质的方法,可将 layer 上的坐标转化到设备上。它看起来像输出,但其实不是。另外,它拥有 session (outputs 被 session 所拥有)。


AVCaptureSession用来控制输入设备(AVCaptureDeviceInput)视频图像到流入输出缓冲区(AVCaptureOutput)的过程。一旦AVCaptureSession启动以后,就会收集来自输入设备的信息,并在适当的时候将这些信息输出到数据缓冲区中。


 AVCaptureVideoPreviewLayer默认情况下会显示来自输入设备的原始数据,如果要实现实时滤镜或在这个图层上绘制额外的物体,那么就要冲视频输出缓冲区获取数据帧数据,并对其进行处理,处理完毕后可将像素数据输出到另一个图层或者OpenGL上下文中。

**
如果仅仅获取摄像头数据:
效果如下:

**
视频流的处理(实时美颜、滤镜)并通过简单的coreImage渲染_第1张图片
代码如下:

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    [self setupCaptureSession];
}
- (void)setupCaptureSession
{
      // 初始化GPUView
    _gpuView = [[GPUView alloc] initWithFrame:self.view.bounds];
    [self.view addSubview:_gpuView];
    self.view.backgroundColor = [UIColor redColor];

    NSError *error = nil;
    // 初始化第二个CIFilter

    // Create the session
    AVCaptureSession *session = [[AVCaptureSession alloc] init];

    // Configure the session to produce lower resolution video frames, if your
    // processing algorithm can cope. We'll specify medium quality for the
    // chosen device.
    session.sessionPreset = AVCaptureSessionPresetMedium;

    // Find a suitable AVCaptureDevice
    AVCaptureDevice *device = [AVCaptureDevice
                               defaultDeviceWithMediaType:AVMediaTypeVideo];

    // Create a device input with the device and add it to the session.
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
                                                                        error:&error];
    if (!input) {
        // Handling the error appropriately.
    }
    [session addInput:input];

    // Create a VideoDataOutput and add it to the session
    AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
    [session addOutput:output];

    // Configure your output.
    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    // Specify the pixel format
    output.videoSettings =
    [NSDictionary dictionaryWithObject:
     [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                                forKey:(id)kCVPixelBufferPixelFormatTypeKey];


    // If you wish to cap the frame rate to a known value, such as 15 fps, set
    // minFrameDuration.
    output.minFrameDuration = CMTimeMake(1, 15);
//    [self setSession:session];
    // Start the session running to start the flow of data

    preLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
    preLayer.videoGravity = AVLayerVideoGravityResizeAspect;
    preLayer.frame = [UIScreen mainScreen].bounds;
    [self.view.layer addSublayer:preLayer];
    [session startRunning];


}
如果对图像进行coreImage进行GPU渲染,需要使用这个代理   AVCaptureVideoDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection NS_AVAILABLE(10_7, 6_0);

对得到的sampleBuffer 进行处理
效果 如下:
视频流的处理(实时美颜、滤镜)并通过简单的coreImage渲染_第2张图片

视频流的处理(实时美颜、滤镜)并通过简单的coreImage渲染_第3张图片

代码如下:

#import "ViewController.h"
#import 
//#import 
#import "GPUView.h"
@interface ViewController ()<AVCaptureVideoDataOutputSampleBufferDelegate>
{
    AVCaptureVideoPreviewLayer *preLayer;
}
@property (nonatomic, strong) CIFilter  *ciFilter2;
@property (nonatomic, strong) GPUView   *gpuView;

@end

@implementation ViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    [self setupCaptureSession];
}
- (void)setupCaptureSession
{
      // 初始化GPUView
    _gpuView = [[GPUView alloc] initWithFrame:self.view.bounds];
    [self.view addSubview:_gpuView];
    self.view.backgroundColor = [UIColor redColor];

    NSError *error = nil;
    // 初始化第二个CIFilter

    // Create the session
    AVCaptureSession *session = [[AVCaptureSession alloc] init];

    // Configure the session to produce lower resolution video frames, if your
    // processing algorithm can cope. We'll specify medium quality for the
    // chosen device.
    session.sessionPreset = AVCaptureSessionPresetMedium;

    // Find a suitable AVCaptureDevice
    AVCaptureDevice *device = [AVCaptureDevice
                               defaultDeviceWithMediaType:AVMediaTypeVideo];

    // Create a device input with the device and add it to the session.
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
                                                                        error:&error];
    if (!input) {
        // Handling the error appropriately.
    }
    [session addInput:input];

    // Create a VideoDataOutput and add it to the session
    AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
    [session addOutput:output];

    // Configure your output.
    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    // Specify the pixel format
    output.videoSettings =
    [NSDictionary dictionaryWithObject:
     [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                                forKey:(id)kCVPixelBufferPixelFormatTypeKey];


    // If you wish to cap the frame rate to a known value, such as 15 fps, set
    // minFrameDuration.
    output.minFrameDuration = CMTimeMake(1, 15);
//    [self setSession:session];
    // Start the session running to start the flow of data

//    preLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
//    preLayer.videoGravity = AVLayerVideoGravityResizeAspect;
//    preLayer.frame = [UIScreen mainScreen].bounds;
//    [self.view.layer addSublayer:preLayer];
    [session startRunning];


}

// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{

    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];

    CIImage *ciimage = [[CIImage alloc] initWithImage:image];
    _ciFilter2 = [CIFilter filterWithName:@"CIHueAdjust"];
    [_ciFilter2 setValue:ciimage forKey:kCIInputImageKey];

    [_ciFilter2 setValue:@1.f forKeyPath:kCIInputAngleKey];
        CIImage *outputImage = [_ciFilter2 outputImage];
    [_gpuView drawCIImage:outputImage];

}

// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0);

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Free up the context and color space
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    UIImage *image = [UIImage imageWithCGImage:quartzImage];

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}@end

gpuimage

//
//  GPUView.m
//  OpenGL_ES_1
//
//  Created by fsk-0-1-n on 16/9/8.
//  Copyright © 2016年 Xoxo. All rights reserved.
//

#import "GPUView.h"

@interface GPUView ()

@property (nonatomic, assign)  CGRect     rectInPixels;
@property (nonatomic, strong)  CIContext *context;
@property (nonatomic, strong)  GLKView   *showView;

@end

@implementation GPUView

- (id)initWithFrame:(CGRect)frame
{
    self = [super initWithFrame:frame];
    if (self)
    {
        // 获取OpenGLES渲染环境
        EAGLContext *eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];

        // 初始化GLKView并指定OpenGLES渲染环境 + 绑定
        _showView = [[GLKView alloc] initWithFrame:frame context:eaglContext];
        [_showView bindDrawable];

        // 添加进图层
        [self addSubview:_showView];

        // 创建CIContext环境
        _context = \
        [CIContext contextWithEAGLContext:eaglContext
                                  options:@{kCIContextWorkingColorSpace:[NSNull null]}];

        // 定义绘制区域(像素描述)
        _rectInPixels = \
        CGRectMake(0.0, 0.0, _showView.drawableWidth, _showView.drawableHeight);
    }
    return self;
}

- (void)drawCIImage:(CIImage *)ciImage
{
    // 开始绘制
    [_context drawImage:ciImage
                 inRect:_rectInPixels
               fromRect:[ciImage extent]];
    //将CIImage转变为UIImage
//    CGImageRef cgimg = [_context createCGImage:ciImage fromRect:[ciImage extent]];
//    UIImage *newImg = [UIImage imageWithCGImage:cgimg];
//    CGImageRelease(cgimg);
    // 显示
    [_showView display];
}

@end

你可能感兴趣的:(IOS精益编程)