iOS直播(三)GPUImage音视频采集并写入文件

上一篇介绍了用GPUImage图像处理库进行图像采集,从而避免了直接使用AVFoundation(AVKit)时繁琐的代码,同时不用熟悉OpenGL ES也可以快速地对图像进行美颜、添加滤镜等。这一篇介绍如果使用多个滤镜以及录制视频,并保存到本地沙盒中。

本文默认你已经按照上一篇中介绍的集成了GPUImage库。
1.声明必要的全局变量

   fileprivate lazy var camera : GPUImageVideoCamera = GPUImageVideoCamera(sessionPreset: AVCaptureSession.Preset.high.rawValue, cameraPosition: .back)
    
    //实时显示画面的预览图层
    fileprivate lazy var showView = GPUImageView(frame: view.bounds)
    
    //滤镜
    let bilateralFilter = GPUImageBilateralFilter()//磨皮
    let exposureFilter = GPUImageExposureFilter() //曝光
    let brightnessFilter = GPUImageBrightnessFilter()//美白
    let satureationFilter = GPUImageSaturationFilter()//饱和
    
    //视频写入类
    fileprivate lazy var movieWriter : GPUImageMovieWriter = {
        [unowned self] in
        let writer = GPUImageMovieWriter(movieURL: self.fileURL, size: self.view.bounds.size)
        return writer!
    }()
    
    //视频沙盒地址
    fileprivate lazy var fileURL : URL = {
        [unowned self] in
        return URL(fileURLWithPath: "\(NSTemporaryDirectory())movie\(arc4random()).mp4")
    }()

2.在viewDidLoad中初始化配置,并开始采集和录制

override func viewDidLoad() {
        super.viewDidLoad()
        //设置摄像头方向为垂直
        camera.outputImageOrientation = .portrait
        //使用前值摄像头
        camera.horizontallyMirrorFrontFacingCamera = true

        //加入预览图层
        view.insertSubview(showView, at: 0)
        
        //获取滤镜组
        let filterGroup = getGroupFilters()
        //设置GUPImage的响应链
        camera.addTarget(filterGroup)
        filterGroup.addTarget(showView)

        //开始采集
        camera.startCapture()
        
        //配置写入文件
        movieWriter.encodingLiveVideo = true
        filterGroup.addTarget(movieWriter)
        camera.delegate = self;
        camera.audioEncodingTarget = movieWriter
        movieWriter.startRecording()
    }

3.上一步中的getGroupFilters方法,是将多个滤镜效果叠加。GPUImage采用链式方法来处理画面,通过addTarget方法添加对象到链中,处理完一个target,再把上一个环节处理好的图像数据传递到下一个target处理,成为GPUImage处理链。

fileprivate func getGroupFilters() -> GPUImageFilterGroup {
        //创建滤镜组
        let filterGroup = GPUImageFilterGroup()
        
        //设置滤镜关系链
        bilateralFilter.addTarget(brightnessFilter)
        brightnessFilter.addTarget(exposureFilter)
        exposureFilter.addTarget(satureationFilter)
        
        //设置滤镜组初始、终点filter
        filterGroup.initialFilters = [bilateralFilter]
        filterGroup.terminalFilter = satureationFilter
        
        return filterGroup
    }

4.我们在界面上添加一个按钮,点击后结束采集和录制,并利用AVPlayerViewController查看录制的视频

    @IBAction func clickPlay(_ sender: Any) {
        print(fileURL)
        camera.stopCapture()
        showView.removeFromSuperview()
        movieWriter.finishRecording()
        let playerVc = AVPlayerViewController()
        playerVc.player = AVPlayer(url: fileURL)
        present(playerVc, animated: true, completion: nil)
    }

运行效果:
iOS直播(三)GPUImage音视频采集并写入文件_第1张图片
demo代码:https://github.com/dolacmeng/LiveGPUImageDemo/tree/record

你可能感兴趣的:(iOS)