iOS(Swift) 美颜,GPUImage,FaceU杂谈

最近复习了下美颜的集成,这里记录一下
直播三要素 采集->美颜->推拉流
本篇只是简单介绍下美颜的流程,附加一些 demo 测试

基本概念

  • CMSampleBuffer: CM 开头为 CoreMedia 中的框架,它包含图像及图像外的信息,比如 CMTime 等等,
    CVPixelBuffer是typealias CVPixelBuffer = CVImageBuffer,是一种图像数据,所以CMSampleBuffer很容易转成CVPixelBuffer,用这个 api 即可获得public func CMSampleBufferGetImageBuffer(_ sbuf: CMSampleBuffer) -> CVImageBuffer?,反过来也能转,但是会丢失一些信息.
  • OpenGL是一个图形api 规范,一种抽象接口,每个平台的底层实现都不一样,在 iOS 中的实现为 OpenGL ES,他是 C类型的 API,如果没有专门的学习知识,阅读起来是十分麻烦的,但是,我们并不需要知道其具体的实现原理,本篇博客不会讲也讲不出那么专业的知识
  • GPUImage 开源框架,对 OpenGLES 做了二次封装,更加面向对象,提供了非常多的滤镜功能.

前置准备

视频采集过程中,不管中途参数如何设置,结果如何转发,原生 API AVCaptureVideoDataOutputSampleBufferDelegate代理中的方法:optional func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)一定是接收采集到的数据
而我们美颜要处理的就是采集到的CMSampleBuffer,或者是CVPixelBuffer,最后我们推流所需要上传的,就是美颜处理完后的CVPixelBuffer,以 Agora 声网推流举例

/**
 * Pushes the external video frame.
 *
 * This method pushes the video frame using the AgoraVideoFrame class and
 * passes it to the Agora SDK with the `format` parameter in AgoraVideoFormat.
 *
 * Call {@link setExternalVideoSource:useTexture:pushMode: setExternalVideoSource}
 * and set the `pushMode` parameter as `YES` before calling this method.
 * @note
 * In the Communication profile, this method does not support pushing textured
 * video frames.
 * @param frame Video frame containing the SDK's encoded video data to be
 * pushed: #AgoraVideoFrame.
 * @return
 * - `YES`: Success.
 * - `NO`: Failure.
 */
- (BOOL)pushExternalVideoFrame:(AgoraVideoFrame * _Nonnull)frame;

// 调用实现
func pushVideoFrame(_ buffer: CVPixelBuffer, timestamp: CMTime) -> Bool {
        let frame: AgoraVideoFrame = AgoraVideoFrame()
        frame.format = 12
        frame.rotation = 0
        frame.textureBuf = buffer
        frame.time = timestamp
        return rtcEngine.pushExternalVideoFrame(frame)
    }

我们最后需要构建AgoraVideoFrame进行推流.

GPUImage 框架测试


本次测试对 GPUImage部分特效滤镜进行了测试,没有探究到全部功能,测试源码在结尾会贴出.

GPUImage总结

GPUImage提供了一百都种框架,大家在网上都可以找到对应的解释,大家在查看源码的时候,也可以看到openGL的代码实现,比如下面的某个滤镜,texture2D,gl 开头的都是 open 框架的 API,如果你有志向往图形学上研究,可以认真系统学习opengl,但是,这真的很难,而且参数真的超级超级难调.
我对美颜的态度其实就是,你需要知道它是怎么回事,你需要在哪里开始处理,你需要处理好调参的策略,比如增加个串行队列控制滑动条等等就行了,首先要保证我们自身 APP 的流畅.
如果有人觉得,我用 GPUImage就够了,没必要用三方的美颜,这其实大错特错了,GPUImage确实提供给我们很多好玩的滤镜,但是美颜是一个细类,我目前找到的一个网上大神写的美颜文件,也不支持细节化的定制,比如大眼,窄脸,下巴,开眼角,缩人中等等.为什么会出现 FaceUnity 这种付费的美颜三方,就是因为他们对美颜做了深度的研究,再者,人脸识别+道具添加,这东西,你一个业务型的公司,做这东西,做个几年都做不完,更谈不上赚钱了.

GPUImage的 buffer上传

测试过程中,我是用的是GPUImageVideoCamera做特效集成,但是这个类仅有一个GPUImageVideoCameraDelegate代理,其方法func willOutputSampleBuffer(_ sampleBuffer: CMSampleBuffer!)只是在处理美颜前调用,但是我们需要的是处理完美颜的CMSampleBuffer,查找一番后,发现 GPUImage 提供了这个类GPUImageStillCamera,它提供了处理完各种滤镜后的数据输出,比如 png,image,jpeg 等等,方便我们做拍照,图片存储等等,但是我们需要的是拿到处理完的 samplebuffer.


其.m 文件方法实现中告诉我们要修改源码,并且对图片文件输出等等失效,不过这是目前我唯一知道实时拿到 buffer 的途径,如果有其他方法,往读者告知.

Nama 库 buffer 处理

Nama 是 FaceUnity 的美颜三方库,我司目前使用的是这个库,这里不谈价格,接入还是比较简单的,demo 效果十分丰富.
同样是在AVCaptureVideoDataOutputSampleBufferDelegate中处理,


这是Faceunity 提供的 Demo 处理,它是对 CVPixelBuffer进行了一系列处理,最后我们可以同步将buffer 上传给服务器.

Nama 道具句柄概念

Demo ,Faceunity提到了一种概念 道具句柄

NSString *path = [[NSBundle mainBundle] pathForResource:@"tiara" ofType:@"bundle"];
int itemHandle = [FURenderer itemWithContentsOfFile:path];

道具句柄就是我们加载的道具文件包,在对视频采集的Buffer渲染的时候,我们传入了一个items: &items道具句柄数组的地址,FaceU的服务是按功能划分的,你可以购买美颜,AR,道具等等效果,每个效果对应一个道具包,道具包被从文件中加载成道具句柄,我们需要传递所有我们想要渲染的道具句柄数组,而设置美颜效果,就是对美颜道具句柄调用
FURenderer.itemSetParam(self.items[Handle.beauty.rawValue], withName: param.key, value: tempValue),我司目前只买了一个美颜道具,基本也够用了.
这样,最后在视频采集的时候就能实时渲染出美颜的效果,剩下的初始化鉴权等等,在购买FacaU 时会提供完整的接入代码,甚至还包括美颜 UI 的代码,服务也算是比较完全了.

GPUImage效果展示


美颜滤镜出自: 戳这

GPUImage测试代码

下面贴一下我测试的GPUImage 测试代码,只是很简单的特效测试,这里顺便


Page

page集成自 UIViewController,主要承接的是页面布局,渲染工作,没有业务逻辑代码

class GPUImageTestPage: BasePage {
    //MARK:- --------------------------------------infoProperty
    lazy var controller = GPUImageTestController(self)
    //MARK:- --------------------------------------UIProperty
    let interactionView = GPUInteractionView()
    //MARK:- --------------------------------------system
    override func commonInit() {
        super.commonInit()
        Get.put(controller)
    }
    override func viewDidLoad() {
        super.viewDidLoad()
        title = "美颜测试"
        interactionView.add(to: self.view)
    }
    
    func finishInitial() {
        self.view.insertSubview(controller._outputView, at: 0)
        updateUI()
    }
    
    func updateUI() {
        interactionView.updateUI()
        view.setNeedsLayout()
        view.layoutIfNeeded()
    }
    
    override func viewDidLayoutSubviews() {
        super.viewDidLayoutSubviews()
        controller._outputView.pin.all()
        interactionView.pin.all()
    }
}

Controller

整个页面逻辑都在这里,做了GPUImageVideoCamera的初始化,添加滤镜,滤镜选择,滑动条修改值(部分滤镜支持)

//
//  GPUImageTestController.swift
//  quick
//
//  Created by suyikun on 2022/3/4.
//

import Foundation
import GPUImage

class GPUImageTestController: GetController {
    
    var page: GPUImageTestPage? { _page as? GPUImageTestPage }
    
    // MARK: - --------------------------------------infoproperty
    var openSelect: Bool = false
    
    var filterList: [FilterModel] = []
    
    var currentFilter: FilterModel?
    
    private var filterBag: [FilterType: GPUImageOutput&GPUImageInput] = [:]
    
    var filterQueue: DispatchQueue = DispatchQueue(label: "gpu.image.queue", qos: DispatchQoS.default, attributes: [], autoreleaseFrequency: DispatchQueue.AutoreleaseFrequency.workItem, target: nil)
    // MARK: - --------------------------------------uiProperty
    
    var _videoCamera: GPUImageVideoCamera!
    var _outputView: GPUImageView = GPUImageView()
    private var _beautifyFilter: GPUImageBeautifyFilter?
    
    override func onViewDidLoad() {
        super.onViewDidLoad()
        initList()
        initialCamera()
    }
    
    func initList() {
        filterList = [
            FilterModel(type: .eGPUImageStretchDistortionFilter, title: "哈哈镜", defaultValue: 0.5),
            FilterModel(type: .eBrightness, title: "亮度", defaultValue: 0.5),
            FilterModel(type: .eExposure, title: "对比度", defaultValue: 0.5),
            FilterModel(type: .eGPUImageGrayscaleFilter, title: "灰度", defaultValue: 0.5),
            FilterModel(type: .eInvert, title: "反色", defaultValue: 0.5),
            FilterModel(type: .eBeautify, title: "美颜", defaultValue: 0.5),
            FilterModel(type: .eMonochrome, title: "黑白", defaultValue: 0.5),
            FilterModel(type: .eGPUImageSepiaFilter, title: "褐色", defaultValue: 0.5),
            FilterModel(type: .eGPUImageGammaFilter, title: "伽马", defaultValue: 0.5),
            FilterModel(type: .eGPUImageSketchFilter, title: "素描", defaultValue: 0.5),
            FilterModel(type: .eGPUImageToonFilter, title: "卡通效果", defaultValue: 0.5),
//            FilterModel(type: .eGPUImageMosaicFilter, title: "黑白马赛克", defaultValue: 0.5),
        ]
        filterBag = [
            .eGPUImageStretchDistortionFilter: GPUImageStretchDistortionFilter(),
            .eInvert : GPUImageColorInvertFilter(),
            .eExposure : GPUImageExposureFilter(),
            .eGPUImageGrayscaleFilter: GPUImageGrayscaleFilter(),
            .eBeautify: GPUImageBeautifyFilter(),
            .eBrightness : GPUImageBrightnessFilter(),
            .eMonochrome: GPUImageMonochromeFilter(),
            .eGPUImageSepiaFilter: GPUImageSepiaFilter(),
            .eGPUImageGammaFilter: GPUImageGammaFilter(),
            .eGPUImageSketchFilter: GPUImageSketchFilter(),
            .eGPUImageToonFilter: GPUImageToonFilter()
        ]
        page?.updateUI()
    }
    
    func initialCamera() {
        _videoCamera = GPUImageVideoCamera(sessionPreset: AVCaptureSession.Preset.iFrame1280x720.rawValue, cameraPosition: .front)
        _videoCamera.outputImageOrientation = .portrait
        _videoCamera.horizontallyMirrorFrontFacingCamera = true
        _videoCamera.frameRate = 25
        _videoCamera.delegate = self
        
        
        openSelect = true
        _videoCamera.addTarget(_outputView)
        _videoCamera.startCapture()
        DispatchQueue.main.async {
            self.page?.finishInitial()
        }
        
    }
    
    // MARK: - --------------------------------------action
    func cameraSelect() {
        openSelect = !openSelect
        if openSelect {
            _videoCamera.resumeCameraCapture()
        } else {
            _videoCamera.pauseCapture()
        }
        page?.updateUI()
    }
    
    func selectCurrentFilter(_ model: FilterModel?) {
        guard let model = model else { return }
        
        llog(model)
        if model.isSelected {
            model.isSelected = false
            currentFilter = nil
        } else {
            currentFilter = model
            if model.type.priority == 1 { // 清除其他同优先级的
                filterList.filter({ $0.type.priority == 1 }).forEach { $0.isSelected = false }
            }
            model.isSelected = true
        }
        
        page?.updateUI()
        
        
        _videoCamera.removeAllTargets()
        let list = filterList.filter { $0.isSelected == true }
        var lastFilter: (GPUImageOutput&GPUImageInput)?
        for filterModel in list {
            let filter = filterBag[filterModel.type]
            if let last = lastFilter {
                last.addTarget(filter)
                lastFilter = filter
            } else {
                _videoCamera.addTarget(filter)
                lastFilter = filter
            }
        }
        
        (lastFilter ?? _videoCamera).addTarget(_outputView)
    }
    
    /// 滑动条
    func sliderValueChanged(_ v: CGFloat) {
        guard let fModel = currentFilter else { return }
        fModel.value = v
        let currentFilter = filterBag[fModel.type]
        if let f = currentFilter as? GPUImageStretchDistortionFilter { // 哈哈
            f.center = MakePoint(v, 0.5)
        } else if let f = currentFilter as? GPUImageExposureFilter { // 对比度 -10.0 to 10.0
            f.exposure = v * 20 - 10
        } else if let f = currentFilter as? GPUImageBrightnessFilter { // 亮度
            f.brightness = v * 2 - 1
        } else if let f = currentFilter as? GPUImageGammaFilter { // 伽马
            f.gamma = v * 3
        } else if let f = currentFilter as? GPUImageSaturationFilter { // 饱和度
            f.saturation = v * 2
        }
    }
    
    private func getFilter(_ model: FilterModel) -> GPUImageOutput&GPUImageInput {
        if let filter = filterBag[model.type] {
            return filter
        } else {
            assertionFailure("无初始话的滤镜")
            return GPUImageExposureFilter()
        }
    }
    
    override func onDispose() {
        super.onDispose()
        _videoCamera.stopCapture()
    }
}
extension GPUImageTestController: GPUImageVideoCameraDelegate {
    func willOutputSampleBuffer(_ sampleBuffer: CMSampleBuffer!) {
        guard let cvbuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
//        llog(cvbuffer)
    }
}

View


class GPUInteractionView: DXView {
    
    var c: GPUImageTestController? {
        Get.find(GPUImageTestController.self)
    }
    
    override func commonInit() {
        super.commonInit()
        
//        rootFlex.column.alignItems(.center)
        
        listView.add(to: self).snap {
            $0.left.equalTo(20)
            $0.top.equalTo(Screen.navigationHeight + 20)
            $0.width.equalTo(80)
            $0.bottom.equalTo(-Screen.safeArea.bottom-200)
        }
        
        sliderView.add(to: self).snap {
            $0.left.right.equalToSuperview()
            $0.bottom.equalToSuperview()
            $0.height.equalTo(200)
        }
    }
    
    override func allEvents() {
        super.allEvents()
        
        openButton.r.touchUpInside.observeValues {[weak self] _ in
            guard let self = self, let c = self.c else { return }
            c.cameraSelect()
        }
    }
    
    func updateUI() {
        guard let c = c else {
            return
        }
        openButton.isSelected = c.openSelect
        
        listView.updateUI()
        sliderView.updateUI()
    }
    
    // MARK: - --------------------------------------lazy
    
    lazy var listView: GPUFilterListView = GPUFilterListView()
    lazy var sliderView: GPUFilterSliderView = GPUFilterSliderView()
    lazy var openButton: UIButton = UIButton().then {
        $0.setTitle("开播", for: .normal)
        $0.setTitle("关播", for: .selected)
        $0.layer.cornerRadius = 22
        $0.layer.masksToBounds = true
        $0.titleLabel?.font = .regular(18)
        $0.setTitleColor(.white, for: .normal)
        $0.backgroundColor = .female
    }
}
//
//  GPUFilterListView.swift
//  quick
//
//  Created by suyikun on 2022/3/4.
//

import Foundation
import UIKit

class GPUFilterListView: DXView, UICollectionViewDelegateFlowLayout, UICollectionViewDataSource {
    
    var c: GPUImageTestController? {
        Get.find(GPUImageTestController.self)
    }
    
    var layout: UICollectionViewFlowLayout {
        UICollectionViewFlowLayout().then {
            $0.itemSize = MakeSize(70, 35)
            $0.scrollDirection = .vertical
            $0.minimumLineSpacing = 10
            $0.minimumInteritemSpacing = 10
        }
    }
    lazy var listView = UICollectionView(frame: .zero, collectionViewLayout: layout)
    
    override func commonInit() {
        super.commonInit()
        
        rootFlex.addItem(listView).position(.absolute).all(0)
        listView.dataSource = self
        listView.delegate = self
        listView.registReusable(FilterSelectCell.self)
        listView.backgroundColor = .clear
    }
    
    func updateUI() {
        listView.reloadData()
    }
    
    func collectionView(_ collectionView: UICollectionView, numberOfItemsInSection section: Int) -> Int {
        c?.filterList.count ?? 0
    }
    
    func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell {
        let cell = listView.reuseCell(for: indexPath, cellType: FilterSelectCell.self)
        cell.update(model: c?.filterList[safe: indexPath.item])
        return cell
    }
    
    func collectionView(_ collectionView: UICollectionView, didSelectItemAt indexPath: IndexPath) {
        guard let c = c, let model = c.filterList[safe: indexPath.item] else { return }
        c.selectCurrentFilter(model)
    }
}

class FilterSelectCell: UICollectionViewCell {
    
    let flexContainer = UIView()
    
    let lblTitle = UILabel().then {
        $0.textColor = .white
        $0.numberOfLines = 0
        $0.textAlignment = .center
    }
    
    override init(frame: CGRect) {
        super.init(frame: frame)
        commonInit()
    }
    
    required init?(coder: NSCoder) {
        super.init(coder: coder)
        commonInit()
    }
    
    override func awakeFromNib() {
        super.awakeFromNib()
        commonInit()
    }
    
    func commonInit() {
        contentView.backgroundColor = .clear
        backgroundColor = .clear
        
        flexContainer.add(to: contentView)
        flexContainer.flex.hvCenter.define {
            $0.addItem(lblTitle)
        }
        flexContainer.isUserInteractionEnabled = false
        flexContainer.cornerRadius = 5
    }
    
    func update(model: AnyObject?) {
        guard let model = model as? FilterModel else { return }
        lblTitle.text = model.title
        lblTitle.flex.markDirty()
        if model.isSelected {
            contentView.backgroundColor = .male
            lblTitle.textColor = .white
        } else {
            contentView.backgroundColor = .white
            lblTitle.textColor = .title
        }
        setNeedsLayout()
    }
    
    override func layoutSubviews() {
        super.layoutSubviews()
        flexContainer.frame = contentView.bounds
        flexContainer.flex.layout()
    }
}

Model

//
//  FilterModel.swift
//  quick
//
//  Created by suyikun on 2022/3/4.
//

import Foundation
import HandyJSON
import GPUImage

class FilterModel: NSObject, HandyJSON {
    required override init() {}
    
    /// 类名
    var type: FilterType = .eInvert
    /// 中文名
    var title: String = ""
    /// 参数值
    var value: CGFloat?
    /// 默认值
    var defaultValue: CGFloat = 0
    
    var is101: Bool = false
    
    // MARK: - --------------------------------------custom
    /// 是否被选中
    var isSelected: Bool = false
    
    init(type: FilterType, title: String, value: CGFloat? = nil, defaultValue: CGFloat = 0, is101: Bool = false) {
        self.type = type
        self.title = title
        if let value = value {
            self.value = value
        }
        self.defaultValue = defaultValue
        self.is101 = is101
    }
    
    override var description: String {
        self.toJSONString() ?? ""
    }
}

enum FilterType: Int, HandyJSONEnum, Equatable {
    /// 哈哈镜
    case eGPUImageStretchDistortionFilter
    /// 亮度
    case eBrightness
    /// 对比度
    case eExposure
    /// 反色
    case eInvert
    /// 美颜
    case eBeautify
    /// 黑白
    case eMonochrome
///    褐色(怀旧)
    case eGPUImageSepiaFilter
    /// 灰度
    case eGPUImageGrayscaleFilter
    /// 伽马
    case eGPUImageGammaFilter
    /// 素描
    case eGPUImageSketchFilter
    /// 卡通效果
    case eGPUImageToonFilter
    
    var canSlider: Bool {
        self == .eBrightness
        || self == .eExposure
        || self == .eGPUImageStretchDistortionFilter
        || self == .eGPUImageGammaFilter
    }
    /// 优先级
    var priority: Int {
        return 1
//        switch self {
//        case .eBrightness, .eExposure:
//            return 2
//        default:
//            return 1
//        }
    }
    static func == (lhs: Self, rhs: Self) -> Bool {
        lhs.rawValue == rhs.rawValue
    }
}

你可能感兴趣的:(iOS(Swift) 美颜,GPUImage,FaceU杂谈)