YUV420格式的CVPixelBuffer转换为RGB格式

ARKit中提取到的CVPixelBuffer为YUV420格式,很多时候我们需要把它转换为RGB格式,然后再进行各种后续操作。这里我们利用Accelerate中提供一个函数来完成这种转换:

Declaration
func vImageConvert_420Yp8_CbCr8ToARGB8888(_ srcYp: UnsafePointer, _ srcCbCr: UnsafePointer, _ dest: UnsafePointer, _ info: UnsafePointer, _ permuteMap: UnsafePointer!, _ alpha: UInt8, _ flags: vImage_Flags) -> vImage_Error
Parameters
srcYp
A pointer to the vImage buffer that references the source Yp plane.

srcCbCr
A pointer to the vImage buffer that references the source CbCr plane.

dest
A pointer to the vImage buffer that references 8-bit ARGB interleaved destination pixels.

info
A pointer to vImage_YpCbCrToARGB, which contains info coefficient and prebias values.

permuteMap
An array of four 8-bit integers with the values 0, 1, 2, and 3, in some order. Each value specifies a channel from the source image that should be copied to that channel in the destination image. 0 denotes the alpha channel, 1 the red channel, 2 the green channel, and 3 the blue channel. 

alpha
A value for the alpha channel in dest.

flags
The options to use when performing the operation. If you plan to perform your own tiling or use multithreading, pass kvImageDoNotTile.
Return Value
kvImageNoError; otherwise, one of the error codes described in Data Types and Constants.
完整代码:
func pixelBufferToImage(pixelBuffer:CVPixelBuffer) -> UIImage {
    //01
    CVPixelBufferLockBaseAddress(pixelBuffer, .readOnly)
    //02
    defer {
       CVPixelBufferUnlockBaseAddress(pixelBuffer, .readOnly)
    }
        
    //03
    let lumaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0)
    //04
    let lumaWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 0)
    let lumaHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 0)
    let lumaRowBytes = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0)
    var sourceLumaBuffer = vImage_Buffer(data: lumaBaseAddress, height: vImagePixelCount(lumaHeight), width: vImagePixelCount(lumaWidth), rowBytes: lumaRowBytes)
        
    let chromaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1)
    let chromaWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 1)
    let chromaHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 1)
    let chromaRowBytes = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 1)
    var sourceChromaBuffer = vImage_Buffer(data: chromaBaseAddress, height: vImagePixelCount(chromaHeight), width: vImagePixelCount(chromaWidth), rowBytes: chromaRowBytes)
        
    //05
    var rawRGBBuffer: UnsafeMutableRawPointer = malloc(lumaWidth * lumaHeight * 4)
    var rgbBuffer: vImage_Buffer = vImage_Buffer(data: rawRGBBuffer, height: vImagePixelCount(lumaHeight), width: vImagePixelCount(lumaWidth), rowBytes: lumaWidth * 4)
        
    //06
    guard var conversionInfoYpCbCrToARGB = _conversionInfoYpCbCrToARGB else {
        return UIImage()
    }
        
    //07
    guard vImageConvert_420Yp8_CbCr8ToARGB8888(&sourceLumaBuffer, &sourceChromaBuffer, &rgbBuffer, &conversionInfoYpCbCrToARGB, nil, 255, vImage_Flags(kvImageNoFlags)) == kvImageNoError else {
        return UIImage()
    }
      
    //08
    let colorSpace = CGColorSpaceCreateDeviceRGB()
    let ctx = CGContext(data: rgbBuffer.data, width: lumaWidth, height: lumaHeight, bitsPerComponent: 8, bytesPerRow: rgbBuffer.rowBytes, space: colorSpace, bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue)

    let imageRef = ctx!.makeImage()!
    let uiimage = UIImage(cgImage: imageRef)
       
    //09 
    rawRGBBuffer.deallocate()
    return uiimage
}

private var _conversionInfoYpCbCrToARGB: vImage_YpCbCrToARGB? = {
    var pixelRange = vImage_YpCbCrPixelRange(Yp_bias: 16, CbCr_bias: 128, YpRangeMax: 235, CbCrRangeMax: 240, YpMax: 235, YpMin: 16, CbCrMax: 240, CbCrMin: 16)
    var infoYpCbCrToARGB = vImage_YpCbCrToARGB()
    guard vImageConvert_YpCbCrToARGB_GenerateConversion(kvImage_YpCbCrToARGBMatrix_ITU_R_601_4!, &pixelRange, &infoYpCbCrToARGB, kvImage422CbYpCrYp8, kvImageARGB8888, vImage_Flags(kvImageNoFlags)) == kvImageNoError else {
        return nil
    }
    return infoYpCbCrToARGB
}()
代码解析

01、在访问buffer内部裸数据的地址時(读或写都一样),需要先将其锁上,用完了再放开。

CVPixelBufferLockBaseAddress(pixelBuffer, .readOnly) 
CVPixelBufferUnlockBaseAddress(pixelBuffer, .readOnly)

02、defer block 里的代码会在函数 return 之前执行

defer {
       CVPixelBufferUnlockBaseAddress(pixelBuffer, .readOnly)
}

03、ARKit中通过ARFrame获取的CVPixelBuffer为YUV420格式,YUV420格式Y通道(Luminance)与UV通道(Chrominance)分开储存数据,可通过下面语句获取地址

let lumaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0)
let chromaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1)

04、创建vImage_Buffer,参数中包含三个vImage buffer,前两个参数需要基于Y通道与UV通道地址初始化:
srcYp:

let lumaWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 0)
let lumaHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 0)
let lumaRowBytes = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0)
var sourceLumaBuffer = vImage_Buffer(data: lumaBaseAddress, height: vImagePixelCount(lumaHeight), width: vImagePixelCount(lumaWidth), rowBytes: lumaRowBytes)

srcCbCr:

let chromaWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 1)
let chromaHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 1)
let chromaRowBytes = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 1)
var sourceChromaBuffer = vImage_Buffer(data: chromaBaseAddress, height: vImagePixelCount(chromaHeight), width: vImagePixelCount(chromaWidth), rowBytes: chromaRowBytes)

05、参数中第三个vImage buffer需要初始化一个新的buffer,dest:

var rawRGBBuffer: UnsafeMutableRawPointer = malloc(lumaWidth * lumaHeight * 4)
var rgbBuffer: vImage_Buffer = vImage_Buffer(data: rawRGBBuffer, height: vImagePixelCount(lumaHeight), width: vImagePixelCount(lumaWidth), rowBytes: lumaWidth * 4)

06、第四个参数info,

guard var conversionInfoYpCbCrToARGB = _conversionInfoYpCbCrToARGB else {
      return UIImage()
}

private var _conversionInfoYpCbCrToARGB: vImage_YpCbCrToARGB? = {
    var pixelRange = vImage_YpCbCrPixelRange(Yp_bias: 16, CbCr_bias: 128, YpRangeMax: 235, CbCrRangeMax: 240, YpMax: 235, YpMin: 16, CbCrMax: 240, CbCrMin: 16)
    var infoYpCbCrToARGB = vImage_YpCbCrToARGB()
    guard vImageConvert_YpCbCrToARGB_GenerateConversion(kvImage_YpCbCrToARGBMatrix_ITU_R_601_4!, &pixelRange, &infoYpCbCrToARGB, kvImage422CbYpCrYp8, kvImageARGB8888, vImage_Flags(kvImageNoFlags)) == kvImageNoError else {
        return nil
    }
    return infoYpCbCrToARGB
}()

07、利用函数进行转换

guard vImageConvert_420Yp8_CbCr8ToARGB8888(&sourceLumaBuffer, &sourceChromaBuffer, &rgbBuffer, &conversionInfoYpCbCrToARGB, nil, 255, vImage_Flags(kvImageNoFlags)) == kvImageNoError else {
      return UIImage()
}

至此已完成RGB格式的转换

08、这里我们把RGB格式的buffer转换为UIImage,注意bitmapInfo的值

let colorSpace = CGColorSpaceCreateDeviceRGB()
let ctx = CGContext(data: rgbBuffer.data, width: lumaWidth, height: lumaHeight, bitsPerComponent: 8, bytesPerRow: rgbBuffer.rowBytes, space: colorSpace, bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue)

let imageRef = ctx!.makeImage()!
let uiimage = UIImage(cgImage: imageRef)

09、最后别忘了释放rawRGBBuffer。

rawRGBBuffer.deallocate()
耗时测试

ARKit可以在ARWorldTrackingConfiguration中设定摄像头采集分辨率

worldTrackingSessionConfiguration.videoFormat = ARWorldTrackingConfiguration.supportedVideoFormats[0]

对于iphoneX有3种分辨率可供选择

ARWorldTrackingConfiguration.supportedVideoFormats[0]   1920*1440
ARWorldTrackingConfiguration.supportedVideoFormats[1]   1920*1048
ARWorldTrackingConfiguration.supportedVideoFormats[2]   1024*768

耗时统计

分辨率 01-07 08
1920*1440 2.58ms 4.36ms
1920*1080 2.03ms 3.05ms
1024*768 1.4ms 1.4ms

绝大多数时候我们只需要完成01-07步即可,不需要执行08步。

其他

当我们使用Vision框架中的Machine-Learning Image Analysis时,CVPixelBuffer格式会自动转换为RGB格式。

如果我们需要的是灰度图数据,直接使用Y通道数据即可。

你可能感兴趣的:(YUV420格式的CVPixelBuffer转换为RGB格式)