在AVPlayer中截取一帧HLS(m3u8)格式图片

AVPlayer的API真心不友好,一个截屏问题搞得我花了6个小时才彻底解决。在网上搜索会找到两种截屏方案: 使用AVPlayerItemVideoOutput
和AVAssetImageGenerator。

其中AVAssetImageGenerator用于处理非HLS(切片视频流)的截屏,比如MPEG-4格式。但一开始我因为没有限制精度导致截出来的帧总是慢了1-2秒:

AVURLAsset *asset = self.playerItem.asset;
self.imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:asset];
self.imageGenerator.appliesPreferredTrackTransform = YES;
// 注意:设置为高精度截屏,这和seekToTime的精度其实是一样的
self.imageGenerator.requestedTimeToleranceBefore = kCMTimeZero;
self.imageGenerator.requestedTimeToleranceAfter = kCMTimeZero;

// 截屏操作
int attemptNumber = 0;
BOOL success = NO;
// 截屏时间点可能不准,会反复重试
while (attemptNumber < 3 && !success) {
    CMTime actualTime;
    NSError *error;
    videoImage = [self.imageGenerator copyCGImageAtTime:time actualTime:&actualTime error:&error];
    captureImage = [UIImage imageWithCGImage:videoImage];
    CGImageRelease(videoImage);
    if (error) {
        return nil;
    }
    float actual =  CMTimeGetSeconds(actualTime);
    if (isFloatEqual(timestamp, actual)) {
        success = YES;
    } else {
        attemptNumber++;
    }
}

// 你也可以使用generateCGImagesAsynchronouslyForTimes方法,异步获取多个时间点的帧图片

而AVPlayerItemVideoOutput则用于处理HLS(m3u8)的截屏,我一开始是先把视频暂停了再截屏没问题,但是在播放中时截屏就出现了这篇文章中出现的buffer返回null的问题
。然后各种查询之后发现这个回答中说:

1. Create player & player item, dispatch queue and display link
2. Register observer for AVPlayerItem status key
3. On status AVPlayerStatusReadyToPlay, create AVPlayerItemVideoOutput and start display link

然后我也将AVPlayerItemVideoOutput的构造和注册放到了AVPlayerStatusReadyToPlay之后发现还是不行,最后在这篇早年的objcio的文章中演示了对CADisplayLink的使用,才理解了前面那句create AVPlayerItemVideoOutput and start display link的含义

// 在AVPlayerItemStatusReadyToPlay时创建实例
if (self.config.isVideoHLSFormat
     && self.videoOutput == nil
     && self.playerItem.status == AVPlayerItemStatusReadyToPlay) {
     NSDictionary *settings = @{(id)kCVPixelBufferPixelFormatTypeKey:
                                    @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)
                                };
     self.videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
     [self.playerItem addOutput:self.videoOutput];
     self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(displayLinkCallback:)];
     [self.displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:NSRunLoopCommonModes];
 }


// 在runloop的间隙获取当前videoOutput的buffer
- (void)displayLinkCallback:(CADisplayLink *)sender {
    CMTime time = [self.videoOutput itemTimeForHostTime:CACurrentMediaTime()];
    if ([self.videoOutput hasNewPixelBufferForItemTime:time]) {
        self.lastSnapshotPixelBuffer = [self.videoOutput copyPixelBufferForItemTime:time itemTimeForDisplay:NULL];
    }
}

// 截屏处理
CVPixelBufferRef buffer = [self.videoOutput copyPixelBufferForItemTime:time itemTimeForDisplay:nil];
// 再取一次buffer是为了防止在暂停时截屏不准确
if (buffer == NULL
    && self.lastSnapshotPixelBuffer) {
    buffer = self.lastSnapshotPixelBuffer;
}
if (buffer) {
    //生成CIImage
    CIImage *ciImage = [CIImage imageWithCVPixelBuffer:buffer];
    // 转化为CGImage
    CIContext *context = [CIContext contextWithOptions:nil];
    size_t width = CVPixelBufferGetWidth(buffer);
    size_t height = CVPixelBufferGetHeight(buffer);
    videoImage = [context createCGImage:ciImage
                               fromRect:CGRectMake(0, 0, width, height)];
    // 生成UIImage
    captureImage = [UIImage imageWithCGImage:videoImage];
    CGImageRelease(videoImage);
}

我对截屏的图片处理比较简单就直接用UIImageJPEGRepresentation(image, 0.5)压缩了,也可以更复杂的处理。最后在重建playerItem时别忘了清理资源:

[self.playerItem removeOutput:self.videoOutput];
self.videoOutput = nil;
[self.displayLink invalidate];
self.displayLink = nil;
self.lastSnapshotPixelBuffer = NULL;

最后还有一个问题没有答案,如果要实现HLS视频的缓冲进度的图片预览不知道用AVPlayer的原生API怎么做。

你可能感兴趣的:(在AVPlayer中截取一帧HLS(m3u8)格式图片)