iOS:AVFoundation视频-快放、慢放、倒放

效果图:


iOS:AVFoundation视频-快放、慢放、倒放_第1张图片
正常.gif

iOS:AVFoundation视频-快放、慢放、倒放_第2张图片
慢放.gif

iOS:AVFoundation视频-快放、慢放、倒放_第3张图片
快放.gif

iOS:AVFoundation视频-快放、慢放、倒放_第4张图片
倒放.gif

视频预览播放

倒入框架#import
使用AVPlayer播放视频、rate属性进行加速、减速
使用AVPlayerItem中的方法判断加速、减速
canPlaySlowReverse:支持到放
canPlaySlowForward:支持慢放
canPlayFastForward:支持快放

核心代码:

注意:倒放需要seek到视频结尾。

- (void)clickAction:(UIButton*)sender{
    
    [self.avplayer seekToTime:kCMTimeZero];
    [self.avplayer play];
    
    switch (sender.tag) {
        case 0:
            if ([self.avplayer.currentItem canPlaySlowReverse]) {
                self.avplayer.rate = -1.0;
                [self.avplayer seekToTime:self.avplayer.currentItem.duration];
            }else{
                [self showAlert:@"不支持倒叙播放"];
            }
            break;
        case 1:
            self.avplayer.rate = 1;
            break;
        case 2:
            if ([self.avplayer.currentItem canPlaySlowForward]) {
                self.avplayer.rate = 0.5;
            }else{
                [self showAlert:@"不支持0.5倍速播放"];
            }
            break;
        case 3:
            if ([self.avplayer.currentItem canPlayFastForward]) {
                self.avplayer.rate = 2.0;
            }else{
                [self showAlert:@"不支持2倍速播放"];
            }
            break;
            
        default:
            break;
    }
}

生成资源视频

如果对视频资源做速度处理,我思考的原理:AVAssetReader+AVAssetWriter

倒放:我本来是想着用AVAssetWriter从后往前写入CMSampleBufferRef,苹果应该是不支持这种方式,必须要从0点开始写入。
所以需要使用resetForReadingTimeRanges方法,这个方法能够加载一个时间节点数组,倒序便利,copyNextSampleBuffer,就能取出后->前CMSampleBufferRef时间点。

使用resetForReadingTimeRanges,必须设置AVAssetReaderTrackOutput的supportsRandomAccess为YES:不按照顺序读取

获取到时间点数组、时间节点范围数组

- (void)setclipTimeRangeArray{
    CMSampleBufferRef sample;
    CMTime presentationTime = kCMTimeZero;
    CMTime startTime = kCMTimeZero;
    CMTime endTime = kCMTimeZero;
    NSUInteger processIndex = 0;
    
    self.clipTimeRangeArray = [NSMutableArray array];
    self.sampleTimeArray = [NSMutableArray array];
    
    //每10片确定一个帧组范围
    while((sample = [self nextVideoSample])) {
        //时间点
        presentationTime = CMSampleBufferGetPresentationTimeStamp(sample);
        NSValue *presentationValue = [NSValue valueWithBytes:&presentationTime objCType:@encode(CMTime)];
        [self.sampleTimeArray addObject:presentationValue];
        
        CFRelease(sample);
        sample = NULL;
                
        if (processIndex == 0) {
//            startTime = presentationTime;
            processIndex ++;
            
        } else if (processIndex == 9) {
            endTime = presentationTime;
            
            CMTimeRange timeRange = CMTimeRangeMake(startTime, CMTimeSubtract(endTime, startTime));
            NSValue *timeRangeValue = [NSValue valueWithCMTimeRange:timeRange];
            [self.clipTimeRangeArray addObject:timeRangeValue];
            
            processIndex = 0;
            startTime = presentationTime;
            endTime = kCMTimeZero;
            
        } else {
            processIndex ++;
        }
    }
    //处理不够kClipMaxContainCount数量的帧的timerange
    if (CMTIME_COMPARE_INLINE(kCMTimeZero, !=, startTime) && CMTIME_COMPARE_INLINE(kCMTimeZero, ==, endTime)) {
        
        endTime = presentationTime;
        
        //单独处理最后只剩一帧的情况
        if (CMTIME_COMPARE_INLINE(endTime, ==, startTime) &&
            processIndex == 1) {
            startTime = CMTimeSubtract(startTime, CMTimeMake(1, self.videoAsset.tracks[0].nominalFrameRate));
        }
        
        CMTimeRange timeRange = CMTimeRangeMake(startTime, CMTimeSubtract(endTime, startTime));
        NSValue *timeRangeValue = [NSValue valueWithCMTimeRange:timeRange];
        [self.clipTimeRangeArray addObject:timeRangeValue];
    }
}

针对每一个时间节点范围,获取CMSampleBufferRef,再依据元素位置获取时间点。
CMSampleBufferRef:前->后
pts:后->前

//倒序读取
- (void)nextReverseVideoSample:(void(^)(CMSampleBufferRef buffer,CMTime pts_reverse))block{
    
    __block NSInteger index = 0;
    __block NSMutableArray* bufferCaches = [NSMutableArray array];
    __block NSMutableArray* ptsCaches = [NSMutableArray array];
    
    [self.clipTimeRangeArray enumerateObjectsWithOptions:NSEnumerationReverse usingBlock:^(id  _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {

        CMSampleBufferRef buffer;
        [_readerTrackOutput_video resetForReadingTimeRanges:@[obj]];

        while ((buffer = [self nextVideoSample])) {

            [bufferCaches addObject:(__bridge id _Nonnull)(buffer)];
            [ptsCaches addObject:self.sampleTimeArray[index]];
            index++;
        }
        [bufferCaches enumerateObjectsWithOptions:NSEnumerationReverse usingBlock:^(id  _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {

            CMTime pts = ptsCaches[ptsCaches.count - idx - 1].CMTimeValue;
            NSLog(@"==%f",CMTimeGetSeconds(pts));

            block((__bridge CMSampleBufferRef)(obj),pts);
        }];
        [bufferCaches removeAllObjects];
        [ptsCaches removeAllObjects];
    }];
}

插入视频

// 开始写入
- (void)pushVideoBuffer:(CVPixelBufferRef)buffer pts:(CMTime)pts{
    
    [NSThread sleepForTimeInterval:0.005];
    if (self.assetWriterInput_video.readyForMoreMediaData) {
        
        NSLog(@"插入图片:%f",CMTimeGetSeconds(pts));
          if (buffer) {
              [_adaptor appendPixelBuffer:buffer withPresentationTime:pts];
              CFRelease(buffer);
              buffer = NULL;
          }
    }else{
        NSLog(@"!!!!!无法插入图片:%f---%ld",CMTimeGetSeconds(pts),(long)self.writer.status);
    }
}

慢速、快速
对视频实现快速、慢速,无非是改变帧的时间点。
慢速:延长帧时间点
快放:缩小帧时间点

- (void)nextSpeedChangeFromValue:(float)slowValue VideoSample:(void(^)(CMSampleBufferRef buffer,CMTime pts_reverse))block{
    __block NSInteger index = 0;
    __block NSMutableArray* bufferCaches = [NSMutableArray array];
    __block NSMutableArray* ptsCaches = [NSMutableArray array];
    
    [self.clipTimeRangeArray enumerateObjectsUsingBlock:^(id  _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
        CMSampleBufferRef buffer;
        [_readerTrackOutput_video resetForReadingTimeRanges:@[obj]];

        while ((buffer = [self nextVideoSample])) {

            [bufferCaches addObject:(__bridge id _Nonnull)(buffer)];
            [ptsCaches addObject:self.sampleTimeArray[index]];
            index++;
        }
             
        [bufferCaches enumerateObjectsUsingBlock:^(id  _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
            
            CMTime pts = ptsCaches[idx].CMTimeValue;
            
            pts = CMTimeMultiplyByFloat64(pts, 1/slowValue);
            block((__bridge CMSampleBufferRef)(obj),pts);
            
        }];
        [bufferCaches removeAllObjects];
        [ptsCaches removeAllObjects];
    }];
}

GitHub:
https://github.com/qw9685/rateVideo

你可能感兴趣的:(iOS:AVFoundation视频-快放、慢放、倒放)