AVFoundation 媒体创建和编辑

1 媒体的组合和编辑

AVFoundation提供了大量API来创建非线性、无损的编辑工具和应用程序。

1.1 组合媒体核心类

AVFoundation 媒体创建和编辑_第1张图片

组合媒体的核心类时AVComposition。AVAsset和具体的媒体文件是一一对应的映射关系,组合更像是一个说明,描述多个资源如何正确的呈现和处理。AVComposition没有遵守NSCoding协议,因此不能直接保存数据库,只能保存必要的属性,在需要时创建。

1.2 时间的处理

CMTime
AVFoundation中媒体资源时间的处理为了保存精度,使用CMTime结构体,其中包含value、timeScale、flags、epoch四个属性,value和timeScale分别是64和32位有符号的整形变量,具体的时间等于value/timeScale。Flags标识数据是否有效、不确定或者是否出现舍入值等。对于视频资源timeScale通常设置为常见视频帧率的公倍数600,对于音频资源常设置为采样率,如44100(44.1Hz)等。时间的加减法使用CMTimeAdd和CMTimeSubtract,时间的比较使用CMTimeCompareIsInline。

CMTimeRange
CMTimeRange表示一个时间尺度,可以使用CMTimeRangeMake函数初始化,也可以使用CMTimeRangeFromTimeToTime初始化。时间尺度取交集使用GetInterSection,取并集使用GetUnion。

1.3 AVURLAsset

播放媒体资源的时候通常直接创建AVAsset,但是在编辑媒体资源时需要创建其子类对象AVURLAsset,并在实例方法的options中配置AVURL...Duration...TimeKey为YES,这样在后续获取asset时间相关属性时更精确,尽管这需要增加开销。

1.4 组合媒体

AVFoundation 媒体创建和编辑_第2张图片

首先需要定义组合媒体中需要用到的类,CompositionBuilderFactory负责管理CompositionBuilder,CompositionBuilder负责具体的创建组合对象并将创建好的AVComposition封装到THBaseComposition中,THBaseComposition负责将返回可播放的AVPlayerItem对象或者导出的预设值。THCompositionExporter包含一个THBaseComposition对象,负责将组合导出成mov文件。

THTimeLine对象表示一个时间轴对象,由多个THTimelineItem组成,每个THTimelineItem对象表示时间轴上的一个资源。其中键值着资源在时间轴上的位置。为了更好的载入媒体资源,创建子类THMediaItem,它封装了AVAsset对象,可以载入track等资源,为视频编辑做好准备。为了区分媒体类型,创建THVideoItem和THAudioItem子类。

THTimeline

typedef NS_ENUM(NSInteger, THTrack) {
    THVideoTrack = 0,
    THTitleTrack,
    THCommentaryTrack,
    THMusicTrack
};

@interface THTimeline : NSObject
@property (strong, nonatomic) NSArray *videos;
@property (strong, nonatomic) NSArray *transitions;
@property (strong, nonatomic) NSArray *titles;
@property (strong, nonatomic) NSArray *voiceOvers;
@property (strong, nonatomic) NSArray *musicItems;

- (BOOL)isSimpleTimeline;
@end

THTimelineItem

@interface THTimelineItem : NSObject
@property (nonatomic) CMTimeRange timeRange;
@property (nonatomic) CMTime startTimeInTimeline;
@end

THMediaItem

typedef void(^THPreparationCompletionBlock)(BOOL complete);

@interface THMediaItem : THTimelineItem
@property (strong, nonatomic) AVAsset *asset;
@property (nonatomic, readonly) BOOL prepared;
@property (nonatomic, readonly) NSString *mediaType;
@property (nonatomic, copy, readonly) NSString *title;

- (id)initWithURL:(NSURL *)url;
// 预加载static NSString *const AVAssetTracksKey = @"tracks";
// static NSString *const AVAssetDurationKey = @"duration";
// static NSString *const AVAssetCommonMetadataKey = @"commonMetadata";等属性
- (void)prepareWithCompletionBlock:(THPreparationCompletionBlock)completionBlock;
- (void)performPostPrepareActionsWithCompletionBlock:(THPreparationCompletionBlock)completionBlock;
- (BOOL)isTrimmed;
- (AVPlayerItem *)makePlayable;
@end

THAudioItem

@interface THAudioItem : THMediaItem
@property (strong, nonatomic) NSArray *volumeAutomation;

+ (id)audioItemWithURL:(NSURL *)url;
@end

THVideoItem

@interface THVideoItem : THMediaItem
@property (strong, nonatomic) NSArray *thumbnails;
@property (strong, nonatomic) THVideoTransition *startTransition;
@property (strong, nonatomic) THVideoTransition *endTransition;
@property (nonatomic, readonly) CMTimeRange playthroughTimeRange;
@property (nonatomic, readonly) CMTimeRange startTransitionTimeRange;
@property (nonatomic, readonly) CMTimeRange endTransitionTimeRange;

+ (id)videoItemWithURL:(NSURL *)url;
@end

1-初始化THComposition协议

@protocol THComposition 
- (AVPlayerItem *)makePlayable;
- (AVAssetExportSession *)makeExportable;
@end

2-初始化THBasicComposition

@interface THBasicComposition : NSObject 
@property (strong, readonly, nonatomic) AVComposition *composition;

+ (instancetype)compositionWithComposition:(AVComposition *)composition;
- (instancetype)initWithComposition:(AVComposition *)composition;
@end

@interface THBasicComposition ()
@property (strong, nonatomic) AVComposition *composition;
@end

@implementation THBasicComposition
+ (id)compositionWithComposition:(AVComposition *)composition {
    return [[self alloc] initWithComposition:composition];
}

- (id)initWithComposition:(AVComposition *)composition {
    if (self = [super init]) {
        _composition = composition;
    }
    return self;
}

- (AVPlayerItem *)makePlayable {
    return [AVPlayerItem playerItemWithAsset:[self.composition copy]];
}

- (AVAssetExportSession *)makeExportable {
    NSString *presset = AVAssetExportPresetHighestQuality;
    return [AVAssetExportSession exportSessionWithAsset:self.composition.copy presetName:presset];
}
@end

3-初始化** THCompositionBuilder**

@protocol THCompositionBuilder 
- (id )buildComposition;
@end

4-初始化** THBasicCompositionBuilder**

@interface THBasicCompositionBuilder : NSObject 
- (id)initWithTimeline:(THTimeline *)timeline;
@end

@interface THBasicCompositionBuilder ()
@property (strong, nonatomic) THTimeline *timeline;
@property (strong, nonatomic) AVMutableComposition *composition;
@end

@implementation THBasicCompositionBuilder
- (id)initWithTimeline:(THTimeline *)timeline {
    if (self = [super init]) {
        _timeline = timeline;
    }
    return self;
}

- (id )buildComposition {
    self.composition = [AVMutableComposition composition];
    [self addCompositionTrackOfType:AVMediaTypeVideo withMediaItems:self.timeline.videos];
    [self addCompositionTrackOfType:AVMediaTypeAudio withMediaItems:self.timeline.voiceOvers];
    [self addCompositionTrackOfType:AVMediaTypeAudio withMediaItems:self.timeline.musicItems];
    return [THBasicComposition compositionWithComposition:self.composition];
}

- (void)addCompositionTrackOfType:(NSString *)mediaType
                   withMediaItems:(NSArray *)mediaItems {
    if (!THIsEmpty(mediaItems)) {
        // 使用TrackID_Invalid时候,AVFoundation会自动管理轨道id从1到n
        CMPersistentTrackID trackID = kCMPersistentTrackID_Invalid;
        AVMutableCompositionTrack *compositionTrack = [self.composition addMutableTrackWithMediaType:mediaType preferredTrackID:trackID];
        CMTime cursorTime = kCMTimeZero;
        for (THMediaItem *item in mediaItems) {
            // 视频、音频、配音的mediaType都包含startTimeInTimeline属性,视频和音频必须是连续的,因此此属性为kCMTimeInvalid,配音可是是在任意位置插入,并且可以不连续,因此其属性有具体的时间
            if (CMTIME_COMPARE_INLINE(item.startTimeInTimeline, !=, kCMTimeInvalid)) {
                cursorTime = item.startTimeInTimeline;
            }
            AVAssetTrack *assetTrack = [item.asset tracksWithMediaType:mediaType].firstObject;
            [compositionTrack insertTimeRange:item.timeRange ofTrack:assetTrack atTime:cursorTime error:nil];
            cursorTime = CMTimeAdd(cursorTime, item.timeRange.duration);
        }
    }
}
@end

1.4 导出媒体

**初始化**
@interface THCompositionExporter : NSObject
@property (nonatomic) BOOL exporting;
@property (nonatomic) CGFloat progress;

- (instancetype)initWithComposition:(id )composition;
- (void)beginExport;
@end

@interface THCompositionExporter ()
@property (strong, nonatomic) id  composition;
@property (strong, nonatomic) AVAssetExportSession *exportSession;
@end

@implementation THCompositionExporter
- (instancetype)initWithComposition:(id )composition {
    if (self = [super init]) {
        _composition = composition;
    }
    return self;
}

- (void)beginExport {
    self.exportSession = [self.composition makeExportable];
    self.exportSession.outputURL = [self exportURL];
    self.exportSession.outputFileType = AVFileTypeMPEG4;
    
    [self.exportSession exportAsynchronouslyWithCompletionHandler:^{
        dispatch_async(dispatch_get_main_queue(), ^{
            AVAssetExportSessionStatus status = self.exportSession.status;
            if (status == AVAssetExportSessionStatusCompleted) {
                [self writeExportedVideoToAssetsLibrary];
            } else {
                [UIAlertView showAlertWithTitle:@"Export falied" message:@"The requested export failed"];
            }
        });
    }];
    
    self.exporting = YES;
    [self monitorExportProgress];
}

- (void)monitorExportProgress {
    double delayInSeconds = 0.1;
    int64_t delta = (int64_t)delayInSeconds *NSEC_PER_SEC;
    dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, delta);
    dispatch_after(popTime, dispatch_get_main_queue(), ^{
        AVAssetExportSessionStatus status = self.exportSession.status;
        if (status == AVAssetExportSessionStatusExporting) {
            self.progress = self.exportSession.progress;
            [self monitorExportProgress];
        } else {
            self.exporting = NO;
        }
    });
}

- (void)writeExportedVideoToAssetsLibrary {
    NSURL *exportURL = self.exportSession.outputURL;
    NSError *error = nil;
    __block PHObjectPlaceholder *createdAsset = nil;
    [[PHPhotoLibrary sharedPhotoLibrary] performChangesAndWait:^{
        createdAsset = [PHAssetCreationRequest creationRequestForAssetFromVideoAtFileURL:exportURL].placeholderForCreatedAsset;
    } error:&error];
    if (error || !createdAsset) {
        NSString *message = @"Unable to write to Photos Library";
        [UIAlertView showAlertWithTitle:@"Write Failed" message:message];
    }
    [[NSFileManager defaultManager] removeItemAtURL:exportURL error:nil];
}

- (NSURL *)exportURL {
    NSString *filePath = nil;
    NSUInteger count = 0;
    do {
        filePath = NSTemporaryDirectory();
        NSString *numberString = count > 0 ?
            [NSString stringWithFormat:@"-%li", (unsigned long) count] : @"";
        NSString *fileNameString =
            [NSString stringWithFormat:@"Masterpiece-%@.m4v", numberString];
        filePath = [filePath stringByAppendingPathComponent:fileNameString];
        count++;
    } while ([[NSFileManager defaultManager] fileExistsAtPath:filePath]);
    return [NSURL fileURLWithPath:filePath];
}
@end

2 混合音频

当有多个音频轨道时,如音乐轨道和配音轨道,通常希望在有配音时,背景音乐音量降低。在AVFoundation中AVAudioMix及其相关类负责音频轨道的音量处理。在AVAudioMix中轨道的音量大小在0~1之间,默认的行为是每个轨道都以最大音量1播放。AVMutable...Parameters提供了两个方法用于立即设置音量到某个值,和在某个范围内将值由一个值设置到另外一个值。对于组合,对于本地媒体资源,对于媒体输出都可以设置AVAudioMix来控制音频播放和输出的行为。


AVFoundation 媒体创建和编辑_第3张图片

1- 初始化THAudioMixComposition负责提供可播放对象和导出预设值

@interface THAudioMixComposition : NSObject 
@property (strong, nonatomic, readonly) AVAudioMix *audioMix;
@property (strong, nonatomic, readonly) AVComposition *composition;

+ (instancetype)compositionWithComposition:(AVComposition *)composition
                                  audioMix:(AVAudioMix *)audioMix;
- (instancetype)initWithComposition:(AVComposition *)composition
                           audioMix:(AVAudioMix *)audioMix;
@end

@interface THAudioMixComposition ()
@property (strong, nonatomic) AVAudioMix *audioMix;
@property (strong, nonatomic) AVComposition *composition;
@end

@implementation THAudioMixComposition
+ (instancetype)compositionWithComposition:(AVComposition *)composition audioMix:(AVAudioMix *)audioMix {
    return [[self alloc] initWithComposition:composition audioMix:audioMix];
}

- (instancetype)initWithComposition:(AVComposition *)composition audioMix:(AVAudioMix *)audioMix {
    if (self = [super init]) {
        _composition = composition;
        _audioMix = audioMix;
    }
    return self;
}

- (AVPlayerItem *)makePlayable {
    AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:[self.composition copy]];
    playerItem.audioMix = self.audioMix;
    return playerItem;
}

- (AVAssetExportSession *)makeExportable {
    NSString *preset = AVAssetExportPresetHighestQuality;
    AVAssetExportSession *session = [AVAssetExportSession exportSessionWithAsset:[self.composition copy] presetName:preset];
    session.audioMix = self.audioMix;
    return session;
}
@end

2- 初始化THAudioMixCompositionBuilder负责编辑媒体

@interface THAudioMixCompositionBuilder : NSObject 
- (id)initWithTimeline:(THTimeline *)timeline;
@end

@interface THAudioMixCompositionBuilder ()
@property (strong, nonatomic) THTimeline *timeline;
@property (strong, nonatomic) AVMutableComposition *composition;
@end

@implementation THAudioMixCompositionBuilder
- (id)initWithTimeline:(THTimeline *)timeLine {
    if (self = [super init]) {
        _timeline = timeLine;
    }
    return self;
}

- (id)buildComposition {
    self.composition = [AVMutableComposition composition];
    [self addCompositionTrackOfType:AVMediaTypeVideo withMediaItems:self.timeline.videos];
    [self addCompositionTrackOfType:AVMediaTypeAudio withMediaItems:self.timeline.voiceOvers];
    AVMutableCompositionTrack *musicTrack = [self addCompositionTrackOfType:AVMediaTypeAudio withMediaItems:self.timeline.musicItems];
    AVAudioMix *audioMix = [self buildAudioMixWithTrack:musicTrack];
    return [THAudioMixComposition compositionWithComposition:self.composition.copy audioMix:audioMix];
}

- (AVMutableCompositionTrack *)addCompositionTrackOfType:(NSString *)type withMediaItems:(NSArray *)mediaItems {
    if (!THIsEmpty(mediaItems)) {
        CMPersistentTrackID trackID = kCMPersistentTrackID_Invalid;
        AVMutableCompositionTrack *compositionTrack = [self.composition addMutableTrackWithMediaType:type preferredTrackID:trackID];
        CMTime cursorTime = kCMTimeZero;
        
        for (THMediaItem *item in mediaItems) {
            if (CMTIME_COMPARE_INLINE(item.startTimeInTimeline, !=, kCMTimeInvalid)) {
                cursorTime = item.startTimeInTimeline;
            }
            AVAssetTrack *assetTrack = [[item.asset tracksWithMediaType:type] firstObject];
            [compositionTrack insertTimeRange:item.timeRange ofTrack:assetTrack atTime:cursorTime error:nil];
            cursorTime = CMTimeAdd(cursorTime, item.timeRange.duration);
        }
        return compositionTrack;
    }
    return nil;
}

- (AVAudioMix *)buildAudioMixWithTrack:(AVMutableCompositionTrack *)track {
    THAudioItem *item = [self.timeline.musicItems firstObject];
    if (item) {
        AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
        AVMutableAudioMixInputParameters *parameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track];
        for (THVolumeAutomation *automation in item.volumeAutomation) {
            [parameters setVolumeRampFromStartVolume:automation.startVolume toEndVolume:automation.endVolume timeRange:automation.timeRange];
        }
        audioMix.inputParameters = @[parameters];
        return audioMix;
    }
    return nil;
}
@end

3 创建视频过度效果及组合视频轨道

3.1 核心类

AVFoundation 媒体创建和编辑_第4张图片

AVFoundation中AVVideoComposition是描述视频组合各个视频轨道应该具体如何呈现的核心类。它可以被设置到AVPlayerItem、AVAssetExportSession等中,以用于播放和导出特定轨道组合效果的视频对象。AVVideoComposition由多个Instruction构成,每个Instruction描述了一个时间段timeRange,其属性layerInstructions中每一个对象都描述了对应轨道视频资源呈现方式。

在实际操作时,通常先建立AVComposition对象,并在轨道上添加必要的媒体信息,对于多个视频轨道,在导出或者播放AVComposition时,都会在轨道上没有媒体数据地方插入一个空片段。并且导出后视频仅含有一个视频轨道。Apple Developer Center中含有一个AVCompositionDebugView的APP示例,可以显示轨道信息。帮助调试程序。

在一个AVComposition中使用多个视频轨道并且没有配置AVVideoComposition属性时,在播放时只有索引为1的轨道会被渲染,导出同样。创建AVVideoComposition的方式有两种。

1)手动创建:直接通过AVVideoComposition的init方法创建,再为其添加Instruction数组,Instruction包含时间信息,再为Instruction添加layerInstructions属性,设置每个轨道的层展示方式。设置AVVideoComposition的rendersize、frameduration和renderScale,renderScale通常使用默认缩放比1.0,frameduration通常不设置,使用媒体默认帧率。

2)快捷创建:通过AVVideoComposition的videoComposition...OfAsset:类方法创建,这种方式穿件的videoComposition会包含。

  • Instructions:包含完整的基于组合视频轨道及其中包含片段空间布局的组合和层指令。通常其中默认的层布局指令layerInstruction都是全屏展示,需要对过度的Instruction和layerInstruction重新设置展示方式。
  • redersize:设置为AVComposition的naturalSize,若其为空,则设置为最大视频维度的尺寸值。
  • frameDuration:设置为组合中最大轨道的nominalFrameRate,如果所有轨道改值都为0,则设置为1/30.(30FPS)。
  • renderScale:始终为1。

3.1 逻辑实现

3.1.1 TransitionComposition
@interface THTransitionComposition : NSObject 
@property (strong, nonatomic, readonly) AVComposition *composition;
@property (strong, nonatomic, readonly) AVVideoComposition *videoComposition;
@property (strong, nonatomic, readonly) AVAudioMix *audioMix;

- (id)initWithComposition:(AVComposition *)composition videoComposition:(AVVideoComposition *)videoComposition audioMix:(AVAudioMix *)audioMix;
@end


@implementation THTransitionComposition
- (id)initWithComposition:(AVComposition *)composition videoComposition:(AVVideoComposition *)videoComposition audioMix:(AVAudioMix *)audioMix {
    if (self = [super init]) {
        _composition = composition;
        _videoComposition = videoComposition;
        _audioMix = audioMix;
    }
    return self;
}

- (AVPlayerItem *)makePlayable {
    AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:[self.composition copy]];
    playerItem.audioMix = self.audioMix;
    playerItem.videoComposition = self.videoComposition;
    return playerItem;
}

- (AVAssetExportSession *)makeExportable {
    NSString *preset = AVAssetExportPresetHighestQuality;
    AVAssetExportSession *session = [AVAssetExportSession exportSessionWithAsset:[self.composition copy] presetName:preset];
    session.audioMix = self.audioMix;
    session.videoComposition = self.videoComposition;
    return session;
}
@end
3.1.2 TransitionCompositionBuilder
@interface THTransitionCompositionBuilder : NSObject 
- (id)initWithTimeline:(THTimeline *)timeline;
@end


@interface THTransitionCompositionBuilder()
@property (nonatomic, strong) THTimeline *timeline;
@property (nonatomic, strong) AVMutableComposition *composition;
@property (nonatomic, weak) AVMutableCompositionTrack *musicTrack;
@end

@implementation THTransitionCompositionBuilder
- (id)initWithTimeline:(THTimeline *)timeline {
    if (self = [super init]) {
        _timeline = timeline;
    }
    return self;
}

- (id)buildComposition {
    self.composition = [AVMutableComposition composition];
    [self buildCompositionTracks];
    AVVideoComposition *videoComposition = [self buildVideoComposition];
    AVAudioMix *audioMix = [self buildAudioMix];
    return [[THTransitionComposition alloc] initWithComposition:self.composition.copy videoComposition:videoComposition audioMix:audioMix];
}

- (void)buildCompositionTracks {}

- (AVVideoComposition *)buildVideoComposition {
    AVVideoComposition *videoComposition = [AVMutableVideoComposition videoCompositionWithPropertiesOfAsset:self.composition].copy;
    NSArray *transitionInstructions = [self transitionInstructionsInVideoComposition:videoComposition];
    for (THTransitionInstructions *instructions in transitionInstructions) {
        CMTimeRange timeRange = instructions.compositionInstruction.timeRange;
        AVMutableVideoCompositionLayerInstruction *fromLayer = instructions.fromLayerInstruction;
        AVMutableVideoCompositionLayerInstruction *toLayer = instructions.toLayerInstruction;
        THVideoTransitionType type = instructions.transition.type;
        
        // 动画处理
        if (type == THVideoTransitionTypeDissolve) {
        } else if (type == THVideoTransitionTypePush) {
        } else if (type == THVideoTransitionTypeWipe) {
        }

        instructions.compositionInstruction.layerInstructions = @[fromLayer,toLayer];
    }
    return videoComposition;
}

- (AVMutableCompositionTrack *)addCompositionTrackOfType:(NSString *)type withMediaItems:(NSArray *)mediaItems {
    if (!THIsEmpty(mediaItems)) {
        CMPersistentTrackID trackID = kCMPersistentTrackID_Invalid;
        AVMutableCompositionTrack *compositionTrack = [self.composition addMutableTrackWithMediaType:type preferredTrackID:trackID];
        CMTime cursorTime = kCMTimeZero;
        
        for (THMediaItem *item in mediaItems) {
            if (CMTIME_COMPARE_INLINE(item.startTimeInTimeline, !=, kCMTimeInvalid)) {
                cursorTime = item.startTimeInTimeline;
            }
            AVAssetTrack *assetTrack = [[item.asset tracksWithMediaType:type] firstObject];
            [compositionTrack insertTimeRange:item.timeRange ofTrack:assetTrack atTime:cursorTime error:nil];
            cursorTime = CMTimeAdd(cursorTime, item.timeRange.duration);
        }
        return compositionTrack;
    }
    return nil;
}

- (AVAudioMix *)buildAudioMix {}

- (NSArray *)transitionInstructionsInVideoComposition:(AVVideoComposition *)videoComposition {}
@end

创建视频轨道

- (void)buildCompositionTracks {
    CMPersistentTrackID trackID = kCMPersistentTrackID_Invalid;
    AVMutableCompositionTrack *compositionTrackA = [self.composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:trackID];
    AVMutableCompositionTrack *compositionTrackB = [self.composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:trackID];
    NSArray *videoTracks = @[compositionTrackA,compositionTrackB];
    
    CMTime cursorTime = kCMTimeZero;
    CMTime transitionDuration = kCMTimeZero;
    if (!THIsEmpty(self.timeline.transitions)) {
        transitionDuration = THDefaultTransitionDuration;
    }
    NSArray *videos = self.timeline.videos;
    for (NSUInteger i = 0; i < videos.count; i++) {
        NSUInteger trackIndex = i%2;
        THVideoItem *item = videos[i];
        AVMutableCompositionTrack *currentTrack = videoTracks[trackIndex];
        AVAssetTrack *assetTrack = [[item.asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
        [currentTrack insertTimeRange:item.timeRange ofTrack:assetTrack atTime:cursorTime error:nil];
        
        cursorTime = CMTimeAdd(cursorTime, item.timeRange.duration);
        cursorTime = CMTimeSubtract(cursorTime, transitionDuration);
    }
    
    [self addCompositionTrackOfType:AVMediaTypeAudio withMediaItems:self.timeline.voiceOvers];
    NSArray *musicItems = self.timeline.musicItems;
    self.musicTrack = [self addCompositionTrackOfType:AVMediaTypeAudio withMediaItems:musicItems];
}

混合音频轨道

- (AVAudioMix *)buildAudioMix {
    NSArray *items = self.timeline.musicItems;
    if (items.count == 1) {
        THAudioItem *item = self.timeline.musicItems[0];
        AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
        AVMutableAudioMixInputParameters *parameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:self.musicTrack];
        for (THVolumeAutomation *automation in item.volumeAutomation) {
            [parameters setVolumeRampFromStartVolume:automation.startVolume toEndVolume:automation.endVolume timeRange:automation.timeRange];
        }
        audioMix.inputParameters = @[parameters];
        return audioMix;
    }
    return nil;
}

获取视频转场指令

- (NSArray *)transitionInstructionsInVideoComposition:(AVVideoComposition *)videoComposition {
    NSMutableArray *transitionInstructions = [NSMutableArray array];
    int layerInstructionIndex = 1;
    NSArray *compositionInstructions = videoComposition.instructions;
    for (AVMutableVideoCompositionInstruction *vci in compositionInstructions) {
        if (vci.layerInstructions.count == 2) {
            THTransitionInstructions *instructions = [[THTransitionInstructions alloc] init];
            instructions.compositionInstruction = vci;
            instructions.fromLayerInstruction = (AVMutableVideoCompositionLayerInstruction *)vci.layerInstructions[1 - layerInstructionIndex];
            instructions.toLayerInstruction = (AVMutableVideoCompositionLayerInstruction *)vci.layerInstructions[layerInstructionIndex];
            [transitionInstructions addObject:instructions];
            layerInstructionIndex = layerInstructionIndex == 1 ? 0 : 1;
        }
    }
    
    NSArray *transitions = self.timeline.transitions;
    if (THIsEmpty(transitions)) {
        return transitionInstructions;
    }
    
    NSAssert(transitionInstructions.count == transitions.count, @"Instruction count and transition count do not match.");
    for (NSUInteger i = 0 ; i < transitionInstructions.count; i++) {
        THTransitionInstructions *tis = transitionInstructions[i];
        tis.transition = self.timeline.transitions[i];
    }
    return transitionInstructions;
}

多段视频转场效果实现

- (AVVideoComposition *)buildVideoComposition {
    AVVideoComposition *videoComposition = [AVMutableVideoComposition videoCompositionWithPropertiesOfAsset:self.composition].copy;
    NSArray *transitionInstructions = [self transitionInstructionsInVideoComposition:videoComposition];
    for (THTransitionInstructions *instructions in transitionInstructions) {
        CMTimeRange timeRange = instructions.compositionInstruction.timeRange;
        AVMutableVideoCompositionLayerInstruction *fromLayer = instructions.fromLayerInstruction;
        AVMutableVideoCompositionLayerInstruction *toLayer = instructions.toLayerInstruction;
        THVideoTransitionType type = instructions.transition.type;
        
        if (type == THVideoTransitionTypeDissolve) {
            [fromLayer setOpacityRampFromStartOpacity:1.0 toEndOpacity:0.0 timeRange:timeRange];
        } else if (type == THVideoTransitionTypePush) {
            CGAffineTransform identityTransform = CGAffineTransformIdentity;
            CGFloat videoWidth = videoComposition.renderSize.width;
            CGAffineTransform fromDestTransform = CGAffineTransformMakeTranslation(-videoWidth, 0.0f);
            CGAffineTransform toStartTransform = CGAffineTransformMakeTranslation(videoWidth, 0.0);
            [fromLayer setTransformRampFromStartTransform:identityTransform toEndTransform:fromDestTransform timeRange:timeRange];
            [toLayer setTransformRampFromStartTransform:toStartTransform toEndTransform:identityTransform timeRange:timeRange];
        } else if (type == THVideoTransitionTypeWipe) {
            CGFloat videoWidth = videoComposition.renderSize.width;
            CGFloat videoHight = videoComposition.renderSize.height;
            
            CGRect startRect = CGRectMake(0.0f, 0.0f, videoWidth, videoHight);
            CGRect endRect = CGRectMake(0.0f, videoHight, videoWidth, 0);
            [fromLayer setCropRectangleRampFromStartCropRectangle:startRect toEndCropRectangle:endRect timeRange:timeRange];
        }
        instructions.compositionInstruction.layerInstructions = @[fromLayer,toLayer];
    }
    return videoComposition;
}

4 动画图层内容

在视频处理时,有时候常常希望添加上一些叠加效果,如水印、标题、下沿字母等。此时需要结合AVFoundation框架好Core Animation两个框架来实现这个功能。

4.1 Core Animation简介

Core Animation是基于GPU提供硬件加速的图像渲染框架。其主要分为Layers和Animations两部分。Core Animation将另起文章研究。另Lockwood的iOS Core Animation。

  • Layers:基本类为CALayer,用于管理屏幕中可视内容元素。一般指图片或者Bezier线条。Layer自身也可以设置可视化属性,如背景色等。CATextLayer和CAShapeLayer分别用于文字和Bezier曲线的渲染。
  • Animations:其基本类是一个抽象类CAAnimation,根据不同的需要定义了CABasicAnimation、CAKeyFrameAnimation等。这里需要注意CAAnimationGroup最好不要结合AVFoundation使用。

4.2 使用Core Animation

4.2.1 非AVFoundation环境

普通环境下,Core Animation渲染图像的时间是基于主机时间渲染。主机时间是从系统启动时开始计算并单向向前推进。

AVFoundation 媒体创建和编辑_第5张图片

4.2.2 AVFoundation环境

AVFoundation中时时播放时,我们需要在预览视图上添加一个AVSynchronizedLayer,并将其关联对应的AVPlayerItem,这样在AVSynchronizedLayer上所有的子视图上的动画都将依据对应的AVPlayerItem的时间执行,当AVPlayerItem暂停、倒退时动画都将做出相应反馈。

AVFoundation 媒体创建和编辑_第6张图片

AVFoundation中导出CoreAnimation时,AVFoundation提供了一个可以整合视频图层和动画图层的AVVideoCompositionCoreAnimationTool工具类。

AVFoundation 媒体创建和编辑_第7张图片

AVVideoCompositionCoreAnimationTool可以将组合视频帧整合于视频图层中,并在其中渲染动画效果,但是使用时应注意以下两点。

  • 不能移除动画:Core Animation的默认行为是执行动画,然后移除动画,但是在AVFoundation中动画会被反复执行,因此其removedOnCompetition必须设置为NO。
  • 开始时间不能设置为0.0:动画的开始时间设置为0.0时将会转换为当前主机的时间CACurrentMeidaTIme()。这样的时间不会和AVPlayerItem关联,动画将永远不会执行,需要设置为AVCoreAnimationBeginTimeAtZero常量。

4.3 在视频组合中创建动画

为了整合前几章内容,设计THTimeLineItem子类THTitleItem来负责创建和管理具体需要渲染的动画Layer。


AVFoundation 媒体创建和编辑_第8张图片
4.3.1 初始化TitleItem
@interface THTitleItem : THTimelineItem
@property (copy, nonatomic) NSString *identifier;
@property (nonatomic) BOOL animateImage;
@property (nonatomic) BOOL useLargeFont;

+ (instancetype)titleItemWithText:(NSString *)text image:(UIImage *)image;
- (instancetype)initWithText:(NSString *)text image:(UIImage *)image;
- (CALayer *)buildLayer;
@end

@interface THTitleItem ()
@property (copy, nonatomic) NSString *text;
@property (strong, nonatomic) UIImage *image;
@property (nonatomic) CGRect bounds;
@end

@implementation THTitleItem
+ (instancetype)titleItemWithText:(NSString *)text image:(UIImage *)image {
    return [[self alloc] initWithText:text image:image];
}

- (instancetype)initWithText:(NSString *)text image:(UIImage *)image {
    if (self = [super init]) {
        _text = text;
        _image = image;
        _bounds = TH720pVideoRect;
    }
    return self;
}

- (CALayer *)buildLayer {
    CALayer *presentLayer = [CALayer layer];
    presentLayer.frame = self.bounds;
    presentLayer.opacity = 0.0f;
    CALayer *imageLayer = [self makeImageLayer];
    [presentLayer addSublayer:imageLayer];
    CALayer *textLayer = [self makeTextLayer];
    [presentLayer addSublayer:textLayer];
    
    CAAnimation *fadeInFadeOutAnimation = [self makeFadeInFadeOutAnimation];
    [presentLayer addAnimation:fadeInFadeOutAnimation forKey:nil];
    
    if (self.animateImage) {
        presentLayer.sublayerTransform = THMakePerspectiveTransform(1000);
        CAAnimation *spinAnimation = [self make3DSpinAnimation];
        NSTimeInterval offset = spinAnimation.beginTime + spinAnimation.duration - 0.5f;
        CAAnimation *popAnimation = [self makePopAnimationWithTimingOffset:offset];
        [imageLayer addAnimation:spinAnimation forKey:nil];
        [imageLayer addAnimation:popAnimation forKey:nil];
    }
    return presentLayer;
}

- (CALayer *)makeImageLayer {
    CGSize imageSize = self.image.size;
    CALayer *layer = [CALayer layer];
    layer.contents = (id)self.image.CGImage;
    // 图片边缘应用抗锯齿效果
    layer.allowsEdgeAntialiasing = YES;
    layer.bounds = CGRectMake(0.0f, 0.0f, imageSize.width, imageSize.height);
    layer.position = CGPointMake(CGRectGetMidX(self.bounds)-20.0f, 270.0f);
    return layer;
}

- (CALayer *)makeTextLayer {
    CGFloat fontSize = self.useLargeFont ? 64.0f : 54.0f;
    UIFont *font = [UIFont fontWithName:@"GillSans-Bold" size:fontSize];
    NSDictionary *attrs = @{NSFontAttributeName : font, NSForegroundColorAttributeName : (id)[UIColor whiteColor].CGColor};
    NSAttributedString *string = [[NSAttributedString alloc] initWithString:self.text attributes:attrs];
    
    CGSize textSize = [self.text sizeWithAttributes:attrs];
    CATextLayer *layer = [CATextLayer layer];
    layer.string = string;
    layer.bounds = CGRectMake(0.0f, 0.0f, textSize.width, textSize.height);
    layer.position = CGPointMake(CGRectGetMidX(self.bounds), 470.0f);
    layer.backgroundColor = [UIColor clearColor].CGColor;
    return layer;
}

static CATransform3D THMakePerspectiveTransform(CGFloat eyePosition) {
    CATransform3D transform = CATransform3DIdentity;
    transform.m34 = -1.0 / eyePosition;
    return transform;
}
@end
4.3.2 添加动画效果
- (CAAnimation *)makeFadeInFadeOutAnimation {
    CAKeyframeAnimation *animation = [CAKeyframeAnimation animationWithKeyPath:@"opacity"];
    animation.values = @[@0.0f, @1.0, @1.0, @0.0];
    animation.keyTimes = @[@0.0, @0.2, @0.8, @1.0];
    animation.beginTime = CMTimeGetSeconds(self.startTimeInTimeline);
    animation.duration = CMTimeGetSeconds(self.timeRange.duration);
    animation.removedOnCompletion = NO;
    return animation;
}

- (CAAnimation *)make3DSpinAnimation {
    CABasicAnimation *animation = [CABasicAnimation animationWithKeyPath:@"transform.rotation.y"];
    animation.toValue = @((4*M_PI) * -1);
    animation.beginTime = CMTimeGetSeconds(self.startTimeInTimeline) + 0.2;
    animation.duration = CMTimeGetSeconds(self.timeRange.duration) * 0.4;
    animation.removedOnCompletion = NO;
    animation.timingFunction = [CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionEaseInEaseOut];
    return animation;
}

- (CAAnimation *)makePopAnimationWithTimingOffset:(NSTimeInterval)offset {
    CABasicAnimation *animation = [CABasicAnimation animationWithKeyPath:@"transform.scale"];
    animation.toValue = @1.3f;
    animation.beginTime = offset;
    animation.duration = 0.35f;
    animation.autoreverses = YES;
    animation.removedOnCompletion = NO;
    animation.timingFunction = [CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionEaseInEaseOut];
    return animation;
}
@end
4.3.3 准备视频组合负责生产可导出Session和可播放PlayerItem
@interface THOverlayComposition : NSObject 
@property (strong, nonatomic, readonly) AVComposition *composition;
@property (strong, nonatomic, readonly) AVVideoComposition *videoComposition;
@property (strong, nonatomic, readonly) AVAudioMix *audioMix;
@property (strong, nonatomic, readonly) CALayer *titleLayer;

- (id)initWithComposition:(AVComposition *)composition
         videoComposition:(AVVideoComposition *)videoComposition
                 audioMix:(AVAudioMix *)audioMix
               titleLayer:(CALayer *)titleLayer;
@end


- (AVPlayerItem *)makePlayable {
    AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:[self.composition copy]];
    playerItem.videoComposition = self.videoComposition;
    playerItem.audioMix = self.audioMix;
    if (self.titleLayer) {
        AVSynchronizedLayer *syncLayer = [AVSynchronizedLayer synchronizedLayerWithPlayerItem:playerItem];
        [syncLayer addSublayer:self.titleLayer];
        playerItem.syncLayer = syncLayer;
    }
    return playerItem;
}

- (AVAssetExportSession *)makeExportable {
    if (self.titleLayer) {
        CALayer *animationLayer = [CALayer layer];
        animationLayer.frame = TH720pVideoRect;
        CALayer *videoLayer = [CALayer layer];
        videoLayer.frame = TH720pVideoRect;
        [animationLayer addSublayer:videoLayer];
        [animationLayer addSublayer:self.titleLayer];
        // 设置几何翻转为YES,否则图片文字会颠倒
        animationLayer.geometryFlipped = YES;
        
        AVVideoCompositionCoreAnimationTool *animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:animationLayer];
        AVMutableVideoComposition *mvc = (AVMutableVideoComposition *)self.videoComposition;
        mvc.animationTool = animationTool;
    }
    
    NSString *presetName = AVAssetExportPresetHighestQuality;
    AVAssetExportSession *session = [[AVAssetExportSession alloc] initWithAsset:[self.composition copy] presetName:presetName];
    session.audioMix = session.audioMix;
    session.videoComposition = self.videoComposition;
    return session;
}
4.3.4 准备视频组合Builder负责生成组合对象

此处大部分逻辑实现和前一章一样,直接参照。

@interface THOverlayCompositionBuilder : NSObject 
- (id)initWithTimeline:(THTimeline *)timeline;
@end

- (id )buildComposition {
    self.composition = [AVMutableComposition composition];
    [self buildCompositionTracks];
    AVVideoComposition *videoComposition = [self buildVideoComposition];
    return [[THOverlayComposition alloc] initWithComposition:self.composition.copy videoComposition:videoComposition audioMix:[self buildAudioMix] titleLayer:[self buildTitleLayer]];
}

- (CALayer *)buildTitleLayer {
    if (!THIsEmpty(self.timeline.titles)) {
        CALayer *titleLayer = [CALayer layer];
        titleLayer.bounds = TH720pVideoRect;
        titleLayer.position = CGPointMake(CGRectGetMidX(TH720pVideoRect), CGRectGetMidY(TH720pVideoRect));
        
        for (THTitleItem *titleItem in self.timeline.titles) {
            [titleLayer addSublayer:[titleItem buildLayer]];
        }
        return titleLayer;
    }
    return nil;
}

你可能感兴趣的:(AVFoundation 媒体创建和编辑)