iOS-音频合并,音视频合并,剪切

iOS中音频合并是指将两个不同的声音文件合成一个声音文件进行输出,音视频合并适用于视频中没有声音,将声音合并到视频中,最好声音的时长和视频的时长是一致的,生成新的适配会更有效果一点,剪切是合并的逆向操作,对音视频可以进行更精细化的处理~

CMTime简介

音视频合并剪切需要更精确,所有中间时间的都是通过CMTime进行出来的,先来看一下结构:

typedef struct
{
 CMTimeValue value;  /*! @field value The value of the CMTime. value/timescale = seconds. */
 CMTimeScale timescale; /*! @field timescale The timescale of the CMTime. value/timescale = seconds.  */
 CMTimeFlags flags;  /*! @field flags The flags, eg. kCMTimeFlags_Valid, kCMTimeFlags_PositiveInfinity, etc. */
 CMTimeEpoch epoch;  /*! @field epoch Differentiates between equal timestamps that are actually different because
             of looping, multi-item sequencing, etc.  
             Will be used during comparison: greater epochs happen after lesser ones. 
             Additions/subtraction is only possible within a single epoch,
             however, since epoch length may be unknown/variable. */
} CMTime;

value是分子,timeScale是分母,下面是几种CMTime常用的方式,CMTimeShow进行输出,代码如下:

    CMTime startTime = CMTimeMake(13, 100);
    CMTimeShow(startTime);
    
    CMTime endTime = CMTimeMake(40, 100);
    CMTimeShow(endTime);
    
    CMTime addTime = CMTimeAdd(startTime, endTime);
    CMTimeShow(addTime);
    
    CMTime subTime = CMTimeSubtract(startTime,endTime);
    CMTimeShow(subTime);
    
    CMTimeRange timeRange = CMTimeRangeMake(startTime, endTime);
    CMTimeRangeShow(timeRange);
    
    CMTimeRange fromRange = CMTimeRangeFromTimeToTime(startTime, endTime);
    CMTimeRangeShow(fromRange);
    
    //交集
    CMTimeRange intersectionRange = CMTimeRangeGetIntersection(timeRange, fromRange);
    CMTimeRangeShow(intersectionRange);
    //合集
    CMTimeRange unionRange = CMTimeRangeGetUnion(timeRange, fromRange);
    CMTimeRangeShow(unionRange);

音频合并

音视频合并只需要AVFoundation头文件即可进行文件操作,不过有一个限制就是导出的文件都是m4a格式的:
音频合并操作:

    NSString *wayPath = [[NSBundle mainBundle] pathForResource:@"MyWay" ofType:@"mp3"];
    NSString *easyPath = [[NSBundle mainBundle] pathForResource:@"Easy" ofType:@"mp3"];
    NSMutableArray *dataArr = [NSMutableArray array];
    [dataArr addObject:[NSURL fileURLWithPath:wayPath]];
    [dataArr addObject:[NSURL fileURLWithPath:easyPath]];
    NSString *destPath = [[self composeDir] stringByAppendingString:@"FlyElephant.m4a"];
    
    if ([[NSFileManager defaultManager] fileExistsAtPath:destPath]) {
        [[NSFileManager defaultManager] removeItemAtPath:destPath error:nil];
    }
    [self audioMerge:dataArr destUrl:[NSURL fileURLWithPath:destPath]];

核心代码:

- (void)audioMerge:(NSMutableArray *)dataSource destUrl:(NSURL *)destUrl{
    AVMutableComposition *mixComposition = [AVMutableComposition composition];
    
    // 开始时间
    CMTime beginTime = kCMTimeZero;
    // 设置音频合并音轨
    AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    
    NSError *error = nil;
    for (NSURL *sourceURL in dataSource) {
        //音频文件资源
        AVURLAsset  *audioAsset = [[AVURLAsset alloc] initWithURL:sourceURL options:nil];
        //需要合并的音频文件的区间
        CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
        // ofTrack 音频文件内容
        BOOL success = [compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] firstObject] atTime:beginTime error:&error];
        
        if (!success) {
            NSLog(@"Error: %@",error);
        }
        beginTime = CMTimeAdd(beginTime, audioAsset.duration);
    }
    // presetName 与 outputFileType 要对应  导出合并的音频
    AVAssetExportSession *assetExportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetAppleM4A];
    assetExportSession.outputURL = destUrl;
    assetExportSession.outputFileType = @"com.apple.m4a-audio";
    assetExportSession.shouldOptimizeForNetworkUse = YES;
    [assetExportSession exportAsynchronouslyWithCompletionHandler:^{
        dispatch_async(dispatch_get_main_queue(), ^{
            NSLog(@"%@",assetExportSession.error);
        });
    }];
}

音视频合并

音视频和音频合并类似,合并操作:

    NSString *wayPath = [[NSBundle mainBundle] pathForResource:@"MyWay" ofType:@"mp3"];
    NSString *easyPath = [[NSBundle mainBundle] pathForResource:@"Way" ofType:@"mp4"];
    
    NSString *destPath = [[self composeDir] stringByAppendingString:@"FlyElephant.mp4"];
    
    if ([[NSFileManager defaultManager] fileExistsAtPath:destPath]) {
        [[NSFileManager defaultManager] removeItemAtPath:destPath error:nil];
    }
    [self audioVedioMerge:[NSURL fileURLWithPath:wayPath] vedioUrl:[NSURL fileURLWithPath:easyPath] destUrl:[NSURL fileURLWithPath:destPath]];

核心代码:

- (void)audioVedioMerge:(NSURL *)audioUrl vedioUrl:(NSURL *)vedioUrl destUrl:(NSURL *)destUrl {
    AVMutableComposition *mixComposition = [AVMutableComposition composition];
    NSError *error;
    
    AVMutableCompositionTrack *audioCompostionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    //音频文件资源
    AVURLAsset  *audioAsset = [[AVURLAsset alloc] initWithURL:audioUrl options:nil];
    CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
    [audioCompostionTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] firstObject] atTime:kCMTimeZero error:&error];
    
    //视频文件资源
    AVMutableCompositionTrack *vedioCompostionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    AVURLAsset *vedioAsset = [[AVURLAsset alloc] initWithURL:vedioUrl options:nil];
    CMTimeRange vedio_timeRange = CMTimeRangeMake(kCMTimeZero, vedioAsset.duration);
    [vedioCompostionTrack insertTimeRange:vedio_timeRange ofTrack:[[vedioAsset tracksWithMediaType:AVMediaTypeVideo] firstObject] atTime:kCMTimeZero error:&error];
    
    // presetName 与 outputFileType 要对应  导出合并的音频
    AVAssetExportSession* assetExportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
    assetExportSession.outputURL = destUrl;
    assetExportSession.outputFileType = @"com.apple.quicktime-movie";
    assetExportSession.shouldOptimizeForNetworkUse = YES;
    [assetExportSession exportAsynchronouslyWithCompletionHandler:^{
        dispatch_async(dispatch_get_main_queue(), ^{
            NSLog(@"%@",assetExportSession.error);
        });
    }];
}

音频剪切

剪切比合并简单,只需要操作一个文件即可:

    NSString *inputPath = [[self composeDir] stringByAppendingString:@"FlyElephant.m4a"];
    
    [self audioCrop:[NSURL fileURLWithPath:inputPath] startTime:CMTimeMake(30, 1) endTime:CMTimeMake(48, 1)];

核心代码:

- (void)audioCrop:(NSURL *)url startTime:(CMTime)startTime endTime:(CMTime)endTime {
    
    NSString *outPutPath = [[self composeDir] stringByAppendingPathComponent:@"Crop.m4a"];
    NSURL *audioFileOutput = [NSURL fileURLWithPath:outPutPath];
    
    [[NSFileManager defaultManager] removeItemAtURL:audioFileOutput error:NULL];
    AVAsset *asset = [AVAsset assetWithURL:url];
    
    AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:asset
                                                                            presetName:AVAssetExportPresetAppleM4A];
    CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(startTime, endTime);
    
    exportSession.outputURL = audioFileOutput;
    exportSession.outputFileType = AVFileTypeAppleM4A;
    exportSession.timeRange = exportTimeRange;
    
    [exportSession exportAsynchronouslyWithCompletionHandler:^{
        if (AVAssetExportSessionStatusCompleted == exportSession.status) {
            NSLog(@" FlyElephant \n %@", outPutPath);
        } else if (AVAssetExportSessionStatusFailed == exportSession.status) {
            NSLog(@"FlyElephant error: %@", exportSession.error.localizedDescription);
        }
    }];
}

文件路径:

- (NSString *)composeDir {
    NSString *cacheDir = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) firstObject];
    NSFileManager *fileManager  = [NSFileManager defaultManager];
    NSString *compseDir = [NSString stringWithFormat:@"%@/AudioCompose/", cacheDir];
    BOOL isDir = NO;
    BOOL existed = [fileManager fileExistsAtPath:compseDir isDirectory:&isDir];
    if ( !(isDir == YES && existed == YES) ) {
        [fileManager createDirectoryAtPath:compseDir withIntermediateDirectories:YES attributes:nil error:nil];
    }
    return compseDir;
}

以上就是音频合并,剪切的基本操作,如有疑问,欢迎评论区探讨~

你可能感兴趣的:(iOS-音频合并,音视频合并,剪切)