[iOS] 音乐/录音 实时波形图

简概:

  • 本次文章分别讲述根据播放器播放的音乐和麦克风采集的实时CMSampleBufferRef展示波形。
  • CMSampleBufferRef 计算分贝的工具使用的是Objective-C版:GetVolumeLevelsFromeSampleBuffer,这里补充Swift版:SampleBufferToVolumeLevels
  • 音乐播放器使用的是:KSYMediaPlayer_iOS,该播放器有实时的buffer数据回调接口,这里用来做demo很方便。
  • 波形绘制使用的是基于:kevinzhow/Waver 的修改版DYwaver
  • 如果你有问题,或者对下述文字有任何意见与建议,除了在文章最后留言,还可以在微博阳眼的熊1993上给我留言,或者联系我的邮箱[email protected],以及访问我的Github。
  • 文章某尾会给到Demo。
  • 完整音乐视频效果:链接链接_2
    波形效果

代码简单介绍

  • pod 相关工具
pod 'GetVolumeLevelsFromeSampleBuffer', :git => 'https://github.com/doubleYang1020/GetVolumeLevelsFromeSampleBuffer.git'
pod 'KSYMediaPlayer_iOS'
pod 'DYwaver', :git => 'https://github.com/doubleYang1020/DYwaver.git'
  • import相关头文件
#import 
#import "SampleBufferToVolumeLevelsEngine.h"
#import "DYwaver.h"
  • 初始化播放器 +(float)getVolumeLevelsFromeSampleBuffer:(CMSampleBufferRef)sampleBuffer;是根据buffer转换音量volume
-(void)creatMediaPlayer{
    __weak typeof(self) weakSelf = self;
    _player = [[KSYMoviePlayerController alloc] initWithContentURL:[NSURL URLWithString:@"http://music.fffffive.com/1500458888991.mp3"]];
    [_player setShouldAutoplay:true];
    [_player setShouldLoop:true];
    _player.audioDataBlock= ^(CMSampleBufferRef buf){
        float volume = [SampleBufferToVolumeLevelsEngine getVolumeLevelsFromeSampleBuffer:buf];
        weakSelf.audioVolume = volume;
    };
    [_player prepareToPlay];
}
  • 初始化DYWaverView
-(void)creatWaverView{
    __weak typeof(self) weakSelf = self;
    UIColor* redColor = [UIColor colorWithRed:255.0/255.0 green:0.0/255.0 blue:0.0/255.0 alpha:1.0];
    UIColor* orangeColor = [UIColor colorWithRed:255.0/255.0 green:165.0/255.0 blue:0.0/255.0 alpha:1.0];
    UIColor* yellowColor = [UIColor colorWithRed:255.0/255.0 green:255.0/255.0 blue:0.0/255.0 alpha:1.0];
    UIColor* greenColor = [UIColor colorWithRed:0.0/255.0 green:255.0/255.0 blue:0.0/255.0 alpha:1.0];
    UIColor* cyanColor = [UIColor colorWithRed:0.0/255.0 green:127.0/255.0 blue:255.0/255.0 alpha:1.0];
    UIColor* blueColor = [UIColor colorWithRed:0.0/255.0 green:0.0/255.0 blue:255.0/255.0 alpha:1.0];
    UIColor* purpleColor = [UIColor colorWithRed:139.0/255.0 green:0.0/255.0 blue:255.0/255.0 alpha:1.0];
    NSArray* colorsAry = [NSArray arrayWithObjects:redColor,orangeColor,yellowColor,greenColor,cyanColor,blueColor,purpleColor, nil];
    _waver = [[Waver alloc] initWithFrame:CGRectMake(0, [UIScreen mainScreen].bounds.size.height / 2.0 - 40.0, [UIScreen mainScreen].bounds.size.width, 80.0) andNumberOfWaves:7 andWavesColors:colorsAry andDecorativeWavesWidth:1.0];
    [_waver setUserInteractionEnabled:false];
    [self.view addSubview:_waver];
    __weak Waver * weakWaver = _waver;
    _waver.waverLevelCallback = ^(Waver * waver) {
        if (weakSelf.audioVolume) {
            float normalizedValue = pow(10, weakSelf.audioVolume/40000) - 1.0;
            weakWaver.level = normalizedValue * 1.0;
        }
    };
}

关于录音实时显示波形图思路流程

  • 在AVCaptureAudioDataOutputSampleBufferDelegate 回调中获取audio 的buffer
    optional public func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)

这里需要注意的是AVCaptureVideoDataOutputSampleBufferDelegate 回调也是上面方法,所以这里要判断出是否是音频的SampleBuffer

let isAudio = output is AVCaptureAudioDataOutput
  • 按上述步骤获取到AudioSampleBuffer,在通过SampleBufferToVolumeLevels转换成音量Volume,下面思路就和上面差不多了。
    self.waver.waverLevelCallback = { [weak self] waver_1 in
      guard let volume = self?.audioVolume else {
        return
      }
      let normalizedValue = pow(10, volume/40000) - 1.0
      waver_1!.level = CGFloat(normalizedValue * 1.0)
    }
  • demo传送门:

你可能感兴趣的:([iOS] 音乐/录音 实时波形图)